SOTAVerified

Reranking

Papers

Showing 426450 of 586 papers

TitleStatusHype
WikiUMLS: Aligning UMLS to Wikipedia via Cross-lingual Neural RankingCode1
Fine-grained Morphosyntactic Analysis and Generation Tools for More Than One Thousand Languages0
Tired of Topic Models? Clusters of Pretrained Word Embeddings Make for Fast and Good Topics too!Code1
Complementing Lexical Retrieval with Semantic Residual Embedding0
Fast and Memory-Efficient Neural Code Completion0
Fast and Accurate Deep Bidirectional Language Representations for Unsupervised LearningCode1
Have Your Text and Use It Too! End-to-End Neural Data-to-Text Generation with Semantic FidelityCode1
TREC CAsT 2019: The Conversational Assistance Track OverviewCode1
Overview of the TREC 2019 Fair Ranking Track0
Pseudo Labeling and Negative Feedback Learning for Large-scale Multi-label Domain Classification0
Fine-Grained Fashion Similarity Learning by Attribute-Specific Embedding NetworkCode1
Cost-effective Interactive Attention Learning with Neural Attention ProcessCode0
Samsung and University of Edinburgh’s System for the IWSLT 20190
The OSU/Facebook Realizer for SRST 2019: Seq2Seq Inflection and Serialized Tree2Tree Linearization0
KNU-HYUNDAI's NMT system for Scientific Paper and Patent Tasks onWAT 20190
Team SVMrank: Leveraging Feature-rich Support Vector Machines for Ranking Explanations to Elementary Science Questions0
UCSMNLP: Statistical Machine Translation for WAT 20190
Putting Machine Translation in Context with the Noisy Channel Model0
Improving Quality and Efficiency in Plan-based Neural Data-to-Text GenerationCode0
A Sketch-Based System for Semantic ParsingCode0
Subword Language Model for Query Auto-CompletionCode0
Simple and Effective Noisy Channel Modeling for Neural Machine Translation0
A Multi-Type Multi-Span Network for Reading Comprehension that Requires Discrete ReasoningCode0
The NiuTrans Machine Translation Systems for WMT190
GTCOM Neural Machine Translation Systems for WMT190
Show:102550
← PrevPage 18 of 24Next →

No leaderboard results yet.