SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 13611370 of 4002 papers

TitleStatusHype
With More Contexts Comes Better Performance: Contextualized Sense Embeddings for All-Round Word Sense Disambiguation0
WLV-RIT at HASOC-Dravidian-CodeMix-FIRE2020: Offensive Language Identification in Code-switched YouTube Comments0
Robust Backed-off Estimation of Out-of-Vocabulary Embeddings0
ISWARA at WNUT-2020 Task 2: Identification of Informative COVID-19 English Tweets using BERT and FastText Embeddings0
From Zero to Hero: On the Limitations of Zero-Shot Language Transfer with Multilingual Transformers0
Vocabulary Adaptation for Domain Adaptation in Neural Machine TranslationCode0
Alignment-free Cross-lingual Semantic Role Labeling0
Low-Resource Unsupervised NMT: Diagnosing the Problem and Providing a Linguistically Motivated SolutionCode0
Machine Translation for English–Inuktitut with Segmentation, Data Acquisition and Pre-Training0
The Chilean Waiting List Corpus: a new resource for clinical Named Entity Recognition in Spanish0
Show:102550
← PrevPage 137 of 401Next →

No leaderboard results yet.