SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 12411250 of 4002 papers

TitleStatusHype
The Chilean Waiting List Corpus: a new resource for clinical Named Entity Recognition in Spanish0
Domain adaptation challenges of BERT in tokenization and sub-word representations of Out-of-Vocabulary words0
Embedding Structured Dictionary Entries0
Cross-Lingual Suicidal-Oriented Word Embedding toward Suicide Prevention0
Cross-lingual Embeddings Reveal Universal and Lineage-Specific Patterns in Grammatical Gender Assignment0
Neutralizing Gender Bias in Word Embeddings with Latent Disentanglement and Counterfactual Generation0
Robust Backed-off Estimation of Out-of-Vocabulary Embeddings0
Analysing Word Representation from the Input and Output Embeddings in Neural Network Language ModelsCode0
Word associations and the distance properties of context-aware word embeddings0
Rethinking Topic Modelling: From Document-Space to Term-Space0
Show:102550
← PrevPage 125 of 401Next →

No leaderboard results yet.