SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 19611970 of 4002 papers

TitleStatusHype
Semantic Change in the Language of UK Parliamentary Debates0
Contextualized Diachronic Word RepresentationsCode0
Constrained Sequence-to-sequence Semitic Root Extraction for Enriching Word Embeddings0
Grammar and Meaning: Analysing the Topology of Diachronic Word Embeddings0
Unsupervised Compositional Translation of Multiword Expressions0
Measuring Diachronic Evolution of Evaluative Adjectives with Word Embeddings: the Case for English, Norwegian, and Russian0
JHU System Description for the MADAR Arabic Dialect Identification Shared Task0
ArbEngVec : Arabic-English Cross-Lingual Word Embedding Model0
Equipping Educational Applications with Domain Knowledge0
Enhancing biomedical word embeddings by retrofitting to verb clustersCode0
Show:102550
← PrevPage 197 of 401Next →

No leaderboard results yet.