SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 451460 of 4002 papers

TitleStatusHype
Evaluating Multilingual Text Encoders for Unsupervised Cross-Lingual RetrievalCode0
Evaluating Neural Word Embeddings for SanskritCode0
Attentive Mimicking: Better Word Embeddings by Attending to Informative ContextsCode0
Attentive Neural Network for Named Entity Recognition in VietnameseCode0
Analyzing Structures in the Semantic Vector Space: A Framework for Decomposing Word EmbeddingsCode0
Evaluating Word Embeddings in Multi-label Classification Using Fine-grained Name TypingCode0
Evaluation of Croatian Word EmbeddingsCode0
Evaluation of sentence embeddings in downstream and linguistic probing tasksCode0
Evolution of emotion semanticsCode0
Cross-Lingual BERT Transformation for Zero-Shot Dependency ParsingCode0
Show:102550
← PrevPage 46 of 401Next →

No leaderboard results yet.