SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 27512760 of 4002 papers

TitleStatusHype
Word embedding and neural network on grammatical gender -- A case study of Swedish0
Word Embedding and WordNet Based Metaphor Identification and Interpretation0
Word Embedding-based Antonym Detection using Thesauri and Distributional Information0
Word Embedding-Based Automatic MT Evaluation Metric using Word Position Information0
Word-Embedding based Content Features for Automated Oral Proficiency Scoring0
Word Embedding Calculus in Meaningful Ultradense Subspaces0
Word Embedding Evaluation and Combination0
Word Embedding Evaluation Datasets and Wikipedia Title Embedding for Chinese0
Word Embedding Evaluation for Sinhala0
Word Embedding Evaluation in Downstream Tasks and Semantic Analogies0
Show:102550
← PrevPage 276 of 401Next →

No leaderboard results yet.