SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 27712780 of 4002 papers

TitleStatusHype
Lexical and Semantic Features for Cross-lingual Text Reuse Classification: an Experiment in English and Latin Paraphrases0
Lexical Chains meet Word Embeddings in Document-level Statistical Machine Translation0
Lexical Coherence Graph Modeling Using Word Embeddings0
Lexical Comparison Between Wikipedia and Twitter Corpora by Using Word Embeddings0
Lexical Induction of Morphological and Orthographic Forms for Low-Resourced Languages0
Lexicalized vs. Delexicalized Parsing in Low-Resource Scenarios0
Lexical Relation Mining in Neural Word Embeddings0
Lexical semantics enhanced neural word embeddings0
Lexical Simplification with Neural Ranking0
Lexical Simplification with the Deep Structured Similarity Model0
Show:102550
← PrevPage 278 of 401Next →

No leaderboard results yet.