SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 36113620 of 4002 papers

TitleStatusHype
Sensing Ambiguity in Henry James' "The Turn of the Screw"Code0
Assessing Social and Intersectional Biases in Contextualized Word RepresentationsCode0
Paraphrases do not explain word analogiesCode0
Learning Word Embeddings with Domain AwarenessCode0
Evaluation of Croatian Word EmbeddingsCode0
textTOvec: Deep Contextualized Neural Autoregressive Topic Models of Language with Distributed Compositional PriorCode0
Sentence Alignment with Parallel Documents Facilitates Biomedical Machine TranslationCode0
Learning Word Meta-Embeddings by AutoencodingCode0
Truly unsupervised acoustic word embeddings using weak top-down constraints in encoder-decoder modelsCode0
Urdu Word EmbeddingsCode0
Show:102550
← PrevPage 362 of 401Next →

No leaderboard results yet.