SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 26112620 of 4002 papers

TitleStatusHype
Knowing the Author by the Company His Words Keep0
Knowledge-Base Enriched Word Embeddings for Biomedical Domain0
Knowledge Distillation for Bilingual Dictionary Induction0
Knowledge-Driven Event Embedding for Stock Prediction0
Knowledge Graph and Text Jointly Embedding0
Knowledge Graph-Augmented Language Models for Knowledge-Grounded Dialogue Generation0
L2F/INESC-ID at SemEval-2017 Tasks 1 and 2: Lexical and semantic features in word and textual similarity0
L3Cube-MahaCorpus and MahaBERT: Marathi Monolingual Corpus, Marathi BERT Language Models, and Resources0
Lacking the embedding of a word? Look it up into a traditional dictionary0
LACoS-BLOOM: Low-rank Adaptation with Contrastive objective on 8 bits Siamese-BLOOM0
Show:102550
← PrevPage 262 of 401Next →

No leaderboard results yet.