SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 30413050 of 4002 papers

TitleStatusHype
Nonsymbolic Text Representation0
Normalization of Transliterated Words in Code-Mixed Data Using Seq2Seq Model \& Levenshtein Distance0
NORMA: Neighborhood Sensitive Maps for Multilingual Word Embeddings0
NormCo: Deep Disease Normalization for Biomedical Knowledge Base Construction0
Norm of Word Embedding Encodes Information Gain0
NormXLogit: The Head-on-Top Never Lies0
Not just about size - A Study on the Role of Distributed Word Representations in the Analysis of Scientific Publications0
Not wacky vs. definitely wacky: A study of scalar adverbs in pretrained language models0
NRC-Canada at SMM4H Shared Task: Classifying Tweets Mentioning Adverse Drug Reactions and Medication Intake0
NRC: Infused Phrase Vectors for Named Entity Recognition in Twitter0
Show:102550
← PrevPage 305 of 401Next →

No leaderboard results yet.