SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 12311240 of 4002 papers

TitleStatusHype
One of these words is not like the other: a reproduction of outlier identification using non-contextual word representationsCode0
Evaluating Word Embeddings on Low-Resource Languages0
Diachronic Embeddings for People in the News0
Evaluating Word Embeddings for Language Acquisition0
Low-Resource Unsupervised NMT: Diagnosing the Problem and Providing a Linguistically Motivated SolutionCode0
How does BERT capture semantics? A closer look at polysemous wordsCode0
Going Beyond T-SNE: Exposing whatlies in Text Embeddings0
ISWARA at WNUT-2020 Task 2: Identification of Informative COVID-19 English Tweets using BERT and FastText Embeddings0
Combining BERT with Static Word Embeddings for Categorizing Social Media0
“Did you really mean what you said?” : Sarcasm Detection in Hindi-English Code-Mixed Data using Bilingual Word EmbeddingsCode1
Show:102550
← PrevPage 124 of 401Next →

No leaderboard results yet.