SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 17611770 of 4002 papers

TitleStatusHype
Incorporating Sub-Word Level Information in Language Invariant Neural Event Detection0
Deconstructing and reconstructing word embedding algorithms0
Inducing Relational Knowledge from BERT0
RETRO: Relation Retrofitting For In-Database Machine Learning on Textual Data0
Word Embedding based New Corpus for Low-resourced Language: Sindhi0
Taking a Stance on Fake News: Towards Automatic Disinformation Assessment via Deep Bidirectional Transformer Language Models for Stance Detection0
Hybrid Text Feature Modeling for Disease Group Prediction using Unstructured Physician Notes0
Word-Class Embeddings for Multiclass Text ClassificationCode0
City2City: Translating Place Representations across Cities0
A Causal Inference Method for Reducing Gender Bias in Word Embedding RelationsCode0
Show:102550
← PrevPage 177 of 401Next →

No leaderboard results yet.