SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 17211730 of 4002 papers

TitleStatusHype
EduBERT: Pretrained Deep Language Models for Learning Analytics0
Incorporating Sub-Word Level Information in Language Invariant Neural Event Detection0
Deconstructing and reconstructing word embedding algorithms0
RETRO: Relation Retrofitting For In-Database Machine Learning on Textual Data0
Word Embedding based New Corpus for Low-resourced Language: Sindhi0
Inducing Relational Knowledge from BERT0
DeFINE: DEep Factorized INput Token Embeddings for Neural Sequence ModelingCode1
Taking a Stance on Fake News: Towards Automatic Disinformation Assessment via Deep Bidirectional Transformer Language Models for Stance Detection0
Hybrid Text Feature Modeling for Disease Group Prediction using Unstructured Physician Notes0
Word-Class Embeddings for Multiclass Text ClassificationCode0
Show:102550
← PrevPage 173 of 401Next →

No leaderboard results yet.