SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 22412250 of 4002 papers

TitleStatusHype
SimpleNets: Quality Estimation with Resource-Light Neural Networks0
SimpleScience: Lexical Simplification of Scientific Terminology0
Simple task-specific bilingual word embeddings0
Simplifying Sentences with Sequence to Sequence Models0
SimplifyUR: Unsupervised Lexical Text Simplification for Urdu0
Simulating ASR errors for training SLU systems0
SINAI at SemEval-2017 Task 4: User based classification0
SINAI at SemEval-2021 Task 5: Combining Embeddings in a BiLSTM-CRF model for Toxic Spans Detection0
SINAI-DL at SemEval-2019 Task 5: Recurrent networks and data augmentation by paraphrasing0
Singleton Detection using Word Embeddings and Neural Networks0
Show:102550
← PrevPage 225 of 401Next →

No leaderboard results yet.