SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 871880 of 4002 papers

TitleStatusHype
TUDA-CCL at SemEval-2021 Task 1: Using Gradient-boosted Regression Tree Ensembles Trained on a Heterogeneous Feature Set for Predicting Lexical Complexity0
GlossReader at SemEval-2021 Task 2: Reading Definitions Improves Contextualized Word Embeddings0
Stanford MLab at SemEval-2021 Task 1: Tree-Based Modelling of Lexical Complexity using Word Embeddings0
RS\_GV at SemEval-2021 Task 1: Sense Relative Lexical Complexity Prediction0
GX at SemEval-2021 Task 2: BERT with Lemma Information for MCL-WiC TaskCode0
Learning Embeddings for Rare Words Leveraging Internet Search Engine and Spatial Location Relationships0
Realised Volatility Forecasting: Machine Learning via Financial Word Embedding0
Arabic aspect sentiment polarity classification using BERT0
Language Models as Zero-shot Visual Semantic Learners0
Stress Test Evaluation of Biomedical Word EmbeddingsCode0
Show:102550
← PrevPage 88 of 401Next →

No leaderboard results yet.