SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 28212830 of 4002 papers

TitleStatusHype
LSTMEmbed: Learning Word and Sense Representations from a Large Semantically Annotated Corpus with Long Short-Term Memories0
LSX_team5 at SemEval-2022 Task 8: Multilingual News Article Similarity Assessment based on Word- and Sentence Mover’s Distance0
LT3 at SemEval-2020 Task 7: Comparing Feature-Based and Transformer-Based Approaches to Detect Funny Headlines0
LT3 at SemEval-2020 Task 9: Cross-lingual Embeddings for Sentiment Analysis of Hinglish Social Media Text0
LTSG: Latent Topical Skip-Gram for Mutually Learning Topic Model and Vector Representations0
Lucene for Approximate Nearest-Neighbors Search on Arbitrary Dense Vectors0
Lump at SemEval-2017 Task 1: Towards an Interlingua Semantic Similarity0
LyS at SemEval-2016 Task 4: Exploiting Neural Activation Values for Twitter Sentiment Classification and Quantification0
Machine Comprehension with Syntax, Frames, and Semantics0
Machine Learning Based on Natural Language Processing to Detect Cardiac Failure in Clinical Narratives0
Show:102550
← PrevPage 283 of 401Next →

No leaderboard results yet.