SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 9911000 of 4002 papers

TitleStatusHype
Opinions are Made to be Changed: Temporally Adaptive Stance ClassificationCode0
Sarcasm Detection in Twitter -- Performance Impact while using Data Augmentation: Word EmbeddingsCode0
Yseop at FinSim-3 Shared Task 2021: Specializing Financial Domain Learning with Phrase Representations0
How Cute is Pikachu? Gathering and Ranking Pokémon Properties from Data with Pokémon Word Embeddings0
FeelsGoodMan: Inferring Semantics of Twitch Neologisms0
Challenges and Applications of Automated Extraction of Socio-political Events from Text (CASE 2021): Workshop and Shared Task Report0
Diachronic Analysis of German Parliamentary Proceedings: Ideological Shifts through the Lens of Political BiasesCode0
Statistical Dependency Guided Contrastive Learning for Multiple Labeling in Prenatal Ultrasound0
Efficacy of BERT embeddings on predicting disaster from Twitter dataCode0
Deriving Disinformation Insights from Geolocalized Twitter CalloutsCode0
Show:102550
← PrevPage 100 of 401Next →

No leaderboard results yet.