SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 831840 of 4002 papers

TitleStatusHype
Train Short, Test Long: Attention with Linear Biases Enables Input Length ExtrapolationCode2
Sarcasm Detection in Twitter -- Performance Impact while using Data Augmentation: Word EmbeddingsCode0
Yseop at FinSim-3 Shared Task 2021: Specializing Financial Domain Learning with Phrase Representations0
How Cute is Pikachu? Gathering and Ranking Pokémon Properties from Data with Pokémon Word Embeddings0
FeelsGoodMan: Inferring Semantics of Twitch Neologisms0
Challenges and Applications of Automated Extraction of Socio-political Events from Text (CASE 2021): Workshop and Shared Task Report0
IsoScore: Measuring the Uniformity of Embedding Space UtilizationCode1
Diachronic Analysis of German Parliamentary Proceedings: Ideological Shifts through the Lens of Political BiasesCode0
Statistical Dependency Guided Contrastive Learning for Multiple Labeling in Prenatal Ultrasound0
Efficacy of BERT embeddings on predicting disaster from Twitter dataCode0
Show:102550
← PrevPage 84 of 401Next →

No leaderboard results yet.