SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 38413850 of 4002 papers

TitleStatusHype
QuanTaxo: A Quantum Approach to Self-Supervised Taxonomy ExpansionCode0
GiBERT: Enhancing BERT with Linguistic Information using a Lightweight Gated Injection MethodCode0
Word Similarity Datasets for Thai: Construction and EvaluationCode0
The Undesirable Dependence on Frequency of Gender Bias Metrics Based on Word EmbeddingsCode0
Give your Text Representation Models some Love: the Case for BasqueCode0
SocialVec: Social Entity EmbeddingsCode0
Quantifying Lexical Semantic Shift via Unbalanced Optimal TransportCode0
Argument from Old Man's View: Assessing Social Bias in ArgumentationCode0
GLoMo: Unsupervisedly Learned Relational Graphs as Transferable RepresentationsCode0
Aggressive Language Identification Using Word Embeddings and Sentiment FeaturesCode0
Show:102550
← PrevPage 385 of 401Next →

No leaderboard results yet.