SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 36513660 of 4002 papers

TitleStatusHype
Word Embedding Approach for Synonym Extraction of Multi-Word TermsCode0
BERTRAM: Improved Word Embeddings Have Big Impact on Contextualized Model PerformanceCode0
A Simple and Effective Usage of Word Clusters for CBOW ModelCode0
Investigating the Frequency Distortion of Word Embeddings and Its Impact on Bias MetricsCode0
Explaining word embeddings with perfect fidelity: Case study in research impact predictionCode0
Plumeria at SemEval-2022 Task 6: Robust Approaches for Sarcasm Detection for English and Arabic Using Transformers and Data AugmentationCode0
Sentiment Lexicon Construction with Representation Learning Based on Hierarchical Sentiment SupervisionCode0
Exploiting Debate Portals for Semi-Supervised Argumentation Mining in User-Generated Web DiscourseCode0
Poincaré GloVe: Hyperbolic Word EmbeddingsCode0
Poincare Glove: Hyperbolic Word EmbeddingsCode0
Show:102550
← PrevPage 366 of 401Next →

No leaderboard results yet.