SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 36513675 of 4002 papers

TitleStatusHype
Word Embedding Approach for Synonym Extraction of Multi-Word TermsCode0
BERTRAM: Improved Word Embeddings Have Big Impact on Contextualized Model PerformanceCode0
A Simple and Effective Usage of Word Clusters for CBOW ModelCode0
Investigating the Frequency Distortion of Word Embeddings and Its Impact on Bias MetricsCode0
Explaining word embeddings with perfect fidelity: Case study in research impact predictionCode0
Plumeria at SemEval-2022 Task 6: Robust Approaches for Sarcasm Detection for English and Arabic Using Transformers and Data AugmentationCode0
Sentiment Lexicon Construction with Representation Learning Based on Hierarchical Sentiment SupervisionCode0
Exploiting Debate Portals for Semi-Supervised Argumentation Mining in User-Generated Web DiscourseCode0
Poincaré GloVe: Hyperbolic Word EmbeddingsCode0
Poincare Glove: Hyperbolic Word EmbeddingsCode0
Aligning Multilingual Word Embeddings for Cross-Modal Retrieval TaskCode0
A Neural Language Model for Dynamically Representing the Meanings of Unknown Words and Entities in a DiscourseCode0
The Dynamic Embedded Topic ModelCode0
Exploration of register-dependent lexical semantics using word embeddingsCode0
Political Stance in DanishCode0
The Early Modern Dutch Mediascape. Detecting Media Mentions in Chronicles Using Word Embeddings and CRFCode0
ViCE: Improving Dense Representation Learning by Superpixelization and Contrasting Cluster AssignmentCode0
A Simple and Effective Approach for Fine Tuning Pre-trained Word Embeddings for Improved Text ClassificationCode0
ViCo: Word Embeddings from Visual Co-occurrencesCode0
Exploring Diachronic Lexical Semantics with JeSemECode0
Portuguese Word Embeddings: Evaluating on Word Analogies and Natural Language TasksCode0
SeVeN: Augmenting Word Embeddings with Unsupervised Relation VectorsCode0
When and Why are Pre-trained Word Embeddings Useful for Neural Machine Translation?Code0
SexWEs: Domain-Aware Word Embeddings via Cross-lingual Semantic Specialisation for Chinese Sexism Detection in Social MediaCode0
When do Word Embeddings Accurately Reflect Surveys on our Beliefs About People?Code0
Show:102550
← PrevPage 147 of 161Next →

No leaderboard results yet.