SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 131140 of 4002 papers

TitleStatusHype
Improved Semantic Role Labeling using Parameterized Neighborhood Memory AdaptationCode1
Exploring Text Specific and Blackbox Fairness Algorithms in Multimodal Clinical NLPCode1
IndicNLPSuite: Monolingual Corpora, Evaluation Benchmarks and Pre-trained Multilingual Language Models for Indian LanguagesCode1
CODER: Knowledge infused cross-lingual medical term embedding for term normalizationCode1
“Did you really mean what you said?” : Sarcasm Detection in Hindi-English Code-Mixed Data using Bilingual Word EmbeddingsCode1
Emotion Understanding in Videos Through Body, Context, and Visual-Semantic Embedding LossCode1
Multimodal Metric Learning for Tag-based Music RetrievalCode1
Combining Self-Training and Self-Supervised Learning for Unsupervised Disfluency DetectionCode1
Named Entity Recognition for Social Media Texts with Semantic AugmentationCode1
Learning Contextualised Cross-lingual Word Embeddings and Alignments for Extremely Low-Resource Languages Using Parallel CorporaCode1
Show:102550
← PrevPage 14 of 401Next →

No leaderboard results yet.