SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 39613970 of 4002 papers

TitleStatusHype
Topic Modeling on User Stories using Word Mover's DistanceCode0
Recurrent neural networks with specialized word embeddings for health-domain named-entity recognitionCode0
Topic Modeling over Short Texts by Incorporating Word EmbeddingsCode0
Acquiring Common Sense Spatial Knowledge through Implicit Spatial TemplatesCode0
[RE] Double-Hard Debias: Tailoring Word Embeddings for Gender Bias MitigationCode0
Morphology-Aware Meta-Embeddings for TamilCode0
Acoustic word embeddings for zero-resource languages using self-supervised contrastive learning and multilingual adaptationCode0
Morphosyntactic Tagging with a Meta-BiLSTM Model over Context Sensitive Token EncodingsCode0
IITK at SemEval-2024 Task 1: Contrastive Learning and Autoencoders for Semantic Textual Relatedness in Multilingual TextsCode0
Automatic Extraction of Nested Entities in Clinical Referrals in SpanishCode0
Show:102550
← PrevPage 397 of 401Next →

No leaderboard results yet.