SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 741750 of 4002 papers

TitleStatusHype
Caveats of Measuring Semantic Change of Cognates and Borrowings using Multilingual Word EmbeddingsCode0
Learning and Evaluating Character Representations in NovelsCode0
Can We Use Word Embeddings for Enhancing Guarani-Spanish Machine Translation?Code0
Estimating word co-occurrence probabilities from pretrained static embeddings using a log-bilinear model0
Binary Encoded Word Mover’s Distance0
Evaluating Biomedical Word Embeddings for Vocabulary Alignment at Scale in the UMLS Metathesaurus Using Siamese Networks0
LM-BFF-MS: Improving Few-Shot Fine-tuning of Language Models based on Multiple Soft Demonstration MemoryCode0
“Vaderland”, “Volk” and “Natie”: Semantic Change Related to Nationalism in Dutch Literature Between 1700 and 1880 Captured with Dynamic Bernoulli Word Embeddings0
English-Malay Cross-Lingual Embedding Alignment using Bilingual Lexicon Augmentation0
A Neural Model for Compositional Word Embeddings and Sentence Processing0
Show:102550
← PrevPage 75 of 401Next →

No leaderboard results yet.