SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 37113720 of 4002 papers

TitleStatusHype
Should All Cross-Lingual Embeddings Speak English?Code0
Living Machines: A study of atypical animacyCode0
LM-BFF-MS: Improving Few-Shot Fine-tuning of Language Models based on Multiple Soft Demonstration MemoryCode0
Baselines and test data for cross-lingual inferenceCode0
Siamese CBOW: Optimizing Word Embeddings for Sentence RepresentationsCode0
Eye-tracking based classification of Mandarin Chinese readers with and without dyslexia using neural sequence modelsCode0
Two Methods for Domain Adaptation of Bilingual Tasks: Delightfully Simple and Broadly ApplicableCode0
Word Embeddings for the Construction DomainCode0
The Interplay of Semantics and Morphology in Word EmbeddingsCode0
Word Mover's Embedding: From Word2Vec to Document EmbeddingCode0
Show:102550
← PrevPage 372 of 401Next →

No leaderboard results yet.