SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 39213930 of 4002 papers

TitleStatusHype
Visual Word2Vec (vis-w2v): Learning Visually Grounded Word Embeddings Using Abstract ScenesCode0
How Familiar Does That Sound? Cross-Lingual Representational Similarity Analysis of Acoustic Word EmbeddingsCode0
How Gender and Skin Tone Modifiers Affect Emoji Semantics in TwitterCode0
RAW-C: Relatedness of Ambiguous Words--in Context (A New Lexical Resource for English)Code0
A Comprehensive Comparison of Word Embeddings in Event & Entity Coreference ResolutionCode0
Sparse Victory -- A Large Scale Systematic Comparison of count-based and prediction-based vectorizers for text classificationCode0
"Wikily" Supervised Neural Translation Tailored to Cross-Lingual TasksCode0
A Bi-Encoder LSTM Model For Learning Unstructured DialogsCode0
The Pupil Has Become the Master: Teacher-Student Model-Based Word Embedding Distillation with Ensemble LearningCode0
Specialising Word Vectors for Lexical EntailmentCode0
Show:102550
← PrevPage 393 of 401Next →

No leaderboard results yet.