SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 39313940 of 4002 papers

TitleStatusHype
How to evaluate word embeddings? On importance of data efficiency and simple supervised tasksCode0
How to Evaluate Word Representations of Informal Domain?Code0
How to Generate a Good Word Embedding?Code0
How to (Properly) Evaluate Cross-Lingual Word Embeddings: On Strong Baselines, Comparative Analyses, and Some MisconceptionsCode0
Specializing Word Embeddings (for Parsing) by Information BottleneckCode0
How to Train good Word Embeddings for Biomedical NLPCode0
Modelling Salient Features as Directions in Fine-Tuned Semantic SpacesCode0
Word Embeddings Are Capable of Capturing Rhythmic Similarity of WordsCode0
Vocabulary Adaptation for Domain Adaptation in Neural Machine TranslationCode0
Humor in Word Embeddings: Cockamamie Gobbledegook for NincompoopsCode0
Show:102550
← PrevPage 394 of 401Next →

No leaderboard results yet.