SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 19511960 of 4002 papers

TitleStatusHype
Dialog State Tracking: A Neural Reading Comprehension Approach0
Word2vec to behavior: morphology facilitates the grounding of language in machinesCode0
Creative Contextual Dialog Adaptation in an Open World RPGCode0
SyntaxFest 2019 Invited talk - Quantitative Computational Syntax: dependencies, intervention effects and word embeddings0
Noisy Parallel Corpus Filtering through Projected Word Embeddings0
WMDO: Fluency-based Word Mover's Distance for Machine Translation Evaluation0
The LMU Munich Unsupervised Machine Translation System for WMT190
Learning Joint Acoustic-Phonetic Word Embeddings0
Contributions to Clinical Named Entity Recognition in PortugueseCode0
Identification of Adjective-Noun Neologisms using Pretrained Language ModelsCode0
Show:102550
← PrevPage 196 of 401Next →

No leaderboard results yet.