SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 32513260 of 4002 papers

TitleStatusHype
Bidirectional Recurrent Convolutional Neural Network for Relation Classification0
Bidirectional Retrieval Made Simple0
Evaluating Contextualized Representations of (Spanish) Ambiguous Words: A New Lexical Resource and Empirical Analysis0
Big Data Small Data, In Domain Out-of Domain, Known Word Unknown Word: The Impact of Word Representations on Sequence Labelling Tasks0
Big Data Small Data, In Domain Out-of Domain, Known Word Unknown Word: The Impact of Word Representation on Sequence Labelling Tasks0
Bilexical Embeddings for Quality Estimation0
Bilingual Autoencoders with Global Descriptors for Modeling Parallel Sentences0
Bilingual Correspondence Recursive Autoencoder for Statistical Machine Translation0
Bilingual Distributed Word Representations from Document-Aligned Comparable Data0
Bilingual Embeddings and Word Alignments for Translation Quality Estimation0
Show:102550
← PrevPage 326 of 401Next →

No leaderboard results yet.