SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 10511060 of 4002 papers

TitleStatusHype
An Iterative Approach for Unsupervised Most Frequent Sense Detection using WordNet and Word Embeddings0
Bilingual Lexicon Induction by Learning to Combine Word-Level and Character-Level Representations0
Determining Code Words in Euphemistic Hate Speech Using Word Embedding Networks0
Bilingual Lexicon Induction across Orthographically-distinct Under-Resourced Dravidian Languages0
An Investigation of the Interactions Between Pre-Trained Word Embeddings, Character Models and POS Tags in Dependency Parsing0
Acoustically Grounded Word Embeddings for Improved Acoustics-to-Word Speech Recognition0
Determining Gains Acquired from Word Embedding Quantitatively Using Discrete Distribution Clustering0
DFKI-MLT System Description for the WMT18 Automatic Post-editing Task0
Bilingual Embeddings with Random Walks over Multilingual Wordnets0
Bilingual Embeddings and Word Alignments for Translation Quality Estimation0
Show:102550
← PrevPage 106 of 401Next →

No leaderboard results yet.