SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 10311040 of 4002 papers

TitleStatusHype
An Ontology-Based Method for Extracting and Classifying Domain-Specific Compositional Nominal Compounds0
Closed Form Word Embedding Alignment0
Bilingual Word Embeddings with Bucketed CNN for Parallel Sentence Extraction0
Bilingual Word Embeddings from Parallel and Non-parallel Corpora for Cross-Language Text Classification0
A non-DNN Feature Engineering Approach to Dependency Parsing -- FBAML at CoNLL 2017 Shared Task0
Detecting Metaphorical Phrases in the Polish Language0
Bilingual Word Embeddings from Non-Parallel Document-Aligned Data Applied to Bilingual Lexicon Induction0
Bilingual Word Embeddings for Phrase-Based Machine Translation0
Bilingual Word Embeddings for Bilingual Terminology Extraction from Specialized Comparable Corpora0
Bilingual Topic Models for Comparable Corpora0
Show:102550
← PrevPage 104 of 401Next →

No leaderboard results yet.