SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 29712980 of 4002 papers

TitleStatusHype
A Multi-task Learning Approach to Adapting Bilingual Word Embeddings for Cross-lingual Named Entity Recognition0
Training Word Sense Embeddings With Lexicon-based Regularization0
Embracing Non-Traditional Linguistic Resources for Low-resource Language Name Tagging0
Hyperspherical Query Likelihood Models with Word Embeddings0
Grammatical Error Detection Using Error- and Grammaticality-Specific Word EmbeddingsCode0
Correlation Analysis of Chronic Obstructive Pulmonary Disease (COPD) and its Biomarkers Using the Word Embeddings0
MIPA: Mutual Information Based Paraphrase Acquisition via Bilingual PivotingCode0
XMU Neural Machine Translation Systems for WAT 20170
A Bag of Useful Tricks for Practical Neural Machine Translation: Embedding Layer Initialization and Large Batch SizeCode0
Comparing Recurrent and Convolutional Architectures for English-Hindi Neural Machine Translation0
Show:102550
← PrevPage 298 of 401Next →

No leaderboard results yet.