SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 29763000 of 4002 papers

TitleStatusHype
Correlation Analysis of Chronic Obstructive Pulmonary Disease (COPD) and its Biomarkers Using the Word Embeddings0
MIPA: Mutual Information Based Paraphrase Acquisition via Bilingual PivotingCode0
XMU Neural Machine Translation Systems for WAT 20170
A Bag of Useful Tricks for Practical Neural Machine Translation: Embedding Layer Initialization and Large Batch SizeCode0
Comparing Recurrent and Convolutional Architectures for English-Hindi Neural Machine Translation0
Semantic Structure and Interpretability of Word EmbeddingsCode0
Deep word embeddings for visual speech recognitionCode0
Topic Based Sentiment Analysis Using Deep Learning0
One-shot and few-shot learning of word embeddings0
ALL-IN-1: Short Text Classification with One Model for All LanguagesCode1
Linking Tweets with Monolingual and Cross-Lingual News using Transformed Word Embeddings0
NileTMRG at SemEval-2017 Task 4: Arabic Sentiment Analysis0
Local Word Vectors Guiding Keyphrase ExtractionCode0
Unsupervised Sentence Representations as Word Information Series: Revisiting TF--IDF0
RETUYT in TASS 2017: Sentiment Analysis for Spanish Tweets using SVM and CNN0
Convolutional Neural Networks for Sentiment Classification on Business Reviews0
Word Translation Without Parallel DataCode0
MoNoise: Modeling Noise Using a Modular Normalization SystemCode0
Deep Learning Paradigm with Transformed Monolingual Word Embeddings for Multilingual Sentiment Analysis0
Clickbait detection using word embeddings0
Learning Word Embeddings for Hyponymy with Entailment-Based Distributional Semantics0
Low-resource bilingual lexicon extraction using graph based word embeddings0
BPEmb: Tokenization-free Pre-trained Subword Embeddings in 275 LanguagesCode0
Cross-Language Question Re-Ranking0
Syntactic and Semantic Features For Code-Switching Factored Language Models0
Show:102550
← PrevPage 120 of 161Next →

No leaderboard results yet.