SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 17111720 of 4002 papers

TitleStatusHype
Improving Interpretability of Word Embeddings by Generating Definition and Usage0
CoSimLex: A Resource for Evaluating Graded Word Similarity in ContextCode0
Machine Translation with Cross-lingual Word EmbeddingsCode0
Multilingual aspect clustering for sentiment analysisCode0
Can AI Generate Love Advice?: Toward Neural Answer Generation for Non-Factoid Questions0
Massive vs. Curated Word Embeddings for Low-Resourced Languages. The Case of Yorùbá and TwiCode0
Measuring Social Bias in Knowledge Graph Embeddings0
Natural Alpha Embeddings0
A Robust Self-Learning Method for Fully Unsupervised Cross-Lingual Mappings of Word Embeddings: Making the Method Robustly Reproducible as WellCode0
TU Wien @ TREC Deep Learning '19 -- Simple Contextualization for Re-rankingCode1
Show:102550
← PrevPage 172 of 401Next →

No leaderboard results yet.