SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 27612770 of 4002 papers

TitleStatusHype
Baseline Needs More Love: On Simple Word-Embedding-Based Models and Associated Pooling MechanismsCode0
Enhancing Chinese Intent Classification by Dynamically Integrating Character Features into Word Embeddings with Ensemble Techniques0
How much does a word weigh? Weighting word embeddings for word sense induction0
Scoring Lexical Entailment with a Supervised Directional Similarity NetworkCode0
Bilingual Sentiment Embeddings: Joint Projection of Sentiment Across LanguagesCode0
Morphosyntactic Tagging with a Meta-BiLSTM Model over Context Sensitive Token EncodingsCode0
Aff2Vec: Affect--Enriched Distributional Word Representations0
Sentence Modeling via Multiple Word Embeddings and Multi-level Comparison for Semantic Textual Similarity0
Unsupervised Cross-Modal Alignment of Speech and Text Embedding Spaces0
A robust self-learning method for fully unsupervised cross-lingual mappings of word embeddingsCode1
Show:102550
← PrevPage 277 of 401Next →

No leaderboard results yet.