SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 14911500 of 4002 papers

TitleStatusHype
Exploring Distributional Representations and Machine Translation for Aspect-based Cross-lingual Sentiment Classification.0
Exploring Embeddings for Measuring Text Relatedness: Unveiling Sentiments and Relationships in Online Comments0
Exploring Fine-Tuned Embeddings that Model Intensifiers for Emotion Analysis0
Exploring Human Gender Stereotypes with Word Association Test0
Extrapolating Binder Style Word Embeddings to New Words0
Exploring Intra and Inter-language Consistency in Embeddings with ICA0
Detecting Figurative Word Occurrences Using Recurrent Neural Networks0
Exploring Numeracy in Word Embeddings0
Exploring Semantic Representation in Brain Activity Using Word Embeddings0
Detecting Fake News with Capsule Neural Networks0
Show:102550
← PrevPage 150 of 401Next →

No leaderboard results yet.