SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 13211330 of 4002 papers

TitleStatusHype
Argument from Old Man's View: Assessing Social Bias in ArgumentationCode0
Neural Text Classification by Jointly Learning to Cluster and Align0
Acoustic span embeddings for multilingual query-by-example searchCode0
Unequal Representations: Analyzing Intersectional Biases in Word Embeddings Using Representational Similarity AnalysisCode0
Advancing Humor-Focused Sentiment Analysis through Improved Contextualized Embeddings and Model Architecture0
Evaluating Input Representation for Language Identification in Hindi-English Code Mixed Text0
DiaLex: A Benchmark for Evaluating Multidialectal Arabic Word EmbeddingsCode0
Sensing Ambiguity in Henry James' "The Turn of the Screw"Code0
A semi-supervised model for Persian rumor verification based on content information0
Self-Supervised learning with cross-modal transformers for emotion recognition0
Show:102550
← PrevPage 133 of 401Next →

No leaderboard results yet.