SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 13011310 of 4002 papers

TitleStatusHype
Multi-Adversarial Learning for Cross-Lingual Word Embeddings0
Multi-label Few/Zero-shot Learning with Knowledge Aggregated from Multiple Label GraphsCode1
A Self-supervised Representation Learning of Sentence Structure for Authorship AttributionCode0
From Language to Language-ish: How Brain-Like is an LSTM's Representation of Nonsensical Language Stimuli?0
Legal Document Classification: An Application to Law Area Prediction of Petitions to Public Prosecution Service0
BRUMS at SemEval-2020 Task 3: Contextualised Embeddings for Predicting the (Graded) Effect of Context in Word SimilarityCode0
Multilingual Offensive Language Identification with Cross-lingual EmbeddingsCode0
gundapusunil at SemEval-2020 Task 9: Syntactic Semantic LSTM Architecture for SENTIment Analysis of Code-MIXed Data0
comp-syn: Perceptually Grounded Word Embeddings with ColorCode1
MuSeM: Detecting Incongruent News Headlines using Mutual Attentive Semantic Matching0
Show:102550
← PrevPage 131 of 401Next →

No leaderboard results yet.