SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 981990 of 4002 papers

TitleStatusHype
Automatic coding of students' writing via Contrastive Representation Learning in the Wasserstein space0
Data-Driven Mitigation of Adversarial Text Perturbation0
A Hierarchical Knowledge Representation for Expert Finding on Social Media0
Data Filtering using Cross-Lingual Word Embeddings0
Bio-inspired Structure Identification in Language Embeddings0
Data Sets: Word Embeddings Learned from Tweets and General Data0
Automatic Detection of Incoherent Speech for Diagnosing Schizophrenia0
DataStories at SemEval-2017 Task 6: Siamese LSTM with Attention for Humorous Text Comparison0
DCC-Uchile at SemEval-2020 Task 1: Temporal Referencing Word Embeddings0
BioAMA: Towards an End to End BioMedical Question Answering System0
Show:102550
← PrevPage 99 of 401Next →

No leaderboard results yet.