SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 23012310 of 4002 papers

TitleStatusHype
Structured Generative Models of Continuous Features for Word Sense Induction0
Interpretable Structure-aware Document Encoders with Hierarchical Attention0
STUFIIT at SemEval-2019 Task 5: Multilingual Hate Speech Detection on Twitter with MUSE and ELMo Embeddings0
Sub-label dependencies for Neural Morphological Tagging -- The Joint Submission of University of Colorado and University of Helsinki for VarDial 20180
Subsumption Preservation as a Comparative Measure for Evaluating Sense-Directed Embeddings0
Subword-based Cross-lingual Transfer of Embeddings from Hindi to Marathi0
Subword-based Cross-lingual Transfer of Embeddings from Hindi to Marathi and Nepali0
Subword-level Composition Functions for Learning Word Embeddings0
Sub-Word Similarity based Search for Embeddings: Inducing Rare-Word Embeddings for Word Similarity Tasks and Language Modelling0
Suicide Risk Assessment on Social Media: USI-UPF at the CLPsych 2019 Shared Task0
Show:102550
← PrevPage 231 of 401Next →

No leaderboard results yet.