SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 26012610 of 4002 papers

TitleStatusHype
Key Phrase Extraction & Applause Prediction0
Keyphrase Extraction from Disaster-related Tweets0
Keyphrase Extraction from Scholarly Articles as Sequence Labeling using Contextualized Embeddings0
Keyphrase Extraction Using Neighborhood Knowledge Based on Word Embeddings0
Keyphrases Extraction from User-Generated Contents in Healthcare Domain Using Long Short-Term Memory Networks0
KeyVec: Key-semantics Preserving Document Representations0
Keyword-centered Collocating Topic Analysis0
KIT-Multi: A Translation-Oriented Multilingual Embedding Corpus0
KNET: A General Framework for Learning Word Embedding using Morphological Knowledge0
Know-Center at SemEval-2019 Task 5: Multilingual Hate Speech Detection on Twitter using CNNs0
Show:102550
← PrevPage 261 of 401Next →

No leaderboard results yet.