SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 741750 of 4002 papers

TitleStatusHype
Clustering Prominent People and Organizations in Topic-Specific Text Corpora0
ARHNet - Leveraging Community Interaction for Detection of Religious Hate Speech in Arabic0
An Unsupervised Approach for Mapping between Vector Spaces0
Cluster Labeling by Word Embeddings and WordNet's Hypernymy0
Comparison of Representations of Named Entities for Document Classification0
CNN- and LSTM-based Claim Classification in Online User Comments0
BLISS in Non-Isometric Embedding Spaces0
Blinov: Distributed Representations of Words for Aspect-Based Sentiment Analysis at SemEval 20140
Code-Switched Named Entity Recognition with Embedding Attention0
Measuring Societal Biases from Text Corpora with Smoothed First-Order Co-occurrence0
Show:102550
← PrevPage 75 of 401Next →

No leaderboard results yet.