SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 16311640 of 4002 papers

TitleStatusHype
Distant Supervision and Noisy Label Learning for Low Resource Named Entity Recognition: A Study on Hausa and Yorùbá0
A Machine Learning Application for Raising WASH Awareness in the Times of COVID-19 Pandemic0
Leveraging Foreign Language Labeled Data for Aspect-Based Opinion Mining0
Text Similarity Using Word Embeddings to Classify Misinformation0
Word Sense Disambiguation for 158 Languages using Word Embeddings Only0
Deep Representation Learning of Electronic Health Records to Unlock Patient Stratification at ScaleCode1
A Graph Convolutional Topic Model for Short and Noisy Text StreamsCode1
Using word embeddings to improve the discriminability of co-occurrence text networks0
Hurtful Words: Quantifying Biases in Clinical Contextual Word EmbeddingsCode1
A Precisely Xtreme-Multi Channel Hybrid Approach For Roman Urdu Sentiment Analysis0
Show:102550
← PrevPage 164 of 401Next →

No leaderboard results yet.