SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 20912100 of 4002 papers

TitleStatusHype
Medical Word Embeddings for Spanish: Development and Evaluation0
Using time series and natural language processing to identify viral moments in the 2016 U.S. Presidential Debate0
Neural Text Simplification in Low-Resource Conditions Using Weak Supervision0
Neural Machine Translation between Myanmar (Burmese) and Rakhine (Arakanese)0
Semantics and Homothetic Clustering of Hafez Poetry0
Ensemble Methods to Distinguish Mainland and Taiwan Chinese0
Naive Bayes and BiLSTM Ensemble for Discriminating between Mainland and Taiwan Variation of Mandarin Chinese0
Dyr Bul Shchyl. Proxying Sound Symbolism With Word Embeddings0
Coherence models in schizophrenia0
EusDisParser: improving an under-resourced discourse parser with cross-lingual data0
Show:102550
← PrevPage 210 of 401Next →

No leaderboard results yet.