SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 511520 of 4002 papers

TitleStatusHype
A Methodology for Studying Linguistic and Cultural Change in China, 1900-19500
Bilingual Lexicon Induction across Orthographically-distinct Under-Resourced Dravidian Languages0
A Challenge Set and Methods for Noun-Verb Ambiguity0
A data-driven strategy to combine word embeddings in information retrieval0
Bilingual Embeddings with Random Walks over Multilingual Wordnets0
Bilingual Lexicon Induction by Learning to Combine Word-Level and Character-Level Representations0
ARHNet - Leveraging Community Interaction for Detection of Religious Hate Speech in Arabic0
Argument from Old Man’s View: Assessing Social Bias in Argumentation0
A Margin-based Loss with Synthetic Negative Samples for Continuous-output Machine Translation0
A Machine Learning Application for Raising WASH Awareness in the Times of COVID-19 Pandemic0
Show:102550
← PrevPage 52 of 401Next →

No leaderboard results yet.