SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 15811590 of 4002 papers

TitleStatusHype
Sentiment Analysis for Hinglish Code-mixed Tweets by means of Cross-lingual Word Embeddings0
Massive vs. Curated Embeddings for Low-Resourced Languages: the Case of Yor\`ub\'a and Twi0
Usability and Accessibility of Bantu Language Dictionaries in the Digital Age: Mobile Access in an Open Environment0
From Zero to Hero: On the Limitations of Zero-Shot Cross-Lingual Transfer with Multilingual Transformers0
A Closer Look on Unsupervised Cross-lingual Word Embeddings Mapping0
Lexicon-Enhancement of Embedding-based Approaches Towards the Detection of Abusive Language0
TF-IDF Character N-grams versus Word Embedding-based Models for Fine-grained Event Classification: A Preliminary Study0
FlorUniTo@TRAC-2: Retrofitting Word Embeddings on an Abusive Lexicon for Aggressive Language Detection0
Figure Me Out: A Gold Standard Dataset for Metaphor Interpretation0
French Contextualized Word-Embeddings with a sip of CaBeRnet: a New French Balanced Reference Corpus0
Show:102550
← PrevPage 159 of 401Next →

No leaderboard results yet.