SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 19711980 of 4002 papers

TitleStatusHype
Evaluation of Morphological Embeddings for the Russian Language0
Is there Gender bias and stereotype in Portuguese Word Embeddings?0
Is ``Universal Syntax'' Universally Useful for Learning Distributed Word Representations?0
ISWARA at WNUT-2020 Task 2: Identification of Informative COVID-19 English Tweets using BERT and FastText Embeddings0
Evaluation of Greek Word Embeddings0
Classification of Micro-Texts Using Sub-Word Embeddings0
It's All in the Name: Mitigating Gender Bias with Name-Based Counterfactual Data Substitution0
IxaMed at PharmacoNER Challenge 20190
A Review of Standard Text Classification Practices for Multi-label Toxicity Identification of Online Content0
A Locally Linear Procedure for Word Translation0
Show:102550
← PrevPage 198 of 401Next →

No leaderboard results yet.