SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 25612570 of 4002 papers

TitleStatusHype
Isomorphic Cross-lingual Embeddings for Low-Resource Languages0
Isomorphic Cross-lingual Embeddings for Low-Resource Languages0
Is Stance Detection Topic-Independent and Cross-topic Generalizable? - A Reproduction Study0
Is there Gender bias and stereotype in Portuguese Word Embeddings?0
Is ``Universal Syntax'' Universally Useful for Learning Distributed Word Representations?0
ISWARA at WNUT-2020 Task 2: Identification of Informative COVID-19 English Tweets using BERT and FastText Embeddings0
Is Wikipedia succeeding in reducing gender bias? Assessing changes in gender bias in Wikipedia using word embeddings0
It's All in the Name: Mitigating Gender Bias with Name-Based Counterfactual Data Substitution0
IxaMed at PharmacoNER Challenge 20190
Jabberwocky Parsing: Dependency Parsing with Lexical Noise0
Show:102550
← PrevPage 257 of 401Next →

No leaderboard results yet.