SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 10311040 of 4002 papers

TitleStatusHype
Towards Multi-Sense Cross-Lingual Alignment of Contextual EmbeddingsCode0
Unsupervised Transfer Learning in Multilingual Neural Machine Translation with Cross-Lingual Word Embeddings0
DebIE: A Platform for Implicit and Explicit Debiasing of Word Embedding SpacesCode0
Evaluation of Morphological Embeddings for the Russian Language0
CNN-based Spoken Term Detection and Localization without Dynamic Programming0
A Comparison of Word2Vec, HMM2Vec, and PCA2Vec for Malware Classification0
Semantic-aware Knowledge Distillation for Few-Shot Class-Incremental Learning0
Overcoming Poor Word Embeddings with Word Definitions0
WordBias: An Interactive Visual Tool for Discovering Intersectional Biases Encoded in Word EmbeddingsCode1
Lex2vec: making Explainable Word Embeddings via Lexical Resources0
Show:102550
← PrevPage 104 of 401Next →

No leaderboard results yet.