SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 611620 of 4002 papers

TitleStatusHype
Acoustic word embeddings for zero-resource languages using self-supervised contrastive learning and multilingual adaptationCode0
Definition Modeling: Learning to define word embeddings in natural languageCode0
Black is to Criminal as Caucasian is to Police: Detecting and Removing Multiclass Bias in Word EmbeddingsCode0
BLCU-ICALL at SemEval-2022 Task 1: Cross-Attention Multitasking Framework for Definition ModelingCode0
Crossmodal ASR Error Correction with Discrete Speech UnitsCode0
Dependency Sensitive Convolutional Neural Networks for Modeling Sentences and DocumentsCode0
Deriving Word Vectors from Contextualized Language Models using Topic-Aware Mention SelectionCode0
Design and Implementation of a Quantum Kernel for Natural Language ProcessingCode0
BL.Research at SemEval-2022 Task 1: Deep networks for Reverse Dictionary using embeddings and LSTM autoencodersCode0
Aggressive Language Identification Using Word Embeddings and Sentiment FeaturesCode0
Show:102550
← PrevPage 62 of 401Next →

No leaderboard results yet.