SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 31313140 of 4002 papers

TitleStatusHype
Contrastive Loss is All You Need to Recover Analogies as Parallel LinesCode0
Contrastive Learning in Distilled ModelsCode0
Multilingual transfer of acoustic word embeddings improves when training on languages related to the target zero-resource languageCode0
BPEmb: Tokenization-free Pre-trained Subword Embeddings in 275 LanguagesCode0
Boosting Zero-shot Cross-lingual Retrieval by Training on Artificially Code-Switched DataCode0
Representation of linguistic form and function in recurrent neural networksCode0
Cross-Lingual Word Embeddings for Turkic LanguagesCode0
BL.Research at SemEval-2022 Task 1: Deep networks for Reverse Dictionary using embeddings and LSTM autoencodersCode0
BLCU-ICALL at SemEval-2022 Task 1: Cross-Attention Multitasking Framework for Definition ModelingCode0
An Analysis of Euclidean vs. Graph-Based Framing for Bilingual Lexicon Induction from Word Embedding SpacesCode0
Show:102550
← PrevPage 314 of 401Next →

No leaderboard results yet.