SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 941950 of 4002 papers

TitleStatusHype
Cross-lingual Models of Word Embeddings: An Empirical ComparisonCode0
Robust Gram EmbeddingsCode0
BERTRAM: Improved Word Embeddings Have Big Impact on Contextualized Model PerformanceCode0
Russian word sense induction by clustering averaged word embeddingsCode0
Dependency Sensitive Convolutional Neural Networks for Modeling Sentences and DocumentsCode0
Design and Implementation of a Quantum Kernel for Natural Language ProcessingCode0
An Analysis of Euclidean vs. Graph-Based Framing for Bilingual Lexicon Induction from Word Embedding SpacesCode0
SART - Similarity, Analogies, and Relatedness for Tatar Language: New Benchmark Datasets for Word Embeddings EvaluationCode0
SChME at SemEval-2020 Task 1: A Model Ensemble for Detecting Lexical Semantic ChangeCode0
Diagnosing BERT with Retrieval HeuristicsCode0
Show:102550
← PrevPage 95 of 401Next →

No leaderboard results yet.