SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 34713480 of 4002 papers

TitleStatusHype
On a Novel Application of Wasserstein-Procrustes for Unsupervised Cross-Lingual LearningCode0
Embeddings Evaluation Using a Novel Measure of Semantic SimilarityCode0
Training Temporal Word Embeddings with a CompassCode0
Embeddings for Word Sense Disambiguation: An Evaluation StudyCode0
On Dimensional Linguistic Properties of the Word Embedding SpaceCode0
ValNorm Quantifies Semantics to Reveal Consistent Valence Biases Across Languages and Over CenturiesCode0
Embedding Strategies for Specialized Domains: Application to Clinical Entity RecognitionCode0
One of these words is not like the other: a reproduction of outlier identification using non-contextual word representationsCode0
Embedding Transfer for Low-Resource Medical Named Entity Recognition: A Case Study on Patient MobilityCode0
A Method for Studying Semantic Construal in Grammatical Constructions with Interpretable Contextual Embedding SpacesCode0
Show:102550
← PrevPage 348 of 401Next →

No leaderboard results yet.