SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 20112020 of 4002 papers

TitleStatusHype
Morphological Word Embeddings0
Word Embeddings for the Analysis of Ideological Placement in Parliamentary CorporaCode1
Interpretable Segmentation of Medical Free-Text Records Based on Word EmbeddingsCode0
Neural Semantic Parsing with Anonymization for Command Understanding in General-Purpose Service RobotsCode0
Assessing Wordnets with WordNet EmbeddingsCode0
Visualising WordNet Embeddings: some preliminary results0
Fitting Semantic Relations to Word Embeddings0
Synthetic, yet natural: Properties of WordNet random walk corpora and the impact of rare words on embedding performance0
Apprentissage de plongements lexicaux par une approche r\'eseaux complexes (Complex networks based word embeddings)0
Apprentissage de plongements de mots dynamiques avec r\'egularisation de la d\'erive (Learning dynamic word embeddings with drift regularisation)0
Show:102550
← PrevPage 202 of 401Next →

No leaderboard results yet.