SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 20212030 of 4002 papers

TitleStatusHype
Hybridation d'un agent conversationnel avec des plongements lexicaux pour la formation au diagnostic m\'edical (Hybridization of a conversational agent with word embeddings for medical diagnostic training)0
Exploring Numeracy in Word Embeddings0
EigenSent: Spectral sentence embeddings using higher-order Dynamic Mode DecompositionCode0
Towards Incremental Learning of Word Embeddings Using Context InformativenessCode0
Better OOV Translation with Bilingual Terminology Mining0
Self-Attention Architectures for Answer-Agnostic Neural Question Generation0
Diachronic Sense Modeling with Deep Contextualized Word Embeddings: An Ecological View0
Predicting Humorousness and Metaphor Novelty with Gaussian Process Preference LearningCode0
Collocation Classification with Unsupervised Relation VectorsCode0
De-Mixing Sentiment from Code-Mixed Text0
Show:102550
← PrevPage 203 of 401Next →

No leaderboard results yet.