SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 17611770 of 4002 papers

TitleStatusHype
Humpty Dumpty: Controlling Word Meanings via Corpus Poisoning0
"Hunt Takes Hare": Theming Games Through Game-Word Vector Translation0
Improving Neural Network Performance by Injecting Background Knowledge: Detecting Code-switching and Borrowing in Algerian texts0
Des pseudo-sens pour am\'eliorer l'extraction de synonymes \`a partir de plongements lexicaux (Pseudo-senses for improving the extraction of synonyms from word embeddings)0
Hybridation d'un agent conversationnel avec des plongements lexicaux pour la formation au diagnostic m\'edical (Hybridization of a conversational agent with word embeddings for medical diagnostic training)0
Hybrid Code Networks using a convolutional neural network as an input layer achieves higher turn accuracy0
Designing a Russian Idiom-Annotated Corpus0
Hybrid Text Feature Modeling for Disease Group Prediction using Unstructured Physician Notes0
Hyperbolic Centroid Calculations for Text Classification0
BERTrade: Using Contextual Embeddings to Parse Old French0
Show:102550
← PrevPage 177 of 401Next →

No leaderboard results yet.