SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 17611770 of 4002 papers

TitleStatusHype
Humpty Dumpty: Controlling Word Meanings via Corpus Poisoning0
"Hunt Takes Hare": Theming Games Through Game-Word Vector Translation0
Exploring Adequacy Errors in Neural Machine Translation with the Help of Cross-Language Aligned Word Embeddings0
Exploration on Grounded Word Embedding: Matching Words and Images with Image-Enhanced Skip-Gram Model0
Hybridation d'un agent conversationnel avec des plongements lexicaux pour la formation au diagnostic m\'edical (Hybridization of a conversational agent with word embeddings for medical diagnostic training)0
Hybrid Code Networks using a convolutional neural network as an input layer achieves higher turn accuracy0
Exploration des relations sémantiques sous-jacentes aux plongements contextuels de mots (Exploring semantic relations underlying contextual word embeddings)0
Clustering Prominent People and Organizations in Topic-Specific Text Corpora0
ARHNet - Leveraging Community Interaction for Detection of Religious Hate Speech in Arabic0
Exploiting Task-Oriented Resources to Learn Word Embeddings for Clinical Abbreviation Expansion0
Show:102550
← PrevPage 177 of 401Next →

No leaderboard results yet.