SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 14711480 of 4002 papers

TitleStatusHype
Better Early than Late: Fusing Topics with Word Embeddings for Neural Question Paraphrase Identification0
Predicting Job-Hopping Motive of Candidates Using Answers to Open-ended Interview Questions0
Morphological Skip-Gram: Using morphological knowledge to improve word representation0
On a Novel Application of Wasserstein-Procrustes for Unsupervised Cross-Lingual LearningCode0
An Enhanced Text Classification to Explore Health based Indian Government Policy Tweets0
A Feature Analysis for Multimodal News Retrieval0
Topic Modeling on User Stories using Word Mover's DistanceCode0
Pre-trained Word Embeddings for Goal-conditional Transfer Learning in Reinforcement Learning0
Cultural Cartography with Word Embeddings0
Automatic Detection of Sexist Statements Commonly Used at the WorkplaceCode0
Show:102550
← PrevPage 148 of 401Next →

No leaderboard results yet.