SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 661670 of 4002 papers

TitleStatusHype
Can AI Generate Love Advice?: Toward Neural Answer Generation for Non-Factoid Questions0
Can Domain Adaptation be Handled as Analogies?0
Can Existing Methods Debias Languages Other than English? First Attempt to Analyze and Mitigate Japanese Word Embeddings0
Can Eye Movement Data Be Used As Ground Truth For Word Embeddings Evaluation?0
Aligning Open IE Relations and KB Relations using a Siamese Network Based on Word Embedding0
Active Discriminative Text Representation Learning0
Captioning Images with Novel Objects via Online Vocabulary Expansion0
Capturing Pragmatic Knowledge in Article Usage Prediction using LSTMs0
Card-660: Cambridge Rare Word Dataset - a Reliable Benchmark for Infrequent Word Representation Models0
Boosting Named Entity Recognition with Neural Character Embeddings0
Show:102550
← PrevPage 67 of 401Next →

No leaderboard results yet.