SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 32613270 of 4002 papers

TitleStatusHype
Adversarial Training for Unsupervised Bilingual Lexicon Induction0
Recurrent neural networks with specialized word embeddings for health-domain named-entity recognitionCode0
Neural Question Answering at BioASQ 5B0
Beyond Bilingual: Multi-sense Word Embeddings using Multilingual Context0
Jointly Learning Word Embeddings and Latent Topics0
End-to-End Neural Ad-hoc Ranking with Kernel PoolingCode0
A Mixture Model for Learning Multi-Sense Word Embeddings0
A Survey Of Cross-lingual Word Embedding Models0
Query-by-Example Search with Discriminative Neural Acoustic Word EmbeddingsCode0
Neural Domain Adaptation for Biomedical Question AnsweringCode0
Show:102550
← PrevPage 327 of 401Next →

No leaderboard results yet.