SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 26212630 of 4002 papers

TitleStatusHype
Low-Resource Machine Transliteration Using Recurrent Neural Networks of Asian Languages0
BioAMA: Towards an End to End BioMedical Question Answering System0
Keyphrases Extraction from User-Generated Contents in Healthcare Domain Using Long Short-Term Memory Networks0
Investigating Domain-Specific Information for Neural Coreference Resolution on Biomedical Texts0
Investigating Effective Parameters for Fine-tuning of Word Embeddings Using Only a Small Corpus0
Natural Language Inference with Definition Embedding Considering Context On the Fly0
Multilingual Seq2seq Training with Similarity Loss for Cross-Lingual Document Classification0
Attention-based Semantic Priming for Slot-filling0
Joint learning of frequency and word embeddings for multilingual readability assessment0
A Sequence Learning Method for Domain-Specific Entity Linking0
Show:102550
← PrevPage 263 of 401Next →

No leaderboard results yet.