SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 28312840 of 4002 papers

TitleStatusHype
Machine Learning to Promote Translational Research: Predicting Patent and Clinical Trial Inclusion in Dementia Research0
Machine Translation Evaluation for Arabic using Morphologically-enriched Embeddings0
Machine Translation for Accessible Multi-Language Text Analysis0
Machine Translation for English–Inuktitut with Segmentation, Data Acquisition and Pre-Training0
Macquarie University at BioASQ 6b: Deep learning and deep reinforcement learning for query-based summarisation0
Macquarie University at BioASQ 6b: Deep learning and deep reinforcement learning for query-based multi-document summarisation0
MainiwayAI at IJCNLP-2017 Task 2: Ensembles of Deep Architectures for Valence-Arousal Prediction0
Manifold Learning-based Word Representation Refinement Incorporating Global and Local Information0
Mapping Text to Knowledge Graph Entities using Multi-Sense LSTMs0
Mapping Unparalleled Clinical Professional and Consumer Languages with Embedding Alignment0
Show:102550
← PrevPage 284 of 401Next →

No leaderboard results yet.