SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 331340 of 4002 papers

TitleStatusHype
Analogical Proportions and Creativity: A Preliminary Study0
300-sparsans at SemEval-2018 Task 9: Hypernymy as interaction of sparse attributes0
A Multi-tiered Solution for Personalized Baggage Item Recommendations using FastText and Association Rule Mining0
A Multitask Objective to Inject Lexical Contrast into Distributional Semantics0
A Deep Learning Architecture for De-identification of Patient Notes: Implementation and Evaluation0
Aspect-Based Sentiment Analysis Using Bitmask Bidirectional Long Short Term Memory Networks0
Assessing multiple word embeddings for named entity recognition of professions and occupations in health-related social media0
Assessing the Unitary RNN as an End-to-End Compositional Model of Syntax0
A Multi-task Learning Approach to Adapting Bilingual Word Embeddings for Cross-lingual Named Entity Recognition0
A Multi-task Approach to Learning Multilingual Representations0
Show:102550
← PrevPage 34 of 401Next →

No leaderboard results yet.