SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 371380 of 4002 papers

TitleStatusHype
Automated Scoring of Clinical Expressive Language Evaluation Tasks0
A Model of Zero-Shot Learning of Spoken Language Understanding0
Amobee at SemEval-2018 Task 1: GRU Neural Network with a CNN Attention Mechanism for Sentiment Classification0
Addressing Biases in the Texts using an End-to-End Pipeline Approach0
A Trie-Structured Bayesian Model for Unsupervised Morphological Segmentation0
A Mixture Model for Learning Multi-Sense Word Embeddings0
Artificial mental phenomena: Psychophysics as a framework to detect perception biases in AI models0
Additional Shared Decoder on Siamese Multi-view Encoders for Learning Acoustic Word Embeddings0
Artificial intelligence prediction of stock prices using social media0
Article citation study: Context enhanced citation sentiment detection0
Show:102550
← PrevPage 38 of 401Next →

No leaderboard results yet.