SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 30613070 of 4002 papers

TitleStatusHype
Lexical Chains meet Word Embeddings in Document-level Statistical Machine Translation0
Bilexical Embeddings for Quality Estimation0
Towards the Understanding of Gaming Audiences by Modeling Twitch Emotes0
Lexicalized vs. Delexicalized Parsing in Low-Resource Scenarios0
UWat-Emote at EmoInt-2017: Emotion Intensity Detection using Affect Clues, Sentiment Polarity and Word Embeddings0
Variable Mini-Batch Sizing and Pre-Trained Embeddings0
Neural Networks and Spelling Features for Native Language Identification0
Detecting Sarcasm Using Different Forms Of Incongruity0
Predicting Pronouns with a Convolutional Network and an N-gram Model0
LIPN-UAM at EmoInt-2017:Combination of Lexicon-based features and Sentence-level Vector Representations for Emotion Intensity Determination0
Show:102550
← PrevPage 307 of 401Next →

No leaderboard results yet.