SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 26812690 of 4002 papers

TitleStatusHype
Morphological Word Embeddings for Arabic Neural Machine Translation in Low-Resource Settings0
Automatic Detection of Incoherent Speech for Diagnosing Schizophrenia0
Literal, Metphorical or Both? Detecting Metaphoricity in Isolated Adjective-Noun Phrases0
Phrase-Level Metaphor Identification Using Distributed Representations of Word Meaning0
Di-LSTM Contrast : A Deep Neural Network for Metaphor Detection0
Detecting Figurative Word Occurrences Using Recurrent Neural Networks0
Fast Query Expansion on an Accounting Corpus using Sub-Word Embeddings0
Subword-level Composition Functions for Learning Word Embeddings0
\#TeamINF at SemEval-2018 Task 2: Emoji Prediction in Tweets0
UWB at SemEval-2018 Task 3: Irony detection in English tweets0
Show:102550
← PrevPage 269 of 401Next →

No leaderboard results yet.