SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 28212830 of 4002 papers

TitleStatusHype
YZU-NLP at EmoInt-2017: Determining Emotion Intensity Using a Bi-directional LSTM-CNN Model0
YZU-NLP Team at SemEval-2016 Task 4: Ordinal Sentiment Classification Using a Recurrent Convolutional Network0
Zara Returns: Improved Personality Induction and Adaptation by an Empathetic Virtual Agent0
Zero-Inflated Exponential Family Embeddings0
Zero-Shot Activity Recognition with Videos0
Zero-Shot Cross-Lingual Opinion Target Extraction0
Zero-Shot Cross-Lingual Transfer is a Hard Baseline to Beat in German Fine-Grained Entity Typing0
Zero-Shot Visual Question Answering0
Text Classification Components for Detecting Descriptions and Names of CAD models0
Zur Darstellung eines mehrstufigen Prototypbegriffs in der multilingualen automatischen Sprachgenerierung: vom Korpus über word embeddings bis hin zum automatischen Wörterbuch0
Show:102550
← PrevPage 283 of 401Next →

No leaderboard results yet.