SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 26512660 of 4002 papers

TitleStatusHype
Utilizing Pre-Trained Word Embeddings to Learn Classification Lexicons with Little Supervision0
Utilizing Word Embeddings based Features for Phylogenetic Tree Generation of Sanskrit Texts0
Utterance Intent Classification of a Spoken Dialogue System with Efficiently Untied Recursive Autoencoders0
UWat-Emote at EmoInt-2017: Emotion Intensity Detection using Affect Clues, Sentiment Polarity and Word Embeddings0
UWB at IEST 2018: Emotion Prediction in Tweets with Bidirectional Long Short-Term Memory Neural Network0
UWB at SemEval-2016 Task 7: Novel Method for Automatic Sentiment Intensity Determination0
UWB at SemEval-2018 Task 1: Emotion Intensity Detection in Tweets0
UWB at SemEval-2018 Task 3: Irony detection in English tweets0
UZH at SemEval-2020 Task 3: Combining BERT with WordNet Sense Embeddings to Predict Graded Word Similarity Changes0
“Vaderland”, “Volk” and “Natie”: Semantic Change Related to Nationalism in Dutch Literature Between 1700 and 1880 Captured with Dynamic Bernoulli Word Embeddings0
Show:102550
← PrevPage 266 of 401Next →

No leaderboard results yet.