SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 13711380 of 4002 papers

TitleStatusHype
Detecting Semantically Equivalent Questions in Online User Forums0
Entropy-Based Subword Mining with an Application to Word Embeddings0
Changes in Commuter Behavior from COVID-19 Lockdowns in the Atlanta Metropolitan Area0
Equalizing Gender Bias in Neural Machine Translation with Word Embeddings Techniques0
An Experimental Study of Deep Neural Network Models for Vietnamese Multiple-Choice Reading Comprehension0
Error Analysis for Vietnamese Named Entity Recognition on Deep Neural Network Models0
Character aware models with similarity learning for metaphor detection0
Character and Subword-Based Word Representation for Neural Language Modeling Prediction0
Estimating Mutual Information Between Dense Word Embeddings0
Detecting Sarcasm Using Different Forms Of Incongruity0
Show:102550
← PrevPage 138 of 401Next →

No leaderboard results yet.