SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 10511060 of 4002 papers

TitleStatusHype
Hate speech detection using static BERT embeddings0
Multilingual transfer of acoustic word embeddings improves when training on languages related to the target zero-resource languageCode0
Mixtures of Deep Neural Experts for Automated Speech Scoring0
Clinical Named Entity Recognition using Contextualized Token Representations0
Membership Inference on Word Embedding and Beyond0
Deep Learning Models in Detection of Dietary Supplement Adverse Event Signals from Twitter0
WG4Rec: Modeling Textual Content with Word Graph for News RecommendationCode0
An Improved Single Step Non-autoregressive Transformer for Automatic Speech Recognition0
Do Acoustic Word Embeddings Capture Phonological Similarity? An Empirical StudyCode0
PairConnect: A Compute-Efficient MLP Alternative to Attention0
Show:102550
← PrevPage 106 of 401Next →

No leaderboard results yet.