SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 36213630 of 4002 papers

TitleStatusHype
TF-CR: Weighting Embeddings for Text ClassificationCode0
Evaluation of sentence embeddings in downstream and linguistic probing tasksCode0
Learning Zero-Shot Multifaceted Visually Grounded Word Embeddings via Multi-Task TrainingCode0
URLNet: Learning a URL Representation with Deep Learning for Malicious URL DetectionCode0
TF-IDF vs Word Embeddings for Morbidity Identification in Clinical Notes: An Initial StudyCode0
The 2022 n2c2/UW Shared Task on Extracting Social Determinants of HealthCode0
Evaluation Of Word Embeddings From Large-Scale French Web ContentCode0
Evaluation of Word Vector Representations by Subspace AlignmentCode0
Aspect Detection using Word and Char Embeddings with (Bi)LSTM and CRFCode0
PejorativITy: Disambiguating Pejorative Epithets to Improve Misogyny Detection in Italian TweetsCode0
Show:102550
← PrevPage 363 of 401Next →

No leaderboard results yet.