SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 19411950 of 4002 papers

TitleStatusHype
Shallow Domain Adaptive Embeddings for Sentiment Analysis0
Disentangling Latent Emotions of Word Embeddings on Complex Emotional Narratives0
Natural Language Processing of Clinical Notes on Chronic Diseases: Systematic Review0
Sex Trafficking Detection with Ordinal Regression Neural Networks0
On-Device Text Representations Robust To Misspellings via Projections0
Generative Question Refinement with Deep Reinforcement Learning in Retrieval-based QA SystemCode0
BERT-based Ranking for Biomedical Entity Normalization0
Debiasing Embeddings for Reduced Gender Bias in Text Classification0
Text mining policy: Classifying forest and landscape restoration policy agenda with neural information retrieval0
A Simple and Effective Approach for Fine Tuning Pre-trained Word Embeddings for Improved Text ClassificationCode0
Show:102550
← PrevPage 195 of 401Next →

No leaderboard results yet.