SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 10411050 of 4002 papers

TitleStatusHype
CG-CNN: Self-Supervised Feature Extraction Through Contextual Guidance and Transfer Learning0
Spanish Biomedical and Clinical Language Embeddings0
Abelian Neural Networks0
The Sensitivity of Word Embeddings-based Author Detection Models to Semantic-preserving Adversarial Perturbations0
Paraphrases do not explain word analogiesCode0
Co-occurrences using Fasttext embeddings for word similarity tasks in UrduCode0
Image Captioning using Deep Stacked LSTMs, Contextual Word Embeddings and Data Augmentation0
Knowledge-Base Enriched Word Embeddings for Biomedical Domain0
Towards Emotion Recognition in Hindi-English Code-Mixed Data: A Transformer Based ApproachCode0
Revisiting Language Encoding in Learning Multilingual RepresentationsCode1
Show:102550
← PrevPage 105 of 401Next →

No leaderboard results yet.