SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 801810 of 4002 papers

TitleStatusHype
Regional Differences in Information Privacy Concerns After the Facebook-Cambridge Analytica Data Scandal0
An experimental study of the vision-bottleneck in VQA0
Hindi/Bengali Sentiment Analysis Using Transfer Learning and Joint Dual Input Learning with Self AttentionCode0
Bench-Marking And Improving Arabic Automatic Image Captioning Through The Use Of Multi-Task Learning Paradigm0
HistBERT: A Pre-trained Language Model for Diachronic Lexical Semantic AnalysisCode0
Fairness for Text Classification Tasks with Identity Information Data Augmentation Methods0
L3Cube-MahaCorpus and MahaBERT: Marathi Monolingual Corpus, Marathi BERT Language Models, and Resources0
Towards a Theoretical Understanding of Word and Relation Representation0
Learning Representations of Entities and Relations0
Recognition of Implicit Geographic Movement in Text0
Show:102550
← PrevPage 81 of 401Next →

No leaderboard results yet.