SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 32213230 of 4002 papers

TitleStatusHype
Bag-of-Words Transfer: Non-Contextual Techniques for Multi-Task Learning0
BAHP: Benchmark of Assessing Word Embeddings in Historical Portuguese0
Balancing the composition of word embeddings across heterogenous data sets0
Batch IS NOT Heavy: Learning Word Representations From All Samples0
Bayesian Paragraph Vectors0
BB\_twtr at SemEval-2017 Task 4: Twitter Sentiment Analysis with CNNs and LSTMs0
Beats of Bias: Analyzing Lyrics with Topic Modeling and Gender Bias Measurements0
Bench-Marking And Improving Arabic Automatic Image Captioning Through The Use Of Multi-Task Learning Paradigm0
Benchmarking zero-shot and few-shot approaches for tokenization, tagging, and dependency parsing of Tagalog text0
BERT-based Ensembles for Modeling Disclosure and Support in Conversational Social Media Text0
Show:102550
← PrevPage 323 of 401Next →

No leaderboard results yet.