SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 29812990 of 4002 papers

TitleStatusHype
Naive Bayes and BiLSTM Ensemble for Discriminating between Mainland and Taiwan Variation of Mandarin Chinese0
Named Entity Recognition Only from Word Embeddings0
Named Entity Recognition on Twitter for Turkish using Semi-supervised Learning with Word Embeddings0
NARNIA at NLP4IF-2021: Identification of Misinformation in COVID-19 Tweets Using BERTweet0
Natural Alpha Embeddings0
Quantum Natural Language Processing0
Natural Language Inference with Definition Embedding Considering Context On the Fly0
Natural Language Processing for Diagnosis and Risk Assessment of Cardiovascular Disease0
Natural Language Processing of Clinical Notes on Chronic Diseases: Systematic Review0
N-best Rescoring for Parsing Based on Dependency-Based Word Embeddings0
Show:102550
← PrevPage 299 of 401Next →

No leaderboard results yet.