SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 32013210 of 4002 papers

TitleStatusHype
Squashed Shifted PMI Matrix: Bridging Word Embeddings and Hyperbolic SpacesCode0
Deeper Text Understanding for IR with Contextual Neural Language ModelingCode0
DeepHateExplainer: Explainable Hate Speech Detection in Under-resourced Bengali LanguageCode0
Towards Arabic Sentence Simplification via Classification and Generative ApproachesCode0
Deep Image-to-Recipe TranslationCode0
Inorganic Materials Synthesis Planning with Literature-Trained Neural NetworksCode0
BiLSTM-CRF for Persian Named-Entity Recognition ArmanPersoNERCorpus: the First Entity-Annotated Persian DatasetCode0
Deep Learning for Hate Speech Detection in TweetsCode0
Deep learning for language understanding of mental health concepts derived from Cognitive Behavioural TherapyCode0
Insights into Analogy Completion from the Biomedical DomainCode0
Show:102550
← PrevPage 321 of 401Next →

No leaderboard results yet.