SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 20912100 of 4002 papers

TitleStatusHype
Embeddings as representation for symbolic music0
Embeddings for Named Entity Recognition in Geoscience Portuguese Literature0
Embeddings in Natural Language Processing0
Embedding Space Correlation as a Measure of Domain Similarity0
Embedding Structured Dictionary Entries0
Embedding Words and Senses Together via Joint Knowledge-Enhanced Training0
Embodying Pre-Trained Word Embeddings Through Robot Actions0
Embracing Non-Traditional Linguistic Resources for Low-resource Language Name Tagging0
Emerging Cross-lingual Structure in Pretrained Language Models0
EmoDet at SemEval-2019 Task 3: Emotion Detection in Text using Deep Learning0
Show:102550
← PrevPage 210 of 401Next →

No leaderboard results yet.