SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 12911300 of 4002 papers

TitleStatusHype
Detection of Adverse Drug Reaction in Tweets Using a Combination of Heterogeneous Word Embeddings0
Embeddings as representation for symbolic music0
Embeddings for Named Entity Recognition in Geoscience Portuguese Literature0
BUCC2020: Bilingual Dictionary Induction using Cross-lingual Embedding0
Embeddings in Natural Language Processing0
Embedding Space Correlation as a Measure of Domain Similarity0
Building a robust sentiment lexicon with (almost) no resource0
Embedding Structured Dictionary Entries0
Building a Web-Scale Dependency-Parsed Corpus from CommonCrawl0
Detecting weak and strong Islamophobic hate speech on social media0
Show:102550
← PrevPage 130 of 401Next →

No leaderboard results yet.