SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 18811890 of 4002 papers

TitleStatusHype
Indigenous Language Revitalization and the Dilemma of Gender Bias0
In-domain Context-aware Token Embeddings Improve Biomedical Named Entity Recognition0
Evolving Large Text Corpora: Four Versions of the Icelandic Gigaword Corpus0
Clinical Abbreviation Disambiguation Using Neural Word Embeddings0
Inducing Distant Supervision in Suggestion Mining through Part-of-Speech Embeddings0
Evolving Hate Speech Online: An Adaptive Framework for Detection and Mitigation0
Inducing Embeddings for Rare and Unseen Words by Leveraging Lexical Resources0
Inducing Relational Knowledge from BERT0
Clickbait detection using word embeddings0
Are Word Embeddings Really a Bad Fit for the Estimation of Thematic Fit?0
Show:102550
← PrevPage 189 of 401Next →

No leaderboard results yet.