SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 15911600 of 4002 papers

TitleStatusHype
Combining Character and Word Embeddings for the Detection of Offensive Language in Arabic0
Extrapolating Binder Style Word Embeddings to New Words0
Representation Learning for Unseen Words by Bridging Subwords to Semantic Networks0
Legal-ES: A Set of Large Scale Resources for Spanish Legal Text Processing0
Exploring Bilingual Word Embeddings for Hiligaynon, a Low-Resource Language0
Usability and Accessibility of Bantu Language Dictionaries in the Digital Age: Mobile Access in an Open Environment0
Offensive Language Detection Using Brown Clustering0
Are Word Embeddings Really a Bad Fit for the Estimation of Thematic Fit?0
Urban Dictionary Embeddings for Slang NLP Applications0
CLFD: A Novel Vectorization Technique and Its Application in Fake News Detection0
Show:102550
← PrevPage 160 of 401Next →

No leaderboard results yet.