SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 16211630 of 4002 papers

TitleStatusHype
Abusive language in Spanish children and young teenager's conversations: data preparation and short text classification with contextual word embeddings0
Building Sense Representations in Danish by Combining Word Embeddings with Lexical Resources0
Translating Knowledge Representations with Monolingual Word Embeddings: the Case of a Thesaurus on Corporate Non-Financial Reporting0
Building Semantic Grams of Human Knowledge0
Embedding Space Correlation as a Measure of Domain Similarity0
BUCC2020: Bilingual Dictionary Induction using Cross-lingual Embedding0
Embeddings for Named Entity Recognition in Geoscience Portuguese Literature0
TF-IDF Character N-grams versus Word Embedding-based Models for Fine-grained Event Classification: A Preliminary Study0
``A Passage to India'': Pre-trained Word Embeddings for Indian Languages0
SimplifyUR: Unsupervised Lexical Text Simplification for Urdu0
Show:102550
← PrevPage 163 of 401Next →

No leaderboard results yet.