SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 891900 of 4002 papers

TitleStatusHype
Leveraging Domain Agnostic and Specific Knowledge for Acronym Disambiguation0
Cross-lingual alignments of ELMo contextual embeddings0
A Simple and Efficient Probabilistic Language model for Code-Mixed Text0
Hate speech detection using static BERT embeddings0
SAT Based Analogy Evaluation Framework for Persian Word Embeddings0
A Source-Criticism Debiasing Method for GloVe EmbeddingsCode1
Multilingual transfer of acoustic word embeddings improves when training on languages related to the target zero-resource languageCode0
Clinical Named Entity Recognition using Contextualized Token Representations0
Mixtures of Deep Neural Experts for Automated Speech Scoring0
Membership Inference on Word Embedding and Beyond0
Show:102550
← PrevPage 90 of 401Next →

No leaderboard results yet.