SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 11411150 of 4002 papers

TitleStatusHype
SMM4H Shared Task 2020 - A Hybrid Pipeline for Identifying Prescription Drug Abuse from Twitter: Machine Learning, Deep Learning, and Post-Processing0
Unmasking Contextual Stereotypes: Measuring and Mitigating BERT’s Gender BiasCode1
Lexical Induction of Morphological and Orthographic Forms for Low-Resourced Languages0
Go Simple and Pre-Train on Domain-Specific Corpora: On the Role of Training Data for Text Classification0
Augmenting NLP models using Latent Feature Interpolations0
A Co-Attentive Cross-Lingual Neural Model for Dialogue Breakdown DetectionCode0
Expert Concept-Modeling Ground Truth Construction for Word Embeddings Evaluation in Concept-Focused DomainsCode0
Manifold Learning-based Word Representation Refinement Incorporating Global and Local Information0
Combining Word Embeddings with Bilingual Orthography Embeddings for Bilingual Dictionary Induction0
CogniVal in Action: An Interface for Customizable Cognitive Word Embedding Evaluation0
Show:102550
← PrevPage 115 of 401Next →

No leaderboard results yet.