SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 13711380 of 4002 papers

TitleStatusHype
Evaluating Word Embeddings for Language Acquisition0
Diachronic Embeddings for People in the News0
Task-oriented Domain-specific Meta-Embedding for Text Classification0
Unsupervised Cross-Lingual Part-of-Speech Tagging for Truly Low-Resource Scenarios0
Rethinking Topic Modelling: From Document-Space to Term-Space0
Beyond Adjacency Pairs: Hierarchical Clustering of Long Sequences for Human-Machine Dialogues0
Revisiting Representation Degeneration Problem in Language Modeling0
Machine Translation for English–Inuktitut with Segmentation, Data Acquisition and Pre-Training0
Neutralizing Gender Bias in Word Embeddings with Latent Disentanglement and Counterfactual Generation0
Robust Backed-off Estimation of Out-of-Vocabulary Embeddings0
Show:102550
← PrevPage 138 of 401Next →

No leaderboard results yet.