SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 151160 of 4002 papers

TitleStatusHype
GLOW : Global Weighted Self-Attention Network for Web SearchCode1
DiLM: Distilling Dataset into Language Model for Text-level Dataset DistillationCode1
Discovering and Categorising Language Biases in RedditCode1
Discovering Differences in the Representation of People using Contextualized Semantic AxesCode1
Apples to Apples: A Systematic Evaluation of Topic ModelsCode1
AI4Bharat-IndicNLP Corpus: Monolingual Corpora and Word Embeddings for Indic LanguagesCode1
Dynamic Contextualized Word EmbeddingsCode1
Effective Seed-Guided Topic Discovery by Integrating Multiple Types of ContextsCode1
Can a Fruit Fly Learn Word Embeddings?Code1
Comparative Evaluation of Pretrained Transfer Learning Models on Automatic Short Answer GradingCode1
Show:102550
← PrevPage 16 of 401Next →

No leaderboard results yet.