SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 10611070 of 4002 papers

TitleStatusHype
PairConnect: A Compute-Efficient MLP Alternative to Attention0
Deriving Word Vectors from Contextualized Language Models using Topic-Aware Mention SelectionCode0
Shape of Elephant: Study of Macro Properties of Word Embeddings Spaces0
Predicting the Ordering of Characters in Japanese Historical Documents0
CogAlign: Learning to Align Textual Neural Representations to Cognitive Language Processing SignalsCode0
Case Studies on using Natural Language Processing Techniques in Customer Relationship Management Software0
Low-Dimensional Structure in the Space of Language Representations is Reflected in Brain ResponsesCode0
Denoising Word Embeddings by Averaging in a Shared Space0
A General Method for Event Detection on Social Media0
Evaluating Word Embeddings with Categorical ModularityCode0
Show:102550
← PrevPage 107 of 401Next →

No leaderboard results yet.