SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 911920 of 4002 papers

TitleStatusHype
CogAlign: Learning to Align Textual Neural Representations to Cognitive Language Processing SignalsCode0
Low-Dimensional Structure in the Space of Language Representations is Reflected in Brain ResponsesCode0
Case Studies on using Natural Language Processing Techniques in Customer Relationship Management Software0
Obtaining Better Static Word Embeddings Using Contextual Embedding ModelsCode1
Combining Static Word Embeddings and Contextual Representations for Bilingual Lexicon InductionCode1
Denoising Word Embeddings by Averaging in a Shared Space0
A General Method for Event Detection on Social Media0
Evaluating Word Embeddings with Categorical ModularityCode0
Looking for a Role for Word Embeddings in Eye-Tracking Features Prediction: Does Semantic Similarity Help?0
Experiments on a Guarani Corpus of News and Social Media0
Show:102550
← PrevPage 92 of 401Next →

No leaderboard results yet.