SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 11211130 of 4002 papers

TitleStatusHype
Words with Consistent Diachronic Usage Patterns are Learned Earlier: A Computational Analysis Using Temporally Aligned Word EmbeddingsCode0
Deep Clustering with Measure Propagation0
Group-Sparse Matrix Factorization for Transfer Learning of Word Embeddings0
From Fully Trained to Fully Random Embeddings: Improving Neural Machine Translation with Compact Word Embedding Tables0
Sentence Alignment with Parallel Documents Facilitates Biomedical Machine TranslationCode0
Frequency-based Distortions in Contextualized Word Embeddings0
A multilabel approach to morphosyntactic probing0
Embodying Pre-Trained Word Embeddings Through Robot Actions0
Multi-source Neural Topic Modeling in Multi-view Embedding SpacesCode0
"Wikily" Supervised Neural Translation Tailored to Cross-Lingual TasksCode0
Show:102550
← PrevPage 113 of 401Next →

No leaderboard results yet.