SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 891900 of 4002 papers

TitleStatusHype
Monitoring geometrical properties of word embeddings for detecting the emergence of new topics0
My House, My Rules: Learning Tidying Preferences with Graph Neural Networks0
Leveraging Advantages of Interactive and Non-Interactive Models for Vector-Based Cross-Lingual Information Retrieval0
UnClE: Explicitly Leveraging Semantic Similarity to Reduce the Parameters of Word Embeddings0
On the Cross-lingual Transferability of Contextualized Sense Embeddings0
Named Entity Recognition in the Romanian Legal DomainCode0
Do not neglect related languages: The case of low-resource Occitan cross-lingual word embeddings0
HOTTER: Hierarchical Optimal Topic Transport with Explanatory Context RepresentationsCode0
Decoding Word Embeddings with Brain-Based Semantic Features0
Increasing Sentence-Level Comprehension Through Text Classification of Epistemic Functions0
Show:102550
← PrevPage 90 of 401Next →

No leaderboard results yet.