SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 11111120 of 4002 papers

TitleStatusHype
Diachronic Embeddings for People in the News0
An Exploratory Analysis on the Explanatory Potential of Embedding-Based Measures of Semantic Transparency for Malay Word Recognition0
A bilingual approach to specialised adjectives through word embeddings in the karstology domain0
Distilled embedding: non-linear embedding factorization using knowledge distillation0
Distilled Wasserstein Learning for Word Embedding and Topic Modeling0
Distributed Representations for Unsupervised Semantic Role Labeling0
Diachronic degradation of language models: Insights from social media0
Beyond Offline Mapping: Learning Cross-lingual Word Embeddings through Context Anchoring0
An Exploration of Word Embedding Initialization in Deep-Learning Tasks0
D-Graph: AI-Assisted Design Concept Exploration Graph0
Show:102550
← PrevPage 112 of 401Next →

No leaderboard results yet.