SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 34613470 of 4002 papers

TitleStatusHype
Eliciting Explicit Knowledge From Domain Experts in Direct Intrinsic Evaluation of Word Embeddings for Specialized DomainsCode0
Global Textual Relation Embedding for Relational UnderstandingCode0
Asynchronous Training of Word Embeddings for Large Text CorporaCode0
Beyond One-Hot-Encoding: Injecting Semantics to Drive Image ClassifiersCode0
Unsupervised Open Relation ExtractionCode0
Unsupervised Parallel Sentence Extraction with Parallel Segment Detection Helps Machine TranslationCode0
Interpretable Segmentation of Medical Free-Text Records Based on Word EmbeddingsCode0
Clustering-Based Article Identification in Historical NewspapersCode0
InceptionXML: A Lightweight Framework with Synchronized Negative Sampling for Short Text Extreme ClassificationCode0
Training Cross-Lingual embeddings for Setswana and SepediCode0
Show:102550
← PrevPage 347 of 401Next →

No leaderboard results yet.