SOTAVerified

Representation Learning

Representation Learning is a process in machine learning where algorithms extract meaningful patterns from raw data to create representations that are easier to understand and process. These representations can be designed for interpretability, reveal hidden features, or be used for transfer learning. They are valuable across many fundamental machine learning tasks like image classification and retrieval.

Deep neural networks can be considered representation learning models that typically encode information which is projected into a different subspace. These representations are then usually passed on to a linear classifier to, for instance, train a classifier.

Representation learning can be divided into:

  • Supervised representation learning: learning representations on task A using annotated data and used to solve task B
  • Unsupervised representation learning: learning representations on a task in an unsupervised way (label-free data). These are then used to address downstream tasks and reducing the need for annotated data when learning news tasks. Powerful models like GPT and BERT leverage unsupervised representation learning to tackle language tasks.

More recently, self-supervised learning (SSL) is one of the main drivers behind unsupervised representation learning in fields like computer vision and NLP.

Here are some additional readings to go deeper on the task:

( Image credit: Visualizing and Understanding Convolutional Networks )

Papers

Showing 79017950 of 10580 papers

TitleStatusHype
CARL: Aggregated Search with Context-Aware Module Embedding Learning0
Representation Learning with Autoencoders for Electronic Health Records: A Comparative Study0
A Prior Guided Adversarial Representation Learning and Hypergraph Perceptual Network for Predicting Abnormal Connections of Alzheimer's Disease0
Adversarial Representation Sharing: A Quantitative and Secure Collaborative Learning Framework0
Label Noise Robust Image Representation Learning based on Supervised Variational Autoencoders in Remote Sensing0
Discrimination-Aware Mechanism for Fine-Grained Representation Learning0
Label-guided Learning for Text Classification0
Label-efficient Time Series Representation Learning: A Review0
Discriminability-enforcing loss to improve representation learning0
Label Distribution Learning via Implicit Distribution Representation0
Label Distribution Learning Forests0
Representation Learning With Hidden Unit Clustering For Low Resource Speech Applications0
Representation Learning with Information Theory for COVID-19 Detection0
Discriminability Distillation in Group Representation Learning0
Representation Learning with Multisets0
Discriminability Distillation in Group Representation Learning0
Label Consistent Transform Learning for Hyperspectral Image Classification0
Label-Consistency based Graph Neural Networks for Semi-supervised Node Classification0
Label Aware Speech Representation Learning For Language Identification0
Representation Learning with Video Deep InfoMax0
Improve Variational Autoencoder for Text Generationwith Discrete Latent Bottleneck0
Label-Aware Graph Convolutional Networks0
Discrete Infomax Codes for Supervised Representation Learning0
Representations for Stable Off-Policy Reinforcement Learning0
Careful Selection and Thoughtful Discarding: Graph Explicit Pooling Utilizing Discarded Nodes0
Representation Transfer for Differentially Private Drug Sensitivity Prediction0
A Preliminary Study of Disentanglement With Insights on the Inadequacy of Metrics0
Kriformer: A Novel Spatiotemporal Kriging Approach Based on Graph Transformers0
KQGC: Knowledge Graph Embedding with Smoothing Effects of Graph Convolutions for Recommendation0
Reproducible, incremental representation learning with Rosetta VAE0
Reprogramming Foundational Large Language Models(LLMs) for Enterprise Adoption for Spatio-Temporal Forecasting Applications: Unveiling a New Era in Copilot-Guided Cross-Modal Time Series Representation Learning0
Reprogramming Language Models for Molecular Representation Learning0
Reprogramming Pretrained Language Models for Protein Sequence Representation Learning0
Repurposing Foundation Model for Generalizable Medical Time Series Classification0
Discrete Audio Representation as an Alternative to Mel-Spectrograms for Speaker and Speech Recognition0
KoreALBERT: Pretraining a Lite BERT Model for Korean Language Understanding0
Koopman-Equivariant Gaussian Processes0
Research on Domain Information Mining and Theme Evolution of Scientific Papers0
K-ON: Stacking Knowledge On the Head Layer of Large Language Model0
Research Team Identification Based on Representation Learning of Academic Heterogeneous Information Network0
Discovery of Visual Semantics by Unsupervised and Self-Supervised Representation Learning0
Residual Connections Encourage Iterative Inference0
Career Path Prediction using Resume Representation Learning and Skill-based Matching0
Residual Contrastive Learning for Image Reconstruction: Learning Transferable Representations from Noisy Images0
Residual Contrastive Learning: Unsupervised Representation Learning from Residuals0
Knowledge Router: Learning Disentangled Representations for Knowledge Graphs0
Residual Relaxation for Multi-view Representation Learning0
Re-Simulation-based Self-Supervised Learning for Pre-Training Foundation Models0
Discovery and Separation of Features for Invariant Representation Learning0
Knowledge Representation with Conceptual Spaces0
Show:102550
← PrevPage 159 of 212Next →

Benchmark Results

#ModelMetricClaimedVerifiedStatus
1SciNCLAvg.81.8Unverified
2SPECTERAvg.80Unverified
3CiteomaticAvg.76Unverified
4Sci-DeCLUTRAvg.66.6Unverified
5SciBERTAvg.59.6Unverified
6BioBERTAvg.58.8Unverified
7CiteBERTAvg.58.8Unverified
#ModelMetricClaimedVerifiedStatus
1top_model_weights_with_3d_21:1 Accuracy0.75Unverified
#ModelMetricClaimedVerifiedStatus
1Resnet 18Accuracy (%)97.05Unverified
#ModelMetricClaimedVerifiedStatus
1Morphological NetworkAccuracy97.3Unverified
#ModelMetricClaimedVerifiedStatus
1Max Margin ContrastiveSilhouette Score0.56Unverified