SOTAVerified

Data-free Knowledge Distillation

Papers

Showing 150 of 75 papers

TitleStatusHype
Prompt2Model: Generating Deployable Models from Natural Language InstructionsCode4
Data-Free Knowledge Distillation for Deep Neural NetworksCode2
One-shot Federated Learning via Synthetic Distiller-Distillate CommunicationCode1
Diffusion-Driven Data Replay: A Novel Approach to Combat Forgetting in Federated Class Continual LearningCode1
MAZE: Data-Free Model Stealing Attack Using Zeroth-Order Gradient EstimationCode1
Is Synthetic Data From Diffusion Models Ready for Knowledge Distillation?Code1
DKDM: Data-Free Knowledge Distillation for Diffusion Models with Any ArchitectureCode1
Fine-tuning Global Model via Data-Free Knowledge Distillation for Non-IID Federated LearningCode1
EchoDFKD: Data-Free Knowledge Distillation for Cardiac Ultrasound Segmentation using Synthetic DataCode1
NAYER: Noisy Layer Data Generation for Efficient and Effective Data-free Knowledge DistillationCode1
Training Generative Adversarial Networks in One StageCode1
Small Scale Data-Free Knowledge DistillationCode1
Contrastive Model Inversion for Data-Free Knowledge DistillationCode1
Data-Free Knowledge Distillation for Heterogeneous Federated LearningCode1
ZeroGen: Efficient Zero-shot Learning via Dataset GenerationCode1
Up to 100 Faster Data-free Knowledge DistillationCode1
Robust and Resource-Efficient Data-Free Knowledge Distillation by Generative Pseudo ReplayCode1
Data-Free Knowledge Distillation via Feature Exchange and Activation Region ConstraintCode1
Revisiting Data-Free Knowledge Distillation with Poisoned TeachersCode1
Relation-Guided Adversarial Learning for Data-free Knowledge TransferCode1
ProGen: Progressive Zero-shot Dataset Generation via In-context FeedbackCode1
Does Training with Synthetic Data Truly Protect Privacy?Code0
AuG-KD: Anchor-Based Mixup Generation for Out-of-Domain Knowledge DistillationCode0
CDFKD-MFS: Collaborative Data-free Knowledge Distillation via Multi-level Feature SharingCode0
Cross-feature Contrastive Loss for Decentralized Deep Learning on Heterogeneous DataCode0
Customizing Synthetic Data for Data-Free Student LearningCode0
DAD++: Improved Data-free Test Time Adversarial DefenseCode0
Data-free Knowledge Distillation for Segmentation using Data-Enriching GANCode0
Data-free Knowledge Distillation for Fine-grained Visual CategorizationCode0
Data-Free Knowledge Distillation for Image Super-ResolutionCode0
Handling Data Heterogeneity in Federated Learning via Knowledge Distillation and FusionCode0
Dynamic Data-Free Knowledge Distillation by Easy-to-Hard Learning StrategyCode0
Hybrid Data-Free Knowledge DistillationCode0
Knowledge Extraction with No Observable DataCode0
Large-Scale Data-Free Knowledge Distillation for ImageNet via Multi-Resolution Data GenerationCode0
Mosaic: Data-Free Knowledge Distillation via Mixture-of-Experts for Heterogeneous Distributed EnvironmentsCode0
Robustness and Diversity Seeking Data-Free Knowledge DistillationCode0
Synthetic data generation method for data-free knowledge distillation in regression neural networksCode0
Teacher as a Lenient Expert: Teacher-Agnostic Data-Free Knowledge DistillationCode0
Towards Data-Free Domain GeneralizationCode0
Towards Effective Data-Free Knowledge Distillation via Diverse Diffusion AugmentationCode0
HFedCKD: Toward Robust Heterogeneous Federated Learning via Data-free Knowledge Distillation and Two-way Contrast0
Data-Free Distillation of Language Model by Text-to-Text Transfer0
Dense Depth Distillation with Out-of-Distribution Simulated Images0
Distribution Shift Matters for Knowledge Distillation with Webly Collected Images0
Conditional Generative Data-free Knowledge Distillation0
Knowledge Migration Framework for Smart Contract Vulnerability Detection0
Completely Heterogeneous Federated Learning0
Learning to Retain while Acquiring: Combating Distribution-Shift in Adversarial Data-Free Knowledge Distillation0
Leveraging Foundation Models for Efficient Federated Learning in Resource-restricted Edge Networks0
Show:102550
← PrevPage 1 of 2Next →

Benchmark Results

#ModelMetricClaimedVerifiedStatus
1GOLD (T5-base)Accuracy91.7Unverified
2ZeroGen (T5-base)Accuracy88.5Unverified
3ProGen (T5-base)Accuracy85.9Unverified
4Prompt2Model (T5-base)Accuracy62.2Unverified
#ModelMetricClaimedVerifiedStatus
1GOLD (T5-base)Exact Match75.2Unverified
2Prompt2Model (T5-base)Exact Match74.4Unverified
3ZeroGen (T5-base)Exact Match69.4Unverified
4ProGen (T5-base)Exact Match68.1Unverified