SOTAVerified

Data-free Knowledge Distillation

Papers

Showing 110 of 75 papers

TitleStatusHype
Mosaic: Data-Free Knowledge Distillation via Mixture-of-Experts for Heterogeneous Distributed EnvironmentsCode0
CAE-DFKD: Bridging the Transferability Gap in Data-Free Knowledge Distillation0
Adversarial Curriculum Graph-Free Knowledge Distillation for Graph Neural Networks0
HFedCKD: Toward Robust Heterogeneous Federated Learning via Data-free Knowledge Distillation and Two-way Contrast0
Does Training with Synthetic Data Truly Protect Privacy?Code0
Hybrid Data-Free Knowledge DistillationCode0
Relation-Guided Adversarial Learning for Data-free Knowledge TransferCode1
Knowledge Migration Framework for Smart Contract Vulnerability Detection0
One-shot Federated Learning via Synthetic Distiller-Distillate CommunicationCode1
Large-Scale Data-Free Knowledge Distillation for ImageNet via Multi-Resolution Data GenerationCode0
Show:102550
← PrevPage 1 of 8Next →

Benchmark Results

#ModelMetricClaimedVerifiedStatus
1GOLD (T5-base)Exact Match75.2Unverified
2Prompt2Model (T5-base)Exact Match74.4Unverified
3ZeroGen (T5-base)Exact Match69.4Unverified
4ProGen (T5-base)Exact Match68.1Unverified