SOTAVerified

Data-free Knowledge Distillation

Papers

Showing 4150 of 75 papers

TitleStatusHype
Towards Effective Data-Free Knowledge Distillation via Diverse Diffusion AugmentationCode0
HFedCKD: Toward Robust Heterogeneous Federated Learning via Data-free Knowledge Distillation and Two-way Contrast0
Data-Free Distillation of Language Model by Text-to-Text Transfer0
Dense Depth Distillation with Out-of-Distribution Simulated Images0
Distribution Shift Matters for Knowledge Distillation with Webly Collected Images0
Conditional Generative Data-free Knowledge Distillation0
Knowledge Migration Framework for Smart Contract Vulnerability Detection0
Completely Heterogeneous Federated Learning0
Learning to Retain while Acquiring: Combating Distribution-Shift in Adversarial Data-Free Knowledge Distillation0
Leveraging Foundation Models for Efficient Federated Learning in Resource-restricted Edge Networks0
Show:102550
← PrevPage 5 of 8Next →

Benchmark Results

#ModelMetricClaimedVerifiedStatus
1GOLD (T5-base)Accuracy91.7Unverified
2ZeroGen (T5-base)Accuracy88.5Unverified
3ProGen (T5-base)Accuracy85.9Unverified
4Prompt2Model (T5-base)Accuracy62.2Unverified
#ModelMetricClaimedVerifiedStatus
1GOLD (T5-base)Exact Match75.2Unverified
2Prompt2Model (T5-base)Exact Match74.4Unverified
3ZeroGen (T5-base)Exact Match69.4Unverified
4ProGen (T5-base)Exact Match68.1Unverified