SOTAVerified

Data-free Knowledge Distillation

Papers

Showing 7175 of 75 papers

TitleStatusHype
FedTAD: Topology-aware Data-free Knowledge Distillation for Subgraph Federated Learning0
Adversarial Curriculum Graph-Free Knowledge Distillation for Graph Neural Networks0
Generative Adversarial Simulator0
GOLD: Generalized Knowledge Distillation via Out-of-Distribution-Guided Language Data Generation0
Sampling to Distill: Knowledge Transfer from Open-World Data0
Show:102550
← PrevPage 8 of 8Next →

Benchmark Results

#ModelMetricClaimedVerifiedStatus
1GOLD (T5-base)Accuracy91.7Unverified
2ZeroGen (T5-base)Accuracy88.5Unverified
3ProGen (T5-base)Accuracy85.9Unverified
4Prompt2Model (T5-base)Accuracy62.2Unverified
#ModelMetricClaimedVerifiedStatus
1GOLD (T5-base)Exact Match75.2Unverified
2Prompt2Model (T5-base)Exact Match74.4Unverified
3ZeroGen (T5-base)Exact Match69.4Unverified
4ProGen (T5-base)Exact Match68.1Unverified