Improving Question Answering Performance Using Knowledge Distillation and Active Learning Sep 26, 2021 Active Learning Knowledge Distillation
Code Code Available 05 Improving Respiratory Sound Classification with Architecture-Agnostic Knowledge Distillation from Ensembles May 28, 2025 Knowledge Distillation Sound Classification
Code Code Available 05 Infusing Sequential Information into Conditional Masked Translation Model with Self-Review Mechanism Oct 19, 2020 Decoder Knowledge Distillation
Code Code Available 05 Attention-Based Depth Distillation with 3D-Aware Positional Encoding for Monocular 3D Object Detection Nov 30, 2022 3D Object Detection Depth Estimation
Code Code Available 05 DSMix: Distortion-Induced Sensitivity Map Based Pre-training for No-Reference Image Quality Assessment Jul 4, 2024 Data Augmentation Image Quality Assessment
Code Code Available 05 DSG-KD: Knowledge Distillation from Domain-Specific to General Language Models Sep 23, 2024 Knowledge Distillation Transfer Learning
Code Code Available 05 DS_FusionNet: Dynamic Dual-Stream Fusion with Bidirectional Knowledge Distillation for Plant Disease Recognition Apr 29, 2025 Fine-Grained Image Classification image-classification
Code Code Available 05 Improving generalizability of distilled self-supervised speech processing models under distorted settings Oct 14, 2022 Knowledge Distillation
Code Code Available 05 Improving Knowledge Distillation via Transferring Learning Ability Apr 24, 2023 Knowledge Distillation
Code Code Available 05 DROP: Poison Dilution via Knowledge Distillation for Federated Learning Feb 10, 2025 Data Poisoning Federated Learning
Code Code Available 05 Improving End-to-End Speech Translation by Imitation-Based Knowledge Distillation with Synthetic Transcripts Jul 17, 2023 automatic-speech-translation Imitation Learning
Code Code Available 05 Improving Robustness by Enhancing Weak Subnets Jan 30, 2022 Adversarial Robustness Data Augmentation
Code Code Available 05 Ensemble diverse hypotheses and knowledge distillation for unsupervised cross-subject adaptation Apr 15, 2022 Activity Recognition Domain Adaptation
Code Code Available 05 Ensemble Knowledge Distillation for Learning Improved and Efficient Networks Sep 17, 2019 Ensemble Learning General Classification
Code Code Available 05 Improving Neural Architecture Search Image Classifiers via Ensemble Learning Mar 14, 2019 Ensemble Learning Image Classification
Code Code Available 05 Do You Remember . . . the Future? Weak-to-Strong generalization in 3D Object Detection Aug 3, 2024 3D Object Detection Knowledge Distillation
Code Code Available 05 Improved Knowledge Distillation for Crowd Counting on IoT Device Aug 2, 2023 Crowd Counting Knowledge Distillation
Code Code Available 05 Improved Knowledge Distillation via Teacher Assistant Feb 9, 2019 Knowledge Distillation
Code Code Available 05 IE-GAN: An Improved Evolutionary Generative Adversarial Network Using a New Fitness Function and a Generic Crossover Operator Jul 25, 2021 Evolutionary Algorithms Generative Adversarial Network
Code Code Available 05 Improving Adversarial Robust Fairness via Anti-Bias Soft Label Distillation Dec 9, 2023 Adversarial Robustness Fairness
Code Code Available 05 Are All Linear Regions Created Equal? Feb 23, 2022 All Knowledge Distillation
Code Code Available 05 Image Recognition with Online Lightweight Vision Transformer: A Survey May 6, 2025 Knowledge Distillation Survey
Code Code Available 05 Hybrid Data-Free Knowledge Distillation Dec 18, 2024 Data-free Knowledge Distillation Generative Adversarial Network
Code Code Available 05 Hybrid Attention Model Using Feature Decomposition and Knowledge Distillation for Glucose Forecasting Nov 16, 2024 Knowledge Distillation
Code Code Available 05 Human Guided Exploitation of Interpretable Attention Patterns in Summarization and Topic Segmentation Dec 10, 2021 Extractive Summarization Knowledge Distillation
Code Code Available 05 HVDistill: Transferring Knowledge from Images to Point Clouds via Unsupervised Hybrid-View Distillation Mar 18, 2024 Knowledge Distillation NER
Code Code Available 05 Class Incremental Fault Diagnosis under Limited Fault Data via Supervised Contrastive Knowledge Distillation Jan 16, 2025 Fault Diagnosis Knowledge Distillation
Code Code Available 05 Domain-Lifelong Learning for Dialogue State Tracking via Knowledge Preservation Networks Nov 1, 2021 Dialogue State Tracking Diversity
Code Code Available 05 HRKD: Hierarchical Relational Knowledge Distillation for Cross-domain Language Model Compression Oct 16, 2021 Few-Shot Learning Knowledge Distillation
Code Code Available 05 Attention to detail: inter-resolution knowledge distillation Jan 11, 2024 Knowledge Distillation whole slide images
Code Code Available 05 Domain Knowledge Transferring for Pre-trained Language Model via Calibrated Activation Boundary Distillation May 1, 2022 Knowledge Distillation Language Modeling
Code Code Available 05 How to Train the Teacher Model for Effective Knowledge Distillation Jul 25, 2024 Knowledge Distillation
Code Code Available 05 HTR-JAND: Handwritten Text Recognition with Joint Attention Network and Knowledge Distillation Dec 24, 2024 Computational Efficiency Handwritten Text Recognition
Code Code Available 05 Classification Under Misspecification: Halfspaces, Generalized Linear Models, and Connections to Evolvability Jun 8, 2020 Fairness General Classification
Code Code Available 05 How Knowledge Distillation Mitigates the Synthetic Gap in Fair Face Recognition Aug 30, 2024 Face Recognition Fairness
Code Code Available 05 Dynamic Data-Free Knowledge Distillation by Easy-to-Hard Learning Strategy Aug 29, 2022 Data-free Knowledge Distillation Knowledge Distillation
Code Code Available 05 Domain Generalization for Crop Segmentation with Standardized Ensemble Knowledge Distillation Apr 3, 2023 Domain Generalization Knowledge Distillation
Code Code Available 05 HiTSR: A Hierarchical Transformer for Reference-based Super-Resolution Aug 30, 2024 Image Super-Resolution Knowledge Distillation
Code Code Available 05 Highlight Every Step: Knowledge Distillation via Collaborative Teaching Jul 23, 2019 Knowledge Distillation
Code Code Available 05 Domain Adaptable Fine-Tune Distillation Framework For Advancing Farm Surveillance Feb 10, 2024 Computational Efficiency Knowledge Distillation
Code Code Available 05 DOGe: Defensive Output Generation for LLM Protection Against Knowledge Distillation May 26, 2025 Knowledge Distillation
Code Code Available 05 Does Training with Synthetic Data Truly Protect Privacy? Feb 18, 2025 Data-free Knowledge Distillation Dataset Distillation
Code Code Available 05 Holistic White-light Polyp Classification via Alignment-free Dense Distillation of Auxiliary Optical Chromoendoscopy May 25, 2025 Diagnostic Knowledge Distillation
Code Code Available 05 HDKD: Hybrid Data-Efficient Knowledge Distillation Network for Medical Image Classification Jul 10, 2024 Computational Efficiency image-classification
Code Code Available 05 Approximating Interactive Human Evaluation with Self-Play for Open-Domain Dialog Systems Jun 21, 2019 Dialogue Evaluation Knowledge Distillation
Code Code Available 05 CKD: Contrastive Knowledge Distillation from A Sample-wise Perspective Apr 22, 2024 Contrastive Learning image-classification
Code Code Available 05 DMSSN: Distilled Mixed Spectral-Spatial Network for Hyperspectral Salient Object Detection Mar 31, 2024 Dimensionality Reduction Knowledge Distillation
Code Code Available 05 Low-Cost Self-Ensembles Based on Multi-Branch Transformation and Grouped Convolution Aug 5, 2024 Classification Diversity
Code Code Available 05 Handling Data Heterogeneity in Federated Learning via Knowledge Distillation and Fusion Jul 23, 2022 Data-free Knowledge Distillation Fairness
Code Code Available 05 Group Multi-View Transformer for 3D Shape Analysis with Spatial Encoding Dec 27, 2023 3D Classification 3D Shape Recognition
Code Code Available 05