Aligning (Medical) LLMs for (Counterfactual) Fairness Aug 22, 2024 counterfactual Fairness
Code Code Available 05 D^2TV: Dual Knowledge Distillation and Target-oriented Vision Modeling for Many-to-Many Multimodal Summarization May 22, 2023 Knowledge Distillation
Code Code Available 05 Leaning Compact and Representative Features for Cross-Modality Person Re-Identification Mar 26, 2021 Cross-Modality Person Re-identification Knowledge Distillation
Code Code Available 05 cViL: Cross-Lingual Training of Vision-Language Models using Knowledge Distillation Jun 7, 2022 Knowledge Distillation Question Answering
Code Code Available 05 BEBERT: Efficient and Robust Binary Ensemble BERT Oct 28, 2022 Binarization Computational Efficiency
Code Code Available 05 Customizing Synthetic Data for Data-Free Student Learning Jul 10, 2023 Data-free Knowledge Distillation Knowledge Distillation
Code Code Available 05 Language Model Knowledge Distillation for Efficient Question Answering in Spanish Dec 7, 2023 Knowledge Distillation Language Modeling
Code Code Available 05 Adaptive Distillation: Aggregating Knowledge from Multiple Paths for Efficient Distillation Oct 19, 2021 Knowledge Distillation Neural Network Compression
Code Code Available 05 CSE: Surface Anomaly Detection with Contrastively Selected Embedding Mar 4, 2024 Anomaly Detection Knowledge Distillation
Code Code Available 05 Language-Universal Adapter Learning with Knowledge Distillation for End-to-End Multilingual Speech Recognition Feb 28, 2023 Automatic Speech Recognition Automatic Speech Recognition (ASR)
Code Code Available 05 Large-Scale Data-Free Knowledge Distillation for ImageNet via Multi-Resolution Data Generation Nov 26, 2024 Data-free Knowledge Distillation Diversity
Code Code Available 05 Learning to Maximize Mutual Information for Chain-of-Thought Distillation Mar 5, 2024 Knowledge Distillation Language Modeling
Code Code Available 05 Knowledge Transfer Graph for Deep Collaborative Learning Sep 10, 2019 Knowledge Distillation Transfer Learning
Code Code Available 05 Cross-View Consistency Regularisation for Knowledge Distillation Dec 21, 2024 Knowledge Distillation
Code Code Available 05 KS-DETR: Knowledge Sharing in Attention Learning for Detection Transformer Feb 22, 2023 Knowledge Distillation Transfer Learning
Code Code Available 05 Knowledge Grafting of Large Language Models May 24, 2025 Continual Learning Knowledge Distillation
Code Code Available 05 BAM! Born-Again Multi-Task Networks for Natural Language Understanding Jul 10, 2019 Knowledge Distillation Natural Language Understanding
Code Code Available 05 Adaptive Decoupled Pose Knowledge Distillation Oct 1, 2023 Knowledge Distillation Pose Estimation
Code Code Available 05 Cross-modal Knowledge Distillation for Vision-to-Sensor Action Recognition Oct 8, 2021 Action Recognition Activity Recognition
Code Code Available 05 Knowledge Extraction with No Observable Data Dec 1, 2019 Data-free Knowledge Distillation Knowledge Distillation
Code Code Available 05 KnowledgeSG: Privacy-Preserving Synthetic Text Generation with Knowledge Distillation from Server Oct 8, 2024 Federated Learning Knowledge Distillation
Code Code Available 05 Knowledge Distillation with Adversarial Samples Supporting Decision Boundary May 15, 2018 Adversarial Attack Knowledge Distillation
Code Code Available 05 Cross Modality Knowledge Distillation for Multi-Modal Aerial View Object Classification Jun 19, 2021 Image Classification Knowledge Distillation
Code Code Available 05 Alignahead: Online Cross-Layer Knowledge Extraction on Graph Neural Networks May 5, 2022 Knowledge Distillation
Code Code Available 05 Knowledge Distillation with Reptile Meta-Learning for Pretrained Language Model Compression Oct 1, 2022 Knowledge Distillation Language Modeling
Code Code Available 05 Knowledge Distillation via Instance Relationship Graph Jun 1, 2019 Knowledge Distillation
Code Code Available 05 Knowledge Distillation of Russian Language Models with Reduction of Vocabulary May 4, 2022 Knowledge Distillation
Code Code Available 05 Knowledge Distillation Performs Partial Variance Reduction May 27, 2023 Knowledge Distillation
Code Code Available 05 A Lightweight Target-Driven Network of Stereo Matching for Inland Waterways Oct 10, 2024 Autonomous Navigation Knowledge Distillation
Code Code Available 05 Knowledge Distillation in RNN-Attention Models for Early Prediction of Student Performance Dec 19, 2024 Knowledge Distillation Student dropout
Code Code Available 05 Backdoor for Debias: Mitigating Model Bias with Backdoor Attack-based Artificial Bias Mar 1, 2023 Backdoor Attack Knowledge Distillation
Code Code Available 05 Knowledge Distillation Layer that Lets the Student Decide Sep 6, 2023 Knowledge Distillation
Code Code Available 05 Knowledge distillation to effectively attain both region-of-interest and global semantics from an image where multiple objects appear Jul 11, 2024 Knowledge Distillation object-detection
Code Code Available 05 Cross-feature Contrastive Loss for Decentralized Deep Learning on Heterogeneous Data Oct 24, 2023 Data-free Knowledge Distillation Knowledge Distillation
Code Code Available 05 Knowledge Distillation from Cross Teaching Teachers for Efficient Semi-Supervised Abdominal Organ Segmentation in CT Nov 11, 2022 Image Segmentation Knowledge Distillation
Code Code Available 05 Few Sample Knowledge Distillation for Efficient Network Compression Dec 5, 2018 Knowledge Distillation Network Pruning
Code Code Available 05 Knowledge Distillation from Single to Multi Labels: an Empirical Study Mar 15, 2023 Classification image-classification
Code Code Available 05 Knowledge Distillation For Wireless Edge Learning Apr 3, 2021 Cloud Computing Federated Learning
Code Code Available 05 Knowledge Distillation for Singing Voice Detection Nov 9, 2020 Information Retrieval Knowledge Distillation
Code Code Available 05 Knowledge Distillation for Detection Transformer with Consistent Distillation Points Sampling Nov 15, 2022 General Knowledge Knowledge Distillation
Code Code Available 05 Knowledge Distillation for End-to-End Person Search Sep 3, 2019 Knowledge Distillation Model Compression
Code Code Available 05 Knowledge Distillation-Based Model Extraction Attack using GAN-based Private Counterfactual Explanations Apr 4, 2024 counterfactual Knowledge Distillation
Code Code Available 05 Knowledge Distillation by On-the-Fly Native Ensemble Jun 12, 2018 Computational Efficiency image-classification
Code Code Available 05 Knowledge Distillation as Semiparametric Inference Apr 20, 2021 Knowledge Distillation Model Compression
Code Code Available 05 AVQACL: A Novel Benchmark for Audio-Visual Question Answering Continual Learning Jan 1, 2025 Audio-visual Question Answering Continual Learning
Code Code Available 05 Knowledge Distillation By Sparse Representation Matching Mar 31, 2021 Knowledge Distillation Representation Learning
Code Code Available 05 Knowledge Distillation for Multi-Target Domain Adaptation in Real-Time Person Re-Identification May 12, 2022 Domain Adaptation Knowledge Distillation
Code Code Available 05 Co-Teaching for Unsupervised Domain Adaptation and Expansion Apr 4, 2022 Domain Adaptation image-classification
Code Code Available 05 Knowledge Distillation approach towards Melanoma Detection Oct 14, 2022 Knowledge Distillation TAG
Code Code Available 05 Auxiliary Learning for Self-Supervised Video Representation via Similarity-based Knowledge Distillation Dec 7, 2021 Auxiliary Learning Knowledge Distillation
Code Code Available 05