Modality-Independent Brain Lesion Segmentation with Privacy-aware Continual Learning Mar 26, 2025 Continual Learning Knowledge Distillation
Code Code Available 0Squeezed Deep 6DoF Object Detection Using Knowledge Distillation Mar 30, 2020 Knowledge Distillation Object
Code Code Available 0Unsupervised Spike Depth Estimation via Cross-modality Cross-domain Knowledge Transfer Aug 26, 2022 Autonomous Driving Depth Estimation
Code Code Available 0Adaptive Modality Balanced Online Knowledge Distillation for Brain-Eye-Computer based Dim Object Detection Jul 2, 2024 EEG Electroencephalogram (EEG)
Code Code Available 0Distilling Influences to Mitigate Prediction Churn in Graph Neural Networks Oct 2, 2023 Knowledge Distillation Node Classification
Code Code Available 0Model-Architecture Co-Design for High Performance Temporal GNN Inference on FPGA Mar 10, 2022 Knowledge Distillation
Code Code Available 0Closest Neighbors are Harmful for Lightweight Masked Auto-encoders Jan 1, 2025 Knowledge Distillation
Code Code Available 0CLIMB-3D: Continual Learning for Imbalanced 3D Instance Segmentation Feb 24, 2025 3D Instance Segmentation Continual Learning
Code Code Available 0RAIN: RegulArization on Input and Network for Black-Box Domain Adaptation Aug 22, 2022 Data Augmentation Domain Adaptation
Code Code Available 0Reprogramming Distillation for Medical Foundation Models Jul 9, 2024 Knowledge Distillation Lightweight Deployment
Code Code Available 0Model Compression Techniques in Biometrics Applications: A Survey Jan 18, 2024 Fairness Knowledge Distillation
Code Code Available 0Unsupervised Training of a Dynamic Context-Aware Deep Denoising Framework for Low-Dose Fluoroscopic Imaging Oct 29, 2024 Denoising Diagnostic
Code Code Available 0Comb, Prune, Distill: Towards Unified Pruning for Vision Model Compression Aug 6, 2024 image-classification Image Classification
Code Code Available 0Class Incremental Fault Diagnosis under Limited Fault Data via Supervised Contrastive Knowledge Distillation Jan 16, 2025 Fault Diagnosis Knowledge Distillation
Code Code Available 0StableKD: Breaking Inter-block Optimization Entanglement for Stable Knowledge Distillation Dec 20, 2023 Knowledge Distillation
Code Code Available 0Toward Extremely Lightweight Distracted Driver Recognition With Distillation-Based Neural Architecture Search and Knowledge Transfer Feb 9, 2023 Knowledge Distillation Neural Architecture Search
Code Code Available 0Distilling Implicit Multimodal Knowledge into Large Language Models for Zero-Resource Dialogue Generation May 16, 2024 Dialogue Generation Knowledge Distillation
Code Code Available 0Distilling Image Dehazing With Heterogeneous Task Imitation Jun 1, 2020 image-classification Image Classification
Code Code Available 0Enhancing Heterogeneous Federated Learning with Knowledge Extraction and Multi-Model Fusion Aug 16, 2022 Federated Learning Knowledge Distillation
Code Code Available 0Modeling Document-level Temporal Structures for Building Temporal Dependency Graphs Oct 21, 2022 Knowledge Distillation Sentence
Code Code Available 0Guiding Frame-Level CTC Alignments Using Self-knowledge Distillation Jun 12, 2024 Automatic Speech Recognition Automatic Speech Recognition (ASR)
Code Code Available 0Data Efficient Stagewise Knowledge Distillation Nov 15, 2019 Knowledge Distillation Model Compression
Code Code Available 0Weight-Inherited Distillation for Task-Agnostic BERT Compression May 16, 2023 Knowledge Distillation
Code Code Available 0StatsMerging: Statistics-Guided Model Merging via Task-Specific Teacher Distillation Jun 5, 2025 Knowledge Distillation
Code Code Available 0Why Not Transform Chat Large Language Models to Non-English? May 22, 2024 Knowledge Distillation
Code Code Available 0Response Ranking with Deep Matching Networks and External Knowledge in Information-seeking Conversation Systems May 1, 2018 Knowledge Distillation Retrieval
Code Code Available 0Distilling Global and Local Logits With Densely Connected Relations Jan 1, 2021 image-classification Image Classification
Code Code Available 0UPFL: Unsupervised Personalized Federated Learning towards New Clients Jul 29, 2023 Federated Learning Knowledge Distillation
Code Code Available 0GSSF: Generalized Structural Sparse Function for Deep Cross-modal Metric Learning Oct 20, 2024 Image Retrieval Image-text Retrieval
Code Code Available 0Two-stage Textual Knowledge Distillation for End-to-End Spoken Language Understanding Oct 25, 2020 Automatic Speech Recognition Automatic Speech Recognition (ASR)
Code Code Available 0Distilling Focal Knowledge From Imperfect Expert for 3D Object Detection Jan 1, 2023 3D geometry 3D Object Detection
Code Code Available 0Distilling and Transferring Knowledge via cGAN-generated Samples for Image Classification and Regression Apr 7, 2021 General Classification image-classification
Code Code Available 0MoMA: Momentum Contrastive Learning with Multi-head Attention-based Knowledge Distillation for Histopathology Image Analysis Aug 31, 2023 Contrastive Learning Knowledge Distillation
Code Code Available 0Exploring Inconsistent Knowledge Distillation for Object Detection with Data Augmentation Sep 20, 2022 Data Augmentation Knowledge Distillation
Code Code Available 0GSB: Group Superposition Binarization for Vision Transformer with Limited Training Samples May 13, 2023 Binarization Knowledge Distillation
Code Code Available 0Distilled Non-Semantic Speech Embeddings with Binary Neural Networks for Low-Resource Devices Jul 12, 2022 Emotion Recognition Keyword Spotting
Code Code Available 0Group Multi-View Transformer for 3D Shape Analysis with Spatial Encoding Dec 27, 2023 3D Classification 3D Shape Recognition
Code Code Available 0Greedy-layer Pruning: Speeding up Transformer Models for Natural Language Processing May 31, 2021 Knowledge Distillation Unsupervised Pre-training
Code Code Available 0Automatic Assignment of Radiology Examination Protocols Using Pre-trained Language Models with Knowledge Distillation Sep 1, 2020 Data Augmentation Knowledge Distillation
Code Code Available 0Graph Knowledge Distillation to Mixture of Experts Jun 17, 2024 Knowledge Distillation Mixture-of-Experts
Code Code Available 0Mosaic: Data-Free Knowledge Distillation via Mixture-of-Experts for Heterogeneous Distributed Environments May 26, 2025 Data-free Knowledge Distillation Federated Learning
Code Code Available 0Graph Entropy Minimization for Semi-supervised Node Classification May 31, 2023 Classification Knowledge Distillation
Code Code Available 0Rethinking Intermediate Layers design in Knowledge Distillation for Kidney and Liver Tumor Segmentation Nov 28, 2023 Diagnostic Knowledge Distillation
Code Code Available 0AdaBERT: Task-Adaptive BERT Compression with Differentiable Neural Architecture Search Jan 13, 2020 Knowledge Distillation Neural Architecture Search
Code Code Available 0Graph-based Knowledge Distillation by Multi-head Attention Network Jul 4, 2019 Inductive Bias Knowledge Distillation
Code Code Available 0Gradient Knowledge Distillation for Pre-trained Language Models Nov 2, 2022 Knowledge Distillation
Code Code Available 0MSE-Optimal Neural Network Initialization via Layer Fusion Jan 28, 2020 General Classification Knowledge Distillation
Code Code Available 0Automatic adaptation of object detectors to new domains using self-training Apr 15, 2019 Domain Adaptation Knowledge Distillation
Code Code Available 0MST-KD: Multiple Specialized Teachers Knowledge Distillation for Fair Face Recognition Aug 29, 2024 Face Recognition Knowledge Distillation
Code Code Available 0STKDRec: Spatial-Temporal Knowledge Distillation for Takeaway Recommendation Dec 21, 2024 Knowledge Distillation Knowledge Graphs
Code Code Available 0