Improved Techniques for Training Adaptive Deep Networks Aug 17, 2019 Computational Efficiency Knowledge Distillation
Code Code Available 15 ConcealGS: Concealing Invisible Copyright Information in 3D Gaussian Splatting Jan 7, 2025 3D Reconstruction Knowledge Distillation
Code Code Available 15 Improving Continual Relation Extraction by Distinguishing Analogous Semantics May 11, 2023 Continual Relation Extraction Knowledge Distillation
Code Code Available 15 Learning Efficient Vision Transformers via Fine-Grained Manifold Distillation Jul 3, 2021 Knowledge Distillation Model Compression
Code Code Available 15 Distilled Split Deep Neural Networks for Edge-Assisted Real-Time Systems Oct 1, 2019 Edge-computing Image Classification
Code Code Available 15 Hybrid Inverted Index Is a Robust Accelerator for Dense Retrieval Oct 11, 2022 Knowledge Distillation Quantization
Code Code Available 15 Enhancing Low-Density EEG-Based Brain-Computer Interfaces with Similarity-Keeping Knowledge Distillation Dec 6, 2022 EEG Eeg Decoding
Code Code Available 15 Bit-mask Robust Contrastive Knowledge Distillation for Unsupervised Semantic Hashing Mar 10, 2024 Image Retrieval Knowledge Distillation
Code Code Available 15 AMFD: Distillation via Adaptive Multimodal Fusion for Multispectral Pedestrian Detection May 21, 2024 Knowledge Distillation Pedestrian Detection
Code Code Available 15 BKDSNN: Enhancing the Performance of Learning-based Spiking Neural Networks Training with Blurred Knowledge Distillation Jul 12, 2024 Knowledge Distillation
Code Code Available 15 Black-Box Attacks on Sequential Recommenders via Data-Free Model Extraction Sep 1, 2021 Data Poisoning Knowledge Distillation
Code Code Available 15 Adaptive Multi-Teacher Knowledge Distillation with Meta-Learning Jun 11, 2023 Knowledge Distillation Meta-Learning
Code Code Available 15 Model LEGO: Creating Models Like Disassembling and Assembling Building Blocks Mar 25, 2022 Incremental Learning Knowledge Distillation
Code Code Available 15 Coaching a Teachable Student Jun 16, 2023 CARLA longest6 Knowledge Distillation
Code Code Available 15 Improving Simultaneous Machine Translation with Monolingual Data Dec 2, 2022 Hallucination Knowledge Distillation
Code Code Available 15 Are Intermediate Layers and Labels Really Necessary? A General Language Model Distillation Method Jun 11, 2023 Knowledge Distillation Language Modeling
Code Code Available 15 Understanding the Role of the Projector in Knowledge Distillation Mar 20, 2023 image-classification Image Classification
Code Code Available 15 Block-Wisely Supervised Neural Architecture Search With Knowledge Distillation Jun 1, 2020 Knowledge Distillation Neural Architecture Search
Code Code Available 15 Blockwisely Supervised Neural Architecture Search with Knowledge Distillation Nov 29, 2019 Knowledge Distillation Neural Architecture Search
Code Code Available 15 Distilling Holistic Knowledge with Graph Neural Networks Aug 12, 2021 Knowledge Distillation
Code Code Available 15 Distilling Knowledge from Refinement in Multiple Instance Detection Networks Apr 23, 2020 Knowledge Distillation Multiple Instance Learning
Code Code Available 15 Distilling Large Vision-Language Model with Out-of-Distribution Generalizability Jul 6, 2023 Few-Shot Image Classification Image Classification
Code Code Available 15 Distilling Knowledge from Reader to Retriever for Question Answering Dec 8, 2020 Information Retrieval Knowledge Distillation
Code Code Available 15 Distilling Knowledge from Ensembles of Acoustic Models for Joint CTC-Attention End-to-End Speech Recognition May 19, 2020 Automatic Speech Recognition Automatic Speech Recognition (ASR)
Code Code Available 15 Distilling Knowledge from Self-Supervised Teacher by Embedding Graph Alignment Nov 23, 2022 Knowledge Distillation Representation Learning
Code Code Available 15 Distilling Knowledge via Knowledge Review Apr 19, 2021 Instance Segmentation Knowledge Distillation
Code Code Available 15 Boosting Light-Weight Depth Estimation Via Knowledge Distillation May 13, 2021 Computational Efficiency Depth Estimation
Code Code Available 15 Distilling Knowledge via Intermediate Classifiers Feb 28, 2021 Knowledge Distillation Transfer Learning
Code Code Available 15 CMD: Self-supervised 3D Action Representation Learning with Cross-modal Mutual Distillation Aug 26, 2022 3D Action Recognition Action Recognition
Code Code Available 15 Efficient Fine-Tuning and Concept Suppression for Pruned Diffusion Models Dec 19, 2024 Bilevel Optimization Knowledge Distillation
Code Code Available 15 Distilling Meta Knowledge on Heterogeneous Graph for Illicit Drug Trafficker Detection on Social Media Dec 1, 2021 Knowledge Distillation Marketing
Code Code Available 15 Intra-Document Cascading: Learning to Select Passages for Neural Document Ranking May 20, 2021 Document Ranking Knowledge Distillation
Code Code Available 15 Distilling Object Detectors via Decoupled Features Mar 26, 2021 image-classification Image Classification
Code Code Available 15 Distilling Out-of-Distribution Robustness from Vision-Language Foundation Models Nov 2, 2023 Data Augmentation Domain Generalization
Code Code Available 15 Distilling Object Detectors with Feature Richness Nov 1, 2021 Knowledge Distillation Model Compression
Code Code Available 15 itKD: Interchange Transfer-based Knowledge Distillation for 3D Object Detection May 31, 2022 3D Object Detection Cloud Detection
Code Code Available 15 Bootstrapping meaning through listening: Unsupervised learning of spoken sentence embeddings Oct 23, 2022 Acoustic Unit Discovery Contrastive Learning
Code Code Available 15 Adapt Your Teacher: Improving Knowledge Distillation for Exemplar-free Continual Learning Aug 18, 2023 class-incremental learning Class Incremental Learning
Code Code Available 15 3D Annotation-Free Learning by Distilling 2D Open-Vocabulary Segmentation Models for Autonomous Driving May 24, 2024 Autonomous Driving Knowledge Distillation
Code Code Available 15 JiuZhang3.0: Efficiently Improving Mathematical Reasoning by Training Small Data Synthesis Models May 23, 2024 Knowledge Distillation Math
Code Code Available 15 BPKD: Boundary Privileged Knowledge Distillation For Semantic Segmentation Jun 13, 2023 Knowledge Distillation Segmentation
Code Code Available 15 Breaking Modality Gap in RGBT Tracking: Coupled Knowledge Distillation Oct 15, 2024 Knowledge Distillation Rgb-T Tracking
Code Code Available 15 KDAS: Knowledge Distillation via Attention Supervision Framework for Polyp Segmentation Dec 13, 2023 Knowledge Distillation Medical Image Segmentation
Code Code Available 15 KD-Lib: A PyTorch library for Knowledge Distillation, Pruning and Quantization Nov 30, 2020 Knowledge Distillation Model Compression
Code Code Available 15 Bridge Past and Future: Overcoming Information Asymmetry in Incremental Object Detection Jul 16, 2024 Knowledge Distillation object-detection
Code Code Available 15 Distilling Visual Priors from Self-Supervised Learning Aug 1, 2020 Classification Contrastive Learning
Code Code Available 15 Advantage-Guided Distillation for Preference Alignment in Small Language Models Feb 25, 2025 Knowledge Distillation
Code Code Available 15 Bridging Cross-task Protocol Inconsistency for Distillation in Dense Object Detection Aug 28, 2023 Binary Classification Classification
Code Code Available 15 Distill the Image to Nowhere: Inversion Knowledge Distillation for Multimodal Machine Translation Oct 10, 2022 Knowledge Distillation Machine Translation
Code Code Available 15 APSNet: Attention Based Point Cloud Sampling Oct 11, 2022 3D Point Cloud Classification Knowledge Distillation
Code Code Available 15