Distilling GANs with Style-Mixed Triplets for X2I Translation with Limited Data Sep 29, 2021 Image Generation Knowledge Distillation
— Unverified 00 Distilling Generative-Discriminative Representations for Very Low-Resolution Face Recognition Sep 10, 2024 Face Recognition Knowledge Distillation
— Unverified 00 Distilling HuBERT with LSTMs via Decoupled Knowledge Distillation Sep 18, 2023 Automatic Speech Recognition Knowledge Distillation
— Unverified 00 Distilling Inductive Bias: Knowledge Distillation Beyond Model Compression Sep 30, 2023 Inductive Bias Knowledge Distillation
— Unverified 00 Distilling Inter-Class Distance for Semantic Segmentation May 7, 2022 Knowledge Distillation Position
— Unverified 00 Distilling Invariant Representations with Dual Augmentation Oct 12, 2024 Knowledge Distillation
— Unverified 00 Distilling Knowledge for Short-to-Long Term Trajectory Prediction May 15, 2023 Knowledge Distillation Prediction
— Unverified 00 Distilling Knowledge from CNN-Transformer Models for Enhanced Human Action Recognition Nov 2, 2023 Action Recognition Knowledge Distillation
— Unverified 00 Distilling Knowledge from Deep Networks with Applications to Healthcare Domain Dec 11, 2015 Computational Phenotyping Decision Making
— Unverified 00 Distilling Knowledge from Heterogeneous Architectures for Semantic Segmentation Apr 10, 2025 Knowledge Distillation Semantic Segmentation
— Unverified 00 Distilling Knowledge from Pre-trained Language Models via Text Smoothing May 8, 2020 Knowledge Distillation Language Modeling
— Unverified 00 Distilling Knowledge from Resource Management Algorithms to Neural Networks: A Unified Training Assistance Approach Aug 15, 2023 Knowledge Distillation Management
— Unverified 00 Distilling Knowledge into Quantum Vision Transformers for Biomedical Image Classification Mar 10, 2025 image-classification Image Classification
— Unverified 00 Distilling Large Language Models for Efficient Clinical Information Extraction Dec 21, 2024 Knowledge Distillation named-entity-recognition
— Unverified 00 Distilling Missing Modality Knowledge from Ultrasound for Endometriosis Diagnosis with Magnetic Resonance Images Jul 5, 2023 Knowledge Distillation
— Unverified 00 Distilling Monocular Foundation Model for Fine-grained Depth Completion Jan 1, 2025 Autonomous Driving Depth Completion
— Unverified 00 Distilling Multi-Level X-vector Knowledge for Small-footprint Speaker Verification Mar 2, 2023 Knowledge Distillation Speaker Verification
— Unverified 00 Distilling Named Entity Recognition Models for Endangered Species from Large Language Models Mar 13, 2024 In-Context Learning Knowledge Distillation
— Unverified 00 Distilling Normalizing Flows Jun 26, 2025 Density Estimation Knowledge Distillation
— Unverified 00 Distilling Object Detectors with Task Adaptive Regularization Jun 23, 2020 Knowledge Distillation Object
— Unverified 00 Distilling ODE Solvers of Diffusion Models into Smaller Steps Sep 28, 2023 Denoising Knowledge Distillation
— Unverified 00 Distilling Optimal Neural Networks: Rapid Search in Diverse Spaces Dec 16, 2020 GPU Knowledge Distillation
— Unverified 00 Distilling Pixel-Wise Feature Similarities for Semantic Segmentation Oct 31, 2019 Knowledge Distillation Neural Network Compression
— Unverified 00 Distilling portable Generative Adversarial Networks for Image Translation Mar 7, 2020 Image-to-Image Translation Knowledge Distillation
— Unverified 00 Distilling Privileged Multimodal Information for Expression Recognition using Optimal Transport Jan 27, 2024 Diversity Knowledge Distillation
— Unverified 00 Distilling Spatially-Heterogeneous Distortion Perception for Blind Image Quality Assessment Jan 1, 2025 Blind Image Quality Assessment Image Quality Assessment
— Unverified 00 Distilling Spikes: Knowledge Distillation in Spiking Neural Networks May 1, 2020 image-classification Image Classification
— Unverified 00 Distilling Structured Knowledge for Text-Based Relational Reasoning Nov 1, 2020 Contrastive Learning Knowledge Distillation
— Unverified 00 Distilling Temporal Knowledge with Masked Feature Reconstruction for 3D Object Detection Jan 3, 2024 3D Object Detection Knowledge Distillation
— Unverified 00 Distilling Text Style Transfer With Self-Explanation From LLMs Mar 2, 2024 In-Context Learning Knowledge Distillation
— Unverified 00 Distilling the Knowledge in Data Pruning Mar 12, 2024 Knowledge Distillation
— Unverified 00 Distilling BERT into Simple Neural Networks with Unlabeled Transfer Data Oct 4, 2019 Knowledge Distillation NER
— Unverified 00 Distilling Vision-Language Foundation Models: A Data-Free Approach via Prompt Diversification Jul 21, 2024 Data-free Knowledge Distillation Image Generation
— Unverified 00 Distill or Annotate? Cost-Efficient Fine-Tuning of Compact Models May 2, 2023 Knowledge Distillation
— Unverified 00 DistillSpec: Improving Speculative Decoding via Knowledge Distillation Oct 12, 2023 Knowledge Distillation Language Modelling
— Unverified 00 Distill-then-prune: An Efficient Compression Framework for Real-time Stereo Matching Network on Edge Devices May 20, 2024 Knowledge Distillation Stereo Matching
— Unverified 00 Distill to Delete: Unlearning in Graph Networks with Knowledge Distillation Sep 28, 2023 GPU Graph Neural Network
— Unverified 00 Distill-to-Label: Weakly Supervised Instance Labeling Using Knowledge Distillation Jul 26, 2019 Breast Cancer Detection Instance Segmentation
— Unverified 00 DistillW2V2: A Small and Streaming Wav2vec 2.0 Based ASR Model Mar 16, 2023 Automatic Speech Recognition Automatic Speech Recognition (ASR)
— Unverified 00 DistPro: Searching A Fast Knowledge Distillation Process via Meta Optimization Apr 12, 2022 Knowledge Distillation Meta-Learning
— Unverified 00 Distributed Learning for Wi-Fi AP Load Prediction Apr 22, 2024 Federated Learning Knowledge Distillation
— Unverified 00 Distribution Shift Matters for Knowledge Distillation with Webly Collected Images Jul 21, 2023 Contrastive Learning Data-free Knowledge Distillation
— Unverified 00 Diverse Knowledge Distillation for End-to-End Person Search Dec 21, 2020 Human Detection Knowledge Distillation
— Unverified 00 Divide and Conquer: Leveraging Intermediate Feature Representations for Quantized Training of Neural Networks Jun 14, 2019 Knowledge Distillation Quantization
— Unverified 00 DLIP: Distilling Language-Image Pre-training Aug 24, 2023 Image Captioning Image-text Retrieval
— Unverified 00 DL-KDD: Dual-Light Knowledge Distillation for Action Recognition in the Dark Jun 4, 2024 Action Recognition Knowledge Distillation
— Unverified 00 DMKD: Improving Feature-based Knowledge Distillation for Object Detection Via Dual Masking Augmentation Sep 6, 2023 Knowledge Distillation object-detection
— Unverified 00 DNA 1.0 Technical Report Jan 18, 2025 Belebele GSM8K
— Unverified 00 DocKD: Knowledge Distillation from LLMs for Open-World Document Understanding Models Oct 4, 2024 document understanding Knowledge Distillation
— Unverified 00 Does Knowledge Distillation Matter for Large Language Model based Bundle Generation? Apr 24, 2025 In-Context Learning Knowledge Distillation
— Unverified 00