Rethinking the Knowledge Distillation From the Perspective of Model Calibration Oct 31, 2021 Knowledge Distillation
— Unverified 0Rethinking the Upsampling Layer in Hyperspectral Image Super Resolution Jan 30, 2025 Hyperspectral Image Super-Resolution Image Super-Resolution
— Unverified 0Retrieve Anything To Augment Large Language Models Oct 11, 2023 Knowledge Distillation Retrieval
— Unverified 0Revealing the Two Sides of Data Augmentation: An Asymmetric Distillation-based Win-Win Solution for Open-Set Recognition Apr 28, 2024 Data Augmentation Knowledge Distillation
— Unverified 0Reverse-engineering recurrent neural network solutions to a hierarchical inference task for mice Dec 1, 2020 Knowledge Distillation Model Compression
— Unverified 0Reverse Thinking Makes LLMs Stronger Reasoners Nov 29, 2024 Data Augmentation Knowledge Distillation
— Unverified 0Review helps learn better: Temporal Supervised Knowledge Distillation Jul 3, 2023 image-classification Image Classification
— Unverified 0Revisiting Architecture-aware Knowledge Distillation: Smaller Models and Faster Search Jun 27, 2022 Bayesian Optimization Knowledge Distillation
— Unverified 0Revisiting Data Augmentation in Model Compression: An Empirical and Comprehensive Study May 22, 2023 Data Augmentation Knowledge Distillation
— Unverified 0Revisiting Graph based Social Recommendation: A Distillation Enhanced Social Graph Network Apr 25, 2022 Knowledge Distillation Recommendation Systems
— Unverified 0Revisiting Intermediate-Layer Matching in Knowledge Distillation: Layer-Selection Strategy Doesn't Matter (Much) Feb 6, 2025 Knowledge Distillation
— Unverified 0Revisiting Knowledge Distillation for Object Detection May 22, 2021 Domain Adaptation Knowledge Distillation
— Unverified 0Revisiting Multi-modal 3D Semantic Segmentation in Real-world Autonomous Driving Oct 13, 2023 3D Semantic Segmentation Autonomous Driving
— Unverified 0Revisiting Self-Distillation Jun 17, 2022 Knowledge Distillation Model Compression
— Unverified 0Reward-Based 1-bit Compressed Federated Distillation on Blockchain Jun 27, 2021 Federated Learning Knowledge Distillation
— Unverified 0Reward Modeling with Ordinal Feedback: Wisdom of the Crowd Nov 19, 2024 Knowledge Distillation
— Unverified 0Rich Feature Distillation with Feature Affinity Module for Efficient Image Dehazing Jul 13, 2022 Contrastive Learning image-classification
— Unverified 0RKLD: Reverse KL-Divergence-based Knowledge Distillation for Unlearning Personal Information in Large Language Models Jun 4, 2024 Knowledge Distillation Language Modeling
— Unverified 0RL-based Query Rewriting with Distilled LLM for online E-Commerce Systems Jan 29, 2025 Knowledge Distillation Natural Language Understanding
— Unverified 0RNAS-CL: Robust Neural Architecture Search by Cross-Layer Knowledge Distillation Jan 19, 2023 Knowledge Distillation Neural Architecture Search
— Unverified 0Robust Active Distillation Oct 3, 2022 Active Learning Informativeness
— Unverified 0Robust Distillation for Worst-class Performance Jun 13, 2022 Knowledge Distillation
— Unverified 0Robust Distillation via Untargeted and Targeted Intermediate Adversarial Samples Jan 1, 2024 Adversarial Robustness Knowledge Distillation
— Unverified 0RobustDistiller: Compressing Universal Speech Representations for Enhanced Environment Robustness Feb 18, 2023 Knowledge Distillation Multi-Task Learning
— Unverified 0Robust feature knowledge distillation for enhanced performance of lightweight crack segmentation models Apr 9, 2024 Crack Segmentation Knowledge Distillation
— Unverified 0Robust Knowledge Distillation from RNN-T Models With Noisy Training Labels Using Full-Sum Loss Mar 10, 2023 Knowledge Distillation
— Unverified 0Robustly Optimized and Distilled Training for Natural Language Understanding Mar 16, 2021 Knowledge Distillation Machine Reading Comprehension
— Unverified 0Robustness-Reinforced Knowledge Distillation with Correlation Distance and Network Pruning Nov 23, 2023 Data Augmentation Knowledge Distillation
— Unverified 0Robustness to distribution shifts of compressed networks for edge devices Jan 22, 2024 Knowledge Distillation Quantization
— Unverified 0Robust Overfitting may be mitigated by properly learned smoothening Jan 1, 2021 Knowledge Distillation
— Unverified 0Robust & Precise Knowledge Distillation-based Novel Context-Aware Predictor for Disease Detection in Brain and Gastrointestinal May 9, 2025 Disease Prediction Knowledge Distillation
— Unverified 0Role of Mixup in Topological Persistence Based Knowledge Distillation for Wearable Sensor Data Feb 2, 2025 Data Augmentation Knowledge Distillation
— Unverified 0RoSearch: Search for Robust Student Architectures When Distilling Pre-trained Language Models Jun 7, 2021 Adversarial Robustness Knowledge Distillation
— Unverified 0RoS-KD: A Robust Stochastic Knowledge Distillation Approach for Noisy Medical Imaging Oct 15, 2022 Classification Knowledge Distillation
— Unverified 0RTSR: A Real-Time Super-Resolution Model for AV1 Compressed Content Nov 20, 2024 4k Knowledge Distillation
— Unverified 0RW-KD: Sample-wise Loss Terms Re-Weighting for Knowledge Distillation Nov 1, 2021 Knowledge Distillation
— Unverified 0S2HPruner: Soft-to-Hard Distillation Bridges the Discretization Gap in Pruning Oct 9, 2024 Knowledge Distillation
— Unverified 0S2OSC: A Holistic Semi-Supervised Approach for Open Set Classification Aug 11, 2020 General Classification Knowledge Distillation
— Unverified 0S2P3: Self-Supervised Polarimetric Pose Prediction Dec 2, 2023 Knowledge Distillation Pose Prediction
— Unverified 0Safe Distillation Box Dec 5, 2021 Knowledge Distillation
— Unverified 0SafetyAnalyst: Interpretable, Transparent, and Steerable Safety Moderation for AI Behavior Oct 22, 2024 Knowledge Distillation
— Unverified 0SAM-COD: SAM-guided Unified Framework for Weakly-Supervised Camouflaged Object Detection Aug 20, 2024 Knowledge Distillation object-detection
— Unverified 0SAM-Guided Masked Token Prediction for 3D Scene Understanding Oct 16, 2024 3D Object Detection Knowledge Distillation
— Unverified 0SAM-Guided Robust Representation Learning for One-Shot 3D Medical Image Segmentation Apr 29, 2025 General Knowledge Image Segmentation
— Unverified 0Sample, Translate, Recombine: Leveraging Audio Alignments for Data Augmentation in End-to-end Speech Translation Mar 16, 2022 Data Augmentation Knowledge Distillation
— Unverified 0Sampling to Distill: Knowledge Transfer from Open-World Data Jul 31, 2023 Data-free Knowledge Distillation Knowledge Distillation
— Unverified 0Samsung R&D Institute Poland submission to WAT 2021 Indic Language Multilingual Task Aug 1, 2021 Domain Adaptation Knowledge Distillation
— Unverified 0SC2 Benchmark: Supervised Compression for Split Computing Mar 16, 2022 Data Compression Edge-computing
— Unverified 0Scalable Collaborative Learning via Representation Sharing Nov 20, 2022 Federated Learning Knowledge Distillation
— Unverified 0Scalable Detection of Salient Entities in News Articles May 30, 2024 Articles Knowledge Distillation
— Unverified 0