Mixed Sample Augmentation for Online Distillation Jun 24, 2022 Data Augmentation Knowledge Distillation
— Unverified 0Online Distilling from Checkpoints for Neural Machine Translation Jun 1, 2019 Knowledge Distillation Machine Translation
— Unverified 0Online Hyperparameter Meta-Learning with Hypergradient Distillation Oct 6, 2021 Hyperparameter Optimization Knowledge Distillation
— Unverified 0Online Knowledge Distillation via Multi-branch Diversity Enhancement Oct 2, 2020 Diversity image-classification
— Unverified 0Online Knowledge Distillation with Reward Guidance May 25, 2025 Imitation Learning Knowledge Distillation
— Unverified 0Online Policy Distillation with Decision-Attention Jun 8, 2024 Deep Reinforcement Learning Knowledge Distillation
— Unverified 0Online pre-training with long-form videos Aug 28, 2024 Action Recognition Contrastive Learning
— Unverified 0Online Sensor Hallucination via Knowledge Distillation for Multimodal Image Classification Aug 28, 2019 Classification Decision Making
— Unverified 0On Multilingual Encoder Language Model Compression for Low-Resource Languages May 22, 2025 Knowledge Distillation Language Modeling
— Unverified 0On Neural Network Equivalence Checking using SMT Solvers Mar 22, 2022 Knowledge Distillation
— Unverified 0On Reducing Activity with Distillation and Regularization for Energy Efficient Spiking Neural Networks Jun 26, 2024 Knowledge Distillation
— Unverified 0On Self-Distilling Graph Neural Network Nov 4, 2020 Graph Embedding Graph Neural Network
— Unverified 0On student-teacher deviations in distillation: does it pay to disobey? Jan 30, 2023 Knowledge Distillation
— Unverified 0On the benefits of knowledge distillation for adversarial robustness Mar 14, 2022 Adversarial Robustness Knowledge Distillation
— Unverified 0On the Compression of Language Models for Code: An Empirical Study on CodeBERT Dec 18, 2024 Code Search Code Summarization
— Unverified 0On the Demystification of Knowledge Distillation: A Residual Network Perspective Jun 30, 2020 Knowledge Distillation Model Compression
— Unverified 0On The Distribution of Penultimate Activations of Classification Networks Jul 5, 2021 Classification Conditional Image Generation
— Unverified 0On the Efficacy of Knowledge Distillation Oct 3, 2019 Knowledge Distillation
— Unverified 0On the Efficiency of Subclass Knowledge Distillation in Classification Tasks Sep 12, 2021 Binary Classification Classification
— Unverified 0On the Impact of Knowledge Distillation for Model Interpretability May 25, 2023 Knowledge Distillation
— Unverified 0On the Impact of White-box Deployment Strategies for Edge AI on Latency and Model Performance Nov 1, 2024 Knowledge Distillation
— Unverified 0On the Interplay Between Sparsity, Naturalness, Intelligibility, and Prosody in Speech Synthesis Oct 4, 2021 Knowledge Distillation Speech Synthesis
— Unverified 0On the Orthogonality of Knowledge Distillation with Other Techniques: From an Ensemble Perspective Sep 9, 2020 Data Augmentation Efficient Neural Network
— Unverified 0On the Query Strategies for Efficient Online Active Distillation Sep 4, 2023 Active Learning Continual Learning
— Unverified 0Analysis of Knowledge Transfer in Kernel Regime Mar 30, 2020 Knowledge Distillation Transfer Learning
— Unverified 0Open-Set Fine-Grained Retrieval via Prompting Vision-Language Evaluator Jan 1, 2023 Knowledge Distillation Retrieval
— Unverified 0Open-set Short Utterance Forensic Speaker Verification using Teacher-Student Network with Explicit Inductive Bias Sep 21, 2020 Inductive Bias Knowledge Distillation
— Unverified 0Open Vocabulary 3D Scene Understanding via Geometry Guided Self-Distillation Jul 18, 2024 Knowledge Distillation Representation Learning
— Unverified 0Open-Vocabulary Object Detection using Pseudo Caption Labels Mar 23, 2023 Image Captioning Knowledge Distillation
— Unverified 0Open-Vocabulary Object Detection with Meta Prompt Representation and Instance Contrastive Optimization Mar 14, 2024 Contrastive Learning Knowledge Distillation
— Unverified 0Open World DETR: Transformer based Open World Object Detection Dec 6, 2022 Knowledge Distillation Object
— Unverified 0Leveraging Complementary Attention maps in vision transformers for OCT image analysis Oct 21, 2023 Knowledge Distillation
— Unverified 0OplixNet: Towards Area-Efficient Optical Split-Complex Networks with Real-to-Complex Data Assignment and Knowledge Distillation Dec 3, 2023 Knowledge Distillation
— Unverified 0Optical Flow Distillation: Towards Efficient and Stable Video Style Transfer Jul 10, 2020 Knowledge Distillation Optical Flow Estimation
— Unverified 0Optimising TinyML with Quantization and Distillation of Transformer and Mamba Models for Indoor Localisation on Edge Devices Dec 12, 2024 Knowledge Distillation Mamba
— Unverified 0Optimizing Knowledge Distillation in Transformers: Enabling Multi-Head Attention without Alignment Barriers Feb 11, 2025 image-classification Image Classification
— Unverified 0Optimizing LLMs for Resource-Constrained Environments: A Survey of Model Compression Techniques May 5, 2025 Knowledge Distillation Mixture-of-Experts
— Unverified 0Optimizing Multi-Gateway LoRaWAN via Cloud-Edge Collaboration and Knowledge Distillation Apr 13, 2025 Decision Making Knowledge Distillation
— Unverified 0Optimizing speed/accuracy trade-off for person re-identification via knowledge distillation Dec 7, 2018 Deep Learning General Classification
— Unverified 0Learning Deep and Compact Models for Gesture Recognition Dec 29, 2017 Gesture Recognition Knowledge Distillation
Code Code Available 0Improved Knowledge Distillation via Full Kernel Matrix Transfer Sep 30, 2020 Knowledge Distillation Model Compression
Code Code Available 0Learning Efficient Detector with Semi-supervised Adaptive Distillation Jan 2, 2019 image-classification Image Classification
Code Code Available 0Efficient Logit-based Knowledge Distillation of Deep Spiking Neural Networks for Full-Range Timestep Deployment Jan 27, 2025 Knowledge Distillation
Code Code Available 0TernaryBERT: Distillation-aware Ultra-low Bit BERT Sep 27, 2020 Knowledge Distillation Quantization
Code Code Available 0Training on the Test Model: Contamination in Ranking Distillation Nov 4, 2024 Knowledge Distillation
Code Code Available 0Leaning Compact and Representative Features for Cross-Modality Person Re-Identification Mar 26, 2021 Cross-Modality Person Re-identification Knowledge Distillation
Code Code Available 0Beyond the Limitation of Monocular 3D Detector via Knowledge Distillation Jan 1, 2023 Knowledge Distillation
Code Code Available 0ADD: Frequency Attention and Multi-View based Knowledge Distillation to Detect Low-Quality Compressed Deepfake Images Dec 7, 2021 DeepFake Detection Face Swapping
Code Code Available 0Beyond Conventional Transformers: The Medical X-ray Attention (MXA) Block for Improved Multi-Label Diagnosis Using Knowledge Distillation Apr 3, 2025 Anomaly Detection Knowledge Distillation
Code Code Available 0Pretrained Speech Encoders and Efficient Fine-tuning Methods for Speech Translation: UPC at IWSLT 2022 May 1, 2022 Decoder Knowledge Distillation
Code Code Available 0