DεpS: Delayed ε-Shrinking for Faster Once-For-All Training Jul 8, 2024 All GPU
— Unverified 0Boosting Graph Neural Networks via Adaptive Knowledge Distillation Oct 12, 2022 Graph Classification Graph Mining
— Unverified 0Deploying a BERT-based Query-Title Relevance Classifier in a Production System: a View from the Trenches Aug 23, 2021 CPU Data Augmentation
— Unverified 0Analyzing Knowledge Distillation in Neural Machine Translation Oct 1, 2018 Knowledge Distillation Machine Translation
— Unverified 0Efficient Intent-Based Filtering for Multi-Party Conversations Using Knowledge Distillation from LLMs Mar 21, 2025 intent-classification Intent Classification
— Unverified 0Densely Distilling Cumulative Knowledge for Continual Learning May 16, 2024 All Continual Learning
— Unverified 0Boosting Contrastive Learning with Relation Knowledge Distillation Dec 8, 2021 Contrastive Learning Knowledge Distillation
— Unverified 0BoostingBERT:Integrating Multi-Class Boosting into BERT for NLP Tasks Sep 13, 2020 Ensemble Learning Knowledge Distillation
— Unverified 0Denoising Mutual Knowledge Distillation in Bi-Directional Multiple Instance Learning May 17, 2025 Denoising image-classification
— Unverified 0Analyzing Compression Techniques for Computer Vision May 14, 2023 Knowledge Distillation Quantization
— Unverified 0Efficient Image Compression Using Advanced State Space Models Sep 4, 2024 Computational Efficiency Image Compression
— Unverified 0Demystifying Catastrophic Forgetting in Two-Stage Incremental Object Detector Feb 8, 2025 Incremental Learning Knowledge Distillation
— Unverified 0Delving Deep into Semantic Relation Distillation Mar 27, 2025 Knowledge Distillation Model Compression
— Unverified 0Boosting Accuracy and Robustness of Student Models via Adaptive Adversarial Distillation Jan 1, 2023 Adversarial Robustness Knowledge Distillation
— Unverified 0BOLT: Bootstrap Long Chain-of-Thought in Language Models without Distillation Feb 6, 2025 In-Context Learning Knowledge Distillation
— Unverified 0An Active Learning Framework for Inclusive Generation by Large Language Models Oct 17, 2024 Active Learning Clustering
— Unverified 0Adaptive Regularization of Labels Aug 15, 2019 Data Augmentation Knowledge Distillation
— Unverified 0Efficient Inference via Universal LSH Kernel Jun 21, 2021 Knowledge Distillation Quantization
— Unverified 0Efficient Knowledge Distillation of SAM for Medical Image Segmentation Jan 28, 2025 Computational Efficiency Decoder
— Unverified 0DeGAN : Data-Enriching GAN for Retrieving Representative Samples from a Trained Classifier Dec 27, 2019 Data-free Knowledge Distillation Incremental Learning
— Unverified 0BLSP-KD: Bootstrapping Language-Speech Pre-training via Knowledge Distillation May 29, 2024 Instruction Following Knowledge Distillation
— Unverified 0AMTSS: An Adaptive Multi-Teacher Single-Student Knowledge Distillation Framework For Multilingual Language Inference May 13, 2023 Knowledge Distillation
— Unverified 0Defending against Data-Free Model Extraction by Distributionally Robust Defensive Training Sep 21, 2023 Knowledge Distillation Model extraction
— Unverified 0Defending against Data-Free Model Extraction by Distributionally Robust Defensive Training Sep 21, 2023 Knowledge Distillation Model extraction
— Unverified 0Deep versus Wide: An Analysis of Student Architectures for Task-Agnostic Knowledge Distillation of Self-Supervised Speech Models Jul 14, 2022 Automatic Speech Recognition Automatic Speech Recognition (ASR)
— Unverified 0Deep-to-bottom Weights Decay: A Systemic Knowledge Review Learning Technique for Transformer Layers in Knowledge Distillation Nov 16, 2021 Knowledge Distillation
— Unverified 0Block-wise Intermediate Representation Training for Model Compression Oct 20, 2018 Knowledge Distillation model
— Unverified 0Efficient Gravitational Wave Parameter Estimation via Knowledge Distillation: A ResNet1D-IAF Approach Dec 11, 2024 Astronomy Computational Efficiency
— Unverified 0Deep Serial Number: Computational Watermarking for DNN Intellectual Property Protection Nov 17, 2020 Knowledge Distillation valid
— Unverified 0Bidirectional Distillation: A Mixed-Play Framework for Multi-Agent Generalizable Behaviors May 16, 2025 Knowledge Distillation Multi-agent Reinforcement Learning
— Unverified 0Deep Semi-Supervised and Self-Supervised Learning for Diabetic Retinopathy Detection Aug 4, 2022 Diabetic Retinopathy Detection Knowledge Distillation
— Unverified 0Amortized Noisy Channel Neural Machine Translation Dec 16, 2021 Imitation Learning Knowledge Distillation
— Unverified 0Efficient Federated Learning for AIoT Applications Using Knowledge Distillation Nov 29, 2021 Federated Learning Knowledge Distillation
— Unverified 0Deep Representation Learning of Patient Data from Electronic Health Records (EHR): A Systematic Review Oct 6, 2020 Articles Deep Learning
— Unverified 0Black-box Source-free Domain Adaptation via Two-stage Knowledge Distillation May 13, 2023 Domain Adaptation Knowledge Distillation
— Unverified 0Deep Neural Network Models Compression Mar 4, 2021 Knowledge Distillation Quantization
— Unverified 0Deep Neural Compression Via Concurrent Pruning and Self-Distillation Sep 30, 2021 Knowledge Distillation Language Modeling
— Unverified 0Efficient Evaluation-Time Uncertainty Estimation by Improved Distillation Jun 12, 2019 Knowledge Distillation
— Unverified 0Efficient Hybrid Language Model Compression through Group-Aware SSM Pruning Apr 15, 2025 Knowledge Distillation Language Modeling
— Unverified 0Efficient Knowledge Distillation via Curriculum Extraction Mar 21, 2025 Knowledge Distillation Language Modeling
— Unverified 0Deep Net Triage: Analyzing the Importance of Network Layers via Structural Compression Jan 15, 2018 Knowledge Distillation
— Unverified 0Black-Box Dissector: Towards Erasing-based Hard-Label Model Stealing Attack May 3, 2021 Knowledge Distillation Self-Knowledge Distillation
— Unverified 0Deep Learning for Medical Text Processing: BERT Model Fine-Tuning and Comparative Study Oct 28, 2024 Knowledge Distillation
— Unverified 0Knowledge Distillation-aided End-to-End Learning for Linear Precoding in Multiuser MIMO Downlink Systems with Finite-Rate Feedback Aug 10, 2020 Binarization Knowledge Distillation
— Unverified 0Deep Epidemiological Modeling by Black-box Knowledge Distillation: An Accurate Deep Learning Model for COVID-19 Jan 20, 2021 Diversity Knowledge Distillation
— Unverified 0BJTU-WeChat's Systems for the WMT22 Chat Translation Task Nov 28, 2022 Denoising Knowledge Distillation
— Unverified 0AMLN: Adversarial-based Mutual Learning Network for Online Knowledge Distillation Aug 1, 2020 Knowledge Distillation Transfer Learning
— Unverified 0Efficient Compression of Multitask Multilingual Speech Models May 2, 2024 Automatic Speech Recognition Automatic Speech Recognition (ASR)
— Unverified 0Efficient AI in Practice: Training and Deployment of Efficient LLMs for Industry Applications Feb 20, 2025 Knowledge Distillation Model Compression
— Unverified 0Deep Collective Knowledge Distillation Apr 18, 2023 Knowledge Distillation Model Compression
— Unverified 0