Fast DistilBERT on CPUs Oct 27, 2022 Knowledge Distillation Model Compression
— Unverified 0AKD : Adversarial Knowledge Distillation For Large Language Models Alignment on Coding tasks May 5, 2025 Code Completion Code Generation
— Unverified 0Cooperative Learning for Cost-Adaptive Inference Dec 13, 2023 Knowledge Distillation
— Unverified 0FAN-Trans: Online Knowledge Distillation for Facial Action Unit Detection Nov 11, 2022 Action Unit Detection Face Alignment
— Unverified 0A Knowledge Distillation Approach for Sepsis Outcome Prediction from Multivariate Clinical Time Series Nov 16, 2023 Knowledge Distillation Time Series
— Unverified 0Cooperative Denoising for Distantly Supervised Relation Extraction Aug 1, 2018 Denoising Information Retrieval
— Unverified 0On Importance of Pruning and Distillation for Efficient Low Resource NLP Sep 21, 2024 Document Classification GPU
— Unverified 0Fast and Efficient Once-For-All Networks for Diverse Hardware Deployment Sep 29, 2021 All GPU
— Unverified 0Automated Graph Self-supervised Learning via Multi-teacher Knowledge Distillation Oct 5, 2022 Graph Representation Learning Knowledge Distillation
— Unverified 0Convolutional Neural Network Compression through Generalized Kronecker Product Decomposition Sep 29, 2021 image-classification Image Classification
— Unverified 0Automated Channel Pruning with Learned Importance Sep 29, 2021 Denoising GPU
— Unverified 0Control Policy Correction Framework for Reinforcement Learning-based Energy Arbitrage Strategies Apr 29, 2024 Knowledge Distillation reinforcement-learning
— Unverified 0Controlling the Quality of Distillation in Response-Based Network Compression Dec 19, 2021 Knowledge Distillation
— Unverified 0Fast and High-Performance Learned Image Compression With Improved Checkerboard Context Model, Deformable Residual Module, and Knowledge Distillation Sep 5, 2023 Image Compression Knowledge Distillation
— Unverified 0Fast Real-time Personalized Speech Enhancement: End-to-End Enhancement Network (E3Net) and Knowledge Distillation Apr 2, 2022 Automatic Speech Recognition Automatic Speech Recognition (ASR)
— Unverified 0Federated Deconfounding and Debiasing Learning for Out-of-Distribution Generalization May 8, 2025 Attribute Benchmarking
— Unverified 0Contrast-reconstruction Representation Learning for Self-supervised Skeleton-based Action Recognition Nov 22, 2021 Action Recognition Contrastive Learning
— Unverified 0Contrast R-CNN for Continual Learning in Object Detection Jul 11, 2021 Continual Learning image-classification
— Unverified 0AUTOKD: Automatic Knowledge Distillation Into A Student Architecture Family Nov 5, 2021 Bayesian Optimization Knowledge Distillation
— Unverified 0Contrastive Representation Distillation via Multi-Scale Feature Decoupling Feb 9, 2025 Knowledge Distillation Transfer Learning
— Unverified 0A Joint Sequential and Relational Model for Frame-Semantic Parsing Sep 1, 2017 Knowledge Distillation Machine Translation
— Unverified 0AirNet: Neural Network Transmission over the Air May 24, 2021 Knowledge Distillation
— Unverified 0Contrastive Learning-Based Spectral Knowledge Distillation for Multi-Modality and Missing Modality Scenarios in Semantic Segmentation Dec 4, 2023 Benchmarking Contrastive Learning
— Unverified 0AutoDistill: an End-to-End Framework to Explore and Distill Hardware-Efficient Language Models Jan 21, 2022 Bayesian Optimization Knowledge Distillation
— Unverified 0AdapterDistillation: Non-Destructive Task Composition with Knowledge Distillation Dec 26, 2023 Knowledge Distillation Retrieval
— Unverified 0Fairness Continual Learning Approach to Semantic Scene Understanding in Open-World Environments May 25, 2023 Continual Learning Continual Semantic Segmentation
— Unverified 0Contrastive Continual Multi-view Clustering with Filtered Structural Fusion Sep 26, 2023 Clustering Contrastive Learning
— Unverified 0AutoDistil: Few-shot Task-agnostic Neural Architecture Search for Distilling Large Language Models Jan 29, 2022 Inductive Bias Knowledge Distillation
— Unverified 0Continuous sign language recognition based on cross-resolution knowledge distillation Mar 13, 2023 Knowledge Distillation Sign Language Recognition
— Unverified 0Dynamic Object Queries for Transformer-based Incremental Object Detection Jul 31, 2024 Knowledge Distillation Object
— Unverified 0Fair Feature Importance Scores for Interpreting Tree-Based Methods and Surrogates Oct 6, 2023 Fairness Feature Importance
— Unverified 0Continuous Concepts Removal in Text-to-image Diffusion Models Nov 30, 2024 Knowledge Distillation
— Unverified 0Continuation KD: Improved Knowledge Distillation through the Lens of Continuation Optimization Dec 12, 2022 Knowledge Distillation Natural Language Understanding
— Unverified 0AutoADR: Automatic Model Design for Ad Relevance Oct 14, 2020 AutoML Knowledge Distillation
— Unverified 0Continual Self-Supervised Learning with Masked Autoencoders in Remote Sensing Jun 26, 2025 Continual Learning Continual Self-Supervised Learning
— Unverified 0Continual Segment: Towards a Single, Unified and Non-forgetting Continual Segmentation Model of 143 Whole-body Organs in CT Scans Jan 1, 2023 Continual Semantic Segmentation Decoder
— Unverified 0Adapter-based Selective Knowledge Distillation for Federated Multi-domain Meeting Summarization Aug 7, 2023 Federated Learning Knowledge Distillation
— Unverified 0Accelerating Large Scale Knowledge Distillation via Dynamic Importance Sampling Dec 3, 2018 Knowledge Distillation Machine Translation
— Unverified 0Fairly Predicting Graft Failure in Liver Transplant for Organ Assigning Feb 18, 2023 Fairness Knowledge Distillation
— Unverified 0Fair Text to Medical Image Diffusion Model with Subgroup Distribution Aligned Tuning Jun 21, 2024 Knowledge Distillation
— Unverified 0Continual Segment: Towards a Single, Unified and Accessible Continual Segmentation Model of 143 Whole-body Organs in CT Scans Feb 1, 2023 Continual Semantic Segmentation Decoder
— Unverified 0A Unified Knowledge Distillation Framework for Deep Directed Graphical Models Sep 29, 2021 Continual Learning Federated Learning
— Unverified 0Accelerating Diffusion Models with One-to-Many Knowledge Distillation Oct 5, 2024 Image Generation Knowledge Distillation
— Unverified 0AI-KD: Adversarial learning and Implicit regularization for self-Knowledge Distillation Nov 20, 2022 Knowledge Distillation Self-Knowledge Distillation
— Unverified 0A Unified Knowledge-Distillation and Semi-Supervised Learning Framework to Improve Industrial Ads Delivery Systems Feb 5, 2025 Knowledge Distillation
— Unverified 0Adapt-and-Distill: Developing Small, Fast and Effective Pretrained Language Models for Domains Jun 25, 2021 Knowledge Distillation
— Unverified 0Failure-Resilient Distributed Inference with Model Compression over Heterogeneous Edge Devices Jun 20, 2024 Knowledge Distillation Model Compression
— Unverified 0Continual Learning with Dirichlet Generative-based Rehearsal Sep 13, 2023 Continual Learning Incremental Learning
— Unverified 0Continual Learning with Diffusion-based Generative Replay for Industrial Streaming Data Jun 22, 2024 Continual Learning Knowledge Distillation
— Unverified 0A Unified Framework for Continual Learning and Unlearning Aug 21, 2024 Continual Learning Knowledge Distillation
— Unverified 0