GenDistiller: Distilling Pre-trained Language Models based on an Autoregressive Generative Model Jun 12, 2024 Knowledge Distillation Self-Supervised Learning
— Unverified 0Extreme Compression for Pre-trained Transformers Made Simple and Efficient Jun 4, 2022 Knowledge Distillation Quantization
— Unverified 0Compression of Deep Learning Models for Text: A Survey Aug 12, 2020 Deep Learning Information Retrieval
— Unverified 0Extremely Small BERT Models from Mixed-Vocabulary Training Sep 25, 2019 Knowledge Distillation Language Modelling
— Unverified 0Continual Self-Supervised Learning with Masked Autoencoders in Remote Sensing Jun 26, 2025 Continual Learning Continual Self-Supervised Learning
— Unverified 0Face to Cartoon Incremental Super-Resolution using Knowledge Distillation Jan 27, 2024 Hallucination Incremental Learning
— Unverified 0Continuation KD: Improved Knowledge Distillation through the Lens of Continuation Optimization Dec 12, 2022 Knowledge Distillation Natural Language Understanding
— Unverified 0AutoADR: Automatic Model Design for Ad Relevance Oct 14, 2020 AutoML Knowledge Distillation
— Unverified 0Generalized Supervised Contrastive Learning Jun 1, 2022 Contrastive Learning Knowledge Distillation
— Unverified 0Factorized Distillation: Training Holistic Person Re-identification Model by Distilling an Ensemble of Partial ReID Models Nov 20, 2018 Knowledge Distillation Person Re-Identification
— Unverified 0Compression of Acoustic Event Detection Models With Quantized Distillation Jul 1, 2019 Event Detection Knowledge Distillation
— Unverified 0Factual Dialogue Summarization via Learning from Large Language Models Jun 20, 2024 Contrastive Learning Data Augmentation
— Unverified 0Selective Cross-Task Distillation Apr 25, 2022 Knowledge Distillation
— Unverified 0Failure-Resilient Distributed Inference with Model Compression over Heterogeneous Edge Devices Jun 20, 2024 Knowledge Distillation Model Compression
— Unverified 0G-DetKD: Towards General Distillation Framework for Object Detectors via Contrastive and Semantic-guided Feature Imitation Aug 17, 2021 Knowledge Distillation object-detection
— Unverified 0Fair Feature Distillation for Visual Recognition May 27, 2021 Fairness Knowledge Distillation
— Unverified 0Generalized Continual Zero-Shot Learning Nov 17, 2020 Continual Learning Knowledge Distillation
— Unverified 0Fairly Predicting Graft Failure in Liver Transplant for Organ Assigning Feb 18, 2023 Fairness Knowledge Distillation
— Unverified 0Compressing Visual-linguistic Model via Knowledge Distillation Apr 5, 2021 Image Captioning Knowledge Distillation
— Unverified 0Enhancing Generalization in Chain of Thought Reasoning for Smaller Models Jan 16, 2025 Knowledge Distillation Memorization
— Unverified 0Fair Text to Medical Image Diffusion Model with Subgroup Distribution Aligned Tuning Jun 21, 2024 Knowledge Distillation
— Unverified 0Faithful Knowledge Distillation Jun 7, 2023 Adversarial Robustness Knowledge Distillation
— Unverified 0A Theoretical Analysis of Soft-Label vs Hard-Label Training in Neural Networks Dec 12, 2024 Binary Classification Knowledge Distillation
— Unverified 0Enhancing Few-shot Keyword Spotting Performance through Pre-Trained Self-supervised Speech Models Jun 21, 2025 Dimensionality Reduction Keyword Spotting
— Unverified 0Enhancing Data-Free Adversarial Distillation with Activation Regularization and Virtual Interpolation Feb 23, 2021 Knowledge Distillation
— Unverified 0Fall Detection using Knowledge Distillation Based Long short-term memory for Offline Embedded and Low Power Devices Aug 24, 2023 Knowledge Distillation Time Series
— Unverified 0Compressing VAE-Based Out-of-Distribution Detectors for Embedded Deployment Sep 2, 2024 CPU GPU
— Unverified 0GAN-Knowledge Distillation for one-stage Object Detection Jun 20, 2019 Knowledge Distillation Object
— Unverified 0Enhancing CTC-Based Visual Speech Recognition Sep 11, 2024 Automatic Speech Recognition Automatic Speech Recognition (ASR)
— Unverified 0Compressing Recurrent Neural Networks for FPGA-accelerated Implementation in Fluorescence Lifetime Imaging Oct 1, 2024 Computational Efficiency Knowledge Distillation
— Unverified 0Enhancing Content Representation for AR Image Quality Assessment Using Knowledge Distillation Dec 8, 2024 Image Quality Assessment Knowledge Distillation
— Unverified 0Fast and Efficient Once-For-All Networks for Diverse Hardware Deployment Sep 29, 2021 All GPU
— Unverified 0Fast and High-Performance Learned Image Compression With Improved Checkerboard Context Model, Deformable Residual Module, and Knowledge Distillation Sep 5, 2023 Image Compression Knowledge Distillation
— Unverified 0Contrast-reconstruction Representation Learning for Self-supervised Skeleton-based Action Recognition Nov 22, 2021 Action Recognition Contrastive Learning
— Unverified 0Enhancing Chinese Multi-Label Text Classification Performance with Response-based Knowledge Distillation Nov 1, 2022 Knowledge Distillation Multi Label Text Classification
— Unverified 0A Technical Study into Small Reasoning Language Models Jun 16, 2025 Code Generation Computational Efficiency
— Unverified 0Gap Preserving Distillation by Building Bidirectional Mappings with A Dynamic Teacher Oct 5, 2024 Knowledge Distillation
— Unverified 0Enhancing Adversarial Training with Prior Knowledge Distillation for Robust Image Compression Mar 11, 2024 Backdoor Attack Image Compression
— Unverified 0Convolutional Neural Network Compression through Generalized Kronecker Product Decomposition Sep 29, 2021 image-classification Image Classification
— Unverified 0Compressing Image-to-Image Translation GANs Using Local Density Structures on Their Learned Manifold Dec 22, 2023 Density Estimation Image-to-Image Translation
— Unverified 0Faster Inference of Integer SWIN Transformer by Removing the GELU Activation Feb 2, 2024 GPU image-classification
— Unverified 0Compressing GANs using Knowledge Distillation Feb 1, 2019 Knowledge Distillation Super-Resolution
— Unverified 0Enhancing Action Recognition from Low-Quality Skeleton Data via Part-Level Knowledge Distillation Apr 28, 2024 Action Recognition General Knowledge
— Unverified 0Fast Real-time Personalized Speech Enhancement: End-to-End Enhancement Network (E3Net) and Knowledge Distillation Apr 2, 2022 Automatic Speech Recognition Automatic Speech Recognition (ASR)
— Unverified 0Fast Sampling Through The Reuse Of Attention Maps In Diffusion Models Dec 13, 2023 Image Generation Knowledge Distillation
— Unverified 0A Generalized and Robust Method Towards Practical Gaze Estimation on Smart Phone Oct 16, 2019 Gaze Estimation Knowledge Distillation
— Unverified 0FastSR-NeRF: Improving NeRF Efficiency on Consumer Devices with A Simple Super-Resolution Pipeline Dec 15, 2023 GPU Knowledge Distillation
— Unverified 0Fast Streaming Transducer ASR Prototyping via Knowledge Distillation with Whisper Sep 20, 2024 Automatic Speech Recognition Automatic Speech Recognition (ASR)
— Unverified 0Galileo at SemEval-2020 Task 12: Multi-lingual Learning for Offensive Language Identification using Pre-trained Language Models Oct 7, 2020 All Knowledge Distillation
— Unverified 0Enhancing Accuracy and Parameter-Efficiency of Neural Representations for Network Parameterization Jun 29, 2024 Knowledge Distillation
— Unverified 0