An Efficient Method of Training Small Models for Regression Problems with Knowledge Distillation Feb 28, 2020 Knowledge Distillation Memorization
— Unverified 00 DiffusionTalker: Personalization and Acceleration for Speech-Driven 3D Face Diffuser Nov 28, 2023 3D Face Animation Contrastive Learning
— Unverified 00 Towards Complementary Knowledge Distillation for Efficient Dense Image Prediction Jan 24, 2024 Implicit Relations Instance Segmentation
— Unverified 00 Diffusion-Augmented Coreset Expansion for Scalable Dataset Distillation Dec 5, 2024 Bilevel Optimization Computational Efficiency
— Unverified 00 Improving Neural Ranking via Lossless Knowledge Distillation Sep 30, 2021 Knowledge Distillation Learning-To-Rank
— Unverified 00 Diffusion Glancing Transformer for Parallel Sequence to Sequence Learning Dec 20, 2022 Knowledge Distillation Machine Translation
— Unverified 00 Differentiable Feature Aggregation Search for Knowledge Distillation Aug 2, 2020 Knowledge Distillation Model Compression
— Unverified 00 DiDOTS: Knowledge Distillation from Large-Language-Models for Dementia Obfuscation in Transcribed Speech Oct 5, 2024 Hallucination Knowledge Distillation
— Unverified 00 An Efficient Federated Distillation Learning System for Multi-task Time Series Classification Dec 30, 2021 Knowledge Distillation Time Series
— Unverified 00 Add a SideNet to your MainNet Jul 14, 2020 General Classification Knowledge Distillation
— Unverified 00 Bootstrapping Chest CT Image Understanding by Distilling Knowledge from X-ray Expert Models Apr 7, 2024 Contrastive Learning Diagnostic
— Unverified 00 DFM: Dialogue Foundation Model for Universal Large-Scale Dialogue-Oriented Task Learning May 25, 2022 Dialogue Generation Diversity
— Unverified 00 Bootstrapped Representation Learning for Skeleton-Based Action Recognition Feb 4, 2022 Action Recognition Data Augmentation
— Unverified 00 An Efficient Detection and Control System for Underwater Docking using Machine Learning and Realistic Simulation: A Comprehensive Approach Nov 2, 2023 Generative Adversarial Network Image-to-Image Translation
— Unverified 00 Dialect Identification through Adversarial Learning and Knowledge Distillation on Romanian BERT Apr 1, 2021 Automatic Speech Recognition Automatic Speech Recognition (ASR)
— Unverified 00 DiagrammaticLearning: A Graphical Language for Compositional Training Regimes Jan 2, 2025 Knowledge Distillation Multi-Task Learning
— Unverified 00 BOOT: Data-free Distillation of Denoising Diffusion Models with Bootstrapping Jun 8, 2023 Denoising Knowledge Distillation
— Unverified 00 DFRD: Data-Free Robustness Distillation for Heterogeneous Federated Learning Sep 24, 2023 Data-free Knowledge Distillation Diversity
— Unverified 00 Boost Vision Transformer with GPU-Friendly Sparsity and Quantization May 18, 2023 Benchmarking GPU
— Unverified 00 An Efficient Active Learning Pipeline for Legal Text Classification Nov 15, 2022 Active Learning Classification
— Unverified 00 DFMSD: Dual Feature Masking Stage-wise Knowledge Distillation for Object Detection Jul 18, 2024 Knowledge Distillation Object
— Unverified 00 DeViT: Decomposing Vision Transformers for Collaborative Inference in Edge Devices Sep 10, 2023 Collaborative Inference GPU
— Unverified 00 Device-Directed Speech Detection: Regularization via Distillation for Weakly-Supervised Models Mar 30, 2022 Knowledge Distillation
— Unverified 00 Boosting Self-Supervision for Single-View Scene Completion via Knowledge Distillation Apr 11, 2024 Depth Estimation Depth Prediction
— Unverified 00 Developing Multi-Task Recommendations with Long-Term Rewards via Policy Distilled Reinforcement Learning Jan 27, 2020 Deep Reinforcement Learning Knowledge Distillation
— Unverified 00 DETRDistill: A Universal Knowledge Distillation Framework for DETR-families Nov 17, 2022 Knowledge Distillation object-detection
— Unverified 00 Detecting Optimism in Tweets using Knowledge Distillation and Linguistic Analysis of Optimism Jun 1, 2022 Hate Speech Detection Knowledge Distillation
— Unverified 00 An Effective Deep Network for Head Pose Estimation without Keypoints Oct 25, 2022 Gaze Estimation Head Pose Estimation
— Unverified 00 Analyzing the Importance of Blank for CTC-Based Knowledge Distillation Jun 2, 2025 Automatic Speech Recognition Knowledge Distillation
— Unverified 00 A Cohesive Distillation Architecture for Neural Language Models Jan 12, 2023 Knowledge Distillation Language Modeling
— Unverified 00 DistillGrasp: Integrating Features Correlation with Knowledge Distillation for Depth Completion of Transparent Objects Aug 1, 2024 Depth Completion Feature Correlation
— Unverified 00 Designing Parameter and Compute Efficient Diffusion Transformers using Distillation Feb 20, 2025 Knowledge Distillation NVIDIA Jetson Orin Nano
— Unverified 00 Designing an Improved Deep Learning-based Model for COVID-19 Recognition in Chest X-ray Images: A Knowledge Distillation Approach Jan 6, 2023 Knowledge Distillation
— Unverified 00 Designing and Training of Lightweight Neural Networks on Edge Devices using Early Halting in Knowledge Distillation Sep 30, 2022 Knowledge Distillation
— Unverified 00 Boosting Lossless Speculative Decoding via Feature Sampling and Partial Alignment Distillation Aug 28, 2024 Knowledge Distillation Language Modelling
— Unverified 00 DεpS: Delayed ε-Shrinking for Faster Once-For-All Training Jul 8, 2024 All GPU
— Unverified 00 Deploying a BERT-based Query-Title Relevance Classifier in a Production System: a View from the Trenches Aug 23, 2021 CPU Data Augmentation
— Unverified 00 Boosting Graph Neural Networks via Adaptive Knowledge Distillation Oct 12, 2022 Graph Classification Graph Mining
— Unverified 00 Analyzing Knowledge Distillation in Neural Machine Translation Oct 1, 2018 Knowledge Distillation Machine Translation
— Unverified 00 Densely Distilling Cumulative Knowledge for Continual Learning May 16, 2024 All Continual Learning
— Unverified 00 Boosting Contrastive Learning with Relation Knowledge Distillation Dec 8, 2021 Contrastive Learning Knowledge Distillation
— Unverified 00 Denoising Mutual Knowledge Distillation in Bi-Directional Multiple Instance Learning May 17, 2025 Denoising image-classification
— Unverified 00 BoostingBERT:Integrating Multi-Class Boosting into BERT for NLP Tasks Sep 13, 2020 Ensemble Learning Knowledge Distillation
— Unverified 00 Analyzing Compression Techniques for Computer Vision May 14, 2023 Knowledge Distillation Quantization
— Unverified 00 Demystifying Catastrophic Forgetting in Two-Stage Incremental Object Detector Feb 8, 2025 Incremental Learning Knowledge Distillation
— Unverified 00 Delving Deep into Semantic Relation Distillation Mar 27, 2025 Knowledge Distillation Model Compression
— Unverified 00 Boosting Accuracy and Robustness of Student Models via Adaptive Adversarial Distillation Jan 1, 2023 Adversarial Robustness Knowledge Distillation
— Unverified 00 BOLT: Bootstrap Long Chain-of-Thought in Language Models without Distillation Feb 6, 2025 In-Context Learning Knowledge Distillation
— Unverified 00 An Active Learning Framework for Inclusive Generation by Large Language Models Oct 17, 2024 Active Learning Clustering
— Unverified 00 Adaptive Regularization of Labels Aug 15, 2019 Data Augmentation Knowledge Distillation
— Unverified 00