EchoAtt: Attend, Copy, then Adjust for More Efficient Large Language Models Sep 22, 2024 Knowledge Distillation
— Unverified 0Addressing Bias Through Ensemble Learning and Regularized Fine-Tuning Feb 1, 2024 Ensemble Learning Knowledge Distillation
— Unverified 0DiffusionTalker: Personalization and Acceleration for Speech-Driven 3D Face Diffuser Nov 28, 2023 3D Face Animation Contrastive Learning
— Unverified 0Towards Complementary Knowledge Distillation for Efficient Dense Image Prediction Jan 24, 2024 Implicit Relations Instance Segmentation
— Unverified 0Diffusion-Augmented Coreset Expansion for Scalable Dataset Distillation Dec 5, 2024 Bilevel Optimization Computational Efficiency
— Unverified 0Disentanglement, Visualization and Analysis of Complex Features in DNNs Jan 1, 2021 Disentanglement Knowledge Distillation
— Unverified 0Improving Neural Ranking via Lossless Knowledge Distillation Sep 30, 2021 Knowledge Distillation Learning-To-Rank
— Unverified 0An Efficient Federated Distillation Learning System for Multi-task Time Series Classification Dec 30, 2021 Knowledge Distillation Time Series
— Unverified 0DualDE: Dually Distilling Knowledge Graph Embedding for Faster and Cheaper Reasoning Sep 13, 2020 Graph Embedding Knowledge Distillation
— Unverified 0Bridging the gap between Human Action Recognition and Online Action Detection Jan 21, 2021 Action Detection Action Recognition
— Unverified 0Distill, Adapt, Distill: Training Small, In-Domain Models for Neural Machine Translation Mar 5, 2020 Domain Adaptation Knowledge Distillation
— Unverified 0Distill and De-bias: Mitigating Bias in Face Verification using Knowledge Distillation Dec 17, 2021 Attribute Face Recognition
— Unverified 0Diffusion Glancing Transformer for Parallel Sequence to Sequence Learning Dec 20, 2022 Knowledge Distillation Machine Translation
— Unverified 0A Cohesive Distillation Architecture for Neural Language Models Jan 12, 2023 Knowledge Distillation Language Modeling
— Unverified 0Differentiable Feature Aggregation Search for Knowledge Distillation Aug 2, 2020 Knowledge Distillation Model Compression
— Unverified 0DiDOTS: Knowledge Distillation from Large-Language-Models for Dementia Obfuscation in Transcribed Speech Oct 5, 2024 Hallucination Knowledge Distillation
— Unverified 0Knowledge Distillation Decision Tree for Unravelling Black-box Machine Learning Models Jun 9, 2022 Knowledge Distillation
— Unverified 0Distillation-Enabled Knowledge Alignment for Generative Semantic Communications in AIGC Provisioning Tasks Jun 24, 2025 Knowledge Distillation Semantic Communication
— Unverified 0Distillation-Enhanced Physical Adversarial Attacks Jan 4, 2025 Adversarial Attack Knowledge Distillation
— Unverified 0ECAT: A Entire space Continual and Adaptive Transfer Learning Framework for Cross-Domain Recommendation Jul 2, 2024 Domain Adaptation Knowledge Distillation
— Unverified 0StableMamba: Distillation-free Scaling of Large SSMs for Images and Videos Sep 18, 2024 Action Recognition image-classification
— Unverified 0Bootstrapping Chest CT Image Understanding by Distilling Knowledge from X-ray Expert Models Apr 7, 2024 Contrastive Learning Diagnostic
— Unverified 0DFM: Dialogue Foundation Model for Universal Large-Scale Dialogue-Oriented Task Learning May 25, 2022 Dialogue Generation Diversity
— Unverified 0Bootstrapped Representation Learning for Skeleton-Based Action Recognition Feb 4, 2022 Action Recognition Data Augmentation
— Unverified 0An Efficient Detection and Control System for Underwater Docking using Machine Learning and Realistic Simulation: A Comprehensive Approach Nov 2, 2023 Generative Adversarial Network Image-to-Image Translation
— Unverified 0Dialect Identification through Adversarial Learning and Knowledge Distillation on Romanian BERT Apr 1, 2021 Automatic Speech Recognition Automatic Speech Recognition (ASR)
— Unverified 0DiagrammaticLearning: A Graphical Language for Compositional Training Regimes Jan 2, 2025 Knowledge Distillation Multi-Task Learning
— Unverified 0BOOT: Data-free Distillation of Denoising Diffusion Models with Bootstrapping Jun 8, 2023 Denoising Knowledge Distillation
— Unverified 0DFRD: Data-Free Robustness Distillation for Heterogeneous Federated Learning Sep 24, 2023 Data-free Knowledge Distillation Diversity
— Unverified 0Boost Vision Transformer with GPU-Friendly Sparsity and Quantization May 18, 2023 Benchmarking GPU
— Unverified 0An Efficient Active Learning Pipeline for Legal Text Classification Nov 15, 2022 Active Learning Classification
— Unverified 0DistillGrasp: Integrating Features Correlation with Knowledge Distillation for Depth Completion of Transparent Objects Aug 1, 2024 Depth Completion Feature Correlation
— Unverified 0DFMSD: Dual Feature Masking Stage-wise Knowledge Distillation for Object Detection Jul 18, 2024 Knowledge Distillation Object
— Unverified 0An Effective Deep Network for Head Pose Estimation without Keypoints Oct 25, 2022 Gaze Estimation Head Pose Estimation
— Unverified 0DeViT: Decomposing Vision Transformers for Collaborative Inference in Edge Devices Sep 10, 2023 Collaborative Inference GPU
— Unverified 0Device-Directed Speech Detection: Regularization via Distillation for Weakly-Supervised Models Mar 30, 2022 Knowledge Distillation
— Unverified 0Boosting Self-Supervision for Single-View Scene Completion via Knowledge Distillation Apr 11, 2024 Depth Estimation Depth Prediction
— Unverified 0Deep Face Recognition Model Compression via Knowledge Transfer and Distillation Jun 3, 2019 Face Recognition Knowledge Distillation
— Unverified 0Developing Multi-Task Recommendations with Long-Term Rewards via Policy Distilled Reinforcement Learning Jan 27, 2020 Deep Reinforcement Learning Knowledge Distillation
— Unverified 0DETRDistill: A Universal Knowledge Distillation Framework for DETR-families Nov 17, 2022 Knowledge Distillation object-detection
— Unverified 0Detecting Optimism in Tweets using Knowledge Distillation and Linguistic Analysis of Optimism Jun 1, 2022 Hate Speech Detection Knowledge Distillation
— Unverified 0Analyzing the Importance of Blank for CTC-Based Knowledge Distillation Jun 2, 2025 Automatic Speech Recognition Knowledge Distillation
— Unverified 0Dynamic Y-KD: A Hybrid Approach to Continual Instance Segmentation Mar 10, 2023 Continual Learning Incremental Learning
— Unverified 0EasyDistill: A Comprehensive Toolkit for Effective Knowledge Distillation of Large Language Models May 27, 2025 Knowledge Distillation
— Unverified 0EasyNLP: A Comprehensive and Easy-to-use Toolkit for Natural Language Processing Apr 30, 2022 Few-Shot Learning Knowledge Distillation
— Unverified 0EchoLM: Accelerating LLM Serving with Real-time Knowledge Distillation Jan 22, 2025 Knowledge Distillation Response Generation
— Unverified 0Boosting Lossless Speculative Decoding via Feature Sampling and Partial Alignment Distillation Aug 28, 2024 Knowledge Distillation Language Modelling
— Unverified 0Designing Parameter and Compute Efficient Diffusion Transformers using Distillation Feb 20, 2025 Knowledge Distillation NVIDIA Jetson Orin Nano
— Unverified 0A Closer Look at Wav2Vec2 Embeddings for On-Device Single-Channel Speech Enhancement Mar 3, 2024 Automatic Speech Recognition Keyword Spotting
— Unverified 0Designing an Improved Deep Learning-based Model for COVID-19 Recognition in Chest X-ray Images: A Knowledge Distillation Approach Jan 6, 2023 Knowledge Distillation
— Unverified 0