Continual Learning for Neural Machine Translation Jun 1, 2021 Continual Learning Knowledge Distillation
— Unverified 0AIDE: Agentically Improve Visual Language Model with Domain Experts Feb 13, 2025 Knowledge Distillation Language Modeling
— Unverified 0A Unified Compression Framework for Efficient Speech-Driven Talking-Face Generation Apr 2, 2023 Face Generation Knowledge Distillation
— Unverified 0Continual Learning for Fake Audio Detection Apr 15, 2021 Continual Learning Knowledge Distillation
— Unverified 0Continual Learning for Class- and Domain-Incremental Semantic Segmentation Sep 16, 2022 class-incremental learning Class Incremental Learning
— Unverified 0Augmenting Knowledge Distillation With Peer-To-Peer Mutual Learning For Model Compression Oct 21, 2021 Knowledge Distillation Model Compression
— Unverified 0Continual Face Forgery Detection via Historical Distribution Preserving Aug 11, 2023 Knowledge Distillation
— Unverified 0Augmentation with Projection: Towards an Effective and Efficient Data Augmentation Paradigm for Distillation Oct 21, 2022 Data Augmentation Diversity
— Unverified 0Continual Distillation Learning: Knowledge Distillation in Prompt-based Continual Learning Jul 18, 2024 Continual Learning Knowledge Distillation
— Unverified 0Continual Detection Transformer for Incremental Object Detection Apr 6, 2023 Class-Incremental Object Detection Knowledge Distillation
— Unverified 0AI can evolve without labels: self-evolving vision transformer for chest X-ray diagnosis through knowledge distillation Feb 13, 2022 Deep Learning Diagnostic
— Unverified 0Adam: Dense Retrieval Distillation with Adaptive Dark Examples Dec 20, 2022 Knowledge Distillation Retrieval
— Unverified 0FedD2S: Personalized Data-Free Federated Knowledge Distillation Feb 16, 2024 Data-free Knowledge Distillation Fairness
— Unverified 0FedDTG:Federated Data-Free Knowledge Distillation via Three-Player Generative Adversarial Networks Jan 10, 2022 Data-free Knowledge Distillation Federated Learning
— Unverified 0Feature-map-level Online Adversarial Knowledge Distillation Feb 5, 2020 Knowledge Distillation
— Unverified 0AdaKD: Dynamic Knowledge Distillation of ASR models using Adaptive Loss Weighting May 11, 2024 Knowledge Distillation Model Compression
— Unverified 0Feature-Rich Audio Model Inversion for Data-Free Knowledge Distillation Towards General Sound Classification Mar 14, 2023 Data-free Knowledge Distillation Knowledge Distillation
— Unverified 0Audio Representation Learning by Distilling Video as Privileged Information Feb 6, 2023 Emotion Recognition Knowledge Distillation
— Unverified 0Contextual Knowledge Distillation for Transformer Compression Jan 1, 2021 Knowledge Distillation Language Modeling
— Unverified 0A Good Student is Cooperative and Reliable: CNN-Transformer Collaborative Learning for Semantic Segmentation Jul 24, 2023 Knowledge Distillation Semantic Segmentation
— Unverified 0Contextualized Attention-based Knowledge Transfer for Spoken Conversational Question Answering Oct 21, 2020 Audio Signal Processing Conversational Question Answering
— Unverified 0Contextual Distillation Model for Diversified Recommendation Jun 13, 2024 Diversity Knowledge Distillation
— Unverified 0Audio-Oriented Multimodal Machine Comprehension: Task, Dataset and Model Jul 4, 2021 Knowledge Distillation Machine Reading Comprehension
— Unverified 0Feature Kernel Distillation Sep 29, 2021 image-classification Image Classification
— Unverified 0Feature Structure Distillation for BERT Transferring Nov 16, 2021 Knowledge Distillation
— Unverified 0Contextual Affinity Distillation for Image Anomaly Detection Jul 6, 2023 Anomaly Detection Knowledge Distillation
— Unverified 0Feature Distillation is the Better Choice for Model-Heterogeneous Federated Learning Jul 14, 2025 Federated Learning Knowledge Distillation
— Unverified 0Feature Fusion and Knowledge-Distilled Multi-Modal Multi-Target Detection May 31, 2025 Domain Adaptation Knowledge Distillation
— Unverified 0Constructing Deep Spiking Neural Networks from Artificial Neural Networks with Knowledge Distillation Apr 12, 2023 Knowledge Distillation
— Unverified 0A Gift From Knowledge Distillation: Fast Optimization, Network Minimization and Transfer Learning Jul 1, 2017 Knowledge Distillation Transfer Learning
— Unverified 0Inference Optimizations for Large Language Models: Effects, Challenges, and Practical Considerations Aug 6, 2024 Knowledge Distillation Navigate
— Unverified 0Feature Interaction Fusion Self-Distillation Network For CTR Prediction Nov 12, 2024 Click-Through Rate Prediction Knowledge Distillation
— Unverified 0MKD: a Multi-Task Knowledge Distillation Approach for Pretrained Language Models Nov 9, 2019 Knowledge Distillation Multi-Task Learning
— Unverified 0Conformer with dual-mode chunked attention for joint online and offline ASR Jun 22, 2022 Knowledge Distillation
— Unverified 0Agglomerating Large Vision Encoders via Distillation for VFSS Segmentation Apr 3, 2025 Image Segmentation Knowledge Distillation
— Unverified 0Configurable Holography: Towards Display and Scene Adaptation Mar 24, 2024 Depth Estimation Knowledge Distillation
— Unverified 0Confidence Preservation Property in Knowledge Distillation Abstractions Jan 21, 2024 Classification Knowledge Distillation
— Unverified 0AttentionLite: Towards Efficient Self-Attention Models for Vision Dec 21, 2020 Knowledge Distillation
— Unverified 0Ada-DQA: Adaptive Diverse Quality-aware Feature Acquisition for Video Quality Assessment Aug 1, 2023 Diversity Knowledge Distillation
— Unverified 0ACAM-KD: Adaptive and Cooperative Attention Masking for Knowledge Distillation Mar 8, 2025 Autonomous Driving feature selection
— Unverified 0Confidence Conditioned Knowledge Distillation Jul 6, 2021 Knowledge Distillation
— Unverified 0Confidence Based Bidirectional Global Context Aware Training Framework for Neural Machine Translation Feb 28, 2022 Decoder Knowledge Distillation
— Unverified 0Attention is all you need for boosting graph convolutional neural network Mar 10, 2024 All Knowledge Distillation
— Unverified 0Confidence Attention and Generalization Enhanced Distillation for Continuous Video Domain Adaptation Mar 18, 2023 Autonomous Driving Domain Adaptation
— Unverified 0Attention-guided Feature Distillation for Semantic Segmentation Mar 8, 2024 Knowledge Distillation Segmentation
— Unverified 0AgentDistill: Training-Free Agent Distillation with Generalizable MCP Boxes Jun 17, 2025 Knowledge Distillation Transfer Learning
— Unverified 0Conditional Generative Data-free Knowledge Distillation Dec 31, 2021 Conditional Image Generation Data-free Knowledge Distillation
— Unverified 0Attention-Guided Answer Distillation for Machine Reading Comprehension Aug 23, 2018 Knowledge Distillation Machine Reading Comprehension
— Unverified 0Conditional Autoregressors are Interpretable Classifiers Mar 31, 2022 Classification image-classification
— Unverified 0A Generative Framework for Personalized Learning and Estimation: Theory, Algorithms, and Privacy Jul 5, 2022 Federated Learning Knowledge Distillation
— Unverified 0