BJTU-WeChat's Systems for the WMT22 Chat Translation Task Nov 28, 2022 Denoising Knowledge Distillation
— Unverified 0AMLN: Adversarial-based Mutual Learning Network for Online Knowledge Distillation Aug 1, 2020 Knowledge Distillation Transfer Learning
— Unverified 0Emo Pillars: Knowledge Distillation to Support Fine-Grained Context-Aware and Context-Less Emotion Classification Apr 23, 2025 Emotion Classification GPU
— Unverified 0Embedding Compression for Teacher-to-Student Knowledge Transfer Feb 9, 2024 Knowledge Distillation Transfer Learning
— Unverified 0Deep Collective Knowledge Distillation Apr 18, 2023 Knowledge Distillation Model Compression
— Unverified 0A metric learning approach for endoscopic kidney stone identification Jul 13, 2023 Few-Shot Learning Knowledge Distillation
— Unverified 0A Closer Look at Rehearsal-Free Continual Learning Mar 31, 2022 Continual Learning Knowledge Distillation
— Unverified 0AdvFunMatch: When Consistent Teaching Meets Adversarial Robustness May 24, 2023 Adversarial Robustness Knowledge Distillation
— Unverified 0A method for estimating forest carbon storage distribution density via artificial intelligence generated content model Feb 2, 2025 Knowledge Distillation
— Unverified 0Adaptive Multiplane Image Generation from a Single Internet Picture Nov 26, 2020 Depth Estimation Image Generation
— Unverified 0Embedded Knowledge Distillation in Depth-Level Dynamic Neural Network Mar 1, 2021 Dynamic neural networks Knowledge Distillation
— Unverified 0EmbedDistill: A Geometric Knowledge Distillation for Information Retrieval Jan 27, 2023 Information Retrieval Knowledge Distillation
— Unverified 0Knowledge distillation for optimization of quantized deep neural networks Sep 4, 2019 Knowledge Distillation
— Unverified 0Decoupling Dark Knowledge via Block-wise Logit Distillation for Feature-level Alignment Nov 3, 2024 Knowledge Distillation Philosophy
— Unverified 0Decouple Non-parametric Knowledge Distillation For End-to-end Speech Translation Apr 20, 2023 Knowledge Distillation Machine Translation
— Unverified 0Decoupled Transformer for Scalable Inference in Open-domain Question Answering Sep 1, 2021 Knowledge Distillation Machine Reading Comprehension
— Unverified 0Decoupled Transformer for Scalable Inference in Open-domain Question Answering Aug 5, 2021 Knowledge Distillation Machine Reading Comprehension
— Unverified 0Biologically inspired structure learning with reverse knowledge distillation for spiking neural networks Apr 19, 2023 Knowledge Distillation
— Unverified 0AMD: Automatic Multi-step Distillation of Large-scale Vision Models Jul 5, 2024 image-classification Image Classification
— Unverified 0BiM-VFI: Bidirectional Motion Field-Guided Frame Interpolation for Video with Non-uniform Motions Jan 1, 2025 Knowledge Distillation Motion Estimation
— Unverified 0AMD: Adaptive Masked Distillation for Object Detection Jan 31, 2023 Knowledge Distillation Model Compression
— Unverified 0A Closer Look at Knowledge Distillation with Features, Logits, and Gradients Mar 18, 2022 Incremental Learning Knowledge Distillation
— Unverified 0Decoupled Alignment for Robust Plug-and-Play Adaptation Jun 3, 2024 Knowledge Distillation
— Unverified 0De-confounded Data-free Knowledge Distillation for Handling Distribution Shifts Mar 28, 2024 Causal Inference Data-free Knowledge Distillation
— Unverified 0Bilateral Memory Consolidation for Continual Learning Jan 1, 2023 Continual Learning Knowledge Distillation
— Unverified 0Always Strengthen Your Strengths: A Drift-Aware Incremental Learning Framework for CTR Prediction Apr 17, 2023 Click-Through Rate Prediction Diversity
— Unverified 0Sentence-wise Speech Summarization: Task, Datasets, and End-to-End Modeling with LM Knowledge Distillation Aug 1, 2024 Automatic Speech Recognition Automatic Speech Recognition (ASR)
— Unverified 0Decision Boundary-aware Knowledge Consolidation Generates Better Instance-Incremental Learner Jun 5, 2024 class-incremental learning Class Incremental Learning
— Unverified 0Debias the Black-box: A Fair Ranking Framework via Knowledge Distillation Aug 24, 2022 Fairness Information Retrieval
— Unverified 0Decentralized and Model-Free Federated Learning: Consensus-Based Distillation in Function Space Apr 1, 2021 Federated Learning Knowledge Distillation
— Unverified 0Adaptively Integrated Knowledge Distillation and Prediction Uncertainty for Continual Learning Jan 18, 2023 Continual Learning Knowledge Distillation
— Unverified 0ELAICHI: Enhancing Low-resource TTS by Addressing Infrequent and Low-frequency Character Bigrams Oct 23, 2024 Automatic Speech Recognition Automatic Speech Recognition (ASR)
— Unverified 0ELiTe: Efficient Image-to-LiDAR Knowledge Transfer for Semantic Segmentation May 7, 2024 Knowledge Distillation LIDAR Semantic Segmentation
— Unverified 0Empirical Evaluation of Knowledge Distillation from Transformers to Subquadratic Language Models Apr 19, 2025 Knowledge Distillation State Space Models
— Unverified 0Enhancing Once-For-All: A Study on Parallel Blocks, Skip Connections and Early Exits Feb 3, 2023 All Knowledge Distillation
— Unverified 0Debiased Distillation by Transplanting the Last Layer Feb 22, 2023 Attribute Knowledge Distillation
— Unverified 0Debate, Reflect, and Distill: Multi-Agent Feedback with Tree-Structured Preference Optimization for Efficient Language Model Enhancement Jun 4, 2025 Knowledge Distillation Language Modeling
— Unverified 0DearKD: Data-Efficient Early Knowledge Distillation for Vision Transformers Apr 27, 2022 Knowledge Distillation
— Unverified 0Dealing with training and test segmentation mismatch: FBK@IWSLT2021 Jun 23, 2021 Action Detection Activity Detection
— Unverified 0Bi-CryptoNets: Leveraging Different-Level Privacy for Encrypted Inference Feb 2, 2024 Knowledge Distillation Privacy Preserving
— Unverified 0Dealing with Missing Modalities in the Visual Question Answer-Difference Prediction Task through Knowledge Distillation Apr 13, 2021 Knowledge Distillation Triplet
— Unverified 0ALP-KD: Attention-Based Layer Projection for Knowledge Distillation Dec 27, 2020 Knowledge Distillation
— Unverified 0Adaptive Label Smoothing with Self-Knowledge in Natural Language Generation Oct 22, 2022 Knowledge Distillation Text Generation
— Unverified 0DDK: Distilling Domain Knowledge for Efficient Large Language Models Jul 23, 2024 Knowledge Distillation
— Unverified 0DCSNet: A Lightweight Knowledge Distillation-Based Model with Explainable AI for Lung Cancer Diagnosis from Histopathological Images May 14, 2025 Diagnostic Knowledge Distillation
— Unverified 0Be Your Own Best Competitor! Multi-Branched Adversarial Knowledge Transfer Oct 9, 2020 Decoder image-classification
— Unverified 0ESGN: Efficient Stereo Geometry Network for Fast 3D Object Detection Nov 28, 2021 3D Object Detection Knowledge Distillation
— Unverified 0DC-CCL: Device-Cloud Collaborative Controlled Learning for Large Vision Models Mar 18, 2023 Knowledge Distillation
— Unverified 0Beyond the Tip of Efficiency: Uncovering the Submerged Threats of Jailbreak Attacks in Small Language Models Feb 27, 2025 Knowledge Distillation Model Compression
— Unverified 0Efficient Video Segmentation Models with Per-frame Inference Feb 24, 2022 Image Matting Instance Segmentation
— Unverified 0