Dual-Student Knowledge Distillation Networks for Unsupervised Anomaly Detection Feb 1, 2024 Anomaly Detection Anomaly Segmentation
— Unverified 0Dual-Teacher Class-Incremental Learning With Data-Free Generative Replay Jun 17, 2021 class-incremental learning Class Incremental Learning
— Unverified 0Dual-Teacher: Integrating Intra-domain and Inter-domain Teachers for Annotation-efficient Cardiac Segmentation Jul 13, 2020 Cardiac Segmentation Domain Adaptation
— Unverified 0Dual Teacher Knowledge Distillation with Domain Alignment for Face Anti-spoofing Jan 2, 2024 Adversarial Attack Face Anti-Spoofing
— Unverified 0DualVC 2: Dynamic Masked Convolution for Unified Streaming and Non-Streaming Voice Conversion Sep 27, 2023 Decoder Knowledge Distillation
— Unverified 0DualVC: Dual-mode Voice Conversion using Intra-model Knowledge Distillation and Hybrid Predictive Coding May 21, 2023 Data Augmentation Decoder
— Unverified 0DuckSegmentation: A segmentation model based on the AnYue Hemp Duck Dataset Mar 27, 2025 Knowledge Distillation Object Recognition
— Unverified 0DVFL: A Vertical Federated Learning Method for Dynamic Data Nov 5, 2021 Federated Learning Knowledge Distillation
— Unverified 0DyLiN: Making Light Field Networks Dynamic Mar 24, 2023 Attribute Knowledge Distillation
— Unverified 0Dynamic Activation with Knowledge Distillation for Energy-Efficient Spiking NN Ensembles Feb 19, 2025 Disentanglement Ensemble Learning
— Unverified 0Dynamically pruning segformer for efficient semantic segmentation Nov 18, 2021 Knowledge Distillation Segmentation
— Unverified 0DynamicKD: An Effective Knowledge Distillation via Dynamic Entropy Correction-Based Distillation for Gap Optimizing May 9, 2023 Knowledge Distillation
— Unverified 0Dynamic Knowledge Distillation for Black-box Hypothesis Transfer Learning Jul 24, 2020 Knowledge Distillation Transfer Learning
— Unverified 0Dynamic Knowledge Distillation With Noise Elimination for RGB-D Salient Object Detection Jun 17, 2021 Knowledge Distillation object-detection
— Unverified 0Dynamic Low-Resolution Distillation for Cost-Efficient End-to-End Text Spotting Jul 14, 2022 global-optimization Knowledge Distillation
— Unverified 0Dynamics-Adaptive Continual Reinforcement Learning via Progressive Contextualization Sep 1, 2022 Bayesian Inference Knowledge Distillation
— Unverified 0Dynamic Self-Distillation via Previous Mini-batches for Fine-tuning Small Language Models Nov 25, 2024 Knowledge Distillation Natural Language Understanding
— Unverified 0Dynamic Textual Prompt For Rehearsal-free Lifelong Person Re-identification Nov 9, 2024 Knowledge Distillation Person Re-Identification
— Unverified 0Dynamic Transformer Architecture for Continual Learning of Multimodal Tasks Jan 27, 2024 Continual Learning Edge-computing
— Unverified 0Dynamic Y-KD: A Hybrid Approach to Continual Instance Segmentation Mar 10, 2023 Continual Learning Incremental Learning
— Unverified 0EasyDistill: A Comprehensive Toolkit for Effective Knowledge Distillation of Large Language Models May 27, 2025 Knowledge Distillation
— Unverified 0EasyNLP: A Comprehensive and Easy-to-use Toolkit for Natural Language Processing Apr 30, 2022 Few-Shot Learning Knowledge Distillation
— Unverified 0ECAT: A Entire space Continual and Adaptive Transfer Learning Framework for Cross-Domain Recommendation Jul 2, 2024 Domain Adaptation Knowledge Distillation
— Unverified 0ECG-guided individual identification via PPG Dec 30, 2024 Knowledge Distillation
— Unverified 0EchoAtt: Attend, Copy, then Adjust for More Efficient Large Language Models Sep 22, 2024 Knowledge Distillation
— Unverified 0EchoLM: Accelerating LLM Serving with Real-time Knowledge Distillation Jan 22, 2025 Knowledge Distillation Response Generation
— Unverified 0Edge AI-Enabled Chicken Health Detection Based on Enhanced FCOS-Lite and Knowledge Distillation Jul 3, 2024 Knowledge Distillation Quantization
— Unverified 0Edge-Efficient Deep Learning Models for Automatic Modulation Classification: A Performance Analysis Apr 11, 2024 Knowledge Distillation Model Optimization
— Unverified 0EdgeFormer: A Parameter-Efficient Transformer for On-Device Seq2seq Generation Feb 16, 2022 Grammatical Error Correction Knowledge Distillation
— Unverified 0Edge-free but Structure-aware: Prototype-Guided Knowledge Distillation from GNNs to MLPs Mar 24, 2023 Knowledge Distillation
— Unverified 0EdgeFusion: On-Device Text-to-Image Generation Apr 18, 2024 Image Generation Knowledge Distillation
— Unverified 0EDocNet: Efficient Datasheet Layout Analysis Based on Focus and Global Knowledge Distillation Feb 23, 2025 Document Layout Analysis Knowledge Distillation
— Unverified 0Education distillation:getting student models to learn in shcools Nov 23, 2023 Incremental Learning Knowledge Distillation
— Unverified 0EduPal leaves no professor behind: Supporting faculty via a peer-powered recommender system Apr 20, 2021 Chatbot Knowledge Distillation
— Unverified 0EEGMobile: Enhancing Speed and Accuracy in EEG-Based Gaze Prediction with Advanced Mobile Architectures Aug 6, 2024 Brain Computer Interface EEG
— Unverified 0EFCM: Efficient Fine-tuning on Compressed Models for deployment of large models in medical image analysis Sep 18, 2024 Knowledge Distillation Medical Image Analysis
— Unverified 0Effective Decision Boundary Learning for Class Incremental Learning Jan 12, 2023 class-incremental learning Class Incremental Learning
— Unverified 0Effectiveness of Arbitrary Transfer Sets for Data-free Knowledge Distillation Nov 18, 2020 Data-free Knowledge Distillation Knowledge Distillation
— Unverified 0Effectiveness of Function Matching in Driving Scene Recognition Aug 20, 2022 Autonomous Driving image-classification
— Unverified 0Effective Training of Convolutional Neural Networks with Low-bitwidth Weights and Activations Aug 10, 2019 Knowledge Distillation Quantization
— Unverified 0Efficiency optimization of large-scale language models based on deep learning in natural language processing tasks May 20, 2024 Inference Optimization Knowledge Distillation
— Unverified 0Efficient AI in Practice: Training and Deployment of Efficient LLMs for Industry Applications Feb 20, 2025 Knowledge Distillation Model Compression
— Unverified 0Efficient and Robust Knowledge Distillation from A Stronger Teacher Based on Correlation Matching Oct 9, 2024 Knowledge Distillation Neural Network Compression
— Unverified 0Efficient Compression of Multitask Multilingual Speech Models May 2, 2024 Automatic Speech Recognition Automatic Speech Recognition (ASR)
— Unverified 0Efficient Controllable Multi-Task Architectures Aug 22, 2023 Decoder Knowledge Distillation
— Unverified 0Efficient Convolutional Neural Networks for Depth-Based Multi-Person Pose Estimation Dec 2, 2019 2D Pose Estimation Domain Adaptation
— Unverified 0Efficient Evaluation-Time Uncertainty Estimation by Improved Distillation Jun 12, 2019 Knowledge Distillation
— Unverified 0Efficient Federated Learning for AIoT Applications Using Knowledge Distillation Nov 29, 2021 Federated Learning Knowledge Distillation
— Unverified 0Efficient Gravitational Wave Parameter Estimation via Knowledge Distillation: A ResNet1D-IAF Approach Dec 11, 2024 Astronomy Computational Efficiency
— Unverified 0Efficient Hybrid Language Model Compression through Group-Aware SSM Pruning Apr 15, 2025 Knowledge Distillation Language Modeling
— Unverified 0