Dynamic Self-Distillation via Previous Mini-batches for Fine-tuning Small Language Models Nov 25, 2024 Knowledge Distillation Natural Language Understanding
— Unverified 00 Dynamic Textual Prompt For Rehearsal-free Lifelong Person Re-identification Nov 9, 2024 Knowledge Distillation Person Re-Identification
— Unverified 00 Dynamic Transformer Architecture for Continual Learning of Multimodal Tasks Jan 27, 2024 Continual Learning Edge-computing
— Unverified 00 Dynamic Y-KD: A Hybrid Approach to Continual Instance Segmentation Mar 10, 2023 Continual Learning Incremental Learning
— Unverified 00 EasyDistill: A Comprehensive Toolkit for Effective Knowledge Distillation of Large Language Models May 27, 2025 Knowledge Distillation
— Unverified 00 EasyNLP: A Comprehensive and Easy-to-use Toolkit for Natural Language Processing Apr 30, 2022 Few-Shot Learning Knowledge Distillation
— Unverified 00 ECAT: A Entire space Continual and Adaptive Transfer Learning Framework for Cross-Domain Recommendation Jul 2, 2024 Domain Adaptation Knowledge Distillation
— Unverified 00 ECG-guided individual identification via PPG Dec 30, 2024 Knowledge Distillation
— Unverified 00 EchoAtt: Attend, Copy, then Adjust for More Efficient Large Language Models Sep 22, 2024 Knowledge Distillation
— Unverified 00 EchoLM: Accelerating LLM Serving with Real-time Knowledge Distillation Jan 22, 2025 Knowledge Distillation Response Generation
— Unverified 00 Edge AI-Enabled Chicken Health Detection Based on Enhanced FCOS-Lite and Knowledge Distillation Jul 3, 2024 Knowledge Distillation Quantization
— Unverified 00 Edge-Efficient Deep Learning Models for Automatic Modulation Classification: A Performance Analysis Apr 11, 2024 Knowledge Distillation Model Optimization
— Unverified 00 EdgeFormer: A Parameter-Efficient Transformer for On-Device Seq2seq Generation Feb 16, 2022 Grammatical Error Correction Knowledge Distillation
— Unverified 00 Edge-free but Structure-aware: Prototype-Guided Knowledge Distillation from GNNs to MLPs Mar 24, 2023 Knowledge Distillation
— Unverified 00 EdgeFusion: On-Device Text-to-Image Generation Apr 18, 2024 Image Generation Knowledge Distillation
— Unverified 00 EDocNet: Efficient Datasheet Layout Analysis Based on Focus and Global Knowledge Distillation Feb 23, 2025 Document Layout Analysis Knowledge Distillation
— Unverified 00 Education distillation:getting student models to learn in shcools Nov 23, 2023 Incremental Learning Knowledge Distillation
— Unverified 00 EduPal leaves no professor behind: Supporting faculty via a peer-powered recommender system Apr 20, 2021 Chatbot Knowledge Distillation
— Unverified 00 EEGMobile: Enhancing Speed and Accuracy in EEG-Based Gaze Prediction with Advanced Mobile Architectures Aug 6, 2024 Brain Computer Interface EEG
— Unverified 00 EFCM: Efficient Fine-tuning on Compressed Models for deployment of large models in medical image analysis Sep 18, 2024 Knowledge Distillation Medical Image Analysis
— Unverified 00 Effective Decision Boundary Learning for Class Incremental Learning Jan 12, 2023 class-incremental learning Class Incremental Learning
— Unverified 00 Effectiveness of Arbitrary Transfer Sets for Data-free Knowledge Distillation Nov 18, 2020 Data-free Knowledge Distillation Knowledge Distillation
— Unverified 00 Effectiveness of Function Matching in Driving Scene Recognition Aug 20, 2022 Autonomous Driving image-classification
— Unverified 00 Effective Training of Convolutional Neural Networks with Low-bitwidth Weights and Activations Aug 10, 2019 Knowledge Distillation Quantization
— Unverified 00 Efficiency optimization of large-scale language models based on deep learning in natural language processing tasks May 20, 2024 Inference Optimization Knowledge Distillation
— Unverified 00 Efficient AI in Practice: Training and Deployment of Efficient LLMs for Industry Applications Feb 20, 2025 Knowledge Distillation Model Compression
— Unverified 00 Efficient and Robust Knowledge Distillation from A Stronger Teacher Based on Correlation Matching Oct 9, 2024 Knowledge Distillation Neural Network Compression
— Unverified 00 Efficient Compression of Multitask Multilingual Speech Models May 2, 2024 Automatic Speech Recognition Automatic Speech Recognition (ASR)
— Unverified 00 Efficient Controllable Multi-Task Architectures Aug 22, 2023 Decoder Knowledge Distillation
— Unverified 00 Efficient Convolutional Neural Networks for Depth-Based Multi-Person Pose Estimation Dec 2, 2019 2D Pose Estimation Domain Adaptation
— Unverified 00 Efficient Evaluation-Time Uncertainty Estimation by Improved Distillation Jun 12, 2019 Knowledge Distillation
— Unverified 00 Efficient Federated Learning for AIoT Applications Using Knowledge Distillation Nov 29, 2021 Federated Learning Knowledge Distillation
— Unverified 00 Efficient Gravitational Wave Parameter Estimation via Knowledge Distillation: A ResNet1D-IAF Approach Dec 11, 2024 Astronomy Computational Efficiency
— Unverified 00 Efficient Hybrid Language Model Compression through Group-Aware SSM Pruning Apr 15, 2025 Knowledge Distillation Language Modeling
— Unverified 00 Efficient Image Compression Using Advanced State Space Models Sep 4, 2024 Computational Efficiency Image Compression
— Unverified 00 Efficient Inference via Universal LSH Kernel Jun 21, 2021 Knowledge Distillation Quantization
— Unverified 00 Efficient Intent-Based Filtering for Multi-Party Conversations Using Knowledge Distillation from LLMs Mar 21, 2025 intent-classification Intent Classification
— Unverified 00 Efficient Knowledge Distillation: Empowering Small Language Models with Teacher Model Insights Sep 19, 2024 Decision Making Knowledge Distillation
— Unverified 00 Efficient Knowledge Distillation of SAM for Medical Image Segmentation Jan 28, 2025 Computational Efficiency Decoder
— Unverified 00 Efficient Knowledge Distillation via Curriculum Extraction Mar 21, 2025 Knowledge Distillation Language Modeling
— Unverified 00 Efficient Machine Translation with Model Pruning and Quantization Nov 1, 2021 CPU Decoder
— Unverified 00 Efficient Object Detection in Optical Remote Sensing Imagery via Attention-based Feature Distillation Oct 28, 2023 Knowledge Distillation Object
— Unverified 00 Efficient Open-world Reinforcement Learning via Knowledge Distillation and Autonomous Rule Discovery Nov 24, 2023 Deep Reinforcement Learning Knowledge Distillation
— Unverified 00 Efficient Point Cloud Classification via Offline Distillation Framework and Negative-Weight Self-Distillation Technique Sep 3, 2024 Data Augmentation Knowledge Distillation
— Unverified 00 Efficient Speech Command Recognition Leveraging Spiking Neural Network and Curriculum Learning-based Knowledge Distillation Dec 17, 2024 Edge-computing Knowledge Distillation
— Unverified 00 Efficient speech detection in environmental audio using acoustic recognition and knowledge distillation Dec 14, 2023 Knowledge Distillation Model Selection
— Unverified 00 Efficient Technical Term Translation: A Knowledge Distillation Approach for Parenthetical Terminology Translation Oct 1, 2024 Knowledge Distillation Machine Translation
— Unverified 00 Efficient training of lightweight neural networks using Online Self-Acquired Knowledge Distillation Aug 26, 2021 Density Estimation Knowledge Distillation
— Unverified 00 Efficient Transformer-based Large Scale Language Representations using Hardware-friendly Block Structured Pruning Sep 17, 2020 Edge-computing Knowledge Distillation
— Unverified 00 Efficient Transformer Knowledge Distillation: A Performance Review Nov 22, 2023 Knowledge Distillation Model Compression
— Unverified 00