Knowledge Distillation from Multiple Foundation Models for End-to-End Speech Recognition Mar 20, 2023 Automatic Speech Recognition Automatic Speech Recognition (ASR)
— Unverified 0Efficient Compression of Multitask Multilingual Speech Models May 2, 2024 Automatic Speech Recognition Automatic Speech Recognition (ASR)
— Unverified 0Collaborative Learning for Deep Neural Networks May 30, 2018 Knowledge Distillation Multi-Task Learning
— Unverified 0Efficient and Robust Knowledge Distillation from A Stronger Teacher Based on Correlation Matching Oct 9, 2024 Knowledge Distillation Neural Network Compression
— Unverified 0Collaborative Inter-agent Knowledge Distillation for Reinforcement Learning Sep 25, 2019 Decision Making Knowledge Distillation
— Unverified 0A Survey of Techniques for Optimizing Transformer Inference Jul 16, 2023 Knowledge Distillation Neural Architecture Search
— Unverified 0Adverse Weather Optical Flow: Cumulative Homogeneous-Heterogeneous Adaptation Sep 25, 2024 Domain Adaptation Knowledge Distillation
— Unverified 0Efficient AI in Practice: Training and Deployment of Efficient LLMs for Industry Applications Feb 20, 2025 Knowledge Distillation Model Compression
— Unverified 0Collaborative Distillation in the Parameter and Spectrum Domains for Video Action Recognition Sep 15, 2020 Action Recognition Knowledge Distillation
— Unverified 0Efficiency optimization of large-scale language models based on deep learning in natural language processing tasks May 20, 2024 Inference Optimization Knowledge Distillation
— Unverified 0A Survey of Model Compression and Acceleration for Deep Neural Networks Oct 23, 2017 Benchmarking Knowledge Distillation
— Unverified 0A Bayesian Optimization Framework for Neural Network Compression Oct 1, 2019 Bayesian Optimization Knowledge Distillation
— Unverified 0Effective Training of Convolutional Neural Networks with Low-bitwidth Weights and Activations Aug 10, 2019 Knowledge Distillation Quantization
— Unverified 0Collaborative Distillation for Top-N Recommendation Nov 13, 2019 Collaborative Filtering Knowledge Distillation
— Unverified 0Effectiveness of Function Matching in Driving Scene Recognition Aug 20, 2022 Autonomous Driving image-classification
— Unverified 0A Survey of Methods for Low-Power Deep Learning and Computer Vision Mar 24, 2020 Knowledge Distillation Quantization
— Unverified 0A Study on the Efficiency and Generalization of Light Hybrid Retrievers Oct 4, 2022 Adversarial Attack Contrastive Learning
— Unverified 0Effectiveness of Arbitrary Transfer Sets for Data-free Knowledge Distillation Nov 18, 2020 Data-free Knowledge Distillation Knowledge Distillation
— Unverified 0Effective Decision Boundary Learning for Class Incremental Learning Jan 12, 2023 class-incremental learning Class Incremental Learning
— Unverified 0EFCM: Efficient Fine-tuning on Compressed Models for deployment of large models in medical image analysis Sep 18, 2024 Knowledge Distillation Medical Image Analysis
— Unverified 0EEGMobile: Enhancing Speed and Accuracy in EEG-Based Gaze Prediction with Advanced Mobile Architectures Aug 6, 2024 Brain Computer Interface EEG
— Unverified 0Cold & Warm Net: Addressing Cold-Start Users in Recommender Systems Sep 27, 2023 Knowledge Distillation Meta-Learning
— Unverified 0Active Data Curation Effectively Distills Large-Scale Multimodal Models Nov 27, 2024 Decoder Image Captioning
— Unverified 0Knowledge Distillation from Few Samples Sep 27, 2018 Knowledge Distillation
— Unverified 0Knowledge Distillation from Non-streaming to Streaming ASR Encoder using Auxiliary Non-streaming Layer Aug 31, 2023 Automatic Speech Recognition Automatic Speech Recognition (ASR)
— Unverified 0Knowledge Distillation Neural Network for Predicting Car-following Behaviour of Human-driven and Autonomous Vehicles Nov 8, 2024 Autonomous Vehicles Descriptive
— Unverified 0Knowledge Distillation via Weighted Ensemble of Teaching Assistants Jun 23, 2022 Ensemble Learning Knowledge Distillation
— Unverified 0LAMeTA: Intent-Aware Agentic Network Optimization via a Large AI Model-Empowered Two-Stage Approach May 18, 2025 Deep Reinforcement Learning Knowledge Distillation
— Unverified 0EduPal leaves no professor behind: Supporting faculty via a peer-powered recommender system Apr 20, 2021 Chatbot Knowledge Distillation
— Unverified 0A Study on Knowledge Distillation from Weak Teacher for Scaling Up Pre-trained Language Models May 26, 2023 Knowledge Distillation
— Unverified 0Education distillation:getting student models to learn in shcools Nov 23, 2023 Incremental Learning Knowledge Distillation
— Unverified 0EDocNet: Efficient Datasheet Layout Analysis Based on Focus and Global Knowledge Distillation Feb 23, 2025 Document Layout Analysis Knowledge Distillation
— Unverified 0CoDERT: Distilling Encoder Representations with Co-learning for Transducer-based Speech Recognition Jun 14, 2021 Decoder Knowledge Distillation
— Unverified 0Active Class Incremental Learning for Imbalanced Datasets Aug 25, 2020 class-incremental learning Class Incremental Learning
— Unverified 0EdgeFusion: On-Device Text-to-Image Generation Apr 18, 2024 Image Generation Knowledge Distillation
— Unverified 0CoCo DistillNet: a Cross-layer Correlation Distillation Network for Pathological Gastric Cancer Segmentation Aug 27, 2021 Image Segmentation Knowledge Distillation
— Unverified 0Edge-free but Structure-aware: Prototype-Guided Knowledge Distillation from GNNs to MLPs Mar 24, 2023 Knowledge Distillation
— Unverified 0EdgeFormer: A Parameter-Efficient Transformer for On-Device Seq2seq Generation Feb 16, 2022 Grammatical Error Correction Knowledge Distillation
— Unverified 0A Study of Non-autoregressive Model for Sequence Generation Apr 22, 2020 Automatic Speech Recognition Automatic Speech Recognition (ASR)
— Unverified 0Knowledge Distillation for Underwater Feature Extraction and Matching via GAN-synthesized Images Apr 11, 2025 General Knowledge Knowledge Distillation
— Unverified 0Edge-Efficient Deep Learning Models for Automatic Modulation Classification: A Performance Analysis Apr 11, 2024 Knowledge Distillation Model Optimization
— Unverified 0Edge AI-Enabled Chicken Health Detection Based on Enhanced FCOS-Lite and Knowledge Distillation Jul 3, 2024 Knowledge Distillation Quantization
— Unverified 0EchoLM: Accelerating LLM Serving with Real-time Knowledge Distillation Jan 22, 2025 Knowledge Distillation Response Generation
— Unverified 0CMU’s IWSLT 2022 Dialect Speech Translation System May 1, 2022 Decoder Knowledge Distillation
— Unverified 0Adversarial Sparse Teacher: Defense Against Distillation-Based Model Stealing Attacks Using Adversarial Examples Mar 8, 2024 Knowledge Distillation
— Unverified 0EchoAtt: Attend, Copy, then Adjust for More Efficient Large Language Models Sep 22, 2024 Knowledge Distillation
— Unverified 0ECG-guided individual identification via PPG Dec 30, 2024 Knowledge Distillation
— Unverified 0ECAT: A Entire space Continual and Adaptive Transfer Learning Framework for Cross-Domain Recommendation Jul 2, 2024 Domain Adaptation Knowledge Distillation
— Unverified 0Asterisk*: Keep it Simple Nov 8, 2024 Classification Knowledge Distillation
— Unverified 0A baseline revisited: Pushing the limits of multi-segment models for context-aware translation Oct 19, 2022 Knowledge Distillation Translation
— Unverified 0