Energy-efficient Knowledge Distillation for Spiking Neural Networks Jun 14, 2021 Knowledge Distillation Model Compression
— Unverified 0Comprehensive Survey of Model Compression and Speed up for Vision Transformers Apr 16, 2024 Computational Efficiency Edge-computing
— Unverified 0After-Stroke Arm Paresis Detection using Kinematic Data Nov 3, 2023 Action Classification Knowledge Distillation
— Unverified 0Fixing the Teacher-Student Knowledge Discrepancy in Distillation Mar 31, 2021 image-classification Image Classification
— Unverified 0End-to-End Speech-Translation with Knowledge Distillation: FBK@IWSLT2020 Jun 4, 2020 Data Augmentation Knowledge Distillation
— Unverified 0FLAR: A Unified Prototype Framework for Few-Sample Lifelong Active Recognition Jan 1, 2021 Knowledge Distillation Lifelong learning
— Unverified 0FlyKD: Graph Knowledge Distillation on the Fly with Curriculum Learning Mar 16, 2024 Knowledge Distillation
— Unverified 0Cross-Task Knowledge Distillation in Multi-Task Recommendation Feb 20, 2022 Knowledge Distillation Multi-Task Learning
— Unverified 0End-to-End Speech Translation with Knowledge Distillation Apr 17, 2019 Knowledge Distillation speech-recognition
— Unverified 0Comprehensive Study on Performance Evaluation and Optimization of Model Compression: Bridging Traditional Deep Learning and Large Language Models Jul 22, 2024 Deep Learning image-classification
— Unverified 0End-to-End Simultaneous Speech Translation with Pretraining and Distillation: Huawei Noah’s System for AutoSimTranS 2022 Jul 1, 2022 Decoder Knowledge Distillation
— Unverified 0Follow Your Path: a Progressive Method for Knowledge Distillation Jul 20, 2021 Knowledge Distillation
— Unverified 0End-to-end fully-binarized network design: from Generic Learned Thermometer to Block Pruning May 5, 2025 Knowledge Distillation Quantization
— Unverified 0Comprehensive Pathological Image Segmentation via Teacher Aggregation for Tumor Microenvironment Analysis Jan 6, 2025 Decision Making Diversity
— Unverified 0Edge Bias in Federated Learning and its Solution by Buffered Knowledge Distillation Oct 20, 2020 Federated Learning Knowledge Distillation
— Unverified 0Foundational Model for Electron Micrograph Analysis: Instruction-Tuning Small-Scale Language-and-Vision Assistant for Enterprise Adoption Aug 23, 2024 Instruction Following Knowledge Distillation
— Unverified 0CTC Blank Triggered Dynamic Layer-Skipping for Efficient CTC-based Speech Recognition Jan 4, 2024 Knowledge Distillation speech-recognition
— Unverified 0ActivityCLIP: Enhancing Group Activity Recognition by Mining Complementary Information from Text to Supplement Image Modality Jul 29, 2024 Activity Recognition Group Activity Recognition
— Unverified 0FPGA Resource-aware Structured Pruning for Real-Time Neural Networks Aug 9, 2023 Classification image-classification
— Unverified 0CULL-MT: Compression Using Language and Layer pruning for Machine Translation Nov 10, 2024 Knowledge Distillation Machine Translation
— Unverified 0End-to-End Automatic Speech Recognition with Deep Mutual Learning Feb 16, 2021 Automatic Speech Recognition Automatic Speech Recognition (ASR)
— Unverified 0Endpoints Weight Fusion for Class Incremental Semantic Segmentation Jan 1, 2023 class-incremental learning Class Incremental Learning
— Unverified 0EncodeNet: A Framework for Boosting DNN Accuracy with Entropy-driven Generalized Converting Autoencoder Apr 21, 2024 image-classification Image Classification
— Unverified 0Enabling Weak Client Participation via On-device Knowledge Distillation in Heterogenous Federated Learning Mar 14, 2025 Federated Learning Knowledge Distillation
— Unverified 0Compositional Data Augmentation for Abstractive Conversation Summarization Nov 16, 2021 Conversation Summarization Data Augmentation
— Unverified 0Asynchronous Convergence in Multi-Task Learning via Knowledge Distillation from Converged Tasks Jul 1, 2022 Knowledge Distillation Multi-Task Learning
— Unverified 0Enabling Multimodal Generation on CLIP via Vision-Language Knowledge Distillation Mar 12, 2022 Image Captioning Knowledge Distillation
— Unverified 0FReTAL: Generalizing Deepfake Detection using Knowledge Distillation and Representation Learning May 28, 2021 DeepFake Detection Domain Adaptation
— Unverified 0From Algorithm to Hardware: A Survey on Efficient and Safe Deployment of Deep Neural Networks May 9, 2024 Knowledge Distillation Model Compression
— Unverified 0From Data to Modeling: Fully Open-vocabulary Scene Graph Generation May 26, 2025 Graph Generation Knowledge Distillation
— Unverified 0From Easy to Hard: Learning Curricular Shape-aware Features for Robust Panoptic Scene Graph Generation Jul 12, 2024 Graph Generation Knowledge Distillation
— Unverified 0Adaptive Explicit Knowledge Transfer for Knowledge Distillation Sep 3, 2024 Knowledge Distillation Transfer Learning
— Unverified 0Enabling Multimodal Generation on CLIP via Vision-Language Knowledge Distillation Nov 16, 2021 Image Captioning Knowledge Distillation
— Unverified 0From Knowledge Distillation to Self-Knowledge Distillation: A Unified Approach with Normalized Loss and Customized Soft Labels Mar 23, 2023 Knowledge Distillation Self-Knowledge Distillation
— Unverified 0From Large to Super-Tiny: End-to-End Optimization for Cost-Efficient LLMs Apr 18, 2025 Knowledge Distillation Model Compression
— Unverified 0From LLM to NMT: Advancing Low-Resource Machine Translation with Claude Apr 22, 2024 Knowledge Distillation Language Modeling
— Unverified 0From Multimodal to Unimodal Attention in Transformers using Knowledge Distillation Oct 15, 2021 Knowledge Distillation Multimodal Deep Learning
— Unverified 0Asymmetric Temperature Scaling Makes Larger Networks Teach Well Again Oct 10, 2022 Knowledge Distillation
— Unverified 0From Two-Stream to One-Stream: Efficient RGB-T Tracking via Mutual Prompt Learning and Knowledge Distillation Mar 25, 2024 Knowledge Distillation Object Tracking
— Unverified 0DA-CIL: Towards Domain Adaptive Class-Incremental 3D Object Detection Dec 5, 2022 3D Object Detection class-incremental learning
— Unverified 0Complex Emotion Recognition System using basic emotions via Facial Expression, EEG, and ECG Signals: a review Sep 9, 2024 EEG Electroencephalogram (EEG)
— Unverified 0FSAR: Federated Skeleton-based Action Recognition with Adaptive Topology Structure and Knowledge Distillation Jun 19, 2023 Action Recognition Federated Learning
— Unverified 0Empowering Knowledge Distillation via Open Set Recognition for Robust 3D Point Cloud Classification Oct 25, 2020 3D Point Cloud Classification General Classification
— Unverified 0AfroXLMR-Comet: Multilingual Knowledge Distillation with Attention Matching for Low-Resource languages Feb 25, 2025 Knowledge Distillation Language Modeling
— Unverified 0Highly Constrained Coded Aperture Imaging Systems Design Via a Knowledge Distillation Approach Jun 25, 2024 Image Reconstruction Knowledge Distillation
— Unverified 0Fusing Bidirectional Chains of Thought and Reward Mechanisms A Method for Enhancing Question-Answering Capabilities of Large Language Models for Chinese Intangible Cultural Heritage May 13, 2025 Knowledge Distillation Large Language Model
— Unverified 0Future-Guided Incremental Transformer for Simultaneous Translation Dec 23, 2020 Knowledge Distillation Translation
— Unverified 0Fuzzy Knowledge Distillation from High-Order TSK to Low-Order TSK Feb 16, 2023 Benchmarking Knowledge Distillation
— Unverified 0High Performance Natural Language Processing Nov 1, 2020 Knowledge Distillation Quantization
— Unverified 0Empowering Dual-Encoder with Query Generator for Cross-Lingual Dense Retrieval Mar 27, 2023 Knowledge Distillation Retrieval
— Unverified 0