Efficient Speech Command Recognition Leveraging Spiking Neural Network and Curriculum Learning-based Knowledge Distillation Dec 17, 2024 Edge-computing Knowledge Distillation
— Unverified 0Confidence-aware Self-Semantic Distillation on Knowledge Graph Embedding Jun 7, 2022 Graph Embedding Knowledge Distillation
— Unverified 0Improving Mathematical Reasoning Capabilities of Small Language Models via Feedback-Driven Distillation Nov 22, 2024 Knowledge Distillation Mathematical Reasoning
— Unverified 0Improving Multi-Task Deep Neural Networks via Knowledge Distillation for Natural Language Understanding Apr 20, 2019 Ensemble Learning Knowledge Distillation
— Unverified 0Batch Selection and Communication for Active Learning with Edge Labeling Nov 14, 2023 Active Learning Knowledge Distillation
— Unverified 0Active Large Language Model-based Knowledge Distillation for Session-based Recommendation Dec 15, 2024 Active Learning Knowledge Distillation
— Unverified 0Improving Neural Machine Translation by Denoising Training Jan 19, 2022 Denoising Knowledge Distillation
— Unverified 0Improving Neural ODEs via Knowledge Distillation Mar 10, 2022 Knowledge Distillation
— Unverified 0Efficient Point Cloud Classification via Offline Distillation Framework and Negative-Weight Self-Distillation Technique Sep 3, 2024 Data Augmentation Knowledge Distillation
— Unverified 0Diffusion-Augmented Coreset Expansion for Scalable Dataset Distillation Dec 5, 2024 Bilevel Optimization Computational Efficiency
— Unverified 0Efficient Open-world Reinforcement Learning via Knowledge Distillation and Autonomous Rule Discovery Nov 24, 2023 Deep Reinforcement Learning Knowledge Distillation
— Unverified 0Improving Pronunciation and Accent Conversion through Knowledge Distillation And Synthetic Ground-Truth from Native TTS Oct 19, 2024 Knowledge Distillation
— Unverified 0DiffusionTalker: Personalization and Acceleration for Speech-Driven 3D Face Diffuser Nov 28, 2023 3D Face Animation Contrastive Learning
— Unverified 0Towards Complementary Knowledge Distillation for Efficient Dense Image Prediction Jan 24, 2024 Implicit Relations Instance Segmentation
— Unverified 0ComKD-CLIP: Comprehensive Knowledge Distillation for Contrastive Language-Image Pre-traning Model Aug 8, 2024 Contrastive Learning Knowledge Distillation
— Unverified 0Improving Route Choice Models by Incorporating Contextual Factors via Knowledge Distillation Mar 27, 2019 Knowledge Distillation Management
— Unverified 0Keep Decoding Parallel with Effective Knowledge Distillation from Language Models to End-to-end Speech Recognisers Jan 22, 2024 Automatic Speech Recognition Automatic Speech Recognition (ASR)
— Unverified 0Efficient Object Detection in Optical Remote Sensing Imagery via Attention-based Feature Distillation Oct 28, 2023 Knowledge Distillation Object
— Unverified 0CoMBO: Conflict Mitigation via Branched Optimization for Class Incremental Segmentation Jan 1, 2025 Knowledge Distillation Semantic Segmentation
— Unverified 0Improving Speech Translation by Understanding and Learning from the Auxiliary Text Translation Task Jul 12, 2021 Decoder Knowledge Distillation
— Unverified 0A Survey on Model Compression for Large Language Models Aug 15, 2023 Benchmarking Knowledge Distillation
— Unverified 0Improving Task-Agnostic BERT Distillation with Layer Mapping Search Dec 11, 2020 Knowledge Distillation
— Unverified 0KDSM: An uplift modeling framework based on knowledge distillation and sample matching Mar 6, 2023 counterfactual Knowledge Distillation
— Unverified 0Improving the Interpretability of Deep Neural Networks with Knowledge Distillation Dec 28, 2018 Ethics Knowledge Distillation
— Unverified 0KDSTM: Neural Semi-supervised Topic Modeling with Knowledge Distillation Jul 4, 2023 Classification Knowledge Distillation
— Unverified 0Improving the Transferability of Adversarial Examples by Inverse Knowledge Distillation Feb 24, 2025 Adversarial Attack Diversity
— Unverified 0Improving Video Model Transfer With Dynamic Representation Learning Jan 1, 2022 Action Classification Knowledge Distillation
— Unverified 0Efficient Machine Translation with Model Pruning and Quantization Nov 1, 2021 CPU Decoder
— Unverified 0Combining Curriculum Learning and Knowledge Distillation for Dialogue Generation Nov 1, 2021 Dialogue Generation Knowledge Distillation
— Unverified 0Improving Zero-Shot Multilingual Text Generation via Iterative Distillation Oct 1, 2022 Knowledge Distillation Text Generation
— Unverified 0Combining Compressions for Multiplicative Size Scaling on Natural Language Tasks Aug 20, 2022 Knowledge Distillation Neural Network Compression
— Unverified 0In-Context Learning Distillation for Efficient Few-Shot Fine-Tuning Dec 17, 2024 In-Context Learning Knowledge Distillation
— Unverified 0ABC-KD: Attention-Based-Compression Knowledge Distillation for Deep Learning-Based Noise Suppression May 26, 2023 Knowledge Distillation
— Unverified 0Incorporating Ultrasound Tongue Images for Audio-Visual Speech Enhancement through Knowledge Distillation May 24, 2023 Automatic Speech Recognition Automatic Speech Recognition (ASR)
— Unverified 0KD-VLP: Improving End-to-End Vision-and-Language Pretraining with Object Knowledge Distillation Jan 16, 2022 cross-modal alignment Knowledge Distillation
— Unverified 0Incremental Classifier Learning Based on PEDCC-Loss and Cosine Distance Jun 11, 2019 Incremental Learning Knowledge Distillation
— Unverified 0Incremental-DETR: Incremental Few-Shot Object Detection via Self-Supervised Learning May 9, 2022 Few-Shot Object Detection Knowledge Distillation
— Unverified 0Incremental Knowledge Based Question Answering Jan 18, 2021 Incremental Learning Knowledge Distillation
— Unverified 0Incremental Learning for End-to-End Automatic Speech Recognition May 11, 2020 Automatic Speech Recognition Automatic Speech Recognition (ASR)
— Unverified 0Direct Distillation between Different Domains Jan 12, 2024 Domain Adaptation Knowledge Distillation
— Unverified 0Kendall's τ Coefficient for Logits Distillation Sep 26, 2024 Knowledge Distillation
— Unverified 0Knowledge Adaptation for Efficient Semantic Segmentation Mar 12, 2019 Knowledge Distillation Segmentation
— Unverified 0Efficient Knowledge Distillation via Curriculum Extraction Mar 21, 2025 Knowledge Distillation Language Modeling
— Unverified 0Efficient Knowledge Distillation of SAM for Medical Image Segmentation Jan 28, 2025 Computational Efficiency Decoder
— Unverified 0Collective Wisdom: Improving Low-resource Neural Machine Translation using Adaptive Knowledge Distillation Oct 12, 2020 Knowledge Distillation Low Resource Neural Machine Translation
— Unverified 0Efficient Knowledge Distillation: Empowering Small Language Models with Teacher Model Insights Sep 19, 2024 Decision Making Knowledge Distillation
— Unverified 0Incrementer: Transformer for Class-Incremental Semantic Segmentation With Knowledge Distillation Focusing on Old Class Jan 1, 2023 Class-Incremental Semantic Segmentation Decoder
— Unverified 0DiReDi: Distillation and Reverse Distillation for AIoT Applications Sep 12, 2024 Knowledge Distillation Management
— Unverified 0Collective Knowledge Graph Completion with Mutual Knowledge Distillation May 25, 2023 Knowledge Distillation Knowledge Graph Completion
— Unverified 0Efficient Intent-Based Filtering for Multi-Party Conversations Using Knowledge Distillation from LLMs Mar 21, 2025 intent-classification Intent Classification
— Unverified 0