Knowledge Distillation for Anomaly Detection Oct 9, 2023 Anomaly Detection Knowledge Distillation
— Unverified 00 Knowledge Distillation for Bilingual Dictionary Induction Sep 1, 2017 Knowledge Distillation Translation
— Unverified 00 Knowledge Distillation of Black-Box Large Language Models Jan 13, 2024 Knowledge Distillation Transfer Learning
— Unverified 00 Knowledge Distillation for Efficient Sequences of Training Runs Mar 11, 2023 Knowledge Distillation
— Unverified 00 Knowledge Distillation for Efficient Audio-Visual Video Captioning Jun 16, 2023 Audio-Visual Video Captioning Caption Generation
— Unverified 00 Knowledge Distillation for Enhancing Walmart E-commerce Search Relevance Using Large Language Models May 11, 2025 Knowledge Distillation
— Unverified 00 Knowledge distillation for fast and accurate DNA sequence correction Nov 17, 2022 Knowledge Distillation
— Unverified 00 Knowledge Distillation for Federated Learning: a Practical Guide Nov 9, 2022 Federated Learning Knowledge Distillation
— Unverified 00 Knowledge Distillation for Image Restoration : Simultaneous Learning from Degraded and Clean Images Jan 16, 2025 Decoder Image Reconstruction
— Unverified 00 Knowledge Distillation for Improved Accuracy in Spoken Question Answering Oct 21, 2020 Automatic Speech Recognition Automatic Speech Recognition (ASR)
— Unverified 00 Knowledge Distillation for Incremental Learning in Semantic Segmentation Nov 8, 2019 image-classification Image Classification
— Unverified 00 Knowledge Distillation for Mobile Edge Computation Offloading Apr 9, 2020 Imitation Learning Knowledge Distillation
— Unverified 00 Knowledge Distillation for Multilingual Unsupervised Neural Machine Translation Apr 21, 2020 Decoder Knowledge Distillation
— Unverified 00 Knowledge Distillation for Multimodal Egocentric Action Recognition Robust to Missing Modalities Apr 11, 2025 Action Recognition Knowledge Distillation
— Unverified 00 Knowledge Distillation for Neural Transducer-based Target-Speaker ASR: Exploiting Parallel Mixture/Single-Talker Speech Data May 25, 2023 Knowledge Distillation Speech Extraction
— Unverified 00 Knowledge Distillation for Neural Transducers from Large Self-Supervised Pre-trained Models Oct 7, 2021 Automatic Speech Recognition Automatic Speech Recognition (ASR)
— Unverified 00 Knowledge Distillation for Object Detection via Rank Mimicking and Prediction-guided Feature Imitation Dec 9, 2021 image-classification Image Classification
— Unverified 00 Knowledge Distillation for Object Detection: from generic to remote sensing datasets Jul 18, 2023 Knowledge Distillation Model Compression
— Unverified 00 Knowledge Distillation for Oriented Object Detection on Aerial Images Jun 20, 2022 Knowledge Distillation Model Compression
— Unverified 00 Knowledge Distillation for Real-Time Classification of Early Media in Voice Communications Oct 28, 2024 Audio Tagging Classification
— Unverified 00 Knowledge Distillation For Recurrent Neural Network Language Modeling With Trust Regularization Apr 8, 2019 Knowledge Distillation Language Modeling
— Unverified 00 Knowledge Distillation for Reservoir-based Classifier: Human Activity Recognition May 29, 2025 Activity Recognition Edge-computing
— Unverified 00 Knowledge Distillation for Road Detection based on cross-model Semi-Supervised Learning Feb 7, 2024 Knowledge Distillation Road Segmentation
— Unverified 00 Knowledge distillation for semi-supervised domain adaptation Aug 16, 2019 Domain Adaptation Knowledge Distillation
— Unverified 00 Knowledge Distillation for Small-footprint Highway Networks Aug 2, 2016 Acoustic Modelling Knowledge Distillation
— Unverified 00 Knowledge Distillation for Speech Denoising by Latent Representation Alignment with Cosine Distance May 6, 2025 Denoising Knowledge Distillation
— Unverified 00 Knowledge Distillation for Sustainable Neural Machine Translation Sep 1, 2022 Knowledge Distillation Machine Translation
— Unverified 00 Knowledge Distillation for Swedish NER models: A Search for Performance and Efficiency May 1, 2021 Knowledge Distillation Model Compression
— Unverified 00 Knowledge Distillation for Underwater Feature Extraction and Matching via GAN-synthesized Images Apr 11, 2025 General Knowledge Knowledge Distillation
— Unverified 00 Knowledge Distillation Framework for Accelerating High-Accuracy Neural Network-Based Molecular Dynamics Simulations Jun 18, 2025 Knowledge Distillation
— Unverified 00 Boosting of Head Pose Estimation by Knowledge Distillation Aug 20, 2021 Head Pose Estimation Knowledge Distillation
— Unverified 00 Knowledge Distillation from Few Samples Sep 27, 2018 Knowledge Distillation
— Unverified 00 Knowledge Distillation from Internal Representations Oct 8, 2019 Knowledge Distillation
— Unverified 00 Knowledge distillation from language model to acoustic model: a hierarchical multi-task learning approach Oct 20, 2021 Knowledge Distillation Language Modeling
— Unverified 00 Knowledge Distillation from Language-Oriented to Emergent Communication for Multi-Agent Remote Control Jan 23, 2024 Deep Reinforcement Learning Knowledge Distillation
— Unverified 00 Knowledge distillation from multi-modal to mono-modal segmentation networks Jun 17, 2021 Brain Tumor Segmentation Image Segmentation
— Unverified 00 Knowledge Distillation from Multiple Foundation Models for End-to-End Speech Recognition Mar 20, 2023 Automatic Speech Recognition Automatic Speech Recognition (ASR)
— Unverified 00 Knowledge Distillation from Non-streaming to Streaming ASR Encoder using Auxiliary Non-streaming Layer Aug 31, 2023 Automatic Speech Recognition Automatic Speech Recognition (ASR)
— Unverified 00 Knowledge Distillation Improves Stability in Retranslation-based Simultaneous Translation Dec 17, 2021 Knowledge Distillation Translation
— Unverified 00 Knowledge Distillation in Automated Annotation: Supervised Text Classification with LLM-Generated Training Labels Jun 25, 2024 Articles In-Context Learning
— Unverified 00 Knowledge Distillation in Deep Learning and its Applications Jul 17, 2020 Deep Learning Knowledge Distillation
— Unverified 00 Knowledge Distillation in Document Retrieval Nov 11, 2019 Knowledge Distillation Retrieval
— Unverified 00 Knowledge Distillation in Federated Learning: a Survey on Long Lasting Challenges and New Solutions Jun 16, 2024 Federated Learning Knowledge Distillation
— Unverified 00 Knowledge Distillation in Generations: More Tolerant Teachers Educate Better Students May 15, 2018 General Classification image-classification
— Unverified 00 Knowledge Distillation in Vision Transformers: A Critical Review Feb 4, 2023 Decoder image-classification
— Unverified 00 Knowledge Distillation in Wide Neural Networks: Risk Bound, Data Efficiency and Imperfect Teacher Oct 20, 2020 Knowledge Distillation Model Compression
— Unverified 00 Knowledge Distillation Meets Few-Shot Learning: An Approach for Few-Shot Intent Classification Within and Across Domains May 1, 2022 Cross-Domain Few-Shot cross-domain few-shot learning
— Unverified 00 Knowledge Distillation Methods for Efficient Unsupervised Adaptation Across Multiple Domains Jan 18, 2021 Domain Adaptation image-classification
— Unverified 00 Knowledge Distillation Neural Network for Predicting Car-following Behaviour of Human-driven and Autonomous Vehicles Nov 8, 2024 Autonomous Vehicles Descriptive
— Unverified 00 Knowledge Distillation of Convolutional Neural Networks through Feature Map Transformation using Decision Trees Mar 10, 2024 Knowledge Distillation
— Unverified 00