A Closer Look at Wav2Vec2 Embeddings for On-Device Single-Channel Speech Enhancement Mar 3, 2024 Automatic Speech Recognition Keyword Spotting
— Unverified 00 DeGAN : Data-Enriching GAN for Retrieving Representative Samples from a Trained Classifier Dec 27, 2019 Data-free Knowledge Distillation Incremental Learning
— Unverified 00 BLSP-KD: Bootstrapping Language-Speech Pre-training via Knowledge Distillation May 29, 2024 Instruction Following Knowledge Distillation
— Unverified 00 Defending against Data-Free Model Extraction by Distributionally Robust Defensive Training Sep 21, 2023 Knowledge Distillation Model extraction
— Unverified 00 Defending against Data-Free Model Extraction by Distributionally Robust Defensive Training Sep 21, 2023 Knowledge Distillation Model extraction
— Unverified 00 Deep versus Wide: An Analysis of Student Architectures for Task-Agnostic Knowledge Distillation of Self-Supervised Speech Models Jul 14, 2022 Automatic Speech Recognition Automatic Speech Recognition (ASR)
— Unverified 00 Deep-to-bottom Weights Decay: A Systemic Knowledge Review Learning Technique for Transformer Layers in Knowledge Distillation Nov 16, 2021 Knowledge Distillation
— Unverified 00 Block-wise Intermediate Representation Training for Model Compression Oct 20, 2018 Knowledge Distillation model
— Unverified 00 AMTSS: An Adaptive Multi-Teacher Single-Student Knowledge Distillation Framework For Multilingual Language Inference May 13, 2023 Knowledge Distillation
— Unverified 00 Deep Serial Number: Computational Watermarking for DNN Intellectual Property Protection Nov 17, 2020 Knowledge Distillation valid
— Unverified 00 Deep Semi-Supervised and Self-Supervised Learning for Diabetic Retinopathy Detection Aug 4, 2022 Diabetic Retinopathy Detection Knowledge Distillation
— Unverified 00 Deep Representation Learning of Patient Data from Electronic Health Records (EHR): A Systematic Review Oct 6, 2020 Articles Deep Learning
— Unverified 00 Deep Neural Network Models Compression Mar 4, 2021 Knowledge Distillation Quantization
— Unverified 00 Deep Neural Compression Via Concurrent Pruning and Self-Distillation Sep 30, 2021 Knowledge Distillation Language Modeling
— Unverified 00 Black-box Source-free Domain Adaptation via Two-stage Knowledge Distillation May 13, 2023 Domain Adaptation Knowledge Distillation
— Unverified 00 Amortized Noisy Channel Neural Machine Translation Dec 16, 2021 Imitation Learning Knowledge Distillation
— Unverified 00 Bidirectional Distillation: A Mixed-Play Framework for Multi-Agent Generalizable Behaviors May 16, 2025 Knowledge Distillation Multi-agent Reinforcement Learning
— Unverified 00 Deep Net Triage: Analyzing the Importance of Network Layers via Structural Compression Jan 15, 2018 Knowledge Distillation
— Unverified 00 Deep Learning for Medical Text Processing: BERT Model Fine-Tuning and Comparative Study Oct 28, 2024 Knowledge Distillation
— Unverified 00 Holistic Approach to Measure Sample-level Adversarial Vulnerability and its Utility in Building Trustworthy Systems May 5, 2022 Adversarial Attack Knowledge Distillation
— Unverified 00 Knowledge Distillation-aided End-to-End Learning for Linear Precoding in Multiuser MIMO Downlink Systems with Finite-Rate Feedback Aug 10, 2020 Binarization Knowledge Distillation
— Unverified 00 Black-Box Dissector: Towards Erasing-based Hard-Label Model Stealing Attack May 3, 2021 Knowledge Distillation Self-Knowledge Distillation
— Unverified 00 HKD4VLM: A Progressive Hybrid Knowledge Distillation Framework for Robust Multimodal Hallucination and Factuality Detection in VLMs Jun 16, 2025 Hallucination Knowledge Distillation
— Unverified 00 HIRE: Distilling High-order Relational Knowledge From Heterogeneous Graph Neural Networks Jul 25, 2022 Knowledge Distillation Vocal Bursts Intensity Prediction
— Unverified 00 Hint-dynamic Knowledge Distillation Nov 30, 2022 Knowledge Distillation
— Unverified 00 Highly Constrained Coded Aperture Imaging Systems Design Via a Knowledge Distillation Approach Jun 25, 2024 Image Reconstruction Knowledge Distillation
— Unverified 00 Deep Epidemiological Modeling by Black-box Knowledge Distillation: An Accurate Deep Learning Model for COVID-19 Jan 20, 2021 Diversity Knowledge Distillation
— Unverified 00 BJTU-WeChat's Systems for the WMT22 Chat Translation Task Nov 28, 2022 Denoising Knowledge Distillation
— Unverified 00 AMLN: Adversarial-based Mutual Learning Network for Online Knowledge Distillation Aug 1, 2020 Knowledge Distillation Transfer Learning
— Unverified 00 High-Fidelity Pseudo-label Generation by Large Language Models for Training Robust Radiology Report Classifiers May 3, 2025 Diagnostic Knowledge Distillation
— Unverified 00 High-dimensional Analysis of Knowledge Distillation: Weak-to-Strong Generalization and Scaling Laws Oct 24, 2024 Knowledge Distillation regression
— Unverified 00 Hierarchical Transformer-based Large-Context End-to-end ASR with Large-Context Knowledge Distillation Feb 16, 2021 Automatic Speech Recognition Automatic Speech Recognition (ASR)
— Unverified 00 Hierarchical Selective Classification May 19, 2024 Classification Knowledge Distillation
— Unverified 00 Hierarchical Knowledge Distillation on Text Graph for Data-limited Attribute Inference Jan 10, 2024 Attribute Few-Shot Learning
— Unverified 00 Hierarchical Knowledge Distillation for Dialogue Sequence Labeling Nov 22, 2021 Knowledge Distillation Scene Segmentation
— Unverified 00 Deep Collective Knowledge Distillation Apr 18, 2023 Knowledge Distillation Model Compression
— Unverified 00 A metric learning approach for endoscopic kidney stone identification Jul 13, 2023 Few-Shot Learning Knowledge Distillation
— Unverified 00 HFedCKD: Toward Robust Heterogeneous Federated Learning via Data-free Knowledge Distillation and Two-way Contrast Mar 9, 2025 Data-free Knowledge Distillation Federated Learning
— Unverified 00 Heterogeneous Generative Knowledge Distillation with Masked Image Modeling Sep 18, 2023 image-classification Image Classification
— Unverified 00 High Performance Natural Language Processing Nov 1, 2020 Knowledge Distillation Quantization
— Unverified 00 Heterogeneous Federated Learning Using Knowledge Codistillation Oct 4, 2023 Federated Learning image-classification
— Unverified 00 Heterogeneous Federated Knowledge Graph Embedding Learning and Unlearning Feb 4, 2023 Federated Learning Graph Embedding
— Unverified 00 Heterogeneous Continual Learning Jun 14, 2023 Continual Learning Knowledge Distillation
— Unverified 00 Heterogeneous-Branch Collaborative Learning for Dialogue Generation Mar 21, 2023 Attribute Dialogue Generation
— Unverified 00 A method for estimating forest carbon storage distribution density via artificial intelligence generated content model Feb 2, 2025 Knowledge Distillation
— Unverified 00 Adaptive Multiplane Image Generation from a Single Internet Picture Nov 26, 2020 Depth Estimation Image Generation
— Unverified 00 A Closer Look at Rehearsal-Free Continual Learning Mar 31, 2022 Continual Learning Knowledge Distillation
— Unverified 00 Heterogeneity-aware Personalized Federated Learning via Adaptive Dual-Agent Reinforcement Learning Jan 28, 2025 Federated Learning Knowledge Distillation
— Unverified 00 HeteFedRec: Federated Recommender Systems with Model Heterogeneity Jul 24, 2023 Knowledge Distillation model
— Unverified 00 Decoupling Dark Knowledge via Block-wise Logit Distillation for Feature-level Alignment Nov 3, 2024 Knowledge Distillation Philosophy
— Unverified 00