Digital Twin-Assisted Knowledge Distillation Framework for Heterogeneous Federated Learning Mar 10, 2023 Federated Learning Knowledge Distillation
— Unverified 0Dynamic Y-KD: A Hybrid Approach to Continual Instance Segmentation Mar 10, 2023 Continual Learning Incremental Learning
— Unverified 0Robust Knowledge Distillation from RNN-T Models With Noisy Training Labels Using Full-Sum Loss Mar 10, 2023 Knowledge Distillation
— Unverified 0Learning the Wrong Lessons: Inserting Trojans During Knowledge Distillation Mar 9, 2023 Knowledge Distillation
— Unverified 0NIFF: Alleviating Forgetting in Generalized Few-Shot Object Detection via Neural Instance Feature Forging Mar 9, 2023 Data-free Knowledge Distillation Few-Shot Object Detection
— Unverified 0Gradient-Guided Knowledge Distillation for Object Detectors Mar 7, 2023 Knowledge Distillation Object
— Unverified 0Adaptive Knowledge Distillation between Text and Speech Pre-trained Models Mar 7, 2023 Knowledge Distillation Spoken Language Understanding
— Unverified 0PreFallKD: Pre-Impact Fall Detection via CNN-ViT Knowledge Distillation Mar 7, 2023 Data Augmentation Knowledge Distillation
Code Code Available 0KDSM: An uplift modeling framework based on knowledge distillation and sample matching Mar 6, 2023 counterfactual Knowledge Distillation
— Unverified 0Students Parrot Their Teachers: Membership Inference on Model Distillation Mar 6, 2023 Knowledge Distillation
— Unverified 0IKD+: Reliable Low Complexity Deep Models For Retinopathy Classification Mar 4, 2023 Classification Knowledge Distillation
— Unverified 0X^3KD: Knowledge Distillation Across Modalities, Tasks and Stages for Multi-Camera 3D Object Detection Mar 3, 2023 3D Object Detection Instance Segmentation
— Unverified 0Pre-trained Model Representations and their Robustness against Noise for Speech Emotion Analysis Mar 3, 2023 Emotion Recognition Knowledge Distillation
— Unverified 0Unsupervised Deep Digital Staining For Microscopic Cell Images Via Knowledge Distillation Mar 3, 2023 Colorization Knowledge Distillation
— Unverified 0Letz Translate: Low-Resource Machine Translation for Luxembourgish Mar 2, 2023 Knowledge Distillation Machine Translation
— Unverified 0Distilling Multi-Level X-vector Knowledge for Small-footprint Speaker Verification Mar 2, 2023 Knowledge Distillation Speaker Verification
— Unverified 0Weakly-supervised HOI Detection via Prior-guided Bi-level Representation Learning Mar 2, 2023 Human-Object Interaction Detection Knowledge Distillation
— Unverified 0Distilled Reverse Attention Network for Open-world Compositional Zero-Shot Learning Mar 1, 2023 Compositional Zero-Shot Learning Knowledge Distillation
— Unverified 0Backdoor for Debias: Mitigating Model Bias with Backdoor Attack-based Artificial Bias Mar 1, 2023 Backdoor Attack Knowledge Distillation
Code Code Available 0Towards domain generalisation in ASR with elitist sampling and ensemble knowledge distillation Mar 1, 2023 Domain Adaptation Knowledge Distillation
— Unverified 0Incremental Learning of Acoustic Scenes and Sound Events Feb 28, 2023 Acoustic Scene Classification Audio Tagging
— Unverified 0Learning to Retain while Acquiring: Combating Distribution-Shift in Adversarial Data-Free Knowledge Distillation Feb 28, 2023 Data-free Knowledge Distillation Knowledge Distillation
— Unverified 0Language-Universal Adapter Learning with Knowledge Distillation for End-to-End Multilingual Speech Recognition Feb 28, 2023 Automatic Speech Recognition Automatic Speech Recognition (ASR)
Code Code Available 0Leveraging Angular Distributions for Improved Knowledge Distillation Feb 27, 2023 Knowledge Distillation
— Unverified 0A Light-weight Deep Learning Model for Remote Sensing Image Classification Feb 25, 2023 image-classification Image Classification
— Unverified 0Ensemble knowledge distillation of self-supervised speech models Feb 24, 2023 Automatic Speech Recognition Automatic Speech Recognition (ASR)
— Unverified 0A Knowledge Distillation framework for Multi-Organ Segmentation of Medaka Fish in Tomographic Image Feb 24, 2023 Computed Tomography (CT) Image Segmentation
— Unverified 0Teacher Intervention: Improving Convergence of Quantization Aware Training for Ultra-Low Precision Transformers Feb 23, 2023 Knowledge Distillation Quantization
Code Code Available 0Personalized Decentralized Federated Learning with Knowledge Distillation Feb 23, 2023 Federated Learning Knowledge Distillation
— Unverified 0Exploring Social Media for Early Detection of Depression in COVID-19 Patients Feb 23, 2023 Knowledge Distillation
Code Code Available 0Practical Knowledge Distillation: Using DNNs to Beat DNNs Feb 23, 2023 Denoising Knowledge Distillation
— Unverified 0Debiased Distillation by Transplanting the Last Layer Feb 22, 2023 Attribute Knowledge Distillation
— Unverified 0Distilling Calibrated Student from an Uncalibrated Teacher Feb 22, 2023 Data Augmentation Knowledge Distillation
— Unverified 0KS-DETR: Knowledge Sharing in Attention Learning for Detection Transformer Feb 22, 2023 Knowledge Distillation Transfer Learning
Code Code Available 0CADIS: Handling Cluster-skewed Non-IID Data in Federated Learning with Clustered Aggregation and Knowledge DIStilled Regularization Feb 21, 2023 Federated Learning Knowledge Distillation
Code Code Available 0Two-in-one Knowledge Distillation for Efficient Facial Forgery Detection Feb 21, 2023 Knowledge Distillation Vocal Bursts Valence Prediction
— Unverified 0The Role of Masking for Efficient Supervised Knowledge Distillation of Vision Transformers Feb 21, 2023 Knowledge Distillation
— Unverified 0HomoDistil: Homotopic Task-Agnostic Distillation of Pre-trained Transformers Feb 19, 2023 Knowledge Distillation Model Compression
— Unverified 0RobustDistiller: Compressing Universal Speech Representations for Enhanced Environment Robustness Feb 18, 2023 Knowledge Distillation Multi-Task Learning
— Unverified 0Fairly Predicting Graft Failure in Liver Transplant for Organ Assigning Feb 18, 2023 Fairness Knowledge Distillation
— Unverified 0Explicit and Implicit Knowledge Distillation via Unlabeled Data Feb 17, 2023 Data-free Knowledge Distillation Knowledge Distillation
— Unverified 0Few-shot 3D LiDAR Semantic Segmentation for Autonomous Driving Feb 17, 2023 Autonomous Driving Few-Shot Learning
— Unverified 0Learning From Biased Soft Labels Feb 16, 2023 Knowledge Distillation
— Unverified 0Cross Modal Distillation for Flood Extent Mapping Feb 16, 2023 Knowledge Distillation
— Unverified 0Fuzzy Knowledge Distillation from High-Order TSK to Low-Order TSK Feb 16, 2023 Benchmarking Knowledge Distillation
— Unverified 0LEALLA: Learning Lightweight Language-agnostic Sentence Embeddings with Knowledge Distillation Feb 16, 2023 Knowledge Distillation Sentence
— Unverified 0New Insights on Relieving Task-Recency Bias for Online Class Incremental Learning Feb 16, 2023 class-incremental learning Class Incremental Learning
Code Code Available 0ST-MFNet Mini: Knowledge Distillation-Driven Frame Interpolation Feb 16, 2023 Knowledge Distillation Network Pruning
Code Code Available 0Offline-to-Online Knowledge Distillation for Video Instance Segmentation Feb 15, 2023 Data Augmentation Instance Segmentation
— Unverified 0A lightweight network for photovoltaic cell defect detection in electroluminescence images based on neural architecture search and knowledge distillation Feb 15, 2023 Data Augmentation Defect Detection
— Unverified 0