Knowledge Distillation Label Smoothing: Fact or Fallacy? Jan 30, 2023 Knowledge Distillation text-classification
— Unverified 0FractalAD: A simple industrial anomaly detection method using fractal anomaly generation and backbone knowledge distillation Jan 30, 2023 Anomaly Detection Knowledge Distillation
Code Code Available 0Knowledge Transfer from Pre-trained Language Models to Cif-based Speech Recognizers via Hierarchical Distillation Jan 30, 2023 Automatic Speech Recognition Knowledge Distillation
Code Code Available 1On student-teacher deviations in distillation: does it pay to disobey? Jan 30, 2023 Knowledge Distillation
— Unverified 0Few-shot Face Image Translation via GAN Prior Distillation Jan 28, 2023 Knowledge Distillation Translation
— Unverified 0Supervision Complexity and its Role in Knowledge Distillation Jan 28, 2023 image-classification Image Classification
— Unverified 0MVKT-ECG: Efficient Single-lead ECG Classification on Multi-Label Arrhythmia by Multi-View Knowledge Transferring Jan 28, 2023 Diagnostic ECG Classification
— Unverified 0Improved knowledge distillation by utilizing backward pass knowledge in neural networks Jan 27, 2023 Knowledge Distillation Model Compression
— Unverified 0EmbedDistill: A Geometric Knowledge Distillation for Information Retrieval Jan 27, 2023 Information Retrieval Knowledge Distillation
— Unverified 0Can We Use Probing to Better Understand Fine-tuning and Knowledge Distillation of the BERT NLU? Jan 27, 2023 Knowledge Distillation Natural Language Understanding
— Unverified 0Improving Text-based Early Prediction by Distillation from Privileged Time-Series Text Jan 26, 2023 Knowledge Distillation Prediction
— Unverified 0OvarNet: Towards Open-vocabulary Object Attribute Recognition Jan 23, 2023 Attribute Knowledge Distillation
Code Code Available 1A Simple Recipe for Competitive Low-compute Self supervised Vision Models Jan 23, 2023 Knowledge Distillation
— Unverified 0Unifying Synergies between Self-supervised Learning and Dynamic Computation Jan 22, 2023 image-classification Image Classification
Code Code Available 0The Best of Both Worlds: Accurate Global and Personalized Models through Federated Learning with Data-Free Hyper-Knowledge Distillation Jan 21, 2023 Federated Learning Knowledge Distillation
— Unverified 0ProKD: An Unsupervised Prototypical Knowledge Distillation Network for Zero-Resource Cross-Lingual Named Entity Recognition Jan 21, 2023 Contrastive Learning Cross-Lingual NER
— Unverified 0RNAS-CL: Robust Neural Architecture Search by Cross-Layer Knowledge Distillation Jan 19, 2023 Knowledge Distillation Neural Architecture Search
— Unverified 0Adaptively Integrated Knowledge Distillation and Prediction Uncertainty for Continual Learning Jan 18, 2023 Continual Learning Knowledge Distillation
— Unverified 0Knowledge Distillation in Federated Edge Learning: A Survey Jan 14, 2023 Knowledge Distillation Survey
— Unverified 0A Cohesive Distillation Architecture for Neural Language Models Jan 12, 2023 Knowledge Distillation Language Modeling
— Unverified 0Effective Decision Boundary Learning for Class Incremental Learning Jan 12, 2023 class-incremental learning Class Incremental Learning
— Unverified 0TinyHD: Efficient Video Saliency Prediction with Heterogeneous Decoders using Hierarchical Maps Distillation Jan 11, 2023 Knowledge Distillation Prediction
Code Code Available 1Synthetic data generation method for data-free knowledge distillation in regression neural networks Jan 11, 2023 Data-free Knowledge Distillation Knowledge Distillation
Code Code Available 0Online Hyperparameter Optimization for Class-Incremental Learning Jan 11, 2023 class-incremental learning Class Incremental Learning
Code Code Available 1ERNIE 3.0 Tiny: Frustratingly Simple Method to Improve Task-Agnostic Distillation Generalization Jan 9, 2023 Knowledge Distillation Language Modelling
Code Code Available 0Designing an Improved Deep Learning-based Model for COVID-19 Recognition in Chest X-ray Images: A Knowledge Distillation Approach Jan 6, 2023 Knowledge Distillation
— Unverified 0Reference Twice: A Simple and Unified Baseline for Few-Shot Instance Segmentation Jan 3, 2023 Benchmarking Few-shot Instance Segmentation
Code Code Available 1RELIANT: Fair Knowledge Distillation for Graph Neural Networks Jan 3, 2023 Fairness Graph Learning
Code Code Available 0Knowledge-guided Causal Intervention for Weakly-supervised Object Localization Jan 3, 2023 Knowledge Distillation Object
Code Code Available 0Label-Guided Knowledge Distillation for Continual Semantic Segmentation on 2D Images and 3D Point Clouds Jan 1, 2023 Continual Semantic Segmentation Knowledge Distillation
Code Code Available 1Multi-Task Learning with Knowledge Distillation for Dense Prediction Jan 1, 2023 Boundary Detection Depth Estimation
— Unverified 0Automated Knowledge Distillation via Monte Carlo Tree Search Jan 1, 2023 image-classification Image Classification
Code Code Available 0TripLe: Revisiting Pretrained Model Reuse and Progressive Learning for Efficient Vision Transformer Scaling and Searching Jan 1, 2023 Knowledge Distillation Neural Architecture Search
— Unverified 0Continual Segment: Towards a Single, Unified and Non-forgetting Continual Segmentation Model of 143 Whole-body Organs in CT Scans Jan 1, 2023 Continual Semantic Segmentation Decoder
— Unverified 0Knowledge-Spreader: Learning Semi-Supervised Facial Action Dynamics by Consistifying Knowledge Granularity Jan 1, 2023 Knowledge Distillation
— Unverified 0UniKD: Universal Knowledge Distillation for Mimicking Homogeneous or Heterogeneous Object Detectors Jan 1, 2023 Knowledge Distillation
— Unverified 0Alleviating Catastrophic Forgetting of Incremental Object Detection via Within-Class and Between-Class Knowledge Distillation Jan 1, 2023 Knowledge Distillation object-detection
— Unverified 0Remembering Normality: Memory-guided Knowledge Distillation for Unsupervised Anomaly Detection Jan 1, 2023 Anomaly Detection Knowledge Distillation
Code Code Available 1MI-GAN: A Simple Baseline for Image Inpainting on Mobile Devices Jan 1, 2023 Efficient Neural Network Image Inpainting
Code Code Available 2Tiny Updater: Towards Efficient Neural Network-Driven Software Updating Jan 1, 2023 Efficient Neural Network image-classification
Code Code Available 0Data-Free Class-Incremental Hand Gesture Recognition Jan 1, 2023 class-incremental learning Class Incremental Learning
Code Code Available 1Distilling DETR with Visual-Linguistic Knowledge for Open-Vocabulary Object Detection Jan 1, 2023 Knowledge Distillation Language Modeling
Code Code Available 1Masked Autoencoders Are Stronger Knowledge Distillers Jan 1, 2023 Decoder Knowledge Distillation
— Unverified 0Dual Learning with Dynamic Knowledge Distillation for Partially Relevant Video Retrieval Jan 1, 2023 Knowledge Distillation Language Modelling
Code Code Available 1ICD-Face: Intra-class Compactness Distillation for Face Recognition Jan 1, 2023 Face Recognition Knowledge Distillation
— Unverified 0Beyond the Limitation of Monocular 3D Detector via Knowledge Distillation Jan 1, 2023 Knowledge Distillation
Code Code Available 0Data-Free Knowledge Distillation via Feature Exchange and Activation Region Constraint Jan 1, 2023 Data Augmentation Data-free Knowledge Distillation
Code Code Available 1ScaleKD: Distilling Scale-Aware Knowledge in Small Object Detector Jan 1, 2023 Knowledge Distillation object-detection
— Unverified 0Probabilistic Knowledge Distillation of Face Ensembles Jan 1, 2023 Face Image Quality Face Recognition
— Unverified 0Multi-Level Logit Distillation Jan 1, 2023 Knowledge Distillation Prediction
Code Code Available 1