An Empirical Study of Leveraging Knowledge Distillation for Compressing Multilingual Neural Machine Translation Models Apr 19, 2023 Knowledge Distillation Machine Translation
— Unverified 0Biologically inspired structure learning with reverse knowledge distillation for spiking neural networks Apr 19, 2023 Knowledge Distillation
— Unverified 0Deep Collective Knowledge Distillation Apr 18, 2023 Knowledge Distillation Model Compression
— Unverified 0LaSNN: Layer-wise ANN-to-SNN Distillation for Effective and Efficient Training in Deep Spiking Neural Networks Apr 17, 2023 Knowledge Distillation
— Unverified 0Always Strengthen Your Strengths: A Drift-Aware Incremental Learning Framework for CTR Prediction Apr 17, 2023 Click-Through Rate Prediction Diversity
— Unverified 0OVTrack: Open-Vocabulary Multiple Object Tracking Apr 17, 2023 Denoising Hallucination
Code Code Available 1Learning to "Segment Anything" in Thermal Infrared Images through Knowledge Distillation with a Large Scale Dataset SATIR Apr 17, 2023 Image Segmentation Knowledge Distillation
Code Code Available 0Robust Cross-Modal Knowledge Distillation for Unconstrained Videos Apr 16, 2023 Action Recognition Audio Tagging
Code Code Available 1Teacher Network Calibration Improves Cross-Quality Knowledge Distillation Apr 15, 2023 image-classification Image Classification
Code Code Available 0Learn What Is Possible, Then Choose What Is Best: Disentangling One-To-Many Relations in Language Through Text-based Games Apr 14, 2023 Knowledge Distillation text-based games
Code Code Available 0Multi-Mode Online Knowledge Distillation for Self-Supervised Visual Representation Learning Apr 13, 2023 Knowledge Distillation Representation Learning
Code Code Available 1Class-Incremental Learning of Plant and Disease Detection: Growing Branches with Knowledge Distillation Apr 13, 2023 class-incremental learning Class Incremental Learning
— Unverified 0Constructing Deep Spiking Neural Networks from Artificial Neural Networks with Knowledge Distillation Apr 12, 2023 Knowledge Distillation
— Unverified 0SFT-KD-Recon: Learning a Student-friendly Teacher for Knowledge Distillation in Magnetic Resonance Image Reconstruction Apr 11, 2023 Image Reconstruction Knowledge Distillation
Code Code Available 0Grouped Knowledge Distillation for Deep Face Recognition Apr 10, 2023 Face Recognition Knowledge Distillation
— Unverified 0A Survey on Recent Teacher-student Learning Studies Apr 10, 2023 Knowledge Distillation Survey
— Unverified 0HyperINR: A Fast and Predictive Hypernetwork for Implicit Neural Representations via Knowledge Distillation Apr 9, 2023 Knowledge Distillation Novel View Synthesis
— Unverified 0Homogenizing Non-IID datasets via In-Distribution Knowledge Distillation for Decentralized Learning Apr 9, 2023 image-classification Image Classification
— Unverified 0A Comprehensive Survey on Knowledge Distillation of Diffusion Models Apr 9, 2023 Knowledge Distillation Survey
— Unverified 0Model-Agnostic Decentralized Collaborative Learning for On-Device POI Recommendation Apr 8, 2023 Knowledge Distillation Privacy Preserving
— Unverified 0Continual Learning for LiDAR Semantic Segmentation: Class-Incremental and Coarse-to-Fine strategies on Sparse Data Apr 8, 2023 class-incremental learning Class Incremental Learning
Code Code Available 1Masked Student Dataset of Expressions Apr 7, 2023 Contrastive Learning Facial Expression Recognition
Code Code Available 0Continual Detection Transformer for Incremental Object Detection Apr 6, 2023 Class-Incremental Object Detection Knowledge Distillation
— Unverified 0DiGA: Distil to Generalize and then Adapt for Domain Adaptive Semantic Segmentation Apr 5, 2023 Data Augmentation Knowledge Distillation
Code Code Available 1Towards Efficient Task-Driven Model Reprogramming with Foundation Models Apr 5, 2023 Knowledge Distillation Transfer Learning
— Unverified 0Self-Distillation for Gaussian Process Regression and Classification Apr 5, 2023 Classification GPR
Code Code Available 0MadEye: Boosting Live Video Analytics Accuracy with Adaptive Camera Configurations Apr 4, 2023 Knowledge Distillation
— Unverified 0Selective Knowledge Sharing for Privacy-Preserving Federated Distillation without A Good Teacher Apr 4, 2023 Federated Learning Knowledge Distillation
Code Code Available 1Cross-Class Feature Augmentation for Class Incremental Learning Apr 4, 2023 class-incremental learning Class Incremental Learning
— Unverified 0Knowledge-Distilled Graph Neural Networks for Personalized Epileptic Seizure Detection Apr 3, 2023 channel selection EEG
— Unverified 0Vision-Language Models for Vision Tasks: A Survey Apr 3, 2023 Benchmarking Knowledge Distillation
Code Code Available 4Domain Generalization for Crop Segmentation with Standardized Ensemble Knowledge Distillation Apr 3, 2023 Domain Generalization Knowledge Distillation
Code Code Available 0A Unified Compression Framework for Efficient Speech-Driven Talking-Face Generation Apr 2, 2023 Face Generation Knowledge Distillation
— Unverified 0Selective Knowledge Distillation for Non-Autoregressive Neural Machine Translation Mar 31, 2023 Knowledge Distillation Machine Translation
— Unverified 0Quick Dense Retrievers Consume KALE: Post Training Kullback Leibler Alignment of Embeddings for Asymmetrical dual encoders Mar 31, 2023 Knowledge Distillation Language Modeling
— Unverified 0GVP: Generative Volumetric Primitives Mar 31, 2023 Image Generation Knowledge Distillation
— Unverified 0Knowledge Distillation for Feature Extraction in Underwater VSLAM Mar 31, 2023 Binarization Knowledge Distillation
Code Code Available 1oBERTa: Improving Sparse Transfer Learning via improved initialization, distillation, and pruning regimes Mar 30, 2023 Knowledge Distillation Model Compression
— Unverified 0If At First You Don't Succeed: Test Time Re-ranking for Zero-shot, Cross-domain Retrieval Mar 30, 2023 Image Retrieval Knowledge Distillation
— Unverified 0Kaizen: Practical Self-supervised Continual Learning with Continual Fine-tuning Mar 30, 2023 Continual Learning Knowledge Distillation
Code Code Available 1KD-DLGAN: Data Limited Image Generation via Knowledge Distillation Mar 30, 2023 Diversity Image Generation
— Unverified 0Asymmetric Image Retrieval with Cross Model Compatible Ensembles Mar 30, 2023 Diversity Face Recognition
— Unverified 0SimDistill: Simulated Multi-modal Distillation for BEV 3D Object Detection Mar 29, 2023 3D geometry 3D Object Detection
Code Code Available 1Dice Semimetric Losses: Optimizing the Dice Score with Soft Labels Mar 28, 2023 Knowledge Distillation
Code Code Available 1Information-Theoretic GAN Compression with Variational Energy-based Model Mar 28, 2023 Image Enhancement Knowledge Distillation
— Unverified 0HOICLIP: Efficient Knowledge Transfer for HOI Detection with Vision-Language Models Mar 28, 2023 Decoder Human-Object Interaction Detection
Code Code Available 1SELF-VS: Self-supervised Encoding Learning For Video Summarization Mar 28, 2023 Knowledge Distillation Representation Learning
Code Code Available 0Projected Latent Distillation for Data-Agnostic Consolidation in Distributed Continual Learning Mar 28, 2023 Continual Learning Knowledge Distillation
Code Code Available 0DisWOT: Student Architecture Search for Distillation WithOut Training Mar 28, 2023 Knowledge Distillation
Code Code Available 1Improving Neural Topic Models with Wasserstein Knowledge Distillation Mar 27, 2023 Knowledge Distillation Topic Models
Code Code Available 0