Do Not Blindly Imitate the Teacher: Using Perturbed Loss for Knowledge Distillation May 8, 2023 Knowledge Distillation
— Unverified 0Web Content Filtering through knowledge distillation of Large Language Models May 8, 2023 Knowledge Distillation
— Unverified 0NeuroComparatives: Neuro-Symbolic Distillation of Comparative Knowledge May 8, 2023 Knowledge Distillation valid
— Unverified 0Structural and Statistical Texture Knowledge Distillation for Semantic Segmentation May 6, 2023 Knowledge Distillation Quantization
— Unverified 0Distilled Mid-Fusion Transformer Networks for Multi-Modal Human Activity Recognition May 5, 2023 Activity Recognition Feature Engineering
— Unverified 0Smaller3d: Smaller Models for 3D Semantic Segmentation Using Minkowski Engine and Knowledge Distillation Methods May 4, 2023 3D Semantic Segmentation Knowledge Distillation
Code Code Available 0A Systematic Study of Knowledge Distillation for Natural Language Generation with Pseudo-Target Training May 3, 2023 Knowledge Distillation Text Generation
Code Code Available 0Structure Aware Incremental Learning with Personalized Imitation Weights for Recommender Systems May 2, 2023 Incremental Learning Knowledge Distillation
— Unverified 0Distill or Annotate? Cost-Efficient Fine-Tuning of Compact Models May 2, 2023 Knowledge Distillation
— Unverified 0Detect, Distill and Update: Detect, Distill and Update: Learned DB Systems Facing Out of Distribution Data May 1, 2023 Knowledge Distillation Synthetic Data Generation
Code Code Available 0Scaffolding a Student to Instill Knowledge May 1, 2023 Knowledge Distillation
Code Code Available 0Refined Response Distillation for Class-Incremental Player Detection May 1, 2023 Knowledge Distillation object-detection
Code Code Available 0Ensemble Modeling with Contrastive Knowledge Distillation for Sequential Recommendation Apr 28, 2023 Attribute Contrastive Learning
Code Code Available 0Multi-to-Single Knowledge Distillation for Point Cloud Semantic Segmentation Apr 28, 2023 Knowledge Distillation Semantic Segmentation
Code Code Available 0CORSD: Class-Oriented Relational Self Distillation Apr 28, 2023 Knowledge Distillation Model Compression
— Unverified 0Learning Human-Human Interactions in Images from Weak Textual Supervision Apr 27, 2023 Human-Human Interaction Recognition Image Captioning
— Unverified 0Shape-Net: Room Layout Estimation from Panoramic Images Robust to Occlusion using Knowledge Distillation with 3D Shapes as Additional Inputs Apr 25, 2023 3D geometry 3D Reconstruction
— Unverified 0A Forward and Backward Compatible Framework for Few-shot Class-incremental Pill Recognition Apr 24, 2023 class-incremental learning Class Incremental Learning
Code Code Available 0Interruption-Aware Cooperative Perception for V2X Communication-Aided Autonomous Driving Apr 24, 2023 Autonomous Driving Autonomous Vehicles
— Unverified 0Improving Knowledge Distillation via Transferring Learning Ability Apr 24, 2023 Knowledge Distillation
Code Code Available 0Decouple Non-parametric Knowledge Distillation For End-to-end Speech Translation Apr 20, 2023 Knowledge Distillation Machine Translation
— Unverified 0Word Sense Induction with Knowledge Distillation from BERT Apr 20, 2023 Knowledge Distillation Language Modeling
— Unverified 0Biologically inspired structure learning with reverse knowledge distillation for spiking neural networks Apr 19, 2023 Knowledge Distillation
— Unverified 0Knowledge Distillation Under Ideal Joint Classifier Assumption Apr 19, 2023 Domain Adaptation Knowledge Distillation
— Unverified 0An Empirical Study of Leveraging Knowledge Distillation for Compressing Multilingual Neural Machine Translation Models Apr 19, 2023 Knowledge Distillation Machine Translation
— Unverified 0Deep Collective Knowledge Distillation Apr 18, 2023 Knowledge Distillation Model Compression
— Unverified 0Learning to "Segment Anything" in Thermal Infrared Images through Knowledge Distillation with a Large Scale Dataset SATIR Apr 17, 2023 Image Segmentation Knowledge Distillation
Code Code Available 0LaSNN: Layer-wise ANN-to-SNN Distillation for Effective and Efficient Training in Deep Spiking Neural Networks Apr 17, 2023 Knowledge Distillation
— Unverified 0Always Strengthen Your Strengths: A Drift-Aware Incremental Learning Framework for CTR Prediction Apr 17, 2023 Click-Through Rate Prediction Diversity
— Unverified 0Teacher Network Calibration Improves Cross-Quality Knowledge Distillation Apr 15, 2023 image-classification Image Classification
Code Code Available 0Learn What Is Possible, Then Choose What Is Best: Disentangling One-To-Many Relations in Language Through Text-based Games Apr 14, 2023 Knowledge Distillation text-based games
Code Code Available 0Class-Incremental Learning of Plant and Disease Detection: Growing Branches with Knowledge Distillation Apr 13, 2023 class-incremental learning Class Incremental Learning
— Unverified 0Constructing Deep Spiking Neural Networks from Artificial Neural Networks with Knowledge Distillation Apr 12, 2023 Knowledge Distillation
— Unverified 0SFT-KD-Recon: Learning a Student-friendly Teacher for Knowledge Distillation in Magnetic Resonance Image Reconstruction Apr 11, 2023 Image Reconstruction Knowledge Distillation
Code Code Available 0Grouped Knowledge Distillation for Deep Face Recognition Apr 10, 2023 Face Recognition Knowledge Distillation
— Unverified 0A Survey on Recent Teacher-student Learning Studies Apr 10, 2023 Knowledge Distillation Survey
— Unverified 0HyperINR: A Fast and Predictive Hypernetwork for Implicit Neural Representations via Knowledge Distillation Apr 9, 2023 Knowledge Distillation Novel View Synthesis
— Unverified 0Homogenizing Non-IID datasets via In-Distribution Knowledge Distillation for Decentralized Learning Apr 9, 2023 image-classification Image Classification
— Unverified 0A Comprehensive Survey on Knowledge Distillation of Diffusion Models Apr 9, 2023 Knowledge Distillation Survey
— Unverified 0Model-Agnostic Decentralized Collaborative Learning for On-Device POI Recommendation Apr 8, 2023 Knowledge Distillation Privacy Preserving
— Unverified 0Masked Student Dataset of Expressions Apr 7, 2023 Contrastive Learning Facial Expression Recognition
Code Code Available 0Continual Detection Transformer for Incremental Object Detection Apr 6, 2023 Class-Incremental Object Detection Knowledge Distillation
— Unverified 0Self-Distillation for Gaussian Process Regression and Classification Apr 5, 2023 Classification GPR
Code Code Available 0Towards Efficient Task-Driven Model Reprogramming with Foundation Models Apr 5, 2023 Knowledge Distillation Transfer Learning
— Unverified 0MadEye: Boosting Live Video Analytics Accuracy with Adaptive Camera Configurations Apr 4, 2023 Knowledge Distillation
— Unverified 0Cross-Class Feature Augmentation for Class Incremental Learning Apr 4, 2023 class-incremental learning Class Incremental Learning
— Unverified 0Domain Generalization for Crop Segmentation with Standardized Ensemble Knowledge Distillation Apr 3, 2023 Domain Generalization Knowledge Distillation
Code Code Available 0Knowledge-Distilled Graph Neural Networks for Personalized Epileptic Seizure Detection Apr 3, 2023 channel selection EEG
— Unverified 0A Unified Compression Framework for Efficient Speech-Driven Talking-Face Generation Apr 2, 2023 Face Generation Knowledge Distillation
— Unverified 0Quick Dense Retrievers Consume KALE: Post Training Kullback Leibler Alignment of Embeddings for Asymmetrical dual encoders Mar 31, 2023 Knowledge Distillation Language Modeling
— Unverified 0