Analyzing Compression Techniques for Computer Vision May 14, 2023 Knowledge Distillation Quantization
— Unverified 0On enhancing the robustness of Vision Transformers: Defensive Diffusion May 14, 2023 Computational Efficiency Denoising
Code Code Available 0Towards Understanding and Improving Knowledge Distillation for Neural Machine Translation May 14, 2023 Knowledge Distillation Machine Translation
Code Code Available 0AMTSS: An Adaptive Multi-Teacher Single-Student Knowledge Distillation Framework For Multilingual Language Inference May 13, 2023 Knowledge Distillation
— Unverified 0Black-box Source-free Domain Adaptation via Two-stage Knowledge Distillation May 13, 2023 Domain Adaptation Knowledge Distillation
— Unverified 0GSB: Group Superposition Binarization for Vision Transformer with Limited Training Samples May 13, 2023 Binarization Knowledge Distillation
Code Code Available 0A Lightweight Domain Adversarial Neural Network Based on Knowledge Distillation for EEG-based Cross-subject Emotion Recognition May 12, 2023 EEG Electroencephalogram (EEG)
— Unverified 0Knowledge distillation with Segment Anything (SAM) model for Planetary Geological Mapping May 12, 2023 Decoder Image Segmentation
— Unverified 0Improving Continual Relation Extraction by Distinguishing Analogous Semantics May 11, 2023 Continual Relation Extraction Knowledge Distillation
Code Code Available 1Long-Tailed Question Answering in an Open World May 11, 2023 Knowledge Distillation Language Modelling
— Unverified 0Serial Contrastive Knowledge Distillation for Continual Few-shot Relation Extraction May 11, 2023 Contrastive Learning Knowledge Distillation
Code Code Available 1A Survey on the Robustness of Computer Vision Models against Common Corruptions May 10, 2023 Data Augmentation Knowledge Distillation
Code Code Available 0Explainable Knowledge Distillation for On-device Chest X-Ray Classification May 10, 2023 Explainable artificial intelligence Explainable Artificial Intelligence (XAI)
— Unverified 0Multi-Teacher Knowledge Distillation For Text Image Machine Translation May 9, 2023 Decoder Knowledge Distillation
Code Code Available 0SRIL: Selective Regularization for Class-Incremental Learning May 9, 2023 class-incremental learning Class Incremental Learning
— Unverified 0FedNoRo: Towards Noise-Robust Federated Learning by Addressing Class Imbalance and Label Noise Heterogeneity May 9, 2023 Federated Learning Knowledge Distillation
Code Code Available 1SUR-adapter: Enhancing Text-to-Image Pre-trained Diffusion Models with Large Language Models May 9, 2023 Image Generation Knowledge Distillation
Code Code Available 1DynamicKD: An Effective Knowledge Distillation via Dynamic Entropy Correction-Based Distillation for Gap Optimizing May 9, 2023 Knowledge Distillation
— Unverified 0Distilling Script Knowledge from Large Language Models for Constrained Language Planning May 9, 2023 Knowledge Distillation
Code Code Available 1Do Not Blindly Imitate the Teacher: Using Perturbed Loss for Knowledge Distillation May 8, 2023 Knowledge Distillation
— Unverified 0NeuroComparatives: Neuro-Symbolic Distillation of Comparative Knowledge May 8, 2023 Knowledge Distillation valid
— Unverified 0Web Content Filtering through knowledge distillation of Large Language Models May 8, 2023 Knowledge Distillation
— Unverified 0Structural and Statistical Texture Knowledge Distillation for Semantic Segmentation May 6, 2023 Knowledge Distillation Quantization
— Unverified 0Distilled Mid-Fusion Transformer Networks for Multi-Modal Human Activity Recognition May 5, 2023 Activity Recognition Feature Engineering
— Unverified 0Smaller3d: Smaller Models for 3D Semantic Segmentation Using Minkowski Engine and Knowledge Distillation Methods May 4, 2023 3D Semantic Segmentation Knowledge Distillation
Code Code Available 0Avatar Knowledge Distillation: Self-ensemble Teacher Paradigm with Uncertainty May 4, 2023 Knowledge Distillation object-detection
Code Code Available 1SCOTT: Self-Consistent Chain-of-Thought Distillation May 3, 2023 counterfactual Counterfactual Reasoning
Code Code Available 1A Systematic Study of Knowledge Distillation for Natural Language Generation with Pseudo-Target Training May 3, 2023 Knowledge Distillation Text Generation
Code Code Available 0DeepAqua: Self-Supervised Semantic Segmentation of Wetland Surface Water Extent with SAR Images using Knowledge Distillation May 2, 2023 Knowledge Distillation Semantic Segmentation
Code Code Available 1Structure Aware Incremental Learning with Personalized Imitation Weights for Recommender Systems May 2, 2023 Incremental Learning Knowledge Distillation
— Unverified 0Distill or Annotate? Cost-Efficient Fine-Tuning of Compact Models May 2, 2023 Knowledge Distillation
— Unverified 0Detect, Distill and Update: Detect, Distill and Update: Learned DB Systems Facing Out of Distribution Data May 1, 2023 Knowledge Distillation Synthetic Data Generation
Code Code Available 0Refined Response Distillation for Class-Incremental Player Detection May 1, 2023 Knowledge Distillation object-detection
Code Code Available 0Scaffolding a Student to Instill Knowledge May 1, 2023 Knowledge Distillation
Code Code Available 0Multi-to-Single Knowledge Distillation for Point Cloud Semantic Segmentation Apr 28, 2023 Knowledge Distillation Semantic Segmentation
Code Code Available 0CORSD: Class-Oriented Relational Self Distillation Apr 28, 2023 Knowledge Distillation Model Compression
— Unverified 0Ensemble Modeling with Contrastive Knowledge Distillation for Sequential Recommendation Apr 28, 2023 Attribute Contrastive Learning
Code Code Available 0Learning Human-Human Interactions in Images from Weak Textual Supervision Apr 27, 2023 Human-Human Interaction Recognition Image Captioning
— Unverified 0A Symmetric Dual Encoding Dense Retrieval Framework for Knowledge-Intensive Visual Question Answering Apr 26, 2023 Decoder Knowledge Distillation
Code Code Available 1Shape-Net: Room Layout Estimation from Panoramic Images Robust to Occlusion using Knowledge Distillation with 3D Shapes as Additional Inputs Apr 25, 2023 3D geometry 3D Reconstruction
— Unverified 0Class Attention Transfer Based Knowledge Distillation Apr 25, 2023 Knowledge Distillation Model Compression
Code Code Available 1Improving Knowledge Distillation via Transferring Learning Ability Apr 24, 2023 Knowledge Distillation
Code Code Available 0A Forward and Backward Compatible Framework for Few-shot Class-incremental Pill Recognition Apr 24, 2023 class-incremental learning Class Incremental Learning
Code Code Available 0Interruption-Aware Cooperative Perception for V2X Communication-Aided Autonomous Driving Apr 24, 2023 Autonomous Driving Autonomous Vehicles
— Unverified 0Knowledge Distillation from 3D to Bird's-Eye-View for LiDAR Semantic Segmentation Apr 22, 2023 Autonomous Driving Knowledge Distillation
Code Code Available 1Decouple Non-parametric Knowledge Distillation For End-to-end Speech Translation Apr 20, 2023 Knowledge Distillation Machine Translation
— Unverified 0Train Your Own GNN Teacher: Graph-Aware Distillation on Textual Graphs Apr 20, 2023 Knowledge Distillation Node Classification
Code Code Available 1Word Sense Induction with Knowledge Distillation from BERT Apr 20, 2023 Knowledge Distillation Language Modeling
— Unverified 0Attention Weighted Local Descriptors Apr 19, 2023 3D Reconstruction Homography Estimation
Code Code Available 1Knowledge Distillation Under Ideal Joint Classifier Assumption Apr 19, 2023 Domain Adaptation Knowledge Distillation
— Unverified 0