Efficient and Robust Knowledge Distillation from A Stronger Teacher Based on Correlation Matching Oct 9, 2024 Knowledge Distillation Neural Network Compression
— Unverified 0KnowledgeSG: Privacy-Preserving Synthetic Text Generation with Knowledge Distillation from Server Oct 8, 2024 Federated Learning Knowledge Distillation
Code Code Available 0Progressive distillation induces an implicit curriculum Oct 7, 2024 Knowledge Distillation
— Unverified 0ReasoningRank: Teaching Student Models to Rank through Reasoning-Based Knowledge Distillation Oct 7, 2024 Decision Making Information Retrieval
— Unverified 0DAdEE: Unsupervised Domain Adaptation in Early Exit PLMs Oct 6, 2024 Domain Adaptation Knowledge Distillation
Code Code Available 0CAPEEN: Image Captioning with Early Exits and Knowledge Distillation Oct 6, 2024 Descriptive Image Captioning
Code Code Available 0DiDOTS: Knowledge Distillation from Large-Language-Models for Dementia Obfuscation in Transcribed Speech Oct 5, 2024 Hallucination Knowledge Distillation
— Unverified 0Distillation-Free One-Step Diffusion for Real-World Image Super-Resolution Oct 5, 2024 Image Super-Resolution Knowledge Distillation
Code Code Available 2Accelerating Diffusion Models with One-to-Many Knowledge Distillation Oct 5, 2024 Image Generation Knowledge Distillation
— Unverified 0Gap Preserving Distillation by Building Bidirectional Mappings with A Dynamic Teacher Oct 5, 2024 Knowledge Distillation
— Unverified 0Self-Supervised Keypoint Detection with Distilled Depth Keypoint Representation Oct 4, 2024 Keypoint Detection Knowledge Distillation
— Unverified 0Learning from Committee: Reasoning Distillation from a Mixture of Teachers with Peer-Review Oct 4, 2024 Knowledge Distillation Logical Reasoning
Code Code Available 2DocKD: Knowledge Distillation from LLMs for Open-World Document Understanding Models Oct 4, 2024 document understanding Knowledge Distillation
— Unverified 0Dataset Distillation via Knowledge Distillation: Towards Efficient Self-Supervised Pre-Training of Deep Networks Oct 3, 2024 Dataset Distillation Knowledge Distillation
Code Code Available 0BLEND: Behavior-guided Neural Population Dynamics Modeling via Privileged Knowledge Distillation Oct 2, 2024 Knowledge Distillation Time Series Analysis
Code Code Available 0PairDistill: Pairwise Relevance Distillation for Dense Retrieval Oct 2, 2024 Information Retrieval Knowledge Distillation
Code Code Available 1PHI-S: Distribution Balancing for Label-Free Multi-Teacher Distillation Oct 2, 2024 Knowledge Distillation
— Unverified 0"No Matter What You Do": Purifying GNN Models via Backdoor Unlearning Oct 2, 2024 Backdoor Attack backdoor defense
Code Code Available 0Foldable SuperNets: Scalable Merging of Transformers with Different Initializations and Tasks Oct 2, 2024 Knowledge Distillation
Code Code Available 0HarmAug: Effective Data Augmentation for Knowledge Distillation of Safety Guard Models Oct 2, 2024 Data Augmentation Knowledge Distillation
Code Code Available 1Self-Updatable Large Language Models with Parameter Integration Oct 1, 2024 Continual Learning Conversational Recommendation
— Unverified 0Local-to-Global Self-Supervised Representation Learning for Diabetic Retinopathy Grading Oct 1, 2024 Diabetic Retinopathy Grading image-classification
— Unverified 0AMR-Evol: Adaptive Modular Response Evolution Elicits Better Knowledge Distillation for Large Language Models in Code Generation Oct 1, 2024 Code Generation HumanEval
Code Code Available 0Compressing Recurrent Neural Networks for FPGA-accelerated Implementation in Fluorescence Lifetime Imaging Oct 1, 2024 Computational Efficiency Knowledge Distillation
— Unverified 0Efficient Technical Term Translation: A Knowledge Distillation Approach for Parenthetical Terminology Translation Oct 1, 2024 Knowledge Distillation Machine Translation
— Unverified 0Advancing Medical Radiograph Representation Learning: A Hybrid Pre-training Paradigm with Multilevel Semantic Granularity Oct 1, 2024 Decoder Knowledge Distillation
— Unverified 0Enhancing Romanian Offensive Language Detection through Knowledge Distillation, Multi-Task Learning, and Data Augmentation Sep 30, 2024 Data Augmentation Knowledge Distillation
— Unverified 0Classroom-Inspired Multi-Mentor Distillation with Adaptive Learning Strategies Sep 30, 2024 2D Human Pose Estimation image-classification
— Unverified 0HYDRA-FL: Hybrid Knowledge Distillation for Robust and Accurate Federated Learning Sep 30, 2024 Federated Learning Knowledge Distillation
— Unverified 0Linear Projections of Teacher Embeddings for Few-Class Distillation Sep 30, 2024 Binary Classification Knowledge Distillation
— Unverified 0Domain Consistency Representation Learning for Lifelong Person Re-Identification Sep 30, 2024 Attribute Knowledge Distillation
Code Code Available 1Tailored Federated Learning: Leveraging Direction Regulation & Knowledge Distillation Sep 29, 2024 Federated Learning Knowledge Distillation
— Unverified 0InfantCryNet: A Data-driven Framework for Intelligent Analysis of Infant Cries Sep 29, 2024 Knowledge Distillation Model Compression
— Unverified 0Mind the Gap: Promoting Missing Modality Brain Tumor Segmentation with Alignment Sep 28, 2024 Brain Tumor Segmentation Knowledge Distillation
— Unverified 0Multi-modal Cross-domain Self-supervised Pre-training for fMRI and EEG Fusion Sep 27, 2024 Data Augmentation EEG
— Unverified 0Semi-Supervised Bone Marrow Lesion Detection from Knee MRI Segmentation Using Mask Inpainting Models Sep 27, 2024 Anomaly Detection Knowledge Distillation
— Unverified 0Student-Oriented Teacher Knowledge Refinement for Knowledge Distillation Sep 27, 2024 Knowledge Distillation Transfer Learning
— Unverified 0Towards Diverse Device Heterogeneous Federated Learning via Task Arithmetic Knowledge Integration Sep 27, 2024 Federated Learning Knowledge Distillation
Code Code Available 0MiniVLN: Efficient Vision-and-Language Navigation by Progressive Knowledge Distillation Sep 27, 2024 Knowledge Distillation Vision and Language Navigation
— Unverified 0Harmonizing knowledge Transfer in Neural Network with Unified Distillation Sep 27, 2024 Knowledge Distillation Transfer Learning
— Unverified 0Kendall's τ Coefficient for Logits Distillation Sep 26, 2024 Knowledge Distillation
— Unverified 0Weak-to-Strong Backdoor Attack for Large Language Models Sep 26, 2024 Backdoor Attack Knowledge Distillation
— Unverified 0Shape-intensity knowledge distillation for robust medical image segmentation Sep 26, 2024 Image Segmentation Knowledge Distillation
Code Code Available 0MT2KD: Towards A General-Purpose Encoder for Speech, Speaker, and Audio Events Sep 25, 2024 Audio Tagging Automatic Speech Recognition
— Unverified 0Adverse Weather Optical Flow: Cumulative Homogeneous-Heterogeneous Adaptation Sep 25, 2024 Domain Adaptation Knowledge Distillation
— Unverified 0SelectiveKD: A semi-supervised framework for cancer detection in DBT through Knowledge Distillation and Pseudo-labeling Sep 25, 2024 Cancer Classification Knowledge Distillation
— Unverified 0Privacy Evaluation Benchmarks for NLP Models Sep 24, 2024 Knowledge Distillation
Code Code Available 0AIM 2024 Challenge on UHD Blind Photo Quality Assessment Sep 24, 2024 4k Computational Efficiency
Code Code Available 1Twin Network Augmentation: A Novel Training Strategy for Improved Spiking Neural Networks and Efficient Weight Quantization Sep 24, 2024 Knowledge Distillation Quantization
— Unverified 0TS-HTFA: Advancing Time Series Forecasting via Hierarchical Text-Free Alignment with Large Language Models Sep 23, 2024 Contrastive Learning cross-modal alignment
— Unverified 0