Aligning in a Compact Space: Contrastive Knowledge Distillation between Heterogeneous Architectures May 28, 2024 Contrastive Learning Knowledge Distillation
— Unverified 0UniCompress: Enhancing Multi-Data Medical Image Compression with Knowledge Distillation May 27, 2024 Image Compression Knowledge Distillation
— Unverified 0TIMA: Text-Image Mutual Awareness for Balancing Zero-Shot Adversarial Robustness and Generalization Ability May 27, 2024 Adversarial Robustness Knowledge Distillation
— Unverified 0P4: Towards private, personalized, and Peer-to-Peer learning May 27, 2024 Knowledge Distillation
— Unverified 0A Classifier-Free Incremental Learning Framework for Scalable Medical Image Segmentation May 25, 2024 Contrastive Learning Image Segmentation
— Unverified 0Noisy Data Meets Privacy: Training Local Models with Post-Processed Remote Queries May 25, 2024 Knowledge Distillation Model extraction
— Unverified 0Harnessing Increased Client Participation with Cohort-Parallel Federated Learning May 24, 2024 Federated Learning image-classification
— Unverified 0Leveraging knowledge distillation for partial multi-task learning from multiple remote sensing datasets May 24, 2024 Knowledge Distillation Multi-Task Learning
Code Code Available 0AdaGMLP: AdaBoosting GNN-to-MLP Knowledge Distillation May 23, 2024 Knowledge Distillation
Code Code Available 0Pre-Trained Vision-Language Models as Partial Annotators May 23, 2024 Contrastive Learning image-classification
— Unverified 0Efficient Multitask Dense Predictor via Binarization May 23, 2024 Binarization Knowledge Distillation
Code Code Available 0HoverFast: an accurate, high-throughput, clinically deployable nuclear segmentation tool for brightfield digital pathology images May 22, 2024 GPU Knowledge Distillation
— Unverified 0Low-Resolution Chest X-ray Classification via Knowledge Distillation and Multi-task Learning May 22, 2024 Diagnostic Knowledge Distillation
— Unverified 0Data-Free Federated Class Incremental Learning with Diffusion-Based Generative Memory May 22, 2024 class-incremental learning Class Incremental Learning
— Unverified 0Joint Optimization of Streaming and Non-Streaming Automatic Speech Recognition with Multi-Decoder and Knowledge Distillation May 22, 2024 Automatic Speech Recognition Automatic Speech Recognition (ASR)
— Unverified 0Why Not Transform Chat Large Language Models to Non-English? May 22, 2024 Knowledge Distillation
Code Code Available 0Exploring Dark Knowledge under Various Teacher Capacities and Addressing Capacity Mismatch May 21, 2024 Knowledge Distillation
— Unverified 0Active Object Detection with Knowledge Aggregation and Distillation from Large Models May 21, 2024 Active Object Detection Decision Making
Code Code Available 0TinyM^2Net-V3: Memory-Aware Compressed Multimodal Deep Neural Networks for Sustainable Edge Deployment May 20, 2024 Knowledge Distillation Model Compression
— Unverified 0GeoMask3D: Geometrically Informed Mask Selection for Self-Supervised Point Cloud Learning in 3D May 20, 2024 Knowledge Distillation Self-Supervised Learning
— Unverified 0Federated Learning for Time-Series Healthcare Sensing with Incomplete Modalities May 20, 2024 Computational Efficiency Federated Learning
Code Code Available 0Stereo-Knowledge Distillation from dpMV to Dual Pixels for Light Field Video Reconstruction May 20, 2024 Autonomous Driving Knowledge Distillation
— Unverified 0Evolving Storytelling: Benchmarks and Methods for New Character Customization with Diffusion Models May 20, 2024 Knowledge Distillation Story Generation
— Unverified 0Efficiency optimization of large-scale language models based on deep learning in natural language processing tasks May 20, 2024 Inference Optimization Knowledge Distillation
— Unverified 0Distill-then-prune: An Efficient Compression Framework for Real-time Stereo Matching Network on Edge Devices May 20, 2024 Knowledge Distillation Stereo Matching
— Unverified 0Hierarchical Selective Classification May 19, 2024 Classification Knowledge Distillation
— Unverified 0Nickel and Diming Your GAN: A Dual-Method Approach to Enhancing GAN Efficiency via Knowledge Distillation May 19, 2024 Knowledge Distillation
— Unverified 0Cross-Domain Knowledge Distillation for Low-Resolution Human Pose Estimation May 19, 2024 Knowledge Distillation Pose Estimation
— Unverified 0INDUS: Effective and Efficient Language Models for Scientific Applications May 17, 2024 Contrastive Learning Information Retrieval
— Unverified 0Densely Distilling Cumulative Knowledge for Continual Learning May 16, 2024 All Continual Learning
— Unverified 0Distilling Implicit Multimodal Knowledge into Large Language Models for Zero-Resource Dialogue Generation May 16, 2024 Dialogue Generation Knowledge Distillation
Code Code Available 0QCRD: Quality-guided Contrastive Rationale Distillation for Large Language Models May 14, 2024 Contrastive Learning Denoising
— Unverified 0GLiRA: Black-Box Membership Inference Attack via Knowledge Distillation May 13, 2024 image-classification Image Classification
Code Code Available 0Meta-Learned Modality-Weighted Knowledge Distillation for Robust Multi-Modal Learning with Missing Data May 12, 2024 Brain Tumor Segmentation Classification
Code Code Available 0AdaKD: Dynamic Knowledge Distillation of ASR models using Adaptive Loss Weighting May 11, 2024 Knowledge Distillation Model Compression
— Unverified 0For the Misgendered Chinese in Gender Bias Research: Multi-Task Learning with Knowledge Distillation for Pinyin Name-Gender Prediction May 10, 2024 Gender Prediction Knowledge Distillation
— Unverified 0MH-pFLID: Model Heterogeneous personalized Federated Learning via Injection and Distillation for Medical Data Analysis May 10, 2024 Federated Learning Knowledge Distillation
— Unverified 0Attend, Distill, Detect: Attention-aware Entropy Distillation for Anomaly Detection May 10, 2024 Anomaly Detection Knowledge Distillation
Code Code Available 0From Algorithm to Hardware: A Survey on Efficient and Safe Deployment of Deep Neural Networks May 9, 2024 Knowledge Distillation Model Compression
— Unverified 0CourseGPT-zh: an Educational Large Language Model Based on Knowledge Distillation Incorporating Prompt Optimization May 8, 2024 Diversity Knowledge Distillation
— Unverified 0Less-supervised learning with knowledge distillation for sperm morphology analysis May 8, 2024 Anomaly Detection Knowledge Distillation
Code Code Available 0Markowitz Meets Bellman: Knowledge-distilled Reinforcement Learning for Portfolio Management May 8, 2024 Knowledge Distillation Management
— Unverified 0A Review on Discriminative Self-supervised Learning Methods in Computer Vision May 8, 2024 Clustering Knowledge Distillation
— Unverified 0ELiTe: Efficient Image-to-LiDAR Knowledge Transfer for Semantic Segmentation May 7, 2024 Knowledge Distillation LIDAR Semantic Segmentation
— Unverified 0GOVERN: Gradient Orientation Vote Ensemble for Multi-Teacher Reinforced Distillation May 6, 2024 Knowledge Distillation Question Answering
— Unverified 0Mind the Gap Between Synthetic and Real: Utilizing Transfer Learning to Probe the Boundaries of Stable Diffusion Generated Data May 6, 2024 Data-free Knowledge Distillation Knowledge Distillation
— Unverified 0Exploring Extreme Quantization in Spiking Language Models May 4, 2024 Knowledge Distillation Language Modeling
— Unverified 0Sub-goal Distillation: A Method to Improve Small Language Agents May 4, 2024 Imitation Learning Knowledge Distillation
Code Code Available 0Semantic Objective Functions: A distribution-aware method for adding logical constraints in deep learning May 3, 2024 Knowledge Distillation
— Unverified 0Efficient Compression of Multitask Multilingual Speech Models May 2, 2024 Automatic Speech Recognition Automatic Speech Recognition (ASR)
— Unverified 0