Why Not Transform Chat Large Language Models to Non-English? May 22, 2024 Knowledge Distillation
Code Code Available 0HoverFast: an accurate, high-throughput, clinically deployable nuclear segmentation tool for brightfield digital pathology images May 22, 2024 GPU Knowledge Distillation
— Unverified 0Low-Resolution Chest X-ray Classification via Knowledge Distillation and Multi-task Learning May 22, 2024 Diagnostic Knowledge Distillation
— Unverified 0Exploring Dark Knowledge under Various Teacher Capacities and Addressing Capacity Mismatch May 21, 2024 Knowledge Distillation
— Unverified 0AMFD: Distillation via Adaptive Multimodal Fusion for Multispectral Pedestrian Detection May 21, 2024 Knowledge Distillation Pedestrian Detection
Code Code Available 1Active Object Detection with Knowledge Aggregation and Distillation from Large Models May 21, 2024 Active Object Detection Decision Making
Code Code Available 0CLRKDNet: Speeding up Lane Detection with Knowledge Distillation May 21, 2024 Autonomous Driving Knowledge Distillation
Code Code Available 1GeoMask3D: Geometrically Informed Mask Selection for Self-Supervised Point Cloud Learning in 3D May 20, 2024 Knowledge Distillation Self-Supervised Learning
— Unverified 0TinyM^2Net-V3: Memory-Aware Compressed Multimodal Deep Neural Networks for Sustainable Edge Deployment May 20, 2024 Knowledge Distillation Model Compression
— Unverified 0Distill-then-prune: An Efficient Compression Framework for Real-time Stereo Matching Network on Edge Devices May 20, 2024 Knowledge Distillation Stereo Matching
— Unverified 0Evolving Storytelling: Benchmarks and Methods for New Character Customization with Diffusion Models May 20, 2024 Knowledge Distillation Story Generation
— Unverified 0Efficiency optimization of large-scale language models based on deep learning in natural language processing tasks May 20, 2024 Inference Optimization Knowledge Distillation
— Unverified 0Stereo-Knowledge Distillation from dpMV to Dual Pixels for Light Field Video Reconstruction May 20, 2024 Autonomous Driving Knowledge Distillation
— Unverified 0Federated Learning for Time-Series Healthcare Sensing with Incomplete Modalities May 20, 2024 Computational Efficiency Federated Learning
Code Code Available 0Overcoming Data and Model Heterogeneities in Decentralized Federated Learning via Synthetic Anchors May 19, 2024 Domain Adaptation Federated Learning
Code Code Available 1Cross-Domain Knowledge Distillation for Low-Resolution Human Pose Estimation May 19, 2024 Knowledge Distillation Pose Estimation
— Unverified 0Hierarchical Selective Classification May 19, 2024 Classification Knowledge Distillation
— Unverified 0Nickel and Diming Your GAN: A Dual-Method Approach to Enhancing GAN Efficiency via Knowledge Distillation May 19, 2024 Knowledge Distillation
— Unverified 0INDUS: Effective and Efficient Language Models for Scientific Applications May 17, 2024 Contrastive Learning Information Retrieval
— Unverified 0Densely Distilling Cumulative Knowledge for Continual Learning May 16, 2024 All Continual Learning
— Unverified 0Distilling Implicit Multimodal Knowledge into Large Language Models for Zero-Resource Dialogue Generation May 16, 2024 Dialogue Generation Knowledge Distillation
Code Code Available 0QCRD: Quality-guided Contrastive Rationale Distillation for Large Language Models May 14, 2024 Contrastive Learning Denoising
— Unverified 0GLiRA: Black-Box Membership Inference Attack via Knowledge Distillation May 13, 2024 image-classification Image Classification
Code Code Available 0Meta-Learned Modality-Weighted Knowledge Distillation for Robust Multi-Modal Learning with Missing Data May 12, 2024 Brain Tumor Segmentation Classification
Code Code Available 0AdaKD: Dynamic Knowledge Distillation of ASR models using Adaptive Loss Weighting May 11, 2024 Knowledge Distillation Model Compression
— Unverified 0Attend, Distill, Detect: Attention-aware Entropy Distillation for Anomaly Detection May 10, 2024 Anomaly Detection Knowledge Distillation
Code Code Available 0For the Misgendered Chinese in Gender Bias Research: Multi-Task Learning with Knowledge Distillation for Pinyin Name-Gender Prediction May 10, 2024 Gender Prediction Knowledge Distillation
— Unverified 0MH-pFLID: Model Heterogeneous personalized Federated Learning via Injection and Distillation for Medical Data Analysis May 10, 2024 Federated Learning Knowledge Distillation
— Unverified 0From Algorithm to Hardware: A Survey on Efficient and Safe Deployment of Deep Neural Networks May 9, 2024 Knowledge Distillation Model Compression
— Unverified 0Less-supervised learning with knowledge distillation for sperm morphology analysis May 8, 2024 Anomaly Detection Knowledge Distillation
Code Code Available 0CourseGPT-zh: an Educational Large Language Model Based on Knowledge Distillation Incorporating Prompt Optimization May 8, 2024 Diversity Knowledge Distillation
— Unverified 0Markowitz Meets Bellman: Knowledge-distilled Reinforcement Learning for Portfolio Management May 8, 2024 Knowledge Distillation Management
— Unverified 0A Review on Discriminative Self-supervised Learning Methods in Computer Vision May 8, 2024 Clustering Knowledge Distillation
— Unverified 0ELiTe: Efficient Image-to-LiDAR Knowledge Transfer for Semantic Segmentation May 7, 2024 Knowledge Distillation LIDAR Semantic Segmentation
— Unverified 0GOVERN: Gradient Orientation Vote Ensemble for Multi-Teacher Reinforced Distillation May 6, 2024 Knowledge Distillation Question Answering
— Unverified 0Mind the Gap Between Synthetic and Real: Utilizing Transfer Learning to Probe the Boundaries of Stable Diffusion Generated Data May 6, 2024 Data-free Knowledge Distillation Knowledge Distillation
— Unverified 0Sub-goal Distillation: A Method to Improve Small Language Agents May 4, 2024 Imitation Learning Knowledge Distillation
Code Code Available 0Exploring Extreme Quantization in Spiking Language Models May 4, 2024 Knowledge Distillation Language Modeling
— Unverified 0Semantic Objective Functions: A distribution-aware method for adding logical constraints in deep learning May 3, 2024 Knowledge Distillation
— Unverified 0Advancing Pre-trained Teacher: Towards Robust Feature Discrepancy for Anomaly Detection May 3, 2024 Anomaly Detection Attribute
Code Code Available 1Efficient Compression of Multitask Multilingual Speech Models May 2, 2024 Automatic Speech Recognition Automatic Speech Recognition (ASR)
— Unverified 0Error Exponent in Agnostic PAC Learning May 1, 2024 Binary Classification Knowledge Distillation
— Unverified 0Wake Vision: A Tailored Dataset and Benchmark Suite for TinyML Computer Vision Applications May 1, 2024 Human Detection Knowledge Distillation
— Unverified 0CrossMatch: Enhance Semi-Supervised Medical Image Segmentation with Perturbation Strategies and Knowledge Distillation May 1, 2024 Image Segmentation Knowledge Distillation
Code Code Available 1Distillation Matters: Empowering Sequential Recommenders to Match the Performance of Large Language Model May 1, 2024 Knowledge Distillation Language Modeling
Code Code Available 1Why does Knowledge Distillation Work? Rethink its Attention and Fidelity Mechanism Apr 30, 2024 Data Augmentation Diversity
Code Code Available 0Knowledge Distillation vs. Pretraining from Scratch under a Fixed (Computation) Budget Apr 30, 2024 Knowledge Distillation Language Modeling
— Unverified 0Control Policy Correction Framework for Reinforcement Learning-based Energy Arbitrage Strategies Apr 29, 2024 Knowledge Distillation reinforcement-learning
— Unverified 0Revealing the Two Sides of Data Augmentation: An Asymmetric Distillation-based Win-Win Solution for Open-Set Recognition Apr 28, 2024 Data Augmentation Knowledge Distillation
— Unverified 0Retrieval-Oriented Knowledge for Click-Through Rate Prediction Apr 28, 2024 Click-Through Rate Prediction Contrastive Learning
Code Code Available 1