Enhancing Low-Resource NMT with a Multilingual Encoder and Knowledge Distillation: A Case Study Jul 9, 2024 Knowledge Distillation Language Modeling
Code Code Available 0Reprogramming Distillation for Medical Foundation Models Jul 9, 2024 Knowledge Distillation Lightweight Deployment
Code Code Available 0Less is More: Efficient Brain-Inspired Learning for Autonomous Driving Trajectory Prediction Jul 9, 2024 Autonomous Driving Decision Making
— Unverified 0DεpS: Delayed ε-Shrinking for Faster Once-For-All Training Jul 8, 2024 All GPU
— Unverified 0Leveraging Topological Guidance for Improved Knowledge Distillation Jul 7, 2024 image-classification Image Classification
Code Code Available 0Topological Persistence Guided Knowledge Distillation for Wearable Sensor Data Jul 7, 2024 Activity Recognition Deep Learning
— Unverified 0Mind the Interference: Retaining Pre-trained Knowledge in Parameter Efficient Continual Learning of Vision-Language Models Jul 7, 2024 class-incremental learning Class Incremental Learning
Code Code Available 2Federated Knowledge Transfer Fine-tuning Large Server Model with Resource-Constrained IoT Clients Jul 7, 2024 Federated Learning Knowledge Distillation
— Unverified 0Improving Knowledge Distillation in Transfer Learning with Layer-wise Learning Rates Jul 5, 2024 Knowledge Distillation Transfer Learning
— Unverified 0Understanding the Gains from Repeated Self-Distillation Jul 5, 2024 Knowledge Distillation regression
— Unverified 0AMD: Automatic Multi-step Distillation of Large-scale Vision Models Jul 5, 2024 image-classification Image Classification
— Unverified 0DASS: Distilled Audio State Space Models Are Stronger and More Duration-Scalable Learners Jul 4, 2024 Audio Classification Audio Tagging
Code Code Available 1DSMix: Distortion-Induced Sensitivity Map Based Pre-training for No-Reference Image Quality Assessment Jul 4, 2024 Data Augmentation Image Quality Assessment
Code Code Available 0Relative Difficulty Distillation for Semantic Segmentation Jul 4, 2024 Knowledge Distillation Semantic Segmentation
Code Code Available 0Fully Fine-tuned CLIP Models are Efficient Few-Shot Learners Jul 4, 2024 Domain Generalization Few-Shot Learning
— Unverified 0Edge AI-Enabled Chicken Health Detection Based on Enhanced FCOS-Lite and Knowledge Distillation Jul 3, 2024 Knowledge Distillation Quantization
— Unverified 0Supporting Cross-language Cross-project Bug Localization Using Pre-trained Language Models Jul 3, 2024 Contrastive Learning CPU
— Unverified 0MLKD-BERT: Multi-level Knowledge Distillation for Pre-trained Language Models Jul 3, 2024 Extractive Question-Answering Knowledge Distillation
— Unverified 0Improving Conversational Abilities of Quantized Large Language Models via Direct Preference Alignment Jul 3, 2024 Chatbot Computational Efficiency
— Unverified 0Unified Anomaly Detection methods on Edge Device using Knowledge Distillation and Quantization Jul 3, 2024 Anomaly Detection CPU
— Unverified 0Improving Zero-shot Generalization of Learned Prompts via Unsupervised Knowledge Distillation Jul 3, 2024 Domain Generalization Knowledge Distillation
Code Code Available 2Accelerated Proton Resonance Frequency-based Magnetic Resonance Thermometry by Optimized Deep Learning Method Jul 3, 2024 Knowledge Distillation
Code Code Available 0A Unified Framework for 3D Scene Understanding Jul 3, 2024 Contrastive Learning Knowledge Distillation
Code Code Available 2Advancing Compressed Video Action Recognition through Progressive Knowledge Distillation Jul 2, 2024 Action Recognition Knowledge Distillation
Code Code Available 0ECAT: A Entire space Continual and Adaptive Transfer Learning Framework for Cross-Domain Recommendation Jul 2, 2024 Domain Adaptation Knowledge Distillation
— Unverified 0Adaptive Modality Balanced Online Knowledge Distillation for Brain-Eye-Computer based Dim Object Detection Jul 2, 2024 EEG Electroencephalogram (EEG)
Code Code Available 0Survey on Knowledge Distillation for Large Language Models: Methods, Evaluation, and Application Jul 2, 2024 Knowledge Distillation Survey
— Unverified 0Self-Cooperation Knowledge Distillation for Novel Class Discovery Jul 2, 2024 Knowledge Distillation Novel Class Discovery
— Unverified 0uDistil-Whisper: Label-Free Data Filtering for Knowledge Distillation in Low-Data Regimes Jul 1, 2024 Knowledge Distillation
Code Code Available 0AdaDistill: Adaptive Knowledge Distillation for Deep Face Recognition Jul 1, 2024 Face Recognition Knowledge Distillation
Code Code Available 1BAPO: Base-Anchored Preference Optimization for Overcoming Forgetting in Large Language Models Personalization Jun 30, 2024 Continual Learning General Knowledge
— Unverified 0FANFOLD: Graph Normalizing Flows-driven Asymmetric Network for Unsupervised Graph-Level Anomaly Detection Jun 29, 2024 Anomaly Detection Knowledge Distillation
Code Code Available 0Enhancing Accuracy and Parameter-Efficiency of Neural Representations for Network Parameterization Jun 29, 2024 Knowledge Distillation
— Unverified 0CSAKD: Knowledge Distillation with Cross Self-Attention for Hyperspectral and Multispectral Image Fusion Jun 28, 2024 Knowledge Distillation Super-Resolution
Code Code Available 1MuGSI: Distilling GNNs with Multi-Granularity Structural Information for Graph Classification Jun 28, 2024 Classification Graph Classification
Code Code Available 0Direct Preference Knowledge Distillation for Large Language Models Jun 28, 2024 Knowledge Distillation
— Unverified 0Instance Temperature Knowledge Distillation Jun 27, 2024 Decision Making Efficient Exploration
Code Code Available 0Aligning Teacher with Student Preferences for Tailored Training Data Generation Jun 27, 2024 In-Context Learning Knowledge Distillation
— Unverified 0On Reducing Activity with Distillation and Regularization for Energy Efficient Spiking Neural Networks Jun 26, 2024 Knowledge Distillation
— Unverified 0ConStyle v2: A Strong Prompter for All-in-One Image Restoration Jun 26, 2024 All GPU
Code Code Available 1Towards Optimal Trade-offs in Knowledge Distillation for CNNs and Vision Transformers at the Edge Jun 25, 2024 Knowledge Distillation
— Unverified 0Highly Constrained Coded Aperture Imaging Systems Design Via a Knowledge Distillation Approach Jun 25, 2024 Image Reconstruction Knowledge Distillation
— Unverified 0Sequential Editing for Lifelong Training of Speech Recognition Models Jun 25, 2024 Automatic Speech Recognition Automatic Speech Recognition (ASR)
— Unverified 0Preserving Node Distinctness in Graph Autoencoders via Similarity Distillation Jun 25, 2024 Decoder Knowledge Distillation
— Unverified 0Knowledge Distillation in Automated Annotation: Supervised Text Classification with LLM-Generated Training Labels Jun 25, 2024 Articles In-Context Learning
— Unverified 0MAGIC: Meta-Ability Guided Interactive Chain-of-Distillation for Effective-and-Efficient Vision-and-Language Navigation Jun 25, 2024 Knowledge Distillation Test unseen
Code Code Available 1Dual-Space Knowledge Distillation for Large Language Models Jun 25, 2024 Instruction Following Knowledge Distillation
Code Code Available 2InFiConD: Interactive No-code Fine-tuning with Concept-based Knowledge Distillation Jun 25, 2024 Knowledge Distillation
— Unverified 0Three-Stream Temporal-Shift Attention Network Based on Self-Knowledge Distillation for Micro-Expression Recognition Jun 25, 2024 Knowledge Distillation Micro Expression Recognition
Code Code Available 1WAVE: Weight Template for Adaptive Initialization of Variable-sized Models Jun 25, 2024 Knowledge Distillation Transfer Learning
— Unverified 0