Knowledge Distillation in RNN-Attention Models for Early Prediction of Student Performance Dec 19, 2024 Knowledge Distillation Student dropout
Code Code Available 0SCKD: Semi-Supervised Cross-Modality Knowledge Distillation for 4D Radar Object Detection Dec 19, 2024 3D Object Detection Autonomous Vehicles
Code Code Available 0Enhancing Knowledge Distillation for LLMs with Response-Priming Prompting Dec 18, 2024 GSM8K Knowledge Distillation
Code Code Available 0Canine EEG Helps Human: Cross-Species and Cross-Modality Epileptic Seizure Detection via Multi-Space Alignment Dec 18, 2024 Brain Computer Interface Diagnostic
— Unverified 0A Survey on Inference Optimization Techniques for Mixture of Experts Models Dec 18, 2024 Computational Efficiency Distributed Computing
Code Code Available 3Learnable Prompting SAM-induced Knowledge Distillation for Semi-supervised Medical Image Segmentation Dec 18, 2024 Image Segmentation Knowledge Distillation
Code Code Available 2Hybrid Data-Free Knowledge Distillation Dec 18, 2024 Data-free Knowledge Distillation Generative Adversarial Network
Code Code Available 0On Explaining Knowledge Distillation: Measuring and Visualising the Knowledge Transfer Process Dec 18, 2024 Knowledge Distillation Transfer Learning
— Unverified 0Scaling of Search and Learning: A Roadmap to Reproduce o1 from Reinforcement Learning Perspective Dec 18, 2024 Knowledge Distillation
Code Code Available 0On the Compression of Language Models for Code: An Empirical Study on CodeBERT Dec 18, 2024 Code Search Code Summarization
— Unverified 0Entire-Space Variational Information Exploitation for Post-Click Conversion Rate Prediction Dec 17, 2024 Knowledge Distillation Recommendation Systems
— Unverified 0Split Knowledge Distillation for Large Models in IoT: Architecture, Challenges, and Solutions Dec 17, 2024 Knowledge Distillation Management
— Unverified 0In-Context Learning Distillation for Efficient Few-Shot Fine-Tuning Dec 17, 2024 In-Context Learning Knowledge Distillation
— Unverified 0Modality-Inconsistent Continual Learning of Multimodal Large Language Models Dec 17, 2024 Continual Learning Knowledge Distillation
— Unverified 0Efficient Speech Command Recognition Leveraging Spiking Neural Network and Curriculum Learning-based Knowledge Distillation Dec 17, 2024 Edge-computing Knowledge Distillation
— Unverified 0PromptDet: A Lightweight 3D Object Detection Framework with LiDAR Prompts Dec 17, 2024 3D Object Detection Depth Estimation
— Unverified 0Relation-Guided Adversarial Learning for Data-free Knowledge Transfer Dec 16, 2024 Data-free Knowledge Distillation Data Free Quantization
Code Code Available 1BiM-VFI: directional Motion Field-Guided Frame Interpolation for Video with Non-uniform Motions Dec 16, 2024 Knowledge Distillation Motion Estimation
Code Code Available 2Neural Collapse Inspired Knowledge Distillation Dec 16, 2024 Knowledge Distillation
— Unverified 0Active Large Language Model-based Knowledge Distillation for Session-based Recommendation Dec 15, 2024 Active Learning Knowledge Distillation
— Unverified 0Knowledge Migration Framework for Smart Contract Vulnerability Detection Dec 15, 2024 Data-free Knowledge Distillation Knowledge Distillation
— Unverified 0ProFe: Communication-Efficient Decentralized Federated Learning via Distillation and Prototypes Dec 15, 2024 Federated Learning Knowledge Distillation
— Unverified 0Wearable Accelerometer Foundation Models for Health via Knowledge Distillation Dec 15, 2024 Activity Recognition cross-modal alignment
— Unverified 0Leveraging Large Language Models for Active Merchant Non-player Characters Dec 15, 2024 Knowledge Distillation
Code Code Available 0On Distilling the Displacement Knowledge for Few-Shot Class-Incremental Learning Dec 15, 2024 class-incremental learning Class Incremental Learning
— Unverified 0Redefining Normal: A Novel Object-Level Approach for Multi-Object Novelty Detection Dec 15, 2024 Knowledge Distillation Novelty Detection
Code Code Available 0Can Students Beyond The Teacher? Distilling Knowledge from Teacher's Bias Dec 13, 2024 Knowledge Distillation Model Compression
— Unverified 0LLM Distillation for Efficient Few-Shot Multiple Choice Question Answering Dec 13, 2024 Few-Shot Learning Knowledge Distillation
— Unverified 0ScaleOT: Privacy-utility-scalable Offsite-tuning with Dynamic LayerReplace and Selective Rank Compression Dec 13, 2024 Knowledge Distillation Privacy Preserving
— Unverified 0Dynamic Contrastive Knowledge Distillation for Efficient Image Restoration Dec 12, 2024 Contrastive Learning Image Restoration
Code Code Available 1Optimising TinyML with Quantization and Distillation of Transformer and Mamba Models for Indoor Localisation on Edge Devices Dec 12, 2024 Knowledge Distillation Mamba
— Unverified 0Multimodal Industrial Anomaly Detection by Crossmodal Reverse Distillation Dec 12, 2024 Anomaly Detection Knowledge Distillation
Code Code Available 0All You Need in Knowledge Distillation Is a Tailored Coordinate System Dec 12, 2024 All Few-Shot Learning
— Unverified 0SnapGen: Taming High-Resolution Text-to-Image Models for Mobile Devices with Efficient Architectures and Training Dec 12, 2024 Knowledge Distillation Text-to-Image Generation
— Unverified 0DASK: Distribution Rehearsing via Adaptive Style Kernel Learning for Exemplar-Free Lifelong Person Re-Identification Dec 12, 2024 Exemplar-Free Knowledge Distillation
Code Code Available 0A Theoretical Analysis of Soft-Label vs Hard-Label Training in Neural Networks Dec 12, 2024 Binary Classification Knowledge Distillation
— Unverified 0Efficient Gravitational Wave Parameter Estimation via Knowledge Distillation: A ResNet1D-IAF Approach Dec 11, 2024 Astronomy Computational Efficiency
— Unverified 0Wasserstein Distance Rivals Kullback-Leibler Divergence for Knowledge Distillation Dec 11, 2024 image-classification Image Classification
Code Code Available 2DAKD: Data Augmentation and Knowledge Distillation using Diffusion Models for SAR Oil Spill Segmentation Dec 11, 2024 Data Augmentation Knowledge Distillation
— Unverified 0Cloud Object Detector Adaptation by Integrating Different Source Knowledge Dec 10, 2024 Domain Adaptation Knowledge Distillation
Code Code Available 1TT-MPD: Test Time Model Pruning and Distillation Dec 10, 2024 Knowledge Distillation model
— Unverified 0Unlocking the Potential of Reverse Distillation for Anomaly Detection Dec 10, 2024 Anomaly Detection Decoder
Code Code Available 1FM2DS: Few-Shot Multimodal Multihop Data Synthesis with Knowledge Distillation for Question Answering Dec 9, 2024 Knowledge Distillation Question Answering
Code Code Available 0U-Know-DiffPAN: An Uncertainty-aware Knowledge Distillation Diffusion Framework with Details Enhancement for PAN-Sharpening Dec 9, 2024 Knowledge Distillation
— Unverified 0Domain-Specific Translation with Open-Source Large Language Models: Resource-Oriented Analysis Dec 8, 2024 Decoder Knowledge Distillation
— Unverified 0Enhancing Content Representation for AR Image Quality Assessment Using Knowledge Distillation Dec 8, 2024 Image Quality Assessment Knowledge Distillation
— Unverified 0Neighborhood Commonality-aware Evolution Network for Continuous Generalized Category Discovery Dec 7, 2024 Contrastive Learning Incremental Learning
Code Code Available 0CCS: Continuous Learning for Customized Incremental Wireless Sensing Services Dec 6, 2024 Action Recognition Knowledge Distillation
— Unverified 0BEExformer: A Fast Inferencing Transformer Architecture via Binarization with Multiple Early Exits Dec 6, 2024 Binarization Knowledge Distillation
— Unverified 0One-shot Federated Learning via Synthetic Distiller-Distillate Communication Dec 6, 2024 Data-free Knowledge Distillation Federated Learning
Code Code Available 1