Distilling Large Language Models for Efficient Clinical Information Extraction Dec 21, 2024 Knowledge Distillation named-entity-recognition
— Unverified 0BabyHGRN: Exploring RNNs for Sample-Efficient Training of Language Models Dec 20, 2024 Knowledge Distillation Language Modeling
— Unverified 0A New Method to Capturing Compositional Knowledge in Linguistic Space Dec 20, 2024 Image Retrieval Knowledge Distillation
— Unverified 0Knowledge Distillation in RNN-Attention Models for Early Prediction of Student Performance Dec 19, 2024 Knowledge Distillation Student dropout
Code Code Available 0SCKD: Semi-Supervised Cross-Modality Knowledge Distillation for 4D Radar Object Detection Dec 19, 2024 3D Object Detection Autonomous Vehicles
Code Code Available 0Uncertainty-Guided Cross Attention Ensemble Mean Teacher for Semi-supervised Medical Image Segmentation Dec 19, 2024 Domain Generalization Image Segmentation
Code Code Available 0Self-Evolution Knowledge Distillation for LLM-based Machine Translation Dec 19, 2024 Knowledge Distillation Machine Translation
— Unverified 0Canine EEG Helps Human: Cross-Species and Cross-Modality Epileptic Seizure Detection via Multi-Space Alignment Dec 18, 2024 Brain Computer Interface Diagnostic
— Unverified 0Hybrid Data-Free Knowledge Distillation Dec 18, 2024 Data-free Knowledge Distillation Generative Adversarial Network
Code Code Available 0Enhancing Knowledge Distillation for LLMs with Response-Priming Prompting Dec 18, 2024 GSM8K Knowledge Distillation
Code Code Available 0On the Compression of Language Models for Code: An Empirical Study on CodeBERT Dec 18, 2024 Code Search Code Summarization
— Unverified 0Scaling of Search and Learning: A Roadmap to Reproduce o1 from Reinforcement Learning Perspective Dec 18, 2024 Knowledge Distillation
— Unverified 0On Explaining Knowledge Distillation: Measuring and Visualising the Knowledge Transfer Process Dec 18, 2024 Knowledge Distillation Transfer Learning
— Unverified 0Modality-Inconsistent Continual Learning of Multimodal Large Language Models Dec 17, 2024 Continual Learning Knowledge Distillation
— Unverified 0Entire-Space Variational Information Exploitation for Post-Click Conversion Rate Prediction Dec 17, 2024 Knowledge Distillation Recommendation Systems
— Unverified 0Efficient Speech Command Recognition Leveraging Spiking Neural Network and Curriculum Learning-based Knowledge Distillation Dec 17, 2024 Edge-computing Knowledge Distillation
— Unverified 0Split Knowledge Distillation for Large Models in IoT: Architecture, Challenges, and Solutions Dec 17, 2024 Knowledge Distillation Management
— Unverified 0In-Context Learning Distillation for Efficient Few-Shot Fine-Tuning Dec 17, 2024 In-Context Learning Knowledge Distillation
— Unverified 0PromptDet: A Lightweight 3D Object Detection Framework with LiDAR Prompts Dec 17, 2024 3D Object Detection Depth Estimation
— Unverified 0Neural Collapse Inspired Knowledge Distillation Dec 16, 2024 Knowledge Distillation
— Unverified 0ProFe: Communication-Efficient Decentralized Federated Learning via Distillation and Prototypes Dec 15, 2024 Federated Learning Knowledge Distillation
— Unverified 0On Distilling the Displacement Knowledge for Few-Shot Class-Incremental Learning Dec 15, 2024 class-incremental learning Class Incremental Learning
— Unverified 0Active Large Language Model-based Knowledge Distillation for Session-based Recommendation Dec 15, 2024 Active Learning Knowledge Distillation
— Unverified 0Wearable Accelerometer Foundation Models for Health via Knowledge Distillation Dec 15, 2024 Activity Recognition cross-modal alignment
— Unverified 0Redefining Normal: A Novel Object-Level Approach for Multi-Object Novelty Detection Dec 15, 2024 Knowledge Distillation Novelty Detection
Code Code Available 0Knowledge Migration Framework for Smart Contract Vulnerability Detection Dec 15, 2024 Data-free Knowledge Distillation Knowledge Distillation
— Unverified 0Leveraging Large Language Models for Active Merchant Non-player Characters Dec 15, 2024 Knowledge Distillation
Code Code Available 0ScaleOT: Privacy-utility-scalable Offsite-tuning with Dynamic LayerReplace and Selective Rank Compression Dec 13, 2024 Knowledge Distillation Privacy Preserving
— Unverified 0LLM Distillation for Efficient Few-Shot Multiple Choice Question Answering Dec 13, 2024 Few-Shot Learning Knowledge Distillation
— Unverified 0Can Students Beyond The Teacher? Distilling Knowledge from Teacher's Bias Dec 13, 2024 Knowledge Distillation Model Compression
— Unverified 0Optimising TinyML with Quantization and Distillation of Transformer and Mamba Models for Indoor Localisation on Edge Devices Dec 12, 2024 Knowledge Distillation Mamba
— Unverified 0All You Need in Knowledge Distillation Is a Tailored Coordinate System Dec 12, 2024 All Few-Shot Learning
— Unverified 0DASK: Distribution Rehearsing via Adaptive Style Kernel Learning for Exemplar-Free Lifelong Person Re-Identification Dec 12, 2024 Exemplar-Free Knowledge Distillation
Code Code Available 0A Theoretical Analysis of Soft-Label vs Hard-Label Training in Neural Networks Dec 12, 2024 Binary Classification Knowledge Distillation
— Unverified 0Multimodal Industrial Anomaly Detection by Crossmodal Reverse Distillation Dec 12, 2024 Anomaly Detection Knowledge Distillation
Code Code Available 0SnapGen: Taming High-Resolution Text-to-Image Models for Mobile Devices with Efficient Architectures and Training Dec 12, 2024 Knowledge Distillation Text-to-Image Generation
— Unverified 0DAKD: Data Augmentation and Knowledge Distillation using Diffusion Models for SAR Oil Spill Segmentation Dec 11, 2024 Data Augmentation Knowledge Distillation
— Unverified 0Efficient Gravitational Wave Parameter Estimation via Knowledge Distillation: A ResNet1D-IAF Approach Dec 11, 2024 Astronomy Computational Efficiency
— Unverified 0TT-MPD: Test Time Model Pruning and Distillation Dec 10, 2024 Knowledge Distillation model
— Unverified 0FM2DS: Few-Shot Multimodal Multihop Data Synthesis with Knowledge Distillation for Question Answering Dec 9, 2024 Knowledge Distillation Question Answering
Code Code Available 0U-Know-DiffPAN: An Uncertainty-aware Knowledge Distillation Diffusion Framework with Details Enhancement for PAN-Sharpening Dec 9, 2024 Knowledge Distillation
— Unverified 0Enhancing Content Representation for AR Image Quality Assessment Using Knowledge Distillation Dec 8, 2024 Image Quality Assessment Knowledge Distillation
— Unverified 0Domain-Specific Translation with Open-Source Large Language Models: Resource-Oriented Analysis Dec 8, 2024 Decoder Knowledge Distillation
— Unverified 0Neighborhood Commonality-aware Evolution Network for Continuous Generalized Category Discovery Dec 7, 2024 Contrastive Learning Incremental Learning
Code Code Available 0BEExformer: A Fast Inferencing Transformer Architecture via Binarization with Multiple Early Exits Dec 6, 2024 Binarization Knowledge Distillation
— Unverified 0CCS: Continuous Learning for Customized Incremental Wireless Sensing Services Dec 6, 2024 Action Recognition Knowledge Distillation
— Unverified 0Diffusion-Augmented Coreset Expansion for Scalable Dataset Distillation Dec 5, 2024 Bilevel Optimization Computational Efficiency
— Unverified 0FedDW: Distilling Weights through Consistency Optimization in Heterogeneous Federated Learning Dec 5, 2024 Federated Learning Knowledge Distillation
Code Code Available 0Expanding Deep Learning-based Sensing Systems with Multi-Source Knowledge Transfer Dec 5, 2024 Deep Learning Knowledge Distillation
— Unverified 0Enhancing CLIP Conceptual Embedding through Knowledge Distillation Dec 4, 2024 Contrastive Learning Knowledge Distillation
— Unverified 0