DSG-KD: Knowledge Distillation from Domain-Specific to General Language Models Sep 23, 2024 Knowledge Distillation Transfer Learning
Code Code Available 0Pre-trained Language Model and Knowledge Distillation for Lightweight Sequential Recommendation Sep 23, 2024 Knowledge Distillation Language Modeling
— Unverified 0Prior Knowledge Distillation Network for Face Super-Resolution Sep 22, 2024 Knowledge Distillation Super-Resolution
— Unverified 0DilateQuant: Accurate and Efficient Diffusion Quantization via Weight Dilation Sep 22, 2024 Image Generation Knowledge Distillation
— Unverified 0EchoAtt: Attend, Copy, then Adjust for More Efficient Large Language Models Sep 22, 2024 Knowledge Distillation
— Unverified 0Generalization in birdsong classification: impact of transfer learning methods and dataset characteristics Sep 21, 2024 Knowledge Distillation Sound Classification
— Unverified 0On Importance of Pruning and Distillation for Efficient Low Resource NLP Sep 21, 2024 Document Classification GPU
— Unverified 0Neural-Symbolic Collaborative Distillation: Advancing Small Language Models for Complex Reasoning Tasks Sep 20, 2024 ARC GSM8K
Code Code Available 1Fast Streaming Transducer ASR Prototyping via Knowledge Distillation with Whisper Sep 20, 2024 Automatic Speech Recognition Automatic Speech Recognition (ASR)
— Unverified 0Simple Unsupervised Knowledge Distillation With Space Similarity Sep 20, 2024 Knowledge Distillation Self-Supervised Learning
— Unverified 0Enhancing TinyBERT for Financial Sentiment Analysis Using GPT-Augmented FinBERT Distillation Sep 19, 2024 Data Augmentation Edge-computing
Code Code Available 0LLMR: Knowledge Distillation with a Large Language Model-Induced Reward Sep 19, 2024 Dialogue Generation Knowledge Distillation
— Unverified 0Exploring and Enhancing the Transfer of Distribution in Knowledge Distillation for Autoregressive Language Models Sep 19, 2024 Knowledge Distillation
— Unverified 0Bayesian-Optimized One-Step Diffusion Model with Knowledge Distillation for Real-Time 3D Human Motion Prediction Sep 19, 2024 Bayesian Optimization Human motion prediction
— Unverified 0Small Language Models are Equation Reasoners Sep 19, 2024 Arithmetic Reasoning Knowledge Distillation
— Unverified 0Towards Low-latency Event-based Visual Recognition with Hybrid Step-wise Distillation Spiking Neural Networks Sep 19, 2024 Knowledge Distillation
Code Code Available 0Enhancing SLM via ChatGPT and Dataset Augmentation Sep 19, 2024 Knowledge Distillation Natural Language Inference
— Unverified 0Efficient Knowledge Distillation: Empowering Small Language Models with Teacher Model Insights Sep 19, 2024 Decision Making Knowledge Distillation
— Unverified 0Improving Cone-Beam CT Image Quality with Knowledge Distillation-Enhanced Diffusion Model in Imbalanced Data Settings Sep 19, 2024 Computed Tomography (CT) Image Generation
— Unverified 0Enhancing Knowledge Distillation of Large Language Models through Efficient Multi-Modal Distribution Alignment Sep 19, 2024 Knowledge Distillation Model Compression
Code Code Available 0StableMamba: Distillation-free Scaling of Large SSMs for Images and Videos Sep 18, 2024 Action Recognition image-classification
— Unverified 0EFCM: Efficient Fine-tuning on Compressed Models for deployment of large models in medical image analysis Sep 18, 2024 Knowledge Distillation Medical Image Analysis
— Unverified 0RUIE: Retrieval-based Unified Information Extraction using Large Language Model Sep 18, 2024 Contrastive Learning In-Context Learning
Code Code Available 0Applications of Knowledge Distillation in Remote Sensing: A Survey Sep 18, 2024 Computational Efficiency Instance Segmentation
— Unverified 0Data Efficient Acoustic Scene Classification using Teacher-Informed Confusing Class Instruction Sep 18, 2024 Acoustic Scene Classification Data Augmentation
— Unverified 0Time-Series Forecasting, Knowledge Distillation, and Refinement within a Multimodal PDE Foundation Model Sep 17, 2024 Knowledge Distillation Operator learning
Code Code Available 0Unleashing the Potential of Mamba: Boosting a LiDAR 3D Sparse Detector by Using Cross-Model Knowledge Distillation Sep 17, 2024 3D Object Detection Autonomous Driving
— Unverified 0Single-stage TTS with Masked Audio Token Modeling and Semantic Knowledge Distillation Sep 17, 2024 Knowledge Distillation Speech Synthesis
— Unverified 0Frequency-Guided Masking for Enhanced Vision Self-Supervised Learning Sep 16, 2024 Few-Shot Learning image-classification
Code Code Available 0Human Insights Driven Latent Space for Different Driving Perspectives: A Unified Encoder for Efficient Multi-Task Inference Sep 16, 2024 Autonomous Driving Knowledge Distillation
— Unverified 0Integrated Multi-Level Knowledge Distillation for Enhanced Speaker Verification Sep 14, 2024 Knowledge Distillation Speaker Verification
— Unverified 0Effective Pre-Training of Audio Transformers for Sound Event Detection Sep 14, 2024 Data Augmentation Event Detection
Code Code Available 1Joint Semantic Knowledge Distillation and Masked Acoustic Modeling for Full-band Speech Restoration with Improved Intelligibility Sep 14, 2024 Knowledge Distillation Language Modeling
— Unverified 0AWF: Adaptive Weight Fusion for Enhanced Class Incremental Semantic Segmentation Sep 13, 2024 Class-Incremental Semantic Segmentation Knowledge Distillation
— Unverified 0DiReDi: Distillation and Reverse Distillation for AIoT Applications Sep 12, 2024 Knowledge Distillation Management
— Unverified 0Ruri: Japanese General Text Embeddings Sep 12, 2024 Knowledge Distillation
Code Code Available 2Learn from Balance: Rectifying Knowledge Transfer for Long-Tailed Scenarios Sep 12, 2024 Knowledge Distillation Transfer Learning
— Unverified 0Privacy-Preserving Federated Learning with Consistency via Knowledge Distillation Using Conditional Generator Sep 11, 2024 Diversity Federated Learning
— Unverified 0DS-ViT: Dual-Stream Vision Transformer for Cross-Task Distillation in Alzheimer's Early Diagnosis Sep 11, 2024 Classification Knowledge Distillation
— Unverified 0EchoDFKD: Data-Free Knowledge Distillation for Cardiac Ultrasound Segmentation using Synthetic Data Sep 11, 2024 Data-free Knowledge Distillation Knowledge Distillation
Code Code Available 1Enhancing CTC-Based Visual Speech Recognition Sep 11, 2024 Automatic Speech Recognition Automatic Speech Recognition (ASR)
— Unverified 0A Continual and Incremental Learning Approach for TinyML On-device Training Using Dataset Distillation and Model Size Adaption Sep 11, 2024 Anomaly Detection Computational Efficiency
— Unverified 0How Redundant Is the Transformer Stack in Speech Representation Models? Sep 10, 2024 Knowledge Distillation Speaker Identification
— Unverified 0EasyST: A Simple Framework for Spatio-Temporal Prediction Sep 10, 2024 Knowledge Distillation Prediction
Code Code Available 1Knowledge Distillation via Query Selection for Detection Transformer Sep 10, 2024 Knowledge Distillation object-detection
— Unverified 0Applied Federated Model Personalisation in the Industrial Domain: A Comparative Study Sep 10, 2024 Active Learning Federated Learning
— Unverified 0Distilling Generative-Discriminative Representations for Very Low-Resolution Face Recognition Sep 10, 2024 Face Recognition Knowledge Distillation
— Unverified 0Complex Emotion Recognition System using basic emotions via Facial Expression, EEG, and ECG Signals: a review Sep 9, 2024 EEG Electroencephalogram (EEG)
— Unverified 0LEROjD: Lidar Extended Radar-Only Object Detection Sep 9, 2024 3D Object Detection Knowledge Distillation
Code Code Available 1FedBrain-Distill: Communication-Efficient Federated Brain Tumor Classification Using Ensemble Knowledge Distillation on Non-IID Data Sep 9, 2024 Brain Tumor Classification Federated Learning
Code Code Available 0