UB-FineNet: Urban Building Fine-grained Classification Network for Open-access Satellite Images Mar 4, 2024 Classification Denoising
— Unverified 0PowerSkel: A Device-Free Framework Using CSI Signal for Human Skeleton Estimation in Power Station Mar 4, 2024 Knowledge Distillation Pose Estimation
Code Code Available 0A Closer Look at Wav2Vec2 Embeddings for On-Device Single-Channel Speech Enhancement Mar 3, 2024 Automatic Speech Recognition Keyword Spotting
— Unverified 0Align-to-Distill: Trainable Attention Alignment for Knowledge Distillation in Neural Machine Translation Mar 3, 2024 Knowledge Distillation Machine Translation
Code Code Available 0Logit Standardization in Knowledge Distillation Mar 3, 2024 Knowledge Distillation
Code Code Available 3Hyperspectral Image Analysis in Single-Modal and Multimodal setting using Deep Learning Techniques Mar 3, 2024 Dimensionality Reduction Hyperspectral image analysis
— Unverified 0On the Road to Portability: Compressing End-to-End Motion Planner for Autonomous Driving Mar 2, 2024 Autonomous Driving Knowledge Distillation
Code Code Available 2Teaching MLP More Graph Information: A Three-stage Multitask Knowledge Distillation Framework Mar 2, 2024 Knowledge Distillation
— Unverified 0Distilling Text Style Transfer With Self-Explanation From LLMs Mar 2, 2024 In-Context Learning Knowledge Distillation
— Unverified 0Differentially Private Knowledge Distillation via Synthetic Text Generation Mar 1, 2024 Knowledge Distillation Model Compression
Code Code Available 0Data-efficient Event Camera Pre-training via Disentangled Masked Modeling Mar 1, 2024 Knowledge Distillation Self-Supervised Learning
— Unverified 0Direct Alignment of Draft Model for Speculative Decoding with Chat-Fine-Tuned LLMs Feb 29, 2024 Dataset Generation Knowledge Distillation
— Unverified 0A Cognitive-Based Trajectory Prediction Approach for Autonomous Driving Feb 29, 2024 Autonomous Driving Decision Making
Code Code Available 2Weakly Supervised Monocular 3D Detection with a Single-View Image Feb 29, 2024 Knowledge Distillation Object Localization
— Unverified 0MIKO: Multimodal Intention Knowledge Distillation from Large Language Models for Social-Media Commonsense Discovery Feb 28, 2024 Knowledge Distillation Language Modeling
— Unverified 0A Lightweight Low-Light Image Enhancement Network via Channel Prior and Gamma Correction Feb 28, 2024 Image Enhancement Knowledge Distillation
— Unverified 03MVRD: Multimodal Multi-task Multi-teacher Visually-Rich Form Document Understanding Feb 28, 2024 document understanding Form
Code Code Available 0Sunshine to Rainstorm: Cross-Weather Knowledge Distillation for Robust 3D Object Detection Feb 28, 2024 3D Object Detection Knowledge Distillation
Code Code Available 1Gradient Reweighting: Towards Imbalanced Class-Incremental Learning Feb 28, 2024 class-incremental learning Class Incremental Learning
— Unverified 0Sinkhorn Distance Minimization for Knowledge Distillation Feb 27, 2024 Decoder Knowledge Distillation
Code Code Available 2PromptMM: Multi-Modal Knowledge Distillation for Recommendation with Prompt-Tuning Feb 27, 2024 Knowledge Distillation Model Compression
Code Code Available 2Structural Teacher-Student Normality Learning for Multi-Class Anomaly Detection and Localization Feb 27, 2024 Anomaly Detection Knowledge Distillation
— Unverified 0SDDGR: Stable Diffusion-based Deep Generative Replay for Class Incremental Object Detection Feb 27, 2024 class-incremental learning Class Incremental Learning
— Unverified 0MCF-VC: Mitigate Catastrophic Forgetting in Class-Incremental Learning for Multimodal Video Captioning Feb 27, 2024 class-incremental learning Class Incremental Learning
— Unverified 0DTCM: Deep Transformer Capsule Mutual Distillation for Multivariate Time Series Classification Feb 26, 2024 Knowledge Distillation Relation Network
— Unverified 0m2mKD: Module-to-Module Knowledge Distillation for Modular Transformers Feb 26, 2024 Knowledge Distillation Mixture-of-Experts
Code Code Available 0SKILL: Similarity-aware Knowledge distILLation for Speech Self-Supervised Learning Feb 26, 2024 Knowledge Distillation Self-Supervised Learning
— Unverified 0LLM-based Privacy Data Augmentation Guided by Knowledge Distillation with a Distribution Tutor for Medical Text Classification Feb 26, 2024 Data Augmentation Knowledge Distillation
— Unverified 0LLM Inference Unveiled: Survey and Roofline Model Insights Feb 26, 2024 Knowledge Distillation Language Modelling
Code Code Available 4Distilling Adversarial Robustness Using Heterogeneous Teachers Feb 23, 2024 Adversarial Robustness Knowledge Distillation
— Unverified 0Practical Insights into Knowledge Distillation for Pre-Trained Models Feb 22, 2024 Federated Learning Knowledge Distillation
— Unverified 0TIE-KD: Teacher-Independent and Explainable Knowledge Distillation for Monocular Depth Estimation Feb 22, 2024 Depth Estimation Knowledge Distillation
Code Code Available 0Rethinking Invariance Regularization in Adversarial Training to Improve Robustness-Accuracy Trade-off Feb 22, 2024 Adversarial Defense Knowledge Distillation
— Unverified 0Enhancing Systematic Decompositional Natural Language Inference Using Informal Logic Feb 22, 2024 Formal Logic Knowledge Distillation
— Unverified 0PaCKD: Pattern-Clustered Knowledge Distillation for Compressing Memory Access Prediction Models Feb 21, 2024 image-classification Image Classification
Code Code Available 0Wisdom of Committee: Distilling from Foundation Model to Specialized Application Model Feb 21, 2024 Knowledge Distillation model
— Unverified 0In-Distribution Consistency Regularization Improves the Generalization of Quantization-Aware Training Feb 21, 2024 Knowledge Distillation Quantization
— Unverified 0Unsupervised Text Style Transfer via LLMs and Attention Masking with Multi-way Interactions Feb 21, 2024 In-Context Learning Knowledge Distillation
— Unverified 0PIRB: A Comprehensive Benchmark of Polish Dense and Hybrid Text Retrieval Methods Feb 20, 2024 Information Retrieval Knowledge Distillation
— Unverified 0PromptKD: Distilling Student-Friendly Knowledge for Generative Language Models via Prompt Tuning Feb 20, 2024 Instruction Following Knowledge Distillation
Code Code Available 1FGAD: Self-boosted Knowledge Distillation for An Effective Federated Graph Anomaly Detection Framework Feb 20, 2024 Anomaly Detection Federated Learning
— Unverified 0A Survey on Knowledge Distillation of Large Language Models Feb 20, 2024 Data Augmentation Knowledge Distillation
Code Code Available 5Improve Cross-Architecture Generalization on Dataset Distillation Feb 20, 2024 Dataset Distillation Knowledge Distillation
Code Code Available 1ELAD: Explanation-Guided Large Language Models Active Distillation Feb 20, 2024 Active Learning Knowledge Distillation
— Unverified 0Induced Model Matching: How Restricted Models Can Help Larger Ones Feb 19, 2024 Knowledge Distillation Language Modeling
Code Code Available 0Towards Cross-Tokenizer Distillation: the Universal Logit Distillation Loss for LLMs Feb 19, 2024 Knowledge Distillation
Code Code Available 4Revisiting Knowledge Distillation for Autoregressive Language Models Feb 19, 2024 Knowledge Distillation
Code Code Available 0On the Byzantine-Resilience of Distillation-Based Federated Learning Feb 19, 2024 Federated Learning Knowledge Distillation
Code Code Available 0Teacher as a Lenient Expert: Teacher-Agnostic Data-Free Knowledge Distillation Feb 18, 2024 Data-free Knowledge Distillation Knowledge Distillation
Code Code Available 0GraphKD: Exploring Knowledge Distillation Towards Document Object Detection with Structured Graph Creation Feb 17, 2024 Knowledge Distillation object-detection
Code Code Available 1