Evolving Knowledge Distillation with Large Language Models and Active Learning Mar 11, 2024 Active Learning Knowledge Distillation
— Unverified 0Enhancing Adversarial Training with Prior Knowledge Distillation for Robust Image Compression Mar 11, 2024 Backdoor Attack Image Compression
— Unverified 0Enhanced Sparsification via Stimulative Training Mar 11, 2024 Knowledge Distillation Model Compression
— Unverified 0Answering Diverse Questions via Text Attached with Key Audio-Visual Clues Mar 11, 2024 Audio-visual Question Answering Audio-Visual Question Answering (AVQA)
Code Code Available 0Cooperative Classification and Rationalization for Graph Generalization Mar 10, 2024 Classification Graph Classification
Code Code Available 0Attention is all you need for boosting graph convolutional neural network Mar 10, 2024 All Knowledge Distillation
— Unverified 0Knowledge Distillation of Convolutional Neural Networks through Feature Map Transformation using Decision Trees Mar 10, 2024 Knowledge Distillation
— Unverified 0Weakly Supervised Change Detection via Knowledge Distillation and Multiscale Sigmoid Inference Mar 9, 2024 Change Detection Knowledge Distillation
Code Code Available 0Fine-tuning a Multiple Instance Learning Feature Extractor with Masked Context Modelling and Knowledge Distillation Mar 8, 2024 Image Generation Knowledge Distillation
— Unverified 0Scene Graph Aided Radiology Report Generation Mar 8, 2024 Decoder Knowledge Distillation
— Unverified 0Attention-guided Feature Distillation for Semantic Segmentation Mar 8, 2024 Knowledge Distillation Segmentation
— Unverified 0Adversarial Sparse Teacher: Defense Against Distillation-Based Model Stealing Attacks Using Adversarial Examples Mar 8, 2024 Knowledge Distillation
— Unverified 0MKF-ADS: Multi-Knowledge Fusion Based Self-supervised Anomaly Detection System for Control Area Network Mar 7, 2024 Anomaly Detection Intrusion Detection
— Unverified 0Privacy-preserving Fine-tuning of Large Language Models through Flatness Mar 7, 2024 Knowledge Distillation Privacy Preserving
— Unverified 0A Study of Dropout-Induced Modality Bias on Robustness to Missing Video Frames for Audio-Visual Speech Recognition Mar 7, 2024 Audio-Visual Speech Recognition Knowledge Distillation
Code Code Available 0Can Small Language Models be Good Reasoners for Sequential Recommendation? Mar 7, 2024 Knowledge Distillation Recommendation Systems
— Unverified 0A Teacher-Free Graph Knowledge Distillation Framework with Dual Self-Distillation Mar 6, 2024 Knowledge Distillation
Code Code Available 0Learning to Maximize Mutual Information for Chain-of-Thought Distillation Mar 5, 2024 Knowledge Distillation Language Modeling
Code Code Available 0JEP-KD: Joint-Embedding Predictive Architecture Based Knowledge Distillation for Visual Speech Recognition Mar 4, 2024 Automatic Speech Recognition Automatic Speech Recognition (ASR)
— Unverified 0CSE: Surface Anomaly Detection with Contrastively Selected Embedding Mar 4, 2024 Anomaly Detection Knowledge Distillation
Code Code Available 0UB-FineNet: Urban Building Fine-grained Classification Network for Open-access Satellite Images Mar 4, 2024 Classification Denoising
— Unverified 0PowerSkel: A Device-Free Framework Using CSI Signal for Human Skeleton Estimation in Power Station Mar 4, 2024 Knowledge Distillation Pose Estimation
Code Code Available 0Distilled ChatGPT Topic & Sentiment Modeling with Applications in Finance Mar 4, 2024 Knowledge Distillation Sentiment Analysis
— Unverified 0Hyperspectral Image Analysis in Single-Modal and Multimodal setting using Deep Learning Techniques Mar 3, 2024 Dimensionality Reduction Hyperspectral image analysis
— Unverified 0A Closer Look at Wav2Vec2 Embeddings for On-Device Single-Channel Speech Enhancement Mar 3, 2024 Automatic Speech Recognition Keyword Spotting
— Unverified 0Align-to-Distill: Trainable Attention Alignment for Knowledge Distillation in Neural Machine Translation Mar 3, 2024 Knowledge Distillation Machine Translation
Code Code Available 0Teaching MLP More Graph Information: A Three-stage Multitask Knowledge Distillation Framework Mar 2, 2024 Knowledge Distillation
— Unverified 0Distilling Text Style Transfer With Self-Explanation From LLMs Mar 2, 2024 In-Context Learning Knowledge Distillation
— Unverified 0Differentially Private Knowledge Distillation via Synthetic Text Generation Mar 1, 2024 Knowledge Distillation Model Compression
Code Code Available 0Data-efficient Event Camera Pre-training via Disentangled Masked Modeling Mar 1, 2024 Knowledge Distillation Self-Supervised Learning
— Unverified 0Direct Alignment of Draft Model for Speculative Decoding with Chat-Fine-Tuned LLMs Feb 29, 2024 Dataset Generation Knowledge Distillation
— Unverified 0Weakly Supervised Monocular 3D Detection with a Single-View Image Feb 29, 2024 Knowledge Distillation Object Localization
— Unverified 0MIKO: Multimodal Intention Knowledge Distillation from Large Language Models for Social-Media Commonsense Discovery Feb 28, 2024 Knowledge Distillation Language Modeling
— Unverified 0Gradient Reweighting: Towards Imbalanced Class-Incremental Learning Feb 28, 2024 class-incremental learning Class Incremental Learning
— Unverified 0A Lightweight Low-Light Image Enhancement Network via Channel Prior and Gamma Correction Feb 28, 2024 Image Enhancement Knowledge Distillation
— Unverified 03MVRD: Multimodal Multi-task Multi-teacher Visually-Rich Form Document Understanding Feb 28, 2024 document understanding Form
Code Code Available 0Structural Teacher-Student Normality Learning for Multi-Class Anomaly Detection and Localization Feb 27, 2024 Anomaly Detection Knowledge Distillation
— Unverified 0SDDGR: Stable Diffusion-based Deep Generative Replay for Class Incremental Object Detection Feb 27, 2024 class-incremental learning Class Incremental Learning
— Unverified 0MCF-VC: Mitigate Catastrophic Forgetting in Class-Incremental Learning for Multimodal Video Captioning Feb 27, 2024 class-incremental learning Class Incremental Learning
— Unverified 0LLM-based Privacy Data Augmentation Guided by Knowledge Distillation with a Distribution Tutor for Medical Text Classification Feb 26, 2024 Data Augmentation Knowledge Distillation
— Unverified 0SKILL: Similarity-aware Knowledge distILLation for Speech Self-Supervised Learning Feb 26, 2024 Knowledge Distillation Self-Supervised Learning
— Unverified 0m2mKD: Module-to-Module Knowledge Distillation for Modular Transformers Feb 26, 2024 Knowledge Distillation Mixture-of-Experts
Code Code Available 0DTCM: Deep Transformer Capsule Mutual Distillation for Multivariate Time Series Classification Feb 26, 2024 Knowledge Distillation Relation Network
— Unverified 0Distilling Adversarial Robustness Using Heterogeneous Teachers Feb 23, 2024 Adversarial Robustness Knowledge Distillation
— Unverified 0Rethinking Invariance Regularization in Adversarial Training to Improve Robustness-Accuracy Trade-off Feb 22, 2024 Adversarial Defense Knowledge Distillation
— Unverified 0Practical Insights into Knowledge Distillation for Pre-Trained Models Feb 22, 2024 Federated Learning Knowledge Distillation
— Unverified 0Enhancing Systematic Decompositional Natural Language Inference Using Informal Logic Feb 22, 2024 Formal Logic Knowledge Distillation
— Unverified 0TIE-KD: Teacher-Independent and Explainable Knowledge Distillation for Monocular Depth Estimation Feb 22, 2024 Depth Estimation Knowledge Distillation
Code Code Available 0Unsupervised Text Style Transfer via LLMs and Attention Masking with Multi-way Interactions Feb 21, 2024 In-Context Learning Knowledge Distillation
— Unverified 0In-Distribution Consistency Regularization Improves the Generalization of Quantization-Aware Training Feb 21, 2024 Knowledge Distillation Quantization
— Unverified 0