Towards Few-Call Model Stealing via Active Self-Paced Knowledge Distillation and Diffusion-Based Image Generation Sep 29, 2023 Image Generation Knowledge Distillation
— Unverified 0An Enhanced Low-Resolution Image Recognition Method for Traffic Environments Sep 28, 2023 Computational Efficiency Knowledge Distillation
— Unverified 0Distill to Delete: Unlearning in Graph Networks with Knowledge Distillation Sep 28, 2023 GPU Graph Neural Network
— Unverified 0Distilling ODE Solvers of Diffusion Models into Smaller Steps Sep 28, 2023 Denoising Knowledge Distillation
— Unverified 0Inherit with Distillation and Evolve with Contrast: Exploring Class Incremental Semantic Segmentation Without Exemplar Memory Sep 27, 2023 Class-Incremental Semantic Segmentation Contrastive Learning
— Unverified 0DualVC 2: Dynamic Masked Convolution for Unified Streaming and Non-Streaming Voice Conversion Sep 27, 2023 Decoder Knowledge Distillation
— Unverified 0Cold & Warm Net: Addressing Cold-Start Users in Recommender Systems Sep 27, 2023 Knowledge Distillation Meta-Learning
— Unverified 0VideoAdviser: Video Knowledge Distillation for Multimodal Transfer Learning Sep 27, 2023 Knowledge Distillation regression
— Unverified 0Contrastive Continual Multi-view Clustering with Filtered Structural Fusion Sep 26, 2023 Clustering Contrastive Learning
— Unverified 0ADU-Depth: Attention-based Distillation with Uncertainty Modeling for Depth Estimation Sep 26, 2023 3D geometry Depth Estimation
— Unverified 0DONNAv2 -- Lightweight Neural Architecture Search for Vision tasks Sep 26, 2023 Denoising Image Denoising
— Unverified 0Learning Using Generated Privileged Information by Text-to-Image Diffusion Models Sep 26, 2023 Classification Knowledge Distillation
— Unverified 0Noise-Tolerant Few-Shot Unsupervised Adapter for Vision-Language Models Sep 26, 2023 image-classification Image Classification
— Unverified 0Data Upcycling Knowledge Distillation for Image Super-Resolution Sep 25, 2023 Image Super-Resolution Knowledge Distillation
Code Code Available 0Unsupervised 3D Perception with 2D Vision-Language Distillation for Autonomous Driving Sep 25, 2023 Autonomous Driving Knowledge Distillation
— Unverified 0DFRD: Data-Free Robustness Distillation for Heterogeneous Federated Learning Sep 24, 2023 Data-free Knowledge Distillation Diversity
— Unverified 0Multivariate Prototype Representation for Domain-Generalized Incremental Learning Sep 24, 2023 class-incremental learning Class Incremental Learning
— Unverified 0VIC-KD: Variance-Invariance-Covariance Knowledge Distillation to Make Keyword Spotting More Robust Against Adversarial Attacks Sep 22, 2023 Adversarial Robustness Keyword Spotting
— Unverified 0Poster: Self-Supervised Quantization-Aware Knowledge Distillation Sep 22, 2023 Knowledge Distillation Quantization
— Unverified 0Triple-View Knowledge Distillation for Semi-Supervised Semantic Segmentation Sep 22, 2023 Decoder Feature Importance
— Unverified 0Defending against Data-Free Model Extraction by Distributionally Robust Defensive Training Sep 21, 2023 Knowledge Distillation Model extraction
— Unverified 0Elevating Skeleton-Based Action Recognition with Efficient Multi-Modality Self-Supervision Sep 21, 2023 Action Recognition Knowledge Distillation
Code Code Available 0Defending against Data-Free Model Extraction by Distributionally Robust Defensive Training Sep 21, 2023 Knowledge Distillation Model extraction
— Unverified 0Dense 2D-3D Indoor Prediction with Sound via Aligned Cross-Modal Distillation Sep 20, 2023 3D Scene Reconstruction Depth Estimation
Code Code Available 0Language-Oriented Communication with Semantic Coding and Knowledge Distillation for Text-to-Image Generation Sep 20, 2023 Image Generation In-Context Learning
— Unverified 0Improving CLIP Robustness with Knowledge Distillation and Self-Training Sep 19, 2023 Knowledge Distillation
— Unverified 0Incorporating Ultrasound Tongue Images for Audio-Visual Speech Enhancement Sep 19, 2023 Automatic Speech Recognition Automatic Speech Recognition (ASR)
— Unverified 0Distilling HuBERT with LSTMs via Decoupled Knowledge Distillation Sep 18, 2023 Automatic Speech Recognition Knowledge Distillation
— Unverified 0Facilitating NSFW Text Detection in Open-Domain Dialogue Systems via Knowledge Distillation Sep 18, 2023 Chatbot Knowledge Distillation
Code Code Available 0Heterogeneous Generative Knowledge Distillation with Masked Image Modeling Sep 18, 2023 image-classification Image Classification
— Unverified 0UNIDEAL: Curriculum Knowledge Distillation Federated Learning Sep 16, 2023 Federated Learning Knowledge Distillation
— Unverified 0Two-Step Knowledge Distillation for Tiny Speech Enhancement Sep 15, 2023 Knowledge Distillation Model Compression
— Unverified 0Cross-lingual Knowledge Distillation via Flow-based Voice Conversion for Robust Polyglot Text-To-Speech Sep 15, 2023 Knowledge Distillation Speech Synthesis
— Unverified 0Privacy-preserving Early Detection of Epileptic Seizures in Videos Sep 15, 2023 Knowledge Distillation Optical Flow Estimation
Code Code Available 0One-Class Knowledge Distillation for Spoofing Speech Detection Sep 15, 2023 Binary Classification Knowledge Distillation
— Unverified 0ChromaDistill: Colorizing Monochrome Radiance Fields with Knowledge Distillation Sep 14, 2023 3DGS Colorization
— Unverified 0A Novel Local-Global Feature Fusion Framework for Body-weight Exercise Recognition with Pressure Mapping Sensors Sep 14, 2023 Knowledge Distillation object-detection
— Unverified 0CoLLD: Contrastive Layer-to-layer Distillation for Compressing Multilingual Pre-trained Speech Encoders Sep 14, 2023 Contrastive Learning Knowledge Distillation
— Unverified 0Adaptive Prompt Learning with Distilled Connective Knowledge for Implicit Discourse Relation Recognition Sep 14, 2023 Knowledge Distillation Prompt Learning
Code Code Available 0Continual Learning with Dirichlet Generative-based Rehearsal Sep 13, 2023 Continual Learning Incremental Learning
— Unverified 0Self-Training and Multi-Task Learning for Limited Data: Evaluation Study on Object Detection Sep 12, 2023 Knowledge Distillation Multi-Task Learning
— Unverified 0KD-FixMatch: Knowledge Distillation Siamese Neural Networks Sep 11, 2023 Knowledge Distillation
— Unverified 0DeViT: Decomposing Vision Transformers for Collaborative Inference in Edge Devices Sep 10, 2023 Collaborative Inference GPU
— Unverified 0DAD++: Improved Data-free Test Time Adversarial Defense Sep 10, 2023 Adversarial Defense Adversarial Robustness
Code Code Available 0Exploiting CLIP for Zero-shot HOI Detection Requires Knowledge Distillation at Multiple Levels Sep 10, 2023 Human-Object Interaction Detection Knowledge Distillation
Code Code Available 0Speech Emotion Recognition with Distilled Prosodic and Linguistic Affect Representations Sep 9, 2023 Emotion Recognition Knowledge Distillation
— Unverified 0Knowledge Distillation-Empowered Digital Twin for Anomaly Detection Sep 8, 2023 Anomaly Detection Knowledge Distillation
— Unverified 03D Denoisers are Good 2D Teachers: Molecular Pretraining via Denoising and Cross-Modal Distillation Sep 8, 2023 Denoising Knowledge Distillation
— Unverified 0Towards Mitigating Architecture Overfitting on Distilled Datasets Sep 8, 2023 Dataset Distillation Knowledge Distillation
Code Code Available 0Decoding visual brain representations from electroencephalography through Knowledge Distillation and latent diffusion models Sep 8, 2023 Brain Decoding EEG
Code Code Available 0