Facilitating NSFW Text Detection in Open-Domain Dialogue Systems via Knowledge Distillation Sep 18, 2023 Chatbot Knowledge Distillation
Code Code Available 0Distilling HuBERT with LSTMs via Decoupled Knowledge Distillation Sep 18, 2023 Automatic Speech Recognition Knowledge Distillation
— Unverified 0DFIL: Deepfake Incremental Learning by Exploiting Domain-invariant Forgery Clues Sep 18, 2023 Continual Learning Contrastive Learning
Code Code Available 1Heterogeneous Generative Knowledge Distillation with Masked Image Modeling Sep 18, 2023 image-classification Image Classification
— Unverified 0FDCNet: Feature Drift Compensation Network for Class-Incremental Weakly Supervised Object Localization Sep 17, 2023 class-incremental learning Incremental Learning
Code Code Available 1UNIDEAL: Curriculum Knowledge Distillation Federated Learning Sep 16, 2023 Federated Learning Knowledge Distillation
— Unverified 0One-Class Knowledge Distillation for Spoofing Speech Detection Sep 15, 2023 Binary Classification Knowledge Distillation
— Unverified 0Privacy-preserving Early Detection of Epileptic Seizures in Videos Sep 15, 2023 Knowledge Distillation Optical Flow Estimation
Code Code Available 0Cross-lingual Knowledge Distillation via Flow-based Voice Conversion for Robust Polyglot Text-To-Speech Sep 15, 2023 Knowledge Distillation Speech Synthesis
— Unverified 0Two-Step Knowledge Distillation for Tiny Speech Enhancement Sep 15, 2023 Knowledge Distillation Model Compression
— Unverified 0Adaptive Prompt Learning with Distilled Connective Knowledge for Implicit Discourse Relation Recognition Sep 14, 2023 Knowledge Distillation Prompt Learning
Code Code Available 0ChromaDistill: Colorizing Monochrome Radiance Fields with Knowledge Distillation Sep 14, 2023 3DGS Colorization
— Unverified 0CoLLD: Contrastive Layer-to-layer Distillation for Compressing Multilingual Pre-trained Speech Encoders Sep 14, 2023 Contrastive Learning Knowledge Distillation
— Unverified 0A Novel Local-Global Feature Fusion Framework for Body-weight Exercise Recognition with Pressure Mapping Sensors Sep 14, 2023 Knowledge Distillation object-detection
— Unverified 0Continual Learning with Dirichlet Generative-based Rehearsal Sep 13, 2023 Continual Learning Incremental Learning
— Unverified 0Self-Training and Multi-Task Learning for Limited Data: Evaluation Study on Object Detection Sep 12, 2023 Knowledge Distillation Multi-Task Learning
— Unverified 0KD-FixMatch: Knowledge Distillation Siamese Neural Networks Sep 11, 2023 Knowledge Distillation
— Unverified 0DeViT: Decomposing Vision Transformers for Collaborative Inference in Edge Devices Sep 10, 2023 Collaborative Inference GPU
— Unverified 0DAD++: Improved Data-free Test Time Adversarial Defense Sep 10, 2023 Adversarial Defense Adversarial Robustness
Code Code Available 0Exploiting CLIP for Zero-shot HOI Detection Requires Knowledge Distillation at Multiple Levels Sep 10, 2023 Human-Object Interaction Detection Knowledge Distillation
Code Code Available 0Speech Emotion Recognition with Distilled Prosodic and Linguistic Affect Representations Sep 9, 2023 Emotion Recognition Knowledge Distillation
— Unverified 0Decoding visual brain representations from electroencephalography through Knowledge Distillation and latent diffusion models Sep 8, 2023 Brain Decoding EEG
Code Code Available 0Knowledge Distillation-Empowered Digital Twin for Anomaly Detection Sep 8, 2023 Anomaly Detection Knowledge Distillation
— Unverified 0Towards Mitigating Architecture Overfitting on Distilled Datasets Sep 8, 2023 Dataset Distillation Knowledge Distillation
Code Code Available 03D Denoisers are Good 2D Teachers: Molecular Pretraining via Denoising and Cross-Modal Distillation Sep 8, 2023 Denoising Knowledge Distillation
— Unverified 0Towards Comparable Knowledge Distillation in Semantic Image Segmentation Sep 7, 2023 Image Segmentation Knowledge Distillation
— Unverified 0Leveraging ASR Pretrained Conformers for Speaker Verification through Transfer Learning and Knowledge Distillation Sep 6, 2023 Knowledge Distillation Speaker Verification
— Unverified 0Knowledge Distillation Layer that Lets the Student Decide Sep 6, 2023 Knowledge Distillation
Code Code Available 0DMKD: Improving Feature-based Knowledge Distillation for Object Detection Via Dual Masking Augmentation Sep 6, 2023 Knowledge Distillation object-detection
— Unverified 0Rethinking Momentum Knowledge Distillation in Online Continual Learning Sep 6, 2023 Continual Learning Knowledge Distillation
Code Code Available 1A deep Natural Language Inference predictor without language-specific training data Sep 6, 2023 Aspect-Based Sentiment Analysis Knowledge Distillation
— Unverified 0Fast and High-Performance Learned Image Compression With Improved Checkerboard Context Model, Deformable Residual Module, and Knowledge Distillation Sep 5, 2023 Image Compression Knowledge Distillation
— Unverified 0TODM: Train Once Deploy Many Efficient Supernet-Based RNN-T Compression For On-device ASR Models Sep 5, 2023 Automatic Speech Recognition Automatic Speech Recognition (ASR)
— Unverified 0Probabilistic Self-supervised Learning via Scoring Rules Minimization Sep 5, 2023 Knowledge Distillation Out-of-Distribution Detection
— Unverified 0A survey on efficient vision transformers: algorithms, techniques, and performance benchmarking Sep 5, 2023 Benchmarking Knowledge Distillation
— Unverified 0On the Query Strategies for Efficient Online Active Distillation Sep 4, 2023 Active Learning Continual Learning
— Unverified 0Prior Knowledge Guided Network for Video Anomaly Detection Sep 4, 2023 Anomaly Detection Knowledge Distillation
— Unverified 0COMEDIAN: Self-Supervised Learning and Knowledge Distillation for Action Spotting using Transformers Sep 3, 2023 Action Detection Action Spotting
Code Code Available 1Knowledge Distillation from Non-streaming to Streaming ASR Encoder using Auxiliary Non-streaming Layer Aug 31, 2023 Automatic Speech Recognition Automatic Speech Recognition (ASR)
— Unverified 0Adversarial Finetuning with Latent Representation Constraint to Mitigate Accuracy-Robustness Tradeoff Aug 31, 2023 Knowledge Distillation
— Unverified 0MoMA: Momentum Contrastive Learning with Multi-head Attention-based Knowledge Distillation for Histopathology Image Analysis Aug 31, 2023 Contrastive Learning Knowledge Distillation
Code Code Available 0Towards Long-Tailed Recognition for Graph Classification via Collaborative Experts Aug 31, 2023 Contrastive Learning Graph Classification
— Unverified 0Exploring Multi-Modal Contextual Knowledge for Open-Vocabulary Object Detection Aug 30, 2023 Knowledge Distillation Language Modeling
— Unverified 0SpikeBERT: A Language Spikformer Learned from BERT with Knowledge Distillation Aug 29, 2023 Knowledge Distillation text-classification
Code Code Available 1SynthDistill: Face Recognition with Knowledge Distillation from Synthetic Data Aug 28, 2023 Face Recognition Knowledge Distillation
Code Code Available 0Bridging Cross-task Protocol Inconsistency for Distillation in Dense Object Detection Aug 28, 2023 Binary Classification Classification
Code Code Available 1Distilled GPT for Source Code Summarization Aug 28, 2023 Code Summarization GPU
Code Code Available 0Boosting Residual Networks with Group Knowledge Aug 26, 2023 Knowledge Distillation
Code Code Available 0DM-VTON: Distilled Mobile Real-time Virtual Try-On Aug 26, 2023 GPU Human Parsing
Code Code Available 1Improving Knowledge Distillation for BERT Models: Loss Functions, Mapping Methods, and Weight Tuning Aug 26, 2023 Knowledge Distillation Model Compression
— Unverified 0