Deep Clustering with Diffused Sampling and Hardness-aware Self-distillation Jan 25, 2024 Clustering Contrastive Learning
Code Code Available 0Communication-Efficient Federated Learning through Adaptive Weight Clustering and Server-Side Distillation Jan 25, 2024 Clustering Federated Learning
Code Code Available 1Self-supervised Video Object Segmentation with Distillation Learning of Deformable Attention Jan 25, 2024 Knowledge Distillation Object
— Unverified 0Towards Complementary Knowledge Distillation for Efficient Dense Image Prediction Jan 24, 2024 Implicit Relations Instance Segmentation
— Unverified 0Contrastive Learning in Distilled Models Jan 23, 2024 Contrastive Learning Knowledge Distillation
Code Code Available 0Knowledge Distillation from Language-Oriented to Emergent Communication for Multi-Agent Remote Control Jan 23, 2024 Deep Reinforcement Learning Knowledge Distillation
— Unverified 0A Novel Garment Transfer Method Supervised by Distilled Knowledge of Virtual Try-on Model Jan 23, 2024 Disentanglement Knowledge Distillation
— Unverified 0Stereo-Matching Knowledge Distilled Monocular Depth Estimation Filtered by Multiple Disparity Consistency Jan 22, 2024 Depth Estimation Knowledge Distillation
— Unverified 0Knowledge Distillation on Spatial-Temporal Graph Convolutional Network for Traffic Prediction Jan 22, 2024 Graph Neural Network Knowledge Distillation
— Unverified 0Robustness to distribution shifts of compressed networks for edge devices Jan 22, 2024 Knowledge Distillation Quantization
— Unverified 0Rethinking Centered Kernel Alignment in Knowledge Distillation Jan 22, 2024 image-classification Image Classification
Code Code Available 1Zoom-shot: Fast and Efficient Unsupervised Zero-Shot Transfer of CLIP to Vision Encoders with Multimodal Loss Jan 22, 2024 Knowledge Distillation zero-shot-classification
— Unverified 0Keep Decoding Parallel with Effective Knowledge Distillation from Language Models to End-to-end Speech Recognisers Jan 22, 2024 Automatic Speech Recognition Automatic Speech Recognition (ASR)
— Unverified 0Confidence Preservation Property in Knowledge Distillation Abstractions Jan 21, 2024 Classification Knowledge Distillation
— Unverified 0HiCD: Change Detection in Quality-Varied Images via Hierarchical Correlation Distillation Jan 19, 2024 Change Detection Knowledge Distillation
Code Code Available 1Enhancing Scalability in Recommender Systems through Lottery Ticket Hypothesis and Knowledge Distillation-based Neural Network Pruning Jan 19, 2024 GPU Knowledge Distillation
— Unverified 0Large Language Models are Efficient Learners of Noise-Robust Speech Recognition Jan 19, 2024 Automatic Speech Recognition Automatic Speech Recognition (ASR)
Code Code Available 2Model Compression Techniques in Biometrics Applications: A Survey Jan 18, 2024 Fairness Knowledge Distillation
Code Code Available 0TelME: Teacher-leading Multimodal Fusion Network for Emotion Recognition in Conversation Jan 16, 2024 Emotion Recognition Emotion Recognition in Conversation
Code Code Available 1Bayes Conditional Distribution Estimation for Knowledge Distillation Based on Conditional Mutual Information Jan 16, 2024 Knowledge Distillation
Code Code Available 1Cross-Level Multi-Instance Distillation for Self-Supervised Fine-Grained Visual Categorization Jan 16, 2024 Fine-Grained Visual Categorization Knowledge Distillation
— Unverified 0OBSeg: Accurate and Fast Instance Segmentation Framework Using Segmentation Foundation Models with Oriented Bounding Box Prompts Jan 16, 2024 Amodal Instance Segmentation Instance Segmentation
Code Code Available 2Generative Denoise Distillation: Simple Stochastic Noises Induce Efficient Knowledge Transfer for Dense Prediction Jan 16, 2024 Instance Segmentation Knowledge Distillation
Code Code Available 0A Deep Hierarchical Feature Sparse Framework for Occluded Person Re-Identification Jan 15, 2024 Data Augmentation Knowledge Distillation
— Unverified 0Lightweight Modality Adaptation to Sequential Recommendation via Correlation Supervision Jan 14, 2024 Knowledge Distillation Representation Learning
— Unverified 0Knowledge Distillation of Black-Box Large Language Models Jan 13, 2024 Knowledge Distillation Transfer Learning
— Unverified 0EVOKE: Emotion Enabled Virtual Avatar Mapping Using Optimized Knowledge Distillation Jan 13, 2024 Emotion Recognition Knowledge Distillation
— Unverified 0Direct Distillation between Different Domains Jan 12, 2024 Domain Adaptation Knowledge Distillation
— Unverified 0An Empirical Investigation into the Effect of Parameter Choices in Knowledge Distillation Jan 12, 2024 Knowledge Distillation
— Unverified 0Graph Relation Distillation for Efficient Biomedical Instance Segmentation Jan 12, 2024 Instance Segmentation Knowledge Distillation
Code Code Available 1Exploring Self- and Cross-Triplet Correlations for Human-Object Interaction Detection Jan 11, 2024 Human-Object Interaction Detection Knowledge Distillation
— Unverified 0Attention to detail: inter-resolution knowledge distillation Jan 11, 2024 Knowledge Distillation whole slide images
Code Code Available 0Object-Centric Diffusion for Efficient Video Editing Jan 11, 2024 Knowledge Distillation Object
— Unverified 0Hierarchical Knowledge Distillation on Text Graph for Data-limited Attribute Inference Jan 10, 2024 Attribute Few-Shot Learning
— Unverified 0Translate-Distill: Learning Cross-Language Dense Retrieval by Translation and Distillation Jan 9, 2024 Information Retrieval Knowledge Distillation
— Unverified 0Logits Poisoning Attack in Federated Distillation Jan 8, 2024 Federated Learning Knowledge Distillation
— Unverified 0Multi-Channel Multi-Domain based Knowledge Distillation Algorithm for Sleep Staging with Single-Channel EEG Jan 7, 2024 EEG Knowledge Distillation
— Unverified 0SeqNAS: Neural Architecture Search for Event Sequence Classification Jan 6, 2024 Bayesian Optimization Classification
Code Code Available 0Progressive Knowledge Distillation Of Stable Diffusion XL Using Layer Level Loss Jan 5, 2024 Knowledge Distillation
Code Code Available 2Bridging Modalities: Knowledge Distillation and Masked Training for Translating Multi-Modal Emotion Recognition to Uni-Modal, Speech-Only Emotion Recognition Jan 4, 2024 Emotion Recognition Knowledge Distillation
Code Code Available 0Distillation-based fabric anomaly detection Jan 4, 2024 Anomaly Detection Defect Detection
Code Code Available 0Exploring Vacant Classes in Label-Skewed Federated Learning Jan 4, 2024 Federated Learning Knowledge Distillation
Code Code Available 0CTC Blank Triggered Dynamic Layer-Skipping for Efficient CTC-based Speech Recognition Jan 4, 2024 Knowledge Distillation speech-recognition
— Unverified 0Distilling Temporal Knowledge with Masked Feature Reconstruction for 3D Object Detection Jan 3, 2024 3D Object Detection Knowledge Distillation
— Unverified 0Self-supervised Reflective Learning through Self-distillation and Online Clustering for Speaker Representation Learning Jan 3, 2024 Clustering Knowledge Distillation
— Unverified 0Exploring Hyperspectral Anomaly Detection with Human Vision: A Small Target Aware Detector Jan 2, 2024 Anomaly Detection Knowledge Distillation
Code Code Available 0HAAQI-Net: A Non-intrusive Neural Music Audio Quality Assessment Model for Hearing Aids Jan 2, 2024 Audio Quality Assessment Audio Signal Processing
Code Code Available 1Query-Based Knowledge Sharing for Open-Vocabulary Multi-Label Classification Jan 2, 2024 Knowledge Distillation Multi-Label Classification
— Unverified 0Dual Teacher Knowledge Distillation with Domain Alignment for Face Anti-spoofing Jan 2, 2024 Adversarial Attack Face Anti-Spoofing
— Unverified 0Distilling Local Texture Features for Colorectal Tissue Classification in Low Data Regimes Jan 2, 2024 Knowledge Distillation
Code Code Available 0