Knowledge Distillation from Language-Oriented to Emergent Communication for Multi-Agent Remote Control Jan 23, 2024 Deep Reinforcement Learning Knowledge Distillation
— Unverified 0A Novel Garment Transfer Method Supervised by Distilled Knowledge of Virtual Try-on Model Jan 23, 2024 Disentanglement Knowledge Distillation
— Unverified 0Contrastive Learning in Distilled Models Jan 23, 2024 Contrastive Learning Knowledge Distillation
Code Code Available 0Zoom-shot: Fast and Efficient Unsupervised Zero-Shot Transfer of CLIP to Vision Encoders with Multimodal Loss Jan 22, 2024 Knowledge Distillation zero-shot-classification
— Unverified 0Keep Decoding Parallel with Effective Knowledge Distillation from Language Models to End-to-end Speech Recognisers Jan 22, 2024 Automatic Speech Recognition Automatic Speech Recognition (ASR)
— Unverified 0Knowledge Distillation on Spatial-Temporal Graph Convolutional Network for Traffic Prediction Jan 22, 2024 Graph Neural Network Knowledge Distillation
— Unverified 0Robustness to distribution shifts of compressed networks for edge devices Jan 22, 2024 Knowledge Distillation Quantization
— Unverified 0Stereo-Matching Knowledge Distilled Monocular Depth Estimation Filtered by Multiple Disparity Consistency Jan 22, 2024 Depth Estimation Knowledge Distillation
— Unverified 0Confidence Preservation Property in Knowledge Distillation Abstractions Jan 21, 2024 Classification Knowledge Distillation
— Unverified 0Enhancing Scalability in Recommender Systems through Lottery Ticket Hypothesis and Knowledge Distillation-based Neural Network Pruning Jan 19, 2024 GPU Knowledge Distillation
— Unverified 0Model Compression Techniques in Biometrics Applications: A Survey Jan 18, 2024 Fairness Knowledge Distillation
Code Code Available 0Cross-Level Multi-Instance Distillation for Self-Supervised Fine-Grained Visual Categorization Jan 16, 2024 Fine-Grained Visual Categorization Knowledge Distillation
— Unverified 0Generative Denoise Distillation: Simple Stochastic Noises Induce Efficient Knowledge Transfer for Dense Prediction Jan 16, 2024 Instance Segmentation Knowledge Distillation
Code Code Available 0A Deep Hierarchical Feature Sparse Framework for Occluded Person Re-Identification Jan 15, 2024 Data Augmentation Knowledge Distillation
— Unverified 0Lightweight Modality Adaptation to Sequential Recommendation via Correlation Supervision Jan 14, 2024 Knowledge Distillation Representation Learning
— Unverified 0EVOKE: Emotion Enabled Virtual Avatar Mapping Using Optimized Knowledge Distillation Jan 13, 2024 Emotion Recognition Knowledge Distillation
— Unverified 0Knowledge Distillation of Black-Box Large Language Models Jan 13, 2024 Knowledge Distillation Transfer Learning
— Unverified 0Direct Distillation between Different Domains Jan 12, 2024 Domain Adaptation Knowledge Distillation
— Unverified 0An Empirical Investigation into the Effect of Parameter Choices in Knowledge Distillation Jan 12, 2024 Knowledge Distillation
— Unverified 0Object-Centric Diffusion for Efficient Video Editing Jan 11, 2024 Knowledge Distillation Object
— Unverified 0Exploring Self- and Cross-Triplet Correlations for Human-Object Interaction Detection Jan 11, 2024 Human-Object Interaction Detection Knowledge Distillation
— Unverified 0Attention to detail: inter-resolution knowledge distillation Jan 11, 2024 Knowledge Distillation whole slide images
Code Code Available 0Hierarchical Knowledge Distillation on Text Graph for Data-limited Attribute Inference Jan 10, 2024 Attribute Few-Shot Learning
— Unverified 0Translate-Distill: Learning Cross-Language Dense Retrieval by Translation and Distillation Jan 9, 2024 Information Retrieval Knowledge Distillation
— Unverified 0Logits Poisoning Attack in Federated Distillation Jan 8, 2024 Federated Learning Knowledge Distillation
— Unverified 0Multi-Channel Multi-Domain based Knowledge Distillation Algorithm for Sleep Staging with Single-Channel EEG Jan 7, 2024 EEG Knowledge Distillation
— Unverified 0SeqNAS: Neural Architecture Search for Event Sequence Classification Jan 6, 2024 Bayesian Optimization Classification
Code Code Available 0CTC Blank Triggered Dynamic Layer-Skipping for Efficient CTC-based Speech Recognition Jan 4, 2024 Knowledge Distillation speech-recognition
— Unverified 0Exploring Vacant Classes in Label-Skewed Federated Learning Jan 4, 2024 Federated Learning Knowledge Distillation
Code Code Available 0Distillation-based fabric anomaly detection Jan 4, 2024 Anomaly Detection Defect Detection
Code Code Available 0Bridging Modalities: Knowledge Distillation and Masked Training for Translating Multi-Modal Emotion Recognition to Uni-Modal, Speech-Only Emotion Recognition Jan 4, 2024 Emotion Recognition Knowledge Distillation
Code Code Available 0Distilling Temporal Knowledge with Masked Feature Reconstruction for 3D Object Detection Jan 3, 2024 3D Object Detection Knowledge Distillation
— Unverified 0Self-supervised Reflective Learning through Self-distillation and Online Clustering for Speaker Representation Learning Jan 3, 2024 Clustering Knowledge Distillation
— Unverified 0Exploring Hyperspectral Anomaly Detection with Human Vision: A Small Target Aware Detector Jan 2, 2024 Anomaly Detection Knowledge Distillation
Code Code Available 0Distilling Local Texture Features for Colorectal Tissue Classification in Low Data Regimes Jan 2, 2024 Knowledge Distillation
Code Code Available 0Query-Based Knowledge Sharing for Open-Vocabulary Multi-Label Classification Jan 2, 2024 Knowledge Distillation Multi-Label Classification
— Unverified 0Dual Teacher Knowledge Distillation with Domain Alignment for Face Anti-spoofing Jan 2, 2024 Adversarial Attack Face Anti-Spoofing
— Unverified 0Building Vision-Language Models on Solid Foundations with Masked Distillation Jan 1, 2024 Contrastive Learning Knowledge Distillation
— Unverified 0Distilling CLIP with Dual Guidance for Learning Discriminative Human Body Shape Representation Jan 1, 2024 Knowledge Distillation Person Re-Identification
— Unverified 0Uncertainty-Guided Never-Ending Learning to Drive Jan 1, 2024 Autonomous Driving Continual Learning
— Unverified 0IQ-VFI: Implicit Quadratic Motion Estimation for Video Frame Interpolation Jan 1, 2024 Knowledge Distillation Motion Estimation
— Unverified 0Curriculum-scheduled Knowledge Distillation from Multiple Pre-trained Teachers for Multi-domain Sequential Recommendation Jan 1, 2024 Knowledge Distillation Recommendation Systems
Code Code Available 0SecFormer: Fast and Accurate Privacy-Preserving Inference for Transformer Models via SMPC Jan 1, 2024 Knowledge Distillation Privacy Preserving
Code Code Available 0C2KD: Bridging the Modality Gap for Cross-Modal Knowledge Distillation Jan 1, 2024 Knowledge Distillation Transfer Learning
— Unverified 0KD-DETR: Knowledge Distillation for Detection Transformer with Consistent Distillation Points Sampling Jan 1, 2024 General Knowledge Knowledge Distillation
— Unverified 0Robust Distillation via Untargeted and Targeted Intermediate Adversarial Samples Jan 1, 2024 Adversarial Robustness Knowledge Distillation
— Unverified 0Scene-adaptive and Region-aware Multi-modal Prompt for Open Vocabulary Object Detection Jan 1, 2024 Knowledge Distillation object-detection
— Unverified 0Compressing Deep Image Super-resolution Models Dec 31, 2023 Image Super-Resolution Knowledge Distillation
— Unverified 0Explainability-Driven Leaf Disease Classification Using Adversarial Training and Knowledge Distillation Dec 30, 2023 Adversarial Attack Classification
— Unverified 0ClST: A Convolutional Transformer Framework for Automatic Modulation Recognition by Knowledge Distillation Dec 29, 2023 Automatic Modulation Recognition Knowledge Distillation
— Unverified 0