Understanding and Improving Knowledge Distillation for Quantization-Aware Training of Large Transformer Encoders Nov 20, 2022 Knowledge Distillation Model Compression
Code Code Available 0EEG aided boosting of single-lead ECG based sleep staging with Deep Knowledge Distillation Nov 18, 2022 ECG based Sleep Staging EEG
Code Code Available 1DASECount: Domain-Agnostic Sample-Efficient Wireless Indoor Crowd Counting via Few-shot Learning Nov 18, 2022 Crowd Counting Few-Shot Learning
— Unverified 0Is Smaller Always Faster? Tradeoffs in Compressing Self-Supervised Speech Transformers Nov 17, 2022 Knowledge Distillation Model Compression
Code Code Available 0Knowledge distillation for fast and accurate DNA sequence correction Nov 17, 2022 Knowledge Distillation
— Unverified 0DETRDistill: A Universal Knowledge Distillation Framework for DETR-families Nov 17, 2022 Knowledge Distillation object-detection
— Unverified 0ConNER: Consistency Training for Cross-lingual Named Entity Recognition Nov 17, 2022 Cross-Lingual NER Knowledge Distillation
Code Code Available 1Sub-Graph Learning for Spatiotemporal Forecasting via Knowledge Distillation Nov 17, 2022 Diversity Graph Learning
— Unverified 0BEVDistill: Cross-Modal BEV Distillation for Multi-View 3D Object Detection Nov 17, 2022 3D Object Detection Depth Estimation
Code Code Available 1D^3ETR: Decoder Distillation for Detection Transformer Nov 17, 2022 Decoder Knowledge Distillation
— Unverified 0Yield Evaluation of Citrus Fruits based on the YoloV5 compressed by Knowledge Distillation Nov 16, 2022 Knowledge Distillation
— Unverified 0Knowledge Distillation for Detection Transformer with Consistent Distillation Points Sampling Nov 15, 2022 General Knowledge Knowledge Distillation
Code Code Available 0An Efficient Active Learning Pipeline for Legal Text Classification Nov 15, 2022 Active Learning Classification
— Unverified 0An Investigation of the Combination of Rehearsal and Knowledge Distillation in Continual Learning for Spoken Language Understanding Nov 15, 2022 class-incremental learning Class Incremental Learning
Code Code Available 0Instance-aware Model Ensemble With Distillation For Unsupervised Domain Adaptation Nov 15, 2022 Domain Adaptation Knowledge Distillation
— Unverified 0FedCL: Federated Multi-Phase Curriculum Learning to Synchronously Correlate User Heterogeneity Nov 14, 2022 Federated Learning Knowledge Distillation
Code Code Available 1Feature Correlation-guided Knowledge Transfer for Federated Self-supervised Learning Nov 14, 2022 Feature Correlation Federated Learning
— Unverified 0An Interpretable Neuron Embedding for Static Knowledge Distillation Nov 14, 2022 Knowledge Distillation
— Unverified 0Structured Knowledge Distillation Towards Efficient and Compact Multi-View 3D Detection Nov 14, 2022 Knowledge Distillation
— Unverified 0Cross-Modality Knowledge Distillation Network for Monocular 3D Object Detection Nov 14, 2022 3D Object Detection Knowledge Distillation
Code Code Available 1Fcaformer: Forward Cross Attention in Hybrid Vision Transformer Nov 14, 2022 Image Classification Knowledge Distillation
Code Code Available 1Long-Range Zero-Shot Generative Deep Network Quantization Nov 13, 2022 Knowledge Distillation Quantization
— Unverified 0MDFlow: Unsupervised Optical Flow Learning by Reliable Mutual Knowledge Distillation Nov 11, 2022 Blocking Data Augmentation
Code Code Available 1Knowledge Distillation from Cross Teaching Teachers for Efficient Semi-Supervised Abdominal Organ Segmentation in CT Nov 11, 2022 Image Segmentation Knowledge Distillation
Code Code Available 0FAN-Trans: Online Knowledge Distillation for Facial Action Unit Detection Nov 11, 2022 Action Unit Detection Face Alignment
— Unverified 0PILE: Pairwise Iterative Logits Ensemble for Multi-Teacher Labeled Distillation Nov 11, 2022 Knowledge Distillation
— Unverified 0Efficient Large-scale Audio Tagging via Transformer-to-CNN Knowledge Distillation Nov 9, 2022 Audio Classification Audio Tagging
Code Code Available 2Knowledge Distillation for Federated Learning: a Practical Guide Nov 9, 2022 Federated Learning Knowledge Distillation
— Unverified 0Understanding the Role of Mixup in Knowledge Distillation: An Empirical Study Nov 8, 2022 Attribute Data Augmentation
Code Code Available 0Bridging Fairness and Environmental Sustainability in Natural Language Processing Nov 8, 2022 Dimensionality Reduction Fairness
— Unverified 0CoNMix for Source-free Single and Multi-target Domain Adaptation Nov 7, 2022 Domain Adaptation Knowledge Distillation
Code Code Available 1AlphaPose: Whole-Body Regional Multi-Person Pose Estimation and Tracking in Real-Time Nov 7, 2022 Knowledge Distillation Multi-Person Pose Estimation
Code Code Available 5Closing the Gap between Client and Global Model Performance in Heterogeneous Federated Learning Nov 7, 2022 Federated Learning Knowledge Distillation
— Unverified 0Peak-First CTC: Reducing the Peak Latency of CTC Models by Applying Peak-First Regularization Nov 7, 2022 Knowledge Distillation
— Unverified 0Breaking the trade-off in personalized speech enhancement with cross-task knowledge distillation Nov 5, 2022 Knowledge Distillation Speech Enhancement
— Unverified 0SSDA-YOLO: Semi-supervised Domain Adaptive YOLO for Cross-Domain Object Detection Nov 4, 2022 Domain Adaptation Knowledge Distillation
Code Code Available 2LightVessel: Exploring Lightweight Coronary Artery Vessel Segmentation via Similarity Knowledge Distillation Nov 2, 2022 Decoder Knowledge Distillation
— Unverified 0MPCFormer: fast, performant and private Transformer inference with MPC Nov 2, 2022 Knowledge Distillation
Code Code Available 1Gradient Knowledge Distillation for Pre-trained Language Models Nov 2, 2022 Knowledge Distillation
Code Code Available 0Multi-level Distillation of Semantic Knowledge for Pre-training Multilingual Language Model Nov 2, 2022 Knowledge Distillation Language Modeling
— Unverified 0Fairness without Demographics through Knowledge Distillation Nov 1, 2022 Fairness Knowledge Distillation
Code Code Available 0Lightweight Sound Event Detection Model with RepVGG Architecture Nov 1, 2022 Event Detection Knowledge Distillation
— Unverified 0Enhancing Chinese Multi-Label Text Classification Performance with Response-based Knowledge Distillation Nov 1, 2022 Knowledge Distillation Multi Label Text Classification
— Unverified 0Maximum Likelihood Distillation for Robust Modulation Classification Nov 1, 2022 Classification Knowledge Distillation
— Unverified 0ARDIR: Improving Robustness using Knowledge Distillation of Internal Representation Nov 1, 2022 Knowledge Distillation
— Unverified 0Predicting Multi-Codebook Vector Quantization Indexes for Knowledge Distillation Oct 31, 2022 Automatic Speech Recognition Automatic Speech Recognition (ASR)
— Unverified 0Lightweight Neural Network with Knowledge Distillation for CSI Feedback Oct 31, 2022 Knowledge Distillation
— Unverified 0QuaLA-MiniLM: a Quantized Length Adaptive MiniLM Oct 31, 2022 Computational Efficiency Knowledge Distillation
— Unverified 0Generative Negative Text Replay for Continual Vision-Language Pretraining Oct 31, 2022 Continual Learning image-classification
— Unverified 0Application of Knowledge Distillation to Multi-task Speech Representation Learning Oct 29, 2022 Keyword Spotting Knowledge Distillation
— Unverified 0