Knowledge Distillation from Cross Teaching Teachers for Efficient Semi-Supervised Abdominal Organ Segmentation in CT Nov 11, 2022 Image Segmentation Knowledge Distillation
Code Code Available 0PILE: Pairwise Iterative Logits Ensemble for Multi-Teacher Labeled Distillation Nov 11, 2022 Knowledge Distillation
— Unverified 0FAN-Trans: Online Knowledge Distillation for Facial Action Unit Detection Nov 11, 2022 Action Unit Detection Face Alignment
— Unverified 0Knowledge Distillation for Federated Learning: a Practical Guide Nov 9, 2022 Federated Learning Knowledge Distillation
— Unverified 0Bridging Fairness and Environmental Sustainability in Natural Language Processing Nov 8, 2022 Dimensionality Reduction Fairness
— Unverified 0Understanding the Role of Mixup in Knowledge Distillation: An Empirical Study Nov 8, 2022 Attribute Data Augmentation
Code Code Available 0Peak-First CTC: Reducing the Peak Latency of CTC Models by Applying Peak-First Regularization Nov 7, 2022 Knowledge Distillation
— Unverified 0Closing the Gap between Client and Global Model Performance in Heterogeneous Federated Learning Nov 7, 2022 Federated Learning Knowledge Distillation
— Unverified 0Breaking the trade-off in personalized speech enhancement with cross-task knowledge distillation Nov 5, 2022 Knowledge Distillation Speech Enhancement
— Unverified 0LightVessel: Exploring Lightweight Coronary Artery Vessel Segmentation via Similarity Knowledge Distillation Nov 2, 2022 Decoder Knowledge Distillation
— Unverified 0Multi-level Distillation of Semantic Knowledge for Pre-training Multilingual Language Model Nov 2, 2022 Knowledge Distillation Language Modeling
— Unverified 0Gradient Knowledge Distillation for Pre-trained Language Models Nov 2, 2022 Knowledge Distillation
Code Code Available 0ARDIR: Improving Robustness using Knowledge Distillation of Internal Representation Nov 1, 2022 Knowledge Distillation
— Unverified 0Fairness without Demographics through Knowledge Distillation Nov 1, 2022 Fairness Knowledge Distillation
Code Code Available 0Lightweight Sound Event Detection Model with RepVGG Architecture Nov 1, 2022 Event Detection Knowledge Distillation
— Unverified 0Enhancing Chinese Multi-Label Text Classification Performance with Response-based Knowledge Distillation Nov 1, 2022 Knowledge Distillation Multi Label Text Classification
— Unverified 0Maximum Likelihood Distillation for Robust Modulation Classification Nov 1, 2022 Classification Knowledge Distillation
— Unverified 0Predicting Multi-Codebook Vector Quantization Indexes for Knowledge Distillation Oct 31, 2022 Automatic Speech Recognition Automatic Speech Recognition (ASR)
— Unverified 0Lightweight Neural Network with Knowledge Distillation for CSI Feedback Oct 31, 2022 Knowledge Distillation
— Unverified 0Generative Negative Text Replay for Continual Vision-Language Pretraining Oct 31, 2022 Continual Learning image-classification
— Unverified 0QuaLA-MiniLM: a Quantized Length Adaptive MiniLM Oct 31, 2022 Computational Efficiency Knowledge Distillation
— Unverified 0Application of Knowledge Distillation to Multi-task Speech Representation Learning Oct 29, 2022 Keyword Spotting Knowledge Distillation
— Unverified 0Completely Heterogeneous Federated Learning Oct 28, 2022 Data-free Knowledge Distillation Federated Learning
— Unverified 0Teacher-Student Architecture for Knowledge Learning: A Survey Oct 28, 2022 Knowledge Distillation Multi-Task Learning
— Unverified 0Can Current Explainability Help Provide References in Clinical Notes to Support Humans Annotate Medical Codes? Oct 28, 2022 Knowledge Distillation Medical Code Prediction
— Unverified 0BEBERT: Efficient and Robust Binary Ensemble BERT Oct 28, 2022 Binarization Computational Efficiency
Code Code Available 0Semi-UFormer: Semi-supervised Uncertainty-aware Transformer for Image Dehazing Oct 28, 2022 Image Dehazing Knowledge Distillation
— Unverified 0Li3DeTr: A LiDAR based 3D Detection Transformer Oct 27, 2022 Autonomous Driving Decoder
— Unverified 0Weight Averaging: A Simple Yet Effective Method to Overcome Catastrophic Forgetting in Automatic Speech Recognition Oct 27, 2022 Automatic Speech Recognition Automatic Speech Recognition (ASR)
— Unverified 0Collaborative Multi-Teacher Knowledge Distillation for Learning Low Bit-width Deep Neural Networks Oct 27, 2022 Knowledge Distillation Quantization
— Unverified 0Fast DistilBERT on CPUs Oct 27, 2022 Knowledge Distillation Model Compression
— Unverified 0QUILL: Query Intent with Large Language Models using Retrieval Augmentation and Multi-stage Distillation Oct 27, 2022 Feature Engineering Knowledge Distillation
— Unverified 0Long-tailed Food Classification Oct 26, 2022 Classification Data Augmentation
— Unverified 0Online Cross-Layer Knowledge Distillation on Graph Neural Networks with Deep Supervision Oct 25, 2022 Knowledge Distillation Model Compression
— Unverified 0An Effective Deep Network for Head Pose Estimation without Keypoints Oct 25, 2022 Gaze Estimation Head Pose Estimation
— Unverified 0Referee: Reference-Free Sentence Summarization with Sharper Controllability through Symbolic Knowledge Distillation Oct 25, 2022 Knowledge Distillation Sentence
— Unverified 0Legal-Tech Open Diaries: Lesson learned on how to develop and deploy light-weight models in the era of humongous Language Models Oct 24, 2022 Knowledge Distillation Model Compression
— Unverified 0Respecting Transfer Gap in Knowledge Distillation Oct 23, 2022 Knowledge Distillation
— Unverified 0Adaptive Label Smoothing with Self-Knowledge in Natural Language Generation Oct 22, 2022 Knowledge Distillation Text Generation
— Unverified 0Hard Gate Knowledge Distillation -- Leverage Calibration for Robust and Reliable Language Model Oct 22, 2022 Knowledge Distillation Language Modeling
— Unverified 0Performance-Efficiency Trade-Offs in Adapting Language Models to Text Classification Tasks Oct 21, 2022 Knowledge Distillation text-classification
— Unverified 0Augmentation with Projection: Towards an Effective and Efficient Data Augmentation Paradigm for Distillation Oct 21, 2022 Data Augmentation Diversity
— Unverified 0Modeling Document-level Temporal Structures for Building Temporal Dependency Graphs Oct 21, 2022 Knowledge Distillation Sentence
Code Code Available 0Distilling the Undistillable: Learning from a Nasty Teacher Oct 21, 2022 Knowledge Distillation
Code Code Available 0Semi-supervised object detection based on single-stage detector for thighbone fracture localization Oct 20, 2022 Fracture detection Image Augmentation
— Unverified 0Toward Multiple Specialty Learners for Explaining GNNs via Online Knowledge Distillation Oct 20, 2022 Knowledge Distillation
— Unverified 0Similarity of Neural Architectures using Adversarial Attack Transferability Oct 20, 2022 Adversarial Attack Diversity
— Unverified 0ADPS: Asymmetric Distillation Post-Segmentation for Image Anomaly Detection Oct 19, 2022 Anomaly Detection Anomaly Localization
— Unverified 0A baseline revisited: Pushing the limits of multi-segment models for context-aware translation Oct 19, 2022 Knowledge Distillation Translation
— Unverified 0On effects of Knowledge Distillation on Transfer Learning Oct 18, 2022 image-classification Image Classification
— Unverified 0