Take a Prior from Other Tasks for Severe Blur Removal Feb 14, 2023 Deblurring Image Deblurring
— Unverified 0Learning from Noisy Crowd Labels with Logics Feb 13, 2023 Knowledge Distillation named-entity-recognition
Code Code Available 0NYCU-TWO at Memotion 3: Good Foundation, Good Teacher, then you have Good Meme Analysis Feb 13, 2023 Knowledge Distillation Sentiment Analysis
— Unverified 0SCLIFD:Supervised Contrastive Knowledge Distillation for Incremental Fault Diagnosis under Limited Fault Data Feb 12, 2023 class-incremental learning Class Incremental Learning
— Unverified 0Feature Affinity Assisted Knowledge Distillation and Quantization of Deep Neural Networks on Label-Free Data Feb 10, 2023 Knowledge Distillation Quantization
— Unverified 0SOCRATES: Text-based Human Search and Approach using a Robot Dog Feb 10, 2023 Knowledge Distillation
— Unverified 0Toward Extremely Lightweight Distracted Driver Recognition With Distillation-Based Neural Architecture Search and Knowledge Transfer Feb 9, 2023 Knowledge Distillation Neural Architecture Search
Code Code Available 0Knowledge Distillation-based Information Sharing for Online Process Monitoring in Decentralized Manufacturing System Feb 8, 2023 Knowledge Distillation
— Unverified 0SLaM: Student-Label Mixing for Distillation with Unlabeled Examples Feb 8, 2023 Knowledge Distillation
— Unverified 0Enhancing Modality-Agnostic Representations via Meta-Learning for Brain Tumor Segmentation Feb 8, 2023 Brain Tumor Segmentation Image Generation
— Unverified 0An Empirical Study of Uniform-Architecture Knowledge Distillation in Document Ranking Feb 8, 2023 Document Ranking Knowledge Distillation
— Unverified 0Audio Representation Learning by Distilling Video as Privileged Information Feb 6, 2023 Emotion Recognition Knowledge Distillation
— Unverified 0Knowledge Distillation in Vision Transformers: A Critical Review Feb 4, 2023 Decoder image-classification
— Unverified 0Heterogeneous Federated Knowledge Graph Embedding Learning and Unlearning Feb 4, 2023 Federated Learning Graph Embedding
— Unverified 0Revisiting Intermediate Layer Distillation for Compressing Language Models: An Overfitting Perspective Feb 3, 2023 Knowledge Distillation
Code Code Available 0Enhancing Once-For-All: A Study on Parallel Blocks, Skip Connections and Early Exits Feb 3, 2023 All Knowledge Distillation
— Unverified 0Generalized Uncertainty of Deep Neural Networks: Taxonomy and Applications Feb 2, 2023 Knowledge Distillation Model Compression
— Unverified 0Adaptive Search-and-Training for Robust and Efficient Network Pruning Feb 1, 2023 Knowledge Distillation Network Pruning
Code Code Available 0Improved Knowledge Distillation for Pre-trained Language Models via Knowledge Selection Feb 1, 2023 Knowledge Distillation
— Unverified 0Distill-DBDGAN: Knowledge Distillation and Adversarial Learning Framework for Defocus Blur Detection Feb 1, 2023 Defocus Blur Detection Generative Adversarial Network
Code Code Available 0Continual Segment: Towards a Single, Unified and Accessible Continual Segmentation Model of 143 Whole-body Organs in CT Scans Feb 1, 2023 Continual Semantic Segmentation Decoder
— Unverified 0Knowledge Distillation on Graphs: A Survey Feb 1, 2023 Knowledge Distillation Model Compression
— Unverified 0AMD: Adaptive Masked Distillation for Object Detection Jan 31, 2023 Knowledge Distillation Model Compression
— Unverified 0Knowledge Distillation Label Smoothing: Fact or Fallacy? Jan 30, 2023 Knowledge Distillation text-classification
— Unverified 0On student-teacher deviations in distillation: does it pay to disobey? Jan 30, 2023 Knowledge Distillation
— Unverified 0FractalAD: A simple industrial anomaly detection method using fractal anomaly generation and backbone knowledge distillation Jan 30, 2023 Anomaly Detection Knowledge Distillation
Code Code Available 0Few-shot Face Image Translation via GAN Prior Distillation Jan 28, 2023 Knowledge Distillation Translation
— Unverified 0MVKT-ECG: Efficient Single-lead ECG Classification on Multi-Label Arrhythmia by Multi-View Knowledge Transferring Jan 28, 2023 Diagnostic ECG Classification
— Unverified 0Supervision Complexity and its Role in Knowledge Distillation Jan 28, 2023 image-classification Image Classification
— Unverified 0Can We Use Probing to Better Understand Fine-tuning and Knowledge Distillation of the BERT NLU? Jan 27, 2023 Knowledge Distillation Natural Language Understanding
— Unverified 0Improved knowledge distillation by utilizing backward pass knowledge in neural networks Jan 27, 2023 Knowledge Distillation Model Compression
— Unverified 0EmbedDistill: A Geometric Knowledge Distillation for Information Retrieval Jan 27, 2023 Information Retrieval Knowledge Distillation
— Unverified 0Improving Text-based Early Prediction by Distillation from Privileged Time-Series Text Jan 26, 2023 Knowledge Distillation Prediction
— Unverified 0A Simple Recipe for Competitive Low-compute Self supervised Vision Models Jan 23, 2023 Knowledge Distillation
— Unverified 0Unifying Synergies between Self-supervised Learning and Dynamic Computation Jan 22, 2023 image-classification Image Classification
Code Code Available 0The Best of Both Worlds: Accurate Global and Personalized Models through Federated Learning with Data-Free Hyper-Knowledge Distillation Jan 21, 2023 Federated Learning Knowledge Distillation
— Unverified 0ProKD: An Unsupervised Prototypical Knowledge Distillation Network for Zero-Resource Cross-Lingual Named Entity Recognition Jan 21, 2023 Contrastive Learning Cross-Lingual NER
— Unverified 0RNAS-CL: Robust Neural Architecture Search by Cross-Layer Knowledge Distillation Jan 19, 2023 Knowledge Distillation Neural Architecture Search
— Unverified 0Adaptively Integrated Knowledge Distillation and Prediction Uncertainty for Continual Learning Jan 18, 2023 Continual Learning Knowledge Distillation
— Unverified 0Knowledge Distillation in Federated Edge Learning: A Survey Jan 14, 2023 Knowledge Distillation Survey
— Unverified 0A Cohesive Distillation Architecture for Neural Language Models Jan 12, 2023 Knowledge Distillation Language Modeling
— Unverified 0Effective Decision Boundary Learning for Class Incremental Learning Jan 12, 2023 class-incremental learning Class Incremental Learning
— Unverified 0Synthetic data generation method for data-free knowledge distillation in regression neural networks Jan 11, 2023 Data-free Knowledge Distillation Knowledge Distillation
Code Code Available 0ERNIE 3.0 Tiny: Frustratingly Simple Method to Improve Task-Agnostic Distillation Generalization Jan 9, 2023 Knowledge Distillation Language Modelling
Code Code Available 0Designing an Improved Deep Learning-based Model for COVID-19 Recognition in Chest X-ray Images: A Knowledge Distillation Approach Jan 6, 2023 Knowledge Distillation
— Unverified 0RELIANT: Fair Knowledge Distillation for Graph Neural Networks Jan 3, 2023 Fairness Graph Learning
Code Code Available 0Knowledge-guided Causal Intervention for Weakly-supervised Object Localization Jan 3, 2023 Knowledge Distillation Object
Code Code Available 0Open-Set Fine-Grained Retrieval via Prompting Vision-Language Evaluator Jan 1, 2023 Knowledge Distillation Retrieval
— Unverified 0CaPriDe Learning: Confidential and Private Decentralized Learning Based on Encryption-Friendly Distillation Loss Jan 1, 2023 Federated Learning Knowledge Distillation
Code Code Available 0UniKD: Universal Knowledge Distillation for Mimicking Homogeneous or Heterogeneous Object Detectors Jan 1, 2023 Knowledge Distillation
— Unverified 0