Collaborative Distillation in the Parameter and Spectrum Domains for Video Action Recognition Sep 15, 2020 Action Recognition Knowledge Distillation
— Unverified 0Autoregressive Knowledge Distillation through Imitation Learning Sep 15, 2020 Imitation Learning Knowledge Distillation
Code Code Available 0DualDE: Dually Distilling Knowledge Graph Embedding for Faster and Cheaper Reasoning Sep 13, 2020 Graph Embedding Knowledge Distillation
— Unverified 0SSKD: Self-Supervised Knowledge Distillation for Cross Domain Adaptive Person Re-Identification Sep 13, 2020 Clustering Domain Adaptive Person Re-Identification
— Unverified 0BoostingBERT:Integrating Multi-Class Boosting into BERT for NLP Tasks Sep 13, 2020 Ensemble Learning Knowledge Distillation
— Unverified 0Extending Label Smoothing Regularization with Self-Knowledge Distillation Sep 11, 2020 Knowledge Distillation Self-Knowledge Distillation
— Unverified 0On the Orthogonality of Knowledge Distillation with Other Techniques: From an Ensemble Perspective Sep 9, 2020 Data Augmentation Efficient Neural Network
— Unverified 0Lifelong Object Detection Sep 2, 2020 Knowledge Distillation Lifelong learning
— Unverified 0SAIL: Self-Augmented Graph Contrastive Learning Sep 2, 2020 Contrastive Learning Knowledge Distillation
— Unverified 0Automatic Assignment of Radiology Examination Protocols Using Pre-trained Language Models with Knowledge Distillation Sep 1, 2020 Data Augmentation Knowledge Distillation
Code Code Available 0Classification of Diabetic Retinopathy Using Unlabeled Data and Knowledge Distillation Sep 1, 2020 Classification General Classification
— Unverified 0Initial Classifier Weights Replay for Memoryless Class Incremental Learning Aug 31, 2020 All class-incremental learning
— Unverified 0MetaDistiller: Network Self-Boosting via Meta-Learned Top-Down Distillation Aug 27, 2020 Knowledge Distillation Meta-Learning
— Unverified 0Point Adversarial Self Mining: A Simple Method for Facial Expression Recognition Aug 26, 2020 Adversarial Attack Data Augmentation
— Unverified 0Active Class Incremental Learning for Imbalanced Datasets Aug 25, 2020 class-incremental learning Class Incremental Learning
— Unverified 0Learn to Talk via Proactive Knowledge Transfer Aug 23, 2020 de-en Knowledge Distillation
— Unverified 0Multi-Person Full Body Pose Estimation Aug 23, 2020 Knowledge Distillation Multi-Person Pose Estimation
— Unverified 0Rectified Decision Trees: Exploring the Landscape of Interpretable and Effective Machine Learning Aug 21, 2020 BIG-bench Machine Learning Knowledge Distillation
— Unverified 0Learning to Extract Attribute Value from Product via Question Answering: A Multi-task Approach Aug 20, 2020 Attribute Attribute Value Extraction
— Unverified 0Cascaded channel pruning using hierarchical self-distillation Aug 16, 2020 Knowledge Distillation Model Compression
— Unverified 0An Ensemble of Knowledge Sharing Models for Dynamic Hand Gesture Recognition Aug 13, 2020 Gesture Recognition Hand Gesture Recognition
— Unverified 0Compression of Deep Learning Models for Text: A Survey Aug 12, 2020 Deep Learning Information Retrieval
— Unverified 0Towards Unsupervised Crowd Counting via Regression-Detection Bi-knowledge Transfer Aug 12, 2020 Crowd Counting Knowledge Distillation
— Unverified 0Compact Speaker Embedding: lrx-vector Aug 11, 2020 Knowledge Distillation Speaker Recognition
— Unverified 0S2OSC: A Holistic Semi-Supervised Approach for Open Set Classification Aug 11, 2020 General Classification Knowledge Distillation
— Unverified 0Knowledge Distillation and Data Selection for Semi-Supervised Learning in CTC Acoustic Models Aug 10, 2020 Knowledge Distillation speech-recognition
— Unverified 0Knowledge Distillation-aided End-to-End Learning for Linear Precoding in Multiuser MIMO Downlink Systems with Finite-Rate Feedback Aug 10, 2020 Binarization Knowledge Distillation
— Unverified 0LRSpeech: Extremely Low-Resource Speech Synthesis and Recognition Aug 9, 2020 Automatic Speech Recognition Automatic Speech Recognition (ASR)
— Unverified 0MED-TEX: Transferring and Explaining Knowledge with Less Data from Pretrained Medical Imaging Models Aug 6, 2020 image-classification Image Classification
— Unverified 0Prime-Aware Adaptive Distillation Aug 4, 2020 Knowledge Distillation Metric Learning
— Unverified 0TutorNet: Towards Flexible Knowledge Distillation for End-to-End Speech Recognition Aug 3, 2020 Knowledge Distillation Model Compression
— Unverified 0Teacher-Student Training and Triplet Loss for Facial Expression Recognition under Occlusion Aug 3, 2020 Facial Expression Recognition Facial Expression Recognition (FER)
— Unverified 0Differentiable Feature Aggregation Search for Knowledge Distillation Aug 2, 2020 Knowledge Distillation Model Compression
— Unverified 0Feature Normalized Knowledge Distillation for Image Classification Aug 1, 2020 Classification General Classification
Code Code Available 0YOLO in the Dark - Domain Adaptation Method for Merging Multiple Models - Aug 1, 2020 Domain Adaptation Knowledge Distillation
— Unverified 0Exclusivity-Consistency Regularized Knowledge Distillation for Face Recognition Aug 1, 2020 Diversity Face Recognition
— Unverified 0Local Correlation Consistency for Knowledge Distillation Aug 1, 2020 Knowledge Distillation
— Unverified 0AMLN: Adversarial-based Mutual Learning Network for Online Knowledge Distillation Aug 1, 2020 Knowledge Distillation Transfer Learning
— Unverified 0Weight Decay Scheduling and Knowledge Distillation for Active Learning Aug 1, 2020 Active Learning Knowledge Distillation
— Unverified 0Dynamic Knowledge Distillation for Black-box Hypothesis Transfer Learning Jul 24, 2020 Knowledge Distillation Transfer Learning
— Unverified 0Multi-label Contrastive Predictive Coding Jul 20, 2020 Knowledge Distillation Multi-class Classification
— Unverified 0Interpretable Foreground Object Search As Knowledge Distillation Jul 20, 2020 Knowledge Distillation Object
— Unverified 0CovidCare: Transferring Knowledge from Existing EMR to Emerging Epidemic for Interpretable Prognosis Jul 17, 2020 Diagnostic Knowledge Distillation
— Unverified 0Knowledge Distillation in Deep Learning and its Applications Jul 17, 2020 Deep Learning Knowledge Distillation
— Unverified 0UniTrans: Unifying Model Transfer and Data Transfer for Cross-Lingual Named Entity Recognition with Unlabeled Data Jul 15, 2020 Cross-Lingual NER Cross-Lingual Transfer
Code Code Available 0P-KDGAN: Progressive Knowledge Distillation with GANs for One-class Novelty Detection Jul 14, 2020 Anomaly Detection Decoder
— Unverified 0Add a SideNet to your MainNet Jul 14, 2020 General Classification Knowledge Distillation
— Unverified 0Dual-Teacher: Integrating Intra-domain and Inter-domain Teachers for Annotation-efficient Cardiac Segmentation Jul 13, 2020 Cardiac Segmentation Domain Adaptation
— Unverified 0Representation Transfer by Optimal Transport Jul 13, 2020 Knowledge Distillation Model Compression
— Unverified 0Optical Flow Distillation: Towards Efficient and Stable Video Style Transfer Jul 10, 2020 Knowledge Distillation Optical Flow Estimation
— Unverified 0