Unsupervised Multi-Target Domain Adaptation Through Knowledge Distillation Jul 14, 2020 Domain Adaptation Knowledge Distillation
Code Code Available 1Knowledge Distillation for Multi-task Learning Jul 14, 2020 Knowledge Distillation Multi-Task Learning
Code Code Available 1Learning to Learn Parameterized Classification Networks for Scalable Input Images Jul 13, 2020 Classification General Classification
Code Code Available 1Towards Practical Lipreading with Distilled and Efficient Models Jul 13, 2020 Knowledge Distillation Lipreading
Code Code Available 1RATT: Recurrent Attention to Transient Tasks for Continual Image Captioning Jul 13, 2020 Continual Learning Image Captioning
Code Code Available 1Representation Transfer by Optimal Transport Jul 13, 2020 Knowledge Distillation Model Compression
— Unverified 0Dual-Teacher: Integrating Intra-domain and Inter-domain Teachers for Annotation-efficient Cardiac Segmentation Jul 13, 2020 Cardiac Segmentation Domain Adaptation
— Unverified 0Temporal Self-Ensembling Teacher for Semi-Supervised Object Detection Jul 13, 2020 image-classification Image Classification
Code Code Available 1Optical Flow Distillation: Towards Efficient and Stable Video Style Transfer Jul 10, 2020 Knowledge Distillation Optical Flow Estimation
— Unverified 0Data-Efficient Ranking Distillation for Image Retrieval Jul 10, 2020 Image Retrieval Knowledge Distillation
— Unverified 0Robust Re-Identification by Multiple Views Knowledge Distillation Jul 8, 2020 Knowledge Distillation Person Re-Identification
Code Code Available 1Tracking-by-Trackers with a Distilled and Reinforced Model Jul 8, 2020 Knowledge Distillation Object Tracking
Code Code Available 1Improving Weakly Supervised Visual Grounding by Contrastive Knowledge Distillation Jul 3, 2020 Contrastive Learning Knowledge Distillation
Code Code Available 1Knowledge Distillation Beyond Model Compression Jul 3, 2020 Knowledge Distillation model
— Unverified 0Interactive Knowledge Distillation Jul 3, 2020 image-classification Image Classification
— Unverified 0Improving Autoregressive NMT with Non-Autoregressive Model Jul 1, 2020 Decoder de-en
— Unverified 0Xiaomi's Submissions for IWSLT 2020 Open Domain Translation Task Jul 1, 2020 Domain Adaptation Knowledge Distillation
— Unverified 0CASIA's System for IWSLT 2020 Open Domain Translation Jul 1, 2020 Knowledge Distillation Machine Translation
— Unverified 0Exploring the Limits of Simple Learners in Knowledge Distillation for Document Classification with DocBERT Jul 1, 2020 Document Classification General Classification
— Unverified 0SimulSpeech: End-to-End Simultaneous Speech to Text Translation Jul 1, 2020 Automatic Speech Recognition Automatic Speech Recognition (ASR)
— Unverified 0Improving Event Detection via Open-domain Trigger Knowledge Jul 1, 2020 Event Detection Knowledge Distillation
Code Code Available 1On the Demystification of Knowledge Distillation: A Residual Network Perspective Jun 30, 2020 Knowledge Distillation Model Compression
— Unverified 0Extracurricular Learning: Knowledge Transfer Beyond Empirical Distribution Jun 30, 2020 Image Classification Knowledge Distillation
— Unverified 0Interpreting and Disentangling Feature Components of Various Complexity from DNNs Jun 29, 2020 Knowledge Distillation
Code Code Available 0Motion Pyramid Networks for Accurate and Efficient Cardiac Motion Estimation Jun 28, 2020 Knowledge Distillation Motion Estimation
— Unverified 0Diverse Knowledge Distillation (DKD): A Solution for Improving The Robustness of Ensemble Models Against Adversarial Attacks Jun 26, 2020 Ensemble Learning image-classification
— Unverified 0Streaming Transformer ASR with Blockwise Synchronous Inference Jun 25, 2020 Automatic Speech Recognition Automatic Speech Recognition (ASR)
— Unverified 0Distilling Object Detectors with Task Adaptive Regularization Jun 23, 2020 Knowledge Distillation Object
— Unverified 0Self-Knowledge Distillation with Progressive Refinement of Targets Jun 22, 2020 image-classification Image Classification
Code Code Available 1Paying more attention to snapshots of Iterative Pruning: Improving Model Compression via Ensemble Distillation Jun 20, 2020 image-classification Image Classification
Code Code Available 1Deep Encoder, Shallow Decoder: Reevaluating Non-autoregressive Machine Translation Jun 18, 2020 Decoder Knowledge Distillation
Code Code Available 1Self-supervised Knowledge Distillation for Few-shot Learning Jun 17, 2020 Few-Shot Image Classification Few-Shot Learning
Code Code Available 1Prior knowledge distillation based on financial time series Jun 16, 2020 Knowledge Distillation Time Series
— Unverified 0Multi-fidelity Neural Architecture Search with Knowledge Distillation Jun 15, 2020 Knowledge Distillation Neural Architecture Search
Code Code Available 0AutoGAN-Distiller: Searching to Compress Generative Adversarial Networks Jun 15, 2020 AutoML Knowledge Distillation
Code Code Available 1Pixel Invisibility: Detecting Objects Invisible in Color Images Jun 15, 2020 Knowledge Distillation object-detection
— Unverified 0Knowledge Distillation Meets Self-Supervision Jun 12, 2020 Contrastive Learning Knowledge Distillation
Code Code Available 1Ensemble Distillation for Robust Model Fusion in Federated Learning Jun 12, 2020 BIG-bench Machine Learning Federated Learning
Code Code Available 0Real-Time Video Inference on Edge Devices via Adaptive Model Streaming Jun 11, 2020 Knowledge Distillation Semantic Segmentation
Code Code Available 1Adjoined Networks: A Training Paradigm with Applications to Network Compression Jun 10, 2020 Knowledge Distillation Neural Architecture Search
Code Code Available 1Knowledge Distillation: A Survey Jun 9, 2020 Knowledge Distillation Model Compression
— Unverified 0Continual Representation Learning for Biometric Identification Jun 8, 2020 Continual Learning Knowledge Distillation
Code Code Available 0Classification Under Misspecification: Halfspaces, Generalized Linear Models, and Connections to Evolvability Jun 8, 2020 Fairness General Classification
Code Code Available 0FastSpeech 2: Fast and High-Quality End-to-End Text to Speech Jun 8, 2020 Knowledge Distillation Speech Synthesis
Code Code Available 1ResKD: Residual-Guided Knowledge Distillation Jun 8, 2020 Knowledge Distillation
— Unverified 0Multi-view Contrastive Learning for Online Knowledge Distillation Jun 7, 2020 Classification Contrastive Learning
Code Code Available 1ADMP: An Adversarial Double Masks Based Pruning Framework For Unsupervised Cross-Domain Compression Jun 7, 2020 Domain Adaptation Knowledge Distillation
— Unverified 0Peer Collaborative Learning for Online Knowledge Distillation Jun 7, 2020 Knowledge Distillation
Code Code Available 1An Empirical Analysis of the Impact of Data Augmentation on Knowledge Distillation Jun 6, 2020 Data Augmentation Knowledge Distillation
— Unverified 0An Overview of Neural Network Compression Jun 5, 2020 Knowledge Distillation Model Compression
— Unverified 0