Simplified TinyBERT: Knowledge Distillation for Document Retrieval Sep 16, 2020 Document Ranking Knowledge Distillation
Code Code Available 1Noisy Self-Knowledge Distillation for Text Summarization Sep 15, 2020 Knowledge Distillation Self-Knowledge Distillation
Code Code Available 1Simulating Unknown Target Models for Query-Efficient Black-box Attacks Sep 2, 2020 Knowledge Distillation Meta-Learning
Code Code Available 1Semantics-aware Adaptive Knowledge Distillation for Sensor-to-Vision Action Recognition Sep 1, 2020 Action Recognition Image Generation
Code Code Available 1Unpaired Learning of Deep Image Denoising Aug 31, 2020 Denoising Image Denoising
Code Code Available 1Performance Optimization for Federated Person Re-identification via Benchmark Analysis Aug 26, 2020 Federated Learning Knowledge Distillation
Code Code Available 1PARADE: Passage Representation Aggregation for Document Reranking Aug 20, 2020 Ad-Hoc Information Retrieval Document Ranking
Code Code Available 1Knowledge Transfer via Dense Cross-Layer Mutual-Distillation Aug 18, 2020 Knowledge Distillation Representation Learning
Code Code Available 1Distilling the Knowledge of BERT for Sequence-to-Sequence ASR Aug 9, 2020 Automatic Speech Recognition Automatic Speech Recognition (ASR)
Code Code Available 1Improving Knowledge Distillation via Category Structure Aug 1, 2020 Knowledge Distillation
Code Code Available 1Distilling Visual Priors from Self-Supervised Learning Aug 1, 2020 Classification Contrastive Learning
Code Code Available 1Intra-class Feature Variation Distillation for Semantic Segmentation Aug 1, 2020 Knowledge Distillation Segmentation
Code Code Available 1Weakly Supervised 3D Object Detection from Point Clouds Jul 28, 2020 3D Object Detection Knowledge Distillation
Code Code Available 1Group Knowledge Transfer: Federated Learning of Large CNNs at the Edge Jul 28, 2020 Federated Learning Knowledge Distillation
Code Code Available 1Deep Semi-supervised Knowledge Distillation for Overlapping Cervical Cell Instance Segmentation Jul 21, 2020 Instance Segmentation Knowledge Distillation
Code Code Available 1Resolution Switchable Networks for Runtime Efficient Image Recognition Jul 19, 2020 Knowledge Distillation Quantization
Code Code Available 1Self-supervision on Unlabelled OR Data for Multi-person 2D/3D Human Pose Estimation Jul 16, 2020 3D Human Pose Estimation 3D Pose Estimation
Code Code Available 1Defocus Blur Detection via Depth Distillation Jul 16, 2020 Decoder Defocus Blur Detection
Code Code Available 1Knowledge Distillation for Multi-task Learning Jul 14, 2020 Knowledge Distillation Multi-Task Learning
Code Code Available 1Unsupervised Multi-Target Domain Adaptation Through Knowledge Distillation Jul 14, 2020 Domain Adaptation Knowledge Distillation
Code Code Available 1RATT: Recurrent Attention to Transient Tasks for Continual Image Captioning Jul 13, 2020 Continual Learning Image Captioning
Code Code Available 1Towards Practical Lipreading with Distilled and Efficient Models Jul 13, 2020 Knowledge Distillation Lipreading
Code Code Available 1Learning to Learn Parameterized Classification Networks for Scalable Input Images Jul 13, 2020 Classification General Classification
Code Code Available 1Temporal Self-Ensembling Teacher for Semi-Supervised Object Detection Jul 13, 2020 image-classification Image Classification
Code Code Available 1Robust Re-Identification by Multiple Views Knowledge Distillation Jul 8, 2020 Knowledge Distillation Person Re-Identification
Code Code Available 1Tracking-by-Trackers with a Distilled and Reinforced Model Jul 8, 2020 Knowledge Distillation Object Tracking
Code Code Available 1Improving Weakly Supervised Visual Grounding by Contrastive Knowledge Distillation Jul 3, 2020 Contrastive Learning Knowledge Distillation
Code Code Available 1Improving Event Detection via Open-domain Trigger Knowledge Jul 1, 2020 Event Detection Knowledge Distillation
Code Code Available 1Self-Knowledge Distillation with Progressive Refinement of Targets Jun 22, 2020 image-classification Image Classification
Code Code Available 1Paying more attention to snapshots of Iterative Pruning: Improving Model Compression via Ensemble Distillation Jun 20, 2020 image-classification Image Classification
Code Code Available 1Deep Encoder, Shallow Decoder: Reevaluating Non-autoregressive Machine Translation Jun 18, 2020 Decoder Knowledge Distillation
Code Code Available 1Self-supervised Knowledge Distillation for Few-shot Learning Jun 17, 2020 Few-Shot Image Classification Few-Shot Learning
Code Code Available 1AutoGAN-Distiller: Searching to Compress Generative Adversarial Networks Jun 15, 2020 AutoML Knowledge Distillation
Code Code Available 1Knowledge Distillation Meets Self-Supervision Jun 12, 2020 Contrastive Learning Knowledge Distillation
Code Code Available 1Real-Time Video Inference on Edge Devices via Adaptive Model Streaming Jun 11, 2020 Knowledge Distillation Semantic Segmentation
Code Code Available 1Adjoined Networks: A Training Paradigm with Applications to Network Compression Jun 10, 2020 Knowledge Distillation Neural Architecture Search
Code Code Available 1FastSpeech 2: Fast and High-Quality End-to-End Text to Speech Jun 8, 2020 Knowledge Distillation Speech Synthesis
Code Code Available 1Multi-view Contrastive Learning for Online Knowledge Distillation Jun 7, 2020 Classification Contrastive Learning
Code Code Available 1Peer Collaborative Learning for Online Knowledge Distillation Jun 7, 2020 Knowledge Distillation
Code Code Available 1Channel Distillation: Channel-Wise Attention for Knowledge Distillation Jun 2, 2020 Knowledge Distillation
Code Code Available 1Block-Wisely Supervised Neural Architecture Search With Knowledge Distillation Jun 1, 2020 Knowledge Distillation Neural Architecture Search
Code Code Available 1Distilling Cross-Task Knowledge via Relationship Matching Jun 1, 2020 Knowledge Distillation
Code Code Available 1Online Knowledge Distillation via Collaborative Learning Jun 1, 2020 Knowledge Distillation Model Compression
Code Code Available 1Transferring Inductive Biases through Knowledge Distillation May 31, 2020 Knowledge Distillation
Code Code Available 1Distilling Knowledge from Ensembles of Acoustic Models for Joint CTC-Attention End-to-End Speech Recognition May 19, 2020 Automatic Speech Recognition Automatic Speech Recognition (ASR)
Code Code Available 1MicroNet for Efficient Language Modeling May 16, 2020 Knowledge Distillation Language Modeling
Code Code Available 1Data-Free Network Quantization With Adversarial Knowledge Distillation May 8, 2020 Knowledge Distillation Model Compression
Code Code Available 1ProSelfLC: Progressive Self Label Correction for Training Robust Deep Neural Networks May 7, 2020 Knowledge Distillation Self-Knowledge Distillation
Code Code Available 1MAZE: Data-Free Model Stealing Attack Using Zeroth-Order Gradient Estimation May 6, 2020 Data-free Knowledge Distillation Knowledge Distillation
Code Code Available 1Heterogeneous Knowledge Distillation using Information Flow Modeling May 2, 2020 Knowledge Distillation
Code Code Available 1