ESPnet-ST IWSLT 2021 Offline Speech Translation System Jul 1, 2021 Decoder Knowledge Distillation
— Unverified 0Local-Global Knowledge Distillation in Heterogeneous Federated Learning with Non-IID Data Jun 30, 2021 Federated Learning Knowledge Distillation
— Unverified 0Reward-Based 1-bit Compressed Federated Distillation on Blockchain Jun 27, 2021 Federated Learning Knowledge Distillation
— Unverified 0Learning without Forgetting for 3D Point Cloud Objects Jun 27, 2021 Knowledge Distillation
Code Code Available 0Adapt-and-Distill: Developing Small, Fast and Effective Pretrained Language Models for Domains Jun 25, 2021 Knowledge Distillation
— Unverified 0PQK: Model Compression via Pruning, Quantization, and Knowledge Distillation Jun 25, 2021 Keyword Spotting Knowledge Distillation
— Unverified 0Dealing with training and test segmentation mismatch: FBK@IWSLT2021 Jun 23, 2021 Action Detection Activity Detection
— Unverified 0Efficient Inference via Universal LSH Kernel Jun 21, 2021 Knowledge Distillation Quantization
— Unverified 0Knowledge Distillation via Instance-level Sequence Learning Jun 21, 2021 General Knowledge Knowledge Distillation
— Unverified 0Positive-Unlabeled Data Purification in the Wild for Object Detection Jun 19, 2021 Knowledge Distillation object-detection
— Unverified 0Data-Free Knowledge Distillation for Image Super-Resolution Jun 19, 2021 Data-free Knowledge Distillation Image Super-Resolution
Code Code Available 0Space-Time Distillation for Video Super-Resolution Jun 19, 2021 Knowledge Distillation Super-Resolution
— Unverified 0Cross Modality Knowledge Distillation for Multi-Modal Aerial View Object Classification Jun 19, 2021 Image Classification Knowledge Distillation
Code Code Available 0Minimally Invasive Surgery for Sparse Neural Networks in Contrastive Manner Jun 19, 2021 Knowledge Distillation Model Compression
— Unverified 0Teacher's pet: understanding and mitigating biases in distillation Jun 19, 2021 image-classification Image Classification
— Unverified 0Tree-Like Decision Distillation Jun 19, 2021 Decision Making Knowledge Distillation
— Unverified 0CapsuleRRT: Relationships-Aware Regression Tracking via Capsules Jun 19, 2021 image-classification Image Classification
— Unverified 0Recurrent Stacking of Layers in Neural Networks: An Application to Neural Machine Translation Jun 18, 2021 Knowledge Distillation Machine Translation
— Unverified 0Dual-Teacher Class-Incremental Learning With Data-Free Generative Replay Jun 17, 2021 class-incremental learning Class Incremental Learning
— Unverified 0Knowledge distillation from multi-modal to mono-modal segmentation networks Jun 17, 2021 Brain Tumor Segmentation Image Segmentation
— Unverified 0Dynamic Knowledge Distillation With Noise Elimination for RGB-D Salient Object Detection Jun 17, 2021 Knowledge Distillation object-detection
— Unverified 0Topology Distillation for Recommender System Jun 16, 2021 Knowledge Distillation Model Compression
— Unverified 0Simon Says: Evaluating and Mitigating Bias in Pruned Neural Networks with Knowledge Distillation Jun 15, 2021 Fairness Knowledge Distillation
Code Code Available 0CoDERT: Distilling Encoder Representations with Co-learning for Transducer-based Speech Recognition Jun 14, 2021 Decoder Knowledge Distillation
— Unverified 0Energy-efficient Knowledge Distillation for Spiking Neural Networks Jun 14, 2021 Knowledge Distillation Model Compression
— Unverified 0LENAS: Learning-based Neural Architecture Search and Ensemble for 3D Radiotherapy Dose Prediction Jun 12, 2021 Diversity Ensemble Learning
Code Code Available 0Guiding Teacher Forcing with Seer Forcing for Neural Machine Translation Jun 12, 2021 Decoder Knowledge Distillation
— Unverified 0Generate, Annotate, and Learn: NLP with Synthetic Text Jun 11, 2021 Few-Shot Learning Image Classification
Code Code Available 0RefBERT: Compressing BERT by Referencing to Pre-computed Representations Jun 11, 2021 Knowledge Distillation
— Unverified 0Marginal Utility Diminishes: Exploring the Minimum Knowledge for BERT Knowledge Distillation Jun 10, 2021 Knowledge Distillation
Code Code Available 0AKE-GNN: Effective Graph Learning with Adaptive Knowledge Exchange Jun 10, 2021 Classification Graph Classification
— Unverified 0Learning by Distillation: A Self-Supervised Learning Framework for Optical Flow Estimation Jun 8, 2021 Knowledge Distillation Optical Flow Estimation
— Unverified 0RoSearch: Search for Robust Student Architectures When Distilling Pre-trained Language Models Jun 7, 2021 Adversarial Robustness Knowledge Distillation
— Unverified 0MergeDistill: Merging Pre-trained Language Models using Distillation Jun 5, 2021 Cross-Lingual Transfer Knowledge Distillation
— Unverified 0ERNIE-Tiny : A Progressive Distillation Framework for Pretrained Transformer Compression Jun 4, 2021 Knowledge Distillation
Code Code Available 0Not All Knowledge Is Created Equal: Mutual Distillation of Confident Knowledge Jun 2, 2021 All Knowledge Distillation
— Unverified 0Rejuvenating Low-Frequency Words: Making the Most of Parallel Data in Non-Autoregressive Translation Jun 2, 2021 Knowledge Distillation Translation
Code Code Available 0One Teacher is Enough? Pre-trained Language Model Distillation from Multiple Teachers Jun 2, 2021 Knowledge Distillation Language Modeling
— Unverified 0Cost-effective Deployment of BERT Models in Serverless Environment Jun 1, 2021 Knowledge Distillation Semantic Textual Similarity
— Unverified 0Modality-specific Distillation Jun 1, 2021 Knowledge Distillation Meta-Learning
— Unverified 0Multi-Grained Knowledge Distillation for Named Entity Recognition Jun 1, 2021 Knowledge Distillation named-entity-recognition
— Unverified 0Natural Statistics of Network Activations and Implications for Knowledge Distillation Jun 1, 2021 Knowledge Distillation
— Unverified 0Reinforced Iterative Knowledge Distillation for Cross-Lingual Named Entity Recognition Jun 1, 2021 Cross-Lingual NER Knowledge Distillation
— Unverified 0Continual Learning for Neural Machine Translation Jun 1, 2021 Continual Learning Knowledge Distillation
— Unverified 0Claim Matching Beyond English to Scale Global Fact-Checking Jun 1, 2021 Fact Checking Knowledge Distillation
— Unverified 0Greedy-layer Pruning: Speeding up Transformer Models for Natural Language Processing May 31, 2021 Knowledge Distillation Unsupervised Pre-training
Code Code Available 0FReTAL: Generalizing Deepfake Detection using Knowledge Distillation and Representation Learning May 28, 2021 DeepFake Detection Domain Adaptation
— Unverified 0Not Far Away, Not So Close: Sample Efficient Nearest Neighbour Data Augmentation via MiniMax May 28, 2021 Data Augmentation Knowledge Distillation
Code Code Available 0Joint-DetNAS: Upgrade Your Detector with NAS, Pruning and Dynamic Distillation May 27, 2021 Knowledge Distillation Neural Architecture Search
— Unverified 0Towards Understanding Knowledge Distillation May 27, 2021 Knowledge Distillation Transfer Learning
— Unverified 0