FedRAD: Federated Robust Adaptive Distillation Dec 2, 2021 Federated Learning Knowledge Distillation
— Unverified 0Shapeshifter: a Parameter-efficient Transformer using Factorized Reshaped Matrices Dec 1, 2021 Knowledge Distillation Model Compression
Code Code Available 0Unsupervised Representation Transfer for Small Networks: I Believe I Can Distill On-the-Fly Dec 1, 2021 Knowledge Distillation Linear evaluation
— Unverified 0Handling Long-tailed Feature Distribution in AdderNets Dec 1, 2021 Knowledge Distillation
— Unverified 0Analyzing the Confidentiality of Undistillable Teachers in Knowledge Distillation Dec 1, 2021 Knowledge Distillation
Code Code Available 0Adversarial Teacher-Student Representation Learning for Domain Generalization Dec 1, 2021 Data Augmentation Domain Generalization
Code Code Available 0Using a GAN to Generate Adversarial Examples to Facial Image Recognition Nov 30, 2021 Face Recognition Generative Adversarial Network
— Unverified 0Improved Knowledge Distillation via Adversarial Collaboration Nov 29, 2021 Knowledge Distillation
— Unverified 0Efficient Federated Learning for AIoT Applications Using Knowledge Distillation Nov 29, 2021 Federated Learning Knowledge Distillation
— Unverified 0ESGN: Efficient Stereo Geometry Network for Fast 3D Object Detection Nov 28, 2021 3D Object Detection Knowledge Distillation
— Unverified 0Ensembling of Distilled Models from Multi-task Teachers for Constrained Resource Language Pairs Nov 26, 2021 Knowledge Distillation Translation
— Unverified 0Domain-Agnostic Clustering with Self-Distillation Nov 23, 2021 Clustering Data Augmentation
— Unverified 0Semi-Online Knowledge Distillation Nov 23, 2021 Knowledge Distillation Model Compression
Code Code Available 0Local-Selective Feature Distillation for Single Image Super-Resolution Nov 22, 2021 Image Super-Resolution Knowledge Distillation
— Unverified 0Contrast-reconstruction Representation Learning for Self-supervised Skeleton-based Action Recognition Nov 22, 2021 Action Recognition Contrastive Learning
— Unverified 0Hierarchical Knowledge Distillation for Dialogue Sequence Labeling Nov 22, 2021 Knowledge Distillation Scene Segmentation
— Unverified 0Teacher-Student Training and Triplet Loss to Reduce the Effect of Drastic Face Occlusion Nov 20, 2021 Age Estimation Facial Expression Recognition
— Unverified 0Toxicity Detection can be Sensitive to the Conversational Context Nov 19, 2021 Data Augmentation Knowledge Distillation
— Unverified 0Dynamically pruning segformer for efficient semantic segmentation Nov 18, 2021 Knowledge Distillation Segmentation
— Unverified 0Hierarchical Knowledge Guided Learning for Real-world Retinal Diseases Recognition Nov 17, 2021 Knowledge Distillation
— Unverified 0One General Teacher for Multi-Data Multi-Task: A New Knowledge Distillation Framework for Discourse Relation Analysis Nov 16, 2021 Knowledge Distillation Multi-Task Learning
— Unverified 0Redistributing Low-Frequency Words: Making the Most of Monolingual Data in Non-Autoregressive Translation Nov 16, 2021 Knowledge Distillation Translation
— Unverified 0NVIDIA NeMo Neural Machine Translation Systems for English-German and English-Russian News and Biomedical Tasks at WMT21 Nov 16, 2021 Data Augmentation Knowledge Distillation
— Unverified 0Feature Structure Distillation for BERT Transferring Nov 16, 2021 Knowledge Distillation
— Unverified 0An Unsupervised Multiple-Task and Multiple-Teacher Model for Cross-lingual Named Entity Recognition Nov 16, 2021 Cross-Lingual NER Knowledge Distillation
Code Code Available 0Multi-stage Distillation Framework for Cross-Lingual Semantic Similarity Matching Nov 16, 2021 Contrastive Learning Knowledge Distillation
— Unverified 0Learning to Teach with Student Feedback Nov 16, 2021 Knowledge Distillation
— Unverified 0Multi-Granularity Contrastive Knowledge Distillation for Multimodal Named Entity Recognition Nov 16, 2021 Knowledge Distillation Multi-modal Named Entity Recognition
— Unverified 0Enabling Multimodal Generation on CLIP via Vision-Language Knowledge Distillation Nov 16, 2021 Image Captioning Knowledge Distillation
— Unverified 0Deep-to-bottom Weights Decay: A Systemic Knowledge Review Learning Technique for Transformer Layers in Knowledge Distillation Nov 16, 2021 Knowledge Distillation
— Unverified 0Self-Distilled Pruning of Neural Networks Nov 16, 2021 Knowledge Distillation Language Modeling
— Unverified 0When Chosen Wisely, More Data Is What You Need: A Universal Sample-Efficient Strategy For Data Augmentation Nov 16, 2021 Data Augmentation HellaSwag
— Unverified 0Sparse Progressive Distillation: Resolving Overfitting under Pretrain-and-Finetune Paradigm Nov 16, 2021 Knowledge Distillation
— Unverified 0Making Small Language Models Better Few-Shot Learners Nov 16, 2021 Few-Shot Learning Knowledge Distillation
— Unverified 0Aligned Weight Regularizers for Pruning Pretrained Neural Networks Nov 16, 2021 Knowledge Distillation Language Modeling
— Unverified 0A Flexible Multi-Task Model for BERT Serving Nov 16, 2021 Knowledge Distillation model
— Unverified 0Compositional Data Augmentation for Abstractive Conversation Summarization Nov 16, 2021 Conversation Summarization Data Augmentation
— Unverified 0Synthetic Unknown Class Learning for Learning Unknowns Nov 15, 2021 Diversity Knowledge Distillation
— Unverified 0Robust and Accurate Object Detection via Self-Knowledge Distillation Nov 14, 2021 Adversarial Robustness Knowledge Distillation
Code Code Available 0Facial Landmark Points Detection Using Knowledge Distillation-Based Neural Networks Nov 13, 2021 Face Alignment Facial Landmark Detection
Code Code Available 0Domain Generalization on Efficient Acoustic Scene Classification using Residual Normalization Nov 12, 2021 Acoustic Scene Classification Classification
— Unverified 0Learning Interpretation with Explainable Knowledge Distillation Nov 12, 2021 Knowledge Distillation Model Compression
— Unverified 0Incremental Meta-Learning via Episodic Replay Distillation for Few-Shot Image Recognition Nov 9, 2021 Continual Learning Knowledge Distillation
Code Code Available 0A Survey on Green Deep Learning Nov 8, 2021 Deep Learning Knowledge Distillation
— Unverified 0Class Token and Knowledge Distillation for Multi-head Self-Attention Speaker Verification Systems Nov 6, 2021 Knowledge Distillation Philosophy
— Unverified 0AUTOKD: Automatic Knowledge Distillation Into A Student Architecture Family Nov 5, 2021 Bayesian Optimization Knowledge Distillation
— Unverified 0Visualizing the Emergence of Intermediate Visual Patterns in DNNs Nov 5, 2021 Knowledge Distillation
— Unverified 0Oracle Teacher: Leveraging Target Information for Better Knowledge Distillation of CTC Models Nov 5, 2021 Knowledge Distillation Machine Translation
— Unverified 0A methodology for training homomorphicencryption friendly neural networks Nov 5, 2021 Knowledge Distillation Privacy Preserving
— Unverified 0DVFL: A Vertical Federated Learning Method for Dynamic Data Nov 5, 2021 Federated Learning Knowledge Distillation
— Unverified 0