Knowledge Distillation via Weighted Ensemble of Teaching Assistants Jun 23, 2022 Ensemble Learning Knowledge Distillation
— Unverified 0Conformer with dual-mode chunked attention for joint online and offline ASR Jun 22, 2022 Knowledge Distillation
— Unverified 0Knowledge Distillation for Oriented Object Detection on Aerial Images Jun 20, 2022 Knowledge Distillation Model Compression
— Unverified 0Revisiting Self-Distillation Jun 17, 2022 Knowledge Distillation Model Compression
— Unverified 0Multi scale Feature Extraction and Fusion for Online Knowledge Distillation Jun 16, 2022 Knowledge Distillation Transfer Learning
— Unverified 0FreeKD: Free-direction Knowledge Distillation for Graph Neural Networks Jun 14, 2022 Knowledge Distillation reinforcement-learning
— Unverified 0Toward Student-Oriented Teacher Network Training For Knowledge Distillation Jun 14, 2022 Data Augmentation Knowledge Distillation
— Unverified 0FreeTransfer-X: Safe and Label-Free Cross-Lingual Transfer from Off-the-Shelf Models Jun 14, 2022 Cross-Lingual Transfer Diagnostic
— Unverified 0Better Teacher Better Student: Dynamic Prior Knowledge for Knowledge Distillation Jun 13, 2022 image-classification Image Classification
Code Code Available 0Robust Distillation for Worst-class Performance Jun 13, 2022 Knowledge Distillation
— Unverified 0Federated Bayesian Neural Regression: A Scalable Global Federated Gaussian Process Jun 13, 2022 Federated Learning Knowledge Distillation
— Unverified 0Reducing Capacity Gap in Knowledge Distillation with Review Mechanism for Crowd Counting Jun 11, 2022 Computational Efficiency Crowd Counting
Code Code Available 0SDQ: Stochastic Differentiable Quantization with Mixed Precision Jun 9, 2022 Knowledge Distillation Neural Architecture Search
— Unverified 0Knowledge Distillation Decision Tree for Unravelling Black-box Machine Learning Models Jun 9, 2022 Knowledge Distillation
— Unverified 0Narrowing the Coordinate-frame Gap in Behavior Prediction Models: Distillation for Efficient and Accurate Scene-centric Motion Forecasting Jun 8, 2022 Autonomous Driving Knowledge Distillation
— Unverified 0cViL: Cross-Lingual Training of Vision-Language Models using Knowledge Distillation Jun 7, 2022 Knowledge Distillation Question Answering
Code Code Available 0Self-Knowledge Distillation based Self-Supervised Learning for Covid-19 Detection from Chest X-Ray Images Jun 7, 2022 Knowledge Distillation Self-Knowledge Distillation
— Unverified 0Reconsidering Learning Objectives in Unbiased Recommendation with Unobserved Confounders Jun 7, 2022 Generalization Bounds Knowledge Distillation
— Unverified 0Confidence-aware Self-Semantic Distillation on Knowledge Graph Embedding Jun 7, 2022 Graph Embedding Knowledge Distillation
— Unverified 0Evaluation-oriented Knowledge Distillation for Deep Face Recognition Jun 6, 2022 Face Recognition Knowledge Distillation
— Unverified 0Lip-Listening: Mixing Senses to Understand Lips using Cross Modality Knowledge Distillation for Word-Based Models Jun 5, 2022 Knowledge Distillation Lipreading
— Unverified 0Point-to-Voxel Knowledge Distillation for LiDAR Semantic Segmentation Jun 5, 2022 3D Semantic Segmentation Knowledge Distillation
Code Code Available 0Vanilla Feature Distillation for Improving the Accuracy-Robustness Trade-Off in Adversarial Training Jun 5, 2022 Knowledge Distillation
— Unverified 0Extreme Compression for Pre-trained Transformers Made Simple and Efficient Jun 4, 2022 Knowledge Distillation Quantization
— Unverified 0Guided Deep Metric Learning Jun 4, 2022 Few-Shot Learning Knowledge Distillation
— Unverified 03D-Augmented Contrastive Knowledge Distillation for Image-based Object Pose Estimation Jun 2, 2022 Contrastive Learning Knowledge Distillation
— Unverified 0ORC: Network Group-based Knowledge Distillation using Online Role Change Jun 1, 2022 Knowledge Distillation
Code Code Available 0Generalized Supervised Contrastive Learning Jun 1, 2022 Contrastive Learning Knowledge Distillation
— Unverified 0Detecting Optimism in Tweets using Knowledge Distillation and Linguistic Analysis of Optimism Jun 1, 2022 Hate Speech Detection Knowledge Distillation
— Unverified 0Searching for COMETINHO: The Little Metric That Could Jun 1, 2022 Computational Efficiency Knowledge Distillation
— Unverified 0What Knowledge Gets Distilled in Knowledge Distillation? May 31, 2022 Knowledge Distillation
— Unverified 0VFed-SSD: Towards Practical Vertical Federated Advertising May 31, 2022 Federated Learning Knowledge Distillation
— Unverified 0Spectral Maps for Learning on Subgraphs May 30, 2022 Graph Learning Knowledge Distillation
— Unverified 0Knowledge Distillation for 6D Pose Estimation by Aligning Distributions of Local Predictions May 30, 2022 6D Pose Estimation 6D Pose Estimation using RGB
— Unverified 0A General Multiple Data Augmentation Based Framework for Training Deep Neural Networks May 29, 2022 Data Augmentation image-classification
— Unverified 0MiniDisc: Minimal Distillation Schedule for Language Model Compression May 29, 2022 Knowledge Distillation Language Modeling
Code Code Available 0One Reference Is Not Enough: Diverse Distillation with Reference Selection for Non-Autoregressive Translation May 28, 2022 Knowledge Distillation Machine Translation
Code Code Available 0Parameter-Efficient and Student-Friendly Knowledge Distillation May 28, 2022 Knowledge Distillation Transfer Learning
— Unverified 0Region-aware Knowledge Distillation for Efficient Image-to-Image Translation May 25, 2022 Contrastive Learning image-classification
— Unverified 0Do we need Label Regularization to Fine-tune Pre-trained Language Models? May 25, 2022 Knowledge Distillation Model Compression
— Unverified 0DFM: Dialogue Foundation Model for Universal Large-Scale Dialogue-Oriented Task Learning May 25, 2022 Dialogue Generation Diversity
— Unverified 0CDFKD-MFS: Collaborative Data-free Knowledge Distillation via Multi-level Feature Sharing May 24, 2022 Data-free Knowledge Distillation Knowledge Distillation
Code Code Available 0LILA-BOTI : Leveraging Isolated Letter Accumulations By Ordering Teacher Insights for Bangla Handwriting Recognition May 23, 2022 Handwriting Recognition Knowledge Distillation
Code Code Available 0Aligning Logits Generatively for Principled Black-Box Knowledge Distillation May 21, 2022 Federated Learning Knowledge Distillation
Code Code Available 0InDistill: Information flow-preserving knowledge distillation for model compression May 20, 2022 Knowledge Distillation Model Compression
Code Code Available 0Simple Regularisation for Uncertainty-Aware Knowledge Distillation May 19, 2022 BIG-bench Machine Learning Diversity
— Unverified 0ERNIE-Search: Bridging Cross-Encoder with Dual-Encoder via Self On-the-fly Distillation for Dense Passage Retrieval May 18, 2022 Knowledge Distillation Open-Domain Question Answering
— Unverified 0Prompting to Distill: Boosting Data-Free Knowledge Distillation via Reinforced Prompt May 16, 2022 Data-free Knowledge Distillation Knowledge Distillation
— Unverified 0Chemical transformer compression for accelerating both training and inference of molecular modeling May 16, 2022 Knowledge Distillation Model Compression
Code Code Available 0Not to Overfit or Underfit the Source Domains? An Empirical Study of Domain Generalization in Question Answering May 15, 2022 Domain Generalization Knowledge Distillation
— Unverified 0