How to Select One Among All ? An Empirical Study Towards the Robustness of Knowledge Distillation in Natural Language Understanding Nov 1, 2021 Adversarial Robustness All
— Unverified 0Limitations of Knowledge Distillation for Zero-shot Transfer Learning Nov 1, 2021 CPU Cross-Lingual Transfer
— Unverified 0Distilling Object Detectors with Feature Richness Nov 1, 2021 Knowledge Distillation Model Compression
Code Code Available 1Learning Distilled Collaboration Graph for Multi-Agent Perception Nov 1, 2021 3D Object Detection Knowledge Distillation
Code Code Available 1PP-ShiTu: A Practical Lightweight Image Recognition System Nov 1, 2021 Face Recognition Knowledge Distillation
Code Code Available 0Rethinking the Knowledge Distillation From the Perspective of Model Calibration Oct 31, 2021 Knowledge Distillation
— Unverified 0Estimating and Maximizing Mutual Information for Knowledge Distillation Oct 29, 2021 Knowledge Distillation
— Unverified 0On Cross-Layer Alignment for Model Fusion of Heterogeneous Neural Networks Oct 29, 2021 Knowledge Distillation Model Compression
— Unverified 0NxMTransformer: Semi-Structured Sparsification for Natural Language Understanding via ADMM Oct 28, 2021 Knowledge Distillation Natural Language Understanding
— Unverified 0Towards Model Agnostic Federated Learning Using Knowledge Distillation Oct 28, 2021 Federated Learning Knowledge Distillation
— Unverified 0Temporal Knowledge Distillation for On-device Audio Classification Oct 27, 2021 Audio Classification Classification
— Unverified 0GenURL: A General Framework for Unsupervised Representation Learning Oct 27, 2021 Contrastive Learning Dimensionality Reduction
— Unverified 0Mosaicking to Distill: Knowledge Distillation from Out-of-Domain Data Oct 27, 2021 Knowledge Distillation Semantic Segmentation
Code Code Available 1Beyond Classification: Knowledge Distillation using Multi-Object Impressions Oct 27, 2021 Classification Knowledge Distillation
— Unverified 0Response-based Distillation for Incremental Object Detection Oct 26, 2021 Incremental Learning Knowledge Distillation
— Unverified 0Instance-Conditional Knowledge Distillation for Object Detection Oct 25, 2021 Image Classification Knowledge Distillation
Code Code Available 1Reconstructing Pruned Filters using Cheap Spatial Transformations Oct 25, 2021 Feature Compression Knowledge Distillation
— Unverified 0MUSE: Feature Self-Distillation with Mutual Information and Self-Information Oct 25, 2021 image-classification Image Classification
— Unverified 0Anti-Distillation Backdoor Attacks: Backdoors Can Really Survive in Knowledge Distillation Oct 24, 2021 Backdoor Attack Knowledge Distillation
Code Code Available 1X-Distill: Improving Self-Supervised Monocular Depth via Cross-Task Distillation Oct 24, 2021 Depth Estimation Knowledge Distillation
— Unverified 0How and When Adversarial Robustness Transfers in Knowledge Distillation? Oct 22, 2021 Adversarial Robustness Knowledge Distillation
— Unverified 0Pseudo Supervised Monocular Depth Estimation with Teacher-Student Network Oct 22, 2021 Depth Estimation Knowledge Distillation
— Unverified 0Pixel-by-Pixel Cross-Domain Alignment for Few-Shot Semantic Segmentation Oct 22, 2021 Autonomous Driving Cross-Domain Few-Shot
Code Code Available 1Augmenting Knowledge Distillation With Peer-To-Peer Mutual Learning For Model Compression Oct 21, 2021 Knowledge Distillation Model Compression
— Unverified 0Knowledge distillation from language model to acoustic model: a hierarchical multi-task learning approach Oct 20, 2021 Knowledge Distillation Language Modeling
— Unverified 0Class Incremental Online Streaming Learning Oct 20, 2021 class-incremental learning Class Incremental Learning
— Unverified 0FedHe: Heterogeneous Models and Communication-Efficient Federated Learning Oct 19, 2021 Federated Learning Knowledge Distillation
Code Code Available 0Adaptive Distillation: Aggregating Knowledge from Multiple Paths for Efficient Distillation Oct 19, 2021 Knowledge Distillation Neural Network Compression
Code Code Available 0Graph-less Neural Networks: Teaching Old MLPs New Tricks via Distillation Oct 17, 2021 Knowledge Distillation Node Classification
Code Code Available 1Know your tools well: Better and faster QA with synthetic examples Oct 16, 2021 Diversity Knowledge Distillation
— Unverified 0HRKD: Hierarchical Relational Knowledge Distillation for Cross-domain Language Model Compression Oct 16, 2021 Few-Shot Learning Knowledge Distillation
Code Code Available 0A Short Study on Compressing Decoder-Based Language Models Oct 16, 2021 Decoder Knowledge Distillation
— Unverified 0Robustness Challenges in Model Distillation and Pruning for Natural Language Understanding Oct 16, 2021 Knowledge Distillation Model Compression
— Unverified 0Pro-KD: Progressive Distillation by Following the Footsteps of the Teacher Oct 16, 2021 image-classification Image Classification
— Unverified 0From Multimodal to Unimodal Attention in Transformers using Knowledge Distillation Oct 15, 2021 Knowledge Distillation Multimodal Deep Learning
— Unverified 0Sparse Progressive Distillation: Resolving Overfitting under Pretrain-and-Finetune Paradigm Oct 15, 2021 Knowledge Distillation
— Unverified 0Multilingual Neural Machine Translation:Can Linguistic Hierarchies Help? Oct 15, 2021 Knowledge Distillation Machine Translation
— Unverified 0Kronecker Decomposition for GPT Compression Oct 15, 2021 Knowledge Distillation Language Modeling
— Unverified 0FocusNet: Classifying Better by Focusing on Confusing Classes Oct 14, 2021 Classification image-classification
Code Code Available 1Symbolic Knowledge Distillation: from General Language Models to Commonsense Models Oct 14, 2021 Knowledge Distillation Knowledge Graphs
Code Code Available 1Language Modelling via Learning to Rank Oct 13, 2021 Knowledge Distillation Language Modelling
— Unverified 0False Negative Distillation and Contrastive Learning for Personalized Outfit Recommendation Oct 13, 2021 Contrastive Learning Data Augmentation
— Unverified 0Object DGCNN: 3D Object Detection using Dynamic Graphs Oct 13, 2021 2D Object Detection 3D Object Detection
Code Code Available 1CONetV2: Efficient Auto-Channel Size Optimization for CNNs Oct 13, 2021 Knowledge Distillation Neural Architecture Search
Code Code Available 0Rectifying the Data Bias in Knowledge Distillation Oct 11, 2021 Face Recognition Face Verification
— Unverified 0Compact CNN Models for On-device Ocular-based User Recognition in Mobile Devices Oct 11, 2021 Knowledge Distillation Network Pruning
— Unverified 0Towards Streaming Egocentric Action Anticipation Oct 11, 2021 Action Anticipation Knowledge Distillation
— Unverified 0Visualizing the embedding space to explain the effect of knowledge distillation Oct 9, 2021 Knowledge Distillation
— Unverified 0Towards Data-Free Domain Generalization Oct 9, 2021 Data-free Knowledge Distillation Domain Generalization
Code Code Available 0Cross-modal Knowledge Distillation for Vision-to-Sensor Action Recognition Oct 8, 2021 Action Recognition Activity Recognition
Code Code Available 0