Comprehensive Knowledge Distillation with Causal Intervention Dec 1, 2021 Causal Inference Knowledge Distillation
Code Code Available 1Formalizing Generalization and Adversarial Robustness of Neural Networks to Weight Perturbations Dec 1, 2021 Adversarial Robustness Model Compression
— Unverified 0Aligned Structured Sparsity Learning for Efficient Image Super-Resolution Dec 1, 2021 Image Super-Resolution Knowledge Distillation
Code Code Available 1A Unified Pruning Framework for Vision Transformers Nov 30, 2021 Model Compression object-detection
Code Code Available 1FedHM: Efficient Federated Learning for Heterogeneous Models via Low-rank Factorization Nov 29, 2021 Distributed Computing Federated Learning
— Unverified 0Exploring Low-Cost Transformer Model Compression for Large-Scale Commercial Reply Suggestions Nov 27, 2021 Model Compression
— Unverified 0Accelerating Deep Learning with Dynamic Data Pruning Nov 24, 2021 Attribute Deep Learning
— Unverified 0NAM: Normalization-based Attention Module Nov 24, 2021 Model Compression
Code Code Available 1Sharpness-aware Quantization for Deep Neural Networks Nov 24, 2021 Image Classification Model Compression
Code Code Available 1Semi-Online Knowledge Distillation Nov 23, 2021 Knowledge Distillation Model Compression
Code Code Available 0Automatic Mapping of the Best-Suited DNN Pruning Schemes for Real-Time Mobile Acceleration Nov 22, 2021 Model Compression
— Unverified 0Local-Selective Feature Distillation for Single Image Super-Resolution Nov 22, 2021 Image Super-Resolution Knowledge Distillation
— Unverified 0Structured Pruning Learns Compact and Accurate Models Nov 16, 2021 Model Compression
— Unverified 0Weight Squeezing: Reparameterization for Knowledge Transfer and Model Compression Nov 16, 2021 Model Compression text-classification
— Unverified 0Learning-Based Symbol Level Precoding: A Memory-Efficient Unsupervised Learning Approach Nov 15, 2021 Model Compression
— Unverified 0Learning Interpretation with Explainable Knowledge Distillation Nov 12, 2021 Knowledge Distillation Model Compression
— Unverified 0Domain Generalization on Efficient Acoustic Scene Classification using Residual Normalization Nov 12, 2021 Acoustic Scene Classification Classification
— Unverified 0A Survey on Green Deep Learning Nov 8, 2021 Deep Learning Knowledge Distillation
— Unverified 0SEOFP-NET: Compression and Acceleration of Deep Neural Networks for Speech Enhancement Using Sign-Exponent-Only Floating-Points Nov 8, 2021 Model Compression regression
— Unverified 0LiMuSE: Lightweight Multi-modal Speaker Extraction Nov 7, 2021 Model Compression Quantization
Code Code Available 1Oracle Teacher: Leveraging Target Information for Better Knowledge Distillation of CTC Models Nov 5, 2021 Knowledge Distillation Machine Translation
— Unverified 0Weight, Block or Unit? Exploring Sparsity Tradeoffs for Speech Enhancement on Tiny Neural Accelerators Nov 3, 2021 Model Compression Speech Enhancement
— Unverified 0How to Select One Among All ? An Empirical Study Towards the Robustness of Knowledge Distillation in Natural Language Understanding Nov 1, 2021 Adversarial Robustness All
— Unverified 0Distilling Object Detectors with Feature Richness Nov 1, 2021 Knowledge Distillation Model Compression
Code Code Available 1ILMPQ : An Intra-Layer Multi-Precision Deep Neural Network Quantization framework for FPGA Oct 30, 2021 Edge-computing Model Compression
— Unverified 0On Cross-Layer Alignment for Model Fusion of Heterogeneous Neural Networks Oct 29, 2021 Knowledge Distillation Model Compression
— Unverified 0Generalized Depthwise-Separable Convolutions for Adversarially Robust and Efficient Neural Networks Oct 28, 2021 Model Compression
Code Code Available 1Reconstructing Pruned Filters using Cheap Spatial Transformations Oct 25, 2021 Feature Compression Knowledge Distillation
— Unverified 0Exploring Gradient Flow Based Saliency for DNN Model Compression Oct 24, 2021 Denoising image-classification
Code Code Available 0How and When Adversarial Robustness Transfers in Knowledge Distillation? Oct 22, 2021 Adversarial Robustness Knowledge Distillation
— Unverified 0Analysis of memory consumption by neural networks based on hyperparameters Oct 21, 2021 Deep Learning Model Compression
— Unverified 0Augmenting Knowledge Distillation With Peer-To-Peer Mutual Learning For Model Compression Oct 21, 2021 Knowledge Distillation Model Compression
— Unverified 0Accelerating Framework of Transformer by Hardware Design and Model Compression Co-Optimization Oct 19, 2021 CPU GPU
— Unverified 0Pro-KD: Progressive Distillation by Following the Footsteps of the Teacher Oct 16, 2021 image-classification Image Classification
— Unverified 0A Short Study on Compressing Decoder-Based Language Models Oct 16, 2021 Decoder Knowledge Distillation
— Unverified 0HRKD: Hierarchical Relational Knowledge Distillation for Cross-domain Language Model Compression Oct 16, 2021 Few-Shot Learning Knowledge Distillation
Code Code Available 0Robustness Challenges in Model Distillation and Pruning for Natural Language Understanding Oct 16, 2021 Knowledge Distillation Model Compression
— Unverified 0Differentiable Network Pruning for Microcontrollers Oct 15, 2021 Model Compression Network Pruning
— Unverified 0Joint Channel and Weight Pruning for Model Acceleration on Moblie Devices Oct 15, 2021 Model Compression
Code Code Available 1Kronecker Decomposition for GPT Compression Oct 15, 2021 Knowledge Distillation Language Modeling
— Unverified 0A Memory-Efficient Learning Framework for SymbolLevel Precoding with Quantized NN Weights Oct 13, 2021 Model Compression Quantization
— Unverified 0Rectifying the Data Bias in Knowledge Distillation Oct 11, 2021 Face Recognition Face Verification
— Unverified 0FedDQ: Communication-Efficient Federated Learning with Descending Quantization Oct 5, 2021 Federated Learning Model Compression
— Unverified 0Robot Intent Recognition Method Based on State Grid Business Office Sep 29, 2021 Intent Detection Intent Recognition
— Unverified 0A Unified Knowledge Distillation Framework for Deep Directed Graphical Models Sep 29, 2021 Continual Learning Federated Learning
— Unverified 0KIMERA: Injecting Domain Knowledge into Vacant Transformer Heads Sep 29, 2021 Information Retrieval Model Compression
— Unverified 0Sparse Unbalanced GAN Training with In-Time Over-Parameterization Sep 29, 2021 Model Compression
— Unverified 0Prototypical Contrastive Predictive Coding Sep 29, 2021 Contrastive Learning Knowledge Distillation
— Unverified 0Model Compression via Symmetries of the Parameter Space Sep 29, 2021 model Model Compression
— Unverified 0Learning Efficient Image Super-Resolution Networks via Structure-Regularized Pruning Sep 29, 2021 Image Super-Resolution Knowledge Distillation
— Unverified 0