General Instance Distillation for Object Detection Mar 3, 2021 Knowledge Distillation Model Compression
Code Code Available 1An Information-Theoretic Justification for Model Pruning Feb 16, 2021 Data Compression model
Code Code Available 1FAT: Learning Low-Bitwidth Parametric Representation via Frequency-Aware Transformation Feb 15, 2021 Model Compression Neural Network Compression
Code Code Available 1LightSpeech: Lightweight and Fast Text to Speech with Neural Architecture Search Feb 8, 2021 CPU Model Compression
Code Code Available 1Topology-Aware Network Pruning using Multi-stage Graph Embedding and Reinforcement Learning Feb 5, 2021 Graph Embedding Model Compression
Code Code Available 1Show, Attend and Distill:Knowledge Distillation via Attention-based Feature Matching Feb 5, 2021 General Knowledge Knowledge Distillation
Code Code Available 1Improving Neural Network Efficiency via Post-Training Quantization With Adaptive Floating-Point Jan 1, 2021 Model Compression Quantization
Code Code Available 1Improve Object Detection with Feature-based Knowledge Distillation: Towards Accurate and Efficient Detectors Jan 1, 2021 image-classification Image Classification
Code Code Available 1EarlyBERT: Efficient BERT Training via Early-bird Lottery Tickets Dec 31, 2020 Model Compression
Code Code Available 1Computation-Efficient Knowledge Distillation via Uncertainty-Aware Mixup Dec 17, 2020 Informativeness Knowledge Distillation
Code Code Available 1Neural Pruning via Growing Regularization Dec 16, 2020 L2 Regularization Model Compression
Code Code Available 1Progressive Network Grafting for Few-Shot Knowledge Distillation Dec 9, 2020 Knowledge Distillation Model Compression
Code Code Available 1DE-RRD: A Knowledge Distillation Framework for Recommender System Dec 8, 2020 Knowledge Distillation Model Compression
Code Code Available 1Going Beyond Classification Accuracy Metrics in Model Compression Dec 3, 2020 Classification Edge-computing
Code Code Available 1Multi-level Knowledge Distillation via Knowledge Alignment and Correlation Dec 1, 2020 Contrastive Learning Knowledge Distillation
Code Code Available 1KD-Lib: A PyTorch library for Knowledge Distillation, Pruning and Quantization Nov 30, 2020 Knowledge Distillation Model Compression
Code Code Available 1Head Network Distillation: Splitting Distilled Deep Neural Networks for Resource-Constrained Edge Computing Systems Nov 20, 2020 Edge-computing image-classification
Code Code Available 1HAWQV3: Dyadic Neural Network Quantization Nov 20, 2020 Model Compression Quantization
Code Code Available 1Gaussian RAM: Lightweight Image Classification via Stochastic Retina-Inspired Glimpse and Reinforcement Learning Nov 12, 2020 Classification General Classification
Code Code Available 1VEGA: Towards an End-to-End Configurable AutoML Pipeline Nov 3, 2020 AutoML BIG-bench Machine Learning
Code Code Available 1Passport-aware Normalization for Deep Model Protection Oct 29, 2020 model Model Compression
Code Code Available 1CompRess: Self-Supervised Learning by Compressing Representations Oct 28, 2020 Linear evaluation Model Compression
Code Code Available 1Towards Compact Neural Networks via End-to-End Training: A Bayesian Tensor Approach with Automatic Rank Determination Oct 17, 2020 Model Compression Tensor Decomposition
Code Code Available 1BERT-EMD: Many-to-Many Layer Mapping for BERT Compression with Earth Mover's Distance Oct 13, 2020 Model Compression
Code Code Available 1Contrastive Distillation on Intermediate Representations for Language Model Compression Sep 29, 2020 Knowledge Distillation Language Modeling
Code Code Available 1Densely Guided Knowledge Distillation using Multiple Teacher Assistants Sep 18, 2020 Knowledge Distillation Model Compression
Code Code Available 1Implicit Regularization via Neural Feature Alignment Aug 3, 2020 feature selection Model Compression
Code Code Available 1Paying more attention to snapshots of Iterative Pruning: Improving Model Compression via Ensemble Distillation Jun 20, 2020 image-classification Image Classification
Code Code Available 1Improving Post Training Neural Quantization: Layer-wise Calibration and Integer Programming Jun 14, 2020 Model Compression Quantization
Code Code Available 1Knowledge Distillation Meets Self-Supervision Jun 12, 2020 Contrastive Learning Knowledge Distillation
Code Code Available 1Communication-Computation Trade-Off in Resource-Constrained Edge Inference Jun 3, 2020 Edge-computing Model Compression
Code Code Available 1Online Knowledge Distillation via Collaborative Learning Jun 1, 2020 Knowledge Distillation Model Compression
Code Code Available 1Position-based Scaled Gradient for Model Quantization and Pruning May 22, 2020 Model Compression Position
Code Code Available 1TinyLSTMs: Efficient Neural Speech Enhancement for Hearing Aids May 20, 2020 Model Compression Quantization
Code Code Available 1MicroNet for Efficient Language Modeling May 16, 2020 Knowledge Distillation Language Modeling
Code Code Available 1Data-Free Network Quantization With Adversarial Knowledge Distillation May 8, 2020 Knowledge Distillation Model Compression
Code Code Available 1WoodFisher: Efficient Second-Order Approximation for Neural Network Compression Apr 29, 2020 image-classification Image Classification
Code Code Available 1Training with Quantization Noise for Extreme Model Compression Apr 15, 2020 image-classification Image Generation
Code Code Available 1KD-MRI: A knowledge distillation framework for image reconstruction and image restoration in MRI workflow Apr 11, 2020 CPU GPU
Code Code Available 1Orthant Based Proximal Stochastic Gradient Method for _1-Regularized Optimization Apr 7, 2020 feature selection Model Compression
Code Code Available 1Variational Bayesian Quantization Feb 18, 2020 Image Compression Model Compression
Code Code Available 1BERT-of-Theseus: Compressing BERT by Progressive Module Replacing Feb 7, 2020 Knowledge Distillation Model Compression
Code Code Available 1Discrimination-aware Network Pruning for Deep Model Compression Jan 4, 2020 Face Recognition image-classification
Code Code Available 1ZeroQ: A Novel Zero Shot Quantization Framework Jan 1, 2020 Data Free Quantization Model Compression
Code Code Available 1Learning from a Teacher using Unlabeled Data Nov 13, 2019 Knowledge Distillation Model Compression
Code Code Available 1Contrastive Representation Distillation Oct 23, 2019 Contrastive Learning Knowledge Distillation
Code Code Available 1Compacting, Picking and Growing for Unforgetting Continual Learning Oct 15, 2019 Age And Gender Classification Continual Learning
Code Code Available 1Structured Pruning of Large Language Models Oct 10, 2019 Language Modeling Language Modelling
Code Code Available 1Distilled Split Deep Neural Networks for Edge-Assisted Real-Time Systems Oct 1, 2019 Edge-computing Image Classification
Code Code Available 1Global Sparse Momentum SGD for Pruning Very Deep Neural Networks Sep 27, 2019 Model Compression
Code Code Available 1