Semi-Online Knowledge Distillation Nov 23, 2021 Knowledge Distillation Model Compression
Code Code Available 0Automatic Mapping of the Best-Suited DNN Pruning Schemes for Real-Time Mobile Acceleration Nov 22, 2021 Model Compression
— Unverified 0Local-Selective Feature Distillation for Single Image Super-Resolution Nov 22, 2021 Image Super-Resolution Knowledge Distillation
— Unverified 0Structured Pruning Learns Compact and Accurate Models Nov 16, 2021 Model Compression
— Unverified 0Weight Squeezing: Reparameterization for Knowledge Transfer and Model Compression Nov 16, 2021 Model Compression text-classification
— Unverified 0Learning-Based Symbol Level Precoding: A Memory-Efficient Unsupervised Learning Approach Nov 15, 2021 Model Compression
— Unverified 0Domain Generalization on Efficient Acoustic Scene Classification using Residual Normalization Nov 12, 2021 Acoustic Scene Classification Classification
— Unverified 0Learning Interpretation with Explainable Knowledge Distillation Nov 12, 2021 Knowledge Distillation Model Compression
— Unverified 0SEOFP-NET: Compression and Acceleration of Deep Neural Networks for Speech Enhancement Using Sign-Exponent-Only Floating-Points Nov 8, 2021 Model Compression regression
— Unverified 0A Survey on Green Deep Learning Nov 8, 2021 Deep Learning Knowledge Distillation
— Unverified 0Oracle Teacher: Leveraging Target Information for Better Knowledge Distillation of CTC Models Nov 5, 2021 Knowledge Distillation Machine Translation
— Unverified 0Weight, Block or Unit? Exploring Sparsity Tradeoffs for Speech Enhancement on Tiny Neural Accelerators Nov 3, 2021 Model Compression Speech Enhancement
— Unverified 0How to Select One Among All ? An Empirical Study Towards the Robustness of Knowledge Distillation in Natural Language Understanding Nov 1, 2021 Adversarial Robustness All
— Unverified 0ILMPQ : An Intra-Layer Multi-Precision Deep Neural Network Quantization framework for FPGA Oct 30, 2021 Edge-computing Model Compression
— Unverified 0On Cross-Layer Alignment for Model Fusion of Heterogeneous Neural Networks Oct 29, 2021 Knowledge Distillation Model Compression
— Unverified 0Reconstructing Pruned Filters using Cheap Spatial Transformations Oct 25, 2021 Feature Compression Knowledge Distillation
— Unverified 0Exploring Gradient Flow Based Saliency for DNN Model Compression Oct 24, 2021 Denoising image-classification
Code Code Available 0How and When Adversarial Robustness Transfers in Knowledge Distillation? Oct 22, 2021 Adversarial Robustness Knowledge Distillation
— Unverified 0Analysis of memory consumption by neural networks based on hyperparameters Oct 21, 2021 Deep Learning Model Compression
— Unverified 0Augmenting Knowledge Distillation With Peer-To-Peer Mutual Learning For Model Compression Oct 21, 2021 Knowledge Distillation Model Compression
— Unverified 0Accelerating Framework of Transformer by Hardware Design and Model Compression Co-Optimization Oct 19, 2021 CPU GPU
— Unverified 0HRKD: Hierarchical Relational Knowledge Distillation for Cross-domain Language Model Compression Oct 16, 2021 Few-Shot Learning Knowledge Distillation
Code Code Available 0A Short Study on Compressing Decoder-Based Language Models Oct 16, 2021 Decoder Knowledge Distillation
— Unverified 0Robustness Challenges in Model Distillation and Pruning for Natural Language Understanding Oct 16, 2021 Knowledge Distillation Model Compression
— Unverified 0Pro-KD: Progressive Distillation by Following the Footsteps of the Teacher Oct 16, 2021 image-classification Image Classification
— Unverified 0Kronecker Decomposition for GPT Compression Oct 15, 2021 Knowledge Distillation Language Modeling
— Unverified 0Differentiable Network Pruning for Microcontrollers Oct 15, 2021 Model Compression Network Pruning
— Unverified 0A Memory-Efficient Learning Framework for SymbolLevel Precoding with Quantized NN Weights Oct 13, 2021 Model Compression Quantization
— Unverified 0Rectifying the Data Bias in Knowledge Distillation Oct 11, 2021 Face Recognition Face Verification
— Unverified 0FedDQ: Communication-Efficient Federated Learning with Descending Quantization Oct 5, 2021 Federated Learning Model Compression
— Unverified 0KIMERA: Injecting Domain Knowledge into Vacant Transformer Heads Sep 29, 2021 Information Retrieval Model Compression
— Unverified 0A Unified Knowledge Distillation Framework for Deep Directed Graphical Models Sep 29, 2021 Continual Learning Federated Learning
— Unverified 0Sparse Unbalanced GAN Training with In-Time Over-Parameterization Sep 29, 2021 Model Compression
— Unverified 0HFSP: A Hardware-friendly Soft Pruning Framework for Vision Transformers Sep 29, 2021 image-classification Image Classification
— Unverified 0Model Compression via Symmetries of the Parameter Space Sep 29, 2021 model Model Compression
— Unverified 0Learning Efficient Image Super-Resolution Networks via Structure-Regularized Pruning Sep 29, 2021 Image Super-Resolution Knowledge Distillation
— Unverified 0Robot Intent Recognition Method Based on State Grid Business Office Sep 29, 2021 Intent Detection Intent Recognition
— Unverified 0Prototypical Contrastive Predictive Coding Sep 29, 2021 Contrastive Learning Knowledge Distillation
— Unverified 0Bayesian Optimization with Clustering and Rollback for CNN Auto Pruning Sep 22, 2021 Bayesian Optimization Clustering
Code Code Available 0Classification-based Quality Estimation: Small and Efficient Models for Real-world Applications Sep 17, 2021 Machine Translation Model Compression
— Unverified 0Experimental implementation of a neural network optical channel equalizer in restricted hardware using pruning and quantization Sep 15, 2021 CPU Edge-computing
— Unverified 0A Note on Knowledge Distillation Loss Function for Object Classification Sep 14, 2021 Knowledge Distillation Model Compression
— Unverified 0Multihop: Leveraging Complex Models to Learn Accurate Simple Models Sep 14, 2021 Explainable artificial intelligence Knowledge Distillation
— Unverified 0KroneckerBERT: Learning Kronecker Decomposition for Pre-trained Language Models via Knowledge Distillation Sep 13, 2021 Knowledge Distillation Language Modeling
— Unverified 0Causal Explanation of Convolutional Neural Networks Sep 13, 2021 counterfactual Counterfactual Explanation
Code Code Available 0BioNetExplorer: Architecture-Space Exploration of Bio-Signal Processing Deep Neural Networks for Wearables Sep 7, 2021 Model Compression
— Unverified 0GDP: Stabilized Neural Network Pruning via Gates with Differentiable Polarization Sep 6, 2021 channel selection Model Compression
— Unverified 0Full-Cycle Energy Consumption Benchmark for Low-Carbon Computer Vision Aug 30, 2021 Deep Learning Model Compression
— Unverified 0Lipschitz Continuity Guided Knowledge Distillation Aug 29, 2021 Knowledge Distillation Model Compression
— Unverified 0DKM: Differentiable K-Means Clustering Layer for Neural Network Compression Aug 28, 2021 Clustering Model Compression
— Unverified 0