SOTAVerified

Neural Network Compression

Papers

Showing 125 of 193 papers

TitleStatusHype
DepGraph: Towards Any Structural PruningCode4
A Survey on Deep Neural Network Pruning-Taxonomy, Comparison, Analysis, and RecommendationsCode2
Data-Free Learning of Student NetworksCode2
Torch2Chip: An End-to-end Customizable Deep Neural Network Compression and Deployment Toolkit for Prototype Hardware Accelerator DesignCode2
Neural Network Compression Framework for fast model inferenceCode2
T-Basis: a Compact Representation for Neural NetworksCode1
SwiftTron: An Efficient Hardware Accelerator for Quantized TransformersCode1
Neural network compression via learnable wavelet transformsCode1
PD-Quant: Post-Training Quantization based on Prediction Difference MetricCode1
Learning Filter Basis for Convolutional Neural Network CompressionCode1
Spectral Tensor Train Parameterization of Deep Learning LayersCode1
SPIN: An Empirical Evaluation on Sharing Parameters of Isotropic NetworksCode1
Prune Your Model Before Distill ItCode1
The continuous categorical: a novel simplex-valued exponential familyCode1
Efficient Deep Learning: A Survey on Making Deep Learning Models Smaller, Faster, and BetterCode1
FAT: Learning Low-Bitwidth Parametric Representation via Frequency-Aware TransformationCode1
Distilled Split Deep Neural Networks for Edge-Assisted Real-Time SystemsCode1
Few-Bit Backward: Quantized Gradients of Activation Functions for Memory Footprint ReductionCode1
CHIP: CHannel Independence-based Pruning for Compact Neural NetworksCode1
Head Network Distillation: Splitting Distilled Deep Neural Networks for Resource-Constrained Edge Computing SystemsCode1
NeRV: Neural Representations for VideosCode1
Quantisation and Pruning for Neural Network Compression and RegularisationCode1
REST: Robust and Efficient Neural Networks for Sleep Monitoring in the WildCode1
Robustness and Transferability of Universal Attacks on Compressed ModelsCode1
Towards Meta-Pruning via Optimal TransportCode1
Show:102550
← PrevPage 1 of 8Next →

No leaderboard results yet.