SOTAVerified

HMANet: Hybrid Multi-Axis Aggregation Network for Image Super-Resolution

2024-05-08Code Available2· sign in to hype

Shu-Chuan Chu, Zhi-chao Dou, Jeng-Shyang Pan, Shaowei Weng, JunBao Li

Code Available — Be the first to reproduce this paper.

Reproduce

Code

Abstract

Transformer-based methods have demonstrated excellent performance on super-resolution visual tasks, surpassing conventional convolutional neural networks. However, existing work typically restricts self-attention computation to non-overlapping windows to save computational costs. This means that Transformer-based networks can only use input information from a limited spatial range. Therefore, a novel Hybrid Multi-Axis Aggregation network (HMA) is proposed in this paper to exploit feature potential information better. HMA is constructed by stacking Residual Hybrid Transformer Blocks(RHTB) and Grid Attention Blocks(GAB). On the one side, RHTB combines channel attention and self-attention to enhance non-local feature fusion and produce more attractive visual results. Conversely, GAB is used in cross-domain information interaction to jointly model similar features and obtain a larger perceptual field. For the super-resolution task in the training phase, a novel pre-training method is designed to enhance the model representation capabilities further and validate the proposed model's effectiveness through many experiments. The experimental results show that HMA outperforms the state-of-the-art methods on the benchmark dataset. We provide code and models at https://github.com/korouuuuu/HMA.

Tasks

Benchmark Results

DatasetModelMetricClaimedVerifiedStatus
BSD100 - 2x upscalingHMA†PSNR32.79Unverified
BSD100 - 3x upscalingHMA†PSNR29.66Unverified
BSD100 - 4x upscalingHMA†PSNR28.13Unverified
Manga109 - 2x upscalingHMA†PSNR41.13Unverified
Manga109 - 3x upscalingHMA†PSNR36.1Unverified
Manga109 - 4x upscalingHMA†SSIM0.93Unverified
Set14 - 2x upscalingHMA†PSNR35.33Unverified
Set14 - 3x upscalingHMA†PSNR31.47Unverified
Set14 - 4x upscalingHMA†PSNR29.51Unverified
Set5 - 2x upscalingHMA†PSNR38.95Unverified
Set5 - 3x upscalingHMA†PSNR35.35Unverified
Set5 - 4x upscalingHMA†PSNR33.38Unverified
Urban100 - 2x upscalingHMA†PSNR35.24Unverified
Urban100 - 3x upscalingHMA†PSNR31Unverified
Urban100 - 4x upscalingHMA†PSNR28.69Unverified

Reproductions