SOTAVerified

Multi-Task Accelerated MR Reconstruction Schemes for Jointly Training Multiple Contrasts

2021-10-19NeurIPS Workshop Deep_Invers 2021Code Available0· sign in to hype

Victoria Liu, Kanghyun Ryu, Cagan Alkan, John M. Pauly, Shreyas Vasanawala

Code Available — Be the first to reproduce this paper.

Reproduce

Code

Abstract

Model-based accelerated MRI reconstruction methods leverage large datasets to reconstruct diagnostic-quality images from undersampled k-space. These networks require matching training and test time distributions to achieve high quality reconstructions. However, there is inherent variability in MR datasets, including different contrasts, orientations, anatomies, and institution-specific protocols. The current paradigm is to train separate models for each dataset. However, this is a demanding process and cannot exploit information that may be shared amongst datasets. To address this issue, we propose multi-task learning (MTL) schemes that can jointly reconstruct multiple datasets. We test multiple MTL architectures and weighted loss functions against single task learning (STL) baselines. Our quantitative and qualitative results suggest that MTL can outperform STL across a range of dataset ratios for two knee contrasts.

Tasks

Reproductions