Dies ist eine Übersichtsseite mit Metadaten zu dieser wissenschaftlichen Arbeit. Der vollständige Artikel ist beim Verlag verfügbar.
ReconFormer: Accelerated MRI Reconstruction Using Recurrent Transformer
87
Zitationen
5
Autoren
2023
Jahr
Abstract
The accelerating magnetic resonance imaging (MRI) reconstruction process is a challenging ill-posed inverse problem due to the excessive under-sampling operation in k -space. In this paper, we propose a recurrent Transformer model, namely ReconFormer, for MRI reconstruction, which can iteratively reconstruct high-fidelity magnetic resonance images from highly under-sampled k -space data (e.g., up to 8× acceleration). In particular, the proposed architecture is built upon Recurrent Pyramid Transformer Layers (RPTLs). The core design of the proposed method is Recurrent Scale-wise Attention (RSA), which jointly exploits intrinsic multi-scale information at every architecture unit as well as the dependencies of the deep feature correlation through recurrent states. Moreover, benefiting from its recurrent nature, ReconFormer is lightweight compared to other baselines and only contains 1.1 M trainable parameters. We validate the effectiveness of ReconFormer on multiple datasets with different magnetic resonance sequences and show that it achieves significant improvements over the state-of-the-art methods with better parameter efficiency. The implementation code and pre-trained weights are available at https://github.com/guopengf/ReconFormer.
Ähnliche Arbeiten
Advances in functional and structural MR image analysis and implementation as FSL
2004 · 13.969 Zit.
A default mode of brain function
2001 · 12.296 Zit.
FSL
2011 · 11.553 Zit.
Improved Optimization for the Robust and Accurate Linear Registration and Motion Correction of Brain Images
2002 · 10.586 Zit.
Functional connectivity in the motor cortex of resting human brain using echo‐planar mri
1995 · 9.996 Zit.