Dies ist eine Übersichtsseite mit Metadaten zu dieser wissenschaftlichen Arbeit. Der vollständige Artikel ist beim Verlag verfügbar.
MNet: Rethinking 2D/3D Networks for Anisotropic Medical Image Segmentation
57
Zitationen
8
Autoren
2022
Jahr
Abstract
The nature of thick-slice scanning causes severe inter-slice discontinuities of 3D medical images, and the vanilla 2D/3D convolutional neural networks (CNNs) fail to represent sparse inter-slice information and dense intra-slice information in a balanced way, leading to severe underfitting to inter-slice features (for vanilla 2D CNNs) and overfitting to noise from long-range slices (for vanilla 3D CNNs). In this work, a novel mesh network (MNet) is proposed to balance the spatial representation inter axes via learning. 1) Our MNet latently fuses plenty of representation processes by embedding multi-dimensional convolutions deeply into basic modules, making the selections of representation processes flexible, thus balancing representation for sparse inter-slice information and dense intra-slice information adaptively. 2) Our MNet latently fuses multi-dimensional features inside each basic module, simultaneously taking the advantages of 2D (high segmentation accuracy of the easily recognized regions in 2D view) and 3D (high smoothness of 3D organ contour) representations, thus obtaining more accurate modeling for target regions. Comprehensive experiments are performed on four public datasets (CT\&MR), the results consistently demonstrate the proposed MNet outperforms the other methods. The code and datasets are available at: https://github.com/zfdong-code/MNet
Ähnliche Arbeiten
Deep Residual Learning for Image Recognition
2016 · 219.281 Zit.
U-Net: Convolutional Networks for Biomedical Image Segmentation
2015 · 87.509 Zit.
ImageNet classification with deep convolutional neural networks
2017 · 75.673 Zit.
Very Deep Convolutional Networks for Large-Scale Image Recognition
2014 · 75.503 Zit.
Faster R-CNN: Towards Real-Time Object Detection with Region Proposal Networks
2016 · 53.477 Zit.