File Download

There are no files associated with this item.

  • Find it @ UNIST can give you direct access to the published full text of this article. (UNISTARs only)
Related Researcher

류일우

Lyu, Ilwoo
3D Shape Analysis Lab.
Read More

Views & Downloads

Detailed Information

Cited time in webofscience Cited time in scopus
Metadata Downloads

Fully convolutional neural networks improve abdominal organ segmentation

Author(s)
Bobo, M.F.Bao, S.Huo, Y.Yao, Y.Virostko, J.Plassard, A.J.Lyu, IlwooAssad, A.Abramson, R.G.Hilmes, M.A.Landman, B.A.
Issued Date
2018-02-11
DOI
10.1117/12.2293751
URI
https://scholarworks.unist.ac.kr/handle/201301/50144
Citation
Medical Imaging 2018: Image Processing
Abstract
Abdominal image segmentation is a challenging, yet important clinical problem. Variations in body size, position, and relative organ positions greatly complicate the segmentation process. Historically, multi-atlas methods have achieved leading results across imaging modalities and anatomical targets. However, deep learning is rapidly overtaking classical approaches for image segmentation. Recently, Zhou et al. showed that fully convolutional networks produce excellent results in abdominal organ segmentation of computed tomography (CT) scans. Yet, deep learning approaches have not been applied to whole abdomen magnetic resonance imaging (MRI) segmentation. Herein, we evaluate the applicability of an existing fully convolutional neural network (FCNN) designed for CT imaging to segment abdominal organs on T2 weighted (T2w) MRI's with two examples. In the primary example, we compare a classical multi-atlas approach with FCNN on forty-five T2w MRI's acquired from splenomegaly patients with five organs labeled (liver, spleen, left kidney, right kidney, and stomach). Thirty-six images were used for training while nine were used for testing. The FCNN resulted in a Dice similarity coefficient (DSC) of 0.930 in spleens, 0.730 in left kidneys, 0.780 in right kidneys, 0.913 in livers, and 0.556 in stomachs. The performance measures for livers, spleens, right kidneys, and stomachs were significantly better than multi-atlas (p < 0.05, Wilcoxon rank-sum test). In a secondary example, we compare the multi-atlas approach with FCNN on 138 distinct T2w MRI's with manually labeled pancreases (one label). On the pancreas dataset, the FCNN resulted in a median DSC of 0.691 in pancreases versus 0.287 for multi-atlas. The results are highly promising given relatively limited training data and without specific training of the FCNN model and illustrate the potential of deep learning approaches to transcend imaging modalities. 1 © COPYRIGHT SPIE. Downloading of the abstract is permitted for personal use only.
Publisher
SPIE
ISSN
1605-7422

qrcode

Items in Repository are protected by copyright, with all rights reserved, unless otherwise indicated.