File Download

There are no files associated with this item.

  • Find it @ UNIST can give you direct access to the published full text of this article. (UNISTARs only)
Related Researcher

이지민

Lee, Jimin
Radiation & Medical Intelligence Lab.
Read More

Views & Downloads

Detailed Information

Cited time in webofscience Cited time in scopus
Metadata Downloads

Resolution-dependent MRI-to-CT translation for orthotopic breast cancer models using deep learning

Author(s)
Tessema, Abel WorkuAmbaye, Dagnachew TessemaRyu, JiwonJeong. JiwooYu, TosolLee, JiminCho, Hyungjoon
Issued Date
2024-12
DOI
10.1088/1361-6560/ad9076
URI
https://scholarworks.unist.ac.kr/handle/201301/84537
Citation
PHYSICS IN MEDICINE AND BIOLOGY, v.69, no.23, pp. 235005
Abstract
This study aims to investigate the feasibility of utilizing generative adversarial networks (GANs) to synthesize high-fidelity computed tomography (CT) images from lower-resolution MR images. The goal is to reduce patient exposure to ionizing radiation while maintaining treatment accuracy and accelerating MR image acquisition. The primary focus is to determine the extent to which low-resolution MR images can be utilized to generate high-quality CT images through a systematic study of spatial resolution-dependent magnetic resonance imaging (MRI)-to-CT image conversion. Approach. Paired MRI-CT images were acquired from healthy control and tumor models, generated by injecting MDA-MB-231 and 4T1 tumor cells into the mammary fat pad of nude and BALB/c mice to ensure model diversification. To explore various MRI resolutions, we downscaled the highest-resolution MR image into three lower resolutions. Using a customized U-Net model, we automated region of interest masking for both MRI and CT modalities with precise alignment, achieved through three-dimensional affine paired MRI-CT registrations. Then our customized models, Nested U-Net GAN and Attention U-Net GAN, were employed to translate low-resolution MR images into high-resolution CT images, followed by evaluation with separate testing datasets. Main Results. Our approach successfully generated high-quality CT images (0.142 mm2) from both lower-resolution (0.282 mm2) and higher-resolution (0.142 mm2) MRimages, with no statistically significant differences between them, effectively doubling the speed of MR image acquisition. Our customized GANs successfully preserved anatomical details, addressing the typical loss issue seen in other MRI-CT translation techniques across all resolutions of MRimageinputs. Significance. This study demonstrates the potential of using low-resolution MRimages to generate high-quality CT images, thereby reducing radiation exposure and expediting MRI acquisition while maintaining accuracy for radiotherapy.
Publisher
IOP PUBLISHING LTD
ISSN
0031-9155
Keyword (Author)
MRI-to-CT conversiongenerative adversarial network
Keyword
NETWORKS

qrcode

Items in Repository are protected by copyright, with all rights reserved, unless otherwise indicated.