File Download

There are no files associated with this item.

  • Find it @ UNIST can give you direct access to the published full text of this article. (UNISTARs only)
Related Researcher

이지민

Lee, Jimin
Radiation & Medical Intelligence Lab.
Read More

Views & Downloads

Detailed Information

Cited time in webofscience Cited time in scopus
Metadata Downloads

Full metadata record

DC Field Value Language
dc.citation.number 23 -
dc.citation.startPage 235005 -
dc.citation.title PHYSICS IN MEDICINE AND BIOLOGY -
dc.citation.volume 69 -
dc.contributor.author Tessema, Abel Worku -
dc.contributor.author Ambaye, Dagnachew Tessema -
dc.contributor.author Ryu, Jiwon -
dc.contributor.author Jeong. Jiwoo -
dc.contributor.author Yu, Tosol -
dc.contributor.author Lee, Jimin -
dc.contributor.author Cho, Hyungjoon -
dc.date.accessioned 2024-11-22T14:35:08Z -
dc.date.available 2024-11-22T14:35:08Z -
dc.date.created 2024-11-21 -
dc.date.issued 2024-12 -
dc.description.abstract This study aims to investigate the feasibility of utilizing generative adversarial networks (GANs) to synthesize high-fidelity computed tomography (CT) images from lower-resolution MR images. The goal is to reduce patient exposure to ionizing radiation while maintaining treatment accuracy and accelerating MR image acquisition. The primary focus is to determine the extent to which low-resolution MR images can be utilized to generate high-quality CT images through a systematic study of spatial resolution-dependent magnetic resonance imaging (MRI)-to-CT image conversion. Approach. Paired MRI-CT images were acquired from healthy control and tumor models, generated by injecting MDA-MB-231 and 4T1 tumor cells into the mammary fat pad of nude and BALB/c mice to ensure model diversification. To explore various MRI resolutions, we downscaled the highest-resolution MR image into three lower resolutions. Using a customized U-Net model, we automated region of interest masking for both MRI and CT modalities with precise alignment, achieved through three-dimensional affine paired MRI-CT registrations. Then our customized models, Nested U-Net GAN and Attention U-Net GAN, were employed to translate low-resolution MR images into high-resolution CT images, followed by evaluation with separate testing datasets. Main Results. Our approach successfully generated high-quality CT images (0.142 mm2) from both lower-resolution (0.282 mm2) and higher-resolution (0.142 mm2) MRimages, with no statistically significant differences between them, effectively doubling the speed of MR image acquisition. Our customized GANs successfully preserved anatomical details, addressing the typical loss issue seen in other MRI-CT translation techniques across all resolutions of MRimageinputs. Significance. This study demonstrates the potential of using low-resolution MRimages to generate high-quality CT images, thereby reducing radiation exposure and expediting MRI acquisition while maintaining accuracy for radiotherapy. -
dc.identifier.bibliographicCitation PHYSICS IN MEDICINE AND BIOLOGY, v.69, no.23, pp. 235005 -
dc.identifier.doi 10.1088/1361-6560/ad9076 -
dc.identifier.issn 0031-9155 -
dc.identifier.scopusid 2-s2.0-85209926012 -
dc.identifier.uri https://scholarworks.unist.ac.kr/handle/201301/84537 -
dc.identifier.wosid 001361037400001 -
dc.language 영어 -
dc.publisher IOP PUBLISHING LTD -
dc.title Resolution-dependent MRI-to-CT translation for orthotopic breast cancer models using deep learning -
dc.type Article -
dc.description.isOpenAccess FALSE -
dc.relation.journalWebOfScienceCategory Engineering, Biomedical;Radiology -
dc.relation.journalResearchArea Engineering;Radiology -
dc.type.docType Article -
dc.description.journalRegisteredClass scie -
dc.description.journalRegisteredClass scopus -
dc.subject.keywordAuthor MRI-to-CT conversion -
dc.subject.keywordAuthor generative adversarial network -
dc.subject.keywordPlus NETWORKS -

qrcode

Items in Repository are protected by copyright, with all rights reserved, unless otherwise indicated.