Large-scale Medical Image Processing
|dc.contributor.advisor||Chun, Se Young||-|
|dc.description||Department of Electrical Engineering||-|
|dc.description.abstract||Deep learning based approaches for vision motivated many researchers in Medical Image processing ﬁelds due to the powerful performances. Compare to the natural image data, the medical image data set commonly consumes huge memory with complex data structures. In addition, to demonstrate the large scale images for clinical purpose such as CT scans or pathological image data, it is commonly known to be diﬃcult so that direct application of conventional deep models with typical GPU usage should be considered. For example, in the pathological data which is the image of microscope of human cells to classify tumor cells or not, the size of image slide is far larger than natural high resolution images while the ﬁeld of view (FOV) that we are interested region is tiny. On the other hand, to handle the large scale of CT data which is using X-ray beams to visualize in-vivo hardness structures, due to the memory limitation of GPU device, the patch-wise method is suppressed to yield high performance and is disturbed to compute faster. Thus, in this paper, we investigate the how data balancing method eﬀectively enhance the deep approach method when there is only unbalanced dataset. Furthermore, we propose the eﬃcient memory utilization of multi-gpu method for deep learning with large scale CT images.||-|
|dc.publisher||Graduate School of UNIST||-|
|dc.title||Large-scale Medical Image Processing||-|
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.