File Download

There are no files associated with this item.

  • Find it @ UNIST can give you direct access to the published full text of this article. (UNISTARs only)

Views & Downloads

Detailed Information

Cited time in webofscience Cited time in scopus
Metadata Downloads

In-silico Histopathology using Near-Infrared Quantitative Phase Imaging

Author(s)
Aimakov, Nurbolat
Advisor
Jung, Woonggyu
Issued Date
2024-08
URI
https://scholarworks.unist.ac.kr/handle/201301/84107 http://unist.dcollection.net/common/orgView/200000813757
Abstract
Histological staining has been established as the fundamental tool in disease diagnostics as biologi- cal specimens inherently exhibit negligible light absorption. Certain cells and tissue structures of interest are highlighted through application of various combinations of histological dyes. Although considered the gold standard, staining process can be time consuming, costly, and labor-intensive. In addition, the resulting image conditions are dependent on the sample preparation protocols, technician’s skills, state of the staining equipment, etc. As an alternative to brightfield (BF) imaging, phase-contrast imaging modalities, e.g., Zernike’s Phase Contrast and Nomarski Differential Interference Contrast, are able to convert phase delay of light introduced by the optical thickness of the specimen into intensity variations detectable by a camera. Among them, quantitative phase imaging (QPI) techniques have been gaining popularity among research groups around the world for their ability to provide quantitative information about the specimens’ optical thickness distribution. Despite the advantages that QPI brings to the field of digital histopathology, transitioning these techniques into clinic is impractical in it’s pure form as med- ical personnel might not be accustomed to the image contrast that QPI provides. Hence, to bridge this gap, deep learning techniques have been extensively utilized to digitally generate histological stains us- ing trained neural networks. However, conventional approaches of virtual staining involve complicated steps of image registration to obtain precisely aligned image pairs to be used during training process. This is mostly caused by two factors. Firstly, generally sample imaging with label-free and brightfield microscopes is separated with staining specimen with the target dyes. Hence, sample might undergo structural changes due to physical damage or chemical exposure. Secondly, images acquired from two different optical systems are generally tend to have translational, rotational, sampling, etc., mismatches. While these issues might cause not much problems for image processing and deep learning experts, they dramatically complicate user experience for conventional pathologists and medical personnel. There- fore, the goal of this thesis is to propose a fully automated virtual staining platform that will optimize and simplify the workflow of training virtual staining models down to several mouse clicks by means of utilizing a custom LED array that incorporates patterned near-infrared and white light LEDs.
Publisher
Ulsan National Institute of Science and Technology
Degree
Doctor
Major
Department of Biomedical Engineering

qrcode

Items in Repository are protected by copyright, with all rights reserved, unless otherwise indicated.