KEYWORDS: Tissues, Microscopy, Tumors, Standards development, In vivo imaging, Education and training, Data modeling, 3D imaging standards, Spatial resolution, Phase imaging
Quantitative oblique back-illumination microscopy (qOBM) is a novel technology for label-free imaging of thick (unsectioned) tissue specimens, demonstrating high spatial resolution and 3-D capabilities. The grayscale contrast however, is unfamiliar to pathologist and histotechnicians without familiarization, limiting its adoption. We used deep learning techniques to convert qOBM into virtual H&E, observing successful conversion of both healthy and tumor thick (unsectioned) specimens. Transfer learning was demonstrated on a second collection of qOBM and H&E images of human astrocytoma specimens. With some improvement in robustness and generalizability, we anticipate that this approach can find clinical application.
KEYWORDS: 3D image processing, Biological imaging, Tissues, Stereoscopy, Real time imaging, Light sources and illumination, Biomedical applications, Phase contrast, In vivo imaging, 3D applications
Quantitative oblique-back-illumination microscopy (qOBM) enables quantitative phase imaging (QPI) with epi-illumination, and thus permits the use of phase contrast in applications that were previously out-of-reach for QPI, including clinical medicine. Here, I will discuss our latest efforts to apply qOBM for clinical applications, specifically tissue imaging for non-invasive diagnostics and image guided therapy. Our approach uses an unsupervised cycle generative adversarial networks to translate 3D phase images of thick fresh tissues to appear like H&E-stained tissue sections. This work paves the way for non-invasive, label-free, real-time 3D H&E imaging which can be transformative for disease detection and guided therapy.
Slide-free microscopy techniques have been proposed for accelerating standard histopathology and intraoperative guidance. One such technology is quantitative oblique back-illumination microscopy (qOBM), which enables real-time, label-free quantitative phase imaging of thick, unsectioned in-vivo and ex-vivo tissues. However, the grayscale phase contrast provided by qOBM differs from the colored histology images familiar to pathologists and clinicians, limiting its current potential for adoption. Here we demonstrate the application of unsupervised deep learning using a Cycleconsistent Generative Adversarial Network (CycleGAN) model to transform qOBM images into virtual hematoxylin and eosin (H&E)-stained images. The models were trained on a dataset of qOBM and H&E images of similar regions in excised brain tissue from a 9 L gliosarcoma rat tumor model. We observed successful qOBM-to-H&E conversion of both uninvolved and tumor-containing specimens, as demonstrated by a classifier test. We describe several crucial preprocessing steps that improve the quality of conversion, such as intensity inversion, pixel harmonization, and color normalization. This unsupervised deep learning framework does exhibit occasional subpar performance; for example, as with GANs in general, it can create so-called “hallucinations”, displaying features not actually present in the original qOBM images. We anticipate that this behavior can be minimized with more extensive training and deployment of advanced ML techniques, and that virtual-H&E-converted qOBM imaging will prove safe and appropriate for rapid tissue imaging applications.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.