(WHS-L3.06) Wound Histology Through Virtual Staining Using Generative Adversarial Networks
Thursday, May 16, 2024
10:30 AM – 11:30 AM East Coast USA Time
Hematoxylin and Eosin (H&E) staining is the gold standard for visualizing wound microstructural features. However, traditional biopsy, sectioning, and staining is destructive, variable, and cannot capture dynamic events. Label-free multiphoton microscopy (MPM) allows for non-invasive in vivo 3D visualization of tissue microstructure using the endogenous fluorescence of cellular metabolic cofactors as well as second harmonic generation of collagen. However, MPM-generated images are unfamiliar to pathologists, clinicians, and biologists accustomed to interpreting H&E images. Bridging this familiarity gap is crucial for integrating advanced imaging into established diagnostic practices. Generative Adversarial Networks (GANs) are deep learning models used for style transfer and image-to-image translation tasks. Typical GANs consist of a Generator network that can create or modify images and a Discriminator network that tries to distinguish real and computer-generated images, guided by loss probabilities. A Cycle GAN is a specific GAN architecture useful for unpaired image translation and employs two Generators (along with two adversarial Discriminators) to translate images between two domains. The objective of this project was to develop a Cycle GAN capable of translating unstained MPM images of wounds to resemble H&E sections. A Cycle GAN architecture was written in Python utilizing PyTorch. The Generator networks learned to transform MPM images of unstained tissue to resemble H&E stained sections and vice versa, while the Discriminator networks learned to tell the difference between the generated images and real H&E or MPM images. All the networks learned from mistakes of the Discriminators during training, and the Generators also learned from how similar computer-generated images look when transferred from one image domain and back. Our Cycle GAN was trained over 80 epochs on diverse MPM and H&E image datasets of excisional skin wounds. Visual comparisons between virtually- and chemically-stained images revealed promising outcomes, with the Generator preserving tissue morphology at a high resolution, while accurately staining epidermis and follicles purple and collagen pink. Most measures of loss (i.e. network error) decreased throughout training, with the noteworthy exception of the Discriminator loss. This suggests a successful outcome, indicating the Generators created convincing virtual staining that the Discriminators ultimately could not discern from real chemically-stained sections. This study demonstrates the potential of virtual staining via Cycle GAN to bridge the familiarity gap between MPM and traditional H&E staining. It provides a framework for obtaining high-resolution H&E images of wounds from in vivo imaging without the need for a tissue biopsy, sectioning, or staining.