Abstract:Double staining in histopathology, particularly for metaplastic breast cancer, typically employs H&E and P63 dyes. However, P63's tissue damage and high cost necessitate alternative methods. This study introduces xAI-CycleGAN, an advanced architecture combining Mask CycleGAN with explainability features and structure-preserving capabilities for transforming H&E stained breast tissue images into P63-like images. The architecture allows for output editing, enhancing resemblance to actual images and enabling further model refinement. We showcase xAI-CycleGAN's efficacy in maintaining structural integrity and generating high-quality images. Additionally, a histopathologist survey indicates the generated images' realism is often comparable to actual images, validating our model's high-quality output.
Abstract:In the domain of unsupervised image-to-image transformation using generative transformative models, CycleGAN has become the architecture of choice. One of the primary downsides of this architecture is its relatively slow rate of convergence. In this work, we use discriminator-driven explainability to speed up the convergence rate of the generative model by using saliency maps from the discriminator that mask the gradients of the generator during backpropagation, based on the work of Nagisetty et al., and also introducing the saliency map on input, added onto a Gaussian noise mask, by using an interpretable latent variable based on Wang M.'s Mask CycleGAN. This allows for an explainability fusion in both directions, and utilizing the noise-added saliency map on input as evidence-based counterfactual filtering. This new architecture has much higher rate of convergence than a baseline CycleGAN architecture while preserving the image quality.