Computed Tomography (CT) plays a pivotal role in medical diagnosis; however,
variability across reconstruction kernels hinders data-driven approaches, such
as deep learning models, from achieving reliable and generalized performance.
To this end, CT data harmonization has emerged as a promising solution to
minimize such non-biological variances by standardizing data across different
sources or conditions. In this context, Generative Adversarial Networks (GANs)
have proved to be a powerful framework for harmonization, framing it as a
style-transfer problem. However, GAN-based approaches still face limitations in
capturing complex relationships within the images, which are essential for
effective harmonization. In this work, we propose a novel texture-aware StarGAN
for CT data harmonization, enabling one-to-many translations across different
reconstruction kernels. Although the StarGAN model has been successfully
applied in other domains, its potential for CT data harmonization remains
unexplored. Furthermore, our approach introduces a multi-scale texture loss
function that embeds texture information across different spatial and angular
scales into the harmonization process, effectively addressing kernel-induced
texture variations. We conducted extensive experimentation on a publicly
available dataset, utilizing a total of 48667 chest CT slices from 197 patients
distributed over three different reconstruction kernels, demonstrating the
superiority of our method over the baseline StarGAN.