COMBINED METRIC FOR EVALUATING THE QUALITY OF SYNTHESIZED BIOMEDICAL IMAGES
DOI:
https://doi.org/10.15588/1607-3274-2025-2-15Keywords:
metric, IS metric, FID metric, histopathological images, deep neural networks, diffusion models, Stable DiffusionAbstract
Context. This study addresses the problem of developing a new metric for evaluating the quality of synthesized images. The relevance of this problem is explained by the need for assessing the quality of artificially generated images. Additionally, the study highlights the potential of biomedical image synthesis based on diffusion models. The research results can be applied for biomedical image generation and quantitative quality assessment of synthesized images.
Objective. The aim of this study is to develop a combined metric and an algorithm for biomedical image synthesis to assess the quality of synthesized images.
Method. A combined metric MC for evaluating the quality of synthesized images is proposed. This metric is based on two existing metrics: MIS and MFID. Additionally, an algorithm for histopathological image synthesis using diffusion models has been developed.
Results. To study the MIS, MFID, and MC metrics, histopathological images available on the Zenodo platform were used. This dataset contains three classes of histopathological images G1, G2, and G3, representing pathological conditions of breast tissue. Based on the developed image synthesis algorithm, three classes of artificial histopathological images were generated. Using the MIS, MFID, and MC metrics, quality assessments of the synthesized histopathological images were obtained. The developed metric will form the basis of a software module for image quality assessment using metrics. This software module will be integrated into CAD systems.
Conclusions. A combined metric for evaluating the quality of synthesized images has been developed, along with a proposed algorithm for biomedical image synthesis. The software implementation of the combined metric and image synthesis algorithm has been integrated into an image quality assessment module.
References
Cancer Facts for Women. American Cancer Society. [Electronic resource]. Access mode: ++https://www.cancer.org/cancer/riskprevention/understanding-cancer-risk/cancerfacts/cancer-facts-for-women.html
Cancer in Ukraine 2021–2022: Incidence, mortality, prevalence and other relevant statistics. Bulletin of the National Cancer Registry of Ukraine № 24, 2021–2022. [Electronic resource]. Access mode: http://www.ncru.inf.ua/publications/BULL_24/PDF_E/bull_eng_24.pdf
Berezsky O., Liashchynskyi P., Pitsun O., Izonin I. Synthesis of Convolutional Neural Network architectures for biomedical image classification, Biomedical Signal Processing and Control, 2024, №95, Р. 106325. DOI: 10.1016/j.bspc.2024.106325
Nie D., Cao X., Gao Y., Wang L., Shen D. Estimating CT image from MRI data using 3d fully convolutional networks In: Deep Learning and Data Labeling for Medical Applications. Springer International Publishing, 2016, pp. 170–178. DOI: 10.1007/978-3-319-46976-8_18
Nie D., Trullo R., Lian J., Petitjean C., Ruan S., Wang Q., Shen D. Medical image synthesis with context-aware generative adversarial networks, In: Medical Image Computing and Computer Assisted Intervention –MICCAI 2017. Springer International Publishing, 2017, pp. 417–425. DOI: 10.1007/978-3-319-66179-7_48
Khader F., Müller-Franzes G., Arasteh S. T., Han T., Haarburger C., Schulze Hagen M., Schad P., Engelhardt S., Baeßler B., Foersch S., Stegmaier J., Kuhl C., Nebelung S., Kather J. N., Truhn D. Denoising diffusion probabilistic models for 3d medical image generation, Sci. Rep. 2023 May 5, Vol. 13, № 1, Р. 7303. DOI: 10.1038/s41598-023-34341-2
Xu Q., Huang G., Yuan Y., Guo C., Sun Y., Wu F., Weinberger K. An empirical study on evaluation metrics of generative adversarial networks, ArXiv1806.07755 Cs Stat, 2018. DOI: 10.48550/arXiv.1806.07755
Borji A. Pros and cons of GAN evaluation measures, Computer Vision and Image Understanding, 2019, Vol. 179, pp. 41–65. DOI: 10.1016/j.cviu.2018.10.009
Rendon N., Giraldo J. H., Bouwmans T., Rodríguez-Buritica S., E. Ramirez, Isaza C. Uncertainty clustering internal validity assessment using Fréchet distance for unsupervised learning, Engineering Applications of Artificial Intelligence, 2023, Vol. 124, P. 106635. DOI: 10.1016/j.engappai.2023.106635
Buzuti L. F., Thomaz C. E. Fréchet AutoEncoder Distance: A new approach for evaluation of Generative Adversarial Networks, Computer Vision and Image Understanding, 2023, Vol. 235, P. 103768. DOI: 10.1016/j.cviu.2023.103768
Heusel M., Ramsauer H., Unterthiner T., Nessler B., Hochreiter S. GANs Trained by a Two Time-Scale Update Rule Converge to a Local Nash Equilibrium [Electronic resource]. Access mode: https://arxiv.org/abs/1801.01401. DOI:10.48550/arXiv.1801.01401
Bińkowski M., Sutherland Danica J., Arbel Michael, Gretton Arthur Demystifying MMD GANs [Electronic resource], Access mode: https://arxiv.org/abs/1801.01401. DOI:10.48550/arXiv.1801.01401
Chong M. J., Forsyth D. Effectively Unbiased FID and Inception Score and where to find them [Electronic resource]. Access mode: https://arxiv.org/abs/1911.07023v3. DOI: 10.48550/arXiv.1911.07023
Parmar G. Zhang R., Zhu J.-Y. On Aliased Resizing and Surprising Subtleties in GAN Evaluation [Electronic resource].
Access mode: https://arxiv.org/abs/2104.11222v3. DOI: 10.48550/arXiv.2104.11222
Jayasumana S., Ramalingam S., Veit A., Glasner D., Chakrabarti A., Kumar S. Rethinking FID: Towards a Better Evaluation Metric for Image Generation [Electronic resource]. Access mode: https://arxiv.org/abs/2401.09603v2. DOI: 10.48550/arXiv.2401.09603
Betzalel E., Penso C., Fetaya E. Evaluation Metrics for Generative Models: An Empirical Study, Machine Learning and Knowledge Extraction, 2024, Vol. 6, № 3, pp. 1531–1544. DOI: 10.3390/make6030073
Benny Y., Galanti T., Benaim S., Wolf L. Wolf Evaluation Metrics for Conditional Image Generation, International Journal of Computer Vision, 2021, Vol. 129, № 5, pp. 1712–1731. DOI: 10.1007/s11263-020-01424-w
Arjovsky M., Chintala S., Bottou L. Wasserstein GAN [Electronic resource]. Access mode: https://arxiv.org/abs/1701.07875. DOI: 10.48550/ARXIV.1701.07875
Dayarathna S., Islam K. T., Uribe S., Yang G., Hayat M., Chen Z. Deep learning based synthesis of MRI, CT and PET: Review and analysis, Medical Image Analysis, 2024, Vol. 92, P. 103046. DOI: 10.1016/j.media.2023.103046
Luz D. S., Lima T. J. B., Silva R. R. V., Magalhães D. M. V., Araujo F. H. D. Automatic detection metastasis in breast histopathological images based on ensemble learning and color adjustment, Biomedical Signal Processing and Control, 2022, Vol. 75, P. 103564. DOI: 10.1016/j.bspc.2022.103564
Jiménez-Gaona Y., Carrión-Figueroa D., Lakshminarayanan V., Rodríguez-Álvarez M. José GAN-based data augmentation to improve breast ultrasound and mammography mass classification, Biomedical Signal Processing and Control, 2024, Vol. 94, P. 106255. DOI: 10.1016/j.bspc.2024.106255
Ukwuoma C. C., Cai D., Eziefuna E. O., Oluwasanmi A., Abdi Sabirin F., Muoka G. W., Thomas D., Sarpong K. Enhancing histopathological medical image classification for Early cancer diagnosis using deep learning and explainable AI – LIME & SHAP, Biomedical Signal Processing and Control, 2025, Vol. 100, P. 107014. DOI: 10.1016/j.bspc.2024.107014
Shi S., Li H., Zhang Y., Wang X. Semantic informationguided attentional GAN-based ultrasound image synthesis method, Biomedical Signal Processing and Control, 2025, Vol. 102, P. 107273. DOI: 10.1016/j.bspc.2024.107273
Zeng X., Lu B., Zhang J. Medical image synthesis algorithm based on vision graph neural network with manifold matching, Biomedical Signal Processing and Control, 2025, Vol. 103, P. 107381. DOI: 10.1016/j.bspc.2024.107381
Hu Y., Zhang S., Li W., Sun J., Xu L. X. Unsupervised medical image synthesis based on multi-branch attention structure, Biomedical Signal Processing and Control, 2025, Vol. 104, P. 107495. DOI: 10.1016/j.bspc.2025.107495
Jose L., Liu S., Russo C., Nadort A., Di Ieva Antonio Generative Adversarial Networks in Digital Pathology and Histopathological Image Processing: A Review, Journal of Pathology Informatics, 2021, Vol. 12, № 1, P. 43. DOI: 10.4103/jpi.jpi_103_20
Giri S. K., Dash S. Synthesis of clinical images for oral cancer detection and prediction using deep learning, Mining Biomedical Text, Images and Visual Features for Information Retrieval, 2025, pp. 339–356. DOI: 10.1016/b978-0-443-15452-2.00017-0
Berezsky O. M., Pitsun O. Y. Evaluation methods of image segmentation quality, Radio Electronics, Computer Science, Control, 2018, №1, pp. 119–128. DOI: 10.15588/1607-3274-2018-1-14
Berezsky O. M., Liashchynskyi P. B. Method of generative- adversarial networks searching architectures for biomedical images synthesis, Radio Electronics, Computer Science, Control, 2024, № 1, pp. 104–117. DOI https://doi.org/10.15588/1607-3274-2024-1-10
Berezsky O. M., Liashchynskyi P. B., Pitsun O. Y., Melnyk G. M. Deep network-based method and software for small sample biomedical image generation and classification, Radio Electronics, Computer Science, Control, 2023, №4,Р. 76. DOI: 10.15588/1607-3274-2023-4-8
Rantalainen M., Hartman J. ACROBAT – a multi-stain breast cancer histological whole-slide-image data set from routine diagnostics for computational pathology.
Karolinska Institutet. [Electronic resource]. Access mode: https://doi.org/10.48723/w728-p041. DOI:10.48723/w728-p041
Wan Siti Halimatul Munirah Wan Ahmad, Mohammad Faizal Ahmad Fauzi, Md Jahid Hasan, Zaka Ur Rehman, Jenny Tung Hiong Lee, See Yee Khor, Lai Meng Looi,
Fazly Salleh Abas, Afzan Adam, Elaine Wan Ling Chan, Sei-ichiro Kamata UMMC ER-IHC Breast Histopathology Whole Slide Image and Allred Score IEEE Dataport, 2023. [Electronic resource]. Access mode: https://dx.doi.org/10.21227/9gbq-gz50. DOI: 10.21227/9gbq-gz50
Berezsky O., Datsko T., Melnyk G. Cytological and histological images of breast cancer [Electronic resource]. Access mode: https://zenodo.org/records/7890874. DOI: 10.5281/zenodo.7890873
Berezsky O., Liashchynskyi P., Melnyk G., Dombrovskyi M., Berezkyi M. Synthesis of biomedical images based on generative intelligence tools, Informatics & Data-Driven Medicine (IDDM 2024): 7th International Conference, Birmingham, UK, November 14–16, 2024, CEUR Workshop Proceedings, 2024, Vol. 3892, pp. 349–362. DOI: CEUR_WS.org/Vol 3892/paper23.pdf
Deza M. M., Deza E. Encyclopedia of Distances. Springer-Verlag Berlin Heidelberg, 2013, 583 p.
GitHub – liashchynskyi_rudi_ Lightweight image converter and dataset augmentor, 2024. [Electronic resource]. Access mode: https://github.com/liashchynskyi/rudi
Downloads
Published
How to Cite
Issue
Section
License
Copyright (c) 2025 O. M. Berezsky, M. O. Berezkyi, M. O. Dombrovskyi, P. B. Liashchynskyi, G. M. Melnyk

This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.
Creative Commons Licensing Notifications in the Copyright Notices
The journal allows the authors to hold the copyright without restrictions and to retain publishing rights without restrictions.
The journal allows readers to read, download, copy, distribute, print, search, or link to the full texts of its articles.
The journal allows to reuse and remixing of its content, in accordance with a Creative Commons license СС BY -SA.
Authors who publish with this journal agree to the following terms:
-
Authors retain copyright and grant the journal right of first publication with the work simultaneously licensed under a Creative Commons Attribution License CC BY-SA that allows others to share the work with an acknowledgement of the work's authorship and initial publication in this journal.
-
Authors are able to enter into separate, additional contractual arrangements for the non-exclusive distribution of the journal's published version of the work (e.g., post it to an institutional repository or publish it in a book), with an acknowledgement of its initial publication in this journal.
-
Authors are permitted and encouraged to post their work online (e.g., in institutional repositories or on their website) prior to and during the submission process, as it can lead to productive exchanges, as well as earlier and greater citation of published work.