Call for Papers
Quick Links
July, 2025 | Volume 04 | Issue 03
Explainable AI (XAI) Techniques to Enhance Cancer Diagnosis
Harmeet Khera
Scholar, National P.G. College, Lucknow, India
Author
Nikhil Pandey
Scholar, National P.G. College, Lucknow, India
Author
Dr. Shalini Lamba
Department Head, Computer Science Department, National P.G. College, Lucknow, India
Author
📌 DOI: https://doi.org/10.63920/tjths.43006
🔑 Keywords: Explainable AI; Cancer Diagnosis; Oncology; XAI; Medical Imaging
đź“… Publication Date: 01 July 2025
📜 License:
This work is licensed under a Creative Commons Attribution 4.0 International License .
- Share — Copy and Redistribute the material
- Adapt — Remix, Transform, and build upon the material
- The licensor cannot revoke these freedoms as long as you follow the license terms.
Abstract:
This study examines how Explainable Artificial Intelligence (XAI) enhances diagnosis of cancer, emphasizing the importance of trust and transparency in the clinical sector. While artificial intelligence (AI) models have demonstrated impressive accuracy in detecting cancer from medical images, their black-box nature often prevents doctors from understanding how predictions are made. This lack of interpretability creates hesitation in adopting AI for real-world healthcare. The paper examines popular XAI methods which provide visual and feature-based explanations of model outputs. It also discusses how these methods can enhance clinician confidence, reduce diagnostic errors, and meet regulatory requirements for accountability. This research emphasizes the promise of interpretable models in connecting the precision of machine learning with the trustworthiness of medical practices, based on a review of recent studies and a suggested framework for applying XAI in oncology. Additionally, it discusses future strategies for incorporating XAI into clinical processes.
Download Full PDF Paper
References
[1]. Ansari, Z. A., Tripathi, M. M., and Ahmed, R. (2025). “The role of explainable AI in enhancing breast cancer diagnosis using machine learning and deep learning models.” Discover Artificial Intelligence, 5, 75.
[2]. Badrie, S. (2025). “Skin tone bias in AI dermatology tools: Are we building inclusive systems?” RCSI Student Medical Journal, May 13.
[3]. Balakrishnan, K., et al. (2025). “Artificial intelligence in rural healthcare delivery: Bridging gaps and enhancing equity through innovation.” arXiv preprint, August.
[4]. Balve, A.–K., and Hendrix, P. (2024). “Interpretable breast cancer classification using CNNs on mammographic images.” arXiv preprint, August.
[5]. Chanda, T., et al. (2023). “Dermatologist-like explainable AI enhances trust and confidence in diagnosing melanoma.” preprint, March.
[6]. European Journal of Cancer. (2022). “Explainable artificial intelligence in skin cancer recognition: A systematicreview.”
[7]. Frontiers in Oncology. (2024). “Explainable AI techniques enhance clinicians’ understanding and trust by interpreting medical images and highlighting predictive factors.” Harnessing Explainable AI for Precision Cancer
Diagnosis and Prognosis.
[8]. Gaur, A., et al. (2025). “Explainable CNN for brain tumor detection and classification.” PubMed Central (PMC), published 3 months ago.
[9]. Ghasemi, A., Hashtarkhani, S., Schwartz, D. L., and Shaban-Nejad, A. (2024). “Explainable artificial intelligence in breast cancer detection and risk prediction: A systematic scoping review.” Cancer Innovation, 3,
e136.
[10]. Hammad, M., ElAffendi, M., Abd El-Latif, A. A., Ateya, A. A., Ali, G., and Plawiak, P. (2025). “Explainable AI for lung cancer detection via a custom CNN on CT images.” Scientific Reports, 15, 12707.
[11]. Huang, Z., Zhang, X., Ju, Y., et al. (2024). “Explainable breast cancer molecular expression prediction using multi-task deep learning based on 3D whole breast ultrasound.” Insights into Imaging, 15, 227.
