Accessibility navigation


ThyExp: an explainable AI-assisted decision making toolkit for thyroid nodule diagnosis based on ultra-sound images

Morris, J., Liu, Z., Liang, H., Nagala, S. and Hong, X. (2023) ThyExp: an explainable AI-assisted decision making toolkit for thyroid nodule diagnosis based on ultra-sound images. In: 32nd ACM International Conference on Information and Knowledge Management, Saturday 21 - Wednesday 25 October 2023, Birmingham,UK, pp. 5371-5375, https://doi.org/10.1145/3583780.3615131.

[img]
Preview
Text (Open Access) - Published Version
· Available under License Creative Commons Attribution.
· Please see our End User Agreement before downloading.

1MB
[img] Text - Accepted Version
· Restricted to Repository staff only

1MB

It is advisable to refer to the publisher's version if you intend to cite from this work. See Guidance on citing.

To link to this item DOI: 10.1145/3583780.3615131

Abstract/Summary

Radiologists have an important task of diagnosing thyroid nodules present in ultra sound images. Although reporting systems exist to aid in the diagnosis process, these systems do not provide explanations about the diagnosis results. We present ThyExp – a web based toolkit for it use by medical professionals, allowing for accurate diagnosis with explanations of thyroid nodules present in ultrasound images utilising artificial intelligence models. The proposed web-based toolkit can be easily incorporated into current medical workflows, and allows medical professionals to have the confidence of a highly accurate machine learning model with explanations to provide supplementary diagnosis data. The solution provides classification results with their probability accuracy, as well as the explanations in the form of presenting the key features or characteristics that contribute to the classification results. The experiments conducted on a real-world UK NHS hospital patient dataset demonstrate the effectiveness of the proposed approach. This toolkit can improve the trust of medical professional to understand the confidence of the model in its predictions. This toolkit can improve the trust of medical professionals in understanding the models reasoning behind its predictions.

Item Type:Conference or Workshop Item (Paper)
Refereed:Yes
Divisions:Science > School of Mathematical, Physical and Computational Sciences > Department of Computer Science
ID Code:113015

Downloads

Downloads per month over past year

University Staff: Request a correction | Centaur Editors: Update this record

Page navigation