Brian McCrindle, Katherine Zukotynski, Thomas E Doyle, and Michael D Noseworthy (2021)
A Radiology-focused Review of Predictive Uncertainty for AI Interpretability in Computer-assisted Segmentation
Radiology: Artificial Intelligence, 3(6):e210031.
The recent advances and availability of computer hardware, software tools, and massive digital data archives have enabled the rapid development of artificial intelligence (AI) applications. Concerns over whether AI tools can “communicate” decisions to radiologists and primary care physicians is of particular importance because automated clinical decisions can substantially impact patient outcome. A challenge facing the clinical implementation of AI stems from the potential lack of trust clinicians have in these predictive models. This review will expand on the existing literature on interpretability methods for deep learning and review the state-of-the-art methods for predictive uncertainty estimation for computer-assisted segmentation tasks. Last, we discuss how uncertainty can improve predictive performance and model interpretability and can act as a tool to help foster trust.
Document Actions