Skip to main content
McMaster University Menu Search

Personal tools

You are here: Home / Research / Trust in AI

Trust in AI

Click to learn more about our research in trust in AI

Research Project Overviews

Learn more about the Biomedic.AI Lab's past and current research as related to trust in medical AI...

Factors Influencing Trust in Medical Artificial Intelligence for Healthcare Professionals

Artificially intelligent technology is revolutionizing healthcare. However, lack of trust in the output of such complex decision support systems introduces challenges and barriers to adoption and implementation into clinical practice. The implementation of artificial intelligence (AI) in clinical settings can augment clinical decision-making and provide diagnostic support by translating uncertainty and complexity in patient data into actionable suggestions. Nevertheless, the successful integration of AI-based technologies as non-human, yet collaborative members of a healthcare team, is largely dependent upon other team users’ trust in these systems. We performed a comprehensive review of the literature to better understand the trust dynamics between medical AI and healthcare expert end-users and explored the factors that influence trust in these technologies and how they compare to established concepts of trust in the engineering discipline. By identifying the qualitatively and quantitatively assessed factors that influence trust in medical AI, we gain insight into understanding how autonomous systems can be optimized during the development phase to improve decision-making support and clinician-machine teaming. This facilitates an enhanced understanding of the qualities that healthcare professional users seek in AI to consider it trustworthy. We also highlight key considerations for promoting on-going improvement of trust in autonomous medical systems to support the adoption of medical technologies into practice. Explainability, transparency, interpretability, usability, and education are among the key identified factors thought to influence a healthcare professionals’ trust in medical AI and enhance clinician-machine teaming in critical decision-making healthcare environments. We also identified the need to better evaluate and incorporate other critical factors to promote trust by consulting medical professionals when developing AI systems for clinical decision-making and diagnostic support. 

Citation

Tucci V, Saary J, Doyle TE. Factors influencing trust in medical artificial intelligence for healthcare professionals: a narrative review. J Med Artif Intell 2021. https://dx.doi.org/10.21037/jmai-21-25

Principal Investigator

Dr. Thomas E. Doyle, PhD PEng, Associate Professor, Department of Electrical and Computer Engineering, McMaster University, School of Biomedical Engineering, McMaster University, Vector Institute

Researchers

Victoria Tucci, Honours Bach. Health Sciences Candidate, Faculty of Health Science, McMaster University

Dr. Joan Saary, MD, PhD, FRCPC, FAsMA, FACOEM, Division of Occupational Medicine, Department of Medicine, University of Toronto, Canada; Canadian Forces Environmental Medicine Establishment, Toronto, Canada