Gaube, Susanne; Suresh, Harini; Raue, Martina; Lermer, Eva; Koch, Timo K.; Hudecek, Matthias F. C.; Ackery, Alun D.; Grover, Samir C.; Coughlin, Joseph F.; Frey, Dieter; Kitamura, Felipe C.; Ghassemi, Marzyeh; Colak, Errol (2023): Non-task expert physicians benefit from correct explainable AI advice when reviewing X-rays. Scientific Reports, 13 (1). ISSN 2045-2322
s41598-023-28633-w.pdf
Die Publikation ist unter der Lizenz Creative Commons Namensnennung (CC BY) verfügbar.
Herunterladen (2MB)
Abstract
Artificial intelligence (AI)-generated clinical advice is becoming more prevalent in healthcare. However, the impact of AI-generated advice on physicians’ decision-making is underexplored. In this study, physicians received X-rays with correct diagnostic advice and were asked to make a diagnosis, rate the advice’s quality, and judge their own confidence. We manipulated whether the advice came with or without a visual annotation on the X-rays, and whether it was labeled as coming from an AI or a human radiologist. Overall, receiving annotated advice from an AI resulted in the highest diagnostic accuracy. Physicians rated the quality of AI advice higher than human advice. We did not find a strong effect of either manipulation on participants’ confidence. The magnitude of the effects varied between task experts and non-task experts, with the latter benefiting considerably from correct explainable AI advice. These findings raise important considerations for the deployment of diagnostic advice in healthcare.
Dokumententyp: | Artikel (LMU) |
---|---|
Organisationseinheit (Fakultäten): | 11 Psychologie und Pädagogik > Department Psychologie |
DFG-Fachsystematik der Wissenschaftsbereiche: | Geistes- und Sozialwissenschaften |
Veröffentlichungsdatum: | 12. Jun 2023 13:50 |
Letzte Änderung: | 07. Dez 2023 12:18 |
URI: | https://oa-fund.ub.uni-muenchen.de/id/eprint/759 |
DFG: | Gefördert durch die Deutsche Forschungsgemeinschaft (DFG) - 491502892 |