Background: It is increasingly recognised that the success of artificial intelligence-based clinical decision support (AI/CDS) tools will depend on physician and patient trust, but factors impacting patients' views on clinical care reliant on AI have been less explored.
Objective: This pilot study explores whether, and in what contexts, detail of explanation provided about AI/CDS tools impacts patients' attitudes toward the tools and their clinical care.
Methods: We designed a Sequential Multiple Assignment Randomized Trial vignette web-based survey. Participants recruited through Amazon Mechanical Turk were presented with hypothetical vignettes describing health concerns and were sequentially randomised along three factors: (1) the level of detail of explanation regarding an AI/CDS tool; (2) the AI/CDS result; and (3) the physician's level of agreement with the AI/CDS result. We compared mean ratings of comfort and confidence by the level of detail of explanation using t-tests. Regression models were fit to confirm conditional effects of detail of explanation.
Results: The detail of explanation provided regarding the AI/CDS tools was positively related to respondents' comfort and confidence in the usage of the tools and their perception of the physician's final decision. The effects of detail of explanation on their perception of the physician's final decision were different given the AI/CDS result and the physician's agreement or disagreement with the result.
Conclusions: More information provided by physicians regarding the use of AI/CDS tools may improve patient attitudes toward healthcare involving AI/CDS tools in general and in certain contexts of the AI/CDS result and physician agreement.
Keywords: Ethics; Ethics- Medical; Ethics- Research; Information Technology; Quality of Health Care.
© Author(s) (or their employer(s)) 2024. No commercial re-use. See rights and permissions. Published by BMJ.