As breast screening services move towards use of healthcare AI (HCAI) for screen reading, research on public views of HCAI can inform more person-centered implementation. We synthesise reviews of public views of HCAI in general, and review primary studies of women's views of AI in breast screening. People generally appear open to HCAI and its potential benefits, despite a wide range of concerns; similarly, women are open towards AI in breast screening because of the potential benefits, but are concerned about a wide range of risks. Women want radiologists to remain central; oversight, evaluation and performance, care, equity and bias, transparency, and accountability are key issues; women may be less tolerant of AI error than of human error. Using our recent Australian primary study, we illustrate both the value of informing participants before collecting data, and women's views. The 40 screening-age women in this study stipulated four main conditions on breast screening AI implementation: 1) maintaining human control; 2) strong evidence of performance; 3) supporting familiarisation with AI; and 4) providing adequate reasons for introducing AI. Three solutions were offered to support familiarisation: transparency and information; slow and staged implementation; and allowing women to opt-out of AI reading. We provide recommendations to guide both implementation of AI in healthcare and research on public views of HCAI. Breast screening services should be transparent about AI use and share information about breast screening AI with women. Implementation should be slow and staged, providing opt-out options if possible. Screening services should demonstrate strong governance to maintain clinician control, demonstrate excellent AI system performance, assure data protection and bias mitigation, and give good reasons to justify implementation. When these measures are put in place, women are more likely to see HCAI use in breast screening as legitimate and acceptable.
Copyright © 2024 The Authors. Published by Elsevier Ltd.. All rights reserved.