Background: Automated conversational agents, or chatbots, have a role in reinforcing evidence-based guidance delivered through other media and offer an accessible, individually tailored channel for public engagement. In early-to-mid 2021, young adults and minority populations disproportionately affected by COVID-19 in the United States were more likely to be hesitant toward COVID-19 vaccines, citing concerns regarding vaccine safety and effectiveness. Successful chatbot communication requires purposive understanding of user needs.
Objective: We aimed to review the acceptability of messages to be delivered by a chatbot named VIRA from Johns Hopkins University. The study investigated which message styles were preferred by young, urban-dwelling Americans as well as public health workers, since we anticipated that the chatbot would be used by the latter as a job aid.
Methods: We conducted 4 web-based focus groups with 20 racially and ethnically diverse young adults aged 18-28 years and public health workers aged 25-61 years living in or near eastern-US cities. We tested 6 message styles, asking participants to select a preferred response style for a chatbot answering common questions about COVID-19 vaccines. We transcribed, coded, and categorized emerging themes within the discussions of message content, style, and framing.
Results: Participants preferred messages that began with an empathetic reflection of a user concern and concluded with a straightforward, fact-supported response. Most participants disapproved of moralistic or reasoning-based appeals to get vaccinated, although public health workers felt that such strong statements appealing to communal responsibility were warranted. Responses tested with humor and testimonials did not appeal to the participants.
Conclusions: To foster credibility, chatbots targeting young people with vaccine-related messaging should aim to build rapport with users by deploying empathic, reflective statements, followed by direct and comprehensive responses to user queries. Further studies are needed to inform the appropriate use of user-customized testimonials and humor in the context of chatbot communication.
Keywords: AI; COVID-19; artificial intelligence; chatbots; conversational agent; digital health; health communication; infodemic; infodemiology; misinformation; natural language processing; online health information; public health; social media; user need; vaccination; vaccine communication; vaccine hesitancy.
©Rose Weeks, Lyra Cooper, Pooja Sangha, João Sedoc, Sydney White, Assaf Toledo, Shai Gretz, Dan Lahav, Nina Martin, Alexandra Michel, Jae Hyoung Lee, Noam Slonim, Naor Bar-Zeev. Originally published in the Journal of Medical Internet Research (https://www.jmir.org), 06.07.2022.