Crowdsourcing medical expertise in near real time

J Hosp Med. 2014 Jul;9(7):451-6. doi: 10.1002/jhm.2204. Epub 2014 Apr 17.

Abstract

Given the pace of discovery in medicine, accessing the literature to make informed decisions at the point of care has become increasingly difficult. Although the Internet creates unprecedented access to information, gaps in the medical literature and inefficient searches often leave healthcare providers' questions unanswered. Advances in social computation and human computer interactions offer a potential solution to this problem. We developed and piloted the mobile application DocCHIRP, which uses a system of point-to-multipoint push notifications designed to help providers problem solve by crowdsourcing from their peers. Over the 244-day pilot period, 85 registered users logged 1544 page views and sent 45 consult questions. The median initial first response from the crowd occurred within 19 minutes. Review of the transcripts revealed several dominant themes, including complex medical decision making and inquiries related to prescription medication use. Feedback from the post-trial survey identified potential hurdles related to medical crowdsourcing, including a reluctance to expose personal knowledge gaps and the potential risk for "distracted doctoring." Users also suggested program modifications that could support future adoption, including changes to the mobile interface and mechanisms that could expand the crowd of participating healthcare providers.

Publication types

  • Research Support, Non-U.S. Gov't

MeSH terms

  • Adult
  • Aged
  • Clinical Competence / standards*
  • Computer Systems*
  • Cross-Sectional Studies
  • Crowdsourcing / instrumentation
  • Crowdsourcing / methods*
  • Female
  • Humans
  • Internet* / instrumentation
  • Male
  • Middle Aged
  • Pilot Projects