Identifying stigmatizing language in clinical documentation: A scoping review of emerging literature

PLoS One. 2024 Jun 28;19(6):e0303653. doi: 10.1371/journal.pone.0303653. eCollection 2024.

Abstract

Background: Racism and implicit bias underlie disparities in health care access, treatment, and outcomes. An emerging area of study in examining health disparities is the use of stigmatizing language in the electronic health record (EHR).

Objectives: We sought to summarize the existing literature related to stigmatizing language documented in the EHR. To this end, we conducted a scoping review to identify, describe, and evaluate the current body of literature related to stigmatizing language and clinician notes.

Methods: We searched PubMed, Cumulative Index of Nursing and Allied Health Literature (CINAHL), and Embase databases in May 2022, and also conducted a hand search of IEEE to identify studies investigating stigmatizing language in clinical documentation. We included all studies published through April 2022. The results for each search were uploaded into EndNote X9 software, de-duplicated using the Bramer method, and then exported to Covidence software for title and abstract screening.

Results: Studies (N = 9) used cross-sectional (n = 3), qualitative (n = 3), mixed methods (n = 2), and retrospective cohort (n = 1) designs. Stigmatizing language was defined via content analysis of clinical documentation (n = 4), literature review (n = 2), interviews with clinicians (n = 3) and patients (n = 1), expert panel consultation, and task force guidelines (n = 1). Natural language processing was used in four studies to identify and extract stigmatizing words from clinical notes. All of the studies reviewed concluded that negative clinician attitudes and the use of stigmatizing language in documentation could negatively impact patient perception of care or health outcomes.

Discussion: The current literature indicates that NLP is an emerging approach to identifying stigmatizing language documented in the EHR. NLP-based solutions can be developed and integrated into routine documentation systems to screen for stigmatizing language and alert clinicians or their supervisors. Potential interventions resulting from this research could generate awareness about how implicit biases affect communication patterns and work to achieve equitable health care for diverse populations.

Publication types

  • Review

MeSH terms

  • Documentation*
  • Electronic Health Records*
  • Humans
  • Language
  • Racism
  • Stereotyping

Grants and funding

Columbia University Data Science Institute Seeds Funds Program (VB, MT, KC). https://datascience.columbia.edu/ The Gordon and Betty Moore Foundation (Grant number: GBMF9048) (VB, MT, KC). https://health.ucdavis.edu/nursing/NurseLeaderFellows/index.html The funders had no role in study design, data collection and analysis, decision to publish, or preparation of the manuscript.