Stigma, biomarkers, and algorithmic bias: Recommendations for precision behavioral health with artificial intelligence

Colin G. Walsh, Beenish Chaudhry, Prerna Dua, Kenneth W. Goodman, Bonnie Kaplan, Ramakanth Kavuluru, Anthony Solomonides, Vignesh Subbian

Research output: Contribution to journalReview articlepeer-review

39 Scopus citations

Abstract

Effective implementation of artificial intelligence in behavioral healthcare delivery depends on overcoming challenges that are pronounced in this domain. Self and social stigma contribute to under-reported symptoms, and under-coding worsens ascertainment. Health disparities contribute to algorithmic bias. Lack of reliable biological and clinical markers hinders model development, and model explainability challenges impede trust among users. In this perspective, we describe these challenges and discuss design and implementation recommendations to overcome them in intelligent systems for behavioral and mental health.

Original languageEnglish
Pages (from-to)9-15
Number of pages7
JournalJAMIA Open
Volume3
Issue number1
DOIs
StatePublished - 2021

Bibliographical note

Funding Information:
Authors’ effort were partially supported by the following grants: under grant # W81XWH-10-2-0181 and R01 MH116269-01 (CGW), the National Institute of General Medical Sciences of the National Institutes of Health under grant #P20 GM103424-17 (PD); U.S. National Center for Advancing Translational Sciences via grant #UL1TR001998 (RK); National Science Foundation under grant #1838745 (VS).

Publisher Copyright:
© The Author(s) 2020.

Keywords

  • Algorithms
  • Artificial intelligence
  • Behavioral health
  • Ethics
  • Health disparities
  • Mental health
  • Precision medicine
  • Predictive modeling

ASJC Scopus subject areas

  • Health Informatics

Fingerprint

Dive into the research topics of 'Stigma, biomarkers, and algorithmic bias: Recommendations for precision behavioral health with artificial intelligence'. Together they form a unique fingerprint.

Cite this