Faculty members provide guidance to help implement new standards proposed by the federal government set to take effect later this year
Many physicians are now using algorithms that consider a patient’s sex, like heart disease risk assessment tools, to help with clinical decision-making. Reliance on these algorithms may result in men and women receiving different care or having different eligibility for healthcare resources, for example placement on an organ transplant list.
Yet despite their widespread use, the clinical community has never defined standards to determine when relying on sex-inclusive algorithms is medically appropriate. New federal healthcare regulations that aim to promote antidiscrimination in clinical algorithms will also take effect later this year, but the government has not provided guidance on when the inclusion of biological sex is unlawful.
This dearth of information creates a quandary for healthcare organizations: which algorithms that include sex are fair and legal to continue using? Without guidance, many organizations may discontinue using algorithms that include sex to avoid legal liability, and this discontinuation could harm patients.
To help address this issue, faculty members from the University of Maryland School of Medicine (UMSOM) and the University of Maryland Francis King Carey School of Law have co-authored a new framework to guide the medical establishment in evaluating the inclusion of sex in these algorithms.
The article was published on January 8 in The New England Journal of Medicine.
“When sex appears in an algorithm, its presence tells us little about why risk differs between men and women. Deciding the legality and appropriateness of sex’s inclusion requires going beyond the math to probe the ‘why’ for inclusion,” said lead author Katherine E. Goodman, PhD, JD, Assistant Professor of Epidemiology and Public Health at the University of Maryland School of Medicine and faculty member at the University of Maryland Institute for Health Computing. “In our view, when risk differs due to primarily biological factors, it is appropriate and lawful to take those factors into account in algorithmic decision-making. But if risk differs between men and women for nonbiologic reasons, such as sex-based stereotypes or unconscious biases in medical treatment, that can make adjusting algorithmic predictions for sex unfair and likely unlawful.”
She and her colleagues proposed a new framework to guide those developing algorithms and healthcare providers who are using them in their daily practice. The framework includes asking several questions such as whether the inclusion of sex in the tool is “prognostically necessary,” meaning that the tool will be less accurate without considering a patient’s sex. Other components include understanding why risk and outcomes are believed to differ between male and female patients and whether leaving sex in a particular algorithm would “penalize” the sex that was disadvantaged by bias or stereotypes.
"This framework will help healthcare systems improve outcomes and should save more lives and reduce morbidity by ensuring that when sex is considered in clinical algorithms, it's being used appropriately and not based on biases or stereotypes," said Diane Hoffmann, JD, MSc, Jacob A. France Professor of Health Law at the University of Maryland Francis King Carey School of Law and director of the law school's Law & Health Care Program. "With new HHS guidelines going into effect this May, healthcare providers need clear guidance on when and how to consider sex in their clinical decision-making tools."
The other commentary authors are Jennifer Blumenthal-Barby, PhD, of Baylor College of Medicine, and Rita Redberg, MD, University of California, San Francisco.