Researchers discovered that a widely used algorithm to determine which patients are seen first at hospitals is racially biased.
“We found that a category of algorithms that influences health care decisions for over a hundred million Americans shows significant racial bias,” said Sendhil Mullainathan, senior author of a research study the academic journal Science published Friday.
Black patients assigned the same level of risk by the algorithm were sicker than white patients, according to the study.
In fact, researchers concluded in their work that racial bias reduces the number of Black patients identified for extra care by more than half.
“The authors estimated that this racial bias reduces the number of Black patients identified for extra care by more than half,” researchers said in the study. “Bias occurs because the algorithm uses health costs as a proxy for health needs.”
Because less money is spent on Black patients, the algorithm falsely concluded Black patients are healthier than “equally sick white patients,” researchers said in the article.
The article was a joint effort between researchers Ziad Obermeyer, Brian Powers, Christine Vogeli, Sendhil Mullainathan and several public health and academic institutions.
Those included the University of California Berkeley’s School of Public Health, the Brigham and Women’s Hospital in Boston, the Mongan Institute Health Policy Center and the University of Chicago’s Booth School of Business.
“Because of the structural inequalities in our health care system, blacks at a given level of health end up generating lower costs than whites,” Obermeyer said in a Booth School of Business news release about the research. “As a result, black patients were much sicker at a given level of the algorithm’s predicted risk.”
Researchers also said in the article that reformulating the algorithm so that cost is no longer a proxy for need “eliminates the racial bias in predicting who needs extra care.”