Written by
Jonathan Greig, Staff Writer
Jonathan Greig
Staff Writer
Jonathan Greig is a journalist based in New York City.
Full Bio
on December 13, 2021
| Topic: Digital Health and Wellness
New York City announced a recent effort to end race adjustments in clinical algorithms used by the city’s hospitals. Experts say these adjustments perpetuate racist assumptions in healthcare and can lead to substandard care for minority populations.
“The concept of biological differences based on race existed long before AI, but this technology has been weaponized to exacerbate these racist practices. There are so many algorithms in place that mostly work in a black box with little-to-no transparency into how they make their decisions. In order to tear down racist structures across our society we need to open up these algorithms to find what other racist beliefs are a part of their calculations,” Seeley George said.
“We need all of our health systems to do this work, we need legislation that stops harmful algorithms from being adopted, and in New York City it must go beyond one clinical algorithm (which is their current scope). And these efforts must also include the real people who have been harmed by these algorithms as well as human rights organizations and experts who can give critical insight into the harms of these algorithms on real people.”
Dr. Danya Glabau, assistant professor at the NYU Tandon School of Engineering and director of the Science and Technology Studies Department of Technology, Culture, and Society at the university, said “algorithms” in the context of medicine have a much longer history than the computerized systems that the council seeks to reform.
Medical algorithms, according to Glabau, are essentially any kind of decision tree doctors use to make treatment decisions, and using non-computerized medical algorithms has been a cornerstone of evidence-based medicine for decades.
Automated algorithms, however, take the physician’s judgment out of treatment decisions to a greater or lesser extent because a computer makes the decisions. When the digital algorithms were rolled out, it was thought that removing humans would remove human racism, Glabau explained.
“However, since automated algorithms’ decisions are based on data from past human decisions, human biases like racism and classism still factor into these tools. So they don’t really solve the problem of racism on their own because the history of medicine is racist,” Glabau said.
“It’s hard to say exactly how widespread digital algorithms are and how many of them look at race in particular. But the chances are that most providers use several on a daily basis, and may or may not be aware of it.”
Researchers like Amy Moran-Thomas have shown that even simple devices like pulse oximeters can have racist outcomes.
“In this case, designers simply did not consider how skin color would affect the readings given by an optical sensor. We also know that tools like scheduling software can have racist outcomes even though scheduling doesn’t seem to have anything to do with race. But Black patients in particular were double or triple booked because many had difficulties making it to appointments on time due to factors outside of their control,” Glabau said.
“These examples show how tricky it can be to anticipate how algorithmic and other digital systems will have racist outcomes. In a city like New York, where COVID has hit BIPOC communities hard and where zip code is correlated with income and racial segregation, such seemingly mundane technologies can have significant consequences for health.”
But the only way for the council to succeed is for it to be given full access to hospital system operations, software, and technical documentation from the companies that produce the algorithms, Glabau added. The council also needs to be given the authority to make binding guidelines that can be implemented across the city.
“If this council isn’t given teeth, it may find shocking information or make well-intended recommendations, but it will not change anything for patients or accomplish its anti-racist mandate,” Glabau explained.
Glabau has written extensively about how mundane technologies can have a significant effect on health outcomes.
ZDNet Recommends
Covid testing: The best at-home rapid test kits
Walmart is selling Apple AirPods at prices cheaper than during Black Friday
The best cheap espresso machines: Save money, brew at home
The best PC speakers: Bring the noise!
Check out ZDNet’s Holiday Gift Guide 2021
Enterprise Software