2023 Author: Bryan Walter | [email protected]. Last modified: 2023-05-21 22:24
One of the algorithms used in American hospitals to determine treatment scenarios for patients has been biased against race: it assigns black and white patients the same level of risk for developing certain conditions, even though black patients are significantly less healthy on average. The reason for this was that the algorithm uses spending on medical services to determine the health of patients. An article describing the work was published in Science.
One of the obvious advantages of using automatic systems for analyzing information is the absence of a human factor, which very often affects the result. A striking example is the “search satisfaction” error, which is often encountered in medical diagnostics: having found, for example, one fracture on an X-ray, a specialist may not notice the second. Such an error does not threaten computer vision algorithms.
Another example is the analysis of various profiles of people: for example, when applying for a job or for the provision of necessary medical care. In fact, algorithms that specialize in such analysis must be unbiased and work without discrimination on any grounds. In fact, the work of such algorithms is still based on data collected by people, and it cannot be avoided: it is worth remembering, for example, the story of the questionnaire analysis algorithm when applying for a job from Amazon, which was accused of sexism.
In this case, bias can arise even in cases where the algorithm does not directly address parameters such as gender or race. This is shown in a new work by scientists led by Ziad Obermeyer of the University of California at Berkeley. They analyzed data on more than 50 thousand patients that are used in one of the medical computer programs, which automatically prescribes possible treatments for chronic diseases.
The authors of the work found that, according to the algorithm, black people have 26.3 percent more chronic diseases than whites: in other words, their health in the sample is much worse. If we talk about specific diseases and conditions, then among black people, blood pressure and cholesterol levels are higher, and diabetes is worse. Interestingly, this relationship is observed in all risk ratios: in other words, with the same risk of developing any conditions, disease and death in black patients, health is much worse. This, in turn, leads to the fact that white patients in relative health, compared to black patients, receive more special care.
Interestingly, the algorithm does not take into account the race at all - in this case, the bias still arises. After analyzing the operation of the algorithm, the scientists found that, when calculating the risks of diseases and the need for treatment, the system first of all takes into account the costs of treatment. On the one hand, this is reasonable: severe conditions require more expensive treatment. On the other hand, lower medical spending reflects the socioeconomic status and standard of living of patients, which may be lower for blacks.
The work itself does not say what kind of algorithm it is talking about. However, The Washington Post claims that this is one of the programs of the medical company Optum: the same is indicated in the editorial note of the journal Science. In the article, the authors note that they had no contact with representatives of the company until the publication of the work, but clarify that they managed to work with the developers and reduce racial bias in his work by 84 percent.
The bias of the algorithms due to the bias of the sampling itself sometimes plays into the hands: for example, last year, scientists managed to track on a large corpus of texts how attitudes towards women and Asians were changing.