That is concerning, the researchers say, because doctors use algorithms for help with decisions such as whether patients are candidates for chemotherapy or an intensive care unit. Now these findings raise the possibility that the algorithms are “looking at your race, ethnicity, sex, whether you’re incarcerated or not—even if all of that information is hidden,” says coauthor Leo Anthony Celi, SM ’09, a principal research scientist at IMES and an associate professor at Harvard Medical School.
Celi thinks clinicians and computer scientists should turn to social scientists for insight. “We need another group of experts to weigh in and to provide input and feedback on how we design, develop, deploy, and evaluate these algorithms,” he says. “We need to also ask the data scientists, before any exploration of the data: Are there disparities? Which patient groups are marginalized? What are the drivers of those disparities?”
Algorithms often have access to information that humans do not, and this means experts must work to understand the unintended consequences. Otherwise there is no way to prevent the algorithms from perpetuating the existing biases in medical care.