Menu Close

Let’s be open about who gets vaccinated first, by Cathy O’Neil

*Be as transparent as possible, to keep the discussion about priorities out in the open; hiding behind opaque algorithms is a terrible idea

Cathy O’Neil

Who goes first? Deciding who gets the COVID-19 vaccine — and who has to wait — will be no easy task.

Essentially, authorities around the world are building myriad algorithms to assess who is important, who is most at risk, and how to balance those considerations.

I have one piece of advice: Be as transparent as possible, to keep the discussion about priorities out in the open.

Secrecy breeds mistrust, frustration and snafus. Last week, Stanford Medical School had to do damage control after its hastily designed algorithm bypassed frontline hospital workers, prompting protests including slogans such as “The algorithm sucks!”

The U.S. government has also raised concerns by entrusting the job of allocating vaccine across the country to the opaque Tiberius software, developed by Peter Thiel’s company Palantir.

Conflicts are inevitable in such a vast and complex undertaking. The distribution of doses must go through at least three levels, starting at the federal government: to the states, to institutions such as hospitals and nursing homes, and ultimately to actual people.

A senior woman getting a vaccine shot

So, there will be algorithms in every place at every level, and innumerable ways they can go wrong.

At the federal level, Palantir seems to be doing all right so far — if you believe a Department of Defence report persuasively named “Pro Rata Vaccine Distribution is Fair, Equitable.”

As high-level plans go, sending more to states that have more people seems pretty straightforward, especially in this age of partisan divide.

But even such an apparently simple approach can perpetuate inequalities: The Census data on which it relies, for example, is known to undercount Black men, among the groups hardest hit by COVID-19.

The Centers for Disease Control and Prevention has been tweaking its recommendations on whom to prioritize.

It has, for example, divided the 65+ population into two groups — 65-74 and 75+ — and given higher priority to the latter, sparking heated debate in realms such as Twitter. Although this might not look like an algorithm, it is: An algorithm is anything that automates a process, especially by risk-scoring individuals and thereby ranking them, which is exactly what vaccination schedules require.

If we had perfect information, we could theoretically put clusters of people in line with the aim of saving the greatest number of lives.

It’s pretty clear that hospital workers, elderly people, and frontline workers should come before younger people who can stay home for a couple more months, even if they don’t want to.

Beyond that are people not on the frontlines but at higher risk, such as those with health conditions.

There’s plenty of room for judgment and argument here: I doubt, for example, that obesity will be prioritised, even though obese people are at much greater risk of dying from COVID-19.

Big questions abound. Can vaccinated people still transmit the virus? If so, a lot changes.

One example: Instead of vaccinating nursing-home workers on the assumption that they will no longer present a threat to residents, it might make more sense to vaccinate the residents first. How should second-order effects be weighted?

If, for instance, policy makers want to help some of the hardest-hit demographic groups — such as Black women and Latinas — get back to work, they might want to prioritize vaccinating childcare workers.

The power of money and corporate interests adds further complication. Uber, Con Edison, trade unions all want to get their people vaccinated, and have a fair amount of sway in the world of state politics.

Then of course there are rich individuals trying to pay to cut the line.

Given all the pressures and pitfalls, it’s easy to understand why officials might want to offload the decision making to a computer.

But algorithms aren’t objective and unbiased: They are human judgments and decisions embedded in code.

When the stakes are this high, those decisions need to be as transparent as possible. Let’s have our fights in the open, instead of not knowing what’s in the black box.

Assuming we value the lives of 79-year-olds as much as 35-year-olds. This is easier to do now that the pandemic is expected to end sooner.

Cathy O’Neil is a Bloomberg Opinion columnist, and mathematician who has worked as a professor, hedge-fund analyst and data scientist.

Kindly Share This Story

 

Kindly share this story