As Covid vaccines were becoming more real, there was a lot of
discussion on who should be prioritised. It would’ve been a lottery in a dystopian world — like the movie
Contagion. Thankfully, it wasn’t so.
Stanford University decided to use an algorithm to prioritise. It said to give vaccines to administrators and physicians at home instead of frontline workers, though the policy changed swiftly. The administrators squarely blamed the algorithm. Was it the algorithm’s fault? Caitlin argues, no.
Algorithms are designed, created, implemented, and tested by people. If algorithms aren’t performing appropriately, responsibility lies with the people who made them.
I won’t spoil the article for you, but the culprit was the final “human-in-the-loop” test that the administration skipped in an attempt to iterate fast.