View profile

Black boxes all the way down | The Cat Herder, Volume 3, Issue 49

Revue
 
 
December 20 · Issue #113 · View online
The Cat Herder
This is the last Cat Herder of 2020. The first Cat Herder of 2020 had a quick summary of the main themes running through 2019. Locally, the Public Services Card and its underlying biometric database, and “the peculiar decision of the state’s investment arm to throw a lot of money at a privately owned DNA harvesting company”. Internationally, growing alarm at the use of facial recognition and the realisation that internet-connected devices with microphones and cameras could listen to and watch you.
2020 saw bumper profits for data harvesting firms, Google and Apple showing just how much power comes from owning the operating systems the world’s smartphones run on and a belated realisation that you can’t app your way out of a pandemic.
2021 is likely to bring more of the same. There’ll be a focus on vaccination or immunisation passports. There will be misuse of these. Probably more comedy appearances in court from the Department of Employment Affairs and Social Protection. Possibly the conclusion of the second DPC investigation into the Public Services Card. Multiple issues with workplace monitoring as many people’s homes will remain their workplaces for the foreseeable future. Increasing concerns over excessive and unnecessary surveillance through edtech and remote proctoring tools. Continued and increased use of facial recognition and thermal scanning ‘solutions’ to provide or deny access to locations.
Thank you all for reading through the year and have a richly deserved break. See you in 2021.
😼

Your Credit Score Should Be Based on Your Web History, IMF Says
The COVID-19 pandemic is unprecedented in our lifetimes, but there are lessons we can learn from the past. In 2009, the H1N1 (“swine flu”) vaccination rollout was plagued with inequitable access. With supply potentially limited for COVID-19 vaccinations for the next 6 months, more of the same can occur.A digitized system based on proof of immunization will amplify the lack of access.
Vaccine Passports: A Stamp of Inequity | Electronic Frontier Foundation
Oh yes we did
Oh yes we did
The European Commission approved Google’s acquisition of Fitbit, subject to some conditions. The main condition appears to be that Google has promised not to use health data gleaned from Fitbit users to target them with ads. Which has more than a hint of the Commission doing a bit of a Maginot Line and successfully fighting the last war about it.
The broader point is that such behavioural remedies are much harder to enforce than straightforward prohibitions. Companies are often two steps ahead: the commission in 2017 fined Facebook 110 million euros for providing what it regarded as inaccurate information as part of a 2014 merger review. According to the commission, the social-media giant said it couldn’t automatically link users’ WhatsApp and Facebook profiles, but then went on to do precisely that. The risk is that, by the time Vestager finds loopholes in her Google-Fitbit safeguards, the damage will already have been done.
Breakingviews - EU’s Google-Fitbit approval sets risky precedent | Reuters
“The DPC has imposed an administrative fine of €450,000 on Twitter as an effective, proportionate and dissuasive measure.” The DPC’s first fine of a major multinational finally landed after wending its way through the EDPB’s dispute resolution process.
Of particular interest to students of regulator Kremlinology is that the DPC’s decision (direct link to PDF) and the binding decision of the EDPB in the Article 65 resolution process (direct link to PDF) were both published in full by the EDPB.
The Swedish Data Protection Authority issued an administrative fine of SEK 300,000 (~€30,000) against a housing company for unlawful video surveillance in an apartment building.
The Swedish DPA also fined Umeå University SEK 550,00 (~€55,000) for processing special categories of personal data without applying appropriate technical and organisational measures to protect the data.
The EDPB published draft guidelines on restrictions of rights under Article 23 of the GDPR, which are open for comment until the 12th February 2021.
Reclaim Your Face
This week in NL 🇳🇱 the data protection authority clearly states that public "face recognition is forbidden." 🚫 It "turns us into walking barcodes," the DPA adds. @bitsoffreedom and the #ReclaimYourFace coalition welcome this enforcement of EU law.
https://t.co/YmvtUg1XLh
Bumper year-end edition. Some of these have featured in previous issues of this newsletter.
In recent months, these garden walls have been upgraded to safeguard not only the “spiritual” but the physical health of inhabitants. And in the eyes of authorities, the two are increasingly one and the same. In May, proactive officials in Hangzhou proposed expanding the health code system and integrating it even more into daily life. Citizens would be assigned a health score between 0 and 100 that would fluctuate based on lifestyle choices: 15,000 daily steps could boost a score by 5 points, while 200 millimeters of baijiu (a noxious Chinese liquor) could mean a reduction of 1.5 points. In September, officials in the neighboring city of Suzhou followed suit, introducing plans for a “civility code” that would apply a similar rubric to activities, such as doing volunteer work and jaywalking. Both plans were quickly retracted in the wake of a public backlash, but the technology needed to implement them is already in place, and most crucially, so is the vision of this future world.
Yi-Ling Liu: ‘Returning to China’s walled garden’, Rest of World
In the 1990s, insurers began using external data sources like credit scores to predict accident risk. Since then, rate filings have become increasingly filled with proprietary, opaque algorithms, according to regulators.
Gennady Stolyarov II, a lead actuary at the Nevada Division of Insurance, said all this secrecy and complexity leaves drivers in the dark about how to keep their rates low.
“If a behavior in another sphere of life affects insurance premiums in a way that consumers can’t readily anticipate,” he said in an interview, “that could lead to a cascade of financial consequences arising from what seems to be an innocuous decision.”
Yet regulators play a role in helping insurers keep what they’re doing out of the public eye. Rules vary by state, but insurance companies don’t always submit full details of their pricing algorithms to regulators unless those documents are specifically requested. And insurance companies at times file documents with confidential attachments, blocked from public disclosure due to trade secrets rules.
*This ProPublica investigation is worth reading in the light of this week’s finding by the Central Bank of Ireland that the majority of insurers in Ireland operate so-called ‘differential pricing’.
“The fundamental issue,” says Wizner, “is simply that surveillance used to be expensive, and now it’s cheap. That’s something that we have to confront, centrally, as one of the main challenges of our time. It used to be that our privacy was protected more by cost than by law, but that cost protection is gone. If governments wanted to know where you were, a generation ago, they had to assign a team of agents to track you 24 hours a day. There was no real legal barrier to doing that, but there was a huge resource barrier. There had to be a pretty good reason for it. Now, our technological systems are passively collecting all of this intimate information about all of us. The cost of storing it, forever, has plunged from being very expensive to almost trivially cheap.
Will Dunn talks to Ben Wizner: ‘Databases of Ruin’, New Statesman
Your conflicted Facebook-using friends and relations are living in what frustrated security researchers call the “privacy paradox”: most people will swear up and down that privacy is important to them, and then will continue to share their personal information widely on the Internet.
This is not because they are stupid. It is because they believe that they live in a dark, howling Internet panopticon from which they cannot escape. (I’m exaggerating, but only kinda).
This 2016 focus-group study found that young people were aware of the risks of sharing their information online. They just didn’t think they could do anything about those risks: they felt that “privacy violations are inevitable and opting out is not an option.” They’ve fallen prey to privacy cynicism, which is defined rather succinctly by these researchers as ” an attitude of uncertainty, powerlessness and mistrust towards the handling of personal data by online services, rendering privacy protection behavior subjectively futile.”
Sandra Smith waited more than a year for a spot in public housing in her hometown, Jacksonville, Fla. After a divorce, she had been staying with friends and her mother. Unemployed while recovering from health problems, Smith was relieved she’d finally be able to move back out of her teenage bedroom.
“I’m 55,” she said. “Having my own space is something I haven’t had in a long time, and I felt like I was ready for that.”
But the housing authority turned her down after a background check reported a 2013 eviction for a different Sandra Smith.
Lauren Kirchner and Matthew Goldstein: ‘Access Denied: Faulty Automated Background Checks Freeze Out Renters’, ProPublica & The New York Times
Deng: One thing that surprised me was how fast the number of surveillance cameras was growing. When I returned to the street two weeks after my first visit, more cameras had appeared. I had to adjust my route to take them into account.
It’s now almost impossible to be completely unseen by the cameras. The best we can do is avoid having our faces scanned.
“All these stories involve individuals or entire communities whose personhood is threatened by systems that are not designed by them or even keeping them in mind. These are systems that have become embedded in existing systems, creating entanglements that have evolved faster than the law or policymakers can keep up with. These software spaces will increasingly be home to what Madelaine Claire-Elish calls moral crumple zones, instances where responsibility and liability are obscured in complex, automated systems.
We have crossed the uncanny valley of digital personhood, and if we want to have agency over that personhood, we have to have agency over our data.”




Endnotes & Credits
Find us on the web at myprivacykit.com and on Twitter at @PrivacyKit. Of course we’re not on Facebook or LinkedIn.
If you know someone who might enjoy this newsletter do please forward it on to them.
Did you enjoy this issue?
If you don't want these updates anymore, please unsubscribe here.
If you were forwarded this newsletter and you like it, you can subscribe here.
Powered by Revue
Privacy Kit, Made with 💚 in Dublin, Ireland