View profile

Bark Like A Dog | The Cat Herder, Volume 4, Issue 4

Secret settlements and stolen health data, biometric data deletions and facial recognition guidelines
January 31 · Issue #117 · View online
The Cat Herder
Secret settlements and stolen health data, biometric data deletions and facial recognition guidelines, invalid consent and illegitimate interests. Oh, and barking Amazon devices.

Additionally, by using the mics built into Amazon Echo devices, Alexa Guard Plus can detect unusual sounds and send an alert to your phone, and even play a siren on command. With Guard Plus, Alexa’s audio detection is even able to recognize specific sounds like smoke alarms, glass breaking, or carbon monoxide warnings and then ping you about them. And in cases when an Echo hears an unusual noise, you can play back the recording or use the Drop In feature to listen to what your Echo is hearing in real time.
Alexa Can Now Bark Like a Dog to Scare People Away When You're Not at Home
Maybe I’m misremembering 2019 but I could have sworn all the manufacturers of in-home listening devices devoted considerable energies to insisting - extremely implausibly - that the microphones in their listening devices only turned on when a wake word was spoken nearby.
As mentioned last week, Elizabeth Denham appeared in front of the Digital, Culture, Media and Sport Sub-committee on Online Harms and Disinformation for a bit of a chat. Two startling pieces of information emerged from this.
Firstly, Facebook has managed to get the ICO to sign a gagging order of some sort, preventing the ICO from talking about aspects of the Cambridge Analytica fallout.
Secondly, the ICO found that the Conservative Party racially profiled 10 million people but decided to take no enforcement acction because the Tories agreed to delete the data.
The stolen GGD data comes from two of its COVID-19 systems: CoronIT, which contains the private data of Dutch individuals who have had a coronavirus test, and HPzone Light, a COVID-19 source and contact tracking system, RTL Nieuws reports.
RTL Nieuws also reports that the online ads for the information included photos of computer screens listing data for Dutch citizens.
The Netherlands police statement makes no mention of how many individuals may have had their data breached in the GGD incident. But RTL Nieuws reports that it appears “millions of address data, telephone and Social Security numbers are traded on a large scale, originating from the GGD’s two main corona systems.”
2 Arrested for Alleged Theft of COVID-19 Patient Data
Fear of the front page headline is a great motivator
Fear of the front page headline is a great motivator
Professional services advisory firm KPMG has commented that Canadian businesses respond immediately to data breaches – but only when the incidents get reported by the media, and not because legislation compels them to do so. According to Imraan Bashir, KPMG partner & national public sector cyber leader, several of the more notable data breach incidents in recent times were pushed aside by the affected companies until the news about them went public.
Media coverage, not legislation, prompts businesses to reveal data breaches - KPMG | Insurance Business
The Norwegian DPA announced its intention to fine Grindr 100 million krone (~€10 million) for invalid consent. If upheld this decision sets a strong precedent for many other data controllers who also rely on consent which is not “freely given, specific, informed and unambiguous”.
It seems Grindr is now claiming legitimate interests as a lawful basis for sharing special categories of personal data with ad networks, which is a no-no (see below). So this story may be far from over.
The Hamburg DPA decided Clearview AI was processing the biometric data of a complainant without his consent and ordered the company to delete the unlawfully processed personal data [direct link to PDF].
While this decision concerns only one individual and will as it stands require any other individuals in Europe who wish to stop Clearview AI processing their personal data to make a Subject Access Request to check if the company is processing their personal data and then make a complaint to their DPA, it may have ramifications for other data controllers with biometric databases created on very shaky legal ground. Such as the Department of Employment Affairs and Social Protection.
On a similar theme the Council of Europe marked the fortieth anniversary of Convention 108 opening for signature with a set of guidelines for facial recognition technologies.
The Council of Europe has called for strict rules to avoid the significant risks to privacy and data protection posed by the increasing use of facial recognition technologies. Furthermore, certain applications of facial recognition should be banned altogether to avoid discrimination.
In a new set of guidelines addressed to governments, legislators and businesses, the 47-state human rights organisation proposes that the use of facial recognition for the sole purpose of determining a person’s skin colour, religious or other belief, sex, racial or ethnic origin, age, health or social status should be prohibited.
  • “What should I do if I mistakenly bought one of these dodgy CMPs? Preferably, replace it. If the option to disable this ‘legitimate interests’ nonsense exists, avail yourself of it ASAP. Cut off the data flows which rely on illegitimate-interest cookies. Question the vendor hard as to why they sold you a product which was designed to break the law. Invest in better data protection/ePrivacy knowledge and process integration so that you don’t make this sort of mistake again.” A comprehensive takedown by Miss IG Geek of the current trend among clueless controllers to cite legitimate interests as a lawful basis for setting cookies.
  • “Most significant, surveillance exceptionalism has meant that the United States and many other liberal democracies chose surveillance over democracy as the guiding principle of social order. With this forfeit, democratic governments crippled their ability to sustain the trust of their people, intensifying the rationale for surveillance.” From a long piece from Shoshana Zuboff for the New York Times titled ‘The Coup We Are Not Talking About’.
  • “The main thing to remember about DPIA is that it asks two core questions: -   should we be doing this? and - what could possibly go wrong? It is especially this last question that requires ingenuity. Will databases be created that are attractive to hackers or inside attackers, such as was reported this week from the Netherlands with their contact tracing databases? Or would managers be tempted with “function-creep” thoughts starting “If we collect this data anyway, we could also …”? Can the data collected be combined with other data to cause bad privacy effects?” Eerke Boiten on DPIAs and the UK’s Covid proximity monitoring app.
  • “As a company, under the Facebook Pages judgment and the Schrems II judgment from the Court of Justice of the European Union, you are jointly liable for any breaches of GDPR so if you are planning to use this platform for customer engagement… DON’T. At least, not if you want to avoid a big, fat fine from an EU regulator - if you don’t mind that then you might mind a class action lawsuit under Article 80 of GDPR and I suspect your investors probably will mind. That aside, if you want to totally nuke your brand - GO FOR IT! Clubhouse is a shining example of HOW TO BREAK EU LAW - they are so good at it they could and probably should, write a book on the subject.” Alexander Hanff has a look at current VC darling Clubhouse and doesn’t like it one bit.

Endnotes & Credits
Find us on the web at and on Twitter at @PrivacyKit. Of course we’re not on Facebook or LinkedIn.
If you know someone who might enjoy this newsletter do please forward it on to them.
Did you enjoy this issue?
If you don't want these updates anymore, please unsubscribe here.
If you were forwarded this newsletter and you like it, you can subscribe here.
Powered by Revue
Privacy Kit, Made with 💚 in Dublin, Ireland