Secret settlements and stolen health data, biometric data deletions and facial recognition guidelines
|
January 31 · Issue #117 · View online |
|
Secret settlements and stolen health data, biometric data deletions and facial recognition guidelines, invalid consent and illegitimate interests. Oh, and barking Amazon devices. đź
|
|
|
Additionally, by using the mics built into Amazon Echo devices, Alexa Guard Plus can detect unusual sounds and send an alert to your phone, and even play a siren on command. With Guard Plus, Alexaâs audio detection is even able to recognize specific sounds like smoke alarms, glass breaking, or carbon monoxide warnings and then ping you about them. And in cases when an Echo hears an unusual noise, you can play back the recording or use the Drop In feature to listen to what your Echo is hearing in real time.
|
Alexa Can Now Bark Like a Dog to Scare People Away When You're Not at Home
You typically use Amazonâs Alexa to play music, provide weather forecasts, and respond to other household requests. But Alexa just got a somewhat unusual new skill: the ability to bark like a dog.
|
Maybe Iâm misremembering 2019 but I could have sworn all the manufacturers of in-home listening devices devoted considerable energies to insisting - extremely implausibly - that the microphones in their listening devices only turned on when a wake word was spoken nearby.
|
|
As mentioned last week, Elizabeth Denham appeared in front of the Digital, Culture, Media and Sport Sub-committee on Online Harms and Disinformation for a bit of a chat. Two startling pieces of information emerged from this.
|
|
|
|
|
The stolen GGD data comes from two of its COVID-19 systems: CoronIT, which contains the private data of Dutch individuals who have had a coronavirus test, and HPzone Light, a COVID-19 source and contact tracking system, RTL Nieuws reports. RTL Nieuws also reports that the online ads for the information included photos of computer screens listing data for Dutch citizens. The Netherlands police statement makes no mention of how many individuals may have had their data breached in the GGD incident. But RTL Nieuws reports that it appears âmillions of address data, telephone and Social Security numbers are traded on a large scale, originating from the GGDâs two main corona systems.â
|
2 Arrested for Alleged Theft of COVID-19 Patient Data
Police in the Netherlands have arrested two health ministry workers for allegedly stealing COVID-19 patient data from the agencyâs systems and offering it for
|
|
Fear of the front page headline is a great motivator
|
Professional services advisory firm KPMG has commented that Canadian businesses respond immediately to data breaches â but only when the incidents get reported by the media, and not because legislation compels them to do so. According to Imraan Bashir, KPMG partner & national public sector cyber leader, several of the more notable data breach incidents in recent times were pushed aside by the affected companies until the news about them went public.
|
Media coverage, not legislation, prompts businesses to reveal data breaches - KPMG | Insurance Business
Experts warn that unless Canadian firms rethink how they view data, it will end up in the wrong hands
|
|
|
The Norwegian DPA announced its intention to fine Grindr 100 million krone (~âŹ10 million) for invalid consent. If upheld this decision sets a strong precedent for many other data controllers who also rely on consent which is not âfreely given, specific, informed and unambiguousâ.
|
|
|
|
|
|
While this decision concerns only one individual and will as it stands require any other individuals in Europe who wish to stop Clearview AI processing their personal data to make a Subject Access Request to check if the company is processing their personal data and then make a complaint to their DPA, it may have ramifications for other data controllers with biometric databases created on very shaky legal ground. Such as the Department of Employment Affairs and Social Protection.
|
|
On a similar theme the Council of Europe marked the fortieth anniversary of Convention 108 opening for signature with a set of guidelines for facial recognition technologies.
|
The Council of Europe has called for strict rules to avoid the significant risks to privacy and data protection posed by the increasing use of facial recognition technologies. Furthermore, certain applications of facial recognition should be banned altogether to avoid discrimination. In a new set of guidelines addressed to governments, legislators and businesses, the 47-state human rights organisation proposes that the use of facial recognition for the sole purpose of determining a personâs skin colour, religious or other belief, sex, racial or ethnic origin, age, health or social status should be prohibited.
|
|
|
|
-
âWhat should I do if I mistakenly bought one of these dodgy CMPs? Preferably, replace it. If the option to disable this âlegitimate interestsâ nonsense exists, avail yourself of it ASAP. Cut off the data flows which rely on illegitimate-interest cookies. Question the vendor hard as to why they sold you a product which was designed to break the law. Invest in better data protection/ePrivacy knowledge and process integration so that you donât make this sort of mistake again.â A comprehensive takedown by Miss IG Geek of the current trend among clueless controllers to cite legitimate interests as a lawful basis for setting cookies.
-
âMost significant, surveillance exceptionalism has meant that the United States and many other liberal democracies chose surveillance over democracy as the guiding principle of social order. With this forfeit, democratic governments crippled their ability to sustain the trust of their people, intensifying the rationale for surveillance.â From a long piece from Shoshana Zuboff for the New York Times titled âThe Coup We Are Not Talking Aboutâ.
-
âThe main thing to remember about DPIA is that it asks two core questions: -   should we be doing this? and - what could possibly go wrong? It is especially this last question that requires ingenuity. Will databases be created that are attractive to hackers or inside attackers, such as was reported this week from the Netherlands with their contact tracing databases? Or would managers be tempted with âfunction-creepâ thoughts starting âIf we collect this data anyway, we could also âŚâ? Can the data collected be combined with other data to cause bad privacy effects?â Eerke Boiten on DPIAs and the UKâs Covid proximity monitoring app.
-
âAs a company, under the Facebook Pages judgment and the Schrems II judgment from the Court of Justice of the European Union, you are jointly liable for any breaches of GDPR so if you are planning to use this platform for customer engagement⌠DONâT. At least, not if you want to avoid a big, fat fine from an EU regulator - if you donât mind that then you might mind a class action lawsuit under Article 80 of GDPR and I suspect your investors probably will mind. That aside, if you want to totally nuke your brand - GO FOR IT! Clubhouse is a shining example of HOW TO BREAK EU LAW - they are so good at it they could and probably should, write a book on the subject.â Alexander Hanff has a look at current VC darling Clubhouse and doesnât like it one bit.
|
|
|
If you know someone who might enjoy this newsletter do please forward it on to them.
|
Did you enjoy this issue?
|
|
|
|
If you don't want these updates anymore, please unsubscribe here.
If you were forwarded this newsletter and you like it, you can subscribe here.
|
|
Privacy Kit, Made with đ in Dublin, Ireland
|