View profile

Facebergstagram | The Cat Herder, Volume 3, Issue 39

Revue
 
The CJEU provides roughly the same answers to the latest round of member state questions about indisc
 
October 11 · Issue #103 · View online
The Cat Herder
The CJEU provides roughly the same answers to the latest round of member state questions about indiscriminate mass surveillance as it has on multiple previous occasions. The state of biometrics, immunity passports and cookies. A bad week for Microsoft in both Germany and France.
😼

It turns out that taking steps to avoid being profiled and tracked online can single you out and allow you to be, err, tracked and profiled. What an online experience we’ve built for ourselves.
EFF off: Privacy Badger disables by default anti-tracking safeguard that can be abused to track you online • The Register
The committee could do a bit more than “specifically suggest” the HSE not break the law by giving special categories of personal data to employers with no lawful basis.
Significantly, the committee specifically suggests that the results should be returned “to the individual workers”.
In its final report, the committee says it welcomes an investigation by the Data Protection Commissioner of potential breaches of data belonging to meat plant workers in relation to test results.
Covid outbreaks in meat plants likely to come under further scrutiny
As the European Commission and the academic community have stated, any public health monitoring systems are high risk measures that must be shown to be lawful and necessary in a democratic society. These should be adopted with legal safeguards put in place by design and default in order to counter or mitigate such risks.
These conflicts with human rights particularly include high levels of interference with the rights to private life, data protection, and non-discrimination which are protected by Articles 8 and 14 of the European Convention on Human Rights, the EU Charter of Fundamental Rights, and the EU General Data Protection Regulation (GDPR).
Dr Nóra Ní Loideáin on immunity passports, which we’ll be hearing a lot more about in the coming months.

The sensitive nature of biometric data, recognised both within the EU legal framework, as well as in the framework of the Council of Europe’s Modernised Convention 108+, makes it subject to special protection: the processing of biometric data is prohibited in principle and there are only a limited number of conditions under which such processing is lawful.
The quote above is from a keynote speech on ‘The State of Biometrics’ (direct link to PDF) the European Data Protection Supervisor gave during the week.
The quote below is from the DPC’s final report from its investigation “in respect of the processing of personal data by the Department of Employment Affairs and Social Protection in relation to THE PUBLIC SERVICES CARD (“PSC”) examining compliance with the obligations in relation to LEGAL BASIS AND TRANSPARENCY”, published, grudgingly, by DEASP in August 2019.
 A further report will shortly make provisional findings to the DEASP, including matters relating to data security; arithmetic template generation (and associated processing of personal data) for SAFE 2 and the PSC; and in relation to the DEASP’s processing of personal data generated in connection with the use of the free travel variant of the PSC. Once the DPC has considered any final submissions of DEASP in relation to that second report, it will finalise its report and decision on those particular issues.
Tick tock.
It definitely could.
It definitely could.
The original warrant sent to Google is still sealed, but the report provides another example of a growing trend of data requests to the search engine giant in which investigators demand data on a large group of users rather than a specific request on a single suspect.
The keyword warrants are similar to geofence warrants, in which police make requests to Google for data on all devices logged in at a specific area and time. Google received 15 times more geofence warrant requests in 2018 compared with 2017, and five times more in 2019 than 2018. The rise in reverse requests from police have troubled Google staffers, according to internal emails.
Google is giving data to police based on search keywords, court docs show - CNET
There’s some very interesting detail in a Telegraph story (hat tip TJ McIntyre) which is ostensibly about Instagram’s failure to crack down on self-harm content on its platform. When contacted by the Telegraph Facebergstagram blamed data protection law and the mean ol’ Data Protection Commission for its inability to manage its own platform.
As is fairly typical in these situations, an assertion without evidence is made - that adding more technology to the problem caused by the existing technology is the only reasonable and available solution to the problem.
The DPC quite rightly didn’t agree with this.
A slap on the wrist for Wexford County Council.
“The DPC considers that the drones deployed by [Wexford County Council] constituted a system carrying out surveillance which had the potential to collect personal data and therefore a DPIA should have been carried out by WCC prior to the drones being deployed,” Eunice Delaney, assistant commissioner, said in her ruling.
However, she said that, given the county council had moved to amend its drone policy so that a DPIA will be carried out before the future purchase or use of drones, and given that no identifiable footage had been recorded, no further action would be taken.
After an investigation triggered by complaints from groups including Liberty, the ICO found that the DfE had failed to comply with sections of the general data protection regulation (GDPR). It said there was “no clear picture of what data is held by the DfE” and that its handling of millions of pupil records “could result in multiple data breaches”.
“The audit found that data protection was not being prioritised and this had severely impacted the DfE’s ability to comply with the UK’s data protection laws,” the ICO said.
The ICO published draft statutory guidance on regulatory action (direct link to PDF), including calculating monetary penalties, to sit alongside its regulatory action policy.
The CNIL has asked the Conseil d'Etat to stop using Microsoft to host the French Health Data Hub.
This week the CNIL also published the final version of its guidance on cookies. Enforcement will begin in April 2021.
Coincidentally this week was also the week the DPC began enforcing its cookie guidance, published six months ago. Our money is on there being plenty of data controllers who still aren’t abiding by the rules on this. The easiest infringement to spot is the use of implied consent to set cookies. So if you notice an Irish website with a banner that says something like “by continuing to use this website you consent to our use of cookies” then do drop the DPC a line. They’ll probably be delighted to hear from you.
The Belgian DPA reprimanded a public body for “wrongful processing of personal data from the National Register”. In an echo of what we discussed last week concerning Irish public bodies wishing to use CCTV footage to prosecute individuals for littering, this body has the competence to fine individuals for littering but does not have the competence to search the National Registry and infer family connections.

  • “Current thinking around tech addiction is largely based in biological determinism—the idea that we “can’t help ourselves” from becoming addicted to technology—and tech solutionism—a belief that technological changes alone can solve for digital well-being. Neither of these approaches are grounded in empirical evidence, and both put the blame on the individual, rather than the platform.” ‘Good Intentions, Bad Inventions The Four Myths of Healthy Tech’ by Amanda Lenhart and Kellie Owens deflates some common talking points deployed by moral panic merchants.
  • “In nearly every case, we have absolutely no idea what determinations go into these algorithms. We do not know who coded them. We do not know how they work; how they judge. By their very nature – hidden lines of complex code, obscured by laws protecting business assets – they function invisibly. They are shielded as corporate proprietary information and “intellectual” property – even though it is our intellects that they lay claim to, judging us by the data they gather (typically, without us knowing). This data then, ludicrously, becomes their property, not ours. Whole, revealing chunks of us, some of it extremely revealing and sensitive, owned not by us, but by them.” Karlin Lillington awards the grading of the 2020 Leaving Cert an F.
  • “The Court also relied on Schrems II, implicitly confirming aspects of its approach there and embedding that decision in its jurisprudence. The underlying concern in Schrems II was the same as here: that is, data collected by private actors are accessed by state actors.  In sum, even in the interests of national security, general and indiscriminate surveillance does not satisfy the test of strict necessity and proportionality.” Lorna Woods reviews this week’s Privacy International, La Quadrature du Net and Ordre des barreaux francophones et germanphone CJEU judgments in ‘When is mass surveillance justified? The CJEU clarifies the law in Privacy International and other cases’


Endnotes & Credits
Find us on the web at myprivacykit.com and on Twitter at @PrivacyKit. Of course we’re not on Facebook or LinkedIn.
If you know someone who might enjoy this newsletter do please forward it on to them.
Did you enjoy this issue?
If you don't want these updates anymore, please unsubscribe here.
If you were forwarded this newsletter and you like it, you can subscribe here.
Powered by Revue
Privacy Kit, Made with 💚 in Dublin, Ireland