Privacy Kit

Subscribe
Archives
September 5, 2021

A Needlessly Frustrating Exercise | The Cat Herder, Volume 4, Issue 34

Fines. Lobbyists. Guns. An Apple about turn. It's practically a Warren Zevon song this week. 😼
 
September 5 · Issue #147 · View online
The Cat Herder
Fines. Lobbyists. Guns. An Apple about turn. It’s practically a Warren Zevon song this week.
😼

Damien
Damien
@Damokieran
Soooo my identical twin brother’s face is able unlock my @Apple iPhone 12. 🤔
7:50 PM - 31 Aug 2021
—
The platforms started asking for people’s dates of birth, apparently in order to comply with the UK’s Age Appropriate Design Code.
Neil Brown
Neil Brown
@neil_neilzone
Welcome to the new world of online #AgeVerification. Where a platform says it needs you to hand over your date of birth to help it "comply with the law" (but doesn't say which law), and says it will *also* use it "to make the ads that you see more relevant to you". https://t.co/LQGBTYA7y0
3:48 PM - 1 Sep 2021
The DPC’s WhatsApp decision was published (direct link to PDF). The initial media coverage was of mixed quality with many articles assuming that the monetary penalty was the only corrective power applied. The size of the fine and the disagreements between the DPC and other European Supervisory Authorities attracted much attention. There was less comment on how - creaky, slow and unwieldy though it might be - the final decision was the result of Article 60 (cooperation and consistency) and Article 65 (dispute resolution) of the GDPR working as intended.
The EDPB’s binding decision (direct link to PDF) has clarified some ambiguities in interpretation which should hopefully reduce the amount of back and forth between Supervisory Authorities on these issues in future.
Will Goodbody’s piece for RTÉ reflected on some of the broader implications of the decision.
Today’s decision is also significant because it provides data controllers everywhere with fresh insight into what regulators are thinking about very specific issues, in this case transparency.
Users do need to start paying more attention to what they are being told and more importantly what they aren’t about how their data is processed and used before signing up.
Awareness is key. And a €225m fine is a pretty effective way of raising it.
WhatsApp fine offers pause for thought - between pinging messages
www.rte.ie – Share
It is used by two billion people globally each month.
Robert Bateman has a useful piece examining one element of the decision, the obligation to link purposes and legitimate interests to each processing operation, and stepping through the processes involved inside both the DPC and the EDPB.
The Irish supervisory authority (SA) found multiple violations of the GDPR’s transparency obligations.
But after circulating a draft decision with other EU SAs, Ireland was forced to add a further finding—that WhatsApp had not properly described its legitimate interests to data subjects.
The decision-maker—and, therefore, the Irish SA as a whole—found that WhatsApp had conveyed information about its legitimate interests appropriately, in a clear and transparent manner that gave data subjects “a meaningful overview of the legitimate interests being relied upon.
The EPDB concludes that providing “full information on each and every processing operation respectively” is the “only approach” that will enable data subjects to exercise their data subject rights.
WhatsApp’s €225 Million GDPR Fine — Disagreements Over Legitimate Interests and Transparency | Analysis | GRC World Forums
www.grcworldforums.com – Share
WhatsApp received a €225 million fine on September 2—the second biggest GDPR penalty on record.
Techcrunch’s Natasha Lomas was one of the few journalists in the initial wave of coverage to mention that the corrective powers in the DPC’s decision went beyond the attention-grabbing fine.
In addition to issuing a sizeable financial penalty, it has ordered WhatsApp to take a number of actions to improve the level of transparency it offer users and non-users — giving the tech giant a three-month deadline for making all the ordered changes.
Neither fifty million nor two hundred and twenty five million are large enough amounts to bother Facebook’s bottom line in the slightest and it’s almost certain the lower amount would have been appealed with just as much vigour as the final amount will be.
Given the staggering amounts of money Facebook makes it’s hard to see either amount as being dissuasive.
The Journal has a decent piece by Ian Curran.
While the disagreement between the Irish DPC and the EDPB over the calculation of the fine took centre stage following the publication of the decision this week, the overwhelming majority of Dixon’s findings were uncontested by other European supervisory authorities on the board.
The findings also seem to reveal plenty about WhatsApp Ireland’s approach to its transparency obligations under GDPR thus far — and leave no doubt about the gravity of the breaches involved.
… the WhatsApp decision itself should make clear to businesses like Facebook what, exactly, their transparency obligations are under GDPR.
'Patent ambiguity': WhatsApp's record €225 million fine underlines grave transparency issues
www.thejournal.ie – Share
The Data Protection Commission’s final 250-plus page decision could be significant for the application of GDPR.
Apple hit pause on its plans to scan the contents of your phone.
It isn’t clear how Apple could implement the system in a way that eliminates its critics’ biggest privacy concerns. Apple has claimed it would refuse government demands to expand photo-scanning beyond CSAM. But privacy and security advocates argue that once the system is deployed, Apple likely won’t be able to avoid giving governments more user content.
“Once this capability is built into Apple products, the company and its competitors will face enormous pressure—and potentially legal requirements—from governments around the world to scan photos not just for CSAM, but also for other images a government finds objectionable,” 90 policy groups from the US and around the world said in an open letter to Apple last month. “Those images may be of human rights abuses, political protests, images companies have tagged as ‘terrorist’ or violent extremist content, or even unflattering images of the very politicians who will pressure the company to scan for them. And that pressure could extend to all images stored on the device, not just those uploaded to iCloud. Thus, Apple will have laid the foundation for censorship, surveillance and persecution on a global basis.”
Amid backlash, Apple will change photo-scanning plan but won’t drop it completely | Ars Technica
arstechnica.com – Share
Apple issues vague statement promising “improvements” but still plans to scan photos.
A good Twitter thread from Matthew Green on this.
Matthew Green
Matthew Green
@matthew_d_green
I’m so, so tired of talking about Apple photo scanning but I just want to say one more thing about their (thankfully now paused!) proposal:

One of the leading indicators of whether a scanning proposal is “ok” is “is anyone else doing it.” 1/
4:28 PM - 3 Sep 2021
—
All data leaks eventually, and the ability to take that data and make something new and remarkably dangerous with it is within reach of almost anyone.
Investigation into hacked "map" of UK gun owners - BBC News
www.bbc.com – Share
Animal rights activists have published a “map” of thousands of gun owners and their addresses.
  • “The use of algorithms is often seen as a way to improve, increase efficiency or lower costs of public services. Growing evidence suggests that algorithmic systems in public-service delivery can cause harm and frequently lack transparency in their implementation, including opacity around decisions about whether and why to use them. Most countries have yet to resource efforts to raise awareness and engage the wider public about the use of algorithms in public-service delivery. In recognition of these conditions, regulators, lawmakers and governmental accountability organisations have turned to regulatory and policy tools, hoping to ensure ‘algorithmic accountability’ across countries and contexts. These responses are emergent and fast evolving, and vary widely in form and substance – from legally binding commitments, to high-level principles and guidelines. Lessons from their early implementation raise important challenges and pose questions about the future of governing algorithmic systems.” From a report by the Ada Lovelace Institute, AI Now and the Open Government Partnership, ‘Algorithmic Accountability for the Public Sector’.
  • “The aim of Big Tech and its intermediaries seems to [be to] make sure there are as few hard regulations as possible – for example those that tackle issues around privacy, disinformation, and market distortion – to preserve their profit margins and business model. If new rules can’t be blocked, then they aim to at least water them down. In recent years these firms started embracing regulation in public, yet continue pushing back against behind closed doors. There are some differences between what different tech firms want in terms of EU policy, but the desire to remain ‘unburdened’ by urgently needed regulations is shared by most of the large platforms.” From ‘The Lobby Network: Big Tech’s Web of Influence in the EU’ (direct link to PDF) by Dr. Max Bank, Felix Duffy, Verena Leyendecker and Margarida Silva for the Corporate Europe Observatory.
  • “Afghanistan is not the only country to embrace biometrics. Many countries are concerned about so-called “ghost beneficiaries”—fake identities that are used to illegally collect salaries or other funds. Preventing such fraud is a common justification for biometric systems, says Amba Kak, the director of global policy and programs at the AI Now institute and a legal expert on biometric systems … It’s widely recognized that having legal identification documents is a right, but “conflating biometric ID as the only efficient means for legal identification is,” she says, “flawed and a little dangerous.” Kak questions whether biometrics—rather than policy fixes —are the right solution to fraud, and adds that often it is “not evidence-based.”” From ‘This is the real story of the Afghan biometric databases abandoned to the Taliban’ by Eileen Guo Hikmat Noori for the MIT Technology Review
Endnotes & Credits
  • The elegant Latin bon mot “Futuendi Gratia” is courtesy of Effin’ Birds.
  • As always, a huge thank you to Regina Doherty for giving the world the phrase “mandatory but not compulsory”.
  • The image used in the header is by Krystian Tambur on Unsplash.
  • Any quotes from the Oireachtas we use are sourced from KildareStreet.com. They’re good people providing a great service. If you can afford to then donate to keep the site running.
  • Digital Rights Ireland have a storied history of successfully fighting for individuals’ data privacy rights. You should support them if you can.
Find us on the web at myprivacykit.com and on Twitter at @PrivacyKit. Of course we’re not on Facebook or LinkedIn.
If you know someone who might enjoy this newsletter do please forward it on to them.
Did you enjoy this issue?
In order to unsubscribe, click here.
If you were forwarded this newsletter and you like it, you can subscribe here.
Powered by Revue
Privacy Kit, Made with 💚 in Dublin, Ireland

Fines. Lobbyists. Guns. An Apple about turn. It’s practically a Warren Zevon song this week.

😼

https://twitter.com/Damokieran/status/1432777900626006022

—

The platforms started asking for people’s dates of birth, apparently in order to comply with the UK’s Age Appropriate Design Code.

Welcome to the new world of online #AgeVerification. Where a platform says it needs you to hand over your date of birth to help it "comply with the law" (but doesn't say which law), and says it will also use it "to make the ads that you see more relevant to you". pic.twitter.com/LQGBTYA7y0

— Neil Brown (@neil_neilzone) September 1, 2021

The DPC’s WhatsApp decision was published (direct link to PDF). The initial media coverage was of mixed quality with many articles assuming that the monetary penalty was the only corrective power applied. The size of the fine and the disagreements between the DPC and other European Supervisory Authorities attracted much attention. There was less comment on how - creaky, slow and unwieldy though it might be - the final decision was the result of Article 60 (cooperation and consistency) and Article 65 (dispute resolution) of the GDPR working as intended.

The EDPB’s binding decision (direct link to PDF) has clarified some ambiguities in interpretation which should hopefully reduce the amount of back and forth between Supervisory Authorities on these issues in future.

Will Goodbody’s piece for RTÉ reflected on some of the broader implications of the decision.

It is used by two billion people globally each month.

Robert Bateman has a useful piece examining one element of the decision, the obligation to link purposes and legitimate interests to each processing operation, and stepping through the processes involved inside both the DPC and the EDPB.

WhatsApp received a €225 million fine on September 2—the second biggest GDPR penalty on record.

Techcrunch’s Natasha Lomas was one of the few journalists in the initial wave of coverage to mention that the corrective powers in the DPC’s decision went beyond the attention-grabbing fine.

Neither fifty million nor two hundred and twenty five million are large enough amounts to bother Facebook’s bottom line in the slightest and it’s almost certain the lower amount would have been appealed with just as much vigour as the final amount will be.

Given the staggering amounts of money Facebook makes it’s hard to see either amount as being dissuasive.

The Journal has a decent piece by Ian Curran.

The Data Protection Commission’s final 250-plus page decision could be significant for the application of GDPR.

Apple hit pause on its plans to scan the contents of your phone.

Apple issues vague statement promising “improvements” but still plans to scan photos.

A good Twitter thread from Matthew Green on this.

I’m so, so tired of talking about Apple photo scanning but I just want to say one more thing about their (thankfully now paused!) proposal:

One of the leading indicators of whether a scanning proposal is “ok” is “is anyone else doing it.” 1/

— Matthew Green (@matthew_d_green) September 3, 2021

—

All data leaks eventually, and the ability to take that data and make something new and remarkably dangerous with it is within reach of almost anyone.

Animal rights activists have published a “map” of thousands of gun owners and their addresses.

  • “The use of algorithms is often seen as a way to improve, increase efficiency or lower costs of public services. Growing evidence suggests that algorithmic systems in public-service delivery can cause harm and frequently lack transparency in their implementation, including opacity around decisions about whether and why to use them. Most countries have yet to resource efforts to raise awareness and engage the wider public about the use of algorithms in public-service delivery. In recognition of these conditions, regulators, lawmakers and governmental accountability organisations have turned to regulatory and policy tools, hoping to ensure ‘algorithmic accountability’ across countries and contexts. These responses are emergent and fast evolving, and vary widely in form and substance – from legally binding commitments, to high-level principles and guidelines. Lessons from their early implementation raise important challenges and pose questions about the future of governing algorithmic systems.” From a report by the Ada Lovelace Institute, AI Now and the Open Government Partnership, ‘Algorithmic Accountability for the Public Sector’.
  • “The aim of Big Tech and its intermediaries seems to [be to] make sure there are as few hard regulations as possible – for example those that tackle issues around privacy, disinformation, and market distortion – to preserve their profit margins and business model. If new rules can’t be blocked, then they aim to at least water them down. In recent years these firms started embracing regulation in public, yet continue pushing back against behind closed doors. There are some differences between what different tech firms want in terms of EU policy, but the desire to remain ‘unburdened’ by urgently needed regulations is shared by most of the large platforms.” From ‘The Lobby Network: Big Tech’s Web of Influence in the EU’ (direct link to PDF) by Dr. Max Bank, Felix Duffy, Verena Leyendecker and Margarida Silva for the Corporate Europe Observatory.
  • “Afghanistan is not the only country to embrace biometrics. Many countries are concerned about so-called “ghost beneficiaries”—fake identities that are used to illegally collect salaries or other funds. Preventing such fraud is a common justification for biometric systems, says Amba Kak, the director of global policy and programs at the AI Now institute and a legal expert on biometric systems … It’s widely recognized that having legal identification documents is a right, but “conflating biometric ID as the only efficient means for legal identification is,” she says, “flawed and a little dangerous.” Kak questions whether biometrics—rather than policy fixes —are the right solution to fraud, and adds that often it is “not evidence-based.”” From ‘This is the real story of the Afghan biometric databases abandoned to the Taliban’ by Eileen Guo Hikmat Noori for the MIT Technology Review

Endnotes & Credits

  • The elegant Latin bon mot “Futuendi Gratia” is courtesy of Effin’ Birds.
  • As always, a huge thank you to Regina Doherty for giving the world the phrase “mandatory but not compulsory”.
  • The image used in the header is by Krystian Tambur on Unsplash.
  • Any quotes from the Oireachtas we use are sourced from KildareStreet.com. They’re good people providing a great service. If you can afford to then donate to keep the site running.
  • Digital Rights Ireland have a storied history of successfully fighting for individuals’ data privacy rights. You should support them if you can.

Find us on the web at myprivacykit.com and on Twitter at @PrivacyKit. Of course we’re not on Facebook or LinkedIn.

If you know someone who might enjoy this newsletter do please forward it on to them.

Don't miss what's next. Subscribe to Privacy Kit:
X
Powered by Buttondown, the easiest way to start and grow your newsletter.