Privacy Kit

Subscribe
Archives
January 22, 2023

A Cybernetic Frankenstein | The Cat Herder, Volume 6, Issue 02

Microsoft’s chief privacy officer Julie Brill is quoted in the New York Times piece in the What We’re Reading section below as saying “The Netherlands punches above its weight”. Punching above its weight was something Ireland Inc. used to pride itself on. However, the processes and engagement with technology companies described in the article feel as if they are well beyond the capabilities of the Irish state, which has proven itself over many years and many high-profile debacles to be profoundly incapable of and unwilling to understand data protection. Besides that it’s more of the usual: breaches and enforcement. Also, NoFly.csv.

😼

--------

Futuendi Gratia

“The server contained data from a 2019 version of the federal no-fly list that included first and last names and dates of birth,” CommuteAir Corporate Communications Manager Erik Kane said. “In addition, certain CommuteAir employee and flight information was accessible. We have submitted notification to the Cybersecurity and Infrastructure Security Agency and we are continuing with a full investigation.”

Daily Dot: ‘U.S. No Fly List Left on Unprotected Airline Server’

 

(๑◕ܫ◕๑)

 

Nobody Could Have Seen This Coming

The company which provides a platform for police forces in the US to use to coordinate activities such as raids was hacked.

The data also contains a large amount of personal information about individuals, including the surveillance techniques that police use to identify or track them. TechCrunch found several screenshots showing people’s faces matched against a facial recognition engine called AFR Engine, a company that provides face-matching technology to police departments. One photo appears to show an officer forcibly holding a person’s head in front of another officer’s phone camera.

Techcrunch: ‘A hack at ODIN Intelligence exposes a huge trove of police raid files’

 

(๑◕ܫ◕๑)

 

It Could Never Happen Here

It seems unlikely that a similar event could happen here, regardless of whether the UK government is prepared to listen to the contributions made at this one held in Westminster last month. Not while fully-grown government ministers are still wandering around proposing dumb things like this.

One data protection expert noted that many clients seek to deploy established products from other jurisdictions (particularly the USA and Asia) and want to know how they interact with UK GDPR – often by asking what the bare minimum they can get away with is. People in these decision-making positions are generally from ‘advantaged’ demographic groups, and focus on innovation without seeing the risks – ethics and safe business practice are often seen as ‘nice to have’ rather than core, with their business models built on revenues rather than rights. There are no business incentives to do the right thing and care about human rights because data is so monetizable.

Connected by Data: ‘Ensuring People Have a Say in Future Data Governance’

 

(๑◕ܫ◕๑)

 

Regulators

The DPC followed up its Instagram and Facebook decisions with one for WhatsApp, which imposed a fine of €5.5 million and a directions to WhatsApp “to bring its data processing operations into compliance within a period of six months.”


The Garante fined Clubhouse, once the hot new thing, €2 million for “processing of personal data in violation of lawfulness, transparency and storage limitation principles; infringement of Articles 6 and 7 GDPR; failure to provide the information set out by Articles 13 and 14 or provision of incomplete, unclear, non-transparent and unintelligible information; failure to designate a representative in the EU in breach of Article 27(4); failure to carry out a DPIA with regard to  processing operations for profiling purposes.”


The Polish DPA fined P4, the legal successor to Virgin Mobile Polska, €340,717.27 after a re-examination of a personal data breach first notified in December 2019.

 

(๑◕ܫ◕๑)

 

What We’re Reading

  • “It can be assumed that many public sector processing operations relying on cloud services would be likely to result in a high risk to the rights and freedoms of natural persons (for instance due to processing of sensitive data or data of a highly personal nature– like health data or personal data relating to criminal convictions and offences referred to in Article 10 of the GDPR– and processing is on a large scale) … However, only thirty-two out of the eighty-six stakeholders that use CSPs indicated that a DPIA has been conducted, before the intended processing itself. The EDPB would like to reiterate that the deployment of cloud services by public bodies will often trigger a likely high risk under the GDPR. Based on the information received by SAs from public bodies, in many cases where no DPIA was not carried out, the reason for not doing so was unclear for SAs16. This could be a potential violation of the GDPR. Public bodies that have not (yet) conducted a DPIA when deploying cloud services should therefore (re)evaluate in the short term whether a DPIA should be conducted and document this evaluation.” From the EDPB‘s initial report on 2022’s Coordinated Enforcement Action ‘Use of cloud-based services by the public sector’ [direct link to PDF].

  • “Dutch technical expertise has helped privacy auditors gain unusually granular insights into how some of the largest software companies amass personal data on hundreds of millions of people. It has also allowed Dutch experts to call out companies for practices that appear to violate European rules. Some large American tech firms balk at first, said Sjoera Nas, a senior adviser at the Privacy Company, a consulting firm in The Hague that conducts the data risk assessments for the Dutch government and other institutions. “We are so small that, initially, many cloud providers just look at us, raise an eyebrow and say: ‘So what? You’re the Netherlands. You don’t matter,’” said Ms. Nas, who helped lead the Dutch negotiations with Microsoft, Zoom and Google. But then, she said, the companies begin to understand that the Dutch teams are negotiating compliance for the Netherlands with data protection rules that also apply across the European Union.” From ‘How the Netherlands Is Taming Big Tech’ (€) by Natasha Singer for the New York Times. ↪ archive.org link

  • “The organizing and ordering of potentially disparate bytes of information theoretically makes it possible for the state to construct a cybernetic Frankenstein of each of its citizens. The cryptic databases where such portraits are stored are harmful when they work as designed—and more so when they fail. In Pakistan, Rida Qadri writes about how the nation’s Computerized National Identity Card’s operating database produces errors if someone doesn’t have married parents, thereby cutting them off from all kinds of other societal and social benefits (such as being able to vote, or opening a bank account). In 2021, a massive breach of an Indian government database meant that people were able to buy the details of individuals—their names, addresses, phone numbers and sometimes, photos—for as little as $8. In Afghanistan, biometric information, including family trees, were left on insecure databases used by the government, which then became open to capture by the Taliban in late 2021.” From ‘Database States’ by Sanjana Varghese for The Baffler. Just by the by, a quick search for datasets tagged “Personal” in the Public Service Data Catalogue returns 921 datasets.

Don't miss what's next. Subscribe to Privacy Kit:
X
Powered by Buttondown, the easiest way to start and grow your newsletter.