Privacy Kit

Subscribe
Archives
November 7, 2021

Counterfactual | The Cat Herder, Volume 4, Issue 43

State surveillance, clashing transparency reports and a counterfactual cost-benefit analysis. 😼
 
November 7 · Issue #156 · View online
The Cat Herder
State surveillance, clashing transparency reports and a counterfactual cost-benefit analysis.
😼

The Black Shadow hacking group stole a file containing information on around a million listings with full details on the users, according to France 24, many of whom are closeted. They released the database on Tuesday (2 November) after their $1 million ransom was not met within 48 hours.
Hackers release Israeli LGBT+ app’s entire database in potentially ‘life-threatening’ cyberattack
www.pinknews.co.uk – Share
Hackers have leaked the entire online database of LGBT+ dating app Atraf, exposing highly sensitive information of thousands of users.
“This is exactly one of the things that people are constantly warning about, especially when it comes to government surveillance and corporate data mining,” Best told WIRED in a text message interview. “Not only is the surveillance itself problematic and worrisome, but the data is not handled in the ideal conditions we’re always promised.“
“It’s a crystal-clear example of why mass surveillance makes our society less safe, not more safe,” says Evan Greer, deputy director of the digital rights group Fight for the Future, of the data leak. “Both corporations and governments are terrible at safeguarding the sensitive data that they collect.”
1.8 TB of Police Helicopter Surveillance Footage Leaks Online | WIRED
www.wired.com – Share
DDoSecrets published the trove Friday afternoon. Privacy advocates say it shows how pervasive law enforcement’s eye has become, and how lax its data protection can be.
—
Clearview AI was effectively kicked out of Australia. After being effectively kicked out of Canada earlier in the year (see The Cat Herder, Volume 4, Issue 05.)
In a statement, Australian Information Commissioner and Privacy Commissioner Angelene Falk said the “covert collection of this kind of sensitive information is unreasonably intrusive and unfair,” claiming it “carries significant risk of harm to individuals, including vulnerable groups such as children and victims of crime, whose images can be searched on Clearview AI’s database.”
Clearview AI Forced to Cease Data Scraping Operations in Australia
gizmodo.com – Share
Australia’s national privacy regulator determined Clearview AI breached users’ privacy and violated the Australian Privacy Act 1988.
“Google is sitting on so much location data,” explained Smith. “That’s the gold mine of location information of all the providers. Phone companies have location data, but typically it’s not as precise as what Google has, and it’s not as extensive.”
Google, in its transparency report, noted that geofence warrants have increased dramatically over the past two years and recently made up “more than 25% of all warrants we receive in the United States.”
Thousands of Geofence Warrants Appear to Be Missing from a California DOJ Transparency Database – The Markup
themarkup.org – Share
California requires law enforcement to report the controversial warrants to a state database—but The Markup found massive discrepancies in how they’re reported
A cost benefit analysis of the SAFE-PSC-MyGovID Framework written by the head of investment analysis at the Department of Social Protection (which is still busy delaying its own appeal against the DPC’s decision of more than two years ago, which came after an investigation into this system which lasted close to two years, which is in turn delaying the completion of the second part of the DPC’s investigation) was published by the Department of Public Expenditure and Reform (the subject of an investigation by the DPC into this system which was opened in August of this year) on Friday afternoon.
Friday afternoon not being the traditional best time for publication of something which you hope will be picked up.
The last two links on this page are the relevant ones if you want to have a read of this yourself.
This analysis uses the word “counterfactual” 207 times. It mentions the Data Protection Commission zero times. Ignoring the DPC decision and ongoing investigations is certainly staying true to the principles of the counterfactual approach or, as it’s more commonly known, speculative fiction.
A couple of highlights from a quick flick through the analysis and slides are:
‘A substantial benefit of doing the thing was doing the thing’
‘If we hadn’t collected all the biometric data for our facial recognition system then people would have been less willing to share their personal data with us.’
Counterfactual indeed.
Anyway, at the end of all this the head of investment analysis at the Department of Social Protection concludes that this investment by the Department of Social Protection was a good investment.
In the accompanying slides contining the “Key Policy Relevant Findings” i.e. the parts the departments involved would like you to read, the authors estimate that, if they twiddled a few knobs and manipulated some more levers they could easily find a value for the whole system of in excess of €1 billion.
A venture capitalist would bite your hand off at the prospect of a 10x return on investment.
Not mentioned in the “Key Policy Relevant Findings” is the €98 million cost of the system to date.
However, the position vis à vis the whole system’s legality remains entirely unchanged by this peculiar counterfactual foray.
Loughlin O'Nolan
Loughlin O'Nolan
@loughlin
What we have not seen to date is any organ of the state making a coherent defence of the system's lawfulness. The "incredibly strong" legal advice of over two years ago remains shrouded in secrecy.

https://t.co/XyzGeuo5wD https://t.co/tClw0P6V3n
8:59 AM - 6 Nov 2021
The Belgian DPA is apparently on the brink of declaring the IAB’s Transparency and Consent Framework non-compliant with the GDPR. Since this will echo what the ICO declared back in June 2019 [direct link to PDF], and then didn’t do anything about, what happens next is a mystery.
  • “Around the world, there is a push by corporations and international institutions such as the World Bank to create these kinds of databases to identify people and conflate two things: the right of every person to be recognized legally by a government and an identification system that intermediates people’s transactions with public and even private services … The government should collect the least amount of information possible. Right now, the inertia is the reverse. We are being held captive by this notion that technological progress should be as fast as possible. And then by the time that we see the effects of those technologies, it is very difficult to scale back those systems or their effects are impossible to mitigate.” From an interview with Luis Fernando García by Leo Schwartz for Rest Of World.
  • “What is becoming increasingly apparent is that - perhaps now more than ever? - lots of people have suggestions for solutions, but without a clear idea of what they are trying to solve. They throw out ”big tech must be made to do [x]“ pronouncements and claims that a change in the law to impose a new obligation on tech companies with fix decades-old systemic, societal problems, without a clear, documented problem statement against which their solution is to be assessed. Without sufficiently-defined parameters for what "good” looks like, or what pitfalls they need to avoid.“ From ‘Introducing the Internet policy "red team”: the underappreciated scrutineers of online regulatory discourse’ by Neil Brown. Since the Oireachtas Joint Committee on Tourism, Culture, Arts, Sport and Media published its ‘Report on Pre-Legislative Scrutiny of the General Scheme of the Online Safety and Media Regulation Bill 2020’ during the week and the bill will now wend its way into the Seanad and Dáil for debate this is a useful primer for what to watch out for. Because precious little of this is new.
  • “Right now, the people deciding whether Clearview AI should be allowed to operate are its own executives and the law enforcement community. Those might not be the right people to determine the rules of engagement when matters of grave consequence, such as the Constitutionally-protected right to privacy that every US citizen is guaranteed, are what’s at stake. Ultimately, none of us consented to Clearview AI‘s use of our images. Ton-That’s lined his pockets selling a product built on our photos. And you and I haven’t seen a penny of profit from it.” From ‘Why are people with nothing to hide so scared of Clearview AI’s facial recognition?’ by Tristan Greene for TNW.
—
Endnotes & Credits
  • The elegant Latin bon mot “Futuendi Gratia” is courtesy of Effin’ Birds.
  • As always, a huge thank you to Regina Doherty for giving the world the phrase “mandatory but not compulsory”.
  • The image used in the header is by Krystian Tambur on Unsplash.
  • Any quotes from the Oireachtas we use are sourced from KildareStreet.com. They’re good people providing a great service. If you can afford to then donate to keep the site running.
  • Digital Rights Ireland have a storied history of successfully fighting for individuals’ data privacy rights. You should support them if you can.
Find us on the web at myprivacykit.com and on Twitter at @PrivacyKit. Of course we’re not on Facebook or LinkedIn.
If you know someone who might enjoy this newsletter do please forward it on to them.
Did you enjoy this issue?
In order to unsubscribe, click here.
If you were forwarded this newsletter and you like it, you can subscribe here.
Powered by Revue
Privacy Kit, Made with 💚 in Dublin, Ireland

State surveillance, clashing transparency reports and a counterfactual cost-benefit analysis.

😼

Hackers have leaked the entire online database of LGBT+ dating app Atraf, exposing highly sensitive information of thousands of users.

DDoSecrets published the trove Friday afternoon. Privacy advocates say it shows how pervasive law enforcement’s eye has become, and how lax its data protection can be.

—

Clearview AI was effectively kicked out of Australia. After being effectively kicked out of Canada earlier in the year (see The Cat Herder, Volume 4, Issue 05.)

Australia’s national privacy regulator determined Clearview AI breached users’ privacy and violated the Australian Privacy Act 1988.

California requires law enforcement to report the controversial warrants to a state database—but The Markup found massive discrepancies in how they’re reported

A cost benefit analysis of the SAFE-PSC-MyGovID Framework written by the head of investment analysis at the Department of Social Protection (which is still busy delaying its own appeal against the DPC’s decision of more than two years ago, which came after an investigation into this system which lasted close to two years, which is in turn delaying the completion of the second part of the DPC’s investigation) was published by the Department of Public Expenditure and Reform (the subject of an investigation by the DPC into this system which was opened in August of this year) on Friday afternoon.

Friday afternoon not being the traditional best time for publication of something which you hope will be picked up.

The last two links on this page are the relevant ones if you want to have a read of this yourself.

This analysis uses the word “counterfactual” 207 times. It mentions the Data Protection Commission zero times. Ignoring the DPC decision and ongoing investigations is certainly staying true to the principles of the counterfactual approach or, as it’s more commonly known, speculative fiction.

A couple of highlights from a quick flick through the analysis and slides are:

‘A substantial benefit of doing the thing was doing the thing’

‘If we hadn’t collected all the biometric data for our facial recognition system then people would have been less willing to share their personal data with us.’

Counterfactual indeed.

Anyway, at the end of all this the head of investment analysis at the Department of Social Protection concludes that this investment by the Department of Social Protection was a good investment.

In the accompanying slides contining the “Key Policy Relevant Findings” i.e. the parts the departments involved would like you to read, the authors estimate that, if they twiddled a few knobs and manipulated some more levers they could easily find a value for the whole system of in excess of €1 billion.

A venture capitalist would bite your hand off at the prospect of a 10x return on investment.

Not mentioned in the “Key Policy Relevant Findings” is the €98 million cost of the system to date.

However, the position vis à vis the whole system’s legality remains entirely unchanged by this peculiar counterfactual foray.

What we have not seen to date is any organ of the state making a coherent defence of the system's lawfulness. The "incredibly strong" legal advice of over two years ago remains shrouded in secrecy. https://t.co/XyzGeuo5wD pic.twitter.com/tClw0P6V3n

— Loughlin O'Nolan 🐀 (@loughlin) November 6, 2021

The Belgian DPA is apparently on the brink of declaring the IAB’s Transparency and Consent Framework non-compliant with the GDPR. Since this will echo what the ICO declared back in June 2019 [direct link to PDF], and then didn’t do anything about, what happens next is a mystery.

  • “Around the world, there is a push by corporations and international institutions such as the World Bank to create these kinds of databases to identify people and conflate two things: the right of every person to be recognized legally by a government and an identification system that intermediates people’s transactions with public and even private services … The government should collect the least amount of information possible. Right now, the inertia is the reverse. We are being held captive by this notion that technological progress should be as fast as possible. And then by the time that we see the effects of those technologies, it is very difficult to scale back those systems or their effects are impossible to mitigate.” From an interview with Luis Fernando García by Leo Schwartz for Rest Of World.
  • “What is becoming increasingly apparent is that - perhaps now more than ever? - lots of people have suggestions for solutions, but without a clear idea of what they are trying to solve. They throw out ”big tech must be made to do [x]“ pronouncements and claims that a change in the law to impose a new obligation on tech companies with fix decades-old systemic, societal problems, without a clear, documented problem statement against which their solution is to be assessed. Without sufficiently-defined parameters for what "good” looks like, or what pitfalls they need to avoid.“ From ‘Introducing the Internet policy "red team”: the underappreciated scrutineers of online regulatory discourse’ by Neil Brown. Since the Oireachtas Joint Committee on Tourism, Culture, Arts, Sport and Media published its ‘Report on Pre-Legislative Scrutiny of the General Scheme of the Online Safety and Media Regulation Bill 2020’ during the week and the bill will now wend its way into the Seanad and Dáil for debate this is a useful primer for what to watch out for. Because precious little of this is new.
  • “Right now, the people deciding whether Clearview AI should be allowed to operate are its own executives and the law enforcement community. Those might not be the right people to determine the rules of engagement when matters of grave consequence, such as the Constitutionally-protected right to privacy that every US citizen is guaranteed, are what’s at stake. Ultimately, none of us consented to Clearview AI‘s use of our images. Ton-That’s lined his pockets selling a product built on our photos. And you and I haven’t seen a penny of profit from it.” From ‘Why are people with nothing to hide so scared of Clearview AI’s facial recognition?’ by Tristan Greene for TNW.

—

Endnotes & Credits

  • The elegant Latin bon mot “Futuendi Gratia” is courtesy of Effin’ Birds.
  • As always, a huge thank you to Regina Doherty for giving the world the phrase “mandatory but not compulsory”.
  • The image used in the header is by Krystian Tambur on Unsplash.
  • Any quotes from the Oireachtas we use are sourced from KildareStreet.com. They’re good people providing a great service. If you can afford to then donate to keep the site running.
  • Digital Rights Ireland have a storied history of successfully fighting for individuals’ data privacy rights. You should support them if you can.

Find us on the web at myprivacykit.com and on Twitter at @PrivacyKit. Of course we’re not on Facebook or LinkedIn.

If you know someone who might enjoy this newsletter do please forward it on to them.

Don't miss what's next. Subscribe to Privacy Kit:
X
Powered by Buttondown, the easiest way to start and grow your newsletter.