Privacy Kit

Subscribe
Archives
November 17, 2019

Less usable than Excel | The Cat Herder, Volume 2, Issue 44

A whole lot of facial recognition stories and some more peculiar pirouettes around the head of a pin
 
November 17 · Issue #60 · View online
The Cat Herder
A whole lot of facial recognition stories and some more peculiar pirouettes around the head of a pin from the joint controller of Ireland’s largest database-of-faces-which-are-used-for-facial-recognition.
😼

Apple store employee texted himself a private photo from customer's phone, woman alleges - The Washington Post
www.washingtonpost.com – Share
In a statement, Apple said the employee was “no longer associated with our company.”
Article 4 of the GDPR is called ‘Definitions’. Here are two of those definitions.
Article 4(2):‘processing’ means any operation or set of operations which is performed on personal data or on sets of personal data, whether or not by automated means, such as collection, recording, organisation, structuring, storage, adaptation or alteration, retrieval, consultation, use, disclosure by transmission, dissemination or otherwise making available, alignment or combination, restriction, erasure or destruction;
Article 4(14):‘biometric data’ means personal data resulting from specific technical processing relating to the physical, physiological or behavioural characteristics of a natural person, which allow or confirm the unique identification of that natural person, such as facial images or dactyloscopic data;
The astute reader will observe that the definition of processing is very broad. Doing anything with or to personal data besides thinking about it constitutes processing.
The Secretary General of the Department of Employment Affairs answered a question from Catherine Murphy TD about biometric data during his time in front of the Public Accounts Committee a few weeks back. A session which took place 1,300 days after the GDPR was adopted and 532 days after it became enforceable. Surely long enough for his departmental officials to have read the definitions reproduced above.
I will clarify it for the Deputy. In public commentary three different things tend to get conflated. For example, people have said that the public services card contains biometric data. We are very clear that it does not. There is no biometric data in the photograph or on the card or chip. We are very clear on that. Second, people we say we share biometric data. We are very clear we do not share biometric data with anybody. Third, people say we collect biometric data. Again, we are very clear we do not collect biometric data. The term “collect” has a particular meaning in a data protection context. We do not collect biometric data. For example, it has been said that when people walk into our offices our cameras can immediately scan their faces and collect biometric data. None of that happens. In terms of what we do, we process the photograph to create a biometric template. We have always said that. In replies to parliamentary questions we have always been clear that we produce an arithmetic template. We have called it biometric and we say that on our website. We say it in the frequently asked questions and comprehensive guide for the Committee of Public Accounts. We do create a biometric template for our own purposes. That is what enables us to identify issues of identity fraud. The information does not leave the Department. It is not on the card, it is not collected and it is not shared. People hear that there are no biometric data on the card and even though biometric is referenced they are different things.
In this answer Mr McKeon asserts that the term “collect” has a particular meaning in a data protection context. He does not elaborate. As we can see from the above definition of processing, collection falls under processing. As do all the other things one could possibly do with personal data. So does processing “the photograph to create a biometric template.” So does storing all of this biometric data in a database.
In short, the Regulation doesn’t care how the biometric data came to be, nor how it came to end up in your database, under your control. It simply is. It is personal data, and it is jointly controlled by the Department of Employment Affairs and Social Protection and the Department of Public Expenditure and Reform.
It’s not clear what the purpose of officials continuing to use this peculiar misinterpretation is. But they’re showing no signs of stopping, or accepting the reality that they process biometric data.
More PSC: ‘Pulling mandatory PSC for passports had 'whole of government repercussions’, civil servants warned’
Of course it could
Of course it could
Will Goodbody explores whether it could ever happen here in the case of Google and the health data of a significant share of the population of the US.
The unwritten answer to this is yes, it could.
  • Google wouldn’t comment.
  • The Data Protection Commission “would expect to hear from those involved before such a project was established.” Left hanging is the question of what happens when the drastically under-resourced commission doesn’t hear from those involved.
  • The boilerplate response from the HSE, which talks about information security and confidentiality, would seem to indicate that the HSE didn’t understand the question asked. Information security is not the same thing as data protection. That the HSE still doesn’t understand and hasn’t even got a boilerplate response which covers data protection issues is concerning.
  • The responses from hospital groups appear to be in the time-honoured format of ‘Absolutely not. Not right now at this very moment.’
All of these add up to a not hugely reassuring piece.
As Will points out, getting access to vast amounts of sensitive categories of personal data is very useful for the future bottom line of the technology companies. At the front line of healthcare delivery the technology story is starkly different.
The transition to electronic health records (EHRs) was supposed to improve the quality and efficiency of healthcare for doctors and patients alike — but these technologies get an “F” rating for usability from health care professionals, and may be contributing to high rates of professional burnout, according to a new Yale-led study.
—
In a pretty closely related story the Financial Times examined the rampant sharing of health data across the web - ‘How top health websites are sharing sensitive data with advertisers’ (€).
—
237 UK police force staff punished for misusing IT systems in last 2 years • The Register
www.theregister.co.uk – Share
One UK police staffer is disciplined every three days for breaking data protection rules or otherwise misusing IT systems, according to a Freedom of Information request by think tank Parliament Street.
Related: “No individual or organisation has ever been held responsible for this catastrophic violation of Dara’s rights. GSOC confirmed to the Irish Times that a garda accused of sharing the footage will not face criminal charges”
They did.
They did.
‘India is going ahead with its facial recognition despite privacy concerns’.
The Indian government has played down fears of mass surveillance in response to concerns that its proposed facial recognition system lacks adequate oversight.
Replying to a legal notice filed by the Internet Freedom Foundation (IFF), a Delhi-based non-profit that works on digital liberties, the country’s National Crime Record Bureau (NCRB) defended the move, stating it doesn’t interfere with privacy of citizens as it “only automates the existing police procedure of comparing suspects’ photos with those listed in LEA’s [Law Enforcement Agency] databases.”
In France a plan to introduce a facial recognition system which will allow access to government services is seemingly going ahead. Which threw up this gem from Didier Baichere, facial recognition fan and La République En Marche! member.
“Just ignore all the fear-mongering for a moment, and you’ll see that there are very interesting and positive ways of using facial recognition,” he said. “You can deploy it in the area of security, to manage crowds accessing events or in shops to make offers that are especially adapted to the way people look,” he said.
So. Possible special offers in shops. Doesn’t seem like an entirely fair swap for massive interference with your data protection and privacy rights but hey …
Meanwhile, in the US …
As facial recognition technology has become more popular across law enforcement organizations, so has the backchannel sharing of related databases.
…
“In the context of face surveillance, a lot of access actually happens ad hoc, instead of through a written agreement or policy,” says Narayan. “So an agency without a policy may get access [to facial recognition technology] through these requests.”
The European Data Protection Board published guidelines on the territorial scope of the GDPR.
—
The Hamburg Data Protection Authority published a short guidance note on the use of Google Analytics on websites (direct link to PDF).
—
  • “the majority of advertising companies feed their complex algorithms silos full of data even though the practice never delivers the desired result. In the worst case, all that invasion of privacy can even lead to targeting the wrong group of people.” Jesse Frederik and Maurits Martijn in for The Correspondent ‘The new dot com bubble is here: it’s called online advertising’
  • “The key with the unfreedom of the algorithm is that it knows everything and it feeds back everything. So, you can no longer have this bit of humanity which is absolutely necessary — privacy: the sacred space in which you do not know what the other thinks of you.” Zadie Smith, in an interview with The Toronto Star.
  • “It’s not just Google that benefits. It may treat Facebook as a bitter rival, but both companies have a shared interest in limiting the ability of users to shape how the web works.” Alex Hern on ‘Firefox’s fight for the future of the web’ in The Observer.
  • New research from Pew which comes with the not especially cheery headline ‘Americans and Privacy: Concerned, Confused and Feeling Lack of Control Over Their Personal Information’.
——
Endnotes & Credits
  • The elegant Latin bon mot “Futuendi Gratia” is courtesy of Effin’ Birds.
  • As always, a huge thank you to Regina Doherty for giving the world the phrase “mandatory but not compulsory”.
  • The image used in the header is by Krystian Tambur on Unsplash.
  • Any quotes from the Oireachtas we use are sourced from KildareStreet.com. They’re good people providing a great service. If you can afford to then donate to keep the site running.
  • Digital Rights Ireland have a storied history of successfully fighting for individuals’ data privacy rights. You should support them if you can.
Find us on the web at myprivacykit.com and on Twitter at @PrivacyKit. Of course we’re not on Facebook or LinkedIn.
Barring a disaster we’ll be in your inbox again next weekend.
If you know someone who might enjoy this newsletter do please forward it on to them.
Did you enjoy this issue?
In order to unsubscribe, click here.
If you were forwarded this newsletter and you like it, you can subscribe here.
Powered by Revue
Privacy Kit, Made with 💚 in Dublin, Ireland

A whole lot of facial recognition stories and some more peculiar pirouettes around the head of a pin from the joint controller of Ireland’s largest database-of-faces-which-are-used-for-facial-recognition.

😼

In a statement, Apple said the employee was “no longer associated with our company.”

Article 4 of the GDPR is called ‘Definitions’. Here are two of those definitions.

The astute reader will observe that the definition of processing is very broad. Doing anything with or to personal data besides thinking about it constitutes processing.

The Secretary General of the Department of Employment Affairs answered a question from Catherine Murphy TD about biometric data during his time in front of the Public Accounts Committee a few weeks back. A session which took place 1,300 days after the GDPR was adopted and 532 days after it became enforceable. Surely long enough for his departmental officials to have read the definitions reproduced above.

In this answer Mr McKeon asserts that the term “collect” has a particular meaning in a data protection context. He does not elaborate. As we can see from the above definition of processing, collection falls under processing. As do all the other things one could possibly do with personal data. So does processing “the photograph to create a biometric template.” So does storing all of this biometric data in a database.

In short, the Regulation doesn’t care how the biometric data came to be, nor how it came to end up in your database, under your control. It simply is. It is personal data, and it is jointly controlled by the Department of Employment Affairs and Social Protection and the Department of Public Expenditure and Reform.

It’s not clear what the purpose of officials continuing to use this peculiar misinterpretation is. But they’re showing no signs of stopping, or accepting the reality that they process biometric data.

More PSC: ‘Pulling mandatory PSC for passports had 'whole of government repercussions’, civil servants warned’

Will Goodbody explores whether it could ever happen here in the case of Google and the health data of a significant share of the population of the US.

The unwritten answer to this is yes, it could.

  • Google wouldn’t comment.
  • The Data Protection Commission “would expect to hear from those involved before such a project was established.” Left hanging is the question of what happens when the drastically under-resourced commission doesn’t hear from those involved.
  • The boilerplate response from the HSE, which talks about information security and confidentiality, would seem to indicate that the HSE didn’t understand the question asked. Information security is not the same thing as data protection. That the HSE still doesn’t understand and hasn’t even got a boilerplate response which covers data protection issues is concerning.
  • The responses from hospital groups appear to be in the time-honoured format of ‘Absolutely not. Not right now at this very moment.’

All of these add up to a not hugely reassuring piece.

As Will points out, getting access to vast amounts of sensitive categories of personal data is very useful for the future bottom line of the technology companies. At the front line of healthcare delivery the technology story is starkly different.

—

In a pretty closely related story the Financial Times examined the rampant sharing of health data across the web - ‘How top health websites are sharing sensitive data with advertisers’ (€).

—

One UK police staffer is disciplined every three days for breaking data protection rules or otherwise misusing IT systems, according to a Freedom of Information request by think tank Parliament Street.

Related: “No individual or organisation has ever been held responsible for this catastrophic violation of Dara’s rights. GSOC confirmed to the Irish Times that a garda accused of sharing the footage will not face criminal charges”

‘India is going ahead with its facial recognition despite privacy concerns’.

In France a plan to introduce a facial recognition system which will allow access to government services is seemingly going ahead. Which threw up this gem from Didier Baichere, facial recognition fan and La République En Marche! member.

So. Possible special offers in shops. Doesn’t seem like an entirely fair swap for massive interference with your data protection and privacy rights but hey …

Meanwhile, in the US …

The European Data Protection Board published guidelines on the territorial scope of the GDPR.

—

The Hamburg Data Protection Authority published a short guidance note on the use of Google Analytics on websites (direct link to PDF).

—

  • “the majority of advertising companies feed their complex algorithms silos full of data even though the practice never delivers the desired result. In the worst case, all that invasion of privacy can even lead to targeting the wrong group of people.” Jesse Frederik and Maurits Martijn in for The Correspondent ‘The new dot com bubble is here: it’s called online advertising’
  • “The key with the unfreedom of the algorithm is that it knows everything and it feeds back everything. So, you can no longer have this bit of humanity which is absolutely necessary — privacy: the sacred space in which you do not know what the other thinks of you.” Zadie Smith, in an interview with The Toronto Star.
  • “It’s not just Google that benefits. It may treat Facebook as a bitter rival, but both companies have a shared interest in limiting the ability of users to shape how the web works.” Alex Hern on ‘Firefox’s fight for the future of the web’ in The Observer.
  • New research from Pew which comes with the not especially cheery headline ‘Americans and Privacy: Concerned, Confused and Feeling Lack of Control Over Their Personal Information’.

——

Endnotes & Credits

  • The elegant Latin bon mot “Futuendi Gratia” is courtesy of Effin’ Birds.
  • As always, a huge thank you to Regina Doherty for giving the world the phrase “mandatory but not compulsory”.
  • The image used in the header is by Krystian Tambur on Unsplash.
  • Any quotes from the Oireachtas we use are sourced from KildareStreet.com. They’re good people providing a great service. If you can afford to then donate to keep the site running.
  • Digital Rights Ireland have a storied history of successfully fighting for individuals’ data privacy rights. You should support them if you can.

Find us on the web at myprivacykit.com and on Twitter at @PrivacyKit. Of course we’re not on Facebook or LinkedIn.

Barring a disaster we’ll be in your inbox again next weekend.

If you know someone who might enjoy this newsletter do please forward it on to them.

Don't miss what's next. Subscribe to Privacy Kit:
X
Powered by Buttondown, the easiest way to start and grow your newsletter.