June 13, 2021
Misadvised | The Cat Herder, Volume 4, Issue 22
|
June 13 · Issue #135 · View online |
|
It’s the Department of Determined Not To Learn Anything again. It’s the NHS data grab again. On the plus side, a Colorado ban on the use of dark patterns. Though the ubiquity of these may make any ban extremely difficult to police. 😼
|
|
|
|
|
Apple continues its confusion of privacy and confidentiality by getting deeper into the ID business, laying the pathway for people to be asked to prove any characteristic, to anyone, at any time, with limited to no friction. But it's in a secure enclave, so that's ok then! https://t.co/WYG0YvXZYu
|
|
|
|
Same old same old
|
The Department of Children maintains its posture of ‘nope’ when it comes to releasing medical records in the archive of the Mother and Baby Homes Commission; the UK government backs down (for a while) on the GP data grab. Yes, it’s the same two stories again this week in this section. And in both these stories we see organisations which have lost the trust of data subjects. In the case of the various parts of the Irish state which are involved in the ongoing shambolic, disrespectful and offensive mishandling of every aspect of the Mother and Baby Homes Commission report and what has come after, there’s little indication that the ability to regain that trust even exists.
|
Children’s Minister Roderic O'Gorman has raised concerns about the continued redaction of records and is seeking the advice of the Data Protection Commissioner.
|
On reading this you could easily be forgiven for thinking the minister wasn’t - at least nominally - in charge of his department. Because it’s his department which is carrying out the “continued redaction of records” and unlawful witholding of records.
|
Survivors 'infantilised' by records being withheld
Mother and baby home survivors have been told health records will not be passed onto them directly but to a nominated GP or other doctor who will …
|
|
If you or anyone you know is affected by this refusal by the department, please let them know about Simon McGarr’s sample letter:
|
|
|
Meanwhile, across the Irish Sea …
|
Now the government have conceded that a delay is necessary after maintaining as late as Friday that none was needed. It’s news that will be greeted with a strong sense of déjà vu with those who remember the cancelled Care.data programme, a previous effort to collect centrally GP record data. It foundered in part because of a lack of awareness among patients, in spite of a national information campaign. Today the Information Commissioners Office told me “the success of any project will rely on people trusting and having confidence in how their personal data will be used”. The NHS will need to use the time this delay affords to rebuild just that: trust.
|
New NHS patient data store delayed by two months - BBC News
The creation of a central store of data from GP records in England had been due to start on 1 July.
|
|
|
The House of Commons Public Administration and Constitutional Affairs Committee examined the UK government’s plans for a Covid-Status Certification system. Didn’t like them much at all.
|
The Committee also noted that Covid-status certification system would, by its very nature, be discriminatory, and would likely disproportionately discriminate against some people on the basis of race, religion and socio-economic background, as well as on the basis of age due to the sequencing of the vaccine rollout. We found no justification for introducing a Covid-status certification system that would be sufficient to counter what is likely to be a significant infringement of individual rights. There are also legitimate concerns over the serious data protection risks that would be involved in setting up a Covid-status certification system to the extent that the Committee cannot see how establishing the infrastructure necessary for such a system could be an effective use of resources.
|
|
|
|
Once I started making a video, the change to my jaw shape was obvious. I suspected, but couldn’t tell for sure, that my skin had been smoothed as well. I sent a video of it in action to coworkers and my Twitter followers, asking them to open the app and try the same thing on their own phones: from their responses, I learned that the effect only seemed to affect Android phones. I reached out to TikTok, and the effect stopped appearing two days later. The company later acknowledged in a short statement that there was an issue that had been resolved, but did not provide further details.
|
TikTok changed the shape of some people's faces without asking | MIT Technology Review
Users noticed what appeared to be a beauty filter they couldn’t turn off.
|
|
|
|
|
|
|
The Wall Street Journal (paywalled) reported that Amazon faces a possible $425 million fine from the Luxembourg DPA. According to the report the Luxembourg DPA has circulated a draft decision among the other DPAs.
|
|
|
-
“It may be that witnesses explicitly consented to those acts of deletion, but this would have to have been done by means of full information being provided to themabout the full consequences of deletion for them, including by leaving them unable to query the Commission’s summary of their evidence by reference to the original, full record of that evidence. In addition, the relationship between the Commission (as data controller) and the witness (as data subject) is an imbalanced one, and per the GDPR, it would not in such circumstances be generally open to the Commission to rely on consent (under Article 6(1)(a) GDPR) or explicit consent (under Article 9(2)(a) GDPR) to ground the Commission’s processing activities(and indeed you characterise the witnesses’ disclosures of their personal data as having been “induced”).” From correspondence between the DPC, the Mother and Baby Homes Commission of Investigation and the Department of Children obtained by Ken Foxe under FOI.
-
“Polly Sanderson, policy counsel at privacy-focused think tank Future of Privacy Forum, notes that another key difference concerns CPA’s overarching position on consent and its implications for marketing practices. “Coupled with [the bill’s requirement that businesses obtain opt-in for the processing of ‘sensitive’ information], the consent standard explicitly bans covered entities from using so-called ‘dark patterns,’ which means manipulative user interfaces or design. Taken together, this sets a higher bar than both California (which is opt-out) and Virginia (which does not include anti-dark patterns language).” The bill’s language will set a higher bar for marketers to be transparent in their communications and avoid the use of potentially deceptive tracking methodologies and interfaces. Bartoletti, like many techno-ethicists, is particularly pleased with this restriction. “I like the focus on dark patterns — I think this is excellent, as dark patterns are interfaces that really impair people’s dignity and autonomy,” she says.” From ‘Why Colorado’s data provacy bill may be a big mountain to climb for marketers’ by Kendra Clark for The Drum.
-
“We know that the use of data analytics and algorithmic decision-making is growing in the public sector. This can have significant implications for how citizens experience and engage with public services. Yet citizens are often left out of the debate about the development, implementation and uses of these emerging technologies. From planning, procurement, strategy, and general discussion, there are many ways to involve the public. In this guidebook we shed light on some of the ways civic participation can be enhanced in relation to algorithmic decision-making focusing particularly on the public sector.” From ‘Advancing civic participation in algorithmic decision-making: a guidebook for the public sector’ [direct link to PDF] published by the Data Justice Lab.
|
|
|
If you know someone who might enjoy this newsletter do please forward it on to them.
|
Did you enjoy this issue?
|
|
|
|
In order to unsubscribe, click here.
If you were forwarded this newsletter and you like it, you can subscribe here.
|
|
Privacy Kit, Made with 💚 in Dublin, Ireland
|
|
|
It’s the Department of Determined Not To Learn Anything again. It’s the NHS data grab again. On the plus side, a Colorado ban on the use of dark patterns. Though the ubiquity of these may make any ban extremely difficult to police.
😼
The Department of Children maintains its posture of ‘nope’ when it comes to releasing medical records in the archive of the Mother and Baby Homes Commission; the UK government backs down (for a while) on the GP data grab. Yes, it’s the same two stories again this week in this section. And in both these stories we see organisations which have lost the trust of data subjects. In the case of the various parts of the Irish state which are involved in the ongoing shambolic, disrespectful and offensive mishandling of every aspect of the Mother and Baby Homes Commission report and what has come after, there’s little indication that the ability to regain that trust even exists.
On reading this you could easily be forgiven for thinking the minister wasn’t - at least nominally - in charge of his department. Because it’s his department which is carrying out the “continued redaction of records” and unlawful witholding of records.
Mother and baby home survivors have been told health records will not be passed onto them directly but to a nominated GP or other doctor who will …
If you or anyone you know is affected by this refusal by the department, please let them know about Simon McGarr’s sample letter:
Sample letter in response to Dept of Children refusal to provide Health Data in response to Mother and Baby Home survivor Data Subject Access Request
—
Meanwhile, across the Irish Sea …
The creation of a central store of data from GP records in England had been due to start on 1 July.
The House of Commons Public Administration and Constitutional Affairs Committee examined the UK government’s plans for a Covid-Status Certification system. Didn’t like them much at all.
Covid-Status Certification, Second Report of Session 2021–22 [direct link to PDF]
Users noticed what appeared to be a beauty filter they couldn’t turn off.
The European Data Protection Supervisor published ‘From Lindqvist to Schrems II: case law of the CJEU on transfers of personal data to third countries’
—
The European Commission opened an infringement procedure against Belgium regarding the independence of its Supervisory Authority.
—
The Wall Street Journal (paywalled) reported that Amazon faces a possible $425 million fine from the Luxembourg DPA. According to the report the Luxembourg DPA has circulated a draft decision among the other DPAs.
-
“It may be that witnesses explicitly consented to those acts of deletion, but this would have to have been done by means of full information being provided to themabout the full consequences of deletion for them, including by leaving them unable to query the Commission’s summary of their evidence by reference to the original, full record of that evidence. In addition, the relationship between the Commission (as data controller) and the witness (as data subject) is an imbalanced one, and per the GDPR, it would not in such circumstances be generally open to the Commission to rely on consent (under Article 6(1)(a) GDPR) or explicit consent (under Article 9(2)(a) GDPR) to ground the Commission’s processing activities(and indeed you characterise the witnesses’ disclosures of their personal data as having been “induced”).” From correspondence between the DPC, the Mother and Baby Homes Commission of Investigation and the Department of Children obtained by Ken Foxe under FOI.
-
“Polly Sanderson, policy counsel at privacy-focused think tank Future of Privacy Forum, notes that another key difference concerns CPA’s overarching position on consent and its implications for marketing practices. “Coupled with [the bill’s requirement that businesses obtain opt-in for the processing of ‘sensitive’ information], the consent standard explicitly bans covered entities from using so-called ‘dark patterns,’ which means manipulative user interfaces or design. Taken together, this sets a higher bar than both California (which is opt-out) and Virginia (which does not include anti-dark patterns language).” The bill’s language will set a higher bar for marketers to be transparent in their communications and avoid the use of potentially deceptive tracking methodologies and interfaces. Bartoletti, like many techno-ethicists, is particularly pleased with this restriction. “I like the focus on dark patterns — I think this is excellent, as dark patterns are interfaces that really impair people’s dignity and autonomy,” she says.” From ‘Why Colorado’s data provacy bill may be a big mountain to climb for marketers’ by Kendra Clark for The Drum.
-
“We know that the use of data analytics and algorithmic decision-making is growing in the public sector. This can have significant implications for how citizens experience and engage with public services. Yet citizens are often left out of the debate about the development, implementation and uses of these emerging technologies. From planning, procurement, strategy, and general discussion, there are many ways to involve the public. In this guidebook we shed light on some of the ways civic participation can be enhanced in relation to algorithmic decision-making focusing particularly on the public sector.” From ‘Advancing civic participation in algorithmic decision-making: a guidebook for the public sector’ [direct link to PDF] published by the Data Justice Lab.
Endnotes & Credits
Find us on the web at myprivacykit.com and on Twitter at @PrivacyKit. Of course we’re not on Facebook or LinkedIn.
If you know someone who might enjoy this newsletter do please forward it on to them.