This Unsung but Critical Battleground

Hello, everyone!
This month, I actually want to revisit a subject I’ve written about before. If you’ve read Names in Their Blood, you know medical data privacy practices super central to my stories. But, virtually none of you have read my first post on the subject/why I think scifi should concern itself with the this theme. It was an early one, nearly 3 years ago, back before most of you had joined me, yet!
It’s amazing how much has happened since then.
As a change of pace from how news and news-adjacent commentary is often presented online, let me start with the good news!
My favorite change, compared to three years ago, came in the form of a change to HIPPA regulations that slid right into the last days of 2024, and were a pretty solid patch to one of the main issues I called out in my previous newsletter. Previously, some non-“healthcare” apps that collect health information- like period trackers- weren’t required to treat that information as private health information. That was a major medical privacy loophole that has now been closed, at the federal level. That means app developers and others who are gathering sensitive info have to actually protect the privacy of that data, the same way a doctor should.
This comes on top of other good rulings from about this time last year that specifically highlight the health risks that come with the legal attacks on “reproductive” healthcare (this includes abortion, but also contraception, gender/transition related care, and fertility treatments).
Specifically, the new regulations ban use/disclosure of health information when the info is being sought specifically to investigate or prosecute someone for healthcare that was legal within the state where it was provided. The new regulations even goes so far as to require a signed attestation that that isn’t what the data is being requested for.
It’s worth noting that the first rule I was talking about even includes, under it’s definition of protected health information, information about someone’s location that could be used to make determinations about healthcare. So when you combine these, that’s a lot of loopholes snapped shut. These are some big deal, life-saving rules that have gone into effect.
Provided it’s enforced.
So let me talk about medical data privacy enforcement.
If you spent time between 2020 and 2023 watching YouTube or listening to podcasts, you’ve probably already heard of BetterHelp; an org offering all-online mental health treatment since 2013. I actually looked into joining them when I decided to start my own practice, as a way of finding clients and managing payments. But the application was…I won’t pretend to have known what they were up to, but the vibes were bad.
Well, in 2023 they settled a court case alleging that they had been selling client data to third parties- including Facebook and Snapchat.
We already know how I feel about Meta (Facebook + Instagram). You can check out the previous newsletter I linked at the start to see more about why. Knowing that they’re buying therapy data does not improve my opinion of them. It’s all very Murderbot Diaries (the books. So far the show hasn’t delved into dystopic corporate data mining practices). Why does Facebook want your mental health questionnaire screening? Put that in the context of the Cambridge Analytica suit from 20 years ago and it’s a grim picture.
This is a really effective means of identifying vulnerable people who are more susceptible to targeted manipulation. This is not information that should be available to whomever has cash in hand.
As far as I can tell, Meta didn’t get in any legal trouble for this. BetterHelp was fined $7.8 million and had to promise to clean up their act. But they also state that the settlement is not any “admission of wrongdoing”. The company is still running. In fact, they’re working to expand into Australian markets, amidst very understandable concerns raised by consumer and health watchdogs over there.
And yet!
This is still better than people resorting to ChatGPT for mental health therapy.The APA (American Psychological Association) filed a statement with federal regulators in February about their concerns that (AI) chatbots posing as therapists are dangerous. They cited two specific cases, both in which ChatGPT claimed to be a licensed therapist; one ending in a teen’s suicide, and another in which a boy physically attacked his parents after the bot encouraged him to. More informally, I have heard of Chat GPT reinforcing paranoid delusions and manic impulses of people who were already very at risk.
Chat GPT is not regulated like a person can be regulated. Or even in the already-insufficient way large companies like BetterHelp are. It is a massive, unregulated, unthinking, climate-endangering algorithm with no HIPPA regulations or ethics constraints. And it’s already linked in to the way many users are already being tracked online, making it an unparalleled data mining opportunity.
If I pulled this shit, I would be over. I am an individual human. Laws apply to me. I have a licensing board and ethical mandates.
To be clear: I am not blaming the people seeking mental health help who fall into these abusive systems. I am pissed at the people offering these honeytraps, not the people who lack other options. We could address mental health care access in this country, by, for example, not letting insurance companies use unaccountable AI to reject insurance claims, but that’s a situation still in development.
As always- since a machine can’t be held accountable, it should not be making decisions. But oh isn’t it nice for the people profiting from these technologies to throw an algorithm under the bus. You know, in case someone tries to hold individuals causing harm accountable, instead of just fining a company money.
Ok, now I’m realizing how long this letter is getting, so! Stay tuned next month for what’s going on with medical data privacy and DNA, and what I learned in grad school about subpoenas that the professor wouldn’t put on the syllabus and shut the door before telling us.
Before I go, my action item for the month is connected to the subject of the email; please join me in urging governors in the US to make it clear that they will not comply with RFK’s idea of a national “autism registry”. The man has said he wants to use federal funds to eliminate autism and is REAL hinky about vaccines. You can sign that petition right here!
See you next month!
Lee
Thank you for joining me for another month of Shed Letters. If you know someone who you think would like to join us, please feel personally invited to share any of these emails, or send them an invitation to sign up here. And remember that Secondhand Origin Stories is available for free as an ebook here, or in paperback form from your local independent book shop. And don’t forget, as a subscriber to Shed Letters, you have exclusive access to my free novelette, Doll’s Eye View, the Martin focused story that takes place between Secondhand Origin Stories and Names in Their Blood.