Nov. 20, 2025, 5:24 p.m.

The affiliate marketing HHS

Closed Form

I’m bumping what I had planned to work on today, a bit of critical reflection on AI, in order to write this up instead while everyone is reacting to the CDC’s addition of a web page promoting long-debunked “information” about vaccines and autism. What follows isn’t going to be as thorough as the big thing that I have coming about MAHA, but it picks up on one strand of an argument I develop in that piece.

The new CDC website in question is here. The “key points” as they appear on the page now are as follows:

  • The claim “vaccines do not cause autism” is not an evidence-based claim because studies have not ruled out the possibility that infant vaccines cause autism.

  • Studies supporting a link have been ignored by health authorities.

  • HHS has launched a comprehensive assessment of the causes of autism, including investigations on plausible biologic mechanisms and potential causal links.

It has me feeling a bit like Bruno Latour in “Why has critique run out of steam?” Yes, be critical about how evidence is established, but no, not like that! Wellness grifters, Covid contrarians, and regulation slashers are very skilled at using the language of scientific process and evidence to nefarious ends. It works, because one has to be fairly literate, in the sense of understanding the technical nuances of that scientific process, to immediately get why something like the above text is so infuriating and bad-faith. This literacy is what I think some scicomm people believe they are imparting. A sample of some of the Bluesky Discourse about the addition of the web page: the CDC is “disseminating disinformation” (Yaver), “now directed to push disinformation” (Cruickshank), engaged in a “stunt” to “spread vaccine/autism disinformation” (Alt CDC) or, if you prefer, a “disinformation escalation” (Offir). The accompanying tone of moral outrage is also exactly what you’d predict. It’s “difficult to overstate just how dangerous this is” (Yaver), not that anyone will refrain from trying. The “vile dribble of a webpage update needs to be removed” (Alt CDC), it’s “intentionally weaponized eugenicist rhetoric that will kill people” (Tran), it’s “an incredibly sad and devastating place for our country to be” (Jetelina). 

I’ll leave aside the worthwhile question of whether this is effective in correcting disinformation, or whether it succumbs to the logic of the disinformation framework itself. I want to focus on the obvious pitfalls of fighting disinformation qua disinformation. Framing the issue as one of disinformation suggests that the core social problem at hand is that people see wrong facts, and for any reason at all, believe them. As Quinta Jurecic wrote in a great 2024 essay about the failure of anti-disinformation effort No License for Disinformation, “lies, it turns out, have a constituency,” though the nature of this constituency is never investigated. The content and pace of the furious posting is a problem – if everything is a devastating outrage with incalculable human cost, then really, in our oversaturated media environment, nothing is. But the bigger problem is the theory of the case underpinning furious posting as a strategy, something like: people are seeing wrong facts, from people who don’t have sufficient expertise to speak on behalf of the scientific consensus, and so we need to expose people to real facts, and loudly and insistently establish our credentials for conveying these facts, which is to say, to insist that we should be believed because we’re educated. 

The main problem with this is that it doesn’t work. At all. It hasn’t ever worked. For one thing, plenty of people with impeccable credentials are inveterate bad actors and shit-peddlers – take Martin Kulldorff, for example. For another, as Jurecic notes in her essay, there is great “difficulty” in “distinguish dangerous quackery from productive disagreement.” Attempts to understand MAHA and what is happening in the federal health bureaucracy through the disinformation framework will never not be plagued by these issues. They are epistemic issues inherent in science itself and hence in any attempt to communicate science; they cannot be overcome by establishing expertise or authority. In light of the overwhelming evidence that the disinformation framework leads nowhere productive, I want to suggest that we act like the scientists we play online and abandon it in favor of a framework that I think does a better job of explaining many of the relevant data points. The right framework – the one that gets us asking the right questions, and opens rather than closes avenues for political possibility – is marketing. 

If we read the above text as marketing copy, then the natural questions to ask are: why say these things? Who is benefiting from saying these things to the American public? Rather than being forced to conclude that RFK Jr. is simply a malign bad actor who wants kids to get vaccine-preventable illnesses because (to use one egregious example of psychologizing), other people’s expertise makes him feel bad, the line of analysis coming from the marketing framework leads down an avenue of greater possibility. Walker Bragman has reported that the new page cites a lot of RFK Jr.’s fellow-travelers in the anti-vaccine griftersphere, the citational equivalent of the affiliate marketing networks that dominate the social media disinformation environment and the world that RFK Jr. comes from – his Children’s Health Defense makes its money from individual donations and, you guessed it, affiliate marketing. (We desperately need good investigative journalism mapping out these affiliate marketing networks, I am speaking it into existence!) The additional web page is not just “one battle after another” in the disinfo wars, it’s one part of a bigger program to repurpose HHS for self-dealing. Is this particular part strategically important to focus on? 

I’d argue that it isn’t, if by “focusing on it” we mean yet more furious posting. We know that good marketing is psychologically effective, but somehow we think that good science communication just is. A grave mistake that leads to the plainly insane implicit strategy of “replacing bad thoughts with good ones, at population scale, by voicing our authoritative disgust or horror in increasingly high dudgeon.” To be effective, science communication needs to learn something from marketing strategy. It needs to reflect a coherent understanding of the world back to people, and to help make sense of their actual fears and grievances, not the fears and grievances they would have if they were smarter, more scientifically literate, or more politically sophisticated. (I also think that there’s something of a sublimated wish here, for actual political reality to be as degraded and hopeless as people’s emotional lives, that the MAHA marketing works on, but that’s a subject for another essay.) To have an effective science communication strategy, we simply can no longer bury our heads in the sand about health care and medicine. Part of the reason wellness marketing works is that we’re used to being aggressively marketed legitimate medical treatments by legitimate professionals. When I was a teenager, for example, a family doctor – a woman of prestigious education and tremendous professional and personal integrity – tried to market me the skin care products she was selling from her office during a checkup! Another reason it works is that science communication is leaving the entire experience of health care on the table. 

Consider the actual experience of trying to access behavioral health services for autism – it’s almost impossible in this country, destabilizing and financially grueling for most families. This creates a rich deposit of anxiety in the population that MAHA is happily mining for its own enrichment while we, the experts, tell ourselves that nothing can be done if people just really want to believe falsehoods. Jurecic writes that there is “political benefit” to promoting falsehoods, and I really wish she had followed that thought out to its conclusion. Political benefit to whom? What if we tried, in terms of science communication, to confront the psychological appeal of wellness bullshit? For example: we see and understand how impossible it is to get your kid a behavioral health diagnosis, or any follow-up care. We understand that this creates real, cascading problems for you that you have to contend with and that demand immediate solutions. There are real treatments out there that can help you, but they’re unacceptably hard to get, and we, the experts, are committed to the political advocacy necessary to reform our health care system so that it works for you, not the insurance executives or the supplements hawkers – but we need you, and we can’t do it alone.

This last part is literal, by the way. We really can’t do it alone. And we can’t just scicomm it all the way. We have to fucking mean what we say. This is all disastrous, but I must continue to emphasize that it can be undone. It can be undone tomorrow, if we were to gain some political power tomorrow. It’s much easier to get political power to change the leadership of HHS than it is to convince everyone to “trust science.” But it’s much easier to post expert cringe on main than it is to make meaningful commitments to concrete political efforts and to the “public” that public health is supposed to serve.

You just read issue #81 of Closed Form. You can also browse the full archives of this newsletter.

Powered by Buttondown, the easiest way to start and grow your newsletter.