Mystery AI Hype Theater 3000: The Newsletter logo

Mystery AI Hype Theater 3000: The Newsletter

Archives
Subscribe
December 12, 2025

Report Reveals the Devastating Cost of AI Intimacy

Plus, what an uncanny customer service experience taught us about global systems of labor exploitation

By: Decca Muldowney and Emily M. Bender

This week, the Data Workers’ Inquiry project released a powerful new report detailing the experiences of Michael Geoffrey Asia, a text chat operator from Nairobi, Kenya. In “The Emotional Labor Behind AI Intimacy,” Michael describes how he was working as a data labeler when he heard from colleagues there was money to be made in chat moderation. What he didn’t know at the time was that the new role would require him to assume countless false identities, engage in romantic and sexual chats with people all over the world, and that his work was likely used to train AI bots to replace him. 

“One day, I might be Jessica, a 24-year-old lesbian college student from California, and Joe, a 30-year-old gay man from Florida,” Michael writes. He was unable to tell his wife what his job entailed due to the NDA he’d signed, and every day of deception made him feel more alienated and torn. The psychological scars of the work were profound. “Every moment of pretense fractured something inside my spirit, and my sense of self. I was losing touch with who I really was, a feeling that has never left me.”

Of course, Michael’s salary was tied to his output. He would get $0.05 per message, and had to type at least 40 words per minute, while juggling multiple chats simultaneously. The message interface had a dashboard displaying live tallies of the number of messages he was sending, meaning “every line of manufactured affection, every fake confession, every scripted ‘I love you’ had a price tag.” On top of that, Michael believed his work was being used to train chatbots. 

“I was training my own replacement, teaching machines how to manipulate lonely people the same way I was being forced to,” he writes. “Remember that an AI girlfriend responding to your loneliness might just be a man in a Nairobi slum, wondering if he'll ever feel real love again.”

Michael’s story of the intense alienation of chat work—an industry that sells connection and instead ends up leaving workers psychologically and spiritually wounded—is just one example of the countless ways “AI” technologies (whether actually automated or not) are appearing, without our consent, in our everyday interactions. The insertion of automation takes interactions that previously might have been local, between two people who could actually perceive each other, and instead feeds systems of global labor exploitation.

Earlier this week, Emily had an uncanny experience when she called a hotel about an upcoming booking. A prerecorded message on the other end of the line informed her that her call might "be recorded or AI-assisted for quality assurance." She was then redirected to a customer service voice that was natural(ish) sounding, but a little too wordy, and cheerful, and didn’t react normally to Emily’s questions. 

Emily suspected the voice was synthetic and was likely reading off LLM-generated strings that were lightly vetted or accepted by a call center worker somewhere. Like Michael’s experience of chat moderation, this kind of outsourcing is terrible for everyone involved. Why? 

Because the two people immediately involved in the interaction (Emily and the customer service representative) are prevented from having direct communication. Companies want to take people out of the equation, pretending that conversations are fully automated. But there will always be someone to shepherd the interaction towards feeling like you’re interacting with a person. This is what Mary L. Gray, the author of Ghost Work, calls “the last mile of automation.” The call center worker is likely working under poor conditions, with pressure to complete as many calls as possible. The on-the-ground hotel employees have to deal with people who have no doubt received false information from the remote worker or the bot itself. In order to switch over to this “AI”-powered system, the hotel likely outsourced some fairly decent jobs that would previously have been done in-house by a person. 

And, to the extent that the hotel chain is paying the LLM company for their part in this, they're helping to build the case for the next enormous model training run and the next hyperscale data center, so we all lose. And this is what Michael is experiencing in his work of impersonating intimate companions, having to appear as an automated partner because companies are attempting to get their chatbots to a place where the interaction feels “human” even while they still need real humans to make it run. This dehumanizes real people for the veneer of automation.

“AI hype at work is designed to hide the move employers make towards the degradation of jobs and the workplace behind the shiny claims of techno-optimism,” Emily and Alex write in The AI Con. “But when we look behind the curtain, we see instead that automation is being wielded as a cudgel against workers and trotted out as a cost-saving device for employers, leaving workers the tasks of cleaning up after it, tasks that are devalued, and more poorly paid while also being less creative, engaging, and fulfilling—or at worst, outright traumatic to carry out.”

Here are some Mystery AI Hype Theater 3000 episodes that dig into the issue of automation at work and the impacts of “AI” tools in the workplace and beyond. 

  • In Episode 13: Beware the Robo-Therapist, UC Berkeley historian of medicine and technology Hannah Zeavin tells us why the datafication and automation of mental health services are an injustice that will disproportionately affect the already vulnerable. [Livestream, Podcast, Transcript]

  • In Episode 37: Chatbots Aren't Nurses, registered nurse, nursing care advocate, and Director of Nursing Practice at National Nurses United Michelle Mahon explains why generative AI falls far, far short of the work nurses do. [Livestream, Podcast, Transcript]

  • Chapter 3 of The AI Con, “Leisure For Me, Gig Work for Thee: AI Hype at Work” digs into these issues, particularly in the section “AI is Always People” (p.58) where Alex and Emily write: “Most AI tools require a huge amount of hidden labor to make them work at all [...] It’s not an exaggeration to say that we wouldn’t have the current wave of “AI” if it weren’t for the availability of on-demand laborers who could be called upon at any time to perform a set of tasks whenever some AI researchers or corporate engineers demanded it.”


Our book, The AI Con, is now available wherever fine books are sold!

The cover image of The AI Con, with text to the right, which reads in all uppercase, alternating with black and red: Available Now, thecon.ai.

Don't miss what's next. Subscribe to Mystery AI Hype Theater 3000: The Newsletter:
Share this email:
Share on Twitter Share on LinkedIn Share via email Share on Mastodon Share on Bluesky
Website favicon
Website favicon
Website favicon
Website favicon
Powered by Buttondown, the easiest way to start and grow your newsletter.