Kev in Progress logo

Kev in Progress

Subscribe
Archives
April 30, 2025

Circular Desires for Generative Algorithmic Systems

Circular Desires for Generative Algorithmic Systems or: How everyone came to believe in AI

From talking to friends, family, and strangers, I feel in the perpetual limbo of an echo chamber; a circular maze of reflective mirrors. Folks ask me questions like, "Will AI take all our jobs?" and "What should I teach my child such that they have an advantage over AI?" I see articles with headlines like A.I. And Vibecoding Helped Me to Create My Own Software and More than a quarter of computer programming jobs just vanished that suggest a bleak future for anyone with a Computer Science degree or interest in programming, present or future. Similarly, new research papers are published weekly exploring the possibilities of using generative algorithms in a new domain, in a new context, or for some understudied population.

Take for example the evidence used to describe why programming jobs are at risk in the future in the Vibecoding article:

"Many A.I. companies are working on software engineering agents that could fully replace human programmers. Already, A.I. is achieving world-class scores on competitive programming tests, and several big tech companies, including Google, have outsourced a large chunk of their engineering work to A.I. systems. (Sundar Pichai, Google’s chief executive, recently said A.I.-generated code made up more than one-fourth of all new code deployed at Google."

This raised a red flag in my head, immediately. Before any actual analysis, using evidence from an advertisement, more or less, by the CEO of the product in said advertisement is questionable at best, deceptive at worst. And when that evidence is based on one metric and that metric is the amount of new code, it begins to resemble the period when Elon Musk was asking engineers to print out their code so he could evaluate it, analog, and look at the number and size of commits they made to judge their value to X. In other words, the claim about AI begins to look nonsensical and half-baked because of the inadequate evidence.

Plausibly though, if 1/4 of the aggregate coding output is truly the output of algorithmically generated code, then that could indicate a growing shift towards displacement of human workers. That is, assuming workers currently type prompts, accept the resulting code output and are close their laptops for the day. That is, assuming that any code runs perfectly without any bugs, the first time. For my non-technical folks who may be missing the joke, this is highly unlikely.

Anecdotally, I have found algorithmically generated code to be quite helpful for speeding up the process of figuring out how to manipulate data in very particular ways. Tasks that may have taken hours and required a whiteboard session, have sped up with the use of these systems, admittedly. I just think that not every programmer or scenario will benefit from this in the same way.

I've talked with several engineers at large companies (e.g., FAANG) about how they use algorithmically generated code in their work and in the sea of AI reporting that exists, it was a perspective I have rarely seen expressed. One engineer said, "All the AI generated code [takes] so much work just to actually work half the time... Last time I used AI to generate tests I got 8 comments during review asking why my tests were bad." They reported that their managers directed them to maintain a quota percentage of algorithmically generated code for their work.

The scenario plays out in my head like this: CEO tells VPs, "start using our algorithmic code generator tool"; VPs tell division leads, "lets increase our usage of the algorithmic code generation by X%"; the division leads tell the managers, "every team's new release must contain X% algorithmically generated code"; and the managers tell the engineers, "25% of your code should contain algorithmically generated code" or "your tests should be created using AI"; then the CEO tells the press, "algorithmically generated code makes up more than 1/4 of all new code at my company"; and the press tells us, "algorithmically generated code could fully replace human programmers, sooner than you think".

This recursive logic was somewhat verified when another engineer told me that a division lead told employees, "It is an existential threat to your job if you are not using AI for your job."

Its not atypical for engineering teams to test their own products, themselves; the practice is known as 'dogfooding'. It does seem atypical, if the dogfooding directives originate from the CEO, since they are directly used as evidence fuel for the AI hype engine in the press. It places the average person in the unfortunate position of being at the butt end of the AI dogfood circle.

My qualms with "AI" have less to do with efficacy than how it is used to wield power over groups of people and individuals; the language around so-called AI lends it power through opacity and mysticism, so I intentionally use terms like generative algorithmic system instead of artificial intelligence, to push back on the notion that what algorithms are doing is resembling intelligence. Mysticism allows Google (and other companies) to convince consumers to think/study/create only with their platform's "intelligent" assistant software and to convince others to spread the message because they profit from the use of this software; from the ownership of Microsoft’s Hypocrisy on AI|AI infrastructure. I worry about over-reliance on company mediated platforms because they can disappear or change drastically at a rich man's whim. And yes, over-reliance on company-mediated reminds me of Dune:

"Once men turned their thinking over to machines in the hope that this would set them free. But that only permitted other men with machines to enslave them."

While I certainly have a Luddite bias to my view on AI, I think we could all use some critical evaluation of our beliefs and usage of systems that are bringing coal back and significantly impacting education and artistic labor worldwide.


It's been a while (about 5 months) since my last post, thanks for being here and reading anyway. Maybe I will get on a monthly cadence with this? Maybe the posts will be shorter? Who knows.

- Kev in Progress

Don't miss what's next. Subscribe to Kev in Progress:
This email brought to you by Buttondown, the easiest way to start and grow your newsletter.