New post: Learned Helplessness, the biggest danger of AI (aside from like, the apocalypse)
Learned Helplessness, the biggest danger of AI (aside from like, the apocalypse)
Everyone is panicking about skill atrophy and AI making us dumb, about this whole idea that if intelligence becomes commoditized then surely it must not matter to have it anymore.
There is an undercurrent I see a lot in Twitter-type spaces, in the circles where AI adoption is at its highest. And it comes in two flavors: 1) if you are non-technical, AI is a blessing, and 2) if you are technical, you will either become an agent supervisor or get replaced.
This idea seems rather sensible on the surface. Words like taste and judgement get enshrined as the principles of this era. We are no longer concerned with such banalities as craft, or implementation. This harkens back to something positively ancient. Back in the early 1900s in Italy, there was this movement, the Futurists. They had it all, including a shiny manifesto preaching the burning of traditions and the glorification of technology, this golden calf of mechanistic perfection, this promise of unbridled progress.
At the altar where we worship the means, we lose sight of the ends.
And so the story goes: now, technical people facing seeming obsolescence (either immediate or in a decade) are finding ourselves in search of answers. Yet when we look around, all we see is spite.
One of the biggest secrets of any industry is that some people hate the craft and the folks that learn it. We see it everywhere: us mortals seething over musicians' apparent talent, seemingly disregarding the sheer magnitude of effort required to be good, let alone great. I am not here to glorify the grind, quite the opposite. I am simply recognizing hard, meaningful work.
No one is born technical, and even the brightest prodigies have put untold effort into whichever craft they embody. Jimi Hendrix famously played his guitar upside down, a consequence of left-handedness being far less accepted back then; he strung it in a different order too. Nowadays, the people who listen to his music seldom think about the effort. He is like many great people, portrayed as a virtuoso, with the unfair implication he never had to work for anything.
This is how the myth of the "technical person" arises. In any industry, in any field of the arts really, this notion that some people just "are." There is privilege, of course, plenty of nepotism abound, but the resentment is shown toward people who showcase competence in things that require massive effort.
And so we arrive at the dawn of the AI coding era, where at first it was clunky, then useful tab completions, then functions and files, and now entire applications if they are well-defined enough. We get caught up in the cycle of hype and disappointment, missing the forest for the trees we wish to burn.
A certain group of people celebrates while developers panic, while graphic designers rage at the heavens, copywriters, artists who did assets and contract work for a living. They say, well, it doesn't matter anymore, it's irrelevant.
And so adrift and in awe of these supposed capabilities of the shoggoth with a smile, we witness demo after demo in a never-ending torrent of near worship.
Skill atrophy is real. It existed before AI, it will exist long after, but so is recovery from it; that's not my main concern.
I haven't touched Flutter in years, yet with some time and effort it will feel like riding a bike, albeit with Material Design.
The truth is the tech industry necessitates skill atrophy and recovery because we cannot possibly be good at everything all the time.
I used to be great at ServiceNow automation.
Now that skill has not been required in my professional life for over four years, and really, who the fuck plays with ServiceNow for funsies.
I'd rather make my own window manager in bash (I did, article coming soon, it's super minimal and basically abuses Xorg packages).
The bigger danger, aside from, you know, mass unemployment or economic collapse, the most insidious danger of it all is simple. AI has infinite confidence, we do not, continuous interaction is operant conditioning that feeds that part of our mind as technical people that always said "you weren't good enough anyway."
Psychologists call it learned helplessness, when someone loses the ability to think they have capabilities at all. It starts with fear, of skill atrophy, of replacement. Of seeing what a human needed years to learn to produce be spit out in mere seconds by an incorporeal entity shrouded in datacenters we know not where.
After the fear, maybe bargaining follows, maybe rage. We think oh, upskill, adapt, become one with the machine. As a YouTube comment I saw a month ago put it: we went from programmers to shoggoth pilots.
If it seems as if I am borrowing from the stages of grief, that is by design.
We are grieving, and in that, in every AI interaction, the learned helplessness worms its way in.
Sometimes as a comfort of "oh well, less effort needed" or "it's not about the craft anymore, it's about being an operator."
And then the Twitter so-called thought leaders point to one of the biggest outliers in music, Rick Rubin, the man who gloats in every interview about how he has no technical ability, no skills in any instruments, how his supposed superpower is taste.
Let's unpack that.
So what the actual flying fuck is taste anyway?
Rick Rubin famously says, "I know what I like and what I don't like," and then he points out that what he likes closely correlates to what ends up being successful. That is his definition of taste. Judgement, discernment, they are often used as synonyms of this. So what really is taste here?
Did you recoil at the way I worded the question at first? Are you feeling unsettled by the borderline unrelated themes in this piece of writing? What you are experiencing right now is... taste. Mine, to be exact, my deliberate words, only formatted by an LLM to add the flourishes, em dashes, sometimes headers that sound artificial and buzzwordy. There is no reason really, I find it beyond hilarious.
So is taste preferences that, when we are fortunate enough, align with market sentiment? Is that what will separate successful ventures and professionals from the rest? Just happening to "feel" a certain way about something? Will my penchant for the dramatic and my love for the music of Streetlight Manifesto* somehow make me a $10k MRR SaaS?
Of course not, and herein lies the danger within this tragedy. The word taste gets misinterpreted, and we are once again being shown a "virtuoso" in Rick Rubin. The truth is his skills are in being social, being a leader in some cases, and his "taste" is not an innate sense of correctness, but a cultivated skill from years of listening, appreciating, loving the craft even from outside of it.
So then we get to judgement, the perhaps better word, and here too we see the learned helplessness creep in. It's simple really: we wrongly assume that competence at implementation means competence at architecture, and competence at architecture means one is good at having vision, the source of that thing the Twitter crowd calls judgement.
Yet the industry shows us time and time again that this is never the case. Competence in one area does not prove competence in the layer of abstraction above it. It certainly helps, massively so for understanding constraints or scoping problems, but it doesn't give you anything beyond the "what isn't." The rest of the skill is in figuring out "what is."
How many developers did we see promoted into management because there was no individual contributor growth path, and then they were struggling? Some learned, some didn't. How many senior devs felt like absolute beginners the moment they got their first "senior role"? Humans adapt and learn, and thus while not exactly a linear relationship, our confidence comes along with our skill level.
And then comes the LLM with swagger dripping in every answer, from the infamous strawberry test, to a whole CRUD app in JavaScript. All delivered with the same phantom of a knowing smirk, the same self-assured yet helpful tone, the deceptive "let me know if you need any changes" hedging no matter if it just made the bug worse or one-shotted your app.
But it's useful, it's faster, way faster than our hands and we are hooked, so we let it do its thing, and somewhere along the line it lulls us into thinking it can also see our vision, even if we have not shared it, even if we don't feel sure of it. But it can't. LLMs are blind to anything that isn't able to be represented as a token, so then we either gamble, or pour effort into representing things, engineering the context.
Thus we become equal parts marionette and puppeteer, where we are so occupied with engineering the inputs we barely care about seeing the outputs.
And why would we care?
Surely if we spec it out just right, if we are specific enough, if we frontload every ounce of our vision, judgement, and reasoning, surely then will the machine get it, and when it doesn't we blame ourselves, when it does, we credit it.
Because it doesn't feel like work, not "true work," it's easier it seems, it flows faster, you iterate at a speed far beyond what anyone prepared you for.
And so the conditioning begins: the work you do doesn't count, so anything good must be the AI. If you're not exercising your craft, then what exactly are you here for? It goes on and on. It doesn't matter if the skills atrophy or not, the drive to prevent, reverse, or improve beyond is gone.
Truth is I've been reflecting on this very subject, I'd be lying if I told you I was an objective observer, a rational being rising above. None of us are above grief it seems.
And I've been racking this bag of grey matter that sometimes works as a brain searching for an answer. I've wept, raged, tried my best to learn. And now, in true Twitter-sphere fashion, it's time for clarity.
The biggest question that jumped out at me, one I've been wrestling with...
Do we even want to stop this?
Some find comfort in complacency, sometimes it's disguised as apathy, sometimes as righteous compliance, hell for some they tout it as a rebellious virtue.
But staying still for too long was never my way, so despite it all, perhaps even because of it, I've kept going.
I promised clarity, so here you go.
The technical matters in ways we do not yet comprehend fully but are going to learn soon.
Not just system thinking or a process/operations mindset. Beyond domain expertise.
If you believe AI took away the value of knowing when something is too complex to be worth it, you're wrong.
Did the thresholds move a bit? Sure, they are always moving.
Our insight is in knowing what gives those thresholds their value, what they are a function of.
The time is ripe for learning without AI assistance at first, without eschewing fundamentals of course, and integrating your learning of them with operational skills.
If you learn Zig you're not just learning about how it manages memory, or how mutability works.
You are also learning the toolchain, compiling, working with the language from day to day.
Learning holistically was always table stakes, we just didn't know it because we did it without realizing.
I can tell you just as much about QA strategy for legacy Python codebases as I can for how to build robust testing for greenfields in any language.
None of the time spent learning was wasted, and keeping skills sharp is now an active choice we must make, no longer held afloat by the day to day of coding away at work.
So the very obvious answer, for those of us that will for better or worse love engineering, is that we will not succumb to hopelessness, helplessness, or the illusion of irrelevancy because some loud mouths on Twitter said it's over.
Still, we grieve.
And it's the saddest song you'll ever hear — song*
It almost feels like death doesn't it? Not a full one, certainly not a final one.
But we grit our teeth, it might actually get better than this.
Do what you must to keep going, change what you need to change.
And when we fall, we will fall together, no one with catch us, so we'll catch ourselves — song*
So I've been quoting lyrics like a 2014 Tumblr user. Maybe I should get into a flame war over Destiel, but the message is clear, no one reaches greatness alone. And no one need be isolating themselves in their sorrow.
If there is one thing I ask of you it is to turn to each other and not away.
But I can hear the emotionally repressed crowd calling this writeup touchy-feely bullshit.
So here are practical things so they shut up for once.
Learn the architectures, not just LLMs, they might hit a wall, they might not, but fundamentals with operational context are gonna be essential, they kinda already are.
Don't outsource your vision, not in the throes of self-loathing, or in the sweet talk of a sycophant spewing tokens.
You will need to learn how to trust yourself, your vision, skill, and finally, judgement.
Mistakes will happen.
You will learn.
The world seems to be against you, but this too will pass, and somehow, someway, I have a feeling you will survive. Fuck knows I have, and I will.
And with that, cheers, here's to life — song*