L30 Transmissions logo

L30 Transmissions

Archives
February 17, 2026

Political Philosophy and the Rise of LLMs

Foreword

In my last transmission, I promised to answer a number of questions. The following will attempt to answer the first question:

What do you actually want to achieve with all of this?

This is not a question that I have either a complete or concise answer to- change in the environment is constant and rapid, and so naturally my specific goals must evolve in response. I will try to balance analysis of the present situation, predictions for the future, and an explanation of the philosophical ground that these rest on.

Political Philosophy and the Rise of LLMs

"Agreement at all costs is possible only as agreement at the cost of the meaning of human life. For agreement at all costs is possible only if man has relinquished asking the question of what is right, and if man relinquishes that question, he relinquishes being a man. But if he seriously asks the question of what is right, the quarrel will be ignited... the life and death quarrel, the political- the grouping of humanity into friends and enemies- owes its legitimation to the seriousness of the question of what is right."

— Leo Strauss, Notes on Carl Schmitt, The Concept of the Political (1932)

My motivation for the work that I'm doing now comes originally from a personal search for meaning in my own life- to attempt to answer the question of what I should do with my life and why, the consideration of which I almost unwittingly surrendered in blind obedience to one ideology or the other, unquestioned traditions, and a tendency towards escapism. When, finally, I managed to recover this question and to think about it seriously, I learned that, in doing so, I had taken my first baby steps down the endless path of political philosophy.

Apparently, I'm far from the only person in tech to do so. The days of technologists being allowed to skate by on vague promises of "making the world a better place" are long over. We've all watched as the utopian promises of the early internet descended into a nightmare of surveillance and extraction, an ouroboros of misinformation and censorship. Global connectivity turns out to be at least as traumatic as it is enlightening. CEOs and investors have suddenly decided that STEM isn't the only thing that matters after all, and some have even begun to publicly position themselves as a new generation of philosopher kings, although Alex Karp's piggybacking on what is arguably one of the greatest works of political philosophy of all time in order to create a glorified advertisement for Palantir leaves much to be desired for anyone with a real interest in the topic. The fact that pointing out the obvious— that profit maximization ought not be the sole goal of intelligent and ambitious citizens of the free world— gains Karp such favorable reviews from leading institutions only reveals the severity of our affliction.

But if the purpose of technology isn't merely to profit, then what is it? For Karp, it's to attempt to rescue Western hegemony through technological dominance. With Sam Altman and Dario Amodei, we're back to making the world a better place, but this time through Machines of Loving Grace that also happen to be useful for toppling foreign governments. Although I have taken some comfort in the fact that Anthropic has maintained a lead over OpenAI in the current AI arms race, we should have learned by now that we can never trust in the benevolent intentions of a multi-billion dollar corporation that holds a critical geopolitical position, at least no more than we can trust in our own governments. Even the mediocre moral standard of "Don't Be Evil" was dropped by Google as they reached scale. After being founded in 2015 as a 501(c)(3) with assets "irrevocably dedicated to charitable purpose," OpenAI completed the conversion to a for-profit entity in 2025. At this rate, the far more ambitious charter of Anthropic doesn't stand a chance, as foreshadowed by the recent public resignation of the leader of their Safeguards Research Team for the reason that employees "constantly face pressures to set aside what matters most."

It's possible that time is already up and all we can do at this point is watch as power continues to centralize in the hands of a few tech companies and their benefactors, but personally I'm not yet ready to accept my position in the new permanent underclass or rest all of my hopes on the benevolence of mega-corporations. If we wish to find another way, then there's a lot that needs to change very quickly. The machines do not rest. But first we must begin to understand the predicament we're in.

Outside of tech, most people seem to intuitively grasp the relationship money ↔ power but have not yet updated their thinking for the 21st century:

money ↔ compute ↔ power

Intelligence used to be something that money couldn't buy, and while this will likely continue to be true in some domains, it is broadly untrue today. There has never been a more fluid conversion process between capital and hard power. Accepting this premise inevitably results in a few conclusions:

  1. Although rational by the logic of neoliberalism, relying on a single company in a nation under constant threat of invasion from a neighboring superpower for all of our computer chips is an incredibly arrogant and foolish strategic decision, or it would be if it was a decision at all and not simply the result of decades of blind deference to market logic. This is deeply troubling but is also a prohibitively difficult problem to approach as a mere civilian.
  2. Each time we pay for a cloud service, we are in the position of renting computing power from others rather than owning it ourselves. We either pay with our money or with our attention and personal data, but either way, we're paying rent. After the transaction is complete, you, the tenant, have less power to purchase your own compute while meanwhile the landlord has more, and before you can blink, your entire block belongs to them. This is what I want to focus on.

I'm not here to scold anyone for watching Netflix, buying things on Amazon, or scrolling Instagram, but rather to work towards providing alternatives that fulfill similar functions while leaving the exploitative business practices behind.

To understand the difference between technologists who, whether they are conscious of it or not, wish to exploit us and those who are genuinely providing a service, I'll lean on Heidegger's concept of Fürsorge, which literally means "caring for," usually translated as "concern." In Being and Time, Heidegger distinguishes between two modes of concern for others: one that jumps in and takes over for the other person, and one that instead helps them find their own way. When someone leaps in and does things for us, we become dependent on them. Our own possibilities become limited by what they will allow us to do or what they will do for us, and our relationship with ourselves and with the world suffers as a result. The former mode of concern, though arguably necessary at times and often well-intentioned, is a form of domination.

For this reason, no honest, well-meaning, and discerning technologist will offer to completely take care of things for you unless they have reason to believe you incapable of doing so yourself. Capability, however, is not only a function of aptitude, education, and will. These are important inputs, but capability in the context of technology arises from interaction with the technology itself, meaning that the current state of the technology is just as important, if not more, than your own knowledge and abilities. It would not matter how good at programming you are— given an alien machine that does not work according to your logic, you would not gain any new capabilities, and no one would wonder why. No individual can therefore be blamed for failing to gain control over technology which was primarily designed to make one dependent on it.

Fortunately for us, countless technologists have worked together to ensure our continued access to technology which aims to empower its users rather than to feed off of them. Participants in open source and free software movements have been doing so for decades, usually without any compensation at all beyond the satisfaction of performing a generous act. But as AI trained on their charitable work has accelerated a dramatic centralization of money and power while making no attempt to offer any compensation, many are running out of goodwill. This has led us to a point where Small Tech must now move underground. We can continue the spirit of freely sharing and collaborating with one another within our trusted circles, our Dark Forests, but producing free training data should no longer be seen as a moral imperative. Our challenge now is to compete with Big Tech by doing precisely what they will never do: genuinely attempting to empower less technical people, to help them help themselves, to declare independence from the tyranny of the nerds. This is not to say that everyone ought to become a computer expert. No one has the capacity to learn everything. The real goal is to restore a baseline of trust between technologists and the rest of society by demonstrating that user empowerment and user experience need not be mutually exclusive, that, in fact, our experience of technology will improve when the technology serves us instead of the other way around.

I'll leave the question of how exactly I want to translate the goals presented here into action until next time. For now, I'd be happy to learn where you agree with, disagree with, or were just confused by what I've written here. Many of the problems that I laid out are likely caused in part by a lack of communication between the left and the right brain of our collective intelligence, and solving them will mean that we have to learn how to talk to each other.

Don't miss what's next. Subscribe to L30 Transmissions:
Join the discussion:
  1. P
    poeticsuicide
    February 18, 2026, morning

    Your comment on money equals computing equals power is exactly what the economist what Yanis Varoufakis defines as technofeudal.

    Here's an interview: https://youtu.be/sZDh8JvUG1Q?si=B2MaX8STOtNgKOJl

    Funnily enough, he used to work at Steam

    Reply Report
  2. ↳ In reply to poeticsuicide
    L30 Transmissions
    Lane Spangler Author
    February 18, 2026, afternoon

    Ah, right- I'm sure I originally got the idea from him, but I didn't think of it when I wrote this. I think he has a lot of good insight, but I think his lack of knowledge about the details of how the technology in question works leads towards solutions that feel outdated to me.

    Maybe I'm foolish to imagine technical solutions to a fundamentally social/political problem, but I think the real problem isn't clearly one or the other. It's in the feedback loops between them.

    Reply Report Delete
  3. ↳ In reply to Lane Spangler
    P
    poeticsuicide
    February 18, 2026, evening

    Not at all. I'm no historian but I think the feudalism lost it's hold when a burgeoning class of merchants had enough buying power to act as it's own cohort, hence the rise of capitalism. Therefore, if this equation of money -> computation -> power it would beg to mean allowing the working class access to the computational power would be a way to shake up the current order.

    It would be interesting what the proletariat's total computational power is when aggregating everyone's phones/PCs vs the server farms big tech owns.

    Reply Report

Add a comment:

Powered by Buttondown, the easiest way to start and grow your newsletter.