Remember Co-Star? The algorithmic astrology app? (My Co-Star horoscope today: "You don't need to catalog everything you think." Well, you're not the boss of me.) A typical headline from late 2019, from Vogue: "Meet the astrology app that's intriguing millenials everywhere." From the article,
The app’s proprietary AI technology takes data from the publicly accessible Jet Propulsion Laboratory to map out the position of the planets. The team, which consists of 10 people, has writers who translate the astrological data into thoughtful daily horoscopes and push notifications.
Public data, proprietary algorithm. It's a common enough scenario, and an interesting one. What does it really mean to encode the esoteric knowledge of astrology practice -- in this case, the knowledge of the relative positions of planets -- in a mathematical algorithm that a tech company owns?
This is basically the question addressed in the first half of The Eye of the Master, a "social history of artificial intelligence" by Matteo Pasquinelli. (I got a galley copy of this book from a Verso pal I met at the conference in September, but Google informs me it actually came out yesterday!) It's definitely "a choice" to begin a social history of AI from the industrial revolution in England, and to start with a close reading of Marx's writings on machinery in the Grundrisse and Capital. I think it's a choice that makes sense. As Pasquinelli argues, pace Marx and Charles Babbage, science isn't the inventor of technology -- labor, or rather the division of labor, is. This did not make sense to me for the first few chapters of the book, to be quite honest, but I think it's a valuable insight and I'm going to try to explain why.
Pasquinelli takes Babbage's "labor theory of the machine" as a point of departure, as illustrated by his unsuccessful invention, the Difference Engine. The Difference Engine is often called a precursor to modern computers, or even "the first computer," and it didn't really work, but that doesn't really matter. What is interesting about it is the "problematic" that motivated its invention. The point was to automate the mental labor of calculation. As anyone who has ever written software code knows, calculation gets repetitive fast. Instead of farming out hand calculations to "computers" (office clerks or at-home workers, usually women), the Difference Engine would replace this system with a single, centralized, steam-powered machine. As ever, there was an economic imperative here -- the demand for logarithmic tables, difficult and tedious to compute by hand without error but essential to navigation by stars and thus to British naval power. (Oceans are now battlefields, etc.)
Part of the mental labor here is easy to automate -- if the calculations proceed according to a determined sequence of rules (an algorithm) then you can program a computer to do them (we can't, Babbage couldn't, to be clear). But the algorithm doesn't really know what the calculations mean or how to interpret them. This is true even today; despite the dire warnings periodically issued by tech executives trying to sell and hype their oh-so-powerful AI systems, burnishing the noumenal halo around their proprietary algorithms, there has been no actual breakthrough in machine intelligence. AI systems are, as far as I can tell, all just machine learning systems at bottom, machine learning being a series of statistical tools for signal detection and prediction developed in the 1980s. More on this later, but per Alanis, you oughta know.
Mental effort is perhaps not the best way to understand the labor theory of the machine, however. Physical labor (which is also mental labor, unfortunately for bosses everywhere) gives us an easier entry point. Succinctly stated by Pasquinelli himself: the labor theory of the machine describes how "a new machine comes to imitate and replace a previous division of labor." By way of explanation, Pasquinelli reproduces Marx paraphrasing Babbage in both the Grundrisse:
The hand tool makes the worker independent -- posits him as proprietor. Machinery -- as fixed capital -- posits him as dependent, posits him as appropriated.
[It] is the machine which possesses skills and strength in place of the worker, is itself the virtuoso, with a soul of its own in the mechanical laws acting through it... The worker's activity, reduced to a mere abstraction of activity, is determined and regulated on all sides by the movement of the machinery, and not the opposite. The science which compels the inanimate limbs of the machinery, by their construction, to act purposefully, as an automaton, does not exist in the worker's consciousness, but rather acts upon him through the machine as an alien power, as the power of the machine itself.
And in Capital:
The knowledge, judgment, and will which, even though to a small extent, are exercised by the independent peasant or handicraftsman... are faculties now required only for the workshop as a whole. The possibility of an intelligent direction of production expands in one direction, because it vanishes in many others. What is lost by the specialized workers is concentrated in the capital which confronts them. It is a result of the division of labor in manufacture that the worker is brought face to face with the intellectual potentialities of the material process of production as the property of another and as a power which rules over him.
Machinery -- technology -- solidify and objectify the division of labor, whatever it may be, and also alienate the knowledge contained in the division of labor, appropriating it for the machinery or the technology itself. The knowledge of workers, individually and collectively, is thus appropriated by capital, as capital, where it "confronts" workers not as the fruits of their own labor and mental effort but rather as a controlling "alien power." This is exacerbated by the fundamental uninterpretability of many machine learning algorithms, particularly things like neural nets -- the inputs and outputs are known, but how input becomes output is black-boxed by technical or computational complexity.
This is what's at stake with technology in general, AI specifically: the alienation of knowledge and the disciplining of labor. The dream of AI is a machine that can do everything a thinking human being can do, with none of the pesky human agency that can make people so supremely difficult to manage and exploit. There is not yet, and probably never will be, a technical end-run around this. Remember, the brain is the guiding metaphor for the computer, not the other way around. Any "intelligence" encoded in the the machine or in the system is still human intelligence, however alienated. In the low-stakes instance of Co-Star, actual humans (not automata) are still interpreting the data that is collected automatically and fed through the proprietary algorithm.
Mary L. Gray and Siddharth Suri have written a book about this called Ghost Work. Ghost workers "make the internet seem smart" with their "high-tech piecework: flagging X-rated content, proofreading, designing engine parts, and much more." Or as another article describes,
In the narratives spun by start-ups and industry behemoths, AI is the mysterious creation of those few with six- to seven-figure salaries and degrees from ivy-covered universities.
But behind our increasingly smooth experiences online are a vast number of people that label the data that trains these systems, often precariously employed as contractors without sick leave or bargaining power.
I'm gonna leave it here for today. You may be wondering, what the fuck does any of this have to do with the pandemic? Well... stay tuned. I have some ideas that I am going to develop further here, and I hope to (successfully) pitch a real review of Pasquinelli's book soon. Till then,