The engineer problem, part 1
by Matt May
A couple weeks ago, I did a thing.
I stumbled across a property auction with literally lots and lots of iMacs, and I won. Several times. Ultimately, I ended up with eleven iMacs pulled from a local school district, most of them 5K Retina models from 2014 and newer. They were $20 each. Overall, they’re in pretty good shape. Some just needed a new operating system; others need a little soldering. But I’ve nursed 8 of them back to health, and the supplies to crack the other three (hopefully not literally) are coming today. I can now strip an iMac down to the chassis and rebuild it in under an hour.
I got my start as an engineer. My first real job was actually on the (sigh, aging myself) Windows 95 support line, on the day it was released. It was there that I learned web design, which led to consulting, my first startup, accessibility work, the W3C, and, well, here I am, all these years later.
I got into tech because I loved problem-solving, and at numerous levels, nothing gives you clear victories like a computer can. For those of you who haven’t successfully hacked some piece of tech together, I can tell you, it’s a hell of a drug. From installing a new component and getting the thing to boot up again, to fixing a bug that keeps an app from compiling (or that gets a page to render properly), flashes of competence with technology can provide large hits of dopamine. I still derive a lot of enjoyment out of doing something as simple as a nice operating system reinstall. So clean. So much promise.
Being good at something builds confidence. This is an important and valuable feeling to have, especially when you’re young. The inventor of the Perl programming language, Larry Wall, described the three virtues of a great programmer: laziness, impatience and hubris. He didn’t mean that they were virtues in themselves, but that they led engineers to find problems that would reduce the overall effort expended, anticipate and mitigate future problems, and produce quality work that will improve your status among your peers.
I have worked with some truly great engineers in my time. The ones I find memorable are the ones who just like problem solving. They’re the ones who have hobbies that are often as intense as coding: the rock climbers, the musicians, the woodworkers. Being experts in one system tended to lead them to find other systems where they were not good, so that they had room to expand in a new direction. They may be especially social, or not, but the best ones I’ve known have been exceptionally humble people, and that’s made them especially good technical leaders.
Unfortunately, they’re a small minority of the engineers I’ve worked with, and I don’t think I’m alone. Stories of engineers ignoring or belittling their peers are common. In many developer circles, your value—not just as an employee, but even as a person—is tied intrinsically to what you know. It’s also an intramural sport, with some calling themselves “10x” engineers to imply they’re 10 times as effective as a peer. It’s the status among peer groups that some are hooked on, and they’ve found venues to perpetuate that status-seeking, from GitHub stars to Stack Overflow reputation points, to being the oracle of some Discord somewhere. It’s easy to cloister oneself in these communities, to the exclusion of those outside of tech, and enter an echo chamber, where social norms are dramatically different.
Most industry folks dread working with engineering teams because of that echo chamber, as well as their tendency to demand respect and status, without reciprocity. Many engineers have drawn the conclusion that being very good at some things logically means they must be pretty good at all things, which means that they consider their armchair quarterbacking of, oh, you name it—designers, product managers, executives, DEI specialists, systems of government—is equivalent in value to those professionals, and if you disagree, then why don’t you back up your argument with data? (“Go away before I replace you with a very small shell script” has been an engineering joke for as long as I’ve been around.) The worst evolution of the engineering mindset is the Master of Truth and Reason: someone who has learned that they can play logic games with any position, making straw arguments to drive debates into gray areas, so that their black-and-white position can carry the day.
When you are conditioned to be rewarded with clear answers—solved equations, compiled code—genuinely human problems don’t provide much dopamine. Humans are frustratingly analog. My most beloved of peers recognized that, which is part of what made them great. But there’s another avenue that many engineers take, which is to try to systematize humankind as though it were the same problem set as a computer system. It is human nature to break predictable patterns, which to an engineer is suboptimal. This has led many to treat humans as not-good-enough computers, particularly those whose behaviors confound them. Some at the extreme wish we all would act like computers. A few actually dream of becoming digital beings themselves.
This is where politics come in. Populist politicians sell overly simple solutions, normally of the format: “people like you are good, it’s the people not like you who are the problem.” If you look, that’s where a loooooot of crypto money is going in the upcoming US election. Here, at least, there’s now a loose alliance between technolibertarians—your Elons, your Peter Thiels, your Bill Ackmans, basically everyone in crypto—and the political right. It’s not a surprise to me that Elon’s out there pumping up Trump and saying Germany’s xenophobic AfD party “don’t sound extremist.” That’s not to say all engineers (maybe even not most) are sexist, racist right-wingers, but if you read about a “manifesto” declaring “social responsibility” the enemy or a letter sent to a company mailing list complaining that the company is discriminating against whites, Asians, men and conservatives, chances are the person who wrote it is an engineer (or was one, until he got rich enough to become a venture capitalist).
Data is a convenient sticking point for engineers, because it forces every argument to happen on their terms. To an engineering lead, something like “how many blind users are we talking about, really?” isn’t really a question. To them, the argument is over. I call this the User Data Paradox:
Any number of affected users you give an engineer is either too small to matter, or too big to be believed.
Data-driven decision making is a great way to hide one’s biases behind a veneer of fact. An engineering mindset, the story goes, is best suited to analyze and decide what’s right, because they can do it free of the biases more “emotional” people have. This is a logical closure that eliminates any moral or ethical wiggle room: we, the engineers, know what the real answer is, and if you don’t agree, that’s because you’re not thinking right. A Master of Truth and Reason cannot be questioned.
This is how essentially all of the on-the-ground arguments over AI are being led by two factions of engineers: the “effective altruists,” who have calculated how many bed nets they have to buy in order not to care about Africa anymore; and the “accelerationists,” who want to become nation-states, colonize space and host backups of their brains in the Alpha Centauri system. All while people pointing out actual harm in the here and now get shuffled to the exits. Just last week, outlets like the New York Times and Washington Post breathlessly interviewed a guy who believes the chances AI will kill us all (colloquially known as P(doom)) is 70%. Where did he get that number? Where they all get theirs: they made them up. See, you are too emotional to have a P(doom). But we are Masters of Truth and Reason, so if we think we’re all gonna die, it must be within a standard deviation of correct.
Due to the self-selecting nature of this newsletter, I suspect there are a number of engineers who made it this far and want to tell me you’re not this kind of engineer. To which I say: good. That’s a great first step. But we do have to reckon with the fact that, with respect not just to DEI work but in a whole lot of human-computer interaction issues, engineers and engineering culture are part of the problem.
It feels like we put so much of the responsibility for making equitable products on designers because repairing this breach seems so hopeless. Designers need to recognize that addressing product equity is not exclusively, or even mostly, about your own deliverables. It is about addressing weaknesses in the system that result in inequitable outcomes. It’s time we bring engineers into the conversation, which means meeting many of them where they’re at. It also means bending some of the reward systems they’re operating in, toward something that benefits the systems we are a part of, rather than simply how much code they write or how many bugs they fix. More on that next week.
Epilogue
These iMacs are really well-made machines. The 5K Retina displays are still top of the line; their speakers are even a little too powerful. With 16GB of RAM, at most a $25 upgrade, even a late-2014 model would be a fantastic machine for 90% of users. Bottom line: buy used computers. It keeps these devices from rusting away (and releasing harmful chemicals) in a landfill, and also pushes off all the carbon cost of another one being made.
If you're in the Seattle area, let me know and I'll cut you a deal. ;)