Hello, Governor!

Archive

Engineering and Postmodernism

Apologies for the short newsletter - I have a few other drafts in the works, but none of them are ready yet. The following is from my response to a friend's Facebook post about the value of postmodernism to engineers, and the role that education plays in introducing engineers to postmodernism.

As a software engineer with a liberal arts education, as someone with little knowledge of what a traditional engineering education even looks like, I desperately wish more people in this industry had a background in postmodernism. By that I mean: I wish they had a deep understanding of the way that human beings, individually and collectively, construct their own realities. How everything looks different depending on your perspective, and how institutions and cultural norms shape both the perspectives one has and the objects one is perceiving. I wish they understood that there are many different lenses with which to view a project or event, and those lenses are sometimes in conflict, sometime orthogonal to each other, sometimes complimentary, but rarely commeasurable.

There are individual fields I wish engineers spent more time with - sociology, social psychology, race studies, disability studies, etc - but no one person can study everything and so most important is the core skill of realizing: "This is a really complex topic. There's no way for me to be certain in my perception of it. I better explore the different possibilities with an open mind and collaborate with others towards a shared understanding of it."

A lot of engineers are more likely to fall prey to "Engineer's Disease" - this belief that there's A Problem and they can construct A Solution. And then when they're critiqued because their solution doesn't take into account X, or Y, or Z, instead of saying, "Yeah, wow, there's this whole other perspective I missed, let's work together to incorporate it" they say, "Well, you can't please everyone!" Or, worse, "those people will find fault with anything." Because they see critique as nitpicking, instead of an articulation of the way a given tool or process or artefact fails to take into account the critic's experience of the world.

#9
July 1, 2021
Read more

Code as Contestable Law

Last week I gave two short talks as part of the IASC Knowledge Commons conference, one on our ability to contest rules in digital spaces, and another on how, more generally, we can engineer more responsive governance. Those links go to YouTube, where you can watch the talks. (Update: you can now also see the panel discussions for these talks: , .)

#8
June 17, 2021
Read more

Three Governance Case Studies

One of my communities is choosing a new governance structure, and in preparation for our first conversation on the topic, I wrote up three governance case studies. I covered DisCOs (Distributed Cooperative Organizations), Enspiral, and Python.

I want to continue writing up these case studies and share them in a more navigable and findable way, but for now, I thought I’d share what I’ve got with you.

DisCO

#7
May 28, 2021
Read more

Facebook's Everything Problem

Facebook's 'Oversight Board' recently affirmed Facebook's choice to suspend then-President Donald Trump from its platform the day after the Capitol riots. Although many critics - including myself - support the decision to keep Trump off Facebook, we're not too pleased with the process itself. "[It's] a red herring, substituting a simulacrum of due process in certain high-profile cases for substantive reform," Will Oremus writes in a New York Times op ed. Oremus and others point out that while Facebook has promised to obey the content moderation decisions of the board, it can choose to ignore any policy recommendations. "[Facebook] did not empower the board to watch over its products or systems — only its rules and how it applies them."

And yet it is Facebook's product design, policies, and internal governance mechanisms that cause systemic harm on and through the platform. Three recent reports on systemic content moderation issues show this clearly. In late March, the Tech Transparency Project posted a report into Facebook's continued promotion of militia pages, despite their insistence that the pages were being removed. In mid April, Julie Carrie Wong at the Guardian published an article about data scientist Sophie Zhang, who was fired from Facebook after pushing too hard to remove government-sponsored fake engagement in over a dozen countries. Then, just a few weeks ago, Buzzfeed Reporters released a leaked internal report on how Facebook had failed to stop the "Stop the Steal" movement from organizing the Capitol riots.

There are many similarities in these stories. I recommend reading each of the linked reports, but I'll try here to emphasize some important patterns. I'll start detailed and zoom out, looking first at the individual design decisions that allowed these problems to take root, then at the policies that prevented the problems from being addressed, and finally, at the overall culture of governance at Facebook that allows ineffective policies and exploitable design to remain the status quo.

Design Decisions

#6
May 12, 2021
Read more

On Digital Disobedience

There’s not enough disobedience online. Yes, that’s right, you heard me: there’s not enough space for disobedience online.

This may seem like a ludicrous claim, especially when leveled at the major social media platforms, which are currently struggling to deal with overwhelming tides of misinformation, harassment, conspiracy, demagoguery, and more. But these content moderation issues are symptoms of a greater problem with how the platforms are governed: in an extremely hierarchical, top-down manner.

Platform authoritarianism is fractal. Content moderators are low wage workers whose critiques of moderation guidelines are valued even less than the moderators’ (largely ignored) mental and physical health. Product teams treat users as subjects to be surveilled rather than partners in design. The companies who own the platforms craft their goals around the needs of distant shareholders rather than immediate stakeholders such as workers and users. Even at the societal level, we’ve chosen to reify these power structures by protecting platforms from responsibility for bad decisions (see Section 230 of the Communications Decency Act) and by criminalizing efforts of users to work around these poorly designed systems (see the Computer Fraud and Abuse Act). From lines of code to lines of law, there’s very little room for disobedience.

It’s easy to see that as a good thing when we’re visualizing a Twitter troll, flaunting the site’s content policy in order to spit invective at an activist. But there are many different kinds of disobedience. One kind, civil disobedience, has helped rectify some of the greatest injustices in our history.

#5
April 30, 2021
Read more

Collective Accountability in Digital Spaces

It can be a struggle to hold groups accountable for misbehavior, even when the group is structured through legal forms, with public membership and clear leadership. On social media platforms populated by anonymous accounts, with membership and leadership often informal and invisible, accountability feels like a pipe dream.

And yet we urgently need accountability - for misinformation, for harassment, for exploitation, and for so much more. There is a deep anger towards social media platforms like Facebook and Twitter, both for their negligence in allowing these problems to spread, but also because expressing anger towards them is our only mechanism for regulating online behavior. Lacking other options, we resort to demanding that specific users be banned and specific behavior suppressed.

Platforms protest - rightly, I believe - that they shouldn’t be in control of public debate. But this is also an evasion of responsibility. Companies could design mechanisms of accountability into their platforms, allowing users themselves to collectively control debate, but they have chosen not to. I suspect that, for all their rhetoric about empowering users, they are afraid to let users control what they see. What if they made decisions that hurt the company’s bottom line?

#4
April 14, 2021
Read more

Yo Dawg, I Heard You Like Tech Governance...

... so I put some tech governance controversy on the platform you were using for your tech governance newsletter.

That's right: Substack, which hosted this newsletter for exactly one issue, is drawing some criticism for its decision to offer large cash advances to, among others, some transphobic and alt-right writers. At the heart of the controversy lies a question: is Substack a platform which hosts general content or a publisher which curates and incentivizes certain types of content?

This question comes up again and again in debates about tech governance, in part due to the influential Section 230. Section 230 marks out the legal liability of services for their content, while Substack's critics are largely arguing about ethics. But law is just the codification of ethics, so in some ways these are just two fronts of the same war: the war over who's voices are heard, and who is held accountable when those voices cause harm.

There's more about Section 230 in the Reading section below. In the meantime, I've moved this newsletter to Buttondown. Now, on with the show.

#3
March 30, 2021
Read more

The Federal Trade Commission vs Facebook

Below, I summarize the FTC complaint against Facebook, citing the relevant paragraphs so you can get more detail as needed. This was a useful exercise in helping me understand the case; I hope it also helps you.

Overview & Background

The greatest competitive threats to Facebook “come not from ‘Facebook clones’ but from differentiated services during periods of transition” (8). In buying Instagram and WhatsApp, Mark Zuckerberg and Facebook saw themselves as “buying time” to dominate the mobile market before anyone else could compete (14). In addition to buying rivals Instagram and WhatsApp, Facebook behaves anticompetitively by restricting third party API use to those who don’t compete with Facebook (23).

Facebook chose to monetize by using proprietary algorithms to analyze user data and target advertisements to those users (43) and earns all of its revenue this way (50). Called social advertising, this type of advertising is distinct because it targets users via personal data (47), because it promotes engagement with the advertising, and because it facilitates users sharing the advertising with personal connections (48).

#2
February 27, 2021
Read more

Hello, World!

Hello, and welcome to the very first edition of this newsletter!

My goal is to explore the intersection of technology and governance. This includes questions about how we govern technology, and about how technology impacts how we govern. How does the product design of modern software impact community building? Should governments regulate Big Tech? How does the sociotechnical infrastructure of the internet allow or constrain privacy, equality or freedom? What's at the cutting edge of governance tech?

I also will, with some regularity, geek out about a paper or topic or news item that's "just governance" or "just technology". My newsletter, my rules. ;)

I'm still figuring out the structure and content of the newsletter, so please bear with me as these change from month to month. If there's something you especially like, or something you wish I'd include, please let me know!

#1
February 25, 2021
Read more
This email brought to you by Buttondown, the easiest way to start and grow your newsletter.