Welcome to 2021!
We’ve crossed over into the new year, yet barely one week in and it seems determined to best last year. As I write this, white supremacists empowered by algorithms that amplify their hate have stormed the American capitol building, with what appears to be the aid of the capitol police. I’m saddened and I’m angry, but I’m writing this and sending it out anyway. Because these events are a stark reminder that we in tech have a massive responsibility to think very carefully about the tools we create, and what those tools can be harnessed to accomplish.
In this issue: Is your product ripe for abuse?, 0-indexed arrays and the habits of the wealthy, Facebook silences it employees, The double-edged sword that is Section 230, how calling startups “cults” reinforces the status quo, and Google’s homophobic advertising AI.
If you have a recommendation for a theme, or an article you’ve seen recently that you think I ought to share, please do let me know. You can reply to this email, or hit me up on Twitter
As always, you can find back issues of The Ethical Technologist in the archives. And if you found this issue thought-provoking and informative, please share with your friends and colleagues!
So let’s begin this week with something constructive. PlatformAbuse.org wants to be a clearinghouse of product design worst practices that you can consult when designing your next product. Although the current list is small, it does a great job highlighting potential ways in which bad actors can harness platforms to abuse others.
You and I, when we count things, we start at “1”. But most programmers start at “0”. It’s a technical curiosity that causes no end of confusion for novice and expert programmers alike. But why is it that way? Unreflectively, many programmers (including myself in the past!) give the pat answer that it’s because it increased the efficiency of the [C programming language](https://en.wikipedia.org/wiki/C_(programming_language), which underlies most of modern computing. The real answer has more to do with a disliked IBM executive with a taste for gambling on yacht races.
(As a bonus, we get a stark lesson in how putting academic research behind paywalls kills further academic research.)
[DEGW]
There is a strong argument to be made that social media helped fuel this week’s ugly events. Some employees at Facebook are beginning to wake up to the reality they helped create. But when they tried to discuss it on the corporate forums, their conversation was shut down. For some of us, this hardly seems like news. But situated in this week’s context, it’s worth remembering that our interests as employees are not our employers’ top priorities.
[DEGW & CH]
By the mid-1990s, the Internet had gone from being “too obscure to regulate” to “too popular not to regulate,” and many who had grown accustomed to the Wild West days of digital communications were worried about the end of unrestricted free speech. Thanks to the efforts of groups like the Electronic Frontier Foundation, Section 230 of the Communications Decency Act ensured that proprietors of web forums would face no legal repercussions for content posted by users — and thus would have no incentive to moderate content at all. The Atlantic examines how this one law both allowed for the growth of gigantic social media sites, and all but guaranteed that misinformation would be allowed to spread there unchecked.
[CH]
When tech companies demand that their employees devote ever-higher percentages of their lives to their work, and promote “grind” cultures that view company success as the greatest good, it is tempting to refer to them as “cults”. However, as this article shows, the very term “cult” has a fraught history as a term specifically designed to reinforce a prostetant status quo and exclude the marginalized, and its use also obscures the role of public policy in contributing to such toxic cultures.
[CH]
A company that collects tons of data from everyone who uses the Internet presumably gets really good at targeted advertising. So good, in fact, that when they provide that targeting as a service, they enjoy a near-monopoly, and every ad-supported website comes to depend on them. If you run such a website, and Google’s state-of-the-art algorithms decide that your content is inappropriate, they can lock you out of your only source of revenue with no appeal. This article from freshfruit investigates one such case, that of GlitterbombTV, a website whose LGBTQ-themed content was demonetized by Google’s algorithms while far more offensive straight content made it through the filters.
[CH]
Does your business need a developer strategy? You’ve heard of developer relations, but what is that? Do you even need it? Katsudon.tech can help you navigate the complex world of developer relations. Every business is different, and we can help you evaluate your developer goals, determine the best way to get started, and ensure you feel confident that you are spending your money effectively. To find out more, and contact us to set up an initial consultation, visit our website: https://katsudon.tech/.
So much for Issue 14! Thanks for subscribing and reading, and thanks to the many folks who have continued to share content with us! If you enjoyed this issue, share with your friends! If you have an article that should be featured in an upcoming issue of The Ethical Technologist, let me know by either replying to this email (what, a newsletter you can reply to!?), or pinging me on Twitter.
Until next time, yours, Chip Hollingsworth & Don Goodman-Wilson