Welcome back!
This week, Chip and I take you on a tour of the hidden decisions that guide our lives in unseen ways. We rely on streaming TV, social media, and online banks, but we have so little insight into how these thing work and the influence they have over our lives. Computerized algorithms only obscure the influences even more, as decisions are taken out of human hands. It’s not the algorithms themselves that are to blame, but the lack of transparency around how they work, and how they are deployed. This lack of transparency, as we’ll see in today’s issue, is unjust. This is why efforts to create transparency, like those by the cities of Amsterdam and Helsinki, are so important.
In this issue: Ethical smart phones, Facebook’s doomsday machine, fighting the hidden oppression of automated credit scores, the concentration of power in media, and the hidden incentive structure driving QAnon.
If you have a recommendation for a theme, or an article you’ve seen recently that you think I ought to share, please do let me know. You can reply to this email, or hit me up on Twitter
As always, you can find back issues of The Ethical Technologist in the archives. And if you found this issue thought-provoking and informative, please share with your friends and colleagues!
But let’s start this week’s newsletter with something inspiring. Even as the new iPhone is getting dunked for its low repairability, a new entrant to the smartphone market offers hope. Fairphone‘s Fairphone 3+ is a smartphone designed to be upgraded, repaired, and recycled as easily as possible, with minimum environmental impact. It even includes a screwdriver for disassembly. What began ten years ago as an awareness campaign about our dependence on conflict minerals has transformed into a mission to show the phone industry that we can have exciting, innovative phones that are healthier for the environment.
[DEGW]
The Atlantic compares Facebook (and other social media platforms) to the Cold War-era “doomsday machine” designed to take mutually assured destruction out of human control. Facebook’s algorithms, designed in the service of a “more engagement at any cost” ideology, allow disinformation to spread at a scale that makes human moderation all but impossible. The only way to stop the spread of harmful content is to change the algorithm, which runs counter to the site’s business model:
> The only problem is that reducing the prevalence of content that Facebook calls “bad for the world” also reduces people’s engagement with the site. In its experiments with human intervention, the Times reported, Facebook calibrated the dial so that just enough harmful content stayed in users’ news feeds to keep them coming back for more.
[CH]
When I moved to Europe, I was quite surprised as an American to learn that credit scores are not a universal thing here. For our European audience: Imagine if your ability to take out a loan or make a major purpose was mediated by a hidden algorithm that decided whether you were a risk based on a set of obscure factors that in some cases you might not have control over. That’s been the norm in the US for decades. But it’s not just financial loans, access to many services in the US are now mediated by a range of proprietary algorithms modeled on the credit scoring system.
So for many, when their credit score starts to drop, it leads to a negative feedback loop, slowly cutting them off from financial services, from housing, from job opportunities, from healthcare. Naturally, those with low incomes are disproportionately affected.
So what happens when you lose access to basic services, access directed by a network of proprietary algorithms? How do you litigate against an algorithm? A group of lawyers and academics led by Michele Gilman is on a mission to fight this invisible influence.
[DEGW]
Sometimes there isn’t an algorithm to blame: Sometimes it’s a simple desire to maintain the status quo. The media we consume helps define our culture, the ways that we think about and discuss ideas, the ways we interact with others. But commentators have pointed out for years that the power to decide what we watch is concentrated in a very small number of hands. This remains true in the age of digital streaming: the New York Times reports on how Tim Cook, CEO of Apple, Inc., killed a show about the rise and decline of a media outlet (Gawker) that was itself killed off by another Silicon Valley luminary (Pete Thiel). How very meta.
Whatever you might think of Gawker, the question being raise is worth reflecting on: Is this concentration of power just? And if it isn’t, how can we build a system that is more just (and still entertaining)?
[DEGW]
Computer games are designed to suck you in and hold your attention. Game designers know lots of psychological tricks to keep you playing a game long after you’ve decided it’s time for bed. Where you are enjoying a sense of fun, the game is deliberately manipulating you into addiction through the use (or withholding) of rewards. So has QAnon managed to maintain a stranglehold on the attention of the American (and increasingly the European) political right.
In this insightful piece, Reed Berkowitz dissects the incentives at the heart of QAnon from the perspective of a game designer to show how it creates self-fulfilling prophesies, and sucks even more people into its clutches. This despite the fact that, unlike a game, there is no central designer, no single human running the show. Rather, QAnon is the confluence of several self-reinforcing tendencies, resulting a highly decentralized system of positive reinforcement that is going to be very difficult to combat.
[DEGW]
Does your business need a developer strategy? You’ve heard of developer relations, but what is that? Do you even need it? Katsudon.tech can help you navigate the complex world of developer relations. Every business is different, and we can help you evaluate your developer goals, determine the best way to get started, and ensure you feel confident that you are spending your money effectively. To find out more, and contact us to set up an initial consultation, visit our website: https://katsudon.tech/.
So much for Issue 13! Thanks for subscribing and reading, and thanks to the many folks who have continued to share content with us! If you enjoyed this issue, share with your friends! If you have an article that should be featured in an upcoming issue of The Ethical Technologist, let me know by either replying to this email (what, a newsletter you can reply to!?), or pinging me on Twitter.
Until next time, yours, Chip Hollingsworth & Don Goodman-Wilson