DashKite: Week In Review
AI Ethics, Your Data Is For Sale, Journalists Against Journalism
The dangers of AI are probably not the ones you imagine.
The Big Picture: AI Ethics
People sometimes think I'm engaging in hyperbole when I say things like Big Tech is evil. The assumption of good faith feels more reasonable. Big Tech is just in over its head. These are hard problems. And so on.
Let's consider the debacle that's AI Ethics at Google. They hire a bunch of accomplished researchers to ostensibly help ensure that they use AI ethically. Those ethicists begin publishing results that show that they aren't. Which is what researchers are supposed to do: publish their research. But Google didn't like the results, so they fire one—Dr. Timnit Gebru—and, shortly thereafter, another one—Dr. Margaret Mitchell. Oh, and just for good measure, both of the researchers they fire are women, one of them a Black woman. Google engages in a smear campaign against them, clumsily attempting to cover up the obvious. Namely, they never intended for their AI Ethics research to be anything more than theater.
The whole it's a hard problem rationale kind of goes out the window when you fire people for identifying the problem.
Questions about AI ethics may seem far removed from our daily lives. People worry about a distant future dystopia with killer robots, but the reality is scarier and close at hand. AI already helps us drive our cars, diagnose patients, make hiring decisions, and, last but not least, moderate the social Web. And it isn't intelligent, not really. It’s algorithmic. Our new overlords will vigorously execute their algorithms, utterly blind to the consequences. Consequences that will tend to favor the interests of the people who defined the algorithm. The people who just fired the researchers who pointed that out.
Google doesn't want you to think about that. That's not a hard problem, just a willful ethical failure.
Your Data Is For Sale
We see the same pattern with privacy. Big Tech wants to sell your data so badly, so it isn't surprising that they're bad at protecting it. No one wants to have to tell their boss that they can't sell customer data because it's safely locked away. Again, we see that Big Tech isn't even trying to solve the problem. In fact, they're actively making it worse.
Journalists Against Journalism
We're big fans of Dr. Sarah Roberts. Her research on content moderation, engagingly summarized in her book Behind The Screen, heavily influenced our work. She recently observed that the recent migration of journalists to newsletter platforms like Substack might be bad for journalism. As I understood it, her argument was that newsletter platforms enforce a de facto editorial policy decided by venture capital firms. The backlash was revealing. You could argue that it proved her point since the substance of her arguments was never addressed. A good editor would have insisted on it. And, of course, this doesn’t even address the fact that newsletters platforms are another moderation crisis in the making.
By the way, Dr. Roberts has an essay entitled Your AI Is Human in this intriguing collection of essays, Your Computer Is On Fire.
Noteworthy
- Oh, God, it’s even worse than we thought, part eleventy-thousand-and-ninety-nine.
- Moving away from Cookies is a start, but don’t be fooled into thinking Google cares about your privacy.
- Don’t be fooled, part 2: this proposed Federal privacy law is regulatory capture by Big Tech.
- More evidence that disinformation kills.
- The Antitrust case against Facebook…
- …and Google…
- …you can follow the author on Twitter.
- What facial recognition and the racist pseudoscience of phrenology have in common.
(Still) Upcoming
We knocked one of these of our list. And added one.
- Part 2 of It's Right There In The Name
- What should we make of Block Party?
- Does Section 230 need updating? If so, how?
What does it mean when you fire your ethicists?🎉- What’s all this about No Code?
As always, if you enjoyed this post and want to support our mission, please subscribe to our newsletter. Thank you! 🙏