S06E05 of Connection Problem: Road Tripping (Part 1)
Aloha & hey hey,
I’m sending this from the train; specifically, one of many legs of a series of trips. Today, I’m headed to Hamburg to speak at NEXT, which I co-directed together with the lovely Monique van Dusseldorp for a few year but haven’t been back in a while. Tuesday had me in Stockholm for one workshop, yesterday briefly back in Berlin for another. Tomorrow morning, M and I will be headed for Scotland since we’re both part of a program where we work with the University of Dundee. (I’ll try to write an episode next week from the road, but I can’t promise anything.) So this episode is very much written in motion. Outside, green fields are flying by, drenched in mist and early morning light. It’s gorgeous.
×
If you'd like to work with me or bounce ideas, let's have a chat.
×
What type of Smart City do we want to live in?
Warning: Trick question! The framing should of course be: What type of city do we want to live in. But that’s the title of my presentation at NEXT conference later today. Slides here.
Also, Ann Cavoukian (executive director of the Global Privacy and Security by Design Centre) sums it up nicely in her piece on Sidewalk Labs (where she used to be a consultant):
In a smart city, technologies are gathering information 24-7. There’s no opportunity for people to consent or revoke consent, so we must protect their personal data for them (…) Privacy forms the foundation of our freedom. You cannot have free and democratic societies without it. It’s no accident that Germany is the world leader in privacy. They looked at the abuses under the Third Reich and said “never again.” Privacy isn’t about secrecy. It’s about retaining personal control over your information—over your own life.
×
Should this exist? (Feat. The Great Reckoning)
danah boyd, always an outstanding mind to follow, has given a speech that caused some ripples; deservedly so. It’s her acceptance speech for an EFF award. She pulls no punches here, at this odd time where many of the early pioneers (some would say heroes) of the open and free internet turn out to be pretty flawed products of their time, and we’ll see if they take their affiliated organizations down with them or if the whole scene comes out stronger after a dust-up.
I don’t personally know but highly respect danah; I especially appreciate how she explores her own role as a benefactor of some of these people and that part of the system, and hence as being complicit. It takes guts to say that.
"I realized I needed to reckon with how I have benefited from men whose actions have helped uphold a patriarchal system that has hurt so many people. I needed to face my past in order to find a way to create space to move forward."
Luckily, she doesn’t leave it at that but also looks at the bigger picture, the larger role tech plays in the world now:
"Change also means that the ideas and concerns of all people need to be a part of the design phase and the auditing of systems, even if this slows down the process. We need to bring back and reinvigorate the profession of quality assurance so that products are not launched without systematic consideration of the harms that might occur. Call it security or call it safety, but it requires focusing on inclusion. After all, whether we like it or not, the tech industry is now in the business of global governance."
Also, this quote, which just as a very nice ring to it:
"In a healthy society, we strategically design to increase social cohesion because binaries are machine logic not human logic."
"The Great Reckoning is in front of us. How we respond to the calls for justice will shape the future of technology and society. We must hold accountable all who perpetuate, amplify, and enable hate, harm, and cruelty. But accountability without transformation is simply spectacle. We owe it to ourselves and to all of those who have been hurt to focus on the root of the problem. We also owe it to them to actively seek to not build certain technologies because the human cost is too great."
That last point, especially, is of quite some salience today. Which is why I'm more than happy to keep banging the drum about smart cities, IoT and all the other ways that algorithmic decision-making and other technological governance systems sneak into the fabric of our daily lives.
×
That’s not my face
If you haven’t seen it yet, ImageNet Roulette is an excellent bit of educational/provocative art by the ever-brilliant Kate Crawford and Trevor Paglan. It’s a simple demonstrator of how facial recognition will label you based on its training data set. Concretely, based on ImageNet, which according to this project is the dominant training model out there, used widely in the making of real world products. I highly encourage you to try it out, but with a warning: The results might be hilarious, or pretty disturbing.
ImageNet Roulette is best enjoyed side by side with Excavating AI, which explains a little better what’s going on:
“Training sets, then, are the foundation on which contemporary machine-learning systems are built. They are central to how AI systems recognize and interpret the world. These datasets shape the epistemic boundaries governing how AI systems operate, and thus are an essential part of understanding socially significant questions about AI. But when we look at the training images widely used in computer-vision systems, we find a bedrock composed of shaky and skewed assumptions.”
I tested a few configurations of photos of myself, and depending on context and my facial expression had the photos labeled as a wild, and wildly wrong, range of things:
Now assume we build algorithmic systems that play a significant roles over our lives based on a system that introduces these kinds of flaws - what would you say to that? Right. Well, the thing is, that's exactly what's happening.
See also: AI based surveillance running amok. A super short “best of”:
At least seventy-five out of 176 countries globally are actively using AI technologies for surveillance purposes. This includes: smart city/safe city platforms (fifty-six countries), facial recognition systems (sixty-four countries), and smart policing (fifty-two countries).
Liberal democracies are major users of AI surveillance.
Democracies are not taking adequate steps to monitor and control the spread of sophisticated technologies linked to a range of violations.
China is the leading exporter of this type of surveillance tech, but US and European countries are in on it, too. Seriously, read this one, it’s as quick to read as it is horrifying.
×
Pro regulation?
In meeting after meeting after meeting these last few months I’ve been getting the impression that regulators and policy makers are getting emboldened to take a stand, to assert themselves vis-a-vis the big tech multinationals. And it seems they’re back up by public opinion, too. So it’s an interesting moment in time.
This conversation is happening in the hallways as much as it is at the top. Reuters reports on EU anti-trust chief Vestager (and soon-to-be European Commission vice president):
She said the bloc’s data protection rules, adopted last year, gave Europeans control over personal data but did not help in instances when problems arose from companies misusing data to draw conclusions about individuals or to undermine democracy.
×
I’m wondering…
Multiple sources have, independently from on another, in different organizations and countries, pointed out to me that most political parties and their foundations don’t really have folks thinking about the impact of digital in a structured way, especially not with a focus on consumer protection and digital rights. Can that be true? Have I bumped by chance into the handful or orgs who do? It does seem unlikely to me, but now I’m worried that those efforts aren’t underway in other places. Please help me calm my frayed nerves by sharing a few more examples of players in the policy area that really tackle this.
×
Team Robber Baron
It seems that Kickstarter has been… less than helpful in efforts of their staff to unionize. (Should I say “now ex-staff” for some of the people who were vocal about it? 😬) If they’re really fighting unionizing efforts within their staff, that would put them squarely in Team Robber Baron. That's a real bummer.
I guess it’s hard for an org that considers themselves a startup to admit they’re business as usual, and not the good kind. Sigh. It's been proven over and over that while collective bargaining might be a short term inconvenience for business leaders it's a stabilizing force over time. Everybody benefits. If any company claims a union will kill their business, then they don't have a real business. (I'm looking at you, Uber!)
Repeat it all together: Unions are not 👏 an 👏 issue 👏 for business. In countries like Germany, the biggest corporations have had them for decades, and overall everybody benefits.
×
Is Purpose the new black?
Is the new focus on "purpose”, as heralded by traditional pro-business orgs like just now the Financial Times going to be more than astro turfing? Somehow I can't quite see this morph into more than a new company claim. But hey, there’s hope. (Including hope that “purpose” won’t just replace “responsibility & accountability”.)
×
If you’d like to work with me or have a chat to explore collaborations, let’s chat!
×
Miscellanea
- Backblaze now has a European data center. Want cloud backup? Backblaze are great. And now your data is around the corner, in Amsterdam. If they also start offering hard drive-by-mail recovery, I’ll be a truly happy camper.
- How the Sidewalk Labs proposal landed in Toronto: the backstory (TorontoLife.com). Not the most neutral story but it’s an interesting read.
×
Currently reading: Leviathan Wakes (James S. A. Corey), Four Futures (Peter Frase), Lost Japan (Alex Kerr)
×
What's next?
Hamburg today to speak at NEXT. Tomorrow has me on the way to Dundee as part of the OpenDott PhD program, where I’m an industry supervisor. We’ll be low-carboning this one. Train-boat-train combo ftw! Then, directly from there or via a one-night stop over in Berlin, Boston & New York. In November you can come see me speak at Tech Care Copenhagen, a brand new event about responsible technology in the public interest.
Have a lovely end of the week!
Yours truly,
Peter
×
Who writes here? Peter Bihr explores the impact of emerging technologies — like Internet of Things (IoT) and artificial intelligence. He is the founder of The Waving Cat, a boutique research, strategy & foresight firm. He co-founded ThingsCon, a non-profit that explores fair, responsible, and human-centric technologies for IoT and beyond. Peter was a Mozilla Fellow (2018-19) and is currently an Edgeryders fellow. He tweets at @peterbihr. Interested in working together? Let’s have a chat.
Know someone who might enjoy this newsletter? Please feel free to forward your copy or send folks to tinyletter.com/pbihr. If you'd like to support my independent writing directly, the easiest way is to join the Brain Trust membership.
×
Header image: Weimar (my own)