S06E17 of Connection Problem: Farewell, 20-teens
"The end is coming, but in this moment, it has not. In this moment, there is so much time.”
— Joshua Rivera
A metaphorical god jul everyone,
This is the last installment of Connection Problem for the year. Most likely we’ll meet in your inbox again in the second week of January 2020. It’s been a great ride sharing thoughts and being in conversations with so many of you. Thank you. Also, as always, feel free to hit that reply button!
×
Vintage Illustration of Vintage woodblock print of Japanese textiles (Rawpixel/public domain)
×
If you'd like to work with me or bounce ideas, let's have a chat.
×
ThingsCon Rotterdam is a wrap
Last week we saw our annual ThingsCon conference. It’s such a lovely experience to reconnect with friendly faces and meet so many new folks, all in the service of making tech work better and for everyone. This community really never has failed to inspire me — I always leave energized.
For what it’s worth, the slides from my closing remarks are on Slideshare. More importantly, though, we’ll have the videos and more stuff ready to share soon. Follow updates on the website and on Twitter to see them when they drop.
One of the key take-aways for me in my talk was that we’ve seen the whole debate around responsible tech mature so much over the years. It’s kind of mind blowing that this is now a top priority even in government circles: The new European Commission that just started their work a couple of weeks ago has made reigning in negative effects big tech platforms a top priority right along fighting climate change. A world of better tech (and better tech regulation) and in which society, economy and climate are all aligned? That’s a vision I’ll happily sign on to.
In the meantime, we’ve also released the 2019 edition of our annual RIOT report, The State of Responsible IoT”, an essay collection curated and edited this year by Andrea Krajewski and Max Krüger (🙏!) and available as a PDF download; individual articles will be posted on Medium soon; and if you’d like a print copy, let us know by mid-January or so for a second print run probably by the end of January. The theme this year was “Small escapes from surveillance capitalism.”
Also, there are shirts available and they’re bottle green and look a little like something that Saul Bass could have made but that Pieter made instead:
Image source: indiewire.com
×
Surveillance, unchecked
A Surveillance Net Blankets China’s Cities, Giving Police Vast Powers, titles the New York Times, and outlines a vast — if somewhat bricolaged — system of state surveillance, often installed with minimum oversight by local police departments, and/or with shabby protection of the collected data (and of course no respect whatsoever for privacy in the first place).
The rollout has come at the expense of personal privacy. The Times found that the authorities parked the personal data of millions of people on servers unprotected by even basic security measures. It also found that private contractors and middlemen have wide access to personal data collected by the Chinese government.
This build-out has only just begun, but it is sweeping through Chinese cities. The surveillance networks are controlled by local police, as if county sheriffs in the United States ran their own personal versions of the National Security Agency.
By themselves, none of China’s new techniques are beyond the capabilities of the United States or other countries. But together, they could propel China’s spying to a new level, helping its cameras and software become smarter and more sophisticated.
Part of the fear being, of course, that similar surveillance can happen elsewhere, too. Turns out, unchecked power is bad and will be abused, as the story shows. (Did I just hear someone just cough snowden/prism under their breath?) Because let’s face it, this is happening in the West, with very, very varying degrees of oversight, if maybe a little less centralized. But then again, China’s state apparatus is also reportedly not as centralized and coordinated as it appears from the outside. Factions within factions.
It’s all the more uplifting to read how local residents respond to surveillance they don’t appreciate, like front doors that open based on facial recognition:
In the Shijiachi residential complex, where the facial recognition replaced key card locks, the rebellion has been powered by wire and plywood. On a brisk day in November, the doors of a number of buildings had been propped open with crude doorstops, making facial scans unnecessary.
The street famously finds its own uses for technology. Equally, it also finds its own counter measures.
×
Should AI systems be treated as state actors?
Fascinating academic paper about how to treat AI systems in terms of legal liability. AI system as state actors (Columbia Law Review) is way above my pay grade but/and very accessibly written.
Some quotes:
If governments don’t understand it, they can’t understand the risks:
government agencies commonly outsource the development—and sometimes the implementation—of these systems to third-party vendors. This outsourcing often leaves public officials and employees without any real understanding of those systems’ inner workings or, more importantly, the variety of risks they might pose.
A new paradigm to apply to AI: treat them as state actors.
it is time to consider new paradigms for accountability, especially for potential constitutional violations. One underexplored approach is the possibility of holding AI vendors accountable for constitutional violations under the state action doctrine. Although state actors are typically governmental employees, a private party may be deemed a state actor if (1) the private party performs a function that is traditionally and exclusively performed by the state, (2) the state directs or compels the private party’s conduct, or (3) the private party acts jointly with the government.
AI is more than a tool, and AI systems can be at the cause of constitutional harm:
this Essay argues that—unlike traditional technology vendors that supply government actors with primarily functional tools, such as a computer operating system, word processing program, or web browser—AI vendors provide government with tools that assist or supply the core logic, justification, or action that is the source of the constitutional harm. Thus, much like other private parties whose conduct is fairly attributable to the state, vendors who build AI systems may also subject themselves to constitutional liability.
×
Of VCs and Robber Barons
The Marker discusses Away luggage and its cultural issues. Turns out the luggage startup has work culture issues (The Verge), especially around work requirements, overtime, and just overall boundaries. The articles quotes a VC defending this as an integral part of start-up culture, and makes out the VC model as a structure issue in promoting these types of org behaviors. This is serious stuff, and I tend to agree.
But I want to end on a lighter note — if you’ll indulge me for a quick tongue-in-cheek exercise: Let’s just swap out VC/venture capital in this article for robber baron. I’m happy to report that this delighted me to no end.
VCsrobber barons rarely seem to think about how ideas they fund fit within the existing social and economic infrastructures. “There is nothing innovating about underpaying someone for their labor and basing an entire business model on misclassifying workers,” California State Senator Maria Durazo said of Uber. (…)The
VCrobber baron industry seems unwilling to acknowledge network externalities that don’t benefit it. It also seems unwilling to admit that it’s destroying economic value by rewarding growth metrics versus profitability metrics. The result is a breakneck push for growth at all costs, which can make companies ignore problems like treating employees unfairly and failing to build a nurturing work culture. Those problems don’t get fixed as the companies scale. Lavishly funded byventure capitalrobber baron capital, startups are in the position to undercut incumbents on price and service, all the while being unprofitable. The result is that money-losing companies can go on undercutting competition far longer than before.
You’re welcome.
×
Miscellanea
- Is the AI party over for now? (WIRED) Slowing progress in AI research could be a sign that the next AI winter is coming.
- IKEA is apparently introducing buttons to its smart home efforts (The Verge). Overall, I’ve been consistently impressed with IKEA’s smart home stuff, to the degree that I’d ever be impressed with anything smart home-y.
×
If you’d like to work with me or have a chat to explore collaborations, let’s chat!
×
Currently reading: Medallion Status (John Hodgman), Aurora (Kim Stanley Robinson)
×
What's next?
End of year sprint, few days off, then right back into sprint mode until the end of January. It’s not entirely clear to me what the coming year will look like, so there’s some thinking to do, but I do know that there’s first some more research & policy work to come; there’s a lot of ThingsCon stuff that’s likely to happen. And maybe more. Almost certainly more!
I wish you a great end of the year & look forward to re-connecting with you in the new decade.
Yours truly,
Peter
×
Who writes here? Peter Bihr explores how emerging technologies — like Internet of Things (IoT) and artificial intelligence — can have a positive social impact. He is the founder of The Waving Cat, a boutique research, strategy & foresight firm. He co-founded ThingsCon, a non-profit that explores fair, responsible, and human-centric technologies for IoT and beyond. Peter was a Mozilla Fellow (2018-19) and is currently an Edgeryders fellow. He tweets at @peterbihr. Interested in working together? Let’s have a chat.
Know someone who might enjoy this newsletter? Please feel free to forward your copy or send folks to tinyletter.com/pbihr. If you'd like to support my independent writing directly, the easiest way is to join the Brain Trust membership.
×