Let's talk tech Thursday #14
Welcome to another edition of Let's talk tech Thursday, the newsletter that refuses to work in this heat unless it has 3 separate fans pointed at it.
Two top stories this week, and both of them AI related. Possibly because I have AI on the brain as I head to the West Midlands Funder Network's Annual Conference tomorrow to talk about AI in Grantmaking. If you're going, come and say hi!
Anyway, this week we talk about:
- A woman in Manchester who was incorrectly accused of stealing by an AI system,
- How worries surrounding AI and copyright law might destroy creativity (but not in the way you might think).
Also, we look at the environmental impact of more battery-powered things, and what happens to the internet when Google breaks.
Let's dig in...
Top Stories
Woman mistaken for thief after shop face scan alert
Summary
A Home Bargains customer was wrongly accused of shoplifting due to a facial recognition error and was ejected from two stores in Manchester. Only after persistent communication with both the retailer and the company that provided the system did both parties admit that Danielle Horan was innocent.
So what?
When I saw the headlines for this story, I assumed it meant that AI had made a mistake with someone's identity. Not itself an interesting story, as plenty of stories about the technology exist. But in most of those stories, the victims of mistaken identity are not white. Here then, was a story about AI falsely accusing a white woman - an altogether rarer occurrence.
But, that wasn't actually what the story was about. Facewatch - the company that owns the facial recognition security system - maintain that the system "worked correctly". It correctly identified Horan as she entered a different store to the one where she had "stolen" - doing exactly what it was told to do. Only issue is that in that first store, it was the staff at Home Bargains who incorrectly marked Horan as a thief, and so her details were uploaded to the national network for her to be picked up at the next store she went to.
As a response to this, Facewatch has now "suspended" Home Bargains from using its systems. But of course that doesn't really address any of the problems. For one, it seems we've just accepted that autonomous and unchecked systems can prove people are guilty, and all the security involved seemed very happy to wash their hands of it. No one seemed to take into account the nature of the supposed crime - that it was just £10 of product that was "stolen" - or the logic behind it. The item was a large pack of toilet rolls - hardly something you can just slip into your pocket on the way out of the store.
More and more decisions are being made by machines. In some cases, that's fine. But what we're talking about here is someone's data being uploaded without their knowledge to a centralised system. That data was then shared, ultimately without a good basis, and then that data was used to persecute the same person. Horan appears to have ultimately taken it all in her stride, even finding the space to laugh about parts of it. But there are darker implications here.
AI copyright will hold back creativity
Summary
In this article from MIT Technology Review, Nitin Nohria argues that creativity has always relied on reinterpreting existing works, and AI is just a new tool for this process. Current copyright laws struggle to address how generative AI uses prior art, leading to confusion about ownership and originality. To foster creativity, Nohria posits, we should adapt copyright standards to include meaningful human input, regardless of the medium used.
So what?
There's a lot of talk around AI and copyright at the moment. A lot of it, rightly so, is focused on the fact that many of the Large Language Models that power the likes of ChatGPT, Gemini, and Copilot, are training on stolen information.
Faced with this, it's quite easy to think that we should enforce a blanket ban on AI and art. And yet, it's hard to argue with the central premise of this article: there is so little that is created that doesn't rely on pre-existing work. We're told time and again that AI can't create new things, it can only rehash old into different shapes. But don't a lot of people get to call that innovation? Apple hasn't invented anything new for years, content with just copying other players in the space.
But there's another angle I often consider. One of GenAI's greatest strengths is its ability to remove barriers. But why should barriers stop at language, or inability to focus? If someone can't afford a piano, and AI can help them arrange cords, is that wrong? They were only going to learn by buying sheet music and rearranging that.
To be clear, I'm not talking here about the underlying copyright issue inherent in the creation of LLMs. While it might seem strange to separate the two, I think for the purposes of this conversation it's important. No, AI shouldn't be trained on stolen material. But should people using AI to create art be penalised because of how that art is created?
See also: Data bill opposed by Sir Elton John and Dua Lipa finally passes
Unrelatedly, I wrote a blog a while ago about AI in the arts: Read it here if you like
Any other news?
Race to mine metals for EV batteries threatens marine paradise
Indonesia's Raja Ampat archipelago is known for its rich marine biodiversity, but is coming under severe threat as a result of nickel mining. The Indonesian government has recently revoked the permits of some mining companies, but concerns remain about the impact of mining on local ecosystems and communities. The thing about those eco-friendly electric cars, is that getting the materials to make them is often very much not in the planet's interest. While battery technology is crucial to storing the energy we might get from solar, wind, or other green means, we still need better ways to build the batteries themselves.
The internet went down on Thursday: Here's what we know now
About a week ago a huge number of websites all stopped working. In what might be a surprise, it appears to be down to the failure of just one company - and a big one at that. Google hosts a frankly staggering number of the internet's websites - and parts of even more besides. This was exacerbated because CloudFlare - a service that provides websites with ways to make their sites faster and more secure - uses Google for some of its workload. CloudFlare itself is vastly popular, and so between them and Google, a huge number of other big names fell. It was a nice reminder of how dependant so many have become on so few.
And with that, I'm going to go get an ice cream. Hope you're enjoying the weather, and I'll see you next week.
Will