Let's Make AI Work for 8 Billion People Not 8 Billionaires
Let's make AI work for 8 billion people not 8 billionaires
In this month's edition of Careful Trouble’s newsletter:
A catch-up from AI and Society Forum
Come to the Community Tech Autumn Gathering!
We're hiring a Chairperson for Promising Trouble
Out and about at conferences and events
Our recommendations
Hello hello
It's Rachel Coldicutt here, with an update on what we're up to at Careful Trouble this November.
Firstly, a reminder that we're available for hire. One of the ways we're funded is by working with clients, and we're currently open for commissions. The Careful Industries team specialise in foresight, understanding the social impacts of new technologies, technology governance and equitable policymaking. Get in touch if you'd like to collaborate.
150 AI changemakers can’t be wrong
It’s unlikely you missed it, but just in case, at the end of October we hosted the AI and Society Forum. This was an urgent gathering of 150 civil society changemakers, coming together to shape better outcomes for AI and write an alternative agenda for the AI Safety Forum.
Over the day we hosted 40 contributors across 18 sessions, examining different societal impacts of AI. Topics covered on the day included race, bias and stereotyping; the impacts of automation in education; how money and power flows through the tech industry; the potential of labour organising; the use of AI at borders and in the justice system; the intensification of global inequalities; and opportunities for (h)acktivism. We were also delighted to host Australian Minister for Industry and Science Hon Ed Husic MP for a side conversation during the afternoon.
The energy across the day was amazing. Thanks to everyone for making it such a fantastic day, particularly the brilliant speakers and contributors. We're currently working out what's next, so if you'd like to be involved, fill in this form to let us know what you thought and whether you'd like to be involved in further convenings related to equity, justice and AI.
Also! If you want to join us in our mission to make AI work for 8 billion people not 8 billionaires, I can heartily recommend buying one of our t-shirts. All proceeds go to the Save the Children Gaza Emergency Appeal.
(In fact, our t-shirt shop is full of classics at the moment, including a Luxury Beliefs t-shirt, on the off-chance you'd like to do something special to mark Suella Braverman's departure from the Home Office.)
Community Tech Autumn Gathering
If you missed the AI and Society Forum, we're hosting the Community Tech Autumn Gathering at Stretford Public Hall in Manchester on 29 November, an afternoon of networking, reflection and community building for everyone who makes, uses or is interested in Community Tech. One of the highlights is a session facilitated by Reina Yaidoo and Annette Joseph MBE on Decolonising Community Tech, plus I'll be sharing some of the findings from our forthcoming Community Data paper. There are a very small number of tickets left, so reserve yours here.
Find out more about the Community Tech Community of Practice.
We’re recruiting a Chairperson
We're building a board for Promising Trouble and are looking for an entrepreneurial, values-driven Chair to support us through our next stage of growth and help us become more financially robust so we can continue putting community power at the heart of technology and innovation. All the information is on our website and in the recruitment pack, so please share widely to help us find the right person (and no recruiters please).
Also, I've already had lots of men ping me on LinkedIn to express their interest in the Chair role. If you could email jobs@promisingtrouble.net with any questions (as requested in the recruitment pack) that would be great; messaging me directly adds to my workload, means your message might get lost, and it's not great for fairness and transparency. And if you're a woman or non-binary person who's interested but is thinking "that's probably not for me", then it almost certainly IS for you, and we'd love to hear from you too. Go on, take a look.
Out and about at conferences and events
As well as our own events (run by the brilliant Careful Trouble team), we're at these conferences over the next couple of weeks.
Anna Dent, Head of Research, is talking about community tech and sociodigital futures in Bristol on Thursday 16 November
I'm a late-breaking addition to the line-up at the APPG on the Future of Work tomorrow, reflecting on the AI Summit, and on Thursday I'm heading to Sheffield to join a panel at the Social Enterprise Exchange Conference
At Beyond, the conference for R&D in the creative industries, I'm part of a responsible innovation panel with some amazing pioneers from the digital art world, sharing findings from the Networked Shift report we published with MyWorld Bristol and Bath and the Creative Industries PEC earlier this year. (If that sounds interesting, you might enjoy these thoughts on responsible R&D from Stephen Hilton.)
Also, if you really want more stuff on AI, this Science Gallery panel on ethics and governance - with Rachel Adams from Research ICT Africa, DeepMind's Jennifer Beroshi, and Dr Sanjay Modgil - was a lot of fun to record.
Our November recommendations
If you're in the US, the SSRC's incredible Just Tech Fellowship programme is about to open for applications.
The Careful Trouble team always drop everything when Abeba Birhane publishes a paper, and this one - which reflects some of the themes from her keynote at the AI and Society Forum - is no exception. Co-authored with vinay uday prabhu, Sanghyun Han, Vishnu Boddeti, Sasha Luccioni, "Into the LAION’s Den: Investigating Hate in Multimodal Datasets" explores the ways that the trend towards scale in generative AI leads to hate scaling.
Novelist Naomi Alderman has published the novel about billionaires and tech and power and religion, The Future. You can watch her talking about it with Kara Swisher.
Deb Chachra's fantastic How Infrastructure Works is finally out in the UK and extracted in The Guardian.
Central to much of our work at Careful Trouble are the ways that technologies can be used to intensify power, exacerbate binaries, increase hate, and hasten loss of life. This letter from the Responsible AI Community, calling for a ceasefire in Gaza, is essential reading on ways technologies are being weaponised.
And, in the week the OECD has redefined AI, this fantastic paper by Lucy Suchman on "The uncontroversial "thingness" of AI" is a must read.
That's it for now. See you again in December.
Thanks for reading. If you're a fan of the newsletter, we'd love it if you could forward this on to anyone else who you think might be interested.
If you want to read more from us, take a look at our websites – Careful Industries and Promising Trouble.