Data, Tech & Black Communities

Archives
October 26, 2021

Data, Tech & Black Communities (DTBC) Newsletter

(October Issue)

Welcome to our first Data, Tech & Black communities (DTBC) newsletter! This newsletter exists to keep this Community of Impact informed of our work; events (next one on 25th November); and about key data and technology news affecting Black communities.

What do we mean by Community of Impact? Well, the thing that binds us all together is our interest in ensuring that data and data-centric technologies work for the enhancement of Black lives, rather than to curtail or surveil them. DTBC exists to facilitate awareness about data-centric technologies, support collaboration amongst activists, grassroot organisation leaders, technologists and practitioners, to boost these technologies’ positive effects and mitigate the negative ones. In other words, we’re the community of impact.

What have we been up to?

Writing

After the roundtables in March we took some time out for review and analysis. We attempted to summarise the result of that process in posts on our Medium blog, which also includes a fantastic article from a community member Kimberly Oula who outlines what algorithmic bias is, the real effect it has in our lives and the steps we can take to reduce the power it has over us. 

Governance and funding

Being a sustainable community of impact requires clear, effective and accountable governance processes. That’s why we are setting up a Community Impact Company. If you have some spare hours and want to be involved in the management of this community, do please get in touch with us at data-tech-black-communities@protonmail.com

Another reason that becoming a CIC makes sense is that it makes it easier for us to access funding. This is helpful because some grantmakers have expressed an interest in funding our work. 

Upcoming project- Education

Illustrations from BlackIllustrations.com — The Education Pack.

Conversations during the Education roundtable and elsewhere highlighted the impact of digital exclusion on Black children’s learning and how the growth of EdTech (technology in education) in schools could compound that. 

We foresee EdTech playing a significant supplementary role in the government’s planned Education Recovery programme(and beyond) and given aforementioned concerns about the digital exclusion of Black children, this seems like an important area to gather data on. Having access to the data (as well as analysis and methodology) will support community organisations’ efforts to advocate for and help Black children and children from other marginalised groups. 

How we design and deliver this project is as important as what we discover. During the roundtable discussion we talked a lot about “data about us, being for us”. Developing and retaining data skills within the community i.e. we the parents, practitioners, technologists, academics and community leaders. We want to run the project in two or three pilot areas, alongside grassroot organisation leaders who will be compensated for their time. 

What do you think?

  • Could the scope of this project be improved?

  • Would you be interested in participating? 

  • Which geographical areas should we begin this work in? 

📆EVENTS

JOIN US in our education roundtable on Thursday 25th November 6pm. We will present our findings to date and have a community conversation on how you think we should proceed. SIGN UP here

Community growth

Mozilla – the organisation behind the Firefox browser – wants to bring together AI builders and civil society to share their work and collaborate on projects on ‘Trustworthy AI’. We’re always on the look out for fellow travellers so we applied and were accepted (see bullet 5 under “Civil Society Actors for Trustworthy AI Working Group”). Do have a look at what other groups are working on, and feel free to get in touch if you have any questions. 

How data and data-centric technologies are showing up in our world 

Here we share stories (in our four areas of interest) which have made us stop and think. These and other stories can all be found on our curated Raindrop page. Please tell us about stories that you’ve found interesting! 

🧑🏿‍🎓Education 

Kids data up for grabs. Remember when the news broke in 2019 that the Department for Education (DfE) hadinadvertently given gambling companies access to millions of records from the Learners’ Record Service (data on children aged 14+)? If not, here’s a short tweet thread about it. It was bad enough to warrant an ICO (data regulator) audit. The details of the problems identified were not released by the ICO even though this is a matter of public interest. This summer, DfE finally decided to publish an update on what they’re doing to plug the many gaps highlighted by the ICO. DefendDigitalMe has a good post on why and how DfE continues to fall short on this issue and a good explainer about the very real problems that the DfE-maintained National Pupil Database poses for children’s data rights. 

DfE has form for dubious data sharing decisions like when they chose to share data with the police and Home Office. This is one (of many!)  government departments we need to keep an eye on. Activist and parental pressure works – this year DfE finally deleted the data on nationality and country of birth data that it controversially collected years earlier. 

🧑🏿‍💻Employment & Enterprise

Exploitative labour + tech. Ever read an egregious example of worker exploitation and thought to yourself ‘that could never happen here’? Don’t be so sure, companies are using data-centric technologies to coerce and exploit workers, even in places like the UK. Yet another reason why worker solidarity is so important. Here’s a news story about the lived experience of a food delivery driver in India and here’s a news story about Ocado delivery drivers in the UK. Their employers (who would no doubt reject this term) are working from the same playbook, right down to making them pay for the clothes they have to wear on the job. The result is the same – poorly paid, precarious work. 

The algorithm-driven gig economy isn’t just negatively impacting workers, it's also negatively impacting small, independent business owners like restaurant owners and driving up the price for their customers. This episode of the Which? Money Podcast explores the hidden cost of food delivery apps.  

Interested in ethical tech jobs? The Tech for Good newsletter includes a jobs board (it is also a good read).

🦸🏿‍♀️Crime & justice

Facial recognition technology. London Met Police continues to expand its use of technology for surveillance with a £3 million system to match images of people with the force’s internal database.Wired UK.  This has been approved by the Mayor of London, despite the well known higher error rates for Black people that this type of technology has. 

Surveillance Society courtesy of the Labour party. And in an effort to present Priti Patel as soft on crime, Labour announced in the party conference a raft of ‘tough on crime’ actions- including a planned  “next-generation neighbourhood watch’ using technology including video doorbells and WhatsApp groups”. Surveillance society here we come (but not on our watch!)

Computer says no. In case you’re in any doubt of the devastating consequences of being misidentified or ending up on the wrong database, here’s the story of Adekunle “Femi” Adenuga whose passport was withheld by the Home Office for five years because he was incorrectly added to the Met’s Crime Reporting Information System. 

🧘🏿‍♂️Health 

The tech isn’t working for us. At the height of the Covid-19 outbreak, awareness of pulse oximeters became mainstream because of their ability to help doctors (and even patients themselves) detect hypoxaemia (low oxygen levels) even without typical symptoms. However, these devices don’t work as well on patients with darker skin. In other words, pulse oximeters may underestimate hypoxaemia in black patients. Although the issue has been known for decades a recent study by Sjolding et al has brought the issue back into the limelight. This structural problem needs a structural solution – healthcare providers should refuse to purchase devices that are inherently racially biased and regulators should have more stringent testing requirements in order to pressure the manufacturers to improve the accuracy of their devices for allpeople. 

Hidden algorithmic decision-making.  Algorithmic transparency has become something of a buzzword and implementation ideas have been fuzzy enough that it can be easy to dismiss. But the implications of hidden algorithmic decision-making can be serious as the Wired article about US patients with chronic pain being refused pain treatment because an unaccountable algorithm shows. This algorithm, designed by a private sector organisation which has been given access to all sorts of health databases flags patient behaviour that fits a “drug shopping” profile. Seeking a correction if you’ve been flagged can be difficult because so much about the algorithm’s operation is opaque. The story is US-focused but it’s not hard to imagine the same thing happening here. This is why Black grassroot organisations and activists need to be involved in conversations about algorithmic transparency. 

UK GP data grab halted (for now). And in discussions about the use of data to develop algorithms. GPDPR, the new central data system of patient data that is currently held by GP surgeries is on hold after wide scale criticism about its lack of transparency and privacy risks. We’re in favour of data being used for research, but without transparency around usage and clearer safeguards the risks outweigh the benefits. Unsurprisingly over a million people have opted out. It’s a personal choice but if you wish to opt out, here is some advice from MedConfidential on how to do this.

🤳🏿What are we watching/listening to?

We’ve spotted this series on Race and Data Science developed by Dr Desmond Upton Patton. First speaker in the series is our friend William Isaac who discusses the importance of ‘foresight’ to mitigate the societal harms of AI.

Do you like hip hop? Spoken word? Interested to hear about climate change from the lens of a Black British activist? Then check out this work from Complicité

🙌🏿What’s bringing us joy this month?

Ok, so reading about the myriad ways that data and data centric technologies are being used to drag us down can, well, drag us down. This is why we need to create space to celebrate the warmth and vibrancy of who we are. This video of a beautiful child celebrating his classmate’s achievements not only filled us with joy, but also reminds us that we need to do the same – celebrate and support each other for our many gifts and acts of service. 

In the next newsletter (which will be sent out in mid December), we would love to share some of the things that are bringing you joy. Please drop us an email at data-tech-black-communities@protonmail.com  

💪🏿Get involved

Open Government Partnerships are a worldwide movement seeking to secure government commitments to directly engage citizens and civil society organisations in decision making. We’re on the fence about how effective they are, but certainly some useful ideas have come out of past action plans e.g. support for more reporting transparency in government contracts. The fifth action plan is underway and Algorithmic Transparency is one of the themes. Unfortunately, the civil society participants who have engaged thus far are not very representative of the racial and socioeconomic makeup of the UK. A belated drive to seek more ‘diverse voices’ is now underway and there is talk about improving practices for the next action plan. If you are interested in getting involved, do get in touch.  Tracey is currently listening in on the algorithmic transparency group discussions.

Data: A new direction is the government’s proposed plans for watering down our current data protection regime GDPR. The deadline for providing feedback is the 19th November

National AI Strategy sets out the government’s plans to increase AI use. To get involved in the upcoming conversations, register here by 17th November.

DTBC will be engaging with these consultations.

👨🏿‍💻Get in touch!

We hope you enjoyed this first newsletter. We promise to get more succinct (unless you prefer our little summaries). Either way, if you have thoughts, news to share with us, or want to get involved in any way do get in touch: data-tech-black-communities@protonmail.com

We’ll be hitting your inbox in two months’ time!



Don't miss what's next. Subscribe to Data, Tech & Black Communities:
Powered by Buttondown, the easiest way to start and grow your newsletter.