Biometrics in education: tech enabled harm
I am not sure how we got to a world where schools are happily using biometric payment systems. I understand the appeal: a child will always have their face or fingerprint, whereas they might lose a lanyard or forget a passcode. But many things are convenient and yet schools reject them. It would be more convenient to not have school uniform, and just have three colour, no logo dress codes. Cheaper too for families. It would be more convenient to make parent meetings online, yet many schools moved back to in person. Despite the costs involved in those online systems.
If even DHS can have sensitive biometric data stolen, why are we happy to trust notoriously insecure school systems with such valuable data? There are ransomware attacks on schools regularly as well as simpler cyber criminal activity that steals or leaks data. Every tech or cyber conference has talks and policy panels on some form of “how can we better protect education settings”, yet we aren’t actively helping schools to make good decisions or calling in bad tech. I don’t blame schools entirely for this. I blame an edtech gold rush and terrible regulation. Schools and education settings tend to trust technology sold to them. There is an assumption that it wouldn’t be sold if it wasn’t regulated, and surely, the regulator is watching? But sadly, regulator action is often retroactive, after the harm is done. If it happens at all. The FTC and CNIL and German authorities for example,have better records on this, broadly, but the ICO has done very little.
I often hear people argue that the data isn’t stored insecurely etc but I have worked on this for years and have yet to see a school or company be 100% transparent about where and how biometric data is stored. Furthermore, if a school system itself can be broken into and ransomware attacked, with all the sensitive SEND, socioeconomic and child protection information that includes, I do not believe that any school linked system is safe. I also think that the golden rule of data collection is forgotten: only collect what is absolutely necessary, and store it carefully and only if you HAVE to.
Advice in schools
I think schools need better tech advisors, and particularly ones who understand tech harms. It isn’t enough to have people who have IT or tech experience. Those of us who have worked in tech enabled harm, tech policy and ethics should offer support and be the ones asked to contribute too. Just because you know how to code, does not mean that you understand the harms of a product, as my work has shown time and time again. A good recent example is the launch of apple air tags, which had to be adjusted to alert on their location as they were otherwise a perfectly designed tool for stalkers and abusers.
My concern is also that schools get “governance checklists” that assume honesty and transparency of edtech vendors, as well as the capability within schools to interpret their responses. I have seen very opaque or downright untruthful replies to very good school privacy or security questions. The market is far from transparent. This is not to suggest that all edtech is nefarious. Simply that we cannot rely on checklists without governance and a robust understanding of what harm or risk might be for a child using a technology.
What is a harm? Or a risk?
This paper, is a rich deep dive into the definitions necessary when we discuss harms or wrongs. We could summarise it for this blog as a harm being an act that causes a setback or damage to your interests as an individual, one that could have been avoided. There is much nuance to the debate that is not necessary here. But we do need to ensure that when tech is used in education, that all possible harms to those involved are considered. A concern shared with many in my network is that far too many taking these decisions cannot imagine possible harms due to their own privilege. Balanced rooms make better choices.There are many marginalised demographics who already experience tech harms. AI systems for face recognition in law enforcement that mis-identify Black people. I am personally very wary of mass collection of biometrics as we have seen how that has been misused in the past. With Google or Car manufacturers sharing data with law enforcement without consumer consent as one example. Tech companies always beg forgiveness instead of asking permission. Because there is so much money in selling data.
The future is passwordless, we will be using biometrics and other security methods as tech progresses. And unlike a passsword, you cannot change your fingerprint if it is stolen. The use of biometrics in schools carries a very real risk of that important data being stolen from children for life. I am not sure why we would impose that risk on any child. How will we defend that choice in the future?
Schools and Surveillance
The Runnymeade trust has many reports on racialised surveillance and harms in schools. I do think that there is a tendancy to use surveillance tech in schools. A controversial point, but we do need to ask ourselves why we are happy creating a world where children as young as 11 have biometric systems normalised to them.Where we are saying “ giving us your data is required to access life”.
The same could be said for the normalisation of cctv in bathrooms, which I have written about before. All of it is tech that would be rejected in adult environments.
Schools and communities that are successful are built on trust. Young people need to be shown ethical and responsible use of tech. Because as I like to remind people: your future will be in the hands of this generation who have had their privacy violated from birth. ( parent vloggers, constant sharing on socials by adults, the huge data shared by schools) There is a risk that they simply won’t understand privacy or security in the future as we currently do. Or say that we do. Because they haven’t experienced a life where they ever had it themselves.
All of this to say that we have a responsibility to young people. And that we cannot rant about the dangers of mobile phones and the internet, all while using the internet and technology to share their data in ways that could harm them. There is a real risk that students will leave school at 18, having never been on social media or had a phone, but still with massive digital footprint and their biometrics stolen. All because they attended school. How will we answer for that?