The Weekly Cybers #111
Big tech is “taking the piss”, says minister, as two-thirds of under-16s still on the socials; draft Children’s Online Privacy Code released; Mark Zuckerberg’s makeover backfires; and more.
2 April 2026
Welcome
Laws about who can talk to whom, what can and can’t be said, and where, are perhaps the most fundamental of all. To be human is to communicate. Which is why things like Australia’s teen social media ban deserve so much attention. There’s a reason I’ve been giving it such.
The proposed Children’s Online Privacy Code is also incredibly relevant, therefore, despite the brief mention in this humble newsletter. Children need to be protected, but they also have rights. They are, after all, people not property.
This week there’s also a bunch of new rules for telcos, and more. Enjoy. And have a great long weekend.
As predicted, teen social media ban doesn’t work
Three months into Australia’s teen social media ban, more than two-thirds of under-16s are still there on the supposedly banned platforms. I think we can call that “not working”.
The ultimate aim is to (waves hand vaguely) protect young people, rather than just have a ban for a ban’s sake, but of course we won’t know whether that part works for some time. And it’s not like we’re running a separate control group. The experiment isn’t valid.
Still, if the protections are only reaching a third of the teens then at the very least it’s a case of “could do better”.
“Of the parents who reported their child had an account on each platform prior to 10 December 2025, around 7 in 10 reported that their child still had an account on Facebook (63.6%), Instagram (69.1%), Snapchat (69.4%), and TikTok (69.3%),” according to the eSafety Commissioner’s Compliance Update (PDF) released on Tuesday.
“Around 3 in 10 reported that their child no longer had an account. One in two of these parents (48.5%) reported that their child still had an account on YouTube following the age restrictions coming into effect.”
The most common reason the teens lost their account was that the platforms has closed it, selected by 43.6% of parents whose child no longer had at least one social media accounts. Some 36.3% had closed their own accounts, and 26.6% had been closed by parents or carers.
The sample size doesn’t really justify three significant figures in those percentages, but here we are.
“The most common reason children still had their social media accounts was that they had not yet been asked by the platform to verify their age,” the report says.
The commissioner’s report is of course relatively upbeat, referring to steps having been taken, and how “gaps remain”. Blame is placed on platforms with “poor practices”.
Communications minister Anika Wells was blunter.
“This new report from eSafety Commissioner shows that social media giants seem to be trying to get away with doing the bare minimum — I have serious concerns about their compliance with the law,” Wells said.
“If these social media companies want to do business in Australia, they must obey Australian laws.”
Or as she expressed during a press conference, “That is big tech taking the piss, to be honest.”
The eSafety report listed four key observations, which I’ll paraphrase:
- The messaging from platforms to under-16s was a trigger for them to change their stated age ahead of the ban.
- Under-16s have been able to try age assurance repeatedly until it eventually worked for them.
- Pathways for reporting under-16s using banned platforms haven’t been accessible or effective.
- Some platforms haven’t done enough to implement effective systems, it appears. “However, eSafety is continuing its investigations to enable it to form a concluded view as to whether any platform has not taken reasonable steps to comply with the SMMA [Social Media Minimum Age] obligation,” the agency reports.
eSafety is actively investigating five platforms for potential non-compliance: Instagram, Facebook, Snapchat, TikTok, and YouTube.
At the Guardian, analysis by Josh Taylor notes that apart from wishful thinking about the accuracy of age estimation technology, the ban is still facing two High Court challenges.
“Many of those who had been pushing for the ban had simply accepted the technology trial report in terms of how easy it would be to achieve, but those who actually delved into the data noted at the time that facial age estimation would be less accurate for those aged 14 and 15 years old — exactly the age group the platforms were supposed to be blocking,” he wrote.
IF YOU’VE BEEN FINDING THIS NEWSLETTER HELPFUL, PLEASE SUPPORT IT: The Weekly Cybers is currently unfunded. It’d be lovely if you threw a few dollars into the tip jar at stilgherrian.com/tip. And thanks to those of you who’ve already done so. Much appreciated.
Also in the news
- Anthropic has signed an AI safety agreement with Australia. “Under the agreement, the company will share findings on the risks and capabilities of AI, collaborate with research institutions, and take part in safety and security evaluations as part of a commitment to work with Australia’s AI Safety Institute,” reports the Canberra Times.
- Meanwhile Anthropic’s boss Dario Amodei has opinions on copyright
- The Federal Court has hit crypto exchange Binance’s local operator with a $10 million fine for misclassifying more than 85% of its Australian clients, exposing them to high-risk crypto products.
- Scammers are using a variety of new tactics to increase their effectiveness, including “replicating the hold music used by financial institutions and simultaneously calling both victims and banks so they can bypass authentication,” reports Cyber Daily. Oh, and terrorist groups are posing as charities, so watch out for that.
- There’s new rules to make telcos more accountable for outages, per the exciting Telecommunications (Customer Communications for Outages) Industry Standard Variation 2026 (No. 1). “Under the updated requirements telcos must publish information about every major outage and significant local outage across all their networks resolved on or after 31 March 2026,” writes the Australian Communications and Media Authority (ACMA).
- There’s also new rules for how mobile networks report their network coverage. The relevant document is the Telecommunications (Mobile Network Coverage Maps) Industry Standard 2026. “Mobile providers make available network coverage maps, but they are measured and presented differently. We know that consumers are frustrated that, as a result, they can’t make any meaningful comparison between them,” said ACMA chair Nerida O’Loughlin.
- The ACCC released its latest Measuring Broadband Australia report. Steady as she goes.
- Australia Post has dumped its Digital iD™, which was Australia’s first mainstream digital identity credential. This makes sense to me, given that the Department of Finance’s myID will soon be the god-emperor of ID.
- From friend of the Cybers Kate Carruthers, the observation that data governance has a brand problem. “Inside many organisations it still sounds like control, cost, and constraint, not value. Even when the underlying work is critical, the label ‘data governance’ often lands with executives as bureaucracy: steering committees, policies, and standards that slow things down,” she writes.
Elsewhere
- “Folk are getting dangerously attached to AI that always tells them they’re right,” reports The Register.
- Oligarch Watch has some interesting analysis of why Mark Zuckerberg’s extreme makeover has backfired, particularly given Meta’s recent courtroom losses.
- From Tech Policy Press, How the internet can survive an era of rivalry and fragmentation.
FINALLY, NEW PODCASTS Look for “The 9pm Edict” in your podcast app for two new episodes. In The 9pm Peak Sheep with David F Porteous, the Scottish author and social researcher joins me to chat about everything from looksmaxxing and Nigel Farage to snakes and how we’d renovate Washington DC. And then there’s a space chat in The 9pm Artemis II and the Space Cockroaches with Dr Alice Gorman and Rami Mandow, particularly relevant given today’s launch.
Inquiries of note
- The Office of the Australian Information Commissioner (OAIC) has published an exposure draft of the landmark Children’s Online Privacy Code. There’s some analysis at The Conversation. Submissions close 5 June as part of a complex consultation process.
- The ACCC has launched a consultation on NBN Co’s Special Access Undertaking (SAU), which is a bunch of stuff about how access to the NBN is regulated. Submissions close 28 April.
- The Parliamentary Joint Committee on Intelligence and Security (PJCIS) has kicked off an inquiry into the Australian Criminal Intelligence Commission Bill 2026 and the Crimes and Other Legislation Amendment (Omnibus No. 1) Bill 2026. Submissions close 5 June.
What’s next?
Parliament is now on break until Tuesday 12 May, when it’s Budget Night — but I’ll be watching out for public hearings for the various inquiries under way.
DOES SOMETHING IN THE EMAIL LOOK WRONG? Let me know. If there’s ever a factual error, editing mistake, or confusing typo, it’ll be corrected in the web archives.