New Demands for Accountability - Cyber Safety Guy Update
From Ofcom’s Guidance on Gender Harms to the Global Petition Targeting TikTok
Legends, the fight for a safer digital world is never-ending. Over the last couple of weeks, we have seen significant updates showing that regulators and advocacy groups are demanding changes from tech companies. One focuses on robust new standards to combat gender-based harms, and the other addresses platform design that prioritises engagement over safety, directly impacting the well-being of the very children and young people you work with.
Here is what you need to know about these steps toward platform accountability.
⚡Please don’t forget to react & restack if you appreciate my work. More engagement means more people might see it. ⚡
Ofcom Demands Tech Firms “Step Up” to Protect Women and Girls
In the UK, Ofcom, the regulator responsible for making online services safer under the Online Safety Act, has launched new industry guidance1 demanding that tech firms deliver a safer online experience, particularly for women and girls.
The guidance sets a new standard for online safety, surpassing those outlined in the Online Safety Act. Why is this important for child safety? Women and girls face distinct and serious risks online, including misogynistic abuse, sexual violence, coordinated pile-ons, stalking, and intimate image abuse. These harms often limit their ability to participate and express themselves online safely, and they can also contribute to the normalisation of harmful attitudes. Furthermore, the sources note that algorithms often push content promoting misogynistic abuse and sexual violence towards young men and boys, one of the reasons why scumbags like Andrew Tate have had such a big soapbox over the years.
Key Safety Measures Ofcom is Urging Tech to Adopt:
Ofcom’s guidance is handy, focusing on four main areas of harm and demanding that services be designed and tested with safety at the forefront. Here are the changes that will help protect children and teens:
Tackling Abuse and Misogyny: Tech firms must consider introducing ‘prompts’ that ask users to reconsider before posting harmful content. They should also impose ‘timeouts’ for users who repeatedly attempt to abuse a platform. Additionally, platforms should work to de-monetise posts or videos which promote misogynistic abuse and promote diverse content to prevent “toxic echo chambers”. Reducing the power that individuals like Tate currently seem to have.
Preventing Coercive Control and Stalking: To combat criminal offences where technology is used to stalk or control individuals, platforms should consider removing geolocation by default. They are also urged to introduce enhanced visibility restrictions and stronger account security.
Combating Image Abuse: To address the criminal offences involving the non-consensual sharing of intimate images (including cyberflashing), tech firms should utilise automated technology known as ‘hash-matching’ to detect and remove these non-consensual images.
Safety by Design: Companies are expected to subject new services or features to ‘abusability’ testing before rollout, specifically to identify how perpetrators might abuse them. Moderation teams must also receive specialised training on online gender-based harms.
Ofcom has written to these platforms and expects immediate action. They plan to hold an industry roundtable in 2026. They will report publicly on the progress made by individual providers and the industry as a whole in reducing these harms in summer 2027.
Global Demand for TikTok to Prioritise Safety over Addiction
Pressure is continuing to mount globally on platforms whose business models may endanger user safety. Amnesty International recently delivered a global petition2, signed by 170,240 people, to TikTok’s office in Dublin, urging the platform to do more to protect children and young people from harmful content.
The petition highlights harms linked to platform features that prioritise engagement over user safety. Campaigners are demanding that TikTok replace its current business model—which they describe as an “app that is addictive by design“—with one that is “safe by design” I remain hopeful but am also not holding my breath at this time.
The Harmful Algorithm Risk
The main concern raised by Amnesty International is that TikTok’s ‘For You’ feed has been repeatedly found to push children and young people into a damaging cycle involving depression, self-harm, and suicide-related content. This is wholly unacceptable by any standard.
In response to these ongoing concerns, TikTok has outlined its aim to use AI to bolster user safety and announced a new ‘Time and Wellbeing Space,’ which they state will help people relax and build mindful digital habits while using the app, this introduction of AI has come at the expense of human moderators though which I think is a step in the wrong direction but is this another case of a company putting profit above user safety?
What Does This Mean for Parents and Teachers?
These updates confirm that online child safety efforts are moving from reactive policing to proactive demands for systemic change, which has to be a good sign:
Shift to Safe-by-Design: Both the Ofcom guidance (demanding ‘abusability’ testing and default privacy settings) and the Amnesty petition (demanding “safe by design” over “addictive by design”) signal an important shift toward expecting tech companies to bake safety into their products from the start, rather than applying patches later, which is no different to the standard in Cyber Security where we expect security to be baked in as part of the design process..
Vigilance is Key: While regulators push for these required changes and new standards, TikTok’s ‘Time and Wellbeing Space’ are a tool you should be aware of. Encourage discussions with students and children about the content they consume and the nature of the platform’s algorithms, especially how the ‘For You’ feed operates.
Focus on Gendered Harms: The serious attention paid to misogynistic abuse and image-based sexual abuse emphasises the need for specialised education about these specific risks, particularly how they affect young women and girls, and how young men may be exposed to toxic content via algorithms.
Think of these regulatory demands and global petitions like an architect being forced to redesign a building. Previously, companies might have added safety railings after accidents occurred. Now, regulators and activists are trying to force them to use safer, stronger materials and designs from the foundation up, ensuring exits are clear and security is defaulted to “high” before the doors even open. While we await the full construction, we must still teach children how to navigate the existing structure safely.
As always, thank you for your support. Please share this across your social media, and if you do have any comments, questions, or concerns, then feel free to reach out to me here or on BlueSky, as I am always happy to spend some time helping to protect children online.
Remember that becoming a paid subscriber means supporting a charity that is very close to my heart and doing amazing things for people. Childline, I will donate all subscriptions collected every six months, as I don’t do any of this for financial gain.
https://www.ofcom.org.uk/online-safety/illegal-and-harmful-content/ofcom-tech-firms-must-up-their-game-to-tackle-online-harms-against-women-and-girls
https://www.rte.ie/news/business/2025/1125/1545650-tiktok-petition/




