It Starts in a Game. It Doesn't End There.
Roblox and Deepfakes
Chances are, your child is playing Roblox right now, or has played it this week. It is one of the most popular platforms on the planet, it looks like colourful building blocks, and it is rated suitable for children as young as seven. Over 40% of its 400 million users are under the age of 13.
I want to talk to you about what is happening inside it though.
In 2019, Roblox reported 675 child exploitation cases to authorities. By 2024, that number had risen to 24,522. Not a typo. That is a thirty-six-fold increase in five years. And in 2026, the platform is being sued simultaneously by the County of Los Angeles, the State of Nebraska, the State of Georgia, and several others, all making the same allegation: that Roblox has knowingly allowed its platform to become a hunting ground for predators, and has put profit ahead of the children it claims to protect.
I am also going to tell you about a second threat, one that is newer and one that most parents have never heard of at all, that is being used as a weapon against children who are groomed through games like this. These two things, the game and the tool, are increasingly being used together.
Related: Sextortion: Why Your Son Is the Target Right Now | Grooming: What It Is, How It Starts, and How to Spot It
The Game Your Child Trusts
Part of what makes Roblox such fertile ground for predators is exactly what makes it so popular with children. It is a platform where you can build things, play hundreds of user-created games, and do so alongside others. The social element is the point. You are meant to meet people here. You are meant to make friends.
And that is precisely the problem.
Predators on Roblox do not announce themselves. They do not approach a child with an immediate request. They start by playing. They join the same game. They offer help, share in-game currency (Robux) and pay compliments. They might tell a lonely 12-year-old that they are the most fun person they have played with, that they are mature for their age and that they really get each other. Over days, sometimes weeks, they build something that feels like genuine friendship.
Roblox’s in-house moderation tools are limited, and the platform has consistently struggled to prevent adult accounts from reaching child users. Researchers have shown that creating a fake account that appears to be a teenager can take minutes. The avatar system means there are no faces, no voices, no age markers. A 40-year-old can present as a 13-year-old and a child would have no reason to question it.
What happens next is what I need you to understand.
The Pipeline
Predators do not want to stay on Roblox. Roblox, whatever its failures, does have some moderation. The game’s chat filters and reporting mechanisms, inconsistent as they are, represent a risk to someone trying to exploit a child.
So they move the child off the platform.
The most common destination is Discord, a messaging and voice chat service that is genuinely excellent for gaming communities and completely legitimate in its design, but which has very limited parental visibility and where private servers and direct messages can happen away from any oversight at all. Snapchat is another common step, where disappearing messages make evidence difficult to preserve.
Families, investigators, and legal cases all describe the same pattern: a child is contacted via Roblox, the predator builds trust through gameplay, and then at some point, they suggest moving the conversation “somewhere we can talk properly.” The child, who now trusts this person, agrees.
Once off-platform, the grooming accelerates. Conversations become more personal, more intimate. Requests for photographs begin, framed carefully at first. And then, increasingly, something else enters the picture.
Glad you are still here, because this next part is the reason I sat down to write this post.
The Weapon: AI Deepfakes and Nudifying Apps
In February 2026, UNICEF published a statement with three words at the centre of it: “Deepfake abuse is abuse.”
A report by Internet Matters estimated that approximately 529,632 teenagers in the UK, roughly 4 in every class of 30, have already had an encounter with a nude deepfake. A joint study by UNICEF, ECPAT, and Interpol across 11 countries found that at least 1.2 million children globally disclosed having had their images manipulated into sexually explicit deepfakes in the past year alone. These are scary numbers that not enough people are aware of.
Here is how it works. There are tools available right now, free to access online, no age verification required, that can take a photograph of a clothed person and generate an image of that person without clothes. These are called nudifying apps. An investigation found that none of the 21 tested sites required any form of age check before use. They are being used against children in two distinct and equally disturbing ways.
The first is by adult predators. Once a child has been groomed and has shared photographs, those images can be fed into a nudifying tool and the result used as leverage. “Send me what I’m asking for, or I’ll send this to your school.” A child looking at a realistic-looking image of themselves, terrified, with no idea that it is entirely fabricated, will often comply. 19% of reports made to the UK’s Report Remove helpline now involve imagery that has been digitally altered or AI-manipulated. The same goes for any pictures you are sharing of them on your own social media accounts, a very good reason to make sure you have your account viewing permissions locked down.
The second is by other children. This is the part that concerns me most. Boys, predominantly, are using nudifying tools on photographs of girls they know, classmates, girls in their year. Not necessarily because they intend to use the images maliciously, sometimes because they can, because the tool is there and it feels abstract on a screen, and they have no real concept of the impact. But the impact is the same. The UK’s Children’s Commissioner called for an immediate ban on nudification apps in 2025. The UK Government announced plans to criminalise them in December 2025. As of this writing, those apps are still accessible.
⚡Please don’t forget to react & restack if you appreciate my work. More engagement means more people might see it. ⚡
How These Two Threats Connect
I wanted to write about Roblox and AI deepfakes in the same post because they are increasingly part of the same threat to the same children.
A predator locates a child on Roblox. They build trust through gameplay over days or weeks. They move the child to Discord. Through a combination of emotional manipulation and gentle escalation, they obtain a photograph, a face, a name. That is sometimes all they need. From a single social media photo or an image shared during a conversation, a nudifying tool can generate explicit content. That material becomes the weapon. The blackmail begins.
Your child did not do anything wrong. They played a video game. They made what they thought was a friend. And somewhere in the gap between those two things, something was done to them without their knowledge, using technology that costs nothing and takes seconds.
This is the new stranger danger. And I am telling you about it because most parents have no idea it exists.
Warning Signs
Children caught in this kind of situation will rarely come forward unprompted. But there are signs:
Becoming withdrawn or distressed after gaming sessions or time on their phone
Secretive about who they are playing with online, or who they are messaging
Unexplained anxiety about going to school or seeing certain friends
Sudden reluctance to use a platform they previously loved
Requests for money, gift cards, or unexplained small purchases
Visible upset after receiving messages, particularly late at night
If your child’s mood changes and you can’t find the reason, trust that instinct. Ask the question gently. And make sure they know the answer you will give them is not anger.
If You Think Your Child Is Being Targeted
Stay calm. If your child comes to you, they have done the hardest thing. Meet their courage with yours.
Do not delete anything yet. Before blocking any accounts, screenshot everything: profile names, conversations, usernames, and any images. This is your evidence.
Block the accounts on every platform once you have that evidence preserved.
Report to CEOP at ceop.police.uk. This is exactly what they are there for, and it is free, confidential, and available to any family in the UK.
Contact the Internet Watch Foundation at iwf.org.uk if any images have been shared online. They have the power to work with platforms to have images removed.
Call Childline on 0800 1111, free and available 24 hours a day. The counsellors understand this, and so does your child’s need to talk to someone who is not a parent.
Report to the platform. Roblox, Discord, and Snapchat all have reporting mechanisms. Use them. It will not feel like enough, but it matters.
The Conversation Worth Having Today
You do not need to turn this into a lecture. You do not need to take the game away or make your child feel accused of something they have not done.
You just need to open a door.
Something like: “I read something today about Roblox that I wanted you to know about, not because I think anything like this is happening to you, but because I want you to know what it looks like so you can spot it. And I want you to know that if anyone ever made you feel uncomfortable online, or asked you to keep something a secret, you could always come to me and I would not be angry. I would help.”
That is the whole thing. Open, honest, non-judgmental. The technology will keep evolving. The tactics will change. The one constant that protects a child is knowing that the door to a trusted adult is open.
FAQs
Is Roblox safe for children? Roblox has significant child safety problems that the company has repeatedly failed to address adequately. It reported 24,522 child exploitation cases in 2024. While many children use it without incident, parents should be aware of the risks and use parental controls, talk to their child about online safety, and monitor who their child is interacting with.
What are nudifying apps and how do they work? Nudifying apps use AI to generate sexualised images of clothed people by digitally removing their clothing. They are freely accessible online with no age verification required, and are being used both by adult predators to create blackmail material and by children against their peers.
How do predators use Roblox to groom children? Predators create accounts on Roblox, present as peers, build trust through gameplay and gifts of in-game currency, then move children off-platform to Discord or Snapchat where there is less moderation. Grooming then escalates in private messaging.
What is the Roblox to Discord pipeline? This refers to the documented pattern where predators initiate contact with a child on Roblox, build a relationship over time, then suggest moving to Discord or another messaging platform where conversations are private, unmonitored, and where exploitation can progress more freely.
Are AI deepfake images of children illegal in the UK? AI-generated sexual images of children constitute child sexual abuse material and are illegal in the UK. The government has also announced plans to criminalise nudification tools specifically. If you encounter such images, report them to the Internet Watch Foundation at iwf.org.uk.
What should I do if my child has been sent or shown a nude deepfake? Do not delete any evidence. Screenshot everything. Report to CEOP at ceop.police.uk and to the platform where it occurred. Contact the Internet Watch Foundation if images are being distributed. Contact Childline on 0800 1111 for support for your child.
You are not alone in this. The gap between what is happening online and what most parents know about is vast, and it is not your fault. These platforms are designed to be engaging, to feel safe, and to look like fun. The dark corners are not signposted.
I spent 8 years inside the evidence. Now I spend my time making sure children never become part of it. This is why I keep writing, even when the algorithms do not serve it, even when it would be easier to stop.
Talk to your child. Today, if you can.
Keep fighting the good fight, stay up-to-date and keep the conversations alive and kicking, and remember I am here to guide you through the maze as we ensure your children enjoy their online experiences and flourish in life. I don’t want them to become just another statistic, nor should you.
Useful resources: CEOP | Internet Watch Foundation | Childline — 0800 1111 (free, 24/7) | Internet Matters deepfakes research | UNICEF: Deepfake abuse is abuse
As always, thank you for your support. Please share this across your social media, and if you do have any comments, questions, or concerns, then feel free to reach out to me here or on BlueSky, as I am always happy to spend some time helping to protect children online.
Remember that becoming a paid subscriber means supporting a charity that is very close to my heart and doing amazing things for people. Childline, I will donate all subscriptions collected every six months, as I don’t do any of this for financial gain.





This is a serious safety that often gets unaddressed. The community must do more to protect children from these online dangers.