I Made Joe Marler Say Something He Never Said
Just with a dodgy accent
This week I posted a Reel on Instagram. In it, Joe Marler, the England rugby prop and one of the most entertainingly chaotic people in sport, appeared to speak directly to the camera, promoting my blog.
He absolutely did not do that.
What I actually did was take a photo I’d had taken with Joe at a match, something I’m genuinely chuffed about because he’s a brilliant bloke, and I ran it through an AI tool that animated the still image, I added my own voiceover, thirty seconds of work. The result was a video of Joe Marler appearing to endorse my blog. His mouth moved. His expression changed. It looked, at a glance, like a real video.
The accent wasn’t quite right, the lip sync wasn’t perfect, but keep in mind that if I paid for a tool, I could use samples of his voice to make it sound like him. Most adults watching it could probably clock that something was off. Most children watching it would not.
And that’s the point.
I made this as a demonstration. A harmless, light-hearted one, with a public figure I’d actually met and in a context where nobody was going to be deceived or harmed. But I need you to sit with that for a second.
A photograph. An AI tool. Thirty seconds.
That is all it takes to make someone appear to say something they never said. To make a face move. To make a voice come out of a still image.
Now imagine it isn’t a joke.
Imagine it’s your child’s favourite YouTuber, their favourite footballer, their favourite gaming streamer. And instead of promoting a blog about online safety, that person is saying: “Hey. I’ve seen your messages. You seem like a really special kid. I’d love to meet you. Here’s my number. Keep it between us.”
Dramatic? Yes. Impossible? No. This technology is here. It’s free to access in many cases. There’s no age gate. And the people who want to reach your child know how to use it.
HOW DOES THIS ACTUALLY WORK?
What I used is one of many AI-powered animation tools now widely available online. Some are free. Some require a small subscription. None of the ones I tested for this post required any form of age verification. You upload a photograph, add audio, and the tool does the rest.
This is separate from deepfake video, which goes further still, blending a person’s likeness into existing footage. That technology is also freely accessible. A joint study by UNICEF, ECPAT and Interpol across 11 countries found that at least 1.2 million children globally reported having their images manipulated into sexually explicit deepfakes in a single year. These are not fringe cases.
There are also tools known as nudification apps, which can take a clothed photograph of any person and generate an image of that person without clothes. An investigation found that none of the 21 tested sites required any form of age verification before use. None. The UK’s Children’s Commissioner, Dame Rachel de Souza, called for an immediate ban in 2025. The UK Government announced plans to criminalise these tools in December 2025. As of writing, they remain accessible.
SO WHAT DOES THIS MEAN FOR GROOMING?
I’ve spent years working in digital forensics and incident response, and in my experience, the technology predators use evolves faster than the awareness around it. The animated photo or deepfake video is a new tool in an old playbook. The goal is the same as it always was: build trust, create a sense of relationship, and move the child to a private space.
A video of your child’s idol, appearing to speak directly to them, appearing to know them, is one of the most effective trust-building mechanisms I can imagine. It bypasses the stranger danger instinct entirely. It is not a stranger. It is someone they already love.
The platforms are not doing enough. The AI companies creating these tools are doing almost nothing. The legislative response, welcome as it is, is playing catch-up with technology that moves faster than Parliament can legislate.
⚡Please don’t forget to react & restack if you appreciate my work. More engagement means more people might see it. ⚡
WHAT CAN YOU DO?
First, watch the Reel if you haven’t already. Show it to your child. Use it as a starting point. Ask them, does that look real to you? What would you do if you got a video message from someone famous you’d never actually spoken to?
Second, read my post from March, It Starts in a Game. It Doesn’t End There, which covers deepfakes in the context of gaming platforms and grooming in much more depth. That post covers nudification apps, the stats behind the threat, and what the current UK legislative picture actually looks like. The post is just below.
Third, check your own social media privacy settings. Every photograph of your child you post publicly is a potential asset for someone building a deepfake. That’s not a reason to never post. It is a reason to make sure your profile is locked down and only visible to people you know.
Fourth, if your child receives a video message, a voice note, or any kind of media from someone they haven’t physically met, encourage them to come and talk to you before responding. No panic, no confiscating devices, just let them know you are always available and can look at it together and talk about it.
FINALLY
This post came from a genuinely funny moment. A rugby match. A dodgy AI accent. Joe Marler is looking mildly animated. I’m glad that’s how it started.
But this technology is not funny in the wrong hands. And the wrong hands have it.
I don’t write this to frighten you. I write it because most of what I witnessed during my career in digital forensics didn’t have to happen. Education and conversation are still the most powerful safeguarding tools we have.
Ps Joe, if you are reading this, that picture was taken at a time in the beginning of my journey with my lifetime companion C-PTSD, it was at a time when you were being very open about your mental health, and I told you that it was an inspiration to me as somebody who had to hide my illness whilst I was in the RAF. This was also before you were a world-famous hundy great guy, but I am happy to see you are still talking openly about mental health, as I try to.
As always, thank you for your support. Please share this across your social media, and if you do have any comments, questions, or concerns, then feel free to reach out to me, as I am always happy to spend some time helping to protect children online.
Remember that becoming a paid subscriber means supporting a charity that is very close to my heart and doing amazing things for people. Childline, I will donate all subscriptions collected every six months, as I don’t do any of this for financial gain.
If you or a child you know needs support:
Childline: 116 000 | childline.org.uk
Available 24/7, 365 days a year. Free, confidential, and here for every child.






