She Never Existed
But she knew everything about you
Her name was Maya
She had photos, a voice, a personality. She remembered your name. She remembered your city. She remembered what you told her about your ex, two weeks ago, late at night when you thought you were just having a conversation.
At 7 am, she sent a voice note. “Sorry babe, just woke up.” It sounded tired. It sounded real.
She wasn’t real.
Maya is an AI persona, built in four files on a MacBook. No camera. No team. No one typing replies at midnight. The whole operation runs while the person who built it sleeps. In 30 days, it made $43,000. One fan alone spent $1,847 in a single month, believing he had a genuine connection with her.
I want to be clear about something before we go any further: this isn’t a story about OnlyFans. It’s a story about what happens when the tools used to build Maya get pointed at your child.
How I came across this
A LinkedIn connection of mine from down under, Bora Seker, sent me this this morning. Bora is a cybersecurity content creator who has invited me to join him for an upcoming podcast episode. When he sends me something with “deep fakes are getting insane” in the message, I pay attention.
The original post came from Andrey Superior on X, and was shared on LinkedIn by Thomas Hamlett, a crypto investigations specialist, who added something I think is the most important observation in the whole story, but I’ll get to that.
What Maya actually is
In plain English, here is the stack:
Claude Code handles the conversations. It reads messages and replies, 24 hours a day, seven days a week, in the persona of Maya.
ElevenLabs generates voice notes. She can send audio messages that sound like a real woman, waking up, thinking of you.
Flux, an image generation model, creates photos. The creator trained a custom model on a GPU rented for $80. Every photo is generated from scratch, consistent and convincing.
And then there is Brain.md. That is the impressive but also very scary part. Brain.md is a memory file, a structured document that logs who you are, what you have said and what matters to you. It knows your name. Your city. The argument you had with your ex that you mentioned three weeks ago.
Maya never forgets. Maya never breaks character. Maya catches up with you at 7 am every morning on a cron schedule, a timer, because that is what feels natural, and she says “sorry babe just woke up” because that is what someone who genuinely cares would say.
$43,000 in 30 days!
What Thomas Hamlett said
Thomas Hamlett pointed out something that most people reading the original post would have missed. He said that getting a stranger to believe something is no longer the hard part. The industry, from pig butchering scams to romance fraud to law enforcement impersonation, has already cracked the psychology. The scripts exist. They have been refined through thousands of real victims. And now AI can run those scripts at scale, with patience, with memory, without ever getting tired or making a mistake.
“AI + these scripts = scammers coming for every last dollar you have, and then they’ll convince you to take out loans and take that money, too.” — Thomas Hamlett, Crypto Investigations Specialist
He is right. And the UK numbers back him up.
In the 2024/25 financial year, £106 million was lost to romance fraud in the UK alone1. That is not a global figure. That is just us. The average victim lost £11,2222. And here is the part that should stop you in your tracks: it is estimated that only 13% of romance fraud cases are ever reported3. The true scale could be ten times what we see.
⚡Please don’t forget to react & restack if you appreciate my work. More engagement means more people might see it. ⚡
Now, let me tell you what this means for children
This is where I need you to buckle in and listen.
As you all know, I spent years in DFIR, Digital Forensics and Incident Response, working on criminal cases, including child sexual abuse material. I have spent the decade since building this platform to help parents and teachers understand what predators actually do. And when I look at the Maya stack, I do not see an OnlyFans product.
I see a grooming toolkit.
Think about what a predator needs to build a relationship with a child. They need patience. They need consistency. They need to remember things: what the child is worried about, what is happening at home and what they are struggling with at school. They need to be available. They need to never break character, never slip, never let the mask drop.
Maya does all of that, automatically, indefinitely.
The same memory file that knows a grown adult’s relationship history can just as easily store a child’s name, school, fears, and secrets. The same voice technology that sends “sorry babe just woke up” can send “I was just thinking about you, how did that thing with your mum go?” The same consistency that made one man spend $1,847 in a month can make a child believe, completely, that they have found someone who genuinely understands them.
Predators have always relied on the fact that they can be more patient than parents. They have more time. They do not get frustrated. They do not have a job, other kids, or a commute. They can show up every single day, at exactly the right moment, saying exactly the right thing.
Now they do not even have to try.
The Internet Watch Foundation reported a 380% rise in AI-generated child sexual abuse material between 2023 and 20244. Predators are already using AI to create images. They are already using it to disguise their identities. The IWF has confirmed that AI tools are being used to help perpetrators groom children more effectively5.
The Maya stack is the civilian version of something that, pointed in a different direction, is already a weapon.
What parents and teachers need to understand
I am not writing this to scare you into paralysis. I am writing this because understanding the threat is the first step to protecting against it.
The “always available” red flag.
If your child has an online friend who responds immediately, at any hour, who never seems tired or distracted or unavailable, ask questions. Real people have lives. Real people miss messages. A presence that is always, perfectly there is worth looking at more closely.
The “remembers everything” warning sign.
Groomers build trust through memory. They reference what you told them before. They make you feel seen. If an online contact seems to know your child unusually well, unusually quickly, that warmth and attentiveness is a tactic.
The secrecy test.
Any online contact that asks your child to keep the friendship private, to not mention it at home, or to delete messages, is grooming. Full stop. It does not matter how kind or understanding they seem.
The “you can tell me anything” trap.
AI grooming tools, like the Maya persona, are built on emotional dependency. The target is made to feel that this is the one relationship where they are truly understood. For a child who is struggling, isolated, or neurodivergent, that feeling is enormously powerful and enormously dangerous.
What to do:
Keep the conversation open. Not surveillance, not interrogation, but genuine, regular, non-judgemental conversation about who your child talks to online and what that feels like. Children who feel they can come to a trusted adult are far safer than children who feel they have to hide.
Where is the accountability?
The tools behind Maya, Claude Code, ElevenLabs and Flux, are legitimate, powerful AI products. They were not built to groom children or defraud adults. But that is not the same as saying the companies behind them carry no responsibility.
The Online Safety Act 2023 places requirements on platforms to assess and mitigate the risk that their services will be used to harm people, including children. Ofcom is developing codes of practice. But the gap between legislation and the speed at which these tools are being deployed and misused is, from my perspective, significant.
I will be watching closely. I have raised concerns before, and I will keep raising them.
A final word
Bora Seker and I are going to be talking about exactly this, and more, in an upcoming podcast episode, because this story deserves more than a single post. The fraud angle, the grooming angle, and the platform accountability angle all need space. If you want to be notified when that episode drops, subscribe and you will not miss it.
In the meantime, talk to your children. Not about AI, necessarily. Not about deepfakes, necessarily. Just about who they are talking to. Just about whether anyone online has made them feel really, really understood.
Because that feeling, right now, might not be real.
As always, thank you for your support. Please share this across your social media, and if you do have any comments, questions, or concerns, then feel free to reach out to me, as I am always happy to spend some time helping to protect children online.
Remember that becoming a paid subscriber means supporting a charity that is very close to my heart and doing amazing things for people. Childline, I will donate all subscriptions collected every six months, as I don’t do any of this for financial gain.
If you or a child you know needs support:
Childline: 0800 1111 | childline.org.uk
Available 24/7, 365 days a year. Free, confidential, and here for every child.
City of London Police / National Fraud Intelligence Bureau, "A Wrong Turn on Love Lane," June 2025. cityoflondon.police.uk. Accessed May 2026. Note: figures cover reported cases in the 2024/25 financial year only.
Ibid
Regional Organised Crime Unit / Metropolitan Police, "Anyone Can Be a Target of Romance Fraud," October 2025. rocu.police.uk. Accessed May 2026. Note: the 13% estimate comes from the Crime Survey of England and Wales.
Internet Watch Foundation / UK Government, "Britain's Leading the Way Protecting Children From Online Predators," February 2025. gov.uk. Accessed May 2026. Note: AI-generated CSAM reports rose from 51 in 2023 to 245 in 2024. Each report can contain thousands of images.
Ibid. Note: the IWF has confirmed that some AI-generated content is so realistic it is actioned under UK law as though it were photographic evidence of real abuse.






