The Simple Question the Roblox CEO Refused to Answer
(And Why It Matters)
I often talk about the technical settings on apps like Roblox, how to turn off chat or restrict games and have even written about Roblox on a few occasions. But sometimes, to truly understand the risk, you need to look at the philosophy of the people running the platform.
I am a fan of the Hard Fork Podcast, with
and and I have just watched their latest ‘crazy’ interview (Link below) with the Roblox CEO, David Baszucki. I say crazy because when pressed on the rampant issues of predators and grooming on his platform, his responses ranged from evasive to genuinely concerning.Here is what you need to know about the man running the world’s most popular digital playground, and why his answers suggest that safety might not be the priority we would want it to be.
The “Scale” Excuse
Roblox is undoubtedly huge, but we also know that about 40% of its users are under 13, and it has over 150 million daily active users. But when the podcast hosts asked Baszucki if he was confident in his technology’s ability to moderate such a massive space, his defence was telling.
He repeatedly described the scale of Roblox as “absolutely mind-boggling” Instead of offering a concrete reassurance of safety, he framed the challenge of moderation as an “opportunity” to innovate.
Why this matters: When a CEO answers a question about child safety by marvelling at how big their company is, it makes me think that their growth has outpaced their ability to protect your children.
Denial of the Predator Problem
The interviewers pointed out that Roblox is currently facing over 20 lawsuits accusing the platform of enabling sexual exploitation, with reports of predators using the site to lure children.
When asked directly if he felt Roblox had a problem with predators, Baszucki’s response was not to acknowledge the severity of the issue, but to pivot. He claimed the company is doing an “incredible job at innovating relative to the number of people on our platform”. He went so far as to “categorically reject” the description of Roblox as a place where predators go to find children, despite the mounting legal evidence and news reports stating otherwise.
The Red Flag: Denial is the enemy of safety. If the leadership cannot admit there is a severe problem, they are unlikely to take the drastic measures needed to fix it. He clearly lives on a different planet if he thinks there are no predators on the Roblox platform and use it as their hunting grounds.
The “Stealth Support” Delusion
In one of the most bizarre moments of the interview, after the hosts pressed him on the reliance on AI moderation, which historically fails to catch everything, Baszucki seemed confused by the tough questions.
He asked the hosts, “Is this a stealth interview where actually you love everything we’re doing and you’re here to stealthily support it?”. He seemed to believe that because he views his company’s use of AI as “innovative,” everyone else, including journalists covering child safety, should blindly support it. Ask most people who work in cyber about AI and they will tell you that it should be used as a tool to support you and not something to blindly believe in.
Shifting the Blame to Parents
Roblox is rolling out new “facial age estimation” technology (scanning your face with a phone) to gate certain chat features. Whilst better verification is generally good, Baszucki’s final stance on safety responsibility was a classic deflection.
He stated that the “parent is the ultimate arbiter of responsibility”. While parents obviously play a huge role, this statement conveniently sidesteps the company’s duty to ensure its product is safe by design. He implies that if you aren’t comfortable, you simply shouldn’t let your kid on the platform, a fair point, but one that ignores how deceptive the platform’s safety marketing can be. So my advice to you is to do just that and not let your child use a platform where the person at the very top seems averse to taking responsibility for the safety of your child, with a complete disregard of the duty of care that should exist
Gambling for Kids?
Perhaps most shockingly, when the conversation turned to “prediction markets” (essentially betting on future outcomes), Baszucki called the idea of putting them inside Roblox a “brilliant idea”.
He suggested a scenario where children could use Robux (digital currency bought with real money) to bet on outcomes, calling it educational. For a platform struggling to keep predators away from children, expressing enthusiasm for introducing gambling-style mechanics to young children shows a disturbing lack of judgment regarding child welfare.
⚡Please don’t forget to react & restack if you appreciate my work. More engagement means more people might see it. ⚡
The Bigger Picture: What Police Data Tells Us
While tech CEOs like Baszucki offer optimistic views on “innovation” and “opportunities,” real-world data often tells a much grimmer story. A recent Sky News report revealed the “worst social media app for child abuse offences,” highlighting a massive disconnect between corporate narratives and police reality. According to the report (which identifies Snapchat as a primary offender), thousands of child abuse image offences are recorded annually on these platforms.
This connects directly to the Roblox interview, whether it is Snapchat or Roblox, these platforms often rely on “reactionary” safety models—waiting for reports rather than designing safety from the start. The Sky News data serves as a stark reminder that while CEOs debate “scale” and “metrics,” police are dealing with the actual victims of these design flaws. It reinforces why, as parents and teachers, you must demand “Safety by Design” rather than accepting the “mind-boggling” growth of apps at the expense of children’s safety.
The Bottom Line
At the end of the interview, the host asked a simple question: In a few years, when his own toddler is old enough to ask for Roblox, will the platform’s safety problems be fixed?
Baszucki did not say “Yes.”
Instead, he offered to take the host under an NDA (Non-Disclosure Agreement) to show him internal metrics to “get him over the hump”.
My advice to parents: If the CEO of Roblox cannot publicly promise that his platform will be safe for a toddler in three years, you should remain extremely vigilant today. Keep those chat filters on, monitor their usage, and remember: the “mind-boggling” scale of Roblox means your child is just one of millions in a system that is still figuring out how to keep them safe.
But to be brutally honest, by the end of this interview and the impression this CEO gave, I wouldn’t let a child anywhere near Roblox until they seriously change the viewpoint regarding online predators, grooming and the very real risks that are present within the Roblox environment.
As always, thank you for your support. Please share this across your social media, and if you do have any comments, questions, or concerns, then feel free to reach out to me here or on BlueSky, as I am always happy to spend some time helping to protect children online.
Remember that becoming a paid subscriber means supporting a charity that is very close to my heart and doing amazing things for people. Childline, I will donate all subscriptions collected every six months, as I don’t do any of this for financial gain.





