Two Verdicts. Two Days. One Very Clear Message.
Meta has been found liable for child sexual exploitation on its platforms. Here's what the evidence actually showed, and why this verdict matters to every parent and teacher in the UK.
Not one landmark verdict against a social media company, two in two days.
On Tuesday, a jury in New Mexico found Meta liable on all counts for child sexual exploitation on its platforms and ordered the company to pay $375 million. On Wednesday, a jury in Los Angeles found Meta and YouTube liable for addicting a child to their platforms, the first verdict of its kind for that specific harm, ever.
Two separate courts. Two separate legal theories. Two separate sets of plaintiffs. And in both cases, twelve ordinary people sat through weeks of evidence and came back with the same answer: Meta knew, and it chose profit anyway.
I want to walk you through both verdicts, because they are different stories with the same ending. And I want to be clear about why, sitting here in the UK, this week should matter to every parent and teacher reading this.
The New Mexico Case: They Knew About the Predators
The New Mexico case was brought by the state’s Attorney General, Raúl Torrez, a former prosecutor who spent years working on internet crimes against children. He filed the lawsuit in December 2023.
It started with an undercover investigation. The AG’s office created fake Facebook and Instagram profiles posing as children under the age of 14. Within a very short time, those accounts were flooded with predatory contact, sexually explicit material, and solicitations from adults seeking sexual content from minors. Three criminal arrests followed, including two men who arrived at a location believing they would meet a 12-year-old girl.
The jury found Meta acted willfully. Not negligently. Willfully. The legal finding that Meta knowingly and intentionally deceived the public about the safety of its platforms for children.
The most damaging evidence didn’t come from prosecutors. It came from inside Meta itself.
In 2019, Meta pushed ahead with end-to-end encryption on Facebook Messenger. Their own teams raised the alarm. An internal briefing predicted the move would cause reports of child sexual abuse material to law enforcement to drop from 18.4 million annually to 6.4 million. A 65% reduction. Not because the abuse was stopping. Because the ability to detect and report it would be gone.
Monika Bickert, Meta’s own Head of Content Policy, wrote internally at the time:
“We are about to do a bad thing as a company. This is so irresponsible.”
Antigone Davis, Meta’s global head of safety, warned that encrypted Messenger would be ‘far, far worse than anything we have seen on WhatsApp.’
They went ahead anyway.
Former Meta engineering director Arturo Bejar, whose own 14-year-old daughter received sexual solicitations on Instagram, testified:
“The product is very good at connecting people with interests, and if your interest is little girls, it will be really good at connecting you with little girls.”
The jury awarded $375 million in civil penalties. Meta says it will appeal. A Phase 2 hearing is scheduled for May, where a judge could order Meta to make structural changes to its platforms, including mandatory age verification.
The Los Angeles Case: They Knew About the Addiction
The LA case was different in nature, but the pattern was identical.
The plaintiff, referred to as Kaley, started using YouTube at age six and Instagram at age nine. By her teens, she had depression, anxiety, body dysmorphia, and suicidal thoughts. Her legal team argued the platforms’ design was responsible for that harm.
The jury was explicitly told not to consider what content Kaley saw. Only how the platforms were built. Infinite scroll, autoplay, algorithmic recommendation feeds, push notifications, and like counts. All of it engineered to create compulsive use, to make putting the phone down feel difficult, to reward engagement and punish absence.
The jury deliberated for nearly 44 hours over nine days. They found both Meta and YouTube negligent on every count. They found the companies knew or should have known their designs would cause harm to children. And critically, they found both companies acted with malice, oppression, or fraud. That last finding opens a punitive damages phase that could dwarf the $3 million compensatory award.
Meta’s own internal research, shown in court, found that Instagram makes body image issues worse for one in three teen girls. A Meta researcher wrote that Instagram ‘is a drug’ and that Meta ‘are basically pushers.’ They published none of it.
Mark Zuckerberg testified in person for nearly eight hours. He maintained the science hasn’t proved that social media causes mental health harm. He admitted a 2015 internal review found over four million users under 13 on the platform. When asked about his qualifications to assess causation, he said: ‘I don’t have a college degree in anything.’
The Same Pattern, Twice
I’ve spent years in digital forensics with the Royal Air Force. I’ve reviewed material that most people will never see and should never have to. And what strikes me about both of these cases is not the verdicts themselves. It’s how consistent the internal evidence was.
In both cases, Meta’s own employees raised serious concerns about child safety. In both cases, those concerns were documented. In both cases, the company moved forward regardless. In the New Mexico case, that meant rolling out encryption that their own safety team predicted would destroy their ability to detect child abuse. In the LA case, that meant building and refining features they knew were psychologically harmful to young users.
To be brutally honest, none of this surprised me. But that doesn’t make it any less important that two separate juries, looking at two separate bodies of evidence, arrived at the same conclusion in the same week
⚡Please don’t forget to react & restack if you appreciate my work. More engagement means more people might see it. ⚡
Why This Matters in the UK Right Now
I hear this from parents every week: ‘This all sounds terrible, but it’s happening in America.’
It isn’t.
Your child is on the same Instagram. The same Facebook. The same YouTube. The same algorithms. The same infinite scroll. The same notification system. The same features that two separate juries this week described as negligent and harmful are running on the device in your child’s pocket right now.
Ofcom wrote directly to Facebook, Instagram, Roblox, Snapchat, TikTok, and YouTube on 12 March, just two weeks before these verdicts. Four demands: effective minimum age policies, failsafe grooming protections, safer feeds for children, and an end to product testing on children. Platforms have until 30 April to respond. Ofcom’s CEO Dame Melanie Dawes stated: ‘These online services are household names, but they’re failing to put children’s safety at the heart of their products.’
The Online Safety Act is an active law. Children’s protection duties have been in force since July 2025. Ofcom has launched over 80 investigations and issued its first fines. Reddit was fined £14 million in February 2026 for unlawfully processing children’s data.
The UK government is consulting on going further, a potential ban on social media for under-16s, restrictions on infinite scroll and autoplay, and raising the digital age of consent from 13. The consultation closes on 26 May. The internal Meta documents exposed in US courts this week will almost certainly be cited in UK regulatory proceedings.
The Numbers You Need to Know
In the UK, 40% of children under 13 have social media profiles despite minimum age rules. Instagram ownership among 8 to 9-year-olds nearly doubled in a single year, from 8% to 14%. Meta’s own leaked research found 13.5% of British teen girls traced suicidal thoughts to Instagram. A major study in JAMA Psychiatry found that adolescents spending more than three hours daily on social media face double the risk of depression and anxiety. Ofcom data shows 37% of children aged 3 to 5 used at least one social media app in 2024.
These are not edge cases. This is the landscape your children are growing up in.
What Comes Next
In New Mexico, the Phase 2 hearing in May could compel Meta to make structural changes to its platforms. In Los Angeles, the punitive damages phase is still to come. A second California bellwether trial is set for May, a third for July. The federal multi-district litigation in Oakland covers over 10,000 individual cases and nearly 800 school district lawsuits. More than 41 state attorneys general have active suits against Meta.
Legal experts have compared this to 1990s tobacco litigation. The early verdicts looked small. The eventual tobacco master settlement was $206 billion. The question is whether the momentum holds.
What This Means For You
I want to be honest with you. Neither verdict fixes anything overnight. Meta and YouTube will appeal. The platforms your children are using today are the same ones that were in those courtrooms this week.
But these verdicts matter because they say, publicly and legally, what many of us have known for a long time. These companies knew. They had internal research. Their own employees raised the alarm. And they chose profit.
Talk to your children about how social media makes them feel, not just what they see on it. Ask them what happens when they put the phone down. Do they feel anxious? Do they immediately want to pick it back up? That feeling is not a coincidence. It is the product working exactly as designed.
Turn off push notifications on every app. Turn off autoplay on YouTube. Set a daily time limit. On Instagram, go into Settings, then Your Activity, then Time in App. These are not perfect solutions. But they put some friction back into a system designed to remove all friction.
If your child is under 13, they should not be on Facebook, Instagram, or YouTube. The age restrictions exist because of exactly what came out in those courtrooms.
If your child needs someone to talk to right now:
Childline: 116 000 | childline.org.uk | Free, confidential, 24 hours a day
Two juries. Two days. Two verdicts that said the same thing.
They knew. And now, so do we.
As always, thank you for your support. Please share this across your social media, and if you do have any comments, questions, or concerns, then feel free to reach out to me, as I am always happy to spend some time helping to protect children online.
Remember that becoming a paid subscriber means supporting a charity that is very close to my heart and doing amazing things for people. Childline, I will donate all subscriptions collected every six months, as I don’t do any of this for financial gain.





