Ask Onix
Teenagers still exposed to harmful content despite Online Safety Act, BBC finds
A BBC investigation has revealed that teenagers continue to encounter disturbing content-including posts about bullying, suicide, and weapons-on social media platforms, despite new legal protections under the UK's Online Safety Act, which took effect in July 2025.
The findings, part of a follow-up study conducted with BBC Morning Live, suggest that while some platforms have improved, others still fail to shield young users from harmful material.
Repeated experiment shows mixed progress
Researchers recreated six fictional social media profiles-three boys and three girls, aged 13 to 15-matching a May 2025 study. Each profile scrolled for 10 minutes daily over a week on TikTok, YouTube, and Instagram.
Results were uneven: Instagram showed no harmful content this time, aligning with its new PG-13-style restrictions for teen accounts. However, TikTok repeatedly exposed the 15-year-old girl profile, "Maya," to posts about bullying, child suicide, terminal illness, and violent abuse against women and children. Another 15-year-old girl, "Sophie," saw less of this content, while 13-year-old "Aisha" encountered minimal issues.
On YouTube, two of the three boy profiles saw no concerning material, but 15-year-old "Harry" was shown videos reviewing knives, guns, and crossbows-alongside footage of animals being shot, a first in the BBC's research. These clips appeared abruptly amid unrelated content like football and gaming.
Industry response and expert warnings
Both TikTok and YouTube defended their safety measures. TikTok cited over 50 privacy and content restrictions for teens, including a 60-minute screen-time limit, while YouTube highlighted expanded protections for younger viewers. A YouTube spokesperson cautioned that test accounts "may not reflect real user behavior."
Child safety advocates, however, urged stronger action. Emma Motherwell of the NSPCC called for "safety by design," emphasizing open parent-child conversations and parental controls. David Wright CBE, director of the UK Safer Internet Centre, acknowledged progress-particularly on Instagram-but warned that change would be gradual: "This isn't a binary fix. Ofcom must keep enforcing the law."
"More needs to be done. Companies should integrate safety by design-it can't just be an afterthought."
Emma Motherwell, NSPCC
What parents can do
Experts recommend:
- Regularly reviewing children's devices and apps with them.
- Using parental controls to block high-risk sites.
- Encouraging open, non-judgmental discussions about online experiences.
Wright added: "Don't assume the new rules mean instant safety. Vigilance is still key."
Next steps
Ofcom, the UK's communications regulator, is expected to continue monitoring compliance with the Online Safety Act. The BBC's findings will be shared with policymakers as part of ongoing oversight efforts.