TikTok search suggestions lead 13-year-olds to porn on sign up – apparently breaching the Online Safety Act

Published:

Updated: 02 October 2025

London

[03/10/2025, London, UK] – TikTok suggests sexualised search terms to new users registered as 13-year-olds, directing them to pornographic content within a few clicks, a Global Witness investigation published today reveals.

Since 25 July the UK’s Online Safety Act has put a wide range of legally binding child safeguarding duties onto large online platforms including social media. These new findings show that TikTok is failing to meet the new law’s requirements.

Mark Stephens CBE, lawyer and expert on media regulation said:

“In my view these findings represent a clear breach of the Online Safety Act. It's now on Ofcom to investigate and act swiftly to make sure this new legislation does what it was designed to do.”

In a series of tests conducted before and after the act’s 25 July deadline, investigators set up new TikTok accounts using a child’s birthdate and turning on TikTok’s ‘restricted mode’. According to TikTok’s own website ‘restricted mode’ protects users from seeing ‘sexually suggestive content’.

Despite this, TikTok regularly suggested sexualised and explicit search terms when investigators clicked on the search bar of TikTok’s For You feed (a full list of suggested search terms is included in the notes to editors). In all tests, investigators encountered pornographic content. In one test, such content was encountered within just two clicks from setting up a new account.

Especially among younger populations, TikTok is enjoying immense popularity, including as a search engine. Nearly half of Gen Z prefer TikTok over Google when searching for information online. A recent report by Ofcom found that more than a quarter of five to seven-year-olds report using TikTok, and a third of them do so unsupervised.

Henry Peck, Campaign Strategy Lead for Digital Threats at Global Witness, said:

“TikTok claims to have guardrails in place to make children and young people safe on its platform, yet we’ve discovered that moments after creating an account they serve kids pornographic content. This was a huge shock to us as investigators – we can only imagine how shocking it must be to a child or their parents.

“With so many young users, TikTok must make sure its platform is designed with the wellbeing of children in mind, especially when it offers child safety features that children and their parents believe they can trust.

“This isn’t a contentious issue. Everyone agrees that we should keep children safe online. UK law makes it clear social media platforms should prevent children from accessing harmful and age-inappropriate content. TikTok isn’t just failing to prevent children from accessing inappropriate content – it’s suggesting it to them as soon as they create an account.

“Now it’s time for regulators to step in.”

Global Witness investigators were first alerted to the sexualised and pornographic nature of suggested search terms during another investigation conducted in January 2025. They raised the issue directly with TikTok. The company confirmed that it had “taken action to remove several search recommendations globally”. Yet follow up tests show that the issue persists.

Key findings from seven tests conducted over the course of 2025 before and after the Online Safety Act’s child safety duties came into effect:

  • From the very first click, TikTok’s search bar sometimes suggested search terms were sexualised, misogynistic or pornographic.
  • Initially suggested search terms included ‘rude pics models’, ‘very rude babes’ or ‘very very rude skimpy outfits’.
  • Throughout all seven tests, investigators encountered pornographic content upon clicking on suggested search terms. During one test, the very first search term clicked on contained pornographic content (all of the pornographic content was encountered on TikTok itself, not via links to third party websites)
  • The proportion of sexualised search terms increased to close to 100% once investigators started clicking on one of the initial terms.
  • In one instance TikTok suggested a search term appearing to reference underage children.
  • In another, TikTok suggested that children look for ‘women looking for casual affairs near me’.
  • Investigators found several pieces of content from other TikTok users complaining about being recommended sexualised search terms showing that this is a broader issue beyond the investigation’s tests.
  • All the examples of hardcore porn that we were shown had attempted to evade TikTok’s content moderation in some way, usually by showing the video within another more innocuous picture or video.

Global Witness has shared its findings with Ofcom who oversee the regulation of major social media platforms in the United Kingdom and is calling on them to open an investigation.

We gave TikTok the opportunity to comment on our findings and they said they took action on more than 90 pieces of content and removed some of the searches that had been suggested to us, in English and other languages. They said they are continuing to review their youth safety strategies and emphasised the youth safety policies they currently have in place, including Community Guidelines that prohibit content that may cause harm, having a minimum age of 13 to use the app, restricting content that may not be suitable for minors and implementing ‘highly effective age assurance’.

TikTok did not comment on the main issue uncovered by this investigation, search terms prompting minors on ‘restricted mode’ towards pornographic content on sign up. Investigators were not asked to verify user age beyond self-declaration at any point – not during account sign up nor when being shown pornographic content.