
A new investigation by the UK watchdog group Global Witness has found that TikTok’s search algorithm may be directing teenage users toward pornographic and sexually explicit content. The findings have sparked renewed concerns over the platform’s safety measures and its handling of underage accounts.
INVESTIGATION REVEALS DISTURBING SEARCH SUGGESTIONS
According to Global Witness, seven TikTok accounts were created in the UK to simulate 13-year-old users—the minimum age required to open an account. Each account was set up on a factory-reset device with no prior search history. Despite being set in TikTok’s “restricted mode,” which is meant to limit exposure to adult or suggestive material, the accounts were immediately met with highly sexualized search suggestions.
The organization reported that some accounts were recommended explicit terms the first time they tapped into the search bar. Within a few clicks, all seven test profiles encountered pornographic content, demonstrating what Global Witness described as a troubling flaw in TikTok’s content recommendation system.
“Our point isn’t just that TikTok shows pornographic content to minors,” Global Witness stated. “It is that TikTok’s search algorithms actively push minors towards pornographic content.”
The report highlights potential violations of the UK’s Online Safety Act 2023, which requires digital platforms to protect children from harmful material, including pornography and self-harm content. Media lawyer Mark Stephens, commenting on the findings, called TikTok’s practices “a clear breach” of the new law.
TIKTOK RESPONDS, DEFENDS ITS SAFETY FEATURES
In response to the report, TikTok said it acted swiftly to investigate and remove any material violating its community standards. A company spokesperson emphasized that TikTok is “fully committed to providing safe and age-appropriate experiences” and pointed out that it removes “nine in ten violative videos before they are ever viewed.” The platform also stated it has implemented over 50 features and tools specifically designed to enhance teen safety.
TikTok’s community guidelines prohibit all forms of nudity, sexual activity, and sexually suggestive content, particularly those involving minors. The company said its transparency report for early 2025 showed that about 30% of removed videos were taken down for mature or sensitive themes.
The app also claims to delete around six million underage accounts each month through advanced age detection tools and manual moderation. These systems, TikTok says, help identify users who may be younger than 13, in compliance with global child safety standards.
Global Witness conducted portions of its research both before and after the UK’s new Online Safety Act child protection provisions came into force in July 2025. TikTok maintains it is cooperating with Ofcom, the UK’s communications regulator, and continues to strengthen compliance through stricter age checks and safer browsing modes.
Despite these assurances, the findings have intensified scrutiny of TikTok and other major social media platforms. Tech companies such as YouTube and Instagram have also introduced new artificial intelligence and privacy controls aimed at shielding minors from harmful online material.
As public and governmental pressure mounts, regulators are urging tech firms to prove that their safety systems truly protect young users—not merely in policy, but in practice.
Leave a Reply