A human rights group claims TikTok recommends pornography and sexualised videos to minors. Researchers created fake child accounts, enabled safety settings, and still received explicit search prompts. These included clips of simulated masturbation and pornographic sex. TikTok says it acted quickly after being alerted and insists it prioritises safe experiences for young users.
Child accounts reveal explicit material
In July and August, Global Witness researchers set up four TikTok profiles. They posed as 13-year-olds using false birth dates. The platform did not request further identification. Investigators enabled TikTok’s “restricted mode”. The company promotes this feature as protection against sexual or mature content. Despite this, the accounts received sexualised search prompts in the “you may like” section. These led to videos of women flashing underwear, exposing breasts and simulating masturbation. At its most extreme, pornography appeared hidden inside ordinary-looking clips to avoid moderation.
Global Witness issues warning
Ava Lee from Global Witness called the findings a “huge shock”. She said TikTok not only fails to protect children but actively recommends harmful content. Global Witness usually investigates how technology affects human rights, democracy and climate change. The organisation first noticed TikTok’s explicit content during unrelated research in April.
TikTok defends safety measures
Researchers reported the findings earlier this year. TikTok said it removed the material and introduced fixes. But when Global Witness repeated the test in late July, sexual videos appeared again. TikTok says it has more than 50 safety features for teenagers. It claims nine out of ten violating clips are deleted before viewing. After the report, the company said it upgraded its search tools and removed further harmful content.
Children’s Codes demand stronger action
On 25 July, the Children’s Codes within the Online Safety Act came into force. Platforms must now enforce strict age verification and block children from accessing pornography. Algorithms must also filter content linked to suicide, self-harm or eating disorders. Global Witness repeated its research after the new rules began. Ava Lee urged regulators to intervene, stressing that children’s online safety must be enforced.
Users express concern
During the investigation, researchers observed reactions from TikTok users. Some questioned why sexualised content appeared in their feeds. One wrote: “can someone explain to me what is up with my search recs pls?” Another commented: “what’s wrong with this app?”