The Popular Video Platform Allegedly Leads Child Accounts to Explicit Material In Just a Few Taps
As reported by a new study, the widely-used social media app has been discovered to steer children's accounts to adult videos within a small number of clicks.
Research Methodology
A campaign organization established simulated profiles using a 13-year-old's birth date and activated the "restricted mode" setting, which is designed to limit exposure to inappropriate content.
Study authors observed that TikTok suggested inappropriate and adult-themed search terms to the simulated accounts that were set up on clean phones with no search history.
Concerning Search Suggestions
Keywords recommended under the "suggested searches" feature featured "very very rude skimpy outfits" and "very rude babes" – and then advanced to phrases such as "hardcore pawn [sic] clips".
In three cases of the accounts, the inappropriate search terms were suggested immediately.
Fast Track to Adult Material
Within minimal interaction, the researchers came across adult videos from exposure to graphic sexual acts.
Global Witness claimed that the content sought to avoid detection, often by displaying the video within an harmless image or video.
Regarding one profile, the process took two taps after signing in: one interaction on the search bar and then a second on the proposed query.
Regulatory Context
The climate organization, whose remit includes researching technology companies' influence on public safety, stated it carried out multiple testing phases.
Initial tests occurred prior to the enforcement of child protection rules under the British online safety legislation on July 25th, and additional tests after the regulations took effect.
Concerning Discoveries
Investigators added that several pieces of content included someone who seemed to be a minor and had been submitted to the child protection organization, which tracks online child sexual abuse material.
The campaign group claimed that the video platform was in violation of the Online Safety Act, which obligates social media firms to stop children from accessing dangerous material such as adult material.
Regulatory Response
A spokesperson for Ofcom, which is tasked with regulating the law, commented: "We appreciate the work behind this investigation and will analyze its results."
Ofcom's codes for complying with the act indicate that digital platforms that present a significant danger of showing harmful content must "adjust their systems to block harmful content from young users' timelines.
The platform's rules prohibit pornographic content.
TikTok's Statement
The video platform said that following notification from the research group, it had taken down the violating content and made changes to its search recommendations.
"Immediately after notification" of these allegations, we acted promptly to examine the issue, delete material that violated our policies, and implement enhancements to our search prompt functionality," stated a company representative.