TikTok Expands AI-Powered Efforts to Detect Underage Users

As more governments push for stricter online age restrictions, TikTok has detailed its latest initiatives to better identify underage users and safeguard teens on its platform.

In a new overview, the company explained that it’s expanding its use of AI-driven age detection and reinforcing its moderation practices to ensure compliance with global regulations.

“In most parts of the world, the minimum age to use TikTok is 13,” the platform stated. “We use a multi-layered approach to confirm someone’s age or detect when they may not actually be the age they claim.”

Multi-Layered Age Verification

TikTok’s basic safeguards start with requiring new users to enter their birth date when creating an account. If a user fails to meet the minimum age, TikTok immediately suspends their ability to recreate another account using a different birth date.

The company is also scaling its AI-based age assurance tools, which were first piloted in the U.K.

“We’ve been piloting new AI technologies in the U.K. over the last year and found they’ve strengthened our efforts to remove thousands of additional accounts under 13. We’re planning to roll this technology out more widely, including in the EU, and are currently discussing it with our European privacy regulator.”

In addition to automated systems, TikTok’s moderation teams are trained to identify accounts potentially belonging to minors.

“If moderators suspect an account belongs to someone under 13, they can escalate it to a specialized review team. When in doubt, we remove any account we suspect may be underage,” TikTok explained. “We also allow anyone — even non-users — to report accounts they believe belong to minors.”

Strengthened Teen Protections

Alongside these detection systems, TikTok enforces strict protections for teen users. Direct messaging is disabled for users under 16, and default screen time limits help younger users manage app use. Collectively, these measures have supported TikTok’s broader effort to minimize exposure to potentially harmful content.

TikTok reports that it removes around 6 million underage accounts per month worldwide.
Global Push for Stricter Age Laws

The company’s growing focus on age assurance aligns with an international movement to tighten youth social media access. Over the past year:

  • France, Greece, and Denmark have supported proposals banning users under 15 from social platforms.
  • Spain has suggested raising the minimum age to 16.
  • Australia, New Zealand, Norway, and Papua New Guinea are drafting similar restrictions for those under 16.

While most major social networks already restrict accounts to users aged 13 or 14 and older, regulators are now pressing for tougher detection and enforcement, placing responsibility directly on platforms — with steep financial penalties for non-compliance.
The Challenge of a Unified Standard

Despite these efforts, a key issue remains: there’s still no global, legally enforced standard for digital age verification. Each platform employs its own systems, creating inconsistent enforcement across the industry.

TikTok has acknowledged this gap and is collaborating with industry partners to establish common frameworks.

“Since its first session last year, TikTok has engaged in the Global Multistakeholder Dialogue on Age Assurance, convened by the Centre for Information Policy Leadership (CIPL) and WeProtect Global Alliance,” the company said. “We’re also exploring whether the European Commission’s planned age verification app could serve as an effective additional tool.”

TikTok emphasized that for any approach to work, a level playing field must exist — one where all platforms adhere to the same rules and are evaluated under consistent regulatory expectations.

The outcome of these discussions could shape the future of online safety — not just for TikTok, but for the broader social media landscape.

LEAVE A REPLY

Please enter your comment!
Please enter your name here