TikTok has taken down more than 580,000 videos in Kenya between July and September 2025 after they were found to violate the platform’s Community Guidelines.
According to a report released by the social media giant, 99.7% of these videos were automatically flagged and removed by TikTok’s moderation systems before any users could report them. About 94.6% of the content was deleted within 24 hours of being uploaded.
In addition to videos, roughly 90,000 live sessions were interrupted in Kenya for breaching content rules, representing about 1% of all live streams during the period.
Globally, TikTok removed over 204 million videos in the same quarter, accounting for 0.7% of all uploads. The platform also deleted more than 118 million fake accounts and over 22 million accounts suspected to belong to users under the age of 13.
TikTok said it uses a combination of automated moderation technologies and thousands of trust and safety professionals to handle content violations. These experts also manage appeals, consult external specialists, and respond to fast-moving events.
“By integrating advanced automated moderation technologies with the expertise of thousands of trust and safety professionals, TikTok ensures swift and consistent enforcement of content that violates its Community Guidelines,” the company said.
The platform highlighted that these measures are important for protecting users from harmful content, including misinformation, hate speech, and other violations.
Last November, TikTok launched a dedicated Time and Well-being hub and introduced four new Well-being Missions to encourage safer and more mindful use of the platform, especially for teenagers.
This latest moderation report underscores TikTok’s commitment to keeping the platform safe while balancing user engagement and creativity.