TikTok says it removed a total of 60,465 Kenyan accounts from the platform between April and June this year for violating community guidelines.
TikTok second quarter Community Guidelines Enforcement Report for Kenya further shows that the short video platform removed another 57,262 accounts in Kenya which were suspected to be run by users under the age of 13.
Globally, TikTok says a total of 178.8 million were removed globally out of which 144.4 million were removed through automation. Only 5.4 million vides were restored.
“With over a billion people and millions of pieces of content posted to our platform every day, we continue to prioritize and enhance TikTok’s automated moderation technology as such technology enables faster and consistent removal of content that violates our rules,” said the firm.
In Kenya at least 360,000 videos were removed from the platform which is 0.3pc of the total videos uploaded in the country in the latest reporting period.
According to the firm 99.1pc of the videos were proactively removed before users reported them, with 95pc taken down within 24 hours.
“We invest in technologies that improve content understanding and predict potential risks so that we can take action on violative content before it’s viewed. These technical investments also reduce the volume of content that moderators review, helping minimize human exposure to violative content. As a result, automated technology now removes 80% of violative videos, up from 62% a year ago,” said TikTok in the report.
Of the total number of content removed, 31pc involved sensitive and mature themes, 27.9pc regulated goods and commercial activities, 19.1pc mental and behavioral health, 15.1pc safety and civility, 4.7pc privacy and security and 2.1pc integrity and authenticity.