Last Updated on: 10th July 2021, 03:24 pm
The 60 seconds short videos app ‘TikTok’ is now ready to introduce a new feature that will automatically remove all content that violates community guidelines.
In the United States and Canada, the company will begin using automated verification systems to remove videos that contain nudity, violence, graphic content, illegal activity, and violations of its minor security policies.
Also Read: United States Is Opening A National Safety Investigation Into the Procurement Of The TikTok Founder
Currently, all uploaded videos go through technological tools that help identify and report possible violations, which are then reviewed by a member of the security team.
In case the violation is present and identified, the video will be removed and the user will be notified, TikTok said.
The ByteDance-owned company added that it will automatically remove certain types of content that violate its minors’ safety policy for the next several weeks.
This is in addition to the deletions confirmed by TikTok’s security team.
The company said it would help its security team focus more on highly contextual and nuanced areas like bullying and harassment. Additionally, TikTok added that an in-app alert will be sent the first time you violate.
Also Read: TikTok Launches #EduTok Campaign To Promote The Creation Of Educational Content
However, repeated violations will result in user notification and permanent account deletion.
Previously, the changes were criticized for amplifying hate speech and misinformation on their platforms, including Facebook and TikTok, around the world.