TikTok recently shared how the platform is taking action against undesired content and how it removed more than 49 million videos within the last six months of 2019. This transparency report comes at a time when TikTok has come under increased pressure and criticism by individuals and agencies, over the platform's potential privacy and security risks.
The popular video-sharing platform has found itself under intense scrutiny lately, due to privacy fears. In recent days alone, politicians have considered outright banning the app. However, many TikTok supporters are trying to fight the potential ban by arguing how the service offers them a platform where they can share their creativity and political voice. While some government groups have already banned the app, it is important to consider that TikTok has particularly proven popular with younger generations.
TikTok's transparent report explained how the platform removed 49 million videos as it looks to increase trust in the service and its moderation efforts. While 49 million videos may seem impressive, the report explains that this is less than one percent of the number of user-created videos within the six month period. Moreover, the system was able to remove nearly 100 percent of videos that violated the Community Guidelines before users reported them. In another impressive feat, TikTok’s system removed almost 90 percent of videos before members had the chance to view them.
First, it is important to understand what type of content violates the video-sharing platform’s Community Guidelines. The app is against crime and dangerous individuals that promote harm. In addition, violent and graphic content – including self-harm – is also against the guidelines, as is bullying and hate speech related content. Furthermore, videos that promote illegal and deceptive activities are also not permitted. As a platform that typically caters to a younger audience, child safety is a high priority; the platform will take additional steps in these cases and alert the appropriate legal officials about content that poses a risk to a child's well-being.
TikTok explains that it cooperates with officials who possess legal requests, and sometimes the video-sharing app will act without a legal process as a measure of good faith by providing user-information to prevent greater harm. In an effort to show transparency, the report provides charts to demonstrate acts where it cooperates with various countries. Other charts explain the total number of videos removed from countries that had exceptionally high violations; notably, India and the United States had the most content removed. TikTok took a quarter of these videos down because of nudity, and it removed another 25 percent of videos that raised child safety concerns. Another breakdown shows that around 20 percent of videos were taken down due to illegal activities, while 25 percent of content removed included self-harm and violent videos. The last remaining videos that the system caught related to harassment and hate speech.
The platform appears to be trying to dismiss privacy concerns by creating transparency reports that showcase the service's ability to crack down on inappropriate content. Furthermore, TikTok works with a team of experts and groups, like the National Center for Missing and Exploited Children, to make better policies and moderation strategies. It seems like TikTok’s report is a display of strength while simultaneously acting as an olive branch as the platform tries to ease concerns and mend rifts with various countries.
Source: TikTok
from ScreenRant - Feed https://ift.tt/2Dqzc9s
0 Comments