How Tiktok Uses Censorship on Its “For You” Page

Are you one of the 1.3 billion people who use TikTok? If so, you’ve probably found yourself wondering at some point: how are so many of these TikTokers so good-looking? Why do so many of them have such nice houses? Why does everyone look so rich?

To put it simply, it’s because that’s what TikTok wants you to see. According to internal documents acquired by The Intercept and interviews with internal sources conducted by Netzpolitik, TikTok moderators received instructions to suppress images of “undesirable” users deemed unattractive, queer, poor, or disabled. In these leaked documents, specific guidelines advised moderators to pay special attention to physical and environmental features that would “decrease the short-term new user retention rate” since, “if the character’s appearance or the shooting environment is not good, the video will be much less attractive, not worthing [sic] to be recommended to new users.” As a result, staff were tasked with examining videos for undesirable facial characteristics, “abnormal” body shapes, and even cracks in the walls or outdated décor in order to  prevent such videos from reaching the “For You” page, the central organ of the platform that allows users to become “TikTok famous.” 

Though representatives from the company have explained that these restrictions were only implemented to protect “special users” from being vulnerable to cyberbullying, many argue that this form of prevention has simply led to discriminatory practices punishing those that are supposed to be protected. As well, the leaked guidelines clearly expressed otherwise.

“We perceive this as censorship for which there is no basis,” assert Manuela Hannen and Christoph Krachten, members of German disability support foundation Evangelische Stiftung Hephata. “It is completely absurd not to punish the trolls, but the victims of cyberbullying.”

Furthermore, while TikTok has dismissed these claims as being “an early blunt attempt at preventing bullying” that is “no longer in place” and which was only meant to be used temporarily after TikTok’s creation in 2017, sources point out that such policies were still in use as recently as late 2019. Moreover, multiple creators have still noticed their videos being censored or removed for “violating community guidelines” despite other creators on the platform sharing similar videos without repercussion.

One of these creators, Ashley Reeves, voiced her outrage on her Instagram account: “This video got taken down and flagged as “OFFENSIVE” on tiktok… if you’ve been on tiktok, there is [sic] P L E N T Y of women dancing in swimsuits who’s [sic] videos seem to stay put- what’s the difference? From what I can tell they’re all thin women.”

Videos featuring “undesirable” traits have not been the only ones facing censorship. Users have also noticed that, although searches for the Hong Kong protests on other social media platforms yield a vast assortment of posts and pictures, conducting the same search on the China-owned platform results in very few videos of these events. The same occurs when searching for content related to Tiananmen Square, Tibetan independence, or Uyghur persecution. 

This has thus led to speculation that moderators purposefully remove content running counter to the Communist Party of China’s directives. Indeed, leaked documents published by both The Intercept and The Guardian have revealed that accounts are in fact suspended or banned if they display political imagery defaming political leaders, endangering national security, or violating any of the other listed guidelines. Although representatives argue that videos are only punished for violent, political imagery and not specifically for harming the CPC’s ideals, many remain skeptical due to the parallels drawn between TikTok and other Chinese media platforms, which must comply with the CPC’s standards or otherwise face punishment. Because of TikTok’s extensive international audience, some are concerned this will allow the Chinese government to use it as a propaganda tool. 

Even if the platform’s rules solely seek to limit political activity in general, many still find this problematic since it can lead to unequal representation on the application, especially in times of political crisis. Recently, for example, TikTok faced backlash for allegedly censoring videos supporting the Black Lives Matter movement, which creators argue has long been an issue on the app along with Black underrepresentation.

Critics therefore argue that, given the company’s wealth and influence, TikTok can and must do better. Whether or not it truly plans to commit, only time will tell.

Leave a Reply

Your email address will not be published. Required fields are marked *