TikTok Apps Censorship Europe

‘Make your day.’ TikTok’s slogan conveys the idea of another playful company to add to the list of social media giants. The reality, though, is much darker.

Initially released as ‘Musical.ly’ — a platform that allowed users to upload videos of themselves miming to songs —  the company was later acquired by Chinese ByteDance, who then launched TikTok. Rather than focusing on music videos, the app is now a place where users can create, share, and watch any kind of short videos on a continuous loop. 

According to GlobalWebIndex, TikTok’s target demographic is primarily Gen Z and Millennials between 16 and 24 years old. In November 2019, it hit an impressive 1.5 billion downloads worldwide, placing the app third in the list of most-downloaded non-gaming apps, after Whatsapp and Facebook Messenger. 

Operating on such a huge scale, TikTok has come under harsh scrutiny from governments and the press about its transparency and content regulation. With its headquarters in Beijing, questions about TikTok’s ethics came to fruition long before the company expanded into Europe in September 2017. Bytedance has repeatedly had to defend itself from allegations of propagandist aims and preventing mentions of Tiananmen Square. With an already-tarnished reputation in countries like Indonesia, Bangladesh and India, can the app survive fresh accusations of censorship in a relatively new European market?

User demographics 

Of the 500 million active TikTok users, 150 million are located in China and use the Chinese version of the app (called Douyin). Other Asian countries like Japan, Vietnam, and Thailand have an equally high engagement. 

A November 2018 breakdown of TikTok’s European markets noted that Germany was the top country for users with 4.1 million active, followed closely by France with 4 million, where views averaged a total of 6.5 billion and 5 billion respectively. In both countries TikTok users open the app around 8 times a day. 

As of October 2019, Germany’s user base has more than doubled with 8.8 million users and is now the 10th country in the world with the biggest TikTok following.

Concerns over content

Indonesia was one of the first countries to block TikTok after videos on the platform were deemed blasphemous and pornographic. As a result, TikTok was removed from all app download stores for a week, beginning July 2018. Only once TikTok agreed to clear all negative content, apply additional restrictions for 14 to 18-year-olds, and set up a team of censors in Indonesia to “sanitize” content, was the app reinstated in the country. 

Later, in February 2019, after declaring a war on pornography, Bangladesh followed suit and also shut down TikTok. Currently, the app remains unavailable in the country. 

More recently, in April this year, India joined the list of countries hitting back at TikTok. Accusations of content featuring child pornography, accidental suicides and killings, as well as dangerous trends like jumping in front of cars surfaced. Other issues included spreading fake news and cyberbullying. For two weeks, new downloads of the app were banned across all of India, costing TikTok $500,000 in revenue each day. Following an appeal from TikTok’s parent company Bytedance, Madras High Court reversed its decision after one week.

TikTok content moderation

In November 2019, German digital rights blog Netzpolitik gained access to TikTok’s moderation rules. The site  noted that, although guidelines were extremely loose, TikTok’s “strategy, however, is clear: certain content is given the widest possible reach, while others are systematically suppressed.” 

Netzpolitik also discovered that unwanted content on TikTok is divided into four categories: ‘deletion’, ‘visible to self’, ‘not for feed’, and ‘not recommended’. General videos that do not fall into these categories can still be marked ‘risk’ and be blocked by location. TikTok claims the moderation is to ensure content complies with different country laws.

“The strategy, however, is clear: certain content is given the widest possible reach, while others are systematically suppressed.”

‘TikTok – Cheerfulness and Censorship’ Netzpolitik

An unnamed source from TikTok also told Netzpolitik that protests are generally not welcome on the app. Since its parent company Bytedance is Chinese, the recent troubles in Hong Kong, for example, have little — if any — exposure on the platform. 

“Special user” lists

Only a month later, Netzpolitik broke another story about TikTok’s use of censorship. This time, a leaked document revealed how TikTok made videos of people with disabilities less visible. The app also hid videos of overweight people and people identifying as LGBTQ by grouping them on “special user” lists, deemed higher risk. 

The justification? TikTok claimed the action was to protect vulnerable users and those “susceptible to harassment or cyberbullying based on their physical or mental condition.” These special user lists were curated by a team of moderators who were assigned to make their judgements based on 15-second video uploads. 

The controversial measure meant that videos of users with disabilities or perceived disabilities were only shown in the country where they were uploaded. In Germany, this shrunk a potential audience of 500 million to 8.8 million.

Long-term consequences

TikTok claims the “special user” lists were never intended to be a long-term solution and have since changed them. The platform has also emphatically denied censoring politicized content. Nonetheless, its moderation guidelines and Bytedance’s likely pressure to further Chinese foreign policy still compromise the platform. 

By systematically disadvantaging unfavorable content, TikTok has fueled suspicions of censorship and its broader political goals. While TikTok’s growth does not appear to be slowing down, it remains uncertain whether the skepticism will have a lasting impact on its user base.