TikTok neglected security

Social media The popular app TikTok censored political videos and gave little priority to combating digital child molesters.

A man holding a phone walks past a sign of Chinese company ByteDance's app TikTok,
A man holding a phone walks past a sign of Chinese company ByteDance's app TikTok, Reuters

For a long time, TikTok has neglected the safety of users and their rights. The social network, popular among young children, did little to counteract digital child molesters and remove sexually inappropriate comments.

TikTok censored videos from, among others, disabled people, obese users and transgenders, and the platform restricted the distribution of virtually all political content, according to research by NRC.

Over the past few months, NRC spoke with former employees of TikTok, who worked for the company until well into 2019. This newspaper also has access to hundreds of pages of company documents and dozens of hours of audio recordings. With that material, NRC was able to reconstruct the internal policy of the company until the summer of 2019.


On TikTok – owned by the Chinese tech company Bytedance – users share short music and dance clips. In the Netherlands, the app has 700,000 users, and a billion worldwide. According to research agency Multiscope, TikTok is most popular among 10, 11 and 12-year-olds in the Netherlands.

Also read the reconstruction: TikTok hardly did anything about safety (in Dutch: Jonge kinderen zitten graag op TikTok. Maar hoe veilig is het daar?)

Employees of the Dutch-German team in Berlin were explicitly told in the spring of 2019 that tackling inappropriate adult commentary underneath childrenʼs videos was not part of their core business. The previous months various organisations had warned TikTok about the problem.

Sexual abuse

At Helpwanted, a Dutch helpline that advises young people about online sexual abuse, reports about TikTok come in every day, according to a spokesperson. Almost every child at TikTok will eventually have to deal with inappropriate remarks, says Helpwanted.

The problem is also known internally; employees encounter inappropriate behaviour on a daily basis. A former employee: „TikTok wants to grow fast, no matter how.”

TikTok-employees in Berlin worked with a list of ʻspecial users,ʼ for whom special policies existed. On this list were mostly disabled people, but also obese users and transgenders. The reach of their clips was limited without their knowledge.

TikTok acknowledges in a reply that there have been ʻblunt policiesʼ and incorrect guidelines. „We have no higher priority than to keep our users safe.” To this end, the company will soon be opening a Trust and Safety department in Dublin. Videos of disabled people and transgenders would currently not be censored. The same goes for political material.

Restricting political videos

TikTok has long been criticised for restricting political videos and material from lgbti-users. NRC found ample evidence for this in the instructions to moderators. They had to pay specific attention to „rainbow-earrings or -wristbands” in videos and to material in which politicians were mocked. TikTok limited the reach of these clips. A former moderator: “Nothing that was offensive in the eyes of Beijing went through.”

Moderators working in the Berlin office not only denounce the moderation policy, they also call the working climate unprofessional. Employees were allowed to drink alcohol during working hours until the spring of 2019, and several times there were visitors who could look at the computer screens which displayed privacy-sensitive information. Moderators had to work with poorly translated Chinese guidelines. These guidelines left a lot of room for free interpretation, were often contradictory and were constantly adapted. TikTok does not want to answer questions about the Berlin office.

For more information about the research contact the authors at media@nrc.nl. Translation by Menno Grootveld