The immensely popular TikTok has many rules about what is and is not allowed on the platform. More than ten thousand moderators around the world watch the video and verify that the rules are followed. But there is often a gray area in the rules. Moderators work under great time pressure, giving them an average of 14.4 seconds per video to rate the video.
TikTok’s internal rules do not always correspond to what the app publicly claims, RTL Nieuws notes. For example, public community guidelines prohibit posting videos with lyrics like “I want to kill myself” or “I’m going to cut myself”. But according to the internal moderator’s manual, a video should not be removed until the user “intentions” to harm himself.
Intention is difficult to determine
Establishing this intent is not easy in the upbeat and sarcastic tone of many TikTok users. Moderators should decide in seconds and not see different context like other user uploaded videos.
Nor do the moderators intend to classify anything close to self-harm under this heading, just to be on the safe side. This creates a lot of work for other teams that will have to deal with this later, as they were asked during an internal training.
The reluctance to exercise moderation means that there are many videos on TikTok that violate the Community Guidelines. For example, there are many videos where users lip-sync to suicidal texts. “We don’t want a video overdone” is the sarcastic catchphrase moderators hear over and over.
billions of users
TikTok, based in China, has grown enormously since its launch in 2017. Last year, according to research firm Newcom, there were one billion users worldwide, now there are 3 million in the Netherlands. While users must be at least 13 years old to use TikTok, many children are up to 9 years old.
disinformation
RTL Nieuws also notes that TikTok’s internal rules are very limited when it comes to fake news and disinformation. TikTok says in its public guidelines that it will remove content that is “wrong or inaccurate” and causes “significant damage”. However, the internal guide has a limited list of only 16 identified examples of misinformation. These are very well-known fake stories, for example about 5G poles that fight fleas in corona vaccines.
Due to the heavy workload, moderators are unable to verify other lesser-known expressions in videos. They are even told not to google if something is right during training. In practice, this means that bullshit stories that are not on the sample list flow freely on the platform. Like videos that claim sunscreen causes cancer.
Secret mission on TikTok
A journalist from RTL Nieuws worked for a month as a moderator for the Amsterdam TikTok team. He was approached by TikTok via LinkedIn for a vacancy. Since this was the only way to get a behind-the-scenes look at the influential tech company, he decided to apply. Within a month, he completed a series of new recruiting training that explained the moderation guide and put that knowledge into practice for another two days. In this video Timo talks about his experiences.
see manuals
TikTok openly says it offers help to people with suicidal thoughts, citing 113. In reality, this help doesn’t seem that extensive. TikTok internally distinguishes between emergencies and non-emergencies. Anyone posting a video saying they cut themselves is an emergency. However, if the same images are published as a slideshow, it will not count as an emergency. TikTok believes there is no life-threatening situation.
TikToks can only be used internally emergency response team (ERT) indicates. These are people who can call emergency services, not moderators. Their report does not tell them what will happen next or whether ERT has reviewed the report. In non-emergency cases, the video will only be removed and the creator will not be directed to hotlines.
At 113 nothing is known about these diversions. “I’ve never heard of anyone approaching and sending us,” said Scarlet Hemkes of the Suicide Prevention Line.
Fake news on the table
During the undercover investigation, RTL Nieuws discovered that TikTok’s definition of fake news is very superficial. Moderators will learn that you should only flag a video as fake news, for example if the disinformation video was shot from a news studio or shows “breaking news.” So a vlogger who sells nonsense from his garden can do his job. As a moderator, you practically don’t flag videos as fake news, this will be noticed during the tutorial.
Social media expert Marieke Kuypers believes that TikTok does not deal sufficiently with disinformation. “The guidelines are not being implemented correctly.” It’s worth noting that TikTok doesn’t partner with independent validators like Facebook does, for example. That is why Kuypers checks the facts for the stories circulating on TikTok at his own expense. “But it’s impossible to deal with the sheer amount of misinformation that currently exists.”
New rules signed
TikTok signed a new European code of conduct this month to combat the spread of misinformation. Committed to more collaboration with independent verifiers and less room for misinformation. Failure to comply with the rules can lead to heavy fines. The rules have also been signed by other major tech companies such as Google and Meta.
All social media are concerned with such videos and looking for the best ways to moderate them. The way the TikTok app is designed makes it critical that this environment works well. 70% of TikTok’s are viewed on the “For You” page. There, an algorithm presents users with videos that closely match their interests.
As with Instagram or Facebook, it doesn’t matter if you follow the uploader and how many followers he has. Users can hardly adjust this algorithm themselves. This helps TikTok grab your attention, but it can also drive you into a tunnel with, for example, an endless stream of self-injury videos.
TikTok response
TikTok says in a response that they work with thousands of people to keep the network safe and fun. The company insists that content glorifying or promoting suicide should be removed and moderators trained to do so. The company also says it uses fact-checkers, “security experts” and moderators to protect the network from misinformation.
At the same time, TikTok says it is constantly making efforts to improve content moderation. It also means reviewing the moderators’ decisions to “understand where certain content hasn’t been stored before.” Read the full answer here.
Questions about suicide?
Stichting 113 Suicide Prevention: Call 113 or 0800-0113 (free) or anonymously via chat on the website 113.nl.
available 24/7
Source: RTL
Smith Charles is a talented entertainment journalist and author at The Nation View. He has a passion for all things entertainment and writes about the latest movies, TV shows, music, and celebrity news. He’s known for his in-depth interviews with actors, musicians, and other industry figures, and his ability to provide unique perspectives on the entertainment industry.