Well, it seems TikTok has some questionable moderation and algorithm policies.
https://netzpolitik.org/2019/discrimination-tiktok-curbed-reach-for-people-with-disabilities/
https://netzpolitik.org/2019/discrimination-tiktok-curbed-reach-for-people-with-disabilities/
Leaked documents reveal how TikTok hid videos of people with disabilities. Queer and fat users were also pushed out of view. The Chinese company says the rules were meant to protect vulnerable users.
TikTok, the fast-growing social network from China, has used unusual measures to protect supposedly vulnerable users. The platform instructed its moderators to mark videos of people with disabilities and limit their reach. Queer and fat people also ended up on a list of "special users" whose videos were regarded as a bullying risk by default and capped in their reach – regardless of the content.
...
People with disabilities were kept away from the big stage
For users that were considered particularly vulnerable, TikTok had even further-reaching regulations. When their videos popped up on the screens of the TikTok moderation teams in Berlin, Beijing or Barcelona – after 6,000 to 10,000 views – they were automatically tagged as "Auto R".
As a result, if these videos exceeded a certain number of views, they automatically ended up in the "not recommend" category. Such a categorization means that a video no longer appears in the algorithmically compiled For-You-Feed, which the user sees when opening the app.
Strictly speaking, such videos are not deleted – but in fact they hardly have an audience.
The guidelines also gives examples of users for whom this applies: "facial disfigurement," "autism," and "Down syndrome". Moderators were supposed to judge whether someone has these characteristics and mark the video accordingly in the review process. On average, they have about half a minute to do this, as our source at TikTok reports.
...
One source familiar with moderation reported that staff repeatedly pointed out the problems of this policy and asked for a more sensitive and meaningful policy.
However, their comments were dismissed by the Chinese decision-makers. The rules were mainly handed down from Beijing. This is largely in line with what the Washington Post learned from former TikTok employees in the USA.
...
A list of "special users"
In addition to the above rules, TikTok moderators maintained a list of "special users" who were considered especially vulnerable to bullying. These users were generally rated as a risk and their videos were automatically capped with the "Auto R" mark so that they did not exceed a certain number of views.
The list names 24 accounts, including people who post videos with hashtags such as #disability or write "Autist" in their biographies. But the list also includes users who are simply fat and self-confident. A striking number show a rainbow flag in their biographies or describe themselves as lesbian, gay or non-binary.