Controversial and extremist content have been increasing on social media platforms, while social sites are doing everything to curb them from their platforms. Just recently Facebook announced that it will be enhancing its AI technology to filter out malicious or extremist content from its platform. Similarly, YouTube has also announced it will suppress controversial or extremist content. However, there is a catch to this announcement and that even if your video does not violate YouTube policies it will still be suppressed if it is controversial. The company said it will place controversial videos in “limited state”.
YouTube is working on a “trusted flagger” function which will allow users to mark and report video or content which is potentially harmful or extremist in nature even if they don’t breach company’s policies. This step is condemned by analysts as it means that a video will be suppressed if a large number of people votes against it. A YouTuber while talking to a media outlet said,“The popular opinion isn’t always the right opinion.”
The company is working with more than 15 institutions to make this system possible. YouTube said in its blog post, “The company will soon be applying tougher treatment to videos that aren’t illegal but have been flagged by users as potential violations of our policies on hate speech and violent extremism. If we find that these videos don’t violate our policies but contain controversial religious or supremacist content, they will be placed in a limited state.”
YouTube has also rolled out a new feature named as redirect method, which will redirect individuals who are searching for terrorist content to anti- terrorist content in an aim to stop them from becoming a potential terrorist recruit.