We can all turn a blind eye to it, but it is a fact that tonnes of inappropriate content exist on YouTube, which is right now, the world’s largest video sharing website.
YouTuber Matt Watson has highlighted a concerning issue of how a couple of extra clicks could lead users into a “wormhole” of “soft-core pedophilia ring” on YouTube, he said in a Reddit post.
YouTube’s recommendation algorithm suggests such videos in the “up next” section or in users feed, for example, when someone would search for a video of bikini-clad adult women and then the algorithm gets to show how “smart” it is.
What’s the worst part, as Matt points out, is that such inappropriate videos are being monetized by big brands like McDonald’s, Disney, and other known names.
Matt has created a 20-minute video showing off the whole process where he demonstrates how YouTube suggests different videos of similar nature. And he points out that many of the children featured in the videos are pre-pubescent.
Also, the comments of such videos are clogged with links to actual child pornography, and people with similar mindset get to connect with each other. The comments often include timestamps of videos where kids are in compromising positions.
He also notes that the comments are disabled on a fair number of such YouTube videos, and that could lead to an assumption that YouTube is aware that such videos exist.
But he raises the question of why such content is not being reviewed by humans once it’s detected. He also reported a “channel and a user for child abuse,” and the channel was online 60 hours later, and at the time he made the Reddit post.
It’s not the first time people have brought such an issue into the limelight. It was back in 2017 when the company got into the “Elsagate” controversy and promised to make changes to its policies and algorithms. But it turns out such content is still there on its platform.