whatsapp delete message

Facebook-owned instant messenger app WhatsApp is currently being used on a massive scale to distribute child pornography material, a report from two Israeli online safety groups states.

The report has been analyzed, translated, and followed up by TechCrunch. The publication states that the third-party group discovery apps for WhatsApp are being used by users to find such groups that offer invite links to join them.

It’s worth noting that encryption isn’t the primary reason why Facebook has been unable to detect these groups. Such groups aren’t even attempting to hide their nature suggesting the ignorance of WhatsApp. Groups with names like “child porn xvideos” and “10years Hardcore” have been openly active in the past and WhatsApp failed to ban them.

Sample-Of-Child-Exploitation-WhatsApp
Source: TechCrunch

It’s surprising that while Facebook increased its moderation staff from 10,000 to 20,000 in 2018, it didn’t do much when it comes to WhatsApp. The instant messaging giant has just 300 employees and it continues to serve 1.5 billion users.

A similar review of such groups was also performed by the Financial Times. The publication found that many such groups remained active, even after WhatsApp was informed about the same by researchers.

“It is a disaster: this sort of material was once mostly found on the darknet, but now it’s on WhatsApp,” said a member of the Israeli NGO.

It’s worth noting that Google’s Play Store has allowed listing of apps that aggregate such adult content groups; such groups can’t be found on Apple’s App Store.

Overall, the reluctance to act when it comes to such issues repeatedly shows how earning billions in profit is one of the top priorities of tech giants as they continue to serve a huge userbase. It’s important to highlight once again that WhatsApp chooses to employ and pay only 300 humans, which, in my opinion, is an exercise to boost its profit margin.

Also Read: Apple To Stop Selling Older iPhones In Germany After Qualcomm’s Win