Microsoft search engine Bing served child pornography images when asked for it, according to an investigative report from AntiToxin. In fact, the popular search engine even suggested additional search terms on typing in queries that led to more illegal images.
Researchers at AntiToxin, a security firm that excels in building technologies to protect people against abuse, discovered that Bing came up with child porn images when the “safe search” filter was turned off.
In their research form December 30th, 2018 to January 7th, 2019, AntiToxin found out that terms like “porn kids,” “nude family kids” etc. served up all kinds of porn content. The more alarming part of the report is that Microsoft Bing suggested porn-related keywords on clicking the photos.
Following an anonymous tip, TechCrunch commissioned a report from online safety startup AntiToxin to investigate the matter. When TechCrunch shared the intricacies of the report with Microsoft, the search engine chief vice president Jordi Ribas said:
“We acted immediately to remove them, but we also want to prevent any other similar violations in the future. We’re focused on learning from this so we can make any other improvements needed.”
However, even after Microsoft blocked such terms from Bing, AntiToxin found that some keywords, although not all, are still serving up illegal content.
Previously, many other platforms have also been found serving child pornography. That includes Whatsapp which is still finding it difficult to put an end to sharing of illegal content.
The other was the Tumblr app which was temporarily removed from Apple’s App store because of child porn. It is very saddening to see another such incident where big technology giants are focusing on their own growth instead of people’s security.