Google Launches AI Based API To Identify Child Sexual Abuse Material


The Internet is flooded with child sex abuse material, and every tech company prioritizes that any such content is weeded out before it is visible to users.

In a bid to automate this process, Google has launched an AI-powered API that will help in identifying child sexual abuse material (CSAM). This will not only speed up the process but would also prevent human reviewers’ exposure to the illegal and disturbing content.

The earlier approach adopted by the companies to track such content is matching the suspected images with the previously flagged content. The new AI-based API works by using deep neural networks for scanning image processing. The API would prioritize CSAM content for review leading to a quicker review process.

Google has said that this new tool would allow the companies to identify and report 700% more CSAM content as compared to the content evaluated by a human reviewer.

The API has been made available without any charges for corporate partners and non-governmental organizations through Google’s Content Safety programming kit.

This is a welcome step from Google given the volume of CSAM content that is available over the internet. Hopefully, the new AI-powered API would help in speeding up the process and would protect the children who are sexually abused.

Also Read: OnePlus Might Launch A 5G Smartphone On January 15, 2019
Anmol Sachdeva

Anmol Sachdeva

Anmol is a tech journalist who handles reportage of cybersecurity and Apple and OnePlus devices at Fossbytes. He's an ambivert who is striving hard to appease existential crisis by eating, writing, and scrolling through memes.
More From Fossbytes

Latest On Fossbytes

Find your dream job