The Internet is flooded with child sex abuse material, and every tech company prioritizes that any such content is weeded out before it is visible to users.
In a bid to automate this process, Google has launched an AI-powered API that will help in identifying child sexual abuse material (CSAM). This will not only speed up the process but would also prevent human reviewers’ exposure to the illegal and disturbing content.
The earlier approach adopted by the companies to track such content is matching the suspected images with the previously flagged content. The new AI-based API works by using deep neural networks for scanning image processing. The API would prioritize CSAM content for review leading to a quicker review process.
Google has said that this new tool would allow the companies to identify and report 700% more CSAM content as compared to the content evaluated by a human reviewer.
The API has been made available without any charges for corporate partners and non-governmental organizations through Google’s Content Safety programming kit.
This is a welcome step from Google given the volume of CSAM content that is available over the internet. Hopefully, the new AI-powered API would help in speeding up the process and would protect the children who are sexually abused.
Also Read:Â OnePlus Might Launch A 5G Smartphone On January 15, 2019