AFP, Monash Uni Crowdsource Images to Fight Child Abuse with AI

AFP, Monash Uni Crowdsource Images to Fight Child Abuse with AI

The AFP wants images of young people to help train its algorithm for detecting child abuse.

Monash University partners with the Australian Federal Police (AFP) to tackle child abuse.

AFP, Monash Uni Crowdsource Images to Fight Child Abuse with AI

The pair will create an “ethically-sourced” database of images that can train artificial intelligence algorithms to detect child exploitation.

The project, an initiative of the AiLECS laboratory – a collaboration between the Faculty of Information Technology at Monash University and the Australian Federal Police, will aim to collect at least 100,000 images from the community over the next six months.

AiLECS researchers are asking individuals aged 18 and over to donate images of themselves as children to help populate the My Pictures Matter crowdsourcing campaign database.

AiLECS Lab Co-Director Associate Professor Campbell Wilson said machine learning models trained on images of people are often fed with images scraped from the internet or without documented permission to use them.

Associate Professor Wilson said: “To develop AI that can identify exploitation images, we need a very large number of childhood images in everyday ‘safe’ contexts that can train and evaluate the AI ​​models intended to combat child exploitation”

To maintain the privacy of the contributors, the email addresses used to send the images are kept separate.

“It’s problematic to get these images off the Internet if there’s no way of knowing whether the kids in those photos actually consented to have their photos uploaded or used for research,” says Professor Wilson.

“By obtaining photos from adults, with informed consent, we are trying to build technologies that are ethical and transparent.” he said

Comprehensive strategies to store and use the data while preserving the privacy of those depicted in the images have also been developed by AiLECS researchers

Individuals who contributed photos can get details and updates about each investigation stage. They may also change usage rights or revoke their permission to research and remove pictures from the database.

AiLECS Lab research is funded by Monash University, the Australian Federal Police, and the Westpac Safer Children Safer Communities Scholarship Program.

In 2020, AFP admitted it had briefly tested Clearview AI, a controversial facial recognition tool that allows you to search a database of images scraped from the internet.

It was one of four police stations in Australia, Victoria, Queensland, and South Australia, and 2,200 worldwide reported using the platform.

The “Limited Pilot” was conducted by the AFP-led Australian Center for Combating Child Exploitation (ACCCE) to see if it could be used in child exploitation studies.

In 2021, the AFP-run Australian Center for Combating Child Exploitation received more than 33,000 reports of child exploitation online, and each piece may contain large amounts of images and videos of children being sexually abused or used to be exploited by criminals.

By the end of 2022, researchers aim to have a large enough database of at least 100,000 ethically sourced images to train the AI ​​algorithm through the My Pictures Matter campaign.


The TBN team is an established group of technology industry professionals with backgrounds in IT systems, business communications, and journalism.