In fact, Colm Gannon, CEO of the International Centre for Missing and Exploited Children (ICMEC), recently called it a crisis. In his testimony, he warned against growing the use of artificial intelligence (AI) for the sex trafficking of children. His message is clear — the Australian government needs to place more importance on legislation that tackles this quickly increasing menace. Gannon notes that Australia’s child protection policy was only adopted three years ago. It fails to take on the tough stuff—specifically dangers AI technologies present.
Recent series of reports from Graphika point to a more disturbing trend. The business of synthetic, non-consensual intimate imagery has spread from underground online forums to widely available automated platforms, making reform an urgent imperative. This very concerning trend highlights the need for strong protections from the regulatory process. Gannon states, “What we need to do is look at legislation that’s going to be comprehensive – comprehensive for protection, comprehensive for enforcement and comprehensive to actually be technology neutral.”
With AI tools increasingly available to anyone, Gannon cautions that the potential for exploitation is increasing at a frightening pace. He argues that major technology companies—like Apple—need to be held accountable for embedding safety features into their products. In 2021, Apple introduced several new features aimed at protecting its younger users. These features issue warnings when users are about to receive or attempt to send images containing nudity. Even with these measures, Gannon argues that stronger steps are needed.
Today, Gannon is working together with Childlight Australia to promote similar legislation inside the country. For instance, he contends, the current framework does not include measures for how to respond to AI-fueled threats, such as deepfakes. “By bringing it into the national framework that was brought around in 2021, a 10-year framework that doesn’t mention AI, we’re working to hopefully develop solutions for government to bring child safety in the age of AI to the forefront,” he said.
Unfortunately, the problem of AI-enabled exploitation is not unique to Australia. That’s how the United Kingdom has moved resoundingly forward. As a result, it is the first country to create enumerated offences against AI sexual exploitation and abuse to protect children. Gannon calls on Australia to do the same and enact legislation that will protect children from the vulnerabilities technology has disclosed.
At the same time, a growing black box warning from Professor Jon Rouse has highlighted the even more dangerous impacts on society from unregulated digital ecologies. He notes, “Apple has got accountability here as well, they just put things on the app store… but there’s no regulation around this.” Rouse emphasizes the need for a community-wide education approach, stating, “Perpetrators are not just grooming their individual victims; they’re grooming their entire environments to create a regime of control in which abuse can operate in plain sight.”
Gannon further calls for law enforcement to utilize available technologies that could aid in identifying victims of exploitation more swiftly. He remarks, “The other thing that we want to do is use law enforcement as a tool to help identify victims. There is technology out there that can assist in rapid and easy access to victim ID.”