Bill to criminalise AI child abuse apps to be introduced to parliament

A bill to criminalise the use of AI tools purpose-built to create child sexual abuse material is set to be introduced to parliament.

Independent MP Kate Chaney, who will introduce the bill, says the urgent issue cannot wait for the government’s wider response to artificial intelligence.

While it is an offence to possess or share child abuse material, there is no criminal prohibition on downloading or distributing the wave of emerging AI generators designed to create the illegal material.

The tools are becoming easier to access online, with some of the most popular visited millions of times. 

Their spread is diverting police resources and allowing material to be created offline, where it is harder to track.

A roundtable convened last week to address the issue recommended swift action to make the tools illegal, prompting Ms Chaney’s bill.

“[This] clearly needs to be done urgently and I can’t see why we need to wait to respond to this really significant and quite alarming issue,” Ms Chaney said.

“I recognise the challenges of regulating AI — the technology is changing so fast it’s hard to even come up with a workable definition of AI — but while we are working on that holistic approach, there are gaps in our existing legislation we can plug to address the highest-risk-use cases like this, so we can continue to build trust in AI.”

Ms Chaney said she had met with Attorney-General Michelle Rowland, who she said recognised there was a gap in the law.

Tools enable ‘on-demand, unlimited’ abuse material, Chaney warns

The MP for Curtin’s bill would create a new offence for using a carriage service to download, access, supply or facilitate technologies that are designed to create child abuse material.

A new offence for scraping or distributing data with the intention of training or creating those tools would also be created.

The offences would carry a maximum 15-year term of imprisonment.

A public defence would be available for law enforcement, intelligence agencies and others with express authorisation to be able to investigate child abuse cases.

“There are a few reasons we need this,” Ms Chaney said. 

“These tools enable the on-demand, unlimited creation of this type of material, which means perpetrators can train AI tools with images of a particular child, delete the offending material so they can’t be detected, and then still be able to generate material with word prompts.

“It also makes police work more challenging. It is [getting] harder to identify real children who are victims. 

“And every AI abuse image starts with photos of a real child, so a child is harmed somewhere in the process.”

Child safety experts say bill addresses ‘urgent’ gap

The federal government continues to develop its response to the explosion in the use of AI tools, including by enabling the tools where they are productive and useful.

Sexual assault support lines:

It is yet to respond to a major review of the Online Safety Act handed to the government last year, which also recommended that so-called “nudify” apps be criminalised.

Members of last week’s roundtable said there was no public benefit to consider in the case of these child abuse generators, and there was no reason to wait for a whole-of-economy response to criminalise them.

Former police detective inspector Jon Rouse, who participated in that roundtable, said Ms Chaney’s bill addressed an urgent legislative gap.

“While existing Australian legislation provides for the prosecution of child sexual abuse material production, it does not yet address the use of AI in generating such material,” Professor Rouse said.

Colm Gannon, Australian chief of the International Centre for Mission and Exploited Children, said there was a strong consensus that the AI tools had no place in society and Ms Chaney’s bill was a “clear and targeted step to close an urgent gap”.

In a statement, Attorney-General Michelle Rowland said the foremost priority of any government was “to keep our most vulnerable safe”.

“As Attorney-General, I am fully committed to combating child sexual exploitation and abuse in all settings, including online, and the government has a robust legislative framework in place to support this,” Ms Rowland said.

“Keeping young people safe from emerging harms is above politics, and the government will carefully consider any proposal that aims to strengthen our responses to child sexual exploitation and abuse.”

Ms Chaney said regulating AI must become a priority for the government this term.

“This is going to have to be an urgent focus for this government, regulating the AI space,” she said.

“Existing laws do apply to AI, and so we need to plug the gaps in those so they continue to be fit-for-purpose. 

“We do also need a coordinated approach and a holistic approach so we can balance individual rights with productivity, global governance and trust in information and institutions.

“The challenge is the technology moves fast and government does not move fast, so we need to get it right but we also need to plug these gaps as they appear.

An inquiry established by former industry minister Ed Husic last year recommended the government take the strongest option in regulating AI by creating standalone laws that could adapt to the rapidly shifting technology.

Source: https://www.abc.net.au/