Australia has announced plans to outlaw deepfake nudes that are created and used for stalking.
Under the reforms announced on Tuesday by the Australian government, tech platforms will be tasked with preventing access to “nudify” and undetectable online stalking tools.
Anika Wells, the minister for communications, said Australia would work with businesses to stop “abhorrent technologies” and make sure “legitimate and consent-based” artificial intelligence (AI) and online tracking services were not adversely impacted.
Abusive technologies are now widespread and readily accessible, according to Wells, who issued a statement.
We’ll work closely with industry to achieve this by working with them as these new, constantly evolving technologies demand a new, proactive approach to harm prevention.
While this action, along with existing laws and our world-leading online safety reforms, will make a significant difference in protecting Australians, she continued.
With the proliferation of platforms that can produce photo-realistic material with the click of a mouse, there is growing concern about the use of AI to sexually explicit images of people without their consent.
In a survey of 1,200 young people conducted by the advocacy group Thorn last year, 6% of respondents said they had been a direct victim of such abuse, while 6% said they knew someone who had created fake nude imagery for them.
In recent years, Australia has implemented a number of significant legal reforms, including the first social media use ban for minors.
Source: Aljazeera
Leave a Reply