Categories: Sports

Australia clamps downs on ‘nudify’ sites used for AI-generated child abuse | Social Media News


Three websites used to create abuse imagery had received 100,000 monthly visits from Australians, watchdog says.

Internet users in Australia have been blocked from accessing several websites that used artificial intelligence to create child sexual exploitation material, the country’s internet regulator has announced.

The three “nudify” sites withdrew from Australia following an official warning, eSafety Commissioner Julie Inman Grant said on Thursday.

Recommended Stories

list of 4 itemsend of list

Grant’s office said the sites had been receiving approximately 100,000 visits a month from Australians and featured in high-profile cases of AI-generated child sex abuse imagery involving Australian school students.

Grant said such “nudify” services, which allow users to make images of real people appear naked using AI, have had a “devastating” effect in Australian schools.

“We took enforcement action in September because this provider failed to put in safeguards to prevent its services being used to create child sexual exploitation material and were even marketing features like undressing ‘any girl,’ and with options for ‘schoolgirl’ image generation and features such as ‘sex mode,’” Grand said in a statement.

The development comes after Grant’s office issued a formal warning to the United Kingdom-based company behind the sites in September, threatening civil penalties of up to 49.5 million Australian dollars ($32.2m) if it did not introduce safeguards to prevent image-based abuse.

Grant said Hugging Face, a hosting platform for AI models, had separately also taken steps to comply with Australian law, including changing its terms of service to require account holders to take steps to minimise the risks of misuse involving their platforms.

Australia has been at the forefront of global efforts to prevent the online harm of children, banning social media for under-16s and cracking down on apps used for stalking and creating deepfake images.

The use of AI to create non-consensual sexually explicit images has been a growing concern amid the rapid proliferation of platforms capable of creating photo-realistic material at the click of a mouse.

In a survey carried out by the US-based advocacy group Thorn last year, 10 percent of respondents aged 13-20 reported knowing someone who had deepfake nude imagery created of them, while 6 percent said they had been a direct victim of such abuse.



Source link

admin

Share
Published by
admin

Recent Posts

TOI Bharat Abroad: Casual Hinduphobia

This week, we look at how a Joy Reid podcast remark sparked a wave of…

43 seconds ago

New women’s boxing coach Nieva aims for more medals at LA Olympics

Newly-appointed Indian women’s boxing head coach Santiago Nieva believes the current squad is brimming with…

8 minutes ago

BJP MLA accuses ally Shiv Sena’s legislator of trying to influence voters with money power

Legislator Santosh Bangar, centre, was accused by BJP MLA Tanaji Mutkule (not in picture). File.…

12 minutes ago

Copa Libertadores final: Palmeiras vs Flamengo – teams, start, lineups | Football News

Who: Palmeiras and FlamengoWhat: Copa Libertadores finalWhere: Monumental Stadium, Lima in PeruWhen: Saturday, November 29…

14 minutes ago

Copa Libertadores final: Palmeiras vs Flamengo – teams, start, lineups | Football News

Who: Palmeiras and FlamengoWhat: Copa Libertadores finalWhere: Monumental Stadium, Lima in PeruWhen: Saturday, November 29…

14 minutes ago

Tender floated for appointment of consultant for road project in Puducherry

The widening of the Puducherry-Cuddalore Road will bring relief to commuters as the narrow stretch…

17 minutes ago