US bill proposes to let victims sue over digitally fake sexual images
IANS 31 January 2024
As AI-generated explicit images of Taylor Swift create massive debate among the policy makers, US lawmakers have proposed a bill that would let victims sue over digitally faked sexual images.
The ‘Disrupt Explicit Forged Images and Non-Consensual Edits’ (DEFIANCE) Act would add a civil right of action for intimate "digital forgeries" depicting an identifiable person without their consent, letting victims collect financial damages from anyone who “knowingly produced or possessed” the image with the intent to spread it.
The bill has been introduced by Senate Majority Whip Dick Durbin (D-IL), joined by Senators Lindsey Graham (R-SC), Amy Klobuchar (D-MN), and Josh Hawley (R-MO), reports The Verge.
"An identifiable individual who is the subject of a digital forgery may bring a civil action in an appropriate district court of the United States for relief against any person that knowingly produced or possessed the digital forgery with intent to disclose it, or knowingly disclosed or solicited the digital forgery," read the bill.
The term ‘digital forgery’ means any intimate visual depiction of an identifiable individual created through the use of software, machine learning, artificial intelligence, or any other computer-generated or technological means, including by adapting, modifying, manipulating, or altering an authentic visual depiction, to appear to a reasonable person to be indistinguishable from an authentic visual depiction of the individual.
Pornographic AI-manipulated images, frequently referred to as deepfakes, have grown in popularity and sophistication since the term was coined in 2017.
Meanwhile, Microsoft has also introduced more protections to its AI text-to-image generation tool Designer that users were utilising to create nonconsensual sexual images of celebrities.
Elon Musk-run X has also lifted the ban on searches for Swift that blocked queries of her name for several days following the spread of explicit, digitally altered photos of her. The company blocked searches for Swift after her AI-generated explicit images went viral on its platform last week.
The popular singer's images were seen by millions before X removed those. The company was criticised for a slow action on those images.
A ban on search results came after the White House weighed in last week, calling the fake images "alarming" and emphasising that social media companies have a responsibility to prevent the spread of such misinformation.
Disclaimer: Information, facts or opinions expressed in this news article are presented as sourced from IANS and do not reflect views of Moneylife and hence Moneylife is not responsible or liable for the same. As a source and news provider, IANS is responsible for accuracy, completeness, suitability and validity of any information in this article.
Free Helpline
Legal Credit