Connect with us

News

Trump signs bill banning nonconsensual AI deepfake porn

Published

on

Trump signs the Take It Down Act, banning non-consensual explicit content, including AI deepfakes. Platforms must remove such material within 48 hours.

President Donald Trump enacted a law banning the dissemination of explicit content without permission, covering both real and AI-produced images.

According to the new law, platforms must take down such content within 48 hours after someone notifies them.

After the Senate and House passed the bill, Senators Ted Cruz of Texas and Amy Klobuchar of Minnesota introduced the Take It Down Act, which the President officially signed into law at the White House on Monday.

Trump enacted the Take It Down Act, a bipartisan measure targeting people who distribute non-consensual intimate imagery, such as deepfakes and revenge porn.

The law prohibits anyone from posting non-consensual sexually explicit visuals online, whether authentic or AI-generated.

The government subjects violators who publish such material to mandatory compensation and criminal sanctions, including fines, imprisonment, or both.

The bill applies criminal charges to anyone who threatens to publish intimate visuals, including AI-created content.

The Federal Trade Commission will ensure websites delete offending images within 48 hours of receiving a complaint and will push them to remove any replicated content.

After lobbying members of Congress herself, Melania Trump declared the law a “national victory” in an official statement.

Only two representatives in the House opposed the Take It Down Act, while the rest of Congress overwhelmingly supported it.

This law represents one of the first federal laws in the U.S. designed to combat threats posed by AI-generated content, criminalizing those who disseminate offending visuals as the technology rapidly evolves.

Although many states have laws against non-consensual intimate imagery and explicit deepfakes, the severity of charges and penalties varies.

Read also: Global experts advocate for responsible AI use, signs pact


Victims face major difficulties removing such content from the web, which allows the material to keep circulating and perpetuate emotional distress.

“With the rise of AI image generation, countless women have been harassed with deepfakes and other explicit images distributed against their will. This is wrong, and it’s just so horribly wrong,” Trump says during the signing ceremony. “It’s a very abusive situation, like in some cases, people have never seen before. And today we’re making it totally illegal.”

The number of harmful deepfake incidents has grown over time.

In early 2024, X experienced a high-profile case involving the swift spread of deepfake-generated explicit images of Taylor Swift.

Following the incident, X temporarily banned searches for Taylor Swift’s name, coinciding with lawmakers’ efforts to enact laws against deepfake creation.

The UK, among other nations, included the prohibition of sharing deepfake pornography in its 2023 Online Safety Act.

Security Hero’s 2023 study revealed that creators produce the vast majority of deepfakes on the internet as pornographic content, and nearly all victims (99%) are female.

Crypto News Update

Latest Episode on Inside Blockchain

Crypto Street

Advertisement



Trending

ALL Sections

Recent Posts