Trump Signs 'Take It Down Act,' Criminalizing Deepfake and Revenge Porn

Trump Signs 'Take It Down Act,' Criminalizing Deepfake and Revenge Porn

On Monday, President Donald Trump signed the “Take It Down Act” into law, putting the United States among the first countries to impose criminal penalties on creators of non-consensual deepfake and revenge porn content. The bill, passed by a broad bipartisan majority in Congress, targets the rising threat of AI-generated explicit imagery and videos, especially those shared online without the subject’s consent.

The law was the First Lady’s marquee project inside a rebooted Be Best campaign. She told the crowd that child and family well-being depend on curbing digital abuse, a claim she backed by noting the speed and realism with which generative AI can fabricate sexual content.

Senators Cruz and Klobuchar co-sponsored the bill after months of hearings that featured victims describing how manipulated photos traveled faster than takedown notices could keep pace. Two dissenting House members—both citing free-speech worries—became footnotes once the chamber passed the bill 409-2 in April, following a unanimous Senate vote in February.

Now that the measure is law, anyone who knowingly posts non-consensual intimate imagery faces federal charges that can include prison time. The statute also compels social-media sites to delete flagged images “expeditiously,” and it hands the Federal Trade Commission authority to fine companies that sit on complaints.

Cruz called the moment “an historic win for victims,” stressing that tech firms “will no longer be allowed to turn a blind eye.” Klobuchar pointed to bipartisan unity, framing the act as proof that Congress can still move on privacy when parents pressure lawmakers directly.

Digital rights groups applauded the new criminal penalties yet warned that the takedown mandate could chill legitimate speech if platforms over-filter. They also questioned whether the FTC has resources to police thousands of daily removal requests once the law takes effect in six months.

Victim-advocacy organizations, including the Cyber Civil Rights Initiative, praised the measure’s narrow focus on non-consensual sexual material, contrasting it with broader content-moderation bills that have stalled over First Amendment concerns. Several groups said they will monitor court cases to be sure prosecutors use the statute rather than lesser state-level charges.

Tech policy analysts see the act as a rare tech law to clear a deeply fractured Congress. They note its limited scope—deepfakes and revenge porn—sidestepped larger debates over Section 230, data privacy, or AI liability that continue to bottle up broader reforms.

The Justice Department now must draft guidance that tells U.S. attorneys how to handle evidence from overseas servers, while the FTC prepares rule-making on response-time benchmarks for takedown requests. Civil liberties attorneys will watch those rules for signs that the line between illegal exploitation and constitutionally protected satire stays intact.

Alex Cooke's picture

Alex Cooke is a Cleveland-based photographer and meteorologist. He teaches music and enjoys time with horses and his rescue dogs.

Log in or register to post comments