Victims of non-consensual, intimate online images, including deepfake AI-generated content, will soon have new legal options to have the content removed after US President Donald Trump signed what has become known as the Take It Down Act.
First lady Melania Trump, who has been a major proponent of the legislation, gave a speech just before Mr Trump signed the bill into law.
“Over the past few months I have met with brave survivors, deeply loving families and advocates who know first hand the emotional and psychological toll of NCII and deepfake abuse,” she said at the bill-signing ceremony outside the White House. “Many thanks to both parties for passing this legislation.”
According to the US Senate committee on commerce, science and transport, the Take It Down Act criminalises the publication of non-consensual intimate imagery, often referred to as “revenge porn”.
The law requires that social media sites or other content-hosting websites, along with service providers, “remove such content within 48 hours of notice from victims".
Just before signing the bill into law, Mr Trump said it increases penalties and introduces civil liabilities for online platforms that do not act to take such content down.
The act also includes provisions related to content generated with artificial intelligence tools.

According to The 19th, a non-profit newsroom focused on gender, politics and policy, internet platforms will have approximately one year to establish a process by which users can report the non-consensual content.
Though Take It Down passed almost unanimously in the US House of Representatives and the Senate, the act is not without critics.
The Electronic Frontier Foundation (EFF), a non-profit group promoting civil liberties in the tech world, has voiced frequent concerns.
“Good intentions don’t make good laws,” the EFF said in a news release when the act was first introduced in January.
It said the legislation's 48-hour deadline would put too much burden on smaller websites and service providers, making it more likely that they would comply quickly, rather than accurately, to avoid litigation.
“Instead, services will rely on automated filters – infamously blunt tools that frequently flag legal content, from fair-use commentary to news reporting,” the EFF said.
“Take It Down is the wrong approach to helping people whose intimate images are shared without their consent. We can help victims of online harassment without embracing a new regime of online censorship.”
The commerce committee, however, insists that the act is narrowly tailored to uphold the First Amendment and in turn, prevent an effect on “lawful speech".
According to the committee, the Take It Down Act also has the support of more than 120 organisations and companies including Meta, Snap, Google, Microsoft, TikTok and X.
Linda Yaccarino, chief executive of X, attended the bill-signing ceremony.
In February, as Take It Down legislation was gaining momentum, the EFF continued to oppose the bill, pointing out that victims of non-consensual intimate imagery already had legal options.
“In addition to 48 states that have specific laws criminalising the distribution of non-consensual pornography, there are defamation, harassment and extortion statutes that can all be wielded against people abusing NCII,” it said.
“Congress should focus on enforcing and improving these existing protections.”
The Take It Down Act is not the first law aimed at protecting reputations from being unfairly compromised.

In 2014, the European Union enacted what has become known as a “right to be forgotten” policy, which makes it easier for people to request deletion of certain private data collected by digital entities.
Much like the Take It Down Act, however, the “right to be forgotten” is not without critics, and has been subject to legal challenges in parts of the world.