Article/Mention - Media and Entertainment Newsroom

As Deepfakes Proliferate Legal Protections Struggle to Keep Pace

October 28, 2025

In early October, OpenAI soft-launched a new video-creation app called Sora. The New York Times called it both “jaw-dropping” for its ability to create hyper-realistic simulations of faces, bodies and voices; and also “disconcerting” for its ability to “pour gasoline on disinformation.”[1]

 “Welcome to the era of fakery,” the Times wrote.[2]

 While “deepfakes” have lurked beneath the surface of mainstream media for nearly a decade, new technology is rapidly making them ubiquitous across the media landscape. As deepfakes suddenly proliferate, the law is struggling to keep pace. Deepfakes implicate an individual’s right of publicity, control over one’s identity, the integrity of elections and core First Amendment freedoms—thorny issues that defy easy answers. As a result, efforts to regulate deepfakes in Congress have been halting, leaving a patchwork of state laws—and litigation—to govern this important and rapidly evolving issue.

Federal Efforts

In Congress, the debate over deepfakes has moved slowly as lawmakers try to define what constitutes a deepfake, which deepfakes should be prohibited, what the penalties should be, whether platforms should be liable and/or required to remove such content, and whether any of these potential regulations comport with the First Amendment.

To date, Congress has only enacted the TAKE IT DOWN Act, which criminalizes the publication of non‑consensual intimate imagery (NCII), whether real or AI‑generated, and requires websites and social media platforms to take down such content within 48 hours of notice. Following an uproar over sexually explicit deepfakes, the bill passed in both chambers with only two “no” votes. But the law remains largely untested, and First Amendment advocates have raised concern about the vague wording of the statute and the potential for selective enforcement by the Federal Trade Commission.

While Congress came together to address sexually explicit deepfakes, the regulation of non-explicit deepfakes has proven more difficult. In 2024, bipartisan groups[3] in the House and Senate introduced the Nurture Originals, Foster Art and Keep Entertainment Safe (NO FAKES) Act. The bill defines “digital replica” as a “newly‑created, computer‑generated, highly realistic electronic representation that is readily identifiable as the voice or visual likeness of an individual” where that individual did not actually appear for the content.

The bill creates a private, enforceable property-like right for individuals that extends for 10 years beyond death, with the possibility of additional renewals. A person or entity violates the law if they produce a digital replica without consent, or publish, reproduce, display, distribute, transmit or otherwise make such a replica publicly available. The bill requires that a violator must either have actual knowledge—or have willfully avoided having knowledge—that the digital replica was not authorized. Platforms can avoid liability if they comply with notice-and-takedown rules upon being notified.Individuals can seek statutory damages ranging from $5,000 to $25,000 for each violation, along with actual damages and punitive damages when the defendant acts with “malice, fraud, knowledge or willful avoidance of knowledge that the conduct violated the law.” The bill also allows for prevailing plaintiffs to obtain attorneys’ fees, and for prevailing defendants to obtain fees when an action was brought in bad faith.

The NO FAKES Act attempts to ameliorate First Amendment concerns through a carveout for the use of digital replicas in news reports; historical or biographical uses; uses involving “bona fide commentary, criticism, scholarship, satire or parody;” and fleeting or negligible uses.

The most recent version of the bill has the backing of several key industry players, including Motion Picture Association, Screen Actors Guild-American Federation of Television and Radio Artists, Recording Industry Association of America, Recording Academy, OpenAI, IBM, and Google/YouTube. But First Amendment advocates and some consumer groups continue to express concerns that the bill is overly broad and threatens freedom of expression. The bill has been referred to the Judiciary Committee in each chamber but has yet to be marked up or voted out of either committee.

State Efforts

 As Congress moves slowly to address these issues, several states enacted measures to regulate digital replicas in 2024 and 2025, including California, Illinois, Montana, New York, and Tennessee. These bills generally create—or extend—a property right in digital replicas or regulate contracts that allow for the production of such digital replicas. While the NO FAKES Act would preempt similar state legislation in the future, it would grandfather in most existing laws.

States have also aggressively targeted the use of deepfakes to disrupt elections. More than 25 states have enacted laws to protect voters from political deepfakes in election communications—up from just five prior to 2024.

California passed the most sweeping of these regulations, targeting not only election-related materials but also videos and images that misrepresent election integrity. The legislation also covers materials that depict election workers and voting machines.

Two separate laws enacted by California were challenged by conservative social media influencer Christopher Kohls, known as "Mr. Reagan" on X, who was joined by the social media platform X and conservative outlets the Babylon Bee and Rumble. The first law would have prohibited online platforms from hosting deceptive, AI-generated content related to an election in the run-up to an election. In August, U.S. District Judge John Mendez in the Eastern District of California struck down that law on Section 230 grounds, but declined to reach the First Amendment issues.[4] The second law required digitally altered campaign materials and ads to contain labels identifying them as digitally altered. Judge Mendez struck down that law on First Amendment grounds.[5] The State of California has appealed those rulings to the Ninth Circuit.

Kohls also challenged a similar law in Minnesota, joined by Republican state representative Mary Frasson. U.S. District Judge Laura M. Provinzino declined to issue a preliminary injunction in that case.[6] The Eighth Circuit heard oral argument in Kohls’ appeal of that denial on October 22. In a separate suit brought by X, Judge Provinzino stayed proceedings on the constitutional issues pending the Eighth Circuit’s ruling, but allowed the Section 230 claim to proceed.[7]

Conclusion

Deepfake technology continues to make rapid improvements while Congress and the courts deliberate the complex legal issues. In the meantime, consumers are largely left to their own devices when it comes to identifying deepfakes, and the subjects of these deceptive videos are left with only limited recourse.

           



[1] Mike Isaac and Eli Tan, OpenAI’s New Video App Is Jaw-Dropping (for Better and Worse), N.Y. Times (Oct. 2, 2025) at https://www.nytimes.com/2025/10/02/technology/openai-sora-video-app.html.

[2] Brian X. Chen, A.I. Video Generators Are Now So Good You Can No Longer Trust Your Eyes, N.Y. Times (Oct. 9, 2025) at https://www.nytimes.com/2025/10/09/technology/personaltech/sora-ai-video-impact.html.

[3] The bill’s Senate sponsors included Chris Coons (D‑DE), Marsha Blackburn (R‑TN), Amy Klobuchar (D‑MN), and Thom Tillis (R‑NC). The bill’s House sponsors included Reps. María Elvira Salazar (R‑FL), Madeleine Dean (D‑PA), Nathaniel Moran (R‑TX), Adam Schiff (D‑CA), Rob Wittman (R‑VA) and Joe Morelle (D‑NY), among others.

[4] Order and Final Judgment and Permanent Injunction as to AB 2655, Dkt. 98, Kohls v. Bonta, No. 2:24-CV-02527-JAM-CKD, 2025 WL 2495613 (E.D. Cal. Aug. 20, 2025).

[5] Kohls v. Bonta, No. 2:24-CV-02527-JAM-CKD, 2025 WL 2495613 (E.D. Cal. Aug. 29, 2025).

[6] Kohls v. Ellison, No. 0:24‑cv‑03754 (LMP/DLM) 2025 WL 66765 (D. Minn. Jan. 10, 2025).

[7] Order Granting in Part and Denying in Part Defendant’s Motion to Stay, X Corp. v. Ellison, Case No. 25‑cv‑1649 (D. Minn. July 3, 2025).