Congress Takes On Deepfakes with the NO FAKES Act of 2025
Also: Deepfake Legislation in PA and Growing Film Infrastructure in NJ
The issue of deepfakes, both of likeness and voice, has reached a point where the US Congress is attempting impose some guardrails. Below are summaries of the most current version of the proposed federal law, and later a somewhat similar but weaker attempt by Pennsylvania at the state level.
The No Fakes Act of 2025
This Act is an attempt by the US Congress to regulate the use of deepfakes without the consent of the person(s) portrayed in the deepfake. For reasons which will become apparent to the reader later on this piece, the NO FAKES Act, if signed into law, will be a hotbed of First Amendment free speech litigation.
The "Nurture Originals, Foster Art, and Keep Entertainment Safe Act of 2025" (the “NO FAKES Act” or the “Act”) aims to protect intellectual property rights related to the voice and visual likeness of individuals, including digital replicas. It establishes that individuals or their designated right holders have exclusive rights to authorize the use of their voice or likeness in digital replicas or related products and services. These rights are considered property rights, transferable post-mortem, and renewable for up to 70 years after the individual's death under specific conditions. The 70 year post-death period is consistent with US and EU Copyright law.
The Act also outlines licensing requirements, including written agreements and court approval for minors, and provides mechanisms for post-mortem registration and renewal of these rights.
The Act imposes liability on individuals or entities that publicly display, distribute, or use unauthorized digital replicas or products designed to create such replicas without proper authorization. It includes safe harbor provisions for online service providers, protecting them from liability if they adopt policies to address repeat violations and promptly remove unauthorized content upon notification. Exclusions are provided for uses in news, commentary, satire, or historical works, as long as they do not create false impressions of authenticity or involve sexually explicit content.
The legislation preempts state laws regarding voice and likeness rights in digital replicas but allows exceptions for certain pre-existing state statutes and causes of action. It explicitly states that providers of online services are not required to monitor their platforms for unauthorized digital replicas. The Act applies retroactively to individuals who died before its enactment, granting rights to their heirs, and takes effect 180 days after its enactment. It aims to balance intellectual property protection with the public interest in creative and expressive works.
The Act defines its key terms as follows:
Digital Fingerprint: An electronic label or identifier created by a cryptographic hash function (or similar function) or any other digital process, tool, or technique selected by the provider of an online service. It is unique to a specific piece of material, ensuring that the material will not be misidentified as a match for a different piece of material.
Digital Replica: A newly created, computer-generated, highly realistic electronic representation that is readily identifiable as the voice or visual likeness of an individual. It includes:
Representations embodied in sound recordings, images, audiovisual works, or transmissions where the individual did not actually perform or appear.
Versions of sound recordings, images, or audiovisual works where the individual's performance or appearance has been materially altered.
It excludes electronic reproductions, sampling, remixing, mastering, or digital remastering authorized by the copyright holder.
The NO FAKES Act protects voice likeness by granting individuals and their designated right holders exclusive rights to authorize the use of their voice in digital replicas or related products and services. This protection is established as a property right, which is licensable during the individual's lifetime and transferable post-mortem. The Act ensures that unauthorized use of an individual's voice likeness in digital replicas or products designed to create such replicas is subject to liability, including civil penalties and damages.
Licensing requirements are outlined to ensure proper authorization, including written agreements specifying intended uses. For minors, additional safeguards such as court approval are required. The Act also provides mechanisms for heirs or assigns to register and renew post-mortem rights for up to 70 years after the individual's death, ensuring long-term protection of voice likeness.
Furthermore, the Act imposes liability on entities that publicly display, distribute, or use unauthorized voice likenesses in digital replicas. It includes safe harbor provisions for online service providers, requiring them to remove unauthorized content upon notification while protecting them from liability if they comply with the Act's requirements.
More Details
Under the NO FAKES Act, the following activities are considered unauthorized:
Use of Digital Replicas Without Authorization:
Public display, distribution, transmission, or communication of a digital replica without authorization from the applicable right holder.
Making a digital replica available to the public without authorization.
Products or Services Designed to Create Unauthorized Digital Replicas:
Distributing, importing, transmitting, or making available products or services primarily designed to produce digital replicas of specifically identified individuals without authorization.
Products or services with limited commercially significant purposes other than creating unauthorized digital replicas.
Marketing, advertising, or promoting products or services designed to produce unauthorized digital replicas.
Knowledge or Willful Avoidance:
Liability applies if the individual or entity has actual knowledge or willfully avoids knowledge that the material is an unauthorized digital replica or a product/service designed to create such replicas.
Exclusions:
Certain uses are excluded from liability, such as bona fide news, public affairs, sports broadcasts, documentaries, historical or biographical works, commentary, criticism, scholarship, satire, or parody, provided they do not create false impressions of authenticity or involve sexually explicit conduct.
These unauthorized activities are subject to civil liability, including damages, penalties, and injunctive relief, such as:
For the unauthorized public display, transmission or communication of a digital replica:
Individual: $5,000 per work embodying the unauthorized digital replica.
Provider of an online service (good faith effort to comply): $25,000 per work embodying the unauthorized digital replica.
Provider of an online service (no good faith effort): $5,000 per display, copy made, transmission, or instance of the unauthorized digital replica, up to $750,000 per work.
Entity not a provider of an online service: $25,000 per work embodying the unauthorized digital replica.
Actual damages: Any actual damages suffered by the injured party, plus profits from the unauthorized use not accounted for in computing actual damages.
For making a service available to the public that is primarily designed to produce unauthorized digital replicas:
Individual: $5,000 per product or service.
Provider of an online service (good faith effort to comply): $25,000 per product or service.
Provider of an online service (no good faith effort): $750,000 per product or service.
Entity not a provider of an online service: $25,000 per product or service.
Actual damages: Any actual damages suffered by the injured party, plus profits from the unauthorized use not accounted for in computing actual damages.
Additional Remedies:
Injunctive or other equitable relief.
Punitive damages for willful activity proven to involve malice, fraud, or knowledge of legal violations.
Reasonable attorney’s fees for the prevailing party:
Awarded to the plaintiff if they win.
Awarded to the defendant if the court determines the action was not brought in good faith.
Penalties for false or deceptive notices:
$25,000 per notification containing a misrepresentation.
Actual damages, including costs and attorney’s fees, incurred by the alleged violator or the provider of an online service.
You can read the text of the NO FAKES Act below:
Pennsylvania Anti-Deepfake Legislation
Pennsylvania Governor Shapiro recently signed a bill into law that creates a criminal offense for using artificial intelligence to create non-consensual “forged digital likenesses” — including deepfakes and voice clones — with the intent to defraud or injure another person.
For the purposes of the new law, a forged digital likeness means a computer-generated visual representation of an actual and identifiable individual or audiorecording of an actual and identifiable individual's voice that:
has been created, adapted or modified to closely resemble a genuine visual representation or audio record of the individual;
materially misrepresents the appearance, speech or behavior of the individual such that the fundamental character of the individual's appearance, speech or behavior is changed;
is likely to deceive a reasonable person to believe that the visual representation or audio recording is genuine; and
is created and distributed without the consent of the individual.
The new digital forgery offense occurs when a person generates or creates and distributes a forged digital likeness as genuine and knows or reasonably should know the visual representation or audio recording is a forged digital likeness.
EXCEPTION: There is a general carveout from the definition of the offense for “a constitutionally protected activity”. There is no additional language to define that exception in the bill’s language. The vagueness of this language makes this law ripe for a challenge under the First Amendment.
The text of Senate Bill 649 is below:
This is a good start, but a far cry for what some other states are doing on this front. Click here to see my analysis of Tennessee’s anti-deepfake legislation.
Will New Jersey Become Hollywood East?
There was an interesting story in The Hollywood Reporter profiling New Jersey’s growing film industry infrastructure, specifically the groundbreaking of Netflix’s 292 acre film and TV production facility in Monmouth County. Given the Lehigh Valley’s proximity to the Garden State, this is an interesting read.
Also an interesting read is the NJ Film, TV and Digital Media Tax Credit program.
J. Bryan Tuk is the founder of Tuk Business & Entertainment Law, and the author of risk, create change: a survival guide for startups and creatives. In addition to twenty five years of private practice as an attorney, Bryan is also a professional musician and producer, and a member of The Recording Academy and the American Society of Composers and Producers.
The problem with laws like these isn’t the goal, it’s the enforcement. Most deepfakes are already created and shared anonymously or from overseas. Will platforms really be incentivized to comply without massive liability?