Protecting Creators And Limiting Unauthorized Digital Replicas in the Pending NO FAKES Act
By Sez Harmon
June 10, 2025
The 2025 bipartisan bill known as the Nurture Originals, Foster Art, and Keep Entertainment Safe Act or “NO FAKES” Act, has significant potential to set several precedents in how the U.S. governs GenAI applications. Representative Salazar reintroduced this bill in April to protect the voice and visual likeness of all individuals from GenAI recreations and its revisions from its original 2023 draft are receiving widespread industry support. However, the bill has since been referred to the House Judiciary Committee, which signals several key legal issues are at stake. The language of the NO FAKES Act could impact how Intellectual Property law is applied to biometric data and where limits of free speech are defined for digital recreations. To set this bill up for success, I believe Congress will need to use well-scoped parameters for the misuse of biometric data and thoroughly consider how liability for unauthorized deepfakes should be applied under American jurisprudence.
The current text of the NO FAKES Act treats an individual's voice and visual likeness as property rights that cannot be reassigned during their lifetime, but can be licensed by an individual to external parties. This development means that under the NO FAKES Act, any individual whose voice or visual appearance is used without their consent can bring a civil action against a violating party and sue for damages, regardless of whether libel is involved. Transforming biometric features into IP rights would be a monumental change in federal law and builds off how legal scholars have advocated for treating personal and biometric data as trade secrets, where this data is granted restricted access. Although this new protection could significantly support creators, whose artistic abilities have been used nonconsensually in AI replicas, it also introduces several questions: (1) how will this statute apply to AI-generated recreations from systems that are not trained on biometric data? (2) are there contexts in which using an individual’s voice or likeness in a deepfake should be permitted to uphold free speech, especially when financial gain is not involved? and (3) would setting parameters for banned use cases of biometric data help circumvent this issue of free speech and get to the root of the issue? I believe the answers to these questions are critical to ensuring this bill can be signed into law and become enforceable in the U.S.
Additionally, this IP protection in the NO FAKES Act would be monumental because of how it diverges from the way states are treating this issue: 25 states have passed deepfake-related laws, but these statutes have focused on the misuse of biometric data in political campaigning or obscenity cases. By expanding the prohibition against nonconsenual biometric data usage in deepfakes to apply in every context, Congress would also curb political critique and satire in AI recreations, which should be protected under the First Amendment. Also, agencies like the Federal Trade Commission have shown that deepfake prohibitions can be effectively structured to only apply when this technology leverages biometric data for financial gain and the NO FAKES Act may benefit from this distinction. Finally, if the intent of this bill is to truly support artists’ agency over how their identities are used, Congress may want to specify limits to biometric data usage in arts-specific domains, rather than through deepfake-creation in general. While I fully support the protection of artists’ voices, likenesses, and talents and I want to ensure they have a say over how these attributes are used, the wide scope of this bill could harm other facets of American democracy that should not slide by unaddressed.
Overall, the NO FAKES Act is capable of becoming a fantastic law that genuinely prioritizes identity protection, free speech, and biometric consent, but this bill has a long way to go before reaching that status. I am excited to see how this bill develops and whether it will set a federal precedent once and for all on how biometric data is treated under IP law. This bill demonstrates the growing perception that artists’ voices need to be heard in the pursuit of AI innovation and I hope the NO FAKES Act continues to evolve to best protect these voices.
Sez Harmon is an AI Policy & Data Governance professional passionate about ensuring emerging technologies serve the public good and empower human talent. Sez has more than six years of experience leading technology policy initiatives and specializes in AI regulatory compliance, interdisciplinary research, client services, and program management. Currently, Sez serves as an Independent AI Governance & Data Privacy Consultant for multinational organizations pursuing robust data governance solutions internationally.