This browser is not actively supported anymore. For the best passle experience, we strongly recommend you upgrade your browser.

Katten

| 1 minute read

Tennessee Expands Right-of-Publicity Statute to Cover AI-Generated Deepfakes

On March 21, 2024, Tennessee Governor Bill Lee signed into law the Ensuring Likeness, Voice, and Image Security Act of 2024 or the ELVIS Act—an unprecedented piece of legislation aiming to ban unauthorized artificial intelligence reproductions of individuals’ likenesses and voices. The new Tennessee law follows the current trend of federal and state lawmakers and regulators seeking to address “deep fakes” and pursuing other “anti-impersonation” measures. The ELVIS Act overtly targets AI-generated songs by imposing civil and criminal liability for reproduction of any voice that is readily identifiable and attributable to a particular individual, “regardless of whether the sound contains the actual voice or a simulation of the voice of the individual.” 

Importantly, the ELVIS Act extends liability to not just the end-user who creates the infringing work using AI, but also to anyone who “publishes, performs, distributes, transmits, or otherwise makes available to the public” the infringing work or “makes available an algorithm, software, tool, or other technology, service, or device” that assists in producing such infringing work.  Three categories of individuals or entities can thus be targeted by the law: end-users who created the infringing work, publishers of the infringing work, and any company or individual that makes the infringement possible through technology. The text of the bill limits end-user liability to commercial uses (e.g., “advertising,” “fundraising,” “solicitation”). The text is less clear on that point in the subsections relating to publishers, tech companies,  and app developers, but the statute elsewhere includes carve-outs for First Amendment-protected activities. 

The Act does require that end users and publishers have actual knowledge of infringement to impose liability. For tech companies and developers, the Act requires that the software, app, algorithm, etc., have a “primary purpose or function” of producing such infringing works. The Act also imposes liability where the publisher of an advertisement (e.g., newspaper, television station) reasonably should have known of the unauthorized use of an individual’s voice or likeness, meaning advertising publishers can be held liable without actual knowledge of the conduct. 

The ELVIS Act received broad bi-partisan support in the Tennessee Legislature. Several industry groups also supported the ELVIS Act, including Broadcast Music Inc. and the Screen Actors Guild. 

Tags

artificial intelligence, entertainment and media, litigation, advertising marketing and promotions