Connecting decision makers to a dynamic network of information, people and ideas, Bloomberg quickly and accurately delivers business and financial information, news and insight around the world.
Americas+1 212 318 2000
EMEA+44 20 7330 7500
Asia Pacific+65 6212 1000
Connecting decision makers to a dynamic network of information, people and ideas, Bloomberg quickly and accurately delivers business and financial information, news and insight around the world.
Americas+1 212 318 2000
EMEA+44 20 7330 7500
Asia Pacific+65 6212 1000
By Isaiah Poritz
The swift rise of “deepfakes” and similar artificial intelligence technology that allows users to digitally swap or blend faces has spurred questions about legal remedies for programs that exploit a person’s image and voice without permission.
The burgeoning technology has already resulted in a proposed class action from a reality TV star who claims that the app Reface allows users to digitally paste their own faces on top of images and videos of him and other celebrities in violation of California’s right of publicity law.
Those types of AI programs—driven by algorithms trained on large swaths of images and videos—are behind recent viral memes like an AI-generated image of the Pope wearing a white puffer coat and a TikTok account featuring a deepfake of Tom Cruise.
For those unwillingly featured in deepfakes and other AI-manipulated content, right of publicity laws that protect one’s image and likeness from commercial exploitation may serve as a legal remedy, attorneys said. But those laws vary state by state, with some like California and New York providing clear statutory guidelines, while others have far less defined bodies of law. They could also come into conflict with federal laws like the tech industry’s legal shield, Section 230 of the Communications Decency Act.
The mesh of statutory regimes could create a patchwork of decisions as generative AI applications continue their rise—and litigation mounts.
“The challenge is that this technology is moving so quickly and there are so many potential variations that could come up,” said Eleanor Lackman, an intellectual property attorney at Mitchell Silberberg & Knupp LLP. “It’s certainly hard to tell what will happen, particularly in the entertainment context.
The lawsuit against Reface, which was created by the Ukrainian company NeoCortext Inc., was brought by Kyland Young, a finalist on the reality competition show “Big Brother.” He claims to have found images and videos of himself on the app, which allows users to digitally scan their own face to paste over his.
Young’s complaint said he hopes to represent a class of California actors, athletes, and musicians whose “names, voices, photographs, and likenesses” are exploited by NeoCortext to “pitch its product for profit.”
NeoCortext didn’t respond to Bloomberg Law’s requests for comment.
Robert Freund, a California-based attorney specializing in branding and advertising, said the lawsuit’s claims appeared straightforward: Young never authorized NeoCortext to use his images for the app, which has both free and paid versions.
What’s more interesting about the complaint, Freund said, was the decision to not include a false advertising claim, which is commonly brought alongside right of publicity suits.
The lack of clear deception could be one reason.
Lackman said that she wouldn’t call the Reface app a deepfake because it appears the app is only mashing up faces. “I think most people in the industry would consider a deepfake to be something where you legitimately believe it’s that person, like having someone say words that they didn’t actually say.”
Although Young’s complaint said the case isn’t about the legality of deepfake technology, Jonathan Blavin, a technology attorney at Munger Tolles & Olsen LLP, said the lawsuit will likely have “more significant, broader consequences.”
“There are different statutory regimes relating to the right of publicity in each state,” he said. “Some have common law protection, some have statutory protections. The scope of those protections may diverge in some respects.”
California’s right of publicity statute is one of the most comprehensive in the nation and requires permission to use one’s “name, voice, signature, photograph, or likeness” on products or in advertising.
Unlike other states’ laws, California’s doesn’t necessarily require the plaintiff to be a celebrity or to show that there’s commercial value in their identity.
The Screen Actors Guild-American Federation of Television and Radio Artists, a major entertainment industry union, has called to update laws to account for unauthorized deepfakes and digital avatars.
In 2020, SAG-AFTRA successfully lobbied for legislation in New York that expanded the state’s right of publicity law to include protection against “digital replicas” of deceased celebrities and deepfake pornography.
But Young and other potential plaintiffs turning to the right of publicity will have to overcome some hurdles that also vary by location.
The First Amendment is a prominent legal defense against publicity rights lawsuits, especially in cases where the underlying product or service is protected by free speech, like comic books or documentaries.
Jennifer Rothman, a University of Pennsylvania law professor who researches right of publicity laws, said the Reface app “may not be protected in the same way by the First Amendment,” in contrast to recent right of publicity cases over public records databases.
The court will “have to take into account whether the activity is somehow in the public interest, whether it serves a parody purpose, or whether or not there’s a significant cultural or entertainment value relating to the app,” Blavin said.
Other federal statutes may also be shields against the lawsuits, such as Section 230, which immunizes online platforms from liability over user-created content.
Courts have yet to rule on whether generative AI programs—which are often at least partially trained on user content—are protected by that law, and legal experts have contested the question.
Whether Reface is “creating its own content such that Section 230 doesn’t apply, I think that’s an open issue, and it certainly could be a defense that the app attempts to raise” Blavin said.
But even if Section 230 were to apply to generative AI programs, courts are split over whether the right of publicity is considered an intellectual property right, which is exempt from the statute. A New York federal court ruled earlier this year that the state’s publicity law is more akin to a privacy right, while California’s law has been interpreted as an intellectual property right.
“You’re going to get various cases brought in various jurisdictions and potentially diverging opinions in courts on some of the critical issues” relating to the First Amendment and Section 230, Blavin said.
The case is Young v. NeoCortext Inc., C.D. Cal., No. 2:23-cv-02496, 4/3/23.
To contact the reporter on this story:
To contact the editors responsible for this story:
AI-powered legal analytics, workflow tools and premium legal & business news.
Log in to keep reading or access research tools.