Businesses Beware: Using AI to Create Fake Celebrity Advertisements Can Get You in Trouble
Tuesday, May 21st, 2024
Imagine Taylor Swift endorsing your product. Can’t afford her? AI can fake it for you!
Thanks to AI deepfake technology, you can create a counterfeit but realistic Taylor Swift endorsement without involving her. Deepfakes use computer machine learning to create eerily realistic images, videos, and audio mimicking real people.
Scammers used AI to create a deepfake video in which a phony Taylor Swift announced she was giving away Le Creuset cookware sets, a real luxury cookware brand. After following some prompts, deceived shoppers were asked to pay a small shipping fee. If they paid, they didn’t receive cookware but did get hit with a hidden monthly credit card charge.
Faking ads and endorsements by living celebrities is obviously illegal. But what about deceased ones? What about mimicking government officials, such as the president? What if you only replicate the person’s famous voice but don’t identify the person? What if your fake endorsement is an obvious parody of the celebrity?
Everything is legal if you get a license from the person whose name, image, or likeness (“NIL”) is used. For example, James Earl Jones allowed Disney to replicate his vocal performance as Darth Vader in future projects using an AI voice-modeling tool called Respeecher.
But what if you can’t buy a license from the celebrity? The law is clear that you can’t use the NIL of a living celebrity without permission for a commercial purpose, such as advertising or endorsement.
Doing so violates the right of publicity, which is a person’s right to control the commercial use of his NIL. This is a state-level law that varies from state to state.
About two-thirds of states recognize a right of publicity by statute, common law, or both. Other states usually have a “right of privacy,” which accomplishes roughly the same thing. Most states, including Virginia, hold that the right of publicity protects everyone. Still, some states protect a person’s NIL only if it has commercial value, which essentially means celebrities.
Most (but not all) states with a right of publicity hold that it continues after death, but the length of protection varies. In Virginia, protection lasts 20 years after death. In other states giving postmortem rights, the length runs from 10 to 100 years. Sometimes, the length of postmortem protection depends on whether the person is famous or whether the person’s estate continued to exploit the deceased celebrity’s NIL commercially.
For a business advertising using the NIL of others, it’s best to presume your activity will be governed by the most protective right of publicity in the country. Presume that the right of publicity protects everybody’s NIL, not just celebrities and including politicians, and that it protects not only living people but anyone who lived in the past 100 years. That’s because it’s difficult to determine which state’s law would apply to your activity, and you might get sued in another state.
Also, don’t get cute by mimicking a celebrity’s voice while not identifying the person. Most states that recognize the right of publicity include someone’s recognizable voice.
For example, in the 1980s, Ford Motor Company produced an ad for the Mercury Sable using a voice impersonator singing “Do You Want to Dance” in Bette Midler’s style without her permission. Midler sued Ford and won.
Reacting to rising AI voice mimicry, Tennessee recently enacted a law that imposes criminal and civil liability on using AI to mimic someone’s recognizable voice without permission. The law extends liability to people who knowingly publish a fake voice and, in the case of advertisers, when they should have known it was fake. It also extends liability to any company or individual producing AI technology with a “primary purpose or function” of making AI fakes.
What about political figures? Don’t you have a First Amendment free speech right to mimic them in advertisements? Generally, no. Politicians also receive protection against unauthorized use of their NIL for commercial purposes, including postmortem rights for as long as applicable state law gives such rights. Free speech principles don’t override that.
What if your AI fakery is obviously a parody, such as a phony Joe Biden endorsing hair-care products or counterfeit Donald Trump endorsing a gym chain? Trying this is legally risky. You will be liable if some in the public don’t get the joke, meaning some think the endorsement might be real. And even if everybody gets the joke, if a parody is an advertisement to sell some good or service, that commercial aspect might make it legally unprotected.
Finally, what if your AI-generated person happens to be a non-celebrity’s appearance and voice? Because the right of publicity in most states protects all people, this too requires getting a license to use the NIL of the person depicted.
So, don’t use AI to create a fake Taylor Swift endorsement for your business. You might be “Enchanted” by her market appeal, but when you get sued by her, it would be hard to “Shake it Off.”
NOTE: A longer, more detailed version of this column is available on John Farmer’s Substack, here.
Written on May 21, 2024
by John B. Farmer
© 2024 Leading-Edge Law Group, PLC. All rights reserved.