As “Swift”ly As You Can Post, the Backlash Comes: Using AI-Generated, Deep Fake, and Other Celebrity “Endorsements”
by: Rosanne Yang
The backlash was swift after former President Trump posted at least digitally edited, if not deep fake AI-generated, images this week falsely suggesting that Taylor Swift had endorsed him in his bid for re-election. This is, in some ways, an old story – brands routinely find themselves in the hot seat for using celebrity images to create unauthorized endorsements of their brands. (Are the names Katherine Heigl and Duane Reade ringing a bell?) The modern twist is the use of AI to produce ever-more-realistic but completely made-up images. The problem has become so widespread that the FTC published an alert about fake celebrity endorsements.
Meanwhile, marketing departments everywhere remain intensely interested in these activities. Celebrity endorsements are gold, and AI is cheap. Sounds like a match made in heaven, but for those pesky little things called laws. Today we talk about the inevitable questions from marketing departments relating to these kinds of posts.
First, let’s level set on what rights and legal issues are in play.
Rights of Publicity. Celebrities – and sometimes regular people – have rights in their names, images, voices, likenesses, signatures, and other indicators of their persons. Third parties are not permitted to use such indicators of person without their permission to promote, endorse, advertise, etc. a commercial endeavor, and sometimes these rights continue after the death of the person whose likeness was used. These rights are granted as a matter of state law and vary. Determining which law applies can be tricky.
Note: There is a pending bill called the NO FAKES Act which would, if passed, create a national right of publicity for all persons with respect specifically to digital replicas of their visual likeness or voice. The right would extend beyond death for a period of 10 years, extendable up to 70 years, depending on the circumstances.
False Endorsement. The Lanham Act provides recourse to those who have been falsely associated with or said to endorse another brand.
False Advertising. The FTC and the 50 state corollary laws uniformly prohibit falsely advertising one’s brand, which includes fake endorsements. Again, the FTC thinks this problem is big enough that it issued guidelines on the subject, which are actively enforced. In addition, it just finalized a new rule on fake testimonials (and reviews).
Defamation. When a person is falsely stated to have endorsed a brand in a way that portrays that person in a negative way, it is possible that defamation claims could be brought.
Copyright. Let’s just pretend we have the consent of the celebrity or other person featured in the image or video. Unless the image being used is a selfie taken by the person featured, the consent does not extend to the image or video itself. To use the specific materials, the consent of the photographer, videographer, artist, or other creator is required.
When advising marketing teams of these laws, they are likely to have a variety of questions and responses. So, let’s take a look at those next.
1.) But they are a public figure – doesn’t that give us some right to talk about them?
Not for commercial purposes. Let’s be clear – if a brand posts or asks others to post on its behalf, it’s for commercial purposes. It’s not news coverage.
2.) But it’s just a meme!
Memes have been a thing for a while and by all indications are here to stay. When it’s just Suzie posting on her personal account for personal reasons, there may be some fair use protection depending on the circumstances, but other than that possibility, there is no exception in the law for them. There is just a tacit understanding that it’s typically not worth going after individuals using third party content for personal purposes. But when it comes to commercial posts, the incentive to enforce is present.
And to be clear, the fact that the origin of the meme – including what an AI may have used from its training to generate the image – may be unknown doesn’t change the calculus. When posting for commercial purposes all rights should be cleared.
3.) AI generated it, and I thought you told me that no one owns what AI generates.
While true that a work produced purely via generative AI is not currently afforded copyright protection in the U.S., that doesn’t give a carte blanche right do whatever one pleases with the output. The celebrity still has rights, and if the AI borrowed too heavily from the images it was trained on, the creators of those training images will have copyright claims. In addition, if the work was not produced purely by AI but rather was edited significantly by a human after the AI production, that human may have copyright rights as well.
4.) It’s funny, and I thought parody/satire was protected by the First Amendment.
Being funny doesn’t make something parody or satire. The legal definition of parody under copyright law requires more than being funny: it requires that the new work comment upon or criticize the original work. To qualify as satire, the use must be made as commentary on society, people, or issues rather than on the original work. Traditionally, these have been issues of copyright law, though similar concepts have been brought into trademark law. In any case, though, it is rare that posts by marketing departments would meet the criteria to be either protected parody or satire.
5.) It’s political speech, and I thought that was protected by the First Amendment.
Most companies require some fairly high-up-the-chain approvals before a political statement or post can be made by the company. So, for starters, marketing departments should make sure that anything that smacks of being political is cleared internally for alignment as to the message. Beyond that, it’s a thin and fuzzy line between “political” speech and commercial speech when it is a brand doing the talking. Even if the message leans into the political realm, it’s still not an anything goes type of situation. The message should still not be false, and regardless of whether or not it is ultimately found to be protected political speech, you can rest assured that if a copyright owner or the individual featured doesn’t appreciate the post, bad press and/or an expensive piece of litigation could ensue before that outcome would be reached.
At the end of the day, while AI may make fake celebrity endorsements easier to achieve and harder to detect (and therefore more convincing to viewers), existing laws uniformly dictate the need for brands to gain permission before, and speak truth when, posting a celebrity endorsement. If anything, these laws will only be strengthened as deep fakes become increasingly common.
Originally published by InfoLawGroup LLP. If you would like to receive regular emails from us, in which we share updates and our take on current legal news, please subscribe to InfoLawGroup’s Insights HERE.