Morals clauses in the age of AI: An artificial misconduct
© ImageFlow – stock.adobe.com

Morals clauses in the age of AI: An artificial misconduct

3 min.

Working with celebrities or public figures involves risks. If the talent’s reputation suffers, the bad image will reflect on the company. Celebrity endorsement contracts manage this risk with a morals or conduct clause. The experts at Pryor Cashman LLP* explain why this is particularly important in connection with AI.

One of the key considerations when engaging with talent (a celebrity, actor, singer, or other public figure) is that the value of the association between a company or person and that talent is wholly dependent on the talent’s reputation. Of course, when entering into an agreement, one’s reputation can be vetted and appear intact. It is what might happen later, over the course of the relationship, that can be a cause for concern. To shield against that risk, talent agreements of all kinds (endorsement, co-branding, license, and sponsorship agreements), especially since the MeToo era, will typically include a “morals” or “conduct” clause.

Speak to us if you have any questions about AI risk and morals clauses.
Dyan Finguerra-DuCharme, Partner, Pryor Cashman LLP*, New York, USA

What a morals clause brings

The purpose of a morals clause is to provide an express remedy against talent whose conduct adversely affects the association with the relevant talent and, accordingly, the potential success of a motion picture, television series, marketing campaign, or product (thus jeopardising the value of the accompanying investment and the goodwill associated with a corporate brand).

The dawn of artificial intelligence (AI) technology has seen the birth of “deep fakes” (or as SAG-AFTRA calls them, “digital replicas”), which are (at a high level) AI-generated versions of a celebrity, typically a video or photograph that appears to be real with the celebrity expressing a particular view or encouraging consumers to unwittingly fall victim to a scam. The possibility of a deep fake creates a potential complication as to what constitutes (or, rather, should constitute) “conduct” in order to ensure that the purpose of the morals clause is met.

This might also interest you

Would you like to know more about AI risks in celebrity endorsement deals? Read the article “Artificial Misconduct: Morals Clauses in the Age of AI” here: https://www.pryorcashman.com/publications/artificial-misconduct-morals-clauses-in-the-age-of-ai

The challenges of AI

The advent of AI has brought new challenges for talent and companies of all kinds engaging talent, each of whom must recognise that false and deceptive information about and/or involving talent and which can therefore harm his/her/their reputation has become not just possible, but rather incredibly easy. As a result, it is now more important than ever for both parties to an agreement involving the use of someone’s persona to anticipate and guard against such concerns and pre-emptively establish remedial protocols to address such issues in the unfortunate event that they occur.

For further information please contact:

Dyan Finguerra-DuCharme, Partner, Pryor Cashman LLP*, New York, USA
Email: dfinguerra-ducharme@pryorcashman.com

Simon Pulman, Partner, Pryor Cashman LLP*, New York, USA
Email: spulman@pryorcashman.com

Laure Sawaya, Special Counsel, Pryor Cashman LLP*, New York, USA
Email: lsawaya@pryorcashman.com


*Ecovis cooperates with Pryor Cashman LLP (www.pryorcashman.com), a full-service, US-based law firm with offices in New York City, Los Angeles and Miami.

Sign up to our newsletter!

Contact us:

Pryor Cashman LLP
7 Times Square
NY 10036-6569 New York
Phone: +1 212 421 4100
www.pryorcashman.com