The AI Playbook: What sports stars must do now to protect their IP in the age of artificial intelligence
Not long ago, we thought that social media offered near-limitless opportunities to monetize athlete intellectual property (IP). Generative AI has been a sobering development. From digital replicas and voice cloning to highly realistic deepfake videos, artificial intelligence creates a portfolio of new risks that athletes must confront. Endorsement, NIL and similar deals through which sports stars capitalize on their brands have a long history and involve standard industry practices. However, AI raises new legal and practical issues that demand attention at every stage.
Growing threat of unauthorized AI use
AI creates a growing threat of unauthorized use. AI deepfake technology allows bad actors to generate convincing content depicting a star endorsing a product, delivering a political message, or associating with a cause. Incidents have already been documented globally, recently a deepfake of Brady Tkachuk mocking Canadians after the Winter Olympics and nonconsensual imagery of female Olympians posted on 4chan.
This category of harm sits at the intersection of multiple legal theories. A false AI-generated endorsement may give rise to right-of-publicity, Lanham Act, state consumer protection and common law fraud claims. The challenge is speed: AI-generated content spreads across platforms in minutes, and by the time takedown notices are issued or litigation initiated, the damage is done. Statutes are also not without their limitations, especially because most were not drafted with AI in mind. This is especially true given the patchwork of state right-of-publicity laws. For example, while California and New York provide broader protection, other states offer more limited protection. At the federal level, the bipartisan NO FAKES Act would establish a federal right of publicity in digital replicas, create a private right of action against unauthorized use of voice or likeness and impose a DMCA-style takedown regime. The bill has industry support but has not been enacted, leaving athletes reliant on the existing patchwork, which creates real exposure.
For sports stars, the stakes of unauthorized AI use are extreme. Fabricated endorsements can damage carefully managed brand relationships and violate existing agreements with partners. Proactive monitoring using AI-detection platforms is prudent but must be paired with experienced counsel prepared to take enforcement action.
AI in commercial deals and preventing catastrophe
It is critical for athletes and their representatives to be meticulous in deals. Provisions that were once uncontroversial now require careful negotiation.
AI restrictions: Many contracts contain broad grants of rights authorizing the use of athletes’ NIL across all media and by all technologies now known or hereafter developed. Courts have not yet definitively resolved whether such language is broad enough to authorize use in connection with AI. As such, broad “all media and all technologies” clauses should be narrowed or modified to exclude use in connection with AI, particularly in training, cloning and digital replica creation. While some state statutes, such as California AB 2602 (which renders digital replica provisions unenforceable in performance contracts), may provide some protection, they are not perfect or well tested in the courts. Careful contractual negotiation is essential to best protect against risks posed by AI.
Appropriate compensation: When athletes allow the use of AI in connection with the exploitation of their IP, it is critical that compensation attached to that use is scrutinized and negotiated. Flat fees in traditional endorsement deals often don’t account for AI use. Equity grants are increasingly common and may be better suited to the wide-scale use of athlete IP. Tiered compensation with upfront fees, plus use-based royalties for AI-generated derivative content, are other options to address commercial value in the age of AI.
Written approvals: As an additional protection against excessive AI exploitation by commercial partners, even authorized AI use can be further conditioned on written consent. Rather than default authorization, athletes may require commercial partners to request and obtain specific prior written approvals before AI is used in conjunction with their IP. This is especially important as AI capabilities continue to improve rapidly.
Scope and territory: Perpetual, worldwide grants are always dangerous, particularly with respect to AI. Replicas trained and released today may be generating additional content decades from now. Contracts should define a term for any AI-related use, require destruction of content upon license expiration, and include geographic restrictions.
Indemnification and takedown obligations: Athletes should negotiate robust indemnification from commercial partners for any claims arising from AI-generated content and require contractual commitments to the rapid takedown of unauthorized AI-generated content that violates their publicity and IP rights.
NIL agreements: NIL agreements pose acute challenges, as college athletes are often unrepresented and negotiate against sophisticated partners. Compressed timelines and lower NIL deal values create additional pressure to accept form agreements. Yet the AI issues are, if anything, more pronounced in NIL agreements because college athletes are younger and contractual provisions can allow the use of their NIL long after they graduate, turn pro, and their IP becomes more valuable.
AI estates of sports stars
AI complicates the already nuanced area of postmortem publicity rights. While California and New York extend rights of publicity beyond death, many states have more limited protection, and AI creates new threats and opportunities across the landscape. Laws such as California Assembly Bill 1836, enacted in 2024 to address digital replicas, have attempted to address this threat by narrowing the statutory exemption that had previously permitted “expressive works” to incorporate a deceased celebrity’s voice or likeness without consent. However, they are not comprehensive. For athletes, estate planning should now explicitly address AI rights: who controls digital likeness after death, on what terms it may be licensed, and what residuals or compensation flows to heirs. And there should be careful monitoring by estates of the use of a deceased athlete’s IP.
The game plan
AI has fundamentally altered the playing field for athletes. The regulatory and contractual frameworks are developing rapidly, and those who fail to be proactive risk exposure. Athletes cannot afford to wait for a statutory solution that may never come. Some immediate priorities are clear: audit existing agreements for unintended AI grants; negotiate AI-specific protections in new deals; implement monitoring and enforcement protocols; and ensure that estate plans address digital NIL rights. Athletes and their representatives who act now will be better positioned than those who treat AI as tomorrow’s problem.
--
This article first appeared in the April 9, 2026, edition of the Sports Business Journal. All rights reserved. Further duplication without permission is prohibited.