News

Your Agency Contract Was Written Before AI. Here's What It's Missing.

Your Agency Contract Was Written Before AI. Here's What It's Missing.

What procurement teams need to know about updating agency agreements for how production works today. With legal perspective from Roch Glowacki, Lewis Silkin LLP.

Most agency contracts were negotiated before AI became a standard part of the production workflow. The pricing, the deliverables, the representations about originality and ownership, and the way risk is allocated between the parties were all written for a world where human professionals did the work.

That world has changed. Agencies are using AI tools across the production process, from ideation and copywriting to image generation and video editing. In many cases, this means work is being produced faster and with fewer people. For procurement, this raises a straightforward question: if the way work is being done has changed, shouldn't the terms you're operating under change too?

This is the second article in a three-part series on the AI legal landscape for global advertisers. The first article examined whether AI-generated campaign assets can be legally protected. This one looks at what your contracts need to say to reflect how production is actually happening.

Roch Glowacki of Lewis Silkin describes contracts as "the building block of AI governance" right now. With IP law uncertain, regulation still forming, and case law thin, the only way to get real clarity between you and your agency is to write it down.

What AI addenda should cover

AI addenda to agency and vendor contracts are becoming standard practice. Roch identified the provisions they need to include.

Ownership of AI-generated assets. Who owns what, and what "ownership" means when copyright may not apply. As the first article in this series explored, the protectability of AI-generated content varies by jurisdiction.

Data usage. Whether your campaign data, brand assets, or proprietary material is being fed into AI systems that may use it to train models or improve tools used by other clients. This is a particularly important question for advertisers whose brand strategy and creative assets represent significant competitive value.

Tool visibility. What AI tools are being used, by whom, and for what purpose. Without this visibility, you cannot assess whether the production costs you're being charged reflect the way the work is actually being done.

Risk assessment. Even basic due diligence on the AI tools being deployed and the risks they carry. Roch emphasised that some form of risk assessment, even a simple one, is a prerequisite for informed decisions about what you're approving.

Disclosure and transparency. What the agency must tell you about AI involvement in the work, including when disclosure is required and what level of detail is expected. This is not about policing your agency. It is about having the information you need to manage the relationship and evaluate what you are receiving.

The calibration challenge

Roch highlighted a practical problem. Many contract clauses say "we need approval over any AI that you're using," but the definition of "AI" has expanded dramatically. A simple scheduling algorithm, a media planning tool, and a generative AI image creator are all technically "AI" but carry vastly different risk profiles.

Blanket prohibitions or approval requirements create friction without delivering meaningful protection. The question "what AI are you using?" needs to be followed by "for what?" and "with what data?" Contract language needs to match the actual risk of the specific tool, not treat every AI application the same way.

The RFP gap

Roch also flagged a gap that occurs before contracts are even negotiated. When advertisers put out requests for proposals, AI usage is still not being captured properly. Even when due diligence questionnaires are included, it can be difficult to determine what is actually happening with AI in the proposed workflow.

If the questions in the RFP are broad and unspecific, the answers will be too. Questions about what tools are being used, for what purpose, with what data, and with what governance are more likely to surface useful information. And knowing how AI is being used in the proposed production process is directly relevant to evaluating whether the proposed pricing makes sense.

What advertisers should consider:

Audit your current production contracts and agency agreements across all markets for AI-specific provisions. If they are not present, they should be added, and calibrated to the risk level of the specific AI deployment rather than applied as a blanket clause. Strengthen your RFP processes to capture AI usage at the proposal stage, and evaluate whether the production costs being proposed reflect the tools and methods being used to do the work.

Roch Glowacki is a commercial contracts lawyer at Lewis Silkin LLPBBS Worldwide is an independent advertising production consulting firm.

This article is for informational purposes only and does not constitute legal advice. The legal perspectives shared here reflect general observations about the current state of the law and should not be relied upon as guidance for specific legal decisions. Readers should consult qualified legal counsel for advice on their particular circumstances.

Search