
Diritto al Digitale
Diritto al Digitale is the must-listen podcast on innovation law, brought to you by Giulio Coraggio, data and technology lawyer at the global law firm DLA Piper. Each episode explores the cutting-edge legal challenges shaping our digital world—from data privacy and artificial intelligence to the Internet of Things, outsourcing, e-commerce, and intellectual property.
Join us as we illuminate the legal frameworks behind today’s breakthroughs and provide insider insights on how innovation is transforming the future of business and society.
You can contact us at the details available on dlapiper.com
Diritto al Digitale
How AI Related EU’s New Model Clauses can help your business
The European Commission has just released an updated version of the Model Contractual Clauses for AI Procurement — but what do they actually mean for your business? In this episode of Diritto al Digitale, Giulio Coraggio, technology and data lawyer at DLA Piper, explores how these clauses aim to simplify compliance with the AI Act, reduce legal uncertainty, and reshape the way public and private players negotiate AI contracts. Whether you’re an AI buyer, supplier, or legal counsel, this is your essential guide to turning regulation into opportunity.
📌 You can find our contacts 👉 www.dlapiper.com
What if I told you that the European Commission has just handed us a new legal toolkit to navigate the complex world of AI contracts? A toolkit that could radically change how we structure the purchase, sale, and deployment of artificial intelligence systems in both the public and private sectors? Whether you're a public authority buying an algorithm to assess social benefits, or a private company integrating AI into your customer service, these new clauses could soon become your best legal ally — or your worst headache.
I’m Giulio Coraggio, a technology and data lawyer at the global law firm DLA Piper. Diritto al Digitale is the podcast where we explore the intersection between law and innovation — and today, we dive into the updated Model Contractual Clauses for AI Procurement, also known as MCC-AI, just released by the European Commission in light of the AI Act coming into force.
So, what are we talking about?
The Model Contractual Clauses for AI Procurement were first introduced in September 2023, and just updated to reflect the obligations imposed by Regulation (EU) 2024/1689 — better known as the AI Act. They are a practical, modular contractual template created to help both public buyers and private sector operators source and provide AI systems in compliance with the new regulatory landscape.
Why is this important?
Because under the AI Act, using AI isn’t just about innovation — it’s about accountability, transparency, risk management, and data governance. And when you put these requirements into a contractual relationship, things get very complicated, very quickly. That’s where the MCC-AI come in.
The MCC-AI are not a complete contract. Instead, they’re a set of clauses meant to be integrated into a broader agreement, such as a SaaS license or an IT procurement deal. They do not cover general terms like intellectual property rights, indemnities, or governing law. Instead, they focus purely on AI-specific obligations under the AI Act.
There are two versions of the clauses:
- A full version for high-risk AI systems — think facial recognition, credit scoring, or recruitment tools.
- A light version for non-high-risk systems, but still aligned with key AI Act principles like transparency and technical documentation.
Both versions are supported by an explanatory guide, use case examples, and templates for data governance and compliance documents.
The MCC-AI cover five main areas:
- AI System Compliance – defining the legal and ethical standards the system must meet.
- Supplier Roles & Responsibilities – establishing who does what in terms of risk, transparency, and compliance.
- Data Governance – who owns the data, who can access it, and how it’s used.
- Verifiability & Traceability – how the AI system will be audited and monitored.
- Cost Allocation – who pays for what, especially when changes are needed to ensure compliance.
In a world where AI is often a “black box,” these clauses aim to make the contractual relationship much more transparent and accountable.
Although the MCC-AI are targeted primarily at public procurement, private companies should not ignore them. They can — and probably should — be adapted for B2B AI transactions as well.
But here's the catch: in the private sector, contractual negotiation is a different game. AI vendors, especially the big ones, often hold the stronger bargaining power. Many operate on standard T&Cs, and offer AI as-a-service models where custom clauses are hard to integrate.
Still, even if you can’t impose them wholesale, borrowing language or principles from the MCC-AI can help companies show regulatory readiness and reduce legal uncertainty — especially if things go wrong and there’s a dispute.
Let’s be honest — the practical adoption of MCC-AI may face headwinds.
Why?
Because the balance of power is rarely in favor of the buyer. Many AI deployments today happen via standardized, pre-built platforms, often open-source or proprietary systems with short-form licensing models. In these contexts, it’s hard to negotiate anything — let alone a 30-page compliance annex.
But if you're a regulated entity, or if your AI use cases impact human rights, consumer rights, or safety, regulators will expect more. Adopting — or at least aligning with — the MCC-AI might not be optional for long.
So what should companies do?
- Map your AI supply chain – identify where you're buying or building high-risk systems.
- Compare existing contracts to the MCC-AI – are you covering compliance, data governance, audit rights?
- Adapt the MCC-AI to your industry context — don’t copy-paste them blindly.
- Train your procurement and legal teams — they need to understand not only the clauses, but the risks they’re meant to mitigate.
Above all, view these clauses not as bureaucracy, but as a competitive advantage. If your business can prove it has responsible AI contracts in place, you're not only mitigating risk — you’re building trust.
The MCC-AI are not perfect. They’re not magic. And they won’t instantly solve the complex power dynamics of today’s AI market.
But they are a starting point — and perhaps the most structured one we’ve had so far — to bring legal certainty into an area dominated by technical complexity and ethical grey zones.
What do you think? Are these clauses something you’d actually use? Or are they just another layer of red tape?
Let me know — I’d love to hear your views. Write to me at giulio.coraggio@dlapiper.com.
And if you found this episode useful, subscribe to Diritto al Digitale, turn on the bell so you don’t miss upcoming episodes, and leave us five stars on Apple Podcasts or Spotify.
I am Giulio Coraggio, this is Diritto al Digitale. Arrivederci.