
"Uri made one point that should hit close to home for in-house teams: AI is already in your systems. It is embedded in software tools, layered into services, and working behind the scenes in platforms your business uses every day. Most of that came through the door via contracts, vendor agreements, procurement deals, license terms, and partner relationships. If you don't know what you've agreed to, you can't govern it."
""You have to watch the regulations," Uri said. "There's a lot of regulatory uncertainty. But you also have to understand your own internal use. What AI is already in your company? What have you already exposed yourself to?" That's not just a procurement problem. It's a legal risk. Terms around data use, liability, privacy, and intellectual property are often buried deep in vendor contracts."
Generative AI adoption in legal departments can outpace foundational governance, creating unmanaged risk. Governance begins with contracts, since AI often enters organizations through vendor agreements, procurement deals, license terms, and partnerships. Many agreements contain terms about data use, liability, privacy, and intellectual property that determine exposure. Organizations must inventory and map AI presence across systems and software to understand internal use and contractual obligations. Emerging regulations such as the EU AI Act increase expectations for transparency about internal AI usage, making contractual visibility and proactive governance essential to manage legal and compliance risk.
Read at Above the Law
Unable to calculate read time
Collection
[
|
...
]