AI governance your firm can stand behind.
The SRA expects law firms to have effective governance in place for how AI is used. We help you build it, document it, and where appropriate, certify it to the world’s first international AI management standard, ISO 42001.
Firms must have effective governance structures, arrangements, systems and controls in place to ensure they comply with regulatory and legislative requirements. The use of AI does not change this obligation. It extends it.
This is not optional anymore.
The SRA is clear. Firms using AI must have governance in place. Recent tribunal decisions have confirmed that using AI without adequate supervision is a professional conduct risk, not just a technology risk. The question is not whether your firm needs a governance framework. It is whether yours is fit for purpose.
The SRA expects it
Rule 2.1(a) of the Code of Conduct for Firms requires effective governance, systems and controls. The SRA has confirmed that compliance officers for legal practice are responsible for regulatory compliance when new technology is introduced. That includes AI.
The courts have weighed in
Recent tribunal decisions have established that supervising AI output is a professional obligation, not a choice. Firms that cannot demonstrate how that supervision works are exposed. Documentation is not bureaucracy. It is protection.
Clients are beginning to ask
Commercially sophisticated clients, particularly those in financial services, energy, and the public sector, are starting to include AI governance in their supplier due diligence. ISO 42001 certification answers those questions before they become a barrier to work.
AI mistakes scale differently
A junior lawyer’s error affects one matter. An ungoverned AI tool used across a practice can introduce the same error across dozens. Governance frameworks exist to catch systemic risk before it becomes systemic harm.
Governance that is fit for purpose.
The SRA has also stated that firms should appoint a senior individual with overall oversight of AI systems, establish a committee responsible for training and monitoring, carry out regular audits, and ensure governance structures are agile enough to respond to a changing regulatory landscape. Most firms have not yet done this in any documented way.
Good practice includes appointing a senior individual with overall AI oversight, setting up a committee for training and monitoring, carrying out regular audits, and maintaining an agile governance structure that can respond as the regulatory picture evolves.
Practical governance, not paper exercises.
Everything we deliver is grounded in how your firm actually works. We do not apply a generic template and call it a framework. We look at your practice, your tools, your people, and your risk profile, and we build governance that is specific, defensible, and usable.
AI Policy for your firm
A written policy covering acceptable use, prohibited uses, data handling, supervision requirements, and staff responsibilities. Written in plain English. Specific to your practice area and size.
AI Risk Register
A documented register of the AI systems and tools your firm uses, the risks associated with each, the controls in place, and who is responsible. The foundation of any credible governance framework.
Roles and Responsibilities Framework
Clarity on who owns AI governance at firm level. Who is the senior responsible individual. What the COLP’s role covers. How oversight is structured and what gets escalated.
Audit and Review Process
A structured process for regular governance reviews. What to check, how often, who is responsible, and what good looks like. Designed to be sustainable, not a one-off exercise.
ISO 42001 Certification Pathway
For firms seeking formal certification, we build the complete AI Management System required by the standard and guide you through to independent audit and certification. See below for detail.
The standard
that changes things.
The world’s first certifiable AI management standard
Published in December 2023, ISO 42001 is the international standard for establishing, implementing, maintaining, and continually improving an AI Management System within an organisation. It is the only AI governance framework in the world that is independently certifiable.
It is not a technical standard about how AI works. It is a governance standard about how organisations use AI responsibly. That distinction matters enormously for law firms, where the question is never about the algorithm. It is always about the people, the processes, and the accountability.
Two stages to certified.
Certification follows a clear sequence. Everything has to be in place before the audit can happen. We handle stage one. The audit itself is conducted independently.
Get everything in place
A certified ISO implementation lead works with your firm to build the complete governance foundation. Every policy, framework, and documented process your firm needs - written, structured, and ready for audit. This is the work that has to be done before certification is possible.
The audit
Once everything is in place, an independent auditor conducts the formal ISO 42001 audit. We prepare your firm, coordinate with the certification body, and stay alongside you throughout. The audit itself is conducted independently — that is what makes the certification meaningful.
Things people usually ask.
Does my firm actually need ISO 42001 certification?+
How is ISO 42001 different from a generic AI policy?+
How long does the ISO 42001 certification process take?+
What does ISO 42001 certification actually cost?+
We already have a data protection policy. Does that cover AI governance?+
Not sure what your firm needs?
Tell us where you are and we will tell you what makes sense. If a standalone governance framework is the right fit, we will say so. If ISO 42001 certification makes sense for your firm, we will explain exactly what that involves.