77% of companies using AI fail to govern client data

New research commissioned by the secure legal AI provider LEGALFLY has revealed widespread enthusiasm for AI adoption across organisations, but a significant governance gap, as 77% of firms using AI with client data lack a fully developed AI policy.

The research uncovered that a further 90% of firms are actively using AI tools, yet only 18% have fully implemented governance frameworks, raising concerns over data protection and compliance risks.

The study, based on a survey of 154 General Counsels across the UK, Germany, and France, found that legal and compliance departments are among the heaviest AI users, with 77% utilising these tools, like Chat GPT and Microsoft Copilot, regularly. However, just 29% of organisations report that their AI policies are always followed – creating substantial risks for how sensitive client data is being protected.

The governance challenge is compounded by the fact that 34% of organisations process client and vendor data through AI systems, yet many lack adequate safeguards for this sensitive information.

In a striking admission, 24% of firms admit they have limited understanding of what AI tools are being used within their organisation and no centralised knowledge of AI usage, exposing them and their clients to significant compliance and data protection risks. This comes as high-profile incidents such as the Legal Aid Agency data breach, widely reported as an IT failure, and courtroom AI hallucinations have highlighted the real-world consequences of inadequate AI oversight.

The findings shine a light on the growing disconnect between AI usage and oversight, particularly when it comes to sensitive information. More than half (56%) of organisations surveyed use AI on internal company information, including financial data, while 34% process client and vendor data. Yet governance frameworks often lag far behind actual usage.

The governance gap is particularly pronounced among smaller businesses within the research. Among firms with 200–499 employees, 29% say they have no visibility over AI tool usage – and none reported having a fully developed governance framework. Larger enterprises (2,500+ employees) reported having slightly better coverage, with just 32% having comprehensive AI governance policies, highlighting significant room for improvement. The report emphasises that governance actually accelerates AI adoption rather than hindering it: organisations with formal AI governance frameworks report 70% daily AI usage – far higher than the 7% daily use where no governance framework exists. Organisations must embed AI governance into their core operations rather than treating it as an add-on, establishing regular audits, clear approval processes, and comprehensive data protection protocols.

Based on the research findings, LEGALFLY has published a six-step framework for General Counsels to strengthen AI governance, including conducting urgent AI usage audits and implementing mandatory governance training.

Ruben Miessen, LEGALFLY Co-Founder and CEO, commented: “We’re seeing companies race ahead with AI implementation while leaving their risk frameworks in the dust. Incidents like the Legal Aid Agency data breach and reports of AI hallucinations causing errors in court filings show systemic vulnerabilities can be exposed, often without clear attribution.

“But the answer isn’t to limit AI usage – it’s about enabling it responsibly. AI adoption is progressing at breakneck speed – 28% of enterprises with 2500+ employees already see it as an ‘essential tool’ to their business operations. Policies that limit innovation are therefore likely to result in increased 'shadow AI' usage.

“Strong governance enables safe, confident adoption at scale. Our key recommendations to plug this gap include prioritising AI governance at board level, establishing common risk frameworks, and replacing shadow AI with approved, monitored tools. General Counsels must use this data to make the urgent case for AI governance investment.”

The full report is available to download here.