
Looking for legal advice? Half of Brits would ask AI over a lawyer
Artificial intelligence (AI) usage is on the rise, but with many using it for basic admin tasks and every day queries – a shocking half of Brits would use it for their legal decision-making.
This is what a new survey of UK adults by The Legal Director has revealed, with half of the public saying they would trust AI with their legal matters, while 56% would trust it to read or interpret contracts or terms and conditions.
In a surprising twist, those surveyed admitted that they would trust AI to advise on their legal support over choosing friends (32%) or giving health advice (46%).
The data also showed that men are more likely to recruit AI as an “instant solicitor”, with 55% saying they would rely on it for legal guidance. Although women steered more on the cautious side, 47% admitted they would use it for advice.
Age played a significant role too. Gen Z topped the list when it came to being pro-AI, while hesitancy increased with every decade. Among those aged 75 and over, 61% said they wouldn’t trust AI with legal advice.
Kiley Tan, a lawyer at The Legal Director, which provides fractional general counsel, says that although AI may seem like a cost-effective solution, people must be aware of potential implications: “Legal services can be expensive and there’s no doubt that AI seems like a clever workaround. But when it comes to law, large language models are not fit for purpose.
“They’re not trained on verified legal content and they don’t understand legal context. The result might look convincing but could be wildly inaccurate. And in law, close isn’t good enough. Most contracts aren’t public documents, so AI lacks access to the depth of content needed to draft them properly. Even the best systems struggle to get it right.”
Despite a proliferation of AI into professional services, the public is still sceptical about its use.
The more personal the task, the lower the trust
The survey also asked people which tasks they would not trust AI to handle. What emerged is a clear pattern. Trust decreases sharply as the task becomes more personal or carries greater consequences.
Surgery topped the list. Two thirds (65%) said they wouldn’t trust AI to operate on them or someone they care about while more than half the sample wouldn’t rely on AI to plan a wedding (50%).
People were also wary of handing over everyday responsibilities. Nearly half said they wouldn’t trust AI to pay their bills or do their shopping. These are the kinds of jobs AI is often marketed as being helpful with, but the public’s willingness to delegate them is clearly mixed.
Younger people are more open, but not fully convinced
The 18 to 29 age group was the most open to AI across the board. Even so, 43% still wouldn’t trust it to offer legal advice, and 39% wouldn’t rely on it to read a contract. That openness dropped steadily with age. By the time people reach 65 and over, more than half are against trusting AI in most of these scenarios.
A small group are fully onboard, but they’re the minority
Only 15% of people said they would trust AI to do all the tasks listed. The vast majority want a human involved, especially when the job involves risk, judgement or emotional intelligence.
Sarah Clark, Chief Revenue Officer at The Legal Director says: “AI is brilliant for tasks like scheduling, sorting data or speeding up admin, but when it comes to complex areas like legal advice, it’s crucial not to put too much trust in technology alone. You still need human knowledge and skill to navigate the nuances. It’s not just about getting an answer – it’s about understanding the context, the consequences, and the details. That’s where the human touch really matters.”
For more startup news, check out the other articles on the website, and subscribe to the magazine for free. Listen to The Cereal Entrepreneur podcast for more interviews with entrepreneurs and big-hitters in the startup ecosystem.