AI in healthcare: why regulation is the key to trust
Artificial intelligence is reshaping healthcare. From diagnostic imaging to patient monitoring, AI systems are being deployed to enhance efficiency, accuracy, and access to care. But in a field where lives are at stake, innovation cannot come at the expense of safety. This is why regulation in AI health technology is not a bureaucratic obstacle, but a foundation for trust and sustainable adoption.
The benefits of regulation extend beyond individual patients. In countries such as the UK, where the NHS faces rising costs from increasing skin cancer diagnoses, tools that are both innovative and certified can relieve pressure on the system. Non-melanoma skin cancer (NMSC) is the most common cancer in the UK, and projections estimate that by 2025, NMSC diagnoses will approach 400,000 annually, with associated costs expected to rise from around £289–£399 million in 2020 to approximately £338–£465 million in 2025.
Raising the bar for digital health
AI health technologies hold immense potential, but they also carry risks if not carefully validated. Algorithms trained on incomplete or biased data may produce inaccurate results. Systems not tested across diverse conditions may fail in real-world use. And without clear oversight, claims about accuracy or safety can remain unverified.
Regulation ensures that technologies introduced to patients meet consistent standards of safety, reliability, and transparency. In Europe, the Medical Device Regulation (MDR) is the strictest regulatory framework for proven safety and reliability. It applies not just to traditional medical devices such as CT scanners and syringes, but also to software and AI-driven health apps. By holding these tools to the same bar, regulators safeguard patients while guiding the industry towards higher quality.
The MDR has fundamentally reshaped the medical software landscape, but few AI health apps have met its standards. According to a 2024 MedTech Europe survey, many manufacturers are transitioning only a fraction of their portfolios, sometimes less than 5%, and large device makers are now 33% less likely to launch in the EU first. A peer-reviewed analysis of over 2,000 legacy software entries shows the impact of MDR’s Rule 11: where 53% of devices were previously classified as low-risk Class I, today 55% are Class IIa. As a result, roughly three-quarters of existing products will require costly reclassification by 2028.
The technology industry and regulation have not always been pulling in the same direction. Take the EU AI Act which has caused heated debate about whether regulatory oversight hinders innovation. In the healthcare industry, though, it is widely accepted that regulation is critical to building trust among patients and medical professionals. If AI-enabled technologies are to be applied to support people’s health, it is essential that the software and devices undergo rigorous testing and approval. In our experience at SkinVision, the work we have undertaken has enabled us to build a body of scientific evidence and validation that has given healthcare professionals and consumers confidence. The discipline of going through this process has been integral to the commercial progress we have achieved and we would encourage medical device and software vendors to embrace the process.
Yes, it is rigorous, which for some smaller or early stage companies will be resource-intensive, so a balance between rules and supporting innovation is important. However, having the right framework for AI adoption has benefited SkinVision as it has helped to make the technology more robust and made SkinVision credible in the eyes of commercial partners.
Understand the regulator’s process
SkinVision benefited from early engagement with the regulatory compliance process. For example, working towards CE registration under the EU Medical Device Directive (MDD) the company gained a clear understanding of how the European regulator operates which enabled the correct internal structures to be established, guiding the company forward. Having this insight is invaluable in order to work towards the higher Class IIa certification under the EU MDR. MDR ensures unverified apps are not allowed to operate unchecked and when digital tools are marketed as medical devices, they meet the same evidence-based standards as more traditional technologies.
To achieve Class IIa certification, SkinVision had to clearly demonstrate its medical purpose: detecting skin cancer risk through photos taken with the app. The company also needed to prove its accuracy, safety, and consistency across different device types and conditions. An independent EU-appointed team of medical technology experts reviewed the system and confirmed its reliability. Along the way bodies of scientific evidence were built, including 11 peer-reviewed clinical studies published in leading medical journals. For any company looking to commercialise AI technology in a healthcare setting, this examination – and the subsequent approval – underlines a company’s credibility and commitment to prioritising patient safety.
Regulatory compliance supports commercialisation
Compliance with medical device regulations is a cornerstone of successful commercialisation. It enables market access by meeting essential buyer requirements. It builds trust among clinicians, patients, and partners by proving that a product is safe, effective, and reliable.
Strong compliance systems should also seek to reduce liability and risk to partners of adverse event reporting and other service-related incidents, protecting both revenue and reputation. For growing companies, a certified quality management system like ISO 13485 supports scalability and smoother entry into new markets.
Regulatory compliance also attracts investors and partners who look for operational discipline and long-term reliability. Beyond meeting legal standards, it strengthens brand credibility and ensures sustained success in competitive healthcare markets.
Having achieved regulatory compliance, SkinVision has rapidly scaled to over 30 global partnerships with insurers and health providers demonstrating that commercial success stems from effective compliance and governance strategy.
Regulation as a path to trust
The central question for AI in healthcare is not whether it can innovate, it already has, but whether patients, providers, and policymakers can trust it. Regulation provides that assurance. It requires transparency of claims, evidence of effectiveness, and ongoing monitoring for safety.
Without regulation, AI health tools risk being dismissed as unproven or unsafe. With it, they can become trusted components of clinical practice. In this sense, MDR and similar frameworks are not barriers to progress, but enablers of responsible adoption.
SkinVision’s achievement of MDR Class IIa certification is just one example of how AI health tools can meet this bar. More importantly, it shows that regulation works: it protects patients, guides innovation, and builds the trust necessary for long-term impact.
As AI continues to advance, regulation will remain essential. It is not a hurdle to overcome, but the foundation of safe, effective, and trusted healthcare innovation.
For more startup news, check out the other articles on the website, and subscribe to the magazine for free. Listen to The Cereal Entrepreneur podcast for more interviews with entrepreneurs and big-hitters in the startup ecosystem.