Innovation in regulatory approaches to AI

Locked or continuously learning data-driven AI systems will need an innovative regulatory approach

Modifications to regulatory approaches for AI–based medical device software will depend on the type and nature of the algorithm, and the associated risks. There are existing principles for categorizing software as a medical device (SaMD) that should form a basis for considering these different approaches.

International Medical Device Regulators Forum (IMDRF) software classification is dependent upon the state of the healthcare condition (critical, serious, or non-serious) and the significance of the information provided by the software (to treat or diagnose, drive clinical management, or inform clinical management). In addition, the international standard IEC 62304[1] introduces three classes of software (A, B, and C), based upon whether a hazardous situation could arise from failure of the software and the severity of injury that is possible.

The level of adaptation of an AI solution also will be important for considering the regulatory approach. Rules-based AI systems can generally be treated in the same way as traditional software[2], whereas locked or continuously learning data-driven AI systems will need innovative treatment. The FDA discussion document, Proposed Regulatory Framework for Modifications to Artificial Intelligence/Machine Learning (AI/ML)-Based Software as a Medical Device (SaMD), mentions all currently approved AI solutions have been locked while providing patient care, but there is an ambition to utilize continuous learning systems within the healthcare sector in the future.

Collaboration and coproduction between developers, healthcare providers, academia, patients, governments, and statutory bodies across the AI life cycle will be essential for maximizing the deployment of AI. A recent article from Harvard Business Review (July 2019) discussed a concept of “AI marketplaces” for radiology. These are aimed at allowing discovery, distribution, and monetization of AI models, as well as providing feedback between users and developers. Similar collaborations could support the life cycle requirements for AI models, and therefore we recommend establishment of a relationship with IMDRF to develop standardized terminologies, guidance, and good regulatory practices.

FDA is currently collaborating with stakeholders to build a U.S. National Evaluation System for health Technology (NEST).[3] This is aimed at generating better evidence for medical devices in a more efficient manner. It will utilize real-world evidence and advanced analytics of data that is gathered from different sources.

Similarly, in the UK, new evidence standards have been developed to ensure digital health technologies are clinically effective and offer economic value.[4] This improves the understanding for innovators and commissioners about what good levels of evidence should look like.

The impact of AI beyond the traditional boundaries of medical device regulation will also be an important factor; particularly where AI is applied in research, health administration, and general wellness scenarios. Alignment with other regulators, e.g., for professional practice, clinical services, research, and privacy will be critical to ensure successful deployment across the healthcare system. The IMDRF is well-suited as the venue to host such discussions and develop related potential regulatory approaches.

Due to the potential for AI solutions to learn and adapt in real time, organizational-based approaches to establish the capabilities of software developers to respond to real-world AI performance could become crucial. These approaches are already being considered by U.S. FDA, although they may not necessarily align with EU Medical Device Regulation.


[1] IEC 62304:2006, Medical device software – Software life cycle processes. 2006.

[2] See clause 2 of Machine learning AI in medical devices: adapting regulatory frameworks and standards to ensure safety and performance https://pages.bsigroup.com/l/35972/2020-05-06/2dkr8q4

[3] https://www.fda.gov/about-fda/ cdrh-reports/national-evaluation-system-health-technology-nest

[4] https://www.nice.org.uk/Media/ Default/About/what-we-do/our-programmes/evidence-standards-framework/digital-evidence-standards-framework.pdf

This is an excerpt from the BSI/AAMI white paper: Machine learning AI in medical devices: adapting regulatory frameworks and standards to ensure safety and performance. To browse our collection of medical device white papers, please visit the Insight page on the Compliance Navigator website.

Request more information today for a call back from a member of our sales team so that you can get a better understanding of how Compliance Navigator can meet your needs.  

The Compliance Navigator blog is issued for information only. It does not constitute an official or agreed position of BSI Standards Ltd or of the BSI Notified Body.  The views expressed are entirely those of the authors.