MHRA, FDA and Health Canada offer “guiding principles” for predetermined change control plans
Regulators from the U.S., the U.K., and Canada previously released guiding principles for good machine learning practices. Now, the trio have just published a new series of “guiding principles” for pre-determined change control plans (PCCPs) for machine learning-enabled medical devices. The regulators suggest that PCCPs need to be transparent, risk-based, bounded and operate throughout the product life cycle.
International regulators are working on policy for artificial intelligence
- According to the International Medical Device Regulators Forum (IMDRF), machine learning (ML), a subset of Artificial Intelligence (AI), “allows ML models to be developed by ML training algorithms through analysis of data, without models being explicitly programmed.” In 2022, IMDRF finalized a technical document defining key terms related to AI/ML in medical devices, including the term Machine Learning-enabled Medical Devices (MLMD): “A medical device that uses machine learning, in part or in whole, to achieve its intended medical purpose.” The IMDRF AI/ML working group is working “to develop new documentation on the topic of Good Machine Learning Practice (GMLP), to provide internationally harmonized principles to help promote the development of safe and effective artificial intelligence/machine learning-enabled (AI/ML) medical devices.”
- Internationally, regulators are building a framework AI/ML in regulated products. In 2021, a collaboration between the U.S. FDA, the U.K. Medicines and Health products Regulatory Authority (MHRA) and Health Canada released a series of “joint principles” on GMLPs. The ten principles were intended to “lay the foundation for developing” GMLPs and “identify areas” that should be addressed by IMDRF. At the time, the regulators specifically noted that the areas identified could serve to inform future research, resources, opportunity for policy harmonization, or building of consensus standards. It’s worth noting that the IMDRF AI/ML Working Group is chaired by Health Canada (Russel Pearson) and the U.S. FDA (Matthew Diamond), and MHRA is a participant in that working group.
- There are still regulatory questions on AI/ML in medical device development and total product lifecycle that need to be addressed. In particular, the U.S. FDA has been advancing policy on pre-determined change control plans (PCCPs). The FDA first issued a 2019 discussion draft that outlined a potential regulatory framework, a document that was followed by legislative changes to the FDA’s authorizing statute in late 2022 that clarified the legality of such plans. Earlier this year, the agency issued draft guidance on submissions of PCCPs for AI/ML medical devices. The FDA’s high-level concept of PCCPs is that they are “reviewed as part of a marketing submission to ensure the continued safety and effectiveness of the device without necessitating additional marketing submissions for implementing each modification described in the PCCP.” In effect, the sponsor can pre-specify (and FDA can authorize) anticipated significant changes to a device under a PCCP that can be implemented, not through a new marketing submission, but by documentation in the quality system — as long as the PCCP is followed. The PCCP itself, once authorized, is considered part of the device (a “technical aspect” of the device).
- That draft guidance debuted use of a new term: ML-DSF, or machine learning-enabled device software function. The definition generally aligns with the IMDRF’s definition of a machine learning-enabled medical device (MLMD), although it also incorporates the FDA’s statutory directive to define (and regulate) software as a medical device by function.
This week, MHRA, FDA and Health Canada released a second collaboration: Principles for predetermined change control plans
- The PCCP guiding principles are similar to those from the GMLP collaboration, and per the FDA’s announcement were informed by that effort. In fact, the new PCCP “guiding principles… draw upon the overarching GMLP guiding principles, in particular principle 10” from that collaboration: that ML models must be “monitored for performance” over time after they are deployed, and “re-training risks are managed.” PCCPs are intended to help developers move post-deployment considerations and guardrails to a point earlier in the process, to help plan for adjustments and potential changes over time. According to the MHRA, “these guiding principles will help to ensure alignment between our jurisdictions on PCCPs and products utilising them.”
- Similar to the GMLP principles, the PCCP principles appear to be intended as a jumping-off point. The goal of the principles themselves are not robust guidance for developers; instead, they are intended to “provide foundational considerations that highlight the characteristics of robust PCCPs” and “foster ongoing engagement and collaboration among stakeholders,” as well as “lay a foundation for PCCPs and encourag[e] international harmonization.” In effect, they are intended to help build consensus on what should be top of mind for regulators who are developing guidance on the subject – and laying out a framework for elements to be included that will align across jurisdictions.
- The definition of a PCCP, for the purpose of the guiding principles: PCCPs are “a plan, proposed by the manufacturer, that specifies” three key things, according to the document: planned modifications to a device, the protocol for implementing and controlling the modifications, and how to assess the impacts of the modifications. Notably, it acknowledges that “PCCPs may be developed and implemented in different ways in different regulatory jurisdictions.”
- The five guiding principles are that PCCPs should be: focused and bounded, risk-based, evidence-based, transparent; and focused on the total product life cycle (TPLC). Drilling down, PCCPs need to describe specific, planned changes on how device modifications can be verified and validated using methods and metrics, defining in advance what would happen if and when performance criteria are not met. These should follow the principles of risk management from inception to use, considering the evidence gathered during the TPLC showing that benefits continue to outweigh risks and that risks are controlled.
- Metrics and methods that measure device performance should be based on science and clinical evidence, and should “demonstrate the benefits and risks of the device before and after changes specified in the PCCP are implemented.” To ensure transparency, manufacturers should provide detailed plans so that stakeholders understand the device’s performance both before and after any planned or potential modifications. Data used for both development and modifications should reflect the intended use population, and the plan should describe plans for monitoring device performance and detecting and responding to deviations throughout the product lifecycle.
- FDA has an open docket for feedback on the PCCP guiding principles document. It’s worth noting here that this isn’t a new docket – instead, the FDA has added the guiding principles to its existing docket on AI/ML issues, including its 2019 Discussion Paper (cited above) and a 2021 workshop on issues related to transparency issues for AI/ML. [ Read AgencyIQ’s analysis of that meeting here.]
Analysis
- The PCCP guiding principles comes after draft guidance from Health Canada and the FDA. AgencyIQ talked about FDA’s draft guidance on PCCPs in March of this year. That document offered extensive background information on how PCCPs fit in the U.S. regulatory space and how to include them in premarket submissions. Notably, the FDA also maintains a list of AI/ML-enabled medical devices that have been authorized in the U.S., which after an October 2023 update is up to nearly 700 products (primarily in radiology).
- Meanwhile, Health Canada recently released a draft guidance on premarket submissions for MLMDs. The Canadian draft document focused on transparency, but also introduced the concept of the PCCP. Health Canada referenced the IMDRF document on key terms and definitions for MLMDs (IMDRF N67). The Canadian regulator recognized that the risk class of MLMDs may span all of the risk classes (I to IV) and may fall into classification as a medical device or an IVD. The regulator’s recommendations encompassed GMLP and PCCPs, and noted that Canadian populations should be explicitly addressed in these submission types.
- Engagement with and from IMDRF. The collaboration among FDA, MHRA, and Health Canada continues to put out specific Guiding Principles on AI/ML issues, even as FDA and Health Canada co-chair the AI/ML-enabled working group at IMDRF (and MHRA is a participant). Since putting out the GMLP “joint principles” in 2021, that IMDRF working group did finalize their first-edition technical document on key terms and definitions of MLMDs in May 2022. However, it’s not entirely clear if the guiding principles documents are informing IMDRF workstreams, or whether the trio of FDA, MHRA and Health Canada are building their own consensus documents. With the FDA’s Center for Devices and Radiological Health set to monitor its own progress on international harmonization over the next few years, these “guiding principles” efforts are likely to be featured prominently.
To contact the author of this item, please email Corey Jaseph ( cjaseph@agencyiq.com) or Laura DiAngelo ( ldiangelo@agencyiq.com).
To contact the editor of this item, please email Kari Oakes ( koakes@agencyiq.com).
Key Documents and Dates
- Guidance: Predetermined Change Control Plans for Machine Learning-Enabled Medical Devices: Guiding Principles, published October 24, 2023.
- News story – MHRA and international partners publish five guiding principles for machine learning-enabled medical devices, published October 24, 2023.
- FDA Bulletin: Predetermined Change Control Plans for Machine Learning-Enabled Medical Devices: Guiding Principles, published October 24, 2023.
- Feedback: FDA Public Docket FDA-2019-N-1185