Software to Predict Risk of Sepsis and Stroke Should Be Regulated as a Medical Device, Says FDA

This audio is generated automatically. Please let us know if you have any comments.

Diving Brief

  • Certain types of risk assessment tools would be regulated as medical devices, according to a final direction issued by the Food and Drug Administration on September 28.
  • Many of these software tools have been exempted from FDA regulation under the 21st Century Cures Act, as long as the health care provider can independently review the basis for the recommendations and does not rely on them to make a diagnostic or treatment decision.
  • The clarity of the rules has been welcomed by some device makers, although they have warned that products could take longer to hit the market. Some medical device companies supported the changes, saying they provided more clarity on the FDA’s thinking, while others saw the guidelines as exceeding congressional intent.

Overview of the dive

The guidelines apply to clinical decision support software, a broad category of software tools that can help physicians and other healthcare providers detect disease and generate alerts when the status of a patient changes. For example, software that uses electrocardiogram (ECG) data to detect arrhythmias, such as that used by Apple and Fitbit smartwatches, is regulated as a medical device. The same goes for software that analyzes images to differentiate between ischemic and hemorrhagic strokes, or that is used by radiologists to “triage” patients and review potential cases of pulmonary embolism.

“The 21st Century Cures Act was very clear that anything that directly scans a medical image would be a medical device,” said Christina Silcox, research director for digital health at the Duke-Margolis Center for Health Policy.

The biggest policy change, the FDA noted in the guidance, is that certain types of predictive tools, which previously fell into a gray area, should now be regulated as medical devices. The agency gave an example of software that analyzes patient information to detect stroke or sepsis, and generates an alarm to notify a healthcare provider.

Sepsis detection

Several of these risk-scoring tools are already in use: Electronic health record companies Epic Systems and Cerner have both created sepsis monitoring tools to detect fatality before patients deteriorate. Neither is FDA approved, and a study last year by Michigan Medicine found that Epic’s tool performed worse than expected.

“I think there’s a distinction in people’s minds around prediction and diagnosis. And you’ve seen that with sepsis, where we’re not saying they’ve got sepsis now, we’re predicting they’re going to have sepsis,” Silcox said. “In a way, I think the FDA said a risk score, the prediction, it’s still a device. That was a space where people weren’t quite sure.

Bradley Merrill Thompson, an attorney at Epstein Becker Green in Washington, DC, who advises clients on FDA regulations, wrote that part of the FDA’s argument is about automation bias, or fear that Health care providers don’t rely too heavily on automated suggestion.

“The FDA is desperately trying to distinguish between software that is overconfident in its recommendations (Dr, the diagnosis is X) and software that is more hesitant in its recommendation (Dr, the diagnosis is probably X, but possibly also something else thing.) But this the whole line of argument is a new addition to the draft guidelines,” he wrote in an email. “Apparently the FDA has a very low opinion of healthcare professionals if the agency thinks that simple software can ‘direct’ doctors what to do, as if the doctor then had no choice. .”

For patients, these tools are often used without their knowledge, and nothing in the guidelines suggests they should be told, Silcox said. But they could still benefit from the protection that someone has reviewed the software tools to make sure they’re going to work, she said.

“There are always trade-offs, but in my view, this advice could bring a lot of benefit to patients,” Kellie Owens, assistant professor of medical ethics at NYU’s Grossman School of Medicine, wrote in an email. . “It could help prevent patients from living under the jurisdiction of clinical decision support software that either doesn’t work or that systematically blocks certain groups from getting the care they deserve.”

The advice comes amid a wider discussion about how biases in medical devices can affect patient care. Several recent studies have found that pulse oximeters overestimate blood oxygen levels in patients with darker skin. Sutter Health researchers recently discovered the inaccuracy leads to hours of delay in black patients with covid-19 receiving supplemental oxygen treatment.

What this means for software developers

Epic Systems declined to comment on the advice. Suchi Saria, associate professor at Johns Hopkins and CEO of Bayesian Health, a startup with a sepsis alert platform, supported the changes.

The existing lack of clarity on how to assess the quality of software solutions makes it difficult for health systems or provider groups to confidently adopt these kinds of tools, Saria said. They had to turn to published, peer-reviewed research, which is not available for all sepsis alert tools on the market.

“Traditionally, when it comes to drugs and devices, organizations have relied on groups like the FDA to validate, evaluate and approve certain products. In that sense, I think it’s very exciting that there’s more clarity now through organizations like the FDA in the evaluation of these types of products,” Saria said.

She warned that FDA review of these software products could slow them to market, given the many processes involved in evaluating the safety and quality of machine learning solutions.

Despite these concerns, Saria expects a profit for her business as a whole.

“[The] The FDA coming in and wanting to play a bigger role in oversight really only helps Bayesian’s case because it allows us to accelerate that program,” she said. “We believe there is so much noise in the market and it is preventing credible solutions from being implemented more quickly.”

The Mayo Clinic, which is developing a suite of ECG algorithms with spin-off company Anumana, said the examples in the guidelines are helpful in evaluating how the law applies to software products.

Overall, FDA’s regulatory policy for the use of clinical decision support software is substantially the same as its draft guidelines and is consistent with FDA’s risk-based approach to regulation. medical devices, more broadly,” wrote Colin Rom, who leads science and innovation policy for Mayo Clinic’s government engagement team. “Mayo Clinic believes that these policies and a risk-based approach will enable the development and deployment of safe, effective, and ethical digital health technologies for our patients.

Enforcement

There are still a lot of details to iron out, including how the changes would be applied for software tools that are now considered a device.

“I think the guidelines provide useful clarity about the agency’s intentions to regulate certain clinical decision support tools, but also leave many questions about what that regulation will ultimately entail,” NYU’s Owens wrote.

An FDA spokesperson wrote in an email that the agency has “the discretion about when to take appropriate risk-based action, as facilitated by publicly available guidance. The guidance outlines current thinking of the agency on a subject and should be considered recommendations, unless specific regulatory or statutory requirements are cited.

The agency added that developers are encouraged to work with the FDA as early in the process as possible to get their questions answered, through the Center for Devices and Radiological Health’s pre-submission program.

I absolutely believe that we should take [the FDA] literally, and that they would apply these guidelines to regulate many software programs that are currently on the market but for which developers have not sought to comply with FDA requirements,” Thompson wrote.

Whether the agency would license risk-scoring tools developed and used by hospitals internally was not addressed in the guidelines, but the FDA has long had a policy of licensing the practice of medicine, Thompson added. Vendors who develop products for sale to multiple institutions should comply, he said.

Martin E. Berry