Jodi G. DanielMaya Uppaluru

This morning, the Food and Drug Administration released highly anticipated guidance on clinical and patient decision support that has been in the works at the agency for several years, advising the digital health community about how it plans to regulate software that offers recommendations or feedback to its users—both healthcare professionals, and patients and caregivers. It also provides guidance on FDA’s interpretation of new software provisions in Section 3060 of the 21st Century Cures Act.

Given the explosion of these innovative digital health tools and their strong potential to transform healthcare, this guidance is a significant development for tech companies and investors focusing on this space. Comments will be accepted for 60 days.

Clearer Guidelines for Industry on Clinical Decision Support

Industry has been innovating and evolving rapidly. Machine learning and artificial intelligence (AI) are powering many diverse technologies, both provider and consumer facing. AI products can be found in many households, and are starting to be used for healthcare and wellness purposes. Against this changing landscape, the FDA has released its Clinical Decision Support (CDS) guidance, mandated by the Cures Act.

The FDA’s focus has historically been on functionality and not the category of product. This means that a product may have some functionality that is within FDA’s purview and some functionality that is not. The central concept is CDS software that allows for the user to independently review the basis for the recommendations won’t be subject to FDA regulation as a medical device—because the user, typically a healthcare professional with specialized knowledge and training, won’t need to primarily rely on any of the software’s recommendations when making care decisions, and instead can rely on their own judgment.

The FDA guidance states, “The intended user should be able to reach the same recommendation on his or her own without relying primarily on the software function.” Information that is available publicly, such as FDA-approved drug labeling or commonly used clinical practice standards, is the type of information that could be independently verified, and therefore the FDA would likely not be interested in regulating software based on that kind of information.

On the other hand, proprietary algorithms are specifically called out as likely requiring regulation. This suggests that many of the products on the market or under development that are based on continuously evolving algorithms will be treated as medical devices.

The FDA lists several examples of software that it says are not regulated medical devices. Common themes among these examples of unregulated products are:

  • Software that references information the medical community “routinely uses in clinical practice, e.g. practice guidelines.”
  • Software that providers recommendations that are “consistent with FDA-required drug labeling.”
  • Software that uses “rule-based tools” that compares patient-specific information with publicly available, generally accepted practice guidelines.

On the other hand, issues that would suggest a product is in fact a medical device would include:

  • Software that creates individualized or customized recommendations or feedback, such as an individualized care plan, where the health care provider is intended to rely primarily on such feedback in making care decisions.
  • Software that in some way manipulates, analyzes, calculates, or interpolates data—including, for example, use of a proprietary algorithm—in developing its recommendations to the user, such that the user may not be able to independently review the basis for the recommendation. (The FDA clarifies that it will exercise enforcement discretion for software that performs calculations that are routinely used in clinical practice.)
  • Software that analyzes sound waves, breathing patterns, or images is specifically mentioned.

These themes are in line with the FDA’s stance to-date, although it’s not always easy to predict how it can be applied to different scenarios. As digital health products adopt machine learning algorithms and AI, it will become impossible for providers to independently review the basis for recommendations and the risk may be higher.

New Focus on Decision Support for Patients and Caregivers

Going beyond its mandate in the Cures Act, the FDA also declared that the agency will apply these same principles of CDS regulation to the regulation of “patient decision support” or “PDS,” including the ability for a patient or caregiver to independently review the basis of a product’s recommendations or feedback.

The FDA’s addition of this new category is notable. First, it signals that the FDA recognizes and is responding to the growing influx of health technology products that are being used directly by patients and caregivers. Second, the concept of “independent review” is trickier in this context. Will a patient or caregiver have the time, inclination, and expertise to understand the basis for a software product’s decision support?

Fewer examples are listed for PDS, but the FDA does offer that lower-risk PDS would look something like software that reminds a patient how or when to take a prescribed drug, consistent with the drug’s labeling—again, based on publicly available information or common guidelines. An example of higher-risk PDS that would be regulated as a medical device is a warfarin monitoring device that makes recommendations for dosing based on the outcome of a home blood test.

How Does All This Relate to Wellness Products, EHRs, and Other Software?

Finally, the FDA also has released guidance on categories of digital health products that the Cures Act specified would not be subject to FDA regulation, such as software that is intended to promote general wellness or a healthy lifestyle. Key takeaways include:

  • Some general wellness products will still fall under the category of regulated medical devices, if their intended use is for the diagnosis, cure, mitigation, prevention, or treatment of a disease or condition. But the FDA will not enforce premarket review and premarket notification requirements for general wellness products that present a low risk of harm to patients.
  • Products that make “healthy lifestyle” claims that are unrelated to the diagnosis, cure, mitigation, prevention, or treatment of a specific disease or condition will not be regulated medical devices. Examples could include weight management or mindfulness applications.

FDA also clarified that ONC-certified electronic health record (EHR) systems, as well as mobile apps that enable individuals to interact with these EHR systems, are not medical devices—with the rationale that these systems are the equivalent of a paper medical chart. It’s unclear how this concept will apply in the future as EHR products and connected mobile apps continue to advance and look less like their paper chart predecessors. Additionally, the FDA notes that personal health records (PHRs) also are not medical devices as long as they meet the same requirements listed above for other non-regulated products, and alludes to future guidance on multi-functionality.

The FDA seeks public feedback on these proposals. This represents a major opportunity for digital health companies to make their voices heard in an area that is critically important for the design of products that will shape the future of healthcare. Electronic comments can be submitted until February 6, 2018 via https://www.regulations.gov. For further assistance, please contact Jodi Daniel (jdaniel@crowell.com) and Maya Uppaluru (muppaluru@crowell.com).