Note: this is a recording of a LinkedIn audio event with Eric Maass. The article below highlights key insights from the discussion.
Imagine you are silently observing a team analyzing the risk(s) of harm for their new medical device under development. It may be a slight tweak to an existing design, or a completely new and innovative product. The team is working through a design FMEA1 template led by a facilitator who is trying to capture notes on their computer. Everyone can see what is being entered in each line item of the template on the big screen in the conference room.
The progress is slow. There is a lot of discussion on each line item intended to document different ways a medical device can fail to perform its intended function. The team must identify these failure modes, their causes and effects, and then estimate their individual risk level. Even for a simple device, there may be hundreds of different failure modes, and each failure mode has to be clearly identified and analyzed.
The discussion becomes more animated when it is time to estimate the risk level based on the severity and likelihood of a specific failure mode, cause and effect combination. It is not unusual to see many divergent opinions, especially when not a lot of data is available to quantitatively estimate these parameters.
As an individual, each one of us looks at risk differently. Some of us are highly analytical, while others take a more intuitive approach to risk and go by the gut. In a group setting, our thinking and behavior are also influenced by organizational culture.
Risk management is about uncertainty management. If you are facilitating a meeting where your team is working on risk analysis, you have to be aware of these undercurrents that can inadvertently introduce a lot of cognitive bias in the process.
Listen to the recording of our conversation above to learn more.
Here are a few key highlights from this discussion.
1. Three approaches to making sense of uncertainty: heuristic, deterministic, probabilistic
In our personal lives, we make many decisions involving risk on a daily basis. In most situations, our decisions are quick without any detailed analysis. This is because we have learned the rules of the game in life that help us make quick decisions that are lawful, safe and effective. These decisions are driven by heuristics, which are mental shortcuts that allow us to solve problems and make judgments efficiently.
When we don’t have a lot of objective and quantifiable information available about potential outcomes, we tend to project based on our prior experiences, values and beliefs. Sometimes it is also called using common sense or using our gut. In most situations involving uncertainty, we make a reasonably good decision, but often our subjective judgments are prone to bias and error2.
We rely on a limited number of heuristic principles which reduce the complex tasks of assessing probabilities and predicting values to simpler judgmental operations. In general, these heuristics are quite useful, but sometimes they lead to severe and systematic errors.
A heuristic-based approach to risk often results in wide variations in risk perceptions under different circumstances. This article3 below explores how different people rate the same situation differently based on their individual risk perception.
Engineers, on the other hand, are trained to take a more systematic approach by developing models that relate inputs to outputs in a cause-and-effect manner to more accurately explain and predict behavior.
Sometimes these can be conceptual models or physical models. They can also be mathematical models described by one or more mathematical equations, or empirical models based on modeling of observational data gathered through carefully designed experiments or simulations. We can think of this model-based approach as a deterministic approach to risk. It is more powerful than the heuristic-based approach to risk, but it is not always reliable due to assumptions or other limitations in the model. We cannot extrapolate them beyond the boundaries of the model, and point estimates may have a significant degree of variance.
There is a third, more recent approach to understanding and evaluating uncertainty. These methods are derived from probabilistic (or stochastic) modeling techniques such as Monte Carlo simulations, variance transmission, discrete event simulation or Bayesian methods. These methods help us to estimate the level of uncertainty due to factors beyond our control.
Medical devices operate in a complex environment. It is not easy to model precisely their behavior in different use environments. Although regulatory controls for labeling4 can help limit many of the unforeseen errors during use, we cannot account for all potential scenarios at the design stage. As an example, we cannot account for all patient-related factors that may be important. That is why it is a good idea to use these more advanced techniques to estimate the level of uncertainty in our predictions and estimates from models. As more information becomes available during the post-market phase, we can refine these estimates to improve the level of confidence.
Understanding uncertainty in both benefits and risks is important to ensure continued safety and effectiveness of medical devices.
2. Watch out for groupthink
There are others factors at play when we are working in a group. In an environment which is unsupportive of independent thinking and experimentation, we tend to seek the relative safety of the group to avoid criticism and failure. Even when we disagree with the dominant position in the group, we tend to go along and not speak up. In other situations, particularly where organizational hierarchy plays an important role in decisions, we tend to hold back in deference to those in a higher position.
We don’t know what to do when we don’t know everything, and in those situations we rely on what other people are doing. This heuristic-based thinking leads to the cognitive bias of groupthink.
As an example, if the dominant group position is that life-threatening injury or death could occur in a certain situation, we are likely to go along with the conclusion that risk is very high even when we personally believe that the likelihood of such a scenario is extremely low.
On the other hand, if the dominant group position is that the likelihood of a certain scenario is extremely low, we are likely to go along with the conclusion that risk is very low even when we personally believe that the severity is very high.
It is important to remind ourselves that risk is not either-or. It is a combination of both severity of the outcome and its likelihood of occurrence.
This kind of groupthink5 is not limited to analyzing and evaluating risk. Even the so-called brainstorming activities are prone to distortion from groupthink where individual ideas and creativity may be suppressed (or shot down) by the group.
It is very important for the facilitator to recognize the effect of groupthink and call it out for everyone to recognize.
As a best practice, take a pause and ask team members to first work on their own. Then go around the room asking each individual to present their ideas (or analysis of risk) without any interruption from others. Once all individual ideas are accurately documented, the group discussion can lead to a better result.
Another best practice is to avoid involving people in a higher position or authority in brainstorming or working group discussions. Presence of a person of higher authority generally introduces other complex dynamics in the group which do not directly contribute to the problem at hand. It is best to work with the group at first, then summarize the key conclusions in a separate presentation to the management team.
Groupthink is real! As practitioners of risk management and as leaders in this field, we need to be aware of this phenomenon and take appropriate steps to avoid it.
3. Focus on compliance leads to heuristics based thinking
In heavily regulated industries such as medical devices, there is a strong focus on regulatory compliance. There can be serious consequences to non-compliance such as warning letters, mandatory recalls, product seizures, injunctions and criminal prosecution6. The natural tendency of loss avoidance is at play when we focus more on documentation instead of true quality that is essential for consistently delivering safe and effective products.
However, Compliance is not the same as Quality. Even the FDA has realized that an excessive focus on compliance has not led to the expected level of improvements in product quality and patient safety. That is why the Agency launched the Case for Quality7 (CfQ) program in 2011 and encouraged medical device manufacturers to participate in a Voluntary Manufacturing and Product Quality Pilot Program.
CfQ is intended to elevate the focus of all medical device stakeholders from baseline regulatory compliance to sustained, predictive practices that advance medical device quality and safety to achieve better patient outcomes.
Focusing on Quality involves applying statistical and other quantitative techniques to optimize the achievement of quality and process performance objectives. This is different than the fear-driven deterministic approach for compliance. Automotive industry, for example, is heavily regulated and has a strong focus on compliance. But it has also evolved and matured in application of basic quality principles in the last several decades.
It is now the medical device industry’s turn to focus on quality!
In Conclusion
As individuals, we generally make decisions and judgments using heuristics-based thinking. This is not sufficient for formal risk management of medical devices as it is prone to bias and errors in judgment.
Engineering approach to risk management involves building cause-and-effect relationships between inputs and outputs. We can think of this approach as a deterministic model-based thinking.
However, medical devices operate in a complex environment. This requires us to combine a probabilistic approach to our model-based approach to understand and manage uncertainty.
As practitioners of risk management, especially in a leadership role, we have to be aware of individual differences and the potential role of groupthink.
As our world becomes more complex, we can no longer rely on simple heuristics and deterministic thinking. We have to adapt to a probabilistic approach rather than relying on heuristics and deterministic thinking.
About Eric Maass
Eric Maass currently serves as an adjunct professor at the Arizona State University. He was one of the original founders of the Six Sigma program at Motorola where he served in different roles for nearly 30 years. Most recently, he was at Medtronic as the senior director of design, reliability and manufacturing (DRM), a technical fellow and master black belt. He retired from Medtronic in 2020 and now provides training, coaching and consulting services in product/process development and optimization.
About Let’s Talk Risk with Dr. Naveen Agarwal
Let’s Talk Risk with Dr. Naveen Agarwal is a weekly live audio event on LinkedIn, where we talk about risk management related topics in a casual, informal way. Join us at 11:00 am EST every Friday on LinkedIn.
Disclaimer
Information and insights presented in this article are for educational purposes only. Views expressed by all speakers are their own and do not reflect those of their respective organizations.
FMEA: Failure Modes and Effects analysis is the most common risk analysis technique currently used in the medical device industry.
Amos Tversky and Daniel Kahneman in Judgement under Uncertainty: Heuristics and Biases, Science, New Series, Vol. 185. No. 4157 (Sep. 27, 1974).
Medical devices are required to include Instructions for Use (IFU) or Directions for Use (DFU) that clearly describe the intended use, indications and contraindications, warnings and precautions and other instructions for safe use.
Wikipedia: Groupthink is a psychological phenomenon that occurs within a group of people in which the desire for harmony or conformity in the group results in an irrational or dysfunctional decision-making outcome.
FDA: Case for Quality
Share this post