Debunking Simulation Myths
We believe simulation is the most underutilized tool for process improvement in healthcare today. Whether in hospital operations or space planning, simulation offers a unique lens for considering the random or uncertain nature of patient arrivals and/or patient needs. After years of using simulation as a tool to design and improve Emergency Departments and Surgery Departments and to plan staffing for areas such as Environmental Services (Housekeeping), we still find that there are myths about simulation that prevent process improvement professionals from fully leveraging these powerful tools.
Before we get to the myths, we should discuss why hospitals need simulation.
Simulation can be used to understand system barriers and constraints and identify factors that support efficient workflow and adequate facilities, minimize avoidable suffering, and enhance patient safety.
Simulation is a critical tool for the following situations:
1. Volume (Demand) fluctuates in random and difficult-to-predict ways.
2. The volume of a new facility or new services needs to be clarified.
3. The effects of volume fluctuations on resource utilization (e.g., staffing, movable equipment, supplies) or facility usage (e.g., exam rooms, ORs, etc.) are unclear.
4. Another less obvious area is the uncertain need for patient treatment, such as surgery. Two patients may have the same procedure, such as a Coronary artery bypass graft, and yet how long the surgery takes will depend on factors unique to that patient—how easy or difficult it is to suture the arteries, for example.
Healthcare organizations and hospitals can reap many benefits in three key areas: operations, operations planning, and facilities planning, design, and construction/renovation.
Hospitals realize many benefits from simulation, including, but not limited to, order response and turnaround time, supply chain management, patient safety, resource utilization, including staff, facilities, and equipment, improved workflow, reduced idle time leading to increased throughput, and increased revenues while costs remain constant.
For operational planning, hospitals have seen benefits from policy and procedure development, needs assessment, design development, construction cost minimization, and future staffing and operating cost minimization
For facility planning, the main benefit is rightsizing the facility (e.g., the number of exam rooms, treatment spaces, patient rooms, ORs, etc.). Simulation can prove a facility’s ability to handle the projected volumes, plus “what if” increases in the mix, assuring executives that a facility will provide the correct amount of space.
Let’s examine some myths that keep organizations from achieving these benefits and see if we can offer a different perspective.
Myth Number 1: You Need Dedicated, Full-Time Resources to Do Simulation:
We have been using simulation intermittently for over thirty years and much more frequently in the last 20 years. Why the increase? Thirty to forty years ago, simulation relied heavily on programming and thus was time-consuming. It needed to run on mainframes or minicomputers, where access was limited for many process improvement practitioners. With the growth of personal computers, both in terms of affordability and in terms of computing power, simulation programs evolved and became easier to use. Modern software, with its object-oriented programs and ever-evolving host of features built into the software itself, makes it much more intuitive to create and run a simulation model.
Of course, simulation software requires training and practice for teams to become proficient. An organization with sufficient need may have full-time dedicated staff just like they may have staff dedicated to productivity, Lean, or any other discipline. That said, in our experience, we have never had a sufficient need to dedicate staff to simulation (or any other single discipline for that matter).
In our experience at a large hospital management company (with 27 acute care hospitals), two individuals were the go-to folks for simulation. They used simulation about 25% to 30% of their time. We had three more individuals who used simulation models extensively, but this only accounted for 10 to 20% of their time. All these individuals had been trained to create models using our software of choice, FlexSim. We tended to split the actual application into model building and model use. The two individuals who spent the most time on simulation (whom we’ll refer to as Darrell I and Darrell II) were the primary model builders. They were technically inclined and had less experience than the other three, so it was efficient to have them focus on model building. Once they did, their increasing familiarity with the software meant they could build models faster and, thus, more efficiently than the other three.
Familiarity with a particular software product creates efficiency, which is true of any technical effort. Once folks become more proficient, they get more opportunities to apply their knowledge and thus become even more adept. One of the dangers in this circle of efficiency is that others will lose their skills, and if you lose one of the proficient individuals, you will be hampered in continuing to use simulation effectively. While this danger is obvious when you have full-time staff dedicated to simulation, it can be even more pressing when you have staff who only use it part-time. If you invest in training multiple individuals from diverse backgrounds in a particular software, you should ensure they receive sufficient practice to remain reasonably proficient.
Myth Number 2: Simulation Requires a Significant Amount of Time:
Does building and using a model require more time than exploring a process through other means? The answer is sometimes yes and sometimes no. The old expression, “Don’t swat flies with dynamite,” applies here. If you can get the same result with an easier and faster tool, you should use that. While we believe that simulation is underutilized, we are not advocating using simulation for everything.
Let’s go back to our last job and the two Darrells. So, the two Darrells focused on model building and the other three on using, though in this, we also included the model set-up or a significant part of what we call practical simulation. Model set-up precedes model building—understanding the processes and gathering initial data describing the process is critical to building a good model.
Here, we should separate the knowledge of what to do from how to do it. What to do in a simulation is software agnostic, while how to do a simulation is software dependent. How to build a model using FlexSim differs from using MedModel or Arena. Many people focus too much on building their model and the “how” and not enough on the “what” or the whole simulation creation.
A simulation is a project and requires project management, but the main steps or the “what” are:
1. Understand the system
2. Choose the right software
3. Determine the model’s system components
4. Collect data
5. Plan the model
6. Build and test the model
7. Verify, Validate, and Accredit
8. Use Sensitivity Analysis and Experimentation
9. Document
10. Implement the approved changes.
Before we build and test the model, we must understand the system, determine the system components, collect data, and plan the model. After the model is built, tested, verified, validated, and accredited, we must perform sensitivity analysis and experiment with possible solutions, document our results, and implement the approved changes; this is the “what.” With today’s software, these tasks generally take longer than the modeling phase. These steps would also be similar if we used other process improvement techniques or facility design techniques, so using simulation as a tool is often no more time-consuming than any other process improvement technique.
Myth Number 3: Simulation is Complicated:
Simulation is actually complicated. But let’s put that in context. Complicated as compared to what? Simulation is undoubtedly more complex than calculating the mean or average of a group of numbers. When the authors learned statistics, we had to do the calculations using simple math and a slide rule or, later, a calculator. So, to calculate the standard deviation, you had to find the mean, subtract the mean from each data point, sum the results, divide by n-1, and then take the square root. This can get pretty complicated when you have dozens or hundreds of data points. Today, if we need the standard deviation, we use that function in our spreadsheet application and instantly get the answer.
Modern simulation is similar; with the advent of functions and objects within the simulation, we can easily construct a model, enter the parameters of any number of arrival distributions, logarithmic, triangular, exponential, etc. (or service distributions), and use pre-built objects that are set to function is specific ways (if you have hospital specific software such as FlexSim these may be exam rooms, radiology equipment, lab equipment, etc.).
Let’s also put “complicated” in perspective. Having taught Lean Six Sigma for over 20 years, many of our Black Belt students and some of our Green Belt students would say that Lean Six Sigma is “complicated.” Before simulation became more straightforward to use, we used Queueing Theory to analyze situations and develop countermeasures. That wasn't very easy. While simulation models require education and training, they are far easier than most people imagine.
Myth Number 4: Simulation Is Only Helpful for Large and Complex Problems:
This myth partially comes from prior myths – that it requires significant investments in time and money; therefore can only be justified for significant, complex problems. This was perhaps true in the early days of simulation when the software was costly and the learning curve was much steeper. Today, with the easier-to-use software, simulation can be applied quickly to problems that are not “large” and perhaps not even especially “complex.” If the problem is simple, you might not need a tool like simulation to understand the situation and develop countermeasures. However, we caution that sometimes simple issues may need more complex solutions or analysis to understand them truly. This can be especially true when dealing with complex, adaptive systems.
One of the early projects I mentored was with a Black Belt team working to improve an Emergency Department. The team used DMAIC (Define, Measure, Analyze, Improve, and Control) as a methodology but got stuck in the improvement phase. They defined and collected data, measured, analyzed, and developed improvements or countermeasures. The first countermeasures were implemented but didn’t eliminate or significantly reduce the problem. They went back to work, brainstormed additional and new countermeasures, and implemented them. This revised analysis greatly reduced the problems but still didn’t meet requirements. So once again, they developed new and additional countermeasures, eliminating the ones that were not successful. They implemented again. Finally, success.
This was a brand-new Black Belt, and it was the first time this team worked on an improvement project, so we were focused on DMAIC. But what if we had access to simulation? We could have tried the first set of countermeasures, but we would have seen that they didn’t give us the needed results, so we would have saved a lot of time.
What was the cost of the three implementations? Fortunately, we had good change management, but there was definite staff frustration around the new process implementation, followed by the team implementing more new processes, then eliminating some new processes, and implementing more new ones. We worked through those issues, but it could have gone the other way, with staff resistance blocking us from being successful. Then the cost would have gone up because we would have failed to fix the problems. Also, the “false starts” in implementation had a price. Overall, this was not a complex problem. Still, the Emergency Department is a complex, adaptive environment, so simulation could have mitigated these costs. When considering simulation, remember to compare this mitigation against the price of the software or the time spent doing simulation.
Myth Number 5: Simulation Is Helpful in Manufacturing and Unsuitable for Patient Care Environments:
This myth also comes from multiple other myths. The thinking goes like this: Hospitals deal with people, not widgets. Simulation, some say, works well with inanimate objects but not people. Healthcare, in general, and hospitals, in particular, are slow to embrace new technologies for a variety of reasons, including risk-averse cultures, limited capital with competing priorities, hard-to-compute ROI due to soft savings, and fear of the unknown. However, the fact that working with people (customers, patients, clients, etc.) adds to the complexity of a situation is the very thing that makes simulation an effective tool.
Above, we list four situations where simulation is helpful. The first item, volume (Demand), fluctuates in random and difficult-to-predict ways in a hospital setting. Arrivals to an Emergency Department do so more-or-less randomly. The same is true of Surgery, the Cardiac Cath Lab, and other areas because of emergent or urgent patients that must be accommodated. Any area that accepts walk-in patients, such as Lab, Radiology, etc., will also experience unpredictable patient volumes.
Item three, the effects of volume fluctuations on resource utilization (e.g., staffing, movable equipment, supplies) or facility usage (e.g., exam rooms, ORs, etc.) are unclear. In the areas we mentioned above, because the volume fluctuations are difficult to predict, the effects on staffing, the number of exam rooms needed, etc., are also difficult to predict using basic math. There is math specifically designed for this called Queueing Theory, a special case of Markov Chains. Modern simulation, especially healthcare-specific simulation, is as easy or, for many of us, more straightforward to use and much easier to explain than Queueing Theory. We have used Queueing Theory in the past, and we can definitely attest that the 3D simulation model is a great deal easier to explain.
The fourth item is the uncertain need for patient treatment. By this, we mean that each patient is unique, a situation not generally faced by manufacturing organizations (except for perhaps Job Shops). Two patients can arrive in our ED suffering from a heart attack, and their treatment requirements could be different depending on their prior medical history, especially previous cardiac events, other medical conditions, etc. This is where simulation can help create a better understanding of staff and exam room utilization, as well as other treatment-related aspects of providing service.
While hospitals have been slow to embrace new technology and methods, especially in the management sciences, the use of simulation is growing in healthcare spaces. Unfortunately, it is still woefully underutilized in a setting for which it is uniquely suited.
Chapter 1 of our book Simulation Solutions: A Practical Guide to Improve Patient Flow & Facility Design in Healthcare Operations (available from Amazon) covers this and the risks of not using simulation in more depth.