Elusive Cost of Risk
For some time now, I have been researching, or rather trying to research, the concept of risk associated with infection transmission through inadequately decontaminated surgical instruments. When I began looking into this back in 2014, the available research and data on this topic was minimal. This remains true to this day. Currently, we are operating in a fuzzy logic domain of low and high risk, with no universal method of quantifying risk or relating it to an end cost. This makes decision-making based on this risk difficult.
Excluding the worst-case scenario (patients dying because of failures in decontamination), it is possible to estimate the cost of the risk. We can do this by looking into the cost of the treatment used to countermeasure the particular infection, therefore quantifying cost of failure. We can calculate the cost of any extra time in the hospital, necessary medication, extra staff that have to commit their time, aftercare, or whatever else is necessary to make the patient recover after such infection.
Such data could, first of all, help to justify any further investment in equipment, training, or additional staff that could reduce this risk. Detailed analysis of particular risks could further point out areas where risk could be minimized at a given cost. Quantified risks, or risk cost, would become a driver behind the means engaged to eliminate it.
In the absence of scientific evidence, it is easy to ask questions like whether dentists should follow the same decontamination procedures as other healthcare units when it comes to reusable instruments. I think they should, but what will happen if they do not? How significant is the risk and cost associated with the treatment if something happens?
From the decontamination process optimization point of view, we can already start analyzing data, provided we use quantitative methods to evaluate the performance of individual processes. In other words, we need to know how much contamination was removed and inactivated in the overall process. If we had a universal method of contamination quantification that could be correlated with a risk of infection, we could use this data to determine the value of implementing new processes or technologies at almost any stage of the process. For example, let’s imagine that a hospital is considering adopting a new technology or method for precleaning dirty surgical instruments. If the hospital were to trial this new method, resulting in a lower level of quantified contamination, and if we had a correlation between the level of quantified contamination and percentage risk of infection, we could accurately determine the money saved by introducing this new technology.
At first glance, a study such as this seems to be a monumental task due to the sheer number of variables. On the other hand, in the age of big data and number crunching, there may be a way to design an experiment so that data gathering and analysis is greatly simplified and automated. Data from decontamination processes and average costs of treating infections already exist but are not being integrated, coincidentally, because there is too much of it, and it exists in different data centers and in different data formats. Because of this, it is a great challenge to analyze it and draw meaningful conclusions, especially if this type of data analysis were to be done on a routine basis. That is where we could perhaps train artificial intelligence (AI) algorithms to look for correlations, anomalies, and similarities.
The key to the implementation of such a project at a larger scale is digitization of all relevant data. Even in the age of computers and smartphones, there are a large number of activities performed manually that are critical to the process outcome; for example, manual preparation and precleaning of surgical instruments before an automated process. Quality control of manual tasks on a routine basis is difficult, but in this case, the added challenge is that most manual activities produce no quantifiable results or outputs that in turn are not converted into available data.
One solution to this problem is greater automation of those procedures because all digitally controlled processes are data driven. They use digitized inputs and control the most critical process parameters. They produce data but in many cases controlling the process is not enough to determine the quality of the outcome, and the outcome is the critical variable for the end result. One example is the cleanliness of surgical instruments. Consider a case where instruments are loaded incorrectly to the washer or are moved halfway through the process by the water jets creating excessive shadowing, preventing other instruments from being adequately cleaned. Even if the washer executes its process perfectly, instruments may come out dirty. The only solution is to evaluate instruments’ cleanliness after the process. Here again, we can see that a simple visual test will not produce quantifiable data that could be used as an input to the AI algorithms. Unless, of course, the evaluation method produces a quantifiable measure of remaining contamination, as in the case of the ProReveal or some protein-detecting dyes.
We can already identify critical elements of the process and the outputs we need to quantify that have a direct effect on the risk of infections transmitted by inadequately reprocessed instrumentation. What we need to build is a system that integrates and analyzes this data. There is certainly a lot more research necessary to look into categorizing, quantifying, and costing risk for it to become a tool for decision-making involving processes, equipment, and management of human resources.
In any case, the ability to use quantifiable data from decontamination processes to determine risk levels that relate directly to cost would be of huge benefit to hospitals worldwide. This crucial information would make decision-making and the adoption of new technologies and processes a simple task. The task is surely difficult but certainly not impossible since insurance companies and banks do it every day already.
Pawel de Sternberg Stojalowski, MSc, BSc, MBA, is an R&D engineer with passion for clever solutions. From his background in mechanical engineering, automation, robotics, and business stem innovations, Pawel has an interdisciplinary and cooperative approach to projects. He has been involved in R&D of equipment and technologies for decontamination and sterilization of complex surgical instruments since 2007. Today, Pawel leads an expert team researching, designing, prototyping, and evaluating innovative solutions aimed at solving challenges in decontamination of complex surgical instruments, contamination detection, and identification, as well as fluid dynamics and ultrasonic waves in washers.
Sign up to receive NewSplash free and to read our weekly feature articles by many distinguished authors with experience!
NewSplash is a trusted free weekly digital newsletter dedicated to providing useful information to CS and IP professionals who strive to keep patient safety high.