Adverse events and untoward outcomes in any industry, including healthcare, are almost always the result of a cascade of errors rather than a single error. And seldom is the adverse event related purely to human error by one individual but rather to the interaction of a variety of factors that come together to enable those errors to lead to the unwanted results. We usually refer to the latter factors as latent factors. The classic example of a latent factor was described in what we think of as one of the first-ever patient safety books, Steven Casey’s “Set Phasers on Stun” (Casey 1993). In that case, a broken intercom system that had not been repaired prevented the patient, who knew something wrong was happening, from communicating with the radiation technologist who might have been able to stop the procedure. It, of course, had been not functioning for some time but in other cases no adverse events occurred. Had the intercom been repaired the fatal outcome might have been averted. Of course, there were multiple other latent factors in that case described by Casey.
Latent factors may
be of a variety of types. Last month there were two excellent articles that
looked at such contributing factors in surgical settings. One group, using the
SEIPS model, identified safety hazards in cardiovascular operating rooms
(Gurses
2012). Another group used the LOTICS scale to identify a variety of latent
factors in university hospital operating rooms (van
Beuzekom 2012).
The cardiovascular
study was done as part of the Locating Errors Through Network Surveillance
(LENS) project (Martinez
2010). That project brings together contributors from multiple disciplines
(organizational sociology, industrial psychology, human factors engineering,
clinical medicine, informatics, epidemiology and biostatistics, economics and
health services research). The project is designed to identify hazards in
healthcare settings and develop risk reduction interventions. A key philosophy
is development of a peer-to-peer assessment process, akin to what occurs in the
nuclear industry. That means that peers who work in similar settings come to an
organization to do a safety assessment and share feedback in a confidential
fashion rather than having such assessments done by external regulatory
agencies. It is focused on learning rather than judging and part of the project
is to develop tools to use in such assessments. One such tool involves direct
observation of processes and events as they actually occur rather than simply
reviewing what written policies state should occur.
The current study (Gurses 2012) involved a sampling of hospitals participating in the LENS project who perform cardiovascular surgery. In addition to the direct observational method noted above the researchers also did contextual interviews with healthcare workers to get further insight into some of the events they witnessed. They also took photographs of the physical aspects of the environment and some of the tools and technologies used. This is all done using the Systems Engineering Initiative for Patient Safety (SEIPS) model (Carayon 2006) that articulates that hazards can emerge from any of multiple components of the healthcare system and from the interactions amongst these various components. Some of those components include care providers, tasks, tools and technologies, care processes, physical environment, organization, and other processes.
Gurses and colleagues observed 20 cardiac surgeries at the 5 centers selected and identified 59 hazard categories. They did not include portions of the pre-op process such as case scheduling and only followed the patients to the handoff to the post-anesthesia care unit or recovery unit. They developed a classification tool and provide examples of hazards identified in each of those categories. Though these were cardiac surgical procedures observed you will see that most of the hazards are likely present in your OR’s regardless of what types of surgery you are doing. If you have access to the article go to the online tables and spend some time looking at the categories identified and examples provided. You’ll find yourself saying “Yup. That happens here all the time”.
Note that some of the examples given actually fit under more than one category. For example, they include under inadequate knowledge and skills due to lack of education, experience or training an instance where a technician is unaware of a newly purchased piece of equipment. The same example appears under the category of organization/education and training.
One example under the providers category was use of non-standardized approaches. There were often substantial practice variations and preferences that led to confusion for other healthcare workers and often led to workarounds. They also identified many examples of “unprofessional” behavior among care providers that encompass most of the more subtle forms of disrespectful behaviors Luician Leape has emphasized (see our July 2012 What’s New in the Patient Safety World column “A Culture of Disrespect”).
Under tasks they identified the usual problems of interruptions, time pressures, workload demands, and non-value-added tasks. Here they also found numerous examples of poor planning leading to delays. Many of the latter might be avoidable with better use of pre-op huddles. Examples of non-value-added tasks included things like anesthesiologists preparing most of their own medication doses rather than having them prepared by pharmacy.
Under tools and technology they note one of our favorite pet peeves: multiple different types of IV pumps in different areas of the hospital. Problems with alarms, hardware and software malfunctions were also common.
Under the workspace design category were frequent examples cluttered or congested workplaces and of how such physical design elements added to the workload of various workers such as someone having to physically change position to reach another item.
The organization category included not only issues related to training and education but also purchasing, planning, ancillary service provision, policies and procedures, and safety and teamwork culture.
Some of the
suggested solutions to the identified hazards include:
·
Standardization
(practices, equipment, etc.)
·
More
coordinated purchasing practices, with input from frontline staff
·
Teamwork
training
·
More consistent
application of communications practices (hearback, etc.)
·
Better use of
tools like checklists, pre-op huddles and post-op debriefings
·
Proactive
design of operating rooms
·
Use of CUSP-like
interventions (see our March 2011What’s New in the Patient Safety World column
“Michigan
ICU Collaborative Wins Big” for more on CUSP’s)
The second study (van
Beuzekom 2012) used the previously validated Leiden Operating Theatre and
Intensive Care Safety (LOTICS) scale to assess the impact of an intervention of
multiple latent factors in the OR. The LOTICS scale identifies latent risk
factors in the OR and includes many of the same categories and elements found
in the tools used in the Gurses study. Among them are communication, design,
maintenance, material resources, planning and coordination, teamwork,
procedures, situational awareness, team instructions, training, staffing
resources, and error reporting. Based on LOTICS scale results and other staff
input they decided to focus on Material Resources, Training, and Staffing
Resources in an intervention. The intervention included much training on latent
risk factors but also really focused on standardization and reduction of
variation of materials and equipment. Using a pre/post design (with a
comparable unit at another university hospital serving as a control group) they
were able to demonstrate a reduction in the perceived incident rate and fewer
problems related to material and staffing resources. And the contribution of
technical factors to incident causation decreased significantly.
Undoubtedly, hazards in many of the same categories of both the Gurses and van Beuzekom studies would be found if they had also analyzed the pre-and post-surgical parts of surgical care.
We encourage you to
use some of the tools developed or used in these two studies and do your own
observational assessments of your OR’s (or any other unit in your hospital for
that matter). It’s critical that these be done in an open approach making it clear
that the goal is learning and identifying hazards so that ultimate improvements
can be made not only in patient safety and patient outcomes but also staff
safety and work satisfaction, and organizational outcomes.
References:
Casey S. Set Phasers on Stun and Other True Tales of Design, Technology, and Human Error. Santa Barbara California: Aegean Publishing Company, 1998 (first published in 1993)
Gurses AP, Kim G,
Martinez EA, et al. Identifying and categorising patient safety hazards in
cardiovascular operating rooms using an interdisciplinary approach: a multisite
study. BMJ Qual Saf 2012;
21: 810-818
http://qualitysafety.bmj.com/content/21/10/810.short?g=w_qs_current_tab
Carayon P, Hundt AS,
Karsh B-T, et al. Work system design for patient safety: the SEIPS model. Qual Saf Health Care 2006; 15(suppl
1): i50-i58
Martinez EA,
Marsteller JA, Thompson DA, et al. The Society of Cardiovascular Anesthesiologists’
FOCUS Initiative: Locating Errors Through Networked Surveillance (LENS) Project
Vision. Anesthesia & Analgesia 2010; 110(2): 307-311
http://www.anesthesia-analgesia.org/content/110/2/307.full.pdf
van Beuzekom M, Boer
F, Akerboom S, Hudson P. Patient safety in the operating room: an intervention
study on latent risk factors. BMC
Surgery 2012, 12: 10 (22
June 2012)
http://www.biomedcentral.com/content/pdf/1471-2482-12-10.pdf
Print PDF
version
http://www.patientsafetysolutions.com/