Physical Address
304 North Cardinal St.
Dorchester Center, MA 02124
It is indeed human to err, yet unrealistic expectations for perfection persist throughout the practice of medicine. For radiology, the challenge has increased substantially in recent years due to rapidly advancing imaging technologies, increasing volumes of studies, each with massively increased volumes of images per study, as well as poor overall communication of needed clinical information from referring physicians, all of which contribute to an increased risk for radiologist error. If errors cannot be eliminated, then we must develop systems and procedures to reduce and manage them. If radiologists and their professional societies do not meet this challenge, governing bodies such as The Joint Commission (JC) will attempt to externally enforce the development of comprehensive error detection systems and procedures for active error prevention and remediation strategies.
In this chapter we discuss the various human factors involved in error occurrence in radiology, along with recommended strategies to address these factors with the aim of improving error detection, prevention, and finding the most effective ways to manage errors when they (inevitably) occur. We also focus on communication in radiology and strategies where improving the effectiveness of radiologic communication can lead to error reduction. The concept of a sentinel event response aimed at prevention of potential future harm is presented. We also wish to examine the types of occurrences that constitute an error in radiology and what methods exist that can be applied on a continuous basis to detect and prevent radiologic error in a rigorous fashion, including the role of (now mandated) peer review.
In a healthcare delivery system there are numerous daily interactions among workers, equipment, and environment, which ultimately lead to positive or negative outcomes in patient care. Human factors engineering is a relatively new but growing discipline in multiple fields, including healthcare. This discipline focuses on complex systems analysis and understanding of how a complex system, including people and machines, works in actual practice. The aim is to then design or optimize equipment and human-machine interfaces, with the human users’ strengths and limitations in mind, to increase safety and minimize the risk of error in complex environments. This is done by dissecting complex activities or processes, breaking them down into smaller component tasks, and then assessing the individual demands of the operator at each stage, including things such as physical demands, skill demands, mental workload, team dynamics, and environmental adaptations (e.g., lighting, noise, distractors, ergonomics). This discipline attempts to pair human strengths and limitations in the performance of each task with the core design of the equipment and physical environment in which the task is performed.
Usability testing refers to use of equipment and systems by trained users under real-world conditions to identify unintended flaws in these new technologies before they reach the user market. An example is a recent study that found an unexpected increase in patient mortality in a pediatric intensive care unit as a result of the use of a computerized order entry system (CPOE). Upon close evaluation, it was found that time demands resulting from a cumbersome order entry process were drawing clinicians away from the bedside, leading them to overlook signs of distress in their patients. Usability testing was later applied with simulated clinical scenarios and showed severe limitations of the installed CPOE.
Usability testing, as it applies to radiology, means testing the interaction between equipment and the user to find the best-suited equipment/technology or system to maximize ease and intuitive use to optimize radiologist performance. Usability in this sense is defined as the effectiveness, efficacy, and satisfaction with which a radiologist can interact with a system. In other words, functionality determines usability .
More simply put, usability testing answers the question: “How user-friendly is this system in achieving its desired purpose?”
Usability testing also impacts equipment design. Usability problems can arise not only when there is poor design of equipment but also when there are poor instructions for its use. One published example (see the Suggested Readings ) is an implantable inferior vena cava (IVC) filter system that can be inserted by a femoral or brachial approach. Depending on the approach, the filter unit must be attached to the introducer sheath in a particular orientation for correct placement, and incorrect attachment of the filter (e.g., using the brachial attachment for a femoral approach procedure) can result in an incorrect orientation of the filter within the patient’s IVC and thus lead to potential for harm by dislodgement or nonfunction. The authors point to the need for very clear documentation of how the system must be used and further suggest that usability problems in the future may be detected by use of a shared online database of users, as long as there is frequent database review by these users.
Workarounds are perhaps the most common class of methods in use to accomplish an activity when existing methods are cumbersome or are not working well. This involves both single instances and situations where practitioners consistently bypass established policies and safety procedures, increasing the risk of errors and patient harm. Although the goal is generally to get work done more efficiently, workarounds can be dangerous and illustrate the need for good system design. Flawed and poorly designed systems that force workers to spend an excessive amount of time or effort to complete a task when all safety steps are followed precisely lead workers to cut corners and find alternatives of varying effectiveness and safety. The identification of workarounds within a system can be a signal to leaders that a faulty process is in place.
Workarounds are often created spontaneously and used with good intentions by skilled staff to promote patient comfort or speed up medical interventions in an emergency situation. When cumbersome systems exist, healthcare workers may quickly determine that it is not always practical (or safe) in an emergency situation to follow all steps and comply with the prescribed process; these workers will generally find ways to circumvent the system to accomplish tasks more efficiently. There is substantial risk, however, of unintended downstream consequences: no matter how carefully done, and even with the best intentions, the use of workarounds promotes error and compromises patient safety, partly by overriding safety features that are built into systems. This is especially true in emergent situations with high tension when human attention to detail may be suboptimal.
In most instances, workarounds do not result in patient harm, and the increased efficiency of their use creates a type of reward for the creator of the workaround; there is the positive feedback of enhanced efficiency or ease, which reinforces the workaround’s use, leading to complacency, which only serves to enhance their latent risks. An effective workaround can also serve to prevent the underlying system problems from being fully recognized and addressed.
Workarounds, however, can have positive effects. They can be viewed as a trigger warning for the existence of an underling system failure that requires attention and resolution. Analyzing a workaround process may lead to definitive solutions for more global issues within a poorly functioning or cumbersome system. It can identify problems with existing technologies or uncover unnecessarily complex processes. For example, it was discovered at a Veterans Affairs hospital that the staff was forced to use informal patient identification processes because barcodes on patient armbands were not water resistant and easily washed off. This example demonstrates how a significant patient identification error could easily occur as the result of working around an ineffective, unusable system. Addressing the underlying system failure, particularly by applying usability testing for the armbands, would resolve both the system failure and obviate the incentives for a potentially dangerous workaround.
In other words, a workaround process is most often an answer to the question, “How can I make this cumbersome, time consuming, or complicated system more user-friendly so that I can do my job more efficiently?” Usability testing can be employed in a scenario where the use of workarounds is identified. For example, when a new piece of equipment or a new system is introduced into the work environment, there perhaps ought to be a beta testing period where the new system is used by a limited number of staff to identify potential usability nuisances and provide feedback to the institution before the equipment or system is introduced to the larger group.
A forcing function is an aspect of system design that prevents an undesirable function from being performed or allows its performance only after another function is performed first, such as when a prefunction is needed to make the main intended function safe. A simple example of a forcing function in commonly used technology is how a microwave oven is designed not to function while the door is open. This forces the user to close the microwave door first. By applying principles of human factors engineering in error management and prevention, the design of forcing functions represents an attempt to anticipate the types of error that may occur and incorporate a function or failsafe directly into the design of products and processes that may prevent the occurrence of that error. (One must be careful not to create a function that only serves to force a workaround, however.)
Forcing functions are often embedded in medical equipment, from small syringes to complex magnetic resonance imaging (MRI) scanning machines. Although some make this equipment more cumbersome to use and require specialized training, this type of human factors engineering works to ensure patient safety as long as the forcing function mechanism is intact. One such example is the use of a Luer-Lok system for syringes and indwelling lines, which must be matched to catheters before an infusion is possible. Another common example is how the connectors for anesthetic and oxygen gas lines are incompatible with each other, so it is not possible to inadvertently misconnect them.
An example of a forcing function related to radiology equipment is the use of automatic exposure control (AEC) in both film screen and digital radiography. The purpose of AEC is to limit patient radiation dose and still generate the most optimal image by controlling exposure time. An AEC system uses radiation detectors in the form of ionization chambers (usually three to a system), which are calibrated based on phantoms and positioned in a specific orientation. AEC reduces radiation exposure by controlling the total milliampere second (mAs) output of the x-ray tube.
Another example of a forcing function, designed specifically for use in procedural environments, is the use of a preprocedural time-out . Instituting this practice forces performers to stop what they are doing and focus on the pertinent details of the case about to start, including having the correct patient, performing the requested procedure, and confirming optimized laboratory studies and presence of any possible limitations such as drug allergies. This also focuses the attention of multiple attendees present in the room for the case and enables someone to speak up if the presented information is incorrect to his or her knowledge before the case is started. This universal protocol is a requirement for all hospital procedures today.
Simply put, a forcing function is similar to a constraint because a constraint makes it more difficult to do the wrong thing; a forcing function at least theoretically makes it impossible to do the wrong thing. Employing the concept of the forcing function answers the following questions: How do I make this system impossible to mess up? What steps can I anticipate and pre-perform or take out of the hands of the operator? How can I make my system’s safety more operator independent?
Human factors engineering was responsible for bringing about many of the equipment and processes standards that we enjoy today in various medical domains. A key concept is that equipment and processes should be standardized whenever possible to increase reliability, improve information flow, minimize cross-training needs, and prevent operator confusion. Consistency with equipment function, for example, prevents potential for error due to alterations from the usual or expected workflow. It also allows staff to alternate between sites without learning how to use different equipment.
Although standardization of equipment is optimal, it may or may not be possible in a large institution, which may have equipment that differs by age and by manufacturer due to economic considerations such as contracting variability over time. It is, however, always possible to strive for standardization of processes. Constantly following the same steps over time minimizes variation and has been shown to improve both efficacy and safety. Standardizing processes ensures that the same processes are followed by different staff working in a rotation or in a changing environment. It builds resiliency into a system. More importantly, it allows for detection of variation from the norm more easily, which can trigger closer analysis and error prevention. The use of checklists and templates can help to promote standardization in radiology.
Checklists ensure consistency of procedural steps and communication. They allow for a concrete listing of required information so that all parties involved in a procedure are completely aware of the circumstances and can speak up if something seems off. They also allow for organized and readily available lists of proceedings in an emergency or high-risk situation. Some groups even advocate the printing of resuscitation or contrast reaction steps on cards with lanyards that are worn by primary staff, so this information is readily available when needed.
In surgery, institution of checklists prior to operative procedures has been shown to reduce mortality from 1.5% to 0.8% and complications from 11% to 7% (see Suggested Readings ). Radiology checklists are useful in both diagnostic and procedural settings. In diagnostic radiology, TJC mandates the use of magnetic resonance (MR) and computed tomography (CT) screening questionnaires. These questionnaires are designed to detect potential safety issues prior to scan administration so that they can be corrected or scanning prevented where potential harm may outweigh the benefits of the scan. Potential safety issues screened by these questionnaires include contrast material allergies, pregnancy and breast-feeding status, renal function, medication interactions and allergies, intravenous access, and presence of a cardiac pacemaker or aneurysm clips. In procedure-based radiology practices, such as interventional services, drainage services, aspiration, and injection services, checklists are used in the form of time-outs with an actual pause prior to the procedure and active participation in the surgery safety checklist by all team members, which is also mandated by TJC. This checklist includes a review of patients’ identification, allergies, correct procedure, site of procedure, labs, collection of specimens, and medications that must be discontinued and those that must be administered.
The use of report templates is a form of standardization of the radiology report and has generated much controversy in recent years among radiologists but is gaining traction throughout the country. Although TJC recommends use of templates for reporting, it is not yet mandated, except in reporting of mammography results. Reports written using templates have been shown to have more clarity and consistency of content, leading to improved clinician understanding. For example, in one study evaluating the introduction of templated radiology reporting for presurgical staging of pancreatic cancer, surgeons reported an increase in presence of all information needed for surgical planning in radiology reports (an increase from 69% to 98% in structured reporting, compared to an increase from 25% to 43% in unstructured reporting, which was used as a comparison standard). Currently, several radiologic societies, including the Radiological Society of North America (RSNA) and the American College of Radiology (ACR), offer sample report templates in their Breast Imaging Reporting and Data System and Liver Imaging Reporting and Data System in an effort to standardize the reporting process across the nation. However, each radiology practice remains free to implement their own versions of templates as appropriate for their own specific practices.
Some authors in radiology have suggested that a checklist approach to imaging interpretation ought to decrease error rates in radiology. The use of a checklist function embedded in a template might well help to prevent error due to human perception where a finding is simply missed, by reminding the radiologist to look for it to fill in a blank in the template. A radiology checklist in this scenario might include common diagnoses and misdiagnoses typically seen on a specific body part or condition so that these are always checked prior to finalizing a radiology report.
Become a Clinical Tree membership for Full access and enjoy Unlimited articles
If you are a member. Log in here