Instead of blaming test participants for use errors, look more carefully at your device’s design.

July 8, 2013

20 Min Read
Human Factors: Identifying the Root Causes of Use Errors

Have you either observed or perhaps even conducted a summative usability test of a medical device? If so, you have undoubtedly witnessed a use error—an instance when a test participant made a mistake while performing a task. And perhaps the mistake occurred while the test participant was performing a high-risk task, thereby causing angst among the development team members because the use error imperiled a successful validation test.

Following a high-risk use error, the device manufacturer must perform a root cause analysis. The cause might initially appear to be the user’s failure to do the right thing. For example, a usability test participant might have forgotten to open the clamp on a fluid line or calibrate a sensor before using it to measure a vital parameter value. However, jumping to blame the user for a use error is perilous.

Did you know? 

The term "use error" instead of "user error" was first coined in the pages of MD+DI in 1995. William A. Hyman wrote a guest editorial, “The Issue Is ‘Use,’ Not ‘User,’ Error,” which appeared in the May issue that year. 

A hasty decision to blame the user might overlook the possibility that the medical device was to blame. Yet, on the surface many use errors seem to be the user’s fault. It is tempting to draw such conclusions when test participants make comments such as “That mistake was totally my fault,” “I forgot to follow the instructions,” or “My bad!”

Manufacturers seeking a use error’s true root cause should resist test participants’ readiness to blame themselves for mistakes. There are usually underlying, design-related causes for use errors that might at first seem to be the user’s fault.That is both good and bad for manufacturers seeking to obtain regulatory clearance for new products.

On the positive side, implementing a design change could prevent a use error in the future. However, manufacturers rushing to validate their devices might need to take time-consuming and expensive steps backward to fix their flawed designs. This explains why the instinct to blame the user prevails among many manufacturers, particularly those conducting their first validation (i.e., summative) usability test.

About Blaming the User

Some manufacturers inadvisably adopt a Teflon-coated approach to root cause analysis of use errors, preventing use errors from sticking to the device’s user interface design. These manufacturers might blame the user for a use error because of one or more of the following reasons:

  • Forgetfulness—the user knew the correct way to operate the device but experienced a mental lapse.

  • Inattentiveness—the user did not appear to be concentrating on the task at hand.

  • Fatigue—the user appeared exhausted, which seemed to effect her concentration.

  • Carelessness—the user did not seem particularly concerned with performing the task correctly.

  • Distraction—the user seemed to have something else on his mind.

  • Noncompliance—the user knew the prescribed way to perform a task but chose to take a different approach.

  • Rush—the user seemed to be in a hurry to complete the task or test session.

  • Confusion—the user seemed easily perplexed by the device.

  • Habit—the user followed a routine but incorrect approach to performing the task.

  • Disregard for learning aids—the user neglected available resources (e.g., instructions for use, quick reference cards, online help, hotline) that might have helped him perform the task.

  • Nervousness—the user seemed flummoxed by being the center of attention and being recorded.

  • Blunder—the user erred for no apparent reason but not one attributable to the test’s artificiality.

Before resorting to such design-sparing findings, analysts should explore other potential reasons why a test participant committed a use error. An intensive analysis of use errors usually reveals user interface design flaws, missing risk-control measures that should have accounted for normal human failings and protected against related use errors.

Table I. Shown are sample root causes for various medical devices. Note that some use errors might have more than one cause.

Device

IV pump

Dialysis machine

Heart-assist pump

Drug injector

Insulin pump

Assume the Device is at Fault

When a usability test participant acts forgetful, inattentive, fatigued, careless, or in another way that appears to cause an error, view the user’s behavior as normal and expected rather than the root cause of the error. For example, a test participant might forget to perform a step when setting up a device because there are 25 separate steps that must first be remembered and performed in order. A superior design would not place such a burden on the user.

As another example, a test participant might not bother to prime a tubing set because she previously used a comparable device that did not require priming. While one might cite this as a case of negative transfer and habit, the requirement to prime might not have been communicated well enough to ensure compliance. Moreover, the injector might not give any indication (labeling or otherwise) that priming was necessary. A better design would bring the need to prime the device to the user’s attention and check to ensure the task was completed successfully.

The sensible approach to root cause analysis of use errors is to start by assuming that the medical device, not the usability test participant, is to blame. It takes discipline—courage even—to adopt and maintain this mindset. This is particularly true in scenarios in which the development organization is pressing to validate a device, perhaps before passing a deadline to apply for regulatory clearance.

Further impetus to embrace the mindset to seek device-related root causes of use errors is that regulatory authorities—FDA in particular—expect it. Try submitting a human factors engineering (HFE) report that places the majority of the blame for summative usability test use errors onto the participants.1  You might receive a response letter from the agency challenging the rigor of the root cause analysis and associated conclusions. The agency might reject the proposition that users are at fault for use errors and suggest the manufacturer conduct additional studies to identify and mitigate design-related causes.

How do you converge on a device-related root cause for a use error? As suggested, you can identify discrepancies between the device and established HFE design principles, such as those found in AAMI HE75 and multiple HFE textbooks.2  Identify ways that the device’s user interface was poorly matched to the use scenario. For example, a given use scenario might require a layperson with a visual impairment (common among users of certain devices) to read a label with small print. As another example, a use scenario might call upon a clinician to detect an audible alarm sound within a loud working environment. It is important to fully consider user and use-environment characteristics to identify design shortcomings.

Usability test participants who aim to please are prone to blame themselves for mistakes. However, they can still help pinpoint a use error’s design-related causes. Usability test administrators can put test participants on the right path by responding to self-incriminating statements with statements and questions such as “Don’t blame yourself. We are here to evaluate the device and not you. Is there anything about the device that might have triggered the mistake? Is there any way that the device could have prevented you from making the mistake?”

A naïve observer might regard this as baiting the test participant to say something negative about the device. It is actually a simple way to get to the point—obtaining test participant feedback that hints at or specifically cites a use error’s root cause. Following the pattern suggested, a test participant might first state that she wasn’t paying attention when she installed a disposable tubing set backwards on an intravenous pump. After further prompting, she might explain that the set’s two sides look almost identical at first and that it would have helped her if the outward-facing side was labeled as the front.

Sometimes a test participant does not progress from self-incrimination to pinpointing a design flaw. The participant might insist the error was his fault, that the design is perfect and that he was the culprit. Moreover, the participant might not have the knack for finding fault in a given device. This is when the usability test administrator must back-off and move on to the next step in the test rather than push too hard for the test participant to come up with something. Just because a user maintains it was her fault does not necessarily mean that is the case. It means the HFE specialists and supporting cast must derive root causes from what they observed.

In the case of the IV pump’s tubing set, a human factors specialist might conclude that the use error—installing the component backwards—was due to a lack of visual and tactile cues, that the device passively enabled the user to install the component backwards rather making the use error impossible by design. The specialist might advise the development team to shape the component and receptacle to communicate the proper means of insertion, introduce a mechanical constraint to avoid any other means of insertion, and add a label indicating which side is the front. While such a use error might not directly cause patient harm, it might lead to a delay in therapy that posed an unacceptable risk to the patient. (See Table I for sample root causes for various medical devices.)

One Format for Documenting a Use Error

Did not close downstream line clamp after priming.

Use Error Rate

Five of 30 test participants committed the use error a total 7 times. Two of these test participants committed the use error twice.

Associated Task and Priority

Task 3 (Priority 4 of 12)

Use-Related Risks

u-FMEA nos.1.16, 4.5

Description

Participants RN2, RN9, and LPN5 committed the use error once, with RN7 and LPN11 each committing the use error twice.

Participant Comments About Use Errors

Participants RN2, RN9, and LPN11 speculated that they forgot to close the clamp because the line was not dripping after completing priming. LPN5 reported that she thought the clamp was closed because there was little visual distinction between an open and closed clamp, particularly when viewed in relatively dim lighting. RN7 said that the trainer did not tell him that the clamp needed to be closed prior to patient connection.

(Note that the video recording of this participant’s training session documented that the trainer did in fact instruct the nurse to close the clamp after priming.)

Root Cause Analysis

The use error occurrences appear to have multiple root causes:

  1. The line does not drip substantially once priming is completed. Therefore, it does not provide a conspicuous indication that the line is open versus closed (i.e., clamped).

  2. There is little visual distinction between an open and closed clamp. (Option to include photo of both clamp positions.)

  3. The device does not automatically check or direct the user to check to ensure the clamp is closed prior to patient connection.
     

Reporting

After identifying the root causes(s) of use errors that occur in a usability test, one must document the findings in a clear and compelling manner. One possible model for documenting use errors is shown in the sidebar (right).

Such use error reports contrast sharply with the typical manufacturer’s optimism that its device will pass its summative usability test, but test report writers should document the test results in an objective, straightforward manner that will facilitate a residual risk analysis.

Conclusion

The worn adage “to err is human” is valid. People sometimes make mistakes that have no cause other than human imperfection. But when it comes to mistakes made by medical device users, repeated failures are induced by user interface flaws more often than not. This truth is borne out in the results of innumerable usability tests in which multiple test participants have made the same kinds of use errors—evidence that use errors are most often induced by common user interface shortcomings like undersized text, ambiguous labels, lengthy procedures, and the lack of visual or audible feedback.

If a port allows the insertion of the wrong tube, you have a use error waiting to happen. If an alarm signal frequency is too high and the tone too quiet, someone will fail to detect it. If a drug-delivery rate should be in the 0–100 mL/h range, but a device accepts an entry of 1000 mL/h (10 times the intended maximum), there is a good chance someone will overdose a patient. If a procedure is long and lacks sufficient prompts and safety checks, users will inevitably forget to perform certain procedural and perhaps safety-critical steps.

Poor design can trigger use errors. This is why manufacturers should always begin a root cause analysis by assuming a use error was induced by the device rather than blaming the user. Medical device manufacturers should not turn users into scapegoats for use errors induced by user interface design shortcomings. Pure human blunder is much less common that some people might want to believe, so it should be a root cause of last resort.

References

1. FDA. CDRH. Draft Guidance for Industry and Food and Drug Administration Staff—Applying Human Factors and Usability Engineering to Optimize Medical Device Design. June 22, 2011. Appendix A HFE/UE Report, 30–32. 

2. ANSI/AAMI HE75: 2009, "Human Factors Engineering—Design of Medical Devices" (Arlington, VA: AAMI, 2009).

Michael E. Wiklund, PE, CHFP, is general manager of UL’s human factors engineering practice (Concord, MA). He is also a member of MD+DI’s editorial advisory board. Wiklund coauthored Usability Testing of Medical Devices and Designing Usability into Medical Products and coedited Handbook of Human Factors in Medical Device Design. He is a voting member of the AAMI and IEC human factors engineering committees. Reach him at [email protected].

Sign up for the QMED & MD+DI Daily newsletter.

You May Also Like