A patient was admitted to the hospital for uncontrolled diabetes. A nurse taking care of her checked her blood glucose level. The glucometer showed the patient’s blood level was dangerously high. Repeated tests yielded the same result.
The patient was given insulin and her condition deteriorated. The rapid response team was called. The patient was in a diabetic coma. She was given multiple doses of glucose and transferred to the intensive care unit. The human resources department suspended the nurse.
Another nurse in the same healthcare system had made the same apparent mistake a few days earlier. This sequence of events prompted safety leadership to request a human-factors review of the glucometer. When the engineers simulated the use of the glucometer, a pop-up text box appeared, which read, “Critical Value, Repeat Lab draw for > 600.” This is the message the nurse saw. It obscured the actual blood glucose reading, which was low.
Hospital staff had customized the glucometer so a pop-up message would alert users to critical readings. But the pop-up message partially covered the actual glucose reading. Why would a glucometer be designed to allow the blood glucose reading to be partially hidden?
The patient recovered. She was informed of the events and received an apology. The nurse was reinstated and the hospital apologized to her. The nurse agreed to be interviewed; her interview can be found at https://www.youtube.com/watch?v=yVStl7zqmjM.
Human factors is a scientific discipline that studies how human beings interact with technology and equipment. A guiding principle is that equipment and devices should be designed to match the cognitive process and environments of users. Human-factors engineering principles are applied in airplane cockpits; no new equipment or software is used without thorough testing for safety, reliability, and interoperability with other equipment.
But in health care, new devices and equipment routinely are placed in the “healthcare cockpit” with little, if any, testing for usability and interoperability. Nurses are expected to just deal with them.
Contrast this approach to what happened when the iPhone 4 was introduced. Users found that a design flaw prevented them from making calls and caused calls to be dropped. The phone’s design made it easy for users to cover up the antenna with their fingers, causing failed connections and dropped calls. Consumers responded vociferously, and Apple fixed the design flaw. The manufacturer didn’t tell users to keep their fingers off the antenna.
This is the equivalent advice that’s often given to nurses when equipment and devices aren’t designed with end users in mind. A more satisfying upstream approach is for national physician and nursing organizations to consider advocating on behalf of healthcare professionals by encouraging manufacturers of equipment with designed-in defects to remove them, just as consumers pressed Apple to fix the iPhone 4.
This approach would convey an important message: Equipment, computers and devices used by nurses every day should be designed to prevent defects that can harm patients and cause distress in the nurses who care for them.
Resource
Gibson R, Prasad Singh JP. If Only They Were iPhones. In: The Battle Over Health Care: What Obama’s Reform Means for America’s Future. Rowman & Littlefield, 2012:109-118.
Rosemary Gibson is the coauthor of Wall of Silence: The Untold Story of the Medical Mistakes that Kill and Injure Millions of Americans. She is also a senior advisor to The Hastings Center.