Dr. Wachter and the hospital are to be commended for publicizing this incident so others may learn from it. The hospital staff, the patient, and his mother, also deserve credit for allowing their stories to be told.
A synopsis does not do justice to this well-written account of the boy's near-death experience in a top hospital in San Francisco. In short, he somehow received a massive overdose of the antibiotic Septra despite the presence of a sophisticated electronic medical record and multiple systems in place that were supposed to prevent such a thing from happening.
After the patient recovered from receiving 38½ pills when he should have been given only one, a root cause analysis found numerous faulty system issues such as an electronic ordering program that was overly complex, a nurse "floating" to an unfamiliar floor, a satellite pharmacy that was too busy and susceptible to distractions, "alert fatigue" among hospital staff, and a culture, like that of most hospitals, that may have discouraged questioning both authority and the almighty computer.
Dr. Wachter contrasted the error-prone way we used to order medications on paper, which he said could take up to 50 different steps before the medication got to the patient, with the electronic process which even uses a “smart” robot instead of a human to count out the number of pills to be dispensed.
But in this case, errors such as those caused by illegible handwriting, transcription errors, and the like were replaced with errors we never dreamed of.
Twenty years ago, a human pharmacist probably would have questioned the order as he was counting out 38½ pills of Septra to be given as a single dose. But the "smart" robot didn't bat an eye [robots don't have eyelids].
And most of the nurses of that era would have balked at giving any patient 38½ pills of a single drug at one time.
A French airliner crashed because the pilots didn’t know what to do when the plane’s computer malfunctioned. The author of the lengthy Vanity Fair piece about it said, “automation has made it more and more unlikely that ordinary airline pilots will ever have to face a raw crisis in flight—but also more and more unlikely that they will be able to cope with such a crisis if one arises.”
A brief article called “The case for dangerous roads and low-tech cars” (also from a book “The World Beyond Your Head: On Becoming an Individual in an Age of Distraction”) by Matthew B. Crawford, discusses the possibility that so-called safety advances in automobile design may lull drivers into a false sense of security.
New options such as automatic braking when a car ahead slows down or an alert warning about a car in your blind spot may isolate drivers too much. Crawford says, “The animating ideal seems to be that the driver should be a disembodied observer, moving through a world of objects that present themselves as though on a screen.”
On the subject of roads, he writes, “When roads look dangerous, people slow down and become more heedful” and says that some new roads deliberately built with “less safe” features yield fewer crashes.
Like pilots and drivers, are hospital personnel becoming less vigilant by trusting computers and automation too much?