Case Study: An Insulin Overdose
Case Study: An Insulin Overdose
Morana Lasic, MD, Clinical Instructor in Anesthesia, Harvard Medical School, Brigham and
At the end of this activity, you will be able to:
Discuss how errors in communication can lead to adverse events.
Identify two systemic factors that can lead to medication errors.
List at least two ways to change systems to prevent such errors.
You are a senior anesthesia resident on call at a large hospital and suddenly find yourself at the center
of an avoidable medication error. You’re devastated and left wondering — is it all your fault? In this
case, we’ll consider a number of factors that might have contributed to the drug error that occurred,
as well as some possible ways in which the error might have been prevented.
Related IHI Open School Online Courses:
PS 101: Fundamentals of Patient Safety
PS 102: Human Factors and Safety
PS 105: Communicating with Patients after Adverse Events
Communication, patient safety, adverse event, adverse drug event (ADE), teamwork, medication
safety, surgical safety, environmental design, reliable processes.
It is 3 a.m. on your call night. You are a senior anesthesia resident on call at a large hospital and you
have been working non-stop since 6:30 a.m. the day before. You have just finished a long surgery and
your beeper alerts you to a very sick patient from the Cardiac ICU. You phone the surgery resident
and she gives you the following patient history:
Mr. H is a 78-year-old gentleman who was admitted to the cardiac ICU last night for an active
MI (myocardial infarction). On admission exam, his abdomen was found to be extended and
rigid and the surgical team is suspecting ischemic bowel for which an immediate surgical
exploration is warranted. Mr. H is intubated, his vital signs are currently stable on several
inotropes (IV drips supporting the heart function), and his kidney function has been getting
You call your attending on call and assemble the team who will be caring for Mr. H. Several of the
anesthesia residents work on getting the operating room ready while another resident runs to the
Cardiac ICU to evaluate Mr. H and to obtain an informed consent from his family.
You are keenly aware of the high-risk nature of this procedure. In your head, you quickly run through
some possible “worst case” scenarios in order to better anticipate complications. You discuss these
with your attending and your colleagues.
The patient arrives to the operating room and the surgery begins. Soon after the incision is made, the
diagnosis of ischemic bowel is confirmed. The patient loses some blood and the decision is made to
transfuse a unit of packed red cells. At the same time, labs are drawn and, soon after, your team is
alerted to a dangerously abnormal result. Mr. H’s potassium is high and his creatinine (an indicator
for kidney function) continues to increase, showing signs of further kidney deterioration.
The high potassium is a cause for concern. In a setting of worsening kidney failure and an ongoing
blood transfusion, Mr. H’s body is ill equipped to eliminate the excess potassium. With this in mind,
your attending tells you to administer 10 units of insulin (in addition to the other concurrently
utilized potassium-lowering treatments). You reach for the insulin vial and draw 10 ccs. You then
attach your 10-cc syringe to the IV line and inject all of the insulin. You are stricken with panic as you
register what you have just done. You remember that each cc contains 100 units of insulin. You just
delivered 1000 units of insulin to an already compromised patient.
1) What factors about this case made this error more likely to occur?
The circumstance described in the case was ripe for miscommunication and false assumptions.
For one, the anesthesia team was fatigued. For another, the patient was very sick and the surgery
would have posed a significant challenge to any team, even under the most optimal conditions.
The insulin was packaged in a way that was not useful to the OR team. An insulin vial contains
highly concentrated insulin (100 units/cc) since it is packaged for SUBCUTANEOUS use (where
much larger doses are required), rather than INTRAVENOUS administration. For intravenous
use, insulin needs to get diluted into a much lower concentration (usually 1 unit/cc). Thus, each
time anesthesiologists in the OR use IV insulin, they first need to physically dilute it. Doesn’t this
practice seem like a mistake waiting to happen, especially under the circumstances described in
2) Can you think of ways the same circumstances could have been prevented?
This is a good example of a systemic problem that requires a systemic solution to reliably
minimize a potential for future drug error. One possible approach might be to cease having the
concentrated insulin vials available in the OR pharmacy, and to have the pharmacists take over
the preparation of all diluted insulin solutions (for example, 100 units of insulin in 100 ccs of fluid
for a very user-friendly concentration of 1 unit/cc).
Let us consider a few other examples in which medication administration is made safer by
changing the system, rather than by solely relying on practitioners’ vigilance to ensure patient
safety. For instance, anesthesiologists utilize color-coded labels for medications so that each
commonly used drug family is printed on its specifically colored labels (red for paralytics, blue for
narcotics, gray for local anesthetics, etc.). Thus, even a quick glance at a syringe can be very
helpful in reducing the mix-up of the different drug categories (imagine administering a long
acting paralytic instead of a narcotic at the end of the surgery!).
Similarly, different concentrations of medications like Ketamine (a potent anesthetic) are
packaged completely differently, by varying the vial size, color of the label, etc. None of these
strategies is meant to replace vigilance, but each can greatly augment the safety of our practice.
Safety means fully acknowledging that to err is human.
3) What communication challenges and failures contributed to this medical error?
The attending assumed that the senior resident was well aware of the insulin dosing and the way
insulin was packaged. The attending may have felt that reviewing the insulin dosing with this
seasoned resident would have been construed as an insult, or at the very least, as a statement of
Meanwhile, the resident might have felt uncomfortable confirming the appropriate dosing with
the attending. It is not uncommon for residents to fear being judged for asking “obvious”
questions. The working environment did not foster a level of safety where all questions were
welcome, as obvious as they may have seemed.
4) How could you improve communication on the unit to prevent a similar error
from occurring next time?
Although all U.S. residency programs now limit work hours, it is still likely that a clinician will
one day find herself tackling a complex situation when she is very fatigued. Given circumstances
like these, every member of the medical team must feel comfortable voicing his or her concerns at
In a hierarchical system like the medical residency training program, the person with higher
authority has a responsibility to establish a safe working environment. The attending, keenly
aware of everyone’s fatigue as well as the high stakes of this case, might have stated something
like this: “Insulin comes in 100 units/cc. We need to dilute it into 1 unit/cc before administering
it to the patient. How do you plan to do that?” Not only would this have clarified an important
dosing point, but it would have also helped foster a safe environment in which all questions and
concerns were actively encouraged. Additionally, the team might have practiced closed loop
conversations so that everyone would have known when an important task was completed (“I just
gave 10 units of insulin”).
On the other hand, the resident might have somehow mustered the courage to ask how to
administer insulin. Although it may be difficult for a resident to ask what seems to be an obvious
question, it is even more difficult when a patient suffers as a consequence of not asking.
You immediately inform your attending and while one of you starts administering IV dextrose, the
other is getting ready to draw the blood sugar level. You inform the surgeon of the adverse event.
After the surgery is completed, you transfer Mr. H back to the Cardiac ICU and accompany the
surgeon to speak to the family.
Mr. H’s glucose levels fluctuate over the course of the next few hours, but due to t he continued
vigilant blood glucose monitoring, Mr. H’s blood sugar levels never get dangerously low. Nevertheless,
his overall condition continues to deteriorate and Mr. H expires later that day.
You leave the hospital after a quick and awkward conversation with your attending and you feel awful.
You blame yourself for the drug error—after all, if you had just been more careful, this would not have
happened (and getting the strange looks from some of the residents and faculty certainly did not make
you feel better…). You wonder why you went into medicine anyway and if you will ever make a decent
1) In this case, insulin was routinely distributed from the OR pharmacy in high concentrations,
thereby posing a great danger if administered incorrectly. Can you think of an example in your
own work environment where you thought to yourself, “This is a disaster waiting to happen!”
2) Now that you have identified examples from your work environment, take a closer look at the
suboptimal systems that may be at work. Why does this “disaster waiting to happen” exist in your
workplace? Is it because there is no reporting system for such situations? Do the employees feel
disempowered to change their workplace? There are many possibilities.
3) Next, what are some ways in which those systems could be improved?
4) Let us assume that the resident caught her mistake before she injected the syringe with insulin, so
that no harm was done. We may think of such a scenario as a “near miss” event. In many work
environments “near misses” are greatly underreported, and the opportunity to learn from them
gets lost. If you are a clinician, can you think of a “near miss” experience of your own? How did
you change your practice as a result?
5) Mistakes and adverse events often trigger the “who” rather than the “how” question, thereby
fostering a blaming environment in which reporting mistakes and near misses does not feel safe.
Think of a mistake or near miss that you observed at work. How was it handled? Did the
organization’s response to the error make you more or less likely to report your own mistakes?
6) If your answer to Question 5 was “less likely,” what could the organization have done differently
to make you more likely to report your mistakes?