Skip to main content

A Doctor Confronts Medical Errors — And Flaws In The System That Create Mistakes

Dr. Danielle Ofri's new book, When We Do Harm, explores health care system flaws that foster mistakes — many of which are committed by caring, conscientious medical providers. She notes that many errors go unreported, especially "near misses," in which a mistake was made, but the patient didn't suffer an adverse response.

40:26

Guest

Contributor

Other segments from the episode on June 30, 2020

Fresh Air with Terry Gross, June 30, 2020: Interview with Dr. Danielle Ofri; Review of album 'Rough & Rowdy Ways.'

Transcript

DAVE DAVIES, HOST:

This is FRESH AIR. I'm Dave Davies, in today for Terry Gross. The COVID-19 pandemic has brought more Americans into contact with a strained and stressed health care system. Our guest Dr. Danielle Ofri has spent more than two decades as an internist at New York City's Bellevue Hospital and was there when the surge of coronavirus cases hit the city.

She wrote a piece last month for The New York Times about some of the things hospitals got wrong in treating COVID-19, and it was a subject she had a keen eye for. She'd spent the previous four years researching her new book about medical errors - how frequently they occur, which is pretty often; the kind of harm they cause, which varies widely; and why they happen. She found most errors are caused by conscientious committed health providers, including herself, and that errors are often abetted by health care delivery systems that can undermine accurate diagnosis and effective treatment. She also compares the U.S. system for dealing with errors, often through litigation, with models from other countries.

Besides her work at Bellevue, Dr. Ofri is a clinical professor of medicine at the New York University Medical School. She's written several previous books and is a regular contributor to The New York Times as well as the New England Journal of Medicine and The Lancet. She spoke to me from her home in New York about her new book, "When We Do Harm: A Doctor Confronts Medical Error."

Well, Danielle Ofri, welcome to FRESH AIR.

DANIELLE OFRI: Thank you, Dave. It's so nice to be here.

DAVIES: I think we have to talk first about what you've gone through over the past couple of months with the COVID-19 epidemic. You have been an internist at Bellevue Hospital for a long time. Describe some of the stresses that you saw this flood of COVID-19 patients putting on the hospital and its staff.

OFRI: Well, the hospital was actually somewhat prepared in that we have a special pathogens unit that was developed at the time of the Ebola crisis. So we had a special, you know, isolation unit and were really preparing for the handful of patients that would probably show up. I think when things got heavy in Italy, we realized that we could be hitting that kind of surge. And the ramp-up that I witnessed - the mobilization is unlike anything I've ever seen. I mean, for an ICU that normally handles 10, 15 patients, there were 110 at the peak. And the general wards were close to 400 patients. That's half of our total beds.

So what we saw was a number of patients and an acuity that was more than I think anyone had expected. But what was impressive was the fact that people were able to step up to the plate. And really, we pulled in everyone from every department in every area, and really, nobody complained. There wasn't a peep of protest. And really, it was impressive to see how people pitched in. But the patients really did suffer quite a bit. They were quite sick and were quite isolated from their families.

DAVIES: You had people working out of their specialty, I assume.

OFRI: Everywhere. I mean, we had to pull - a typical medical team now instead of having, you know, a medical attending physician and a few medical interns would have maybe a pulmonologist or a cardiologist or a dermatologist and a psychiatrist. A couple of orthopedic residents would be there, some visiting nurses from Ohio and everyone in between. Medical students were graduated a few months early. Pretty much anyone who could did. And people came back, volunteered - people who graduated years ago who used to work here. We had quite a coterie of people, and it was impressive because you got perspectives from different specialties. But there also were challenges because people were out of their element.

DAVIES: You said medical students were graduated early. They got their stripes and got to work.

OFRI: They did. Well, it was voluntary. The school offered for medical students who would like to graduate early and become kind of a junior intern. And about 50 of our students accepted, and they were put right, you know, on the medical teams as interns.

DAVIES: Administrators putting lab coats on and getting back to work, too?

OFRI: Well, you know, there were, you know, nursing administrators who hadn't done ward time who came back. And doctors who were predominately administrative, they came back out of retirement. Really, everyone came back to help out and not just at Bellevue. All the hospitals, I think, saw that kind of pitching in from every corner.

DAVIES: You wrote in this piece for the Times that hospitals got a lot of things right, but given the stresses, there were some things done wrong. What sort of errors did you see?

OFRI: So I'll stress that what went right was far greater than what went wrong. But in the same way, in my book, I ended up deciding or learning that the idea of medical errors has really expanded to the idea of patient harm because patients can be harmed even when things don't go wrong.

And so what my larger perspective in that piece was that patients definitely suffered harms in the system not always because there were errors, though certainly, there were some. I mean, the volume of patients - patients were almost identical in their conditions, and it was easy to mix up and similar names. And, you know, it was very overwhelming. But the idea that patients didn't get as good care as they might have if we had time to think things out - now, of course, we didn't, but we will for the second wave, which we're fairly sure is coming.

So, for example, we did pull a lot of people out of their range of specialties, and it was urgent. But now that we have some advanced warning on that, I think we could take the time to train people better. Another example is we got many donated ventilators. Many hospitals got that, and we needed them. But they're all different. And the basics are the same, but it's like having 10 different remote controls for 10 different TVs. You know, it takes some time to figure that out. And we definitely saw things go wrong as people struggled to figure out how this remote control worked from that one. And so trying to coordinate donations to be the same type and the same unit would be one way of minimizing patient harm.

And I'll say that the other area was the patients who don't have COVID. A lot of their medical illnesses suffered because - not that hospitals got things wrong, but we didn't have a way to take care of them. But now we might want to think ahead. What do we do for the things that are maybe not emergencies but urgent - cancer surgeries, heart valve surgeries that maybe can wait a week or two but probably can't wait three months?

DAVIES: So when you see people suffered harm, that's in addition to the harm that the disease itself brings. But you were saying there were issues in care that really weren't related to specific misjudgments but just the fact that the system was overwhelmed.

OFRI: The system was definitely overwhelmed. And so partly, it's the whole country didn't really think we'd be seeing a pandemic of that proportion, but there were things that were mixed up. I mean, we had many patients being transferred from overloaded hospitals. And when patients come in a batch of 10 or 20, 30, 40, you know, it's really a setup for things going wrong. So you have to be extremely careful in keeping the patients distinguished. We have to have a system set up to accept the transfers for which we can be - take the time to carefully sort patients out. Especially when every patient comes with the same diagnosis, it is easy to mix patients up. And so thinking ahead, what does it take to have enough time and space and resources to make sure that nobody gets mixed up?

DAVIES: What was the impact on your hours and the stress that you felt?

OFRI: I think everyone worked more hours. And it was both stressful and, if I can say - invigorating would be the wrong word because it wasn't a pleasurable thing in that so many patients were suffering and dying. But we also felt this unity of purpose in responding to this unprecedented situation. I mean, many people in our country or in the world were on lockdown and were perennially bored and had nothing to do, and we were in some ways fortunate that we had something to do and a reason to go to work. We had the chance to go to work. And being with colleagues who were all committed in the same direction was very inspiring. I mean, that's really the right word.

So it was stressful and hard and long hours, but it also felt very right. And this is why we all went into this profession. So it was very gratifying in the end to realize that we had accomplished something and that patients - although there were probably some harm suffered, overall, I think patients got very well cared for. And the mortality rates were lower than might have been expected given the chaos that could have been there.

DAVIES: How much did you worry for your own safety?

OFRI: I think everyone did, especially in the beginning, when we didn't know how much was circulating from asymptomatic patients. Once we had the right PPE and the right, you know, setup for that, I think we were fine. I felt very safe with the PPE, and our hospital never ran low on PPE, and, you know, we felt protected. But I think we hadn't realized that we'd been unprotected for so long in our community and even between each other, that we ourselves were probably passing it to each other unknowingly.

DAVIES: Did you have colleagues who contracted the virus?

OFRI: Many, yes. Many patients got it. And most, you know, thankfully, were mild to moderate. But we did have - one of our head nurses died, and there were several staff members who died, some who may have contracted it outside the hospital, perhaps some in - you know, we'll never know. But, you know, the death of staff members was tragic.

And I think - you know, maybe it's the historians who will have to mark when's the turning point, but I think for us at Bellevue, when one of our head nurses died in our own hospital felt like the turning point where this felt different than an external force. But it really felt internal, and now it risked ourselves and each other. And to lose someone you've worked with for so many years affected the staff deeply.

DAVIES: Let me reintroduce you. We're speaking with Dr. Danielle Ofri. She is an internist at Bellevue Hospital in New York City. Her new book about medical errors is called "When We Do Harm." We'll continue our conversation in just a moment. This is FRESH AIR.

(SOUNDBITE OF TODD SICKAFOOSE'S "TINY RESISTORS")

DAVIES: This is FRESH AIR, and we're speaking with Dr. Danielle Ofri. She has been an internist at the Bellevue Hospital in New York for more than two decades. She has a new book called "When We Do Harm: A Doctor Confronts Medical Error."

Let's talk about your book. You know, you write that you were inspired to attack the subject of medical errors in part because of a headline that you saw in 2016. I guess an editor saw this headline and asked you about it. What was the headline and your reaction?

OFRI: The headline was from a study that suggested that medical error was the third-leading cause of death in the United States. And she emailed that to me and said, is this really true? And I have to admit I was kind of stymied. I honestly didn't know because on the one hand, if it is true, I feel like I should be seeing it all the time.

I mean, I work in a very busy city hospital. And if medical error's No. 3, I should see as much medical error almost as cancer and heart disease, which are the first and second causes of death. But I wasn't. Or at least, I didn't feel like I was. So either we just simply have our blinders on, or maybe it really isn't the third-leading cause of death. And that became kind of my quest which started the book.

DAVIES: Right. You write about a number of interesting studies, one of them from the Institute of Medicine. And they looked at some data of a group of hospitals and extrapolated that over the entire United States and came up with this really shocking number, that up to 98,000 people a year could be dying from medical errors. And that was roughly equivalent to a jumbo jetliner going down every day. That had a huge impact. What are some of the distinctions that people need to make when they're looking at those numbers or that you look - you made when you began to look at this more closely?

OFRI: Well, this famous Institute of Medicine report, which was called "To Err Is Human," as well as this third-leading-cause-of-death study - both of them were analyses of previously published data. They weren't primary studies where someone went and looked at the medical records and went through cases in operating rooms. They were looking back at studies that were done mostly of hospitalized patients.

Now, hospitalized patients don't represent the whole population, right? Those are, you know, a small number. They tend to be sicker and older than the average population. And the other thing is looking at the errors being the cause of death. Well, how do you know if an error causes a death?

You could have, for example, a patient who has end-stage liver disease, and they're given the wrong antibiotic. Well, that's an error. But the patient also dies. But did that error cause the death? It can be very hard to tell, especially when you're looking at a study that's reanalyzing a study, and you don't have the primary data. And because those studies are being extrapolated to 350 million people, even a small change in those numbers when multiplied out can have a vast, you know, difference in the numbers that are reported.

DAVIES: But I guess you did come to conclude, there are a lot of errors, some of which cause harm, some of which cause not-so-serious harm. You wanted to dig into this deeper.

OFRI: So, I mean, not to give a spoiler - I don't think we'll ever know, you know, what number, in terms of cause of death, is medical error, but it's not small. It's certainly there. And even if it's nine or 10, it's still quite a lot. And so we should be trying to make the system safer.

The other thing I was interested in researching or looking into was the idea of the near miss. You know, a near miss is when a patient - when an error happens, but the patient is OK. You gave the wrong medication, but they did fine; there was no ill effect. And that's still an error. And I think a near miss just means the patient got lucky and didn't have the ill effect. But it's still the error. In another situation, that same error could be deadly.

And so near misses are the huge iceberg below the surface, where all of the future errors are occurring, but we don't know where they are. And we don't know where they are because whoever wants to report a near miss? It's embarrassing. It's shameful. It's a pain in the neck. It takes time. And so we don't know where these are happening, so we don't know where to, you know, send our resources to fix them or make it less likely to happen.

DAVIES: Right. And as you get into later in the book, there probably are some ways that you can encourage providers to report more of this and build some useful information. You know, you write that one of the most effective tools for improving patient safety, especially in hospitals, really grew out of the aviation industry. You want to explain this?

OFRI: In the aviation industry, there was a whole development of the process called the checklist. And some people date this back to in 1935, when a very complex - the B-17 Fortress was being tested with the head of the military aviation division, and it exploded, and the pilot unfortunately died. And when they analyzed what happened, they realized that the - this high-tech airplane was so complex that a human being could not keep track of everything and that even if he was the smartest, most experienced pilot, it was just too much. And you were bound to have an error. And so they developed the idea of making a checklist to make sure that every single thing you have to check is done. And so it put more of the onus on the system, of checking up on the system, rather than the pilot to keep track of everything. And the checklist quickly decreased the adverse events and bad outcomes in the aviation industry. And that's been adapted to medicine.

And most famously, Peter Pronovost at Johns Hopkins developed a checklist to decrease the rate of infection when putting in catheters, you know, large IVs, in patients. And the checklist is very simple. You know, make sure the site is clean. Put on a clean dressing. Make sure you're wearing the right PPE - nothing, you know, unusual. It's kind of like checklisting (ph) how to brush your teeth. Yet, the rate of infections came right down. And it seemed to be a miracle. Once you start paying attention to the steps of a process, it's much easier to minimize the errors that can happen with it.

DAVIES: Right. And then you write that this was adopted for some other processors and had similar impacts. But then there was a case in Canada where they developed a checklist for safety in operating rooms. And they did a survey that showed that everybody did it, 100% compliance. And it had no impact on safety. There was a reason for this that's really revealing.

OFRI: Well, it's interesting. If you think about the idea of a checklist, this little, tiny, low-tech, inexpensive idea can keep all those jumbo jets from falling out of the sky. And so the media loved it. And the health care industry loved it. And government ministers loved it. And they start checklisting everything, you know, from checking for blood clots, for discharge planning, for advance end-of-life care planning - everything. I mean, you practically couldn't get something, you know, a cup of coffee without having to do a checklist.

But the problem is, once you have a million checklists, how do you get your work done as an average nurse or doctor? You've got to make these checklists just go away because they just get in the way of getting through your day. And so we just check all of the boxes to get rid of it. And that's what happened with this preop checklist in Canada. And, again, the preoperative checklist was making sure you have the right patient, the right procedure, the right blood type - very simple. And it showed impressive improvements in complication rates in hospitals both academic and high-end and even hospitals in developing countries.

So in 2010, the minister of health in Ontario mandated that every hospital would use it, planned to show an improvement in patient safety on this grand scale. And as you said, the data did not budge at all despite an almost 100% compliance rate. And that lets you know that at some point, people just check the boxes to make them go away. And they're not really gaming the system per se. But it lets you know that the system wasn't implemented in a way that's useful for how health care workers actually work.

And this brings up the very sort of boring science of implementation, right? These are the dull details about, oh, where are the supplies kept? Who do we call when things run out? What do we do for short-staffed one day, you know? Who's going to provide the coffee? What are we measuring? Have we thought about unintended consequences? And this is really not so interesting. It doesn't make great headlines. Hospitals don't put these on their billboards. But if you don't do these, you can see a 100% compliance rate and no effect on the outcome.

DAVIES: Right, devil's in the details, as they say. Well, you know, one of the things that was fascinating about this was that Peter Pronovost, the guy who really came up with this first checklist, said that there is a critical element to it particularly when you're trying to get surgeons and doctors to be very careful about hygiene, about always washing their hands, about making sure everything is sterile. And that really involved the nurses, right?

OFRI: Right. He said, you know, the secret sauce - if he could pick one - was empowering the nurses because the nurses are seeing what's happening in the OR, in the ICU, on the wards, when central lines are being put in. And they'll notice these details. Now, the way the hospital hierarchy is set up, it's very hard to speak up against the hierarchy whether you're the nurse, the medical student, the orderly. Anyone who's perceived to be lower on the totem pole has a disincentive to speak up. And so nurses, like anyone else who are seeing things on the ground, you know, would have to pick and choose what's worth, you know, speaking up about and upsetting the applecart.

But once, you know, he empowered the nurses and said, you know, we'll have your back. You can bring any procedure to a halt, whether it be the biggest surgery or the small central line, if you see something that's not being done appropriately. And we will back you up. And once they could do that, they had the clout to say, this isn't being done right, without getting, you know, treated poorly. And that's how we ended up decreasing the harm from these procedures.

DAVIES: Dr. Danielle Ofri is an attending physician at Bellevue Hospital in New York and a clinical professor of medicine at the NYU Medical School. Her new book is "When We Do Harm: A Doctor Confronts Medical Error." She'll be back to talk more about the causes of medical errors after a break. Also rock critic Ken Tucker reviews Bob Dylan's new album, "Rough And Rowdy Ways."

I'm Dave Davies. And this is FRESH AIR.

(SOUNDBITE OF CHRISTIAN SANDS' "ARMANDO'S SONG")

DAVIES: This is FRESH AIR. I'm Dave Davies, in for Terry Gross. My guest is Dr. Danielle Ofri, who spent more than two decades treating patients at New York's Bellevue Hospital. She has a new book in which she writes frankly about medical errors, including some of her own. She analyzes some of the systemic causes of errors and what we do - and, sometimes, fail to do - to prevent them. Her book is "When We Do Harm: A Doctor Confronts Medical Error." You know, you say in the book that most medical errors are committed by dedicated conscientious providers. Describe an error that you've made.

OFRI: Well, I'd say, how much time do you have...

(LAUGHTER)

OFRI: ...Because we've all committed a lot of errors. When I was a resident, I had a patient admitted for so-called altered mental status. It was an elderly patient from a nursing home. And they were sent in because someone thought they looked a little more demented today than they looked yesterday. And, of course, we were really busy. And this seemed like a ridiculous admission. And the labs were fine. The radiology was fine. And so I just basically thought, let me get this patient back to the nursing home. It's all fine. So I sent the patient to kind of an intermediate holding area to just wait until their bed opened up back at the nursing home.

Well, it turns out that the patient was actually bleeding into his brain. But I missed it because I hadn't looked at the CAT scan myself. Somebody said to me, radiology fine. And so I took that at their word and didn't look at the scan myself, as I should have. Now, luckily, someone else saw this scan. The patient was whisked straight to the OR, had the bleed drained. And the patient did fine. So in fact, this was a near-miss error because the patient didn't get harmed. Their medical care went just as it should have been.

But, of course, it was still an error. It was an error because I didn't do what I should have done. And had the patient gone home, they could have died. But, of course, this error never got reported because the patient did OK. So we don't know - it never got studied or tallied. So it was missed kind of in the greater scheme of how we improve things. But when I look back at how I made the error and, of course - why didn't I speak up? - I was too embarrassed. I mean, how humiliating to tell my attending - my supervisor, to tell my team and, worse, to tell the patient or their family that I almost killed them. So I didn't say a word. And it took me, you know, almost 20 years to speak about it and write about it publicly.

You know, now that it's been some time, it's given me some perspective. I have some empathy for my younger self. And I recognize that the emotional part of medicine is so critical because it wasn't science that kept me quiet. It was shame. It was guilt. It was all of the emotions. And so while we have all of these efforts to decrease medical error, we have checklists and different procedures, if we don't talk about the emotions that keep doctors and nurses from speaking up, we'll never solve this problem.

DAVIES: You know, you write that a lot of the serious problems that come in the treatment of patients involve systems, not just individual decisions by people. And so I thought we'd talk a bit about some of the things that you describe which can lead to or encourage medical error. And one of them is electronic medical records. And, of course, there's a lot that's great about getting records, you know, in digital form. They're easy to read. They're permanent. They're easily shared.

But you certainly have a lot - (laughter) some choice words in the book about some of their limitations or drawbacks. Let's just begin - why don't you describe how you used to write notes from an exam, before there were these digital records, in a patient's chart, you know, by longhand. And then describe how electronic medical records alters the way you present information or think about it.

OFRI: All right. So back in the Pleistocene era when we had paper charts, it was, you know, a straightforward process. The patient would come in. And they would have their chief complaint, you know? I'm having chest pain or my stomach hurts. And then you'd write the history of the present illness - the HPI - and that's the story. When did it start, you know? Just give me the story of when this pain began. Tell me more about it. Then you'd go back to their past medical and past surgical history. You do a bit about their social history, where they're from. Do they smoke? Do they drink? What do they do for a living? Who do they live with?

Then you do a physical exam. You'd gather some data from, you know, lab tests or X-rays. And then you come up with an assessment and plan. You assess what's going on. You synthesize the data. You come up with a differential diagnosis. What are the possible things that it can be? And here's the plan of how I'm going to figure it out and what I plan to do. It kind of takes the form of a story with a beginning, middle and end. And there was something sort of very unitary about that. And now, with electronic medical record - the EMR, as we often call it - it's a different process.

And part of it is understanding how the EMR came to be. It really started as a method for billing, for interfacing with insurance companies and medical billing with diagnosis codes. And that's the origin. And then it kind of, retroactively, was expanded to include the patient care. And so you see that difference now. For example, when I was seeing patients this morning and I - a patient with diabetes. And it won't let me just put diabetes. It has to pick out one of the 50 possible variations of on or off insulin with kidney problems, with neurologic problems, in what degree and what stage, which are important. But I know that it's there for billing.

And each time I'm about to write about it, these, you know, 25 different things pop up. And I have to address them right now. But, of course, I'm not thinking about the billing diagnosis. I want to think about the diabetes. But this gets in the way of my train of thought. And it distracts me. And so I lose what I'm doing as I have to attend to these many things. And that's, really, kind of the theme of medical records in the electronic form is that they're made to be simple for billing. And they're not as logical - or they don't think in the same logical way that clinicians do. And it's very fragmented. Things are in different places. Whereas, in the chart, you know, in the old paper chart, everything was in one spot. And now, they're in many spots.

DAVIES: It's almost like you're trying to tell a story about your patient. And you are interrupted every five seconds by somebody asking you a question. And so in order to proceed with an examination, you have to fill out all of these fields in the electronic medical record. And then, as these alerts pop up, you're prescribing this medication. You have to affirm you understand it can have effects on people who are a certain age. It can have effects on people who are pregnant.

You're essentially - it's a liability issue, isn't it, really? I mean, you are taking responsibility for the fact that, yes, every single one of these alerts, I have seen the alert. I have considered its meaning and taken the appropriate action in my treatment.

OFRI: I mean, that's the goal. I mean, that's why they're there. And the truth is, I want to be reminded of important things because I can't know them all. And one day I thought, you know what? I'm going to read every single alert that comes up because there could be something important. And I was defeated by my very first patient. This was a gentleman. He was on some blood thinners, which notoriously interact with everything under the sun. He was on, you know, a dozen medications. And so I had, practically, a hundred alerts to go through.

And I realized if I read every one, it'll take me three hours to finish with this patient, so I can't. I just have to say, OK, OK, OK. And I recognized that it's a transfer of liability from the system, from the hospital, from whatever, onto the doctor's shoulders. Yes, well, Dr. Ofri said OK. So if something happens, you know, we can sue her and not the system. And so it feels like a transfer of liability, as well as a transfer of workload, onto the clinicians. And it's not necessarily giving us better patient care.

DAVIES: Let me reintroduce you. We're speaking with Dr. Danielle Ofri. She is an internist at Bellevue Hospital in New York City. She's been there for more than two decades. She has a new book about medical error called "When We Do Harm." We'll continue our conversation in just a moment. This is FRESH AIR.

(SOUNDBITE OF THE ACORN SONG, "LOW GRAVITY")

DAVIES: This is FRESH AIR. And we're speaking with Dr. Danielle Ofri. She is an internist at Bellevue Hospital in New York City. She's been there for more than 20 years. Her new book is called "When We Do Harm: A Doctor Confronts Medical Error."

You cite a really interesting example of how electronic medical records affected care in the case of a man who had travelled from Liberia to one city in Texas - maybe it was Dallas. And it turned out he had been infected with Ebola. But the way the records were set up made it harder for them to connect critical information. You want to just explain this?

OFRI: It was a fascinating and tragic case. So this gentleman came. And at the time, the things to screen for was travel history, of course, in the presence of fever. And he had just come from Liberia. And the nurse had noted these things. But when the doctor went to see the patient with the electronic medical record, their fields didn't connect. And the travel history was sort of tied to the vaccination field. And so it didn't pop up in what the doctor saw. Now, of course, the doctor probably should've asked about travel history given that Ebola was on the horizon, but he didn't. And so he sent the patient home. And the patient, of course, got sicker and sicker. And then by the time the patient came back, he was gravely ill, you know, vomiting and bleeding. And, of course, the ambulance was contaminated.

In the end, two nurses contracted Ebola from this. And that could've been avoided if, you know, the doctor's and nurse's field connected. In fact, it could've been avoided if the doctor and nurse were there together and actually talking to each other, which was kind of what we did in the old days. But now, everyone's off with their little computer or tablet seeing the patient independently. And there's no - there's very little conversation between team members. And that was a tiny thing. But the ramification was enormous. I think almost 160 people had to be quarantined based on the exposures that that one lapse ended up causing.

DAVIES: You know, it's interesting. I picture people listening to us who work in electronic medical records or feel like they're great objecting to some of this conversation. But it gets to an idea which you mentioned in the book, which is that implementation is key. You can have great ideas. But unless you really work through how it actually affects the people who use it, you can get into trouble.

OFRI: Yeah. And, of course, listen; there are wonderful things about the EMR. I don't mean to sound like it's all a disaster. In fact, the COVID crisis really pointed that out. Patients were getting transferred, you know, right from an ER of one hospital to an ICU of another, you know, 30 or 40 at a time. If we didn't have the EMR, we'd have no idea what was going on with these patients. So it was critical that we had EMRs that we could use between our hospitals. It has an amazing potential.

Another place where it can be very helpful is outbreaks of infections within a hospital. There was a case where a hospital had an outbreak of C. difficile, which is a terrible diarrheal illness from which patients can die. And it's very, very hard to get rid of it. And this hospital had an outbreak. And by using the EMR, they could track where every single patient in the entire hospital had been every minute of the day, in essence. And they could track down which staff they came in contact with.

And so they could churn through the amount of data that no human could and were able to identify that the source was a single CAT scan machine - just one of the many - that hadn't been cleaned properly. And that would have been impossible with just humans and pen and paper. So it has many wonderful things. Listen; it's nice to be able to read handwriting. I can read, you know, the surgeon's note and the X-rays sitting in someone's back pocket. These are very, very helpful things. But it has downsides, too. And if it's not implemented correctly, it can actually make more work and not make things easier.

DAVIES: How effective are malpractice suits as a remedy for medical error as you've considered this?

OFRI: You know, we tend to think of malpractice suits as, oh, someone cut off the wrong leg or operated on the wrong side. That's a very tiny sliver of malpractice suits. A majority will boil down to some form of communication error either between the doctor and the patient, the doctor and the family, between the teams. And often, people sue, as in this case, to get information, to find out what happened. That's really a shame because it shouldn't have to take a legal procedure to find out, you know, what happened.

And nobody wins. That's really the truth. Even if the patient wins the case or the doctor wins the case, nobody comes out feeling victorious. For both parties, it's painful. It's excruciating. It ruins their life for years. It drags on, typically, three to five or even more years of the process. And no one comes out unscathed. And it doesn't really solve the problem.

DAVIES: Yeah. It sounds like, in some respects, I mean, one of the clearest paths to better patient safety is giving doctors a little more time with their examinations and giving hospitals a little better staffing.

OFRI: Well, yeah. But that all comes down to money because the number of patients a doctor sees each day is how the hospital pays the bills. And if you see 30 patients instead of 20, you know, you make more money. And if you see fewer patients, you make less money. So it's really a tradeoff. It's really a manifestation of what we value financially in the medical system. So for example, I have a lot of patients with diabetes. And a big part of their care is talking about what they eat and how they exercise and how they take their medications. This takes a lot of time. It's not very dramatic and exciting. And you don't get reimbursed much for that.

But if, while I'm talking to my patient, you know, about how to cook broccoli, I thread in a catheter in one of their orifices - you can pick any orifice you want - but the reimbursement will go up, like, tenfold. And if I did a CAT scan at the same time, it's, like, twentyfold. So the more procedures you do, the more the hospital will get paid. But, in fact, the real way to help the patient's illness and to prevent medical error is to spend the time talking to them. But that's not reimbursed very much in our current we don't value that much, so we don't give ourselves much time for that.

DAVIES: Do you have advice for people who are going into the hospital, or family members?

OFRI: Yes. So one thing is to, you know, be as aware as you can. Now, of course, you're busy being sick. You don't necessarily have the bandwidth to be on top of everything. But to the best that you can, to have someone with you, to keep a notebook. Ask what every medication is for and why you're getting it. What are the side effects? And if people are too busy to give you an answer, remind them that that's their job, and it's your right to know and your responsibility to know.

And if you can't get the information you want, there's almost always a patient advocate office or some kind of ombudsman either at the hospital or with your insurance company. You should feel free to take advantage of that. The information in the charts is yours. You own it. And so if someone's not giving you the time of day or the explanation, it's your right to demand it. Now, of course, recognize that people are busy. And most people, you know, are trying their best. And you could certainly acknowledge how hard everyone's working. But don't be afraid to speak up and say, I need to know what's going on.

DAVIES: And have someone with you who can ask the questions that you may be too sick or distracted to come up with.

OFRI: Right, of course. During COVID-19, when you can't have someone with you, though, it's been very, very difficult. And certainly, there are times where people don't - maybe don't have family members. It can be very challenging in certain situations.

DAVIES: Dr. Danielle Ofri, thank you so much for speaking with us.

OFRI: Thank you. It was a pleasure.

DAVIES: Dr. Danielle Ofri is an attending physician at Bellevue Hospital in New York and a clinical professor of medicine at the NYU Medical School. She's also co-founder and editor-in-chief of the Bellevue Literary Review. Her new book is "When We Do Harm: A Doctor Confronts Medical Error."

Coming up, rock critic Ken Tucker reviews Bob Dylan's new album, "Rough And Rowdy Ways." This is FRESH AIR.

(SOUNDBITE OF SONG, "WIGWAM")

BOB DYLAN: (Vocalizing). Transcript provided by NPR, Copyright NPR.

Did you know you can create a shareable playlist?

Advertisement

Recently on Fresh Air Available to Play on NPR

52:30

Daughter of Warhol star looks back on a bohemian childhood in the Chelsea Hotel

Alexandra Auder's mother, Viva, was one of Andy Warhol's muses. Growing up in Warhol's orbit meant Auder's childhood was an unusual one. For several years, Viva, Auder and Auder's younger half-sister, Gaby Hoffmann, lived in the Chelsea Hotel in Manhattan. It was was famous for having been home to Leonard Cohen, Dylan Thomas, Virgil Thomson, and Bob Dylan, among others.

43:04

This fake 'Jury Duty' really put James Marsden's improv chops on trial

In the series Jury Duty, a solar contractor named Ronald Gladden has agreed to participate in what he believes is a documentary about the experience of being a juror--but what Ronald doesn't know is that the whole thing is fake.

There are more than 22,000 Fresh Air segments.

Let us help you find exactly what you want to hear.
Just play me something
Your Queue

Would you like to make a playlist based on your queue?

Generate & Share View/Edit Your Queue