June 15, 1997 How Can We Save the Next Victim? By LISA BELKIN ON A FRIDAY AFTERNOON LAST SUMMER, TINY JOSE ERIC MARtinez was brought to the outpatient clinic of Hermann Hospital in Houston for a checkup. The 2-month-old looked healthy to his parents, and he was growing well, so they were rattled by the news that the infant had a ventricular septal defect, best described as a hole between the pumping chambers of his heart. He was showing the early signs of congestive heart failure, the doctors said, and those symptoms would need to be brought under control by a drug, Digoxin, which would be given intravenously during a several-day stay. The child's long-term prognosis was good, the doctors explained. Time would most likely close the hole, and if it did not, routine surgery in a year or so would fix things. The Digoxin was a bridge between here and there. There was nothing to worry about. The lesson of what happened next is not one of finger-pointing or blame. In fact, the message of this story is quite the opposite: that finger-pointing does not provide answers, and that often no one - no one - is to blame. No single person caused the death of that child in the pediatric special care unit of Hermann Hospital on Aug. 2, 1996. No isolated error led his heart to slow and then stop, suddenly and irreversibly, while his mother, Maria, was cuddling him and coaxing him to suck on a bottle. No one person was responsible, because it is virtually impossible for one mistake to kill a patient in the highly mechanized and backstopped world of a modern hospital. A cascade of unthinkable things must happen, meaning catastrophic errors are rarely a failure of a single person, and almost always a failure of a system. It seems an obvious point, one long understood in other potentially deadly industries like aviation, aerospace, nuclear power. In those realms, a finding of human error is likely to be the start of an investigation, not its conclusion. ''If a pilot taxis out and takes off with the flaps up, yes, it's human error,'' says John Nance, an airline pilot and aviation analyst who has spent much of his time this past year as an adviser to the health care industry. ''But the next question is, 'What caused the error?' It's not because the folks in the cockpit says, 'O.K., guys, we can go take off with the flaps up and die, or we can put them down and make it home for dinner.' ''Were they confused? Tired? Misinformed? That's still not an answer. What caused them to be confused or tired or misinformed? That's where you learn something useful.'' This systems approach to errors has been slow in coming to health care industry. Perhaps it is because operating room slips are far less obvious and dramatic than plane crashes, and to discuss error as an integral part of medicine is to shine a light on how many errors there actually are. Or maybe it is because we accept that machines are in charge of the space shuttle, but still want to believe that human beings are in charge of our health. Possibly it is because doctors have long been trained to think that they can be - must be - perfect, and patients have been conditioned to accept no less. Whatever the reason, medicine continues to focus on who while other fields try to focus on why. ''The mentality has always been: 'Who's the person? Who do I blame? Give me a name,' '' says James Conway, who became chief operating officer of the Dana-Farber Cancer Institute in Boston in 1995 during the restructuring after the death of a patient and the injury of another from a medication error. ''But that 'going for the jugular' approach hides problems in the system, problems you don't see if you don't look at it as a system error. ''People don't make errors because they want to, or because they're bad people. Everybody makes errors. Every human being. What we need to forcus on is how to best design our systems so that those errors are caught before they reach the patient.'' Slowly, tentatively and very recently, health care started to shift that focus. Patient safety is coming to be recognized as a systems problem, for a chain of reasons - the death at Dana-Farber, the tort reform movement in Congress, fears that quality is suffering under managed care. That awareness is growing throughout medicine - at the American Medical Association, at several major malpractice insurers, in the office of academic researchs and at dozens of medical centers nationwide. Soon it will be everywhere, because the organization that accredits hospitals has announced that systemic evaluation of errors will be required at all hospitals that report a serious mistake. One of the newly converted is Hermann Hospital, which stumbled into this burgeoning revolution by accident and as a result of an accident. The significance of the death of one baby, therefore, lies not only in how he died, but also in what happened at the hospital after he died. ''The entire organization was mobilized to go back and look at this from a systemic point of view,'' says Lynn Walts, Hermann's chief operating officer. The internal investigation found that six separate people had noticed, or had a chance to notice, that the infant was being given 10 times the appropriate dose of Digoxin. As a result, nearly every procedure at Hermann is now being looked at anew. These changes in philosophy and procedure, Walts knows, will not undo the damage already done. They will not ease the anguish of the Martinez family nor soothe the psyches of their baby's doctors and nurses. Nor will the new approach absolve the hospital of liability for its mistake. But what it can do - indeed, is already doing - is to keep mistakes like that from happening again. And it can replace a paralyzing atmosphere of blame with the healing sense of moving forward, toward a goal. ''If we had looked at things differently five years ago,'' Walts says, ''maybe this mistake wouldn't have happened. We can't change it, but we can make sure we don't look back with the same regret five years from now.'' THE NIGHT JOSE MARTINEZ DIED, HERMANN HOSPITAL WAS TWO weeks away from a visit from the Joint Commission on the Accreditation of Healthcare Organizations, a body with the power to take away a hospital's economic lifeblood by making it ineligible for payments form H.M.O.'s and other health care organizations. The commission, sanctioned by organizations like the A.M.A. and the American Hospital Association, is an example of an industry's governing itself. It has its share of critics, who call it a watchdog with no teeth; in fact, the joint commission rarely removes hospital accreditations. But whatever teeth it has, they are sharp enough to cause great stress at hospitals preparing for a joint commission inspection. At Hermann, those preparations had been underway for a year, and in the final weeks employees were given lists of possible questions, with the proper answers. Administrators walked around conducting mock reviews: What do you do in case of a fire? What is your job? How does it fit into the hospital system? When word spread about the overdose given to Jose Martinez, therefore, the first question asked on the executive floors at Hermann was, ''How did this happen?'' The question asked immediately after that was, ''Should we tell the joint commission?'' ''They don't directly ask you, 'Did you have a sentinel event?'' Walts says, using the commission's term for a major mishap. ''And it was so recent that they probably wouldn't have discovered it in our paper trail.'' It was soon decided, she says, that full confession would serve the hospital best. Joanne Turnbull, chief quality and utilization officer, was assigned the task of figuring out what happened and explaining it to the visitors. Turnbull, a take-charge woman who is a social worker and psychologist by training, spoke with each of the central figures in the case, and each interview seemed to widen the circle. Within a few days, in time for the all-important meeting, she had a sense of the scenario that had caused Jose Martinez to die. On the Friday afternoon that the boy was admitted, she says, the attending doctor discussed the Digoxin order in detail with the resident. First, the appropriate dose was determined in micrograms, based on the baby's weight, then the micrograms were converted to milligrams. They did those calculations together, double-checked them and determined that the correct dose was .09 milligrams, to be injected into an intravenous line. They went on to discuss a number of tests that also needed to be done, and the resident left to write the resulting list of orders on the baby's chart. With a slip of the pen that would prove fatal, the resident ordered 0.9 milligrams of Digoxin rather than .09. The list complete, the resident went back to the attending doctor and asked, ''Is there anything else I need to add on here?'' The attending scanned the list, and said no, there was nothing to add. The error went unnoticed. A copy of the order was faxed to the pharmacy, and a follow-up original copy was sent by messenger. The pharmacist on duty read the fax and thought that the amount of Digoxin was too high. The pharmacist paged the resident, and then put the order on top of the pharmacy's coffeepot, site of the unofficial ''important'' pile. What the pharmacist did not know was that the resident had left for the day and did not receive the page. Sometime later, the backup copy of the as-yet-unfilled order arrived at the pharmacy. This time a technician looked at it and filled a vial with 0.9 milligrams of Digoxin. The technician then set the order and the vial together on the counter so that the pharmacist could double-check the work. The pharmacist verified that the dosage on the prescription matched the dosage in the vial, and did not remember questioning the dosage in the first place. The order of the digoxin was sent up to the pediatric floor. A nurse there took the vial, read its dosage and worried that it was wrong. She approached a resident who was on call but had not personally gone over the drug calculation with the attending. ''Would you check this order,'' she asked. Or maybe she said, ''Is this what you want me to give?'' The resident took out a calculator, redid the math and came up with .09, the correct dose. Looking from the calculator to the vial, the resident saw a ''0'' and ''9'' on both and did not notice the difference in the decimal point. There was one remaining step. Following procedure, the first nurse asked a second nurse to verify that the order in the chart was the same as the label on the vial. She did, and it was. At 9:35 P.M., a troubled nurse gave Jose Martinez a dose of Digoxin that was 10 times what was intended. It took 20 minutes for the entire dose to drip through his IV tube. At 10 P.M., the baby began to vomit while drinking a bottle, the first sign of a drug overdose. Digoxin works by changing the flux of ions in the heart, altering the cell membranes. Too much allows the hear to flood with calcium, so it cannot contract. There is an antidote, called Digibind, and the nurse, her fears confirmed, called for it immediately. But even immediately was too late. ''They killed my son,'' the boy's father, Jose Leonel Martinez, sobbed on the local TV news. ''Those people who work there are not professional and they shouldn't be there.'' A restaurant worker who had moved his family from Mexico a few years earlier, Martinez was shocked that the world's best health care system could make such a mistake. ''When I asked the doctor if the medicine they were going to put in him was strong, the doctor said no, that it was normal,'' he said through an interpreter. ''That it was just so the child would function better.'' The residents and the nurse were ''given some time off'' during the investigation, Walts says; no one was fired. ''It sobered us to realize that we've always dealt with errors as a discipline problem, yet we're not eliminating erros by firing people,'' she adds. All those in the chain of error are back at work, and all are still haunted by the death of Jose Martinez. When the system fails, the patient is not the only victim. ''It was an abosultely devastating thing,'' the attending doctor says. ''The loss to the parents was indescribable. There are no words. ... The only thing that made it possible for me to struggle through was my concern for these young people'' - meaning the two residents. ''I had to make them understand that this did not mean they were bad doctors.'' After hearing Turnbull's account, the joint commission place Hermann on ''accreditation watch,'' a category so new that the hospital was in the first group to receive the designation. It required that Hermann analyze the root cause of the error - not only what went wrong, but also why it went wrong - and develop a plan to fix it. THE CHANGE OF APPROACH AT HERMANN, THE CHANGE OF APproach throughout medicine, all fo it ahd its start during those frightening, disorienting, reorienting months in 1995. That was the year, says Dr. Dennis S. O'Leary, the joint commission's president, that ''medicine went to hell in a handbasket.'' It is impossible to tell whether errors increased significantly in 1995 and, if so, whether the increasing complexity of medicine and the concomitant cost-cutting of managed care were to blame. What is clear is that it seemed as if error was everywhere, as if the system was out of control. The year began with the news that Betsy Lehman, 39, a halth columnist for The Boston Globe, had died not of breast cancer but of a fourfold miscalculation int he amount of Cytoxan she was being given at Dana-Farber to battle her breast cancer. It happened because the total dose to be given over four days was instead given on each of the four days, an error that was not corrected by doctors, nurses or pharmacists. At about the same time, a vascular surgeon at University Community Hospital, in Tampa, Fla., was accused of amputating the wrong leg. Then came reports that the wrong side of a brain had been operated on at Memorial Sloan-Kettering Cancer Center in New York. In an echo of the Dana-Farber error, a patient at the University of Chicago Hospitals died of a huge overdose of chemotherapy medication, also because the wrong dose was written down and the error wasn't noticed until the drug was administered. The join commission had recently visited and accredited all those hospitals. That fact alone wa reason for concern at the commission's headquarters in Oakbrook Terrace, Ill., and the immediate response was to conduct some surprise reinspections. The accreditation of the Tampa hospital was temporarily lifted, and Dana-Farber was placed on probation. The longer-term response was to create the category of ''accreditation watch.'' It was intended to replace the punishment of probation with a more collaborative, problem-solving approach. ''The policy should not forcus on what happened from an action standpoint, because you can't undo what's been done,'' O'Leary says. ''What can be done is to reduce the likelihood of this happening in the future.'' While the public feared more mistakes, and the joint commission feared overlooking mistakes, doctors began to fear something else entirely. What sobered and stunned organized medicine was not only the enormity of these errors, but also how quickly they became infamous single-sentence sound bites. The stories took on lives of their own, told and retold with no room for nuance, no blame for anyone but the bungling idiot of a doctor. The wrong side of the brain incident at Sloan-Kettering, for instance, was quickly abbreviated into a description of a doctor who could not tell his left from right. In reality, the case was never that simple. When the surgeon stood at the operating table and cut into the wrong side of a patient's brain, it was because he was being guided by a CAT scan and an M.R.I. showing a large tumor on the same side of the brain as he cut into. The wrong set of films, those beloning to another patient, was brought into the operating room. The surgeon was fired. Now the policy at Sloan-Kettering requires a more comprehensive preoperation check, including matching X-rays with the patient's ID bracelet. The case of the wrong leg, in Tampa, was similarly simplified into a story of a doctor who could not tell the difference between a diseased leg and a healthy one. To the conrary, the patient had two seriously diseased legs, a result of diabetes. The circulation in both was so poor that there was no pulse present in either leg and both feet were cold to the touch. A mistake on the surgical schedule said that the left leg was to be amputated. The error was noticed, but only one copy of the schedule was corrected. When the surgeon scrubbed for the actual operation, therefore, he did so staring at an O.R. schedule saying that the left leg should be amputated. He then walked into the operating room, past a blackboard at the control desk, which also indicated that the left leg was to be amputated. The schedule inside the O.R. said the same thing. When the surgeon approached the operating table, he found his patient already completely draped, save for his ulcerated and swollen left leg, which ahd been prepped by the nurse for amputation. The surgeon was place on probation for two years (but his license was reinstated after six months), and he was fined $10,000. The hospital has changed its policy so that corrections on one copy of the schedule must appear on all copies of the schedule. For decades, the American Medical Association's approach to error has been describe it as an aberration in a system that is basically safe. For instance, when researchers at Harvard University released a 1993 study estimating that one million preventable injuries and 120,000 preventable deaths occurred in American hospitals in a single year, the A.M.A. dismissed the study's method as unsound and its conclusions as alarmist. ''We were very defensive on the Harvard study,'' says Martin J. Hatlie, executive director of the National Patient Safety Foundation at the A.M.A. ''We found ourselves in a position of denying it had any validity at all,'' a position that sounded hollow and disingenuous, even to those within the A.M.A. But in 1995, when everything seemed to go wrong, the denials stopped. Not directly because of the errors, but because of the fallout from the erros. Things were bad enough for medicine when doctors were seen merely as arrogant and insensitive. Now a new popular caricature was taking shape, portraying doctors as inept, if not downright murderous. Hatlie marks the moment of change as an afternoon when the new, 104th Congress began a debate long at the center of the A.M.A.'s agenda - limits on the amount patients can be awarded in malpractice suits. Hatlie remembers losing all hope for that provision when Representative Greg Ganske, a Republican from Iowa who is also a doctor, tried to speak in favor of the bill and was interrupted by a fellow Republican, Ed Bryant of California. ''Last week a member of the gentleman's profession did some surgery down in Florida,'' Bryant drawled. ''I heard on the radio, he was supposed to cut off a person's foot. He amputated it, and when that person woke up, they had cut off the wrong foot.'' Ganske, flustered, did not respond eloquently: ''It is inevitable that mistakes are going to be made.'' That was when insiders at the A.M.A. stopped quibbling over how many mistakes doctors make and decided to be seen as trying to do something about those mistakes. It is when they stopped arguing about the methodology of the Harvard study and instead turned for help to Dr. Lucien Leape, the report's primary author, who was a surgeon for 20 years and now studies medical errors. That two-year-old partnership has become the basis of some tangible, structured efforts. The National Patient Safety Foundation, for instance, was designed to root out and reduce error in the way that a similar organization, the Anesthesia Patient Safety Foundation, revolutionized that segment of the industry in the 1980's. Similarly, the U.S. Pharmacopeia Convention established the National Coordinating Council for Medication Reporting and Prevention to track medication errors. And the Institute for Healthcare Improvement began a project to reduce adverse drug events. There are also less tangible but equally important results. Specifically, all these separate efforts add up to a growing recognition that the health of health care may lie in its ability to admit and to prevent its mistakes. ''In hockey,'' says Nancy W. Dickey, a Texas family practioner and chairman of the A.M.A. board of trustees, ''you don't go wher the puck is. You go where the puck is going to be. We're trying to go where the puck is going to be.'' HUMAN FACTORS EXPERTS ARE A PATIENT GROUP. They sit in their laboratories and huddle over their research papers, waiting will utnil whole industries are ready - really ready - to hear what they have to say. Skeptics turn into believers, they know, during times of confusion and remorse. The human factors field first began in the 1940's, when psychologists and engineers came together to prevent assembly-line errors that were threatening the war effort. The 1970's brought another burst of interest, with the Three Mile Island nuclear accident and a series of plane crashes. In 1986 came the Challenger explosion. Now it seems to be medicine's turn, and human factors researchers have been expecting the call. Even before anyone asked for their help, they had spent a lot of time analyzing that industry. From where they sat, health care was the pinnacle of challenges, the most complex of industries, the ultimate test of systems theory. ''Health care is really interesting from our point of view because it straddles the entire span, the spectrum of accidents,'' says James Reason, a psychology professor at the University of Manchester and one of the first human factors researchers. In other industries, the relationships are between operators and equipment - pilots and their airplanes, nuclear plant personnel and their walls of confusing displays and dials. In medicine, the relationships involve operatings, equipment and patients - patients who, by definition, are not in perfect working order, creating infinitely more ways for things to go wrong. It was inevitable, human factors experts agree, that medicine would eventually shed its resistance to being seen as a system in which human beings were but one fallible component. That it took a string of tragedies to spur that realization is probably also inevitable. The first step, researchers are telling clinicians, is to accept that perfect human performance is not an attainable goal. People are not perfect outside a hospital. Half asleep, they spray deodorant on their heads and hair spray under their arms. Distracted, they write the wrong dates on checks. On autopilot, they leave their phone numbers when they mean to leave their fax numbers. They are equally imperfect inside a hospital. They write the cumulative chemotherapy dose instead of the daily dose. They write ''left leg'' when they mean ''right leg.'' Admitting to imperfection is a first step for medicine, because many in the profession seem to believe that they can be perfect, says Robert L. Helmreich, a human factors expert at the University of Texas at Austin who spent years helping airlines teach pilots that they were fallible. ''The think they're bulletproof,'' Helmreich says. He cites a 1988 survey of pilots in which 42 percent agreed with the statement, ''Even when I am fatigued, I perform effectively.'' He was amazed by that until 1996, when he gave the same survey to surgeons, anesthesiologists and nurses and found that 60 percent agreed with the statement. The central problem with the belief in perfection is that hospital systems are designed around it. They rely on concentration - that the nurse, for instance, will connect the nutrition bag to the nutrition line and not to the dialysis line. But things should be designed, human factors experts would argue, so that the connective port on the nurtition bag fits only into the connective port on the nurtition line. Systems based on perfection also deny the possibility of confusion - trusting that a nurse will always double-check whether she is dispensing the right drug and not a similarly packaged or a similarly named one. And they depend on memory - on a resident remembering to write down the correct dose of Digoxin, rather than on a computer system devised so that it won't accept a prescription with an erroneous dose of Digoxin. The possible remedies for this dependence are many, and they vary with the problems of each hospital, which leads to perhaps the most important message of human factors research. ''You cannot solve your problems,'' says David Woods, a professor of cognitive systems engineering at Ohio State University, ''until you know what they are. And you will now know what they are unless you create an environment where people feel free to tell you.'' Every industry that has substantially reduced error, experts say, has created a blame-free environment for reporting errors. And that includes not only actual errors, but also near misses, which have traditionally been seen as evidence of the strength of the system, but are more likely examples of errors waiting to happen. For instance, an airline pilot must carry a form as part of the Aviation Safety Reporting System. When he makes or sees a major foulup, Helmreich says, he fills out that form - including his name - and sends it to NASA. The agency has a week to contract the pilot for amplification or clarification. Then the information is put in a computer but without the pilot's name. Whie the thorough and honest reporting of error is a central message of the experts, it is the trickiest for medicine to hear. Some hospitals are trying: James Conway at Dana-Farber, for one, has sent thank-you notes to staff members who report errors. But this zealous honesty does not come easily to a profession trained to understand that anything written down is discoverable evidence in a malpractice suit; it doesn't help that many states have chosen to confront error by publishing lists of doctors who have been disciplined or sued. In fact, no one is more interested in these latest changes by doctors than malpractice lawyers ''On the surface, it does sound intriguing,'' says Judith A. Livingston, a personal injury lawyer at Kramer, Dillof, Tessel, Duffy & Moore in Manhattan. ''But of course I'm skeptical.'' She does not see health care workers the same way human factors researchers do, and worries that this is merely a way for doctors to avoid blame. Yes, she says, the sysem often fails, but that does not mean that the individuals in that system are not responsible for their contributory actions. ''You can't just say, 'Everyone makes mistakes,' '' she says. ''If a reporter makes a mistake in a magazine article, you can run a correction,'' but when health care workers make a mistake, ''someone dies. The gravity is so much greater. The responsibility should be greater, too.'' It is too early to tell, malpractice lawyers say, what the effects of this systems approach will be on medical malpractice suits. If the new thinking does in face reduce errors, it follows that is will reduce lawsuits. But the methods used to root out error - admitting it - have the side effect of providing evidence of error, evidence the palintiffs' lawyers are eager to see. In most states the results of internal surveys would be protected by laws of privilege. But Thomas Demetrio of the Chicago personal injury firm Corboy & Demetrio believes public pressure might cause that to change. If hospitals begin legitimate investigations of their error rate, ''the consumer is entitled to know the findings,'' he says. ''If the numbers are there, and they are solid, they'll come out.'' To hospitals, anything that intrigues malpractice lawyers is unsettling, which is the major stumbling block for those who would like to see a human factors take over of health care. When the joint commission created the category of accreditation watch, for instance, the agency saw it as a nonpunitive way to monitor hospital error. The hospitals saw it as something else entirely. ''It was supposed to be a neutral, nonjudgmental designation,'' says O'Leary, of the joint commission. ''We thought we were saying, 'You tell us about your sentinel events, and we'll work together on the solutions.' '' Turnball, of Hermann Hospital, responds: ''They say, 'Report, report, report,' and then when you report, they punish you. They give the information to the newspapers.'' O'Leary admists that ''the policy is a work in progress - I would not be surprised to see us come back with a further iteration this fall.'' BEN KOLB WAS SCARED when he arrived at Martin Memorial Hospital, in Stuart, Fla., in December 1995. This was to be the third ear operation on the 7-year-old Ben. His doctor wanted to remove scar tissue that was left from the prior surgeries, at ages 2 and 5. So his mother, Tammy, spent the time before surgery joking with her son, talking about soccer (he was the captain of his team) and Christmas (when he would be singing int he yearly pageant at school). By the time an orderly came to take the boy into surgery, he was calm. ''Give your mom a kiss,'' the nurse said, and he did. ''Have fun,'' his mother said, waving as he left. Ben was given general anesthesia, and about 20 minutes later it took full effect. His surgeon was handed what everyone thought was a syringe of lidocaine, a local anesthetic, which reduces bleeding. He injected it inside and behind Ben's ear. Moments later, for no apparent reason, Ben's heart rate and blood pressure increased alarmingly. Dr. George McLain, an anesthesiologist on standby for emergencies, was summoned. McLain helped to stabilize the child, but a short time later Ben's heart rate and blood pressure dropped precipitously. Fro an hour and 40 minutes, frantic doctors performed CPR on the boy, knowing it was futile. More than a year later, the memories are fresh, and McLain sits at lunch, crying as he speaks. The other diners stare, but he makes no attempt to hide the tears. How long would he have kept up the CPR? ''If it was my kid, I would want them to keep trying,'' he says. ''I think we were never going to stop.'' Ben's heart did begin to beat again, and he was transferred to Martin Memorial's intensive-care unit. The surgeon, who had know Ben since he was a baby, went with McLain to talk to Tammy Kolb. ''There has been a serious problem with your son,'' McLain remembers telling the woman. ''His heart stopped. We had to restart his heart. He is extremely critical, in a coma-like state.'' He winces at the memory: ''You don't know how strong to be to get your point across. You want her to understand, but you can't stick a knife in her.'' At first, Tammy Kolb did not seem to understand. ''I know he's going to wake up just fine,'' she said. ''I don't -,'' McLain said. ''I've seen this on TV. As soon as he wakes up I have a Christmas present for him. I brought it for him early.'' Ben reamined in a coma for nearly 24 hours. His parents and older sister remained at his bedside as their fog of denial slowly lifted. The next day they agreed that his ventilator should be removed, and he was declared brain dead. As with the death of Jose Martinez, a lot can be learned by what happened after Ben Kolb died. First, the hospital's risk manager, Doni Haas, had all the syringes and vials used on Ben locked away, then sent to an independent laboratory for analysis. Second, Haas promised Ben's parents that she ''was going to find them an answer, if there was one.'' There was. Tests showed that there had been a mixup, a mistake, a human error in a system that made that error more likely. Ben Kolb, lab reports showed, was never injected with lidocaine at all. The syringe that was supposed to contain lidocaine actually contained adrenaline, a highly concentrated strength that was intended only for external use. Procedure in the Martin Memorial operating room at the time was for topical adrenaline to be poured into one cup, made of plastic, and lidocaine to be poured into a cup nearby, made of metal. The lidocaine syringe was then filled by placing it in the metal cup. It is a procedure used all over the country, a way of getting a drug from container to operating table. According to Richmond Harman, the hospital's C.E.O., ''It has probably been done 100,000 times in our facility without error.'' But it is a flawed procedure, the hospital learned. It allows for the possibility that the solution can be poured into or drawn out of the wrong cup. Instead, a cap, called a spike, could be put on the vial of lidocaine, allowing the drug to be drawn directly out of the labeled bottle and into a labeled syringe. The elimination of one step eliminates one opportunity for the human factor to get in the way. Haas received the lab results three weeks after Ben died. The family had hired an attorney by then, and Haas and McLain drove two hours and met with the Kolbs at Krupnick, Campbell, Malone, Roselli, Buster, Slama & Hancock. ''It was very unusual,'' says Richard J. Roselli, one of Flordia's most successful malpractice lawyers and the president of the Academy of Florida Trial Lawyers. ''This is the first occasion where I ever had a hospital step forward, admitting their responsibility and seeking to do everything they can to help the family.'' A financial settlement was reached by nightfall, but neither side will confirm the amount paid to the Kolbs. After the papers were signed, the family asked for a chance to talk with the doctors at the hospital. The first thing Ben's father, Tim, did when he entered the emotion-filled room was to hug his son's surgeon. Then came the torent of questions, questions that had kept the Kolbs awake at night, questions they might never have been able to ask had the case spent years in court. Was Ben scared when his heart rate started dropping? Was he in pain? How much did he suffer? The doctors explained what the Kolbs did not know, that Ben had been put under general anesthesia long before anything went wrong. ''The decisions I made for him were the same I would have made if it were my child,'' McLain said. Just before the family left, they asked if it would be O.K. for them to continue to use Martin Memorial for their medical care. ''Of course,'' Haas said, grateful and amazed. Would the hospital promise to spread the word about how Ben died, so that the procedure in question could be changed in other places? Haas promised. With that, the Kolb case was closed, but it wasn't over. Tim Kolb still coaches his son's soccer team. The family still grieves. The doctors in the operating room that day still have nightmares of their own. ''I let that child's life slip through my fingers,'' McLain says. ''They tell me there was nothing I could do. I know there was nothing I could do. But it's like I was a lifeguard and he died on my watch. There must have been something.'' And the lawyers at Krupnick, Campbell are still searching, too. ''We're not done with this - yet,'' Roselli says. Why, he asks, was it possible to mix up the lidocaine and the adrenaline? Did the two bottles look alike? ''We're still investigating the product liability aspect of it,'' he says. ''The questions of packaging and labeling. JOANNE TURNBULL DID not fully realize she was part of a sea change until she was in a Palm Springs auditorium last October - two months after the death of Jose Martinez, nearly two years after the death of Betsy Lehman, one day after what would have been Ben Kolb's eighth birthday - and marveled at what was going on around her. At the front of the room were representatives from Martin Memorial Hospital. Keeping their promise to the Kolb family, the group spent nearly two hours retelling the story of how and why the boy died. In the audience, fighting tears as they took notes, were more than 300 doctors, nurses, pharmacists and administrators, each of whom had been given a smiling photograph of the child at the three-day conference on ''Examining Errors in Health Care.'' ''Five years ago if we'd held this conference, very few people would have come,'' Leape, of Harvard, said at the opening session. But the list of sponsors included not only the A.M.A. and the joint commission, but also four medical insurance organizations, three pharmaceutical groups and the American Hospital Association. ''This is a miracle that this is happening,'' Turnbull remembers thinking as the session concluded. ''Everyone's telling the truth.'' In the months since Jose Martinez died, Turnbull has become an expert herself on the world of human factors research. She has done her root-cause analysis for the joint commission and learned a lot more about what went wrong that night: that the hospital's pharmacy was short one technician because someone called insick and that policies there require that the phone be answered in four rings and visitors greeted within five seconds of their arrival; that the nurse who questioned the order was trained in a country where women rarely confront men and nurses rarely confront doctors; that the first resident was distracted by personal problems. Turnbull has made changes, too, ones she hopes will be strong enough to make such human imperfections matter less. Hermann Hospital's computer now flags questionable orders for the most dangerous drugs, including Digoxin. The hospital is looking into a paging system that alerts a caller when the person being paged has his beeper turned off. Double copies of a prescription are no longer sent to the pharmacy unless it is a prescription that must be filled within 15 minutes. Administrators even asked the joint commission to schedule its accreditation review for April, not August, to ease the stress during the hospital's busiest month. Two research experiments are being planned to increase error reporting. As a result of Turnbull's analysis, and the accompanying changes, Hermann Hospital is no longer under ''accreditation watch.'' On Dec. 27, 1996, it was given full accreditation, with commendation, the highest designation given by the joint commission. That brings satisfaction, she says, but mostly she feels sadness for the little boy and his family and concern for the staff members who are also still burdened by the event. She is wary, too, because everything she has learned tells her that no institution can ever be certain that something like this will not happen again. And she feels a sense of responsibility, an understanding that this moment and movement might well be seen as a crossroads for medicine. ''We've become part of something,'' she says. ''We want to make sure it is something that's done right.'' To Err Is Human, Even in Medicine The most unsettling lesson of human factors research is this: Despite all best efforts, mistakes happen. Careful planning can reduce the number, and mitigate the consequences, but it cannot completely prevent mistakes. The aviation industry has learned that in the decade since it began to confront human error head-on. Crew training was over-hauled, equipment modified and procedures changed. The number of crashes attributed to human error (distinct from insufficient maintenance and aging airplanes) decreased significantly. So impressive were the changes, says John Nance, an aviation analyst, that the industry was feeling pleased with itself. Then, in December 1995, an airplane crashed outside Cali, Colombia - which was result of human error. Even with all the changes, tragedies continue. Now medicine is starting to grapple with the same lesson. One morning this winter, I was observing a meeting of the pediatric oncology staff at the Dana-Farber Cancer Institute in Boston. Suddenly the session was interrupted with the news that a 6-year-old patient had erroneously been given a double dose of his chemo medication. Perhaps no institution has been more aggressive in error prevention than Dana-Farber. And yet somehow a pharmacist had misread a confusing recipe card for Cytarabine, a drug used to fight leukemia. The error was caught because the pharmacist was given the same prescription for a second patient a few hours later. When he mixed that second prescription a mental alarm bell rang, and he realized he had made a mistake earlir in the day. By the time word reached this meeting, the medicine had already been administered. The first thing the staff did was look for someone to blame, taking momentary comfort in the fact that the overdose was not actually given at Dana-Farber: the little boy had visited the Dana-Farber clinic that morning but had been sent across the street, to Boston Children's Hospital, which is affiliated with Dana-Farber and has made most of the same procedural changes. ''It's not our problem - it's Children's problem,'' one pediatric oncology fellow volunteered. ''It's our patient, it's our service, it's our problem,'' his attending, Dr. Amy Billett, said sharply. All energy then turned to a search of the available literature to see what effects a double dose of Cytarabine would have on the child. Five doctors crammed into a closet-size office, reading textbooks and jounral articles aloud. Eventually Billett found a 1979 article revealing that early patients were given far larger doses of the drug with no evidence of toxicity. Relieved, the doctors decided no remedial action was called for. Then, armed with reassuring knowledge, they told the family. ''A mistake was made,'' Billett remembers saying. ''Your son received too much medication.'' She then assured the parents that their son was well, and was likely to remain well, although he would be closely monitored. ''We were able to offer them reassurances,'' Billett says. ''They were not particularly concerned.'' An internal investigation was begun immediately. The preparation card was changed so that it was less confusing. For Billett, for everyone, it was a sobering lesson. The staff learned that all the recent changes, all the heightened awareness, while necessary and effective, were far from foolproof. ''You can never design a truly perfect system,'' Billett says. ''There are no guarantees.'' Correction: The cover of The Times Magazine today and an article on page 28 about medical mistakes render a prescription figure incorrectly. The order for digoxin given for Jose Eric Martinez, a 2-month-old who died in Hermann Hospital in Houston, was written as .90 milligram (not 0.9 milligram, a different way of writing the same amount). The dosage the child was supposed to receive, under the original doctor's orders, was .09 milligram. The 0.9 rendering was supplied by the Houston hospital; the error was discovered after the article had gone to press. Lisa Belkin, a contributing writer for The New York Times Magazine, wrote about breast cancer fund-raising in December.