ISPUB.com / IJLHE/8/1/14279
  • Author/Editor Login
  • Registration
  • Facebook
  • Google Plus

ISPUB.com

Internet
Scientific
Publications

  • Home
  • Journals
  • Latest Articles
  • Disclaimers
  • Article Submissions
  • Contact
  • Help
  • The Internet Journal of Law, Healthcare and Ethics
  • Volume 8
  • Number 1

Original Article

Production Pressure And A Culture Of Deviance In The Insular Operating Room And The Consequences Of Their “Normalization”: Have We Reached A Hooper Moment?

K Kirsner, C Biddle

Citation

K Kirsner, C Biddle. Production Pressure And A Culture Of Deviance In The Insular Operating Room And The Consequences Of Their “Normalization”: Have We Reached A Hooper Moment?. The Internet Journal of Law, Healthcare and Ethics. 2012 Volume 8 Number 1.

Abstract

The consensus view of enlightened healthcare providers is that medical error is a matter of national public health importance. While not the first publication to draw attention to the concern, certainly the landmark publication by the Institute of Medicine’s To Err is Human (1999) created what amounted to an epiphany regarding the widespread patient safety concerns that permeated healthcare delivery. As a result of this and subsequent work, we realize that errors and misadventures in patient care are more often the result of system design and process failures rather than due to inept or misguided providers. Kohn L CJ, Donaldson M (eds.). To Err is Human: Building a Safer Health System. Committee on Quality of Health Care in America, Institute of Medicine 1999.

 

Introduction

The consensus view of enlightened healthcare providers is that medical error is a matter of national public health importance. While not the first publication to draw attention to the concern, certainly the landmark publication by the Institute of Medicine’s To Err is Human (1999) created what amounted to an epiphany regarding the widespread patient safety concerns that permeated healthcare delivery. 1 As a result of this and subsequent work, we realize that errors and misadventures in patient care are more often the result of system design and process failures rather than due to inept or misguided providers.

The environment of the operating room is a complex organization involving a patient, multiple staff, technology and drugs, focused on the completion of a procedure. Most errors that occur arise occur as a result of human and environmental factors. In his classic portrayal of error, Reason delineated human error as “person-based” or “systems-based,” neither of which account for all errors. He further argued that the prevailing thinking in healthcare was that error was “person-based,” a cognitive approach to error that is inherently flawed because even well-intentioned, highly trained and thoughtful individuals can be predisposed to making errors by the very system within which they work. 2

In our reading of To Err is Human, we were left with the strong message that at the core of any successful system-based approach to optimizing patient safety is, 1) the need to design and nurture a strong patient-safety culture, 2) appropriate multidisciplinary leadership in attending to errors, and 3) to simplify and standardize as many of the complex processes (e.g, such as medication delivery, providing blood transfusions, cleaning a hospital room that housed a patient with an infectious disease, etc.). Although discussed, what was not emphasized strongly enough, in our opinion, was the matter of degree of production pressure that many providers experience in their work environment, particularly in the operating room. Likewise, in a recent review of 13 patient safety culture measurement tools available to researchers and clinicians, it was revealed that most don’t even assess the domain of “production pressure” or “work pressure.” 3 In our view this oversight represents an obvious failure.

We embrace the definition of production pressure used by the Institute of Medicine, that is, the overt or covert pressures and incentives on personnel to place production over safety as the primary priority. 4 The domain of anesthesia care services has been held as an iconic leader in the patient safety movement. As such we believe a deconstruction of this environment—one that ostensibly has undergone a transformation—represents a valuable opportunity to illustrate the erosive potential of production pressure. This is especially important as an open dialogue focusing upon patient safety and productivity in this domain does not occur frequently enough in our opinion. Because patient safety and productivity are characterized by conflicting goals, serious consequences can (and do) result. Too often the adage of “doing more, better, faster and cheaper” prevails.

Links between organizational factors and patient safety: the “culture of safety”

A culture of safety is defined by the Center for Disease Control as “the shared commitment of management and employees to ensure the safety of the work environment.” Implicit in this is the acknowledgement by those in an organization, whether it is a nuclear powered U.S. Navy submarine, a mass public transit system such as an airline company, an automobile production plant, or a health care institution, that error is inevitable and that an aggressive, forward thinking approach designed to identify and correct “pathogens” in the system is absolutely vital. Such an approach demands that those at all levels (from leadership down to “worker bee”) within the system be willing to actively participate. It exemplifies the notion of a chain being only as strong as its weakest link.

As noted in many of the Institute of Medicine’s reports, those organizations with strong cultures of safety encourage the reporting of errors and mistakes, acknowledge the imperfection of human performance, and recognize the role that process or systems failures play in the genesis of errors. The delivery of anesthesia services within the confines of a surgical operating room is one that ideally exemplifies what is known as a “high reliability organization” (HRO), defined as a complex and dynamic entity that has a commitment to safety as the dominant organizational priority, and one that has a focused awareness of the risks of failure and aggressively seeks methods to prevent failure. Here we see complex, high risk decision making by humans resulting in relatively error free, high quality outcomes over long time epochs.

There are many practices in the operating room, however, which call into question whether it truly has the culture appropriate for a HRO. In fact, the typical operating room environment could fairly be described as having an “insular nature” for at least two reasons. First is that the operating room is a special place with poor accessibility even to other health care workers in the hospital; a nurse or physician from another area in the hospital might need to change clothes and have special permission in order to enter the operating room and observe. Secondly, many tasks performed there are performed by one person without observation of any other health care worker. Operating rooms also may have special rules for many tasks that may differ from the rules about how they are performed in other areas of the hospital. In some cases rules that apply to other areas of the hospital are ignored in the operating room. Additionally the tightly-knit team culture which characterizes the modern surgical suite may be associated with a culture that is prone to protective behavior towards fellow teammates leading to a culture of silence. Perhaps because of this insular nature, operating room practices have continued over the years with little questioning even when strong arguments can be made against them.

These practices, no matter how common, may be wrong in the light of current evidence. We believe that some of these common practices are indefensible as appropriate in health care in any scientific, professional, or ethical review. We also believe that they have brought healthcare organizations to a point at which serious legal liability for common practices may be on the horizon. These practices have become normalized and are common practices in the operating room, which would usually mean that they constitute the standard of care. However, there are times in other industries that courts have held that even practices that are universal can be deviations from the standard of care. In some cases we believe that common practices in operating rooms may have reached that point. This point was eloquently stated in a classic and oft-quoted opinion ruling that tug boat companies could be held liable for failing to have radios on board even though that was not the industry standard at the time, Judge Learned Hand said,

Likewise, the fact that practices are commonly performed in a particular manner in the operating room will not prevent them from being found substandard in a courtroom. Texting while driving, although not against the law in every state, can still result in a negligence verdict. Texting while operating trains and boats, even though the law may not specifically preclude it, has resulted in criminal convictions. 6 We believe that practices in the operating room must be critically examined; many common systems and practices may be exceeding that Hooper threshold.

Given their very nature, within an HRO is the presence of embedded and intrinsic hazards that must be considered and managed a priori. Many workers, especially those in health care in general, or anesthesia delivery in particular, are educated, trained and otherwise inculcated in believing the speciousness of ‘professional infallibility’ thus focusing on the errors made by workers rather than flaws in the organization or the system. While it is clear that some errors are the result of human carelessness or even misguided behavior, many can be averted or the outcome of the error lessened, with appropriately designed systems. In anesthesia care delivery it is generally believed that human error is implicated in perhaps as many as 80% of mishaps. 7

A variety of theories have emerged that have accelerated our understanding of the HRO and what lessons can be learned. 8 For example these models:

As will be discussed in more detail below, the “enemy” of the culture of safety is normalized deviance, where gradual and accepted erosions in safe operating procedures occur until they become routine attributes of the work environment and culture. In industrial enterprises speed, efficiency and output are targeted goals and a great deal of effort and resources go into achieving these ends. However whether it involves the manufacture of cars, the air-transport of people, generating electricity at a nuclear power plant, or providing anesthesia care, an imbalance between production efficiency and safety can easily occur especially when some degree of normalized deviance is present. These changes may permeate an organization or even an entire profession. This deviance may obscure seeing what the effects of its practices are and how an entire profession has become negligent in its practices until tested in court. In a very different type of HRO, railroad trains at the turn of the 20 th century, the United States Supreme Court decided whether a jury could rule on whether a railroad company’s practices could be negligent, even though it was clearly the usual practice in that industry. Justice Oliver Wendell Holmes, writing for the Court said,

In the domain of perioperative patient care, production pressure often threatens the culture of safety. Consider the following: financial incentives to reduce turnover time between cases, yielding to the demands of a surgeon to proceed with a scheduled case even when circumstances are not in the patient’s best interest, working to execute a task in an artificially abbreviated time frame, reusing drugs and apparatus not intended for reuse, and any behavior which induces haste or circumvents established safe procedures or protocols. Humans working under organizationally sanctioned production pressure are more likely to experience unintentional errors in both judgment and performance.

Figure 1
Figure 1: Reason’s Model of Organizational Failure summarized

The Theory of Organizational Accidents provides us with a foundation to study the adverse consequences of production pressure in the workplace. (1) This model is useful in revealing how human and organizational failures conspire to produce negative outcome; fundamentally misadventures and errors are not simply a result of the individual but result from the interaction of workers and the organization that they work in.(See Figure 1) The environment of health care, with the operating room as an iconic example, is a focus of intense administrative efforts to increase efficiency and reduce cost yet ensure safe care. Production pressure, especially if excessive, can challenge even the most skillful and experienced provider in balancing these objectives. 10

Figure 2
Figure 2: The so-called Swiss Cheese Model of Reason demonstrates how defense barriers and safeguards can be penetrated by accident trajectory and lead to a negative outcome.(5) Production pressure applied at any “level” of the model can amplify the risk and rate of error.

Production pressure is like the proverbial 800 pound gorilla in the room; we know it is there, but ignore or passively accept its presence. Many providers feel a sense of overconfidence and underestimate the potential hazards of production pressure because things almost always go well. However, rushing through or engaging in an abbreviated preoperative anesthesia evaluation, failing to fully check the anesthesia machine, inadequately securing backup equipment, too hastily handing-off the patient to the PACU crew, and otherwise rushing through care to increase productivity and efficiency are classic examples of personnel and organizational factors that combine to create gaps in the culture of safety.

Production Pressure I & II: A case of wanting to “please” the surgeon and a case of trying to make up time by skipping essential procedures

Production pressure is a well known but poorly analyzed factor in the operating room. It has led to numerous incidents, many resulting in injury and death. In a landmark study, 49% of anesthesiologists reported that they had observed or felt pressured to conduct anesthesia in a fashion that they considered unsafe given the level of urgency of the situation. 11 Avoiding delaying surgery, getting along with the surgeons, and avoiding litigation were among the most intense types of production pressures the anesthesiologist felt although other “pressures” were experienced as well. Two real world examples of production pressure (both from the closed claims literature that the authors have access to) follow:

Case 1: An anesthesia provider was relatively new to a facility. She had requested that she be scheduled to do bigger cases. She was scheduled to administer anesthesia to a patient for a surgeon who had been described to her as difficult and aggressive.

The patient was a fifty-one year old internist who was scheduled for a colectomy for irritable bowel disease. It was scheduled as the first case of a very busy day for the surgeon. The surgeon requested that a central line be placed during the case for postoperative intravenous nutritional support. Noting that he had a very busy surgical day and an office full of patients awaiting his arrival, he did not have the time to place one. He expressed a desire for a subclavian line placement.

The anesthesia provider had recently been granted privileges to place central lines. During her training she had placed four central lines, all using the jugular approach and all under close supervision by her clinical instructors. She had never placed a subclavian line and noted afterward that she felt very uncomfortable and awkward with both the anatomy and the technique. During placement of the subclavian line, air was entrained and the patient suffered an immediate cardiac arrest. Resuscitation was promptly instituted, continued for nearly two hours, but was not successful. The patient was pronounced dead in the operating room.

In retrospect it is easy to see the confluence of events that led to catastrophe. Preventing delay, getting along with/not irritating the surgeon, the failure to request assistance in the face of an unfamiliar procedure, and provider inexperience/obsequiousness, all played critical roles in this case.

Case 2: The anesthesia provider is engaging in a relaxing conversation with a patient as part of the induction of general anesthesia for a laparoscopic abdominal procedure. Following a standard dose of hypnotic (propofol) and muscle relaxant (rocuronium) the patient loses consciousness and his breathing ceases, necessitating “bag-mask” ventilation using the anesthesia apparatus. To the surprise (shock) of the provider, she was completely unable to pressurize the circuit and deliver a breath to the patient that was not due to the patient or the mask fit. The equipment failed and a quick check of the breathing circuit did not reveal the cause.

On her way to work that morning, contemplating that she was already behind due to some early morning domestic issues, the anesthesia provider arrived at work feeling stressed. Seeing her busy caseload, and reflecting upon a recent operating room staff meeting that “time was money,” she abbreviated her preparation for the day’s cases by performing an inadequate anesthesia apparatus checkout, which in this case included little beyond making sure that a breathing circuit was present and that the suction device was operative. In failing to adequately prepare, and falling victim to perceptions of production pressure, she failed to ensure the integrity of apparatus functionality, thus placing the patient, now unconscious and unable to breathe for herself, in great peril. A routine checkout would have revealed that the machine was neither connected to the hospital’s pipeline source of oxygen nor were there oxygen tanks (as backup) attached—all had been removed the day before during a scheduled routine equipment maintenance.

Production Pressure III—Failure to follow the standard of care for all the wrong reasons

Case(s) 3: The recent hepatitis outbreak in a Las Vegas ambulatory care facility resulted in the documented transmission of hepatitis C to at least 7 patients. Many more were exposed to the disease as well as placed at potential risk for any number of other viral and bacterial diseases. The exposures are alleged to be the result of a particularly nefarious type of production pressure, the pressure to intentionally cut corners in order to save money. Two nurse anesthetists and the physician owner of the clinics were indicted for intentionally administering propofol from single use vials to more than one patient, reusing syringes, needles and bite blocks, and falsifying patient records. Procedures were rushed at the expense of patient safety, scheduling and treating an unreasonable number of patients per day. Furthermore, insurance fraud was committed by falsely listing anesthesia times.

Administering drugs from vials labeled single patient use to more than one patient and utilizing single use syringes on more than one patient, even to non-professional observers, are clearly unacceptable practices. Yet in the desire to maximize the number of patients seen in a day and minimize expenditure, these practices became normalized. The urge to increase productivity, create workarounds, and save money likely continues in many practices, albeit in less dramatic fashion. However, utilization of drugs from single patient use vials on more than one patient is a persistent problem in anesthesia despite efforts of the American Association of Nurse Anesthetists, the American Society of Anesthesiologists, and the Centers for Disease Control to eliminate it. 12

Production pressure and provider attitudes

Anesthesia provider attitudes may be antithetical to modern industrial error prevention strategies. One example is the hierarchical nature of the operating room in example I above. Aviation has flattened the power structure that was an important factor in the deadliest plane crash in history, the 1977 Tenerife crash of two 747s, one a KLM Royal Dutch Airline flight and one a Pan American World Airways flight. The pilot of the KLM flight was KLM’s chief of training for KLM’s 747 pilots. Analysis of the crash revealed that while the causes were multi-factorial, the KLM unquestionably took off without clearance to do so while the Pan Am flight was still on the runway, resulting in the death of 583 passengers and crew.

The analysis revealed that crew may have been unwilling to challenge such a senior pilot when he chose to take off without clearance to do so. In response to this, aviation moved to procedures and training called crew resource management (CRM), focusing on interpersonal communication, leadership and decision making in the cockpit. The “time out” in surgery and the World Health Organization surgical checklist are similar moves in healthcare. Any member of the crew, no matter how junior, is given the authority to question the most senior crew member and stop the process before critical events like takeoff occur.

In studies of anesthesia providers’ attitudes, little has changed. In a recent study, 75% of providers agreed with the statement, “The senior person, if available, should take over and make all decisions in life-threatening emergencies.” The statement, “Junior operating theatre team members should not question the decisions made by senior personnel,” was agreed to by 89% of respondents. 13 Contrast this with what was described above with respect to the HRO where we indicated that these models:

Clearly two things must occur: 1) CRM needs to be integrated into the education and training of anesthesia providers as well as into the culture of the operating room, and 2) individuals with the culture of the operating room, no matter what their hierarchal position, must be empowered to speak up or take action. Our insular world must be penetrated by well known industrial engineering factors and HRO principles and opened up to critiques of ongoing attitudes and behaviors that we have taken for granted for years.

Is there a production pressure irony to the rapid response team?

The rapid response team (RRT) mobilizes promptly when a patient’s condition suddenly deteriorates. The purpose of the RRT is to bring a multidisciplinary team of critical care expertise to the patient before things worsen and if needed, marshaling ICU level care to the patient who is outside the ICU. The effectiveness of RRTs nationwide is currently under debate and is not the focus of this discussion. What we would like to consider is the possibility that the RRT is a symptom of problems resulting from widespread production pressure.

It is logical to argue that prevention, recognition and early intervention in a patient who is deteriorating is a good thing. Some patients, as a consequence of their illness or even of the care provided them, will experience a deteriorating event and preventing deterioration is obviously a health system goal. In deconstructing the purpose of the RRT consider the two reasons why a patient might deteriorate. Certainly some patients deteriorate no matter what level of care is provided. These patients would likely benefit from the RRT where an organized system (team) is in place to intervene rapidly. The other reason where a patient’s condition deteriorates occurs when an inadequate level of care is provided relative to the severity of the condition. This occurs when there is an imbalance in the matching of patient severity and personnel readiness (e.g., staffing shortage, inadequate knowledge, insufficient training/expertise, inadequate resources). This is why thoracic surgery is not performed in the pharmacy, and we don’t offer mechanical ventilation or intravenous vasoactive drug therapy in the outpatient radiology unit.

These absurd examples aside, one can argue that the fundamental philosophy guiding arguments favoring the RRT is that if the current level of care is inadequate then the RRT will supply that. Inadequate care suggests that a patient is admitted to a hospital unit that is unable to meet their unique set of needs. Likewise we suggest that if the current level of care is sufficient, then mobilization of an RRT may not likely be of benefit. Because we cannot admit all patients to the ICU, a hospital scarce resource—current estimates place a >$1 million capital cost on each ICU bed--some patients will inevitably be admitted to a unit where there is disconnect between resources (level of monitoring, staff, resources) and need (patient acuity). 14

Research clearly reveals that hospitals are episodically overcrowded, but generally have an average occupancy of about 66%. 15 However, peak time-based bed requirements and “high-stress” days are largely predictable. Patient admission on high peak/stress days may not be optimally matched to a unit. In such circumstances discord between resources, care allocation and patient needs are heightened with a resultant measurable impact on RRT utilization. 16 The irony being that the value of the RRT (in terms of “saves”) may be a consequence of production pressure caused by logistical mistakes made in patient triage, staff assignment and other institutional resources.

How we have come to normalize deviance: The insular nature of the operating room

Somehow in anesthesia (and even in our personal lives) we have too often come to normalize deviant practices.(Table 2) This is, in part, due to the insular nature of the operating room. In other areas of the hospital many health care providers observe one another’s practices. Indeed, patients and their families witness those practices. A failure to follow sterile procedure, a failure to observe proper hand hygiene, or administering a contraindicated drug, may be seen by all and may result in discipline of the health care provider who fails to follow established protocols.

Figure 3
Table 1: Levels of organizational analysis in studying error and safety efficiency

Figure 4
Table 2: A few examples of how we normalize otherwise “deviant” behaviors in professional and personal life in part, due to “production pressure”

Likewise, if a given procedure, for example central line insertion, has an established protocol like removal of rings and watches, hand scrub, full gowning and gloving in a given ICU, it is unlikely that other areas of the hospital could fail to follow the same procedures. Health care providers from the area with the higher standards would likely observe other areas, mandating that one standard be followed by all. But few people other than those who actually work in the operating room see what goes on in the operating room. In particular, anesthesia care is often physically cordoned off from other aspects of operating room behavior as a result of an individualized work station, drapes tethered to IV poles, and the presence of a “sole provider” in the room tending to the patient.

Practices that are long standing, behaviors learned during training, and actions repeated as a matter of routine are hard to break. The absence of sinks available to anesthesia providers inside the operating room contribute to poor hand hygiene as does the extreme task density that occurs episodically and often unpredictably due to changes in a patient’s condition. The result is an extremely poor overall rate of quality hand cleansing that is known to be associated with anesthesia provider-related nosocomial infection. 17 The recent advent of foam/liquid hand sanitizers has improved the situation somewhat, but poor hand hygiene prevails in our specialty. This has been the case for generations and has not been aggressively questioned until recently.

Other common sense practices, like a full scrubbing, gowning and draping for regional anesthesia insertion, particularly central neuraxial blocks, lag in adoption too often because “we have always done it a certain way.” The fact that excellent studies show marked decreases in central line infections by utilization of aseptic technique. Yet it was not until 2007 that the Healthcare Infection Control Practices Advisory Committee recommended for the first time that surgical masks be worn during spinal procedures. xxii What possible reasons could there be for not scrubbing, gowning and draping for central neuraxial block when the possibility of bacterial, viral or fungal inoculation of the central nervous system, as rare as it might be, is so catastrophic? Speed, production pressure, inertia, a sense of inconvenience, and an absence of appreciation for the growing evidence, too-often rule the day. When production pressure is normalized, when safeguards are marginalized in the quest to get things done faster or cheaper though at greater risk, when HRO tenets are softened, then safety is likely to erode increasing the rate of near- and overt accidents and misadventures. (Figure 3)

What can (should) be done?

Figure 5
Figure 3: Where might PRODUCTION PRESSURE erode safety?

There is little question that production pressure is a reality of the modern practice environment for anesthesia providers. Theories of unsafe practice abound and include a wide range of internal and external factors including task density, economics, constraints, human fallibility, systems design, stress, human personality attributes, stress, organizational constraints, and organizational culture, just to mention a few. xxiii Lessons learned in other HROs have application and benefit if aggressively applied to the anesthesia care/surgical care domain. (Figure 4)

Figure 6
Figure 4: High Risk Organizations (HROs) have people in them, prone to make mistakes

Certainly there are tools that force a pace slowdown and mandate that essential steps not be overlooked. Examples of this are the surgical “time-out” ensuring the opportunity to secure essential items and information before engaging in dangerous care procedures and mandating that certain checkout procedures occur. Such activity s is done in many HROs such as commercial aviation’s non-violable rules about pilot-performed equipment check prior to flight.

The authors do not claim to have the final answer on all the issues raised in this paper. The evidence and costs of each practice discussed, and others not raised, must be analyzed. But we believe that it is time for anesthesia providers to lead a reassessment of our work practices and increase the transparency of the operating room environment, evaluating all of our workspaces and procedures not only in terms of how we have always done them in the past, but also based on the evidence and on how they would be viewed by those looking in. We will be judged negatively if we fail to adopt reasonable safety practices

References

1. Kohn L CJ, Donaldson M (eds.). To Err is Human: Building a Safer Health System. Committee on Quality of Health Care in America, Institute of Medicine 1999.
2. Reason J. Safety in the operating theatre. Human error and organizational failure. Qual Saf Health Care. 2005;14:56-60.
3. Singla AK et al. Assessing patient safety culture. J Patient Saf. 2006;2:105-115.
4. Kohn L CJ, Donaldson M (eds.). To Err is Human: Building a Safer Health System. Committee on Quality of Health Care in America, Institute of Medicine 1999.
5. T. J. Hooper v Northern Barge, 60 F.2d 737 Cir., 1932.
6. Tug Boat Pilot Sentenced to a Year and a Day in Duck Boat Crash. http://www.nbcphiladelphia.com/news/local/Tug-Boat-Pilot-Sentenced-to-a-Year-and-a-Day-133015843.html Accessed March 17, 2012
7. Cooper JB. Accidents and mishaps in anesthesia: How they occur and how to prevent them. Minerva Anesthesiol. 2001;67:310-313.
8. Weick K, Sutcliffe K. Managing the Unexpected. San Francisco: Jossey-Bass. 2001.
9. Texas & Pacific Ry. Co.V. Behymer, 189 U.S. 468 (1903).
10. Weick K, Sutcliffe K. Managing the Unexpected. San Francisco: Jossey-Bass. 2001.
11. Gaba DM, Howard SK, Jump B. Production Pressure in the Work Environment. Anesthesiology 1994;81:488-500.
12. Safe Injection Practices to Prevent Transmission of Infections to Patients. http://www.cdc.gov/injectionsafety/IP07_standardPrecaution.html, Accessed March 17, 2012.
13. Flin R, Fletcher G, McGeorge P, Southerland A, Patey R. Anaesthetists’ attitudes to teamwork and. Anaesthesia, 2003, 58:233–242.
14. Personal communication. Dr. Sheldon Retchin, CEO, VCU-MCV Health System, Richmond, VA. February 21, 2011.
15. Litvak E (ed). Managing Patient Flow in Hospitals. 2nd ed. Oak Brook, IL: Joint Commission; 2009.
16. Litvak E, Pronovost P. Rethinking rapid response teams. JAMA> 2010;304:1375-1376.
17. Loftus R et al. Hand contamination of anesthesia providers is an important risk factor for intraoperative bacterial transmission. Anesth Analg. 2011;112:98-105.
18. XU.S. Dept Health and Human Services. Bacterial meningitis after intrapartum spinal anesthesia. NY and OH, 2008-2009. MMWR. 2010;59:65-69.
19. Maryx D. Whack A Mole. The Price We Pay for Expecting Perfection. Your Side Studios. Plano, TX. 2009.

Author Information

Kenneth Kirsner, CRNA, JD
Lincoln Memorial University

Chuck Biddle, CRNA, PhD
Virginia Commonwealth University

Download PDF

Your free access to ISPUB is funded by the following advertisements:

 

BACK TO TOP
  • Facebook
  • Google Plus

© 2013 Internet Scientific Publications, LLC. All rights reserved.    UBM Medica Network Privacy Policy