Swiss Cheese model

Swiss Cheese model

The Swiss Cheese model of accident causation is a model used in the risk analysis and risk management of human systems. It likens human systems to multiple slices of Swiss cheese, stacked together, side by side. It was originally propounded by British psychologist James T. Reason in 1990, and has since gained widespread acceptance and use in healthcare, in the aviation safety industry, and in emergency service organizations. It is sometimes called the cumulative act effect.

Reason hypothesizes that most accidents can be traced to one or more of four levels of failure: Organizational influences, unsafe supervision, preconditions for unsafe acts, and the unsafe acts themselves. In the Swiss Cheese model, an organization's defences against failure are modelled as a series of barriers, represented as slices of Swiss cheese. The holes in the cheese slices represent individual weaknesses in individual parts of the system, and are continually varying in size and position in all slices. The system as a whole produces failures when all of the holes in each of the slices momentarily align, permitting (in Reason's words) "a trajectory of accident opportunity", so that a hazard passes through all of the holes in all of the defenses, leading to a failure. [cite book|title=Controlling Pilot Error|author=Daryl Raymond Smith, David Frazier, L W Reithmaier, and James C Miller|pages=10|date=2001|publisher=McGraw-Hill Professional|id=ISBN 0071373187] [cite book|pages=4–6|title=Clinical Risk Management in Midwifery: the right to a perfect baby?|author=Jo. H. Wilson, Andrew Symon, Josephine Williams, and John Tingle|date=2002|publisher=Elsevier Health Sciences|id=ISBN 0750628510] [cite book|title=Clinical Governance in Mental Health and Learning Disability Services: A Practical Guide|editor=Adrian J. B. James, Tim Kendall, and Adrian Worrall|pages=176|date=2005|publisher=Gaskell|id=ISBN 1904671128|chapter=Risk managament|author=Tim Amos and Peter Snowden]

Frosch [cite book|title=Seeds of Disaster, Roots of Response: How Private Action Can Reduce Public Vulnerability|editor=Philip E Auerswald, Lewis M Branscomb, Todd M La Porte, and Erwann Michel-Kerjan|author=Robert A. Frosch|pages=88|chapter=Notes toward a theory of the management of vulnerability|date=2006|publisher=Cambridge University Press|id=ISBN 0521857961] describes Reason's model in mathematical terms as a being a model in percolation theory, which he analyses as a Bethe lattice.

The Swiss Cheese model includes, in the causal sequence of human failures that leads to an accident or an error, both active failures and latent failures. The former concept of active failures encompasses the unsafe acts that can be directly linked to an accident, such as (in the case of aircraft accidents) pilot errors. The latter concept of latent failures is particularly useful in the process of aircraft accident investigation, since it encourages the study of contributory factors in the system that may have lain dormant for a long time (days, weeks, or months) until they finally contributed to the accident. Latent failures span the first three levels of failure in Reason's model. Preconditions for unsafe acts include fatigued air crew or improper communications practices. Unsafe supervision encompasses such things as, for example, two inexperienced pilots being paired together and sent on a flight into known adverse weather at night. Organizational influences encompass such things as reduction in expenditure on pilot training in times of financial austerity. [cite book|title=A Human Error Approach to Aviation Accident Analysis: The Human Factors Analysis and Classification System|author=Douglas A. Wiegmann and Scott A. Shappell|pages=48–49|date=2003|publisher=Ashgate Publishing, Ltd.|id=ISBN 0754618730]

The same analyses and models apply in the field of healthcare, and many researchers have provided descriptive summaries, anecdotes, and analyses of Reason's work in the field. For example, a latent failure could be the similar packaging of two different prescription drugs that are then stored close to each other in a pharmacy. Such a failure would be a contributory factor in the administration of the wrong drug to a patient. Such research has led to the realization that medical error can be the result of "system flaws, not character flaws", and that individual greed, ignorance, malice, or laziness are not the only causes of error. [cite book|title=Annual Review of Nursing Research Volume 24: Focus on Patient Safety|editor=Joyce J. Fitzpatrick and Patricia Hinton-Walker|chapter=The intersection of patient safety and nursing research|author=Patricia Hinton-Walker, Gaya Carlton, Lela Holden, and Patricia W. Stone|date=2006-06-30|publisher=Springer Publishing|id=ISBN 0826141366|pages=8–9]

Lubnau, Lubnau, and Okray [cite book|title=Crew Resource Management for the Fire Service|author=Thomas Lubnau II, Randy Okray, and Thomas Lubnau|date=2004|publisher=PennWell Books|id=ISBN 1593700067|pages=20–21] apply Reason's Swiss Cheese model to the engineering of human systems in the field of firefighting, with the aim of reducing human errors by "inserting additional layers of cheese into the system", namely the techniques of Crew Resource Management.

References

Further reading

*
*
*
*
*
* — Westrum and Adamski relate Reason's Swiss Cheese model to Westrum's "human envelope" model, where "around every complex operation there is a human envelope that develops, operates, maintains, interfaces, and evaluates the function of the sociotechnological system" and where the system "depends on the integrity of this envelope, on its thickness and strength". Westrum models latent failures as voids within this envelope, and active failures as factors external to the envelope that are acting to breach it.
*
*
*
* (also available on-line [http://www.tc.gc.ca/CivilAviation/publications/tp185/2-06/Pre-flight.htm#Seeking here] ) — a reminder that while Reason's model extends causation to latent failures, this is not at the expense of eliminating active failure entirely
*
*
* — the application of the Swiss Cheese model to a specific case of medical error

See also

* Systems engineering
* Root cause analysis
* Latent human error
* Tenerife disaster


Wikimedia Foundation. 2010.

Игры ⚽ Поможем написать курсовую

Look at other dictionaries:

  • Swiss cheese (disambiguation) — Swiss cheese is the name of several types of cheese, some of which have holes in them. It may also refer to: * Swiss cheese (mathematics), a compact subset of the complex plane * Swiss Cheese model, a model of accident causation * Swiss Cheese… …   Wikipedia

  • Swiss cheese features — (SCFs) are curious pits in the south polar ice cap of Mars and were first identified in 2000 using Mars Orbiter Camera imagery. [cite journal| last=Thomas | coauthors=et al.| journal=Science| year=2000] They are typically a few hundred meters… …   Wikipedia

  • Healthcare error proliferation model — The Healthcare Error Proliferation Model is the adaptation of James Reason’s Swiss Cheese Model designed to illustrate the complexity inherent in the contemporary healthcare delivery system and the attribution of human error within these systems …   Wikipedia

  • Cream cheese — Source of milk Cow Texture Soft Aging time none Cream Cheese in the USA Nutritional value per 100 g (3.5 oz) …   Wikipedia

  • Patient safety and nursing — Nurses are knowledge workers whose main responsibility is to provide safe and effective care within constantly evolving health care systems. Nurses collaborate with one another, as well as doctors, aides and technicians, to provide holistic care… …   Wikipedia

  • Medical error — A medical error may be defined as a preventable adverse effect of care, whether or not it is evident or harmful to the patient. This might include an inaccurate or incomplete diagnosis or treatment of a disease, injury, syndrome, behavior,… …   Wikipedia

  • Organizational models of accidents — Models of accident causation are used for the risk analysis and risk management of human systems. Since the 1990s they have gained widespread acceptance and use in healthcare, in the aviation safety industry, and in emergency service… …   Wikipedia

  • Human reliability — is related to the field of human factors engineering, and refers to the reliability of humans in fields such as manufacturing, transportation, the military, or medicine. Human performance can be affected by many factors such as age, circadian… …   Wikipedia

  • Patient safety — is a new healthcare discipline that emphasizes the reporting, analysis, and prevention of medical error that often lead to adverse healthcare events. The frequency and magnitude of avoidable adverse patient events was not well known until the… …   Wikipedia

  • Charles W. Hellaby — Dr Charles Hellaby is a lecturer at the University of Cape Town, South Africa. He is a member of the Baha i Faith. His research interests include: Inhomogeneous cosmology. Standard cosmology assumes a smooth homogeneous universe, but the real… …   Wikipedia

Share the article and excerpts

Direct link
Do a right-click on the link above
and select “Copy Link”