- Normalcy bias
-
The normalcy bias, or normality bias, refers to a mental state people enter when facing a disaster. It causes people to underestimate both the possibility of a disaster occurring and its possible effects. This often results in situations where people fail to adequately prepare for a disaster, and on a larger scale, the failure governments to include the populace in its disaster preparations. The assumption that is made in the case of the normalcy bias is that since a disaster never has occurred then it never will occur. It also results in the inability of people to cope with a disaster once it occurs. People with a normalcy bias have difficulties reacting to something they have not experienced before. People also tend to interpret warnings in the most optimistic way possible, seizing on any ambiguities to infer a less serious situation.[1]
Contents
Possible causes
The normalcy bias may be caused in part by the way the brain processes new data. Research suggests that even when the brain is calm, it takes 8–10 seconds to process new information. Stress slows the process, and when the brain cannot find an acceptable response to a situation, it fixates on a single solution that may or may not be correct. An evolutionary reason for this response could be that paralysis gives an animal a better chance of surviving an attack; predators are less likely to eat prey that isn't struggling.[2]
Effects
The normalcy bias often results in unnecessary deaths in disaster situations. The lack of preparation for disasters often leads to inadequate shelter, supplies, and evacuation plans. Even when all these things are in place, individuals with a normalcy bias often refuse to leave their homes. Studies have shown that more than 70% of people check with others before deciding to evacuate.[2]
The normalcy bias or the sheep effect also causes people to drastically underestimate the effects of the disaster. Therefore, they think that everything will be all right, while information from the radio, television, or neighbors gives them reason to believe there is a risk. This creates a cognitive dissonance that they then must work to eliminate. Some manage to eliminate it by refusing to believe new warnings coming in and refusing to evacuate (maintaining the normalcy bias), while others eliminate the dissonance by escaping the danger. The possibility that some may refuse to evacuate causes significant problems in disaster planning.[3]
Examples
Little Sioux Scout camp in June 2008. Despite being in the middle of "Tornado Alley," the campground had no tornado shelter to offer protection from a strong tornado.[4]
New Orleans before Hurricane Katrina. Inadequate government and citizen preparation and the denial that the levees could fail were an example of the normalcy bias, as was the thousands of people who refused to evacuate.
The September 11th Attacks can also be seen as a case of normalcy bias. Although the US Government received adequate signals that an attack was being organized, inadequate steps were taken to prevent it.
Prevention
The negative effects can be combated through the four stages of disaster response:
- preparation, including publicly acknowledging the possibility of disaster and forming contingency plans
- warning, including issuing clear, unambiguous, and frequent warnings and helping the public to understand and believe them
- impact, the stage at which the contingency plans take effect and emergency services, rescue teams, and disaster relief teams work in tandem
- aftermath, or reestablishing equilibrium after the fact by providing supplies and aid to those in need
See also
- Disaster relief
- Disaster planning
- Black Swan Theory
References
External links
- Doswell, Chuck. "Thoughts about Tornadoes and Camping Safety after the Iowa Tragedy on June 11, 2008." Flame.org. 26 July 2008 http://www.flame.org/~cdoswell/scout_tragedy/scout_tragedy_2008.html.
- Oda, Katsuya. "Information Technology for Advancement of Evacuation." http://www.ysk.nilim.go.jp/kakubu/engan/engan/taigai/hapyoronbun/07-17.pdf.
- Ripley, Amanda. "How to Get Out Alive." Time 25 Apr. 2005.
- Valentine, Pamela V., and Thomas E. Smith. "Finding Something to Do: the Disaster Continuity Care Model." Brief Treatment and Crisis Intervention 2 (2002): 183-96.
Categories:- Cognitive biases
- Disaster preparedness
- Emergency management
- Sociology
Wikimedia Foundation. 2010.