The Psychology of Hubris in Disaster Response

The appointment of community leaders in emergency management context often overlooks the necessity of comprehensive expertise across all phases of the field, favouring individuals with backgrounds in events, project management and social services. This practice results in a leadership cohort that excels primarily in the preparedness phase, while lacking crucial knowledge in, prevention, response, recovery, and mitigation. This imbalance can lead to significant inefficiencies, as leaders either delegate unfamiliar tasks to subordinates or neglect them altogether in what is sometimes referred to as the Ostrich method. This lack of holistic understanding can compromise the effectiveness of high consequence emergency management programmes and increase the community’s vulnerability to hazards (Kruger & Dunning, 1999).

The Dunning-Kruger Effect plays a pivotal role in this issue, as individuals with limited experience often overestimate their efficacy, failing to recognise the gaps in their knowledge. This cognitive bias can be particularly dangerous in emergency management, where overconfidence may lead to poor decision-making and catastrophic consequences. As highlighted by Stojiljković et al. (2018), the inability of unskilled individuals to accurately assess their own capabilities can be a significant source of human error in safety and emergency response settings. The researchers emphasise that this overconfidence often prevents individuals from seeking further training, thereby perpetuating a cycle of inadequacy and ineffective participation and leadership.

Furthermore, the Peter Principle exacerbates this problem. Originally written as satire the Peter principle has gained traction in academic circles after qualitative meta analysis. Individuals are often promoted based on unrelated performance in previous roles until they reach a level of incompetence, where they are often unable to meet the demands of their new position. This phenomenon is particularly evident in emergency management, where theoretical expertise is insufficient for the strategic oversight required in complex emergencies (Peter & Hull, 1969). As Benson and Campbell (2007) have noted, participants who are unprepared for the strategic challenges of emergency management are likely to struggle, making errors that could endanger lives.

In addressing these challenges, Stojiljković et al. (2018) propose several training interventions aimed at mitigating the Dunning-Kruger Effect. These include comprehensive training programmes that incorporate self-assessment and critical thinking exercises, as well as regular performance evaluations. By integrating these elements into training, organisations can help individuals develop a more accurate self-perception and improve their decision-making capabilities. Moreover, incorporating metacognitive training into emergency management education can help leaders recognise their limitations and seek additional expertise when necessary (Ehrlinger et al., 2008).

Ultimately, effective emergency management requires leaders who are not only skilled in theoretical response but are also equipped with a broad understanding of all phases of practical emergency management. The integration of interdisciplinary training programmes that address cognitive biases and enhance metacognitive abilities is crucial. As Kontogiannis et al. (2017) suggest, a holistic approach to training that involves technical and psychological expertise can significantly reduce the risk of human error and improve safety outcomes. Luckily the Dunning Kruger effect has an inverse relationship. The more qualified training initiated the more participants are aware of their limitations and need for more training. By fostering a culture of continuous learning and self-awareness, emergency management organisations can better prepare for the complex challenges they face.

References

Benson, M. J., & Campbell, J. P. (2007). To be, or not to be, linear: An expanded representation of personality and its relationship to leadership performance. International Journal of Selection and Assessment, 15(2), 232-249. https://doi.org/10.1111/j.1468-2389.2007.00383.x

Ehrlinger, J., Johnson, K., Banner, M., Dunning, D., & Kruger, J. (2008). Why the unskilled are unaware: Further explorations of (absent) self-insight among the incompetent. Organizational Behavior and Human Decision Processes, 105(1), 98-121. https://doi.org/10.1016/j.obhdp.2007.05.002

Kontogiannis, T., Malakis, S., & McDonald, N. (2017). Integrating operational and risk information with system risk models in air traffic control. Cognition, Technology & Work, 19(2–3), 345–361. https://doi.org/10.1007/s10111-017-0415-6

Kruger, J., & Dunning, D. (1999). Unskilled and unaware of it: How difficulties in recognising one’s own incompetence lead to inflated self-assessments. Journal of Personality and Social Psychology, 77(6), 1121-1134. https://doi.org/10.1037/0022-3514.77.6.1121

Peter, L. J., & Hull, R. (1969). The Peter Principle: Why things always go wrong. William Morrow and Company.

Stojiljković, E., Bozilov, A., Golubovic, T., & Glisovic, S. (2018). Human error prevention by training design: Suppressing the Dunning-Kruger effect in its infancy. In P. M. Arezes et al. (Eds.), Occupational Safety and Hygiene VI (pp. 219-224). Taylor & Francis Group.

Leave a Reply

Your email address will not be published. Required fields are marked *