Revisiting the NIMS debate
We have to be careful with NIMS as with the many other blind orthodoxies we are encouraged to accept
A few columns ago I wrote about the National Incident Management System (NIMS). It was the most polarizing article that I have ever written. But I think most people missed my point about NIMS. Using four "problems" I seek to clarify my position on the system.
Problem 1: There is not enough space
Many of the essays I write are extracts from much longer works. My original work on the NIMS issue was 25 pages. The daily fire service Web news magazines are not the forum for discussions that run over dozens of pages — and judging by the hit rate neither is my website.
In order to have a voice in the easily digestible market place of the Internet, you have to be as concise as possible; meaning that, sometimes, many times, a bit of the point gets lost in the process. So my discussion of NIMS, as well as most other discussions on the Net, is necessarily simpler than I would like.
Problem 2: Control
Interestingly, what no one has argued against is that NIMS, like any incident management system, is at its core an attempt to assert control. And that is its primary weakness. We are encouraged to believe that using NIMS will provide at least the operational superstructures from which a single point of control can emerge. But control is not that simple. Control requires four general conditions to be met :
- There must be a goal
- It must be possible to ascertain the state of the system to be controlled
- It must be possible to change the state of the system
- There must be a model of the system to be controlled
Think about this as you attempt to apply NIMS. Can all of these conditions be met at all events? If not then it is likely you cannot control them; it's likely that NIMS won't be the answer.
Problem 3: What's in a name?
In "From Disaster to Catastrophe: The Limits of Preparedness," author Andrew Lakoff explains of NIMS, "…the all-hazards rubric was not only a set of techniques and protocols, but also a shared ethos [emphasis added]: the injunction to be prepared."  Ethos and injunction are powerful words.
Injunction means that NIMS is beyond the tired metaphorical association with tools and toolboxes. NIMS is more than a tool, even more than the toolbox. It is the idea of tools; an idea with serious implications for how we frame problems of emergency response.
The exhortation to be prepared, I get that, but if the unimaginable is truly beyond our minds to even imagine, how can we dare to say that we have a tool to assert control over it?
If the events of 9/11 were, as the 9/11 Commission wrote, "unimaginable"  and if "all-hazards" planning is the answer, then we should be able to apply the same planning modalities of NIMS equally as well to the unimaginable events of 9/11 and any other unimaginable event. But before you try, consider what Lee Clarke noted:
"To rationally plan for a mass nuclear war is an attempt to claim that after the usual routines of everyday life are gone they can still be had. The very existence of such planning constitutes a claim that adequate thought and hard work can allow adequate control over highly uncertain and, unpredictable events. More broadly, it is a rhetorical claim that a meaningful knowledge base can be constructed: that the information can be gathers, that is will be valid and reliable, that it can be drawn upon." 
Can an all-hazards approach even have meaning? It is possible to look back on any incident and imagine how the application of NIMS — or any other system for that matter — would have made the response better or more efficient but this retrospective value attribution is dangerous.
"Once we attribute a certain value to a person or thing, it dramatically alters our perceptions of subsequent information.This power of value attribution is so potent that it affects us even when the value is assigned completely arbitrarily." 
NIMS works.It works every single day all over America as a methodology for exerting positive control over events with limited time horizons and those that fit within the realm of organizational knowledge.It works within its limits.
I did not invent NIMS nor did I invent the "all-hazards" moniker but it, "Like any scientific hypothesis, [it] can never be proved to the extent that a mathematical theorem can be; but it can be disproven by a single counter example." 
In other words to prove that NIMS is not "all hazards," I simply have to provide a single example of a hazard it does not apply to. I choose a mass nuclear war. Try planning for that with NIMS.
Problem 4: The triplet of opacity
We, as firefighters and as planners but mostly as people, suffer from what Nassim Nicholas Taleb calls, "the triplet of opacity:" 
- The illusion of understanding, or how everyone thinks he knows what is going on in a world that is more complicated (or random) than they realize
- The retrospective distortion, or how we can assess matters only after the fact, as if they were in a rearview mirror (history seems clearer and more organized in history books than in empirical reality); and
- The overvaluation of factual information and the handicap of authoritative and learned people, particularly when they create categories — when they "Platonify."
I hope that this latest discussion sheds light on the fundamental fact that I understand NIMS. But that I also understand the sinister implications an uncritical acceptance of NIMS — and such an acceptance — says about both our ability to know and our ability to control.
We have to be careful with NIMS as with myriad other blind orthodoxies we are encouraged to accept. We cannot afford not to let NIMS become a deconditionalized concept; a concept removed from the context of the conditions bearing on it. 
1. Berndt Bremer. Dynamic Decision Making in Command and Control. Electronically retrieved from http://tinyurl.com/2uarkyn
2. Andrew Lakoff 206. From Disaster to Catastrophe: The Limits of Preparedness electronically retrieved: http://tinyurl.com/2eme63k
3. The 9/11 Commission Report: Final Report of the National Commission on Terrorist Attacks Upon the United States. W.W. Norton & Company, New York, New York. Pg.315
4. Lee Clarke. 1999 Mission Improbable: Using Fantasy Documents to Tame Disaster. University of Chicago Press. Chicago, Illinois
5. Ori Brafman & Rom Brafman. 2008. Sway: The Irresistible Pull of Irrational Behavior. Broadway Books. New York, New York.
6. Henry Petroski. 2006. Success Through Failure: The Paradox of Design. Princeton University Press. Princeton, New Jersey.
7. Nassim Nicholas Taleb. 2007. The Black Swan: The Impact of the Highly Improbable. Random House. New York, New York.