The practical value of emergency response exercises

A report by “Trust for Americas health” reviews pandemic planning and makes some interesting points about exercises.

The report scores 50 US states (and DC) against a few criteria and finds some lacking on all but one.

– seven states dont stockpile antivirals
– eight do not have adequate laboratories
– labs in three stats dont have adequate surge capacity
– 12 states dont have adequate biosurveillance systems
– legal protection for volunteers – 21 states inadequate
– minimum threshhold for medical volunteers – 13 states below
– flu vaccination for seniors – down in 11 states
– funding down in six states

The one question to which every state gets a yes is: did it hold “an emergency preparedness drill or exercise in 2007 with health department officials and the state National Guard?”.

This is partly because states are legally obliged to hold exercises. The December 2006 Pandemic and All-Hazards Preparedness Act requires tabletop exercises including outcomes measures, lessons learned, and future planning. Quite alrge amounts of moneyhave been put into emergency response preparations.

But I think its also because exercises can be an easy hit. They take a few weeks to set up and a day to hold, they provide photo-opportunities for local politicans and officials, and they put a tick in the box.

The report itself, despite the 100% pass rate, finds that “There remains limited, non-systematic testing and exercising of emergency health plans, and inconsistent mechanisms for incorporating lessons learned into future planning.”

It goes on to say: “Often, emergency plans are evaluated using written assessments that include surveys, checklists, and written reports. …. Written assessments are favored by many preparedness officials because they tend to be inexpensive, especially when compared to the cost of holding live exercises or drills. A growing number of experts, however, both within and outside of government, are urging federal, state, and local emergency planners to incorporate drills and real-time exercises into their preparedness training and evaluation….. The PAHPA legislation ties state and local preparedness funding to states’ incorporation of drills and exercises to test emergency preparedness. While many public health experts applaud this, they caution that simply holding an exercise or drill does not mean the state or local government would be able to respond adequately in a real emergency situation. One major flaw with the current drilling system is a lack of clear criteria for evaluating the quality of performance. At present, there are no evidence-based guidelines from the federal government regarding conduct of an emergency preparedness exercise in terms of what outcomes are expected from each drill.”

In other words, you can do as many exercises as you want, but you have to listen to the results and act on them as well, and the TFAH report is evidence that a significant number of states dont do this.

As an afterthought: the TFAH report says: “Until such guidance is developed, one way state and local planners can incorporate more drills into their preparedness evaluations is to use so-called embedded assessments. This type of exercise makes use of ongoing public health activities, such as an annual flu clinic, to measure a state’s or locality’s ability to vaccinate populations against smallpox or influenza. Planners can use the results of this embedded assessment to determine where bottlenecks occur or whether other barriers to a successful deployment of vaccine exist.”

Googling the concept of embedded assessment finds papers about its use in education and medical diagnosis, health care and measuring the effect of computer games on cognitive ability.

I suspect this is one of those terms that means different things to different people, but it does seem to involve
– systematically embedding test questions or measurements into normal activities
– systematically scoring these on an ongoing basis
– sometimes providing feedback, training, or remedial measures during the period of activity, rather than at the end, and of course monitoring these measures and their effects.

One report on the technique as used in education sums it up as “Standard assessment practices generally entail assessing learning at the end of a period of learning….Embedded assessment practices involve incremental evaluations of learning as the course or training progresses….” adding “…In traditional assessment, embedded tests would most likely be multiple choice tests rather than small skills demonstrations. The term for meaningful tasks that replicate real world challenges based assessment is authentic assessment…”

So – exercises are really systematic authentic embedded assessments?

Leave a Reply

Your email address will not be published. Required fields are marked *