Challenges and Approaches to Measuring Hospital Emergency Preparedness

Slide presentation from the AHRQ 2010 conference.

On September 28, 2010, David Chin and Sheryl Davies made this presentation at the 2010 Annual Conference. Select to access the PowerPoint® presentation (404 KB). Free PowerPoint® Viewer (Plugin Software Help).


Slide 1

 Slide 1. Challenges and Approaches to Measuring Hospital Emergency Preparedness

Sheryl Davies, MA
Stanford University
Center for Primary Care and Outcomes Research

David Chin
University of California, Davis
Center for Healthcare Policy and Research

AHRQ Annual Meeting; September 26-29, 2010; Bethesda, MD

Slide 2

 Acknowledgements

Acknowledgements

Project team:

  • Kathryn M. McDonald, MM1
  • Sheryl M. Davies, MA1
  • Tamara R. Chapman, MA1
  • Christian Sandrock, MD, MPH2
  • Patrick Romano, MD, MPH2
  • Patricia Bolton, PhD3
  • Jeffrey Geppert, JD3
  • Eric Schmidt, BA1
  • Howard Kress, PhD3
  • James Derzon, PhD3

1Stanford University; Center for Health Policy, Freeman Spogli Institute for International Studies; Center for Primary Care and Outcomes Research, School of Medicine
2 University of California at Davis School of Medicine
3Battelle

AHRQ: Mamatha Pancholi, Sally Phillips, Kelly Johnson

ASPR: RADM Ann Knebel, Margaret (Peggy) Sparr, Ibrahim Kamara, Torrance Brown

This project was funded by a contract from the Agency for Healthcare Research and Quality (#290-04-0020)

Slide 3

Outline  

Outline

  • Approach to development
  • Content Domain Development
  • Evaluation of existing measures
  • Challenges specific to EP
  • Adapting measurement approaches to EP
  • Solutions to challenges
  • Example measures and validation

Slide 4

Measure Development and Validation Process  

Measure Development and Validation Process

Image: A flow chart shows the measure development and validation process. Under the heading of "Sources" are three text boxes:

  • Literature
  • Actual Use
  • Concept

Arrows point from each box to a large oval labeled "Candidate Indicators"; an arrow points from this oval to a text box labeled "Evaluation," and another arrow points from "Evaluation" to a diamond shape labeled "Selection." A series of arrows lead from "Selection" back to "Sources."

Notes: This is a simplified graphic that illustrates the general process of developing and validating measures, both for this set and past sets. The feedback loop shows the iterative nature of the process by which information from literature reviews, users' evaluation and experience is used to select the most promising measures and then refine definitions.

Slide 5

 Obstacles to Usual Developmental Framework

Obstacles to Usual Developmental Framework

  • Broad swath of potential indicators
  • Available data inconsistent
  • Many standards without evidence base to guide selection
  • No standardized data collection system
  • Challenges specific to measuring EP

Slide 6

 Revised Measure Development Framework

Revised Measure Development Framework

Image: A chart displays the following framework process. An arrow points down from each item to the one below it:

  • Identify Potential EP Topics
  • Identify Ideal Set of EP Topics
  • Identify and Specify Measures within Topics
  • Develop Implementation Guidelines
  • Data Collection (Feasibility and Validity of Measures)
  • Re-Specification (if needed)
  • Validation of Measure Set

To the right of these steps is a cog captioned, "Ongoing Feedback, Refinement, and Reassessment"; an arrow indicates that the cog is turning.

Slide 7

Our Process: Where Have We Been?  

Our Process: Where Have We Been?

  • Literature Review
  • Review of Existing Guidelines
  • Expert Panel Evaluation of Indicator Topics
  • MMWG Feedback:
    • Focus on Functionality and Outcomes
  • Indicator Development and Justification

Image: A series of arrow shapes pointing from left to right are captioned with the following text:

  • 900+ Potential Indicators
  • 179 Indicator Topics in Initial Evaluation
  • 47 Indicator Topics in Conference Calls
  • 42 Indicator Topics in Final Evaluation
  • Priority Levels 1-4

Slide 8

 Panel Methods

Panel Methods

  • Review to identify guidelines, checklists, etc.
  • Group together like guidelines to identify general topics
  • Topics evaluated by expert panel (nominal group technique):
    • 43 panelists assigned to 3 duplicative panels
    • Rated topics on importance to include in report, participated in call, then re-rated subset of topics
    • Each call summarized and shared with other panels
    • Only highest rated topics moved to next step
    • Final rating also included set-building task
    • Ratings used to prioritize topics (priority level 1-4)

Slide 9

 Priority One Topics

Priority One Topics

Indicator TopicMedian Rating (1-5)Percent Including Topic in SetConcept Area
Hospital's emergency operations plan (EOP) identifies a chain of command.573.0Emergency Management Procedures and Planning
Hospital has a plan for unsupported functioning/self sufficiency, including through the use of alternative sources of potable water and electricity, for 96 hours.570.3Continuity of Operations
Hospital has a plan for alternative means of communication or backup communication systems.564.9Communications
Hospital has a plan for coordinating all levels of communication, including both intra- and inter-organizational communication, as well as required technology.475.7Communications

Slide 10

 Priority One Topics (Continued)

Priority One Topics (Continued)

Indicator TopicMedian Rating (1-5)Percent Including Topic in SetConcept Area
Hospital has a plan specifically for protecting staff and other responders using countermeasures, supplies, and personal protective equipment (PPE).464.9Countermeasures, Supplies, and PPE
Hospital has a plan for safety and security of people, including staff, patients, and supplies, which may involve partnering with local law enforcement agencies.454.0Safety and Security
Hospital has a plan for evacuation, including transport of patients and information to alternate care sites.451.4Evacuation and Shelter in Place

Slide 11

 Priority Two Topics

Priority Two Topics

Indicator TopicMedian Rating (1-5)Percent Including Topic in SetConcept Area
Surge capacity is addressed at various levels in the hospital (i.e. not just in the emergency department) and with community partners.578.4Surge Capacity
Hospital's emergency operations plan (EOP) contains specific plans for communications.567.6Emergency Management Procedures and Planning
Hospital has a plan for treatment and management of contaminated persons.464.9Decontamination
Hospital has a plan for evacuation in general.464.9Evacuation and Shelter in Place
Hospital has a plan for tracking both patients and the deceased.462.2Patient Management and Care
Staff training is ongoing.459.5Staff Training

Slide 12

Priority Two Topics (Continued)  

Priority Two Topics (Continued)

Indicator TopicMedian Rating (1-5)Percent Including Topic in SetConcept Area
Hospital inventory of equipment and supplies includes items such as vents, PPE, negative pressure isolation, ICU beds, decontamination showers, antidote kits, and pediatric equipment.456.8Countermeasures, Supplies, and PPE
Hospital has a plan for facility access control and staff is able to gain access to the facility when called back to duty.456.8Safety and Security
In ramping up for surge, hospital has the ability to increase physical space and resource capacity through tactics such as rapid discharge, home care, and alternate care sites.456.8Surge Capacity
Drills are executed in collaboration with other organizations.454.1Community Integration

Slide 13

 Priority Two Topics (Continued)

Priority Two Topics (Continued)

Indicator TopicMedian Rating (1-5)Percent Including Topic in SetConcept Area
Hospital has a plan for decontamination that is specific to chemical/biological/radiological/nuclear/high-yield explosive (CBRNE) hazards.451.4Decontamination
Hospital's emergency operations plan (EOP) is modified based on exercises or actual emergencies.451.4Emergency Management Procedures and Planning
Criteria for evacuation and shelter in place decision-making are in place.451.4Evacuation and Shelter in Place
Hospital has a plan for modification of normal clinical activities (including specialized care) or standards of care as related to disaster response.451.4Patient Management and Care
Staff training incorporates the incident command system (ICS).451.4Staff Training

Slide 14

 Results: Concept Areas Covered by Highest Priority Topics

Results: Concept Areas Covered by Highest Priority Topics

  • 6 out of 15 covered in highest priority.
  • Priority 1 and 2 covered all except:
    • Staff and volunteer management
    • Fatality management
    • Disease reporting and surveillance
  • Did not favor topics derived from guidelines from any single source, or from multiple sources.

Slide 15

 Database Analysis

Database Analysis

Image: A chart displays the following process. An arrow points from each step to the next:

  • Database search
  • Database narrowing
  • Linkage of dataset measures across databases
  • Linkage of dataset measures to priority areas
  • Validation and outcome analysis

Slide 16

Search for EP Data Sources  

Search for EP Data Sources

  • Literature Review using Web-based aggregators:
    • ISI Web of Knowledge, PubMed, Google Scholar
  • State and federal sources:
    • ASPR, DHS
  • National and regional EP expert feedback [Image: An arrow points down]

547 initially selected, however 44 unique EP data sources identified

Slide 17

 Database Identification and Narrowing

Database Identification and Narrowing

  • Inclusion Criteria:
    • Must focus on hospital or healthcare system in part or whole
    • Must be state, regional data, (or aggregate of data above state level if available)
    • Data must be available and accessible (some EP data lost or "secure")
    • Must have available data dictionary
  • Identified: 44
  • Data sources which met criteria: 11

Slide 18

 Database Selection: The final 11

Database Selection: The final 11

  • Databases with EP information:
    • Price Waterhouse Cooper's Public Health Emergency Preparedness (PHEP)
    • The Joint Commission
    • Government Accountability Office 2003
    • Government Accountability Office 2008
    • American Hospital Association Health Forum
      • Annual Survey, TrendWatch
    • National Hospital Ambulatory Medical Care Survey (NHAMCS) Pandemic & Emergency Response Preparedness Supplement 08
    • National Hospital Discharge Survey
    • South Bay Disaster Resource Center at Harbor-UCLA Medical CTR
    • Veteran's Health Administration Data
    • Hospital Preparedness Program, ASPR
    • The Pediatric Preparedness of Emergency Departments: A 2003 Survey

Slide 19

Identifying Links  

Identifying Links

  • 11 databases evaluated in detail:
    • Characteristics
    • Quality
    • Size
    • Temporal
    • Relationship
  • Indicators identified with relevance to MMWG:
    • ONLY 16:
      • National Hospital Discharge Survey
      • The Hospital Preparedness Program
      • American Hospital Association
      • The Joint Commission

Slide 20

 Example of Identified Link

Example of Identified Link

  • HPP SI 25
    • Number of participating hospitals statewide that have access to pharmaceutical caches sufficient to cover hospital personnel (medical and ancillary), hospital based emergency first responders and family members associated with their facilities for a 72-hour period.
  • Links TJC EC.4.14.3
    • The organization plans for replenishing medical supplies that will be required throughout response and recovery, including access to and distribution of caches (stockpiled by the hospital or its affiliates, local, state, or federal sources) to which the hospital has access.

Slide 21

 Example of Identified Link

Example of Identified Link

  • HPP SI6 Drills
    • Number of drills conducted during the FY 2005 budget period that included hospital personnel, equipment or facilities.
  • Links to TJC EM.03.01.03, EP 3
    • For each site of the hospital that offers emergency services is a community-designated disaster receiving station, at least one of the hospital's two emergency response exercises includes an escalating event in which the local community is unable to support the hospital.

Slide 22

Example of Identified Link  

Example of Identified Link

  • HPP SI 26 A3
    • Number and level of PPE statewide to protect current and additional health care workers during an event
    • Possess sufficient numbers of PPE to protect both the current and additional health care personnel deployed in support of an event.
  • Links to TJC EC.4.11.9
    • The organization keeps a documented inventory of the assets and resources it has on site that would be needed during an emergency (at a minimum, personal protective equipment, water, fuel, staffing, medical, surgical etc.

Slide 23

Links to Priority Areas 1 and 2  

Links to Priority Areas 1 and 2

  • Only 7 represented in the linkages between databases to the major topic areas in Priority 1 and 2 (n = 16).
  • None of the major function areas were represented (surge capacity).
  • None of the patient care areas were represented.
  • Unable to provide any link between databases and the priority areas determined by group.

Slide 24

 Issues with Linkages Between Databases

Issues with Linkages Between Databases

  • Lack of clear definitions
  • Lack of similarity
  • Extensive assumptions required
  • From an EP perspective, indicators from databases do NOT accurately reflect EP function

Slide 25

 Additional Database Problems

Additional Database Problems

  • Most linkages between only 2 datasets: HPP and TJC
  • Data for measures collected and recorded differently:
    • HPP: mixed (continuous, categorical, rank)
    • TJC: binary (compliant, non-compliant)
  • Do not measure EP function or outcome during a clinical or simulated situation
  • Thus, data are inconsistent within and between datasets

Slide 26

 Data Quality Example

Data Quality Example

Correlation:

  • American Hospital Association Survey 2008
    • AHA: Total licensed beds—the total number of beds authorized by the state licensing (certification agency)
  • Hospital Preparedness Program Survey 2006
    • HPP: Number of beds statewide, above the current daily staffed bed capacity that awardee is capable of surging beyond within 24-hours post event

Note: These variables differ from an EP perspective but collected from same agency (L & C) in state.

Slide 27

Data Correlation  

Data Correlation

  • Number of hospital beds available
    • +ρ = 0.8179
    • +t* = 9.7456 >> 3.496 (95% Confidence)
  • State population
    • +ρ = 0.9948
    • +t* = 66.96 >> 3.496 (95% Confidence)

ρ: Spearman Rank Correlation Coefficient

Slide 28

 Summary For Databases

Summary For Databases

  • Unable to identify existing measures:
    • No outcome or function analysis
    • Hypothetical, not real patient care events
    • Limited in scope of EP
    • Few measures in major priority areas
  • Unable to perform validation of existing measures:
    • Lack of adequate linkages across datasets with similar data
    • Inconsistently defined data
    • Absence of patient outcome data

Slide 29

Challenges in Measuring Preparedness  

Challenges in Measuring Preparedness

ChallengeClinical MeasurementEP Measurement
1: Infrequent EventsObserve patient outcomes on daily basisFew small scale responses, very rare large scale responses
2: Measurement Requires Additional EffortCan observe daily patient careRequires proxy events to regularly observe
3: Hospitals Control Simulated EventsLimited ability to "cherry pick" patientsParameters of proxy events often controlled by the measured entity
4: Link between Performance in Proxy Events and Actual Events not Fully EstablishedLimited need to rely on proxy measures. Proxy measures based on evidence.Proxy measures not yet linked to outcomes, and limited ability to establish link given frequency of actual events.

Slide 30

Challenges in Measuring Preparedness  

Challenges in Measuring Preparedness

ChallengeClinical MeasurementEP Measurement
5: Response System ComplexityOutside entities have limited impact on care; Can isolate care for clinical groupsOutside entities (e.g. public health system) integral to response; Difficult to isolate response activities
6: Limited Evidence Base for Best PracticesExtensive literature based on RCTs and scientific evaluation of interventionsLimited knowledge about best "preparation" to improve outcomes, limited ability to establish.
7: Variations in Scale and Types of DisastersDaily care somewhat homogeneous, can isolate clinical groupsSmall scale to large scale events; different types require different response

Slide 31

Challenges in Measuring Preparedness  

Challenges in Measuring Preparedness

ChallengeClinical MeasurementEP Measurement
8:  Potential Variation in Need for PreparednessMost hospitals will care for commons diseasesMajor differences in scale and type of disasters likely to occur.
9: Exact Nature of Potential Events UncertainDay to day clinical care predictableWhen, what, where, how big?—all uncertain.
10: Impact of Resource Dedication to EPImproving performance on QIs theoretically improves day to day care.Resources dedicated to EP and EP measurement may draw resources away from day to day clinical care.

Slide 32

 Conceptual Models Related to Measurement

Conceptual Models Related to Measurement

  • Donabedian Model of Clinical Measurement:
    • Structure:
      • Material, human resources, hospital characteristics
      • Lack evidence linking structure with outcome
    • Process:
      • What you do: includes planning and response
      • Doing the right thing well
      • Includes functional measures
      • Assumed to be associated with outcomes
    • Outcomes:
      • True outcomes are difficult to measure
      • Approaches to estimating outcomes during exercise not established
      • Risk adjustment required

Slide 33

 Guiding Principals to Address Challenges

Guiding Principals to Address Challenges

  • Aim to measure functionality.
  • Identify a goal outcome.
  • Seek continuous outcomes.
  • Constrain the focus to hospital.
  • Consider the potential data and distributions.

Slide 34

 Potential Approaches to Measurement

Potential Approaches to Measurement

  • Survey of preparedness activities:
    • Example: Elements included in Emergency Operation Plan
  • Exercise based measures of functionality:
    • Example: Time to establish a functional security checkpoint.
  • Exercise + modeling based measures:
    • Example: Time to evacuate a hospital, based on small demonstration evacuation and modeling to extrapolate time to evacuate the entire hospital.

Slide 35

 Steps Undertaken to Develop Measures

Steps Undertaken to Develop Measures

  • Identify potential ways to measure topics:
    • Review existing metrics and concepts.
    • Identify most salient functionality reflected in topic.
    • Consider how well metric fits topic area.
    • Consider potential performance.
  • Draft specifications:
    • Consider feasibility of implementation.
    • Consider how well the metric reflects actual functionality.

Slide 36

Steps Undertaken to Develop Measures, Cont.  

Steps Undertaken to Develop Measures, Cont.

  • Define each component:
    • Consider alternative interpretations of specification.
  • Justify choices based on literature and case studies.
  • Define how to move from hospital based data collection to aggregate measures at state-level.
  • Iterative process.

Slide 37

 Example Indicator: Functional Measures

Example Indicator
Functional Measures

Topics:

  1. Hospital has a plan for alternative means of communication or backup communication systems.
  2. Hospital has a plan for coordinating all levels of communication, including both intra- and inter-organizational communication, as well as required technology.

Proposed Measure 1: The time to relay a field asset request or critical field information to a non-hospital-based emergency operations center (EOC) during an exercise. (Repeated for secondary and tertiary communication modalities.) [Preliminary recommendation for state level reporting: Mean time for all hospitals.]

Slide 38

 Example Indicator: Modeling + Measure Based Indicators

Example Indicator
Modeling + Measure Based Indicators

Proposed Measure: The time to evacuate the hospital. [This time is to be based on the time to evacuate a sample of X patients, the time for planning evacuation, and the subsequent extrapolation to the entire hospital.]

Preliminary recommendation for state level reporting: Mean time for all hospitals.

  • Modeling helps to reduce measurement burden
  • Potential to reduce measurement bias
  • Requires extensive development and validation

Slide 39

Example Indicators: Using multiple approaches  

Example Indicators
Using Multiple Approaches

Topic:

  • Hospital has a plan for safety and security of people, including staff and patients, and supplies, which may involve partnering with local law enforcement agencies.

Proposed Measure 1: The time to establish a functioning security screen checkpoint during an exercise, according to the hospitals EOP.

Proposed Measure 2: Does the hospital have an MOU or MOA with a security agency for security support?

Slide 40

 Evaluation Criteria Based on National Quality Forum

Evaluation Criteria Based on National Quality Forum

ItemCriteria
Importance
  1.   Is the concept important to measure?
  2.   Is there opportunity for improvement?
Usability
  1.   Does the measure foster true quality improvement instead of gaming or adverse consequences?
  2.   Is the measure harmonized with similar measures?
  3.   Is the measure meaningful, understandable and useful?
Feasibility
  1.   Does the measure minimize burden?
  2.   Is the data collection and implementation feasible?
Scientific Acceptability
  1.   Is the measure precisely defined?
  2.   Is it reliable (test-retest and inter-rater)?
  3.   Does the measure demonstrate face validity, construct validity and predictive validity?
  4.   Is there systematic bias and can that bias be address with adjustment?
  5.   Does it detect meaningful differences in performance?

Slide 41

Proposed Indicators: Known Evidence Base  

Proposed Indicators
Known Evidence Base

AxisCriterionKnown evidence base
ImportanceConcept is importantPanel/MMWG
Opportunity for improvementActual performance
UsabilityFosters true improvementTheoretical
HarmonizationTheoretical
MeaningfulnessTheoretical
FeasibilityMinimizes burdenTheoretical
ImplementationUnknown
Scientific acceptabilityPrecise definitionSpecifications
ReliabilityUnknown
Face/Consensual validityPanel/MMWG, literature
Construct validityUnknown
Criterion validityUnknown
Bias and risk adjustmentTheoretical issues
PowerTheoretical issues

Slide 42

 Validation Recommendations

Validation Recommendations

  • Step 1: Establish consensual validity through structured panel review process.
  • Step 2: Develop data collection processes.
  • Step 3: Develop methods to assess feasibility.
  • Step 4: Develop methods for assessing proxy outcomes in an exercise (optional).
  • Step 5: Identify a representative sample of hospitals to pilot test measures.
  • Step 6: Collect pilot data, including test-retest reliability, inter-rater reliability, and measure performance.
  • Step 7: Assess the distribution of performance and relationship between measures.
Current as of December 2010
Internet Citation: Challenges and Approaches to Measuring Hospital Emergency Preparedness. December 2010. Agency for Healthcare Research and Quality, Rockville, MD. http://www.ahrq.gov/news/events/conference/2010/chin-davies/index.html