Measurement and Evaluation of TeamSTEPPS: Slide Presentation
TeamSTEPPS®: National Implementation
Slide 1
Measurement and Evaluation of TeamSTEPPS
National Conference 2012
June 21-22
Nashville, Tennessee
Slide 2
Measurement and Evaluation of TeamSTEPPS
TeamSTEPPS National Conference
June 22, 2012
Images: Three photographs show medical personnel at work in a hospital setting.
Slide 3
Session Moderator
- Andrea Amodeo, MS, Research Associate, IMPAQ International.
Slide 4
Purpose
- Provide Information on How to Measure and Evaluate TeamSTEPPS' Success.
- Provide Best Practices.
- Describe an Example.
Slide 5
Agenda
- Introduction.
- Why Measure and What to Measure: Perspectives from the C-Suite.
- Ten Considerations for Measurement
- Developing a Measure of Trauma Team Performance: An Example.
- Panel Q&A.
Slide 6
Panel Introductions
- Anthony Slonim, MD, DrPH, Barnabas Health.
- Eduardo Salas, PhD, University of Central Florida.
- David P. Baker, PhD, IMPAQ International.
Slide 7
Why Measure and What to Measure: Perspectives from the C-Suite
Anthony Slonim, MD, DrPH.
Executive Vice President and Chief Medical Officer, Barnabas Health.
Professor, Medicine, Pediatrics, Preventive Medicine and Community Health University of Medicine and Dentistry of New Jersey New Jersey Medical School.
Slide 8
Acknowledgments
- No Conflicts of Interest.
- AHRQ Funding and Contracts:
- Contract # HHSA290200600019i Task Order #12 Subcontract: SAMIKE, LLC: "Proactive Risk Assessment in Ambulatory Surgery Centers" Anthony D. Slonim, MD, DrPH Principal Investigator."
- Objectives:
- Participants will understand the importance of measuring teamwork success.
- Participants will understand what measures are important for leadership.
Slide 9
Why Measure?
- To know if your TeamSTEPPS implementation worked:
- Step 5 of Implementation Planning.
- Assist in modifying implementation plan.
- Planning future implementations.
- Information for leadership.
- Producing a scholarly product.
Slide 10
TeamSTEPPS Phases
Image: The shift process has three phases. Phase 1: Assessment. Pre-Training Assessment includes Site Assessment, Culture Survey and Data/Measures. Are these ready? If No, pass through Climate Improvement and return to Pre-Training Assessment. If Yes proceed to Action Plan and then move on to Phase 2. Phase 2: Planning, Training and Implementation. Training leads to Intervention. Intervention includes testing and leads to Phase 3. Phase 3: Sustainment. This phase includes Culture Change: Coach and Integrate, Monitor the Plan, and Continuous Improvement. Continuous Improvement includes going back to Training to lead to more Culture Change.
Slide 11
TeamSTEPPS Phases
Image: The shift process model described on Slide 10 is repeated. The Intervention section is highlighted.
Slide 12
TeamSTEPPS Phases
Image: The shift process model described on Slide 10 is repeated. The Intervention section has exploded into a red burst captioned "Outcomes".
Slide 13
Teamwork
Image: Interlinked puzzle pieces are captioned Macrosystem, Provider to Provider, Climate, Provider to Patient, Outcomes, and Patient. The word Microsystem appears below, bracketing all but the Macrosystem piece.
Slide 14
Teamwork
Image: The interlinked puzzle pieces from Slide 13 are repeated. Macrosystem is highlighted.
Slide 15
Macrosystem
Image: A box captioned Macrosystem contains three arrows captioned Safe, Timely, and Efficient. Overlapping this box is a smaller box captioned Microsystem and containing two arrows captioned Safe, Equitable, and Patient-Centered, and Effective, Timely, and Efficient.
Slide 16
How do Hospitals Work?
Image: A triangle is shown with a large red H in the center. The three sides of the triangle are captioned Medical Staff, Board of Directors, and CEO Management Team.
Slide 17
Teamwork
Image: Interlinked puzzle pieces are captioned Macrosystem, Provider to Provider, Climate, Provider to Patient, Outcomes, and Patient. The word Microsystem appears below, bracketing all but the Macrosystem piece.
Slide 18
The Microsystem Team
Image: Three double-headed arrows form a triangle. The corners were two arrows meet are captioned Patient, Family, and Provider. At the center of the triangle is Teamwork.
Slide 19
Teamwork
Image: The interlinked puzzle pieces from Slide 17 are repeated. Provider to Provider and Provider to Patient are highlighted.
Slide 20
Patient
Image: A large arrow captioned Patient crosses the top of the slide from left to right. In the first section below, on the left side of the slide, is the word "Access" separated by a dotted line from "Structure"; below Structure are People, Buildings, Technology, and Organization, each in a small text box. Beneath this section is a bracket and the following list:
- Trained Providers.
- Primary Care Services.
- EMS.
- Emergency Departments.
- ICUs.
- Specialized Equipment.
- Regionalized Care.
- Financing.
In the second section is "Process, Interactions between" with Providers and Patients, Providers and Providers, and Providers and Technology, each in a small text box; the first two are outlined in red. Beneath this section is a bracket and the following list:
- Evidence Based Practices.
- 'Bundles' of Care.
- Specialist Availability.
- Family Centered Care.
- Trainees.
- Teamwork.
- Multidiscplinary Care.
In the third section is "Outcomes" with Vitality, Economic, and Quality, each in a small text box. Beneath this section is a bracket and the following list:
- Mortality.
- Cost.
- Length of Stay.
- Infections.
- Complication Rates.
- Quality of Life.
Slonim AD, Marcin JP, and Pollack MM. Outcomes in Pediatric Critical Care Medicine: Implications for Health Services Research and Patient Care. In: Fuhrman B and Zimmerman J Pediatric Critical Care 4th Edition Mosby 2011.
Slide 21
Clinical Processes
Provider to Provider:
- Policies and Procedures.
- Core Measures.
- Bundle Adherence.
- Family Centered.
- Teamwork.
- Availability.
- Consistency.
- Knowledgeable.
Provider to Patient:
- Safe.
- Effective.
- Outcome Based.
- Clinical.
- Experiential.
Slide 22
Bedside Care
Welcome, Register, Risk Assess, Triage
Image: A chart shows the following process: Outreach (Sales, Marketing—Relationships and Transport) → Welcome, Register, Risk Assess, Triage (Needed Services) → Provide Feedback and Monitor Service (Care Team Introductions and Follow-up—Assess, Plan and Initiate Care—Direct Care Delivery OR Technology Care Delivery—Treatment Plan and Care Delivery—Deliver Care OR Clinical Support Services) → Evaluate Care, Outcomes and Business.
Slide 23
Teamwork
Image: The interlinked puzzle pieces from Slide 17 are repeated. Climate is highlighted.
Slide 24
Culture vs. Climate
Climate:
- Organizational Structure:
- Connectiveness.
- Historical/Environmental Forces.
- Vision/Strategy.
- Standards:
- Accountability.
- Behavior.
- Communication.
- Rewards.
- Trust.
Culture:
- Values.
- Beliefs.
- Myths.
- Norms.
- Traditions.
Kennedy Group Executive Strategies:
Consulting at the Kennedy Group.com
Slide 25
Teamwork
Image: The interlinked puzzle pieces from Slide 17 are repeated. Outcomes is highlighted.
Slide 26
What Leadership Cares About: An Integrated Quality Program: Content
Clinical Quality:
- Continual Survey Readiness.
- Core Measures + Pain.
- Nursing and Doctor Quality.
- Readmissions.
- Standardized Ratios:
- LOS.
- Mortality.
- PSIs/PQIs/PPCs.
Patient Experience:
- Patient Centered:
- HCAHPS Survey Scores.
- Likelihood to Recommend.
- Wait times and Pain/empathy.
- Employee Centered:
- Engaged in Mission.
- Accountable.
- Delivering on Excellence.
Images: Two figures with interlocking shapes.
Slide 27
Practice Evaluations and Learning Activities
- Performance achievement:
- High-recognize and promote.
- Moderate-target improvement opportunities.
- Low:
- Graduated Counseling.
- Education.
- Supervision.
- Assurance of competency.
- Behavior modification.
- Other areas for improvement.
Slide 28
Teamwork
Image: The interlinked puzzle pieces from Slide 17 are repeated. Patient is highlighted.
Slide 29
The Patient's Responsibility
- Open, Honest.
- Follows Through:
- Therapy.
- Recommendations.
- Rules of engagement.
- Asks Questions, Demands Answers.
Slide 30
Questions?
Slide 31
Designing a Team Performance Measurement System: Ten Considerations
Eduardo Salas, PhD Department of Psychology & Institute of Simulation & Training University of Central Florida
esalas@ist.ucf.edu
Slide 32
To Begin…
- Measurement is not "Sexy, Flashy"…Yet, of the utmost importance!
- No "Silver Bullet"…
- No Perfect Tools!
- All need to be adapted, refined, expanded for different purposes!
Slide 33
1. Consider Level of Analysis
- Individual:
- MD.
- Nurse.
- Technician.
- Team:
- Emergency Department Team.
- Radiology Team.
- Multi-team—Team Structure:
- Core Care Team.
- Contingency Team (e.g., RRT).
- Administrative Team.
Slide 34
2. Clarify the Purpose
- Diagnose root causes of performance deficiencies:
- Identify specific weaknesses.
- Provide feedback:
- Relay information regarding strengths and weaknesses as a remediation plan.
- Assessment:
- Evaluate the level of proficiency or readiness.
Slide 35
3. Decide What to Measure: Outcomes vs. Processes
- Extent that outcome was successful:
- Accuracy: Precision of performance (e.g., correct medication).
- Timeliness: How long (e.g., time to incision).
- Productivity: How much (e.g., patient volume in ED).
- Efficiency: Ratio of resources required vs. used (E.g., OR supplies).
- Diagnose root causes of performance deficiencies:
- Feedback or follow-up training.
- Types of Process:
- Taskwork.
- Procedural.
- Non-procedural.
- Teamwork.
- Taskwork.
Slide 36
4. Select a Measure: Checklist
- Items/actions use dichotomous responses:
- YES/NO.
- RIGHT/WRONG.
- PERFORMED/NOT PERFORMED.
Action / Behavior | Yes | No |
---|---|---|
Assess Airway | ||
Breathing | ||
Circulation / FAST Exam | ||
Disability | ||
Exposer and Environment |
Slide 37
4. Select a Measure: Frequency Count
- # of times a behavior, action, or error occurs:
- Better for measuring acts of commission vs. acts of omission.
- Useful when purpose = know how often a specific action is taken or task is performed.
- Can be recorded during a critical event in an exercise or across the entire scenario.
Communication | Positive Instances |
---|---|
Check-back | 8 |
Call-outs | |
SBAR | 2 |
Unintelligible Communications | |
Mutual Support | Positive Instances |
Two-Challenge | |
CUS | 4 |
Task Assistance |
Slide 38
4. Select a Measure: Graphic Rating Scale
- Numeric or descriptive judgment of how well a task was performed.
1. The team leader assigned roles to the Trauma Team.
Scale legend: 1 = ineffective through 6 = very effective
Ineffective | Very Effective | ||||
1 | 2 | 3 (checked) | 4 | 5 | 6 |
2. The PGY2 used check-back to confirm orders.
Scale legend: 1 = strongly disagree through 6 = strongly agree
Strongly Disagree | Strongly Agree | ||||
1 | 2 | 3 | 4 | 5 (checked) | 6 |
Slide 39
4. Select a Measure: Anchored Rating Scale
Communication: Used check-back during trauma resuscitation.
Did not use check- back | Used check-back once to confirm care plan at end of case | Used check-back to confirm all medication orders | Used check-back to confirm critical orders, primary and secondary survey | Used check-back to confirm all critical orders (checked) |
Slide 40
5. Decide the Timing; When?
Image: Chart shows the points at which Training and Retraining should occur along a timeline for Level 1: Reactions, Level 2: Learning, Level 3: Behavior, and Level 4: Results.
Slide 41
6. Consider Fidelity of Setting; Where?
Type of Location | Good for: | Trouble with: |
---|---|---|
Actual clinical context (i.e., on the job, in the wild) |
|
|
Learning enviornment (i.e., in situ or in vivo simulations) |
|
|
Hybrid approaches (i.e., in situ or in vivo simulations) |
|
|
Slide 42
7. Train Observers; Who?
- Choosing observers:
- Clinical + Teamwork competence.
- Training and supporting raters:
- Develop and maintain high inter-rater reliability:
- Rater training.
- Scoring guides.
- Develop and maintain high inter-rater reliability:
- Developing coaches and facilitators:
- How is data going to be used to improve performance?
- Facilitation skills.
- Debrief assessment.
- How is data going to be used to improve performance?
Slide 43
8. Calibrate the Measurement System: Test It!
Images: Two medical staff are shown taking test samples from a dummy patient; below, two close-up photographs of hands holding pens and writing on paper are captioned Observer 1 and Observer 2.
Slide 44
9. Dilemma—Generic vs. Specific Tools
Image: Model depicts the process for choosing tools. An arrow points from Expectations for Performance to Behavioral Specificity of Content. Four arrows point from Behavioral Specificity of Content to Abstract/Generic Content, Specific/Concrete Content, The Rater/Observer, and The Tool Protocol. Between The Rater/Observer and The Tool Protocol is the question, "Where is the knowledge burden?"
Slide 45
10. It is Paramount for Debriefing!
- To maximize learning from experience:
- Practice alone isn't good enough.
- Structure practice with diagnostic feedback is required.
- To ensure that the 'right' lessons are learned:
- Everyone walks away with the same lessons learned.
- The team's interpretation of what happened and why it happened is cross-checked with standards.
- To promote self-reflection and continuous learning:
- Develop the team's self-correction skills.
- 'Calibrate' team members to rating their own performance.
Slide 46
Remember the Issues to Consider…
- Why…Always keep purpose in mind!
- "In the end, are the questions this measurement tool could answer what I really want to know?"
- What…What content needs to be captured by the measurement system?
- Where…Where will team be assessed?
- Training room vs. in-situ.
- When….when is it best to measure each competency in the scenario?
- Who…Who will be using this measure? What training will they have?
- How… What scale is best?
Slide 47
Developing a Measure of Trauma Team Performance: An Example
David P. Baker, PhD
IMPAQ International
Slide 48
Acknowledgements
- No conflicts of interest.
- Part of Dr. Jeannette Capella's Surgical Education and Research Fellowship.
- Research Team:
- Jeannette Capella, MD (PI).
- Andi Wright, RN.
- Ellen Harvey, RN.
- Sonya Ranson, PhD.
Slide 49
Purpose
- To develop and test a new tool for observing and measuring team performance during trauma resuscitation:
- Trauma Team Performance Observation Tool (TPOT).
- To conduct an investigation of the impact of TeamSTEPPS on:
- Trauma team performance.
- Patient outcomes.
Slide 50
Background
- Within the OR and trauma room, there has been a growing body of evidence validating the importance of teamwork.
- Christian et al. (2006).
- A prospective study of ten general surgery cases.
- Problems in communication, managing workload, and prioritizing competing tasks within the surgical team were found in all ten cases.
- Few team performance evaluation tools, particularly in the areas of surgery and trauma.
Slide 51
Methods
- Conducted interviews:
- 31 trauma team members (e.g., physicians, nurses, residents).
- Two different organizations.
- Goals of the interviews:
- To identify the steps in trauma resuscitation.
- To identify technical and team requirements that comprise each step.
- Similar to a mini FMEA.
Slide 52
Phase | Technical | Teamwork |
---|---|---|
Transport | 1. EMS/rescue team or trauma team brings patient to resuscitation/trauma area 2. EMS/rescue team and/or trauma team continues ABCs: assesses patient airway, breathing, circulation, disability, and exposure/environment 3. EMS/rescue team conducts verbal handoff of patient information to trauma team |
(Communication) 1. Team members are quite while EMS/rescue team gives report |
Primary Survey | 1. Perform ABCs-this should be completed in 60-90 seconds: A: Airway (secure airway; identify problems; initiate timely intervention) B: Breathing (assess lung sounds; identify problems; initiate timely intervention) [....and much more....] |
(Leadership) 1. Team leader continually communicates and advocates the plan of care (Situation Monitoring) 1. Team members know thier role and responsibilities 2. Team member(s) prepare(s) patient and/or equipment |
Slide 53
Methods
- An initial pool of items was developed for the TPOT through an extensive item writing process.
- Items were linked to the following four team constructs:
- Leadership.
- Situation monitoring.
- Mutual support.
- Communication.
Slide 54
Phase | Technical | Teamwork |
---|---|---|
Transport | 1. EMS/rescue team or trauma team brings patient to resuscitation/trauma area 2. EMS/rescue team and/or trauma team continues ABCs: assesses patient airway, breathing, circulation, disability, and exposure/environment 3. EMS/rescue team conducts verbal handoff of patient information to trauma team |
(Communication) 1. Team members are quite while EMS/rescue team gives report |
Primary Survey | 1. Perform ABCs-this should be completed in 60-90 seconds: A: Airway (secure airway; identify problems; initiate timely intervention) B: Breathing (assess lung sounds; identify problems; initiate timely intervention) [....and much more....] |
(Leadership) 1. Team leader continually communicates and advocates the plan of care (Situation Monitoring) 1. Team members know thier role and responsibilities 2. Team member(s) prepare(s) patient and/or equipment |
Slide 55
Leadership—The team leader…..
- Conducts a brief prior to patient arrival (e.g., identifies self, assigns members roles and responsibilities, discusses initial plan based on current information, anticipates interventions [e.g., chest tube, OR]).
- Continually renders plan of care to team.
- Feedback provided to team members is constructive.
- Ensures task prioritization (e.g., important tasks performed first, ABC's and survey sequence are being completed).
- Asks non-response team members to leave when they are distracting.
Slide 56
Situation Monitoring—Team members….
- Prepare equipment before patient arrival (e.g., set up IV, ultrasound machine, suction).
- Work quickly and efficiently.
- Conduct tasks in right order.
- Are not distracted by major injuries.
- Ensure that NEW team members perform expected role and responsibilities.
- Adapt quickly and efficiently to deterioration of patient's condition (e.g., decreased O2 sats, decreased blood pressure, decreased mental status).
Slide 57
Mutual Support—Team members….
- Feedback provided to other team members is constructive.
- Assist when moving patient to next unit (e.g., CT scanner, OR, ICU).
- Provide assistance when needed/Complete other team members' tasks.
- Identify/Call out when patient safety issue is suspected.
Slide 58
Communication—Team members….
- Remain quiet while team gives report.
- Request additional information from EMS (e.g., medications given, vital signs, mechanism of injury).
- Use call-outs to share important patient information (i.e., Team leader "Airway status?" Airway doc responds "Airway clear!").
- Use check-backs to verify important information is exchanged (i.e., Doctor "Give 25 mg Benadryl IV." Nurse "25 mg Benadryl IV" to confirm. Doctor "That's correct").
- Use clear and concise language.
- Request information from others when it's not readily shared.
Slide 59
Scoring
Rating Scale | |||||
---|---|---|---|---|---|
1 | 2 | 3 | 4 | 5 | NA |
Very poor | Poor | Average | Good | Excellent | Not applicable |
Very poorly done | Poorly done | Acceptable performance | Good performance | Perfect Performance | Did not need to be done |
Should have been done but was not | Should have been more often | Could have been done more often/consistently but is acceptable as is | Done most of the time | Done at all times appropriately | Was not done and did not need to be done |
Slide 60
Observer Training
- Five staff (two trauma nurses and three trauma registrars) were trained to use the TPOT.
- Training involved four steps:
- Reviewing and discussing the TPOT.
- Practice and feedback rating the videotapes.
- Revising the TPOT items for clarity, as needed.
- Independent evaluation of the videotapes.
Slide 61
Rater Agreement
Scenario | Team | Mean | ICC | Inter-Rater Agreement |
---|---|---|---|---|
1 | 1 | 2.67 | .44 | 65% |
2 | 1 | 1.22 | .64 | 82% |
1 | 2 | 2.98 | .47 | 69% |
2 | 2 | 1.21 | .58 | 86% |
Slide 62
Implementation
- Part of a larger study to assess the impact of TeamSTEPPS.
- Trained raters:
- Observed and rated 33 live trauma resuscitations in the trauma bay over a 3-month period (Pre-Training).
- Observed and rated 40 live trauma resuscitations in the trauma bay over a 3-month period (Post-Training).
Slide 63
Alphas and Inter-correlations
Team Skills | LDR | SM | MS | COM | Mean | Stdev |
---|---|---|---|---|---|---|
Leadership | 2.90 | .68 | ||||
Situation Monitoring | .92 | 3.29 | .62 | |||
Mutual Support | .75 | .75 | 3.88 | .47 | ||
Communication | .85 | .82 | .70 | 2.92 | .58 | |
Alpha | .53 | .57 | .64 | .63 |
Slide 64
Impact Study
- Design:
- Quasi-experiment.
- Pre-test/Post-test, no control group design.
- Interventions:
- Training—Didactic and Simulation.
- 2-hour TeamSTEPPS Essentials; 2 hours simulation (skills practice and feedback):
- Nurses and Doctors separately trained.
- Trauma Room Roles and Responsibilities Policy.
- Briefing, STEP, CUS, Call-Outs and Check-Backs.
- One-day Nursing Crash Course.
Slide 65
Measures (Kirkpatrick)
- Level I Reactions:
- End of Training.
- Level II Learning—Pre/Post Test:
- Knowledge, Attitudes, Skills.
- Level III Transfer—Observed Teamwork (TPOT):
- 3 months pre and post-training.
- Level IV Outcomes:
- Clinical Outcomes (ICU LOS, Hospital LOS, Complication Rate, Mortality).
- Clinical Process (Time to CT SCAN, Surgery, Intubation).
Slide 66
Expected Findings
- Reactions—Positive
- Learning:
- Knowledge:
- No change—staff know what to do.
- Attitudes:
- Positive—Believe in teamwork or social desirability.
- Skills:
- Significant improvement in simulator.
- Knowledge:
- Transfer:
- Not sure about trauma bay—many environmental factors.
- Outcomes:
- Unlikely due to base rate issues.
Slide 67
Descriptive Data
Pre-training | Post-training | |
---|---|---|
Observed | 33 | 40 |
Trauma | 176 | 263 |
ISS | M=13.97, SD=11.85 |
M=11.63; SD=11.04 |
Slide 68
Level III Transfer
Pre-training N=33 |
Post-training N=40 |
p value | |
---|---|---|---|
Leadership | 2.87 | 3.46 | 0.003 |
Situation monitoring | 3.30 | 3.91 | 0.009 |
Mutual support | 3.40 | 3.96 | 0.004 |
Communication | 2.90 | 3.46 | 0.001 |
Total | 3.12 | 3.70 | <0.001 |
Note: Pre-training and post-training reflect observations in trauma bay by 4 trained raters.
Slide 69
Pre-training | Post-training | p value | |||||
---|---|---|---|---|---|---|---|
N | M | SD | N | M | SD | ||
ICU LOS (days) | 73 | 5.50 | 6.37 | 82 | 5.61 | 6.45 | |
Vantilator days | 53 | 4.49 | 5.08 | 55 | 6.42 | 7.27 | |
Hospital LOS (days) | 176 | 7.63 | 13.99 | 263 | 6.25 | 5.81 | |
% without complications | 176 | 70.45 | 263 | 76.80 | |||
% alive at discharge | 176 | 86.93 | 263 | 91.54 | |||
Time to FAST | 123 | 8.3 | 5.7 | 221 | 9.6 | 7.8 | |
Time to CT | 124 | 26.4 | 14.5 | 174 | 22.1 | 11.7 | 0.005 |
Time to ETT | 21 | 10.1 | 6.8 | 22 | 6.6 | 4.2 | 0.049 |
Time to OR | 46 | 130.1 | 82.7 | 47 | 94.5 | 63.8 | 0.021 |
Time in ED | 176 | 186.1 | 151.0 | 263 | 187.4 | 159.3 |
Slide 70
Challenges
- Measurement:
- How best to train observers?
- Moderate reliability.
- Capella & Baker, ACOS Grant to study different training strategies.
- Available tools:
- Had to develop tools, few available, few validated.
- How best to train observers?
Slide 71
Thank You!
For more information, please contact our team at:
dbaker@impaqint.com