Facilitator's Notes (continued)

Training Guide: Using Simulation in TeamSTEPPS Training

Training Guide: Using Simulation in TeamSTEPPS® Training (continued)

Facilitator's Notes (continued)

MaterialsActivitiesTimes
IV. Phase 2—Development of Performance Measures 
PPT slide 21

Welcome participants back from break and show PPT slide #21, Phase 2. Tell them that you will now move them into Phase 2, in which they will learn to develop performance measures. Explain that the preceding steps they performed in Phase 1, Scenario Development, particularly the targeted response, lead directly to the process of generating event-based diagnostic measurement tools. These can take the form of simple checklists of observable behaviors, frequency counts, and rating scales, among other measures.

(2 min.)

Explain that performance measures should diagnose both individual and team performance and provide information about why things did or did not go well. Performance measures need to accurately assess performance within a scenario to determine if the objectives of training were achieved, to determine the degree to which training participants are learning, and to identify areas for improvement. Performance measurement can help you to 1) Evaluate and continually improve your simulation-based training scenarios and 2) Scaffold the process of providing feedback in team debriefs. Tell them that there are four critical actions involved in developing measures and that you will elaborate on each of these:
(1) Considering the level of analysis, (2) Clarifying the purpose of the assessment, (3) Deciding what to measure, and (4) Selecting a measure.

(3 min.)
PPT slide 22Show PPT slide #22, Team Performance Observation Tool, and tell them that this is an example of an observation tool. Using this tool, observer(s) can rate individual and team behaviors under each skill area of leadership, situation monitoring, mutual support, and communication. Tell them that they may also design their own performance measurement tools.(3 min.)
PPT slide 23Show PPT slide #23, Consider Level of Analysis. This slide provides examples of individual positions and of teams whose performance is to be measured. Explain that whether they select an existing performance measurement tool or develop their own, they need to consider the individual personnel and teams for which the measure is intended.(2 min.)
PPT slide 24

Show PPT slide #24, Clarify the Purpose. Measurement is the only means of determining actual changes in behavior resulting from simulation-based learning. Simulations are only effective if you can follow up with participants by measuring behaviors that need to change. Measuring these behaviors starts with identifying the purpose. With simulation, measurement should be used for two purposes:

  1. Identifying deficiencies in performance.
  2. Assessing whether performance has improved following learning.
Tell participants that before they determine the measures they will use, they need to clarify the purpose of the assessment. They will need to diagnose root causes of performance deficiencies and identify specific weaknesses to be addressed. They will need to relay information concerning strengths and weaknesses in the form of a remediation plan. They will need to determine if the purpose is to evaluate the level of proficiency or readiness.
(2 min.)
PPT slide 25

Show PPT slide #25, Decide What to Measure. Tell participants that in developing a performance tool they must decide whether to measure performance outcomes or the processes, or both. Ideally, they should measure both outcomes and processes. Outcomes generally are more quantifiable than processes. They are the results of an individual's or team's performance and answer the questions “What happened?”and “What was the end result?” Process, on the other hand, encompasses the steps, strategies, or procedures used to accomplish a task and answer the questions “How well did the team communicate with one another?” and “Why did it happen?

(3 min.)
PPT slide 26Show PPT slide #26, Outcomes, and tell participants that outcomes are sometimes referred to as measures of effectiveness (MOEs). They provide an indication of the extent to which the outcome of the task was successful and are important for most measurement purposes.
  1. Typical outcomes that we can assess include the following:
  2. Accuracy—Precision of task performance (e.g., was the correct medication and correct dosage administered to the patient).
  3. Timeliness—Length of time it took to perform a task (e.g., how much time elapsed from the onset of an event to the team's response).
  4. Productivity—Volume (e.g., volume of patients in emergency department).
Efficiency—Rate of resources required versus those used (e.g., operating room supplies).
(5 min.)
PPT slide 27

Show PPT slide #27, Process, and tell participants that processes are sometimes referred to as measures of performance (MOPs). They provide an indication of why certain outcomes were obtained. They typically measure the question “Was the decision made right?” (i.e., in the right way) versus “Was the right decision made?” Process measures are extremely important in diagnosing the root causes of performance deficiencies. They also are critical in providing feedback and designing additional training.

Explain that there are two types of processes: Taskwork and teamwork. Taskwork refers to the specific steps and strategies associated with a particular job. This includes team members' responsibility to understand the nature of the task, how to interact with equipment, and how to follow proper policies and procedures. Taskwork activities have been referred to as operational or technical skills. They are representative of “what” teams do. They can be procedural or non-procedural. For example, in the context of an emergency department, taskwork can refer to the act of intubating a patient experiencing trauma.

Teamwork refers to the steps or procedures that team members use to coordinate their actions or tasks. They describe “how” teams accomplish their work, such as how priorities are established and how team members monitor each other's performance to ensure that tasks are accomplished correctly. For example, in the context of an emergency department, teamwork can refer to the communication between providers during the act of intubating a patient experiencing trauma. In other words, if a team huddles up before treating trauma patient, that is an example of teamwork.

(5 min.)
PPT slide 28Show PPT slide #28, Measurement Tips. Review the tips on this slide with participants. Explain that they should strive to assess both processes and outcomes when they are diagnosing performance deficiencies or providing feedback to teams. They also should consider measuring at multiple levels to help them identify the weak link(s). They should also provide training participants multiple opportunities for practice, such as performing the same skills or task repeatedly throughout the course of an exercise.(2 min.)
PPT slide 29

Show PPT slide #29, Select a Measure. Reiterate that performance measures can take the form of checklists of observable behaviors, frequency counts, or rating scales, as described below:

  1. Behavioral checklists consist of items or actions that have dichotomous answers, such as Yes/No, Right/Wrong, Performed/Not Performed. An example of this is the Primary Survey ABCs.
  2. Frequency counts provide an indication of the number of times that a specific behavior, action, or error occurs. An example of this is the use of CUS, SBAR, two-challenge. One way to achieve frequency counts is via videotaped performances.
  3. Rating scales consist of a numerical or descriptive judgment of how well a task was performed.
(5 min.)
PPT slide 30Show PPT slide #30, Checklist Tips, which provides suggestions for successful use of checklists. For example:
  1. Checklists are best used with scripted (rather than “free play”) scenarios in which the observer looks for specific behaviors that are expected given the trigger and distracters that were written into the scenario. Because the checklists are event-based and the scenarios are scripted in advance, the rater knows when each event will occur and can direct his or her attention to the targeted responses.
  2. Checklist items should be related to triggers that are embedded in the scenario.
  3. Each checklist item should represent a single action taken by the individual or team.
  4. The response category used on the checklist should be labeled.
(3 min.)
PPT slide 31Show PPT slide #31, Checklist, as an example. 
PPT slide 32Show PPT slide #32, Frequency Count Tips, which provides suggestions for successful use of frequency count measures. For example:
  1. Frequency counts are more effective when measuring overt actions or errors (acts of commission) than when measuring failure to demonstrate specific behaviors (acts of omission).
  2. Frequency counts are most effective for determining how often a specific action is taken or a specific task is performed.
  3. Frequency counts can be recorded during a critical event in an exercise or throughout the entire scenario.
(3 min.)
PPT slide 33Show PPT slide #33, Frequency Count, as an example. 
PPT slide 34Show PPT slide #34, Graphic Rating Scale, which is designed to capture a range of behavior from “ineffective” to “highly effective” or from “strongly disagree” to “strongly agree.” On such a scale, the rater marks the degree to which he or she observes the targeted behavior. This type of rating scale allows for some rater variation in opinion. Note that it may require additional training to ensure inter-rater reliability.(5 min.)
PPT slide 35

Show them PPT slide #35, Anchored Rating Scale, and tell them that, in contrast to the Graphic Rating Scale, which has less opportunity for rater variance, the Behaviorally Anchored Rating Scale (Smith & Kendall, 1963) is designed so that each point on the rating scale is explicitly linked to a specific behavior. Be sure to emphasize that both types of rating scale are effective and may be used to measure individual and team performance.

(5 min.)
PPT slide 36Show PPT slide #36, Rating Scales, and emphasize that rating scales are best used to assess quality when expected behavior does not equate to a sum or quantity. Instead, rating scales are best used to assess tasks that are less procedural in nature, and rating scales are particularly effective for assessing performance on a continuum, as shown in the previous examples. However, explain that if the rating scale is designed to rate the quality of a response (and not just whether or not the response occurred), additional training will be required to ensure inter-rater reliability and that the ratings do not drift over time.(2 min.)
PPT slide 37

Show PPT slide #37, Measures, and tell participants that it's now their turn to develop a measure. In the same groups in which they developed scenarios with event sets and targeted responses, the groups are to develop a measure (using a checklist, a frequency count, or a rating scale) to assess individual and/or team performance for their scenario.

Allow participants 25 minutes for this exercise. Tell them to be prepared to make a 3-minute presentation in which their group will share their results with the total group.

(25 min.)

After about 25 minutes (longer, if participants need more time and your schedule allows), ask each group to report to the total group the measure they developed and the reasons they selected the type of measure they did for their specific scenario. As a group completes its report, suggest that others in the room ask questions about the measure, such as “Could you use a checklist equally effectively for this scenario?” or “What methods would you use to obtain the frequency count?” After all groups report, give the groups about 5 minutes to make revisions to their performance measures.

In summary, tell participants that the combination of focusing on the presence or absence of key behaviors and controlling events that occur during the simulation allows for valid and reliable measurement of complex performance.

Note: Keep in mind that the timing listed here for each part of the training is only a suggestion. You will need to modify the timing for exercises depending on the size of the group for which you are providing training as well as for the content of the training.

(15 min.)
Break15 min.
V. Phase 3—Debriefing45 min.
PPT slide 38

Welcome participants back from break and show PPT slide #38, Phase 3. Explain that this phase concerns the debriefing process, which is critical to maximizing participants' learning. This phase consists of four steps, which you will cover in depth: (1) Introducing the debrief process, (2) Describing what happened, (3) Conducting an analysis of performance, and (4) Identifying lessons learned. Not all team debriefs will flow in this exact order, but you can use this framework to help guide the discussion.

Tell them that in introducing the debrief process to their participants, they should make clear that the purpose of the debriefing is to help participants understand the complex team skills and knowledge required for quality patient care. They need to explain that the debriefing will be team centered and that all team members are expected to participate. They also need to explain the format that the debriefing will take. To describe what happened during the simulation, you can show a videotape, provide a description, or ask team members to objectively describe the outcome without making judgments of one another's behavior.

(3 min.)
PPT slide 39

Show PPT slide #39, Description Phase. Explain that during this phase, each team member objectively shares his or her perspectives of events that unfolded during the scenario. When facilitated skillfully, the description phase helps team members gain insight into each other's perspectives, helps the team to reach consensus on what happened during the scenario, and helps ensure that everyone takes away similar lessons from the experience. Tell participants that measurement can help during this phase because it describes human behavior in concrete terms, thereby providing a structure for understanding the scenario. During this phase, be sure to focus the discussion on critical aspects of performance related to the learning objectives.

(2 min.)
PPT slide 40

Show PPT slide #40, Analysis Phase. Explain that this phase entails a systematic investigation of why things happened in the scenario. In this phase, the team focuses on what went well and what could have been done better. Tell them that they should encourage team members to address each other directly. They should ask team members to discuss how they were affected by each other's actions. And they should encourage team members to discuss what they were each thinking.

Tell them that measurement can shed some light on performance if they compare the team's performance with standards of performance for the tasks in the scenario. They can review the TeamSTEPPS behaviors associated with the tasks and determine whether the team used these behaviors when necessary. If so, they can identify whether the behaviors were performed correctly or if they could be improved.

Share the following tips to use when they serve as facilitators for debriefings of teams that take their training courses:

  • Give your analysis and evaluation only after the team being debriefed has completed its analysis and evaluation.
  • Ask team members to talk about what went well, what could be improved, and how it could be improved.
  • Push the team to go beyond just describing what happened. Ask follow-up questions that require in-depth analysis, such as asking team members to analyze why they made the decisions they made.
  • Encourage team members to discuss what they were thinking during the simulation.
  • Encourage team members to discuss the factors that enabled or impeded the team's success.
(5 min.)
PPT slide 41

Show PPT slide #41, Application/Generalization Phase. Tell participants that this phase helps the team to look ahead to what they learned in the simulation and apply or generalize it to their daily practice. Ask team members to discuss specific ways they can apply what they learned in the simulation exercise to their care of patients. It is helpful in this debrief phase to generate a list of lessons learned. In other words, team members can discuss what went wrong and how it can be corrected.

Explain that measurement can shed some light on the team's performance if they use explicit event sets to draw parallels between the scenario and the actual clinical environment. Explicit measures associated with these events can help promote team members' reflections about how to transfer what went well to the actual clinical environment.
(3 min.)
PPT slide 42Show PPT slide #42, Tip for Success 1. Remind participants to keep things simple. People can integrate only a few key learning points from a scenario. Observers have a limited attention span and often have to multitask, so don't ask too much of them. A good rule is that they have a key event every 1 to 2 minutes of scenario time.(2 min.)
PPT slide 43

Show PPT slide #43, Tip for Success 2. Remind participants that it's not enough to tell someone that they did well. They need diagnostic feedback that is specific, behaviorally focused, and descriptive. For example:

  • “The S and B of that SBAR were great. I didn't notice you making a recommendation or request, though. What did you want the team to do?”
  • Using the same SBAR example, ask the team to discuss what they thought about the A and R. For example, “I heard a clear S and B in that SBAR. Did anyone hear the A and R? Did you know what the assessment was or what you were supposed to do?”
  • “When you did that check-back, repeating the order, it was really clear that there was a mistake and they corrected it before it did any harm. That was a good catch.”
  • “You shared the care plan with the team, but no one had a chance to speak up or ask questions.”
(3 min.)
PPT slide 44Show PPT slide #44, Tip for Success 3. Remind participants that they will need to train observers so that measurement is systematic, reliable, and valid. In other words, the same ratings should be obtained no matter who observes and no matter how many times the selected measurement is used. They will need to ensure that everyone has common expectations about performance, and they will need to develop and use a scoring guide with rubrics that are understandable to all raters/observers.(3 min.)
PPT slide 45PPT slide #45 presents a sample Rater Scoring Guide. Review this with participants. 
PPT slide 46Show PPT slide #46, Tip for Success 4. Remind participants that they need to keep teamwork and clinical skills separate and not overcomplicate the clinical nature of the scenario when their main purpose is to train teamwork. To do this, they can use dual debriefs. In other words, they can provide feedback on teamwork skills as a team, and they can address major clinical deficiencies through individual follow-up sessions. Participants who are teamwork novices should be allowed to focus on teamwork, not on complex clinical issues, in the scenario. Remind them that, as team members become more sophisticated in their teamwork skills, more complex clinical scenarios can be used.(2 min.)
PPT slide 47Show PPT slide #47, Tip for Success 5. Remind participants that event-based methods involve more than just measurement. Event-based methods involve good training design practices, good scenario design practices, and good debrief facilitation practices.(2 min.)
PPT slide 48

Show PPT slide #48, Teamwork Actions. Review with participants the content of the training they have just completed. The following are things that facilitators of simulation training should be able to do:

  • Apply the event-based approach to training.
  • Develop TeamSTEPPS training scenarios.
  • Develop TeamSTEPPS performance measures.
  • Conduct effective debriefs of team performance.
 
Flipchart page marked “Parking Lot Issues”Now collect the flipchart page marked “Parking Lot Issues” that you posted at the beginning of the course. Review Post-It Notes on this page to determine if the questions have been answered during the training. Provide answers to unanswered questions or, if the questions need to be referred to others or if they need research, give participants an approximate date by which they can expect to receive either the answers or referrals to other information sources. Also ask if participants have any questions, items, or issues that still need to be clarified.(20 min.)
Flipchart page of pluses and deltas, + and Δ

Now tell participants that you would like to take the temperature of the group concerning training activities for the training they just completed. Use the flipchart page that you pre-labeled “Participant Feedback” that has two columns, one with a plus sign [+] and one with a delta [Δ]. Ask them to call out those things that they liked about the day's training. Accept all comments and write them under the [+] column. When there are no more responses, ask them to identify those things that they felt could have been improved about the day's training. Again, accept all comments and write them under the [Δ] column. Tell them that you appreciate and take their comments seriously and, to the extent possible, you will attempt to address those items in the [Δ] column that are under your control in future training courses.

Ask them to complete the formal evaluation form (Go to sample Course Evaluation Form under Tab B, Appendix A, of the TeamSTEPPS Instructor Guide that you can adapt as needed). Thank them for their participation and enthusiasm and tell them that you look forward to seeing them at future training courses.

 

Return to Contents
Proceed to Next Section

Page last reviewed November 2008
Internet Citation: Facilitator's Notes (continued): Training Guide: Using Simulation in TeamSTEPPS Training. November 2008. Agency for Healthcare Research and Quality, Rockville, MD. http://www.ahrq.gov/professionals/education/curriculum-tools/teamstepps/simulation/traininggd1.html