Patient Safety Training Evaluations: Reflections on Level 4 and more (

Slide presentation from the AHRQ 2009 conference.

On September 15, 2009, Eduardo Salas, Ph.D. made this presentation at the 2009 Annual Conference. Select to access the PowerPoint® presentation (346 KB). Plugin Software Help.


Slide 1

Patient Safety Training Evaluations: Reflections on Level 4 and more.

Eduardo Salas, Ph.D. 
Department of Psychology &
Institute of Simulation & Training
University of Central Florida

esalas@ist.ucf.edu

 

Slide 2

Purpose Today.

  1. Challenge
    • Offer some observations & myths
  2. Proposal
    • Time to think differently
  3. Guide
    • Best Practices

 

Slide 3

A few thoughts about the science of training.

 

Slide 4

What do we know about training?

  • The science has evolved & matured.
  • The past decade-an explosion of research!
    • More empirical work
    • Research conducted in organizations
    • New, more & deeper theories and models
    • More evaluations reported
  • Huge military investment.
  • Influence of cognitive psychology.
    • Expertise

 

Slide 5

What do we know about training?

  • Much progress in.
    • Organizational needs analysis
    • Cognitive task analysis
    • Transfer of training
    • Instructional design
    • Feedback
    • Training evaluation
    • Simulation-based training
    • Individual characteristics

 

Slide 6

Observations From the Science

  • The quality and quantity of research has increased
  • The cognitive and organizational concepts is revolutionizing the field
  • The field is multi-disciplinary
  • The influence of technology will continue
  • Training is part of an organizational system
  • There are more guidelines, tools and approaches for practitioners

 

Slide 7

Framework for Training Effectiveness

 

Slide 8

Myths & misconceptions about training.

 

Slide 9

The Simplistic View of Training

  1. Unskilled Worker
  2. Training Program
  3. Skilled Worker
  • Uninformed About the Science
  • Erroneous Assumptions

 

Slide 10

Myth

Everyone Who Has Ever Learned Anything is a Training Expert
 

 

Slide 11

Reality

  • Opinions aside, training is a behavioral/cognitive event that can be structured to empirical investigation.
  • There is a science of training that should be exploited to optimize training design.
  • Processes exist which, if appropriately and consistently applied, can help to ensure that effective training is designed.

 

Slide 12

Myth

TASK EXPERTS CAN ARTICULATE TRAINING NEEDS
 

 

Slide 13

Reality

  • Experts do not have access to their own expertise.
    • Knowledge becomes "compiled"
  • Task experts do not necessarily understand the learning process or how learning progresses.
  • Task experts are crucial, but they must be paired with learning experts.
    • Partnership

Slide 14

Myth

Reactions to training = Learning

 

 

Slide 15

Reality

  • Just because trainees are having fun, doesn't mean that they are learning anything.
    • Very little or no relationship
  • "Instrumentality" does seem to be a factor.
    • Does seem to be related to learning
    • Affects motivation to learn
  • Simple measures of training outcomes are insufficient to judge training quality.

 

Slide 16

Myth

Learning will translate into Behavior change
 

 

Slide 17

Reality


  • Training transfer is a very complex phenomenon.
  • Some of the factors:
    • Supervisor Peer support
    • Climate for Transfer
    • Opportunity to perform/practice
  • Even when trainees demonstrate learning after training, it does not mean that they can or will transfer back to the job.

 

Slide 18

Thinking Differently about Training Evaluation.

 

Slide 19

Kirkpatrick's Model of Training Evaluation

  • Level 5 - Return on Investment
    Was the training worth the cost?
  • Level 4 - Results
    Did the change in behavior positively affect the organization?
  • Level 3 - Behavior / Training Transfer
    Did the participants change their behavior on-the-job-based on what they learned?
  • Level 2 - Learning
    What skills, knowledge, or attitudes changesd after training? By how much?
  • Level 1 - Reaction
    Did the participants like the training? Whad od they plan to do with what they learned?

Slide 20

This Model.

  • Has served as well!
  • Used, misused & abused!
  • Created a misconception that Level 1 is all one needs
  • Over simplified evaluations
  • Links among levels, weak
  • Minimal impact of training on Level 4
    • Clinical outcomes

 

Slide 21

So.

What if we reverse Kirkpatrick's model?

 

Slide 22

Start as Level 4.

What are the outcomes/results we want out of this training?

 

Slide 23

Level 3: Given these wanted outcomes.

What behaviors we want/need of our trainees?

 

Slide 24

Level 2: Given these needed behaviors.

What KSAs we want our trainees to have?

 

Slide 25

Level 1: Given those KSAs.

What reactions we want our trainees to have?

 

Slide 26

What do you get by reversing Kirkpatrick's typology?

  • Precise learning outcomes
  • Better links among Levels
  • Better link of training to outcomes
    • Clinically-relevant
  • Hints for performance assessment/observation
  • Tailor training program better
  • Better accountability

 

Slide 27

Best Practices after Training Evaluation in.Healthcare, Aviation.

 

Slide 28

Best Practices

  1. Even before designing your training, start backwards: Think about evaluation first.
  2. Accept that effective training does not exist without effective evaluation.
  3. Strive for robust, experimentation design in your evaluation: It is worth the headache.
  4. When designing your evaluation plan and metrics, ask the experts - your frontline staff.
  5. Do not reinvent the wheel, leverage existing data relevant to training objectives.

 

Slide 29

Best Practices (cont)

  1. When developing measures: Consider multiple aspects of performance.
  2. When developing measures: Design for variance.
  3. Evaluation is affected by more than just training itself: Consider organizational, team, or other factors which may help (or hinder) the effects of training (and thus the outcome of your evaluation)
  4. Engage socially powerful players early:
    Physicians, nursing & executive management is crucial to evaluation success.

 

Slide 30

Best Practices (cont)

  1. Ensure evaluation continuity: Have a plan for employee turnover at both the participant & evaluation administration team level.
  2. Environmental signals before, during, and after training must indicate that the trained KSAs & the evaluation itself are valued by the organization.
  3. Get in the game coach! Feed evaluation results back to frontline providers & facilitate continual improvement through constructive coaching.
  4. Report evaluation results in meaningful way.

 

Slide 31

Conclusions

  • Avoid Myths!
  • Training Evaluation matters!
  • Reverse Kirkpatrick's typology!
Current as of December 2009
Internet Citation: Patient Safety Training Evaluations: Reflections on Level 4 and more (. December 2009. Agency for Healthcare Research and Quality, Rockville, MD. http://www.ahrq.gov/news/events/conference/2009/salas/index.html