Advanced Methods in Delivery System Research - Planning, Executing, Analyzing, and Reporting Research on Delivery System Improvement

Webinar #4: Formative Evaluation (Slide Presentation)

This slide presentation was presented at a webinar on July 15, 2013.

Presenter: Jeffrey Smith, PhD(c)
Discussant: Cheryl McDonnell, PhD
Moderator: Michael I. Harrison, PhD

Sponsored by AHRQ's Delivery System Initiative in partnership with the AHRQ PCMH program.

Slide 1

Formative Evaluation Webinar #4 Slide 1

Advanced Methods in Delivery System Research –Planning, Executing, Analyzing, and Reporting Research on Delivery System Improvement

Webinar #4: Formative Evaluation

Presenter: Jeffrey Smith, PhD(c)
Discussant: Cheryl McDonnell, PhD
Moderator: Michael I. Harrison, PhD

Sponsored by AHRQ's Delivery System Initiative in partnership with the AHRQ PCMH program.

July 15, 2013

Slide 2

Speaker Introductions

Speaker Introductions

Jeffrey Smith, PhD Candidate is Implementation Research Coordinator (IRC) for the Department of Veterans Affairs’ Mental Health Quality Enhancement Research Initiative (QUERI). Jeff’s presentation today will draw on his paper with Kristin Geonnotti, Deborah Peikes and Winnie Wang on Formative Evaluation. This AHRQ PCMH Research Methods Brief is posted on the AHRQ PCMH website.

Cheryl McDonnell, PhD is an experimental psychologist with over 30 years’ experience in evaluation and management of large-scale public health projects. Her presentation today draws on her work with an AHRQ grant entitled Accelerating Implementation of Comparative Effectiveness Findings on Clinical and Delivery System Interventions by Leveraging AHRQ Networks.

Image of Jeffrey Smith on the upper left side of the slide.

Slide 3

Formative Evaluation in Implementation Research: An Overview

Formative Evaluation in Implementation Research: An Overview

Jeffrey L. Smith, PhD(c)
Implementation Research Coordinator
VA Mental Health QUERI

Email:  Jeffrey.Smith6@va.gov

Slide 4

Objectives

Objectives

  • Describe goals of evaluation in implementation science. 
  • Offer perspectives on what constitutes ‘successful implementation’. 
  • Describe 4 stages of formative evaluation.

Slide 5

Goals of Evaluation in Implementation Science

Goals of Evaluation in Implementation Science

  • Conduct formative evaluation:
    • Rigorous assessment process designed to identify potential and actual influences on the progress and effectiveness of implementation efforts (Stetler et al, JGIM 2006; 21(Suppl 2):S1-8.)
  • Conduct summative evaluation:
    • Systematic process of collecting and analyzing data on impacts, outputs, products, outcomes and costs in an implementation study.
  • Evaluate usefulness of selected theory, in terms of:
    • Planning implementation strategy.
    • Unanticipated elements critical to successful implementation, but unexplained by selected theory.
    • Helping to understand findings and relationships between domains or constructs.

Slide 6

What is Successful Implementation?

What is Successful Implementation?

  • Implementation plan and its realization.
  • Evidence-based practice (EBP) innovation uptake:
    • I.e., clinical interventions and/or delivery system interventions
  • Patient and organizational outcomes achievement.

Slide 7

Does the concept of implementation success apply to implementation strategy as well as to the innovation?

Does the concept of implementation success apply to implementation strategy as well as to the innovation?

Image of flowchart showing the measures of implementation success in relation to the intervention, based on implementation strategies and clinical innovation. Measures include process outcomes and health outcomes.

Adapted from: Lukas CV, Hall C. Challenges in Measuring Implementation Success. 3rd Annual NIH Conference on the Science of Implementation and Dissemination:  Methods and Measurement. March 15-16, 2010. Bethesda, MD.

Slide 8

Four Stages of Formative Evaluation (FE)

Four Stages of Formative Evaluation (FE)

  • Developmental
  • Implementation-Focused
  • Progress-Focused
  • Interpretive

Slide 9

Developmental Formative Evaluation

Developmental Formative Evaluation

  • Aka “needs assessment”, “organizational diagnosis”.
  • Involves data collection on…
    • Actual degree of less-than-best practice (need for improvement).
    • Determinants of current practice (including context).
    • Potential barriers / facilitators to practice change.
    • Feasibility of (initial) implementation strategy.
  • Goals:
    • Identify determinants and potential problems and try to address in implementation strategy; refine strategy as needed.
    • Avoid negative unintended consequences.
    • Engage stakeholders in defining problem and potential solutions

Slide 10

Implementation-Focused Formative Evaluation

Implementation-Focused Formative Evaluation

  • Occurs during implementation of project plan.
  • Focuses on assessing discrepancies between implementation plan and execution.
  • Enables researchers to…
    • Ensure fidelity (both to implementation strategy and clinical intervention).
    • Understand nature and implications of local adaptation.
    • Identify barriers.
    • Identify new intervention components or refine original strategy to optimize potential for success.
    • Identify critical details necessary to replicate implementation strategy in other settings.

Slide 11

Progress-Focused Formative Evaluation

Progress-Focused Formative Evaluation

  • Occurs during implementation of project plan.
  • Focuses on monitoring indicators of progress toward implementation or clinical quality improvement (QI) goals:
    • Audit/feedback of clinical performance data.
    • Progress in relation to pre-determined timelines for implementing intervention components.
  • Used to inform need to modify or refine original strategy.
  • May also be used as positive reinforcement for high performing sites; negative reinforcement for low performers.

Slide 12

Interpretive Evaluation

Interpretive Evaluation

  • Uses data from other stages and data collected from stakeholders at end of project.
  • Obtain stakeholder views on:
    • Usefulness or value of intervention.
    • Barriers and facilitators to implementation success or failure.
    • Satisfaction with implementation strategy.
    • Recommendations for refinements to implementation strategy.
  • Can provide working hypotheses on implementation success / failure.

Slide 13

Formative Evaluation Assessment Methods/Tools

Formative Evaluation Assessment Methods / Tools

  • Quantitative:
    • Structured surveys / tools:
      • Instruments assessing context (e.g., organizational culture, readiness to change), provider receptivity to evidence-based practices.
      • Intervention fidelity measures.
    • Audit / feedback of clinical performance data.
  • Qualitative:
    • Semi-structured interviews w/ clinical stakeholders (pre-/post-).
    • Focus groups.
    • Direct (non-participant) observation of clinical structure and processes in site visits.
    • Document review.
  • Mixed Methods (i.e., Quantitative + Qualitative)

Slide 14

Stages of Formative Evaluation

Stages of Formative Evaluation

Flowchart showing the stages:

Pre-Implementation

Developmental

  • Identify determinants of current practice.
  • Identify barriers and facilitators.
  • Assess feasibility of proposed intervention.
  • Integrate findings into intervention design and refinement prior to implementation.

Implementation

Implementation-Focused

  • Assess discrepancy plans between implementation plan and execution, exploring issues of fidelity, intensity, exposure.
  • Understand and document nature and implementation of local adaptation.

Progress-Focused

  • Monitor impacts and indicators of progress toward project goals.
  • Use data to inform need for modifying original strategy.
  • Provide positive reinforcement to high performers; negative reinforcement to low performers.

Post-Implementation

Interpretive

  • Assess intervention usefulness/value from stakeholder perspective.
  • Elicit stakeholder recommendations for further intervention refinements.
  • Assess satisfaction with intervention and implementation process.
  • Identify additional barriers/facilitators.

Slide 15

Limitations

Limitations

  • Requires additional time and resources.
  • Methodological challenges.
  • Necessitates care in interpreting results:
    • Intermediate vs. final results.
  • Preserving objectivity 
  • FE is part of the intervention

Slide 16

Advantages

Advantages

  • Increase understanding of key barriers and facilitators to implementation.
  • Facilitate mid-stream modifications:
    • Process for adapting tools and strategies to increase chances for implementation success.
  • Refine complex interventions:
    • Patient-Centered Medical Home (PCMH) Interventions:
      • Multiple components.
      • New roles for clinical staff.
      • Variable local resources to support implementation.

Slide 17

Questions?

Questions??

Slide 18

Leveraging Networks to Spread Evidence: The Role of Formative Evaluation

Leveraging Networks to Spread Evidence: The Role of Formative Evaluation

Cheryl McDonnell, PhD
James Bell Associates

Slide 19

Grant Overview

Grant Overview

  • Accelerating Implementation of Comparative Effectiveness Findings on Clinical and Delivery System Interventions by Leveraging AHRQ Networks (R18) Dina Moss - PO.
  • Purpose: Spread CER findings by leveraging the capacities of multi-stakeholder or multi-site networks.
  • Goal: Implement existing evidence.

Slide 20

Evaluation Objectives

Evaluation Objectives

Identify effective methods of dissemination and diffusion of evidence-based practices, and barriers and facilitators to diffusion.

The evidence-based practices included activities intended to assist clinical providers and/or patients to: 

  • Choose a course of treatment.
  • Identify the most effective method of screening for a disease within a population.
  • Change the process of care delivery.
  • Promote self-management of chronic diseases.

Slide 21

Grantee Projects

Grantee Projects

  • Leveraging PBRNs for Chronic Kidney Disease Guideline Dissemination: James Mold, MD.
  • Comparative Effectiveness of Asthma Interventions Within an AHRQ PBRN: Michael Dulin, MD.
  • The Teen Mental Health Project: Dissemination of a Model for Adolescent Depression Screening & Management in Primary Care: Ardis Olson, MD.
  • Partners in Integrated Care (PIC): Keith Kanel, MD.
  • Accelerating Utilization of Comparative Effectiveness Findings in Medicaid Mental Health: Stephen Crystal, PhD.
  • Cardiac Surgery Outcomes – Comparing CUSP and TRiP to Passive Reporting: Peter Pronovost, MD and David Thompson, DNSc

Slide 22

CER Dissemination Grants

CER Dissemination Grants

  • Examples of ‘T3’ phase translational research incorporating:
    • Effectiveness.
    • Dissemination.
    • Implementation.
  • Applied knowledge about interventions in a real-world setting.
  • At the ‘make it happen’ end of the continuum.

Slide 23

Formative Evaluation Approach

Formative Evaluation Approach

  • Mixed methods:
    • Qualitative.
    • Quantitative.

 

  • Four areas of focus:
    • Needs Assessment.
    • Evaluability assessment.
    • Implementation evaluation.
    • Process evaluation.

Slide 24

Conceptual Model

Conceptual Model

Flowchart showing linkage between resource system and system antecedents, diffusion of innovation by knowledge purveyors, and linkage between change agency and user system, including adoption of innovation, implementation, and consequences.

Greenhalgh et al. 2004

Slide 25

Formative Evaluation Tasks

Formative Evaluation Tasks

  • Ensure an evaluation is feasible.
  • Determine the extent to which the program is being implemented according to plan on an ongoing basis.
  • Assess and document the degree of fidelity and variability in program implementation, expected or unexpected, planned or unplanned.
  • Provide information on what components of the intervention are potentially responsible for outcomes.

Slide 26

Formative Evaluation Tasks (cont.)

Formative Evaluation Tasks (cont.)

  • Describe the relationship between program context (i.e., setting characteristics) and program processes (i.e., levels of implementation).
  • Provide feedback on the status of implementation.
  • Identify barriers and facilitators to implementation
  • Refine the delivery components.
  • Provide program accountability to stakeholders, and other funders.

Slide 27

Initial Site Visit Focus

Initial Site Visit Focus

  • Evidence base.
  • Value and relevance of the evaluation process to the implementation team.
  • Identified outputs.
  • Role of the research team in the implementation process.
  • Perceived degree of influence of the PI.
  • Scalability of the intervention.
  • Organizational variables.

Slide 28

Ongoing Monitoring

Ongoing Monitoring

  • Measurement of outputs:
    • Number of clinics enrolled/services delivered/training sessions completed/meetings held.
    • Frequency/duration/dosage.
  • Measurement of Study Characteristics:
    • Clinics participating.
    • Clients served.
    • Staff involved.
  • Integration of Tracking and Reporting.
  • Site Visits.
  • Input from external expert advisors.

Slide 29

Follow-up Site Visits

Follow-up Site Visits

  • Current Status:
    • Evidence.
    • External context.
    • Partnerships/collaborations.
    • Study design.
  • Progress to date.
  • Identified barriers/proposed solutions.
  • Identified facilitators.

Slide 30

Common Challenges

Common Challenges

  • Resource Constraints.
  • IT Integration Challenges.
  • Infrastructure Limitations.
  • Communication and collaboration.
  • Intervention fidelity vs. flexibility.
  • Practice site engagement and sustainability.

Slide 31

Questions?

Questions?

Slide 32

Thank you for attending!

Thank You for Attending!

For more information about the AHRQ PCMH Research Methods briefs, please visit:
http://www.pcmh.ahrq.gov/portal/server.pt/community/pcmh__home/1483/pcmh_evidence___evaluation_v2

Page last reviewed January 2014
Internet Citation: Advanced Methods in Delivery System Research - Planning, Executing, Analyzing, and Reporting Research on Delivery System Improvement: Webinar #4: Formative Evaluation (Slide Presentation). January 2014. Agency for Healthcare Research and Quality, Rockville, MD. http://www.ahrq.gov/professionals/prevention-chronic-care/improve/coordination/webinar04/formativeevalsl.html