Regional Collaboratives as Catalysts for Quality Reporting and Improvement (Text Version)

Slide presentation from the AHRQ 2009 conference

On September 15, 2009, Mary McWilliams made this presentation at the 2009 Annual Conference. Select to access the PowerPoint® presentation (1.9 MB) (Plugin Software Help).


Slide 1

Slide 1. Regional Collaboratives as Catalysts for Quality Reporting and Improvement

Regional Collaboratives as Catalysts for Quality Reporting and Improvement

September 15, 2009

Mary McWilliams
Executive Director

 

Slide 2

Slide 2. Health Alliance: A Regional Coalition

Health Alliance: A Regional Coalition

  • Collaborative of purchasers, providers, plans, patients
  • Private, non-partisan, non-profit 501(c)(3)
  • Created in 2004; hired staff in 2005
  • Now ~150 organizations with 2 million lives
  • Focus: 5 counties in Puget Sound region
  • Funded by participants and grants
  • RWJF Aligning Forces for Quality grantee
  • HHS Chartered Value Exchange

 

Slide 3

Slide 3. A Sampling of Who We Are . . .

A Sampling of Who We Are...

  • Alaska Airlines
  • The Boeing Company
  • REI
  • Starbucks Coffee Company
  • Puget Sound Energy
  • WA State Health Care Authority
  • County Governments
  • City of Seattle
  • UFCW/Teamsters
  • Group Health
  • Polyclinic
  • Everett Clinic
  • Northwest Physicians Network

 

Slide 4

Slide 4. What We Do

What We Do

  • Build local agreement on evidence-based care and standard measures
  • Report on health care performance in quality and cost
  • Provide resources to help with decision-making
  • Encourage incentives to reward value (e.g., medical home payment pilot)

Transparency. Value. Consumerism.

 

Slide 5

Slide 5. Community Checkup:

Community Checkup:

  • Signature report & web site
    www.WAcommunitycheckup.org
  • Quality performance measure results for the community
    • 1st release- January 2008
      • 14 volunteer medical groups;
      • 97 clinics
    • 2nd release - November 2008
      • 47 medical groups; 170 clinics
      • 25 hospitals
    • 3nd release - July 2009
      • 80 medical groups; 270 clinics
      • 25 hospitals

 

Slide 6

Slide 6. Alliance Data Processes

Data suppliers, largely health plans, submit claims and enrollment data to Milliman.  Milliman de-identifies the patient names from the claims and combines data from all sources then runs it through algorithms on the 23 measures of effective care to produce measure results.  The Alliance only has access to these de-identified results.

Meanwhile, the Alliance works with medical groups to ensure that we have the right clinicians in the right medical group, which is no mean feat.  We provide the provider information with practice location so that Milliman can report appropriate results to the right group.  Medical groups get results on individual clinicians, but the Alliance reports publicly only at the clinic level of 4 or more clinicians.

 

Slide 7

Slide 7. 3rd Community Checkup (July 2009):

3rd Community Checkup (July 2009):

Ambulatory Quality of Care Measures:

  • 23 process of care measures including chronic disease management, prevention, and appropriate use of services.
  • Claims and encounter data for 2 million lives from 18 health plans, self-insured purchasers, union trusts and Medicaid.
  • Separate results for Commercial and Medicaid
  • Custom reports for data suppliers' workforce/dependents

Hospital Quality of Care Reports:

  • Repackaging of publicly available hospital results for Medicare (www.hospitalcompare.hhs.gov) including heart attack, heart failure, pneumonia, surgical care, and patient experience

eValue8TM Health Plan Results:

  • Summary level comparison of six health plans through standardized tool of National Business Coalition on Health

 

Slide 8

Slide 8. Disability Care

This is a chart that is a summary of the results for 4 of our chronic care measures – management of diabetes.  The measures include hemoglobin a 1 c, getting an eye exam, testing cholesterol, and kidney screening.

For each measure we show the top and bottom performing medical group and the regional average and compare it to the top 10% HEDIS results from health plans (based on claims data only).

You can see that our regional averages generally compare very favorably to the top 10%, but there still is a wide range of variation across medical groups as well as variation in the range between measures.

 

Slide 9

Slide 9. Image of Web Screen Capture

To move from the summary view of the chart to the web view, this is a screen shot of those same diabetes measures as we show them online.  We start with information on why these measures are important and then show results for each medical group and even for each clinic on these measures.  For simplicity, we categorize results as above, at or below the regional average, but people can drill down to see the individual results and the confidence intervals around them.

 

Slide 10

Slide 10. Next Steps for Quality Reports

Next Steps for Quality Reports:

  • Additional measures
  • Additional data suppliers
  • Expand medical group auditing capabilities
  • Expand data elements (e.g. cost data, vision data, benefit information, PHI for linking member data)
  • Incorporate clinical data

 

Slide 11

Slide 11. The Next Horizon: Cost Transparency

The Next Horizon: Cost Transparency

  • Need cost transparency to complete the value equation
  • Currently lack pricing data from data suppliers
  • Interim strategy: report on relative resource use
  • Alliance developing 4 report 'tracks' to capture service intensity across hospitals and medical groups

 

Slide 12

Slide 12. Resource Measurement Report Plan

Resource Measurement Report Plan

  1. Mine public sources for acute care facility information
    • Dartmouth Atlas service intensity
    • CHARS (WA hospital data) gross charges and net revenue per case
  2. Use Alliance multi-payer data set to analyze selected high cost, high volume hospitalizations
    • Facility & professional care dyad resource use during hospitalization
    • Apply APR-DRG (3M) and RVUs (Milliman)
  3. Use Alliance multi-payer data set to analyze selected surgical procedures (preference-sensitive conditions)
    • Procedure frequency by patients' area of residence
  4. Use Alliance multi-payer data set to aggregate episodes of care
    • Resource use across care continuum: professional, facility, Rx
    • QASC (Brookings/ABMS) definitions - link to HEDIS measure results

 

Slide 13

Slide 13. What's Made the Alliance Successful (to date)?

What's Made the Alliance Successful (to date)?

  • Personal, sustained leadership of respected purchaser executive
  • Purchaser driven, multi-stakeholder effort
  • Quality first; efficiency second
  • National standards but local stakeholder input
  • Community-wide report rather than payer-specific
  • Limitations on first uses of report
  • Public reporting at the medical group level
  • Trial run of 14 volunteer clinics for first report
  • Positioning as "Community Checkup"
  • Addition of eValue8TM for plans by NBCH for purchasers

 

Slide 14

Slide 14. What Challenges Lie Ahead?

What Challenges Lie Ahead?

  • Expanding the content and uses of the quality report and scope of reporting
  • Navigating the development and release of resource use reports and subsequent cost measures
  • Engaging health plans in collaborative projects
  • Continuing to afford the cost of data aggregation
  • Demonstrating ROI to stakeholders, especially purchasers
  • Expanding the use of all-payer data base as community asset
  • Balancing national standardization and local flexibility

 

Slide 15

Slide 15. Transparency Value Consumerism.

Transparency. Value. Consumerism.

www.WaCommunityCheckup.org

Puget Sound Health Alliance
2003 Western Avenue, Suite 600, Seattle, WA 98121
(206) 448-2570
www.pugetsoundhealthalliance.org

Current as of December 2009
Internet Citation: Regional Collaboratives as Catalysts for Quality Reporting and Improvement (Text Version). December 2009. Agency for Healthcare Research and Quality, Rockville, MD. http://www.ahrq.gov/news/events/conference/2009/mcwilliams/index.html