Part V. Interpreting Quality and Resource Use Measures

Selecting Quality and Resource Use Measures: A Decision Guide for Community Quality Collaboratives

Question 26. How can quality and resource use measures be evaluated together to help identify high-value and low-value providers?

As community quality collaborative progress beyond their initial measure selection, data collection, and measurement, new challenges and opportunities will arise. One of those challenging opportunities is designing a useful construct for evaluating quality and resource use measures concurrently.

Very little research has been done in this area, although interest is building.79 The Centers for Medicare & Medicaid Services (CMS) is currently exploring how to combine resource use metrics for episodes of care with quality metrics to differentiate physicians and to tie a portion of their payment to improvements or achievable benchmarks of efficiency. The CMS acute care episode (ACE) demonstration focuses on promoting efficiency by bundling all care delivered for an inpatient stay. Similar experiments are underway in the private sector; at least one health system is bundling payment for care in a hospital with the care delivered before and after the hospitalization for particular conditions.114

Figure 4 demonstrates one collaborative's approach, plotting a quality composite score with severity-adjusted hospital charges. In this example, providers falling in the upper-left quadrant score best in both quality and resource use. The Consumer-Purchaser Disclosure Project presented the diagram in Figure 5 to summarize the distribution of physicians in terms of "value" provided. Here, "value" is again constructed from both quality and resource use measures, but consumers and purchasers are encouraged to find physicians in the upper-right quadrant where high quality is delivered with low resource use.

Community quality collaboratives can serve as laboratories for both national and regional initiatives to package information on health care resource use and quality in ways that will promote more efficient delivery of high-quality care. AHRQ has supported several useful tools to assist in this process, including:

  1. The "Talking Quality" Web site at https://cahps.ahrq.gov/consumer-reporting/talkingquality/ provides specific guidance about "what to say" and "how to say it" when communicating information on health care quality to consumers.
  2. The "Health Care Report Card Compendium" is a searchable directory of health care report cards that provide comparative information on the quality of health plans, hospitals, medical groups, individual physicians, and nursing homes.
  3. Evidence-based, empirically tested model reports are available for public reporting of hospital performance on the AHRQ Quality Indicators site at: http://qualityindicators.ahrq.gov/Modules/Default.aspx. Similarly, templates for reporting on patient experience (based on the CAHPS® family of surveys) are available at: https://cahps.ahrq.gov/Surveys-Guidance/CG/index.html.
  4. MONAHRQ (My Own Network, powered by AHRQ) is a Web-based application that will enable community collaboratives to input their own hospital administrative data and generate a data-driven Web site. MONAHRQ will be released in 2010.
  5. "Model Public Report Elements: A Sampler" is an illustrative Web-based sampler of model public report elements. It spans five core Web pages that constitute a public report as well as functionality that facilitates use by consumers. The Sampler will be released in 2010 and accessible on AHRQ's community quality collaboratives Web page at http://www.ahrq.gov/qual/value/localnetworks.htm.
  6. "Methodological Considerations in Generating Provider Performance Scores for Use in Public Reporting" is a report focused on a set of 20 key methodological decisions associated with producing provider (e.g., hospital, physician, physician group) performance scores for use in public reporting. It also includes an explanation of the practical importance of each decision, a review of alternative decision paths, discussion of the pros and cons of each option, and examples from collaboratives. This resource will be released in 2010 and accessible on AHRQ's community quality collaboratives Web page at http://www.ahrq.gov/qual/value/localnetworks.htm.

Similar tools and examples of successful dissemination approaches are likely to emerge from the Robert Wood Johnson Foundation's Aligning Forces for Quality program (www.rwjf.org/qualityequality/af4q/focusareas/index.jsp). Through this program, 15 vanguard communities are bringing together key stakeholders to improve quality of care, measure and publicly report on quality of care, and engage consumers to make informed choices about their own health care. These efforts by both public and private funding agencies will provide a stronger evidence base for future initiatives to promote value-based purchasing and to increase consumer demand for high-value health care.

Figure 4. Example of "value" plot diagram

Diagram of heart attack care hospital charges and quality comparison. This quadrant analysis represents a comparison of heart attack quality of care and charges. Heart attack is also called acute myocardial infarction or AMi. The purpose of this analysis is to attempt to quantify the value each hospital provides when caring for patients with heart attacks. The quality score is a composite number that takes into account how well a hospital performed in giving the recommended care proven to give t  

Source: Wisconsin Collaboration for Healthcare Quality, 2009.

Figure 5. Example of plotting physician value

Scatter plot showing distribution of physicians by quality and efficiency. The x axis is the Efficiency Index (higher efficiency equals lower relative cost for episode of care). The y axis is the Quality Index (quality outcomes of care processes). The four possible combinations are high quality, low efficiency; high quality, high efficiency (the ideal); low quality, high efficiency; and low quality, low efficiency. The 50th percentile is noted. Most of the data points cluster around the 50th per

Source: Price & Cost Transparency: Understanding the Issues — Shaping the Agenda. Consumer-Purchaser Disclosure Project Invitational Working Session; May 25, 2006.

Current as of May 2010
Internet Citation: Part V. Interpreting Quality and Resource Use Measures: Selecting Quality and Resource Use Measures: A Decision Guide for Community Quality Collaboratives. May 2010. Agency for Healthcare Research and Quality, Rockville, MD. http://www.ahrq.gov/professionals/quality-patient-safety/quality-resources/tools/perfmeasguide/perfmeaspt5.html