Page 1 of 1

Best Practices in Public Reporting No. 1: How To Effectively Present Health Care Performance Data To Consumers

The purpose of the Best Practices in Public Reporting series is to provide practical approaches to designing public reports that make health care performance information clear, meaningful, and usable by consumers. Report 1 focuses on the presentation of comparative health care performance data.


Prepared by Judith Hibbard, Dr.P.H., and Shoshanna Sofaer, Dr.P.H., Center for Health Improvement.

Prepared for the Agency for Healthcare Research and Quality, Contract No: HHSA290200710022T.

Contents

Purpose
Importance of Reporting to the Public
Challenges in Designing a Report Card
   Consumers do not know that there is a quality gap
   Consumers and clinical experts define quality differently
   Quality measures are often hard to understand or are not meaningful to consumers
   Using quality information to inform choices is hard cognitive work
Practical Report Design Solutions
   Make the information more relevant to what consumers already understand and care about
   Make it easy for consumers to understand and use the comparative information
   Test reports with consumers during development
Cost and Efficiency
   Cost
   Efficiency
Issues Addressed in Reports 2 and 3
References and Resources
Acknowledgments

Purpose

The purpose of the Best Practices in Public Reporting series is to provide practical approaches to designing public reports that make health care performance information clear, meaningful, and usable by consumers. This series consists of three reports:

  • Report 1: This report focuses on the presentation of comparative health care performance data.
  • Report 2 (Maximizing Consumer Understanding of Public Comparative Quality Reports: Effective Use of Explanatory Information): This report focuses on the background information contained in public reports that frames the decision question, provides a context for using the information, and details the specifics of the data.
  • Report 3 (How To Maximize Public Awareness and Use of Comparative Quality Reports Through Effective Promotion and Dissemination Strategies): This report focuses on the promotion and dissemination of reports.

Together the three reports cover the wide range of issues and challenges faced by report sponsors. The audiences for the three reports are community collaboratives and others involved in the production, packaging, promotion, and dissemination of comparative health care quality and cost information for consumers, patients, and the general public. The goal is to help sponsors present information so that it can be understood easily and processed by people who may have limited time or motivation and are without technical training in this area.

Return to Contents

Importance of Reporting to the Public

Consumers have been slow to use comparative performance reports to make health care choices. Their use of reports, however, can influence quality in at least three ways:

  • Informed choices make it more likely that consumers will obtain high-quality health care for themselves and their family members.
  • The collective effect of many informed choices may stimulate quality improvement among providers. That is, providers may be motivated to improve as a way to protect or enhance their market share.
  • Public reports that affect providers' public image by identifying them as high-quality or low-quality providers may encourage them to improve the quality of care they provide, to protect or enhance their reputations.

Finding ways to make public reports more relevant and useful to consumers is part of an overall strategy to improve health care.

Return to Contents

Challenges in Designing a Report Card

Consumers do not know that there is a quality gap

All of us have heard over and over again that the United States has the best medical care in the world. At the same time, messages about the significant and pervasive quality gaps in health care have been much less omnipresent. It is not surprising that there is a widespread belief among consumers that the technical quality of care is high and uniform across physicians, hospitals, and other providers. If the technical quality of care were actually uniformly high, as many believe, then ignoring public performance reports would make sense. Consumers know, from their own experiences, that the interpersonal aspects of health care do vary considerably. Pairing information on the technical aspects of quality with patient experience data may show consumers that quality is a concept they know something about and can learn more about.

Return to Contents

Consumers and clinical experts define quality differently

In a national survey, the top three factors that consumers identified as being "most important" in determining the quality of health care patients receive were: affordability of care, doctor's qualifications, and access to care for everyone (Kaiser Family Foundation, 2004). This is quite different from the more multifaceted concept of health care quality represented in most public performance reports that typically include technical quality of care measures as well as patient experience measures. This means there is a serious communication gap between what is contained in reports and what consumers think quality of care means.

Return to Contents

Quality measures are often hard to understand or are not meaningful to consumers

Measures of quality are frequently misinterpreted. Misunderstanding takes many forms, and sometimes measures are interpreted in exactly the opposite way they are intended. For example, some hospital reports include length of stay (LOS) or readmissions as performance indicators. Longer LOS and higher readmissions are intended to indicate poor performance. However, many consumers will view these as measures of access and interpret them in the opposite way—they may think that a high score shows that patients are able to stay in the hospital for as long as they need or be readmitted when necessary.

Some measures are incomprehensible to consumers. For example, reporting on measures such as administration of beta blockers and angiotensin-converting enzyme (ACE) inhibitors assumes a higher level of clinical knowledge than most consumers have.

What is not understood is often ignored or viewed as unimportant. The assumption that consumers will "click through" to learn or look up the meaning of an indicator is faulty. If labels are not clear to begin with, they are likely to be ignored.

Return to Contents

Using quality information to inform choices is hard cognitive work.

Using performance reports to inform choices involves reviewing and processing a large amount of information and then applying that information to a choice. As the number of pieces of information or decision factors to consider increases, an individual's ability to use that information to make their choices decreases. Indeed, giving people a lot of information can be counterproductive (Vaiana and McGlynn, 2002; Hibbard, Slovic, and Jewett, 1997). Humans can only integrate a limited number of factors into a choice. When asked, consumers often indicate they want more information. When faced with using that information in making a choice, however, they feel overwhelmed by the amount of information.

Making tradeoffs among different categories of factors (e.g., a doctor who communicates well but whose patients wait long times for appointments) are very difficult cognitive tasks. Most providers are not going to score well on everything, necessitating tradeoffs and differential weighting of factors. Differential weighting of factors in a choice is problematic for people.

Most performance reports are constructed on the assumption that different people will care about different elements of care. The inclusion of multiple performance measures on different elements of care in one report is typically done so that people can pick and choose and differentially weight factors according to their preferences. In reality, people have a very hard time differentially weighting, and even when they think they are doing so, they often are not (Hibbard and Peters, 2003).

In sum, using comparative data to select a provider involves three tasks. Consumers must process a large amount of information, select relevant factors and differentially weight them, and bring all the factors together into a choice. However, research shows that these are very onerous cognitive tasks at which human beings are not very adept.

Thus, for a variety of reasons, consumers have not been quick to use these reports. In addition to all the reasons cited above, reports have not provided guidance on how to act on the information. Finally, consumers are inundated on a daily basis with other kinds of information and demands on their attention. Figuring out what to pay attention to and what is credible information can be additional challenges for them.

Return to Contents

Practical Report Design Solutions

Make the information more relevant to what consumers already understand and care about

1. Present an overall definition of quality

Because consumers do not understand quality of care in the same way that it is often measured and reported, communicating an overall definition that is understandable and salient may help consumers see the relevance of the information. Further, because consumers tend to define quality of care narrowly (e.g., understanding it in terms of the quality of the relationship with their clinician), broadening their concept of quality is likely a necessary prerequisite to making comparative performance information meaningful.

Consumers have a relatively easy time understanding patient experience measures and some patient safety measures. They have a much harder time understanding and relating to process of care measures, volume measures, and many structural measures. Even some classic outcome measures, such as mortality, may be rejected by some consumers, in part because they do not want to think about such a negative idea.

2. Define the elements of quality and use them as the reporting categories

A definition of quality of care that is communicated in everyday language and kept to a few simple ideas will likely work best. For example, phrases using a modified Institute of Medicine framework such as "care that is proven to work," "care that is responsive to a patient's needs," and "care that does not cause harm" communicate what is meant by quality without using jargon or technical terms. Sponsors should consider using these same categories, or ones like them, to frame the decision about choosing a provider, as well as to label overarching reporting categories for displaying the indicators.

The indicators included under the reporting categories also must be salient and easy to understand (and relate to the overarching framework). That is, the indicators too must be in everyday language and speak to the things consumers already care about. For example, "effective and appropriate treatment" is easier to understand than the label "ACE inhibitors." By allowing users to drill down or otherwise find details on the measure, reports can still make specifics available without burdening most users, who will be perfectly happy with the more general label. When reports have categories or category names that are difficult to understand or are viewed as meaningless, sponsors risk discouraging the use of public reports or having users draw inappropriate conclusions about the meaning of the data.

3. Include information on sponsor and methods

The information in reports must be viewed as credible and sponsored by a trusted source. Generally, consumers prefer information from their own physician or from sources that are independent, objective, and knowledgeable. Consumers mistrust information when it appears to come from the organization being evaluated. This means giving full information on sponsorship and methodologies, as well as access to the more granular data. It also requires providing assurances that the information comes from a trusted and reliable source. Such assurances should not be on the "top layer" but need to be available, and made known to be available, in a drill-down layer.

Return to Contents

Make it easy for consumers to understand and use the comparative information

1. Reduce the cognitive burden by summarizing, interpreting, highlighting meaning, and narrowing options.

How much information is presented and the way that information is presented and displayed will make a difference in whether consumers can actually process it and use it in decision-making. Among other things, this means being consistent in whatever metrics are used. When a high number (or long bar) indicates good performance with one type of indicator and poor performance with another indicator, the chances of confusing or misleading people is increased. It is always a good idea to make it clear whether a high value means good or bad performance.

Information displays that help consumers quickly see the meaning in the data increase motivation to use the data and actual use of the data. This means summarizing information and even interpreting data for consumers. One of the more powerful display strategies is to rank order providers by performance, with the top performers at the top and the bottom performers at the bottom. Even if the providers are ordered within tiers, where providers within a tier are roughly equal performers, this helps consumers immensely by reducing the work required to make a choice. Ordering by performance also helps to highlight for the user differences in performance.

Labeling performance helps consumers the same way. Labeling performance as "excellent" or "good" does some of the cognitive work for viewers by telling them what the data mean, and even calling out for them outstanding examples. As an example, Figure 1 shows performance with high performers at the top and low performers at the bottom. The summary bar reflects the percentage of indicators for which the physician scored in the 70th percentile or above. If physicians were listed in some other order (e.g., by last name, by ZIP code, by clinic affiliation), consumers would have to review all performance results, doing the difficult cognitive work of identifying and arranging the different performance levels to make a choice.

Figure 1. Sample performance report ranking physicians*

Recommended: Summarize and order performance results to help bring the information together for the user.

Sample performance report ranking physicians

* Physician names and addresses are fictitious.

Figure 2 shows how to present data that include more than one variable, such as cost and quality. Here, the "Best Value" label reflects the rater's evaluation of multiple factors and reduces the burden for users, allowing users to identify the top performers regardless of the order in which they are presented.

Figure 2. Sample report with multiple variables*

Recommended: Create a report that summarizes and interprets information for consumers.

Figure 2. Sample report with multiple variables

* Provider names are fictitious.

Using symbols that are inherently meaningful also can help people quickly discern the meaning of data. About one-half of the population has difficulty deriving meaning from numbers. Facing a sea of numbers can be daunting for them, so using symbols rather than numbers can help. The best symbols are those that tell the meaning as part of the symbol.

The examples in Figure 3 show symbols that incorporate the interpretive label as part of the symbol; they incorporate a shape and a color into the symbol so that it is easy to see patterns in the data. When symbols are presented this way, users will be able to immediately interpret them without the added burden of holding information in one's mind as one looks at a legend.

2. Reduce the cognitive burden by helping to bring the information together into a choice

Many of the strategies discussed above will help users bring the information together into a choice. Any strategies that narrow options, highlight differences, and help the user differentially weight factors will help them arrive at a choice. Using summary measures, and symbols to represent them, helps people "count up" the good attributes. This is the way that most people will use the data, rather than differentially weighting according to personal preferences. Providing them with this framework will help them use the data in decision-making.

Making full use of Web-based data allows the user to narrow down the number of data points in a variety of ways. For example, to look only at options in one's geographic region is the simplest approach. Web-based information may be manipulated a number of ways and customized by users to narrow and order their choices. A PDF file, by contrast, even though simpler to produce, does not allow this customization and is therefore less preferred than a Web-based approach.

Figure 3. Example of symbols in reports

Recommended: Use colors, symbols, and simple words to help consumers process and interpret data quickly.

Figure 3. Example of symbols in reports

Finally, even though it is often technically correct to present confidence intervals when presenting comparative performance data, it should be avoided. To understand the concept of confidence intervals, consumers would need statistical knowledge. Research shows that consumers tend to discount information when a report communicates that there is uncertainty about the data (Schapira, Nattinger, and McHorney, 2001). Confidence intervals should be used to determine performance levels; however, consumers should not be burdened with interpreting them.

Ironically, those characteristics of reports that help consumers the most (e.g., ordering by performance) are also the ones that are most often resisted by providers. Summarizing and interpreting data for consumers, while helping users, puts a greater responsibility on report sponsors. This responsibility may be to determine what constitutes meaningful differences among providers, or to put labels on performance, indicating what levels of performance should be interpreted as good or poor.

Return to Contents

Test reports with consumers during development

While we can do our best to produce reports we think consumers will understand and find meaningful, it is always best to test the information with consumers. Such tests will reveal areas consumers do not understand, specific misinterpretations, difficulty users have finding information within reports, and users' perception of the information's relevance. These tests can be done with a small number of consumers, preferably with members of the population who will have access to the report.

Consumer input can be obtained through a variety of questions and tasks. Getting individuals to say in their own words what they think a label or a phrase means reveals quite a lot about their comprehension. Asking people to find the top three and bottom three performing providers will indicate how easy it is for people to put this information together and interpret it correctly. Asking about whether they would use this information to choose a doctor or hospital will reveal how much it is valued.

Research Identifies Critical Design Elements

A recent experiment showed what helped consumers the most:

  • Rank ordering by performance as opposed to alphabetical ordering.
  • Using symbols (such as the ones shown in Figure 3) instead of numbers.
  • Providing an overall summary measure.
  • Including fewer reporting categories (5 vs. 9).

The findings showed that with all four of these things, the best results are obtained: 89% of consumers who viewed reports with all four of these design approaches were able to correctly identify the top three and bottom three performers. Only 16% of consumers who viewed the same information with none of these design characteristics could do this.

Carman KL. Improving quality information in a consumer-driven era: showing differences is crucial to informed consumer choice.
10th National CAHPS® User Group Meeting, Baltimore, MD, 2006


Return to Contents

Cost and Efficiency

Cost

Increasingly, we are seeing the inclusion of cost data in public reports. While useful to consumers, cost information adds complexity to the choices and can be misleading. Aside from issues of accuracy and reliability of the data, there is a major concern with the interpretation of cost data. Typically, American consumers believe that a more expensive "anything" is of higher quality. If they get cost data alone, without quality data, consumers are likely to use the cost data as a proxy for quality.

Alternatively, reports can show quality within cost strata, or cost within quality strata. In this context, it is easier for the consumer to see that it is possible to get good quality at a good price. An example is a comparison report on care systems from the Patient Choice program offered by Medica Health Plans: http://www.pchealthcare.com/consumers/midwest_patientchoice/aboutpcs/consumersurvey.html. However, if quality information is not well understood or is not integrated with presentation of cost information (e.g., quality within cost strata), misinterpretations may lead to the choice of higher cost options.

Return to Contents

Efficiency

Efficiency is not an attribute of health care that consumers are familiar with, nor are they looking for it. The term does not resonate well with consumers, and it can evoke concerns about medical care that is cutting corners or saving money for their health plan or their employer at the consumer's expense. At this point, we do not have a way to effectively communicate with consumers about this issue. Labeling a provider as "efficient" will likely not be viewed as a positive attribute. It will be necessary to test different ways of conveying the concept that will resonate with consumers. For example, "uses health care dollars wisely," "is careful with your health care dollars," or "is a high-value provider" may work better with consumers than the term "efficient." However, whatever is used needs to be tested with consumers first.

Return to Contents

Issues Addressed in Reports 2 and 3

Report #2:

  • How to get started with public reporting.
  • How to make the information in reports actionable.
  • How to highlight the fact that quality differences exist.
  • How to integrate cost and/or efficiency data with quality data.

Report #3:

  • How to increase the credibility of the reports.
  • How to promote the reports.

Return to Contents

References and Resources

2006 survey: what consumers want to know about their HMOs. Policy Brief. Sacramento, CA: Center for Health Improvement; September 2006. Available at: http://www.chipolicy.org/doc.asp?id=6059.

TalkingQuality: Guidance for sponsors of consumer reports on health care quality. Rockville, MD: Agency for Healthcare Research and Quality. Available at: http://www.talkingquality.gov/.

Carman KL. Improving quality information in a consumer-driven era: showing the differences is crucial to informed consumer choice. Presentation at the 10th National CAHPS® User Group Meeting, Baltimore, MD, 2006.

Hibbard JH, Peters EM. Supporting informed consumer health care decisions: data presentation approaches that facilitate the use of information in choice. Ann Rev Pub Hlth 2003;24:413-33.

Hibbard JH, Slovic P, Jewett JJ. Informing consumer decisions in health care: implications from decision-making research. The Milbank Quarterly 1997;75(3):395-414.

National survey on consumers' experiences with patient safety and quality information. Menlo Park, CA: Kaiser Family Foundation; 2004. Available at: http://www.kff.org/kaiserpolls/7209.cfm.

Kanouse D, Spranca M, Vaiana M. Reporting about health care quality: a guide to the galaxy. Health Promot Pract 2004;5(3):222-31.

Schapira MM, Nattinger AB, and McHorney CA. 2001. "Frequency or Probability? A Qualitative Study of Risk Communication Formats Used in Health Care," Medical Decision Making 21:459-467.

Shaller Consulting. Consumers in health care: creating decision support tools that work. Oakland: California HealthCare Foundation; 2006. Available at: http://www.chcf.org/publications/2006/06/consumers-in-health-care-creating-decisionsupport-tools-that-work.

Shaller Consulting. Designing consumer guides on quality for Medi-Cal managed care beneficiaries. Oakland: California HealthCare Foundation; 2003. Available at: http://www.chcf.org/topics/medi-cal/index.cfm?itemID=20536.

Shaller D. Consumers in health care: the burden of choice. Oakland: California HealthCare Foundation. 2005. Available at: http://www.chcf.org/topics/view.cfm?itemID=115327.

Vaiana ME, McGlynn EA. What cognitive science tells us about the design of reports for consumers. Med Care Res Rev 2002;59:3-35.

Return to Contents

Acknowledgments

The authors would like to thank the following people for reviewing this document: Bruce A. Boissonnault, President and Chief Executive Officer, Niagara Health Quality Coalition; Doug Libby, R.Ph., Executive Director, Maine Health Management Coalition; Dale Shaller, M.P.A., Principal, Shaller Consulting; and the teams at the Agency for Healthcare Research and Quality (Peggy McNamara, M.S.P.H.; Jan De La Mare, M.P.Aff.; and Katherine Crosson, M.P.H.) and Center for Health Improvement (Patricia E. Powers, M.P.P.A.; and Karen Shore, Ph.D.).

We consider our Learning Network tools to be works in progress and always welcome your comments. Please forward suggestions to Peggy McNamara at peggy.mcnamara@ahrq.hhs.gov.

Return to Contents

Page last reviewed May 2010
Internet Citation: Best Practices in Public Reporting No. 1: How To Effectively Present Health Care Performance Data To Consumers. May 2010. Agency for Healthcare Research and Quality, Rockville, MD. http://archive.ahrq.gov/professionals/quality-patient-safety/quality-resources/tools/pubrptguide1/pubrptguide1.html