Chapter 9. Developing Metrics and Collecting Data

Regional Coalition Collaboration Guide

Developing Metrics

A metric is a standard measure for assessing performance in a particular area. Metrics are essential for any program directed at continuous improvement. Regional coalitions should develop metrics that cross hospitals, physicians, and employers. Doing so shows how stakeholders are interconnected and ensures compatibility of results.

In general, measures should be targeted to a specific area and collect accurate and complete data. A metric also should clearly convey performance in a timely and relevant manner. Regardless of what metrics a coalition settles on, the Better Quality Information (BQI) sites recommend carefully building consensus around a small number of measures (3 priorities versus 30) in the beginning. Once these measures are put into practice and trust among participants grows, coalitions can expand the number of metrics.

In its early stages, Massachusetts Health Quality Partners worked with a consultant who recommended that the coalition begin with measuring patient experience with hospital care. This idea appealed to participants because there was a clear path for collecting and using these data (for example, an instrument had been developed for collecting data, and scientific analysis had been established in interpreting data). Although not all clinicians were convinced that patient experience is an important part of quality care, Massachusetts Health Quality Partners saw the potential for an early success in this approach and recognized how well the public could relate to these data.

Return to Contents

Collecting Data

Each BQI site has devised various approaches to data collection. Regardless of whether a coalition uses health plan claims data, Healthcare Effectiveness Data and Information Set (HEDIS) results, or data from physician practices, there are general issues that new coalitions need to consider. The Institute of Medicine has identified six challenges in collecting and reporting data that coalitions should expect:

  1. Inefficiencies associated with performance measurement.
  2. Variations among performance measurement systems.
  3. Organizational and cultural issues.
  4. Technological barriers.
  5. Economic pressures.
  6. Competing priorities.

Common questions for forming regional coalitions to consider involve data collection and analysis: Who is the primary customer for the data, physicians or consumers? What do you use the data for, quality improvement or leverage for change? Currently, the primary users of data reported by BQI sites are physicians who use data to make improvements in their care delivery. Only in the past couple years has the focus shifted to include consumer use. Many sites agree that, at this point, consumers are not interested in publicly reported data or they may not know how to use it. However, one of the consumer advocacy groups represented on Massachusetts Health Quality Partners' board developed a quality council to engage the public in determining the kind of information that is most useful for consumers.

Basic data issues that new coalitions need to address to reassure stakeholders and maintain trust follow.

Data Validity

Data validation is a systematic process for reviewing a body of data against a set of criteria to ensure the data are adequate for their intended use. Developing a data validation process early is important for participant buy-in and trust the process to be effective.

Data validation is an integral component of the Wisconsin Collaborative for Healthcare Quality's measurement model. The collaborative uses a Web-based data submission tool that allows participating medical groups to submit performance measure results for reporting through the collaborative's Web site. The data submission tools require the groups organize the administrative and clinical data files required for calculation of the collaborative's measures in a consistent format, facilitating the audit of data used to calculate the measures.

Careful checks determine whether the measurement specifications were applied by member organizations in a manner that would allow for the same results. Review of the data warehouse construction also ensures that data for inclusion have been pulled from all available and appropriate sources. Each organization within the Wisconsin collaborative must validate its denominator files for each data submission in the spring and fall. Additionally, members are randomly assigned numerator validations during each cycle, and they must supply the data files and programming code used to obtain the data results. Audits of the data are conducted randomly or as requested by the Wisconsin collaborative's board or business partners.

Minnesota Community Measurement has two levels of data validation due to its process of aggregating data from 10 different sources. The first validation level is at the health plan with the HEDIS validation required for accredited health plans. The second level is when the data come to the Minnesota coalition. At this point, each file is validated for accuracy and sent back to the data source with questions that cannot be answered easily.

Minnesota Community Measurement also has data submitted directly from medical groups. The coalition has an extensive guide that walks the medical group through each step of the process to pull the data. Members of the coalition staff are available to answer questions or make an on-site visit to the medical group to clarify the process and overcome barriers. The Minnesota coalition's policy requires a Minnesota Community Measurement staff member to certify the denominator of each measure at midprocess before the group moves forward with extracting the data from an electronic medical record or abstracting data from a paper record. Further, the coalition reserves the right to make an on-site visit to each medical group to validate each step of the process and certify that its process meets coalition requirements.

Transparent Data Collection Methods

The coalition's data collection methods must be transparent to the entity being reported on. When there is trust in the credibility of the data and results, medical groups are more likely to support publicly reporting their data.

At the Indiana Health Information Exchange, clinical data from labs, hospitals, transcription notes, and so forth are collected electronically from those institutions without requiring physical effort by physicians. The exchange gathers claims data from payers. Minimal, specific point-of-care data are gathered from physician offices through a variety of tools in an attempt to be as least intrusive as possible to the physician's environment. The Indiana exchange does not pull data from manual patient files or require physician offices' staff to do so.

For tests performed in the office and when the results or procedures are not available through insurance claims or labs, data need to be collected for specific measures and forwarded to the Indiana exchange. These data can be extracted electronically from the physician's electronic medical record and can be faxed on scannable, optical character recognition forms; entered through a Web application by physician office staff; or faxed to the exchange for manual data entry. Labs that are not currently contracted with the Indiana Health Information Exchange to send data directly to the data aggregator (Regenstrief Institute) can send spreadsheets or other electronic files directly to the exchange.

Data Testing and Credibility

Because the Indiana Health Information Exchange uses medical claims, point-of-care data, and clinical data collected from hospitals, labs, radiology, and the RxHub National Patient Health Information Network™ (a network that provides secure access to more than 90 percent of people with commercial prescription coverage in the United States), the data are richer. The Indiana exchange bases scores on data found for all patients, not simply Medicare or commercial payers enrolled in the program. Scores are determined by evaluating results on all the physician's patients, including those who are uninsured, members of nonparticipating payers, etc. Because the coalition has access to hospitals, clinics, and labs in the area, it has access to many patients.

The Indiana exchange first tests reports internally and then tests them with its Measures Subcommittee, which consists of physicians and payer representatives. After this step, the coalition tests reports with physicians and physician groups and then tests them with its larger Measures Committee before moving to production.

Confidentiality

Creating internal safeguards for data and establishing confidentiality protocols before recruiting stakeholders also will enhance the coalition's credibility and build trust among stakeholders. All BQI sites ensure data are encrypted and require all stakeholders to sign confidentiality agreements. Members of the Wisconsin Collaborative for Healthcare Quality assign pseudo-medical record numbers to files submitted during the data validation process, guaranteeing the ability to cross-reference the patient files, if necessary. This step also protects patient confidentiality by containing any patient-identifiable piece of information remaining with the host organization. Minnesota Community Measurement's policy addresses data use, including health data collection and measurement specification, data confidentiality, data contributor participation, public reporting of data, and release of coalition data.

Data Concerns

Self-Promotion

Stakeholders often express concern that competitors in the coalition will use publicly reported data for marketing purposes (for example, "We're rated number 1."). Data-use agreements should address this concern by having participants agree not to use data results for self-promotion.

Low Rating

One particularly difficult aspect of reporting involves low ratings that make a participant look bad. The California Cooperative Healthcare Reporting Initiative facilitates participant forums to help address and resolve the issues underlying the accurate reporting of scores. During these forums, every participant is able to express his or her concerns, problems, and questions regarding the data and their impact. Coalitions should frame low ratings as opportunities for improvement rather than reacting punitively or viewing them as shameful.

Determining the Cut Point

If the cut points between high and low ratings are not carefully defined, one group can end up with three stars and another with two when in fact their performance is not significantly different. The California Cooperative Healthcare Reporting Initiative worked with its stakeholders and national analytic experts to define a methodology that addressed this issue.

Data Challenges

Data challenges in regional coalitions range from how to use the data to who "owns" the data. New coalition leaders need to be ready to address these concerns up front and use them as opportunities to increase transparency and improve the coalition's credibility.

All BQI sites have developed processes to help resolve data concerns and challenges. Specific examples of data partner meetings follow.

Return to Contents

Data Partner Meetings

The Center for Health Information and Research holds quarterly meetings to bring together all the data partners to discuss current and future initiatives. One goal of the meetings is to build relationships. The idea is that once relationships are built and maintained, the relationships will foster more collaboration and information sharing with the ultimate goal of improving community health in Arizona.

All Participants Meetings

The California Cooperative Healthcare Reporting Initiative holds biannual All-Participants Meetings where staff present results for the:

  • HEDIS data collection project.
  • Health Maintenance Organization Consumer Assessment of Healthcare Providers and Systems member survey.
  • Patient assessment survey.
  • Special studies.

During a HEDIS results presentation, for instance, the analyst identified quality improvement opportunities based on low rates, large variation across the California Cooperative Healthcare Reporting Initiative's plans, and poor performance compared to the National Committee for Quality Assurance 2006 national percentiles. For measures with rates below 60 percent, the analyst highlighted those that could potentially be improved through sharing best practices and others that indicate where an opportunity exists for all plans to improve.

Throughout each presentation, members of the group are encouraged to ask questions and raise concerns. At the meeting's end, participants are invited to provide thoughts on opportunities for improvement for particular measures.

Other Approaches

Physician Council

Massachusetts Health Quality Partners also has meetings similar to the Arizona and California coalitions, but it meets quarterly with the coalition's Physician Council and board (data partners are on one or both of these groups). The council consists of medical directors from a group of physician organizations across Massachusetts who have come together under coalition's umbrella. The Physician Council's top priority is guiding the Massachusetts Health Quality Partners in establishing a collective set of clinical and service quality improvement priorities that could best be accomplished through collaboration with other coalition health care stakeholders. Two members of the council sit on Massachusetts Health Quality Partners' board of directors.

In addition to the Physician Council meetings, Massachusetts Health Quality Partners has regular meetings with data partners and Physician Council committees, such as the BQI Rapid Response Team, on specific uses of the data it receives as well as on reporting formats and messages.

The Massachusetts coalition has established a process in which physicians review physician grouping data and final results through a secure, private Web site or on compact discs to correct grouping inaccuracies before public release. If physicians express concerns about the measures, their concerns are discussed with the Physician Council. Coalition staff and board members review Physician Council recommendations that a measure not go public. If all are in agreement, the measure is not made public. Not reporting questionable measures increases credibility that any data reported will be accurate.

"Road Show" Approach

Before its first public launch, Minnesota Community Measurement conducted a 30-city "road show" around Minnesota for coalition leaders to present data results to providers. During the tour, the Minnesota coalition was successful in defusing provider concerns by framing the launch as a way to improve the system, not as a way to punish or embarrass anyone.

In general, the Minnesota coalition's system of review and validation before public reporting allows for discussion and debate. The National Committee for Quality Assurance also has been involved with Minnesota Community Measurement since its inception and has helped with implementation, especially with the sampling methodology needed for hybrid measures.

Member Work Groups

The Wisconsin Collaborative for Healthcare Quality has benefited from the work of an ambulatory care specifications workgroup that has been in existence for more than 3 years. Meeting once a week through teleconference, the workgroup is a vivid example of the power of collaboration in devising innovative approaches to complex measurement issues. Composed of quality measurement and improvement professionals from the Wisconsin collaborative's member organizations, the workgroup is the source of the collaborative's distinctive "all patient, all payer" measurement methodology that focuses reporting at the population level for all eligible patients, regardless of source of payment. The ambulatory care specifications workgroup oversees the development and maintenance of measurement specifications, the cycle of data submission and reporting, and enhancements to the Web-based suite of measurement tools.

Page last reviewed April 2008
Internet Citation: Chapter 9. Developing Metrics and Collecting Data: Regional Coalition Collaboration Guide. April 2008. Agency for Healthcare Research and Quality, Rockville, MD. http://archive.ahrq.gov/professionals/quality-patient-safety/quality-resources/tools/collabguide/collabguide9.html