Laying the Groundwork for Pharmacy Quality Measurement (Text Version)

Slide presentation from the AHRQ 2010 conference.

On September 28, 2010, Joyce McMahon, Laura T. Pizzi, and Elizabeth Schaefer made this presentation at the 2010 Annual Conference. Select to access the PowerPoint® presentation (480 KB). Free PowerPoint® Viewer (Plugin Software Help).


Slide 1

Laying the Groundwork for Pharmacy Quality Measurement

Laying the Groundwork for Pharmacy Quality Measurement

Results from the Evaluation of the PQA Phase I Demonstration Project
September 28, 2010

Slide 2

Overview

Overview

  • Introduction and background for PQA Phase I demonstration projects:
    • Evaluation goals, overall approach, key research areas, and challenges:
      • Joyce McMahon, PhD (CNA)
  • Results, findings, and recommendations:
    • Generalizable findings across sites/models.
    • Key differences in results across sites/models.
    • Pharmacy staff survey and PQA Consumer Survey:
      • Elizabeth Schaefer, MA (CNA)
    • Interviews with key project staff and overall recommendations:
      • Laura T. Pizzi, PharmD, MPH (Jefferson)

Slide 3

Evaluation of PQA Phase I Demonstration Projects

Evaluation of PQA Phase I Demonstration Projects

  • AHRQ contracted with CNA and CNA's subcontractor, Thomas Jefferson University, in June 2008.
  • Primary focus: evaluate PQA's Phase I demonstrations:
    • Phase I: test the feasibility of creating pharmacy performance reports.
      • PQA selected 5 demonstration sites under 3 models.
      • Overall purpose was to evaluate implementation of the programs at the 5 demonstration sites.
      • We determined lessons learned and best practices.
      • We looked for generalizable results as well as different outcomes across demonstration sites.
    • Results can inform PQA's follow-on Phase II plans to test the use of performance reports to improve quality of care.
  • By contract, our focus was on implementation issues only.

Slide 4

Demonstration Models and PQA-Selected Sites

Demonstration Models and PQA-Selected Sites

  • Model 1—Health Plan:
    • Highmark in collaboration with CE City and Rite Aid
  • Model 2—Coalition of Health Care Plans:
    • Pharmacy Society of Wisconsin (representing the Wisconsin Pharmacy Quality Collaborative)
      • In collaboration with the University of Wisconsin-Madison; State of WI Dept of Health and Family Services; Unity Health Insurance; and Group Health Cooperative of South Central WI
    • Purdue University School of Pharmacy and Pharmaceutical Sciences
      • In collaboration with the Regenstrief Institute; the Indiana Health Information Exchange; and the Indiana Pharmacists Alliance
    • University of Iowa and the Iowa Foundation for Medical Care
      • In collaboration with Wellmark Blue Cross and Blue Shield, Iowa Medicaid Enterprise, and Iowa pharmacies
  • Model 3—Community Pharmacy Corporation:
    • Outcomes Pharmaceutical Health Care in collaboration with Kerr Drug

Slide 5

Overall Evaluation Approach

Overall Evaluation Approach

  • Our approach considered both primary and secondary data:
  • Primary data collected by the CNA team:
    • Pharmacy staff survey for each demonstration project:
      • Survey created by CNA team—required OMB approval.
    • Interviews of key demonstration project staff:
      • Interview guide created by CNA team—required OMB approval.
    • Teleconferences with PQA and demonstration Project Leaders.
  • Secondary data derived from demonstration projects:
    • Consumer survey data on pharmacy user satisfaction by site:
      • Collected by Avatar International LLC.
    • Claims data collected and analyzed by each demonstration project:
      • De-identified data provided to CNA team.

Slide 6

Key Research Questions (1)

Key Research Questions (1)

  • What is the most efficacious way to collect and aggregate data on the pharmacy performance measures and from the PQA consumer survey about pharmacy services?
    • For example, were there difficulties in acquiring claims data and/or in calculating claims-based performance measures? Were there difficulties in obtaining representative samples for the PQA Consumer Survey?
  • How well did the demonstration sites reach their objectives?
    • For example, were they able to maintain the pharmacy reporting system and periodically refresh and update the pharmacy reports?
  • How could the demonstration sites have strengthened their measurement efforts?
    • For example, how might the quality of the claims data be improved? Should additional measures be included?

Slide 7

Key Research Questions (2)  

Key Research Questions (2)

  • What challenges, issues, and technical problems were encountered in creating and populating the template reports? How were these issues resolved?
    • For example, were there difficulties with obtaining and processing claims data? Were there issues with creating Web sites and loading reports?
  • How well were the reports understood by users, such as pharmacy staff? How well were the reports received by pharmacists, managers, and others?
    • For example, do pharmacists believe that the reports could help them improve pharmacy quality in real-world applications?

Slide 8

Key Research Questions (3)

Key Research Questions (3)

  • How could the PQA-endorsed report templates and the reporting process be improved (e.g., with respect to user-friendliness, comprehension, and the ability to act on the information provided in the reports)?
    • For example, was training sufficient to allow the reports to be easily understood? How should reporting elements be standardized?
  • What are the operational costs and non-financial burdens encountered in collecting the data, generating reports, and using the performance data?
    • For example, how were key staff utilized to conduct required demonstration site activities, such as building reporting templates and creating/populating Web sites?

Slide 9

PQA Claims-Based Pharmacy Performance Measures

PQA Claims-Based Pharmacy Performance Measures

  1. Proportion of Days Covered: Beta Blockers
  2. Proportion of Days Covered: ACE Inhibitors/ARBs
  3. Proportion of Days Covered: Calcium Channel Blockers
  4. Proportion of Days Covered: Dyslipidemia Medications
  5. Proportion of Days Covered: Diabetes Medications
  6. Gap in Therapy: Beta Blockers
  7. Gap in Therapy: ACE Inhibitors/ARBs
  8. Gap in Therapy: Calcium Channel Blockers
  9. Gap in Therapy: Dyslipidemia Medications
  10. Gap in Therapy: Diabetes Medications
  11. Diabetes: Excessive Doses of Oral Medications
  12. Diabetes: Suboptimal Treatment of Hypertension
  13. Asthma: Suboptimal Control
  14. Asthma: Absence of Controller Therapy
  15. High-Risk Medications in the Elderly

Slide 10

Challenges Encountered

Challenges Encountered

  • OMB approval process was slow
    • Resulted in delays and difficulty in matching interviews and surveys to demonstration site activities as we initially planned.
  • Demonstration sites discovered layers of administrative delays:
    • Site-specific IRB approvals were sometimes quite lengthy.
  • Contractual agreements took a long time for demonstration sites to establish:
    • Legal staffs had to review contracts.
    • The more partners, the more complex and/or time-consuming it became to establish agreements.
  • Obtaining claims data posed difficulties as well:
    • Partly due to need for data use agreements.
    • Data formatting issues/inconsistencies caused some delays.

Slide 11

Results, Findings, and Recommendations

Results, Findings, and Recommendations

  • Overview of substantive project conclusions:
    • Generalizable findings across sites/models
    • Key differences in results across sites/models
    • Focus on results from surveys and interviews
  • Pharmacy staff survey and PQA Consumer Survey:
    • Elizabeth Schaefer (CNA)
  • Interviews with key project staff and overall recommendations:
    • Laura Pizzi (Jefferson)

Slide 12

Pharmacy Staff Survey: Background

Pharmacy Staff Survey: Background

  • Purpose: Pharmacists' perspectives on the PQA measures and on quality improvement
  • Development of the survey instrument:
    • Paper survey to be distributed by mail
    • Pilot testing
    • IRB and OMB approval
  • Modules:
    • Respondent characteristics
    • Opinions about the performance measures
    • Opinions about the performance reports
    • Barriers and beliefs about pharmacy quality

Slide 13

Pharmacy Staff Survey: Administration

Pharmacy Staff Survey: Administration

  • Sampling goal: Draw sample of 100 pharmacists from each demonstration project:
    • Most projects had <100 participating pharmacists.
    • Ultimately, distributed 370 surveys (ranged from 16 to 181 per site).
  • Schedule:
    • December 2009: advance letter
    • January 2010: cover letter and survey instrument
      • Followed 2 weeks later by a reminder letter.
    • Survey responses continued to be returned through March 2010.
  • Received 50 responses (13.5 percent response rate).
  • Probable reasons for low response rate:
    • Time between demonstration project and survey (OMB approval).
    • December—February is especially busy for pharmacists.

Slide 14

Pharmacy Staff Survey: Results (1)

Pharmacy Staff Survey: Results (1)

  • Training on use of the pharmacy performance reports:
    • About a quarter of respondents were not offered a training program.
    • Respondents generally considered the training materials to be useful:
      • However, most respondents thought there wasn't enough information about actions they could take to improve pharmacy quality.
  • Opinions about the pharmacy performance measures:
    • Most respondents felt that the measures were important to improving patient care.
    • Some respondents were concerned about the effect that product shortages could have on the "gap in therapy" performance measures.
  • Opinions about the pharmacy performance reports:
    • Varying opinions about whether the reports were easy to understand.
    • Varying opinions about the reports overall (Question: "What was your overall perception about the pharmacy performance report?").

Slide 15

Pharmacy Staff Survey: Results (2)

Pharmacy Staff Survey: Results (2)

  • Sense of empowerment to improve pharmacy quality:
    • Most respondents (78 percent) agreed that they "felt empowered to improve pharmacy quality as a result of the project."
  • Use of incentives to participate in demonstration project:
    • Very few respondents reported receiving any incentives to participate:
      • Monetary incentive (9 percent among those reporting any incentive).
      • Less tangible incentives (11 percent among those reporting any incentive).
    • Respondents had mixed opinions about whether incentives increased participation.

Slide 16

Pharmacy Staff Survey: Results (3)

Pharmacy Staff Survey: Results (3)

Beliefs about barriers to pharmacy quality

  • Barriers in daily work environment (in order of perceived importance):
    • Lack of time
    • Frequent job interruptions
    • Fear of making a medication error
  • Barriers in broader work environment (in order of perceived importance):
    • Physicians not engaged in pharmacy quality programs
    • Lack of organizational support
    • Pharmacy technicians not engaged in quality improvement

Slide 17

PQA Consumer Survey: Background

PQA Consumer Survey: Background

  • Purpose: Obtain information on consumer satisfaction with pharmacy services so that the information can be included in the pharmacy performance reports.
  • Survey instrument was developed prior to the demonstration:
    • Paper survey distributed and returned by mail.
  • Modules:
    • Pharmacy staff communication with consumers.
    • Written information provided to consumers.
    • Information that consumers received about new prescriptions.
    • Health care provided by pharmacists.
    • Respondent characteristics.

Slide 18

PQA Consumer Survey: Administration

PQA Consumer Survey: Administration

  • Consumer survey was administered for each of the demonstration projects by Avatar International, LLC.
  • Sampled from consumers who had used a pharmacy three or more times in the previous 12 months.
  • Not all demonstration site partners were willing to provide consumer names for the survey:
    • Thus, not all pharmacies and insurers participating in the demonstration are represented in the consumer survey results.
  • Numbers of pharmacies participating in the consumer survey:
    • Highmark / Rite Aid: 51
    • Iowa / Osterhaus: 1
    • Purdue / Wishard: 3
    • Outcomes / Kerr: 5
    • Wisconsin / Unity: 67

Slide 19PQA Consumer Survey: Results (1)

PQA Consumer Survey: Results (1)

  • Typical challenges from low response rates:
    • Increases potential for biased responses>
    • Decreases the statistical reliability of the results:
      • Typical criterion for reporting results: 30 responses or more
  • Survey response rates:
    • Highmark / Rite Aid: 10% (ranged from 4% to 18% per pharmacy)
    • Iowa / Osterhaus: 38% (range not applicable because 1 pharmacy)
    • Purdue / Wishard: 22% (range: 19% to 24%)
    • Outcomes / Kerr: 38% (range: 31% to 52%)
    • Wisconsin / Unity: 37% (range: 12% to 60%)

Slide 20

PQA Consumer Survey: Results (2)

PQA Consumer Survey: Results (2)

  • Average numbers of respondents per pharmacy:
    • Highmark / Rite Aid: 25 (ranged from 7 to 46 per pharmacy)
    • Iowa / Osterhaus: 570 (range: not applicable)
    • Purdue / Wishard: 109 (range: 95 to 121 per pharmacy)
    • Outcomes / Kerr: 94 (range: 78 to 129 per pharmacy)
    • Wisconsin / Unity: 27 (range: 9 to 48 per pharmacy)
  • Numbers of core survey questions (out of 28) for which any pharmacy had fewer than 30 responses:
    • Highmark / Rite Aid: 28 questions (ranged from 36 to 51 pharmacies)
    • Iowa / Osterhaus: 0 questions
    • Purdue / Wishard: 5 questions (ranged from 1 to 3 pharmacies)
    • Outcomes / Kerr: 6 questions (ranged from 1 to 2 pharmacies)
    • Wisconsin / Unity: 28 questions (ranged from 44 to 67 pharmacies)

Slide 21

On-Site Interviews: Background

On-Site Interviews: Background

  • Purpose: obtain in-depth information from project team members at each demonstration site regarding Phase I implementation structure, execution, and lessons learned.
  • Development of the interview guide:
    • Pilot testing
    • IRB and OMB approval
    • 10 constructs
  • Sampling goal: 6 project team members from each demonstration site
  • Schedule: late December 2009—mid-January 2010
  • Each 1-hour interview was conducted by two evaluators in a private setting and recorded with permission.

Slide 22

Analytical Methods and Data Sources: Performance Reports

Analytical Methods and Data Sources: Performance Reports

  • Who completed the analyses for performance measure calculations?
    • University groups, vendors (individual consultants and separate companies); Model 1 completed analysis internally.
  • Did the specifications provided for each measure by PQA, with support from NCQA, meet the project's analytical needs?
    • Overall, respondents felt that the specifications were easy to understand.
  • How did the demonstration sites consider statistical error in reporting actual performance differences among pharmacists?
    • Reporting of variation was not required; 2 sites reported confidence intervals.
  • How did [demonstration site name] employ case-mix and severity adjustment to make fair comparisons?
    • No sites used case mix adjustment to account for differences in populations at the pharmacy level, though 2 sites explored it.
  • How did the demonstration site exclude outlier cases in the measurement?
    • No sites excluded outliers.

Slide 23

Analytical Methods and Data Sources: Pharmacy Consumer Survey

Analytical Methods and Data Sources: Pharmacy Consumer Survey

  • How was the sample selected to complete the Pharmacy Consumer Survey?
    • All sites used a random sample, but 2 sites generated sample from only 1 pharmacy.
  • How many customers were administered the Pharmacy Consumer Survey?
    • 1 site was able to clearly answer this question; that site reported that 250 patients were mailed the survey x 50 stores, with a total sample of 12,000.
  • How many customers responded to the Pharmacy Consumer Survey?
    • Only 1 site was able to clearly provide a response rate (Model 1).
  • Was an incentive offered to customers who completed the questionnaire?
    • 3 sites did not offer an incentive; the other 2 sites were unsure as to whether an incentive was offered.

Slide 24

Analytical Methods and Data Sources: Data Verification

Analytical Methods and Data Sources: Data Verification

  • How did [demonstration site name] verify or audit the data generated for the performance reports?
    • Face validity of the analytic reports was checked by analyst and/or the project leader.
    • Reports were examined for glaring/obvious problems such as unusually high or low quality scores, unusually high or low numerators or denominators, or missing data.
    • One site re-ran the reports to verify technical accuracy.
  • Who verified the results generated for the performance reports and the Pharmacy Consumer Survey?
    • Team analyst generally held this responsibility.

Slide 25

Personnel Training about Pharmacy Quality

Personnel Training about Pharmacy Quality

  • What were the topic(s) covered?
    • At minimum: PQA overview, definition of measures, report format, and participation requirements.
  • Who participated in the training?
    • All participating pharmacists were asked to complete the training.
  • What was the format of the training?
    • 3 sites used computer-based training; 1 site used live training; 1 site used computer-based training + clinical pharmacists to train staff pharmacists.
  • Besides the initial session, were any additional sessions offered?
    • No other formal training was offered.
  • Overall, how would you rate the training program?
    • Responses were mixed; most were satisfied with the training, but a few felt there was too much content and/or the program was not engaging.
  • What could be done to improve the quality of the training?
    • More interactivity; continuity throughout the project.

Slide 26

Performance Measure Evaluation: Measure Importance

Performance Measure Evaluation: Measure Importance

  • Were the measures easily interpreted?
    • Denominators required in NCQA technical specifications were too high; respondents suggested rolling up the measures by drug category.
    • Also issues with score interpretation: what constitutes a "good" score and how to interpret adherence measures (PDC vs. GAP) that overlap in concept.
  • Did the measures address significant health conditions?
    • All respondents said "yes."
  • Did the measures relate to activities that have high financial impact to the pharmacy?
    • Improving the measures would have a modest positive financial impact to the pharmacy, driven by increased medication adherence.
  • Did the measures relate to activities that have high financial impact to the health care system?
    • Respondents agreed that the measures could have a high financial impact to the health care system by reducing use of medical services.

Slide 27

Performance Measure Evaluation: Scientific Acceptability and Feasibility

Performance Measure Evaluation: Scientific Acceptability and Feasibility

  • Scientific Acceptability: Do the measures make sense logically and clinically?
    • All respondents with clinical knowledge agreed that the measures make sense clinically.
    • Respondents did not feel that the measures are logical because (1) they fail to capture other conceptual elements of pharmacy quality (i.e., patient satisfaction and medication errors) or (2) they are based solely on pharmacy data.
  • Feasibility: Do you think the measures impose an inappropriate level of burden on the pharmacy system?
    • Models 1 and 3 acquired and analyzed the data without difficulty.
    • Model 2 (coalitions) reported data acquisition to be a major challenge, due to necessary data use agreements.

Slide 28

Performance Measure Evaluation: Overall Perceptions about PQA Measures

Performance Measure Evaluation: Overall Perceptions about PQA Measures

  • Do you have doubts about the accuracy of the measures?
    • Respondents reported that the results were accurate assuming the source data were accurate.
    • Concerns pertaining to cash-paying pharmacy customers, "store hoppers," and data entry error were expressed.
  • Do you have doubts about the measures overall?
    • Respondents suggested that asthma measures be re-examined (could not be run for many pharmacies due to insufficient number of patients).
    • Respondents also suggested that either GAP or PDC should be chosen for adherence measures, but not both.
  • Do you have doubts about the relationship between the measures and the quality of customer care at the participating pharmacy?
    • Respondents did not feel the measures were "top of mind" when one thinks about pharmacy quality; "top of mind" issues were customer service attributes (e.g., speed of prescription processing, pharmacists' level of engagement at the point of service, and accuracy of the dispensed prescription).

Slide 29

Data or Measures That Do Not Exist But Would Be Useful

Data or Measures That Do Not Exist But Would Be Useful

  • Thinking about the 15 PQA measures, are there any diseases that were not but should have been included in the measure set?
    • Other topics that were stated as important: depression, HIV, heart failure, chronic pain control, anti-coagulation, arthritis, COPD, coronary artery disease, oncology, biologics, osteoporosis, hypertension, and antipsychotics.
  • Thinking about the 15 PQA measures, are there any additional elements that should be included in the measure set?
    • Integration of laboratory values (particularly hemoglobin A1c, cholesterol, renal and liver function) was stated as important to future pharmacy quality endeavors.
    • Respondents also reported that measures of patient safety, such as prescription misfill rates, are important.
    • Finally, measures pertaining to medication therapy management (MTM) were suggested (e.g., pharmacists' identification of patients meeting MTM eligibility requirements and the number of eligible patients who successfully received MTM services).

Slide 30

Usability of the Performance Reports

Usability of the Performance Reports

  • How confident are you that the performance reports are accurate measures of quality?
    • Answers ranged from "very confident" to "not confident."
    • Respondents were confident that performance data in the reports were accurate; however, respondents viewed the15 measures and consumer survey responses to be only partially reflective of pharmacy quality.
  • Did you find the performance reports easy to interpret?
    • 2 sites stated that the reports were easy to interpret, while 3 sites expressed pharmacists' challenges in interpreting reports.
    • Sites attempted to proactively address interpretation problems by using a simple format, with key terms defined.
    • Icons used to indicate whether the measure was met vs. not met were reported as useful.
  • Were the performance reports easy to access?
    • Lack of Internet access at the store level (3 sites) posed implementation challenges.
    • These sites pursued an alternative approach whereby the reports were uploaded onto the pharmacies' intranets.

Slide 31

Additional Findings

Additional Findings

  • Was any type of incentive program implemented as part of this project?
    • 2 sites did not use incentives; 2 sites provided continuing education credit; 2 sites provided modest monetary incentives.
  • Was there a commitment by the executive leadership of the demonstration site in supporting the PQA Phase I Demonstration project?
    • All sites reported that senior leadership demonstrated commitment, though leaders were typically removed from project-level tasks.
  • Did participation lead individuals to feel empowered to improve quality in the pharmacy?
    • Respondents disagreed that the project made participants feel empowered, citing the word "empowered" as overly-emphatic.
  • Will the site continue to report on the pharmacy quality measures after completion of the PQA Phase I Demonstration project?
    • Only 1 site planned to continue use of the PQA measures.

Slide 32

Recommendations (1)

Recommendations (1)

  1. Refine the measures:
    • Revise the adherence measures to include Gap in Therapy or Proportion of Days Covered, but not both.
    • Test aggregate measures for diabetes, dyslipidemia, and hypertension whereby the relevant drug classes are rolled up into a single measure.
    • Redefine the Asthma Control measure such that fewer cans of albuterol are required for inclusion in the numerator, or the measurement period is expanded beyond 90 days.
  2. Refine the technical specifications:
    • Specifications should direct analysts on how to report the measures if the denominator is less than 30 patients.
    • Specifications should provide clarity on how to handle patients who purchase prescriptions at multiple pharmacies, particularly those who have an equivalent number of relevant prescriptions during the measurement period.
  3. Refine the reporting template:
    • Standardized scoring should be implemented such that high (or low) scores would always mean a good outcome.

Slide 33

Recommendations (2)

Recommendations (2)

  1. Further develop pharmacist training programs such that material is delivered interactively and continuously throughout pharmacy quality endeavors:
    • Training should focus on how to interpret performance reports and what actions can be taken to improve quality.
  2. In future demonstrations, coordinate the calculation of consumer survey results:
    • It would be more efficient for the organization administering the survey to provide tabulations of the survey responses for all of the demonstration projects, in addition to providing the raw data.
  3. Ensure that consumer survey sample sizes start with a sufficient number of customers:
    • Work to obtain even participation across types of pharmacies, in order to obtain a more representative mix of all pharmacy customers.
  4. Consider the merits and problems associated with coalition models:
    • Coalition models are likely to be slow to execute, due to the need to contract with partners and establish extensive sets of Business Associate Agreements (BAAs).

Slide 34

Lessons Learned: Quotes from Interviewees (1)

Lessons Learned: Quotes from Interviewees (1)

  • "We learned that it's difficult to give a clear picture of the pharmacy quality with the current set of measures. The measures provided a good overview, but are not comprehensive enough. I think we learned that we are heading in the right direction in evaluating something other than prescription volume to measure pharmacy quality, [but] we need more pharmacist-specific and patient-specific measures."
  • "I was very impressed from the start [by] the interest level of pharmacists in looking at how they can move quality in a positive direction. I was very pleased to see that the majority of pharmacists clearly saw this as a step into the future. What we learned is that we need to give them some additional tools to move this in a positive direction. Although that wasn't necessarily warranted in Phase I, I think based on [our] feedback, it became fairly clear that we needed some additional tools to move PQA forward."
  • "There has to be a clear understanding and set of expectations. Training is key where pharmacists are not really familiar with quality and healthcare reform. They are in the trenches and want to intervene, but they need additional resources and training on quality."

Slide 35

Lessons Learned: Quotes from Interviewees (2)

Lessons Learned: Quotes from Interviewees (2)

  • "My biggest concern with the whole project was the number of responses that we're getting, and I think from our perspective, it would have been more prudent to get more corporate buy-in from the beginning."
  • "[I have] concerns about sample size. Getting the desired N of say 30 per pharmacy is a challenge. In my opinion, we should just lower the bar."
  • "I learned that anything that you do with a pharmacist who is currently working needs to be well timed and efficient. If they were concerned that it was going to take any time from their day, they were apprehensive about it."
  • "I would like to see pharmacies and pharmacists move to the point where we control our own scores. We [also] need a way to verify disease states of patients; the only idea I have is that ICD-9 codes be put on all prescriptions. Pharmacists are very much capable and willing to provide quality care, and if they were called out [for a ] measure [where they are] lacking behind peers, they would be motivated to improve that. I think they would step up to the challenge."

Slide 36

Lessons Learned: Quotes from Interviewees (3)

Lessons Learned: Quotes from Interviewees (3)

  • "My impression at the end was that pharmacies are at the same early stage as we've seen with other venues when addressing quality. They are faced with many of the same kinds of issues but in some ways more complicated because of the attribution issues. In terms of things that are lacking at this point—validation of the quality indicators (why is it a 30-day gap instead of a 40-day gap, where is the empirical evidence for that?)."
  • "It can be done—this was an important lesson we learned. It takes a lot of effort and a lot of resources to do it."
  • "One lesson is that there is still a lot more to do. This is, at best, a toe in the water. It will take a substantial commitment of resources, particularly by the federal government to advance the development and meaningful use of these measures. I think this initiative has a lot of merit and has the potential to really change the medication use process. The potential is there to develop meaningful measures that can be objectively evaluated by providers and those external to payers, and can lead towards improvement. But it's going to take a lot to get there."

Slide 37

Additional Recommendations for Furthering Pharmacy Quality Initiatives

Additional Recommendations for Furthering Pharmacy Quality Initiatives

  1. Determine which health care organization(s) should measure pharmacy quality.
  2. Consider an alternative model of quality where pharmacists and physicians are aligned in the measures for which they are held accountable.
  3. Work with pharmacy providers and policymakers to integrate quality improvement services into the pharmacy business model.
  4. Integrate quality into pharmacy school accreditation requirements.

Slide 38

Supplemental Slide: Resources Necessary for Expansion of Pharmacy Quality Initiatives

Supplemental Slide: Resources Necessary for Expansion of Pharmacy Quality Initiatives

Resource CategoryExamples Cited by Interviewees
Personnel
  • Project management
  • Analytic resources
  • Data collection/data management
  • Staffing improvements, reworking workflow, more technicians
  • Need for communicating and interacting with the pharmacists
  • Expansion of staff
Technology
  • Information Technology systems (hardware and software required for reporting and analytical components)
  • Internet access at the stores
Funding
  • Equipment
  • Compensation for training staff and training time spent by pharmacists
  • Reimbursement mechanism to compensate pharmacists for achieving quality goals (e.g., pay-for-performance system)
Other
  • More support from corporate office (particularly with respect to having IT needs met via resources from the corporate office)
  • Further pharmacist training aimed at developing their knowledge and skills on how to use performance measure information
  • Ongoing agreement with the payers to continue to provide the data
  • Space (e.g., office space, meeting rooms)

Current as of December 2010
Internet Citation: Laying the Groundwork for Pharmacy Quality Measurement (Text Version). December 2010. Agency for Healthcare Research and Quality, Rockville, MD. http://www.ahrq.gov/news/events/conference/2010/mcmahon-pizzi/index.html