Webinar Transcript - National Quality Strategy Webinar: Using the Nine Levers to Achieve Results

August 19, 2014

Download accessible version of slides (PDF, 2.4 MB)

National Quality Strategy Webinar: Using the Nine Levers to Achieve Results [Slide 1]

Slide 1. Introducing Nine Levers to Support the Aims and Priorities.

Ann Gordon: Good afternoon everyone. We want to go through a few housekeeping items just before we get started with today's presentation.

Housekeeping [Slide 2]

Slide 2. Housekeeping.

Just to let you know, this session is being recorded and an archive, including a transcript, will be available on the Working for Quality Web site in 2 weeks. We will share the link to that site at the end of the presentation. A copy of the slides were distributed with the email reminder for this webinar, and you can also download them from the console. We will have an opportunity to talk with our presenters at the end of the presentation to ask questions. Of course, you can always submit questions through the chat box at the lower left corner of your console. Again, at the end of the presentation we will have a very short survey—it'll only take about a minute to complete. We're just getting some feedback on today's session and we would like to use that information to help plan more Webinar events like this in the future.

Agenda [Slide 3]

Slide 3. Agenda.

All right—I'd like to take a quick look at our agenda. First, we will get an overview of the nine National Quality Strategy levers from Dr. Nancy Wilson, who is the executive lead for the National Quality Strategy. Then we'll hear from Chris Queram from the Wisconsin Collaborative for Healthcare Quality and Mylia Christensen, Oregon Health Care Quality Corporation, who will talk about how they've used the levers within their organizations to improve quality. We will move to the question-and-answer session for the presenters and for everyone who is attending today. I'd like to hand it over to Dr. Nancy Wilson to provide an overview of the National Quality Strategy and the nine levers. Nancy?

The National Quality Strategy and Nine Levers for Program Alignment [Slide 4]

Slide 4. The National Quality Strategy and Nine Levers for Program Alignment.

Nancy Wilson: Thanks, Ann. For those of you that don't know me, I will just give a 2—minute bio and that is that I'm a psychology nurse and primary care physician by training, and I've used the skills from both of those careers in my policy work today. I've been involved in quality improvement for most of my career—quality improvement and patient safety. So I would say my vision is really to figure out how we get to a high—reliability, high—quality health system for the Nation. Everything I've been doing is geared in that direction. I think that having a coherent strategy and focus on priorities and levers can help us in that work. Next slide, please.

Background on the National Quality Strategy [Slide 5]

Slide 5. Background on the National Quality Strategy.

The National Quality Strategy was really a part of what came from the Affordable Care Act. The mandate was really to improve health care, patient outcomes and population health. That really made it broader than health care and, I think, really challenges us to think about and grapple with the fact that health care may contribute about 20 percent to the health of Americans, but that there's 80 percent that's beyond what health care does. And so, not that health care isn't critically important—it's an incredible piece of how things, families, people and families stay healthy. We have to think more broadly. I say that as someone who is been in health care since the 70s. I really think the National Quality Strategy is a nationwide effort. This isn't a “Fed” initiative, not HHS. It's really a gift to the Nation to say, can we align, can we think about ways to focus and how do we do that iteratively together, both public and private sectors. Next slide, please.

The IHI Triple Aim and NQS Three Aims [Slide 6]

Slide 6. The IHI Triple Aim and NQS Three Aims.

So, the three aims of the National Quality Strategy: better care, healthy people/healthy communities, affordable care might look a lot like the Triple Aim, might sound like the Triple Aim and that's a good thing. It's really an elaboration on the Triple Aim. Whether you have embraced in your organization, and you say, “We've embraced the Triple Aim,” or you want to say the National Quality Strategy, it's all okay. I think my friends at IHI and I have had the conversations about the fact that the National Quality Strategy really elaborates on the original Triple Aim. So embracing one is embracing the other and that's a good thing. It means we are aligned. Next slide.

The National Quality Strategy: How It Works [Slide 7]

Slide 7. The National Quality Strategy: How It Works.

This is a busy slide, but I will start with who's involved. And, as I mentioned before, we are really talking about multi-stakeholder groups: private, Fed, States. The stakeholders are everyone. It's really the content of what it is you focus on that really is the priority. That's patient safety, person- and family-centered care, care coordination, prevention and treatment of the leading causes of mortality and morbidity, starting with cardiovascular disease. Until we get that one, we are going to keep plugging away on it. Really enabling health and well-being within communities, and then trying to get health care more affordable. Those are the priorities. They are not going to change until we've solved them, if you will. I expect—I had a call the other day where somebody said, “Are the priorities going to change this year?" I said no. We haven't gotten these—we haven't fixed these. We're going to keep these priorities until we really top out on them, if you will.

The levers are really our attempt to describe how everybody, no matter what your business is, can contribute to aligning and driving forward work on the priorities of the National Quality Strategy. It's really an attempt to be inclusive and explain how everyone has a role to play. Yes, it's measurement and feedback and yes, it's public reporting. And we are going to hear from some fabulous examples in a few minutes about public reporting and quality improvement because I think that the folks that are going to talk to you are going to be able to share how what they do really pulls multiple levers, if you will. But it's also certification and accreditation. There are roles to play no matter what business you are in as it relates to health care or health. You can align with the National Quality Strategy. And, of course, the ultimate goals are of the three aims: better health, better care, lower cost. Next slide.

Why We're Here Today: Levers [Slide 8]

Slide 8. Why We're Here Today: Levers.

We had a Webinar a little bit ago talking about the various levers that you can pull, if you will. The core business functions that if you look at what you do, you say, “How can I incorporate the National Quality Strategy into the work that I do?”

That's really the focus—what can I do? Because it's one thing to say we have these aims and we have these priorities, but if we don't do anything, if we have no intervention, then really we are not making any progress. Next slide, please.

Nine National Quality Strategy Levers [Slide 9]

Slide 9. Nine National Quality Strategy Levers

So, again, just to elaborate a little bit, this is not exclusive.

I'm sure there are more. I'm sure we've missed things. But when we sat down and wrote the National Quality Strategy, what we thought about were critical areas that needed to be addressed. I'm going to start at the bottom. One is workforce. How do you deal with innovation and infusion? How do you grapple with that issue of standardization versus innovation that we constantly debate about in our discussions between local communities and Feds? That's where it plays out. It's kind of a Fed/State or local/national debate. It's something we all have, which is: how do we support innovation and diffusion of innovation at the same time that we support some standardization so that we can identify common practices around the country? Health information technology is always an issue that we are grappling with in this day and age. Payment, payment reform, value-based purchasing, et cetera. What is it that we pay for? If we stay in a transactional model that we've been in for the past 50, 100 years, I don't know—how does that impact our ability to get to the outcomes that we want? I think everybody's been trying to say how can we move away from transaction to funding outcomes over the life span? That's a difficult transition to make.

Consumer incentives and benefits, certification—these are all things that you know. Hopefully, you can see yourself in at least one of these things if not multiple areas where you might be able to contribute. Next slide.

2014 Annual Progress Report: Levers in Action [Slide 10]

Slide 10. 2014 Annual Progress Report: Levers in Action

I'm going to turn it over to the featured speakers for today and just flag for you that we do have a 2014 Annual Progress Report coming out. It's been a little late. But, I'm hoping it's out in September. We are going to feature several State programs that have been doing fabulous work. I would love to continue to hear from you and connect with you because we are now, even as we speak, out there looking for the programs that we can feature in next year's Annual Report. Some of the things, just to give you a bit of a tip, is that we've focused on the New York State Health Foundation diabetes campaign. That has been a terrific collaboration, where it's improved diabetes care for more than 600,000 people. It really focused on diabetes prevention and self-management. It was collaborative in focusing with the work in the Institute for Leadership. It's a great story, and they have data that shows the cholesterol management was really increased and diabetes management improved as well.

I hope when we are able to release the Annual Progress Report, you take a look at it and you get the details of that. Of course, we will share with you how to connect with any of these entities so that you can ask the questions that you have on how they actually did what they did. Another one is the Minnesota Statewide Health Improvement Program. It really focuses on nutrition and physical activity, decreased exposure to tobacco products. It's really translated into fabulous work for 14,000 workers who benefited from worksite wellness programs, 429 schools that incorporated changes into their food policies by supporting school gardens, and 160 schools adopted the State program to make walking and biking to school safer. They're really bread-and-butter interventions that I think we've all heard about, but putting them into practice is really what the challenge is. It's great to have a good idea, but to actually make it happen and be able to demonstrate that you've got better health outcomes is where we are at this point in time. California Quality Collaborative is also really focused on collaborating with 27 participating hospitals preventing readmission, focusing on high risk patients. It's doing terrific work. I'm going to stop there and turn it over to Ann as the facilitator and really get into the meat of what this Webinar is about, which is to get—to peel back another layer and say, how did you do this in your community? Because, it's one thing for me to be talking about it sort of at the national level and it's another to be on the ground working with people and actually putting these levers in action. Ann?

Levers in Action: Wisconsin Collaborative for Healthcare Quality [Slide 11]

Slide 11. Levers in Action: Wisconsin Collaborative for Healthcare Quality.

Ann Gordon: Thank you, Dr. Wilson. We are very excited to introduce Chris Queram, President and CEO of the Wisconsin Collaborative for Healthcare Quality.

Chris Queram: Thank you very much. Ann, Nancy, and all of the others, thanks for making this opportunity possible for me to present to you and let me thank those who have joined us today for taking time from their schedules to listen to our story as well as the one you'll hear next from my friend and colleague out west, Mylia. I am with the Wisconsin Collaborative for Healthcare Quality. If we go to the next slide I will tell you a little bit about WCHQ.

Our Founding Premise, 2003 [Slide 12]

Slide 12. Our Founding Premise, 2003.

We are a voluntary statewide consortium of organizations that are learning and working together to improve health care in Wisconsin. I purposely used this founding premise, which was put together at the time that we were created, now almost 11 years ago, because I think there are some important words in here that provide a thread that runs through a number of the levers that we will be talking about. One, for starting off, is “voluntary.” The collaborative is a voluntary organization. There is no mandate that compels organizations in our State to join us and to work with us. And I say join us because we are structured as a membership organization, and we draw much of our revenue but also a lot of the human capital that we need to do our work from not only our member organizations, but other stakeholders, which is the second important word: “consortium.” The collaborative is composed at the core of provider organizations that form our membership. We also work very closely with purchasers and payers and consumer advocates, our public sector colleagues in Wisconsin state government, academicians and others, to do our work in the full sense of the spirit of the term multi-stakeholder. I also thought it was important to emphasize the learning and working together and the presence of the word collaborative in our name may be another signal that much of what we do revolves around bringing these disparate perspectives and these different stakeholders to gather to address very complex and challenging issues related to the three aims of the National Quality Strategy. Whether it's improving health or making care better or making care more affordable, I think there's widespread acceptance now of the reality that no one sector of the industry can tackle those complex problems alone. We've put a great amount of emphasis on a collaborative model that brings all of the key players together in concert, to address those three overarching aims.

The other comment that I'd make is the improved health care in Wisconsin. At the time that we were formed, the focus was solely as the words suggests, on the health care delivery system: improving quality and making care more affordable. Certainly, through the advent of the Triple Aim and then its incorporation in the National Quality Strategy, like many of the organizations similar to us around the country, we do have an interest in gearing our work toward the Three Aims. But when we came together, much of the focus of the initial work of the collaborative was around improving health care. The last comment I'll make about our founding premise, which is then a segue to some of these specific levers, is that from the very beginning our value proposition was conceived around two interrelated activities: one is performance measurement and reporting and the other is serving as a convener and facilitator of shared learning.

Implicit in that founding premise that I just walked you through is the tools that we use to accomplish these important objectives that's performance measurement and shared learning. If we could go to the next slide, that takes us into the first of the levers, and that's Measurement and Feedback.

Measurement and Feedback [Slide 13]

Slide 13. Measurement and Feedback.

If we could go to the next slide, I will give you a few examples of the Collaborative's work in this regard. I mentioned that a core premise or core element of the value proposition is performance measurement and reporting.

Measurement and Feedback in Action [Slide 14]

Slide 14. Measurement and Feedback in Action.

We issued our first public report in the fall of 2003. I will talk about that a little bit later when we talk about another related lever, which is public reporting. But I mention that now because we have almost 11 years of experience in publicly reporting comparative measures. The first report we issued in 2003 was probably more significant because of the fact that we released it than what it contained, necessarily because much of the measures that were in existence at that point in time were harvested from secondary data sources—Joint Commission core measures, HEDIS measures—but the aspiration of the founding organizations of WCHQ was to move beyond those established methods of measurement and to look for a way to measure an entire population of patients, not one that's tied to a particular health plan. In the early days of WCHQ, there was an intensely creative period of time from 2003 to 2004 where we built not only the tools but the political will and the organizational capacity to begin to collect data directly from provider organizations so that we were not dependent upon data from a disparate group of health plans. By doing so, we were able to build a method of reporting that captured all patients and all payers. That was critical to our success because that method of measurement, very early on, gained the respect and the trust of the provider organizations as being valid and reliable indicators of performance. But a byproduct of that creative process was we built the capacity, as I mentioned, to take data directly—we needed to be able to store it and retrieve it and use it at different points in time. And so we have also built the capacity to provide feedback to organizations separate from our public reporting activities that can be used to begin to differentiate or identify variation and practice within a physician group at the group level, at the practice site level, or even down to the individual practitioner level. We do provide regular feedback to our member organizations and that, I think, is a good illustration of this particular lever. The other comment that I'd make regarding the last bullet before we move on, is that as time has gone on, we have built a very transparent and consensus-oriented process to identify the measures that we report and it's not only multi-stakeholder that includes those three key groups—provider, purchaser, payer—should also be listed as well as consumers but it's also multi-organizational. This work has matured and the quality ecosystem in Wisconsin has become more complex, as I'm sure it has in Oregon and every other jurisdiction in the country. This work has to be done through multi-organizational consensus processes. We pay a great amount of attention and invest a considerable amount of time and energy ensuring that the measures that we report reflect the areas of greatest interest to a pretty diverse audience.

Public Reporting [Slide 15]

Slide 15. Public Reporting.

The next lever — I've already touched on this a little bit. If we could then go to the next slide I will talk about our public reporting.

Public Reporting in Action [Slide 16]

Slide 16. Public Reporting.

I mentioned that we issued our first public report in the fall of 2003, notable more by virtue of it being released than in terms of what it contained. We generated our second report, which contained what I'll describe as indigenous measures—ones that we had to build ourselves to accomplish that objective of measuring an entire population of patients. At that point in time, back 10 years ago, performance measurement was very much in its infancy and there were no good measures that existed for measuring a practice or a group. There were good measures for measuring hospital performance and good measures for measuring health plan performance but not for provider groups. So we published our first measure with our own attribution, logic, and method to establish a denominator and populated that with a numerator and composed of both billing and clinical data.

We issued that first report, which was a subset of now a more comprehensive suite of diabetes measures, but they were blood sugar control and cholesterol control measures and debuted those not only as our first homegrown measures but also the first that we reported online. Since that time, we have expanded the measures that we report to over 30 and report them on two Web sites. The next slide shows the core WCHQ Web site, which is at WCHQ.org.

Public Reporting in Action [Slide 17]

Slide 17. Public Reporting in Action, continued.

It is very robust. It has longitudinal information. For as long as we've been reporting a measure, you can go back and track changes in performance over time and it provides a detail at the practice level, at the practice site level. You can sort it by geography. There are a number of different filters that you can use depending on the level of nterest. This particular slide is for osteoporosis screening, and you can see that it shows performance for a subset of the member organizations in WCHQ. Our core Web site is used primarily by member organizations. It doesn't have confidence intervals, stars, or other features that are more commonly found on consumer-facing Web sites. This is more of a benchmarking Web site.

Public Reporting in Action [Slide 18]

Slide 18. Public Reporting in Action, continued.

The next slide shows a Web site that we've experimented with a bit over the last 2 years with funding from the Robert Wood Johnson Foundation through the Aligning Forces for Quality program. A more consumer-oriented Web site that packages a subset of our a measures in the context of narrative storytelling based on research that suggests people are more inclined to go deeper in exploring information for situations or conditions that are common to ones that they're experiencing. We designed this Web site around that technique for a subset of our measures related to diabetes, hypertension. We introduced a new character to do with cancer screening measures. If we could move to the next slide.

Learning and Technical Assistance [Slide 19]

Slide 19. Learning and Technical Assistance.

Being cognizant of time and making sure that Mylia has time for her presentation, I will talk a little bit about Learning and Technical Assistance, if we could go on to the examples of some of the work that we do in this area.

Learning and Technical Assistance in Action [Slide 20]

Slide 20. Learning and Technical Assistance in Action.

I mentioned that a core element of our value proposition revolves around convening and facilitating shared learning sessions, and we've been doing that for the full time that we've been in existence since 2003. We convene what we refer to as bimonthly assembly meetings. We've had Nancy out for one of those, somewhere along the line, as well as a number of other speakers from national, regional, or State perspectives. We have Learning Action Network events that are geared more specifically to a provider audience that go deeper into process improvement initiatives related to measures that we report: office practice redesign, things of that sort that might be more uniquely of interest to a provider audience. We have experienced, even though it's a hypercompetitive time in health care, with dramatic amounts of merger and acquisition activities, we have been able to maintain a collaborative spirit in our work and enjoy a high degree of participation from our member organizations, both as learners and teachers in sharing their best practices. We do have an online community, a portal that we use to provide another way to connect our members.

Innovation and Diffusion [Slide 21]

Slide 21. Innovation and Diffusion.

If we can move to Innovation and Diffusion.

Innovation and Diffusion in Action [Slide 22]

Slide 22. Innovation and Diffusion in Action.

Go to the next slide, and I will comment on one of the innovative components of our work that we are very proud of, which is our direct data submission tool, which we refer to as RBS: repository-based data submission. It does allow members to securely provide to us a rich amount of patient-level data that we use to generate our measures and feedback reports. And notably, this RBS tool is a CMS-approved qualified clinical data registry for PQRS reporting in 2014 and we're very excited about the possibilities inherent in this designation because we think it will go a long way toward reducing some of the measurement burden associated with responding to data initiatives from multiple sources.

Levers in Action Drive Results [Slide 23]

Slide 23. Levers in Action Drive Results.

If we could go to the next slide, I also wanted to briefly comment on what impact our work has had. This particular slide displays the gains that we've seen on our Web site in uncomplicated hypertension. We just took that as one example. You can see that the trajectory has been upward, which is good as a result of reporting, but if we go to the next slide, our member organizations have been very committed to use our experience to contribute to the evidence base that makes this work not only possible but hopefully makes it an imperative going forward.

Levers in Action Drive Results [Slide 24]

Slide 24. Levers in Action Drive Results.

We were fortunate to have two studies published in Health Affairs. The first was in 2012, in March of that year.] It was a qualitative study that examined the impact of public reporting on investments made by physician practices and quality improvement interventions, and our takeaway from that was that reporting definitely drives investments and improvement. The second study, which was published a year later, was more a quantitative, empirical analysis that established that there was some correlation between our public reporting and the observed trend in reporting—similar to work that Judy Hibbard did in the early days of hospital performance measurement. This particular study, I think, does help make the business case that what gets measured gets improved. Some of the data shows that for groups that are publicly reported, their rate of improvement is faster than groups that don't have data publicly reported, which leads us to modify that and say what gets measured and reported publicly gets improved faster. We're trying to contribute to the aphorisms that govern so much of the work in this country.

Reported Reasons for Initiating Quality Improvement Measures, Physician Groups in the Wisconsin Collaborative Healthcare Quality (WCHQ) [Slide 25]

Slide 25. Reported Reasons for Initiating Quality Improvement Measures, Physician Groups in the Wisconsin Collaborative Healthcare Quality (WCHQ).

The last slide shows this information graphically and is drawn directly from the second study, which was published in 2013. I will stop there. Hopefully that leaves enough time for Mylia.

Levers in Action: Oregon Health Care Quality Corporation [Slide 26]

Slide 26. Levers in Action: Oregon Health Care Quality Corporation.

Mylia Christensen: Great. Thank you. This is Mylia Christensen in Oregon. Delighted to be with Ann, Nancy, and Chris as we work our across from east to west this afternoon, looking at levers in action. I believe that in the presentations, beginning with the next slide, you are going to see some similarities between what Chris has presented, what has been discussed earlier about the levers.

Mission [Slide 27]

Slide 27. Mission.

The Oregon Quality Corp is an independent nonprofit organization dedicated to improving the quality and affordability of health care in Oregon. We do that by leading community collaboration and by producing unbiased information. Also to Chris's point about stakeholders, we take that very seriously. We work with all members of our community and that includes consumers, providers, employers, policymakers, and health insurers to improve the health of all Oregonians so we are statewide. Chris got a little ahead of us in his start line for public reporting, but we have been around for 14 years. We too have a large cadre of volunteers, over 200 volunteers who participate in 11 working committees that contribute to our work and our progress along the way, and I'm delighted to be able to share some highlights of that work and how well they dovetail into the levers that have been introduced to you earlier in today's Webinar.

Measurement and Feedback [Slide 28]

Slide 28. Measurement and Feedback.

I will start with the next slide, which is again focused on Measurement and Feedback, and this one has to do with providing performance feedback to both plans and providers. We began our public reporting journey in 2006. We have really promoted the idea that with collaborative partners working together, the ability to aggregate large data sets and provide feedback really means that we are all better together in terms of the magnitude of the data and the impact of the data. Just like Chris explained, we too have a very rigorous measurement and reporting technical committee that has met every month over those last 8 years to really guide and advise us on the selection of measures, the rigor of looking at methodology and results and helping us all along the way.

Measurement and Feedback in Action [Slide 29]

Slide 29. Measurement and Feedback in Action.

The next slide just gives you a sense of the backdrop to what kind of data we currently have. We currently, in Oregon, are also a voluntary program. The members of our collaboration have all agreed in the spirit of improving quality and affordability to come together and work as a collaborative. We currently have 15 health plans and we are also very proud to have added CMS as a new data partner, as part of the recent Qualified Entity program. We also have the Oregon Health Authority which provides Medicaid data on its membership to the database. As of today, we have over 80% of the commercially insured population, we have 92% of Medicare, which now includes Medicare Advantage plans in Oregon, and also the newly acquired Medicare fee-for-service data which we are very excited about and we have 100% of the Oregon Medicaid population. Similar to Chris, in Wisconsin, we have over 30 quality improvement and utilization measures. We started out focused on primary care and chronic care, primarily in the ambulatory care setting and we grew to include measures that refocused on the pediatric population and to include measures around utilization. We are very excited this year — as part of the Robert Wood Johnson Foundation's Aligning Forces for Quality program, we are expanding our measures to include a total cost of care in resource measure. As part of that activity, we are also working with the Network for Regional Health Improvement collaborative with five of us nationally that are working on a collaborative project together to roll out this new measure together and to really work again on the implementation and also the reporting of that information. We hope to add, later this year and early next year, some specialty measures that will originally focus around maternity and perinatal care and that is where we are going to add some clinical measures that will really help us to focus more on outcomes.

Measurement and Feedback in Action [Slide 30]

Slide 30. Measurement and Feedback in Action, continued.

The next slide is an example of some of the kinds of measurements and reporting that we do, just as an overview of similar to Chris. We have multiple audiences for our data. We publish a statewide report every year that is about to be published in the next couple of weeks. It's really a recap at a very high level of what's going on, and we have a consumer-friendly Web site that I will show you an example of in a moment that's really focused on translating results to consumers. We also have a private and secure provider Web site, which I will share with you in a moment, and we have special reports and analysis that are done from time to time on topics like readmissions or low back pain. All of those are different in their approach and the level of information but again are really touching on multi-stakeholders who can use this information. This is an example of a well-child visit for children in the third, fourth, fifth, and sixth year of life. It's an example, again, of the community saying that this was an interesting measure to them, and it was important both for people tracking quality but also improvements and incentives. This was a particular example of that we heard feedback about the challenges for clinics that have large Medicaid populations and how difficult it was to really approach the benchmark or achieve the benchmark and the challenges between commercial and Medicaid. This was an example of how we were able to break out the results of this measure and show how clinics were doing across Oregon and compare, again, the performance of those clinics. Also, to demonstrate if you look to the far right of the graph—that there were actually clinics who had over 30 percent of their clinic population who were being covered by Medicaid, but were among the high-performing clinics. Back to how you put this into action. Some of the work that we do is to actually have clinics share best practices so that others can understand what kinds of things clinics do differently to really raise the bar in this measure and in others, even if they have a different payer mix for clinics that may present particular challenges.

Measurement and Feedback in Action [Slide 31]

Slide 31. Measurement and Feedback in Action, continued.

On the next slide, it's another slice of how we look at opportunities for both reporting information but also then what you can do with it. This is the breast cancer screening measure. As many of you may be aware, the U.S. Preventive Services Task Force has made changes over the last several years, updating the recommendation for age groups for breast screenings. One of the things our measurement reporting committee wanted to do was to start showing information about screening by age bands and targeting this information to help several groups. In some cases, a physician may want to know how was I doing before they changed the age bands? How am I doing after? Others want to know how we do overall as the age groups have been changed and for us, one of the most important areas that got highlighted in the last couple of years was looking at the screening that was occurring in populations that were over 84. As some of you may know, the evidence is less clear as the age increases about the effectiveness of this kind of screening and the results here for folks—they were very surprised about the number of women over the age of 84 who were getting screened in Oregon, and there has been quite a bit of conversation and education about this as an example of using this information to change and inform practice.

Report 3: View Provider Scores by Clinic [Slide 32]

Slide 32. Report 3: View Provider Scores by Clinic.

The next slide is an example of what providers can see. It's one of many pages on the secure Web site. If you look at the blue areas on the left that are highlighted, these are the measures on this page that represent the results for this individual provider. If you look across the page, you can see how this score might compare to the clinic's overall score, the Oregon average, the ABC benchmark which relates to high performers in Oregon, and then how they did they do compared to the HEDIS benchmark. If you are a provider you can click on this and actually drill down to patient level and see which patients were attributed to these measures. To Chris' point, in an effort to make this as friendly as possible, this information is all downloadable in the provider's office to help facilitate quality improvement activities and tracking improvements.

Increasing Traffic to Portal [Slide 33]

Slide 33. Increasing Traffic to Portal.

The next slide, to Chris' point about the benefits of public reporting, none of this work makes any difference unless people are looking at it and paying attention to it. This is actually tracking of the secure provider portal, where you can see over 160,000 page hits last year during one of our big releases, and we're hoping for a similar release this year. For those of you who live in much larger States, this is actually very encouraging data to us over time, because both the high points are getting higher, but also the frequency of ongoing use is also increasing over time.

Getting Patients In For Needed Services [Slide 34]

Slide 34. Getting Patients In For Needed Services.

Lastly, just a little storytelling. One of our star physicians, Susie Clack with the Pacific Medical Group, has said repeatedly how important these measures and feedback are to the clinic and the practice. An example was of how patients were actually being lost to followup who were getting care from endocrinologists. By having our information they were able to see that patients actually weren't getting the recommended treatments because the primary care physician was lacking a very important loop about regular and routine testing for diabetes patients.

Public Reporting [Slide 35]

Slide 35. Public Reporting.

Lastly, just kind of taking off on the public reporting.

Public Reporting in Action [Slide 36]

Slide 36. Public Reporting in Action.

The next slide is an example of our consumer-facing Web site. You can see, again, similar to Chris in Wisconsin, options to help people with their care and how to get better care and how to contrast the care presenting a voice of a patient.

Public Reporting in Action [Slide 37]

Slide 37. Public Reporting in Action, continued.

On the next slide, you can see what patients can see as they drill down to a particular geography in a particular heart disease care measure, which gives them greater levels of specificity, and also there are drilldown features, to Chris' point about transparency. Regardless of which stakeholder you are, you can see how each one of these measures was developed and get any level of technical information that you would like.

Learning and Technical Assistance [Slide 38]

Slide 38. Learning and Technical Assistance.

The last slide, Learning and Technical Assistance, just again to talk about how do we translate and how do we make sure that this improvement is happening at the practice level.

Learning and Technical Assistance in Action [Slide 39]

Slide 39. Learning and Technical Assistance in Action.

The next slide is the Patient Centered Primary Care Institute Web site. It's a very special project which brings together technical experts, health care providers, staff, patients, advisors, policymakers, academic centers, and others to share information that's about transformation in primary care in Oregon. We have a huge statewide initiative that is also very tied to the certification and recognition process. We have a whole bevy of resources to help facilitate connections, facilitate learning collaboratives, technical assistance, and online modules and that can be found on this Web site, as well.

Health Information Technology [Slide 40]

Slide 40. Health Information Technology.

On the Health Information Technology front on the next slide and the following slide, this was just one example of how we're using health information technology to try to focus on population health. In this case, Quality Corp partnered with the Coalition for a Livable Future, and we provided the data that we collect on health to allow them to focus on some specific mapping around chronic disease, pediatric preventive care, and potentially avoidable ED visits on their Regional Equity Atlas. And this is really exciting information also sponsored by the Robert Wood Johnson Foundation.] We are currently working on a toolkit to see if this project might be scalable to other communities. It started with four counties in Portland. It continues to get a lot of great feedback and traction in terms of looking at population health.

Health Information Technology in Action [Slide 41]

Slide 41. Health Information Technology in Action.

The next slide is just an example of statewide results for avoidable hospital ED visits. The most noteworthy thing is the striking variation that occurs across the State both in commercial and Medicaid, especially comparing adult and pediatric populations, but just remarkable opportunities to improve care and change where care is delivered, to try to improve again on the Triple Aim.

Health Information Technology in Action [Slide 42]

Slide 42. Health Information Technology in Action, continued.

Lastly, levers in action, this is an example of potentially avoidable ED visits, an area being tracked nationally and regionally, and we are about to come out with another measurement year of reporting on this one.

Levers in Action Drive Results [Slide 43]

Slide 43. Levers in Action Drive Results.

Again, just striking the differences in the populations between children and adults, also between Medicaid and other payers. This is one being tracked very closely in Oregon because of the change with the introduction of coordinated care organizations and transformation in Oregon. Those are just really very high level examples of these levers in action. I'm delighted to be part of the virtual panel today and I will turn it back to the facilitator for questions.

Questions? [Slide 44]

Slide 44. Questions.

Again, just striking the differences in the populations between children and adults, also between Medicaid and other payers. This is one being tracked very closely in Oregon because of the change with the introduction of coordinated care organizations and transformation in Oregon. Those are just really very high-level examples of these levers in action. I'm delighted to be part of the virtual panel today, and I will turn it back to the facilitator for questions.

Ann Gordon: Great. Thank you to all of our presenters. We have a lot of great information to share with our audience today. Before we get started in the questions, I want to say we only have about 5 minutes left. We have a hard stop at 4:00 p.m. For the questions that are coming into the chat box, we will follow up with you individually to make sure you get an answer to your questions. Operator, are there any folks in the queue who have a question?

Question: One of the questions we have from the chat box is—I think it's specifically for Chris. The question is, is the repository-based data submission proprietary, or is it publicly available software programmed for your needs?

Chris Queram: It is—that's a great question, first of all. It is a tool that we've built in partnership with a technology company here in Wisconsin, and jointly we've developed that and operated for the benefit of our members. As CMS has shown interest in moving in the direction of more clinical data reporting and making tools like this available as one way to accomplish that for PQRS and Meaningful Use, we've been very interested in the possibility of making that technology more widely available. But it's not open-source, per se. It is something that we built and would make available.

Question: I was wondering, does either of your groups track information or include membership of providers that are primarily dental care or eye exams and relate that kind of thing to the prevalence of getting eye exams and keeping diabetes under control or diagnosed, that sort of thing? Or are they all physician-based?

Chris Queram: This is Chris. Ours is all physician-based. We don't have any experience working with dental providers at this point.

Mylia Christensen: This is Mylia. Our dental providers are one of those areas that we would love to move into and because in Oregon, the new coordinated care organizations for Medicaid include physical, mental, and dental health. It's becoming a higher priority, so that's one that's on our wish list and very interesting to us. We do collect eye exam information in our diabetes measures. Ours are claims-based measures, and because we get downloads from the health plans and data suppliers I mentioned, we are able to track quite a few of the eye exams that occur, although it's one of our largest areas of challenge because depending on whether or not someone has vision coverage or not, that is what generates the claim or not. So, it's a challenge in terms of that measure.

Questioner: Thank you.

Nancy Wilson: This is Nancy. I just wanted to follow up real quickly on the question. Was there a message behind your question that you wanted to get across to people?

Questioner: No. Just always seeming to notice that there doesn't seem to be a connection, and I work with insurance, providing insurances to members that are dental and vision, and I'm frustrated that I never can get measures about those.

Nancy Wilson: Well, it's a really great point, and it is part of where I think we are trying to head, in terms of getting beyond the walls of what is considered traditional health care. We've got a long way to go. So connect with me and help me figure out how to help make that happened.

Ann Gordon: I think we have time for one last question and I know there are several questions in the chat box. Again, we will connect with you individually to make sure your question gets answered and provide contact information. One question that just came in, do you partner with the QIN/QIOs in your own State for the levers in action that you described? Chris, Mylia?

Chris Queram: Sure. This is Chris. Thanks for that question. We are working collaboratively with our quality improvement organization, which is now part of the Lake Superior Quality Improvement Network, MetaSTAR, and we are working together in two areas. One is around data and given that PQRS reporting is a prominent part of the new scope of work for the QINs, we are very interested in exploring ways to leverage our data capabilities to help MetaSTAR in that regard and also around ambulatory quality improvement, looking to create some greater synergy between our work with our assembly and Learning Action Networks and work that they will be doing under the 11th Scope of Work. So, yes, absolutely.

Mylia Christensen: Same for Oregon. We work closely with Accumeasure and now Health Insight in the new scope of work, have for years done partnership and much to Chris's point; we will be looking forward to working together even more closely as things unfold with the new scope of work.

Ann Gordon: Thank you. All right. We are actually at a point where we need to end the session. Nancy, would you like to give any final remarks?

Nancy Wilson: A terrific thank-you to those who have dialed in, and please take a moment and fill out the survey. It literally takes you about 30 seconds. We really want to provide you with what you need and let us know. You must give information about how we can be most helpful. Thank you. Thank you all for attending on this late August day.

Ann Gordon: Thank you very much. Thank you to our presenters and to everyone for participating today. We look forward to seeing you on a future National Quality Strategy Webinar. Thanks.


Page last reviewed November 2016
Page originally created November 2016
Internet Citation: Webinar Transcript - National Quality Strategy Webinar: Using the Nine Levers to Achieve Results. Content last reviewed November 2016. Agency for Healthcare Research and Quality, Rockville, MD. http://www.ahrq.gov/workingforquality/events/webinar-using-the-nine-levers-to-achieve-results.html