Appendix VII: Public Comments Submitted Between March 7 and March 21, 2008 (continued)

Environmental Scan of Measures for Medicaid Title XIX Home and Community-Based Services Measure Scan as of July 5, 2007

Charles R. Moseley, Ed.D.
Associate Executive Director
National Association of State Directors of Developmental Disabilities Services

Medicaid Home and Community-Based Services Measure Scan Project: Comments and Suggestions

January 8, 2008

This is in response to the request from the Agency for Healthcare Research and Quality (AHRQ) for comments on documents the Agency prepared addressing: (a) program performance measures, (b) client functioning measures, (c) client experience measures, (d) measure constructs and (e) final report outline. As requested by D.E.B. Potter, in her email of 1/1/08, these comments address "big picture" issues regarding the identified instruments, including assessment administration, applicability and usage.

AHRQ summarizes the various existing measures identified through the project across key assessment categories related to program performance, client functioning and client experience in the three documents mentioned above. The charts provide information on each measure and attest to the wide variety of instruments being used by states to assess the quality of home and community based services (HCBS) furnished to the broad scope of Medicaid beneficiaries. The Draft Final Report Outline describes the general path the AHRQ intends to take in its final report but does not provide detail on intended project outcomes or conclusions.

Comments

AHRQ has performed a detailed analysis of the component questions contained in the wide range of assessment tools reviewed by the project. The analysis appears to be consistent with the project's goal of identifying individual measures from existing standardized and nonstandardized instruments that could be used to assess changes in the functioning and experience of individuals receiving Medicaid financed home and community based services (HCBS). The Final Draft Report Outline offers an overview of the steps AHRQ has taken to evaluate the measures and some of the key parameters that should be considered in the assessment. The Report Outline does not, however provide insight into the ways in which AHNRQ intends to address many of the critical issues identified NASDDDS in its previous comments regarding the Measure Scan Project. The following issues and concerns identified in the Association's comments issued on June 12, 2007 remain.

Inappropriate Use of a Single Set of Measures for All Populations. Noting that relatively few formal measures have been specifically designed and tested to address HCBS, AHRQ appears to be following its original plans to develop a single assessment tool with individual measures drawn from existing instruments that have been designed for specific populations and purposes. As mentioned in the Association's previous comments, the appropriateness of this one-size fits all approach must be questioned from the outset given the varying characteristics and needs of the population groups to be assessed and the desire for content validity and test-retest reliability.

Recommendation. NASDDDS recommends that the most valid and appropriate approach to assessing multiple populations is the use of currently available, statistically sound instruments that have been designed to evaluate individual characteristics, experiences and outcomes that are relevant to each of the various population groups under consideration.

Focus on Individual Measures Rather than Instruments. AHRQ identified the need for scientific soundness, reliability and validity, but continues to focus the unit of measurement narrowly, identifying and gathering information on existing performance measures of quality services rather than on the instruments of which the individual measures are a part. This approach appears to ignore the fact that the assessment of reliability and validity typically is made on the entire instrument or evaluation scale and not on its individual component questions. The use of single measures from valid and reliable instruments does not provide any guarantee that the items will continue to be valid and reliable when removed from original context. This approach does not make it possible to derive benefit from the numerous currently existing assessment instruments that have demonstrated reliability and validity. Nor does it take advantage of previous years of data that may have been gathered by states using national, standardized, valid and reliable evaluation tools for specific populations.

Recommendation. It is recommended that AHRQ collaborate with state agencies serving each of the population groups receiving HCB services to: (a) identify valid and reliable instruments that currently are in use for assessing the specific populations groups for which they are designed, (b) determine the applicability of these instruments for meeting DRA requirements without further modification, (c) develop implementation strategies for using data drawn from current instruments to evaluate HCB services. This approach strengthens state's current efforts to assess and manage service quality, takes advantage of current valid and reliable measures and produces a more credible outcome.

Quality Domains. The Methodology Report does not reference the measures included in the CMS Quality Inventory. As noted in the Association's previous comments, the CMS Quality Inventory Initiative documented state quality assessment practices in HCB services for persons with developmental disabilities and individuals who are aging across all 50 states and the District of Columbia. The format was developed with significant input from state executive branch associations, including the National Association of State Medicaid Directors, the National Association of State Units on Aging, the National Association of State Directors of Developmental Disabilities Services, the National Associations of State Brain Injury Administrators and representatives of the Cash and Counseling demonstration states. Organized within a "Quality Framework," a broad series of quality measures were developed and revised for seniors and individuals with developmental disabilities and organized into seven quality domains. The components of the Quality Framework were built into a new HCBS Medicaid waiver template, backed by new Federal expectations of states regarding the implementation of comprehensive quality management strategies to ensure appropriate state oversight of HCBS waiver-financed services.

Recommendation. It is recommended that AHRQ/Medstat review the quality measures identified in the Quality Inventory and examine the CMS Quality Inventory with particular reference to its method of organization and consider basing the measurement scan on the seven quality domains contained in the CMS Quality Framework. This approach appears to offer significant advantages over the Institute of Medicinemedical quality categories.

Review of State Programs. Over the past 10 years, state Medicaid agencies, developmental disabilities agencies, units on aging, agencies serving individuals with physical disabilities, traumatic brain injury agencies, and others have utilized a variety of national and state derived assessment instruments measure system performance and personal outcomes. The Draft Final Report Outline and accompanying charts suggests that AHRQ has reviewed some of the instruments currently in use, but does not mention the importance of state programs as critical and current sources of information on quality assurance and assessment. No systematic plan to gather feedback from more than a few state program agencies that are responsible for overseeing the delivery and financing of home and community-based long-term supports was offered. In view of the extensive investments that states have made in developing their HCBS Quality Management Strategies over the past three years, a more rigorous and objective effort to gather state feedback on the measures would seem to be in order.

NASDDDS believes that it is ill advised to ignore these efforts by limiting the focus of the review of state practices to contacts with a limited number of individuals and programs. The recommendations made during the previous comments are restated for consideration.

Recommendations.

  1. Quality assurance and management is performed by state agencies. Revise the assessment process outlined in the Methodology Report to include a comprehensive review of current state practice identifying the perspectives of state Medicaid and program agency officials on what matters, what works, what doesn't work, and the linkage between performance measurement, remediation and performance improvement.
  2. States use different approaches to evaluating and tracking individual recipient outcomes, program performance and system responsiveness. As noted above, the National Core Indicators Program is currently being used by 30 states to assess outcomes achieved by and on behalf of individuals with developmental disabilities. Additional states use the Participant Experience Survey to evaluate elder services or employ their own "home grown" measures. It is recommended that the assessment process develop an understanding of each approach, including its strengths and weaknesses.
  3. Gather data from officials who represent state agencies other than the Medicaid agency. As noted above, in the majority of states operational authority for Medicaid programs serving individuals with developmental disabilities has, by interagency agreement, been delegated to the state developmental disabilities agency.
  4. Directly involve representations from selected national associations representing state executive branch agencies in the Measure Scan process, beyond participating in the Technical Expert Panel, in operational roles, by forming advisory groups or focused ad-hoc committees.

Methodology Report for Medicaid Home and Community Based Services Measure Scan: Comments and Suggestions
June 12, 2007

This paper offers comments from the National Association of State Directors of Developmental Disabilities Services (NASDDDS) on the recently completed Methodology Report for Medicaid Home and Community Based Services Measure Scan. The Methodology Report is written to describe a "consistent criteria for including and excluding potential measures in the final compendium," and to identify data elements that should comprise each of the potential measures. In reviewing the document, NASDDDS identified several concerns regarding: (a) the proposed focus areas and domains; (b) the proposed process for reviewing state programs and approaches to outcome measurement; (c) the criteria for including or excluding measures as part of the scan; and (d) the proposed data elements. These comments are offered in response to AHRQ's request for feedback, and in an effort to improve the outcome and usability of the final product.

Background. The Deficit Reduction Act of 2005 tasked the Agency for Health Care Research and Quality (AHRQ) with responsibility for identifying existing measures that could be used or modified for the purpose of assessing Medicaid Home and Community Services in three areas: (a) program performance, (b) client functioning and, (c) client satisfaction. In response, as the first step in fulfilling this Congressional directive, AHRQ launched the Measure Scan Project to complete the following activities:

  • Develop, in consultation with relevant stakeholders, measures of program performance, client functioning, and client satisfaction with respect to HCBS services provided through State Medicaid programs.
  • Assess the quality of HCBS services and their outcomes, along with the overall system for providing such services through the Medicaid program.
  • Disseminate best practices information distilled from a comparative analysis of the system features of each State.

AHRQ contracted with the MEDSTAT Group Inc. (Medstat) to conduct "a comprehensive environmental scan of existing measures, to set the baseline of available science for the process of developing any new measures." Medstat is familiar with several current approaches to human service outcome measurement through its staff's development of the Participant Experience Survey, one of the "Current HCBS Quality Measures and Databases" that will be reviewed as a part of the project.

Comments

I. Focus Areas and Domains.

Issue. The Background section of the Methodology Report describes the broad scope of Home and Community-Based Services furnished by the Medicaid program and the extensive array of service populations that are eligible to receive Medicaid benefits. The challenges involved in supporting such a broad range of individuals, with varying treatment and support needs, is noted as well as the additional issues brought on by the recent trend toward self-directed supports. In the second section, the Methodology Report describes the challenge of identifying a single set of quality measures and domains with the capacity to adequately assess program performance, client functioning and client satisfaction across the broad scope of services, populations, providers and administrative structures that are covered by the Medicaid program.

In view of the extensive reach of Medicaid program and the unique needs of the populations it served, one must question whether the adoption of a one-size-fits-all approach to performance measurement is conceptually appropriate or practical, given the varying characteristics and needs of the population groups to be assessed. While theoretically attractive, the emphasis on uniformity derived from a single set of measures should be challenged from the outset as one that necessarily will drive the evaluation measures and the process itself to the lowest common denominator, resulting in the selection of indicators that are so general as to lose their relevancy, utility and validity to unique particular subpopulations. This point was underscored with the implementation of the Participant Experience Survey (PES). In developing the PES, CMS initially set out to create a single instrument that would be applicable to all HCBS waiver programs. After consulting with representatives from the National Association of State Units on Aging and the National Association of State Directors of Developmental Disabilities Services, however, CMS concluded that two versions of the instrument should be prepared: one normed on the elderly/disabled population and another based on the ID/DD population.

Medicaid-eligible individuals who are frail and elderly, have intellectual and developmental disabilities, head injuries, mental illnesses, physical disabilities, or who are eligible for and covered by Medicaid for other reasons have distinctive disability characteristics and long term support needs. Typically, these individuals receive assistance through an array of HCB Medicaid services and supports that operate under service definitions that are tailored to the unique characteristics and needs of each population. Over the years, specific programs, services, providers, funding mechanisms and state/local agencies have been developed to specialize in the provision of necessary supports and services to each group. This focus has been particularly important for individuals with intellectual and developmental disabilities and has made it possible for states to tailor services and supports to meet their unique needs. As a group, these persons experience a wider range and severity of disabilities over a sustained (typically lifelong) period than any other group of Medicaid beneficiaries. As a result, the levels of services and supports that they receive typically are high, intensive, life-long in duration and expensive to provide. To effectively deliver services and supports to individuals with intellectual and developmental disabilities, state Medicaid authorities in over eighty-percent of the states have delegated financial and operational responsibility for managing Medicaid-funded services furnished to these individuals to the state developmental disabilities agency. In the vast majority of cases, the single state Medicaid Agency is only tangentially aware of or involved in the delivery of services to this group of individuals.

Recommendation. Based on the material that has been provided to the Technical Expert Panel, the DRA does not appear to require the development of a single set of measures to apply to all groups of recipients of Medicaid-financed home and community-based services. While acknowledging that a number of performance indicators exist that can appropriately be applied across populations, NASDDDS believes that the most valid and appropriate approach would be to focus on the identification of performance measures that have been designed to assess issues relevant to each of the various population groups. It is recommended that as a first step, AHRQ refocus the approach to the Measure Scan project to one that seeks to identify sets of measures and instruments that effectively and appropriately address individual characteristics and outcomes for each of the population groups served.

II. Program Performance.

Issue. The Methodology Report identifies several areas related to the performance of Medicaid-funded HCB programs and services. All of the areas that are mentioned are covered within CMS's Quality Framework, a format that was developed with significant input from representatives of affected state executive branch associations, including the National Association of State Medicaid Directors, the National Association of State Units on Aging, the National Association of State Directors of Developmental Disabilities Services, the National Associations of State Brain Injury Administrators and representatives of the Cash and Counseling demonstration states. The CMS Quality Framework currently is being used by states as the template for organizing the comprehensive Quality Management Strategies that are being designed and implemented as a required component of each state's HCBS Medicaid waiver programs.

Recommendation. It is recommended that AHRQ accept the Quality Framework as the format to organize the selection of performance measures. Further, it is recommended that: (a) an additional "cultural appropriateness" element be included to assess the ability of state Medicaid programs and Medicaid provider agencies to deliver services and supports in a manner that respects the cultural beliefs, practices and languages of the individuals receiving support; and (b) timely access, evaluating the extent to which eligible individuals are able to access needed services and supports in a timely fashion.

III. Client Functioning

Issue. The section of the Methodology Report dealing with client functioning draws an appropriate distinction between acute health care and the long term services and supports furnished to individuals receiving HCB services. The report suggests the focus be on the measurement of two functional outcome components: (a) the unmet need for supports in activities of daily living and in instrumental activities of daily living, and (b) social role functioning. While addressing two key functional areas, this approach omits consideration of several relevant issues that should be addressed.

Recommendation. A more effective approach, we would argue, would be to conceptualize client functioning as described by Rosemary Kane1 in terms of the needs that individuals have to receive: (a) direct support to access work, family and home-life, and community activities; (b) skill development and training to teach individuals to complete daily living tasks and responsibilities without, or at reduced levels of, assistance, access the community and perform work and gainful activity; (c) needed ancillary services, including, transportation, assistive technology, case management and; (d) medical and health related treatment, to meet the health, dental, physical, speech and occupational needs of participants in HCB services. This approach places the concept of client functioning within the context of the person's everyday life and offers a more practical format for identifying and tracking performance measures.

Additionally, we believe that the performance measures identified under client functioning should address:

  • Outcome performance: assessing the extent to which the HCB services that are provided are effective in achieving the intended outcomes. That is, do the services accomplish the goals identified in the individual's plan of care.
  • Improving self-direction, autonomy, and personal responsibility: assessing the extent to which Medicaid HCB services build the capacity of the individual, regardless of his or her level of functioning, to take on increasing responsibility for her/his own care and direction.

IV. Client Satisfaction and Experience

Issue. We agree with the proposal to expand satisfaction to include the recipient's perspective on the quality, responsiveness and effectiveness of the services provided and received.

Recommendation. It is recommended that performance measures be included under the aforementioned areas of Client Functioning and Program Performance to address the extent to which Medicaid HCB programs, services and providers actively solicit and respond to information from individuals receiving Medicaid services.

V. Quality Domains

Issue. The Methodology Report proposes that the Institute of Medicine's fundamental aims of health care quality be used as the organizational domains for the quality measures selected under the Measure Scan project. Although these focus areas may be suitable for assuring the quality of acute medical services, their appropriateness for gauging the quality of long term HCB supports furnished under the Federal-State Medicaid program must be seriously questioned. A much more appropriate approach, we believe, would be to build upon the numerous measures and systems that have been developed by states and CMS, separately and in collaboration over the past ten years, to identify, verify and measure quality domains in long term services. In 1999, for example, CMS launched the Quality Inventory Initiative, an ambitious effort to catalog quality assurance practices in states and develop a framework for organizing and measuring key performance outcomes related to the assurance, management and improvement of quality. A broad series of quality measures were developed and revised for seniors and individuals with developmental disabilities. The quality measures were organized into seven quality domains, later called "Focus Areas." A detailed survey called the Quality Inventory was prepared and distributed to all state developmental disabilities agencies, as well as state agencies for the aging. Medicaid program waiver services in each state were surveyed on a broad range of key quality performance measures. Responses were received from all 50 state developmental disabilities agencies plus the District of Columbia and a majority of state programs serving seniors. The Quality Framework was vetted and approved by relevant state agencies and stakeholder groups and widely distributed to the field for review and comment. The components of the Quality Framework were built into a new HCBS Medicaid waiver template, backed by new federal expectations of states regarding the implementation of comprehensive quality management strategies to ensure appropriate state oversight of HCBS waiver-financed services.

Recommendation. It is recommended that AHRQ/Medstat rescind its proposal to utilize the Institute of Medicine medical quality categories and instead base the measurement scan on the seven quality domains contained in the CMS Quality Framework. Further, it is recommended that AHRQ review the quality measures identified in the Quality Inventory.

VI. Sources

Issue. The Methodology Report does not reference the need to assess and utilize measures included in the National Core Indicators Program, the broadest and most complete quality indictor measurement system for persons with developmental disabilities in existence. Additional information about the NCI program has been furnished to Medstat and AHRQ under a separate cover. The National Core Indicators Program is currently used to assess a wide range of satisfaction and system performance indicators in 24 states.

Recommendation. It is recommended that the AHRQ review the structure and functioning of the National Core Indicators performance measures in developmental disabilities programs nationwide and use this process as the core of the further development of measures for persons with developmental disabilities receiving Medicaid waiver services. As noted above, NASDDDS does not recommend that the project attempt to develop a single set of indicators for all population groups and waiver programs. A more appropriate and useful approach, it is argued, would be to identify separate performance assessment tools and measures with proven validity and reliability to assess performance in each group served.

VII. Review of State Programs

Issue. While acknowledging the importance of state programs as critical and current sources of information on quality assurance and assessment, Medstat/AHRQ, in the Methodology Report, proposes no systematic plan to gather feedback from state program agencies that are responsible for overseeing the delivery and financing of home and community-based long-term supports. Instead, AHRQ states its intention to use Medstat, a current CMS quality contractor and publisher of the Participant Experience Survey, one of the instruments under review, to "synthesize our existing knowledge of quality measures,.and fill in gaps with calls to known state contacts." Given the extensive investments that states have made in developing their HCBS Quality Management Strategies over the past three years, a more rigorous and objective effort to gather state feedback on the measures would seem to be in order.

Over the past 10 years, separate state Medicaid agencies, state developmental disabilities agencies, state units on aging, agencies serving individuals with physical disabilities, traumatic brain injury agencies, and others have utilized and in many cases developed a variety of instruments and approaches for measuring system performance and personal outcomes. NASDDDS believes that it is ill advised to ignore these efforts by limiting the focus of the review of state practices to contacts with a limited number of individuals and programs.

Recommendations.

  • Quality assurance and performance measurement is performed by state agencies. Revise the assessment process outlined in the Methodology Report to include a comprehensive review of current state practice identifying the perspectives of state Medicaid and program agency officials on what matters, what works, what doesn't work, and the linkage between performance measurement, remediation and performance improvement.
  • States use different approaches to evaluating and tracking individual recipient outcomes, program performance and system responsiveness. As noted above, the National Core Indicators Program is currently being used by 24 states to assess outcomes achieved by and on behalf of individuals with developmental disabilities. Additional states use the Participant Experience Survey to evaluate elder services or employ their own "home grown" measures. It is recommended that the assessment process develop an understanding of each approach, including its strengths and weaknesses.
  • Gather data from officials who represent state agencies other than the Medicaid agency. As noted above, in the majority of states operational authority for Medicaid programs serving individuals with developmental disabilities has, by interagency agreement, been delegated to the state developmental disabilities agency.
  • Directly involve representations from selected national associations representing state executive branch agencies in the Measure Scan process, beyond participating in the Technical Expert Panel, in operational roles, by forming advisory groups or focused ad-hoc committees.

VIII. Data Elements

Issue. AHRQ has expressed its intention to focus the unit of measurement narrowly, identifying and gathering information on existing performance measures of quality services rather than on the instruments of which the individual measures are a part. This "reinventing the wheel" approach seems ill-advised. First, the assessment of reliability and validity is typically performed on the entire instrument or evaluation scale and not on its individual component questions. The selection of single measures would not provide any information regarding validity or reliability. Second, focusing on individual measures would not make it possible to derive any benefit from the numerous assessment tools that are currently in existence that over time have demonstrated their ability to effectively and reliably measure key characteristics and abilities of the individuals and populations being assessed. Third, the results of this approach may encourage states to abandon effective assessment instruments that are able to produce detailed and relevant performance and outcome data on specific population groups in favor of more generalized, and less useful, information.

Recommendation. It is recommended that AHRQ perform an analysis of existing quality assessment instruments, such as the National Core Indicators, the Council on Quality and Leadership Quality Outcomes, the PES and state developed quality management and assessment tools, to identify currently used measurement approaches with proven reliability and validity data.

IV. Conclusion

The Measure Scan project is designed to identify key performance measures to assess the performance and outcomes of home and community based services furnished through the Medicaid program. Medstat's Methodology Report outlines an ambitious process designed to identify a single set of indicators that can be used to assess the services and supports received by all Medicaid long term support beneficiaries regardless of the nature of their needs and the types of services required. NASDDDS believes that this approach ignores important differences between the populations being served that directly impact the nature of the services provided, the outcomes that are expected to be achieved and the measures that should be selected to evaluate and track performance. We are concerned that the process will result in a series of broad indicators that are "a mile wide and an inch deep," providing minimal amounts of useful information across a range of variables that may or may not be directly relevant to each of the various population groups being reviewed. Recommendations are provided to improve the methodology of the scan process.

NASDDDS appreciates the decision of AHRQ to involve representatives of the affected state executive branch associations as members of the Technical Expert Panel and its willingness to receive what we hope are considered constructive suggestions on the methodology it intends to use to complete the Measure Scan project. We recognize the scope and challenge of AHRQ's task and are committed to providing whatever assistance we can to help to ensure the final product of the work can be effectively utilized by state developmental disabilities agencies.


1 Kane RA, Kane RL, Ladd RC. The heart of long term care. New York: Oxford University Press; 1998.


Jeffery Thompson, M.D.
Medical Director
Washington Department of Social and Health Services

While I think that all these tools are great and certainly add to the quality information I see a hole in using claims based information to either compare HCBS contractors or look at regional variation.

For example we at Medicaid know that poor adherence to mental health medication leads to increases in re-hospitalization, excessive use of ER and increases in SNF placements. Correlation of medical claims data to HCBS is a rich data set of information to detect variation in care and identify "best practices." Along those lines, looking at transfer rates between HCBS and SNF/LTCs, rehospitalization rates, and ER use rates would add a great deal.

Finally, looking toward the future and the "Never Events," I did not see any use of the never events, including:

  • Stage 3 or 4 pressure ulcers acquired after admission to a health care facility.
  • Patient death or serious disability associated with a medication error (e.g., error involving the wrong drug, wrong dose, wrong patient, wrong time, wrong rate, wrong preparation, or wrong route of administration).
  • Patient death associated with a fall while being cared for in a health care facility.
  • Patient death or serious disability associated with the use of restraints or bedrails while being cared for in a health care facility.
  • Sexual assault on a patient within or on the grounds of a health care facility.

James Gardner, Ph.D.
President and CEO
Council on Quality and Leadership

March 12, 2008

D.E.B. Potter
Project Officer, HCBS Measure Scan Project
Agency for Healthcare Research & Quality
540 Gaither Road, Suite 500
Rockville, MD 20850

Dear Ms. Potter:

Let me take this opportunity to thank you, Sara Galantowicz, and all your colleagues for the fine work that you have done in preparing the Draft Final report of the Environmental Scan of Measures for Medicaid Title XIX Home and Community-Based Services.

I want to call your attention to two technical issues and then close with a question.

  1. In table a.V.1a, CQL is incorrectly identified as the Center on Quality and Leadership. CQL should be identified as the Council on Quality and Leadership.
  2. CQL designed and validated the Personal Outcome Measures on people with a wide range of disabilities. We have always included in our research significant samples of people with mental illness, developmental disabilities, and intellectual disabilities. You can find this data and research in Gardner, Nudler and Chapman (1997) "Personal Outcomes as Measures of Quality" Mental Retardation, 35 (4); Gardner, Carran and Nudler (2001) "Measuring Quality of Life and Quality of Services through Personal Outcome Measures: Implications for Public Policy," International Review of Research in Mental Retardation, 24; and Gardner and Carran (2005) "Attainment of Personal Outcomes by People with Developmental Disabilities," Mental Retardation, 43 (3).
  3. My query is why the Personal Outcome Measures did not meet the threshold requirements for inclusion in Table A.5.2.b. In short, we have published our research in peer reviewed scientific journals (Scientific Evidence); We have designed and tested the measures for people with mental illness, developmental disabilities, and intellectual disabilities (HCBS Populations of Interest); and data are obtained from a single survey respondent and reported in aggregate as the number of respondents with the outcome present divided by the total number of respondents sampled (Feasibility of Data Collection). Finally, the Personal Outcome Measures have a documented use in state programs supporting community-based services for people with disabilities, are used in multiple clinical settings and have been, or are currently, used in 2-4 states as quality indicators for the HCBS Waiver program.

Thank you for this opportunity to contribute to this important national initiative.

Sincerely,

James F. Gardner, Ph.D.
President and CEO

Page last reviewed June 2010
Internet Citation: Appendix VII: Public Comments Submitted Between March 7 and March 21, 2008 (continued): Environmental Scan of Measures for Medicaid Title XIX Home and Community-Based Services Measure Scan as of July 5, 2007. June 2010. Agency for Healthcare Research and Quality, Rockville, MD. http://www.ahrq.gov/professionals/systems/long-term-care/resources/hcbs/hcbsreport/hcbsapviia.html