Outcome Measurement Program Case Studies
Summary, Key Findings, and Recommendations
Methodological Components Need to Ensure Measure Administration Fidelity
A key element of fidelity is the full implementation of critical components of an intervention (Century et al., 2010). The level of administration fidelity of a particular measurement tool can be assessed by examining the key components of the tool’s design and observing the extent to which these components are implemented.
Implementation fidelity, however, does not assess the extent to which a measurement tool is measuring relevant or appropriate constructs. For example, a measurement tool may be fully implemented as the designers intended, but may not measure the most relevant HCBS outcomes, or it may not measure outcomes at all. Therefore, it may be an inadequate or inappropriate instrument to measure HCBS quality outcomes. The NQF report on measurement gaps in HCBS provides a comprehensive framework for the domains of HCBS quality that are important to measure. Using the NQF domains as a framework to select an instrument or combination of instruments to measure HCBS quality will lead to relevant data that can be used to assess program quality, in addition to fidelity in implementation.
Strengths and Challenges in Measure Administration
Strengths of the IM4Q:
- The emphasis on independence in monitoring ensures that data is collected by organizations that do not have a financial or other interest in the service delivery system.
- The mandate that monitoring teams consist of a person with disabilities or a family member of someone with a disability.
- Implementation is on a local level, meaning that monitors, local agencies charged with implementing IM4Q, and AEs have an understanding of the local context.
- IM4Q measures key HCBS outcomes, including choice and control, relationships, inclusion, etc.
- IM4Q provides local and state-level reports that empower local agencies to evaluate system performance and allows Pennsylvania to make data-driven policy and programming decisions.
- Because small organizations are implementing IM4Q at a local level, the IM4Q coordinators at these organizations can observe monitors in the field regularly, contributing to the fidelity of implementation.
- AEs and local agencies noted that they received high quality technical assistance regarding any implementation questions that arose during the monitoring process.
- A strong relationship between Institute on Disabilities, Institute on Disabilities, Temple University and local and state organizations.
- Provide opportunities for individuals to participate in the interview process that directly relates to the policy at the system level.
- One of the primary strengths of the IM4Q is the extent to which it is woven into the state’s Quality Assurance processes. The Consideration process, for example, means that actions can be taken to address an individual’s needs while still giving the state systems level data to monitor trends and address system level needs. An example of this is the focus on ensuring that individuals have communication support needs to be addressed.
- IM4Q has longitudinal data on participants, including a subsample in state-operated residential centers. This allows the state to monitor the effects of policy changes on outcomes.
Challenges of the IM4Q:
- Some of the AE and local agency staff interviewed noted that while the local focus on the IM4Q had positive aspects, the lack of centralized training was potentially a barrier to high fidelity.
- Coordinators and interviewers reported challenges recruiting participants due to participant fatigue as well as the increased difficulty in reaching participants as more people live in their own or family homes. Information about the IM4Q has been developed for families about the IM4Q and its purpose in order to improve participation. Reaching the intended sample is essential for fidelity.
- Participants from ODP, AEs, and local agencies noted that some monitors underutilize the Considerations process. Monitors could be more creative or forward-thinking about what might improve the quality of life for the individual they are interviewing.
- Reliability and validity testing of the EDE and other aspects of the IM4Q program are not robust.
Factors, Characteristics or Components that May Strengthen or Deter Effective Outcome Measurement of HCBS Programs
Positive drivers include:
- IM4Q has buy-in from stakeholder groups, support from ODP which provides on-going funding for the program.
- The ability to add and revise items to measure local and state-level goals
- IM4Q is a person-centered process.
- IM4Q is woven into the state’s quality assurance and monitoring process.
- The ability of local organizations to measure their performance against other organizations and to learn from each other
- The trust between the independent monitors and participants
- The ability of local organizations to measure their performance against other organizations and to learn from each other
- State agency leadership’s commitment and supports to the measurement process to drive quality improvement for LTSS
- Measurement protocols and training that are designed to assure fidelity in implementation
- Sufficient levels of training and technical assistance (from Institute on Disabilities, Institute on Disabilities, Temple University) are very important for implementing and interpreting measures.
- The transparency of the use of data and the ability to share information across counties
- The new provider profiles that have been developed are designed to show the providers the value and usefulness of IM4Q to achieve the goals and objectives.
Deterrents to implementation include:
- In-person interviews conducted in person with a team of interviewers is costly.
- Decentralized training introduces opportunities for reduced fidelity.
- The time between administering the survey and receiving the results is too lengthy.
- Extensive questions may be time consuming and limit participants' willingness to participate.
- Some concepts and questions on the survey instrument are too complex for some participants leading to the possibility of unreliable or missing data. There are sections of the survey that can be answered by proxy.
Recommendations:
- The timeliness and accessibility of data are essential to make timely decisions.
- Some interview participants noted that video-conferencing could be used to reach participants in different areas of the state and different types of residential settings. Trials could be conducted to determine if this undermines the benefits that come with face-to-face interviews.
- Sharing information with support coordinators from the local programs as support coordinators are the key to ensuring quality supports.