Outcome Measurement Program Case Studies

Appendix B:
Modified Consolidated Framework for Implementation Research Constructs

Alignment of POMs® Components with CFIR

Framework Domain: Intervention Characteristics

External vs. Internal Development:  The first factor considered is the degree to which stakeholders “buy” into the tool. CFIR research indicates that those programs that are not developed by the stakeholders as a “grassroots” solution to a problem or challenge and are centrally controlled, risk implementation failure.  Externally developed tools like the POMs® are more likely to be successful if local decision-making occurs, and if the tool is tailored to the organization. 

The Intervention Source for the POMs® is external, and this influences the selection of the tool within the small group of organizations in this study. Key informants involved with a consortium of HCBS organizations receiving grant funding to implement the same outcome measurement tool reported that the consortium members had lengthy discussions regarding creating their own tool or adopting an existing tool. Their decision was to select the POMs®, an externally developed tool.

Within the HCBS provider community, there are very few examples of a regional group of HCBS providers organized as a learning community around quality improvement, as we observed in this case.  Members reported that initially, some companies were wary of discussing quality issues with competitors but this concern diminished over time.  Our observation was that using the same tool in a specific region does promote a common vocabulary across agencies regarding quality outcomes, and an opportunity for cross-fertilization of quality improvement ideas, practices when employees from different agencies gather for POMs® training and other consortium meetings.

Its external development did not appear to negatively affect implementation in agencies that had used the tool for accreditation and not just as a training vehicle. 

There was evidence in one organization that there was significant buy-in of POMs® by stakeholders.  The POMs® aligned with the agency’s person-centered approaches and concern for outcomes more than procedures.  For training and possibly for accreditation purposes in which in-house certified interviewers were used, staff have substantial involvement in implementation and control over decision-making regarding the extent to which desired quality of life outcomes are present in an individual’s life.  Because the POMs® measures are based on people’s lives rather than professional standards, it may be easier for users to feel invested in the implementation of the POMs® because it focuses on the people and situations they know.  In agencies whose approaches are not explicitly grounded in person-centered and outcome-oriented methods, there was less enthusiasm regarding the POMs®.

Evidence Strength and Quality: The CFIR foundation literature indicates that a tool’s success is strengthened when stakeholders believe in the tool’s credibility.  Perceptions of tool relevance and effectiveness may derive from a range of sources such as the influence and views of colleagues, studies of reliability and validity, as well as pilot results and feedback from implementation teams. 

The end-users we interviewed, including policymakers, were not very familiar with the psychometric properties of the tool nor the methods of developing and revising the tool, indicating that these issues were not a high priority in their decision to use the tool.  Nonetheless, they perceived the tool as highly relevant and useful for learning more about the people they support.  We did find that implementation was positively influenced by the strong alignment of the POMs® constructs with person-centered approaches as well as its focus on outcomes.  Respondents noted that these factors positively influenced organizations to adopt the POMs®, and also influenced the recommendations of thought leaders familiar with quality improvement regarding which tools HCBS organizations should consider when choosing measure outcomes. 

Organizations within the regional consortium with a higher level of commitment to person-centered approaches and to an outcomes perspective were more engaged with POMs® implementation and with using the results to make changes at the individual and organizational levels.  This factor also appeared to build and sustain engagement with implementation over time.  This was especially noticeable in smaller organizations whose employees could directly apply what they learned to the supports they provided and could see how changes in support affected outcomes.  Those agencies that were influenced by observing the growth in person-centered interventions over time appeared to have a cadre of stable staff who were employed for several years and able to see change.  This observation suggests that it is essential to empower direct support professionals with the opportunity to learn and use the tool to raise their awareness of the challenges identified by the tool and how their day-to-day support can be adjusted to improve outcomes.

Relative Advantage:  The CFIR research evidence supporting this construct finds that there is stronger implementation when stakeholders view their selected intervention as providing an advantage over similar interventions. One organization we visited in our exploration of POMs® indicated that they had chosen the POMs® tool because the state required accreditation by an external agency, either CQL or CARF.  The agency had done some initial self-assessment using POMs® and felt this was the best choice for an agency with a person-centered philosophy (as they described themselves).  The agency was part of a regional learning community of similar organizations that was funded by a grant from a private foundation. This consortium of HCBS provider agencies was encouraged by the consortium funder to use the same measurement tool and wrestled with the question of which tool to select.   An organization with some experience with POMs® recommended this tool and the recommendation was accepted.  Afterward, the consortium received grant funds to initiate a structured process of training using POMs® as a vehicle for organizational learning.  The initiating agency had also committed to moving forward with accreditation under the aegis of the CQL.   Using the POMs® for two different purposes, a context for training and as part of an accreditation process, indicated to the study team than another relative advantage to the POMs® is that it can be used for different purposes. 

Observations and discussions with staff from a much larger organization within the consortium suggested that this agency did not see the tool as offering an advantage over other methods of quality measurement.   Review of this agency’s annual plan, it was apparent that unlike staff in the smaller organization in the consortium, these informants did not describe how the tool deepened their knowledge with every use and did not identify ways that the POMs® results were integrated into support plans or organizational goals.  

In the case of the POMs®, in the limited context, we explored, its use and implementation did not appear to be motivated by a perceived sense of advantage over other HCBS service providers, or appear to provide a marketing advantage in growing the organization’s service base.  Currently, most publicly funded HCBS services do not operate in response to typical business growth and profitability drivers.  The system has not yet matured to the level where significant numbers of consumers are making provider choices based upon the outcomes an organization helps consumers to achieve. This is an area where greater standardization of measures would empower consumers to make informed choices in the selection of providers.  

Adaptability:  In every intervention or program proposed for implementation, there will be essential components that must be unchanged regardless of local culture or requirements. However, measurement programs that can be readily introduced to new environments with easily accomplished customization of non-essential components are more successful. The core of the POMs® is an interview process where trained staff conduct interviews with program recipients using specific questions.  The results of these interviews are then discussed in a group format, and established rubrics are used to guide the decisions about whether an outcome is present.  It was evident in observing trainees learning the POMs® methods that the interview and decision process generated a high degree of engagement among end-users. 

Some organizations that use the POMs® select employees to serve as interviewers (agencies may also use certified interviewers who work elsewhere).   Interviewers must participate in a 4-day training workshop and pass a reliability test to become a “certified” interviewer. To the degree possible, certified employee interviewers are scheduled for interviews with people and staff from different work sites.  The POMs® is quite flexible in that it can be implemented wherever there are trained interviewers and HCBS recipients willing to be interviewed.  There are no special equipment or meeting room requirements (other than a private, comfortable space) necessary unless the HCBS recipient requires some form of communication or other support accommodation. 

One constraint to adaptability is the capacity of an organization to provide additional staffing resources to cover direct support duties for any frontline staff who are part of the interview team.  This might make it more difficult for smaller or more poorly resourced organizations to participate. It may also deter organizations from including direct support staff in the training or accreditation process.  It is our view that failure to include Direct Support staff in becoming familiar with the tool and its findings regarding people they support is likely to diminish the use of tool results to improve support.

Another constraint to adaptability is that the POMs® does not support the use of “proxies” for people who are unable to communicate with words.  This may present challenges to organizations supporting people with limited language skills, making it difficult to find volunteers to be interviewed. It does, however, as part of its standard protocol involve interviewing staff who work with a specific person. One organization we observed using the POMs® as a training experience did serve many people with verbal communication challenges; nonetheless, the key informants from this agency were very positive about the knowledge they obtained using the POMs® and its overall benefit to the organization’s quest for quality.

Trialability:  Research establishes a link between the ability to pilot test a tool and successful implementation.  A trial period that can be reversed provides an organization with the opportunity to adapt and retool implementation based on the trial experience.  The POMs® offers significant scope for experimentation with the tool on a smaller scale before investing considerable time, staff resources, and capital in a larger scale implementation effort.

Organizations considering utilizing the POMs® for training, quality enhancement, or accreditation can connect with other organizations that have used the tool to discuss the experience and its properties.  The qualitative nature of the tool and its intuitive methodology make it an accessible instrument to review and understand holistically before any form of implementation.  Also, a wealth of peer-reviewed literature focused on the constructs that are at the core of the POMs® is available for review before implementation.  POMs® is flexible and can be implemented on a smaller scale suitable for piloting a larger scale implementation with appropriate guidance on interpreting the indicators.  One organization we interviewed had initiated training activities on a modest scale without technical guidance from CQL on methods of interviewing and interpreting results and found they had been implementing the tool incorrectly.  Therefore, we must revisit the concept discussed earlier in this report that while a tool may be flexible and adaptable, certain core components cannot be eliminated.  This agency’s experience indicates that sufficient training and technical assistance is critical to implementation fidelity

Complexity:  There is an inverse relationship between the complexity of a proposed intervention and the likelihood of its successful implementation. Logistical considerations comprise one area of complexity for the POMs® concerning the steps necessary to support the interview process. Identifying and coordinating schedules of interviewers, support participants, and staff is essential to ensure their participation. This involves outreach and communication with all interview parties and identification of appropriate interview locations.   The complexity of this task will vary based on the scale of the POMs® implementation.  As with any measurement process, a statistically sound sample frame is needed if the end-users seek to use the results for inferential statistical analysis.  

HCBS sites are frequently decentralized, presenting logistical challenges, adding a layer of complexity to aggregating, and interpreting trends across mid to larger size organizations.    For example, highly decentralized HCBS organizations serving 500 – 1000 people may be providing supports in hundreds of locations making it more challenging to arrange interview schedules and transportation. Depending on the purpose of the implementation – training vs. accreditation – and the number of people receiving support being interviewed, the level of logistical complexity will differ.

The POMs® tool requires that the staff being interviewed are those staff who have the most knowledge about the person receiving supports. This is often the direct support professionals working directly with that person.  This increases the complexity of logistical planning as coverage must be secured to release DSPs to participate in interviews. 

Using POMs® as a training tool is less complicated as it is not necessary to identify certified interviewers and training may be implemented slowly over time allowing for greater ease in making logistical arrangements. Also, it is not necessary to follow a scientific sampling plan. 

 Complexity is introduced by the qualitative nature of the tool.  For meaningful results, interviewers must be knowledgeable about the quality indicators and the types of evidence that contribute to the decision about whether a specific outcome is present or not and whether support is present. Our observation suggests that in addition to the four-day training on the tool content and interpretation, effective interviewers must also have a base of relevant work experience to complement knowledge acquired in training.  Trainees from organizations that operate with an explicit and well-developed person-centered approach may find it easier to grasp the POMs® meaning and intent.  The depth of work experience will also aid learners even where a person-centered, outcomes-based support philosophy is less developed.

 Regardless of the complexity of the POMs® core concepts, they are aligned with much of the content and expectations of the 2014 changes in the HCBS “settings” rule and are therefore important for staff to understand. Staff competence would increase with additional uses of the tool. However, frequent turnover of frontline staff may present a barrier to using the POMs® well and adopting work strategies that support quality outcomes.  Nonetheless, using staff to implement the POMs ® increases the potential for learning about and understanding outcomes related to quality supports.  This can be more beneficial for quality improvement in organizations than it may be for a survey measurement process in which staff are not involved with data collection and do not engage in looking deeply at the evidence of desired outcomes.  The first-hand experience of interviewing people, followed by reflective discussions on the findings, provides a powerful opportunity to gain knowledge and understanding about quality outcomes that can be applied when providing supports.

Despite the complexity in mastering the support constructs in POMs®, the instrument is streamlined with just 21 quality indicators to be measured.  Complexity is also reduced because interviewers who score the protocol only have to determine whether the outcome is present or not and if support is present to facilitate the outcome. Complexity is further reduced in making this decision by the presence of clear and concise decision rubrics for each area.  Well-defined rubrics are an essential element when assessing qualitative evidence.

Design quality and packaging:  When components of a measurement tool or program are well-designed and easily accessible to users, the intervention is more likely to be implemented successfully.  Flawed or poorly designed components can cause user dissatisfaction and/or lead to incomplete or inaccurate results.  The POMs® manual is a primary component used in training and accreditation activities.  It is an attractive publication that offers a layout characterized by sufficient white space, engaging photographs, and inspirational quotes.  It also includes simple and intuitive forms that interviewers may use to document information collected and to make decisions about whether outcomes are present. 

Training materials that the research team observed were attractive, visually accessible, and arranged in an organized and logical sequence.  The instruction we observed demonstrated a high level of mastery of the process and content used in the POMs® tool and was communicated with professionalism, humor, and accuracy and punctuated by many relevant stories about people who rely on long term support illustrating the theoretical aspects of the instruction. The presentation also provided a mix of brief didactic segments followed by group exercises in keeping with effective adult learning principles.  The training facility was comfortable and attractive.

Cost:  The implementation literature finds a negative association between the cost of the intervention and its success with implementation.  In addition to the cost of the tool and associated implementation costs, this part of the analytic framework also prompts consideration of “opportunity costs,” those items, projects, wages, or other financial obligations that cannot be pursued, completed or prioritized due to the POMs® implementation.  For example, if the intervention costs reduce the funds available to provide training release coverage for direct support professionals, these staff may not obtain sufficient training to deliver quality supports.  Another opportunity challenge is the redirection of resources to quality enhancement and away from providing direct support.  For a poorly resourced organization, this may create a paradoxical situation where quality enhancement efforts contribute to diminished availability of support that can affect quality negatively.

Another distinct opportunity cost is the intrusion on people’s lives.  HCBS strives to provide people with experiences similar to people not in HCBS. The experience of people systematically probing the quality of one’s life is not a typical community living experience.  While this is true, our observations of interviews indicated that most people enjoyed the interview experience.  It is crucial to weigh the intrusion against the importance of hearing first-hand from people about their lives.  The importance of this goal, coupled with our observation that HCBS users enjoyed the interviews, diminishes the concern about the intrusion.

Our observation is that the cost of using the POMs® for accreditation is comparable to other major accreditation programs. However, as described earlier coverage challenges may occur if an agency chooses to use certified interviewers from its own staff in this labor-intensive process.  While first-person interviews may be more costly than other types of surveys, the application of results to improve support may lead to outcomes that may reduce agency expenses. 

Framework Domain: Outer Setting

Focus on Individual Needs and Resources:  In addition to the various components and characteristics of an intervention, the implementation literature identifies significant associations between specific environmental factors and successful implementation.    One construct in this domain is the degree of person-centeredness of an organization implementing a measurement tool.  The implementation literature indicates that implementation success will increase if implementers are knowledgeable about the individual needs of people receiving support as well as factors that hinder the provision of support.

With these recent HCBS setting requirements, HCBS providers will likely be seeking Quality Enhancement tools like the POMs® that are aligned with new requirements for person-centered strategies of support and individually measured outcomes. It will be beneficial for organizations if the tool both measures outcomes and is useful in training organization staff about personal outcomes.  Based on our observations, it is likely that the person-centered POMs® content, if carefully implemented, will bring an organization to a more in-depth knowledge of the individualized needs and preferences of the people it supports.  While it is not known to what degree the POMs® may move an organization toward a person-centered philosophy and support strategies, the tool provides a fertile opportunity for this type of growth to occur. Indeed, some key informants we interviewed who were employed by an organization with an explicit person-centered philosophy, identified the organization as learning more about people with each POMs® implementation.  They had been using the POMs® for several years and reported that despite the verbal limitations of people receiving support, their knowledge or each person’s needs and preferences had increased over time with each use of the POMs®. We were told that the organization incorporated new learning about an individual into his or her person-centered plan. The organization also used information about the trends in support quality to develop their organization’s annual goals.  POMs® appears to contribute to continuous quality improvement at both the individual level and the organization level.

Utilization of POMs® results to improve support plans and to shape agency direction was not apparent in all organizations using the tool.  Some organizations used the tool for training purposes and did not systematically use the results to make organizational or individual changes through agency-wide systems change or through modification of individual support plans.

Information Sharing and Shared Vision (Cosmopolitanism):  Another factor external to the measurement tool that affects implementation is the extent and nature of organizational support for staff to expand their roles to include keeping up with research, external training, and participating in professional groups. The implementation literature indicates that organizations that support these kinds of efforts are faster at implementing new practices. However, these factors do not impact implementation until a specific intervention, such as the POMs®, is accepted by others in the peer network.  From this perspective, the POMs® has been used nationally by many organizations and therefore has a growing body of users who share information and similar approaches to outcomes measurement. This provides a fertile opportunity to establish regional, or national learning communities or discussion groups focused on how organizations are using POMs® findings.  Our investigation revealed that local groups that form an intentional learning community around quality improvement benefit by using the same tool because it creates scope for trading ideas and information in joint training sessions and provides the group with vocabulary and language focused on quality improvement.

Peer Pressure:  In cases where an organization seeks competitive advantage among peers who are using a specific intervention, or where pressure exists within an organization or group to mimic an intervention used elsewhere, peer pressure is a factor in accelerating implementation.  To some degree, this was a factor among the organizations within the consortium group using the POMs®. State policy required that organizations choose an accreditation process, and the foundation sponsoring special projects in the consortium asked the group to select an outcomes-based tool. The POMs® was selected based upon its use by one of the member organizations in the group. 

It is unlikely that the POMs®, or any other HCBS OM tool is a factor in competition to attract a customer base for support services.  It is only recently that the shift to outcomes-based measurement has been at the forefront of looking at service quality. Therefore, it is not likely that the general public makes this distinction when selecting a support setting.  Also, the NQF report on HCBS measurement gaps (2016) indicates that measurement HCBS OM data is rarely published.   With the growth of managed care and HCBS outcome measurement and publication, consumer choice may increase, and, with this, provider competition will likely become a more salient factor in the future. It is important to note that POMs® results may be analyzed within the PORTAL system to identify important service patterns and to permit some comparability with other organizations. 

External Policy and Incentives:  Motivation to implement an outcomes measurement tool is increased by policies or regulations that call for a focus on outcomes.  This has been a factor in the growth of the POMs® tool as a method of measuring outcomes. Within HCBS, earlier forms of monitoring and measurement emphasized examining service inputs and professional practice recommendations with little attention to the actual circumstances of a person’s life.  There has not been a federal mandate for any form of required, standard outcomes measurement for the HCBS program.  Instead, each state was required to submit evidence of compliance with the commitments made in waiver agreements or in-state plans.  The 2014 revisions to the settings rule will require organizations and states to assure that they are achieving important outcomes in the areas of choice-making, self-determination, human rights, community access and inclusion, and person-centered strategies of support among other requirements. The new rule does not prescribe the method of outcomes measurement, but as states prepare their HCBS transition plans it’s likely that some will choose to implement tools like the POMs® that measure several of these domains

Framework Domain: Inner Setting

Inner setting addresses organizational attributes including structure, networks and communication, culture, implementation climate, tension for change, compatibility with the intervention, incentives and rewards, goals and feedback, learning environment, readiness, available resources, and access to knowledge and information.  This study was not designed to examine the capability of one or more specific organizations to implement the OM tool effectively. Instead, it looks at the characteristics of the tool that facilitate or hinder implementation. There are, however, some general observations from our experience that apply to this domain.

One organization we encountered using the POMs® was relatively small (serving fewer than 300 people) and had been supporting many of the same people for several years.  The smaller size of the organization enabled it to conduct the POMs® with most of the people they support thus providing them with a comprehensive window into their overall supports.  It also offered the flexibility to change each person’s individual plan based on POMs® results and to track results over time.  The culture of the organization was very person-centered and therefore compatible, enthusiastic, and ready to take on the POMs®.  In addition to a robust person-centered philosophy, it was clear that the leadership of the agency strongly valued the importance of measuring the impact of support on people’s lives and, more importantly, using that information to change individual support plans as well as inform the strategic direction of the agency.  Readiness was also enhanced by the organization’s participation in a regional consortium focused on quality improvement with funding from a local foundation.   This funding was crucial to their financial ability to implement POMs®.  Economies of scale make it difficult for smaller organizations to obtain the resources needed for quality measurement activities.

Key informants and other evidence from a larger consortium organization suggested that this organization had not used the POMs® to change individual support plans or to chart future strategic directions. Website information describing the history of this organization and its annual report did not emphasize a person-centered or outcome-based support philosophy.   This may be one reason that the POMs® had not penetrated practice as profoundly as in the smaller organization where there was a strong affinity between the organization’s philosophy of support and the core constructs of the POMs®.  More research would be necessary to determine whether the lower utilization was related to size, espoused mission, or other factors or combinations of factors.

The regional consortium also provided a vehicle for cross-organizational networks and discussion about quality outcomes and about the POMs® contributing to a fertile learning climate.  The local structure made it possible for smaller organizations to access the knowledge and training they needed for implementation.  As described in the preceding sections of this report, smaller organizations often have difficulty finding coverage for direct support who are released for training activities.  In a consortium, it may be possible to offer the same training opportunity on several occasions thus relieving the burden of releasing significant numbers of staff from the same organization to attend training. 

Quality of life (QOL) is a multidimensional construct that has been applied across individuals, organizations, and systems. The construct is increasingly impacting the redefinition of organizations and the services and supports they provide to persons with intellectual and closely related developmental disabilities. Because of this impact, it is essential that evidence-based outcomes are both understood and incorporated into best practices  (Schalock & Verdugo, 2012b in van Loon, J.H., Bonham, G.S., Person, D.D., Schalock, R.L., Claes, C., & Decramer, A.E. (2013). The use of POMS® outcomes in systems and organizations providing services and supports to persons with intellectual disability. Evaluation and Program Planning, 36(1),80-87.

Framework Domain:  Characteristics of Individuals

The constructs identified in this domain include individual knowledge and beliefs about the intervention, individual stage of change (degree of progress toward skilled implementation), individual identification with the organization, as well as other personal attributes such as motivation, flexibility, etc.   Similar to the previous domain, this study was not focused on how the characteristics of specific opinion makers, leaders, or stakeholder’s mediate implementation.  It is, however, important to note some general observations.

In several sections of this report, we noted that the support approach of one agency we observed was highly person-centered.  This was demonstrated in the organization members we saw in training and the enthusiasm they brought to the task of interviewing people.  It makes sense that in hiring people, a person-centered organization will seek employees who are willing to learn this philosophy of support or who already embrace it.  Individual belief and commitment to this philosophy helped these trainees to become more proficient with the POMs®.  It is interesting to note that these “person-centered” staff had attempted to implement the POMs® on their own but after contracting with the Council for training, soon realized they had been implementing incorrectly. Thus, it appears in this case that that commitment to a person-centered approach must be augmented by training targeted towards recognizing and interpreting evidence of personal outcomes to assure effective measurement of outcomes.

Framework Domain: Process:  

The CFIR developers identify the “process” domain as the most challenging domain in implementation science to evaluate due to several theories on implementation that range across topics including total quality improvement and management, integrated support, complexity theory, and organizational learning (Dramschroder et al., 2009).  The CFIR framework focuses on four constructs that are commonly found, either explicitly or implicitly, in a range of theories and frameworks:  1) Planning, 2) Engaging, 3) Executing, and 4) Evaluating.

Planning:  The level of advanced planning for intervention is positively associated with the success of the intervention.  From what we observed, there is a robust and well-structured capacity to support the planning and preparation process to implement the POMs®.  When an organization contracts to use the POMs® in some manner, the Council works closely with the group to support them in the ramp-up to implementing the POMs®.  Interviewers must participate in an intensive 4-day training effort.  The Council has experimented with strategies to provide adequate training with fewer training days or through distance learning strategies to ease the burden of planning on end-users but has discovered that these efforts yielded less than optimal results.  The training curriculum is well-designed and executed and uses high-quality materials. The CQL has been supporting the implementation of the POMs® for several decades and the maturity of their experience is demonstrated in the quality of their website, their materials, and the quality and availability of technical assistance, all essential factors in supporting organizations to complete a comprehensive planning effort before initiation. 

As mentioned in the preceding sections, the POMs® may be applied to achieve different goals within an organization and the CQL website provides organizations with a clear and comprehensive overview of the various options.  The relatively recent addition of a service that enables organizations to access advanced analytics based on POMs® data through its user-friendly “PORTAL” platform expands the range of output that organizations can use to strengthen planning using their POMs® results. The PORTAL supports organizations in examining potential interviewer reliability problems enabling them to plan for reliable POMs® interviews. 

Engaging:  One organization we observed was using the POMs® as a training tool with longer-term plans for accreditation. It was clear that the tool’s use was enthusiastically supported at all levels of the organization from the board and leadership to middle management and direct support.   We were not able to analyze how the strong engagement with the tool had become part of the organization’s practice, but leaders spoke highly of the training and their access to technical assistance from the CQL.  Several key informants also spoke positively about how the POMs® results were used in adjusting support plans and at the board and executive level in crafting strategic directions for the organization.  One factor of the POMs® implementation that appears to motivate the engagement of the staff is the structure that welcomes staff, regardless of their position in the organization, to obtain experience implementing the POMs® interviews either as trainees or as certified interviewers.  This direct experience provides a vehicle for staff to become very knowledgeable about the content of the tool and the experience of implementing it.  This first-hand experience appears to have galvanized a positive opinion about the overall experience and an appetite for the knowledge gained about the people they support.  

Executing:  Within the CFIR, execution or fidelity, is one of the multiple components that are considered when looking at the barriers and facilitators to successful implementation.  To determine the essential components of the POMs®, the research team asked CQL leaders as well as trainers/facilitators at CQL to identify the core elements of the POMs® intervention so that we could focus on these components in our examination of the quality of its implementation.  To assist them in this process, we provided definitions and examples of essential components and non-essential components. The research team then reviewed the POMs® website and related materials to determine if a critical part was not mentioned (nothing was identified). We then consolidated the CQL staff responses and edited the list to exclude any components that the team thought was non-essential.  This is the list of essential components we reviewed to examine execution:

  • Outcome measures that probe the conditions most relevant to HCBS quality
  • POMs® manual including 21 valid outcome indicators and outcome decision guidelines
  • Expertise and knowledge of POMs® trainers/facilitators – Trainers certification
  • Experiential learning in the four-day POMs® training workshop, including materials and hands-on activities
  • Interviews focused on consumers and staff with in-depth knowledge of consumers interviewed
  • Interviewer expertise and reliability (certified through reliability testing)
  • Outcomes decision discussion and reflection based on Decision Matrix (written guidelines for making decisions regarding each of the 21 outcome areas)
  • Technical assistance before and after POMs® training
  • Summative quality data that is usable at the individual and the systems level for HCBS improvement available through the CQL “Portal” analytics database service

During a three-day site visit, through both observations and interviews, we had the opportunity to validate the presence and strength of most of the components on the list except for the process of testing reliability and certifying interviewers and the procedures used for reporting summative quality data. We observed a POMs® training activity, one that was not intended to test reliability and certify interviewers or trainers and therefore did not have the opportunity to observe this process first-hand. However, we did discuss the process with key informants and reviewed relevant protocols.  We were also able to examine the capacity and applications of the online “Portal” analytics service. 

Based on observations, interviews, and reviews of documents and reports, it is clear that the POMs® is implemented with robust, well-designed components and is executed with fidelity.

According to CQL staff analysis of data available in the PORTAL analytics, there is an observable difference between the decisions made about the presence of outcomes by certified and non-certified interviewers. Non-certified interviewers tend to find more “outcomes” present than do certified interviewers.  This emphasizes the importance of the reliability testing procedure as a critical element in an effective implementation process.

Additionally, the background research we conducted about the development of the POMs® outcomes indicators suggests that both the indicator development process and consequent psychometric evaluations have been rigorous and completed with a frequency that assures validity and consistency with current practice (Friedman, 2017; Gardner, 2005). 

Reflecting and Evaluating:  The CQL has undertaken significant evaluations of the POMs® in 2005 and 2017 resulting in refinement of the outcome indicators. Recognizing the burden of sending employees to a four-day training experience, CQL informants told us that they have experimented with different training models, including online training as well as a shorter interval than four days.  The results from these experimentations were unsatisfactory so they have maintained the four-day model.