Outcome Measurement Program Case Studies
About the National Core Indicators (NCI)
The National Core Indicators® (NCI®) is a collaboration between the National Association of State Developmental Disabilities Directors (NASDDDS), Human Services Research Institute (HSRI), and participating states. NCI includes a portfolio of several surveys, including surveys of adults receiving services under the auspices of a state’s Developmental Disabilities (DD) system, surveys of families of people receiving services, and a survey assessing the quality and stability of the direct support workforce. The National Core Indicators- In-Person Survey (NCI-IPS) is an in-person survey of adults who receive these supports and services. Participating states use NCI-IPS results to guide policy and assess policy outcomes. For 2017-2018, 46 states, the District of Columbia, and 22 sub-state entities participated in NCI (HRSI, 2019).
The NCI started in 1997 with seven states (Smith & Ashbaugh, 2001). Specific outcomes were developed with the guidance of an advisory group and based on a review of outcome research, national goals, state agency missions, and objectives. The NCI survey gathers the following information:
descriptive—demographics, functioning, diagnoses, and behavioral health; service use—setting size and service funding source; service and lifestyle—friendships, community participation, family involvement, participation in a self-advocacy group, choices, respect, and employment; and wellbeing—individual satisfaction/liking of one’s life and services. Demographic variables are completed primarily, using administrative records as a part of the background section of the NCI survey. Section I of the survey can only be answered by the participant with a disability. Variables in Section II are completed by the person with a disability whenever possible, but when the person is unable to answer independently, a response is sought from a knowledgeable proxy (family, advocate, and staff). For items allowing proxy responses, the source of the information (person with a disability or proxy) is recorded.
The NCI has made revisions to its In-Person survey since its inception in response to feedback from states . Recent revisions have included the addition of questions related to person-centered practices in the 2018-2019 survey year. The program has also been engaged in assessing the instrument’s psychometric properties (NCI, 2018) in preparation for submitting measures to the National Quality Forum (NQF) for endorsement.
States draw a random sample calculated to yield a 95% confidence interval with a 5% margin of error (typically 400 individuals) from recipients of at least one LTSS service in addition to case management. Individuals are at least 18 years of age and are served by the state’s DD agency. States have the option of oversampling particular populations to understand how subpopulations compare to state and national samples. For example, states may oversample a specific geographic region or a particular race or ethnic group. HSRI assists states in developing an appropriate sampling from their state to assure unbiased samples.
The NCI-IPS training options include a “train-the-trainer” session conducted by the NCI national team for the designated lead trainer(s) in each state, while other states receive NCI-team-conducted training for all surveyors. These training sessions occur annually. In all training modalities, trainees learn essential skills for interviewing persons with IDD, including people with communication support needs, procedures for practice interviews and coding; and, a review of the survey questions and their intent. Videos are used to practice coding and testing inter-rater reliability to the criterion level. Interrater reliability of the NCI-IPS was initially calculated at 92% (Smith & Ashbaugh, 2001). Other training materials include scripts for follow-up conversations, procedures for monitoring a sample of the participants to assure interviews occurred and were positive, visual aids to assist with interviewing, and procedures for preparing new interviewers to shadow experienced ones.
Interviewers from one UCEDD (Oklahoma) (2), 2 national survey organizations (4), and local non-profit entities in Pennsylvania (4) were interviewed about their experiences with NCI interviewer training and support. Interviewers from the various entities reported that training itself was detailed and included in-depth information about the questions and the intent behind the questions, how to correctly conduct interviews in a way that assured consistency and allowed time for interviewers to practice doing interviews at the training. In all instances, interviewers reported that they shadowed experienced interviewers before going out on their own and had their interviews observed by an experienced interviewer/mentor. All interviewers receive annual training. Survey coordinators and lead interviewers reported that retraining occurs as needed if specific interviewers are found to need additional training based on interview observations. Also, staff that works for the agencies that conduct interviews reported regular in-person or virtual meetings during the period when interviewers were in the field collecting data to answer interviewer questions and provide additional information that might be beneficial to interviewers.
Interviewers and coordinators reported that one of their biggest challenges was finding participants who were willing to agree to be interviewed. Interviewers and coordinators said that reaching potential participants was difficult and it was reported that when a participant was reached they either declined because they were unsure of what the purpose of the interview was or they were tired of doing surveys and did not want to participate in another one. This difficulty was reported by interviewers or coordinators across the states represented in this case study.
Technical Assistance for Surveyors
Technical assistance for the NCI-IPS interviewer includes the opportunity to submit questions via the state's coordinator to the national project coordinator. Interviewers reported having the ability to reach a supervisor or a knowledgeable “lead” interviewer with questions they may have while in the field. Also, interviewers said that there were videos available to watch to refresh their knowledge about the interview process. Agency survey coordinators reported having access to technical assistance available as needed from the NCI National Team.
HSRI developed an online database (ODESA) for states to use to report all of their data, ensuring standardized data formatting. All states collect data using standard timelines and submit their data files by June 30th of each year. HSRI carefully reviews and cleans the data and maintains it on a secure server. Different entities involved in IPS surveying handle data review and entering differently. Interviewers from the larger survey entities (such as Vital Research) reported conducting their interviews on tablets with data submitted to a database after the interview. They were unsure of what happened to the data they submitted, including how it was checked for accuracy and completeness. Smaller entities, such as the non-profits that collected NCI data in Pennsylvania, reported a process where the survey coordinator at the local level reviewed the data before it was entered, allowing apparent errors or missing data to be corrected by the interviewer while the interview was still fresh in their minds. Interviewers in Florida, Pennsylvania, and Oklahoma were aware of how those states used the data and were able to describe how their state’s DD agency used the data to learn more about services and supports. Interviewers in Minnesota were not aware of how Minnesota’s DD agency made use of the NCI data.
The NCI National Team provides additional technical assistance to states to use their data meaningfully to monitor their systems’ performance and to inform policy decisions. For example, there is an implementation guide developed for this purpose that explains the technical aspects of the survey design and sampling frame, reliability and validity, and other key concepts necessary for using data to improve quality. It also offers examples of both how to use the data and how the data should not be used (NCI, 2017). For instance, in a manual aimed at helping states use NCI data, one of the identified uses of NCI data is to identify areas of improvement and to develop change strategies (NCI, 2017, pp 19-20). The manual further states that NCI data is not meant to be a monitoring or quality assurance tool at the individual level (NCI, 2017, pp 14). The NCI Team has regular calls with states about their data collection plans and how to interpret and use the data collected. NCI-IPS provides a chart generator of key core indicators that allows stakeholders to access data for the states’ participating in NCI-IPS allowing for increased access information in a user-friendly format. Yearly reports are also produced for each participating state. State DD agency staff participating in interviews did report using the NCI for decision-making purposes. For example, efforts to increase employment were initiated in several states based on NCI-IPS data showing low levels of employment despite participants reporting interest in working for pay. However, one of the challenges reported was the lag time between data collection and data reporting. State agency staff thought that NCI would be strengthened if they had access to their data sooner to make timely data-driven decisions. DD agency staff also talked about the need for current information to provide to legislators during legislative sessions.