NCEO Report 445
NCEO Report 445:
Developing Requests for Proposals (RFPs) for Interim Assessments that Include All Students
January 2025
Executive Summary
This report provides a resource for state and district assessment staff as they develop requests for proposals (RFPs) for interim assessments[1] for all of their students, including those who require accessibility features, to demonstrate their knowledge and skills as well as those who may need an alternate assessment because of their significant cognitive disability. In 2021 a review by the National Center on Educational Outcomes (NCEO) and the National Center for the Improvement of Educational Assessment (NCIEA) identified a range of concerns with existing interim assessments and found that none of the available interim assessments met the needs of students who require alternate assessments based on alternate academic achievement standards (AA-AAAS) (Boyer & Landl, 2021).
This report provides ideas for ensuring that bids for interim assessments address the needs of all students, including students with disabilities and English learners. This is not a trivial undertaking for states and districts because the assessment requirements of both the Individuals with Disabilities Education Act (IDEA) and Section 504 of the Rehabilitation Act of 1973 (Section 504) must be addressed. For example, IDEA requires the inclusion of all students with disabilities in all state- and district-wide assessment administrations. Section 504 prohibits the discrimination of public school students with disabilities and ensures equal opportunity regardless of disability.
States’ procurement practices, assessment policies, and internal capacities vary widely, so we deliberately presented a range of example language that might be used in a request for proposal (RFP). The example language was derived from actual state RFPs. States should factor in the costs (both time and financial) associated with the different approaches and select a path that best meets their assessment needs and the resources available in the state.
Introduction
The past few years have seen increasing state attention turned toward interim assessments as some states examine how to replace their summative assessments with through-year or through-course assessments modeled on the general form of interim assessments. Interim assessments are administered several times during a school year, and fall between formative assessment processes that teachers use to track progress during instruction, and summative assessments that are administered once a year and have historically been administered as part of state accountability systems (Lazarus et al., 2021; Perie et al., 2007).
Some states have provided interim assessments to schools and districts or supported school and district use of interim assessments selected from an approved list. For example, some states use interim assessments to track student progress toward reaching grade 3 literacy guarantees. Additionally, the vast educational disruptions caused by the COVID-19 pandemic in 2020 and the accompanying impact on student learning have led other states to provide interim assessments as a tool for educators to assess the scope of learning loss and track student recovery from that loss.
However, there have been issues with states and districts not always including some students with disabilities and English learners in interim assessments. The purpose of this report is to provide ideas for how to ensure that bids for interim assessments address the needs of all students, including students with disabilities and English learners.
Under the Individuals with Disabilities Education Act (IDEA), each state must ensure all children with disabilities are included in all general state and districtwide assessment programs, including assessments required by the Elementary and Secondary Education Act (ESEA) commonly known as the Every Student Succeeds Act (ESSA), with appropriate accommodations and alternate assessments, if necessary, as indicated in their respective individualized education programs (IEPs). ESSA additionally requires the assessment of all English learners, including those with disabilities.
Some states that are moving toward replacing their summative assessments with through-year assessments have likely planned to meet the accessibility needs of students with disabilities and English learners, including students with the most significant cognitive disabilities who require alternate assessments based on alternate academic achievement standards (AA-AAAS). However, states that have adopted existing published interim assessments as their through-course assessment or provided a menu or approved list of interim assessments for district and school use, may not have considered whether those assessments meet the needs of all students.
Interim assessments have not yet come under the same level of scrutiny that summative assessments have. This may be because individual districts were making the decisions and interim assessments were seen as low stakes. As states begin to provide interim assessments and approved lists of interims, there is a responsibility to ensure that those assessments meet the needs of all students and provide valid scores for decision making. Even when assessments are “optional” the state is responsible for ensuring that the assessments it provides or recommends comply with federal requirements and are fair, accessible, and valid for all students.
In the spring of 2021, the National Center on Educational Outcomes (NCEO) released two briefs, Interim Assessment Practices for Students with Disabilities (Boyer & Landl et al., 2021) and Alternate Interim Assessments for Students with the Most Significant Cognitive Disabilities
(Browder et al., 2021), and a report from an expert panel, Using Interim Assessments to Appropriately Measure What Students with Disabilities Know and Can Do: Advisory Panel Takeaways and NCEO Recommendations
(Lazarus et al., 2021), which examined how some current interim assessments document the ways that the needs of students with disabilities are met. The NCEO publications provided general details of the extent to which interim assessments met the needs of students with disabilities as well as identified areas of concern or gaps that exist. Although NCEO focused specifically on students with disabilities, their findings apply equally to the access and accommodation needs of English learners.
The Browder et al. (2021) review found that none of the available interim assessments met the needs of students who required alternate assessments based on alternate academic achievement standards (AA-AAAS). States and districts cannot afford to ignore the needs of students who take AA-AAAS and their educators simply because existing interim assessment products do not include appropriate supports for those students.
The purpose of this report is to provide states with ideas for how to systematically address gaps that may exist in the current generation of interim assessments. The resources take the form of Requests for Proposals (RFPs) language that specifically address the needs of students with disabilities and English learners.
Method
We reviewed selected state and district RFPs to learn more about how they include students with disabilities in them. When a state or district reviews vendor’s “off the shelf” assessments there are two key standards that need to be examined: (a) the product’s Accessibility Conformance Report (ACR),[2] which describes how a technology product or service conforms with Section 508; and (b) the current version of the Question & Test Interoperability (QTI, v3.0)[3] specification along with their QTI Conformance Certification.
We focused on providing examples of RFP language that responded to identified areas of concern. The language, drawn and adapted from actual state and district documents, can be used or adjusted for use in RFPs or Requests for Information (RFIs). This will help to ensure that interim assessment vendors supply the documentation that states need to be confident that the needs of students with disabilities and English learners have been addressed and that the scores from the interims will be valid for instructional use with all students.
To help ensure the accuracy and usefulness of this report, two groups affiliated with the Council of Chief State School Officers (CCSSO) provided input and reviewed and provided comments on the contents of this report (The Students with Disabilities Assessment Advisory Task Force and the Assessment, Standards, and Education for Students with Disabilities (ASES) Collaborative). Revisions were made to the content of this report based on the input and feedback.
Findings
AA-AAAS
As interim assessments play a larger and more consistent role within assessment systems, states and districts must ensure that students who participate in the state’s AA-AAAS are included in interim assessments. As previously noted, IDEA requires that state and districtwide assessments must meet the needs of all students; this includes students who participate in AA-AAAS. The fact that most interim assessments do not currently meet the needs of students who take AA-AAAS does not relieve districts or states from their obligation under the law. IDEA further requires that an alternate assessment be developed for each state and districtwide assessment, regardless of whether it is used for ESSA accountability or another purpose. For any assessment that a state or district requires, an alternate assessment must be provided for students with the most significant cognitive disabilities who pursue alternate academic achievement standards.
We found that some states with AA-AAAS for students with the most significant cognitive disabilities were exploring options for progress monitoring tools for this population that would minimize impact on instructional time while measuring what students know and can do throughout the year. Those states are investigating ways to develop and implement alternate progress monitoring solutions for students with the most significant cognitive disabilities.
Accessibility Features
The full range of accessibility features (e.g. universal features, designated features, accommodations) provided by an interim assessment must be understood. Further, it is also important to know how the interim assessment accessibility features compare to those for the summative assessment. The mechanism for assigning these features should be delineated along with the process for how educators document that students actually received assigned accessibility features during the assessment. Several examples of RFP language address this need:
The list shall describe the test accessibility features that allow access for students with disabilities and English learners to most fully participate in each assessment without interfering with the measurement of the constructs. Offerors shall also discuss accommodations which would threaten the validity of the assessment by interfering with the construct being measured.
****
The Offeror shall discuss how the system will assign specific accessibility features only if the student’s Personal Needs Profile (PNP) indicates the need and how cross checks will be used to only allow certain accommodations for some subpopulations.
****
Respondents should propose a wide range of accessibility features, including accommodations for students with sensory disabilities (e.g., refreshable braille, closed captions, sign language interpretation, etc.) and English learners with disabilities (e.g., glossaries, translations, etc.). Students with disabilities, including those with sensory disabilities and English learners with disabilities, should have a variety of ways to be able to access the assessments so they can meaningfully show what they know and can do with respect to all of our state early learning standards.
****
For each assessment produced, the Contractor will provide braille (Unified English Braille [UEB], contracted and uncontracted versions), large-print, regular-print, and one-item-per-page versions in each grade and subject for students with disabilities. The Contractor must also provide a variety of computer-based accommodations for students with disabilities.
****
Students with disabilities and English learners will take the assessments with appropriate accessibility features. Respondents must provide a full list of all accessibility features currently provided within the state’s test delivery platform and those anticipated with a defined timeline for availability.
****
Assessments administered by the Respondent must be available to all students, and the format of the assessments must enable both an online and paper-and-pencil mode. Large print and braille versions of the assessments must also be made available.
****
Accessibility deals with test interface and administration solutions that allow students’ assessment results to not be affected by disability, gender, ethnicity, or English language ability. These options ensure validity of assessments and provide equitable testing opportunities.
****
Because content delivered in each program, subject area, and grade is different, flexibility in the availability of accessibility features is necessary to ensure content validity retention. Without this flexibility, you will not be able to control which accessibility features are made available to students at specific times of the test. For example, there will be some test items on which students can use a calculator while on other test items calculators are not allowed. These differences, as noted, are due to the content standards and knowledge being tested by specific test items. The availability of student-level accessibility features usage data at the completion of testing provides the needed details for later analyzing the validity and effectiveness of these features in order to refine the availability of accessibility features for future administrations.
****
Our assessments must be accessible to all participating students, including English learners and students with disabilities, and include appropriate accessibility features for students with disabilities and English learners. Accessible assessments will allow all individuals taking the assessments to participate and engage in a meaningful and appropriate manner, with the goal being to ensure that results are valid for each and every student. The results will yield information in order to make valid inferences about the performance of students with diverse characteristics.
****
The Contractor shall provide a viable mechanism to ensure that accessibility features can be configured to meet the specific needs of each student, can be defaulted to reduce repetitive entry for the same student, and are only available to the students for whom they have been configured.
****
The solution shall provide logs of accessibility feature use to include, but not be limited to: assignment of features, number and percentage of items on which the feature was used, length of time it was used in seconds, etc. Logs will be used to determine trends, needs for updates, and other information needed by the state. Some features will be allowable for all students while others are considered accommodations requiring special criteria for use. All features shall provide for assignment of accessibility features at the feature/student/content area level and may be entered manually for each individual student and/or through a bulk-batch process.
Accessibility Feature Guidance
Accessibility features must be identified and the specifics of how to appropriately administer the assessment using them must be clear and easy to implement. Examples of language that address this include:
Required documents include test administrator and coordinator manuals, training manuals, accessibility manuals, data interpretation manuals, as well as various order and request forms.
***
The initial training must include, at a minimum, information on how to access the accessibility feature; how to use the feature during assessment with validity and reliability; and test administration guidelines including information about the accessibility features, student data submission, student data reporting tools, and reporting to parents.
Alignment
The use of an interim assessment for instructional decisions or to predict future performance is dependent on the alignment with the state’s academic standards. The interim assessment should provide evidence that both test items and the test blueprint align with the state’s academic standards and reflect the depth, breadth, and complexity of those standards. Examples of RFP language that address alignment are:
For all assessments, the Successful Offeror shall set aside resources for independently conducted alignment studies of the assessments to the revised state standards and to its performance standards.
***
Responses should address the extent to which the proposed solution is aligned in the following ways:
a. Test content aligns with the state’s standards and benchmarks for subject, grade level and learning outcomes.
b. Test content is written to the correct skill level of the standards and benchmarks.
***
All item specifications and a sample of items must represent both coverage of state standards (in depth and breadth) and be representative of item types to be used on the summative assessments and must be reviewed for alignment to the state standards. Additionally, all items must be reviewed for adequate depth of knowledge (DOK) and coverage of the full range of DOK.
***
For a bank of items licensed from previous development undertaken by the respondent, the respondent must contract with an independent organization approved by the Department to complete an alignment study with the state’s academic standards within three months of the contract execution.
***
Contractor must provide evidence of alignment, for all content areas and grade levels, to the state’s academic standards. The evidence must include an independent alignment study of the proposed assessment solution. If a commercial off-the-shelf solution is proposed, the contractor must provide an independent alignment study with their proposal.
***
The administration vendor and item, test form, and instructional instruments development vendors in collaboration with the scoring and reporting vendor shall provide documentation to ensure that assessments and instructional instruments provided are aligned to the state academic or English language proficiency standards. The administration vendor shall support the logistics of additional alignment studies conducted by the scoring and reporting vendor, as requested by the state education agency, for future revisions to the academic or language proficiency standards that cause a significant change in the content being assessed. Such alignment studies may include the use of an outside third-party that is not affiliated with the assessment program.
Assistive Technology
Assistive technology covers a wide range of products, technology, and software. Similar to accessibility features, vendors should support any needed assistive technology on interim assessments that students regularly use during instruction. It is particularly important to know how the assistive technology supported for the interim assessment compares with the assistive technology for the summative assessment. Examples of language for RFPs include:
Respondent must provide a description of how non-embedded assistive technology devices that students utilize on a regular basis can be used during secure testing. Respondent must provide the functionality to track and capture a student’s use of tools and accessibility features by item.
***
The online vendor shall work cooperatively with the state education agency to support and maintain compliancy with specified assistive technologies (ATs) as well as any additional assistive technology that becomes available during the life of the contract resulting from this solicitation or is requested by the state education agency. Provision of a practice environment that shall allow teachers to test specific student AT within the system prior to use either on instructional instruments or assessments.
***
Contractor’s test delivery platform shall support refreshable braille devices and vision enhancing software. Contractor shall work with the state to explore the feasibility of supporting additional assistive technology including, but not necessarily limited to, screen reader and text-to-speech software, speech-to-text software, screen enlargement, and alternative input devices and software. If the state requests test access through a specific assistive technology device, contractor shall make provisions to support the aforementioned assistive technology but would not be responsible for providing any needed hardware or software (such as refreshable braille devices) for school districts or the state.
Field Testing
Field testing of items should include the full range of students for whom the items are intended. Following field testing, differential item functioning (DIF) or other statistical analyses should be conducted to assess possible item bias. Examples of RFP language on field testing include:
Item data from the field test must include the appropriate item response theory (IRT) item and task parameters, distractor and bias sensitivity analysis, and fit and differential item functioning (DIF) statistics based on the selected IRT model.
***
Following each field test or operational administration, differential item functioning (DIF) analyses, to detect possible item bias, will be conducted, preferably using Mantzel-Haenszel method. DIF analyses are conducted for Caucasian, African-American, and Hispanic racial/ethnic groups and by gender, students with disabilities, and English learner status. Values for items resulting from these analyses will be included in the item-banking system. Changes in DIF values across administrations will be analyzed and presented in the technical report.
***
The scoring and reporting vendor shall incorporate differential item functioning (DIF) analyses with multiple procedures in the test development process and minimize the use of items with large DIF. If such items are necessary, a thorough content review should be conducted to ensure these items are not biased against certain subgroups of students. The scoring and reporting vendor shall examine DIF for subgroups including gender, race/ethnicity, special education, ELs and socio-economic status.
Implementation Training and Support
For an assessment to be valid and reliable it must be implemented as designed. Educators must be trained in how to correctly administer the assessment and understand the reports. The ongoing support of educators and schools is critical in maintaining the integrity of the assessment program. Language that reflects this requirement in RFPs includes:
The contractor shall provide initial and ongoing training each year of the contract. The training must be provided for district-level teams, including at least five individuals in small districts, at least ten individuals in medium districts, and up to fifteen individuals in large districts. The district-level teams should consist of district-level staff responsible for implementation, district-identified school administrators, lead literacy coaches, and lead teachers. Regional trainings are permissible if this does not require overnight travel for district participants. For cost estimate purposes, respondents should plan for five annual regional trainings throughout the state. The contractor will be responsible for travel costs incurred for their own staff, as well as that of up to three Department staff. Meeting venues and costs for district attendance will not be the responsibility of the contractor.
***
The training meetings shall be completed by August 1st of each year of the contract. The purpose of these meetings will be to train district-level teams on the procedures for administering the assessment and reporting and interpreting the screening results. The contractor shall provide an Internet-delivered version of the training program for use by districts in training their school personnel. The goal of these materials is to enable district staff to train as many school personnel as needed within a short period of time.
***
The contractor is also required to provide and maintain an in-person presence throughout the state to assist independent school districts, districts, and schools in the preparation for and administration of the assessments, as well as support for the interpretation and use of reports. The contractor is required to submit an onboarding plan in support of the state’s schools for the first test administration cycle as well as a maintenance plan for at least two subsequent years.
***
Provide an overview of the implementation training approach. Describe whether the contractor approaches training through a train-the-trainer approach, turn-key implementation, or other strategy. Be specific about the number of staff that will be directly trained by contractor personnel under the proposal. Define whether training will be conducted in person, remotely (synchronous), or via on-demand tools. Provide an outline of the proposed training content and sample training support materials.
***
The lead supervisor and other contractor trained staff shall be available to answer calls from 7:00 a.m. to 6:00 p.m. Eastern Time each day, excluding federal holidays. The contractor shall describe its proposed procedures for providing telephone support to the state. The contractor shall provide email responses from Customer Service agents within 24 hours. The contractor shall provide chat support during 7:00 a.m. to 6:00 p.m. Eastern Time each day using experienced agents.
Item Sensitivity and Bias
Item reviews (sensitivity and bias) need to consider which students will be taking the test and reviewers with appropriate expertise should be a part of the reviews. Examples of RFP language are:
Each item must be reviewed by a Content and Bias/Sensitivity Review Committee in each content area to assure that the item is of high quality, that it is aligned with a skill in the content area, that it measures the skill in a sound manner, and that the item does not unfairly advantage/disadvantage any student, and that it is not offensive to students, parents, or the public.
***
The committees are normally composed of representatives of the state’s citizens and educators from various backgrounds who review each newly developed test item. The primary purpose of each committee is to consider whether the subject matter and language used is free of potential bias and acceptable to the state’s students, parents, and other community members.
***
The items must also be reviewed for bias and accessibility to ensure that the assessments provide equitable measures for students with diverse cultural and ethnic backgrounds and learning styles.
…the contractor must ensure the pool of questions for the assessments shall be subject to a transparent review process for quality, bias, sensitivity, and accessibility issues by involving the state’s educator review and comment.
***
Content, bias, and accessibility reviews shall be provided for passages/stimulus materials for language arts. Additional reviews may be required for social studies or other content areas as determined by the state educational agency. Reviews for item content, bias, and accessibility shall be provided for all content areas for all assessments and instructional instruments. Representatives from the state including, but not limited to, members of higher education, legislators, and other members of the public, shall have the opportunity to participate in all reviews.
***
During the bias and/or accessibility review, items, passages, and graphics/illustrations shall be reviewed for bias and/or accessibility (visual, hearing, English learners, etc.). The item, test form, and instructional instrument development vendors shall present an adequate number of passages to provide enough items and passages to maintain the item banks for all content areas and domains, including Spanish versions, for future form development.
***
A sight review committee shall have time and opportunity to review test forms for the ability to adapt for large print or translate into braille. For the purpose of sight review, three types of meetings are herein described; these meetings should be conducted in a mutually agreed upon location prior to the completion of a braille form of the assessments.
***
The process for participation in the review of test items that ensures that the assessment meets the qualifications around universal design and the needs of special populations, including comprehensive review of all accessibility features.
Item Writing
Item writers need to understand the full range of students who will take the assessment so that they proactively consider the entire population as items are developed. RFP language that addresses this includes:
Offeror shall present its plan to ensure that items will be developed that will permit students with disabilities and English learners to fully participate in the assessments and receive valid scores, while minimizing the need for accessibility features. The offeror shall fully explain how the needs of students with disabilities, especially sensory, physical, and language disabilities, as well as English learners, will be taken into consideration during item writing.
***
The contractor will fully explain how the needs of students with disabilities, especially sensory, physical and language disabilities, as well as English learners will be taken into consideration during item writing.
***
High quality items:
i. measure what they claim to measure (are clearly aligned to a standard);
ii. meet all requirements of the state education agency’s item writing style guides;
iii. make sense to students;
iv. are equally accessible to all students – regardless of socioeconomic status, race or ethnicity, special needs, English language acquisition, etc.
Reporting
It is important that reports for students, parents, teachers, and administrators are clear and accessible, and that guidance or resources are provided for the proper use and interpretation of reports. Examples of how these are addressed in RFP language include:
The reports must include an indication of measurement error, such as error band graphics; relevant comparative information such as a bar chart displaying student scale score, school scale score mean, and district scale score mean, and explanatory narrative on all reports where appropriate.
***
Respondents should propose a solution that includes a web-based portal for parents and families to securely access student assessment data and review their student’s Individual Student Reports (ISR) in near real-time following the student’s assessment.
***
This new assessment system will involve a vast increase in the amount of data and information educators, students, and parents will be provided. The state education agency wants to assure that educators are prepared to use this information effectively to improve instruction in their schools. To that end, the vendor shall be prepared to assist for at least the first 2 years of this contract in providing and coordinate professional development to schools in collaboration with the state education agency.
***
Assessment results are to be reported in a “user friendly” format. We are especially interested in reporting approaches that provide actionable information for students, parents, and classroom teachers. The reporting system must be designed to complement instruction and to facilitate the use of assessment results to improve student achievement. Reports must reflect areas of strength as well as areas that could be targeted for instruction.
***
For each assessment, annually, the contractor shall develop an Interpretive Guide to assist parents and educators with interpreting assessment results. The Interpretive Guide shall be formatted in such a way that pertinent information can easily be copied at the school building for distribution to teachers or parents.
***
The contractor shall utilize feedback from students, parents, administrators, and teachers on report shells and content when designing and creating the reporting system for the assessments expected to be reported following the first test administration.
Staff Experience/Qualifications
The vendor staff, both in content areas and psychometrics, needs to include individuals with expertise in the assessment of the full range of students including students with disabilities and English learners. RFP language that reflects this incudes:
Contractor team members assigned to this contract must have, at a minimum, the necessary technical experience, knowledge, and operational experience in the following areas:
….
- Academic, technical, and operational experience in working with a statewide assessment for students with disabilities and English learners;
Standard Setting
All student groups should be proportionately represented in the data that are the basis of standard setting and the process used to set standards must ensure the validity of the results for all student groups. Although examples of language to address this are rare, some examples are:
At minimum, technical reports should include:
- Norming studies that show dates of the studies, definition of the populations sampled, including four-year-old students, the procedure used to draw the samples, sample sizes, participation rates, and any weighting or smoothing procedure used to make the sample data better represent the population. Norming studies should have been conducted within the last 10 years, with 5 years being preferable.
***
…. the respondent will develop a technical report of the standard setting that describes the implementation of the standard setting workshop. The respondent will provide the state education agency with an initial draft of the technical report within 30 days of the workshop.
***
The standards setting workshops shall include, without limitation, the following:
A panel of educators, who are knowledgeable in the grade level/content areas, familiar with the state’s academic standards and graduation requirements, and who are drawn from various stakeholder groups (including representatives for special education, English Learners, visually and hearing-impaired populations, colleges, universities and other community groups).
Test Administration and Security
Historically, interim assessments are seen as “low stakes” and sometimes standardized administration practices and appropriate test security are neglected. Vendors should have processes and procedures in place to ensure that the interim assessments are validly delivered as designed. Examples of RFP language that addresses this include:
the offeror shall describe in detail the steps that it will take to monitor the fidelity with which the test administration and security procedures are being applied. This shall include a plan for on-site monitoring of paper-based and computer-based administrations, as well as the use of forms certifying that applicable test administration and security procedures were followed to be signed by district assessment coordinators, school assessment coordinators, and test administrators.
***
Minimally, manuals will contain instructions and scripts for administering each portion of the assessments for the appropriate grade levels/subject areas for that administration. They also include information about security of materials, packing and returning materials, receipt and distribution of materials, and …
***
Standardized test directions ensure that all students have the same testing experience. Students must be provided with an opportunity to understand the task at hand. Without doing so may increase security risks and not provide students with a valid opportunity to show their knowledge of a content area.
***
Test Administration Manual (TAM) – Contractor shall develop, for electronic distribution, TAMs that clearly explain all procedures relative to test administration. Where applicable, contractor shall develop individual TAMs for each assessment. Additionally, if assessment-specific manuals are developed, the TAM will clearly delineate each content/grade-level assessment by section.
Universal Design
The vendor should provide documentation that shows how the assessments were developed with clear links to how fully the principles of “Universal Design” were incorporated into the development of the assessment. Universal Design principles help ensure that tests are fair for all students and that students can access the test items and respond appropriately. Examples of RFP language include:
The offeror shall explain how universal design principles will be applied to both paper-based and online administration and how universal design principles will be considered and applied to items starting at the beginning stages of item development. In addition, the offeror shall explain how construct-irrelevant language load and unnecessary visual dependencies will be addressed.
***
The system should ensure adherence to universal design concepts.
Items must meet the principles of Universal Design. A review of the item specifications and audit of items must be conducted with the state’s educators in advance of the operational use…
***
Universal Design: If the four principles of accessibility (perceptibility, operability, simplicity, and forgiveness) are employed by the assessment system it will ensure that the widest possible range of students can use the system with ease…
The contractor shall include a description of how item writers will consider and be trained to incorporate Universal Design Principles and a description of strategies that will be used to ensure that item developers use the universal design guidelines, including, but not limited to, the following:
A. the item or task measures what it is intended to measure;
B. the item or task respects the diversity of the assessment population and allows the full range of eligible students to respond to the item/stimulus;
C. how decisions will be made to ensure that items and tasks measure what they are intended to measure for English learner students with different levels of English proficiency and/or first language proficiency;
D. multiple means of presentation, expression, and engagement have been considered with regard to individual items/tasks for both students with disabilities and English learners;
E. the item or task material uses a clear and accessible text format;
F. the item or task material uses clear and accessible visual elements (when essential to the item). Web Content Accessibility Guidelines (WCAG) 2.0 AA accessibility guidelines must be met for all visual elements.
***
Assessments and instructional instruments, including operational items, field test items, and test bank items are developed and administered in a manner that represents Universal Design principles to maximize participation of students with disabilities and allow for accessibility features, to the extent reasonable, in accordance with the Individuals with Disabilities Education Act, 20 U.S.C §1400 et seq. (IDEA) and the state’s requirements and guidelines.
Validity
Clear documentation must be provided showing the interim assessment’s scores and reports are reliable and valid for all student groups. RFP language that addresses this includes:
Considering the use of scores for accountability calculations, such as growth or value added models, evidence that the selected scaling and design features would support such uses, such as satisfactory relationship between total-test score, sub-scores and external variables from other measures of the same construct.
***
Respondents must ensure the reliability and validity of individual student scores. The technical analyses conducted by the respondent for the state assessment must meet nationally recognized professional and technical standards, as established by the Standards for Educational and Psychological Testing, published jointly by the American Educational Research Association, the American Psychological Association, and the National Council on Measurement in Education.
***
Included in these studies, the respondent shall describe in detail how it will conduct studies to verify and support the validity of interpretations drawn from test scores.
***
Accessibility features should not provide the student with an unfair advantage or interfere with the validity of a test, and must not change the underlying skills that are being measured by the test.
***
The administration vendor shall collaborate with the scoring and reporting vendor to conduct a study and provide recommendations, based upon psychometric standards and the state’s policies, as to whether specific accessibility features required or requested for students with an individualized education program (IEP), Section 504, or English language proficiency plans would alter the validity, reliability, and equity of the standards being measured. The study shall evaluate all accessibility features to determine that they are: (a) appropriate and effective, (b) do not alter the construct being assessed, (c) allow meaningful interpretations of results. The final study report shall also provide a comparison of scores for students with disabilities and English learners as compared to other students.
***
…review of accessibility feature validity – they are appropriate and effective for meeting the individual student’s need(s) to participate in the assessments, do not alter the construct being assessed, and allow meaningful interpretations of results and comparison of scores for students with disabilities and English learners as compared to other students.
***
Contractor will provide evidence of validity of any allowable accessibility features.
***
At a minimum, the contractor’s technical reports must provide all technical data consistent with the State’s Guide to the U.S. Department of Education’s Assessment Peer Review Process and the Standards for Educational and Psychological Testing, published jointly in 2014 by the American Educational Research Association, the American Psychological Association, and the National Council on Measurement in Education. Specific technical information shall include:
- Documentation that accessibility features commonly used by students with disabilities yield valid and reliable scores;
- Documentation that accessibility features commonly used by English learners yield valid and reliable scores;
- Documentation of steps to ensure fairness in development of assessments (to include bias review, differential item functioning analysis, and impact statistics) relative to all subgroups.
Conclusions
Some students, including many students with disabilities and English learners, require accessibility features to be able to demonstrate their knowledge and skills on interim assessments. A few students with the most significant cognitive disabilities may need an alternate assessment. It is important that states’ RFPs for interim assessments address the needs of all students, including students with disabilities and English learners. States’ procurement practices, assessment policies, and internal capacities vary widely, so this report presented a range of example language that might be used in an RFP. States should factor in the costs (both time and financial) associated with the different approaches and select a path that best meets their assessment needs and the resources available in the state.
References
- Boyer, M., & Landl, E. (2021). Interim assessment practices for students with disabilities (NCEO Brief #22). National Center on Educational Outcomes and National Center for the Improvement of Educational Assessment. https://nceo.umn.edu/docs/OnlinePubs/NCEOBrief22.pdf
- Browder, D. M., Lazarus, S. S., & Thurlow, M. L. (2021). Alternate interim assessments for students with the most significant cognitive disabilities (NCEO Brief #23). National Center on Educational Outcomes. https://nceo.umn.edu/docs/OnlinePubs/NCEOBrief23.pdf
- Lazarus, S. S., Hinkle, A. R., Liu, K. K., Thurlow, M. L., & Ressa, V. A. (2021). Using interim assessments to appropriately measure what students with disabilities know and can do: Advisory panel takeaways and NCEO recommendations (NCEO Report #427). National Center on Educational Outcomes. https://nceo.umn.edu/docs/OnlinePubs/NCEOReport427.pdf
- Perie, M., Marion, S., Gong, B., & Wurtzel, J. (2007). The role of interim assessments in a comprehensive assessment system: A policy brief. Aspen Institute.
All rights reserved. Any or all portions of this document may be reproduced without prior permission, provided the source is cited as:
Bruce, W. (2025). Developing requests for proposals (RFPs) for interim assessments that include all students (NCEO Report 445). National Center on Educational Outcomes.
The Center is supported through a Cooperative Agreement (#H326G210002) with the Research to Practice Division, Office of Special Education Programs, U.S. Department of Education. The Center is affiliated with the Institute on Community Integration at the College of Education and Human Development, University of Minnesota. Consistent with EDGAR §75.62, the contents of this report were developed under the Cooperative Agreement from the U.S. Department of Education, but do not necessarily represent the policy or opinions of the U.S. Department of Education or Offices within it. Readers should not assume endorsement by the federal government. Project Officer: David Egnor

In collaboration with:
NCEO Core Staff
Andrew R. Hinkle, Co-Director
Kristi K. Liu, Co-Director
Jessica Bowman
Gail Ghere
Linda Goldstone
Michael L. Moore
Darrell Peterson
Mari Quanbeck
Virginia A. Ressa
Kathy Strunk
Yi-Chen Wu
National Center on Educational Outcomes
University of Minnesota
2025 East River Parkway, Room 1-330
Minneapolis, MN 55414
Phone 612/626-1530
The University of Minnesota shall provide equal access to and opportunity in its programs, facilities, and employment without regard to race, color, creed, religion, national origin, gender, age, marital status, disability, public assistance status, veteran status, sexual orientation, gender identity, or gender expression.
This document is available in alternative formats upon request.
NCEO is an affiliated center of the Institute on Community Integration