Test Breaks: Research
This fact sheet on test breaks is part of the Accommodations Toolkit published by the National Center on Educational Outcomes (NCEO). It summarizes information and research findings on test breaks as an accommodation. The toolkit also contains a summary of states’ accessibility policies for test breaks.
What are test breaks? The test breaks accommodation is the provision of breaks beyond those provided as part of the standard test administration. They may be breaks as needed or scheduled, supervised or unsupervised, and within or outside of the testing setting. When using test breaks as an accommodation, the break time is not subtracted from the allowable test time. Some states and accessibility manuals refer to breaks as frequent breaks, extended breaks, more breaks, periodic breaks and short supervised breaks (Joakim, 2015; One Feather, 2010; Ysseldyke et al., 2001). The test breaks accommodation is most often “bundled” with other accommodations such as extended time and setting (e.g., individual administration, small group administration) (One Feather, 2010).
What are the research findings on who should use this accommodation? Test breaks may be useful for some students who have an emotional difficulty that might interfere with performance, students with attention and concentration issues, or challenges related to fatigue and frustration (Ganguly, 2010). It may also be useful for English learners with disabilities who find tests in English overwhelming, tiring, or stressful (Abedi, 2008). However, the research findings do not provide a clear indication that test breaks actually improves the performance of students with disabilities. For example, Joakim (2015) found that fourth and fifth grade students with disabilities did not perform better when they utilized breaks during a state-wide writing test.
What are the research findings on implementation of test breaks? Twelve studies were located that addressed test breaks. Ten of these examined the use of test breaks as an accommodation.
- Test breaks is one of the most commonly allowed accommodations in state accessibility policies (Abedi et al., 2008; Albus & Thurlow, 2008; One Feather, 2010). Similarly, test breaks is one of the most frequently included accommodations on students’ Individualized Education Programs (IEPs) (Finizio, 2008; Ganguly, 2010; Kern et al., 2019; Ysseldyke et al., 2001).
- Students who use breaks as an accommodation during testing should also be afforded the opportunity for similar breaks during instruction (Finizio, 2008; Kern et al., 2019; Ysseldyke et al., 2001). However, students with disabilities who use breaks as an accommodation often do not have a similar accommodation in instructional settings (Finizio, 2008; Ganguly, 2010; Gibson et al., 2005).
- One study (Abedi et al., 2008) explored the possibility of having breaks “built in” to the test by segmenting the content so students would experience a compulsory break between each segment. Abedi and his colleagues found that segmenting a reading assessment this way did not result in any significant improvement in the scores for students with and without disabilities. Further, incorporating breaks into the test design did not produce improved student motivation, mood, or general emotions.
- Ganguly (2010) studied the use of sets of bundled accommodations with students with emotional and behavioral disabilities, and found that educators wanted to compensate for emotional difficulties that might interfere with student performance on the assessment when selecting this accommodation.
- Three studies found that the use of bundled accommodations, including test breaks, did not interfere with the validity of the assessment or have a significant impact on the performance of students with disabilities (Cho, Lee, & Kingston, 2015; George-Ezelle & Skaggs, 2004; Joakim, 2015).
What perceptions do students and teachers have about test breaks? Four research studies examined student and teacher perceptions about the use of test breaks.
- Researchers found that teachers perceived test breaks to be a useful accommodation for students who may have an emotional difficulty that might interfere with performance (Ganguly, 2010). Similarly, teachers perceived test breaks as an appropriate accommodation for students with attention and concentration issues, or challenges related to fatigue and frustration (Abedi et al., 2008).
- One study found that most general education teachers in grades 9–12 perceived their knowledge of timing accommodations, including test breaks, to be “somewhat high” to “high” (Davis, 2011).
- One Feather (2010) found that teachers perceived test breaks as important for English learners with disabilities for multiple reasons, including that tests in English may be overwhelming, tiring, or stressful. One teacher noted that breaks relieved student anxiety and perceived that it was necessary for accurate testing.
- Teachers reported providing breaks for English learners with disabilities who received extended time because of the extra time students need to translate test items (One Feather, 2010).
- Students who took a segmented version of a reading test, with built in breaks, did not report a higher motivation than students taking the standard version of the test (Abedi et al., 2008).
What have we learned overall? Test breaks comprise one of the most frequently included accommodations on student IEPs, yet research on test breaks as an assessment accommodation on its own, and not bundled with other accommodations, is limited. There is a need for additional research that focuses only on test breaks. The research that is available does not show that test breaks provide a significant effect on performance for students with disabilities; however, teachers perceive that test breaks may be useful for students who get easily fatigued or frustrated, or who have challenges with attention and concentration. Teacher perceptions also suggest that English learners with disabilities may find this accommodation useful.
Abedi, J., Kao, J. C., Leon, S., Sullivan, L., Herman, J. L., Pope, R., … Mastergeorge, A. M. (2008). Exploring factors that affect the accessibility of reading comprehension assessments for students with disabilities: A study of segmented text (CRESST Report No. 746). Retrieved from University of California-Los Angeles, National Center for Research on Evaluation, Standards, and Student Testing website: http://www.cse.ucla.edu/products/reports/R746.pdf
Albus, D., & Thurlow, M. (2008). Accommodating students with disabilities on state English language proficiency assessments. Assessment for Effective Intervention, 33(3), 156–166. https://doi.org/10.1177/1534508407313241
Cho, H. J., Lee, J., & Kingston, N. (2012). Examining the effectiveness of test accommodation using DIF and a mixture IRT model. Applied Measurement in Education, 25(4), 281–304. https://doi.org/10.1080/08957347.2012.714682
Davis, J. E. (2011). Secondary education teachers’ perceptions related to their knowledge and effectiveness of accommodations for students with mild disabilities. Dissertation Abstracts International: Section A. Humanities and Social Sciences, 72(10). Retrieved from http://search.proquest.com/docview/884226584/abstract
Finizio, N. J., II. (2008). The relationship between instructional and assessment accommodations on student IEPs in a single urban school district. Dissertation Abstracts International: Section A. Humanities and Social Sciences, 69(05). Retrieved from http://search.proquest.com/docview/304817648/abstract
Ganguly, R. (2010). Testing accommodations for students with emotional or behavioral disorders: A national survey of special education teachers. Dissertation Abstracts International: Section A. Humanities and Social Sciences, 71(12). Retrieved from http://search.proquest.com/docview/787893086/abstract
George-Ezzelle, C. E., & Skaggs, G. (2005). Examining the validity of GED tests scores with scheduling and setting accommodations. Retrieved from GED Testing Service, American Council on Education website: http://www.gedtestingservice.com/uploads/files/9964eb8d2ccc151283f1d09277755696.pdf
Gibson, D., Haeberlie, F. B., Glover, T. A., & Witter, E. A. (2005). Use of recommended and provided testing accommodations I. Assessment for Effective Intervention, 31(1), 19–36. https://doi.org/10.1177/073724770503100103
Joakim, S. E. (2015). Help me fail: A study on testing accommodations for students with disabilities in writing assessments. Dissertation Abstracts International: Section A. Humanities and Social Sciences, 77(03E). Retrieved from http://search.proquest.com/docview/1735801010/abstract
Kern, L., Hetrick, A. A., Custer, B. A., & Commisso, C. E. (2019). An evaluation of IEP accommodations for secondary students with emotional and behavioral problems. Journal of Emotional and Behavioral Disorders, 27(3), 178–192. https://doi.org/10.1177/1063426618763108
One Feather, M. (2010). Test accommodations and standardized assessment for students with learning disabilities who are second language learners. Dissertation Abstracts International: Section A. Humanities and Social Sciences, 71(11). Retrieved from http://search.proquest.com/docview/761141872/abstract
Ysseldyke, J., Thurlow, M., Bielinski, J., House, A., Moody, M., & Haigh, J. (2001). The relationship between instructional and assessment accommodations in an inclusive state accountability system. Journal of Learning Disabilities, 34(3), 212–220. https://doi.org/10.1177/002221940103400302
All rights reserved. Any or all portions of this document may be reproduced and distributed without prior permission, provided the source is cited as:
Ressa, V., Lazarus, S. S., Rogers, C. M., & Goldstone, L. (2021). Test breaks: Research (NCEO Accommodations Toolkit #7a). National Center on Educational Outcomes.
NCEO is supported through a Cooperative Agreement (#H326G160001) with the Research to Practice Division, Office of Special Education Programs, U.S. Department of Education. The Center is affiliated with the Institute on Community Integration at the College of Education and Human Development, University of Minnesota. NCEO does not endorse any of the commercial products used in the studies. The contents of this report were developed under the Cooperative Agreement from the U.S. Department of Education, but does not necessarily represent the policy or opinions of the U.S. Department of Education or Offices within it. Readers should not assume endorsement by the federal government. Project Officer: David Egnor