Book cover of Eurasian Maritime Geopolitics, focusing on international affairs.
Book cover of Eurasian Maritime Geopolitics, focusing on international affairs.

What Did You Learn Over Summer Break? Understanding and Combating Summer Learning Loss

As students across the nation eagerly recount, “What Did You Learn Over Summer Break?” upon returning to classrooms this fall, educators are acutely aware that for many, particularly those from historically disadvantaged backgrounds, the answer might be less than encouraging academically. This phenomenon, widely recognized as summer learning loss, summer setback, or the “summer slide,” has been a concern in education research for over a century. Since as early as 1906, researchers have investigated the impact of summer vacation on students’ academic progress. This article delves into the current understanding of summer learning loss and provides insights for schools and districts seeking effective strategies to mitigate its effects.

Early research provided a foundational understanding of summer learning loss. A comprehensive review of studies in the field highlighted several key findings. Firstly, it was observed that, on average, students’ achievement scores experienced a decline equivalent to approximately one month of school-year learning during summer vacation. Secondly, this decline was found to be more pronounced in mathematics compared to reading. Thirdly, the extent of learning loss tended to increase as students progressed through higher grade levels. Crucially, the review also pointed to the concerning trend of widening income-based reading gaps over the summer. While students from middle-class backgrounds often demonstrated improvements in reading skills during the summer months, students from lower-income backgrounds were more likely to experience a loss in reading proficiency. However, the study did not find significant differences in summer learning loss in mathematics based on income, gender, or race in either subject.

More recent studies on summer learning loss have presented a more nuanced and sometimes mixed picture. One study, analyzing data from over half a million students in grades 2-9 in a southern state between 2008 and 2012, revealed that students, on average, lost between 25% and 30% of their school-year learning over the summer. Furthermore, this study indicated that Black and Latino students tended to experience less academic gain during the school year and greater learning loss during the summer compared to their white peers. However, an analysis of the nationally representative Early Childhood Longitudinal Study, Kindergarten Class of 2010–11 (ECLS-K:2011) presented a different perspective, finding limited evidence of overall learning loss during the summers following kindergarten and first grade. The study also noted that while socioeconomic status gaps widened in some subjects and grades during the summer, this was not a consistent pattern across all areas. Further complicating the picture, Von Hippel and Hamrock re-analyzed earlier datasets and concluded that academic gaps “do not necessarily…grow fastest over the summer.” These diverse findings suggest that while summer learning loss and the widening of achievement gaps do occur, they are not universal phenomena, varying across geographic locations, grade levels, and subject areas.

The “faucet theory,” developed by Entwisle, Alexander, and Olson, provides a compelling explanation for why students from lower-income backgrounds may experience greater summer learning loss compared to their higher-income counterparts. This theory posits that during the school year, the “resource faucet” is turned on for all students, providing equal access to educational resources and enabling learning gains for everyone. However, during the summer months, the flow of these resources diminishes significantly for students from disadvantaged backgrounds, while students from more privileged backgrounds continue to have access to enriching opportunities. Higher-income students often benefit from continued access to financial and human capital resources, such as parental education and involvement, over the summer, which facilitates ongoing learning and skill development.

Students’ achievement scores declined over summer vacation by one month’s worth of school-year learning.

Traditionally, educators and policymakers have turned to summer school programs as a primary strategy to combat summer learning loss and mitigate the widening of achievement gaps. A comprehensive meta-analysis of classroom-based summer programs conducted in 2000 by Cooper and colleagues found that these programs generally had positive effects on student achievement. However, their research also indicated that middle-income students tended to benefit more from summer programming than lower-income students. This disparity led to speculation that programs serving more advantaged students might be of higher quality, or that there might be an interactive effect between the programming and the home resources available to students. This finding raised concerns that well-intentioned efforts to address summer learning loss could inadvertently exacerbate achievement gaps if not carefully targeted and implemented.

In contrast to Cooper’s findings, a meta-analysis of 41 summer reading programs from 35 studies published after Cooper et al.’s review, conducted by Kim and Quinn, presented a different perspective. Similar to Cooper and colleagues, Kim and Quinn found that summer reading programs were generally effective in raising test scores. However, and significantly, they found that low-income students actually benefited more from summer reading programs than higher-income students, even when comparing students from different income levels attending the same program. Their analysis suggested that the reason for this greater benefit for lower-income students was that they were more likely to experience summer learning loss in reading when they did not participate in summer programs. The authors identified several key differences between their review and Cooper et al.’s that could explain the contrasting results: 1) Kim and Quinn focused exclusively on reading programs, whereas Cooper and colleagues included both math and reading programs, 2) Kim and Quinn included only two-group experimental and quasi-experimental studies, while Cooper and colleagues included single-group pre/post-test designs, and 3) Kim and Quinn’s review incorporated home-based programs.

The effectiveness of school-based summer school programs can vary considerably. Many recommendations for creating high-quality programs are based on expert opinion and best practices. Common suggestions include integrating academic learning with hands-on or recreational activities to enhance student engagement, professionalizing summer school staff to ensure quality instruction, and establishing partnerships with community organizations to leverage resources and broaden program offerings. Research also provides valuable insights into effective program design. For example, the meta-analysis mentioned earlier found that programs were more effective when they employed research-based literacy instruction. Specifically, programs that utilized instructional strategies identified by the National Reading Panel as best practices demonstrated the most significant impact on students’ reading comprehension scores, equivalent to moving a student from the 50th to the 65th percentile. Program effectiveness also varied across different literacy domains, with programs proving effective in improving reading comprehension and fluency/decoding scores but less so in enhancing vocabulary scores. Unsurprisingly, research also indicates that program effectiveness is directly linked to student attendance and the amount of time spent actively engaged in academic tasks.

While school-based summer learning programs hold significant promise when they adhere to these quality criteria, they often fall short of expectations in practice. Two major challenges contribute to the ineffectiveness of some school-based summer programs: difficulty in attracting and retaining high-quality teachers and challenges in appealing to students and families for whom attending summer school may represent a significant opportunity cost, potentially conflicting with summer jobs or other activities. Furthermore, school-based programs can be quite expensive to operate. These challenges have led researchers to explore and experiment with lower-cost, home-based summer programming models, with encouraging initial success.

One successful example of a home-based summer reading program specifically designed for low-income upper elementary school students is READS for Summer Learning. This program, refined through multiple randomized trials, provides students with eight books mailed to their homes over the summer, carefully selected to match their reading level and interests. Each book is accompanied by a tri-fold paper activity guide that leads students through a pre-reading activity and a post-reading comprehension check. Students are asked to mail back the postage-prepaid tri-fold for review and feedback, and families receive reminders if tri-folds are not returned. Importantly, teachers deliver scripted lessons at the end of the school year to prepare students to effectively use the tri-fold scaffold and engage in productive independent reading over the summer. A recent study demonstrated that READS had a positive impact on low-income students’ reading comprehension in the spring following their participation in the intervention (ES=.05 SD on the state reading test). Additional research suggests that the tri-fold activity guide plays a key role in mediating the program’s positive effects.

Another recent randomized trial demonstrated the surprising effectiveness of a simple and low-cost intervention: sending text messages to families of elementary school students at risk of summer learning loss. This study found that summer text messages improved the reading scores of third- and fourth-graders (but not first or second graders), with effect sizes ranging from .21 to .29. The text messages included practical tips on accessing available summer resources, ideas for engaging learning activities to do with children, and information highlighting the value of summer learning activities.

Home-based programs like READS and text messaging interventions offer a more cost-effective approach to combating summer learning loss compared to traditional school-based programs. For instance, the estimated cost of READS per student ranges from $250 to $480, while other supplementary education service programs can cost as much as $1,700 per student, often with similar or even less favorable cost-effectiveness ratios.

Kim and Quinn’s meta-analysis, which included home-based programs, provided encouraging evidence that the effects of home-based programs were not significantly different from their more expensive classroom-based counterparts. However, it’s important to note that the effects of these programs might not be as substantial as those achieved by the highest-quality school-based programs that utilize research-backed instructional strategies and intensive interventions.

Addressing summer learning loss is crucial for schools and districts not only to prevent the exacerbation of achievement gaps but also to avoid the “waste” of knowledge and skills students acquire during the school year. Summer learning loss undoubtedly increases the amount of time teachers must dedicate to “re-teaching” previous year’s content at the beginning of each academic year, likely contributing to the perceived repetitiveness of the typical U.S. curriculum. While investing in extensive school-based summer options may be financially challenging for many districts, implementing targeted out-of-school interventions for students most vulnerable to summer backsliding can be a cost-effective and strategic approach. When designing such programs, policymakers and educators should consider the following research-backed recommendations:

  • Center the program around evidence-based curriculum: Utilize instructional materials and methods proven to be effective in promoting learning gains.
  • Incorporate hands-on or recreational activities alongside academic content: Blend learning with engaging and enjoyable activities to attract and motivate students.
  • Ensure sufficient time on task and encourage consistent attendance: Structure programs to maximize learning time and implement policies or incentives to promote regular participation.
  • Invest in hiring effective and qualified teachers: Prioritize attracting and retaining skilled educators for summer programs.

Regardless of the specific program design, it is essential that summer learning initiatives offer engaging and enriching options for students so that participation feels like an opportunity rather than a punitive measure imposed on their summer vacation. By implementing thoughtful and effective summer learning strategies, we can better equip all students for success as the school year commences and beyond.

The authors did not receive any financial support from any firm or person for this article or from any firm or person with a financial or political interest in this article. They are currently not an officer, director, or board member of any organization with an interest in this article.

Related Books

Cross Purposes

Religion & Society Cross Purposes Jonathan Rauch

February 4, 2025

Eurasian Maritime Geopolitics

International Affairs Eurasian Maritime Geopolitics Kent E. Calder

March 4, 2025

Strategy and Grand Strategy

Geopolitics Strategy and Grand Strategy Joshua Rovner

January 14, 2025

Authors

David M. Quinn Assistant Professor of Education – University of Southern California

Morgan Polikoff Associate Professor of Education – USC Rossier School of Education @mpolikoff

Cooper H., Nye B., Charlton K., Lindsay J., Greathouse S. (1996). The effects of summer vacation on achievement test scores: A narrative and meta-analytic review. Review of Educational Research, 66(3), 227–268. http://journals.sagepub.com/doi/10.3102/00346543066003227

ibid

Atteberry, A., & McEachin, A. (2016). School’s out: Summer learning loss across grade levels and school contexts in the United States today. In Alexander, K., Pitcock, S., & Boulay, M. (Eds). Summer learning and summer learning loss, pp35-54*. New York: Teachers College Press.

Quinn, D.M., Cooc, N., McIntyre, J., & Gomez, C.J. (2016). Seasonal dynamics of academic achievement inequality by socioeconomic status and race/ethnicity: Updating and extending past research with new national data. Educational Researcher, 45(8), 443-453. http://journals.sagepub.com/doi/abs/10.3102/0013189X16677965?journalCode=edra

Von Hippel, P.T., & Hamrock, C. (2016). Do test score gaps grow before, during, or between the school years? Measurement artifacts and what we can know in spite of them. (Social Science Research Network working paper). Retrieved from http://papers.ssrn.com/sol3/papers.cfm?abstract_id=2745527

Entwisle D. R., Alexander K. L., Olson L. S. (2000). Summer learning and home environment. In Kahlenberg R. D. (Ed.), A notion at risk: Preserving public education as an engine for social mobility (pp. 9–30). New York, NY: Century Foundation Press

Borman G. D., Benson J., Overman L. T. (2005). Families, schools, and summer learning. The Elementary School Journal, 106(2), 131–150. http://www.journals.uchicago.edu/doi/abs/10.1086/499195

Cooper, H., Charlton, K., Valentine, J. C., & Muhlenbruck, L. (2000). Making the most of summer school: A meta-analytic and narrative review. Monographs of the society for research in child development, 65, i-127. https://www.jstor.org/stable/3181549

Kim J. S., Quinn D. M. (2013). The effects of summer reading on low-income children’s literacy achievement from kindergarten to grade 8 a meta-analysis of classroom and home interventions. Review of Educational Research, 83(3), 386–431. http://journals.sagepub.com/doi/10.3102/0034654313483906

McLaughlin B., Pitcock S. (2009). Building quality in summer learning programs: Approaches and recommendations (White Paper Commissioned by the Wallace Foundation). Retrieved from: http://www.wallacefoundation.org/knowledge-center/documents/building-quality-in-summer-learning-programs.pdf

Augustine, CH, Sloan McCombs, J., Pane, JF, Schwartz, HL, Schweig, J., McEachin, A. and Siler-Evans, K. (2016). Learning from Summer: Effects of Voluntary Summer Learning Programs on Low-Income Urban Youth. Santa Monica, CA: RAND Corporation. Retrieved from: https://www.rand.org/pubs/research_reports/RR1557.html

Denton D. R. (2002). Summer school: Unfulfilled promise. Atlanta, GA: Summer Regional Education Board. Retrieved from: http://files.eric.ed.gov/fulltext/ED467662.pdf

McLaughlin & Pitcock (2009)

e.g., Kim, J.S., Guryan, J., White, T.G., Quinn, D.M., Capotosto, L., & Kingston, H.C. (2016). Delayed effects of a low-cost and large-scale summer reading intervention on elementary school children’s reading comprehension. Journal of Research on Educational Effectiveness, 9 sup1, 1-22. http://www.tandfonline.com/doi/abs/10.1080/19345747.2016.1164780?journalCode=uree20

ibid

Guryan, J., Kim, J.S., & Quinn, D.M. (2014). Does reading during the summer build reading skills? Evidence from a randomized experiment in 463 classrooms. NBER Working Paper No. 20689. http://www.nber.org/papers/w20689

Kraft, M.A., & Monti-Nussbaum, M. (in press). Can schools empower parents to prevent summer learning loss? A text messaging field experiment to promote literacy skills. The ANNALS of the American Academy of Political and Social Science. https://scholar.harvard.edu/files/mkraft/files/kraft_monti-nussbaum_2017_can_schools_empower_parents_to_prevent_summer_learning_loss_annals.pdf

Polikoff, M.S. (2012). The redundancy of mathematics instruction in US elementary and middle schools. The Elementary School Journal , 113(2), 230­-251. http://web-app.usc.edu/web/rossier/publications/66/The%20Redundancy%20of%20Math%20Instruction.pdf

The Brookings Institution is committed to quality, independence, and impact. We are supported by a diverse array of funders. In line with our values and policies, each Brookings publication represents the sole views of its author(s).

Comments

No comments yet. Why don’t you start the discussion?

Leave a Reply

Your email address will not be published. Required fields are marked *