31 October 2012

Breaking Up Summer Vacation Doesn't Work

Steven C. McMullen and Kathryn E. Rouse, from their piece ”The Impact of Year-Round Schooling on Academic Achievement: Evidence from Mandatory School Calendar Conversions,” in The American Economic Journal, report:
In 2007, 22 Wake County, North Carolina traditional calendar schools were switched to year-round calendars, spreading the 180 instructional days evenly across the year. . . . We then exploit the natural experiment to evaluate the impact of year-round schooling on student achievement. . . . Results suggest that year-round schooling has essentially no impact on academic achievement of the average student. Moreover, when the data are broken out by race, we find no evidence that any racial subgroup benefits from year-round schooling.


Via Tyler Cohen at Marginal Revolution (full text of study available here).

There was good reason to think that this effort would work and that it would particularly benefit working class and minority children, because evidence famously cited in Malcolm Gladwell's book "Outliers" demonstrated that most of the setbacks in academic achievement for these children relative to their more affluent peers arose during long summer vacations. The paper's review of the literature describes this research and is followed by the paper's summary of its results (emphasis added, included footnotes relabeled, paragraph breaks modified to increase readability in a blog format):

Proponents of YRS [Year Round School] calendars argue that they are beneficial to students because they help alleviate human capital loss during the long summer break (“summer learning loss”). Supporters further contend that the long break is particularly harmful for low-income, low-performing students who are less able to afford supplemental learning opportunities in the summer (Von Drehle, 2010). These assertions are largely supported by a wide literature on summer learning loss, which has found that student achievement stagnates over the summer, and that for low achieving and disadvantaged students especially, achievement can often decline while not in school (Cooper et al. 1996; Jamar 1994; Alexander et al. 2007).*

* It is well documented that inequalities in student achievement are generally exacerbated over the summer months(Downey et al. 2004; Reardon 2003; Alexander et al. 2007).

Alexander et al. (2007) finds that by the end of ninth grade, almost two-thirds of the socioeconomic achievement gap can be explained by differential summer learning loss. It is important to note, however, that the ability of YRS to address this problem depends crucially upon the nature of the human capital accumulation process. In this paper, we present a simple model that illustrates YRS can only improve achievement if learning loss accelerates with the number of days out of school or if there are diminishing returns to learning.**

** Some critics also argue the more frequent breaks actually create more disruption in the learning process (Rasberry 1992). More frequent breaks could negatively impact achievement if learning was convex in the number of days of school.

Thus, even if disadvantaged students lose more human capital than their wealthier counterparts over summer, YRS cannot alleviate the problem unless there are specific non-linearities in the human capital process. If YRS acts largely as a remedy for summer learning loss, the impact should be no greater than the documented negative impact of a summer vacation away from school, which is rarely larger than a loss of 0.1 standard deviations of student achievement per year, and often close to zero (Downey et al. 2004; Cooper et al. 1996).

Our study adds to a body of literature, primarily coming from outside of the field of economics, that is well-summarized by the meta-analysis performed by Cooper et al. (2003). The general consensus coming out of that review is that the impact of year-round education on student achievement is, on average, nearly negligible. On the other hand, the evidence suggests the modified calendar does benefit low performing and economically disadvantaged students.

McMillan (2001) finds similar results using a cross-sectional dataset from North Carolina. The primary drawback of these early studies is their failure to account for non-random student and school selection. The studies included in Cooper et al. (2003) do not adequately control for student and school characteristics, and none attempt to control for both unobserved student and school heterogeneity. Cooper et al. (2003) thus concludes that it “would be difficult to argue with policymakers who choose to ignore the existent database because they feel that the research designs have been simply too flawed to be trusted (p. 43).” McMillan (2001) is able to control for a student’s previous year end-of-grade test score, gender, ethnicity, and parents’ highest level of education. However, data limitations prevent him from controlling for other student, family, and school characteristics that may also impact student achievement, making it difficult to draw causal inferences.

Cooper et al. (2003) report that those studies that do a better job controlling for student and school characteristics find smaller YRS effect sizes, indicating that the lack of proper controls may bias the results of previous studies upward. This result may be indicative of non-random selection of high-achieving students into YRS or could also reflect the non-random implementation of year-round calendars in high-income, high achieving areas.

Most recently, Graves (2010) uses detailed longitudinal school-level data from California to estimate the impact of the multi-track year-round calendar on academic achievement. By including school fixed effects and school-specific time trends, Graves is able to mitigate concerns over non-random year-round calendar implementation. In contrast to much of the prior research on YRS, Graves finds achievement in multi-track year-round schools is 1 to 2 percentile points lower than that in traditional calendar schools. However, without student-level data, she is not able to control for non-random student selection into YRS or to estimate the impacts separately by race. . . .

Our paper adds to this literature in three important ways. First, we perform the first study that controls for both observed and unobserved student and school heterogeneity, which is vital given the concerns in the literature about both student and school selection effects (McMillen 2001; Cooper et al. 2003). Second, because we use student-level panel data, we examine not only the impact of YRS on the level of achievement, but also look at the impact on the achievement growth. Finally, our data and methodology allow us to estimate the impact of YRS by race. . . .

Consistent with the existing literature, our results suggest YRS has essentially no impact on the academic achievement of the average student. Moreover, when the data is broken down by racial sub-group, the evidence indicates that, contrary to some previous studies, disadvantaged racial groups do not benefit from YRS. Taken as a whole, these results are consistent with the assertion that dividing a long summer break into several shorter breaks will not improve student achievement or address achievement gaps.

REFERENCES

Alexander, Karl L., Doris R. Entwisle and Linda Steffel Olson. “Lasting Consequences of the Summer Learning Gap.” 2007. American Sociological Review, 72: 167-180. . . .

Cooper, Harris, Jeffrey C. Valentine, Kelly Charlton, and April Melson. 2003. “The Effects of Modified School Calendars on Student Achievement and on School and Community Attitudes.” Review of Educational Research, 73(1): 1-52.

Cooper, Harris, Nye, Barbara, Charlton, Kelly, Lindsay, James, & Greathouse, Scott. 1996. “The Effects of Summer Vacation on Achievement Test Scores: A Narrative and Meta-Analytic Review.” Review of Educational Research , 66 (3): 227-268.

Downey, Douglas B., Paul T. von Hippel, and Beckett A. Broh. 2004. “Are Schools the Great Equalizer? Cognitive Inequality during the Summer Months and the School Year.” American Sociological Review. 69: 613-635. . . .

Graves, Jennifer. 2010. “The Academic Impact of Multi-Track Year-Round School Calendars: A Response to School Overcrowding.” Journal of Urban Economics, 67: 378-391. . . .

Jamar, Idorenyin. 1994. “Fall Testing: Are Some Students Differentially Disadvantaged?” Pittsburgh, PA: University of Pittsburgh Learning Research and Development Center. . . .

McMillen, Bradley J. 2001. “A Statewide evaluation of academic achievement in year-round schools.” The Journal of Educational Research, 95(2): 67-73. . . .

Rasberry, Quinn. 1992. “Year Round Schools May Not be the Answer.” Time To Learn Report.

Reardon, Sean F. 2003. “Sources of Educational Inequality: The Growth of Racial/Ethnic and Socioeconomic Test Score Gaps in Kindergarten and First Grade.” Population Research Institute, Pennsylvania State University Working Paper No. 03-05R. . . .


This study implies that there is a predominantly a linear relationship between the number of days that you go to school and what you learn, with the gains from spending days in a formal school setting being greater if you are a low income or minority student than they are if you are a middle class white student.

The language from the paper that I have emphasised explains that the distribution of a 180 day school year through the calendar year matters only to the extent that the relationship between amount learned and number of days in school (or out of school) at a time is non-linear.

Reasonable suggestions from educators and education policy analysts suggest that there might be a non-linear relationship between the number of days spent at a time, in or out of the classroom, in which either sustained periods away from school might cause students to forget more, or frequently broken up periods of school attendance might cause students to have their progress disrupted. But, the empirical evidence appears to establish there is not, in fact, such a non-linear relationship.

Taken together with the prior work strongly showing that most of the achievement gap seen in low income and minority students arises during summer vacation, this paper suggests that working class and minority children are simply losing ground relative to their more affluent peers each day that they are not in school, and that this effect is most visible during the summer simply because this is when students are away from school for periods of time long enough for these effects to be measurable.

No comments: