theworldismine13
God Emperor of SOHH
Good News for New Orleans
Good News for New Orleans: Early evidence shows reforms lifting student achievement
Good News for New Orleans: Early evidence shows reforms lifting student achievement
What happened to the New Orleans public schools following the tragic levee breeches after Hurricane Katrina is truly unprecedented. Within the span of one year, all public-school employees were fired, the teacher contract expired and was not replaced, and most attendance zones were eliminated. The state took control of almost all public schools and began holding them to relatively strict standards of academic achievement. Over time, the state turned all the schools under its authority over to charter management organizations (CMOs) that, in turn, dramatically reshaped the teacher workforce.
A few states and districts nationally have experimented with one or two of these reforms; many states have increased the number of charter schools, for example. But no city had gone as far on any one of these dimensions or considered trying all of them at once. New Orleans essentially erased its traditional school district and started over. In the process, the city has provided the first direct test of an alternative to the system that has dominated American public education for more than a century.
Dozens of districts around the country are citing the New Orleans experience to justify their own reforms. In addition to being hailed by Democratic president Barack Obama and Louisiana’s Republican governor, Bobby Jindal, parliamentary delegations from at least two countries have visited the city to learn about its schools.
The unprecedented nature of the reforms and level of national and international attention by themselves make the New Orleans experience a worthy topic of analysis and debate. But also consider that the underlying principles are what many reformers have dreamed about for decades—that schools would be freed from most district and union contract rules and allowed to innovate. They would be held accountable not for compliance but for results.
There is clearly a lot of hype. The question is, are the reforms living up to it? Specifically, how did the reforms affect school practices and student learning? My colleagues and I at the Education Research Alliance for New Orleans (ERA-New Orleans) at Tulane University have carried out a series of studies to answer these and other questions. Our work is motivated by the sheer scale of the Katrina tragedy and the goal of supporting students, educators, and city leaders in their efforts to make the city’s schools part of the city’s revitalization effort. The rest of the country wants to know how well the New Orleans school reforms have worked. But the residents of New Orleans deserve to know. Here’s what we can tell them so far.
Before the Storm
Assessing the effects of this policy experiment involves comparing the effectiveness of New Orleans schools before and after the reforms. As in most districts, before Hurricane Katrina, an elected board set New Orleans district policies and selected superintendents, who hired principals to run schools. Principals hired teachers, who worked under a union contract. Students were assigned to schools based mainly on attendance zones.
The New Orleans public school district was highly dysfunctional. In 2003, a private investigator found that the district system, which had about 8,000 employees, inappropriately provided checks to nearly 4,000 people and health insurance to 2,000 people. In 2004, the Federal Bureau of Investigation (FBI) issued indictments against 11 people for criminal offenses against the district related to financial mismanagement. Eight superintendents served between 1998 and 2005, lasting on average just 11 months.
This dysfunction, combined with the socioeconomic background of city residents—83 percent of students were eligible for free or reduced-price lunch—contributed to poor academic results. In the 2004‒05 school year, Orleans Parish public schools ranked 67th out of 68 Louisiana districts in math and reading test scores. The graduation rate was 56 percent, at least 10 percentage points below the state average.
As a result, some reforms were already under way when Katrina hit in August 2005. The state-run Recovery School District (RSD) had already been created to take over low-performing New Orleans schools. The state had appointed an emergency financial manager to handle the district’s finances. There were some signs of improvement in student outcomes just before the storm, but, as we will see, these were relatively modest compared with what came next.
A Massive Experiment
After Katrina, state leaders quickly moved almost all public schools under the umbrella of the RSD, leaving the higher-performing ones under the Orleans Parish School Board (OPSB). Gradually, the RSD turned schools over to charter operators, and the teacher workforce shifted toward alternatively prepared teachers from Teach for America and other programs. So new was the system that a new name was required—longtime education reformer Paul Hill called it the “portfolio” model.
![]()
Gradually, the RSD turned schools over to charter operators, and the teacher workforce shifted toward alternatively prepared teachers from Teach for America and other programs.
Researchers often refer to such sudden changes as “natural experiments” and study them using a technique called “difference-in-differences.” The idea is to first take the difference between outcomes before and after the policy, in the place where it was implemented—the treatment group. This first difference is insufficient, however, because other factors may have affected the treatment group at the same time. This calls for making the same before-and-after comparison in a group that is identical, except for being unaffected by the treatment. Subtracting these two—taking the difference of the two differences between the treatment and comparison groups—yields a credible estimate of the policy effect.
We have carried out two difference-in-differences strategies:
1) Returnees only. We study only those students who returned to New Orleans after Hurricane Katrina. The advantage of this approach is that it compares the same students over time. One disadvantage is that it omits nonreturnees. Also, we can only study returnees over a short period of time—after 2009, they no longer have measurable outcomes to study.
2) Different cohorts. We consider the achievement growth of different cohorts of students before and after the reforms—for example, students in 3rd grade in 2005 and students in 3rd grade in 2012. The advantages here are that we can include both returnees and nonreturnees, and we can use this strategy to study longer-term effects. But the students are no longer the same.
In both strategies, the New Orleans data set includes all publicly funded schools in the city, including those governed by the district (OPSB), since all public schools were influenced by the reforms. The main comparison group includes other districts in Louisiana that were affected by Hurricane Katrina, and by Hurricane Rita, which came soon afterward. This helps account for at least some of the trauma and disruption caused by the storms, the quality of schools students attended in other regions while their local schools were closed, and any changes in the state tests and state education policies that affected both groups.
![]()
Effects on Average Achievement
Figure 1 shows the scores for each cohort, separately for New Orleans and the matched comparison group. The scores cover grades 3 through 8, are averaged across subjects, and are standardized so that zero refers to the statewide mean. The first thing to notice is that before the reforms, students in New Orleans performed far below the Louisiana average, at about the 30th percentile statewide. Students from the comparison districts also lagged behind the rest of the state, but by a lesser amount. The New Orleans students and the comparison group were moving in parallel before the reforms, however, suggesting that our matching process produced a comparison group that is more appropriate than the state as a whole.
The performance of New Orleans students shot upward after the reforms. In contrast, the comparison group largely continued its prior trajectory. Between 2005 and 2012, the performance gap between New Orleans and the comparison group closed and eventually reversed, indicating a positive effect of the reforms of about 0.4 standard deviations, enough to improve a typical student’s performance by 15 percentile points.
The estimates we obtain when we focus just on returnees are smaller and often not statistically significant, although the discrepancies are predictable: first, the returnees were probably more negatively affected by trauma and disruption; second, creating a new school system from scratch takes time, so we would expect any effects to be larger in later years; and third, the effects of the reforms seem more positive in early elementary grades, and the returnees were generally in middle school when they returned. Even so, the combination of analyses suggests effects of at least 0.2 standard deviations, or enough to improve a typical student’s performance by 8 percentage points.
But there is still the possibility that what appear to be reform effects are actually the result of other factors.
Addressing Additional Concerns
The goal of any analysis like this is to rule out explanations for the changes in outcomes other than the reforms themselves. Our main comparisons deal with many potential problems, such as changes in state tests and policies. Here we consider in more depth four specific factors that could bias the estimated effects on achievement: population change, interim school effects, hurricane-related trauma and disruption, and test-based accountability distortions.
Population change. Hurricane Katrina forced almost everyone to leave the city. Some returned and some did not. The most heavily flooded neighborhoods were (not coincidentally) those where family incomes were lowest, and people in these neighborhoods returned at much lower rates than people who lived in other parts of the city. Given the strong correlation between poverty and student outcomes, this could mean that higher test scores shown in Figure 1 are driven not by the reforms but by schools serving more-advantaged students.
Observers have pointed out that the share of the student population eligible for free or reduced-price lunch (FRL) actually increased slightly in New Orleans after the storm. But there are many reasons not to trust FRL data. For example, they reflect crude yes/no measures and are unlikely to capture extreme poverty of the sort common in New Orleans. Also, what really matters here is not whether poverty increased in New Orleans, but whether poverty increased more than in the comparison group. Therefore, in addition, we gathered data from the U.S. Census, which measures changes in income and the percentages of the population with various levels of education. We also carried out the difference-in-differences analysis in these demographic measures to understand the changes in New Orleans relative to the matched comparison group of hurricane-affected districts, and then simulated the effect of changes in family background characteristics on test scores using data from the federal Early Childhood Longitudinal Study.
We also examined pre-Katrina characteristics to see whether the returnees were different from nonreturnees and found that returnees did have slightly higher scores. In fact, we come to the same conclusion in both analyses: the expected increase in student outcomes after the hurricanes due to population change is no more than 0.02 to 0.06 standard deviations, or about 10 percent of the difference-in-differences estimates in Figure 1.