Last week, we wrote about an analysis of the newly released standardized test scores that offered some evidence of a widening gap in the Hartford area between historically high-performing school districts and those that have traditionally had below-average scores.
We’ve now expanded that analysis to all school districts in the state, and have found the same general trend: For most subject and grade groupings, school districts that exceeded the statewide average on a key measure of achievement in 2013, reported scores on the 2015 test that were even farther above the state average. Likewise, schools with below-average performance generally lost ground, falling farther below the statewide average.
In elementary school math, for example, among the 20 highest-performing districts two years ago, 18 had scores this year that were farther above the state average. Among the 20 lowest-performing districts two years ago, 16 dropped farther below the average.
Put another way, the above-average districts in 2013 outperformed the state by an average of 16 percent, while the below-average districts lagged the statewide figures by an average of 11 percent. But in 2015, the higher performing schools had widened their margin to 29 percent above the state average, while the lower-performing schools had dropped to 25 percent below the state average.
That’s what the numbers show. Divining what the numbers mean is a far harder task. It’s complicated by the switchover this year to a new test, developed by the Smarter Balanced Assessment Consortium, and known as SBAC. If the Courant’s analysis does indeed reflect a wider performance gap among districts, it could still be the result of factors beyond any actual changes in academic ability. District-by-district differences in test-preparation and familiarity with computers could affect student scores, and it’s possible the SBAC test is simply better or worse at identifying student achievement, and therefore the student achievement gap.
Most years, it’s easy to establish whether test scores point to a widening gap between high- and low-performing districts. But with a radically different test, it would be invalid to simply compare performance on the SBAC test to performance on the CMT and CAPT tests that Connecticut students have taken in the past.
So the Courant sought to compare how districts performed against the state average on both the old tests and the new test. We looked at the percentage of students in each district who scored at or above goal on the 2013 CMT and CAPT tests, and the percentage deemed to meet or exceed the “target achievement level” on the 2015 SBAC test – and then compared those numbers to the state average for those years.
As noted in the previous post, we averaged scores in grades 3, 4 and 5 to report a single elementary school score, and averaged scores in grades 6, 7 and 8 to create a single middle school score. (At the high school level, the test was given only once – in 10th grade for the CAPT test and in 11th for the SBAC.) We also averaged reading and writing scores on the CMT and CAPT tests, to compare to the single English Language Arts/Literacy score on the SBAC test.
The charts below show the result of that analysis, with each blue or orange bar representing a school district, with the districts arranged from left to right on the basis of how far they outperformed or lagged the state average on the CMT and CAPT tests two years ago. Blue bars indicate that a district in 2015 improved its relative ranking compared to the state as a whole, and orange bars indicate districts that lost ground compared to the state average.The ends of the bars indicate the district’s performance relative the statewide average each year. For districts that improved their relative standing, the 2015 figure is at the top of the bar; for districts that declined, the 2015 is at the bottom. Lastly, the wide horizontal bar represents the statewide average on each test.
In the first chart below, for example, the blue line at far left shows that for that district – New Canaan – the percentage of elementary school students who met goal on the math portion of the CMT in 2013 was 44 percent above the state average, while this year, the percentage of students with the highest scores on the SBAC test had risen to 85 percent above the state average. (There are too many school districts to display on the charts below. But click on any of the images to download a spreadsheet of that chart in Excel. Hovering over any of the bars in Excel will identify the school district.)
The charts show that most higher-performing districts – those on the left side of the charts – generally extended their margins against the state average, while lower-performing districts – those on the right side – generally fell even farther below the state average. There is one striking exception: high school English, where the scores indicate the exact opposite outcome. For high school English, higher performing districts generally lost ground compared to the state as a whole, while districts that performed below average two years ago generally saw significant gains compared to the statewide average. The cause of that anomaly is not clear.
It is worth noting that state officials conducted a different achievement-gap analysis and found no significant differences between 2013 and 2015. The state compared statewide performance across racial and ethnic lines and among designated high-needs students – English-language learners, students with cognitive disabilities and students eligible for free and reduced-price lunch. Gaps persist, they found, but at levels similar to those seen in 2013.
A deeper analysis will be possible in the months ahead, when the state releases more detailed school-level and district-level data. And the best comparison is likely a year away, when the state releases a fresh set of SBAC data from tests administered next spring, for comparison to this year’s long awaited baseline.