Retail gas prices in the region have been cut in half since 2011, but a degree of pain persists for one class of drivers: those with high-end cars that require high-octane fuel.
If you luxury-car drivers feel like you’ve been paying an extra premium for premium fuel lately, you’re right. Data from the U.S. Energy Information Administration show that the price gap between regular unleaded and super unleaded has soared in the past year – the result, experts say, of tight competition for regular gas and costlier processing for the high-end stuff.
For most of the last decade, as the chart below shows, premium fuel was 7 or 8 percent more expensive than regular. But the gap began rising sharply in mid-2014. After a dip in early 2015, the gap skyrocketed, rising from a 12-percent surcharge last June to nearly 31 percent on Monday.
That translates to an extra 55 cents a gallon for premium unleaded today, more than twice the extra cost for most of the decade, even when fuel prices were far higher. And at some stations, the surcharge is far higher. At one West Hartford Shell station, premium users this week were paying an extra 90 cents a gallon – a 47-percent bump – for super unleaded. That’s an extra $25 every time a Lincoln Navigator owner fills up.
Why the big gap? Part of the explanation is the hyper-competitive market for regular gas, which makes up the vast majority of a typical service station’s sales. That’s the price most prominently featured on the giant lighted signs at gas stations, and it generates the thinnest profit margins as stations compete for price-conscious consumers. But drivers of pricier cars with more demanding engines tend to be less price-conscious and make for welcome targets as gas station owners look to make up for thin margins on regular gas.
“Stations find competitive benefits in dropping regular prices as a way to attract customers,” AAA reported in response to questions about gas prices. “The reality is that many premium customers will continue to shop at the same station regardless as to whether the price is dropping as fast as regular.”
At the same time, even with the hefty surcharge, some regular-unleaded users are splurging, since even the inflated prices for high-octane gas are far lower than the prices charged for regular gas just months ago. The current average price for premium gas in New England – about $2.32 a gallon – is 45 cents a gallon cheaper than customers were paying for regular gas last June. As regular-gas users treat themselves to high-test, that changes the supply/demand equation.
And some experts also say that boosting the octane of the raw fuel delivered to stations has become more expensive, particularly as the formulation of North American shale oil – the fuel currently in abundant supply – is better suited to regular unleaded than super. “As a result,” according to AAA, “supplies of regular gasoline have remained more plentiful than premium.” And that, too, changes the supply/demand equation.
But there may be relief on the way. AAA reports that historically, the price gap between regular and premium gas is at its highest in the winter. “In the summer,” the association reports, “the spread between regular and premium is at its closest.”
And summer is a mere 111 days away.
For years, it has been the policy of the state Department of Education to give school superintendents an advance peek at their district’s standardized test scores, a week or so before the high-stakes numbers are released to the public. And for roughly as long, transparency advocates have groused that such a policy seemed to put the public-relations concerns of insiders above the rights of the public to have prompt access to the government records they own.
But it took an enterprising reporter from the Journal Inquirer of Manchester to actually do something about it.
Last Aug. 19, a deputy commissioner at the education department sent an email to every superintendent in the state, announcing that aggregate district-wide data for the Smarter Balanced Assessment Consortium test would be made available to them later that day, but that the data was not to be shared with the press or discussed at a board of education meeting.
“As a courtesy, the embargoed results are made available for districts before results are made public,” the email stated. And breaking that embargo, the email warned, “jeopardizes your district’s access to future embargoed releases.”
Michael Savino, the J-I’s Capitol reporter, caught wind of the Aug. 19 email and immediately requested a copy of the district-by-district data the Department of Education had distributed to districts. The department declined, explaining that the data would be released to the public only after a “final quality control check” was completed.
“We set a high bar regarding accuracy of information that we generate and the public should expect nothing less from us,” a department official wrote in rejecting the request.
Nine days later, the data that had been released to the superintendents was released to the public. And two months after that, the press and the state squared off before a hearing officer with the Freedom of Information Commission to determine whether that nine-day delay was legal.
The department argued that an advance copy of the test data – including student-by-student results – is sent to local school districts so superintendents can notify the department of any anomalies that might indicate errors. The department’s chief performance officer testified that the participation of superintendents is an important part of the department’s verification process.
“In essence,” wrote Attorney Kathleen K. Ross, the hearing officer, “the [department officials] argued that the record containing the aggregate district-wide test results was a ‘preliminary draft’ at the time it was requested because such results had not been verified by the superintendents and the department.”
But Ross wasn’t buying it.
“The Commission does not credit the testimony of the [chief performance officer] regarding the superintendents’ role in verifying the test scores, in light of the conflicting evidence that the aggregate district-wide test results were sent to the superintendents ‘as a courtesy’,” Ross wrote in a proposed decision released last week. Moreover, she said, there was no evidence that the department had asked superintendents to verify the data and report any problems, and no evidence that any superintendents had in fact provided feedback on the results.
Beyond that challenge to the department’s testimony, Ross wrote that the data did not meet the legal definition of a preliminary draft, and that the department violated the Freedom of Information Act when it delayed releasing the data to Savino.
After Ross’s proposed decision was released, an education department spokeswoman told the Journal Inquirer: “We are strong proponents of transparency, but we are also bound by our duty to ensure the information we present to the public is accurate.”
If the full Freedom of Information Commission adopts Ross’s findings at its Feb. 24 meeting, the department will know which of those two interests is paramount.
The state Department of Education took extraordinary measures last week to prevent even the slightest risk that you might find out how qualified your kid’s teacher is.
If that sounds ridiculous, it’s not entirely the department’s fault. A three-decade-old state law exempts from disclosure all records of teacher performance and evaluation (because why would taxpayers care about the quality of the teachers they employ?). So when the state submitted aggregate school-by-school data on teacher ratings as part of a court case last week, they suppressed more than half the data points to make sure there was no possible way anyone could link a particular rating to a particular teacher.
The data show the number of teachers who fell into each of four categories in a new and controversial assessment of the state’s teaching force. The department did release statewide figures, showing that about a third of teachers were rated as “exemplary” by their school districts, a little under two-thirds were deemed “proficient,” and fewer than 2 percent were labeled “developing” or “below standard.”
But when reporting the data at the school level, the state withheld at least some of that information for nearly 7 out of 10 schools. For just under half the state’s schools, all of the numbers were suppressed. And all in service of making sure no one outside of the school system knows which teachers are extraordinary and which are struggling.
The department used an exceptionally strict suppression strategy, hiding the numbers if a particular category had been assigned to between one and five teachers at a school. The thinking apparently was that if four teachers at a school were rated exemplary, and they all knew each other’s rating, then they would know that no other teacher at the school had received that rating. To keep that category’s number secret, they also blacked out the number for the next-highest category. And by the time they were done, most of the numbers had been suppressed.
There are, for example, 66 teachers statewide rated “below standard.” But what schools they’re assigned to is a mystery; not one of the schools with a below-standard teacher is identified in the data.
There is plenty of reason to question the validity of the rankings, as reported by my colleague, Kathy Megan. But this is the teacher-performance scheme the state has imposed on its teachers, and taken $13.5 million from its taxpayers to implement. Does the public have no interest in seeing what the ratings revealed?
As The Scoop has written before, the secrecy surrounding teacher performance dates to 1984, when the legislature was hoodwinked into shutting off access to evaluations of public school teachers, purportedly to thwart teacher-shopping by parents. The law, however, applied to every certified school official below the superintendent, and before long, was extended to professors throughout UConn and the state university system as well.
While the statute was initially seen as a way to keep written performance reviews secret, educational leaders are afraid that releasing even anonymous data will put them on the wrong side of the law. That’s what’s behind a Freedom of Information Commission case in which a New Milford Board of Education member was turned down when he asked for the breakdown of teacher ratings for his district. The local board of ed is wary of releasing the data without direction from the FOI Commission. The teachers union, meanwhile, has intervened to try to keep the numbers from seeing the light of day.
While the Commission’s eventual ruling may resolve the issue for other school boards, the New Milford numbers came out in the court exhibit submitted by the state. Here’s what the union was trying to keep secret: Of the 346 New Milford teachers evaluated under the new system, 72 percent were deemed proficient and 28 percent were deemed exemplary. Not a single one was rated developing or below standard.
Does that sound like the sort of information that should be kept out of the hands of the public by the force of law?
A year ago, The Scoop published “A Transparency’s Advocate’s Legislative Wish List,” with eight suggestions for improving the public’s access to government records. In the 2015 session, legislators resolved one of the issues, fixing a confusing statute covering what arrest information police agencies must release. The other seven options are still on the table, including addressing that 1984 law that exempts teachers from the same level of accountability to which every other public employee is held. This year’s short session will be dominated by the budget. But there’s no reason our lawmakers can’t multitask.
So, legislators: Anyone care to step up in favor of government transparency?
For all the data analysis conducted by The Courant, no other topic comes close to generating the reader reaction we receive when looking at apparent racial disparities in policing.
That was evident once again this week with the release of fresh police-stop data showing – as previous releases have – that statewide, black and Hispanic motorists are stopped and ticketed at higher rates than white drivers.
That led to a flurry of comments on The Courant’s website – more than 100 at last count – most of which took exception to any implication that police officers might be treating minority drivers more harshly. Some were simply self-disproving diatribes – posts that used overtly racist slurs in arguing that racism was non-existent. But others took aim at the statistical methodology applied, some raising legitimate points, others misinterpreting or making incorrect assumptions about the analysis applied.
So, as we have in the past, here’s a primer on the data collected by the Connecticut Racial Profiling Prohibition Project and on the Courant’s analysis of post-stop behavior, along with responses to the most frequent issues that are raised whenever we dig into this data. Continue reading
Hartford Mayor Pedro Segarra often insisted that race was not an issue in his unsuccessful battle with Luke Bronin for the Democratic mayoral nod. But balloting in Wednesday’s primary election suggests otherwise,with voting patterns highlighting the city’s division along racial and ethnic lines.
As the election map below illustrates, Bronin, represented by the green shading, was strong in precincts in the northern half of the city, while Segarra, represented by orange, did well in the south. The maps farther below, drawn from Census data, show that the city has a similar division racially and ethnically, with most black residents concentrated in the North End, and most Latinos in the South. (Click shaded areas in the election map for vote information and the Census maps for demographic details.)
While the final vote spread was 55 to 45 percent, voting in individual precincts was far more lopsided. Where the candidates won, they won big. Out of 24 precincts in the city, Bronin won seven with more than 70 percent of the vote. Segarra topped 60 percent in five precincts.
So Bronin captured the nomination by winning – and winning decisively – in predominantly black precincts, overcoming Segarra’s generally strong support in precincts with large numbers of Hispanics. Bronin also did well in the West End districts that are home to large concentrations of white residents.
On the campaign trail, Bronin frequently promised an administration that would work “for all of Hartford’s residents.” Wednesday’s vote could be an indication of how difficult it may be to unify all of the city’s constituencies.
Population concentrations in Hartford for Hispanics (above left), blacks (above right), and whites (below).
Last week, we wrote about an analysis of the newly released standardized test scores that offered some evidence of a widening gap in the Hartford area between historically high-performing school districts and those that have traditionally had below-average scores.
We’ve now expanded that analysis to all school districts in the state, and have found the same general trend: For most subject and grade groupings, school districts that exceeded the statewide average on a key measure of achievement in 2013, reported scores on the 2015 test that were even farther above the state average. Likewise, schools with below-average performance generally lost ground, falling farther below the statewide average.
In elementary school math, for example, among the 20 highest-performing districts two years ago, 18 had scores this year that were farther above the state average. Among the 20 lowest-performing districts two years ago, 16 dropped farther below the average.
Put another way, the above-average districts in 2013 outperformed the state by an average of 16 percent, while the below-average districts lagged the statewide figures by an average of 11 percent. But in 2015, the higher performing schools had widened their margin to 29 percent above the state average, while the lower-performing schools had dropped to 25 percent below the state average.
That’s what the numbers show. Divining what the numbers mean is a far harder task. Continue reading
Local school officials got something of a break this year from the ritualized hand-wringing that typically accompanies the release of the state’s standardized-test results. With the first statewide administration of the Smarter Balanced Assessment Consortium test, 2015 is a benchmark year, setting a baseline of scores that officials will be hand-wringing over in 12 months.
But while state officials correctly warn against comparing achievement scores on the SBAC test to the prior CMT and CAPT tests, there is a way of to get some insight into how schools and districts are progressing. The Courant conducted that analysis for 19 towns in the Hartford region, and the results offer some evidence of a widening gap between the highest- and lowest-performing districts in the state.
For the analysis, the Courant examined the percentage of students scoring at high achievement levels on the SBAC test (those who met or exceeded the target achievement level) and calculated how each district performed compared to the state average. We did the same thing for the CMT and CAPT tests in 2013 – the last year the tests were widely administered (looking at students deemed “at goal”). Then we looked at whether districts had improved their position relative to the state average, or whether they lost ground compared to the state as a whole.
For simplicity’s sake, we grouped elementary grades and middle-school grades, and we combined data for the reading and writing portions of the CMT and CAPT tests, as the SBAC test includes a single English-language test.
As shown in the graphs below, for nearly all subject/grade levels we analyzed – high school English was the only exception – there is a clear trend in which towns that outperformed the state as a whole in 2013 generally extended their margins over the state average on the SBAC test, and towns with achievement levels below the state average two years ago fell farther below the average this year.
To read the charts: Blue arrows indicate an improvement in a district’s position relative to the state as a whole, and orange arrows indicate a decline. Data points above the zero line indicates a performance exceeding the state average, and points below the zero line indicate performance lagging the state average.
So the first arrow below shows that 3rd, 4th and 5th graders in Avon had an at-goal percentage in math in 2013 that was 30 percent above the state average. That’s the bottom of the arrow. On the newly released SBAC test, the percentage of elementary students meeting or exceeding the target achievement level was 73 higher than the state average. That’s the top of the arrow.
For elementary school math, the chart shows that of the 11 towns that performed above the state average in 2013, all but one extended their margins. Similarly, of the eight towns that lagged the state average two years ago, all but one lost more ground compared to the state average. Most of the other charts show a similar trend.
We’ll extend our analysis beyond these Hartford-area towns and see if the trend holds statewide. To explore SBAC scores for your town, see the chart and visualization created by my colleague Stephen Busemeyer.
Racial and ethnic disparities in policing has long been an uneasy topic in Connecticut and across the country. And that was reflected in reaction to a Sunday story in the Courant reporting that black and Hispanic motorists pulled over for traffic violations were more likely to receive a ticket than were white motorists pulled over for the same offense.
Many commenters and email writers were quick to challenge the findings, advancing a slew of reasons why the data or the analysis was flawed, and confidently assuring that there was a legitimate reason for any disparities in policing. Some raised legitimate questions. Others misunderstood the analysis.
The Courant performed a similar analysis in 2012 – and received a similarly visceral reaction from many readers. So as we did three years ago, here’s an elucidation on a few of the topics raised by readers.
The most common misconception was that the reported disparities simply indicate that black and Hispanic drivers violate traffic laws at higher rates than white motorists. “Could minority drivers commit more motor vehicle violations than non-minority drivers?” one poster asked. “No, this can’t be true. that would be racist.” Continue reading