Illinois Sucks at Measuring Temperature

Or at least, that's what BEST says. Here's a figure showing temperature trends in Illinois stations according to BEST. The first figure shows trends in the raw data, the second trends in quality controlled data, the third trends after BEST's adjustments:BEST_Illinois_Station_Trends

The average values in these are 0.59, 0.63 and 1.49°C/Century. That is, BEST's Breakpoint Adjusted Station Data shows trends an average of a degree higher in Illinois than its raw data shows. Illinois must really suck at measuring temperatures!

I'm sure that's not actually what this means, but what does it mean? To gain some insight, we can check BEST's description of this data:

During the Berkeley Earth averaging process we compare each station to other stations in its local neighborhood, which allows us to identify discontinuities and other heterogeneities in the time series from individual weather stations. The averaging process is then designed to automatically compensate for various biases that appear to be present. After the average field is constructed, it is possible to create a set of estimated bias corrections that suggest what the weather station might have reported had apparent biasing events not occurred.

According to BEST, the data is adjusted to show what records would have been like had there been no "apparent biasing events." I'm at a loss as to what that could mean here. What "apparent biasing events" happened that introduced nearly a degree of cooling into Illinois' average temperature trend? What "apparent biasing events" happened that introduced artificial cooling trends in 166 of 244 temperature records?

I have no idea. What I do know is if Illinois really is such a hot spot for artificially induced cooling trends, that's major news. BEST should publish a paper discussing how terribly inaccurate Illinois data is. They should investigate if Illinois's neighbors are as bad. Who knows? Maybe all of the midwest has terrible temperature stations whose records need to be severely adjusted.

While they're at it, they probably ought to update their Station Quality paper. The paper concludes station quality isn't important for temperature trends, but that seems hard to believe if "apparent biasing events" are adding a degree to states' centurial temperature trends.

The only way I can see that would be true is if "apparent biasing events" in other regions introduce lots of artificial warming trends. That'd be a heck of a coincidence.

Now to explain my methodology. My choice of state to examine wasn't random. I picked Illinois because I live in Illinois. I was curious what BEST's page on Illinois results had to show. That's when I saw I could view a list of stations within Illinois. I did, and I looked at some records of stations whose location I recognized. That's when I noticed an apparent trend in reported trends.

Intrigued, I started working my way through the list. I started at the top and worked my way through ten stations before deciding this wasn't a coincidence. To investigate, I collected the reported trends for all 334 stations listed as inside Illinois (the list also includes stations that are only near Illinois). I filtered out 90 of these stations as they did not have reported trends for all three categories. That left me with the 244 pictured above.

Now then, the station records are not all of the same length. They don't even all start or end at the same point. These differences mean averaging them as I did is not rigorous. It'd be worth investigating if there are any patterns as to which stations received what sorts of adjustment. That'd take quite a bit of time though so for the moment I've done some rudimentary checking to ensure what I've found isn't an artifact of shorter records or the like. There may be valuable information I've missed, but this pattern is definitely a pattern in the BEST Illinois temperature station records.

I think I'll try to look at the full data sets next. My hope is to find out which regions were adjusted in what ways. With that in hand, we can try to figure out if those adjustments make sense. I don't know if it'll happen though. It'd be quite a bit of work.

In the meantime, I'd like to highlight an oddity in these reported trends. One station I examined reported a trend in the raw data of 1.54°C/Century. After quality control, that dropped to 0.05°C/Century. Only two points of data were removed, and neither was particularly extreme, yet somehow the reported trend changed by 1.49°C/Century. It's weird.

One comment

  1. A couple additional comments. First, I used a variety of Python and R code while doing this. I'm happy to answer any questions about the details of it, but right now, I can't provide any sort of turnkey code. What I can provide is the HTML files I got while collecting data from the BEST website as well as a copy of the data tables I extracted from them.

    Second, I suspect the description I quoted of the Breakpoint Adjusted Station Data may be flawed. Here is a line I left out of my quote:

    This breakpoint-adjusted data set provides a collection of adjusted, homogeneous station data that is recommended for users who want to avoid heterogeneities in station temperature data.

    Notice the phrases to "homogeneous station data" and "avoid homogeneities." There's no description of what scale this is in reference to. We'd expect heterogeneity at some scales due to the nature of weather and climate. That means there is no particular reason "apparent biasing events" must explain all heterogeneity. It is possible BEST has made the data overly homogeneous by "smoothing" too much. If so, the Breakpoint Adjusted Station Data is terrible for understanding temperature of an area, but the overall data set may still be good.

    Given that, it's interesting to note Missouri and Indiana have similar trends to Illinois for the last century, all above the national average.

Leave a Reply

Your email address will not be published. Required fields are marked *