Anthony Watts has just pre-released a game changing paper that he co-authored and it can be found here. This paper has the potential to rock the Anthropogenic Global Warming (AGW) theory and the Al Gore approved hockey stick graph to their foundations and prove that Team AGW is guilty of statistical chicanery. But before I get to that, I need to review some basics first for those who don’t follow the Climate Science news closely.
Measuring Land Temperatures in the US
Meteorological monitoring stations are installed at over 1,200 locations in the 48 contiguous United States (CONUS) and they are used to monitor various weather variables (temperature, humidity, etc.). The data from these weather stations are called the United States Historical Climatology network (USHCN) and below is a short description.
“The United States Historical Climatology Network (USHCN) is a high-quality data set of daily and monthly records of basic meteorological variables from 1218 observing stations across the 48 contiguous United States. Daily data include observations of maximum and minimum temperature, precipitation amount, snowfall amount, and snow depth; monthly data consist of monthly-averaged maximum, minimum, and mean temperature and total monthly precipitation.”
AGW skeptics have long contended that many of these weather stations, while located in urban areas and some even on airports, would suffer from what is called Urban Heat Island (UHI) effects. Anybody who lives in or near a large metropolitan area knows that the temperatures in the city are usually higher than the rural areas because the concrete, asphalt and other manmade, heat trapping structures cause the land temperatures to be higher when compared to rural areas.
In 2011, Professor Richard Muller led the Berkeley Earth Surface Temperature Project (BEST) which sought to analyze the USHCN and determine if UHI altered the results. They separated the weather stations into different classifications based on whether they were sited well (in rural areas) or sited poorly (in urban areas where UHI could affect the measurements). To the dismay of AGW skeptics, including myself, the BEST report stated that UHI didn’t have an appreciable effect on the land temperatures across the US and you can find the results in this link which states:
“The BEST analyses closely match existing land temperature records produced by NASA, NOAA, and the United Kingdom’s Hadley Centre, despite using differing techniques for station data analysis. They present additional evidence indicating that neither urbanization nor poor station siting has much of an influence on temperature records.”
So maybe this UHI effect was just a crazy theory of AGW skeptics and we should all just move on.
But the story doesn’t end there……
The BEST report included the effects of UHI by using station classification methods developed in a 1999 paper written by Michel Leroy but in 2010 Leroy authored another paper which modified this method and the importance of the new method is stated below from the Watts 2012 paper summary located on WUWT website :
“Watts et al 2012 has employed a new methodology for station siting, pioneered by Michel Leroy of METEO France in 2010, in the paper Leroy 2010, and endorsed by the World Meteorological Organization (WMO) Commission for Instruments and Methods of Observation (CIMO-XV, 2010) Fifteenth session, in September 2010 as a WMO-ISO standard, making it suitable for reevaluating previous studies on the issue of station siting.”
“Previous papers all used a distance only rating system from Leroy 1999, to gauge the impact of heat sinks and sources near thermometers. Leroy 2010 shows that method to be effective for siting new stations, such as was done by NCDC adopting Leroy 1999 methods with their Climate Reference Network (CRN) in 2002 but ineffective at retroactive siting evaluation.”
“Leroy 2010 adds one simple but effective physical metric; surface area of the heat sinks/sources within the thermometer viewshed to quantify the total heat dissipation effect.”
“Using the new Leroy 2010 classification system on the older siting metadata used by Fall et al. (2011), Menne et al. (2010), and Muller et al. (2012), yields dramatically different results.”
Summarizing the above quote – There were three prior studies performed to investigate the UHI impacts of the USHCN – Fall et al. (2011), Menne et al. (2010) and Muller et al (2012) and while they all found minimal to no impact on the temperature records, these three studies all used an older method (from Leroy 1999) to rate the temperature stations that would suffer from UHI instead of the more accurate modeling method (Leroy 2010). Stations that would be classified as ‘good’ using the older Leroy 1999 method will now be classified as ‘poor’ using the peer reviewed Leroy 2010 paper and the new paper by Watts uses the new modeling method that accounts for the surface area of the heat sink/sources.
Leroy 2010 broke down the weather stations into classes (numbered 1 through 5) with 1 being an ideal station that is not impacted by external objects/structures (i.e. must be greater than 100 meters from a parking lot, among other criteria) and 5 being the worst possible site for a weather station which would cause the temperature data to be skewed by as much as 5°C.
The Results of the Watts et. al 2012 Paper
I don’t want to understate the fact that the results of the Watts 2012 paper are huge!
When Watts and his team went back through the data and used the Leroy 2010 modeling method to account for UHI on weather stations, here is what they found (emphasis mine):
“A reanalysis of U.S. surface station temperatures has been performed using the recently WMO-approved Siting Classification System devised by METEO-France’s Michel Leroy. The new siting classification more accurately characterizes the quality of the location in terms of monitoring long-term spatially representative surface temperature trends. The new analysis demonstrates that reported 1979-2008 U.S. temperature trends are spuriously doubled, with 92% of that over-estimation resulting from erroneous NOAA adjustments of well-sited stations upward. The paper is the first to use the updated siting system which addresses USHCN siting issues and data adjustments.”
“The new improved assessment, for the years 1979 to 2008, yields a trend of +0.155°C per decade from the high quality sites, a +0.248° C per decade trend for poorly sited locations, and a trend of +0.309° C per decade after NOAA adjusts the data. This issue of station siting quality is expected to be an issue with respect to the monitoring of land surface temperature throughout the Global Historical Climate Network and in the BEST network.”
Watts 2012 found that, from the years 1979 to 2008, Class 1 and 2 stations have lower warming trends than Class 3, 4 and 5 stations (0.155°C and 0.248°C respectively) which makes sense even to those individuals who don’t have degrees in Climate Science. A thermometer sitting in the middle of a parking lot will measure a higher temperature than a thermometer a mile away in a grass field. From the graph below, you can see the vast majority of monitoring stations are characterized as Class 3, 4 or 5 and that fact alone would skew our land temperatures to the high side.
By the way, if you are wondering what a poorly sited weather station looks like, the Watts 2012 paper has a picture of one.
So important point #1 is -> Team AGW stacks the deck by using a higher proportion of weather stations that skew their temperature data to the high side. 80% of the weather stations are classified as ‘poorly sited’ by Leroy 2010 and therefore skew their temperature measurements higher than a well sited instrument (sometimes by as much as 5°C).
But this is only half of Team AGW’s errors and the other half is the real bombshell.
The National Oceanic and Atmospheric Administration (NOAA) adjusted the database to account for poorly sited weather instruments and you’d think that they would have adjusted the temperature measurements DOWN to offset the UHI but no, NOAA adjusted the temperatures UP!
Yes, you read that right. NOAA adjustments to the data (over that same period of time 1979 – 2008) showed a warming trend of 0.305°C which indicates twice as much warming as what well sited monitoring stations showed (0.155°C). Below are a couple of graphs from the Watts 2012 paper that show this unbelievable breach of scientific integrity. The graphs show the average temperature increase per decade by region with the light blue bar representing data from weather stations rated Class 1 or 2, the yellow bar representing data from weather stations rated 3, 4 or 5 and the red bar representing the value NOAA assigned to those weather stations to ‘adjust’ for poorly sited weather stations. You will see in every case, except region 9-W, NOAA raised the overall temperatures ABOVE the average temperature increase for the worst sited weather stations (class 3, 4 and 5). It makes no sense! Unless, of course, you are trying to willing deceive….
What is a more reliable source of data? Raw temperature data from well sited monitoring stations that are unaffected by manmade and natural obstructions or data that has been manually biased (manipulated) by humans where the bias makes no sense based on the data?
I’ll leave you with 2 more images from Watts 2012 that summarize the conclusions of the paper.
The map above shows the average temperature increase per decade using three data sets – the upper right map shows only the data from well sited monitoring stations, the lower right map shows only the data from poorly sited monitoring stations and the left map shows the NOAA biased data for all weather stations.
The graph on the left shows the temperature trend for well sited stations from Rural, Suburban and Urban locations and you can see the trend up as you move the stations from a Rural area to an Urban area. Note the green triangle that shows the NOAA temperature trend after they biased the data. The graph on the right shows the temperature trend for poorly sited stations from Rural, Suburban and Urban locations and again you see the trend up as we move from Rural to Urban but there is another key take away from this graph. The delta between minimum, mean and maximum temperature measurements is larger and this artifact of the data shows a clear indication that the poorly sited stations suffer from higher variability (which should also raise a red flag that the data from these stations are unreliable).
It wasn’t enough for Team AGW to stack the deck and have 80% of their weather stations be classified as poorly sited (giving a temperature bias due to UHI); they doubled down and threw a ‘fudge factor’ on top of the data which further increased the temperature trend per decade. This is not only bad science, it’s criminal.
Further Reading – Joanne Nova also has a great summary of the Watts 2012 paper and you can find it here.
The truth will out … but all I needed to know were the names of the usual suspects who supported the Kyoto treaty to realize that man-made global warming was a leftist scam. The fact that they’re now calling it “climate change” doesn’t alter that. The idea is still to use environmental concerns to further the goals of international socialists.
Pingback: Hanoi John Hot Over ‘Warming’ Critics | Be Sure You're RIGHT, Then Go Ahead
Pingback: The Inconvenient Truth of Ice Melt | cosmoscon