Overheated claims on temperature records

It’s time for sober second thoughts on climate alarms

By Dr. Tim Ball and Tom Harris

Now that the excitement has died down over the news that Earth’s surface temperature made 2017 one of the hottest years on record, it is time for sober second thoughts.

Did the January 18 announcement by the National Oceanic and Atmospheric Administration (NOAA) that 2017 was our planet’s third-hottest year since 1880, and NASA’s claim that it was the second hottest year, actually mean anything?

Although the Los Angeles Times called 2017 “a top-three scorcher for planet Earth,” neither the NOAA nor the NASA records are significant. One would naturally expect the warmest years to come during the most recent years of a warming trend. And thank goodness we have been in a gradual warming trend since the depths of the Little Ice Age in the late 1600s! Back then, the River Thames was covered by a meter of ice, as Jan Grifier’s 1683 painting “The Great Frost” illustrates.

Regardless, recent changes have been too small for even most thermometers to notice. More important, they are often less than the government’s estimates of uncertainty in the measurements. In fact, we lack the data to properly and scientifically compare today’s temperatures with the past.

This is because, until the 1960s, surface temperature data was collected using mercury thermometers located at weather stations situated mostly in the United States, Japan, the United Kingdom, and eastern Australia. Most of the rest of the planet had very few temperature sensing stations. And none of the Earth’s oceans, which constitute 70% of the planet’s surface area, had more than an occasional station separated from its neighbors by thousands of kilometers or miles.

The data collected at the weather stations in this sparse grid had, at best, an accuracy of +/-0.5º C (0.9º  F). In most cases, the real-world accuracy was no better than +/-1º C (1.8º F). Averaging such poor data in an attempt to determine global conditions cannot yield anything meaningful. Displaying average global temperature to tenths or even hundreds of a degree, as is done in the NOAA and NASA graphs, clearly defies common sense.

Modern weather station surface temperature data is now collected using precision thermocouples. But, starting in the 1970s, less and less ground surface temperature data was used for plots such as those by NOAA and NASA. This was done initially because governments believed satellite monitoring could take over from most of the ground surface data collection.

However, the satellites did not show the warming forecast by computer models, which had become so crucial to climate studies and energy policymaking. So bureaucrats closed most of the colder rural surface temperature sensing stations – the ones furthest from much warmer urban areas – thereby yielding the warming desired for political purposes.

Today, virtually no data exist for approximately 85% of the earth’s surface. Indeed, fewer weather stations are in operation now than in 1960.

That means surface temperature computations by NOAA and NASA after about 1980 are meaningless. Combining this with the problems with earlier data renders an unavoidable conclusion: It is not possible to know how Earth’s so-called average surface temperature has varied over the past century and a half.

The data are therefore useless for input to the computer models that form the basis of policy recommendations produced by the United Nations Intergovernmental Panel on Climate Change (IPCC) and used to justify eliminating fossil fuels, and replacing them with renewable energy.

But the lack of adequate surface data is only the start of the problem. The computer models on which the climate scare is based are mathematical constructions that require the input of data above the surface, as well as on it. The models divide the atmosphere into cubes piled on top of each other, ideally with wind, humidity, cloud cover, and temperature conditions known for different altitudes. But we currently have even less data above the surface than on it, and there is essentially no historical data at altitude.

Many people think the planet is adequately covered by satellite observations, data that represents global 24/7 coverage and is far more accurate than anything determined at weather stations. But the satellites are unable to collect data from the north and south poles, regions that the IPCC, NOAA, and NASA tout as critical to understanding global warming. Besides, space-based temperature data collection did not start until 1979, and 30 years of weather data are required to generate a single data point on a climate graph.

So the satellite record is far too short to allow us to come to useful conclusions about climate change.

In fact, there are insufficient data of any kind – temperature, land and sea ice, glaciers, sea level, extreme weather, ocean pH,  and so on – to be able to determine how today’s climate differs from the past. Lacking such fundamental data, climate forecasts cited by climate activists therefore have no connection with the real world.

British Professor Hubert Lamb is often identified as the founder of modern climatology. In his comprehensive 1972 treatise, Climate: Past, Present, and Future, he clearly showed that it is not possible to understand climate change without having vast amounts of accurate weather data over long time frames. Lamb also noted that funding for improving the weather database was dwarfed by money being spent on computer models and theorizing. He warned that this would result in wild and unsubstantiated theories and assertions, while predictions failed to improve. That is precisely what happened.

Each and every prediction made by the computer models cited by the IPCC has turned out to be incorrect. Indeed, the first predictions they made for the IPCC’s 1990 Assessment Report were so wrong that the panel started to call them “projections” and offered low, medium, and high “confidence” ranges for future guesstimates, which journalists, politicians, and others nevertheless treated as reliable predictions for future weather and climate.

IPCC members seemed to conclude that, if they provided a broad enough range of forecasts, one was bound to be correct. Yet, even that was too optimistic. All three ranges predicted by the IPCC have turned out to be wrong.

US Environmental Protection Agency (EPA) Administrator Scott Pruitt is right to speak about the need for a full-blown public debate among scientists about the causes and consequences of climate change. In his February 6 television interview on KSNV, an NBC affiliate in Las Vegas, Mr. Pruitt explained:

“There are very important questions around the climate issue that folks really don’t get to. And that’s one of the reasons why I’ve talked about having an honest, open, transparent debate about what do we know, and what don’t we know, so the American people can be informed and they can make decisions on their own with respect to these issues.”

On January 30, Pruitt told the Senate Environment and Public Works Committee that a “red team-blue team exercise” (an EPA-sponsored debate between climate scientists holding differing views) is under consideration. It is crucially important that such a debate take place.

The public needs to understand that even the most basic assumptions underlying climate concerns are either in doubt or simply wrong. The campaign to force America, Canada, Europe and the rest of the world to switch from abundant and affordable coal and other fossil fuels – to expensive, unreliable, land intensive alternatives – supposedly to control Earth’s always fluctuating climate, will then finally be exposed for what it really is: the greatest, most damaging hoax in history.

 ———-                                                                                                               

Dr. Tim Ball is an environmental consultant and former climatology professor at the University of Winnipeg in Manitoba. Tom Harris is executive director of the Ottawa, Canada-based International Climate Science Coalition.

 

Categories

About the Author: CFACT

CFACT defends the environment and human welfare through facts, news, and analysis.

6 Comments
  1. yoda

    “That means surface temperature computations by NOAA and NASA after about 1980 are meaningless” But why?? Why does NOAA & NASA, two “august” organisations, resort to such
    activities? Why are they doing it??/

  2. FrankSW

    Endlessly discussing whose version of temperature is the wrong line of defence, publishing temperature and related effects is just the alarmists smokescreen of confirming that their CO2 theory is right.

    If indeed CO2 has a minimal effect on temperature then the correct rebuttal should always include an independent attack their core CO2 belief and weave other reasons independent of temperature that CO2 is not dangerous such as the saturation of overall atmospheric opacity.

    Without the assumed truth of “dangerous CO2 causes temperature changes” the political decision makers can move on from CO2 emissions reduction.

  3. Brin Jenkins

    Indeed, the man made bit is infinitesimal but still we get bombarded with scare scenarios. Also to the point is how our carbon taxes are currently used in uplifting low carbon economies so they now require manufactured goods and fuels. This is converting low carbon economies into carbon producers, negating any dubious benefit whilst we destroy ours. An honest approach would be to reduce our standard of living to that of a low carbon nature, this would be like flying a lead balloon.

  4. cgs

    Dr. Ball and Mr. Harris,

    Berkeley Earth calculate for 2017 an annual temperature anomaly of 0.47 +/- 0.05°C. This number, they claim, is 0.11°C cooler than the anomaly for 2016. Since this difference is twice the claimed error, if correct, they claim it clearly shows that 2017 is cooler than 2016.

    http://berkeleyearth.org/global-temperatures-2017/

    They also show that the difference between 2017 and 2015 is not greater than the calculated error, thus they assert that they cannot unambiguously order these two years.

    But 2017 is 0.17°C greater than 2014, so, since this number is 3x the calculated error, they claim they can clearly order 2017 against 2014 (and all previous years).

    If we are to accept this, it would tell us that the changes we measure currently are indeed large enough to allow for an analysis of warming or cooling.

    Additionally, as you can also see, their measurement of the anomaly and its error are presented to the hundredths of a degree.

    So, a question:

    When you write that “recent changes have been too small for for even most thermometers to notice” and, “displaying average global temperature to tenths or even hundred[th]s of a degree…clearly defies common sense” are you claiming that the calculations that Berkeley Earth does are incorrect?

0 Pings & Trackbacks