Average annual global temperatures have risen a degree or two since the Little Ice Age ended some 150 years ago. Thank goodness. The LIA was not a particularly pleasant time.
Prolonged winters, advancing glaciers, colder summers, more frequent storms and extended cloudiness reduced arable land, shortened growing seasons, rotted grain in wet fields, and brought famine, disease and death. Coming after the prosperous Medieval Warm Period – when farmers grew wine grapes in England and Vikings raised crops and cattle in Greenland – it must have been quite a shock.
The LIA underscored how much better a warmer planet is than a colder one. Moderate warming above today’s norm would likely bring expanded cultivation during longer growing seasons in northern latitudes, fewer people dying from hypothermia during frigid winters, and many other benefits.
What caused the Medieval Warm Period to end, and the Little Ice Age to come and go, is still debated. Even the best scientists don’t fully understand what alignments of solar, cosmic, oceanic, atmospheric and planetary forces control this millennial warm-cool rhythm.
In any event, the initial warming of 1850-1900 was followed by perhaps an additional overall 1.3 degrees Fahrenheit (0.8 degrees Celsius) of warming during the twentieth century. However, it was not a steady rise in temperatures, proportionate to increasing atmospheric carbon dioxide levels, as “manmade climate disaster” themes suggest. Instead, Earth warmed noticeably1900-1940, cooled slightly 1940-1975 (“most scientists” worried about another little ice age), warmed again 1975-1995 (“most scientists” feared global warming), and exhibited little change from then to the present.
The 20-year late twentieth century warming supposedly justifies demands that we stop using hydrocarbon fuels, halt US economic growth, hold back Third World development, ban incandescent light bulbs, blanket the planet with unreliable wind turbines and solar panels, make recompense to poor nations for emitting CO2 and “causing global climate disruption,” and even consider “geo-engineering” (putting dust particles or tiny mirrors into space to block the sun’s rays) to prevent warming that stopped in 1995. Even though no reliable or factual evidence shows that this recent warming was (primarily) human-caused!
These are important issues for the next Congress (and others) to grapple with. But an even more fundamental question is rarely raised, and almost never addressed.
How much credence can we give any claim that average global temperatures have risen or fallen X degrees over a certain period, or that this year or decade is “the warmest ever,” or “since record-keeping began” – especially when the alleged difference is measured in tenths or hundredths of a degree?
The answer: Not much. The truth is, we cannot trust the hype and numbers that routinely come out of the IPCC, NOAA, NASA, CRU, White House and other branches of the climate crisis industry.
Certainly, satellites have gathered arguably reliable atmospheric temperature data since 1980. However, they obviously provide no insights into pre-1980 warming and cooling trends. And for 1850 to 1930, we must rely on scattered land and oceanic thermometer measurements; historic anecdotes, diary entries and paintings that give only general descriptions of climate, heat waves, floods and blizzards; and “proxy” records like tree rings. Even together, this evidence is so sparse, scattered and of uneven quality that it cannot and must not be used to drive major energy, economic and environmental policy decisions.
Calibrated thermometers were invented in 1724, but they provide only random measurements for vast continental land masses until well into the twentieth century – and across much of Africa, Asia and South America even today. No one can calculate 1850-1950 average global temperatures from that. To fill in the huge gaps, scientists often utilize tree rings. However, annual tree growth is determined as much by rainfall as by temperature. Far worse, researchers have been caught selecting twelve trees out of hundreds from Siberia, to generate desired “warming trends,” and splicing thermometer measurements onto tree ring data that suddenly showed inconvenient “cooling” trends.
Temperature data from the 71% of Planet Earth covered by oceans is even more sporadic. Today, buoys and satellites cover large expanses that previously were measured only by ships traveling different routes, during favorable times of the year, using a variety of methods to measure seawater and air temperatures. But even today only a small portion of Earth’s oceans are measured regularly or accurately.
Compounding these problems, 55% of the 12,000 surface temperature stations operating in 1990 have been closed down – and many of the now missing stations were in Siberia and other cold regions. This alone has created a significant 20-year “warming” bias, notes former University of Winnipeg climatology professor Tim Ball.
Today, nearly half of the world’s remaining stations are located in the United States, on 1.9% of the Earth’s surface. The vast majority are in the Lower 48 States. And as meteorologist Anthony Watts has documented, most of those stations are near parking lots, air conditioning exhaust ports, highways, airport tarmac and other artificial heat sources – all of which skew the recorded temperatures upward. His report, “Is the US surface temperature record reliable?” is a real eye-opener.
However, none of this sobering reality deters climate chaos alarmists, who consistently show a penchant for distributing dire news releases on the eve of important global warming votes and conferences.
2000-2010 was “the hottest decade ever,” and 2010 “is shaping up to be the hottest year on record,” NASA and NOAA breathlessly announced … on July 28, prior to hoped-for Senate votes and the Cancun summit. “World temperatures in 2010 may be the warmest on record. 2010 will be one of the two warmest years, going back to 1850,” Britain’s Meteorology Office intoned … in late November.
“This year will be the third warmest year on record, since 1850,” the World Meteorological Organization declaimed … on December 3. Other organizations issued similar headline-grabbing alarums.
But before you say kaddish or “requiescat in pace” for Mother Earth, keep the previous caveats in mind and note a few other realities. One, only a few hundredths of a degree separate the 2010 decade from the similarly very warm 1930s – and NASA and other researchers refuse to release their raw temperature data and analytical methods, so that independent researchers can examine their calculations and claims.
Two, most of 2010 was marked by El Nino, the warming phase of the periodic climate pattern across the tropical Pacific Ocean that typically makes summer months warmer than usual. Three, the pre-Cancun pronouncements were based on January-through-October temperatures, and an assumption that November and December will be “average.”
Four, the climate and record books are not cooperating with that assumption or the hype, headlines and summit on Climate Armageddon. South Florida just had its coldest night in 169 years, Wales its coldest since recordkeeping began; and in the middle of its global warming gabfest, Cancun set four record low temperatures in a row. Other local cold records are falling all over the Northern Hemisphere, hot on the heels of record cold and snow during the 2009-2010 winter in both hemispheres.
But then “climate policy has almost nothing to do anymore with environmental protection,” IPCC Working Group III co-chair Ottmar Edenhofer reminded us recently. In fact, “the world climate summit in Cancun is actually an economy summit, during which the distribution of the world’s resources will be negotiated.” [emphasis added] Keep that in mind, too, next time someone says we have a climate crisis.
Magic is delightful when it’s Criss Angel or Harry Potter. Magic temperature numbers – pulled out of hats, computers and fertile imaginations – are a lousy, fraudulent, redistributionist way to set public policy.