Last summer’s headlines blared, “Hottest July in the history of the United States.” The National Climatic Data Center (NCDC) of the U.S. National Oceanic and Atmospheric Administration (NOAA) said so, so it must be true.
This week, the NCDC is reporting the same, with the added alarm that 2012 was the warmest year on record and one of the top two years for extreme weather in America.
Climate activists are linking this to man-made global warming, ignoring the fact that the area covered in the NCDC reports, the contiguous United States (excluding Alaska), comprises only 2 percent of the Earth’s surface. Trends that may manifest in the United States in no way indicate global phenomena. In fact, the United Kingdom’s Meteorological Office has said that there has been no global warming for 16 years and this week announced that temperatures are expected to stay relatively stable for another five years.
Regardless, all NCDC temperature proclamations must be taken with a large grain of salt. Here’s why.
Until the use of thermocouple temperature indicators became common in the U.S. climate network, temperatures were determined with mercury thermometers that are, at best, only accurate within 0.9 degree Fahrenheit. Even today, many U.S. stations only record temperatures to the closest whole degree. Thus, breaking the 1936 high temperature record by 0.2 degrees F, as NCDC claimed occurred last July, is not meaningful. This change falls within the uncertainty of the measurement. It is akin to being alarmed that the moon has moved a millimeter closer when we can only measure the Earth-moon distance to within a few centimeters.
Read more at The Washington Times. By Tom Harris and Tim Ball.
Photo credit: Shazz Mack (Creative Commons)