Is It Really The Warmest Ever?
But after the incredibly cold and snowy winters in 2008/09 and 2009/10 and so far in 2010/11, those claims are falling on increasingly deaf ears. The public doubt about global warming has been increasing given the Climategate disclosures suggesting scientists have been ‘cooking the books’, especially when earlier promises of warm, snowless mid-latitude winters failed miserably.
Back on March 20, 2000, The Independent, a British newspaper, reported Dr. David Viner’s of the UK”s Climate Research Unit warning that within a few years snowfall will become “a very rare and exciting event.” Indeed, Viner opined, “Children just aren’t going to know what snow is.”
Similarly, David Parker, at the UK’s Hadley Centre for Climate Prediction and Research, said that eventually British children could have only “virtual” experience of snow via movies and the Internet.
The last three winters in the UK were forecast by the UK Met Office to be mild and snowless. Instead, brutal cold and snow in the UK has the UK Met Office on their heels. Indeed the cold and snow was a throwback to the age of Dickens in the early 1800s. UK MPs called for Official Parliamentary Probe into whether the UKMO reliance on their ideology and CO2 models had biased their predictions.
In the United States, NOAA echoing the UN IPCC, claimed snow would retreat north with the storm tracks and major cities would get more rain and mild winters. The Union of Concerned Scientists said in 2004 scientists claim winters were becoming warmer and less snowy. In 2008, Robert F. Kennedy Jr. bemoaned that children would be robbed of the childhood joys of sledding and skiing in the DC area due to global warming. A year later, the area set a new seasonal snowfall record with 5 to 6 feet of snow and sleds and skis were the only way to get around.
The winter of 2009/10 was the coldest ever in parts of the southeast, and in parts of Siberia and the coldest since 1977/78 or 1962/63 in many parts of the United States, Europe and Asia.
The spirits of alarmists and their cheerleaders in the media were buoyed by the hot summer in the eastern United States and western Russia even though that is the normal result when a strong La Nina follows on the heels of a strong El Nino winter. But as is usually the case in La Ninas, global cooling usually follows within 6 months. Indeed, temperatures plunged as winter approached and this past December (2010) was the second coldest in the entire Central England Temperature record extending back to 1659. It was the coldest ever December in diverse locations like Ireland, Sweden, and Florida.
Reluctantly, alarmists and their cheerleaders in the media changed their tune and the promise of warm and snowless winters with ‘global warming’ morphed into global warming means cold and snowy winters. ABC News even said cold and snowy winters would be the new norm because of global warming. Non sequiturs like that have sadly become ‘the new norm’ in the wacky world of the mainstream media.
In Australia, the government’s Bureau of Meteorology and university alarmist scientists promised major drought and blocked dams and flood mitigation projects, but when devastating floods occurred this summer, they blamed that on global warming and again enviros and government agencies escaped the blame. Other scientists had warned that changes in the Pacific would lead to a return of the flood years like 1974, but they were ignored by agenda driven, green leaning government.
In fact environmentalists and alarmist scientists have reinvented global warming and now attribute all weather to global warming – cold, warm, drought and flood. They call it ‘climate disruption’. But the climate has not been cooperating in a way that is convincing the public they have to sacrifice even more to stop a problem they don’t sense is real. Just imagine if they knew how much they really would cost (trillions – several thousands of dollars per year per family) and how little these deep sacrifices would change the climate (not measureable).
Despite claims to the contrary, in recent years, global temperatures stopped warming. Even Phil Jones of the UK Climate Research Unit after Climategate admitted there has been no statistically significant warming since 1995 (15 years) and between 2002 and 2009, the global temperatures had declined 0.12C (0.22F).
To try and stop the bleeding, NOAA and NASA took steps to reduce or eliminate the cooling.
This aggravated what already was an already a bad situation. CRU data base programmer Ian ‘Harry’ Harris’s frustrated rants in his Climategate log were eye-opening “[The] hopeless state of their (CRU) data base. No uniform data integrity, it’s just a catalogue of issues that continues to grow as they’re found…There are hundreds if not thousands of pairs of dummy stations…and duplicates… Aarrggghhh! There truly is no end in sight. This whole project is SUCH A MESS. No wonder I needed therapy!!”
Furthermore, in a candid interview on the BBC, CRU’s Director Phil Jones admitted his “surface temperature data are in such disarray they probably cannot be verified or replicated”.
So should we avoid CRU and focus on NOAA and NASA. The answer is an unequivocal no.
In a Climategate email, Phil Jones acknowledges that CRU mirrors the NOAA data. “Almost all the data we have in the CRU archive is exactly the same as in the Global Historical Climatology Network (GHCN) archive used by the NOAA National Climatic Data Center.” And NASA uses NOAA data applying their own adjustments. All three data bases suffer from the same flaws.
All have managed to extract a warming trend from data that suggests cyclical changes and little long term trend. See how the three data centers working off the same data have reconstructed the global temperature history. NASA in green show the warmest anomalies, CRU generally the lowest. Part of this is the base period for computing averages (NASA uses the cold 1951 to 1980 30 year period for normals, CRU 1961 to 1990 and NOAA the entire period of record.
All show a warming period from the 1920s to early 1940s, a cooling from the 1940s to 1970s another warming from late 1970s to around 1998, and then as Jones noted a flattening. The warming early in the century before the industrial boom was very similar to that from 1978 to 1998. The cooling post WWII was during the post war boom.
In the detailed working paper I coauthored with Anthony Watts and others we concluded: There has clearly been evidence of some cyclical warming in recent decades, most notably 1979 to 1998. However, the global surface-station data is seriously compromised. The data suffers significant contamination by urbanization and other local factors such as land-use/land-cover changes. Ina majority of cases studied, station siting does not meet the published criteria with contamination by very local heat sources. There was a major station dropout, which occurred suddenly around 1990 and a significant increase in missing monthly data in the stations that remained. (Note: this increases uncertainty – greatest in regions where they claim the warming is the greatest). There are huge uncertainties in ocean temperatures; no small issue, as oceans cover 71% of the earth”s surface.
These factors lead to significant uncertainty and a tendency for over-estimation of century-scale temperature trends. A conclusion from all findings suggest that global data bases are seriously flawed and can no longer be trusted to assess climate trends or rankings or validate model forecasts. And, consequently, such surface data should be ignored for decision making”.
In this story, we will look at two of the best documented issues, urban contamination and poor siting of instruments.
Urban Heat Island
Everyone recognizes that the urban areas are warmer, especially at night than surrounding rural and suburban areas. Airports originally on the outskirts of urban areas have seen cities grow around them and temperatures artificially rise. Tim Oke (1973) and Torok et al. (2003), experimentally determined the amount of contamination as a function of population. They found even a town of 1000 could produce an artificial warming of 2.2C (3.8F).
In the original NOAA US data base called USHCN version 1, NOAA (Karl 1988) used Oke’s work and others to develop an adjustment for urban heat island contamination. The combination of longer term station stability and this adjustment made that data base (1221 climate stations), the best in the world. The data showed cyclical changes with warmth peaking in the 1930s and a cooling that bottomed out in the 1960s and 1970s and a modest warming thereafter falling short of the heat of the 1930s warm peak. This is a screen capture of the US annual temperatures from NOAA but posted on the NASA web site in 1999.
Indeed James Hansen in 1999 remarked correctly about this plot “The U.S. has warmed during the past century, but the warming hardly exceeds year-to-year variability. Indeed, in the U.S. the warmest decade was the 1930s and the warmest year was 1934.”
NOAA and NASA had to constantly explain why their global data sets which had no such adjustment was showing warming and the US, not so much. NOAA began reducing the UHI around 2000 (noticed by state climatologists and seen in this analysis of New York City’s Central Park data here) and then in USHCN version2, released for the US stations in 2009, the urban heat island adjustment was totally eliminated which resulted in an increase of almost 0.3F in warming trend since the 1930s. See animating GIF here.
David Easterling, Chief of the Scientific Services Division at NOAA admitted in one of the NASA FOIA emails: “One other fly in the ointment, we have a new adjustment scheme for USHCN (V2) that appears to adjust out some, if not most, of the “local” trend that includes land use change and urban warming.”
Brian Stone of Georgia Tech in a 2009 paper found “Across the U.S. as a whole, approximately 50 percent of the warming that has occurred since 1950 is due to land use changes (usually in the form of clearing forest for crops or cities) rather than to the emission of greenhouse gases,” said Stone. “Most large U.S. cities, including Atlanta, are warming at more than twice the rate of the planet as a whole – a rate that is mostly attributable to land use change.”
NOAA used a paper by Peterson (2003) to justify the removal of the urban adjustment. Steve McIntyre challenged NOAA’s Peterson (2003), who had said, “Contrary to generally accepted wisdom, no statistically significant impact of urbanization could be found in annual temperatures” by showing that the difference between urban and rural temperatures for the full Peterson station set was 0.7oC and between temperatures in large cities and rural areas 2oC.
CRU had done the same for their global data using the findings of Jones (1990) and Wang (1990). The Jones and Wang papers in 1990 were shown by Keenan to be based on fabricated China data. In 2008 ironically Jones found that contamination by urbanization in China was a very non-trivial 1C per century but that did not cause the data centers to begin adjusting as that would have eliminated global warming.
Bad station siting
According to NOAA guidelines, climate temperature sensors are to be located away (100 feet or more) from local heat sources and sheltered from direct sunlight on the sensing element, while allowing for ventilation by the wind.
Watts found that 89 percent of 1000 plus U.S. ground temperature stations surveyed do not meet NOAA’s published standards for distance between stations and adjacent heat sources, seriously compromising readings. “(Even) The raw temperature data produced by the … stations are not sufficiently accurate to use in scientific studies or as a basis for public policy decisions,” Watts concludes.
Just one example among thousands – Urbana, Ohio climate station is shown below with sensor surrounded by multiple heat sources.
NOAA first denied it was an issue in an internal talking points memo and then in a rushed ‘pal’ review paper (Menne 2009) but then asked the government for $100 million to upgrade/correct the siting of 1,000 climate stations.
Indeed, numerous peer-reviewed papers catalogued here have estimated that these local issues with the observing networks may account for 30%, 50% or more of the warming shown since 1880.
STILL MORE ADJUSTMENTS
After the data with all its warts is collected, further adjustments are made, each producing more warming. MIT meteorologist Dr. Richard Lindzen commented “[W]hen data conflicts with models, a small coterie of scientists can be counted upon to modify the data” to agree with models’ projections.”
Over time in the global data bases, the warming trend has been steadily increasing. This has been accomplished by cooling off prior decades while increasing the warming in recent years. Many examples are provided in the paper and case studies here.
For example, extracting old data from papers by James Hansen and comparing them with data downloaded from NASA’s GISS site in 2007 and 2010, we can see the progressive ‘man-made’ global warming (the men here though are at NASA). This is accomplished by making adjustments to the data and homogenizing data (blending urban with rural and good sited stations with bad sited) and then removing in 2007, the urban adjustment in the United States.
The frequency and direction of NASA US adjustments stepped up in 2007 as temperatures began to cool (here).
NASA/NOAA’s homogenization process has been shown to significantly alter the trends in many stations where the siting and rural nature suggest the data is reliable. In fact, adjustments account for virtually all the trend in the data. Unadjusted data for the best sites/rural shows cyclical multi-decadal variations but no net long term trend as former NASA scientist Dr. Ed Long showed here. He showed however that after adjustment, the rural data trend was made consistent with the urban data set with an artificial warming introduced. So in the data sets, urban warming is allowed to remain and the warm bias is artificially introduced into the rural and/or well sited data sets which in their unadjusted state show no warming.
In the graph above from Climate Audit, the difference in adjustments made before (red) and after 2007 (black) is dramatic.
Record highs and lows are based on raw, unadjusted data. They show the pattern we find in raw unadjusted rural and well site stations, a cyclical change but no long term trend. They suggest the 1930s is still the warmest decade, as Hansen stated in 1999. This can be seen by looking city by city at the records. Here we look at state record highs and lows. It shows the decade with the highest and lowest temperature for the month and state through 2009. Instead of the warmest decade on record, the 2000s is shown to be unusually benign with fewer records than any decade since the 1880s.
Though both NOAA and NASA have resisted FOIA requests for release of all the unadjusted data and documentation for all the adjustments made, that may change in the new congress. The Data Quality Act requires that any published data must be able to be replicated by independent audits. That is currently not possible given the resistance posed, despite promises of transparency.
Georgia Tech’s Dr. Judith Curry’s comments on Roger Pielke Jr.’s blog support such an independent effort: “In my opinion, there needs to be a new independent effort to produce a global historical surface temperature dataset that is transparent and that includes expertise in statistics and computational science…The public has lost confidence in the data sets…Some efforts are underway in the blogosphere to examine the historical land surface data (e.g. such as GHCN), but even the GHCN data base has numerous inadequacies.”
How did we get here?
Dwight Eisenhower in his 1961 Farewell Address to the Nation warned: “The prospect of domination of the nation’s scholars by Federal employment, project allocations, and the power of money is ever present – and is gravely to be regarded.”
NOAA’s Administrator Dr. Jane Lubchenko, when she was president of AAAS in 1999, urged “Urgent and unprecedented environmental and social changes challenge scientists to define a new social contract … a commitment on the part of all scientists to devote their energies and talents to the most pressing problems of the day, in proportion to their importance, in exchange for public funding.”
NOAA and NASA are together receiving nearly a billion dollars in direct government climate research funding and up to $600 million more from the Recovery Act of 2009. For that they are expected to support environmental, social and political agendas. You can see how quickly the political operatives and the media enablers respond to those press releases (here). In an act of unbelievable hypocrisy (and perhaps desperation), Congressman Waxman even wants to challenge skeptic Pat Michael’s (who testified in front of his majesty’s committee) funding by industry groups, ignoring the clear government sponsored bias of the grant toting alarmists who testified in front of his committee. Instead of focusing on where skeptics get their money, the congress should be focusing on whether they can trust the global warming scientists in data centers, labs and most universities who have benefited to the tune of over $73 billion in the last two decades.
Ronald Coase, Nobel Economic Sciences, said in 1991 “If we torture the data long enough, it will confess.”
So is 2010 the warmest year?, the 2000s the warmest decade? … Don’t bet on it!
Jones P.D., Groisman PYa, Coughlan M, Plummer N, Wangl WC, Karl TR (1990) Assessment of urbanization effects in time series of surface air temperatures over land. Nature 347:169-172.
Jones, P.D., D. H. Lister, and Q. Li , 2008 Urbanization effects in large-scale temperature records, with an emphasis on China, J. Geophys. Res., 113, D16122,doi:10.1029/2008JD009916.
Karl, T.R., H.F. Diaz, and G. Kukla, 1988: Urbanization: its detection and effect in the United States climate record. J. Climate, 1, 1099-1123.
Menne, M.J., C.N. Willaims, Jr., and M.A. Palecki, 2010: On the reliability of the U.S. surface temperature record. J. Geophys. Res. , doi:10.1029/2009JD013094, in press.
Oke, T.R. 1973. City size and the urban heat island. Atmospheric Environment 7: 769-779.
Peterson, T.C., 2003. “Assessment of Urban Versus Rural in situ Surface Temperatures in the Contiguous United States: No Difference Found.” Journal of Climate 16(18) 2941-2959.
Stone, Brian Jr., 2009. Land Use as Climate Change Mitigation, Environmental Science & Technology, 43: 9052-9056.
Torok, S.J., Morris, C.J.G., Skinner, C. and Plummer, N., 2001. Urban heat island features of southeast Australian towns. Australian Meteorological Magazine 50: 1-13.
Joseph D’Aleo (BS, MS Meteorology, University of Wisconsin, Doctoral Program at NYU, CCM, AMS Fellow) has over 35 years experience in professional meteorology. He was the first Director of Meteorology and co-founder of the cable TV Weather Channel. Mr. D’Aleo was Chief Meteorologist at Weather Services International Corporation and Senior Editor for WSI’s popular Intellicast.com web site. He is a former college professor of Meteorology/ Climatology at Lyndon State College. He is the author of a Resource Guide on El Nino and La Nina. Mr. D’Aleo has frequently written about and made presentations on how research into ENSO and other atmospheric and oceanic phenomena has made skillful long-range forecasts possible and has helped develop statistical models using these global teleconnections which he and others use in forecasting for energy and agriculture traders. He has also studied, published and presented on the roles these cycles in the sun and oceans have played in multidecadal climate change. He is currently Executive Director of the International Climate and Environmental Change Assessment Project (http://icecap.us).