Last month the Berkeley Earth Surface Temperature Project released the findings of its extensive study on global land temperatures over the past century. Physics professor Richard Muller, who led the study, heralded the findings with a number of controversial statements in the press, including an op-ed in this newspaper titled The Case Against Global-Warming Skepticism. And yet Mr. Muller remains a true skeptica searcher for scientific truth. I congratulate Mr. Muller and his Berkeley Earth team for undertaking this difficult task in the realm of climate.
The Berkeley study reported a warming trend of about 1º Celsius since 1950, even greater than the warming reported by the U.N.s Intergovernmental Panel on Climate Change (IPCC). I disagree with this result, which perhaps makes me a little more of a skeptic than Mr. Muller.Mr. Muller has been brutally frank about the poor quality of the weather-station data, noting that 70% of U.S. stations involve uncertainties of between two and five degrees Celsius. One could interpret the Berkeley studys results as confirmation of earlier studies and of the IPCCs conclusions, despite the poor quality of the stations used. But perhaps the issue is that the Berkeley study and the ones that came before suffer from common errors. I suspect that the temperature records still are affected by the urban heat-island effecta term given to any local warming, whatever its causedespite efforts to correct for this. The urban heat-island effect could include heat produced not only in urban areas, but also due to changes in land use or poor station siting. Therefore, I suggest additional tests:
1. Disassemble the global average temperature to get a better picture of whats going on regionally. This could involve plotting both the IPCCs and the Berkeley studys data only for tropical regions, separating the northern and southern hemispheres and testing for seasonal variation and differences between day and night.
2. Better describe what we can think of as the demographics of weather stations, a major source of possible error. The IPCC used 6,000 stations in 1970 and only about 2,000 in 2000. Lets examine their latitude, altitude and possible urbanization, and see if there have been major changes in the stations sampled between 1970 and 2000. For example, it is very likely that airports were used as temperature stations in both 1970 and 2000, because airport stations are generally of high quality. But airports are likely warming rapidly because of increasing traffic and urbanization. So if the number of airport stations remained constant at, say, 1,200 over that 30-year interval, the warming observed there might have increased between 20% and 60% over the same period of time, thereby producing an artificial warming trend.
3. The Berkeley study used a total of 39,000 weather stations, an impressive number. But again, we need to know if that number changed significantly between 1970 and 2000, and how the demographics of the stations changedboth for stations that showed cooling and for those that showed warming.
But the main reason that I am skeptical about the IPCC, and now the Berkeley, findings, is that they disagree with most every other data source I can find. I confine this critique to the period between 1978 and 1997, thereby avoiding the Super El Niño of 1998 that had nothing to do with greenhouse gases or other human influences.
Contrary to both global-warming theory and climate models, data from weather satellites show no atmospheric temperature increase over this period, and neither do the entirely independent radiosondes carried in weather balloons. The Berkeley study confined its findings to land temperatures as recorded by weather stations. Yet oceans cover 71% of the earths surface, and the marine atmosphere shows no warming trend. The absence of warming is in accord with the theory that climate is heavily impacted by solar variability, and agrees with the solar data presented in a 2007 paper by Danish physicist Henrik Svensmark in the journal Proceedings of the Royal Society A.
Moreover, independent data using temperature proxiesvarious non-thermometer sources such as tree rings, ocean and lake sediments, ice cores, stalagmites, and so onalso support an absence of warming between 1978 and 1997. Coral data also show no pronounced warming trend of the sea surface, and there are good reasons to believe that reported sea-surface warming is an artifact of thermometer measurements.
The IPCCs 2007 Summary for Policy makers claims that Most of the observed increase in global average [surface] temperatures since the mid-20th century is very likely [90-99% sure] due to the observed increase in anthropogenic greenhouse gas concentrations. While Mr. Muller now seems to agree that there has been such global average warming since the mid-20th century, he nonetheless ended his op-ed by disclaiming that he knows the cause of any temperature increase. Moreover, the Berkeley teams research paper comments: The human component of global warming may be somewhat overestimated. I commend Mr. Muller and his team for their honesty and skepticism.
|Atmospheric physicist S. Fred Singer is a Research Fellow at the Independent Institute, Professor Emeritus of Environmental Sciences at the University of Virginia, and former founding Director of the U.S. Weather Satellite Service. He is author of Hot Talk, Cold Science: Global Warmings Unfinished Debate (The Independent Institute).|
HOT TALK, COLD SCIENCE: Global Warmings Unfinished Debate
Distinguished astrophysicist S. Fred Singer explores the inaccuracies in historical climate data, the limitations of attempting to computer climate models, solar variability, the effects of clouds, ocean currents, and sea levels on global climate, and factors that could mitigate any human impacts on world climate.