January 26, 2005
Uncertainty in the GCM's

There's been a lot of noise today about a paper in today's Nature on the range of uncertainties of future climate change. It's a terrifically interesting and useful paper, but if you just read the press coverage that I've been reading, you may have a rather wrong idea about why.

The researchers, using a Seti@home-like distributed computing system, were able to perform a far larger number of climate model simulation runs than is normally done, and they found that, by using a reasonable but far larger range of the adjustable parameters used in the models, they saw a significantly larger range of possible temperature increase.

The 2,017 simulation runs showed a range of possible temperature increase under doubled atmospheric CO2 of from 2 degrees C to more than 11 degrees C. That's a bigger range, especially at the top end, than the 1.4 to 5.8 degrees cited by the IPCC. Note - and this is important - that the Oxford team that did the calculations did not say that the high end of the range was the most likely. Quite the opposite. The frequency distribution in their paper suggests the lower end, in the 2 to 4 degree range, is the most likely.

Now let's look at the press coverage.

First, from The Telegraph: Screen saver weather trial predicts 10 degree C rise in British temperatures

Or MSNBC: "Computer models for climate change allow for a global temperature rise of as much as 20 degrees Fahrenheit (11 degrees Celsius), according to the first results from the world’s largest climate-modeling experiment." (This accompanied by a map color-coded with the extreme case.)

From The Scotsman: "THE biggest computer calculation of its kind has forecast a degree of global warming greater than previously predicted, with Britain becoming an unrecognisable tropical country in a few generations."

Even Nature itself, in the Web news story accompanying the paper, fell into the trap: "Biggest-ever climate simulation warns temperatures may rise by 11 C." (To be fair, Michael Hopkin's second paragraph gets to the meat: "But as well as a predicting a bigger maximum rise, the project has also increased the range of possible temperature changes.")


I could find no one who wrote a headline saying "Biggest-ever climate simulation finds temperatures may only rise 2 degrees C," though that would have been equally well supported by the data.

For a more useful take, without the big number alarmism, there is Newsday's Bryn Nelson, who did a good job of laying out where this fits and why it matters:


In formulating reasonable global warming predictions, researchers say they have long been stymied by the inherent uncertainty in their models. The climateprediction.net experiment, run by a British research consortium, is trying to address this frustration with a deceptively simple solution: unused computer space.

Calling upon thousands of volunteers to run different versions of the same model with their computers' spare capacity, the team hopes to refine a long list of atmospheric and oceanic variables. Eventually, the researchers hope to produce a 21st-century forecast that neither over-hypes nor underestimates warming scenarios.

This is an important exercise in helping better understand and refine the range of possibilities, and it's pretty clear from the research that some of them are extreme and worthy of attention. But a lot of the press coverage I read today makes me think that some of the worst fears of the skeptics crowd about biased media coverage on this issue are not entirely unjustified.

Posted by John Fleck at January 26, 2005 09:49 PM
Comments
Comments
Comments