Wednesday, July 07, 2010

The CCE Review report

Today, the Independent Climate Change Email Review report was issued. As a bit of background I'll point you to some of my old blog posts on this:

1. Met Office confirms that the station errors in CRUTEM3 are incorrectly calculated

2. Something a bit confusing from UEA/CRU

3. Something odd in CRUTEM3 station errors

4. New version of CRUTEM3 and HADCRUT3

5. Well, I was right about one thing

6. Bugs in the software flash the message "Something's out there"

7. Whoops. There's a third bug in that code

And here's what one of the people involved in the report wrote:

First, climategate reveals the urgent demand by a new breed of citizen-scientist for access to the raw data scientists use to do their work. Simply accepting a scientist's assurance that data are accurate and reliable is no longer enough. Scientists will have to make their data available for independent audit.

Second, climategate shows that science must change its idea of accountability. Traditionally, scientists have been accountable only to one another. But with the advent of new critical public voices in science – the birth of the blogosphere, for example – scientists must redefine who is a legitimate critic and who isn't. It is easy to brand the blogosphere as universally damaging and defamatory. But climategate has shown that while some critics do enjoy abusing scientists, others ask tough and illuminating questions, exposing important errors and elisions. These critics have an important part to play in shaping scientific debate and dialogue.

The report itself says:

37. Making source code publicly available. We believe that, at the point of publication, enough information should be available to reconstruct the process of analysis. This may be a full description of algorithms and/or software programs where appropriate. We note the action of NASA‘s Goddard Institute for Space Science in making the source code used to generate the GISTEMP gridded dataset publically available. We also note the recommendation of the US National Academy of Sciences in its report ―Ensuring the Integrity, Accessibility and Stewardship of Research Data in the Digital Age that: “…the default assumption should be that research data, methods (including the techniques, procedures and tools that have been used to collect, generate or analyze data, such as models, computer code and input data) and other information integral to a publically reported result will be publically accessible when results are reported." We commend this approach to CRU.


Interestingly, Sir Muir Russell's review did what I did: they wrote code (in their case in C++) to reproduce CRUTEM3 (see Appendix 7). Unfortunately, they didn't go all the way to check the error ranges and find the bug that Ilya Goz and I found.

PS I asked the review if they would give me a copy of the C++ code (in the spirit of openness :-)

1 comment:

Phil said...

It's a shame that they didn't practice what they preach and include the code referred to in Appendix Seven. I hope they put it up on their website for peer review.