Summary: This month might be an inflection point in the three decades of struggle about the best policy to deal with climate change. Not the new IPCC special report, which has little new in it. A new paper questions the global temperature history records, the foundation of the debate.
“But facts are chiels that winna ding, and downa be disputed.
— “A Dream” by Robert Burns (1786).
An Audit of the Creation and Content of the HadCRUT4 Temperature Dataset
By John D. McLean, October 2018.
PhD in 2017 in Physics from James Cook U.
Preface
“This report is based on a thesis for my PhD, which was awarded in December 2017 by James Cook University, Townsville, Australia. …The thesis was examined by experts external to the university, revised in accordance with their comments and then accepted by the university. This process was at least equivalent to ‘peer review’ as conducted by scientific journals.”
His thesis is “An audit of uncertainties in the HadCRUT4 temperature anomaly dataset plus the investigation of three other contemporary climate issues“, submitted for Ph.D. in physics from James Cook University (2017). The HadCRUT dataset is a collaboration of the Met Office’s Hadley Centre and the Climatic Research Unit at the University of East Anglia.
Here are two excerpts from Dr. McLean’s report.
From the Introduction.
… The key temperature data used by the Intergovernmental Panel on Climate Change (IPCC) is the HadCRUT dataset, now in its fourth version and known as HadCRUT4. When I was an Expert Reviewer of the IPCC’s 2013 Climate Assessment report I raised questions as to whether the HadCRUT4 dataset and the associated HadSST3 dataset had been audited. The response both times was that it hadn’t.
Further indication that no-one has independently audited the HadCRUT4 dataset came early in my analysis, when I found that certain associated files published simultaneously with the main dataset contained obvious errors. Given the nature of the errors and the years in which some of the errors occurred, it seemed that they probably existed for at least five years. (At the time I notified the relevant people and the files have since been corrected.) It seems very strange that man-made warming has been a major international issue for more than 30 years and yet the fundamental data has never been closely examined. …
Almost all of the published papers about the HadCRUT4 dataset and its two associated datasets were written by people involved in the construction and maintenance of them, which hardly makes for unbiased analysis. …
Some issues in this study focus on individual situations, such as a single observation station, that would have negligible impact on global average values. Similar issues could exist elsewhere in the data and processing, perhaps less obviously, and the fact that issues can be identified at all suggests a variety of problems including lack of attention to detail and possible problems with fundamental procedures or processing. Above all they show that considerable uncertainty exists about the accuracy of the HadCRUT4.
The PhD candidature on which this work is based was funded on the normal “per candidate” basis by the Australian government and had no additional funding. The creation of this report itself had no funding whatsoever.
From the Executive Summary.
…As far as can be ascertained, this is the first audit of the HadCRUT4 dataset, the main temperature dataset used in climate assessment reports from the Intergovernmental Panel on Climate Change (IPCC). Governments and the United Nations Framework Convention on Climate Change (UNFCCC) rely heavily on the IPCC reports so ultimately the temperature data needs to be accurate and reliable.
This audit shows that it is neither of those things. More than 70 issues are identified, covering the entire process from the measurement of temperatures to the dataset’s creation, to data derived from it (such as averages) and to its eventual publication. The findings (shown in consolidated form Appendix 6) even include simple issues of obviously erroneous data, glossed-over sparsity of data, significant but questionable assumptions and temperature data that has been incorrectly adjusted in a way that exaggerates warming.
It finds, for example, an observation station reporting average monthly temperatures above 80°C, two instances of a station in the Caribbean reporting December average temperatures of 0°C and a Romanian station reporting a September average temperature of -45°C when the typical average in that month is 10°C. On top of that, some ships that measured sea temperatures reported their locations as more than 80km inland.
It appears that the suppliers of the land and sea temperature data failed to check for basic errors and the people who create the HadCRUT dataset didn’t find them and raise questions either.
The processing that creates the dataset does remove some errors but it uses a threshold set from two values calculated from part of the data but errors weren’t removed from that part before the two values were calculated.
Data sparsity is a real problem. The dataset starts in 1850 but for just over two years at the start of the record the only land-based data for the entire Southern Hemisphere came from a single observation station in Indonesia. At the end of five years just three stations reported data in that hemisphere. Global averages are calculated from the averages for each of the two hemispheres, so these few stations have a large influence on what’s supposedly “global”
Related to the amount of data is the percentage of the world (or hemisphere) that the data covers. According to the method of calculating coverage for the dataset, 50% global coverage wasn’t reached until 1906 and 50% of the Southern Hemisphere wasn’t reached until about 1950
In May 1861 global coverage was a mere 12% – that’s less than one-eighth. In much of the 1860s and 1870s most of the supposedly global coverage was from Europe and its trade sea routes and ports, covering only about 13% of the Earth’s surface. To calculate averages from this data and refer to them as “global averages” is stretching credulity.
Another important finding of this audit is that many temperatures have been incorrectly adjusted. {Technical explanation follows.} …
The overall conclusion (see chapter 10) is that the data is not fit for global studies. Data prior to 1950 suffers from poor coverage and very likely multiple incorrect adjustments of station data. Data since that year has better coverage but still has the problem of data adjustments and a host of other issues mentioned in the audit.
Calculating the correct temperatures would require a huge amount of detailed data, time and effort, which is beyond the scope of this audit and perhaps even impossible. The primary conclusion of the audit is however that the dataset shows exaggerated warming and that global averages are far less certain than have been claimed.
One implication of the audit is that climate models have been tuned to match incorrect data, which would render incorrect their predictions of future temperatures and estimates of the human influence of temperatures.
Another implication is that the proposal that the Paris Climate Agreement adopt 1850-1899 averages as ‘indicative’ of pre-industrial temperatures is fatally flawed. During that period global coverage is low – it averages 30% across that time – and many land-based temperatures are very likely to be excessively adjusted and therefore incorrect. …
Ultimately it is the opinion of this author that the HadCRUT4 data, and any reports or claims based on it, do not form a credible basis for government policy on climate ….
———————————————-
You can buy a copy of this report for $8 US. I recommend reading it!
The Met Office responds. McLean replies.
From The Australian: “Britain’s Met Office welcomes audit by Australian researcher about HadCRUT errors” by Graham Lloyd, 15 October 2018. As usual, climate science institutions blow the opportunity to boost their credibility with a strong response to challenges (that is what new scientists are supposed to do).
Here is McLean’s response at JoNova’s website. From high up in the peanut galley, both look like climate science at its worst. But that’s an amateur opinion.
My comments
“It is a capital mistake to theorize before you have all the evidence. It biases the judgment.”
— Sherlock Holmes in A Study in Scarlet, by Arthur Conan Doyle (1887).
This report makes an astonishing claim: that one of the principle global temperature datasets has poor quality control. It seems negligent, given the importance of this data. Also, I look forward to experts on the climate data records critiquing his report. If McLean is wrong, MET-CRU can easily disprove his claims. Their response (or non-response) might tell us much about the institutions of climate science.
I hope Dr. McLean publishes this in a peer-reviewed journal, giving a higher level of review than for a dissertation. But that might be difficult for a green PhD who challenges the core climate science narrative. My guess is that only public pressure will make this possible (assuming, of course, that MET-CRU do not quickly disprove his claims).
Ten years ago there was considerable discussion about the poor quality control in the surface temperature record. I wrote about it, and made this my top recommendation: “More funding for climate sciences. Many key aspects (e.g., global temperature data collection and analysis) are grossly underfunded.” The cost would be pocket lint compared to our total spending on climate science, and well worth it. In 2009 Ross McKitrick gave a powerful call for action …
“I have often used the analogy of national Consumer Price Indexes to illustrate the ridiculous situation of the “Global Temperature” data. Each country has large professional staffs at their Stat agencies working on the monthly CPI using international protocols, using transparent methods, with independent academics looking over their shoulders weighing the various aggregation methodologies …and with historical archiving rules that allow backward revisions periodically if needed. It’s by no means perfect, but it’s a far cry from the f**king gong show we’re seeing here. …
“By contrast the Global Temperature numbers are coming from a bunch of disorganized academics chipping away at it periodically in their spare time. GISS numbers are handled (on Gavin’s admission) by a single half-time staffer, and the CRU says they’re stumped trying to find their original files back into the 70s and 80s, as well as the agreements under which they obtained the data and which to this day they invoke to prevent independent scrutiny.”
Assurances were given that the temperature datasets were reliable. Skeptics were mocked. But this new paper suggests that the problem was worse than most of us thought, and little or nothing has been done to improve what might be the most important data record used today. The other major global temperature datasets might be better, but I doubt it.
Nothing will change without public pressure. Push Congress to fund a full review and – if necessary, an update of these systems. Push Team Trump to implement these measures immediately.
About the author
McLean has 40 years experience as an IT professional. The Department of Physics at James Cook University awarded him a PhD in 2017. He has published four papers in peer-reviewed journals (links here). The most recent: “Late Twentieth-Century Warming and Variations in Cloud Cover” in Atmospheric and Climate Sciences, October 2014. See his website.
Trivia note: McLean has a mark of distinction – a page about him at Skeptical Science. These pages are crude propaganda, taking statements out of context and sneering at legitimate points of debate among scientists. For an extreme example, see their page about the eminent climate scientist Roger Pielke Sr.
Coverage of this story (will be updated)
“Claims of 70 problems found with key temperature dataset used by key temperature dataset used by climate models” by Graham Lloyd at The Australian, 8 October 2018 (gated). Excerpt …
“An audit of the key temperature dataset used by climate models claims to have identified more than 70 problems which the Australian author said made it “unfit for global studies”. Problems include zero degree temperatures in the Caribbean, 82 degree C temperatures in Colombia and ship based recordings taken 100km inland. …The audit is an extension of the PhD thesis by Dr John McLean awarded by James Cook University. Dr McLean has previously identified anomalies in the data set which were acknowledged by the Met Office and corrected.”
For More Information
See the new IPCC report: “Global Warming of 1.5 °C.” SR15 differs from AR15 on one major way: it assume +1.5°C over pre-industrial creates Armageddon. That’s odd, since we are already at 1°C over (much of that is natural warming). So McLean’s concerns about the global temperature dataset are important, since they could mean we are farther from the 1.5°C red level. Which is a political revision of a politically-chosen target. See “The Invention of the Two-Degree Target” in Der Spiegel.
If you liked this post, like us on Facebook and follow us on Twitter. For more information about this vital issue see the keys to understanding climate change and these posts about the wars over the historical temperature record…
- Important: climate scientists can restart the climate change debate – & win.
- Surprising news about trend of America’s temperature and precipitation.
- Alabama debunks the Times’ story about our warming world.
- About those headlines of the past century about global cooling….
- The facts about the 1970′s Global Cooling scare.
- Start of another swing of the media narrative – to global cooling?
- Global Cooling returns to the news, another instructive lesson about America.
- Did NASA and NOAA dramatically alter US climate history to exaggerate global warming? – Spoiler: no.
- The climate wars get exciting. Government conspiracy! Shattered warming records! Global cooling!
- Have the climate skeptics jumped the shark, taking the path to irrelevance?
Alarmists worked hard to keep you from reading this book.
Alarmists have worked long and hard to discredit Roger Pielke Jr.’s, because he tells us about the IPCC and peer-reviewed research. Things that violate the “narrative” about our imminent doom.
They really do not want you to read the revised second edition of The Rightful Place of Science: Disasters & Climate Change
“After nearly every hurricane, heatwave, drought, or other extreme weather event, commentators rush to link the disaster with climate change. But what does the science say?
“In this fully revised and updated edition of Disasters & Climate Change, renowned political scientist Roger Pielke Jr. takes a close look at the work of the Intergovernmental Panel on Climate Change, the underlying scientific research, and the climate data to give you the latest science on how climate change is related to extreme weather. What he finds may surprise you and raise questions about the role of science in political debates.”

