Summary: How warm was the world in November? How fast is it warming? See the numbers. They might surprise you.
The world has been warming for 2 centuries. Seldom mentioned is how much it has warmed, which allows alarmists to more easily arouse fear (e.g., see Joe Romm’s latest; difficult to read graphs but no numbers). For the answer we turn to the NASA-funded global temperature data from satellites. This post shows the numbers: the warming since 1979 is small (so far; the future might be quite different). The truth is out there for people willing to see it. Only with it can we prepare for our future.
Click to enlarge the graphs. This is the first of two posts today.
“It is extremely likely (95 – 100% certain) that human activities caused more than half of the observed increase in global mean surface temperature from 1951 to 2010.”
— conclusion of the IPCC’s AR5 Working Group I
- What do satellites tell us about global warming?
- Was this the hottest November?
- The long-term history of warming
- Who produces this satellite data & analysis?
- For More Information
(1) What do satellites tell us about global warming?
Satellites provide the most comprehensive and reliable record of the atmosphere’s warming since 1979.
The November 2014 Global Temperature Report
by the Earth System Science Center of the University of Alabama in Huntsville
(Blue is cold; red warm}. Click to enlarge.
See the equivalent graph from the surface temperature stations of the Climate Anomaly Monitoring System (CAMS) of the Climate Prediction Center (CPC) at the U.S. National Centers for Environmental Prediction (NCEP).
Key points from the UAH report (prepared under contract for NASA), which show a world that has warmed since 1979, but only slightly (few alarmists know this; even fewer admit it):
- The global composite temperature in November was +0.33°C (0.60°F) above the average for November during 1981-2010.
- Global climate trend of temperature starting in 16 November 1978: +0.14°C (0.3°F) per decade.
- Anomalies are computed per the World Meteorological Organization (WMO) recommended method, comparing the current temperatures vs. a 30 year base period ending with the latest decade.
That warming has not, however, been uniform around the globe.
- The fastest warming has been over the Arctic Ocean and the Arctic portions of the Atlantic and Pacific oceans. Those areas have warmed at the rate of 0.49°C per decade, or more than 1.76°C (about 3.17°F) in 36 years.
- The oceans surrounding the Antarctic are cooling at the rate of 0.02°C per decade, or 0.07°C since December 1978.
- The Northern Hemisphere is warming more than twice as fast as the Southern Hemisphere (0.19°C per decade vs. 0.09°C per decade).
- The contiguous 48 U.S. states have an average warming rate of 0.22°C (almost 0.40°F) per decade during the past 36 years. That means the average atmospheric temperature over the lower 48 has warmed by 0.79°C or about 1.43°F during that time.
(2) Was this the hottest November?
Before we look at the numbers, Colin Morice (climate monitoring scientist at the UK Met Office) warns us that…
Record or near-record years are interesting, but the ranking of individual years should be treated with some caution because the uncertainties in the data are larger than the differences between the top ranked years. We can say this year will add to the set of near-record temperatures we have seen over the last decade.
Dr. Morice refers to the records from the hodgepodge of surface temperature stations. Their HadCRUT4 data is on course to exceed the 2010 record by 0.01°C (0.57°C above the 1961-1991 average). Neither of the two satellite datasets show new records for 2014.They measure lower troposphere temperature, with better coverage and consistency than the many national weather bureaus (many quite poor) that collect surface temperatures.
From the report of the UAH satellite data for November 2014 (the graph shows YTD thru November):
November 2014 was the second warmest November in the 36-year global satellite temperature record, according to Dr. John Christy, a professor of atmospheric science and director of the Earth System Science Center at The University of Alabama in Huntsville.
With a global average temperature that was 0.33°C (about 0.60°F) warmer than seasonal norms, November 2014 trailed only November 2009, which averaged 0.39°C (about 0.70°F) warmer than seasonal norms.
Update: NASA funds two satellite temperature datasets. The second comes from Remote Sensing Systems (RSS). It shows that 2014 so far is the 7th warmest since 1979, not even close to the El Nino years of 1998 and 2010.
(c) A key thing to know: since 1980 climate models have predicted too much heat.
Here is a comparison of actual global average temperature from Remote Sensing Systems (the other satellite dataset) vs forecasts of the major climate models.
The thick black line is the observed time series from RSS V3.3 MSU/AMSU Temperatures. The yellow band is the 5% to 95% range of output from CMIP-5 climate simulations. The mean value of each time series average from 1979-1984 is set to zero so the changes over time can be more easily seen. Note that after 1998, the observations are likely to be below the simulated values, indicating that the simulation as a whole are predicting too much warming.
CMIP-5 is model output from the Coupled Model Intercomparison Project Phase 5, used in the IPCC’s AR5).
(3) The vital context: a longer-term temperature history
Summary: Two decades of cool weather, followed by 18 years of warm weather. Wide swings in temperature; a relatively flat trend starting in 1998 – 2000. For more about the pause see these posts.
(a) From the UAH monthly report: a graph of the full record of UAH satellite data (started in 1979).
(b) A different view of the UAH data, by Roy Spencer, principal scientists on the UAH team (at his website).
(4) Who produces this satellite data and analysis?
(a) About the global satellite data, from the Novmeber report:
As part of an ongoing joint project between UAHuntsville, NOAA and NASA, Christy and Dr. Roy Spencer, an ESSC principal scientist, use data gathered by advanced microwave sounding units on NOAA and NASA satellites to get accurate temperature readings for almost all regions othe Earth. This includes remote desert, ocean and rain forest areas where reliable climate data are not otherwise available.
The satellite based instruments measure the temperature of the atmosphere from the surface up to an altitude of about eight kilometers above sea level.
(b) About John R. Christy
He is Professor of Atmospheric Science at U Al-Huntsville, and Director of their Earth System Science Center (ESSC). He is also Alabama’s State Climatologis. See his full profile and publications here.
(c) Roy Spencer is a principal research scientist at the ESSC and member of the Affiliated Faculty at the U AL-Huntsville.
(5) For More Information
(a) Reference Pages about climate on the FM sites:
- The important things to know about global warming
- My posts
- Studies & reports, by subject
- The history of climate fears
(b) An introduction to climate change:
- What we know about our past climate, and its causes
- Good news! Global temperatures have stabilized, at least for now.
- What can climate scientists tell about the drivers of future warming?
- What can climate scientists tell us about the drivers of future warming? – part two of two
12 thoughts on “How much did the world warm in November? How fast is it warming? See the numbers.”
It is quite obvious that the views about AGW are significantly influenced by political factors. Therefor we should ignore all political aspects and try to examine the DATA to determine if the globe is warming, cooling or just “pausing”.
Under the assumption that the GSS data derived from satellite instrumentation is the most uniformly consistent data set, as relates to GLOBAL TEMPERATURE, let us examine that data set for trends. After all, we are trying to predict the century scaled future.
Using exponential smoothing of the data in which a 20%*** (five year “equivalent”) weighting factor is applied to each year’s GSS temperature value, when averaged with the previous years’ smoothed result yields an “up to date” vision of where break points may exist in the continuum. Exponential smoothing also eliminates the “drop off” effect of front-end data found in “x” year segment averaging.
Applying LMS analysis to each of those “perceived” sections leads to the following:
1880 to 1916 = 0.2 degree decline
1916 to 1945 = 0.4 degree rise
1945 to 1976 = dead flat zero change
1976 to 1996 = 0.6 degree rise
1996* to 2013 = dead flat zero change (the “pause”)
It is obvious that selection of “break points” will influence these change values, however the prediction of future global temperatures must use the latest and best continuous measurement technology available, and so selection of the last segment break point is inconsequential.
It would thus appear obvious, from a data standpoint, that we should let the “pause” data from 1996 onward be our guide until NEW data is added that inclines UPWARD or DOWNWARD for some significant period of time.
Prediction of century long global temperature justifies more than a few select words such as “the hottest October on record since 1980” or “2014 projected to be warmest U.S. on record since the dust bowl”. Assuming the truth**** of the one-degree GT rise from 1916 to 1996, obviously the present temperature will always be “one of the hottest on record”. Those verbal declarations add nothing but confusion for the lay person to the GW discussion.
The process of predicting future century trends in global temperature requires non-biased calculation of the present DATA TREND.
To those points, the following data must be considered “noise” and should not be quoted to signal any real trend information.
Regional values (not global).
Climate change values (not global temperature data) –
(These are results and may have time delays).
Only unimpeachable CONTINUOUS DATA STREAM data must be used to calculate trends of at least five to ten years. The data stream should also be constantly evaluated for any measurement TREND** bias. The expenditure of trillions of dollars in the world’s resources justifies a rigorous analysis of up-to-date TREND DATA.
CONCLUSION: When global temperature trends have been rigorously evaluated, they must then be correlated with perceived causes. If they do not correlate, remediation efforts will be useless.
*Of interest is an apparent two year drop following the El Nino spike in 1997 as if in compensation. The pause is LMS flat with or without 1996 to 2000 data.
**Such as Urban heat islands or parking lot expansion or adding ocean current analysis.
***Various constant percentages may be used to instantly visualize the effect on the smoothing.
**** A “patch” of land based old data with satellite data is inevitable at some point in time.
Thank you for the interesting comment.
Such analysis is over my head, both in statistics and climate science. My only thought is that I suggest sticking with the pro’s work, as they know the many factors involved (e.g., cycles) and have supercomputers!
Trivia: I think you mean RSS (Remote Sensing Systems) not GSS. They are one of the two satellite temperature datasets funded by NASA. They’re roughly similar. There is a revision to UAH coming, which is said will decrease the differences.
Me thinks you must love confusion. The climate “experts” have developed lots of theories to try ro match reality to the models, all to no avail.
If “climate change” is the result of Global Warming, and global temperature trends are measurable as per GISS satellite, then only numerical analysis expertise is required to see the 1997 to 12013 FLAT global temperature record. When new data begins to deviate from this 0.0 degree / decade trend line, in order to foretell some disastrous temperature in the future, numerical trend analysis will be the first to spot it.
All valid points! A few details…
(1) People often look to the clashing theories on the frontiers of science as evidence that scientists are confused. There is always a frontier, the edge between the vast dark of what we know and the small but growing circle of light. We see that today climate scientists study the pause, its causes and forecast its duration.
(2) “if climate change is the result of global warming …”
Just to be clear: climate change is omnipresent, everywhere and always; global warming is one of the many factors pushing it forward.
(3) GISS is a surface temperature dataset. I suspect you’re thinking of a satellite temperature dataset: RSS (remote sensing system) or UAH (U of AL-Huntsville).
I do not wish to dispute the warming that took place between 1976 and 1996. That was data from land based instruments, corrected by thousands of corrective algorithims.
Another trivia point, expanding what you said — the Earth has warmed since the mid-19th century, when the Little Ice Age ended. The evidence is massive and incontrovertible. Of course, anthropogenic CO2 was insignificant back then. Which is why the consensus of climate scientists is, per the IPCC, “more than half of the warming since 1950” is from us.
The reaal question is: What warming is NOW taking place that can be used to predict future extreme warming?
Yes, from a public policy standpoint it’s all about forecasts. The internal dynamics of science as a profession and public policy often get conflated, resulting in confusion.
What’ happening “now” (today, this season, this year, this decade) are of interest, but only as they provide scientists with more data to improve their models — and as an out of sample test for past and current models.
We might learn as much or more from more research. Such as, my favorite, updating the proxies used in climatology. Such as the tree ring data that formed the foundation for the famous “hockey stock” temperature graphs that imitated the current round of public concern. Most of these ended in the 1970. Updating them and comparing them with the temperature data will help validate them as a temperature proxy — or discredit them, as many scientists have warned. Unfortunately it appears that neither side is interested in more data, just more smoke.
You do of course realise that satellites actually measure atmospheric temperature abut 1 km above our heads, and cannot be expected to show the same results as thermometers on the surface?
Did you also know that orbital decay and equipment degradation has to be allowed for, and this makes the measurement problematic and possibly inaccurate?
You did not know that? Not surprised.
I am always astonished by the rude comments of people on these weather threads, especially by people with so little knowledge of the subject.
(1) “You do of course realise that satellites actually measure atmospheric temperature abut 1 km above our heads”
(a) What about these sentence was not clear to you: “They measure lower troposphere temperature…” and “The satellite based instruments measure the temperature of the atmosphere from the surface up to an altitude of about eight kilometers above sea level.”
(b) Yes, the temperatures are different at the surface and higher. Both should show warming (there’s a large literature on models’ estimates of the differences, but I suspect you haven’t seen that).
(2) “Did you also know that orbital decay and equipment degradation has to be allowed for”
What makes you believe that the NASA-funded scientists running these programs don’t do so? Have you read their peer=reviewed papers explaining their adjustments?
(3) “this makes the measurement problematic and possibly inaccurate?”
Let’s see the peer-reviewed research supporting your claim. NASA spends a great deal of money running those satellites and the two data analysis programs. I believe everybody involved is competent.
Also, what do you mean by “inaccurate”? No measurement of something so complex as any aspect of Earth’s average temperature can be “accurate”. It’s a matter of estimated accuracy and error bars. The surface land temperature measurements come from the many national weather agencies, run with different equipment (designed to measure weather, not climate trends) and different standards — and very different funding levels. The global datasets adjust for this, but there is only so much that can be done with such varied data. Also, their coverage of the poles and oceans is not as complete as the satellites.
On the other hand, the surface temperature record is far longer than the satellite record (starting ~1979) and vital for many purposes. Both surface and satellite datasets have their uses, their relative strengths and weaknesses.
This didn’t copy and past correctly, so repeated from the above comment:
(1) “You do of course realise that satellites actually measure atmospheric temperature abut 1 km above our heads”
(a) What about these sentences was not clear to you: “They measure lower troposphere temperature…” and “The satellite based instruments measure the temperature of the atmosphere from the surface up to an altitude of about eight kilometers above sea level.”
Pingback: Latest Lms Comparison News - Moodle