Authors: Ronald Bailey
In a January 2015
Nature
article, two European climate researchers report the results of comparing the outputs of eighteen climate models used by the IPCC to simulate average global temperature trends from 1900 to 2010 to see how well they match with observed temperature trends. They find that the models actually do simulate similar lengthy hiatuses during that period; they just don't happen to coincide with the current observational hiatus. They find that due to natural variation, the observed warming might be at the upper or lower limit of simulated rates, but there is no indication of a systematic bias in model process. “Our conclusion is that climate models are fundamentally doing the right thing,” University of Leeds researcher Piers Forster explained. “They [climate models] do in fact correctly represent these 15-year short-term fluctuations but because they are inherently chaotic they don't get them at the right time.” The
Nature
article concludes, “The claim that climate models systematically overestimate the response to radiative forcing from increasing greenhouse gas concentrations therefore seems to be unfounded.” Accordingly, the current pause in global average temperature increases is just the result of natural fluctuations in the climate and the man-made trend toward higher temperatures will resume eventually. What natural fluctuations might be responsible for slowing global temperature increases? In a February 2015 article in
Science
University of Pennsylvania climatologist Michael Mann and his colleagues used climate model simulations to estimate natural variability in North Atlantic and North Pacific Ocean temperatures. They conclude that temperatures in the northern Pacific just so happen to be in cold phase right now which has “produce[d] a slowdown or “false pause” in warming of the past decade.”
Interestingly, the IPCC's
Synthesis Report
found that “ocean warming dominates the increase in energy stored in the climate system, accounting for more than 90% of the energy accumulated between 1971 and 2010 (
high confidence
) with only about 1% stored in the atmosphere.” The report also suggested that most of the excess heat was stored in the upper ocean, but added that “it is
likely
that the ocean warmed from 700 m[eters] to 2000 m[eters] from 1957 to 2009 and from 3000 m[eters] to the bottom for the period 1992 to 2005.” Of course, the JPL studies published the month before contradict this assertion.
Just how long the temperature pause must last before it would falsify the more catastrophic versions of man-made climate change obviously remains an open question for many researchers. For the time being, most are betting that it will get real hot real fast when the hiatus ends.
The upshot is that many researchers remain convinced that natural fluctuations in the climate unaccounted for in the computer models are responsible for keeping average global temperature flat for the past sixteen to eighteen years. The IPCC's
Physical Science
report asserts that the models cannot be expected to simulate the timing of the sort of natural climate variability that has produced the current sixteen- to eighteen-year pause. Georgia Tech climatologist Judith Curry contrarily observed, “If the IPCC attributes to the pause to natural internal variability, then this begs the question as to what extent the warming between 1975 and 2000 can also be explained by natural internal variability. Not to mention raising questions about the confidence that we should place in the IPCC's projections of future climate change.”
Overall, the IPCC suggests that the difference between the models and the actual recent temperature trend “could be caused by some combination of (a) internal climate variability, (b) missing or incorrect radiative forcing, and (c) model response error.” That is to say, the projections are off owing to pesky natural climate fluctuations, possible errors regarding estimates of how much warming a given increase in greenhouse gases will produce, and/or boosting temperature projections too high in response to given increases in greenhouse gases.
Nevertheless, the IPCC believes that the current temperature slowdown will soon end and states, “It is
more likely than not
that internal climate variability in the near-term will enhance and not counteract the surface warming expected to arise from the increasing anthropogenic forcing.” In other words, when the warm-up resumes, it will soar. By how much? The IPCC
Physical Science
report projects, “The global mean surface temperature change for the period 2016â2035 relative to 1986â2005 will likely be in the range of 0.3°C to 0.7°C.” This implies increases of 0.15°C to 0.35°C per decade. That would mean that warming could increase at nearly triple the rate the IPCC reports for the period after 1951, and seven times higher than the rate of increase it reports for the last fifteen years. Researchers from the Pacific Northwest National Laboratory published an article in
Nature Climate Change
in March 2015 in which they compared past rates of temperature change over forty-year periods with future projections. They predict that global average temperature will be increasing at a rate of 0.25C° per decade by 2020. That rate of change would be “unprecedented for at least the past 1,000 years.” If average global temperature began to rise at this rate, it would vindicate the climate models. If not, then what?
What sorts of changes in “internal climate variability” might soon increase global average temperatures? Warmer and colder water sloshes back and forth periodically in the tropical Pacific Ocean, producing significant changes in global weather. When this El Ni
ñ
o Southern Oscillation (ENSO) pattern is in its warm phase, it substantially boosts the average global temperature. In 2014, many meteorologists were waiting to see if 2014â2015 would conjure up a big ENSO warm phase that would end the hiatus and finally increase global average temperatures above the big 1998 ENSO spike. As of January 2015, the National Oceanic and Atmospheric Administration noted the existence of mild El Ni
ñ
oâlike conditions and suggested that Pacific Ocean sea surface temperatures have a good chance of subsiding to a neutral state in 2015.
The computer climate models are supposed to give policymakers reliable data regarding future trends in man-made global warming. The failure to predict the sixteen- to eighteen-year temperature hiatus has caused some policymakers to wonder if the findings in the IPCC's
Physical Science
report really do inspire the kind of confidence that could justify the entailed multitrillion-dollar bet on massive changes to humanity's energy supply programs.
The Science Is SettledâClimate Sensitivity
Another possible explanation for why the computer climate models may be running too hot is what the IPCC refers to as model response error. That is, they may overestimate the amount of warming that results from a given increase in greenhouse gas concentrations in the atmosphere. This brings up the crucial issue of climate sensitivity, conventionally defined as the amount of warming that doubling carbon dioxide in the atmosphere would eventually produce. Temperature increases lag increases in atmospheric carbon dioxide, so another important process is the transient climate response (TCR), which is the amount of warming expected at the time a carbon dioxide concentration crosses the doubling line.
In its 2007 report, the IPCC estimated that climate sensitivity was between 2° and 4.5°C, with the best estimate being 3°C. The 2013 IPCC
Physical Science
report drops the lower bound and finds that climate sensitivity is
likely
in the range 1.5° to 4.5°C. It also states that it is
extremely unlikely
climate sensitivity is less than 1°C and
very unlikely
to be greater than 6°C. In addition, the report notes with “high confidence” that the transient climate response is “likely in the range 1° to 2.5°C and extremely unlikely greater than 3°C, based on observed climate change and climate models.” In IPCC parlance,
likely
means that the authors believe that there is more than a 66 percent chance that they've gotten the right estimate for climate sensitivity, whereas
extremely unlikely
means that they think there is less than 5 percent chance that they are wrong. Just how sensitive the climate is to increases in greenhouse gases is a controversial and hotly disputed area of climate research.
Several recent studies have reported that climate sensitivity could be lower than the
Physical Science
summary suggests. For example, an article in the June 2013
Nature Geoscience
concluded that the “most likely value of equilibrium climate sensitivity based on the energy budget of the most recent decade is 2.0 °C, with a 5â95% confidence interval of 1.2â3.9 °C.” A confidence interval is basically the probability that a value will fall between an upper and lower bound of a probability distribution. In other words, these researchers are 90 percent confident that climate sensitivity lies somewhere between 1.2° and 3.9°C. The researchers also reported that the best estimate for transient climate response based on observations of the most recent decade is 1.3°C, ranging between 0.9° to 2.0°C.
The
Nature Geoscience
estimate is a whole degree lower than the best estimate calculated by the IPCC in 2007. Interestingly, the 2013
Physical Science
report, unlike its predecessor reports, provides no best estimate of climate sensitivity. Research on this topic continues. For example, in a March 2014 study, Norwegian researchers take into account the last ten years of temperature records and trends in ocean heat buildup. Their estimates for climate sensitivity and transient climate response are even lower. They report that the best estimate of climate sensitivity “is 1.8°C, with 90% C.I. [confidence interval], ranging from 0.9 to 3.2°C, which is tighter than most previously published estimates.” They also calculate that there is only 1.4 percent chance that climate sensitivity would turn out to be greater than 4.5°C. Since the amount of warming that a doubling of carbon dioxide would produce is lower, so too is the transient climate response, which the researchers estimate to be about 1.4°C, ranging between 0.8° and 2.2°C.
Other researchers, however, come to more worrisome conclusions about how sensitive the climate is. Two new studies, one in March and another in May 2014, argue that the many researchers who have reported lower climate sensitivities have failed to take the cooling effects of various pollutants into account. Once the dampening of airborne particles like sulfates and black soot and ground-level ozone is properly included in calculations, the March
Nature Climate Change
study by GISS researcher Drew Shindell calculated a transient climate response of about 1.7°C. Building on Shindell's insights, researchers at Texas A&M University estimated in May 2014 that climate sensitivity is likely to be 3.0°C, ranging between 1.9° and 6.8°C. This range is a bit higher and wider than the one reported by the IPCC in 2013. However, more recent research cited earlier that now finds that man-made aerosols have had a negligible effect on current global average temperatures might suggest that these climate sensitivity ranges are too high.
In September 2014 the climate sensitivity research pendulum swung back toward a lower estimate. In their article published in
Climate Dynamics,
Georgia Tech climatologist Judith Curry and British statistician Nicholas Lewis reported results using temperature data from 1850 to 2011 along with estimates of the effects on climate of various factors taken from the IPCC's
Fifth Assessment Report,
such as land-use changes, volcanic activity, and atmospheric pollutants. They calculate that the best estimate for climate sensitivity is 1.64°C, with an uncertainty range of 1.05° to 4.05°C. The corresponding transient climate response is 1.33°C, with an uncertainty range of 0.90° to 2.50°C. These new values are at the low end of the IPCC climate sensitivity and transient climate response estimates.
Is there any way to resolve this scientific dispute? Yes. Wait and see what the climate actually does. But climatologists may have to wait at least a couple of decades before they can know the answer for sure. In an April 16, 2014, article in
Geophysical Research Letters,
researchers sort through various climate sensitivity scenarios ranging from a low of 1.5° to a high of 6.0°C. They calculate that it would take another twenty years of temperature observations for us to be confident that climate sensitivity is on the low end and more than fifty years of data to confirm the high end of the projections. This ongoing controversy is important because lower climate sensitivity would mean that future warming will be slower, giving humanity more time to adapt and to decarbonize its energy production technologies. Higher climate sensitivity would mean the opposite.
Ocean Acidification
As the oceans absorb carbon dioxide from the atmosphere, the amount of carbonic acid is increased, thus making the ocean more acidic. As noted previously, the acidity of the surface waters of the oceans has increased by about 26 percent since the beginning of the Industrial Revolution. The IPCC 2014
Adaptation
report observes, “Impacts of ocean acidification range from changes in organismal physiology and behavior to population dynamics and will affect marine ecosystems for centuries if emissions continue.” Some ocean denizens like plants and algae will most likely benefit from increased carbon dioxide levels, whereas other creatures such as corals and mollusks might suffer significant harm. Corals, echinoderms, and mollusks absorb carbonate minerals from the oceans to make their shells, and higher acid levels lower dissolved amounts of that mineral in seawater. Various computer model projections suggest that as acidity increases, it will be harder to calcifying creatures to survive. However, the 2014
Adaptation
report observes, “Limits to adaptive capacity exist but remain largely unexplored.”