Read Collision Course: Endless Growth on a Finite Planet Online

Authors: Kerryn Higgs

Tags: #Environmental Economics, #Econometrics, #Environmental Science, #Environmental Policy

Collision Course: Endless Growth on a Finite Planet (44 page)

BOOK: Collision Course: Endless Growth on a Finite Planet
6.04Mb size Format: txt, pdf, ePub
ads

Throughout his chapter on what he calls “the depletion myth,” Bailey makes much of the falling prices of commodities and food, as did his colleague, Julian Simon, and most of the other economists who disparaged the
Limits
work. Most rely on the seminal 1963 research by economists Barnett and Morse,
20
which found that resources became ever cheaper through the twentieth century, as their extraction costs in capital and labor terms declined; this was due, they thought, to technological advance. Simon and others who denigrated the limits work were confident that resources would remain cheap as technology continued to improve. The energy analyst and geographer Cutler Cleveland, however, has argued that Barnett and Morse overlooked the role of energy in general, and cheap oil in particular, during the period of their study (1900–1960). For Cleveland, cheap energy is the crucial factor in the declining price of resources since all resource extraction depends on the application of energy; energy is not just one resource among many but the “master resource” essential to the recovery and production of every other commodity. Cleveland’s view is consistent with John McNeill’s thesis that cheap and plentiful fossil fuels, especially cheap oil, underpin the unprecedented economic growth of the twentieth century.
21

Other than the spike in the oil price during the 1970s, energy prices remained flat until the early twenty-first century. So too did those of key minerals. The belief in never-ending cheap resources appeared plausible. But from 2003 on, commodity prices began to soar—in five years the price of copper tripled and that of zinc doubled.
22
Despite the downturn that followed the global financial collapse of 2008, the upward trend in prices of food, oil, and several other commodities had resumed by 2010. Economists insist that price signals “solve” all scarcities: as the price goes up, new options become affordable and substitutes are developed. The evidence regarding prices in the early twenty-first century indicates that they are at best an imperfect and tardy method of detecting scarcity. Until a resource is close to actual depletion, market forces do not recognize its decline.

In the second half of the 1990s, the petroleum geologist Colin Campbell began publishing analyses of world oil production that indicated it was likely to “peak” sometime in the first decade of the new century.
23
Not equivalent to running out, the term peak oil indicates that production has reached a level beyond which it can no longer expand. Notwithstanding drilling in deep water on the edge of technological feasibility and rash plans for the Arctic, the production of conventional petroleum reached a plateau in recent years and seems unlikely to rise significantly in the future. Though the OECD’s International Energy Agency has resisted the idea of peak oil for many years, Fatih Birol, its chief economist, conceded in 2011 that crude oil production had peaked in 2006.
24
Campbell and Laherrère’s figures show that the
discovery
of liquid petroleum peaked in 1964.
25
Increases in oil production in the second decade of the twenty-first century will depend on less accessible sources, a reality that also exacerbates global warming since emissions from the application of energy during production must be added to emissions when the oil is burned.

The recent US boom in shale gas and shale oil,
26
for example, captures hydrocarbons from source rocks rather than from reservoirs where the material has migrated over millions of years. It is no longer a matter of drilling into a cavity and recovering the liquid or gas, which gushed out under pressure in early wells. Instead, extraction relies on the newly developed technology of large-scale multistage hydraulic fracturing (fracking) of the rock, through wells drilled horizontally. Energy and capital have to be applied continually to keep the shale producing. Although there is considerable exuberance about “the end of peak oil,” “a new age of plenty,” and even “energy independence” for the United States,
27
outcomes are unlikely to be so positive. Production from fractured shale falls away dramatically over the first and second years, demanding the ongoing drilling of new wells. Major new investment is constantly required to do this, so that the price required for profitable production is high and likely to rise further as the best well sites are exhausted.
28

In addition to these difficulties, opposition on environmental and social grounds is increasing. The multiplicity of wellheads have to be connected by a lattice of pipelines and roads, which transforms rural or urban landscapes into industrial ones. The process also requires the injection of massive quantities of water and sand and generates a great amount of polluted wastewater which has to be disposed of. Water supply is already stressed in almost half of the areas where shale energy development is most intense. Although it is denied by industry sources, the contamination of groundwater has been linked to fracking in several studies (including one by a team of researchers at Duke University and two by the EPA).
29
The level of methane leakage from the whole production and distribution cycle for shale gas is also unsettled. Researchers at Cornell University have found that, because of methane leakage, greenhouse gas emissions exceed those from conventional gas and oil and are comparable with—or greater than—those from coal. Unburned, methane has far greater greenhouse potential than CO
2
.
30
Although shale gas is argued to be a temporary answer to the decline in cheap oil and conventional gas, perhaps enough to buy time if environmental side effects can be controlled, the fact that the industry is predicated on releasing hydrocarbons from solid rock is emblematic of ever-increasing degrees of difficulty or, as some might say, desperation.

Bailey’s claims about
Limits to Growth
have been widely repeated, from the think tank extremes to the economic mainstream. All ignore the fact that precise prediction was never the aim of the
Limits
project, even though the media and most economists focused on dates and figures. The idea of
Limits to Growth
was to alert an apparently unconscious world to the longer-term consequences of exponential growth as the scale of the human enterprise ballooned. The Meadows team avoided exact dates and quantities and kept many axes on their graphs nonspecific as to timing. “In terms of exact predictions, the output is not meaningful” the original text stated,
31
and Dennis Meadows later described the purpose in this way: “We used it [World3] to determine the main behavioral tendencies of the global system.”
32
This was a study of process and trend. The object was a broad understanding of the way the global economic system unfolds, not a set of exact forecasts. Though the MIT team reiterated this throughout their work and both the subsequent updates, critics rarely acknowledged these objectives, or perhaps did not even understand them. The critiques arising from or allied to the Bailey attack are all based on a misunderstanding of the purpose and research strategy of the Meadows team and an incorrect reading of one table.

How Close to the Mark?

Into the Future: Testing the Model

It was always my purpose to ask, as I drew to the end of this book, whether the projections made by the Meadows team in 1972 approximated actual conditions observable decades later. In fact, this question has been pursued by several investigators.

Graham Turner of the Sustainable Ecosystems Unit at Australia’s Commonwealth Scientific and Industrial Research Organisation (CSIRO) compared the output generated by various runs of the World3 model in the
Limits to Growth
work with readily available public statistics for the period 1970–2000. Turner stresses the way the Meadows team was interested in process and trend rather than precise prediction, as noted above. Turner also emphasizes the crucial role of feedback loops and delayed effects in the behavior of the planetary system, and reflects on the distortions circulated by Bailey.
33

But Turner’s main purpose was to compare actual data from the real world for the period 1970–2000 with the data generated in three of the key scenarios run by the
Limits
team. The team ran the World3 program many times using differing assumptions about several key variables: population, services per capita (health and education), food per capita, industrial output per capita, consumption of nonrenewable natural resources, and persistent pollution. Turner examined the output from three of these runs: the “standard run,” reflecting business as usual, where rates of growth, industrialization, and resource use continued on their 1970 trajectory; the “comprehensive technology” run, modeling what would occur if huge improvements in agricultural productivity, resource extraction, pollution mitigation, and birth control were achieved; and the “stabilized world” run, in which population and capital investment were stabilized alongside significant technological advance.
34

The standard run of the World3 model resembled the world that actually transpired. For net population, Turner found that actual data for 1970–2000 agree closely with the standard run.
35
When he used electricity supply as his proxy for services per capita, the real-world data were again very close to the standard run; when he used literacy rates as the proxy for services, the standard run was more optimistic than reality. Food per capita has been slightly better than the standard run, though broadly similar, while industrial output per capita is virtually identical to the standard run.

In dealing with nonrenewable resources, Turner examined a range of data sets to reflect the ongoing uncertainty surrounding the future availability of energy resources in particular. If more speculative technologies such as nuclear fusion and effective carbon capture do not come into play and coal burning is limited owing to greenhouse concerns, the real world tracks World3’s standard run fairly closely, though depletion in the real world is slightly slower than in the model. If much of the remaining coal is burned, the data approximate the resources output for the “comprehensive technology” run. For persistent pollution data, Turner used CO
2
emissions as a proxy since data on such pollutants as heavy metals, radioactive wastes, and organic pollutants were inadequate and insufficiently available as aggregated global totals. The actual rise in CO
2
emissions was somewhat lower than persistent pollution in the standard run.

In general, the real-world data do not match the “stabilized world” run at all, which is hardly surprising, since ideas about biophysical limits have been ignored in most policymaking since 1980. The “comprehensive technology” run is also shown to be wide of the mark, being much more optimistic than the observed data in most cases. But the standard, or business as usual, model (which trends toward collapse in the middle of the twenty-first century) is a close match, yielding outcomes that tally well with what has actually occurred. This result is compelling, in light of the complexity of the feedbacks between sectors that are incorporated into the model. Turner believes that the close correlation between the real world and the standard run supports the conclusion “that the global system is on an unsustainable trajectory unless there is substantial and rapid reduction in consumptive behaviour, in combination with technological progress.”
36

Turner also points out that the
Jevons paradox
or “rebound effect” is at the root of the ambiguous benefits brought to the system by technical efficiencies: significant gains in efficiency do not moderate consumption but rather facilitate expansion. Although such gains should theoretically reduce impacts, this does not happen in practice, something William Jevons, one of the pioneers of neoclassical economics, observed in the middle of the nineteenth century. For example, the carbon intensity of industrial production has declined for almost a century, while the rate of carbon emissions has continued to grow exponentially. Indeed, it is arguable that there is no real paradox here. As engineer Michael Huesemann notes, “technological innovation has never been used to stabilize the size of the economy; [its] main role has always been exactly the opposite, namely the enhancement of productivity, consumption, and economic growth.”
37

The systems ecologists Charles Hall and John Day have also compared the standard run with actual 2008 data and, like Turner, have found that the model’s output matches population and industrial output per capita in 2008; for resources, they looked at specific resources such as copper, oil, soil, and fish and found that the actual data in 2008 were very close to the model’s predicted values in the early twenty-first century. For pollution, they looked at CO
2
and nitrogen and, again like Turner, found levels that were close to, though somewhat less than, the standard run. Despite the prevailing perception of the abject failure of the
Limits
work, Hall and Day point out that, whatever occurs later, the model’s performance has not been invalidated so far. “We are not aware,” they write, “of any model made by economists that is accurate over such a long time span.”
38

Hall and Day also explore reasons for the decline in resource prices through the twentieth century. Like McNeill and Cleveland, they attribute the long run of cheap resources to the availability of cheap energy rather than to “technology,” as economists tend to believe. In any case, “technology does not work for free,” they argue, giving the example of US agriculture, where up to ten calories of petroleum are used to generate every one calorie of food. Since energy is the master resource on which everything depends, wealth is produced with huge amounts of oil and other fuels—to the extent that everyone in the developed world has the energy equivalent of somewhere between thirty and sixty “slaves” working all day, every day. In these circumstances, cheap fuel is of the essence. Hall and Day argue that energy return on investment (EROI) is more critical than price. EROI is a measure of how much energy is produced by the application of a given unit of energy invested. While the ratio between energy produced and energy expended for new oil in the United States stood at 100:1 in 1930, 40:1 in 1970, and 14:1 in 2000, they suggest that this ratio for oil production will approach 1:1 within a few decades. Hall and Day end their paper with a plea for the teaching of economics from a biophysical as well as a social perspective: “The concept of the possibility of a huge multifaceted failure of some substantial part of industrial civilization is so completely outside the understanding of our leaders that we are almost totally unprepared for it.”
39

BOOK: Collision Course: Endless Growth on a Finite Planet
6.04Mb size Format: txt, pdf, ePub
ads

Other books

Nocturnal by Chelsea M. Cameron
Drumsticks by Charlotte Carter
Nobody's Secret by MacColl, Michaela
Sunrise Over Fallujah by Walter Dean Myers
Evening of the Good Samaritan by Dorothy Salisbury Davis
Shadow River by Ralph Cotton
These Is My Words by Nancy E. Turner
The Twisted Way by Jean Hill
La sociedad de consumo by Jean Baudrillard