Citation: Energy Policy 23 (4/5): 411–416 1995 Also pp. 501-512 in Integrated Assessment of Mitigation, Impacts, and Adaptation to Climate Change, N Nakicenovic, WD Nordhaus, R Richels, and FL Toth (eds), International Institute for Applied Systems Analysis, Laxenburg, Austria, 1994.
The global warming debate has neglected and thus underestimated the importance of technical change in considering reduction in greenhouse gases and adaptation to climate change. Relevant quantitative cases of long-run technical change during the past 100 years are presented in computing, communications, transport, energy, and agriculture. A noteworthy technological trajectory is that of decarbonization, or decreasing carbon intensity of primary energy. If we are not at the end of the history of technology, the cost structure for mitigation and adaptation changes and could be cheap.
Keywords: climate, energy, natural gas, decarbonization, hydrogen, carbon dioxide, co2
Areas of Research: Energy and Climate
1. Introduction
One hundred years ago icebergs were a major climatic threat impeding travel between North America and Europe. 1,513 lives ended when the British liner Titanic collided with one on 14 April 1912. 50 years later jets overflew liners. Anticipating the solution to the iceberg danger required understanding not only the rates and paths on which icebergs travel but the ways humans travel, too.
My premise is that nearly everyone in the global warming debate, from atmospheric scientists and agronomists to energy engineers and politicians, largely neglects to consider, and thus underestimates, the importance of technical change in considering reduction in greenhouse gases and adaptation to climate change.
Of course, not all technical change is good, with respect to climate or any other facet of our world. Technology can destroy as well as better us. Advances in technology such as the internal combustion engine have generated the outpouring of greenhouse gases in the first place. When Alfred J. Lotka made his landmark projection of anthropogenic climatic change in 1924, he figured 500 years to double atmospheric carbon. He did not foresee the explosion of energy demand and the gadgets that collectively would make the mushroom possible.
Technical change, the blind spot in Lotka’s otherwise remarkably perceptive work, is precisely my focus. To think reliably for the long-term, we must question carefully what stays the same and what can change.
For purposes of this paper, let us assume that most innovation is humane and responsible. A companion exercise which emphasizes demonic aspects of technology and technological failures in the face of climate change would certainly also be worthwhile.
Not all human societies need have asked our question about technical progress in the face of climatic change. For some societies, time stands still or cycles with little development. Of course, the function of innovation has existed in all civilizations. Medieval European guilds, for example, transmitted knowledge about their crafts along dozens of generations, combining it with many inventions. Many inventions originated in China, of which gunpowder and the spinning wheel are among the most famous.
But something new happened in Western civilization about 300 years ago. One might call it organized social learning. Successful societies are learning systems, as Cesare Marchetti observed (1980). In fact, the greatest contribution of the West during the last few centuries has been the zeal with which it has systematized the learning process. The main mechanisms include the invention and fostering of modern science, institutions for the retention and transmission of knowledge (such as universities), and the aggressive diffusion of research and development throughout the economic system.
Attempts have been made to quantify the takeoff of modern science and technology. Early in the 20th century the German chemist Ludwig Darmstaedter carefully listed important scientific and technological discoveries and inventions back to 1400 AD. The list is certainly not complete, but it may be representative. The message is firm: some kind of take-off, albeit bumpy, did occur about 1700, and by 1900 the level of activity was an order of magnitude higher (Figure 1).
Figure 1. Decadal number of scientific and technological discoveries, 1400- 1900. Source of data: L. Darmstaedter, 1908.
Fear that humanity was running out of inventions partly motivated Darmstaedter’s history. Scientists and engineers themselves have often stated that the pool of ideas is near exhaustion. In 1899 Charles H. Duell, U.S. Commissioner of Patents, urged President William McKinley to abolish the Patent Office, stating “Everything that can be invented has been invented” (Cerf and Navasky, 1984, p. 203). After the telephone and electric light what could follow? Darmstaedter’s lists peaked about 1880.
In fact, we do know that invention and innovation are not distributed evenly but come in spurts (Mensch, 1979). But they have come with ever increasing intensity. The slow periods for diffusion of innovations flatten the world economy and drain confidence from many of us. Perhaps the 1990s are an economic trough. In any case, understanding the accumulated surges of technical progress during the 20th century can help us glimpse 2050 and 2100, when the heat may be on.
2. Evidence of Technical Progress
Let us begin with examples from computing, communications, and transport.
Modern computing began in the late 1940s with the ENIAC machine, operating on vacuum tubes. One of the first customers for the most advanced machines was always the U.S. military, in particular, the national laboratories such as Los Alamos, which designed nuclear weapons. The top computer speed at Los Alamos, shown in Figure 2, increased one billion times in 43 years.
Figure 2. Computer speed, Los Alamos National Laboratory. Source of data: Worlton: 1988.
We know that mechanical and electromechanical calculating machines had a history of improvement before John von Neumann and others began to tinker in the 1940s. And we know that the current Cray machines are not the ne plus ultra. Parallel machines already promise a further pulse of speed. Quantum computing looms beyond (Lloyd, 1993).
When Darmstaedter wanted to telephone, he was no doubt excited by the speed and distance his message could travel but probably frustrated by the capacity of the available lines. Long-distance calls had to be booked in advance. In the days of the telegraph it was one line, one message. In one hundred years, as Figure 3 shows, engineers have upped relative channel capacity by one hundred million times. In fact, fiber optics appear to initiate a new trajectory, above the line that described best performance from 1890 to 1980.
Figure 3. Communication channel capacity. Source: Patel, 1987.
Without computers, modern numerical climate models would not be tractable. Without telecommunications, global conferences would be difficult to organize. Without airplanes, Americans would rarely attend meetings in Europe. In 1893 it probably would have required three weeks to travel from a laboratory in Stanford, California, to a conference hall in Laxenburg, Austria, assuming no detours from icebergs. Airplanes first shrank our continents and then made it possible to hop from one to another.
Propulsion for aircraft, shown in Figure 4, has improved by one hundred thousand in 90 years. In fact, we can see clearly that the aeronauts have exploited two trajectories, one for pistons, ending about 1940, and one for jets, culminating in the present.
Figure 4. Performance of aircraft engines. After Gruebler, 1990. Sources of data: Angelucci and Matricardi, 1977, Grey, 1969, Taylor, 1984.
The aircraft engines exemplify that continuing improvement of any technology eventually becomes limited by some physical principle. A new technology then overtakes the old by becoming more cost effective and permitting a broader range of operating characteristics such as speed or bandwidth. The present wave of jet development may have broken. But, linear motors are just starting. These may power the magnetically levitated trains (terra-planes) of the 21st century at 2000-3000 km/hour.
These examples from information, communications, and transport bear importantly on the economy and society as a whole. Yet, one can argue that they matter only indirectly for emissions of greenhouse gases and for adaptation to climatic change. In fact, this view is wrong. Simply recall that weather forecasts, a pre-eminent form of adaptation, are the product of satellites, computers, radio, and video (and earlier, telegraphs and telephones). Assessing prospects for climate change requires broad consideration of technical progress. Nevertheless, let us look at agriculture and energy, where the links between climate and technology are most obvious.
It is common to believe that the revolution in agricultural productivity preceded the revolution in industrial productivity. In the United States, this was not the case. Thomas Jefferson’s Virginia fields yielded roughly the same number of bushels of wheat in 1800 as the average American field yielded until about 1940. Americans harvested more by bringing in more land.
Productivity per hectare took off in the United States in the 1940s, just like jet engines and computers, as is evident from Figure 5. U.S. wheat yields have tripled since 1940, and corn yields have quintupled. Other crops show similar trajectories. Yields in agriculture synthesize a cluster of innovations, including tractors, seeds, chemicals, and irrigation, joined through timely information flows and better organized markets.
Figure 5. Yields of wheat and corn per hectare in the United States, 1880- 1990. After Waggoner, 1994.
Fears are chronic that societies have exhausted their agricultural potential. The Latin church father Tertullian wrote circa 200 AD “The most convincing examinations of the phenomenon of overpopulation hold that we humans have by this time become a weight on the Earth, that the fruits of nature are hardly sufficient for our needs, and that a general scarcity of provisions exists which carries with it dissatisfaction and protests, given that the Earth is no more able to guarantee the sustenance of all. We thus ought not to be astonished that plagues and famines, wars and earthquakes come to be considered as remedies, with the task, held necessary, of reordering and limiting the excess population.”
Two millennia later the agricultural frontier is still spacious, even without invoking genetic engineering of plants. Figure 6, which contrasts annual corn yields for the best growers in Iowa, the average Iowa grower, and the world average, says the world grows only about 20 percent of the top Iowa farmer. Interestingly, the production ratio of the performers has not changed much since 1960. Even in Iowa, the average performer lags more than 30 years behind the state-of-the-art. While technology may progress, rates of diffusion appear to remain stable. And conservative.
Figure 6. Corn yields, Iowa and world, 1960-1991. “Iowa Master” refers to the winner of the annual Iowa Master Corn Growers Contest. Source: Waggoner, 1994.
Though societies are cautious in adopting new practices, recall that the doubling of the pre-industrial level of CO often cited as hazardous is probably 75 or more years in the future. If we had performed a study prior to 1940 of the impacts of CO doubling and climate change on U.S. wheat and corn, the most easily defended assumption would have been constant yields per hectare as a baseline. Neglecting technical progress, the assumption would have brought misleading results. Modern science can now penetrate to every field, cell, and sector of society. It must be taken into account in assessing costs and benefits of strategies for mitigation and adaptation.
One of the technical quests that began about 1700 was to build efficient steam engines. As shown in Figure 7, engineers have taken about 300 years to increase the efficiency of the generators to about 50 percent. Alternately, we are about mid-way in a 600-year struggle for perfectly efficient generating machines. What is clear is that the struggle for energy efficiency is not something new to the 1980s, just the widespread recognition of it.
Figure 7 also explains why we have been changing many light bulbs recently. We have been zooming up a one-hundred year trajectory to increase the efficiency of lamps. The struggle with the generators is measured in centuries. The lamps glow better each decade. The current pulse will surely not exhaust our ideas for illumination. The next century could well reveal new ways to see in the dark, just as quantum computing, linear motors, and bioengineering will reshape our calculations, travel, and food.
Figure 7. Efficiency of energy technologies. Sources: Starr and Rudman, 1973; Marchetti, 1979.
The “cost” of reducing greenhouse gas emissions cannot be properly estimated without understanding the directions in which technical change will drive the energy sector anyway, with regard to preferred primary fuels as well as efficiency. What appear as costs in our current cost-benefit calculus for mitigating, and adapting to, the greenhouse effect may largely be adjustments that will necessarily occur in any case.
This possibility is illustrated by the final technological trajectory discussed here, that of decarbonization, or the decreasing carbon intensity of primary energy, measured in tons of carbon created per kilowatt year of electricity (or its equivalent) (Figure 8). As is evident, the global energy system has been steadily economizing on carbon. Without gloomy climate forecasts or dirty taxes.
Figure 8. Carbon intensity of primary energy, 1900-1990, with projections to 2100. The projection stopping the historic trend of decarbonization is the IPCC 1990 “Business as Usual” (BAU) scenario; IPCC IS92a and ISP2c are high and low energy scenarios from the 1992 Supplement. Sources: Intergovernmental Panel on Climate Change (IPCC), 1990, 1992; Ausubel et al., 1988.
In a peculiar choice of words, the Intergovernmental Panel on Climate Change in 1990 designated as “Business-as-Usual” its scenario which stifled and even reversed the 130-year trend. “Business as Usual” was a scenario of technical regression. It essentially ignored the scientific and technical achievement of the past 300 years, including the achievements that make identification and estimation of the greenhouse effect possible. Mr. Duell would have been quite at home with the 1990 IPCC.
For contrast, consider the “methane economy” scenario which essentially squeezes carbon out of the energy system by 2100 (Ausubel et al., 1988). It is perfectly consistent with the technical history and evolution of the energy system. In its 1992 scenarios the IPCC reluctantly began to reflect that society is a learning system and that we are learning to leave carbon.
3. Conclusions
The essential fact is that technological trajectories exist. Technical progress in many fields is quantifiable. Moreover, rates of growth or change tend to be self-consistent over long periods of time. These periods of time are often of the same duration as the time horizon of climatic change potentially induced by additions of greenhouse gases. Thus, we may be able to predict quite usefully certain technical features of the world of 2050 or 2070 or even 2100.
The hard part may be believing that in a few generations our major socio-technical systems will perform a thousand or a million or a billion times better than today.
If we accept that we are not at the end of the history of technology, surely our cost structure for mitigation and adaptation changes. In some cases it may be possible to summarize improving performance in a simple coefficient, such as that used for “autonomous energy efficiency improvement” (Nordhaus, 1992). The need is to have a long and complete enough historic record from which to establish the trend. Most prognosticators live life on the tangent, projecting on the basis of the last 15 minutes of system behavior. Our methods must advance to encompass long time frames.
A complicating factor is that technologies form clusters to reinforce one another and create whole new capabilities. Imagining how the clusters will affect lifestyles and restructure the economy, and thus affect emissions and vulnerability to climate, is a tremendous intellectual challenge. Lotka saw cars and compressors, but he probably could not envision vast air- conditioned cities and suburbs growing in Arizona, Texas, and Florida.
We also do not understand well the malleability of the time constants or rates of technical change. The technical clock ticks. The West did something a few centuries ago to set the whole machine in motion. Over the last 100 years the United States and other countries have gone much further in establishing systems for research and development. The global research and development enterprise is now about $200 billion annually. Will higher investments speed up the clock? Or, are they required just to maintain current rates of progress, with each increment coming at greater cost? The question is open.
Some object to the trajectories of technology because they limit freedom. In fact, they point out promising channels for society to explore. Discovery and innovation can be costly games. Scientists and engineers should be grateful for signs pointing in the right directions and make mitigation and adaptation for climate change cheap.
Acknowledgement
I am grateful to Perrin Meyer for assistance.
References
Angelucci E, and Matricardi, P., 1977, Practical Guide to World Airplanes, Vols. 1-4, Mondadori, Milan (in Italian), Italy.
Ausubel, J.H., Gruebler, A., and Nakicenovic, N., 1988, Carbon dioxide emissions in a methane economy, Climatic Change 12:245-263.
Cerf, C., and Navasky, V., 1984, The Experts Speak: The Definitive Compendium of Authoritative Misinformation, Pantheon, New York, USA.
Darmstaedter, L., 1908, Handbuch zur Geschichte der Naturwissenschaften und der Technik, Springer-Verlag, Berlin, Germany.
Grey, C.G., ed., 1969, Jane’s All the World’s Aircraft, reprint of 1919 Edition, David & Charles, London, UK.
Gruebler, A., 1990, The Rise and Fall of Infrastructures, Physica, Heidelberg, Germany.
Intergovernmental Panel on Climate Change (IPCC), 1990, Climate Change: The IPCC Scientific Assessment, Cambridge U. Press, England.
Intergovernmental Panel on Climate Change (IPCC), 1992, Climate Change 1992: The Supplementary Report to the IPCC Scientific Assessment, Cambridge U. Press, England.
Lloyd, S., 1993, A potentially realizable quantum computer, Science 261:1569- 1571.
Lotka, A.J., 1924, Elements of Physical Biology, Williams & Wilkins, Baltimore MD, reprinted 1956, Dover, New York, USA.
Marchetti, C., 1979, Energy systems: The broader context, Technological Forecasting and Social Change 15:79-86.
Marchetti, C., 1980, Society as a learning system, Technological Forecasting and Social Change 18:267-282.
Mensch, G., 1979, Stalemate in Technology, Ballinger, Cambridge, MA, USA.
Nordhaus, W.D., 1992, An optimal transition path for controlling greenhouse gases, Science 258:1315-1319.
Patel, C.K.N., 1987, Lasers in communications and information processing, in Ausubel, J.H. and Langford, W.D., eds., Lasers: Invention to Application, National Academy, Washington DC., USA, pp. 45-100.
Starr, C. and Rudman, R., 1973, Parameters of technological growth, Science 182:358-364.
Taylor, J.W.R., ed., 1984, Jane’s All the World’s Aircraft 1984-1985, Jane’s, London, England.
Tertullian, Apology, Loeb Classical Library 250, Harvard University, Cambridge, MA, USA.
Waggoner, P.E., 1994, How much land can ten billion people spare for Nature? Council for Agricultural Science and Technology, Ames, IA, USA.
Worlton, J., 1988, Some patterns of technological change in high performance computers, CH2617-9, Institute for Electrical and Electronics Engineers, New York, USA.