Lecture #24: People Who Live in Glass Houses

Suggested Readings:

Spencer Weart, The Discovery of Global Warming (2003) see also his very helpful website: http://www.aip.org/history/climate/

Richard Elliot Benedick, Ozone Diplomacy, 1991 (insider's account of Montreal Protocol)

John Houghton, Global Warming: The Complete Briefing, 5th ed. (2015)

Intergovernmental Panel on Climate Change (IPCC), Climate Change 2014 (2014) http://www.ipcc.ch/

RealClimate website (extensive web guide to current climate science discussions)

Bjorn Lomborg, Cool It: The Skeptical Environmentalist's Guide to Global Warming (2007) (a leading skeptic)

Naomi Oreskes & Erik M. Conway, Merchants of Doubt: How a Handful of Scientists Obscured the Truth on Issues from Tobacco Smoke to Global Warming (2010) (critical history of climate change skeptics)

PBS Frontline, Climate of Doubt (hour-long documentary on efforts to promote climate change skepticism)

Mike Hulme, Why We Disagree About Climate Change: Understanding Controversy, Inaction and Opportunity (2009)

John McPhee, "Atchafalaya," in The Control of Nature (1990) (also available on New Yorker website at https://www.newyorker.com/magazine/1987/02/23/atchafalaya)

Richard Campanella, Geographies of New Orleans: Urban Fabrics Before the Storm (2006)

Craig Colten, An Unnatural Metropolis: Wresting New Orleans from Nature (2004)

Douglas Brinkley, The Great Deluge: Hurricane Katrina, New Orleans, and the Mississippi Gulf Coast (2006)

Joshua Howe, Behind the Curve: Science and the Politics of Global Warming (2014)

Joshua Howe, Making Climate Change History: Documents from Global Warming's Past (2017) (a useful collection of primary documents on history of climate change science)

Outline

I. The Limits to Growth

A powerful irony of the 1980s was that at very moment when issues of toxicity were politicizing people at the local level to an unprecedented degree, a new kind of environmental politics was also emerging at the much larger scale of planet as a whole.

To see where this comes from, we have to revisit the question of population and also trace the emerging science of climatology over course of twentieth century. As we do so, we’ll encounter several themes that will already be familiar to you from earlier in the course:

  • The importance of new scientific techniques and instrumentation in helping people recognize the existence of environmental problems hitherto invisible to them.
  • In particular, the rise of digitational computational technologies that permitted modelling of earth systems on a scale never before possible.
  • An increasingly political role for science itself, with numerous disagreements among scientists and a dilemma for non-scientific members of the public about how to evaluate the meaning and implications of scientific “uncertainty.”
  • The inescapable challenge of managing nature, now apparently at the scale of the planet as a whole.
  • A whole new vision of potential apocalypse, symbolized in many ways by Hurricane Katrina in August 2005.
  • All of this in the context of increasingly hostile partisan politics, with sustained efforts to raise doubts in the minds of the public about scientific findings about climate change.

Let's start by reviewing efforts to predict future environmental change by using mathematical models.

Malthusian demography can be seen as a crucial source for this effort at modeling future popular change.

Limits to Growth: in 1972, a team led by Dennis and Donella Meadows (Donella chief author) produced the most famous such prophesy of the early 1970s, using MIT computer modeling as a new more sophisticated version of Malthusian logic: plug complex economic, social, and ecological variables into an elaborate set of equations, then run various "scenarios" to see likely future outcomes.

The "standard run" of their model produced a scenario leading to overpopulation, then falling resources, soaring pollution, and societal collapse. Even their "unlimited resources" scenario ultimately hit limits driven by pollution effects. Only their "stabilized model" held out hope for the human future.

Limits to Growth became an international bestseller all around the globe. The public read the book as an apocalyptic prophesy. Perfectly timed (by accident) to coincide with the Arab oil boycott in 1973-74, it seemed a fulfillment of the worst environmental nightmares. Most readers missed its intended message of future hope if more responsible practices were put in place in a timely fashion. Its conclusion read:

If there is cause for deep concern, there is also cause for hope. Deliberately limiting growth would be difficult, but not impossible. The way to proceed is clear, and the necessary steps, although they are new ones for human society, are well within human capabilities. Man possesses, for a small moment in his history, the most powerful combination of knowledge, tools, and resources the world has ever known. He has all that is physically necessary to create a totally new form of human society‑-one that would be built to last for generations. The two missing ingredients are a realistic, long-term goal that can guide mankind to the equilibrium society and the human will to achieve that goal. Without such a goal and a commitment to it, short-term concerns will generate the exponential growth that drives the world system toward the limits of the earth and ultimate collapse. With that goal and that commitment, mankins would be ready now to begin a controlled, orderly transition from growth to global equilibrium.

For a fuller discussion, see
https://en.wikipedia.org/wiki/The_Limits_to_Growth
For a free downloadable copy of the 1972 edition of the book, see
http://www.donellameadows.org/wp-content/userfiles/Limits-to-Growth-digital-scan-version.pdf

Limits to Growth views achieved official government approval when President Jimmy Carter commissioned the Global 2000 Report, authored by Gus Speth and published just as Carter was leaving office. It provided abundant statistical evidence suggesting that the predictions of Limits to Growth were coming true, and that the: 21st century would be worse than the 20th century in myriad ways. Key conclusion:

If present trends continue, the world in 2000 will be more crowded, and more vulnerable to disruption than the world we live in now. Serious stresses involving population, resources, and environment are clearly visible ahead. Despite greater material output, the worlds people will be poorer in many ways than they are today.

A conservative rejoinder came from the economist Julian Simon and the futurist Herman Kahn in a book entitled The Resourceful Earth (1983), which refuted the Global 2000 Report point-for-point: the world was getting better, not worse, in all ways.

For our purposes, the key insight from these debates about Malthusian "limits to growth" was that computer models were now becoming indispensable tools for speculating about ways in which past and present environmental trends might be projected into future. In one very important sense, our understanding of future environmental problems was becoming increasingly virtual.

II. A Hole in the Sky

The 1980s would see computer modeling of environmental problems turn to a new realm: the atmosphere.

James Lovelock argued in his influential 1979 book Gaia that the unusual combination of highly unstable gases in Earth's oxidizing atmosphere could not possibly existence without life on the planet. In Lovelock's view, Earth's atmosphere was fundamentally a product of homeostatic biological processes. His striking conclusion:

The chemical composition of the atmosphere bears no relation to the expectations of steady-state chemical equilibrium. The presence of methane, nitrous oxide, and even nitrogen in our present oxidizing atmosphere represents violation of the rules of chemistry to be measured in tens of orders of magnitude. Disequilibria on this scale suggest that the atmosphere is not mere a biological product, but more probably a biological construction: not living, but like a cat's fur, a bird's feathers, or the paper of a wasp's nest, an extension of the living system designed to maintain a chosen environment.... We have since defined Gaia as a complex entity involving the Earth's biosphere, atmosphere, oceans, and soil; the totality constituting a feedback or cybernetic system which seeks an optimal physical and chemical environment for life on this planet.

For more on Lovelock and his Gaia Hypothesis, see:
https://en.wikipedia.org/wiki/James_Lovelock
https://en.wikipedia.org/wiki/Gaia_hypothesis

The early 1980s saw computer modeling to show the likely effects of a full-scale nuclear exchange between the US and the USSR in the form of what was called a "nuclear winter," darkening of atmosphere so complete as to yield long-term biological failure of natural and agricultural systems alike, with devastating consequences for human civilization.
https://en.wikipedia.org/wiki/Nuclear_winter

In the early 1970s, there had been a debate over whether the supersonic passenger jet (SST) might potentially damage the Earth's ozone layer (which filters much of the sun's ultraviolent radiation so that it doesn't reach the surface of the planet); this was soon followed by a hypothetical argument that the chlorinated fluorocarbons (CFCs) used as propellants in aerosol sprays might also damage the ozone layer.

CFCs had been invented in 1930 for General Motor's Frigidaire Division by the chemist Thomas Midgely. Midgley had also invented tetraethyl lead for GM in 1923 as a gasoline additive to reduce "knocking" in the internal combustion engines of automobiles. (Given the eventual environmental impacts of these two inventions, the environmental historian John McNeill has quipped "Midgley, the same research chemist who figured out that lead would enhance engine performance, had more impact on the atmosphere than any other single organism in earth history.”)
https://en.wikipedia.org/wiki/Thomas_Midgley_Jr.

CFCs appeared to be ideal refrigerants: non-toxic, highly efficient, non-explosive, extraordinarily stable. But the very stability that made them chemically inert and therefore safe under normal conditions meant that they were persistent enough to be transported into the upper atmosphere where they could be broken down by sunlight and the resulting chlorine ions could then serve as catalysts breaking down ozone molecules over and over again:
https://en.wikipedia.org/wiki/Chlorofluorocarbon https://en.wikipedia.org/wiki/Freon

In 1974, the chemists Sherwood Rowland and Mario Molina published a paper in Nature (for which they would eventually win the Nobel Prize in Chemistry) hypothesizing the potential impacts of CFCs on atmospheric ozone.

It was a remarkable example of how a seemingly benign solution to one set of environmental and health problems (the toxicity and explosive risks associated with early refrigerants) could have the unintended consequence of causing other problems in places where no one ever expected they might arise.

In 1985, British scientists discovered an ozone "hole" over Antarctica, with resulting fears about the effects of rising levels of ultraviolent radition on plankton and marine food supplies, crop damage if ozone depletion were found in the mid latitudes of the planet, as well as potential increases in human skin cancer from UV radiation.
https://en.wikipedia.org/wiki/Ozone_depletion

There had in fact been a steady rise in melanoma and basal cell skin cancers since the 1930s...but this was caused as much changes in fashion--the boom in tanning and exposing bare skin to summer sunlight--as by shifts in the atmosphere.

In 1987, most nations in the world signed the Montreal Protocol, agreeing to phase out of CFCs. It was successful partly because there seemed to be relatively easy replacements available for CFCs, making possible a technical fix to a well-defined problem about which most scientists agreed.
https://en.wikipedia.org/wiki/Montreal_Protocol

III. Peering Skyward Into the Future

What seemed relatively easy in addressing the challenges associated with CFCs would prove far more challenging for other aspects of the Earth's atmosphere.

Through the 1970s, the principal source of climate fear was threat that we might be returning to Ice Age conditions caused by natural cooling and anthropogenic dust pollution. A national leader in advocating for this view was UW-Madison's Reid Bryson, founder of what is now known as the Nelson Institute for Environmental Studies.

Starting in the late 1970s, there was increasing interest in the possibility of "greenhouse warming" from growing quantities of certain trace gases in the atmosphere. like CO2, CH4 (methane), CFCs (and water vapor, always most important).

We should start by noting that no one doubts the reality of Earth's greenhouse effect, which is a natural physical process in the planet's atmosphere. Trace gases like CO2, CH4 (methane), CFCs, and most of all water vapor have the effect of reflecting heat back into lower atmosphere and thereby warming the planet. The greenhouse effect is currently (and naturally) responsible for 33o C (59o F) warming. (If the greenhouse effect were solely responsible for the planet's temperature, the mean temperature would be 77o C (171o F)...but it's not. A variety of other effects dampen the greenhouse effects.
https://en.wikipedia.org/wiki/Greenhouse_effect

The earliest prediction of this possibility was made by the Swedish scientist Svante Arrhenius in 1896, who predicated that a doubling of atmospheric CO2 from coal combustion could yield a 5o C global temperature increase. He also predicted

  • greater warming in winter than summer;
  • greater warming on land than over the oceans;
  • more in the northern hemisphere than the southern;
  • more at night than during the day;
  • more at the poles than in mid-latitudes or on the equator.

All of these were backed with plausible reasoning, but there was essentially no way to measure or test his hypotheses: instrumentation and data processing weren't at the point that such calculations could be made. For more on Arrhenius, see:
https://en.wikipedia.org/wiki/Svante_Arrhenius

The British engineer Guy Stewart Callendar performed a series of calculations by hand in 1938, and predicted world temperature rise, essentially as a back-of-the-envelope exercise.

The International Geophysical Year (IGY) in 1957-58 created new institutional arrangements for climatic data collection: international scientific cooperation during the Cold War, with permanent stations in Antarctica, and new efforts at climate monitoring. This was driven partly by Cold War military interest in calculating the flight paths of long-distance missiles, navigation, growing computational power, impacts of radiation from nuclear testing, etc.: all contributed to the technologies and instrumentation that made it possible to study climate change (and that produced other major discoveries in the earth sciences, especially the gathering of oceanographic data that finally demonstrated the reality of plate tectonics).

A key leader among American scientists during the IGY was Roger Revelle, head of the Scripps Institution of Oceanography, who declared in 1957:

Human beings are now carrying out a large scale geophysical experiment of a kind that could not have happened in the past nor be reproduced in the future. Within a few centuries we are returning to the atmosphere and oceans the concentrated organic carbon stored in sedimentary rocks over hundreds of millions of years. This experiment, if adequately documented, may yield a far-reaching insight into the processes determining weather and climate.

Revelle proposed a key experiment: Charles David Keeling's Mauna Loa monitoring station, which found rising CO2, which eventually yielded what is now arguably among the most important scientific graphs in history: the "Keeling Curve."
https://en.wikipedia.org/wiki/Keeling_Curve

Reconstructing past temperatures and climates is an essentially historical enterprise, albeit with "documents" different from the ones most used by traditional historians: tree rings, Greenland and Antarctica ice cores, pollen, carbon 14, etc.

Computer models were key here too: indeed, they began in subsequent years to be relied on at least as much as actual climatological data. Earliest models (produced during the period when the graphs in Limits to Growth were also being assembled) were quite crude, with very large "pixels" to model large chunks of Earth's surface, limited number of variables included in model equations. Over time, models became increasingly complex, with more and more variables, pixels covering ever smaller units of Earth's surface, and growing accuracy of fit with historic datasets.

A severe midwestern drought during the spring of 1988 was the context for James Hansen (NASA climate modeler) testifying before the Senate that this regional US weather crisis was a product of anthropogenic warming; a controversial claim, even among climate scientists, but one that attacted enormous media attention.

The Intergovernmental Panel on Climate Change (IPCC) was founded under UN auspices in 1988, and began preparation of series of synthetic reports designed to identify consensus understanding among climate scientists. These are available for downloading at http://www.ipcc.ch/

In January 1989 Time Magazine took the unprecedented step of replacing the "Man of the Year" on its cover with "Planet of the Year: Endangered Earth":
http://content.time.com/time/covers/0,16641,19890102,00.html

Bill McKibben published The End of Nature in 1989 (initially serialized in the New Yorker), the first popular book on climate change, a national bestseller, declaring that human influences on the climate and Earth were now everywhere...and that therefore "nature" as something separate from humanity had "ended."

By the mid-1990s, an influential "hockey stick" graph seemed to show a clear signal that mean global temperatures were rising as a result of anthropogenic increases in greenhouse gases. Computer models were also becoming better able to mimic historical data. There would be subsequent controversy over this graph, which is summarized at
https://en.wikipedia.org/wiki/Hockey_stick_graph

CO2 is not remotely as easy to regulate as CFCs, given its central role in all agricultural, industrial, and physiological activity: it's no exaggeration to say that CO2 is central to the very metabolism of modernity.

By the early twenty-first century, there was mounting evidence of climate change and growing public concern: fear of flooding in coastal areas from sea level rise; shrinking mountain glaciers worldwide; spread of insect-borne diseases; even decreasing duration of ice cover on Madison's Lake Mendota.

For a graph of Lake Mendota's ice, see
http://climatewisconsin.org/story/ice-cover (requires Flash)
http://www.aos.wisc.edu/%7Esco/lakes/mendota-dur.gif

IV. Katrina

Al Gore's An Inconvenient Truth in 2006 played an influential role in focusing public attention on climate change, an issue that had been of concern to Gore for many years, earning him (and IPCC) a Nobel Prize in 2007.

The film (and Gore's unpopularity among conservatives) led to large investments by oil companies and conservative groups seeking to cast doubt on climate science, with the Heartland Institute (https://www.heartland.org) playing a lead role in organizing such efforts after 2008. For an hour-long PBS documentary on these efforts, see
https://www.pbs.org/wgbh/frontline/film/climate-of-doubt/

Promotional posters for the film showed an ominous hurricane vortex emanating from industrial smokestacks. The clear implication was that greenhouse gases were contributing to hurricane frequency and/or severity.

The use of this image no accident: a year before the film came out, on August 29, 2005, Hurricane Katrina had become a powerful global symbol of severe weather.
https://en.wikipedia.org/wiki/Hurricane_Katrina

After an extended period of relatively infrequent hurricanes during the second half of the twentieth century, their frequency seemed to be on the rise by 1990s, with some climatologists hypothesizing that warmer ocean temperatures would increase the number of severe hurricanes. Independent of this hypothesized effects, property damage from hurricanes was also rising as people built expensive vacation houses on exposed shorelines during decades of relative calm.

Although Katrina and New Orleans became symbols of global warming, we know from this course that they in fact reflect much deeper environmental historical phenomena that we've been studying all semester. Let's use them to practice some of the environmental history skills we've learned.

The city's location at the mouth of the Mississippi River meant that the interior drainage of much of the nation between the Appalachians and the Rockies all flowed past this city.

People had been constructing levees along the river for centuries to prevent flooding, with large-scale levee building promoted by the Army Corps of Engineers from second half of nineteenth century forward.

The theory was that levees would not only protect property adjacent to the river, but would speed the flow of the water, scouring and deepening the river bed. But floods kept getting higher, requiring levees to be raised repeatedly.

New Orleans had suffered from flooding from the beginning when "crevasses" opened in its levees.

The oldest part of city, the French Quarter, had been constructed on a natural levee that was the highest ground adjacent to the Mississippi. There was relatively little construction in the low marshy ground between the Mississippi and Lake Ponchartrain.

The great 1927 Mississippi River flood was widely blamed on bad levee design, producing strong commitment afterwards to higher levees, pumping facilities, and spillways where excess water could be directed.

The effectiveness of new flood control structures accelerated urban settlement onto lower ground, while at same time drying and compressing soil so that city began to subside, making low ground even lower. By the early 21st century, 49% of city was below sea level: New Orleans was effectively a bowl between the Mississippi and Lake Ponchartrain.

The containment of the river between levees meant that marshes in the Louisiana delta were no longer replenished with silt, so that they too began to sink and disappear, thereby reducing the buffer they had previously provided against Gulf storms.

Hurricane Katrina made landfall east of New Orleans as a Category 3 storm (it had earlier been a Category 5) on August 29, 2005. After initial elation that the city had avoided a direct hit, people realized that the levees had been breached at several locations, resulting in 80% of the city being under deep water at the height of the flooding.

More than 1800 people lost their lives in the storm and and subsequent floods: $81 billion in property damage; 90,000 sq mi (an area the size of the United Kingdom) was declared disaster area; 3 million people were left without electricity.

The differential impact on poor people and the linkage of poverty with race made the storm's aftermath a scandal: Americans were ashamed at the racial injustices that were so obviously revealed in the wake of Katrina, and angry that their leaders were so ineffective in responding. A "natural" disaster had proven also to be a profound example of environmental injustice.

Although Katrina became a symbol of global warming for many, it was a tragedy with much deeper and more complicated historical roots.