preindustrial levels, they put the likely global temperature rise at between two and a half and eight degrees Fahrenheit. The panel members weren’t sure how long it would take for changes already set in motion to become manifest, mainly because the climate system has a built-in time delay. The effect of adding CO 2 to the atmosphere is to throw the earth out of “energy balance.” In order for balance to be restored—as, according to the laws of physics, it eventually must be—the entire planet has to heat up, including the oceans, a process, the Charney panel noted, that could take “several decades.” Thus, what might seem like the most conservative approach—waiting for evidence of warming to make sure the models were accurate—actually amounted to the riskiest possible strategy: “We may not be given a warning until the CO 2 loading is such that an appreciable climate change is inevitable.”
It is now more than twenty-five years since the Charney panel issued its report, and, in that period, Americans have been alerted to the dangers of global warming so many times that reproducing even a small fraction of these warnings would fill several volumes; indeed, entire books have been written just on the history of efforts to draw attention to the problem. (Since the Charney report, the National Academy of Sciences alone has produced nearly two hundred more studies on the subject, including, to name just a few, “Radiative Forcing of Climate Change,” “Understanding Climate Change Feedbacks,” and “Policy Implications of Greenhouse Warming.”) During this same period, worldwide carbon-dioxide emissions have continued to increase, from five billion to seven billion metric tons a year, and the earth’s temperature, much as predicted by Manabe’s and Hansen’s models, has steadily risen. The year 1990 was the warmest year on record until 1991, which was equally hot. Almost every subsequent year has been warmer still. As of this writing, 1998 ranks as the hottest year since the instrumental temperature record began, but it is closely followed by 2002 and 2003, which are tied for second; 2001, which is third; and 2004, which is fourth. Since climate is innately changeable, it’s difficult to say when, exactly, in this sequence natural variation could be ruled out as the sole cause. The American Geophysical Union, one of the nation’s largest and most respected scientific organizations, decided in 2003 that the matter had been settled. At the group’s annual meeting that year, it issued a consensus statement declaring, “Natural influences cannot explain the rapid increase in global near-surface temperatures.” As best as can be determined, the world is now warmer than it has been at any point in the last two millennia, and, if current trends continue, by the end of the century it will likely be hotter than at any point in the last two million years.
In the same way that global warming has gradually ceased to be merely a theory, so, too, its impacts are no longer just hypothetical. Nearly every major glacier in the world is shrinking; those in Glacier National Park are retreating so quickly it has been estimated that they will vanish entirely by 2030. The oceans are becoming not just warmer but more acidic; the difference between daytime and nighttime temperatures is diminishing; animals are shifting their ranges poleward; and plants are blooming days, and in some cases weeks, earlier than they used to. These are the warning signs that the Charney panel cautioned against waiting for, and while in many parts of the globe they are still subtle enough to be overlooked, in others they can no longer be ignored. As it happens, the most dramatic changes are occurring in those places, like Shishmaref, where the fewest people tend to live. This disproportionate effect of global warming in the far north was also predicted by early climate models, which forecast, in column after column of FORTRAN-generated figures,