How To Predict the Rising Tides?
Insurance Regulators Prompt Action; Recent Report on Mid-Atlantic Coast a Test Case
The National Association of Insurance Commissioners last week suddenly made the science of calculating the consequences of a rising sea level anything but academic. These state insurance regulators will now require insurance companies to submit annual “climate-risk” reports, which must include (among many other things) the risk of extreme weather events such as inexorably rising sea water. The decision follows a March 12 blast from the National Research Council complaining that state and local government officials were also guilty of ignoring climate change in their infrastructure decisions.
The back-to-back demands for insurers and government officials to get real about climate change highlights the importance of an important recent study on the sensitivity of Mid-Atlantic shorelines to rising sea level. The report, titled “Coastal Sensitivity to Sea-Level Rise: A Focus on the Mid-Atlantic Region,” laudably attempts to address that problem by translating the global effects of climate change to a geographic scale where local, state, and federal policymakers can act—as well as insurers.
That report, however, highlights just how complex it is to reconcile policy needs and stringent standards for uncertainty before coastlines from New York to North Carolina slip anywhere from 12 to 40 inches (30 centimeters to 100 centimeters) below rising sea water. Writ large, the study presents policymakers with a conundrum they must overcome—balancing the need to always gather more data to refine the analysis of dynamic problems versus taking action on climate change in the face of clearly intolerable consequences.
The Environmental Protection Agency took the helm of the study, and the U.S. Geological Survey and National Oceanic and Atmospheric Administration served as co-authors. Their report assumed three scenarios of relative sea-level rise between 30 cm and 100 cm by 2100. The authors then asked experts to explore the corresponding consequences. The report finds that shoreline erosion will likely increase, as will damage from storm surges. Some land will become submerged. More wetlands will probably disappear than will migrate inland, with a concomitant blow to wildlife that loses its habitat, including valuable spawning areas. Groundwater also may be contaminated.
Contributing author Denise Reed, professor and Director of the Pontchartrain Institute for Environmental Sciences at the University of New Orleans, explains that the report creates frameworks for understanding what might befall a particular beach, estuary, or marsh—even if the report can’t specify the future impact of rising seas on any particular piece of coastline. Adding to the uncertainty of the predictions is this additional conclusion: the resilience of different coastlines will be unevenly distributed. As Reed explained in an interview, “Some places are in worse shape than others, or some might be okay if we don’t screw them up.”
The report also suggests—greenhouse gas emissions aside—that we may well screw some things up. Considering all the commercial interests, supervisory agencies, land-use policies, and other local social forces, Jim Titus, EPA’s lead author and sea level rise project manager, warns that “our systems were designed with a stable sea level, and in some instances they thwart our ability to respond to sea-level rise.” Case in point: the Federal Emergency Management Agency’s maps for insurance rates don’t incorporate rising sea level, yet requiring a sea wall or other barrier before approving a project now may impede the ability of the coastline and communities to adapt.
The report therefore tries “to educate the reader, the public, and the policymaker,” says Jeff Williams, Coastal Marine Geologist at the U.S. Geological Survey, that “we ought to be taking sea-level rise seriously and making plans not just for the next 5 years, but the next 50 or 100 years.”
But between consensus and uncertainty, policy and science diverge. “Uncertainty” within the scientific community refers to levels of statistical confidence associated with the results of a study. Does the statistical analysis show that the results are 99 percent likely to indicate a genuine, or statistically significant, relationship? Or 95 percent? Or 80 percent? Statistical samples can never yield 100 percent certainty, but this practical impossibility does not mean that the results are somehow flawed or useless.
In scientific literature, as more and more studies conducted by a multitude of researchers add more test cases that reach similar conclusions, the overall “certainty” of these results increases. For instance, although there is not a 100-percent consensus on what the impacts of global warming will be on hurricane intensity, more and more studies now show that rising sea surface temperatures will result in stronger storms. This is how the scientific community develops its knowledge base.
That epistemological process also means results that are 94 percent or less certain may never appear in scientific publications, which tend to use standards of 95 percent or higher confidence. Nonetheless, even 80-percent-certain information could interest someone who has nothing better available for making decisions, such as climate policy.
Science retains a key role in convincing people to act.
After all, the word “uncertainty” in the rest of the world has a slightly different connotation than in statistics and can be used to feign ignorance or imply fault with scientific analyses. Indeed, “uncertainty” has been used as a policy weapon to delay action on global warming, as if we were paralyzed because we could not model exactly what might happen in Los Angeles on any given Sunday in 2050. One remedy might put whatever we do know into the public domain, with proper caveats about its relative uncertainty and mutability, so that legislators and others can act rather than continually ask scientists to refine and assess what we know before doing anything. That approach would mean scientific reviewers were not the only gatekeepers for information; different people could decide which standards of statistical certainty they could tolerate for different circumstances.
But science retains a key role in convincing people to act. As Titus said in a conference call, some people might consider “this report was a detour we had to take that maybe kept us from doing important stuff, but we had to do” because leaders inside and outside of government still needed persuading. Hence, in a contrasting approach, since uncertainty may present a motor for climate change naysayers to power their disinformation campaigns, we might wait until we can circulate data that inspire 95 percent confidence in order to withstand attempts to warp the interpretation or to preclude charges of flip-flopping as better data become available.
But extreme local confidence is difficult when it comes to rising sea level. This quest to balance urgency with certainty threads its way through “Coastal Sensitivity to Sea-Level Rise.” Indeed, the original title changed from “Coastal Elevations and Sensitivity to Sea-Level Rise” because local data about coastal elevations varied in precision and was largely dropped from the 790-page compendium. Meeting minutes of the Coastal Elevations and Sea Level Rise Advisory Committee note with irony and disappointment that the first task of the report was to decide which areas were low enough to be inundated by rising seas. CESLAC concluded it:
Understands the rationale that led the report authors to excise most of the spatially explicit material from the final document, and the committee believes that the decision underscores one of the principal governmental challenges in dealing with climate change. The fact that there is no comprehensive, highly resolved, and well-vetted inventory of coastal elevations means analyses of lands at risk suffers from variable resolutions and uncertainties. This kind of information can be problematic for agency accountability when it is the basis for published analyses. The default is to avoid publication of analyses that might be challenged. Unfortunately, this means less information and motivation for public decision making. In the case of SLR [sea level rise], risks are not static and indecision is an undesirable response. We believe there is a need for government to develop a tolerance for uncertainty in matters like this.
Various agencies are already acquiring more precise data and hope to establish a national clearinghouse of their combined efforts. But two of them, FEMA and the Army Corps of Engineers, did not co-author SAP 4.1.
Michael MacCracken of the Climate Institute directed the U.S. Global Change Research Program under President Clinton from 1993–97 and oversaw the National Assessment Synthesis Team that produced the 2000 report on the impact of climate change on the United States. He said over email that the report’s main “success is that the report came out, indeed, given the Bush Administration’s aversion to doing anything on impacts. Basically, the report did a serious update of what is understood and what can be done with the available information,” rather than dwell on missing information or statistical uncertainty and use either to defer action.
In contrast, the report’s greatest flaw was “how the effort was organized,” He added. Asking government agencies to write scientific reports excluded other stakeholders, falsely implied the scientific community has an official opinion, and presents knowledge as static although much changes during a protracted report. Indeed, melting ice from Greenland and Antarctica puts “Coastal Sensitivity”’s scenarios at the low end of newer estimates.
How to improve the process, especially now that both the public and private sector will have to consider the effects of rising sea level? MacCracken said that assessments should be frequent and “the government and scientific community have to be very careful about making an evaluation of whether the results are well enough known or too uncertain to be of use to stakeholders.” For some stakeholders in certain cases, maybe only 95 or better percent confidence works; for others, maybe 75 percent is good enough to start. The report does not emphasize that different stakeholders have different expertise and different needs with different metrics for acceptable uncertainty. Thus, these different stakeholders must be included more than they have been by the Climate Change Science Program so that they can express their needs and understand the full range of available data. And they must also understand what different definitions and choices about uncertainty mean.
Reed, another veteran of the U.S. GCRP National Assessment, agrees that CCSP involved too few people, especially from academia, and should have clarified its audience earlier. Members of CESLAC questioned the audience and purpose of the report as late as March 2008. But CCSP, a small office with one person coordinating much of the work on 21 reports, was probably hard-pressed to advise closely any single Synthesis and Assessment Product. Peter Schultz, director of the CCSP office, is already “thinking about ways to enhance stakeholder engagement,” but the resources and timeline he receives to do so will be decided by the Obama administration.
His office, meanwhile, is ushering highlights from the 21 SAPs in the Unified Synthesis Product: “Global Climate Change in the United States,” through public review. This report, though prolonging the process, will address the General Accountability Office’s criticism that 21 separate reports “may be difficult for Congress and others to use.”
Similarly, Titus, Williams, Reed, and MacCracken emphasize scientists and policymakers should always be communicating, although not just for or through reports. “Let’s put what we know to work, which doesn’t mean giving someone what you know and letting them run with it. It’s not a hand-off,” Reed says. “Instead, we need to cultivate, educate, and appropriately reward academics and people” who forge “new ways of linking science, policy, and research.”
Being certain about each other’s needs and tolerances for uncertainty is one link to forge—not least because big private-sector industries and federal state and local officials soon will be demanding some kind of consensus.
Mark Meier, a former teacher and environmental consultant, writes about science and society from Charlottesville, Virginia.
 Coastal Sensitivity to Sea Level Rise: A Focus on the Mid-Atlantic Region, p. 19. Available at http://www.climatescience.gov/Library/sap/sap4-1/final-report/default.htm (last accessed February 3, 2009). Relative sea-level rise incorporates global sea-level rise from melting ice and the expansion of water as it heats plus the effects of subsidence (land sinking, which occurs from extracting oil or water or other causes) or uplift (from plate tectonics). According to Jeff Williams of USGS, areas of Alaska and the Pacific Northwest may actually see decreases in relative sea level, as the land continues to rise, while much of the rest of the country will experience relative rises in sea level.
 Another SAP released January 16, 2009, SAP 5.2 Best Practice Approaches for Characterizing, Communicating and Incorporating Scientific Uncertainty in Climate Decision Making, is a rather philosophical and linguistic treatise on the problem of what different people mean by probabilistic statements.
 CESLAC was the federal advisory committee convened to review drafts of the report. Members, mostly from state or federal agencies and universities, met or had conference calls six times from January 2007 through October 2008. The draft documents it reviewed, its comments on the SAP, and meeting minutes can be found at http://www.environmentalinformation.net/CESLAC.
 Report of the Coastal Elevations and Sea Level Rise Advisory Committee, p.6, available at http://www.climatescience.gov/Library/sap/sap4-1/default.php (last accessed February 1, 2009).
 GAO, Climate Change Assessment: Administration Did Not Meet Reporting Deadline, p. 4, available at www.gao.gov/new.items/d05338r.pdf (last accessed February 3, 2009).
Comments on this article