Welcome

 

Nearly all environmental problems faced today entail some element of uncertainty. Estimates of emissions rates, fate and transport of pollutants, human exposure to pollutants, health effects, costs and benefits of regulatory options, and other attributes of environmental analyses are often fraught with uncertainty. Typically, only limited data are available to support an analysis, if in fact any data are available at all. Yet regulatory decisions must be made by the U.S. Environmental Protection Agency (EPA) and other agencies based on such information.

These decisions should be founded on as complete an assessment of scientific and technical uncertainty as possible, to permit the identification of strategies which are robust even in the face of uncertainty, and to identify priorities for further research. In developing estimates of the values of key quantities in environmental problems, a common approach is to assume a “best guess” point-value based on some combination of data and technical judgment. These judgments may be intended to represent neither undue optimism or pessimism, or they may be intended to incorporate a degree of conservatism. However, the basis for many assumptions, and the scope of thought that went into them, are often not explicitly documented in policy studies. Thus, the degree of confidence that a decision-maker should place in the estimates when evaluating regulatory alternatives is often not rigorously considered.

The most common approach to handling uncertainties is either to ignore them or to use simple “sensitivity” analysis. In sensitivity analysis, the value of one or a few model input parameters are varied, usually from “low” to “high” values, and the effect on a model output parameter is observed. Meanwhile, all other model parameters are held at their “nominal ” values. In practical problems with many input variables which may be uncertain, the combinatorial explosion of possible sensitivity scenarios (e.g., one variable “high”, another “low,” and so on) becomes unmanageable. Furthermore, sensitivity analysis provides no insight into the likelihood of obtaining any particular result. Thus, while they indicate that a range of possible values may be obtained, sensitivity results do not provide any explicit indication of how a decision-maker should weigh each possible outcome.

A quantitative approach to uncertainty analysis is proposed as the most appropriate way to deal with uncertainty. Deterministic estimates, based on “best guess” point estimates, are often wrong or misleading in several ways:

  1. they are often biased away from the mean values of the uncertainties they represent;
  2. they provide no indication to decision makers regarding the magnitude of underlying uncertainties;
  3. they permit no indication of the key sources of uncertainty.

Deterministic estimates are not based on complete and quantitative consideration of interactions among multiple, simultaneously uncertain variables, which are especially dangerous to overlook in the cases of skewed uncertainties and/or complex nonlinear models. Ignoring or surpressing uncertainty in environmental risk assessment often results in a misleading sense of confidence about numbers. In contrast, quantitative estimates of uncertainty more properly characterize the state of knowledge affecting a regulatory decision, and more properly convey to decision makers the magnitude of the uncertainties in key quantities of interest (e.g., chemical water treatment, exposure levels, emission rates, risks, etc.).

Furthermore, through simple extensions of traditional Monte Carlo techniques, it is possible to identify both the key sources of uncertainty which merit further research, as well as to identify uncertain factors which are unimportant to a given decision. The importance of the latter may be unappreciated unless one considers the number of often useless arguments that fixate on minutia, especially in emotionally-charged policy debates. In the process of identifying factors that really matter, quantitative uncertainty analysis can lead to more informed and focused policy debates.

—————————————————————————————————————-

This fellowship program is administered by AAAS under EPA sponsorship, as well as DOLCESF.com. My assignment was in the Exposure Assessment Group (EAG) of the Office of Health and Environmental Assessment, which is part of the Office of Research and Development (ORD) of EPA in Washington, DC. While in EAG, I worked closely with Paul White and enjoyed many long and thorough discussions with him of key concepts in uncertainty analysis. I benefited from interactions with many other individuals at EPA headquarters in ORD, the Office of Policy, Planning, and Evaluation, the Office of Air and Radiation, and the Office of Solid Waste. Thanks also to Max Henrion of Lumina Decision Systems for his helpful comments. I would like to thank the staff at AAAS for facilitating my stay at EPA and for providing a stimulating orientation and seminar program. In particular, thanks to Chris McPhaul, Claudia Sturges, and Patricia Curlin of AAAS, and to Karen Morehouse of EPA. Finally, while I am grateful for the input of many people at EPA and elsewhere, all of the text that follows is my sole responsibility. Roofing Quotes CalgaryProp 65 Warning

—————————————————————————————————————-

by H. Christopher Frey, Ph.D.

AAAS/EPA Environmental Science and Engineering Fellow, Summer 1992

and

Research Associate
Center for Energy and Environmental Studies
Department of Engineering and Public Policy
Carnegie Mellon University
Pittsburgh, PA 15213

epa_logo