R7 PM Uncertainty Analysis in Risk Assessment: Influences on Decision-making|
Thursday, 17 November 2005: 1:50 PM - 5:30 PM in 327-329
727 (FER-1117-839772) The tradeoff between measurement precision and sample size: should we get more or better data?
Start time: 1:50 PM
Ferson, S.1, Kreinovich, V.2, 1 Applied Biomathematics2 University of Texas at El Paso
One intuitively expects there to be a tradeoff between precision and sample size of measurements. For instance, one might be able to spend a unit of additional resources to be devoted to measurement either to increase the number of samples, or to improve the precision of the individual samples. Many practitioners apparently believe, however, that the tradeoff always favors increasing the number of samples over improving their precision. This belief is understandable given that most of the statistical literature of the last century has focused on assessing and projecting sampling uncertainty, and has in comparison neglected the problem of assessing and projecting measurement uncertainty. The belief is nevertheless mistaken, as can easily be shown by straightforward numerical examples. Consider, for example, the problem of conservatively estimating an exposure point concentration (EPC) from sparse and imprecise data. We might use an upper confidence limit on the mean to account for the sampling uncertainty associated with having made only a few measurements. This value is affected by the sample size, but, if the calculation also accounts for the imprecision of the values in a reasonable way, it is also affected by the measurement precision. Using recent algorithms to compute basic statistics for interval data sets, we consider the EPC and describe a nonlinear tradeoff between precision and sample size. This nonlinearity means the optimal investment of empirical resources between increasing sampling and improving precision depends on the quantitative details of the problem. We describe how an analyst can plan an optimal empirical design.
728 (FRE-1117-841475) Probabilistic Consequence Analysis for Dispersant Use on Oil Spills.
Start time: 2:10 PM
French-McCay, D1, Whittier, N1, Rowe, J1, Aurand, D2, 1 Applied Science Associates, Inc., Narragansett, RI, USA2 Ecosystem Management & Associates, Inc., Lusby, MD, USA
The implications of chemical dispersant use was evaluated in an objective manner using modeling to inform decision-makers involved in oil spill response planning. There are many possible spill scenarios that could be modeled, as well as an essentially infinite number of potential spill sites where releases could occur. Thus, to evaluate the likely consequences of hypothetical spills, modeling was performed in probabilistic mode, i.e., by randomly varying spill date and time, and so environmental conditions during and after the release among potential conditions that would occur. Spills of commonly shipped crude oils at five representative spill locations in shipping lanes of major US ports (Delaware Bay, Florida Straits, Galveston Bay, San Francisco Bay and Prince William Sound) were modeled with alternative spill response strategies (combinations of mechanical response and dispersant use) to examine potential impacts. The model results were analyzed to estimate mean, standard deviation (SD), and 5th, 50th and 95th percentile results for surface water and shoreline oiling, water column and sediment contamination, and biological impacts. The results of the modeling show larger decreases in wildlife and shoreline impacts with dispersant use than increases in water column effects, supporting the contention that there are more opportunities to save wildlife, shorelines, and near-shore sensitive habitats with dispersant use than there are risks of impacting water column biota. If the areas that would be impacted by surface oil are those where wildlife are concentrated, as they typically are near shore, and the water column impacts resulting from dispersant use would be offshore where water column biota are lower in abundance, the trade-off is more heavily weighted toward dispersant use before oil comes near the shoreline. The results quantify the trade-offs between water-column resource and wildlife/shoreline impacts resulting from dispersant use that may be used in decision making, contingency planning and ecological risk assessments.
729 (TOL-1117-851660) Deriving Localized Bioaccumulation Models for Setting Site-Specific Water Selenium Concentration Benchmarks for Pollution Control Decisions.
Start time: 2:30 PM
Toll, John1, Brix, K.2, DeForest, D.3, Tear, L.4, Adams, W.5, 1 Toll Environmental, Seattle, WA, USA2 EcoTox, Coral Gables, FL, USA3 Parametrix, Bellevue, WA, USA4 , Seattle, WA, USA5 Rio Tinto, Murray, UT, USA
We have developed a procedure for determining site-specific water quality benchmarks for substances regulated based on tissue residues. The method uses a multi-site regression model to solve for the conditional prior probability density function on water concentration, given that tissue concentration equals a tissue-based water quality criterion. It then uses site-specific water and tissue concentration data to calibrate the probabilistic model and identify the water concentration that, if met at the site, would provide a desired level of confidence of meeting the tissue-based criterion. This allows for derivation of a site-specific water quality benchmark. The procedure is fully reproducible, statistically rigorous and easily implemented. We have applied it to selenium. The current draft aquatic life criteria for selenium would set the freshwater chronic criterion based on a whole body fish tissue concentration. If the measured tissue concentration data indicate that the use is not protected, back-calculation of a water concentration benchmark could be triggered to support pollution control decisions. Our procedure provides a way to do that.
Start time: 3:10 PM
731 (ROS-1117-728177) Assessing uncertainty when modeling fate, exposure and toxic effects of thousands of chemicals.
Start time: 3:50 PM
Rosenbaum, R1, Pennington, D2, Jolliet, O1, 1 Ecole Polytechnique Fédérale de Lausanne (EPFL), Lausanne, Switzerland2 European Commission - Joint Research Centre, Ispra, Italy
When assessing thousands of chemicals the scarcity of uncertainty distributions for substance properties as model input parameters demands practical solutions for assessing the uncertainty of toxicological effect indicators in (comparative) environmental risk assessment. An approach implemented in the IMPACT 2002 multimedia model provides uncertainty information related to impact characterization factors for aquatic ecosystems and human health. For fate and human exposure it relies on a practical method of adopting certainty values for the intake fraction per cluster of chemicals as a function of emission medium, dominating exposure pathway and model robustness towards the respective chemical. Multiplying and dividing the geometric mean point estimate by such a value provides an estimate of the upper and lower 95th percentile confidence interval bounds. These certainty values can be based on insights from techniques such as Monte Carlo simulations for representative chemicals, avoiding the need to repeat timely Monte Carlo calculations with every model run. Using expert judgment, the human health effect factor uncertainty is similarly defined and readily combined through addition with that of the intake fraction. The uncertainty for ecotoxicological effects is currently related to the number of aquatic species tested in the freshwater column. The more species test results available, the more robust the estimate of the ecotoxicological factor is assumed to be. The presented approach proved to be very transparent, robust while reflecting our current level of knowledge, quick to use, and is easily applied in practice to combine e.g. uncertainties of an emission inventory with those of the impact assessment in a comparative assessment study.
732 (VON-1117-724195) Admitting to Uncertainty Undermines the Decision and other Uncertain Myths.
Start time: 4:10 PM
von Stackelberg, KE1, 1 Menzie-Cura & Associates, Inc., Winchester, MA, USA
The goal of uncertainty analysis is to make the risk assessment process more transparent by acknowledging and, to the extent possible, quantifying the inherent uncertainties. Unfortunately, sometimes it has just the opposite effect. Uncertainty analyses, particularly (multi-dimensional) probabilistic analyses can be difficult to communicate and consequently understand. It sometimes appears as though every parameter is more uncertain than not, making it difficult to justify a decision and defend it publicly, and obscuring the real message. However, not acknowledging the uncertainties inherent in any analysis is at best foolish and at worst dangerous by providing a false sense of confidence that this is "the number". By identifying sources and magnitude of uncertainties, decisionmakers can determine whether additional information should be obtained prior to making a decision, and provides a quantitative context particular individual results. This paper presents the advantages of uncertainty analysis, some strategies for overcoming "uncertainty analysis paralysis," and identifies specific questions managers and decisionmakers should ask regarding any analysis.
733 (ROS-1117-709605) Probabilistic risk assessment of urban wet-weather discharges.
Start time: 4:30 PM
Krejci, V.1, Fankhauser, R.1, Chèvre, N.1, 1 Swiss Federal Institute of Aquatic Science and Technology, Duebendorf, Switzerland
Urban wet-weather discharges, such as CSO (combined sewer overflows) and stormwater discharges could impair the receiving water in different ways: acute chemical pollution, contamination of sediments, etc. The main problems in assessing the risk of these discharges are linked with the random aspect of the cause (i.e. the rainfall) and the dynamic properties of pollutants during rain events. Furthermore, a lot of uncertainties are linked to the different pollution sources and transport processes in the sewer system and in the receiving waters. In this study, we propose a stochastic and probabilistic approach, taking the variability and the uncertainty of urban wet-weather discharges into account, to estimate the concentrations of pollutants resulting from urban discharges. These concentrations are compared with ecotoxicologically-based environmental quality criteria to avoid acute (ammonia toxicity, effects of turbidity and adsorbed compounds, dissolved oxygen) and chronic (accumulation of contaminated sediments from urban wet-weather discharges) risk for the aquatic ecosystem. This approach has been implemented in specific software, REBEKA II, in order to illustrate this new way of handling probabilities in risk assessment. The model parameters are specified through probability distributions. A Monte-Carlo procedure is used, with 10 years of rain data for each run. The results are expressed as a probability of fulfilling the different acute and chronic criteria. Discussions with authorities and engineers show a surprisingly good acceptance of this new approach. In fact, the first presentation of the concept generated heavy criticism from practitioners. But regular publication in specific journals, discussions, workshops and courses contributed to the acceptance of this method. In the near future, we expect that all new infrastructures for wet-weather control in Switzerland will be planned on the basis of this stochastic-probabilistic approach of risk assessment.
734 (GID-1117-850289) Uncertainty Analysis in Pesticide Registration - Is It Working?
Start time: 4:50 PM
Giddings, Jeffrey1, Warren-Hicks, William2, 1 Parametrix, Rochester, MA, USA2 EcoStat, Inc., Mebane, NC, USA
Classical and Bayesian probabilistic techniques are being used in many countries as tools for uncertainty analysis in the ecological risk assessment of pesticides. Both the US EPA and the European Commission are developing guidance on the tiered use of probabilistic methods for pesticide risk assessment. But are these approaches working? Are probabilistic outputs contributing significantly to registration decisions? By what criteria can we judge the regulatory usefulness and scientific acceptability of probabilistic methods? We will explore these questions by examining recent successes and failures through interviews with regulators, pesticide registrants, and academic scientists. Case studies illustrating the use of probabilistic methods in pesticide registration will support the discussion.
735 (BRI-1117-724889) Managing Uncertainty to Make Risk-Informed Decisions About Contaminated Sediments.
Start time: 5:10 PM
Bridges, T1, von Stackelberg, K2, Vorhees, D2, Butler, C2, Cura, J2, Greges, M3, Reiss, M4, 1 United States Army Engineer Research and Development Center, Vicksburg, MS, USA2 Menzie-Cura & Associates, Inc., Winchester, MA, USA3 US Army Corps of Engineers, NY District, New York, NY, USA4 US EPA, Region 2, New York, NY, USA
This paper will describe how the regulatory community, concerned with placing sediments at the Historic Area Remediation Site (HARS) on the NY Bight, are incorporating site-specific information and uncertainty into their decision-making approach. Limited and uncertain data often challenge regulators in reaching credible conclusions about the extent, magnitude and management of risks. One response to this challenge has been to invest in additional data collection and research. This investment provides additional insights into the processes relevant to understanding risk, but still leaves unresolved uncertainties. Managing and addressing this residual uncertainty represent significant challenges to decision-making. At the HARS, past regulatory evaluations for sediment placement were a comparison of bioaccumulation test results to deterministically derived tissue based guidelines. More recently, regulators have considered risks to fish and humans from contaminant bioaccumulation and trophic transfer. Regulators, recognizing important uncertainties in these evaluations, commissioned two site-specific studies: a fish tagging study to learn more about how fish behavior may affect their exposures; and, a creel survey to refine fish ingestion rates for the angler population fishing near the HARS. Refined and updated estimates of these and other inputs to the evaluation were used to develop a remodeled approach for evaluating sediments based on estimating risks from cumulative contaminant effects. This approach factors uncertainty and variability in input parameters into bioaccumulation and risk modeling using two-dimensional Monte Carlo analysis. Model outputs are included in a decision-support tool that allows decision-makers to explore and factor uncertainty into their conclusions about risks and their decisions as to how to manage those risks.