L1130_FM.fm Page iii Thursday, August 12, 2004 9:23 PM
Fundamentals of Risk Analysis and Risk Management Edited by
Vlasta Molak President GAIA UNLIMITED, Inc. Cincinnati, Ohio
LEWIS PUBLISHERS Boca Raton
© 1997 by CRC Press, Inc.
New York
London
Tokyo
L1130_FM.fm Page iv Thursday, August 12, 2004 9:23 PM
Publisher: Project Editor: Marketing Manager: Direct Marketing Manager: Cover Design: PrePress: Manufacturing:
Joel Stein Carole Sweatman Greg Daurelle Arline Massey Denise Craig Carlos Esser Sheri Schwartz
Library of Congress Cataloging-in-Publication Data Molak, Vlasta. Fundamentals of risk analysis and risk management / Vlasta Molak. p. cm. Includes bibliographical references and index. ISBN 1-56670-130-9 (alk. paper) 1. Technology—Risk assessment. I. Title. T174.5.M64 1996 363.1—dc20
96-19681 CIP
This book contains information obtained from authentic and highly regarded sources. Reprinted material is quoted with permission, and sources are indicated. A wide variety of references is listed. Reasonable efforts have been made to publish reliable data and information, but the authors, editor, and the publisher cannot assume responsibility for the validity of all materials or for the consequences of their use. Neither this book nor any part may be reproduced or transmitted in any form or by any means, electronic or mechanical, including photocopying, microfilming, and recording, or by any information storage or retrieval system, without prior permission in writing from the publisher. All rights reserved. Authorization to photocopy items for internal or personal use, or the personal or internal use of specific clients, may be granted by CRC Press, Inc., provided that $.50 per page photocopied is paid directly to Copyright Clearance Center, 27 Congress Street, Salem, MA 01970 USA. The fee code for users of the Transactional Reporting Service is ISBN 1-56670-130-9/97/$0.00+$.50. The fee is subject to change without notice. For organizations that have been granted a photocopy license by the CCC, a separate system of payment has been arranged. CRC Press, Inc.’s consent does not extend to copying for general distribution, for promotion, for creating new works, or for resale. Specific permission must be obtained from CRC Press for such copying. Direct all inquiries to CRC Press, Inc., 2000 Corporate Blvd., N.W., Boca Raton, Florida 33431. © 1997 by CRC Press, Inc. Lewis Publishers is an imprint of CRC Press No claim to original U.S. Government works International Standard Book Number 1-56670-130-9 Library of Congress Card Number 96-19681 Printed in the United States of America 1 2 3 4 5 6 7 8 9 0 Printed on acid-free paper
© 1997 by CRC Press, Inc.
L1130_FM.fm Page v Thursday, August 12, 2004 9:23 PM
Foreword My Uncle Steve, who worked on one of the government’s first computers, had his own mathematical system wherein he calculated the probability of a horse winning a race. Sometimes Uncle Steve won money on the horses. Sometimes he lost money on the horses. All of his winning and losing was done very scientifically: studying The Daily Racing Digest, calculating the odds according to such dependent variables (such as the track records of the stable, the trainer, the jockey, the horse, and the length of the race), and assigning proper weight to intervening variables (such as the condition of the track and weather at the time of the race). He did well. My Aunt Betty, who also did well at the track, used the time-honored “Hunch System of Equine Competition,” also known as intuition. “I’ve just got a feeling that this horse is due,” she would say to me during our frequent summer visits to Thistledown. All this risk taking with money, whether through science or intuition, can be best summed by the immortal tout who once said: “Ya places yer bets and ya takes yer chances.” And then there was Betty and Steve’s younger brother, Frank, (my father) who never bet on the horses because he believed all horse races were fixed. Risk analysis and risk management are, for most people, much more lofty and consequential than the outcome of a horse race. Nevertheless, Uncle Steve and Aunt Betty’s track assessment styles came to my mind when a nuclear scientist testifying before our Ohio Senate Energy and Environment committee claimed a planned multistate radioactive waste dump would be of little risk to Ohio. I thought of Uncle Steve and how he would have demanded the track record of the industry of containment of nuclear waste in the past. I thought of Aunt Betty and what her instincts would have told her about whether it was the right time to bet on a long shot named Glows in the Dark. I thought of my father and his wariness about the fix being in. Thus I came to vote against Senate Bill 19. Informed opinions by the highly educated and much lettered are available to support nearly every point of view. Human decision-making is a terribly complicated matter. We all want to make the best decision. We would hope that the best decision is made on the basis of the best available information. Often it is. Sometimes it is not. In the chain reaction of real world decision-making, science collides with economics, which collide with politics, and the decision rests with that body of knowledge, which is (accidentally) left standing. Vlasta Molak has gathered together the works of some of the most impressive authors of papers on risk analysis and risk management in the world. Her writings and her compilation of the work of so many leading scientists in one complete volume is a public service, in that it enables both novice and expert to ponder the many and diverse factors that are at work in assessing, analyzing, and managing risk. This book will be useful to both legislators (local, state, and federal) and their staff to help devise better laws to protect the public, encourage responsible business development, and increase profits – rather than using risk analysis to promote status quo or reduce environmental safeguards. Several chapters that deal with economics and risk analysis have convinced me that being PRO-working average person and PRO-environmental protection is NOT
© 1997 by CRC Press, Inc.
L1130_FM.fm Page vi Thursday, August 12, 2004 9:23 PM
being ANTI-business. On the contrary, responsible and effective business organizations profit from a loyal, well-trained work force and reasonable, smart environmental regulations that encourage efficiency and nonpollution. Numerous studies, cited in this book, demonstrate that application of most enlighted environmental management increases profits (since pollution is equivalent to wasted resources) and thus fiscal conservatism and emphasis on private property rights also mean increased environmental protection. Only in an unenlightened society are environmental safeguards mistakenly considered as opposed to business interests and free markets. Better business with cleaner environment is the paradigm for the 21st century. The old paradigm “business vs. environment” needs to be retired. Fundamentals of Risk Analysis and Risk Management will help raise this awareness and finally bury the old nonproductive paradigm, which has been one of the major sources of controversy in our legislative process. I would recommend this book to my colleagues, who are often involved in designing very complex environmental and occupational protection laws, as a reference and as a useful book to increase their analytical skills in dealing with the complexity of legislation, regulations, risk-benefit analysis, and risk management. Also, the wealth of references provided in this book can help us better understand how our laws affect our environmental and occupational safety and health, and ultimately our quality of life. Senator Dennis Kucinich Ohio State Senator
© 1997 by CRC Press, Inc.
L1130_FM.fm Page vii Thursday, August 12, 2004 9:23 PM
Preface The idea for this book started as a consequence of my directing and teaching a one-day course on “Fundamentals of Risk Analysis” at the annual meetings of the Society for Risk Analysis (1991, 1992, and 1994). Also, teaching a course at the United Nations Division for Sustainable Development, New York, on “Use of Risk Analysis in Sustainable Development”, and teaching a course on “Environmental Risk Assessment and Management” at the University of São Paulo and University of Mato Grosso, Cuiába, Brazil, made me aware of the need for a reference that I could give to students to get a comprehensive overview of the field and lead them to valuable references if they wanted to increase their knowledge in specific aspects of risk analysis. Moreover, my position as Secretary of the Society for Risk Analysis (from 1989–1994) convinced me that there is a great need for integrating the rapidly expanding field of risk analysis and risk management, and for providing a common language for all the practitioners and members of this varied interdisciplinary professional group. The last few years have witnessed the concepts of risk analysis and risk management permeating public discussion, often confusing decision makers and the public. When Lewis Publishers called me in 1995, after having seen the title of the course I taught at the SRA Annual Meeting in December 1994, and asked me to write a book on the subject of risk analysis and risk managment, I decided that the need for such a book was overwhelming, and that providing such a book would be a worthwhile project. Since no single person could accomplish such a monumental task of integrating the diverse fields of risk analysis and risk management, I asked my colleagues to help me write the chapters for which they were recognized experts in their particular practice of risk analysis and risk management. Most of them graciously agreed, or gave up under my incessant prodding. Some of them cancelled at the last moment, but I was fortunate to find new authors who were not intimidated by the task. With the miracle of Internet, I was able to bring in several authors from different parts of the world to help expand our understanding of how risk analysis is practiced around the world. After almost two years of work, we have completed the task of producing this book of 26 chapters, in which we cover the fundamentals of what is known as risk analysis and risk management in the contemporary western world. Most chapters also provide a summary, questions and answers to be used as tools in teaching courses in risk analysis. The glossary should also be helpful both to students and practitioners of risk analysis. Finally, the index should make it easier to focus on a particular area of the reader’s interest. The addresses of co-authors are given as an easy access for those readers and students of risk analysis who may have some questions. The E-mail addresses of some of the authors should be particularly useful for further communication. I want to thank all of the 20 co-authors who have graciously accepted the task of making their chapters understandable to an educated general reader, while at the same time providing references and in-depth discussion for those who want more detailed understanding. My work and discussions with them were very enlightening and fun. They have done an excellent job in educating me of the aspects of risk
© 1997 by CRC Press, Inc.
L1130_FM.fm Page viii Thursday, August 12, 2004 9:23 PM
analysis of which I was not aware, and helping to deepen my understanding of different applications of risk analysis. Also, I want to thank Brian Lewis, who asked me to do this book before selling his company, Lewis Publishers, to CRC Press. My thanks go to the professionals at CRC Press, who have been very helpful in explaining the “nuts and bolts” of publishing and have been encouraging in finishing this work. Finally, I want to thank my daughter, Yelena, and Ohio State Senator, Dennis Kucinich for their review of some of my chapters and useful discussions and suggestions. They brought to my attention broader implications of the topics in this book of real life and political functioning in which risk analysis and risk management have become household words, frequently used without ever being properly defined and understood. Any mistakes found in this book are mine and unintentional, and I would appreciate if the reader brings them to my attention. We hope that this book will be a useful guide to all who want to improve their knowledge in confronting dangers of living, and particularly to those who make decisions that affect public safety and the general safety of this planet. The increased awareness and application of risk analysis and risk management can improve our understanding of the dangers that we face on our life journey and help us make better choices. Vlasta Molak
© 1997 by CRC Press, Inc.
L1130_FM.fm Page ix Thursday, August 12, 2004 9:23 PM
The Editor Dr. Vlasta Molak is the International Coordinator and former Secretary of the Society for Risk Analysis (SRA). In 1989 she convened an international communication network to promote uses of risk analysis in solving some of the environmental problems resulting from misuse of technology. On her several trips to Eastern Europe and the former Soviet Union, Dr. Molak initiated activities to start chapters of the SRA in Prague (Republic of Czech), Zagreb (Croatia), Osijek (Croatia), Warsaw (Poland), Budapest (Hungary), Moscow (Russia), and Kharkov (Ukraine) with interested scientists, engineers, and policy makers in those countries. Dr. Molak represented the U.S. at a four-day workshop on “How to improve environmental awareness of local decision makers in Eastern Europe,” sponsored by the European Commission. Dr. Molak taught in a training program in Brazil, which was organized by Taft’s University Environmental Management Program, at the University of Cuiába and the University of São Paulo. The subject was “Environmental Risk Assessment and Risk Management” for professionals involved in Brazilian environmental management. She also taught a course at the United Nations headquarters (New York) on “The Use of Risk Analysis in Sustainable Development.” Dr. Molak is the founder and president of the Biotechnology Forum, Inc. in Cincinnati and chairs the Subcommittee for Technical Interpretation of the Local Emergency Planning Committee for Hamilton County, Ohio. Under her leadership, the Biotechnology Forum has organized series of lectures and workshops. One of the workshops, “The Alaska Story: In the Context of Oil Spill Problems in the Marine Environments,” with special emphasis on the biological cleanup efforts, resulted in the proceedings edited by Dr. Molak. As a chair of the Subcommittee for Technical Interpretation, Dr. Molak initiated the efforts for hazard analysis in Hamilton County, Ohio and formulated the strategy for hazard analysis. She was a member of the Planning Committee for Comparative Risk Analysis for Hamilton County (Cincinnati, Ohio) and a member of the Quality of Life Committee of the Ohio Comparative Risk Analysis Project. She presently is coordinating the efforts to deal with more complex aspects of chemical safety: process safety in manufacturing, transportation of hazardous materials, and adverse effects of routine chronic releases of toxic chemicals. Dr. Molak has worked at the U.S. Environmental Protection Agency and the National Institute for Occupational Safety and Health (NIOSH) on developing methodologies for risk analysis of toxic chemicals. These methodologies are used to derive various environmental and occupational criteria. Dr. Molak also worked for a private environmental consulting company and now is the founder and president of GAIA UNLIMITED, Inc., her own consulting company dealing with environmental and occupational risk assessment, risk management, and general
© 1997 by CRC Press, Inc.
L1130_FM.fm Page x Thursday, August 12, 2004 9:23 PM
environmental problems including strategies for pollution prevention. She is teaching various courses for risk analysis (including courses for local and state governments). She is also developing the AGENDA 21 PROGRAM as a dean at the Athena University, based entirely on the Internet. It is intended to be a fully accredited program promoting ideas and operational skills necessary for sustainable development. Her training is interdisciplinary: she has a B.S. in physical engineering, an M.S. in chemistry, a Ph.D. in biochemistry, and postdoctoral training in molecular genetics. Dr. Molak is a Diplomat of the American Board of Toxicology (DABT).
© 1997 by CRC Press, Inc.
L1130_FM.fm Page xi Thursday, August 12, 2004 9:23 PM
Contributors Joseph Alvarez, Ph.D. Auxier & Associates Parker, Colorado 80134 E-mail:
[email protected] Vicki M. Bier, Ph.D. Department of Industrial Engineering Department of Nuclear Engineering and Engineering Physics University of Wisconsin–Madison Madison, Wisconsin 53706 E-mail:
[email protected] William E. Dean, Ph.D. Private Consultant Sacramento, California 95814 E-mail:
[email protected] Paul F. Deisler, Ph.D. Private Consultant Austin, Texas 78703 Jeffrey H. Driver, Ph.D. Technology Sciences Group, Inc. Washington, D.C. 20036 E-mail:
[email protected] Paul K. Freeman, J.D. The ERIC Group, Inc. Englewood, Colorado 80112 B. John Garrick, Ph.D. PLG, Inc. Newport Beach, California 92660 E-mail:
[email protected] Herman J. Gibb, Ph.D. National Center for Environmental Assessment U.S. Environmental Protection Agency Washington, D.C. 20460 E-mail:
[email protected] © 1997 by CRC Press, Inc.
P. J. (Bert) Hakkinen, Ph.D. Department of Risk, Policy, and Regulatory Sciences The Procter and Gamble Company Ivorydale Technical Center Cincinnati, Ohio 45224 E-mail:
[email protected] Barbara Harper, Ph.D., DABT Department of Health Risk Pacific Northwest Laboratory Richland, Washington 99352 E-mail:
[email protected] Peter Barton Hutt, LL.M. Covington & Burling Washington, D.C. 20044 Howard Kunreuther, Ph.D. Center for Risk Management and Decision Processing Wharton School University of Pennsylvania Philadelphia, Pennsylvania 19104 E-mail:
[email protected] Robert T. Lackey, Ph.D. Environmental Research Laboratory U.S. Environmental Protection Agency Corvallis, Oregon 97333 E-mail:
[email protected] Howard Latin, J.D. John J. Francis Scholar Rutgers University School of Law at Newark Newark, New Jersey 07102 E-mail:
[email protected] Terence L. Lustig, Ph.D. Environmental Management Pty. Ltd. Kensington, NSW, Australia E-mail:
[email protected] L1130_FM.fm Page xii Thursday, August 12, 2004 9:23 PM
Stuart C. MacDiarmid, Ph.D. Department of Regulatory Authority Ministry of Agriculture Wellington, New Zealand E-mail:
[email protected] David Vose, M.Sc. Risk Analysis Services Wincanton, Somerset United Kingdom BA9 9AP E-mail:
[email protected] Vlasta Molak, Ph.D. GAIA UNLIMITED, Inc. Cincinnati, Ohio 45231 E-mail:
[email protected] Gary K. Whitmyre, M.A. Technology Sciences Group, Inc. Washington, D.C. 20036 E-mail:
[email protected] Alexander Shlyakhter, Ph.D. Department of Physics Harvard Center for Risk Analysis Harvard University Cambridge, Massachusetts 02138 E-mail:
[email protected] Richard Wilson, Ph.D. Department of Physics Harvard Center for Risk Analysis Harvard University Cambridge, Massachusetts 02138 E-mail:
[email protected] Paul Slovic, Ph.D. Decision Research Eugene, Oregon 97401 E-mail:
[email protected] Rae Zimmerman, Ph.D. New York University Robert F. Wagner Graduate School of Public Service New York, New York 10003 E-mail:
[email protected] James A. Swaney, Ph.D. Department of Economics Wright State University Dayton, Ohio 45435 E-mail:
[email protected] © 1997 by CRC Press, Inc.
L1130_FM.fm Page xiii Thursday, August 12, 2004 9:23 PM
Dedication
This book is dedicated to my dear husband, Peter and our children, Yelena, Ina, and Allen, and to my friends who have helped expand my view of the universe and of the impending dangers we all must confront to make our world a better place in which to live. Special gratitude is extended to Yelena and my friend, Dennis, whose help came when it was most needed.
© 1997 by CRC Press, Inc.
L1130_FM.fm Page xv Thursday, August 12, 2004 9:23 PM
Contents Foreword by Ohio State Senator Dennis Kucinich Preface The Editor Contributors Dedication Introduction and Overview Vlasta Molak I. THEORETICAL BACKGROUND OF RISK ANALYSIS Chapter I.1 Toxic Chemicals Noncancer Risk Analysis and U.S. Institutional Approaches to Risk Analysis Vlasta Molak Chapter I.2 Epidemiology and Cancer Risk Assessment Herman J. Gibb Chapter I.3 Uncertainty and Variability of Risk Analysis Richard Wilson and Alexander Shlyakhter Chapter I.4 Monte Carlo Risk Analysis Modeling David Vose Chapter I.5 An Overview of Probabilistic Risk Analysis for Complex Engineered Systems Vicki M. Bier Chapter I.6 Ecological Risk Analysis Robert T. Lackey Chapter I.7 The Basic Economics of Risk Analysis James A. Swaney
© 1997 by CRC Press, Inc.
L1130_FM.fm Page xvi Thursday, August 12, 2004 9:23 PM
II. APPLICATIONS OF RISK ANALYSIS Chapter II.1 Assessment of Residential Exposures to Chemicals Gary K. Whitmyre, Jeffrey H. Driver, and P. J. (Bert) Hakkinen Chapter II.2 Pesticide Regulation and Human Health: The Role of Risk Assessment Jeffrey H. Driver and Gary K. Whitmyre Chapter II.3 Ionizing Radiation Risk Assessment Joseph L. Alvarez Chapter II.4 Use of Risk Analysis in Pollution Prevention Vlasta Molak Chapter II.5 Integrated Risk Analysis of Global Climate Change Alexander Shlyakhter and Richard Wilson Chapter II.6 Computer Software Programs, Databases, and the Use of the Internet, World Wide Web, and Other Online Systems P. J. (Bert) Hakkinen III. RISK PERCEPTION, LAW, POLITICS, AND RISK COMMUNICATION Chapter III.1 Risk Perception and Trust Paul Slovic Chapter III.2 The Insurability of Risks Howard Kunreuther and Paul K. Freeman Chapter III.3 Setting Environmental Priorities Based on Risk Paul F. Deisler, Jr. Chapter III.4 Comparative Risk Analysis: A Panacea or Risky Business? Vlasta Molak Chapter III.5 Environmental Justice Rae Zimmerman © 1997 by CRC Press, Inc.
L1130_FM.fm Page xvii Thursday, August 12, 2004 9:23 PM
Chapter III.6 Law and Risk Assessment in the United States Peter Barton Hutt Chapter III.7 Science, Regulation, and Toxic Risk Assessment Howard Latin IV. RISK MANAGEMENT Chapter IV.1 Risk Management of the Nuclear Power Industry B. John Garrick Chapter IV.2 Seismic Risk and Management in California William E. Dean Chapter IV.3 Sustainable Management of Natural Disasters in Developing Countries Terrence L. Lustig Chapter IV.4 Risk Analysis, International Trade, and Animal Health Stuart C. MacDiarmid Chapter IV.5 Incorporating Tribal Cultural Interests and Treaty-Reserved Rights in Risk Management Barbara L. Harper Chapter IV.6 Global Use of Risk Analysis for Sustainable Development Vlasta Molak Conclusion Vlasta Molak Answers to Questions Glossary
© 1997 by CRC Press, Inc.
Introduction and Overview Vlasta Molak
We are all more or less successful risk assessors and managers if we are still alive. Life is intrinsically filled with dangers, real or perceived. Planes may explode and go down either because of terrorist activities or safety rules violations, a nuclear power plant may blow up (Chernobyl) or release radioactive clouds (Three Mile Island), a chemical plant may release toxic gas (Bhopal), or a natural disaster (hurricane, flood, tornado, volcano, landslide) can strike the area in which we live. We may get acute food poisoning from either bacterial or chemical contamination, or we may suffer from chronic diseases that are in part caused by the food choices we make. Whether we are crossing the street, making investments, deciding what to eat, how to get from one place to another, choosing our profession, or getting married, we are making our decisions based on evaluating risks and benefits that a particular activity or avoidance would bring us. The subject of this book is to improve our analytical techniques in evaluating dangers and develop skills in confronting them.
1. DEFINITION OF RISK ANALYSIS We can define risk analysis as a body of knowledge (methodology) that evaluates and derives a probability of an adverse effect of an agent (chemical, physical, or other), industrial process, technology, or natural process. Definition of an "adverse effect" is a value judgement. It could be defined as death or disease (in most cases of human health risk analysis); it could be a failure of a nuclear power plant, or a chemical plant accident, or a loss of invested money. In some recent cases of risk analysis, even vaguely defined terms such as “quality of life” or “sense of community” have been evaluated using risk analysis. Traditionally, most risk assessments (risk analysis applied in a particular situation) deal with health effects or, more recently, with the ecological health or economic well-being (in case of business risk
© 1997 by CRC Press, Inc.
analysis). Although there are many types of risk analysis, some common elements are necessary to qualify the process as risk analysis, particularly when dealing with the potential health effects of toxic chemicals. Those elements are (NAS 1983) 1. Hazard (agent) identification 2. Dose-response relationship (how is quantity, intensity, or concentration of a hazard related to adverse effect) 3. Exposure analysis (who is exposed? to what and how much? how long? other exposures?) 4. Risk characterization (reviews all of the previous items and makes calculations based on data, with all the assumptions clearly stated; often the conclusion is that more data and/or improvement in methodology is needed and that no numerical risk number can be derived to express accurately the magnitude of risk)
Deciding WHAT is an adverse effect (and to some extent hazard identification) is a value judgment that can be made by well-informed citizens. The consideration of other components of risk analysis is a complex process, which in order to be properly conducted requires extensive training. Just as one would not want to have a surgery performed by an untrained layman, risk analysis may be a risky business if performed by untrained people. Because of its interdisciplinary nature and complexity, risk analysis requires an appropriate amount of time to evaluate all pertinent data, even when one deals with problems of lesser complexity. We are constantly performing risk analysis and risk management in everyday situations, such as observing traffic when planning to cross the street or driving. However, in more complex situations where we may be exposed to toxic substances, radiation, or the possibility of a nuclear power plant disaster, formal risk analysis may be necessary in order to derive reasonable (and sometimes optimal) recommendations for the most appropriate risk management.
2. PURPOSE OF THIS BOOK This book provides a comprehensive overview of risk analysis and its applications to a broad range of human activities. The editor and co-authors seek to bridge the gap between theory and application and to create a common basic language of risk analysis. They hope that the material in this book will provide a common knowledge base for risk analysts, which can be expanded according to their specific interests and fields of study by using the references provided in each chapter. The co-authors are experienced and recognized practitioners in the various types of risk analysis and risk management. The intended readers are scientists, engineers, lawyers, sociologists, politicians, and anyone interested in gaining an overview of risk analysis, wanting to become proficient in speaking the basic language of risk analysis, and understanding its applications in difficult risk management decisions. This book can be used as a textbook and reference for undergraduate, graduate, and other training courses in risk analysis. Also, the editor hopes that it will be used by legislators and their aides
© 1997 by CRC Press, Inc.
(local, state, and federal) to devise better laws to protect the public and to encourage responsible business development and profit increases, rather than using risk analysis to promote the status quo or reduce environmental safeguards. Several chapters demonstrate that application of the most enlightened environmental management increases profits (since pollution is equivalent to wasted resources). Thus, fiscal conservatism and emphasis on private property rights also mean increased environmental protection. Only in an unenlightened society are environmental safeguards mistakenly considered as being opposed to business interests and free markets. Better business with a cleaner environment is the paradigm for the 21st century. The old paradigm “business vs. environment” needs to be retired. The book is divided into four sections. Section I, Theoretical Background of Risk Analysis consists of chapters demonstrating the scientific basis of risk analysis, types of risk analysis, and basic concepts. Chapters in this section discuss toxic chemicals risk analysis, epidemiological risk analysis, uncertainty and variability of risk analysis, Monte Carlo risk analysis modeling, probabilistic risk analysis of complex technological systems, ecological risk analysis, and the basic economics of risk analysis. Section II, Applications of Risk Analysis demonstrates applications of risk analysis to real-life situations. Examples come from agriculture (application of pesticides), indoors exposures, promoting pollution prevention, global climate change, etc. A chapter on computer software programs and use of the Internet in risk analysis is also added. Section III, Risk Perception, Law, Politics, and Risk Communication deals with differences between public perception of risks, scientific risk analysis and its legal applications, and how to communicate risks to those who may be affected. This section also has two chapters dealing with setting environmental priorities and comparative risk analysis and environmental justice. The insurability of risk deals with societal response to various risks of living. Section IV, Risk Management illustrates the use of risk analysis in devising better risk management in handling technologies (e.g., nuclear power plants) or general everyday environmental problems. Also, chapters deal with the management of natural risks such as earthquakes and floods and with the cleanup of radioactive hazardous waste sites on an Indian reservation. The final chapter integrates a worldview as seen by a risk analyst (Vlasta Molak, the editor). The conclusion summarizes the topics elaborated in the chapters and suggests how the practice of risk analysis affects social management of environmental problems in view of the recent controversies in risk-benefit analysis applications in legislative proposals and regulations in the U.S.
3. HISTORICAL OVERVIEW OF RISK ANALYSIS Historical perspective on risk analysis applications in society was given by Covello and Mumpower (1985). Around 3200 B.C. in the Tigris-Euphrates valley, a group called Asipu served as risk analysis consultants for people making risky, uncertain, or difficult decisions. Greeks and Romans observed causal relationships between exposure and disease: Hippocrates (4th century B.C.) correlated occurrence of diseases with environmental
© 1997 by CRC Press, Inc.
exposures; Vitruvious (1st century B.C.) noticed lead toxicity; and Agricola (16th century A.D.) noticed the correlation between occupational exposure to mining and health. Modern risk analysis has roots in probability theory and the development of scientific methods for identifying causal links between adverse health effects and different types of hazardous activities: Blaise Pascal introduced the probability theory in 1657; Edmond Halley proposed life-expectancy tables in 1693; and in 1792, Pierre Simon de LaPlace developed a true prototype of modern quantitative risk analysis with his calculations of the probability of death with and without smallpox vaccination. With the rise of capitalism, money use, and interest rates, there was an increased use of mathematical methods dealing with probabilities and risks. For example, the risk of dying was calculated for insurance purposes (life-expectancy tables). Physicians in the Middle Ages also observed a correlation between exposures to chemicals or agents and health: John Evelyn (1620–1706) noticed that smoke in London caused respiratory problems. He also noticed correlation of scrotal cancer with occupational exposures to soot in chimney sweeps.
4. RISK MANAGEMENT Insurance, which started 3900 years ago in Mesopotamia, is one of the oldest strategies for dealing with risks. In 1950 B.C., the Code of Hamurabi formalized bottomry contracts containing a risk premium for the chance of loss of ships and cargo. By 750 B.C., Greeks also practiced bottomry. In 1583, the first life insurance policy was issued in England. In contemporary society, insurance has developed to deal with a wide variety of phenomena associated with adverse effects, from health insurance to mortgage insurance. Actuaries (people who calculate insurance premia, based on historical losses and estimates of the future income from premiums and losses) are probably the best risk assessors, since the failure in making accurate predictions about losses and premia income can result in the loss of the business. Companies with bad actuaries go bankrupt (see Chapter III.2). Government interventions to deal with natural or manmade hazards are recorded in all great civilizations. In order to manage air pollution from burning coal in London, King Edward (1285) issued an order forbidding the use of soft coal in kilns, after an unsuccessful trial to voluntary decrease its use. Perhaps we can learn from this historical example that “voluntary” reduction in risks from pollution and technological risks in general are best achieved by designing and enforcing intelligent environmental and occupational laws. Carrots and sticks may be more effective in dealing with environmental and occupational risks (accidents or pollution) than either sticks or carrots alone! Thus, while we may choose to believe that industries and individuals sincerely have the public good in mind when dealing with industrial production, pollution, and waste management, it is helpful to have laws and regulations to insure responsible behavior in cases where promises are not kept because budgetary constraints have pushed environmental considerations out of the picture. The irony is that in most cases improvement in environmental management also improves the bottom line in the long run and often in the short run. Thus, budgetary
© 1997 by CRC Press, Inc.
constraints should encourage environmental protection and pollution prevention since they save money for the company and save on public health and litigation costs! However, as the great physicist Max Plank said, “The new ideas do not win by the strength of their logic, but because their opponents eventually die!” Hopefully, the idea of pollution prevention and safe environmental management, as one of the most obvious ways to improve profits, will prevail before all of its opponents die! Water and garbage sanitation in the 19th and 20th centuries were extremely successful in decreasing the risk of mortality and morbidity, so were building and fire codes; boiler testing and inspection; and safety engineering on steamboats, railroads, and cars. A whole field of risk management was developed based on common sense risk analysis, which increased the longevity and generally improved the quality of life for most citizens in the developed world.
5. MODERN RISK ANALYSIS Conceptual development of risk analysis in the United States and other industrially developed countries (referred to by the United Nations as “North”) started from two directions: (1) with the development of nuclear power plants and concerns about their safety (this problem led to the development of the classical probabilistic risk analysis) and (2) with the establishment of the U.S. Environmental Protection Agency (EPA), Occupational Safety and Health Administration (OSHA), National Institute for Occupational Safety and Health (NIOSH), and equivalent governmental agencies in developed countries. These organizations developed in response to a rapid environmental degradation caused by indiscriminate use of pesticides, industrial pollution, and a public outcry, triggered by the publishing of Rachel Carson’s book, The Silent Spring. Modern industrial society underwent changes that must be factored into risk analysis and management associated with industrial development. However, one should keep in mind that in the underdeveloped countries (referred to as “South” by the United Nations) one still deals with infectious diseases, malnutrition, and other diseases of preindustrial society, in addition to environmental degradation due to either overpopulation or rapid, unregulated industrial development. In the North, the following applies for modern risks: 1. A shift in the nature of risks from infectious diseases to degenerative diseases 2. New risks such as from nuclear plant accidents, radioactive waste, pesticides and other chemicals releases, oil spills, chemical plant accidents, ozone depletion, acid rain generation, and global warming 3. Increased ability of scientists to measure contamination 4. Increased number of formal risk analysis procedures capable of predicting a priori risks 5. Increased role of governments in assessing and managing risks 6. Increased participation of special interest groups in societal risk management (industry, workers, environmentalists, and scientific organizations), which increases the necessity for public information 7. Increased citizen concern and demand for protection
© 1997 by CRC Press, Inc.
Risk analysis can help manage technology in a more rational way and promote sustainability of desirable conditions for societies and eliminate conditions detrimental to the well-being of humans and ecosystems. However, in each particular case of risk assessment, the assumptions and uncertainties have to be clearly spelled out. All the models used in performing risk analysis have to indicate assumptions and uncertainties in conclusions. Formal risk analysis can be organized into (Figure 1) 1. Noncancer chemicals risk analysis 2. Carcinogen risk analysis 3. Epidemiological risk analysis (which could include both cancer and noncancer chemicals or other nonchemical hazards, such as accidents, electromagnetic radiation, nutrition, etc.) 4. Probabilistic risk analysis associated with nuclear power plant safety and chemical plant safety 5. A posteriori risk analysis, which is applied in actuary science to predict future losses, either from natural phenomena, investments, or technology 6. Nonquantitative risk analysis, or “common sense” risk analysis, which can give only vague patterns of possible risks.
Chapters in Section I of this book will deal with these types of risk analyses and their limitations.
Figure 1
Schematic representation of the types of risk analysis.
© 1997 by CRC Press, Inc.
For noncarcinogenic chemicals, it is assumed that an adverse effect occurs only if exposure to the chemical exceeds a threshold. Risk analysis is used both for establishing criteria and standards for chemicals in the environmental media and for evaluating risks in particular cases of exposures to toxic chemicals (such as contaminated water, soil, or air in the vicinity of a pollution source or evaluations of Superfund sites). It is assumed that there is no probability of harm if the exposure is below such a threshold. Criteria are based mostly on animal studies, and risk analysis methods deal with extrapolations from animal to human, from short-range to long-range exposures, and with similar scientific issues that require expert judgments and cannot be neatly put into a formula. Uncertainty in the derived criteria and standards is usually one to two orders of magnitude. Risk analysis for establishing criteria for toxic substances is probabilistic only in the case of carcinogens. The probability of developing cancer or a cancer potency slope as a result of exposure to a particular level or concentration of a chemical is derived by modeling from animal data. Depending on the model applied, a variety of results may be obtained. Probabilities of developing cancer or other diseases can also be obtained from epidemiological research correlating exposures to toxic substances with the development of cancer or other types of diseases. Epidemiological risk analysis deals with establishing correlations or causal relationships between exposures to chemicals or physical agents and diseases. Most frequently, retrospective, cohort, and mortality studies of occupational groups are used for assessing cancer risk. Standard morbidity or mortality ratios can be regarded as an increase in probability of a health risk with exposure. However, because of the large uncertainty in estimating exposure, the results of the epidemiological studies are combined with studies in animals, in order to confirm the causal relationship between exposures to an agent (carcinogen) and cancer. Probabilistic risk analysis is applied to industrial process safety and nuclear plant safety (fault-tree and failure-tree analysis). The probability of an adverse outcome (failure of a component or a system) of a series of interconnected events is obtained by evaluating probabilities of failures of individual components. These probabilities are obtained either based on historical data or on assumptions of failure. Once a probability of failure of a chemical process is established, one can apply chemical risk analysis to establish the severity of consequences of a release of a particular toxic substance. This type of probabilistic risk analysis was the beginning of the modern discipline of risk analysis, when atomic energy promised a new way of tapping into an almost limitless energy resource. Until Chernobyl (see Chapter IV.1), the risk analysis numbers were very clear indicators of its safety. Chernobyl and the problems with disposing of radioactive waste from nuclear reactors demonstrated again that the technology that initially promised to be a panacea may not be all that was promised. Thus, it may be wise to be cautious when promoting technological fixes. Based on historical data, one can establish probabilities of adverse effects from natural phenomena (earthquakes, floods, etc.) or types of human activities (transportation accident rates, number of smokers with lung cancer, acute pesticide poisonings, etc.). This type of risk analysis is used extensively by insurance industries
© 1997 by CRC Press, Inc.
to establish insurance rates. Economic risk analysis also could be regarded as belonging to this category, because adverse economic effects are obtained from known prices of wasted chemicals and other costs associated with pollution (cost of cleanup of hazardous waste sites, legal costs, medical costs to society, etc.). Some recent phenomena are not yet quantifiable. For example, risks from acid rain are not yet easily amenable to numerical analysis, neither are the risks from global warming. Therefore, one can only establish qualitative risks until more data is obtained to perform quantitative risk analysis. However, one should keep in mind that in the study of such complex phenomena we may never have sufficient data for accurate predictions and therefore we must base our risk management decisions on prudence.
5. LIMITATIONS OF RISK ANALYSIS Each chapter in this book elaborates on topics in which the definition of risk analysis may vary, depending on the application. The reader will notice the wide diversity of definitions and controversy, which indicate that, unlike the physical sciences, there is much uncertainty associated with any risk analysis (assessment). While risk analysis may be a useful tool to evaluate relatively simple risks (such as health risks from toxic substances in a particular exposure scenario) and to compare them with alternative risks if different human actions were taken (e.g., replacement of particular chemicals or industrial processes and technology), it may be dangerous to apply it to more complex phenomena in order to derive definitive risk ranking or risk management plans. Thus, risk analysis should be applied with caution to the real-life problems, keeping in mind its limitations. The caution may be even more critical in risk-benefit analysis, where calculations of benefits may be even more uncertain and dependent on various underlying assumptions (see Chapter I.7). A Nobel Laureate economist, Dr. Friedrich Hayek, expressed the dangers of applying science that dealt with “essentially complex phenomena” (such as risk analysis or economics) for sweeping policy decisions (Hayek 1991). His assessment of economics could be translated into a cautionary note on risk analysis: There is as much reason to be apprehensive about long-run dangers created in a much wider field, by the uncritical acceptance of assertions which have the appearance of being scientific. There are definite limits to what we can expect science to achieve. This means that to entrust the science — or to deliberate control according to scientific principles — more than scientific method can achieve may have deplorable effects. This insight will be especially resisted by all who have hoped that our increasing power of prediction and control, generally regarded as the characteristic result of scientific advance, applied to the process of society, would soon enable us to mold it entirely to our liking. Yet the confidence in the unlimited power of science is only too often based on a false belief that the scientific method consists in the application of a ready-made technique, or in imitating the form rather than the substance of scientific procedure, as if one needed only to follow some cooking recipes to solve all social problems.
© 1997 by CRC Press, Inc.
The current controversy between industry, government, and environmentalists about the use of risk analysis follows the previous reasoning. Many environmentalists regard risk analysis as a devious tool used by the industry to maintain the status quo (“proving” that something is NOT dangerous) and totally deny its usefulness, while industry and governmental agencies in increasing numbers want to base all decisions on results of risk analysis. While it is true that risk analysis may be used by both sides in an issue to justify their actions, often based on some rather questionable numerical values, risk analysis could be useful to point out the dangers of pursuing one or another course of action. The most important thing is to always make risk assessment transparent to the public with all the assumptions and parameters clearly stated. The thought process that goes into evaluating a particular hazard is more important than the application of some sophisticated mathematical technique or formula, which often may be based on erroneous assumptions or models of the world. The controversy about the requirement for risk-benefit analysis before any law is enacted may lead the legislators into total regulatory deadlock, which may leave the public unprotected, even in obvious cases of environmental abuse. Risk analysis can, under some circumstances, make general predictions about the outcome of our decisions; sometimes we can only obtain a very rough feeling about the possible outcomes. While in physical sciences the predictions are usually very accurate, in risk analysis our predictions could have a range of several orders of magnitude. If we were to build a bridge based on an assumption of the average value obtained for weight put on this bridge and, in reality, the weight may vary for one to two orders of magnitude, we would soon experience collapse if we did not allow ample space for uncertainty and caution. The best we can hope in applying risk analysis to the complex problems that we face today (such as environmental exposure to chemicals and radiation, ozone hole, resource depletion, soil loss, global warming, etc.) is to ascertain patterns that could be useful for risk management. The numbers derived by risk analysis are at best crude and often misleading, if the uncertainty associated with them is not clearly spelled out. We could compare the risks of different cleaning methods at the hazardous waste sites or the risks of the use of different types of energy or transportation with more certainty than we could predict the global warming phenomena. Risk analysis can help us predict general economic, ecological, and human health impacts of certain decisions (e.g., either to use public transport modes or personal cars, nuclear energy, coal-powered plants, or conservation) (see Chapter IV.6), which then could help create more livable and equitable sustainable societies. Compared with the accurate predictions we can get in the physical sciences, this sort of mere pattern prediction is not satisfying. However, to pretend that we possess the knowledge and power to shape the processes of society entirely to our liking, knowledge, which in the real world we do NOT possess, is likely to make us do a great deal of harm. As Dr. Hayek pointed out: The recognition of the insuperable limits to his knowledge ought indeed to teach the student of society a lesson in humility which should guard him against becoming an
© 1997 by CRC Press, Inc.
accomplice in man's fatal striving to control society — a striving which makes him not only a tyrant over his fellows, but may make him destroy a civilization which no brain has designed but which has grown from the FREE efforts of millions of individuals.
REFERENCES Carson, RL. 1987. The Silent Spring. Boston, MA. Houghton Mifflin. 1–448. Covello VT and Mumpower J. 1985. Risk Analysis and Risk Management: A Historical Perspective, Risk Analysis 5(2):103–120. Hayek FA. 1991. Economic Freedom. Cambridge, MA (Oxford, UK). Basil Blackwell Ltd. p. 287. NAS. 1983. Risk Assessment in the Federal Government: Managing the Process. Washington, DC. National Academy Press.
© 1997 by CRC Press, Inc.
L1130_Answers.fm Page 427 Friday, September 3, 2004 7:02 PM
Answers CHAPTER I.1 1. There are two types of noncancer chemical risk analysis uses: to derive criteria and standards for various environmental media and to characterize risks posed by a specific exposure scenario, e.g., at the Superfund site, by drinking contaminated water, by consuming contaminated food, by performing some manufacturing operations, by accidental or deliberate spill or release of chemicals, etc. Usually such exposure scenarios are complex and vary with each individual case; thus, methods in risk analysis must be modified to account for all possible exposures in a given situation. 2. Chemical risk analysis used for criteria development generally does not determine the probability of an adverse effect. Rather, it establishes concentrations of chemicals that could be tolerated by most people in their food, water, or air without experiencing adverse health effects either in short-term or long-term exposures (depending on the type of derived criterion). These levels (either concentrations of chemicals in environmental media or total intake of a chemical by one or all routes of exposure) are derived using point estimates of average consumption of food and drink, and body parameters, such as weight, skin surface, metabolic rate, etc. 3. Permissible Exposure Limits (PELs) are regulated by OSHA. 4. Exposure assessment is determining of the fate of the chemical in the environment and its consumption by humans. Ideally, by performing environmental fate and transport of chemicals, and by evaluating food intakes, inhalation and possible dermal contacts, one can assess total quantities of toxic chemicals in an exposed individual or population, which may cause adverse health effects. In criteria derivation, one uses either the worst-case exposure scenario or most-probable exposure scenario and point values for various human parameters. 5. Reference dose (RfD), previously known as daily acceptable intake (DAI) is defined as total daily dose of the chemical (in mg/kg of body weight) that would be unlikely to cause adverse health effects even after a lifetime exposure. An RfD for a chemical is “estimation (with uncertainty spanning perhaps an order of magnitude) of a daily or continuous exposure to the human population (including sensitive subgroups) that are likely to be without appreciable health risk.” RfDs are established from all available toxicological data for several hundred chemicals, particularly those associated with Toxic Release Inventories (TRI). The RfDs and risk assessment methodologies used for their derivation are available on-line in the Integrated Risk Information System.
© 1997 by CRC Press, Inc.
L1130_Answers.fm Page 428 Friday, September 3, 2004 7:02 PM
6. Uncertainty factor is a number (usually a multiple of 10) that is used as multiplier and dividend in order to derive more protective (conservative) criteria. Its function is to ensure safety when criteria are derived based on incomplete sets of data, or uncertainty in choosing the model for that particular risk assessment. 7. Criteria are calculated by using best toxicological studies on a particular toxic substance and models of average exposures in daily situations. Appropriate uncertainty factors are also used. 8. Since the contamination of groundwater is only half of a 10-day health advisory, it should not be recommended for regular drinking use unless the chemical can evaporate (by boiling or letting the water stand). Without knowledge of other properties of the given chemical XYZ and knowledge of its fate and transport in soil, as well as RfD, we cannot derive further recommendations about this chemical, except that the caution should be practiced in handling the contaminated soil. 9. If the bioaccumulation factor is 20, then the concentration of the chemical in a mature fish would be 100 mg/kg of fat in the fish. Since the RfD is only 2 mg/day, it would be unwise to eat the fish for an extended period of time even in very small quantities, since even 20 g of fish (fat) would contain 2 mg. Especially dangerous is the fact that the chemical can likely bioaccumulate also in human tissues.
CHAPTER I.2 1. National Research Council (NRC), Risk Assessment in the Federal Government: Managing the Process, National Academy Press, Washington, D.C., 1983. National Research Council (NRC), Science and Judgment in Risk Assessment, National Academy Press, Washington, D.C., 1994. Office of Science and Technology Policy, Chemical carcinogens: review of the science and its associated principles, 1985, Federal Register 50:10372–10442. U.S. Environmental Protection Agency. The risk assessment guidelines of 1986, Federal Register 51:33992–34005. U.S. Environmental Protection Agency. Proposed guidelines for carcinogen risk assessment, 1996, Federal Register 61(79):17960–18011. 2. Analytical and Descriptive. Analytical studies consider individual exposure. The two approaches to analytical studies are cohort and case-control methods. The cohort method identifies groups of exposed and nonexposed individuals and studies the difference in disease occurrence between the two groups. The case-control method compares persons with the disease and persons without the disease for differences in exposure and other factors. Descriptive studies are analyses of disease rates in groups of exposed and nonexposed persons. The primary difference between analytical and descriptive studies is that analytical studies consider individual exposure while descriptive studies consider measures of exposure for a whole group. An example of a descriptive study might be a correlation of esophageal cancer mortality rates among countries with the per capita alcohol consumption of those countries. Such a study might find a positive correlation, but it is unknown whether those who developed esophageal cancer actually consumed alcohol. 3. Meta-analysis is the comparing and synthesizing of studies dealing with similar health effects and risk factors. Its utility is that it can be used to formally examine sources of heterogeneity, clarify the relationship between environmental exposures
© 1997 by CRC Press, Inc.
L1130_Answers.fm Page 429 Friday, September 3, 2004 7:02 PM
and health effects, and generate information beyond that provided by individual studies or a narrative review. 4. Biomarkers are generally considered to include: (1) biomarkers of effect (biologic evidence that damage has occurred). (2) biomarkers of susceptibility (biological evidence that the individual may have heightened disease susceptibility). Susceptibility could be inherited or acquired. (3) biomarker of internal dose (e.g., tissue level of a carcinogen). (4) biomarker of biologically effective dose (e.g., DNA adducts). 5. Questions include: Could the study have detected an increase in cancer risk, i.e., was the sample size large enough? Could the results of the study have been due to chance, bias, or confounding? Was cancer latency addressed? How was exposure determined? In a cohort study, was follow up of cohort members adequate? 6. Temporality — the disease has to occur within a biologically reasonable time after initial exposure. Consistency — the same result occurs in multiple studies. Magnitude of the association — the risk is large and precise. Biological gradient — the risk is found to increase as the exposure increases. Specificity of the association — the likelihood of a causal interpretation is increased if a particular form of cancer is related to exposure in several studies (e.g., asbestos exposure and mesothelioma, cigarette smoking and lung cancer). Biological plausibility — the association makes sense with respect to metabolism, pharmacokinetics, etc. Coherence — the cause and effect are in logical agreement with everything known about the agent, exposure to the agent, and the disease. (Note: None of the criteria are considered conclusive by themselves, and the only criterion that is essential is the temporal relationship.)
CHAPTER I.3 No exact answers; questions are creative thoughts.
CHAPTER I.4 1. The development of a risk analysis model offers several benefits, including: a greater awareness of uncertainty and risk for all those involved in the development of the model; the creation of a more blameless environment for the discussion and management of risk; the identification of “opportunities”, i.e., events that may or may not occur but would accrue benefit to the group should they occur; and finally, the development of more informed and balanced decisions and risk reduction strategies. 2. Monte Carlo risk analysis modeling is superior to more traditional single point (deterministic) modeling as it can incorporate all identified uncertainties and risks and thus facilitate more informed decisions. The great advantages of Monte Carlo modeling over other risk analysis modeling techniques (e.g., algebraic solutions
© 1997 by CRC Press, Inc.
L1130_Answers.fm Page 430 Friday, September 3, 2004 7:02 PM
3. 4.
5.
6. 7.
and method of moments) are that building up a Monte Carlo model is very intuitive and this type of model allows for the inclusion of complex stochastic relationships between the model’s variables with a minimum of effort. 81.5%, zero, 97.5%. Bounded distributions do not allow scenarios that extend a variable beyond the minimum to maximum range defined by the expert. Thus, if the expert underestimates the extent to which a value may vary (as very often happens) the model will have an unrealistically narrow range and fail to convey the extremes that the future might hold. BetaPERT distributions have a smaller spread than triangle distributions. The distribution of a total project cost will therefore also have a smaller spread if BetaPERT distributions are used instead of triangles. Thus, the risk contingency will be smaller since the difference between the mean and 85 percentile will be smaller. No specific answer; examples and correlations will vary according to workplace. Mean: £, mode: £, median: £, standard deviation: £, variance: £ˆ2, skewness: unitless; kurtosis: unitless.
CHAPTER I.5 1. Probabilistic risk analysis was developed to facilitate the quantification of risks associated with complex engineered systems. It is particularly well suited to analyzing the frequencies of extremely rare events, such as core melts in nuclear reactors or chemical plant accidents, for which little if any accident data will be available. 2. To simplify the difficult and complicated task of system design, system design is generally done by specifying the boundary conditions under which each subsystem is expected to operate (e.g., sources of electric power, cooling water, etc.), and performing detailed engineering design of each subsystem individually. Thus, the dominant sources of risk often arise from interactions between subsystems (e.g., situations in which one subsystem fails, and thus changes the environment faced by other subsystems), since such interactions may be overlooked in the ordinary engineering design process. 3. Quantitative estimates of accident frequencies or probabilities provide a more rigorous basis for evaluating the cost-effectiveness of alternative risk reduction actions, and for determining the relative importance of different risk contributors. 4. Without information on risk contributors, the only decisions available after the PRA is completed will be either to accept the status quo and continue operating, or to shut down the system (typically at great cost). Information on risk contributors can help facility owners and operators make good decisions about system design modifications, operations, and maintenance. 5. (1) What can go wrong?; (2) How likely is it to go wrong?; and (3) What will be the consequences if it does? 6. Hierarchical models provide a way of structuring the vast quantities of information that go into a risk analysis. 7. Event trees are well suited for displaying the order of events, displaying dependencies between events (e.g., the fact that the failure probability of subsystem B may depend on the status of subsystem A), and facilitating communication about
© 1997 by CRC Press, Inc.
L1130_Answers.fm Page 431 Friday, September 3, 2004 7:02 PM
8.
9.
10.
11.
12.
13.
the assumptions made in the risk model (e.g., presenting a risk model to plant staff for review and discussion). However, because combinations of subsystem successes and failures are explicitly shown, event tree models can rapidly become extremely large, including literally billions of sequences. Fault trees provide a more compact way of representing large numbers of events, but can obscure dependencies and the chronological order of events. In the large fault-tree approach, the sequences that are important to risk can be easily remembered and understood, thereby facilitating communication. By contrast, in the large event-tree approach, the individual split fraction models are relatively simple, and the failure probability of a particular subsystem will generally not depend on the specific causes of other subsystem failures or top events earlier in the event tree. Grouping redundant components or trains under a single top event helps to ensure that the various top events in the event tree will be conditionally independent of each other, and makes it possible to model common-cause failure within an individual top event, rather than between different top events. Placing causally dependent events to the right of the events that influence them also helps to minimize complex dependencies between the various top events. Placing more severe events toward the left side of the event tree often makes it possible to prune the event tree by eliminating unnecessary branches, which can greatly reduce the number of accident sequences that must be represented in the event tree. Finally, putting the top events in chronological order can help to facilitate communication about the assumptions made in the risk model. Data are needed on initiating event frequencies, component failure rates, commoncause failure rates, component maintenance frequencies and durations, component fragilities, and human error rates. Success or exposure data are also needed; for example, the number of hours of operation or experience over which the observed failures occurred, and the number of demands experienced by standby or by cycling components (such as thermostatically controlled heating or cooling systems). Possible data sources include maintenance requests, corrective action reports, significant event reports, anomaly reports, plant or mission logs, test results, and case histories, in addition to expert opinion and published or computerized databases. The analyst must specify the level of detail of the analysis, the components of interest, the database study period, the relevant failure modes for each component, and the appropriate units for each failure mode (e.g., hours vs. demands). The analyst must also decide whether to pool information for similar components (which can increase the total amount of information that is available for the analysis, but can lead to misleading results if the components are not sufficiently similar), whether to use test data on partial failures, and how to account for corrective actions. The application of PRA has been successful in risk management because PRA results, combined with engineering judgment, frequently make it possible to identify relatively inexpensive risk reduction options. Every plant is unique, even nominally “identical” units on the same site. The influence of operating and maintenance practices can far outweigh the inherent design reliability of the equipment, so that even plants that start out as sister units can have very different risk and reliability profiles.
© 1997 by CRC Press, Inc.
L1130_Answers.fm Page 432 Friday, September 3, 2004 7:02 PM
CHAPTER I.6 1. Ecological risk assessment is defined as “the process that evaluates the likelihood that adverse ecological effects are occurring, or may occur, as a result of exposure to one or more stressors.” In contrast, ecological risk management is the process for making decisions or selecting options to manage the risk. Ecological risk assessment is one of several inputs into risk management. 2. Human health risk assessment defines adverse consequences in terms of effects on individual, classes, or groups of humans. Ecological risk assessment defines “adverse” in a variety of ways, but usually as the consequences to a nonhuman biological feature of the environment. For example, the main risk to converting native prairie to farmland may be the consequences on soil biological diversity. 3. Among the major reasons offered in support of ecological risk assessment are: it is an organized, systematic way to prioritize ecological policy problems; it allows policy makers to allocate scarce resources to solving the most important ecological problems; it clearly separates science from policy making; and it really only formalizes how decisions are actually made. Among the common objections to ecological risk assessment are: it is too easy for practitioners (technocrats) to insert their personal values and distort the results; in order to make ecological problems technically tractable for risk assessment, the problems must be simplified, which distorts their relevance to the policy problem; and the formulation of the risk problem defines the result and, in practice, technical experts often define the risk problem rather than the public. 4. In an ideal world personal values and priorities should be separated from science and the assessment process. However, in conducting ecological risk assessments this is nearly impossible because there are no clearly accepted values and priorities for ecological policy questions, so scientists and analysts must make many assumptions in order to carry out the analysis. The assumptions (values) are often the most important issues in public policy; unfortunately, they are often not explicitly stated and conveyed to risk managers. 5. The most common alternative to ecological risk assessment is to avoid defining adverse events (risk) initially by evaluating the ecological consequences and let policy makers or the public decide which of the alternatives are most desirable. Ecological consequences (or ecological change) can only be defined as adverse when a human value or criterion is applied. Another approach is to expand benefit/cost analysis to cover nonmonetary consequences (and costs). All of these approaches are subject to similar criticisms (and misuse) as is ecological risk assessment. 6. The most commonly held view is that ecological risk assessment needs to be closely linked to ecological risk management, but clearly separated. Risk assessment is an analytical process that provides input to decision making (management). It is separate and distinct, but should be a directly relevant policy question being addressed by risk managers. 7. The issue of which ecological changes or consequences are defined as adverse is one of the most difficult in ecological risk assessment. To label an ecological change as adverse requires the application of a human value or priority, which means that it is not a scientific or analytical choice. Societal involvement is required and this may be obtained through legislation, policy directives of elected or appointed officials, or direct stakeholder input.
© 1997 by CRC Press, Inc.
L1130_Answers.fm Page 433 Friday, September 3, 2004 7:02 PM
8. Scientists and other technical experts play the dominant role in ecological risk assessment. In contrast, their role in risk management is one of providing input and technical council.
CHAPTER I.7 1. Opportunity Cost. At the least, a “free lunch” takes your time, and often the provider of this “free” meal wants more of your time or other resources. 2. While not necessarily “good” reasons, there are at least a couple of explanations for why businesses fail to undertake activities that will improve their profitability: (a) inefficient capital markets; and (b) habit. A small business with cash-flow problems may have difficulty obtaining a loan to purchase the needed equipment even though the equipment would improve cash flow and profitability. Bankers are not famous for their impartiality and objectivity. Just like the rest of us, business people are creatures of habit. Phrases like, “If it ain’t broke, don’t fix it,” contain a rationale for inaction as well as folk wisdom. 3. Over time, technological advances may very well reduce (social) costs. Lunches might not be free, but some lunches might be much less expensive. 4. Eliminating the risks associated with automobile transportation means doing without it. Since much of America is built around automobile transportation, this is not a feasible goal for very many Americans. As individuals, we can reduce auto transportation safety risks by reducing commuting distance, making fewer unnecessary trips, and purchasing safer autos. Socially, we can reduce safety- and pollution-related risks by investing in alternative transportation, supporting policies to make highways safer and to make autos safer, longer-lasting, cleaner-burning, and more fuel efficient. We can also support investments in alternative transportation, long-range planning to reduce dependence on autos, and systems of penalties (taxes) and rewards (tax credits) to bring market prices closer to total social costs. 5. “Better economists” means behaving more like the “rational economic man” of neoclassical economics, where each person’s actions are motivated only by his or her own narrow pecuniary self-interest. Many organizations, both religious and secular, organize and act on behalf of what they believe is “the right thing to do.” In most work environments, some degree of cooperation and teamwork is necessary. Even basic civility requires some restraint on self-interest. If the “what’s in it for me?” approach to life is becoming more widespread, then people are becoming “better economists” and one of the consequences will be poorer economic performance. 6. Yes, because in pure competition good information and an industry-wide standardized product combine with free entry and exit of sellers as well as buyers to ensure that profits earned have actually added value to the economy. If there is no market failure or opportunities for cost-shifting, competitive forces will “reward” profits in proportion to actual value added to the economy. 7. The “good economist” is unconcerned with others’ rights, and is more likely to take advantage of an opportunity to shift costs. 8. The “right-to-know” provision of the 1986 Superfund reauthorization required companies to report all releases of toxic chemicals. This simple, low-cost statute forced companies to look at their own releases and contemplate the likely reactions of citizens and customers. Better and more complete information usually produces better decisions and better resource allocations. If the government reported eco-
© 1997 by CRC Press, Inc.
L1130_Answers.fm Page 434 Friday, September 3, 2004 7:02 PM
9.
10.
11.
12. 13.
nomic data with adjustments for increased risk or environmental degradation, many producers and consumers would begin to “rethink” their activities, gradually leading to much improved resource allocations. Wider acceptance of the “intrinsic value” perspective would make us more cautious when it comes to irreversibilities. If value exists apart from what humans want, irreversible biological and ecosystem effects involve serious costs even where humans are not directly impacted. If all life is intrinsically valuable, many “valuecreating” economic activities should be “trumped” to prevent irreversible species loss and ecosystem degradation. In a democracy, every citizen’s vote should carry approximately equal weight in the political process, which defines community needs and develops processes for meeting those needs. Increasing global interdependencies suggest that, if we value democracy highly, we need to develop international institutions that can “hear” the voices of all the world’s inhabitants. The point of this exercise is to stimulate thinking about the social consequences of what are (primarily) private choices. A decision to stop smoking has consequences for family and for the country’s health care resources. Most private risk reduction decisions involve some social benefits or costs. This question is intended to encourage the reader to approach risks from an economic perspective. If “fairness” is defined to include consideration of differences in income, race, ethnicity, gender, and generation (assuming future generations are represented), and “most economical” is defined to take account of all costs, adjusting for market failures and factoring in irreversibilities, society might nevertheless be justified in “trumping” an economic risk reduction strategy on the grounds of community need.
CHAPTER II.1 1. A. dermal exposure during use of an antimicrobial soap B. VOC inhalation following offgasing from new furniture C. consumption of pesticide residues in agricultural commodities D. ingestion of microbial-contaminated water 2. Track contaminant into residence from outdoor soil/dust; exposure during resuspension of contaminated soil/dust. 3. Biological variation in body weight, skin surface area, and inhalation rate; body weight basis; for example, children would have a higher exposure (mg/kg) than adults. 4. Air exchange rate temperature; open windows increase air exchange rate and reduce inhalation exposures to air-borne chemicals.
CHAPTER II.2 1. FIFRA requires that any pesticide registered in the U.S. must perform its intended function without causing unreasonable adverse effects on the environment. This statute also requires that evaluations of potential risks to man or the environment must also take into account the economic, social, and environmental costs and benefits of the use of a given chemical. While the use of “unreasonable risk”
© 1997 by CRC Press, Inc.
L1130_Answers.fm Page 435 Friday, September 3, 2004 7:02 PM
2.
3.
4.
5.
suggests that some risks will be tolerated under FIFRA, it is clearly expected that the anticipated benefits will outweigh the potential risks when a pesticide is used according to commonly recognized, good agricultural practices. Scientific issues involved in the evaluation of potential dietary exposures and human health risks associated with pesticide residues in food include: the scientific and regulatory paradox created by the Delaney Clause in the FFDCA and analysis of uncertainty. The Delaney Clause specifically prohibits the presence of residues of materials found to “induce cancer in man or animal.” This absolute standard is inconsistent with the “risk-benefit” statutes of FIFRA. Further, this creates a regulatory dilemma in that while residues of “carcinogenic” pesticides are allowed in RACs under Section 408 of FFDCA, they are not allowed under Section 409. The second issue, uncertainty analysis, underscores a fundamental issue for the practice of risk analysis in general. Quantitative evaluations of uncertainty, for example, the use of Monte Carlo simulation to develop distributions of dietary exposures (and risks), provide the most scientifically defensible approach for estimating potential exposures to pesticides. The NAS has recently recommended the use of distributions of consumption and pesticide residues in food rather than single-point data to characterize dietary exposures and risks. Uncertainty analysis, where appropriate data can support its use, provides much more information to risk management decision-makers. Two examples of methods that can be used for monitoring potential occupational exposures to pesticides are (1) the use of dosimetry clothing or patches on workers for measuring dermal exposures and (2) the use of personal air sampling devices to measure breathing zone inhalation exposures. Three exposure pathways that may be relevant to potential residential exposures to chemicals include: incidental ingestion of dislodgeable residues from treated surfaces following hand-to-mouth behavior in children; dermal exposure to dislodgeable residues from treated surfaces; and inhalation of air-borne chemicals during and post-application of spray (e.g., hand-held aerosols, total release foggers) products. Benefits that result from the international harmonization of testing guidelines and protocols for studies for pesticide registration include: establishment of a uniform approach to data requirements and interpretation, minimization of regulatory staff resource duplication regarding study initiation and review, conservation of economic resources and prevention of trade barriers.
CHAPTER II.3 1. 2. 3. 4.
dose International Committee for Radiation Protection U.S. Environmental Protection Agency The doses are considered acceptable in relation to background doses, but are as low as are technically feasible for that industry. 5. Whole-body dose is the dose received from uniformly irradiating the body with an external source of radiation. 6. Effective dose is the equivalent in risk to a unit of whole-body dose. If the body is not uniformly irradiated by an external source, the effective dose is a dose (usually lower) that is equivalent in risk that would be received by the whole body.
© 1997 by CRC Press, Inc.
L1130_Answers.fm Page 436 Friday, September 3, 2004 7:02 PM
7. For the simple situation of whole-body dose from photons, energy deposited equals dose. For other types of radiation or other energies of photons, the dose may be higher or lower than the energy deposited. 8. 0.48 Sv. The doses are added without regard to source, age, or time interval between doses. 9. The dose is .04 Gy for two hours of exposure and dose equivalent is 1.5 × .04 = .06 from the graph in Figure 1. 10. Table 1 shows the weighting factor for the lung to be 0.12. The effective dose is 0.24 Sv.
CHAPTER II.4 1. Industrial pollution can be defined as the presence of toxic substances in air, water, or soil, often resulting from inefficiencies in production processes. 2. Pollution can also be regarded as resources distributed in wrong places. Therefore, pollution prevention at the source can be regarded as saving on resources. 3. Pollution prevention is a process of decreasing pollution at the source. 4. Waste minimization is a decrease of the amount of waste that must be shipped offsite. It is not necessary due to pollution prevention at the source. 5. Industrial pollution may present a health risk to humans or ecological systems as well as a risk to economic well-being. These risks can be estimated and compared using risk analysis methods. Increased new risks may come from nuclear plant accidents, radioactive waste, pesticides and other chemicals release, oil spills, chemical plant accidents, ozone depletion, acid rain, and global warming. 6. Mining, basic industry, chemical industry, other manufacturing, energy production, transportation, waste management, wastewater treatment. 7. Air pollution, water contamination, soil/land contamination, hazardous waste generation, radioactive waste generation, acid rain, ozone depletion, global warming. 8. Risk analysis can serve to establish a priority of pollution problems based on the magnitude of risk that they pose either to human health or ecological systems. Pollution can also be regarded as resources distributed in wrong places. Therefore, pollution prevention at the source can be regarded as saving on resources. Since economic risk analysis can indicate economic losses resulting from pollution, it can be used to encourage pollution prevention at the source as a means of improving the bottom line.
CHAPTER II.5 1. Risk assessment: evaluate undesirable outcomes and assign probabilities to their chance of occurrence (e.g., climate change and climate impact assessment). Risk management: involves political decisions concerning what can be done to control societal risks, e.g., response strategies. 2. Possible answers: • Concentration of CO2 depends critically upon environmental sinks. • Atmospheric CO2 concentrations do not match the total CO2 emissions; roughly half the CO2 is absorbed by terrestrial plants and the ocean.
© 1997 by CRC Press, Inc.
L1130_Answers.fm Page 437 Friday, September 3, 2004 7:02 PM
• Scenarios of future emissions critically depend on the rates of population growth, energy consumption per capita, and the rate of penetration of the nonfossil energy sources (renewables and nuclear).
CHAPTER II.6 1. Risk Analysis, Toxicology, Food and Chemical Toxicology, Journal of the American Medical Association, and Toxicology Modeling. 2. One CD-ROM contains the equivalent of several hundred computer floppy disks and thousands of printed pages. The information can be searched and accessed very quickly. CD-ROM versions of printed documents often contain “extras” such as spoken text and other sounds, photographs, interactive maps, “movies,” and animation. 3. A key drawback to CD-ROM databases is that they cannot be completely up-todate, whereas online databases can be updated frequently and accessed as soon as they are updated. However, it is likely that the rather new online ability to update and revise the content of CD-ROMs via information downloaded to the user’s computer hard drive will be used for at least some CD-ROMs of particular value to exposure assessors and risk assessors. 4. Created by the U.S. Department of Defense in the late 1960s, the Internet has evolved into a worldwide collection of computer networks used for many purposes. This includes sending electronic mail messages, accessing various databases, and as a way to share, publish, and distribute textbooks, journals, newsletters, and other sets of information. 5. A key evolving part of the Internet, the WWW includes a collection of documents (text, graphic, video, and audio files) that users can navigate through via use of browser software programs and “hypertext” links. Hypertext enables users to highlight certain pictures, words, or phrases, starting with information displayed on “Home Pages,” and to then move to linked pictures or pages of information. 6. Risk assessors and risk managers can communicate via Internet e-mail messages and via information shared on WWW sites. There can also be use by risk assessors and risk managers of Usenet newsgroups and mailing lists to collaborate and to ask about or share information (e.g., research plans and results). The future is likely to see increased use of the Internet for teaching, group discussions, and scientific meetings held at “virtual facilities.” Data can be shared and discussed, and one can even take a “walk” through the virtual facility being used. 7. Intelligent Agent (or Smart Agent, Software Agent, Assistant Agent, Internet Agent, Good Virus, or Web Robot): Software programs that are told or essentially learn what information the user likes to see, and which will then search through electronic mail, databases, networks, World Wide Web home pages, and Internet “Usenet” newsgroups on an ongoing basis to retrieve that information for the user. These types of programs can also be used to deliver data and messages to other users/systems. In a portable device, Intelligent Agents are used by “Intelligent Assistants” to sort incoming messages based on what the user has looked at first in the past. Intelligent Assistant (or Personal Intelligent Communicator or PIC): Small, portable electronic devices that can send and receive electronic mail and faxes, sort incoming messages, and handle other functions such as listings of addresses and phone and fax numbers.
© 1997 by CRC Press, Inc.
L1130_Answers.fm Page 438 Friday, September 3, 2004 7:02 PM
Personal Assistant: Pocket-sized electronic devices that contain the contents of books or other information. The information can be searched using key words, with results displayed on the screen of the Personal Assistant.
CHAPTER III.1 1. Extensive news coverage of catastrophic accidents. Increased number of riskassessment studies. Loss of trust in risk management. 2. Experts see risk as determined primarily by expected mortality whereas laypeople’s perceptions include qualitative characteristics such as dread, control, catastrophic potential, etc. 3. The finding that white males see the world as much less risky than everyone else sees it. 4. Because of the public’s greater sensitivity to adverse events amplified by a social and political system that highlights those events and keeps them in the “public eye.”
CHAPTER III.2 1. Fire risks are normally independent of each other while earthquake risks are highly correlated. If insurers are risk averse they are likely to charge higher premiums for risks that are more highly correlated due to the increased variance in the losses. 2. The most important step would be to require a medical exam as a condition for insurance so the insurer is fully informed about the patient’s health status before issuing a policy. 3. Catastrophic losses may cause insolvency of the insurer. The federal government has unlimited borrowing power so a severe flood will not create financial problems. In addition to the fear of insolvency, insurers are concerned with problems of adverse selection when insuring against risks, such as floods. Unless they inspect each house individually they may not be clear how safe it is in relation to the hazard. 4. The insurer would determine whether such a risk is insurable by utilizing epidemiological data to estimate the probability of a person contracting asbestosis when exposed to a certain number of particles of asbestos fibers in the air. The insurer also has the option to refuse to write coverage or cancel an existing policy if the number of asbestos fibers in the air exceeds a certain level.
CHAPTER III.3 1. Risk assessment is a process whereby the nature and size of a risk are assessed and characterized; it is one of two main parts of risk analysis. 2. Risk management, the second main part of risk analysis, is a process whereby the ways in which a risk may be abated or eliminated, or its consequences mitigated, are developed, and appropriate ways are chosen and implemented. 3. In comparative risk assessment, risks are characterized by being compared, qualitatively and/or quantitatively, to others; often risks are characterized comparatively by ranking them against each other, ordinally or categorically. The individual risks
© 1997 by CRC Press, Inc.
L1130_Answers.fm Page 439 Friday, September 3, 2004 7:02 PM
4.
5.
6.
7.
8.
9.
associated with different problems or issues are not, and usually cannot be, calculated or characterized separately as in other forms of risk assessment. Because data, theory, and calculations are usually not sufficient to characterize the risk, qualitative factors and judgments, and creative ideas, assumptions, and models, must be brought into play, as in any art. Associated with an issue within the framework of a risk study, risk reduction methods may or may not be in place and functioning. Whatever risk still exists with such methods operating, or whatever risk exists if there are no methods operating, is the residual risk that is to be assessed before implementing any new, future risk reduction methods. Residual risk is, therefore, the risk that is to be characterized and, if necessary, abated or eliminated through further risk management efforts. Misinterpretation often occurs when participants in a comparative risk project forget that there are already risk management measures in place, when they are in place, and/or when they consider possible risk abatement strategies as though they have already been implemented. Policy makers should understand residual risk and that it has been used in a comparative risk assessment because, if they do not, they might prioritize specific issues for action incorrectly; a low residual risk may be the result of the risk being low even though no abatement measures are in place or, alternatively, it may be low because risk management programs are in place and operating well. In the latter case, priority needs to be given to maintaining such risk management measures. Criteria, such as probability of exposure to stressors, extent of adverse effects within a population or among ecosystems, and others, clearly defined and consistently used in the course of a comparative risk study, make for clarity and ease of communication and better understanding among study participants and, thus, make for a better quality of comparative risk assessment. “Uncertainty” is sometimes suggested as a criterion for characterizing a risk; it is better thought of as a characterization of the uncertainty regarding the characterization of risk. Uncertainty, in this sense, arises from uncertainty and gaps in the data, in the theories used, and in judgments applied. It is useful to know, when using the results of a risk assessment, what the uncertainty in the assessment and characterization may be when planning risk management measures, for example. A ranking cleanly and clearly based on comparative risk is one valuable input to a priority ranking. Other valuable and necessary inputs to a priority ranking are such considerations as, for example: available means for, and technical feasibility of, risk reduction; the costs of risk reduction; the benefits that might accrue from risk reduction; and social and political factors. A risk ranking and a priority ranking are, therefore, two very different things. A single individual can carry out a comparative risk assessment, but the results would be highly suspect because no one person has the sum total of knowledge, skill, experience, and perceptions needed to produce well-rounded results of high quality, utility, and credibility. Although risk is mathematically defined in principle, in practice, especially with the broad range of risks a comparative risk study must encompass, the information is not available for full, mathematically correct risk calculations to be made. As to public participation, among the factors that must enter into a comparative risk study are the perceptions and values of the risk takers and their views on consequences, such as impacts on their communities and quality of life, for example. By involving members of the public, and keeping them as well informed as possible
© 1997 by CRC Press, Inc.
L1130_Answers.fm Page 440 Friday, September 3, 2004 7:02 PM
10.
11.
12.
13.
14.
15.
16.
about the available scientific information, a better idea of the risks perceived by the public can be entered into a comparative risk assessment. Inevitably, whether a comparative risk study is carried out only by technically trained individuals or by participants who are not necessarily scientists or scholars, individual perceptions and values must enter into the process, if only through judgments required to fill gaps in the available data. Mathematical definitions of risk deal with objective risk; risk as characterized even in a well-done comparative study is always largely perceived risk, informed by science and by risk principles. A comparative risk ranking such as discussed in this chapter may be described as a risk ranking if it is carefully based on residual risk and if a nonmathematical definition of risk, one including perceptions, is accepted. Regional or national comparative risk rankings deal in perceived “averages” of residual risks and do not usually highlight specific, localized risks. Thus, risks associated with abandoned waste sites were not ranked highly in Unfinished Business although in a local community such a site may be the major concern. Where known, such local risks need to be mentioned specifically in the commentary that accompanies any regional or national comparative risk ranking. Despite the enormous and growing base of sound scientific knowledge relevant to environmental risks, the base is still in no way sufficient to make comparative risk assessment possible by itself. Much as some scientists may not wish to speculate and make judgments in their fields, it is necessary to do so to help bridge the gaps caused by the absence of specifically applicable knowledge if comparative risk studies are to be done. Potential participants in a study must understand this need and commit themselves to making judgments if they are to become actual participants. The many studies that have been carried out with the help of distinguished scientists make it clear that there are individuals willing to participate in such important endeavors despite their possible misgivings. Personal commitment of individuals to promoting a sound environment, the understanding that, since resources are not endless, priorities must be set and that risk is an important consideration in setting priorities, and the possibility of making a significant personal contribution to an important means for ensuring the continuation of a sound environment. One would not expect there to be gross differences, but there would almost inevitably be differences in detail, such as the inversion of the rankings of two closely ranked issues, since two separate groups would not be likely to bring identical perceptions of the different risks to the table. Comparative risk ranking involves many uncertainties. Examples of this are lack of pertinent information, uncertainties in the information and data that are available, honest differences of opinion on how to interpret available data or what judgments to use where data are not available, different views on how to weigh risks that differ in nature from each other, differences in perception among participants in a comparative risk assessment, and, even differences in the personalities and the abilities of participants to express their views during a consensus ranking process. For these kinds of reasons, expressing a final ranking by assigning issues to a limited number of risk categories more correctly brings the total uncertainty into the picture than expressing a final ranking in ordinal form — even though developing an ordinal ranking may be a useful step toward achieving a final ranking. In setting priorities for action or in developing policies, risk managers and policy makers need to take the uncertainty in a risk assessment into account, and they must therefore be informed as to what the risk assessors think the uncertainties
© 1997 by CRC Press, Inc.
L1130_Answers.fm Page 441 Friday, September 3, 2004 7:02 PM
are. How they do this is determined on a case-by-case basis and it can depend on how conservatively protective they may wish to be, or feel they must be, among other things. Thus, whether to consider an issue of more uncertain risk characterization to be of lower priority for action because it is not as certain in its ranking as another issue, or whether to give it a higher priority of action because, with its uncertainty, it could pose a much higher risk than its ranking would indicate, is a decision that must be made on a case-by-case basis. 17. There are alternatives to using risk, including comparative risk, and many have been used and are still used. Some alternatives are to prioritize on the basis of factors other than risk, such as feasibility of reduction once a risk is known or believed to exist, whatever its size, or, not to prioritize at all but to address risks as they become apparent or as they become politically evident (“squeaky wheel” prioritization). Although such approaches do reduce risk, they do not make for the best use of risk reduction resources because they do not, in principle, strive for maximizing the total reduction of risk with those resources. The author does not know of approaches that are better, in allocating resources, than those that make use of risk as an input to prioritization.
CHAPTER III.4 1. Comparative Risk Analysis (CRA) is a term used to describe a rapidly growing number of projects performed around the U.S. by state and local groups as a promised new cure for “irrational” environmental management. 2. CRA is supposed to combine the “science of risk analysis” with “community values” to derive an environmental problems priority list, which then could be used to rationally apply resources for risk management based on the magnitude of “real” risk rather than perception. 3. Conflict of interest of the participants, insufficient information for valid risk analysis, insufficient time and expertise to perform risk analysis in a competent manner by the participants. 4. Open question with no right answer! Good ideas should be sent to the author! 5. Peace of mind — safety, happiness, and health Mobility — ease of getting from one place to another Aesthetics — visibility, noise, odors, and any visual impacts Future generations — impact on our children, availability of alternatives, reversibility of effects Sense of community — neighborhoods and personal growth Economic impacts — maintaining a comfortable standard of living, achieving personal goals, costs Fairness — sense of equity, respect of individuals’ or property owners’ rights, number of affected persons, severity of effects on different groups Recreation — access to and quality of recreational lands, opportunities for solitude 6. Such a definition of risk totally defies the stated purpose of CRA, which is to combine the science of risk analysis and community values, rather than basing risk management on public perceptions of risk (as has often been the case in the past). Moreover, such a definition may miss some real risks which the general public may not be aware of and thus outrage would be nonexistent.
© 1997 by CRC Press, Inc.
L1130_Answers.fm Page 442 Friday, September 3, 2004 7:02 PM
CHAPTER III.5 1. Environmental equity refers to how equally environmental risks and procedures to mitigate those risks are distributed across different sectors of the population. Environmental justice refers to a policy of affording subpopulations equal environmental and health protection. Environmental racism connotes “disproportionate environmental risks in racial minority communities.” 2. The environmental justice movement originated as a convergence of the civil rights movement and grass roots environmental movements. It was believed to be precipitated by hazardous waste cases in Warren County, North Carolina, in the early 1980s, which signaled racial disparities in the location of waste sites. 3. Requirements to address environmental justice issues are currently embodied in Executive Order 12898, which requires each federal agency to consider environmental justice in its activities. 4. It is difficult to quantify health risks associated with hazardous waste sites for use in an environmental justice analysis, since those risks vary according to many factors. For example, the toxicity of individual contaminants varies, making an aggregate risk estimate difficult, and risk varies temporally as the presence and toxicity of those contaminants change. 5. Minority groups are often difficult to define, since people use different criteria in classifying themselves into groups and persons in more than one category (e.g., children of mixed marriages) find it difficult to classify themselves. 6. Spatial units used for data collection in environmental justice analyses include those defined by the U.S. Bureau of the Census, such as Blocks, Block Group, Tracts, Zip Codes, and Counties. 7. Circles of varying radii around specific sites have been used as one approach to aggregating data in the spatial units selected. One way to aggregate the data is to include all Census units within a certain radius in their entirety (actually the centroids of those data units are used to determine whether or not the Census unit falls within a certain distance of the site). Another way is to use Geographic Information Systems to intersect data units so that a desired geographic area, such as a circle, is obtained. 8. Once the data are extracted and aggregated, various methods and criteria are used to determine whether or not an environmental justice issue exists. A comparison area or areas are selected against which the population characteristics of a given area of interest are compared. One basis for concluding a justice issue may exist is if the interest area, having a potential environmental problem, has a greater proportion of minorities than the comparison area. Various numerical techniques are available for conducting the comparison. 9. Examples of areas of subjectiveness in environmental analyses include what particular threshold and/or difference between the area of interest and the comparison area is used to establish a disparity and what comparison area is selected.
CHAPTER III.6 There were no questions in this chapter.
© 1997 by CRC Press, Inc.
L1130_Answers.fm Page 443 Friday, September 3, 2004 7:02 PM
CHAPTER III.7 There were no questions in this chapter.
CHAPTER IV.1 1. The distinguishing threats of nuclear power are radiation and the heat given off by the radiation decay process, which decreases with time, but requires cooling provisions for some time following reactor shutdown. 2. The evidence is strong that nuclear power is among the safest of the developed energy alternatives. This is in spite of the two serious accidents involving Three Mile Island and Chernobyl. No member of the public or the operating staff has been killed or injured from a nuclear power plant accident in the United States. This is extremely impressive considering that there are 109 operating nuclear plants in the U.S. 3. There have been two major accidents involving nuclear power plants: Three Mile Island (U.S.) and Chernobyl (Ukraine). Although the Three Mile Island accident did not result in any injuries or deaths, the Chernobyl accident did result in 30 fatalities from acute doses of radiation and the treatment of 300 people for radiation and burn injuries. The latent effects of the Chernobyl accident have yet to be quantified. Nuclear power suffered a severe setback from these two accidents, especially the Chernobyl accident. It is expected that it will take decades of safe operation of nuclear power plants to rebuild public confidence in spite of its many advantages over other energy alternatives. 4. Among the principal elements of managing nuclear power plant safety are an effective regulatory process, risk and safety assessment practices by industry that clearly reveal the safety performance of the plant with time, and the adoption of a quantitative risk assessment and management process based on the use of proven risk-based technologies. 5. There is tangible progress in moving toward risk-based regulation, but to most that progress is considered slow. Some of the reasons for the slow progress are institutional inertia in the government, concerns by industry that the cost may be too high, the lack of stability in the methods of analyses to support risk-based regulation, and continuing questions on how to control the quality of the supporting analyses. 6. The distinguishing feature of PRA is that it quantifies the uncertainty of how likely an event or a series of events is. Most other risk assessment methods deal only with questions concerning the occurrence of events and their consequences; they do not attempt to quantify the uncertainty in the results of the assessment. PRA addresses all three of the fundamental questions of risk: what can go wrong, what are the consequences, and how likely is it, including the uncertainties involved.
CHAPTER IV.2 1. Earthquakes also cause damage to buildings and their contents, as well as damage to lifelines (highways, power lines, gas distribution network, etc.). These forms of
© 1997 by CRC Press, Inc.
L1130_Answers.fm Page 444 Friday, September 3, 2004 7:02 PM
2.
3. 4.
5.
6.
7.
damage cause homelessness, business interruption, unemployment, and other economic consequences. Furthermore, the earthquake causes a disaster by making all the damage occur simultaneously in one region. So the community’s capacity to respond is diminished at the same time that it is called upon for support. All of this is too much to squeeze into a single chapter, so the chapter focuses on the most extreme consequence of an earthquake: the potential for death. The buildings destroyed in the Northridge earthquake had withstood previous quakes, but that did not mean that they were immune. Earthquake damage depends on so many factors, such as frequency spectrum of ground motion, constructive or destructive interference of waves reflected under the earth, etc., that a vulnerable building can “luck out” most of the time. Where there are many URM buildings, there are many owners. They get together and become a potent political force in opposition to a mandatory ordinance. As long as the state has bond money set aside for this purpose, go for it now! However, if it takes a fight to get the money in either year, consider the differences between the two options. In terms of lives saved, the difference is that option 1 saves lives during the first 10 years and option 2 does not. (Both save lives after the first 10 years.) In terms of cost, option 1 costs full price now and option 2 costs half price 10 years from now. With a 7% discount rate, it follows that the present discounted value of the cost of option 2 is one-fourth the cost of option 1. So the cost of saving lives for the first 10 years is three-fourths the cost of retrofit now. To compare the value of 10 years of life saving with 30 years of life saving, calculate [1 – exp(–0.07 × 10)]/0.07 = 7.2 and [1 – exp(–0.07 × 10)]/0.07 = 12.5. The ratio is 0.57. The cost per life saved is thus (3/4)/0.57 × $0.6 million = $0.8 million for a “typical” URM bearing wall building. If the owner is willing to retrofit the building rather than demolish it, then the cost of life saving for a building lifetime of T years is the cost for 30 years divided by [1 – exp(–0.07 × T)]/0.07/12.5, as discussed in the answer to Question 4. The answer depends on which quantity relating to retrofit you set equal to which quantity relating to WTP for risk reduction. If you set typical cost to $3 million, you have to come down a factor of 9.6/3.0 = 3.2, from $25 to $8/ft2. If you look at a reasonable upper bound for a typical building (cost × uncertainty factor of 3) and equate it with $7 million, you have to come down a factor of 3 × 9.6/7.0 = 4.1, from $25 to $6/ft2. If you look at median cost for a high-cost building (cost × variability factor of 7) and set it to $7 million, you have to come down a factor of 9.6, from $25 to $3/ft2. Neither earthquakes nor buildings are distributed in a random geographic fashion. A GIS enables the analyst to model realistic distributions of earthquake probabilities, soil types, etc., for buildings in a given location. Earthquake effects, and thus the cost of saving a life, do not really conform to the default lognormal distributions assumed in this chapter. A GIS is the best way to disaggregate the locationdependent features.
CHAPTER IV.3 1. Because of our need to feel in control of our lives, we tend to deny hazards which threaten our sense of control. Thus, unless we have suffered from a hazardous event, we will tend to deny that it will recur even though others might assure us
© 1997 by CRC Press, Inc.
L1130_Answers.fm Page 445 Friday, September 3, 2004 7:02 PM
2.
3.
4.
5.
6.
7.
it will. Sometimes, even if we have lived through such an event, we will rationalize that it won’t happen again. Immediately after the event, almost everyone will recognize that it can happen. However, as people die, move out, or rationalize, fewer and fewer people will. The newcomers will not accept the hazard and so will not prepare, and the overall preparedness of the community will decline. A disaster-management system requires the input from many cooperating agencies. If one agency does not perform as required, the effectiveness of the whole system is threatened. Unless the staff of each agency has experience working with the other agencies, there will tend to be tasks that fall “between the cracks” and are left undone. That is, each agency may think that the other is responsible. Moreover, key experienced staff in each agency will tend to be transferred to other positions or they may leave. This turnover may be every 4 to 5 years on an average. Since disasters may not happen more than once in twenty years or so, there will be plenty of time for the overall effectiveness of the disaster-management system to decline. First, there is the tendency of communal preparedness to decline. Second, disaster-management systems will tend to become less effective as the time since the last event increases. Finally, developing (and developed) countries frequently run their disaster-management systems on hierarchical lines, so that potential victims will be led to assume that disaster management is the responsibility of the government and not their problem. The most effective component of any disaster-mitigation system is the members of the public. Most people get their warnings from family and friends, and it is often members of the public who provide vital information to the disaster-management team. People can also substantially reduce their vulnerability to the next disaster by preparing for it. Finally, it is only by constant communal pressure that funding for disaster-management systems are sustained. There is no unique answer to this. However, a key matter for attention is sustaining the preparedness of the community. Without this, funding for disaster management will tend to dry up. A preparedness campaign using ideas set out in Appendix A may be a good start. Disaster-management systems can be categorized as falling into three groups: • controlling the event (e.g., levees or dykes to protect against floods), • avoiding the event (e.g., by planning regulations or keeping developments out of the way of the hazard), • mitigating the effects of the event (e.g., insurance, relief, etc.). What is the best mix of strategies could be assessed by using economic analysis, taking account of both monetary and nonmonetary risks. Factors of advantage in many developed countries might be: • good communications, • good transport facilities, • high technology, • a tradition of individual initiative. Factors of advantage in many developing countries might be: • a tradition of communal cooperation, • a tradition of striving for consensus, • a tradition of avoiding assigning blame.
© 1997 by CRC Press, Inc.
L1130_Answers.fm Page 446 Friday, September 3, 2004 7:02 PM
8. Perhaps one strategy might be to call a meeting to discuss how to maintain the safety of the village during the wet season, and steer the discussion toward matters of individual responsibility.
CHAPTER IV.4 1. Under the GATT measures restricting trade in animals or animal products may be imposed to protect animal or human health in the importing country. 2. When considering importation of animals or animal products, risk assessment is the process of identifying all the potential diseases that could be associated with the particular commodity and then estimating the probability of their being introduced through imports. 3. Risk assessments on imports of animals and animal products are sometimes controversial and open to challenge because many of the assumptions relating to disease prevalence in the source population, survival of pathogens in the commodity and exposure of local livestock to the pathogens have to be made on the basis of few hard data. 4. The effects of risk management measures are usually able to be quantified more objectively because more data are available. 5. When only those animals that fail a specific diagnostic test are excluded from a group intended for import, the risk of introducing the disease in question increases as the size of the group increases. 6. When a single animal reacting to the specific diagnostic test excludes the entire group, risk decreases with increasing group size. 7. The main weakness of deterministic models is that they do not give the decision maker any estimate of the uncertainty of the risk estimate. 8. With most diseases, one cannot be totally confident that embryo transfer is riskfree because insufficient studies have been conducted with most pathogens. 9. Quantitative risk analysis assists in obtaining consistency in decision making and also permits a comparison of the effects of different risk reduction measures. 10. Nonquantitative risk analysis methods are still useful, especially in the routine regulation of imports of animal products, because they can be objective, repeatable and transparent, and are always quicker, thus cheaper, than quantitative methods.
CHAPTER IV.5 1. Human health, environmental impacts, and quality of life. 2. “Precision” refers to the level of detail in the measurement of parameters, generally numerical in nature with a mean (or best estimate) with a range (e.g., standard deviation). Precision can also refer to qualitative data, such as elicitation of expert judgment, which can be either numerical or narrative in nature. In the latter case, precision refers to the level of resolution that an answer provides, and derives from the correctness of the question and how focused the answer is. “Accuracy” refers to whether the results capture the truth somewhere within the numerical bounds, regardless of the size of the error bars. Qualitative data can be accurate, even if it is not very precise.
© 1997 by CRC Press, Inc.
L1130_Answers.fm Page 447 Friday, September 3, 2004 7:02 PM
3. Western medicine has largely focused on the disease rather than the patient, and defines health more in terms of clinically observable symptoms that can be measured by diagnostic tools. The public health disciplines tend to have a broader definition that includes whether a person’s function or mobility is impaired or her/his activity is restricted. Only recently has western medicine begun to focus on the whole person, but still does not quite reach the level of holism that indigenous cultures have always followed. In indigenous cultures, some illnesses are seen as purely physical, and, at the other extreme, some illnesses are seen as being the manifestation of spiritual illness. An indigenous health care facility would include both a spiritual health care facility and a medical care facility. Some clinics for Native Americans are now including traditional spiritual healers as well as medical practitioners with better health outcomes for the patients. 4. If community well-being is in place (e.g., the ability to follow traditional activities and healing practices, psycho-spiritual well-being and so on) then contamination would not only affect the person by virtue of direct exposure to contaminants, but would also affect his/her health through the degradation of resources, loss of ceremonial resources, loss of community integrity through reduced social interactions and trade, and so on. A person’s health can be adversely affected even if she/he is not directly exposed, but this must be measured as a degree of lost access or use, rather than by personal exposure or symptoms. Prospectively, impacts to community-wide health and personal health can be predicted from seeing adverse environmental or ecological effects. Similarly, the health of a community is, in some respects, a reflection of the degree of ecological health. Conventional risk assessment has yet to recognize this. 5. First and foremost is the long-term perspective that rejects short-term fixes or partial solutions that either postpone the final remedy by imposing it on future generations, or even prevent final cleanup by choosing interim states that preclude more cleanup later. Second, the total environmental contamination burden would be managed in addition to individual hotspots. Third, endstate management goals would tend to be expressed in positive language (such as “achieve holistic environmental stewardship”) rather than in negative language (such as “avoid major adverse impacts”). Fourth, risk management decisions would abandon forced decisions between reducing human exposure at the expense of habitat and ecocultural resources through excavation, and would move to decisions about how to reduce contamination while protecting ecocultural resources by choosing less intrusive remediation technology. 6. Restricting access to traditional use areas and traditional cultural properties may violate treaty-reserved rights, it may result in lost community knowledge if access to specific sites is required for teaching, it may harm the spiritual well-being of the community if sacred or ceremonial sites are degraded or if access is denied altogether, it may result in language impacts (place-names, place-specific activity names), it may impair the gathering of specific foods and medicines the loss of which could cause a nutritional or medical decrement, and it may impose detrimental replacement costs on a community that already lacks sufficient funds for adequate health care and nutrition. 7. There are five aspects to this answer: direct food exposures, increased exposure due to food collecting practices (including hunting, fishing, and gathering), indirect exposures to materials that are used for household and cultural items (such as food storage baskets, cooking pots, etc.), unique cultural practices such as the use of
© 1997 by CRC Press, Inc.
L1130_Answers.fm Page 448 Friday, September 3, 2004 7:02 PM
the sweat lodge (increased inhalation exposure if the water used to produce the steam is contaminated), and the wider exposure of the trade network (total community contaminant burden). Thus, a traditional subsistence exposure scenario would have to include more than just an individual’s increased consumption of plants and animals that could potentially come from contaminated areas. It will need to reflect the many ways that people interact with the environment, and also the recognition that tribal communities are not exposed just one person at a time, but as whole extended families or communities at a time. It is also important to recognize that when persistent contaminants are present, exposure might extend for more than just one generation, thus resulting in another type of increased community exposure. 8. Breaking the human exposure pathway through the use of institutional controls reduces direct human exposure, but may not result in actual cleanup, and thus would not reduce environmental risk (exposure of biota or ecosystems). If the institutional controls result in lost access to traditional areas or specific ceremonial sites, the ability of the tribal community to exercise their culture and religion is diminished. If the institutional control limits the number of visits or types of activities in order to reduce exposure, this means that the people are being asked to accept exposure in return for being allowed access to their ancestral lands and resources. Using the narrow definition of risk (risk = probability of symptoms if excessive exposure occurs), then reducing risk would be measured solely by the level of human exposure, and success would be defined as preventing exposure in excess of regulatory standards or conventionally accepted risk levels. Similarly, the loss of habitat or cultural resources during remediation would not be valued as highly as human exposure, and those resources might be irretrievably lost if the remedial technology was not chosen specifically to be as least intrusive as possible. 9. The risk to one person would be assessed using an exposure scenario representing the maximum reasonable exposure for the lifestyle that we wish to protect. For tribal members this would be a subsistence lifestyle that includes comprehensive consideration of major pollutant sources to which she/he might be exposed. This might also include a child’s exposure scenario, gender-specific activities, co-risk factors such as possible underlying health and nutritional deficits, and so on. The risk to the current population would include cumulative exposures with cancer risk summed over everyone exposed, an evaluation of the number of people exposed to additive hazards from noncarcinogens, and specific evaluation of target organ toxicity (such as neurotoxicity if neurotoxins are present. The risk to future generations would estimate the concentrations of the contaminant over time (10 halflives, for instance) and evaluate how much cancer or noncancer risk this would result in. This would be expressed using the analogy of how much exposure a person would receive if he lived 1000 or 10,000 years. The number of people exposed at various exposure levels would also be evaluated. The determination of whether this cumulative risk is acceptable can only be made through a negotiation process involving the people whose future members would be impacted. 10. The elicitation of information from experts is an established procedure for developing technical information. Such a process can also be used with tribal elders to develop information about what are the appropriate risk measures to be evaluated, and whether there has been any adverse impact to them. There is no reason to think that this data is any less accurate than the information elicited from other experts, since it is just as verifiable as typical numerical data elicited from technical
© 1997 by CRC Press, Inc.
L1130_Answers.fm Page 449 Friday, September 3, 2004 7:02 PM
experts. This data is no more “anecdotal” than best professional judgment is and should be regarded just as accurate. 11. The degree to which these are different will vary from situation to situation. There are some instances where multiple contaminants may be present, each slightly below its regulatory standard. This risk would be allowed under regulations even if the cumulative risk were above levels typically allowed during Superfund cleanups. This is both because economic considerations can be part of the basis for developing the regulatory standard, and because contaminants are regulated individually. Regulation-based cleanup and risk-based cleanup might result in different cleanup levels. In other situations, the question is posed as how much risk reduction can we afford. The people whose children are being exposed, the polluters, and the people who must pay (the general taxpayer) may all be different, in which case there will be questions about whether society at large has a moral obligation to help protect someone else’s children, and whether the federal government has a legal obligation as natural resource trustee (under NRDA) and guarantor of tribal health and safety (under the treaties). There may also be a disproportionate distribution of benefits versus risks, such as local communities receiving the benefit of jobs in an industry that causes the contamination of resources belonging to people who seldom receive any employment benefit. Thus the question of affordable risk may pit local jobs versus environmental cleanup, and can only be resolved through negotiation and education about respective rights and concerns. 12. This is a critical data gap that has had relatively little attention due to the presumption that if the concentration of a contaminant is low enough to be acceptable for one generation, it should be acceptable no matter how long it persists. For small confined gene pools, such as occurs with many tribes, the cumulative dose to the total DNA contained in the gene pool might be an appropriate unit of analysis. The accumulation of nonlethal detrimental mutations over time could be estimated, and perhaps verified by the examination of genetic polymorphisms. Any such research, however, must be carefully designed since the small numbers of people may preclude statistical significance. The ethics of such research must also be carefully considered, as we have seen with the Human Genome Project when it attempts to sequence the DNA of indigenous populations.
CHAPTER IV.6 1. Sustainable development is defined as: “integrated strategies that would halt and reverse the negative impact of human behavior on physical environment and allow for livable conditions for future generations on Earth” (UNCED 1992). 2. The Agenda 21, developed at the Earth Summit in Rio de Janeiro, 1992, presents a blueprint for development of humanity in the 21st century agreed upon by a majority of countries on Earth. 3. The majority of human societies (countries) are not sustainable, since they are highly dependent on fossil fuels and nonrenewable resources, which are being rapidly depleted. 4. Environmental Impact (EI) = P × C × T × E (this is qualitative and not an exact mathematical expression). The major factors are P = population, C = consumption, T = technology, E = energy consumption. 5. North is the term used to denote developed countries of North America, Europe, and East Asia, while South refers to developing countries located mostly in the
© 1997 by CRC Press, Inc.
L1130_Answers.fm Page 450 Friday, September 3, 2004 7:02 PM
6. 7.
8.
9. 10.
11.
southern hemispheres (Latin America, Africa, South Asia). The importance of each factor in any locality varies. Western Europe and North America contribute mostly to the consumption and energy use (C and E), while in developing countries, the major factors are population (P) and polluting technology (T), in those countries that are attempting to industrialize. Also, often the energy efficiency is very poor (China and former USSR), even though useful energy consumption per person may be small. Countries of former USSR and Eastern Europe have stable and limited populations, and consumption is also low. However, their outdated technology and poor energy efficiency (T and E) are major contributors to tremendous environmental degradation of Eastern Europe uncovered after the fall of communism. Compartmentalization can be overcome by an interdisciplinary and long-term approach to a problem. Public transportation, use of bicycle and walking increases sustainability since the energy expanded by a mile traveled per person is decreased multiple times in comparison with use of private vehicles. Bicycles are the most efficient means of transportation per mile traveled, using about 60 times less energy than cars for the same distance. Urban planning is one of the determinants of a necessity to commute to a work place or daily activities, thus having an impact on energy expenditures or conservation. More efficient process leaves less waste, thus pollution prevention is equivalent to more efficient manufacturing. (a) Food choices have a great impact on health and particularly on occurrence of degenerative diseases, such as heart disease, cancer, arteriosclerosis, diabetes, high blood pressure, and others, both in etiology of those diseases and their management. Meat and milk product consumption over a long time are associated with an increased incidence of degenerative disease, as compared with grains, vegetable and bean consumption. (b) Since each pound of meat takes 12–18 lbs of grain to produce, food choices have a direct impact on agricultural impact, which is one of the major factors in the earth-caring capacity; in addition, meat and milk consumption are associated with an increase of water needed for raising animals. All the advances in environmental protection (clean air, water and soil, easier commuting, energy efficiency, etc.), will be foregone if the population keeps growing. The finality of the Earth’s resources is a given fact, and therefore Earth systems can support only a limited number of people. Therefore, a prudent policy would be to encourage limits in the number of children per family, rather than encouraging large families.
© 1997 by CRC Press, Inc.
L1130ChI.1.fm Page 11 Thursday, August 12, 2004 10:05 PM
Section I Theoretical Background of Risk Analysis
© 1997 by CRC Press, Inc.
L1130ChI.1.fm Page 13 Thursday, August 12, 2004 10:05 PM
CHAPTER
I.1
Toxic Chemicals Noncancer Risk Analysis and U.S. Institutional Approaches to Risk Analysis Vlasta Molak
SUMMARY Most environmental problems that concern the public deal with exposures to toxic chemicals (by inhaling air, by ingestion of water or food, or by dermal exposure) originating from chemical or other industries, power plants, road vehicles, agriculture, etc. There are two types of noncancer chemical risk analysis uses: (1) to derive criteria and standards for various environmental media and (2) to characterize risks posed by a specific exposure scenario (e.g., at the Superfund site by drinking contaminated water; by consuming contaminated food; by performing some manufacturing operations; by accidental or deliberate spill or release of chemicals, etc.). Usually such exposure scenarios are complex and vary with each individual case, and, thus, methods in risk analysis must be modified to account for all possible exposures in a given situation. Chemical risk analysis used for criteria development generally does not determine the probability of an adverse effect. Rather, it establishes concentrations of chemicals that could be tolerated by most people in our food, water, or air without experiencing adverse health effects either in short-term or long-term exposures (depending on the type of a derived criterion). These levels (either concentrations of chemicals in environmental media or total intake of a chemical by one or all routes of exposure) are derived by using point estimates of the average consumption of food and drink and body parameters such as weight, skin surface, metabolic rate, etc. Risk analysis is then applied to derive “criteria” for particular pollutants,
© 1997 by CRC Press, Inc.
L1130ChI.1.fm Page 14 Thursday, August 12, 2004 10:05 PM
which are then modified by risk management considerations to derive standards. There are numerous criteria and standards established for various chemicals by the U.S. Environmental Protection Agency (EPA), the U.S. Food and Drug Administration (FDA), the National Institute for Occupational Safety and Health (NIOSH), and the Occupational Safety and Health Administration (OSHA). Since many of them were established before formal risk analysis techniques became available, they are undergoing revision, based on better risk analysis methods. For a particular pollution situation, one can measure or estimate exposures to a contaminant and compare them to the previously established criteria and/or standards. The likelihood of harm increases if the exposure levels exceed the derived “safe” levels. The exposure assessments could follow a deterministic model by assuming average parameter values (air, water, food consumptions, dermal intake, etc.) or could follow the Monte Carlo method, which uses real-world distribution data on various exposures, thus potentially giving more accurate and informative estimates of risk. Key Words: toxic, chemicals, hazard, exposure, standard, criteria, dose response, acute, chronic, pollution
1. INTRODUCTION Chemical risk analysis is generally divided into four parts (NAS 1983): 1. Hazard identification — identifying potentially toxic chemicals. 2. Dose–response relationships — determining toxic effects depending on amounts ingested, inhaled, or otherwise entering the human organism. These are usually determined from animal studies. Different “end points” of toxicity are observed, depending on the target organ of a chemical. Severity of a particular effect is a function of dose. 3. Exposure assessment — determining the fate of the chemical in the environment and its consumption by humans. Ideally, by performing environmental fate and transport of chemicals, and by evaluating food intakes, inhalation, and possible dermal contacts, one can asses total quantities of toxic chemicals in an exposed individual or population, which may cause adverse health effects. In criteria derivation, one uses either worse case exposure scenario or most probable exposure scenario and point values for various human parameters. Monte Carlo modeling uses real-world distribution data for those parameters. 4. Risk characterization consists of evaluating and combining data in Items 2 and 3. For establishing criteria and standards, assumptions are made about “average exposures,” and the criteria are set at the concentration at which it is believed that no harm would occur. For example, reference dose (RfD) and health advisories (for 1-day, 10-day, and subchronic exposures) are derived for many chemicals with the use of safety (uncertainty) factors to protect most individuals. If an actual exposure to environmental pollutant (or pollutants) exceeds limits set by the criteria, efforts should be made to decrease the concentrations of pollutant. The magnitude of risk can be estimated by comparing the particular exposure to derived criteria or reference doses.
© 1997 by CRC Press, Inc.
L1130ChI.1.fm Page 15 Thursday, August 12, 2004 10:05 PM
2. TOXICOLOGICAL BASES OF TOXIC SUBSTANCES RISK ANALYSIS Over 110,000 chemicals are used in U.S. commerce. The Registry of Toxic Effects of Chemical Substance (RTECS) database, maintained by NIOSH, contains updated information on the toxicity of those chemicals (RTECS 1995). Since the number of chemicals potentially appearing in the environment is large, and the toxicological effects are very complex and differ depending on the chemical and conditions of exposure, it is sometimes difficult to determine how toxic is toxic. Risk analysis helps determine which chemicals are dangerous and under what circumstances. It can also help establish relative risks from various chemicals (ranking risks). If, for example, in a particular industrial setting the derived health risk from pollutant A is higher than from pollutant B, that may indicate that the action should first be taken to decrease the pollution by A. In order to be able to use information on such a large number of substances, the toxicologists have developed classification of chemicals by their acute, subacute, and chronic toxicity (Cassarett and Doull 1986). 2.1
Acute Toxicity
Acute toxicity is the most obvious and easiest to measure and is generally defined by the LD50 (lethal dose 50%). This is the dose expressed in milligrams per kilogram of body weight, which causes death within 24 hours in 50% of exposed individuals after a single treatment, either orally or dermally. LD50 is usually derived from animal studies (mice and rats). Measure of acute toxicity for gases is LC50 (lethal concentration of chemical in the air that causes death in 50% of animals if inhaled for a specified duration of time, usually 4 hours). Based on that definition, chemicals are divided into toxicity ratings of practically nontoxic, moderately toxic, very toxic, extremely toxic, and supertoxic (Table 1). Table 1
Toxicity Ratings of Chemicals
Toxicity rating
Probable lethal oral dose for humans
Units/kg body weight
Practically nontoxic Slightly toxic Moderately toxic Very toxic Extremely toxic Supertoxic
>15 5–15 0.5–5 50–500 5–50