INTERNATIONAL SEMINAR ON
NUCLEAR WAR AND PLANETARY EMERGENCIES 42nd Session: World Energy Crisis-Energy and Pollution: Essential Technologies for Managing the Coupled Challenges of Climate Change and Energy Security, Energy, Water, Climate, Pollution and Limits of Development in Asian Countries; Global Monitoring of the Planet-Sensitivity of Climate to Additional Co, as Indicated by Water Cycle Feedback Issues, Climate Uncertainties Addressed by Satellites, The Basic Mathematics Needed for All Models; Pollution and Medicine-The Revolution in the Environmental Health Sciences and the Emergence of Green Chemistry; Information Security-Cyber Conflict and Cyber Stability: Finding A Path to Cyber Peace; Cultural Pollution- The Erice Science for Peace Award Scientific Session
THE SCIENCE AND CULTURE SERIES Nuclear Strategy and Peace Technology Series Editor: Antonino Zichichi 1981 -
International Seminar on Nuclear War of Nuclear War
1st Session: The World-wide Implications
1982
International Seminar on Nuclear War - 2nd Session: How to Avoid a Nuclear War
1983
International Seminar on Nuclear War - 3rd Session: The Technical Basis for Peace
1984
International Seminar on Nuclear War - 4th Session: The Nuclear Winter and the New Defence Systems: Problems and Perspectives
1985
International Seminar on Nuclear War - 5th Session: SOl, Computer Simulation, New Proposals to Stop the Arms Race
1986
International Seminar on Nuclear War The Alternatives
1987
International Seminar on Nuclear War - 7th Session: The Great Projects for Scientific Collaboration East-West-North-South
1988 -
International Seminar on Nuclear War - 8th Session: The New Threats: Space and Chemical Weapons - What Can be Done with the Retired I.N.F. Missiles-Laser Technology
1989
International Seminar on Nuclear War -
9th Session: The New Emergencies
1990
International Seminar on Nuclear War -
10th Session: The New Role of Science
1991
International Seminar on Nuclear War -
11 th Session: Planetary Emergencies
1991
International Seminar on Nuclear War (unpublished)
12th Session: Science Confronted with War
1991
International Seminar on Nuclear War and Planetary Emergencies Satellite Monitoring of the Global Environment (unpublished)
13th Session:
1992
International Seminar on Nuclear War and Planetary Emergencies Innovative Technologies for Cleaning the Environment
14th Session:
1992
International Seminar on Nuclear War and Planetary Emergencies - 15th Session (1 st Seminar after Rio): Science and Technology to Save the Earth (unpublished)
1992
International Seminar on Nuclear War and Planetary Emergencies - 16th Session (2nd Seminar after Rio): Proliferation of Weapons for Mass Destruction and Cooperation on Defence Systems
1993
International Seminar on Planetary Emergencies - 17th Workshop: The Collision of an Asteroid or Comet with the Earth (unpublished)
1993
International Seminar on Nuclear War and Planetary Emergencies (4th Seminar after Rio): Global Stability Through Disarmament
18th Session
1994
International Seminar on Nuclear War and Planetary Emergencies (5th Seminar after Rio): Science after the Cold War
19th Session
1995
International Seminar on Nuclear War and Planetary Emergencies - 20th Session (6th Seminar after Rio): The Role of Science in the Third Millennium
1996
International Seminar on Nuclear War and Planetary Emergencies - 21 st Session (7th Seminar after Rio): New Epidemics, Second Cold War, Decommissioning, Terrorism and Proliferation
6th Session: International Cooperation:
THE SCIENCE AND CULTURE SERIES Nuclear Strategy and Peace Technology
INTERNATIONAL SEMINAR ON
NUCLEAR WAR AND PLANETARY EMERGENCIES 42nd Session: \\orld Energl Crisis - Energl and Pollution: Essentiallechnologies lilr \lanaging the Coupled Challenges or Climate Change and Energ) Securit). Energ). \Yater. Climate. Pollution and Limits or Delclopment in Asian Countries: (ilobal [Vlonitoring or the Planet Sensitilit) or (limate to.\dditional Co. as Indicated bl \\ater Clele Feedback issues. Climate Incertain ties Addressed III Satelli tes. The Basic \Iathematics \eeded lill' .\ II \Iodels: Pollution and \ ledicine The ReI olut ion in the Fill irunmentaillealth Sciences and the Fmergence or (ireen Chemistrl: Inlilrlnation Securitl - (Iher Connict and CI her Stabilit): Finding.\ Path to tsber Peace: Cultural Pollution The Erice Science lill" Peace\lIard Scientilic Session
UE. Majorana" Centre for Scientific Culture Erice, Italy, 19-24 August 2009
Series Editor and Chairman: A. Zichichi
Edited by R. Ragaini
,Ii»
World Scientific
NEW JERSEY· LONDON· SINGAPORE' BEIJING· SHANGHAI' HONG KONG' TAIPEI' CHENNAI
Published by World Scientific Publishing Co. Pte. Ltd. 5 Toh Tuck Link, Singapore 596224
USA office: 27 Warren Street, Suite 401-402, Hackensack, NJ 07601 UK office: 57 Shelton Street, Covent Garden, London WC2H 9HE
British Library Cataloguing-in-Publication Data A catalogue record for this book is available from the British Library.
INTERNATIONAL SEMINAR ON PLANETARY EMERGENCIES - 42ND SESSION: WORLD ENERGY CRISIS-ENERGY & POLLUTION: ESSENTIAL TECHNOLOGIES FOR MANAGING THE COUPLED CHALLENGES OF CLIMATE CHANGE AND ENERGY SECURITY,ENERGY,WATER,CLIMATE,POLLUTION & LIMITS OF DEVELOPMENT IN ASIAN COUNTRIES; GLOBAL MONITORING OF THE PLANET-SENSITIVITY OF CLIMATE TO ADDITIONAL CO 2 AS INDICATED BY WATER CYCLE FEEDBACK ISSUES, CLIMATE UNCERTAINTIES ADDRESSED BY SATELLITES, THE BASIC MATHEMATICS NEEDED FOR ALL MODELS; POLLUTION AND MEDICINE-THE REVOLUTION IN THE ENVIRONMENTAL HEALTH SCIENCES AND THE EMERGENCE OF GREEN CHEMISTRY; INFORMA TION SECURITY-CYBER CONFLICT AND CYBER STABILITY: FINDING A PATH TO CYBER PEACE; CULTURAL POLLUTION-THE ERICE SCIENCE FOR PEACE AWARD SCIENTIFIC SESSION Copyright © 2010 by World Scientific Publishing Co. Pte. Ltd.
All rights reserved. This book, or parts thereof, may not be reproduced in any form or by any means, electronic or mechanical, including photocopying, recording or any information storage and retrieval system now known or to be invented, without written permission from the Publisher.
For photocopying of material in this volume, please pay a copying fee through the Copyright Clearance Center, Inc., 222 Rosewood Drive, Danvers, MA 01923, USA. In this case permission to photocopy is not required from the publisher.
ISBN-13 978-981-4327-19-0 ISBN-IO 981-4327-19-0
Printed in Singapore by B & Jo Enterprise Pte Ltd
1997 -
International Seminar on Nuclear War and Planetary Emergencies - 22nd Session (8th Seminar after Rio): Nuclear Submarine Decontamination, Chemical Stockpiled Weapons, New Epidemics, Cloning of Genes, New Military Threats, Global Planetary Changes, Cosmic Objects & Energy
1998 -
International Seminar on Nuclear War and Planetary Emergencies - 23rd Session (9th Seminar after Rio): Medicine & Biotechnologies, Proliferation & Weapons of Mass Destruction, Climatology & EI Nino, Desertification, Defence Against Cosmic Objects, Water & Pollution, Food, Energy, Limits of Development, The Role of Permanent Monitoring Panels
1999 -
International Seminar on Nuclear War and Planetary Emergencies - 24th Session: HIV/AIDS Vaccine Needs, Biotechnology, Neuropathologies, Development Sustainability - Focus Africa, Climate and Weather Predictions, Energy, Water, Weapons of Mass Destruction, The Role of Permanent Monitoring Panels, HIV Think Tank Workshop, Fertility Problems Workshop
2000 -
International Seminar on Nuclear War and Planetary Emergencies - 25th Session: Water - Pollution, Biotechnology - Transgenic Plant Vaccine, Energy, Black Sea Pollution, Aids - Mother-Infant HIV Transmission, Transmissible Spongiform Encephalopathy, Limits of Development - Megacities, Missile Proliferation and Defense, Information Security, Cosmic Objects, Desertification, Carbon Sequestration and Sustainability, Climatic Changes, Global Monitoring of Planet, Mathematics and Democracy, Science and Journalism, Permanent Monitoring Panel Reports, Water for Megacities Workshop, Black Sea Workshop, Transgenic Plants Workshop, Research Resources Workshop, Mother-Infant HIV Transmission Workshop, Sequestration and Desertification Workshop, Focus Africa Workshop
2001 -
International Seminar on Nuclear War and Planetary Emergencies - 26th Session: AIDS and Infectious Diseases - Medication or Vaccination for Developing Countries; Missile Proliferation and Defense; Tchernobyl - Mathematics and Democracy; Transmissible Spongiform Encephalopathy; Floods and Extreme Weather Events Coastal Zone Problems; Science and Technology for Developing Countries; Water Transboundary Water Conflicts; Climatic Changes - Global Monitoring of the Planet; Information Security; Pollution in the Caspian Sea; Permanent Monitoring Panels Reports; Transmissible Spongiform Encephalopathy Workshop; AIDS and Infectious Diseases Workshop; Pollution Workshop
2002 -
International Seminar on Nuclear War and Planetary Emergencies - 27th Session: Society and Structures: Historical Perspectives - Culture and Ideology; National and Regional Geopolitical Issues; Globalization - Economy and Culture; Human Rights - Freedom and Democracy Debate; Confrontations and Countermeasures: Present and Future Confrontations; Psychology of Terrorism; Defensive Countermeasures; Preventive Countermeasures; General Debate; Science and Technology: Emergencies; Pollution, Climate - Greenhouse Effect; Desertification, Water Pollution, Algal Bloom; Brain and Behaviour Diseases; The Cultural Emergency: General Debate and Conclusions; Permanent Monitoring Panel Reports; Information Security Workshop; Kangaroo Mother's Care Workshop; Brain and Behaviour Diseases Workshop
2003 -
International Seminar on Nuclear War and Planetary Emergencies - 29th Session: Society and Structures: Culture and Ideology - Equity - Territorial and Economics - Psychology - Tools and Countermeasures - Worldwide Stability - Risk Analysis for Terrorism - The Asymmetric Threat - America's New "Exceptional ism" - Militant Islamist Groups Motives and Mindsets - Analysing the New Approach The Psychology of Crowds - Cultural Relativism - Economic and Socio-economic Causes and Consequences - The Problems of American Foreign PolicyUnderstanding Biological Risk Chemical Threats and Responses - Bioterrorism Nuclear Survivial Criticalities - Responding to the Threats - National Security and Scientific Openness - Working Groups Reports and Recommendations
2003 -
International Seminar on Nuclear War and Planetary Emergencies - 30th Session : Anniversary Celebrations: The Pontifical Academy of Sciences 400th - The 'Ettore Majorana' Foundation and Centre for Scientific Culture 40th - H.H . John Paull! Apostolate 25th - Climate/Global Warming: The Cosmic Ray Effect; Effects on Species and Biodiversity; Human Effects; Paleoclimate Implications; Evidence for Global Warming - Pollution: Endocrine Disrupting Chemicals; Hazardous Material; Legacy Wastes and Radioactive Waste Management in USA, Europe; Southeast Asia and Japan - The Cultural Planetary Emergency: Role of the Media; Intolerance; Terrorism; Iraqi Perspective; Open Forum Debate - AIDS and Infectious Diseases: Ethics in Medicine; AIDS Vaccine Strategies - Water: Water Conflicts in the Middle East - Energy: Developing Countries; Mitigation of Greenhouse Warming Permanent Monitoring Panels Reports - Workshops: Long-Term Stewardship of Hazardous Material; AIDS Vaccine Strategies and Ethics
2004 -
International Seminar on Nuclear War and Planetary Emergencies - 31 st Session: Multidisciplinary Global Approach of Governments and International Structures: Societal Response - Scientific Contributions to Policy - Economics - Human Rights Communication - Conflict Resolution - Cross-Disciplinary Responses to CBRN Threats: Chemical and Biological Terrorism - Co-Operation Between Russia and the West - Asymmetrical Conflicts - CBW Impact - Cross-Disciplinary Challenges to Emergnecy Management, Media Information and Communication: Role of Media in Global Emergencies - Emergency Responders - Working Groups' Reports and Recommendations
2004 -
International Seminar on Nuclear War and Planetary Emergencies - 32nd Session : Limits of Development: Migration and Cyberspace; in Europe; Synoptic European Overview; From and Within Asia; Globalization - Climate: Global Warming; a Chronology; Simple Climate Models ; Energy and Electricity Considerations - T. S. E. : CJD and Blood Transfusion; BSE in North America; Gerstmann-Straussler-Scheinker Disease - The Cultural Emergency: Innovations in Communications and IT - Cosmic Objects: Impact Hazard; Close Approaches; Asteroid Deflection; Risk Assessment and Hazard Reduction; Hayabusa and Follow Up - Aids and Infectious Diseases: Ethics in Medicine; International Co-operation; Laboratory Biosecurity Guidelines; Georgian Legislation; Biosecurity Norms and International Organizations, Legal Measures Against Biocrimes - Water and Pollution: Cycle Overview; Beyond Cost and Pri::e; Requirements in Rural Iran; Isotope Techniques; Clean and Reliable Water for the 21 st Century - Permanent Monitoring Panels Reports - Workshops: Global Biosecurity; Cosmic Objects
2005 -
International Seminar on Nuclear War and Planetary Emergencies - 34th Session: Energy: Nuclear and Renewable Energy; Energy Technologies for the 21 st Century; Repositories Development; Nuclear Power in Europe and in Asia; The Future of Nuclear Fusion - Climate: Global Warming; Celestial Climate Driver; Natural and Anthropogenic Contributions; Climate Data and Comparison with Models; Understanding Common Climate Claims - AIDS and Infectious Diseases: New Threats from Infectious Agents-SARS Epidemic; Vaccines Development; Transmissible Spongiform Encephalopathies Update - Limits of Development: International Points of View on Migration - Pollution: Science and Technology; Subsurface Laser Drilling Desertification: A Global Perspective; Integrated Approaches - Disarmament and Cultural Emergencies: A WFS Achievement in China; Non-Proliferation - Permanent Monitoring Panel Reports - Workshops : Energy; Information Security; Building Resilence Associated with the Third Meeting on Terrorism
2006 -
International Seminar on Nuclear War and Planetary Emergencies - 36th Session: Energy: Global Nuclear Power Future; Global Monitoring of the Planet Proliferation: Nuclear Weapons; AIDS and Infectious Diseases : Avian Flu - Global Health ; Climatology: Global Warming/Aerosols and Satellites; Pollution: Plastic Contaminants in Water; Information Security: Relevance of Cyber Security; Limits of Development: Development of Sustainability; Defence Against Cosmic Objects; WFS General Meeting: Cultural Energy-Focus: Terrorism; Permanent Monitoring Panel Reports; Limits of Development Permanent Monitoring Panel Meeting; World Energy Monitoring Workshop.
2007 -
International Seminar on Nuclear War and Planetary Emergencies - 38th Session: World Energy Crisis; Managing Climate Change; Mitigation of Greenhouse Gases; Geoengineering & Adaptation; Theoretical Alternatives to Climate Modelling; US Missile Defence Shield; Global Monitoring of the Planet; Life Cycle Nuclear Energy Environmental Issues; The Epidemic of Alzheimer; Infectious Agents and Cancer
2008 -
Energy: Nuclear Power Present and Future; Sustainability of Biofuels; Resolving the Nuclear Waste - Climatology Model and Statistics; Ozone and Climate Change Interaction; Spatio-Temporal Field of Atmospheric CO2 ; Forest Policies - Medicine: Vector-Borne Diseases; Screening Technology - Pollution: Air-Borne Particulates Global Monitoring of the Planet: Disarmament and Non-Proliferation Regime; The Crisis in Internet Security; The Northern Sea Route - The Erice Science for Peace Award Scientific Session
2009 -
World Energy Crisis-Energy & Pollution: Essential Technologies for Managing the Coupled Challenges of Climate Change and Energy Security, Energy, Water, Climate, Pollution & Limits of Development in Asian Countries; Global Monitoring of the Planet-Sensitivity of Climate to Additional CO 2 as Indicated by Water Cycle Feedback Issues, Climate Uncertainties Addressed by Satellites, The Basic Mathematics Needed for all Models; Pollution and Medicine-The Revolution in the Environmental Health Sciences and the Emergence of Green Chemistry; Information SecurityCyber Conflict and Cyber Stability: Finding a Path to Cyber Peace; Cultural Pollution -The Erice Science for Peace Award Scientific Session
This page intentionally left blank
CONTENTS
1. OPENING SESSION Antonino Zichichi Why Science is Needed for the Culture of the Third Millennium-The Motor for Progress
3
Nicholas P. Samios Acceptance Remarks on Receiving the 2009 Gian Carlo Wick Gold Medal Award
37
Honglie Sun Glacial Retreat and Its Impact in Tibetan Plateau Under Global Warming
39
Yuri Antonovitch Izrael Climate Stabilization on the Basis of Geo-Engineering Technologies
49
Herman H. Shugart Modeling Forest Ecosystems, Their Response to and Interaction with Global Climate Change
57
Jan Szyszko Forest Policies, Carbon Sequestration and Biodiversity Protection
67
Henning Wegener and William Barletta Avoiding Disaster: Book Presentation
81
2. INFORMATION SECURITY FOCUS: CYBER CONFLICTS AND CYBER STABILITY-FINDING A PATH TO CYBER PEACE Henning Wegener Cyber Contlict vs. Cyber Stability: Finding a Path to Cyber Peace
ix
85
x Hamadoun I. Toure Advancing the Global Cybersecurity Agenda and Promoting Cyberstability Globally
87
Mohd Noor Amin Bridging the Global Gaps in Cyber Security
91
Jody R. Westby Cyber War vs.' Cyber Stability
97
John G. Grimes Cyber Conflict vs. Cyber Security: Finding a Path to Peace
105
Rick Wesson Information Security, Ensembles of Experts
109
Jacques Bus Cyber Conflict vs. Cyber Stability: EU and Multi-National Collaboration
115
Jody Westby and William Barletta Erice Declaration on Principles for Cyber Stability and Cyber Peace
119
3.
POLLUTION FOCUS: INTEGRATING ENVIRONMENTAL HEALTH RESEARCH AND CHEMICAL INNOVATION
John Peterson Myers Fomenting New Opportunities to Protect Human Health
123
John C. Warner Green Chemistry: A Necessary Step to a Sustainable Future
129
Jerrold J. Heindel Health Impact of Environmental Chemicals: Need for Green Chemistry
135
Terry Collins Moving the Chemical Enterprise Toward Sustainability: Key Issues
143
xi
4. ENERGY & CLIMATE FOCUS: ESSENTIAL TECHNOLOGIES FOR MODERATING CLIMATE CHANGE AND IMPROVING ENERGY SECURITY CarlO. Bauer Balancing Perspectives on Energy Supply, Economics, and the Environment
151
Edward S. Rubin The Outlook for Power Plant CO 2 Capture
157
Wolfgang Eichhammer Making Rapid Transition to an Energy System Centered on Energy Efficiency and Renewables Possible
175
Giorgio Simbolotti Beyond Emerging Low-Carbon Technologies to Face Climate Change?
197
Lee Lane, W. David Montgomery and Anne E. Smith Institutions for Developing New Climate Solutions
205
Michael C. MacCracken Moderating Climate Change by Limiting Emissions of Both Short- and Long-Lived Greenhouse Gases
225
Masao Tamada Current Status of Technology for Collection of Uranium from Seawater
243
Roger W. Bentley An Explanation of Oil Peaking
253
Peter Jackson The Future of Global Oil Supply: Understanding the Building Blocks
271
Rodney F. Nelson The Importance of Technology-The Constant Wild Card
283
xii
Maw-Kuen Wu Recent Scientific Development in Taiwan in Response to Global Climate Change
S.
305
CLIMATE FOCUS: GLOBAL WARMING AND GREENHOUSE GASES
Mikhail 1. Antonovsky Exponential Analysis in the Problem of the Assessment of the Contribution of Greenhouse Gases in Global Warming
313
6. ENERGY, CLIMATE, POLLUTION AND LIMITS OF DEVELOPMENT FOCUS: ADVANCED TECHNOLOGIES AND STRATEGIES IN CHINA FOR MEETING THE ENERGY, ENVIRONMENT AND ECONOMY PREDICAMENT IN A GREENHOUSE CONSTRAINED SOCIETY Mark D. Levine Myths and Realities about Energy and Energy-Related CO 2 Emissions in China
329
Zhang Xiliang Technologies and Policies for the Transition to Low Carbon Energy System in China
335
Mingyuan Li Assessment of CO 2 Storage Potential in Oil/Gas-Bearing Reservoirs in Songliao Basin of China
357
Yuan Daoxian Carbon Cycle in Karst Processes
369
lie Zhuang and Gui-Rui Yu Bioenergy in China: A Grand Challenge for Economic and Environmental Sustainability
387
lun Xia Screening for Climate Change Adaptation: Water Problem, Impact and Challenges in China
397
xiii
7. CLIMATE & DATA FOCUS: SIGNIFICANT CLIMATE UNCERTAINTIES ADDRESSED BY SATELLITES John A. Haynes NASA Satellite Observations for Climate Research and Applications for Public Health
407
Judit M. Pap Climate Insights from Monitoring Solar Energy Output
415
8. CLIMATE & CLOUDS FOCUS: SENSITIVITY OF CLIMATE TO ADDITIONAL CO 2 AS INDICATED BY WATER CYCLE FEEDBACK ISSUES William Kininmonth A Natural Limit to Anthropogenic Global Warming
431
Richard S. Lindzen and Yong-Sang Choi On the Observational Determination of Climate Sensitivity and Its Implications
445
Garth W. Paltridge Two Basic Problems of Simulating Climate Feedbacks
463
9. CLIMATE WITHOUT COMPUTER SIMULATION F 0 C US: MA THEMA TICS, PHYSICS, AND CLIMATE Kyle L. Swanson What is the Climate Change Signal?
471 Christopher Essex A Key Open Question of Climate Forecasting
481
10. CLIMATE AND HEALTH FOCUS: WINDBLOWN DUST Mark B. Lyles Medical Geology: Dust Exposure and Potential Health Risks in the Middle East
497
xiv
Dale Griffin Climate Change and Climate Systems Influence and Control the Atmospheric Dispersion of Desert Dust: Implications for Human Health
503
11. SCIENCE & TECHNOLOGY FOCUS: WMD PROLIFERATION-ENERGY OF THE FUTUREMATHEMATICS & DEMOCRACY Gregory Canavan Remote Detection with Particle Beams
511
Lowell Wood Exploring the Italian Navigator's New World: Toward Economic, Full-Scale, Low-Carbon, Conveniently-Available, ProliferationRobust, Renewable Energy Resources
523
K. C. Sivaramakrishnan The Mathematics of Democracy in South Asia
543
12. WFS GENERAL MEETING PMP REPORTS-DEBATE AND CONCLUSIONS Lord John Alderdice Permanent Monitoring Panel on Motivations for Terrorism
551
Franco M. Buonaguro AIDS and Infectious Diseases PMP
555
Nathalie Charpak Mother and Child PMP
559
Christopher D. Ellis Permanent Monitoring Panel on Limits of Development
573
Lorne Everett Pollution Permanent Monitoring Panel: Annual Report
579
xv Charles McCombie Multinational Repositories: Recent Developments and 2010 Session and Workshop Proposals
583
William Fulkerson, Carmen Difiglio, Bruce Stram and Mark Levire Energy PMP Report
589
Sally Leivesley Report of the Permanent Monitoring Panel for the Mitigation of Terrorist Acts: PMP-MT A
599
William A. Sprigg Permanent Monitoring Panel on Climate Activity Report
605
Henning Wegener and Jody R. Westby Permanent Monitoring Panel on Information Security Report from the Co-Chairs
609
13. INFORMATION SECURITY PANEL MEETING World Federation of Scientists: Permanent Monitoring Panel on Information Security Erice Declaration on Principles for Cyber Stability and Cyber Peace
613
World Federation of Scientists: Permanent Monitoring Panel on Information Security Top Cyber Security Problems that need Resolution to Address Communications
615
World Federation of Scientists: Permanent Monitoring Panel on Information Security Quest for Cyber Peace
621
14. LIMITS OF DEVELOPMENT PANEL MEETING Juan Manuel Borthagaray and Andres Borthagaray About Questions to be Discussed on Occasion of the 2009 Erice Meeting of the PMP Limits of Development: The Situation in Argentina
627
xvi
Alberto Gonzalez-Pozo Sustainable Development in Mexico: Facing the Multi-Headed Hydra
639
15. MITIGATION OF TERRORIST ATTACKS MEETING Richard Wilson Permanent Monitoring Panel-Mitigation of Terrorist Acts (MPM-MTA) Workshop Agenda
647
Friedrich Steinhiiusler Development of CBRN Event Mitigation
649
Annette L. Sobel One Science for CBRN Mitigation
657
Richard Wilson The Need for a Corps of Radiation Workers for Immediate Assignment
661
Ramamurti Rajaraman India's Response to the Prospect ofWMD Terrorism
669
Vasily Krivokhizha Politization in the Process of International Cooperation to Mitigate Nuclear Terrorism: Some Dubious Results
677
Robert V. Duncan Immediate Communications in the CBRN Environment
691
Richard L. Garwin Immediate Evaluation of Radiological and Nuclear Attacks
693
Richard Wilson Establishment of a Scientifically-Informed Rapid Response System
705
16. ENERGY PANEL MEETING Akira Miyahara Status of ITER Broader Approach Activities
711
xvii Akira Miyahara Topics of Energy Research in Japan
713
Hisham Khatib Impact of the Financial Crisis of 2008 on World Energy
715
17. GREEN CHEMISTRY WORKSHOP Evan S. Beach and Paul T. Anastas Plastics Additives and Green Chemistry
721
Nicolas Olea Plastics, Plasticizers and Consumer Products
729
Bruce Blumberg, Felix Griin and Severine Kirchner Organotins are Potent Inducers of Vertebrate Adipogenesis: The Case for Obesogens
737
Wim Thielemans Bio-Based Polymers: A Green Chemistry Perspective
747
Karen Peabody O'Brien Revolutionary Sciences: Green Chemistry and Environmental Health
757
Frederick S. vom Saal, Julia A. Taylor, Paola Palanza and Stefano Parmigiani The High-Volume Hormonally Active Chemical Bisphenol A: Human Exposure, Health Hazards and Need to Find Alternatives
763
18. AIDS AND INFECTIOUS DISEASES Franco M. Buonaguro 2009 Progress Report of the MCD-217 Project and 2010 Research Project, East-Africa AIDS Research Center at the Uganda Virus Research Institute (Uvri), Entebbe, Uganda
775
xviii
19. SEMINAR PARTICIPANTS Seminar Participants
783
20. ETTORE MAJORANA ERICE SCIENCE FOR PEACE PRIZE- SCIENTIFIC SESSION Why Science is Needed for the Culture of the Third Millennium Antonio M. Battro The Impact of Digital Technologies Among Children of Developing Countries
797
Richard Wilson The Crucial Role of Science (and Scientists) in Public Affairs: A Suggestion for Coping with Terrorism
799
Christopher Essex When Scientific Technicalities Matter
807
Anastasios Tsonis The Use and Misuse of Science-An Example
813
Robert Huber Innovation Cannot Be Planned
·817
Henning Wegener Why Science is Needed for the Culture of the Third Millennium
821
Albert Arking Global Warming and the Energy Crisis: How Science Can Solve Both Problems
825
Carmen Difiglio Co-Benefits of Climate Policies: The Role of Science
835
Zenonas Rokus Rudzikas Why Science is Needed for the Culture of the Third Millennium: Historical Experience of a Small Country (Lithuania)
839
xix
Maw-Kuen Wu Means to Propagate our Ideas in Scientific and Decision-Making Circles
845
Bruno Maraviglia The Human Brain Function Investigated by New Physical Methods
855
Jan Szyszko Quality of Life-How to use Ecological Science for Sustained Development
861
M.1. Tannenbaum Fundamental Science and Improvement of the Quality of Life-Space Quantization to MRI
865
Frank L. Parker Improving the Chances for Peace by Providing Almost Limitless Energy
877
Lord John Alderdice A Science of the Irrational Can Help Protect Science from Irrational Attacks
889
This page intentionally left blank
SESSION 1 OPENING SESSION
This page intentionally left blank
THE INTERNATIONAL SEMINARS ON PLANETARY EMERGENCIES AND ASSOCIATED MEETINGS 42 nd SESSION PROFESSOR ANTONINO ZICHICHI CERN, Geneva, Switzerland; University of Bologna, Italy; and Centro Enrico Fermi, Italy OPENING SESSION: WHY SCIENCE IS NEEDED FOR THE CULTURE OF THE THIRD MILLENNIUM-THE MOTOR FOR PROGRESS Dear Colleagues, Ladies and Gentlemen, I welcome you all to this 42 nd Session of the International Seminars on Nuclear War and Planetary Emergencies and declare the Session to be open. Why this distinguished group of Interdisciplinary Scientists is at the Ettore Majorana Foundation and Centre for Scientific Culture (EMFCSC) in Erice? Because we care about the consequences of Environmental and Cultural Pollution on the future of the human race. We want to overcome the danger of an Environmental Holocaust: Environmental Holocaust: spending enormous resources-billions of Dollars/Eurosfor the solution of problems whose origin is believed to be known but is not. In our action we have the support of distinguished members of the Italian Government: • • •
The Minister for Foreign Affairs, On. Franco Frattini The Minister of Culture, On. Sandro Bondi The Minister of Science and University, On. Maria Stella Gelmini
and of the Parliament: • • •
The President of the Senate, On. Renato Schifani The President of the Senate Environment Commission, On. Antonio D' Ali The President of the Government, On. Silvio Berlusconi and his deputy Dr. Gianni Letta
As you probably know, Dr. Gianni Letta has long been engaged in modern culture. He had the courage to create in Italy a scientific page in the newspaper he was directing "II Tempo" (The Times). The success of this initiative induced the most popular newspapers to open their doors to scientific culture. I have been working with him for two decades. Let me go back to the Environmental and Cultural Pollution. We are confronted with a very difficult task since, at the ongm of the Environmental and Cultural Pollution, there is a lack of Knowledge, which in our days means a lack of Scientific Culture. This is why we have to convince the great public that Science is needed in the Culture of the third millennium.
3
4 We scientists cannot remain silent when the great public shows a vivid interest for topics such as: • • • • • •
Global warming The energy crisis The information security The environment The Intelligent Design The Evolution
We have to convince the great public that the solution to all these problems require clarity and rigour. EXAMPLES OF INTERVENTIONS IN SOME TOPICS OF VIVID INTEREST TO THE GREAT PUBLIC •
•
•
•
•
• •
•
Rigorous Logic in the Theory of Evolution Zichichi, Pontificia Academia Scientiarum, Plenary Session on "Scientific Insights into the Evolution of the Universe and of Life", Vatican City (2008). Big Bangs and Galilean Science Zichichi, II Nuovo Cimento, Vol. 124 B N. 2, Italian Physical Society (January 2009). Why Science is needed for the Culture of the Third Millennium: The Motor for Progress Zichichi, published in Public Service Review European Union, 18, UK (2009). Let/ere ag/i inglesi dall'Italia, A. Zichichi, II Giomale (29 luglio 2009), The Cradle of Democracy and the Truth about Italy, translated by Barbara Zichichi. Language, Logic and Science Zichichi, Proceedings of the 27th Session of International Seminar on Nuclear War and Planetary Emergencies-2002, The Science and Culture Series, World Scientific (2003). The Logic of Nature and Complexity Zichichi, in Proceedings of the international Conference on "Quantum [unJspeakables" in Commemoration of John S. Bell, International Erwin Schriidinger Institut (ESI), Universitat Wien (Austria), 10-14 November 2000. Complexity at the Fundamental Level Zichichi, Desy, Hamburg, November 2005. Complexity and Planetary Emergencies Zichichi, in Proceedings of the 36th Sessions of the International Seminars on Planetary Emergencies, Erice (Italy), August 2006. Complexity Exists at the Fundamental Level Zichichi, in Proceedings of the 2004-Erice SubnucJear Physics School. "How and Where to go Beyond the Standard Model", The Subnuclear Series Vol. 42, page 251, World Scientific (2007).
5 •
Science and Society Zichichi, MIUR, Rome 2003.
These interventions are all in the direction of convincing people that the best way to study a problem, with clarity and rigour, is through Science. EXAMPLES OF RESULTS OBTAINED 1. A few months ago, a jewel of the world physics, the CERN in Geneva, has run the risk to loose the support of some countries. The Italian Governmentthanks to Berlusconi and Frattini-has taken immediate steps to avoid the rising of a negative phase that would have affected the greatest laboratory of high energy subnuclear physics existing in the world. 2. One of our flags is Science without secrets nor frontiers. Berlusconi has proposed Erice for the peace negotiations between Palestinians and Israeli. In the past, during the Cold War, we have contributed to overcome the danger of a Nuclear Holocaust in the USA-URSS confrontation. How? With a great alliance between Science and Cultural-Political Leaders such as the most beloved President of the Italian Republic, Sandro Pertini and the most beloved Pope in the History of the Catholic Church John Paul II.
The President of the Italian Republic, Sandro Pertini, a strong supporter of the Ettore Majorana Foundation and Centre for Scientific Culture, receiving the Erice Statement.
6
30 March 1979.
Our community of Interdisciplinary Scientists has been able to contribute to overcome the danger of the Nuclear Holocaust, whose worldwide known symbol is the fall of the Berlin Wall. Here comes a sequence of scientific leaders to whom we have dedicated our buildings. • •
Patrick M.S. Blackett which is the starting point (my youth). Isidor I. Rabi a decisive step towards the creation of the Interdisciplinarity
•
Eugene P. Wigner a witness of the crucial steps towards the fall of the Berlin Wall. Victor F. Weisskopf whose support was decisive when this Institution was
Scientific Community.
•
created at CERN-Geneva. What have we been able to do in the past decades are our credentials.
7
ETTORE MAJORANA FOUNDATION AND CENTRE FOR SCIENTIFIC CULTURE
DATA ON ACTFVITIES SINCE 1963 123 SCHOOLS,1.-I97 COrJRSES, 103 .48-1 PARTICIPANTS (124 OF WHICH NOBEL LWREATES) COMING FROM 932 UNIVERSITIES AND LABOR.4.TORIES OF 1-10 NATIONS.
And now the 42 nd Session of the International Seminars on Nuclear War and Planetary Emergencies. Clarity and Rigour is needed to fight cultural pollution. As said before, the best source of Clarity and Rigour is Science.
THIS IS WHY SCIENCE IS NEEDED FOR THE CULTURE OF THE THIRD MILLENNIUM. MOREOVER SCIENCE IS THE lVIOTOR FOR PROGRESS
SCIENCE
&
POLITICAL VIOLENCE
+
+
Unification of all Forces of Nature
15 Classes of Planetary Emergencies Total number: 63
I am pleased to let you know that our actions have given interesting results, as testified by the letter by the President of the Italian Senate Environment Commission, Dr Antonino D' Ali.
Senato della Repubblica Sen. Antonio d'AIi 1''''''id,'IIII' 011'1/11 X/II COII'III;";,,,, .. Terril"ric>, II/I/NclI/C, BClli IlJllbiCII/llli
ROMA. 20 agosto 200 di sinlesi dil Irasmcllcr.: ,Ii ":l)ll1pell.'nli organismi iIlICr11------~J" ~_ l XIV ! COMMON DEJ.'ENSE AGAINST COSMIC ! OBJECTS XV THE INVESTMENTS - HUGE MIliTARY -
=2 ._----=5 t-
._ _-,:r.l= __J=Q
,-- - - I
15
J
rI ] j- -- ',
I
Total: 4-
,~' " '- -' .~-- ----,
WATER
'--~ "
l PROTECTION
,............----Jio1 .!k-
/1 /" ,'1\\
' j\
[ ~~FOR
--'
".,
tID
I
I SAVINOS
I \
OF NATURAL RESOURCES
@
I
! .
I
I
I
l
l, SOURCES ~,I
ID,ES ,', ALINIZA " . , '",TlON I -
16
.
[!!] Totol.~
3
SOIL /'
,
I
~i ~~~ ~
ill
__
!
I
PROTEcrION AGAINST
!
DROUGHT AND DESERTIFICATION
'
POLLUTANTS
DEFENCE FROM CATASTROPHIC EVENTS
17
Iill I
Totol: 5
J
PRODUCTION
I\
.
\
1
IAG~~ I \ ~-'I AGRICULTURE! MARlNB ' \ \
. \ ,
, ~ PROCES:~~ .. \ ~ ..
STORAGE
I. \
r---~---L-~ NEEDS & WASTE
I i
18
.
I
. .- l
19
G 1-
ToloI:6
POLLUTION
I
CGWB ~ ffi I !
GREBNHOUSB EFFECT
l......-._.~_ _
2
---1 I
OZONE
DEPLETION Ji
I
I.....-_ __ __
.......-----1=--L....,
i
!
I
OIL AND CHEMICAL SPILLS
20
URBA."'i AND DOMESTIC
r··-·-~l
l...Y!J
21
Total: J
~~
!L----.J VII i Towl: 1
22
23
IIX I
TOIQ/: 3
NEW MILITARY THREATS IN THE MULTIPOLAR WORLD
DANOEROF PROLIFERATION OFWMD"
24
Ix 1
Total: I
TECHNOu)G~
SClENCE AND FOR DEVELOPING COUNTRIES TO AVOID A
NORTH-SOUTH ENVIRONMENTAL HOLOCAUST
25
IXI I
To/Qj: j
THE PROBLEM OF ORGAN SUBSTITUTION NATURAL ORGANS
I ~C
~INEBRING 3
I Iii
THE UNDERSTANDING
OF "MINIMAL LIFE"
I
.4
FROM INERT TO LIVING MATfER
ARTIFICIAL ORGANS
26
27
r'-"'-"" ........-,,-"-··...."'''',..''--~:~'~·~,.,..'-'-~--.,.___ -_ ~ ...---~-~~'-1
I MA'f"ftE.~TICS &. DEMOCRACY. I
28
r--------.
~ 7vml:2
··--·--·-------··- - ------1
I I
I COMMON DEFENSE
AGAINST COSMIC I, OBJECTS !! ! I
.
~.~,~._ ... ".,.~._ . . ._ .. _ , . _ ._ ._ ..'_~'~~.~~~:~~..~._ _,-.,..,...,."..,.,,...,_,.."'''..,,~..........,,~..Ji
__....~../'~-----..,.----..•..'" r~ !-~~~ ~~~l , ... -"'~--.~ I DETECTION i DEFENSE '11
,
I'
!
L_,___._~,...._ _.._....J
Io..._ _ _
I,'
~~ _
_
._. _ _
r
!
I
, _..._._~_.~~_.
!I
~
L __ METEORITES AND COMETS _--.--J f
I II:
;
DINOSAUR
65~~RS AGO (10~ .MT)
t____
29
j
THE HUGE MILITARY INVESTMENTS DISMANTLING OF THE 60,000
NUCLEAR WARHEADS
THE NEW
ARMS RACE FOR
/
1/
}UGH PRECISION WEAPONS
I
I /' r
I
~.--~'---[l
=~~~
RESOURCES TOWARDS PROJa~ OF PEACE I
RECONVERSION OF WAR INDUSTRIES ~
L-.........__ _ __ _ _ _ __
The Planetary Emergencies have been at the centre of our attention since 1986 and here we are again, one hundred and eighteen scientists from 31 countries and 111 institutes and laboratories, gathered in Erice to analyse a series of crucial
30
31
multidisciplinary scientific issues. They are all part of the 63 Planetary Emergencies identified by the World Federation of Scientists 23 years ago. The main topics of this 42 nd Session are: • Cultural Pollution, which is of great importance to us scientists, as mentioned before. • The World Energy Crisis, with a focus on: Energy and Pollution-which is about Essential Technologies for Managing the Coupled Challenges of Climate Change and Energy Security. These problems are strictly correlated with Energy, Water, Climate, Pollution and Limits of Development the world over, including Asian Countries. • Global Monitoring of the Planet, with a focus on: Firstly, Climate Uncertainties Addressed by Satellites. Secondly, the Sensitivity of Climate to Additional CO2 as indicated by Water Cycle Feedback Issues. Thirdly, the Basic Mathematics Needed for All Models, and that will be about Physics, Mathematics and Climate. and finally, in conjunction with medicine for the problem of windblown dust, Climate and Health. • Pollution and Medicine, Integrating Environmental Health Research and Chemical Innovation, we'll have a session on The Revolution in the Environmental Health Sciences and the Emergence of Green Chemistry. • Information Security, Cyber Conflict and Cyber Stability-Finding a Path to Cyber Peace. And now let me give you a good news. I have received from the Mayor of San Vito Lo Capo, a very interesting letter. This letter is an example of the attention the Erice Seminars have received from many people in Italy. The letter recalls a very interesting episode when, during the Cold War, fellows with very high responsibility in the two superpowers, such as Professor Teller (USA) and Professor Velikhov (URSS), were discussing hot topics in the beautiful transparent waters of San Vito Lo Capo. The episode is at the origin of the famous statement by the father of the timereversal invariance theorem, the great Professor Eugene Wigner who said: "The Berlin Wall started to fall down in the San Vito waters, much before it fell down in Berlin". Mr. Matteo Rizzo was a young fellow at the time; now being the Mayor of San Vito he would like to establish, in a very beautiful region of San Vito, a Museum dedicated to all possible records that we could collect before it is too late. The Mayor is here with us and I would like to give to Mr. Matteo Rizzo the floor in order to read out what he says in his letter.
COMUNB DISANVlTO La CAPO PROVINCTA DI'rRAPANT UiJk(tJ del 8indtu;o "'ll's.",... ..M!'- P;I. Jd'clh 1'd(!').$f.lf2U; CF~i#,11l 1i'-·MIi
Fig. 3:
Relationship between the age of stand and the thickness of the litter in cm (from Szyszko et at. 2003).
72
..,..... 4$1.lOlJ
4000..!l
•
•
3Il00.0
3000.1.1
•
•
25QO.O
•
2000,0
..
1~O
1000.0
eoo.O
.. -.,
1M'! ;-~------ --'-" - "--- "C--"" ~'"--'- '-.'- ---------,-- -'
0.0
Fig 4:
~O
4.0
$.0
8.0
10.0
Relationship between the thickness of the litter in cm and the weight of carbon in gram per 1 m 2, (from Szyszko et ar 2003/
BIODIV ERSITY WITH MAN (PART II) It turns out that man can and has to playa similar role, There are two reasons for that Firstly, "ecological catastrophes" are currently mainly an anthropogenic factor (caused by man) and occur mainly in those places where they destroy the effects of man' s economic activities and pose a danger to human safety. Hence, man needs to try to eliminate them. Secondly, the lack of "ecological catastrophes" entails no opportunity for species of the early succession stages and landscape species to occur. It is the "ecological catastrophes" that destroy the carbon reserves collected on advanced stages of the succession in forests that made it possible for the sticky bun (Suillus luteus), woodlark (Lulluia arborea), nightjar (Camprimulgus europaseus) to appear and also provided hunting possibilities for the majority of birds of prey. Protecting advanced succession stages from destruction due to "ecological catastrophes", man himself has to replace the forces of nature and play its "destructive role". Let us discuss the problem on the example of forests. If it was not for the rough interference with succession processes due to clear-cuts (Figure 5) and the reduction of the carbon content due to clear-cuts up to a few dozen tons (Figure 6) forests would not contain the nightjar (Camprimulgus europaseus), the woodlark (Lullula arborea) and the majority of birds of prey nesting on old trees would have had no place to hunt It would not be possible for the MIB figure to attain the level of 50 mg, hence, there would be no species of such Carabid beetles as Carabus nitens, Bembidion nigricorne, Pterostichus lepidus, Calathus erratus, Masoreus wetterhali and Harpalus rufitarsis. Finally, the mass appearance of fungi species such as sticky buns (Suillus luteus), sulfur tufts (Hypholoma fasciculare) or, slightly later, chanterelles (Cantharellus cibarius) and porcinis (Boletus edulis) would not be possible (Figure 6).
73
Fig. 5:
Clear-cut forest.
2071.2 €
Fig. 6:
The occurrence of characteristic species of birds, Carabid beetles and fUngi as well as the structure of the carbon content in tons per 1 ha in a forest stand, litter and mineral soil up to the 10 cm depth, in a young pine plantation with the MIB of Carabidae about 50 mg, created after the clear-cut of a timber pine stand (more than 100 years old) compared with the annual accumulation of carbon in that young stand, the value of the all carbon content (stand + litter + mineral soil) expressed in carbon dioxide at the prices in euro according to the European trade emission system at 15.08.2008 (from Szyszko, 2007). Full explanation in the text.
74
10525.6 €
• Price of 1 t of carlJon dioxide on 15.08.2008: 23.15 Euro (www.pointcarlJon.com)
Fig. 7:
The occurrence of characteristic species of birds, Carabid beetles and fungi as well as the structure of the carbon content in tons per 1 ha in a forest stand, litter and mineral soil up to the 10 cm depth, in a ca. sixty years old pine stand with the MIB figure about 250 mg, compared with the annual accumulation of carbon in that stand, the value of the all carbon content (stand + litter + mineral soil) expressed in carbon dioxide at the prices in euro according to the European trade emission system at 15.08.2008 (from Szyszko, 2007). Full explanation in the text.
The destruction of the natural environment is also a chance for the regeneration of those environmental resources with time, i.e., for an increase in the carbon content and the modification of occurring species, which is reflected in an increase in the MIB value as the synthetic measure relating to epigeic Carabids (Szyszko, 1990; Szyszko et al. 2000). In a ca. sixty years' old pine stand (Figure 7), the carbon content and the MIB value increased in comparison with a young pine stand. We can observe completely different most numerous species of birds, Carabids and fungi there. Birds characteristic for that stage of the succession include the chaffinch (Fringilla coelebes), great tit (Parus major) and coal tit (Parus ater). Carabids characteristic for that stage of the succession include: Carabus arcensis, C. nemoralis and Pterostichus niger and, as far as fungi are concerned, the sickener (Russula emetica), brown roll-rim (Paxillus involutus), false morel (Gyromitra esculenta) and the cauliflower mushroom (Sparassis crispa).
75
Fig. 8:
The occurrence of characteristic species of birds, Carabid beetles and fungi as well as the structure of the carbon content in tons per 1 ha in a forest stand, litter and mineral soil up to the 10 cm depth, in a ca. eighty years old beech stand created from the undergrowth after the clear of a pine with this stand with the MIB figure about 350 mg, compared with the annual accumulation of carbon in that stand, the value of the all carbon content (stand + litter + mineral soil) expressed in carbon dioxide at the prices in euro according to the European trade emission system at 15.08. 2008 (from Szyszko, 2007). Full explanation in the text.
The planting of the beech as an undergrowth in ca. sixty years' old pine stands and then the removal of the pine stand after 40 years with only the beeches left resulted in the creation of a beech stand about eighty years old (Rylke and Szyszko, 2002, Figure 8). When compared with a sixty years' old pine stand, a higher carbon content on 1 hectare can be observed in that beech stand and the MIB value exceeds 350 mg. Of course, characteristic species of birds, Carabids and fungi occurring there are also different than in succession stages (forest stands) presented previously. Characteristic birds are: the black woodpecker (Drycopus martius), stock pigeon (Columba oenas) and chaffinch (Fringilla coelebes); characteristic Carabids: Carabus coriaceus, C. hortensis and C. intricatus; and characteristic fungi: the dotted stem bolete (Boletus erythropus), fleecy milk-cap (Lactarius vellereus) and the death cap (Amanita phalloides). The data presented above suggest that a greater differentiation of the carbon content in space within living environmental systems, i.e., greater differentiation of succession stages measured with the MIB value, entails greater biodiversity (Figure 9) (Szyszko, 2002).
76
Fig. 9:
Heterogeneous landscape. Top left-a natural forest with the carbon content 325 ton per 1 ha with Carabus coriaceus, top right-arable land with the carbon content 20 tons per 1 ha with Cicindela campestris. In the middle-a peat bog with a very high content of carbon per 1 ha with Panageus bipustulatus. Bottom left-a clear-cut with the content of carbon ca. 90 tons with Harpalus rujitarsis. Bottom right-timber stand with the carbon content 124 tons with Carabus nemoralis (from Szyszko 2007). Full explanation in the text.
Hence, the MIB figure can be adopted as a measure of assessment of the landscape value (Rylke and Szyszko, 2001). However, the evaluation can be complete only if we also take into account the occurrence of those species that use varied succession stages defined by Szyszko (2002a) as landscape species (Figure 10).
77
Fig. J0: Landscape species in a Junctional ecological landscape. The lesser spotted eagle (Acquila pomarina) nesting on very old trees in natural and cultivated Jorests, hunting in the abandon arable land. The crane (Grus grus) nesting in peat bogs and hunting in the abandon land; the kestrel (Falco tinnunculis) nesting on old trees in natural and cultivated Jorests and hunting in clear Jellings. The black stork (Ciconia nigra) nesting on old trees in natural and cultivated Jorests and hunting in the peat bogs (from Szyszko, 2007). Full explanation in the text. More succession stages and higher diversity of the carbon content in environmental space entailed higher biodiversity (Figure 2). On the one hand, various succession stages existing simultaneously side by side guarantee great richness of species and, on the other hand, the possibility of occurrence of landscape species for whom living space needs to include the close proximity of various stages of the succession, i.e., meadows, fields, forest young plantations, natural forests and peat bogs (Figure 3). Species that require such areas are, for example, the already mentioned lesser spotted eagle (Acquila pomarina) and crane (Grus grus) (Szyszko, 2005). The former prefers nesting on old trees in natural forests, finding good hunting grounds on meadows and arable fields. In turn, the latter selects wet open areas, preferably peat bogs, for nesting and likes to feed on arable fields and meadows. Afforestation or the natural development of the succession unavoidably causes the disappearance of species characteristic for meadows and fields as well as the disappearance of landscape species. A similar effect could be obtained for the lesser spotted eagle (Acquila pomarina) with the felling of old trees in a natural forest and, for a crane (Grus grus), with the drainage of wet areas. Shortly speaking, one can cause the disappearance of species characteristic for individual stages of the succession as well as landscape species through the strict protection of
78 succession change processes. For example, by protecting forests from fires and ceasing all economic activities, we would cause the disappearance of all species characteristic for open areas and most landscape species with time. Hence, the economic activity of man made on the base of forest policy not only can but also has to guarantee biodiversity protection. FOREST POLICY, BIODIVERSITY AND MAN. Forest policy however, needs to take place in line with the sustained development concept with native biodiversity, i.e., the complete content of native species of plants, fungi and animals is the measure of such development. Where a full range of native species exists, sustained development only has to correspond with their occurrence control while in those regions where we caused the extinction of native species due to our economic activity, sustained development has to be measured with the return of these species (Szyszko, 2008). The UN Climate Change Convention (1992) and the appendix to it, i.e., the Kyoto Protocol (1997), and UN Convention on Biological Diversity (1992) provides an excellent instrument and opportunity in that area. Such an opportunity comes from the absorption of atmospheric carbon dioxide thanks to the afforestation of degraded arable lands and sustained forest management focused on an increase in that absorption for wood production and biodiversity protection (Szyszko, 2004). The Polish rural areas contain over 2 million hectares of poor soils that do not guarantee profitable farming. According to experts, each hectare of such soils is able to absorb 10-14 tons of CO 2 annually for 100 years after being afforested. One ton of the absorbed carbon dioxide is a specific amount of money that can be defined currently according to the prices of the European emission trade system where the price for one ton can amount to a few dozen euro. It is estimated that the forest cultivation sized 10 ha could significantly support one family and guarantee subsistence that not only gives a chance to survive but also develop further. Hence, forest planting on poor arable lands would create jobs thus reducing the unemployment (Kubacz, 2008; W6jcik, 2008), would protect and improve even more the quality of our environmental resources, at the same time multiplying renewable energy sources in the form of wood (Stasiak, 2008). The UN Climate Change Convention (1992) and the UN Convention on Biological Diversity (1992) provide an opportunity to implement the sustained development concept entailing the rational use of environmental resources for human needs by way of an appropriate management of carbon in environmental space (landscape), where the forest policy has to play the main role. (Szyszko et al. in print). REFERENCES 1.
2. 3.
Grum, L. (1962) "Horizontal distribution of larvae and imagines of some species of Carabidae." Ekol. Pol. 10:73-84. Grum, L. (1971) "Spatial differentiation of the Carabus L. (Carabidae, Coleoptera) mobility." Ekol. Pol. 19: 1-34. Jl(drzejewska, B., Jl(drzejewski, W. (2001) Ekologia zwierzqt drapieinych Puszczy Bialowieskiej. Warsaw, PWN Press.
79 4. 5.
6. 7. 8. 9.
10.
11.
12.
13.
14.
IS.
16.
17.
18.
Kruszewicz, A.G. (2007) Plaki Polski. Encyklopedia iluslrowana. Mullico Oficyna Wydawnicza Press. Warsaw. 312 pp. Kubacz, B. (2008) Konwencja Klimatyczna, Protok6l z Kioto i lasy, szanSq zr6wnowazonego rozwoju Gminy Czerwonka. Typescript of the master thesis. SGGW- Laboratory of Evaluation and Assessment of Natural Resources Warsaw Agricultural University. 82 pp. Naumow, N. (1961) Ekologia zwierzqt. PWRiL Rijnsdorp, A.D. (1980) "Pattern of movement and dispersal from Dutch forest of Carabus problematicus Hbst. (Coleoptere, Carabidae)". Oecologia 45: 274-281 Rylke, 1., Szyszko, J. (2001) "Evaluation of landscape value." Ann. Warsaw. Agricult. Univ-SGGW, Horticult. Landsc. Architect. 22: 89-100. Rylke, J., Szyszko, 1. (2002) Didactics trails for field classes on evaluation and assessment of natural resources. Rylke 1. and Szyszko 1. (eds.). Warsaw Agricultural University Press. 166 pp. Skrok, A. (2003) Occurrence of some selected species of bumblebees (Bombus Latr.) in the research object "Krzywda". In: Szyszko 1., Abs M. (eds.) Landscape architecture and spatial planning as the basic element in the protection of native species- modeling of succession stages. Warsaw Agricultural University Press: 116-124. Stasiak, P. (2008) Program zr6wnowaZonego rozwoju gminy Tuczno w oparciu 0 wykorzystanie odnawialnych zr6del energii. Typescript of the master thesis. SGGW - Laboratory of Evaluation and Assessment of Natural Resources Warsaw Agricultural University. 96 pp. Szyszko, J. (1990) Planning of prophylaxis in threatened pine forest biocenoses bsed on an analysis of the fauna of epigeic Carabidae . Warsaw Agricultural University Praess. Warsaw. 96 pp. Szyszko, J. (2002a) Determinants of the occurrence of chosen animal species. In: Szyszko 1. (ed.) Landscape architecture as the basic element in the protection of native species. Fundacja Rozw6j SGGW Press: 28-37. Szyszko, J. (2002b) Zarys stanu srodowiska naturalnego (przyczyny, perspektywy, szanse i trudnosci) W: Ocena i Wycena Zasob6w Przyrodniczych. Wydawnictwo SGGW. 338pp. Szyszko, 1., Platek, K., Dyjak, R., Michalski, A., Salek, P. (2003) Okreslenie modelowego projektu w dziedzinie wzrostu pochlaniania gaz6w cieplarnianych przez zalesienie nizinnych teren6w nielesnych na obszarze kraju. SGGW Laboratory of Evaluation and Assessment of Natural Resources Warsaw Agricultural University. Manuscript. 48 pp. Szyszko, J. (2004) Foundations of Poland's cultural landscape protectionconservation policy. In: M. Dieterich, 1. van der Straaten (eds.): Cultural landscapes and land use. Kluwer Academic Publishers, The Netherlands: 95-110. Szyszko, J. (2007) Combating climate change: Land use and biodiversityPoland's point of view. In: Ed. R.Ragani. International seminar on nuclear war and planetary emergencies 38 th Session.:5-12. Szyszko, K. (2003) Characteristic of occurrence of diurnal butterflies (Rhopalocera) on the research object "Krzywda". In: Szyszko J., Abs M. (eds.) Landscape architecture and spatial planning as the basic element in the protection
80
19.
20. 21.
22.
23. 24. 25. 26.
27.
28.
29.
of native species-modeling of successIOn stages. Warsaw Agricultural University Press: 125-132 Szyszko, 1., Platek, K., Dyjak, R., Michalski, A., Salek, P. (2003) OkreSlenie modelowego projektu w dziedzinie pochlaniania gaz6w cieplarnianych przez zalesienie nizinnych teren6w nieleSnych na obszarze kraj. Typescript. Independent Studio for the Valuation and Estimation of Environmental Resources SGGW. Warszawa-Tuczno. Szyszko, J., Schwerk, A. (in print): Zwierz1y ~
Sp>~
MW
~
USA
2S,170
Cormany Span (hiM Indl. Italy
23,903 16)S4
20.8 19.8
france UK Oennwk Portugol Rest of world
12.l10 9,645 3)36 3,404
3.l41 3,180 2,862 16,693
13.9 10.1 8.0 3.1 2.8 2.7 2.6 2.4 13,8
USA el1lM nlill Cormany SpaIn Italy france
MW
~
8)58
30.9 23.3
6,300 1,800 1,665 1,609
6.2
1.010
3.7
950 836 712
l .S 3.1 2.6
Res t of world
526 3,285
12.2
UK Portug.l (MOdo
6.7 S.9
1.9
Tot. ltopl0
104,104
86.2
Tot. lto p10
2 3,766
87,8
Wor ldtot. l
120,798
100.0
Wor ldtot. l
27,OS1
100.0
Fig. 8: Cumulative and new installed wind power 2008 (MW) What drives the development? [Source: GWEC (2008)J It is clear from the country comparisons (e.g., France and Germany) that the installed capacities have not so much to do with the available potentials or with the true costs of the technologies but rather with the perceived costs. While one country recognises that renewable energy promotion costs some money but that it does not exceed 10-20 Euro per household (figure derived from the evaluation of the German feed-in law) and per year which can be considered an ordinary insurance against climate change (and contributing at the same time to industrial development), another country may only see that the promotion costs per armum several billion Euro due to the expensive promotion of PV. This very different perception of entrepreneurial chances in renewables and energy efficiency leads to the strikingly difference in results for the different types of renewables. So why at the end wind technology is now taking up? It started with a very early pioneer (Denmark) and two decided champions (Spain and Germany) who where able to buy the costs of the technology very substantially down for the more hesitating countries. In the developing world, India had an early start showing that this is not a technology for rich countries only. In more or less recent times, support from high fossil fuel prices came in as an additional factor, persuading the really big players like USA and China to
185 add tremendous capacities in the past two years contributing to further cost decrease. By the end of2009, overall installed wind capacity will be close to 150 GW. The example of concentrating solar power: why this has not (yet) worked An example which is at the opposite side to wind energy at present is concentrating solar power (Figure 9).
Fig. 9: Concentrating solar power technologies.
Although this technology had a good start in the eighties with 350 MW installed rapidly in California, and the technology was at an equal footing with wind energy, up to now the 1 GW has not yet been reached although rapid growth is expected in the next years up to 2015, mainly in Spain and California (Figure 10). Why has this technology more than hundred times less installed than wind energy at present?
186
Capacity (IVI\N)
USA
-
lOOOO r----------------------
..L
•
-
~ t-------------------Ctp.Idty
2000
Other countries: Morocco E
-
•
iii
Spanlen
>000
.,.. .COO
II
""• IL
L
••
Fig. 10: Concentrating solar power technologies. [Source: Fraunhofer lSI (20l0)} . There is a bundle of reasons rather simple to present (but also to overcome ... ): •
•
•
First there was a lack of champions on the side of industrial countries that bought the technology costs down. That is, why the costs of CSP today are still substantially above the fossil generation costs, which wind energy is already approaching or has already reached (Figure 11). California had initially assumed this role but then the support for the technology faded away for 15 years until Spain and again California emerged as the second generation champions that have put in place sustained promotion schemes. Countries that would have had an interest to promote the technology like Germany were unable to introduce a demand driven policy because CSP does not works very suitably at the latitude of Germany. Hence those countries were left to promote R&D which effectively happened. This was nevertheless a cornerstone in keeping the technology alive and the scientists and engineers in the field. Third, the technology unfortunately comes "in big lumps" that is rather 30100 MW and not 1 kW like for example PV. So a concentrating solar plant cannot be paid out of everyone's pocket, unlike PV which is considerably more expensive than CSP but has already reached a much larger spread because 1 kW can be easily installed by individuals. In addition, PV had again champions (Japan, Germany) that promote the technology with demanddriven policies) which CSP did not have. That has lead to a fast learning curve for PV. So at present CSP is like a train that is late: everyone thinks that this
187
train can even take a bit more time as it is already late.
Geothormalolectricity Biowaslo (Solid) Biomass (Solid) Biomas. co-firing
Bi~as 1-__lJ~~~"II~~~~~~~~~!t o
20
40
60
80
100
120
140
160
__L-__1---J 180
200
220
Coli Of electrtclty (LRMC • pa~ck time: 15 years) [e/MWh)
Fig. 11: (Current) cost levels of electricity generationfrom renewables-Last 15 years for Concentrating Solar Power. (Source: Fraunhofer lSI based on various sources)
• Other countries like developing countries in North Africa, that would have enormous potentials both for their own consumption and for exports, only perceived the comparatively high costs, but not the chances of enhanced supply security, of balancing the increasing amounts of wind energy in the supply system and of regional employment and championship. • An essential actor in the promotion of CSP was the World Bank and the GEF. They promoted several CSP plants, in particular in North Africa. However, their error was that they considered grants as the only funding mechanism while this may not reach the necessary capacities to bring the costs down. In another attempt the World Bank is investigating to bring more sustainable financing sources together for this technology (Fraunhofer lSI, 2009), including new financing mechanism as foreseen in the EU Directive on Renewables and that allow to export renewable electricity to Europe and receive financing via for example a fee-in tariff. IF ENERGY EFFICIENCY COULD MAKE AS MUCH WIND AS WIND ENERGY ... Energy efficiency has tremendously contributed to reduce our dependency on fossil fuel consumption but has not really a sexy image. It is much more heterogeneous, much more difficult to understand and hence also very difficult to sell, especially on the policy side, especially as there is nothing to see or to show, except from nice events which document that even after so many years of energy savings it is still possible to save 91 % of energy compared to a conventional plant or appliances (Figure 12).
188
Energy • Waste heat recovery from work machines. , Optimised heat distribution. • Use of a heat pump with a coefficient of petionnance greater than 4. * Displacement ventilation via source air outlets. • Optimal dimensioning of the piping. • Use of heating and cooling water pumps with energy efficiency class A. * Use of energy-saving EC fans. • Use of a 153-kWp photovoltaic system.
Fig. 12: 1st Award: dena Energy Efficiency Award 2009. ebm-papst Mulfingen GmbH & Co. KG-construction of new energy-efficient production plant in Hollenbach. (Source: dena Energy Efficiency Award 2009 9) Numerous studies have shown that most energy efficiency options, in difference to renewables, payoff already now at fairly low fossil fuel prices (Figure 13). Increasingly we also start to realise that also energy efficiency options follow experience curves similar to renewables and that it is important therefore to stimulate the market demand for those technologies to help them getting over the first market hurdles (see the example of efficient windows in Figure 14; much more examples of this type can be shown).
http://www.industrie-energieeffizienz.de/energy-efficiency-award/energy-efficiencyaward-2009.html
189
.-
Potentlel th60rlque d'6conomle d'6nergle en Belgique en 2030
• -
$NI ck rtntlbUU.n fonetkH\ du prix do brut
•
T'....,.I ~..
Si b"
Oil price yesterday and tomorrow
120 100
eo
Oil price today
20
6
10
15 20
26 30 35 40 45 50 55
eo
as eo
55 10 15 60
05 100 105
105 mliions d. bte'
,~
..
~
.
R"'-'tdon des Comlsslont d. go ••• /fol d...". : 42/,lIco,. ~
8OUACI: 1oIcfChMy00000000000Or~ o.t ~ eo.lC\ntv1.0: ""OCQ:/III'WIt$'~
Fig. 13: McKinsey study for an energy effiCient Belgium. Table 4
Cost of window manufacturing in 1970 and in 2000. nomina! and rea! (U-value 1970 approx. 2.5- 3.0 \Wm'K; 2000-approx. 1.3 Wim'K). expressed ill CHF!m' standard window .\laten-a/,
Glass
coating
lVindow manufacturing
AssembJ"
Calculated
incl. tJ'ansport
contribution mm'gin
Total
1970
nominal
150
70
120
60
80
480
real!
2021
94'
135 3
80'
90 3
601
2000
100
100
80
80
90
450
iRea! 2000 prices;
2Adjusted with the Swiss producer price index for the lllanufacnu-ing: industry: 3Adjusted
with the average price index for the construction of residential buildings, Source: [3]. data obtained from an int"rview with a representative ofSZFF (Schweizerische Zentra!stelle fur Fenster-1Uld Fassadenbau). Dietikoll'ZH.
Fig. 14: Do efficient windows cost more? [Source: Jakob (200 7)]
It would not be fair to say that nothing has been done so far to improve energy efficiency. Numerous instruments have been introduced and tested, some with very large success. Just to mention the energy labelling of electric appliances or of buildings and cars, regulation for energy efficient buildings, incentive systems like the French BonusMalus system for energy efficient cars etc.'o We have seen further up the importance of '0
On European energy efficiency measures much can be learned from the MURE database on energy efficiency measures in Europe (www.mure2 .com).
190 the electric system from the supply side pat. However, there is also the other face the coin which represents electricity savings. The EU for example has initiated the ecodesign process which has already spurred mandatory standards for 10 products and will spur another 20 or more mandatory standards in the next years (Figure 15).
The EU Ecodesign Directive A comprehensive regulatory approach to electricity efficiency
aC\Mnh~l(t t l ~~~00nYri1et21 •
ElOo 2<X». 1 ld 2
.W__ •
~~ld '
30 Mlttlt2O.».
----
OVoru.nr' ........ ld 17
~
IdS
_1'I'~00rmIIe0
..~ ld t3
"".
ld ~
C 1or(O 13 ~12<X».
R>om '" <Mi6or*'Q
CiM9.&>IM I'uun 22 Ju>o aw.
0RiarX ... ldto
?,"ttc.-sl-l!OkW,ld ::r=~Coom""' tt
""'""" _ . " " " ' . U!I*'O~ ld t9
o o ld20 · 2f)
~"""'~ ld
.. prI
=.
£nIy,,*,"'," ....... ~"""",," "' ~b regJi)lO\ 8 N'IIJ 20». _~
O=~~
_~
O~~ 2
-~
O~~~~_~
Fig. 15: Ecodesign directive for energy efficiency electric appliances and usages. [Source. ECEEE (2009)}
Yet, the contribution of energy efficiency must be much larger in the course of the next 40 years than it is at present. Armory Lovins once compared our way of using energy with taking a bath in a bath tube with a plug that has the shape of Figure 16. Instead of looking for another bath plug we just open more the bath tap to get more water in the tube. Figure 16 points to the main paths to reduce energy consumption but also electricity consumption by the required 25% discussed above but potentially by much more. These options include: demand-driven policies to bring existing technologies to the market, R&D more very advanced energy efficiency technologies (Figure 17 and Figure 18) and material efficiency options (Figure 19).
191
AlmofY l ovin's bathroom plug 2080
{.\.lrrtlll tflklt fl( Y
produ('m lafnb l S' rA ' C
~
15
Ii 10 5+---~--------~~----------------------------_4
1965
1979
© Eoo", yules Ltd,
1975
1900
1!185
l!l!!O
199>
2\100
2005
2010
lO15
20lll
2025
Year
Fig 6: UK '2P' oil discovery and production, displayed on a cumulative basis, Source: Energyfiles Ltd Also shown are four estimates for the UK's conventional oil 'ultimate', The UK Department of Energy's estimate (,DOE,) is from 1974; the others are more recent, The Campbel/lUppsala and USGS year-2000 estimates exclude NGLs (these add -4,5 Gb); the USGS also excludes UK West of Shetlands basins,
Now we come to an important point We have indicated that the 1999 peak is resource-limited, and clearly this is the case based on the oil already discovered (see
Figure 4). But how do we know this will remain true in future? Perhaps the UK has big new plays waiting in the wings that in time will yield much greater quantities of oil, enough to surpass the 1999 peak. As has been mentioned, the situation often occurs where historical discovery data (the 'creaming curve' vs. time) indicates an apparent asymptote, but where this increases as a new play enters the scene. So what was known to indicate that the UK's 1999 peak was indeed resource-limited; unlike, therefore, the 1984 peak? Knowledge of peak cannot be based solely on discovery data, it must also include geological appraisaL The latter will always be a jUdgement, and can never be known with absolute certainty. But a great deal of geological knowledge now exists for much of the world's likely oil plays. In the UK's case there are still several significant future potential sources of oiL There may be quite large quantities of oil undiscovered in subtle stratigraphic traps; there is new potential in the deeper Atlantic; and there are certainly large amounts of oil in-place currently deemed unrecoverable, But geological and reservoir knowledge says it is virtually certain that none of this oil, if it exists, can be developed rapidly enough to push UK production back up past the 1999 peak. The subtle traps, if they hold significant amounts of oil, will need highly calibrated seismic to find, so will not be found rapidly; the deeper Atlantic will offer surprises but is not thought especially prospective due to poor source rock and traps; while the many routes to
263 improved recovery in existing fields have already seen much trial and analysis. Overall, combining the UK's 2P discovery data with geological knowledge indicates that the country's conventional oil peak in 1999 was indeed resource-limited. Figure 6 brings out this point by including four estimates of the UK's ultimately recoverable resource (,ultimate'). The earliest is a UK government DoE 'Brown Book' estimate made back in 1974, and the more recent are from Campbell, the USGS, and Energyfiles. These 'ultimates' are in close agreement with each other, and with the asymptote of the '2P' discovery creaming curve. (As already indicated, the reason that the UK Department of Energy estimate made in 1974 for the UK 'ultimate' could be so accurate - before UK offshore production had even started-was that by 1974 most of the big fields had already been discovered.) An important question, therefore, is why did the 1999 peak-and perhaps more so, the very steep subsequent decline in production---{;ome as such a surprise to the UK government? It should not have done so. Using the 1974 estimate of ultimate; and plotting a simple 'mid-point' isosceles triangle based on the initial production trend certainly finds peak at around the right date; a fact reported at the time (and see below, Figures 7b and 7c). But 'mid-point' peaking got forgotten (and not just in the UK, as we shall see), and a deep myth developed based on the behaviour of proved reserves. Table I: UK Data on Reserves A: PROVED RESERVES ('IP') Year Gb Year Gb 1975 16.0 1991 4.0 1976 16.8 1992 4.1 1977 19.0 1993 4.6 1978 16.0 1994 4.5 15.4 4.3 1979 1995 1980 14.8 1996 4.5 1981 14.8 1997 5.0 1982 13.9 1998 5.2 1983 13.2 1999 5.2 1984 13.6 2000 5.0 1985 13.0 2001 4.9 5.3 1986 2002 4.7 1987 5.2 2003 4.5 1988 4.3 2004 4.5 1989 3.8 2005 4.0 1990 3.8 2006 3.6 2007 3.6 (Source: BP Statistical Review, various dates.) B: PROVED PLUS PROBABLE RESERVES ('2P') Year Gb USGS 1996 9.7 C/U 2005 9.3 Note: C/U: Campbell I University ofUppsa1a
264 As the table shows, the UK's proved reserves from 1975 to 1985 were in the region of 15 Gb; but then dropped in 1986 to about 5 Gb, and stayed close to this figure until very recently. Of course, all that changed in 1986 was the basis of reporting. Proved plus probable (2P) reserves are currently about twice the proved value. (The full reason that the UK's proved reserves have been so much below the 2P reserves still needs elucidating. It almost certainly reflects, in part, reserves reporting by oil companies under U.S. Securities & Exchange Commission rules; but probably also the non-inclusion of reserves of discovered fields until sanctioned for development.) The long period of static values for UK proved reserves-staying at the equivalent of roughly 5 year's supply-would not matter except that it fooled many analysts into thinking that something special was going on. Year after year oil was being produced, but the proved reserves were not falling. This replacement of the reserves was thus very widely ascribed, including within the oil industry, the UK government and the lEA, as being primarily due to improvements in technology; horizontal drilling and 4-D seismic being frequently cited. The real explanation was that as the proved reserves were produced, reserves in the probable category became classed as proved. But why did analysts not see this for what it was? The reason lies in the usual definition of proved reserves: "".those quantities that geological and engineering information indicate with reasonable certainty can be recovered in future under existing economic and operating conditions." Most analysts then-and still today-treat proved reserves as a fairly accurate measure of the amount of oil likely to be available. The simple reality-that the quantities of oil likely to be recovered under existing economic and operating conditions are generally much larger than the proved reserves-was not recognised; and all too often is still not recognised today. Figure 7a, though a little complex to read, sets all this out plainly. It shows UK data on cumulative production, IP and 2P reserves; and, importantly, shows the estimates made at the time of the total amount of oil likely to be recovered in UK waters (the 'ultimate'). The data are taken from issues of the UK's Brown Book for the years indicated.
265
UK Cum. Production , Reserves & Ultimates: BB
8000
.
•
•
•
~-
.-: ..-
I
100'.1
•
I 0000
I
...... .....-
ii
•
'000
•
•
3000 2000 I QOO
----
..........---
,
:1 I
a-a-~ . - : : : - - - - - - -
1/(_.4.
a,--(i
0
~
..
·• . • • s::::: .-. · ·
5000
:e
~
_Cum . Prod. ~P 3
.....-Ult. (Av9.)
m
~
~
\
- -- -ca.)ltflo"
-J.YQIlUS
- Nin~n
-
-Sohl.hllion
Plptl
~---jl----I----+----H , --+---t--...l\H -----l---+-l
!
/'7'
\
__ f--
~r_--~--~~~_I~-
.~" ---~
r--/r i - il~
-I-
I--'"
') '~~ .~. 'C
I-
+-
~~I----~--~--~----4---_+----~--~--~----+_--~--_t----_I_--_+--
"
"
25
Fig. 8: Reserves growth for UK oil fields; '2P' data. Data from R. Miller of BP. Upper graph: UK large fields, showing the change in industry data for proved plus probable' (2P) reserves with time after first declaration. The Beryl field seems to be anomalous between years 18 and 22, but the trend of the data is clear: after 25 years, reserves for large fields had grown by some 50% on average. Lower graph: UK small fields. The data are probably statistically unreliable by 25 years, as few small fields have yet operated so long. Interestingly there is no significant change in industry data for declared 2P reserves for 9 years, but then a steady growth sets in, reaching 25% after 25 years altogether. This might suggest a very good initial estimate of field size, with only statistical fluctuation of the mean. After some 10 years, further
269 exploration ejJorl (driven by approaching exhauslion?) has discovered a suite of satellile fields, slacked reservoirs and other deposits entirely excluded from Ihe inilial estimates. Miller noled Ihal "II would be interesting know whether Ihe large fields (> 500 mmbbl recoverable) grew from the discovery of new pools. "
As Figure 8 shows, field growth is very variable between fields, but averaged over time, the large fields grew by about 50%; and the smaller fields by about 25%, These are significant increases, and should not be ignored in the modelling, But these values are less than one-tenth the U,S, and Canadian 'IP reserves becoming 2P' reserves growth values of 600% to 900% reported above. And even with 2P reserves, a caution is needed. Campbell, with long experience in industry of field discovery, and of watching how the size of fields is reported over time, identifies a 'U-shaped reporting curve'. This starts with an original 'geological' value, kept internal to the company, which is based on an estimate of oil in-place, and factored by an initial estimate of overall recovery factor. This is followed by the first published value, based on conservative engineering evaluation of the infrastructure likely to be initially committed. Then there is a slow reported growth in field size as subsequent investments are made in the field; with this growth often taking the field size back to close to the original 'geological' estimate. The evolution of the reported size of Prudhoe Bay, for example, has shown just this process, as confirmed by BP's Gilbert. The main conclusions from this section on UK data are: • •
•
•
The simple model of Section 2 captures much of what happens in reality, at least for the UK. The UK government forecast made in 1976, that the UK production peak would occur shortly before the year 2000, is easy to understand on the basis of the estimate for the UK's 'ultimate' and the 'mid-point peaking' rule. It was a pity that this comprehension of the mechanism of peaking got eroded over time, to be replaced by the widespread myth of very high levels of technology-driven reserves replacement-becoming the favoured explanation of why the UK's 5 years' of proved reserves had lasted for over 20 years without diminution. Moderate levels of reserves growth do occur in 2P data however, at least as reported III industry datasets, and need to be accounted for.
This page intentionally left blank
THE FUTURE OF GLOBAL OIL SUPPLY: UNDERSTANDING THE BUILDING BLOCKS DR. PETER JACKSON Cambridge Energy Research Associates, Senior Director Cambridge, Massachusetts, USA
KEY IMPLICATIONS The controversy surrounding future oil supply can be divided into two components: a determination of the factors that will drive the much-debated "peak" and, more importantly, a consideration of the consequences and actions required when oil supply no longer meets demand. IHS CERA sees a number of critical observations at the core of the analysis: • • • • • •
Supply evolution through 2030 is not a question of reserve/resource availability. IHS CERA projects growth of productive capacity through 2030, with no peak evident. There are no unique answers: we are dealing with a complex, multi component system. Above-ground drivers----economics, costs, service sector capability, geopolitics, and investment~are crucial to future supply availability. Market dynamics will remain highly volatile. The upstream oil industry faces some major challenges.
CONTEXT Fears about "running out" of oil coincide with periods of high prices and tight supplydemand balance. The latest such period of "peak oil" concerns became very visible from 2004, when strong oil demand ran up against capacity constraints. IHS CERA's reference case for global liquid productive capacity shows steady growth through 2030 to around 115 million barrels per day (mbd), and there is no evidence of a peak in supply appearing before that time. Hydrocarbon liquids--crude oil, condensate, extra heavy oil, and natural gas liquids~are a finite resource, but based on recent trends in exploration and appraisal activity there should be more than an adequate inventory of resources available to increase supply to meet anticipated levels of demand in this time frame. Post-2030 supply may well struggle to meet demand, but an undulating plateau rather than a dramatic peak will unfold. In the short term, the industry is at another crossroads following the precipitous fall in demand in 2008-09 in response to the onset of the recession. The oil price has halved from its peak of $147 per barrel in July 2008, OPEC has cut nearly 4.2 mbd of production, OPEC spare capacity has nearly tripled to 6.5 mbd, and the industry has slowed its pace of expansion. Early in 2009, IHS CERA predicted that as much as 7.5 mbd of new productive capacity could be at risk by 2014 if costs remained high and oil
271
272 prices hovered just below the cost of the marginal barrel for two years. 1 Since then the oil price has recovered strongly to around $70 per barrel and some confidence has returned. Even in these unpredictable times the industry has continued to invest and to build new productive capacity; indeed, Saudi Arabia recently brought onstream the giant Khurais field, which at plateau is expected to produce 1.2 mbd. With sustained investment, a healthy cushion of spare capacity, and slow to moderate post-recession economic growth, supply should not present major problems in the short term. Of course looking further ahead, it is important to recognize that oil is a finite resource and that, at some stage, supply will fail to meet demand on a consistent basis. It is impossible to be precise about the timing of this event, but given the pace at which demand has increased in the past decade, a pivot point may well be reached before the middle of this century. Much depends on key factors such as global economic growth, the capability of the upstream industry, costs, government policies on access and taxation, the evolution of renewable and alternative energy sources, and the effect of climate change issues on policies and regulations concerning the use of fossil fuels. However, there is time to prepare and to make rational decisions to avoid being forced into shortterm approaches that may not resolve longer-term problems. Many studies of future oil supply examine subsurface issues and focus in particular on the scale of the resource while giving limited consideration to technology, economics, and geopolitics (Deffeyes, 2005). Though belowground factors are critical, it is aboveground factors that will dictate the ultimate shape of the supply curve. This IHS CERA Report summarizes our current productive capacity outlook to 2030 and discusses the architecture of future liquids supply. In addition, the methodology and foundations of the outlook are reviewed and the results of supporting studies on decline rates and giant fields are summarized. Though a peak of global oil production is not imminent, there are some major hurdles to negotiate. METHODOLOGY Productive capacity is defined as the maximum sustainable level at which liquids can be produced and delivered to market. Productive capacity estimates account for routine maintenance and general operational inefficiency but not for dramatic swings in political or economic factors or temporary interruptions such as weather or labor strikes. For example, a field may have a productive capacity of 140,000 barrels per day (bd) but in reality produce 130,000 bd on average over a year because of unforeseen maintenance issues, regulatory inspections, rig movements, and tie-ins. At the core of IHS CERA's methodology is recent production history, which is considered the most reliable data available on which to base a forecast. We can measure the barrels arriving at the surface. Future production trends are extrapolated using a comprehensive framework of decline rates and knowledge of operational plans for individual projects and fields. Remaining reserve data are an important constraint on the
The marginal barrel is the most expensive oil to find and produce globally; currently the oil sands in Canada are regarded as representing the marginal barrel.
273 future supply profiles but- given the uncertainties in reserves estimation---can be used only as a broad guideline of future supply. Four key components of supply are included in the outlook: fields in production (FIP), fields under development (FUD), fields under appraisal (FUA), and yet to find (YTF) resources. IHS CERA has fully incorporated the data from the IHS International Field and Well Data database so that there are approximately 24,000 fields and discoveries underpinning the outlook. In addition, we have conducted detailed analysis of field production characteristics, especially decline rates, which have been incorporated at the field and project levels (see the IHS CERA Private Reports Giant Fields: Providing the Foundationfor Oil Supply Now and in the Future? and Finding the Critical Numbers: What Are the Real Decline Rates for Global Oil Production?). A detailed database of approximately 450 OPEC and non-OPEC FUD provides a clear insight into the immediate plans of the industry to execute new projects ranging individually up to 1.2 mbd at production plateau. YTF resources are estimated by extrapolating historical activity and success rate data and making assumptions about future levels of activity in key countries. We have recently compiled historical exploration data from the IHS International Field and Well Data database on well count, success rate, and discovery sizes for each country that have improved the YTF analysis. In this activity-based model we take account of project efficiency, costs, timing, hardware availability, and our detailed oil price outlook. We adopt a holistic portfolio perspective to evaluate global productive capacity. Although it is clear that some giant fields such as Canterell are now strongly in decline after a very successful secondary production program, and many countries are past their "peak," the sum of the parts as we currently see them show that productive capacity should be able to grow for the next two decades . WHY SO MUCH VARIATION BETWEEN PUBLISHED OUTLOOKS? The long and complex debate about the future of global oil supply is characterized by two overriding characteristics: the very large range of potential outcomes projected and sustained disagreement about "the answer" (e.g. , Mills, 2008). Production volumes are closely related to reserves, rock physics, and investment. Publicly available data tend to be limited and of variable quality. A wide range of different methodologies have been applied to the problem, from those encompassing systematic analysis and careful assumptions (International Energy Agency [lEA] 2008) to less robust techniques such as Hubbert's method (Deffeyes, 2005; AI-Bisharah et. al. 2009), which can provide a good approximation in certain circumstances. Additionally different studies are based on variable views on reserves/resources, field production performance, future exploration, technology, and commercial issues. Few have attempted to incorporate the impact of aboveground factors such as demand and geopolitics. Some models are based on a very pessimistic view of the future, which is not borne out by scrutiny of recent trends in exploration and production. For example, claims that half of global oil reserves have been produced, global reserves are not being replaced on an annual basis, and deepwater exploration is essentially exhausted (e .g., Leggett, 2006) are questionable. The recent discoveries of 10 giant oil fields below a thick salt layer in the Santos Basin, Brazil, may have boosted global resources by at least 25 billion
274
barrels. Further claims that giant oil fields are past their prime have been refuted in a recent detailed study of 548 giant oil fields in the IHS CERA Private Report Giant Fields: Providing the Foundation for Oil Supply Now and in the Future?, which demonstrates their continuing strong contribution to global supply and that some 76 giant fields, representing 84 billion barrels, remain undeveloped. Fields in general and giant fields in particular still show considerable potential for reserves upgrades, as illustrated in many studies (Klett and Gautier, 2005). IHS CERA'S 2009 SUPPLY OUTLOOK: "PAUSING FOR BREATH" In our most recent reference case outlook, global productive capacity is expected to average approximately 92 mbd in 2009 and to rise to 115 mbd by 2030. This is a lower rate of growth than we have projected in the past and reflects the reaction of the oil industry to recent changing market forces . This is just one version of many possible outcomes, and we use it in this report to illustrate the architecture of supply and the nature and scale of the problem. This reference case provides a view of the building blocks of future supply in terms of FIP, FUD, FUA, and YTF as well as "Others," which include extra heavy oil, biofuels, coal-to-liquids/gas-to-liquids, and natural gas liquids. With aggregate decline rates of around 4.5 percent per year, FIP provide a diminishing proportion of the total future capacity. But in terms of the conventional oil asset life cycle, exploration replenishes the appraisal project inventory, which feeds into sanctioned development projects and ultimately producing fields. Figure I is a snapshot of a very dynamic system. Figure 1 Global Liquids Productive Capacity Outlook 140 120 100 80 Million Barrels 60 per Day
40 20
2005
2010
2015
Source: IHS Cambridge Energy Research Associates. 90509-3
2020
2025
2030
275 This summary does not show evidence of a peak in oil productive capacity before 2030. However, it does emphasize the importance of future exploration and the role of unconventional liquids in generating growth in the future. IHS CERA believes that unconventional liquids already contribute around 14 percent of total global capacity, and we expect this share to grow to 23 percent by 2030. The contribution of exploration is emerging as one of the key uncertainties and is the subject of current IHS CERA research. This model assumes that: • • •
The oil price stays above the cost of the marginal barrel for most of the period to 2030. Adequate existing and future resources exist to support these sustained volumes of higher capacity. The industry can build the hardware and develop the technical capability to implement investment programs.
WHAT ARE THE CHALLENGES TO PRODUCING A ROBUST OUTLOOK? Predicting future productive capacity hinges on an in-depth understanding of a complex multicomponent system, which is driven by the interplay of both aboveground and belowground factors. It is not realistic to treat the global oil endowment as if it were simply in a tank being emptied. IHS CERA's experience of evaluating productive capacity over two decades suggests that there are no unique answers, a point reinforced by the wide variety of published outlooks noted above. As part of our ongoing research program IHS CERA has concentrated on a number of factors that will strongly influence future supply: •
•
Data. The IHS CERA reference case outlook is based largely on the IHS International Field and Well Data database, which is arguably the most comprehensive commercially available upstream data set available. A reliable and comprehensive database is critical to any credible forecast-but the complexity of the analysis requires some bold assumptions to be made. Even a perfect data set would generate a wide range of possible outcomes in modeling such a complicated system. The debate about future supply and data has tended to focus on subsurface technical data, especially reserves data. But there is a wide range of sources related to aboveground drivers that is also crucial in assessing country-specific economic data and projections-which drive supply-as well as rig count, yard space, and service sector capability. Reserves. To date the analytical core of this debate appears to have hinged on knowledge of field and global reserves (Mills, 2008). Oil and gas reserves are defined as the volumes that will be commercially recovered in the future . Hydrocarbons are trapped in reservoirs underground and can't be physically audited or inspected, so estimates are based on the evaluation of data that provide evidence of the scale of the reserve base. The Society of Petroleum Engineers (SPE) has produced a detailed set of six categories of
276
•
reserves and contingent resources and three categories of undiscovered prospective resources (ref: SPE website http://www.spe.org/speapp/spelindex.jsp). These reserve estimates entail large degrees of uncertainty, and a lot of experience and judgment are required In performing the calculations. Given the complexity of the calculations there are no unique answers at the individual field or global levels, and we still don't know exactly how much has been discovered or what remains to be found, despite any claims to the contrary. Current estimates can only be considered as being orders of magnitude. The questionable use of resource estimates is well illustrated by Hubbert's (1982) approach, which suggests that a peak of production occurs when half of the global inventory of supply has been produced. This seems plausible, given that some 1.1 trillion barrels of oil has been produced to date and there are apparently some 1.2 trillion barrels remaining to be produced. What this approach does not testify is that this analysis is based on proven plus probable conventional reserves alone, which amounts to 2.3 trillion barrels. It ignores all the remaining categories of conventional and unconventional reserves and resources (including possible, contingent, and prospective reserves), defined by the SPE, which could ultimately contribute at least as much again. IHS CERA estimates that global resources could be approximately 4.8 trillion barrels, including just over 1.1 trillion barrels of cumulative production to date (see the IHS CERA Private Report Why the Peak Oil Theory Falls Down: Myths, Legends, and the Future of Oil Resources). It is clear that we are dealing with a finite resource, but more consistency in reserves reporting and further systematic studies are needed, such as the United States Geological Survey (2000) study of global YTF resources, to improve the quality of the numbers. Remaining reserves data are an important constraint on the future supply analysis-but given the uncertainties this can be used only as a broad guideline. Decline rates and field performance At the core of IHS CERA's productive capacity model is an extrapolation of historical production data into the future. We have completed a study of over 1,000 fields to understand the characteristics of field production through the buildup, plateau, and decline phases. Central to this analysis was an attempt to estimate typical decline rates for a range of field sizes and types in different geological and geographic environments. Information from relatively mature, data-rich areas such as the North Sea and Norway suggested that decline rates were well above an alarming 10 percent on an individual field basis, so it was important to complete this study to develop a more accurate and representative picture around the world. All oil fields start to deplete the day production start-up occurs, but not all fields are in decline. From our 1,000 field study database only 40 percent of production comes from fields in decline, suggesting, perhaps surprisingly, that a significant proportion of all production comes from fields building up or on plateau. This study showed that the average decline rate for fields was 7.5
277
•
percent, but this number falls to 6.1 percent when the numbers are production weighted. The numbers were subsequently corroborated by the lEA (2008). Importantly, the aggregate decline rate of all fields currently in production (which includes fields building up and on plateau) works out to be around 4.5 percent. It is anticipated that aggregate decline rates might increase slowly with time, and also that ultimate recovery will continue to increase medium term. Giant fields are still the cornerstone of global production. Some 548 giant oil fields contribute 61 percent of global production; and although production from the giants has risen, that proportion has remained steady in recent years. Recent IHS CERA research on giant oil fields shows that collectively the giant fields are not in decline and some 60 percent of their recoverable oil remains to be produced. The number of giant field discoveries has declined in recent years, but their contribution seems unlikely to plummet in the near term. Costs and capability. The IHS CERA Upstream Capital Costs Index (UCCI) is a set of indices used to monitor the current state of the global upstream cost environment. Set at 100 in 2000, it more than doubled by the end of 2008 (230). This means that oil companies were essentially spending twice as much to undertake the same amount of work as in 2000. Recently the UCCI has declined by 8.5 percent, putting costs back to early 2008 levels; and although oil prices recently fell back to 2004 levels, cost reductions are projected to drop only an additional 10 percent over the next six months, bringing costs back to 2007 levels by third quarter 2009. Some service sectors, such as the deepwater rig market, will sustain a high pricing structure because of the sustained demand; others, such as jackup rig markets, have softened and may continue to do so. Current upstream sector demographics are such that a large proportion of experienced professionals will retire in the next ten years. The industry has acknowledged this for a number of years and has taken steps to hire and train a new generation of experts, but this may be too little too late. In the current downturn the industry is again in danger of further erosion of its skills base. The service sector in particular is under pressure from operating companies to reduce costs, and this means rationalizations of staff, which will seriously restrict the capability of the service sector in future.
Any outlook can present only one potential version of the future. IHS CERA uses a reference case productive capacity outlook to generate three scenarios for future production-Asian Phoenix, Break Point, and Global Fissures-which enable an understanding of the range of possible drivers of future supply and describe three feasible outcomes (see the IHS CERA Multiclient Study Dawn of a New Age: Global Energy Scenarios for Strategic Decision Making-The Energy Future to 2030). Recent oil price volatility has further reinforced the point that the future is highly uncertain and a range of outcomes should be considered.
278 THE BIG PICTURE It would be easy to interpret the following recent market and oil price events in isolation to support the belief that a peak in global supply has passed or is imminent: • • •
Oil price spike to $147 per barrel in July 2008 Tight supply-demand balance of around 2.5 mbd through mid-2008 Considerable decline in global production to around 83 mbd
However, these events are linked to an array of economic and political factors; they do not herald the onset of a peak and at the simplest level illustrate that the market continues to act as the shock absorber of major volatility. Supply continues to respond to prices (conditioned by expectations of future demand), and simultaneously demand responds to prices. Improved data availability and transparency could help to produce more accurate outlooks for future capacity- but even this will not provide unique, reliable answers. Subsurface data on reserve levels and decline rates are only a part of the story. Any prediction of the date of the peak based on subsurface data alone would be umeliable. Once it is possible to accurately model the following building blocks-and only thenwill a truly reliable and useable range of outcomes become available: • • • • •
Future course of the global economy Balance and impact of the complex web of geopolitics Future course of oil prices Course of government policies that focus on controlling demand Development of renewable energy sources and climate change issues
Many projections, including those based on the methodology of Hubbert, fail to account for the impact of economics, technology, or geopolitics (Deffeyes, 2005), while others concentrate on conventional oil alone (Bentley et al 2007) and fail to account for the growing proportion of unconventional oil being developed and produced. IHS CERA tackled this issue by developing a possible range of outcomes through plausible scenarios for the future of global energy (see the IHS CERA Multiclient Study Dawn of a New Age, cited above). Even this comprehensive study does not present a unique base case projection, but rather develops the three scenarios noted above-Asian Phoenix, Global Fissures, and Break Point---extending to 2030. In Asian Phoenix the center of economic and political gravity shifts to Asia. Strong growth in China and India puts them on a path to eventually challenge the United States for global economic preeminence. In Global Fissures, a widespread political backlash against free trade and globalization, combined with global trade and political disputes, lowers economic growth and weakens energy prices. One of the triggers is a hard landing of the United States economy. Global Fissures reflects the current global climate most closely.
279 In only one scenario, Break Point, do we envisage a period of very tight supplywhere supply difficulties would limit production growth, but with no imminent peak in sight. In 2006 we anticipated that oil prices would reach $150 per barrel. In this scenario fear of peak oil encourages programs to enhance energy efficiency and accelerate growth of alternative fuels, and oil loses its monopoly on transportation. Looking ahead, the upstream industry faces many challenges. There is little doubt that the existing and possible future resource base can support growth in capacity through 2030. There is no shortage of new projects or exploration potential to replenish the hopper. Exploration and field upgrades have tended to replace global production in recent years. It has been said that the "easy oil" has all been discovered, but this statement reflects access and commercial challenges rather than fundamental exploration potential or operational issues in every environment. Exploration is not yet in terminal decline, and while recently some 12 billion barrels of oil has been discovered annually, the five-year moving average is actually growing (see Figure 2). Figure 2 World Liquids Resource Discovery and Production, 1930 to 2007 160 140
EE3 OPEC Liquids Discovery IIi!IIIII Non.oPEC Liquids Discovery
120
-
100 Billion Barrels
Uquids Production
Historical discoveries in all current OPEC countries (excludes Gabon and Indonesia)
80 60 40 20 0 1930
1940
1950
1960
1970
1980
1990
2000
Source: IHS Cambridge Energy Research Associates. 90509·25
The longer-term problem lies not below ground, but in obtaining the investment and resources that the industry will need to grow supply significantly from current levels. Both OPEC and non-OPEC countries have a strong current inventory of some 450 projects under development. The recent fall in oil prices has precipitated a slowdown in the rate at which projects are being sanctioned and developed-but this temporary situation will ease when the global economy starts to recover. The projected mediumterm slowdown in the rate of supply growth is a simple function of economics rather than evidence of an imminent peak.
280
However, not everything is working according to plan. Non-OPEC growth has been worryingly anemic for five years, driven largely by slowing growth of productive capacity in Russia. Non-OPEC may well struggle to regain the annual growth levels greatly exceeding 500,000 bd that were common before 2004. OPEC countries will be a key element of future growth, but prolonged periods of low oil prices (below $60 per barrel) and abundant spare capacity of around 6.5 mbd might well start to inhibit longterm supply growth. But just over the horizon a period of strong economic growth could quickly reverse this trend. However, structural changes currently occurring in the service sector in response to falling costs will pose a threat to future supply expansion. After nearly a decade of strong grO\vth in response to increasing demand, some service sector companies are downsizing, and this will affect the ability of the service sector to bring on new supply at an appropriate pace. While the current economic situation has driven a reduction in E&P investment, it has also coincidentally provided a supply cushion that will take some time to work its way back into the system. Companies continue to build new productive capacity, albeit at a slower rate than one year ago. Collectively this will provide a short-term cushion until the global economy starts to pick up again from 2010 onwards; and so the current recession has effectively postponed any imminent peak. There are many areas of overlap between IHS CERA's view of future oil supply and other outlooks. Oil is a finite resource, and at some stage supply will begin to fall short of meeting demand on a consistent basis. The basic differences in opinion appear to center on when this will happen, but what happens after the inflection point is also crucial (e.g., Campbell, 2009; lEA, 2008 and Hirsch et al. 2005). The view that oil supply will plummet after the inflection point and oil will run out, like the gasoline in an automobile, is misleading for the layperson. IHS CERA believes that this inflection point will herald the beginning of an undulating plateau of supply which will last for perhaps two decades before a long, slow decline sets in (see Figure 3). It marks the start of a transition period when traditional market forces and government policy will be unable to adjust supply to meet growing demand and limits are reached. Peak demand is an equally important concept and may well be viewed in hindsight as the main driver of peak supply.
281 Figure 3 Undulating Plateau versus Peak Oil: Schematic .. ~,,- • ~ Refe-rence Case liquids Capacity (IHS CERA 2U(9)
140
K • •-
-
Undulating Platsaa
Conv$'otianal Crude Cap"ctty (IHS CERA 2009)
r~
120
r 2Atrillion
100
barrels post 2010
Million Barrels per Day
80
I 1.9 trillion barrels past20W
60
Histortcal
40
(L 1.1 trillion barrols cumulatlye)
Production
20
0'--1990
........- 2000
2010
2020
2030
2040
2050
2060
2070
Source: Cambridge Energy Research Associates.
60907-9_2107
REFERENCES 1.
2. 3. 4. 5.
6.
7. 8.
Bentley R.W., Mannan S.A., and Wheeler S.J. (2007), "Assessing the date of the global oil peak: The need to use 2P reserves," in Energy Policy, Elsevier, vol. 35, pp 6364-6382, December 2007. Campbell C.J. ed. (2002), The Essence of Oil & Gas Depletion: Collected Papers and Excerpts, Multi-Science Publishing Company Ltd., 341 pages. Deffeyes K.S., (2005), Beyond Oil: The View from Hubbert's Peak, Princeton University Press. Hirsch R.L., Bezdek R., and Wendling R., (2005), "Mitigating a long term shortfall of world oil production," World Oil, May 2005, pp 47-53. Hubbert M.K. (1982), Techniques of Prediction as Applied to Production of Oil and, U.S. Department of Commerce, NBS Special Publication 631, May 1982. Klett T.R. and Gautier D.L. (2005), "Reserve growth in oil fields of the North Sea," in Petroleum Geoscience, May 2005, vol. 11, no. 2, pp 179190, April 2005, Geological Society of London. Leggett, J., (2006), Half Gone: Oil, Gas, Hot Air and the Global Energy Crisis, Porto bello Books Limited. Mills R.M., (2008), The myth of the oil crisis: Overcoming the challenges of depletion, geopolitics, and global warming, Westport, Conn.: Praeger.
282 9.
10. 11.
Mohammed AI-Bisharah, Saud Al Fattah, and Ibrahim Sami Nashawi, (2009), Forecasting OPEC Crude Oil Supply, Society of Petroleum Engineers Paper Number l20350-MS. u.s. Geological Survey World Petroleum Assessment 2000- Description and Results, USGS World Energy Assessment Team, 2000. World Energy Outlook 2008, International Energy Agency, 2008.
THE IMPORTANCE OF TECHNOLOGY-THE CONSTANT WILD CARD RODNEY F. NELSON Senior Vice President for Technology and Strategy, Schlumberger, Ltd. Houston, USA
The modern oil industry is approaching 100 years of age. During that time we have moved from collecting oil at ancient surface seeps to imaging the subsurface with startling precision below 10,000 feet of water and 5,000 feet of salt. The constants throughout this period are the steady progression of the technology deployed and the amazing complexity of the earth which is slowly revealed to us. We have certainly learned that the hydrocarbon endowment we have been given is much larger than anyone imagined even a few years ago.
283
284 INDUSTRY MACRO
Many of you may have seen versions of this chart before or at least the data in another form. It is the history of last 40 years of the oil business. From the oil shocks of the 1970s, through the huge over-capacity of the 1980s, the first Gulf war, the sudden increase in non-OECD demand in 2004 and more recently the current recession, it's all here. For the future, the estimates for the next few years have been updated to include a scenario based on the latest IEA Medium-Term Oil Market Report and you can see the increased current spare supply capacity that has resulted from a combination of lower demand and new supply. As a result of this combination, prices have fallen from last summer' s highs and investment levels have dropped. Here, however, I'd like to add a word of caution. Even though investment in exploration and production almost tripled from 2000 to 2008, the industry didn't add very much additional oil production capacity. As industry observers have pointed out, some of this investment was consumed by inflation across the supply chain, but even so production capacity outside a handful of OPEC producers hardly changed, and within non-OPEC producers it either levelled off or began to fall. As long-term global energy demand remains little changed I, for one, remain concerned that the inevitable higher finding and development costs of new supply, coupled with lower oil and gas prices and more restrictive credit markets, are stifling investment flows. This situation, if it persists, could lead to inadequate supply when demand growth returns.
285
Increasing demand and natural production decline create growing need for significant new production capacity
While we are seeing lower demand, we are also seeing lower overall supply capacity. Already, two-and-a-half million barrels of expected additional production capacity have already been lost, and the forecasts do not show much change in that number over the near term. Much of the expected capacity lost comes from the delay or cancellation of projects associated with the more-difficult-to-produce heavy oils, which are uneconomic at lower product prices. But that is only part of the story. As fields progress through their natural life cycle they begin to decline. This creates the constant need to invest in new capacity, which must offset this decline to keep production constant or grow it. As this slide indicates, even with conservative estimates of production decline, the new capacity required grows to dramatically large numbers. Obviously this wedge of liquid demand will be met by some combination of OPEC, non OPEC and unconventional fuels.
286
Proven Reserves versus Production
OPEC
As everyone here knows, the earth's endowment of oil is not distributed evenly by country. OPEC member countries hold the majority of the remaining conventional oil reserves. This, of course, creates geopolitical concerns across the globe.
Drilling Intensity in the United States1954-2007 10
12,000
'.000
o
"" " """""""""",
287 Remaining reserves is partly due to the natural distribution and partly due to the production history. The United States, for example, has drilled more wells than any other country. And as you can see here, the footage drilled per year normalized to production has varied dramatically over the past 50 years.
Drilling Intensity in Libya i
1 '1.')
g
~ ~-
~. ~
f
0.$
1,900
O.6()
I}oo
l§ ,::
i
£ 15
040
0.20
1,3!XJ
000 1999
2000
III
1001
=
Drilling Intensity
1.100 2007
...... _Oil P"Cl!'i..ctlon
To give you a comparison, here is a similar plot for Libya. The corresponding drilling footage intensity is orders of magnitude less. One conclusion to an analysis like this is that there is considerably more upside potential for significant discoveries in Libya than in the U.S. for example.
288
Most analysis tends to focus on conventional oil. But, as you know, not all oil is created equal. This plot which is taken from an lEA report that is, I think, particularly instructive. On the X axis is the estimated resources in billions of barrels. On the Y axis is the estimated production cost of those reserves as of 2008 in 2008 dollars. The rectangles estimate the size and the cost range of different classes of oil. Under these circumstances, the challenges to which new technology must respond are two-fold. First, operating costs in environments where new resources exist are high. Even in today's recessionary climate, deepwater project expenditures have been little reduced with the cost of operating in up to two kilometers of water to drill and develop reserves below two kilometers of salt remaining elevated. Technology that can mitigate risk in such drilling and completion operations continues therefore to be in demand. Second, the complexities of such potential reservoirs require sophisticated measurement and modeling before and during exploration and development to ensure that the right well is drilled, and the right information collected. More and more this is becoming a matter of integration across previously somewhat discrete technologies with the goal being to improve reservoir performance.
289
For natural gas, supply and demand presents a similar story with the difference that supply is changing in a different way as the commodity rapidly becomes a global business. After four decades of nearly uninterrupted growth, worldwide demand for gas is still expected to increase at an average rate of 1.8% over the 25-year period from 2006 to 2030. This is nearly double the average increase in oil demand over the same period. By 2030, natural gas demand will represent 22% of total energy demand, while for the next two decades the power generation sector will account for nearly 60% of the growth.
290
• Signifioant Ur,tapped Resources "' Developing global LNG market II
Geographically I Politically DisPersed
Ii
"Gleaner"
.. Lead lime for Power Generation
- Nuclear
Hi years
- Coal
5y~ars
- Gas
1~2y sai'$
The largest relative growth in demand will come from Asia and the Middle East, driven not only by increasing use for power generation, but also by housing needs, and as feedstock for the petrochemical industry. By 2030, these two regions will account for a combined 30% share of global gas demand-up from 19% today. And while natural gas is expanding outside traditional consuming countries, a significant share of the projected production increase will come from the Middle East with most of the remainder coming from the Former Soviet Union countries and Africa. Such changing patterns are leading to a global change in inter-regional gas trading-something that is expected to more than double over the period to 2030-and something that is being fueled by liquefied natural gas supply and transportation. In the mediurn- to longer-term therefore, significant efforts will be needed to find and produce considerably more gas than is available today. But just as in the case of oil, we must also look at where future supply will lie, as this will guide a number of the needed technology development efforts for just as the age of easy oil is over, so perhaps is the age of easy gas.
291
.. Workh>ide Gas In PlwJe "Corwentiol1ai"
14,000 Td
- Shale
15,OO() Tef
_. Coal Bed -
:I!
j 2. GOO Tcf
_. Tight
Hydr~tes
".f.J'orldifAde ,xtnsumedto date
6,000 let >'IllO.OOusTcf
3,000 Td
We know that considerable natural gas resources exist from estimations such as those from the 2009 BP Statistical Review which puts even conventional resources at over 185 trillion cubic meters-a figure more than double the corresponding 1980 estimate, But you can see that the majority of today's known resources are nonconventional--coming from tight sand, shale and coal-bed methane accumulations, Nowhere is this trend more evident than in North America where nonconventional gas now represents more than 40% of U,S. domestic production-a figure that has been made possible by some exciting new technologies that maximize the contact between the shale formation and the well bore completion such as in horizontal wells drilled and fractured hydraulically in multiple stages to enhance well productivity, Worldwide however, non-conventional gas resources represent only 10% of total production with commodity prices and project costs dictating whether, where and when their development will expand, That said, major coal-bed methane projects already exist in China and in Australia. Yet even if we were to limit ourselves conventional resources, much remains to be done and this will require considerable new technology,
292
Summary-Energy is a Long-Term Industry • World energy demand forecast to increase by about 45% by 2030 • Fossil fuels will supply 80% of this as alternative energies lack scale and investment within this timeframe • Oil demand growth strongest in the developing economies and weakest in the OECD. Natural gas a global issue • OPEC production increases as non-OPEC production peaks but increasing production hinges on adequate and timely investment and new technology • Non-conventional resources will playa greater role in the energy mix • Energy and the environment are inextricably linked with global energy-related C02 emissions increasing 45% by 2030
Technology Challenges • Deepwater exploration will need a changing technology mix. The priority will be on mitigating risk and on service execution • Enhancing production from existing fields will require improved workflows, faster well construction, improved completions and better efficiency The focus will be on increasing performance • Technologies for unconventional hydrocarbon production will become more important. Service intensity will increase • Technology development is a long-term commitment that must be maintained through the cycle • Environmentalfootprint is increasingly important and must be reduced
293
One can divide the E&P challenges into at least four major categories. • • • •
Cost effective production of known reserves. Reducing risk in exploration of new reserves. Expanding our capabilities in the deeper and harsher environments such as deepwater and the arctic . And , finally, technology specifically developed to unlock unconventional hydrocarbons.
294
R&E Spend versus Hydrocarbon Resources
~% ~__. .------------------------------------------.----,100% ~
o
'"'0 140%
.8t
~ ., l:!
tD%
8:. I.l
C
120%
I E
0% . KnownReseN.s Yetto be Explored
Deep + Harsh
k "'2007%
--,---'-'--""-"'----+ 110% Unconventional C02
I>-
~
-2007 as % of 2006
And just to show you we, and I am sure others, are putting our money where our mouths are this plot shows you the breakdown by category of our R&D spend and how it has evolved from 2007 to 2008. I can assure you that the 2009 spend is even more heavily shifted towards unconventional and CO 2 sequestration.
295
The Value of Technology Integration Pre-Drill Pp:>gno$is
DalaAe-quisitiQn While Drilling
Intsgrmed Drilling
Update
/"'-
/"
.,
'-,"}f
Drifling Operations Support Center
Drilling involves two distinct cycles. The first is planning-which is long-termwhile the second is execution and is short-term. Seismic-guided drilling changes this by integrating the two in a real-time model which is continuously updated by new information added as the well is drilled. Seismic-guided drilling uses an earth model, updated in near real-time. In this picture the left-hand seismic section is representative of the model as it stands when we start drilling a given welL It incorporates all our prior understanding of the sub-surface. When we add real-time seismic logging-while-drilling data, we can update the model as we drilL In the field, we use our own InterACT transmission technology to send the information to a dedicated support team at a Drilling Operations Support Center. At the Center, the team analyzing the data consists of experts from various disciplines including geophysics, petrophysics and drilling engineering. The data are compared to the earth model and any deviations are analyzed. When these reach certain thresholds, a complete re-imaging of the sub-surface is undertaken. This incorporates all available information as well as the new information gathered while drilling. With conventional technology even this advanced re-imaging can take several months-a time period far too long to have any bearing on the drilling of the welL Using a unique combination of software, process and computer technology however we have developed a solution that significantly reduces the time required for re-imaging to match the timeframe available during drilling so that characterization measurements can almost immediately influence the operation. On the right-hand side you can see the result in this example case. As you can see, the top of structure was not found where our pre-drill prognosis forecast it to be. The availability of the while-drilling seismic data allowed us to update the model and
296 sidetrack the well-shown in blue-to be able to drill into the top of the potential reservoir.
Technology Deployed-Productive Drilling
Three years ago we introduced a new innovation in logging-while-drilling services under the PeriScope brand name. This technology uses electromagnetic measurements to determine the position of the bottom-hole assembly with respect to nearby formation boundaries-including reservoir tops, bottoms and fluid contacts. Here is an example where the red path traces the trajectory of an extended-reach well. The scale is vastly distorted-the yellow reservoir is about 15 feet thick, and the section shown extends just over 2000 feet horizontally. Through measurement and modeling PeriScope tools yield the information needed for rotary-steerable systems to keep the well within the most productive part of the reservoir. Since its introduction, the technology has been deployed in more than 20 countries and has drilled well over one million feet in reservoirs ranging from coal-bed methane to heavy oil, and from complex sands to carbonates. It has allowed horizontal wells to be optimally placed in complex geologies where previous attempts were unsuccessful; it has improved production rates by increasing the length of the well bore placed within the pay zone; and it has improved recovery by eliminating early breakthrough of gas or water. So how do you make a good thing better? Well, you make it read deeper. PeriScope technology can see up to some 15 feet around the tool. This is fine for well placement in thin reservoirs, but not quite good enough for thicker beds or for helping steer into the reservoir in the first place and this leads us to the next generationPeriScope UltraDeep.
297
Waterflood and Recovery Monitoring
Do,m·hole permanent pressure gauge. electrical array and produc\iVll logging tool continuaily measure pressure, resistivity and injection rate
The same type of integration you have just seen in drilling is also true for production. In which case we see integration of downhole valves together with downhole instrumentation and modeling to yield higher recovery rates and ultimately leave less oil behind.
298
Technology Development-lnSitu Downhole Fluid Analysis Optical num AMlysls • Fluididenlificalioo • OHM/star mtlo • FhJidCoICf
• GasOeibciioo
1 .-
: · ·.·1 ·" 11
!f: .
1992
1996
Oii,W"ie!'.Glis 08Mi=i!i...4i:1 vets~S
Gf'..jde j,!
2001
2001
COOlr4!r!moot~.
Fi
a-"
&",..
~
"o.§'
9
'"
,~\1'i il11 nHlJ
'"
--t-
i
Oil eonsumption"~ total=330.9
"i",. a ~"'_"""'"-,, « C'>
C')
CT> C'>
.-
m
C'> CT>
m
0
0 0
'"
.-
~
..
'~
1].0
11.0
'.0
.7.0
,.
'.0 •
......~ ,
..
~
7.
'.','
pH-Val_
'«:0
No. of ElI:peri.e"tal Sites
V.riation In Hydrochemical P~amet." Do.ate. . From the GootHnaai K.... t 5prt,. of "_loti, Rari.
378 INTERNATIONAL BACKGROUND (lGCP379)
IGCP3 79 (1994-1999) : Karst Processes and the Carbon Cycle. Objective: 1. Global uptake of atmospheric CO 2 by Karst processes. 2. Global deep source C02 outgassing in Karst regions. 3. Karst records of environmental change.
NSFC key project Karst processes and carbon cycle in typical karst regions of China (40231008) 2003-2006.
379 THE FINAL PRODUCT OF IGCP379
IGCP299 book
Limestone dissolution rate measurements Rainfall (mm/a) Location Malaysia 5000 Madagascar 1800 French Alps 850 Serbia 3400 Shanxi, China 400 Yichange, China 1200 Guilin, China 1900
IGCP379 book
Dissolution Rate (mm/ka) 180 135 20 31 10.7 84.9 40 (subaeriaD_ 80 (subsoil)
380 The Carbonate Rock Dissolution Rate of the World (M.Pulina)
..:'
~.,;S ~~HO &$1~2!)
~!iJ 2D-3) !!i!iI3G-40 ~ .o·~) ~eo.-100
ji"100m1ru'!OC\Tree!'3;
The Estimation of CO 2 Sink in Surface Karst Dynamic System of the World 6.08x 108/a (Yuan,J\t)i)t;, China 1997) 2.2x 108/a (Kazuhisa,iStIJH0..,Japan,1996) 3.02x 108/a (Philippe Gombert,France, 1999) That makes about"20-40% of the world CO 2 "missing sink"
381 Plates of Earth, Red points: Tufa Deposits
o
. . . . . of . . CIUII '1 ,
.,n
zit 11 U.... CMI
G
DhctIon
of"" ~
~ T~""
Figure 69
M«Jnf "'f1mrhmc plales of a.nll "'" IqIIms or lIent'rG.lon .nd spteadlo, of Cl'USI (mid_
ridllt'l loci dcslluctian ... ,..... 01 CNsl (subduellM zones).
f 1\11T,
There is little disagreement that additional CO 2 in the atmosphere will enhance the greenhouse effect. However, these seemingly plausible statements are either demonstrably false or not verified by rigorous theory or observation. The relationship between radiative forcing and surface temperature response does not have theoretical underpinning and the sensitivity factor, A can only be estimated from computer models. The value of A given by different computer models varies over a relatively broad range; there is no way of assessing whether A should have a low value or a high value. The IPCC, without rigorous scientific analysis, suggests that the average of all models is the most realistic estimate that should be used. Faced with such uncertainty it is reasonable to re-examine the scientific premises. It comes as little surprise that our understanding of the climate system has advanced since the premises were first formulated more than two decades ago. It is surprising that the IPCC has not incorporated new knowledge into its description of the climate system and its evaluation of computer model performance outlined in the most recent 2007 assessment report. CARBON DIOXIDE AND RADIATION TO SPACE CO 2 absorbs and emits radiation within selected bands of the infrared spectrum. That is, within these bands the CO 2 molecules absorb radiation that has been emitted from the earth's surface; the intensity of that emission is characteristic of the local surface temperature. Also, within these bands the CO 2 molecules emit radiation in all directions but with intensity that is dependent on the prevailing gas temperature and its emissivity. Treating the atmosphere as a layer we find the emission to space is of much less intensity than the radiation emitted from the surface. This is because the earth's surface is much warmer than the cold high layer of the atmosphere from whence the radiation to space originates. However, the lowest warm layer of the atmosphere is also emitting radiation back to earth. What is of importance in this discussion is the change in radiation intensity as the concentration of CO 2 varies. Figure I illustrates how the changing concentration of CO 2 affects the radiation intensity, both the emission from the atmosphere to space and the downward emission from the atmosphere to earth. These calculations have been performed using the MODTRANS 3 radiation transfer model based on the U.S. Standard Atmosphere under clear sky conditions. As the CO 2 concentration of the atmosphere increases the infrared The intergovernmental Panel on Climate Change (IPCC) refers to the 'radiation forcing' as the reduction in upward directed infrared at the tropopause due to the increase in CO, concentration. MODTRANS is a medium resolution radiation transfer model and is accessible through the University of Chicago at http://geosci.uchicago.edu/archer/cgimodelslradiation.html.
433 radiation in the CO 2 wavelengths emanates from a higher, colder altitude and the intensity decreases. At the surface, the downward infrared radiation emanates from a lower, warmer altitude as the CO 2 concentration increases .
-- ---------T
300.0
275 .0
Ni:
r '. ,. --..-__ _ I "
150.0
t 145.0
f __ ___ ~~.,-'"~~:"~::::::~~,,.-"'l.::.::::: ~ c 250.0
'"
E
140.0
~
135.0
.9
ill
t
...o
;!!
'"
,.;
II
IL
__ . : ._: , , _ "
_"
_ · ·_ __
_ _ _ _ _ _ _ _ _ _ _- ' -
:::::
225 .0
o
50
100
200
400
800
(02 Concentration (ppm) - - . - Upward at 70Km
' . ..... Downward at Surface
- -'- Net Loss from Atmosphere
Fig. I,' Changes in upward infrared emission to space, downward emission at the surface (both LH scale), and net radiation loss from the atmosphere (RH scale) for changing concentrations of CO 2. (Computed from MODTRANS for the U.S. Standard Atmosphere and clear sky). Two points of Figure I are of interest: 1. As the concentration of CO 2 increases the reduction in intensity of the emission to space is similar in magnitude to the corresponding increase in intensity of downward radiation at the surface. As a consequence, as CO2 concentration increases there is only a small increase in net radiation loss from the atmospheric layer. 2. Figure 1 does not give support to the notion that, as the atmospheric CO 2 concentration increases, there is more absorption of infrared radiation by the atmospheric layer, leading to warming of the atmosphere. There is an equal or greater loss of energy to the surface as downward emission increases with increasing CO 2 concentration. The notion of radiation forcing is further weakened when the variation with latitude of net radiation at the top of the atmosphere (solar absorption less infrared emission) is considered. Figure 2 clearly shows a surplus of solar radiation over tropical latitudes and excess emission to space over polar latitudes. Nowhere are surface temperatures determined by local radiation balance. In order to achieve overall global
434 radiation balance large quantItIes of energy are transported from the tropics to polar regions by the ocean and (principally) the atmospheric circulations. As a consequence of the poleward transport of energy the polar temperatures are warmer than they would be under local radiation equilibrium. Moreover, the polar temperatures (and ice mass magnitude - glaciation) will vary as the poleward energy transport varies. The ocean and atmospheric circulations are two interacting fluids and it is to be expected that the partitioning of the poleward energy transport will vary over a range of timescales. Indeed, there is every reason to believe that the partitioning will fluctuate with time such that polar temperatures fluctuate on similar timescales. 100
~
50
E
..................... ..................... ................. ... .. .. ..... .........
G c o
..
.......... .
;,; --50 :::::::::. o ....... . c
o
N
-100
-150 L - - L_ _ _ _ 90
60
__
~
40
_ _L -_ _
~
30
20
os
__
~
10
__
~
EO Latitude
__
~
10
__
~
20
____
~~
30
40
_L~
60
90
"N
Fig. 2: Zonal mean variation with latitude of net radiation (solar absorption minus infrared emission to space) at the top of the atmosphere (TOA). (Trenberth and Caron/
The message of Figure 2 is that the ocean and atmospheric circulations are continually acting to bring about overall global radiation balance at the top of the atmosphere. There is no unvarying steady-state. At times the climate system is accumulating energy and at other times there is a net loss of radiant energy, depending on the changing ice mass, changing energy storage of the respective fluids and the thermodynamics of the fluid flows. This is evident because the earth's annual climate cycle is not exactly repeated. In addition, known oceanic-atmosphere phenomena such as El Nino and various multi-decadal oscillations reflect major variations to the climate cycle. The anthropogenic global warming hypothesis is critically dependent on the assumption that a reduction of infrared radiation to space in the CO 2 wavelength bands will cause the earth to warm and increase the intensity of emissions across the rest of the radiation spectrum. This assumption does not take cognisance of the fact that, at least for Trenberth, K.E. and 1.M. Caron, (2001) "Estimates of meridional ocean and atmospheric heat transports." J of Clim. 14:3433-3443.
435 tropical and subtropical latitudes, the main variation in infrared radiation emission to space is brought about through variations in cloud and water vapour distribution.
NCEP!NCAII lI.onoly.i. OLR (lI'/mA2) Climatology 196B-19!l6
Jon to Dec:
170
Fig. 3: Spatial variations of climatological infrared radiation to space (OLR). Radiation to space is reduced in the regions of deep tropical convection because the emission largely emanates from the high cold cloud tops. Radiation is highest in the regions of dry descending air where the emission emanates from warm layers near the surface. Radiation is also reduced over the cold polar regions. The dominant control of cloud and water vapour distribution can be readily seen in Figure 3. In regions of recurring deep convective clouds with tops in the high cold troposphere, such as over the Congo and Amazon Basins and the warm equatorial oceans extending from the Indian Ocean to the western Pacific Ocean, the radiation to space is reduced. In contrast, over much of the subtropics and other regions of dry subsiding air the radiation to space emanates from much lower in the atmosphere where temperatures are warmer. Variations in infrared radiation emission to space can be more than 80 Wm,2 from cloud to cloud-free regions. In addition, these spatial patterns are not fixed in time but vary on hourly, daily, weekly and longer scales, including the annual cycle and from year to year. There are major disruptions to the cloud and outgoing infrared radiation patterns during El Nino events when the deep convective clouds form over the central and eastern equatorial Pacific Ocean. The changing cloud and moisture patterns during El Nino
436 events significantly change the magnitude of poleward energy transport and the pattern of infrared radiation emission to space. Interactions between the ocean and atmospheric fluids regulate internal variability of the climate system, especially the changing poleward transport of energy and the changing cloud and moisture patterns. These internal processes have a dominant control over the magnitudes and pattern of infrared radiation to space. It is not plausible that the only response from a change to CO 2 concentration, and its small reduction of infrared radiation to space, will be an increase of surface temperature. The small decrease in infrared radiation to space resulting from CO 2 increase will be overwhelmed by the magnitude of the ever-changing patterns resulting from the atmospheric circulation and associated cloud and moisture distribution. There is no sound theoretical basis to expect a reduction in infrared radiation to space in the relatively narrow CO 2 wavelength bands to be directly and unequivocally linked to an increase in surface temperature. CARBON DIOXIDE AND SURFACE ENERGY EXCHANGE In contrast to the upper atmosphere and the ever-changing infrared radiation to space, any change in CO 2 concentration and downward infrared radiation will directly affect the surface energy balance and surface temperature. An increase in the concentration of atmospheric CO 2 will increase the downward infrared radiation and tend to warm the surface. The magnitude of the actual surface temperature rise will be regulated by the response of other surface energy exchange processes to the CO 2 radiation forcing. At the surface, the energy inputs are solar radiation and the back radiation from the atmosphere (the emissions of infrared radiation from the greenhouse gases, principally water vapour and CO 2, and clouds). The surface energy losses are primarily by way of direct heat exchange between the surface and the atmosphere, Latent energy exchange between the surface and the atmosphere due to evaporation of water, and the emission of infrared radiation from the surface. There is also a loss or gain of energy to surface storage (in the land surface or ocean surface layer) if the surface temperature is warming or cooling but this is small compared to the energy exchange processes and is neglected here. The increase in downward radiation, Ll.FC02 due to increased CO 2 concentration will vary the magnitudes of the surface energy exchange processes and cause an increase in surface temperature, Ll.Ts given by: Ll.FC02
= [dFu/dT + dLHldT + dHidT - dSJdT -
dFd/dT]
* Ll.Ts
(I)
Here: dSJdT is the rate of change of solar radiation absorbed at the surface with temperature; dFd/dT is the rate of change of back radiation with temperature; dFu/dT is the rate of change of surface emission with temperature; dHidT is the rate of change of direct surface heat exchange with temperature; and dLH/dT is the rate of change of latent energy exchange with temperature.
437 The magnitude of solar radiation at the surface will vary with cloudiness changes but not directly with variation of CO 2 concentration. Cloudiness may change with surface temperature of the earth but a priori we do not know the direction or magnitude of any potential change. In the first instance solar radiation is treated as a constant that does not change with temperature. The downward infrared radiation at the surface varies directly with greenhouse gas concentration and temperature of the air near the ground. The main greenhouse gases are water vapour and CO 2 ; water vapour concentration varies with temperature and CO 2 concentration varies with fossil fuel usage. In the context of anthropogenic global warming, CO 2 is the forcing process; atmospheric temperature and water vapour concentration are response processes. The back radiation at the surface will increase as the concentration of either CO 2 or water vapour increases. The direct exchange of heat between the surface and atmosphere varies with the vertical gradient of air temperature at the surface. However the atmosphere has a relatively low thermal capacity and the temperature of the air near the ground increases as the surface temperature increases. Consequently, the rate of heat exchange between the surface and atmosphere does not vary appreciably as the surface temperature changes; it is ignored in this discussion. The infrared emission from the surface varies with emissivity and temperature according to the Stefan Boltzman Law. The emissivity varies with the nature of the surface (land, vegetation or ocean) but not with temperature. The evaporation of water that exchanges latent energy between the surface and the atmosphere varies with the wetness of the surface (water body, moist soil, evapotranspiration from plants, etc.) and the vapour pressure gradient near the surface. The IPCC suggests that the relative humidity near the surface does not vary with temperature. More than 70 percent of the Earth's surface is water and ice and there is no a priori information on how the wetness and vegetation of land surfaces may vary with temperature. It is assumed that the rate of evaporation and latent energy exchange vary according to the Clausius Clapeyron relationship (the rate of change of saturation vapour pressure with temperature) . Recognising that solar absorption and direct heat exchange vary little with temperature then equation 1 can be reduced to: ~FC02
= [dFu/dT + dLHldT - dFd/dT)
* ~Ts
(2)
and rearranged to: ~T,
=~TC02 I (1 -
(3)
r)
where ~TC02
=
~FC02
I [dFu/dT + dLH/dT)
(4)
and r =dFd/dT I [dFu/dT + dLH/dT)
(5)
438 Here t. T CO2 is the direct surface temperature response resulting from CO 2 forcing and I /(I-r) is the feedback amplification due to atmospheric temperature and water vapour increase. It is important to note that the rate of change of surface energy loss with temperature, given by [dFu/dT + dLHldTj constrains both the direct surface temperature response to radiation forcing and the magnitude of the feedback amplification.
_ ___ J Fig. 4: Changing magnitudes of the major surface energy exchange processes over the range of typical temperatures of the Earth's surface. (The Back Radiation is computed for the U.S. Standard Atmosphere under clear sky conditions using the MOD TRANS model). At Figure 4 are plotted the magnitudes of the major surface energy exchange processes across a range of temperatures typical of the Earth's surface. The surface emission is according to the Stefan Boltzman Law (emissivity = I) while the back radiation is computed using the MODTRANS radiation transfer model for the U.S . Standard Atmosphere (approximately average global temperature and moisture) under clear sky conditions and constant relative humidity. Latent energy exchange is according to the Clausius Clapeyron relationship (7 percent change with each degree Celsius variation: 7% C 1) scaled to the global average exchange of78 Wm'2 at IS°C. What is clear from Figure 4 is that the magnitudes of surface emission and the back radiation increase in near parallel, as is to be expected because the temperatures of the surface and near surface atmosphere also increase in near parallel. As a consequence, there is little change in the magnitude of net infrared radiation loss from the surface across the temperature range. It is the latent energy exchange, approximately doubling in magnitude with every 10°C temperature rise, which dominates the changing surface
439 energy loss with temperature. The importance of evaporation for limiting surface temperature has previously been discussed by Priestley (1966).5
I
220
Solar and other Unvarying
;no
I
: ,;gy~~~'=t-!-+--i-~ !
200
I 190
!
i
! ..
,./i~i --.J::..... C02 increases back Radiation
i...... I k· I
,
and reduces Net Surface Energy Loss
[
1
;
1
l
i i
-~---~ --j~~~~:··~--J--.--I ---J-----L---J----l-------j
10
I
i
15
20
Temperature QC
1__ .________._. __ .. _. ____.__. _ ,_,,_, . ____._.________________. ___. ___.... ______________. __
Fig. 5: The magnitude of the net surface energy loss with the solar absorption and other processes that do not vary with temperature scaled to be in steady state at the Earth's mean temperature of 15°C. As CO 2 concentration increases the back radiation also increases, thus reducing the net surface energy loss. The surface temperature rises to a new steady state for energy balance with the near constant energy processes. When the magnitude of the net surface energy loss (net infrared radiation plus latent energy) is plotted against temperature and scaled for steady state at the average temperature of the Earth, as in Figure 5, it is found that the surface temperature is relatively stable. A small change in surface temperature, either to a lower or a higher value, causes the surface energy loss to be out of balance with the steady energy input and there is a strong tendency to return to the steady state temperature. A change in the atmospheric C02 concentration will also cause a shift to a new steady state surface temperature. For example, a doubling of the CO 2 concentration from prevailing values will increase the back radiation by about 4 Wm- 2 . As a consequence, the net surface energy loss will be reduced by an equal magnitude and the surface energy
Priestley, C.H.B. (1966) "The limitation of temperature by evaporation in hot climates." Agr. Meteorol., 3:241-246.
440 processes are out of balance. A new steady state is achieved by an increase in surface temperature of about 0.6°C, as shown in Figure 5. It should be noted that this adjustment to surface temperature is independent of changes that might be wrought by changing atmospheric circulation and distributions of cloud and moisture patterns. The changing CO 2 concentration will directly affect the local surface temperature because of the impact that CO 2 concentration has on back radiation and the ensuing surface energy balance. Unlike the tenuous connection between CO 2 forced change to the infrared radiation to space and surface temperature, the change in back radiation has a direct impact on surface temperature and the effect is mathematically tractable. Moreover, because of the rapid increase of latent energy exchange with temperature, the surface temperature rise is constrained to a relatively small response. THE EXAGGERATED RESPONSE OF COMPUTER MODELS There is nearly an order of magnitude difference between the relatively small surface temperature response of 0.6°C to a doubling of CO 2 concentration calculated above and the projected responses quoted by the lPCe. The latter are based on computer models with individual estimates ranging from 1. 1°C to about 6.4°e. The key to the difference can be found in the formulation of the changing rate of latent energy exchange with temperature. Over a water surface with constant relative humidity the rate of increase in evaporation (and latent energy exchange) with temperature will equate to the Clausius Clapeyron relationship of 7% per degree C, all other factors not varying. Held and Soden (2006)6 have identified that for the computer models used in the lPCC fourth assessment, on average the rate of increase of evaporation with temperature rise was only about onethird this value. This low value in computer models was confirmed by Wentz et al. (2007),7 who identified a range of 1-3% K'I for the global average evaporation increase across the models. The anomalous reduction in the rate of evaporation increase with temperature, as specified in computer models, has significant consequences for the magnitude of temperature projection under CO 2 forcing. The tendency to return to the steady state temperature is weakened. The slope of the curve of Figure 5 is reduced and surface temperature must rise by a larger magnitude to recover from the same radiative forcing of a doubling of CO 2 . More importantly, if the rate of increase of evaporation with temperature is significantly less than the Clausius Clapeyron relationship then the surface temperature response becomes very sensitive to CO 2 forcing. The reduction in latent heat exchange with temperature means that the offsetting energy loss necessary to arrive at a new steady state from back radiation forcing must come from additional infrared radiation emission. That is, the new steady state energy exchange will be at a higher surface temperature than if the evaporation was following the Clausius Clapeyron relationship.
Held, I.M. and BJ. Soden, (2006) "Robust responses of the hydrological cycle to global warming." J of Chill 19:5686-5699. Wentz, FJ .. L. Riccaiardulli, K. Hilburn and C. Mears, (2007) "How much more rain will global warming bring." Science Express, 31 May 2007.
441 The changing sensItivity of surface temperature to radiative forcing under different evaporation rate assumptions can be readily assessed by way of equation 3 above. At the average temperature of the Earth (lSOC) the rate of increase of surface infrared emission with temperature change is given by the Stefan Boltzman Law as 5.4 Wm- 2C I . The equivalent rate of increase of back radiation with temperature can be assessed, for example using the MODTRANS radiation transfer model. With the assumptions that the U.S. Standard Atmosphere approximates the mean profile of the atmosphere , that relative humidity is constant (that is, the atmospheric water vapour increases with temperature in accordance with the Clausius Clapeyron relationship) and ignoring clouds, it is found that the natural rate of increase in back radiation at the surface is about 4.8 Wm'2C I . Table I sets out indicative values for the sensitivity of surface temperature to radiative forcing for a range of rates of latent energy exchange with temperature. The value of 6% C l is the global average estimate by Wentz et a!. (2007) based on satellite estimates of changing precipitation during global warming of recent decades. It is less than the Clausius Clapeyron relationship but this is not unexpected given the magnitude of arid and semi-arid land areas. The other values are typical for computer models (GCM) used in the !PCC fourth assessment of 2007. Table 1: Indicative values of surface temperature increase from a doubling of CO 2 concentration and with a range of rates of increase of evaporation with surface temperature. The rates of surface latent energy exchange, dLHldT correspond to global values assessed from satellite analysis, and values corresponding to computer models (GCM) used in the 2007 IPCC fourth assessment. ATjAF c 02 AT, (2 x CO 2) dLHldT O.16°ClWm·2 6% C' (satellites) O.6°C 2% C' (Average GCM) O.45°ClWm·2 I.7°C O.83°ClWm·2 1 % C' (Low-end GCM) 3.l oC
It is very clear from Table I that surface temperature response to CO 2 forcing is very sensitive to the specification of the rate of increase of evaporation, and hence latent energy exchange, with temperature increase. The analysis at Table 1 clearly points to a high likelihood that the computer models used as the basis for the !PCC estimates of anthropogenic global warming are significantly exaggerating the projected global temperature response. If we accept that the rate of surface evaporation will increase according to near the Clausius Clapeyron relationship then a doubling of CO2 concentration, from current level to near 800 ppm by the end of the 21 st century, is not likely to cause global temperature rise exceeding 1°C. Such a rise is well within the range of natural variability and should not be construed as dangerous.
ISSUES WITH SURFACE EVAPORATION Surface evaporation, and the associated latent heat exchange, is a very difficult process to quantify. Over extensive water bodies the thermal capacity of the mixed surface layer is often a sufficient source of energy and the primary regulating factors on evaporation are wind speed, atmospheric stability and vertical vapour pressure gradient. The relationship between the factors is not linear and evaporation can vary significantly with space and time. Over land the surface and vegetation have only a limited thermal capacity and
442 evaporation additionally responds to solar insolation, plant moisture availability and surface wetness. The estimation of surface evaporation is further complicated in computer models because the regulating factors often vary over a wide range within the scale area of computation. Simple averaging is inadequate because of the non-linear relationships involved. The magnitude of global precipitation provides a suitable closure condition for estimating global evaporation but this does not assist in formulating methodologies for estimating spatial and temporal variation across the ocean and land surfaces. The highlighted difficulties of estimating evaporation are compounded in the estimation of rate of change of evaporation with surface temperature. It is clear, however, that it is the rate of change of evaporation (and latent heat exchange-see equations 3, 4 and 5) with surface temperature change that is fundamentally important for estimating the magnitude of global surface temperature response to greenhouse gas forcing. CONCLUSION Carbon dioxide is a greenhouse gas and interacts with the Earth's infrared radiation, both the emission to space and the back radiation at the surface. Contrary to popular explanations, it is not the reduction in radiation to space across the CO 2 bands that is important for enhancing the greenhouse effect; it is the increase in back radiation at the surface that is important because it directly leads to an adjustment of the surface temperature. An increase to the concentration of CO 2 will enhance the greenhouse effect but the magnitude remains controversial. Water vapour is important in regulating the magnitude of the enhanced greenhouse effect in two ways: increased water vapour in the atmosphere has an amplifying effect on the C02 forcing because it further increases the back radiation as temperature rises; and, more importantly, any increased evaporation and latent heat exchange between the surface and atmosphere constrains the surface temperature rise. It is the evaporation that is dominant because I) the Earth's surface is more than 70 percent ocean and much of the remainder is covered by transpiring vegetation; and 2) the rate of increase of evaporation with temperature approximately follows the Clausius Clapeyron relationship, nearly doubling with each lOoC temperature rise. A doubling of CO 2 concentration by the end of the century from current levels is expected to cause a modest global temperature rise not exceeding 1°C. The computer models on which the IPCC based its fourth assessment projections have been shown to significantly underestimate the rate of increase of evaporation with temperature. The indicative analysis presented here suggests that projections of global temperature made by these contemporary computer models are nearly an order of magnitude too large. As a consequence, a better representation of evaporation and surface latent heat exchange in computer models, particularly the important response to surface temperature, is a primary requirement if the uncertainty about anthropogenic global warming is to be reduced. Without this improvement the projected temperature response to anthropogenic forcing will continue to be exaggerated. It is also evident that suggestions of Earth passing a 'tipping point' temperature, and even going into a phase of 'runaway global warming', are an outcome of the flawed computer models that do not represent a realistic future scenario. The extensive oceans
443 and the hydrological cycle are a natural constraint on global temperature and dangerous anthropogenic global warming is not a feasible outcome.
This page intentionally left blank
ON THE OBSERVATIONAL DETERMINATION SENSITIVITY AND ITS IMPLICATIONS
OF
CLIMATE
RICHARD S. LINDZEN AND YONG-SANG CHOI Earth, Atmospheric and Planetary Sciences, Massachusetts Institute of Technology, Cambridge, Massachusetts, USA PREFATORY REMARKS The following paper is, indeed, significantly different from what was presented in Erice. Both the original version of this paper as well as the published paper that the present work is an expansion on, have been widely circulated, and there has been a very helpful response that has led us to examine all aspects of the original paper more carefully. The result has led to some changes in analysis, some expanded explanations, and, finally, some changes in results-though the last has not been very great. The new analysis is, we think, much more robust and comprehensible. Among the items addressed are the compensation for the 36 day precession period of the ERBE satellite; the original ignoring of the fact that in the observations, it was necessary to distinguish radiation changes that resulted from surface temperature changes from both noise and those radiation changes that forced the temperature changes; the use of a more reasonable zerofeedback flux; and the undue smoothing of the time series for short wave outgoing radiation. We have also added arguments concerning the concentration of feedbacks in the tropics. INTRODUCTION It is usually claimed that the heart of the global warming issue is so-called greenhouse warming. This simply refers to the fact that the earth balances the heat received from the sun (mostly in the visible spectrum) by radiating in the infrared portion of the spectrum back to space. Gases that are relatively transparent to visible light but strongly absorbent in the infrared (greenhouse gases) will interfere with the cooling of the planet, thus forcing it to become warmer in order to emit sufficient infrared radiation to balance the net incoming sunlight. By the net incoming sunlight, we mean that portion of the sun's radiation that is not reflected back to space by clouds and the earth's surface. The issue then focuses on a particular greenhouse gas, carbon dioxide. Although carbon dioxide is a relatively minor greenhouse gas, it has increased significantly since the beginning of the industrial age from about 280 ppmv to about 390 ppmv, and it is widely accepted that this increase is primarily due to man's emissions. However, it is also widely accepted that the warming from a doubling of carbon dioxide would only be about 1°C (based on simple Planck black body calculations ; it is also the case that a doubling of any concentration in ppmv produces the same warming because of the logarithmic dependence of carbon dioxide's absorption on the amount of carbon dioxide). This amount of warming is not considered catastrophic, and, more importantly, this is much less than current climate models suggest the warming from a doubling of carbon dioxide will be. The usual claim from the models is that a doubling of carbon dioxide will lead to warming of from 1.5°C to 5°C and even more. What then is really
445
446 fundamental to 'alarming' predictions? It is the 'feedback' within models from the much more important greenhouse substances, water vapor and clouds. Within all current climate models, water vapor increases with increasing temperature so as to further inhibit infrared cooling. Clouds also change so that their net effect resulting from both their infrared absorptivity and their visible reflectivity is to further reduce the net cooling of the earth. These feedbacks are still acknowledged to be highly uncertain, but the fact that these feedbacks are strongly positive in most models is considered to be a significant indication that the result has to be basically correct. Methodologically, this is a most peculiar approach to such an important issue. In normal science, one would seek an observational test of the issue. As it turns out, it is may be possible to test the issue with existing data from satellites and there has recently been a paper (Lindzen and Choi, 2009) that has attempted this though, as we will show in this paper, the details of that paper were, in important ways, incorrect. The present paper attempts to correct the approach and arrives at similar conclusions. FEEDBACK FORMALISM A little bit of simple theory shows how one can go about doing this. In the absence of feedbacks, the behavior of the climate system can be described by the following illustration.
-~Q-I~.e-~-Q-"'~"I____. . ---I~. ~To Fig. 1. A schematic for the behavior of the climate system in the absence of feedbacks. !'lQ is the radiative forcing, Go is the zero-feedback response function of the climate system, and !'lTo is the response of the climate system in the absence of feedbacks. The checkered circle is a node. Figure 1 symbolizes the temperature increment,!'lT o, that a forcing increment,!'lQ, would produce with no feedback, (1)
It is generally accepted (Hartmann, 1994) that without feedback, doubling of carbon dioxide will cause a forcing of !'lQ '" 3.7 Wm- 2 (due to the black body response), and will increase the temperature by !'lTo ;::; 1.1°C (Schwartz, 2007). We therefore take the zero-feedback response function of (1) to be Go;::; 0.3 (=1.1/3.7) °c W· 1m 2 for the earth as a whole. With feedback, Figure 1 is modified to
447
Fig. 2. A schematic for the behavior of the climate system in the presence of feedbacks.
The response is now (2)
Here F is a feedback function that represents all changes in the climate system (for example, changes in cloud cover or humidity) that act to increase or decrease feedback-free effects. Thus, F should not include the response to I'!.T that is already incorporated into Go. The choice of zero for the tropics in Lindzen and Choi (2009) is certainly incorrect in this respect. At present, the best choice seems to remain 1/00 (3.3 Wm-2 °C 1) (Colman, 2003; Schwarz, 2007), though a lower value than this might be appropriate due to the high opacity of greenhouse gases. Solving (2) for the temperature increment I'!.T we find I'!.T= I'!.To . 1- f
(3)
The dimensionless feedback fraction is f =F Go. From Figure 2, the relation of the change in flux, I'!. Flux, to the change in temperature is given by
f I'!.Flux - ZFB = --I'!.T Go
(4)
The quantities on the left side of the equation indicate the amount by which feedbacks supplement the zero-feedback response to I'!.Q (ZFB). At this point, it is crucial to recognize that our equations, thus far, are predicated on the assumption that the I'!.T to which the feedbacks are responding is that produced by I'!.Q. Physically, however, any fluctuation in I'!.T should elicit the same flux regardless of the origin of I'!.T. When looking at the observations, we emphasize this by rewriting (4) as (5)
448 When restricting ourselves to tropical feedbacks, equation (5) is replaced by _G o
(~FIUX - ZFB ) ~SST
""
21
(6)
Impi",
where the factor 2 results from the sharing of the tropical feedbacks over the globe following the methodology of Lindzen, Chou and Hou (2001) (See Appendix 2 for more explanation) . The longwave (LW) and shortwave (SW) contributions to I are given by
ILlY
= _ Go
2
(~OLR -
ZFB)
~SST
I sw = _ Go (~SWR) 2
~SST
(7a) Impio
(7b) ''''pin
Here we can identify ~Flux as the change in outgoing longwave radiation (OLR) and shortwave radiation (SWR) measured by satellites associated with the measured ~SST, the change of the sea-surface temperature. Since we know the value of Go, the experimentally determined slope allows us to evaluate the magnitude and sign of the feedback factor I provided that we also know the value of the zero-feedback flux. Note that the natural forcing, ~SST, that can be observed, is different from the equilibrium response temperature ~T in Eg. (3). The latter cannot be observed since, for the short intervals considered, the system cannot be in equilibrium, and over the longer periods needed for equilibration of the whole climate system, ~Flux at the top of the atmosphere is restored to zero. Indeed, as explained in Lindzen and Choi (2009), it is, in fact, essential, that the time intervals considered, be short compared to the time it takes for the system to equilibrate, while long compared to the time scale on which the feedback processes operate (which are essentially the time scales associated with cumulonimbus convection). The latter is on the order of days, while the former depends on the climate sensitivity, and ranges from years for sensitivities of O.SoC for a doubling of CO 2 to many decades for higher sensitivities (Lindzen and Giannitsis, 1998). Finally, for observed variations, there is the fact that changes in radiation (as for example associated with volcanos) can cause changes in SST as well as respond to changes in SST, and there is a need to distinguish these two possibilities. This is not an issue with model results from the AMIP program where observed variations in SST are specified. Of course, there is always the problem of noise arising from the fact that clouds depend on factors other than surface temperature.
449 THE DATA AND ITS PROBLEMS 36-day average Monthly 0.8 ~Ie----------~,I·e-------=-----~,I
0.6 Q' 0.4 >-
~ 0.2 o
~
0.0r-,J--~--T_~~~~v_~~L-~--~L-----------~~~
-0.2 -0.4
L....JL...----'._----'-_--L_--'--_..L.-_L.....---"_---L_--L_--L.._....l-J
1986 1988 1990 1992 1994 1996 1998 2000 2002 2004 2006 2008
Fig. 3: Tropical mean (20'S to 20'N latitude) 36-day averaged and monthly sea surface temperature anomalies with the centered 3-point smoothing; the anomalies are referenced to the monthly means for the period of 1985 through 1989. The SST anomaly was scaled by a factor of O. 78 (the area fraction of the ocean to the tropics) to relate with the flux. Red and blue colors indicate the major temperature fluctuations exceeding 0.1 "C. Now, it turns out that sea surface temperature is measured (Kanamitsu et al. 2002), and is always fluctuating as we see from Figure 3. High frequency fluctuations, however, make it difficult to objectively identify the beginning and end of warming and cooling intervals (Trenberth et al. 2010). This ambiguity is eliminated with a 3 point centered smoother. (A two point lagged smoother works as well.) In addition, the net outgoing radiative flux from the earth has been monitored since 1985 by the ERBE satellite, and since 2000 by the CERES instrument aboard the Terra satellite (Wielicki et al. 1998). The results for both long wave (infrared) radiation and short wave (visible) radiation are shown in Figure 4. The sum is the net flux. With ERBE data, there is, however, the problem of satellite precession with a period of 36 days. In Lindzen and Choi (2009) that used ERBE data, we attempted to avoid this problem (which is primarily of concern for the short wave radiation) by smoothing data over 7 months. It has been suggested (Takmeng Wong, personal communication) that this is excessive smoothing. In the present paper, we start by taking 36 day means rather than monthly means. The CERES instrument is flown on a sunsynchronous satellite for which there is no problem with precession. Thus for the CERES instrument we use the conventional months. However, here too we examine the effect of modest smoothing. Both ERBE and CERES data are best for the tropics. ERBE field-of-view is between 60 0 S and 60 0 N. For latitudes 40° to 60°, 72 days are required instead of 36 days to reduce the precession effect (Wong et al. 2006). Both data sets have no/negligible shortwave radiation in winter hemispheric high latitudes, which would compromise our analysis. Moreover, our analysis involves relating changes in outgoing flux to changes in SST. This is appropriate to regions that are mostly ocean covered like the tropics or the
450 southern hemisphere, but distinctly inappropriate to the northern extratropics. However, as we will argue in an appendix, the water vapor feedback is almost certainly restricted primarily to the tropics, and there are reasons to suppose that this is also the case for cloud feedbacks. The methodology developed in Lindzen, Chou , and Hou (2001) permits the easy extension of the tropical processes to global values. Finally, there will be a serious issue concerning distinguishing atmospheric phenomena involving changes in outgoing radiation that result from processes other than feedbacks (the Pinatubo eruption for example) and which cause changes in sea surface temperature, from those that are caused by changes in sea surface temperature (namely the feedbacks we wish to evaluate). Our admittedly crude approach to this is to examine the effect of considering fluxes with a time lags and leads relative to temperature changes. The lags examined are from one to five months. The discussion is in Section 4. ERBElERBS NS (36-day average)
_!ECERESrrerra (monthly) _!
8r!E,--,__~__~~__~~__~__~~__~~~ LW 4 0~~--~~~~4-------~--L-----------------4
'i'
E
~ -4 1986 1988 1990 1992 1994 1996 1998 2000 2002 2004 2006 2008 >.
(ij
E
8
0
c::
4
86 88 90 92 94 96 98 00 02 04 06 08
86 88 90 92 94 96 98 00 02 04 06 08
86 88 90 92 94 96 98 00 02 04 06 08
86 88 90 92 94 96 98 00 02 04 06 08
86 88 90 92 94 96 98 00 02 04 06 08
86 88 90 92 94 96 98 00 02 04 06 08
0 -4 8 4 0 -4 8 fl..,.-.
. ltv"\} v
4
/\ _ "
I..,!
\v
0 -4 86 88 90 92 94 96 98 00 02 04 06 08
86 88 90 92 94 96 98 00 02 04 06 08
Fig. 5: Comparison of outgoing longwave radiations f rom AMIP models (black) and the observations (red) as found in Figure 4.
452
86 88 90 92 94 96 98 00 02 04 06 08
86 88 90 92 94 96 98 00 02 04 06 08
Fig. 6: Comparison of reflected shortwave radiations from AMI? models (black) and the observations (blue) shown in Figure 4.
CALCULA nONS With all the above readily available, it is now possible to directly test the ability of models to adequately simulate the sensitivity of climate. The procedure is simply to identify intervals of change for llSST in Figure 3 (for reasons we will discuss at the end, it is advisable, but not essential, to restrict oneself to changes greater than 0.1 0c), and for each such interval, to find the change in flux. Let us define i], h, ... im as selected time steps that correspond to the starting and the ending points of intervals. llFlux/llSST can be basically obtained by Flux(id-Flux(i z) divided by SST(i l ) -SST(iz). As there are many intervals, llFlux/llSST is a regression slope for the plots (llFlux, llSST) for a linear regression model. Here we use a zero y-intercept model (y = ax) because the absence of the y-intercept is related to noise other than feedbacks. Thus, a y-intercept model may be
453 more appropriate for the purpose of our feedback analysis; however, the choice of regression model turns out to be minor. As already noted, the data need to be smoothed to minimize noise, and it is also crucial to distinguish IlSST that are forcing changes in llFlux, and not responses to llFlux. Otherwise, llFJux/llSST can vary (Trenberth et al. 2010) and/or may not represent feedbacks that we wish to determine. As an attempt to avoid such problems, though imperfectly, we need to consider smoothing (i.e., use of Flux'(i) and SST (i), where the prime designates the smoothed value) and lag-lead methods (e.g., use of Flux'(i+lag) and SST (i» for ERBE 36-day and CERES monthly data. For a stable estimate of dFlux/dSST, the time step i should be also selected based on the maximum and minimum of the smoothed SST (i.e., SST). As shown in Figure 3, this study selected SST(il) -SST(i2) that exceeds 0.1 K. The impact of thresholds for t-.SST on the statistics of the results is minor. Figure 7 shows the impact of smoothing and leads and lags on the determination of the slope as well as on the correlation, R, of the linear regression. 10~----------------,
10~----------------,
SW
:.:
~
'"~E
"'E
~ f-
(fJ
~
:J
u::9.999 mg/m' readings recorded during peak dust storms
Count (Total Number of Suspended PM 10 Particles /mJ ) Size Range = 0.5 urn to 10 urn = 1,314,906 (Navy Lab, Great Lakes, IL-indoors) =12,290,917 (Camp Virginia Clinic, Kuwait-indoors) = 107,261,167 (Highest average hourly maximum @1300) (SO =54,959,015) =588,633,693 (Highest daily maximum-18 June @1300) = 127,643,273 (Highest average hourly daily maximum-I 3 June) (SO = 34,311,341) • NOTE: Readings recorded during peak dust storms were> 706J93J34 partic/es/m'.
Size Range =5.0 urn to =36,515 =507,824 =6,884,417 =44,571,347
=5,244,651
10 urn (Navy Lab, Great Lakes, IL-indoors) (Camp Virginia Clinic, Kuwait-indoors) (Highest average hourly maximum @ 1300) (SO (Highest hourly maximum-I 8 June @1300) (Highest average daily maximum-13 June) (SO
=4,142,586) = 3,632,501)
Fig. 1: Table of Dust Particle exposure in Camp Buehring Kuwait over a 12 day period.
•
•
At PMIO (particle with aerodynamic mass of 10 microns), the highest hourly average each day was 2.469 mg/m 3 which occurred at 0800. Maximum exposures during dust storms exceeded 10.000 mg/ m3 . (Figure I) Daily daytime PM 10 average for 12 consecutive hours, 0700-1900, was -0.900 mg/m 3 (n=12).
499 •
At peak exposures, particle counts (0.5 to 10 f.l,m range) exceeded 7x 108 particlesl m3 .
Fig. 2: Bioavailable Elements in Dust Particles from Camp Buehring, Kuwait.
•
•
•
A total of 54 elements were screened for with 37 different elements identified of which there are 15 are bioactive metals including Uranium . Of these the ones of greatest concern are: Arsenic (10 ppm), Chromium (52 ppm), Lead (138 ppm), Nickel (564 ppm), Cobalt (l0 ppm), Strontium (2700 ppm), Tin (8 ppm), Vanadium (49 ppm), Zinc (206 ppm), Manganese (352 ppm), Barium (463 ppm), Aluminum (7521 ppm), (Figure 2). The ratio of Chromium III to Chromium VI is unknown (40-120 ppm = .04 .12 f.l,g/m3 per every mg 1m 3 of TSP mass at PM 10). The U.S . Maximum Exposure Guidelines (MEG) for Cr (III) is 12 f.l,g 1m 3 and 0.068 f.l,g 1m 3 for Cr (VI). Microbiological analysis of these same samples identified 147+ different microbial isolates (six different Genera by 16s DNA analysis). Of these, -30% are human pathogens, 13 are alpha and/or beta hemolytic species, and several were found to have antibiotic resistance (Figure 3).
500
Best ID thus Far
Comment
Neisseria meningiditis
meningitis
Staphylococcus aureus
cystic fibrosis
Bacillus circulans
gastro-enteritis
Pantoea agglomerans
I
eptic arthriti.
Pseudomonas al!1ici Ralstonia paucula
opportunist-septicemia, peritonitis, abscesses
Staphylococcus pasteuri
various infections
Arthrobacter crystallopoietes Pseudomonas balearica
cystic fibrosis
Paenibacillus thiaminolyticus
I
Bacillus vedderi
, obligate alkaliphile
bacteremia
Bacillus subtilis Palltoea agglomeram
eninbVJe_
Pseudomonas pseudoalcaligenes
Strains reported to carry metallo-B-Iactamase
Cryptococcus albidus
septicemia and meningitis
Bacillus clousii
Oral bacteriotherapy
Kurthia gibsonii
Diarrhea
Bacillus firmus
alkaliphiJe; bread spoilage
Staphylococcus kloosii Bacillus moiavellsi
E.,'
~ ~m
i E+12 lE+lO
fOOOOOO rOOOOOO
10000 100 1
t----=.-:::::::=:=:;::;:::;;;;;;;;;;=""""'---~-:---
+---.. ~ .-.---.----.... --- ---------
I - ===:::::::::;;;;;;;;;;;;;;;;=....._--""'"'-- - =---::=_.:.;:;: ___ __~ __- - - --
=......---""""'=:......-
r ----- -----------.-------.. -.-- . -.----------.--...
r---
NG5
-
Nl
-
PGS
-
Pi
- 1 65/D
__-------------------------:::
====__-----------------------+----- -=-------,--.----,-- .---.. ---.-,-------, o
-
0.5
1
1.5
distance km
2
25
-
11/0
EXPLORING THE ITALIAN NAVIGATOR'S NEW WORLD: TOWARD FULL-SCALE, LOW-CARBON, CONVENIENTLYECONOMIC, AVAILABLE, PROLIFERATION-ROBUST, RENEWABLE ENERGY RESOURCES LOWELL WOOD Hoover Institution, Stanford University Stanford, California, USA THE "FACTS ON THE GROUND" Two-thirds of a century after "The Italian Navigator has landed in the New World," nuclear fission-based electricity is .. . • .. . manifestly 'economic' - Higher capital cost trades off vs . lower O&M costs O&M costs $ with RObiift C. GaUQ
III
_In It• .,. w. "'..... lh. """,p..n Socill'ty of Virliil0'9'1- April 24'"
L_
555
PMP
556 AIDS AND INFECTIOUS DISEASES PMP PAST ACTIVITIES (I) Since its establishment in 1988, the AIDS and Infectious Diseases PMP has organized several PMP Meetings and Plenary Sessions focused on epidemiological, molecular aspects, prevention, vaccine development on HIV, with several colleagues, as reported in the Proceedings of the Erice Seminars. The HIV sessions have been alternated with other infectious epidemics with major public health impact: I. 2. 3. 4.
BSE, Bovine Spongiform Encephalitis; Avian flu; Vector-born diseases, such as the one held last year; Other emerging diseases.
AIDS AND INFECTIOUS DISEASES PMP PAST ACTIVITIES (2) •
• • •
Established the Infectious Agents and Cancer (lAC) online journal, directed by F.M. Buonaguro, with several senior colleagues in the editorial board (including Bob Gallo, Harald zur Hausen, Guy de The, Peter Biberfeld, etc ... ); Contributed along with the Inter-Academy Society to support the Medical School at Gulu University in Northern Uganda; Established the Goggle Infectious Agents and Cancer group to foster discussions and update PMP participants; Started an Infectious Agents and Cancer Blog for visibility issue.
AIDS AND INFECTIOUS DISEASES PMP 2009 ACTIVITIES This year the AIDS and Infectious Diseases PMP contributed to the Climate PMP session held on August 19 1h on health-related issues which can be studied with and possibly prevented by satellite monitoring of: 1. Soil , precipitations, vegetations, etc., which can clearly define vector habitat: i.e, mosquitoes for malaria, rodents for plague, etc. 2. Dust storm to prevent exposure to particles with or without pathogenic microorganism, which can determine specific organ diseases (including respiratory diseases) besides the overall immunosuppression. AIDS AND INFECTIOUS DISEASES PMP PLANNING FOR NEXT YEAR For next year we are proposing: The organization of a Plenary Session during the 2010 Erice Meeting focused on vaccine development strategy whose presentations will be articulated on: • •
Vaccine strategies; Adjuvants development;
557 • • •
Pandemic infections and vaccine preparedness; Emerging diseases and vaccine approaches; Vaccines and Developing Countries.
Furthermore we are proposing: •
•
The establishment of an African network with colleagues of two scientific societies [African Society of Human Genetics (AfSHG) and Journal of Infection in Developing Countries (JIDC)] in order to confirm the clinical relevance of Remote Epidemiology and the possibility to develop a Health sentinel system for infectious diseases; The organization of a joint PMP session with the Climate PMP to be held 20 I 0 Erice Planetary Emergencies Meeting. during the
This page intentionally left blank
MOTHER AND CHILD PMP NATHALIE CHARPAK Kangaroo Foundation Bogota, Colombia MANIFESTO (ERICE 2002): MATERNAL AND CHILD MORTALITY IS A PLANETARY EMERGENCY MISSION 1. Recognize new and monitor existing tools that decrease maternal and infant mortality and morbidity. 2. Highlight the impact of the other planetary emergencies on Maternal and Infant mortality and morbidity and therefore contribute to the enhancement of the quality of the present life and future generations .
Kangaroo Mother Care is our specialty A tec hnique to decrease the mortality and to improve the quality of survival of the premature and Low Birth Weight infant in all levels of care 18 millons of candidates per year Kangaroo Position: How to carry your premature baby in the neonatal unit
Kangaroo nutrition How to breastfeed your premature infant
559
Kangaroo discharge policy: how to go home sooner with your premature infant
560 For 20 years we have been working on the scientific evaluation, dissemination and implementation of KMC (and we will never forget that you have been the first in supporting this KMC adventure). • • • •
Supporting creation of Kangaroo Foundations in other countries: Philippines, Vietnam (in process) Establishing KMC centers all around the world Making available guidelines and material on KMC Pursuing good research and being present in big international events National 2009 neonatal congress in Italy, American Academy of Pediatrics in Washington in Oct 2009, Neonatology update of Cornell, Columbia and NY universities
Kangaroo Mother Care in Europe Marina CutUni! Hospital Bambino Je!'us de Roms; "European Science f oundation Networfor....
Mother
Fcrl\"r
j
, !
..
!
L 100
_
-.
.!1·i!i· I","',
"
Is
"I
5
100
, i
~
N=283
DI Only if theV ask for it I
I Usuatly not
561
D
Kangaroo Foundation Pilot center
o
Trained KMC centers (big public hospitals)
D
Centers weiling to be trained
In Colombia 12% of all deliveries are LBW, which means that from 850 .000 deliveries a year,
100.000 are LWBI. We just sign (August 2009) with the health ministry for editing the Colombian KMC rules and tools for quality evaluation of KMC
562
MONITORING NEW TOOLS TO DECREASE MORBIDITY OF MOTHER AND INFANT
THE
MORTALITY
AND
Goa\: Update the KMC evidence based guidelines 2006-2009 (70 papers) The access to the guideline is free on the Internet and guidelines are downloaded each day by professionals from all around the world Dr. Socorro Mendoza • Pain • Physiology and thermal stability • Growth • Neurodevelopment Dr. Nathalie Charpak • Perception and acceptability of KMC from mother, parents and health worker • Resistances from health worker and family • Implementation of the full KMC intervention • Diffusion and implementation of KMC Dr. Juan Gabriel Ruiz • Mortality before and after stability • Morbidity • Breastfeeding IMPACT OF TERRORISM ON MOTHER AND CHILD HEALTH We discussed the shameful phenomenon of child participation in armed conflicts worldwide.
563 The Child Soldier: a Terrifying Dimension of Global Terrorism Understanding and increasing awareness to further encourage scientific collaboration towards mitigation of terrorism. A case study: The Colombian Iinternal Conflict by Dr. J.G. Ruiz, Dr. S. De Leon-Mendoza, Dr. N. Charpak. How incomprehensible it is to see the body of a 15 years old suicide-bomber who killed innocent people believing he was right or a 14 years old "sicario" (hired killer) killing for money and buying a refrigerator for his mother with the earning of the murder? The question is: why and how we reach such situation? What can we do to stop it? PLANETARY PROBLEM •
•
Since 1990, the war is responsible for: 2 million dead children. 6 million injured children. 10 million psychologically traumatized children. 22 million displaced children from their homes. 300,000 child soldiers all over the world, acti ve in at least 30 countries
Armed conflict traumatizes children, strips them of their innocence, and denies them the protection needed to develop physically, intellectually, spiritually, and socially.
Where?
).{othfr~('bn4p rmanuteomroT.istp~t
,
I.
564
Countries with Child Soldiers Peru (0)
Iran (G,O)
Turkey{Oi
Iraq ((;,0) Israel [lnd OccuPied T81TitoriBS (13.0 ; Angola (13,0)
Russian Federation (0) Lebanon fO, TajiKistan (0) Papua New GUinea (0) Uzbekistan (0) Nepal (0) Pakistan (0) Phillppmes i,O) Solornen Islands (0)
Burundi (G ,O) Repub lic of Congo (G,O) Oem. Rep. of the Congo (G.O) Rwanda (G.O) Uganda fG .O) Myanmar (G,O)
Sri Lanka to)
Chad tGi Eritrea {G) Ethiopia (G)
Colombia (P ,O) Mexico (P,O) Yugoslavia (former Rep .of) (P,O) Algeri a (P,O) India (P,O) Indonesia (P,O) EaslTimor OWS WIIlOOws.wi ~gh's pI'OYlde ~tftlta! lighting
",;6 heat G!az;ld
1>, @tlbk-paned il WAIEIl I'l'ffmNCY Cf.s!~ms
rollf!Ct
..."inOClw5 1l!000idli! InsulallOll.
lail\Wl!tertO lise for
lalldscaping irrlgatioll. L{)w·I!~ wallYl@ss ·o r roIl1po~tingtoit~ ~ r\\¢4(~ W"W~
' IlUIU>IHG MAl£IIIA!,$: Re.I '1:I!.!(j building m\!!te:ials
• VOOILA110N: \lllnt~ 31:1Gllf>i1Iabl!! _. __..• ....J ~"I! 'W~, S\liI(~1iY witn certifilld luml:!er h~lfl5 windows because it IS alltlCIpated tim a Significant llllluber of inctividuals, who should rem,'\in safely sheltered. will begin to request populauon monitoring to confinn that they have not been exposed to ractiation. 9. U,e of contaminated vehicles (e .g .. personal or ma,s transit) for evacuation should not be discomaged in the initial days following a nuclear detomtlon: however, ,inlple imtructiom for nnsing or W3Shmg ,'eludes once decontarllinatioll can be aclueved wlthom Impeding e\'aCUaUOll should be proVided. iO. nlere is no lmiversally accepted threillold of radioactivity (external or internal) above WlliCh a person I> conSidered contarrunated and below which a persolll> considered unc ontamina red. 1L State and local agencies should establish sumvor regmry and locator databases a,
early as pos>ible.. Imtially. the most basic and cntical11lformation to collect from each person is IllS or her name, address. telephone number, and contact information. 12 Plalll1ers should Identify radiation protecllon professionals III their COllll1ltUlity alld
encourage them to ,'oluuteer and register in ally one of the Citizen Corp'> or sImilar programs in theircOIlllUuuity.
Clearly this HSC "guidance" is only the beginning of the needed Federal involvement in defining and supporting the reaction to a nuclear detonation.
SUMMARY OF SOME COMMENTS MADE IN RESPONSE TO THIS PRESENTATION: Michael MacCracken: 1. There may also be an overt threat of a nuclear detonation, which in the United States would call for the deployment of "NEST" (Nuclear Emergency Search Team") capabilities. The question then is how best to communicate with the public in response to such a threat. 2. California cities prepare for earthquake damage, and other locations in many countries face routine tsunami hazard. Can the medical requirements for CBRN mitigation be related to these natural threats? Carl Bauer: In many cases the "first responder" may be the engineer or supervisor in charge of the particular building or institution. Friedrich Steinhiiusler: There may be organizational or bureaucratic impediments to helicopter-borne or drone-aircraft radiation surveys of a city, post altack. John Alderdice: Since the creation of public fear or terror is the purpose of terrorism, anything that can diminish that fear can help to reduce the likelihood of terrorist attack.
703
This page intentionally left blank
ESTABLISHMENT OF A SCIENTIFICALLY-INFORMED RAPID RESPONSE SYSTEM RICHARD WILSON Department of Physics, Harvard University Cambridge, Massachusetts, USA PREAMBLE In this discussion I will focus on two, very different, types of possible terrorist attacks to illustrate the problem and to show how a Rapid Response system could be effective. The two would be explosion of a "dirty bomb", a Radioactivity Dispersal Device (RDD), in a crowded area of a city which I will describe as Wall Street, and the wide dispersal of an infectious agent. In each of these it has been argued, and I believe correctly, that the most important action to prepare for terrorism is to be prepared for a natural occurrence-an accident involving radioactivity or a natural outbreak of a disease such as SARS or HlNl influenza .. Sally Leivesley has cogently argued that there are some very important decisions that must be made very soon after the accident or event: in the first 10 minutes, and maybe an hour or so later. These decisions will not only affect the immediate course an emergency situation will take but will establish a precedent which may adversely affect the effective recovery from the emergency that we all so fervently desire. Although the most important people to be informed of the situation are the "natural" first responders such as the fire brigades, the general public also wants to be informed. Two factors seem important. Firstly, in an emergency the ordinary channels of communication will be overwhelmed, and secondly, the public and maybe even the first responders may not know which source of advice and information to trust. Each of these problems can be mitigated by advance preparation. There are many examples of problems in previous situations. 15 minutes after the San Francisco earthquake of 20 or so years ago, the telephone system into the Bay Area was clogged. as more and more of the public became aware of the disaster. More importantly, in New York City after the two airplanes flew into the World Trade Center, cell phone access was almost impossible. MY PERSONAL ACIVITY AS A "PUBLIC EXPLAINER" I here outline a problem as I have seen it over the last 30 years to illustrate that it is not easy and demands dedication at crucial times . Over the years, I have been aware of the major public misconceptions, and consequent counter-productive public actions, in matters of radiation. In an emergency, the people who have the duty to take charge usually know nothing about radiation and its effects and have no contact with people who do. In the USA, history shows that the press do not help. After Three Mile Island (TMI), not one major newspaper got the units straight-confusing DOSE and DOSE RATE. Not even the Associated Press quoted the accurate press releases of the NRC. It was a bit better after Chemobyl, but there were numerous nonsense stories and to my certain knowledge they refused to publish an accurate account from the Pravda correspondent in
705
706 Kuwait who filed while on vacation in Kiev after a visit to the power plant. Even the Japanese criticality incident was badly described. The NY Times quoted the site boundary dose rate in Rlhr rather than mR per hour, thereby changing a nuisance into a disaster. Fortunately, the National Public Radio saw the NY Times story and called me. I had, during the night, called the head of the Japanese Industrial Forum who had told me all he knew-including the correct number and the NY Times error was nipped in the bud. At TMI, my ability to help was aided by two facts . (i) my continued friendship with Dr. Robert Budnitz, a former graduate student, then Director of Research at NRC, and (ii) my friendship with Dr. Leo Beranek, then running Channel 5 TV in Boston . The one provided me with accurate information and how to get more (for example the telephone number of the TMI control room) and the other provided me with a half hour news broadcast with no advertisements. It was probably just after TMI that I was asked by an informal group in NY City, "Scientists Institute for Public Information", to be on a list of scientists who could be called at any hour of day or night to answer questions about radiation. I took this very seriously and for a month after Chernobyl my phone was constantly ringing. I took it off the hook to sleep. I returned calls from call boxes. After that first month I was on the lecture circuit. I gave approximately 100 lectures, all but one unpaid, and mostly paying my own travel expenses, over the next 6 months. SIPI seems to have vanished, but in my view it needs resurrection and expansion. THE REQUIREMENTS • • • •
The need for a body of people first responders will trust. The need for a body of people the public will trust. The need for a reliable set of recommendations. The need for a communication network that will not be overloaded or compromised.
A COMMUNICATION NETWORK I consider three existing communication networks that can be used in an emergency • • •
The U.S. military. The CERN (Centre Europenne de Research Nucleaire) and U.S. DOE (Department of Energy) Elementary Particle Physics network Google.
Leigh Moore has noted that the WHO list a number of websites in the USA for information about possible pandemics. 80% are military sites. But Leigh goes on to note that while these may be trusted by Fire Brigades and other first responders they are unlikely to be trusted by the general public. I note that the CERN-DOE network was very active as early as 1970, with dedicated telephone lines. The transatlantic link was originally military-the Advanced Research Projects Agency (ARPA net). In 1975 I remember sending to AERE Harwell by
707 British Airways, on New Years Day, several magnetic tapes containing the previous week's data from FERMILAB and on January 2nd sending a brief message "mount tape xxx and run the program yyy". The data analyzed was printed 2 hours later on the computer at the Harvard computer center. My research fellow Dr. Lynn Verhey was able to make a small software modification increasing the speed a factor of 5- thereby illustrating the importance of having a system, designed for military emergencies, which is regularly exercised. The major DOE laboratories and CERN have expanded the system and it is used by elementary particle physicists world wide. Indeed the World Wide Web system was invented in the late 1980s in CERN. It is no longer necessary to send data by air, but data whizzes across the Atlantic all the time. As I write this, my son at FERMILAB informs me that there is a major program between CERN and FERMILAB to handle the vastly increased quantities of data anticipated from the Large Hadron Collider (LHC) at CERN. While the public use of the web has expanded and is now greater than the physics use, the CERN-DOE network is still a major player largely using dedicated lines. In an international emergency, the research use might be temporarily suspended, allowing the whole system to be instantly available. In addition, as noted in the next paragraph, the CERN-DOE network is used and exercised by a number of dedicated scientists who might be willing, as I was after TMI and Chernobyl, to drop what they are doing and help. For each of these dedicated scientists, of course, advanced preparation would be necessary. In 1979, I was not only knowledgeable about radiation and its effects but had studied nuclear reactor safety. The Google network is indeed world wide and in several languages. It is used world wide. Whether or not it is trusted is more doubtful. But unless careful advanced planning is made, it will be overwhelmed in an emergency PROPOSAL I make the following general proposal. That the World Federation of Scientists, based in Geneva at CERN organize scientists to address this issue. That the CERN management, ably led by the Director General Dr. Ralph Feuer, express their willingness to put their facilities at the disposal of the world in an emergency. This would be supported by the United Nations and hopefully this would be followed by the U.S. DOE and Fermilab Director Dr. Pier Oddone and other national entities. For emergencies involving radiation and nuclear matters most CERN scientists are already partially prepared. Almost all will have qualified as "radiation workers", know how to measure radiation and certainly know the distinctions between DOSE and DOSE RATE. They spend their lives understanding the difference between Rems and milliRems. A few volunteers could be recruited in each country to act as "explainers" to the press or advisors to first responders. Although CERN might identify such people, I suggest that WFS might act as a filter to select those who would be useful to recommend to the press and other interested public persons or groups. For other emergencies such as potential pandemics, the average elementary particle physicist is less well informed. But the proximity of WHO to CERN suggests that a collaboration would be appropriate. CERN could make advanced arrangements to make a web available. Experience in the HINI virus (the 2008 Swine Flu) suggests that both
708 WHO and the Center for Disease Control (CDC) in the United States are widely trusted. Leigh Moore has proposed orally an important first step. He has volunteered to organize a small group of students in Huntsville who are looking for (non-secret) projects to plan in some detail various aspects of this proposal. This would be under the auspices of the World Federation of Scientists. This would need a small amount of funding (less than $10,000) which could no doubt be acquired if WFS sponsorship was assured. I therefore propose that the Permanent Monitoring Panel on Mitigation of Terrorist Actions (PMPMT A) make this recommendation to the management of WFS .
This is still very confused and elementary. But far less confusing than what happens in an emergency!
SESSION 16 ENERGY PANEL MEETING
This page intentionally left blank
STATUS OF ITER BROADER APPROACH ACTIVITIES AKIRA MIYAHARA Professor Emeritus, National Institute for Fusion Science Tokyo,Japan The ITER Broader Approach Activities comprise three projects, namely, I) Engineering Validation and Engineering Design Activities for the International Fusion Materials Irradiation Facility (IFMIFIEVEDA), 2) International Fusion Energy Research Centre (IFERC), 3) Satellite Tokamak Programme. In the following sentences, details will be described. 1) IFMIF consists of two accelerators of D+ beams with 40MeVxI2SmA(CW), Li target and small specimen test facilities. Comprehensive engineering design of IFMIF is in progress through the efforts of the Project Team at Rokkasho Japan, while the tasks of design, fabrication, component test of accelerators are being carried out on EU responsibility. They shared the jobs according to their past experiences. Japan is also responsible for RF quadruple accelerator with Italian team. For design of the Lithium Target assembly and Specimen Test facilities , contribution from Japanese team is important. 2) International Fusion Energy Research Centre (IFERC) consists of three subCentre, namely, 2.1) DEMO Design R&D Coordination Centre, 2.2) Computational Simulation Centre, and 2.3) the ITER Remote Experimentation Centre. 2.1) DEMO DESIGN R&D Coordination Centre consists of following two items. The first task is the design work of DEMO. The activity is performed by workshops and/or meetings including the coordinators meeting, and the discussions were focused on two topics concerning design driver and constraints for DEMO design. For physics aspect, plasma shaping and magnetic structure, position stability of elongated plasma are main concern, while for technology and engineering, assessment of superconducting magnet and current sustain system, analysis of electro-magnetic forces, torus configuration and maintenance issues. For the system design, feasibility assessment of pulsed DEMO and sensitivity study of design parameters were also performed. The second task is to identify R&D areas to be carried out during the BA activities, namely R&D on SiC/SiC Composites (IFERC-R-TI), R&D on Tritium Technology (IFERC-R-T2), R&D on Materials Engineering for DEMO Blanket (IFERC-R-T3), R&D on Advanced Neutron Multiplier for DEMO Blanket (IFERC-R-T4) and R&D on Advanced Tritium Breeders for DEMO Blanket (IFERC-R-TS). In addition to the above mentioned items, Prof. Ogawa insisted the need of R&D for Li 6 isotope separator for fuel preparation. The major activities were concentrated to design/evaluate/discuss the equipment/devices/facilities to be installed at Rokkasho site in near future . 2.2) Computational Simulation Centre: The mission and Scope of the centre are to establish a Centre of Excellence (COE) for the simulation and modeling of ITER, of the advanced SC Tokamak and other fusion experiments, and for the design of future fusion power plants, in particular DEMO. The computer resources shall be externally accessible, with sufficient transmission rate to Europe including the ITER site, to allow an efficient remote use of the facilities. Activities in 2008, were selection of High Level Benchmark Codes (gyro-kinetic codes, fluid/fluid-kinetic codes and material science codes), discussion on the procurement process, and so on. 2.3) An
711
712 ITER Remote Experimentation Centre: The preparation schedule depends on the ITER schedule, and at the moment it is scheduled in 2012 begin to install computer, and operation will begin from 2015 . I hope such kind of facility must be built in U.S. also, so that we will be able to access the ITER 24 hours long, because time difference between each facility is 8 hours. 3) Satellite Tokamak Programme: The mission of this programme is important, because during EDA of ITER, experimental results and experiences of JET and JT-60 were transferred to the design activities, while those of Satellite Tokamak will be put into both for ITER support and ITER complement for DEMO. This year, remarkable rebaselining of the JT-60SA (Satellite Tokamak) was done by the integrated Project Team consisting of Project Team and EU/JA Home Teams was successfully completed with approval of Steering Committee in December 2008. By means of the improvement, physical and engineering aspects are very much optimized with penalty of schedule change of obtaining the first plasma from March 2015 to March 2016. SUMMARY •
• • • •
Three Projects are launched for Broader Approach Activities between EU and Japan. They are: lFMlFIEVEDA, lFERC and the Satellite Tokamak Programme. Signature of the BA Agreement was completed on 5th February, 2007, and entered into force on June I st. A new site is being prepared in Rokkasho, Aomori prefecture. ITER Broader Approach Activities are progressing smoothly, and including site preparation, and up to now each project is on schedule. The BA Activities are open to the other ITER Parties and their participation are quite welcome.
TOPICS OF ENERGY RESEARCH IN JAPAN AKIRA MIYAHARA Professor Emeritus, National Institute for Fusion Science Tokyo, Japan In this manuscript, I am introducing some topics of energy research in Japan beside ITER and ITER Broader Activities. These are Nuclear Fusion Study at National Institute for Fusion Science, Related Problems of Uranium Recovery from Seawater, Remarks on Disposal of Low Level Nuclear Waste, Recent Activities of Film Type Amorphous Solar Modules, and Precautions against Earthquake for Nuclear Power Station. Recent Results from Large Helical Device at National Institute for Fusion Science. I. Large Helical Device (LHD) is realizing high performance plasma parameters, Ti(O)= 5.6keV at ne(0)=1.6 x 1019m-3 , mT=5 x 1019m-3 seckeY. 2. Discovery of super-dense-core regime (achieved density was 1.2x10 21 m-3 at B=2.5(T) has an attractive potential for an innovative reactor operation scenario of ignition at 6-7keV ion temperature, instead of 20keV for ITER case. 3. Impurity hole develops with increase in ion temperature, achieved by new perpendicular NBI, that opened the window to adopt SiC and W diverter plate materials. 4. Effective use of facility for bilateral benefits-since 1998, more than 90,000 plasma discharges were served for cooperative researchers-both for the researchers and the students for the next step Fusion Studies. Related Problems of Uranium Recovery from Seawater. 1. Historically, no country has devoted into development of U-235 enrichment technologies to be more proliferation resistant, except the work in 1971 by Prof. Kakihana, former IAEA DDG in 1970's. 2. Project for development of chemical method for U-235 enrichment (ACEP) has started from this Spring under the guidance of Prof. Fujii. 3. Spent fuel (U, Pu, MA) conditioning by pyro-process is inherently proliferation resistant recycle of spent fuels, because in electro-refiner, always U, Pu, MA are co-deposited together. 4. Remarks above mentioned, came from Dr. Tokiwai (Nuclear Solution Access and Communication (NuSAC Inc.). Remarks on Low Level Nuclear Waste Disposal: 1. In addition to Nuclear HLW matters, caution for LLW is necessary in order to be publicly accepted. 2. The New Scope of the Effective Utilization of Low Level Radioactive Waste instead of Disposal was proposed by Tanabe (Kyushu Uni) and Yoshida (Nagoya Uni).
713
714 3. Enhanced gamma-ray Energy Conversion in Water Vessel by means of coexisting with Ah03 has open the way to effective utilization of LLW, such as for Hydrogen Production and Generating Electricity.
Recent Activities of Film Type Amorphous Solar Modules: I. More advanced Photovoltaic Modules were developed in Japan using amorphous Si and microcrystalline Si by Fuji Electric, Mitsubishi Heavy Industries, while in U.S. Unisolar Company. 2. Advantages of Flexible Modules are, light weight, thin and flexible, higher productivity (with roll-to-roll process), high voltage specifications with no external wiring required. 3. Advantages of amorphous modules are, more annual energy output than crystalline modules with the same rated capacity, superior temperature characteristics (less efficiency reduction at high temperatures), power can be generated with a small amount of light, less silicon required (1/200 of that of crystalline cell), less CO 2 emitted during production (50% of that of crystalline modules). 4. Because of the good sunshine is usually available in Japan, government recommended to install solar panel on roof. However, price is still expensive. More wide market is required to reduce the cost, and from this aspect, flexible modules have good future. Precautions Against Earthquake for Nuclear Power Station: I.
2.
3.
4.
5.
After the attack of the earthquake to the nuclear power station of KashiwazakiKariwa, new criterion against earthquake has been introduced, namely every reactor buildings must be guaranteed against 1000gais of acceleration. Severe discussions were done for earthquake forecast, site selection and aseismic reactor buildings. Now seismic isolation structure of reactor buildings is seriously considered. At present, reasonably accurate earthquake forecast is possible only for earthquake caused by Plate Movement, while prediction of earthquakes caused by active faults are far less reliable. In this case, prediction is made through the knowledge of geological and historical (archaeological) approaches. The earthquake accident is not popular in western Europe and eastern U.S., but frequency of occurrence is dominated along pacific coast, where the new nuclear power stations are expected to be built. We have to learn from past experience, through international conversations. On August II, 2009, an earthquake with the moment magnitude of 6.5 shook wide area of Shizuoka prefecture, and at Hamaoka nuclear power station, No.4 and No.5 reactors were automatically shut down. Driving system of control rod for No.5 reactor was damaged slightly.
IMPACT OF THE FINANCIAL CRISIS OF 2008 ON WORLD ENERGY DR. HIS HAM KHATIB World Energy Council Amman, Jordan The financial crisis 2008-2009 was sudden and unanticipated. It greatly reduced economic growth practically in every region in the world, increased unemployment and reduced investments. Its financial and economic details are in Table I below: Table 1. Impact of the Financial Crisis (2007-2010) 2007 2008 World Output % 5.3 3.1 Advanced Countries % 2.7 0.8 Emerging & DCS % 8.3 6.0 Of which China % 13.0 9.0 World Trade % 7.2 2.9 MlD- Year Oil Prices ($)/b 71 97 Source: IMF statlstlcs
2009 -1.4 -3 .8 1.5 7.5 -12.2 60
2010 2.5 0.6 4.7 8.5 1.0 74
The year 2008 was a tectonic year for energy with the following features: • • • • • •
First time in history non-OECD commercial energy consumption (51.2%) was larger than OECD energy consumption. Electrical power generation in OECD fell. China's power generation bigger than EU power generation Carbon emission from China larger than U.S. Coal became the world's fastest growing energy fuel in 2008 it grew 3.1% Global primary energy growth was 1.4% only. Oil consumption on the United States fell 1.3 Mbpd (or 6.4%). China increased 0.26 Mbpd.
From the above statistics it is clear that the impact of the financial crisis on the global energy sector was significant. However the reduced prices of oil and other forms of energy helped to reduce the impact of the recession on the world economy. It also decreased the demand for oil. However the recession reduced funds available for investment in oil and gas development. The negative effect of this in the long term is going to be significant because, it means delay in the development of energy sources, particularly oil which needs 5- 7 years for resource development. The recession also reduced investments and interest in developing renewables, not only due to non-availability of funds, but also because reduced fossil energy prices diminished the need for developing alternatives. The year 2008 also witnessed tremendous swings in energy prices. Oil prices peaked in mid-year to $147 per barrel (b), but went down to less than $40/b by year end. Similarly coal prices witnessed a peak of $219/ton and then plummeted to almost $58/ton by year end.
715
716 With regards to global energy security the ratio of proved oil reserves to annual production has held steady at roughly 40: I for more than 20 years, but the remaining reserves are increasingly concentrated in more politically and technically challenging terrain. As oil prices neared their peak in mid-2008, consumption by industrialized countries fell by about 1 percent from one year before. Economic turmoil dragged demand still lower later in the year, and the average OECD consumption for 2008 was 47.5 million barrels per day (Mbpd), 3.5 percent below the 2007 level, with even sharper declines in the first half of 2009. In contrast, developing-world demand increased by 1.4 Mbpd to 38.7 Mbpd, driven by rising transportation energy needs and government fuel subsidies that softened the pain of higher prices. This growth offset much of the industrial-country decline, and global oil consumption ended only 0.3-0.6 percent lower than in 2007. The World Watch Institute recently drew the attention that for six years running, coal has led the growth in fossil fuel production . In 2000, it provided just 28 percent of the world's fossil fuel energy production, compared with 45 percent for oil. But by 2008, coal production reached 9.1 Mtoe per day, representing a third of fossil energy production and a 0.7 percent increase over 2007. The growth in China's coal consumption since 2000 dwarfs that of all other countries combined. India, second in growth, added less than an eighth as much coal consumption as China during that period. Globally, the largest share of coal production is for electricity generation. Larger capacities and better materials have led to higher efficiencies at coal-fired power plants, particularly in China. China aims to reduce the energy intensity of its economy by 20 percent during the 2006- 10 planning period, in part by improving power-plant efficiency by 4 percent. Industry data suggest that this goal was already surpassed in 2007. In the United States, the construction of new coal-fired power plants has been discouraged by expectations of greenhouse gas regulations, as well as factors such as materials costs and public opposition. Fossil Fuels which constitute more than 80% of world primary energy consumption will continue to dominate global energy markets well into 2050. In my humble view well beyond that, it is not the financial crisis that will shape future of energy ; rather it is environmental awareness and world emission limitations agreements. Still carbon emissions and CO 2 concentration will likely to continue for decades to come. There is mounting need for migration and adaptation The following two Figures demonstrate Fossil Fuel Production 1981-2008 and the growing Coal Consumption by Region in year 2000,2007 and 2008.
717
718
REFERENCES 1.
IMF semi-annual world economic surveys 2008-2009.
2. 3.
BP Statistical Review of World Energy-June 2009. World Watch Institute, "Fossil Fuel Production Up Despite Recession" by James Russell, 2009.
SESSION 17 GREEN CHEMISTRY WORKSHOP
This page intentionally left blank
PLASTICS ADDITIVES AND GREEN CHEMISTRY EVAN S. BEACH AND PAUL T. ANASTAS* Center for Green Chemistry and Green Engineering Yale University, New Haven, Connecticut, USA ABSTRACT The plastics enterprise currently depends on a small number of commodity polymers to perform in a diversity of applications, putting a burden on additives to enhance the properties of various materials. The toxic effects and environmental persistence of certain commercial additives impact the sustainability of the plastics industry. Green chemistry has been (and will be) applied to find solutions. This paper will focus on alternatives to phthalate plasticizers and halogenated flame retardants, which together account for a significant portion of the global additives market and the global dispersion of endocrine disrupting chemicals. Small molecule alternatives that exist in various stages of research and commercialization will be reviewed, with emphasis on the use of renewable resources. The rise of biorefineries and new bio-based monomers may help overcome existing economic barriers. Increasing the molecular weight of additives or covalently linking them to polymer backbones are two promising strategies for reducing both mobility and toxicity, but are beyond the scope of this extended abstract. It should be noted that none of the chemicals put forward as "green" replacements have received the same level of scrutiny as dioctyl phthalate (DOP, aka DEHP) or polybrominated diphenyl ethers (PBDEs). Cooperation between chemists, engineers, and the health and safety community will be critical to ensure the adoption of safe and sustainable technologies. INTRODUCTION Global plastic resin consumption in 2007 was 210 million tonnes. The corresponding demand for additives was 11 million tonnes, or about 5% by weight of all the plastic products manufactured in a year. 1 The environmental and human health impacts of phthalate plasticizers and PBDE flame retardants have been reported in depth and will not be summarized here. Plasticizers, mostly used in poly(vinyl chloride) (PVC), accounted for 54% of additives (by mass) in 2007, and flame retardants were reported to be one of the fastest-growing sectors.2 In Europe, these two categories together accounted for just over 75% of the additive market (by mass).3 Replacing PVC with alternative polymers would have a significant effect on the global dispersion and health impacts of additives. In Europe in 2007, PVC accounted for 80% of plasticizer use, and that market continues to be dominated by phthalates (7585%).3.4 The numbers are not surprising considering the high levels of phthalates that are used in flexible PVC. Whereas pipes may contain >95% PVC by weight, in some applications like fishing lures the proportion can drop as low as 14%, and the polymer is effectively a gelling agent for liquid plasticizer 5 Abated use of PVC is not expected in the short term, however. It is forecast that due to growth in Asia and developing markets, production will more than double from 1992-2012, from 22 million to 50 million
721
722 tonnes/yr, and as of 2007 PVC accounted for 35.3 million tonnes, or about 17% of all polymer resin sold. 6 Even if PVC production diminishes, plastics that fill the gap will demand additives as well. Global production of bioplastics is expected to quintuple from 20072011, and in a future where biorefineries are the top source of chemical feedstocks, the demand will be even higher. Poly(lactic acid) (PLA) is currently the most widely used bioplastic and has been the focus of most additives-for-bioplastics research to date. PLA depends on a variety of additives including plasticizers if it is to perform in a wide range of applications.7 Cellulose-, starch-, and wheat gluten-based polymers consume plasticizers as well. 8 The inherent flammability of most polymers means that flame retardant additives are critical for almost any plastic used in electronics, textiles, foam padding, and other applications where accidental fires cost lives. Global flame retardant demand is expected to increase 4.7% annually to 2.2 million tonnes by 2011. Growth is expected in both halogen-free materials as well as brominated flame retardants. 9 SOLUTIONS: SMALL MOLECULE PLASTICIZERS A survey of the literature shows there are abundant alternatives to DOTP (the most highprofile endocrine-disrupting phthalate), as well as alternatives to the phthalate class of molecules altogether. It must be stressed that absence of the phthalate moiety (as a sole criterion) does not assure "greenness" in any way. The discussion here will be limited to plasticizers derived from bio-based resources, as there are a variety of simple carbohydrates and lipids that are generally expected to be safe. It is well known that nature abounds with toxic chemicals and thus bio-based chemicals should not be excluded from full toxicity testing. One non-phthalate alternative wholly derived from petroleum should be highlighted: BASF's Rexamoll® DINCR (Figure I) is perhaps the most rigorously tested drop-in replacement for DOTP. Prior to commercialization DINCR passed a battery of eco-toxicity and genotoxicity tests covering a variety of species from bacteria and daphnids to zebrafish, earthworms, rats, rabbits, and guinea pigs. IO Production capacity recently increased to 100 million kg/yr.4
c(°
0/isononYI
0,. Isononyl
° Fig. 1. One class of plasticizers entirely based on renewable resources is based on isosorbide, a dehydration product of glucose-derived sorbitol (Figure 2). The performance can be tuned by selecting various alkanoic acids. Isosorbide di-(n-octanoic acid) ester (Figure 3) has capabilities similar to DOTP. Isosorbide esters are fully
723 biodegradable and have passed tests for acute toxicity, sensitization, mutagenicity, and estrogenicity. I1,12
OH
OH
HO~OH OH
OH
-
-2 H 2 0
Hm I:i O
'.
°
° H- -OH
n-octanoic acid (renewable)
. Fig. 3
Fig. 2:
Citrate esters are well known plasticizers for PVC and PLA. Tributyl citrate [TBC, (Figure 4)], acetyl tributylcitrate (ATBC), acetyl trihexylcitrate, and butyryl trihexylcitrate are all available commercially (e.g., Citroflex®) and the toxicological literature shows that this family of compounds is generally nontoxic by most assays. However some studies have found that ATBC has cytotoxic effects,13-IS suggesting citrates should be regarded with some caution. Epoxidized soybean oil is another well known, commercial plasticizer. It has tested negative for harmful effects in a range of tests (estrogenicity, mutagenicity, carcinogenicity, and embryotoxicity), except it is noted that some grades affected organs in rats. 12 .16,17 Danisco GRINDSTED® SOFT-N-SAFE [consisting primarily of the castor oil derivative (Figure 5)] has lower volatility than DOTP and high resistance to extraction. 18 The patent literature suggests that SOFT-NSAFE is finding applications in PLA resins as welL 19 Several dibenzoate esters of biobased diols show excellent performance in comparison to conventional plasticizers, but biodegradation and estrogenicity are concerns. Di(ethylene glycol) dibenzoate and di(propylene glycol)dibenzoate were shown to form toxic, stable metabolites when treated with yeast. 20 The related chemical 1,5-pentanediol dibenzoate shows improved biodegradability.21 The outlook is promising but it was reported that a technical grade plasticizer containing predominantly di(propylene glycol) dibenzoate showed estrogenic properties, 12 so more thorough testing is needed.
f
t;?0y0t;?
~o~o~ OH
Fig. 4.
Fig. 5.
The use of waste products, particularly from agricultural processes, will promote low cost, environmentally friendly plasticizers. Tributyl aconitate [TBA, (Figure 6)] made from aconitic acid, a waste product of sugar cane processing, shows some advantages over citrates. TBA imparted better flexibility to PVC than di(isononyl) phthalate or TBC and had better migration properties than TBC. 22 According to
724 TOXNET, TBA has an LDso >500 mg/kg (mouse), indicating relatively low toxicity, but further study is needed to confirm the safety of this plasticizer. Another low-value agricultural product is unrefined "biodiesel coproduct stream" (BCS, consisting of glycerol, free fatty acids, and fatty acid methyl esters). BCS has been shown to be an effective plasticizer for gelatin. The thermoplastic gelatin produced may be used in extrusion, injection molding, or foam applications. 23 The use of BCS in plastic may raise the value of the biorefinery product and expand the range of applications for gelatin and other biopolymers.
f
9°"1° 9
~o~o~ Fig. 6.
Ionic liquids have emerged as a new class of plasticizers. Low volatility, low migration compared to DEHP, and reduced flammability hazard are all expected benefits,24 though toxicity will be a concern for many structural c1asses. 2s To date, the ionic liquids reported to have plasticizer effects have all been derived from petroleum, but the development of bio-based ionic liquids may offer new opportunities for environmentally benign innovations in the plastics field. 26 SOLUTIONS: SMALL MOLECULE FLAME RETARDANTS Non-halogenated flame retardants are the focus of a thriving research field. A sign of the growing interest is a report from the 2007 AddCon conference noting that there were no submissions on halogenated flame retardants, though presentations on flame retardancy were one of the largest groups of papers?7 Numerous reviews of PEDE alternatives have been conducted by scientists, government, and industry. The EPA has published a study on expected environmental effects of various phosphorus-based flame retardants. 28 Industry groups like HDPUG have made similar efforts?9 Flame retardant manufacturers have created a website, http://www.nonhalogenated-flameretardants.com. compiling performance and environmental data for a variety of applications. 3o The US EPA considers environmentally positive attributes of flame retardants to include ready biodegradation or safe incineration, very large diameters (> 10 A) or high molecular weights (> 1000 Da), ability to chemically bind to the substrate, and low toxicity.28 A few particularly interesting commercial technologies based on small molecules will be highlighted here. For polycarbonate plastics, it has been known for decades that certain metal sulfonates impart flame resistance at spectacularly low levels, in the range of 0.05-0.1 % loading. Of the commercial sulfonates, one is non-halogenated (potassium diphenyl sulfone sulfonate). The sulfonate technology is just one example of the benefits that can be gained through taking advantage of unique flame retardant mechanisms. 31 In the polyester industry, it is estimated that 40% of resins are flame-retarded, usually with
725 halogen-based agents. Melamine polyphosphate (e.g., DSM Melapur® 200) is among the commercial non-halogenated alternatives. 32 Melamine polyphosphate thermal decomposition reactions are endothermic, and combustion generates N2 , contributes to char, enhances char properties, and shows synergy with other flame retardant additives 33 Better understanding of chemical mechanisms, in particular synergies between materials (for example systems containing aluminum, phosphate, and nitrogen that achieve very high flammability standards) will help inform the design of new materials. 34 A new product called Molecular Heat Eater® (MHE) is available in various formulations based on carbonate and phosphate salts and benign organic acids (such as citric, glutaric, succinic, oxalic, formic, acetic, and stearic acids). Many of these components are available as agricultural waste products. MHE is typically dispersed in a polymer matrix as micron-sized particles, which require a strong endothermic reaction to decompose, resulting in the flame retardant effect. Performance of MHE in thermogravimetric analysis testing is reportedly similar to that of PBDEs, and in cone calorimeter tests MHE exceeded ISO standards. 35 MHE is one of the rare halogen-free technologies that makes extensive use of bio-based materials. Further development of flame-resistant materials from biologically familiar chemicals should be highly encouraged. DESIGNING LESS HAZARDOUS CHEMICALS Very few (if any) of the alternative additives discussed in this review have received the same level of scrutiny as DEHP and PBDEs. They have been highlighted mainly to demonstrate that functional alternatives are abundant, and that progress has been made in adoption of green chemistry principles. The use of renewable resources (and particularly renewable resources that are widely recognized as safe) is to be encouraged, but ideally all green chemistry principles must be met. Comprehensive assessments of hazards at all stages of the chemical lifecycles need to be completed for many promising technologies. The criteria considered by the United States EPA Design for the Environment team in its assessments of flame retardant materials 28 are an excellent set of properties that should be determined for any chemical designated for mass markets: Acute toxicity
Carcinogenicity
Bioconcentration
Subchronic & chronic toxicity
Neurotoxicity
Degradation & transport
Reproductive toxicity
Immunotoxicity
Aquatic toxicity
Developmental toxicity
Genotoxicity
Terrestrial organism toxicity
The hazard screening process for new additive technologies will ideally be aided by computational methods, and eventually simple molecular design rules, to aid chemists and engineers in selecting which polymer additives are worthy of comprehensive study. A hierarchy of design information for designing safer chemicals has been proposed (in order of increasing utility): 36.37 •
Molecular modifications that decrease bioavailability
726 •
Molecular modifications affecting absorption, distribution, modification, and excretion parameters
•
Quantitative structure-activity relationships that predict safe or problematic structural classes
•
Knowledge of the precise mechanism of action
Some progress has been made in articulating guidelines that can be easily adopted by chemists and other molecular designers, for example in predicting biodegradability.38 Designing for minimal harm to humans (particularly in regards to emerging issues like endocrine disruption and epigenetic effects) remains a tremendous challenge. Shape Signatures, a computational approach that relies on molecular geometry and polarity information, has been used to identify novel estrogen antagonists 39 and may prove useful in screening new polymer additives. As research efforts continue to reveal new links between molecular structure and harmful effects, one productive application of the results will be the screening of libraries of chemicals that can be simply produced from biorefinery products (by esterification, hydrogenation, or other green processes). The biobased chemical platforms of the future will begin to supplant the petroleum platform of the past, and new molecular structures will appear in the commodity chemical markets. It is in this development where transformative advances in green chemistry of polymer additives will be made. REFERENCES I. 2.
3. 4. 5. 6. 7. 8.
9. 10.
Babinsky, R.; Gastrock, F. Brics, foundation for strategic growth. Addcon 2008, Barcelona, Spain, Paper I. New study highlights trends in additives. Plastics Additives & Compounding 2008,10 (September/October), 12. MUller, S. Plastic additives-the European market in a global environment. Addcon 2007, Frankfurt, Germany, Paper l. Markarian, J. (2007) "PVC additives-what lies ahead?" Plastics Additives & Compounding, 9 (November/December), 22-25. Wickson, EJ. In Handbook of pvc formulating; Wickson, EJ., Ed.; John Wiley & Sons, Inc.: New York, 1993, 1-13. Global PVC markets: Threats and opportunities. Plastics Additives & Compounding 2008,10 (November/December), 28-30. Markarian, J. (2008) "Biopolymers present new market opportunities for additives in packaging." Plastics Additives & Compounding, 10 (May/June), 22-25. Rahman, M.; Brazel, C.S. (2004) "The plasticizer market: An assessment of traditional plasticizers and research trends to meet new challenges." Progress in Polymer Science, 29 (12), 1223-1248. (2008) "Flame retardant demand to rise." Plastics Additives & Compounding, 10 (January/February), 8. Wadey, B.L. (2003) "An innovative plasticizer for sensitive applications." Journal of Vinyl & Additive Technology, 9 (4), 172-176.
727 II.
12.
13.
14. 15.
16. 17.
18. 19. 20.
21.
22.
23.
24.
25. 26. 27.
van Haveren, J.; Oostveen, E.A.; Micciche, F.; Weijnen, J.GJ. In Feedstocks for the future; Bozell, J.J., Patel, M.K., Eds.; American Chemical Society: Washington, DC, 2006, 99-115. Ter Veld, M.G.R.; Schouten, B.; Louisse, J.; Van Es, D.S .; Van der Saag, P.T.; Rietjens, LM.C.M.; Murk, A.J. (2006) "Estrogenic potency of food-packagingassociated plasticizers and antioxidants as detected in ERa and ER~ reporter gene cell lines." Journal of Agricultural and Food Chemistry, 54 (12), 4407 -4416. Meyers, D.B.; Autian, J.; Guess, W.L. (964) "Toxicity of plastics used in medical practice. Ii. Toxicity of citric acid esters used as plasticizers ." Journal of Pharmaceutical Sciences, 53 (7),774-7 . Ekwall, B.; Nordensten, c.; Albanus, L. (982) "Toxicity of 29 plasticizers to HeLa cells in the MIT-24 system." Toxicology, 24 (3-4),199-210. Mochida, K.; Gomyoda, M.; Fujita, T. (996) "Acetyl tributyl citrate and dibutyl sebacate inhibit the growth of cultured mammalian cells." Bulletin of Environmental Contamination and Toxicology, 56 (4), 635-7. Epoxidised soya bean oil. http://www.bibra-information.co.uklprofile-126.html, accessed June 15,2008. Seek Rhee, G.; Hee Kim, S.; Sun Kim, S.; Hee Sohn, K.; Jun Kwack, S.; Kim, B. H.; Lea Park, K. (2002) "Comparison of embryotoxicity of ESBO and phthalate esters using an in vitro battery system." Toxicology in Vitro, 16 (4), 443-448. Kristoffersen, B.L. (2005) "Ud med phthalaterne? (Out with phthalates?)." Dansk Kemi, 86 (3), 22-23. Hamaguchi, T.; Mori, A. (Kao Corporation, Japan). Plasticizer for biodegradable resin. United States Patent Application 2006/0276575 A I. Gartshore, J.; Cooper, D.G.; Nicell, J.A. (2003) "Biodegradation of plasticizers by Rhodotorula rubra." Environmental Toxicology and Chemistry, 22 (6), 12441251. Firlotte, N.; Cooper, D.G. ; Maricacute, M.; Nicell, J.A. (2009) "Characterization of 1,5-pentanediol dibenzoate as a potential 'green' plasticizer for poly(vinyl chloride)." Journal of Vinyl and Additive Technology, 15 (2), 99-107. Gil, N.; Saska, M.; Negulescu, I. (2006) "Evaluation of the effects of biobased plasticizers on the thermal and mechanical properties of poly(vinyl chloride)." Journal of Applied Polymer Science, 102 (2), 1366-1373. Stevens, E.S.; Ashby, R.D.; Solaiman, D.K.Y. (2009) "Gelatin plasticized with a biodiesel coproduct stream." Journal of Biobased Materials and Bioenergy, 3 0), 57-61. Rahman, M.; Brazel, C.S. (2006) "Ionic liquids: New generation stable plasticizers for poly(vinyl chloride)." Polymer Degradation and Stability, 91 (2), 3371-3382. Scammells, PJ.; Scott, J.L.; Singer, R.D. (2005) "Ionic liquids: The neglected issues." Australian Journal of Chemistry, 58 (3),155-169. Zhao, D.; Liao, Y.; Zhang, Z. (2007) "Toxicity of ionic liquids." Clean : Soil, Air, Water, 350),42-48. Pritchard, G. (2007) "Technical progress-but an uphill struggle for Western Europe." Plastics Additives & Compounding. 9 (November/December), 36-39.
728 28. 29. 30.
31. 32. 33. 34.
35. 36.
37.
38. 39.
Furniture flame retardancy partnership: Environmental profiles of chemical flame-retardant alternatives for low-density polyurethane foam. accessed. HDP halogen free guideline. http://www.hdpug.org/contentJpublications-O, accessed 611412009. Halogen-free flame retardants in E&E applications: A growing toolbox of materials is becoming available. http://www.halogenfree-f1ameretardants.com! HFFR-300.pdf, accessed 6114/2009. Levchik, S.V.; Weil, E.D. (2005) "Overview of recent developments in the flame retardancy of polycarbonates." Polymer International, 54 (7), 981-998. Scheirs, J. (2003) In Modern polyesters: Chemistry and technology of polyesters and copolyesters; Scheirs, J., Long, T. E., Eds. 495-540. Murphy, J. (2001) "Flame retardants: Trends and new developments." Plastics Additives & Compounding, 3 (April), 16-20. Braun, U.; Schartel, B. (2007) "Flame retardancy mechanisms of aluminium phosphinate in combination with melamine cyan urate in glass-fibre-reinforced poly(1,4-butylene terephthalate)." Macromolecular Materials and Engineering. 293 (3), 206-217. Additives from the natural world. Plastics Additives & Compounding 2008, 10 (November/December),42-43. DeVito, S.e. (1996) In Designing safer chemicals: Green chemistry for pollution prevention; DeVito, S.C., Garrett, R.L., Eds.; American Chemical Society: Washington, DC, 16-59. Anastas, N.D.; Warner, J.e. (2005) "The incorporation of hazard reduction as a chemical design criterion in green chemistry." Chemical Health & Safety, 12 (2), 9-13. Boethling, R. S.; Sommer, E.; DiFiore, D. (2007) "Designing small molecules for biodegradability." Chemical Reviews, 107 (6), 2207-2227. Wang, e.y.; Ai, N.; Arora, S.; Erenrich, E.; Nagarajan, K.; Zauhar, R.; Young, D.; Welsh, W.J. (2006) "Identification of previously unrecognized antiestrogenic chemicals using a novel virtual screening approach ." Chemical Research in Toxicology. 19 (12),1595-1601.
PLASTIC, PLASTICIZERS AND CONSUMER PRODUCTS NICOLAS OLEA Laboratorio Investigaciones Medicas, Hospital Universitario San Cecilio Granada, Spain INTRODUCTION Knowledge about human exposure to endocrine disrupters (ED) is expanding at a time when we are discovering new chemical compounds that can alter the hormonal balance. As the list of new ED lengthens, we are also identifying exposure pathways and how these substances enter the human organism. This is the case of some plastics and plasticizers found in consumer products, such as bisphenols and phthalates. Bisphenols are a group of chemical compounds that were initially designed as synthetic estrogenic hormones and now form a part of epoxy resins and polycarbonates. Phthalates are used in the manufacture, stabilization, modification, and performance of plastic polymers. The estrogenicity of bisphenols was first documented in 1936, when they were already being used in the formation of synthetic polymers, and bisphenol-F was a base monomer in bakelite. Although bisphenols and phthalates have been used for all of 100 years, account has only recently been taken of human exposure or potential consequential health risks. It can be affirmed that: i) "bisphenols" is a broad term that includes various compounds that are structurally similar to bisphenol-A (BPA) and are widely used in the chemical industry; ii) human exposure to bisphenols and phthalates is a significant, demonstrated and increasing phenomenon; iii) the biological effects of bisphenols and phthalates are well documented, fundamentally with respect to their estrogenicity. The causal relationship between endocrine disruption by bisphenols and phthalates and human disease remains elusive and these uncertainties allow different conclusions to be drawn. Nevertheless, it is clear that these chemicals are hormonally active, interfere in the homeostasis of the hormonal system, and may thus disrupt the endocrine system. BISPHENOLS AND BISPHENOL-A (BPA) Bisphenols is a broad term that includes many substances which have as a common chemical structure two phenolic rings joined together through a bridging carbon. In BPA (2,2'-bis[4-hydroxyphenyl]propane) the bridging group is isopropylidene, in bisphenol S it is sulphur, and in bisphenol AF it is fluorine. BPA is synthesised from two molecules of phenol and one of acetone. Following the same approach, bisphenol F comes from formaldehyde, bisphenol B from butanone, bisphenol H from cyclohexane, bisphenol C o-cresol and bisphenol G from o-isopropylphenol. BPA is a one of the 2,000 high-production volume chemicals manufactured world-wide. In Europe, four companies produce more than 700,000 tonnes/year of BPA at six production sites, with one factory in Southern Spain producing around 250,000 tonnes/year. This massive production implies the continuous emission of BPA into the environment from its manufacture and the utilization of products containing this
729
730 compound (Vandenberg et al. 2007). Nevertheless, BPA has not been subjected to any environmental legislative control. Bisphenols have been extensively used as an intermediate in the production of polycarbonate, epoxy, and corrosion-resistant unsaturated polystyrene resins. Epoxy resins are the fundamental components of high quality commercial polymer materials. They are versatile materials used in a wide range of essential applications from electronics to food protection. They are used as a component in the manufacture of barrier coatings for the inner surfaces of food and beverage cans. They playa vital role in preventing corrosion of the metal or migration of its ions, which would lead to tainting or spoiling of the can contents. They are also used as additives in a variety of other plastic materials such as vinyl and acrylic resins and natural and synthetic rubber. As biomaterials they have multiple uses in human health, for instance in dental composites and sealants and as bioactive bone cements. Polycarbonate is used in a wide array of plastic products, with novel applications continuously being developed. They are used in the automotive, aircraft, optical, photographic, electrical and electronic market. They are also employed in the packaging, storing, and preparation of a myriad of foods and beverages, baby foods, and juice containers. Phenolic resins are produced by the copolymerisation of simple phenols or bisphenols and fonnaldehyde. They are used in inks, coatings, varnishes and abrasive binders. Phenoxy resins are thennoplastic copolymers of bisphenol A and epichlorohydrin. The resins have good resistance to extreme temperatures and corrosion, which makes them suitable for use in pipes and ventilating ducts. BIOLOGICAL ACTIVITY OF BISPHENOLS: ESTROGENICITY The estrogenicity of bisphenols was reported for the first time in 1936 by Dodds and Lawson, who looked for synthetic estrogens devoid of the phenantrenic nucleus. These authors classified stilbenes and bisphenols by their ability to mimic l7b-estradiol (E2) in increasing the uterine weight of ovariectomized rats. Stilbenes were found to be much more potent than bisphenols and one of them, diethylstilbestrol (DES), was selected for pharmaceutical use. Bisphenols were subsequently discarded for pharmaceutical purposes. In 1944, Reid and Wilson again studied the relationship of the structure of some bisphenols to estrogenic activity in vivo compared with stilbene derivatives. Interestingly, some bisphenols were already used at this time in the plastic industry. For instance, bisphenol-F was part of Bakelite plastic, invented in 1909. The early reported estrogenicity of bisphenols was not considered a toxicological problem, and new bisphenols were synthesized for use in many industrial applications. Fifty years later, Gilbert et al. studied the relationship between correspondence factor analysis and structure-activity in bisphenols by testing the effect of these compounds on proliferation of MCF-7 human breast cancer cells, and by testing their binding specificity to the estrogen receptor. They proposed that no single structural feature defines estrogenic activity and that hydrophobic volume, together with hydroxyl groups and conjugation with basic groups in bisphenol structure, are involved in the triggering of cell proliferation.
731
In 1998, we studied the estrogenic potency of some diphenylalkanes with bisphenol structure (Figure I) and demonstrated their ability to stimulate MCF-7 cell proliferation in vitro and to induce specific E2-responsive proteins (Perez et ai, 1998). We proposed that both the length and the nature of the substituent groups at the bridging carbon of BPA analogues affected the estrogenic potency of these compounds. Good correlation was found between the relative binding affinity and the proliferative potency of each compound, suggesting that the proliferative effects of bisphenols are mediated through the binding to the estrogen receptor. Further in 2002, we investigated whether several events triggered by E2 in MCF-7 cells were also observed in response to various bisphenols (Rivas et al. 2002). We explored the proliferative effect of these agents, the expression of the estrogen controlled genes by measuring the mRNA of the pS2 protein and the related protein released to the culture medium, the induction of the progesterone receptor (PgR), and the expression of a luciferase reporter gene transfected in MVLN cells (MCF-7 cells stably transfected with a pVit-tk-Luc reporter-containing plasmid). CH,
OH
(5) ~©
OH
111 ' . ,
OH
/r--~.:-\
C,"'.
~\
.G('') '4~ ~~~~
Mlli IM~";6W ",
~ .I~~~~_"'_ef"~. _"" '"
~~."''''M!'II\If!I~ ''''jII\r~ti' ~luMI4
Furthermore an International PhD in immunology has been established at the University of Milan, available also to foreign students, supported by a tuition fellowship. Finally a collaborative Exchange program has been organized with Chinese Research Institutions, which has been approved and funded by the EU within the context of the EFBIC RED Ribbon program. In this frame. two virologists (Professors Bin Gao and Wenlin Huang) from the Chinese Academy of Science visited the National Cancer Institute in Naples to describe their scientific project and explore possible collaborations.
780
InstiMe of Microbiology Chinese Academy of Sciences Datun Road, Chaoyang District, Beijing 100101 Tell Fax: 86-10-64807599 E-mail:
[email protected] Education MSc, The Chinese Academy of Military Medicine. China, 1987 PhD, The University of london, 1990-1993 Post-doc, Oxford University, 1993-1996 Career Research leader, Peptide Therapeutics pic, Cambridge, 1996-1997 Research Fellow, Institute of Molecular Medicine, Oxford. 1997-
Bin Gao PhD. Professor
2001 lecturer. UnIversity College london. 2001-2005 Currently, Director of the Center for Molecular Immunology, IMCAS
Research Inlerests The immune system works by recogniSIng the presence of an invading organism. To distinguish between normal ceUs and invaded celis. immune cells, including Cytotoxic T lymphocyte (CTl) and Natural Killer (NK) cell. keep checking an identity marker called Major histocompatibility Complex (MHC) class I molecule on the surface of all nucleated cells. If cells are invaded by viruses. bacteria, or paraSites, a piece of materia! from the invader wi!! be loaded onto the MHC complex, the T cell will recognize this change and kill the host cells with pathogens.
Institute of Microbio!ogy Chinese Academy of Sciences
Tat: 86-10-64807808 E-mail: ~:fL ...tltJI~']g_~_;J.fli,&.::~i..~,
Research Area
Wenlin Huang Ph,D .. Professor
The group's ,