Developments in Marine Technology, 12
Science-Technology Synergy for Research in the Marine Environment"
Challenges for the XXI Century
DEVELOPMENTS IN MARINE TECHNOLOGY Vol.
1
Vol. Vol.
2 3
Vol.
4
Vol.
5
Vol. Vol.
6 7
Vol. Vol.
8 9
Vol.
10
Vol.
11
Vol
12
Marine and Offshore Safety (P.A. Frieze, R.C. McGregor and I.E. Winkle, Editors) Behaviour of Offshore Structures (J.A. Battjes, Editor) Steel in Marine Structures (C. Noordhoek and J. de Back, Editors) Floating Structures and Offshore Operations (G. van Oortmerssen, Editor) Nonlinear Methods in Offshore Engineering (S.K. Chakrabarti) CFD and CAD in Ship Design (G. van Oortmerssen, Editor) Dynamics of Marine Vehicles and Structures in Waves (W.G. Price, P.Temarel and A.J. Keane, Editors) A Course in Ocean Engineering (S. Gran) MARIN Jubilee 1992: Special Jubilee Volume (MARIN, Editor) Hydrodynamics: Computation, Model Tests and Reality (H.J.J. van den Boom, Editor) Practical Design of Ships and Mobile Units (M.W.C. Oosterveld and S.G.Tan, Editors) Science-Technology Synergy for Research in the Marine Environment: Challenges for the XXI Century (L. Beranzoli, P. Favali, G. Smriglio, Editors)
Developments in Marine Technology, 12
Science-Technology Synergy for Research in the Marine Environment: Challenges for the XXI Century
Edited by Laura Beranzoli Paolo Favali Guiseppe Smriglio t lnstifufoNazionale de Geofisica e Vulcanologica, Roma, Italy
ELSEVIER SCIENCE B.V. Sara Burgerhartstraat 25 P.O. Box 211, 1000 AE Amsterdam, The Netherlands
9 2002 Elsevier Science B.V. All rights reserved.
This work is protected under copyright by Elsevier Science, and the following terms and conditions apply to its use: Photocopying Single photocopies of single chapters may be made for personal use as allowed by national copyright laws. Permission of the Publisher and payment of a fee is required for all other photocopying, including multiple or systematic copying, copying for advertising or promotional purposes, resale, and all forms of document delivery. Special rates are available for educational institutions that wish to make photocopies for non-profit educational classroom use. Permissions may be sought directly from Elsevier Science Global Rights Department, PO Box 800, Oxford OX5 1DX, UK; phone: (+44) 1865 843830, fax: (+44) 1865 853333, e-mail:
[email protected]. You may also contact Global Rights directly through Elsevier's home page (http://www.elsevier.com), by selecting 'Obtaining Permissions'. In the USA, users may clear permissions and make payments through the Copyright Clearance Center, Inc., 222 Rosewood Drive, Danvers, MA 01923, USA; phone: (+1) (978) 7508400, fax: (+1) (978) 7504744, and in the UK through the Copyright Licensing Agency Rapid Clearance Service (CLARCS), 90 Tottenham Court Road, London W1P 0LP, UK; phone: (+44) 207 631 5555; fax: (+44) 207 631 5500. Other countries may have a local reprographic rights agency for payments. Derivative Works Tables of contents may be reproduced for internal circulation, but permission of Elsevier Science is required for external resale or distribution of such material. Permission of the Publisher is required for all other derivative works, including compilations and translations. Electronic Storage or Usage Permission of the Publisher is required to store or use electronically any material contained in this work, including any chapter or part of a chapter. Except as outlined above, no part of this work may be reproduced, stored in a retrieval system or transmitted in any form or by any means, electronic, mechanical, photocopying, recording or otherwise, without prior written permission of the Publisher. Address permissions requests to: Elsevier Science Global Rights Department, at the mail, fax and e-mail addresses noted above. Notice No responsibility is assumed by the Publisher for any injury and/or damage to persons or property as a matter of products liability, negligence or otherwise, or from any use or operation of any methods, products, instructions or ideas contained in the material herein. Because of rapid advances in the medical sciences, in particular, independent verification of diagnoses and drug dosages should be made.
First edition 2002 Library of Congress Cataloging in Publication Data A catalog record from the Library of Congress has been applied for.
ISBN:0-444-50591-1 Q The paper used in this publication meets the requirements of ANSI/NISO Z39.48-1992 (Permanence of Paper). Printed in The Netherlands.
In memory of Giuseppe Smriglio To our friend Giuseppe, who abruptly passed away shortly before this volume issue. Giuseppe, having conceived GEOSTAR, left an indelible mark for the scientists and technologists involved in deep-sea exploration. His intuition is living and guiding us in continuing the way he traced. Laura Beranzoli - Paolo Favali
This Page Intentionally Left Blank
vii
Table of Contents Acknowledgements
L. Beranzoli, P. Favali, G. Smriglio Preface
xiii
J. Boissonnas
Part I - W h y deep-sea observatories?
Perspectives and challenges in marine research G. Ollier, P. Favali, G. Smriglio and F. Gasparoni Research for the protection of the Deep Sea H. Thiel
11
Deep physical oceanography experimentation and benefits from bottom observatories C. Millot
19
Why global geomagnetism needs ocean-bottom observatories F. J. Lowes
27
Ocean-bottom seismology in the Third Millennium J. Bialas, E. R. Flueh, J. P. Morgan, K. Schleisiek and G. Neuhaeuser
37
Part II - Seafloor observatories 9 state of the art and ongoing experiments
45
Development of seismic real-time monitoring systems at subduction zones around Japanese islands using decommissioned submarine cables J. Kasahara
47
viii Geophysical Ocean Bottom Observatories or Temporary Portable Networks? J.-P. Montagner, J-F. Karczewski, E. Stutzmann, G. Roult, W Crawford, P. Lognonn~, L. B6guery, S. Cacho, G. Coste, J-C. Koenig, J. Savary, B. Romanowicz and D Stakes H20: The Hawaii-2 Observatory A. D. Chave, F. K. Duennebier, R. Butler, R. A. Petitt, Jr., F. B. Wooding, D. Harris, J. W Bailey, E. Hobart, J. Jolly, A. D. Bowen and D. R. Yoerger The MBARI Margin Seismology Experiment: A Prototype Seafloor Observatory D. S. Stakes, B. Romanowicz, M. L. Begnaud, K. C. McNally, J. P. Montagner, E. Stutzmann and M. Pasyanos Towards a permanent deep sea observatory: the GEOSTAR European experiment P. Favali, G. Smriglio, , L. Beranzoli, T. Braun, M. Calcara, G. D'Anna, A. De Santis, D. Di Mauro, G. Etiope, F. Frugoni, V. Iafolla, S. Monna, C. Montuori, S. Nozzoli, P. Palangio, G. Romeo
59
83
93
111
NEMO: a project for a Km3-scale neutrino telescope in the Mediterranean Sea near the south Italy coasts A. Capone
121
Part l l I - Marine technologies for deep-sea observatories
131
Deep Sea Challenges of Marine Technology and Oceanographic Engineering G. Clauss and S. Hoog
133
From Abel to Geostar: development of the first european deep-sea scientific observatory F. Gasparoni, D. Calore and R. Campaci
143
Design and realization of Communication Systems for the GEOSTAR project J. Marvaldi, Y. Aoustin, G. Ayela, D. Barbot, J. Blandin, J. M. Coudeville, D. Fellmann, G. Loagc, Ch. Podeur, A. Priou Gravimeter for Deep Sea Measurements V. Iafolla and S. Nozzoli
161
183
ix
Part I V - Marine environmental and risk assessment
199
The Deep-sea as an Area for Geotechnical Intervention H. U. Oebius and H. W. Gerber
201
Offshore hydrocarbon leakage" hazards and monitoring G. Etiope, P. Carnevale, F. Gasparoni, M. Calcara, P. Favali and G. Smriglio
217
The use of a Coastal HF Radar system for determining the vector field of surface currents G. Budillon, G. Dallaporta and A. Mazzoldi
229
The Italian Tsunami Warning System" state of the art A. Maramai, A. Piscini, G. D 'Anna and L. Graziani
247
Advanced technologies: equipments for environmental monitoring in coastal areas G. Zappalh
261
This Page Intentionally Left Blank
xi
Acknowledgements This volume is one of the most significant results of the conference "ScienceTechnology Synergy for Research in Marine Environment: Challenges for the XXI Century" held in Erice and Ustica Italy, September 1999. The volume has the aim of giving the state of the art of developments in technology and scientific research applied to the deep sea long-term observatories. The scientific finalisation of those observatories in Earth sciences and environmental studies are presented as well as some of the most recent experiments of long-term monitoring worldwide (sessions I and II). Session III is devoted to the description of new technologies enabling the deep sea long-term observatory deployment/recovery and functioning. Session IV is a miscellany of contributions mainly addressed to the risk assessment in marine environment from deep sea to coastal regions in order to give information on systems that are complementary to the monitoring through deep sea long-term observatories. The Editors wish to thank the Institutions that supported the conference and this volume: the Istituto Nazionale di Geofisica e Vulcanologia and the President of the institute, Prof. Enzo Boschi; the European Commission (MAST Programme), the former Director of MAST-III, Dr. Jean Boissonnas and the Scientific Officer, Dr. Gilles Ollier; the Natural Marine Reserve of Ustica and its Director, Dr. Roberto Sequi; the Commune of Ustica and its Major, Dr. Attilio Licciardi; the Ettore Majorana Centre for Scientific Culture and its Director, Prof. Antonino Zichichi. The Editors also wish to thank the the referees, for their competence and care in reviewing the papers, and authors for their precious contributions. Special thanks are due to Mrs. Sabina Vallati for her precious, careful and patient work as editor assistant and in the arrangement of the contributions within the volume. Many thanks also to Mrs. Francesca Di Stefano for helping Mrs. Vallati in the elaboration of some of the images. The Editors are also very grateful to Dr. Caterina Montuori for the help in the preparation of the final version of the volume. Finally, the Editors wish to remember the scientists and friends recently deceased who, in different ways, contributed to the conference and the volume: Dr. Alberto Gabriele, Dr. Vladimir Rukol, Dr. Gianfranco Dalla Porta and Dr. Luc Floury.
Laura Beranzoli Paolo Favali Giuseppe Smriglio
This Page Intentionally Left Blank
xiii
Preface Pensioners are often assumed to have plenty of spare time for reading. When I undertook to write this Preface at the kind insistence of the editors, I began by refreshing my memory about the history of underwater exploration. It seems that fishermen, even well into the 18th century, believed the sea in the Gulf of Lions to be bottomless. I discovered, or re-discovered, the legend of Alexander the Great's dive in the Arabian Sea; the account given by the Scriptures (Acts XXVII) of St. Paul caught in a storm at sea, his crew trying to measure sea depth to avoid shipwreck; the first attempts, dating from the Renaissance, to design primitive submersibles and diving bells; the efforts by increasing numbers of inventors throughout the 17th, 18 th and early 19th centuries to overcome the constraints of underwater exploration. In those days, the main objectives of dives - made with primitive submersibles and equipments - were treasure retrieval and warfare. In parallel, but especially after 1800, information on bathymetry began to accumulate with the multiplication of soundings, and some indications of the possible presence of deep sea life began to appear. The mid 19th century is a crucial period. The emergent cable industry needed a reasonably accurate knowledge of bathymetry and sea floor topography; measurements had of course to be carried out from surface ships. Notable advances also began at that time on underwater navigation and by 1870, when he imagined the Nautilus, Jules Verne was already able to draw upon a number of recent developments. However, modem oceanography in the accepted sense was not yet born" as we all know, the turning point was the Challenger expedition of 1872-1876. Jules Verne's visions of the sea floor reveal both the power of his imagination and the extent of misconceptions that were still prevailing in his days. For a long time, progress in scientific understanding was rather slow. As a student in geology, I learned about the existence of a mid-Atlantic ridge (was it perhaps a belt of folded sediments with volcanoes all along ?) and of deep oceanic trenches. The deep sea floor was deemed to be covered with red clay (" l'argile rouge des grands fonds "). We had not yet heard about plate tectonics. I do not need to recall here the extraordinary advances that began in the 60's and led to the now generally accepted concept of the deep sea floor as a dynamic environment, as a fundamental component of the Earth System. Greatly improved means of accessing ocean depths, with bathyscaphes and later with submersibles, were a determining factor of progress. By the end of the 1980s, manned submersibles had brought in a wealth of information. Much insistence was still placed on the necessity of direct observation by man. Characteristically enough, a Special Issue of the Marine Technology Society Journal devoted to deep ocean research in June 1990, is named "A deepest ocean presence" and specifies" " ...Touching bottom is one thing; establishing an effective, meaningful presence is quite another. The latter implies extended durations at depths...in order to conduct experiments, observations, and work at will. It should further imply efficient and versatile operations, with favourable cost, time and logistics elements". Meanwhile, an important shift of approach was under way. Although the objectives set out above in the quotation from the MTS Journal remained valid, we began to hear about fixed sea floor stations. Today, monitoring the marine environment, including sea floor processes, has become a key objective of scientific activity. New requirements can best be met by automated and modular observatories, to be networked together. These observatories must be multidisciplinary; ideally, they should encompass multiple observing sites on a spatial scale
xiv appropriate to the processes under study. I am pleased to stress that Europe has been quite active in this field. The MAST programme, supported from 1991 to 1994, a feasibility study for an abyssal benthic laboratory. ECOPS (a former joint EC/ESF Committee devoted to Ocean and Polar Science ) in 1994 mentioned the possible role of such stations to implement its so-called Grand Challenge on the Deep Sea Floor as a Changing Environment. The GEOSTAR 1 proposal followed in 1995, benefitting from the earlier feasibility study and as I write this preface, the GEOSTAR 2 project is in full activity. In the context of another discipline, I should also mention extensive European collaboration on ANTARES, a project to deploy a neutrino observatory off the coast of SE France. So, the scene is now set for a new era of investigations on the ocean floor, relying on the most advanced technologies. What can be the place of imagination and excitement in this context, I mean, the sort of feelings conveyed by Jules Verne, that can still be experienced during any deep dive of a manned submersible ? Only 50 years ago, Rachel Carson, in "The sea around us", could write of man, whose distant ancestors of geological times came from the ocean: "Over the centuries, with all the skill and ingenuity and reasoning powers of his mind, he has sought to explore and investigate even its most remote parts, so that he might re-enter it mentally and imaginatively .... He found ways to probe its depths, he let down nets to capture its life, he invented mechanical eyes and ears that could re-create for his senses a world long lost . . . . And yet he has returned to his mother sea only on her own terms... " It may be that man is now substituting his "own terms" for those of the sea. Should this worry us ? Perhaps not. Every researcher will claim that with the powerful means now at his disposal, opportunity for exciting discovery are far from being over ! In organising the Erice conference, the editors of this volume have not only provided a striking venue in a very beautiful landscape, their objective was, above all, to stimulate the scientific community, particularly in Europe, for the planning of frontier research on the sea floor: they should be thanked for their efforts.
Jean BOISSONNAS former head of the EU MArine Science and Technology (MAST) research programme.
XV
List of the Authors
Y. Aoustin, IFREMER-Center of Brest, B.P. 70 29280 Plouzan6, France G. Ayela, ORCA-Instrumentation, 5, rue Pierre Rivoalon 29200 Brest, France J. W. Bailey, Department of Applied Ocean Physics and Engineering, Woods Hole Oceanographic Institution, Woods Hole, MA 02543, U.S.A D. Barbot, ORCA-Instrumentation, 5, rue Pierre Rivoalon 29200 Brest, France M. L. Begnaud, MBARI, 7700 Sandholdt Road, Moss Landing CA 95039-0628, U.S.A. (Now at Los Alamos National Laboratory, Los Alamos, NM 87545, U.S.A.) L. Beranzoli, Istituto Nazionale di Geofisica e Vulcanologia, Via di Vigna Murata 605,00143 Rome, Italy L. B6guery, Seismological Laboratory, URA, CNRS 195, Institut de Physique du Globe, Paris, France J. Bialas, GEOMAR, Research Center for Marine Geosciences, Christian Albrechts University Kiel, Wischhofstr. 1-3, D 24148 Kiel, Germany J. Blandin, IFREMER-Center of Brest, B.P. 70 29280 Plouzan4, France J. Boissonnas, European Commission, Rue de la Loi 200, 1049 Brussels, Belgium D. Bowen, Department of Applied Ocean Physics and Engineering, Woods Hole Oceanographic Institution, Woods Hole, MA 02543, U.S.A G.M. Bozzo, Tecnomare S.p.A., San Marco 3584, 30124 Venezia, Italy T. Braun, Istituto Nazionale di Geofisica e Vulcanologia, Via di Vigna Murata 605, 00143 Rome, Italy and INGV-Osservatorio Sismologico Centralizzato Arezzo, Via Uguccione della Faggiuola 3, 52100 Arezzo, Italy G. Budillon, Istituto Universitario Navale, Istituto di Meteorologia e Oceanografia, Via Acton 38, 80133 Napoli, Italy R. Butler, Incorporated Research Institutions for Seismology, 1200 New York Ave. NW, Suite 800, Washington, DC 20005, USA S. Cacho, Seismological Laboratory, URA, CNRS 195, Institut de Physique du Globe, Paris, France M. Calcara, Istituto Nazionale di Geofisica e Vulcanologia, Via di Vigna Murata 605, 00143 Rome, Italy D. Calore, Tecnomare S.p.A., San Marco 3584, 30124 Venezia, Italy R. Campaci, Tecnomare S.p.A., San Marco 3584, 30124 Venezia, Italy A. Capone, Dipartimento di Fisica, Universitfi di Roma "La Sapienza", P.le Aldo Moro 2, 00185 Roma, Italy P. Carnevale, Tecnomare S.p.A., San Marco 3584, 30124 Venezia, Italy A. D. Chave, Woods Hole Oceanographic Inst., Woods Hole, MA 02543, USA G. Clauss, Institute of Naval Architecture and Ocean Engineering at the Technische Universit&it Berlin, Germany W. Crawford, Seismological Laboratory, URA, CNRS 195, Institut de Physique du Globe, Paris, France
xvi J. M. Coudeville, ORCA-Instrumentation, 5, rue Pierre Rivoalon 29200 Brest, France G. Dallaporta, C.N.R., Istituto Studio Dinamica Grandi Masse. S.Polo 1364, 30125 Venezia. Italy G. D'Anna, Istituto Nazionale di Geofisica e Vulcanologia, Osservatorio Geofisico di Gibilmanna, 90015 Cefah), Palermo, Italy De Santis, Istituto Nazionale di Geofisica e Vulcanologia, Via di Vigna Murata 605, 00143 Rome, Italy D. Di Mauro, Istituto Nazionale di Geofisica e Vulcanologia, Via di Vigna Murata 605, 00143 Rome, Italy F. K. Duennebier, School of Ocean and Earth Science and Technology, University of Hawaii at Manoa, Honolulu, HI 96822, U.S.A. G. Etiope, Istituto Nazionale di Geofisica e Vulcanologia, Via di Vigna Murata 605, 00143 Rome, Italy P. Favali, Istituto Nazionale di Geofisica e Vulcanologia, Via di Vigna Murata 605, 00143 Rome, Italy D. Fellmann, ORCA-Instrumentation, 5, rue Pierre Rivoalon 29200 Brest, France E. Flueh, Geomar Wischhofstr. 1-3, 24148 Kiel, Germany F. Frugoni, Istituto Nazionale di Geofisica e Vulcanologia, Via di Vigna Murata 605, 00143 Rome, Italy F. Gasparoni, Tecnomare S.p.A., San Marco 3584, 30124 Venezia, Italy H. W. Gerber, TFH Berlin, Luxemburger Str. 10, 13553 Berlin, Germany L. Graziani, Istituto Nazionale di Geofisica e Vulcanologia, Via di Vigna Murata 605, 00143 Rome, Italy D. Harris, School of Ocean and Earth Science and Technology, University of Hawaii at Manoa, Honolulu, HI 96822, U.S.A. E. Hobart, Department of Applied Ocean Physics and Engineering, Woods Hole Oceanographic Institution, Woods Hole, MA 02543, U.S.A S. Hoog, Institute of Naval Architecture and Ocean Engineering at the Technische Universit~it Berlin, Salznfer 17-19, 10587 Berlin, Germany V. Iafolla, Istituto di Fisica dello Spazio Interplanetario (IFSI) CNR, Via del Fosso del Cavaliere 100, 00133 Rome, Italy J. Jolly, School of Ocean and Earth Science and Technology, University of Hawaii at Manoa, Honolulu, HI 96822, U.S.A. J.-Y. Jourdain, Thomson Marconi Sonar S.A.S., 525 Route del Dolines, BP 157, 06903 Sophie Antipolis Cedex, France J.-F. Karczewski, Seismological Laboratory, URA, CNRS 195, Institut de Physique du Globe, Paris, France J. Kasahara, Earthquake Research Institute, University of Tokyo, 1-1-1, Bunkyo-Ku, Tokyo 113, Japan J-C. Koenig, Seismological Laboratory, URA, CNRS 195, Institut de Physique du Globe,
xvii Paris, France G. Loaec, IFREMER-Center of Brest, B.P. 70 29280 Plouzan6, France J.F. Lowes, University of Newcastle Upon Tyne, NE1 7RU, Newcastle Upon Tyne, United Kingdom P. Lognonn6, Seismological Laboratory, URA, CNRS 195, Institut de Physique du Globe, Paris, France A. Maramai, Istituto Nazionale di Geofisica e Vulcanologia, Via di Vigna Murata 605, 00143, Roma, Italy J. Marvaldi, IFREMER, Centre de Brest, BP 70, Plouzan6 29280, France K. C. McNally, UC Santa Cruz, Dept Earth Sciences, Santa Cruz, CA 95064, U.S.A. A. Mazzoldi, C.N.R., Istituto Studio Dinamica Grandi Masse, S.Polo 1364, 30125 Venezia, Italy C. Millot, Antenne LOB-CNRS, BP 330, La Seyne/mer 83507, France J.-P. Montagner, IPG, Place Jussieu 4, 75252 Paris CEDEX 05, France S. Monna, Istituto Nazionale di Geofisica e Vulcanologia, Via di Vigna Murata 605, 00143 Rome, Italy C. Montuori, Istituto Nazionale di Geofisica e Vulcanologia, Via di Vigna Murata 605, 00143 Rome, Italy G. Neuhaeuser, Send GmbH, Signal-Elektronik und Netz-Dienste, Stubbenhuk 10, D 20459 Hamburg, Germany S. Nozzoli, Istituto di Fisica dello Spazio Interplanetario (IFSI) CNR, Via del Fosso del Cavaliere 100, 00133 Rome, Italy H. U. Oebius, Technische Universitaet Berlin, FB 6/ITU-WUM, Sekr. VWS, MuellerBreslau-Str. (Schleuseninsel), 10623 Berlin, Germany G. Ollier, European Commission, Rue de la Loi 200, 1049 Brussels, Belgium P. Palangio, Istituto Nazionale di Geofisica e Vulcanologia, Osservatorio Geomagnetico, Castello Cinquecentesco dell'Aquila, 67100 Aquila, Italy M. Pasyanos, UCB Seismographic Station, Berkeley, CA, U.S.A. (Now at Lawrence Livermore National Laboratory, Livermore, CA 94551, U.S.A.) R. A. Petitt, Jr., Department of Applied Ocean Physics and Engineering, Woods Hole Oceanographic Institution, Woods Hole, MA 02543, U.S.A J. Phipps Morgan, GEOMAR, Research Center for Marine Geosciences, Christian Albrechts University Kiel, Wischhofstr. 1-3, D 24148 Kiel, Germany A. Piscini, Istituto Nazionale di Geofisica e Vulcanologia, Via di Vigna Murata 605, 00143 Rome, Italy Ch. Podeur, IFREMER-Center of Brest, B.P. 70 29280 Plouzan6, France A. Priou, ORCA-Instrumentation, 5, rue Pierre Rivoalon 29200 Brest, France B. Romanowicz, UCB Seismographic Station, Berkeley, CA, U.S.A. G. Romeo, Istituto Nazionale di Geofisica e Vulcanologia, Via di Vigna Murata 605, 00143 Rome, Italy G. Roult, Seismological Laboratory, URA, CNRS 195, Institut de Physique du Globe, Paris, France
xviii H. Roe, University of Southampton, Waterfront Campus, European Way, Southampton SO 14 3ZH, U.K. J. Savary, Seismological Laboratory, URA, CNRS 195, Institut de Physique du Globe, Paris, France K. Schleisiek, Send GmbH, Signal-Elektronik und Netz-Dienste, Stubbenhuk 10, D 20459 Hamburg, Germany G. Smriglio, Istituto Nazionale di Geofisica e Vulcanologia, Via di Vigna Murata 605, 00143 Rome, Italyt D. S. Stakes, MBARI, 7700 Sandholdt Road, Moss Landing CA 95039-0628, U.S.A. E. Stutzmann, IPG, Dept. Seismology, Case 89, Paris 75252, France H. Thiel, Poppenbuettler Markt 8a, 22399 Hambourg, Germany F. B. Wooding, Department of Geology and Geophysics, Woods Hole Oceanographic Institution, Woods Hole, MA 02543, U.S.A. D. R. Yoerger, Department of Applied Ocean Physics and Engineering, Woods Hole Oceanographic Institution, Woods Hole, MA 02543, U.S.A G. ZappalS., Istituto Talassografico-CNR, Spianata San Ranieri 86, 98122 Messina, Italy
Part I Why deep-sea observatories?
This Page Intentionally Left Blank
Science-Technology Synergy for Research in the Marine Environment: Challenges for the XXI Century L. Beranzoli and P. Favali and G. Smriglio, editors.
9 2002 Elsevier Science B.V. All rights reserved.
Perspectives and challenges in marine research G. Ollier a, Chairman bt P. Favali b and G. Smriglio, Co-Chairman F. Gasparoni c, Rapporteur aEuropean Commission, Rue de la Loi 200, 1049 Brussels (Belgium) bIstituto Nazionale di Geofisica e Vulcanologia, Via di Vigna Murata 605, 00143 Roma (Italy) CTecnomare SpA, San Marco 3584, 30124 Venezia (Italy)
INTRODUCTION In the period 8-13 September 1999, the Ettore Majorana Center for Scientific Culture (Erice, Sicily) hosted the 16th Course of the International School of Geophysics, devoted to "Science-Technology Synergy for Research in Marine Environment: Challenges for the XXI Century". Over 70 representative from European and international scientific institutions and industries attended. The Conference focused on the most recent developments in Marine Technologies and Scientific Research, with a particular emphasis on environmental studies and on the monitoring of major natural risks. The Course was concluded by a Round Table, specifically devoted to discuss the main topics raised from the conference and, more generally, perspectives and challenges in marine research. The Conference had a strong focus on benthic observatories and in particular those aiming at long-term monitoring were reviewed during the conference. In the domain of Geophysical research where the activities relating to deep-sea observatories already went somewhat ahead compared to the other domains the objective is either to better access to the parameters of earthquake sources or to investigate the deep structure of the earth. Biological research is also a good candidate for benthic laboratories in particular nowadays to help assessing the deep marine biodiversity. Physical oceanographers also call for long- term
Report of the Round Table on "Science-Technology Synergy for Research in Marine Environment: Challenges for the XXI Century", Ustica, Italy, 14th September 1999.
monitoring of the deep-water mass because of its impact on global processes and in particular on climate. The GOOS/EUROGOOS/MEDGOOS community (Global Observation Systems) in particular is interested in such devices. The industry is nowadays moving into deeper waters, which implies also more technology developments relating to long term deep-sea observatories. Such operations in deeper waters are of higher risk and necessitate a complete monitoring of the environmental parameters (currents, sediment stability, impact of the exploration/exploitation activities). Even astronomers are now seeking the benefit of deep-water research in giving a high priority to deep-water neutrino telescopes that will enable them to receive information from remote areas of the universe further than optical telescopes. A rather well accepted interpretation by the audience of all the requests put by the scientific community is that an optimised baseline approach is needed in order to tackle common technological problems which are basically: (1) communication problems, (2) energy problems, (3) deep-sea component availability and (4) servicing the deep-sea stations. So, it is quite clear that collaboration across the disciplines in the domain of deep-sea observatories will remain a key approach and will even necessitate being strengthened. The technology needed for such developments is quite demanding whereas the resources available are limited, in the context of several possible approaches to monitor in the long-terms the deep-sea environment. The Round Table extended the conference through discussions envisaging the various technical possibilities and frameworks for moving deeper in complement to what was said during the sessions of the conference itself. In fact the central question that was put forward during the round table was how to move into the deep-sea in technological and economic terms. The discussion was structured around the following four axes: (1) the need for marine instrumentation in the near future, (2) the role and framework of science and industry interactions, (3) deep-sea observatories: progress achieved till now and further needs and (4) key components for marine observatories.
PERSPECTIVES
IN MARINE
INSTRUMENTATION
by Howard Roe (Southampton Oceanographic Centre, U.K.) Understanding the processes in marine environment needs in particular an effective data collection. Technology represents the tool to achieve this objective, especially in deep sea environment where extreme robustness, minimum maintenance, need for minimum calibration are mandatory requirements for the instrumented packages. At the same time the possibility to observe variability of phenomena in time and space imposes us to collect data in the fight scale; this means that technology developments must be oriented in the design of what is effectively needed by the users. Scientific benefit and cheaper approaches should therefore be considered as the leading factors influencing the future technological developments in marine instrumentation. Multidisciplinary approaches such as those characterising the new benthic observatories under development in Europe and U.S.A. represents a solution to the problem. Needs for the future may be summarised as follows: Extend the number and type of platforms, with particular reference to landing systems, benthic observatories and Remotely Operated Vehicles; at the same time the more
traditional stations (like buoys, subsurface moorings, etc.) need to be kept into operation to continue time series already started. - Collect in real time; increase the number and type of instrumented packages that may be remotely operated continuously (like plankton recording samplers). - Develop systems that may be compatible with the logistics and requirements of different organisations; this will widen the number of disciplines that may benefit from the development of new fixed stations and observatories. A promising perspective is the link of fixed stations (observatories, moorings) with AUVs, which feature the potential of being more cost effective than survey ships. Possible proposals for increase the role and presence of European community are: make an inventory of large scale facilities; - develop new platforms at European level; stimulate technological transfer from fields like space and medical science; - promote and run interactive programs (for schools, education, etc.), taking example from the U.S.A. -
-
SCIENCE-TECHNOLOGY INTERACTION: THE INDUSTRY POINT OF VIEW by G. M.Bozzo (Tecnomare, Italy) Three main issues justify the need of more fruitful relationship between science and technology: improving the understanding and knowledge of the sea; identifying better tools and procedures for management and safeguard of the sea and marine resources; - securing the sustainable development of marine resources; These issues feature important and direct implications on significant aspects such as: quality of life, understanding of the global phenomena of the earth, socio-economical development, etc. For this reason it is becoming more and more important to be able to explain to "normal people" and to society the importance of the sea, why it is important to invest in improving its understanding, why to go deeper, etc. On this regard, a better communication between the marine community and society about sea and related problems is necessary, with the involvement of all the different actors, with the commitment to provide qualified information on fundamental issues such as: quality of water and status of the marine ecosystem. This approach will also avoid the emotional approach, which often characterises the debate on marine issues. As regard the relationship between Scientific and Technological Communities, these have been considered more a "client-vendor" relationship rather than a real co-operation where both parties could have a complementary role to play. In this regard the difficulties in accepting each other, between science and technology, which characterised the discussion during the first years of implementation of the European Program MAST is quite over. Nobody questions now about the importance of co-operation, but how to improve and increase such co-operation. Two good examples proving the need for a better synergy are relevant to deep-sea activities. -
-
Deep-sea research, for a better understanding of the deep ocean in the global phenomena, call for the availability of suitable enabling technologies as well as of robust, economic and reliable instruments. Some of these technologies are already available to the offshore industry. On the other side, the hydrocarbon industry is moving towards new areas for the exploration and exploitation of deep-sea reservoirs in several areas of the world (Gulf of Mexico, offshore Brazil, Mediterranean Sea, North Sea, Caspian Sea, Far East). To proceed in this development industry has to face the basic problem of achieving a better understanding of physical and biological phenomena affecting the deep sea, not yet well known, in order to properly design the production systems. In this case Science may provide the solution to the problem. One important concept needs to be understood: marine technology is not a commodity; the time necessary to develop new marine technology, suitable to operate in very severe environmental conditions, is often more than 6 to 8 years. In addition proper training of the persons involved is necessary to secure a successful result. In this regard MAST Program was the most important tool to foster the development of a modem marine science and technology community and to create an effective co-operation at European level. Marine Reserves can play an important role in the support of Pilot Projects for new technological developments.
D E E P SEA O B S E R V A T O R I E S
by Barbara Romanowicz (Berkeley University, U.S.A.) The role and importance of deep-sea observatories in marine and earth sciences is now recognised at worldwide level. Seismologists, together with geomagnetists and geodesists were the first communities to launch the idea of a global coverage of the Earth surface with permanent observatories. For obvious logistic reasons, monitoring stations relevant to these disciplines have been traditionally located on continents and islands, but it has soon become evident that, to achieve a uniform distribution in the observation networks, and, consequently, to improve our understanding of Earth's interior, underwater stations have to be developed and operated at deep sea bed. In 1993 an international organisation (International Ocean Network) has been established to co-ordinate activities in this specific field; a Workshop (Multidisciplinary Observatories on the Deep Sea Floor, Marseille, France 1995), was dedicated to discuss, between scientists and technologists, the various aspects of the problem. Deep-sea observatories represent the technological solution for reaching the target of global observations. As regards the initiatives in progress within the European Union, Geostar represents the most significant one, focusing on multidisciplinarity, communications and simplified deployment. Standardisation is one of the key issues for globality; modularity, common installation and recovery tools could provide easier maintenance procedures and lower costs. In this regard the European Union should promote participation to global programs and not limit attention to local problems. Real-time data recovery from ocean bottom observatories represents another crucial aspect; for some of the possible applications real-time data connection is an essential requirement, that cannot be met by the presently available solutions. Experimentation of different approaches should be encouraged to ensure a fruitful evolution of these systems.
Observatories need to be designed taking into account the requirements of multidisciplinarity, so to enable scientists to deploy different instruments and share infrastructures, in which observations of several phenomena are carried out simultaneously and over a time scale of one year or more. In conclusion, deep-sea observatories call for a strong relationship between science and technology; it is a matter of fact that scientific requirements represent a stimulus to develop new solutions, but at the same time only new technological developments may enable the achievement of the most challenging goals of the deep-sea observatories (like autonomy, real-time data communication, etc.).
KEY TECHNOLOGIES FOR MARINE OBSERVATORIES
by Jean-Yves Jourdain (Thompson, France) Advantages of co-operation between scientists and industry have been widely underlined during the conference. It is important for European industry to be involved, at their early beginning, within ambitious scientific projects and to face these challenges together with scientists. For industry there is the problem of finding the synergy between many different marine applications because, except for oil exploration and exploitation, each application cannot constitute a market. One of the most critical features influencing the industrial strategy is the perspective of a clear market. In this regard the scientific market is not always obvious from the industry point of view, while the oil industry represents a good example of clear market. With particular reference to the underwater acoustics, it is possible to mention several topics where significant progress has been done: low frequency and wide band sources, imaging (thanks to synthetic aperture processing), positioning. Acoustical links could now be efficient whatever the application to be addressed; nowadays marine systems never use such links for transmitting commands or images (even if they often have an acoustical link for security reasons when all other possibilities fail: a paradoxical situation). Finally, developments in autocalibration techniques, particularly those exploiting correlation of scattered signals, are of paramount importance for future systems.
SUMMARY OF DISCUSSION TOPICS The conference was mainly oriented to marine research in deep sea and represented an important opportunity of discussion between scientists and technologists involved in this field. Most significant contributions were relevant to unmanned deep sea observatories, that are providing a suitable solution as a complementary technique to more traditional systems like deep-sea ROVs, manned submersibles, buoy networks, etc., enabling the development of permanent monitoring networks. For this reason they are subjects of a growing interest from many scientific missions and disciplines and at the same time represent one of the most challenging goals for the next century of technical development. In this regard, the conference gave the opportunity to present in detail the main experiments that are being carried out worldwide.
Another aspect that has qualified the conference has been the presence of a wide range of scientific disciplines all interested in working in deep sea and potential beneficiary of common approaches and infrastructures. Most of the interventions made by the participants pointed out the importance of multidisciplinary approach both for the advantage in terms of costs (sharing common facilities) and for the added value in carrying out synoptic observations from different instruments. This trend seems to be well advanced for geophysical sciences, less so for more traditional marine sciences, like physical oceanography. Other disciplines need to be involved, in particular biologists and chemical oceanographers. At the same time it is to be kept clearly in mind that certain disciplines have very specific needs, so multidisciplinarity is not to be intended like a "dogma", but must be adapted, case by case, to find the best solution. In some cases the absence of some disciplines (like biology) in deep-sea observatories is simply because sensing techniques are not yet ready. In this regard it is essential to take advantage from know-how available in other fields, like space and laboratory analytical techniques. Two significant examples of the potential of technology transfer were presented at the conference, both relevant to instruments originally developed for space applications, that are now being adapted to deep-sea use in the framework of Geostar project. One is a gravity meter developed by the "Istituto per la Fisica dello Spazio Interplanetario" of the Italian Research Council, the other is a three-axial broad-band seismometer developed by the French Institut du Physique du Globe. In other cases there is a problem of achieving an active involvement of specific disciplines, and obtain a clear definition of their requirements and objectives. It has been pointed out that in past cases, the answers from the scientific community to initiatives oriented to investigate their expectations and goals were too limited (see for example the questionnaire distributed in the MAST-2 Abyssal Benthic Laboratory feasibility study: the percentage of replies was only 20%). After the conclusion of the MAST program, marine science and technology must now scatter all over the European framework programme. The marine community faces now two problems: to be able to communicate to public opinion the reason and advantages of investing in marine research; - to link what is achieved during the pre-competitive phase (typical of MAST projects) to fully operative devices. Community funding covers a small percentage of the total research budget; the large part derives from national funds, that are however not easily available for initiatives that features an intrinsic risk. There is a difficult gap to fill. Sharing of resources is therefore a need. The development of large scale infrastructures like deep-sea geophysical observatories may represent a good opportunity for the use by other communities. In the Fifth Framework Programme there are budgetary lines on large-scale infrastructures. Also, collaboration between EU and other countries is welcome if funding is not possible. Besides, the availability of the infrastructures, use of ships at European level represents another problem, especially for deep-sea activities that are characterised by high costs. Many interventions underline how the U.S.A. seems to be well ahead of Europe in the field of education to marine sciences. Initiatives like exhibitions (as already done by astrophysical institutes) should be promoted at European level to show and involve the public. -
CONCLUSIONS The Round Table is concluded by Attilio Licciardi, Major of Ustica, that welcomes participants and underlines the importance of the event for all the Sicilian community. Ustica, the island that hosts the first Italian Marine Reserve, intends to represent a meeting place for the marine scientists; on this regard the co-operation agreement signed between Ustica Marine Reserve and Istituto Nazionale di Geofisica for GEOSTAR project represents a first important achievement.
This Page Intentionally Left Blank
Science-Technology Synergy for Research in the Marine Environment: Challenges for the XXI Century L. Beranzoli and P. Favali and G. Smriglio, editors.
11
9 2002 Elsevier Science B.V. All rights reserved.
Research for the Protection of the Deep-Sea Hjalmar Thiel Poppenbfittler Markt 8A, 22399 Hamburg, Germany
The exploitation of deep-sea resources began about 50 years ago and is expected to increase in the future. In these activities, impact assessments must be conducted to avoid any misuse of the environment. Short accounts are presented of the potential activities in waste deposition, ore mining and hydrocarbon extraction; in each case the nature of the required environmental research is characterised. Finally, technical requirements and the current international legal framework governing activities in the deep sea are briefly summarised.
1. B A C K G R O U N D
Marine science has seen substantial developments during the second half of the 20th century, with a number of remarkable discoveries. These have included the final proof of the existence of plate tectonics and the many related processes: for example, the occurrence of warm and cold water venting from the seafloor; the predominance of chemoautotrophically fuelled animals at these localities and the specialised communities thriving around them [15]. BuHowever, these amazing discoveries were founded on basic research by scientists curious about the oceans' functional relationships and processes which gradually developed into more general insights about the oceanic and atmospheric systems. In the 1950s, for example, the el Nino and la Nina situations were believed to be rather localised effects; today they are recognised as important phenomena within a world-wide weather oscillation system [61. Basic science is of the utmost importance for an understanding of ocean processes. However, the last decade has seen an increasing shift towards applied science in many countries, including those of the European Community, directed by political decision makers. In many fields this may be justified by the needs of modem human society. However, one eminent human need is the proper functioning of the Earth's ecosystems, the human environment. Ecology must therefore be regarded as an applied science; it is wealth creating, not in the sense of financial resources, but in the sense of human health and quality of life. Most publications on the marine ecosystem deal with relatively shallow waters and much less is known about the remote deep ocean. However, human society already uses the deep sea and is in the process of preparing further large-scale intrusions into this inner space. Oceanographers from all disciplines must consider the need for precautionary investigations into the potential effects of this intrusion to enable governments and industry to avoid environmentally harmful activities and to undertake appropriate a posteriori studies. Thinking ahead, and considering potential impacts of current developments, will guide mankind into
12 wise uses of our environment and will help remove the mysteries of the deep ocean and bring it fully into the human ambit [7].
2. USE OF DEEP-SEA R E S O U R C E S E N V I R O N M E N T A L RESEARCH
AND
REQUIREMENTS
FOR
THE
The terrestrial parts of the Earth are already severely impacted by man [8,9]. Consequently, the potential use of the deep oceans, both for waste disposal and as a source of potentially valuable minerals, is frequently referred to. Indeed, the deep sea has already been used for the storage of wastes and the possible extraction of metalliferous resources is being actively explored. There is no valid ethical argument against the use of the deep ocean for the permanent disposal of waste material when the available land areas are exhausted. But the available data on the deep sea are not yet sufficient to allow adequate discussion of the best practicable environmental options for permanent waste disposal into this environment or the extraction of resources from it. As part of the process of correcting this deficiency, wastes already disposed of in the deep ocean should be viewed as experiments, albeit unintentional ones, and should be studied accordingly. Examples of such waste disposal activities that might be studied in this way, and experiments that might be conducted to investigate the possible impact of future disposals and resource extraction, are summarised below. For more details, the reader is referred to Thiel et al. [ 10, 11 ]. Munitions have been dumped in the sea for many years, in both shallow and deep waters [12]. Much of this material was dumped following World War II, but munitions dumping has also occurred subsequently. Despite the existence of the London Convention (the former London Dumping Convention) there has been no control over such dumping and there is no publicly available data on its extent since, under the sovereignty of states, naval authorities have been exempted from the convention. As a scientist, I cannot understand why a particular section of society should be exempted in this way from rules which apply to all other sections. The natural environment does not recognise such distinctions, and impacts on it are the same wherever they originate. In fact, munition dumping may not be particularly harmful to the wider environment. But any exploitation of the environment, including dumping munitions in the deep sea, demands study of the potential effects and this has never been done. The last munitions dumping in European waters probably occurred in 1994 when Portugal scuttled a ship containing 2200 metric tonnes of various types of munitions close to the outer boundary of its Exclusive Economic Zone [13]. However, no investigation of the short-term effects on the environment have been conducted. This represents a missed opportunity to obtain valuable information, though it is still not too late to study the longer-term effects. Large structures include those such as the oil storage platform Brent Spar which was intended to be dumped in the deep sea in 1995. As a result of public pressure following the intervention of Greenpeace, this plan was abandoned in favour of onshore re-use and disposal [14]. Scientists would have been in a better position to advise on the likely environmental consequences of the original Brent Spar disposal plan if the impacts of other large structures, already on the seafloor, had been studied before. There are many such structures lying on or buried within deep sea sediments, including the munitions carrier mentioned above and hundreds of sunken commercial vessels and warships. A study of their near- and far-field
13 environments would provide valuable data for impact predictions for other large structures as and when these are required. Radioactive wastes were dumped in the sea from the end of World War II until 1982 when, on the advice of the International Atomic Energy Agency, the London Convention banned this mode of final storage. Most of the dumped material was disposed of in the NE Atlantic between 1949 and 1982 [15,16]. Radioactively low level contaminated material from hospitals and industry was packed in barrels with concrete or bitumen as a matrix and dumped mainly in water depths between 4500 and 5300 metres. Environmental investigations were conducted in the general areas of the waste and no contamination above background levels was discovered. But, except for a few which were trawled, the barrels were never thoroughly inspected and the near-field environment, at the centimetre to metre scales, was not studied. Fine scale sediment coting surveys and analysis of near-field sediment and animal samples would provide useful information for the evaluation of past and potential future waste disposals. The technology for such studies around barrels of varying in situ times and degrees of deterioration already exists: manned submersibles and remotely operated vehicles. Oil and gas have been extracted in shelf and upper slope regions from rock formations deep below the seafloor for many years, and the industry is progressively penetrating deeper down the continental slope. The extraction process itself does not seriously impact the deep sea unless the riser connection fails and oil leaks into the ocean. But the impacts of drilling muds and cuttings piles around drilling rigs are known from shallow water studies [17]; with appropriate modifications, these results should be applicable to deep water situations. Gas hydrates were discovered only recently, but they have a wide distribution in continental slope regions. They constitute a solid material with a high energy content and may replace other resources in the future [18]. But the effects of such exploitation remain speculative. Extraction of the hydrates may lead to instabilities in the continental slopes and may cause sediment slides with far-reaching environmental consequences. Precautionary environmental studies such as large-scale experiments cannot be contemplated, but thorough site surveys would be an absolute prerequisite for any decision on exploitation, since sedimentary slope structures vary and each potential extraction region would require special consideration of the possible environmental risks. Sewage sludge and dredge spoils from harbours and rivers, when not contaminated with heavy metals, organochlorines or oil residues might be recycled to agricultural areas, but contaminated material require permanent storage. Potential disposal sites are becoming increasingly rare on land and interest in ocean dumping may recur. This is not to promote deep-sea disposal, but to urge oceanographers to be aware of potential future attacks on the deep ocean floor and the overlying water column. At present, they are not prepared for a best practicable environmental option comparison with terrestrial storage. Municipal waste dumping has occurred into the deep western Atlantic at the surface 106 nautical miles offNew Jersey in over 2500 metres of water [ 19]. This notorious dump site was used between 1986 and 1992 and the environmental effects were studied carefully. Further disposal activities are not being considered at the moment, but various technologies have been developed for dumping such waste at mid-ocean depths [20], suggesting that this option is still being considered. Surface release of wastes is certainly not advisable. At the very least, they should be delivered to the seafloor either in some form of riser or packed in geotextile container bags in order to confine any impact to a restricted area. In either case, more information than is currently available will be needed to evaluate the impact of any proposed future deep ocean waste dumping.
14 As in the case of deep-sea mining, only a large-scale experiment could produce this information. It is simply not possible to extrapolate from data gathered by traditional oceanographic techniques to large-scale intrusions expected from any commercial disposal activity. Commercial-scale disposal might involve the order of 1 million tonnes per year, but a feasible experiment should approach this scale of dumping through a series of steps with intermittent monitoring activities. For example, an initial dumping of 100,000 tonnes might be envisaged. If the results from monitoring the effects of this quantity were encouraging, it could be followed by a single disposal of around 1 million tonnes. Again, if the results were encouraging a final stage could involve monitoring the effects of dumping 1 million tonnes each year for ten years. Such a nested experiment would need to allow sufficient time for any environmental impacts to occur and for the analysis and evaluation of the monitoring results [10]. Carbon dioxide (C02) sequestration into the ocean has already been discussed for some years, aiming at a reduction of further increases in the concentration of greenhouse gases in the atmosphere [21,22]. The ocean surface takes up CO2 in natural processes, but the deep ocean has a much higher storage capacity. Eventually, any CO2 naturally or artificially sequestered in the deep ocean will return to the atmosphere, but the time-lag of several hundreds of years should be sufficiently long to allow other methods of CO2 sequestration to be found in the meantime. However, CO2 is poisonous at high concentrations by changing the pH of the water so that the possible effect of deep-sea storage on the marine fauna would require careful study. Again, it is difficult to extrapolate from laboratory data and a step-wise approach should therefore be employed. Such an approach might begin with laboratory-scale experiments using aquaria or mesocosms, move on to an intermediate-scale experiment in, for example, a relatively isolated deep, cold t]ord, and, finally, to a major experiment in the deep ocean. As in the case of large volume waste disposal, progression to each successive stage would depend upon the acceptability of the impact of the previous stage. Deep-sea ores have been known for about 130 years. However, the widespread occurrence of polymetallic nodules on the sediment surface at depths between about 4000 and 6000 m, and their potential as an industrial resource, has become apparent only in recent decades with the development of modem oceanographic techniques and particularly of deep-sea photography [23]. In fact, it was the presence of these curious concretions which prompted the most severe controversy between developing and industrialised nations during the negotiations of the United Nations Convention on the Law of the Sea (UNCLOS) which delayed this treaty entering into force. In the meantime, other potentially exploitable ores have been discovered, including polymetallic crusts, metalliferous muds and massive sulphides, in addition to phosphoritic nodules with potential in the production of fertilisers. Most efforts, both in exploration and in exploitation technology development, have been concentrated on polymetallic nodules and metalliferous muds, and technical mining tests were successfully conducted in the late 1970s. Environmental problems were foreseen from the beginning, and impact assessments accompanied the commercial sea-going activities [24,25]. The most stimulating insight was gained in the early 1980s during evaluation of the environmental studies for metalliferous mud mining in the Red Sea [26]. Data collected by traditional methods proved to be unsuitable for the prediction of environmental mining impacts since it was impossible to extrapolate from small scale impacts to assess the likely effects of much larger commercial activities.
15 As a result, the first large-scale disturbance experiment (by Germany) was developed [27] as a new approach to observe the re-establishment of the deep-sea benthic community after a severe disturbance of several square kilometres of seafloor in a polymetallic nodule area [28]. Other large-scale re-sedimentation and faunal recovery experiments followed by the United States, Japan, India and the Interoceanmetal Joint Organisation headquartered in Poland [29,30,31,32,33]. Further experiments, on a variety of scales, must be conducted for the evaluation of the impacts of nodule and metalliferous mud mining and the potential impacts of polymetallic crust and massive sulphide mining must be considered. An additional essential step will also be the monitoring of early mining tests lasting several months. All of these activities will be large scale undertakings which should be financed through international co-operation and the resulting knowledge shared. Little is known about plans for the exploitation of massive sulphides. The environmental problems are certainly different from those involved in polymetallic nodule mining since the ore body is not, in this case, confined to the sediment surface and mining may take place close to hydrothermal vents with their associated specialised communities. This situation may have similarities to the mining of metalliferous muds in the Red Sea where the basic resource is relatively similar to massive sulphides. Moreover, the environmental studies conducted during the exploration of the Atlantis II Deep in the Red Sea was unable to exclude the possible existence ofhydrothermal vent communities on the upsloping outskirts of the Deep.
3. REQUIRED TECHNIQUES Much of the applied deep-sea research envisaged for the future can be achieved with standard oceanographic equipment available on most research vessels. However, more specialised technology will be essential to answer many of the current questions. As noted above, barrels containing radioactive waste should be inspected closely and this can be achieved only with remotely operaing vehicles (ROVs) or with manned submersibles. The same is true for the collection of samples close to dumped munitions and large structures, both because of the risk of loss or damage to conventionally towed sampling devices and because these cannot be positioned sufficiently accurately to obtain the relevant samples. Even monitoring large-scale dumpsites, such as those for sewage sludge or dredge spoils, may require fine scale sampling achievable only with ROVs or submersibles. Long-term measurements and observations may also be needed. These can be achieved with deployed abyssal laboratory type installations [34] such as GEOSTAR (see this volume), with the equipment carried tailored specifically to the task in hand. Studies of hydrothermal vents, for example, would require a broad spectrum of equipment including cameras, corers, water bottles, current and turbidity meters and a range of sensors. Similarly, at sewage sludge or dredge spoil dumpsites the equipment should detect particle sedimentation and erosion, sediment and contaminant transport, life and bioturbation.
4. L E G A L C O N D I T I O N S F O R THE P R O T E C T I O N OF THE D E E P - S E A
The United Nations Convention on the Law of the Sea [35], which has been ratified by more than 132 State Parties (131 States and the European Union), contains many paragraphs relating to the deep seabed environment. Responsibility for environmental protection of the
16 deep-sea floor, referred to as the Area in the UNCLOS document, is assigned to the International Seabed Authority (ISA). A code for the exploration for polymetallic nodule mining is under negotiation within the ISA and environmental guidelines for this activity have already been drafted [36]. No other use of the Area has been discussed thoroughly since the Law of the Sea entered into force in November 1994 and the ISA was established. A general statement related to the use of other deep-sea resources might be a useful position document before such discussions take place. However, any ISA environmental provisions have no legal force outside the Area, that is in the Exclusive Economic Zones (EEZs) of individual states. UNCLOS agreed on a 200 nautical mile wide EEZ for coastal states (with exceptions), and Article 193 formulates that >. But legal provisions to protect the environment in many countries may be weakly developed, and in most countries the marine environment, particularly the offshore parts of the EEZ, may not be included within the legal framework. Thus, the missing legal provisions for EEZs have a potential for severe disturbances in deep sea regions, and no international legal regulations may be effective. The legal basis for the protection of the sea should be further developed, and this should include clear formulations for the need for precautionary research, environmental impact assessments, conservation of species and habitats [37,38], and monitoring. It should also insist that in order to develop the best practicable environmental options, the best available and appropriate techniques should be employed. ACKNOWLEDGEMENTS
I am most grateful to Prof. Dr. Paolo Favali and Prof. Dr. Giuseppe Smriglio for inviting me to participate in the course Science - Technology Synergy for Research in the Marine Environment." Challenges for the XXI Century. I thank Dr. Anthony L. Rice for the language correction of my manuscript.
REFERENCES
1. V. Tunnicliffe, Oceanogr. Mar. Biol., Ann. Rev., 29 (199 l) 319. 2. L.M. Parson, C. L. Walker and D. Dixon (eds.), Hydrothermal Vents and Processes. Spec. Publ. The Geological Society, London, 1995. 3. Ch. R. Fisher, Biosystem. Ecol. Ser., 11 (1996) 313. 4. C. R. German, L. M. Parson and R. A. Mills, in C.P. Summerhayes and S.A. Thorpe (eds.), Oceanography. An Illustrated Guide, Kanson Publishing Ltd., London, 1996. 5. V. A. Gebruk, S. V. Galkin, A. L. Vereshchaka, L. I. Moskalev and A. J. Southward, Advanc. Mar. Biol. 32 (1997) 94 6. R.W. Schmitt and S. E. Wijffels, Geophys. Monogr., 75 (1993) 77. 7. K.J. Hsti and J. Thiede (eds.), Use and Misuse of the Seafloor, John Wiley & Sons Ltd, Chichester, 1992. 8. WBGU (German Advisory Council on Global Change), World in Transition: The Threat to Soils, Economica, Bonn, 1995.
17 9. WBGU (German Advisory Council on Global Change), World in Transition: Ways Towards Sustainable Management of Freshwater Resources, Springer, Berlin, 1999. 10. H. Thiel, M. V. Angel, E. J. Foell, A. L. Rice and G. Schriever, Environmental Risks from Large-Scale Ecological Research in the Deep Sea: A Desk Study, Contract No. MAS2CT94-0086, Commission of the European Communities, Directorate General for Science, Research and Development, 1998. 11. H. Thiel, Anthropogenic impacts on the deep sea, in: P. A. Tyler (ed.); Ecosystems of the World: The Deep Sea, Elsevier Science, Amsterdam, in press. 12. ACOPS, ACOPS Yearbook 1986/87, Advisory Committee of Pollution of the Sea, London, 1988. 13. SEBA (Working Group on Sea-bed activities), Oslo and Paris Conventions for the Prevention of Marine Pollution, Summary Record 95/8/1-E (1995). 14. T. Rice and P. Owen, Decommissioning the Brent Spar, E. & F.N. Spon, London and New York, 1999. 15. F. G. T. Holliday, Report of the Independent Review of Disposal of Radioactive Wastes in the Northeast Atlantic, HMSO, London, 1984. 16. IAEA (International Atomic Energy Agency), Inventory of Radioactive Material Entering the Marine Environment, Sea Disposal of Radioactive Waste, IAEA TECDOC-588, Vienna, 1991. 17. F. Olsgard, and J. S. Gray, Mar. Ecol. Progr. Ser., 122 (1995) 277. 18. M. Kelland, Mar. Poll. Bull., 29 (1994) 307. 19. M. H. Bothner, H. Takada, I. T. Knight, R. T. Hill, B. Butman, J. W. Farrington, R. R. Colwell and J. F. Grassle, Mar. Environm. Res., 38 (1994) 43. 20. P. J. Valent and D. K. Young, Abyssal Seafloor Waste Isolation: Environmental Report. Naval Research Laboratory, Stennis Space Center, NRL/MR/7401-95-7576, 1995. 21. T. Ohsumi, Mar. Techn. Soc. Joum., 29 (1996) 58. 22. B. Ormerod and M. V. Angel, Ocean Storage of Carbon Dioxide: Workshop 2 Environmental Impact, IEA Greenhouse Gas R&D Programme, Cheltenham, 1996. 23. J. L. Mero, The Mineral Resources of the Sea, Elsevier Oceanogr. Ser. 1, Elsevier, New York, 1965. 24. E. Ozturgut, J. Lavelle and R. E. Bums, in: R.A. Geyer (eds.), Marine Environmental Pollution, Dumping and Mining, Elsevier Oceanogr. Ser. 27B, Elsevier Science Publishers, Amsterdam, 1981. 25. H. Thiel, E. J. Foell and G. Schriever, Ber. Zentr. Meeres- Klimaforsch. Univ. Hamburg, 26 (1991) 1. 26. H. Thiel, H. Weikert and L. Karbe, Ambio, 15 (1986) 34. 27. H. Thiel, Mar. Mining, 10 (1991) 369. 28. H. Thiel and G. Schfiever, Ambio, 19 (1990) 245. 29. T. Kaneko (Sato), K. Ogawa and T. Fukushima, Proc. 1st Ocean Mining Syrup., Tsukuba, Japan (1995) 181. 30. T. Radziejewska, Proc. Intern. Syrup. Environm. Stud. in Deep-Sea Mining, Tokyo (1997) 223. 31. D. D. Trueblood and E. Ozturgut, Proc. 7 th Intern. Offshore Polar Engin. Conf., Honolulu (1997) 481. 32. R. Sharma, Proc. 3~dOcean Mining Syrup., Goa (1999) 118. 33. H. Thiel and Forschungsverbund Tiefsee-Umweltschutz, Deep-Sea Res. II, (in press).
18 34. H. Thiel, K.-O. Kirstein, C. Luth, U. Luth, G. Luther, L.-A. Meyer-Reil, O. Pfannkuche and M. Weydert, Jour. Mar. Syst., 4 (1994) 421. 35. United Nations, The Law of the Sea, Division of Ocean Affairs and the Law of the Sea, Office of Legal Affairs, United Nations, New York, 1997. 36. ISA (International Seabed Authority), Deep-seabed Polymetallic Nodule Exploration: Development of Environmental Guidelines, International Seabed Authority, Office of Resources and Environmental Monitoring, Kingston, 1999. 37. H. Thiel and T. J. A. Koslow, Ocean Challenge, 9 (1999) 14. 38. J. A. Koslow, G. W. Boehlert, J. D. M. Gordon, R. L. Haedrich, P. Lorance and N. Parin, ICES J. Mar. Sci., 57 (in press).
Science-Technology Synergy for Research in the Marine Environment: Challenges for the XXI Century L. Beranzoli and P. Favali and G. Smriglio, editors.
19
9 2002 Elsevier Science B.V. All rights reserved.
D e e p physical observatories
oceanography
experimentation
and
benefits
from
bottom
Claude Millot Antenne LOB-COM-CNRS, BP 330, F-83507 La Seyne/mer, France
Physical oceanography is probably the marine sciences discipline which has benefited the most from in situ autonomous sampling. The major reason is that it deals with state and dynamical equations relying parameters (temperature, salinity, pressure, current) that can be measured almost easily, at least locally (current can also be remotely sampled). Simple instruments and more sophisticated devices have thus been deployed for a while, either in the bottom layer (on which the paper focuses) or in the water column. Even though they obviously have limits, bottom observatories appear to be potentially valuable devices in physical oceanography.
1. INTRODUCTION This paper is a review article. As emphasized by the referees, it thus should have provided the reader with a set of references on up-to-date technologies and figures of general interest. Nevertheless, because of the tremendous and rapid technological progresses, the available references are relatively old and not of great interest, while space is too limited here to show an exhaustive set of figures. Therefore, instead of having classical references and figures, we have thought it more valuable to provide the reader with some www addresses that will help him in initiating a research about more specific, up-to-date and well-illustrated information. Sampling the water column is usually done with instruments which are either ship-handled or left drifting or set in place on moored supports. The first group generally allows a rather manageable description of the phenomena of interest since the sampling strategy can be adjusted and modified conveniently. But ship mobility prevents one from having fully synoptic data sets, and shiptime is expensive. Moreover, ships do not allow us to acquire long time series, and thus monitoring a specific area. This can be partly done with drifting (i.e. lagrangian, [1,2]) instruments, as long as they remain in an area where they can be located. This area is the world ocean for instruments drifting at the surface (located from space), but it practically reduces to a few hundreds up to few thousands kilometres for instruments drifting at intermediate depths (the sole location technique being acoustics). Such instruments are generally aimed at drifting in the same water mass, and are thus especially efficient in describing its circulation, not in specifying its temporal variations (the recorded variations being both spatial and temporal). In order to get sufficiently long time series directly related to equations written in an easily "legible" (i.e. eulerian [3]) form and suitable to be analysed with the available techniques (statistics, correlations, spectral analysis, etc.), data sets must be recorded at fixed locations. Oceanographical instruments can be set in series on a line moored on the bottom (reaching the surface or not), or directly set on the bottom on more or less
20 sophisticated observatories. Integrating sound travel times (acoustic tomography) over relatively small or large distances provide information on average parameters, current or temperature, respectively. For global experiments (e.g. WOCE and TOGA [4,5]) as well as for regional ones (e.g. ELISA [6]), the in situ sampling strategy ideally combines both eulerian and lagrangian instruments. Some of these instruments have to be built by the research teams for their own use, but most of them are marketed and available from everywhere. Most of the largest companies specializing in physical oceanography instrumentation present their products during well known exhibitions (e.g. Oceanology International Exhibition [7]) and are thus indexed by the organizing committees (information available upon request). Now, physical oceanography deals with (i) a state equation showing three parameters (temperature, salinity, pressure) which can be measured locally almost easily, and the density (which is computed from them), and (ii) dynamical equations showing the density, the 3-D current (which can be measured both locally and remotely, over a few hundreds of metres on the vertical), and the external forcings. Space variables can either be specified easily with the required accuracy, especially for bottom-mounted observatories, or constitute the major sampling problem, especially for drifting devices. Due to the relatively large time scales of the major oceanographical phenomena, the time variable is generally measured easily with sufficient accuracy; measuring time becomes a problem with tomography, that is measuring slight differences in sound speed over long distances. The major characteristics of the physical parameters, as well as the performances and specific problems of the physical sensors which could be operated with bottom observatories are first presented in Section 2. Benefits and limits of bottom observatories, mainly with respect to simple mooting lines are then analysed in Section 3. Finally, possible developments of bottom observatories are discussed in Section 4.
2. THE PHYSICAL OCEANOGRAPHY PARAMETERS AND SENSORS 2.1. Temperature
Being basically measured with a thermistor, temperature is certainly the state parameter that is the most easily sampled at any place and for any duration. The major difficulty is to reach a sufficient accuracy, since the temperature gradients, which are relatively large on the vertical in the surface layers, can be very small and with no specific direction in the deep ocean. Indeed, the surface layers warmed up by the sun either temporary (in spring and summer at mid-latitudes), or in some global zones (the tropical ocean), become lighter and thus remain at the surface. Even though they are mixed over a few tens of metres by the surface agitation (the waves mainly), they come to be separated from the deeper layers by a thermocline where the vertical gradients are in the order of a few tenths (and even more) ~ Down to a few tens (respectively hundreds) metres, an accuracy/resolution of-O.1 (respectively, 0.01) ~ is thus generally sufficient and can be achieved without any sophisticated calibration. In some other places (the polar zones of the Atlantic ocean, the northern parts of semienclosed seas such as the Mediterranean), and during severe meteorological conditions (wintertime cold and strong winds), surface waters are cooled and evaporated. They thus become denser, tend to sink, locally homogenize the whole water column, and then spread horizontally in the whole deep ocean/sea, driven by the horizontal pressure gradients and the Coriolis effect. Being no more in contact with the atmosphere, dense water masses keep for
21 years, decades and even centuries, the characteristics they got in their formation zones, which are basically different from zone to zone. In such a way, they can be recognized more or less everywhere (i.e. far away from their formation zone) and classified. Note that because of pressure, the temperature of a sinking homogeneous water mass increases (a few hundredths to a few tens ~ / km, depending on salinity and pressure), so that physical oceanographers do not deal with in situ, but with potential (i.e. at zero pressure) temperatures. Now, while propagating, these dense waters adjust themselves on the vertical due to their relative densities. They also mix, even if slightly, with the surrounding waters. Therefore, and except in the vicinity of topographic features such as a trench between two basins or a continental slope, gradients on both the vertical and the horizontal can be extremely low and sensors must be able to give evidence of variations of few 0.001 ~ Such sensors are sophisticated and must be adequately calibrated before and after experiments lasting several months/years (e.g. WOCE [4]). Acoustic tomography is the sole technics able to measure slight temperature variations (related to some general global warming for instance) at a global scale and at intermediate layers; it has recently been used and validated at the scale of the Western Mediterranean Sea during the Thetis-2 MAST project. The temperature of the deepest layers can be as low as -2~ (in the southern Atlantic ocean) while it is-13 ~ in the Mediterranean sea.
2.2. Salinity Salinity is not measured directly but via a related parameter, the one universally used being the conductivity. In a cylindrical cell, a primary circuit induces an electromagnetical field which, in turn, induces a current (proportional to the conductivity) in a secondary circuit. The salinity-conductivity relationship strongly depends on the temperature, which must therefore be simultaneously measured in a convenient way. Problems can arise in relatively heterogeneous conditions, due to different response times of the temperature and conductivity sensors swept by moving waters. They are solved by setting the sensors in a tube where water is flushed at a constant rate by a pump during the measurements. Such problems are unexpected in deep, relatively homogeneous and slowly moving waters. Another problem with conductivity measurements comes from turbidity. Indeed, organic particles produced in the whole water column or inorganic ones originated from the coastal zones, tend to sink and accumulate in the deep basins, generally forming more or less compact muds. Mainly due to the current, and also partly, to any topographical obstacle, these particles are more or less permanently resuspended. They thus form a few metres up to a few tens of metres bottom nepheloid layer. These particles obviously penetrate into the conductivity cell and tend to sediment on the cell inner sides, thus temporary or continuously modifying the water volume and then the measured conductivity. Such a cell must be cleaned, which can probably be achieved efficiently with the pump already mentioned. From an oceanographic point of view, the surface salinity is modified at a global scale by air-sea interactions, through the evaporation/precipitation processes. More locally, it depends on the proximity of inputs from rivers and melting of ice sheets. Dense waters formed in the Mediterranean Sea will thus have much higher salinities (more than 38 g of salt per litre or practical salinity unit, psu) than the ones formed in the Labrador or Weddel Seas (less than 35 psu). As for temperature, accuracy/resolution of a few tenths up to hundredths of psu is sufficient in the surface and intermediate layers (and easily achieved as long as the conductivity sensor cleaning is efficient). In the deeper layers (see WOCE [4]), significant information needs to reach a few thousands of psu and dedicated calibrations.
22
2.3. Pressure
Pressure is measured with piezoelectric quartzs which have an accuracy proportional to their measurement range. Not considering the effect of waves (complex and limited to a depth of about half their wavelength; note that wave characteristics can be inferred from pressure sensors immersed close to the surface), the pressure is said to be hydrostatic, since it only depends on the mean height and density of the water column above the sensor. Pressure is needed to compute, with both in situ temperature and salinity, the in situ density of the water mass in which these parameters have been measured (i.e. locally); to do this, adequate basic sensors provide a sufficient accuracy. But pressure can also be considered as a remote parameter to try specifying the water height (and thus the sea level) and mean density; sophisticated pressure tide gauges [8,9] reaching relative accuracies/resolutions of 10 -s (i.e. 0.1 mm at 10.000 m!) are now available and very easily managed. Separating height from density variations can be impossible. For instance, the seasonal warming up of the surface layers in the Mediterranean Sea ( b y - 1 0 ~ over few tens metres) decreases their density (by 1-2 kg/m 3) and increases the sea level (by 10-20 cm), without leading to any pressure variation. Generally speaking, height and density effects on pressure variations could be separated according to the variations time scales and a knowledge of the environmental background. Practically, since mean (over a large depth interval) density variations are relatively small, pressure variations will be directly related to sea level ones, especially at mesoscale, that is periods of hours to days and weeks. Phenomena of specific interest which could be monitored with sampling intervals of the order of 1 hour are tides, wind effects pushing waters against (respectively away from) a coast and thus increasing (respectively decreasing) the sea level, long waves (associated to phenomena such as E1 Nifio), eddies (such as the Gulf Stream tings); higher sampling rates could allow monitoring phenomena such as seiches (natural oscillations of oceanic basins) and tsunamis. 2.4. Currents
The largest amplitudes (a few m/s) are encountered in the open ocean within the core of the major currents (e.g. the Gulf Stream) or within straits (e.g. Gibraltar) or around capes and between islands/rocks (generally due to tides there). In the abyssal domain, they are relatively low (order of cm/s) but can be large (50 cm/s have been measured at -2000 m in the Mediterranean Sea) under specific conditions. Currents close to the bottom are almost horizontal, and could thus be easily measured (with an accuracy of a few cm/s) with any basic current meter [10,11,12]. Such instruments use mechanical sensors (for instance, a rotor for the speed and a wing for the direction), or involve acoustic elements (measuring travel times over few decimetres pathways), or measure electrical quantities (since saline water moving across a magnetic field induces potential differences in a perpendicular direction). Because of friction, note that the current is zero just at the bottom while it is sheared over a few tens of metres height (within what is called the Ekman bottom layer). Now, currents can also be measured remotely with relatively sophisticated instruments generally called Acoustic Doppler Current Profilers (ADCPs, [13,14]). Basically, an ADCP measures the Doppler shift between a frequency emitted from a source and the one echoed from suspended particles entrained by the current in a water layer; the shift is proportional to the current radial velocity. The analysis of the echoes at different time lags after the emission gives the radial velocity in cells at different distances from the source. Because of the overall horizontal stratification of the water column, and especially close to the bottom, the current is
23 expected to be quite homogeneous horizontally at a specific location, while it can be markedly different on the vertical. Therefore, orientating an antenna horizontally gives radial velocities significant but similar for all cells, while orientating it vertically gives zero radial velocities (even if they correspond to currents different in the various cells). A satisfying compromise is found with an orientation of 20-30 ~ with respect to the vertical and at least 3 (practically 4 to allow redundancy) antennae regularly distributed around the vertical. The 3-D current profile classically associated with the x-y-z orthogonal axes can thus be computed from radial velocities in a series of cells. The ADCP performances (accuracy, range) basically depend on the working frequency. They also depend on the various specifications which can be modified such as the cells size, the number of pings over which averages are computed, the emission intensity, as well as on the density of adequate suspended particles. The layer thickness can range from 400 to 500 m for a 75 kHz frequency to a few tens of metres for a -600 kHz one. The cells size may have to be increased up to several metres (with low frequencies) but can be as low as a few decimetres (with high frequencies) while, in general, several tens up to few hundreds pings are averaged to give relatively accurate measurements. Accuracy/resolution is similar for all components (while, for instance, vertical velocity cannot be measured with mechanical sensors). Enlarged battery packs generally allow, for instance, 1-h measurements d u r i n g - 1 year for a long range (i.e. 75 kHz) ADCP. The current field can be defined with a more or less fine resolution in a box domain, either in the mid-ocean or within a channel, using acoustic tomography.
3. B O T T O M O B S E R V A T O R I E S B E N E F I T S AND LIMITS 3.1. The hydrological sensors Conductivity, temperature, and pressure sensors are commonly set on basic instruments available on the market and known as CTDs (D standing for depth even though it cannot be measured directly) [15,16]. When operated from a ship, either yo-yoed ( a t - 1 m/s, ship arrested) or to-yoed (yo-yoed at -1 m/s and towed at a few m/s [17]), they have to sample more or less stratified layers so that the various sensors must be flushed by a pump (especially when yo-yoed) while the response times must be correctly considered. Now, as long as CTDs can be operated with an electric cable, energy and memory are not limited and data can be recorded at high sampling rates (several tens Hz). Also, sensors can be sophisticated (i.e. sensible even if fragile), easily calibrated and changed in case of necessity. No doubts that such CTDs give the most confident data sets, even though they do not allow collecting time series at a fixed location. Long (weeks to months) time series can be collected with fully autonomous CTDs also available on the market and set on moorings at any depth (in line between a float and a weight). But, since energy and memory are limited, they generally do not acquire data at sampling rates higher than a few times per hour. Sensors must be relatively reliable and robust, which generally means relatively rough. In any case, calibration before and after deployment, and even in situ comparisons with ship-handled measurements, will allow specifying eventual drifts. Note that moorings are supports not stable enough, especially away from the bottom, to allow pressure measurements able to infer sea level and mean density variations. Therefore, a bottom observatory offers several advantages. Essentially, energy and memory can be less restricted than for marketised autonomous CTDs. In order to clean the
24 conductivity cell, a relatively powerful pump (or another cleaning device) can be operated. Such a pump can also be operated more frequently to allow a higher sampling rate, as long as more memory is available (even though this might not be necessary due to the overall homogeneity of the deep ocean). Since the observatory is accurately located and close to the bottom, in situ comparisons with ship-handled instruments are relatively easy and efficient to estimate drifts of the sensors. To be emphasized is the specific interest to compare and combine the pressure measurements from deep bottom observatories with sea-level heights from satellite altimeters, even though the former have a maximum accuracy/resolution much better than the latter (a few hundredths cm vs. a few cm). For instance, the sea-level being directly given by an altimeter will allow defining, with a pressure sensor, the mean density (and mean temperature in case of global warming) of the world ocean, or even the thickness of a surface layer whose density could be estimated by other means (as in the case of E1 Nifio for instance). In this respect, specifying the tides in the middle part of the ocean is relatively easy (series of periodic signals of constant amplitudes and phases); it will allow refining tidal models and thus will markedly improve the corrections actually made to altimetric data sets, which appears to be of major global interest. Except for the fact that bottom observatories only allow, for the time being, measurements at a few dm/m from the bottom, a limitation could arise from the turbulences they create in the current field, which will necessarily increase the resuspension of sedimented particles. This will occur whatever the location of the sensors are (within the observatory and with respect to the current direction), and will contribute mainly in making the conductivity cell more dirty. 3.2. The dynamical sensors Currents close to the bottom are surely, even if more or less, disturbed by any observatory; the larger the observatory, the stronger the perturbation of the current field. When considering also the shear in the Ekman bottom layer, it is clear that any pure physical study about bottom layer dynamics will need a specific support. Even though such a support can be more than a single mooring line, and thus assimilated in some sense to a bottom observatory, it will necessarily have to disturb the current field as little as possible, which prevents it from deploying other instruments or devices there. Local current measurements from an observatory can thus be only aimed either at roughly defining the environmental characteristics in the study area, or at accurately specifying the small scale current (i.e. noise source) close to some sensible sensor, such as a seismometer, for instance. Obviously, ADCPs can be set advantageously on bottom observatories. Apart from the few metres of bottom layer where the current is more or less disturbed by the observatory, they give data sets of good quality. But, ADCPs are equipped with complementary sensors such as those measuring the tilt and the orientation of the instrument, so that good quality is also expected from instruments set on moorings. Ideally, both long range (relatively coarse resolution) and short range (sharp resolution) ADCPs should be mounted on a bottom observatory, which can also be easily done on a mooting. The major advantage provided by an observatory is thus the supplementary amount of both energy and memory which allows more energetic and/or frequent pings and thus more accurate and/or longer data sets.
25
3.3. Relationships with other sensors and devices In addition to the increased amount of energy and memory, an obvious advantage provided by a bottom station is the complementarity with sensors dedicated to other disciplines (as already mentioned in this and other papers in this publication). Also, the possibility to check the functioning of the various instruments, through messengers or more classical acoustics systems, is certainly an interesting opportunity. Finally, the possibility to modify the sampling rate of the physical sensors, based on events detected by other sensors and in order to analyse more finely specific phenomena of a non-physical-oceanography origin, is an interesting possibility offered by a bottom observatory. Nevertheless, the complexity of acoustic propagation close to the bottom (due to both reflexion and refraction) does not make bottom observatories very efficient for acoustic tomography.
4. POSSIBLE D E V E L O P M E N T S OF B O T T O M O B S E R V A T O R I E S
Obviously, more or less complicated (thus, expensive and hardly realizable!) devices can be imagined on the basis of the presently or soon-available technology. Local hydrological and dynamical sensors are probably used actually at the maximum of their potential performances. But the technology of remote sensors such as the ADCP is in marked progress so that, in the forthcoming years, the range and accuracy/resolution of the current measurements will certainly increase; in particular, ranges of a few thousand metres will probably be possible soon. A major limitation of actual bottom observatories might be their incapacity to sample scalar parameters (hydrological or similar ones such as oxygen, nutrients, suspended matters) over relatively large depth ranges. One could imagine that messengers, instead of being only carriers of information, could also be equipped with their own sensors (acting as CTDs) and thus getting profiles over the whole water column while rising up to the surface. Also, using at least 3 acoustic sources deployed for the whole experiment in the bottom observatory surroundings could permit locating the rising messengers, and thus getting vertical profiles of the horizontal current over the whole depth. Whatever the location of such observatories could be (i.e. in the coastal zone or in the mid-ocean) and whatever the sampling rate (in terms of profiles per day, week or month), it is clear that the technical capacity of satellite systems allows transmission of convenient data sets in real-time, and thus an efficient monitoring of a specific area at a relatively low (compared to ship and crew) cost. Autonomous profiling devices equipped with hydrological sensors are commonly operated. Some are based on a string [10] of sensors (mainly temperature and conductivity), connected or not to a unique recorder, and distributed along a line a few tens to a few hundreds of metres in length maintained vertical by floats. Others are based on a unique set of sensors, typically a CTD sometimes equipped with a current meter, yo-yoing along a cable which can be as long as a few thousands metres. Others are free-falling expandable probes (XBTs, XCTDs, [18]) that are connected to a recorder on a ship through a very thin cable which brokes at some nominal depth. Obviously, the accuracy and sampling rates of these devices are variable, and can be more or less adequate according to the hydrological characteristics of the water masses in which they are operated. Nevertheless, such profiling devices (including inverted expandable ones that could be easily developed) could be also operated from a bottom station, taking benefit from the advantages already mentioned (energy, memory, monitoring, checking), since their deployment from the observatory after the observatory setting, as well as their release from the observatory before the observatory recovery (for both a string and a
26 yo-yo in order to simplify the observatory manipulations) do not theoretically raise major problems. To conclude, and even though most of the physical oceanography sensors are presently operated in a rather convenient way from simple moorings, it is clear that the actual bottom observatories offer interesting possibilities, mainly in terms of increasing capacities of the standard equipments, monitoring the sampling rates to better describe phenomena (eventually of non physical-oceanography origin), checking the functioning and sending messages. In the future, such observatories could be the less costly solution to conveniently monitor the global ocean through a new generation of profiling sensors.
ACKNOWLEDGMENTS I would like to warmly thank J.-L. Fuda for fruitful discussions about this manuscript and his overall valuable involvment in the GEOSTAR project.
REFERENCES
1. http ://www.rsmas.miami.edu/groups/ldmg.html 2. http ://rafos. gso.uri, edu/ 3. http://stommel.tamu.edu/-baum/reid/bookl/book/nodeS.html 4. http://www-ocean.tamu.edu/WOCE/uswoce.html 5. http ://www.ncdc.noaa.gov/coare/ 6. http ://www.com.univ-mrs. fr/LOB/ELISA 7. http ://www. spearhead, co.uk/ 8. http ://www.nbi.ac.uk/psmsl/sea_level.html 9. http://www.paroscientific.com/site_map.htm 10. http://www.aanderaa.com/ 11. http://www.interoceansystems.com/index.htm 12. http://www.falmouth.com/ 13. http://www.rdinstruments.com/ 14. http ://www.sontek.com/ 15. http://www.seabird.com/ 16. http ://www.business 1.com/genocean/ 17. http'//matisse.whoi.edu/seasoarintro.html 18. http://www.sippican.com/
Science-Technology Synergy for Research in the Marine Environment. Challenges for the XXI Century L. Beranzoli and P. Favali and G. Smriglio, editors.
27
9 2002 Elsevier Science B.V. All rights reserved.
Why global geomagnetism needs ocean-bottom observatories Frank J. Lowes Physics Department, University of Newcastle, Newcastle Upon Tyne, NE3 3QE, United Kingdom The main geomagnetic field is mostly large scale, coming from electric currents induced by motions in the fluid conducting core. But, there is also a significant, mostly small-scale, contribution from the magnetization of near-surface crustal rocks, which acts as noise for surface observations. The main field is usually modelled by a spherical harmonic expansion, truncated at degree about n=10-13. However, there is a significant change of the core field with time, and this so-called secular variation is very poorly known and difficult to predict. Because of the crustal noise, the accurate measurement of the local secular variation needs long-term measurements at a fixed s i t e - a magnetic observatory. Because of the large gaps in the distribution of surface geomagnetic observatories, the secular variation can at present be modelled, at best, to about degree n-8. Some of the gaps in the distribution could be filled by observatories at island sites, but there is a need for at least 8 sites in mid-ocean; in practice this means ocean-bottom observatories. Although satellite surveys will in future give good models of the field at a particular epoch, there will still be a need for a global network of surface stations to act as a baseline, and to better determine the instantaneous secular variation.
1. THE M A I N G E O M A G N E T I C FIELD
Most people have come across the Earth's magnetic field simply as the source of the mysterious torque that makes a compass needle point north. In fact this torque comes only from the horizontal part of a magnetic field; the field is actually a three-dimensional vector which also points upwards or downwards, as well as roughly northwards. And the field is not confined to the Earth's surface, but extends inwards and outwards. To a crude approximation, the field is dipolar, like that of a bar magnet near the centre of the Earth (though at present tilted about 10 ~ to the geographic axis)(Fig. 1). At the Earth's surface, near the equator the field is roughly horizontal, and of magnitude about 30,000 nT, and at the north and south magnetic poles it is vertical, and of magnitude about 60,000 nT; I will use 45,000 nT as a typical global average. However, the dipole nature of the field is only very approximate; if we remove the global best-fitting dipole (a vector subtraction), then at the surface we are still left with a field of magnitude about 15,000 nT, the so-called non-dipole field. This non-dipole part of the main field has a length scale of thousands ofkilometres.
28
Figure 1. Schematic picture of dipole field
Crustal
Magnatization
-200 nT
,,x_,.x.., Dynamo -45000 nT Currents
Solid Mantle
Liquid Conducting Core
Figure 2. The main field (about 45,000 nT vector rms over the surface) is producted by electric currents in the core, while the crustal field (about 200 nT rms) comes from near surface rocks.
29 This main field has its origin in motions in the fluid core, 3000 km below us (Fig. 2). Because of our large distance from the source, it is not surprising that what we see is of mainly long length scale. Note that, as far as the main field is concerned, the continents and oceans do not exist; the field does not "see" the Earth's surface, or the details on it; such details are added to diagrams only to help the reader locate features. In the early days it was sufficient just to produce a chart showing how the field varied over the Earth's surface. But now we are also interested in the field above and below the surface, and it is standard practice to use a mathematical model of the field. For convenience the modelling is in terms not of the magnetic field vector B itself, but of the corresponding magnetostatic potential, V, from which B is obtained as B = -grad V. It is conventional to express V as a sum of spherical harmonics, so that at any one time (and slightly simplifying) V(r,O,)~)
= aZnZ,ngn
m
(a/r) n+l s,,m(0,)~).
Here (r,0,)~) are the usual spherical polar geocentric coordinates, a is the Earth's radius, and the gnm are numerical coefficients having the dimensions of magnetic field. The series of surface harmonics snm(t3,)~) is the equivalent of a two-dimensional Fourier series, but one appropriate to a spherical surface. A relevant parameter is the degree n of the harmonic; this is essentially a spatial wavenumber, or spatial "frequency", with the corresponding minimum wavelength 27~a/n = 40,000/n kin. Theoretically the series should go to n=oo, but in practice we put an upper limit on n, corresponding to a lower limit on the length scale. This limit is not arbitrary, but arises because at shorter wavelengths there is a significant source of"noise" which we cannot avoid. Although most of the internal field comes from the core, there is also a small contribution from the top 20 km or so of the rocks of the continents, and ocean floor (Fig. 2). This thin shell is weakly magnetic because of the ferromagnetic minerals in the rocks. Deeper than about 20 km the temperature is too high for the minerals to be ferromagnetic. In some cases these minerals were magnetized by the geomagnetic main field existing when they were formed, and in all cases they have induced magnetization because they are in the present main field. The overall contribution at the surface is a field of magnitude typically 200 nT. Most of this crustal field is on quite a small length scale, of the order of metres and kilometres; however, there is some contribution at long wavelengths. If we have measurements of the field only on or above the surface, there is no way we can separate this crustal field from the core field. However, if we look, as in Figure 3, at the spatial power spectrum of the field observed at the Earth's surface (e.g. [1 ]), we see a steeply falling part at low frequency/long wavelength, and a much flatter part at high frequency/short wavelength. The shallow slope of the high-frequency part shows that this part almost certainly comes from a very local source, and it is consistent with that source being in the crust; it would need sources of completely unrealistic magnitude if it were to come from the core. On the other hand, the steep slope of the low-frequency part is consistent with an origin in the core. Although we cannot rigorously prove this separation, it is reasonable to assign almost all of the long-wavelength part to electric currents in the core, and to assign almost all of the short-wavelength part to the magnetization of crustal rocks. If we fit the spectrum by two straight lines, they intersect at about n=13, corresponding to a wavelength of about 3000 kin. So, those of us interested in the main field coming from the core restrict our interest to spherical harmonic degrees n = 1-12 or13; this means that to specify the field at a given time
30
10 8 9
o
Core
10 6 t--
10 4 Crust 10 2 o
05 (45000 nT)
~
O
9
I
I
1
10
15
20
Degree n ::
(-200 nT)
Figure 3. Spatial power spectrum: mean square field at the surface R. : (n + 1)E[(g~") 2 + (h,') 2] coming from all the harmonics of degree n. m
we need to determine 168 or 195 numerical coefficients. Of course one man's noise can be another man's signal! For someone interested in the structure of continental plates and the history of their motion, or in mineral exploration, it is the crustal magnetic field which is the signal, and the core field which is the noise.
2. TIME V A R I A T I O N OF THE FIELD
So much for the geomagnetic field at a given time. However, unlike almost all other geophysical phenomena of internal origin, the geomagnetic field is changing quite rapidly in time. This c~B/c3t (itself a three-dimensional vector) has a magnitude of typically 70 nT/yr, corresponding to more than 1% in 10 years! Fig. 4 gives an example. And this c~B/& is difficult to predict as it is itself changing with time. So if we are going to produce an accurate model or map of the present field, we can only use recent observations. An observation made at a particular site 100 years ago does not add very much to our knowledge of what the field there is at present! This rate of change of the main field is called the secular variation ("Secular", meaning slow, by comparison with the daily variations). It is lack of knowledge of this secular variation that is probably the greatest problem in global geomagnetism, both from the point of view of specifying the present field, and for theoreticians interested in processes in the fluid core.
31
Tashkent 9149149
1500] -nT
9
O 9
9
Z - 45000 9
9149
OOOOO4PO 9149
1000
X-
25000
++ 4 - + + + + + + + + + + + 4"-I-.I. ++-t,
9
500
9
++4.+++
9
9 +
4-
++
++
+++++
" .
+
4-
+4.++ +
@Nil'l I 1
II
9 IN t I I I iN
IN II
9I l l
I IIIi
Y- 2000
I li I
Wniem 9149149
1940
1950
1960
1970
1980
Figure 4. Time variation of three orthogonal components of the field at a typical observatory. The lower full line shows schematically how the (long-wavelength) core field varies with position at Time 1. At the later Time 2, the secular variation (SV, the time variation of the core field) will have moved this line to the higher position. But superimposed on this is the (effectively constant, short-wavelength) crustal field represented by the dotted line. So, if the observation point (large dot) is moved between the two times, there could be a large error in the deduced secular variation. Unfortunately, because the crustal magnetization produces a field which can vary very rapidly with position, for field measurements at non-identical sites this magnetization effectively introduces random noise, equivalent in magnitude to a few years' secular variation; see Fig. 5. Thus, it is very difficult to determine the secular variation from measurements which are scattered in space as well as in time. We could certainly estimate an average secular variation over several decades, but then we are smearing out the way in which the secular variation is changing in time. To measure the secular variation we need to have our magnetometers accurately fixed in the same position for a long time; that is, we need a magnetic observatory. Ideally this observatory needs instruments having stable baseline and calibration, and known temperature coefficients, mounted on physically stable pillars and sheltered by a non-magnetic building, in a location where man-made magnetic disturbance is, and is expected to remain, small, and be maintained for a hundred years!
32
_._
9
...... :
..., "
9
9
i
~
"
Time
2
o~"I
9 v,,-I
(D
!
SV
%,"
/".. ..-,.," o
,~ .
-
9 o~
.," ,
-
-
"
9
o ; .
-
Ti me
1
stal Beld r-
Position
Figure 5. How the crustal field confuses measurement of secular variation.
Figure 6. Present distribution of magnetic observatories; not all are fully operational (Hammer equal area projection)
33 If we are going to determine the secular variation down to wavelengths of about 3000 km (corresponding to fitting harmonics up to n=13), we would need a network of at least 180 observatories, uniformly spaced over the surface. We do in fact have about 180 observatories, but they are anything but uniformly spaced (Fig. 6). We see that, for this purpose, they are too close together in the developed countries, very sparse in the developing countries, and, not surprisingly, essentially absent in the oceans!
3. THE NEED FOR OCEANIC O B S E R V A T O R I E S The lack of observatories in oceanic areas means that there are quite large areas of the surface where we cannot measure the secular variation. If you are navigating a boat in the South Pacific, you would be wise not to put too much trust in the statement of magnetic declination (the horizontal angle between magnetic north and true north), and of rate of change of declination, given on the chart! Such a lack of data does not significantly affect our knowledge of the field in Europe say, just as lack of ocean current measurements in the Pacific does not affect our knowledge of the currents in the Mediterranean. But there are users (chart makers, if nobody else!) who do need information about the field in the Pacific. And there are users of our models of the main geomagnetic field who are concerned not only with the Earth's surface; theoreticians want to know the secular variation at the surface of the core, and space scientists want to know it in the ionosphere, 100 km above the surface, and out to the magnetosphere at 10 earth radii. It is particularly in this context of downward and upward continuation that these large data gaps become particularly important, and lead to significant degradation of our global models. Fig. 7, from a simulation by Langel et al. [2], shows quantitatively how using good data (+ 10 nT/yr) from the existing observatory network still leads to errors of magnitude 100 nT/yr in the South Pacific; if the 180 observatories had been uniformly distributed over the surface we would have expected errors of magnitude only about 8 nT/yr, with a maximum error of about 20 nT/yr, there would only be one contour on the chart! This large amplification of the errors shows up one major problem we have. If we limit our modelling to low degrees/long wavelengths, then our model effectively interpolates between our observation sites, and gaps are less important; for these long wavelengths the numerical coefficients are less affected by data gaps. But this limitation to long wavelength means that we are ignoring a significant amount of field at shorter wavelengths. It is when we try to estimate the coefficients for increasingly shorter wavelength contributions that a data gap has more and more effect in increasing the error of these coefficients; it was in attempting to determine the n=10-13 coefficients that these very large errors of prediction were introduced in the South Pacific region. For these comparatively short wavelengths the other data points could not adequately constrain the values in the South Pacific. In fact, at present we do not try to model the instantaneous secular variation beyond ?/=8. Thus, it is clear that we desperately need to fill in the gaps in the current distribution of observatories. Langel et al. [2] found that, even when all possible island sites had been included, there was still a need for at least 8 ocean-bottom observatories, as shown on Fig. 8. Of course no ocean-bottom station is going to survive for 100 years, so some compromise is needed; but with a good, stable, magnetometer, even a year or two will add another value of
34
Figure 7. Magnitude of the expected error in the vertical component of a n = 13 model of the secular variation, using existing observatories assumed to have +/- 10 nT/yr uncertain~. Contour interval 10 nT/yr (reproduced by permission from Langel et al., 1995)
Figure 8. The locations of existing observatories are shown as squares. The locations of possible new island observatories are shown as open circles, and the locations of the 8 ocean-bottom observatories, that would still be needed to give a reasonably uniform coverage, are shown by large filled circles (after Langel et al., 1995)
35 secular variation in a region where it will have a significant effect on improving our model at that time. Obviously, the longer the occupation, the more valuable the data will be. So far I have ignored the use of data obtained from satellite surveys. Back in 1980 we had 6 months of MAGSAT data, so that for the first time we had truly global coverage, and could feel reasonably confident that we had a good knowledge of the main geomagnetic field. But ever since then our knowledge has been getting worse again; the recent surface data are nowhere near adequate in themselves, and we could not update the 1980 model because we did not know the secular variation sufficiently accurately and with sufficient detail. We now have the Oersted satellite in orbit, and, after some initial problems, it is now producing good data. So we should soon again have a good knowledge of the main field. Also, there are plans for various other magnetometer missions. However, these missions will be comparatively short-lived; so it will be quite difficult to determine an instantaneous secular variation from any one satellite. In addition there is the ever-present problem with satellite measurements that they are from an instrument which is moving quite fast in space. There are rapid time variations of the field due to time-varying sources in the ionosphere and magnetosphere, and it is quite difficult to separate the variation of the field with position from the variation of the field with time. We are not yet able to do this separation uniquely, and there is no doubt that measuring the time variations at a sufficient number of fixed ground stations is going to be necessary for the foreseeable future. So I can only emphasize that we in global geomagnetism desperately need ocean bottom observatories, and I hope that GEOSTAR will be one step on the way to achieving this.
ACKNOWLEDGEMENT Some of the diagrams in this paper were prepared using the Generic Mapping Tools produced by Wessel & Smith [3]. REFERENCES
1. R.A. Langel and R. H. Estes, Geophys. Res. Lett., 9 (1982) 250. 2. R.A. Langel, R. T. Baldwin and A. W. Green, J. Geomag. Geoelectr., 47 (1995) 475. 3. P. Wessel and W. H. F. Smith, EOS Trans. Amer. Geophys. Union, 79 (1998) 579.
This Page Intentionally Left Blank
Science-Technology Synergy for Research in the Marine Environment: Challenges for the XXI Century L. Beranzoli and P. Favali and G. Smriglio, editors.
37
9 2002 Elsevier Science B.V. All rights reserved.
O c e a n - B o t t o m S e i s m o l o g y in the Third M i l l e n n i u m J. Bialas a, E. R. Flueh a, J. Phipps Morgan a, K. Schleisiek b and G. Neuhaeuser b aGEOMAR, Research Center for Marine Geosciences, Christian Albrechts University Kiel, Wischhofstr. 1-3, D 24148 Kiel, Germany bSend GmbH, Signal-Elektronik und Netz-Dienste, Stubbenhuk 10, D 20459 Hamburg, Germany
Marine seismic wide-angle data acquisition and earthquake seismology observations are at the verge of a quantum leap in data quality and density. Advances in micro-electronic technology facilitates the construction of instruments that enable large data volumes to be collected and that are small and cheap enough so that large numbers can be built and operated economically. The main improvements are a dramatic decrease of power consumption (< 250 mW) and increase in clock stability (< 0.05 ppm). Several scenarios for future experiments are discussed in this contribution.
1. INTRODUCTION Seismic studies in the marine environment have a long tradition. During the last decades, acquisition of multichannel near-vertical reflection data has been developed to near-perfection by the hydrocarbon exploration industry. We will soon see specialized vessels towing up to 100 km of streamers for 3-D seismic investigations. The collection of supplementary seismic wide-angle data has historically been promoted by academia, and recent advances in recording technology have lead to a major development of new instruments during the last decade. We are on the verge of making experiments where up to 100 ocean bottom instruments will be simultaneously deployed. Using airguns as sources, the data volume collected in marine work is generally much higher compared to land work, at a much lower cost per data record. In contrast to active source seismology, marine earthquake seismology is clearly not nearly as developed as land based observations. Instead, it is comparable to what was done on land about 20 to 30 years ago. Several factors contribute to this situation, although primarily it has been missing technology that led to this imbalance. Deployments of at least a few months are needed to study local seismicity. As was convincingly shown by Hasegawa et al. [1], reliable epicenter location in subduction zones can only be obtained by incorporating marine stations. At reasonable sampling rates, this requires instruments capable of storing a few GBytes of data. Recently, this class of marine instrument has become available and has been successfully used for local marine seismic networks [2,3]. Networks recording teleseismic events for local tomographic studies require deployment times ranging from at least six months to possibly years, which is currently beyond the
38 capacity of most existing instruments. Perhaps the best example of this type of experiment was the MELT study of the structure beneath the southern East Pacific Rise [4]. The scientific needs for even longer duration Seafloor Observatories have been formulated at several recent workshops. Seismic stations that meet this role are still at the edge of what is technologically possible. So far, only a few trial deployments have been made to develop and test seafloor observatory technology [5,6]. Besides power supply and data capacity, the coupling of a seismic sensor to the ground remains the most prohibitive obstacle to advances in seismic recording capability. ROVs or submersibles offer the optimum solution, though the involved costs are high. Alternatively, free-falling and gimballed instruments are more easy to deploy, but the collected data are typically noisier and less broad-band. The U.S. Ocean Seismic Network Pilot Experiment (OSNPE [7]), carried out in 1998 at ODP site 843 B, used a simple but heavy mechanical frame to push the seismometer into the seabed, which had a profound effect on data quality. In this contribution, we summarize the scientific aims of future marine seismological work, present a new data logger that meets most of the requirements and speculate about possible scenarios that may be made operational in the first decade of the new millennium.
2. C U R R E N T T E C H N O L O G Y For more than a decade the need to extend land based broadband instrument arrays to the sea has been recognized [8], and in 1993 the International Ocean Network (ION) committee was formed to coordinate the effort towards broadband marine observatories. Several DSDP/ODP drillholes were used for test deployments [9,6] or drilled on purpose (site 396B; site 794; OSN1- site 843B; leg 179; leg 186). These experiments all showed that the ocean floor is a suitable environment for broadband observations, and that the technological challenges can be met. Data storage capacity has meanwhile become a relative non-problem due to advances in portable computer recording technology, but installation of sensors and data retrieval at a reasonable cost is still limited by technological issues. Currently, sensors can only be installed in the seafloor using a drill ship, a submersible or a ROV. Data retrieval is somewhat simplified when obsolete telecommunication cables can be used, such as for the H 2 0 observatory [ 10] or off Japan [ 11 ]. However, these are limited in number and are not at the optimum locations. New installations of purpose-designed fibre-optic cables are only practical close to the shore [ 12,13]. Free-falling instruments have been developed, which, based on 1 Hz seismometers using electronic feedback, extend into the semi-broadband (50 sec) range [ 14]. These instruments, as well as those used for active seismic experiments, all use gimballed seismometers that rest on the seafloor. Their easy handling is at the expense of a poor coupling and a significantly increased noise level compared to buried sensors, as shown by many detailed tests [15]. Often, they are complemented by an additional hydrophone or differential pressure gauge (DPG), which always provide excellent coupling but retain little information on particle motion. At GEOMAR we have currently created a pool of 40 ocean bottom hydrophones (OBH) or seismometers (OBS), mainly used for active seismic experiments including high resolution work (up to 4 kHz, e.g. [16]). These low power (1.5 W) and high data capacity (4 GByte) instruments have also been used for local seismicity studies at the Chilean margin [3], and recently we have started to extend the application range into the broadband spectrum required for future teleseismic and regional seismicity studies. In a first step, a new recorder was developed, which is described below. We are now in the process of testing several sensor packages. The GEOMAR instruments [17,18] have a modular design, so they can be adopted
39 to the needs of a particular experiment. They cover the frequency range from 120 sec to 4 kHz, with different sensors available.
3. FUTURE INSTRUMENT NEEDS IN OCEAN B O T T O M S E I S M O L O G Y As on land, marine seismic packages would be deployed for different purposes that have somewhat different instrument needs. One use is as four-component stations (a threecomponent seismometer plus a hydrophone) that "fill spatial gaps" caused by the uneven global distribution of land and island stations, which would improve the data coverage in global seismic studies. Stations used for this purpose would ideally have very broadband sensors (-1-0.003 Hz) and continuous recording. In this case, they would be a seafloor "analogue" to instruments in the current land-based networks of digital seismic stations. For earthquake location studies, quasi-permanent stations are desirable. However, for global tomographic studies, 1-2 year deployments of mobile local arrays (discussed further in the next paragraph) may actually be preferable, since they can currently obtain significantly more information at a much lower cost than a permanently maintained seafloor or drillhole seismic station. In addition, tomographic studies do not require real-time data return, which allows the use of recording packages that are collected after a single 1-2 year deployment, thus greatly reducing the station costs as no telemetry will be necessary. Since most of the accessible mantle and core ray paths to a given station location can be "illuminated" by a few years of natural seismicity, a longer deployment at a single site will typically gather less new information about Earth's internal seismic structure than a redeployment at a different seafloor site. In addition, the use of mobile arrays in place of a single permanent station would allow beam-forming and stacking analysis that could improve resolution. Another use of marine stations is for earthquake and nuclear test monitoring. For this purpose, real time (or near real time) data recovery is necessary, which will require the development of either data telemetry or a method to quickly recover data from a field instrument, e.g. through messengers [5]. Stations are also envisaged as a component of seafloor observatories to monitor tectonic or volcanic activity in "natural laboratories". Here local seismicity would be a key target, which would ideally require an array of four-component seismic stations that can record microseismicity which contains much higher frequency (-10 Hz) information than teleseismic arrivals. Telemetry from at least a few stations would be extremely desirable for monitoring purposes. Finally, we anticipate that a major future use will be PASSCAL-like arrays of marine seismometers that are used in portable 1-2 year deployments to study the regional crustal, lithosphere, and upper mantle structure beneath specific regions, e.g. for regional studies of the lithosphere and upper mantle structure beneath an island arc or oceanic hotspot. Also, to study 50-200 km scale variations in upper mantle asthenosphere structures associated with upwelling and melting at mid-ocean spreading centers and mantle plumes, plume-spreading center interaction, island arc and back-arc spreading and volcanism. Many sites will meet both regional and global seismic objectives with 1-2 years of passive recording coupled with an active seismic experiment to constrain crustal and shallow lithospheric seismic structure. For these experiments telemetry is not necessary, so that the required instrument can be much simpler with lower deployment and recovery costs.
40
Figure 1. Photograph showing the complete GEOLON-MLS recorder housing all electronic parts including digitizer and clock. At GEOMAR, we will focus on year-long deployments in array-seismic studies. We believe that this instrument is both feasible and relatively affordable with current microprocessor and sensor technology. A similar approach has recently been initiated in the U.S. to create the U.S. National Ocean Bottom Seismic Instrumentation Pool (OBSIP [19]).
4. G E O L O N - M L S THE L O W - P O W E R D A T A L O G G E R
In 1998 we decided to develop a new data logger that can be easily used for long-term deployments. This means that high data capacity and low power consumption were the two main requirements. In addition, we wanted to keep the modular design of our instrument fleet, and therefore the physical dimensions of that new recorder were to be close to those of the METHUSALEM MBS recorder used for active source profiling [18]. The design of the new system (Fig. 1), which is optimized for long time seismological investigations on the ocean bottom, led to some important modifications. Because of the smaller range of planned applications the flexibility and functional extent could be reduced for significant cost-savings. On the other hand, because of the need for long-term stand-alone operation, the following had to be optimized: 9 space and weight constraints for the batteries needed for 12-month-deployments require much lower power consumption than that of the OBH recorder; 9 the necessary clock stability and precision demands a better clock; the Seascan (R) oscillator was selected which limits the deviation to 1.5 sec max. over 1 year (< 50 ppb);
41 9 to realize a data storage capacity of 6-12 GBytes the new instrument was equipped with 12 PCMCIA slots. The availability of already announced PCMCIA elements with larger capacities will extend the overall capacity correspondingly in the short future; 9 the new instrument should fit into the same pressure cylinder of 15 cm inner diameter as used for METHUSALEM-MBS; it will thus be mechanically compatible with the existing OBH/OBS equipment; 9 it should be easily possible to use the recorder with different types of seismometers, hydrophones and pressure sensors. For this reason the analogue front-end was designed as an exchangeable unit which is plug-replaceable to adapt to the particular interface requirements of the sensors used. The key features of the GEOLON-MLS are summarized in Table 1. 5. FUNCTIONAL DESCRIPTION OF THE DATA L O G G E R 5.1.
Hardware
The analogue front-end has four input channels. Three channels are prepared for a threecomponent seismometer. The fourth channel is prepared for hydrophones or pressure sensors based on strain-gauges. The input sensitivity of its low-noise preamplifier can be set via switches. Table 1 Key features of the GEOLON-MLS Analogue inputs: Seismometer Input sensitivity
3 channels custom configurable
Hydrophone (or pressure sensor)
1 channel with low-noise preamplifier (gain switch-selectable)
Time synchronization
DCF77-coded signals or single pulse
Internal time base drift
< 0.05 ppm, < 1.5 sec/year (from 0~ to +30~
Power supply: - external - internal
from 6.2 V to 16.5 V 3 AA alkaline cells to ease handling during preparation
Power consumption Storage medium Storage capacity
Weight
Recording: from 230 mW to 250 mW Lowbat standby: 100 mW PCMCIA flash-disk / hard disk 12 PCMCIA slots type II or 6 PCMCIA slots type III (at present 12 GBytes) 1.5 kg without batteries and PCMCIA storage modules
42 Table 2 Sample Rates and Resolution Samples per Second
f-3dB (Hz)
Resolution (Bits)
Signal-to-Noise Ratio (dB)
1
0.3
21
120
2
0.7
21
120
5
1.7
21
120
10
3.3
2O
114
2O
6.7
20
110
30
10.0
19
106
50
16.7
18
100
100 *
33
17
96
200 * * optional
67
14
78
5.2. Preparation for measuring campaign The instrument can be parameterized using an ASCII terminal via its RS232 interface. The high precision oscillator is synchronized using DCF77 compatible pulses.
5.3. Data recording After low pass filtering, the signals of the four input channels are digitized using Sigma-Delta A/D converters. A final decimating sharp digital low-pass filter is realized in software by a Digital Signal Processor. The effective signal resolution depends on the sample rate as shown in Table 2. Finally the samples are permanently stored on PCMCIA flash disk or hard disk memory modules. The first file on each PCMCIA card is a special system file containing all control, status, and identification information of the actual experiment and the particular PCMCIA card. The synchronization time from the beginning of the experiment and the skew time at the end, giving the time deviation over the experiment duration, are also documented in this file, as well as all used filter coefficients and filter delay. The remaining storage capacity is available for data files. Each data file can be identified by a name generated automatically. A Huffman coded delta modulation is used for data compression, resulting in 22-bit signals. Proper recording operation or correct parameter settings may be checked by visualizing the currently acquired signal on a selected channel via the RS232 interface. This is even possible on the ocean bottom in the deep sea, provided a modem and suitable data link are available.
5.4. Data analysis The recorded data are retrieved by plugging the PCMCIA storage modules into a PC. Available standard software supports data transfer and decompression into 32-bit signals. The PASSCAL format is used as standard data transfer format allowing the generation of the SEG-Y format as a base for different individual analysis methods. An example of data collected during the virgin dive of the GEOLON-MLS recorder in October 1999 offshore Costa Rica is shown in Fig.2.
43 6. FUTURE ISSUES The cost for the use of a large research vessel is now the largest cost component in future passive marine experiments. As ship costs are often "hidden" as part of the operating budget, there is often little concern over ship costs in a particular experiment, but instead a large problem with ship scheduling. Deploying a buried instrument-sensor will require a specialized marine vessel in the foreseeable future (to operate a ROV or more specialized deployment tool), but instrument recovery should be made as simple as possible so that any ship of opportunity (e.g. a fishing boat where available) can potentially be used. Free-fall instruments should also be suitable for deployment by ships of opportunity. Therefore, they should be easy to prepare, relatively low-weight and easy to move. The development of a relatively portable "installation tool" or the ability to use a standard simple ROV for installation would allow instrument deployment from a large fraction of the marine research fleet. Active data acoustic telemetry is now technologically possible to a moored surface transponder. However, routine real-time data telemetry requires considerable power that is beyond the capabilities of current battery technology. Perhaps fuel-cell technology will be able to significantly improve the power-to-cost and power-to-weight ratio of high-power seafloor installations. The power supply could also come from the moored surface transponder (i.e. fuel cell, wave-action, solar power, or all of the above). This type of surface plus seafloor deployment is obviously much more expensive than the deployment of simple stand-alone recording packages. At present this type of real-time seismic monitoring seems best suited to military applications (e.g. nuclear bomb test monitoring) where cost is a lower priority.
Figure 2. Data-example of marine airgun shots recorded during the first deployment of the GEOLON-MLS data logger.
44 ACKNOWLEDGEMENTS The data shown in Figure 2 were collected during cruise SO144-1 off Costa Rica in October 1999. This cruise was funded by the German Ministery for Science and Technology (BMBF), and we especially thank master H. Papenhagen and his crew for their expert assistance during the cruise.
REFERENCES
1. 2. 3. 4. 5.
A. Hasegawa, S. Horiuchi and N. Umino, J. Geophys. Res., 99-B 11 (1994) 22295. J. Lilvall and T. Francis, Geophys. J. RAS, 54 (1978) 721. S. Husen, E. Kissling, E. Flueh and G. Asch, Geophys. J. Int., 138 (1999) 687. MELT Science Team, Science, 1998. L. Beranzoli, A. De Santis, G. Etiope, P. Favali, F. Frugoni, G. Smriglio, F. Gasparoni and A. Marigo, Phys. Earth Plan. Int., 108 (1998) 175. 6. J.P. Montagner, J. F. Karczewski, B. Romanowicz, S. Bouaricha, P. Lognonne, G. Roult, E. Stutzmann, J. L. Thirot, J. Brion, B. Dole, D. Fouassier, J. C. Koenig, J. Savary, L. Floury, J. Dupond, A. Echerdour, H. Floch, Phys. Earth Planet. Inter., 84 (1994) 321. 7. http://msg.whoi.edu/osn/EOS/EOS_paperadditional_j ul_23_99.html 8. A. Dziewonski and G. M. Purdy (eds), Proc. Workshop on broad-band downhole seismometers in the deep ocean, Woods Hole Ocean. Inst., 1988. 9. K. Suyehiro, T. Kanazawa, N. Hirata, M. Shinohara and H. Kinoshita, Proc. ODP Sci. Results, 127/128 (1992) 1061. 10. R. Butler, A. D. Chave, F. K. Duennebier, D. R. Yoerger, R. Petitt, D. Harris, F. B. Wooding, A. D. Bowen, J. Bailey, J. Jolly, E. Hobart, J. A. Hildebrand, A. H. Dodeman, EOS, in press. 11. K. Hirata, M. Aoyagi, S. Morita, I. Fujisawa, H. Mikada, Y. Kaiho, R. Iwase, K. Kawaguchi, Y. Sugioka, K. Suyehiro and H. Kinoshita, EOS Trans. AGU, 80 (1999) 46. 12. F. K. Duennebier, D. N. Hams, J. Jolly and J. Babinec, EOS Trans. AGU, 80 (1999) 46. 13. www.neptune.washington.edu 14. R. S. Jacobson, L. M. Dorman, G. M. Purdy, A. Schultz, S. Solomon, EOS Trans. AGU, 72 (1991) 506. 15. T. W. Barash, C. G. Doll, J. A. Collins, G. H. Sutton and S. C. Solomon, Mar. Geophys. Res., 16 (1994) 347. 16. J. Mienert and P. Bryn, EOS, 78 (1997) 567. ! 7. E. R. Flueh and J. Bialas, Int. Underwater Systems Design, 18 (1996) 18. 18. J. Bialas and E. R. Flueh, Sea Technology, 40 (1999) 41. 19. http://www.obsip.org
P a r t II Seafloor observatories 9 state of the art and ongoing experiments
This Page Intentionally Left Blank
Science-Technology Synergy for Research in the Marine Environment: Challenges for the XXI Century L. Beranzoli and P. Favali and G. Smriglio, editors.
47
9 2002 Elsevier Science B.V. All rights reserved.
D e v e l o p m e n t o f seismic real-time monitoring systems at subduction zones around Japanese islands using d e c o m m i s s i o n e d submarine cables J. Kasahara Earthquake Research Institute, University of Tokyo, 1,1,1, Yayoi, Bunkyo, Tokyo 113-0032, Japan
Two submarine OBS systems were developed using decommissioned submarine cables: the IZU OBS using the GeO-TOC cable at the Izu-Bonin Trench, and the VENUS multidisciplinary station using the GOGC cable at the Ryukyu Trench. The former system uses digital data acquisition, but it borrows much from the traditional submarine cable technology. The latter system is more sophisticated than GeO-TOC, and it was deployed by manned and unmanned submersibles, deep-tow equipment, and cable ships. Many new technologies have been developed for both OBS. The IZU OBS has been operated for three and half years and it has contributed to improving hypocenter determination accuracy along the Izu-Bonin seduction zone. The VENUS system proved the usefulness of decommissioned submarine cables for multi-disciplinary measurements at the deep-sea floor.
1. INTRODUCTION Several subduction zones surround the Japanese Archipelagos: the Kuril Trench, the Japan Trench, the Izu-Bonin Trench, the Nankai Trough, and the Ryukyu (Nansei-shoto) Trench. The Pacific plate and the Philippine Sea plate subduct beneath the Japanese archipelagoes at these sites. The subducting movement of the plates generates destructive earthquakes along the plate boundaries and in the forearc slopes. To minimize fatalities and damage to buildings caused by large earthquakes, it is important to study the nature of natural earthquakes and seismic structures at subduction zones. Since 1965, we have developed two major kinds of Ocean Bottom Seismometer (OBS): the mobile OBS and the submarine cable OBS. The first system [ 1] has been used extensively for seismic surveys during the past several years. The real-time monitoring of seismicity, however, is also very important to understand ongoing movements along the subducting plate boundary. To perform real-time monitoring of the subduction zone, we have developed two submarine cable OBS systems using decommissioned submarine cables. In this article, the author describes only the cable system.
48 2. USE OF DECOMMISSIONED SEISMIC MONITORING
SUBMARINE
CABLES
FOR
REAL-TIME
Real-time seismic observation on the deep-sea floor is sought from the viewpoint of earthquake hazard. One of the best technologies to achieve real-time observations is the use of submarine cables, which have long histories of technological development and proven field works in telecommunication. Although fiber-optic submarine cables apply very advanced and reliable technology, using new fiber-optic submarine cables is extremely costly. Another kind of submarine cable is the coaxial cable, which can provide electrical power and real-time telemetry similar to fiber-optic cables. Due to the extremely rapid growth of fiber-optic technology and huge demands for global telecommunication, a number of fiber-optic submarine cables with Giga-to-Tera bit capacity have been deployed. Although many submarine cable OBS have been deployed around the Japanese coast during the past two decades [2], each system required a huge investment. Therefore, construction of similar system at a distance far from the shore would not be easy. For example, a submarine cable OBS at the Izu-Bonin arc would require much longer cables and using a fiber-optic system is not practical. With the installation of fiber-optic systems, the rather old TPC-1 and TPC-2 coaxial submarine cables terminated their long lives in commercial service in 1990 and 1994, respectively. TPC-1 was the first Japan-US submarine cable constructed in 1964 and TPC-2 was the second one constructed in 1976. By reusing such resources, real-time geophysical observatories on the deep-sea floor can be realized with high reliability and at reasonable cost [3,4]. Sections of the TPC-1, between Ninomiya, Japan and Guam Island, and the TPC-2 between Okinawa Island, Japan and Guam Island were donated to the ERI (Earthquake Research Institute) and the Incorporated Research Institutions for Seismology (IRIS) in 1990, and to the ERI in 1996, respectively. Both cables cross geophysically important regions (Fig. 1). TPC-1 is routed from Guam, the Marina-Izu-Bonin Trenches to the Sagami-Bay. TPC-2 is routed from the Mariana Trough to the mid-Philippine Sea plate to the Ryukyu Trench. These areas contain seismically very active subduction zones and a rifting backarc basin. There are two major projects for the scientific reuse of decommissioned submarine cables in Japan. The Geophysical and Oceanographical Trans Ocean Cable (hereafter GeO-TOC) project uses the TPC-1 cable. The other is the Versatile Eco-Monitoring Network of the Undersea-Cable System (VENUS) project. The VENUS project uses the TPC-2 cable (Guam Okinawa Geophysical Cable; hereafter GOGC). In the VENUS project, a multi-disciplinary observatory was installed at the Ryukyu Trench.
3. GeO-TOC SYSTEM The length of the GeO-TOC submarine cable is 2659 km (Fig. 1). The GeO-TOC system has 74 repeaters [5], which contain dual vacuum-tube amplifiers to compensate for the gainloss caused by submarine cables. The submarine cable is a 1-inch diameter coaxial cable. The power supply allowance for the TPC-1 system is 1940 V and 370 mA for instruments. If each station uses 30 W, more than 20 stations can be installed along the cable. TPC-1 had 138 voice channels during commercial use. If data-telemetry uses several voice channels, there remains enough capacity for scientific use. A question arose over the remaining service life of the cable system, because the official life of the system was 25 years. Because the design service life of submarine electronics, however, is roughly more than 50 years and the
49
Figure 1. Cable routes of GeO-TOC and GOGC, and locations of the IZU and the VENUS OBS.
50 estimated lives of submarine cables are longer than the lives of electronics, the remaining service life of the GeO-TOC system should be sufficient for scientific observations [6]. The IZU OBS [6,7] was designed to use well-developed submarine cable technology because this was the first trial in the world, in which an old coaxial cable system was converted for scientific use. For this reason, the OBS package resembled a submarine cable repeater. The basic consideration was placed on reusing the old telephone equipment to minimize construction cost, without loosing data accuracy. To simplify the system, 4 kHz was adopted for the frequency width of data-transmission, which corresponds to the traditional analog telephone bandwidth. The electronics of the IZU OBS comprises sensors, A/D converters, control-CPUs, data telemetry unit, PSFs (Power Separation Filters), and DC-DC converters (Fig. 2). The PSFs separate high-frequency signals from the DC high voltage. The DC-DC converters supply the necessary DC voltages to the electronics using the high DC voltage. The OBS contains three orthogonal accelerometers (JAE-5V-IIIA, manufactured by Japan Aeronautic Electronics Co. Ltd.), hydrophone (ITC-1010, manufactured by International Transducer Corp. Ltd.), external quartz thermometer-pressure gauge (PaloScience-8B7000-15), and six thermometers to monitor temperature change of electronics. Two 16-bit CPUs (Hitachi H-8) are applied for data acquisition, formatting data, level change of modems, and rotation of gimbals. Three delta-sigma 24-bit A/D converters (Crystal Semiconductor-CS5322/5323) with a 125/62.5 Hz sampling rate are used for digitizing accelerometer outputs. The resolution of accelerometers is estimated to be 0.1 mgals for 0.1-62.5 Hz. The frequency characteristics are fiat over DC62.5 Hz. The resolution of the quartz thermometer is 0.001 ~ Data from each accelerometer are transmitted to shore at 9600 bps through one phone channel by FDM (Frequency Division Multiplexing). Slow data such as temperature and pressure are multiplexed and transmitted to shore through a phone channel. In total, the IZU OBS uses only five phone channels for data (uplink) and two phone channels for downlinking system control commands. The hydrophone signals, however, are transmitted in analog form to avoid loosing all data in the case of a malfunction of the digital system. On January 13, 1997, the IZU OBS was deployed on the forearc slope of the Izu-Bonin Trench (31~ and 140~ 2,708 m ) , which is approximately 400 km south of Tokyo, using the cable ship M/V "KDD-Ocean Link" (Fig. 1) [6]. The operation was carried out as follows. First, the middle of two repeaters of the GeO-TOC cable was hooked by a set of grapnels, which are a kind of anchors with three or four flukes, and lifted to the shipboard. The main cable was cut on board and the OBS unit was spliced into the main cable using a traditional cable repair technique. At2er verifying communication between the OBS and the shore through the actual submarine cable, the OBS was deployed in the sea through the drum cable engine and the stern-chute of the ship. The +4170 V and 370 mA DC using the same DC power supply as that previously used has been fed to the GeO-TOC cable from the Guam station. The 90 V has been increased in comparison to the previous one. The other end of the submarine cable is landed at Ninomiya. The FDM modulated signals are separated to each channel at Ninomiya and transferred to the ERI, Tokyo using a commercial data line at 64 kbps.
51 To Guam SD main coaxial 900V
De; ! 370n~
cable
-t Co+ler.....i ""~;SF
DC component
....~__~[,C-DC ... 't
1
I c~
[ High freq. t component
I ! Im h
~lps F 13~.~
pressure , ,,~ Quart ][ - ~ l l't~t, ] [ and Z temperature[ ! ~ I'
10
cony.
' I
_ I ~1 2 CPUs ~ ......
freq.
~component ,r ,,
~n~~
To Ninomiya k SD main 810V DC" coaxial cable
Block diagram of IZU OBS with couplers
~
,'
+
Data to Ninomiya and commands from Ninomiya
,
............... t Hydrophone ARC ~
!
+0 ! i xsbD+ 1 uotlv.
Temperal:ures ! [ Ac,celeromet ers ! Pressure ........................................................................................................................................ case PSF Po~er Separation Filter ARC:Acoustic Receiving Circuit HB Hybrid circuit FCC:Frequency Converter Circuit MODModulator/Demodulator
Figure 2. Block diagram of the IZU OBS sensors and electronics. The hydrophone data are digitized by a 16-bit A/D converter at Ninomiya. A GPS clock placed at Ninomiya gives timing to the data. Continuous data have been stored on a 3.5 inch MO-disk every five days since the installation in January 1997. The typical size of one hour of data is approximately 4.5 Mb when the ground is quiet. The IZU data have been processed regularly since March 1998 by the University Micro-Earthquake Network) to determine hypocenters in Japan. The IZU OBS has provided the seismic data to the University Micro-Earthquake Network as the far south station, and contributed to improving hypocenter accuracy, especially in depth, for earthquakes occurring along the Izu-Bonin subduction zone [7]. In general speaking, earthquake hypocenters along the Izu-Bonin trench and its forearc slope are determined several 10 km shallower and 15 km eastward if IZU station is added to the land stations, because the nearest land seismic stations for the IZU station are at Aogashima (32~ 139~ and Iwo-jima (24~ 141~ The distance between two stations are of the order of 1000 km. The data have also been disclosed to IRIS and the Tokyo Metropolitan Seismic Network. In the three and half years since their deployment, the OBS unit, the main cables and the repeaters have experienced no major problems. There were some minor problems in shilling the center frequency on the receiving unit, and adjusting the center frequency solved this issue. Several earthquakes along the Japan Trench and Izu-Bonin Trench subuduction zones have been observed every day by the IZU OBS. Only one waveform example is shown in Fig. 3, which shows a seismic record for the deep focus event at the Izu-Bonin subducting slab.
52
Figure 3. Records of the deep-focus earthquake (at 11:16, January 15, 1999, 560-km depth). EW, NS, UD, Hyd. components from top to bottom. Record length is 2 minutes. The noise spectrum is always interesting in the OBS performance. The noise power spectra of the IZU OBS showed high peaks at the period of 0.2-~0.3 see. on horizontal accelerometers and similar power spectra for the vertical accelerometer and the hydrophone (Fig. 4). Spectra shorter than 0.1 see. are similar to the HNM (High Noise Model), but the level of 0.2~0.3 seconds peak is a little lower than the HNM [8]. The noise level of 0.2~0.3 see. has a tendency to be strongly influenced by the sea state. Because the IZU station is located at the middle of the Typhoon corridor and also below the main stream of the Kuroshio Current, they might give some affects on the noise level, even though the depth of the OBS is 2700 m.
4. VENUS-GOGC SYSTEM In contrast to the GeO-TOC OBS, the VENUS project (1995~1999) intended to develop new technologies for using decommissioned submarine cables for environmental measurements at the ocean bottom. A multi-disciplinary observatory using the GOGC cable [7, 9] was installed at a depth of 2200 m on the forearc slope of the Ryukyu Trench at 50 km from the mainland of Okinawa Island in the fall of 1999 (Fig. 1). The objectives of the VENUS project [9] are to develop a multi-disciplinary station to study deep-sea environmental changes due to subduction of the Philippine Sea plate at the Ryukyu Trench.
53
Figure 4. Noise power spectral density for the IZU. One PSD equals 10-~m/s) H-Izand 102 pa2/Hz for accelerometers and hydrophone, respectively. (a) EW component of accelerometers. (b) Vertical component of accelerometers. (c) Hydrophone channel. The absolute noise level of hydrophone noise lower than 4 Hz may be underestimated due to the response of hydrophone and analog data transmission.
54 Nine Japanese institutions jointly worked on this project. The cable length of the GOGC is 2400 km. The system uses 1.5 inch-diameter coaxial cables. The former TPC-2 system had 845 voice channels. Although +1080 V DC from Okinawa a n d - 1 0 8 0 V DC from Guam were supplied to the cables at a constant current during commercial use [10], the electric power supply in the VENUS project was modified to a single supply from Okinawa. The bottom system comprises seven bottom sensor units, a bottom telemetry system, and main coaxial cables. The land system comprises shore station and data center. The total power dissipation caused by the bottom units is approximately 53.5 W. To minimize corrosion during the long observation period, all pressure cases and the major parts of frames for the bottom units were made of titanium and plastics. The interfaces made of titanium and stainless steel, if they existed, were protected by plastic insulators. The sensor units comprise ocean bottom broadband seismometers (OBBS), a tsunami pressure sensor, a hydrophone array, a multi-sensor unit, geodetic instruments, geoelectricgeomagnetic instruments, and a mobile unit (Fig. 5) [9,11]. Seismic measurement uses "Guralp CMG-1T" triaxial broadband seismometers with gimbals. Seismometers respond to periods between 300 and 0.05 sec. The OBBS outputs are digitized at 24-bit and 100 Hz. The tsunami gauge uses a quartz pressure sensor and the resolution for sea-level change is 0.5 mm. The multi-sensor unit comprises short-period seismometers, a hydrophone, a digital still-life camera, a CTD, a current meter, a nephelometer, and sub-bottom temperature probes. The hydrophone array is composed of five hydrophones with a 700 rn spacing due to some limitations on the budget [ 12]. Sixteen-bit data are transmitted to shore. The geodetic changes are acoustically determined by precise baseline measurements between two transponders. Three units are placed in a triangular formation and the distance between two units is approximately 1 km. The estimated accuracy of geodetic measurements will be a few centimeters/year, which may be smaller than the expected precursory crustal deformation near the trench if an M7 class earthquake occurs just beneath the site [ 13]. The geoelectric-geomagnetic unit comprises a proton magnetometer, flux-gate magnetometers, and orthogonal geo-potentiometers. The length of the geo-potential measurement is 10 m. The mobile unit consists of an acoustic communication unit and a remote instrument. A separate report describes details of each sensor [ 11]. The bottom telemetry system comprises a data-coupling unit, a data-telemetry unit and a junction box. The data-coupling and data-telemetry units generate 24 DC using 3,000 V and separate the high-voltage DC component from high-frequency carriers, and again mix the high-frequency carrier with the DC component. The data-telemetry unit multiplexes the data, and sends them to shore using a 240-kHz carrier bandwidth. The transmission rate for the multiplexed data is 96 kbps. Each instrument, however, uses a particular transmission rate, 19.2 kbps for the OBBS. Hydrophone data use another 240-kHz bandwidth. If an instrument does not operate correctly, users can shut it down remotely from a land base. The junction box has nine so-called Remotely Operated Vehicles (ROVs) undersea mateable connectors. The ROV connectors allow units to be plugged in and unplugged on the ocean floor with the assistance of a manned submersible or an ROV. The route of the GOGC cables was identified on the ocean floor, with a thin sediment cover, by the deep-tow camera of the RJV Yokosuka in February 1998. In March 1998, the submersible Shinkai 6500 (Dive # 411/YK98-02-Leg3) cut the GOGC at 25~ and 128~ at a water depth of 2,200-m using a newly developed cable cutter. Three major legs were carried out by the M/S Kuroshio-Maru, the R/V Kaiyo, and the RJV Karei/ROV-
55 Kaiko for deployment of the telemetry system, deployment of instruments and extensionconnection of cables, respectively. Instruments were confined within an approximately 1-km radius area around the junction box (Fig. 5). The data-coupling unit was spliced into the main cable on the deck of the M/S Kuroshio-Maru in August 1999. The M/S Kuroshio-Maru installed the bottom telemetry system with the tsunami sensor and the hydrophone array at the ocean bottom. The location of the telemetry system is 25~ and 128~ at a water depth of 2157-m. The deep-tow equipment of the RfV Kaiyo installed five instrument units at the ocean bottom in September and October 1999. The ROV-Kaiko connected nine ROV connectors of the cable end to connectors on the junction box at the ocean bottom in October 1999. The OBBS was placed at 80 m north of the bottom telemetry system at a water depth of 2154 m. Figure 6 shows the bottom telemetry system at 2200 m depth. The OBBS was not buried in the sediment as there was no appropriate equipment on the. Instead of burying, 80 kg weights were placed on the OBBS flame, but it seems to be of no help in reducing the low frequency noise [ 11 ].
Figure 5. Instruments and telemetry system configuration at the VENUS multi-disciplinary observatory.
56 A shore station is located in Okinawa. Some shore equipment was obtained from the previous station used by the TPC-2 system. The shore-receiving unit demodulates signals and sends them to Yokosuka, Japan, using two 64-kbps lines. It also supplies 3100 V to the cable. Data from ocean bottom instruments are stored in the data storage at the Japan Marine Science and Technology Center (JAMSTEC). Scientists can obtain their own data and also communicate to a particular instrument using a data terminal at JAMSTEC. A number of earthquakes including two large events of southern California (Ms=7.3) of October 16 and the Taiwan earthquake (Ms 6.1) of November 1, 1999, were observed [11]. However, the noise level of broadband seismometers seems to be extremely large (_--_5~tm/s at 300 sec. for horizontal components) due to strong influences from the infra-gravity wave and ocean currents, because the OBBS was not buried in ocean-bottom sediments [8].
5. CONCLUSIONS AND FUTURE PLAN The two projects (GeO-TOC and VENUS) use decommissioned submarine cables for realtime geophysical monitoring on the deep-sea floor. Data from the ocean bottom can be monitored in the laboratory. Three and half years of operation of the GeO-TOC IZU OBS confirm the usefulness and the reliability of the system. During operation, the system has served to improve hypocenter accuracy of the University Micro Earthquake Network as the far south station. The OBBS records of VENUS show the necessity of installing seismometers in sediments to reduce noise [e.g., 8]. The submarine cable between Okinawa and Ninomiya (Okinawa Cable) will be used to monitor future large earthquakes along the Nankai Trough.
Figure 6. Photograph of the bottom telemetry system at 2200 m deep. Deep-sea ROV connectors can be seen.
57 ACKOWLEDGEMENTS
The Ministry of Education, Science, Culture and Sports, Japan, supports the GeO-TOC project. The Science and Technology Agency, Japan, supports the VENUS project. The author thanks JAMSTEC for its great assistance in operating research boats and submersible vehicles. The author also thanks ERI for permission to use the GeO-TOC and GOGC cables for seismological research.
REFERENCES
1. J. Kasahara, T. Matsubara, T. Sato and K. Mochizuki, J. Mar. Acous. Soc. Jpn., 24 (1997) 39. 2. H. Kinoshita, in: Proceedings of International Workshop on Scientific Use of Submarine Cables, Okinawa, 1997, 119. 3. S. Nagumo and D. A. Walker, EOS Trans. AGU, 70 (1989) 673. 4. J. Kasahara, H. Utada and H. Kinoshita, J. Phys. Earth, 43 (1995) 619. 5. KDD Technical Jour., 42 (1964) 1. 6. J. Kasahara, H. Utada, T. Sato and H. Kinoshita, Phys. Earth Planet. Intr., 108 (1998) 113. 7. J. Kasahara, T. Sato, H. Momma and Y. Sirasaki, Earth Planets Space, 50 (1998) 913. 8. S.C. Webb, Rev. Geophys., 36 (1998) 105. 9. J. Kasahara, Y. Shirasaki and H. Momma, IEEE J. Ocean Engin., 25 (2000) 111. 10. KDD Technical Jour., 88 (1976) 3. 11. J. Kasahara (ed.), Final report on the VENUS project, ERI, 2000. 12. K. Watanabe, in: J. Kasahara (ed.), Final report on the VENUS project, ERI, 17 (2000). 13. Y. Nagaya, in: J. Kasahara (ed.), Final report on the VENUS project, ERI, 148 (2000).
This Page Intentionally Left Blank
Science-Technology Synergy for Research in the Marine Environment: Challenges for the XXI Century L. Beranzoli and P. Favali and G. Smriglio, editors.
59
9 2002 Elsevier Science B.V. All rights reserved.
G e o p h y s i c a l O c e a n B o t t o m O b s e r v a t o r i e s or T e m p o r a r y P o r t a b l e N e t w o r k s ? J-P. Montagner a, J-F. Karczewski a, E. Stutzmann a, G. Roult a, W. Crawford a, P. Lognonn6 a, L. B6guerya, S. Cacho a, J-C. Koenig a, J. Savarya, B. Romanowicz b and D. Stakes c aSeismological Laboratory, URA, CNRS 195, Institut de Physique du Globe, Paris, France bSeismological Laboratory, U.C. Berkeley, Berkeley, California, U.S.A. CMBARI, Moss Landing, CA 95039-0628, California, U.S.A.
Most of the scientific issues are multiscale for spatial scales as well as for temporal scales. There is a general consensus that it is necessary to install geophysical stations in oceans which are presently instrumental deserts. In this paper, we review the general constraints of a network of ocean bottom stations which should enable us to address these scientific issues by covering the different spatial scales from global down to local scale. The recent progresses made by Japanese, French, German, Italian and U.S. groups show that the technical challenge of installing long-term or 'even better' permanent geophysical ocean bottom observatories (coined GOBOs) is not out of reach. Different technological developments are presently explored and prefigure what will be the future geophysical ocean bottom stations and networks. They are integrating the concept of multiparameter station, which is demonstrated to have a great scientific interest. The installation of different kinds of sensors at the same place in a seismic station allows us to enhance the signal-to-noise ratio and opens wide the possibility of new discoveries. The finding of the excitation of normal modes in the absence of large earthquakes is emblematic in that respect. Such a multiparameter oceanic observatory includes at least broad band seismometers, microbarometers or pressure gauges, microthermometers, and possibly other sensors (e.g. electromagnetic sensors, strain meters, GPS). The design of the complete chain of acquisition, from the sensor to the distribution of data, will imply integration of all the technical progresses made in micromechanics, electronics, computer science, space science, and telecommunication systems. There is also a real need for developing seismological and more generally geophysical arrays enabling us to address scientific issues at regional and local scales, particularly for understanding active processes in seismic and volcanic areas. The networks at all scales must be coordinated in order to constitute a hierarchical or multiscale network, which will be the basic tool for addressing scientific issues in geosciences. However, the strategy for developing reference GOBOs and temporary stations is not necessarily the same. The technological developments are largely dependent on the period of operation of the station: a
60 long-term geophysical observatory is much more difficult to install and maintain than a time-limited station, due to the problems of power and data transmission. For example, the solution for ensuring the long-term operation is the installation of cables. The cables can be either laid down on the seafloor between the station and the sea-shore or installed vertically, in connection with a surface buoy, in order to ensure the link between sea floor and sea surface, where data can be teletransmitted and power can be provided by solar panels (dieselpowered or windmills). Such a "heavy" observatory necessitates the use of a manned submersible, ROV or AUV. On the other hand, the installation of typically one hundred temporary "light" stations must be performed by usual oceanographic vessels using simple dropping procedures. The design of both kinds of stations will be detailed. Whereas GOBO should follow the multiparameter concept, a temporary station must be, for practical reasons, dedicated to a single parameter measurement. All the efforts necessary to achieve and maintain such a multiscale, multiparameter network represent a formidable technological challenge for the next decade.
1. I N T R O D U C T I O N The last twenty years have seen the explosion of a new kind of seismology, broad band seismology. The Federation of Digital Seismographic Networks (FDSN) played a key role in promoting this new seismology and in coordinating the projects of several countries in broad band seismology, by proposing different standards for the sensors, a data distribution format and by avoiding the duplication of national efforts [1 ]. The same philosophy is followed by the geomagnetism community, which has launched the InterMagnet program. However, in spite of these international efforts, the global coverage of the Earth by digital seismic stations of global networks such as GEOSCOPE, IRIS/GSN and GEOFON, and regional networks (e.g. MedNet, CSN, CNSN, POSEIDON) is still very uneven. Most of the stations are located on continents and primarily in the Northern Hemisphere. Therefore, a large part of the oceanic areas (2/3 of the surface of the Earth) is devoid of seismic and geophysical instrumentation. At the same time, the broad band revolution also concerned the local networks and portable instrument arrays (e.g. PASSCAL in U.S.A., SKIPPY in Australia, GEOFON in Germany, RLBM in France). Such portable networks make it possible to investigate geodynamic processes at regional scales (around 100 km) such as continental roots, tectonic processes, deep origin of plumes. The lateral resolution of tomographic models and detailed studies of active processes (e.g. earthquakes, volcanic activity of plumes and ridges, landslides), is primarily limited by the poor coverage of the oceanic areas by stations. The investigation of such processes is equivalent to looking at an object from only one side or to be blind in one eye. Consequently, a recommendation of recent prospective workshops (see for example the proceedings of the recent ION/ODP workshop [2]) was to promote the installation of long-term oceanic seismographic stations or even better, geophysical stations and observatories. However, this is a very difficult task, due, firstly to the environmental hostile conditions prevailing at the bottom of the ocean, and secondly to the difficulty to correctly install stations
61 on the sea floor, to maintain stable long-term observations, to retrieve data in real time, and to supply power for a long period of time. Due to the high cost of such observatories, the geosciences community has realized that these observatories have to be multidisciplinary. These multiparameter geophysical ocean bottom observatories (hereafter referred as GOBO) are interesting not only from a financial point of view but also from a scientific point of view. Beauduin et al. [3] and Roult and Crawford [4] demonstrated that the co-location of broad band seismometers and microbarometers makes it possible to improve the signal-to-noise ratio for land stations. But the concept of multiparameter station is valid for land station as well as for ocean bottom observatories. The international organization, International Ocean Network (ION), was launched in 1993 in order to coordinate the international efforts in the design, site location and installation of ocean bottom observatories [5]. The same effort must be done for portable ocean bottom stations. In this paper, we review different recent developments which are ongoing regarding instrumentation (sensors, pilot experiments on the sea floor), multiparameter stations, and data teletransmission. The relationship between technological developments and the duration of operation (short-term experiments, semi-permanent and permanent stations) will be discussed. We highlight the French contribution to this international efforts.
2. TOWARDS AN INTERNATIONAL OCEAN N E T W O R K A uniform coverage of the Earth with geophysical observatories at different scales is particularly important from a scientific point of view for understanding the Earth dynamics. Different spatial scales can be considered: global scale (characteristic spacing of stations around 2,000 km), regional scale (typical dimensions: 1,000 km, spacing of the order of 100 km), local scale (dimensions smaller than 100 km). For the global scale, most emerged lands are now covered by broad band stations (Fig. 1). The site locations are coordinated through the FDSN [1 ]. Its initial goal was to obtain the best uniform coverage of the Earth as possible with a station spacing of around 2000 km, which corresponds to a number of stations around 100. This goal is largely overpassed and more than 200 stations are now part of the Federation network. This network includes all stations of global networks (GEOSCOPE, Geofon, IRIS/GSN), and selected stations of regional networks such as Chinese Seismograph Network (CSN), Canadian National Seismograph Network (CNSN), Mediterranean Network (MedNet, Italy), POSEIDON (Japan), Australian National Seismograph Network (ANSN). Though most of continents and emerged lands are adequately covered, the station coverage is still very uneven and dramatically unbalanced towards continents, particularly in the northern hemisphere. The same problem exists for geomagnetic stations. A large part of oceanic areas, particularly in the southern hemisphere, is devoid of instruments. From the scientific point of view, that means a strong aliasing of tomographic models and the impossibility to correctly investigate active processes occurring in these areas.
62
Figure 1. Broad band digital stations. With the present station coverage, the lateral resolution of global tomographic models is limited to about 1000 km. For source studies, the azimuthal coverage of seismic sources is also very uneven. The different scientific issues regarding global studies and active processes were extensively discussed in the ION/ODP workshop [2] held in Marseilles in January 1995. Even though all islands are instrumented, there will be large parts of oceans unsampled particularly in the Pacific Ocean and in the Indian Ocean. 2.1. Pilot experiments on the ocean floor
The installation of a network of GOBOs represents a "formidable" technological challenge and several pilot experiments have been carried out in order to unravel different technical issues. Several groups in Japan, France and U.S.A. have performed preliminary experiments focussed towards the goal of installing permanent seismic stations. In March 1991, a downhole set of broad band seismometers CMG3 was successfully placed in the ODP hole 843B in Japan sea but not recovered [6]; teleseismic events were recorded and broad band seismic noise spectra (0.03 s - 200 s) were obtained [7].
63
NADIA
4~
.o NA OFM
HOLE
396 a 20 m maxl
OFP
Figure 2. Sketch of the OFM/SISMOBS experiment [8]. 2.1.1. SISMOBS/OFM In May 1992, the French pilot experiment OFM/SISMOBS was successfully conducted and two sets of CMG3 broad band seismometers were installed, operated for more than one week and recovered [8]. The experiment (Fig. 2) took place in the North-Atlantic Ocean at 23~ and 43.3~ at the location of the DSDP hole 396B. A first set of CMG3 seismometers (called OFM) was installed on the sea floor at 20 m from the hole and was semi-buried within the sediments. It was possible to install a second set of CMG3 seismometers (named, OFP) into the hole down to -296 m below ocean bottom level. After the installation of both sets of seismometers, seismic signals were recorded continuously during 8 days for OFM and 5 days for OFP at a sampling rate of 5 samples per second. The different instrumentation tools were designed by a team of the Technical Division of INSU, now integrated into IPG-Saint-Maur. The experiment made necessary the simultaneous use of the oceanographic vessel NADIR, of the submersible NAUTILE, and the re-entry logging system NADIA. All the logistical support was provided by IFREMER. From a technological point of view, this experiment was a complete success. The most important scientific results are the following [9]: the seismic noise is smaller in the period range 4-30 s for both OFM (sea floor seismometers) and OFP (downhole seismometers) than in a typical broad band continental station such as SSB (France, GEOSCOPE). But, more important, the noise is smaller than the noise at SSB up to 600 s for
64
OFM. This low level of seismic noise implies that the detection threshold of earthquakes is very low and it has been possible to correctly record teleseismic earthquakes of magnitude as small as 5.2 at an epicentral distance of 105 ~ (Fig. 3). Another important qualitative result is that the noise level tends to decrease as time goes on for both OFM and OFP [ 10]. It is observed that the amplitude of noise is systematically and rapidly decreasing for OFP at long periods (T >50 s). For OFM, there is some tendency of noise decreasing with time but its variations are more erratic and can correspond to the normal variations of noise in a seismic station. The noise level decrease for OFP can be approximated by an exponential, but the asymptotic level corresponding to t --~ ~ is still larger for OFP than for OFM. However, the duration of the operation for OFP (5 days) is too short to have an accurate estimate of the exponential decay. That means that the equilibrium stage was not yet attained by the end of the experiment. Therefore, the key issue of whether it is important to install seismometers down boreholes or on the sea floor was still unresolved. This experiment demonstrated that a broad band seismometer carefully installed on the sea floor and semi-buried can present an excellent signal/noise ratio and provide useful seismic data. Hokkaido, Japan - May 7, 1992 - 06h2,3'56.1" 41.175N 144.7E - depth: 15km - mb - 5.8 Bandpass filler: 10 - 50 mHz
Figure 3. An example of earthquake recorded on both OFM and OFP vertical component.
65
2.1.2. MOISE: Monterey Bay Ocean bottom International Seismic Experiment Following the international workshop of ION in Marseilles [2], a cooperative multiparameter project between IPG (Paris), UBO (Brest), MBARI (Monterey, California) and UC Berkeley was launched in order to test the feasibility of installing, operating and recovering different geophysical sensors (primarily broad band seismometers and electromagnetometers) on the sea floor during three months in order to investigate the covariations of the different signals recorded by these sensors. This experiment named MOISE was conducted from June to September 1997 off the California coast in the Monterey Bay on seafloor sediments at a depth of 1015 m. The different geophysical instruments were deployed by using the remotely operating vehicle (ROV) Ventana of Monterey Bay Aquarium Research Institute (MBARI). The experiment (Fig. 4) is described in [11 ]. Preliminary results are presented in [12] demonstrating the strong correlation between seismic noise and deep water currents. A systematic study of the seismic noise level variations is presented in [13]. It is shown that the seismic noise level was stable throughout the experiment. It is comparable to terrestrial station noise below 15 s, and displays strong diurnal variations at long periods. These diurnal variations can be removed from the vertical component by subtracting the effect of the horizontal components, decreasing the vertical noise by up to 40 db. Coherence between long period seismic, electromagnetic and environmental data was investigated. The coherence between the different signals is maximum near 12 hours, a consequence of tidal effects. The coherence between the vertical seismic signal pressure and current velocity is high throughout the experiment. There is no significant high coherence with the vertical magnetic field. 2.1.3. O.S.NI: February-June 1998 Plans in U.S. called for a three-phase approach [14]. In phase 1 (completed), pilot experiments are proposed to address the fundamental problems of sensor coupling in holes, noise, devising solutions for power, data retrieval and reliability on the multiple year time scale. In phase 2 (under progress), a small number of prototype observatories will be installed, immediately contributing data to the seismological community. In phase 3, the complete network Ocean Seismic Network (OSN) of 20~25 stations will be installed and will complement the IRIS/GSN. An important experiment was carried out at the OSN-1 drill site (ODP hole 843B) 225 km south-west of Oahu, Hawaii, in water 4407 m deep. The noise level in the Pacific Ocean is known to be larger than in the Atlantic Ocean ([ 15] for a review). A complete description of this experiment can be found in Stephens et al. [16]. Three broad band seismic systems were tested: a seismometer (Guralp CMG-3T) resting on the seafloor, a seismometer (CMG-3T) buried within 1 m of the seaflor and a seismometer (Teledyne KS5400) clamped at 248 m beneath the seafloor in the hard rock basement. The instruments were deployed in early February and recovered in early June 1998. The results of the experiment confirm the previous results of the Japanese and French SISMOBS/OFM experiment. In the microseism and short-period band, the borehole sensor had the quietest ambient noise levels, particularly
66
Figure 4. Monterey Bay Ocean bottom International Seismic Experiment. The ROV Ventana of MBARI is holding the seismic broad band package [ 11 ]. on the horizontal components. But at periods longer than 10 s (noise notch and infragravity wave band following the classification of Webb [15]), the buried sensor was as quiet or quieter than the borehole sensor. The good news again was that broad band seafloor seismic installations can yield quality data comparable to land stations, not only in the Atlantic Ocean but in the Pacific Ocean as well. The debate regarding borehole versus buried sensors is now closed. The borehole system can be used for permanent observatory sites provided that the sensor be cemented in place, which should reduce the noise level above 10 s. 2.2. Reference Geophysical Ocean Bottom Observatories on Global scale The same three-phase scheme is also followed by other members of the ION community and the phase which requires installing a small number of prototype observatories is ongoing.
67 These sites were detailed in the ION proposal [5]. In May 1998, a hole was drilled in the middle of the Indian Ocean on the Ninetyeast Ridge. The installation should have been carried out in the framework of a French-Japanese cooperative project, but both partners are still waiting for funding from their respective agencies. In November 1998, U.S. groups installed a junction box on the cable linking the Hawaiian archipelago and California coast about half-way, including broad band seismometers [17]. An unfortunate failure delayed the operation of this first ocean bottom observatory named H20. It was repaired during Fall 1999 and instruments are providing seismic data to the IRIS DMC in Seattle. H20 is a good example on how to re-use retired telecommunication cables. The main advantage of cable is that it enables us to provide power to the instruments and to transmit the data. Unfortunately, most of the ION selected sites are located far away from available cables, particularly in the Southern hemisphere. And the installation of several thousand kilometers long cables should be far too expensive to constitute an alternative to completely autonomous GOBOs. In June 1999, during ODP Leg 186, two sites off the northeast coast of Japan were successfully drilled, in order to monitor seismic and aseismic crustal deformation associated with the subduction of the Pacific plate beneath Japan. Borehole strainmeters, tiltmeters and broad band seismometers (Guralp CMG-1) were installed at the bottom of the drilled sites. At least one set of instruments turns out to be operating correctly (Suyehiro, personal communication).
3. SHORT T E R M STATIONS, REGIONAL AND LOCAL SCALE ARRAYS Long-term GOBO will enable us to investigate global scale geodynamics and will serve as reference stations. However, in order to study active processes, the global ION network must be complemented by regional and local scale networks. Permanent regional networks are devoted to the study of structure for spatial wavelengths from the tens of kilometers to a maximum of 1000 km, and to the localization and study of regional earthquakes. The scales between 100 km and 1000 km in most of the oceanic areas is naturally out of reach so far. As for local networks (scale between a few kilometers and hundreds of kilometers), the number of permanent networks is very limited. They are located either where active processes occur (seismic or volcanic areas) or in quiet areas for nuclear test monitoring. There is a real need as well, for high resolution experiments (scale smaller than 1 km or 100 m) which should be able to map interesting three-dimensional geological objects (e.g. faults systems, magma chambers). So far, regional and local networks cannot provide a uniform sampling of intermediate spatial wavelengths (between 10 km and 1000 km) and mini-scales (smaller than 1 km) for the large variety of geological environments. Only a limited number of tectonically active areas are covered and our knowledge in active processes occurring on plate tectonics boundaries below oceans (ridges, passive or active margins), or in the middle of plates (intraplate volcanism, plumes) is still very limited. The short wavelength structure, below the crust, in the deep mantle, in the D"-layer and in the core is very poorly known. The present coverage of stations does not enable us to address these issues with the
68 available networks. The development of broad band portable networks such as SKIPPY in Australia [ 18] or PASSCAL in U.S.A. among many other initiatives in different European and Asian countries, has demonstrated that hot scientific issues on continents can be addressed by short-term (usually less than one year long) experiments. The new american initiative, U.S.-ARRAY, consists in developing a network of 1000 broad band (BB) instruments and recording systems. However, such a portable broad band network is still missing in oceans. During the MELT experiment [ 19], it was possible to record long-period surface waves and body waves by using short-period instruments with broadened bandwidth, but the quality of data was rather poor due to the inappropriate sensors. However, this pilot experiment demonstrated the interest of such data for understanding the behaviour of ridges and dynamic processes.
3.1. GEODIS: Geophysical Diving Saucer In order to be efficient and achievable, a portable broad band (BB) seismic network must be easy to install and to recover, must have a low power consumption and must be reliable. If we imagine a network of at least 100 instruments, the main limitation of such a network will be its financial cost and the associated expenses in terms of sea campaigns and maintainance. Different institutions around the world are working on the design of such a network. We will detail the project named GEOphysical Diving Saucer (GEODIS) of the IPG of Paris which is developing such an instrument fulfilling these requirements. On a small platform of small diameter (0.93 m), different boxes including three component broad band seismometers, power, recording system and eventually environmental sensors (e.g. pressure, temperature, currentmeter) are interconnected (Fig. 5). The height is 43 cm and the architecture is designed in order to look like a saucer to minimize the influence of bottom currents. The weight of inside water will be 140 kg (216 kg in air) and the round flat frame at its base enables a good coupling with the sea floor. The power will be provided by lithium batteries and should be smaller than 1 watt, in order to ensure a functioning for at least one year. All components of GEODIS are designed to minimize their power consumption. The masterpiece of the instrument is the seismic sensor which is derived from the spatial technology (see next section). Such an instrument can be connected to other platforms through a cable. A first experiment will take place in autumn 2000 off Ustica Island in the Thyrrhenian sea (Gasparoni et al., this volume). GEODIS will be connected to the GEOSTAR platform [20]. This experiment is supposed to test the feasibility of the concept of GEODIS and to make a quantitative comparison as for the seismic noise with the CMG-3T seismic sensor installed on the GEOSTAR platform by the ING team. Two different ways can be used for installing GEODIS. It can be installed either as an autonomous station thrown off board or by using an installer. In the first case, it is necessary to add a weight which will act as a ballast, connected at the bottom of GEODIS with an acoustic release. To get a positive weight balance, 6-8 benthos spheres can be assembled as a toroid. The spheres can be used to store enough Li batteries to
69 operate for at least one year. A solid state mass memory similar to the ones designed for a small station on planet Mars (8 Gbytes) can be used inside the cylinder. In the second case, an "universal" installer must be designed. It is connected either to a classical cable or to a mixed cable with optical fiber and copper wire. The extremity of the cable is connected to the installer which includes standard submarine lights and video camera in order to select convenient site for installation and to recover GEODIS by the end of the experiment. A simple device enabling the communication between the installer and the station will be implemented, to check from the vessel, the quality of the installation and the good health of GEODIS. The same device can be used during the recovery for correctly stopping all cycles and to lock the different moving pieces of the seismometer. The choice between these two options is only dictated by the cost. The second option is more expensive but should ensure a high reliability for the installation. The installer can be even improved by adding a digger to bury or semi-bury the sphere of the seismometer.
Figure 5. GEODIS: GEOphysical Diving Saucer designed by the Technical Team of OFM.
70 3.2. The sensor The present technology does not make it possible the use a very broad band seismometer for automatic or robotics installation, required for the deployment of seismometers in hostile environments, such as ocean bottom, boreholes and comparable hot environments, cold locations such as Antarctica or on the surface of the other telluric planets of the solar system. Less sensitive seismometers, like the STS2 or CMG3 are then used, even if the noise is comparable or even less than levels recorded in the seismic vault. An improvement of the global network VBB instruments, for both the sensitivity, the miniaturisation and the automatic installation capability is then highly desirable. This improvement will be reached by the development of a new VBB 3-axis instrument in France, derived from the prototype presently designed for future planetary missions. This type of seismometer is the second generation of a space-qualified seismometer after the OPTIMISM seismometer, which was onboard the two Small Surface Station of the Russian mission MARS96, launched in November 1996 but which unfortunately failed and was swallowed up by the South Pacific Ocean. Both sensors were designed to survive to high g load (up to 350 gram during 10 ms, where gram is the amplitude of the gravity field). The seismometer which will equip GEODIS is the terrestrial simplified version of the SEIS-NL seismometer developed to monitor the Martian seismic activity within the framework of the NETLANDER mission [21 ] planned for 2005. It is under development through a CNES R&T Program and its terrestrial version could be developed by the SODERN company in cooperation with IPG in Paris (Fig. 6). The external part of the package is efficiently thermally insulated. Optional equipment, composed of a thermal/ current shield and an inclinometer will be available to reduce installation cost. Cost issues will, however, forbid the use of such ultra-sensitive sensors for very dense portable networks. Moreover, and even if field tests have proved the possibility of reaching very low noise levels for surface installations [23], the micro-seismic noise of the seismometers deployed on the surface installation or in small holes is generally one order of magnitude greater and allows, without strong loss, the use of more noisy but much cheaper, miniaturised seismometers. Such instruments will probably allow the development of seismic stations based either on a "bury and forget" philosophy or a "drop and forget" philosophy. The "bury and forget" philosophy may be used after a major earthquake in order to rapidly deploy thousands of sensors on a regional scale and for an operation time of a few weeks. The "drop and forget" strategy makes it possible to deploy seismic stations from the sea surface, by mini-penetrators, developed by Geko-Prakla company for their internal use, able to penetrate the ground for a better seismic coupling. 3.3. Scientific targets: plumes, seismogenic zones The scientific targets of such broad band ocean bottom portable arrays are numerous. Most of the active processes in the Earth take place below oceans or at the boundary between oceans and continents. The manifestations of these active processes can induce catastrophic events,
71
Figure 6. Very-broad band (UBB) seismometer, designed for future Martian missions [24]. in terms of strong earthquakes or volcanic eruptions. Generally, they are related to large scale material flow in the mantle, such as either upwellings, downwellings or even differential flows on both sides of a transform fault (North Anatolian fault, San Andreas fault, for example). In subduction zones, the installation of seismic stations on the oceanic side might enable us to understand the interaction of the subducting plate with the overlying plate, the role of fluids, and the processes involved in the digestion of sediments and oceanic crust. The most advanced projects are those of the Japanese, where a few ocean bottom stations connected to the seashore by cables, are already operating. Such a project exists as well in France where it is devoted to the investigation of the structure of the Antilles Arc, in order to understand the relationship between the seismogenic zone and the volcanic belt. The recent earthquake in Turkey (Aug. 18, 1999) draws attention on the part of North Anatolian fault located below the Marmara Sea, just to the south of Istanbul, where the next damaging seismic event might occur in the next 20 years [25]. As for the upwellings, their surface manifestations, mid-ocean ridges and plumes, are usually less dramatic, but their structure is not well understood. Plumes are probably the most mysterious geological objects. Their origin at depth (asthenosphere, transition zone between
72 400 and 1000 km, D-layer) is still debated. Their geodynamic role in the opening of ridges, in plate reorganization, and in biological crises is probably the most exciting scientific issue for the next decade. Several ongoing initiatives in Europe propose to address these issues. A project named MAGIA (Multiscale Approach of Geohazards Investigation in Azores) is going to be proposed to the European Community. The proposal, foreseing the coordination of CGUL (Lisbon, Portugal) and the involvement of French, German, Italian and Spanish groups, proposes a multiscale approach in order to find the origin at depth of the Azores plume, to understand its interaction with the mid-Atlantic ridge and the associated geohazards. Figure 7 presents the area of investigation where a regional broadband network on the seafloor and in islands will be complemented by dense, classical OBS deployments along lines. These two scales of study should enable us to relate the regional scale heterogeneities (larger than 50 km) to short scale heterogeneities (smaller than 50 km). At an even smaller scale, some seismic reflection experiments should shed some light on the crustal ,A. BB Marine Network '~
VBB Network 9 Island Network
MAGIA
Figure 7. The MAGIA experiment (Multiscale Approach of Geohazards Investigation in Azores) is an European project involving CGUL (Lisbon, Portugal), and several institutions in France, Germany, Italy and Spain.
73 structure between Azores and the Mid-Atlantic Ridge. All these projects will be coordinated under a larger umbrella, the project Monitoring of the Mid-Atlantic Ridge (MOMAR). Another ambitious program devoted only to the investigation of oceanic plume was launched in 2000, under the intiative of GEOMAR (Kiel, Germany), IPG (Paris, France) and the University of Cambridge (UK). In order to solve its scientific objectives, the Plume Oceanic Project plans to develop a network of 100 broad band ocean bottom stations. A first target might be the La Reunion Plume in the Indian Ocean.
4. DISCUSSION: L O N G - T E R M OBSERVATORIES VERSUS T E M P O R A R Y STATIONS The technological issues regarding long-term observatories or temporary stations are quite similar. However, the technological developments are largely dependent on the period of operation of the station: a long-term geophysical observatory is much more difficult to maintain than a temporary ocean bottom station, due to the problems of power supply, failures and data retrieval and transmission. The scientific purposes are different as well, though complementary. Long-term observatories can be used as reference stations and as nodes for short-term experiments with a dense coverage of stations. The best solution for ensuring the long-term operation is the installation of cables. The type of installation is dependent on the distance from the sea-shore. For a short distance from the coast (less than 200 km), the cables can be laid down on the seafloor between the station and the sea-shore. But, for large distance (more than 200 km), the solution consists in installing a moored buoy, connected through a vertical cable. The cable ensures the link between the surface buoy at the sea surface and the sea floor. The only way to operate on a continuous and long-term basis is an ocean floor observatory, to collect and teletransmit data in real-time. Power can be provided by solar panels, diesel generators or windmills. 4.1. General strategy In spite of many advances performed by American, Japanese and French groups, many technical problems are still pending. Due to the high cost of GOBO, the ongoing plan consists of trying to associate different geophysical communities interested in long-term ocean bottom observations. Such a coherent strategy should enable scientists from different fields to develop a synergy by coordinating their efforts. A GOBO must be able to address several scientific issues at a global scale, fulfill different scientific constraints and will need several sensors, which must be independent in order to constitute a scientific module. The concept of modular observatory provides a large degree of freedom to the different scientific groups in the design of their specific sensor and should make the maintenance of the system easier. The notion of standardization is also intimately associated with the concept of independent modules. The different scientific modules have to be connected to a central module by standard connectors.
74 The central common module is the brain and the heart of the GOBO. The whole observatory is organized around it. It is composed of several units, having different functions and providing different facilities, power supply, data storage systems, sending commands to each dedicated module, communication with the outer world through a cable connected to a buoy on the surface and teletransmission to data centers. The cable between the ocean floor and the surface, makes it possible to provide data on a timely basis and eventually to send orders to the different modules from a vessel. From a technical point of view, the observatory shares common modules (power, dataloggers, recording systems, transmission of data), which should be very versatile and are almost independent of any kind of scientific needs. A first step should be achieved in GEOSTAR-2 project where messengers with Gbytes of data can be automatically released and recovered on a regular basis from bottom station up to the sea surface. Therefore, the design of a GOBO must follow the basic philosophy presented previously of multidisciplinarity and modularity. We detail in the next section, what might be the scientific interest of multiparameter stations. Such a philosophy is not necessarily the best one for temporary deployments. The multiparameter concept tends to make the design of the station, its installation (use of manned submersible, ROV or AUV) and its recovery during sea campaigns more complex. Thus, it tends to multiply the causes of failures. In order to reduce the cost of a large number of non-permanent stations, we can accept a degree of reliability smaller than for permanent observatories. Therefore, it is likely that for portable ocean bottom arrays, each station must be dedicated to one sensor, seismic or electromagnetic. However, the recording of environmental parameters such as pressure, temperature and currents are highly desirable.
4.2. Multiparameter stations Since the beginning of the nineties, the concept of multiparameter stations is emerging. Firstly, it enables us to improve the signal-to-noise ratio of signal or equivalently to separate the signal due to seismic waves and the signal due to the fluids envelopes of the Earth (ocean and atmosphere). Secondly, the purpose is, in a sense, more empirical and regards the investigation of correlations between independent physical parameters relevant to a complex scientific problem (earthquake prediction and more generally active processes). For example, the multiparameter observations around Corinth Gulf show a nice correlation on land between seismic anisotropy and magnetic anisotropy [26]. Thirdly, it allows us purpose is to realize economies of scale by allowing sensors required by disciplines to use the same power source, recording and data transmission systems and to simplify operations and maintenance organization. For the first kind of application, some progress has been made and some preliminary results were obtained in the GEOSCOPE station of SSB (Saint-Sauveur-en-Badole, France) and TAM (Tamanrasset, Algeria). Since 1989, microbarometers were installed in SSB, where two sets of STS1 seismometers [27] were present. The complete design of the experiment is presented in Beauduin et al. [3]. It is observed that microbarometric pressure is correlated with the horizontal acceleration component signals of seismic noise. The vertical component
75
Figure 8. Excitation of normal modes in the absence of earthquakes. Top: before pressure correction. Bottom: atter pressure correction [4].
76 of seismic noise is only correlated to pressure at long periods (larger than 500 s). After selecting a seismic signal without earthquakes, it is possible to calculate the transfer function between pressure and seismic noise (amplitude and phase). It is then possible to remove the effect of pressure from the seismic signal. Montagner et al. [28] present an example of such a calculation, where two earthquakes hidden in the noise are perfectly observed after correction. However, Beauduin et al. [3] show that the correlation between pressure and seismic signal is not systematic and is largely dependent upon the quality of the installation of sensors. This means that in a conventional land-based station, where there is easy access to the sensors, the simultaneous recording of pressure and seismic signal can only be an indicator of a problem in the installation. The same kind of approach was followed by Routt and Crawford [4], who showed that it is possible to detect more clearly, free oscillations of the Earth in the absence of earthquakes by subtracting the effect of the atmospheric pressure (Fig. 8). Multicomponent seafloor instruments have also proven useful for improving the seismic signal and for determining crustal structure independent of seismic sources. Crawford et al. [29] use autonomous seafloor packages containing a broad band seismometer and a differential pressure gauge to measure the seafloor deformation under pressure forcing by ocean waves (Fig. 9). They invert the deformation/pressure transfer function to determine crustal structure, and, in particular, to detect and quantify low shear velocity regions such as magma chambers and regions of high hydrothermal circulation. The compliance signal is generally between 0.002 and 0.05 Hz, requiting instruments sensitive to lower frequencies than typical OBS seismometers and hydrophones.They use Lacoste-Romberg vertical gravimeters or Streckeisen STS-2 broad band seismometers as the acceleration sensor, and differential pressure gauges [30] to measure the pressure variations. These instruments have also been used to demonstrate that low-frequency seafloor seismic noise can be reduced by subtracting this "compliance" signal [31] and, in the case of a nonstable emplacement, a significant low frequency tilt noise signal can be removed from the vertical component by either precisely leveling the seismometer, or subtracting coherent horizontal noise from the vertical signal [29]. In the future, the compliance sensors will be constructed using broad band seismometers (either the STS-2 or a Guralp CMG-3) and a combined differential pressure gauge/hydrophone sampled at 50 Hz to create a truly broad band instrument sensitive to high frequency processes such as airgun shots and microearthquakes as well as low-frequency information such as seafloor compliance and teleseismic earthquake arrivals. These results demonstrate the utility of this simultaneous recording for stations installed in hostile environments (planet Mars, ocean bottom, drill holes) where it is almost impossible to check and modify the installation of seismometers on a regular basis. The next question is: what should be an ideal standard multiparameter station and can this concept be adapted to portable stations? A multiparameter station classically includes broad band seismometers, high frequency seismometers, microbarometer, microthermometer, electromagnetic sensors, GPS receiver. According to local requirements, it should be possible to add strainmeters, gravimeters, and any kind of environmental modules and geochemical sensors. Such a philosophy was followed in the design of the GEOSTAR platform [20],
77
a _
~ 2-40 km
an i~(L
t
t
1-10 l.tm
E
b
(8
(.9 ..._,
10 .4
r.
_
.
a d
"
_
E
o (u c~
"o o (8 10 ~ (D r 0 (8 L..
o o
10~' 104
10~
1~
10-1
Frequency (Hz)
Figure 9. Compliance sensor: a) description of the experiment; b) transfer function between the acceleration and the sea surface displacement.
78 1998), but the limitations of such a station are its cost, its weight and its inherent complexity. It is likely that a reasonable approach might be to co-locate different dedicated stations, following the procedure used during the MOISE experiment [11,13].
4.3. Temporal scale According to the previous discussion, the main difference between GOBOs and portable stations concerns the period of operation. An important application of long-term observatory results from the long-time series of signals. Observatories enable us to investigate the time- dependent earth processes. With the recent controversy of discovery of differential rotation of the Earth's inner core [33, 34], a time-dependent seismology is emerging. The compilation of past magnetic observations since the 18th century, has made it possible to investigate the secular variations of the magnetic field and to gain insight on the flow in the outer core at the core-mantle boundary. Since most sensors are recording present physical fields (magnetic, seismic, gravity fields), they only provide information on the instantaneous Earth, by comparison to the geological time scale. If we want to study the time evolution of phenomena, the upper bound of the temporal scale is the human lifetime or in the best case, the history of mankind. Therefore, for time-scales larger than thousands of years, the paleosciences (e.g. paleomagnetism, geology, geochemistry) will still be necessary for providing invaluable information on the evolution of the Earth system. The future networks will only be able to save data on time-scales around a century. However, even at this time scale, there are very interesting geophysical phenomena, such as the earthquake or volcanic cycles. These examples demonstrate the necessity of installing a global network of ocean bottom observatories which will have to work on a long-term basis (several decades). 4.4. Multiscale approach As advocated previously, scientific issues involve many spatial scales. The consequence of this statement is that it is necessary to investigate and relate these different scales and, therefore, to have geophysical networks at different scales. The basis of such a network based on the concept of hierarchical network or multiscale network was presented in Montagner et al. [28]. The goal of this network is to provide observations at all spatial scales, with a uniform distribution of sub-networks. It can be easily understood that such a perfect network leads to an exponential increase in the number of stations and is rather unrealistic. In order to solve this problem, the selection of nodes can be dictated by scientific needs and interests. The community investigating active processes, will propose to install local and mini-scale networks where active processes can provide the most valuable observations, i.e. in tectonic areas (along plate boundaries or for investigating intraplate volcanism or oceanic plumes). As for the instruments in the stations, it is unlikely that there will be only one kind of sensors for the whole network. Since there is a direct relationship between wavelength )~ and frequency f it is obvious that, as scale is decreasing, sensors must be sensitive to higher frequencies. However, due to technological progress, the number of instruments might be quite limited (two or three types of sensors). For example, in seismology, broad band seismometers can cover the frequency range 0.001-80 Hz and industrial short period
79 seismometers can span higher frequencies. So far, there are networks at all scales, but the important point regards the coordination of the different networks, which have to share the same philosophy on the standardization of sensors, data format (such as SEED format) and data distribution. The complete coverage of the global scale is presently ongoing with the extension of the global network of the FDSN towards the ocean thanks to ION. This global network should enable us to investigate the structure of the whole Earth down to scales around 1000 km. The scales between 10 km up to 1000 km can be investigated by portable broad band seismic networks which are exist for emerged lands and under development for oceanic areas.
5. CONCLUSION The networks which are presently under development, must be considered as the instrument of the whole geoscience community. The design of the future network will have to present an even distribution of stations at all scales and to make the whole dataset available on a timely basis. It includes not only the sensor but the complete chain, from the acquisition to data storage and distribution. This whole chain has to be designed in a coherent manner, in order to save time and money. If the geoscience community agrees on this concept of hierarchical (or equivalently multiscale) network, a global strategy must be defined, and the networks at all scales have to be coordinated. The part of the world where most of the scales are poorly investigated is the ocean. The priority in the next years might be to achieve the global scale coverage of the Earth and the development of portable ocean-bottom array with at least 100 stations. The efforts necessary to achieve such a multiscale network represent a formidable technological challenge for the next decade.
ACKNOWLEDGMENTS
This work was partly supported by INSU-OFM (1999), URA/CNRS 7580, Institut de Physique du Globe, Paris.
Seismological Laboratory,
REFERENCES
1. B. Romanowicz and A.M. Dziewonski, EOS, 67 (1986) 541. 2. J.-P. Montagner and Y. Lancelot (eds.), Multidisciplinary observatories on the deep sea floor, I.O.N. workshop, Marseilles, 1995. 3. R. Beauduin, P. Lognonn6, J.P. Montagner, S. Cacho, J.F. Karczewski and M. Morand, Bull. Seism. Soc. Am., 86 (1996a) 1760. 4. G. Roult and W. Crawford, Phys. Earth Planet. Int., 121 (2000) 325. 5. K. Suyehiro in: J.-P. Montagner and Y. Lancelot (eds.), Multidisciplinary observatories on the deep sea floor, I.O.N. workshop, Marseilles, 1995.
80 6. K. Suyehiro, T. Kanazawa, N. Hirata, M. Shinohara and H. Kinoshita, Proc. ODP, Scientific Results, 127/128 (1992). 7. T. Kanazawa, K. Suyehiro, N. Hirata and M. Shinohara, Proc. ODP, Scientific Results, 127/128, (1992). 8. J.P. Montagner, B. Romanowicz and J.F. Karczewski, EOS, Trans. AGU, 75 (1994a) 150. 9. J.P. Montagner, J.F. Karczewski, B. Romanowicz, S. Bouaricha, P. Lognonne, G. Roult, E. Stutzmann, J.L. Thirot, D. Fouassier, J.C. Koenig, J. Savary, L. Floury, J. Dupond, A. Echardour and H. Floc'h, Phys. Earth Planet. Int., 84 (1994b) 321. 10. R. Beauduin, J.P. Montagner and J.F. Karczewski, Geophys. Res. Lett., 24 (1996b) 493. 11. D. Stakes, B. Romanowicz, J.P. Montagner and P. Yarits, EOS Trans. AGU, 79 (1998) 301. 12. B. Romanowicz, D. Stakes, J.P. Montagner, P. Tarits, R. Uhrhammer, M. Begnaud, E. Stutzmann, M. Pasyanos, J.F. Karczewski, S. Etchemendy and D. Neuhauser, Earth Planet. Sci., 50 (1998) 927. 13. E. Stutzmann, W. Crawford, J.L. Thirot, J.P. Montagner, P. Tarits, D. Stakes, B. Romanowicz, A. Sebai, J.F. Karczewski, D. Neuhauser and S. Etchemendy, Bull. Seism. Soc. Am., (2000) submitted. 14. G.M. Purdy, and J.A. Orcutt (Ed.), Broad band Seismology in the oceans - Towards a Five-year plan (Ocean Seismic Network/ Joint Oceanographic Institutions Inc.), Washington, D.C., 1995. 15. S. C. Webb, Rev. Geophys. Space Phys., 36 (1997) 105. 16. R. Stephens, J. A. Collins, J.A. Hildebrand, J.A. Orcutt, K.R. Peal, F.N. Spiess and F.L. Vernon, EOS, Trans. AGU, 1999. 17. R. Butler, A. D. Chave, F.K. Duennebier, D.R. Yoerger, R. Petitt D. Harris, F.B. Wooding, A.D. Bowen, J. Bailey, J. Jolly, E. Hobart, J.A. Hildebrand and A.H. Dodeman, E.O.S., Trans. AGU, 81 (2000) 157. 18. R. D. Van der Hilst, B. L. N. Kennett, D. Christie and J. Grant, E.O.S., 75 (1994) 177. 19. D. W. Forsyth and the MELT Seismic team, Science, 280 (1998) 1215. 20. L. Beranzoli, A. De Santis, G. Etiope; P. Favali, F. Frugoni, G. Smriglio, F. Gasparoni and A. Marigo, Phys. Earth Planet. Int., 108 (1998) 175. 21. P. Lognonne, and the NETLANDER team, Planet. Space Sci., 48 (2000) 1289. 22. Coste 23. P. Lognonn6 J. Gagnepain-Beyneix, W.B. Banerdt, S. Cacho, J.F. Karczewski, and M. Morand, Planet. Space Sci., 44 (1996) 1237. 24. Cacho, S., Etude et r6alisation d'un prototype de sismometre tr6s large bande, 3 axes, qualifi6 spatial, Th6se Universit6 Paris VII, 1996. 25. A. Hubert-Ferrari, G. C. P. King, A. Barka, E. Jacques, S. Nalbant, B. Meyer, R. Armijo, P. Tapponnier and G. King, Nature, 404 (2000) 269. 26. P. Bernard, G. Chouliaras, A. Tzanis, P. Briole, M.-P. Bouin, J. Tellez, G. Stavrakakis and K. Makropoulos, Geophys. Res. Lett., 24 (1997) 2227. 27. E. Wielandt and G. Streickeisen, Bull. Seism. Soc. Am., 72 (1982) 2349. 28. J.P. Montagner, P. Lognonn6, R. Beauduin, G. Roult, J. F. Karczewski and E. Stutzmann, Phys. Earth Planet. Int., 108 (1998) 155.
81 29. 30. 31. 32. 33. 34.
W. C. Crawford, S. C. Webb and J. A. Hildebrand, J. Geophys. Res., 104 (1999) 2923. C. S. Cox, T. Deaton and S. C. Webb, J. Atmos. Oceanic Technol., 1 (1984) 237. S.C. Webb and W. C. Crawford, Bull. Seism. Soc. Am. 89 (1999) 1535. W.C. Crawford and S. C. Webb, Bull. Seism. Soc. Am., 90 (2000) 952. X. D. Song and P. G. Richards, Nature, 382 (1996) 221. Dziewonski
This Page Intentionally Left Blank
Science-Technology Synergy for Research in the Marine Environment: Challenges for the XXI Century L. Beranzoli and P. Favali and G. Smriglio, editors.
83
9 2002 Elsevier Science B.V. All rights reserved. H 2 0 : The H a w a i i - 2 O b s e r v a t o r y A. D. Chave a, F. K. Duennebier b, R. Butler c, R. A. Petitt, Jr. a, F. B. Wooding d, D. Harris b, J. W. Bailey a, E. Hobart a, J. Jollyb, A. D. Bowen a and D. R. Yoerger a aDepartment of Applied Ocean Physics and Engineering, Woods Hole Oceanographic Institution, Woods Hole, MA 02543, U.S.A bSchool of Ocean and Earth Science and Technology, University of Hawaii at Manoa, Honolulu, HI 96822, U.S.A. CIncorporated Research Institutions for Seismology, 1200 New York Ave. NW, Suite 800, Washington, DC 20005, USA dDepartment of Geology and Geophysics, Woods Hole Oceanographic Institution, Woods Hole, MA 02543, U.S.A.
A permanent deep ocean scientific research facility, the Hawaii-2 Observatory or H20, was installed on the retired HAW-2 commercial submarine telephone cable in mid-1998. H20 consists of a seafloor submarine cable termination and junction box in 5000 m of water located halfway between Hawaii and California. The H20 infrastructure was installed from a large research vessel using the Jason ROV and standard over-the-side gear. The junction box provides two-way digital communication at variable data rates of up to 115 kbit/s using the RS-422 protocol and a total of 400 watts of power for both junction box systems and user equipment. Instruments may be connected by an ROV to the junction box at 8 wet-mateable connectors. The H20 junction box is a "smart" design which incorporates redundancy to protect against failure and with full control of instrument functionality from shore. Initial instrumentation at the H20 site includes broad band seismometer and hydrophone packages.
1. I N T R O D U C T I O N
The Hawaii-2 (HAW-2) submarine telephone cable was laid in 1964 between San Luis Obispo, California and Makaha, Oahu, Hawaii. It is a second generation vacuum tube repeater (AT&T SD series) analog system [1] which continued in service until 1989, when a cable break off California led to its retirement from commercial service. In 1996, the entire HAW-2 wet plant was acquired by the Incorporated Research Institutions for Seismology (IRIS) from AT&T on behalf of the US scientific community. H20 was installed close to the midpoint between two repeaters (which are spaced 20 nm apart) near 28N, 142W at about 5000 m water depth (Fig. 1). The lithosphere west of 140W in this area was formed between the Pacific and Farallon plates under normal spreading conditions, but at a fast half-rate of 7 cm/year [2,3]. The crustal age based on magnetic lineations is about 45 Ma (isochron 20) or mid-Eocene. The regional physiography is one of
84 abyssal hills with a nominal but variable 50-100 m cover of terrigeneous clay sediment. The local relief around the H 2 0 junction box is quite subdued; a deep-towed survey for 1 km around that point reveals no rock outcrops and very gentle relief of a few tens of meters on a smoothly sedimented bottom. The science drivers behind the installation of H 2 0 are primarily in global geophysics, especially seismology. The H 2 0 site is located at a point on Earth's surface where there is no land for about 2000 km in any direction. For this reason, it is a high priority site for the Ocean Seismic Network (OSN) component of the Global Seismic Network (GSN) [4], and serves as the first operational OSN station. While the present H 2 0 seismometer utilizes a buried broad band sensor, the Ocean Drilling Program (ODP) is scheduled to drill a re-entry borehole close to the H 2 0 site at the end of 2001 for subsequent installation of a downhole seismometer. The H 2 0 site is also one of eight seafloor locations identified by the geomagnetic community where permanent observatories are required [5]. Finally, H 2 0 is located at a logisticallyconvenient place for testing permanent seafloor instrumentation and observatory concepts in the deep ocean.
Figure 1. The Hawaii-2 Observatory is sited 1750 km east-northeast of Honolulu. The path of the Hawaii-2 cable runs between Oahu and Kauai, then heads northeast toward California.
85 2. SHORE INSTALLATION Originally, HAW-2 was powered both from the California and Hawaii ends of the system using opposite polarity power supplies and a telecommunications-standard seawater return. In 1992, the California end of HAW-2 from the shore station out to deep water was removed by AT&T, and hence H20 was designed to be a single-end system powered from Hawaii with a local sea ground. Because the shore plant had been removed from the Makaha cable station, this also required the reinstallation of high voltage power supply and high (radio) frequency line equipment. In the original HAW-2 installation, Makaha was a B terminal with a negative power supply operating at a nominal 5000 V and low band (150-550 kHz) receive/high band (650-1050 kHz) transmit, yielding a total of 138 analog telephone channels, each occupying 3 kHz using frequency domain multiplexing. The original performance characteristics of the system were maintained at re-installation by using actual SD shore equipment previously salvaged from a commercial installation on Guam. The power supply was reduced from quad to dual redundancy and the high frequency line was modernized by replacing vacuum tubes with solid state devices, but all passive components remain the same, and hence the equalization properties of the system have not been altered. In fact, system tests after shore plant reinstallation showed that repeater performance was not substantially different from the initial values measured in 1964. In addition to the analog high frequency line, it was necessary to install modulation/demodulation equipment and associated computing hardware at Makaha to provide a mirror image of the H20 junction box electronics, as described in the next section. This provides a series of digital data streams which are transmitted over a frame relay from Makaha to the University of Hawaii, and then onto the Internet. One of the design goals of H20 was minimization of the need for local data archiving and instrument control by utilizing the Internet. In effect, instrument owners can control their bottom packages from anywhere on Earth.
3. THE H 2 0 JUNCTION BOX
From the outset, it was deemed necessary to design a seafloor installation that minimizes the cost of instrument connection and which provides a simple mechanical and electronic interface for scientific users. The former precludes the connection of dedicated in-line instruments using standard industry cable handling practices, both because of the high initial expense and because future instrument failures or upgrades require a second costly recovery and re-installation. In addition, this approach deters future usage of the cable by the broad community due to the necessity for sophisticated and expensive installation tools. Instead, an approach was used which focuses the electronic complexity in a seafloor junction box into which scientific users can simply plug their instruments using standard deep submergence assets, and which provides a comparatively simple digital communications interface as well as DC power. Finally, it was also required that the junction box be installed using a conventional, large oceanographic ship assisted by a remotely operated vehichle (ROV) rather than a specialized cable ship. The H20 junction box was designed with a number of goals intended to maximize reliability and flexibility.
86
Figure 2. Cartoon showing the H20 site. The termination frame is shown at the left, while the junction box is on the fight. The sea ground is off to the fight of the picture. The mechanical aspects include (1) minimizing corrosion concerns, (2) maintaining compatibility with a generic ROV so that a specialized vehicle will not be necessary in future instrument installations, (3) simplifying the deployment, and 4) ensuring in situ stability. Figure 2 shows a cartoon of the mechanical layout of a seafloor installation that satisfies these criteria. The first condition is met by constructing the junction box of titanium alloys and plastic, thus avoiding the usual corrosion problems encountered with aluminum. The SD cable is terminated at a gimbal recovered from an SD repeater and attached to a titanium flame containing a wet-mateable underwater connector. Use of this termination flame allowed the SD cable to be lowered to the seafloor during installation without the complications which entail from attachment to the main H 2 0 junction box, which is deployed separately. It also allows the junction box to be easily retrieved and serviced either for upgrades or in the event of a failure. The SD cable is connected to the main junction box by a short (-30 m) oil-filled underwater-mateable umbilical. The power conditioning pressure case on the junction box contains a shunt regulator to extract power from the constant current SD system, and is terminated by a sea ground which is deployed far enough from the junction box to eliminate corrosion concerns. The junction box electronics pressure case contains all of the systems necessary to control the power to and communicate digitally with instruments, multiplex the digital data they produce, and transmit it to Makaha on the submarine cable. It also contains the control systems necessary to adjust the communications systems and control power to individual instruments. The electronics pressure case is connected to an oil-filled connector manifold which provides eight ROV-compatible, wet-mateable 8 pin connectors, with four connectors on either side of the manifold. These provide an RS-422 communications interface with extemal instruments as well as 48-volt power. The connector manifold also houses two additional 4-pin connectors to which the termination flame and sea ground are attached. The connector manifold is designed to provide space for ROV access, and the connectors are
87 specifically intended to be compatible with a standard ROV manipulator, being based on the Ocean Design Nautilus family. The entire junction box sits on a broad weighted base and frame that protects vulnerable pressure cases and associated connectors, yet places the connector manifold well clear of the seafloor. The electronic systems for the H20 junction box were also designed to meet a set of criteria, including (1) making use of the available bandwidth on HAW-2 in an efficient manner, (2) accommodating both low and high data rate users, (3) making the main interface between plug-in instruments and the junction box digital to simplify future design work and minimize noise, (4) protecting the cable system from interference or damage by users, (5) providing a down-link capability to control junction box and user instrument functions, (6) making use of commercial hardware whenever possible to minimize engineering costs, and (7) using high reliability design principles with failsafes and fallbacks in case of partial failures. The first two criteria preclude simply maintaining the existing frequency division multiplex (FDM) SD architecture and assigning channel space directly to individual instruments, as this makes inefficient use of the available bandwidth for low data rate instruments and may be inadequate for higher rate ones.
Figure 3. Block diagram of the H20 junction box electronics. See text for details.
88
The DC and RF components arriving from Hawaii on the SD cable are separated at a power separation filter (PSF) located in the oil-filled manifold. The DC component is sent to the power supply pressure case. Because the SD system operates in a constant current mode at 370 mA, it is necessary to produce a voltage drop across a stack of shunt regulators to extract power from the system. The shunt regulators must extract a fixed amount of power from the cable, dissipating that which is not used by the junction box or instruments as heat. There are eight shunt regulators in series, each providing a nominal 50 watts so that about 400 watts is available to power both the junction box electronics and user instruments. When the cable is powered on, two stacks come up automatically to support the system bus, and additional stacks can be brought on line as needed to power the user bus. Both the system and user bus power are generated by DC-DC converters to give electrical isolation of the cable and junction box systems. Standard levels of 48 volts are sent to the main electronic pressure case. The power conversion electronics can be monitored and adjusted using a control computer which is completely independent of the remainder of the junction box, providing a degree of failsafe operation. The RF component is sent to the main electronic pressure case, where it is fed into the SD channel analog interface (Figure 3). The main electronic pressure case contains the remainder of the junction box systems for communications and control. The SD channel analog interface is a passive filter and summing network which separates the uplink (to H20) highband and downlink (from H20) lowband signals, and further combines or splits the FDM spectrum in each band while matching the impedance of the SD cable. Control of most junction box functions is provided through a system controller, which is a dual tone multiple frequency (DTMF) interface to the junction box and user power systems, and to the more fundamental electronic functions of the junction box. The system controller provides the ability to turn junction box subsystems on and off, select backup systems in the event of failure, halt and reboot junction box computer systems individually or collectively, and control power distribution to users. A separate system telemeters junction box electronic parameters like system and user bus voltages and currents or subsystem status information back to shore on a dedicated channel. It also provides a ground fault detection capability to protect connectors and pressure cases from damage in the event that a component failure allows electric current to pass through the pressure case wall. Overcurrent at each user connector is protected through a foldback current limiter, preventing a single user from drawing more than about 50 watts. The heart of the junction box is a communications block consisting of an FDM interface, a set of five V.34 protocol modems, and a PC104 computer. Each of the five blocks provides bidirectional communications at up to 80 kbps using the standard RS-422 protocol. In addition to the five blocks, there are seven individual modem channels for low data rate users. A multiplexer or crosspoint switch allows a given block or modem to be connected to any of the six connectors under command of the system controller. The PC104 is a compact implementation of the PC bus which satisfies reduced space requirements and power constraints for embedded control applications. These computers serve to package and time stamp the data arriving from instruments and distribute the packets among four modems. Precise (about 1 ms) timing is provided by an IRIG board in each stack. The reverse function is performed for data arriving from shore. Each modem occupies a 10-kHz dual sideband channel which is frequency-domain multiplexed onto a section of the downlink low or uplink high band. Each communications block occupies about 40 kHz of spectrum including guard bands, so the five blocks plus seven modem channels require about 270 of the available 400
89
kHz. The remaining approximately 130 kHz of bandwidth is held in reserve to accommodate future expansion. A mirror image of the junction box set of communications blocks and modems is connected to the high frequency line at Makaha. A computer system at Makaha provides full remote control of all multiplexing, monitoring, and stack control functions through a graphical user interface. It also separates the control and user data streams to enhance security. 4. INSTALLATION The H 2 0 junction box was installed in September 1998 using a large US oceanographic research vessel (R/V Thomas G. Thompson) and the Woods Hole Oceanographic Institution Jason ROV. The installation site was selected based on prior site survey data to be approximately halfway between the lay positions of repeaters to minimize the possibility of damage while handling the SD cable. The cable was first located visually using the ROV and found to be about 3/4 nm south of the nominal lay position (which is based on 1964 navigation capability). The ROV then transited along the cable for 5 km (1 water depth) toward California. The cable was cut at this point using a hydraulic cable cutter on the vehicle. The Hawaii end of the cut cable was recovered using a flatfish grapnel attached to the oceanographic 9/16" trawl wire on Thompson. The recovery point was 5 km toward Hawaii from the cut point, so that an equal amount of cable would hang from either side of the grapnel to avoid slippage and cable loss during retrieval. The cable recovery took about 24 hours with trawl wire tension that was continuously near the operating limit of 24,000 lb. The vessel was maneuvered using its dynamic positioning system to minimize cable tension and maintain a nearly vertical cable throughout the operation. The cable was recovered to the fantail of the research vessel through a stern chute and secured to bits on deck. It was then cut at the lift point so that the Hawaii end could be identified. The remaining 5 km cable stub was discarded. The cable termination depicted in Figure 2, which had an SD pigtail already attached to its gimbal, was then spliced into the main cable using standard industry methods. The junction box next underwent a thorough alignment and final checkout on deck after the cable was repowered from the Makaha end. This required about 4 days of closely coordinated effort between shipboard and Makaha engineers during which the cable was hung from the Thompson's stem A-frame and the ship maintained station using dynamic positioning. Once testing was complete, the junction box was unplugged from the cable termination and the latter was maneuvered over the stem to be lowered to the seafloor on the end of the trawl wire through a set of acoustic releases. During this operation, the 1/2" chain holding the load on the acoustic releases failed, sending the cable and termination plunging to the seafloor. The resulting pile of cable was surveyed using a towed camera system and the termination frame was found to be intact, uptight, on top of the cable pile, and within 2 m of open seafloor. The system was tested by installing a shorting plug on the termination frame with Jason and powering it up. Functionality was found to be normal, although the final location of the termination frame was about 2 km west of the intended installation site. The junction box was lowered to the open seafloor north of the termination frame on an acoustically-navigated trawl wire through acoustic releases. Jason proceeded to hook up the umbilical cables linking the junction box to the termination frame and the sea ground, respectively. The system was tested and found to function correctly, but a total failure
90
occurred about 12 hours later which necessitated recovery of the junction box. This was done by hooking a lift line dropped from the ROV depressor weight to the top of the junction box with Jason and recovering all 3 components (depressor, junction box, and ROV). The problem was quickly traced to contaminated oil in the manifold and repaired. The junction box was then re-installed. The primary sensor which was deployed at the H 2 0 site is a ULF seismic system consisting of two packages. The main acoustic sensor package (ASP) houses most of the electronics along with both absolute and differential pressure sensors, a hydrophone, a temperature sensor, and a two component current meter. This package is connected to the H 2 0 junction box with a short tether to provide communications and power. The ground motion sensor package (GMSP) is pulled away from the ASP by Jason and lowered into a caisson that the ROV had previously buried in the sediments using a hydraulic pump. The GMSP sensors include a Guralp CMG-3 broadband triaxial seismometer along with t i l t m e t e r s , a t e m p e r a t u r e s e n s o r , and a l e v e l i n g s y s t e m . The s e n s o r s t a g e also
Figure 4. Earthquake recorded at the H20 seismic system on May 4, 2000. This magnitude 7.3 quake occurred at 04:21:33 Z at 1.41S, 123.6E, at a distance of 90 ~ from the H20 observatory. The X and Y traces show movement in the horizontal direction, while the Z trace shows movement in the vertical direction. The hydrophone shows changes in pressure. The data have been filtered to display signals at frequencies below 0.12 Hz.
91 contains a mechanical shaker consisting of an offset cam on a motor that can be used to vibrate the package at frequencies from 5 to 60 Hz for calibration and coupling tests. Further calibration can be performed on command by sending specified inputs to the Guralp feedback coils. The Guralp sensor is digitized over both high and low gain ranges, effectively yielding 24-bit resolution, as is standard Global Seismic Network practice. Figure 4 shows a sample seismogram from the broad band sensor package. About 2 months after installation in 1998, the seismic package failed due to catastrophic flooding of the current meter. In September 1999, the H20 site was visited to repair this package and install some upgrades to the junction box to improve performance. A separate high frequency hydrophone was also added to the instrument suite. The entire system has performed well since this time. Seismic data from H20 is archived at the IRIS Data Management Center in Seattle, from which it is freely available to scientists around the world. Further plans for instrument installations at H20 include a benthic biology experiment and a seafloor geomagnetic observatory. A borehole seismometer will be installed in 2003 or 2004. It is anticipated that H20 will serve the scientific community for many years.
REFERENCES
1. R.D. Ehrbar, J.M Fraser, R.A. Kelley, L.H. Morris, E.T. Mottram, and P.W. Rounds, The SD submarine cable system, Bell Syst. Tech. J., 43 (1964) 1155-1184. 2. T. Atwater, Plate tectonic history of the northeast Pacific and western North America, in E. L. Winterer, D. M. Hussong, and R. W. Decker (eds.), The Eastern Pacific Ocean and Hawaii, Boulder: Geol. Soc. Am., (1989) 21-72. 3. S. C. Cande and D. V. Kent, A new geomagnetic polarity time scale for the late Cretaceous and Cenozoic, J. Geophys. Res., 97 (1992) 13917-13951. 4. G.M. Purdy and J. A. Orcutt (eds.), Broadband Seismology in the Oceans, Washington D.C.: Joint Oceanographic Inst., (1995) 106. 5. J.R. Heirtzler, A.W. Green, Jr, J.R. Booker, R. Langel, A.D. Chave, and N.W. Peddie, An Enhanced Geomagnetic Observatory Network, Report to the US Geodynamics Committee, National Research Council, (1994) 62.
This Page Intentionally Left Blank
Science-Technology Synergy for Research in the Marine Environment: Challenges for the XXI Century L. Beranzoli and P. Favali and G. Smriglio, editors.
93
9 2002 Elsevier Science B.V. All rights reserved.
The M B A R I M a r g i n S e i s m o l o g y Experiment: A Prototype Seafloor O b s e r v a t o r y Debra S. Stakes a, Barbara Romanowicz b, Michael L. Begnaud c, Karen C. McNallyd, JeanPaul Montagner e, Eleonore Stutzmann e and Mike Pasyanos f aMBARI, 7700 Sandholdt Road, Moss Landing CA 95039-0628, U.S.A.
bUCBSeismographic Station, Berkeley, CA, U.S.A. CMBARI (Now at Los Alamos National Laboratory, Los Alamos, NM 87545, U.S.A.) dUC Santa Cruz, Dept Earth Sciences, Santa Cruz, CA 95064, U.S.A. eIPG, Dept. Seismology, Case 89, Paris 75252, France fUCB (Now at Lawrence Livermore National Laboratory, Livermore, CA 94551, U.S.A.) The MBARI Margin Seismology Experiment, conducted from 1996 through 1999, had both technical and scientific goals. The technical goals were to develop new sensors and methods for the development of long-term seafloor geophysical observatories. The scientific goals of the project were to constrain the seismicity of the major faults that crosscut the continental margin of Central California. The 1997 component of this project was MOISE (Monterey Bay Ocean Bottom International Seismic Experiment), an international cooperative pilot experiment that successfully deployed a suite of geophysical and oceanographic instrument packages on the ocean floor using MBARI's ROV Ventana, a tethered Remotely Operated Vehicle (ROV). The goal of MOISE was to advance the global Seafloor Observatory effort through the development and installation of a prototype suite of instruments placed on the western side of the San Andreas fault system offshore of Central California. The MOISE instrument suite was a digital broad band seismometer package partially buried within the sediment-covered floor of Monterey Bay. Several regional earthquakes of magnitude 3.5 and larger as well as several large teleseisms were well recorded during the three-month deployment. The seismic data from MOISE suggest that burial of the broad band sensor package in the continental margin sediments adequately reduces the noise from bottom currents such that both regional and teleseismic events can be usefully recorded, at least in the "low-noise notch" (a frequency band between 5 and 50 sec). Both conventional and well-coupled, ROVinstalled, short-period instruments were deployed in conjunction with the MOISE experiment. During 1998, an offshore network of five ROV-installed instruments was continuously deployed for 8 months. These MBARI "corehole" seismometers include short-period geophone packages mounted in an underwater housing that can be inserted into a 2.5" diameter borehole to provide improved mechanical coupling with the seafloor. The results of this field program demonstrate that placing instruments in offshore sites reduces azimuthal
94 gap, horizontal and vertical location errors, and provides more robust focal mechanism solutions to constrain seismicity on nearshore faults.
1. N E E D F O R L O N G - T E R M S E A F L O O R O B S E R V A T O R I E S
The limited distribution of continents and islands around the world precludes adequate coverage by land-based geophysical observatories to address many important scientific issues related to plate tectonics and the deep structure and dynamics of Earth (e.g. [1]). Long-term seismic observatories on the ocean floor are necessary both for global dynamics studies of deep Earth structure and regional active-process studies that focus on the seismicity, tectonics and hydrothermal volcanic activity of Earth's crust (e.g. [2]). Continuous measurements from seafloor instruments also provide the ability to characterize episodic events, such as undersea volcanic eruptions and avalanches, as well as unrecognized linkages with biogeochemical processes which may not be noted with traditional expeditionary oceanographic approaches. Such long-term stations should be as low-noise, multidisciplinary and broad band as possible with deployment periods of at least 5 years (e.g. [3, 4]. Arrays of multidisciplinary sensors that include both short and long-period ocean bottom seismometers placed in accessible near-shore sites can be used to constrain critical regional processes that may have significant impact on heavily populated coastal areas. In northern California, existing public broad band and short-period stations are predominantly located on the eastern side of the North-America/Pacific plate boundary, and the seismic activity on the off-shore fault system related to this plate boundary is very poorly documented. The complex of active faults that crosscut the continental margin is considered part of the San Andreas system or relicts of the pre-San Andreas Oligocene plate reorganization. The San Gregorio (SGF) and Monterey Bay fault zones (MBFZ) are the major offshore faults in central California. Seismic activity along these structures has been correlated with the distribution of benthic cold seep communities [5, 6] and submarine mass wasting events [7]. There is some chance that they could pose an earthquake hazard for the adjacent populations from Monterey Peninsula to Santa Cruz [8]. The estimated location, mechanism and size of moderate to large events associated with the SGF and the MBFZ is biased by the uneven distribution of seismograph stations. The historical catalogue of seismic events suggests that earthquakes of magnitude 4.0 have not been uncommon. For example, a magnitude 6.2 earthquake doublet was recorded for Monterey Bay in 1926. In addition, a recent analysis by the U.S. Geological Survey suggests that the northern San Gregorio is capable of a magnitude 7.0 event [8]. The sparse distribution of seismograph stations near the Monterey Bay combined with the absence of stations on the west side of the faults created large errors in the determination of hypocenters and focal mechanisms for the characteristic moderate to small events (M300 m) marine environments. Deep ocean drilling discovered hydrates along both the eastern and western American continental shelf (Peril, Costa Rica, Guatemala, Mexico, United States, Canada), offshore Japan, New Zealand, N o r ~ y , Black and Caspian Seas [36]. The quantity of methane hydrates in many oceanic environments is relevant [39,40]. Globally, current estimates are around 10 ~9 g of methane carbon, a factor of 2 larger than the carbon present in all known fossil fuel (coal, oil, and natural gas) deposits [35]. Hydrates also represent an important hazard when they decompose or melt, releasing free gas, potentially becoming a source of "greenhouse" gas and altering the engineering properties of the seabed. Ascending gas plumes and bubble columns may sometimes originate from hydrate-beating seafloor [41 ] leading to significant output of CH4 into the seawater and the atmosphere. Hydrates generally occur within the upper few hundred meters of seafloor sediments, and often overlie deeper zones containing bubbles of free gas [39,42]. Possible relationships between hydrates, methane fluxes and pockmarks have been recently discovered [10,21,43], suggesting a quite complex picture of seafloor gas phenomenology. In particular, amounts and temporal variations of gas stored and emitted, as well as the evolution of seabed structures and sediment engineering properties, remain poorly understood.
221
3. IMPLICATIONS FOR THE OFFSHORE INDUSTRY 3.1. Hydrocarbon exploration Detecting hydrocarbons offshore has been a standard part of exploration programmes for some time. Hydrocarbon seeps at the seafloor provide an indirect view of deep petroleum production, migration and accumulation processes. These seeps have in fact been used in petroleum prospecting and have led to the discovery of major oil and gas fields. Hydrocarbon microseepages can create anomalies in the near bottom seawater chemistry, which outline hydrocarbon pools and can be used therefore as a viable exploration tool. Pockmarks and other geomorphic structures can be important indicators of subsurface petroleum deposits. Also, biogenic production of methane by methanogenesis in sediments below the zone of sulfate reduction can generate sufficient methane to produce gas blisters and pockmarks. However, even if at global scale the volumes leaking offshore are significantly greater than those detected onshore; many of the offshore fields now being discovered do not necessarily have visible seeping hydrocarbons. On the other hand, offshore gas leakages are not always an indication of productive reservoirs, but may be biogenic. Many wells drilled within areas of gas leakage were unproductive, with consequent substantial economic losses for the oil companies. Large sums of money were lost just because the gas seeps were not submitted to detailed investigations in terms of fluid origin (biogenic vs. thermogenic), mixing processes and relation with brittle tectonics. It is important to be able to differentiate methane produced by shallow biological processes and that produced by thermogenic processes at depth. This can be made by two geochemical methods based on the determination of isotopic composition of the methane, and/or the presence or absence of ethane and heavier hydrocarbons. Then only, through temporal, continuous monitoring, can some important information on the subsurface fluid potential, such as amount of gas discharged, origin, migration episodicity and relation with seismic activity, be acquired. A possible optimal use of temporal monitoring of gas leakages is in field definition after a wildcat discovery and in the extension of older producing fields. Basically, a programme of gas monitoring can help to minimise the percentage of failures and maximise the probability of economic success. 3.2. Offshore engineering and exploitation activity The presence of free gas within shallow sediments may have important consequences for offshore industry [44]. In particular high gas concentrations and pressures introduce the risk of potentially catastrophic " b l o w - o u t s " during drilling o p e r a t i o n s , so that shallow o v e r - p r e s s u r e d gas zones may cause significant delays in the exploitation schedule, and could even result in the field abandonment [1 ]. Moreover, the presence of shallow gas could affect the performance of foundations of offshore structures in at least three different ways: 1. the presence of undissolved gas could alter the engineering properties of sediments, such as stiffness and strength, thus influencing possible displacement and ultimate load capacity of a foundation;
222 2. the interface between foundation and sediments could act as preferential migration pathways for gas, reducing the skin friction of piles and skirts and causing a reduction in the load capacity of a foundation; 3. gas flux from the seabed could affect foundation performance: a build-up of gas inside the skirts of a gravity platform could cause instability (as well as potential hazards from fire and explosion), and sudden emission of large volumes of gas into the sea would reduce the buoyancy of a structure and result in the formation of pockmarks threatening the stability of a foundation [45,46]. Basically, free gas in the seafloor increases the compressibility, reduces the undrained shear strength and speed of sound in sediments [44]. These parameters can evolve and change with time, depending on the variation of gas concentration in the sediments or episodic venting. Seismic activity, in particular, can be responsible for sudden gas outputs perturbing the sediment performance, analogous to sand liquefaction on land. In the presence of gas hydrates, operations that alter the in situ temperature and pressure conditions on the seafloor or within the sediments may cause hydrates to decompose, releasing methane and water to the sediment [47,48]. Potential problems include: sediment instability; - seabed erosion; corrosion and dissolution of materials; formation of seafloor biological communities that subsist on methane. Furthermore, these problems may not arise immediately, but may take years to develop as fluid flow from the producing formation warms the surrounding sediments, heating ambient gas hydrates and causing their breakdown (Figs. 3 and 4). Shallow gas in the seabed may lead to mud volcanoes, that are recognised to be the main threat to mobile offshore drilling units (MODU), pipelines and fixed platforms [49]. In some locations, such as the Caspian Sea, mud volcanoes are capable of erupting without warning, blowing mud, water, fire, boulders, and explosive gas into the air and devastating any structure in the area. There is a large amount of methane gas involved in these explosions, along with formation water and unconsolidated rocks, including boulders. The threat of gas seepage, seafloor fracturing, and shallow gas pockets are very real hazards for drillers operating from bottom-mounted figs. Falling mud can endanger the foundation of a rig. Extruding mud creates serious cracking, disrupting the surface with faults or reactivating pre-existing faults. Operators need to consider the presence of mud volcanoes when placing rigs on the bottom. The weight transfer involved in jacking up a rig could disturb the seafloor, or punch through in an area with an unstable surface. As the mud is compressed, it rises like a fluid, being less dense than the surrounding rock. As the mud, with petroleum gases suspended under pressure in its pore water, reaches the surface, volume expansion occurs releasing this gas violently under explosive pressures. The over-pressured slurry can escape at very high rates, producing not only flowing mud, but a great amount of gas. The mixture moves so quickly, it can spontaneously ignite at the surface. There can also be a localization of gas around the foundation of fixed platforms or MODUs, or contained in pockets below the seafloor. The gas in these pockets is under high pressure and is released quickly if the pocket is disturbed by drilling or ground breakage from a drilling mat. Another problem is the spontaneous emission of gas, which has collected under a rig. Also, pipelines are subject to the flow of mud off the slopes of these volcanoes. The mud flows can sweep a pipeline out of position, or buckle or crush the pipelines. -
-
-
223
B. Later conditions
A. Initial conditions
1•
Subsea _._._..~, ~ - - ~ , wellhead I-
seafloor I - - Fracture o f " ~ .,70~ filling "4~}t hydrate -
.... ] Hydrate .~l'nodule s ~ / "
o
Newly-formed Exposed Chemosynthetic casing community Hydrate \ Sediment --~.,,~., • . . . . . . . . . . ,//,.._-. Re-formed rafts ,,~,,~J [ _ ~ l ~ gas "..o ~ ' ~ t l l~f~-. hydrate 9 Methane "to ~ - . ~ ~ ~t~.~'IZ " bubbles
/
9
9
,::,.::;:~;:
9
9
Hydrocarbon at formation temperature
Warmed" :"j~ .... [ ':~';~soci2~ng sediment Gas hydrate
Figure 3. Schematic diagram illustrating the potential effects of decomposing gas hydrate on a subsea production system. At initial conditions a subsea well head produces hydrocarbons from a reservoir. At later conditions, producing hydrocarbons from formations below causes gradual heating of the sediments surrounding the well bore. Any gas hydrates within the sediment break down and release methane and water. Methane can thus move up toward the seafloor, creating a new methane source and/or re-forming as gas hydrate. Methane, in contact with sulphate, causes H2S to form, creating a corrosive environment. Furthermore, both methane and H2S at the seafloor can attract chemosynthetic biota and stimulate biologic community growth. With gas hydrate at the seafloor, rafts of hydrate and sediment have the potential to lift off the bottom and eventually create excavations, exposing additional infrastructure to the corrosive action of H2S-charged waters (from [47] with permission of Offshore Technology Conference).
Buried
Sediment
A. Initial conditions Seafloor
B. Later conditions
Dissociated gas hydrate
Figure 4. Schematic diagram showing gas hydrate dissociation causing sediment instability. At initial conditions, a pipeline is buried in deep-water sediments containing gas hydrates. Over time, heat from hydrocarbons flowing in the pipeline warms the surrounding sediment, changing ambient temperature conditions. Heating causes gas hydrates to dissociate, releasing methane and water into the pore spaces of the sediment. Adding these fluids causes sediment instability and slumping on the seafloor. The pipeline ruptures when the sediment founders and shifts (from [47] with permission of Offshore Technology Conference).
224 In areas hosting production wells, occurrence of gas in the seafloor and its emission into the sea may vary over time as a result of the changes in the subsurface pressure regimes. During drilling and exploitation activity, sedimentary rocks and fluid reservoirs are subjected to mechanical stresses, variations in loading, pressure, temperature, all leading to changes in the gas migration and/or oil recovery potential. It is evident therefore that simultaneously monitoring gas occurrence and seismic activity in correspondence with offshore structures would represent an important surveillance action. At the same time, in the case of production wells, monitoring provides a useful approach for controlling fluid extraction rate and acquiring information on how fluid extraction can influence the circulation of fluids in the surrounding sediments, eventually inducing seismicity, with possible negative impact to the submarine structures and systems.
4. OFFSHORE GAS DETECTION 4.1. Background
The search and detection of offshore gas and related structures can be based on geochemical and geophysical techniques. In geochemical surveys it is assumed that the hydrocarbons leaking from the ocean floor form a plume that is either directly above the accumulation because of the lack of current, or is shifted down current. In some cases, a hydrocarbon anomaly is not detected in the water unless sampling is done close to the seabed. Since the 1960s offshore hydrocarbon leakages have been investigated through areal surveys, based on grab sampling and analyses of water, but rarely repeated over time. These surveys are commonly labelled the "sniffer" technique. The sniffer consists of a metal "fish", containing an electric pump and a sonar system, which is towed behind a ship. Water is continuously pumped to the surface, degassed and analysed for hydrocarbon gases by gas chromatography. This approach, useful for reconnaissance areal surveys, has two main limitations: - the technique is limited to water depths of approximately 300-400 m due to drag on the cable supporting the "fish", making depth control difficult; - the results, referring to as a discrete sampling, do not allow a detailed examination of the nature and time variation of the gas leakage. Geophysical surveys are based on high-resolution reflection profiles, side-scan sonar, and echosounder recording [9,13,12,19]. Gas is recognised either as enhanced reflectors or by acoustically obscured zones. A particular feature is that of dome-shaped reflectors, low relief intrasedimentary doming formed by high pressure gas build-up [7]. Pockmarks are observed as conical and dish shaped incisions in the seismic records. The activity of pockmarks, i.e., continuous supply of gas, can be recognised by the existence of a columnar, acoustically turbid zone underneath the pockmarks themselves [13]. The integration of these geochemical and geophysical methods is always the most effective approach for exploring and defining offshore gas occurrence in the space-domain; but they cannot provide fundamental information in the time-domain. 4.2. A new approach for long-term monitoring
The cognitive elements related to the subsurface fluid potential and seafloor evolution, such as the amount of gas discharged, migration episodicity, and relation to seismic activity,
225 can only be acquired through continuous and long-term data recording. Autonomous benthic stations are the most recent technological development for this purpose. The main scientific and technological requirements for a benthic station capable of performing gas leakage monitoring include: versatile and cost effective procedures of station deployment and recovery in shallow and deep sea sites; continuous and long-term multiparametric measurements; precision positioning of the sensors within a stable platform; good performance of the sensors for long-term (several months) measurements; real-time or near real-time data communication; management of instrumentation by means of an intelligent unit (data acquisition and control system), allowing periodic and automatic adjustments of the sensors to maintain data quality. -
-
-
-
These requirements are presently fulfilled by the GEOSTAR station (Geophysical and Oceanographic Station for Abyssal Research), the first observatory available for monitoring geophysical, geochemical and oceanographic parameters from coastal areas to the deep seafloor [50,51]. GEOSTAR can host a series of instruments, including seismometers, magnetometers, current meters, conductivity/salinity and temperature sensors, water samplers and chemical probes. In particular, seafloor degassing can be monitored through a set of specific sensors. In this respect, GEOSTAR is presently equipped with a time series water sampler, a CTD (sensor for water conductivity, temperature, pressure, and derived parameters such as salinity and density), a 0.25-m pathlength transmissometer for water turbidity determination, a chemical package for pH and HzS measurement, and an ADCP (Acoustic Doppler Current Profiler) for monitoring water current direction and velocity. The water sampling unit can be used throughout 18 months with 48 collection events of 500 ml each. Water can be sampled at specific points and heights above the seabed by sampling tubes fixed onto the station frame. After the GEOSTAR recovery, water samples can be analysed in an onshore laboratory for gases (e.g., hydrocarbons, carbon dioxide, oxygen), chemical constituents, biological products and isotopes. The pH and HzS electrode system includes a self-calibration device and has been recently tested successfully in a hyperbaric chamber under 350 bars pressure. Upgrades and addition of new sensors in the system (e.g., for NH4 + and total iron by colorimetry) are planned. Thanks to the availability of additional interfaces, new instruments for further chemical parameters of seawater and indirect tracers of hydrocarbon leakage (e.g., CO2, CH4, Eh) can also be added to GEOSTAR. All sensors are managed by an intelligent unit (DACS, Data Acquisition and Control System) able to perform a first level of data quality control. A high-accuracy clock, included in the seismometer, acts as a time reference for the whole station, allowing precise comparison in the time domain among all the scientific measurements to evaluate possible correlations: the simultaneous collection of different data will allow possible causal links among natural phenomena to be studied, and hence, lead to a better understanding of the phenomena observed at the sea bottom. The accurate positioning of the station with respect to seepage manifestations or gas hydrates occurrences is a further advantage of the system. The GEOSTAR Mobile Docker, the deployment/recovery vehicle, can transport the station to the seafloor and perform
226 accurate landings due to specific devices including thrusters, telemetry, video, lighting, sonar, and altimeter. The solution adopted overcomes typical limitations of other systems like the need of neutral buoyancy in water, free-falling deployment (very imprecise and rough), short autonomy and limited payload. Besides the described environmental monitoring, the station can also host instrumented packages devoted to the acquisition of measurements related to the cathodic protection design parameters and geotechnical parameters. With reference to geotechnical surveys, the benthic station should be upgraded in order to investigate the seabed with good resolution in the top layers, enabling a proper design of a future gravity structure to be installed in the surrounding area. In particular, the integrated station shall be able to provide an exhaustive set of soil mechanics data useful for mud mats and skirt design, pipeline design and laying, trenching operations and anchoring of vessels. The geotechnical instrumentation necessary to perform the requested task should typically entail a penetrometer, a shear strength meter, a plate pressure meter and, eventually, a core sampler. This instrumentation could be remotely controlled from the surface by means of the GEOSTAR Mobile Docker telemetry line (see also Oebius and Gerber, in this volume). The possibility of managing long-term collection of seismological data represents a further important feature, as discussed in Sections 2.1 and 3.2. A significant example in this regard is the PROCAP-2000 programme launched by the Brazilian oil company Petrobras aimed at characterising the Brazilian continental shelf in the Campos Basin Area (2000 m.w.d.). It should include a specific activity devoted to seismic monitoring for which the use of instrumented benthic stations is foreseen.
5. CONCLUSIONS Natural leakage of gaseous hydrocarbons at the seafloor (gas bubbles in the sediments, leaking hydrates, vents in pockmarks) is a widespread and relevant phenomenon in hydrocarbon-prone basins, as also recognised with increased interest by recent literature. Still, offshore gas leakage has not been adequately explored, being an important hazard for offshore oil exploration, exploitation, and construction of related facilities. The possibility of studying and monitoring gas occurrence and evolution at the seafloor is now offered by autonomous benthic observatories, which can be positioned with great precision on the seafloor and can collect long-term, continuous and multiparametric (geophysical, geochemical, oceanographic and geotechnical) data. Benthic observatories, like the existing GEOSTAR, can be used for a reliable characterisation and monitoring of a specific site where, for example, an offshore structure or subsea system is to be installed. The advantages would include the evaluation of parameters related to the subsurface fluid potential and seafloor evolution. Such parameters, e.g. amount of gas discharged, gas output episodes, seismic activity, are essential for the management of both exploration and construction activities. The data will also aid in finding the best solutions to engineering problems associated with hydrocarbon production and gas occurrence in seafloor sediments. This approach would enable the evaluation of the best commercial hydrocarbon prospects. As it has occurred in the realisation of GEOSTAR, a close collaboration between experts of different disciplines, coming from scientific and industrial groups (engineers, geochemists, geophysicists), is now demanded for effective development, testing and operativity of advanced systems for seafloor monitoring.
227 ACKNOWLEDGEMENTS
Thanks are due to R.W. Klusman and T. Francis for the helpful review and comments. The GEOSTAR project has been funded by the European Commission (EC contracts MAS3CT950007 and CT98-0183). The GEOSTAR partnership includes Istituto Nazionale di Geofisica, Tecnomare S.p.a., Technische Universit~it Berlin, Technische Fachhochschule Berlin, IFREMER, ORCA Instrumentation, Laboratoire de Oc6anographie et de Biogeochimie of Marseille, Institute de Physique du Globe de Paris. The geochemical package is developed by Systea s.r.l., with collaboration of Tecnomare and Istituto Nazionale di Geofisica.
REFERENCES
1. M.E. Smith, M. K. Rieken, D. J. Streu, M.T. Gaona and J. Sousa, Offshore Technology Conference, Houston, OTC 8893, 1998. 2. G.B.J. Fader, Cont. Shelf Res., 11 (1991) 1123. 3. E. Suess, in: Weydert et al. (eds.), 2nd MAST days and Euromar Market, 1 (1995) 303. 4. E. Uchupi, S. A. Swift and D. A. Ross, Marine Geology, 129 (1996) 237. 5. J.C. Moore, AAPG Bulletin 83 (1999) 681. 6. N.A. Morner and G. Etiope, Global Planet. Change, (2001) in press. 7. Graham and Trotman (eds.), Seabed pockmarks and seepages: impact on geology, biology and the marine environment, London, 1988. 8. L.H. King, and B. MacLean, Geol. Soc. Am. Bull., 81 (1970) 3141. 9. J.T. Kelley, S. M. Dickson, D. F. Belknap, Barnhardt and M. Henderson, Geology, 22 (1994) 59. 10. Chapman and Hall (eds.), Gas-related sea-floor depressions. Glaciated Continental Margins, London, 1997. 11. L. Dimitrov and V. Dontcheva, Bull. Geol. Soc. Denmark, 41 (1994) 1. 12. A. Garcia-Garcia, F.Vilas and S. Garcia-Gil, Environm. Geol., 38 (1999) 296. 13. T. Hasiotis, N. Papatheodorou, N. Kastanos and G. Ferentinos, Marine Geology, 130 (1996) 333 14. M. Hovland, Marine Petrol. Geol., 8 (3) (1991) 311. 15. M. Hovland, Cont. ShelfRes., 12 (10) (1992) 1111. 16. M. D. Max, R. Schreiber and N. Z. Cherkis, Marine Geophys. Res., 14 (1) (1992) 77. 17. P. S6derberg, and T. Flod6n, Contin. Shelf Res., 12 (10) (1992) 1157. 18. P. S6derberg, Ph. D.-thesis, Geol. Geochem., Stockholm Univ., 1993. 19. S. Soter, Tectonophysics, 308 (1999) 275. 20. J. P. Ellis and W. T. McGuinness. Proc. of Oceanology Intern. Conf., Brighton, March, 1986 p. 353. 21. P. R. Vogt, K. Crane, E. Sundvore, M. D. Max. and S. L. Pfirman, Fram Strait. Geology, 22 (1994) 255. 22. S. J. Wheeler, Marine Geotechnol., 9 (1990) 113.
228 23. M. Hovland, A. G. Judd and R. A. Burke, Chemosphere, 26 (1993) 559. 24. S. M. Karisiddaiah, M. Veerayya, K. H. Vora and B. G. Wagle, Marine Geology, 110 (1993) 143. 25. T. Whelan, J. M. Coleman, J. N. Suhayda and H. H. Roberts, Marine Geotechnol., 2 (1977) 147. 26. G. B. J. Fader, BIO Review '85, Bedford Institute of Oceanography, 1985, p. 16. 27. A. G. Judd, J. Davies, J. Wilson, R. Holmes, G. Baron and I. Bryden, Marine Geology, 137 (1997) 165. 28. P. V. Curzi and A. Veggiani, Acta Naturalia de 1' "Ateneo Parmense" (1985) 21, 79. 29. J. Platt, Offshore Eng., August (1977) 45. 30. G. Etiope, J. Environm. Radioactiv., 40 (1998) 11. 31. G. Etiope and S. Lombardi, Environm. Geol., 27 (1996) 226. 32. L. C. Price, A critical overview and proposed working model of surface geochemical exploration, in: Unconventional Methods in Exploration for Petroleum and Natural Gas IV. South. Method. Univ. Press, Dallas, 1986. 33. A. Brown, AAPG Bulletin, 84 (11) (2000) 1775. 34. M. E. Field and A. E. Jennings, 77 (1987) 39. 35. K. A. Kvenvolden, Chem. Geol., 71 (1988) 41. 36. K. A. Kvenvolden, Rev. Geophysics, 31 (1993) 173. 37. Marcel Dekker (eds.), Clathrate Hydrates of Natural Gases, New York, 1990. 38. E. Suess, G. Bohn~ann and E. Lausch, Sci. Am., 281, 5 (1999) 76. 39. G. R. Dickens, C. K. Paull, P. Wallace and ODP Leg 164 Scient. Party, Nature, 385 (1997) 426. 40. G. D. Ginsburg and V. A. Soloviev, EOS, Trans. AGU, 76 (1995) S 164. 41. C. K. Paull, F. N. Spiess, W. III Ussler and W. A. Borowski, Geology, 23 (1995) 89. 42. N. L. B. Bangs, D. S. Sawyer and X. Golovchenko, Geology, 21 (1993) 299. 43. W. S. C. K Paull and W. Ussler III, Geology, 24 (1996) 655. 44. G. C. Sills and S. J. Wheeler, Cont. ShelfRes., 12 (1992) 1239. 45. M. Hovland, Quart. J. Eng. Geol., 22 (2) (1989) 131. 46. K. Wang, E. E. Davis and G. van der Kamp, J. Geophys. Res., 103, (6) (1998) 12339. 47. W. S. Borowski and C. K. Paull, Offshore Mediterranean Conference, Houston, OTC 8297, 1997. 48. R. P. LaBelle, Offshore Technnology Conference, Houston, OTC 8708, 1998. 49. W. Furlow, Oil & Gas Journal, 59 (3) (1999) 50. L. Beranzoli, A. De Santis, G. Etiope, P. Favali, F. Frugoni, G. Smriglio, F. Gasparoni and A. Marigo, Phys. Earth Planet. Int., 108 (1998) 175. 51. L. Beranzoli, T. Braun, M. Calcara, D. Calore, R. Campaci, J. M. Coudeville, A. De Santis, G. Etiope, P. Favali, F. Frugoni, J. L. Fuda, F. Gamberi, F. Gasparoni, H. Gerber, M. Marani, J. Marvaldi, C. Millot, C. Montuori, P. Palangio, G. Romeo and G. Smriglio. EOS, 81 (5) (2000) 45.
Science-Technology Synergy for Research in the Marine Environment: Challenges for the XXI Century L. Beranzoli and P. Favali and G. Smriglio, editors.
229
9 2002 Elsevier Science B.V. All rights reserved.
The use of a Coastal HF Radar system for determining the vector field of surface currents G. Budillon a, G. Dallaporta b and A. Mazzoldi b aIstituto Universitario Navale, Istituto di Meteorologia e Oceanografia, Via Acton 38, 80133 Naples, Italy bC.N.R., Istituto Studio Dinamica Grandi Masse, S. Polo 1364, 30125 Venice, Italy
The HF coastal radar, using backscatter from sea waves of the electromagnetic waves emitted from land-based coastal stations, allows mapping sea surface currents along the coast with a range of several kilometers. This paper describes the use of the HF radar system installed within the framework of the MURST PRISMA 2 Project, to measure the surface current field in the near shore area in front of the city of Ancona (Northern Adriatic Sea). The system has been operating continuously from September 1997 to the spring 2000. Its use is fundamental because of the possibility of monitoring a wide coastal area, that would otherwise be difficult to cover, demonstrating its capability of being a useful supplement in data collection with standard oceanographic field measurements. Preliminary results show the well-known southeastward (i.e. along shore) coastal jet that extends its influence as far as 10-15 km from the coast; further offshore a cyclonic eddy, with a diameter of 10-20 km typically occurs. This circulation can be occasionally reversed showing an anticlyclonic eddy that forces a northward coastal current with a far less energetic signature. The spectral analysis reveals a propagation of the semidiurnal frequencies from NW to SE according to the propagation of a Kelvin wave theory; conversely an exponential increase from the coast to offshore has been identified for the near-inertial frequencies. An analysis of the mean kinetic energy during a Bora wind episode describes the typical dynamic of a surface layer driven by an impulsive atmospheric force.
1. INTRODUCTION Currents within 1 m of the sea surface are highly variable, being driven by geostrophic forces and tides, and are strongly influenced by the local wind and wave fields at different time and spatial scales. Near-surface current patterns, and how they respond locally to the relevant prevailing forces is a crucial element for the effective management of operations in coastal waters, and an increasingly important input for global resource monitoring and weather predictions. Nearly all available techniques, for measuring this thin surface layer are Lagrangian, meaning that they measure the trajectory of a parcel of water near the surface, thus obtaining one or more current streamline on function of time. The method used in our
230 case is an Eulerian one. It measures surface currents at points spaced approximately 1.5 km from each other on one particular grid. The HF Radar experiment has been carried out in the framework of the national project "PRISMA-2" (acronym of project for the safeguard of the Adriatic Sea, second phase), financed by the Ministry of University and Scientific Research (MURST), under the coordination of the National Research Council (CNR). The aims of the experiment were: 1. obtain a map of coastal surface current in order to give oceanographers a tool for defining and following frontal lines (e.g., the line dividing the coastal fresh water from offshore salt water); 2. experiment with this recent technology in order to investigate its potential for other applications in the future.
2. HF R A D A R
The mechanism which permits a HF radar system (wavelength 9~~ 6-30 m) to measure sea surface currents, involves the physics of reflection and dispersion of electromagnetic waves from sea waves. Under a given wind field, sea waves of different lengths are generated; of this multiplicity of waves, some travelling radially to the receiving antenna reflect back a large signal when they have a wavelength L = )~/2, )~ being the HF wavelength. Observing sea waves from the coast with the HF radar and analysing the spectrum of received echoes, one can point out the presence of two overlapping peaks, symmetrically spaced around transmission frequency (Fig. 1). They are the HF signals given by sea wave trains moving radially toward and away from the radar (Bragg effect), having a spatial period precisely one half of the transmitter radar wavelength. The spectrum of the continuous-wave transmitted signal is a narrow peak at the carrier frequency location fo = c/)~ (taken as zero reference value), where c is the speed of light in vacuum. In the absence of current, the received firstorder sea echo appears as two symmetrical spaced peaks about carrier (Doppler shifts), corresponding to sea wavelength L = )~/2. These peaks are shifted by a small amount with respect to original positions in the presence of a surface current. This shift depends on the fact that the "target", seen by the radar, is in motion, and the signal given out by the transmitter comes back at a frequency different from that transmitted because of radial velocity of the "target". This velocity is the result of the linear combination of two components. The first is the gravitational velocity Vph = (g~/4rt) ~/2 (g = 9.81 ms -2 gravity acceleration) of the surface waves. The corresponding resulting Doppler shift iSfph = 2Vph/)~ [2]. The second component is made by surface current, so that the two peaks produced by the waves are shifted a further amount proportional to the radial component of current velocity. This quantity is fcr-- 2Vcr/)2, where Vcr is the real radial component of the current in the radar direction. The total Doppler shift is so Af = fph+fcr. The velocity is obtained from the total shift using the equation Vcr = )~Afl2-Vph It has been demonstrated that the thickness of surface water that influences the Doppler shift, when a surface current is present, is in 1/25 of radar wave length [3]. The relation is d = )~/8rt approximately, where d is the surface water thickness. Because a single radar station is capable of measuring only the component of flow along a radial beam emanating from the site, radial currents from two (or more) sites must be combined to form the surface current flow, which is two-dimensional (Figs. 2 and 3).
231
transmitted electromagnetic
~
wave
~x/2
backscattered :sea echo
Advancing wave echo
First Order Sea Echo With No Current
i
/~ RECEIVED SEA ECHO SIGNAL STRENGTH
tll
Transmitted signal
receding wave echo
l;I
JIL
J~,
First-Order Sea Echo with Advancing Current
I ^
Transmitte: [k Tr frequency
--,4 F- z~(= 2Vcr:]'l
I! Jt
L~( - 2 Vcr X
DOPPLER SHIFT
Figure 1. Generation mechanism of the Doppler shift [1 ].
232
Fugure 2.Radial Currents map about remotesites.
Working range of HF radar depends on the attenuation of electromagnetic waves travelling from the transmitter, to "target" and return, on the energy back-scattered by sea wave roughness, on atmospheric noise and on noise by radio interference. The attenuation of wave HF is directly proportional to the working frequency and inversely proportional to conductivity, that is a proportional function of salinity and temperature. The HF radar systems are constructed in a frequency range between 5 MHz and 50 MHz. The maximum absolute range of every frequency depends on radar parameters such as the power, the band, the model of antenna, etc. The distance resolution is limited by disturbances by radio interference that are minimized using transmission signals to narrow bandwidths. This problem is strong with HF radar working at low frequency, near that of radio transmissions, while there is no interference from long range radio sources with VHF frequencies higher than 50 MHz. In the absence of very near radio sources it is possible by employing broad bandwidths and obtaining high spatial resolution. Using a 150 MHz system one can obtain a resolution below 100 m. It is necessary, then, to find a compromise between range of work and spatial resolution; for example, working between 20-30 MHz, the range of work is 40-60 km and the spatial resolution is 0.3-1.5 km, well adapted to many coastal studies.
3. R A D A R STATIONS
Within the framework of the described MURST PRISMA 2 Project, the CODAR SeaSonde (by COS, LTD) was installed by the Institute I.S.D.G.M. of C.N.R.
233 The SeaSonde is a Coastal HF Radar that radiates less than 80 watts average power in the frequency band near 25 MHz. Two radars operate at sites 25 km apart, each measuring radial currents to a distance of about 45 km. Each radar station produces radial velocities vs range and bearing. The range cell spacing, that depends on the signal bandwidth, has been set from 1.5 km, but this resolution may be increased to 500 m, if necessary; radial speeds are produced for each of 32 range cells at 5 ~ bearing increments over the annular sector falling over the sea. At remote sites the radar unit runs continually, acquiring 512 samples in 256 seconds. The data extracted are averaged on 15 complete acquisitions and, every hour, files of these radial speeds, in polar coordinates, are processed and stored at each remote site and/or sent by modem to the central site, in Venice. The choice of remote sites is dictated by sea coverage required. It is necessary to observe some important conditions about the installation of antennas (principally the distance between the antennas and the water, the distance between the building housing transmitter and receiver, the absence of obstacles around the antennas) to obtain the very best sea surface coverage. In our case, the choice sites are: building on the sea front of Senigallia and north wharf of Ancona's harbour. The total coverage area is 1000-1200 km z, approximately. The composition of instrumentation at remote sites is: one transmitting antenna with omnidirectional pattern; one receiving antenna, capable of distinguish the direction of incoming echoes; a transmitter and receiver - computing hardware, modem, power supply and telephone line. At the central site a computer Power PC is connected to remote sites by modem and telephone lines. -
-
-
4. M A P OF S U R F A C E C U R R E N T Radial speed maps from each radar site alone are not a complete depiction of the surface current flow, which is two-dimensional. This is the way at least two radar systems observing the same spot on the sea from different directions are used to construct a total vector from each site's radial components. Many programs have been developed to process data of radial files [4]. A simple program, developed using MatLab language, gives the results in Fig. 3 and 4. Figure 3 is the map of total current vectors over a grid spanning the overlapped remote site coverage areas. One notices that the radial vectors in the coverage area of two remote sites are not always associated with the same point. Therefore, the cells into which total sea area is divided, will give a vector combination of radial vectors, which is the mean of many vectors, depending on the area's size in which means are computed. Then, a map of errors can be reproduced using vector values whose least square mean is higher than the fixed given value. To prove better the errors in Fig. 4, the scale of the vectors is 2 cm/s instead of 50 cm/s of Fig. 3. The vector map of Figure 4 gives a representation of errors that can be explained as: (a) velocity vector error is an absolute value, both direction and intensity; (b) one considers only velocity errors over 1 cm/s (or with different thresholds); (c) direction error is represented by a deviation with respect to North, in degrees; (d) intensity error is proportional to the length of vector error.
234
Figure 3. Map of surface current vectors
Figure 4. Map of errorsmore than l cm/s, relativeto the vector map in Fig. 3.
235 What is shown is just one portion of the total error in any HF radar measurement. It is the portion proportional to the geometric sampling grid [5]. Generally, about the errors, the central area is the best investigated by the two radars, with an error below 1 cm/s. Errors over 1 cm/s are systematic in the extremes of the coverage area, where the numbers of available radial vectors is less per unit area, while the direction error seems to increase moving from south to north. This is compatible with an asymmetrical coverage by the two radars. At increasing distance of the observation point from the antennas the measurement reliability decreases for distortion errors; additional errors can be due to a too narrow view angle. Lipa and Barrick [4] suggest, in consequence of the above, to take into consideration only data relative to a matrix of 20x20 km for a setting analogous to ours. Keeping in mind that this footprint was based on the performance of an older and less powerful system, after a first analysis aimed at assessing the consistency of data outside this core area with those inside it, we have decided to take into account measurement spanning over a 30 x 30 km area [6]. Besides distortion, another problem which may arise in radar data is the presence of spatial and/or temporal gaps; this can be due to obvious hardware reasons but also to problems with the direction finding algorithms or with the signal maximum range fluctuations [7].
5. FIRST R E S U L T S
In the adopted configuration, the system provides data averaged over 1-hour periods. In order to filter out semi-diurnal and diurnal tides as well as inertial oscillations, data were low-pass filtered having a Chebyshev filter with a 50-hour cutoff. Spatial/temporal gaps were filled by means of linear interpolation with neighbouring data (in space and time). In this work we analyze the hourly data acquired during the months of August and September 1997. The whole dataset start from July 1997 and stop at March 2000. It must be stressed that in this study, which is a preliminary contribution to the analysis of this important dataset, we have used the HF radar velocity data as processed by the manufacturer. In Fig. 5 we present weekly surface current maps in the studied area relative to the month of September 1997. The surface circulation is dominated by a southward coastal jet and by a single cyclonic eddy further offshore. This dynamical feature is very frequent in our dataset and it is often detectable also in the hourly data; its typical dimension is of the order of twice the Rossby radius of deformation, which was estimated in the same area and period at around 6.5 km [8]. The coastal current shows typical velocities of the order of 10~-20 cm/s and it influences the coast for 10-15 km seaward. The cyclonic (anticlockwise) circulation shows a symmetric field located approximately in the middle of the studied area. We expect the topography to be important in setting the location and the persistence of the eddy in the region. A major forcing setting up the eddy can be obviously related to the local field of the wind stress. This observed pattern is consistent with the previous knowledge of the general circulation of the Adriatic Sea. This quasi-permanent circulation pattern does not occur in several situations as detected, for example, at the beginning of September, when the coastal jet direction is almost completely reversed and the investigated area is dominated by an anticyclonic (clockwise) local circulation, most likely due to meteorological conditions [9]. For a complete presentation of the monthly averaged circulation in this area see Budillon et al. [10].
236
I week
II week
i
i \
\ \
\
\
\
20cmls
\
9
44.0
..
r162
43.8
~,
; "eL, ; ; ; \
~Ancola"
43.6 13.2
13.4
/
"~---~._/~Aoco ~a
"-.
13.6
13.8
13.2
13.4
III week
" "-,
13.6
13.8
IV week
i
Ix \
',
44.0
\
,,Ionl
20em/s
\\
4r
....
,,
I~m
2(Rmm/s
N
44.0
,e---
\
~ ~_
~.~,.-,~.~
'I ,r
,l~J
x',g -. ~.," ,. ~"~.,. ~'~,L---x/ 4,-p. 'l" T . L
43.8
43.8
k
"-.\
\ ~...
"-~-....___/~Ao col
43.6 13.2
13.4
~ ~
(O~.d)
13.6
"\k
\
13.8
i -'----...._.~Anco,
43.6 13.2
13.4
(6e.d)
13.6
Figure 5. Weekly averaged circulation in September 1997 (adapted from [9]).
/
113.8
237
Figure 6. Spectral analysis of the longshore component measured by the CODAR in a nearshore (upper right panel) and offshore (bottom right panel) location in August 1997. See the left panel for the position of the analyzed points.
238
The energy spectra (Fig. 6) of the longshore velocity component estimated for two points clearly shows the tide periods (frequencies of approximately 0.04 c/h and 0.08 c/h for the diurnal and semidiumal component, respectively), with the last one lessening of energy from the coast to offshore. This agrees with what is known about the tide in the northern Adriatic, i.e. the semidiumal frequency is composed of Kelvin wave that decreases exponentially towards offshore and the diurnal frequency is composed of shallow water waves that propagate along the axis of the Adriatic basin. Moreover, it is emphasized that the inertial frequency that, at this latitude corresponds to a period at around 17 hours is identified by the energy peak at about 0.058 c/h in the offshore location (Fig. 6, fight panel). This frequency disappears close to the coast (Fig. 6, left panel), according to a bigger influence of the bottom stress in shallow water area. The observation that near-inertial current decays toward the coast is not new, but it is very interesting to describe the scale over which the decay is occurring in this area. The maximum of the inertial frequency has been calculated, from the coast to seaward, along three different transects perpendicular to the coastline and the results are shown in Fig. 7. For all the transects there is an exponential increase of the energy associated with the inertial currents moving from the coast to offshore (from 2 to 6 times greater along 30 km). Moreover, observing the exponential fitting line, a general decrease of the energy is observed moving from the northern (dashed line) to southern section (solid line).
6E+006
oq c>., i,,_
4E+006
..... 4-
..Q i,._
v >.., i,._ c"
LU
2E+006 .....
3
+
i,...
el.}
~"~,/
++
t-
,,1-1t
0 OE+O00 ....... 0
9
-
90 - 0 10
oo
20 Distance from the Coast (km)
~30
4O
Figure 7. Plot of the maximum inertial energy versus the offshore distance along transects perpendicular to the coast: square for northem one (Senigallia, dashed line), dots for southem one (Ancona, solid line), and cross for the central one (between Senigallia and Ancona, dashed-dotted line).
239 Mean Kinetic Energy 3000
I
a)
,
2000
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
, .
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
E 1000 O 0 5 1500
c,q
E o
10
15
20
I ,
I ,
I ,
15
20
25
1000 500 0 5
10
September 1997
Figure 8. Mean Kinetic energy of the surface llayer over the area covered by the CODAR during the Bora forcing occurred in Septembr 1997: (a) hourly data and (b) low pass filtred data @ = 50 hours). As all the closed or semi closed seas, the Adriatic Sea is perturbed by free oscillations or seiches (mainly induced by the atmosphere interactions) having characteristic modes depending on the geometry of the basin. Two main seiches are known for the Adriatic Sea having periods of about 22 and 11 hour [11]. The first one represents the fundamental longitudinal free oscillation of the basin, with a nodal line at the southern boundary (Otranto Channel). The 11 hours oscillation might be interpreted as the bimodal seiches. Both seiches are present in the spectrum obtained for the offshore location (Fig. 6, fight panel) at 0.045 and 0.09 c/h. Besides these main oscillations, other shorter seiches have been observed and described by numerical models. Treating the Adriatic Sea as a non-rotating one-dimensional barotropic basin, Bajc [ 12] calculated also a period of about 12 hours, supporting the Defant's hypothesis of the so-called "closed Adriatic". Therefore, in order to investigate the response of the studied area to the atmospheric forcing, an isolated meteorological event of strong NW wind (Bora) has been analyzed. The Bora occurred after a long period of breeze (low wind stress) on September, 14, 1997, and 2 days later it disappeared. In Fig. 8 is shown the mean kinetic energy (m.k.e. = 0.5"[u2+v2]) obtained averaging the hourly velocity data (upper panel) and the low-pass filtered data with a cut frequency of 50 hour (bottom panel). The m.k.e, oscillates with values less than 500 cm2/s 2 until the afternoon of September 14 1997, then it quickly increases at night to a value of 2400 cm2/s2. On the five successive days, the m.k.e, gradually decreases until resuming its previous magnitude in the weather forcing.
25
240 This is a classic behavior of a coastal transient under the influence of an impulsive forcing of wind stress; therefore, it can be asserted that the CODAR has correctly defined the energetic input in this coastal area. The spectrum analysis shown in Fig. 9 has been obtained using the offshore and longshore hourly data of Fig. 8 in order to investigate the origin of the large oscillation in the m.k.e. observed during the Bora forcing. The absence of any peak in the inertial or near-inertial frequency exclude the presence of inertial currents forced by the wind stress produced by the Bora blowing in the Northern Adriatic (it is also unusual to see inertial oscillation in the energy signal because the velocity vector rotates with inertial periods, but the energy would not present inertial variations unless only one component is analyzed). Otherwise, the presence of two peaks of energy with similar magnitude at the semi-diurnal frequencies (centered at the period of 12.5 and 12.0 hours) for the alongshore component can support the results of Bajca and the Defant's hypothesis on the existence of a seiche with an oscillation period of about 12 hours (determined supposing that the Adriatic basin closed at the Otranto Channel). Due to the coincidence with the semidiurnal constituents of the tide, the effect of this particular seiche is to amplify (with decreasing amplitudes) the tide oscillation with a semidiurnal period. The decay time of this seiche (about 5 days) is not very long in comparison with the other main seiches of the Adriatic Sea (about 10-15 days). At the bottom panel of Fig. 9 the rotary coefficient (CR) is shown. The rotary coefficient gives the partition of total energy at the different frequencies: the magnitude will be one for pure rotary motion and zero for unidirectional motion; its sign is related to the polarization of the ellipses: positive values for clockwise rotation and negative for anticlockwise [ 13]. As expected, the maximum positive peaks (anticyclonic rotation) of the rotary coefficient appear at the diurnal and inertial frequencies. At the semidiurnal periodicity CR shows different values in the corresponding two major peaks. A unidirectional motion is detected at approximately 12.5 hours while the oscillation with a period of 12.0 hours clearly has an anticyclonic behavior. The most important peaks corresponding to the anticlockwise (cyclonic) motion are detected at different frequency ranges where the total energy of the surface current is essentially negligible. This indicates that the mean velocity measured by the CODAR, at a fixed point in space, in the course of time rotates along a clockwise path, whereas the spatial picture of the averaged velocity field is direct anti-clockwise (cyclonic gyre). In order to better visualize the rotary nature of the semidiurnal current, these are presented in terms of elliptic properties, specifically the semi-major and semi-minor axis and orientation. The semi-diurnal tidal ellipses show a smooth, even sweep of the currents with maximum amplitudes of the semi-major axis in a band extending 10-15 km offshore well correlated with the presence of the coastal jet (Fig. 10). The semi-diurnal ellipse orientation of the tidal currents shows a preferential behavior parallel to the coast. In the central sector the direction veers clockwise from the coast to offshore. In the remainder the ellipse directions appear inclined toward the coast in shallow water and turn anticlockwise more offshore (Fig. 11). The semi-minor axes show a general constant behavior coming from the cost to offshore, therefore the ellipse eccentricities are generally controlled by the semi-major axis. Actually the eccentricity decreases from the coast to offshore. It must be stressed, once more, that these results can be biased because they were obtained considering a limited data set.
241
10
0
Offshore c o m p o n e n t
x 105
0
0.02
0.04 Frequency
0.06 (C/h)
0.08
0.1
0.08
0.1
Lo ngsho re co mpo nent
x 10 6
OF
0
0.02
0.04 Frequency
0.06 (C/h)
Rotary Coefficient and Total Energy I
I
I
I
I
I
I
I
0.5
-0.5 ....... - ..... -1
0
0.02
0.04 Frequency
0.06 (C/h)
0.08
0.1
Figure 9. Spectrum analysis of the hourly time series of Fig. 8 (top panel); note the different magnitude for the alongshore and offshore spectra. In the bottom panel the rotary coefficient (bar) and the normalised total energy (line) are reported.
242 September
1997
9 + . . . . . . .
,.. . . . . . . . . . . . . . . . . .
_
. . . . . . .
,. . . . . . . . . . . . . . . . .
,. . . . . . . .
7 ._6 .......................................................... .o5 . . . . . . . : - - - ; - - - : + - . . . . . . . . . . . . . +,
X
~
+
+
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
!,...
,
,
+
,
'
,
:,. . . . . . . .
:. . . . . . . .
:. . . . . . . .
; _ ; _~_-t _ ;_-_ ~~-:-._: co 3 . . . . . +-. . +--:. -~- . - ~ . . : z + . . .+. § : . . . . . . . + : ,-+. . . . . . .
,,. . . . . . . .
E, 4 ._
. . . . ~ - : +;
,
+
+ - ~.- ::-+- - - - -++ - :-, - + . . . . .
.... ~__~:_..... ~_,_i___; . _ t _ ~ . ~ 1
.....
,,. . . . . *
. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
+ - - 4 -*.
,
O0
+
-F,~ 5
10
15
I 25
20
+
.....
,
1
30
Offshoredistance(km)
35
Figure 10. Semi-major axis versus offshore distance, September 1997.
September
1997-
Semidiurnal
3 0 ................i~............................................. ..1......................... :}: i "i ............................................. -'" ..-
"-
0
f f -: : 2 5
.x':
-+-
.~-
--.-
"+--i
"/-.
1
.-
- -
sh
~"
/
~
............ --4-" ~ . . . . . . . .
"~
-~,
./- . -
............JI
+
.................
f' ,t'
o r 2 0 .................!....p~.............-...............~ e dis
ta nc
................ , ~ ..............
~
'21
~~
1 5 .................i . - i j .................! ..............~
~...,,.,
~
"ix
!
/
................~ '
~
~
....
~-+-~
ii+
'
~
.......
i ..........
_..+.._
..............................
i
!
.
-15
-10
-5
Longshore
. 0 distance
5 (km)
ref. (2xl)i-I10
15
243 6. CONCLUSIONS Sea surface currents have an important role in coastal processes and in the exchange between coastal and offshore waters [14]. In particular, surface currents strongly affect planktonic larvae transport and their retention in areas suitable for hatching. This is fundamental in marine population dynamics reproduction and, therefore, in the evolution of a coastal system. Also, very important is the observation and study of current surface flow whose description is difficult using conventional Eulerian instruments (e.g., ADCP, moorings, etc.), while it is easier using Lagrangian systems (e.g., drifters), even if the spatial resolution is less. Experimentation with the radar technique began about 25 years ago, but only for a few years has the available system been reliable. The radar, with respect to other standard techniques, allows us to obtain a good spatial resolution and long time series of data, on an hourly basis, permitting an analysis of current fields to be made on many tidal cycles, and the identification of possible seasonal evolutions in the local coastal circulation over a wide area. The SeaSonde, installed in the Northern Adriatic Sea in April 1997, tested for 4 months and operating since August 1997, has proven to be a reliable system to define the coastal circulation in this area at different temporal scales. The surface circulation offshore Ancona described by the radar has shown some interesting features of this coastal area in the northern Adriatic Sea as: (1) the presence of a strong coastal jet extending its influence 10-15 km seaward; (2) the decrease of the tidal energy moving offshore (as Kelvin wave); (3) the exponential decrease of the near-inertial current approaching the shallower depth; (4) the analysis of the kinetic energy during a strong atmospheric forcing has been used in this paper to underline the adequacy of the system; in particular, the presence of a minor seiche (with period of about 12 hours) has been described on the basis of the current oscillation and decay time; (5) the rotary analysis has shown a predominant clockwise characteristic of the current rotation corresponding with the most energetic frequencies; (6) the eccentricity at the semi-diurnal tide frequencies is generally controlled by the semi-major axis amplitude that decrease from the coast to offshore. We conclude that this analysis has provided a preliminary useful examination of spatial and temporal structure in the data; similar analyses could of course be done with different approaches and techniques. For this reason this work must be viewed as exploratory and our interpretations should be tested in additional studies.
ACKNOWLEDGEMENTS This work was carried out in the framework of the activities of Task 1 of PRISMA-2 Project, and of Cofin2000 funded by the Italian Ministry for Scientific Research. We wish to thank Prof. M. Moretti for helpful discussions. The constructive comments and suggestions of the two anonymous referees also greatly improved the manuscript.
APPENDIX Specifications of SeaSonde Briefly, we describe how the SeaSonde operates to accomplish the measurements.
244 (a) Range of the target: the distance of the patch of scatters in any radar depends on the time delay of the scattered signal after transmission. The SeaSonde employs a unique method of determining the range from this time delay. By modulating the transmitted signal with a sweep-frequency signal and demodulating it properly in the receiver, the time delay is converted to a large-scale frequency shift in the echo signal. Therefore, the first digital analysis of the signal extracts the range or distance to the sea-surface scatters, and sorts it into 32 range "bins". (b) Velocity from Doppler of the target: information about the velocity of the scattering sea waves (which includes speed contributions due to both current and wave motions) is obtained by a second spectral processing of the signal from each range bin, giving the Dopplerfrequency shifts due to these motions. The length of the time series used for this spectral processing dictates the velocity resolution; at 25 MHz for a 256 s time series sample, this corresponds to a velocity resolution about 2.5 cm/s (the velocity accuracy can be better or worse depending on environmental factors). (c) Beating of the target: after the range to the scatters and their radial speeds have been determined by the two spectral processing steps outlined above, the final step involves extraction of the beating angle to the patch of scatters. This is done for the echo at each spectral point (range and speed) by using simultaneous data collected from three-directional receive antennas. The complex voltages from these three antennas are put through a "direction 9 finding" algorithm to get the beating [15]. At the end of these three signal-processing algorithms, surface current speed maps for each remote site are available in polar coordinates.
REFERENCES
1. 2. 3. 4. 5.
D.E. Barrick, M. W. Evans and B. L. Weber, Science, 198 (1977) 138. D.E. Barrick, Journal of Oceanic Engineering, OE-11 (1986) 286. K.W. Gurgel, Oceanology (1998) 54. B.J. Lipa and D. E. Barrick, Journal of Oceanic Engineering, OE-8 (1983) 226. R.D. Chapman, L. K. Shay, H. C. Graber, J. B. Edson, A. Karachintsev, C. L. Trump and D. B. Ross, J. Geophys. Res., 102 (1997) 18873. 6. G. Budillon, E. Paschini, A. Russo and A. Simioli, Proceeding of the XIII Congresso dell'Associazione Italiana di Oceanografia e Limnologia (In Italian, 2000a) 23. 7. J.D. Paduan and L. K. Rosenfeld, J. Geophys. Res., 101 (1996) 20669. 8. G. Budillon, M. Grotti, P. Rivaro, G. Spezie and S. Tucci, in: F. M. Faranda, L. Guglielmo and G. Spezie (Eds), Structures and processes in the Mediterranean ecosystems, Springer-Verlag, 2001 a, p. 9. 9. G. Budillon, E. Paschini, A. Simioli and E. Zambianchi, in: F. M. Faranda, L. Guglielmo and G. Spezie (Eds), Structures and processes in the Mediterranean ecosystems, SpringerVerlag, 2001 b, p. 19. 10. G. Budillon, E. Sansone and G. Spezie, IAME Proceedings, 2000b. 11. P. Franco, L. Jeftic, P. Malanotte-Rizzoli, A. Michelato and M. Orlic, Oceanologica Acta, 5 (1982) 379. 12. C. Bajc, Boll. Geofis. Teor. Appl., 15 (1972) 57. 13. J. Gonella, Deep-Sea Res., 19 (1972) 833. 14. E. Bijorkstedt and J. Roughgarden, Oceanography, 10 (1997) 64.
245 15. D. E. Barrick and B. Lipa, Oceanography, 10 (1997) 72. 16. R. O. Schmidt, IEEE Trans. Antennas Propag., AP-34 (1986) 276.
This Page Intentionally Left Blank
Science-Technology Synergy for Research in the Marine Environment: Challenges for the XXI Century L. Beranzoli and P. Favali and G. Smriglio, editors.
247
9 2002 Elsevier Science B.V. All rights reserved.
The Italian T s u n a m i W a r n i n g System: state o f the art A. Maramai, A. Piscini, G. D'Anna and L. Graziani Istituto Nazionale di Geofisica e Vulcanologia, Via di Vigna Murata 605, 00143 Rome, Italy
From June 1996 to December 1998, a European Union Project has been carried out, called GITEC-TWO (Genesis and Impact of Tsunamis on the European Coasts-Tsunami Warning and Observations). One of the main goals of the project was the design and the realization of the pilot station of the first Italian Tsunami Warning System (TWS), to be installed in the eastern Sicily coast, that experienced large tsunamis in the past. The Italian TWS is conceived for those coastlines potentially affected by tsunamis generated by local earthquakes and it is designed for an immediate detection and early warning, taking into account the characteristics of the tsunamis occurring in this area. At present, the pilot station has been installed and some preliminary analysis of the recorded data has been carried out.
1. I N T R O D U C T I O N During a number of years, tsunami research in the world has evolved by steps, alternating periods of great interest to others of moderate or even no attention and it is worth noting that each phase of interest renewal has been generally triggered by the occurrence of big events. For example, the extraordinary 1956 Aegean sea tsunami, occurred in the Santorini island, prompted tsunami research in Europe and after the '60s, the following period of peak interest on tsunamis in the Mediterranean region flourished around 1980-85, when new tsunamis catalogues [1,2] came to light. Although it was well known that tsunamis constitute an important hazard factor for the European coastlines, especially for Greece and Italy, the interest on this problem diminished again up to the 1992, when a first EU project called GITEC (Genesis and Impact of Tsunamis on the European Coasts) started, with the objectives of studying tsunami generation mechanisms and of evaluating tsunami hazard in European seas, in order to reduce tsunami risk in Europe. One of the most relevant results of this project was the production of the first unified catalogue of European tsunamis, in database form [3]. The realization of this catalogue was the starting-point for a second EU project, named GITEC-TWO (Tsunami Warning and Observations), which is, in a sense, the continuation and the enlargement of the previous project, but mainly focused in the problems of tsunami modeling and warning. In this project the Istituto Nazionale di Geofisica (ING) was involved in the task of the realization of the pilot station of the first Italian tsunami warning system (TWS), in collaboration with the University of Bologna. In fact, the conception of the Italian TWS is just the result of this co-operation and some papers have been produced in the last years [4,5]. A detailed knowledge of the tsunamigenic potential of the coastal areas is a fundamental tool for the realization of an efficient TWS and the availability of a complete and reliable
248 catalogue is the basis for the identification of the most suitable coastal place for the system installation. In this context, the realization of the new Italian tsunami catalogue [6], an extraction of the European catalogue as far it concerns Italy, put in evidence the coastal regions particularly exposed to tsunami attacks. It is underlined that along the Italian coasts a relevant number of destructive tsunami waves occurred in historical times as well as in the present century. Also, a number of minor tsunamis are well documented in the literature. The analysis of the catalogue showed that Calabria and Sicily are the Italian regions where the most relevant tsunamigenic earthquakes took place in the past. These regions are characterized by a high seismicity level with a consequent high seismic risk; the first relevant group of seismic events is located in the Messina Straits, that is the most active and dangerous tsunamigenic source, where the catastrophic December 1908 event occurred, followed by a huge tsunami with waves exceeding 10 meters which caused severe damage in many places along the eastern Sicily coasts and a large number of victims. The western Ionian Sea is another interesting and dangerous tsunamigenic source and a group of relevant seismic events is located in the Augusta-Siracusa area, where the destructive January 1693 earthquake took place, affecting especially Augusta, Catania and Messina (Fig.l). In this area the last significant seismic event occurred on December 13, 1990, located in the sea offshore just a few kilometers in front of Augusta, that caused some anomalous sea behaviour and some measured changes in the bathymetry. Taking into account the historical information on the characteristics of the Italian tsunamis, the relevant social/economical and environmental value of the exposed coastlines and also the facilities for the TWS station installation and maintenance, the selected coastal area for the pilot station installation is the coastal segment facing the Ionian Sea and, in particular, the area of Augusta. Since this area has been chosen as the most suitable place for the installation, in the frame of the GITEC-TWO some additional research has been done, especially on the historical tsunamigenic events that have occurred in this part of the coast. In this context, Piatanesi et al. [7,8] made a study of the 1693 event to define the position of the tsunamigenic fault consistent with the known earthquake and tsunami data. Taking into account the bathymetry, the authors performed some numerical simulations of the tsunami, considering the initial sea withdrawal in the different localities involved. They concluded that the fault named $5 in Fig. 1, is the most probable candidate as responsible of the effects. It could be underlined that, in spite of the limitation of the method, the offshore fault system located in front of Augusta on the megastructure known as Hyblean-Maltese escarpment is responsible for generating the earthquake. Only by analyzing the tsunami data, and in particular the first wave polarity, was it possible to draw this kind of conclusion.
2. THE ITALIAN TSUNAMI WARNING SYSTEM In the Mediterranean region tsunamis are usually generated by local earthquakes with quite short travel-times from the source to the coast, generally not exceeding 20 minutes. This behaviour is typical, of course, also for the Italian tsunamis, where the waves attack the coast within 10-15 minutes after the shock, e.c. in particular, the tsunamis which occurred in the eastern Sicily coasts. Historical information points out that they are generated by earthquakes with focal region very close to the coast, with epicenter both in the sea or in land. In addition, for all the tsunamis in the area of Augusta the first movement of the sea was a withdrawal, followed, within a few minutes, by a flooding. Considering that, usually, for strong shocks the
249
N
Messina ~ /
../
/
a~cuv
/ .t""
/
/
/ -/ /VIII
/" / .i" I
_
0
r
..-
//
~ "w
..
../Mascalilf f f-"
,/
/
9"
/" ~5.ormi .
/ I~
/ "" Cpial /-'/ /
/
IX
~-'"'"
/-"
./ x
.4
/~ug~s~ ( XI ..~J\
"'~.~...../
M._ \ l acus .
50 km
Figure 1. Isoseismal lines (dashed) for the January 11, 1693 event. The sign indicates strongtsunami effects.
250
focal mechanisms are the same for many cycles, it is reasonable to hypothesize that a possible future earthquake in this area could cause the same kind of sea behaviour, with an initial withdrawal. The hypothesis of a delay between the tsunami generation and the wave attack is fundamental for a suitable utilization of the Italian TWS. In fact, the system differs from those at present operating all over the world because they are efficient for tsunamis generated by distant earthquakes, being based on a real-time seismic network detecting and locating the shock: if the shock is in the sea with a relevant magnitude, a message is issued to the tidegauge networks for a possible tsunami verification. This kind of system cannot work for local earthquakes because the time elapsing between the tsunami generation and the wave attack is quite short. On the contrary, the Italian pilot TWS is a very local system conceived for the coastlines attacked by the waves just a few minutes after the shock. In such cases what is needed is an immediate detection of the generated tsunami. This is why our system is an integrated system designed for the simultaneous acquisition of both the seismic signal and the sea-surface displacement, for processing both signals, for detecting dangerous conditions and for issuing an alarm message to specific local units of the Civil Protection service within 4-5 minutes after the shock. Two cases must be essentially distinguished, according to the first sea movement: if the first movement will be positive (a flooding of the coast), even the above mentioned 4-5 minutes delay will be too long and the alarm could not be effective. Also, if the first movement will be a withdrawal (that is the usual behaviour in the Augusta area), the inundation will occur about 10-15 minutes later and the system will be able to send a suitable alarm message and proper actions could be taken. 2.1. Architecture and Realization As already mentioned, the integration of the system with the installation of a tide gauge station and the use of seismic data is viewed as the fight strategy to timely detect anomalous sea movements and to send a suitable alarm. The pilot TWS station was been installed on July 1998 in a suitable wharf, managed by the EniChem Oil Company, just outside the Augusta harbour. The wharf (see Fig. 2) has all the necessary characteristics for a good installation: (a) it is located a few km in front of the seismogenic structure generating the 1693 earthquake; (b) it is a safe place controlled 24 hours a day; (c) it is only rarely used by the oil company; (d) it is about 1 km long into the sea, with the water depth ranging from 2.5 m (at the beach) to more than 10 m; (e) it has some platforms with basic facilities for the station installation and maintenance. As regards the monitoring of the sea conditions, a water-level station, including a pressure gauge and a barometric unit, is used. The sensor is a pressure gauge, ranging from 0-10 m with an accuracy of 1 cm, that has been installed in the sea, at a depth of about 4 m. The chosen sample rate is 1 Hz, averaged over a 15 sec time window. We have to emphasize that the choice of this rate is particularly relevant because the sea level data available up to now in Italy, both from national and local networks, are low-frequency data, only useful to study long period oscillations. On the contrary, our choice allows us to obtain information on the events of interest for us and to have a much larger data set. The pressure gauge is connected to an automatic station (SM3840 by SIAP factory), located in the wharf, able to carry out some diagnostics, to collect and process the sea level data from the sensor and the atmospheric pressure data from the barometric unit included in the station itself. The station, fed either by a battery or by a photovoltaic panel to be used in emergency conditions, is also able to process many other parameters such as temperature, wind velocity and direction, solar radiation, etc. The atmospheric pressure is an additional collected parameter,
251 sampled every 15 sec, necessary to be compared with the sea level data in case of anomalous oscillations. Both the sea level and the atmospheric pressure data are sent, via dedicated radio link, from the automatic station on the wharf to the TWS control center located in land about 4 km far from the coast. Concerning the seismic data, in this experimental stage the already available three component seismic station (S-13 seismometers) of Augusta is used, which is an operating station of the Italian seismic network. During the GITEC-TWO project a study of the background seismic noise has been carried out, analyzing some months of continuous registrations both in time and frequency domain. Fig. 3. shows the Power Spectral Density (PSD) acceleration diagrams for vertical and horizontal components. The main contribution of noise is given by human activity (frequency greater than 1 Hz). By filtering all frequencies greater than 2 Hz we obtain the diagram of Fig.4.
Figure 2. General view of the wharf.
252
Figure 3. PSD diagrams (acceleration vs frequency) respectively for AU9Z, AU9N and AU9E component.
Figure 4. Filtered recording for the three component seismic station (velocity vs time).
253 As mentioned before, both the sea level and the seismic data are sent to a dedicated TWS control center, collecting and processing the data in real-time and able to detect the occurrence of a significant seismic event and to identify anomalous sea level oscillations. If both signals show anomalous behaviour to suggest the possible occurrence of a tsunamigenic event, an alarm message is issued to local units of the Civil Protection service. The control center is equipped with two PC units, in LAN network, one for the seismic signal acquisition and processing and the second for the sea level data processing and for the alarm message procedures. The PC responsible for the seismic signal acquisition, programmed in LabView environment, contains specific routines for the detection of the earthquake; after the detection, the seismic data will be sent to the second PC unit, that receives the sea level data too, continuously transmitted through the radio link. This PC is equipped with a CM5000 software that has some main tasks: (a) automatic acquisition of the sea level data; (b) real time data validation; (c) data collection in a relational database (DBMS) SQL standard; (d) graphic display of the data; (e) station activity monitoring; (f) configuration of new stations. In the PC unit equipped with the CM5000 software there also resides for routine uses a predictive model (developed by means of a neural network) to define a tide gauge trend, continuously related to the real data, in order to detect anomalous sea level oscillations and to single out an alarm condition. This LAN at the control center is connected to the 1NG center in Rome, by means of a dedicated telephone line, with a PPP/TCP access on the SCO UNIX PC, in order to perform some remote tasks (i.e. data downloading, system remote control, etc.). At present at the control center only the PC with the CM5000 software is installed, dedicated to the acquisition and processing of the sea level and pressure data. However, before the end of this year the second PC unit, in LabView environment, will be installed, also containing the algorithm for the alarm message transmission. In Fig. 5 the general scheme of the complete system is shown and Fig. 6 shows a general view of the automatic station installed in the wharf (including the protection box, the antenna and the photovoltaic panel).
Figure 5. General scheme of the Italian pilot TWS.
254
Figure 6. View of the automatic station installed in the wharf
2.2. Data collection and analysis As in the Mediterranean basin it is more probable to record weak tsunami waves rather than strong events, it becomes very important to know what are the noise characteristics that can interfere and mask the real phenomena. So, in the case of this project, in order to determine the main characteristics of the local noise and their dependence on harbour geometry, we performed a systematic collection and analysis of tide-gauge records, also considering different existing atmospheric conditions. Since installation, the pilot system has been working, although with a discontinuous activity due to some problems both in the software and in the technical apparatus, allowing the collection of quite a large amount of data. At present, the availability of different sets of sea level data for different periods of time, sampled every second, allows us to make some preliminary fundamental analysis. Starting from the longest continuous available dataset, first of all we have observed the tidal trend by applying a low pass filtering procedure with 0.01 Hz comer frequency to cut the oscillations with periods lower than 100 sec The pure tide trace is clearly visible only by filtering with 0.00027 Hz comer frequency, in order to cut oscillations with periods shorter than 1 hour. In Fig. 7 an example of 6 days registration is shown and the peaks of the tidal waves, diurnal and semidiurnal, can be observed.
255
1
'
'
'
I
t
!
.1
!
i ~ ?5
-.1
i..................................................... 1.............
........
i . . . . . . .
1 ..........
x10+5
Time (s)
Figure 7. Filtered (0.01 Hz) sea level data (from June 13 to 18 1999) (displacement in meters vs time in seconds).
256 For a better understanding of the phenomenon we carried out the spectral analysis of the signal, obtaining some Power Spectral Density (PSD) diagrams. In Fig. 8 the PSD diagrams for the data of Fig. 7 and for the July data are shown. As regards the period from June 13 to 18, 1999 (black line) four peaks, around 1250, 800, 500 and 5 sec., are evident. The same kind of behavior was observed by analyzing the data in different time intervals as, for example, for the period July 11-16, 1999 (grey line). Taking into account this information, we could hypothesize that the Augusta harbour, and, in particular, the basin where the tide gauge is located, is characterized, from a spectral point of view, by the oscillation period above mentioned, that is, 1250 sec (oscillations with amplitude one order of magnitude lower than tidal waves). In order to support this hypothesis, we have considered the available studies describing the relations between the oscillations periods and the geometrical characteristics of the basins in "shallow water" conditions, to compute the dimensions of the Augusta basin. The approximated formula (Merian) from [9]: Tn = 2L / n (g h)1/2
(a)
was used, where L is the dimension of the basin in the wave propagation axis, g is the gravity and h the sea depth in the basin. The parameter n is inversely proportional to the wavelength: the oscillation wavelength is 2 L (if n=l), is L (if n=2), is 2/3L (if_n=3), etc. The period with n_= 1, T !, is the main oscillation period" T ! = 2 L / ( g h_) 1/2
(b)
We have substituted the experimental period found, 1250 sec in formula (b), by considering an average depth of 6 meters. We have obtained an L value of about 4800 m, that is in the same order of magnitude of the Augusta harbour. This simple model (only considering the mean length and depth of the basin) is therefore able to explain the 1250sec peak, but to explain the other observed oscillation frequencies some more complex numerical models, taking into account the real geometry of the basin, are needed. The comparison between sea level data and atmospheric pressure spectra (Fig. 9) does not indicate any correlation in the range of frequencies studied. In fact, the atmospheric phenomena can contribute to the changing of the waves amplitude, but not to their oscillation periods. During the period of activity of the system no significant event has occurred to generate an alarm condition. At the Augusta seismic station the largest seismic events which occurred in the Mediterranean area, respectively in Turkey and Greece (Fig. 10), have been recorded but, obviously, no alarm message has been issued because of the absence of anomalous conditions in the sea level data. On the other hand, never in the past have the Italian coasts experienced tsunamis generated by distant sources. Likewise, in the case of anomalies in the sea level data, if the system does not find a trigger of a possible seismic event, no sort of alarm message will be sent. The necessity of having simultaneous anomalies in both the seismic and sea level signals, in order to identify a warning condition, strongly reduce the possibility of false alarms.
257
Figure 8. Power Spectral Density diagrams(displacement vs. frequency). The black line shows unfiltered data from June 13 to 18, 1999, the grey line shows the same data for the period July 11 to 16, 1999
Figure 9. Comparison between the the PSD (amplitude vs. frequency) of the sea level data (black line) and the atmospheric pressure data (grey dashed line) for July 1999.
258
Fig. 10. The August 17, 1999 Turkish event (upper trace) and the September 9, 1999 Greek event (lower trace) recorded at the three component S-13 Augusta seismic station (AU9). Only the vertical component is shown for both events.
3. CONCLUSIONS A TWS for the Calabrian Sicilian tsunamis has been realized and installed close to the Augusta harbour. This system works for local tsunamis, taking advantage of the delay between the tsunami generation and the waves attack. In spite of the limitations due to the very short time (10-15 rain) available for all the operations, this is the only kind of system suitable for the characteristics of the Italian tsunamis, and in particular for those occumng in the eastern Sicily coast. Although some technical problems occurred during the operating period, preliminary analysis for both the seismic and sea level data have been carried out. In the future the installation of a strong motion seismometer is foreseen, to support the S-13 seismometer in order to reduce the possibility of saturation, just to improve the quality of the available data.
259 ACKNOWLEDGMENTS
The authors wish to thank Prof. Stefano Tinti of the University of Bologna for his constant contribution to the realization of the TWS. This work was supported by the GITECTWO project, contract ENV4-CT96-0297.
REFERENCES
1. J. Antonopoulos, Ann. Geofi, 33 (1980) 141. 2. G.A. Papadopoulos and B. J. Chalkis, Mar. Geol., 56 (1984) 309. 3. S. Tinti, M. A. Baptista, C. B. Harbitz and A. Maramai, Proceedings of International Conference on Tsunamis, Paris, 1998, p.84. 4. A. Maramai, S. Tinti, Phys. Chem. Earth, 21 (1996) 39. 5. A. Piscini, A. Maramai and S. Tinti, International Conference on Tsunamis, Paris, 1998, p.137. 6. S. Tinti and A. Maramai, Annali di Geofisica, 34 (1996) 1253. 7. A. Piatanesi, S. Tinti, A. Maramai and E. Bortolucci, Atti XV GNGTS, Rome, 1996, p.337. 8. A. Piatanesi and S. Tinti, J.Geophys.Res., 103 (1998) 2749. 9. F. Mosetti, Oceanografia, Del Bianco Editore, Udine, 1964.
This Page Intentionally Left Blank
Science-Technology Synergy for Research in the Marine Environment: Challenges for the XXI Century L. Beranzoli and P. Favali and G. Smriglio, editors.
261
9 2002 Elsevier Science B.V. All rights reserved.
A d v a n c e d technologies: e q u i p m e n t s for e n v i r o n m e n t a l m o n i t o r i n g in coastal areas G. ZappalS. Istituto Sperimentale Talassografico CNR, Spianata San Ranieri 86, 98122 Messina, Italy
Traditional equipments and techniques for offshore oceanography hardly meet the requirements of coastal monitoring (committed to the acquisition of long-term time series of characterizing parameters), so creating the need for special-purpose instrumentation able to work safely and in all weathers. The lack in the 1980s of such instrumentation and the ecological interest for environmental preservation led to the organization in 1988 of the CNR "Strategical Project for Automatic Monitoring of Marine Pollution in Southern Italy Seas". During the ten years which the Project lasted, new instruments and systems for automatic and real time monitoring of marine coastal waters were developed, for the first time in Italy. The availability of new microprocessors and semiconductors allowed to design and build a whole new generation of data acquisition and transmission equipments, combining low cost with good performances in terms of power consumption, size and reliability. The know-how obtained was also used in the "Venice Lagoon System" programme and in the PRISMA 2 - SubProject 5 "Remote Sensing and Monitoring" research of the Italian Ministry for University and Scientific and Technological Research (MURST), aimed at the study and safeguard of the Adriatic Sea. Systems for real time automatic monitoring of water characterizing parameters (e.g. temperature, salinity, dissolved oxygen, nutrients content...) were installed in Augusta (SR), Messina and Venice. The paper presents the developed equipments and some examples of their use.
1. M A I N D E S I G N C O N S I D E R A T I O N S
The main restriction which systems are to comply with concern the size, weight and power consumption of the equipments, that must be minimized to fit also on little buoys. The systems are usually powered by rechargeable batteries backed up by solar cells or wind generators. They are evidence of the impossibility of using the same instrumentation that could fit in a laboratory or on a ship and the need to ensure the proper supply voltage in all battery conditions, avoiding wastage of limited and precious energy. The ordinary maintenance required by the station (sensor cleaning and reactive refilling) should be reduced to a minimum; its interval usually varies, according to the trophic conditions, between 2 and 4 weeks. The measuring systems vary according to the kind of monitoring required: physical parameters (e.g. water temperature and salinity) can be easily measured and used also as tracking parameters (e.g. to estimate the dispersion of pollutants carried by fresh water bodies),
262
chemical parameter measurements can be performed using sensors (e.g. dissolved oxygen and pH) or automatic instruments (e.g. nitric nitrogen, phosphates). The communication system (if necessary redundant) must be chosen according to the site conditions: VHF and UHF private radio links are usually the most reliable solutions, but require base stations for the antenna equipment. The use of cellular phones is simpler, but (more for GSM than for E-TACS) their applicability and reliability depend on the telecom company; satellite communications are usually too expensive for long-term monitoring. The last, but not least, requirement to be met is the cost - building up a network of n monitoring stations involves a multiplication by a factor at least n+l (the +1 takes into account the need for a spare system) of the costs of a single one. The number of stations depends on the characteristics of the site and on the expected results - a preliminary study of the hydrodynamic conditions and of the presence of sewers or fiver mouths is necessary to define the locations and quantity of the monitoring stations.
2. THE FIRST SYSTEMS: THE FLEXIBLE MONITORING NETWORK The first systems were obtained with the cooperation of some firms making their first steps in this new business. Experiences in environmental monitoring performed in Germany [1,2] and France [3], encouraged us to experiment with the realization of a network in Augusta Bay (SR), of which a very short description follows.
2.1. The buoys Five small buoys were installed in Augusta Bay for about 3 years in the early 1990s. The mooring depth ranged around 11 m. The measuring subsystem was based on an industrial grade 8088 PC compatible computer, equipped with 12 bit A/D converters with serial output (one per sensor). All the buoys communicated on the same frequency, using 10 W UHF RTXs with analog modems; the carrier detection was used to switch on and off the measuring equipment; the operations were controlled by an automatic base station (hosted by the harbour authority), sequentially activating the data acquisition and transmission from each buoy, that had no local data storage. A remote control software was used to manage the network from the Institute in Messina and data transfer. Measurements performed included water temperature, salinity, dissolved oxygen, turbidity and fluorescence [4]. 2.2. The continuous measuring system The network was supplemented by a continuous measuring system hosted in a little boat cruising on a prefixed course. Sea water was continuously pumped in a little tank, housing a multiparametric CTD probe (with sensors for temperature, conductivity, dissolved oxygen, pH), a turbidometer and a fluorimeter for chlorophyll a. The tank also supplied water to a continuous flow analyzer, able to measure the nitrates in water with a response time of about 300 sec. Acquired data, integrated with the GPS fix, were real-time displayed and stored in a database for the subsequent elaborations.
263 3. THE NEW SYSTEMS The experience obtained with the first systems showed their limits and boosted the custom design of a new series of equipments, all developed and built in Messina. 3.1. The data acquisition and transmission system Core of the new generation of devices is the data acquisition and transmission system whose general architecture is showed in Fig. 1. It offers a great modularity and is conceptually similar to that described in 1999 by Lessing et al. [5] for the American NDBC. It is able to control almost any instrument, communicating via standard serial or parallel digital interfaces, or reading measurements presented as an analogue voltage or current. 3.1.1. The computer subsystem The basic architecture of the computer subsystem is an IBM PC-like one, adapted during the years to the different needs and to the availability of new components. An x86 CPU is the heart of the system, dialoguing with external devices using standard RS232 serial communication ports and I/O mapped bidirectional ports controlled on a single bit basis; a 12-bit resolution, 16 input A/D converter and solid state storage devices complete the system. Two different kinds of boards were adopted, both with satisfactory results. The first system, built in 1996 to equip the "Advanced Technology Coastal Monitoring Platform", used a 486 CPU with industrial grade boards and had no local data storage. Housed in an IP65 box, it was the natural development of a 286 system designed in the early 1990s in the framework of the "Venice Lagoon System" Strategical Project.
Packet Switching Modem Cellular Modem
Solar Cells
Communication Subsystem
PC-LIKE COMPUTER
Main Power Suppley
GSne I--
Pho
Subsurface CTD [ Nutrient Analyzers
I I
Batteries
Dig ital I/O
....
[ i
Rem ote Controls]
I Ana,o0
Interface
[
[
_[ M eteo Station r [
Future Expansions
Radiometer and analog t l Sensor and System I
Figure 1. The basic architecture of the data acquisition and transmission system.
264 The computer subsystem was switched on only during the measurements and then switched off again sending commands to the packet switching modem. A new implementation always switched on, with local data storage, using a 386 CPU and PC104 standard boards, was developed in 1997 in the framework of "PRISMA 2" Project, and installed in "Acqua Alta" platform in the Venice Lagoon. 3.1.2. The communication subsystem The first communication subsystem used in Messina basically consisted of a 2 W UHF radio transceiver, a commutable 10 W RF amplifier-antenna preamplifier, a 2400 baud packet switching radio modem, equipped with 4 remotely controlled digital I/O ports and having the capability to act as a transponder to stations not directly reachable from the base. The system in Venice has been equipped with a Nokia 2110 GSM cellular phone and 9600 baud Nokia Cellular Data Card PCMCIA modem. A new communication GSM Module, expressely designed for this kind of application, has recently been integrated with a new PC104 electronics to be used in the Messina system and in the new generation of buoys. 3.1.3. The software On the remote systems a hardware controls all the functions, implementing a set of macrocommands designed to best perform all the measuring and maintenance operations; it also has the capability to perform, if requested, as a stand alone monitor, and to act as a "repater" to other stations, so making a buoy the "master" of a series, performing like an "intermediate base station". In the main base station software runs, able to perform timed data acquisition with real time transmission and/or timed data transfer. The operating mode can be chosen according to the communication device and the scientific needs: usually the UHF link is chosen for real time data acquisition and transmission, while the GSM link is preferred for delayed data transmission. 3.2. The Advanced Technology Coastal Monitoring Platform The platform (about 8 m 2) has the shape of an equilateral triangle, composed of four little buoys (three in the corners and one in the center). A steel grid lays over the structure and is used as a floor for maintenance operators; p y r a m i d - s h a p e d towers are bolted over the corner buoys, while a higher, triangular prism-shaped tower overhangs the center one, with the solar panels. The platform (Fig. 2, left) was equipped with: the data acquisition and transmission system previously described (Fig. 2, middle), a CTDO probe, a meteo station, colorimetric analyzers for Ammonia and Phosphates, a remotely controlled water multi-sampler for bacteriological analysis (Fig. 2, fight). The water sampler system features eight 250-ml bottles, preconditioned with 10 cc of formalin for sample preservation, cyclically filled by a peristaltic pump via a circular water distributor with position control, operated by a stepping motor. The bottle frame is a circular slice of teflon.
265
Figure 2. The Platform moored near Messina (left), the Computer subsystem (1996 version) (middle) and a close-up of the water sampler (right).
4. SOME APPLICATIONS 4.1. Upwelling detection in the Strait of Messina The tidal movements cause the alternate presence in the Strait of Messina of Tyrrhenian (warmer, nutrient-poor) and Jonian (colder, nutrient-rich) waters. A complete substitution doesn't occur, but springs of Jonian waters rise up among the Tyrrhenians. A study performed in the mid-90s using a little boat equipped with the continuous measuring system cruising on a fixed course, measuring some tracking parameters (mainly temperature and salinity) in the few hours during which the tidal movements develop the highest and lowest intensity, enabled us to find and point out the most important upwelling areas also near the southern Sicilian and Calabrian coasts, and define the zones affected by the dispersion of urban waste waters in the various hydrodynamic conditions [4]. 4.2. Physical, chemical and bacteriological measurements using the "Advanced Technology Coastal Monitoring Platform" in Messina The platform was moored in September 1996 on a 20-m. depth, near the mouth of a torrent used as a urban sewer, in a site in which periodic tidal streams produce an alternate and variable mixing of sea and waste waters [6]. The experience demonstrated the practicability of automatic water sampling and in situ measurement of nutrients. A statistical elaboration about temperature (T) and salinity (S) values measured hourly in two sample weeks, in September 1996 and March 1997, is reported in Table 1. The lower salinity values in March, having a high coefficient of variation (C.V. = Standard Deviation/Average) confirm the strong influence of the rain waters on the sewer flux, and a preponderance of the plume over the tidal movements, while the lower flux in September (confirmed also by the higher salinity levels) shows the tidal variations; the behaviour of the r correlation coefficient (Pearson) is probably caused by the less diluted sewer flux arriving with the Tyrrhenian waters. The statistical elaborations on Orto-phosphate (12 October, 1996) and Ammonia (19 October, 1996) concentrations, measured every 2 hours and r correlation with temperature are reported in Table 2 and confirm the tidal influence on the dilution of the sewer.
266 Table 1
min
T(~ max
Aver.
C.V.
10-16 Sept. '96
19.63
24.03
22.39
21-27Mar.'97
15.28
16.66
15.96
S(psu) max
Aver.
C.V.
r
0.85
34.33 37.09
36.50
0.17
0.59
0.04
32.56 36.57
35.00
0.60
-0.21
min
Table 2
min
PO4 (12 Oct. 96) max Av. C.V.
0.56
5.38
2.98
2.92
r
min
NH4 + (19 Oct. 96) max Av. C.V.
0.11
2.20
4.89
3.14
0.71
r 0.35
The positive Pearson coefficient shows that the measured values increase with temperature, with the warmer Tyrrhenian waters bringing the plume closer to the platform. The values of r depend on the different dispersion of orto-phosphates and ammonia in the skin and subsurface layers. Water samples automatically collected and fixed were analysed in laboratory to detect the presence of Escherichia coli, a micro-organism indicator of faecal pollution. Data obtained show the presence always of high concentrations of Escherichia coli cells, ranging from 1.36 xl 03 to 3.78xl 04 cells/100 ml (Fig. 3). All the measurement showed the existence of a base level of chemical and bacteriological pollution, with increases related to the presence of warmer Tyrrhenian waters drawing the waste waters closer to the platform.
4.3. Physical and chemical measurements in Venice Lagoon The system was installed in 1997 in a laboratory on the second deck of the "Acqua Alta" platform in Venice Lagoon. It is equipped with a CTDO probe, a modified Barnes PRT-5 TIR Radiometer and a pMAC Ammonia Colorimetric Analyzer. The combined use of radiometer and submersed CTD probe allows us to experience the variability (due to solar radiation and wind) of surface temperature with respect to the water mass underneath, and the presence of "anomalies" (like the peak in the hourly measurements showed in Fig. 4) that can be related to the presence of materials and fluxes affecting only the very skin layer [7]. The observation of water temperature and ammonia content in Fig. 5 displays the arrival of warmer offshore waters, having lower ammonia content than the continental ones; measuring some physical tracking parameters it is possible to monitor the arrival of fresh waters and hypotesize the sea capability of diluting pollutants of anthropic origin [8].
267
Figure 3. Escherichia coli and temperature values in Messina Platform from 20 th to 27 th, March 1997. 27
...........................................................................................................................................................................................................................
2
/ /U-._
--~
RA-
O o
251
24! 0.0 3.0
6.0
9.0 120 15.0 18.0 21.0 0.0 3.00 6.0 Hours of the
9.0 120 15.0 18.0 21.0 0.0
Figure 4. Temperatures measured by radiometer and submersed probe from 10 th to 29 th, September 1997.
Hours
of the
Figure 5. Temperature and ammonia values measured in Venice platform from 23th to 24 th, February 1999.
268 5. CONCLUSIONS The performance of the systems is well worth the cost. Thanks to the modular structure of the acquisition systems new instruments and sensors can be easily added. The systems have proven to be a useful, reliable basis for building automatic networks for automatic environment monitoring and they are still open to further development.
REFERENCES
1. K. Grisard, Proc. Oceans '94, (I) (1994) 38. 2. D. Knauth, F. Schroeder, R. Menzel, S. Thurow, S. Marx, E. Gebhart, D. Kohnke and F. Holzkamm, Proc. Oceanology International '96 (III) (1996) 21. 3. A.H. Carof, D. Sauzade and Y. Henocque, Proc. Oceans '94 (III) (1994) 298 4. E. Crisafi, F. Azzaro, G. Zappal/l and G. Magazzfl, Proc. Oceans '94 (I) (1994) 455. 5. P. Lessing, D. Henderson and B. Edwards, Proc. Oceans '99 (II) (1999) 785. 6. G. Zappal/t, E. Crisafi, G. Caruso, F. Azzaro and G. Magazzfl, Proc. Oceanology International 98 (I) (1998) 69. 7. G. Zappal/t, L. Alberotanza and E. Crisafi, Proc. Ocean Community Conference '98 (I) (1998) 585. 8. G. Zappal/l, L. Alberotanza and E. Crisafi, Proc. Oceans '99 (II) (1999) 796.