Ethics and War in the 21st Century
Ethics and War in the 21st Century explores the ethical implications of war in the ...
102 downloads
1174 Views
1MB Size
Report
This content was uploaded by our users and we assume good faith they have the permission to share this book. If you own the copyright to this book and it is wrongfully on our website, we offer a simple DMCA procedure to remove your content from our site. Start by pressing the button below!
Report copyright / DMCA form
Ethics and War in the 21st Century
Ethics and War in the 21st Century explores the ethical implications of war in the contemporary world. The author, a leading theorist of warfare, explains why it is of crucial importance that Western countries should continue to apply traditional ethical rules and practices in war, even when engaging with international terrorist groups. The book uses the work of the late American philosopher Richard Rorty to explain the need to make ethical rules central to the conduct of military operations. Arguing that the question of ethics was re-opened by the ‘War on Terror’, the book then examines America’s post-9/11 redefinition of its own prevailing discourse of war. It ends with a discussion of other key challenges to the ethics of war, such as the rise of private security companies and the use of robots in war. In exploring these issues, this book seeks to place ethics at the centre of debates about the conduct of future warfare. This book will be of great interest to all students of military ethics, war studies, military history and strategic studies in general, and to military colleges in particular. Christopher Coker is Professor of International Relations, LSE and Visiting Professor at the Staff College, Oslo. He is the author of many books on war, most recently The Warrior Ethos (Routledge 2007).
LSE International Studies Series Editors: John Kent, Christopher Coker, Fred Halliday, Dominic Lieven and Karen Smith
The International Studies series is based on the LSE’s oldest research centre and, like the LSE itself, was established to promote interdisciplinary studies. The CIS facilitates research into many different aspects of the international community and produces interdisciplinary research into the international system as it experiences the forces of globalisation. As the capacity of domestic change to produce global consequences increases, so does the need to explore areas which cannot be confined within a single discipline or area of study. The series hopes to focus on the impact of cultural changes on foreign relations, the role of strategy and foreign policy and the impact of international law and human rights on global politics. It is intended to cover all aspects of foreign policy, including the historical and contemporary forces of empire and imperialism, the importance of domestic links to the international roles of states and non-state actors, particularly in Europe, and the relationship between development studies, international political economy and regional actors on a comparative basis, but is happy to include any aspect of the international with an interdisciplinary aspect. American Policy Toward Israel The power and limits of beliefs Michael Thomas The Warrior Ethos Military culture and the war on terror Christopher Coker The New American Way of War Military culture and the political utility of force Benjamin Buley Ethics and War in the 21st Century Christopher Coker
Ethics and War in the 21st Century
Christopher Coker
First published 2008 by Routledge 2 Park Square, Milton Park, Abingdon, Oxon, OX14 4RN Simultaneously published in the USA and Canada by Routledge 270 Madison Avenue, New York, NY 10016 This edition published in the Taylor & Francis e-Library, 2008. “To purchase your own copy of this or any of Taylor & Francis or Routledge’s collection of thousands of eBooks please go to www.eBookstore.tandf.co.uk.” Routledge is an imprint of the Taylor & Francis Group © 2008 Christopher Coker All rights reserved. No part of this book may be reprinted or reproduced or utilised in any form or by any electronic, mechanical, or other means, now known or hereafter invented, including photocopying and recording, or in any information storage or retrieval system, without permission in writing from the publishers. British Library Cataloguing in Publication Data A catalogue record for this book is available from the British Library Library of Congress Cataloging-in-Publication Data Coker, Christopher. Ethics and war in the 21st Century / Christopher Coker p.cm. ISBN 978-0-415-45282-3 (pbk.) – ISBN 978-0-415-45280-9 (hardback) – ISBN 978-0-203-93089-2 (e-book) 1. War (Philosophy) 2. War–Moral and ethical aspects. 3. War on Terrorism, 2001- I. Title. II. Title: Ethics and war in the 21st century. U21.2.C6396 2008 172’.42–dc22 2007034547 ISBN 0-203-93089-4 Master e-book ISBN ISBN 978-0-415-45280-9 (hbk) ISBN 978-0-415-45282-3 (pbk) ISBN 978-0-203-93089-2 (ebk) ISBN 0-415-45280-5 (hbk) ISBN 0-415-45282-1 (pbk) ISBN 0-203-93089-4 (ebk)
These moral laws Of nature and of nations (William Shakespeare, Troilus and Cressida)
Contents
Preface 1 The war on terror
ix 1
A new discourse on war? 1 Richard Rorty and the ethics of war 7 Conclusion 15 2 Etiquettes of atrocity Etiquettes of atrocity 17 Why we take prisoners of war 24 Discourses on war 30 Keeping to the discourse 35 The United States and Vietnam 36 Carl Schmitt and the Theory of the Partisan 44 Conclusion 52
17
3 Changing the discourse Germany and the Eastern Front 1941–5 54 Algeria and the guerre revolutionaire 61 Israel and the intifada 67 Conclusion 74
54
4 A new discourse Excluding from the discourse on war: from Guantanamo to Abu Ghraib 77 Is it a war? 79 A new paradigm? 87 New wars, new paradigms 95 Conclusion 112
77
viii Contents
5 Grammars of killing Grammars of killing 115 Respecting our enemies 124 Non-lethal weapons 127 Conclusion 131
114
6 The unconditional imperative Micro-management of the battlefield 134 Corporate warriors? 138 Asimov’s children 143 Conclusion 152
132
7 Back to the Greeks Back to the Greeks? 155 Simone Weil and The Iliad 157 Reading Thucydides 160 What is Hecuba to him? 165
155
8 The heuristics of fear The ambiguity of peace 170 Towards the future 174
169
Notes Bibliography Index
176 189 193
Preface
Dear Mom and Dad: The war that has taken my life, and many thousands of others before me. It is amoral, unlawful, and an atrocity … 1 These were the opening words of a letter left at home by an American soldier, to be opened if he did not return. They were the first words his parents read after they were told he had died in the field. Every war, we are told, is an atrocity. Even to suggest that it can be moral – or teach moral lessons – seems quite perverse to many people. In The Things They Carried, Tim O’Brien, who served in Vietnam, tells us that ‘a true war story is never moral. It does not instruct, nor encourage virtue, nor suggest models of proper human behaviour, nor restrain men from doing things men have always done’. If it seems to contain a message or convey a lesson, don’t believe it.2 A moment’s reflection, however, will reveal that atrocities are actually part of the landscape of war. All killing is atrocious, but that in itself does not delegitimise war. To understand war, writes Dave Grossman in his seminal book On Killing, we must understand the protocols of atrocity. When we go to war we try to reduce them to a minimum. We even have etiquettes prescribing what we can and cannot do in order to distinguish the act of killing from the act of murder as sharply as possible.3 We are permitted to target some people, but not others. When we kill we are expected to do so proportionately and within reason. We are not allowed to kill indiscriminately. As for those we take prisoner, we are rule-bound to keep them alive until they can be repatriated at the end of hostilities. Such protocols are as old as recorded history. They can be traced back to the ancient Egyptians. They cross the barriers of culture as well as time. The War on Terror proclaimed by President Bush in 2001 has produced a series of responses and practices that have re-opened the question of ethics and, in the process, divided the US and Europe in much the same way that Israel’s response to the second intifada has divided it from its allies. I discovered this at first hand when I attended a conference in Norfolk, Virginia in 2003. Its theme was ‘On the Cusp operations’, a term of art which never really caught on. However, the term was applied to those hybrid operations which, in the case of Iraq and Afghanistan, the
x
Preface
US and UK have found themselves fighting (and for which they discovered that they were largely unprepared).4 I was invited to give a lecture on the ethics of counter-insurgency warfare. Following my talk, a senior British general came up to me. ‘I liked your talk,’ he remarked. ‘You were spot on, but I’m afraid they don’t get it, do they?’ I was not surprised by his observation. British and American strategic cultures are very different. The two armies have operated in separate theatres of operation in Iraq. This has mitigated the clash between two very different operational cultures and their very different ideas about the practice of war. There are many British officers who feel that the US military is in the grip of a dogged ‘Dunkirkism’ – a refusal to admit to reality that can only result in further defeats and defeatism. The general was right to highlight one of the central differences between the two societies. Since the 1970s and Northern Ireland (not, I would hazard, necessarily before), the British Army has tended to put ethics at the centre of its operational doctrines. Perhaps this was largely because Northern Ireland was not a foreign country, but part of the UK. Some of the harsher policies the British had pursued in their colonial campaigns – from Malaya to Kenya – could hardly be repeated in the UK itself. (Not that the army did not, from time to time, find itself accused of criminal behaviour.) But ethical practices are not confined to any one military culture; they inhere in the nature of war itself. It is unwise for any army to see it as an ‘add-on’, a secondary theme that is not really central to war or the conduct of military operations. Far from being an add-on, it is at the absolute core of what defines the military profession, and what makes the soldier’s life a continuous set of ethical challenges. In recent years ethical practices have been given added urgency in counter-insurgency operations (which are probably the most demanding of all). Arguably, it is also the only type of warfare that Western forces will be committed to in future. Ethics today is important because the dynamics of the wars we now fight have a consequentialist dimension. In our television-networked world, the rise of ‘news guerrillas’, like the arrival of bloggers within the Armed Forces, have created a networked environment which is very different from the muchdiscussed ‘CNN Effect’ of the 1990s. It is important to recognise that with the passing of the George W. Bush administration we are likely to place the War on Terror in perspective. The idea of an indefinite and unlimited war is unlikely to remain at the centre of US foreign policy. The attacks of 9/11, 7/7 and others may even be seen as ‘lucky shots’; mass terrorism may not remain a defining threat of strategic proportions. The next administration is also unlikely to continue its insistence that combating the spread of global terrorism is at the root of the Middle East’s problems. It may even acknowledge that terrorism is not fuelled solely by a virulent and antidemocratic ideology that as a ‘cult’ sells a pseudo-religious vision to impressionable young men and women. This may well hold for those who draw their inspiration from Al-Qaeda, but terrorism has other causes which will have to eventually be addressed by the international community. Finally, some of the practices that we associate with an incompetent administration, including the
Preface xi indefinite imprisonment of alleged combatants without judicial oversight and extra-territorial prisons, are likely to be repudiated. But terrorism will be with us for the duration. It is true that the terrorists do not pose the same existential threat as the Soviet Union, which could have annihilated us in 30 minutes, but this misses the point. States may be more secure from each other than they have been for 150 years, but our fellow citizens are more vulnerable than ever before – at least in their own imagination. States can sign peace treaties, enter into armistices with their enemies or cease hostilities for a limited period. There are no armistices with non-state actors who believe they are permanently at war with the rest of us. These actors which we now find ourselves fighting are often beyond our understanding. They call themselves ‘warriors’, but they defy the traditional understanding of the term. Warriors are not supposed to kill indiscriminately, let alone deliberately target civilians. Our own warriors are bound by codes of conduct that are centuries old. ‘The Warrior’s Honour’ requires them to observe the rules of proportionality even when fighting in the field where there are no judges and no law courts to which they are immediately accountable. And though the West did come across suicide bombers in 1944–5 when the Japanese in desperation launched waves of kamikaze attacks, the veterans of that campaign are aghast that they should be tarred with the same brush. The few who have survived into the twenty-first century insist, for example, that they wore uniform, targeted only Allied soldiers or sailors and were answerable to the state. The samurai code may seem strange to the Western mind of today (and certainly of the 1940s). The Americans even commissioned Ruth Benedict, one of the leading anthropologists of the day, to make sense of the kamikaze phenomenon. Yet even their behaviour can be explained far more easily than that of the young men and women who sacrifice themselves for Islam. Why do they do it? What is it that induces young people to take their lives and those of others? All that anthropologists can tell us is that the resort to terrorism apparently has multiple causes. One cause is the personal and cultural alienation experienced by some young men. A violent, global movement like Al-Qaeda seems to offer them an answer of sorts. Unfortunately, anthropology can only explain so much; it does not allow us to label our enemies more precisely. We do not even have the perverse satisfaction of asking ‘Who are these alien people?’ and replying ‘They are what we have made them’. For some studies of suicide bombers show that there is not always a cause that we would recognise as rational. Poverty or social marginalisation may motivate some, but many terrorists are what they themselves have chosen to become. Inevitably, the next question we ask ourselves is whether these ‘alien people’ deserve to be treated as we treated the Chinese in Korea or the Vietcong in Vietnam. Do they merit the protection of the Geneva Conventions or should we label them criminals? These ‘alien people’ are not much different (though more murderous, perhaps) than the feral adolescent gangs in some of our own inner cities who demand respect at the point of a knife or a gun because they cannot earn or win it any other way. 11 September 2001 was a shock for most and a turning
xii Preface point in world history for some. The intoxicating moral certainty of the hour even persuaded a few in Washington that only the most ruthless response would save the day. A new age apparently demanded new measures. The gloves came off. A few years earlier, the historian John Keegan had regretted that the West no longer displayed the cultural ruthlessness to preserve what was left of its nineteenth-century imprint on history. On 9/11 the Bush administration responded in kind to the attacks. They did not quite resort to the language of the terrorists, let alone their currency of war – terror – but the Bush administration chose to act in defiance of America’s own liberal tradition. Guantanamo Bay, Abu Ghraib and extraordinary rendition entered the popular consciousness for the first time. Every war has its own unique language. The War on Terror is no different but for one significant difference: the new vernacular largely defined the moral landscape against which military operations are conducted, both on and off stage. The damage this has done to America’s reputation is incalculable. Since 9/11 American power has diminished as the plan to democratise the Middle East has floundered in the killing grounds of Iraq. In turn, the legitimacy of American power has been compromised by its decision to ride roughshod over international law and the United Nations; the US chose to go to war without a Security Council mandate. Most regrettable of all, its status as a nation has been much diminished. Much of the non-Western world has come to distrust its intentions, and even its allies have come to question its judgement. What is the price of cultural ruthlessness when it has so markedly contributed to the contraction of American power, legitimacy and prestige? What the Bush administration has forgotten is that we have rules because war demands it. It is possible to be ruthless and observe those rules at the same time. In fact, it is not only possible, it is necessary if we are to be successful. Others have written eloquently about the laws of war and the Just War tradition. I am not attempting to replicate their work in this study. I shall look at ethics from a purely pragmatic perspective – which is likely to have greater appeal to those of my readers who, by inclination, are impatient with academics who invoke arguments from the classic theological or philosophical texts which still form the mainstay of most ethics of war courses. Without philosophy, however, one cannot grasp the phenomenon of war. This book is philosophical to a degree, but the philosophy is there for a reason: to prompt us to ask the right questions. The study of military history will provide the answers. If historical lessons are to be drawn (which is the only reason for studying history), they show that societies which understand the need for restraint, proportionality, discrimination and legitimacy tend to have an advantage over those which do not. Simply put, we have rules not because we are nice, but because we are sensible. History is there to ensure that we know more and, with luck, know better. The chapters that follow explore the ethical implications of war in the contemporary world. The study begins with the War on Terror, but is not limited to it. Other chapters discuss the ethical implications of non-lethal weapons (NLW), the micro-management of the battlefield and the rise of private security companies (PSC). I also discuss the ethical implications of using robots. The
Preface xiii latter may be the most important factor of all, for if there are many different values rooted in the different possibilities of human life, those possibilities have been extended through technology. None of those developments absolve us of responsibility for our actions. Indeed, in a networked world, actions have perhaps greater consequences than ever before. To conclude, this is a very practical book. It is grounded mainly (though not exclusively) in Anglo-American pragmatism. The ethics it promotes is at one with consequence management – the demand of the hour in the risk societies we have become. We are encouraged all the time to weigh the consequences of our own actions. It is a lesson that needs to be driven home at every level, for the days when we could consign ethics to staff colleges have long since passed. ‘High rank can be very dangerous,’ said Pangloss, ‘all the philosophers say so’. If that was true when Voltaire wrote his satirical masterpiece Candide, it is equally true for today’s low-ranking soldiers, as the recent Abu Ghraib incident showed. Ethical considerations are becoming more demanding even in battle. We expect small tactical units to be agile and adept at many competing tasks in a chaotic environment full of neutrals, civilians and non-government organisations (NGOs), in a grey area between war and peace. A corporal, or even a private, can compromise an entire mission. An ethical disposition is one of the qualities demanded of military life, as is fortitude, self-restraint and personal loyalty to one’s friends, the men and women one stands by in the line of fire. The highest service of the military to the state may well lie in the moral sphere – and that goes for every rank, not just the highest ones.
1
The war on terror
A new discourse on war? On 3 April 2006, the British Defence Secretary, John Reid, gave a talk before a largely military audience about the rules of twenty-first century warfare. For centuries, he told his audience, conflict between tribes, cities and states had been unbridled and savage. Only gradually had mankind developed a range of conventions that could be applied to constrain and moderate what was, in essence, a brutal activity. Eventually those agreements had become rules which over time had become laws. Much had been achieved, he added, in the current legal frameworks which went by the name of international humanitarian law. However, warfare continues to evolve, and in its moral dimensions we now have to cope with the deliberate regression towards barbaric terrorism. Today, against the background of the uneven nature of the modern battlefield and the unconstrained enemy ranged against us, we must be swifter to support and slower to condemn, our armed forces. I am not in any way suggesting that British forces should operate outside the law. The legal constraints upon us, which set against an enemy which adheres to none whatsoever, but is swift to insist that we do, make life very difficult for the forces of democracy ... We owe it to ourselves, to our people, to our forces and to the cause of international order, to constantly reappraise and update the relationship between our underlying values, the legal instruments which apply them to the world of conflict, and the historical circumstances in which they are to be applied, including the nature of that conflict.1 John Reid concluded by remarking: ‘I believe we need now to consider whether we – the international community in its wider sense – need to re-examine these conventions. If we do not, we risk continuing to fight twenty-first century conflicts with twentieth century rules.’ 2 When he became Home Secretary a few months later, Reid took this new remit with him. He even went so far as to suggest that the law be changed so that torture could be justified in circumstances where the state considered it essential to the security of everyone else.
2
The war on terror
John Reid was not alone in his thoughts. Many other, better informed observers have complained that while the nature of war has changed dramatically, the rules designed to govern the actions of soldiers engaging in battle have remained, for the most part, static. In the context of the War on Terror this inflexibility is said to have created a dilemma in which ‘the rules of war’ clash with ‘the situational demands of military necessity’. Donald Rumsfeld, the former US Secretary of Defense, famously insisted that the world had changed and that business as usual will not do. If this is indeed the case then we should ask ourselves as a matter of urgency whether it is not time to think the unthinkable, including the use of torture. Let me list the ways in which war seems to have changed in three critical respects and on which there is some agreement (there are many others but from the point of view of the ethics of war those I shall cite seem to me by far the most significant). First, the post-Westphalian codes we have in our repertoire of political controls presuppose a political battlefield: two states battling it out for a recognised political end. In such conflicts there is room for negotiation and compromise, the foundations of any enduring peace. It is this political space that seems to have been hollowed out since 2001. One of the reasons why John Reid could be forgiven for thinking that the laws of war need to be revised is obvious: the enemies we now face are (largely) non-state actors. Because many of them are not governed by the same prudential rules as (most) states it is easy to fall into the trap of seeing them as distinctly malign. It was especially foolish of the Bush administration, however, to claim that the perpetrators of 9/11 were ‘evil’ by nature, not just by design. At the funeral oration at the National Cathedral in Washington three days after the World Trade Center attack, Bush declared that Americans did not yet have the distance of history – yet he also insisted that ‘their responsibility to history’ was already clear: to rid the world of evil. It was Michael Gerson, his speech writer and a strong Evangelical Christian, who transformed the original phrase ‘axis of hatred’ into ‘axis of evil’ a few months later. This was translated into a combat doctrine once the War on Terror began in earnest.3 We have often demonised our enemies in the past, of course. In the heat of battle it is difficult to be dispassionate. Yet we have tried to draw a distinction between the states with which we have found ourselves at war and the citizens they have sent into battle. In World War I this distinction was gradually obscured as the conflict deepened; in World War II the distinction was maintained until the very end. However, Nazi Germany retains its exceptionalism as it was, indisputably, the most evil regime of the modern era, both Britain and the United States insisted that they were not at war with the German people. What makes the present conflict different in the eyes of many is that the enemies we face seem to be so intractable. Some are suicidal, others seem eager not only to kill but to kill the largest number of people possible; some are apocalyptic in their dreams, others are more apparently rational. Whatever the issues, the US now believes itself to be engaged in a life or death struggle against evil:
The war on terror 3 ‘Wherever there is evil, we will fight it’, declared the deputy commander of Europe Command in 2005.4 This view was expressed much earlier in a book which was published by David Frumm and Richard Perle in the immediate aftermath of the Iraq War (2003). It was provocatively entitled An End to Evil. Its first chapter ends with the following conclusion: For us, terrorism remains the great evil of our time and the war against evil, our generation’s greatest cause. We do not believe that Americans are fighting this evil to minimise it, or manage it. We believe they are fighting to win – to end this evil before it kills again and on a genocidal scale. There is no middle way for Americans: it is victory or holocaust.5 Rarely has a case been presented with such clarity of vision. But is the vision well informed? There are a number of objections that can be raised in framing the issue so tendentiously. The first objection is that it feeds our obsession with what Carl Sagan called ‘our demon-haunted world’. Our world seems to be full of demons including terrorists, muggers, drug lords, loggers in the Amazon and people with diseases. Conspiracy theories abound, so much so that the 1997 edition of the Oxford English Dictionary felt it necessary to define the term ‘conspiracy theory’ for the first time. As a people Americans, in particular, have a tendency to see universal conspiracies where none exist; to identify demons when we only encounter flawed human beings who are much more fallible than we think, and therefore easier to deal with. But a second objection to invoking ‘evil’ in the abstract is that it depoliticises politics. We do indeed find ourselves at war with a number of evils. We have even declared war on many of them, which is why we speak of the War on AIDS, the War on Drugs, the War on Crime and, though less often than before, the War on Poverty. In terrorism we have another abstract enemy, but unlike the above it is an old-fashioned political one even if its methods are anything but old fashioned. Terrorists have agendas. If you identify terrorism as ‘evil’ rather than ‘an evil’, and if you refuse to manage, contain or limit it, then you are in danger of depoliticising it at the same time. There are some who would have us believe that terrorism is meta-political (that it is beyond politics, that it is something more transcendent and other worldly).6 There are others who insist it is dispersed, disarticulated and without political expression; they contend that it is simply apocalyptic and therefore not open to interrogation or serious analysis. One must be careful here. ‘Evil can only be dealt with by means of evil,’ warned Jean Baudrillard some years ago.7 To fight evil is not without its own irony, or propensity to evil for it encourages us to fight terror with terror at the same time. Evils, by contrast, can indeed by managed, which is the business of politics and politicians. In managing evils we are enjoined to act in accordance with our own ethical codes, in order to be better placed to win the political discussion. Second, some claim that the War on Terror differs from every other conflict because of the question of religion. Our moral codes have traditionally been
4
The war on terror
derived from religious injunctions, but in the last three hundred years the West has largely derived the laws of war from what philosophers have had to say, not the theologians. Beginning with Suarez, Grotius and Pufendorff and going on to Kant, it is philosophy which has carried the burden of the ethical debate. What are we to make of a world in which religion has returned, sacralising violence and transforming the terrorist into a martyr? The War on Terror is an inherently ethical struggle. Whether we identify it as does President Bush as ‘the first war of the twenty-first century’, or treat it as part of a much larger conflict: a ‘clash of civilisations’, ‘the West v the Rest’ (the formulations differ, although they hold a tenacious purchase on our imagination, whether we agree with them or not), the central reality of conflict, at the time of writing, is religion, not inter-state rivalry over resources. Nor is it a war of the ‘haves’ against the ‘have-nots’ prompted by global inequality, or an ideological clash between the forces of globalisation and those movements who oppose it. What I think lies at the root of this conviction is the belief that this conflict is quite different from any other because there can be no compromise on Truth. It is especially daunting, for example, to be faced with an adversary who believes that faith itself is a virtue, even when it flies in the face of reason. If one is dealing with an irrational enemy who prefers death to life, can the old rules be upheld? As Voltaire once famously remarked, ‘Those who make you believe absurdities can make you commit atrocities.’ 8 Is this what war has now become – atrocity management? At the root of our difficulties with religious or sacramental violence lies the fundamental debate about the role of human reason in political affairs. We find it difficult to understand so much of Islamist terrorism in pragmatic or politically rational terms. What is being demanded of us by some is the surrender of the bedrock of the Western conception of itself, of everything (to use the word advisedly) that Western liberals themselves regard as ‘sacred’ (i.e. non-negotiable). At the same time, in this very real ideological struggle the West is being challenged with the cooperation of the Muslim majority to separate the faith from its violent exponents. That means it has to act rationally (or reasonably). It cannot fight evil with evil. I do not suggest that we have exaggerated the threat. It is precisely because it is very real that we need to treat ethics seriously. It is because we are engaged in a war against those who consider and proclaim themselves to be at war with us that we must understand why, over time, we have developed the ethical codes that we ask our soldiers to observe even in the most challenging of circumstances. Finally, our ethical and legal codes have tended to take the Weberian understanding that violence is legitimate only when exercised by the state. It was this understanding which allowed Clausewitz, long before Max Weber put pen to paper, to conceive of war as a duel between states or moral equals. Duels, after all, are the practice of members of a freemasonry who hold certain values in common and are prepared to defend their honour, at times at great risk to themselves. In a duel there is always a winner and a loser, unless one of the parties (or both) agrees to suspend the exchange, confident that honour has been satisfied. States, insofar as they are legally constituted, are by definition legitimate actors. The
The war on terror 5 only exception in three hundred years of European history occurred in 1815 when the Congress of Vienna declared Napoleon an international outlaw. Today we face a series of non-state actors instead. In the case of Iraq these include warlords, foreign volunteers (terrorists), gangs, militias, separate political parties and even the police and the army (or at least factions within them). Iraq indeed has entered a stage beyond civil war – instead of breaking apart (like Yugoslavia) it is breaking down. In the case of the latter, at least the separate armies were on subcontract to factional leaders or warlords with whom we could cut a deal. No-one in Iraq can. The violence is self-generating for that reason. It has its own rationale. The former New York Times reporter Chris Hedges calls it a Hobbesian playground; the columnist Tom Friedman calls it ‘Hobbes’ jungle’ (with no Leviathan in the wings).9 By definition, Iraq is a country from which the ‘political’ seems to be largely absent, or so hollowed out that it is difficult to imagine a brokered agreement, a Dayton Accord, that could concentrate everyone’s minds on the future. All three factors, I would suggest, explain why our politicians have begun to question whether our traditional ethical codes can be applied to the present conflict. Wars, it would seem, are no longer symmetrical, even when it comes to traditional ideas of what is right or wrong, or proper or improper behaviour. We know that our enemies (those who think, or claim to be at war with us) scorn our own values, and would not sign up to the Geneva Conventions even if asked. In short, we are dealing with movements or leaders who are not willing to abide by those etiquettes of war which most states, with only a few exceptions, have chosen to recognise for centuries or more – if sometimes only in the breach, not the observance. Let me take one example of this – the value we attach to historic and cultural sites. One of the hallmarks of conflict in the twentieth century was its wanton cultural vandalism. Back in 1947 the literary critic Edmund Wilson wrote a book, Europe without a Baedeker, which chronicled in embarrassing detail how a society had not only surrendered its power but had also almost forfeited its soul.10 Some of the great cultural achievements had gone up in smoke including, in the last air raid on Berlin, some 435 irreplaceable works by Titian, Caravaggio and Veronese disappeared in a single night’s bombing. Likewise much of the Continent’s architectural heart disappeared: much of the fabric of pre-eighteenth century London disappeared in the Blitz; baroque Dresden was burnt out in the final months of the war; the centre of renaissance Rotterdam incinerated at its outset. Today, we congratulate ourselves on being more civilised. One of the milestones in what the German philosopher Jürgen Habermas calls the ‘juridification’ (as opposed to the moralisation) of war is an article to be found in the 1954 Hague Conventions which my own country, Britain, only ratified as recently as 2005. International law no longer allows states to target the hundred cultural sites nominated by governments as being at the core of the national understanding of itself. In Britain they include The National Gallery and Tate Modern, though not, interestingly, St Paul’s Cathedral. Ironically, for those who believe the devil is in the detail, the first (silent) movie to depict a terrorist attack on a major city was made over a century ago.11 In a 1906 film, two biplanes
6
The war on terror
piloted by anarchists (the first suicide bombers) are depicted flying into the dome of St Paul’s Cathedral.12 Unfortunately for the rest of us, the Geneva Conventions have not been ratified by terrorist movements. Let me return to John Reid’s speech. What the Defence Secretary was addressing was not the morality of this or that particular law, but the ethical codes by which war is practised. Irrespective of our justification for employing violence against other states, when we do go to war we have rules for a reason. Alasdair MacIntyre insists on calling the discussion of moral problems in all the major professions including the military ‘applied ethics’.13 Many moral philosophers accept this definition and it is now widely used on academic courses, though it is not entirely unchallenged. This study does not ask whether war itself is a moral activity or not or, for that matter, whether any particular war is just or unjust in the eyes of its protagonists. It is concerned entirely with the applied, not pure, side of ethics. It is probably true to say, however, that the Western world today has a greater moral problem with war than perhaps any other because it finds the means so unappealing. We require, among other things, greater legal justification for going to war, greater humanity in targeting and greater proportionality in response. Our ethical practices reflect or take account of the demands of the hour, but it is the chief contention of this study that many are centuries old. And this is for one seminal reason – ethics is essential to what Clausewitz called the nature of war. Indeed, in contrast to John Reid, I shall contend that the ethical practices of the past have not been rendered irrelevant by the War on Terror, that the demands of the hour do not require a radical rewriting of the jus in bello. The novelty of the present struggle notwithstanding, we should still continue to subscribe to the conventions we have applied for the past three hundred years. What the War on Terror does challenge us to do is to rethink why we have these conventions and why we should continue to apply them. I shall explain why armies that have observed the codes have been rewarded. I shall explain the very practical benefits that good behaviour usually brings. I shall also contend that we cannot escape the dictates of rationality even if (indeed, precisely because) war is becoming more complex. The ethics of war actually inheres in complexity itself: it stems from the consequences of acting badly. Such an understanding has been implicit in military writing from the very beginning. Indeed Thucydides quotes Pericles’ remark, ‘I fear our own mistakes more than anything the enemy may devise’. We still read Thucydides, not because of what he has to tell us about warfare two thousand or so years ago, but because the lessons that he drew are rooted in human nature. We have rules so as to be better placed to moderate our own behaviour. As Thucydides warned us, more often than not it is us who pose the greatest danger to ourselves. In our attempt to defeat terrorism, we need to grasp this more than ever. This study is rooted in an empirical approach which pays scant regard to just war theories or metaphysical principles, but a great deal more to human psychology, as did Thucydides himself. It is written for a practical audience, including soldiers themselves. Bernard Williams once remarked that he saw his role as a philosopher as reminding ‘moral philosophers of truths about human life which are very well
The war on terror 7 known to virtually all adult human beings except moral philosophers.’ 14 I would remind the generals of ethical truths that are very well known to everyone except (some) generals – and the politicians who order them into battle.
Richard Rorty and the ethics of war It is usual to talk about the ethics of war, but outside academia it is even more usual to talk about morality when discussing a subject such as war with its attendant risks and dangers. Both words have equal force. In this study I shall distinguish the words ‘morals’ and ‘ethics’ in the fashion outlined below. By ‘moral’ I take to mean first principles, in particular what we consider to be justice, and the claims that others can make against us. Implicitly and explicitly such morality assumes an idea of the good – a set of values to which we can aspire. Everyone has a conception of justice but most of us have a different understanding of what is ‘just’ in practice. The value of justice is not in question; what is are the norms we invent to make a more just world for ourselves. In the case of war there are three moral traditions. The first is the idea that war can never be justified, an idea that has appealed to few (if any) governments since the state first came into existence. The second tradition that all wars can be justified has been increasingly challenged since the late nineteenth century. It is now difficult for states, especially in the Western world, to fight for the national interest, a geo-political advantage or even the balance of power. The third tradition has come into its own: Just War theories have been with us since St Augustine (even earlier if we go back to the Greeks). These days wars can only be held to be just if they secure a humanitarian end or, in the case of the War on Terror, help to prevent further civilian casualties. To be ‘just’ a war must meet certain moral criteria such as right authority, right intention, last resort and reasonable hope of success.15 By ‘ethics’ I mean the application of the principles which would allow us to realise those values (insofar as we ever can) in a particular historical, social or professional context. It is usually not soldiers who decide whether a war is just or not – they leave this to the politicians even though, as with the British generals in the run-up to the Iraq war (2003), they may seek legal advice as to a particular conflict’s legality. Soldiers are usually held to account for their own actions and practices in the field. It is at this point that we must make a further observation. Whereas ethics can only be codified, morality can only be argued. Which of the two should be privileged, whether morality should be seen as the product of ethics (that is, as a distillation of practice) or whether judgements of morality should take precedence as a precondition for the ethical life is a question I leave to the philosophers. I am certainly not qualified to answer it. Let me also say something about laws. Moral thought is deeply concerned with the systematic examination of the relations of human beings to each other. Ethics is concerned with how we should conduct relations with others. When codified by the state they become laws. International humanitarian law – also known as the law of armed conflict or the Laws of War – has two branches. One is the ‘law of Geneva’ which is designed to safeguard military personnel who by
8
The war on terror
virtue of surrendering no longer take part in the fighting as well as those not actively involved in hostilities, i.e. civilians. The other is the ‘law of the Hague’ which establishes the rights and obligations of belligerents in the conduct of military operations and limits the means of harming the enemy. The two branches of international law draw their name from the cities in which each was initially codified. With the adoption of the Additional Protocols of 1977, which combined both branches, that distinction is now merely of historical interest. It is important to have such laws – they bind us to good practices. They hold us accountable if we depart from our principles (the conventions or codes we have applied over the centuries). However, it is also important to recognise that laws are only observed because they legitimate existing practices. We have laws because of our practices and we have practices because we have learnt that without them we are a threat to ourselves. In other words, laws derive their usefulness, force and even legitimacy from the insight into what is equally good for all parties. Above all, laws acquire a binding force only within the practices of a particular cultural form of life. Every notion of law (virtue), MacIntyre reminds us, as well as every notion of justice is embedded in a social tradition as a set of practices specific to a particular community. MacIntyre goes so far as to argue that moral philosophy characteristically presupposes a sociology. In other words, particular values, ethical conceptions and the laws which prescribe ethical practices presuppose a social content and a social context.16 We can see this in operation in the case of the West, which has finessed the laws of war over time. When James Boswell visited the Arsenal in Venice and contemplated ‘the great storehouse of mortal engines’ he was particularly struck by the fact that men could so calmly construct ‘the instruments of destruction of their own species’. Still, he was able to find some consolation in the thought that if civilisation had failed to abolish war, it had, at least, ‘refined its savage rudeness’.17 We have made immense strides since Boswell’s day. We now legislate for war as never before – we have codified and enshrined it in legal conventions. We have a wide body of humanitarian law policed and enforced by international courts before which soldiers can be brought to account for their actions. In legitimising war the laws of war are now more important than ever before. My own focus in this book is the ethics of war and what follows is a reading grounded very much in the philosophy of the late Richard Rorty, one of America’s most important contemporary philosophers, and since the death of John Rawls, probably its most quoted. There are a number of reasons why I feel Rorty is a central figure in the ethics of war even though (to my knowledge) he never addressed the issue as such in any of his many writings. Let me mention three. The first reason is the idea that we fight ‘just wars’ because justice involves an absolute truth about the human condition. This is problematic at best. It is our justice, no-one else’s. Rorty spent his life trying to convey this message. Since Plato, he tells us, philosophy has embarked on an epistemological quest to find a truth out there that might serve as a model, or an objective standard for the way that we should live our lives. His whole project was to persuade us to drop this quest and in a pragmatic manner, to do away with what he calls ‘the dilemma faced by
The war on terror 9 human reason in its relation to philosophical or “metaphysical” questions’, an approach associated with the work of one of his main targets, Immanuel Kant. Human reason has this particular fate that in one species of its knowledge it is burdened by questions which, as prescribed by the very nature of reason itself, it is not able to ignore, but which, as transcending all its powers, it is also not able to answer.18 By contrast, Rorty thinks that although metaphysical questions will continue to fascinate us, there is little real point in trying to answer them. The whole quest for a truth ‘out there’ has not helped us in any significant way to gain greater knowledge about the real world. The reason for this, Rorty postulates, is because philosophy does not correspond in any way to reality itself. Instead, he urges us to scrap much of the rhetoric closely associated with many of traditional philosophy’s most cherished notions, such as ‘reality’, ‘truth’, ‘rationality’ and even ‘morality’. What makes him so provocatively interesting as a writer is that he maintains we can do this without losing our grip on effective standards of enquiry or useful norms of behaviour. We can still make genuine distinctions between what is better or worse, or what is right or wrong, but such distinctions must be made – indeed can only be made – on the basis of pragmatic criteria. One kind of behaviour, for example, can be judged better than another not because it conforms to some metaphysical idea of ‘right’, but because it more effectively fulfils a practical purpose. Philosophy, Rorty insists, should not concern itself about the quest for truth – its task is more modest as it is only a tool for interpreting reality. Our beliefs are purely a matter of historical contingency. We stand where we do by virtue of our own history. Liberals hold their beliefs not because they are grounded in some external authority, such as God or metaphysics, but because of their history. Crudely put, liberals are what they are not because they alone have grasped the truth but because it so happens that they are what they are.19 We may, it follows, think the wars we fight are just, but when we do so we have in mind our own concept of justice, no-one else’s. Most societies, in fact, fight what for them are ‘just wars’. Not surprisingly, Rorty has to defend himself against the charge of relativism. But as a liberal he too believes that liberalism is true, that the Western societies who practise it are ‘something more than an epiphenomenon of recent socio-economic history’.20 As an American he has a deep commitment to human rights. But he is also cognisant of the fact that the rightness of liberalism must follow from what liberal societies have achieved, not from such universal principles as a Kantian Categorical Imperative – which liberals have somehow miraculously ‘discovered’ operates at the supra-human level of human reason. Reason, indeed, is vitally important: it helps us understand why some practices are prudential and others are not. But reason must be grounded in the real, empirically verifiable world, not in a leap of faith. The only source of morality is our own actions, Rorty insists, which is why, echoing William James, he urges us to grasp the pragmatic ‘cash value’ of the question: ‘why should we act in a moral fashion?’21
10
The war on terror
Dispensing with something called ‘the Truth’ allows him to take an ironic approach towards the moral vocabulary we use. As liberals, he insists, we must continue to believe in the vocabulary of human rights and democracy. Ignoring Plato, who proclaimed that philosophy should not tell complicated tales, Rorty insists that we should all tell stories to make sense of the world around us. We should tell stories in order to find what we do historic. Why is the word ‘historic’ important? It is important for two reasons. Like his chief source of inspiration, John Dewey, Rorty believes that philosophy is not a form of knowledge but ‘a social hope reduced to a working programme of action’.22 From Dewey’s point of view the history of philosophy had been a series of efforts, some more successful than others, to help people understand what mattered most. For epistemological purposes humanity could be seen as a species in quest of what was most useful for its existence and continued development. We aim to do what is ‘just’ and what is ‘right’, but as liberals we must follow John Stuart Mill and define both words in terms of what is most useful. The terms ‘just’ and ‘right’ gain their meaning from their use in helping us to evaluate what is ‘good’ for us as well as good for others since we are social animals and live in an interconnected world. Philosophers are better advised to aim for small, fleeting successes just as Thomas Kuhn suggested scientists are usually engaged in solving puzzles rather than discovering the true nature of things. The stories we tell therefore should not set their eyes on the wider horizon but the here and now where philosophy, especially moral philosophy, actually matters. Like Hume he exhorts us to stop talking about unconditional obligations and do what is sensible. He urges us to accept that there is no difference between what is prudent and what is ethical.23 What we should want to find historic are those events that improve our own life. Another reason for trying to ensure that our actions are ‘historic’ is that unless we know whether they are historic or not, we do not really know what and what not to find important. To find something historic is to accord it value; to accord it value is to exercise judgement. Without judgement there is no independence of mind or action. Without judgement you cannot tell a story – and that is important if Rorty is right, and what philosophers do is to tell stories. It is through narrative that events take on significance or become meaningful. While actions that are lived are meaningful for those who live them, only actions that are narrated have value or convey lessons for those who have to rely on history as a guide. Judgement, in the end, involves risk and most risks are not taken. Judgement involves stepping out of the line – ‘brushing history against the grain’, to quote Walter Benjamin – for only judgement makes societies willing to take the risks involved in political life, including the greatest risk of all: going to war. 24 In short, Rorty tells us that politicians tell stories and the stories they tell have to be coherent; they have to make sense. The story that the West has been telling itself since the Enlightenment is that life gets better. Rorty is not averse to telling this story as long as we know the story we are trying to tell. He stands adamantly against ‘metaphysical narratives’ that tell us that history is teleological or that it shows ‘intelligent design’ – history shows the ‘search for truth’ or the ‘realisation
The war on terror 11 of human freedom’ (Hegel’s grand narrative in his Lectures on History). Neither does Rorty believe that history shows that progress is at work. What is important for Rorty are the ‘ordinary narratives’, the stories of how individual men and women, through their sacrifices and labours, have made life a little better than it might be otherwise. Our lives are merely episodes in such narratives but they make a difference for history’s victims. It is on the basis of this understanding that we have tried to fight wars more humanely. For that reason we may even judge our morality to be superior to anyone else’s, even though there is no objective standard for assessing this. The obvious question Rorty raises is how it would be possible to reach such an assessment. In trying to answer it, he employs Judith Shklar’s definition that ‘liberals are people who think that cruelty is the worst thing we can do’.25 What Shklar is arguing is that to put cruelty first is to disregard the idea of sin as it is understood by revealed religion. When liberals claim that cruelty to other human beings is the supreme evil they judge it so in and of itself and not because it signifies a denial of God or any other higher norm. It is a judgment made within a world in which cruelty occurs as part of our normal private life and our daily public practices. What keeps liberalism going, in Rorty’s view, is not a belief in the capacity of the liberal vocabulary to pinpoint the truth better than any other vocabulary, but the fact that our world – at this juncture in history – evinces a contempt for cruelty, a contempt that is consistent with the vocabulary we use. It is in relation to this concept that other vocabularies are to be judged, and usually judged adversely. It thus follows that in any war the West pursues, the way we fight it must be addressed to the need to limit cruelty. Let me develop this a little further. One of the main stories that liberals might tell at this stage of history is that having seen off one totalitarian challenge, we face another. In his autobiography, R.G. Collingwood claimed that the chief business of twentieth-century philosophy was to reckon with twentieth-century history.26 It could also be argued that the significant challenge of twenty-first century philosophy is to carry on this mission with regard to the chief cruelties that have survived the century we have just survived. Of these, when it comes to the War on Terror, the chief business stems from the concept of objective criminality.27 This is the belief that a person (or fellow human being) can be seen as a criminal not because s/he has actually committed a crime – or is indeed subjectively responsible for one – but because they are members of a group deemed criminal by the state. In the course of the last century millions of people were dispatched to the Gulag or death camps for being members of a ‘criminal’ class, race or ethnic group. All of them were regarded as being sub-human, quasi-human or not quite human. As Elie Wiesel warned after 9/11, the same concept has reappeared in history. No state now officially adheres to the doctrine of objective criminality (Pol Pot’s Cambodia was probably the last). Today no state ideologically discriminates against segments of its own citizens defined by who they are – with regard to race, religion, sexual orientation or ethnicity – but some non-state actors do, and many more may similarly discriminate in the course of this century. For some
12
The war on terror
years now Israeli citizens have been targeted for being citizens of a criminal or ‘Zionist’ state. Al-Qaeda’s fatwa of 1998 was issued not against the United States, but the American people. The point was re-stated unambiguously by Bin Laden in November 2002: By electing these leaders, the American people consent to the incarceration of the Palestinian people, the demolition of Palestinian homes and the slaughter of the children of Iraq ... This is why the American people are not innocent. The American people are active members in all these crimes. To kill an American is, he insists, ‘a duty for every Muslim ... God willing, America’s end is near.’28 These days, it is possible to be considered objectively criminal for being a citizen of a nation, quite independent of whether those targeted have ever endorsed the policies of their government with regard, for example, to the conduct of the War on Terror. The problem with objective criminality is just the same as it was in the twentieth century: it seeks to essentialise people in terms of abstract categories of class, race or, in this case, a nation. Once ‘they’ were excluded from the human race for their colour or racial origins; now they are excluded for being members of a criminal nation. In Sartrean terms Al-Qaeda puts ‘essence’ before ‘existence’. Sartre insisted that there was no conception of humanity independent of existence – we are not produced by a master builder, we are not manufactured according to a perfect blueprint. We are born – we only ‘become’ in the encounter between ourselves and others. In that encounter we recognise each other’s humanity in the biological reality we inhabit – especially the fact that we all feel pain and that there are times when we all suffer humiliation. ‘Species-being’ is what endows us with the right to life, a right that overrules everything else we might ‘become’ – Israelis, Christians or Muslims. For liberals this was the great battlefield of the previous century, and it is still the field we occupy today. The point I am trying to make is that we all tell stories; the stories we tell will prove persuasive or not according to whether we believe them ourselves. We are condemned to being consistent. We should do nothing that in our own eyes, let alone the eyes of others, would devalue the stories we tell. The second reason for invoking Rorty is that if the object of any liberal practice, as he insists, is to minimise, reduce or even manage cruelty to others, then Western societies have a duty to prosecute war in accordance with the same ethical demand. If they are to be internally consistent they have an obligation to accept (pace Peter Berger) that ‘the most pressing moral imperative in policy making ... is the precise calculus of pain’. Pain and suffering are inescapable parts of political life. We have to live with pain, embrace it and even impose it. ‘Human beings have the right to live in a meaningful world,’ Berger adds. ‘An assessment of the costs of policy must also include a calculus of meaning.’29 Pain is at the core of our moral life. It follows that if we must inflict it we must do so in a manner that is consistent with our moral selves. If ethics is ‘applied’ then in the case of war it is about how, and with what aim, we apply pain to other people.
The war on terror 13 The jus in bello is the currency in which war is conducted. It is in one sense a language addressed to those we engage, a concept which accords very much with Rorty’s own engagement with John Dewey’s views of language. Language does not help us picture reality ‘as it is’. Instead, language is part of our behaviour, part of what we human beings do in order to make sense of reality. However, language does not correspond to anything that could be considered objectively ‘true’. Rorty favours Dewey’s idea of language as a tool rather than a picture, but a tool that cannot be separated from its user. We are captive of the language we speak because we cannot imagine anything outside it. ‘It is impossible to step outside our skins – the traditions, linguistic and other, within which we do our thinking and self-criticism – and compare ourselves with something absolute.’ Truth, in other words, cannot exist outside language. It lies in sentences within it. Language is the creator of truth.30 The language we employ to practise war is the creator of truth as well, at least for liberals like ourselves. For a liberal, to be cruel is to deny one’s own credentials. Indeed, if there is no objective reality out there (if reality is what we perceive it to be), then we must have very strong views about the reality of human pain. We cannot excuse applying it by invoking any metaphysical reality larger than the biological pain our bodies suffer, and the cultural pain – or shame – we endure when humiliated by others. We must not, to employ a Sartrean term, show ‘bad faith’. To adopt a language (or discourse) independent of our first principles, to attempt to step outside our skin, would be disastrous. It is entirely irrelevant – in this respect – whether those we fight abide by the same rules or not. ‘We’ do what we have to; ‘they’ do what they must. Their value system may even sanction extreme cruelty – in more traditional societies one’s moral commitment is often measured by the self-belief which allows one to inflict pain on others. Cruelty (as a ‘grammar of killing’) may take many forms; it tends to vary from culture to culture, and within cultures it evolves over time. We should always remember that even those who murder, rape, kill, decapitate or mutilate others spend the rest of their livers soliciting their fellows’ approval, respect and praise. But, if our intention is to influence other cultures (insofar as we can), let alone to ‘set an example’ (if we must), then we must remain true to our own first principles – not because those principles are universally true but because they are true for us. We must not be seen to be hypocrites, preaching one thing and practising another. There can be no exceptions to this rule – no great ‘emergencies’ (as Michael Walzer calls them), no exceptional call to arms. Rortean ethics, in other words, is grounded in a post-metaphysical age. What would a full-blooded post-metaphysical liberal society look like, he asks. And he provides this answer. It would be: a culture ... in which no-one – or at least no intellectual – believes that we have, deep down inside us, a criterion for telling whether we are in touch with reality or not, when we are in the Truth. This would be a culture in which neither the priests, nor the physicists, nor the poets nor the Party were thought of as more ‘rational’ or more ‘scientific’ or ‘deeper’ than one another ... In such
14
The war on terror a culture heroes ... would not be those who knew a Secret, who had won through to the Truth, but simply people who were good at being human.31
If we are now fighting for humanity against those who would deny the universality of the human condition, our own soldiers have to be heroic in this way too. They have to be good at being human beings, which for them means keeping faith with their own liberal creed. At this stage in our historical development liberalism is grounded on protecting its citizens and, where possible, others from evils universally feared: torture, violence, arbitrary power and humiliation. This, according to Bernard Williams, is ‘the least ambitious and most convincing justification of liberalism ... namely the liberalism of fear’.32 It follows that if the value we and others accord to liberalism should be measured against the evils it resists, we can hardly in the name of liberalism commit evil ourselves. We cannot fight humanitarian wars inhumanely. We cannot counter terror with terror or, for that matter, meet ‘evil with evil’. None of these things is something that, as liberals, we can do irrespective of whether at certain times it might make good practice to do them. My third reason for taking Rorty’s reading of ethics as a point of departure for this study is that it is peculiarly utilitarian and therefore likely to appeal to those who actually implement policy, whether they are in government or the military. Rorty’s ethical prescriptions are both consequentialist and prudentialist at the same time. We have rules, he tells us, because it is prudent to do so. Our responsibility is to remain always alive to the consequences of our own actions. It follows that our ethical codes in war inhere not in metaphysical absolutes, or Categorical Imperatives; they inhere not in epistemology but heuristics. As Rorty claims with regard to the pursuit of the Truth, the Kantian vocabulary of morals, like the Cartesian philosophy of the mind has not paid off any more than the scientific positivist culture of the Enlightenment.33 We embrace pragmatism, or would be well advised to, because we find the reasons for acting well in the practices we ourselves pursue. As I will argue, the ethics of war inheres in war itself, not in absolute moral injunctions or imperatives. If we correctly understand the dynamic of war, we must do everything we can to avoid the worst mistakes we have committed in the past. This is especially important for the practitioners, those who are at the cutting edge: soldiers –some of whom I hope will read this book. This is why, traditionally, ethics has been ground in an ethos, not in laws and conventions entered into by states. Indeed for much of history it was grounded in ‘the warrior’s honour’. As Hans-Georg Gadamer writes: Aristotle ... became the founder of ethics as a discipline independent of metaphysics. Criticising the Platonic idea of the good as an empty generality, he asks instead the question of what is humanly good, what is good in terms of human action... He demonstrates that the equation of virtue and knowledge, arête and logos ... is an exaggeration. The basis of moral knowledge in man is orexis, striving ... the very name ‘ethics’ indicates that Aristotle bases arête on practice and ‘ethos’.34
The war on terror 15 In other words, the truth, with its ethical implications, is not something fixed that we can find outside us. It is rather a certain type of procedure concerning the way we approach moral problems and implicitly the manner in which we let ourselves be changed by our interaction with ‘otherness’. Orexis is that inward speech, the discourse we conduct incessantly with ourselves. We liberals have let ourselves be changed very much in the past fifty years. Our ethical codes continue to evolve for that reason. We are always trying – in Berger’s sense of the term – to exercise a more precise calculus of pain employing (as we would expect of ourselves) technological solutions to achieve this, from smart bombs to non-lethal weapons. Indeed, if we see Liberalism itself as a work in progress then Berger’s injunction is what morality has become: applied ethics. In this work, I shall argue that we face two challenges to this Rortean ideal. The first is that in depoliticising the language of war, in talking of ‘Evil’, rather than the ‘evils’ we face every day we risk abandoning a prudentialist course. We risk ignoring the consequences of our own actions. The other challenge is that in increasingly instrumentalising the world – in hollowing out the warrior’s honour, or threatening to replace warriors themselves with machines – we risk ignoring Aristotle’s insight that ethics is in the striving. In the striving we discover much about ourselves and otherness. If we are not allowed to strive, what will there be left to discover?
Conclusion I make no apology for introducing philosophy into what is essentially designed to be a very practical work. Theory and practice cannot be divorced because one informs the other. If we understand the prevailing philosophical ideas of the day we are more likely to understand why we act as we do. What Alasdair MacIntyre has argued with reference to modern political thought and practice applies equally well to the present war with terrorism in which we seem to be engaged. Let me quote him: There is a history yet to be written in which the Medici princes, Henry VIII and Thomas Cromwell, Frederick the Great and Napoleon, Walpole and Wilberforce, Jefferson and Robespierre are understood as expressing in their actions, often partially and in a variety of different ways, the very same conceptual changes which at the level of philosophical theory are articulated by Machiavelli and Hobbes, by Diderot and Condorcet, by Hume and Adam Smith and Kant. There ought not to be two histories, one of political and moral action and one of political and moral theorising, because there were not two pasts, one populated only by actions, the other only by theories. Every action is the bearer and expression of more or less theory-laden beliefs and concepts; every piece of theorising and every expression of belief is a political and moral action.35 With this methodological premise in mind, the rest of this book, using Rorty’s ideas to frame some of the issues, tries to explain why, as liberal societies, we
16
The war on terror
must abide by our liberal beliefs. In this changed world – and the world has changed profoundly since 9/11 – we should indeed ask whether the traditional ethical codes we have applied in war also apply to terrorism. John Reid was right that governments must ask tough questions. Countries that do not adapt to change tend to perish. It will become manifestly clear in the course of reading this book that I am neither a knee-jerk critic of the Bush administration, nor do I question that we will find ourselves, if not in a ‘long war’, at least in a long struggle. The cities of the Western world are under the most prolonged siege since World War II. Car bombs, suicide bombers and the bombing of rail commuters on their way to and from work are blasting away at the moral and physical shell of our cities, producing the most significant mutation in city form and urban lifestyles. The anxiety of our citizens is real. In response, governments, not always deliberately, are hollowing out some of our hard-won civil liberties. The Cold War never really intruded into people’s lives in a way that the surveillance society does today. The Cold War may have produced shadowy ‘security states’ in which the security services certainly monitored political radicals and public service employees were often vetted, but these and other measures did not challenge free speech, the presumption of innocence, asylum or the free movement of people across the country. The global war on terrorism does challenge such civil liberties. This really is a threat to the Western way of life and it is not going to get any better. We are in for the long haul. It is precisely because there is a struggle to be won, however, that we have to understand why we have rules in war. The Western world was undoubtedly ruthless in the way it prosecuted war in the past. Indeed, many of its practices such as the mass bombing of cities in World War II are now questioned by many historians. Different societies have different rules which reflect the values of the day. We should not judge others retrospectively, from our own unique moral vantage point, but even at the height of World War II the Allies chose to apply the framework of humanitarian law; even in the Cold War the Americans were scrupulous in formally adhering to the Geneva Conventions in respect to the enemy both in Korea and Vietnam. So far, I would argue, no-one has come up with an intellectually challenging or coherent reason for abandoning the rules we have applied in the past. What follows is an attempt to illustrate why ethical rules help societies win wars and why societies abandon the tried and trusted methods at their peril.
2
Etiquettes of atrocity
Traditional cultures clearly understood war as a form of discourse between the human and ‘the other’. As a result they accepted that the conversation must conform to certain syntactical rules and limits if the communication was to be effective ... The natural environment and the human enemy must in some degree be protected and preserved. (Susan Mansfield, The Gestalt of War, 1982)
Etiquettes of atrocity The business of the armed forces in wartime, writes the historian Geoffrey Parker, is to kill people and break things, and we have been cataloguing both activities with gusto. Military history now outsells most other historical works. The bookshelves devoted to the writings of military historians and the contemporary accounts of conflicts in Afghanistan and Iraq continue to expand. This possibly shows a change of sensibility in our post-heroic world. As fewer and fewer citizens volunteer for military service (as conscription disappears altogether from the Western world), so we are more fixated on the past which has largely been the history of war. Parker was invited in April 1991 to participate at Yale in a course of lectures on ‘the laws of war’. He took as his point of departure a recent article by one of the world’s major strategic thinkers, Martin van Creveld, who claimed to have noted a major change in the laws of war, both in the jus ad bellum (the rules concerning the legitimacy of war) and the jus in bello (the rules concerning its conduct). In the past, force had been resorted to mostly to change state boundaries. Now this was outlawed by the United Nations Charter. By contrast, the first Gulf War (1990–1) had been fought to evict Iraqi solders from Kuwait and thus to reinforce the rule of international law. In the past states had also routinely engaged in practices that the West, in particular, now condemned – such as the targeting of civilians and hostage taking. Van Creveld concluded from this that what we consider acceptable or unacceptable behaviour in war is historically determined. Parker took issue with both conclusions. The disagreement between two leading authorities brings to mind the distinction that Clausewitz drew between the nature of war and its character. Clausewitz was ready enough to concede that war changed over time – it was a reflection, after all, of social circumstances. A fifth-century
18
Etiquettes of atrocity
city state such as Athens waged war differently from an early-nineteenth-century state such as his own. The two societies even thought about war very differently. This may, perhaps, strike one as fairly obvious today but it was not so clear-cut at the time that he was writing. The most popular book on war in his own day remained Machiavelli’s treatise The Art of War (the only one of the author’s books to have been published in his own lifetime). In it Machiavelli had famously claimed that technology, for example, had made little difference in the way war was fought, notoriously insisting that the Roman legions which he admired so much could have seen off the gunpowder armies of his own day just as effectively as they had seen off Hannibal’s elephants. The nature of war, however, Clausewitz maintained was very different. It never changed. All wars, for example, are unpredictable: for that reason war is an art, not a science. All wars are bedevilled by lack of information about the enemy’s intentions and capabilities (the famous ‘fog of war’). And a conflict fought with little regard for ethical considerations is likely soon enough to escape political control. Parker too had no doubt that over time societies have been faced with different ethical demands in war. Medieval knights did not enslave their enemies as the ancient Greeks did, but they were allowed to put citizens to the sword if a town did not surrender when called upon to do so. By the seventeenth century even this practice had been largely abandoned although the laws of war allowed Cromwell to sack Drogheda and Wexford from which his reputation, in Ireland at least, has never recovered. Bringing the debate up to date, many of Parker’s contemporaries often regretted that in a counter-insurgency war the distinction between civilian and combatant that Clausewitz had taken for granted no longer seemed to apply. But Parker was making a very different point. Most of the laws of war, such as the taking of prisoners and the rights granted to a defeated people, had been observed over the centuries; they had been applied by different peoples at different times with remarkable consistency. And the explanation was simple: they inhered in the nature of war itself. If Clausewitz was right to insist that wars are only won not when one side prevails but when it prevails upon its enemy to concede defeat, then the manner of effecting that defeat is likely to be crucial. An enemy is more likely to abide by the result when defeat can be swallowed. Defeat is more likely to be accepted when a defeated society no longer feels it has to keep faith with those who have died in its name; a soldier is more likely to abide by the result on the battlefield if he can look his children in the eye and recount his defeat without any sense of shame. How we treat our enemies determines how they react to defeat – it determines (to use today’s vernacular) whether they can draw a line under events and move on. In his essay Parker also took issue with the idea that we have become more sensitive to suffering. If civilians are targeted less often by the West today – ‘We can’t do Dresden’, declared a US general on the eve of Operation Desert Storm – that is largely because technology has provided other options. Yet technology has created a paradoxical situation in most postmodern conflicts. While in general only those who kill or maim innocent civilians face-to-face risk being held accountable by a war crimes tribunal, those who act from a distance – although they often inflict far more damage – are rarely put on trial.1
Etiquettes of atrocity 19 What Parker found when looking at the laws of war, at least in the past five hundred years of European history, was a remarkable consistency of practice. The laws of war relied on a mixture of natural law, military law, common custom and self-interest, and not much has changed in the interim. All that happened was that in the course of the modern era natural law as part of international law had gone ‘positivist’ for the first time: the customs of war (the precedents created by the conduct of war itself) had remained much the same, but had been gradually embodied in legal conventions negotiated by states. This should not be allowed, he insisted, to obscure the fact that the conventions that were transformed into laws such as honouring surrenders, sparing the wounded or respecting flags of truce – the social conventions that had reduced the danger and chaos of conflict for all combatants – had been observed for centuries. They could be seen, he added, as ‘contractual etiquettes’ which provided each party with a vital framework of expectations concerning the conduct of others.2 As Parker worked on his presentation over the summer, the Yugoslav civil war broke out. The destruction that followed at times reminded him of the devastating Thirty Years War that had traumatised mid-seventeenth-century Europe. For historians it remains the definitive disaster of European history even though in the popular imagination it has been displaced by World War I. When the German general Helmut von Moltke wanted to convey what lay ahead as the Great War loomed he predicted (uncannily) that it would be like the Thirty Years War, crammed into four years.3 If anything, however, the earlier conflict was far more disastrous for his own country where the loss of life was proportionally greater than in World War II; the displacement of people and material devastation was almost as great and the cultural and economic dislocation persisted for substantially longer.4 The court painter Rubens recounts one incident in the Thirty Years War which was all too common judging from the pamphlets and broadsheets of the day: Wallenstein’s troops had prepared axes, hammers and cudgels for their plundering and they used them as butchers use instruments when slaughtering cattle. They split open people’s heads, threw them to the ground and crushed them with their feet so that the blood spurted forth not only from open wounds but also from necks, ears and noses. It is generally agreed that of the hundreds of citizens the troops encountered not one escaped unharmed.5 And then there were collective acts of barbarism such as the destruction of an entire city – in May 1631 Magdeburg was sacked by the Imperial Army over several days. Only the cathedral and some seventy houses in its immediate neighbourhood were spared. The rest of the city was reduced to a wasteland of blackened timber. For two weeks dreary trains of wagons carried the charred corpses of 24,000 citizens to be dumped in the river. Europe would not see another city deliberately put to the flame until the Allied bombing of Germany began in earnest in 1942. Four years later Aldous Huxley compared the atombombed city of Hiroshima with Magdeburg and believed both atrocities had awakened two very different societies to the need for even tighter ethical codes in
20
Etiquettes of atrocity
warfare. ‘Assuming ... that we are capable of learning as much from Hiroshima as our forefathers learned from Magdeburg we may look forward to a period, not indeed of peace, but of limited and only partially ruinous warfare.’6 And yet Europe survived the Thirty Years War as it survived Hitler. There was a popular saying at the time: ‘Every soldier needs three peasants – one to give up his lodgings, one to provide his wife, and one to take his place in hell.’7 Although depredations against the peasantry were particularly brutal, there were many occasions in which soldiers who oppressed civilians were severely punished, and even executed. And the reason was simple: common sense. An army cannot afford to alienate those who supply labour, guides and intelligence of the enemy, not to mention food and quarters. Such realities prompted Parker to reiterate his central thesis: the rules of war followed by most European societies have displayed a remarkable continuity since the sixteenth century, and indeed even long before. As a historian of early modern Europe, Parker is the first to admit that serious thought was given for the first time to codifying the rules of war in legal conventions. Even so, such rules had existed from the beginning of time. Soldiers had been taken prisoner, and prisoners returned to their homes at the end of hostilities. All such practices are grounded on a very practical maxim: do as you would be done by. If it is true that theoretical restrictions on the jus in bello have multiplied in recent times, most of the actions today outlawed by the Geneva Conventions have been condemned in the West for the past four hundred years. Only the degree and extent of enforcement have changed over time as the international system gradually moved from natural law to the law of nations and, most recently, to positivist international law. The process is described especially well in Stephen Neff’s War and the Law of Nations. The first writer to identify what Parker nicely calls ‘etiquettes of atrocity’ as ‘laws of war’ was Hugo Grotius. In The Laws of War and Peace (1625) he attempted to find the ‘natural laws’ which governed human behaviour and which by deduction conferred on human beings certain ‘natural rights’. What is ‘modern’ about his account was that he had little interest in those attributes of society which his contemporaries thought made life civil, such as benevolence. Benevolence is all very well but it is not at the heart of social existence. There are only two things that people have a natural right to exercise in the company of their fellows: the expectation that they will not be subject to unprovoked attack and the freedom to defend themselves if they are. Grotius’ system did not outlaw war; what it did instead was insist that states should hold each other accountable for their respective actions as well as those carried out in their name. The fault lay with themselves not with the mercenaries to whom they contracted. Indeed, state control of soldiers – especially against neutrals, enemy non-combatants and prisoners of war – was the true mark of a civilised polity. War required restraint to be ‘sociable’. Grotius is famous for helping to codify the idea of distinguishing between civilians and soldiers in battle and introducing the idea of proportionality (that harm done in war should be in proportion to the good that is to come from victory). But even he was only stating what had long been observed in the field, including the requirement that soldiers who have surrendered have done so on the strict understanding that they
Etiquettes of atrocity 21 will be looked after by the side that has taken them prisoner. Whatever the fortunes of war, both sides owe each other certain obligations. Grotius, in short, asked his readers to locate the rules of war in necessity – which is to say wise, as opposed to unwise, behaviour. In that sense and in keeping with much of seventeenth-century philosophy, his laws were largely mechanical – they were derived from a mechanistic understanding of how human beings behaved towards each other. Like Hobbes, Grotius offered a ‘physics of society’ or, in this case, a ‘physics of war’ – a consequentialist code which was grounded on patterns of behaviour historians had observed over time. We have evolved rules for a reason: they work, they help limit war. As a species we are rule-bound; rules do not constrain our freedom, they make it possible to be free. Laws make life simpler and can be liberating for that reason. Kant realised this when he claimed: ‘Man is free if he needs to obey no person but solely the laws.’ The fact that so many of these laws had been flouted in the course of the Thirty Years War prompted a desire to avoid any such struggle again by turning practical limitations into specific laws. Those laws were to be flouted again in World War I and only just observed in World War II. Laws are no guarantee that rules will be observed any more than unwritten conventions. Even Kant called international lawyers ‘sorry comforters’ for, their best efforts notwithstanding, international law had done remarkably little to relieve human suffering.8 Grotius himself was preoccupied mainly with natural law – the divinely ordained rules which are proper to the social existence of Man. By the end of the century, other writers had begun to think in terms of international law between states. Thus began the move from etiquettes of atrocity owed to others by virtue of common humanity (we are all created in God’s image) to laws of war entered into by agreement between states. We have never looked back. Principles such as sparing women and children and respecting the rights of neutrals were embodied three centuries later in the Hague Conventions (1899/1907). Looking back we can see that the Treaty of Westphalia confirmed a belief that we can trace back to Machiavelli a distinction between natural and moral virtues. Political values are not merely different from, but may in principle be incompatible with, Christian ethics. But that too builds on a much earlier recognition which we can trace back to the Greeks, namely that there is an important distinction between natural law and human law. In each case this does not absolve us of ethical behaviour. On the contrary, it requires us to recognise that ethics is built into politics itself. The ethics of war, likewise, is built into its practice. In so far as war is ‘a continuation of politics by other means’ the laws of war are also grounded on political – not Christian – ethics. Natural law constitutes a system of ethics that binds all individuals. It calls upon them to recognise and respect its rules by means of such natural faculties as the capacity for reason. Parker’s ‘etiquettes of atrocity’ are deduced not from natural law, human rights, first principles or categorical imperatives. They are arrived at, not deductively, but inductively. The great Enlightenment Italian thinker Giambattista Vico, for example, accepted that God ran the world through laws but insisted that these laws were immanent, not transcendent. They were not
22
Etiquettes of atrocity
available through revelation but emerged in human institutions and could be deduced. He was more radical still in urging that ‘laws’ do not stem from rational social contracts between two parties. Instead, he argued, they had been learnt through the instinctive realm of custom or what he called ‘common wisdom’ – they had been learned through trial and error.9 Such was Hume’s claim that ‘mankind is much the same in all times and in all places’ (Essays Moral and Political). Hume himself argued that human co-operation (such as two men rowing a boat) results not from rational promises or contracts, but trial and error. All human beings have needs – to be released from undue fear, to survive, to be sheltered or clothed – and these needs can be guaranteed not only in peace but also in war by practices that span cultures and historical eras. Parker’s etiquettes of atrocity rest on a trans-cultural consensus. Because they are useful they have been translated into values from which we derive norms of behaviour such as a duty to look after prisoners of war and to alleviate human suffering wherever possible. A similar idea can be found in the work of Adam Smith both in The Theory of Moral Sentiments and his more famous book, The Wealth of Nations. Smith considered the greatest achievement of modern philosophy to be the rejection by Berkley, followed by Hume, of any hard and fast distinction between objective reality and human imagination. Building on this foundation he discarded the old idea of timeless moral standards sanctioned by reason or religion and replaced it with the notion of an ‘impartial spectator’. The impartial spectator was a creature of the imagination, a kind of mental monitor built up from our involuntary sympathies with the passions of others. Though imaginary, it was real enough to make us adhere to acceptable standards of conduct most of the time; it secured beneficence, in short, without requiring benevolence. When Smith later turned to economics and tried to account for national wealth, he came up with a similar mechanism: he refused to invoke the transcendent principles of productive propriety since ‘the haggling and bargaining of the market’, like the impartial spectator, provided a perfectly adequate discipline for economic activity.10 Smith insisted that not only are there laws of the market but that these laws emerge from the push and pull of trade; they are not imposed from outside. Two opposing forces (competition and self-interest) generate a self-regulating, steady hand. In Smith’s view, ‘good’ (meaning a fair, competitive market) could emerge from selfishness. To invoke the more contemporary voice of the twentieth-century British philosopher Michael Oakeshott, morality is not a system of general principles or a code of rules so much as a vernacular language. We may indeed be able to elicit general principles and even rules from it, but what gives it its vitality is not a theorem (such as ‘good conduct is acting fairly’), much less a rule (such as ‘always tell the truth’). What makes it alive (like any language) is that it can be spoken intelligently. It is a guide to how to act or what to think. It is inherently practical. Rorty adds that what Oakeshott is claiming is that moral principles like Kant’s famous Second Categorical Imperative (to treat human beings as an end, not as a means to an end) are only compelling insofar as they
Etiquettes of atrocity 23 are ‘political’ (that is, there is a practical point behind them). We must strip morality of metaphysics. There is no divine voice in history, only our voice, or rather the voice of the community into which we are born. Morality then is a common language. We act well – or try to – not because of some inner voice telling us that our actions are ‘wrong’, but because of our own experience or that of our ancestors. We know that some actions are wrong because we know from the study of history or from observation that they do not work and are counter-productive. Ethics, concludes Rorty, speaks the language it does not because it approximates to the will of God or the nature of man, but because in the past it was tried, tested and vindicated. Its language, therefore, is the language of experience. It is historical for that reason. If we have never ceased to learn this lesson it is because as fallible human beings we always forget. But then we are moral beings only because we learn from our mistakes. We are not better for it; we are better informed.11 Indeed, what is distinctive about societies such as our own is that they tend to congratulate themselves that their superior morality is grounded in superior knowledge. In Kantian terms, we contrast the piety of pre-modern peoples with the wisdom of our own age. Instead of submitting to a law that originates elsewhere (a law inscribed in the order of the cosmos or revealed ‘through revelation’), we insist that we submit ourselves to human-made laws. Modern humanity, writes Tzvestan Todorov, sees no moral merit in merely submitting to the law. Merit begins with freedom and can only be earned by action which involves the exercise of free will.12 Reason is a tool that serves reasonable behaviour. Ethics derives from prudence, or what Robert Kaplan nicely calls ‘anxious foresight’.13 If this conclusion smacks of consequentialism, so be it. It is at one with another concept: the relational aspect of pragmatism. The great American philosopher John Dewey saw the word ‘prudence’ as a member of the same family as ‘habit’ and ‘custom’. All three words describe ways in which human beings adjust and adapt to non-human and human environments alike. For Dewey there was no distinction between what is useful and what is ‘right’: Right is only an abstract name for the multitude of concrete demands in action which others impress upon us, and of which we are obliged, if we would live, to take some account.14 This is far from the universality of Kant’s Second Categorical Imperative. Conscious of the Kantian critique of pragmatism – that adapting to the needs of others offers a cheap series of trade-offs far removed from the supposed magisterial authority of an absolute moral law – Dewey had this reply: to argue that there is an absolute moral law is to contend that the ideal precedes or antedates custom, or that in evolving from custom it is an accidental by-product. Dewey took issue with the critique by invoking language. Language grows out of unintelligent babbling and instinctive motions called gestures. But once called into existence we recognise the communication as language and that it operates as such. The point
24
Etiquettes of atrocity
about Dewey’s analogy between language and morality is that there is no decisive moment at which language stopped being a series of reactions to stimuli provided by the behaviour of other human beings and started to be an instrument for expressing certain beliefs. Similarly, there is no point at which practical reasoning stopped being prudential and became specifically moral; there is no point at which it stopped being merely useful and started being authoritative.15 On Dewey’s understanding, philosophers who have sharply distinguished reason from experience or morality from prudence have tried to turn an important difference of degree into a difference of a metaphysical kind. They have therefore constructed problems for themselves which are as insoluble as they are artificial. In the end, Pragmatism as a philosophy offers a major break with previous thinking because it is not just one more theory; it represents a change of view which puts all theorising on a par with all other practical activities. For pragmatists such as Rorty, moral obligation does not have a nature or a source different from tradition, habit or custom. Indeed, he concludes, ‘our sense that prudence is unheroic and morality heroic is merely the recognition that testing out the relatively untried is more dangerous, more risky than doing what comes naturally’.16 The problem in the twentieth century was that we kept invoking absolute moral standards which we suspended soon enough when fighting ‘uncivilised’ people or barbarians, whether they were sub-human Russians (in the eyes of the Nazis) or ‘verminous’ Chinese (in the eyes of Imperial Japan). Both Axis powers abandoned the understanding that we have rules in the first place because it is wise to do so. Such an understanding did not fit easily with the pseudo-metaphysical theorising of Fascism or the pseudo-Zen Buddhist musings of Imperial Japan. For a pragmatist, a practical man like Clausewitz can convey the texture of ethics just as well as (or perhaps better than) many a moralist or philosopher.
Why we take prisoners of war Practical common sense is what comes out particularly clearly in Clausewitz’s discussion of why we take prisoners, rather than kill them. Clausewitz wrote one of those rare books – Vico’s New Science is another – whose greatness is to be measured not so much by the conclusions they reach as by the tasks their authors set themselves. In both cases the tasks were so original that the authors seemed unaware of what they have given birth to.17 Vico, for example, seems to have regarded his New Science as a contribution to jurisprudence rather than to the philosophy of history. Clausewitz also set out merely to write a manual for professionals rather than a phenomenology of war. His key insight when translated into a discourse on war was that ethics (of which he writes little) is structured into war. Ethics is what prevents war from becoming absolute, and therefore escaping political control. It is what makes war possible for humanity. When Clausewitz wrote of absolute war he was not referring to the wars of the mid-seventeenth century or the total conflicts of the century that followed the publication of his life’s work. Likewise, he was not writing about the nuclear
Etiquettes of atrocity 25 peace in whose shadow we all lived until recently. What he saw acutely in the Napoleonic Wars in which he fought (on both sides of the conflict) was that when morality is suspended, we can expect the worst. ‘Bounds which existed only in a consciousness of what is possible, when once thrown down, are not easily built up again ... whenever great interests are in dispute, mutual hostility will discharge itself as we have seen it do in our times.’18 This was even more true in World War II. As even Goebbels bemoaned in July 1943, ‘We wage too much war and do not engage in enough politics’.19 Unfortunately, the idea that politics could be transcended was one of Hitler’s principle beliefs. His was a world in which a politics of pure heroism could succeed – a redemptive, transforming politics which transcended the ‘political’, especially the compromises which are at the heart of political life and the ethics which those compromises demand. In common with this vision went another disastrous misconception: that victory would go to the side that killed the largest number in battle. This skewed view may have stemmed from an innate fear of the teeming masses flooding into the major cities. Crowds dominated the imagination of twentieth-century Europe in the popular cinematic idiom of Modern Times and Metropolis, as well as in classics such as Spengler’s Decline of the West which described the masses as parasitical city dwellers who were both traditionless and religionless (and thus all the more ready for a Messiah or master builder to shape them for his own purposes). The urban masses, in turn, proved a tempting target once airpower arrived with full force. For the novelist Henry Green, describing a crowd milling in the concourse of Victoria Station, the masses were less protagonists in their own story than potential victims unknowingly awaiting the bombers overhead.20 The greatest twentieth-century interpreter of crowds was Elias Canetti who recognised from the first that their importance lay in their numbers. For the man in the crowd, he argued, belongs in wartime to two very different crowds. From the perspective of his own side, he belongs to the crowd of living fighters; to the enemy, he belongs to the potential crowd of the dead. Men join up to avoid death, surprising though this may sound. What a man needs is the experience of a postponement of death, and what stay of execution is more potent than the comforting thought that death awaits not one’s own side, but someone else’s? ‘Death to the English’ and ‘Death to the French’ were more than merely conventional catchphrases; they also constituted a remission of one’s own sentence.21 Canetti was especially intrigued to learn that the survivors of the first atomic bomb in Hiroshima, many of whom died days after the attack, had drawn comfort from the rumour that their own side had hit back and devastated several large American cities. The patients in the hospitals had once again become a crowd, believing that they were safe from death by this ‘diversion of death’ to others.22 In short, it was important in terms of crowd psychology for the enemy’s ranks to be thinned. Indeed, it was widely assumed that victory in war would go to the country that killed the largest number. ‘Each side wants to constitute the larger crowd of living fighters,’ Canetti added, ‘and it wants the opposing side
26
Etiquettes of atrocity
to constitute the largest heap of the dead.’23 Hitler merely took this obsession to its logical conclusion. He himself, of course, was a man whose face was first glimpsed by history in a photograph of a mob demonstrating its enthusiasm for war in the Odeonsplatz in Munich in August 1914. The crowd was his decisive experience and ultimate inspiration. ‘Without this feeling he cannot be understood,’ persisted Canetti, ‘nor can his beginning, nor can his power, nor what he meant to do with his power, nor what his undertaking has led to.’24 Hitler’s adamant refusal to allow the army to retreat, beginning with the winter campaign before Moscow in 1941, may have been the sublimation of his own wish to fight to the last man. He was not alone. Heinrich Boll recalls how, even a few months before Hitler’s death, his fellow countrymen continued to derive comfort from radio broadcasts which referred not to the army’s defeats, but its ‘lost victories’. In the end, the remarkable fortitude of the German people could only be explained by their own wish to keep the crowd in being for as long as possible, even if this meant thinning it out.25 All this would have puzzled Clausewitz who knew from his close study of military history that the ‘net body count’ was not the key to victory, as he made clear enough in the fourth book of his treatise, On War. [Now it] is known by experience that the losses in physical forces in the course of a battle seldom present a great difference between victor and vanquished respectively, often none at all, sometimes even one bearing an inverse relation to the result, that the most decisive losses on the side of the vanquished commence only with the retreat. It is in the retreat, he pointed out, that soldiers ‘lose their way and fall defenceless into the enemy’s hands, and thus victory mostly gains bodily substance after it is already decided’. The loss in physical force is not the only one which the two sides suffer in the course of the combat; the moral forces also are shaken, broken and go to ruin. It is not only the loss in horses, men and guns, but in order, courage, confidence, cohesion and plan, which come into consideration when it is a question of whether the fight can still be continued or not. It is principally the moral forces which decide here, and in all cases in which the conqueror has lost as heavily as the conquered, it is these alone ... In the combat the loss of moral force is the chief cause of the decision. Hence it follows that: The losses in a battle consist more in killed and wounded; those after the battle, more in artillery taken and prisoners. The first the conqueror shares with the conquered, more or less, but the second not: and for that reason, they usually only take place on one side of the conflict, at least they are considerably in excess on one side.
Etiquettes of atrocity 27 [Captured] artillery and prisoners are therefore at all times regarded as the true trophies of victory, as well as its measure, because through these things its extent is declared beyond a doubt. Even the degree of moral superiority may be better judged of by them than by any other relation, especially if the number of killed and wounded is compared therewith ... If prisoners and captured guns are those things by which the victory principally gains substance, its true crystallisation, then the plan of the battle should have those things especially in view; the destruction of the enemy by death and wounds appears here merely as the means to an end.26 A logical inference from this was that enemy soldiers should be encouraged to surrender, or at least not be discouraged from doing so. The end of war, Clausewitz argued, is to persuade the enemy that he has been defeated. Death is not necessarily crucial. Indeed, as Niall Ferguson comments, if death in battle had been the measure of success, the Allies would have lost both World Wars, for they suffered far more casualties than the opposing side (especially when we take into account the casualties incurred by the Red Army). Even in World War I, the Allies lost far more men than the Central Powers. But what mattered in both conflicts was that the overwhelming percentage of POWs were not Allied soldiers. An army loses when it loses its cohesion – when men desert or surrender in large numbers, when demoralisation sets in. Anything that dissuades an enemy from surrendering – such as humiliating those who do, or being so indiscriminate in targeting that revenge becomes the only consolation for defeat in the field – will rebound on the offending country. It is for that reason that we must always consider the consequences of our own actions. What concerned Clausewitz about the Napoleonic Wars was that there were a number of times, especially in Spain and Russia, when war had threatened to become ‘total’. The potential of war to escalate was contained, after all, in one of Napoleon’s more chilling remarks: ‘I grew up on the battlefield. A man like me doesn’t give a shit about the lives of a million dead.’27 This was – as it turned out – an unwittingly accurate estimate of what Napoleon and the revolution cost France in terms of manpower. Fortunately, the merciful treatment of prisoners of war was widely regarded as a hallmark of a civilised nation. It was another of Napoleon’s maxims that ‘Prisoners of War do not belong to the power for which they have fought; they are under the safeguard of the honour and generosity of the nation that has disarmed them’.28 Napoleon’s maxim reflected a major change from the peace of Westphalia onwards when the codification and acceptance of legal rules and restraints came to supersede the older appeal to moral authority. The legitimacy of the prince replaced the authority of the pope. The French Revolution saw another change: it replaced the framework of a common morality with the framework of a common civilisation which was sustained by international law and common codes of conduct in war. With the optimism demonstrated by such Enlightenment thinkers as Hume and Smith, Clausewitz had no doubt that the Europeans had finally learnt an important lesson from their own history.
28
Etiquettes of atrocity If ... civilised nations do not put their prisoners to death, or devastate towns and the countryside, it is because cleverness plays a larger part in their conduct of war and has taught them more effective ways of using force than such a crude expression of instinct.29
Indeed, Clausewitz expected that cleverness in political life would continue to increase as societies developed and learnt even more – so did most other contemporary writers. A few years earlier, Adam Ferguson had contrasted the ‘civilised or polished’ nations such as his own with the ‘barbarous or rude’ nations of the pre-modern era. What made the former more civil was their discretionary use of violence. ‘We have mingled politeness with the use of the sword. We have learned to make war under the stipulations of treaties and cartels and trust to the faith of an enemy whose ruin we contemplate.’30 Clausewitz was expressing two ideas here, both of them very modern. The first was that better behaviour could be expected of a people that knew what was in its own interests. We do not become better people, but history and its study does make us better informed. We are not more selfless, or even less selfish, but we can cultivate a better understanding of what is in our true interests. From a reading of history, we know what consequences will follow from acting badly. ‘Savage’ people, by contrast, are historyless – in both senses of the word. Not only do they not make their own history (because it is made for them by others), but they do not make it because they have no written records, no great historians to instruct them in the ways of the world. But something else was involved. Clausewitz has often been dismissed as the progenitor of the total wars of the twentieth century (the English strategist Basil Liddell-Hart famously called him ‘the Mahdi of mass destruction’). But his conceptual framework of war, if it cannot be said to be ethical, is at least an enabling device for the ethical construction of the use of armed force.31 The idea that we do not need to separate ethics from politics (in this case, war) stems from the understanding that ethics inheres in the practice of war itself – it is an idea that comes from the political framework of Clausewitz’s age. It emerges in a famous paragraph from The Philosophy of Right in which Hegel tells us that the origin of the (modern) state as a ‘self-organising rational and ethical organisation’ can be traced back to the disillusion of religious unity in the West.32 Ethical constraints no longer came from without (from God) but from within (from politics). Hegel gave no theoretical explanation for this; others have since done so. The first and most important precondition of the modern concept of the state, we are told, is the formation of a distinctive language of politics. In other words, for modern politics to be made possible it was necessary to accept the idea of the state which presupposed that political society was held together solely for political purposes.33 To claim that the ‘political’ was merely another name for the ‘temporal’ (the Middle Ages) or the ‘royal’ (the age of Absolutism) would be facile. We associate the emergence of the ‘political’ with the emergence of the ‘citizen’ (as distinct from the ‘Christian’ or the ‘subject’). The political sphere is modern and ethical at the same time. Ethics is inherent to the political practice of
Etiquettes of atrocity 29 forging and keeping together a state divided by class and conflicting interests. The aim, claimed Marx, was to create an illusionische Gemeinschaft – an illusory society held together by certain foundational myths.34 The aim in war, Clausewitz believed, was to keep together an illusory international society distinguished by different and diverse laws, conventions and treaties. In a sense, this did not remove the religious foundations from which the state had emerged. It was Max Weber who wrote that military organisations and actions are made possible by a social organisation that is very similar in form to religious devotion. One might add that the early-nineteenth-century belief in the ‘civil society’ of Europe was illusory as well, but it worked for a time, at least until 1914. Unfortunately, both world wars tested not only the effectiveness of armies in the field, but also the viability of entire societies and their way of life. What occurred was what Clausewitz had predicted would happen if combat were no longer guided by ‘the will of a leading intelligence’, if war took the place of policy, becoming in the process ‘its own independent will’, ‘a complete, untrammelled, absolute manifestation of violence’. The deadly result can be seen at Thiepval, the last word in World War I memorials (in this case one devoted to the Missing of the Somme) or rather the first word; the very idea of commemorating not a victory but a monstrous sacrifice was novel, with its elementary democracy of simply carving 73,357 names into the forty-eight internal wall panels. Not least of the ironies is that Thiepval stands in rural Picardy, not far from the triumphant and ‘glorious’ battlefields of Crecy and Agincourt. However, there were to be even greater sacrifices demanded of the World War II generation, especially in Nazi Germany, a country that, as even Goebbels confessed, did too much war and too little politics. The point I am trying to make is that however the etiquettes of atrocity evolved, the protocols of violence are almost as old as war itself. Clausewitz’s work bears out Parker’s main contention that we have ‘laws’ because of experience – from learning the hard way that it is dangerous not to. We act well not because we should (by invocation of a higher authority such as God or the Law) but because it is imprudent not to. If God counsels us not to transgress His law, wisdom counsels us to show imagination. That we can imagine the pain of others is what distinguishes us as a species even if (or precisely because) we are one of the few species that makes war on itself. The scale of the carnage of both World War I and II, which had been presaged by the death toll and destruction of the Thirty Years War three centuries earlier, shows us why we ignore ethical practices at our peril. Wisdom perceives that in war there is a tendency to over-reach, or to act wilfully or perversely; it is in our nature (God-given or not). Where law counsels us not to transgress, wisdom tells us that we are at risk if we do. Wisdom concurs with the law that offences against others are likely to incur punishment by God, as the pious hold, or more simply the punishment of hubris, as the Greeks were acutely aware. Sometimes we may even suffer from the morally corrosive effect of acting unjustly to others – we may be infected by our own hatred.
30
Etiquettes of atrocity
The wise, Rorty contends, differ in one critical respect: punishment is not certain. The tragic truth about life is its capricious cruelty. Virtue is not always its own reward. Where nature may punish the guilty it shows no tendency to reward the virtuous. Retribution may come, as it did for Germany in 1945. Acting honourably, however, is no guarantee of success, for the honourable are often defeated. In the end, the main reason to be virtuous is to not incur the risks of vice. As always, the injunction to act morally is really addressed to ourselves.
Discourses on war One of the people who helped Geoffrey Parker with his essay on the ‘Etiquettes of Atrocity’ was the military historian John Lynn. Lynn later wrote a book, Battle, in which he developed a thesis that every society shares a discourse on war which includes its preconceptions about the practice, the value it attaches to it, as well as some fundamental ideas about its nature. This seems to me to be a very valuable tool of analysis. Discourse is an ill-defined term which has many uses in the social sciences, and almost as many meanings. The widely held view, however, is that a discourse is a set of ideas, beliefs and practices that provide ways of representing knowledge. A discourse enables the presentation of certain forms of knowledge while precluding the construction of others. Lynn uses the term to refer to cultural conceptions of war that tend to structure our perceptions of its essence, its purpose and the problems it presents. A discourse helps to make sense of the world ‘out there’ and thus to reduce it to manageable proportions. It makes it more open to reason, although the decisions we take on the basis of what we perceive to be real are not necessarily rational at all. Lynn is no social constructivist. Indeed, he is very critical of the rise of fashionable social science approaches to military history: ‘“Deconstruction” means one thing to our “cutting edge” colleagues; to us [military historians] it just means something like carpet bombing.’ Irony aside, Lynn’s discourses are cultural lenses through which particular groups in the military at different times choose to perceive reality. Different groups will contend with each other but only one discourse will prevail at any given time. As he acknowledges, ‘... a culture has no single discourse on war. Rather, a number of distinct discourses encompass the values, expectations etc of varied groups which harbour potentially very different and at times opposing interests and points of view’.35 Different social groups may well share different discourses specific to themselves, to a particular class, or even a particular gender. It follows that there are discourses of power – or empowerment – and that it is generally true to say that the dominant group in society will produce a dominant discourse. The Italian philosopher Pareto called ideologies ‘elite derivations’, and the same term, it seems to me, might be usefully applied as well to the dominant discourse on war. In distinguishing between what he terms the discourse and the reality of war, Lynn adds:
Etiquettes of atrocity 31 The fundamental assertion of cultural history is that human communities impose cultural constructions upon reality, that they make the actual fit the conceptual. Cultural historians sometimes insist that reality is simply what is perceived, and thus culturally constructed. Such an attitude in war is fatal, in the literal sense of the word. But avoiding foolish intellectual excess, this principle applies to the cultural history of war within limits set by the objective facts of armed conflict.36 His concept of the reality of war is compatible with another definition of reality by John Vasquez: ‘The word ‘reality’ refers to the resistance of the world to conform to every imaginable conception humans create.’37 As Stephen Lukes notes on a related point: All cultures ... are, apart from everything else that they are, a setting within which their members individually and collectively, engage in the cognitive enterprise of reasoning and face the common human predicament of getting the world right; of understanding, predicting and controlling their environment ...38 If this is true of all forms of cultural conception, then it has particularly urgent implications for the military. Reality, in Vasquez’s sense of the word, confronts every military culture with a particularly harsh truth: success or failure in battle owes a lot to its ‘cognitive enterprise’, its dialogue with the real world. A military discourse is difficult to sustain in the face of persistent failure in the field. Indeed, it may even prove to be severely dysfunctional if it does not reflect reality as it is, not as one would like it to be. In other words, a military discourse is formed of a set of basic assumptions about warfare that will tend to be mutually reinforcing on their own terms. At the level of an individual discourse culture will tend to be internally coherent, but there may be multiple discourses in any society involving contradictory sets of basic assumptions. It is the plurality of discourse that gives rise to strategic incoherence, especially when a country is facing defeat. This plurality of discourse is a reflection of the fact that different cultures try to change or control reality to fit their conceptions while reality, in turn, tends to modify the cultural discourse to better match the objective facts of combat. In other words, there exists a ‘feedback loop’ between the discourse and the reality of war. A military culture as a conceptual medium may be deeply embedded, yet as a cognitive enterprise engaged in understanding, predicting and controlling the military environment, it always encounters the resistance imposed by a reality that is independent of its own cultural construction. Social realities To illustrate this claim let me take one social group: the aristocracy, who determined the practice of war in Europe until 1914. Paul Fussell makes an important
32
Etiquettes of atrocity
point when he comments that in the days when war was entrusted to a specialised military caste – the aristocracy – there was a wide gap between the actuality of war and the language. Until the twentieth century there is not much mention in individual accounts by soldiers themselves of mutiny (collective indiscipline), casualties or combat fatigue (madness). They are part of the moral landscape of every war but they were airbrushed out of most accounts prior to World War I. Thus, even the world’s first ‘embedded journalist’, war correspondent William Howard Russell, whose graphic reports to The Times awakened the public to the incompetence of the high command during the Crimean War, could still write up the charge of the Light Brigade (1854) – one of the great calamities of British military history – as if it were a glorious (albeit ill-fated) episode in an otherwise inglorious conflict: ‘They swept proudly past glittering in the morning sun in all the pride and splendour of war.’39 Glory died on the Western Front after 1915 and has never been rekindled. In this respect, writes Alan Finkielkraut, no memorial did more justice to World War I than the Tomb of the Unknown Soldier. Making war and making a name for oneself had been tied together since Homer; the hero traditionally dared all so that he would be remembered. The Trojan War introduced the Western world to the names of the first individuals we know of: Achilles, Hector, Odysseus, Ajax. Warriors, indeed, tended to be individualised even more than saints. Everything changed with the mass slaughter on the industrialised battlefield. The cult of the Unknown Soldier can be seen as the offering of two antinomic modes of being that had been defined in opposition to one another for most of history: glory and obscurity. The virtue of the Unknown Soldier, by contrast, was his very anonymity, his fate that of a worker on an assembly production line. For every soldier killed at the Western Front there was always another in reserve.40 What has dominated the Western discourse of war ever since is a highly utilitarian ethos which has its origins, ironically enough, in the mid-nineteenth-century when Russell was sending his reports back from the Crimea. It is highly consequentialist. Actions are judged on the grounds of their consequences, not whether in and of themselves they glorify war or even the men called upon to fight it. Utilitarianism as a philosophy has little use for concepts such as ‘honour’ or ‘respect’, and little place for individual loyalties or tribal affiliations with their rituals and traditions. The utilitarian mind is entirely given over to instrumental reason. War in the Western world has no higher ‘good’ than its utility. War does not exist to satisfy any aesthetic or metaphysical ends and it certainly does not exist to satisfy the social needs of an aristocracy. In such a world there is little value attached to something as immaterial as thymos (the soul). Kant believed that by banishing passion from the field war could be eradicated as a human profession. Glory, when acknowledged at all, was to be channelled into business or commercial activity. In England, Jeremy Bentham had little belief in the warrior calling. Indeed, he lamented the fact that civilised societies were still willing to go to war simply to provide the aristocracy with a pastime:
Etiquettes of atrocity 33 Thus we see in the ages of barbarism that the upper classes have divided their whole life between war; the chase which is the image of war; the animal function and long repasts at which drunkenness was the greatest attraction. Such is the whole history of a great proprietor, a great feudal lord of the Middle Ages. The privileges of this noble warrior, or this noble hunter, extended into the midst of a more civilised society the occupations and the character of the savage.41 Bentham claimed that the aristocratic classes of his own day welcomed war as a godsend, for it gave them a purpose and, more importantly, an opportunity to carry out what they had been trained to do since childhood: ride horses and fire guns. His critique was wholly consistent, of course, with his general contempt for honour or reputation as ‘springs to action’ in a man’s life, whether or not soldiering was his profession. He had even less time for the concept of sacrifice as it was propagated in national myths and the great works of military history. Rather like religious faith, he considered sacrifice to be a superstition, the festering remnant of an earlier, more barbaric age. The US is an in-between culture. As de Tocqueville claimed way back in the 1830s, it was the first society to have renounced what he called ‘the will to grandeur’.42 The US had chosen to pursue business with a vengeance and even turn a profit from making war on others. And yet the American military still invokes the concept of sacrifice in a way that Bentham would have heartily disapproved. It even retains a ‘warrior ethos’ that still celebrates combat as the supreme expression of military life. Let me mention one other dimension of the American discourse on war which has changed in recent years: manliness. The admission of women in the armed forces in the early 1970s has transformed the discourse once again. It has seen a marked decline in the emphasis in what the Greeks called andreia, which meant both manliness and courage. The sociologist Cas Wouters has noted the striking readiness of American pilots preparing for action in the Gulf War (1990–1) to admit to journalists that they were anxious, even frightened about the challenge to come.43 Clearly they did not feel (as their predecessors probably would have) that this acknowledgement was either inappropriate or unmanly. Using this example, a discourse can be seen as the context within which the military conceives itself, including its position as a role model for people in other professions or walks of life. The point is not (in the case of the aristocratic principle of glory) that war has become inglorious; only that the public context of glory seeking has changed, as have the policies based on it. Glory is now a job well done; it is a collective aspiration, not an individual one. This is also the case with fear. In 1990 American pilots were no more fearful than, say, their predecessors in the Korean War. But the Americans were members of a military culture in which it had become more acceptable to talk about, and thus acknowledge, their fears. One can accept as a given that most warriors have been fearful on the eve of battle; fearfulness was just never publicly acknowledged or, for that matter, reported any more than before 1914 when incidents that were seen to detract from the ‘glory’ of a particular campaign.
34
Etiquettes of atrocity
In recent years the Western discourse on war has changed again. For a start, it would seem to be more humanitarian, at least in its ambition. The US Army is under pressure to scrap ‘Warrior Ethos’, a military creed that was introduced comparatively recently in November 2003, after General Peter Schoomaker, the US Army Chief of Staff, expressed alarm that many soldiers in Iraq considered themselves to be support troops – cooks, mechanics and supply staff – rather than fighters. Schoomaker insisted that no matter what their job, every soldier should consider himself a rifleman first. Soldiers are now instructed to live by a creed which urges them to demonstrate their fighting spirit by destroying the country’s enemies at close quarters. Admirable though this may be in the heat of battle, the emphasis on annihilating the enemy is clearly inimical to the type of patient, confidence-building counter-insurgency warfare in which the US is now engaged in the Middle East.44 What this discussion has shown is that a discourse on war is always evolving, in line with what Clausewitz called the ‘character’ of war. In the case of the West it has become increasingly instrumental over time. Our ethical practices or etiquettes of atrocity are largely prudential, or utilitarian, and they are increasingly determined by technological breakthroughs and distinctive cultural preferences. Some societies, unable to face technological change, retreat into an idealised version of it. For much of European history combat was dependent on the valour of horsemen clashing with their counterparts on the opposing side. The strength of the state, wrote a Spanish Muslim writer in the Central Middle Ages, ‘varies directly as the number of its paid troops’.45 But he also observed that in Christian Europe battles were won by ‘a few renowned knights’ and that the army which could claim ‘even one more famous warrior than its enemy must ... win’. His report was revealing for it underlined the extraordinary mixture of military insight and knightly showmanship – gamesmanship, even – which governed medieval warfare in this period. Medieval warfare was ruthlessly destructive of life and livelihood. Siege and ambush were its leitmotifs. But at the same time Western knights engaged in ritualistic tournaments which allowed them to acquire a glamour they could rarely attain in the field. Chivalry was an elaborate device, the code of a class that allowed it to disguise from itself – and others – the brutal nature of its calling.46 Even when the introduction of gunpowder made chivalry increasingly meaningless, it was difficult to dislodge from the aristocratic imagination. As late as the sixteenth century, chivalrous literature was at its most prolific, but it ceased to stress the spiritual goal of knightly contests. We will still find chivalrous gestures (including the challenge to single combat, release without ransom of a particularly gallant foe and, particularly, efforts to save women from the horrors of a siege), but most of these episodes are best seen as the codes of a freemasonry. The idealisation of chivalry may still have been pervasive, but the chief reality of war was being shaped not by values so much as technology. ‘Warfare was now an intellectual problem not an athletic exercise,’ writes Thomas Arnold.47 It is especially ironic, therefore, that Western societies in particular were so pre-occupied with seizing a technological advantage that they often deployed technology way before its time. Take the longbow, for example, which gave the
Etiquettes of atrocity 35 English their great set-piece victories against the French in the Hundred Years War. It was abandoned not because it was obsolete, but because it had to yield to the onward march of ‘progress’. The question is: was it replaced too soon? The Duke of Wellington once enquired whether there were enough archers in England to raise a corps of longbowmen for in any clash between medieval longbows and nineteenth-century muskets, the longbow, even outnumbered two to one, would have won every time. Not only did it continue to outrange the musket, it had a much faster rate of fire. At the beginning of the Napoleonic Wars the Prussian Army tested the muskets’ accuracy against a target 100ft (30m) wide and 6ft high. At 75 yards a trained battalion scored 60 per cent hits, at 150 yards this went down to 40 per cent. A yew bow used at Agincourt (1415) with a draw weight of more than 100lb was far more accurate and lethal. Shooting at a man-sized target 100 yards away, 17 arrows could be released in a minute. If the Redcoats had been armed with longbows at Waterloo, it would have been an easy ‘away’ win, rather than a ‘damned close-run thing’.48 In the end, of course, firearms had one distinct advantage: a longbowman was a product of a distinct lifestyle and years of practice, whereas an effective drill sergeant could transform a mass of peasants into reasonably proficient musket men in a few weeks. As always in the West, instrumental reason won the day. Today, the prevailing class in the Western military is the middle class, and the prevailing discourse on war is a liberal one. Glory, status and honour have now been purged from the account, even if they are still invoked in training. Indeed, the Great War can be seen as the last conflict in which both sides genuinely believed that national honour was at stake, despite the faceless slaughter in the trenches and the very real chance of an anonymous death. World War II, by contrast, was regarded even at the time as a struggle – not between equals in honour but between two sides which were distinctly unequal in morality, pitting Western liberalism against the illiberal face of fascism. The Cold War that followed was only an extension of the same conflict with communism replacing fascism as the enemy, despite the fact that the Soviet Union had played, perhaps, the major role in Nazi Germany’s defeat. For good or ill, the Western discourse on war at this juncture of history is a liberal one. And technology is the most important single factor in delivering the means by which liberal societies can fight with a good conscience. Technology makes it possible, as we shall see when discussing the use of non-lethal force, to keep faith with traditional ethical practices much more than in the past. It also raises, of course, new ethical problems: technology has side effects, like everything else in the risk age. Liberal societies still find the world stubbornly resistant to their own idea of war.
Keeping to the discourse There are times when the reality/discourse divide is so fundamental that adjustment is not possible. Such an impasse can lead to a rejection of a particular conflict as a true war and even a tendency to redefine the struggle as distinct from
36
Etiquettes of atrocity
the traditional understanding of warfare. In turn, this can justify a more extreme reality of conflict. It can even drive a country to abandon its traditional discourse on war in an attempt to adjust to the ‘real world’. This new discourse can then justify fighting war more ruthlessly as the ‘laws of war’ begin to lose their appeal. Often, the enemy is deemed to have placed himself outside the traditional etiquettes of atrocity. And nothing is more likely to change the reality when different societies share different discourses about the value of human life, the acceptability of surrender and the fate of prisoners of war or non-combatants.49 Let me return to Lynn’s claim of what happens when a discourse comes up against a reality it cannot explain, as often occurs when two different societies find themselves fighting by different rules of engagement. Nothing is more sobering than to find oneself fighting a society which has a different etiquette of atrocity from one’s own. Take one comparatively recent case: the Vietnam War. The US chose to keep faith with the prevailing discourse on war rather than exclude from it the Vietcong (VC), on the grounds that they were ‘unlawful combatants’, a term which has become all too familiar in the War on Terror.
The United States and Vietnam ‘Vietnam, was that a war or what?’ asks Sgt Benson, a character in a story by Richard Ford. It is not that she does not know about Vietnam; it is that she does not want to know. She is talking to a Vietnam veteran on a train: ‘You were probably on a boat that patrolled the rivers blindly in the jungle day and night, and you don’t want to discuss it now because of your nightmares, right?’ Who wants yesterday’s papers, the Rolling Stones used to ask; for many that is also true of Vietnam, a war the US lost. Except that Vietnam should be remembered. It may be history, but it is important in terms of the argument I am putting forward. ‘I wanted to be intimate with the war,’ writes one of its chief witnesses Michael Herr, the author of Despatches, one of the seminal testaments to what he saw in combat. Later he helped to script two of the most memorable films to come out of the experience, Apocalypse Now and, ten years later, Full Metal Jacket. ‘I also wanted to maintain control as everyone does in the matter of intimacy but I couldn’t. Circumstances arose.’50 What is remarkable is that despite every provocation, despite changing circumstances for which many American soldiers had not been trained very well, the US chose to abide by its traditional discourse on war. The circumstances might have changed, but they did not persuade the US to change the script. The circumstances were different from anything the US had experienced or, more to the point perhaps, could recall. Thus the head of one of the chapters of what is by far the best of the Vietnam War books (A Rumour of War, Philip Caputo’s troubled, soul-searching meditation on the nature of war, ‘about the things men do in war and the things war does to them’) we find a passage from Thomas Hobbes’ Leviathan, his description of ‘the war of all against all’. Caputo and his comrades-in-arms found themselves fighting in the cruelest of conflicts, a people’s war waged in an unforgiving environment:
Etiquettes of atrocity 37 A war in which the enemy soldier fights for his own life and the lives of the men beside him, not caring who he killed in that personal cause, or how many, or in what manner, and feeling only contempt for those who sought to impose on his savage struggle the mincing distinctions of civilised warfare – that code of battlefield ethics that attempts to humanise essentially inhuman war.51 What seemed to distinguish the war from most other conflicts was its amorphous, anonymous nature. ‘It was the dawn of creation in the Indo-China bush. Out there, lacking restraints, sanctioned to kill, confronted by a hostile country and a relentless enemy, we sank into a brutish state.’52 It was the land that resisted us, the land, the jungle, the sun. Everything rotted and corroded quickly over there; bodies, boot leather, metal, morals. In the field, the humanity of the soldiers rubbed off them as the protection bluing rubbed off the barrels of their rifles.53 There is another fine passage about the unrelenting Vietnamese countryside in Tim O’Brien’s novel, Going After Cacciato: ‘When it was not raining a low mist moved across the paddies blending the elements into a single grey element, and the war was cold, and pasty and rotten.’54 This is a familiar trope, the Hobbesian state of nature. Caputo takes it as a warning of what happens to even the most modern of us when confronted with intolerable circumstances. When a man is stripped of all his socially acquired appetites and left in a moral vacuum, he is in danger of returning to his primeval state. And this can be the fate even of the most courageous soldier. Courage may enable a person to dominate his fear on the battlefield, but it cannot eliminate the stress that lives on. To survive battle is to have to live with oneself and the memory of one’s actions for years afterwards. Many of the ‘collateral casualties’ of war, such as the victims of Post-Traumatic Stress Disorder (PTSD), never quite recover from the ordeal. When Caputo first considered putting his thoughts down on paper in 1976 he intended to write a novel. However, he soon realised that he was using non-fictional material from his own life, even when changing the names of the characters and places. In his prologue he summarised his intentions: ‘This book does not pretend to be history ... In a general sense it is simply a story about war, about the things men do in war and the things war does to them.’55 Since The Iliad literature has had the job of mediating the realities of war for the rest of us, but for soldiers it is very different. Caputo soon discovered, as Ernest Hemingway once noted, that ‘it is difficult to write imaginatively when you know too much about something’.56 Instead of a novel, Caputo produced a memoir that reads with the passion of great fiction. Indeed, it is not so much an essay on Vietnam as a series of arresting and sometimes lyrical meditations on war, shot through with vivid reconstructions of his own deeds and experiences. Caputo is an erudite man; there is an epigraph at the head of each chapter, often a literary quotation. Only Thucydides is missing which is surprising since
38 Etiquettes of atrocity history’s first serious historian concluded that war is a violent teacher. War deprives people of the ability to satisfy their needs and brings them down to the level of their circumstances. We have rules of war so that our soldiers are not asked to commit acts that will haunt them in peacetime. We respect our enemies so that we do not become them. The ‘Warrior’s honour’ helps a warrior to keep faith with himself, his unit and his profession. The painful paradox, writes the psychiatrist Jonathan Shay, ‘is that fighting for one’s country can render one unfit to be its citizen’.57 We have rules so that our citizen-soldiers do not cross the line. For on the battlefield there are neither policemen nor courts of law; there is only the warrior’s honour to keep a soldier in check. Only his honour is likely to save him from himself; to ensure that he is not robbed of his humanity, which is the name we give not only for our species but for the qualities it is deemed to embody. In this particular study, of course, this is not the central subject of concern – although it is a powerful argument in favour of ethics, and one that I will address in Chapter 6 when discussing the growing micro-management of the battlefield. My main interest, however, is not with individual men at arms, but with the states that send them into battle. What interests me most is why states have continued to observe rules so that they can craft military objectives, achieving results in a more efficient and effective manner. Nevertheless, the human factor is uniquely important. In Vietnam it was especially difficult for soldiers to maintain their moral compass. The difficulty was compounded by the fact that it was the first ‘postmodern’ conflict, one in which instrumental rationality was exacting in its demands. It was the first computer war; for the first time computers promised total oversight, exacting standards of control and technical, rational solutions to a myriad of complex problems. In an attempt to master this new reality the Americans embedded the conflict in what Paul Edwards calls a ‘closed world’ discourse, ‘an inescapably self-referential space where every thought, word and action was ultimately directed towards a central struggle’.58 It was a discourse in which cybernetic science became a primary source of meaning. Formal and mechanical modelling provided the categories and techniques by which the Americans understood themselves and the enemy. As a discourse it was distinguished by three central features: ●
●
●
Techniques were drawn from engineering and mathematics to model aspects of the world. Technologies, especially the computer, made systems analysis and central control practical on a very large scale. A language of systems, gaming and abstract communication and information devalued experiential and situated knowledge.
The closed world was based on an understanding that war was finite, manageable and, above all, computable.59 The Pentagon was captured by a generation of systems analysts who brought with them an obsessive love of numbers, equations and calculations along with a certain arrogance that their calculations
Etiquettes of atrocity 39 could reveal the truth. They transformed not only the vocabulary of war but also the prevailing philosophy of force. It was a philosophy which triumphed when the new US Secretary of Defense, Robert McNamara, proceeded to apply systems analysis across the military more thoroughly than ever before. McNamara had first risen to prominence during World War II when he had distinguished himself as one of the most brilliant analysts in the Statistical Control Office, where he conducted operations research for the Air Force using IBM counting machines. During the strategic bombing campaign of Japan he had recommended a switch to fire bombing and lower altitude missions, both of which were adopted with devastating effect in 1944. After the war, he left the armed forces to join the Ford Corporation where he applied the same principles of scientific management before accepting the role of Secretary of Defense. Surrounding himself with a team of ex-RAND analysts who shared his own world view, he set out to extend these principles to all branches of the military. A controversial figure particularly unpopular with certain sections of the military over which he asserted previously unseen levels of control, McNamara was once referred to as ‘a human IBM machine’ who cared more for computerised statistical logic than for human judgement.60 Because of the scientific and mathematical methodology upon which this new discourse relied, analysts gave much greater priority not to the ethical or unquantifiable aspects of warfare but those that could be fed into a computer. To quote Martin van Creveld: With computing as the stimulus, the theory of war was assimilated into that of micro-economics ... Instead of evaluating military operations by their product – that is, victory – calculations were cast in terms of input, output and cost effectiveness. Since intuition was replaced by calculation, and since the latter was to be carried out with the aid of computers, it was necessary that all the phenomena of war be reduced to quantitative form. Consequently everything that could be quantified was. While everything that could not be tended to be thrown onto the garbage heap.61 Inevitably, perhaps, this informational approach to war and the increasing reliance on computers ensured that those elements which could be quantitatively measured were privileged over those which could not be measured quantitatively. Intuition, courage and will power – attributes that had been considered central to war for centuries – were widely devalued. The new discourse on war involved what Gibson calls ‘an un-poetic poetic’, a mindset best illustrated by McNamara’s response to a White House aide’s assertion that, based on his reading of military history, it was clear that the war could not be won. ‘Where’s your data?’ he fired back. ‘Give me something I can put in the computer. Don’t give me your poetry.’62 Besides masking the reality of conflict on the ground, the informational demands of this approach eventually overwhelmed the military infrastructure. Van Creveld, again:
40
Etiquettes of atrocity Extreme specialisation of personnel and units, coupled with adherence to the traditional triangular chain of command, meant that headquarters was piled upon headquarters and that co-ordination between them could only be achieved, if at all, by means of inordinate information flow. A tendency towards centralisation, the pooling of resources, and the running of the war by remote control – especially evident in the field of logistics and in the air war against North Vietnam – further augmented the demand for information. Though the signals network for the US Army established in South Vietnam was the most extensive, expensive and sophisticated in history, it proved in the end incapable of dealing with this ‘bottomless pit’, as General Abrams once put it.63
And so the bombing of North Vietnamese cities proceeded to no particular end on the new ‘GIGO’ principle – Garbage in, Garbage out. What is especially striking about this new discourse on war was that it enabled the military to remain curiously detached from the conflict. The war produced a peculiarly unsettling jargon including ‘frontier sealing’ and ‘census grievance’. One term that is particularly difficult to forget once encountered is how a US military spokesman described a bombing raid north of the Demilitarized Zone as having obtained ‘a 100% mortality response’. The military continually talked of ‘crossover points’ and ‘body counts’, dealing in the currency of mathematical formulae. Robert Thomson, the man who helped the British army win one of the few successful counter-insurgency operations in history, had a nice phrase for it; in Vietnam, he claimed, the Americans were guilty of ‘squaring the error’, an apposite mathematical trope which could be applied especially to the air campaign. The nightly bombing of North Vietnamese cities was expected to bring the war to a ‘cross-over’ point at which it was expected the enemy’s will would finally be broken. ‘I can’t believe that a fourth rate power like Vietnam doesn’t have a breaking point,’ Henry Kissinger told his staff.64 The breaking point was never found but 600,000 North Vietnamese citizens died in the attempt to discover it, or 60 civilians for every American soldier on the ground. The atrocities we still recall, of course, were in the field. Take the regular burning of South Vietnamese villages suspected of harbouring Vietcong, for example; here the Zippo lighter was as destructive a force as the B52. Americans still tend to dwell on the scenes from the ground war: a group of children running down the road, set on fire, and a South Vietnamese police chief executing a VC officer at the height of the Tet Offensive (1968). In both photographs, the viewers were intimately involved: they were witness to an execution; the children crying were running towards them. The Americans tended to ignore (or even deny) the atrocities perpetrated by the other side: 3,000 graves of those executed in cold blood were discovered at Hue after 1967; the greatest slaughter of civilians arose from indiscriminate North Vietnamese and Vietcong artillery/guerrilla attacks in the south. But one of the reasons we still tend to dwell on American behaviour is that the language of the ‘closed world’ discourse on war was deeply troubling to the liberal conscience. In the ‘closed world’ of the American military, Vietnam spawned its own euphemisms
Etiquettes of atrocity 41 which soon became part of the ‘white noise’ of the war. Such terms include the ‘discreet burst of fire’ which tore bodies to pieces, ‘friendly fire’ which refers to soldiers killed accidentally by their own side and ‘meeting an engagement’ which means confronting an ambush. All of these terms emanated from the military headquarters in Saigon. They were part of the attempt to spin the war for the press and the public back home to which it addressed its reports. Nothing that happened in the field was so horrible that it was beyond language: ‘they talked as though killing a man was nothing more than depriving him of his vigour’.65 The real problem was that the Americans were looking for a quantitative measurement in a war that was qualitative. Moral science is rooted not in statistical analyses or data crunching, but in humanity. Data tends to dehumanise the enemy. Body counts were just one example of a specific style that made the Vietnam War unique. The practice still has its defenders who deny that it led to a more coldblooded approach to the war. Instead, they claim, the system of body counts produced a higher proportion of prisoners of war, resulted in fewer civilian casualties and enabled the ‘pacification’ of the country to advance more quickly.66 Still, the main argument against it is that it desensitised American soldiers to the consequences of their own actions, especially in two respects. First, it substituted a mathematical measure of military prowess for actual combat, encouraging individual units to grossly exaggerate their scores in a bid to obtain specific rewards, such as extended furloughs. Second, it reduced the sense of responsibility by the military by ironically detaching them from their own actions. Thus the 25th Infantry Division left ‘visiting cards’, torn-off shoulder pads displaying the Division’s emblem on the Vietcong they killed. One helicopter gunship commander dropped visiting cards reading, ‘Congratulations! You have been killed through courtesy of the 361st’. All this led Paul Fussell to conclude that Vietnam could be described as the first ‘post-modern’ conflict in history, breeding in the ranks of the senior command a certain scepticism and subversiveness that bordered on nihilism.67 In the end, it should come as no real surprise that one of the most famous memoirs of the war was Body Count. Its author was William Calley, a 26-year-old lieutenant known as ‘Rusty’ to his friends and the man responsible for the most infamous atrocity of the war: the My Lai massacre. Calley’s fifteen minutes of fame owed everything to the system. His unit had failed to meet its body count quota for the month. He had been warned by his commanding officer a few days earlier: ‘You’d better start doing your job, Lieutenant, or I’ll find someone who can.’68 In that sense, if in that sense alone, Calley could claim that he was acting under orders. Unlike the Abu Ghraib scandal, My Lai was investigated by the Pentagon immediately after the news leaked. Even in the Nixon years (with a beleaguered President in office) there was no attempt to whitewash. The commission empowered to look into the affair concluded that given the competitive nature of command assignments and the general tendency to evaluate command performance on the basis of ‘tangible results’, Calley’s unit probably viewed the operation as a chance to redeem past failings. His own response was one of incredulity that he should have been blamed for the massacre at all. If the US
42
Etiquettes of atrocity
public objected to the slaughter of women and children, the mathematical metric applied by the military made it difficult to draw a distinction: ‘The old men, the women, the children – the babies – were all VC or would be VC in about three years,’ Calley insisted. ‘And inside of a VC woman I guess there are a thousand little VC waiting to get out.’69 The men responsible for My Lai were not monsters; they were ordinary soldiers who found themselves in an extraordinary situation, fighting a war that made no sense to them or almost anyone else in the field. They were not untypical of many other soldiers of the time. Insofar as statistics can measure what is typical, most were young (around 20 years old); some were enlisted, others had been drafted; some were black, others white. All were new to Vietnam and, more importantly, were new to the army. Indeed, the massacre reflected a general failure to either enforce or police existing regulations. A survey of 108 US Army general officers who served in theatre revealed that less than one-fifth believed that the rules of engagement had been ‘carefully adhered to throughout the chain of command’. Fifteen per cent reported that they had not been especially ‘considered’ in the day-to-day conduct of the war.70 What should have shocked the military leadership most was the commission’s final conclusion. Calley himself was criticised for his stupidity, his insecurity and his inability to inspire confidence in his own men. Nevertheless, the real blame ran higher up the chain of command: If on the day before the ... operation any one of the leaders of platoon, company, task force, or brigade level had foreseen and voiced an objection to the prospect of killing non-combatants, or had mentioned the problem of noncombatants in their pre-operational orders and instructions ... the tragedy might have been averted altogether or have been substantially limited and the operation brought under control. 71 The problem, in short, arose not from a change in the discourse on war but a failure to enforce the prevailing one. Although republished every six months, the rules of engagement were not always widely distributed. Many general officers claimed that prior to My Lai they were ‘frequently misunderstood’. Calley’s men had received ‘accelerated training’ which meant minimal instruction in the correct treatment of non-combatants. Indeed, only two months prior to their embarkation they had been assigned 50 replacement personnel who had received so little training that they were bound to find themselves at a loss in a difficult operational environment. The US Army’s ethical codes were imperfectly understood. All of which can lead to only one conclusion: ethical leadership is not leading troops in the battle in ethically appropriate ways; it involves the conceptually much harder task of preparing and structuring forces so that when they are employed they will be prepared for the environment in which they operate. My Lai was not the only atrocity, nor was it necessarily the worst. However, it was the most extensively reported. Most atrocities went unreported; most American journalists (at least until 1967) saw them as part of the immutable landscape of war.
Etiquettes of atrocity 43 Why My Lai is interesting is that it showed how America’s ‘closed world’ discourse on war did not lead the US to break faith with its own ethical rules. Even the Vietcong were not excluded on the grounds that they were ‘unlawful combatants’, the term the Bush administration has applied to the foot soldiers of the War on Terror. This omission is all the more remarkable given that a case could have been made out for doing so had the administration chosen to make it. North Vietnam, after all, publicly rejected the laws of war as ‘bourgeois’ conventions. VC foot soldiers did not wear uniforms (they wore their famous ‘black pyjamas’ instead); they did not have a standard command structure (they were organised in groups of three to nine men); they did not carry arms openly. Nevertheless, according to the Criteria for Classification and Disposition of Detainees drawn up at the height of the conflict in 1967, the VC were entitled to humane treatment under the Geneva Convention. There were many occasions when the US Army would not even transfer VC prisoners to the South Vietnamese (from whom they could expect much harsher treatment). Usually when VC members were taken prisoner they could expect to be treated fairly. Ironically, perhaps, one of the principal repercussions of My Lai was a revolution in how soldiers were trained.72 Their training subsequently included clear, specific instructions in the Geneva Conventions concerning the treatment of captured POWs and enemy combatants.73 There was always a chance, of course, that things might have gone the other way. Common among many American soldiers in Vietnam were such stock phrases as ‘any living gook is a VC suspect’ and ‘any dead gook is a VC’.74 Derogatory remarks about the enemy are common enough in all wars. All soldiers distance themselves from those they kill or target. In the twentieth century the epithets tended to be racial. A good example of the way in which the VC were portrayed in a particularly disparaging light can be found in the novels of Larry Heinemann. In writing Close Quarters and Paco’s Story the author drew upon his own tour of duty as a 22-year-old sergeant with the 25th Infantry Division. Heinemann’s work is a pretty authentic echo of the times – its language is vulgar, racist and sexist; the reader is not spared. The common contempt for the Vietnamese people is manifest throughout: You could roll [Vietnam] into one tight-ass ball and none of it is worth the powder to blow it away with. There ain’t one, not one, square inch of muck within 5,000 or 6,000 miles of here that I would fight anybody for, except what I’m standing on.75 It is this widely shared contempt for the Vietnamese as a people – ‘I never met a squint-eye that I would call anything but gook’ – which was compounded by the military emphasis on ‘body counts’ which explains why in real life so many atrocities against civilians could be committed. Yet there is something else worth adding. My Lai itself was part of an area in the Quang Ngai province known as Pinkville. It is true that hundreds of villagers – mostly women and children – were killed by American soldiers who had seen their friends die weeks earlier, often at the hands of a faceless enemy:
44
Etiquettes of atrocity
a young girl carrying a basket of fruit down a trail, concealing a bomb; a farmer in the field who could not be trusted. In essence, they were seeking revenge for the lives of friends that had been lost days or weeks before. Even Calley was trying to restore his tarnished reputation in the eyes of his own men. What is important is that the villagers were not targeted because they were Vietnamese. One intimation of this can be found in the nickname of the area; it was called not ‘Gookville’ but Pinkville, a name used to denote an area considered sympathetic to the Vietcong.76 Perhaps it was because race could not be factored into a computer that it never took on the significance it might have. This surprised many contemporary observers. Writing 12 months after the Soweto uprising which produced the slow countdown to the end of Apartheid in South Africa, Hugh Tinker argued that race had become part of the world’s ‘total experience’. It was the one issue which transcended the ideological conflict between East and West. He even likened it to the role of religion in seventeenth-century Europe: ... Religion formed the total experience transcending everything, dynastic struggles, political debate, artistic and literary ferment, the rise of capitalism, the challenge of science and rationalism ...77 He predicted that in the 1970s, the confrontation between the races would transform the face of war. This transformation did not occur. National Liberation Wars, even in black Africa, never degenerated into race wars. Marxist liberation movements remained true to their own first principles and race was dismissed as a form of ‘false consciousness’. The Cold War remained a secular conflict involving two secular blocs driven by very different ideological convictions. One could always, of course, attribute America’s reluctance to change the discourse on war to the self-confidence of a still inexperienced global power unaccustomed to defeat. But that would be to misunderstand the significance of its commitment to a discourse of war that traced its origins back to the birth pangs of the republic. The US kept faith with its discourse at least as far as ethics was concerned because it kept faith with its own revolutionary past. The Vietnam War could be understood in the context of the Americans’ own history: it could be seen as a war of liberation – albeit one inspired by the ideas of Marx, not Locke. Like other national liberation struggles across the globe during the Cold War – whether in Southern Africa or Central America – the Vietnam War could be grasped as part of the historic cycle which had given birth to the US itself.
Carl Schmitt and the Theory of the Partisan Central to this point of view is what Carl Schmitt – in a major 1962 essay which has only recently been translated into English – calls ‘the Theory of the Partisan’.78 The point of departure for his reflections was the guerrilla war which the Spanish people conducted in the years 1808–13 against the army of a
Etiquettes of atrocity 45 foreign conqueror. For the first time in the modern age a pre-industrial and prebourgeois society clashed with a modern army. The Spanish war was waged for five years alongside the regular battles conducted between the French and the Duke of Wellington’s English armies in what the English still call ‘the Peninsular War’. It was not a directed guerrilla war; there was no central direction and no unifying ideology. Instead the conflict involved almost 200 separate, regional conflicts in Aragon, Catalonia, Navarre and Castile. The conflict also involved other areas that were under the leadership of countless combatants whose names are woven in many myths and legends (including Juan Martin Diez, known as Empecinado, who rendered the road from Madrid to Saragossa particularly unsafe). In due course the spark flew north from Spain. It may not have kindled the same flame in Germany in 1813 as Clausewitz and others hoped but it gave the guerrilla war its world historical significance. It produced a new theory of war, as well as a new reality. Since the partisans had no hope of engaging the enemy in pitched battle they chose to engage in sneak attacks on civilians or small military detachments after which they immediately melted back into the population at large. They relied on surprise and shock, and generally retreated when they met serious resistance. The French army retreated into a few strong points leaving the rest of the country thinly occupied and effectively out of their control. Sometimes the battles were bloody. In the course of the fighting the French wiped entire towns off the map, as in Saragossa which saw the worst urban fighting seen before the twentieth century. Saragossa was Spain’s ‘Fallujah’. In this terrible conflict the Enlightenment met up with an earlier age. Napoleon had once boasted that ‘nations with lots of friars are easy to subjugate’, but what confirmed the guerrilla in his absolute enmity towards the French was religion. For many this was the first religious war since 1648 and the next would not be seen until Iraq in the twenty-first century. Schmitt acknowledged that irregular war has been a feature of conflict for centuries, especially in times of general dissolution such as the Thirty Years War. It also continued to remain a feature of colonial conflicts between indigenous peoples and Western armies throughout the period of European expansion. But irregular war came into its own in the course of the twentieth century (notwithstanding World War I and II). Indeed, it was by far the greatest challenge that the Great Powers encountered. Whereas the conventional wars in Korea and Vietnam were fought by the US to restore the status quo ante bellum, partisan wars transformed the political landscape from Indo-China to Southern Africa. Even for the US, war has become increasingly indecisive as a political instrument; for partisans, war has made a significant difference. What distinguishes the modern partisan, Schmitt claimed, is the force and significance of his ‘irregularity’. The distinction between ‘regular’ and ‘irregular’ is only really applicable in the modern age, given the modern form of social organisation (the nation state) and the modern organisation of armies. It was the regularity of the state and the military in Napoleonic France which gave the guerrilla war in Spain its distinctiveness. The innumerable Indian wars conducted against Native American tribes or the civil war in the Vendée (1793–6) still belong
46
Etiquettes of atrocity
to the pre-Napoleonic era. Napoleon quite self-consciously recognised that the war in Spain was different. He complained that, ‘it has cost us dearly to return ... to the principles that characterized the barbarism of the early ages of nations, but we have been constrained ... to deploy against the common enemy the arms he has used against us.’ The old saying attributed to General Lefevre (one of his commanders in Spain) remains valid: ‘You have to fight like a partisan whenever there are partisans’.79 In partisan warfare the regular army is always confronted with the type of warfare it least wants to fight: a high manpower, drawn-out struggle with indeterminate ends in which high-end technology is not dominant and military actions are not decisive. This demands a lot of any force which is why it is often in danger of ‘regressing’ (in its own imagination). It is from this tension that many of today’s ethical dilemmas still arise. This is especially challenging for a nation that would like to abide by the rules of the game for none of the constraints which constitute ethical practices in war are observed by the partisan. As in most guerrilla wars, the partisan fights for a cause: the liberation of his own country (even if the real inspiration is invariably local). State armies fight outside their own territory; the partisan fights inside his own. The partisan is all the more persistent and ruthless when defending of his own ‘being’. He expects neither justice nor mercy from his enemies and rarely grants it in return. More alarming still, he trades what Schmitt calls ‘the conventional enmity of a containable war’ and identifies a real enemy that arises from the circumstances of the occupation. Inevitably, terror and counter-terror become the currency of war.80 Yet Schmitt was quick to remind us that partisan warfare is rooted in the sphere of politics. It is his intense political commitment which sets the partisan apart from other combatants. It is politics which distinguishes him from the common thief and criminal whose motives are personal enrichment. The pirate is possessed of what jurisprudence knows as animus furandi (felonious intent). The partisan, by contrast, fights on a political front and it is precisely the political character of his actions that throws into stark relief the original sense of the word we apply to comprehend him. In German ‘partisan’ means a party adherent (someone who adheres to a party). What this means concretely is very different at different times both in regard to the party he supports and to the extent of his collaboration, cooperation and even his possible capture. There are warring parties as well as judicial parties, parties of parliamentary democracy, parties of opinion and parties of action. In the romance languages the word is employed both as a substantive and as an adjective: the French speak of a partisan of whatever opinion; in English to be partisan is to take sides. It is this changing set of meanings which makes it such a politically charged word. Clausewitz, interestingly, was the first military theorist to absorb the partisan into the Western discourse on war. It was the partisan’s political character that he grasped from the first. In a letter to the philosopher Fichte in 1809, he advised that Machiavelli’s doctrine of war was outmoded in part because he had not witnessed the vitality of individual forces in war. The courage of the individual soldier in the end is most decisive in close combat especially ‘in the most stunning of all wars, conducted by a people at home on behalf of their own freedom
Etiquettes of atrocity 47 and independence’.81 The young Clausewitz knew the partisan from Prussian insurrection plans of 1813. Two years earlier at the Military Academy in Berlin he had presented lectures in low-intensity warfare. He may have been sorely disappointed that everything he had expected from the Prussian Landsturm edict of April 1813 fell through (the edict itself was rescinded three months later). However, he continued to argue that the first ‘popular insurrectionary war’ in Spain had changed the character of modern warfare. The potential of the partisan is discussed in his classic work On War, especially in Book 6 and chapter 6 of Book 8 (‘War as an instrument of Politics’). One also finds astonishingly telling remarks throughout the work, such as the one about the civil war in the Vendée which says that sometimes a few isolated partisans in the field might even lay claim to being an ‘army’.82 In the end, Clausewitz never developed these ideas systematically. He died before he could complete his master work. But this is not the explanation for his failure. He remained the reform-minded regular officer of his age unable to germinate the seed he had sown, or to exploit its full potential. The strong national impulse evident in 1813 ended in isolated acts of rebellion. It was soon channelled into regular warfare on the battlefields of Bautzen, Dresden and finally Leipzig, the decisive encounter that forced Napoleon back to the frontiers of France. Clausewitz himself, adds Schmitt, still thought in classical categories as when talking about the Trinitarian nature of war and attributing ‘the blind natural impulse’ of hate to the people; courage and genius to the commander and his army; and the purely rational management of war to the state. Partisan warfare encapsulates all three. Clausewitz never grasped this point, perhaps because it offended his sense of order. In Vietnam, nonetheless, the US kept faith with one of Clausewitz’s key insights: that the partisan is usually a political agent. The US military never lost sight of the fact that in the Vietcong, like North Vietnam, it was dealing with a political enemy. The war in this regard can be seen as a guerre de forme – an armed clash between political agents who recognised each other’s political status. The US chose to apply the Geneva Conventions to the Vietcong but not because it had to (although since 1949 the conventions have increasingly encompassed civilians engaged in military acts). It accepted that the conventions we sign are there not to humanise war but, in Schmitt’s words, ‘to contain’ it. We act honourably towards our enemies because it is sensible to do so, because they are generically different from criminals. The US may have gone on to lose the war but it did not forfeit the chance of victory because it applied the Geneva Conventions. It lost the war because it had no game plan to win it. Possibly, it was fighting an enemy that could not have been defeated in the circumstances of the time. The fact that it kept the politics it did at least ensured that aside from the many atrocities – and there were far more than reported – the war itself never became an atrocity. It remained a political struggle whose eventual outcome was accepted by both sides. Let me return to Schmitt’s contribution to understanding modern war because it is far more profound than we think. His thought largely revolves around the
48
Etiquettes of atrocity
problem of distinguishing what war always is (the simple ability to impose one’s will on another) from what it can become (an absolute, unrestricted struggle). The trick is to contain it through laws restraining force or, in Schmitt’s words, channelling the lack of restraint of war (its nature) into a juridical form (its character). Ethics, similarly, can perform that role too, either through personal conduct or through the legal codes which embody its principles. ‘War has its own reality,’ writes Tim O’Brien. ‘War kills and maims and rips up the land and makes orphans or widows. These are the things of war. Any war.’ 83 Vietnam is the villain of his novel Going after Cacciato. But, as he adds, every war is villainous for the same reasons, even though some wars are ‘contained’ and others are not. And what contains them is politics in the form of law or ethics. As Schmitt wrote in his seminal work The Nomos of the Earth, international law implicitly accepts that in the anarchical state of nature that constitutes international politics states can live as personae morales (moral persons). The anarchical society is not lawless. What prevents states from engaging in a Hobbesian war of all against all is not just the conventions of international law, ‘the thin thread of treaties that [bind] these Leviathans together ... but ... strong traditional ties – religion, social and economic [which] endure much longer’.84 Schmitt is not popular with liberals not only because he was not of their persuasion. He turned his back on the twentieth-century liberal project of constructing a New World Order, first through the League of Nations and then through the UN. He focussed, instead, on those more durable forces that spanned the ages: the attitudes and practices rooted in the past that Geoffrey Parker, as a historian, identified as most likely to contain war’s passions. Schmitt’s work tells us less about specific institutions such as international regimes or organisations than the fundamental practices which they are deemed to institutionalise – the rules, if you like, that we have learnt over the centuries from experience or through example. For Schmitt the challenge of international politics was not to outlaw war but to transform it into an institution. He offended his liberal readership by taking issue with liberal governments for trying to legislate away war in international agreements or conventions; he contended that it is our willingness to fight for our beliefs and take the ultimate responsibility for our values which defines our humanity. If it is true that ‘a liberal cannot take his own side in an argument’ then there was little to be said for liberalism. What do principles represent if in the last resort they are not worth fighting for? In insisting that international politics could never be made safe Schmitt broke with the liberal orthodoxy. His sense of an embattled world was probably made more intense by his involuntary withdrawal from academia following World War II. As he progressively became more isolated his internal vision grew even more intense, which suggests that his vision may have had roots deeper than those of his intellectual enquiry. Schmitt must be seen, perhaps, as an orphan of an age which did not agree with his arguments in favour of defending one’s ground by recognising one’s enemies. Precisely because he did not underwrite fashionable opinions he was called a reactionary, the taunt which the world throws at those who do not follow the
Etiquettes of atrocity 49 conventional wisdom. He did not help his own position by insisting that it was not rooted in morality, but a quasi-Hobbesian understanding of human nature. Great writers who claim that their work is not ethically grounded usually turn out to be quite moralistic when you examine their beliefs more deeply. The same is probably true of Schmitt. Leo Strauss always claimed that in his constant search for meaning in life he was a highly moral writer for the ‘political’ in the realm in which morality is to be found. What might be claimed, more controversially, is that his morality was highly pragmatic. In asking us to identify our enemies he was asking us to understand who ‘we’ are so that we know what is in our true interest. Schmitt did indeed want us to identify enemies, but real ones – not ones who were generic or universal so that we could better defend our own positions. It is the way in which we defend those positions which may determine whether they have universal applicability or not, though Schmitt himself would never have made such a claim. When surveying the contemporary international landscape he was appalled by the decline of the Western state system as the liberal world tried to abolish war by subordinating states to international institutions. He would have had no truck with the International Criminal Court to which most Western countries (excluding the US) have signed up. His analysis is so radical because he traced the decline of the state not to the forces of globalisation or the omnipresence of market forces, but to the decline in its understanding of the new currents of political intensity that were transforming the international order. Enmity in international politics, he warned, was transforming itself from the real to the absolute. As always in Schmitt’s writing, the evolution of the ‘political’ shapes the evolution of war. Writing in the 1920s, he predicted that the partisans (or parties) who captured the state (as in Soviet Russia and later in Nazi Germany) would be more ruthless than their liberal opponents at mobilising their societies.85 What would he have made of our post-9/11 world? Would he have seen a clash of civilisations, not between states but between states and non-state actors? Would he have advised us to ignore the suicide bombings, the televised beheadings, Sharia law, honour killings, political oppression and religious intolerance including virulent anti-Semitism? Would he have dismissed these as many do as ‘fictive ethnicities’ belonging to the obsolete ideology of American imperialism? Although an implacable critic of America’s liberal project, I suggest he would have counselled us to take terrorism very seriously, precisely because its ideology is rooted in the idea of an absolute enemy with whom there can be no peace, even an armistice. Would he have warned his fellow Europeans that in Europe itself the old state system is dissolving into a weak transnational community (rather as Christianity has dissolved into ‘a faith community’) that has difficulty inspiring the faithful? In the US might he have seen a different problem: a country attempting to unite its citizens in a war against an absolute force of Evil, a country intent on transforming war into an eternal policing operation? Schmitt’s fear was that only the partisan would take war seriously. In this person we would see an actor engaged in trying to subvert the existing international order and doing so successfully by having a better grasp of war and politics.
50
Etiquettes of atrocity
Schmitt’s liberal views may make for grim reading and ‘we liberals’ (whether Rorteans or not) need not accept his analysis uncritically. But what we should take seriously is the seriousness of war, as well as the intent of our enemies. There are indeed enemies out there, not all of whom are merely the results of our stereotyping ‘the other’ on the basis of artificially constructed norms of religion or ethnicity. Our enemies are those Schmitt feared most: partisans who, following Lenin’s example, have married nationalism and ideology, partisans who have turned to terror not as a tactic but as a strategy. Nearer in time to Schmitt’s writing on the partisan was the anti-colonial struggle against the French in Indo-China. Here was a particularly horrendous example of a party that had deliberately provoked counter-terror from the French by terrorising the local population not in an attempt to restore the social order (as in Spain), but to subvert it. Later, in the conflict against the US, the North Vietnamese state mobilised the entire population. In doing so it was able to marry nationalism and ideology, in this case to overlay a quasi-Marxist message of historical struggle: the advance to a perfect society, with a centuries-old Vietnamese tradition. The struggle was called the ‘Dow-Tranh’, a phrase which consecrated the war as a personal duty and in the process made sense of personal suffering. The individual suffered so that the nation might live. But every Marxist movement also owed a large debt to Christianity in secularising the doctrine of providence, and in the process converting the belief in salvation into a metaphysical historicism, a substitute religion that became the faith of those whose scepticism was not vigorous enough to dispense with religion altogether. Communism was definitively Christian in its themes – in its opposition between Free Will (liberty), and Predestination (history); not even God was entirely absent from the text. History assumed His functions, if not His face, with one critical difference: unlike the Christian God, history did not assume a human form. It is worth noting that although successful in the field, the young revolutionaries who fought the Americans were betrayed by the revolutionary war. Take the young Bao Ninh who was 16 years old when he was first called to serve with the 27th Youth Brigade. By the time Saigon fell ten years later he found that he was only one of ten survivors from the 500 who had joined up with him. His award-wining novel The Sorrows of War is one of the outstanding accounts we have of the fighting. His is one of the few voices we hear from the other side, yet it is a despairing work because of its final conclusion that history had played false. He and his comrades had started out with so many hopes, but hope was soon extinguished. Our history-making efforts for the great generations have been to no avail ... from the horizon of the distant past an immense sad wind like an enormous sorrow gusts and blows through the cities, through the villages and through my life.86 A socialist utopia had not been forged. Humanity had not been redeemed. For North Vietnam the glorious socialist future had not lasted very long.
Etiquettes of atrocity 51 He had discovered he was happier when looking into the past; his path in life which he had once assumed would be towards a beautiful future had done a U-turn and taken him backwards into the murky darkness of the hard times his homeland had experienced. Happiness seemed to lie in the past. The older he grew, the rosier the past looked to him.87 Schmitt had no doubt that the Indo-China war would not be the last in which the revolutionary partisan embraced terror. He was particularly prescient in making the following prediction: You have only to follow the logic of terror and counter-terror to its natural conclusion, and then apply it to every sort of civil war in order to understand the shattering of social structures at work today. A few terrorists suffice to put large masses under pressure. To the narrower space of open terror are added further the spaces of insecurity, anxiety and common mistrust ... All the peoples of the European continent ... have experienced this personally as a new reality.88 What would make the future more alarming still was the continuing evolution of technology. The old style partisan who the Prussian Landsturmedikt of 1813 had planned to mobilise would have used pitchforks and muskets. Modern partisans fought with machine guns, bombs and grenades. They were becoming wired into a communications network worldwide.89 Even in 1962 Schmitt glimpsed our networked world. And even more striking, he warned that the ultimate means of destruction, including nuclear weapons, might soon fall into their hands. Most frightening of all, the partisan was becoming radicalised. All his life he maintained that the ‘political’ involved the distinction between the friend and the enemy. In this distinction can be found his key insight that if one accepts that war can never be outlawed successfully, and that conflict (violent or non-violent) is at the heart of political life, the best we can aim for is to contain it. It is the containment of war that for Schmitt represented the highest form of order within the scope of human ambition. It is the duties we recognise we owe our enemies that makes us human. We escape the state of nature when we recognise that our enemies are fallible human beings like ourselves. It is from human fallibility that we deduce our ethical obligations. As long as the partisan lived in the political world his enmity too would be real, not abstract. What struck him even about revolutionary guerrilla leaders like Che Guevara was the unconditional nature of their political engagement. The partisan without rights seeks justice for himself in enmity. In war he finds the meaning of justice once he feels himself excluded from the protection of the laws he has once obeyed, or the system of norms from which he once expected justice has been shattered. The partisan fights to be recognised as a human being by very real enemies who stand in his way. But such enemies are to be defeated, or possibly ‘re-educated’, or redeemed. They do not merit annihilation. What happens,
52
Etiquettes of atrocity
however, when the partisan sees you – and you see him in turn – as an ‘enemy of mankind’? What if a real enemy is transformed into an absolute one in the eyes of both parties? Clausewitz warned that if unchecked war has a predisposition to take an absolute form. He imagined that the state system could prevent this. He did not predict, Schmitt noted, that the state might one day become the instrument of a party, or that a party might capture the state, as happened in the case of Nazi Germany.90 If such a party waged war, Schmitt predicted, then the conflict would be especially intense and inhuman because states would depoliticise their enemies as a ‘sub-moral or even a sub-categorical monster’. Both parties would push each other ‘into the abyss of total devaluation’. But then Schmitt himself did not foresee that non-state actors prepared to kill in the name of Truth or God might declare war on the state system.91
Conclusion In the following chapters I look at three cases of what happened when three separate discourses on war were revised to take account of an absolute enmity. In the case of the first, Nazi Germany, a state captured by a partisan party, the Russians were designated sub-moral (or in this case sub-human). We then look at two liberal societies who also felt obliged to change their doctrines when fighting partisans that targeted them as sub-moral. In all three cases the changes made little strategic sense. In the case of the last two they posed an additional challenge, forcing both to ask: what price were their own liberal principles? What makes all three cases relevant today is that they illustrate the problems every society faces when fighting partisans. Even a regime as ruthless as Nazi Germany might have met its match if it had not been defeated on the conventional battlefield. The historian John Keegan once claimed that if Germany had won World War II it would have had no difficulty holding down its new empire in Europe. Martin van Creveld is not convinced. Like every other occupation army, he argues, the Wehrmacht would have found most of its most powerful weapons (nearly) useless. Like every occupation army that followed it, it would have experienced great difficulty in gaining intelligence as its enemies, fighting without uniform and often without an orderly chain of command, emerged from, and melted away into, the civilian population. As its operations in Yugoslavia illustrated, the Germans enjoyed no great advantage over their opponents. Europe, after all, was a continent where the occupied population had all the skills necessary to manufacture light weapons of every kind, and where, owing to the system of general conscription, the number of men who had received some kind of military training during their lives ran into the millions. It is true that the Nazis were ruthless – although not so ruthless as to treat men and women alike; female members of Western resistance movements, instead of being executed, often ended up at Ravensbruck concentration camp instead. Arguably the Nazis were not necessarily worse than some other would-be counterinsurgents who, without acknowledging the fact, followed in their footsteps. The
Etiquettes of atrocity 53 French in Algeria killed about a hundred people for every one they lost. They also destroyed entire villages by bombing them from the air. They re-introduced torture. One of their most notorious units, the Foreign Legion, not only had in its ranks many Wehrmacht veterans but was proud to call itself ‘the White SS’.92 If the struggle for liberation cost countries dearly, there is no reason to think that if called upon they would not have paid the cost of their attempted liberation. Yugoslavia lost two and a half per cent of its pre-war population. Most of the casualties were inflicted not by the German occupation forces but by internecine fighting – Serbs against Croats, Communists against Chetniks. In today’s Iraq the main casualties are incurred in a Shia-Sunni civil war. Cruel as it sounds, history also shows that even a tenth of the population dying in a protracted struggle is not necessarily too high a price – in 1941–5 the USSR suffered considerably more. Above all, van Creveld insists, Keegan is wrong because he fails to recognise the most important factor of all. From beginning to end World War II only lasted six years, but insurgencies last much longer. Although some resistance movements are quick off the mark – for example the Americans entering Baghdad in 2003 were granted only a few days’ respite – experience shows that most take quite some time to get up to speed. It took 40 years in Palestine to start the intifada but there is no end to the struggle. The critical fact in all of this is precisely the one that Keegan overlooks: time. Wars between states are short; counter-insurgency campaigns are endless. The longer they last, the more likely the weak will become stronger. The strong tend to undermine their own legitimacy. A single crime, once it has been committed, may be forgotten, or at least forgiven. A long series of crimes carried out over a long time span will invariably lead to the counter-insurgent force undermining its own credibility, moral authority and political position. As an Israeli himself, it pained van Creveld to witness the decline of the Israeli Defence Force in the first intifada. Once considered one of the world’s foremost fighting machines, it was caught off balance by a situation it had failed to foresee. It reacted by lashing out incoherently. Over the following months it killed hundreds, arrested thousands, blew up the houses of many suspected terrorists and, on the specific orders of the Ministry of Defence, used batons right and left to break the arms and legs of young Palestinians. The lesson van Creveld drew from that particular conflict was that any counter-insurgency campaign must be grounded in an ethical response: it must be backed by solid professionalism and iron discipline. If it does not recognise the nature of the problem, it will fail. As always, the choice is ours.
3
Changing the discourse
Germany and the Eastern Front 1941–5 One of Schmitt’s central claims was that the post-Wesphalian settlement had established a jus publicum Europaeum, a space in which war had been ‘bracketed’ or kept within ethical and legal bounds. After 1648 the European powers were able to seal themselves off from war’s true nature: nihilistic hatred and mutual destruction. Only this allowed Clausewitz to consider war a ‘duel’ between states who recognised each other as equals. Only this allowed him to consider that the battlefield could decided the outcome, and to further contrast war between states as the norm when historically the norm had been more ubiquitous violence. It was the state system, Schmitt added, that made possible those distinctions which reside in the ethical realm. The state was not only a source of power, but an ethical entity. By mediating war the state permitted clear and unambiguous distinctions: war and peace; military and civilian; neutral and non-neutral. It was the inter-state system that, for Immanuel Kant, constituted ‘the republic of Europe’ and, for Hegel, the European ‘civil society’. Yet this system was placed under great strain after 1800 with the rise of the nation-state. It was clear that the rise of nationalism and the threat that war might well unleash passions like hatred could transform the face of war. Both challenges were addressed in different ways by Clausewitz and Hegel. Both writers gave the French Revolution and the rise of Napoleon a seminal role in the genesis and articulation of their respective phenomenologies of political life. Napoleon’s very possibility, thought Hegel, taken in its broadest theoretical and empirical sense, marked the fulfilment of humanity’s historical destiny. Famously, he glimpsed Napoleon under his window at Jena at the precise moment in which his preface to the Phenomenology of Spirit was being completed. It was an iconic coincidence. Philosophy had achieved its end at the moment in which history drew to an end. The problem for those who read Hegel is: what does this finality signify? It signifies the nation-state, as Hegel himself came to believe; Stalinism as his principal twentieth-century interpreter Alexandre Kojeve believed; or the triumph of liberalism as Francis Fukuyama believed. The problem for both Clausewitz and Hegel was to marry ethics in war to two new forces: the nation-state and the advent of industrialised modern war.
Changing the discourse 55 Hegel never really addressed that challenge head on. War, he tells us, ‘educates for freedom’. It is the supreme ‘ethical moment’ in a nation’s life, consistent with the larger search for freedom, which he claimed to have recognised as the main narrative of history. Europe, he claimed, had reached a stage when its citizens no longer fought for the security of life or the property of the individual, but were willing to hazard both for a greater end. In the modern age, societies were willing to fight wars for freedom or for a cause. It was the readiness of the citizen to sacrifice his life in service of the state that, for Hegel, constituted the last phase of history or what he somewhat portentously called ‘the integration with the Universal’.1 Hegel’s ideas were grounded in his own experience. He had seen the force of ideas when the French revolutionary slogan ‘Liberty, Equality and Fraternity’ was carried by force of arms across Europe. He had seen the Geist of freedom and the Volkgeist of the German nation forged in war against Napoleon. Both enabled him as a philosopher, or so he believed, to step outside history as it unfolded, understand its course and even glimpse its potential fulfilment. From this perspective, for example, he ingeniously, if implausibly, was able to deduce the invention of firearms. Instead of fighting face to face in hand-to-hand combat, modern soldiers were now prepared to die anonymous deaths, turning personal valour into a more abstract form.2 His bold attempt to make war an agent of transcendent history suggested that both the Geist and the Volkgeist could ultimately be reconciled through the medium of the state, and that war between states could be mediated through international law. In that sense war could indeed remain an intensely ethical activity.3 Any action that threatened the possibility of peace (e.g. the killing of civilians) would constitute a breach of Parker’s etiquettes of atrocity. He believed that (in war) relations between states are determined by ‘national customs’. He was therefore able to draw comfort from the fact that the nations of Europe formed a family, a unique ‘civil society’.4 Recognising the rights associated with the customs of war, he expected that parties to a conflict would respect certain rules of engagement. Only through finding the true end of their existence in their relationship to the ‘civil society’ could Europe’s respective states achieve ‘a life of self-conscious freedom’. Their discourse of war was grounded in reason, not passion. There could be no place for hatred or revenge (two of the existential emotions that make wars so difficult to control when they are given unlimited expression). In short, Hegel took his cue from Machiavelli’s view that there are two means of fighting: by law and by force. The first belongs to men, the second to animals. What distinguished law from force, Hegel added, was that the former retains the possibility of peace. Even in war the peace of family and private life is inviolate.5 And although Hegel was dismissive of Kant’s idea for a Universal Peace he seems to have accepted its preliminary article that ‘No state at war with another shall permit such acts of hostility as would make mutual confidence impossible during a future time of peace’.6 What of Clausewitz? There is no evidence that he attended Hegel’s lectures in Berlin or read any of the philosopher’s works (most of which were published posthumously by his students from his lecture notes). His interest was more modest.
56
Changing the discourse
For Hegel, war could become an instrument of human transcendence in the eyes of Reason. For Clausewitz, war could become an instrument of the preservation of the state system, provided its logic was understood. In other words, while Hegel tried to force the logic of the state to come to terms with the phenomenon of war, Clausewitz tried to make war conform to the needs of the state.7 This seems to have been the fundamental meaning of Clausewitz’s most famous aphorism that war is, or should be, ‘a continuation of politics by other means’. It was assumed that the political requirements of strategy would keep war within limits. In short, ethical practices were implicit in both writers’ thinking. Neither thinker foresaw the coming of a war without rules. However, that is precisely what came to pass in the two World Wars that shattered peace in the first half of the twentieth century. To take just one example, towards the end of July 1943 leaflets began falling from the sky above the city of Hamburg. Children playing in the streets whispered that the leaflets were poisoned and ran home for fire tongs to pick them up. Reading them was forbidden but almost everybody sneaked a look. The leaflets instructed the citizens of the city to leave immediately because it was about to be attacked by Allied bombers. Nobody took any notice. It was only propaganda after all and the war seemed far away. So they carried on as before in the glorious summer weather, visiting the circus, listening to Hungarian orchestras or watching the latest romances at the cinema. And then on 24 July the bombers came. Crowding into cellars in the city while a firestorm raged overhead – 20,000 citizens died in under two hours, 50,000 in all in one week – did not call for the transcendent heroism of the nation. Instead, it represented the triumph of pure contingency. The triumph of the contingent in both World Wars owed much to Germany. This ironic fact would no doubt have been unappreciated by Hegel and Clausewitz. It was the Germans who were the first to shoot civilians (5,500 in all during the 1914 invasion of Belgium); the first to bomb cities; the first to use poison gas on the battlefield; and the first to sink ships carrying civilians. All of these activites are the products of historical contingency – in this case sheer fear that verged on paranoia – and what fear, in turn, gave rise to. Fear remained, wrote Frederick Manning in The Middle Parts of Fortune, ‘an implacable and restless fear, but that too, seemed to have been beaten and forged into a point of exquisite sensibility and had become indistinguishable from hate’.8 Nowhere was this more evident than in Hitler’s war against Soviet Russia. From the first, Hitler intended to fight the war by new rules. ‘The struggle will be very different from that in the West’, he told 250 senior officers a few months before the invasion, and ‘... the leaders must make the sacrifice of overcoming their scruples’. The Guidelines for the conduct of troops in Russia issued on 19 May 1941 called for ruthless and vigorous measures against guerrillas, saboteurs and Jews. If the etiquette of atrocity is a value, the scruples we develop are the way in which we translate a value into a norm; it is the means by which we instrumentalise the values we embrace. The war in the East marked a deliberate departure from anything that Clausewitz had known or would have recognised as war. Ironically, the German Army’s behaviour might have reminded him of the
Changing the discourse 57 Cossack units which had served the Russian Army in 1812. The Cossacks had not hesitated to strip the uniforms off the backs of French soldiers caught in the rear, leaving them to perish in sub-zero Arctic conditions. Some Cossacks had even sold French soldiers to the serfs. The Germans began by excluding from their discourse on war certain parties who they felt guilty of fighting outside the rules or, to be more precise, parties who had previously shown little interest in the conventions of war. This included the Red Army – not entirely surprisingly perhaps, since the Soviet government openly insisted that all international law was a bourgeois anachronism. The Soviet Union had not ratified the 1929 Geneva Conventions on POWs; it had also not expressly acknowledged the 1899/1907 Hague Laws of Land Warfare. Throughout the war it also refused to allow the Red Cross to operate behind the lines. The German government certainly considered it was under no obligation to treat the Soviet Union as it treated Britain or France, though it did (unsuccessfully) attempt to frustrate Moscow’s efforts to formally subscribe to the Hague Conventions in July 1941. Even though the Soviet Union signed the conventions at the eleventh hour, the Germans acted as if nothing had changed. The notorious ‘Commissar Order’ of 6 June 1941 required every captured commissar to be shot on the grounds that ‘hate-inspired, cruel and inhumane treatment of prisoners [could] be expected on their part’.9 The Jews were another category of potential agitators, or what were called ‘bolshevist driving forces’, from whom little good could be expected. Again, a case of sorts could have been made to justify such behaviour had the Nazis chosen to make it. In their own treatment of ‘class enemies’ the Soviets themselves excluded groups from political life for being ‘objectively criminal’ – such as the bourgeoisie, the Kulaks or the 15,000 Poles killed out of hand at Katyn in 1939 (a massacre which was not known to the world until it was uncovered by the Germans in 1943). They had been killed, not on the grounds that they had been ‘subjectively’ guilty of any particular crime. They were considered ‘objectively’ criminal by virtue of being members of a criminal social group. But the Germans went far beyond excluding Bolsheviks from their discourse on war. They always insisted that war in the East involved a struggle between two races, not merely two states. Racial differences, alas, may not be real but we seem genetically predisposed to take them seriously. This was not a war in the traditional sense; it was not even a class war. It was a racial war. By a process of ideological rationalisation the Geneva Conventions were not treated as universal laws, but laws that applied to a particular racial group. Soviet soldiers caught behind enemy lines (even those who had no time to surrender, as was the case of many in the early months of the campaign) were deemed to be potential partisans or saboteurs. No provision was made for their welfare. As the campaign dragged on into the early winter they were left to perish of cold, malnutrition or both. The death rate (the highest in history) tells its own story. The Russians lost seven million soldiers in the course of the war. A great many were simply shot on sight, even when they surrendered. In his book War of the World, Ferguson tells the story of one German soldier – an eye witness to what happened.
58
Changing the discourse We were mad with harassment and exhaustion ... we were forbidden to take prisoners ... We knew that the Russians didn’t take any ... [that] it was either them or us, which is why my friend Hals and I threw grenades ... at some Russians who were trying to wave a white flag. [Later] we began to grasp what had happened ... We suddenly felt gripped by something horrible which made our skins crawl ... For me, these memories produced a loss of physical sensation, almost as if my personality had split ... because I knew that such things don’t happen to young men who have led normal lives ... We really were shits to kill those Popovs ... [Hals said]. He was clearly desperately troubled by the same things that troubled me ... ‘[That’s] how it is, and all there is,’ I answered ... something hideous had entered our spirits to remain and haunt us forever.10
Even at the time, many soldiers complicit in these crimes recognised that the Germans were taking a terrible gamble. If they prevailed, history would probably exonerate them; if they failed, there would be a terrible reckoning. Quite apart from its illegality, the policy of shooting Russian soldiers in the act of surrendering made no sense. Seeing their comrades shot out of hand, the majority had no option but to continue the fight however helpless their condition. Officers of the 18th Panzer Division came to the same conclusion: ‘Red Army officers ... were more afraid of falling prisoner than of the possibility of dying on the battlefield.’11 The Italian writer Curzio Malaparte, whose reports from the front soon brought him into conflict with the German authorities, personally witnessed the fate of a group of Soviet POWs who were given an impromptu ‘reading test’ by the Germans guarding them. Several old copies of Izvestia and Pravda were distributed to groups of five men who were told that if they were literate they would be given jobs as clerks; the rest would work in the docks. Those who failed the test were sent to the left. Those sent to the right were then marched to a wall where they were shot. Why? Because peasants who could read and write were too dangerous to live; by definition they were ‘communists.’ 12 Even when Russians were taken prisoner the suffered a grim fate. Of 5.7 million Russian POWs, nearly 60 per cent died in captivity (compared with 5 per cent of Russian POWs in World War I). Two things became clear to Soviet citizens within weeks of the struggle: any fate (even the gulag) was better than surrendering to an enemy who evidently enjoyed torturing and killing Soviet citizens regardless of gender or age; and, however ambivalent many Red Army soldiers may have been to the regime, at least it protected them and their families from something far worse. The main reason why the Nazi discourse on war failed the regime it was intended to serve was that it was intensely un-modern. Most ethical considerations in the modern era revolve around the central question: the relationship between politics and morality. Unlike the Greeks, modern societies ask themselves the
Changing the discourse 59 question that defines the historic moment in which we all live: how can politics serve a moral vision of society? It was not accidental that at the very time that the French republic published the Rights of Man, the French state abolished torture. The historical association is particularly significant. It illustrates how significant the ethical imperative is to the self-understanding of a modern project which sought to find the connection between politics (the domain of both the state and the citizen – the Declaration of the Rights of Man is actually the Declaration of the Rights of Man and the Citizen) and a universal moral code which was the basis of an ethically grounded political and social order. Unlike the Greeks, modern society has grounded ethical practices not on customary norms specific to individual countries, but values recognised by all.13 The German departure from those values (from the Geneva Convention and even the unwritten codes that have governed the behaviour of armies for centuries) can be seen as an attempt to reverse the clock, to revert back to a more atavistic era. It is a terrible warning from history of what happens when one loses purchase on reality. In the end, by excluding all ethical considerations from their new discourse on war the Germans allowed war to escape all limits. The war in the East which claimed 27 million lives is as near as it gets to Clausewitz’s idea of ‘absolute war’. We have laws of war – ‘those mincing distinctions of polite society’, as Caputo called them – because we need to keep war under control. The Nazi discourse on war was adopted in the expectation of a quick victory. Most of the generals had expected the Soviet edifice to collapse within weeks under the ‘shock and awe’ of the German invasion. When the Red Army proved more resilient than anticipated – worse, when its material strength began to dictate the terms of engagement – the Wehrmacht was only able to continue the struggle by de-modernising war on the Eastern Front, and thereby removing politics from the equation. At which point war was transformed into what Clausewitz most feared – an absolute, existential struggle. After 1943, the German Army reverted to the infantry tactics of the Great War: digging in, fighting for every square foot of ground and refusing to admit defeat. In its tactics, it showed the same grim determination that had been displayed by the soldiers on the Hindenberg Line (1917). Possibly it showed much more for German soldiers frequently fought in conditions of physical exhaustion much grimmer than those their forebears had sustained in the last phase of World War I. And, unlike the latter, they did not collapse. On the Eastern Front war became a condition of life, a Darwinian struggle that offered only one stark choice: killing or being killed. This, in turn, made the rank and file even more susceptible to ideological indoctrination. As the historian Omer Bartov observes, the Germans substituted ‘a ruthless, fantastic, amoral view of war for material strength and rational planning’.14 With its nihilistic celebration of death, Germany’s new discourse on war combined a growing contempt for traditional values with a powerful, nihilistic urge for self-destruction. Bartov quotes at length from the diaries and journals of German soldiers in the fields: ‘Man becomes an animal. He must destroy in order to live. There is nothing heroic on this battlefield ... The battle returns here to its most primeval, animal-like form.’
60
Changing the discourse
Another soldier wrote: ‘Here war is pursued in its ‘pure form’ [Reinskultur]. Any sign of humanity seems to have disappeared from deeds, hearts and minds.’15 Confronted with a battlefield reality that was starkly new, everything was permitted that might prevent the extinction of the individual soldier and, by extension, that of his comrades, his unit, his country and his race. In an attempt to overcome the material de-modernisation of the Front, as the material strength of the Soviet Union began to prevail and as the German Army faced the utter hopelessness of its situation, battle became a condition that was glorified as the real, supreme essence of ‘being’. War ceased to be the continuation of politics by other means. It became its own justification. German soldiers now accepted the Nazi version of war as the only one applicable to their situation. It was at this point, adds Bartov, that the Wehrmacht finally became ‘Hitler’s Army’. Can the ethical blindness of the German military be entirely attributed to political indoctrination? Was that tendency not reinforced when the military surrendered itself to Nazi ‘programming’ (particularly after the officers took a personal oath of allegiance to Hitler in 1938)? 45 per cent of Werhmacht officer candidates between 1939 and 1942 were party members. Most of the younger recruits arrived in the ranks with ideological baggage acquired in school or the Hitler Youth.16 Whether its faith in Nazism survived the reverses of 1942–3 is a moot question, but the army continued to act barbarically – probably because the consequences of defeat were too appalling to contemplate. Faced with their own criminality, German soldiers could not afford to contemplate defeat. Faced with the evidence of their own crimes, they embraced a Social Darwinist argument that the losers deserve their fate. The only way that Germany could prove that it had fought all along for a just cause was by winning, no matter what the cost. Confronting the prospect of retribution for their own crimes, many Germans might well have taken counsel from the graffiti artist who, in March 1945, scrawled on a Berlin wall the grim advice: ‘Enjoy the war – the peace is going to be terrible.’ I have argued so far that the Nazi discourse on war was the product of what historians call ‘a reactionary modernism’, and that they could only sustain themselves on the Eastern Front after 1943 by going further and de-modernising the battlefield. Yet the problem was further compounded by the reactionary insistence of many army officers that it was possible to distance themselves from the regime in what they called ‘inner immigration’ – it was possible to retreat into the honour code of the Werhmacht and thus escape complicity with the state and its crimes. The attempt was bound to fail even if the senior officers had really been innocent of crimes which many were not. The nonsense of ‘inner immigration’ is vividly brought out in Hermann Broch’s great trilogy, The Sleepwalkers, which offers a sweeping analysis of one of the dilemmas of the modern age: the fate of human beings facing the process of disintegrating values at home. Milan Kundera discusses one of the protagonists, Joachim von Pasemow, whose brother dies in a duel. His father claims that he died for honour, but Joachim’s friend, Bertram, is amazed that the old man could ever think so. How is it possible in an age of trains and factories that two men can stiffly stand face to face, arms extended, revolvers
Changing the discourse 61 in hand? Like his brother, Joachim is attached to sentiments that have no place in the new world. With uncanny prescience, Broch anticipated that the military would provide a refuge for those holding such archaic sentiments. A man who cannot live without the old values (fidelity to country, family or honour) can button himself up in a uniform as if, adds Kundera, ‘that uniform were the last shred of the transcendence that could protect him against the cold of a future in which there will be nothing left to respect’.17 What the Nazi episode revealed was that it is quite impossible for an institution to think that it can keep itself clean by distancing itself from the state. No institution, even in a democracy, is ever autonomous. In 1944, George Orwell wrote an article in which he dissected this dangerous fallacy, one that was especially widespread in totalitarian societies: The fallacy is to believe that under a dictatorial government you can be free inside ... the greatest mistake is to imagine that the human being is an autonomous individual ... Your thoughts are never entirely your own.18 It is only by relating to other people that we remain moral beings. If we choose not to relate we will no doubt act immorally whether we consciously elect to do so or not. There is nothing inside us, no common moral core, no built-in human solidarity which can serve as a moral reference point. What is common to all of us is the ability to feel pain; the rest is socialised into us by education. We are only moral beings in conversation with other people.
Algeria and the guerre revolutionaire There’s an osmosis in war, call it what you will, but the victors always tend to assume ... the trappings of the loser. (Norman Mailer, The Naked and the Dead, 1948) One reason why Western militaries are being urged to change the way they have traditionally thought about war is its growing complexity. The first writer to spot this was perhaps the greatest of all Clausewitz’s twentieth-century interpreters, the French sociologist and political philosopher Raymond Aron. In discussing the war in Algeria in the 1950s he employed the new term ‘polymorphous’ to describe the extent to which the conflict kept mutating from a war of independence into a revolutionary struggle, or from a terrorist campaign into one which involved a large element of criminality. Aron was attempting describe a military conflict which could only be understood by encompassing its many and very different dimensions. The Algerian war started with 60 indiscriminate terrorist attacks perpetrated during one night. It rapidly escalated, changing form all the time. It was not, contrary to popular belief at the time, just a national liberation war or revolutionary bid for power. Today, were Aron to use the same term, ‘polymorphous’, he might choose to use it in a sense not available to him at the time. He might choose to invoke the
62
Changing the discourse
Table 3.1 The Algerian War (1954–62) World War II Paradigm
Algerian War Paradigm
Specific moment and place Encounter on a battlefield
Enlargement of the spatial dimension Geographical indeterminacy of theatre of operations
Sharply etched sequential timeframe Recognisable beginning and end of engagement deceleration of engagement
Transformation of the temporal element Simultaneous multiplicity of points of interaction; concurrent acceleration and
Well-defined actors Soldiers (as state agents), civilians
Mutation of the belligerents’ identity Obliteration of combatant/civilian categories
Armies attacking armies Military targets, siege warfare, proportionality
Expansion of the nature of targets Increasing blending of civilian and military targets
Traditional weaponry Targeted use of kinetic force
Systemisation of asymmetrical warfare Amplification of the platform of combat: weaponisation of civilian assets
language of the computer age, for in Afghanistan and Iraq war shows a tendency to morph into different struggles in different theatres of operation. It is the protean nature of the conflict which caught out the Coalition forces when the insurgency began a few weeks after the fall of Baghdad – it has taken a long time for them to come to grips with its nature.19 If Aron provides the best description of the kind of struggle the US military is currently facing in Iraq, and presumably will continue to face in the future, he also provides an example of what happens when a country changes its discourse on war to fit the realities on the ground (as the French found in Algeria). The French are an intellectual people given to theorising. They are also heirs to a revolutionary tradition, and Algeria evoked a revolutionary response: a new discourse on war cast in a revolutionary language. They fought a ‘guerre revolutionaire’, a phrase first coined by the Bureau of Psychological Warfare. In recognising that the guerrilla war was revolutionary the French high command tried to compete with it by conducting a ‘revolutionary war’.20 In Algeria the French had to contend with everything the Americans have had to confront in recent years except for the suicide bombers (pioneered by Hizbollah in the early 1980s). Thus the majority of those targeted by the insurgents were not French soldiers, or even white settlers, but ordinary Algerians (especially those deemed to have ‘collaborated’ with the colonial regime by joining the police or the military). The FNL deliberately targeted non-belligerents by placing bombs in restaurants, dance halls, coffee shops, sports stadiums – sites frequented by young Europeans. The bloodshed was sickening as many contemporary accounts reveal:
Changing the discourse 63 I can still see that beautiful young girl of eighteen with both legs blown off, lying unconscious, her blonde hair stained with blood ... fragments of feet visible in shoes lying aimlessly in the rubble.21 Such attacks were intended to drive home the unconditional nature of the fighting. Often the houses of Muslim non-commissioned officers serving on active service were put to the torch. Such atrocities were intended to drive the Muslim population into the extremist camp and to illicit an equally brutal response from the French. The key words in the guerre revolutionaire were ‘protection’, ‘commitment’ and ‘supervision’. To protect the people against insurgents, groups were herded into cantonments. Algeria had its own euphemisms for cantonments – such as ‘camps de hebergement’ (shelter camps) or ‘camps de regroupment’ (transit camps) – in which up to two million Algerians were resettled. The ‘zones interdites’ (forbidden zones) equivalent to the ‘free-fire’ zones applied by the United States in Vietnam were also in common use. Protection amounted to forced internment for many Algerians who were often subject to particularly harsh treatment. Commitment required the population to be ‘protected’ so that they would continue to remain onside. Supervision required those ‘protected’ to be monitored regularly. Of the three terms, commitment was the cruellest as it required information to be extracted regularly, even by torture. As part of the new discourse, torture was no longer punished in the courts in the few cases where soldiers were brought before the authorities. Before the practice was finally abandoned in 1958 under public and international protest, it had become institutionalised as a revolutionary response to a revolutionary reality. Far from being merely expedient or, in the language of the time, ‘necessary’, it became central to winning the war. Torture began in Algeria gradually, as it usually does. Unethical practices are rarely introduced de novo. A society regresses into barbarism slowly, not overnight. In this respect, the French experience is much more representative of contemporary democratic practices than the German experience. A pluralistic society has to balance competing claims from different agencies, and what is considered ‘necessary’ (if regrettable) is usually a judgement not of the government, but individual agencies at different times.22 Torture, in short, is like cancer in that it metastasises. But there is always a primary tumour. In this case, it was the police who began to randomly torture prisoners during their interrogations. Within two years, as the emergency worsened, police powers were formally signed over to the military. In time the judicial atmosphere became more permissive and the use of torture came to be widely accepted within French government circles. A government report called for ‘safe and controlled’ interrogation techniques which included the use of electric shocks and the so-called ‘water technique’ (holding a person’s head under water until he nearly drowned). Despite military protestations that the use of ‘hardball’ interrogation techniques would be limited to ‘exceptional cases’, it is estimated that 40 per cent of the adult male Muslim population of Algiers (approximately 55,000) was tortured or threatened with torture between 1956 and 1957.
64
Changing the discourse
True to their civilising mission, the French authorities stopped short. Because the main effects were considered to be psychological rather than physical, they were also considered to fall short of torture as it is traditionally understood.23 One of the principal commanders, General Massu, even claimed that, when practised by his men, torture left no permanent damage. Indeed, on seeing one of his former victims interviewed on the steps of the Palais de Justice in 1970, he exclaimed to reporters: ‘Do the torments which he suffered count for much alongside the cutting off of the nose or of the lips, when it was not the penis, which had become the ritual present of the fellaghas to their recalcitrant “brothers”?’ 24 His remarks have a contemporary resonance when we read of foreign contractors being beheaded in Iraq today. Yet they should also bring us up short for what he was describing would (in today’s jargon) be called ‘torture-lite’ – punishment without long-term consequences. In principle this may sound fine; in practice there are always consequences, especially mental ones for those who suffer the torments others impose on them. For the authorities torture had two purposes. Instrumentally, it was used in interrogations to elicit information that might save lives or prevent future killing. At the time Massu’s chaplain Delarue called it ‘the lesser of two evils’.25 It was also expressive; it was designed to send a message, to deter the local population from supporting or giving information to the FNL. The normative context changed. Even this was defended by Delarue: ‘Here, it is no longer a matter of waging war but of annihilating an enterprise of generalised, organised murder.’26 The object was not winning hearts and minds so much as frightening the Algerians into accommodation. Was either form especially effective? In the short term, the first probably set back the insurgent campaign a few months. This was achieved by a variety of tough measures directed at seizing control of the kasbah in Algiers itself. Thousands of Arab youths were taken away for questioning. Of these, about 3,000 never re-emerged. Every form of degradation, including rape, was practised. Electric-shock treatment was especially severe. By the time Massu himself died in 2002, however, he admitted that the campaign had achieved little. ‘We could have done things differently,’ he regretted.27 To be fair to the French, they believed they had a duty to engage in an unremitting and courageous determination to face reality. Every discourse on war, they argued, should be empirically rooted in experience. In this case terror was to be matched with terror. But the notion that it is possible to supervise limited torture effectively during a war is absurd. In the course of the crackdown in Algiers, 3,000 suspects ‘disappeared’. The city may have been ‘purged’ of rebels but, like Fallujah in November 2004, the revolutionary leaders themselves had more than enough time to escape capture. Ultimately, the French response was never codified, and never really thought through. It represented what Robert Lifton calls a ‘grotesque improvisation’. It was more an improvised script than a revised, intellectually coherent discourse.28 The problem ran much deeper, however, than mere operational effectiveness. ‘One does not fight a revolutionary war with a Napoleonic code,’ insisted Colonel Lacheroy, a leading proponent of Massu’s revolutionary warfare.29 But that was
Changing the discourse 65 precisely the point – in the end it was dangerous for the French to turn their back on their own revolutionary tradition. In the end, in the words of Jean-Paul Sartre, it was dangerous to express ‘bad faith’ in their own historical tradition. It was to be inauthentic. The vocabulary of Sartrean existentialism, the popular philosophy of the 1950s, was inherently ethical for it deepened the traditional moral injunction to ‘respect others’. We respect others in war when we accord them the status of POWs; when we renounce degrading or humiliating punishments; when we allow even the defeated dignity from which, in turn, they can continue to draw a measure of self-respect. The question for existentialists has never been whether respect is a basic moral value. Rather the question has been how to provide an existential account of the value of respect.30 Morality derives from the sense that self-hood is a deeply social affair and that we confirm our social identities only through recognition of others. Most of us would rather be respected than feared. However, in some societies fear is the optimum value. It is the basis for how the societies are run. But in the West, stemming from a very strong sense of individualism and professionalism, esteem is largely derived from the respect one earns or ‘wins’. Sartre called this authenticity – the high premium placed on self-awareness and understanding. Existential virtue, in other words, presupposes a high degree of self-knowledge. To be an authentic person is to be a person who acts in good faith, respecting others in order to respect oneself even more. Sartre viewed the self as having two dimensions. It is objective in that our social selves (or what he called our ‘being-for-others’) consists of the qualities others see in us. Being seen by others has a deep effect on our personality – at times it can be uplifting, at others, not. The subjective dimension involves agency as we must remain agents of our own acts. We are in trouble when we identify wholly with ourselves as a subject and thus deny our objectivity (living through other people), or when we identify wholly with ourselves as an object and thus deny our subjectivity (our responsibility for the consequences of our own actions). For Sartre, both involved bad faith. In the first case bad faith involved the denial of one’s subjectivity, i.e. the recognition that our actions have consequences for others as well as ourselves. A society which acts without recognising that its actions have consequences for others shows bad faith. Such a society restricts the implications of its actions to the present, thereby refusing to accept that the future is also important. More to the point perhaps, such a society accepts that the present is important only insofar as it is the place where the future is being forged. The second type of bad faith is the denial of one’s objectivity. One must remain true to the past – as well as the future – to one’s own nature if one is to act in good faith.31 As Rorty would contend, to be a liberal one must act like one. Failure to do so brought France bad press in the rest of the world, particularly in the US where the young John F. Kennedy made his reputation as a senator by denouncing French colonialism. The French paid a penalty for not keeping faith within the past. Let us take the case of Pierre-Henri Simon, a reserve officer who spent five years in a German
66
Changing the discourse
prisoner-of-war camp and later became literary critic of Le Monde After a particularly brutal FLN massacre he admitted it was natural to feel indignation when French soldiers were killed, but the ‘intimate suffering’ felt at the spectacle of tortures carried out by Frenchmen themselves was even worse. The former sentiment is the expression of a sociological and visceral patriotism which is awakened when France suffers ... the latter is the expression of a patriotism of values and responsibility that is awakened when France sins.32 One of the first acts of the revolution was to abolish torture in 1789. When reintroduced in Algeria it represented a break, or rupture, with France’s revolutionary heritage. It was an act of bad faith for the state was a creation of the revolution of 1789 whose central document was the Declaration of the Rights of Man and the Citizen. Nationality and citizenship came into the world at the same time. The two have reinforced each other ever since. In the words of one historian, because France, especially the France of 1789, incarnates universal values, to be a patriot in France is to be a patriot for peoples and nations everywhere. ‘Far from being incompatible with universalistic human sympathies,’ he continues, ‘it is – for a Frenchman, though not for a German, the very condition of their realisation’.33 In other words, the rise of human rights was a response to the rise of nationalism, just as nationalism was a response to the Enlightenment’s preoccupation with human rights. It was deeply embarrassing to be reminded of what had transpired in France itself during the recent German occupation. Reviewing Henri Alley’s famous exposé of the Algerian issue, Sartre wrote that in the Gestapo’s Paris headquarters, screams of pain were often heard. No-one knew at the time what would be the outcome of the war. ‘Only one thing seemed completely impossible – that men would ever be made to scream like that in the name of France. But the impossible can always happen’. From which we can conclude in the manner of a classical moralist: that we never learn from other people’s mistakes.34 Camus was one of the first to make the connection between the Algerian War and the German occupation of France. ‘The facts are these, as obvious and ugly as reality: In Algeria we are doing what we blamed the Germans for doing to us.’ What Edward Said wrote of Palestine 40 years later might have been said of the Algerians at the time: they had become ‘the victim’s victims’.35 The special problem with Algeria was indeed that it linked the French back to the German occupying forces. General Paris de Bollardière, Grand Officer of the Legion of Honour, recalled his own experience against the Nazis and forbade torture in the sector under his control. While in civilian clothes, he once overheard officers boasting about their efficiency: ‘In Algiers now, there’s nothing but genuine chaps, the [Foreign] Legion, fine big blond fellows, stalwarts, not sentimentalists.’ Bolladière replied, ‘Doesn’t that remind you of anything,
Changing the discourse 67 les grands gars blonds, pas sentimentaux (meaning Aryan-like guys who act without pity)?’36 Like many of his fellow officers he knew that the counter-terror tactics he was asked to apply were not militarily necessary and that suspending the laws of war, even in Algiers, was morally impermissible. It was an act of ‘bad faith’. Bollardière went on to lodge the only formal protest against torture by a general officer. Receiving no satisfaction, he requested to be relieved of his command. Charged with violating military discipline, he served 60 days under arrest, after which he retired from the service. Bolladière once protested to Massu: ‘If the leadership yielded on the absolute principle of respect for human beings, enemy or not, it meant unleashing deplorable instincts which no longer knew any limits and which could always find means of justifying itself.’ 37 Looking back at the whole sorry episode one can see that one of the most profound observations ever penned was written by Romaine Gary in the immediate aftermath of World War II: ‘When a war is won, it is the losers, not the winners, who are liberated.’ This was certainly the case with the German people. In 1945 they were liberated from the illusion that they were the master race, the ‘indispensable’ nation. They were liberated from the idea that their actions, however morally questionable, were sanctioned by history. Many may not have been reconciled to defeat immediately, but in time most welcomed their return to Kant’s ‘Republic of Europe’. The French, in contrast with the Germans, forgot that it is not only the bad who commit atrocities or war crimes, even if the crimes of the good are usually on a lesser scale, largely because they are publicly debated. The lesson was learned nevertheless. Within two years of its introduction the guerre revolutionaire was discredited even though it would be another forty years before its main architects were prepared to admit that it had been imperfectly conceived. France remained true to its liberal values. What liberalism tells us is that history is ironic; defeats are never final. France’s defeat meant victory for the rest of us. This is what Camus meant when he wrote that the irony of history is that the defeated are sometimes liberated by their defeat.
Israel and the intifada Inevitably, this brings us to the most recent example of a country changing its discourse on war to meet what it feels to be a demand of the times. Until recently academics have tended to disregard the significance of cultural ways of warfare – not only because of the danger of portraying other societies in an unfair light but also because the very notion of military culture is notoriously difficult to pin down. This is unfortunate because incorporating the cultural dimension into their studies would have provided scholars with an additional understanding of the factors that they do focus on (political systems, military capacities and the skills of particular leaders, etc). In short, there is nothing deterministic about culture but it must be considered, among other factors, when exploring why a society chooses to adopt a particular military discourse. In the case of Israel it has significantly refashioned its former discourse on war in an attempt to come to terms with a new
68
Changing the discourse
reality. This has meant abandoning the Western discourse which it embraced on independence in 1948. ‘Our general ideas about the conduct of battle,’ claims the historian Victor Davis Hanson, ‘have not changed much ... from those practised by our Greek ancestors.’38 In other words, it is not that there has been no change in the political, economic or indeed military environments since the fifth century BC, but rather that throughout history Western societies have displayed certain attributes in warfare which, originating in ancient Greece, collectively constitute what can be considered a unique ‘Western way of war’. Among the most important of its features is strong citizen participation in the military. The kind of ‘civic militarism’ developed by the city-states of ancient Greece was unique in that it provided its soldiers with a sense of obligation unknown to their non-Greek contemporaries. The city was both governed and defended by its citizens. Military service was the main source of civic virtue.39 Undoubtedly, the idea of democracy and the rule of law have played crucial roles in this context. Not only is citizenship central to the Western way of warfare, but soldiers joining the army as equal citizens have been conscious of their rights and responsibilities. This combination of civic militarism and democracy involves another distinctly Western feature – namely the high value attributed to self-critique and dissent. Traditionally, citizen soldiery has gone hand in hand with civilian control of the military. Indeed, no armies outside the West have been subject to such a high degree of audit or inspection from the impromptu committees set up by members of the Ten Thousand who marched with Xenophon and which we can read about in the Anabasis to the mass anti-war protests witnessed in Washington in 1969, in Tel Aviv in 1982 and London in 2003 in the run-up to the Iraq War. Hanson’s thesis has met with criticism from his fellow historians, including John Lynn. In his colleagues’ eyes his thesis tends to reify Western culture. They also see it as obscuring the fact that some of the features considered typically Western can be found in non-Western militaries at different times, and that others are not even found in the West except at certain periods of its own history. It is not necessary to buy into the thesis, however, to substantiate the point I am trying to make. In some shape or form since the late nineteenth century and their major encounter with the non-Western world, Western armies have tended to subscribe to something very similar to Hanson’s thesis. The thesis also has especial attraction in the US for those who believe that the countdown to the ‘clash of civilisations’ has already begun. It also has particular appeal in one non-Western society: Israel. Some Israelis date the clash to 16 April 1993 when a young, 22year-old Palestinian packed a white Mitsubishi van with an improvised bomb and blew himself up, perpetrating the first suicide bombing in the decades-old Israeli–Palestinian conflict. In addition to its own unique social organisation, the Israeli military has put a premium on a distinctive approach to battle tactics. Like most Western societies it has shown a strong preference for direct, face-to-face shock battle. Again, this approach was pioneered by ancient Greek soldiers for whom quick and decisive
Changing the discourse 69 battle simply ensured a speedy return to their place of work. Harassing, checking or temporarily discouraging the enemy from attacking was never entirely satisfactory. It was preferable for adversaries to be met with full force at close range. The ruthlessness with which the Greeks engaged in such decisive battles genuinely shocked much of the ancient world. Similarly, Western armies have preferred to overpower their enemies enthused by a Clausewitzian ethos – the idea of war as a duel between equals. Israel has imbibed much of this thinking. It is perhaps ironic that Theodor Herzel, the founder of modern Zionism, could not have been further from the truth when he declared that: The Jewish state is conceived as a neutral one. It will therefore require only a professional army, equipped, of course, with every requisite of modern warfare, to preserve order internally and externally.40 Needless to say, Israel has never been a neutral state. Instead, it has found itself involved in war from its very birth, yet the Israeli Defence Force (IDF) has never been a fully professional army. Its Permanent Service Corps represents the smallest of its three components. Its second component is compulsory conscript service into which all Jewish citizens – female and male – are drafted at the age of 18 for a period of two or three years, respectively. Among Israel’s non-Jewish population, Druze and Circassian males also occasionally serve voluntarily, while some Bedouin and Christian Arabs volunteer. But by far the largest component of the Israeli Army is made up of reservists. By law, all eligible male Israelis continue to serve in the IDF for several weeks per year until they are well into their forties. More than anything else, it is this combination of universal conscription and a rigorous system of reserve service which makes the IDF the citizen army it still is.41 Unlike other societies in the Middle East, the boundaries between civil and military life are vague. This means that neither soldiers nor their officers are easily identifiable as a special ‘caste’ different from the rest of society. If Hanson argues that one of the most striking features of ancient Greek armies was the fact that ‘philosophers ... marched in file alongside cutthroats’,42 the same is also true of present-day Israel where citizens from all parts of society serve side by side (including young Israelis with minor criminal records). In this sense, the IDF probably provides the clearest example of a citizen army left in the world today. The IDF’s commitment to a functioning democratic society similarly reflects a specifically Western cultural heritage. The prominence given in Western military cultures to democratic structures gives rise to a related phenomenon: the acceptance, and sometimes even encouragement, of dissent and debate. Specifically, the media has become highly critical of the army over the past decades, shattering the ‘conspiracy of silence’ which existed prior to the loss of faith in the IDF’s capabilities following the Yom Kippur War (1973). The events of the Lebanon War of 1982 produced the largest street demonstrations in the country’s history.43 Subsequent actions – especially towards the Palestinians – throughout the 1980s
70
Changing the discourse
and 1990s occasionally provoked massive public discontent. During the Al Aqsa intifada a reluctance to serve by some senior reserve officers in the Occupied Palestinian Territories captured media attention. The IDF’s distinct tactical preference for decisive battle can be traced back to a geographical lack of strategic depth. Israel must, of necessity, rely on offensive force at the operational and tactical levels. The country is so small that it simply cannot allow a war to take place on its territory. One of the main tenets of its security doctrine is thus to fight beyond its borders. This may require pre-emptive and preventative strikes (as were undertaken both in the 1956 and 1967 wars). More importantly, the realities of small territory, limited manpower and (especially) the dependence on reserve forces creates a significant need for short wars, decisive victories and a quick return to peacetime. It is no surprise that the IDF exhibits a strong preference for offensive manoeuvre warfare. All this has changed and, from Israel’s perspective, not for the better. From the moment the Israeli military found itself bogged down (as it did in the Lebanon in 1982), it began to face quite new and distinctive ethical conundrums. Since 1982 the environment in which it has found itself operating has introduced its own moral quandaries – the most immediate of which was siege warfare. In the case of street fighting in Beirut the IDF found itself faced with the dilemma of targeting non-combatants. It seemed to many of the soldiers stationed on the heights around Beirut that the IDF’s self-image as a defensive force was being demolished along with whole neighbourhoods of Beirut.44 The line between firing at terrorists and firing at civilians became alarmingly narrow. The sight of high-rise buildings collapsing under intense shelling fuelled a bitter moral debate. The IDF took great pains to prepare a gigantic map on which every one of Beirut’s 25,000 buildings was numbered so that bombing could be ‘precise’. Even so, artillery barrages did not spare civilians. The army tried to encourage the civilian population to leave the city. But even in such heavily and systematically shelled quarters as Burj-El-Barhneh and Sabra, tens of thousands of civilians chose to stay in improved shelters rather than leave their homes behind. Back in 1982, in the case of artillery and air strikes, it was not possible to achieve the ten-meter accuracy that can be achieved today against a Global Positioning System (GPS) coordinate. They also found – as is still the case today – that significant numbers of weapons go astray, that modern sensors cannot tell the difference between many types and uses of military and civilian vehicles, and that a civilian often looks exactly like an insurgent or terrorist. Even by numbering individual buildings, mapping all potential targets is next to impossible, as is the real-time location of civilians. They also discovered that civilians are the natural equivalent of armour in asymmetric warfare. Since then they have had to get used to the fact that their opponents have steadily improved their ability to use civilians quite cynically in order to hide their own front line forces (as they did
Changing the discourse 71 especially successfully in 2006). By that time Hizbollah was much better armed, as well as arguably better motivated than the PLO. In 2006 they found that their Merkava tanks, the steel monsters built for confrontation with Arab armed forces rather than guerrillas, were defenceless against anti-tank missiles; guerrilla units using fibre-optic communications were much more difficult to disrupt. Given this new reality it should not be so surprising that the Israelis have begun to rethink their discourse on war. In recent studies, scholars have pointed out that the defining characteristic of civil–military relations in Israel (the citizen army) is slowly fading away. Israel’s ‘nation-in-arms’ – one of the primary pillars of its Western character – has been replaced by a leaner and meaner force which is gradually becoming a professional army. The recruitment of IDF conscripts has become progressively more selective and discriminating. In the process an increasing number of young Israelis has been granted exemption from military service. Beginning in 2008, reservists will only be called for training in an emergency, not for operational activities. Also, non-combat soldiers will not serve in the reserve army at all after being released from compulsory service. Furthermore, reserve duty is to last no more than two weeks a year and will end at the age of 40. This age limit is likely to be further lowered in the future.45 Other changes reveal the extent to which the Israel military has moved from a purely instrumental to a more expressive use of force – at least since the violence that erupted in September 2000, which became known as the Al Aqsa intifada. Strikingly, the army has specifically turned a large number of its operations in recent years – mostly, but not exclusively, against Palestinian targets – into retaliation strikes for previous terrorist attacks. The large-scale destruction of Palestinian Authority offices has gone beyond the fulfilment of immediate military goals. Such operations appear to have served the additional, more symbolic or expressive purpose of sending a message to Palestinians – militants and ordinary citizens alike – that violence against Israeli citizens will be met with a ruthless response. Furthermore, the violence of previous years has corresponded with a trend within the IDF to step up its use of covert military operations. Since the 1980s, the IDF has boosted the allocation of resources for undercover Mista’arvim (Arab-Masquerader) units whose purpose is to infiltrate terrorist movements and kill their leaders. The IDF makes use of troops who are fluent in Arabic, a large proportion of which are recruited from Israel’s Druze and Bedouin minorities.46 The high number of terror attacks which are thwarted by the IDF before they can be carried out makes plain just how crucial these covert operations have been in counter-terrorist efforts over the past years. Perhaps more striking still, the IDF seems to have adopted the maxim that successful armies need to instil ‘terror in the hearts of their enemies’. This maxim can be seen in a number of policies which have become fundamental to countering terrorist operations during the intifada. The most well-known of these is what the Israeli government calls ‘targeted attacks’ or ‘extra judicial executions’. These have taken many forms, ranging from the detonation of terrorist leaders’ mobile phones to helicopter strikes in densely populated areas. Crucially, the IDF has
72
Changing the discourse
opted not only to strike at purely military ringleaders, but also to take out a number of individuals whose function is partly or wholly political. The assassinations in 2004 of Sheikh Ahmed Yassim, the spiritual leader of Hamas, and his successor, Abdel-Aziz al-Rantissi, are cases in point. While these operations drew widespread condemnation in the outside world, Israel maintains that it has significantly weakened the infrastructure of the movement. Further IDF policies also seem to be at least partially motivated by a desire to harass the enemy population as opposed to achieving directly observable military goals. Between September 2000 and February 2005, 675 Palestinian houses belonging to the families of recent suicide bombers or terrorists were deliberately bulldozed (a practice officially halted at the end of this period). The IDF also periodically issued what it termed ‘assigned residence’ orders to forcibly transfer the close relatives of terrorists from the West Bank to the Gaza Strip. These operations are clearly expressive; they are intended to send a message about the consequences of further terrorist attacks. Finally, Israel’s Supreme Court decided that in view of the Jews’ long history of suffering it was legitimate to employ coercive measures to protect Israeli citizens in extreme circumstances. Established by the Israeli Government in 1987, the Landau Commission advocated ‘the lesser evil’ doctrine which holds that in moments of emergency the state can commit immoral acts for the greater public good. The Commission went on to reject the idea that torture be practised ‘outside the law’, preferring legal authority instead. The different types of torture now permitted still exclude: any harm to the person’s honour or dignity; lasting physical or psychological harm; and disproportionate measures. In other words, torture is prescribed by certain rules or procedures which have become the basis for some public defences of torture in the West – both consciously, as in the case of the Harvard Law professor Alan Derschowiz, and unconsciously, as in the case of some of his supporters.47 Inevitably, the Israeli reaction to the changing reality of war has aroused intense international criticism. Some bizarre comparisons have been drawn. Perhaps the key event, if only for its symbolic importance, was the Israeli operation in the West Bank city of Jenin in the spring of 2002. In many respects, the house clearance operation in search of Palestinian terrorists was an exemplary case of house-tohouse fighting in which collateral damage, while great, was not as great as portrayed in the Western media where it was made out to be another siege of Grozny – the city the Russian Army ruthlessly flattened in the 1995 war. That association would have been unfair enough, but even worse was the association of the names ‘Jenin’ and ‘Auschwitz’. One author came across 2,890 references to the two events while searching on the web, and a further 8,100 references to ‘Jenin and Nazi’.48 Little sympathy was expressed for the suicide bombing blitz that Israel was forced to endure in 2003–4. It demonstrated, on the part of the bombers, what Paul Berman calls the ‘lowest rung of nihilistic despair’.49 To employ an appropriate biblical metaphor, have the Israelis reaped what they have sown? Bin Laden’s deputy Aymah al-Zawahiri continues to claim that the repressive campaign waged against the second intifada in the autumn of 2000
Changing the discourse 73 provided Al-Queda’s opportunity, as the corpses of dead children piled up. Here was a rallying call that could unify the Muslim world. Bin Laden himself has claimed that it was the Israeli bombing of Beirut that first radicalised him. The sight of city tower blocks collapsing may even have inspired him to attack the World Trade Center 20 years later. All oppression – whether perceived or real – is alienating. In his book The Looming Tower, Lawrence Wright is especially perceptive on the role that torture in Egyptian prisons played in transforming both the principal Islamic ideologue Sayyid Qutb and his star pupil al-Zawahiri into ‘violent and implacable extremists’.50 Actions have consequences even if the consequences are not always thought through. Israel is an excellent example of a society that has changed its discourse of war in a Darwinian attempt to adapt to a changing external environment. Even so, it is important to add that retaliation strikes, for example, have been employed for a long time in response to terrorist attacks. In 1953 Special Forces reacted to the killing of three Israeli civilians by entering the West Bank village of Quibya. The operation aroused much criticism at the time because 69 civilians were killed. Similarly, targeted assassinations can be traced back to the beginning of anti-terrorist operations and have formed a continuous part of Israel’s military policy for decades. There again some of the trends identified – such as the house demolition policy – have been short in duration and terminated because of criticism, particularly within Israel itself.51 It could also be argued that the IDF’s traditional Western character has been determined by inescapable strategic necessities. Thus the strong preference for offensive manoeuvre warfare is a direct product of geographical necessity. Lacking the kind of impenetrable borders of Switzerland, let alone the territorial strategic depth of Russia, Israel simply has had no choice but to engage in decisive shock battles. Similarly, its critical lack of manpower has required it to adopt the model of a nation-in-arms. All that I am arguing is that the strategic environment in which Israel has found itself since 1982 has required a systemisation of new methods which, in turn, may reflect two radical challenges which Israel has had to confront. The first is technological. The new conflict environment has robbed Israel of the technological advantage it has (and still retains) over neighbouring armies. Merkava tanks offer no defence against Hizbollah units. Second, the country confronts new demographic changes within its own borders. One may be tempted to conclude that much growing ruthlessness is due to a growing role and prominence of Sephardim Jews originating from Arab countries within Israeli society, and particularly its military elite. Shaul Mofaz, the former Defence Minister, was the first Sephardi Chief of Staff during his tenure in the late 1990s, and Israel’s former President Moshe Katzav was also the first Sephardi to hold this position. Israel is no less a tribal society than its neighbours. It has a cultural mix of secular Askenazi from Eastern Europe, Orthodox Jews, recent immigrants from Russia and Arab-Israelis, in addition to the Sephardim. In the past 15 years Israel’s strategic current has been flowing not in the direction of the designer’s dream of a greater Israel, but in the opposite
74
Changing the discourse
direction, ever deeper into the exploration of inner space and its own identity. In confronting the reality of war it has been forced to confront its inner reality as well. Israel is certainly not the state that its first president knew at the time of independence in 1948. Yet the military confronts a disturbing reality. After the re-invention of Hebrew as a spoken language, the Army is by far the most effective element of a common Israeli identity. However, this triumphantly efficient body of armed citizens with (until recently) an unbroken record of military success has only served to transform the once-despised Palestinians, fleeing from the land in their tens of thousands in 1948, into thousands of militant ‘young lions’. Israel’s recent history confirms what we have known from earlier years: its policy towards its neighbours is driven by geography and the history of the Jewish people. Its will to survive has led to repeated success in warfare but to failure in war; it has failed to wage warfare of a kind that would lead to a lasting peace. One reason for this is ethics. It has alienated the Palestinian population more than ever and in doing so it has lost its strategic touch.
Conclusion All the examples cited above exemplify a simple lesson: if a society is to do war successfully it must first understand how peace is made. In this respect all the strategies I have discussed were intensely self-defeating. None of the parties involved heeded the Kantian injunction contained in the Second Categorical Imperative to treat others as an end, not a means to an end. Indeed, the Imperative tells us that we should always act in such a way that treats humanity in ourselves (as well as other persons) as an end. It is this double injunction that is so often missed. What Kant is arguing is that it is our capacity for reason which confers value on us as a species – whether it is our capacity to act for principles, our capacity to set ends or our very predisposition to act morally. When we suspend the Imperative we have to find reasons to exclude others as ‘subhuman’, culturally ‘inferior’ or just plain ‘evil’. We have to deny ‘them’ the ‘reason’ we identify in ourselves and those who, in our eyes, merit being treated as fellow human beings. We have to find reasons to deny others the dignity which we claim as a species and which allows us to make claims on others; above all is the claim that we should be treated with respect. It is always easy, of course, to find a justification for not respecting others – especially when our enemies are seen to forfeit that respect by their own actions. What respect can we accord those who, like the Vietcong, openly despised the Geneva Conventions, or others who, like Hizbollah or Hamas fighters, are openly dedicated to the elimination of Israel? But Kant insists that there are reasons to treat human beings with respect even when they are not worthy of it. When we disrespect a particular enemy we are in danger of extending the practice to others. No doubt the world expects more of liberal societies than it does of others. If this is a case of applying double standards,
Changing the discourse 75 we should always remember that they are our standards. Rorty would contend that we stand or fall by the principles which we ourselves try to export. To export values is to court world opinion. It is because of our belief that what is true for us should be true for others that we open ourselves to condemnation when we act, or are seen to act, ‘out of character’. But there is also a more pragmatic reason to subscribe to Kant’s Imperative. To deny respect to others undermines our chances of reforming them or persuading them to rethink their position. Whether they deserve to be treated well is not the point. In the long term we have rules to resolve conflicts, not amplify them. We have rules to ensure that we do not act in such a way that they feel boxed in or so resentful that the only avenue to self-respect is to fight on. A side that can preserve something of its dignity is more likely to accept defeat. More important still, by respecting the humanity in ourselves we are less likely to demonise our enemies or remove them from the human race, thereby locking us all in an endless cycle of violence. In denying them respect we diminish the chance of bringing the conflict to an early conclusion. We are in danger, in short, of ignoring one of Clausewitz’s principal insights: wars are only won when the losing side is prevailed upon to accept defeat. Rorty had little time for Kantian Categorical Imperatives. He saw them as metaphysical devices or myths which muddy the waters and obscure what really matters. But, in this respect at least, it is likely he would have stood in the ranks with Kant even if he could never be accused of being a Kantian. Like Rorty, I have little belief in categorical imperatives that are grounded in metaphysical speculations. But it is well worth invoking another of Kant’s imperatives that is founded on scientific reason. He called it the Hypothetical Imperative: ‘You must behave in this way to achieve that result.’ The Hypothetical Imperative is a favourite with scientists because to achieve any result in science you must adhere to certain ethical practices. You must adopt an ethic for science that makes knowledge possible. As one scientist has written: The very activity of trying to refine an enhanced knowledge – of discovering ‘what is’ – imposes on us certain norms of conduct. The prime condition for its success is a scrupulous rectitude of behaviour based on a set of values like truth, trust, dignity, dissent and so on ... In such societies where these values did not exist, science has had to create them to make the practice of science possible.52 All scientific endeavours are highly ethical for that reason. So too is war. Unlike laboratory technicians, the military is clearly not hampered by the same code: the imperative of precision and perfection in performance; the imperative connected to the moral imperative that inauthentic results or corrupted data will end in technical failure. Tactically, unethical behaviour on the battlefield can lead to temporary success, but history is replete with examples of how tactical success can often end in a strategic end game. Military history illustrates especially vividly how even a string of military successes in the field can lead all the way to strategic ruin.
76
Changing the discourse
The Germans paid the price for departing from their traditional discourse. The example I have taken is that of the mistreatment of POWs which made no sense militarily; Soviet soldiers merely fought all the more desperately to escape falling into their hands. For their part, the Germans, recognising that they had committed criminal actions, were so fearful of retribution that they had no recourse but to fight on when all was clearly lost, at enormous loss to themselves. The statistics tell the story: almost twice the number of German soldiers died after the July plot of 1944 than before, when it was still possible to entertain some hope of ultimate victory. In the case of France, we find a country that failed to think through the guerre revolutionaire. It was improvised gradually and was indeed a ‘grotesque improvisation’. It did not constitute a well-thought-out strategic paradigm for a new kind of war. It was not instrumental, but largely expressive – designed to send a message, to communicate intent. But this is not the Western way of war, and certainly not a French one. For a brief moment the French army failed to keep faith with its own revolutionary tradition. Bad faith usually leads to a bad conscience because it mocks that which is most important to you. In the case of Algeria it was an affront to the national principle, an infringement of the Rights of Man. In 1958 the French retreated and chose to remain faithful to the ideas of the Revolution. They chose to put ‘greater France’ at risk in order to remain standard-bearers of Western values. It was a tough choice. It is small wonder that, at first, the Fourth Republic’s nerve failed. Yet in keeping faith with the nation’s own revolutionary principles it ultimately chose to remain in touch with its ethical first principles. As for Israel, it is the most timely case of the three. It is contemporaneous with the War on Terror. Indeed, some writers would trace the beginning of the conflict to 1982–3. These years saw the first major terrorist campaign of the modern era. Inspired by a revolutionary change of government in Iran, and grounded in part on the appearance of a new foot soldier, suicide bombing started. What Israel would claim, of course, is that the logic of war is quite different from what the French experienced in North Africa or what their American allies experienced in Indo-China 20 years later. War is now networked. It involves a different kind of violence or, to be more precise, a different grammar of killing. Does it therefore demand a different etiquette of atrocity? It is to this question (which is not to be dismissed out of hand) that we must now turn.
4
A new discourse
Excluding from the discourse on war: from Guantanamo to Abu Ghraib One day in the autumn of 2003 General Mattis was walking out of a mess hall in Al-Asad in western Iraq when he saw a group of soldiers intensely watching a cable news show. ‘What’s going on?’ he asked. He soon learnt it was the first revelation about Abu Ghraib. A 19-year-old lance corporal glanced up from the television and told him, ‘Some assholes have just lost the war for us’.1 Mattis was a tough soldier of the old school. ‘Be polite, be professional, but have a plan to kill everyone you meet,’ was one of the rules to live by, he instructed his young officers. Later, he would get into trouble for saying that he found it fun to shoot some people. ‘Actually it’s a lot of fun to fight ... I like brawling.’ 2 Broadcast on television, his unguarded remark threatened to compromise his career. The programme did not report his other remarks that day: that you should never underestimate your enemy, or it will be difficult to break his will. Moreover, he insisted, killing in the field was not the equivalent of humiliating people in prison. To underestimate the will of the man you fight is to shame the society from which he comes. It is contrary to every prudent military rule. In counter-insurgency it is particularly dangerous, as even a 19-year-old soldier could recognise. It is the way to lose a war.3 The trouble began in Abu Ghraib in October 2003 when 372 Military Police Company, an Army Reserve Unit, took over the prison. Abuse had been going on there for some time. There was a very low ratio of military police to the number of inmates; at Guantanamo it was one to one; at Abu Ghraib it was one to 75. Even if there were no ‘bad apples’ abuse would have been inevitable. As it was, demoralised at being kept on in Iraq for longer than expected, some guards, in the words of one, took abuse ‘to a new level’. They made the mistake of taking pictures of their acts such as stripping prisoners naked, draping women’s underwear over prisoners’ heads, tying some to dog leashes and urinating on others. One detainee even died on their watch. Many of these abuses violated Article 3 of the 1949 Geneva Convention which requires countries to treat POWs humanely, ‘without outrages on personal dignity, in particular humiliating and degrading treatment’. Whether the practices were
78
A new discourse
officially sanctioned, or merely tolerated by the higher authorities, is not the point. As a prison Abu Ghraib was incompetently run, overcrowded, starved of resources and unable to defend itself against frequent mortar attacks or prevent prisoners from escaping. It was all a reflection of the impending failure of a mission which can be traced back to the mistaken belief that the US would be met by joyous Iraqis grateful for being liberated from Saddam Hussein. It is worth adding that the US Army has only encountered joyous crowds once since liberating Paris in 1944, and that was in tiny Grenada in 1983. Abu Ghraib proved to be one of the worst setbacks in US military history. Although the enemy gained not a single inch of ground, the Joint Force Commander Richard Sanchez said that the outcome was ‘clearly a defeat’ for his forces.4 Few dispute that the events at Abu Ghraib inflicted operational damage as much as any combat-imposed loss. The highly publicised reports of the prison abuse scandal energised the Iraqi insurgency, and eroded domestic and coalition support at home. Most damaging of all was the negative reaction of ordinary Iraqis. A 2004 poll found that 54 per cent believed all Americans behaved like those alleged to have taken part in the abuse.5 It is all the more surprising, therefore, that only the lowest ranking soldiers were punished – despite the military maxim that the best way to prevent abuses is the knowledge that the commanders will be held responsible for any breach of honour. Senior officers are more likely to show interest in the behaviour of the soldiers under their command if they know they will be held accountable. All in all, concludes Thomas Ricks, the Washington Post’s senior Pentagon correspondent, this constituted a tragic moment for the US Military which, up till then, had the proud heritage of treating POWs better than most. Even in the Revolutionary War President George Washington constantly reminded his men that they were an army of liberty fighting for the rights of humanity which should be extended even to their enemy (the British). Unfortunately, Ricks concludes, in Iraq the US Army was a long way from home.6 New times, new mores? If President Bush could honestly claim that he ‘didn’t know’ about Abu Ghraib, was it only because he chose not to ask? Or was his ignorance (before or after the fact) a sign of a new paradigm or new thinking about war? Do new realities call for a new discourse on war and, possibly, new etiquettes of atrocity? The events of 9/11, contended Attorney-General Alberto Gonzales in a memo to the President in January 2002, required a new paradigm now that the terrorists themselves had rendered the Geneva Protocol on the treatment of POWs (1949) obsolete. Since the terrorist attacks of 11 September, many commentators have argued that the conventional response is inadequate. Some, like Ralph Peters, contend that concerns over non-combatant casualties which constrain military action are misguided ‘diplomatic table manners’. The intentional targeting of non-combatants, he adds, may be necessary to wage the war successfully. Failure to do so merely betrays a misunderstanding of the nature of the enemy, as well as an allegiance to outdated conventions.7 Peters is certainly not alone in expressing this view. In a war, why take half-measures? Yet it remains disturbing that the same people who labelled the Geneva Conventions ‘outdated’
A new discourse 79 have made no concrete proposals as to which provisions of International Humanitarian Law should be amended, and with what new wordings. One of the problems of the War on Terror, as we shall see, is that there is remarkably little new thinking behind many of the acts of commission and omission committed in its name.
Is it a war? Inevitably, therefore, we have to begin by asking whether the War on Terror is actually a war. Let me begin by referring to two films, one a box-office success for the Spielberg studios, the other less well known. In the latter, The Last Supper, one of the characters asks a redneck ex-US Marine, ‘what have you been doing recently?’ He replies, ‘I fought in the war’. ‘Which war would that be?’, asks one of the students. ‘The war’, is the reply. ‘Really? Korea? Vietnam? World War II?’ the student persists. ‘No – the Gulf War. You got a problem with that?’ ‘Was that a war?’ the student concludes archly. ‘I thought it was a Republican Party commercial.’ In a different vein, the same scepticism can be found in Jean Baudrillard’s notorious book The Gulf War Did Not Take Place. 8 So much for the first Gulf War, what of the second? The aforementioned successful film is Steven Spielberg’s The War of the Worlds, which is loosely based on H.G. Wells’ novel of the same name. Early in the film Tom Cruise rushes home to take his children out of New York, a city which has just been invaded by distinctly hostile Martians. As he is bundled into the car, his son asks the same question as the priest in Wells’ novel: ‘What are these creatures? Why are they allowed?’ In the film the boy asks, ‘What are they, dad? Terrorists?’ ‘No, son, they come from another place’. ‘You mean they’re from Europe?’, replies the incredulous adolescent. Yes, they do things differently in Europe. For Hollywood Europeans have been doing so since The Third Man, a classic film set in post-World War II Vienna, in the ruins of a continent that had produced a new breed of barbarians. Americans always believe Europe is out to get them. Spielberg’s movie is, of course, a commentary on the War on Terror, and he (perhaps unconsciously) asks why Americans think they are at war while the Europeans do not. In Wells’ novel the Martians are so alien that they are quite beyond us. Malevolence really has nothing to do with it. All they can do for the human race, apart from wiping it out, is to provide a pause for humanity to consider what makes it human. 9/11 too seemed to constitute one of those seminal moments in history in which the civilised world (rather than humanity) had to ask itself what made it ‘civilised’. Now many Americans find the West no longer at ease with itself, no longer able to agree what constitutes civilised values, let alone how best to defend them. In a speech before the release of the 2006 Quadrennial Defense Review (QDR) Donald Rumsfeld claimed that ‘they will either succeed in changing our way of life, or we will succeed in changing theirs’.9 The US went into the War on Terror – or the Long War, as the QDR redefined it – as a country with a universal mission. The US is still dreaming for
80
A new discourse
the rest of us. Yet terrorism – even in the worst case – is unlikely to change our life unless we change it for ourselves through excessive fear. As a term the ‘War on Terror’ has attracted much derision. Writers sceptical about the Bush administration from the first would probably have had an even greater field day with the original name for the invasion of Iraq: Operation Iraqi Liberation. The original name was scratched just in time by an eagle-eyed functionary who spotted that it spelled ‘OIL’.10 The term the US government originally preferred was ‘Global War on Terrorism’ (GWOT) which makes for an unsatisfactory acronym. The public needed something snappier with fewer syllables. So ‘War on Terror’ was officially adopted on 20 September 2001. The war on international terror had actually been declared much earlier by Reagan’s Secretary of State, Alexander Haig. Given that a third of all victims of air hijacking in the 1980s were its own citizens, the US has always taken terrorism more seriously than most other governments. Now, of course, we have another term: the Long War. It was introduced in 2006 and is a no less unfortunate phrase. Only historians, after all, are really in a position to be able to tell us whether a war was long or not – whether it lasted a hundred years (in the case of England and France in the Middle Ages) or thirty years (in the case of seventeenth-century Europe, which was bitterly divided into two religious camps). Unfortunately, the producers of the first talkie version of The Three Musketeers in the early 1930s allowed a line to slip into the script. Cardinal Richelieu is working in his study one day when another, this time less eagle-eyed, functionary interrupts his deliberations. Popping his head round the door, the young man asks, ‘Your Eminence, have you heard the news? The Thirty Years War has just broken out.’ Inside the Washington beltway strategists call the War on Terror ‘the war against the Global Islamic Insurgency’. Others talk about an Islamic civil war in which the West will have to take sides. The War on Terror has been quietly dropped. The world needs to choose its words with care but must also have the courage to talk honestly about what is happening. ‘War’ itself is often a contested concept. Was the Cold War a ‘war’? It was not a war as the word is traditionally understood, but it was real enough for its victims, including the 50 million who died on various battlefields across the world. In its early days the War on Terror was often compared to the Cold War. It is a long conflict with smaller campaigns in Afghanistan and Iraq and therefore a kind of punctuated equilibrium. Of course, the critics will not be silenced. Can a state declare war on a tactic (terrorism)? Yet this most popular critique of Bush’s policy seems to me to be ill informed. The attacks of 11 September were not the first aerial attacks against the US. Al-Qaeda began targeting American civilians in the embassy bombings in East Africa in 1996. It has been engaged since then in a campaign, not a crime wave. What distinguishes Al-Qaeda’s acts of violence from those of, say, a criminal cartel, is that its motivation is entirely political. It was on this understanding that the United Nations effectively proscribed the movement in 2001. War or not, the conflict is desperately serious. Since 11 September more than 3,000 Al-Qaeda operatives have reportedly been captured in more than 100 countries. Two major campaigns have been fought in Afghanistan and Iraq. The US is
A new discourse 81 hunkering down for the duration. Before leaving office Rumsfeld designated Special Forces Command a ‘global synchroniser’ for all US military combat commands. He assigned it responsibility for not only designing a new counter-terrorist campaign plan, but for conducting preparatory reconnaissance missions against terrorist organisations around the world. The scale of the violence is also much greater than many realise. The great majority of major terrorist attacks are aborted or intercepted. Few, if any, of these failures appear on the database for terrorism. Open sources provide little information. Some occasionally do enter the public domain, such as the Dutch intelligence coup in December 2001 which thwarted an Al-Qaeda attack on the US Embassy in Paris; and the FBI prevention of an attack on the tallest building in Los Angeles in early 2002. The great majority of interceptions remain unknown. This is a long and bitter war of attrition. Take one month, April 2007. Nine Iraqi soldiers were found dead after a suicide bomber rammed a checkpoint. Two days earlier nine US soldiers were killed by a pair of suicide bombers driving a garbage truck packed with explosives. A few days earlier another attack killed nearly 200 people in Baghdad. And it is not just Iraq. In March a suicide bomber blew up an Internet café in Morocco. A pair of suicide bombers killed a further 24 people in Algiers. In Quetta a judge and at least 14 other people were killed in an attack on a courtroom (the sixth bombing in Pakistan that month). In Saudi Arabia police arrested 172 suspects who were planning a suicide attack on the country’s oil fields. If we do not always know how successful we are at tackling this campaign of violence, we are not helped to understand its scale by the statistics that are bandied around as most are unreliable. We are told that we have eliminated 75 per cent of Al-Qaeda’s upper and middle management, and that only 500 key operatives remain alive. Who comes up with these figures? According to a leaked dossier from 2005, the number of British Muslims actively engaged in terrorist activity is estimated at less than 1 per cent. That would make 16,000 – a significant number.11 The year before, another leaked document had put the number of active British supporters of Al-Qaeda at ‘up to 10,000’. No clear definition of ‘active’ was offered. ‘It is like the old game of space invaders,’ a senior British anti-terrorist official remarked. ‘When you clear one screen of potential attackers another simply appears on the screen.’12 Unfortunately, this is a highly subjective war to which the old actuarial criteria no longer apply. Outdated criteria include counting the number of enemy divisions and multiplying by three (as the military was instructed that attacking side needed a 3:1 advantage in numbers to have any real confidence of success). To paraphrase Dorothy Parker, considering how dangerous everything is, we are not nearly frightened enough. At the same time, it is very difficult to know just how frightened we should be. The situation, it should be added, is also not helped by many of my fellow academics who tend to hype up the threat posed by terrorism in the absence of any useful, solid data. Over 20,000 articles have been published on terrorism since 9/11. Only seven are based on rigorous statistical analysis, according to the National Centre for
82
A new discourse
the Study of Terrorism at the University of Maryland.13 Many of the self-proclaimed experts on terrorism have little more than a passing acquaintance with the subject. It all reminds one of John Stuart Mill’s less famous father who wrote a history of India that was several hundred pages long. The fact that he had never visited the country, did not personally know any of its inhabitants and did not speak any of the subcontinent’s many languages did not prompt him to question whether he was quite the expert he thought himself to be. Terrorist studies, in a word, are rather like junk food – they make you feel good at the time, but they have little long-term nutritional value. All that can be salvaged from this phenomenon is the knowledge that more than half of all published journal articles in the social sciences are never quoted which suggests, in turn, that more than half the information produced by research is never read by anyone except the anonymous peer reviewers and copy editors of the journals themselves.14 The ignorance of Islam in much of the Western world (especially America) must be added to all of this. According to the 9/11 Commission’s report, only 17 students graduated in Arabic from American universities in 2002. This statistic may explain, in part, some of the failings of American foreign policy since then. The country is led by a president who, long after invading Iraq, was still unaware that there is a difference between Sunni and Shia. Even more problematic, the US Army started its war against terror with only 108 Arabic-speaking interrogators. So let me repeat the question: is it a war? It is understandable that, like compassion fatigue, we are suffering from war fatigue. After all, we are told that we are engaged in a ‘war on AIDS’ and a ‘war on crime’, though these days we hear less often about the ‘war on poverty’ that was once so popular. Indeed, there has been a tendency in recent history to widen the war metaphor, beginning with Marx’s Class War in the nineteenth century and Darwin’s evolutionary principle of ‘the survival of the fittest’ (which is misrepresented by Social Darwinists). Some of us still live in the shadow of World War I. We still talk about ‘going over the top’, ‘fighting for our principles’ or ‘winning’ respect (as opposed to earning it). The language of war has alienated much of the world, among them many Muslims who believe that the West has been waging a war against Islam for decades. Such vocabulary reinforces the propaganda of Al-Qaeda which claims to be fighting a global jihad to defend Islam against ‘Crusaders and Jews’. Language matters, as Bush discovered in 2001 when his one mention of a ‘crusade’ against terrorism seemed to confirm the fears of many Muslims. Nevertheless, I would contend that a good case can be made for using the term ‘war’ when dealing with Al-Qaeda; this is a movement that is engaged not in terrorism so much as a global insurgency, the first networked war. Like the world economy, it has no headquarters, no over-arching ideology and no hierarchy, yet it generates profits – the publicity that terror attacks bring. So does the insurgency in Iraq. The former RAND Corporation analyst Bruce Hoffman was one of the first writers to redefine terrorism in this way. What the US faced, he claimed, was a unique insurgency with no centre of gravity, no common leadership and no hardwired organisational structures, only a series of cells linked through the Internet.15 The Internet, in fact, has created two global challenges to the management of the
A new discourse 83 present international order: the first is the anti-globalisation movement and the second is Al-Qaeda. As Manuel Castells reminds us, both are Internet-based phenomena. The Internet is what the factory was for the working classes in the nineteenth century: a place where peasants came together and morphed into a working class; a venue in which to learn socialism, and protest against their own living conditions by withdrawing their labour for revolutionary or non-revolutionary ends. The global insurgency is the same: it is a de-territorialised, Internet-based movement which uses terrorism as a tactic to subvert the international system.16 We may debate whether we are at ‘war’ with terrorism, but the terrorists themselves have no such doubts about being at war with us. We can trace the first article on this theme to an Al-Qaeda website in February 2002. The article invoked what the Americans call ‘Fourth Generation Warfare’: a model developed by American military thinkers, including the strategist William Lind, to include terrorism by nonstate actors as an act of war. The time had come, the website proposed, for Islamic movements to ‘internalise the rules of Fourth-Generation warfare’. Equally significant was the work of the Al-Qaeda strategist Mustafa Setmariam Nasar. From a secret hideout in South Asia he published thousands of pages of Internet tracts on how small teams of extremists could wage a decentralised global war against the US and its allies. With their base in Afghanistan lost after 2002, he argued that radicals would need to change their approach and work primarily on their own (with guidance from roving operatives acting on behalf of the broader movement). Nasar was not a bomb-maker or an operational planner. He was one of the movement’s prime theorists for the post-9/11 world. The world’s counter-terrorist agencies saw his theories in action in Casablanca (2003), Madrid (2004) and London (2005). In each case the perpetrators organised themselves into local, self-sustaining cells that acted largely on their own. Nasar’s masterwork was a 1,600-page paper entitled ‘The Call for a Global Islamic Resistance’. It circulated on the web 18 months prior to his eventual capture.17 It was the first document of its kind to spell out a doctrine for a decentralised global jihad. In its pages the 47-year-old jihadist outlined a strategy for a truly global conflict on several fronts at once. The strategy sought to overthrow a world the West has created: a global civil society made up of working coalitions of Western governments and NGOs, that ‘empire’ of civil society that many see as the latest (and possibly) last manifestation of Empire.18 This is not a geographical war defined by flanks or centres of gravity. It is a de-territorialised challenge, and it has not been seen before. It heavily relies on second and third generation immigrants in Europe for its recruitment. Many of the young Muslims now at war against a world order which they believe permitted the invasion of Iraq (now the key inspiration) are themselves inspired by the war on terrorism. In a recent ICM poll, seven out of ten British Muslims agreed with the following statement: ‘The War on Terrorism is in reality a war against Islam.’ 19 This critical difference explains why Europe cannot admit that it is at war, while the US can. This is one of the main points of contrast which has produced something of a crisis in Atlantic relations.
84
A new discourse
The crisis in the Western Alliance When thinking about a crisis it is useful to consider the original definition of the term. Whereas in popular usage ‘crisis’ is often associated with ‘disaster’ or ‘catastrophe’, it used to mean a time to make decisions. Etymologically, writes Zygmunt Bauman, the word comes much closer to the word ‘criterion’, i.e. the principle we apply to make the right decisions.20 There is no real crisis engulfing the Western world if we mean a ‘disaster’ or ‘catastrophe’. The US will even survive defeat in Iraq as it survived defeat in Vietnam. The real ‘crisis’ which faces the alliance is a dilemma regarding which criteria should be applied to grasp the terrorist threat we are all facing. For us, Bauman also claims, a crisis is contingent upon a theory or paradigm. A crisis arises when the old theory crumbles; when things fall out of joint; when randomness appears when regularity should rule; and when events are no longer predictable. In other words we call a ‘crisis’ a situation in which events defy whatever has passed for normality and routine actions no longer bring the results we became used to in the past.21 Invoking Bauman’s definition, it can be said that we still face a crisis. Our theories no longer fit reality. The conceptual framework by which we used to measure security is no longer widely accepted. The tools we used to use now feel wrong, and so we feel the need to find out what the conditions were that once made them effective. What is to be done to restore those conditions or change the tools? This question, of course, includes the ethical practices we once took for granted, which were part of the value system that defined the Western world in its confrontation with Soviet communism. There is little we can do to recreate the framework of the Cold War. Both the US and Europe face the problem of terrorism (global, networked, religious in inspiration) but their diagnosis of the threat differs substantially. First, the Americans believe they are involved in a war; the Europeans are trying to prevent one. The war they are trying to prevent is not in the Middle East or Central Asia, it is at home. President Bush gave a revealing speech in November 2005 in which he told the American people: ‘Our troops are fighting [the] terrorists in Iraq so you will not have to face them here at home.’ Echoing this view, Colonel Sean MacFarlane, the commander of the First Brigade Combat Team, told a British embedded journalist in Iraq in late 2006: ‘Here’s my thing. I want to go on holiday with my family in the US and not have to worry about terrorists blowing themselves up in America. I’d rather be over here killing terrorists than have them killing us in the US.’ 22 Contrary to what many Europeans would like to imagine, there are many American soldiers in Iraq who do not want to leave until the job is done. The Europeans, by contrast, have a very different understanding of what is entailed in prevailing against terrorism. As Francis Fukuyama reminds his erstwhile neo-conservative colleagues in his latest book
A new discourse 85 (a no-holds barred critique of neo-conservatism), if Olivier Roy is right in his analysis that universal jihadism is a global, disembodied and de-territorialised challenge that transcends differences of culture, then the real battle lines of the future will run through Berlin and Bradford, Amsterdam and Marseilles.23 The Europeans understood the need to overturn in Afghanistan the world’s first terrorist-sponsored state. But the paradigm of a global war on terror, let alone a war on a Global Islamic Insurgency, makes them distinctly uncomfortable. Unlike the US they tend to think of terrorism as a crime. This is partly because the terrorists are among them, not fighting far away in the mountains of Afghanistan or the suburbs of Basra or Baghdad. The Madrid bombing and 7/7 in London were warnings that the Europeans had produced their own suicide bombers. The British, in particular, have every reason to be concerned: three out of the four terrorists in July 2005 were born and raised in the United Kingdom. To deal with this new challenge the British government needs to use domestic law (not cruise missiles or the SAS). All Europeans face the same dilemma. The US (at the time of writing) produces no suicide bombers of its own although it has its own share of home-grown terrorists from eco-warriors to Tim McVeigh. But these terrorists are not on the cast list of the Long War. What worries Americans like Samuel Huntingdon – when they are not preoccupied with the impending ‘clash of civilisations’ – is the future of the US itself: the ‘ethnic stranger’, who is mostly Spanish-speaking with dubious Third World work habits. For all the popular anxiety in Europe about asylum-seekers and immigrants, ethnicity is not the challenge it faces. It is not the ethnic stranger but the ethical stranger who is the true danger. The danger stems from the fact that the ethical tie between citizens (the tie that makes the civic order possible) is far more important than ethnicity. But the state is in trouble when the ethnic stranger becomes an ethical stranger (a stranger to the civic order of which he is a member) and, worse still, when he celebrates his own estrangement. The state has to confront what Rousseau called ‘foreigners among citizens’ – those opposed to the social contract that sustains civil society. The ethical stranger, whatever his provenance, is an anti-social citizen. In Europe he has the advantage of living in a multicultural society which grants him rights without insisting on him discharging his duties. Unfortunately, if duties without rights make slaves, rights without duties make strangers.24 The purpose of ‘policing’ such outsiders is to reform, rehabilitate or, at the very worst, lock them up. When terrorists hold British passports it is quite impossible for a prime minister to advocate their total destruction. Thus the Long War is driving a wedge into the alliance and it is difficult to see how things will get better. Of course, many other issues, such as capital punishment and the failure of the US to sign up to the Kyoto Accord, divide Europe and the US. Such divisions make it increasingly difficult to speak of a Western alliance at all. One group of commentators implies that if the West survives the US will no longer be part of it – its pro-active military policy and social life already make it unique. The other group of commentators imply that if there is a West, only the US is willing to fight for Western values with the ruthlessness that the cause demands.25 But often these divisions are
86
A new discourse
overdrawn. On a number of substantive issues (including global warming) there is a transatlantic viewpoint. Take, for example, the effort of six New England states, plus New York, Delaware and New Jersey, to reduce carbon emissions, or the 128 mayors of American cities who have signed up to the Kyoto Accords. But the War on Terror is quite different. An alliance cannot function if one of its members thinks it is at war and the others do not. During the Cold War, it is true, the Europeans often did not buy into America’s pre-occupation with Soviet capabilities; they preferred to discuss Soviet intentions. Frequently, the allies clashed over burden-sharing, out-of-area issues and even the funding of defence. But the one thing on which they never disagreed was the identity of the common enemy. As Robert Cooper argues in his book The Breaking of Nations, unless there is a strategic consensus on the present threat, differences and disagreements among allies are likely to become much more serious.26 Second, the US is fighting a war, Europe is policing its own neighbourhood. Both confront risks, not the threats of old. We have moved from a world of threats to one of risks. The alliance, of course, faced dangers in the Cold War but those were regional, not global. Today dangers are cast in a new language and we have none of the old ‘stabilising factors’ to reassure us (such as summit meetings and Confidence-Building Measures [CBMs]). We now deal with risks and ask ourselves how best to manage them. Risk management has become the focus of transatlantic relations. Why is the transatlantic management of differences so much more difficult now? Because a risk is intrinsically different from a threat, a danger or a hazard. It is not that a risk is intrinsically more dangerous or hazardous than a threat. The problem with a risk is that in addressing it you may find yourself at greater risk than ever. If acting in the Cold War was dangerous, not acting in the Long War can be very hazardous indeed. But so can pro-active behaviour. The problem is that while we know what we do not want to see, we do not know how best to avoid it. Sociologists call it ‘the risk trap’. And whereas the Cold War threats were very concrete and objectively real (the Soviet divisions in Central Europe could not be conjured away), it is impossible to know how many suicide bombers are out there today. We can only estimate, and intelligence estimates are notoriously unreliable. The risks we perceive tell us a lot about our own fears and anxieties. Our individual national risk thresholds are very different. We all adjust differently to the degrees of risk, all of which are culturally determined. This is illustrated by Rumsfeld’s infamous remark that all we can do on the basis of intelligencegathering is to say: ‘Well, that’s basically what we see is the situation: that is really only the known knowns and the known unknowns.’ What Rumsfeld left out is what Zizek calls ‘the unknown knowns’ (the beliefs that one unconsciously adheres to). For different cultures know different things.27 They apply different lenses to analyse the dilemmas they face. We all have to make choices. Social life demands an organisation of bias – some of us are biased to take some risks (smoking) and some of us to avoid others (hanggliding). It is that bias that Robert Jervis tells us is ‘wired’ into us by culture, which is why the perception of risk is a social process. We all have different ‘world
A new discourse 87 views’.28 And the world view of an American president is likely to be very different from that of a German chancellor or, for that matter, a British prime minister. Which is why Manuel Castells explains that British foreign policy is ‘American foreign policy with a human face’.29 It is more ethical, not because the British are nicer, but because they feel more at risk from not being seen to act well. Hence the fairly widespread European perception that the war in Iraq (which, of course, they mostly opposed) is already lost and that the longer the Americans try to win it, the more likely they will fail. What they see in Iraq is a training ground for young terrorists (including many of their own citizens). They see a strategic vortex into which Americans are continuing to send men and money. They face a war which is fuelling even more anti-Western sentiment (especially within their own Muslim communities whose young men are increasingly listening to radicalised Imams). Instead of the management of risk they see its proliferation. Social reality is a feedback process. We expect to find what we do, and what we find feeds back into our expectations and risk thresholds. In this case, the Europeans feel they are increasingly being put at risk by American actions. So, in the end, whether or not the fight against terrorism is a war, it is a conflict which is intensely ethical in its language and its implications – especially for the ‘ethical strangers’ in our midst. In this book I argue that the old rules do apply and that they apply for a reason. We have rules not because of the beliefs we hold, or the religious faiths we profess, or the cultural norms we seek to apply. We have them because history tells us that it is dangerous not to. And although war is indeed evolving all the time – as are the technologies that enable us to pursue it – with a good, or merely indifferent conscience, we should remember why ethics has been central to war for centuries. (And we should also remember why, contrary to what some in the military believe, ethics is not merely an ‘add on’, an inconvenience which gets in the way of tackling the enemy head-on.)
A new paradigm? Bush’s Presidential Memorandum of 7 February 2002 opened with the statement that ‘new thinking on the law of war’ was required.30 Yet such thinking as there has been does not constitute a new discourse yet. All the administration has done is make a case for excluding ‘unlawful combatants’ from the prevailing discourse. Paradigms, Kuhn tells us, are important because they help us make sense of the world. When we can no longer make sense of it we have to rethink our basic assumptions. In effect how analysts think about change and continuity shapes what they look for, and what they look for affects what they find. A paradigm does not provide answers so much as hold the promise of answers. So far those answers have not been forthcoming, which is why the US has acted so ineptly. In suggesting we need a different paradigm but failing to make the case for one the US faces what gestalt psychologists call a ‘creative point of indifference’: when a patient cannot continue to practise the old ways which have worked in the past and yet does not know how to go forward successfully.
88
A new discourse
This is especially evident in the use of language, one of the legitimising factors in war. Instead of defining the ‘war’ precisely the US has engaged in what Steven Poole calls ‘Unspeak’: a mode of speech that persuades by stealth. Unspeak differs from two other linguistic deformations with which we are more familiar: euphemism and doublespeak.31 A euphemism, for example, affirms what is known to be untrue. It is a moral violation which takes on the guise of a literary form. It is particularly pernicious because it becomes detached from moral judgement. It is an idiom that tends to reduce psychological insight into a collection of standardised observations which provide those who employ them with ways of evading the truth. Even as early as 1940, George Orwell was warning that language was being distorted to disguise the reality of war off the battlefield. Thus the term ‘transfer of population’ was used to describe the forcible removal of peasants from their farms; another term, ‘rural pacification’, masked their deliberate and systematic killing.32 The Vietnam War was particularly replete with euphemisms such as ‘protection reaction’, ‘free fire zones’, ‘air support’, ‘friendly fire’ and ‘collateral damage’. Euphemisms, Orwell added in his seminal essay Politics in the English Language (1946), defend the indefensible. They are by definition insincere, allowing those who can decode them to distinguish a state’s real aims from its declared ones.33 Doublespeak was not coined by Orwell, but it is associated with him and with his novel 1984. It is a mixture of the novel’s Newspeak and Doublethink (the ability to believe in two opposing ideas at the same time). It describes the phenomenon of saying one thing while meaning another. Doublespeak really is a euphemism itself – for lying. Unspeak, by contrast, does not say one thing while meaning another. It says one thing and really means it. In default of a new discourse on war, Unspeak allows the state to exclude certain actors from the prevailing one. Thus the term ‘unlawful combatant’ is used to denote people who are deemed to not merit POW status. (Not that an argument could not have been made for redefining the status of those who carried out the attacks of 9/11.) The International Committee of the Red Cross was wrong in its early – and later retracted – denial of the very essence of such a category. What was worrying about the US position was not the existence of such a category, but the conclusion that was drawn from it – the refusal to accept that any legal framework formally applied to the detainees in the War on Terror.34 In case of doubt as to whether persons who have committed a belligerent act are combatants, Article 5(2) of Convention 111 insists they must be treated as prisoners of war ‘until such time as their status has been determined by a competent tribunal’. The US established such tribunals in the Vietnam War and the 1991 Gulf War; it has not established such tribunals in the present conflict. Instead, in the case of those detained in Guantanamo, it has merely insisted that their status is not in doubt. The trouble is that the US has been led astray by its own rhetoric, especially by the use of the biblical language of ‘evil’. Even these ‘unlawful combatants’ – whatever their moral condition as members of the human race – have a right to ‘humane
A new discourse 89 treatment’, as defined by international humanitarian law. It is what makes us ‘civilised’ in our own eyes, and hopefully those of others. For the world’s most powerful liberal state, the US has already taken several steps back from the civilised values it trumpets. In his State of the Union address in 2002 Bush claimed that human dignity was ‘non-negotiable’. Yet the US’s own actions would suggest otherwise. In this respect it is guilty of Sartrean ‘bad faith’. For example, take the word ‘abuse’, which covers cruel, inhuman and degrading treatment ‘which fails to rise to the level of torture’.35 One particular example of abuse has produced a euphemism of its own: ‘repetitive administration’, which is the use of force sometimes resulting in fatal injuries. One case in point was the deaths of two prisoners at the US base in Bagram, Afghanistan in December 2002. They died after suffering injuries to their legs which were so extensive that the coroner described them as being equal to being run over by a bus. They were an example of a peroneal strike which is a deliberate, disabling strike to the side of the leg just above the knee which targets the peroneal nerve.36 The term ‘repetitive administration’ immediately brings to mind the ‘administration’ of justice or the ‘administration’ of a vaccine to a child. Even if in a darker frame of mind one thinks of the ‘administration’ of punishment, this is only after ‘due process’. The point about the prisoners in question in Bagram was that they had never been put on trial. As Poole adds, the very word ‘abuse’ is a catch-all. To abuse a prisoner in the US is to insult or demean him, usually verbally. The US Defense Secretary, Donald Rumsfeld, refused to call any acts of Abu Ghraib ‘torture’, preferring the term ‘abuse’ instead. As with the abuse by prison guards in the US, this suggested random, capricious acts which were ‘out of character’ and certainly neither ordered, nor condoned by the prison authorities. But if we mean torture, then it was encouraged as soon as the US argued that it was not bound by the UN Convention Against Torture (1987). Instead, it entered a ‘reservation’ to take account of the new circumstances, permitting it to carry out actions hitherto not permitted, such as forcing a prisoner to kneel while kicking him in the stomach. This is no longer deemed to constitute what the UN Convention calls ‘severe pain or suffering’. Another abuse involves ‘stress positions’ – shackling a prisoner’s wrists behind his back and then suspending him from a ceiling. One prisoner in Abu Ghraib died after being suspended for half an hour. Even excluding that case (which was not authorised) the Assistant Attorney-General insisted in a memo that ‘for purely mental pain or suffering to amount to torture ... it must result in significant psychological harm or significant duration (e.g. lasting for months or even years)’.37 A dubious distinction in itself, it brings to mind General Massu’s insistence that his own torture victims had suffered no permanent bodily damage, unlike the victims of the men he was fighting. What Unspeak does is conceal the implications of the administration’s own actions. In the case of water boarding Bush himself apparently does not believe that strapping someone to a board, tipping them upside down and pouring water repeatedly over their cellophane-wrapped face constitutes severe suffering. Nor does he
90
A new discourse
appear to find the act of forcing prisoners to stand for lengthy periods especially troubling. At first the detainees in CIA custody were required to be restrained for a maximum of four hours without rest. Then a memo from Donald Rumsfeld came down the chain of command: ‘I stand for eight to ten hours a day. Why is standing limited to four hours?’ Why indeed? It certainly sounds mild enough. Except that it is not. A hooded person forced to stand on a box for hours will quickly lose his sense of equilibrium. Lower back pain will develop from the strain of remaining upright for a long time; pain in the legs will develop as blood pools there. It is certainly creative, of course, even if it is not officially sanctioned. Unfortunately, it is a twist on the old maxim of George Patton: ‘Never tell people how to do things. Tell them what to do and they will surprise you with their ingenuity.’ Ingenious or not, here is a description of what it actually means in plain English: There is the method of simply compelling a prisoner to stand there. This can be arranged so that the accused stands only while being interrogated – because that, too, exhausts and breaks a person down. It can be set up in another way – so that the prisoner sits down during interrogation but is forced to stand up between interrogations (a watch is set over him and the guards see to it that he does not lean against the wall, and if he goes to sleep and falls over he is given a kick and straightened up). Sometimes even one day of standing is enough to deprive a person of all his strength and to force him to testify to anything at all. Ironically for those of us who believed that we occupied the moral high ground in the Cold War, the author Alexander Solzhenitsyn documented ‘long time standing’ as a method used in the old Soviet Union. Sleep deprivation also sounds mild enough but it is not. What follows is another vivid account of what the punishment actually entails. A prisoner ‘... is wearied to death, his legs are unsteady, and his one sole desire is to sleep, to sleep just a little, not to get up, to lie, to forget ... Anyone who has experienced the desire knows that not even hunger or thirst are comparable with it.’ Again, ironically, the author of this account is Menachem Begin, the former Israeli Prime Minister and a man who the British authorities in Palestine considered a terrorist. Begin was describing the methods used by the Soviets in Siberia where he was imprisoned in 1939. We know that one person in Guantanamo Bay was forced to go without sleep for 48 out of 55 consecutive days and nights. He was also manacled naked to a chair in a cell that was airconditioned to around 50ºF and had cold water poured on him repeatedly until hypothermia set in.38 It is the reality of such acts that makes Unspeak so pernicious and, in the end, so self-defeating in a struggle that requires the West to hold the moral high ground if it is to have any hope of success. While it may indeed be prudent to adopt a new discourse on war – or a new ‘paradigm’ as the administration prefers to call it – it is definitely foolish to avoid intellectually engaging with
A new discourse 91 that challenge, and using Unspeak to cover up the absence of any sustained thought. If you call torture ‘abuse’ and other forms of physical abuse ‘repetitive administration’, are you not forced to ask yourself whether such practices actually work? Are they efficient, or are they likely to alienate the very community whose hearts and minds we are trying to win? The worst part of Unspeak is that it encourages ethical blindness. It absolves us of ethical duties and demeans us by denying responsibility for our own actions. Language can distance us from the realm of moral empathy. And this can be accomplished by rewriting the story. In some cases such as ‘collateral damage’, the victims are airbrushed out of the account; in other cases such as ‘extraordinary rendition’, we avoid confronting ourselves. Our actions become exempt from public discussion, or so we hope. To admit to torturing our fellow human beings in a liberal society, after all, is to invite immediate criticism. It requires justification in what will be a lengthy and contentious public debate, the central topic of which will be whether we should keep faith with our liberal values. We liberals, Rorty would have argued, diminish our own authenticity by resorting to a language that allows us to be morally evasive. As liberals we have unconditional responsibilities to others and the belief that what works for us works for other people. Guantanamo Bay and extraordinary rendition The point is that most Western discourses on war have been largely instrumental. Practices have to be seen to be effective if they are to be justified. This is simply not the case in the decision to exclude ‘unlawful combatants’ from the prevailing discourse on war. Let me take two of the most controversial decisions by the administration: first, to detain suspects at a special base in Guantanamo Bay; second, to countenance the torture of prisoners not directly, but on sub-contract, by means of ‘extraordinary rendition’. The Guantanamo zone is an eight-mile wide, five-mile deep sovereign territory of the US. When the first detainees were brought there, they were held in Camp X-Ray. The prisoners in orange jumpsuits and goggles, shackled to garnies as they were moved between wire cages, soon became the image of Guantanamo around the world. Later, they were moved to Camp Delta where it was hoped they would fall outside US jurisdiction. Guantanamo was always going to be controversial, but it was made more controversial still by the identity of the prisoners brought back from Afghanistan following the collapse of the Taliban regime. When, in January 2002, Rumsfeld called the detainees ‘among the most dangerous, best-trained vicious killers on the face of the earth’, the immediate question raised was on what evidence were they detained. And anyway, if most really were ‘worse’ than others, why imprison someone as unimportant as Bin Laden’s driver (one of the few to be indicted when the indictments were finally drawn up)? One study culled from the US government’s own data found that only eight per cent of the camp’s prisoners were actually members of Al-Qaeda.39 The US had incarcerated 773 men and boys in the camp; as of 2007, 385 remained (of whom
92
A new discourse
four-fifths were held in solitary confinement, often at grave risk to their mental health). The situation reached its low point when three inmates, who were held for years without trial, hanged themselves. The hapless American Deputy Assistant Secretary for Public Diplomacy called it ‘a good PR move’. She was not alone in sounding callous. The commander of the base grumbled that by taking their own life his three charges had committed an act of ‘asymmetrical warfare’.40 Plenty more tin ears and sharp tongues belonged to bigger heads higher up in the administration. (Not that the Deputy Assistant Secretary may not have been right to imply that their suicide was a political act rather than an individual expression of despair.) The point, though, is that much of the war against terrorism is a contest between values – in short, a public relations war – which America should be winning hands down. As The Economist observed at the time, ‘a brand that stands for liberty and life [should be] an easier sell than one that stands for beheading unbelievers and reviving the Middle Ages’.41 It seems likely that many of the detainees were simply in the wrong place at the wrong time. Some were sold into captivity by locals who were greedy for the bounty offered by the US. Amnesty International has found that this ‘rewards programme’ resulted in a frenetic market in abductees – some of whom were seized not in Afghanistan at all, but in Pakistan, where a $5,000 bounty makes a significant difference to a man whose per capita income is $720. As for those still left in the camp, it seems that moral cowardice is the main reason for their continued incarceration. No one wants, or is able, to take responsibility for signing the release order. Indefinite imprisonment in solitary confinement seems to be their lot, at least until there is a new incumbent in the White House. In this global PR campaign America’s reputation for fairness has been put at risk. This is particularly because of the inherently unfair nature of the Guantanamo tribunals and the principle of indefinite detention without charge. In the case of the former, the tribunals allowed witnesses to give evidence against prisoners in unsworn statements instead of testimonies. The prisoners were not necessarily allowed to know the evidence against them if it was deemed to be classified. There was no appeal outside the military and the courts could impose the death penalty (although the administration has not sought it for any of those charged so far). Unfortunately, there is no justification for this in a liberal society. A collateral consequence of relocating the detainees to the base was that it highlighted the absence of judicial oversight and public scrutiny. By introducing a system that did not allow for judicial review the administration precluded the judiciary from performing its traditional function as a check on the misuse of executive power. If the US has adequate evidence against the prisoners it should have tried them in conventional courts, or sent them back to their home countries for trial. The fact that no European court has been willing to find potential terrorists guilty on the evidence extracted from Guantanamo Bay has ensured that all such cases collapsed in the courts. Critics of the camp – including (on fundamental points) the Pentagon’s own defence lawyers – always argued that the evidence obtained from interrogations would never have held up in a criminal court in the US itself.
A new discourse 93 Indeed, publicly available transcripts of Combatant Status Review Tribunal proceedings make grim reading for they reveal a highly flawed process which strongly suggests that much of the evidence to support the executive’s claims for indefinite detention has been extracted by coercive interrogations that, in some instances, may amount to torture. In the case of indefinite detention, the US is making out a special case that it has the right (which it undoubtedly does under international law) to remove an enemy from the battlefield for the duration of a conflict. But in the War on Terror the conflict has no defined end. Condoleeza Rice is right to argue that ‘captured terrorists of the twenty-first century do not fit easily in traditional systems of criminal or military justice’.42 She identifies a real dilemma and a perennial question: are terrorists soldiers or criminals? But even if the US chooses to stick by the second designation it faces another dilemma: they would be stateless, pledging allegiance only to a movement, or perhaps only a cause. And if this war has no end – or not one that is in sight – how long can a country continue to hold prisoners of war? Moving to the issue of extraordinary rendition, this is yet another example of how it would have been much better to have thought seriously about formulating a completely new discourse rather than excluding from it some actors who are deemed to be ‘outside the law’. To be fair to the administration we must not engage in foreshortened perspectives. Rendition first began well before George W. Bush came to the White House. There were over 70 renditions before 9/11, according to former CIA Director George Tennant’s testimony to the 9/11 Commission.43 These abductions can be dated back even earlier to 1987 when a terrorist called Fawaz Yunis, who had been implicated in two skyjackings, was lured to Cyprus from his home in Beirut and arrested by the FBI. He is now serving a 30-year term in Leavenworth. This set the precedent for a series of abductions originally authorised by an executive order of President Reagan. Because the Supreme Court has traditionally shown no interest in how defendants reach the dock, the federal authorities have exercised a wide latitude in taking suspects into custody. In the famous example of Manuel Noriega, the US invaded Panama to bring him to court in Florida to face drug charges. When extraordinary rendition first began it specifically targeted people named in foreign arrest warrants. After 9/11 it was expanded to include anyone the administration considered to be ‘unlawful combatants’. One such example is that of the senior Al-Qaeda operative Ibn Al-Sheikh al-Libi who was taken to Egypt for questioning. Another example is the case of two Yemenis taken up by Amnesty International who were interrogated and tortured by Jordanian security forces for four days during an 18-month period of detention by American forces. Once again the problem with the process is that it effectively puts suspected terrorists outside any legal system and does not allow for due process of law. It is inherently illiberal. In the end, we are confronted by the spectacle of a government that has been not only caught up, but also caught out, in its defence of some dubious practices. Uncompromising in its adherence to the rules, it has been agile in its interpretation of them.
94
A new discourse It says it abides by the UN Convention against Torture, but its own lawyers have argued that a ban on ‘cruel, inhumane and degrading treatment’ does not apply to the interrogation of prisoners outside the US. It claims it does not render suspects ‘to countries for the purpose of interrogation using torture’. Indeed, where appropriate, it seeks ‘assurances’ they will not be tortured, but former officials claim that governments often give such assurances casually.
These are some of the administration’s elliptical public statements and motivations. Behind every stark statement we encounter layers of ambiguity. This ambiguity is so counter-productive in a war that it requires the West to adopt an unambiguous moral stance. It is important, of course, that we think the unthinkable; it is important that we ask whether torture might have instrumental value. No-one, if they are honest, can deny that torture may work in certain circumstances. The problem is that in most cases the confessions and the information extracted may not be real. Tortured detainees probably will tell you what you want to hear. The scholar Dorias Rejali has trawled through the archives of the brutal Algerian War of Independence and found no evidence that torture did anything to aid the French in delaying defeat, let alone prevailing.44 Even General Massu, before he died in 2002, admitted it had achieved nothing. The problem is that what begins as coercive interrogation is likely to degenerate into torture. No equivocating over permanent or short-term damage – physical or psychological – should be allowed to obscure the dynamic at play. Even in World War II Robin Stephens, the commander of the Wartime Interrogation Centre (codenamed Camp 020) in Britain, never used torture – even though he personally interrogated over 500 suspects and broke all but a few. In a recently declassified report that is now in the Public Record Office, he listed the tactics needed to break down a suspect. A breaker is born and not made ... Pressure is attained by personality, tone, and rapidity of questions, a driving attack in the nature of a blast which will scare a man out of his wits. Many spies were turned and became double agents. This could never have been achieved through physical abuse or humiliation. The double agent system – in which Stephens played a vital role – was probably the greatest espionage coup of the war, culminating in the strategic deception of the D-Day landings, when the Germans were successfully deceived into believing the Allies would land not in Normandy, but the Pas de Calais.45 Even if Al-Qaeda suspects are not Germans – even if they are more fanatical and therefore less likely to crack because of their religious belief – there is another reason why torture should be avoided and it has been made by Senator John McCain. If we are engaged in ‘a war of ideas, a struggle to advance freedom
A new discourse 95 in the face of terror in places where oppressive rule has bred the malevolence that creates terrorists’, then we should surely conduct it by being as little like those who are our enemies as we possibly can be – even if torture works.46 Yet there is no evidence that it does. As the Pentagon’s own lawyers have argued, the evidence obtained at Guantanamo could have been obtained by conventional means. But that is not the point: torture is pointedly expressive; it is designed to send a message, not to extract information. Expressive violence is not the Western way. To quote the former Head of the State Department’s Counter Terrorism Center, ‘there was a before 9/11, and there was after 9/11 ... after 9/11, the gloves come off’. A recent Washington Post article quotes a government official supervising the transfer of original Camp X-Ray detainees as saying, ‘If you don’t violate someone’s human rights, some of the time, you are probably doing your job’. This, adds Joanne Bourke, may win support back home, but it does little to win a war where remaining true not only to one’s own beliefs, but one’s own best practices, is essential.47 In the modern age (and, it could be argued, before) Western societies have always practised a highly instrumental way of war. We judge the justice of our actions on results, on strictly utilitarian criteria. Our ethical codes, I have contended, are inherently pragmatic and are intended to promote good practice. We should not act badly in order to send a message, still less to communicate our resolve to win. It may well be that we need a better theoretical understanding of the post-9/11 world; a better political grasp of the causes of terrorism. But none of this requires a different etiquette of atrocity. The term ‘unlawful combatant’ is especially unhelpful given the polymorphous nature of the conflicts in which we find ourselves engaged. As the International Crisis Group pointed out in a seminal report in February 2006, the US and its allies might not be able to establish a monopoly over the use of force in Iraq, but they could and should do so over the legitimate use of violence.48
New wars, new paradigms ‘In the future it is likely that, to an increasing extent, war will be waged by suggestion – by words and not by weapons, propaganda replacing the projectile.’ (Basil Liddell Hart, 1932) We must be careful not to condemn the Bush administration out of hand; it is attempting as best it can to respond to the new reality of war. This reality was glimpsed very early in an essay written shortly after the first Gulf War by the semiotician Umberto Eco. Eco presciently wrote of a new world of flows, goods, services and information in which old patterns of military conflict had begun to count much less than ever. His remarks are even more prescient now than they were then. Power, he wrote, was no longer monolithic, but monocephalous. It was diffused and packeted between a multiplicity of powers which each competed with one another. War had become a parallel, not a serial system.49 A serial intelligence system used, for example, to build machines capable of translating or drawing inferences from some given information, is instructed by its programmer so that, on the basis of a finite number of rules, it can make subsequent
96
A new discourse
decisions, each of which depends on an assessment of the preceding decision. Oldfashioned military strategies were based on the understanding that the enemy’s rules were its rules too, and each party made one decision at a time, rather like a game of chess. A parallel system, by contrast, requires the kind of interaction we find in a neural network. War has gone neural: it takes a path which is independent of the will of the two parties in contention. It has become networked. In other words, it has a logic of its own.50 ‘Whenever we look at life, we look at networks,’ writes Fritjof Capra, one of the chief complexity theorists.51 The network has replaced the machine as the dominant metaphor of the modern era – even if most soldiers (who, when not fighting the last war, are usually thinking about it) still tend to use the term ‘machine’ to understand the reality they have to face every day. ‘The men we have been fighting,’ the reporter Evan Wright was told by a US Marine in Iraq, ‘probably came here for the same reason we did, to test themselves, to feel what war is like. In my view it doesn’t matter if you oppose or support war. The machine goes on.’52 The importance of machines, of course, is that they are governed by the laws of cause and effect. If you cut the fuel pipeline of a plane, the plane will crash every time. The importance of a network is its non-linearity. Michel Foucault, contending some years ago that a world of causes was giving way to one of effects, offered this advice: ‘the uniform, simple notion of assigning causality’ should be overturned. Political scientists should analyse the whole play of dependencies. By ‘eliminating the prerogative of the endlessly accompanyjng cause’, they could ‘bring out the bundle of polymorphous correlations’ between a host of polymorphous actors operating in horizontal networks quite different from the vertical worlds of the past.53 In the case of the War on Terror these actors tend to proliferate in ways that often defy analysis by intelligence experts. They are the product of many milieus: the mosque, the prison and the neighbourhood being the three most common. None of this should really be surprising. Churchill grasped the importance of this as long ago as 1948. ‘In the wars of old, decisions arose from events rather than tendencies; in modern war tendencies are infinitely more important than events’.54 The truth of Churchill’s claim was borne out not only in war but the market place in the 1990s during the financial collapse which began in south-east Asia and then spread to Russia and from there to Brazil. The Internet boom and subsequent slump were fuelled by computer-driven buying and selling patterns that exaggerated both upward and downward moves in share prices. Thomas Friedman has devised a graphic phrase to describe the participants in the global market: the ‘electronic herd’. What is worrying about herds is that they tend to stampede when frightened. If we prefer a different analogy then look at a recent bestseller. Malcolm Gladwell’s The Tipping Point claims that changes in social trends – from fashion to crime – are best understood through the model of epidemiology which, in this case, is the process by which relatively isolated cases of disease can soon become an epidemic. The more interconnected we become through, for example, the Internet, the more likely it is that small disruptions will have large effects, partly through contagion, partly through the mathematics of geometric progression (the successive multiplications of change).
A new discourse 97 Networks can also be seen as paradigms. In The Information Age Manuel Castells did much to establish the network in the popular imagination. His later work has further developed this idea. Twenty-first century social movements, which he defines as purposeful collective actions aimed at the transformation of the values and institutions of society, manifest themselves particularly through the biggest network of all: the Internet. ‘Cyberspace has become a global electronic agora where the diversity of human disaffection explodes in a cacophony of accents.’55 It is a rather poetic way of stating the obvious: the Internet has become the medium through which different and diverse social groups, many of which protest against the international system, not only communicate with one another, but also coordinate their actions. Castells sees the anti-globalisation movement – in all its extraordinary diversity – as the historical equivalent of the disorderly working class movement that emerged before it formed itself into the labour movement, and then initiated the process of debate, negotiation and institutionalisation of conflict which gave birth to modern industrial society. The labour movements which emerged in the nineteenth century through the factory were able to mobilise effectively for the first time. The factory gave them a working-class consciousness, as well as a political voice. Instead of occupying the margins of society, the workers found themselves at its heart. No longer condemned to taking part in marginal urban revolts or food riots, they had a new weapon: the withdrawal of labour. The pre-industrial crowds may have won a few short-lived victories, including wrecking toll gates and machinery, and imposing short-lived price controls in food riots. Only when the sans-culotte and small freeholder gave way, however, to the factory worker did the machine wrecker and urban rioter give way to the trade unionist and labour militant. The latter could use their power democratically or in a revolutionary cause. Indeed, Sorel famously referred to the regenerative role of violence. What he identified was a feedback process: an industrial labour force could gain a better consciousness of itself through violent acts. Just as the factory was the organisational basis of the labour movement, the network is the characteristic feature of many of today’s terrorist movements (especially Al-Qaeda). The communication of value and mobilisation around meaning has become fundamental to Al-Qaeda’s identity and self-belief. The movement is wired into global communication systems – not only the Internet, but also the global media. Both are media through which terrorist movements can reach out to those who share their values. A movement like Al-Qaeda, adds Castells, aims to seize the power of the mind, rather than the power of the state.56 Its influence comes from its ability to raise issues and force a debate without entering into negotiation with the state. Because no-one can negotiate on belief Al-Qaeda is difficult to defeat: It is a pure movement, not the precursor of new institutions ... and because the Internet is its home, it cannot be disorganised or captured. It swims like fish in the net.57
98
A new discourse
And because the Internet has no centre of gravity it is almost impossible to disrupt. Like Nicholas of Cusa’s universe, the Internet’s centre is everywhere and its circumference nowhere. The movements that rely on the Net reach out to groups across the world through symbolic actions, from the suburbs of Bradford to the alienated margins of society in the banlieues of Paris. They need the support provided by local groups (particularly in Europe), yet they act globally at the level where it really matters today; they draw their support from information that is broadcast across the world. Paraphrasing Marshall McLuhan, General Rupert Smith remarks, ‘this is not so much a global village as a global theatre of war with audience participation’.58 ‘Call it public diplomacy, or public affairs or psychological war, or – if you really want to be blunt – propaganda,’ added Richard Holbrooke, the former US ambassador to the UN, shortly after 9/11. ‘But whatever it’s called, defining what this war is really about in the minds of the one billion Muslims in the world will be of decisive and historical importance.’ 59 As Rorty would add, one of the principal audiences we must address is our own. If we liberals tell stories, our actions impact upon the stories we wish to tell. How can we get others to believe us if we lack self-belief? Even if we can live with contradiction, others require a less complicated message. Some will demand the certainty that comes from seeing the West act in the spirit of its own first principles. The networked battlespace The West is now engaged in a global networked war in which the way it treats its opponents or is seen to is likely to have immediate consequences. The contradictions in its behaviour still abound to the detriment of conveying the central message: that ‘we’ are better than our opponents. It may be true that the US military tends to abide by the Geneva Conventions towards civilians in war zones while pretty much ignoring the rules concerning the treatment of detainees. Unfortunately, detainees get as much, if not more, coverage in the news. The audience the US is trying to win over is largely influenced by what it sees, and what it sees is what the media chooses to show it. There is a competitive market of ideas out there. At the time of the 9/11 attacks one source had identified over twelve Jihadist websites. Four years later the figure rose to 4,500 sites, representing a ‘virtual community of belief’ or, as one writer put it, ‘one big madrassa’.60 This networked world is highly complex. It produces what John Urry calls three basic network typologies: ●
● ●
Line or chain networks with many nodes that are spread out in more or less linear fashion. Star or hub networks where central relationships move through a central hub. All-channel networks in which communications proceed in more or less all directions simultaneously.61
A new discourse 99 The importance of all three is that they can generate massive non-linear increases in output. Networks dramatically amplify small inputs through long-term ‘increasing returns’. Such non-linear outcomes are generated by a system moving beyond what is called the ‘tipping point’. Tipping points, in turn, involve three notions: ● ● ●
Events or phenomena are contagious. Little causes can have huge effects. Changes can happen not in a gradual linear way, but dramatically at the moment when the system switches.
A particularly vivid example of contagion is the 2005 riots in Paris during which the suburbs erupted into spontaneous violence that soon became self-sustaining. At the height of the riots 700 cars were being torched every night. They began as a series of protests by Muslim immigrants against their living conditions. Looking back, the picture is rather different. Only 36 per cent of rioters were North Africans; an even greater percentage (39 per cent) were native Caucasians who came on to the streets to copy what they had seen on television.62 Changes, as Urry adds, can happen quickly when the system ‘switches’. A good example is the point at which every major office in the world decided it had to have a fax machine. A fax machine as a means of communication is only really useful if everyone else has one. When they do, new networked connections can be formed. This development was not linear in the sense that it was intended; it just happened. We live in a world of an increasingly large number of unintended consequences. Karl Marx was the first writer to discuss a concept that was disturbingly new for the time: unintended harm. A new technological development in Britain could destroy entire livelihoods in China or India within a year. This chain of events was not intended by anyone. Marx’s unique insight was that the growth of the global market exposed an increasing percentage of the world population to long-distance harm. A double revolution, in effect, was taking place. Old types of deliberate harm caused by warring states and expanding empires were gradually being replaced by diffused forms of harm transmitted across frontiers by the forces of global capitalism. And the market, not only the state, was beginning to reshape the world in ways that impacted more than ever upon peoples’ lives.63 Urry introduces two other key concepts. First is ‘increasing returns’. These are to be distinguished from traditional ‘economies of scale’, which are found within single organisations, increasing output over time and reducing the average cost of production. Economies of scale are linear. Increased returns, by contrast, are non-linear. They are often derived from coordination between machines (not people), like fax machines or the Internet, and involve a process of organisational learning across the network. In a network people learn faster from their mistakes.64 Accordingly, organisations can reinvent or rebrand themselves overnight. Terrorist groups are amongst the most protean of all. Once a territorially based movement, Al-Qaeda has become a venture capitalist organisation, and now sponsors different attacks in different countries in the hope that
100
A new discourse
one or two out of ten may succeed. Its metamorphosis is truly impressive. What company could have survived the capture or death of 75 per cent of its upper to middle management, and still be operationally effective? The second concept Urry discusses is ‘path dependence’. Contingent events can set in motion institutional patterns that have long-term deterministic properties. Small events can produce significant outcomes; they can get ‘locked in’ over long periods of time, producing a long-term pattern of behaviour which is difficult to reverse.65 Like the other forces he analyses, path dependence can produce networks that are not necessarily derived from human actions or intentions. Indeed, in many walks of life we are increasingly networking with machines, ideas and texts in a material way, rather than in an old-fashioned social whirl. We regularly communicate with people we will never meet. We increasingly live in virtual communities. What makes Islam such a potent force is that through the network it has become deterritorialised, detribalised and diffused. The network society has arrived with a vengeance and it is beginning to shape the War on Terror. Insurgent networks in Iraq, for example, like computer viruses, tend to proliferate at an alarming rate. The computer virus is indeed a useful analogy since it is largely a product of the network – if not in origin then in its potential for harm. Viruses tend to proliferate. Some leap from user to user, others lie dormant for months. Some evade immediate detection, others bury themselves only to pop up later on. Some are even designed by their programmers to undergo spontaneous mutations. The importance of viruses is that they have no function apart from their own reproduction. Most important of all, to employ a Darwinian term, there is no necessary connection between their own reproduction and the good of anything. In other words, there is no connection between their fitness, from their own point of view, and their contribution to the general fitness of the host. This definition of selfishness, of course, is behavioural, not subjective. When we say that insurgent groups are ‘selfish’ (like genes) we are not concerned with the psychology of their members’ motives. For a ‘virus’ the idea of the ‘good’ constitutes its own chances of survival. Like computer viruses, polymorphous actors tend to proliferate faster than they did in the traditional guerrilla wars of the past. And unlike the guerrilla movements of old, many of the polymorphous actors are self-contained, and indeed solipsistic. The reality is so kaleidoscopic that it often confuses local commanders on the ground. ‘I don’t have a particular name affixed to what I’m going up against,’ remarked the puzzled commander of Combined Joint Task Force 180 before going into Afghanistan in 2002.66 Al-Qaeda was merely one actor with which the Americans had to contend. Taliban was another: a nationalist, religious movement that maintained power through somewhat tenuous alignments with regional warlords (most of whom chose to jump ship when the Americans arrived). Since the fall of Taliban, however, the country’s different factions have proliferated. Up to 13 different warlords and 4,000 separate armed groups have been identified. Many of the latter number no more than 20 members; others number in the thousands. Taliban has fragmented into several local or territorial
A new discourse 101 groups. Meanwhile, Al-Qaeda has relinquished its operational profile and assumed an enabling role for groups who share similar operational aims. In Iraq, the situation is even more confusing. There are now some 20,000 insurgents in the country working under the banner of 60 identifiable groups. The shape of the insurgency is quite different from anything the US has encountered in the past. The insurgency has no centre of gravity and no clear leadership. It has no ambition to seize and actually hold territory, though some insurgent groups are not averse to forcing the coalition to fight for it (as in the battles of Fallujah and Najaf). More to the point, the insurgents do not seem to have a unifying ideology. In short, the insurgency is taking place in an ambiguous and constantly shifting environment in which constellations of cells gravitate to one another from time to time to carry out armed attacks, trade weapons and engage in joint training. The cells then disperse, and often never cooperate again. Chris Hedges, the former war correspondent for the New York Times, calls such places ‘Hobbesian playgrounds’.67 In such a kaleidoscopic environment it is difficult to identify what we would see as genuine political actors. What of private militias, such as the force associated with Muqtada alSadr (which is splintering into rival groups)? Or the foreign volunteers? Or the criminal gangs? In so far as the criminal gangs are subcontracted to others, they are part of the political environment to be coerced, defeated, contained or simply bought off. Each of these political actors is a striking example of what Hobbes, with reference to the rise of private armies and armed political factions in his own day, called ‘worms in the intestines of the state’, gnawing away at what is left of civil society.68 Instead of killing off the host body, they gradually drain it of its vitality as its lifeblood haemorrhages out. Even earlier in Bosnia, private armies (83 irregular units in all) did most of the damage. Some were made up of former prisoners released from jail. Arkan’s notorious ‘Tigers’ were recruited from Belgrade’s Red Star soccer team. One of the Bosnian bands was led by a convicted rapist and another was led by a former mob boss and racketeer. What passed for ethnic conflict seems to have been something far more banal: the rise of communities of criminal violence who saw war as a mode of capital accumulation. The anthropologist Clifford Gertz called them ‘homeboy subcontractors’ who were willing to kill on contract to, and in the name of, their respective leaders: Milosovic, Karadizic, Tudjman and Izetbegovic, the contending warlords who emerged from the detritus of Tito’s Yugoslavia.69 What makes Iraq different is the 3,000 or more foreign volunteers. It is reported that over 700 Somali fighters with combat experience went to Lebanon to fight the Israelis alongside Hizbollah in mid July 2006. This is part of the new reality to which we try to adapt our discourses on war. The old counterinsurgency doctrines presupposed that the population from which the insurgents were likely to draw their recruits was concentrated in a single territory. (If they did not recruit they tried to at least win the hearts and minds of the target population.) The population is now dispersed; it is even global. No foreign volunteers came to Malaya to fight for the communist guerrillas. Now global Jihadists are a reality
102
A new discourse
and many are drawn from the 63 countries (including Britain) in which Muslims represent a significant minority of the population. Contrary, however, to received opinion, war is not escaping state control. It still has a political character. It is still the work of well-armed, well-organised collective groups whose aim is to capture political power, if only to promote what many of us consider non-political ends. In the case of Al-Qaeda, the US is dealing with a politically motivated revolutionary movement. In Iraq it faces sectarian and nationalist insurgents fighting under different flags. Assassination, ambush, sabotage and kidnapping may all be criminal acts, but when systematised they form part of a larger picture that it can be folded perfectly well into our traditional discourses on war. This is not to deny that the networked movements we find ourselves fighting are not incredibly difficult to defeat. Take out one node and another arises, and the network rearranges itself accordingly. Nodes increase their importance by absorbing more information and processing it more efficiently. Insurgents learn faster than ever before. The logic of insurgency is not based on command structures – the insurgency in Iraq has no centre of gravity. The strength of a network is that it is self-organised and often short-term. Above all, it is non-linear.70 Non-linearity posits that a ‘system’s effect’ (for example, an insurgency) is not the result of adding together the individual components (in the case of Iraq, the 60 or more individual groups). Something else called ‘emergence’is involved (i.e. regularities of behaviour emerge and transcend their own ingredients).71 The components of a system, through their own interaction, spontaneously generate a collective property which is not (necessarily) implicit in each group or component. Or put another way, to quote Urry, ‘such large-scale patterns or properties emerge from, but are not reducible to, the micro-dynamics of the phenomenon in question’. What we find in Iraq is a network characterised by ebb and flow. It is not surprising, Castells writes, that non-linear warfare, in eliminating the notion of a front, allows small, autonomous units possessing high firepower, real-time information and rapid mobility to swarm around isolated coalition units in Iraq in hit and run attacks.72 Swarming Swarms are networks of distributed intelligence; the same kind of intelligence enables bees, ants and termites to evolve complex forms of collective behaviour on the basis of very simple rules of interaction between their individual members. What is remarkable about swarms is their resilience and flexibility as amorphous ensembles in which no single individual is critical to the continued existence of a successful operation. There is something about the way individuals interact in swarms that produces a kind of coherent group behaviour. Emergent properties enable the whole to be more than the sum of its parts. Kevin Kelly describes the swarm model as ‘systems ordered – a patchwork of parallel operations where in the absence of a chain of command what emerges from the collective is not a series of critical individual actions but a multitude of simultaneous actions whose collective pattern is more important’.73 The fluidity of swarms allows
A new discourse 103 these forms of social organisation to adapt more rapidly and effectively to the unforeseen – at least much more rapidly than the hierarchical, top–down organisation of Western militaries. Discussing ant colonies, Nicolas and Prigogine observe that: [a] permanent structure in an unpredictable environment may well compromise the adaptability of the colony and bring it to a sub-optimal regime. The possible reaction towards such an environment is to maintain a high rate of exploration and the ability to rapidly develop temporary structures suitable for taking advantage of any favourable occasion that might arise. In other words it would seem that randomness presents an adaptive value in the organisation of society.74 What enables ants to identify the optimal state of fitness in a given environment is the process of distributed computing. Despite their limited perceptive apparatus and intelligence, a colony can identify the shortest route across a rugged landscape fairly rapidly. This is achieved by the secretion of pheromones which enables its members to communicate with each other. In that sense, pheromones can be thought of as packets of information broadcast or communicated within a particular ant colony. In other words, tiny pieces of local knowledge acquired by individual ants combine to constitute the emergent larger knowledge that characterises the colony as a whole. As Kelly puts it, ‘this calculation perfectly mirrors the evolutionary search: dumb, blind agent, simultaneous agents trying to optimise a path on a computationally rugged landscape. Ants are a parallel processing machine.’75 It is this distributed form of computing which invests complex-adaptive systems with their ability to change and come up with creative solutions to new problems. When translated into the field of insurgency we can see how swarms are the natural corollary of the constitution of networked forces; they are an emergent phenomenon produced by information sharing. In the future we are likely to witness insurgent operations evolving even faster than they have already. Indeed, we may well see small groups or dynamically reconfigurable packs working together, rather like the cells in our own bodies. ‘As such, they will be able to be far more discriminating and precise in the effects they cause. They will become less mechanical and more organic, less engineered and more “grown”’. In military terms, this means that the old waves of the past – the mass linear attacks that characterised the attritional warfare of World War I or the mass attacks by thousands of Chinese ‘volunteers’ sent to attack Allied positions in Korea – will be replaced by swarms. Rather than throwing themselves onto the enemy, they will converge in all directions for offensive bursts, often short in duration, thereby maximising the shock effect. While Linda Beckerman does not employ the term ‘swarming’, this is really what she describes as the ‘mass and disperse’ tactics that were deployed by Somali fighters during the famous Battle of Mogadishu in October 1993 (the theme of Ridley Scott’s film, Black Hawk Down).76
104
A new discourse Swarming is achieved when the dispersed nodes of a network of small (and also perhaps some large) forces can converge on an enemy from multiple directions, through either fire or manoeuvre. The overall aim should be sustainable pulsing – small networks must be able to coalesce rapidly and stealthily on a target ... a small network should have little to no mass as a rule (except perhaps during a pulse), but it should have a high energy potential – like a swarm of bees that can fell a mighty beast, or a network of antibodies that can attack a spreading virus.77
Dispersing after an attack renders insurgents far less vulnerable to counter-attack. Swarming is not a new tactic. It can be traced back to the hordes of Genghis Khan and the German U-boat packs during the battle of the Atlantic. Similar tactics have also been employed in a non-military context by civil rights activists in the 1960s and anti-globalisation protestors today. What makes swarms so difficult to break – and break into – is that they are not hierarchical. They are shape-shifting in form, cohering briefly around opportunities, then changing direction and even targets. They have no fixed trajectories. Swarms are not teams; they have no centre of gravity because they are not organised around a familiar division of labour. To invoke Durkheim, they have only ‘mechanical solidarity’: they replicate the behaviour of others and thus appear to have strategies, but they have no recognisable strategic purpose. Terrorism of this kind, in Clausewitzian terms, is an unpolitical force because it does not represent a continuation of policy by other means. In a classic swarm there is no ‘direction’ (as we understand the term when looking at team efforts which involve specialists, managers and members with creative ideas, but perhaps without the managerial or entrepreneurial skills to realise them). There are no cells to penetrate for that reason; there is no leadership to subvert. Swarms accordingly have no history and no future – instead they occupy a perpetual present. Yet they are networked and what the network provides is a fluid, self-reinforcing process in which any small initial change is amplified by positive feedback. It is also highly contagious. An event seen on television can stir others into joining a conspiracy. In combating such groups states find themselves in a difficult position. Ethics has become networked. There is now a collective, global awareness of unethical acts. In the words of one US Air Force officer, ‘[the Internet] brings a global connectivity to the kill chain’. He was not talking about ethics; he was referring to network-centric warfare which is the ability of the US to network and thus transform the face of battle by integrating all its forces for the first time. But it strikes me that his remark could also be applied to ‘the calculus of pain’ that constitutes war as we practise it today. The kill chain is now transparent as it has never been before. Networked morality In a bitter satire, ‘How We Lost the Hi-Tech War of the Future’, Charles Dunlap writes in the role of a revolutionary reader from a Middle Eastern group or nation which has successfully turned modern morality into a weapon against the US:
A new discourse 105 Our strategy was to make warfare so psychologically costly that the Americans lost their will to win. To do so, we freed ourselves from the decadent West’s notions of legal and moral restraint ... We would rather be feared than respected.78 The West has had to confront this moral dilemma for some time. It came across this dilemma in Somalia when human shields were used by the warlords to defend their soldiers, and a few years later in Iraq when Saddam Hussein decided to ring his palaces with civilians prior to an impending American air attack. Also, during the Kosovo war the citizens of Belgrade were encouraged to wear ‘target patches’ and line the main bridges outside the city to deter Western air strikes. None of these events deterred the West from intervening, but they did make it more fearful of losing the propaganda war. The point is that none of these conflicts were as networked as conflicts are today. The problem is that dubious ethical practices, especially when exposed by the media, increasingly alienate what we call global public opinion – and the global is becoming increasingly performative. Once images get established in the public mind, they are difficult to dislodge. The performative must be understood not as an act, as such, but rather as ‘the reiterative and citational practice by which discourse produces the effects that it names’.79 All of which is just a social scientist’s way of saying that the feedback mechanism is all important. Scandals such as Abu Ghraib feed back into international opinion. By structuring and framing the discourse, the media creates the reality that international public opinion perceives. And the US has found to its detriment that bad practices have adverse consequences. In his book Political Scandal, John Thompson discusses four processes that disembody criminal actions (or actions deemed to be criminal) and invest them with global reach. The same model applies to acts that are deemed unethical by global public opinion.80 The first is that damage occurs when actions are seen to violate particular norms, particularly when they conflict with the behaviour we expect from the society in question. And though we do not always like to admit it, most of us tend to expect higher standards from some societies (like the US) than others. As Osama Siblani, the publisher of an Arab/American news service, wrote one month after the World Trade Center attack, the US has lost the public relations war in the Muslim world: ‘they could have the prophet Mohammed doing PR and it wouldn’t help’.81 The cost of disappointment is much greater than it would be for a society such as China, from which we expect much less. No-one protests about the continuing illegal occupation of Tibet or the vandalising of its culture; everyone complains about Western indifference to the plight of the Palestinians. Disappointment matters when the principal dangers we face are not embodied in arms and weapons, but in individuals and intentions. The trouble arises when a state is seen to act ‘out of character’ by behaving badly. Sometimes the situation can be deeply ironic. Not even Al-Qaeda operatives expect to be treated well if captured by the Jordanians or the Egyptians. An Al–Qaeda manual found in the AlFarook training camp in 2004 sets out these double standards. Nothing much was
106
A new discourse
to be done, its readers were instructed, if they fell into Arab hands. Most would not live to tell others about the experience. But if captured by the Americans, they could at least provoke their interrogators into hurting them so that they could show their bruises and broken bones to the Red Cross, and thus win some points in public diplomacy. If all else failed they could inflict damage on themselves by ramming their heads against prison walls or beating up each other. The American reluctance to inflict bodily punishment was taken as further confirmation that they were not ‘warriors’. Indeed, by some standards they are not worthy enemies at all precisely because of their reluctance to inflict pain, which in some traditional cultures is often taken to be evidence of belief in the cause.82 The first dimension illustrates very well what Rorty calls the irony of liberalism; we liberals should indeed find our views ironic because we are condemned to doing what we liberals must do. We cannot escape the liberal trap; we can only hope that we are on the side of history. What is particularly ironic is the fact that others should judge as we do not always judge ourselves; others expect more from us than we expect from ourselves. Like all societies, we compromise and find ourselves making difficult moral choices. As liberals, of course, we can demand nothing less. The second process Thompson discusses concerns the violation of trust. This is important for the US (the greatest liberal power of all). The demand-side model applies here. Much of the legitimacy of America’s role as a global manager derives from the fact that it provides a service we all need: it meets a global demand. It is this that invests the use of force on occasion with the aura of what John Ikenberry calls ‘a semi-constitutional right of restraint’ over countries or actors who break international law. No other country could have removed Iraq from Kuwait or, for that matter, Sadaam Hussein from Iraq. In the former it had public support; in the latter it forfeited public support. Nevertheless, the US is the only power able to enforce international law if others break it, and conversely the US is one of the few powers that can defy international law – but at a cost, of course, to its international reputation. The third level of analysis is public exposure. The exercise of power has been increasingly transparent since the invention of photography. As Susan Sontag wrote in her final book, Regarding the Pain of Others, photography is the only major art in which professional training and years of experience do not confer a decisive advantage over the untrained or inexperienced.83 This is for many reasons, among them the larger role that chance plays (finding oneself in the right place at the right time) and a bias towards the spontaneous. What the technology of photosensitivity introduced is the definition of photographic time as no longer passing. It was time that ‘gets exposed’, time that breaks to the surface, the time of the sudden ‘take’. What distinguishes photography from all other art forms is that it can capture the telling moment. Searching for a literary language that could encapsulate the reality of the Great Depression, James Agee referred to the camera as a metaphor for the truth in his seminal book Let Us Now Praise Famous Men.
A new discourse 107 All of consciousness has shifted from the imagined ... to the effort to perceive simply the cruel radiance of what it is ... This is why the camera it seems to me, next to unassisted and weapon-less consciousness, is the central instrument of our time.84 Indeed, throughout the 1920s camera-seeing was exalted as a particularly cruel form of vision because it caught the real, unadulterated by any attempt at explanation or rationalisation. André Breton called it a ‘savage eye’ for that reason. Like Goya graphically illustrating the atrocities of the Napoleonic War in Spain, what an artist shows is synthetic – things like this happen. What a photograph shows is evidence that things did happen. It does not evoke, it shows; it illustrates as well as corroborates. Sontag cited three colour pictures of Tyler Hicks which appeared in The New York Times on a page devoted to the War on Terror. The triptych depicted the fate of a wounded Taliban soldier who had been caught by Northern Alliance soldiers in their advance towards Kabul. The first panel showed him being dragged on his back by two of his captors. The second showed him gazing up in terror as he was pulled to his feet. The third showed him at the moment of death, supine, with arms outstretched and knees bent. Pity and disgust were amplified precisely because the viewer could imagine what other cruelties were not being shown on film.85 Much has changed in recent years. The media itself can now make transparent what governments would rather conceal. One reason why the military has to be even more true to its own ethical codes of behaviour than ever before is that the war zone is no longer a military dominated space. It is shared with a multiplicity of actors, including NGOs, private security companies, the media and a variety of allies (in coalitions of the [un]willing). It is no longer a purely military space as it was in the past, but a networked one in which the military is only one element (and by no means necessarily the major element). The military’s actions, however, are far more newsworthy than those of any of the other actors in legitimising and de-legitimising a particular mission. Even in the hands of a soldier committed to the cause, one mobile phone camera, or a military blog connected to other websites, can amplify an abuse of power with incalculable results (most of which are usually harmful for the cause or the mission). Blogging has made everyone his own reporter. The way in which a campaign is reported and generally perceived has an immediate impact. Soldiers operate in an environment which is much less permissive than ever before. Any sense of grievance or injustice can be invigorated – or reinvigorated – by news stories. Even when successful, military operations can inspire people to join a Jihad or global campaign. It is impossible for the military to control the flow of information. The new matrix of real-time recording and uplink technology means that even the most remote, hostile and (in theory) operationally secure locations are transparent. This point has not eluded policymakers. As Donald Rumsfeld told an audience in New York, the US government still functions as ‘a five a dime store in an eBay world.86 Al-Qaeda, on the other hand, has effectively exploited the information revolution to wage the world’s first cyber-jihad. In testimony to the House
108
A new discourse
Armed Services Committee on the treatment of Iraqi prisoners Rumsfeld also bemoaned the fact that US Central Command (CENTCOM) had begun looking into abuse at Abu Ghraib four months before the first pictures appeared on television. Photographs unfortunately matter; they are the lifeblood of a connected world. CENTCOM’s mistake had been to see the issue of abuse as a criminal investigation. In reality, twenty-first-century digital technology simply short-circuited twentieth-century organisational practice. The point is that the news environment has changed significantly since 9/11. First, the arrival of multiple 24-hour news broadcasting channels has created an entirely new dynamic. Stories and images are now always in demand 24 hours a day. This is very different from the news bulletins broadcast by the main channels only four times a day. This increase of reporting has broadened the coverage, but paradoxically narrowed it at the same time. There is a continual demand for more images and stories while the main story is often lost sight of, or the bigger picture is obscured. Military commanders, writes one BBC veteran, must now accept: an ever-sharper political and military vulnerability which forces the machinery of government to be reactive and provide political accountability to a public which swiftly sees and hears unfiltered and possibly distorted versions of events that are not channelled through the public information processes of the military or government.87 Second, ‘digitalisation’ has democratised news gathering. Images can be caught on digital cameras, laptop cameras and 3G mobile phones. News is no longer defined by what a broadcaster chooses to film or a producer finds particularly newsworthy. Stories become news when they are received; pictures (and even stories) can be received at any time of the day, from anywhere in the world. We are all potential broadcasters, and the point is that broadcasters will take photos from any source simply to generate news. A new force of ‘citizen journalists’ has opened the world of news reporting to anyone with a keyboard and an Internet connection. Each blogger is open to bias (as much as any reporter), but as a group bloggers offer an audience that is searching for the truth boundless material to digest. Blogs and bloggers are changing the landscape of reporting for they have created a new class of experts. They are neither journalists nor defence correspondents who have spent years in the field; nor are they scholars sequestered in their own academic world of conference papers. Bloggers are interested observers who often know as much as journalists, and can write just as well. Above all, they can communicate news much faster. A blogger can aim for a much narrower segment of the reading public than a newspaper or a television channel could possibly target. Not only are there millions of blogs, but readers can post comments that augment the blogs, and the information on these comments zips around the blogland at the speed of electronic transmission. Blogs, as a result, are making governments more transparent than ever before. They are a feature of the ‘exposure culture’ which can make and unmake reputations overnight.
A new discourse 109 Also, the cost of news reporting is falling as much as the cost of terrorism. On 11 September Al-Qaeda was able to turn two aircraft into guided missiles for $160,000. In March 2004 the cost of destroying commuter trains in Madrid came to only a tenth of that. The cost of media exposure of abuses is lower still. In information terms, writes the BBC World broadcaster Nic Gowing, a $40 flash card and a $300 digital camera can have a similar impact. Third, digital archiving means that even the images that are not broadcast immediately can be restored and retrieved later. They can be stashed away for future use. Many archives can be found on the Internet – anyone looking for information has never been better equipped. News aggression sites like Google News draw together sources from around the world. The website of Britain’s Guardian newspaper now has nearly half as many readers in America as it does at home. The point is that the shocking never dissipates. One recalls Andre Malraux’s description of a photographic gallery as ‘a museum without walls’. The photograph gets exposed, but its exposure can be delayed. It is often all the more shocking when revealed later, especially if it confirms us in what we already know, or merely suspect. Photography aspires to be revelatory. It comes with the built-in claim that it conveys a truth that could never have been revealed otherwise. Take the picture of Saddam Hussein’s botched execution, for example. The chants and taunts of his Shia guards were broadcast across the Internet. More than 2,500 copies were made in the months that followed; the most viewed version was seen more than 16 million times, according to Google Video.88 All of this can confront the viewer or newspaper reader with information overload. As Ignacio Ramonet claims, in the last 30 years more information has been produced than during the previous 5,000 years. Ramonet illustrates this point with a telling example: a single copy of the Sunday edition of The New York Times probably contains more information than a cultivated person of the eighteenth century could expect to consume during a lifetime.89 Information overload is the product of digital media, including the dedicated news channels that claim to update the news hourly. What gets lost in the process is context and understanding; at the moment, the credibility of the stations that broadcast the news is not lost, unfortunately. In Milan Kundera’s novel Slowness (1995), one of the key characters is a Czech scientist who sits in a hotel room reflecting on the sheer speed of news broadcasting. He mulls over the fact that fragments give way to new fragments without any particular order or over-arching theme. He eventually realises that it has become patently impossible to weave those snippets of news into a larger story, or the narrative we used to call history. And all of this has taken place at a critical time when broadcasters themselves have been sealed off from the larger military picture. Even many of the embedded journalists who accompanied the Marines into Fallujah in November 2004 had little sustained contact with the armed forces. Therefore, they probably had very little real sympathy or understanding for working conditions, operating restrictions or the pressures which the Marines faced from all sides. If the purpose of television should be to inform the public about what is happening ‘out there’, it is questionable whether the medium is fulfilling that purpose. Indeed, it
110
A new discourse
might be fulfilling a quite different purpose altogether. All of which seems to exemplify Henry Thoreau’s observation: ‘Our inventions are wont to be pretty toys which distract our attention from serious things. They are improved means to unimproved ends.’ 90 Thompson’s fourth and final point is that the media enhances the use of power; it draws governments into the ambit of visibility. And it is bodies that are made especially visible; it is the human body that speaks to the world ‘intimately’ and ‘close up’. One especially striking example occurred during the Israeli invasion of Lebanon in 1982 when the PLO put out a picture of a girl who had lost both arms in the attack. American public opinion was so alienated that President Reagan prevailed upon the Israeli government to call off the bombardment of West Beirut for a time. Later it was discovered that the girl had lost her limbs not due to Israeli actions, but to Arab crossfire. But that is not the point. What the story illustrates is how a single image can define an entire campaign. Television deals with specifics, not abstractions or the big picture. And specifics which engage the audience with the frailty of the human body have a particular power to move opinion. Urry calls this ‘performative biopower’. The point about abuses such as Guantanamo Bay or Abu Ghraib is that they are all small (7,000 people crowded into a small prison), but bodies tell a story. The small tends to snowball as ‘cascading effects’ produce real damage. Images have the power to diminish reputations by exposing illegitimate power or the illegitimate use of power. Images can dis-empower the powerful. And in a competitive age where every broadcaster is at war with another, images are sold and resold abroad, and stories traded, analysed and discussed. On the global media networks, what we perceive is real. And the media address different audiences: the nation, the wider Arab world, an Iraqi audience or insurgent groups. Some groups are extremely adept at tailoring their deeds as propaganda. Such images encourage people to mimic the actions of others, and inspire some, in turn, to join insurgent groups. Sometimes this can work to the military’s advantage. In April 2004, US Marines were accused of attacking mosques in the first battle for Fallujah. In November, they returned with embedded journalists who later broadcast to the world (not that the world was listening) that the insurgents were sniping at the Marines from the minarets of the city’s mosques. Usually, however, it works out differently. The first Arab television network, Al-Jazeera, was set up in 1996, coinciding with Bin Laden’s first call for a holy war against the US. Since then, Arabic satellite channels and Jihadist websites have proliferated, sensitising Muslim opinion to the oppression of the co-religionists in Kashmir, Palestine, the Balkans and elsewhere. Undoubtedly, these grievances have fuelled the spread of Al-Qaeda’s ideology as it underpinned the rage of the 9/11 hijackers. Iraq has clearly acted as a magnet and an inspiration for foreign fighters, many of whom appear to be young Arab men from neighbouring countries, and easily rallied to the call of Jihad. But there is no sign that for the past couple of years they have been under Bin Laden’s personal direction, as opposed to being inspired by his message. The same goes for the 7/7 bombers in London whose
A new discourse 111 links with Pakistan at first suggested contact with Al-Qaeda militants who had woven themselves through their country. Later we found that they had been inspired by the great pool of disaffected young men and teenagers who were Pakistani by nationality and often Pashtun by ethnicity. These young men swill around Pakistan’s western cities and see 9/11 as the defining inspiration of their adolescence. For decades such young men, both Arab and Pakistani, have found justification for their anger in the Israeli/Palestine struggle. Their countries’ population booms, combined with the sudden spread of television, have made their anger combustible, sparked by a shocking image on the overhead television of a street café. Terrorist memes Perhaps we should see the rise of Islamism, as mediated by the Internet and global media networks, as an example of what Richard Dawkins famously calls a ‘meme’. Dawkins made his name by writing about the ‘selfish gene’. Perhaps we should entertain the alarming possibility that terrorist ideas are self-serving; that they owe their success to their peculiar ability to turn men’s minds to serving the ends of terrorist ideology rather than human beings themselves.91 Roughly, this is what Dawkins meant when he coined the term ‘meme’: an element of culture that works in the mind something like a virus, changing the behaviour of the individual it infects in such a way as to help spread itself from one mind to another, even though the carrier may derive no personal benefit, and may actually be harmed. The ‘selfish gene’ is dedicated only to its own replication. So is the ‘selfish meme’, as Ferguson explains in War of the Worlds; he uses the term ‘a virus of the mind’ to suggest that in the twentieth century racism spread between peoples not because it benefited them, but because it benefited racism.92 This suggestion is not an outlandish one, though it has been criticised by many. Daniel Dennett, for example, has persuasively argued that just such an analysis might explain certain aspects of religious belief. A meme can be any non-genetic material transmitted from person to person: a word, a song, an attitude or, indeed, a religion. Whether they exist or not, memetic imitation is a good way of explaining how ideas spread.93 It is possible to object that memes do not have strategies for insinuating themselves in one’s brain, but neither do genes. Yet geneticists write about genes ‘replicating’ themselves, and ‘competing’ for space in the gene pool. The justification for this, as a metaphorical shorthand, is that natural selection preserves those genes that happen to act as if they were pursuing a strategy. Likewise, the ideas that win out are those that evolve, or compete, more successfully than others. The critical question is: why do some ideas survive when they are not necessarily good for us? Dawkins and those who agree with him are not claiming that memes are necessarily close analogues of genes, only that the more like genes they are, the better meme theory works. For example, memes that are good at getting copied become more numerous at the expense of those that are are not. Racism, as we saw in chapter two, could have become copied globally in the second half of the twentieth-
112
A new discourse
century; national liberation wars could very easily have become race wars. The racism meme was, in fact, copied in the cases of ethnic cleansing in the Balkans and genocide in Rwanda, but both were isolated instances. Terrorism is proving to be much more contagious, especially when combined with religious belief. What meme theory tells us is that some memes survive not only because of their direct appeal. They also flourish in the presence of other memes as part of a Memeplex.94 One of the most tenacious memes is honour, and when honour gets mixed up with religion we have a potent mix. In many Islamic societies honour has an especially tenacious hold over the collective imagination. Their honour codes approximate to our own 300 years ago in that they are inseparable from religion. The West is engaged with an adversary that is the product of one of the world’s great unreconstructed and unreformed honour cultures at a time when the fortunes of the West’s own honour culture are at a low ebb. When people feel dishonoured by Western actions that appear to show insufficient respect for the faith or the people who embrace it, the results can be lethal. When added to an age-old belief that the West has weakened the Muslim world through colonialism, the mixture can become particularly toxic. Arab and Pushtun ideas of honour are an important aspect of militant Islam, and are too often neglected by politicians when committing armed forces to ambitious projects of post-war reconstruction.95 Given what we know, I think that memetic natural selection of some kind offers a plausible account of the evolution of Islamic terrorism. It certainly accounts for the emulation of terrorist movements independent of their own actions. The Al-Qaeda leadership is unlikely to have been especially provoked by Abu Ghraib or Guantanamo Bay, but their supporters, or those who derive inspiration from bin Laden, are a different matter. This does not rule out the additional role of manipulation by terrorist leaders. Terrorism, at least in part, is ‘intelligently designed’, just as fashions are in art. But memes tend to evolve and what the West does – or is seen to do – can influence their evolution.
Conclusion In our networked world, in short, ethics has become more important than ever. A blog site, a camera in a mobile telephone or a lone bystander uploading a video on YouTube can make the exercise of power by a country such as the US even more pregnant with moral consequence. The blogosphere has made the exercise of power even more transparent. A country’s reputation can be lost early on (and this was the real tragedy of Abu Ghraib and Guantanamo). Once this happens its digital imprint can never be erased. It is likely to be preserved online. The historian Fritz Stern once claimed that in history countries get few second chances. True or not, the persistence of memory in electronic form makes second chances even more elusive. Once lost, reputations are not won back easily. But let us not end on too pessimistic a note. The blogosphere can also create opportunities: how you engage others, how you treat POWs and how you keep to your discourse on war is uniquely your own choice. This allows a state to differentiate itself from others – especially from terrorists whose own discourse on war
A new discourse 113 is so one-dimensional and immoral, so self-serving and remorseless, so impoverished in terms of moral empathy for its victims that it can become self-defeating very quickly. In the conduct of war there is still immense variation; where a broad spectrum of variation exists, opportunities can, and do, arise. How a country acts can influence whether others accept the legitimacy of its cause or calling.
5
Grammars of killing
The atrocities perpetrated in Iraq happen on an almost daily basis in what has become a civil war; it is difficult to contemplate these events in a detached way. It is true that life somehow continues for the majority, but it is that very use of ‘somehow’ which tells us that mere existence has become the aim of political life – what has disappeared is that something we call civilised living. That disappearance owes much to the attacks by foreign volunteers, Shia militias and Sunni insurgents, whatever responsibility the Americans and British must assume for allowing the situation to get out of control. Whenever societies find themselves confronting a different reality they inevitably question whether the old ethical codes are applicable. Can they be applied in a context of car bombers, suicide attacks on regular forces and the decapitation of prisoners played out on television? Can they be applied to groups or individuals who employ a grammar of killing that is radically different from our own? I use the last term advisedly. There is a sentence in Nietzsche’s work which is more than usually arresting: ‘I fear we are not getting rid of God because we still believe in grammar.’A few years ago it was as fashionable to declare ‘the death of language’ as it had been in Nietzsche’s time to declare ‘the death of God’. The printed word, we were told, was about to be replaced by the digital image. Language is not dead. Variations of syntactical constraint still determine the nature of what is said and done. Is a suicide bomber a martyr or a terrorist? The adoption of a certain vocabulary, the use of a certain term or even the choice of a transitive or intransitive verb can shape our awareness of a phenomenon as it can shape the experience and the understanding of a cause for the suicide bomber himself. Language is never neutral. Every institution claims its own; every cause has its own grammar of action, including killing. And these days instead of God’s death it is his reappearance which authenticates the killer. The role of grammar, of course, is to make conscious the ways in which language unites meaning and function. Grammar is particularly difficult for those who already know a language because function and meaning are fused and seem indivisible. The function of much of the violence we find in the greater Middle East (the ‘arc of extremism’, as Tony Blair used to call it) is revenge. And the meaning is what we in the outside world make of it. Are we intimidated, coerced or persuaded to desist from intervening?
Grammars of killing 115 In this instance I take the grammar of killing to mean the articulation of an act: how we perceive it, how we reflect on why others do what they do and how we tend to experience it once done. George Steiner provides a useful definition in his book The Grammar of Creation: grammar is ‘the nerve structure of consciousness when it communicates with itself and with others’.1 In terrorist acts killing is all about communication. This is very much the case in the Middle East where killing usually takes an expressive form. Killing is not only an action, but also a speech act. It involves signs, gestures and expressions. We should know this from everyday life. We talk of ‘body language’. We claim we can tell what someone is really ‘saying’ by their expression. We ‘talk’ in our expressions. Evolutionary psychologists think we arrived with the gift of language hard-wired into us. They talk of the ‘language instinct’ for that reason. Through culture we found the methods and opportunities to turn this instrument into words. In war words are turned back into instincts, basic expressions or a body language, if you like, aimed at other bodies – in the case of the War on Terror, our own.
Grammars of killing In those places where it happens, the survivors, the people nearby who are injured, sometimes, months later, they develop bumps, for lack of a better term, and it turns out this is caused by small fragments, tiny fragments of the suicide bomber’s body. The bomber is blown to bits, literally bits and pieces, and fragments of flesh and bone come flying outward with such force and velocity that they get wedged, they get fixed in the body of anyone who’s in striking range ... They call this organic shrapnel. (Don DeLillo, Falling Man, 2007) Every society has its own discourse on war. Moral concepts are rooted in time and place, writes Alasdair MacIntyre. They change as social life changes. In his book The History of Ethics MacIntyre adds that he deliberately did not write that ‘because social life changes’, for such a claim might have suggested to his readers that social life is one thing, and morality another. And that, he adds, would be wrong because moral concepts are embodied in, and are particularly constitutive of, particular forms of social life.2 A court of appeal between different cultures would work no better than a court of appeal between Marxists and Liberals. Indeed, Marxists contended that moralising could no longer play a genuine role in the settling of social differences. Moralising could only invoke an authority which masked social coercion (which for them constituted the class war itself). As MacIntyre adds, speaking from within our own moral criteria, we will find ourselves bound by the criteria embodied in it which we will share with others who speak the same moral language.3 Postmodern societies such as our own speak a more humanistic language than in the past. Our language is largely directed at the needs of the body. Our language is no longer directed at the needs of the soul (as it was for much of the
116
Grammars of killing
twentieth century). It is a language in which the ‘sacred’ inheres in the secular social imagination. All of us have to adopt the moral vocabulary of the time if we are to have a social relationship including, in the case of war, a distinct grammar of killing. Without the cultivation of virtues we would not be able to share ends with anyone else. We would not be able to work together or engage in collective projects. Like Rorty, MacIntyre argues that each culture has its own view of human nature. ‘The choice of a form of life [culture] and the choice of a view of a human nature go together.’4 From the first, I found Arthur Adkin’s seminal study of Greek ethics, Merit and Responsibility (1960), especially impressive. In it he reminded his readers that their own moral codes differed radically from those of the Greeks. For a Westerner, for example – especially one writing in the mid-twentieth-century – concepts of duty and responsibility were central concepts of ethics. In this respect, we are all Kantians. We all ask ourselves, or are encouraged to ask: ‘what is our duty in these circumstances?’ It is a question we ask in any matter which requires a moral decision. And since we tell ourselves that an ‘ought’ implies a ‘can’, anyone who has to pass judgement on any action must first enquire whether the agent did or did not do his/her duty; whether s/he could or could not have acted otherwise. On that basis we also ask whether an agent should be held responsible for his/her actions. The Greek world, by contrast, was quite different from our own. It was so different that ‘duty’ cannot be translated in the Kantian sense into an ethical terminology at all. For us this is quite remarkable. Even more remarkable still, there is no Greek word for ‘duty’. There is neither an equivalent connotation nor an equivalent emotive power. Once we grasp why the concept of moral responsibility was so unimportant to the Greeks we can understand the difference between our moral systems and those of some of the enemies we find ourselves fighting.5 In other words, every historical era has its own understanding of the rights and rites of killing; from the killing grounds of ancient Greece in which Hoplite met Hoplite, to the hand-to-hand combat zones of the Middle Ages, to the anonymous killing fields of World War I, which were essentially industrial slaughterhouses through which soldiers were processed in assembly-line destruction. It is likely that none of these situations would have surprised the Greeks, but they would have been surprised by the nuclear bomb – a weapon which would have negated the whole purpose of war from their point of view. A nuclear exchange between the Superpowers would not have sent a message or communicated a purpose. It would not have helped either power to win the argument. A nuclear war would have ended any conversation. Let us return to our own literature on war. Killing is embedded in a cumulative past and a manifold present, and is conducted to coerce others. Killing is rhetorical. It aspires to be heard and to persuade, which is why it is grammatical. The semantic leads into semiotic, into making and communicating sense. In our society, the taking of life has been refined. We have made it – or we aspire to make it – more discriminating. What is new is our wish to minimise suffering where we can, especially through the use of non-lethal weapons. Non-lethality does not
Grammars of killing 117 spell the end of the war or in a teleological sense the end to which it has been heading. It is merely encoded in the social practice and life of a particular culture: our own. It is socially embedded. The problem is that in a networked world, we meet with resistance all the time. In a globalised world cultural differences are magnified, which is why we have become even more aware than we once were that there are many different grammars of killing. In any grammar of killing what is important is the ability to inflict pain. What distinguishes us from all other animals, however, is that we also humiliate each other. Expressive violence, in other words, builds on the distinction between our dual natures as pain-suffering human beings. It is in part biological, but it is also cultural. What makes us different from other species is our ability to feel shame. We are shamed by different things at different times. But shame is what terrorism is all about. Expressive violence is all about the symbolic importance of the act. As the Dutch anthropologist Anton Blok writes, what is important is the meaning expressive violence has, both for the victim (fear, humiliation, personal loss) and the offender (position, status, prestige and, above all, reputation). Expressive violence is performative; it conveys a message to several different audiences at the same time – from Western public opinion to the men and women on the Arab street. All expressive violence is also highly ritualistic – from a duel to a public execution; from a blood feud to a terrorist act. It involves, adds Blok, protocols or etiquettes from ‘specifications of time and space, the presence of special persons and special outfits, the use of a special vocabulary – in short, they have a formalised, theatrical character’.6 There is not, alas, much inter-subjective experience here. In a ritual beheading broadcast on Al-Jazeera or a suicide bombing reported on a night-time news channel, the enemy is very much ‘the other’. He is robbed of his humanity, which serves the function not only of placing distance between the perpetrators and the victim, but also sets the latter apart from the perpetrator’s ethical community. It is also intended to be a mirror-image of our own grammar of killing – killing from both an emotional and, where possible, a physical distance. A society that wants to democratise the Middle East is one that is usually once removed from the means. A society which is fighting in the name of God, a cause or even an ethnic group (as in the Balkans) is one that may wish to sacralise its mission through sacrifice. The videotaped beheadings of foreign construction workers or the bombings of people at prayer are involuntary ‘sacrifices’ – the counterpart of the suicide bomber. In both cases it is important that their actions should be visceral and intimate. Blok encourages us to recognise that violence and its use are contingent on time and place. They vary with historical circumstances and depend on the perspective of those involved – both of the offenders and the victims, as well as of the spectators, the bystanders looking on. But if the use of violence can be primarily understood in terms of symbolic action, such is not to imply that we are dealing only with symbolic violence. What it does imply is that the effective use of physical force depends very much on its symbolic form.
118
Grammars of killing
Take dismemberment or decapitation, for example. Most societies across the centuries have attached great importance to the integrated body and a proper burial. The decaying corpse is a sign of corruption (metaphorical and real). What is important about the staged television killings of kidnapped victims in Baghdad is not that human beings are killed, but that they are decapitated. What is involved is a certain ‘protocol’ (Blok’s word). It is taking ‘blood for blood’ – it is a ritual.7 It is also a sign of victory. ‘Whilst severed heads always speak, they say different things in different cultures ... Cross-culturally taking and displaying an enemy’s head is one of the most widely distributed signs of victory.’ 8 The popularity of decapitation across cultures, including the guillotining of victims in the French Revolution, is probably explained by the fact that we tend to think of the head as the most individual part of the body. The violation of reputation (i.e. humiliation) is also expressed through the medium of the human body. Such humiliation moves across a spectrum from the shaven heads of the women who collaborated with the Nazis in France to branding, mutilation and, ultimately, the decapitation of investigative journalists such as Daniel Perle in Pakistan. As Blok adds, terrorism is a form of ritual sacrifice, and the point about sacrifice is that the victim must be innocent. The innocence of the victims is implied in the randomness of their death. The indiscriminate nature of terrorist attacks is meant to inspire fear. Things are ‘said’ as much as ‘done’. One of the overriding aims of expressive violence involves (the winning back or retaining of) honour. This imposes sacrifices of its own, as the sociologist Georg Simmel wrote years ago about the aristocratic honour codes that were still alive in the Germany of his own day. To maintain honour is so much a duty that one derives from it the most terrible sacrifices – not only self-inflicted ones, but also sacrifices imposed on others.9 Duelling was deemed by those who issued a challenge to ‘say’ something about their own honour, reputation and status. It is significant that in certain circles duelling was still practised in Simmel’s Germany. For we tend to forget that the instrumentalisation of violence which we take for granted today is very recent. ‘Nothing plagueth England but the many breaches and ever unsure, never faithful friendship of the nobles,’ complained a writer in 1565.10 He was only describing a fact; that the great families of sixteenth-century England were often apt to indulge in violent demonstrations of their power and status, relying on the authority of the sword (which Hobbes, writing a century later, took to be the essential sanction of all power). When analysing this problem of the Tudor era in a trail-blazing book of the 1960s, the Princeton historian Lawrence Stone recounted one episode in the 1580s when two prominent and influential courtiers came to blows over the fact that the son of one had fathered an illegitimate child by the daughter of the other. Thanks to the studied neutrality of the state, two great families were allowed to commit murder after murder with complete impunity through hired killers (or ‘cutters’) in acts of sporadic violence in London and occasional pitched battles in the countryside. Both in the brutality of their tactics and their immunity from the law, Stone could find no parallel other than that of the Chicago gangsters of the 1920s.11
Grammars of killing 119 Many forces, Stone reminds us, converged in the Western world to remove the centuries-old danger of expressive violence. Some were economic, some legal, others psychological. With the dominance in the field by infantry pikemen, and as technical services such as pioneers and ordnance increased in importance, war ceased to be a game for high-spirited young men from aristocratic families. Strength, courage and even skill in horsemanship were no protection against small arms fire. But among the most important explanations for the decline of expressive violence was a change in concepts of aristocratic honour, or a social revolution in concepts of duty. Duelling gave way to litigation in the law courts. Honour took on an instrumental hue that involved public service.12 As another British historian, Jonathan Hale, remarks, while armies were still captained largely by the same old military caste, the motives that led them to fight had become ‘de-classed’ (i.e. they had become much like anyone else).13 Stone characterises this as a failure of political nerve (it is part of his general thesis on the crisis of the aristocracy in late-sixteenth-century England which led to changes in aspirations, behaviour patterns and economic circumstances, and which as a result paved the way for the aristocracy’s return to power a century later, until well within the memory of Stone’s own generation). Instrumental reason was the tipping point in legitimating violence only when it was used as an instrument of state. The corollary of this was the view (which is still widely shared) that the ‘unauthorised’ use of violence is illegitimate. What is illegitimate, in turn, should be considered anomalous, irrational and disruptive (the antithesis of civilised behaviour, something that has to be brought under control). A by-product of this was the assumption which we now take for granted: that the Western nation-state is the logical end of ‘modernisation’. The problem with this view is that it ignores the function violence serves. It may reinforce selfesteem and social status. As such, the resort to violence may be a perfectly rational decision when the costs are outweighed by the benefits: In this calculation dominant cultural attitudes about violence may play a significant role. For example, cultures which judge violence as a powerful and definite response to ‘insult’ and as a good way of restoring ‘honour’ will support individual decisions to use violence.14 In a sense, this is what terrorism involves. In claiming that terrorism is expressive, I am not attempting to rule out an instrumental bias. As one writer contends, the problem of assuming that expressive violence is irrational is that it tends to ignore the function violence serves for the perpetrator. In focusing solely upon the goals of the offender it overlooks the instrumentality of the aggressor. We should not deny rational choice even in the case of suicide terrorism. All social violence fulfils a function. Invariably it involves a weighed, rational decision in which the costs of violence are outweighed by the benefits that follow from it. Many suicide bombers feel compelled to resort to the tactic to achieve what they perceive as justice for their society, their family or themselves.
120
Grammars of killing
The suicide terrorism phenomenon is more widespread than we imagine. There are at least ten religious and secular terrorist groups today that have used suicide terrorism as a tactic against either their own government or others. They include the Islam Resistance Movement (Hamas), Hizbollah, the Armed Islamic Group of Algeria (GIA), the Liberation Tamil Tigers of Sri Lanka and the AlQaeda network. There were also four pro-Syrian, Lebanese and Syrian political parties engaged in suicide terrorism in the 1980s, although they are currently inactive on the terrorist front. Between them they staged about 25 suicide attacks in Lebanon during this period. Of all these groups, the most successful has been Hizbollah, A Shi’ite Muslim group supported by Iran, the movement was characterised in the 1980s by its invocation of divine sanction. As a group it was notable for the number of Shi’ite clerics that were linked to, or prominent in, its activities. Most of them shared a common formative experience in the Shi’ite shrine city of Najaf. Najaf’s role is vital if we are to understand the organisation because it was there that, from the late 1950s, the faithful gathered to re-assess the state of Islamic values, which they felt at the time were under threat. Najaf was also where Ayatollah Khomeini spent 13 years of his exile from Iran, delivered his landmark lecture on Islamic Government and called upon Muslims everywhere to lay exclusive claim to political rule.15 The method of attack most prized by Hizbollah (the suicide bomb) was formally endorsed by Sayyed Fadallah, a leading Shi’ite cleric. Initially adopting a position of ambivalence towards the attacks, he increasingly came out in support of them and the individuals involved. What is interesting is that the justification relied on a highly instrumental understanding of their effectiveness. If an oppressed people does not have the means to confront the United States and Israel with the weapons in which they are superior, then they possess unfamiliar weapons ... Oppression makes the oppressed discover new weapons and new strengths every day ... When a conflict breaks out between oppressed nations and imperialism or between two hostile governments, the parties to the conflict seek ways to complete the elements of their power and to neutralise the weapons used by the other side ... These initiatives must be placed in their context. [The aim of such a combatant] is to have political impact on an enemy whom it is impossible to fight using conventional means, though his sacrifice can be part of a Jihad, a religious war. Such an undertaking differs little from that of a soldier who fights and knows that in the end he will be killed. The two situations lead to death; except one fits into the conventional procedures of war, and the other does not ... Muslims believe that you struggle by transforming yourself into a living bomb as you struggle with a gun in your hand. There is no difference dying with a gun in your hand than exploding yourself ... The death of such persons is not a tragedy, nor does it indicate an ‘agitated mental state’. Such a death is calculated; far from being a death of despair, it is a purposeful death in the service of a living cause.16
Grammars of killing 121 Fadallah articulated the complex logic by which suicide operations in Lebanon could be, and still are, not only politically justified, but also morally endorsed. This mechanism of moral disengagement was simplified for the ranks of Hizbollah by lesser clerics who used terms like ‘the attainment of paradise for the souls of the shaeed (martyrs)’. Fadallah, in contrast, never referred to the souls of the individuals who took part in these operations. In late 1985, he added: We believe that suicide operations should only be carried out if they can bring about a political or military change in proportion to the passions that incite a person to make of his body an explosive bomb ... Present circumstances (however) do not favour such operations any more, and attacks that only inflict limited casualties and destroy one building should not be encouraged, if the price is the death of the person who carries them out.17 Such reservations were ruled out by other leaders who were intent on targeting Israeli forces in South Lebanon, especially in 1988. It is clear that the legitimacy of the method rests largely upon its effectiveness, not on its theological implications for the bombers themselves. What makes this ‘grammar of killing’ so disturbing for Western societies is that our own etiquettes of atrocity are increasingly associated with the fate of the body. Ethics has become visceral, embodied and rooted in suffering. It is only as embodied beings, vulnerable and mortal, that ethical demands arise, and it is only as embodied beings that ethical demands can be met. Values are increasingly lived as affects (e.g. strongly felt emotions or gut reactions). We experience a sense of outrage, a profound feeling of injustice. Values such as compassion and justice are increasingly linked by embodied subjectivities prior to any rationalisation.18 One writer who spent much time charting this change in the ethical climate was Michel Foucault. One invokes him with some hesitation for there is in his writing something of the showman, a man claimed very early by the particularly bracing climate of French intellectual life in which the paramount aim is briller – to shine. But his work is always rewarding even when he is wrong, and in this particular case he has something important to contribute to our discussion. He reminded his readers that blood used to be central too to the mechanisms of power in the Western world. Foucault was obsessed with power and how it pervaded every walk of life and every institution (from the family to the prison). We must be careful in reading his work to not become equally obsessive. But there are a few paragraphs in the first volume of The History of Sexuality which repay close attention. For much of European history, he maintained, blood was an essential element, as it is in the Middle East today. It owed its high value to its instrumental role (the right to shed blood was claimed exclusively by the state as the modern age dawned). Blood was also central to a society that well into the nineteenth century was still divided into different social orders or ‘blood lines’. It also functioned in terms of value; to be prepared to risk one’s blood was the mark of an aristocrat, as opposed to a merchant. In much of modern Europe power spoke
122
Grammars of killing
through blood. Blood myths, after all, were central to Fascism, and provoked World War II, the greatest bloodbath in European history.19 Writing in the 1980s, Foucault claimed that blood had been superseded by sexuality. Our views of the body have not simply evolved, but have been shaped within culture. Religion, for example, has shown an abiding interest in what we do with our bodies and how we think about our sexuality. The sexual codes, and even identities, that we have taken for granted as natural have social origins. Sex has a history – and Foucault set out to write it. But Foucault was not alone. Indeed, he drew on a wider set of sources that maintained that sexuality has been subject to socio-cultural moulding to a degree surpassed by few other forms of human behaviour. Our sexuality is neither fixed nor stable. It has been shaped by history. One of the features of our own age is that the body – its fitness, health and well-being – is changing once again; it is displacing our preoccupation with sex and sexuality as the main focus of social concern. For reasons which go very deep psychologically, we now focus on some of the sentiments relative to what Pareto called ‘the integrity of the individual’ and ‘the inviolability of the body’.20 The occasions of bodily exposure and body contact are carefully regulated in all societies, and very much in our own. We address the body (its nurturing); life (its extension); and health (its vitality or stamina). We conceive of the body, writes Bryan Turner while developing an insight by the psychologist Oliver Sachs, ‘as a potentiality which is elaborated by culture and developed in social relations’.21 This is a statement of universal validity – it applies to all cultures. But what distinguishes our own culture is that both the elaboration and the development of the body have taken a new turn. We live in an era in which personal health is more important to us than ever before, and in which the body can be artificially enhanced, both cosmetically and neurologically. We are obsessed with its status and ultimate fate. Our lives (and their usefulness, richness and fulfilment) are defined not in terms of the soul or the spirit, but health and life choices, and the lifestyles that impact upon both. The body, adds Zygmunt Bauman, has become ‘autotelic’, a value in its own right.22 In a Google search of the Internet conducted in July 2004 Bauman found over 300,000 websites containing information about dieting books and a further 700,000 dedicated to the art of slimming. Thirty-two million websites discussed the issue of ‘fat’.23 Instead of engineering the human soul we are intent on reengineering the human body. The body, worshipped, displayed and, in some cases, rebranded, has become a consumer object in its own right. When we turn to societies living outside the postmodern experience we find a very different world. Do societies that attach more importance to the soul than the body put more emphasis on the sacredness of the struggle? The words ‘sacrifice’ and ‘sacred’ are etymologically related. And it seems that societies willing to make more sacrifices are those who are deemed to be less modern. In the US Civil War, for example, the Confederacy fought with a great deal more enthusiasm than the Union. There were no draft riots in Atlanta or Richmond as there were in Chicago and New York. In World War II, German soldiers, especially on the Eastern Front, fought with greater passion than the Allies. Perhaps sacredness
Grammars of killing 123 is defined not by what is right or wrong, but by the level of a society’s sacrifice.24 The soldier who gives his life for his country is promised instant redemption. This was once true, of course, of the West. Maurice Barres coined a resonant term when he visited Verdun in April 1916 (the scene of the bloodiest battle ever fought in Western Europe) and referred to the road by which he reached the battlefield as the Rue sacrée (the Sacred Road). Through repetition the route became the Voie sacrée (Sacred Way). In this latter form it struck a chord in the public imagination, for it seemed to evoke the Via Dolorosa (the Way of Sorrows) taken by Christ to the Cross. It also compared the sacrifice of French soldiers who fought in the battle to Christ’s crucifixion. More than 50 years of living in a secular republic may have diminished traditional modes of thought and feeling that Catholics might have been taught, but it did not expunge them entirely. The language of the poilus was shot through with religious symbolism. And though it was not openly compared to Calvary, Verdun was still referred to as an ‘altar’ or a ‘martyrdom’.25 Such sacrifices possibly transcend moral evaluation. We must not lose sight of the fact that ‘sacrifices do indeed matter; they are true’. And unless we accept the ignominy of unspeakable insult, we should recognise a similar truth for those who in the Middle East today are prepared to bear testament to their version of the truth in the currency of their own bodies. There is a terrible logic here, of course. The word ‘sacred’ is derived from the Latin word sacer, which is usually translated as ‘sacred’, but also sometimes as ‘accursed’. Many of the soldiers who fell at Verdun certainly felt that they were cursed. At the height of the battle, the words chemin de l’abbatoir were chalked on a wall in the suburbs. Those who fell in the battle were called ‘sacrifices’ or, in the original French, holocaustes (burnt offerings).26 Today we are a long way away in time from Verdun. When we look at the phenomenon of the suicide bomber we find an anthropological gulf between ourselves and much of the world. We too were once prepared to sacrifice ourselves for the nation, the people or the cause. Some went to their deaths without complaint in the name of History, in the upper case. We are no longer willing to do so. What has changed, writes Luc Ferry, is our ethical landscape. The concern for ‘otherness’ in contemporary philosophy takes the form of a ‘religion of the Other’ (as opposed to the 1916 religion of nationalism). Today no Frenchman would willingly die for his country, still less for the European Union or the European idea. Our revolutionary impulses have faded.27 If we are prepared to sacrifice ourselves at all, we tell ourselves it has to be for other human beings. Ferry sees in the loosening of communal ties – whether religious, ethnic, national or familial – a move away from a religious concept of sacrifice to the idea that sacrifice is required only by and for humanity itself. The new transcendence (sacred=sacrifice) is no less imposing than the old, but it is imposing in a different way. Previously, the ‘sacred’ came before ethics, which it claimed to ground; today the sacred comes from ethics. The ethical demand that we treat others as we treat ourselves requires us to sacrifice ourselves only for other people. ‘If we are willing to see in sacrifice one of the measures of the sacred, as the very etymology of the word invites us to do, we have to complement
124
Grammars of killing
the language of religion and of ethics with that of the representation of human feeling.’ 28 The sacred is now incarnated in humanity itself. Ferry concludes that the ‘sacred’ in our world is no longer rooted in a tradition whose legitimacy is linked to a religious revelation that precedes conscience but is situated very much in the human heart. What he calls ‘transcendental humanism’ is a concept that gives us access to a genuine spirituality rooted in human beings instead of the divine. This is, of course, far removed from the Middle East where the divine is all important, and where men (and women) are still willing to die and kill in the name of God.
Respecting our enemies These are bold claims that some would contest. But there is enough evidence to suggest that transcendental humanism is not only the ethical demand of the hour but has also become part of what one writer calls ‘the culturally constructed pacifism of the West’.29 Perhaps that looks at it too cynically. Ethics, like war, evolves. For some time we have been attempting to make war more humane, not only for ourselves, but for those we fight. As an injunction, ‘killing them softly’ may strike a hollow note, but it is where the US military has been going for some time. We may indeed debate the moral worth of this trend, but one thing cannot be challenged: to attempt to reverse the trend would be difficult. Modern morality is embedded in treating others as we would wish to be treated ourselves. Modern morality is embodied at this moment in our history in inter-subjectivity. As MacIntyre might add, it is now socially embedded in the ‘grammar of killing’ we have adopted. In recent years, psychologists have begun to direct particular attention to intersubjectivity – especially in infancy, when it is first learnt. Some now posit an innate human propensity for mutual engagement and responsiveness which is part cognitive, part intellectual. It is also emotional.30 But in any case, human character and experience exist only and through peoples’ relations with each other. Anthropologists have also come to recognise that knowledge is not to be found in individuals; it is social. The implications of this idea have only recently begun to be explored in the social sciences. Without inter-subjectivity we simply would not know the world. This is why, at the beginning of the information age, we have begun to use the term ‘distributive processing’ to speak of human thought. In invoking such terms we draw an analogy with the technology that has become the everyday medium through which we communicate with others. Indeed, it would seem that in terms of networking and feedback loops (the basis of cybernetics), the human brain is similar to the Internet in our computer-run societies. What each of us learns and uses is much more than each of us has in his or her head. As Maurice Godelier writes, humans produce culture in order to live. That phrase ‘in order to live’ bears a fuller sense than just a material claim that we depend on others for our very existence (the division of labour). We live by means of relationships both intellectually and emotionally. The speech we learn only makes sense in respect
Grammars of killing 125 of the others from whom we learn it and to whom we direct it. The values that determine our behaviour are sensible only in the perception of others or in our imagination of others’ perspectives.31 Inter-subjectivity has been at the centre of modern morality since the Holocaust. Of all the philosophers who have written about ethics in the light of this central event, the most important was Emanuel Levinas. He survived the Holocaust, as did his wife and daughter (who were hidden first in Paris and then in a convent in southern France). The rest of his family was not so fortunate. Largely from his own experience of World War II, Levinas concluded that we live in a social world, and that our subjectivity arises from our ethical responsibilities in it. He also claimed that ethics is not the ethics of absolute freedom, for we can never be free from the obligations and responsibilities we have to others. The central theme of his life’s work as a philosopher was that no ethical system can treat another person as so radically and memorably ‘other’ as to place them beyond the power of sympathy or fellow feeling.32 One objection that can be made to this bold assertion is that it transfers an absolutely theologically derived imperative to the context of human inter-personal relations in a way that is metaphysical rather than prudential. One must tread carefully, of course, in view of Levinas’ moral authority as a Jewish thinker. His reflections on the Holocaust explain his understandable scepticism of human behaviour, his questioning of all secular values and (in the case of this study) his objection to the notion that enlightened self-interest can ever leave room for a measure of ‘other’ regarding sentiments. But for writers such as Rorty, Levinas’ metaphysics was the problem. He attempted to fuse both the theological and philosophical discourses into one as, prior to Kant, they usually had been. In doing so he grounded ethical behaviour on impossibly high criteria. All ethical practices, I would hazard, must be based on modest performance. Its claims have to be humanly sustainable and firmly grounded in dispositions that can readily become second nature to us. As David Wood argues in a recent study of Levinas’ thought, the main problem with his ethical injunctions is that they leave all of us in an impossible position. We must be prepared to adopt such a critical view of humanity that we will tend to see human nature as irredeemable – a view which is far too despairing even for those who study war as a profession – or we are in danger of being left with an extreme Levinasian position which requires us to negate the self and accept unconditional responsibility to the ‘other’ beyond any first person singular sense of shared ethical concerns. And that would demand too much of human nature (after all, we are all fallible human beings, products of the Fall). As Woods suggests, a better response would perhaps be to eschew metaphysics altogether and claim instead that our obligations to others stem from the very practical insights we have gathered over the centuries into human suffering. Otherwise, to admit to an ‘infinite obligation’ to the ‘absolute alterity’ of the ‘other’ could easily translate into viewing the latter as so completely alien to ‘us’ that they cannot lay claim to humane or civilised treatment by the standards of our own culture.33 It is surely through a knowledge of history that we can best grasp the importance of treating enemies (as Schmitt might argue) not as absolute entities at all,
126
Grammars of killing
but ‘real’ flesh and blood people living in ‘real time’. The ancient Stoic philosopher Epictetus has a nice phrase: ‘When we become men, God delivers us to the guardianship of an implanted conscience.’ 34 This implanted conscience should be the main source of moral sanctions. We are obliged to be moral for purely prudential reasons. When that moral sense is absent from war, disaster is likely to follow. Our moral sense can be reinforced by legal codes and moral covenants, but ultimately it does not need them. The principles of duty (or etiquettes) that soldiers have even to the enemy they engage in battle derive their force from prudence, experience and memory. In short, our ethical codes stem from the inter-subjective relationship we have with other people. Soldiers, after all, live in the same community of fate as those they are asked to fight. Yet if Peter Berger is right that ‘the most pressing moral imperative in policy making is a calculus of pain’, then it is only right that we ask how we can act morally when the sole purpose of war is to persuade the enemy to surrender through a very precise calculus of pain. One response is that if we must be prepared to impose pain on others for a goal we consider important (or do so by default because we have been left with no choice), then we must do so in a manner that allows those we coerce to retain (if they survive) some self-respect. And this also must include, of course, the society from which they come and to which they will return when they re-enter civilian life. There are many synonyms in sociology for respect which delineate its many different aspects. These aspects include ‘status’, ‘prestige’ and ‘dignity’. The social vocabulary of respect is especially brought to life in war. Not every defeated side forfeits the respect of its enemies; not every victorious soldier always wins prestige in the eyes of others, even his own side.35 Respect does not derive from status or personal accomplishment. It derives from our common humanity, and especially and most immediately from the dignity we accord to the human body. In the past dignity was simply derived from an act of creation: we were all made in God’s image, a reality that transcended all man-made distinctions or social codes of honour. The modern world found a secular equivalent for human dignity. Richard Sennett traces it back to Cesare Beccarria who was one of the first jurists to argue that torture detracted from human dignity. It was an argument taken up by other eighteenth-century writers such as Fichte and Thomas Jefferson.36 The act of respecting the pain of another is what confers on human beings a secular dignity. What etiquettes of atrocity acknowledge in the modern era (since the mid-seventeenth-century, in Geoffrey Parker’s view) is a modern ‘moral sensibility’. In literature it is expressed most strongly in George Orwell’s novel, 1984. One of the citations from The Theory and Practice of Oligarchical Collectivism (a book co-authored by the novel’s anti-hero, O’Brien) quite openly explains that the only ‘object of torture is torture’. We do not still read Orwell because of his vision of the future or his artistry (by some lights he was not even an especially accomplished novelist); we read him, explains Lionel Trilling, because of a certain ‘quality of mind’. ‘This quality may be described as a sort of moral centrality, directness of relation to moral and practical facts.’37 One very
Grammars of killing 127 plain moral fact is that it is wrong to torture another human being because it outrages our liberal sensibilities. It is at odds with the chief characteristic of early-twenty-first-century liberalism: our dislike of cruelty. Orwell’s central insight is that torture is never used to extract information – the usual defence for its re-introduction. Its real purpose is to humiliate the person being tortured. The purpose of the North Vietnamese torture of US prisoners of war, like Senator John McCain, was to get them to sign confessions of guilt or indictments of the country that had sent them into battle. This was not done to embarrass the US in the court of world opinion. The intention was to embarrass the tortured – to strip them of their own beliefs, the social language which constituted their identity. As Elaine Scarry writes, the real agony of torture is not pain, which passes soon enough. The real agony lasts a lifetime. The purpose of the torturer is to get a prisoner to do or say things. If possible the torturer even tries to get the prisoner to believe them, which will always be a reproach to the person concerned. Failing that, the next best thing is to betray his beliefs, his country and, especially, his friends. Betrayal is the worst of all sins and once committed it can never be expiated.38 In Orwell’s novel Winston betrays Julia, the person he loves most, when he screams out that he would rather she was tortured than he. At that point, he is lost. He will never be able to deny the memory of that betrayal. It will haunt and psychologically scar him for the rest of his life. In his own eyes he will become a diminished human being. As liberals we will also become diminished selves if we deny what Kant calls the humanity in ourselves (which is central to the imperative to treat ourselves and others as an end).
Non-lethal weapons Alone, at last, in our human world, we shall have to look at ourselves in the mirror of historical reality. And we may not like the vision. (Manuel Castells, The Rise of the Network Society, 2000) One of the greatest challenges that suicide bombers pose, writes Madeleine Bunting in The Guardian, is that it challenges our view of how war should be fought. The West can now only kill from a distance, preferably from several thousand feet up in the air, or several hundred kilometres away on an aircraft carrier. It is the very proximity of these suicide missions which is so shocking. This kind of intimate killing is a reversion to pre-industrial warfare – the kind of brutality seen in the Thirty Years War, for example. Suicide bombers are a new permutation of old traditions; they have no monopoly on the horrors they reveal of the human psyche and its capacity to destroy life.39 The mechanism of psychological ‘distancing’ made it possible for one American soldier to fire canisters of buckshot at jeering children in Mosul while explaining, ‘It’s not good to do dude; it could be fatal, but you’ve got to do it’.40 Far from resigning itself to this reality, the US is trying to change the
128
Grammars of killing
face of battle. Non-lethal weapons (NLW) are, possibly, the future. One day it may be possible, in the words of one of the characters in Michael Crichton’s novel, Lost World, to design ‘a non-lethal area neutraliser’ that will not kill anyone at all. Even in the Western grammar of killing, the virtue of actually killing people has begun to be questioned as never before. This is very recent. In the early 1990s a number of articles referring to ‘soft kills’ (attacks that disabled without destroying) and ‘mission kills’ (attacks which rendered a system inoperable without necessarily killing its operators) began to appear in the press. Later experts began talking about ‘a kinder, gentler, kind of war’. Several articles appeared with provocative titles such as ‘Killing them Softly’ or ‘Bang – You’re Alive’. The term ‘non-lethality’ has since entered general use.41 For some time the US has been trying to limit the fall-out of war: the death count or tally. We may be a long way off from deploying non-lethal weapons in large numbers, but the Americans are not alone in going down this route. The technological prospects are endless. One involves high-powered microwave or ‘directed energy’ weapons that could cause disorientating pain, with apparently no lasting damage, by playing with the nerve ends in a person’s skin. A military affairs analyst elaborates: Microwave weapons work by producing an intense surge of energy, like a lightning bolt that short-circuits electrical connections, interferes with computer motherboards, destroys memory chips and damages other electrical components. They send a narrow beam of energy that penetrates about onetenth of an inch into the skin, to where nerves that cause pain are located. One officer who has experienced these weapons adds, ‘all the glossy slide presentations cannot prepare you for what to expect when you step in the beam’. The weapon, apparently, is now in an advanced stage of development.42 One of the main justifications for NLW is the military’s commitment to greater humanity in warfare. It has been a long time coming, but the prospect was actually glimpsed long before the end of the Cold War. In his paper ‘Non-lethal and Non-destructive Combat in Cities Overseas’ (1970), Joseph Coates went on to define the term ‘non-lethal’ for the first time. What is remarkable about the piece is the rationale he put forward for thinking about war in radically different ways: Future conflict generally may have vague, uncertain, or shifting objectives. Consequently, a more intimate interplay of military and political goals, tactics and implications than has been customary may prevail. There will be both more intermingling of aggressors and civilians and a greater blurring of the distinction between the two ... These points all argue for less profligate killing and less wanton destruction of property.43 Most of the issues Coates raised were ahead of their time. They were ‘rediscovered’ when the US intervened in Somalia in the early 1990s. The US Marines deployed NLW for the first time in a combat zone when they returned to evacuate
Grammars of killing 129 the remaining UN personnel in February 1995. Challenged to defend their usefulness, Colonel Mike Stanton remarked at a press conference that they showed that the US was willing ‘to go that extra mile’ to keep from having to kill anyone.44 He insisted that they were not at odds with the warrior ethos that the Marines are so assiduous in cultivating. Initial concerns that NLW would make troops ‘soft’ proved unfounded. The US Marines did not seem any less ready to use lethal force when required to do so. The head of the Marine Corps later wrote to Senator Robert Smith that NLW offered ‘tremendous flexibility ... to warriors in the field of battle’.45 All technology is problem-solving. So what is the problem that the US is attempting to address? The main demand of war today is that the West retains its legitimacy in the use of force. Even if one supplies a service for which there is a demand, one must ensure the service is consistent with the demand. You cannot fight terrorism with weapons that terrorise other people. You cannot fight humanitarian wars if your own methods are deemed to be inhumane. The British were the first people to deploy NLW extensively in Northern Ireland in the 1970s when they pioneered the use of plastic bullets in riots. However, it soon became clear that they were only non-lethal at a certain range. When fired at close quarter, death could result, and often did. Other ideas were considered at the time only to be rejected. Various acoustic weapons were tested but found to be useless in the narrow streets of Belfast and Londonderry. Large nets in which to enmesh rioters were rejected because of their impracticality.46 The result is that the best non-lethal weapon available to the British was restraint. It has become the guiding principle of the British Army elsewhere, including Iraq. The issue of legitimacy, however, is not going to go away. It is one of the factors that make war more complex than ever. One of the original advocates of non-lethal weapons was John Alexander, an ex-Green Beret in Vietnam who participated in a landmark study by the Council of Foreign Relations, and chaired the first major conference on the topic. He identified one of the key needs of war: to stop defeated peoples from holding grudges and seeking revenge.47 While using maximum force may win battles, it may spur the defeated side to seek revenge. Even if the defeated are reconciled to defeat, or too traumatised or exhausted to retaliate at a later date, a new generation, angered by stories of past atrocities, may be fully prepared to take up the cause as their own. The greatest problem any military faces is ending the cycle of violence which usually follows when atrocities are committed. ‘Killing them softly’ may allow a society to recover from violence more quickly and, where possible, to draw a line under events and move on. Alexander was particularly concerned that much of the ‘cycle of violence’ in Somalia, Haiti and Bosnia in the 1990s constituted a post-modern Zeitgeist which was all too markedly reflected in contemporary Hollywood films. Retribution (usually for some gratuitous slight against family members or friends) was the common theme of the day. As a motive for violence, revenge was even considered legitimate – however indiscriminate the response, or however brutal the act. Mel Gibson in Lethal Weapon, Wesley Snipes in Passenger 57, Steven Segal in Under Siege and
130
Grammars of killing
Sylvester Stallone in the Rambo series (the most brutal series of all) seemed to express the warrior values of the time. The problem, Alexander noted, was that while the movie audience sought (and found) resolution at the end of the film, violence in real life is rarely cathartic. The cycle of violence usually continues.48 Yet non-lethal force is not necessarily ethical. NLW, especially in combating terrorism, can be used for many practices, including torture. Amnesty International produced a report in 2003 which revealed that tasers were being used by US soldiers to inflict pain on detainees in Iraq. In December 2001, under a Freedom of Information request, the American Civil Liberties Union (ACLU) obtained a memo which detailed abuses by US Special Forces at a temporary holding facility for prisoners near Baghdad Airport. The following February, another report from the Associated Press revealed the existence of abusive videotapes from the ‘Immediate Response Forces’ at Guantanamo Bay. The videotapes showed a platoon leader spraying a detainee with pepper spray, before letting the reaction team enter the cell.49 Indeed, non-lethal weapons have been raising serious ethical concerns for some time now. An editorial in New Scientist in March 2005 detailed some morally dubious research in the US on developing a Pulsed Energy Projectile (PEP). There is something chilling about turning research intended to ease suffering into a weapon that can be used to hurt people. Nociceptors, nerve cells that convey pain in the body, have been studied by researchers trying to relieve chronic pain. It emerged this week that a group working for the Pentagon is using that knowledge to turn the tables; to maximise the pain caused by a non-lethal weapon called Pulse Energy Projectile (PEP).50 It is no wonder that pain researchers have expressed their concerns about the plan. Some of the ethical issues for the proposed new NLW have not even begun to be addressed. They include lasers (low power lasers to dazzle), anti-personnel weapons, microwave systems (skin heating), lasers (skin heating or dazzling), chemicals (incapacitating agents), acoustic technology (psychological and physical), kinetic munitions (blunt impact), vortex generators (shock waves) and others. Two technologies in particular have established ethical markers for the future. The first is microbial agents (enzymes, bacteria), which technically should be banned under the Biological Weapons Convention. The second involves chemical weapons that can reshape the nervous system by calmatives, or equilibrium agents. (Technically these should also be outlawed by the Chemical Weapons Convention.) Nevertheless, independent of the legal implications of their deployment, it is clear that NLW are here to stay. Directed-energy weapons (in which the US is investing most of all) promise speed-of-light effects at ranges of up to ten kilometres, rapid targeting, precision engagement, controlled effects from ‘deny to destroy’ (i.e. non-lethal to lethal), low-cost shot and almost unlimited ammunition. These advantages are far too useful to ignore for various protection tasks, including facility security and crowd control. They are particularly relevant given the
Grammars of killing 131 three-block war, where soldiers have to move from intense combat, to peace-keeping to humanitarian assistance situations, and may need alternatives to lethal force.
Conclusion In essence, there is no real reason why the use of NLWs should be unethical. Indeed, their very conception seems to me to be reshaping the face of war in ways that are revolutionary because they challenge the metaphysical first principles on which we have grounded most of our ethical practices in the past. The use of ‘meta’ (a Greek prefix) evolved from the early use of metaphysics and the name of the work that followed the Physics in the collected works of Aristotle. ‘Meta’ at that time simply meant ‘after’. But it also meant something more. If change is the defining feature of the natural world, metaphysics is about things that do not change. The Metaphysics of Aristotle dealt with the ultimate and underlying causes of events, the so-called higher knowledge which could include God, the first principles of motion, or the laws governing all forms of causality. In time the word came to mean the study of ultimate grounds of reality, the first principles that underline the study of the world. Aristotle identified the subject matter of first philosophy as ‘being as such’. Defining the connection between the two is a vexed question. Perhaps this is the answer: the unchanging first causes have nothing but ‘being’ in common with the mutable things they cause. Like us and the objects of our experience, they are, and there the resemblance ceases. Is metaphysics, then, the science of being-assuch? This would be something of a limited, but still important, definition. If we seize upon it then we have a chance to return to the original meaning of the word. In the case of the ethics of war we have the chance not to ground ‘right’ behaviour on something ‘higher’ than the practical world in which we live (which is supposedly needed to inspire us to act well or behave better). We have the chance to make the human body the sole focus of our values and beliefs. We have the duty, then, to obey the injunction that comes from our capacity to feel pain and our liberal need to avoid inflicting unnecessary cruelty, even in war. In another of his works, George Steiner adds that the ability to hope for a better world, to work towards it despite setbacks and misfortunes, to imagine the world other than it is and to map out an alternative future is what gives us hope. ‘Hope is a grammar too.’ The same might even be said of war as long as we have to practise it. Perhaps war can also be invested with what Steiner calls ‘the grammatology of emancipation’.51
6
The unconditional imperative
‘Being reveals itself in war’ is Emmanuel Levinas’ reformulation of Heraclitus’ famous saying, ‘War is the father of everything’.1 War furnishes us with the metaphors we use, the images we invoke and even our thought patterns, such as Aristotle’s dialectical thinking, Marx’s concept of the class struggle or Darwin’s struggle for survival. We also often feel conflicted or in a perpetual state of war with ourselves. Apparently, there is no escaping it. Levinas also adds that the ‘state of war suspends morality ... [it] renders morality derisory’.2 It is clear why he comes to this conclusion. For him the real tragedy of war is not death, it is betrayal: it persuades us to betray our own substance – in this case our common humanity. There is a passage in Primo Levi’s book If This Is a Man which brings this out very well. A fellow inmate in the lager, already fluent in French and German, asks Levi to teach him Italian. A canto from Dante pops into his head in their first (and maybe last) lesson. ‘Think of your breed; for brutish ignorance/Your mettle was not made; you were made men/to follow after knowledge and excellence.’ To those who are condemned to die a humiliating death in the camps, Dante’s humanism seemed an era away. Yet war – though it may seem paradoxical to claim this – can create a common community of fate in which it is possible, often for the first time, to see that the traditional differences of tribe, religion, race or custom are unimportant compared with similarities all human beings share (pain and humiliation, for example). In Levinas’ case it enabled him to think of his German captors not as people wildly different from himself but as people who he could include in the range of ‘us’. Levinas was writing of the human ‘substance’ in two senses of the term: what is unique to ourselves; and what we share with others, including our enemies. When he was a POW in a German camp in 1940 he saw this at first hand: The other men, the ones who were called free, who passed by or gave us work, or orders, or even a smile – and the children and women who also passed by and occasionally looked at us, they all stripped us of our humanity ... With the strength and misery of the persecuted, a small inner voice, in spite of it all, recalled our fundamental essence as thinking human beings. But we were no longer part of the world.
The unconditional imperative 133 Then one day a dog entered the camp. The prisoners adopted him, and he adopted them. He used to greet them with a happy bark when they lined up in the morning and when they returned from work at night. ‘For him – without question – we were men,’ Levinas adds sardonically. For the inmates the dog was ‘the last Kantian in Nazi Germany’.3 War denies that common humanity when it suppresses the ‘other’ within ourselves, and also when it destroys those feelings, emotions and sentiments that make us humane. We derive our humanity, he added, from the responsibility we have to others, even to the dead (i.e. our duty to honour them). It is this responsibility which makes us human. Warriors have a code precisely to enforce this understanding, and to remind them of what happens when they act badly. In betraying their code, they are in danger of betraying each other.4 A similar sentiment cast in a rather different way can be found in the work of another philosopher, Karl Jaspers. We betray our humanity, he writes, when we cut ourselves off from contact with others. True inter-subjectivity, he adds, demands an unconditional commitment to oneself, one’s unit, one’s country and one’s enemies as well. That is the true substance of the human condition. What is evil by comparison is the immediate, unrestrained surrender to passion (bloodlust generated in the heat of battle) or the urge to seek revenge. Evil lies in the sphere of the contingent. Good inheres in the man who subjects his will to moral laws that extend beyond the moment. The good man is one who subjects his life to what is ‘absolute’ (or universal). Instead of succumbing to unconditional passions he pursues unconditional ends. The ethical realm, Jaspers adds, also involves the authenticity of our motives. The bad man is the man for whom life is conditional. The good man is one who acts because it is good to do so. Good actions confer their own source of legitimacy. The man who kills because he is ordered to do so (because he fears punishment if he refuses to obey an order, however contrary the killing may be to the law or spirit of his own profession) is an inauthentic human being because he is not true to his own humanity. The authentic life is to be found in an unconditional commitment to good. Jaspers advances a final argument. Evil is the ‘will to evil’ – the urge to inflict torture, cruelty or annihilation on another. Evil is the nihilistic wish to ruin everything that has value for others. Good, by contrast, is unconditional – it involves love of others. Compassion is rooted in the imagination (we act well, or try to, because we can imagine the pain of others). Although Jaspers’ final argument is couched in a metaphysical language, like Rorty I feel that an appeal to metaphysics is neither necessary nor particularly helpful. Instead we can build compassion on ‘the body in pain’. It is precisely because we are all embodied creatures that we can experience the pain of others through the imagination.5 In all three respects, what Jaspers advances as an ethical demand is the unconditional nature of good behaviour. Only the unconditional character of good invests moral duty with content, an ethical motive with authenticity and a metaphysical will with the will to live rather than a will to cause destruction. Jaspers called this, echoing Kant, ‘the Unconditional Imperative’. It is highly
134
The unconditional imperative
demanding for it requires us to live unconditionally, to subordinate our lives to something greater than ourselves. He came up with a striking summary of it: ‘When we obey the unconditional imperative, our empirical existence becomes in a sense the raw material of the idea of love, of loyalty.’ It is only the unconditional which can explain a soldier’s willingness to go ‘beyond the call of duty’ in laying down his life for others. Being true to oneself means being bound to one’s duty; it is the only way by which one identifies oneself as a person. It is the unconditional which determines whether our lives (and even our death, in certain circumstances) is significant for others or meaningless in our own eyes. It is usually most evident in extreme situations when we have to make a choice. This is why ethics today is concerned with the imperatives of self-confrontation. And yet the unconditioned life, and with it the honour code of our soldiers, is now under threat from three new developments in war. The first is the micromanagement of the battlefield, which is a true echo of our increasingly risk-averse age. The second is the employment of private security companies (PSC) that have come into their own in the war in Iraq. The third, not as distant as it may seem, is the rise of robotics, which has very significant implications for whether soldiers can continue to regard themselves as moral agents. Camus put the case very well when he wrote: ‘We work not on matter but on machines, and we kill and are killed by proxy. We gain in cleanliness but lose in understanding.’6 The moment we surrender responsibility to those behind the lines, contract out our responsibilities to the market or abrogate responsibility to machines, we are in danger of distancing ourselves from the consequences of our own actions. This is the critical insight that Camus offers us.
Micro-management of the battlefield It is the individual soldier who is asked by the society he serves to be most unconditional in exercising moral restraint. What marks a soldier as a professional is that he is expected to make moral choices and to even learn from his mistakes. In our culture we tend to reward people who make mistakes provided they learn from them, and learn quickly. Only by making mistakes, after all, can we really progress. It is precisely this capacity for the exercise of moral responsibility which has been the distinctive mark of the Western soldier. More than any other agent, writes Michael Ignatieff, it is the soldier who has the central responsibility for introducing the moral component to the conduct of war. ‘The decisive restraint on inhuman practice on the battlefield lies within the warrior himself – in his conception of what is honourable or dishonourable for a man to do with weapons.’ 7 Remove the soldier from the battlefield, train him to fight by proxy or to translate his vocation into a financial contract, then you may not reduce what Camus called his ‘cleanliness’, but you will reduce his understanding. Sometimes this can be taken to extreme degrees through the ‘structural micromanagement of morality’. Morality can be micro-managed by privileging legal conventions over the regimental honour especially in regard to the rules of engagement which tell us when it is lawful to fire or not fire on a target, or in
The unconditional imperative 135 some cases to even fire back. In recent operations we have witnessed the presence of legal advisors further and further down the chain of military command. The most remarkable case was in the Balkans where it is rumoured that NATO foot patrols were occasionally accompanied by a military lawyer who determined what they could shoot at and what they could not.8 Let me concentrate on my own country’s experience in Basra where the British Army is operating under a much more restrictive set of rules of engagement than its American allies. Only the separation of the main body of British troops from US troops in Iraq, perhaps, prevented a confrontation between two very different military cultures. ‘We do war differently from the Americans,’ the British Army Chief of Staff told a Parliamentary Committee. These differences came to light when The Guardian ran a story on its front page about a British staff officer which dramatically highlighted the tensions produced by this collision of cultures.9 To put it reductively, in counter-insurgency operations the British are wedded to winning hearts and minds. The Americans are much more confrontational, preferring raids, cordons and sweeps with direct, visible engagements with insurgents, such as those which formed part of Operation Steel Curtain in November 2005. American military culture is still quite attritional. General Westmoreland’s dictum – ‘Send the bullet, not the man’ – re-emerged shortly after the fall of Baghdad, as did the ‘body counts’ of the Vietnam War. The American military is also sustained by an intense religious self-belief. When the US Marines entered Fallujah in April 2004 an army chaplain told them that they were ‘tools of God’s mercy’.10 No British padre is ever likely to be found making a similar claim about the men he serves, and would invite disbelief if he did. Without such self-belief, however, the experience of battle can be daunting. One of many challenges they now face is that British officers are about to lose their historic power to charge their own men with serious offences, including abuses of human rights. The decision to deny them this power in 2008 comes after an unprecedented case a few years ago when a commanding officer of a tank regiment was overruled after he judged that one of his soldiers had not committed a criminal offence in shooting dead an Iraqi civilian. On the evidence available, he came to the conclusion that the soldier had acted within the rules of engagement. The case was later referred to the Army Prosecuting Authority, but dropped when it came to the Chief Criminal Courts by the Director of Public Prosecution. The case highlighted the complexity of prosecuting soldiers in war zones when they have to exercise split-second life-and-death decisions. It is yet another example of how the military is coming under increasing pressure to conform to civic norms, rather than its own. Put another way, it is another instance of where warriors are no longer allowed to exercise what Emerson thought was central to heroism: what he called the exercise of ‘self-trust’.11 Another problem that British Forces have faced in Iraq, according to a leaked report in The Sunday Times, is the stress faced by soldiers in the field, especially in peace support missions. The greatest source of stress, claimed two senior doctors in the Royal Army Medical Corps, came not from the insurgents or roadside bombs, but from the Special Investigation Branch of the British Royal Military
136
The unconditional imperative
Police.12 Between the invasion in 2003 and November 2005, when changes in the law were first announced to bring military practices more in line with civilian, the Police and Prosecution Service investigated 184 complaints against British soldiers. Of these, 164 were subsequently dismissed because of a lack of evidence. It is no wonder that British soldiers are increasingly concerned about being investigated every time they are involved in an incident which requires them to discharge their weapons. Many of these cases are not investigated quickly. Many cases last for months, during which time a soldier never knows whether he will keep his job, be indicted or suffer the ignominy of being dishonourably discharged. This hazard is faced by US soldiers too, but it is more developed in the British military because they are operating under a far more restrictive set of rules of engagement. The US Marines, for example, are encouraged to defeat enemies who must be killed if there is to be peace. They are always looking to use the most force permissible. The British, by comparison, are encouraged to use the least force possible. This is not to say that they are prohibited from taking risks which may place civilian lives in danger, but they are not permitted to knowingly act in a way in which civilians may be seriously injured or killed. This brings me back to Manuel Castells’ acute appraisal of British policy: that it is American policy with a human face. Like their fellow Europeans, the British have gone quite far in recent years in criminalising war. It follows they can hardly practise it themselves; instead, they find themselves engaged in policing. It could be argued, of course, that this would enable them to relax some of the restraints on the use of force – one of the consequences, perhaps, of abandoning the Clausewitzian concept of war as a duel between (moral) equals. But even if they wished to relax restraints they would not be allowed to venture down this road. The British have signed up to the International Criminal Court which encourages many of its members to follow logic to its end: to see war not as a continuation of politics by other means, so much as a continuation of international law. This is not a view of themselves that the Europeans likely share, but they should ask themselves whether their own discourse on war is intellectually compelling. The domestic analogy of policing is especially open to question. Back at home the police’s role is defined by the rule of law. The police rarely employ force. Armies may increasingly look like police forces (with SOF ‘snatch squads’ in the Balkans who have so far brought 15 war criminals back to the Hague to stand trial) and the police may increasingly resemble armies (with their armoured cars and SWAT teams), but their operational environments are very different. We police our streets on the basis of overwhelming consent. Even in the most violent areas, such as the banlieues of Paris, we are expected to keep the peace, not make it. In Iraq the British Army is expected to do both, which is why it finds itself in an impossible situation. As my colleague Chris Brown concludes with his usual clear-headedness, far from being morally superior, European rules are effectively inoperable in many situations. The attempt to apply them in impossible situations represents the worse kind of idealism, ‘an attempt to mandate courses [of action] that could only be appropriate if the world were other than it is’.13
The unconditional imperative 137 Although I have chosen to highlight the problems faced by the British military, the situation is likely to be just as problematic for the US military in the long term. To begin with, the US military is much further advanced in the introduction of networking technologies (especially ‘Blue Force Tracker’) which enable senior commanders to see and meddle in matters on the ground. The decentralisation of command initiative and authority is also much less advanced in the US Army than it is in the British Army. But I have highlighted the British case to make a more general point: micro-management in any field does not increase understanding, the only real basis on which moral judgements can be made. Instead, it undermines individual judgement and responsibility. As Paul Cornish rightly concludes, the ethical realm of war deeply touches upon a soldier’s individual responsibility: how (and how broadly) that responsibility is defined; how it is manifested; and how it is put to use. Such traditional ideas as discrimination, proportionality and justice in armed conflict actually presuppose that there are soldiers willing and able to act morally on the battlefield. Excessive micro-management allows politicians to police the exercise of individual responsibility, challenging the self-trust which is at the very heart of the warrior ethos (of which more anon). Cornish also contends that the micro-management of morality can be pursued in another guise: through technology. As weapons become both ‘smarter’ and more lethal (i.e. as more precision-guided munitions promote even less collateral damage in the future), there will be an expectation on the part of many politicians that machines can ‘clean up’ the battlefield. They will expect this ‘clean up’ in both senses of the word: in the popular sense of ‘taking out’ the opposition, and sanitising the battlefield. As soldiers become even more distant from their actions in an increasingly interactive relationship with machines, their role as moral agents will be hollowed out even more. What makes technology particularly attractive is that it removes some of the major risks that soldiers run. Indeed, micro-management derives its principal force from what war has become, or is in the process of fast becoming: risk management. Risk management brings what Ulrich Beck calls ‘a technological moralisation’ which no longer requires moral or ethical imperatives directly. ‘In this sense ... the calculus of risk exemplifies the type of ethics without morality, the mathematical ethics of the technological age.’ 14 The Categorical Imperative is one example he offers. Kant’s central ethical demand has been replaced by technical data such as the mortality rates involved in certain conditions of air pollution. Whatever puts the body at risk is now considered unethical. To invoke the language of risk is to invoke either an amoral language or a moral language with a pronounced technological bias. Risk is very different from a threat, a hazard or a danger. We often use these words interchangeably as synonyms of each other. In fact, they are very different. What makes a risk different is not that it is intrinsically more dangerous or hazardous than a threat, whatever its provenance; what makes risk different is that we think about it differently, actively trying to avoid it. Strange as it may seem, we used to talk of soldiers ‘hazarding’ their lives in battle – though soldiers were much less prone to use the term themselves. As a
138
The unconditional imperative
French soldier from the Napoleonic era remarked, the best battle in which he had taken part (Bautzen in1813) was one of the last because he had watched it safely from a distance.15 Nevertheless, what he found remarkable was that his fellow soldiers would hazard their lives if asked. In the end, this is what made them soldiers, not civilians. One of the most powerful messages of Leo Tolstoy’s novel War and Peace is when Pierre, watching the battle of Borodino unfold beneath him, descends to join a Russian unit and almost immediately finds himself part of a historical drama. In the thick of battle he is awakened to his patriotic affinity with the common soldiers he finds himself fighting alongside. At that point he finds himself taking part in a larger human story. Soldiers also used to assume that danger was ever-present on the battlefield. It was real, regardless of the decisions we took or did not take. It was dangerous to find oneself in a battle zone, whether you were fighting in the front line or merely in close proximity to an enemy. A danger or hazard was something that occurred externally, an event over which people had remarkably little control. When we talk about taking risks, however, we incorporate internal factors: our own decisions. It is the decisions of our generals which put soldiers at risk in the first place, and soldiers ‘at risk’ may often feel victimised. A risk presupposes a large element of choice that can be mitigated, minimised or avoided. For Niklas Luhmann it involves an element of ‘attribution’.16 What is significant about war is that we are transforming more and more dangers into risks that can be calculated quantifiably. Risks can be attributed to commanders or politicians who can be held accountable. The deterministic rationality of hazarding one’s life for one’s friends (the ultimate moral choice that a soldier takes on his own account, not his commander’s) is being replaced by the ‘problematistic rationality’ of calculation. Risks, of course, vary from society to society, as they do over time. Ideas and values change over generations, influencing the way in which risks are perceived. But it is precisely at this point that the language we use becomes interesting. The very meaning of risk today is shaped by the way our society regards its ability to manage change and deal with the future. Until recently we used to talk of a ‘good’ as well as a ‘bad’ risk. Samuel Johnson used to talk of risking ‘the certainty of little for the chance of much’.17 Surely that is the SAS credo: ‘He who dares, wins.’ But no-one today talks of ‘good risks’. In contemporary life, risk-taking is not highly regarded. A risk is seen as less of an opportunity than a problem. As risks become more equated with danger there will be a tendency to adapt strategies that self-consciously encourage risk avoidance. What better way to reduce risks than by subcontracting some activities to others, especially PSCs?
Corporate warriors? Long before 9/11, contracting and contractors had become part of the American way of war. PSCs such as DynCorp, MPRI and Vinell first emerged in the 1990s. The war in Iraq has amplified their role. ‘The US Army cannot go to war without contractors,’ claimed the Dean of the US Army War College the year before the World Trade Center attacks.18 The Afghan and Iraq campaigns have given them a
The unconditional imperative 139 particular boost. Translation, intelligence and protective security services – all vital tasks – have been increasingly subcontracted. It was the poor planning of post-war reconstruction in Iraq and the sudden need for additional security that created a ‘bubble’ that probably will not last. Even so, PSCs are not going to go away. What makes them important, of course, is that unlike every other private sector business to which governments now subcontract certain services, they are in a position to kill – it is not their purpose, but it may be demanded of them. From time to time they may engage in ‘immediate tactical harm’ (one of many euphemisms for killing). The market size of PSCs has grown dramatically from $55 billion in 1990 to a projected $200 billion by the end of this decade. Throughout the 1990s the value of PSCs with publicly traded stocks grew at twice the rate of the Dow Jones average. What are the main drivers of this expansion? One driver is the demand for expeditionary warfare. The need for logistics and concurrently low intensity operations has made it expeditious to contract out certain tasks. There is always the demand side aspect; our forces are over-stretched. Only one tenth of one per cent of Americans aspire to join the infantry. Consequently, the US Army is being forced to recruit men aged forty and to relax standards of physical fitness and even previous requirements for IQ. Military manpower has never been lower in real terms, which cannot be said of the many tasks the military is being asked to perform.19 At the moment, PSCs are involved in only two of the three spectrums of war: combat support and service support. The former covers those who directly help the military engage the enemy, such as engineers or intelligence; the latter encompasses those areas, such as logistics, which enable the military to deploy for long periods of time in the field. Actual combat is confined purely to national military forces. It is still the cutting edge of military life; it is what soldiers do, and few PSCs would wish it otherwise. Only one PSC, Blackwater, has floated the idea of marketing a private army for low-intensity conflicts, and claims it is ready to equip a brigade-sized force if asked. So far, however, the call has not come. If it did, it would be a giant step for any state to take as states still fiercely guard their right to the monopoly of military violence. Yet it is precisely the retreat of the state, together with the reduction in its authority, that has created a vacuum which PSCs have entered with great speed. Not only did the state maintain a monopoly on violence, it also retained the authority that went with it. Legitimacy aside, the state also once possessed a monopoly of information. Clausewitz wrote about the ‘fog of war’, which is a metaphor that has established itself in the imagination to describe the limited intelligence even the most powerful of states enjoy about their enemies’ dispositions, or intentions. But he had no doubt that states had greater insight into both than any non-state actor. This is no longer necessarily the case. Local PSCs on the ground often know a great deal more. Their employees circulate more freely than military personnel. Intelligence-gathering is now one of the tasks they are hired to undertake. The new public management revolution also encourages us all to place greater emphasis on the private sector in many state-funded activities, from schools to hospitals. Outsourcing, we are told, enhances morale, cohesion and
140
The unconditional imperative
combat effectiveness. Armies are also being encouraged to employ the language of commerce, and even emulate some of its practices. Business consultants increasingly shape military thinking and influence its culture. Inevitably, the language of public service is being eroded by the imperatives of the competitive market environment. The private sector is beginning to shape the way the military looks at the world in ways that are not always acknowledged or understood. Finally, a soldier can expect to serve some of his career on the ‘other side’ of the private/public divide. So can the rest of us. All the professions, including my own, have a foot in the private sector – perhaps, particularly my own. The academic world attracts privately sponsored scholars and post-doctoral fellowships. Many academic conferences are sponsored by companies. If academics have any relevant expertise, they can expect to garner business consultancies. Academics have become experts and no longer ‘guardians of knowledge’. In short, authority which was once invested almost solely in the state and those who served it (e.g. priests, professors and the professions in general, including the military) is now being redistributed within states, increasingly in favour of the private sector. This explains the fast growth of the PSC market. Whether we like it or not (and many of us are in culture shock), the industry is growing at the rate of 7.8 per cent a year. It is here to stay because the range of actors who use it now includes not only the military, but also the United Nations and NGOs. It is intensely demand-driven. Indeed, PSCs are moving increasingly into NGO areas of competence. They are building schools and hospitals, as well as, in some cases, distributing vaccines. And they are also increasing their involvement in the security sector by training the police, the military and government officers of host states. As a result, more and more international aid workers are targeted by insurgents, or choose to leave rather than work in high-risk environments, as was the case of the Red Cross in Iraq. Some are concerned about the security of their employees, others about being too closely identified with the War on Terror. Unfortunately, one of the problems with the private security sector is that most companies are not accountable internationally. Markets often reduce transparency; it is difficult to monitor the activities of companies. In any case, PSCs attract less media attention. Few embedded journalists are attached to them in the field. And governments like working with them, in part because most of their contracts with them are not subject to Parliamentary approval or oversight. Governments find this attractive, of course, because the political costs of their own intervention are thereby significantly reduced. The international conventions that regulate their activity are also of questionable value. Most PSCs are beyond their reach. The UN Special Reporter on Mercenaries has been replaced by a working committee, but neither was or is especially effective. Governments are also reluctant to regulate their activities too closely for fear of putting them at a competitive disadvantage in the global market. Governments, of course, are under an obligation to ensure that PSCs observe international humanitarian law. They are required, when contracting out the interrogation of prisoners to the private sector (as the US did in Afghanistan in 2002–3) or the management of POW camps (as was the case in Iraq) to ensure that they
The unconditional imperative 141 observe the Third Geneva Convention. The responsibility of the state for violations committed by private companies stands alongside the responsibility of the companies themselves – even when, as with many American PSCs in Iraq, they may be granted immunity from local process (i.e. the jurisdiction of local courts). There is no doubt that PSCs perform a function that can be performed by noone else. But there is a price to be paid, and it is largely ethical. The central dilemma of subcontracting to them is that they are not political actors in the true sense of the term, even if they carry out political tasks. Their status is ambiguous in international law because they are essentially unaccountable. The ethical problem derives from their ambivalent relationship with the governments that use them. In Iraq, anecdotal evidence suggests that they have been involved in very few cases of abuse (although Abu Ghraib was one). But in four years there has not been a single case where a civilian contractor has been prosecuted for overstepping the mark, unlike the soldiers who have been brought to book. This should not surprise us. Institutions like the army have traditionally centred around an ethos, an honour code. Since the nineteenth century it has been grounded in service to the community. Previously, it usually represented service to the state (the prince) or, in the case of the Japanese Samurai, to the head of a family (someone not entirely unlike a Mafia boss today). In each case, however, the way the community customarily expressed its thanks for this service was in the form of public esteem, a currency that could not be cashed out. The rise of contractualism has changed all public services, including my own. Just as citizens have been transformed into ‘consumers’ and civil servants into ‘welfare managers’, professors are told they are now ‘service providers’ or ‘line managers’, and that their students are ‘customers’ with whom they have a contract. What can one expect when employees are now ‘human resources’, no longer employees? Resources, after all, are there to be used. Even in these days of sustainable development, they are frequently used up. Contractualism has gradually hollowed out the concept of duty. In place of an ethos, it has brought plans and targets geared to bonuses and performances. The very idea that one might be motivated by a sense of duty is unimaginable, as is the idea that ‘recognition’ might be more important than financial reward. It would be wrong, of course, to pretend that PSCs have not made services previously undertaken by the military considerably more efficient. But this goes only so far for services that are price led or market driven. In the end, there is a symbiotic relationship between institutions like the military and the community it serves, which embraces such unfashionable concepts as shared pride. After all, an army represents us; we are all implicated in its acts. If soldiers misbehave, we hold them to account for that reason. Ultimately, we cannot disassociate ourselves from their actions. In that sense, each actor, state or institution is inextricably involved in its respective standing. But when a state devolves the role to the private sector the dynamic is very different; the relationship is now purely contractual. If a PSC worker misbehaves he is in breach of contract, nothing more. There are no international criminal courts before which private contractors can be taken to account.
142
The unconditional imperative
Another problem with subcontracting to PSCs is one I have identified throughout this study: consequence management. Most corporate warriors are hired to perform a single mission (often the protection of key personnel). They are not asked to pacify a country. Their responsibilities are narrowly defined. If they run an ordinary Iraqi off the road, return fire indiscriminately or generally treat the locals as expendable, they are still doing their job. And their job does not require the training or self-discipline we expect of soldiers. Steroids, guns and tension, adds Tom Ricks, are not a good mix in any situation; PSC employees tend to abuse two and experience the third. In Iraq, the major problems were created not by those who were working on subcontract to the US Military for logistics or support facilities; they were created by some of the 20,000 ‘shooters’ who were hired as bodyguards. The great majority were local Iraqis, but 6,000 represented other nations in the Coalition (by far the greatest percentage were American). As Ricks graphically puts it, between 2003–6 PSCs fielded about as many combat forces as the total non-US contingent in the Coalition, or the equivalent of an entire US Army division. They also suffered higher casualties (more than any single US Army division, and more than the rest of the Coalition combined). That was just the American citizens.20 We do not know the numbers for other personnel. But often the stress they experienced made them indifferent to the wider consequences of their own actions. Many alienated ordinary Iraqis (who, alas, were rarely in a position, or even wanted, to distinguish private American workers from public servants). Tensions between the two flared up quite often. Ricks cites one example in May 2005 when a US Marine Corps unit accused a security detail from Zapata Engineering, a company which had been awarded a contract to dispose of unexploded weapons, of shooting wildly at both Iraqi and American troops while driving west from Baghdad towards Fallujah. The US treated the 16 American contractors and their 3 Iraqi translators as regular security detainees. After disarming them, they were incarcerated and then shipped out of the country three days later.21 The episode shows the dangers of subcontracting responsibility for outcomes to those who work for commercial profit only. The problem of shooting innocent civilians at roadside checkpoints became increasingly contentious in the years immediately after the end of the formal operations. In 2005 the US military documented an average of seven deaths per week at checkpoints where US soldiers had fired on civilians mistaken for suicide bombers. Recognising its moral duty to bring the numbers down, the US command did so. In recent years Western militaries have becoming increasingly preoccupied with what are called Effects-Based Operations (EBO). The American military has adopted an effects-based approach, which is more a way of thinking than a theory. The British military has articulated an effects-based approach within operationallevel doctrine. Whether it is called an approach or an operational doctrine, the military is merely responding to a change in context. In a globalised, strategic environment, effects in one area of the world (Abu Ghraib) can have a snowballing effect on another.
The unconditional imperative 143 The military now shares the battlespace with other actors. The problem is that PSCs are rarely required to think through the consequences of their acts. They rarely question themselves. What are the features of the situation? What effects do they want to have on the situation? What will the situation look like when the operations have come to an end? A military unit entering a town and finding itself shot at by a sniper in a mosque can, of course, order an air-strike or a tank to take out the minaret. These days, however, soldiers are encouraged to take out the sniper instead of creating religious tension by destroying the mosque. In denying himself recourse to overwhelming force, a soldier is only being true to ‘the warrior ethos’. And what is an ethos but the tone, character and quality of a soldier’s life, moral style and mood, as well as his underlying attitude towards his own profession? Private companies have their own honour codes, of course. They best discharge their contractual responsibilities and hold themselves to account for the services they provide, or fail to provide. But that is the point: they enter into a contract with their customers, not a covenant. Their duties are narrowly prescribed. Few of their employees are ever asked to go ‘beyond the call of duty’. The warrior ethos that has developed more fully in the West than anywhere else reminds soldiers that their actions have consequences. When subcontracting to others we are in danger of forgetting that our warriors want to be valued by the rest of us for the work they do, not merely admired for their courage.
Asimov’s children War is nothing else than a huge conflict of opposite engines, worked by men, who are themselves as machines directed by a few. (James Boswell, ‘On War’) It is really only since the 1960s that we have begun to ask how technology has shaped the horizons of human experience. It is only recently that we have begun to ask, as we adapt to each new instrument and device, and learn to reinterpret the world accordingly, whether we are losing touch with our traditional modes of understanding (as Camus surmised). Are we allowing the world to be increasingly mediated through the technologies we design? As technology encroaches upon or mediates our experience, is it beginning to transform our perceptions of the world? One day, perhaps quite soon, will we be asked to see the world through the machine’s point of view? We have been interfacing with reality through technology since the invention of the telescope, which revealed for the first time that things seen were not necessarily as they appeared. The telescope ironically allowed us to see the ‘larger’ picture contained in the small frame, just as the microscope allowed us to see the cosmos concealed in ‘a grain of sand’. But neither technology fundamentally changed the truth. They merely enriched our understanding of it. Neither technology, for that matter, changed our fundamental ontology. The message remained much the same. Everything has changed since the invention of the
144
The unconditional imperative
computer. As it begins to do the seeing for us, things may change ontologically – the ethical content of the life-world, for one, may be hollowed out. It is a fear glimpsed in Kurt Vonnegut’s novel Player Piano. While the first industrial revolution devalued muscle work, the second devalued routine mental work; the third, through undue human reliance on machines, threatens to devalue human thinking. The question Vonnegut asks is this: will we internalise and even subconsciously adopt the outlook of the computer? The point at which this becomes true would mark what Ray Kurtzweil calls a ‘singularity’: a period when the pace of technological change becomes so rapid and its impact so deep that our lives will be dramatically transformed, as will some of the concepts that we rely on to give meaning to our lives, from business models to the cycle of human life.22 This will be a world still definitively human, but one that takes us beyond the humanity we take for granted. Let us go back to the beginning. In his book The Age of the Intelligent Machine, one of the most original works of its kind, Manuel de Landa recounts the story of the transfer of cognitive structures from humans to machines. Much more than merely a history of warfare, de Landa’s book provides a philosophical reflection on the changes and forms through which human bodies and materials have been combined, organised, deployed and made more effective over time. At one point de Landa invites us to imagine a history of war written by a ‘robot historian’. He invites us further to accept that such a history (if ever written) would be quite different from any history by a human author. While a human historian would try to understand the way people assembled clockworks and motors in the past, a robot historian would probably place a stronger emphasis on the way these machines have influenced human evolution.23 When the clockwork mechanism, for example, represented the dominant technology, people imagined the world around them as a similar system of cogs and wheels. The solar system, for example, was pictured until well into the nineteenth century as a clockwork mechanism or motorless system animated by God from the outside. Later, when motors came along, people used the analogy of the motor even for natural systems which were seen to run on an external reservoir of resources and to exploit the labour performed by circulating flows of energy and matter. Similarly, if our robot historian turned his attention to war and to the evolution of armies in tracing the history of its own development, it would probably see human beings as no more than pieces of a larger military industrial or war machine. The assembling of these machines would have been influenced (from its point of view) by certain machinic paradigms. Thus, for example, the armies of Frederick the Great could be pictured as one gigantic clockwork mechanism in which soldiers were merely cogs in a wheel. Napoleon’s armies, by contrast, could be pictured as a motor running on a reservoir of manpower and nationalistic fervour.24 And if our robot historian turned its attention to some of its own machinic ancestors, like the conoidal bullet which was invented in the nineteenth century, it would probably explain how they resisted human control for over a century (which was the time it took for human societies to integrate rifle firepower into an explicit tactical doctrine). Indeed, it is worth pausing for a moment to contemplate
The unconditional imperative 145 the conoidal bullet. Napoleon’s success as a commander owed much to his understanding of the true importance of firepower. One of his favourite maxims was: ‘It is by fire, not by shock, that battles are decided today.’ 25 But even he was limited by the means at his disposal. Although cartridges and Howitzers firing explosive shells were employed with increasing frequency, the triumvirate of solid shot, smooth bores and muzzle-loading still held sway on the battlefield. The conoidal bullet only came into its own when the bullet and propellant were housed in a single cartridge. In the Martini Henry each round had to be hand-locked into the breach, which was prone to jamming, but the weapon was lethal in the hands of disciplined troops – as the unfortunate Zulu warriors discovered when the flower of their manhood was destroyed in an afternoon in the Battle of Rorke’s Drift (1879).26 It was not until the introduction of the bolt action, magazine-fed LeeEnfield, which became standard issue in the same British Army by 1914 and remained so until the 1960s, that trained soldiers could fire 15 rounds a minute. The best could double that rate of fire. Technology gave ordinary infantrymen more killing power than ever before in the history of warfare. Yet it did not determine who they fired at or why. In that sense, technological developments cannot be said to possess their own dynamic. The dominant approach to technology studies – the sociology of technology – rightly rejects technological determinism. Technology is largely guided by human design and needs. Human beings hardly wanted war to become butchery, as it did after 1914, but once they were on a technological treadmill they could not get off. As the example of the conoidal bullet illustrates, a given technology may even force us to redefine our needs. Thus the accuracy of projectiles forced commanders to give up the need to exert total control over their own men by making them fight in tight formations (like the clockwork armies of Frederick the Great). Instead, they were forced to put an emphasis on more flexible, small teams (or platoons). When our robot historian switched its attention from weapons to computers and the new metaphor which replaced the motor – information – it would emphasise the role of non-human factors in their evolution too. The logical structure of computer hardware was originally hard-wired into the human body in the form of empirical problem-solving recipes known collectively as heuristics (i.e. rules of thumb, mental shortcuts and tricks of the trade discovered through trial and error and experience). Some of these insights embodied in heuristic know-how were captured in general principles, or infallible problem-solving recipes known as algorithms. At which point we may say that the logical structures migrated from the human body into the rules that make up logical notation. And from there they have migrated, in turn, to electromagnetic switches and circuits. From our robot historian’s point of view, what is important is precisely this migration, and not the people involved in effecting it. De Landa’s point is that we need not imagine fully fledged human-like robots replacing soldiers on the battlefield (the nightmare vision of the future in films such as Terminator). Nor are we likely to see robotic commanders (or computers) replacing human judgement in the planning and conduct of military operations.
146
The unconditional imperative
What we must recognise instead is the faster and continuing evolution of a coherent ‘higher-level machine’ that, for some time now, has been integrating men/tools/weapons as if they were no more than the components of a single machine. Our robot historian would see armies as evolving machines beginning with clockwork mechanisms and ending with the networks of today. It would be interested in recording the different forms in which intelligence has become incarnated in computers, and the process through which artificial forms of perception (vision) have come to be synthesised and embodied in computers, too. The ethical implications of all of this are not part of de Landa’s argument, but they are vitally important nonetheless. All our ethical rules, or etiquettes of atrocity, largely derive from our physical embodiment, from which we derive our sense of agency and responsibility for our actions. Until recently, ethics has inhered in the soldier, not the system. But what sense of responsibility can we expect from our disembodied selves, our inventions? And what sense of agency can an individual soldier find himself experiencing as part of a functioning network? Ethics, I have argued, actually inheres in networks which have their own rules. The same experiential factors that have led us to formulate our etiquettes of atrocity do not require them to be rewritten, only re-applied in different circumstances. This was even true of the ‘migration’ of responsibilities from individual soldiers, or commanders, to the system. What the system has involved (as the history of firepower in the last 200 years vividly illustrates) is a dramatic increase in the productivity per unit of human labour. Stated rather reductively, we can say that as less and less has been demanded of the individual operator (e.g. the machine gunner in World War I), more and more has been demanded of society as a whole in terms of the human casualties of war. Even so, the system was no less ethical in World War II or in the Cold War, when we debated the ethics of aerial bombing or nuclear deterrence. The problem with robotics is that it demands even less of the individual soldiers it is threatening to replace. In increasing the distance between ourselves and our enemies, robotics is also demanding less and less of the societies that send them out to battle. This brings me back to the question of risk. Risk is built into cost. In the course of the twentieth century war became increasingly a matter of cost–benefit analysis. There is a bias in such calculations towards measuring everything in terms of cost. Perhaps a good illustration of this phenomenon is the little-studied Third Afghan War in 1919, which the British won. Unlike the earlier wars when the technology deployed by the two sides had been much the same and even social factors had been not too dissimilar (the average life expectancy of a British soldier in 1878 was not much greater than that of an Afghan warrior as both lived in Third World conditions, even if Britain was technically a First World society at the time), the last Anglo-Afghan war was very different. Technology determined the outcome. The heavy machine gun was now an adjunct to the artillery, capable of laying down a dense barrage of fire at ranges of up to 2,500 yards. Armoured cars and lorries completely changed the face of logistics. Aeroplanes were used for reconnaissance and bombing villages, and the three bombs dropped on Kabul finally persuaded the Emir to concede defeat.
The unconditional imperative 147 Cost was all important. ‘I consider that when an enemy confronts us who has all the advantages of the savage backed by rifles as good as our own,’ wrote a British commander at the time, ‘we should turn to science to compensate in every possible way for the disadvantage at which our more civilised conditions and manners of war place on our troops.’27 Even such an inveterate romantic as Winston Churchill, who had cut his teeth on the frontier as a young man, had a very instrumentalist idea that the modern soldier was becoming too expensive to put at risk needlessly. The same thinking can be found much earlier in Kipling’s poem ‘The Arithmetic on the Frontier’: A scrimmage in a border station A canter down some dark defile Two thousand pounds of education Fall to a ten rupee jezail ... Strike hard who cares, shoot straight who can The odds are on the cheaper man ... 28 In fact, Churchill was dismayed that the British public would not use its ace card: poison gas. ‘Is it fair,’ he asked, ‘for an Afghan soldier to shoot down a British soldier behind a rock and cut him to pieces as he lies wounded ... Why is it not fair for a British artilleryman to fire a shell which makes the said natives sneeze?’29 Gas might conflict with traditional notions of military chivalry and its use might even be considered ‘unsporting’ against an enemy who did not possess it, but there was not much interest in chivalry in the immediate aftermath of World War I. Nor was there to be much chivalry in British operations in Iraq the following year, when aerial power was employed to discipline the Iraqi insurgents. ‘We can wipe out a third of the inhabitants of a village in 45 minutes,’ remarked one pilot, ‘killed by four or five machines which offer them no real target, no real opportunity [to be] glorious warriors.’30 In fact, it also offered little opportunity for RAF pilots to embrace the existential dimension of air war, even though they had seen themselves as knights of the air only a few years before in dogfights over the Western Front. There was little glory in bombing unarmed civilians. The increasing instrumentalisation of war has devalued glory as a currency. Cost has become everything. Today’s jezail is the Improvised Explosive Device (IED). And the Afghans are still good marksmen. During Operation Anaconda in 2003 (an intense fire-fight with Al-Qaeda operatives still at large in the country), the Americans found themselves fighting an enemy that proved highly skilled in knocking out one helicopter and disabling five more, not with shoulder-fired, surface-to-air missiles, but with assault rifles and a grenade launcher. These were exactly the same armaments that the Somalis had used to knock out the famous Black Hawk helicopters in Somalia ten years before – an event which figures in Ridley Scott’s blockbuster film Black Hawk Down. Today, the question of cost versus risk has become more important than ever. It is increasingly urgent to get right the ‘moral arithmetic on the frontier’ in time. It is expensive to train a US soldier at $44,000, and it therefore makes very little sense
148
The unconditional imperative
to put him or her at risk unnecessarily. Where these risks cannot be removed today, robotics offer a solution tomorrow – and not that far into the future at that. We want unmanned soldiers to go where we do not want to risk our own troops, claims Thomas Killion, the US Army’s Deputy Assistant Secretary for Research and Technology. Robots should take over many of the ‘dull, dirty and dangerous tasks’ from humans in war, he told a conference of unmanned-system contractors in Washington in February 2006. Despite doubts about their cost and effectiveness, the US Defense Department envisages that 45 per cent of the Air Force’s future long-range bombers will be unmanned, although no specific date has been given. One-third of the US Army’s combat ground vehicles will probably be unmanned by 2015. The US Navy is under orders to acquire a pilotless plane that can take off and land on an aircraft carrier and refuel in mid-air without human intervention.31 The Pentagon has doubled the number of Predator and Global Hawks, the unmanned surveillance aircraft that have been prowling the skies over Iraq since the insurgency began. These vehicles will soon be joined by a host of small robotic machines and military drones equipped with cameras that can be launched by hand or catapult from a rooftop or a moving truck. ‘Three years ago I had to beg people to try a robot,’ Marine Colonel Griffin claims. ‘We don’t have to beg any more. Robots are here to stay.’32 Indeed, they are. They represent the future of war or, to be more correct, one of its futures. The ultimate question, of course, is: when they will be capable of taking decisions on their own? ‘I have been asked what happens if the robot destroys a school bus rather than a tank parked nearby,’ remarked Gordon Johnson of the Joint Forces Command. ‘The lawyers tell me there are no prohibitions against robots making life-or-death decisions. We will not entrust a robot with that decision until we’re confident they can make it.’33 When that day will arrive, of course, much debated, but the US Joint Forces Command Project Alpha (which is responsible for accelerating transformative ideas throughout the armed forces) envisages that within 35 years the US will be able to deploy Tactical Autonomous Combatants (TACs). TACs will enjoy ‘some level of autonomy – adjustable autonomy or supervised autonomy, or full autonomy within ... mission bounds’. By then TACs are expected to be available in a wide range of sizes, from nano-robots to UAVs to automated systems that can even walk through complex terrain.34 It is clear why the military is enthusiastic about robotics. War is expensive and bloody – it produces casualties. Even if it did not want to cut down on losses (given that Western armies are smaller than at any time in the past hundred years), robots, unlike men and women, tend to consistently perform at peak levels. Robots do not get tired, suffer post-traumatic stress or need to be fed or counselled. And whereas one human being may master one skill (fighting, logistics or analysis), the fixed architecture of our brains prevents us from having either the capacity or the time to develop or utilise the highest level of skill in every (increasingly specialised) area. It follows, of course, that if robots are to be optimally employed (and optimal performance is a key reason for deploying them), they must be allowed to make their own decisions. For this to happen, of course, they will need Artificial Intelligence (AI).
The unconditional imperative 149 In many ways, we already have AI. It is around us all the time, contrary to arguments that AI withered in the 1980s. We have AI systems in cars, powering the parameters of their fuel injection systems. When planes land, the disembarkation gates are chosen by AI scheduling systems. Every time we use Microsoft software, an AI system tries to figure out what we are doing (writing a letter, for example, or a newspaper article). The real breakthrough which would enable machines to take decisions on their own probably lies in the future, but we had better start thinking through the implications while we still have time. At the Georgia Institute of Technology, Ronald Arkin is already developing a set of rules of engagement for battlefield robots to ensure that they use lethal force in a way that follows the rules of ethics. What he is trying to create is not an artificial intelligence, but an artificial conscience. Arkin insists this should reassure us, not horrify us, because robots have the potential to act more humanely than people. Stress does not affect their judgment in the same way that it does ours. Hatred of the enemy is not a factor. Robots, unless programmed otherwise, will not demonise their targets as ‘erratic primitives’ or ‘barbarians’. The US military is intent on creating what it calls a ‘multidimensional mathematical decision space of possible behaviour actions’. Based on inputs from radar data, current mission status and intelligence feeds, the system would divide the site of all possible actions into those that are ethical and those that are not. Had the system been operational in November 2001 when two missiles fired from a remote-controlled Predator killed the Al-Qaeda chief of military operations on the outskirts of Kabul, the drones would have been able to hold fire had they sensed that the car in which he was travelling had been about to overtake a school bus.35 There are comparisons to be made between Arkin’s work and the famous laws of robotics drawn up by Isaac Asimov to govern robot behaviour. But whereas Asimov’s laws (integral to the Will Smith film I Robot) were intended to prevent robots from being programmed to kill human beings in any circumstances, Arkin’s are supposed to ensure that they are not killed in an unethical fashion. How seriously should we take Arkin’s research? We are told, for example, that we already deploy robots that are sentient; robots can ‘feel’ things and ‘recall’ them, or ‘see’ things’ and identify them. We are also told there are robots that can analyse chess strategies. In principle, feeling, seeing and analogising need not involve sentience at all, even though it does in our species. Nor does sentience necessarily need to involve self-awareness. But this is the problem. One of the most dangerous features of the computer age is that it has impaired the use of language. Information is employed as if it were synonymous with knowledge. Thus a computer is said to ‘imagine’ visual patterns when it has no imagination. All it can do is reproduce images in a mechanical, even banal, manner. Computers may change the face of warfare in terms of information processing, but they will not be able to exercise ethical judgements unless programmed to do so. It is the programmers who need monitoring. Even if machines were to develop some of the feelings that we associate with sentient beings, they would not embody the qualities we still identify with warriors: camaraderie, allegiance to the flag and the willingness, in extreme
150
The unconditional imperative
circumstances, to sacrifice oneself for a friend. This is actually what still gives war the moral context within which we can use the words ‘right’ and ‘wrong’. Without it we would find ourselves living in a world without meaning, in which taking the life of another person would be no more wrong than unplugging a computer for good. A robot can inflict suffering, and even affront human dignity, but it cannot be altruistic. It cannot sacrifice itself even for another robot. It lives, in that sense, in what we would see as a meaningless world. Meaning is essential to all of us. ‘Why’ is the first question we ask when we find ourselves in extreme situations. ‘Why am I here,’ asked Primo Levi of a German guard in Auschwitz. ‘There is no why here,’ he was told. Even the guard did not know why he was there; he certainly had no greater insight than his prisoners into the rationale – or even the larger mechanics – of the Final Solution. It was the meaningless of the Holocaust that haunted the survivors, like Levi, for the remainder of their lives. A soldier also asks why, from the moment he finds himself on the battlefield. He sometimes attributes blame to the rear-echelon officers (the other enemy behind the lines) who have put him and his friends in a position to be killed. The question everyone asks is: ‘Why is this happening to me?’ The other question a soldier asks was first posed by Socrates: ‘How should we live?’ It is a question which calls for a deep and transforming reflection on us as individuals. Anthropologists tend to ask a related question (‘How can we live together?’) which sets out a different array of problems. The problem is not ‘who am I?’, but ‘who are we?’. Every soldier has to ask himself two questions: first, how he should conduct war as an individual soldier; and second, how does he live in the same community of fate as those he is sent out to disable or kill? This is why the ethics of war is inherently so important. It is a great test of human fallibility. The ethics of war quickly reveals ambition, wickedness, courage, hatred and compassion, within an intensely emotional and human framework. Soldiers live with their enemies in the same community of fate. They also have to live with themselves and their actions for years long after the battle is over. The answer to each question (Who am I? Who are we?) involves ethics and, like both the philosophical and anthropological questions that arise, they are interrelated. Socrates argued that we reflect on ourselves; he famously claimed the unexamined life was not worth living. Anthropologists remind us that we are social animals. This lies at the very core of our being; it is what makes us human. War is an intensely social activity. And it is also intensely individual because it impacts on our lives. War ruins many people and maims others, scarring them emotionally and psychologically for life. And, of course, it paradoxically makes others discover their true selves. The natural coward can become a hero in the heat of battle; the natural hero may discover that he is not as heroic as he thought before the battle began. War heightens the self-esteem of some people and it robs others of their self-esteem. As Saul Bellow’s Ravenstein remarks, the examined life may not be worth living either if we cannot derive from it any sense of moral self-worth.
The unconditional imperative 151 In the use of robots we can see the shape of war to come. Similarly, we can see in the continuing war on terrorism the moral ambiguities of an age – our own – when war is still human, all too human. Looking at the post-human future that some Pentagon planners envisage, one has an unmistakable feeling that they have little real understanding of the imaginative and psychological price we may all pay for what they would like war to become. The ultimate challenge of the robotic age is that the robots will inhabit a world of means largely divorced from ends. Computers have no goals, only programmes. They cannot negotiate because they cannot identify. They cannot be traumatised because they do not feel remorse. They cannot be kept in the frame because they do not see the bigger picture. As Clausewitz says of war, no-one sensible engages in it without knowing what the end will be, or what he would like. Ends, however, are not part of a computer’s programming. Programming sets a series of problems which the computer is expected to solve, usually in isolation. There is no larger picture, and therefore no role for the moral self. On no occasion does the subject confront the totality of the world. All technology, in that sense, claims Jacques Ellul, is defined as ‘the complete separation of the goal from the mechanism, the limitation of the problem to the means, and the refusal to interfere in any way with efficiency.’36 Accordingly, it is difficult to hold machines or their programmers morally responsible because they are not responsive to the world around them. This has been true, of course, of many soldiers who have increasingly seen themselves as cogs in a wheel, part of a larger machine whose purposes they do not fully comprehend. Yet even soldiers have to live with the repercussions of their own individual actions. Even if they do not see them they can at least imagine the harm they do. Even when we cannot immediately see the effects of our acts, we can at least imagine the repercussions. We are the only species, in fact, that can assume responsibility for the consequences of our actions. The fact that we can assume responsibility means that we have to. The ability itself brings a moral obligation with it. Responsibility is complementary to freedom – it is the burden of freedom we all carry.37 This is why Socrates is the first major philosopher of freedom, and why his method of reasoning (the Socratic method, which is still the dominant methodology in the law schools of the US) encourages us to challenge our first principles and beliefs. The proposed challenge is not to undermine first principles and beliefs, but to uncover our deepest desires and wishes so that we better understand ourselves. In the end, Socrates insists, we all wish to be happy. His message was that while bad (or immoral) actions may bring immediate satisfaction or gain, over time they may make us very unhappy indeed. We need to be at peace with ourselves, and we cannot find peace if we are prey to inner conflict, emotions or a bad conscience. Only an ethical code which promotes good conduct can discipline our desires or act as a check on our impulses. What Socrates realised with rare psychological insight was that people are unhappy because they are bad. ‘Inner conflict’ produces the highest psychological costs of all. We should be wary of the use of robots for this reason. We claim, of course, that we design them, but with what end result we do not really know. Designed by
152
The unconditional imperative
us, robots could become an inhuman force divesting us, their inventors, of the burden of responsibility for our own actions. They could morally disarm us. Bauman calls this morally ‘de-skilling’, a nice phrase which he employs to describe those people who are largely insensitive to the real moral challenges around them.38 We enter a new century knowing all too well that our ethical imagination is still failing to catch up with the fast expanding realm of our ethical responsibilities. Robots are taking us even further away from the responsibilities we owe our fellow human beings. Robots are, by definition, conditional beings that cannot be expected to go beyond the call of duty or, in this case, their programme. Their world is entirely self-referential – they cannot refer back to anything except their programmes. They will slavishly obey what their programme instructs them to do. Ultimately it is this unquestioning obedience that makes them so attractive to the micromanagers. They cannot even be court-martialled or hauled before a criminal court for obeying an illegal order. All that can be done in such circumstances is to switch them off. As for sacrifice, why bother? In Douglas Adams’ novel Dirk Gentry’s Holistic Detective Agency, we meet the robotic Electric Monk, a labour-saving device that you can buy ‘to do your believing for you’.39 It would be only fitting that a society that does not believe in anything enough to ask its citizens to make material or personal sacrifices would design robots to do the believing for it. ‘I hope many more computer chips will die for their country,’ remarked one US general. It is cheaper and more utiliarian that way. It makes sense. As Jaspers would say, it involves a conditional commitment by a society that is no longer willing to make an unconditional commitment, even for what it believes in most.
Conclusion The present is the instant in which the future crumbles into the past. (Robert Browning) One of the problems with the new etiquettes I have sketched out is that they selfconsciously proclaim themselves to be value-free. They are grounded, after all, on a world view which is hobbled, writes Frank Furedi, by an inflated sense of risk, one which suffers from a deep sense of uncertainty. Not for us the profound moral certainties of the past. ‘It is precisely the very absence of certainty that underwrites the message of caution. In turn, the message of caution justifies itself through the continuous inflation of risks.’ 40 What Furedi grasps is that the new etiquettes of caution – which encourage excessive micro-management of the battlefield or the need to distribute risks with PSCs – eagerly invoke the technical language of risk management. In turn the etiquettes distance us from explicit moral judgments. The core values of the new etiquettes, such as caution, self-restraint and, especially, ‘responsible’ behaviour (i.e. not putting other people at risk through one’s own behaviour) are rarely advocated through an explicit ethical discourse. They are associated instead with the
The unconditional imperative 153 calculus of risk. Micro-management undermines a soldier’s self-belief, enfolding him and his actions into a discourse of risk management. Risk-taking is presented, not as morally wrong, but as inappropriate. Micro-managing the battlefield is not ethical; it represents a moralising impulse which detracts from individual responsibility and demeans ‘agency’, which makes us the authors of our own acts. It undermines the warrior’s own selfbelief, or what Emerson, in his ‘Essay on Courage’, called ‘being scornful of being scorned’. Soldiers were once scornful of being scorned for the honour codes that made them so distinctive. Now soldiers are answerable to legal conventions and codes of practice drawn up by civil society. Likewise, states now distribute risks with PSCs to avoid being held morally accountable for their actions. They employ the language of efficiency and optimisation; they subcontract to others because it makes sense to the taxpayer, not because it is ethically desirable. And in the case of robots, risk figures prominently too. Robots are favoured because they minimise risk to persons. Whether they absolve us of responsibility for the enemy in the community of fate that constitutes the battlefield is not, it would seem, terribly important. In the calculus of risk all that seems to matter is the morality of caution. Let me return to this chapter’s point of departure. The ethics of war inheres in the unconditional nature of our actions: in the warrior’s honour (the willingness to go beyond the call of duty); in the warrior’s imagination (his ability to imagine the pain of others); and in the warrior’s striving (what the Greeks called orexis, the quest to come to terms with oneself). It inheres above all in what was Hans Jonas’ principal philosophical concern: the exercise of responsibility over time, a responsibility that spans generations. Jaspers distinguished between two kinds of responsibility: formal and substantive. Both are central to the ethics of war. The first we have just discussed – it is the responsibility we must assume for our actions. As I have explained, accountability in terms of the responsibilities or duties that soldiers have traditionally owed each other (and for which they have held each other accountable) is now being undermined by the micro-management of the battlefield. The second is responsibility for the ‘other’. It is an expression of power and extends with the extent of the power we exercise. It is also non-reciprocal. Underlying substantive responsibility in the late modern era, Jonas tells us, is the responsibility to ensure the survivability of mankind. In war, this has taken on a renewed urgency since the invention of the atomic bomb.41 A larger substantive responsibility flows from the fact that we are encouraged to take responsibility even for our enemies (POWs) in order to secure a lasting peace. In going to war, we are responsible for the peace which ensues. In that sense, soldiers are governed by both kinds of responsibility: the formal, which requires they take responsibility for their own actions as agents, and the substantive, which demands that they act in such a way that the peace they win is worthwhile and hopefully long-lasting. These days, when wars are usually settled in the post-conventional phase of operations instead of on the battlefield, this responsibility is all the greater.
154
The unconditional imperative
In the next chapter I will discuss substantive responsibility at greater length through the reading of three authors: the greatest of the Greek poets, Homer; the greatest historian, Thucydides; and Euripides, the third and, for us, the most contemporaneous of the great tragedians. What all three tell us (if we are willing to heed the lessons they impart) is that fallible human beings are always being challenged to confront their own humanity. The Greeks would have been especially appalled by the presence of robots on the battlefield for they suggest that we would be better off banishing the human element from war altogether if only we could. What would have shocked Euripides is that we should ask so little of ourselves.
7
Back to the Greeks
Back to the Greeks? In his book The Future of the Classical, Salvatore Settis, the director of the Scuola Normale Superiore in Pisa, writes that the ‘marginalisation of classical studies in our educational systems and our culture at large is a profound cultural shift that would be hard to ignore’.1 At the same time he asks, what place is there for the ancients in a world characterised by the blending of peoples and cultures, the condemnation of imperialism and the bold assertion of local traditions and ethnic and national identities? ‘Why seek out common roots if everyone is intent on distinguishing their own from those of their neighbour?’ His points are well made even if the implications are somewhat overdrawn. After all, one characteristic of the Roman world was precisely that the blending of peoples and cultures as eastern gods and goddesses were introduced to Rome and worshipped by a mix of different peoples. And the US is no stranger to the heritage of empire; indeed many see it as the New Rome. Yet if classical studies are indeed in decline, there still remains something of the sense that the West evolved in the form we know it today from the experiences of Greece and Rome. Military historians have more than a vague notion as to the exact nature of our debt. Some historians, like John Keegan and Victor Davis Hanson, have even drawn our attention to a ‘western way of warfare’ with deep roots in the ancient world.2 It was natural that the young Anthony Swofford, off to war in the Gulf in 1990, should have taken The Iliad with him, enthused by the thought that he was only once removed from the battles between Achilles and Hector.3 Although this was omitted from Jarhead, Sam Mendes’ film of the memoirs of his time in the field, even Hollywood has rediscovered the Greeks after a long absence – despite the banalities of Wolfgang Petersen’s Troy (2003) and Zack Snyder’s 300 (based on Frank Miller’s equally bad book on the battle of Thermopylae). We may indeed be further from understanding classical culture than earlier generations, whose education was dominated by classical texts. Nevertheless the classics continue to matter in a way that no other period in Western history still does. If you travelled across the Greek world at any time in its history you would have traversed a landscape marked by war. Conflict was its dominant theme,
156
Back to the Greeks
whether in the assembly or the battlefield – though over the centuries we have not always been happy to acknowledge it. The modern view of an aesthetically pleasing Hellenistic statue, such as the Nike of Samothrace, encourages us to forget that the statue originally adorned a victory monument. And ancient warfare could take unpleasant forms of visibility. If we are to believe a set of inscriptions from the mid-fourth-century BC, broken bodies were a familiar sight: ‘Euhippus bore a spear head in his jaw for six years; Georgis of Herakleia ... was wounded in the lung by an arrow in some battle; Artikrates of Knidos ... became blind and carried around the spear head (that had blinded him) inside his face.’ The Greeks never stopped praying for peace and they never stopped fighting. They were just as incapable as we are of ‘living their lives in the right way, by observing the sufferings of others’ (to quote one of the last Hellenistic historians).4 This is one of the reasons we should turn to the Greeks. What makes them so remarkable is their ability to look reality in the face. Other peoples have made no apology for avoiding reality. Confronting it is one of the most difficult and unnerving tasks that faces us all. Some societies have constricted their field of vision, or blotted it out entirely. Others have been willing to contemplate reality at a distance. The Greeks, by contrast, treated abstractions, such as aggression, courage, moderation and fanatical enthusiasm, as if they were actors in one of the dramas they invented. They were, above all, a dramatic people who still force us to confront the realities of power, war and life. In other words, the Greeks taught us to see. They did so by using a language that has one of the largest vocabularies of any language and a literature which, though little has survived, still ranges widely over the subject they found of most interest, themselves. Even in the wars we have won against terrorism the Greeks have a distinctive voice. Indeed, it should not be surprising that many writers should urge us to go back to the Greeks and engage again with their strongly held view that war is terrible but innate to civilisation. One such writer is Robert Kaplan, a contemporary journalist close to the US Military. An embedded reporter during Operation Iraqi Freedom, he was also present when the Marines stormed Fallujah the following year. Again and again he has cautioned his readers to not allow themselves to be unduly constrained by Judeo-Christian morality: ‘Progress often comes from breaking heads.’ If the West is to prevail, it must return unashamedly and wholeheartedly to its pagan roots.5 The call for us to go back to the Greeks is a call for us to ‘get real’, to be serious, to acknowledge the old German proverb: ‘Nimmt die Welt wie sie ist, nicht wie sie sien sollte’ (‘Take the world as it is, not as it ought to be’). Those who urge us to do so in the present conflict urge us to see the reality of war for what it is. Yet in going back to our pagan roots we must be careful to heed the lessons that the Greeks teach us. In this chapter I have chosen three writers who warn us that the greatest threat comes from ourselves. This is not the lesson that Kaplan and others have in mind, but it may yet be the most important. Homer tells us what happens when we succumb to ‘force’; Thucydides tells us what happens when we succumb to hubris; Euripides instructs us, in timely fashion, on what happens when we act so badly that the defeated inevitably seek to revenge their own defeat.
Back to the Greeks 157 When fighting ‘evil’ we should always remember Coleridge’s profoundly dialectical intuition that extremes meet. Dissociated principles such as goodness and evil cannot come to grips without some ground common to both. It is on this common ground that ambivalence is to be found. This is what the Greeks warn us, and we ignore their warning at great risk to ourselves. They were fascinated by the way apparently good causes exhibit a moment of profound disorder that can be their undoing. For liberal societies such as our own the campaign against terrorism may be one of them.
Simone Weil and The Iliad First, let us turn to Homer. In The Last Chronicle of Barset by the great Victorian novelist Anthony Trollope, the character Johnny Eames, unhappy in love, proposes to steal something from his disappointment by translating The Iliad. He then sleeps on the matter and decides to take up the sanitary condition of the London poor instead. The quintessential moral improver, Eames had plenty of counterparts in real life. In the mid-nineteenth-century twelve complete verse translations of Homer’s epic were published in just 12 years. The translators included the Earl of Derby, who also found time to be prime minister. Practical men like Trollope (an untiring civil servant when he was not writing novels) or Derby, himself like the young Johnny Eames, turned to the Greeks for a reason: to instruct themselves on how life should be led. Not unlike one of the last of the Victorians, William James, they were fascinated by how theoretical positions and great ideas could be cashed out in practice. What the translators found was a preoccupation with death in many forms. As James wrote in his essay ‘The Moral Equivalent of War’ The Iliad is ‘one long recital of how Diomedes and Ajax, Sarpedon and Hector killed. No details of the wounds they made is spared us, and the Greek mind fed upon the story.’ Greek history, he added, was a panorama of ‘war for war’s sake’. For him and many other writers it makes horrible reading. ‘We inherit the warlike type; and for most of the capacities of heroism the human race is full of we have to thank this cruel history.’ Who knows how many societies have left no trace behind? Dead men tell no tales, James reminded his readers, and if there had been any tribes that had foresworn war they left no survivors. Those who had survived had bred pugnacity into our bone and marrow; there was little reason to hope that in the future it would be bred out.6 Contrary, however, to James’ view, the Greeks also desired peace. At the beginning of The Laws, Plato’s final dialogue, one of the speakers declares that what most people call peace is nothing but a word, and that most city-states are in a state of undeclared war with each other. But it would be wrong to think the Greeks as a whole bought into this view. Proverbial wisdom illustrates that peace was considered the most desirable order of affairs: ‘In war, the sleeping are woken with trumpets; in peace by the birds’ (Polybius); ‘In peace, sons bury their fathers; in war fathers bury their sons’ (Herodotus).7 Even Homer is a more complex figure than he is often made out. Heroes, Fenelon once remarked, look bigger the further away you stand from them in
158
Back to the Greeks
time. The Iliad did indeed impress generations of Greeks who sought to emulate Achilles and his peers, but the Greeks had a robustly sophisticated view of Homer. The poet ‘beautified the heroes’, added the Certamen Homeri, because he too was not close to them. He was writing 500 years after Troy had fallen. It had been a dark time indeed, in which everything seemed lost, including the great cities of the Mycenaean world whose ruins survived it. Skills decayed, trade routes were broken and even writing, for a time, was lost. Homer may not have known any of this but he knew the reality of war well enough. Take the fourth line where we see the bodies of the fallen being eaten by dogs or Book 11’s reference to ‘charioteers, lying on the ground more beloved of vultures than by their wives’. James himself had the great good fortune to be writing before World War I; since 1915 all of us have known the face of battle, even if we are always rediscovering it for ourselves. Perhaps it is part of human nature that we like to be shocked into the truth. The unwritten task of Hollywood films such as The Thin Red Line seems to be to ‘put us right’. I suspect that Homer himself would have echoed T.S. Eliot’s reply when he was congratulated for having caught ‘the disillusionment of a generation’ in The Waste Land: ‘perhaps, it expressed the illusion of being disillusioned’. This brings us to one of the greatest of twentieth-century interpreters of Homer, Simone Weil. If William James offers one reading of The Iliad, Weil offers another. Weil herself experienced war at first hand when she was attached to a group of Catalan anarchists in the Spanish Civil War. With her usual enthusiasm, she desperately wanted to work behind enemy lines when Europe itself went to war a few years later. Instead, she found herself working for the Free French in London in 1942. Possibly it was this rejection, provoked largely by her lack of physical fitness, which led her to despair. Her spirit weakened and eventually broke. She died of self-induced malnutrition in 1943. In ‘The Power of Words’, an essay Weil wrote within a year of her return from Spain, she developed a sense of what she called ‘force’. Force took a very special form when war was fought without any thought for prudence. Force triumphs when a conflict’s importance can only be measured by the sacrifice it demands from everyone, belligerents and non-belligerents alike. Weil’s essay was also her first reference to Homer’s epic. Its ostensible cause was the ‘abduction’ of Helen but her abduction was no more than a symbol of what was really at stake. The real issue was never defined by anyone, nor could it be because it did not exist. For the same reason it could not be calculated. Its importance was simply measured in terms of the deaths incurred and the massacres to come. Hector foresaw the end of Troy; Achilles foresaw his own death. Yet neither felt the cost was so great that the conflict should be abandoned. In the end, both men succumbed to ‘force’ in pursuit of a literal non-entity whose only value was in the price paid for it.8 What Weil was arguing was that only the limited use of force enables us to escape becoming enmeshed in its own machinery. She acknowledged that moderation was not without its perils since the prestige from which force derives at least three-quarters of its strength rests principally upon the ‘marvellous indifference which the strong feel towards the weak’. Excess is rarely ‘politically’ inspired.
Back to the Greeks 159 Excess is grounded in ‘an irresistible temptation’ to defy the voice of reason. In war that voice is only occasionally heard. We hear it briefly in The Iliad when Theristes, ‘the bitch-wolf’s son’, is allowed to speak. In a profoundly hierarchical society it is doubtful whether the few who could read would have found purgative value in his objections to aristocratic warmongering. But what of the speeches of the angry Achilles when he broods on his loss, sulks in his tent and refuses to come out to fight? Unfortunately, the words of reason in Homer’s poem too often drop into a void. If they come from an inferior (Theristes), he is punished and soon silenced; if they come from a chief, his own actions betray them. Achilles eventually ventures out to meet his destiny on the field of battle. In the end everyone rejects the very idea of wanting to escape the role destiny has allotted them. Such, Weil concluded, was the nature of force: employed by man, it ultimately enslaves him.9 ‘To define force – it is that X that turns anybody who is subjected to it into a thing.’ Quite literally this happens when one kills another man, when a warrior becomes a dead body (or what we prefer to call a casualty of war these days). But force is also as pitiless for the man who possesses it as it is for its victims. The first it intoxicates, the second it crushes. But even the intoxicated are not made happy. Achilles himself – the ultimate undefeated warrior – is shown at the very outset of the poem weeping with humiliation and helpless grief. And not a single hero (including Achilles himself) is spared the shame of experiencing fear, even if they rise above it in the hour of battle. Surely, Weil asked, this was the supreme insight of Homer’s poem: that a man enslaved to the emotion of war (both the possessor of force who is, in turn, possessed by it) does not understand his relationship to the weak. He fails to recognise that the strong are not absolutely strong any more than the weak are absolutely weak. The strong are often undone by the unpredicted (or unpredictable) consequences of their own actions; the weak are occasionally re-empowered when the consequences hit home. The man who possesses force knows no pause between impulse and the act, or ‘the tiny interval that is reflection’. And Weil warned that where there is no room for reflection, there is no room for justice or prudence.10 Homer invites us to see the heroes often in a distinctly unflattering light. Even the noblest warriors act ignobly: the sword thrust into the body of an enemy who is already dead; the triumph of the hero who tells a dying man what outrages will be inflicted on his body. To commit an outrage or atrocity is to ignore the consequences of one’s acts. These are the actions of men who do not see that force to be controlled must be limited; that even the bravest warrior must stop short. These are the actions of men who do not experience ‘that interval of hesitation wherein lies all our consideration for our brothers in humanity’.11 As a result, they exceed the measure of force at their disposal. Inevitably, they succumb to nemesis, one of the central themes of Greek tragedy. From this springs the idea of a destiny before which executioner and victim stand equal, in which conquered and conquerors are brothers in their own distress. Weil concluded her essay by reminding her readers that force at its most violent and unmediated was usually expressed in the height of battle, when it reduces
160
Back to the Greeks
warriors either to the level of inert matter which is pure passivity, or to the level of blind force which is pure momentum.12 What rescues The Iliad from being a tale of unremitting force is that it never loses sight of justice. It broods over the battle. Occasionally – if all too infrequently – it even invades the hero’s heart. Even the toughest soldiers experience (in brief moments of reflection) the awful knowledge that violence may consume them, or transform them into animals, not men.This is why, in war, even the bravest warriors are in constant fear of what they may become. Weil concluded that nothing in literature has ever surpassed The Iliad in communicating this message. Perhaps, she added, this is because one is hardly aware that Homer himself, the great spinner of tales on behalf of the winning side, is a Greek, not a Trojan.13 To remain ‘ethical’, war requires one to see it through the eyes of the enemy. One does not have to sympathise or identify with the other side, but one must never lose sight of the larger picture. Peace can only be obtained when the enemy finally admits to defeat. ‘Qui vincit non est victor, nisi victus fatebur’ (‘He who conquers is not the victor, unless the loser considers himself beaten’) (Ennius, Fragments 31:493). Victory can only be won in spite of the defeated if, like the Greeks before Troy, there is only one end possible: total destruction. Perhaps, Weil added wistfully, we will one day rediscover the epic genius of Homer when we learn not to admire force, hate the enemy or scorn the unfortunate, but act in such a way that our enemies can accept their defeat and find it in their heart to live with us in peace.
Reading Thucydides It can be argued that – even if we allow for the fact that time has sped up with the endless march of technology – the difference between Homer’s world and Thucydides’ is nearly as great as the difference between the present day and the Middle Ages. Our world is actually closer to Thucydides than it is to Homer. They are similar not in actual years but in the things we tend to take for granted, such as literacy, education, democracy and political philosophy. At the same time, it seems that Thucydides himself might have said the opposite, and that we would both be right. ‘Thucydides is not a colleague,’ claimed Nicole Loraux, the distinguished French ancient historian. This is worth repeating for Thucydides does so often sound as if he might be working in a Contemporary History or Politics department in one of the more distinguished universities. With his fierce concentration on politics and military history, and his preoccupation with ‘realist’ power politics, it is no wonder that his work is found on many syllabuses in my own field, International Relations. Any author would take pride from knowing that they would still be considered relevant 2,000 years after their death, but Thucydides, one imagines, would be particularly pleased. As he informs us early in his History of the Peloponnesian War, he was writing ‘a possession for ever’, and so it has proved. This is why a war fought by two city-states two millennia ago still remains the most instructive conflict in history. Neither the
Back to the Greeks 161 Thirty Years War, nor the Napoleonic Wars, nor even the World Wars in the last century produced a historian – on any side – who was able to capture the reality of war as well as Thucydides. Thucydides died before finished his history. It breaks off in 411BC, not with Athens’ defeat seven years later to which it looks forward. He claimed to have kept as close as possible to the gist of what was said when he presented his readers with the speeches of his contemporaries. He never claimed word for word accuracy. Instead, what he gives us is a new ‘realism’ that was the mark of his age, or its style. The great set-piece speech at Melos (which I shall discuss below) is frequently cited by realists today as the authentic voice of realism down the ages. If Homer teaches us one lesson (what happens when we lose sight of the fact that we should act prudently if only to redeem ourselves), Thucydides teaches us another: that we should never allow ourselves to be seduced by our own power (i.e. a ‘can’ does not imply an ‘ought’). Regrettably, Thucydides records, this lesson was soon lost sight of during the Peloponnesian War when his native city, Athens, acted as if it could do what it liked. So, of course, in some measure did almost every other city involved. What made the conflict so disastrous was that the timeless norms of war were violated at every turn. The dead were left unburied on the battlefield – for the Greeks, this was the ultimate sin. Thucydides was especially outraged that after the battle of Delium (424BC) the Athenian dead were left unburied for 17 days. He was appalled that the enemy had violated the ancestral protocols of making a truce. Euripides was equally horrified, as he subsequently made clear in his play The Supplicants. By leaving the soldiers unburied, the Thebans had forfeited their ‘Greekness’. For a people so obsessed with binary opposites (e.g. the difference between men and women, humanity and animals, barbarians and Greeks), only one conclusion could be reached: the Thebans had become indistinguishable from barbarians.14 Not only were the dead abused, but captives were butchered, civilians became the only targets, sanctuaries such as Delium were violated, and those who surrendered were either slaughtered or mutilated. So horrific was the slaughter that for the first and only time in the Western experience before the 1920s, the principal poets began to ask themselves whether there was something wrong with war itself. Wartime plays (like Euripides’ Hecuba or The Trojan Women, and Aristophane’s comedies Peace and Lysistrata), while certainly betraying no love of the Spartans, challenged the very institution of war itself. The women of Lysistrata, like the captured Trojans in Hecuba, reveal an entire society that had been traumatised by the fighting. War crossed the boundaries of the ‘possible’ and became ‘impossible’ (or irrational) as a result. For many Greeks, such a conflict could not properly be called war (polemos); instead it was something far nastier, strife (stasis).15 It was more like a plague or a famine than a heroic encounter between competing armies, let alone a Clausewitzian duel between moral equals, fighting what for each side was the right or just cause. The central story which illustrates the horror at first hand is the fate of Melos, a colony of Sparta that had chosen to remain neutral in the early part of the war. In the sixteenth year of the conflict, the Athenians sent an expeditionary force to the
162
Back to the Greeks
island to bring it into their own alliance (by force, if necessary). When the Melians refused, they were challenged to ‘see reason’ by means of a rational debate. Challenged to a fair debate, the Melians made the mistake of thinking that the subject of the debate would be fairness itself. The Athenians, by contrast, insisted on the right of the strong – what Thomas Carlyle centuries later was to call ‘the mights, not the rights of man’. The superior power defined the terms of the debate, a fact which no amount of argument could undo, and from which logically it had to begin. Justice, the Athenians claimed, could only come about when both sides were equal (or, in a court of law, willing to submit to the higher authority of the law itself). So, the Melians moved onto grounds of the Athenians’ own choosing, and they were quite persuasive. They employed consequentialist arguments. It would not be in Athens’ interest to attack them. War might alarm other neutrals, thus multiplying her enemies and threatening her security. This was an argument which, unfortunately for the Melians, the Athenian envoys went on to reject. As Thucydides informs us, the Athenians rejected it at their folly. Not because, in this case, other neutral islands, fearing for their own future, did indeed join the Spartans, but because Athens ignored the main principle of war: we do not what we can, but what we should. And what we should not do is run unnecessary risks. The hatred of other Greek cities which, Pericles had once boasted, was the price that Athens had to pay for greatness, was now considered not only unavoidable, but even desirable. As Christian Meier writes, the injustice in the world, which Euripides had taken as an argument against the existence of the gods, was taken further by the Athenian envoys. The Athenians made the gods witnesses to, and accomplices of, their own unjust acts.16 No trace was left of the awareness that power – if it is to last – needs boundaries, as Aeschylus had warned his fellow citizens in one of his few surviving plays, Prometheus Bound. In the modern age the Athenian envoys have not excited much sympathy, but they did get good press rather late in the day from the greatest philosopher of the late nineteenth century, Nietzsche: Do you suppose perhaps that these little Greek free cities which from rage and envy would have liked to devour each other were guided by philanthropic and righteous principles? Does one reproach Thucydides for the words he put into the mouth of the Athenian ambassadors when they negotiated with the Melians on the question of destruction or submission? Only complete Tartuffes could possibly have talked of virtue in the midst of this terrible tension – or men living apart, hermits, refugees and emigrants from reality – people who negated in order to be able to live themselves.17 Nietzsche was voicing his own personal philosophy more than that of the Greeks. Indeed, he was not only critical of Socrates’ optimism (which held that not only can the world be known, but it can also be corrected), but he also broke with Aristotle’s claim that philosophy begins in wonder at the fact that things are as they are. For Nietzsche, philosophy should begin with a very different sensation:
Back to the Greeks 163 horror. It should begin with horror because human existence is both horrible and absurd. For Nietzsche, the Greeks, at least in the Homeric age, had lived life not because they acknowledged good, or tried to embrace it. Instead, recognising that life was cruel, like nature, they had tried to transcend their fate through art. In making this claim Nietzsche can be said to have been one of the major influences in twentieth-century existentialist thought. What Nietzsche was arguing is that once we recognise our cosmic insignificance, we know that there is no ultimate purpose to human existence. The fact of death brings this home to each of us clearly and forcefully – or so it should. We cannot, therefore, ground ethics – or the principles by which we choose to live – on some higher moral realm or meaning such as a Kantian Categorical Imperative or, in the days of the Greeks, the justice of the gods. His endorsement of Greek theodicy was: ‘Do the gods justify the life of men! They themselves live it.’ There was nothing in the myths of the Immortals to suggest that they lived a moral life. What makes Nietzsche important for us still is that he was writing at the very moment that it became difficult to ground ethical practices on moral law; this difficulty came to pass with the death of God. Whether people believed in God was not the point. God had lost his purchase on the human imagination – at least in the conduct of life itself. This is why Jonathan Glover begins his Moral History of the 20th Century with Nietzsche’s challenge: that you cannot have a moral code in the absence of a divinity. He therefore forecasted that morality would perish in the next century.18 What Nietzsche found in the Melian dialogue was a world very similar to the world that was about to dawn in his own day. For the Greeks there could be no appeal to a higher good, only the assertion of the ‘will to power’ (the right of a dominant people to dominate others). For Nietzsche, the invocation to become what you are was the highest form of good, even if it was at the expense of other people. Unfortunately, his idea that by renouncing morality grounded in a universal law man could create a new set of values was itself over-determined by metaphysical considerations. The ‘will to power’ – his attempt to enrich our conceptions of the possibilities of human existence – remains a challenging idea, but his attempt at liberation betrayed a naïve understanding of politics, as witnessed alas by the horrors that soon followed his death in 1900. As Glover writes, after discussing the Melian Dialogue, our own history demands that we build moral ‘breakwaters’ against bad behaviour. Glover writes that this is best done not by applying universal criteria or codes, but by showing how morality actually serves our own selfish interests and needs. The challenge, he adds, is to keep ethics afloat without external (divine or metaphysical) support. Morality needs to rooted in human needs and values.19 We must return again to the dialogue (for the account Thucydides gives us is not quite as straightforward as it is usually read). On the one hand you can see the Athenians equating what is morally permissible with what the strong have the power to do, should they so wish.
164
Back to the Greeks For the gods, we believe, and of men we know, that by a law of their nature whenever they can rule, they will. This law was made by us, and we are not the first to have acted upon it; we did but inherit it and shall bequeath it to all time, and we know that you and all mankind, if you were as strong as we, would do as we do.20
This, of course, is the old defence of injustice. The weak preach morality because they have no other recourse; the meek attempt to bind the strong. ‘If you were me, you would do as I do.’ In the end, it is the old story: the strong do what they will, the weak what they must. Remember that this is not Thucydides’ position, even though all we have are the words that he puts into the mouths of the Athenian envoys. We do not know whether they were ever uttered, or whether Thucydides paraphrased the actual speech or even invented it. But once we recognise this, we can try, at least, to decode the speech’s meaning. The key terms in the speech are ‘law’, ‘nature’, ‘all time’ and ‘strength’. Of course, the Athenian envoys stressed the role of the strong as a fact of human nature, and therefore as much a fact of inter-state relations as relations between two individuals in ordinary life. But this is not simply the law of the jungle. Reason, rationality and prudent calculation are not excluded by the logic of power. What Thucydides invites his readers to see is a dialectic between Nomos (custom) and Phusis (nature). This was the peculiar gift of the Sophists to Greek philosophy. There is, in the thought of Thucydides, a clear dialectic between might is right (nature) and the prudential use of power. The Athenians were right to tell the Melians that, by virtue of being stronger, they could do what they liked. Plato tells us in the Gorgias that ‘nature demonstrates that it is right that the better man should prevail over the worse and the stronger over the weaker’. Even Thucydides adds that ‘it is a general rule of human nature that people despise those who treat them well and respect those who make no concessions’ (History 3.39). But we also know – from his own account – that he considered that the Athenians had acted imprudently at Melos. It is in our nature that we seek to limit the damage of our own actions even in war. We are social animals and prefer to be respected rather than feared. In destroying Melos, the Athenians were not true to their own nature (i.e. they did not follow the dictates of reason by which they set so much store). And even if as a people they were cruel by nature (which Thucydides denies), custom is there to help us to rise above nature. Even if it is in our nature to act badly, education and history make it possible for us to learn from the example of others; both education and history instruct us in the wisdom of acting well. As Simonides remarked, ‘the polis teaches a man’. The citizen always needs good instructors, like Pericles himself.21 One of the reasons why Enlightenment thinkers such as Vico invoked the Greeks was that they had argued that God ran the world through laws, but that these laws were immanent (not transcendent). They were not available through revelation, but emerged in human institutions and could therefore be rationally deduced. Vico was radical in his own day, arguing that laws do not stem from rational contracts between two parties. Instead, he argued that they are assembled
Back to the Greeks 165 through the instinctive realm of custom, or what he called ‘common wisdom’. They are learnt through trial and error, which is why we read history. And there is another important thing about Nomos: it occurs when the Athenian envoys insist that we should call a spade a spade, or strip away the veneer of a fair name (kala ononamata). The Melians, they insist, should not go ‘into denial’ by insisting that there is a universal law of morality which gives priests the right to legislate for us through codes of morality. By putting these words into their mouth, Thucydides came as near as he dared to arguing that it really does not matter whether the gods exist or not – even gods are governed by the universal laws of nature. If the Melians have any hope it is that the Athenians will know what is good for them and act justly out of self-interest. If not, they too will be ‘in denial’, and pay the price accordingly. They will be eventually undone by their own hubris.22 It is instructive to read the Melian Dialogue in the light of our own anthropological understanding of life. What made the Greeks so disputatious a people (so profoundly agonistic) was their binomial thinking. They thought always in binary terms (e.g. men and women, free men and slaves). They were not unique in this respect. Hunter-gatherer tribes also cut themselves off from the rest of humanity by calling themselves ‘the People’, ‘the Good Ones’ or ‘the Fully Complete Ones’. For them, custom is sovereign. What was different about the Greeks is that they recognised custom as custom. They were bound by the customary law they learnt in the city. Customary law did not prevent them from enslaving their fellow Greeks, but it was always a political choice. For the most part, the logic of hoplite warfare – the need to keep it brief so that men could go back to their trades, and carry on their lives – required the ‘rules’ to be observed. Indeed, some historians speculate that Greekness rose from war and its rituals (the right each side accorded the other of burying their dead at the end of a battle, and thus releasing their souls to find peace in the afterlife). Custom, Thucydides tells us, did not outlaw what the Athenians chose to do at Melos, but did not demand it either. Custom required only one thing: that they act reasonably, which does not mean nicely, but in accord with their rational self-interest. Instead, the Athenian commanders were motivated by revenge; they acted unwisely, out of passion. At this point their Greekness threatened to disappear into nature. Thucydides wanted to impart a lesson. He wished to show that Athens prospered when it had statesmen who, like Pericles, possessed wisdom, and when the assembly found their arguments persuasive, and was willing to act upon them. When Athens followed this logic it won, even in a pitiless, amoral universe. When Pericles left the political scene and Athens succumbed to the influence of lesser politicians, it perished through its own arrogance.
What is Hecuba to him? What’s Hecuba to him, or he to Hecuba? That he should weep for her? (Shakespeare, Hamlet II, ii, 584)
166
Back to the Greeks
What Thucydides tells us is that the massacre at Melos was not premeditated. It was, in many respects, merely an act of revenge for the stubbornness with which the Melians had not only put up a fight (the siege lasted four to five months), but also showed their obduracy in ignoring the logic of their own position. They had resisted when resistance was folly. The decision to kill all men of military age (between 18 and 45) and enslave the women and children may even have been taken by the Athenian commander, Philocrates, without the sanction of the Athenian Assembly. It may have been taken purely to revenge the loss of his own men in an unnecessary siege. The fact that Thucydides provides the name of the general and his patronymic, son of Demeas (or son of democracy), perhaps betrays his own repugnance at this act of private anger.23 Thucydides attached enormous importance to intelligence. The word gnome (‘understanding’ or ‘judgement’) appears more than 300 times in the book. Also, intelligent men are singled out for praise, especially Pericles. Intelligent men recognise that revenge has consequences not only for the defeated, but also for the victors. An intelligent man is one who masters his passions. He is above all a political animal who, in pursuit of the general good, is willing to suppress his own emotions. Here we go back to Machiavelli’s insight that politics breeds its own ethical codes. By contrast, as Robin Lane-Fox observes, there is no detailed concern with politics in the Bible. The Song of Deborah, like Aeschylus’ play The Persians, may offer an example of the consequences of defeat for those left alive (especially the enemy’s royal women), but the Hebrew scriptures are essentially victory odes. They gloat over the change of circumstances which war brings in its wake. ‘So let all these enemies perish, O Lord’. In contrast, Aeschylus shows particular sympathy for women (war’s unsung but principal victims) and enjoins the Greeks to treat them with due respect. It is at one with the central message of Greek tragedy: that happiness derives from harmony. In other words, it is consistent with behaving reasonably. Happiness involves control over the passions.24 Thucydides is a tragic historian and the Peloponnesian War can be read as a giant historical tragedy within which Euripides’ own plays were staged. The tragedy of Athens’ eventual fall is actually prefigured in a much better argument that the Melians make: it would be in Athens’ interest to preserve the principle of just dealings since they themselves might be defeated one day. Those who know the end of the story (as Thucydides’ readers would) doubtless find this deeply ironic. This ironic note sounds most stark in Hecuba, one of Euripides’ ‘Trojan’ plays. It was said after the disastrous Sicilian expedition that the captured Athenian soldiers won their freedom by singing Euripidean choruses. It is nice to think that the etiquette of atrocity kept the Athenian soldiers alive, and even nicer to imagine that their captors could be moved by their enemy’s poetry. This is a world away from the banning of great German classical composers in Britain during World War I, or of Wagner’s music in Israel today. We might well ask, of course, whether the Syracusans would have understood the message that Euripides’ tragedies attempted to convey: the plight of war’s victims (which is portrayed so vividly that Aristotle called him ‘the most tragic poet’). It would
Back to the Greeks 167 probably not have been understood even if, to quote Dewey, ‘the moral prophets of humanity have always been poets, though they have spoken in free verse, or by parable’.25 Hecuba shows us the plight of both heroes and entire societies locked in an endless cycle of revenge. It is natural to avenge a wrong, but a wrong redressed can often prompt the defeated to avenge the shame visited upon them. The most typical example of this is the vendetta, or blood feud, which is potentially endless in scope. It leads to crime breeding counter-crime over generations. In the case of a vendetta, it is the Philoi (the kinsmen) who must avenge the victims – including Philoi in the wider sense of one’s countrymen.26 To flourish, a society, however traumatised by war, must be permitted closure. If one limits the killing to the minimum, a defeated society is more likely to accept its defeat. One of the messages of Euripides’ play is that even the victors find difficulty moving on. He reminds us that wars do not always end even when a city is taken and its women and children sold into slavery. His audience would have known that Agamemnon would go home in triumph, only to die at his wife’s hands. So would his new mistress, Cassandra, the daughter of the Trojan king. The killing would continue long after the respective armies have gone home. War often claims the lives not only of the vanquished, but of the victors as well. Hecuba is not one of Euripides’ better plays, but it has been challenged in recent years because of its underlying message that unless war is mediated ethically, there is no hope of closure. In Hecuba the Trojan queen lures the murderer of her son Polydorus into a trap. As a woman, she is seen to pose no danger. On the battlefield women were excluded. Off the battlefield, however, women come into their own. They are particularly adept as guerrilla fighters and suicide bombers, as our own experience attests. In Euripides’ play, Hecuba blinds her son’s killer, Polynestor, before killing his two sons. It is one of the most shocking scenes in the history of theatre and we know, from original sources, that it shocked the Athenian audience on the first night. It may even have prompted the more imaginative to have recalled the suffering the city had recently imposed on the citizens of Melos. It is easy to conclude, as later generations have done, that Hecuba’s cruelty is a form of post-traumatic stress. Is it not true that war ‘unsexes’ women so that they end up killing children and cheating the maternal impulse? But there is another lesson, perhaps one even more terrible, if we but acknowledge it. At war with her own maternal instincts, Hecuba realises that she can only truly ‘move on’ with her life by murdering her enemy’s children as well. Reason has become a fighting tool, but that is Euripides’ point. Hecuba is rational, even ‘reasonable’. Reason can always find reasons for cruelty. She is too intelligent not to find the most effective revenge, and too strong not to carry it out. And in her own right, she is no more irrational than the Black Widows of Chechnya or the female suicide bombers in Sri Lanka or the Middle East. Indeed, quite the contrary. Far from being deranged, many of these women turn to murder as a way of keeping their reason. However disproportionate the act, murder is part of the logic of an unmediated war in which the rules do not apply and common humanity is banished.
168
Back to the Greeks
One of the main obstacles in the path of any reconciliation, writes Michael Ignatieff, is the desire for revenge. And revenge is moral in its own right, for it is born of the desire to keep faith with the dead and honour their memory, taking up the cause where they left off. Revenge keeps faith between generations. The violence it engenders is a ritual form of respect for those who are no longer here. ‘Political terror is tenacious,’ he writes, ‘because it is an ethical practice. It is a cult of the dead, a dire and absolute expression of respect.’27 Forgiveness does not mean forgetting, but it can allow us to move into the future and out of the shadow of the past. What then, is peace, but forgiveness and the chance to move on? The point is that we all want closure in the end – even Hecuba wanted closure on the murder of her own children. It is an existential choice, a highly personal one. It is not the instrumental closure of our culture which, far from allowing the privatisation of violence, prefers to pursue it through compensatory suits in the law courts. What made Euripides different from his two predecessors is that he took the same mythical figures as they had done, but asked himself – and invited his audience to ask themselves in turn – how they might appear in a more or less contemporary light. The Trojan plays must have been shocking for their audience for that reason. And for those watching as the war raged on outside the city walls, the main lesson to be derived from reading Hecuba is that what we all have most to fear is fear itself.
8
The heuristics of fear
Many will recognise the sentiment expressed by Franklin D. Roosevelt in his inaugural address in 1933: ‘Let me assert my firm belief that the only thing we have to fear is fear itself.’ Of the four freedoms he promised the American people when taking them to war some years later, freedom from fear was foremost. The President took his phrase from Thoreau, who said: ‘Nothing is as much to be feared as fear.’ The full quotation has even more merit than the one sentence that is so often quoted: ‘Let me assert my firm belief that the only thing we have to fear is fear itself – nameless, unreasoning, unjustified terror which paralyses needed efforts to convert retreat into advance.’1 But then again, Roosevelt was representative of his age. It was an age imbued with hope, despite the anxieties generated by the Great Depression. It was an age of utopian thinking – some of it murderous, some of it life affirming. It was an era when the American people were promised a New World Order. Today, all we are promised is more successful management of the Global World Disorder, which we are told is likely to be with us for the duration. Fear takes roots in our motives and purposes, settles in our actions and saturates our daily routines; if it hardly needs any further stimuli from outside, it is because the actions it prompts day in, day out, supply all the motivation, all the justification, and all the energy required to keep it alive, branching out and blossoming. Among the mechanisms that claim to follow the dream of perpetual motion, the self-reproduction of the tangle of fear and fear-inspired actions seems to hold pride of place.2 Modernity promised to be the period in human history when the fears that had pervaded social life in the past could be left behind. Yet at the dawn of the twentyfirst century the fear of indiscriminate terrorism has reduced us to a constant state of anxiety. Fear is the name we give to our uncertainty, to our ignorance of what the threat is and to our incapacity to determine what can and cannot be done to counter it. The measures taken by governments to counter the threats seem calculated to further deepen the mood of emergency. Fear is programmed into the War on Terror. Whether or not we abandon the name, the conflict will continue to amplify our fears. Fear is ingrained in the
170
The heuristics of fear
departure of a more generous vision of a New World Order and the acceptance of a permanent world disorder, which can only be finessed. Governments tend to make us more fearful (to preserve their own prerogatives) while offering us less protection than ever before. The problem is that cruelty is the child of fear. Pride, competition and fear are Thucydides’ three causes of war. But when he wrote of fear he did not see it in moral terms as a vice, but only as part of the human condition. If we return again to Hecuba, one of its most shocking scenes is when Polynestor kills Hecuba’s last son – even though he had raised the boy in his father’s absence himself. He is quick to explain his actions. It was a precautionary measure, made obvious when he says that ‘My primary motive was fear, fear that if this boy, your enemy, survived, he might someday found a second and insurgent Troy’ (Hecuba, 1136–39). In another play, The Trojan Women, the Greeks hurl Hector’s little boy, Astyanax, from the city wall. Hecuba cries as she cradles the broken body of the child in her arms: ‘Now when this city is fallen and the Phrygians slain, this baby terrified you? I despise the fear which is pure terror in a mind unreasoning.’ Fear is deeply corrupting for that reason. Fear is far more corrupting than the pursuit of honour or profit in war; there is no honour in ignoble behaviour, and little profit from destruction for its own sake.3 The principal reason why we should try to act ‘reasonably’ is to escape the fear that will follow if we do not – the fear that we will pay, sooner or later, for our own actions. We are taught to act with due regard for others because we have learnt what happens if we do not. And given that collective learning is the engine of history, we have really no excuse for acting badly. When the Germans and Japanese did so in World War II, they implicitly rejected the dialectic of war: that the only reason for fighting it is to obtain a better kind of peace. Ethics adheres in that dialectic.
The ambiguity of peace The ethical codes we ask our soldiers to respect on the battlefield and expect the state to uphold throughout a war exist for one reason only: to ensure that peace is possible. We liberals claim that we go to war for peace – in this case to make the world safe for democracy. This is our principal war cry, one grounded in the belief (for which there is some empirical evidence) that democratic societies do not go to war against each other. A democratic world, by definition, would be one at peace with itself. But how real is this vision? In Tim O’Brien’s novel Going after Cacciato, the hero, Paul Berlin, meditates on war and peace during a long, wakeful night on watch at a seaside post in the province of Quang Ngai. A fellow soldier, Cacciato, has previously run away in the mountains, and Berlin imagines him being pursued by his squad all the way across Asia to Paris. At the end of this impossible journey he finds that peace is elusive: ‘He looked for meanings. Peace was shy. That was one lesson. Peace never bragged. If you didn’t look for it, it wasn’t there.’ By bringing to the stark face of war the subtle style of peace,
The heuristics of fear 171 with all its layers of ambiguity, adds John Updike, O’Brien succeeded in penning one of the most trenchant war novels.4 Sometimes this has been difficult to grasp because of the strong contradiction in Western thinking between two apparently opposing states of being. In The Politics Aristotle divided life as a whole into opposites: Action and leisure; war and peace; and ... we may further distinguish acts which are merely necessary ... from acts which are good in themselves (Book 7).5 Like most of his contemporaries – indeed like most thinkers until the eighteenth century – he took war as an immutable reality. War, like peace, remains under theorised. Of the two, war has been treated as a given. Aristotle, who wrote about practically everything else, did not produce a systematic discussion of war, but what he did tell his readers has been in line with most other philosophical writing. There is only one end for which war is fought: peace. War, in other words, is merely a necessity, not a good in itself. War is merely a means to an end. This thinking continued into the Christian world. Take St Augustine, who lived with the prospect of perpetual conflict as the Western Roman Empire lurched to its end, for example. He died in the city of his birth during its siege by the Vandals. In the earthly city, he claimed, ‘there is one war after another, havoc everywhere, tremendous slaughterings of men’. Peace was not only the promise of the City of God. It was not confined to the next world, it could be realised in this one too. For the only ‘end’ for which war was fought by his fellow men was peace. Peace, we might say, is programmed into our nature. It is a universal human aspiration. ‘It is not that they [the violators of peace] love peace less, but they love their kind of peace more.’ For St Augustine, natural life is ordered by God. Peace was part of the natural order, the true condition of society. War represented its disruption. Without order, nature would cease to exist, as would humanity.6 One of the implications (which is expressly spelt out in St Augustine’s work) is not only that war should be waged for peace, but that we should do nothing that would make it harder to attain once war has broken out. ‘Be careful, therefore, in warring,’ St Augustine entreated, ‘so that you may vanquish those whom you war against, and bring them to the prosperity of peace.’ In other words, if a society destroys what makes life worthwhile, then it will have undermined any chance of a just (i.e. genuine, long-lasting) peace. St Augustine came to conclude that the true evil in war was not death (as we all have to die sometime), but ‘the desire for harming, the cruelty of avenging, an unruly and implacable animosity, the rage of rebellion, the lust of domination and the like – these are the things to be blamed in war’.7 For St Augustine, the limiting principle in war is what makes peace possible. And this, in turn, must be traced back to intentions. What is most sinful is to succumb to one’s baser instincts and be provoked by anger or led by emotions (including hatred) to lust for domination. Where we moderns differ from those of St Augustine’s age is that we no longer derive the laws of war from individual
172
The heuristics of fear
accountability after the fact to God. We derive them before the fact, from conventions, treaties, laws and customs which prescribe what we are and are not allowed to do. Since the fifteenth century, the direction of the ‘moral space’ in war has moved away from the injunctions of Christian humanism to the law of man. This requires a forteriori rather than a posteriori rules, and injunctions beginning with the discriminating principle of who we can or cannot target legitimately. Like St Augustine, we still have to accept that the only peace within our grasp is a ‘better kind’ than that which prevails at the moment. Peace in that sense is inherently aporetic. What is good for us is not necessarily good for others. As liberals we tell ourselves that we fight for universal values which are the bedrock of an eventual world order. This is all the more reason why we should be as attentive to the jus in bello rules as we are to jus ad bellum issues. As the critics of liberalism argue – and I paraphrase Alasdair MacIntyre, one of the most compelling critics – when we identify what are for others our own partial and particular causes too easily and too completely with the cause of some universal principle, we usually behave worse than we would otherwise do.8 As liberals we have to be even more on guard, and perhaps always have the unwelcome expectation that history may one day reveal that we are not fighting for universal values at all. History will or will not vindicate us – by which time most of us will no longer be alive. It is our task, in pursuit of peace, to act in such a way that we can at least keep a dialogue open with others, so that we can sustain the conversation. The overriding message of this book is that if we can only fight war for a better kind of peace, we have to fight morally. The permanence of peace is encoded in the codes by which we fight it. This imperative has been reinforced in our age, more than in the past, by the knowledge that all actions have greater consequences than before. All our previous injunctions to behave well stem from the Biblical creeds: ‘love thy neighbour as thyself’; ‘do unto others as you would be done by’; and ‘treat your fellow man not as a means, but an end’. All these actions were phrased in the present – not the future – tense. They were addressed to the living. They were also limited to the horizon of place within which the agent and the other met as friend or foe. Today, we have begun to acknowledge responsibilities to those who are far from us – not only spatially, but also temporally. There are new dimensions of responsibility, writes Hans Jonas, which encompass nature (the ‘greening’ of morality). We have somewhat belatedly discovered a responsibility to the non-anthropomorphic ‘other’: the biosphere, as well as the planet.9 As Jonas reminds us, the concept of responsibility does not play a central role in the moral systems of the past or most philosophical theories of ethics. Likewise, the feeling of responsibility does not appear as the affective moment in the formation of the moral will. Quite different feelings – such as love or reverence – were assigned this office. The explanation for this is that responsibility is a function of power and knowledge, which were both limited until recently. ‘Right’ action was restricted to the here and now. Today, by contrast, we have immense power and greater knowledge (though not, alas, greater collective wisdom). We always have to relearn the lessons of history, as we are doing in Afghanistan and Iraq.
The heuristics of fear 173 Jonas and the new ethicists also maintain that, unlike traditional ethics which reckoned only with non-cumulative behaviour, we have to deal with uncertainties for which there is no historical precedent. We have to deal with the unexpected, the unanticipated consequences of our own actions. In our networked world we pile up cumulative effects faster than ever before. Consequences can snowball. Our risk societies deal with probabilities, not certainties. They are always estimating, measuring and anticipating the consequences of their actions to manage them as best they can. We live in a world of perceptions, predictions and scenarios. Our actions are based on assumptions, projections and statistical probabilities beyond the real. This is the shape of our ethical universe at this stage in history. We no longer live in the stable world of the past. Ethics was once associated with continuity. Its main concern was that the state should survive – hence the importance of prudence (not exceeding one’s grasp), as well as moderation (not demanding a Carthaginian peace which could stoke up resentment followed by a wish for revenge). Our world, by contrast, is dynamic. Nothing is stable. Everything is in flux. We are future-oriented for this reason, and that future stretches beyond the immediate to the world of generations not yet born. Responsibility, insists Jonas, is a correlate of power, and the scope and the degree of power we enjoy must determine the scope of our responsibility.10 What morality restores to an increasingly uncertain world is the idea of responsibility: that what we do individually and collectively makes a difference; that the future lies in our hands. Jonas spent much of his life trying to infuse morality with what he called ‘the heuristics of fear’ – the fact that what should frighten us most is the consequences of our own actions. His departure point was Hiroshima (1945), for the survivors of the first atomic bomb were also victims (as many died of radiation sickness years later). Many children in the womb came into the world with genetic defects. Morality can no longer be time-bound or place-bound in the human imagination. Technology has invested humanity with the power to threaten not only the present, but the future. It was this challenge which led Jonas to formulate a Third Categorical Imperative: ‘Act so that the effects of your actions are compatible with the permanence of genuine human life.’11 My misgivings about Jonas’s Categorical Imperative do not arise from his injunction to act ‘so that the effects of our actions are in accordance with the permanence of human life’. My misgivings arise from his account of how such moral standards are to be justified. Jonas anchors his ethical prescriptions in metaphysics and one can see why. It is notoriously difficult to anticipate the consequences of our actions even when considering the permanence of human life, or the well-being of the planet before the damage has been done. Risk societies deal in probabilities, not certainties. The probabilities, predictions and scenarios we are always drawing up are a bit like Plato’s Forms in that they offer a world beyond the ‘real’, the immediate or what can be seen.12 In the end, however, there is no need to invoke higher principles or authorities. As Rorty tells us, we should locate ethics not in epistemology, but heuristics – in our day-to-day practices, not in reference to some authority beyond the world in which we live.
174
The heuristics of fear
Ethical demands should be rooted in human experience. There is a good reason why, over the centuries, we have come to recognise that ethics inheres in the practice of war, and why its practitioners have come to recognise that we have formulated tried and trusted rules for its successful practice. It is through experience that we have come to recognise that we have a responsibility for what we do, for the consequences of our own actions. The postmodern condition is one we all experience in a mode that is more than ever defined by risk. The cluster of risks, insecurities and control problems have played a crucial role in shaping our changing response to the world. Concern about consequence management is no longer a peripheral matter as it is built into the environment, culture and the everyday routines that guide our lives. In this sense we live in a ‘risk age’. Risk has become a way of thinking about one’s moment in history. Risk is not only inherent in the moment itself. It predisposes us to excessive caution, and illiberal actions in defence of the causes for which we are ready to fight. Necessity, wrote Ovid, is the mother of invention. For some in the US, desperation is proving to be the father of despair.
Towards the future To exercise that responsibility requires us to act not only in good faith, but without excessive fearfulness. In their book Ethical Realism, Anatol Lieven and John Hulsman (one a declared ‘radical centrist’, the other an old-fashioned conservative) counsel us to take the threat of terrorism very seriously indeed. They are right to do so. Our concerns are not exaggerated. But to be concerned is not necessarily to be fearful. As both authors warn, the War on Terror presents us with a double threat: if terrorism threatens us directly, it can also make us so fearful that we destroy the liberal values we believe in.13 Our recent history has been so thoroughly shaped by those values that we cannot imagine the state of society without it. Let me end, therefore, where I began: with Richard Rorty. Philosophy, he insisted, is not an academic discipline which confronts permanent metaphysical issues; it is only ‘a voice in the conversation of mankind’ (to employ Michael Oakeshott’s phrase). It is a voice which centres on one topic rather than another at any given time not by dialectical necessity, but as a result of various things happening elsewhere in the conversation. This is very much a definition we associate with American pragmatism. Unfortunately, it has incurred a good deal of philosophical hostility because it appears to subordinate truth to human interests. And this is considered tantamount by many to subverting the hallowed aim of ‘objective intellectual inquiry’ – inquiry that supposedly yields unbiased insight into things ‘as they are’, not as how the majority of human beings at any particular time might like them to be, or would benefit from them being.14 What is happening in the conversation that constitutes International Relations today is the return of religion in the discourse of international politics. This has yet to work itself out, and may well result in a more pluralistic dialogue between peoples. It has also given rise, however, to fundamentalist violence. Given this challenge, the liberal world must remain true to its own values not in the belief
The heuristics of fear 175 that they are universally true, but with the understanding that they are true for us. We live in a historically contingent world. At this stage in history the liberal experiment can be said to have succeeded. Despite serious reverses, we liberals are happier, better off and more free than we have ever been. The nineteenthcentury experiment has worked fairly well for us. In making this claim Rorty acknowledged that he was being unashamedly ethnocentric. Unlike America’s neo-conservatives he did not especially wish to export democracy to the Middle East (or anywhere else outside the Western world) because he recognised that even what he understood by his world being ‘changed for the better’ was conditioned by his own liberal beliefs. What most of us in the West mean by ‘progress’ is a world in which individuals are free to express their thoughts and passions, and to develop their talents. But this is assuredly not what a better world would mean to many in the non-Western world, including much of the Middle East. The point is that all our judgments, including our ethical ones, have to begin somewhere and our own judgements more often than not derive from liberalism. But because liberalism is a project, we must keep to the script. We cannot afford to deviate from it in the name of some emergency or urgent demand of the times. This is true for every liberal state, but it is especially true for the US in its declared and historic aspiration to venture abroad ‘in search of monsters’ to slay. The monsters are certainly out there. Whether the world can be made ‘safe for democracy’ or whether liberal values can be exported is, in the end, not the point. What matters most is the fact that a society that departs from its own first principles is likely to be imperfectly placed to fight for the rest of us. Liberalism enjoins us, as liberals, to recognise that humanity is not only the name of a species, but also a quality. Justifying inhumane acts, especially in time of war, is an implicit admission of defeatism. It was John Dewey, Rorty’s great source of inspiration, who once described philosophy as ‘a working programme for the future’. We conceptualise and reflect on events in order to forge a worthwhile future that is different from the present. If we want liberalism to be central to the political life of the twenty-first century then we better keep to the script.
Notes
Preface 1 P. Fussell, The Bloody Game: An Anthology of Modern War, New York: Scribners, 1991, p. 653. 2 T. O’Brien, The Things They Carried: How To Tell a True War Story, New York: Scribners, 1990, p. 103. 3 D. Grossman, On Killing: The Psychological Cost of Learning To Kill in War and Society, New York: Little & Brown, 1995, p. 195. 4 See the conference report. Mark Joyce (ed.), Transformation of Military Operations on the Cusp, RUSI Whitehall Paper, 106, London: Royal United Services Institute, 2006, p. 190. Chapter 1 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16
J. Reid, ‘21st Century Warfare, 20th Century Rules’ RUSI Journal, June 2006, p. 12. Ibid. p. 15. B. Tertrais, War Without End, New York: Maine Press, 2005, pp. 25–6. Cited M. Klare, ‘Imperial Reach’, The Nation, April 25 2005, pp. 1–2. Cited R. Silverstone, The Media and Morality, Cambridge: Polity, 2007, p. 76. Cited P. Hassner, ‘Beyond War and Totalitarianism: The New Dynamics of Violence’ in G. Prins and H. Tromp (eds), The Future of War, The Hague: Kluwer Law, 2000, p. 204. Silverstone, Media and Morality, op. cit., p. 83 Cited R. Dawkins, The God Delusion, London: Bantam, 2006, p. 103 C. Hedges, War is a Force That Gives Us Meaning, New York: Public Affairs, 2002; T. Friedman, ‘Hobbes’ Jungle’, International Herald Tribune, November 30 2006. M. Green, Children of the Sun, London: Constable, 1977, pp. 409–10. The Times, 7 September 2005. M. Pariss, Warrior Nation: Images of War in Popular British Culture, 1850–2000, London: Reaktion Books, 2000, p. 151. S. Toulmin, Return to Reason, Cambridge: Harvard University Press, 2001, pp. 131–3. B. Williams, ‘The Liberalism of Fear’, in G. Hawthorn (ed.), In The Beginning Was The Deed: Realism and Moralism in Political Argument, Princeton, MJ: Princeton University Press, 2005, p. 52. N. Rengger, ‘On the Just War Tradition in the 21st Century’, International Affairs, 78 (2002), pp. 354, 358; A.J Bellamy, Just Wars: from Cicero to Iraq, Cambridge: Polity, 2006. Alasdair MacIntyre, After Virtue: A Study of Moral Theory, London: Duckworth, 2002, p. 23.
Notes 177 17 J. Boswell, ‘On War’ in John Gross (ed.), Oxford Book of Essays, Oxford: Oxford University Press, 1991, p.102. 18 D. West, An Introduction to Continental Philosophy, Cambridge: Polity, 1996, p. 17. 19 R. Rorty, Contingency, Irony and Solidarity, Cambridge: Cambridge University Press, 1989, p. 9. 20 R. Rorty, Philosophy as Cultural Politics, Cambridge: Cambridge University Press, 2007, pp. 193–4. 21 Ibid. p. 198. 22 Ibid. p. ix. 23 Ibid. p. 187. 24 Ibid. p. 67. 25 J. Shklar, Ordinary Vices, Cambridge: Harvard University Press, 1984, p. 44. 26 R.G. Collingwood, An Autobiography, Oxford: Clarendon Press, 1982, p. 79. 27 A. Heller, A Theory of Modernity,Oxford: Blackwell, 1999. 28 Cited M. Sassoli, Transnational Armed Groups and International Humanitarian Law, Program Humanitarian Policy and Conflict Research, Occasional Paper 6, Winter 2006, Cambridge: Harvard University, p. 25. 29 Cited J.A. Amato, Victims and Values: A History and Theory of Suffering, New York: Praeger, 1990, p. 1. 30 R. Rorty, Consequences of Pragmatism, Minneapolis: University of Minnesota Press, 1982, p. xix. 31 Ibid. p. xxxix. 32 B. Williams, Truth and Truthfulness, Princeton: Princeton University Press, 2002, p. 208. 33 Rorty, Consequences of Pragmatism, op. cit., p. xxxvii. 34 H.G. Gadamer, Truth and Method, New York: Continuum, 1996, p. 312. 35 MacIntyre, After Virtue, op. cit., p. 61. Chapter 2 1 G. Parker, Empire, War and Faith in Early Modern Europe, London: Penguin, 2002, p. 145. 2 Ibid. 3 Cited T. Rabb, The Last Days of the Renaissance, New York: Basic Books, 2002, p. 162. 4 G. Parker, The Thirty Years War, London: Routledge, 1997, p. 193. 5 Rabb, Last Days of the Renaissance, op. cit., p. 167. 6 A. Huxley, Brave New World, London: Harper Collins, 1998, p. xiii. 7 Parker, G. (ed.) The Thirty Years War, op. cit., p. 179. 8 Cited M. Walzer, Arguing About War, New Haven: Yale University Press, 2004, p. 81. 9 P. Watson, Ideas, London: Phoenix, 2006, p. 688. 10 J. Rhee, ‘Adam Smith’s Hard Labour’, Prospect, June 2006, p. 70. 11 Rorty, Contingency, Irony and Solidarity, op. cit., p. 61. 12 T. Todorov, Hope and Memory, Reflections on the Twentieth Century, London: Atlantic, 2003, p. 138. 13 R. Kaplan, Warrior Politics: Why Leadership Demands a Pagan Ethos, New York: Vintage, 2002. 14 Rorty, Contingency, Irony and Solidarity, op. cit., p. 73. 15 Ibid. p. 74. 16 Ibid. p. 76. 17 W.B. Gallie, Understanding War, London: Routledge, 1991, p. 39. 18 Ibid. p. 40. 19 Cited H. Magenheimer, Hitler’s War, London: Cassell, 1998, p. 288. 20 Cited P. Fussell, Abroad: British Literary Travelling Between the Wars, Oxford: Oxford University Press, 1980, p. 217.
178 21 22 23 24 25 26 27
28 29 30 31 32 33 34 35 36 37 38 39
40 41 42 43 44
Notes
E. Canetti, Crowds and Power, London: Penguin, 1973, p. 82. E. Canetti, The Conscience of Words, London: Andre Deutsch, 1986, p. 27. Ibid. p. 108. Ibid. p. 74. H. Boll, ‘Letter to My Son: War’s End’ in Das Ende Autorem aus nein Landern erinaernsich an die letzten Tage des Zweitenweltkreig, Cologne: Verlag Kiepenheuer, 1983. Cited N. Ferguson, ‘Prisoner Taking and Prisoner Killing’, in G. Kassimeris (ed.), The Barbarisation of Warfare, London: Hurst & Company, 2006, pp. 127–9. Cited P. Johnson, Napoleon, London: Weidenfeld & Nicholson, 2002, p. 129. Napoleon did break the rules, however, as he himself once admitted. ‘It has cost us dearly to return ... to the principles that characterised the barbarism of the early ages of nations but we have been constrained ... to deploy against the common enemy the arms he has used against us.’ (Cited D.A. Bell, The First Total War: Napoleon’s Empire and the Birth of Modern Europe, London: Bloomsbury, 2007, p. 3.) What is especially regrettable is that the names of some of the worst ‘war criminals’ such as General JeanMarie Doranne who as governor of Burgos developed a ghastly reputation for torture still appear on the Arc de Triomphe in Paris (Ibid. p. 309). Cited Ferguson, ‘Prisoner Taking’, op. cit., p. 129. Cited B. Heuser, Reading Clausewitz, London: Pimlico, 2002, p. 51. A. Ferguson, An Essay on the History of Civil Society, 4th edition 1773, reprinted Farnborough: Farmer Press, 1969, p. 267. P. Cornish, Cry Havoc and Let Slip the Managers of War: The Strategic Hazards of Micro-Managed Warfare No 51, Camberley: Strategic and Combat Studies Institute, 2006, p. 16. Cited M. Gillespie, Hegel, Heidegger and the Ground of History, Chicago: University of Chicago Press, 1984, p. 59. A. Pizzorno, ‘Politics Unbound’, in J. Farbion (ed.), Rethinking the Subject: An Anthology of Contemporary European Social Thought, Colorado: Westview: 1995, p. 77. Ibid. p. 185. John Lynn, Battle: A History of Combat and Culture, Boulder, CO: West View, 2003. Ibid. p. 365. J.A. Vasquez, ‘The Post-Positivist Debate: Reconstructing Scientific Enquiry – an International Relations Theory after Enlightenment’s Fall’, in K. Booth and S. Smith (eds), International Relations Theory Today, Cambridge: Polity Press, 1995, p. 225. S. Lukes, Liberals and Cannibals: the Implications of Diversity, London: Verso, 2003, p. 59. Fussell, The Bloody Game, An Anthology, New York: Scribners, 1991, p. 23. Significantly, Roger Fenton, the first war photographer, produced a series of works on the Crimean War. Despatched by Prince Albert with strict instructions from the War Office not to photograph the dead, maimed or ill, Fenton went about rendering the war as ‘a dignified male outing’ (S. Sontag, Regarding the Pain of Others, London: Hamish Hamilton, 2003, p. 44). His attempt to disguise the horror of the war either failed or fell short. His most famous piece, ‘Valley of the Shadow of Death’, follows the constraints set upon him, being devoid of dead bodies. But what it equally shows is a haunting, barren, desolate landscape, the battlefield of Balaclava, littered with ammunition and imprinted by the tracks of military artillery leading into a void. A. Finkielkraut, In the Name of Humanity: Reflections on the 20th Century, London: Pimlico, 2001, p. 66. Cited Amato, Victims and Values, op. cit., pp. 77–9. J. Lukacs, At the End of an Age, New Haven: Yale University Press, 2002, p. 33. P.N. Stearns, American Fear: The Causes and Consequences of High Anxiety, London: Routledge, 2006, p. 13. The Sunday Times, 24 September 2006.
Notes 179 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86
Cited C. Brooke, The Central Middle Ages, London: Longman, 1966, p. 133. Ibid. T. Arnold, The Renaissance at War, London: Cassell, 2001, p. 87. The Times, 29 January 2005. Lynn, Battle, op. cit., p. 332. Cited J. Updike, Hugging the Shore: Essays and Criticism, London: Penguin, 1983, p. 561. P. Caputo, A Rumour of War, London: Macmillan, 1978, p. 27. Ibid. Ibid. T. Herzog, Vietnam War Stories: Innocence Lost, London: Routledge, 1992, p. 73. Caputo, Rumour of War, op. cit., p. xii. Cited Updike, Hugging the Shore, op. cit., p. 563. J. Shay, Achilles in Vietnam: Combat Trauma and the Undoing of Character, New York: Simon & Schuster, 2004, p. 79. P.N. Edwards, The Closed World: Computers and The Politics of Discourse in Cold War America, Cambridge: MIT Press, 1969, p.12. http://www.defenselink.mill/specials/secdef-histories/bios/mcnamara.htm M. van Creveld, Command in War, Cambridge: Harvard University Press, 2003, p. 258. Ibid. p. 259. Edwards, The Closed World, op. cit., pp. 127–8. van Creveld, Command in War, Cambridge: Harvard University Press, 2003, p. 259. Cited G. Dyer, War: The Lethal Custom, New York: Carrol & Graf, 2007, op. cit., p. 376. M. Herr, Despatches, London: Picador, 1979, p. 42. J. Bourke, An Intimate History of Killing: Face to Face Killing in 20th Century Warfare, London: Granta, 1999, p. 233. Fussell, Bloody Game, op. cit., p. 654. Ibid. p. 655. Bourke, An Intimate History of Killing, op. cit., p. 200. Ibid. p. 175. Ibid. p. 180. Ibid. p. 197. ‘Contemporary Practice of the United States Relating to International Law’ compiled C. Berans and J. Silber, The American Journal of International Law, 62:3, July 1968, p. 766ff. Herzog, Vietnam War Stories, op. cit., p. 102. Ibid. p. 107. L. Trittle, From Melos to My Lai: War and Survival, London: Routledge, 2000, p. 123. H. Tinker, Race, Conflict and the International Order, London: Macmillan, 1977, p. 179. C. Schmitt, The Theory of the Partisan: A Commentary/Remark on the Concept of the Political (1962), trans. A.C. Goodson 2003, available at http://msupress.msu.edu/ journals/cr/schmitt.pdf Ibid. p. 9. Ibid. p. 7. Ibid. p. 31. Ibid. p. 32. Cited Updike, Hugging the Shore: Essays, op. cit., p. 564. C. Schmitt, The Nomos of the Earth in the International Law of the Jus Publicum Europeaeum, New York: Telos Press, 2003, pp. 147–8. C. Schmitt, The Concept of the Political, trans. George Schwab, Chicago: University of Chicago Press, 1996, p. 89. Bao Ninh, The Sorrows of War, London: Pimlico, 1995, p. 89.
180 87 88 89 90 91 92
Notes
Ibid. Schmitt, ‘The Theory of the Partisan’, op. cit., p. 52. Ibid. p. 56. Ibid. p. 66. Ibid. p. 67. M. van Creveld, The Changing Face of War: Lessons from the Marne to Iraq, New York, Ballantine Books: 2006, pp. 214–16 .
Chapter 3 1 P. Franco, Hegel’s Philosophy of Freedom, New Haven, Conn: Yale University Press, 1999, p. 336. 2 Ibid. p. 333–4. 3 P. Windsor, ‘War and the State’ in M. Berdal (ed.), Studies in International Relations: Essays by Philip Windsor, Brighton: Sussex University Press, 2002, pp. 70–1. 4 Franco, Hegel’s Philosophy of Freedom, op. cit., p. 337. 5 E. Wyschogrod, Spirit in Ashes: Hegel, Heidegger and Man-Made Death, New Haven: Yale University Press, 1985, p. 130. 6 Franco, Hegel’s Philosophy of Freedom, op. cit., p. 337. 7 Windsor, ‘War and the State’, op. cit., p.70. 8 N. Ferguson, War of the World: History’s Age of Hatred, London: Allen Lane, 2006, p. 124. 9 Ibid. p. 442. 10 Ferguson, ‘Prisoner Taking’, op. cit., p. 148. 11 Ferguson, War of the World, op. cit., p. 542. 12 Cited M. Burleigh, The Third Reich: A New History, London: Macmillan, 2000, p. 515. 13 T. Todorov, Hope and Memory, op. cit., p. 66. 14 O. Bartov, Hitler’s Army: Soldiers, Nazis and War in the Third Reich, Oxford: Oxford University Press, 1991, pp.13–28. 15 Ibid. 16 D. Porch, Hitler’s Mediterranean Gamble, London: Weidenfeld & Nicholson, 2004, p. 213. 17 M. Kundera, The Art of the Novel, London: Faber & Faber, 1988, pp. 50–1. 18 Rorty, Contingency, Irony and Solidarity, op. cit., p.176. 19 Raymond Aron, Peace and War between Nations, Paris: Gallimard, 1962, p. 245. 20 P. Johnson, Modern Times: A History of the Modern World, London: Wiedenfeld & Nicholson, 1983, p. 501. 21 Cited A. Horne, A Savage War of Peace 1950–1962, New York: Viking Press, 1977, pp. 208–10. 22 C. Neil MacMaster, ‘The Torch Controversy: Towards a ‘New’ History of the Algerian War,’ Modern & Contemporary France, 10:4, 2002, pp. 449–59. 23 A. Bellamy, ‘No pain, no gain? Torture and Ethics in the War on Terror’, International Affairs, 82:1, 2006, p. 128; Lou DiMarco, ‘Losing the Moral Compass: Torture and Guerre Revolutionnaire in the Algerian War’, Parameters, 2006, pp. 72–3. 24 Cited Johnson, A History of Modern Times, op. cit., p. 503. 25 Cited P. Vidal-Maquet, Torture: A Concern of Democracy, London: Penguin, 1963, p. 51. 26 Cited The Times, 10 December 2005. 27 Ibid. 28 Cited G. Wasserstrom, ‘China’s Boxer Crisis’ in A. Reid (ed.), Taming Terrorism: It’s Been Done Before, London: Policy Exchange, 2005, p. 39. 29 Cited A.R. Derradji, The Algerian Guerrilla Campaign, Queenstown: Edwin Mellor Press, 1997, p. 173. 30 T. Storm Heter, Sartre’s Ethics of Engagement: Authenticity and Civic Virtue, London: Continuum, 2006, p.155.
Notes 181 31 Ibid. 32 D. Schalk, War and the Ivory Tower, Lincoln: University of Nebraska Press, 2005, p. 50. 33 Cited J. Mayall, ‘Globalisation and the Future of Nationalism’ in P. Windsor (ed.), The End of the Century: The Future in the Past, Tokyo: Kordansha International, 1995, p. 443. 34 Todorov, Hope and Memory, op. cit., p. 237. 35 Cited Heter, Sartre’s Ethics, op. cit., p. 107. 36 One of the French units, the Foreign Legion, not only had many ex-Wehrmacht veterans in its ranks, but was proud to call itself ‘the White SS’. (M. van Creveld, The Changing Face of War: Lessons of Combat from the Marne to Iraq, New York: Ballantine Books, 2007, p. 218.) 37 Schalk, War and the Ivory Tower, op. cit., p. 59. 38 V. Davis-Hanson, Why the West has Won, London: Faber & Faber, 2001, p. 17. 39 D. Dawson, Origin of the Western Way of Warfare, Boulder, CO: Westview, 1998, p. 4. 40 T. Herzl, The Jewish State, New York: Scribners, 1988, p. 44. 41 R. Gal, ‘The IDF Structural Model’, in D. Ashkenzy (ed.), The Military in the Service of Society and Democracy: The Challenge of the Dual-Role Military, Westport, Connecticut: Greenwood 1994, p. 21; G. Ben-Dor ‘Israel’s National Security Doctrine Under Strain: The Crisis of the Reserve Army,’ Armed Forces and Society, 28:2, Winter 2002, p. 234. 42 Hanson, Why the West Has Won, op. cit., p. 3. 43 Z. Schiff, ‘Fifty Years of Israeli Security: The Central Role of the Defence System’, Middle East Journal, 53:3, Summer 1999, p. 441. 44 Z. Schiff, Israel’s Lebanon War, London: George Allen, 1985. 45 G. Alon, ‘Government Approves Reforms to IDF’s Reserve System’, Ha’aretz English edition, 13 March 2005. 46 D. Rodman, ‘Israel’s National Security Doctrine’, Middle East Review of International Affairs, 5:3, September 2001, p. 223. 47 A. Bellamy, ‘No Pain, No Gain? Torture and Ethics in the War on Terror’, International Affairs, 82:1, 2006, p. 135; W. B. O’Brien, Law and Morality in Israel’s War with the PLO, London: Routledge, 1991; Alan Derschowitz, Why Terrorism Works: Understanding the Threat, Responding to the Challenge, New Haven: Yale University Press, 2002. 48 P. Berman, Terror and Liberalism, New York: Norton & Co, 2001, p. 139. 49 Ibid. 50 L. Wright, The Looming Tower: Al-Qaeda’s Road to 9/11, London: Allen Lane, 2006, p. 39. 51 Schiff, ‘Fifty Years of Israeli Security’, op. cit., p. 437. 52 M. Holub, The Dimension of the Present Moment and Other Essays, London: Faber & Faber, 1990, p.102. Chapter 4 1 T. Ricks, Fiasco: The American Military Adventure in Iraq, London: Allen Lane, 2006, p. 290. 2 Ibid p. 409. 3 Ibid. p. 291. 4 T. Brokaw, ‘General Sanchez: Abu Ghraib: Clearly a Defeat’ available at http://msnbc. com/id/5333895/ (last modified 30 June 2004). 5 John Solomon, ‘Poll of Iraqis reveals anger towards US’, 15 June 2004, Top News from the Associated Press, available at http://apnews.myway.com/articles/2040616/d837qra 00.html. 6 Ricks, Fiasco, op. cit., p. 297.
182
Notes
7 Cited T. Pfaff, ‘Non-Combatant Immunity and the War on Terrorism’, http://www.usaf. af.mil/jscope/JSCOPE03/pfaff03.html. 8 For a critical discussion of Baudrillard’s work see C. Norris, Uncritical Theory: Post Modernism, Intellectuals and the Gulf War, London: Lawrence & Wishart, 1992. 9 Washington Post, 3 February 2006. 10 S. Poole, Unspeak, London: Little, Brown, 2006. 11 ‘The UK Terrorism Threat in Context’ RUSI Newsbrief, 26:9, September 2006 p. 102. 12 Ibid. For the official see Jane’s Terrorism and Security Monitor, November 2006, p. 7. 13 See conference proceedings, D. Hansen and M. Ranstorp (eds), Co-operating against Terrorism: EU–US Relations Post September 11, Stockholm: Swedish National Defence College, 2007. 14 Z. Bauman, Consuming Life, Cambridge: Polity, 2007, p. 39. 15 J. Mackinlay, Defeating Complex Insurgency: beyond Iraq and Afghanistan, RUSI Whitehall Paper, London: Royal United Services Institute, 2005, p. 81. 16 See also B. Hofmann ‘Plan of Attack’, Atlantic Monthly, July–August 2004. John Robb, a security analyst with a background in counter-insurgency, has developed the concept of a global guerrilla war. See his blog on http://globalguerillas.typepad.com/globalguerillas/. Guerillas and insurgents engage in hit-and-run attacks through dispersed, usually small units. Both seek the long-term exhaustion of the conventional forces they are opposing. In other words, they win by not losing. 17 M. Castells, The Internet Galaxy, Oxford: Oxford University Press, 2001, p. 139. 18 One of the striking features of this global insurgency is that the insurgents have no doubt that they are at war with the West, even if we are reluctant to think we are at war with them. The Middle East Media Research Institute traced an article on an Al-Qaeda website which invoked what is known in the US military as Fourth Generation Warfare (a model developed by William Lind to include terrorism against states). The time had come, the website proposed, for Islamic movements to ‘internalise the rules of Fourth Generation Warfare’. (Gary Hart, The Shield and the Cloak: Security and the Commons, Oxford: Oxford University Press, 2006, pp. 69–70.) 19 The Times, 10 February 2007. 20 Z. Bauman, In Search of Politics, Cambridge: Polity, 2000, p. 141. 21 Ibid. 22 The Times, 22 November 2006 (Times 2 ‘Soldiers Stories’ pp. 4–5). 23 F. Fukuyama, After the Neo-Cons: America at the Crossroads, London: Profile, 2006. In a Weekly Standard ‘Letter from Londonistan’, Irwin Steltzer observed that ‘Brits were horrified to learn that they had been attacked by fellow citizens. Americans know it is “us” against “them”, whereas Brits know that “they” are also “us”.’ Steltzer declared: ‘When it comes to issues such as immigration, extradition and the application of power of the state at home, he [Blair] is torn between humanitarianism and civil rights principles, and the need to wage war against Britain’s domestic enemies.’ (Weekly Standard, 1 August 2005). 24 D. Selbourne, The Principle of Duty: An Essay on the Foundations of the Civic Order, London: Sinclair Stephenson, 1994, pp. 122–3. 25 J. Clark, ‘Is There Still a West?’ Orbis, 48:4(4), 2004, p. 591. 26 R. Cooper, Breaking of Nations: Order and Chaos in the 21st Century, London: Atlantic, 2003, p. 159. 27 S. Zizek, Organs Without Bodies: Dialogues and Consequences, London: Routledge, 2004, p. 26. 28 M. Douglas, Risk and Culture: An Essay on the Selection of Technological and Environmental Dangers, Berkeley: University of California Press, 1982, p. 3. 29 Conversations with Manuel Castel, Cambridge: Polity, 2003, p. 131. 30 Poole, Unspeak, op. cit., p. 3. 31 Ibid. 32 See Rorty’s essay on Orwell in Contingency and Irony, op. cit., pp. 179–4.
Notes 183 33 For a full discussion of euphemism and war, see my War and the Twentieth Century: A Study of War and Modern Consciousness, London: Brassey’s, 1994, pp. 50–5. 34 A. Roberts, ‘September 11: 5 Years On’, World Today, August/September 2006, p. 6. 35 Poole, Unspeak, op. cit., p. 168. 36 Ibid. p. 172. 37 Ibid. 38 A. Sullivan, ‘Torture By Another Name’, Sunday Times, 24 September 2006. 39 Time, 10 July 2006, p. 38. 40 ‘Mind Your Language’, The Economist, 17 June 2006, p. 15. 41 Ibid. 42 The Times, 6 December 2005. 43 Washington Post, 26 December 2002. 44 The Sunday Times, 25 March 2006 45 The Times, 10 February 2006. 46 Washington Post, 6 December, 2006. 47 J. Bourke, ‘Barbarisation and Civilisation in Time of War’ in Kassimeris (ed.), Barbarisation of War, op. cit., p. 214. 48 Financial Times, 17 February 2006. 49 U. Eco, Five Moral Pieces, London: Secker and Warburg, 2001, pp.11–13. 50 Ibid. 51 J. Urry, Global Complexity, Cambridge: Polity, 2003, p. 7. 52 Cited Evan Wright, Generation Kill: Living Dangerously on the Road to Baghdad with the Ultra Violent Marines of Bravo Company, New York: Bantam, 2004, p. 31. 53 Cited G. Mills, The Security Intersection: The Paradox of Power in an Age of Terror, Johannesburg: Witwatersrand University Press, 2004, p. 57. 54 Cited P. Virilio, Negative Horizon, New York: Continuum, 2005, p. 90. 55 Castels, Internet Galaxy, op. cit., p.138. 56 Ibid. p.141. 57 Ibid. 58 R. Smith, The Utility of Force: The Art of War in the Modern World, London: Allen Lane, 2006, p. 289. 59 Washington Post, 13 September 2001. 60 L. Freedman, The Transformation of Strategic Affairs, Adelphi Paper, 379, London: International Institute for Strategic Studies, 2006, p. 77. 61 Urry, Global Complexity, op. cit., p. 53. 62 The Times, 8 March 2007. 63 A. Linklater, The Problem of Harm in World Politics, Martin Wright Lecture, 20 November 2001, p. 49. 64 Urry, Global Complexity, op. cit., p. 53. 65 Ibid. p. 54. 66 Cited P. Ford, ‘Where is Osama and Should We Care?, Christian Science Monitor, 27 June 2002. 67 Hedges, War is a Force That Gives Us Meaning, op. cit., p. 112. 68 T. Hobbes, The Leviathan, ed. C .B. Macpherson, London: Pelican, 1972, p. 375. 69 C. Geertz, Available Light: Anthropological Reflections on Philosophical Topics, Princeton, NJ: Princeton University Press, 2000, p. 173; J. Mueller, The Remnants of War, Ithaca: Cornell University Press, 2004, p. 91. 70 Urry, Global Complexity, op. cit., p. 25. 71 Ibid. 72 Ibid. 73 K. Kelly, Out of Control: The New Biology of Machines, Reading: Cox & Wyman, 1994, p. 27. 74 D. Alberts and R.E Hayes, Power to the Edge – Command and Control in the Information Age, Washington DC: Department of Defense, CCRP 2003, p. 169.
184
Notes
75 Kelly, Out of Control, op. cit., p. 395. 76 L.P. Beckerman, The Non-Linear Dynamics of War, http://www.belisarius.com/modern-business-strategy/beckerman/non-linear.htm. 77 J. Arquilla and D. Romfeldt, ‘Looking Ahead: Preparing for Information-Age Conflict’ in J. Arquilla and D. Romfeldt (eds), In Athena’s Camp: Preparing for Conflict in the Information Age, Santa Monica, CA: RAND Corporation, 1997, p. 465. 78 Cited J. Adams, The Next World War, London: Hutchinson, 1998, p. 291. 79 Urry, Global Complexity, op. cit., p. 99. 80 J. Thompson, Political Scandal: Power and Visibility in the Media Age, Cambridge: Polity, 2000. 81 Cited Z. Bauman, Liquid Fear, Cambridge: Polity, 2003, p. 149. 82 C. Mackey and G. Miller The Interrogator’s War: Inside the Secret War against AlQaeda, London: John Murray, 2005, p. 312. 83 S. Sontag, Regarding the Pain of Others, London: Hamish Hamilton, 2003, p. 25. 84 Cited V. Williams and Liz Heron (eds), Illuminations: Women Writing on Photography from the 1950s–the Present, London: Tauris, 1996, p. 457. 85 Sontag, Regarding the Pain of Others, op. cit., p. 12. 86 Nic Gowing, ‘Real Time Crises: New Real Time Information Tensions’, in Joyce, Transformation of Military Operations on the Cusp, op. cit., pp. 16–20. 87 P. Edde, ‘Window into the Dark Heart’, The World Today, 63:3, March 2007, p. 9. 88 T.H. Eriksen, Tyranny of the Moment: Fast and Slow Time in the Information Age, London: Pluto Press, 2001, p. 71. 89 Cited D. Schmidtchen, Eyes Wide Open: Stability, Change and Network-Enabling Technology, Working Paper, 129, Canberra: Land Warfare Centre, 2006, p. 33. 90 Dawkins, The God Illusion, op. cit., p. 203. 91 N. Humphrey, ‘History and Human Nature’, Prospect, September 2006, pp. 66–7. 92 Ibid. 93 See also S. Shennan, Genes, Memes and Human History, London: Thames & Hudson, 2002; R. Aunger, The Electric Meme: A New History of How We Think, New York: Free Press, 2002; Kate Distin, The Selfish Meme: A Critical Reassessment, Cambridge: Cambridge University Press, 2005; R. Brody, Virus of the Mind: The New Science of the Meme, Seattle: Integral Press, 1996; and D. Dennett, Darwin’s Dangerous Idea, New York: Simon & Schuster, 1995. 94 Ibid. 95 The theme of humiliation is an extremely important one in Al-Qaeda’s world view. Thus, Bin Laden has written that for 80 years Islam has been ‘tasting ... humiliation and contempt ... its sons killed, its blood ... shed; its holy places attacked’. The reference to 80 years has little resonance in the West, but it is a clear reference to the break-up of the Ottoman Empire in 1918 and the passing of the territories of the old Caliphate into Western colonies and League of Nations mandates. (T. Barkawi, ‘On the Pedagogy of “Small Wars”’, International Affairs 80:1, 2004, p. 22.) Chapter 5 1 2 3 4 5 6 7 8 9
G. Steiner, Grammars of Creation, London: Faber & Faber, 2004, p. 5. A. MacIntyre, A Short History of Ethics, London: Routledge, 1998, p. 1. Ibid, p. 259. Ibid. A. Adkins, Merit and Responsibility: A Study in Greek Values, Oxford: Oxford University Press, 1960, p. 2. A. Blok, Honour and Violence, Oxford: Blackwell, 2001, p. 101. Ibid. p. 109. Ibid. p. 113. Ibid. p. 112.
Notes 185 10 L. Stone, The Crisis of the Aristocracy 1558–1641, Oxford: Oxford University Press, 1967, p. 96. 11 Ibid. p. 113. 12 R. van Krieken, Norbert Elias, London: Routledge, 1998, p. 102. 13 J. Hale, War and Society in Renaissance Europe, 1450–1620, London: Fontana, 1983, p. 99. 14 D. Indermaut, ‘Perceptions of Violence’, Psychology and The Law 3:2, 1996, p. 10. 15 M. Kramer, ‘The Moral Logic of the Hizballah’ in Walter Reich (ed.) Origins of Terrorism: Psychologies, Ideologies, Theologies, States of Mind, Cambridge: Cambridge University Press, 1990, p. 135. 16 Ibid. 17 Ibid. 18 Soria Krucks, Retrieving Experience, p. 145. 19 Paul Rabinow (ed.), The Foucault Reader, London: Penguin, 1991, pp. 268–9. 20 R. Coward, Patriarchal Precedents: Sexuality and Social Relations, London: Routledge, 1983. 21 Bauman, Liquid Love, op. cit., p. 90. 22 Ibid. p. 91. 23 Ibid. 24 M. Gelven, War and Existence: A Philosophical Enquiry, University Park, Penn: Pennsylvania State University Press, 1993, pp.74–5. 25 I. Ousby, The Road to Verdun: France, Nationalism and The First World War, London: Jonathan Cape, 2002, p. 7. 26 Ibid. p. 249. 27 L. Ferry, Man Made God, Chicago: University of Chicago Press, 2002, p .71. 28 Ibid. p. 139. 29 H. Jonas, Mortality and Morality: A Search for the Good after Auschwitz, Evanston, IL: North Western University Press, 1999, p. 181. 30 M. Carrithers, Why Humans Have Culture: Explaining Anthropology and Social Diversity, Oxford: Oxford University Press, 1992, p. 57. 31 Ibid. 32 D. Woods, A Step Back: Ethics and Politics After Deconstruction, New York: State University of New York Press, 2006, p.160. 33 Ibid. 34 Cited Selborne, The Principle of Duty, op. cit., p. 161. 35 R. Sennett, Respect: The Formation of Character in an Age of Equality, London: Penguin, 2004, p. 57. 36 Ibid. 37 See R. Rorty, ‘Orwell on Cruelty,’ in Contingency, Irony and Solidarity, op. cit., p. 172. 38 Ibid. 39 The Guardian, 14 May 2005. 40 C. Conetta, ‘Vicious Circle: The Dynamics of Occupation and Resistance in Iraq’, Project on Defence Alliances: Research Monograph 10, 18 May 2005. 41 J. Alexander, Future War: Non Lethal Weapons in 21st Century Warfare, New York: St Martin’s Press, 1999, p. 17. 42 New York Times, 16 August 2005. 43 Cited T. Feakin, Non-Lethal Weapons: Technology for Lowering Casualties? Master’s thesis, University of Bradford, 2005, unpublished, pp.19–20. 44 N. Lewer and S. Schofield, Non Lethal Weapons: A Fatal Attraction?, London: Zed Books, 1997, p. 129. 45 Ibid, p. 71. 46 D. Schukman, The Sorcerer’s Challenge: Fears and Hopes for the Weapons of the Next Millennium, London: Coronet, 1995, p. 210. 47 Alexander, Future War, op. cit., p. 15.
186
Notes
48 Ibid. 49 N. Davison and N. Lewer, Bradford Non-Lethal Weapons Research Project, Research Report 7, May 7 2005, Bradford University: Centre for Conflict Resolution, 2005, p. 2. 50 Ibid. p.5 51 G. Steiner, Errata: An Examined Life, London: Weidenfeld and Nicholson, 1997, p. 85. Chapter 6 1 2 3 4 5 6 7 8 9 10 11
12 13 14 15 16 17 18 19 20 21 22 23 24 25 26
J. Hillsman, A Terrible Love of War, London: Penguin, 2004, p. 28. Ibid, p. 21. Finkielkraut, In the Name of Humanity, op. cit., pp. 3–4. Ibid. p. 4. K. Jaspers, The Way to Wisdom, New Haven: Yale University Press, 1954, pp. 59–60. Cited in P. Cornish, ‘Clausewitz, and the Ethics of Armed Force: Five Propositions’, Journal of Military Ethics 2/3, 2003, p. 30. M. Ignatieff, The Warrior’s Honour: Ethnic War and the Modern Conscience, London: Chatto & Windus, 1998, p. 118. Cornish, ‘Cry Havoc and Let Slip the Managers of War,’ op. cit., p. 16. The Guardian, 12 January 2006. See N. Alwyn Foster, ‘Counter Insurgency Operations’, Military Review, November–December 2005. R. Kaplan, The Atlantic, April 2005. The importance of covenant is seen from the formal adoption of the British Army, Soldiering: The Military Covenant, Army Doctrine Publications, Vol 5, para 0103, GD&D/18/34/71, Army Code No. 71642, February 2000, http://www.army.mod.uk/ servingsoldiers/usefulinfo/valuesgeneral/adp5milcov/ss-hrpers-values-adp5–1-w.html /milcov. ‘Soldiers will be called upon to make personal sacrifices – including the ultimate sacrifice – in the service of the nation. In putting the needs of the nation and the army before their own, they forego some of the rights enjoyed by those outside the armed forces. In return, British soldiers must always be able to expect fair treatment, to be valued and respected as individuals, and that they (and their families) will be sustained and rewarded by commensurate terms and conditions of service ... This mutual obligation forms the Military Covenant between the nation, the army and each individual soldier; an unbreakable common bond of identity, loyalty and responsibility which has sustained the Army and its soldiers throughout its history.’ C. Brown, ‘Conception of a Rule Governed International Order: Europe versus America?’, International Relations 20:3, 2006, p. 311. Ibid. U. Beck, World Risk Society, Cambridge: Polity, 1999, p. 51. P.H. Waite, Die Hard: Famous Napoleonic Battles, London: Cassell, 1996, p. 10. N. Luhmann, Observations on Modernity, Stanford: Stanford University Press, 1998, p. 54. F. Furedi, The Culture of Fear, London: Cassell, 1998, p. 17. D. Donald, After the Bubble: British Private Security Companies After Iraq, Whitehall Paper 66, London: Royal United Services Institute, 2006, p. 9. New York Times, 26 August 2004. Ricks, Fiasco, op. cit., p. 371. Ibid. p. 372. R. Kurzweil, Singularity is Near: When Humans Transcend Biology, New York: Viking, 2005. M. de Landa, War in the Age of Intelligent Machines, Cambridge: Massachusetts Institute of Technology, 1994, p. 3. Ibid. p. 10. R. O’Connell, Of Arms and Men, Oxford: Oxford University Press, 1989, p. 179. D. Hanson, Why the West Has Won, op. cit., p. 120.
Notes 187 27 B. Robson, Crisis on the Frontier: The Third Afghan War, Staplehurst: Spellmount, 2004, pp. 257–8. 28 Cited A. Lieven and J. Hulsman, Ethical Realism, New York: Pantheon, 2006, p. 81. 29 S. McMichael, Stumbling Bear: Soviet Military Performance in Afghanistan, London: Brassey’s, 1991, p. 121. 30 Cited Sontag, Regarding the Pain of Others, op. cit., p. 60. 31 The Miami Herald, 14 February 2006. 32 Ibid. 33 http://www.irb.co.uk/v28/no1/wein01/html 34 Kurtzweil Singularity, op. cit., p. 333. 35 ‘Robot Wars’, The Economist, 17 April 2007, p. 99. 36 Z. Bauman, Post-Modern Ethics, Oxford: Blackwell, 2004, p. 190. 37 Ibid. p. 198. 38 Ibid. p. 89. 39 Dawkins, The God Delusion, op. cit., p. 104. 40 F. Furedi, The Culture of Fear Revisited, London: Continuum, 2006, p. 156. 41 Jonas, Mortality and Morality, op. cit., p. 67. Chapter 7 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27
Cited A. Massie, ‘Return of the Roman’, Prospect, November 2006, p. 42. D. Hanson, Why The West Has Won; J Keegan, A History of War, London: Pimlico, 1996. A. Swofford, Jarhead, New York: Scribners, 2003, p. 154. A. Chaniotis, War in the Hellenistic World, Oxford: Blackwell, 2005, p. 123. R. Kaplan, Warrior Politics: Why Leadership Demands a Pagan Ethos, New York: Random House, 2002, p. 12. William James, ‘The Moral Equivalent of War’ in B. Wilshire, (ed.), William James, The Essential Writings, New York: State University of New York Press, 1984, p. 350. H. van Weer, ‘War and Peace in Ancient Greece’, in A. Hartmann and B. Heuser, War, Peace and World Order in European History, London: Routledge, 2001, p. 39. T. Cahill, Sailing the Dark Wine Sea, New York: Doubleday, 2003, p. 81. Simone Weil: An Anthology, London: Penguin, 2000, p. 240. Ibid. p. 199. Ibid. p. 194. Ibid. p. 205. Ibid. p. 215. V. Davis Hanson, A War Like No Other, London: Methuen, 2005, p. 308. Ibid. C. Meir, Athens: A Portrait of the City in its Golden Age, London: John Murray, 1993, p. 524. Cited J. Glover, Humanity: A Moral History: A Moral History of the Twentieth Century, New Haven: Yale University Press, 2001, p. 29. Ibid. Ibid. p. 406. Cited MacIntyre, A History of Ethics, op. cit., p. 12. P. Cartledge, The Greeks: A Portrait of Self and Others,Oxford: Oxford University Press, 2002, p. 116. Ibid. p. 188. Trittle, From Melos to My Lai, op. cit., p. 121. Cited Watson, Ideas, op. cit., p. 214. Rorty, Contingency, Irony and Solidarity, op. cit., p. 69. R.P. Winnington-Ingram, Sophocles: An Interpretation, Cambridge: Cambridge University Press, 1990, p. 312. Cited J. Sachs, The Dignity of Difference, London: Continuum, 2002, p. 187.
188
Notes
Chapter 8 1 2 3 4 5 6 7 8 9 10 11 12 13 14
Stearns, American Fear, op. cit., p. 223. Z. Bauman, Liquid Fear, Cambridge: Polity, 2006, p. 137. Shklar, Ordinary Vices, op. cit., p.240. Updike, Hugging the Shore, op. cit., p. 464. Aristotle, Politics, Chapter 15 in T.A Smith, Works of Aristotle, volume X, Politics, Oxford: Oxford University Press, 1966. Cited J. Johnson, Morality in Contemporary Warfare, New Haven: Yale University Press, 1999, p. 157. Ibid, p. 211. MacIntyre, After Virtue, op. cit., p. 221. Jonas, Mortality and Morality, op. cit., p. 5. Ibid. p. 7. Ibid. Bauman, Post-Modern Ethics, op. cit., p. 53. Lieven and Hulsman, Ethical Realism, op. cit., p. 123. A. Malachowski, Richard Rorty, Princeton, NJ: Princeton University Press, 2002, p. 60.
Bibliography
Adkins, Arthur, Merit and Responsibility: A Study in Greek Values, Oxford: Oxford University Press, 1960. Alberts, David, and R.E. Hayes, Power To The Edge – Command and Control in the Information Age, Washington DC: Department of Defense, CCRP 2003, p.169. Alexander, John, Future War: Non Lethal Weapons in 21st Century Warfare, New York: St Martin’s Press, 1999. Amato, Joseph, Victims and Values: A History and Theory of Suffering, New York: Praeger, 1990. Arquilla, John, and D. Romfeldt (eds) In Athena’s Camp: Preparing for Conflict in the Information Age, Santa Monica, CA: RAND Corporation, 1997. Arnold, Thomas, The Renaissance at War, London: Cassell, 2001. Aron, Raymond, Peace and War Between Nations, Paris: Gallimard, 1962. Ashkenzy, Daniella (ed.), The Military in the Service of Society and Democracy: The Challenge of the Dual Role Military, Westport, Connecticut: Greenwood, 1994. Bartov, Omer, Hitler’s Army: Soldiers, Nazis and War in the Third Reich, Oxford: Oxford University Press, 1991. Bauman, Zygmunt, Post-Modern Ethics, Oxford: Blackwell, 2004. Bauman, Zygmunt, Liquid Love, Cambridge: Polity, 2003. Bauman, Zygmunt, Consuming Life, Cambridge: Polity, 2007. Berdal, Mats, Studies in International Relations: Essays by Philip Windsor, Brighton: Sussex University Press, 2002. Bourke, Joanne, An Intimate History of Killing: Face to Face Killing in 20th Century Warfare, London: Granta, 1999. Burleigh, Michael, The Third Reich: A New History, London: Macmillan, 2000. Caputo, Philip, A Rumour of War, London: Macmillan, 1978. Castells, Manuel, The Internet Galaxy, Oxford: Oxford University Press, 2001. Chaniotis, Angelos, War in the Hellenistic World, Oxford: Blackwell, 2005. Cooper, Robert, Breaking of Nations: Order and Chaos in the 21st Century, London: Atlantic, 2003. Cornish, Paul, Cry Havoc and Let Slip the Managers of War: The Strategic Hazards of Micro-Managed Warfare, 51, Camberley: Strategic and Combat Studies Institute, 2006. De Landa, Manuel, War in the Age of Intelligent Machines, Cambridge: Massachusetts Institute of Technology, 1994. Derradji, Abder, The Algerian Guerrilla Campaign, Queenstown: Edwin Mellor Press, 1997. Donald, Dominick, After the Bubble: British Private Security Companies After Iraq, Whitehall Paper 66, London: Royal United Services Institute, 2006.
190
Bibliography
Dyer, Gwynne, War: The Lethal Custom, New York: Carroll & Graf, 2007. Edwards, Paul, The Closed World: Computers and The Politics of Discourse in Cold War America, Cambridge: MIT Press, 1996. Feakin, Tobias, Non-Lethal Weapons: Technology For Lowering Casualties? Master’s thesis, University of Bradford, 2005. Ferguson, Niall, War of the World: History’s Age of Hatred, London: Allen Lane, 2006. Ferry, Luc, Man Made God, Chicago: University of Chicago Press, 2002. Finkeilkraut, Alain, In The Name of Humanity: Reflections on the 20th Century, London: Pimlico, 2001. Freedman, Lawrence, The Transformation of Strategic Affairs, Adelphi Paper, 379, London: International Institute for Strategic Studies, 2006. Furedi, Frank, The Culture of Fear, London: Cassell, 1998. Furedi, Frank, The Culture of Fear Revisited, London: Continuum, 2006. Fussell, Paul, The Bloody Game: An Anthology of Modern War, New York: Scribners, 1991. Gallie, W.B, Understanding War, London: Routledge, 1991. Gelven, Michael, War in Existence: A Philosophical Enquiry, University Park: Penn: Pennsylvania State University Press, 1993. Glover, Jonathan, Humanity: A Moral History of the Twentieth Century, New Haven: Yale University Press, 2001. Grossman, David, On Killing: The Psychological Cost of Learning To Kill in War and Society, New York: Little & Brown, 1995. Hale, Jonathan, War and Society in Renaissance Europe, 1450–1620, London: Fontana, 1983. Hart, Gary, The Shield and The Cloak: Security in the Commons, Oxford: Oxford University Press, 2006. Hedges, Christopher, War is a Force That Gives Us Meaning, New York: Public Affairs, 2002. Hertzog, Toby, Vietnam War Stories: Innocence Lost, London: Routledge, 1992. Herr, Michael, Despatches, London: Picador, 1979. Heuser, Beatrice, War, Peace and World Order in European History, London: Routledge, 2001. Heuser, Beatrice, Reading Clausewitz, London: Pimlico, 2002. Hillsman, James, A Terrible Love of War, London: Penguin, 2004. Horne, Alistair, A Savage War of Peace, 1950–1962, New York: Viking Press, 1977. Ignatieff, Michael, The Warrior’s Honour: Ethnic War in a Modern Conscience, London: Chatto & Windus, 1998. Jaspers, Karl, The Way to Wisdom, New Haven: Yale University Press, 1954. Johnson, James, Morality in Contemporary Warfare, New Haven: Yale University Press, 1999. Jonas, Hans, The Imperative of Responsibility, Chicago: University of Chicago Press, 1984. Jonas, Hans, Mortality and Morality: A Search for the Good After Auschwitz, Evanston, Illinois: North Western University Press, 1999. Joyce, Mark, Transformation of Military Operations on the Cusp, RUSI Whitehall Paper 106, London: Royal United Services Institute, 2006. Kaplan, Robert, Warrior Politics: Why Leadership Demands a Pagan Ethos, New York: Vintage, 2002. Kassimeris, George, The Barbarisation of Warfare, London: Hurst & Co., 2006. Lukacs, John, At the End of an Age, New Haven: Yale University Press, 2002. Lynn, John, Battle: A History of Combat and Culture, Boulder, Colorado: Westview, 2003.
Bibliography 191 Mackinley, John, Defeating Complex Insurgencies: Beyond Iraq and Afghanistan, RUSI Whitehall Paper, London: Royal United Services Institute, 2005. Malachowski, Alan, Richard Rorty, Princeton, New Jersey: Princeton University Press, 2002. MacIntyre, Alasdair, A Short History of Ethics, London: Routledge, 1998. MacIntyre, Alasdair, After Virtue: A Study of Moral Theory, London: Duckworth, 2002. Magenheimer, Heinz, Hitler’s War, London: Cassell, 1998. Mills, Greg, The Security Intersection: The Paradox of Power in an Age of Terror, Johannesberg: Witwatersrand University Press, 2004. Ninh, Bao, The Sorrows of War, London: Pimlico, 1995. O’Brien, Tim, The Things They Carried: How To Tell a True War Story, New York: Scribners, 1990. O’Connell, Robert, Of Arms and Men, Oxford: Oxford University Press, 1989. Parker, Geoffrey (ed.), The Thirty Years War, London: Routledge, 1997. Parker, Geoffrey, Empire, War and Faith in Early Modern Europe, London: Penguin, 2002. Pariss, Michael, Warrior Nation: Images of War in Popular British Culture 1850–2000, London: Reaktion Books, 2000. Poole, Steven, Unspeak, London: Little, Brown, 2006. Prins, Gwyn and Tromp, Hylke, The Future of War, The Hague: Kluwer Law, 2000. Ricks, Thomas, Fiasco: The American Military Adventure in Iraq, London: Allen Lane, 2006. Rorty, Richard, Consequences of Pragmatism, Minneapolis: University of Minnesota Press, 1982. Rorty, Richard, Contingency, Irony and Solidarity, Cambridge: Cambridge University Press, 1989. Rorty, Richard, Philosophy as Cultural Politics, Cambridge: Cambridge University Press, 2007. Sachs, Jonathan, The Dignity of Difference, London: Continuum, 2002. Sennett, Richard, Respect: The Formation of Character in an Age of Equality, London: Penguin, 2004. Schalk, David, War And The Ivory Tower, Lincoln: University of Nebraska Press, 2005. Schmitt, Carl, The Theory of the Partisan: A Commentary/Remark on the Concept of the Political, 1962. Available at http://msuperess.msu.edu/journals/cr/schmitt.pdf Schmitt, Carl, The Concept of the Political, Chicago: University of Chicago Press, 1996. Schmitt, Carl, The Nomos of the Earth in the International Law of the Jus Publicum Europeaeum, New York: Telos Press, 2003. Schiff, Ze’ev, and E. Ya’ari, Israel’s Lebanon War, London: George Allen, 1985. Schmidtchen, David, Eyes Wide Open: Stability, Change and Network – Enabling Technology, Working Paper 129, Canberra: Land Warfare Centre, 2006. Shay, Jonathan, Achilles in Vietnam: Combat Trauma and the Undoing of Character, New York: Simon & Schuster, 2004. Silverstone, Roger, The Media and Morality, Cambridge: Polity, 2007. Schukman, David, The Sorcerer’s Challenge: Fears and Hopes for the Weapons of the Next Millennium, London: Coronet, 1995. Smith, Rupert, The Utility of Force: The Art of War in the Modern World, London: Alan Lane, 2006. Sontag, Susan, Regarding the Pain of Others, London: Hamish Hamilton, 2003. Stearns, Peter, American Fear: The Causes and Consequences of High Anxiety, London: Routledge, 2006. Steiner, George, Grammars of Creation, London: Faber & Faber, 2004. Tertrais, B., War Without End, New York: Maine Press, 2005.
192
Bibliography
Thompson, John, Political Scandal: Power and Visibility in the Media Age, Cambridge: Polity, 2000. Todorov, Tzvestan, Hope and Memory: Reflections on the 20th Century, London: Atlantic, 2003. Toulmin, Stephen, Return to Reason, Cambridge: Harvard University Press, 2001. Trittle, Lawrence, From Melos to My Lai: War and Survival, London: Routledge, 2000. Urry, John, Global Complexity, Cambridge: Polity, 2003. Van Creveld, Martin, Command in War, Cambridge: Harvard University Press, 2003. Van Creveld, Martin, The Changing Face of War: Lessons from the Marne to Iraq, New York: Ballantine Books, 2006. Virilio, Paul, Negative Horizon, New York: Continuum, 2005. Walzer, Michael, Arguing About War, New Haven: Yale University Press, 2004. Watson, Peter, Ideas, London: Phoenix, 2006. Williams, Bernard, Truth and Truthfulness, Princeton: Princeton University Press, 2002. Woods, David, A Step Back: Ethics and Politics After Deconstruction, New York: State University of New York Press, 2006. Wright, Evan, Generation Kill: Living Dangerously on the Road to Baghdad With the Ultra Violent Marines of Bravo Company, New York: Bantam, 2004. Wyschogrod, Edith, Spirit in Ashes, Hagel, Heidegger and Man-Made Death, New Haven: Yale University Press, 1985.
Index
9/11 terrorist attacks: a new discourse on war after 2–6, 16, 78, 87–95; objective criminality 11–12; public relations war in the Muslim world after 105; terrorist studies after 81–2 300 155 1984 88, 126, 127 absolute war 52, 59 Abu Ghraib 77–8, 89, 105, 108, 141 abuse 89–91 Adams, Douglas 152 Additional Protocols (1977) 8 Adkin, Arthur 116 Afghanistan 62, 85, 107; Bagram 89; Operation Anaconda 147; polymorphous actors in 100–1; PSCs in 138–9, 140; Third Afghan War 1919 146–7 Age of the Intelligent Machine, The 144 Agee, James 106–7 agency 146, 153 Al-Jazeera 110 Al-Qaeda 73, 80, 81; advice to members on being captured 105–6; fatwa against the American people 12; global media systems and 82–3, 97–8, 107; in Guantanamo Bay 91, 94–5 Alexander, John 129–30 Algerian War 53, 61–7, 76, 94 Algiers 81 Amnesty International 92, 93, 130 Anabasis 68 anonymity in war 32, 35, 55 ‘applied ethics’ 6, 15 Al Aqsa intifada 70–6 aristocracy 32–5, 118–19, 121 Aristophane 161 Aristotle 15, 131, 171
Arithmetic on the Froniter, The 147 Arkin, Ronald 149 Arnold, Thomas 34 Aron, Raymond 61–2 Art of War, The 18 artificial conscience 149 Artificial Intelligence (AI) 148–9 Asimov, Isaac 149 Athenians 161–5, 166 authenticity 65, 133 'axis of evil' 2–3 bad faith 65–6, 76, 89 Bagram 89 Balkans 112, 135, 136 Barres, Maurice 123 Bartov, Omar 59–60 Battle 30 Battle of Mogadishu 103 Battle of Waterloo 35 battlespace: actors in the 107, 143; micromanagement of the battlefield 134–8, 153; networked 76, 96, 98–102 Baudrillard, Jean 3, 79 Bauman, Zygmunt 84, 122 Beccarria, Cesare 126 Beck, Ulrich 137 Beckerman,Linda 103 Begin, Menachem 90 Benjamin, Walter 10 Bentham, Jeremy 32–3 Berger, Peter 12, 15, 126 betrayal of humanity 132–3 Bible 166, 172 Bin Laden, Osama 12, 73 blogs 107, 108, 112 Blok, Anton 117, 118 blood, power through 121–2
194
Index
Blue Force Tracker 137 bodies 110, 115, 118, 121, 122, 137 Body Count 41 body counts: victory and 25–7; in the Vietnam War 40, 41 Boll, Heinrich 26 Bosnia 101, 129 Boswell, James 8 Bourke, Joanne 95 Breaking of Nations, The 86 British Royal Military Police 135–6 Broch, Hermann 60–1 Brown, Chris 136 Bunting, Madeleine 127 Bush, George W. 4, 78, 87, 89 Call for a Global Islamic Resistance, The 83 Calley, William 41, 44 Cambodia 11 Camus, Albert 66, 67, 134 Canetti, Elias 25–6 Capra, Fritjof 96 Caputo, Philip 36–8, 59 Castells, Manuel 87, 97, 127, 136 Categorical Imperatives 9, 22, 23, 74, 75, 137, 173 Charge of the Light Brigade 32 Che Guevara 51 chemical weapons 130 chivalry 34, 147 Christian ethics 21, 28, 29 Christianity and Marxism 50 Churchill, Winston 96, 147 citizen armies 69, 70 citizenship 66, 68 'civic militarism' 68 civic norms applied to the military 135–6 civil liberties 16 civil society 54, 55, 85 civilians: in battle 20, 70–1, 105; distinguishing between the state and 2 classical studies 155 Clausewitz 4, 6, 17–18, 24, 26–8, 29, 34, 45, 46–7, 52, 54, 55–6, 59, 75, 151 Close Quarters 43 'closed world' discourse 38–43 Coates, Joseph 128 codes of war, ethical 4–6, 14–15, 170; transparency of 107, 112 Cold War 16, 35, 44, 80, 86 Collingwood, R.G. 11 Combatant Status Review Tribunals 93 Commissar Order 57
Communism 35, 50 complexity theory 96 computer war, first 38–9 Congress of Vienna 5 conoidal bullets 144–5 conspiracy theories 3 Cooper, Robert 86 Cornish, Paul 137 Cossacks 56–7 costs of war 109, 146–8 counter-insurgencies 34, 53, 101, 135 Crimean War 32 criminalisation of war 136 crowds 25–6 cruelty 11, 12, 13, 127, 170 cultural sites 5 culture of warfare 67–8 cycles of violence 129–30 Davis-Hanson, Victor 68 Dawkins, Richard 111 de Landa, Manuel 144–6 de-territorialisation 83, 85 de Tocqueville 33 decapitations 64, 117, 118 Declaration of the Rights of Man 59, 66, 76 Decline of the West 25 defeat 18, 27, 67, 75, 129, 160, 166 DeLillo, Don 115 democracy 18, 68 demons 3 Dennett, Daniel 111 Derschowiz, Alan 72 Despatches 36 Dewey, John 10, 13, 23–4, 167, 175 digital archiving 109 dignity 126 directed-energy weapons 130–1 Dirk Gentry’s Holistic Detective Agency 152 discourses on war 30–5; after 9/11 2–6, 16, 78, 87–95; in the Algerian War 53, 61–7, 76, 94; ‘closed world’ discourse in Vietnam 38–43; in the future 95–112; Israeli 67–74, 76; Nazis on the Eastern Front 56–61, 76; partisans in the Western 46; reality/discourse divide 35–6; terrorist 112–13 distancing warfare 43, 127 double agents 94 duels 4, 117, 118, 119 Dunlap, Charles 104–5 duty 116
Index 195 Eastern Front 1941-5 56–61 Eco, Umberto 95 Edwards, Paul 38 Effects-Based Operations (EBO) 142 Egypt 73, 105 Ellul, Jacques 151 Emerson, Ralph Waldo 135, 153 An End to Evil 3 enemies: absolute 49, 51–2, 59; in defeat 18, 27, 67, 75, 129, 160, 166; distancing 43, 127; distinction between a state and its citizens 2; identifying 49; as the ‘other’ 50, 117; respect for 38, 47, 65, 74, 124–7; in the War on Terror 2–3, 5, 50; see also ‘unlawful combatants’ Epictetus 126 Essay on Courage 153 Essays Moral and Political 22 ethical leadership 42 Ethical Realism 174 ethical strangers 85, 87 ethics of war 7–15 etiquettes of atrocity 17–24 Euripides 161, 166–8 Europe 79, 84–7 Europe without a Baedeker 5 evil 2–3, 15, 49, 133, 157 existentialism 12, 65 expressive violence 95, 115, 117–19 ‘extraordinary rendition’ 91, 93–4 Fadallah, Sayyed 120–1 Falling Man 114 Fallujah 64, 101, 109, 110, 135 Fascism 24, 35, 122 fear 33, 51, 56, 65, 160, 168, 169–75; of terrorism 80, 169–70, 174–5 Ferguson, Adam 28 Ferguson, Niall 27, 57–8, 111 Ferry, Luc 123–4 Fichte, Johann Gottlieb 126 Finkielkraut, Alan 32 firearms 35, 55, 129, 144–5 force 39, 158–60 Foreign Legion 53 foreign volunteers 101–2, 110, 114 Foucault, Michel 96, 121–2 Fourth Generation Warfare 83 France: Algerian War 61–7, 76; French Revolution 27, 54, 55, 66, 118; parallels between the German occupation and the Algerian War 66–7; Paris riots 2005 99; Vendée civil war 45–6, 47
French Indo-China War 50 Friedman, Tom 5, 96 Frumm, David 3 Fukuyama, Francis 84 Furedi, Frank 152 Fussell, Paul 31–2, 41 Future of the Classical, The 155 future wars 95–112, 151; non-lethal weapons 127–31; robots in 145–52 Gadamer, Hans-Georg 14 Geneva Conventions 7–8, 16, 20; ‘outdated’ 78; PSCs observance of 141; ratification by the Soviet Union 57; terrorist movement and 5, 6; in the Vietnam War 43, 47; violations at Abu Ghraib 77–8 Germany 45, 47, 52; bombing of Hamburg 56; distinction between the Nazi state and its citizens 2; Eastern Front 1941-5 56–61; on losing World War II 67; suspension of morality in World War II 24, 25, 29; Wehrmacht Army 60 Gerson, Michael 2 Gertz, Clifford 101 Gestalt of War, The 17 Gladwell, Malcolm 96 global insurgency 82–3, 85 global jihadists 101–2 global market 96, 99 global public opinion 105 glory of war 32, 33, 35, 147 Glover, Jonathan 163 God and ethics 28, 29, 163 Godelier, Maurice 124 Goebbels, Joseph 25 Going after Cacciato 37, 48, 170–1 Gonzales, Alberto 78 Gorgias 164 Gowing, Nic 109 Grammar of Creation, The 115 grammars of killing 76, 114–24 Greeks 116, 154, 155–7; Hecuba 166–8, 170; hoplite warfare 116, 165; The Illiad 157–60; influence on Western warfare 21, 68–9, 155–7; massacre at Melos 161–5, 166; moral codes 116 Green, Henry 25 Grenada 78 Grotius, Hugo 20–1 Guantanamo Bay 77–8, 90, 91–5 Guardian, The 109, 127, 135 guerre revolutionaire 62, 63, 67, 76 guerrilla war 44–5
196
Index
Guidelines for the Conduct of Troops In Russia 56 Gulf War (1990-1) 17, 33, 79, 88, 106 Gulf War Did Not Take Place, The 79 gunpowder 34 Habermas, Jurgen 5 Hague Conventions 5, 8, 21, 57 Hague Laws of Land Warfare 57 Haig, Alexander 80 Haiti 129 Hale, Jonathan 119 Hamburg, bombing of 56 Hecuba 166–8, 170 Hedges, Chris 5, 101 Hegel, Georg W. F. 11, 28, 54–5, 56 Heinemann, Larry 43 Herr, Michael 36 Herzel, Theodor 69 Hicks, Tyler 107 Hiroshima 19–20, 25, 173 ‘historic’ actions 10 historic sites 5 historical contingency 56 History of Ethics, The 115 History of Sexuality, The 121 History of the Peloponnesian War 160–5, 166 Hitler, Adolf 25, 26, 56 Hizbollah 62, 71, 101, 120 Hobbes, Thomas 21, 37, 101, 118 Hoffman, Bruce 82 Holbrooke, Richard 98 Holocaust 125, 150 Homer 157–60, 161 honour 38, 112, 118, 119 hoplite warfare 116, 165 How we Lost the Hi-Tech War of the Future 104–5 Hulsman, John 174 human rights 66 human shields 105 humane wars 11, 34, 124, 128 humanity, betrayal of 132–3 Hume, David 10, 22, 27 Hundred Years War 35 Huntingdon, Samuel 85 Hussein, Saddam 105, 109 Huxley, Aldous 19 Hypothetical Imperative 75 Ibn al-Sheikh al-Libi 93 If this is a Man 132 Ignatieff, Michael 134, 168
Illiad, The 157–60 Improvised Explosive Device (IED) 147 ‘increasing returns’ 99–100 Indian wars 45–6 Indo-China War, French 50 Information Age, The 97 information, control of 107–8 information overload 109 ‘inner immigration’ 60 insurgencies 53, 62, 82, 100, 101–2; counter-insurgencies 34, 53, 101, 135; global 82–3, 85; swarms in 103–4 inter-subjectivity 117, 124, 126, 133 international aid workers 140 International Criminal Court 49, 136 International Crisis Group 95 international humanitarian law 1, 7, 21, 79, 89; PSCs observance of 140–1 international law 48, 55, 136; ‘positivist’ 19, 20; US position on 106 internet 96, 97, 98, 109 interrogation without torture 94–5 Iraq 5, 45, 53, 64, 95, 106, 114; Abu Ghraib 77–8, 89, 105, 108, 141; American soldiers attitudes in 34, 84; British policy in 129, 135, 136; checkpoints in 81, 142; differences in American and British military culture in 135; European perceptions of 87; execution of Saddam Hussein 109; Fallujah 64, 101, 109, 110, 135; foreign fighters in 101, 110, 114; human shields in 105; insurgency in 53, 62, 82, 100, 101–2; legality of war in 7; Najaf 101, 120; PSCs in 138–9, 140, 141, 142; suicide bombers in 81; use of non-lethal weapons in 130; use of robotics in 148, 149 ‘irregular war’ 45 Israel 12, 53, 67–74, 76 Israeli Defence Force (IDF) 69–73 James, William 157, 158 Japan 24 Jarhead 155 Jaspers, Karl 133–4, 152, 153 Jefferson, Thomas 126 Jenin 72 jihads 85, 98, 101–2, 107, 110–11 Johnson, Gordon 148 Jonas, Hans 153, 172–3 Jordan 93, 105 ‘juridification’ of war 5 jus ad bellum 17, 172
Index 197 jus in bello 6, 13, 17, 20, 172 jus publicum Europaeum 54 ‘just wars’ 7, 8–9 Kant, Immanuel 9, 21, 22, 23, 32, 54, 55, 74, 75, 137 Kaplan, Robert 23, 156 Katyn massacre 57 Katzav, Moshe 73 Keegan, John 52 Kelly, Kevin 102, 103 Kennedy, John F. 65 killing, grammars of 76, 114–24 Killion, Thomas 148 Kipling, Rudyard 147 Kissinger, Henry 40 knights 34 Kosovo war 105 Kuhn, Thomas 10, 87 Kundera, Milan 109 Kurtzweil, Ray 144 Kyoto Accord 85, 86 labour movements 97 Lacheroy, Colonel Charles 64–5 Landau Commission 72 Landsturm edict 47 Lane-Fox, Robin 166 language 13, 82; doublespeak 88; of ethics 23; euphemisms 88; of killing 114–15; of morality 10, 22–3; unspeak 88–91; of the Vietnam War 38, 40–1 Last Supper, The 79 laws of war 4, 7, 48, 59; etiquettes of atrocity 17–24 Laws of War and Peace, The 20 Laws, The 157 leadership, ethical 42 Lebanon: 1982 War 69–70, 110; 2006 War 71, 101; suicide bombers in 120–1 Lefevre, General 46 legal advisors 135 Let us Now Praise Famous Men 106–7 Lethal Weapon 129 Levi, Primo 132, 150 Levinas, Emanuel 125, 132 liberalism 9–16, 35, 48, 91, 98, 127; dangers of terrorism 51, 80, 174–5; going to war for peace 170; irony of 106 Liddell-Hart, Basil 28 Lieven, Anatol 174 Lifton, Robert 64 Lind, William 83
London bombings 7/7 85, 110 Long War 79, 80 longbows 34–5 Looming Tower, The 73 Luhmann, Niklas 138 Lukes, Stephen 31 Lynn, John 30–1, 68 Lysistrata 161 MacFarlane, Colonel Sean 84 Machiavelli, Niccolo 18, 46, 55, 166 machines 96, 144–52, 153 MacIntyre, Alasdair 6, 8, 15, 115, 116, 124, 172 Madrid bombings 85, 109 Magdeburg 19–20 Malaparte, Curzio 58 Manning, Frederick 56 Mansfield, Susan 17 Marx, Karl 29, 99 Marxism 44, 50, 115 Massu, General Jacques 64, 67 Mattis, General James 77 McCain, John 94–5, 127 McNamara, Robert 39 media 107–10 medieval warfare 34 Meier, Christian 162 Melos massacre 161–5, 166 meme theory 111–12 Merit and Responsibility 116 metaphysics 9, 10, 13–14, 174 Metropolis 25 micro-management of the battlefield 134–8, 153 microbial agents 130 microwave weapons 128 Middle Parts of Fortune, The 56 military culture 33, 67–8, 135 military discourses 31 Mill, John Stuart 10 Modern Times 25 Mofaz, Shaul 73 Moral Equivalent of War, The 157 Moral History of the 20th Century 163 moral law, absolute 23 moral vacuums 37, 38 morality 9–10, 15, 24, 49, 121, 172; in the absence of a divinity 163; defining morals 7; Greek 116; language of 10, 22–3; micro-management of 134–7, 153; modern 124–5; networked 104–11; relationship with politics 58–9; respect and 65; responsibility
198
Index
and 134, 151–2, 173; states as personae morales 48; superior 11, 23; suspension of 24, 25, 29, 126; and 'the heuristics of fear' 173 Morocco 81 muskets 35 Muslims: British 81, 83; disaffected young men 110–11 My Lai massacres 41–4 Najaf 101, 120 Napoleon Bonaparte 27, 45, 46, 47, 54, 145 Napoleonic Wars 25, 27, 35 Nasar, Mustafa Setmariam 83 National Liberation Wars 44 nationalism 50, 54, 66 natural law 19, 20, 21–2 Neff, Stephen 20 networking technologies 51, 137 networks: global networked war 76, 96, 98–102; memes 111–12; networked morality 104–11; as paradigms 97; swarms 102–4; terrorist 82–3, 97–8, 107 New Science 24 New World Order 48 New York Times, The 107, 109 news environment 108–10 Nicolas, Gregoier 103 Nietzsche 162–3 Ninh, Bao 50 Nomos of the Earth, The 48 Non-lethal and Non-destructive Combat in Cities Overseas 128 non-lethal weapons 127–31 non-state actors 2, 5, 11–12, 83 Northern Ireland 129 Oakenshott, Michael 22–3 objective criminality 11–12, 57 O'Brien, Tim 37, 48, 170–1 Orwell, George 61, 88, 126, 127 Paco's Story 43 pain 12, 29, 106, 126, 133 Pakistan 11, 81 Palestine 53, 66, 68, 76, 111; Al Aqsa infitada 70–3 Palestinian Liberation Organization (PLO) 71 Paris de Bollardière, General Jacques 66–7 Paris riots 2005 99 Parker, Geoffrey 17–19, 29, 30, 48
partisans 44–52 Passenger 57 129 'path dependence' 100 peace 170–4 peace missions 134–5 Peloponnesian War 160–5, 166 Peninsular War 44–5 performative biopower 110 Pericles 165, 166 Perle, Daniel 118 Perle, Richard 3 Peters, Ralph 78 Phenomenology of Spirit 54 philosophy 8–15, 22, 174–5 Philosophy of Right, The 28 photography 106–7, 108, 109, 110 'physics of war' 21 plastic bullets 129 Plato 8, 10, 157, 164 Player Piano 144 Poland 57 policing wars 136 Political Scandal 105 politics and ethics 21, 28–9, 58–9, 166 Politics in the English Language 88 Politics, The 171 polymorphous conflict 61–2, 96, 100–1 Poole, Steven 88 ‘post-modern’ conflict 38, 41 post-traumatic stress disorder (PTSD) 37 power 95, 161, 162, 164; increasing transparency of 106–7, 112; media enhancement of the use of 110; through blood 121–2; 'will to power' 163 Power of Words, The 158 pragmatism 23, 24 Prigogine, Ilya 103 prisoners of war (POWs) 24–30; at Abu Ghraib 77–8, 89, 105, 108, 141; in both World Wars 27; dilemma of terrorists’ status 93; on the Eastern Front 57, 58, 76; in Guantanamo Bay 88, 91–5; Levinas’ experiences in a German camp 132–3; management of camps by PSCs 140; respect for 65; in Vietnam 43, 127 private sector 139–40; Private Security Companies (PSCs) 138–43, 153 proportionality 20 Prussia 47 public executions 117, 118 public exposure 106–10, 112 public opinion, global 105 Pulsed Energy Projectile (PEP) 130
Index 199 Quadrennial Defense Review (QDR) 2006 79 Qutb, Sayyid 73 race and war 43–4, 57, 111–12 Rambo 130 Ramonet, Ignacio 109 al Rantissi, Abdel-Aziz 72 Reagan, Ronald 93, 110 reality of war 30–1, 48, 95, 156; reality/discourse divide 35–6 Regarding the Pain of Others 106 ‘regular war’ 45 Reid, John 1, 2, 6 Rejali, Dorias 94 relativism 9 religion 3–4, 44, 45, 112, 174–5; and ethics 21, 28, 29, 163; and sacrifice 123; and sexuality 122 ‘repetitive administration’ 89, 91 reputations, damaging 105–6, 112 respect 38, 47, 65, 74, 124–7 responsibility 133, 134, 151–2, 153, 172, 173 revenge 129, 165, 167, 168 Rice, Condoleeza 93 Ricks, Thomas 78, 142 Rise of the Network Society, The 127 risk 86–7, 152; cost v. 146–8; management of war 137–8; and PSCs 153; societies 173–4 robot historians 144 robots 145–52, 153 Roosevelt, Franklin D. 169 Rorty, Richard 8–15, 22–3, 30, 74, 173, 174 Roy, Oliver 85 Rubens, Peter 19 rules of war 2, 38, 134–5, 146; for robots 149 Rumour of War, A 36–7 Rumsfeld, Donald 2, 79, 81, 86, 89, 90, 107–8 Russell, William Howard 32 Russia 56–61; Russian Army (1812) 56–7 Rwanda 112 Sachs, Oliver 122 sacredness 122–4 sacrifices 122–4, 152 Sagan, Carl 3 Said, Edward 66 Sanchez, Richard 78 Saragossa 45
Sartre, Jean-Paul 12, 13, 65, 66 Saudi Arabia 81 Scarry, Elaine 127 Schmitt, Carl 44–52, 54 Schoomaker, General Peter 34 Scott, Ridley 103 self-trust 135, 137, 153 ‘selfish gene’ 111 Sennett, Richard 126 Settis, Salvatore 155 sexuality 122 Shay, Jonathan 38 Shklar, Judith 11 Siblani, Osama 105 Simmel, Georg 118 Simon, Pierre-Henri 65–6 Six Day War 70 Sleepwalkers, The 60–1 Slowness 109 Smith, Adam 22, 27 Smith, General Rupert 98 Socrates 150, 151 soldiers: and applying civilian norms to the military 134–5; ethical questions for 150; ‘hazarding’ their lives 137–8; honour 38; moral responsibility of Western 134, 137; in a moral vacuum 37, 38; rules of war 2, 38, 134–5, 146; self-trust 135, 137, 153; see also warrior ethos Somalia 103, 105, 128–9, 147 Sontag, Susan 106, 107 Sorrows of War, The 50 souls 115–16, 122 Soviet Union 35, 56–61, 86 Spain: Madrid bombings 85, 109; Spanish war (1808-13) 44–5 Spengler, Oswald 25 Spielberg, Steven 79 St Augustine 171–2 Stanton, Mike 129 state of nature 37 states: as instruments of political parties 52; legitimizing violence 4–5, 28, 95, 119, 129; origins of modern 28; as personae morales 48; rise of nation 54–5; subordination to international institutions 49; use of the private sector 139–40, 153 Steiner, George 115, 131 Stephens, Robert 94 Stern, Fritz 112 Stone, Lawrence 118–19 Strauss, Leo 49
200
Index
subcontracted services 138–43, 153 Suez War 70 suicide bombers 62, 81, 86, 115, 119–21, 123; and how war should be fought 127; in the Israeli-Palestinian conflict 68, 72, 76 Supplicants, The 161 surrender 27 swarming 102–4 Tactical Autonomous Combatants (TACs) 148 Taliban 100, 107 technology 18, 34–5, 51, 143–52; ethical practices and 35, 137; the first computer war 38–9 technology studies 145 telling stories 10–12, 98 Tennant, George 93 Terminator 145 terrorism: Al Aqsa intifada 70–3; differing European and US approaches to 84–7; discourse on war 112–13; expressive violence of 115, 119–20; falling cost of 109; fear of 51, 80, 169–70, 174–5; as a global insurgency 82–3, 85; networks 82–3, 97–8, 107; status of captured terrorists 93; thwarting attacks 81; see also War on Terror terrorist memes 111–12 terrorist studies 81–2 Theory of Moral Sentiment, The 22 Theory of the Partisan 44–52 Thiepval 29 Thin Red Line 158 Third Categorical Imperative 173 Thirty Years’ War 19, 20, 21, 29, 45 Thompson, John 105–6, 110 Thomson, Robert 40, 106 Thoreau, Henry 110, 169 Three Musketeers, The 80 Thucydides 6, 160–6, 170 Times, The 32 Tinker, Hugh 44 Tipping Point, The 96 tipping points 99 Todorov, Tzvestan 23 Tolstoy, Leo 138 Tomb of the Unknown Soldier 32 torture: in the Algerian War 63–4, 66; in Egyptian prisons 73; under ‘extraordinary rendition’ 91, 93–4; humiliation of 126, 127; information
obtained under 94, 127; Landau Commission on 72; making a case for 1–2; by US 77–8, 89–91, 94–5, 108, 130; use of non-lethal weapons 130; in Vietnam 127 transcendental humanism 124 Treaty of Westpahlia 21, 27 Trilling, Lionel 126 Trojan War 32 Trojan Women, The 170 Troy 155 trust: self-trust 135, 137, 153; violation of 106 truth, quest for 8–10, 14, 15 Turner, Bryan 122 Unconditional Imperative 133–4 Under Siege 129 unintended harm 99 United Kingdom: 7/7 bombings 110; applying civilian norms to the army 135–6; differences from US in military culture 135; Effects-Based Operations (EBO) 142; military policy 129, 135, 136; policy of restraint 129; using NLWs in Northern Ireland 129 United Nations Charter 17 United Nations Convention Against Torture (1987) 89, 94 United States: abuse and torture of prisoners of war 77–8, 89–91, 94–5, 108, 130; Al-Qaeda's fatwa against the American people 12; army recruitment 139; attitudes to Europe 79; and comparisons with Europe on approach to terrorism 84–7; development of nonlethal weapons 127–31; differences from UK in military culture 135; Effects-Based Operations (EBO) 142; exporting of liberalism 175; ignorance of Islam in 82; Indian Wars 45–6; micro-management of the battlefield 137; PSCs in the military 138–9, 141, 142; public relations war in the Muslim world after 9/11 105; reputation of 105–6, 112; towards more humane wars 34, 124, 128; use of robotics in war 148–9; warrior ethos in the military 33, 34, 129 ‘unlawful combatants’ 36, 43, 87, 88, 95; extraordinary rendition of 93 unmanned surveillance aircraft 148 unspeak 88–91
Index 201 Urry, John 98–100, 102, 110 US Central Command (CENTCOM) 108 utilitarianism 32 values 34, 121 van Creveld, Martin 17, 39–40, 52, 53 Vasquez, John 31 Vendée civil war 45–6, 47 Verdun 123 Vico, Giambattista 21–2, 24, 164–5 victory 167; body count and 25–7 Vietnam War 36–44, 47, 50, 88, 127 violence: cycles of 129–30; expressive 95, 115, 117–19; fundamentalist 174–5; legitimate 4–5, 28, 95, 119, 129; protocols of 29; warriors fear of being consumed by 160 Voltaire 4 von Moltke, Helmut 19 Vonnegut, Kurt 144 Walzer, Michael 13 On War 26–7, 47 War and Peace 138 War and the Law of Nations 20 war crimes tribunals 18 war fatigue 82 war memorials 29, 32 War of the World 57–8, 111 War of the Worlds, The 79 War on Terror 1–16, 76, 79–83; challenge to civil liberties 16; crisis in the Western Alliance 84–7; fear in 51, 80, 169–70, 174–5; a new paradigm 87–95; proliferation of actors in 96; see also terrorism warrior ethos 33, 34, 129, 133, 143; and fear of being consumed by violence 160; and self-trust 135, 137, 153
‘Warrior’s honour’ 38 Washington, George 78 Wealth of Nations, The 22 Weber, Max 29 Wehrmacht Army 60 Weil, Simone 158–60 Wells, H.G. 79 Western Alliance crisis 84–7 Wiesel, Elie 11 Williams, Bernard 6–7, 14 Wilson, Edmund 5 women: off the battlefield 167; in the US military 33 Wood, David 125 World War I 2, 19, 21, 27, 56, 59; anonymity in 32, 35; memorials 29, 32; Verdun 123 World War II 2, 27, 35, 52, 56; cultural vandalism 5; on the Eastern Front 1941-5 56–61; German people in defeat 67; Hiroshima 19–20, 25, 173; Holocaust 125, 150; mass bombing campaigns 5, 16, 39, 56; paradigm 62; Robert McNamara in 39; suspension of morality in Germany 25, 29; Wartime Interrogation Centre 94 Wouters, Cas 33 Wright, Evan 96 Wright, Lawrence 73 Yassim, Sheik Ahmed 72 Yugoslavia 19, 52, 53, 101 Yunis, Fawaz 93 al Zawahiri, Aymah 72–3