Codes and Codings in Crisis Signification, Performativity and Excess
Adrian Mackenzie and Theo Vurdubakis
Abstract The connections between forms of code and coding and the many crises that currently afflict the contemporary world run deep. Code and crisis in our time mutually define, and seemingly prolong, each other in ‘infinite branching graphs’ of decision problems. There is a growing academic literature that investigates digital code and software from a wide range of perspectives – power, subjectivity, governmentality, urban life, surveillance and control, biopolitics or neoliberal capitalism. The various strands in this literature are reflected in the papers that comprise this special issue. They address topics ranging from social networks, mass media, financial markets and academic plagiarism to highway engineering in relation to the dynamics and diversity of crises. Against this backdrop, the purpose of this essay is to highlight and explore some of the underlying themes connecting codes and codings and the production and apprehension of ‘crisis’. We analyse how the everincreasing intermediation of contemporary life by codes of various kinds has been closely shadowed by a proliferation of crises. We discuss three related aspects of the coupling of code and crisis (signification, performativity and excess) running across these seemingly diverse topics. We and the other contributors in this special issue seek to go beyond the restricted (and often restricting) understanding of code as the language of machines. Rather, we view code qua programs and algorithms as epitomizing a much broader phenomenon. The codes that we live, and that we live by, also tell us about the ways in which the ‘will to power’ and the ’will to knowledge’ tend to be enacted in the contemporary world. Key words code j conduct
j
j
crisis
j
performativity
j
software
Theory, Culture & Society 2011 (SAGE, Los Angeles, London, New Delhi, and Singapore), Vol. 28(6): 3^23 DOI: 10.1177/0263276411424761
Downloaded from tcs.sagepub.com at UNIV OF ROCHESTER LIBRARY on December 30, 2011
4
Theory, Culture & Society 28(6)
The concept of crisis, which once had the power to pose unavoidable, harsh and non-negotiable alternatives, has been transformed to fit the uncertainties of whatever might be favored at a given moment. (Koselleck, 2006: 306)
C
ODE AND CRISIS are made for (and from) each other. The contemporary inflation of digital ‘code’ into an obligatory passage point for all participation in contemporary life and the ongoing intrusion of crises ^ financial, ecological, climatic, geopolitical, cultural, psychological ^ into everyday lives are inextricably entwined. Code is in a certain sense the terrain on which decisions concerning chance, pattern, order, values, time, otherness, nature and culture are enacted. ‘Crisis’ designates a mode of apprehension in which problems of chance, pattern, order, values, time, otherness, nature and culture are posed as dilemmas requiring ‘informed’ decisions. It is common to view discussions of codes and encodings as rather esoteric and arcane: code is technical; it concerns things rather than people. It is also understandable to feel that so much has already been said about the various crises that currently grip the contemporary world that crisis talk begins to sound like a hangover from certain German philosophies of history. Yet their mutually constitutive relationship remains restlessly active. Code and crisis are rather like the North Atlantic and Great Pacific Garbage Patches: their gyrations attract much of our detritus. In this special issue we seek to both acknowledge the persistent resonance of the understanding of code as the language of machines ^ and to move beyond it. We therefore view code qua programs and algorithms as epitomizing a much broader contemporary phenomenon. Code is understood here not only in terms of software but also in terms of cultural, moral, ethical and legal codes of conduct. So many situations today become tractable and manageable (and also in-tractable and un-manageable) by virtue of their code-ability. Accordingly, the codes that we live, and we live by, reveal the ways in which the ‘will to power’ and ‘will to knowledge’ tend to be enacted in the contemporary world. Against this backdrop we can note a number of common themes which recur in discussions of codes and their crises. How codings render objects, events and relations into communicable signs, how in making them know-able and available they re-make them, and how such re-makings inevitability generate excess: these principal themes run through much contemporary work on code, coding and codeability. These can be summed up as the overlapping themes of signification/knowledge, performativity and excess. While all of these are important, it is perhaps performativity that plays out in crisis most, and in much of the discussion below it will be primary. Downloaded from tcs.sagepub.com at UNIV OF ROCHESTER LIBRARY on December 30, 2011
Mackenzie and Vurdubakis ^ Introduction to ‘Codes and Codings in Crisis’
5
A General Theory of Code Decodes Itself ‘Codification’, argues Bateson (Bateson and Ruesch, 1951), refers to the ways in which a universe of objects, relations and events is rendered into (communicable) symbols. Codes, in other words, (inter)mediate events and signs, worlds and cultures. In modernity (whether ‘high’, ‘late’ or ‘post’) ‘code’ has become the ubiquitous manifestation of the presumption that in principle all things are cognizable (Bryan, 2010). Code thus prodigally (re)appears in various forms and guises. In various contemporary renderings of the world ^ as political system, as climatic system, as economic system, as media formation ^ code has come to be seen as the revelation of being. As Haraway (1991) has noted, contemporary forms of knowing, from computer and communication sciences to modern biology, involve a common move: the translation of the world into a problem of coding. Differences in kind ^ the language of life vs. the language of machines ^ become differences in degree. Knowledge of the human and animal organisms has come to be seen as problems of genetic coding and read-out (1991: 164). It is not our objective in this special issue to urge a ‘general theory of code’. One problem which besets the quest for any such ‘general theory’ is that code names a range of seemingly diverse processes and objects. Thus in everyday language ‘code’ might equally refer to a written system of laws, to communication codes ^ from language to ciphers, to DNA ‘instructions’, to the written or unwritten rules of conduct pertaining to a wide range of social situations from ‘codes of honour’, to professional codes, to ‘health and safety codes’, to fashion codes, to etiquette. At the same time ‘code’ is more often identified with programming and software. This overarching identification of code with the language of machines might be understood as the heritage of post-Second World War cybernetic cultures. Throughout its various scientific incarnations ^ for instance in molecular biology, in climatology, or in the growth of various forms of systems thinking (operations research, artificial intelligence and machine learning, etc.) ^ the identification of code as program, as the execution of a sequence of pre-scripted operations, has been central (Beniger, 1997). Similarly, as the notion of algorithm has been mathematically formalized in terms of computational costs, and algorithms intensively developed and catalogued in various domains (Cormen, 1990; Wirth, 1976), the identification of code with program and algorithm has grown, sometimes at the expense of its other properties (to signify, for instance). Its other senses ^ as body of laws, as a collection of regulations on a subject, as a way of communicating, whether openly (Bernstein, 1973) or secretly (as in military codes and cyphers) ^ still carry weight, but appear to have been overtaken and subsumed by the program and algorithms. This subsumption has been theoretically supported by various formalizations of code in information theory, in theoretical computer science and in algorithmic complexity theory (Fortnow and Homer, 2003). Practically, it has been supported by the rampant growth of Downloaded from tcs.sagepub.com at UNIV OF ROCHESTER LIBRARY on December 30, 2011
6
Theory, Culture & Society 28(6)
techniques and practices concerned with generating, writing, designing, propagating, circulating and commodifying code (for instance, the highly energized debates around open source software from the late 1990s onwards). The now apparently irreversible identification of code with program and program with algorithm is perhaps inevitable in a society that labels itself as an ‘information’ and a ‘network’ society (e.g. Castells; 2000, 1997, 1996). This identification has gone hand-in-hand with an emphasis on the performativity of code, its apparent ability to ‘make things happen’. Code has become arguably as important as natural language because it causes things to happen, which requires that it be executed as commands the machine can run. Code that runs on a machine is performative in a much stronger sense than that attributed to language. (Hayles, 2005: 49^50)
There has already been a protracted scholarly discussion of the performativity of code as software (Galloway, 2006; Hayles, 2005; Mackenzie, 2006). The growth in notions of code as performativity spans work on law (Lessig, 1999), literature (Marino, 2006; Hayles, 2009), art (Cox et al., 2002; Stocker and Sch˛pf, 2003; Cramer, 2002), and the state (Levy, 2002). Underlying much of this literature, two key issues are being debated: where code stands in relation to language and speech, and what norms govern the force of code. Whilst the entanglements of code and language are still being debated (for instance in critical code studies), execution is the distinguishing focus of much of this work. As Galloway (2004) puts it in his oftcited definition (e.g. Hayles, 2005): ‘code is the only language that is executable’. We can define such ‘executability’ as the ability ‘to perform indicated tasks according to encoded instructions’ (Merriam-Webster’s Online Dictionary). Thus, we might say that secret codes ‘work’ by (re-)distributing the ability to recover what has been encrypted in specific ways. Accordingly, one empirical question which inevitably shadows questions of code and coding is where does this execute-ability reside? (See for instance the articles by Chun and Harvey and Knox in this issue.) Any assertion of the distinctiveness of code execution would be open to deconstruction along the lines already drawn by Jacques Derrida in his reading of J.L. Austin’s account of speech acts in ‘Signature, Event, Context’ (Derrida, 1982), or by Judith Butler in her analysis of language, bodies and politics in Excitable Speech (Butler, 1997). (It would also be possible to derive a similar line of argument out of Gilles Deleuze and Felix Guattari’s discussion in Anti-Oedipus (1983) of the second desiring synthesis in desiring production, coding as recording). While our aim here is not to critique existing work on code (but see Introna, this issue), it is worth pointing out that both Derrida and Butler’s account of performativity strongly suggest that any hard-and-fast opposition between language and force, between saying and doing, cannot be sustained. Therefore, any account of coding that starts or ends by opposing code to language, by Downloaded from tcs.sagepub.com at UNIV OF ROCHESTER LIBRARY on December 30, 2011
Mackenzie and Vurdubakis ^ Introduction to ‘Codes and Codings in Crisis’
7
separating meaning and force, or communicating and acting, is bound for trouble. The difficulty in opposing or even distinguishing language and code is that the iterability on which performativity relies ^ the repetition that allows it to become conventional ^ is itself coded. Something has to be ritualized in order for repetition to occur. Code works by being coded, and this code is encoded, that is, it is less than obvious or immediately legible. As Butler puts it, ‘a performative ‘‘works’’ to the extent that it draws on and covers over the constitutive conventions by which it is mobilized’ (1997: 51). Without the ‘coding of code’, code has no force and it does not ‘execute’. Hence, the conduct of code, we might say, its execution, is a fraught event, and analysis should be brought to bear on the conditions and practices that allow code to, as it were, access conduct. While the knowledge or form that lies at the heart of the code promises completeness and decidability, the execution of code is often mired in ambiguity, undecidability and incompleteness. This raises many concrete problems in relation to designing code-based interactions. At core, the problematic instability or slippage associated with code concerns the non-coincidence between knowing and doing (or conduct) represented by code. One series of questions here concerns how to understand this unstable non-coincidence. One way of highlighting the contingent association of code and conduct is via the concept of performativity. This performativity of code constantly enhances and widens the possibility of variations and deviations that are difficult to contain or control. Code Worlds To code desire ^ and the fear, the anguish of decoded flows ^ is the business of the socius. (Deleuze and Guattari, 1983: 139)
Perhaps in no other domain apart from computer software and hardware itself can such a profound re-mediation of both theory and practice be seen than in the re-conceptualization of the genetic elements of life as code-bearing molecules which began the 1950s with the work of Crick and Watson. The central dogma of molecular biology has become that DNA is the program or code for the construction of proteins that catalytically and structurally ‘execute’ life. As many scholars have observed, the thorough-going re-framing of biology over the last half-century in terms of code, program, information and communication has radically altered key understandings of reproduction, growth, evolution, and vitality (Virilio and Lotringer, 2002; Hayles, 1999). The now very familiar process of putting code into life (and life into code) shows no signs of abating. In the wake of the much-discussed large-scale scientific projects to exhaustively list the Code of Life in the form of ‘the’ human genome, many further forms of codeability have been elaborated (see Moody, 2004, Thacker, 2005, for Downloaded from tcs.sagepub.com at UNIV OF ROCHESTER LIBRARY on December 30, 2011
8
Theory, Culture & Society 28(6)
description of some of these). It has given rise to new sub-disciplines such as bioinformatics, specializing in the organization, integration and analysis of sequence data, as well as many instruments and data-processing arrangements practically constructed and managed as code objects. Yet contemporary high-throughput genomic sciences, coordinating thousands of machines, large databases and global consortia of institutions and infrastructures, doggedly pursue subtle and minute variations in sequence patterns in human and non-human genomes, all under the rubric of code. This ever-growing facility of codes to intermediate between different orders of being enables the ongoing transformation of their objects of knowledge. By means of such moves established categories of existence are seemingly dissolved and new ones willed into being. The natural and social worlds both become Heideggerian ‘standing reserves’ amenable to ‘disassembly, reassembly, investment and exchange’ (Haraway, 1991). Even where code is not the primary trope or technical term, other quasi-universals tend to reinforce its primacy. Code, whether in the guise of information technology, genomics or in some other form, commonly appears as the means through which any problem can be solved. We have seen that in the exuberance that greeted the arrival of the world-wide web in the late 1990s (Woolgar, 2002), when commentators as far apart as Rupert Murdoch (1994), Bill Gates (1995) and Hardt and Negri (2004: 68^90) could join in the celebration of the internet as the engine of liberation and the empowerment of the multitudes. Understood in these terms, codes increasingly appear as ways of worldmaking (Goodman, 1995), as ways of positing and instrumenting distinctively (post?)modern worlds. Perhaps this is most ‘literally true’ in the apparently endless proliferation of intentionally immersive environments. Code-generated worlds, Baudrillardian copies without ‘originals’ which thus take the name of reality in vain, are often said to offer valid substitutes or alternatives to ‘real life’ (‘RL’) and to constitute valid ways of escaping its dangers and contingencies. Code here acts as a form of denial, escape, and fantasy (e.g. Rheingold, 2000). You can fight a battle and walk away from it. New (real life and death) situations can be, and are, quite readily assimilated into these ‘codeworlds’ (e.g. Halter, 2006; Der Derian, 2009). Thus video games and interfaces which direct ‘surgical strikes’ in Iraq, Libya or Afghanistan bear more than a passing resemblance. As in the case of real versus simulated life, the boundaries between real and simulated death also become fuzzy. Inevitably, anxieties and apprehension as well as great expectations come to cluster around codeworlds. Beck’s (1992a, 1992b) ‘risk societies’ and Deleuze’s (1995) ‘societies of control’ (figures of both lack and excess) in spite of their various differences represent pathologies of the codings to which the natural and social worlds are being subjected. ‘Societies of control’, it will be recalled, refer inter alia to what happens when populations meet code in the era of informational capitalism. ‘Population’ comes to be understood statistically as the set of entities whose properties or behaviours Downloaded from tcs.sagepub.com at UNIV OF ROCHESTER LIBRARY on December 30, 2011
Mackenzie and Vurdubakis ^ Introduction to ‘Codes and Codings in Crisis’
9
are observed (sampled) and estimated by various means, and whose dynamics are supported by the stochastic or chance processes generated by the combination of many events. In biological, geophysical, financial, military and media settings, code serves as a principal axis in the control of stochastic processes on large scales, distributed in various ways. Deleuze (1995) thus speaks of contemporary societies as disassembled and reassembled by means of ‘numerical language[s] of control’. Code here becomes the concealed hand which directs participation, whether witting or unwitting, in the corporate marketplace ^ a theme explored in John Cheney-Lippold’s contribution to this issue. Beck (1992a, 1992b) focuses on the in-ability of our techno-scientific systems to fulfill their promise of delivering us from risk and has highlighted their complicity in the production of widespread in-security. Indeed discussions of the nuclear (industry and weaponry) often name code ‘itself’ as a perilous excessive object. Nuclear codes appear plagued by millennial bugs (Knights et al., 2008) or as the merciless post-human guardians of the doctrine of Mutually Assured Discussion (MAD by design). Similarly, Virilio expresses alarm at the notion of bodies, minds and behaviours made reducible to information codes (standing reserves) and subjected to computer-facilitated disassembly and recombination. For him, genetic engineering represents the culmination of the unfolding ‘information revolution’: the ‘industrializing [of] the living organism itself’ (Virilio and Lotringer, 2002: 103). ‘Evolution’ in this sense no longer refers to natural but to an information selection from the databases that now constitute the human. We should therefore speak not of ‘experiments on humans’ but rather of ‘human-experiments’: the newly acquired ‘freedom. . . to produce human beings, to create them, no longer to procreate them’ (2002: 117). For Virilio, then, the new genomics transform biology into a teratology: it is no longer the sleep of reason that breeds monsters. Whether codes work well or badly, it seems, their successes (Deleuze, Virilio) as well as failures (Beck) come at a price. Codeworlds it appears are haunted by spectres of crisis, as well as salvation, by anticipations of their own destruction. Code is the stuff nightmares, as well as dreams, are made of. Crisis and their Codes: On Nuclear Codes and Critical Masses The disruption, in the last instance, of the authority of the code as a finite system of rules; the radical destruction, by the same token, of every context as a protocol of a code. (Derrida, 1982: 316)
Let us now begin to unravel a little what it means to know crisis through code through the case of the global nuclear assemblage. In the wake of a half-century of ‘normal’ accidents, nuclear weapons and nuclear energy have come to epitomize crisis-prone things. As the emblematic technologies Downloaded from tcs.sagepub.com at UNIV OF ROCHESTER LIBRARY on December 30, 2011
10
Theory, Culture & Society 28(6)
of Beck’s (1992a, 1992b) ‘risk society’, they embody crisis in the sense that they call for decision and action split between starkly opposed alternatives: to have them or not to have them. Moreover, they carry within them the temporal-historical distension of crisis ^ they prolong the time of decision. The regular recurrence of ‘nuclear crises’ ^ Three Mile Island, Chernobyl, Iran, North Korea, Iran, Fukushima, to go nuclear or to dismantle it ^ seem to flow from something intrinsic to the nuclear devices (reactors, bombs) themselves. They are things made to be critical or dilemmatic. They materialize in crisis, and they have something critical in them in the form of ‘critical mass’: a flux of particle reactions whose balance must be constantly checked and regulated, so that it becomes neither inert fuel or explosive decay. In more philosophical terms, nuclear devices epitomize the crisis of scientific knowledge: the way in which formulae and code can substitute, as Edmund Husserl puts it in The Crisis of European Sciences and Transcendental Phenomenology, a ‘mathematically substructed world of idealities for. . . our everyday life world’ (Husserl, 1970: 49). ‘Code’ is centrally implicated in the becoming of such intrinsically crisis-making substitutions of idealities into life worlds. Husserl highlighted this substitution in the 1930s at the same time as the mathematicians Alonzo Church and Alan Turing were, somewhat unwittingly, formalizing the foundations of a general architecture of substitution in their papers on decidability and computability (Church, 1936; Turing, 1936; Mackenzie, 1996). Much of the violent history of computer code with its enduring debts to Second World War cryptography, ballistics, atomic weapons and military operations research follows in the wake of formalizations of computability found in Turing’s and Church’s work. The compressed criticality of nuclear weapons and devices relies on code. The techniques through which code permitted a technical knowledge of the ‘critical’ offers a useful allegory for the role of code in ‘knowing’ crises today. In the late 1940s, physicists at Los Alamos working on nuclear weapon design began to devise a new kind of computer simulation. Conjured up by Stanislaw Ulam in discussion with John von Neumann, so-called ‘Monte Carlo simulations’ allowed a certain class of difficult mathematical problems to be solved (Metropolis, 1949). The problem was this: things in the world are, from the standpoint of 20th-century physics at least, largely engaged in mass random behaviour, and this randomness means that we can only view their collective behaviour in terms of statistical distributions. These distributions may have a complex shape since they reflect the collective behaviour of many different things. In nuclear physics, particles such as electrons, neutrons, protons and photons flux en masse. For weapons and reactor designers, the technical problem is to steer the overall flux of their interactions in the right direction ^ towards criticality, where the reactive flux is self-sustaining, perhaps explosively so. Any experimental determination of rates of flux in nuclear reactions is a highly risky undertaking since reactions in a given mass might become unexpectedly super-critical. (The Manhattan Project did have its ‘criticality incidents’ in 1945 when a Downloaded from tcs.sagepub.com at UNIV OF ROCHESTER LIBRARY on December 30, 2011
Mackenzie and Vurdubakis ^ Introduction to ‘Codes and Codings in Crisis’
11
mass of fissile material went supercritical. Many other critical incidents have occurred in reactors, including possibly and most recently the ‘presence of transient criticalities’ at Fukushima [Harrell, 2011].) In nuclear reactors, it is sometimes difficult to know whether a critical incident has occurred (in nuclear weapons, of course, criticality is rather obvious). Faced with the enormous experimental risk of criticality, physicists could only make design decisions through calculation, verified for a decade or so through atomic tests. The problem with calculations of critical mass is they couldn’t and still can’t be carried out analytically. That is, the solutions to the equations describing the fluxes of fission reactions could only be approximated, not solved. The only possible reckoning of criticality was by way of chance. Here computer code becomes crucial as a way of knowing what large numbers of things are doing. One of the things a computer could be used for was to generate large amounts of (pseudo-)random numbers. Large supplies of randomness are actually very useful since they can be used to stand in for large numbers of things in the world. If enough random numbers are generated by a computer, then the statistical distribution of those numbers can stand in for what happens in the world. If large numbers of numbers can stand in for large numbers of things, then the only question is how to shape the distribution of random numbers generated by the computer in such a way as to match the collective behaviour of the things in question. In nuclear weapons design, Monte Carlo simulations marshal large supplies of uniformly distributed random numbers to model ‘infinite branching graphs’ (Metropolis, 1949: 341) physically occurring as chain reactions, reactions whose chain embodies the relations and attributes posited by the equations of particle physics. In this setting, ‘codes’ ^ as the simulations were called ^ were crucial forms of dead reckoning, needed to navigate between sub-critical and critical states. Code Makes an Infinite Branching Graph: Performativity Since code performativity results from coding, the question becomes: what specific norms or conventions does such execution pertain to? Starting out as a technique to aid nuclear weapon design, Monte Carlo simulations set off many ‘chain reactions’ in other domains and have thereby had a long half-life. Aside from deeply affecting supercomputer architectures from the 1960 to the 1980s (Mackenzie, 1991), their branching graph of influence has continued to spread as they were applied in settings where chains of events occur. For instance, in recent years, they have become important in personal financial planning as well as modelling of the performance of financial derivatives. We could also track them through the design of materials such as spacecraft heatshields, in ‘global illumination’ software used to produce photo-realistic images in computer games and cinema, or in the ensemble models used by climatologists and meteorologists. They themselves have been covered over and recombined in subsequent codings, Downloaded from tcs.sagepub.com at UNIV OF ROCHESTER LIBRARY on December 30, 2011
12 Theory, Culture & Society 28(6)
thereby percolating into an even wider variety of modelling and predictive applications: for instance, Bayesian computational statistics have begun to change how prediction and inference are managed in many empirical settings ^ spam detection, clinical trial design, DNA forensics, etc. In Bayesian statistics, Monte Carlo simulations are overlaid with processes of statistical inference. All of these cases are of lesser or greater interest to media studies, science studies, critical political economy, or sociology, but the key point is that the critical thing ^ the nuclear weapon, the rising mean global surface temperature, the personal pension plan, the unstably leveraged credit default swap ^ is not only known in code, but made in code. These codes are themselves the site of a constructive engagement with chance, with many rolls of the dice, with iterations whose chancy outcomes act as support functions for convergence on existing norms and divergence into new norms. The criticality of code comes from the infinite branching graph of decisions that code opens up and curtails. Code as we most often talk about it today ^ something that forms a part of the technical infrastructures of the media, communication, control and production systems ^ is made to be in crisis, and crises are in code. The forms of code explored in this special issue are largely drawn in one way or another to code as software. In financial markets, in social media, or in visual media, crisis is inscribed in code. That is, the very structure of the code, its composition, its ordering, and its mode of existence are chronically crisis-affected. To identify any one particular aspect of code as the anchoring point of the crisis mode of temporality and duration would probably be to miss the processes of coding that buoy up code. The control structures, the dynamics of code revision, the restless expansion of code across various substrates, and the insatiable appetite for code execution as a way of detecting, predicting, connecting, assembling multiples together ^ these attributes of code have been fairly well analysed by now. However, perhaps what is still missing ^ and some of the papers to follow address this issue ^ are good accounts of the infinite branching graphs of decision that are lived through code. Clearly, it would be a mistake to see code (merely) in terms of the reassertion of deterministic control over chance processes such as weather, disease and fashion. This engagement could be seen as taming chance, as an attempt to order and regulate events by laying down sequences of steps, by sorting and archiving events, attributes and operations so that the flow of things is more regular. Matters are definitely more complex than this. In a sense, code has become the source of new forms of chance. Rather than events simply becoming predictable through code, predictability itself has changed in important ways. Prediction has long been most closely associated with belief and with coded beliefs (such as religious texts and authorities). In the rise of statistical and empirical knowledges since the 18th century, prediction gradually adopted mathematical forms (Hacking, 1986, 1975; Kruger et al., 1990; Knights and Vurdubakis, 1993). In the 20th century, models and then simulations became the primary forms of predictive Downloaded from tcs.sagepub.com at UNIV OF ROCHESTER LIBRARY on December 30, 2011
Mackenzie and Vurdubakis ^ Introduction to ‘Codes and Codings in Crisis’
13
power, and such pre-dictive power allowed new things to come into the world, things whose very existence was predicated on coding prediction into models ^ nuclear weapons or global climate. Modernity, we might say, is coterminous with the demand for ever more efficacious instruments and technologies of anti-cipation (from the Latin ante [‘before’] + capere [‘take’]). Without predictive models of chain reactions, nuclear weapons cannot be designed. Without predictive models of geophysical systems of energy circulation, global climate change cannot be predicted or indeed recognized (Edwards, 2010). In both cases, the computer models or ‘codes’ (this term is specifically used to refer to simulation of energy transfer in nuclear reactions) make possible things whose effects cannot be easily predicted or controlled. Rather than successfully taming chance, or bringing events into order, code makes things less stable. It opens up worlds of uncertainty. On Codes and Their Crises There is a certain ambiguity which runs through contemporary conceptions of crisis and which is reflected in the entanglements of crisis narratives with the various forms of ‘code’ and ‘coded’ conduct. Our notions of what constitutes ‘crisis’, as cultural historians have shown, have undergone considerable inflation and no longer simply refer to the specific moments of judgement (from the Greek kr0sis/krisis) in tragedy and in medicine, moments where the fate of the hero or the patient is determined (Kosseleck, 2006). Whether in media discourse or social scientific enquiry, ‘crisis’ has become, according to Luhmann (1984: 68), a vehicle for contemporary society’s self-understanding ^ ‘a semantic predisposition’ always in search for new referential content. In the ‘risk society’, we might say, ‘crisis’ has been transformed from an event into a condition. At the same time the assumption persists that ‘crises’ represent failures of control demanding speedy resolution (Holton, 1987). Against this backdrop, the generation of ‘software solutions’ as well as new ‘codes of conduct have become an automatic response to the infinite branching graph of demands for new scientifically-validated forms of crisis resolution. ‘Code’ has thus come to echo this inflation of the ‘crisis’ metaphor: ‘code’ comprises one of the basic structures or infrastructures of ‘crisis management’. It is hardly surprising that code systems (software) and coded conducts figure prominently in the crisis narratives that pervade contemporary life. Climate change, epidemics, financial crises, terrorist threats, urban uprisings, as well as time, space, media and publics, all have in various ways mutually constitutive entanglements with code and coding processes. Sometimes codes regulate lives and events in the name of safety: passenger check-in and information systems, password and personal identification interfaces, biometric identification. Sometimes, events and situations seem to become manage-able or control-able by virtue of their code-ability. ‘Code’ is often assigned the role of bringing the nature, magnitude and Downloaded from tcs.sagepub.com at UNIV OF ROCHESTER LIBRARY on December 30, 2011
14
Theory, Culture & Society 28(6)
ramifications of such crises into view (e.g. climatological models, epidemiological models, etc.), and thus subjecting them to disciplined knowing. For instance our knowledge of the causes, extent and consequences of global warming cannot be disentangled from the performance of the various climatological models and simulations which render it into a know-able object of study and policy. In turn, those models and simulations become key sites where such knowledge is contested (e.g. consider Michael Mann’s ‘hockey stick’ graph or UEA’s Climate Research Unit’s ‘Climategate’ controversies; e.g. Russell, 2010; Booker, 2009; Gore, 1991; Black, 2011). Such debates and controversies have increasingly drawn popular attention to aspects of the performativity of code. In the case of the still unfolding financial crisis, for example, much opprobrium has been directed to the performances of particular algorithms and codings (see Mackenzie, 2011). Indeed, the ‘codes’ and algorithms that have underpinned the techniques and practices of financial innovation of the last four decades are now widely blamed for both triggering and deepening the current financial crisis ^ and therefore as being ultimately responsible for the present economic instability. Contemporary practices of ‘financial engineering’ ^ perhaps best exemplified by Li’s ‘Gaussian copula’ function widely used in the creation of Collateralised Debt Obligations (CDOs) ^ have arguably (e.g. Jones, 2009) set the stage for large-scale systemic breakdowns, ‘normal accidents’ in Perrow’s (1985) sense. More broadly, financial markets and financial instruments are increasingly viewed as excessive, as bringing into being hyperreal, simulational worlds that come to destabilize and dominate the ‘real-world’ activities to which they ostensibly refer (e.g. Kroker and Kroker, 2008; Engelen et al., 2008). The financial tail wags the economic dog. Codes and codings, to paraphrase Donald Mackenzie (2006), increasingly come to be seen as the engines of financial and economic crises as well as a means for their resolution. We can detect the mutually inflationary relationship of code-crisis at work in the perennial injunction which accompanies instances of ‘crisis’: ‘we’/‘they’ (those whose job it is to make the world go round) should have pre-dicted such events and acted to bring them under control. ‘[H]ow do we thwart a terrorist who has not yet been identified?’ asks Michael Chertoff (2006), former US Secretary of Homeland Security: we need to be better at connecting the dots of terrorist-related information. After Sept. 11, we used credit card and telephone records to identify those linked with the hi jackers. But wouldn’t it be better to identify such connections before a hi jacker boards a plane? (emphasis added)
The solution to this problem, as Louise Amoore describes in her contribution to this issue, is sought in the development of new software tools and data analytics designed to extract pre-dictive patterns of relations from the apparently amorphous masses of ‘data’ thrown up by the digital technologies through which contemporary life is lived. She notes the telling resemblances Downloaded from tcs.sagepub.com at UNIV OF ROCHESTER LIBRARY on December 30, 2011
Mackenzie and Vurdubakis ^ Introduction to ‘Codes and Codings in Crisis’
15
of such acts of prediction to the logic of financial derivatives. Thus, while framed in terms of the ongoing ‘War on Terror’, Chertoff’s questions nevertheless instantiate and articulate much broader demand for ever more effective instruments and architectures of anti-cipation: codings constitute attempts to ensure that not anything might happen. ‘Limits to information’ (Brown and Duguid, 2000) and anticipation, whether attributable to the underdetermined nature of human action or the un-knowable contingencies of the future, are increasingly experienced as illegitimate barriers (tyche) destined to be overcome through improved algorithms and the dedicated application of technology. The desire and ability to sift through the massive amounts of transactional data generated by contemporary life, and to identify patterns of relations between seemingly unconnected data items, promises to transform what at first appears as little more than electronic noise into actionable knowledge (see Amoore, 2006; Andrejevic, 2010). Interrogated by the new devices and techniques, note Achelp˛hler and Niehaus (2004: 407), ‘[e]ven originally irrelevant data gains unexpected significance due to the possibilities of automatic data processing and its capacity of handling and connecting items. . . In this respect there is no ‘‘insignificant’’ data left in times of automatic data processing’. A key feature of such computer-mediated acts of knowing is that they seek to solve problems of reference by increasingly ‘eschewing the traditional quest for causality (depth), in favour of correlation and pattern recognition (surface)’ (Knox et al., 2010). For some, such developments point to a coming crisis of sociology (cf. Savage and Burrows, 2007). As Anderson (2008) put it in his hyperbolic but nevertheless oft-quoted account, Google becomes the new paradigm of knowledge: massive amounts of data and applied mathematics [can] replace every other tool that might be brought to bear. Out with every theory of human behavior, from linguistics to sociology. Forget taxonomy, ontology, and psychology. Who knows why people do what they do? The point is they do it, and we can track and measure it with unprecedented fidelity. With enough data, the numbers speak for themselves.’
The triumph, then, of Searle’s (1990) Chinese Room over alternative forms of knowing? Theory finally made redundant by code? Cheney-Lippold’s contribution to this issue, for example, shows this mode of knowing in action. His focus is on the sophisticated algorithms that enable marketing and web analytics companies to sift through and categorize the masses of data routinely generated by people’s on-line behaviour. Such devices allow companies to infer consumer identities and through them their bearers’ future consumption activities. The resemblances between Cheney-Lippold’s consumer and Amoore’s terrorist in the making are clearly not accidental. Thus, whilst security might be a domain where this newly automated labour of pattern Downloaded from tcs.sagepub.com at UNIV OF ROCHESTER LIBRARY on December 30, 2011
16
Theory, Culture & Society 28(6)
recognition is most visibly performed, clearly similar processes are at work in all areas of modern life, from production, to consumption, to finance. Indeed (as Amoore indicates), the case of the latter is perhaps particularly instructive as a site where the complex techniques and models which underpin contemporary forms of anti-cipation were first widely deployed. Could it be that this new ‘paradigm’ of knowledge, while still in the process of colonizing other domains, is already in the throes of its own ‘crises’? As Mark Lenglet (this issue) shows, such ‘crises’ unfold in an increasingly ‘posthuman’ (Hayles, 1999) landscape. Financial markets are increasingly performed through the use of algorithms designed to speed up trading decisions and execution while at the same time concealing intentions and trading strategies. Ever since the October 1987 (Black Monday) stock market crash, computerized ‘black box’ trading has been accused of destabilizing the markets by increasing stock volatility. In the UK, for instance, the latest such claim has come from the former Financial Services Minister Lord Myners in the wake of the extreme volatility that has characterized the current phase of the financial crisis (e.g. Reuters, 14 Aug. 2011). Be that as it may, it is clear that the speeding up of these processes to milliseconds means that their operation easily exceeds human supervision. As a result, such devices become the loci of tension and mutual interference between different codes: the trading algorithms encoded in the devices and the trading rules and codes of conduct operated by each market’s regulatory overseers. Lenglet illustrates these tensions in his analysis of a London Stock Exchange (LSE) investigation triggered by the placing of a ‘Large Erroneous Order’ that caused a ‘long’ (13 second) spike in the price of the stock concerned ^ prior to the order being cancelled. The resulting (re)allocation of responsibilities for the error between human trader and algorithm is a reminder of how encodings typically entail negotiations over ontological categories. Indeed, a common thread running through the articles that comprise this special issue is the concern with how codes and codings ‘work’ by endowing their objects with being: Cheney-Lippold’s ‘inferred’ consumer, for instance; Amoore’s ‘yet to be terrorist’ invisible in the electronic ‘noise’ of social life; Introna’s plagiarist, hiding in the interstices of intertextual exchanges. As Introna (this issue) shows, plagiarism detection software such as Turnitin (another hybrid progeny of computer code and of social codes of conduct) does more than simply make instances of plagiarism ‘known’ and finger their authors. Rather than merely facilitating its discovery, such devices are constitutive of the phenomenon itself. They work to provide ‘plagiarism’ and its practitioners with a definitive ontological structure so that they can be rendered manage-able. There is nevertheless, Introna argues, a price to be paid for this convenience. Such seemingly straightforward procedures of identification also function as the means of foreclosing far more troublesome questions concerning language, encoding and the nature of ‘originality’. Downloaded from tcs.sagepub.com at UNIV OF ROCHESTER LIBRARY on December 30, 2011
Mackenzie and Vurdubakis ^ Introduction to ‘Codes and Codings in Crisis’
17
Questions of finitude and of the recognition (and de-recognition) of finitude constitute another common thread that connects a number of the articles in this collection. Penny Harvey and Hannah Knox, for example, discuss the (‘health and safety’) codes of conduct that aim to ‘anticipate harm’ during the road construction in the Andean Highlands. Despite the corporate motto of ‘zero accidents’, deaths occur. The social reactions that these deaths elicit show that the enactment of these codes, and of the modern(ist) ontologies which underpin them, remain persistently troubled by the spectre of their ‘others’. That is to say by the co-presence of alternative ways of construing the world (and of what might or might not cause ‘harm’) that cannot be exorcised. They therefore illustrate the emergence of an ambivalent (from ambi [‘both ways’] + valentia [‘strength’]) hybrid in which seemingly incompatible codings appear to (at least temporarily) inhabit, as well as destabilize, one another. The two final papers in the collection explore finitude or limits interior to code. In both of their papers, as in their work more generally, Anna Munster and Wendy Hui Kyong Chun bring finitude and death in code to the foreground of the development of internet and new media. They take somewhat related paths in doing this. Drawing on Derrida and Butler, Chun’s paper focuses on the temporality of crisis, and highlights a crisisprone temporality inherently associated with code-based media in general. This crisis plays out in several different registers: first, in relation to law and power, code brings with it the desire for perfect coincidence between law and action; second, in relation to memory, code promises fully predictable remembering. In both registers, phantasmatic sovereign subject formations can be found. Both aspects of code come together to induce formations of pleni-potent all-knowing subjectivity. The investment in ‘source code’ ^ the written text of software ^ in many political, commercial and epistemic contests over the last two decades can be understood in these terms as a somewhat retrospective attempt to compel coincidence between law and execution and to thereby short-circuit the undecidability of decision. Code, in consequence, goes deep into memory. To remember or not to remember, to save or not to save: these alternatives hardly seem the stuff of crises, yet as Chun argues, the impossibility of deciding whether to remember or not is intensified by code. In keeping everything, as we know, much is lost. Memory is commemorative; it requires regeneration and regeneration is always beset by problems of finitude. The ‘finitude’ of code stems from the fact that it can only resource memory, even the memory of real-time media (the media of contemporary crises) by running or execution. The execution of code supports both the originary sovereign subject whose action is law and the subject beset by the need to re-process, to re-iterate, to run again, in order to remember. In this impossible situation, crisis after crisis arises. The question then is how to respond to the exhausting yet compulsive momentum of this trajectory. One answer to this question, at least in the context of the contemporary network media, comes from an analysis of disconnection or what Downloaded from tcs.sagepub.com at UNIV OF ROCHESTER LIBRARY on December 30, 2011
18
Theory, Culture & Society 28(6)
might be called ‘digital death’. Munster’s paper analyses a mood of finitude marking a shift from the internet of the 1990s to contemporary social network media, computer games and military training simulations. In this shift, an ebullient positivity animated by code’s capacity to assemble relations or created lifted-out spaces becomes a sober awareness or sometimes preoccupation with irreversible digital death. Like the exhaustion that Chun describes, the ethos of digital death suggests that the generative possibilities of code, whilst increasingly felt to be somewhat suffocating or overwhelming, are open to productive differentiations. The implications of her analysis are wide. Whilst affirming the differentiating power of code, Munster outlines the way in which a certain will-to-life has accompanied the growth of digital code more generally. Code’s will-to-life ^ evident in genomics, and artificial life, to give two examples ^ complicates any simple affirmation of code, since it feeds directly into biopolitical and post-biopolitical power relations. What is hard, then, is the task of wresting something away from code, of finding uncoded spaces or locating bifurcations within code itself that resist immediate recuperation by anticipatory cognitive captures. Such birfurcations, if they exist, will be transient segmentations of social network media’s biopolitical massification. When Reinhardt Koselleck writes that ‘the concept of crisis, which once had the power to pose unavoidable, harsh and non-negotiable alternatives, has been transformed to fit the uncertainties of whatever might be favored at a given moment’ (Koselleck, 2006: 306), he might have been referring to the percolation of code through the interstices of all alternatives. The themes of code as signification/knowledge, performativity and excess unfold in the chiasmus of code and crisis. The account of code in crisis and crisis in code we have been sketching here seeks to track down how code and crisis define and feed one another. Knowing, performing and excess (the overflows of performativity) constantly flow into each other in crisis. The processes of coding and forms of criticality we have described are reciprocally driven by these flows. From time to time, in halting steps or massive slippages, bifurcations that are not just part of the ‘infinite branching graph’ of performativity harshly and non-negotiably split things open. These events, on whatever scale, can expose something against which coding of criticality shatters. If code and crisis are made for and in each other, then work on such bifurcations is vital.
References Achelp˛hler, W. and H. Niehaus (2004) ‘Data Screening as a Means of Preventing Islamist Terrorist Attacks on Germany’, German Law Journal 5(5): 495^513. Amoore, L. (2006) ‘Biometric Borders: Governing Mobilities in the War on Terror’, Political Geography 25: 336^51. Amoore, L. (2011) ‘Data Derivatives: On the Emergence of a Security Risk Calculus for Our Times’, Theory, Culture & Society 28(6): 24^43. Downloaded from tcs.sagepub.com at UNIV OF ROCHESTER LIBRARY on December 30, 2011
Mackenzie and Vurdubakis ^ Introduction to ‘Codes and Codings in Crisis’
19
Anderson, C. (2008) ‘The End of Theory’, Wired 16(7). Available at: http:// www.wired.com/science/discoveries/magazine/16 - 07/pb_theory. Andrejevic, M. (2009) ‘Control over Personal Information in the Database Era’, Surveillance & Society 6(3): 322^26. Bateson, G. and J. Ruesch (1951) Communication. New York: Norton. Beck, U. (1992a) The Risk Society. London: SAGE. Beck, U. (1992b) ‘From Industrial Society to the Risk Society: Questions of Survival, Social Structure and Ecological Enlightenment’, Theory, Culture & Society 9(1): 97^123. Beniger, J.R. (1997) The Control Revolution: Technological and Economic Origins of the Information Society. Cambridge, MA: Harvard University Press. Bernstein, B. (1973) Class, Codes and Control. St Albans: Paladin Press. Black, R. (2011) ‘Journal Editor Resigns over ‘‘Problematic’’ Climate Paper’, BBC News, 2 September. Available at: http://www.bbc.co.uk/news/science-environment-14768574. Booker, C. (2009) The Real Global Warming Disaster. London: Continuum. Brown, J.S. and P. Duguid (2000) The Social Life of Information. Cambridge, MA: Harvard University Press. Bryan, B. (2010) ‘Code and the Technical Provenance of Nihilism’, CTheory. Available at: http://www.ctheory.net/articles.aspx?id¼643. Butler, J. (1997) Excitable Speech: A Politics of the Performative. London/New York: Routledge. Castells, M. (1996, 1997, 2000) The Information Age: Economy, Society and Culture, 3 vols. Oxford: Blackwell. Cheney-Lippold, J. (2011) ‘A New Algorithmic Identity: Soft Biopolitics and the Modulation of Control’, Theory, Culture & Society 28(6): 164^181. Chertoff, M. (2006) ‘A Tool We Need to Stop the Next Airliner Plot’, The Washington Post, 29 August. Available at: http://www.washingtonpost.com/wpdyn/content/article/2006/08/28/AR2006082800849.html. Chun, W.H.K. (2011) ‘Crisis, Crisis, Crisis, or Sovereignty and Networks’, Theory, Culture & Society 28(6): 91^112. Church, A. (1936) ‘A Note on the Entscheidungsproblem’, Journal of Symbolic Logic 1(1): 40^41. Cormen, T.H. (1990) Introduction to Algorithms. Cambridge, MA: MIT Press. Cox, G., A. McLean and A. Ward (2002) ‘The Aesthetics of Generative Code’. Available at: http://www.generative.net/papers/aesthetics/index.html. Cramer, F. (2002) ‘Software and Concept Notations. Software in the Arts’. Available at: http://www.macros-center.ru/read_me/teb2e.htm. Deleuze, G. (1995) ‘Postscript on Control Societies’, in Negotiations 1972^1990 (trans. M. Joughin). New York: Columbia University Press. Deleuze, G. and F. Guattari (1983) Anti-Oedipus: Capitalism and Schizophrenia (trans. R. Hurley, M. Seem and H.R. Lane). Minneapolis: University of Minnesota Press. Downloaded from tcs.sagepub.com at UNIV OF ROCHESTER LIBRARY on December 30, 2011
20
Theory, Culture & Society 28(6)
Der Derian, J. (2009) Virtuous War. London: Routledge. Derrida, J. (1982) Margins of Philosophy. Brighton: Harvester Press. Engelen, E., I. Erturk, J. Froud et al. (2008) ‘Financial Innovation: Frame, Conjuncture and Bricolage’. CRESC Working Paper No. 59. Fortnow, L. and S. Homer (2003) ‘A Short History of Computational Complexity’, Bulletin of the EATCS 80: 95^133. Galloway, A.R. (2004) Protocol: How Control Exists After Decentralization. Cambridge, MA: MIT Press. Galloway, A.R. (2006) ‘Language Wants To Be Overlooked: On Software and Ideology’, Journal of Visual Culture 5(3): 315^331. Gates, B. (1995) The Road Ahead. New York: Viking. Goodman, N. (1995) Ways of Worldmaking. Indianapolis: Hackett. Gore, A. (1991) Earth in the Balance. London: Earthscan. Hacking, I. (1975) The Emergence of Probability. Cambridge: Cambridge University Press. Hacking, I. (1986) ‘Biopower and the Avalanche of Printed Numbers’, Humanities in Society 2701^95. Halter, E. (2006) From Sun Tzu to Xbox: War and Video Games. New York: Thunder’s Mouth Press. Hardt, M. and A. Negri (2004) Multitude: War and Democracy in the Age of Empire. London: Penguin. Harraway, D. (1991) ‘A Cyborg Manifesto: Science Technology and Socialist Feminism in the Late Twentieth Century’, in Simians, Cyborgs and Women: The Reinvention of Nature. London: Routledge. Harrell, E. (2011) ‘Has Fukushima’s Reactor No. 1 Gone Critical?’, Ecocentric ^ TIME.com (30 March). Available at: http://ecocentric.blogs.time.com/2011/03/30/ has-fukushimas-reactor-no-1-gone-critical/ Hayles, K. (1999) How We Became Posthuman: Virtual Bodies in Cybernetics, Literature and Informatics. Chicago: University of Chicago Press. Hayles, K. (2005) My Mother Was a Computer: Digital Subjects and Literary Texts. Chicago: University of Chicago Press. Hayles, K. (2009) ‘RFID: Human Agency and Meaning in InformationIntensive Environments’, Theory, Culture & Society 26 (2/3): 1^24. Heidegger, M. (1977) The Question Concerning Technology and Other Essays (trans. W. Lovitt). New York: Harper and Row. Holton, R.J. (1987) ‘The Idea of Crisis in Modern Society’, British Journal of Sociology 38(4): 502^20. Husserl, E. (1970) Crisis of European Sciences and Transcendental Phenomenology. Evanston: Northwestern University Press. Introna, L. (2011) ‘The Enframing of Code: Agency, Originality and the Plagiarist’, Theory, Culture & Society 28(6): 113^141. Jones, S. (2009) ‘The Formula that Felled Wall St.’, Financial Times, 24 April. Available at: http://www.ft.com/cms/s/2/912d85e8 -2d75-11de-9eba-00144feabdc0. html#axzz1RNfUfWNe. Downloaded from tcs.sagepub.com at UNIV OF ROCHESTER LIBRARY on December 30, 2011
Mackenzie and Vurdubakis ^ Introduction to ‘Codes and Codings in Crisis’
21
Knights, D. and T. Vurdubakis (1993) ‘Calculations of Risk: Towards an Understanding of Insurance as a Moral and Political Technology’, Accounting, Organizations and Society 18(7/8): 729^64. Knights, D., T. Vurdubakis and H. Willmott (2008) ‘The Night of the Bug: Technology, Risk and (Dis)organization at the Fin de Sie'cle’, Management & Organizational History 3(3): 289^309. Knox, H. and P. Harvey (2011) ‘Anticipating Harm: Regulation and Irregularity on a Road Construction Project in the Peruvian Andes’, Theory, Culture & Society 28(6): 142^163. Knox, H., D. O’Doherty, T. Vurdubakis and C. Westrup (2010) ‘The Devil and Customer Relationship Management: Informational Capitalism and the Performativity of the Sign’, Journal of Cultural Economy 3(3): 339^59. Koselleck, R. and M. Richter (2006) ‘Crisis’, Journal of the History of Ideas 67(2): 357^400. Kroker, A. and M. Kroker (2008) ‘City of Transformation: Paul Virilio in Obama’s America’, CTheory. Available at: http://www.ctheory.net/printer.aspx?id¼597. Kruger, L., L. Daston and M. Heidelberger (1990) The Probabilistic Revolution, Vol. I. Cambridge, MA: MIT Press. Lenglet, M. (2011) ‘Conflicting Codes and Codings: How Algorithmic Trading Is Reshaping Financial Regulation’, Theory, Culture & Society 28(6): 44^66. Lessig, L. (1999) Code and Other Laws of Cyberspace. New York: Basic Books. Levy, S. (2002) Crypto: Secrecy and Privacy in the New Code War. London: Penguin. Li, D.X. (2000) ‘On Default Correlation: A Copula Function Approach’, Journal of Fixed Income 9(4): 43^54. Luhmann, N. (1984) ‘The Self-Description of Society: Crisis Fashion and Sociological Theory’, International Journal of Comparative Sociology 25(1/2): 59^72. Mackenzie, A. (1996) ‘Undecidability: The History and Time of the Universal Turing Machine’, Configurations 4(3): 359^379. Mackenzie, A. (2006) Cutting Code: Software and Sociality (Digital Formations). New York: Peter Lang. Mackenzie, D. (1991) ‘The Influence of the Los Alamos and Livermore National Laboratories on the Development of Supercomputing’, IEEE Annals of the History of Computing 179^201. Mackenzie, D. (2006) An Engine, Not a Camera: How Financial Models Shape Markets. Cambridge, MA: MIT Press. Mackenzie, D. (2010) ‘The Credit Crisis as a Problem in the Sociology of Knowledge.’ Working Paper, University of Edinburgh. Available at: http:// www.sps.ed.ac.uk/__data/assets/pdf_file/0019/36082/CrisisRevised.pdf. Mackenzie, D. (2011) ‘Knowledge Production in Financial Markets: Credit Default Swaps, the ABX and the Subprime Crisis’. Working Paper, University of Edinburgh. Available at: http://www.sps.ed.ac.uk/__data/assets/pdf_file/0010/55936/ABX13.pdf. Marino, M.C. (2006) ‘Critical Code Studies.’ Electronic Book Review. http:// www.electronicbookreview.com/thread/electropoetics/codology. Downloaded from tcs.sagepub.com at UNIV OF ROCHESTER LIBRARY on December 30, 2011
22
Theory, Culture & Society 28(6)
Merriam-Webster’s Online Dictionary, ‘Executable’. Metropolis, N. (1949) ‘The Monte Carlo Method’, Journal of the American Statistical Association 44: 335^341. Munster, A. (2011) ‘From a Biopolitical ‘‘Will to Life’’ to a Noopolitical Ethos of Death in the Aesthetics of Digital Code’, Theory, Culture & Society 28(6): 67^90. Murdoch, R. (1994) The Century of Networking. St. Leonards: Centre for Independent Studies. Perrow, C. (1985) Normal Accidents: Living with High-Risk Technologies. New York: Basic Books. Reuters (2011) ‘Paul Myners Calls for Black Box Trading Probe’. 14 August. Available at: http://uk.reuters.com/article/2011/08/14/uk-myners-idUKTRE77D 19520110814. Rheingold, H. (2000) The Virtual Community: Homesteading on the Electronic Frontier. Cambridge, MA: MIT Press. Russell, M. (2010) The Independent Climate Change Emails Review. Available at: http://www.cce-review.org/pdf/FINAL%20REPORT.pdf. Savage, M. and R. Burrows (2007) ‘The Coming Crisis of Empirical Sociology’, Sociology 41(5): 885^899. Searle, J.R. (1990) ‘Minds, Brains and Programs’, in M. Boden (ed.) The Philosophy of Artificial Intelligence. Oxford: Oxford University Press. Stocker, G. and C. Sch˛pf (2003) Code, the Language of Our Time (Ars Electronica 2003). Osterfildern-Ruit: Hatje Cantz Verlag. Thacker, E. (2005) The Global Genome?: Biotechnology, Politics, and Culture. Cambridge, MA: MIT Press. Turing, A.M. (1936) ‘On Computable Numbers, with an Application to the Entscheidungs Problem’, in The Essential Turing: Seminal Writings in Computing Logic, Philosophy, Artificial Intelligence, and Artificial Life, Plus the Secrets of Enigma. New York: Oxford University Press. Virilio, P. and S. Lotringer (2002) Crepuscular Dawn (trans. M. Taormina). New York: Semiotext(e). Wirth, N. (1976) Algorithms + Data Structures ¼ Programs. New York: Prentice Hall. Woolgar, S. (ed.) (2002) Virtual Society? Technology, Cyberbole, Reality. Oxford: Oxford University Press.
Adrian Mackenzie (Centre for Social and Economic Aspects of Genomics, Lancaster University) researches in the area of technology, science and culture. He co-directs the Centre for Science Studies, Lancaster University. He has published books on technology, including Transductions: Bodies and Machines at Speed (2002/6), Cutting Code: Software and Sociality (2006), and Wirelessness: Radical Empiricism in Network Cultures (2010), as well as articles on media, science and culture. He is currently working on the circulation of data intensive methods across science, government, and business in network media. [email:
[email protected]] Downloaded from tcs.sagepub.com at UNIV OF ROCHESTER LIBRARY on December 30, 2011
Mackenzie and Vurdubakis ^ Introduction to ‘Codes and Codings in Crisis’
23
Theo Vurdubakis is Professor of Organisation and Technology in the Lancaster University Management School, where he is currently Director of the Centre for the Study of Technology and Organization (CSTO). A graduate of Athens University, he gained his doctorate from the University of Manchester Institute of Science and Technology where he taught for 15 years. His research focuses on the role of technologies in social organization. [email:
[email protected]]
Downloaded from tcs.sagepub.com at UNIV OF ROCHESTER LIBRARY on December 30, 2011
Data Derivatives On the Emergence of a Security Risk Calculus for Our Times
Louise Amoore
Abstract In a quiet London office, a software designer muses on the algorithms that will make possible the risk flags to be visualized on the screens of border guards from Heathrow to St Pancras International. There is, he says, ‘real time decision making’ – to detain, to deport, to secondarily question or search – but there is also the ‘offline team who run the analytics and work out the best set of rules’. Writing the code that will decide the association rules between items of data, prosaic and mundane – flight route, payment type, passport – the analysts derive a novel preemptive security measure. This paper proposes the analytic of the data derivative – a visualized risk flag or score drawn from an amalgam of disaggregated fragments of data, inferred from across the gaps between data and projected onto an array of uncertain futures. In contrast to disciplinary and enclosed techniques of collecting data to govern population, the data derivative functions via ‘differential curves of normality’ , imagining a range of potential futures through the association rule, thus ‘opening up to let things happen’ (Foucault 2007). In some senses akin to the risk orientation of the financial derivative, itself indifferent to actual underlying people, places or events by virtue of modulated norms, the contemporary security derivative is not centred on who we are, nor even on what our data say about us, but on what can be imagined and inferred about who we might be – on our very proclivities and potentialities. Key words borderzones
j
j
risk
j
security
j
technology
Theory, Culture & Society 2011 (SAGE, Los Angeles, London, New Delhi, and Singapore), Vol. 28(6): 24^43 DOI: 10.1177/0263276411417430
Downloaded from tcs.sagepub.com at UNIV OF ROCHESTER LIBRARY on December 30, 2011
Amoore ^ Data Derivatives
25
There is real time decision making, and then the offline team who run the analytics and work out the best set of rules. (Border security software designer, 2009) The rockets are distributing about London just as the equation in the textbooks predicts. As the data keep coming in, he looks more and more like a prophet. It’s not precognition, he wants to say, all I’m doing is plugging numbers into a well known equation, you can look it up in the book and do it yourself. (Thomas Pynchon, 1973: 63)
Introduction: The Data and the Prophets N HIS novel Gravity’s Rainbow, Thomas Pynchon ^ the novelist himself a former technical writer for Boeing ^ depicts the coincidence of two sets of otherwise apparently random data. On a carefully drawn map, an ‘ink ghost of London’, the statistician Roger Mexico has marked out 576 squares, the strikes of German V-2 rockets represented by red circles. ‘Can’t you tell from your map here’, demands his colleague Ned Pointsman, ‘which places would be safest to go into, safest from attack?’ ‘No’, replies Mexico, ‘every square is as likely to get hit again . . . the odds remain the same as they always were. Each hit is independent of all others’ (1973: 65). In the rendering of his map of the data of individual rocket strikes, Mexico claims no predictive insight, no pre-emptive decision or precognition; ‘plugging numbers into a well known equation’, he simply follows an already calculated formula ^ ‘you can do it yourself’. Meanwhile, the American Lieutenant Tyrone Slothrop creates his own London map ^ the private data of his sexual encounters in the city, ‘a cluster near Tower Hill, a violet density about Covent Garden, a nebular streaming on into Mayfair, Soho’, each paper star corresponding precisely with a V-2 rocket strike yet to come (Pynchon, 1973: 22). For the spies and the military strategists, Slothrop’s apparent pre-emptive instincts are ‘the perfect mechanism’. ‘He’s out there’, re£ects Pointsman, ‘he can feel them coming, days in advance. It’s a re£ex. A re£ex to something that’s in the air right now . . . a sensory cue we just aren’t paying attention to, something we could be looking at but no one is’ (1973: 57). In a quiet London office in 2009, the ambition of Pynchon’s Pointsman ^ to make it possible to know how to be safe from attack, to render the calculable data of Mexico’s map but to do it via the pre-emptive logics of Slothrop’s scattered data points ^ is apparently in the process of realization. A group of mathematicians, software designers and computer scientists ‘work out the best set of rules’ governing the links between otherwise scattered items of data. Their algorithmic models supply one element of the UK’s e-Borders programme: a risk-based system that deploys processes of data mining and analytics in order to derive a risk score or flag for individuals entering or exiting the UK. As the commercial advertisements for the initial lead contractor for e-Borders ^ the US arms
I
Downloaded from tcs.sagepub.com at UNIV OF ROCHESTER LIBRARY on December 30, 2011
26
Theory, Culture & Society 28(6)
Figure 1 Raytheon: ‘our solutions let her know what to look for’. At Raytheon, we’re proud to support the hardworking men and women who keep our borders, skies and coastlines safe. Our border security technology and services are based on decades of experience developing and deploying solutions that work under the most challenging conditions. We are determined to provide the best support available. So whatever situation you’re facing, you’ll have Raytheon by your side. www.raytheon.co.uk. ß 2004 Raytheon company. All rights reserved. ‘Customer Success Is Our Mission’ is a registered trademark of Raytheon Company manufacturer Raytheon ^ illustrate, though the border guard ‘scrutinizes to keep us safe’, her sovereign attentiveness is governed by data ‘solutions that let her know what to look for’ (see Figure 1).1 In the world after 9/11, the question ‘how do we thwart a terrorist who has not yet been identi¢ed?’ comes to dominate the horizon of security, with the answer coming in the form of data and the ‘joining of dots that should have been connected before 9/11’ (Department of Homeland Security, 2006). Thus, for our times, before a plane can land, a border crossed or a port entered, the most prosaic and apparently scattered of data (from past travel bookings and credit card transactions to visa applications and in-£ight meal choices) appears to hold out a new promise ^ that if only it can be integrated, the ‘gauge set’ and the algorithm ‘re¢ned’, the various programmes of UK Downloaded from tcs.sagepub.com at UNIV OF ROCHESTER LIBRARY on December 30, 2011
Amoore ^ Data Derivatives
27
e-Borders, EU PNR, USVISIT and ATS might render pre-emptive security action possible.2 The sovereign decisions of the border guard ^ who to stop, question, search, detain ^ are in e¡ect deferred into, as Mexico notes, ‘a well known equation’. And yet this screened calculation is only made possible by the multiple other judgements, dispositions and re£exes that ‘set the gauge’, joining the scattered items together to produce the ‘best set of rules’.3 What is the logic of this joining of dots and setting of the gauge? It is an ontology of association, and it works according to association rules (Amoore, 2009a). Importantly, just as Pynchon’s rocket strikes and sexual encounters are associated, related but not causal ^ ‘all talk of cause and e¡ect is secular history’ (1973: 167) ^ contemporary risk calculus does not seek a causal relationship between items of data, but works instead on and through the relation itself. The ontology of association does have a mathematical means of calculating uncertainty, an equation: if *** and ***, in association with ***, then ***. In the decisions as to the association rules governing border security analytics, the equation may read: if past travel to Pakistan and duration of stay over three months, in association with £ight paid by a third party, then risk £ag, detain; if paid ticket in cash and this meal choice, in association with this £ight route, then secondary searches; if two tickets paid on one credit card and not seated together, then these questions. Understood in this way, it is not strictly collected data that become an actionable security intervention, but a di¡erent kind of abstraction that is based precisely on an absence, on what is not known, on the very basis of uncertainty. Coalescing the imaginative mapping of Slothrop’s pre-emptive sensory cues with the protocols of Mexico’s numbers plugged into a ‘well known equation’, the processes of data integration, mining and analytics draw into association an amalgam of disaggregated data, inferring across the gaps to derive a lively and alert new form of data derivative ^ a £ag, map or score that will go on to live and act in the world. In this paper I elucidate the data derivative as a specific form of abstraction that is deployed in contemporary risk-based security calculations, acting on and through people, populations and objects in novel ways that are little acknowledged or understood either in the natural or social sciences (but see Daston and Galison (2007: 309) on intuitive thinking and scienti¢c imaging; Lane and Whatmore (forthcoming) on £ood risk mapping). To be clear, the form of data derivative emerging in contemporary security risk management is not a ‘more advanced’ form of abstraction, but rather a speci¢c form of abstraction that distinctively correlates more conventional state collection of data with emergent and unfolding futures. The data derivative comes into being from an amalgam of disaggregated data ^ reaggregated via mobile algorithm-based association rules and visualized in ‘real time’ as risk map, score or colour-coded £ag ^ such that it is not of the same order of being as what we might call modernist disciplinary data.4 It is not that derivative forms supersede disciplinary data modes, and indeed among the reaggregated data elements are conventionally collected visa and passport data, but rather that the relation between the Downloaded from tcs.sagepub.com at UNIV OF ROCHESTER LIBRARY on December 30, 2011
28
Theory, Culture & Society 28(6)
elements is itself changed. As Michel Foucault put the problem in his 1978 lectures, ‘mechanisms of security do not replace disciplinary mechanisms’ as a succession of techniques, but instead ‘what above all changes is the dominant characteristic, or more exactly, the system of correlation between juridico-legal mechanisms, disciplinary mechanisms, and mechanisms of security’ (2007: 8). It is precisely the emergence of novel forms of correlation that are distinctive to the data derivative form in the domain of security, though of course these are familiar qualities in the ¢nancial derivative. Derivative instruments in finance are novel forms of risk management in which the relationship between the instrument and an assumed underlying value becomes fleeting, uncertain and loose. ‘The central characteristic of derivatives’, write Dick Bryan and Michael Rafferty, ‘is their capacity to dismantle or unbundle any asset into constituent attributes and trade those attributes without trading the asset itself’ (2006: 44). By ‘slicing and dicing’ and reaggregating underlying values, the financial instrument of the derivative thus sheds any encumbering causal relation to the underlying asset. Indeed, as Duncan Wigan suggests in his analysis of financial derivatives, the instruments ‘abstract from any linear relationship to underlying processes of real wealth creation’, rendering them ‘indifferent’ to the components of individual stocks or bonds, for example, on which they draw (2009: 159). Thus, for example, the so-called ‘sub-prime crisis’ sliced and bundled together credit default swaps and collateralized debt obligations indifferent to underlying values such as house prices or capacities to repay mortgage debt (Langley, 2009). One might intuitively suppose that the state of emergency precipitated by the sub-prime crisis has marked the limit point of the derivative, and yet the response to the ¢nancial crisis has itself ushered in the demand for ever more precise and ¢nite degrees of risk disaggregation. The data derivatives I observe emerging in contemporary security practice are similarly inferred from underlying fragmented elements of data toward which they are for the most part indifferent. It is precisely this capacity to move, to be shared, traded or exchanged indifferent to, and in isolation from, underlying data components that is at the heart of derivative risk flags, maps and scores. Indifferent to the contingent biographies that actually make up the underlying data in fields such as PNR, the data derivative is not centred on who we are, nor even on what our data says about us, but on what can be imagined and inferred about who we might be ^ on our very proclivities and potentialities. The questions I will pose here are: what is the specific form of abstraction embodied by the data derivative? What are the conditions of its emergence and how is it produced? In what ways does it come to life, travel across domains, and how is it authorized to act? What kinds of subjects and populations does it imagine and bring into being? I will focus on four elements of the lived, lively and material being of the data derivative: temporalities; norm/anomaly; the virtual and the visual; and mobilities. Downloaded from tcs.sagepub.com at UNIV OF ROCHESTER LIBRARY on December 30, 2011
Amoore ^ Data Derivatives
29
Temporalities: ‘Making Present the Future Consequences’ The specific deployment of data as derivative acts on and through a population whose dynamics are as yet unknown, a population yet to come. It is not strictly the case that systems of automated targeting, integrated databases and software coded calculations witness some form of acceleration in terms of temporality, though of course this has been a central theme in discussions of ‘netwars’, ‘virtuous war’ and so on (see Der Derian, 2001; Martin, 2003). Indeed, by contrast the data derivative allows for a certain quality of suspended time ^ a ‘stilling’ of the frenetic crossings of the global political economy in advance of arrival. The signi¢cance of the temporal register, then, lies not in a speeding up, but in an algorithmic ‘framing of time and space’ (Mackenzie, 2007: 97) that has a distinctive orientation to the unknown future. As Brian Massumi has put it, ‘pre-emption brings the future into the present. It makes present the future consequences of an eventuality that may or may not occur, indi¡erent to its actual occurrence’ (2005: 7^8, emphasis added). The pre-emptive deployment of a data derivative does not seek to predict the future, as in systems of pattern recognition that track forward from past data, for example, because it is precisely indifferent to whether a particular event occurs or not. What matters instead is the capacity to act in the face of uncertainty, to render data actionable. The data derivative, then, embodies a specific temporal orientation that differentiates it from the temporalities of the data abstractions of survey and census as techniques for encoding population (Bowker and Star, 1999). Of course, survey and census data forms also operate through abstraction, but they have a speci¢c spatial and temporal location ^ akin perhaps to the abstraction of a photograph or snapshot in terms of decisions about angle, framing, sampling, and replicability in the future. The temporality of the security data derivative, though it draws on some conventional elements of survey data such as immigration data, is no longer that of the survey, but is better understood as a projection. Projections are produced from fragments of data, from isolated elements that are selected, di¡erentiated and reintegrated to give the appearance of a whole. The multiple decisions about what to select, how to isolate, what should be joined to what, fall away in the appearance of a projected whole ^ a complete map, £ag or score. In her discussion of ¢lmic projection, Anne Friedberg suggests that ‘for motion to be reconstituted, its virtual reach relies on a missing element, a perceptual darkness between the frames’ (2007: 92). Like filmic projection, the gaps between underlying data items are precisely what makes the projected futures of the data derivative possible. As one major IT consulting company supplying borders programmes explains: Having different types of data sets allows you to do searches across those pieces of data, to be as certain as you can be that you understand who’s coming into the country and why they’re coming and whether or not you should take action.5
Downloaded from tcs.sagepub.com at UNIV OF ROCHESTER LIBRARY on December 30, 2011
30
Theory, Culture & Society 28(6)
For the suppliers of software and risk management solutions for border controls, the emphasis is on what can be conducted ‘across’ items of data, on and through their very relation. There can be no certainty about the association between data on a flight route, a method of payment, a ticket type, or a past ‘no show’, for their relation is not causal but correlative. What matters instead is the capacity to make inferences across the data, such that derivatives can be recognized, shared, and actioned. As the US Inspector General concluded in his review of data mining in US border security programmes: ‘association does not imply a direct causal connection’, rather it ‘uncovers, interprets, displays relationships between persons, places and events’ (2006: 10). What matters, then, is that some form of correlation can be drawn in the relationships, a correlation that is nonetheless indifferent to the specificity of persons, places and events. Rather as Donald MacKenzie notes in the ‘base correlations’ that make financial derivatives actionable as instruments (2009: 14), it is the very relation itself that renders a calculable risk score, a flag on a border guard’s screen, an already encoded course of action. The data derivative’s ontology of association is indeed indifferent to the occurrence of specific events, on the condition that there remains the possibility for action ^ in the domain of finance, the derivative can be exchanged and traded for as long as the correlation is sustained; in the domain of security the data derivative circulates for as long as the association rules are sustained. In the process of making actionable, of course, as Alain Badiou signals in his engagement with the mathematics of set theory, ‘the event is decided as such in the retroaction of an intervention’ (2005: 17). Where the association rules of a piece of software code infer ‘who’s coming into the country’ and ‘why they’re coming’, they release into the world a data derivative that intervenes retroactively in order to have already decided the event. Norm/Anomaly: ‘On This Particular Day, at This Particular Time, in This Moment’ The emergence of the data derivative demands something of a renewed critical thinking about the specific form of the life of a population as a terrain of governing. In Foucault’s 1976 lectures ‘Society Must Be Defended’ he delineates from ‘the power of sovereignty’ to ‘take life and let live’, his analytic of biopower: the ‘technology of power over the population’ in a form that ‘consists in making live and letting die’ (2003: 247). As Stephen Collier (2009) has argued compellingly, this stark annexation of biopolitics from sovereignty contrasts to a second later mood in Foucault’s work on speci¢c forms of biopower.6 In Foucault’s late re£ections, the idea of governing as government of population ‘makes the problem of the foundation of sovereignty even more acute and it makes the need to develop the disciplines even more acute’ (2007: 107). In contrast to an epochal move from sovereignty to discipline to security, then, Foucault depicts their co-presence: ‘a Downloaded from tcs.sagepub.com at UNIV OF ROCHESTER LIBRARY on December 30, 2011
Amoore ^ Data Derivatives
31
triangle: sovereignty, discipline and governmental management, which has population as its main target and apparatuses of security as its essential mechanism’ (2007: 108). In the initial formulations population arose as a series of traits and behaviours which, once accurately categorized and calculated, can be acted upon as norm and anomaly. It is in this way that health data become vaccination programme, education data render possible curriculum, assessment and audit, knowledge of the urban citizen becomes a sanitation programme and so on (Foucault, 2003: 244). Yet, as the formulation of biopolitics moves to consider the sovereign as dealing with a complex and aleatory milieu of the human species, the relation of data to norm appears altered. No longer pursuing a clear delineation of norm from anomaly, the data derivative functions through a mobile norm. Where, as Foucault proposes, ‘normalization posits a model and tries to get people, movements and actions to conform to this model [the norm]’, in the security apparatus he observes, ‘the plotting of di¡erential curves of normality’ (2007: 57^63). To be clear, far from a world in which biopolitics eclipses sovereign and disciplinary power, we see a security apparatus that mobilizes speci¢c techniques for deploying the norm to govern uncertain and unfolding populations. It is precisely such differential normalities that circulate in the associative writing of the data derivative. As the deputy director of the UK e-Borders programme explained in an interview, the risk-flagged anomaly ‘would never be self-evident. Only self-evident on this particular day, at this particular time, in this moment’. When the software designers and mathematicians emphasize the importance of ‘setting the gauge’ ^ the refining of the algorithm governing the rules between items of data, so that data can be ‘flushed’ or ‘washed’ through ^ this does not depict a filter that catches those mobile bodies, monies or objects that deviate from a known norm. Rather, the data derivative works with a mobile norm, a norm that is itself modulated and aleatory, governed not by normalcy and deviations but by differential curves of normality. The e-Borders official thus describes a customs officer who can visually scan pages of data on multitransit-point sea routes, identifying an apparent anomaly (multi-leg journey associated with sea ports, associated with cash paid) as normal in specific circumstances, where the behaviour is linked to a contract seaman returning home. ‘That is what we are trying to do’, he reports, ‘to automate that kind of intuition . . . encode it’. The making of the association rules that automate such judgements is an iterative process described as a ‘rapid fire Q & A’ between the software designers, immigration, customs and counterterror officers, policing authorities and front-line border personnel in the ‘offline team’ ^ for example, ‘what should the age range be for the drug mules route?’, ‘should we associate seating patterns with this one-way ticket?’, ‘is the 50%+ risk score useful as an indicator at that time of day?’7 Though the data derivative is indifferent to the underlying data, it does, as Brian Massumi suggests, ‘assert its own normality, of crisis: the Downloaded from tcs.sagepub.com at UNIV OF ROCHESTER LIBRARY on December 30, 2011
32 Theory, Culture & Society 28(6)
anytime, anywhere emergence of the abnormal’ (2009: 155) ^ living and circulating through multiple decisions about potential threats. Among the designers of border security software, much is made of the fact that the data derivative does not live on in the conventional sense of being retained as personal data attached to an individual. Understood as a mode of data distinct from, and indifferent to, underlying values, however, the data derivative persists in the building and refining of the mobile boundaries between normal and risky travel behaviours at a given moment. ‘Our data serves two purposes’, reports a European Commissioner, ‘they serve the real time risk assessment ^ do you fit the risk level? If not then they will serve a secondary purpose which is they will give an indication of how normal people travel compared to other kinds of people’.8 In the iterative and oscillating setting of the risk gauge, then, the oft-cited ‘99.9% of people going about their business journey or holiday in a perfectly normal manner’ are folded back into the ability to calculate risk against a modular norm. A‘low risk’ derivative persists in the offline modelling of future rules, such that the fleeting encounter between frequent flier, iris scanner and automated gate manifests at some future date in other forms of encounter with other subjects. Indeed, even the apparent ‘false positives’ we might highlight as symptomatic of the excesses and slippages of the risk-based techniques are successes on the register of refining the mobile norm. The false hits of multiple security interventions that prove negative can never be errors in the terms of the derivative, for they too are folded back into association. The apparent risk flag produced by the association of a ticket paid by a third party, one way, less than five days before travel, for example, produced clusters of false positives around ski routes where insurance companies repatriating injured travellers replicate this pattern. The addition of an association with particular flight routes to re-set the risk derivative in those circumstances, illustrates the indifference to error in, or alternative reading of, the underlying data. The data derivative not only loosens the disciplinary relation of data to norm, it embodies also an indifference to conventional Galilean scientific notions of evidence and accuracy (Stengers, 2000). As Lorraine Daston has argued powerfully in relation to the histories of scientific experiments and instruments, the history of science reveals not strictly a desire for ever greater degrees of accuracy and objectivity, but in fact a quite distinct emphasis on precision, for an ‘intelligibility of concepts’ which ‘by itself, stipulates nothing about whether and how these concepts match the world’ (1995: 8). To put the matter simply, it is of lesser consequence whether data accurately captures a set of circumstances in the world than whether the models can be refined for precision. If the governing by norm we associate with census and survey required the ‘large number’ collection of data in order empirically to identify patterns, validate and calculate ^ such as for example in the prudentialism of actuarial models and insurance calculation (Ewald, 2002) ^ the mobile norms of data derivatives are oriented not to Downloaded from tcs.sagepub.com at UNIV OF ROCHESTER LIBRARY on December 30, 2011
Amoore ^ Data Derivatives
33
the conventional archive and collection but to discarding. As one software designer put the problem: It’s about throwing away an item of data if it doesn’t help you make the right decision . . . We want the right amount of data to make a good enough decision, to take the right action at the border, to stop that person, refer to the police. It’s giving them enough information for them to take the right action, with the right type of risk, enough information to make a judgement.9
When what matters is the derivative risk flag itself, and when this lives on in the refining of rules, in the absence of archiving in the conventional sense, the underlying data items can be all but instantaneously discarded, leaving only the trace of their association. Arguably, then, when the controversies surrounding the extradition of European PNR data to the US resulted in the reduction of underlying data fields (from 34 to 19), or when the Canadian privacy commissioner ruled that in-flight meal choices as ‘cultural indicators’ be filtered out of PNR data, these actions failed to adequately apprehend the life of the data derivative. The derived risk flags and scores are not dependent on collected or archived data ^ ‘it’s not about collecting more data’, argues the UK Borders Agency, ‘that just makes the stack bigger. We are throwing the straw away’. Because the data derivative is produced via the screening out of data, the political space of response that says ‘protect’, ‘limit’, ‘make private’ is problematized. In a sense it no longer matters precisely what the authorities are permitted to collect or how it is stored or protected. Akin to the financial derivatives that allow for exposure to the risk-reward characteristic of underlying assets without having to possess them (MacKenzie, 2009), the data derivative is exposed to the underlying data without collecting them, created across the gaps and absences, in the interstitial spaces of inference and expansion. The Visual and the Virtual The very idea of the derivative has long been considered to have a ‘virtual quality’, such that financial derivatives are said to have a ‘strangely imaginary or virtual character’ (Arnoldi, 2004: 23), even where the emphasis is placed on ‘how virtuality is produced’ (MacKenzie, 2009: 80; see also Knorr Cetina and Bruegger, 2002). In the encoded and digital systems that are the milieu of the derivative it is most often an abstracted and virtual world that is depicted. Yet, as Brian Massumi reminds us, the knotted and folded potentialities of the virtual are not to be understood on the same plane as the calculated rationality of possibility that we ¢nd in the digital world. ‘Nothing is more destructive for the thinking and imaging of the virtual’, argues Massumi, ‘than equating it with the digital’ (2002: 137). In the digital processes of programming, and the writing of code, Massumi locates a ‘numeric way of arraying alternatives so that they can be Downloaded from tcs.sagepub.com at UNIV OF ROCHESTER LIBRARY on December 30, 2011
34
Theory, Culture & Society 28(6)
sequenced’, a means of rendering calculable possible futures, ‘step after ploddingly programmed step’ (2002: 138). Understood in this way, the ontology of association does not aspire to virtuality at all, but to actuality and the actualization of an array of possibilities. In this sense, returning to Pynchon’s novel, the data derivative shares more in common with Roger Mexico’s statistically inferred map than with Tyrone Slothrop’s a¡ective world of pre-emptive sensation. When Ned Pointsman seeks to fold Slothrop’s sensate map back into the capacity to foresee and intervene, or indeed when the intuitions and aesthetic sensibilities of software designers become an association rule, this does indeed become a numeric way of arraying alternatives, an already encoded systematization. Perhaps the seductiveness of the idea of the virtual as the dwelling place of the derivative lies in part in the apparent dominance of the visual in its representation. Whether it is the screenic domain of the derivatives trader (Knorr Cetina, 2006), the vigilant visualities of border security (Amoore, 2007) or the scopic regimes of the £ood risk map writer (Lane et al., 2010), the colour-coded £ags, screened scores and red and blue maps appear acutely visual. Yet, the appeal to the ‘sovereign sense’ of the visual further establishes the data derivative as an already encoded set of possibilities, this apparently ‘most reliable’ of senses underwriting the rationality of the association (Mitchell, 2005: 265; Bal, 2003: 13). ‘Let’s say I have one thousand border sta¡’, explains one security software consultant, ‘the o¥ine analysis is complex, but it must be fed back in a decisive format ^ that is what the system is about, displaying it on their screen’.10 What appears as the virtual realm of algorithm and screen is more precisely understood as a visual economy ^ a means of dividing, separating, and acting upon arrays of possible futures. As the art historian Jonathan Crary has argued, echoing Massumi’s sense of the digital as the systematization of the possible, the visual is ‘not primarily concerned with looking at images but rather with the construction of conditions that individuate, immobilize and separate subjects, even in a world singling in which mobility and circulation are ubiquitous’ (1999: 74). At the point that the data derivative becomes visualized on the border guard’s screen, ‘technology’, as Barbara Maria Sta¡ord has argued, ‘screens out what supposedly does not matter’, directing and delimiting attention (2009: 289). The virtual knots and slippages that were present in the room where a mathematician’s intuition met a software designer’s aesthetic sensibility fall away in the appearance of an already decided course of events. In the desire for precision over accuracy, the visualization ‘automates the response’, as the designers explain. Pynchon’s Mexico was perhaps correct: ‘it is plugging numbers into an equation, you could do it’. Understood not strictly as a world of abstracted virtualities or screenic vision, but more precisely as a domain of the arraying of possibilities by association, the data derivative cannot be seen as it is so often portrayed ^ as the irrefutable ‘electronic footprint’ of the data subject, left behind in the Downloaded from tcs.sagepub.com at UNIV OF ROCHESTER LIBRARY on December 30, 2011
Amoore ^ Data Derivatives
35
residue of a digital world. Rather, the data derivative has become a means of dividing, separating, particularizing subjects (Deleuze, 1992), literally bringing them to attention and making them subjects of interest. Thus, for example, when the bombings in London in July 2005 coincided with IBM’s trialling of the UK’s e-Borders programme ^ Project Semaphore ^ the British Prime Minister Tony Blair requested that Semaphore be extended to the Pakistan-London £ight routes, and a new person of interest was written into the association rule and brought to attention: ‘the British citizen of Pakistani origin’. The screening of PNR data, then, visualizes subjects of interest through the surface data but also, as Kaja Silverman (1996) has observed of the screen, ‘through the illusion of depth, a deep reach into databases and analytics’. As commercial players have reported, it is not the surface items that provide the ‘complete picture of a person’ that is sought ^ ‘that only tells us who someone is, we want to know why they are here’. My surface data may say that I am a British citizen, but the analytics will ask why I am here. What are the implications of visualizing subjects in this way? Alexandra Hall and I have elsewhere considered this to be a form of ‘digitized dissection’ ^ an anatomical disaggregation of a person into degrees of risk (Amoore and Hall, 2009: 450). How does a person of interest come to be seen? How are they targeted? How can they respond? Could they ever reply ‘no, that is not me’, or ‘it is me but that is not why I am here, that is not my intention’? When the writers of risk algorithms for the e-Borders system account for their disinterest in the underlying personal data, in a sense this is the case ^ for it is the capacity to abstract and intervene that counts: When you’re doing analysis to refine rules, that’s completely anonymous. You may know everything else about them except their name ^ they’re unknown. Except, well they’re not, we know lots about them, we know there is a person who has been doing this and this and this. We just don’t know their name yet and we can stop them before we know it.11
Such is the extent of the indifference to the singularity of underlying data in the making of the derivative that the slippages and excesses of the systems centre precisely upon those underlying elements that perhaps ought not to be missed. For example, as the derivatives of the sub-prime market continued to proliferate in 2006, the underlying foreclosure rates on mortgages (foreclosure itself a disciplinary technique of punishing the debtor) arguably should have signalled an increasing problem. Similarly, the failed apparent transatlantic bombing attempt on 25 December was said to have evaded security because of the ‘failure to join up the dots’ in PNR analytics and notably not because of the failure to simply identify a name that already existed on a watchlist (New York Times, 2009). So overwhelming is the pursuit of the as yet unknown future threat, that the
Downloaded from tcs.sagepub.com at UNIV OF ROCHESTER LIBRARY on December 30, 2011
36
Theory, Culture & Society 28(6)
sounding alert of the watchlist ^ buried deep below the visible surface of contemporary emergency governing ^ is rarely heard. Mobility at the Service of Security When Michel Foucault mapped the tentative contours of a security apparatus, for him it was oriented not to the concern to prevent events from happening, to ‘let nothing escape’, but rather to ‘open up and let things happen’ (2007: 44^5). Associating disciplinary techniques of governing with the will to prevent and stop things happening, ‘to prevent it and ensure it does not take place’ (p. 31), for him the ‘space of security refers to a series of possible events’, a ‘different sort of problem’ that must ‘allow circulations to take place’ (p. 65). To clarify again at this point, it is not that the security apparatus supersedes the disciplines, but precisely that the techniques of the security dispositif occupy what Collier calls a different ‘problem space’ (2009: 80), occluding those of disciplinary governing at specific moments, in particular places. While disciplinary techniques position mobility and security in a fraught relation, one where prevention is achieved precisely by stopping, halting, prohibiting, the security apparatus places mobility at the service of security. The technology for placing mobility at the service of security is risk itself ^ a set of practices that flag ‘differential risks, zones of higher risk and zones of lesser or lower risk’ (Foucault, 2007: 61). Pynchon’s question of ‘where do we go to be safe?’ is replaced by modulating zones of di¡erential risks. In the security apparatus there is no further need for capture, permissions and prohibitions, for what is governed is circulation itself. In the world of the ¢nancial derivative, mobility at the service of security is writ large ^ what matters is not whether a particular underlying value rises or falls, but only that this volatility or ‘implied volatility’ has itself been rendered tradeable (MacKenzie, 2008: 251). In e¡ect, movement in any direction can be secured so long as it is possible to correlate the mobility to some future amalgam of possible outcomes. To clarify at this point, the data derivative places mobility at the service of security by precisely trading on and through fluctuations. Unlike the ‘prudentialism’ of settled out categories in risk modes that rely upon statistical probabilities (Ewald, 2002) ^ for example, this profession, that salary, this neighbourhood, that health history ^ the data derivative embodies a risk mode that modulates via mobility itself. Like its commercial origins in retail data mining (see Amoore, 2009b), the data derivative does not seek out the settled categories of ‘this customer’, ‘this traveller’, ‘this migrant’ or ‘that visa applicant’, but instead wants to recognize bodies and objects in movement, in and through their very transaction. To give an example, one group of software designers supply the UK government with social network analysis tools for security applications, these tools having been originally developed for ‘customer intelligence’. The writing of association rules for customer intelligence raises questions of the movements Downloaded from tcs.sagepub.com at UNIV OF ROCHESTER LIBRARY on December 30, 2011
Amoore ^ Data Derivatives
37
across and relations between people and places ^ ‘do these customers in£uence each other?’, and ‘is this link signi¢cant?’ The data left in daily transactions and mined for risk criteria is not strictly a body of evidence of what a mobile subject did or did not do, but a set of relations from which the derivative can be written. Perhaps the very heart of the ambitions for writing security via mobility is the so-called ‘automated gate’, a pre-screening, biometrically identifying, fluid border that can, as its designers see it, adapt in real time. The automated gate does in effect replace the border guard at airport security checkpoint or on an international rail terminal plaza ^ the derivative risk flag or score automating the decision as to the opening and closing of the gate. It does indeed appear to, in Foucault’s terms, ‘open up to let things happen’, but it does so via risk-based ‘rules and algorithms’ that modulate, ‘change daily and evolve’. As one UK software consultant described the place of the automated gate in the London Olympics, for e-Borders the ‘Olympics of 2012 represents a showcase for short, sharp pre-arrival and pre-departure screening, tied to biometric ID’. It is through the daily patterns of mobility itself ^ PNR data into the UK; credit card transactions for the purchase of stadium tickets; Oystercard data for London transit ^ that the ‘constant iteration between the setting of criteria and checking the matches’ takes place, so that ‘technology opens the border rather than closing it’.12 The checking of matches pushes at the limits of the criteria, such that there are no singular and definitive criteria for admission or refusal. The rhythms of the ‘Q & A’ by the mathematicians and software designers resonate across the writing of the derivative, into the city streets and stadia ^ into the security intervention and flowing back into the refining of rules. Conclusion: Data and Decision Why is your equation only for angels Roger? Why can’t we do something, down here? Couldn’t there be an equation for us too, something to help us find a safer place? (Pynchon, 1973: 63)
The opening citations of this paper suggested the dilemmas and difficulties of deploying data, preemptively and by risk calculation, to render a safer place ^ a world of security. The statistician Roger Mexico has ‘tried to explain’ the V-bomb statistics to his lover Jessica, insisting on ‘the difference between distribution, in angel’s eye view, over the map of England’ and the contingent uncertainty ‘of their own chances, as seen from down here’ (1973: 62). There is a gap, he reminds us, between the visual and political economies of arraying data on map or screen and the difficulties confronted by any future promise of a ‘safer place’. Jessica struggles to sustain both pictures on a visible register: the statistical probabilities as calculated in Mexico’s distributions, and the more properly virtual potentialities of Downloaded from tcs.sagepub.com at UNIV OF ROCHESTER LIBRARY on December 30, 2011
38
Theory, Culture & Society 28(6)
their own life chances in the city. ‘She couldn’t keep them both in sight’, writes Pynchon, ‘pieces keep slipping in and out’ (1973: 63). The slippages in the relations between data and decision are present also in the contemporary data derivative as it moves back and forth between ‘angel’s eye view’ (the off line analytics) and ‘a safer place as seen from down here’ (the real time decisions of border guard or security official). In one sense it would be accurate to concur with Jacques Derrida, that where decision is ‘simply the application of a body of knowledge of a rule or norm’, in fact no decision meaningfully takes place (1994: 37). Because the many ‘decision trees’13 of the data derivative effectively automate the responses of border security staff, the data derivative annuls the possibility of actual decision. ‘The decision if there is to be one’, proposes Derrida, ‘must advance towards a future which is not known, which cannot be anticipated’ (1994: 38). The ‘real time decision’, then, is simply read off from the derivative ^ replacing the agonism and radical uncertainty of decision and placing responsibility in the realm of response. A responsible decision would have ‘to decide without it, independently from knowledge’, acknowledging the absolute contingency and uncertainty of all relations, all associations. Indeed, it is the case that real decision does haunt the room where the systems of automated rules-based targeting are built. In the numerous judgements ^ how to refine the algorithm, what should this item infer in conjunction with this one? what level of secondary intervention would be warranted for this level of association? ^ the potentialities and emergent virtualities are present at the edges. As Donald MacKenzie notes in his observations of the making of financial derivatives, ‘there is an element of judgement’ in the momentary decisions made by the brokers of what to ‘display on the screens’ (2009: 80). Surely this must be a primary task for critical enquiry ^ to uncover and probe the moments that come together in the making of a calculation that will automate all future decisions. To be clear, I am not proposing some form of humanist project of proper ethical judgement, but rather calling for attention to be paid to the specific temporalities and norms of algorithmic techniques that rule out, render invisible, other potential futures. One might signal, for example, the failure of critical attention to the expansive role of intuition and inference in the making of financial derivatives such as credit default swaps and mortgage-backed securities. Similarly in the security domain, because the entire array of judgements made ^ their prejudices, their intuitions, sensibilities and dispositions ^ are concealed in the glossy technoscientific gleam of the risk-based solution, there is a place for critical thought to retrieve this array and arrange it differently. Indeed, the slippages and excesses of risk-based security ^ for so long a critical resource for enquiry, to point to the excess and the lack ^ are the very terrain of the data derivative. Derivatives occupy that very space of excess ^ the volatilities, uncertain and indifferent relationalities of missing elements that can be inferred and projected. To point to the excess is no longer a Downloaded from tcs.sagepub.com at UNIV OF ROCHESTER LIBRARY on December 30, 2011
Amoore ^ Data Derivatives
39
sufficient route into critical thinking about techniques of enumerating and governing. If the data derivative is making a security architecture that, as Friedrich Kittler (1997: 30) has it, ‘covers the noise of war’ ^ the data stream ‘masking the unliveable outside’, the very impossibility of absolute security ^ it also colonizes precisely what is liveable about a life: a life of associations and relations that is not amenable to calculation. The data derivative is drawn, as Pynchon’s character Jessica realizes, and as Friedrich Kittler suggests, by ‘scanning lines and dots of a situation that forgets us’ (1997: 30). The derivative risk form acts through us and the prosaic, intimate, banal traces of our lives, but yet it forgets us. As necessarily incomplete, complex, undecided people, it forgets us. The very potentiality of life presupposes something of an unknown future. There are, of course, threats and dangers ^ it is against these that the data derivative is posed ^ but to live in association with others, to have relations, to be a life, as Gilles Deleuze put it, a life inde¢nite, with potentiality, ‘de¢ned not by moments but between times’ (2001: 29), this is also part of the promise of what is liveable and never amenable to calculation. It is also what is ethical about political life ^ that it is di⁄cult, to decide is di⁄cult, and that whether mathematician ‘deciding the best set of rules’, software analyst ‘re¢ning the algorithm’, or border guard ‘engaging action’, a decision that is not simply the reading o¡ of risk £ag or protocol can only proceed without recourse to what is projected in the £ags, maps and scores of the data derivative. Acknowledgements This paper was prepared for the ‘Premediation, Anticipation, Speculation’ conference, University of Amsterdam and PRIO, January 2010, and presented also at the ‘Mobile Methods’ workshop, Open University, May 2010. My thanks to Marieke de Goede, Nina Boy, Evelyn Ruppert, Mike Savage, Roger Burrows and John Law, and to Adrian Mackenzie and Theo Vurdubakis for their comments. The paper has benefited greatly from the contributions of the anonymous reviewers. The research from which this paper is drawn is funded by the bi-lateral ESRC-NWO project ‘Data Wars: New Spaces of Governing in the War on Terror’, RES- 062 230594 (with Marieke de Goede), acknowledging also the work of Alex Hall and Mara Wesseling.
Notes 1. At the time of writing, the UK government has terminated its contract with prime contractor Raytheon, citing ‘just cause’ of delays in delivery of elements of the programme. The sub-contracted suppliers of software and analytics ^ including a data analytics organization recently purchased by BAE Systems ^ continue their delivery of the e-Borders system and the National Border Targeting Centre is now the UK Border Agency’s central hub for assessing border security risk. 2. There are two forms of data submitted by airlines on their passengers and crew. The advance passenger information data (APIS) is a limited data field, including passport number, name and flight details. The passenger name Downloaded from tcs.sagepub.com at UNIV OF ROCHESTER LIBRARY on December 30, 2011
40
3. 4.
5. 6.
7.
8. 9. 10. 11. 12. 13.
Theory, Culture & Society 28(6) record data (PNR), available from the moment a flight is booked, contains up to 40 data items, though with some of these data filtered out to comply with privacy laws in a particular country. It is this PNR data that is run through the analytics in order to derive a risk flag against a passenger. Interview with border security software designer, London, May 2009. There are substantial literatures on how systems of number were deployed to govern populations from the 19th century (Rose, 1991). What Ian Hacking has called ‘the making up of a population’ via ‘the enumeration of people and their habits’ (1986: 46) witnessed the proliferation of disciplinary data collection in the forms of survey and census to identify, register, map, order and administrate. It is in relation to these speci¢c forms of data abstraction that modern concepts of rights, liberties and protection have formed, for example rights to privacy, data protection and freedom of information. Interview with IT consultants supplying data analytics to border control agencies, Brussels, August 2009. Collier reads Foucault’s analytic of biopower in The History of Sexuality and Society Must Be Defended as early forms that are elaborated and more nuanced in Security, Territory, Population and The Birth of Biopolitics. Indeed, both Gilles Deleuze (1988) and Giorgio Agamben have signalled a ‘crisis in Foucault’s work’ after the ¢rst volume of The History of Sexuality, a crisis centring on the question of life within ‘the ¢eld of biopolitics’ (Agamben, 1999: 221). Interview with UK e-Borders officials, London Heathrow, March 2009. The iterative process of refining rules by questions and answers in ‘real time’ analysis refers to the process of piloting the e-Borders concept on designated flight routes in Project Semaphore, delivered by IBM. Interview with European Commissioners on use of Passenger Information Units, Brussels, July 2009. Interview with border security software designers, London, August 2009. Interview with security software consultant, London, September 2009. Interview with border security software designers, London, August 2009. Interview with software consultant, 2 June 2009. Algorithmic models are often referred to as ‘decision trees’, depicting the ‘branches’ of associative calculations (for example, automated systems for diagnosing health problems will ask questions at each branch ^ if fever is present follow this branch if associated with this symptom, another branch if it is absent, and so on, until an intervention is flagged: seek urgent medical attention, or treat with analgesics).
References Agamben, G. (1999) Potentialities: Collected Essays in Philosophy. Stanford: Stanford University Press. Amoore, L. (2007) ‘Vigilant Visualities: On the Watchful Politics of the War on Terror’, Security Dialogue 38(2): 215^232. Downloaded from tcs.sagepub.com at UNIV OF ROCHESTER LIBRARY on December 30, 2011
Amoore ^ Data Derivatives
41
Amoore, L. (2009a) ‘Lines of Sight: On the Visualization of Uncertain Futures’, Citizenship Studies 13(1): 13^27. Amoore, L. (2009b) ‘Algorithmic War: Everyday Geographies of the War on Terror’, Antipode 41(1): 49^69. Amoore, L. and A. Hall (2009) ‘Taking People Apart: Digitized Dissection and the Body at the Border’, Environment and Planning D: Society and Space 27(3): 444^464. Arnoldi, J. (2004) ‘Derivatives: Virtual Values and Real Risks’, Theory, Culture & Society 21(6): 23^24. Badiou, A. (2005) Being and Event. London: Continuum. Bal, M. (2003) ‘Visual Essentialism and the Object of Visual Culture’, Journal of Visual Culture 2(1): 5^32. Bowker, G. and S.L. Star (1999) Sorting Things Out: Classification and its Consequences. Cambridge, MA: MIT Press. Bryan, R. and M. Rafferty (2006) Capitalism with Derivatives. Basingstoke: Palgrave. Collier, S.J. (2009) ‘Topologies of Power: Foucault’s Analysis of Political Power Beyond ‘‘Governmentality’’’, Theory, Culture & Society 26(6): 78^108. Crary, J. (1999) Suspensions of Perception: Attention, Spectacle and Modern Culture. Cambridge, MA: MIT Press. Daston, L. (1995) ‘The Moral Economy of Science’, Osiris 10: 2^24. Daston, L. and P. Galison (2007) Objectivity. New York: Zone Books. Deleuze, G. (1988) Foucault. Minneapolis: University of Minnesota Press. Deleuze, G. (1992) ‘Postscript on the Societies of Control’, October 59(Winter): 3^7. Deleuze, G. (2001) Pure Immanence: Essays on a Life. New York: Zone Books. Department of Homeland Security (2006) Survey of DHS Data Mining Activities. Washington, DC: Office of the Inspector General. Der Derian, J. (2001) Virtuous War. New York: Routledge. Derrida, J. (1994) ‘Nietzsche and the Machine (Derrida in conversation with Richard Beardsworth)’, Journal of Nietzsche Studies 7: 7^65. Derrida, J. (1995) The Gift of Death. Chicago: University of Chicago Press. Ewald, F. (2002) ’The Return of Descartes’s Malicious Demon: An Outline of a Philosophy of Precaution’, in T. Baker and J. Simon (eds) Embracing Risk. Chicago: University of Chicago Press. Foucault, M. (2003) Society Must Be Defended. London: Penguin. Foucault, M. (2007) Security, Territory, Population: Lectures at the College de France 1977^1978. Basingstoke: Palgrave Macmillan. Friedberg, A. (2007) The Virtual Window: From Alberti to Microsoft. Cambridge, MA: MIT Press. Hacking, I. (1986) ‘Making Up People’, in T. Heller et al. (eds) Reconstructing Individualism. Stanford: Stanford University Press. Kittler, F. (1997) Literature, Media, Information Systems. Amsterdam: G&B Arts.
Downloaded from tcs.sagepub.com at UNIV OF ROCHESTER LIBRARY on December 30, 2011
42
Theory, Culture & Society 28(6)
Knorr Cetina, K. and U. Bruegger (2002) ‘Traders’ Engagement with Markets: A Postsocial Relationship’, Theory, Culture & Society 19(5/6): 161^185. Lane, S. and S. Whatmore (forthcoming) ‘Virtual Engineering: Producing Flood Risk Knowledge for a Market’, Social Studies of Science. Lane, S.N., N. Odoni, S. Whatmore and N. Ward (2011) ‘Doing Flood Risk Science Differently: An Experiment in Radical Scientific Method’, Transactions of the Institute of British Geographers 36(1): 15^36. Langley, P. (2009) The Everyday Life of Global Finance: Saving and Borrowing in Anglo-America. Oxford: Oxford University Press. Mackenzie, A. (2007) ‘Protocols and the Irreducible Traces of Embodiment: The Viterbi Algorithm and the Mosaic of Machine Time’, in R. Hassan and R. Purser (eds) 24/7: Time and Temporality in the Network Society. Stanford: Stanford University Press. MacKenzie, D. (2008) An Engine, Not a Camera: How Financial Models Shape Markets. Cambridge, MA: MIT Press. MacKenzie, D. (2009) Material Markets: How Economic Agents Are Constructed. Oxford: Oxford University Press. Martin, R. (2003) The Organizational Complex: Architecture, Media, and Corporate Space. Cambridge, MA: MIT Press. Massumi, B. (2002) Parables for the Virtual: Movement, Affect, Sensation. Durham: Duke University Press. Massumi, B. (2005) ‘The Future Birth of the Affective Fact’, in Conference Proceedings: Genealogies of Biopolitics. Available at: http://browse.reticular. info/text/collected/massumi.pdf (consulted July 2011). Massumi, B. (2007) ‘Potential Politics and the Primacy of Preemption’, Theory & Event 10: 2. Massumi, B. (2009) ‘National Enterprise Emergency: Steps toward an Ecology of Powers’, Theory, Culture & Society 26(6): 153^185. Mitchell, T. (2005) ‘There Are No Visual Media’, Journal of Visual Culture 4(2): 257^266. New York Times (2009) ‘New Teams Connect Dots of Terror Plots’, 9 December. Pynchon, T. (1973) Gravity’s Rainbow. London: Jonathan Cape. Rose, N. (1991) ‘Governing by Numbers: Figuring out Democracy’, Accounting, Organizations and Society 16(7): 673^692. Silverman, K. (1996) The Threshold of the Visible World. New York: Routledge. Stafford, B.M. (2009) ‘Thoughts Not Our Own: Whatever Happened to Selective Attention’, Theory, Culture & Society 26(2^3): 275^293. Stengers, I. (2000) The Invention of Modern Science. Minneapolis: University of Minnesota Press. US Inspector General (2006) Survey of DHS Data Mining Activities. Washington, DC: Department of Homeland Security. Wigan, D. (2009) ‘Financialisation and Derivatives: Constructing an Artifice of Indifference’, Competition & Change 13(2): 157^172.
Downloaded from tcs.sagepub.com at UNIV OF ROCHESTER LIBRARY on December 30, 2011
Amoore ^ Data Derivatives
43
Louise Amoore is Professor of Political Geography in the Department of Geography at Durham University, UK. Her research focuses on the techniques through which practices of security and economy come to coalesce and recombine in novel ways. She is currently completing a book manuscript, The Politics of Possibility, in which she explores the capacity of emerging risk techniques to act on the basis of possibility and not strict probability. [email:
[email protected]]
Downloaded from tcs.sagepub.com at UNIV OF ROCHESTER LIBRARY on December 30, 2011
Conflicting Codes and Codings How Algorithmic Trading Is Reshaping Financial Regulation
Marc Lenglet
Abstract Contemporary financial markets have recently witnessed a sea change with the ‘algorithmic revolution’ , as trading automats are used to ease the execution sequences and reduce market impact. Being constantly monitored, they take an active part in the shaping of markets, and sometimes generate crises when ‘they mess up’ or when they entail situations where traders cannot go backwards. Algorithms are software codes coding practices in an IT significant ‘textual’ device, designed to replicate trading patterns. To be accepted, however, they need to comply with regulatory texts, which are nothing else but codes of conduct coding accepted practices in the markets. In this article, I draw on ethnographic fieldwork in order to open these black boxes, while trying to describe their existence as devices encapsulating several points of views. I address the question of a possible misalignment between those visions, and more specifically try to draw the consequences raised by such discrepancies as regards the future of financial regulation. Key words algorithmic trading j codes of conduct practices j regulation
j
codings
j
financial markets
j
Two hundred fifty milliseconds are hardly noticeable while talking, but it’s long enough for a crowd to get ahead of you in the market. (Leinweber, 2009: 72)
j
Theory, Culture & Society 2011 (SAGE, Los Angeles, London, New Delhi, and Singapore), Vol. 28(6): 44^66 DOI: 10.1177/0263276411417444
Downloaded from tcs.sagepub.com at UNIV OF ROCHESTER LIBRARY on December 30, 2011
Lenglet ^ Conflicting Codes and Codings
45
Introduction Struggling for Codes: The New Cold War in Trading
C
ONTEMPORARY FINANCIAL markets are witnessing a sea change in the way they operate. Since the mid-2000s, modifications both related to technological innovation and regulatory homogenization have been reconfiguring the market landscape. The importance of these changes has recently been underlined by the development of the Aleynikov case, triggering in the summer of 2009 a new ‘Cold War’ and the related ‘global arms race’ (Alloway, 2009). Aleynikov, a former Goldman Sachs VP, was allegedly charged with the theft of 32 megabytes of proprietary trading source code just before leaving the company. The code, designed to be used within a computer platform allowing ‘sophisticated, high-speed, and high-volume trades on various stock and commodities markets’, was part of the high frequency trading programmes that ‘generate many millions of dollars of pro¢ts per year’ for Goldman Sachs (Southern District of New York, 2009: 3). The case was otherwise followed by similar thefts at UBS, and revelations on security measures intended to protect ‘coded secrets’ at Citadel Investment Group, LLC (Berenson, 2009).1 Those stories somehow gave algorithmic and automated trading a major exposure in the industry, while firms are fighting for supremacy in the ability to design and protect their proprietary trading codes. Another sign that ‘the hot area now is high frequency trading’ (Bookstaber, 2009), the cases refer to a new step in the automation of the order £ows in ¢nancial markets. After the initial shift from open-outcries to electronic markets in the mid-1980s, a second revolution is well under way in marketplaces, where algorithms are playing a much bigger role in the trading process than ever before. Not only are they used in order to ‘¢nd’ prices and match buy-and-sell orders (Domowitz and Wang, 1994; Lee, 1998; Muniesa, 2003), thereby materializing exchanges, but now they are also able to ‘decide’ when and how to send orders without direct human intervention. The growing development of algorithms therefore over£ows the market frames: moving massively from exchanges to market participants, they recon¢gure the nature of agencies making markets, allowing institutions that have both the abilities to develop machines and the ¢nancial resources to deploy such systems to make healthy pro¢ts; hence the Cold War espionage tone used by journalists and bloggers to qualify these cases. This shift has clearly been recognized as a major one by regulators, as reveals the publication, by the Federal Reserve Board of Washington, of a report entitled ‘Rise of the Machines’ in the US (September 2009), or the similar need for the French Autorite¤ des Marche¤s Financiers to issue a press backgrounder on ‘Key issues arising from the emergence of dark pools and crossing networks’ (October 2009), with reference to venues where algorithms are heavily used. With liquidity issues often serving as a landscape for debates on algorithmic trading (Does algorithmic trading significantly change the quality of the available liquidity? How does it weigh Downloaded from tcs.sagepub.com at UNIV OF ROCHESTER LIBRARY on December 30, 2011
46
Theory, Culture & Society 28(6)
on the order book microstructure?2), the FED report also mentions the most recent improvements in technology, allowing some algorithms to ‘automatically read and interpret economic data releases, generating trading orders before economists have begun to read the first line’ (Chaboud et al., 2009: 1; see also Leinweber, 2009: 56 and 84). Providing ‘Political’ Descriptions of Market Devices
If a cohort of social scientists, ranging from Abolafia (1996) to Lee (1998) to Fligstein (2001), has already unfolded markets and their underlying mechanisms, interest in the description of devices has recently developed (MacKenzie, 2006; Callon et al., 2007); ¢nancial objects populating markets have now been recognized as valid objects of inquiry. Devices at stake in diverse activities such as arbitrage (Beunza and Stark, 2004; Beunza et al., 2006), mergers and acquisitions (Beunza and Muniesa, 2005), or the hedge funds industry (MacKenzie, 2005) have already been examined, together with the technologies once used to send orders (Preda, 2006) and receive them (Muniesa, 2008). However, paying attention to devices while describing objects and the contexts in which they occur has generated a number of criticisms among economic sociologists and political economists alike.3 The main areas of dispute include the alleged lack of attention towards institutions and politics in the ¢ne-grained ethnographies proposed by defendants of the Social Studies of F|nance research agenda. I do not abide by this critique: I think that market institutions and their embedded political controversies are in fact best made visible through the description of devices providing a support to the expression of multiple points of view. Despite all their merits, however, the aforementioned studies have not yet addressed an issue I deem important: namely, the regulation of practices involving those devices and the representations thereof. When mentioned in these works, regulation appears only in passing, but has not been the subject of many studies.4 The notion itself is polysemic and refers to a wide range of realities, from the realm of economics to philosophy to sociology. In this article, it will be understood as a normative activity, the purpose of which is to frame practices developing both within and around the space of the market. Control functions, such as compliance officers, permanent controllers, market supervisors and regulators, all contribute in their own way to the framing of market practices, whether acting a posteriori (e.g. through the careful checking that procedures have been followed, either internally or externally), or a priori (e.g. while issuing advice to market operators just before they engage in a trade). But how do these employees get a grip on market practices, when these are coded and encapsulated in a specific device ^ namely the algorithm? Indeed, if trading algorithms are so widely disseminated and used by market participants in their daily duties, if these tools are really changing the face of the financial world as the examples tend to suggest, then it is high time to question their being from a regulatory stance. Downloaded from tcs.sagepub.com at UNIV OF ROCHESTER LIBRARY on December 30, 2011
Lenglet ^ Conflicting Codes and Codings
47
Nowadays traders are heavily equipped with tools which allow them to delegate a portion of the practice of trading to automats (‘robots’), framed with computational logics and complex binary languages, thereby leaving a space for the development of new kinds of market actors. By the end of August 2009, large banks recognized that approximately 80 per cent of their total equity trading flows in some markets was being processed by algorithms ( Jeffs, 2009). In fact, the dissemination of trading algorithms appears to be an eminent political subject, for it raises questions about the meaning of practices having impact on the collective once these have been delegated to machines. Our point is not to judge whether technology is good or not, and I do not wish to put the blame on the ‘machinist’ type of ¢nance currently developing in markets. Rather, the question lies in the uses that are made of the technology: it is through the resituating of practices that the politics of the market ^ the power relations and institutions framing the culture of the ¢eld, together with voice mechanisms allowing for the expression of di¡erent views ^ can emerge. Our goal is to show through the unfolding of the di¡erent views aggregated in the object the kind of outcomes they produce on their regulatory environment. Like the vast majority of financial devices, trading algorithms are the material expression of converging and diverging points of view. Built as a result of conversations defining needs between users (clients and traders) and designers (engineers and regulators), algorithms differentiate themselves from other processing devices in use in trading rooms (such as screens, keyboards, phones and fiber optics connections) in that they receive a compendious form of practice. They do not just come as partial prostheses intended to cater to specific needs, like microphones (making sure that everybody within the trading room gets the information), or keyboards and screens (which provide a way to ask for prices) do. Algorithms are entities in their own rights, places where extensive financial practices are encapsulated: as codings, they amount to a specific kind of text describing the market’s materiality. As a text, the algorithm is a definitional device that makes the financial world different each time it ‘decides’ to fire an order into the market. When describing the trading pattern it follows and making it fit into the market, algorithms get involved in the shaping of markets: not only because they belong to and co-constitute the marketplace, but also because, in so doing, they open and close possibilities to render the market adequate (or inadequate) to the patterns of action they embody. This article questions the reconfiguration of regulatory spaces entailed by the growing use of algorithms. Thus, I intend to shed light on the problem of misalignment between different codings: the coding of practices necessary for the algorithm to replicate a trading pattern on the one hand, and the coding of practices necessary to the regulators securing market functioning on the other hand. Indeed, the fact that the increasing use of algorithms is modifying the market ecology may well call for a complete renewal of the ways regulation is made when the automation of market practices is at stake. How can we understand the fashion in which algorithms (software Downloaded from tcs.sagepub.com at UNIV OF ROCHESTER LIBRARY on December 30, 2011
48
Theory, Culture & Society 28(6)
codes coding practices in an IT significant ‘textual’ device) impact the performance of regulation (codes of conduct coding the accepted practices in markets)? Algorithms embody a controversial (thereby political) space that we have to describe if we are to understand something of what they are and, more importantly, what they produce.5 Regulation, Algorithmic Innovation and Calculation While regulation is often said to weigh on innovation, recent regulatory changes in Europe have in fact favoured the development of algorithmic trading. Shared calculative spaces between different stakeholders began to emerge, articulated around the tool. Adopting MiFID, Widening Algorithmic Horizons
Technological innovation in European financial markets has recently been enhanced as a result of the implementation of the Markets in Financial Instruments Directive (MiFID), in November 2007. This text, designed to supplement the lacks resulting from the former 1993 Investment Services Directive (ISD), has often been described as a ‘sea change’ in regulation (Casey and Lanoo, 2009: 26). Now structuring the landscape of European ¢nance, the text hinges on two ideas. F|rst, that competition between execution venues needs to be enhanced ^ thereby removing the old ‘concentration rule’ still e¡ective in some European countries, previously making it mandatory to transact ¢nancial instruments in centralized markets. Then, that customers need to be adequately protected from the ‘natural’ dangers they may encounter in markets. W|thin the context of 21st-century ¢nance, competition materializes in the design and development of sophisticated tools enabling investment ¢rms to compete in markets, while ful¢lling their clients’ instructions. ‘Best execution’, once a mere principle in regulatory codex, is now centre stage: investment ¢rms are bound to produce a document (the best execution policy), describing how they will manage to execute their clients’ orders to the best of their abilities, according to a mix of categories such as speed, volume, price, etc. (European Union, 2004, art. 21.2 sq. and 2006, art. 44 and 46). On the technological level, this new obligation borne by intermediaries required them to be in a position to route orders on different venues competing to offer the best liquidity. Being able to build instant comparisons between venues soon became a major concern at brokerage firms, as these needed to know where they should execute their clients’ instructions. ‘Smart Order Routers’ have therefore been designed and developed in order to make those decisions within milliseconds, a lapse of time that makes it impossible for a human being to stand competition. Algorithmic innovation, resulting from the adoption of the European regulation, thereby began to modify the roles assigned to market participants. Brokers, once valued for their ability to provide insightful recommendations on financial instruments to their clients, now chiefly need to prove their ability to swiftly Downloaded from tcs.sagepub.com at UNIV OF ROCHESTER LIBRARY on December 30, 2011
Lenglet ^ Conflicting Codes and Codings
49
process orders in the markets; and this goes through the development of algo trading. ‘Can You Beat the VWAP’? Looking for Famous Algorithms, and Other Variants
Before we try to delineate the controversy articulating around algorithms, and if we want to open the black boxes they embody, we first need to give a vision of what they are. For non-users, algorithms are visible (but not fully available) on the websites of data vendors, IT consulting firms and brokerage houses. Besides these, professional journals such as Automated Trader or The Trade News: Working for the Buy Side provide detailed information about the different classes of algorithms, what they are intended to do, how to use them, etc. The basic strategies are usually displayed in grids and lists reproduced in marketing brochures, thus making it easier for the salesperson to sell the product. These brochures contain a series of ‘fact sheets’ detailing strategies, the set of parameters that are either mandatory or optional for the algorithm to work, and the underlying quantitative model for a taste of scientific knowledge. Among the classic suite of algorithms that have been used for several years now, one can find the ‘Volume Weighted Average Price’ ^ or ‘VWAP’ algorithm. Here, the ITcode attempts to meet the VWAP, which is calculated as an expression of the total value of transactions in a given instrument, divided by the number of instruments effectively traded in a predefined period of time. It gives an average price that is linked both to time and volumes transacted in the market; it therefore serves as a benchmark often used by traders when they want to assess the quality of an execution (has the trade been executed at a better price than the price available on average and over the dedicated period?). The parameters usually available for this specific algorithm are the time period (when will the algorithm begin to ‘work’ the order, when will it stop?), and additional price constraints such as the maximum participation rate (in order to limit market impact) or price limits (levels above or under which the algorithm will stop working). Once the parameters are implemented in the machine, the algorithm ‘slices’ the initial order into small parts, with reference to historical data specific to each financial instrument, all this in a view to sending a series of instructions into the order book. The information perused by the VWAP code is the curve describing the intraday volumes available; thus, the trader using the algorithm has a reasonable assurance to match or go beyond the VWAP by the end of his trade. While resting on a rather old notion (the average price), the VWAP has become a de facto performance metric and is often used to compare the abilities of traders to beat the market: in order to achieve this, the trader will try to anticipate the volumes available on the market, and set the tool so that it participates accordingly. In a way, algorithms are both material and immaterial, and they sometimes possess this strange ability to escape the trader’s perception, despite the fact that they are acting in front of him. Although some algorithms act Downloaded from tcs.sagepub.com at UNIV OF ROCHESTER LIBRARY on December 30, 2011
50
Theory, Culture & Society 28(6)
in the open, some, on the contrary, act in the dark; hence the rather scary names they are given: ‘Dagger’, ‘Guerrilla’, ‘Stealth’, ‘Shadow’ or ‘Sniper’ are all metaphors describing a specific trading pattern. Because it is not easy, even for specialists, to get a good representation of what algorithms are, and what they do, marketing and sales persons use these kinds of metaphors to encapsulate the meaning inherent to an algorithmic strategy. If one is to think about those names, it is far easier to sell the idea that the automat will wait until an opportunity arises, before ‘firing’ an order. We said that those evasive objects were intervening in the market in a fashion that would make their appearance more of a non-event, as they would not always be easily noticed. In this respect, some exchanges provide participants with a stealth capacity, and the best example for this would probably be the widespread use of the ‘Iceberg’ order, the purpose of which is to allow the trader to submit a large volume order while publicly disclosing a small portion of it. The algorithm slices the order in several bits and only shows the ‘tip’ of the iceberg (a small quantity), while the remaining mass is kept secret, waiting under the displayed liquidity. Iceberg orders, when used carefully, are said to both reduce price movement and smooth the execution phases. Indeed, once the first slice is executed, the algorithm automatically places a second slice, until the bottom of the iceberg has completely melted down. These examples provide a first insight into the algorithmic world. The brochures describing trading patterns, grounded in quantitative models made explicit, account for the representation of algorithmic behaviour, which remains difficult to access. Building a Controversy, Opening a Space for Calculation
With algorithms, we face a genuine financial object: not only because they do have a materiality that can be described, and which occupies the minds of traders, quantitative researchers, or IT developers, but also because they make those participants act and react. Used as a means to trade, they can be seen as agencies in the sense given to the term by Latour (1987): ‘centres of calculation’, that is to say ‘collective hybrids’ where ‘calculation is distributed among humans and non-humans’ (Callon and Muniesa, 2005: 1236). It is precisely what the algorithm proposes; a coding within which possible actions and reactions can be found in the form of logical suites, describing di¡erent states of the world: ‘Calculation starts by establishing distinctions between things or states of the world, and by imagining and estimating courses of action associated with those things or with those states as well as their consequences’ (Callon and Muniesa, 2005: 1231). Keeping this definition in mind, I will try to build the controversy articulated around algorithms, those calculative objects describing contemporary marketplaces while being differently viewed by the collective, the coalescence of which they contribute to. As many other devices, algorithms do imply the confrontation of different professions or actors, and actants, Downloaded from tcs.sagepub.com at UNIV OF ROCHESTER LIBRARY on December 30, 2011
Lenglet ^ Conflicting Codes and Codings
51
trying to organize and connect with and through the technology. Among these views, and if we want to draw a list of algo stakeholders, we may find tools (not only the code, but also the trading station, the cables that allow the flowing of information from the station to the marketplace servers), the trader using it or the client to which the tool is delivered, the salesperson who manages to place the algorithmic suite with some clients, and people from marketing having described what the algo could produce. Let us not forget to mention the technicians, pertaining to different IT departments, as well as the quants team, who frame the algorithm with a model that is to be translated in a series of ‘0’ and ‘1’. Control functions evaluating the quality and acceptability of the code, either internally (compliance officers and permanent controllers) or externally (auditors and market surveillance for example) should also be part of the picture. All these participants more or less contribute to the building, implementing and using of trading algorithms. They all share different concerns, and try to read or represent the algorithm according to their situation and their specific function; however, the scripted code represents a common interest among them. It is in this sense that I think the algorithm stands as a political object aggregating and fixing the views that contribute to its enactment as a genuine financial device. This manifold concern lies at the heart of the black box I would like to open here, by trying to express the rationales they all account for. What Is an Algorithm Filled with? Describing the Encapsulation of Points of View As a technical object, the algorithm opens a space allowing for the expression of conflicting views. Designers (IT Developers) and users (sales, traders and marketing experts) generally try to disseminate the product, either in extolling its qualities and the related value it can add to customers, or in effectively using the tool in the marketplace. On the other hand, control functions (compliance officers, market surveillance and regulators) try to corset this dissemination: not because they do not want to see algorithms in markets, but because they recognize the need for a strict monitoring of the algorithm entailing full compliance with market regulations. From time to time, conflicts relating to the nature of algorithms may arise in markets as a result of script errors or misuses of instructions, which generate ‘funny trades’ in the market. These impair the ‘natural’ formation of prices, thereby crystallizing a state of crisis giving rise to a confrontation between the views expressed by all the actors attached to and through the algorithm. The difficulty we encounter here is deeply rooted in the fact that trading algorithms are not easy to represent. They have no unique display and, although they must be located ‘somewhere’ (usually on a server), it seems they do not have a single physical location, but rather a multi-local expression duplicating the market. If we want to trace the algorithmic character, should we assume it resides in the coding describing a trading pattern and Downloaded from tcs.sagepub.com at UNIV OF ROCHESTER LIBRARY on December 30, 2011
52 Theory, Culture & Society 28(6)
the related actions to take depending on the market’s context? Or should we say the algorithm is displayed on the trader’s screen, in the window allowing him to set the parameters, which shape the algorithm? Or else, is it the trendy image sold to clients of a technology necessary for them, if they are to beat the market? By reflecting on the algorithm’s existence, I understand it is subject to the developing views expressed by those who are in close association with it: it is these views that I detail thereafter. A Technical Product that Requires a Specific Argument (The Marketing and Sales View)
Among the most visible representations of the algorithm, we find the formulations produced by marketers, which generally have a ‘design’ approach to them. Marketers need to produce brochures explaining what the algo does, and frame these ‘actions’ in groundbreaking slogans. They need to find the right tone and the vocabulary that will speak to market participants (sales, traders and clients). Most of the textual representations they engender are grounded in assumptions drawn from economics and finance theories. Ideas such as liquidity, efficiency and the perfection of markets are often used to extol the product’s abilities in advertisements directed at potential customers (institutional firms): Low latency, high throughput, efficient liquidity management for personalized best execution handling. (Millennium advertisement, 2009) ITG DARK ALGORITHM Õ. Unique, aggregated access to POSIT Õ, ATSs & ECNs, total transparency of executions, anti-gaming logic. (ITG advertisement, 2008) Sophisticated algorithms that dynamically adjust to minimize slippage and market impact. ( JPMorgan advertisement, 2008) Our sophisticated EMS, SpreadHawkTM, allows clients to trade through shrink-wrapped algorithms for single stocks, pairs and portfolios. (AlgoTrader advertisement, 2009)
All these sentences, selected from a wide range of other similar ones, make it abundantly clear that the machine enhances the capacities to trade better on behalf of its users: that is to say to make money in milliseconds, with the ability to avoid market pitfalls, all the while not disclosing the intentions nor the trading strategies followed. The underlying idea, which serves as a basis for algorithmic design, is that of the clinical precision of a fast machine sophisticated enough to solve problems (Beunza and Stark, 2004). These representations, initially intended to sell the product, duplicate the mathematical and economic assumptions they are built on: the informational e⁄ciency hypothesis, transparency, and the necessary smoothing of Downloaded from tcs.sagepub.com at UNIV OF ROCHESTER LIBRARY on December 30, 2011
Lenglet ^ Conflicting Codes and Codings
53
executions in order not to make ‘noise’ or waves in the market’s liquidity, for instance. These arguments are generally supported and used by salespeople who take advantage of the formalization offered by marketing brochures in order to place the product with their customers. Once the Alternative Execution Services (AES) sales representative has solved connectivity and settlement issues for a client, he may decide to go further and ask the client if he would like to be offered access to the in-house suite of algorithms. He will subsequently send a short email detailing which strategies could be made available to the prospect: I would like you to use my algos on Europe as well ... VWAP, % of volume, TWAP, In line, Arrival price, Iceberg, ... You can place larger orders into them to be worked over a period of time ¼ better execution, less risk, increased productivity. What do you think?
Being convinced that ‘the client will get better opportunities than the others’ helps in the selling process. In the end, the AES sales resorts to similar arguments to sell the code by emphasizing the results it may produce in the market. Technology is contributing to the industrialization of profits because it either allows faster (‘low latency’ allowing to capitalize in milliseconds) or more accurate trades (‘less risk, increased productivity’). An Object Framed in a Dedicated Language (The IT Expert View)
For the IT developer, working hand-in-hand with the Quant analyst, algorithms consist in the translation of a mathematical model into a series of scripts, framed in a dedicated language: C++, Dotnet, FIXATDL, Java, etc. However, they are not yet another piece of software useful to their internal clients (the traders); sometimes, the IT developers describe their algorithms as living entities, which develop and reconfigure themselves according to market-shaping events. Me: Kyle: Me: Kyle:
So, how would you describe the algorithms you write? They are tools, mmmh . . . something in which we put some intelligence . . . there is some intelligence in the automat. What kind of intelligence? What do you mean? Some of them have the ability, when they are living, to use other algorithms, they decide how they should act according to a set of events . . . we work on this with the quants team . . . we call them meta algos.
During the conversation, Kyle explicitly states that his job consists in finding the correct language, the right code allowing the best set-up between different sources of information.6 He describes his work with metaphors mixing technological and human existence, somehow completing the formulation suggested by Lash (2001: 107): ‘making sense of the world Downloaded from tcs.sagepub.com at UNIV OF ROCHESTER LIBRARY on December 30, 2011
54
Theory, Culture & Society 28(6)
through technological systems’. This idea that the code is a being may well be shared by users, as shown by the following message sent by a trader to the IT department in charge of the development of algorithms, and entitled ‘End of life of an algo’: When client puts an end-time including the fixing (say, 5:35:00), it’s obvious he wants to take part to the fixing. Today, POV [the ‘% of Volume’ algorithm] till 5:35:00, auction @ 5:35:16, the algo was cancelled . . . we are long 3.5 million. I remember asking the algo being kept alive till the real fixing time.
The living item, behaving erroneously because of an improper input in its set of parameters, implied the broker finishing the day with a risk position in his book. The terms employed here depict the proximity between humans and machines, which matches the observations previously made by Beunza and Stark (2004: 396): ‘The robots, as the traders say, are partly ‘alive’ ^ they evolve. That is, they mutate as they are maintained, re-tooled, and re-¢tted to changes in the market’. The codes populating the marketplaces therefore do play an active role in the interaction mediating transactions between traders and other market participants. Achieving a kind of autonomy, they need to be ‘kept separated to reduce the possibility that their evolution will converge (thereby resulting in a loss of diversity in the room)’. A (Not So?) Helpful Tool for Daily Duties (The Trader’s View)
When he opens his workstation, the trader usually gets access to an ‘algo box’, which resembles a toolbox (Rosen, 2009): rather than facing a black box, di¡erent strategies are displayed in the form of a list, requiring at least a mental (re)construction before the algorithms are launched into the marketplace. According to his client’s instructions and the market’s context, the trader will choose among di¡erent algorithms and di¡erent parameters the tool with which he will have the trade executed. This is not always as easy a task as it seems. Not only because, despite their being ¢ne-tuned to their needs, algorithms require a certain path dependency (once you have set the parameters in the machine, it is not always possible to change one’s mind and revert to the initial situation, as trade slices might have already been sent to the market for execution), but also because traders sometimes recognize that ‘it’s no fun having the machine play for you’. Some of the old-timers, those who have known the Golden Age of the open-outcries, explain that they do not feel comfortable with having their position built by the algorithm: My job is not to spend the day looking at executions I have not decided. My job is about doing my best to get a good price for my client, and this I can’t do if I’m not able to decide when and how to place the trade . . . and Downloaded from tcs.sagepub.com at UNIV OF ROCHESTER LIBRARY on December 30, 2011
Lenglet ^ Conflicting Codes and Codings
55
you know better than me that algos can just go mad and fuck up a nice execution. (a senior trader)
If the trader does not build the position ^ and sometimes ‘spiel’ with it a little bit ^ he may feel rejected or expelled from his own practice by the technical object intended to support him. What is at stake here is the materialization of a shift in the making of markets, towards a greater rigidifying of practices: older traders often criticize this shift, as they have the feeling of entering an age where financial markets resemble an electronic game they cannot access anymore. And this may not just be the reaction of an aging staff unable to adapt to a better way to process trades. The algorithmic revolution is a real one, according to experts in the negotiation of financial instruments: There is a lot of anecdotal evidence regarding new behaviours as a result of particular algos. An example is the effect of volume participation algos chasing a volume spike. The piling in of several algos simultaneously to chase a spike can cause temporary supply disruption and cause the price to sharply spike before reverting to the mean. (Rosen, 2009: 104)
Therefore, we see that algorithms do weigh on the practices of traders, either because of the changes they bring to the market or because of what traders think they may produce on the ecology of the order book. But automating traders’ decisions also brings new challenges as regards the framing and the control of practices for at least two reasons. First, because it is far easier for those who are in charge of the regulation to interact with a human being than with a machine (and besides, they do not always know exactly where the machine is). Second, and perhaps more importantly, because the kind of mediation generated by the algorithm, which takes place between the trader and the market, does not reinforce the trader’s ability to feel responsible for what is done on his behalf in the market. It is these two points that I will now probe. Adding the Regulatory View, Furthering the Controversy To the different views already expressed (marketing, sales, IT and trading), others should indeed be added if we are to understand the controversy generated by algorithms in contemporary financial markets. Once algorithms ^ which I have defined as software codes coding practices in an IT-significant ‘textual’ device ^ are sent into the cables leading to market servers, they may come up against a series of resistances resulting from the framing of marketplaces with regulatory texts: the codes of conduct which code accepted practices. While the front office staff’s duty is to develop, promote, sell or use the algorithm (all actions that have previously been classified under the idea of a dissemination of the code performing markets), the view from a regulatory stance seems quite distinct, if not entirely different. Downloaded from tcs.sagepub.com at UNIV OF ROCHESTER LIBRARY on December 30, 2011
56
Theory, Culture & Society 28(6)
Control functions do indeed work on algorithms, doing their best to monitor the suggested dissemination: if there is a ‘rise of the machines’, then it has to be framed and monitored. How does this function? An Existence to be Described and Disclosed (The Market’s Point of View)
One of the main concerns expressed by market structures is that of the proliferation of systems plugged in to their platforms and weighing on levels of natural liquidity. It is, however, very interesting to note that some markets are very strict on the use of algorithms, while others do not seem to bother too much with them. A broker willing to use algorithms to trade on the Irish Stock Exchange, on Deutsche B˛rse or the Swiss market will need to go through a validation process requiring the filing of forms, detailing with precision what the algorithm is intended for, how it works, the different levels of controls or monitoring systems that can stop it at will (‘panic buttons’), etc. The name of the person responsible for using the algorithm also needs to be disclosed in order for the market surveillance to be in a position to identify the trader and the related compliance desk to contact, if the need arises. Other markets, such as Euronext or the London Stock Exchange, do not ask for details about the systems through which the transactions are submitted to the market. Rather, their rules and regulations emphasize that investment firms and traders using algorithms are solely accountable for the orders they input or withdraw from the market. Among the markets which detail a disclosing obligation, some propose their own definition of what the algorithm is, thereby trying to set a formalized representation in accepted categories: Automatic order entry systems, in particular Quote Machines, Electronic Eyes and Algorithmic Trading Engines as well as combinations thereof, are computer programs of a company for automatic generation of orders and are part of the Participant Trading system. Such orders are generated and transferred into the electronic trading system on the basis of order book information and additional parameters determined by the company. (Deutsche B˛rse, 2009: 31)
Here, it is through the alignment of mandatory conditions that the algo gets access to a recognized status: a definite physical location within the company together with its registration under a trader’s name, who will be responsible for managing the coded tool. Thus, the exchange seems to admit that algorithms are producing a part of their own reality when shaping the flows and moves that can be observed in order books, and which influence participants’ activities. The definition provided here is intended to allow the market to sanction misbehaviour that would result from an inappropriate use of algorithms ^ or other hybrid systems.
Downloaded from tcs.sagepub.com at UNIV OF ROCHESTER LIBRARY on December 30, 2011
Lenglet ^ Conflicting Codes and Codings
57
An Object Framed with a Set of Predetermined Rules (The Compliance Officer’s Point of View)
Between markets and traders, interface functions such as compliance officers make the link between texts (codes of conduct) and market contexts (in which we now know that IT codes are playing a role). They try to corset the practices where these are made, and to find solutions when rulebooks displaying the principles towards accepted market behaviour do not address the issues faced by operators. Compounding texts and contexts, compliance officers are also in charge of relationships with market surveillance: they process the paperwork accompanying the deployment of algorithms, if any, and manage issues that may arise as a result of the use of the algorithm. If the compliance officer is in charge of processing the paperwork, he therefore needs to get an insight into what the algorithm does precisely, how it duplicates practices and trading patterns, before he can translate and document these into written explanations ‘which both the IT developer and the trader usually do not like at all’ (a compliance officer). Providing an account of what exactly the coding’s purpose is, or how it will behave in the market, will require the gathering of explanations from the departments that participated in the development of the algorithm (the Quant who modelized the behaviour, the IT developer who translated it in a script, and the head trader who initiated the request). Based on these investigations, the compliance officer will construct an argument and compare it with regulatory requirements (the codes of conduct), which can be either overdeveloped or fairly scarce in their expression of what the exchange allows, or not, to its participants. This part of the description can prove a rather difficult task, as the rules to be followed may not always be as self-explanatory as they should: You see the Swiss . . . they used to maintain two markets, SWX and Virt-X for their big caps . . . now they have merged . . . but when you phone to get a confirmation about the rules you should apply, it is not so clear: the rulebooks of both structures are still available on their website, and the reporting mechanisms used to be different. ... In the end you don’t know if you need to give a single trader ID to a dedicated algorithm or if you can address the issue with a single trader ID to the algo box. (a compliance officer, March 2009)
Besides these hermeneutic issues, the compliance officer needs to keep updated on the changes implemented by market structures: for example, German markets’ restructuration in 2009 required ‘heavy paperwork, in order to keep [the formal administrative existence of codes] up-to-date’. This is all part of the compliance officer’s work: enacting acceptable practices through dedicated devices (rulebooks, procedures and codes of conduct). However, we see that access to algorithmic behaviour is already a complex construction, requiring discussions, interpretations and translations that blend different points of view, in order to try to make these fit
Downloaded from tcs.sagepub.com at UNIV OF ROCHESTER LIBRARY on December 30, 2011
58
Theory, Culture & Society 28(6)
within a dedicated frame, whereas at the same time some markets do not ask for such a formalization. Moreover, it is the compliance officer’s duty to make sure that the algorithmic device will indeed be fully framed with limits defining a set of impossibilities, so that when launched into the market, the device does not impair price formation processes or have too much of an impact on the quality of liquidity. This point is a rather difficult one to document, as the relationship between IT specialists and compliance experts often revolves around the different languages ^ sometimes exclusive of each other ^ that they resort to. The level of technicity, implied both by the very coded nature of the algorithm and the complex codification of market behaviours in regulatory texts, does not help reconcile the expression of a common reality. Discrepancies may therefore arise between those two sets of expressions holding themselves together within the algorithm; our descriptions have shown how fragile the existence of a shared apprehension of market reality is. Gaps between what the algorithm performs and the description that is provided to market supervisors on the one hand, or conflicts between underlying trading intentions and effective trading results in the markets on the other hand, always mix those different views. The algorithmic scene is set for the description of the controversy generated by the meeting of potentially conflicting codings with, on one side, the IT code replicating trading practices, and, at the other end of the spectrum, the codes of conduct intended to frame said practices. Misalignments between these two orders result in trading issues, revealing how modes of regulation get challenged in contemporary markets where algorithms are proliferating. Caught in the Controversy: Codes and Codings in Crisis What happens when codings (algorithms) conflict with codes of conduct (the accepted body of rules) in the marketplace? And what do we learn as regards the regulation of financial practices? The Case: A Simple ‘Algo Issue’
Wednesday, 4 November 2008, a stamped envelope is delivered to the attention of the compliance department of Global Execution Services (GES), a Paris-based brokerage house. It contains a letter signed by one of the London Stock Exchange market surveillance team members, requiring some written explanations about a ‘Large Erroneous Order’, electronically sent to the market in late October of the same year. The letter, though very courteous, is firmly expressing the need for GES to better monitor their order management system and the related filters that should be preventing erroneous orders to run inadequately into the public order book: I am writing with regard to an incident that occurred on October 29th, when your firm submitted an extremely large order onto the buy side of AXT Downloaded from tcs.sagepub.com at UNIV OF ROCHESTER LIBRARY on December 30, 2011
Lenglet ^ Conflicting Codes and Codings
59
order book during the closing auction. This order was for a size of 1,184,966 shares at a price of 40.4p, with the previous automated trade in the stock being 37.5p. The order remained on the book for a period of 13 seconds and lifted the price significantly, then being deleted moments before the scheduled uncrossing. After contact with our Market Supervision team it was established that this order was in fact erroneously submitted by an algorithm.
The compliance officer in charge is then asked to provide details about the reasons that led these orders being placed in the book, together with an explanation of the underlying strategy pursued by the client. Indeed, the orders had been submitted ‘very close to the running of the uncrossing algorithm [the market’s algorithm calculating the closing price] and due to the price and size, had a strong impact upon the indicative uncrossing price disseminated by the Exchange’. Finally, an explanation of the enforced controls is also required, along with the measures to be introduced in order to prevent such behaviour in the future. The issue, as exposed by the LSE market surveillance team, relates to the allegedly uncontrolled activity of an algorithmic trader, who sent instructions to the market before deleting them after a rather ‘long’ period of time (13 seconds), thereby modifying the natural ecology of the order book, and displaying false and misleading information to other market participants during a critical phase of the day (the ‘fixing’, which serves as a reference, for instance to price managed funds). Whether this had been done purposefully or not was the reason the stock exchange requested an explanation. For the compliance officer, this meant looking once more into those ‘bloody algo issues’ for the third time that month. Tackling the Issue: The Compliance Officer’s Answer to the Market
The compliance officer gets up and goes to the Algorithmic Trading Team sitting at a nearby desk. Once the trader responsible for the supervision of the algo has been identified (‘ah . . . yes, it must have messed up . . . sorry we didn’t see this before . . . it’s been a nightmare today with so many trades to monitor that I’m surprised it actually happened so late’), explanations are soon formalized in a letter, allowing the compliance officer to revert to the market: Further to your letter mentioned above, we are pleased to provide you with the following information. You will find hereafter a chronology of what happened, based on the information gathered from our Algo team: At 16:34:08: An algorithmic trader enters a buy order for 500,000 AXT shares. At 16:34:50: Order sent in the algo by the trader who entered 5,000,000 shares in the automat, instead of 500,000 by mistake. At 16:34:51: The algorithm automatically split the order into 4 orders (3 orders of 1,271,678 shares and 1 order of 1,184,966 shares) with a limit of Downloaded from tcs.sagepub.com at UNIV OF ROCHESTER LIBRARY on December 30, 2011
60
Theory, Culture & Society 28(6) 41.2p. As the last traded price was 37.5p, the maximum price for the order, calculated by the Automat was 41.2 p (10% max deviation filter). At 16:34:59: The trader realised his mistake and immediately cancelled the order (9 seconds after the sending of the order). As the market was closing, 3 of the 4 orders have been cancelled (respectively at 16:34:59:893; 16:34:59:933; 16:34:59:936), but the last one had been executed for 1.184.966 shares at 40.4p on the close.
After having provided additional details on the controls in place, the compliance officer ends his letter by reiterating GES’s commitment to better monitor algorithms in the future. Internally, he decides to send a strong reminder to the traders and to ask the Head of Execution to review the filters in place in order to prevent other ‘fat fingers’, knowing that a new algo issue could lead to a reprimand or even a fine (and in the worst case, GES’s access to the market being suppressed). In this case, even though the mistake originates with the trader’s ‘fat finger’, the algorithm plays a definitive role, in that it does not leave any time for the trader to cancel and modify the instruction: within a second, the order is managed by the automat, thereby closing the space for the interpretation and correction of the course of action. Such a situation would probably not have occurred in a different setting where the trader would have been working the order himself. Mediating the relation between the trader and the market, the algorithm exemplifies the fixation of (mis)behaviours. The IT coding has been part of a conflict with the rules and regulations of the LSE, which immediately questioned the underlying intention indirectly expressed by such an instruction (in the letter, the market surveillance noted the possibility of an abusive market practice, impacting the prices of the closing auction, one of the strategic moments of the trading day). Some Consequences on the Market’s Ecology
The case reveals a series of interesting elements as regards the framing of machine-based practices and the kind of consequences they bear on the regulation thereof. The algorithm used by the trader lies at the heart of the controversy, implying a market structure (the London Stock Exchange) with its own algorithms calculating the closing auction price (referred to as the ‘uncrossing’) to be displayed in the order book, together with its market surveillance team and systems, and an intermediary (GES) with its own devices ^ including servers, wires allowing the processing of orders, embedded filters and screens used by the support teams to monitor the trading flows. The link between both institutions (the intermediary and the market) and their resulting roles within the market space is embodied by the algorithmic behaviour. At least half a dozen people interact directly in this issue, concatenated with the IT code acting in the marketplace, and the codes of conduct ruling over market behaviours. The empirical material here provided stresses the difficulty arising from the growing uses of algorithmic instructions in markets. Framing the Downloaded from tcs.sagepub.com at UNIV OF ROCHESTER LIBRARY on December 30, 2011
Lenglet ^ Conflicting Codes and Codings
61
practice once it is delegated to the code is not a simple task: it implies a constant attention (the algorithmic trading team spends their whole day monitoring the executions on their screens), and cautiousness (when setting the parameters and ‘giving life’ to the algo). Despite these efforts, errors do occur and they, almost inevitably, are not easy to revert ^ either because the algorithmic machines are too fast (acting within milliseconds), or because they somehow ‘fix’ the actions according to a frame, which is not so flexible. This may well provide us with an indication as regards our question on the impact borne by algorithmic equipment on the regulation of financial markets. If algorithms are thought of as tools intended to help traders when they want to engage in markets, then the question concerning the ability of such systems to be acted upon becomes a crucial one. It is not only a simple question of what can be done in terms of IT coding, it is rather a question of how compliance officers, market supervisors and market regulators can make users manage more efficiently their algorithms, which contribute to the shaping of market materiality. The issues faced in day-to-day practices, which we can read through the conversations between clients, intermediaries, and markets, show that there is a need for traders to remain adaptable in ever-changing market contexts, especially in times of crises. Sometimes, this may not be possible once the actions are framed within the algorithm. Traders who were accustomed to searching a good price for their clients, therefore working the order throughout the day, would now most likely be using a VWAP algo in order to better achieve such a task. In so doing, they would be in a position to provide their client with an average price down to the four digits after the decimal. But they would also, as a consequence, lose the adaptability that is necessary to manage tricky situations in times of market distress. Concluding Remarks: Making Algorithmic Codings Act ‘by the Code’? Crises resulting from the confrontation of codings (algorithms) and codes of conduct (market rules and regulations) allow us to question the recent developments witnessed by marketplaces. Our article has suggested that contemporary financial markets are experiencing an important change with respect to the nature of their regulation. Our purpose here has been to contribute to the identification of such a change, best expressed in the misalignment between codings and codes of conducts. What is at stake with such a misalignment? At least three points, which I can summarize as follows: 1. The rapid dissemination of algorithms indeed questions the ability of regulatory functions to have a grip over the new spaces enacted by those devices. Ethnographic fieldwork has allowed us to provide an insight into the possible representations of trading algorithms, and to delineate one of the multiple controversies they generate. The change at stake here is the rigidifying of practices Downloaded from tcs.sagepub.com at UNIV OF ROCHESTER LIBRARY on December 30, 2011
62
Theory, Culture & Society 28(6)
through their encapsulation in a coded script intended to increase the efficiency of transactions in financial markets. Fixing states of market reality in predetermined patterns raises questions as regards the monitoring of market behaviours. The case displayed shows that even simple errors such as ‘fat fingers’ cannot easily be identified, as the algorithm somehow ‘forces’ the hand of the trader and contributes to the development of the ‘fat finger’, while denying his ability to exercise his judgement. At the same time, the machine contributes to the blurring of representations in the marketplace. When time and space are reduced to dimensions in which it is materially impossible to (inter)act, then the regulation of actions may be problematic. 2. This point shall be linked to a second one: if trading algorithms add a strong mediation between traders and markets, they also express a mix of different views that further impedes the attribution of accountability when issues arise. If market regulations generally request that intermediaries appoint someone to supervise the uses of trading algorithms, they do not address the problem from a practical point of view. What happens in front of the trading station? Indeed, the mediation created by the algorithm changes the relation that traders have to their delegated actions in the markets (see the thoughts of a senior trader quoted above). If this new relation between the human trader and the trading machine does not favour the accountability of the human for what his machine does (partly as a result of the blurring of representations that I have been describing), then how do regulations and sanctions get applied? This makes algorithms, and the controversies they aggregate, a political issue for the regulation of markets. 3. Finally, the conflict between codings (algorithms) and codes of conduct (rules and regulations), once expressed as a construction involving different points of view and keeping these together, emerges as the closing of a space rather than the opening of hermeneutic possibilities. What I mean here is that even before the action takes place in the market, the space of possibilities for interpretations is closed within the tool. This point only emerges as a derivative from the material presented here, but it certainly raises challenging issues for regulators, especially in situations where the hermeneutics of regulatory principles are of major importance when assessing the nature of a trading practice. In a sense, it is the distinction between means and ends that is at stake here, with an emphasis on the role of technologies causing dislocations in order to readjust (Latour, 2002: 258).
When professionals recognize that the ‘shift from phone to low-touch trading will not be an effortless transition’ (Simon and Morgan, 2007: 10), they mean that practices are grounded in a certain kind of mediation that still generates de¢ance. The question is not that of the possibly increased dangerousness in using algorithms, but rather that of the interests to be deployed by institutions and market participants, if they are to describe the detours and recon¢gurations they are facing with the advent of the algo revolution. Therefore, regulators should seriously ponder the idea that ‘equipment matters: it changes the nature of the economic agent, of Downloaded from tcs.sagepub.com at UNIV OF ROCHESTER LIBRARY on December 30, 2011
Lenglet ^ Conflicting Codes and Codings
63
economic action, and of markets’ (MacKenzie, 2009: 13). To us, it is in this respect that algorithms raise a political issue, best expressed through the blurring of means and ends for the participants, thereby acknowledging the calibration problem identi¢ed by Beunza and Stark (2004). If the rise of the machines has obviously begun, the homo algorithmicus is not a grown adult yet. Notes 1. ‘In order to protect the confidential nature of Citadel’s high frequency work, Citadel has instituted a number of physical and electronic security measures. The physical security measures include: limited ID-card access to the building and certain floors, security cameras, and a dedicated Citadel security team. . . . The electronic security includes password protection and encryption of relevant computer systems, computer-access limitations, and a second level of encryption for some source code. Access to high frequency source code is given out on a need-to-know basis only and is strictly limited . . .’ (see Circuit Court of Illinois, 2009: 14). 2. An order book can be described as a device representing the market for a dedicated financial instrument. Although markets have their own specificities, it usually displays the following information, for each side (bid/ask) of the market: the quantity of instruments available, the price, and the number of orders available for each price. 3. Of the numerous controversies already generated as a reaction to the development of social studies of finance, one can refer to paradigmatic cases such as Mirowski vs Callon (and MacKenzie) on performativity in 2004 or, more recently, Williams vs Beunza on the goals of such studies of financial markets, in 2010 (cf. Beunza, 2010). 4. One exception to this statement is to be found in Millo (2007). 5. Materials used have been gathered during a long-lasting participant-observation at a pan-European brokerage house based in Paris (October 2006^ September 2009), which I have renamed Global Execution Services for the sake of anonymity. Recognizing with Charles Smith that ‘to access and grasp a market as a definitional practice . . . it is necessary to become immersed within the market as a true participant observer’ (Smith, 2007: 34), I have been serving as an Equities Compliance O⁄cer for three years, during which I have been looking at sales, analysts, traders and support functions every day, chatting with them and recording conversations, taking part to the formation of practices in the ¢eld. I also introduced artefacts in the research design, such as brochures, emails, and conversation transcripts originating from structured interviews, semi-structured interviews or informal discussions. 6. See for example Simon and Morgan (2007: 12¡.): ‘Certain algorithms have also been attracting recent attention due to their ability to consider publicly disseminated information that has a tendency to move a stock. Traders like the idea of algorithms that take into consideration breaking news stories or press releases before making a decision. . . . Algorithms will continue to play a role similar to the evil villain that comes back to haunt the hero in any theatrical movie. Algorithms are breaking apart orders and making markets more e⁄cient, but at the same time creating an environment where traders are having a signi¢cant Downloaded from tcs.sagepub.com at UNIV OF ROCHESTER LIBRARY on December 30, 2011
64
Theory, Culture & Society 28(6)
amount of di⁄culty in ¢nding their required liquidity. . . . W|th the use of smart order routing technology, the industry is on the heels of arti¢cial intelligence-like abilities. More and more, bulge-bracket brokers are o¡ering their strategies to mimic the actions conducted by traders and replace intuition with a machine. The next generation of algorithms will automatically interpret what is moving the market, when to trade and make the most optimal decision.’
Acknowledgements An earlier version of this article was first presented at the SSFA ‘Reembedding Finance’ Workshop in Paris (20^21 May 2010). I would like to thank the participants for their suggestions, and especially Yuval Millo for his insightful comments on a subsequent version of the text.
References Abolafia, M. (1996) Making Markets: Opportunism and Restraint on Wall Street. Cambridge, MA: Harvard University Press. Alloway, T. (2009) ‘The Cold War in High Frequency Trading’, 8 July, http://ftalphaville.ft.com/blog/2009/07/08/60761/the-cold-war-in-high-frequency-trading/ (accessed 5 March 2010). Autorite¤ des Marche¤s Financiers [AMF] (2009) Key Issues Arising from the Emergence of Dark Pools and Crossing Networks: Press Backgrounders, 20 October. Paris: AMF. Berenson, A. (2009) ‘Arrest Over Software Illuminates Wall St. Secret’, New York Times, 23 August. Beunza, D. (2010) ‘Political Economists Denounce Social Studies of Finance for Overlooking the Political’, blog post dated 21 May 2010. Available at: http:// socfinance.wordpress.com. Beunza, D. and D. Stark (2004) ‘Tools of the Trade: The Socio-technology of Arbitrage in a Wall Street Trading Room’, Industrial and Corporate Change 13(2): 369^400. Beunza, D. and F. Muniesa (2005) ‘Listening to the Spread Plot’. In B. Latour and P. Weibel (eds) Making Things Public: Atmospheres of Democracy. Cambridge, MA: MIT Press. Beunza, D., I. Hardie and D. MacKenzie (2006) ‘A Price Is a Social Thing: Towards a Material Sociology of Arbitrage’, Organization Studies 27(5): 721^745. Bookstaber, R. (2009) ‘The Arms Race in High Frequency Trading’, 21 April, http://rick.bookstaber.com/2009/04/arms-race-in-high-frequency-trading.html (accessed 5 March 2010). Callon, M. and F. Muniesa (2005) ‘Economic Markets as Calculative Collective Devices’, Organization Studies 26(8): 1229^1250. Callon, M., Y. Millo and F. Muniesa (2007) Market Devices. London: Blackwell. Casey, J.-P. and K. Lanoo (2009) The MiFID Revolution. Cambridge: Cambridge University Press. Downloaded from tcs.sagepub.com at UNIV OF ROCHESTER LIBRARY on December 30, 2011
Lenglet ^ Conflicting Codes and Codings
65
Chaboud, A., B. Chiquoine, E. Hjalmarsson and C. Vega (2009) ‘Rise of the Machines: Algorithmic Trading in the Foreign Exchange Market’, International Finance Discussion Papers, No. 980, October. Washington, DC: Board of Governors of the Federal Reserve System. Circuit Court of Illinois (2009) Citadel Investment Group LLC vs Teza Technologies, LLC, et al., Memorandum Opinion and Order, No 09 CH 22478, 16 October. Deutsche B˛rse (2009) Exchange Rules for the Frankfurter Wertpapierb˛rse, FWB1 updated 3 August, http://deutsche-boerse.com (accessed 25 September 2009). Domowitz, I. and J. Wang (1994) ‘Auctions as Algorithms. Computerized Trade Execution and Price Discovery’, Journal of Economic Dynamics and Control 18: 29^60. European Union (2004) Directive 2004/39/EC of the European Parliament and of the Council of 21 April 2004 on markets in financial instruments amending Council Directives 85/611/EEC and 93/6/EEC and Directive 2000/12/EC of the European Parliament and of the Council and repealing Council Directive 93/22/ EEC. Brussels: European Commission. European Union (2006) Commission Directive 2006/73/EC of 10 August 2006 implementing Directive 2004/39/EC of the European Parliament and of the Council as regards organisational requirements and operating conditions for investment firms and defined terms for the purposes of that Directive. Brussels: European Commission. Fligstein, N. (2001) The Architecture of Markets: An Economic Sociology of Twenty-First Century Capitalist Societies. Princeton: Princeton University Press. Jeffs, L. (2009) ‘Algorithmic Trading Comes of Age’, Financial News, 24 August. Lash, S. (2001) ‘Technological Forms of Life’, Theory, Culture & Society 18(1): 105^120. Latour, B. (1987) Science in Action: How to Follow Scientists and Engineers through Society. Cambridge, MA: Harvard University Press. Latour, B. (2002) ‘Morality and Technology. The End of Means’, Theory, Culture & Society 19(5/6): 247^260. Lee, R. (1998) What is an Exchange? The Automation, Management, and Regulation of Financial Markets. Oxford: Oxford University Press. Leinweber, D. (2009) Nerds on Wall Street. Math, Machines and Wired Markets. Hoboken, NJ: John Wiley & Sons. MacKenzie, D. (2005) ‘How a Super-portfolio Emerges: Long-Term Capital Management and the Sociology of Arbitrage’, in K. Knorr Cetina and A. Preda (eds) The Sociology of Financial Markets. Oxford: Oxford University Press. MacKenzie, D. (2006) An Engine, Not a Camera: How Financial Models Shape Markets. Cambridge, MA: MIT Press. MacKenzie, D. (2009) Material Markets. How Economic Agents Are Constructed. Oxford: Oxford University Press.
Downloaded from tcs.sagepub.com at UNIV OF ROCHESTER LIBRARY on December 30, 2011
66
Theory, Culture & Society 28(6)
Millo, Y. (2007) ‘Making Things Deliverable: The Origins of Index-based Derivatives’, in M. Callon, Y. Millo and F. Muniesa (eds) Market Devices. London: Blackwell. Muniesa, F. (2003) Des marche¤s comme algorithmes: sociologie de la cotation e¤lectronique a' la Bourse de Paris (unpublished PhD thesis). Paris: Ecole des Mines de Paris. Muniesa, F. (2008) ‘Trading Room Telephones and the Identification of Counterparts’, in T. Pinch and R. Swedberg (eds) Living in a Material World: Economic Sociology Meets Science and Technology. Cambridge, MA: MIT Press. Preda, A. (2006) ‘Socio-technical Agency in Financial Markets: The Case of the Stock Ticker’, Social Studies of Science 36(5): 753^782. Rosen, M. (2009) ‘Black Box or Tool Box: Changing Technologies for Evolving Marketplaces’, in B. Bruce (ed.) Liquidity II: A Guide to Global Liquidity. New York: Institutional Investor. Simon, M. and C. Morgan (2007) ‘The Algorithmic Revolution ^ What It Is,Who’s in It and Where Is It Heading?’, in J. Lee (ed.) Algorithmic Trading: A Buy-side Handbook. London: The Trade. Smith, C. (2007) ‘Markets as Definitional Practices’, Canadian Journal of Sociology 32(1): 1^39. Southern District of New York (2009) United States of America vs Sergey Aleynikov, Violations of 18 U.S.C. ‰‰ 1832 (a)(2), 2314, & 2, 4 July.
Marc Lenglet is Assistant Professor at the European Business School Paris, Management & Strategy Department. He received a PhD in Management from Paris-Dauphine University (2008), a Master in Philosophy from Paris-Sorbonne University (2003), and graduated from ESSEC Business School (2004). With interests in phenomenology and anthropology, his research focuses on the dissemination of norms within practices, financial objects (such as trading algorithms) and financial ‘things’ (such as liquidity). [email:
[email protected]]
Downloaded from tcs.sagepub.com at UNIV OF ROCHESTER LIBRARY on December 30, 2011
From a Biopolitical ‘Will to Life’ to a Noopolitical Ethos of Death in the Aesthetics of Digital Code Anna Munster
Abstract In a range of digital creative productions and digital culture, questions of how to deal with finitude are on the rise. On the one hand, sectors of the digital entertainment industry – specifically computer games developers – are concerned with the question of how to manage ‘death’ digitally. On the other hand, death and suicide have become the impetus for humorous artistic expression. This article tracks the emergence of a digital ethos that is cognizant of consequence, finitude and even death. Rather than pit a 1990s ‘will to life’ against an emerging ‘death drive’ , I argue that the shift to an ethos in which dark consequences ensue from digital actions must be understood by working through digital code’s technicity and unfolding this relation of technics to both ethics and politics. Although Bernard Stiegler’s analysis of technicity goes some way toward unfolding a political analysis of the aesthetics of digital code, his articulation of noopolitics fails to provide us with a way to conduct ourselves digitally in an era of cognitive capitalism. I look to critical software practices and their provisional networked publics, with potential lines of flight for contemporary technoculture via novel digital ‘codings’. Key words art j biopolitics
j
digitality
j
noopolitics
j
technics
I
N THE latter part of the 20th century digital code was deeply imbricated in an aesthetics and ethics of life or, as Nikolas Rose has put it, ‘life, itself’ (2001). This seemingly implacable entanglement of life and code hit its stride in 2000 when the then US President, Bill Clinton,
j
Theory, Culture & Society 2011 (SAGE, Los Angeles, London, New Delhi, and Singapore), Vol. 28(6): 67^90 DOI: 10.1177/0263276411417458
Downloaded from tcs.sagepub.com at UNIV OF ROCHESTER LIBRARY on December 30, 2011
68
Theory, Culture & Society 28(6)
and UK Prime Minister Tony Blair and leaders of the Human Genome Project (HGP) announced a ‘working draft’ of the sequencing of the human genome. Life was declared virtually ‘decoded’. There was, of course, a long march toward the digital decoding of life that began somewhere midcentury. The formal acknowledgement in 1948 by Norbert Weiner that feedback, drawn from information theory, could be measured across both engineered and biological systems is one catalysing event (1965: vii). The period from 1953 to 1961 in which the combined efforts of mathematicians, biophysicists, cryptographers and other researchers led to ‘cracking the genetic code’ of DNA is another brewery for entwining code with the living organism (Kay, 2000: 8). But while these and a number of other 20th-century periods contributed considerably to rendering life digitally, it is in the 1990s and early 2000s that the digital literally came to be seen as the ‘natural’ medium for the living organism. In 2002, for example, Craig Venter, CEO of Celera Genomics, the company that ¢rst announced in 2000 a draft sequence of the entire human genome, announced that: ‘W|thin 10 years, before a baby leaves the hospital, their parents will have the essence of their genetic code on a CD’ (Venter, 2002). During the same period that scientists, genetic companies and politicians declared that human life had been sequenced and therefore decoded, digital artists were busy populating the emerging arena of new media art. Here digital art was spawning experiments in ‘artificial life’ and ‘hybrid nature’ including animations of strange cyborgian plants and robotically watered ‘gardens’: computer simulations with forms that claimed to replicate and hence be ‘alive’. Even electronic musicians were generating tunes using genetic algorithms. The ‘will to life’ coursing through these more experimental digital art projects entered the broader sphere of digital culture by way of virtual pets such as ‘norns’ that could be installed on computer desktops to live, ‘breed’ and evolve over generations.1 Lately, though, this ‘will to life’ seems to have slowed in terms of aesthetic production, at any rate. Indeed, a rather more sombre mood has crept into new media aesthetic production evidenced by the emergence of a preoccupation with digital death. This ‘death drive’ materializes in games such as Sony Playstation 3’s Heavy Rain (Cage, 2010), where the four main characters potentially die and cannot be ‘reloaded’. Instead, the game narrative changes consequentially, rearranging itself around these digital deaths. A preoccupation with the relation between the digital and death materializes elsewhere. In the aftermath of the increased use of online communications by US military during the war in Iraq, a veritable industry has grown up to safeguard and manage ‘digital estates’ (personal online access to content in emails, blogs and social media) in the event of physical death. Could these varied examples hint at a change in mood, a decline in the fanatical quest for growth and ‘life’ that coloured so much artistic production, drove the quest to unravel the secret code of the organic in genomic research, and, indeed, informed the expansionist logic of online networking Downloaded from tcs.sagepub.com at UNIV OF ROCHESTER LIBRARY on December 30, 2011
Munster ^ Biopolitical ‘Will to Life’ to a Noopolitical Ethos of Death
69
as digital lifestyle during the late 1990s and early 2000s? The most brazen example of a shifting ethos could be found in the Web 2.0 Suicide Machine, a ‘piece of socio-political net-art’ (McNamara, 2010) created by moddr, a new media laboratory of artists, designers and programmers from Rotterdam.2 The Web 2.0 Suicide Machine is a ‘python script’, a programming language frequently used for web applications, running on moddr’s web server, which launches a browser session and automates the process of disconnecting from social networks. Participants submit their login details for social media sites such as Facebook, MySpace, twitter and LinkedIn, and the session erases online traces of all participants’ content and contacts in their online social networks. Although only recruiting around 900 ‘suiciders’ in its ¢rst iteration between November 2009 and January 2010, the site had such an impact that Facebook brought a ‘cease and desist’ order against it.3 Legal wrangles with Facebook meant a halt for a short period of time for the ‘machine’ while its coders routed around its potential breaches of users’ online ‘rights’ to network life. The beauty of the ‘script machine’ is that rather than simply deleting an account (which in some portals such as Facebook is actually impossible as they are stored in perpetuity on separate servers), the code functions to delete content and cut off online ‘friends’. It thus takes aim at the very heart of social media’s expansionism and its substance; that is, to add more and more, to ‘grow’ one’s personal network. Eric Kluitenberg has suggested that a network society ^ that is, a culture in which ceaseless connectivity is the imperative ^ must also provide a collective mode, a ‘right’, to disconnect from participatory technologies (2008: 272^76). Rather than ‘dropping out’, critical software projects such as Web 2.0 Suicide Machine perhaps ¢t the category of what Kluitenberg calls ‘mindful disconnection’ in which a moment and place is carefully thought through for digital ‘death’ (2008: 272). In this article, I want to track the rise of this emerging digital ethos that is cognizant of finitude, consequence and even death ^ the spread, perhaps, of a sobering digital mood. In a range of digital creative productions and digital life, questions of how to deal with finitude are on the rise. On the one hand, sectors of the digital entertainment industry ^ specifically, computer games developers ^ and new start-up industries are concerned with the question of how to manage the relations between death on a physical and digital level as each level begins to have impact upon the other. On the other hand, death and suicide are also becoming the impetus for humorous creative expression. Geert Lovink has pointed to a nihilistic impulse in contemporary online culture, where ‘meaning’ is erased from media via incessant blogging about ‘nothing’ (Lovink, 2008: 1^38). But the incorporation of a death ethos into digital culture is not at all about erasure and does not necessarily cluster around a homogeneous set of images or ideas as the digital ‘will to life’ consistently clustered around the biosciences. There are a number of heterogeneous £ows that intersect with but also diverge from each other amid this sobering mood and sombre ethos. Downloaded from tcs.sagepub.com at UNIV OF ROCHESTER LIBRARY on December 30, 2011
70 Theory, Culture & Society 28(6)
As I suggest later in this article, simulation training environments for US military combat, developed by the ‘serious games’ branch of the digital entertainment industry, have had to respond to the real situation of extended Middle East warfare and troop deployment. A new kind of simulation and gaming environment has been recently developed that emphasizes the ‘non-kinetic’ aspects of military combat such as: . . . mentoring host country security forces, intelligence collection, information operations, providing essential services, increasing local employment, capacity-building, respecting local religious, ethnic, and other sensitivities, and so forth. (Brynen, 2009)
What kinds of software operations are being deployed to run such a simulation? Furthermore, what kind of embodied dynamic does this circumscribe for the user of this simulation with the digital technics of such an environment? Is digital code’s aesthetics searching for a different mode of conduct that fits life lived under conditions of perpetual war, political and financial crisis ^ an ethos that acknowledges mortality, finitude, consequence and local variability rather than one that treats the digital as merely an opportunity to inconsequentially ‘reload’ and refire? Rather than pit a 1990s ‘will to life’ against an emerging ‘death drive’, I argue that the shift to an ethos in which dark consequences ensue from digital actions must be understood by working through digital code’s technicity and of unfolding this technicity’s relation to both ethics and politics. Technicity and technics, rather than technologies, give us a conceptual horizon for the unfolding of relations between technologies and aesthetic production and of the relation of both to the singular unfolding of a particular ethos or, even more specifically, an ‘ethopolitics’.4 Technicity and/or technics are terms that have circulated in the analysis of digital media in part via the influence of Bernard Stiegler.5 I want to suggest that Stiegler’s analysis of contemporary technicity goes some way toward unfolding a political analysis of the relations between ‘life’ and ‘death’ in the recent and current aesthetics of digital code. Specifically, his more recent work is concerned with the over-reaching of biopower into what he terms ‘psychopower’ in which contemporary technicity systematically captures and modulates not simply bodies but also our entire spectrum of attention: ‘. . . the solicitation of attention has become the fundamental function of the economic system as a whole, meaning that biopower has become a psychopower’ (Stiegler, 2010: 103). This, according to Stiegler, is particularly the case for contemporary ‘youth’, who across an entire generation have lost their attentional capacities. We must move rapidly, according to Stiegler, from elaborating a biopolitics in relation to biopower to inventing a ‘noopolitics’ that critiques the destitution of attention by psychopower and restores an ethics and practice of care for mind. Downloaded from tcs.sagepub.com at UNIV OF ROCHESTER LIBRARY on December 30, 2011
Munster ^ Biopolitical ‘Will to Life’ to a Noopolitical Ethos of Death
71
But I will also argue that this articulation of noopolitics fails to provide us with a way to conduct ourselves digitally in the light of the spread of technologies and cultures of cognitive capitalism. Indeed Stiegler at times tends toward a homogenization of computational culture, railing against its ‘program’: ‘computational psychotechnology always aims at substituting for attention, theorizing and modeling attention and its institutions, destroying them by seeming not even to imagine an attention beyond vigilance’ (2010: 102). Stiegler’s strength lies in the ways he draws us toward the shift in biopower that begins to account for why ‘the living’ or ‘life itself’ may no longer be the sole object of biopolitical capture. But I hope to show that if we are now seeing a shift in the apparatus of biopower toward a more concentrated capture of the cognitive by computational technics, we should remain aware of the differentiating mechanisms and flows that constitute this ‘neural’ turn. The emergence of digital death as trope and ethos across a range of aesthetic practices in computational culture signals that noopolitics can be recuperated by an economy of cognitive capitalism. I will consider this recuperation in the light of a deliberate shift in US military strategies toward the elaboration of ‘noopolitik’ as a new strategy of soft power in the light of failures in recent US initiatives in the Middle East. This shift has been played out as much within the military’s alignment with the gaming branch of the entertainment industry as it has in actual military manoeuvres. Even so, other spheres of aesthetic production point toward the productive and differentiating potential of practices of digital ‘coding’. The Web 2.0 Suicide Machine, for instance, declares that one can exit digital ‘life’ humorously and that such disconnecting practices might have a role of differentiation vis-a' -vis generations. As I will suggest toward the end of this article, we need to understand how digital publics and crowds ^ rather than ‘generations’ ^ form in conjunction with such practices. For such indication of the differential conduct of digital coding marks transformations ^ lines of flight ^ for contemporary technoculture. Digital Life Itself The apotheosis of life’s codification was reached when geneticists, flailing in media hype, referred to the sequencing of the human genome as ‘the book of life’. Not long after that millennial announcement, genetic code became more intricately tied to digital encoding, when the final draft of the genome sequencing was described as not only ^ as Venter had suggested ^ storable on, but similar to, a ‘classic CD’ (Noble, 2003). But there are many other instances, especially in the burgeoning ¢eld of digital art in the 1990s, of code’s will to life ^ of its enumeration as ‘life-like’ and of life as ultimately encodable. Projects such as Thomas Ray’s T|erra, a computer simulation of ecologies of arti¢cial life, which was ‘born’ on a PC in 1990; Christa Sommerer and Laurent Mignoneau’s 1992 Interactive Plant Growing, which interactively connected the physical stimulation of live Downloaded from tcs.sagepub.com at UNIV OF ROCHESTER LIBRARY on December 30, 2011
72 Theory, Culture & Society 28(6)
plants by participants to the ‘growth’of 3D representations of arti¢cial plants on a computer monitor; and Ken Rinaldo’s robotic network of arms wrapped in organic grapevines, Autopoesis, 2000, carve out a ¢eld of aesthetic practice and research caught in the thrall of a will to life. Looking back at the period, it seems no coincidence that this digital bioaesthetics aligns culturally and socially with the extensive thickening of biopower: that is, the extension of regimes of political and economic control to every area of ‘the living’, including the choices made in how one lives, managing one’s life, urban planning, the management of population movement and so on (Lazzarato, 2002: 102). Research into the rise and dominance of the molecular within the life sciences and the development of techniques for managing the vast £ows of subsequent data (Rabinow, 1992: 241), and the increase in techniques for managing the life of the self (Rose, 2001), especially during the 1990s and early 2000s, has articulated this biopolitical, bioethical hold and spread and its imbrication with digital technologies of information. Digital aesthetics, too, seemed in the grip of a fascination with the molecular level of programming and generating code ^ witness the rise of evolutionary and genetic algorithms in artistic practice. Equally, it was obsessed with interactivity as a de¢ning feature of digital media and art (Popper, 2005) and with the audience/users exercising a kind of management of choice, via interaction (Edmonds et al., 2004: 113^ 114). There is, then, within digital art a bioaesthetics at work during much of its formative period; a bioaesthetics that sees the digital as a code to be extended over all scales of life from choice through to programming. Nikolas Rose has named the form of conduct particular to contemporary biopower by proposing the concept of ‘ethopolitics’ (2001: 2). Ethopolitics signals a shift from dominant forms of 19th and 20th-century biopower that focused upon the management of whole populations via the policies and implementation of, for example, public health. Instead, he argues, biopower in the late 20th century became increasingly molecular, with high-risk individuals being targeted and managed instead of solely the larger aggregate of populations. In turn those individuals, such as people with HIV-positive status, must become managers of their own ‘bios’ (Rose, 2001: 6^7). Thus ‘ethopolitics’ names an entanglement of force and technique ^ the macro management of life’s £ows with the molecular management of the high-risk self. Although I do not wish to stretch the application of ethopolitics too far, it is this macro-micro entanglement peculiar to the biopolitics of the late 20th century that interests me. As I have suggested above, a strong trajectory within digital aesthetics has been its will to life. Following this, we might call digital art’s primary code of conduct during its emergence as practice and form ^ occurring over the same time span as Rose’s conception of ethopolitics develops ^ an ‘ethoaesthetics’. This ethoaesthetics names the twin obsession with digital life as generative, propelled by a will to life and digital code as something to be manipulated (managed, perhaps) via the individual user’s interactions with computational code.6 Downloaded from tcs.sagepub.com at UNIV OF ROCHESTER LIBRARY on December 30, 2011
Munster ^ Biopolitical ‘Will to Life’ to a Noopolitical Ethos of Death
73
One of the exemplary ‘genetic’ art works of the 1990s, in which life and interaction conjoin in such a manner, is Christa Sommerer and Laurent Mignoneau’s 1994 interactive installation A-volve.7 The interface for A-volve is a large horizontally set customized touch screen, which appears to be a floating ‘pool’ of blue water. At another touch screen set to one side of this, users draw out a creature’s form two-dimensionally onto its surface. This is then computationally assigned three-dimensionality and animation, generatively appearing in the ‘pool’ environment. Users thus ‘give life’ to a virtual creature. The parameters of the user’s drawn form (their creative ‘expression’) determine the behaviour of the creature in the pool/monitor below. Here the user’s creative ‘choices’ produce a form, which is then used to map onto and select algorithms that the artists have programmed and are stored in the installation’s database. Depending upon the choice of algorithmic combinations conjugated by the drawn form, the ensuing creature gains ‘fitness’ in the virtual environment. It must then adapt and survive via random interactions with other drawn creatures in the artificial environment. Users can interact together with others and their virtual creatures, moving the creatures toward each other to see if they will ‘mate’ or kill the other one off. As Oliver Grau writes of this work: Each artificial life form, each ‘phenotype’ has a ‘genome’, with ninety variable parameters, so no individual creature is alike. Life, as understood in bioinformatics, is understood as information. Here too, the images of life are based on a type of code, and only through its reiteration, the reproduction of texts, can the creatures reproduce. . . . All the creatures owe their ‘existence’ to the visitors’ interaction and to the random interactions between themselves. (Grau, 2009: 172)
The logic of interaction and life in A-volve appears to mimic a ‘natural’ evolutionary movement in which creatures are born (creatively expressed) and then live (are algorithmically supported) according to whether they adapt to the already ‘living’ (already existing algorithmic variations of virtual forms) virtual environment. And yet Sommerer and Mignoneau also promote a suite of interactive actions for the users of the installation once the creatures enter the shared ‘pool’ environment. So, Sommerer (1994) explains, the user can ‘learn’ gestures such as placing their outstretched hand over the top of the creature, which ‘protects’ their creation against a predator. It is not simply a question, then, of art imitating life but rather of a complex co-determination between the user’s interactive gestures and the code that gives and maintains the virtual creatures’ life. The installation therefore depends upon an evolving relationship between embodied action and interaction and digital technologies being produced, grasped and deployed. After the user has creatively generated life, s/he ‘learns’ to manage the life of his/her creature by learning how to interact with it. At the heart of digital life itself lies an assemblage: the management of the human body’s interaction with generative digital code. We cannot Downloaded from tcs.sagepub.com at UNIV OF ROCHESTER LIBRARY on December 30, 2011
74
Theory, Culture & Society 28(6)
understand the aesthetics or ethics of interactive art work such as A-volve unless we deploy a productive and constitutive conception of technics and technicity. From Biopower and Technologies of Life to Psychopower and Technologies of Mind In the work of Bernard Stiegler, technics is ‘organised inorganic matter’ (1998: 17). And although ontologically distinct from either unorganized matter (in the sense of unsystematized phenomena such as minerals in their natural state) or organized organic being such as the human, technics is nonetheless caught up, ontogenetically, with the being of humans. The anthropogenesis of the human is technical and the technogenesis of technical objects is anthropological. Technics produces the human as temporal being; the mnemonic, technical system of language, for example, allows for memory. There is no origin of the technical outside the human or of the human without technics ^ they are co-evolutionary and co-constitutive. Stiegler means this not simply at the level of the production of the human as social but importantly as a meeting of matters that drives this evolving dynamic: The who is nothing without the what, and conversely. . . . The passage is a mirage: the passage of the cortex into flint, like a mirror proto-stage. This proto-stage is the paradoxical and aporetic beginning of ‘exteriorization’. (1998: 141)
At the same time, this relation is socio-technical. The organization of inorganic matter does not simply result in the production of aggregates of technical objects but has its own mode and historicity. Both milieu (cultural specificity) and technical materialities impose form upon the technical (1998: 44). Technicity, following on from this, is a way of understanding an historical or contemporary systematization, at both the levels of milieu and material, of techno-anthropogenesis. The generalized concept of technicity is not simply a way of thinking through a particular era’s form of organizing technical objects. Contemporary technicity understood within Stiegler’s Time and Technics project is the historically specific systematizing of technical objects whose organizing principle resides in the acceleration of time (Stiegler, 1998: 23). It is not simply that things are speeding up. Rather, temporality itself is commandeered by this techno-logic such that future thought, life, tendencies must be ‘programmed’, calculated and invested in. In short, the future ^ anticipation as an existential territory ^ is territorialized: Anticipation, at the most global level, is essentially commanded by investment calculation ^ collective decision-making, temporalisation ^ in short, destiny is submitted to the techno-economic imperatives regulating this Downloaded from tcs.sagepub.com at UNIV OF ROCHESTER LIBRARY on December 30, 2011
Munster ^ Biopolitical ‘Will to Life’ to a Noopolitical Ethos of Death
75
calculation. This is as well the domination of a certain understanding of time. (Stiegler, 1998: 42)
Tele-technologies, through which online digital games, for example, are increasingly played, are graspable examples of how this dynamic of human-technics both simultaneously engages and generates a temporality and dysfunctional ethico-aesthetic horizon. For Stieger, tele-technologies, which include television, all forms of digitization and the internet, increasingly capture the full spectrum of human attention and in so doing map a shift to what he calls ‘psychopower’ (Stiegler, 2008). We have shifted, then, from biopower understood as simply the control and modulation of life itself or of the power to produce and modify a given population’s bodies. Increasingly we are dealing with technologies that also capture, control and modulate the neuro-informational circuits of human behaviour, especially dominant in the spheres of marketing and education but increasingly inhabiting and imperializing thought conceived as a broad cultural activity. It is worth quoting Stiegler at length here as he details the sweep of these technologies: The technological, industrial, systematic and constant capture of attention that has been called cultural capitalism has been made possible by the emergence of psychotechnologies, and it corresponds to what I call psychopower. The techniques of management and marketing relying on story telling, the quite recent rediscovery, with the publication in French of his book Propaganda, of Edward Bernays, the place of soft power in American geopolitics, the psychological cast, at once affective and drive-base, of the recent French electoral campaign ^ peoplisation, as the word is now used in Franglais, being nothing but a psychopolitics of symbolic misery ^, the incredible dynamism of the new media, the immense effects of the technoindustrial integration through digitalization, Google attempting now to gain control over the cell-phone network and industry, that is, over human motricity, all of that, and so many other phenomena (I could continue the list all night) are concretizations of this psychopower whose paradoxical effect is that the device of the construction of attention is now seriously threatened: its industrial exploitation leads to its destruction. (Stiegler, 2008)
The ethico-aesthetic horizon of psychopower is destitute according to Stiegler’s critique because it erases the possibility for care (especially between generations), or for ‘attending to’ outside the reach of the technoindustrial spectrum. Hence for Stiegler we must summon a politics that would counter this contemporary capture of the ‘noosphere’; we must begin to articulate a noopolitics that describes a critical limit to the destitution of attention by psychopower. Although Stiegler’s generalized techno-logic underlies the temporal unfolding and acceleration of our current system of technical objects, he also suggests an indeterminacy that lies at the genesis of human-technology Downloaded from tcs.sagepub.com at UNIV OF ROCHESTER LIBRARY on December 30, 2011
76
Theory, Culture & Society 28(6)
dynamics (1998: 43; 204^238). The genesis of the human is for him a technogenesis occurring temporally rather than spatially (1998: 17, 18). Technics is what produces the human’s relation to time; technical systems such as writing and archiving facilitate memory (a past) and then allow for futures to be invented (Beardsworth, 1998: 78). Yet the supplementary logic of the relation of humans to technics ^ there is no human being without its technical prostheses, that is, without its supplementarity ^ allows a possibility of indeterminacy with respect to techno-logics. The originary deferred relation between human and technics provides the possibility for humans and technical nonhumans to co-define new temporal horizons and additionally events in time.8 The oscillation between a historically unfolding techno-logic and technical indeterminacy extends to Stiegler’s analysis of noopolitics. The latter necessarily unfolds as the limit condition of an exhausted biopower yet must necessarily be cultivated as a potential way out of contemporary culture’s etho-political destitution. The noopolitical is played out in time ^ that is, in the historical unfolding of biopower’s logic which has extended to modulate every sphere of life including the life of the mind ^ precisely because all possible spheres of human cognition are now saturated by psychotechnologies. Hence the very resource, attention, being drawn upon is used up. But it must also be played ‘out of time’ or outside the horizon of the now, by cultivating a different ethopolitics, a noopolitics, through which attention, the affective underpinning of human cognition, is created anew. Perhaps, then, the recent emergence of consequentialism in computer games, the acknowledgement of the limits of digital connectivity and the incorporation of finitude into digital narratives is a result of the exhaustion of ‘life’ and ‘growth’ within techno-aesthetics. Stiegler’s general theory of technicity and his specific analysis of contemporary biopower, as it shifts from the production of life to the construction of consumer markets based upon the mining of cognitive resources such as attention, provides us with one way to chart the political continuity of a techno-aesthetics from life to death. It may also allow us to propose that a new ethos is in fact being cultivated by some recent digital aesthetic forays in digital coding that aim, as the Web 2.0 Suicide Machine does, to reconstitute our attention: Tired of your Social Network? Liberate your newbie friends with a Web2.0 suicide! This machine lets you delete all your energy sucking social-networking profiles, kill your fake virtual friends, and completely do away with your Web2.0 alterego. (moddr, 2009)
However, this ethico-aesthetic ‘program’ (perhaps less a program than a deployment of tactical media) remains resolutely encoded by the very tele-technologies ^ that is, processes of digitization ^ which underpin psychopower. Although Stiegler declares that he is not against contemporary technical societies per se (2008), nonetheless his program for movement Downloaded from tcs.sagepub.com at UNIV OF ROCHESTER LIBRARY on December 30, 2011
Munster ^ Biopolitical ‘Will to Life’ to a Noopolitical Ethos of Death
77
away from the reach of cognitive capitalism is via a somewhat conservative call to rethink educational institutions as the milieu in which to re-establish intergenerationalism.9 Noopolitics for Stiegler, then, resides in a resort to an ethics that is wrested from not productively constituted by contemporary technics. In the remainder of this article I want to propose that we rethink a conception of the noopolitical in the light of some of the contemporary trends in digital entertainment, culture and aesthetic creation I listed in my opening paragraphs. What is important about Stiegler’s call for a noopolitics is that his analysis of the relations between biopower and psychopower indicates how a supplementary continuity is at work in their techno-logics. This gives us an understanding of how the decline of aesthetic projects oriented toward ‘life’ and the subsequent growth and the ascent of digital ‘death’ might be less a rupture than a bifurcation of digital aesthetics within a broader regime of cognitive capitalism. But it fails to take account of, first, the extent to which digital aesthetics recuperates noopolitics for psychopower under the aegis of the military-entertainment complex, which has a strong grip over elements of the computer gaming industry; and, second, the potential of the contemporary dynamic of humans^technics to produce digital lines of flight toward a more radical noopolitics. As such, we need to be on the lookout for where the shift to sobriety in digital aesthetics signals a contestation of the biopolitical ethos of digital life or where it supports the extension of soft power and the ongoing normalization of life. In order to take account of both these movements we need to consider not simply the digital’s ‘code of conduct’, its ethopolitics, but also how code conducts itself in the contemporary aesthetic/cultural sphere.
From Noopolitik to Noopolitics and Back Again In 1999 John Arquilla and David Ronfeldt, better known for their work on information warfare, had already identified an emerging trend in international relations and military strategy that they termed ‘noopolitik’ (1999: 29).10 They saw the necessity of the US government and military fully expanding and running with this emerging trend of conducting strategic and international relations as a result of the growth of the noosphere. This was their preferred term for the then more popular ‘cybersphere’ which, they argued, captured both the latter’s sense of virtual relations and communications and the ideational ‘knowledge-based’ arena in which information had become a field of power and economic relations and contestations. Noopolitik was a strategic move, as opposed to noopolitics, which describes more the widespread exercise, ubiquity and operation of information as a field of power relations at all levels of society: Noopolitik is an approach to statecraft, to be undertaken as much by nonstate as by state actors, that emphasizes the role of soft power in expressing Downloaded from tcs.sagepub.com at UNIV OF ROCHESTER LIBRARY on December 30, 2011
78
Theory, Culture & Society 28(6) ideas, values, norms, and ethics through all manner of media. (Arquilla and Ronfeldt, 1999: 29)
According to Arquilla and Ronfeldt (1999: 35), the last few decades of the 20th century had brought about significant change in the fabric of international relations at the level of players, media and technological infrastructure. The global interconnectivity facilitated by information technologies that allow for the flow of financial, political and media information conjoins with the rise of a ‘new’ civil society in which nonstate actors ^ NGOs ^ play a significant role. NGOs operate less like previous liberal corporate players with a top-down approach to international relations and more like nodes in networks, monitoring, reporting upon and disseminating information about government, military and corporate agencies via their networks. The ‘noosphere’, then, is this active sensing mesh of information networks that collectively exercise a noopolitics. Like it or not, state actors such as the US military and government must learn to adjust to this global information field and shift away from a strategy of ‘realpolitik’ or the exercise of hard-nosed sovereign power governed by the motto ‘might makes right’. A decade and more on from Arquilla and Ronfeldt’s ‘advice’ to the US government and military, it seems the active adoption of ‘noopolitik’ as a US state-sanctioned strategy has only just caught on. As a result of lessons learnt in the aftermath of the fall of Baghdad city in April 2003 as part of the US invasion of Iraq, on the ground military personnel found themselves untrained for the protracted occupation that ensued. Computer simulation training has long been an element of US military training with the use of computational flight simulators beginning in the 1960s and becoming widespread by the 1980s. However, the widespread use of computer gaming as first a US military public relations exercise and then later a training simulation for military personnel was America’s Army (AA), released in 2002.11 Interestingly, this suggests that the game was perhaps developed in response to the noopolitical climate of the late 1990s ^ the need to use information era means such as gaming to make the norms of military life attractive to a younger population. Legend has it that Colonel Casey Wardynski, a US West Point-based instructor, dreamt up the idea of a recruitment game at a cocktail party targeted at American youth (Davis, 2003: 269). However, the game environment and play is anything but ‘noopolitical’ insofar as this is described by Arquilla and Ronfeldt as relying upon soft power, the appeal of values to set an agenda and persuasively win over opposition, promoting shared values and belief systems and so forth (Arquilla and Ronfeldt, 1999: 40^41). Rather, AA, although not dwelling on blood and gore in quite the same way as, say, the commercially developed games Doom, Quake or World of Warcraft might, is nonetheless a ¢rst-person ‘shoot ’em up’ game based around team play in combat-simulated situations. Noopolitik, then, is a somewhat fraught approach that has not yet completely diverged from Downloaded from tcs.sagepub.com at UNIV OF ROCHESTER LIBRARY on December 30, 2011
Munster ^ Biopolitical ‘Will to Life’ to a Noopolitical Ethos of Death
79
the ‘realpolitik’ of asserting military superiority and will through sheer might. Nonetheless, as Stephen Graham has shown, military theorists and strategists increasingly understand digital media and technologies as synonymous with the contemporary space of battle (2009: 279^8). A new extended and extensive ‘fourth generation’ of warfare is unfolding, in the eyes of US military strategists, in which everyday urban sites become a space for attack by, for example, urban terrorists. Hence city spaces, such as transport infrastructure, come to be understood as ‘soft targets’ and a military response must be galvanized to architecturally reconsider the city as requiring soft military control. On the one hand, war as a noopolitik of soft control extends to capture all urban space as battlespace. On the other hand, actual spaces of military occupation, such as Iraq and Afghanistan, are proving impossible to control. In the last two or three years, the US army and government have changed tack with respect to the design and development of military computer simulation environments, a result of the state, no doubt, in which they find themselves of perpetual war. Although AA is still an active combat ^ or what is now called a ‘kinetic combat’ ^ training and recruitment environment, a new phase of computer simulations is now active and continues development. One of the main gaming/simulation environments now used for the US military is UrbanSim, developed by the Institute of Creative Technologies (ICT) at the University of Southern California and funded largely by the US military (ICT, 2010). UrbanSim, with its echos of the SimCity series developed by Electronic Arts, deals with the ‘non-kinetic’ aspects of military situations: Bosack [project manager at the Institute of Creative Technologies] and his team then built the game’s characters as autonomous agents that react not just to specific actions, but to the climate created by a player’s overall strategy. Members of a tribe, for instance, want jobs, but they won’t work if they don’t feel safe. Instead, they might join the insurgents. Patrolling neighborhoods, meeting with tribal elders, and creating more economic opportunities ^ tactics straight from counterinsurgency manuals ^ can reduce the likelihood of that outcome in the game. (Mockenhaupt, 2010)
What is of interest to me is not so much the switch to counterinsurgency tactics within simulation environments as a result of the failure of US military ‘kinetic’ operations or ‘realpolitik’ strategies in Iraq and Afghanistan. Rather, I am interested in the implications this has for digital code’s conduct within this simulation/gaming environment. We need to ask this at the specific level of the relation between human and technics in these environments: that is, what routines or operations must the code/game engine adopt as a result of this new direction and how, accordingly, does a user’s corporeality and cognition become reconfigured? Downloaded from tcs.sagepub.com at UNIV OF ROCHESTER LIBRARY on December 30, 2011
80
Theory, Culture & Society 28(6)
UrbanSim uses what is known in software development as ‘under-thehood’ components. Quite literally, the simulation is a cover-over or umbrella for different information architecture components; in the case of UrbanSim, this component architecture uses several Artificial Intelligence (AI) technologies. In fact, the environment borrows a previously modelled educational AI tool for simulating social interactions in multi-user environments called PsychSim.12 This component allowed a belief system for the non-player agents or population to be modelled and stored in the game’s engine by creating a reusable library of ‘behaviour entities’ that could be called up and combined variably (McAlinden et al., 2009: 4). The other AI component technology working in tandem with the PsychSim tool is UrbanSim’s story engine, which the developers state was created in order to interject narrative elements into the simulation environment, where events are drawn from a case library of the real-world experiences of military commanders. The use of a data-driven Story Engine allows UrbanSim to generate events in the user experience that would be difficult to model in rulebased simulations, and provides a framework for quickly modifying UrbanSim content to reflect changes in the contemporary operating environment of the U.S. Army. (McAlinden et al., 2009: 4)
The two AI technologies, then, amount to exactly the concretizations of the techniques of psychopower that Stiegler elaborates. First, the normalization of a population (in this case a normative ‘Arab’ population) by assigning it behaviours from a‘library’. Second, the conjunction of this normalized population with story-telling elements, which create scenarios for the playing out and then management of these behaviours.To quote Stiegler again: ‘The techniques of management and marketing rely on story telling’ (2008). However, what is slightly different about this concretization is that it is a simulation, an environment designed to train the US army, a sub-section of another population, in a noopolitik of non-kinetic combat. A complex level of recursive or iterative behaviour and operations unfold. A counterinsurgent (in this case Arabic) population’s behaviour is modeled for the gaming market of the US military to ‘consume’. But the ‘consumption’ takes place in a training environment whose aim is to produce a preparedness or readiness in this market: in effect, a biopolitical modulation of the US military market consuming the product. In other words, UrbanSim aims to produce a consumer population of soldiers ‘in readiness’ by creating a set of rehearsed story-telling scenarios that combine with the construction of a normative simulated ‘Arabic’ population. The developers of UrbanSim importantly describe the environment as a ‘learning package’ that will allow military commanders to instruct their staff in complex and dynamic military situations after the offensive and defensive operations of initial combat have concluded (McAlinden et al., 2009: 2). Prior to executing simulated scenarios in the UrbanSim practice Downloaded from tcs.sagepub.com at UNIV OF ROCHESTER LIBRARY on December 30, 2011
Munster ^ Biopolitical ‘Will to Life’ to a Noopolitical Ethos of Death
81
environment, the ‘player’ or ‘learner’ as s/he is at this stage, undertakes the ¢rst part of the training package ^ eight tutorials labelled the ‘UrbanSim Primer’ (McAlinden et al., 2009: 3). The developers call this ‘cognitive preparation’ as, via practice examples and accessing recorded interviews with military commanders, the player/learner gains the conceptual knowledge for the complex tasks in the simulation environment proper. There is also an AI component incorporated here as the player’s practice examples are evaluated and s/he is given feedback. The primer sets the stage for the simulation as ‘educational’, literally separating cognition as learning from the ‘action’ of the game. This does not mean that UrbanSim gameplay is disembodied. As Alex Galloway (2006: 83^4) has argued, gaming is an active medium in which the player is constantly doing, physically moving, pushing the game console interface and inputting data into the game’s engine. But what is different about UrbanSim is that the gamer’s action and body is first ‘primed’, made ‘ready’ for the game play. Far from being disembodied and merely cognitive, the primer provides an environment which trains the player in ‘readiness’, in being alert ^ both a corporeal and cognitive state ^ to the playing out of scenarios and of their modulation in the light of their play. Jordan Crandall has described ‘readiness’ as a state that has become a contemporary political and military ideal insofar as it shapes the body’s tendency to act and its willingness to organize action in a more structured way: Readiness shapes tendency, structures disposition. Again, it is always en route, always emerging. Yet it is not only internal: it works laterally across bodies and environments. As provisional as they might be, its objects are group constructions, hybrid compositions: identifiable within the formula, yet interchangeable. We might say that readiness is the lived, embodied dimension of vigilance. (Crandall, 2006)
A number of analyses of biopower have also noted the ways in which global military strategies such as bioweapons stockpiling and public health policy produced out of speculation and planning for pandemic outbreaks preempt the future. Biopower has a securitizing arm not simply over life, bodies and population but over temporality, calculating and anticipating its threat potential (and investing in coordinated programs) in order to shore up the present as readiness for the future (Cooper, 2008: 74^5; Shukin, 2009: 184^9). What UrbanSim realizes is a new kind of embodied dynamic for interacting with computational environments in which one does not ‘manage’ the population of virtual actors or creatures via interactive gesture, as was the case in earlier bioaesthetic interactive art installations such as A-volve. Rather, the dynamic is one which deploys an affective potentiality, which preps for future action. The ‘under-the-hood’ task of the UrbanSim package is to embody the current political and military ideal of perpetual vigilance. Downloaded from tcs.sagepub.com at UNIV OF ROCHESTER LIBRARY on December 30, 2011
82
Theory, Culture & Society 28(6)
Toward a Transversal Noopolitics? What the development of a serious gaming package such as UrbanSim signals in its relays between production and consumption of markets and populations, embodied readiness for, and cognitive analysis of, non-kinetic operations is that bio- and psychopower operate in tandem. There is no succession of biopower by psychopower in other words, as if these followed some implicit evolutionary schema. As Gilles Deleuze and Felix Guattari remind us, ‘There is no biosphere or noosphere, but everywhere the same Mechanosphere’ (1988: 69). We must, then, focus again upon the question of technicity and decide whether this can lead us toward a different conception of the noopolitical that is less regulatory and more transversal. This will involve paying attention to how technics are taken up within an overall organizing schema of technicity. This is not to suggest that technics can simply be used outside the social but rather to recover the potential of indeterminacy at the level of the human^technics dynamic. Stiegler’s critique of Foucauldian biopower lies in his reflection that Foucault concentrated on the productive rather than consumptive or market driven aspects of the biopolitical: ‘But today the question of biopower is less one of ‘‘utilizing the population’’ for production than of establishing markets for consumption’ (2010: 128). Stiegler’s contribution to an understanding of the logic of contemporary technical systems is to demonstrate just how insidious a control society, operating at the level of cognitive capture, has become in the latter part of the 20th century as the forces of psychotechnologies and consumer-oriented strategies such as user profiling and neuromarketing combine. But, as I have suggested, regimes of attentional capture might also require a certain production of embodiment which combines with models or simulations ^ effectively normalizations ^ of populations. Both of these are realized biopolitically and psycho- or noopolitically. But what of the possibility for escape or the deterritorialization of attention from such capturing regimes? One of the problems with not only Stiegler’s but others’ such as Katherine Hayles’ critiques of contemporary technicity’s capture of attention is that it is too hegemonic. Hayles, for example, identifies a generational cognitive change in 8 to 18 -year-olds happening now, which she terms the shift from deep to hyper attention (2007: 187). Deep attention is characterized by solely focusing upon one media object ^ for example, reading a book ^ whereas hyper attention typically involves shifting one’s attention around multiple media: listening to the iPod while doing email while submitting a homework assignment and texting one’s friends. Somehow, according to Hayles, who bases her evidence for such a cognitive shift on both anecdotal observations from older academic professors and the Kaiser Family Foundation’s report Generation M: Media in the Lives of 8^18 Year Olds, this is both typical of the ‘younger’ generation and intricately connected to learning and cognitive disorders such as Attention Deficit Disorder and Attention Deficit Hyperactivity Disorder (2007: 188ff.). Downloaded from tcs.sagepub.com at UNIV OF ROCHESTER LIBRARY on December 30, 2011
Munster ^ Biopolitical ‘Will to Life’ to a Noopolitical Ethos of Death
83
To be fair to Hayles, her tone is not a moralizing or nostalgic one, bemoaning the complete loss of attention in the younger generation or the evils of modern media. She certainly makes a case for the possibility that, for example, computer games also engage their users in long periods of ‘deep’ attention and absorption. Ultimately, she argues, digital media and culture can and often do exemplify interactions between both modes of cognitive attention (2007: 98). And yet sweeping characterizations of generations or populations are implicit in both Stiegler’s and Hayles’ critiques of contemporary psychopolitical regimes and systems of technical objects or media. What is problematic about these characterizations is that they are out of step with the actual logic of the media themselves. And we should remember this logic is not simply technical but also economic ^ one in which, as Stiegler suggests, the market and marketing play a key structuring role. But contemporary digital techno-logic does not just function to create ‘markets’ and hence consumers but rather multiplies ‘the’ market excessively so that it extends throughout the entire spectrum of life from the corporeal to the cognitive. This produces a constant segregating effect upon populations of consumers, dividing ‘the’ market into ever increasing pockets or niches of customization. Chris Anderson identified this as the emerging business model of networked market economies, calling it ‘the long tail’: This is the difference between push and pull, between broadcast and personalized taste. Long Tail business can treat consumers as individuals, offering mass customization as an alternative to mass-market fare. (Anderson, 2004)
Although much has been made of the economics of the long tail and of the networked technics that co-constitute it, less has been said about its subjectivations. That is to say, if markets are becoming more ubiquitous and yet more differentiated, might not this differentiating tendency also have heterogenizing effects on its populace? Tiziana Terranova has argued that the segregating, dissociative logic and affect of tele-technologies, which perform and facilitate action at a distance, may also be generative of new kinds of publics who could resist such attention-grabbing manoeuvres (2007: 140^ 1). Drawing on the notion of the public articulated in the work of Maurizio Lazzarato, Terranova considers the formation of publics as ‘events’ (2007: 142). This is quite a di¡erent characterization of contemporary crowds than that of the somewhat homogenizing concept of population found in, for example, Rose’s analysis of biopower or of the analysis of noo/psychopower a¡ecting ‘generations’ in Stiegler and Hayles. In the aftermath of Anderson’s analysis of networked media as long tail or ‘pull’ media, Terranova notes that in actuality the internet produces phenomena of both massification and segregation: certain blogs or YouTube videos garner exponential ‘hits’ while other sites produce differentiated and segregated audiences. This ‘push/pull’ combination is in fact much more typical of online media publics, which form and disperse Downloaded from tcs.sagepub.com at UNIV OF ROCHESTER LIBRARY on December 30, 2011
84
Theory, Culture & Society 28(6)
temporarily around networked phenomena. For Terranova, this suggests the potential for a differentiating line of flight away from the normalizing forces of mass communications and toward what she names as a ‘creolization’ of networked subjectivities: Rather than being a limit of the internet, it could be argued that this dispersion allows for a process of further segmentation and proliferation of publics, which could potentially operate as a site of further creolization of subjectivities ^ subjectivities whose relation to the whole does not involve the neutralization of a singularity into an identity or a norm. (2007: 140)
The noopolitical capture of attention and affect by tele-technologies can, on the one hand, create publics that might appear more like the ‘masses’ of broadcast media. But, on the other hand, these dispersed, forming and reforming publics can also capture media. This emerging noopolitics charts a differentiating will to power that is immanent to the techno-logic of contemporary systems of technical objects. As Terranova has suggested, networked publics might just as well temporarily constitute new kinds of ‘collectivities’ or, as she puts it, ‘counter-weapons’ of resistance to the implicitly militarized, noopolitik of digital communications packages such as UrbanSim (2007: 142). We see this kind of provisional collectivity forming around a public who wish to exit their online social media networks by using the Web 2.0 Suicide Machine and the programmers, artists and designers who create such phenomena. Moddr approach technical objects as reusable and repurposable: ‘a large part of our practice involves the modification (modding) and re-creation of already existing technology’ (2010). Rather than assign this kind of messing with code the moniker of ‘remix’, projects such as Web 2.0 Suicide Machine can be seen more in terms of what Matthew Fuller has termed ‘critical software’ (2003: 23). Critical software exposes the normative forces that operate across software’s multiple scales, as, for example, the Web 2.0 Suicide Machine does by exposing social media’s biopolitical grip on networking at a molar level and on personal data at a molecular level. But it also gestures, as Fuller has argued, toward new modes of collective aesthetic creation. Critical software, then, suggests an inventive or transformative noopolitics that is all the while digital. It undoes the synchronization of digital technologies with a digital ‘program’ that subtends Stiegler’s noopolitics. What I want to suggest is that such critical software practices escape the normalizing processes of software not simply because of what the code itself writes and unwrites. Early examples of critical software practices in digital aesthetics worked to deconstruct the core of a games engine or interface only to remain largely confined to the domain of experimental art. This can be seen in the work of the online duo jodi.org, for example, arguably the most aesthetically experimental form of early software art in works such as www.jodi.org from 1993. The home page of this website Downloaded from tcs.sagepub.com at UNIV OF ROCHESTER LIBRARY on December 30, 2011
Munster ^ Biopolitical ‘Will to Life’ to a Noopolitical Ethos of Death
85
displayed a garbled screen of nonsensical green code. Those in the know about the HTML source code page view available in web browsers could choose the ‘Page Source’ option in the ‘View’ menu and see revealed within the HTML script an embedded ASCII diagram of a hydrogen bomb. The implication was that the origins or ‘source’ of scripts that allowed us to visually display networked information was in fact post-Second World War military weapons research. This early ‘codework’ that aesthetically mined the relations between code and visualization in data aesthetics was extremely important but it has remained just that: aesthetic experimentation encoded for an audience that already knew what it was looking for. Such practices become productive of new modes of collective aesthetic creation only if they also transversally assemble, and assemble with, provisional publics. This may simply be a question of the right timing but it may also be because networked media have reached both mass proportions and have maintained their segmenting logic. In the case of the Web 2.0 Suicide Machine the code assembles with a small ^ in comparison to usual web ‘hit’ statistics, for example ^ ‘public’of around 4500 social network ‘suiciders’. Yet recently the site featured in an episode of the global cult animation series South Park, in which the character Stan Marsh joins Facebook but then uses the Web 2.0 Suicide Machine to ‘unfriend’ himself.13 Following the nonlinear logic of publics and media that comprise networked assemblages, the relatively small provisional public of suicide machine users transversally assembles with a different and much larger public assembled around a television series, via a relatively ‘small’ python script. This assembling may very well hint at the transformative possibilities of networked and broadcast media assemblages. This public then blogs, tweets and chats about the episode and the ‘suicide machine’, sending other publics to the website. New collectivities form that meld push-and-pull media, hackers and fans, artists and audiences. It is in this transversality of contemporary digital media that the differentiating aesthetic potential for the noopolitical lies. Rather than a readiness that embodies a technics oriented toward vigilance, the ‘suicide machine’ literally embodies a letting go ^ a surrender to the code ‘machine’ and, consequently, walking away from the molecular and molar management of data, life and bodies now invested in by online corporations such as Facebook and Google. More than simply surrendering one’s online ‘life’, though, such projects beckon toward a different constitution of digital life inflected by digital ‘death’, disconnection and finitude ^ an aesthesia that finds humour and joy in the unlikely assemblage of provisional networked events, media and publics. Notes 1. ‘Norns’ were first released in 1996 by the entertainment company Mindscape through the game Creatures. It had a large following and an online community that supported game development of unique ‘norns’. Creatures was then taken Downloaded from tcs.sagepub.com at UNIV OF ROCHESTER LIBRARY on December 30, 2011
86
2.
3.
4.
5.
6.
7.
8.
Theory, Culture & Society 28(6) over by the Cyberlife company in 1998 and had several iterations until 2003. Development on the game finished in 2003. Moddr is a media laboratory that runs out of WORM, a Rotterdam artist collective (see their website at: http://moddr.net/moddr_/). As alumni from the renowned Media Design Masters degree at Piet Zwart Institute, Rotterdam, their express aim is to modify and re-engineer existing media and technologies. Their aesthetic thus comes from a DIY hacker perspective but with an express desire to reformulate within the context of Web 2.0 logics and technologies. On 6 January 2010, Facebook’s lawyers sent moddr a ‘cease and desist’ letter, claiming that the Web 2.0 Suicide Machine’s operations were violating the social media site’s ‘Statement of Rights and Responsibilities’ for its members (see the link to a downloadable pdf version of this letter on the Machine’s website at http://suicidemachine.org/) Moddr responded by ‘ceasing’ to claim that Facebook had partnered with them on its site but argued that it did not violate other member rights. The Machine site was out of action for a short period of time while moddr modified aspects of the code’s operations to route around any seeming violations. For comments by Gordon Savivic from moddr about the ‘cease and desist’ letter, see McNamara (2010). I borrow the term ‘ethopolitics’ from Nikolas Rose (2001: 2), but I borrow it aware that he uses it to describe the risk politics of managing the self that become the hallmark of contemporary biopower. The point I want to develop throughout this article is that biopower, thought in terms of both the ‘life’ management of population and self, misses the question of death as both ‘limit’ to be managed and as potentially generative of counter-weapon to Rose’s articulation of biopower. See, for example, the journal issue devoted to Stiegler and technics of Transformations (2009). It should be noted that others have deployed the concept of technicity in relation to the analysis of digital media, such as Adrian Mackenzie (2006: 22). Mackenzie draws on Gilbert Simondon’s conception of a system of technical objects. I am restricting myself to the in£uence of Stiegler here because of his particular analysis of the shift from the biopolitical to the noopolitical in relation to the supplemental logic of technicity, which I elaborate later in this article. Lev Manovich named this obsession with interactive interfaces as part of the ‘myth of interactivity’ that accompanied 1990s analyses of digital artwork (see Manovich, 2001: 55^57). A-volve was awarded the highest prize, The Golden Nica, in the category of ‘interactive art’ at Ars Electronica: Intelligent Design in Linz, Austria, in 1994. For an application of Stiegler that is decidedly radical with respect to the possibilities of electronic or digital technologies coupling with the human in inventive rather than destructive ways, see Andrew Murphie (2003). Murphie argues that networked electronic music constitutes a proliferating event and world in which humans and technical nonhumans open up relations of
Downloaded from tcs.sagepub.com at UNIV OF ROCHESTER LIBRARY on December 30, 2011
Munster ^ Biopolitical ‘Will to Life’ to a Noopolitical Ethos of Death
9.
10.
11.
12.
13.
87
intensive di¡erentiations that challenge the production of both music and listeners in market-driven terms. However, Stiegler has also been criticized for closing the indeterminacy between humans and technics in his analysis of the overarching reach of, especially, contemporary technicity. Andres Vaccari (2009), for example, argues that the encroachment of the human by an overarching industrialized ‘Technology’ leaves little room except to retreat into an ethics of humanism. This tendency certainly comes to the fore in Stiegler’s recent homogenization of ‘psychotechnologies’, which must be countered by re-education. The full articulation of the role of education in delivering a collective re-individuation via techniques of care (as opposed to technics of psychopower) takes place in the just translated to English text by Stiegler: Taking Care of Youth and the Generations (2010). For their better known work on netwars, see John Arquilla and David Ronfeldt (2001). Arquilla and Ronfeldt publish solely with RAND in research funded and prepared by the O⁄ce of the Secretary of Defense, National Defense Research Institute, USA. See comments from both developers and US officers about the development of America’s Army from a recruitment to a high-level immersive environment for combat training in Grace Jean (2006). The America’s Army website remains active, from which the game can be launched for online play or downloaded. PsychSim was developed in 2003/4 by the Center for Advanced Research in Technology for Education, also at the University of Southern California, no doubt a budget and labour-saving coup for the Creative Technologies Institute! This occurs in the ‘You have 0 friends’ episode 1404 of the South Park series originally aired in the US on 7 April 2010. For further information about this episode, see the South Park Studios website.
References America’s Army (n.d.) Available at: http://www.americasarmy.com/ (consulted 1 March 2010). Anderson, C. (2004) ‘The Long Tail’, Wired. Available at: http://www.wired.com/ wired/archive/12.10/tail.html?pg=4&topic=tail&topic_set= (consulted 31 May 2010). Arquilla, J. and D. Ronfeldt (1999) The Emergence of Noopolitik: Toward an American Information Strategy. Santa Monica: Rand Corporation. Arquilla, J. and D. Ronfeldt (eds) (2001) Networks and Netwars: The Future of Terror, Crime and Militancy. Santa Monica: Rand Corporation. Beardsworth, R. (1998) ‘Thinking Technicity’, Journal of Cultural Research 2(1): 70^86. Brynen, R. (2009) ‘SimCity Meets Urban COIN Operations’, PaxSims weblog. Available at: http://paxsims.wordpress.com/2009/09/10/simcity-meets-urban-coinoperations/ (consulted 5 May 2010) Downloaded from tcs.sagepub.com at UNIV OF ROCHESTER LIBRARY on December 30, 2011
88
Theory, Culture & Society 28(6)
Cage, D. (2010) Heavy Rain [DISC] PlayStation 3. Paris: Quantic Dream. Cooper, M. (2008) Life as Surplus: Biotechnology and Capitalism in the Neoliberal Era. Seattle: University of Washington Press. Crandell, J. (2006) Ready for Action (Protections exhibition catalogue). Graz: Kunsthaus Graz. Available at: http://jordancrandall.com/main/writings/READYAUG10.html (consulted 1 June 2010). Davis, M. (2003) ‘Researching America’s Army’, pp. 268^275 in Design Research: Methods and Perspectives, ed. B. Laurel. Cambridge, MA: MIT Press. Deleuze, G. and F. Guattari (1988) A Thousand Plateaus. London: Athlone Press. Edmonds, E.A., D. Everitt, M. Macaulay and G. Turner (2004) ‘On Physiological Computing with an Application in Interactive Art’, Interacting with Computers 16: 897^915. Fuller, M. (2003) Behind the Blip: Essays on the Culture of Software. New York: Autonomedia. Galloway, A. (2006) Gaming: Essays on Algorithmic Culture. Minneapolis: University of Minnesota Press. Graham, S. (2009) ‘The Urban ‘‘Battlespace’’’, Theory, Culture & Society 26(7^8): 278^287. Grau, O. (2009) ‘Living Environments ^ Immersive Strategies’, pp. 170^190 in Interactive Art Research, ed. C. Sommerer and L. Mignoneau. Vienna and New York: Springer. Hayles, N.K. (2007) ‘Hyper and Deep Attention: The Generational Divide in Cognitive Modes’, Profession 1: 187^199. Institute for Creative Technologies [ICN] (2010 and ongoing) UrbanSim [software application] PC. Playa Vista: Institute for Creative Technologies. Jean, G. (2006) ‘Game Branches Out Into Real Combat Training’, National Defense Magazine. Available at: http://www.nationaldefensemagazine.org/archive/2006/February/Pages/games_brance3042.aspx (consulted 24 May 2010). Kay, L.E. (2000) Who Wrote the Book of Life? A History of the Genetic Code. Stanford: Stanford University Press. Kluitenberg, E. (2008) Delusive Spaces: Essays on Culture Media and Technology. Amsterdam: NAi. Lazzarato, M. (2002) ‘From Biopower to Biopolitics’, Pli: The Warwick Journal of Philosophy 13: 99^113. Lovink, G. (2008) Zero Comments: Blogging and Critical Internet Culture. New York: Routledge. McAlinden, R. et al. (2009) ‘UrbanSim: A Game-based Simulation for Counterinsurgency and Stability-focused Operations.’ Workshop on Intelligent Educational Games, 14th International Conference on Artificial Intelligence in Education, Brighton, England, 7 July. Available at: http://people.ict.usc.edu/ ~gordon/sble.html (consulted 30 May 2010). Mackenzie, A. (2006) Transductions: Bodies and Machines at Speed. London: Continuum.
Downloaded from tcs.sagepub.com at UNIV OF ROCHESTER LIBRARY on December 30, 2011
Munster ^ Biopolitical ‘Will to Life’ to a Noopolitical Ethos of Death
89
McNamara, P. (2010) ‘Facebook Blocks Web 2.0 Suicide Machine’, Buzzblog. Available at: http://www.networkworld.com/community/node/49470 (consulted May 2010). McNamara, P. (2010) ‘Facebook Unleashes Lawyers on Web 2.0 Suicide Machine’, Buzzblog. Available at: http://www.networkworld.com/community/node/53078 (consulted May 2010). Manovich, L. (2001) The Language of New Media. Cambridge, MA: MIT Press. Mockenhaupt, B. (2010) ‘SimCity Baghdad’, The Atlantic ( January/February). Available at: http://www.theatlantic.com/magazine/archive/2010/01/simcity-baghdad/7830/ (consulted 1 May 2010). Moddr (2009) ‘Web 2.0 Suicide Machine Promotion’, online video. Available at: http://vimeo.com/8223187 (consulted 28 May 2010). Moddr (2010) ‘Moddr?’ Available at: http://moddr.net/moddr/ URL (consulted 4 June 2010). Murphie, A. (2003) ‘Electronicas: Differential Media and Proliferating, Transient Worlds.’ Paper delivered at Digital Arts and Culture Conference, Melbourne, November. Available at: http://www.andrewmurphie.org/docs/electronicas.pdf (consulted 30 May 2010). Noble, I. (2003) ‘Human Genome Finally Complete’, BBC News Online, 14 April. Available at: http://news.bbc.co.uk/2/hi/health/2940601.stm (consulted April 2010). Popper, F. (2005) From Technological to Virtual Art. Cambridge, MA: MIT Press. Rabinow, P. (1992) ‘Artificiality and Enlightenment: From Sociobiology to Biosociality’, in Incorporations, ed. J. Crary and S. Kwinter. New York: Zone Books. Rose, N. (2001) ‘‘The Politics of Life Itself’’, Theory, Culture & Society 18(3): 1^30. Shukin, N. (2009) Animal Capital: Rendering Life in Biopolitical Times. Minneapolis: University of Minnesota Press. Sommerer, C. (1994) Prix Ars Electronica. ORF-Documentary. Austrian Broadcasting Company and Ars Electronica, Linz, Austria. South Park Studios (n.d.) Available at: http://www.southparkstudios.com/guide/ 1404 (consulted 4 June 2010). Stiegler, B. (1998) Time and Technics 1: The Fault of Epimethus. Stanford: Stanford University Press. Stiegler, B. (2008) ‘Biopower, Psychopower and the Logic of the Scapegoat.’ Paper delivered at ‘The Philosophy of Technology: A Colloquium with Bernard Stiegler’, Manchester Metropolitan University, 8 March. Also available at: http:// www.arsindustrialis.org/node/2924 (consulted May 2010). Stiegler, B. (2010) Taking Care of Youth and the Generations, trans. S. Barker. Stanford: Stanford University Press. Terranova, T. (2007) ‘Futurepublic: On Information Warfare, Bio-racism and Hegemony as Noopolitics’, Theory, Culture & Society 24(3): 125^145.
Downloaded from tcs.sagepub.com at UNIV OF ROCHESTER LIBRARY on December 30, 2011
90
Theory, Culture & Society 28(6)
Transformations (2009) Special issue on Stiegler and technics. Available at: http:// www.transformationsjournal.org/journal/issue_17/editorial.shtml (consulted 30 April 2010). Vaccari, A. (2009) ‘Unweaving the Program: Stiegler and the Hegemony of Technics’, Transformations 17. Available at: http://www.transformationsjournal.org/journal/issue_17/article_08.shtml. Venter, C. (2002) ‘Gene Pioneer’s Next Goal’, BBC News Science/Nature. Available at: http://news.bbc.co.uk/2/hi/science/nature/2289991.stm (consulted January 2011). Wiener, N. (1965 [1948]) Cybernetics: Or, Control and Communication in the Animal and the Machine. Cambridge, MA: MIT.
Anna Munster is a writer, artist and Associate Professor at the School of Art History and Art Education, University of New South Wales, Australia. She has published Materializing New Media: Embodiment in Information Aesthetics (2006) and is currently writing a book (forthcoming 2012) on networks and experience, using the work of William James to re-examine the technics, culture and aesthetics of connectivity and conjunction. [email:
[email protected]]
Downloaded from tcs.sagepub.com at UNIV OF ROCHESTER LIBRARY on December 30, 2011
Crisis, Crisis, Crisis, or Sovereignty and Networks Wendy Hui Kyong Chun
Abstract This article addresses the seemingly paradoxical proliferation of coded systems designed to guarantee our safety and crises that endanger us. These two phenomena, it argues, are not opposites but rather complements; crises are not accidental to a culture focused on safety, they are its raison d’eˆ tre. Mapping out the temporality of networks, it argues that crises are new media’s critical difference: its exception and its norm. Although crises promise to disrupt memory – to disturb the usual programmability of our machines by indexing ‘real time’ – they reinforce codes and coded logic: both codes and crises are central to the production of mythical and mystical sovereign subjects who weld together norm with reality, word with action. Codes and states of exception are complementary functions, which render information and ourselves undead. Against this fantasy and against the exhaustion that crisis as norm produces, the article ends by arguing that we need a means to exhaust exhaustion, to recover the undecidable potential of our decisions and our information through a practice of constant care. Key words code j crisis j critical j exhaustion j networks j new media j software studies j sovereignty j states of exception
Introduction OWARE codes and safety related? Howcanwe understandthe current proliferation of codes designed to guarantee our safety and of crises that endanger it? Codes, historically linked to rules and laws, seek to exempt us from hurt or injury by establishing norms, which order the present and render calculable the future. As Adrian Mackenzie and Theo Vurdubakis note, ‘code systems and codes of conduct pervade many registers of ‘‘safe living’’ . . . many situations today become manageable or tractable by virtue of their codeability’ (2007). Although codes encompass more than software ^ they are also ‘cultural, moral, ethical’ ^ computational codes are increasingly
H j
Theory, Culture & Society 2011 (SAGE, Los Angeles, London, New Delhi, and Singapore), Vol. 28(6): 91^112 DOI: 10.1177/0263276411418490
Downloaded from tcs.sagepub.com at UNIV OF ROCHESTER LIBRARY on December 30, 2011
92
Theory, Culture & Society 28(6)
privileged as the means to guarantee ‘safe living’ because they seem to enforce automatically what they prescribe. If ‘voluntary’actions once grounded certain norms, technically-enforced settings and algorithms now do, from software keys designed to prevent unauthorized copying to iPhone updates that disable unlocked phones, from GPS tracking devices for children to proxies used in China to restrict search engine results.Tellingly, trusted computer systems are systems secure from user interventions and understanding. Moreover, software codesnotonlysavethefuturebyrestrictinguseraction,theyalsodosobydrawing on saved data and analysis.They are, after all, programmed.They thus seek to free us from danger by reducing the future to the past, or, more precisely, to a past anticipation of the future. Remarkably, though, computer systems have been linked to user empowerment and agency, as much as they have been condemned as new forms of control. Still more remarkably, software codes have not simply reduced crises, they have also proliferated them. From financial crises linked to complex software programs to super-computer dependent diagnoses and predictions of global climate change, from undetected computer viruses to bombings at securitized airports, we are increasingly called on both to trust coded systems andtoprepareforeventsthatelude them. This article responds to this apparent paradox by arguing that crises are not accidental to a culture focused on safety, they are its raison d’e“ tre. In such a society, each crisis is the motor and the end of control systems; each initially singular emergency is carefully saved, analyzed and codified. More profoundly and less obviously, crises and codes are complementary because they are both central to the emergence of what appears to be the antithesis of both automation and codes: user agency. Codes and crises together produce (the illusion of) mythical and mystical sovereign subjects who weld together norm with reality, word with action. Exceptional crises justify states of exception that undo the traditional democratic separation of executive and legislative branches (see Agamben, 2005). Correspondingly, as I’ve argued in my recent book, Programmed Visions: Software and Memory, software emerged as a thing ^ as an iterable textual program ^ through a process of commercialization and commodification that has made code logos: code as source, code as conflated with, and substituting for, action.1 This article revisits code as logos in order to outline the fundamental role crises play in new media networks. Starting from an analysis of rhetorical and theoretical constructions of the internet as critical, it contends that crisis is new media’s critical di¡erence: its norm and its exception. Crises cut through the constant stream of information, di¡erentiating the temporally valuable from the mundane, o¡ering users a taste of real time responsibility and empowerment. They also threaten to undermine this experience, however, by catching and exhausting us in an endlessly repeating series of responses. Therefore, to battle this twinning of crisis and codes, we need a means to exhaust exhaustion, to recover the undead potential of our decisions and our information through a practice of constant care. Downloaded from tcs.sagepub.com at UNIV OF ROCHESTER LIBRARY on December 30, 2011
Chun ^ Crisis, Crisis, Crisis, or Sovereignty and Networks
93
Internet Critical The internet, in many ways, has been theorized, sold, and sometimes experienced as a ‘critical’ machine. In the mid-to-late 1990s, when the internet first emerged as a mass personalized medium through its privatization, both its detractors and supporters promoted it as a ‘turning point, an important or decisive state’ (OED) in civilization, democracy, capitalism, and globalization. Bill Gates called the internet a medium for ‘friction-free capitalism’ (1995). John Perry Barlow infamously declared cyberspace an ideal space outside physical coercion, writing: ‘governments of the Industrial World, you weary giants of flesh and steel, I come from Cyberspace, the new home of Mind. On behalf of the future, I ask you of the past to leave us alone. You are not welcome among us. You have no sovereignty where we gather’ (1996). ‘We in cyberspace’, he continues, are ‘creating a world that all may enter without privilege or prejudice accorded by race, economic power, military force, or station of birth. We are creating a world where anyone, anywhere may express his or her beliefs, no matter how singular, without fear of being coerced into silence or conformity’ (1996). Blatantly disregarding then-current internet demographics, corporations similarly touted the internet as the great racial and global equalizer: MCI advertised the internet as a race-free utopia; Cisco Systems similarly ran television advertisements featuring people from around the world, allegedly already online, who accosted the viewers with ‘Are you ready? We are.’ The phrase ‘we are’ made clear the threat behind these seeming celebrations: get online because these people already are (see Chun, 2006). The internet was also framed as quite literally enabling the critical ^ understood as enlightened, rational debate ^ to emerge. Al Gore argued that the Global Information Structure finally realized the Athenian public sphere; the US Supreme Court explained that the internet proved the validity of the US judicial concept of a marketplace of ideas.2 The internet, that is, ¢nally instantiated the enlightenment and its critical dream by allowing us ^ as Kant prescribed ^ to break free from tutelage and to express our ideas as writers before the scholarly world. Suddenly we could all be Martin Luthers or town criers, speaking the truth to power and proclaiming how not to be governed like that.3 It also remarkably instantiated critiques of this enlightenment dream: many theorists portrayed it as Barthes’s, Derrida’s and Foucault’s theories come true.4 The internet was critical because it ful¢lled various theoretical dreams. This rhetoric of the internet as critical, which helped transform the internet from a mainly academic and military communications network to a global medium, is still with us today, even though the daily experience of using the internet has not lived up to the early hype. From so-called ‘twitter revolutions’ ^ a name that erases the specificity of local political issues in favor of an internet application ^ to Wikileak’s steady flow of information to Facebook’s alleged role in the 2011 protests in Tunisia and Egypt, internet technologies are still viewed as inherently linked to freedom. As the Downloaded from tcs.sagepub.com at UNIV OF ROCHESTER LIBRARY on December 30, 2011
94
Theory, Culture & Society 28(6)
controversy over Wikileaks makes clear, this criticality is also framed as a crisis, as calling the critical ^ and our safety/security ^ into crisis. This crisis is not new or belated: the first attempt by the US government to regulate the content of the internet coincided with its deregulation. The same US government promoting the Information Superhighway also condemned it as threatening the sanctity and safety of the home by putting a porn shop in our children’s bedroom.5 Similarly, Mike Godwin formulated his law that ‘as an online discussion grows longer, the probability of a comparison involving Nazis or Hitler approaches 1’ in the 1990s (1994). So, at the very same time as the internet (as Usenet) was trumpeted as the ideal marketplace of ideas, it was also portrayed as degenerating public debate to a string of nasty accusations. Further, the same corporations celebrating the internet as the great racial equalizer also funded roundtables on the digital divide.6 More recently, the internet has been linked to cyberbullying and has been formulated as the exact opposite of Barlow’s dream: a nationalist machine that spreads rumors and lies. Joshua Kurlantzick, an adjunct fellow at the Paci¢c Council on International Policy in the US, told The Korea T|mes in response to the 2008 South Korean beef protests: ‘the Internet has fostered the spread of nationalism because it allows people to pick up historical trends, and talk about them, with little veri¢cation’ (Kang, 2008). Likewise, critics have postulated the internet as the end of critical theory, not because it literalizes critical theory but rather because it makes criticism impossible. As theorists McKenzie Wark and Geert Lovink have insightfully argued, the sheer speed of telecommunications undermines the time needed for scholarly contemplation.7 Scholarship, Wark argues, ‘assumes a certain kind of time within which the scholarly enterprise can unfold’, a time denied by global media events that happen and disappear at the speed of light (2005: 265). Theory’s temporality is traditionally belated. Theory stems from the Greek theoria, a term that described a group of o⁄cials whose formal witnessing of an event ensured its o⁄cial recognition. To follow Wark’s and Lovink’s logic, theory is impossible because we have no time to register events, and we lack a credible authority to legitimate the past as past. In response, Lovink has argued for a ‘running theory’ and Wark has argued that theory itself must travel along the same vectors as the media event. I am, as I’ve stated elsewhere, sympathetic to these calls (see Chun, 2008). However, I also think we need to theorize this narrative of theory in crisis, which resonates both with the general proliferation of crises discussed above and with much recent hand wringing over the alleged death of theory. Moreover, we need to theorize this narrative in relation to its corollary: an ever increasing desire for crises, or more properly for updates that demand response and yet to which it is impossible to respond completely, from ever updating twitter feeds to exploding inboxes. (That is, if, as Ursula Frohne theorized in response to the spread of webcams, that ‘to be is to be seen’, it would now seem that ‘to be is to be updated’ [2002: 252]. Automatically recognized changes of status have moved from Downloaded from tcs.sagepub.com at UNIV OF ROCHESTER LIBRARY on December 30, 2011
Chun ^ Crisis, Crisis, Crisis, or Sovereignty and Networks
95
surveillance to news and evidence of one’s ongoing existence.) The lack of time to respond ^ brought about by the inhumanly clocked time of our computers, which render the new old and, as I contend later, the old new ^ coupled by the demand for response, I want to suggest, makes the internet compelling. Crises structure new media temporality. Crisis, New Media’s Critical Difference
Crisis is new media’s critical difference. In new media, crisis has found its medium, and in crisis new media has found its value ^ its punctuating device. Crises have been central to making the internet a mass medium to end mass media: a personalized mass device. The aforementioned crises answered the early questions: why go online? And how can the internet ^ an asynchronous medium of communication ^ provide compelling events for users? Further, crises are central to experiences of new media agency, to information as power: crises ^ moments that demand real time response ^ make new media valuable and empowering by tying certain information to a decision, personal or political (in this sense, new media also personalizes crises). Crises mark the difference between ‘using’ and other modes of media spectatorship/viewing, in particular ‘watching’ television, which has been theorized in terms of liveness and catastrophe. Comprehending the difference between new media crises and televisual catastrophes is central to understanding the promise and threat of new media. Television has most frequently been theorized in terms of liveness: a constant flowing connection. As Jane Feuer has influentially argued, regardless of the fact that much television programming is taped, television is promoted as essentially live, as offering a direct connection to an unfolding reality ‘out there’ (see Feuer, 1983: 12^22). As Mary Ann Doane has further developed in her canonical ‘Information, Crisis, Catastrophe’, this feeling of direct connection is greatly enhanced in moments of catastrophe: during them, we stop simply watching the steady stream of information on the television set and sit, trans¢xed, before it. Distinguishing between television’s three di¡erent modes of apprehending the event ^ information (the steady stream of regular news), crisis (a condensation of time that demands a decision: for this reason it is usually intertwined with political events), and catastrophe (immediate ‘subjectless’ events about death and the failure of technology) ^ Doane argues that commercial television privileges catastrophe because catastrophe ‘corroborates television’s access to the momentary, the discontinuous, the real’ (1990: 222). Catastrophe, that is, underscores television’s greatest technological power: ‘its ability to be there ^ both on the scene and in your living room . . . the death associated with catastrophe ensures that television is felt as an immediate collision with the real in all its intractability ^ bodies in crisis, technology gone awry’ (1990: 222). Rather than a series of decisions (or signi¢cations), televisual catastrophe presents us with a series of events that promise reference: a possibility of touching the real. However, like in Feuer’s critique of liveness, Downloaded from tcs.sagepub.com at UNIV OF ROCHESTER LIBRARY on December 30, 2011
96
Theory, Culture & Society 28(6)
Doane points out that television’s relation to catastrophe is ideological rather than essential. Televisual catastrophe is central to commercial television programming because it makes television programming and the necessary selling of viewer’s time seem accidental, rather than central, to televisual time. ‘Catastrophe’, she writes, ‘produces the illusion that the spectator is in direct contact with the anchorperson, who interrupts regular programming to demonstrate that it can indeed be done when the referent is at stake’ (1990: 222). Thus television renders economic crises, which threaten to reveal the capitalist structure central to commercial television’s survival, into catastrophes: apolitical events that simply happen. Televisual catastrophe is thus ‘characterized by everything which it is said not to be ^ it is expected, predictable, its presence crucial to television’s operation . . . catastrophe functions as both the exception and the norm of a television practice which continually holds out to its spectator the lure of a referentiality perpetually deferred’ (1990: 238). In contrast, new media is a crisis machine: the difference between the empowered user and the couch potato, the difference between crisis and catastrophe. From the endless text messages that have replaced the simple act of making a dinner date to the familiar genre of ‘email forwarding accidents’, crises promise to move us from the banal to the crucial by offering the experience of something like responsibility, something like the consequences and joys of ‘being in touch’. Crisis promises to take us out of normal time, not by referencing the real but rather by indexing real time, by touching a time that touches a real, different time: a time of real decision, a time of our lives. It touches duration; it compresses time. It points to a time that seems to prove that our machines are interruptible, that programs always run short of the programmability they threaten. Further, crises, like televisual catastrophes, punctuate the constant stream of information, so that some information, however briefly, becomes (in)valuable. This value is not necessarily inherent to the material itself ^ this information could at other moments be incidental and is generally far less important than the contents of The New York Times. Their value stems from their relevance to an ongoing decision, to a sense of computers as facilitating ‘real time’ action. Real time has been central to the makeover of computers from work machines to cool media devices that mix work and leisure. Real time operating systems transform the computer from a pre-programmed machine run by human operators in batch-mode to ‘alive’ personal machines, which respond to users’ commands. Real time content, stock quotes, breaking news and streaming video similarly transform personal computers into personal media machines. What is real is what unfolds in real time (see Levin, 2002: 578^93). If before visual indexicality guaranteed authenticity (a photograph was real because it indexed something out there), now real time does so, for real time points elsewhere ^ to ‘real world’ events, to the user’s captured actions. That is, real time introduces indexicality to this non-indexical medium, an indexicality felt most acutely in moments of Downloaded from tcs.sagepub.com at UNIV OF ROCHESTER LIBRARY on December 30, 2011
Chun ^ Crisis, Crisis, Crisis, or Sovereignty and Networks
97
crisis, which enable connection and demand response. Crises amplify what Tara McPherson has called ‘volitional mobility’: dynamic changes to web pages in real time, seemingly at the bequest of the user’s desires or inputs, that create a sense of liveness on demand. Volitional mobility, like televisual liveness, produces a continuity, a £uid path over discontinuity (see McPherson, 2002: 458^70; Galloway, 2004). It is a simulated mobility that expands to ¢ll all time but, at the same time, promises that we are not wasting time, that indeed, through real time, we touch real time. The decisions we make, however, seem to prolong crises rather than end them, trapping us in a never advancing present. Consider, for instance, ‘viral’ email warnings about viruses. Years after computer security programs had effectively inoculated systems against a 2005 trojan attached to a message claiming that Osama bin Laden had been captured, messages about the virus ^ many of which exaggerated its power ^ still circulated.8 These messages spread more e¡ectively than the viruses they warn of: out of good will, we disseminate these warnings to our address book, and then forward warnings about these warnings, etc., etc. (Early on, trolls took advantage of this temporality, with their initial volleys unleashing a ¢restorm of warnings against feeding the troll.) These messages, in other words, act as ‘retroviruses’. Retroviruses, such as HIV, are composed of RNA strands that use a cell’s copying mechanisms to insert DNA versions of themselves into a cell’s genome. Similarly, these £eeting messages survive by our copying and saving them, by our active incorporation of them into our ever repeating archive. Through our e¡orts to foster safety, we spread retrovirally and defeat our computer’s usual anti-viral systems. This voluntary yet never-ending spread of information seemingly belies the myth of the internet as a ‘small world’. As computer scientists D. Liben-Nowell and J. Kleinberg in their analysis of the spread of chain letters have shown, the spread of chain letters resembles a long thin tree, rather than a short fat one (Liben-Nowell and Kleinberg, 2008: 4633^38). This diagram seems counter-intuitive: if everyone on the internet was really within six degrees of each other, information on the internet should spread quickly and then die. Liben-Nowell and Kleinberg pinpoint asynchrony and replying preferences as the cause: because everyone does not forward the same message at once or to the same number of people, messages circulate at different paces and never seem to reach an end. This temporality ^ this long, thin chain of transmission ^ seems to describe more than just the spread of chain letters. Consider, for instance, the ways in which a simple search can lead to semi-volitional wandering: hours of tangential sur¢ng. Microsoft has playfully called this temporality ‘search engine overload syndrome’ in its initial advertisements for its ‘decision engine’, Bing. In these commercials, characters respond to a simple question such as ‘we really need to ¢nd a new place to go for breakfast’ with a long stream of unproductive associations, such as statistics about ‘the breakfast club’. These characters are unable to respond to a question ^ to make a decision ^ because Downloaded from tcs.sagepub.com at UNIV OF ROCHESTER LIBRARY on December 30, 2011
98
Theory, Culture & Society 28(6)
each word provokes a long thin chain of references due to the inscription of information into ‘memory’. This repetition of stored information reveals that the value of information no longer coincides with its initial ‘discovery’. If once Walter Benjamin, comparing the time of the story and the news, could declare: ‘the value of information does not survive the moment in which it was new. It lives only at that moment; it has to surrender to it completely and explain itself to it without losing any time’, now newness alone does not determine value (1968: 90). In 2010, for instance, The New York Times charged online for its archive rather than its current news (although as of 2011 it charges users once they have read more than 20 articles); similarly, popular radio shows such as This American Life offered only this week’s podcast for free. We pay for information we miss (if we do), either because we want to see it again or because we missed it the first time, our missing registered by the many references to it (consider, in this light, all the YouTube videos referencing Two Girls, One Cup after that video was removed). Repetition produces value, and memory, which once promised to save us from time, makes us out of time by making us respond constantly to information we have already responded to, to things that will not disappear. As the Bing commercials reveal, the sheer amount of saved information seems to defer the future it once promised. Memory, which was initially posited as a way to save us by catching what we lose in real time ^ by making the ephemeral endure and by fulfilling that impossible promise of continuous history to catch everything into the present ^ threatens to make us insane, that is, only if we expect search engines and information to make our decisions for us, only if we expect our programs to (dis)solve our crises. Bing’s solution ^ the exhausting of decisions altogether through a ‘decision engine’ (which resonates with calls for states of emergency to exhaust crises) ^ after all is hardly empowering. Bing’s promised automation, however, does perhaps inadvertently reveal that, if real time new media do enable user agency, they do so in ways that mimic, rather than belie, automation and machines. Machinic real time and crises are both decisionmaking processes. According to the OED, real time is ‘the actual time during which a process or event occurs, especially one analyzed by a computer, in contrast to time subsequent to it when computer processing may be done, a recording replayed, or the like’. Crucially, hard and soft realtime systems are subject to a ‘real-time constraint’. That is, they need to respond, in a forced duration, to actions predefined as events. The measure of real time, in computer systems, is its reaction to the live, its liveness ^ its quick acknowledgment of and response to our action. They are ‘feedback machines’, based on control mechanisms that automate decision-making. As the definition of real time makes clear, real time refers to the time of computer processing, not to the user’s time. Real time is never real time ^ it is deferred and mediated. The emphasis on crisis in terms of user agency can thus be seen as a screen for the ever increasing automation of our decisions. While users struggle to respond to ‘what’s on your mind?’, Downloaded from tcs.sagepub.com at UNIV OF ROCHESTER LIBRARY on December 30, 2011
Chun ^ Crisis, Crisis, Crisis, or Sovereignty and Networks
99
their machines quietly disseminate their activity. What we experience is arguably not a real decision but rather one already decided in a perhaps unforeseen manner: increasingly, our decisions are like actions in a video game. They are immediately felt, affective, and based on our actions, and yet at the same time programmed. Furthermore, crises do not arguably interrupt programming, for crises ^ exceptions that demand a suspension, or at the very least an interruption of rules or the creation of new norms ^ are intriguingly linked to technical codes or programs.
Logos as State of Exception Importantly, crises ^ and the decisions they demand ^ do not simply lead to the experience of responsibility; as the term ‘panic button’ nicely highlights, they also induce moments of fear and terror from which we want to be saved via corporate, governmental, or technological intermediaries. States of exception are now common reactions to events that call for extraordinary responses, to moments of undecidability. As Jacques Derrida has argued, the undecidable calls for a response that, ‘though foreign and heterogeneous to the order of the calculable and the rule, must . . . nonetheless . . . deliver itself over to the impossible decision while taking account of law and rules’ (Derrida, 2002: 252). States of emergency respond to the undecidable by closing the gap between rules and decision through the construction of a sovereign subject who knits together force and law (or, more properly, force and suspended law); this sovereign subject through his actions makes the spirit of the law live. Although these states would seem to be the opposite of codes and programs, I want to link them together ^ and to the experience of crises discussed earlier ^ through questions of agency or, more properly as I explain later, authority. Giorgio Agamben has most influentially theorized states of exception. He notes that one of the essential characteristics of the state of exception is ‘the provisional abolition of the distinction among legislative, executive, and judicial powers’ (2005: 7). This provisional granting of ‘full powers’ to the executive suspends a norm such as the constitution in order to better apply it. The state of exception is: the opening of a space in which application and norm reveal their separation and a pure force-of-law realizes (that is, applies by ceasing to apply . . .) a norm whose application has been suspended. In this way, the impossible task of welding norm and reality together, and thereby constituting the normal sphere, is carried out in the form of the exception, that is to say, by presupposing their nexus.This means that in order to apply a norm it is ultimately necessary to suspend its application, to produce an exception. In every case, the state of exception marks a threshold at which logic and praxis blur with each other and a pure violence without logos claims to realize an enunciation without any real reference. (2005: 40)
Downloaded from tcs.sagepub.com at UNIV OF ROCHESTER LIBRARY on December 30, 2011
100
Theory, Culture & Society 28(6)
The state of exception thus reveals that norm and reality are usually separate ^ it responds to the moment of their greatest separation. In order to bring them together, force without law/logos ^ a living sovereign ^ authorizes a norm ‘without any reference to reality’ (2005: 36).9 That is, if the relationship between law and justice ^ a judicial decision ^ usually refers to an actual case (it is an instance of parole, an actual speaking), a norm in a state of exception is langue in its pure state: an abstract and mystical signi¢er. It is a moment of pure violence without logos (2005: 40). At one level, states of exception would seem the opposite of programming. Programs do not suspend anything, but rather ensure the banal running of something ‘in memory’. Programs reduce the living world to dead writing; they condense everything to ‘source code’ written in advance, hence the adjective ‘source’. This privileging of code is evident in common sense to theoretical understandings of programming, from claims made by free software advocates that free source code is freedom to those made by new media theorists that new media studies is, or should be, software studies. Programmers, computer scientists, and critical theorists have all reduced software ^ once evocatively described by historian Michael Mahoney as ‘elusively intangible, the behavior of the machines when running’ and described by theorist Adrian Mackenzie as a ‘neighbourhood of relations’ ^ to a recipe, a set of instructions, substituting space/text for time/process (Mahoney, 1988: 121; Mackenzie, 2006: 169). Consider, for instance, the common sense computer science definition of software as a ‘set of instructions that direct a computer to do a specific task’ and the OED definition of software as ‘the programs and procedures required to enable a computer to perform a specific task, as opposed to the physical components of the system’. Software, according to these definitions drives computation. These definitions, which treat programs and procedures interchangeably, erase the difference between human readable code, its machine readable interpretation, and its execution. The implication is thus: execution does not matter ^ like in conceptual art, it is a perfunctory affair; what really matters is the source code. Relatedly, several new media theorists have theorized code as essentially and rigorously ‘executable’. Alexander Galloway, for instance, has powerfully argued that ‘code draws a line between what is material and what is active, in essence saying that writing (hardware) cannot do anything, but must be transformed into code (software) to be effective. . . . Code is a language, but a very special kind of language. Code is the only language that is executable . . . code is the first language that actually does what it says’ (2004: 165^6; emphasis in original).10 This view of software as ‘actually doing what it says’ assumes no di¡erence between source code and execution, instruction and result. Here the ‘says’ is not accidental ^ although perhaps surprising coming from a theorist who argues in an article called ‘Language Wants to Be Overlooked’ that ‘to see code as subjectively performative or enunciative is to anthropomorphize it, to project it onto the rubric of psychology, rather than to understand it through its own logic of Downloaded from tcs.sagepub.com at UNIV OF ROCHESTER LIBRARY on December 30, 2011
Chun ^ Crisis, Crisis, Crisis, or Sovereignty and Networks 101
‘‘calculation’’ or ‘‘command’’’ (2006: 321). The phrase ‘code is the ¢rst language that does what it says’ reveals that code has surprisingly ^ because of machinic, dead repetition ^ become logos. Like the King’s speech in Plato’s Phaedrus, it does not pronounce knowledge or demonstrate it ^ it transparently pronounces itself.11 The hidden signi¢ed ^ meaning the father’s intentions ^ shines through and transforms itself into action. Like Faust’s translation of logos with ‘deed’, ‘The spirit speaks! I see how it must read / And boldly write: ‘In the beginning was the Deed!’, software is word become action ^ a replacement of process with inscription that makes writing a live power by con£ating force and law. Not surprisingly, this notion of source code as source coincides with the introduction of alphanumeric languages. With them, human-written, nonexecutable code becomes source code and the compiled code becomes the object code. Source code thus is arguably symptomatic of human language’s tendency to attribute a sovereign source to an action, a subject to a verb. By converting action into language, source code emerges. Thus Galloway’s statement ^ ‘to see code as subjectively performative or enunciative is to anthropomorphize it, to project it onto the rubric of psychology, rather than to understand it through its own logic of ‘‘calculation’’ or ‘‘command’’’ ^ overlooks the fact that to use higher-level alphanumeric languages is already to anthropomorphize the machine and to reduce all machinic actions to the commands that supposedly drive them. In other words, the fact that ‘code is law’ ^ something Lawrence Lessig emphasizes with great aplomb ^ is at one level hardly profound (see Lessig, 2000). Code, after all, is ‘a systematic collection or digest of the laws of a country, or of those relating to a particular subject’ (OED). What is surprising is the fact that software is code, that code is ^ has been made to be ^ executable, and that this executability makes code not law but rather every lawyer’s dream of what law should be: automatically enabling and disabling certain actions and functioning at the level of everyday practice. Code as law is code as police. Insightfully, Derrida argues that modern technologies push the ‘sphere of the police to absolute ubiquity’ (2002: 279). The police weld together norm with reality; they ‘are present or represented everywhere there is force of law . . . they are present, sometimes invisible but always e¡ective, wherever there is preservation of the social order’ (2002: 278). Code as law as police, like the state of exception, makes executive, legislative and juridical powers coincide. Code as law as police erases the gap between force and writing, langue and parole, in a complementary fashion to the state of exception. It makes language abstract, erases the importance of enunciation, not by suspending law but rather by making logos everything. Code is executable because it embodies the power of the executive. More generally, the dream of executive power as source lies at the heart of Austinian-inspired understandings of performative utterances as simply doing what they say. As Judith Butler has argued in Excitable Speech, this theorization posits the speaker as ‘the judge or some other representative of the law’ (1997: 48). It resuscitates fantasies of sovereign ^ again Downloaded from tcs.sagepub.com at UNIV OF ROCHESTER LIBRARY on December 30, 2011
102
Theory, Culture & Society 28(6)
executive ^ structures of power. It embodies ‘a wish to return to a simpler and more reassuring map of power, one in which the assumption of sovereignty remains secure’ (1997: 78). Not accidentally, programming in a higher-level language has been compared to entering a magical world ^ a world of logos, in which one’s code faithfully represents one’s intentions, albeit through its blind repetition rather than its ‘living’ status.12 As Joseph Weizenbaum, MIT professor, creator of ELIZA and member of the famed MIT AI lab, has argued: The computer programmer . . . is a creator of universes for which he alone is the lawgiver. So, of course, is the designer of any game. But universes of virtually unlimited complexity can be created in the form of computer programs. Moreover, and this is a crucial point, systems so formulated and elaborated act out their programmed scripts. They compliantly obey their laws and vividly exhibit their obedient behavior. No playwright, no stage director, no emperor, however powerful, has ever exercised such absolute authority to arrange a stage or a field of battle and to command such unswervingly dutiful actors or troops. (1976: 115)
Weizenbaum’s description underscores the mystical power at the base of programming: a power both to found and to enforce. Automatic compliance welds together script and force, again, code as law as police or as the end of democracy. As Derrida has underscored, the police is the name for: the degeneration of democratic power . . .Why? In absolute monarchy, legislative and executive powers are united. In it violence is therefore normal, conforming to its essence, its idea, its spirit. In democracy, on the contrary, violence is no longer accorded nor granted to the spirit of the police. Because of the presumed separation of powers, it is exercised illegitimately, especially when instead of enforcing the law, it makes the law. (2002: 281)
Code as logos and states of exception both signify a decay of the decay that is democracy. Tellingly, this machinic execution of law is linked to the emergence of a sovereign user. Celebrations of an all-powerful user/agent ^ ‘you’ as the network, ‘you’ as producer ^ counteract concerns over code as law as police by positing ‘you’ as the sovereign subject, ‘you’ as the decider. An agent, however, is one who does the actual labor, hence an agent as one who acts on behalf of another. On networks, the agent would seem to be technology rather than the users or programmers who authorize actions through their commands and clicks. Programmers and users are not creators of languages, nor the actual executors, but rather living sources who take credit for the action. Similarly, states of exception rely on auctoritas. The auctor is one who, like a father who ‘naturally’ embodies authority, authorizes a state of emergency (Agamben, 2005: 82). An auctor is ‘the person who augments, increases or perfects the act ^ or the legal situation ^ of someone else’ (Agamben, 2005: 76). The subject that arises, then, is the opposite of the Downloaded from tcs.sagepub.com at UNIV OF ROCHESTER LIBRARY on December 30, 2011
Chun ^ Crisis, Crisis, Crisis, or Sovereignty and Networks 103
democratic agent, whose power stems from protestas. Hence the state of exception, Agamben argues, revives the auctoritas as father, as living law: The state of exception . . . is founded on the essential fiction according to which anomie (in the form of auctoritas, living law, or the force of law) is still related to the juridical order and the power to suspend the norm as an immediate hold on life. As long as the two elements remain correlated yet conceptually, temporally, and subjectively distinct (as in republican Rome’s contrast between the Senate and the people, or in medieval Europe’s contrast between spiritual and temporal powers) their dialectic ^ though founded on a fiction ^ can nevertheless function in some way. But when they tend to coincide in a single person, when the state of exception, in which they are bound and blurred together, becomes the rule, then the juridico-political system transforms itself into a killing machine. (2005: 86)
The reference here to killing machines is not accidental. States of exception make possible a living authority based on an unliving (or, as my spell check keeps insisting, an unloving) execution. This insistence on life also makes it clear why all those discussions of code anthropomorphize it, using terms such as ‘says’ or ‘wants’. It is, after all, as a living power that code can authorize. It is the father behind logos that shines through the code. To summarize, we are witnessing an odd dovetailing of the force of law without law with writing as logos, which perverts the perversion that writing was supposed to be (writing as the bastard ‘mere repetition’ was defined in contrast to and as inherently endangering logos). They are both language at its most abstract and mystical, albeit for seemingly diametrically opposed reasons: one is allegedly language without writing; the other writing without language. This convergence, which is really a complementary pairing, since they come to the same point from different ends, puts in place an originary sovereign subject.This originary sovereign subject, however, as much as he may seem to authorize and begin the state of exception, is created belatedly by it. Derrida calls sovereign violence the naming of oneself as sovereign ^ the sovereign ‘names itself. Sovereign is the violent power of this originary appellation’, an appellation that is also an iteration (2002: 293). Judith Butler similarly argues that it is through iterability that the performative utterance creates the person who declares it. Further, the effect of this utterance does not originate with the speaker, but rather with the community s/he joins through speaking (1997: 39). The programmer/user is produced through the act of programming. Code as logos depends on many circumstances, which also undermine the authority of those who would write. Sources, After the Fact Source code as source ^ as logos ^ is a highly constructed and rather dubious notion, not in the least because, as Friedrich Kittler has most Downloaded from tcs.sagepub.com at UNIV OF ROCHESTER LIBRARY on December 30, 2011
104
Theory, Culture & Society 28(6)
infamously argued, ‘there is no software’, for everything, in the end, reduces to voltage differences (1995). Similarly (and earlier), physicist Rolf Landauer has argued that ‘there is really no software, in the strict sense of disembodied information, but only inactive and relatively static hardware. Thus, the handling of information is inevitably tied to the physical universe, its contest and its laws’ (1987: 35). This construction of source code as logos depends on many historical and theoretical, as well as physical, erasures. Source code after all cannot be run, unless it is compiled or interpreted, which is why early programmers called source code pseudo-code.13 Execution, that is a whole series of executions, belatedly makes some piece of code a source. Source code only becomes a source after the fact. Source code is more accurately a re-source, rather than a source. Source code becomes the source of an action only after it expands to include software libraries, after it merges with code burned into silicon chips, and after all these signals are carefully monitored, timed and recti¢ed. It becomes a source after it is rendered into an executable: source code becomes a source only through its destruction, through its simultaneous non-presence and presence.14 Even executable code is no simple source: it may be executable, but even when run, not all lines are executed, for commands are read in as necessary. The di¡erence between executable and source code brings out the ways in which code does not simply do what it says ^ or more precisely, does so in a technical (crafty) manner.15 Even Weizenbaum, as he posits the programmer as all powerful, also describes him as ignorant because code as law as police is a ¢ction. The execution of a program more properly resembles a judicial process: a large program is, to use an analogy of which Minsky is also fond, an intricately connected network of courts of law, that is, of subroutines, to which evidence is transmitted by other subroutines. These courts weigh (evaluate) the data given to them and then transmit their judgments to still other courts. The verdicts rendered by these courts may, indeed, often do, involve decisions about what court has ‘jurisdiction’ over the intermediate results then being manipulated. The programmer thus cannot even know the path of decision-making within his own program, let alone what intermediate or final results it will produce. Program formulation is thus rather more like the creation of a bureaucracy than like the construction of a machine of the kind Lord Kelvin may have understood. (1976: 234)
This complex structure belies the conceit of source code as conflating word and action. The translation from source code to executable is arguably as involved as the execution of any command. Compilation carries with it the possibility of deviousness: our belief that compilers simply expand higherlevel commands ^ rather than alter or insert other behaviors ^ is simply that, a belief, one of the many that sustain computing as such. It is also a belief challenged by the presence and actions of viruses, which ^ as Jussi Downloaded from tcs.sagepub.com at UNIV OF ROCHESTER LIBRARY on December 30, 2011
Chun ^ Crisis, Crisis, Crisis, or Sovereignty and Networks 105
Parikka has argued ^ challenge the presumed relationship between invisible code and visible actions and the sovereignty of the user (see Parrika, 2007). Source code as source is also the history of structured programming, which sought to reign in ‘go-to crazy’ programmers and self-modifying code. A response to the much discussed ‘software crisis’ of the late 1960s, its goal was to move programming from a craft to a standardized industrial practice by creating disciplined programmers who dealt with abstractions rather than numerical processes. This dealing with abstractions also meant increasingly separating the programmer from the machine. As Kittler (1995) has infamously argued, we no longer even write. W|th ‘data-driven programming’ ^ in which solutions are generated rather than produced in advance ^ it seems we even no longer program. Code as logos would seem language at its most abstract because, like the state of exception, it is language in pure state. It is language without parole, or, to be more precise, language that hides ^ that makes unknowable ^ parole. To be clear, I am not valorizing hardware over software, as if hardware naturally escapes this drive to make space signify time. Hardware too is carefully disciplined and timed in order to operate ‘logically’ ^ as logos. As Philip Agre has emphasized, the digital abstraction erases the fact that gates have ‘directionality in both space (listening to its inputs, driving its outputs) and in time (always moving toward a logically consistent relation between these inputs and outputs)’ (1997: 92).16 This movement in time and space was highlighted nicely in early forms of ‘regenerative’ memory, such as the Williams tube. The Williams tube used televisual CRT technology not for display but for memory: when a beam of electrons hits the phosphor surface, it produces a charge that persists for .2 seconds before it leaks away. Therefore, if a charge can be regenerated at least ¢ve times per second, it can be detected by a parallel collector plate. Key here ^ and in current forms of volatile memory involved in execution ^ is erasability. Less immediately needed data does not need to regenerate and Von Neumann intriguingly included within the rubric of ‘memory’ almost all forms of data, referring to stored data and all forms of input and output as ‘dead’ memory. Hence now in computer speak, one reverses common language and stores something in memory. This odd reversal and the con£ation of memory and storage gloss over the impermanence and volatility of computer memory. Without this volatility, however, there would be no memory.17 This repetition of signals both within and outside the machine makes clear the necessity of responsibility ^ of constant decisions ^ to something like safety (or saving), which is always precarious. It thus belies the overarching belief and desire in the digital as simply there ^ anything that is not regenerated will become unreadable ^ by also emphasizing the importance of human agency, a human act to constantly save that is concert with technology. Saving is something that technology alone cannot do ^ the battle to save is a crisis in the strongest sense of the word. This necessary repetition makes us realize that this desire for safety as simple securing, as ensured by code, actually puts us at risk of losing what is valuable, from Downloaded from tcs.sagepub.com at UNIV OF ROCHESTER LIBRARY on December 30, 2011
106
Theory, Culture & Society 28(6)
data stored on old floppy drives to CDs storing our digital images because, at a fundamental level, the digital is an event, rather than a thing.18 It also forces us to engage with the fact that if something stays in place, it is not because things are unchanging and unchangeable, but rather because they are constantly implemented and enforced. From regenerative mercury delay line tubes to the content of digital media, what remains is not what is static, but rather that which is constantly repeated. This movement does not mean that there are no things that can be later identi¢ed as sources, but rather that constant motion and care recalls things in memory. Further, acknowledging this necessary repetition moves us away from wanting an end (because what ends will end) and towards actively engaging and taking responsibility for everything we want to endure. It underscores the importance of access, another reason for the valorization of digitization as a means of preservation. To access is to preserve. By way of conclusion, I want to suggest that this notion of constant care can exhaust the kind of exhaustion encapsulated in ‘search overload syndrome’. The experience of the undecidable ^ with both its reliance on and difference from rules ^ highlights the fact that any responsibility worthy of its name depends on a decision that must be made precisely when we know not what to do. As Thomas Keenan eloquently explains, ‘the only responsibility worthy of the name comes with the removal of grounds, the withdrawal of the rules or the knowledge on which we might rely to make our decisions for us. No grounds means no alibis, no elsewhere to which to refer the instance of our decision’ (1997: 1). Derrida similarly argues that ‘a decision that would not go through the test and ordeal of the undecidable would not be a free decision; it would only be the programmable application or the continuous unfolding of a calculable process’ (2002: 252). The undecidable is thus freedom in the more rigorous sense of the word ^ a freedom that comes not from safety but rather from risk. It is a moment of pause that interrupts our retroviral dissemination and induces the madness that, as Kierkegaard has argued, accompanies any moment of madness. The madness of a decision, though, differs from the madness described by Microsoft, which stems from the constant deferral of a decision. This deferral of decision, stemming from a belief in information as decision, catches us in a deluge of minor-seeming decisions that defer our engagement with crisis ^ or renders everything and thus nothing a crisis. To exhaust exhaustion, we need to exhaust too the desire for an end, for a moment in which things can just stand still. We need to learn to rest while moving. To exhaust exhaustion we must also deal with ^ and emphasize ^ the precariousness of programs and their predictions. That is, if they are to help us save the future ^ to help us fight the exhaustion of planetary, reserves, etc. ^ they can only do so if we use the gap between their future predictions and the future not to dismiss them, but rather to frame their predictions as calls for responsibility. That is, ‘trusting’ a program does not mean letting it decide the future or even framing its future predictions as simply true, but instead acknowledging the Downloaded from tcs.sagepub.com at UNIV OF ROCHESTER LIBRARY on December 30, 2011
Chun ^ Crisis, Crisis, Crisis, or Sovereignty and Networks 107
impossibility of knowing its truth in advance while nonetheless responding to it. This is perhaps made most clear through the example of global climate models, which attempt to convince people that something they can’t yet experience, something simulated, is true (this difficulty is amplified by the fact that we experience weather, not climate ^ like capital, climate, which is itself the product of modern computation, is hard to grasp). Trusted models of global mean temperature by organizations such as Geophysical Fluid Dynamics Laboratory (GFDL) ‘chart’ changes in mean temperature from 1970^2100.19 Although the older temperatures are based on historical data, and thus veri¢able, the future temperatures are not. This suturing of the di¡erence between past and future is not, however, the oddest thing about these models and their relation to the future, although it is certainly the basis from which they are most often attacked. The weirdest and most important thing about their temporality is their hopefully e¡ective deferral of the future: these predictive models are produced so that, if they are persuasive and thus convince us to cut back on our carbon emissions, what they predict will not come about. Their predictions will not be true or veri¢able. This relationship is necessary because by the time we know if their predictions are true or not, it will be too late (this is perhaps why the Bush administration supported global climate change research: by investigating the problem, building better models, they bought more time for polluters). I stress this temporality not because I’m a climate change denier ^ the fact that carbon monoxide raises temperature has been known for over a century ^ but because, by engaging this temporality in terms of responsibility, we can best respond to critics who focus on the fallibility of algorithms and data, as if the gap between the future and future predictions was reason for dismissal rather than hope. (Surprisingly, these critics often accept other models with this same temporality ^ such as economic models ^ without question.) This mode of deferring a future for another future is an engagement with the undead of information. The undead of information haunts the past and the future; it is itself a haunting. As Derrida explains, ‘the undecidable remains caught, lodged, as a ghost . . . in every decision, in every event of decision. Its ghostliness . . . deconstructs from within all assurance of presence, all certainty or all alleged criteriology assuring us of the justice of a decision, in truth of the very event of a decision’ (2002: 253). This undeadness means that a decision is never decisive, that it can always be revisited and reworked. Repetition is not simply exhaustion, not simply repetition of the same that uses up its object or subject. What can emerge positively from the linking of crisis to networks ^ what must emerge from it if we are not to exhaust ourselves and our resources ^ are constant ethical encounters between self and other. These moments can call forth a new future, a way to exhaust exhaustion, even as they complicate the deconstructive promise of responsibility. Downloaded from tcs.sagepub.com at UNIV OF ROCHESTER LIBRARY on December 30, 2011
108
Theory, Culture & Society 28(6)
Notes 1. As Barbara Johnson notes in her explanation of Jacques Derrida’s critique of logocentrism, logos is the ‘image of perfectly self-present meaning . . . the underlying ideal of Western culture. Derrida has termed this belief in the self-presentation of meaning, ‘‘Logocentrism,’’ for the Greek word Logos (meaning speech, logic, reason, the Word of God)’ ( Johnson, 1981: ix). 2. See Gore (1994) and US Supreme Court Decision Reno versus ACLU No. 96^511 (1997). 3. For more on enlightenment as a stance of how not to be governed like that, see Foucault (1996: 382^98). 4. For examples see Landow (1992), Turkle (1997) and Women and Performance issue 17. 5. Senator Daniel R. Coats argued during congressional debate over the Communications Decency Act: ‘perfunctory onscreen warnings which inform minors they are on their honor not to look at this [are] like taking a porn shop and putting it in the bedroom of your children and then saying ‘‘Do not look’’’ (as quoted in the Department of Justice Brief Filed with the Supreme Court 21 in 1997). 6. For more on this see Chun (2006). 7. See Wark (2005) and Lovink (2000). Lovink elsewhere contends: ‘because of the speed of events, there is a real danger that an online phenomenon will already have disappeared before a critical discourse re£ecting on it has had the time to mature and establish itself as institutionally recognized knowledge’ (Lovink, 2003: 12). 8. See ‘Osama Bin Laden Virus Emails’ (http://www.hoax-slayer.com/bin-ladencaptured.html; accessed 7 July 2010). 9. According to Agamben: ‘The state of exception is an anomic space in which what is at stake is a force of law without law (which should therefore be written: force-of-law). Such a ‘force-of-law,’ in which potentiality and act are radically separated, is certainly something like a mystical element, or rather a fictio by means of which law seeks to annex anomie itself’ (2005: 39). 10. Given that the adjective executable applies to anything that ‘can be executed, performed, or carried out’ (the first example of ‘executable’ given by the OED is from 1796), this is a strange statement. 11. See Derrida’s analysis of The Phaedrus in ‘Plato’s Pharmacy’ (1981: 134). 12. Fred Brooks, while responding to the disaster that was OS/360, also emphasized the magical powers of programming. Describing the joys of the craft, Brooks writes: Why is programming fun? What delights may its practitioner expect as his reward? First is the sheer joy of making things . . . Second is the pleasure of making things that are useful to other people . . .
Downloaded from tcs.sagepub.com at UNIV OF ROCHESTER LIBRARY on December 30, 2011
Chun ^ Crisis, Crisis, Crisis, or Sovereignty and Networks 109 Third is the fascination of fashioning complex puzzle-like objects of interlocking moving parts and watching them work in subtle cycles, playing out the consequences of principles built in from the beginning . . . Fourth is the joy of always learning, which springs from the nonrepeating nature of the task . . . Finally there is the delight of working in such a tractable medium. The programmer, like the poet, works only slightly removed from thoughtstuff. He builds his castles in the air, from air, creating by exertion of the imagination. . . .Yet the program construct, unlike the poet’s words, is real in the sense that it moves and works, producing visible outputs separate from the construct itself. It prints results, draws pictures, produces sounds, moves arms. The magic of myth and legend has come true in our time. One types the correct incantation on a keyboard, and a display screen comes to life, showing things that never were nor could be. (1995: 7^8)
13. For instance, The A-2 Compiler System Operations Manual (1953) explains that a pseudo-code drives its compiler, just as ‘C-10 Code tells UNIVAC how to proceed. This pseudo-code is a new language which is much easier to learn and much shorter and quicker to write. Logical errors are more easily found in information than in UNIVAC coding because of the smaller volume’ (p. 1). 14. Jacques Derrida stresses the disappearance of the origin that writing represents: ‘To repeat: the disappearance of the good-father-capital-sun is thus the precondition of discourse, taken this time as a moment and not as a principle of generalized writing. . . .The disappearance of truth as presence, the withdrawal of the present origin of presence, is the condition of all (manifestation of) truth. Nontruth is the truth. Nonpresence is presence. Differance, the disappearance of any originary presence, is at once the condition of possibility and the condition of impossibility of truth. At once’ (1981: 168, emphasis in original). 15. Compilation creates a logical ^ a crafty ^ relation rather than a numerical one ^ one that cannot be compared to the difference between decimal or binary numbers, or numerically equivalent equations, for it involves instruction explosion and the translation of symbolic into real addresses. For example, consider the instructions needed for adding two numbers in PowerPC assembly language:
li li
r3,10 r4,20
*load register 3 with the number 10 *load register 4 with the number 20
add
r5,r4,r3
*add r3 to r4 and store the result in r5
stw
r5,sum(rtoc)
*store the contents of r5 (i.e. 30) *into the memory location called ‘sum’
blr
*end of this piece of code
Downloaded from tcs.sagepub.com at UNIV OF ROCHESTER LIBRARY on December 30, 2011
110
Theory, Culture & Society 28(6)
16. When a value suddenly changes, there is a brief period in which a gate will give a false value. In addition, because signals propagate in time over space, they produce a magnetic field that can corrupt other signals nearby (‘crosstalk’). This schematic erases all these various time- and distance-based effects by rendering space blank, empty, and banal. 17. Memory is not static, but rather an active process. A memory must be held in order to keep it from moving or fading. Memory does not equal storage: although one can conceivably store a memory, storage usually refers to something material or substantial, as well as to its physical location: a store is both what and where it is stored. According to the OED, to store is to furnish, to build stock. Storage or stocks always looks towards the future. Memory stems from the same Sanskrit root for martyr. Memory calls for an act of commemoration or renewal of what is stored. Memory is not a source but an act, and by focusing on either memory or real time as sources, we miss the importance of this and other actions, such as the transformation of information into knowledge, of code into vision. Since the coded ‘source’ of digital media can only operate by being constantly refreshed, degenerated, and regenerated, the critical difficulty of digital media thus stems less from its speed or source, but rather from the ways in which it runs. 18. Wolfgang Ernst thus argues that new media is a time-based medium. See Ernst (2006: 105^23). 19. See the Geophysical Fluid Dynamics Laboratory (GFDL) ‘chart’ here: http:// www.gfdl.noaa.gov/video/gfdlglobe_tref_d4h2x1_1970_2100_30f_720x480.mov References Agamben, G. (2005) State of Exception. Chicago: University of Chicago Press. Agre, P.E. (1997) Computation and Human Experience. Cambridge: Cambridge University Press. Barlow, J.P. (1996) ‘A Declaration of the Independence of Cyberspace’, 9 February. http://w2.eff.org/Censorship/Internet_censorship_bills/barlow_0296.declaration (accessed 1 January 2010). Benjamin, W. (1968) ‘The Story Teller’, pp. 83^110 in Illuminations. New York: Harcourt Brace Jovanovich. Brooks, F.P. (1995) The Mythical Man-Month: Essays on Software Engineering. Reading, MA: Addison-Wesley. Butler, J. (1997) Excitable Speech: A Politics of the Performative. New York: Routledge. Chun, W. (2006) Control and Freedom: Power and Paranoia in the Age of Fiber Optics. Cambridge, MA: MIT Press. Chun, W. (2008) ‘The Enduring Ephemeral, or the Future is a Memory’, Critical Inquiry 35(1): 148^171. Chun, W. (2011) Programmed Visions: Software and Memory. Cambridge, MA: MIT Press. Department of Justice Brief Filed with the Supreme Court 21 (1997) No. 96 -511. http://groups.csail.mit.edu/mac/classes/6.805/articles/cda/reno-v-aclu-appeal.html (accessed 2 January 2011). Downloaded from tcs.sagepub.com at UNIV OF ROCHESTER LIBRARY on December 30, 2011
Chun ^ Crisis, Crisis, Crisis, or Sovereignty and Networks 111 Derrida, J. (1981) ‘Plato’s Pharmacy’, pp. 61^172 in Dissemination. Chicago: University of Chicago Press. Derrida, J. (2002) ‘Force of Law: The Mystical Foundation of Authority’, pp. 228^ 298 in G. Anidjar (ed.) Acts of Religion. New York: Routledge. Doane, M.A. (1990) ‘Information, Crisis, Catastrophe’, pp. 222^239 in P. Mellencamp (ed.) Logics of Television: Essays in Critical Criticism. Bloomington and Indianapolis: Indiana University Press. Ernst, W. (2006) ‘Dis/continuities: Does the Archive Become Metaphorical in Multi-Media Space?’, pp. 105^123 in W. Chun and T. Keenan (eds) New Media, Old Media: A History and Theory Reader. New York: Routledge. Feuer, J. (1983) ‘The Concept of Live Television: Ontology as Ideology’, pp. 12^22 in E.A. Kaplan (ed.) Regarding Television: Critical Approaches. Washington, DC: University Press of America. Foucault, M. (1996) ‘What is Critique?’, pp. 382^398 in J. Schmidt (ed.) What is Enlightenment? Berkeley: University of California Press. Frohne, U. (2002) ‘Screen Tests: Media, Narcissism, Theatricality, and the Internalized Observer’, pp. 252^277 in T. Levin et al. (eds) CTRL [SPACE]: Rhetorics of Surveillance from Bentham to Big Brother. Cambridge, MA: MIT Press. Galloway, A. (2005) Protocol: How Control Exists After Decentralization. Cambridge, MA: MIT Press. Galloway, A. (2006) ‘Language Wants to Be Overlooked: Software and Ideology’, Journal of Visual Culture 5(3): 315^331. Gates, B. (1995) The Road Ahead. New York: Viking. Geophysical Fluid Dynamics Laboratory (GFDL) (n.d.) http://www.gfdl.noaa.gov/ video/gfdlglobe_tref_d4h2x1_1970_2100_30f_720x480.mov (accessed 15 October 2010). Godwin, M. (1994) ‘Meme, Counter-Meme’, Wired 2.10. http://www.wired.com/ wired/archive/2.10/godwin.if.html (accessed 7 July 2010). Gore,A.(1994)‘ForgingtheNewAthenianAgeofDemocracy’,Intermedia22(2): 14^16. Hopper, G.M., R.K. Ridgway and M.H. Harper (1953) The A-2 Compiler System Operations Manual. Philadelphia: Remington Rand. Johnson, B. (1981) ‘Translator’s Introduction’, in J. Derrida (ed.) Dissemination. Chicago: University of Chicago Press. Kang, H. (2008) ‘Cell Phones Create Youth Nationalism’, The Korea Times 12 May. http://koreatimes.co.kr/www/news/special/2008/06/180_24035.html (accessed 1 January 2010). Keenan, T. (1997) Fables of Responsibility: Aberrations and Predicaments in Ethics and Politics. Stanford, CA: Stanford University Press. Kittler, F. (1995) ‘There Is No Software’, ctheory.net. http://www.ctheory.net/text_ file.asp?pick=74 (accessed 2 January 2011). Landauer, R. (1987) ‘Computation: A Fundamental Physical View’, Physica Sripta 35: 88^95. Landow, G. (1992) Hypertext: The Convergence of Contemporary Critical Theory and Technology. Baltimore, MD: Johns Hopkins University Press. Lessig, L. (2000) Code and Other Laws of Cyberspace. New York: Basic Books.
Downloaded from tcs.sagepub.com at UNIV OF ROCHESTER LIBRARY on December 30, 2011
112
Theory, Culture & Society 28(6)
Levin, T. (2002) ‘Rhetoric of the Temporal Index: Surveillant Narration and the Cinema of Real Time’, pp. 578^593 in U. Frohne and P. Weibel (eds) CTRL [Space]. Rhetorics of Surveillance from Bentham to Big Brother. Cambridge, MA: MIT Press. Liben-Nowell, D. and J. Kleinberg (2008) ‘Tracing Information Flow on a Global Scale Using Internet Chain-Letter Data’, Proc. National Academy of Sciences 105(12): 4633^4638. Lovink, G. (2000) ‘Enemy of Nostalgia,Victim of the Present, Critic of the Future: Interview with Peter Lunenfeld’. http://www.nettime.org/Lists-Archives/nettime-l0008/msg00008.html (accessed 2 January 2007). Lovink, G. (2003) My First Recession. Rotterdam: V2_/NAI 12. Mackenzie, A. (2006) Cutting Code: Software and Sociality. NewYork: Peter Lang. Mackenzie, A. and T. Vurdubakis (2007) ‘Workshop 2: Codes and Conduct’. http:// www.lancs.ac.uk/ias/annualprogramme/protection/workshop2/index.htm (accessed 7 July 2010). Mahoney, M. (1988) ‘The History of Computing in the History of Technology’, Annals of the History of Computing 10: 113^125. McPherson, T. (2002) ‘Reload: Liveness, Mobility and the Web’, pp. 458^470 in N. Mirzoeff (ed.) The Visual Culture Reader, 2nd edn. New York: Routledge. ‘Osama Bin Laden Virus Emails’ (n.d.) http://www.hoax-slayer.com/bin-ladencaptured.html (accessed 7 July 2010). Parikka, J. (2007) Digital Contagions: A Media Archaeology of Computer Viruses. New York: Peter Lang. Turkle, S. (1997) Life on the Screen: Identity in the Age of the Internet. New York: Simon & Schuster. US Supreme Court Decision (1997) Reno versus ACLU No. 96^511. http://caselaw.lp.f indlaw.com/cgi-bin/getcase.pl?court¼us&navby¼case&vol¼521&invol¼ 844 (accessed 7 July 2010). Wark, M. (2005) ‘The Weird Global Media Event and the Tactical Intellectual’, pp. 265^276 in W. Chun and T. Keenan (eds) New Media, Old Media: A History and Theory Reader. New York: Routledge. Weizenbaum, J. (1976) Computer Power and Human Reason: From Judgment to Calculation. San Francisco: W.H. Freeman & Co.
Wendy Hui Kyong Chun is Professor of Modern Culture and Media at Brown University. She has studied both Systems Design Engineering and English Literature, which she combines and mutates in her current work on digital media. She is author of Control and Freedom: Power and Paranoia in the Age of Fiber Optics (MIT, 2006) and Programmed Visions: Software and Memory (MIT, 2011), and she is co-editor of New Media, Old Media: A History and Theory Reader (Routledge, 2006) and of a special issue of Camera Obscura entitled Race and/as Technology. She is currently working on a manuscript entitled Imagined Networks, Glocal Connections. [email:
[email protected]]
Downloaded from tcs.sagepub.com at UNIV OF ROCHESTER LIBRARY on December 30, 2011
The Enframing of Code Agency, Originality and the Plagiarist
Lucas D. Introna
Abstract This paper is about the phenomenon of encoding, more specifically about the encoded extension of agency. The question of code most often emerges from contemporary concerns about the way digital encoding is seen to be transforming our lives in fundamental ways, yet seems to operate ‘under the surface’ as it were. In this essay I suggest that the performative outcomes of digital encoding are best understood within a more general horizon of the phenomenon of encoding – that is to say as norm- or rule-governed material enactments accepted (or taken for granted) as the necessary conditions for becoming. Encoded material enactments translate/extend agency, but never exactly. I argue that such encoded extensions are insecure, come at a cost and are performative. To illustrate this I present a brief discussion of some specific historical transitions in the encoding of human agency: from speech to writing, to mechanical writing, and finally to electronic writing. In each of these translations I aim to show that agency is translated/extended in ways that have many unexpected performative outcomes. Specifically, through a discussion of the digital encoding of writing, as reuse, I want to suggest the proposition that all agency is always borrowed (or ‘plagiarized’) – i.e. it is never originally human. As encoded beings we are never authors, we are rather more or less skilful reusers. To extend agency we have to submit to the demands of encoding and kidnap that encoding simultaneously – enabling constraints in Butler’s language. Our originality, if there is any, is in our skill at kidnapping the code and turning it into an extension of our agency, that is to say, our skill at resignification – to be original we need to be skilful ‘parasites’ , as suggested by Serres. Key words agency j code
j
j
originality
j
performativity
j
plagiarius
j
post-human
Theory, Culture & Society 2011 (SAGE, Los Angeles, London, New Delhi, and Singapore), Vol. 28(6): 113^141 DOI: 10.1177/0263276411418131
Downloaded from tcs.sagepub.com at UNIV OF ROCHESTER LIBRARY on December 30, 2011
114
Theory, Culture & Society 28(6)
Introduction t could be claimed that in contemporary society digital encoding is becoming the dominant way of being and doing. Digital encoding increasingly mediates, or more precisely enacts, a vast array of human endeavour. In the digitally wired world it is becoming the way we work, the way we play, the way we conduct war, and so forth. It is becoming subsumed into every aspect of our human and physical geography. It exerts control over our elevators, our cars, our shopping, our writing, our access, our entertainment, our pleasure and much more (Thrift and French, 2002). In our continual pursuit of convenience and e⁄ciency we ‘delegate’ to digitally encoded actors the most intimate details of our lives, and, in doing so, we conveniently forget and lose track of these encodings. Under the surface of our lives an increasingly complex geography of encoding is evolving with its own emergent performative outcomes ^ a performativity in which human agency is but a faint echo, silently shaping our present and future possibilities for becoming. As the media theorist Friedrich Kittler (1995) suggests (with reference to electronic writing): ‘As a consequence, far reaching chains of self-similarities in the sense de¢ned by fractal theory organize the software as well as the hardware of every writing. What remains a problem is only the realization of these layers which, just as modern media technologies in general, have been explicitly contrived in order to evade all perception. We simply do not know what our writing does’ (emphasis added). Likewise, Jacques Derrida (2005: 23), in talking about his somewhat reluctant transition from writing with mechanical writing tools to electronic writing, makes this same point:
I
With pens and typewriters, you think you know how it works, how ‘it responds.’ Whereas with computers, even if people know how to use them up to a point, they rarely know, intuitively and without thinking ^ at any rate, I don’t know ^ how the internal demon of the apparatus operates. What rules it obeys. . . .We know how to use them and what they are for, without knowing what goes on with them, in them, on their side; and this might give us plenty to think about with regard to our relationship with technology today ^ to the historical newness of this experience.
In and through the minutiae of circuit board switches and binary object code a continuous stream of ones and zeros (on and off, true and false) map out this geography of the past, the present and the possible futures of our becoming. Design decisions, encoded and encapsulated in complex nests of logical statements ^ rules within rules within rules ^ enact our supposed agency based on complex relational conditions, which after many iterations of ‘bug fixing’ and ‘tweaking’ even the programmers
Downloaded from tcs.sagepub.com at UNIV OF ROCHESTER LIBRARY on December 30, 2011
Introna ^ The Enframing of Code 115
no longer understand. As Ullman observes: The longer the system has been running, the greater the number of programmers who have worked on it, the less any one person understands it. As years pass and untold numbers of programmers and analysts come and go, the system takes on a life of its own. It runs. That is its claim to existence: it does useful work. However badly, however buggy, however obsolete ^ it runs. And no one individual completely understands how. (1997a: 116^17, emphasis added)
Once encoded, these design decisions (or rather the outcomes of the initial hacking and tweaking) embedded in these multifarious encoding entanglements withdraw into the background and are hardly ever revisited ^ even if they break down, patching and workarounds normally suffice. Yet these encoded geographies (Graham, 2005) seem to con¢gure and circumscribe us and our lives in more or less signi¢cant ways, de¢ning what is relevant and what is not, what needs attending to and what not ^ legitimating particular ways of being whilst simultaneously delegitimizing (or rendering more or less obscure) equally valid alternatives. Or as Lessig (2006: 79) argues: ‘As the world is now, code writers are increasingly lawmakers’. In and through these encoding practices of programmers and system designers an encoded ‘technological unconscious’ is emerging which sustains a ‘presence which we cannot access but which clearly has e¡ects, a technical substrate of unconscious meaning and activity’ (Thrift and French, 2002: 312). If Thrift and French and others are correct, then it is in and through these encoded landscapes where many of the ontological questions of our future will be determined (even if this determination is contingent and emergent). As such, this paper will attempt to render visible some of the contours of the phenomenon of encoding, i.e. present a short preliminary sketch as it were. In doing this it will, however, not only focus on digital code, important as this may be. It will rather suggest that a broader understanding of the phenomenon of encoding may render visible some of the concerns and contradictions visible in contemporary discourse with regard to digital encoding. As such, a preliminary sketch will be presented in three steps. First, I provide an outline sketch of some of the central notions of the phenomenon of encoding, as I see it ^ drawing on some of the work of Heidegger, Derrida, McLuhan, Butler, Latour and Kittler, amongst others. Second, I provide a brief discussion of the encoding and extension of agency from speech to writing, to mechanical writing, and finally, to electronic writing. Third, I take a small detour to look at electronic writing in the context of academic writing and plagiarism detection practices to reveal how multiple and intersecting encoded agencies imbricate with many unexpected performative outcomes. Most significant in this detour is the more general question of the original (and by implication its opposite, the
Downloaded from tcs.sagepub.com at UNIV OF ROCHESTER LIBRARY on December 30, 2011
116
Theory, Culture & Society 28(6)
plagiarized copy) in all encoded enactments of agency. Finally, I provide some concluding thoughts. On the Phenomenon of Code/Encoding The central claim of this essay is that all encoding frames and enframes. In framing it allows for the extension of agency, in enframing it performatively produces that which such agency assumes, and much more besides. However, such extension, and the performativity it affords, is never secure. There is inevitably a host of parasites or kidnappers ready to take such encoded agency hostage and turn it into the extension of their own agency. In the encoded mangle of agency (Pickering, 1995) anything can happen, but not exactly. What seems most remarkable about the ongoing becoming of the world is that although every event in the present is unprecedented and singular, in a signi¢cant way, the already there becoming of the world itself ^ that which renders possible the birth of this event ^ is remarkably familiar. In a very real sense today is similar to yesterday and we have good reason to anticipate that it will be similar tomorrow, again in a very signi¢cant way ^ so much so that we ¢nd no need to attend explicitly to the vast majority of it as we pursue our projects in the unfolding present. This seemingly contradictory simultaneity of the singular unprecedented event and its apparent repetition, in the unfolding presence of everyday life is, in my view, a good place to start when considering the essential becoming of encoding. When referring to the notion of encoding I have in mind a vast array of normatively governed material enactments such as: software code, logical gates on circuit boards, legal codes, writing scripts, grammar, social norms, moral codes, protocols, technological scripts, social practices, habits, etiquette, and so forth. Some of these encodings are seemingly quite rigid/explicit and may be the outcome of more or less explicit design intentions and decisions, and others are more malleable/implicit and emerge as a more or less implicit outcome of ongoing sociomaterial ordering practices. I would suggest that encodings are norm- or rule-governed material enactments accepted (or taken for granted) as the necessary conditions for beings to become what they are supposed to be. Two aspects of this demarcation need to be emphasized ^ that is, besides the obvious rule- or norm-governed basis of course. First, that encodings are enacted precisely because they are taken as the necessary conditions of becoming. By this I mean that all extension of agency (becoming) is necessarily encoded. Second, that all encodings are material enactments ^ even assumed social codes, such as moral codes or etiquette, become encodings precisely in their ongoing material enactments ‘as codes’. Encodings do not have some original agency, in and of themselves, that can compel, or force, actors to act in certain ways rather than in others. Rather, encodings are exactly already constituted as ‘codes’ because they are accepted, or enacted, as the necessary conditions for beings to Downloaded from tcs.sagepub.com at UNIV OF ROCHESTER LIBRARY on December 30, 2011
Introna ^ The Enframing of Code 117
become what they are supposed to be. As such, they render some forms of action/agency, if not impossible, then highly improbable, and others, if not inevitable, then exceedingly likely. Thus, being encoded embodies a certain ontological necessity for the beings so encoded ^ or differently stated, the beings are the sort of beings that they are because they are always already encoded as such. Although encodings are the ontological conditions for becoming, they are not inevitable as such. In a very concrete sense they are unfounded ^ they may be more or less made up on the spot as a bricolage of what happened to be available in the moment (Derrida, 1990). Yet once they become taken for granted as codes they tend to become more or less intractable and irreversible ^ exactly because they continue to be taken as the ontologically necessary constitutive conditions for becoming. One might ask what it is that makes being encoded so compelling, or in some sense what ‘forces’ agents to take them as ontologically necessary. Encoding translates agency (becoming) from one event to another, thereby extending the agency/becoming of actors beyond the boundaries of the singular local event ^ but never in any precise manner (Latour, 1988, 1993, 2008; McLuhan, 1964; Donald, 1991).1 Di¡erently stated, encoding allows for the repetition of the past (or the elsewhere) to be actualized in the present (the here) or in an anticipated future (the not-yet), but not as a simple copy but rather as a trace. Software code can enact the intentions of designers wherever and whenever it runs, but not exactly. Encoding extends and translates agency but not necessarily its assumed intentionality (which was itself, of course, encoded in the ¢rst instance) ^ every translation is always also simultaneously a transformation (Latour, 1988, 2005). The present is always a singular event in which the past is more or less repeated as a trace rather than a simple copy ^ a trace nevertheless that necessitates some form of minimal repetition (or sameness) for becoming to be becoming rather than merely being a coincidental random event. It could be claimed, very cryptically, that without the material enactments of encoding, that is to say without encoded extension, there will be no being, no becoming and no history (Stiegler, 1998). Every encoding requires, as a necessary condition, other codes for the ongoing extension of agency. This is because any encoding can only translate within a circumscribed set of constitutive conditions ^ or, as Foucault (2007 [1969]) would say, a statement is only an enunciation within a speci¢c discursive formation.2 Thus, the ongoing extension of agency is achieved through the ongoing (and mostly imperfect) encoding of agency from one code to another in a seemingly in¢nite regress ^ interlocking lines of codes which are always anchored in, and emerging from, the very materiality of being. Or as Kittler (2010: 31) suggests (in following McLuhan): ‘the content of a medium is always another medium’. One might suggest in¢nite lines of interlocking code but it would probably be more accurate to describe the relationship between these codes as being ‘nested’3 ^ codes within codes, within codes, and so forth. Thus, every attempt at encoding/ translation is itself already encoded at a higher/prior level (which is a Downloaded from tcs.sagepub.com at UNIV OF ROCHESTER LIBRARY on December 30, 2011
118
Theory, Culture & Society 28(6)
necessary condition for translation to be possible in the ¢rst place). One might call every higher, or prior, level of encoding ^ in following Foucault (2007 [1969]) ^ an encoding formation. In this encoded nesting codes encapsulate and become encapsulated. Encapsulation hides complexity by covering over or rendering invisible supposed unnecessary detail ^ what is referred to as ‘black boxing’ in Latour’s terminology. In and through these encapsulated nests of codes encoding makes the traces of agency endure, but not in any straightforward way. Every translation of agency is always also simultaneously a transformation (Latour, 1988, 2005) ^ and a betrayal. Transformation in two interrelated senses: (1) in performative outcomes, to be discussed below, and (2) in resigni¢cation opportunities. In the multiplicity of encoded events there are always multiple points or possibilities for the otherness of the event to assert itself. In other words, there are multiple opportunities for resigni¢cation, as Butler (1990) might say (such as interpretive £exibility, a¡ordance malleability, reappropriation, redundancy, etc.), multiple points of breakdown (such as accidents, misinterpretations, misuse, etc.), and all sorts of other perturbations (such as noise, coincidences, etc.). Di¡erently stated, in every encoded event there is not just sameness (the enduring trace of the code) but also di¡erence (the trace of the other).4 That is exactly what makes the event singular rather than a mere repetition. In every encoded enactment there is always a more or less essential otherness (surplus, incompleteness, etc.) at stake, which is in a sense its double ^ elements of the enactment which do not conform to the sameness and repetition which such encoding demands. In this otherness there is also the possibility for the encoding itself to become a stake in the event. But never entirely, as any attempt to reencode must acknowledge (cite) or somehow incorporate the already there legacy archive (with patches, workarounds, interim procedures, legacy systems, and so forth). However, this otherness does not mean that ‘anything goes’ ^ quite the contrary. The encoding produces a remarkable continuity which suggests at least a su⁄cient level of sameness (or citationality) to endure from one event to the next ^ this is exactly what makes language work, what enables computer programs to run, routines to persist, and so forth. Indeed, this is precisely what gives encoding its power as ‘code’. Its power to translate agency is only secured in its ongoing enactment as code, that is to say, in its assumed or taken for granted ontological necessity. Thus, once established there is a remarkable incentive ^ that of agency extension ^ to maintain it as the code it has become, but this is not secure. It is always possible to be otherwise ^ as revolutions, abductions, mishaps and mistakes often remind us. All encodings are fundamentally performative, as was suggested above (Butler, 1996b). In short, by performative I mean that in its ongoing enactment encoding produces what it assumes. Encoding achieves its performativity through its assumed ontological necessity and its ongoing enactment (or extension) through repetition, or more accurately iteration (Derrida, 1977). Clearly some degree of repetition (or sameness) is the constitutive Downloaded from tcs.sagepub.com at UNIV OF ROCHESTER LIBRARY on December 30, 2011
Introna ^ The Enframing of Code 119
condition for the translation of agency to endure, from one event to another, and therefore of extension. However, iterative repetition also immediately constrains ^ that is to say it becomes the necessary condition for that which is repeated to be exactly a repetition. Thus, the extension of agency is only achieved if translation conforms to the encoding that will be taken (or accepted) in the event as a repetition (i.e. to conform to the norms of the encoding formation in Foucault’s terms). As such, encoding is always already a condition for any agency whatsoever, or as Butler expresses it so well: ‘Agency begins where [assumed] sovereignty wanes. The one who acts . . . acts precisely to the extent that he or she is constituted as an actor and, hence, operating within a ¢eld of enabling constraints [or encodings] from the outset’ (Butler, 1997: 16, emphasis added). As always already encoded, the agency of actors becomes more or less ordered, one might say regular or repeatable. An encoded iteration ‘is the vehicle through which ontological e¡ects are established’ (Butler, 1996b: 112). As was mentioned above, we must, however, be careful not to suggest that the agency of the actors somehow exists in any way separate from, or prior to, its ongoing encoded extension. There is no agency ^ and therefore no actor ^ which is prior to encoding/extension. In the words of Nietzsche: ‘there is no ‘‘being’’ behind doing, acting, becoming; ‘‘the doer’’ is merely a ¢ction imposed on the doing ^ the doing [encoding] itself is everything’ (Nietzsche, 1996 [1887]: 29). Actors are the performative outcomes of encoded material enactments; they are essentially intra-relational (Barad, 2003) ^ that is, relations without preexisting relata. To be sure, there is always a multiplicity of encoded agencies entangled in the enactment of any particular code, excessively £owing in all directions and sometimes in unexpected or unintended ways ^ i.e. at any time the world could have been di¡erent. Although there are always in¢nite possibilities for the future to be otherwise, it is nevertheless, in a certain and important sense, remarkably the same. This, one could claim, is exactly due to the performative power of encoding, as code. In their ongoing enactment these multiplicities of entangled encoded agencies intra-act (Barad, 2003) or transduct (Mackenzie, 2002, 2006) ^ not only to condition action, but also to ‘constitute as an e¡ect the very subject it appears to express’ (Butler, 1996a: 380). The encoded extension of agency obviously comes at a cost ^ it has a double structure, as Ihde (1990) suggested. As encoded, the otherness of the other ^ that which is singular ^ becomes more or less domesticated through the necessary sameness of encoding (Le¤vinas, 1985). T|me, as duration, becomes domesticated through the encoding of the clock; emotions and feelings become reduced to available vocabulary in speech; the application of the law domesticates all its objects through its categories; and so forth. But this is the cost actors have to bear if they want the reward of extension (McLuhan, 1964). Undoubtedly, all actors at some level accept that in the silent tyranny of code there is the reward of agency/becoming ^ and with it a more or less degree of continuity and order ^ which is itself a necessary condition for such agency to have any meaning in the event Downloaded from tcs.sagepub.com at UNIV OF ROCHESTER LIBRARY on December 30, 2011
120
Theory, Culture & Society 28(6)
it is enacted. In other words codes are ¢rst and foremost productive, not only restrictive, or, more precisely, productive because they are already restrictive ^ ‘enabling constraints’, in Butler’s words. Thus, we agents tend to heed its call in spite of the restrictive burden it places on us. Before making some more general comments with regard to the phenomenon of encoding it might be pertinent to ask about the sameness/otherness that this category of encoding covers (or not). At the start of this essay the discussion was focused very much on software code. In my discussion, however, I have now significantly broadened the scope of the phenomenon of encoding. For example, one might suggest that there is surely a difference between the encoding of language and that of software code, or between software code and the material script of a tool, such as a hammer. Galloway (2004) argues, for example, that software code is very di¡erent to ordinary language as ‘[software] code is the only language that is executable’. In a similar manner Ullman (1997b) suggests that: ‘We can use English to invent poetry. . . . In programming you really can’t . . . a computer program has only one meaning: what it does. . . . Its entire meaning is its function’. Hayles (2005: 50), in her essay ‘Speech, Writing, Code’, tends to agree with these claims. She suggests that ‘code that runs on a machine is performative in a much stronger sense than that attributed to language’ since, she argues, ‘the performative force of language is . . . tied to the external changes through complex chains of mediation’. Whilst one might agree with the general point one could equally argue that these distinctions, between software code and ordinary language, for example, are distinctions of degree rather than distinctions of kind. I would suggest that all code, to be code, must be ‘executable’ ^ otherwise it would not translate agency. What is di¡erent, however, is the nature of the necessary constitutive conditions for such execution ^ in particular the necessary degree of conformance or agreement to the encoding formation for the enactment to constitute an ‘execution’ rather than noise. Indeed Wittgenstein (2001) would suggest to Ullman that the meaning of all language, not just software, ‘is its function’. To enact ^ that is, to encode/translate agency ^ all codes must necessarily conform to their constitutive formations (as suggested by Foucault) ^ a move in chess is only ‘a move’ if it conforms to the rules of chess, a hammer can only ‘hammer’ if it is used in the appropriate manner, java will only be a ‘programming language’ if it conforms to the syntax, and so forth. Chun (2008: 299) makes this same argument when arguing against the supposed ‘executable’ nature of software code: ‘source code is never simply the source of any action; rather, source code is only source code after the fact: its e¡ectiveness depends on a whole imagined network of machines and humans’. I would suggest that one can always in principle enumerate all these necessary constitutive conditions for any encoding (such as java, logical switches, English, moral codes, etiquette, hammers, and so forth) to be a valid encoding, or more precisely to be executable or enactable. In some cases the necessary constitutive conditions can be said to be more unambiguous and formal and in some cases they are more £exible and informal. One might Downloaded from tcs.sagepub.com at UNIV OF ROCHESTER LIBRARY on December 30, 2011
Introna ^ The Enframing of Code 121
further suggest that the translation of agency can be made more perfect by making codes more unambiguous (as in software code), but that may in turn reduce the sort of agency that can be translated. Likewise, codes can be made more £exible (as in everyday language) but with the resulting risk of a loss in translation (that is to say, more opportunities for resigni¢cation). Every encoding is already encapsulated within, and already encapsulates, other codes which are taken as necessary for its own becoming. Thus, in its recursive performativity code always already frames and enframes: ‘Enframing is the gathering together that belongs to that settingupon which sets upon man, i.e. challenges him forth, to reveal the real, in the mode of ordering, as standing reserve’ (Heidegger, 1977: 20). I would suggest that the enframing of encoding does not only translate and reveal the world as ‘standing reserve’ (which might be the performative outcome particular to modern technology, if one would accept Heidegger’s analysis) but more generally also reveals beings as always already encoded. As Heidegger suggests, every epoch has its dominant code, the code that orders and encapsulates other codes, and thereby translates and performatively constitutes the becoming of beings, in that epoch, as being essentially this or that sort of becoming ^ for example as ‘resources’ in the case of modern technology, according to Heidegger. Therefore, what we ¢nd is that as we humans and non-humans implicitly or explicitly take up and draw on the agency and order which encoding renders possible it also performatively constitutes us (and reveals the world to us) as already encoded, in a particular manner of being. In short: as extended beings we are always already enframed. Moreover, what we find, in tracing the code ‘all the way down (or up)’, is that it is not us humans that are the original source of the code; rather, our being is already what it is because it is always and already encoded in a particular way rather than another (Heidegger, 1977). Encoding, in its ongoing unfolding, has (or maybe always had) a teleology which is more original (or at least co-original) with our human agency (Mackenzie, 2002; Stiegler, 1998). One might speculate, with Bergson (2003 [1911]) for example, that an essential elan vital is the more original source of this incessant logic of extension/becoming. In other words, the encoded human is already an extension of a more original agency within which it is already encapsulated. As such, the phenomenon of encoding transcends the supposed agency of human or non-human actors, even if they are necessarily implicated in its ongoing enactment. This was, I would suggest, Heidegger’s (1971: 146) point when he said it is language that speaks: ‘Man acts as though he were the shaper and master of language, while in fact language remains the master of man’. Or as Kittler (1996a: 738) suggests with reference to media: However, I don’t believe in the old thesis that thus the media are prostheses of the body, which amounts to saying, in the beginning was the body, then came the glasses, then suddenly television, and from the television, Downloaded from tcs.sagepub.com at UNIV OF ROCHESTER LIBRARY on December 30, 2011
122
Theory, Culture & Society 28(6) the computer. . . . Rather, I think, it’s a reasonable hypothesis to say that the media, including books and the written word, develop independently from the body.
Kittler is not suggesting that there is no body as such. Rather, he is suggesting that the encoded human body ^ and human agency more generally ^ is not itself an original source as is mostly presumed. Or as Stiegler (1998: 141) proposes: ‘The tool invents the human, or the human invents himself by inventing the tool, through techno-logical exteriorisation. But this exteriorisation is in fact the co-constitution of interior and exterior, according to a technological maieutic.’5 I have now presented a brief and preliminary sketch of the phenomenon of encoding. I appreciate that this sketch is neither complete nor sufficiently justified. In many respects it is an initial attempt to weave together a number of ideas taken from the work of Heidegger, Derrida, McLuhan, Butler, Latour and Kittler without trying to reveal all the connections and possible contradictions ^ which would be beyond the scope of this essay. In the following section I will attempt to show how agency becomes encoded in a number of translations from speech to writing, to mechanical writing, to electronic writing. This is not presented or supposed to be a historical analysis as such; it is rather merely a focus on a number of historical episodes of relevance to show how agency becomes encapsulated in different encodings and how such encoding has unanticipated performative outcomes ^ frames and enframes. Of particular interest is the question of originality (and by implication its opposite, the plagiarized copy).
The Encoding of Human Agency: From Speech to Electronic Writing The encoding of the illiterate mind/agency: Iterated mnemonic patterns
It is a self-evident but non-trivial fact that when humans find themselves they are already speaking beings, already dwelling in language, as Heidegger would say.6 Language, it could be argued, is one of the greatest encoding achievements ^ which has as one of its performative outcomes the human (Heidegger, 1971). W|thout the encoding of language there is no world, no thought and no extension of agency (Lafont, 2000) ^ the embodied subject will remain trapped in the immediacy of the present and the encoded materiality of the body. Through speaking we not only express (and impress) but indeed enact a world, a self, others, and much more besides ^ the encoding of language is, like all codes, performative. As Heidegger asserts: Language is not a mere tool, one of the many which man possesses; on the contrary, it is only language that affords the very possibility of standing in the openness of the existent. Only where there is language, is there world Downloaded from tcs.sagepub.com at UNIV OF ROCHESTER LIBRARY on December 30, 2011
Introna ^ The Enframing of Code 123 . . . [and] Only where world predominates, is there history . . . [because of language] man can exist historically. (1988: 76, emphasis added)
In primary oral cultures this claim of Heidegger is much more evident than in our chirographic and typographic culture. In such oral cultures it is selfevident that language is first and foremost a mode of action ^ to speak is to act, and to enact a commonly shared world (Donald, 1991; Ong, 2002). However, ‘there is no way to stop sound and have sound’ (Ong, 2002: 32) ^ that is to say, sound as sound leaves no public trace. The extension of agency that voice/sound o¡ers is limited, unless, of course it becomes repeated. But how can such repetition be secured? For agency to endure in an oral culture speech needs to be encoded in mnemonic patterns ^ in repetitions, rhyme, rhythmic patterns, and a variety of formulary expressions (such as chiasmus and the epithetical form). In oral cultures ‘serious thought [and agency] is intertwined with memory systems’ (Ong, 2002: 32). These of course condition ‘the kind of thinking [and acting] that can be done, the way experience is intellectually organized. In an oral culture, experience is intellectualised mnemonically’ (Ong, 2002: 36; Whitman, 1958). The extension of agency becomes encoded as a narrative centred, mnemonically encoded agency mostly in the hands of the select few (the poets). It is generally accepted that the transition from a primary oral culture to a literate (chirographic) culture happened over a very long period of time with a variety of intermediate stages (Deacon, 1997; Donald, 1991; Havelock, 1988; Ong, 2002). The development and use of these ancient chirographic codes or scripts were mostly limited to an elite group of scribes in the service of the powerful.7 It should also be noted that the major reason for the development of these material mnemonic technologies was for economic transactions and for administration, especially connected with increased trade and urbanization: in other words, to extend/enact agency at a distance in situations where agency needed to reach beyond the local oral community and where accuracy in repetition was most important ^ in matters of power and wealth. The final step in the development of the western writing code was the remarkable encoding of the phonetic alphabet ^ which was invented only once by the nomadic Semitic people in approximately the middle of the 19th century B.C.E. (Goldwasser, 2010; Logan, 2004).8 It is interesting to note that this early phonetic alphabet was developed by illiterate workers in Egypt by idiosyncratically ‘cutting and pasting’ (using the acrophonic principle9) from Egyptian hieroglyphs without regard for their function or value in Egyptian ^ thus allowing a basic form of literacy to emerge outside the elite circles of the scribes. As mentioned above, encoding is often a bricolage of what is available in the moment, i.e. it is essentially unfounded. The Greeks made the ¢nal step (in about the ¢rst half of the 8th century) in the development of the phonetic alphabet by adding vowel sounds (required to encode Greek words) and thereby completing the encoding of speech (sound) to the written word (sight), and in doing so producing a Downloaded from tcs.sagepub.com at UNIV OF ROCHESTER LIBRARY on December 30, 2011
124
Theory, Culture & Society 28(6)
fully phonetic representation of all possible speech (Ong, 2002). McLuhan (1964: 84) argues that the easy-to-learn and £exible phonetic alphabet releases the individual agency from the collective ‘tribal web’ of the oral encoding, but much more besides. The Encoding of the Literate Mind/Agency: Authorship and New Modes of Cognition
When the Greek alphabet encoded speech into alphabetic writing (in a more or less precise manner), this seemingly simple encoded translation reconstituted the actors and their agency in fundamental ways ^ every encoded translation of agency is also simultaneously a transformation, it has performative outcomes. Ong (2002: 77) asserts that: ‘More than any other single invention, writing has transformed human consciousness’. W|thout the necessary mnemonic baggage of the oral culture the encoding of language as writing can now become more direct and concise. Thoughts can be written down and endlessly rehearsed, revised and corrected to render them more or less precise ^ without the eventual knowledge of the reader (Donald, 1991; Logan, 2004). In the material encoding of the manuscript the narrative structure of the oral culture, centred on persons and events (as a mnemonic code), can be replaced with more general prose centred on themes and ideas (Havelock, 1988: 115). From this encoded translation a new and radically different form of cognition becomes possible (McLuhan, 1964) ^ what Donald (1991) refers to as a ‘theoretic culture’. For example, it is well known that formal logic only emerged in Greek culture after it internalized the encoding of alphabetic writing (Donald, 1991; Logan, 2004: 113; McLuhan, 1962: 59; Ong, 2002: 52). Moreover, for the literate Greeks (and for most of modernity ever since) sight, as opposed to sound, becomes established as the original and true source for cognition. In this regard Aristotle (1998: 4, n. 980a) argued in The Metaphysics that ‘sight is the sense that especially produces cognition in us and reveals many distinguishing features of things’.10 It’s a view also proposed by McLuhan (1960: report 5): ‘the phonetic alphabet alone, of all forms of writing, translates the audible and the tactile into the visible and the abstract. Letters, the language of civilization, have this power of translating all of our senses into visual and pictorial space’. Sequential writing also has other performative outcomes, as argued by Flusser (2002). He suggests that this encoding of our senses into sequential texts ^ which demands from the reader and writer the ongoing synchronization of a diachronistic object ^ constituted the very possibility for a historical consciousness to emerge. Not because texts enable us to reconstruct the past, but because the world becomes understood as an unfolding process. That is historically, as successive symbols which continuously refer back to some prior for their meaning. He suggests that in encountering the world through sequentially encoded text the literate humans began to experience, Downloaded from tcs.sagepub.com at UNIV OF ROCHESTER LIBRARY on December 30, 2011
Introna ^ The Enframing of Code 125
understand, and evaluate the world as a successive ‘becoming’. According to him, such an existential attitude was not possible in the world prior to the text (i.e. in prehistory). He summarizes it as follows: If one wants to decipher (‘read’) a text, one must let the eye glide along the line. Not until the end of the line does one receive the message, and then one must attempt to bring it together, to synthesize it. Linear codes demand a synchronization of their diachronicity. They demand progressive reception. And the result is a new experience of time, that is, linear time, a stream of unstoppable progress, of dramatic unrepeatability, of framing: in short, history. With the invention of writing, history begins, not because writing keeps a firm hold on processes, but because it transforms scenes into processes: it generates historical consciousness. (Flusser, 2002: 39)
This linear spatialization of the literate mind privileges an abstract spatiallyoriented mode of cognition whereas speech/sound privileges a more narrative time/duration-oriented mode of cognition (Donald, 1991; McLuhan, 1964). Thus, through the interiorization of the phonetic alphabet by Greek culture, western thought (and agency) becomes encoded as being fundamentally abstract, logical, spatial and linear ^ encoded in this manner the world emerges as an extended thing, res extensia.11 Furthermore, it has also been argued that this abstract logical form of cognition made it di⁄cult for the Greeks (focused on abstract geometry) to invent the concepts necessary for the development of algebra (such as zero and in¢nity) ^ a feat that was instead achieved by the Hindu mathematicians (Logan, 2004). Even though there are many debates about the signi¢cance of the alphabetic encoding as such (Grosswiler, 2004), there is nevertheless general agreement that the sequential encoding of writing has had many very signi¢cant performative outcomes ^ that is, it produced the manner of beings it was supposed to express ^ which are now taken for granted as the way the world is (Olson, 1994). Through the encoding of the written text emerges not only historical consciousness, the world as progressive moments of becoming, but also the self as an increasingly extended subject that authors its own becoming ^ thus, not only an author but also simultaneously a life that is itself continuously being authored. In the encoded performativity of writing the individual ‘author’ discovers the material conditions of her own supposed agency. In the encoded performativity of the written text we human beings become the very beings that we take ourselves to be ^ as the authors of ourselves and the world. As Kittler (2010: 34) argues: ‘we knew nothing about our senses [and our agency] until media [codes] provided models and metaphors’. The Encoding of the Post-literate Mind/Agency: Mechanical and Electronic Writing
As literacy becomes more pervasive12 the practice of writing, by hand, encodes a domain of action (and becoming) that is fundamentally textual, Downloaded from tcs.sagepub.com at UNIV OF ROCHESTER LIBRARY on December 30, 2011
126
Theory, Culture & Society 28(6)
as opposed to being verbal. In this textually encoded world the act of writing ^ and the surface of writing ^ is itself the place where thinking (and action) seems to be emerging from. In the Blue and Brown Books, Wittgenstein (1958: 6^7) remarks: ‘We may say that thinking is essentially the activity of operating with signs. This activity is performed by the hand, when we think by writing . . . we may legitimately employ the expression[s]. . . ‘‘we think with a pencil on a piece of paper’’’, and further on he suggests: ‘if again we talk about the locality where thinking takes place we have a right to say that this locality is the paper on which we write’. The practice of writing, and more speci¢cally the tools of writing, encode ‘thought’ and agency (its modality, its location, etc.) in a particular manner with particular performative outcomes. Indeed, if we attend more carefully to the multiplicity of encoded agencies implicated in the textually encoded chirographic writing practice it becomes evident that agency (and perhaps originality) is never neatly located in one place ^ is it in the head, in the hand, in the tool, on the writing surface, or in all/none of these? Most certainly in some sense agency is encoded in and through the tools of writing itself as Roland Barthes suggests: I have an almost obsessive relation to writing instruments. I often switch from one pen to another just for the pleasure of it. In short, I’ve tried everything . . . except Bics, with which I feel absolutely no affinity. I would even say, a bit nastily, that there is a ‘Bic style’, which is really just for churning out copy, writing that merely transcribes thoughts. In the end, I always return to fine fountain pens. The essential thing is that they can produce that soft, smooth writing I absolutely require. (1991: 177, emphasis added)
Thus, writing with a ‘Bic pen’ produces a sort of Bic writing and thinking, which is just copying, but the fountain pen produces elegant writing and thinking which is ‘soft and smooth’. But what happens to writing/thinking (and the agency of the writer) when writing becomes further encoded through mechanical devices, such as the typewriter? In 1882 Nietzsche bought the recently patented Malling-Hansen Writing Ball typewriter.13 Nietzsche sent some rhymes he produced on his typewriter to a friend, a composer. In his reply his friend commented on the terseness of the language: ‘Perhaps you will through this instrument even take to a new idiom’, adding: ‘with me at any rate this could happen; I do not deny that my ‘‘thoughts’’ in music and language often depend on the quality of pen and paper’. To which Nietzsche replied: ‘You are right ^ our writing equipment takes part in the forming of our thoughts’. Through the encoding of the machine, writes Kittler (1999: 203), Nietzsche’s prose ‘changed from arguments to aphorisms, from thoughts to puns, from rhetoric to telegram style’. For Kittler (1999: 211), the history of the typewriter designates not simply the invention of a writing machine but rather ‘the turning point at which communications technologies can no longer be related back to humans. Instead, the former have formed the latter.’ Downloaded from tcs.sagepub.com at UNIV OF ROCHESTER LIBRARY on December 30, 2011
Introna ^ The Enframing of Code 127
In the mechanical encoded world of typescript the hand, and perhaps the supposed original agency of the author, seem to disappear, a fact commented on by Heidegger (with reference to the typewriter) and echoed by Derrida (1987: 178^9) in his essay ‘Heidegger’s Hand’. The typewriter tends to destroy the word: the typewriter ‘tears (entreisst) writing from the essential domain of the hand, that is, of the word’, of speech. The ‘typed’ word is only a copy (Abschrift) . . . The machine ‘degrades (degradiert)’ the word or the speech it reduces to a simple means of transport (Verkehrsmittle), to the instrument of commerce and communication. Furthermore, the machine offers the advantage, for those who wish for this degradation, of dissimulating manuscripted writing and ‘character’. ‘In typewriting, all men resemble one another’.
One might suggest that what Heidegger was taking note of here (as expressed by Derrida) was in many respects the successive encapsulated encodings of the word (each previous code encapsulated in the next) from the spoken word, to the handwritten word, to the machine-typed word ^ as Kittler (1996b) suggests: ‘New media do not make old media obsolete; they assign them other places in the system’. In each of these iterations there is still the agency of the extended/translated speech act, but each encoded occasion performatively reproduces the agent and action in more or less fundamental ways ^ of course Heidegger is also posing some more fundamental questions. Very importantly, he also appears to hint at the fact that in machine writing the inscription of the body in the word (or the word in the body) seems to be ‘erased’ ^ this tears writing from its essential domain as a hand (craft) work. In mechanical writing hands do not carefully manuscript thought/agency but rather function to depress keys. Thinking in a sense becomes once again a matter of rhythmic repetition, but not of words and phrases but of keystrokes. As the science-¢ction writer Philip K. Dick (who could apparently type 120 words per minute) once remarked to his wife: ‘The words come out of [the keystrokes of] my hands, not my brain, I write with my hands’ (Sutin, 1994: 107). Perhaps one might suggest that, once thought/agency becomes encoded in the repetitive operations of the machine ^ in some sense torn free from the skills of the hand ^ the question of authorship can again be posed in an entirely different manner. The question of what the hand might have been thinking when scripting the text now seems (when faced with the typescript of the machine) less relevant than the question of what the typescript text itself is saying when it is read. The reader comes in focus as the relevant question for which the text is the answer. In his influential essay ‘The Death of the Author’, Roland Barthes argues: the modern scriptor is born at the same time as his text; he is not furnished with a being which precedes or exceeds his writing, he is not the subject of Downloaded from tcs.sagepub.com at UNIV OF ROCHESTER LIBRARY on December 30, 2011
128
Theory, Culture & Society 28(6) which his book would be the predicate; there is no time other than that of the speech-act, and every text is written eternally here and now. . . .for him [the author], on the contrary, his hand, detached from any voice, borne by a pure gesture of inscription (and not of expression), traces a field without origin ^ or at least with no origin but [the code of] language itself. (1989: 52, emphasis added)
In writing encoded as mechanical movements it seems the death of the author is rendered complete. The biographical code of the hand (manuscript) becomes encapsulated and transformed into the typographical code of the typescript. Human agency becomes encoded as already being in the code of the machine. Texts increasingly become intertexts in which the author increasingly becomes ‘like a spider that comes to dissolve itself into its own web’, to use Barthes’ metaphor. With electronic writing intertextuality (or intratextuality) becomes encoded into the writing practice as such ^ for example, in and through the seemingly simple operation of ‘cutting and pasting’. Cutting and pasting frees the composition of text from its linear encoding ^ as required by hand and typewriting. It is possible that contemporary native electronic writers no longer appreciate what this means. James Fallows (1982) ^ a journalist for the Atlantic Magazine ^ writes of his first encounter, in 1979, with this simple operation: ‘When I first saw the [word processing] system in the back room at Optek, I was so dazzled by the instantaneous deletion of sentences and movement of paragraphs that I thought I could never want anything more.’ In reflecting on his writing practice, as he moved from typewriter to word processor, he remarks: ‘The process works this way. When I sit down to write a letter or start the first draft of an article, I simply type on the keyboard and the words [and ideas] appear on the screen.’Any idea, phrase or sentence need not be thought out in advance, it could simply be typed (or pasted from elsewhere) because it can always be deleted, amended or moved ^ nothing is final and everything is subject to potential revision. Encoded as electronic writing, the practice of writing becomes constituted as a patchwork of fragments that can be ‘cut and pasted’ in a more or less ‘thoughtless’ manner ^ in other words, the electronic text becomes constituted as never being thought as such (Heim, 1999). Or as Kittler suggests: ‘the written word develop[s] independently from the body’ (1996a: 738). Freed from the constraints of the physical paper, the paragraph, the page or the book, the boundaries of the electronic text become defused, ill-de¢ned, permeable and plastic. ‘The end of linear writing is indeed the end of the book’, as suggested by Derrida (1976: 86). In a sense one might say that in the electronic writing code all texts become hyper(inter)text ^ even if the ¢nal composition might mimic the traditional linear form. As the universe of available digital text fragments explodes writing, and the agency it implies, becomes encoded as a more or less skilful performative pastiche of fragments, cut and pasted from elsewhere. It seems that Downloaded from tcs.sagepub.com at UNIV OF ROCHESTER LIBRARY on December 30, 2011
Introna ^ The Enframing of Code 129
in the electronic encoding of writing the radical intertextuality of all texts (encapsulated in the chirographic code) is rendered visible again. As Roland Barthes explains: the intertext is a general field of anonymous formulae whose origin can scarcely ever be located; of unconscious or automatic quotations, given without quotation-marks . . . the current theory of the text turns away from the text as veil and tries to perceive the fabric in its texture, in the interlacing of codes. Formulae and signifiers, in the midst of which the subject places himself and is undone, like a spider that comes to dissolve itself into its own web. (1981: 39, emphasis added)
In the electronic intertext everything (all writing and the agency it implies) is constituted as more or less reuse ^ the central encoding of the electronic intertext, and all texts Barthes would argue, is reuse. In the non-linear ‘cutting and pasting’ (as reuse) something more fundamental is also happening to our sense of temporality. Instead of thinking of the text as a linear succession of words (and meaning) ^ words which add up to sentences, which add up to paragraphs, which add up to chapters, etc. ^ we instead have text fragments ‘cut and pasted’ as pre-given thought (or meaning): text fragments which are reused and woven together ‘out of context’, as it were. In the electronic intertext the text fragment does not stand in for the author; it is not a medium, it is immediate. The temporality of the reused text fragment is one of immediacy where the message is given first; it is immediately apparent, on the surface as it were, and then it is reappropriated for whatever purpose. What is paramount in the encoding of electronic writing is an elegant epigraphical phrase or fragment that says what it says immediately and apparently ^ that is to say, one that can ‘travel’ (be cut and pasted) wherever and for whatever it is needed. In the formation of electronic writing the writer/reuser is perhaps best described as a skilful hostage taker or kidnapper (plagium) of the fragment ^ a fragment ‘whose origin can scarcely ever be located’. Of course the electronic encoding of the act of writing not only reconstitutes the practice of writing, it also reconstitutes the act of reading (and performatively the ‘reader’). In the electronic code reading becomes encoded as a non-linear act of ‘finding’ ^ not finding through scanning and skimming, but finding as an act of random access afforded by the materiality of digital storage and access technologies (Kirschenbaum, 2004). Search algorithms locate relevant entry points (perhaps based on algorithmicallygenerated keywords, or traces left by previous readers/travellers) from which the text is then recursively explored ^ as a sort of recursive intertextual meandering, not only within a text but also between very disparate texts, thereby unravelling the supposed link between text and context, as well as authority. This reading as ¢nding, as random access, is of course conditioned by the very materiality of the archive as Derrida suggests: ‘the technical structure of the archiving archive also determines the structure of the Downloaded from tcs.sagepub.com at UNIV OF ROCHESTER LIBRARY on December 30, 2011
130
Theory, Culture & Society 28(6)
archivable content even in its coming into existence and in its relationship to the future. The archivization produces as much as it records’ (1995: 17, emphasis added). For example, the algorithms of search tools (such as Google) condition in a signi¢cant way what can be found, where and under what criteria (Introna and Nissenbaum, 2000) ^ as such, performatively producing visibility, legitimacy and much more. As the assumed agency of the author (and reader) becomes dispersed and dissipates, one might conclude with Kittler (1999) that in this electronic ‘universal Turing machine’ all ‘that remains of people is what media [encoding] can store and communicate’.14 In some respects such a claim seems absurd. It is also possible that one might not ¢nd a vantage point to establish if it is or not. Nevertheless, what we do know is that in the vast encoded geography of the sociomaterial world (the vast intratext) agency is never simple to locate and that the performative outcomes of these encoded agencies are often, if not mostly, unexpected ^ ontological transformations rather than simple transportations (Latour, 2005). Indeed, at any point they could have been otherwise, subverting the very agency supposedly enacted. In the following section I want to take what might seem to be a little detour by looking at how di¡erent encodings of writing encounter each other in an academic context. This is done to show how this intraaction ^ itself encoded in the electronic code of plagiarism detection systems ^ has performative outcomes which not only open up the question of originality/plagiarism in writing but also the question of originality/plagiarism of all encoded enactments of agency, in a more general sense. Electronic Writing, Originality and the Plagiarist Plagiarius: one who abducts the child or slave of another. (OED)
In the humanities and the social sciences the essay has for a long time been seen as the standard bearer of the quality of thought, wit and learning of the student ^ established in elite schools and universities as an important gate-keeping mechanism (Heath, 1993). In its early incarnation the writing of the academic essay was most probably encoded in the ‘classical episteme of imitation’ (Pigman, 1980; White, 1965).15 However, under the sway of the ‘possessive individualism’ of the Romantic age, and particularly the development of copyright law, the author becomes established as the original source (and owner) of the text ( Jaszi, 1991; McFarland, 1974). W|th authorship encoded in this way the idea of plagiarism shifts from its classical encoding as a transgression of attribution (not composition) to being understood as a crime of deception ^ which is practiced by copying the ideas and expressions of the original ‘author’ (Lindey, 1952; Terry, 2007). Thus, in the contemporary age of intellectual property, and within the encoding of copyright law, plagiarism is mostly presented as the copying of another’s words (exact expressions) and presenting them as one’s own. In the Downloaded from tcs.sagepub.com at UNIV OF ROCHESTER LIBRARY on December 30, 2011
Introna ^ The Enframing of Code 131
educational context plagiarism is most often seen as an ‘institutional judgment which creates its own object as an expression of the limits of tolerance with respect to norms such as propriety, originality, and authenticity’ (Randall, 1991: 535). In practice there are probably as many views on what constitutes a plagiaristic writing practice as there are tutors. What is not disputed is a particularly dominant view that electronic writing has made it ‘easy’ for students to plagiarize (copy) and that it is considered to be a major problem ^ hence the proliferation of plagiarism policies, honour codes, etc. to admonish students to only submit ‘original’ work. As Hertz (1982) suggests: ‘The recurrent touting of originality . . . is no doubt a sign of the same uneasiness that produces the ritual condemnation of student plagiarists. . . . And, in one of those nicely economical turns that characterize powerful fantasies, the delinquent member is himself made to unwillingly represent an emblem of integrity, of the binding of the self and its signs.’ In other words, the production of the delinquent (the mere copier) is a violent but necessary part of creating certainty and conferring on the institution its opposite ^ originality. How is this encoded production of the delinquent, the plagiarius, and its opposite, the original author, achieved? In most cases it is encoded in the algorithm of Turnitin.16 For the Turnitin algorithm similarity of a text with a model (in its database) is equal to plagiarism (or at least to non-originality). This algorithm detects similarity when a sufficiently long string of consecutive characters from the original is retained in the copied version. The location, within the fragment, of the consecutive string is important due to the sampling window.17 In some cases a small amount of change in the right way (or place) will make a copied fragment undetectable, and in other cases a large amount of editing will still make it possible to detect.18 Similarity in expression is a concept encoded in copyright and intellectual property law, as was suggested above. In the software code of Turnitin plagiarism becomes encoded as detectable sequential character similarity ^ or, important for us, its corollary, originality becomes encoded as undetected fragments or copies. This encoding of plagiarism imbricates with electronic writing practices ^ as reuse or ‘patch-writing’ (Howard, 1995; Rice, 2003) ^ to performatively produce the plagiarist, the delinquent. In this encoding the plagiarist is one who keeps fragments (which happen to be in Turnitin’s database) su⁄ciently similar for a match to be possible and the original essay the one that is su⁄ciently edited to remain undetected ^ i.e. certi¢ed as ‘original’ by Turnitin. Thus, students now claim originality (and authorship) when they get a ‘clean’ Turnitin report ^ and disciplinary committees are happy to con¢rm this status if a Turnitin match cannot be produced as evidence to the contrary. W|th textual checking and matching encoded in the Turnitin algorithm certainty ^ and integrity ^ can be achieved whilst transforming the question of plagiarism and originality to the di¡erence between detected and undetected fragments. This performative production of the original and the plagiarist (in the encoding of Turnitin) is of course Downloaded from tcs.sagepub.com at UNIV OF ROCHESTER LIBRARY on December 30, 2011
132
Theory, Culture & Society 28(6)
unevenly distributed. It is often students with less sophisticated electronic writing practices as well as limited linguistic skills who are con¢gured by this encoding as plagiarists ^ these are often non-native speakers or those on the periphery of the community of practice who tend to keep reused fragments intact in situations of uncertainty.19 Thus, the assumed neutrality and fairness of the code (as opposed to the human errors of the tutors) now gets transformed into a more or less arbitrary judgement of the algorithm (with its assumed certainty). Moreover, the blanket submission of all assessment work to Turnitin becomes the enactment of the ‘seriousness with which the institution deals with plagiarism’ ^ its emblem of integrity and originality. The performative outcome of the plagiarist as the detected copier ^ and by implication its opposite, the undetected copier, as the ‘original’ author ^ does not end there. Since the electronic text retains no marks of its emergent history ^ it is not a manuscript, it has no specific past ^ it can also gain new commodity value as a Turnitin ‘certifiable’ original work. The signing away of any intellectual property rights when essays are submitted (required by most universities) and the confirmation of ‘originality’ by Turnitin reconstitute the student as the producer and owner of valuable intellectual property. In this encoded constitutive nexus students emerge as producers of valuable commodities when they write an essay for a course ^ commodities which may also have a market value. Hence, once academic writing (and the originality it supposes) is encoded in this manner, students see it as a normatively legitimate action to sell their original work on the internet (for example on eBay). This especially makes sense in an age where education is increasingly encoded as a market transaction where commodity exchange is taking place (Saltmarsh, 2004; Vojak, 2006). Moreover, encoded as a commodity, it seems entirely appropriate to ‘outsource’ the act of electronic writing to ghost writers who can produce original work that is guaranteed to be original (i.e. unlikely to be detected by Turnitin). What does this little detour reveal about the encoding of writing and the becoming of encoding more generally? The traditional academic essay is encoded in the formation of writing as the manuscript in which the original author is a performative outcome (how do I become an author? I produce an original manuscript). In contrast, electronic writing is encoded in the formation of writing as reuse in which the performative outcome is the original author as an undetected skilful reuser. This writing as reuse is a skilfulness that is encoded differently to that of ‘authorship’, as Pennycook (2007: 589) suggests: ‘to repeat a text in another context is an inexorable act of recontextualization and it is only a particular ideology of textual originality that renders such a view invisible’. Indeed, to check for its ‘original source’ is to assume an encoding which misunderstands the type of agency such an encoding implies. The agency of reuse is a borrowed agency, an agency of kidnapping, as it were. The skilful reuser is a plagiarius par excellence ^ the one who abducts the fragment skilfully to graft onto it her own agency: more speci¢cally, abducting the encoding that some other skilful Downloaded from tcs.sagepub.com at UNIV OF ROCHESTER LIBRARY on December 30, 2011
Introna ^ The Enframing of Code 133
reuser has already abducted (that is why it is already enslaved). Also in copyright law there are those who suggest that the encoding of reuse unravels the assumptions of traditional copyright law. For example, legal theorists such as Ginsburg (2005: 381^2) argue that, in the age of electronic writing (and the use of ghost-writers), ‘authorship’ is a matter of a trademark where the ‘author’ is the person who presents herself as such, who succeeds in persuading the public that her personality pervades the work, even if someone else wrote it. The extension of the tutors’ agency (encoded as electronic plagiarism detection) to check the assumed originality of the texts submitted seems to have had many more or less unexpected performative outcomes such as a particular encoding of the plagiarist, the original author, academic integrity, producers of intellectual property, ghost writers, etc. Most signi¢cant for us is the encoded production of the skilful reuser (or kidnapper) as original, thus revealing the essential question of the role of kidnapping (the plagiarius) in the encoding of all agency ^ that is to say, in any encoding who is it that is speaking/acting? The Enframing of Code: Some Concluding Thoughts What is most remarkable about the ongoing becoming of the world is that although it is, and could be, completely otherwise in each and every moment, it is rather extraordinarily similar and familiar. This extraordinary continuity, we argue, can be accounted for by the fact that normatively encoded material enactments are the necessary condition for the ongoing extension of agency, of becoming. However, such encoded extension is not just a translation but also simultaneously a transformation, with many unexpected performative outcomes and opportunities for resignifcation. As we have seen above, the encapsulation of machine writing into a digital encoding (as electronic writing) radically reconstitutes the agency so extended, and much more. Temporality for the writer and reader are no longer sequential and linear. Texts become constituted through, and as, reusable fragments, more or less skilfully woven together. Authorship, and its associated notions of originality, authority and ownership, dissipates. In the electronic intertext the extension of agency becomes increasingly fragmented and insecure ^ as the proliferation and failure of digital rights management systems testify. But most significantly the digital encoding makes it more apparent that agency becomes exactly my agency through the skilful kidnapping of an encoding that has it sources elsewhere ^ it is always already plagiaristic (a ‘crime’ of plagium one might say). Our detour into the imbrication of electronic writing and academic writing (and the problem of plagiarism detection) has unexpectedly opened up the question of kidnapping that underlies all encoded enactments of agency ^ it has revealed its necessarily plagiaristic nature. As always already encoded beings we are never authors, we are instead all skilful reusers. To extend agency we have to submit to the existing encoding and kidnap that encoding simultaneously ^ enabling constraints, in Butler’s language. Our Downloaded from tcs.sagepub.com at UNIV OF ROCHESTER LIBRARY on December 30, 2011
134
Theory, Culture & Society 28(6)
originality, if there is any, is in our skill at kidnapping the code and turning it into an extension of our agency, that is to say, our skill at resignification. One might suggest with Serres (2007) that to be original we need to be skilful ‘parasites’. Nevertheless, one needs to bear in mind that any attempt at resigni¢cation will itself be subject to kidnapping, either directly or through the encoding formation at a higher level. As Serres (2007: 13) suggests, in this logic ‘[t]he parasited one parasites the parasites . . . But the one in the last position wins this game’. However, in our parasitic cutting and pasting we must necessarily conform to the essential elements of the encoding for it to continue to function as a code ^ otherwise it will not extend any agency. The parasite cannot destroy the host without losing its parasitic advantage. As our sociomaterial world becomes more complex, agency becomes increasingly encapsulated ^ nested as codes within codes within codes. Some of the performative outcomes disappear from view and become taken for granted as the way the world is (i.e. it could not be otherwise). This is especially true for encodings that are increasingly subsumed in nested codes (i.e. not available for kidnapping, except by those with specialist expertise, as the complex financial instruments of the recent crisis revealed). Perhaps we are seeing the emergence of a new elite of scribes ^ again in the service of the powerful. As agency becomes encoded in increasingly imperceptible encodings ^ on a digital level or nano-scale for example ^ the ability of humans to take it hostage becomes less and less. In a sense these encodings are becoming more and more individual (more and more concrete, as Simondon would say). As such, the traces of the human other may eventually disappear altogether ^ perhaps an entirely new kidnapper/ parasite will emerge. We may speculate that perhaps another encoded agency has already kidnapped us humans from the start. As Kittler (1999: 1^2) suggests: And once optical fibre networks turn formerly distinct data flows into a standardized series of digitized numbers, any medium can be translated into any other. With numbers, everything goes. Modulation, transformation, synchronization; delay, storage, transposition; scrambling, scanning, mapping ^ a total media link on a digital base will erase the very concept of medium. Instead of wiring people and technologies, absolute knowledge will run as an endless loop. . . . But there still are media [encodings].
It seems that the performative outcome of the massive extension of agency in the digital ^ also the chemical, nano, genetic, etc. ^ is the dissolution of the supposed human as the origin of such agency (Leroi-Gourhan, 1993). In a sense the encoding of the digital reveals the radical proposition that we have never been originally human, or at least not the original human we supposed. Clearly, this paper is in many respects relatively speculative and tentative in its analysis. It is undoubtedly possible to interpret the theoretical Downloaded from tcs.sagepub.com at UNIV OF ROCHESTER LIBRARY on December 30, 2011
Introna ^ The Enframing of Code 135
work and the historical events upon which it is based in many different ways. I want to be careful not to claim too much in the name of this category of encoding, but also not too little. I believe I have shown that encoding is indeed a useful and productive ontological category. It does some useful work but it also unmistakably enframes our understanding in significant ways (many of which we have yet to articulate or understand). It may also have many unexpected performative outcomes, but that is always the cost of extension. Acknowledgements Thanks to Theo Vurdubakis for pushing me to go even further and the anonymous reviewers for their insightful and generous comments on previous drafts. The resulting work has benefitted greatly from their critical engagement with it. I am also indebted to Dimitra Petrakaki for pointing out errors of grammar.
Notes 1. McLuhan (1964: 56) uses the Greek notion of ‘metaphor’ rather than the Latin of ‘translate’. Metaphora is ‘a transfer’, especially in the sense of transfer from one word to a di¡erent word. It literally means ‘a carrying over’. 2. For Foucault (2007), the making of statements ^ within, for example, a disciplinary discourse ^ is governed by a vast set of rules about, for instance, who is allowed to speak, what can be spoken about, how claims can be made and justi¢ed, and so forth. These implicitly and explicitly understood rules govern the way the discourse develops. They are both restrictive and productive. All statements, considered to be ‘statements’, are already governed by a discursive formation which is its necessary condition to be taken as meaningful and legitimate. 3. In computer programming, a nested function (or nested procedure/subroutine) is a function which is lexically encapsulated within another function. 4. This essential otherness in all encoding has been accounted for in a variety of disciplines. For example, in the incompleteness theorems of G˛del in mathematics, in Paul Ricoeur’s argument on the essential surplus of meaning in a text, and the claims of the interpretive flexibility inherent in artefacts made by the social construction of technology tradition, to name but a few. 5. See also Mackenzie’s (2002) very lucid discussion of this co-originality. 6. For two contrasting (and in some ways complementary) views of the development of language refer to Deacon (1997) and Leroi-Gourhan (1993). 7. These ancient scripts (which emerged simultaneously in a variety of geographic locations) used mostly some combination of iconic symbols to more or less encode spoken language, such as pictographs, ideographs, and rebuses. Pictographs are symbols where pictures represented things more or less as they are. Ideographs are more abstract symbols to represent things. Rebuses are a combination of pictures that represent the sounds that made up the word. 8. There are of course many competing theories of the precise history of the development of the phonetic alphabet. One of the dominant theories, which is followed here, is that the phonetic alphabet emerged as Semitic adaptations of Egyptian hieroglyphics. Downloaded from tcs.sagepub.com at UNIV OF ROCHESTER LIBRARY on December 30, 2011
136
Theory, Culture & Society 28(6)
9. The acrophonic principle is where the consonants of a word to be inscribed are represented by pictures of objects whose names begin with those consonants. This is a system of acronyms that works smoothly for Semitic languages such as proto-Sinaitic whose words always begin with a consonant. 10. Also see Heidegger (1962: 215) and McLuhan (1962, 1964) for similar arguments. 11. For similar arguments also refer to Logan (2004) and Rotman (2002). 12. Mostly due to the printing press, which can unfortunately not be covered here ^ see McLuhan (1962). 13. This was necessary because his vision was slowly deteriorating. This loss of sight forced him to severely curtail his reading and writing activity. However, with the typewriter in hand he was able to continue to write ^ even if it did not last very long. 14. For a further discussion refer to the work of authors such as Baudrillard (1994), Hayles (1999), Virilio (2005) and, of course, Kittler (1999) ^ all of whom have a very di¡erent interpretation of the performative outcomes of the digital code. 15. White (1965: 75) summarizes the principles governing this ‘classical episteme of imitation’ as follows: ‘The writer should take only what he ¢nds usable in his predecessors, should add to it whatever changes or improvements later ages, including his own, have developed, and should transform and supplement all he has gathered by the operation of his own literary genius.’ 16. Turnitin is the current market leader in plagiarism detection systems. They claim that their system is used by 5000 institutions in 80 countries worldwide (covering 12 million students and educators) and that 50,000 papers get submitted to their system every day. They also claim that their crawler, ‘Turnitinbot’, has downloaded over 9.5 billion internet pages to their detection database and that it updates itself at a rate of 60 million pages per day (Turnitin website). 17. For example, experiments with Turnitin showed that if one would change one word in a sentence at the right place ^ often between the seventh to fourteenth word in the sentence ^ then Turnitin did not recognize it even if all the rest of the sentence remained exactly the same (Hayes and Introna, 2005). 18. It is also possible that nothing is detected because there is no copy of the text submitted in the database of Turnitin to compare it with. This is entirely possible because Turnitin’s database only covers electronic sources (and only that which it can index on the world wide web ^ i.e. publicly available documents in the right format). 19. Roig (2001) has shown in his study that even experienced academics tend to keep signi¢cant fragments intact when confronted with di⁄cult material. References Aristotle (1998) The Metaphysics (trans. H. Lawson-Tancred). London: Penguin Books. Barad, K. (2003) ‘Posthumanist Performativity: Toward an Understanding of How Matter Comes to Matter’, Signs 28(3): 801^831. Downloaded from tcs.sagepub.com at UNIV OF ROCHESTER LIBRARY on December 30, 2011
Introna ^ The Enframing of Code 137 Barthes, R. (1981) ‘Theory of the Text’, pp. 31^47 in R. Young (ed.) Untying the Text: A Post-structuralist Reader. London: Routledge. Barthes, R. (1989) The Rustle of Language (trans. R. Howard). Berkeley: University of California Press. Barthes, R. (1991) The Grain of the Voice: Interviews 1962^1980 (trans. L. Coverdale). Berkeley: University of California Press. Baudrillard, J. (1994) Simulacra and Simulation. Ann Arbor: University of Michigan Press. Bergson, H. (2003 [1911]) Creative Evolution (trans. A. Mitchell). Whitefish, MT: Kessinger Publishing. Butler, J. (1990) Gender Trouble: Feminism and the Subversion of Identity. New York: Routledge. Butler, J. (1996a) ‘Gender as Performance’, pp. 109^126 in P. Osborne (ed.) A Critical Sense: Interviews with Intellectuals. New York: Routledge. Butler, J. (1996b) ‘Imitation and Gender Insubordination’, pp. 371^387 in A. Garry and M. Pearsall (eds) Women, Knowledge, and Reality: Explorations in Feminist Philosophy. New York: Routledge. Butler, J. (1997) Excitable Speech: A Politics of the Performative. London: Routledge. Chun, W.H.K. (2008) ‘On ‘‘Sourcery,’’ or Code as Fetish’, Configurations 16(3): 299^324. Deacon,T.W. (1997) The Symbolic Species: The Co-evolution of Language and the Human Brain. London: Penguin Press. Derrida, J. (1976) Of Grammatology. Baltimore, MD: Johns Hopkins University Press. Derrida, J. (1977) Limited Inc. Evanston, IL: Northwestern University Press. Derrida, J. (1987) Deconstruction and Philosophy: The Texts of Jacques Derrida (ed. J. Sallis). Chicago: University of Chicago Press. Derrida, J. (1990) ‘Force of Law: The Mystical Foundation of Authority’, Cardozo Law Review 11(5/6): 919^1045. Derrida, J. (1995) Archive Fever: A Freudian Impression. Chicago: University of Chicago Press. Derrida, J. (2005) Paper Machine. Stanford, CA: Stanford University Press. Donald, M. (1991) Origins of the Modern Mind: Three Stages in the Evolution of Culture and Cognition. Cambridge, MA: Harvard University Press. Fallows, J. (1982) ‘Living with a Computer’, The Atlantic Monthly 250( July): 84^91. Flusser, V. (2002) Writings (trans. E. Eisel). Minneapolis, MN: University of Minnesota Press. Foucault, M. (2007 [1969]) Archaeology of Knowledge (trans. A.M. SheridanSmith). London: Routledge. Galloway, A.R. (2004) Protocol: How Control Exists After Decentralization. London: MIT Press.
Downloaded from tcs.sagepub.com at UNIV OF ROCHESTER LIBRARY on December 30, 2011
138
Theory, Culture & Society 28(6)
Ginsburg, J.C. (2005) ‘Author’s Name as a Trademark: A Perverse Perspective on the Moral Right of Paternity’, Cardozo Arts & Entertainment Law Journal 23(2): 379^389. Goldwasser, O. (2010) ‘How the Alphabet Was Born from Hieroglyphs’, Biblical Archaeology Review 36(2). Available at: http://www.bib-arch.org/bar/ article.asp?PubID¼BSBA&Volume¼36&Issue¼2&ArticleID¼6 (consulted May 2010). Graham, S.D.N. (2005) ‘Software-sorted Geographies’, Progress in Human Geography 29(5): 562^580. Grosswiler, P. (2004) ‘Dispelling the Alphabet Effect’, Canadian Journal of Communication 29(2). Available at: http://c jc-online.ca/index.php/journal/article/ view/1432 (consulted May 2010). Havelock, E.A. (1988) The Muse Learns to Write: Reflections on Orality and Literacy from Antiquity to the Present. New Haven, CT: Yale University Press. Hayes, N. and L.D. Introna (2005) ‘Cultural Values, Plagiarism, and Fairness: When Plagiarism Gets in the Way of Learning’, Ethics & Behavior 15(3): 213^231. Hayles, N.K. (1999) How We Became Posthuman: Virtual Bodies in Cybernetics, Literature, and Informatics. Chicago: University of Chicago Press. Hayles, N.K. (2005) My Mother Was a Computer: Digital Subjects and Literary Texts. Chicago: University of Chicago Press. Heath, S.B. (1993) ‘Rethinking the Sense of the Past: The Essay as Legacy of the Epigram’, in L. Odell (ed.) Theory and Practice in the Teaching of Writing: Rethinking the Discipline. Carbondale: Southern Illinois University Press. Heidegger, M. (1962 [1927]) Being and Time. Oxford: Wiley-Blackwell. Heidegger, M. (1971) Poetry, Language, Thought (trans. A. Hofstadter). New York: Harper & Row. Heidegger, M. (1977) The Question Concerning Technology, and Other Essays (trans. W. Lovitt). New York: Harper & Row. Heidegger, M. (1988) Existence and Being (trans. W. Brock). Washington, DC: Regnery Gateway. Heim, M. (1999) Electric Language: A Philosophical Study of Word Processing. New Haven, CT: Yale University Press. Hertz, N. (1982) ‘Two Extravagant Teachings’, Yale French Studies 63: 59^71. Howard, R.M. (1995) ‘Plagiarisms, Authorships, and the Academic Death Penalty’, College English 57(7): 788^806. Ihde, D. (1990) Technology and the Lifeworld: From Garden to Earth. Bloomington: Indiana University Press. Introna, L.D. and H. Nissenbaum (2000) ‘Shaping the Web: Why the Politics of Search Engines Matters’, The Information Society 16(3): 169^185. Jaszi, P. (1991) ‘Toward a Theory of Copyright: The Metamorphoses of Authorship’, Duke Law Journal 2: 455^502. Kirschenbaum, M. (2004) ‘Extreme Inscription: Towards a Grammatology of the Hard Drive’, TEXT Technology 13(2): 91^125.
Downloaded from tcs.sagepub.com at UNIV OF ROCHESTER LIBRARY on December 30, 2011
Introna ^ The Enframing of Code 139 Kittler, F.A. (1995) ‘There Is No Software’, CTheory. Available at: http:// www.ctheory.net/articles.aspx?id¼74 (consulted May 2010). Kittler, F.A. (1996a) ‘Technologies of Writing: Interview with Friedrich A. Kittler’ (trans. M. Griffin and S. Herrmann), New Literary History 27(4): 731^742. Kittler, F.A. (1996b) ‘The History of Communication Media’, CTheory. Available at: www.ctheory.net/articles.aspx?id¼45 (consulted May 2010). Kittler, F.A. (1999) Gramophone, Film, Typewriter (trans. G. Winthrop-Young and M. Wutz). Stanford, CA: Stanford University Press. Kittler, F.A. (2010) Optical Media (trans. A. Enns). Cambridge: Polity. Lafont, C. (2000) Heidegger, Language, and World-disclosure. Cambridge: Cambridge University Press. Latour, B. (1988) The Pasteurization of France. Cambridge, MA: Harvard University Press. Latour, B. (1993) We Have Never Been Modern. Cambridge, MA: Harvard University Press. Latour, B. (2005) Reassembling the Social: An Introduction to Actor-networktheory. Oxford: Oxford University Press. Leroi-Gourhan, A. (1993 [1964]) Gesture and Speech. Cambridge, MA: MIT Press. Lessig, L. (2006) Code. New York: Lawrence Lessig. Le¤vinas, E. (1985) Ethics and Infinity (trans. P. Nemo). Pittsburgh: Duquesne University Press. Lindey, A. (1952) Plagiarism and Originality. Westport, CT: Greenwood Press. Logan, R.K. (2004) The Alphabet Effect: A Media Ecology Understanding of the Making of Western Civilization. Cresskill, NJ: Hampton Press. Mackenzie, A. (2002) Transductions: Bodies and Machines at Speed. London: Continuum. Mackenzie, A. (2006) Cutting Code: Software and Sociality. New York: Peter Lang. McFarland, T. (1974) ‘The Originality Paradox’, New Literary History 5(3): 447^476. McLuhan, M. (1960) Report on Project in Understanding New Media. Washington, DC: National Association of Educational Broadcasters. McLuhan, M. (1962) The Gutenberg Galaxy: The Making of Typographic Man. Toronto: University of Toronto Press. McLuhan, M. (1964) Understanding Media: The Extensions of Man. New York: McGraw-Hill. Nietzsche, F.W. (1996 [1887]) On the Genealogy of Morals: A Polemic: By Way of Clarification and Supplement to My Last Book, Beyond Good and Evil (trans. D.D. Smith). Oxford: Oxford University Press. Olson, D.R. (1994) The World on Paper: The Conceptual and Cognitive Implications of Writing and Reading. Cambridge: Cambridge University Press.
Downloaded from tcs.sagepub.com at UNIV OF ROCHESTER LIBRARY on December 30, 2011
140
Theory, Culture & Society 28(6)
Ong, W.J. (2002) Orality and Literacy: The Technologizing of the Word, 2nd edn. New York: Routledge. Pennycook, A. (2007) ‘‘‘The Rotation Gets Thick. The Constraints Get Thin’’: Creativity, Recontextualization, and Difference’, Applied Linguistics 28(4): 579^596. Pickering, A. (1995) The Mangle of Practice: Time, Agency, and Science. Chicago: University of Chicago Press. Pigman, III, G.W. (1980) ‘Versions of Imitation in the Renaissance’, Renaissance Quarterly 33(1): 1^32. Randall, M. (1991) ‘Approriate(d) Discourse: Plagiarism and Decolonization’, New Literary History 22(3): 525^541. Rice, J. (2003) ‘The 1963 Hip-Hop Machine: Hip-Hop Pedagogy as Composition’, College Composition and Communication 54(3): 453^471. Roig, M. (2001) ‘Plagiarism and Paraphrasing Criteria of College and University Professors’, Ethics & Behavior 11(3): 307^323. Rotman, B. (2002) ‘The Alphabetic Body’, Parallax 8(1): 92^104. Saltmarsh, S. (2004) ‘Graduating Tactics: Theorizing Plagiarism as Consumptive Practice’, Journal of Further and Higher Education 28(4): 445^454. Serres, M. (2007) The Parasite (trans. L. Schehr). Minneapolis, MN: University of Minnesota Press. Stiegler, B. (1998) Technics and Time: The Fault of Epimetheus. Stanford, CA: Stanford University Press. Sutin, L. (1994) Divine Invasions: A Life of Philip K. Dick. London: HarperCollins. Terry, R. (2007) ‘‘‘Plagiarism’’: A Literary Concept in England to 1775’, English 56(214): 1^16. Thrift, N. and S. French (2002) ‘The Automatic Production of Space’, Transactions of the Institute of British Geographers 27(3): 309^335. Ullman, E. (1997a) Close to the Machine: Technophilia and Its Discontents. San Francisco: City Lights Books. Ullman, E. (1997b) ‘Elegance and Entropy: Interview by Scott Rosenberg’, Salon 21. Available at: http://www.salon.com/21st/feature/1997/10/09interview.html (consulted January 2011). Virilio, P. (2005) The Information Bomb (trans. C. Turner). London: Verso. Vojak, C. (2006) ‘What Market Culture Teaches Students About Ethical Behavior’, Ethics and Education 1(2): 177^195. White, H.O. (1965) Plagiarism and Imitation during the English Renaissance. New York: Octagon Books. Whitman, C.H. (1958) Homer and the Heroic Tradition. Cambridge, MA: Harvard University Press. Wittgenstein, L. (1958) The Blue and Brown Books. New York: Harper & Row.
Downloaded from tcs.sagepub.com at UNIV OF ROCHESTER LIBRARY on December 30, 2011
Introna ^ The Enframing of Code 141 Wittgenstein, L. (2001 [1953]) Philosophical Investigations: The German Text, with a Revised English Translation (trans. G.E.M. Anscombe). Oxford: WileyBlackwell.
Lucas D. Introna is Professor of Organization, Technology and Ethics at Lancaster University and is co-editor of Ethics and Information Technology. His research interest is in the social study of technology. He is particularly interested in the ethics and politics of technology (and materiality more generally). For more information refer to: http://www.lums.lancs. ac.uk/ owt/profiles/lucas-introna/ [email:
[email protected]]
Downloaded from tcs.sagepub.com at UNIV OF ROCHESTER LIBRARY on December 30, 2011
Anticipating Harm Regulation and Irregularity on a Road Construction Project in the Peruvian Andes
Hannah Knox and Penny Harvey
Abstract This article draws on an ethnography of road construction in the Peruvian Andes to explore how engineering projects operate as sites of contemporary governance. Focusing on the way in which engineering projects entail a confrontation with dangers of various kinds, we explore how people caught up in road construction processes become preoccupied with the problem of anticipated harm. Drawing on the notion of ‘codes of conduct’ , we suggest that the governmental effects of practices which attempt to deal with the uncertainty of the future might be analysed as a tension between the enactment of two different kinds of codification. Building on the notion of coding as a situated material practice, we investigate the appearance of two different ways of encoding a relationship with an uncertain future which we term ‘machinic’ and ‘emergent’. The article builds on a description of these two ways of encoding uncertainty to explore how formal mechanisms of dealing with anticipated harm, such as the regulations of health and safety, are both unsettled and reinvigorated by more affective and relational dimensions of practice. Key words anthropology
j
control
j
difference
j
knowledge
j
risk
j
safety
Introduction T WAS not long into our research that we heard about the first death. We were three months into ethnographic fieldwork with a consortium of engineers who were constructing a 700 km ‘interoceanic’ highway
I j
Theory, Culture & Society 2011 (SAGE, Los Angeles, London, New Delhi, and Singapore), Vol. 28(6): 142^163 DOI: 10.1177/0263276411420889
Downloaded from tcs.sagepub.com at UNIV OF ROCHESTER LIBRARY on December 30, 2011
Knox and Harvey ^ Anticipating Harm 143
between the southern Andean highlands of Peru and the border with Brazil in the Amazonian rainforest when news came that one of the construction workers on the project had been killed. News travelled fast along the road and by the evening of the day of the accident, the small Andean town where we were staying (several hours drive away from the accident site) was buzzing with news of the fatality. No one knew quite what had happened, but everyone wanted to talk about it. A few days later when we were travelling with two engineers from the consortium, we drove past the spot where it had occurred. Two wooden crosses were hammered into the ground ^ one provided by the construction company and one by the relatives of the deceased man. When the engineers got out of the car, their driver told us what he knew of the incident. A man had been working behind a bulldozer and it seemed he had tripped or somehow been distracted and the machine had driven right over him and crushed him. Road construction, like all construction projects, is a process which has to grapple with the problem of anticipated harm.1 Whilst construction deaths are an obvious danger, we might argue that the problem of harm has an even more fundamental relationship to road building, forming an inherent part of the impetus which drives road construction in the ¢rst place. Building on a generalized politics of fear (Gourevich, 2010) of the social and economic e¡ects of exclusion from international £ows of capital and resources, road building o¡ers a tangible form of political action, protecting the population by facilitating their connection to global processes. In Peru, as in many developing nations, road building has long been seen as a way of mitigating some of the dangers of underdevelopment, economic isolation and political volatility.2 Road building projects are promoted by external, multilateral agencies looking to create new markets and to stimulate commodity £ows whilst simultaneously providing the necessary degree of infrastructural support in ¢elds of health, education, and institutional process to ensure the political stability necessary for economic development.3 Supported by the World Bank, the Peruvian government is actively engaged in several large-scale road construction programmes in what could be seen as investments in ‘safe living’ for the nation and its citizens who would otherwise be at risk of su¡ering from the dangers of isolation, disconnection and economic and political disenfranchisement. As well as responding to a generalized politics of fear, new roads also offer a solution to more mundane and day-to-day dangers. Good roads promote a sense of enhanced security and appear to materialize a kind of liberal social ordering. In the challenging terrain of the Peruvian Andes, modern road construction promises to straighten out the sharp bends around which drivers of large buses have learnt to skilfully manoeuvre; to place safety barriers between cars and the deep precipices over which they frequently fell; to shore up rocky cliffs and blast away precarious outcrops
Downloaded from tcs.sagepub.com at UNIV OF ROCHESTER LIBRARY on December 30, 2011
144
Theory, Culture & Society 28(6)
which used to turn into rock falls and landslides in bad weather or when earth tremors struck; and most importantly perhaps, to flatten bumpy, dusty, steep, narrow roadways into smooth, level, two-way tarmac strips that can increase the speed of travel, cutting journey times between towns from days or weeks down to a matter of hours. Through material interventions roads are claimed to produce positive social effects, increasing mobility whilst also improving safety. Road building might in this respect be seen as an ideal example of the exercise of governmentality through the implementation of measures which are oriented towards the freedom of the population (Foucault, 2008; Joyce, 2003). In lectures transcribed in The Birth of Biopolitics (2008), Foucault argues that the emergence of a governmental concern with the problem of danger and security came during the 19th century out of liberal principles which were concerned with how to manufacture the freedom of both the individual and the collective. The greatest risk to freedom came from the encroachment of alternative interests. For the collective, it became imperative to protect against the danger of the encroachment of individual interests, whilst the individual simultaneously needed to be protected from threats produced by the collective. Foucault argued: If on the one hand . . . liberalism is the art of government that fundamentally deals with interests, it cannot do this . . . without at the same time managing the dangers and mechanisms of security/freedom, the interplay of security/freedom which must ensure that individuals or the community have the least exposure to danger. (Foucault, 2008: 66)
The management of danger and the fear of disconnection through technologies like roads can thus be linked to a historically specific form of governmentality which simultaneously promoted the freedom of the individual whilst putting in place the means through which the interests of the collective could also be protected. In the case of road building, we might argue that a generalized fear of the effects of disconnection leads to interventions which govern by reinforcing a principle of freedom through the implementation of a logic of security. How this occurs is the central interest of this paper. Drawing on our ethnographic work on road building in Peru, we suggest that attention to the situated practices of construction engineers provides a powerful means of engaging with the contemporary manner in which governance is enacted through an engagement with the problem of security. Engineering is a discipline which is oriented towards navigating and responding to danger, and producing solutions to the immanent threat of harm. Whilst engineering is rarely acknowledged in social theory as a central player in forms of governmentality, we suggest that engineers play a key role in formulating and enacting the contemporary means through which the state comes to organize and manage populations. On the one
Downloaded from tcs.sagepub.com at UNIV OF ROCHESTER LIBRARY on December 30, 2011
Knox and Harvey ^ Anticipating Harm 145
hand, the construction process itself is regulated by norms of engineering, by state law, and by the corporate enactment of a responsibility to protect against the potential harm that might be caused to the social and material environments by their material interventions. On the other, engineering consortia have to translate these normative devices through ongoing engagements with people and materials in order to produce a code of corporate care which works across all dimensions of their projects. In this respect the project of building a road is itself a process that aims to produce a systematic stabilization from an unstable social and material world. The code of corporate care in road construction is a regime of transformation and of protection that responds to the anticipation of harm of many different kinds. Deleuze (1992) has extended Foucault’s characterization of liberal governmentality into the contemporary era by suggesting that the disciplinary techniques of modern institutions are being superseded by coding capacities most obviously embedded in computing technologies. For Deleuze, codi¢cation is supplementing and replacing older forms of administration and enumeration. As it does so it reveals new forms of subjectivity based less on the establishment of an opposition between the individual and the collective and instead on the temporary appearance of unstable subjects out of iterative combinations of ‘dividual’ matter (Deleuze, 1992). Whilst Deleuze’s thesis on societies of control o¡ers a helpful extension to Foucault’s reading of 19th-century forms of governmentality, our aim in this paper is less to evaluate the relative merit of these theories than to use them as a starting point for exploring ethnographically the interplay between di¡erent kinds of enumeration and classi¢cation that engineers employ in the anticipation and management of danger. In this respect our approach is more akin to that of Douglas (1992), who sees the variety of responses to the anticipation of harm as the outcome of morally charged ¢elds of social relations oriented towards the production of particular kinds of community. Rather than characterizing a contemporary approach to risk as entailing a particular kind of coding, we are interested in the way in which the anticipation of harm in engineering practice entails di¡erent and contradictory ways of managing uncertain relations. Drawing on the concerns of the special issue regarding the ways in which regulatory practices might be understood as ‘codes of conduct’, we use our ethnographic material to consider the usefulness of a theory of codes for understanding the governmental practices of engineering. Introducing the terminology of machinic and emergent codes, we explore to what extent the governmental dimensions of engineering, often characterized as a modernist science par excellence, emerge out of a tension between di¡erent kinds of codi¢cation. In this respect we aim to contribute to understanding the ways in which practices of coding work to produce the conditions through which danger is made manifest in speci¢c times and places. Downloaded from tcs.sagepub.com at UNIV OF ROCHESTER LIBRARY on December 30, 2011
146
Theory, Culture & Society 28(6)
Accident Prevention and the Machinic Code A few days after the accident, we were invited to attend a health and safety briefing given by the road construction company to new workers at one of the engineering camps. When the room was full, the health and safety officer, Engineer Revaleta, came in and flicked off the lights. The space was lit by the glow of the power point presentation projected onto the screen at the front. Two minders patrolled the room, hushing the new recruits and clapping their hands over the heads of anyone who appeared to be losing attention or drifting off to sleep in the soporific quiet of the darkened classroom. The talk was all about how to ensure the avoidance of accidents in the process of road construction. Half way through the talk a picture flashed up on the screen of a man lying dead on the ground with blood coming out of his head. Somewhat shockingly, it turned out this was the man who had died in the accident earlier in the week. The purpose of the talk was to explain in great detail to new recruits how they could prevent this kind of thing happening to them or their fellow workers. According to Revaleta, accidents need never happen. The construction company operated with a goal of ‘zero accidents’ which in spite of the death earlier in the week was still being voiced as a key objective which drove Revaleta in his safety mission. It was, indeed, Revaleta’s job to make sure that people were signed up to this common goal. Not only was safety important, but for Revaleta and the company alike it was an absolute priority. As Revaleta himself put it, to become a member of the ‘family’ of the construction company, all workers were expected to have security as their principle objective and to see the following of rules and norms as a key responsibility. Revaleta’s description of the causes of an accident disaggregated it into a series of discrete parts. First of all there are dangers (peligros). These are things that could potentially cause accidents. Second there are risks (riesgos). Risk is the latent state of an accident, which occurs when a person is exposed to a potential danger. Revaleta showed the following equation to explain this further: R ¼ P + C, where R ¼ Risk, P ¼ Probability and C ¼ Consequence. An accident (accidente) is any form of personal injury, material harm or interruption in the process of construction when a risk materializes into an event of this kind. An incident (incidente), on the other hand, is when risk materializes into any other kind of event which does not involve personal injury, material harm or the interruption of the construction process. He gave the example of dropping a tool from a height and just missing a fellow worker as the case of an incident. With the seemingly unpredictable now disaggregated into these different components, Revaleta was then able to show that it was possible to produce an adequate response to each of these different parts, thus ensuring that harm could be mitigated. Dangers were dealt with by introducing a series of norms, such as bans on drinking, requirements to use safety equipment in particular spaces, procedures for clear communication between foremen and workers, and stipulations about hours of work which Downloaded from tcs.sagepub.com at UNIV OF ROCHESTER LIBRARY on December 30, 2011
Knox and Harvey ^ Anticipating Harm 147
would prevent exposure to potential dangers and thus reduce risk. The purpose of these norms was precisely to ensure that potential dangers would not be transformed, in the terms of his formula, into risk (cf. Beck, 1992). In the health and safety briefing the problem of harm appeared to be explicitly confronted through a process of codification. Literatures on varieties of coding practices tend to stress the ways in which codification entails a process of formal abstraction. In the field of descriptive linguistics, for example, scholars have long recognized that the identification of core elements and their possible patternings is a descriptive practice oriented to the production of generic relations through which particular, but arbitrary, symbolic forms are organized into sequences that convey and generate meaning. Computer languages operate in the same way, as do the informational codes of DNA, or the equations of mathematical thought. Here the value of the code lies precisely in its reductive formalism. The purpose of this reductive formalism is to describe a system of relations with generative capacity. This move from description to generative action alerts us to the ways in which formal coding is oriented towards a process of mechanization, whereby actions occur autopoetically without the need for ongoing management or persuasion. In the health and safety briefing, the disaggregation of an event into dangers, risks and accidents was intended to set in play a series of commonsensical responses in what we might term a ‘machinic’ manner. Just as experiments in artificial life use machinic codings to run sequences in order to produce virtual context-free worlds in which pure principles of generativity and sociality can be observed in action (Hayles, 1999), health and safety procedures were expected to stimulate commonsense into action, thus producing normative transformations in behaviour that would alter the conditions which had produced the need for a code in the ¢rst place. As a form of what we might call ‘machinic’ codi¢cation, the health and safety brie¢ng revolved around the production of a description of a very speci¢c kind, which was designed to eliminate extraneous information in order to capture the fundamentals of generative processes (in this case, a rational response to objective conditions) for productive e¡ect (zero accidents). The Role of the Assistant Whilst the health and safety briefing laid out the logic of the machinic code, it was in the day-to-day practices of road construction that the process of enacting the code took place. Here we found that the discursive logic of the machinic code was materialized through engagement with all manner of human and non-human ‘assistants’ (Beckmann, 2004). Road signs and hard hats provided a major assistive role, as did a manual of health and safety rules, sections of which were read out to work teams on a daily basis. Driving too fast was dealt with by speed limits or tra⁄c humps; people were protected against falling objects by their hard hats and steel toe-capped boots; unpredictable vehicle movements on the road were to be Downloaded from tcs.sagepub.com at UNIV OF ROCHESTER LIBRARY on December 30, 2011
148
Theory, Culture & Society 28(6)
avoided by proper training in adequate signalling and appropriate norms of manoeuvre. The improper use of unpaved bits of mountainside as shortcuts was prevented by the presence of Revaleta himself. He was constantly on the move up and down the stretch of road that he was responsible for, looking out for bad behaviour, warning his fellow workers to beware of his eagle eye; ‘remember, even the rocks are watching you!’ he warned. Beckmann (2004) uses the idea of the ‘assistant’ to describe those entities that enable the anticipation of harm to be displaced in order for action to occur. In his work, he uses the notion of the assistant speci¢cally to describe car safety mechanisms as a way of rethinking the relationship between the driver, the car, the safety mechanisms, the webs of expertise, and the a¡ective relations that produce the relationship of trust between the car driver and the car. Our interest in these ‘assistants’ is in the ways in which they are deployed to manage the affective dimensions of regulatory procedures. Building on the work of those like Latour (1992), who have stressed a need to pay greater attention to the place of objects in our description of social worlds, we suggest that the capacities attributed to the assistant such as trustworthiness, reliability and utility are traits which rely not on the intrinsic properties of the person or object through which it is hoped that harm will be mitigated but on the practices of codi¢cation that allow entities to appear as assistants in the ¢rst place. Rules such as speed limits, clothing regulations and directives as to appropriate behaviour establish material arrangements like road signs, hard hats, florescent vests and the health and safety manual as objects which have the capacity to provide protection against a particular kind of uncertain future. Whilst on the one hand the assistants work to reproduce the machinic effect of the code of corporate care, on the other the set of relations through which these assistants come to generate their normative effects introduces a domain of ambiguity into the smooth operation of the machinic code. The different practices which health and safety officers are engaged in involve a constant negotiation over who or what can legitimately function as such an ‘assistant’ in any particular situation. When assistants are taken out of the situations through which they have gained their assistant status and expected to operate as assistants in other settings, or, alternatively, when the conditions within which the assistants originally received their assistant status are removed, the consequences are frustrating and often dire. Unsurprisingly, as we will see in the following section, it is often the assistants themselves who bear the brunt of the frustration, as people direct anger, humour or irreverence at the incapacity of these objects or persons to secure against uncertainties of different kinds, including those for which they were never intended. Whilst the practice of establishing and operationalizing a machinic code produces the various assistants which we describe above, the process of road construction is, as we have already mentioned, characterized by a multiplicity of dangers and thus by a proliferating array of other kinds of Downloaded from tcs.sagepub.com at UNIV OF ROCHESTER LIBRARY on December 30, 2011
Knox and Harvey ^ Anticipating Harm 149
assistants whose presence in the social relations of road construction works to disrupt the smooth operation of the machinic code. For example, the protection against harm in relation to road building in the Andes involves assistants as diverse as a hard hat, a manager, a spirit, a kin relation and a coca leaf. By using the terminology of assistance, we are alerted to the diverse specific things and persons that are drawn into practices of safe living, in a way that echoes Isabelle Stengers’ call for an attention to practice as a way of refraining ‘from using general judgemental criteria to legitimate their elimination, and to refrain from dreaming about a clean world with no cause to wonder and alarm’ (Stengers, 2007: 15). Here, Stengers is arguing against approaches which ‘eliminate’ some practices or things as irrelevant because they appear to be concerned with different matters to those which are ‘normally’ or ‘habitually’ posed (or coded) as relevant. We use the notion of the assistant to go beyond the realm of health and safety and the strictures it imposes through the machinic code in order to draw attention to other ways in which people deal with the dangers produced by construction projects like road building. By following the way in which entities of apparently different orders become folded into the notion of the assistant, our attention is drawn to the ways in which the machinic code operates alongside other ways of coding. In the following section we turn our attention to the extended effects of the machinic code as it becomes operationalized in practice. We then move on to discuss the appearance of all manner of other assistants which fall outside the remit of the machinic code but which nonetheless play an important role in dealing with the anticipation of harm. It is here we find a range of practices which unsettle the theoretical leverage of the machinic code as a metaphor for contemporary forms of governance. Responsibility, Control and the Machinic Code The main circumstance within which danger reappeared in the logic of the machinic code came from the circumvention of these rules which Revaleta set out through an avoidance or ignoring of the assistants which had been put in place to protect against possible dangers. The reasons for the flouting of these rules were also formally systematized in the induction talk. They included lack of knowledge, inadequate psychology, problematic attitudes, inadequate leadership, and a deficiency in the training of engineers. Ultimately, the potential of infinite regress in the codification of life was stalled by the assertion that for these codes of protection to work, it required individual workers to take personal responsibility both for their own safety and for the safety of others. Asked to participate as the kind of archetypal modern subjects which Foucault shows to be produced by forms of governmentality based on the management of security, the workers found themselves not simply subject to protection but also constituted in the same moment as autonomous actors with direct responsibility for their own actions. Downloaded from tcs.sagepub.com at UNIV OF ROCHESTER LIBRARY on December 30, 2011
150
Theory, Culture & Society 28(6)
Within the system of codification that we see outlined here, the constant identification of a direct and code-able relationship between cause and effect meant that in order to become responsible, workers were required to submit themselves to the systemic or machinic coding of the company for which they were working. Each worker was given a uniform and a hard hat with a colour that corresponded to their particular role and the environments where they were allowed to pass. The hard hat itself took on a symbolic importance as an indicator of incorporation/exclusion from the construction process as those allowed inside the construction camp were issued with hard hats, whilst hopeful potential workers who hung around outside the gates were conspicuous for their lack of uniform, safety vests, or hat. Once allowed into the company and issued with the requisite safety equipment, being a responsible worker became a matter of being a worker who also acknowledged the causal relations that the machinic coding sets up. As we suggested earlier, what distinguishes the machinic code is not only its capacity to describe but also the potential for it to produce predictable generative effects. To refuse to participate in the machinic code was literally to put a spanner in the works, that is, to stop the generative effect of the system. This had dire and serious effects both for the system and for people’s personal employment prospects. Aware of the risk to the functioning of the machinic code, Revaleta stressed throughout the induction talk how infringement of the rules would lead to immediate and absolute dismissal. Revaleta was widely feared within the company by many employees precisely because of his reputation for carrying out this threat, sending people home without pay for as minor an offence as having forgotten to eat breakfast ^ a failure, in terms of the code, of an understanding of the need for proper sustenance to ensure vigilance, strength and a capacity to react appropriately in the course of their work. In spite of the attempts at codification according to the production of a code that will be able to have generative effects when put in place, we found that in practice the unruly constantly reared its head to threaten the capacity of the machinic code to achieve its generative potential ^ that is, to prevent harm through the clarification of generic causal relations. Implicit in Revaleta’s evangelical selling of the health and safety code, and in spite of the list of means of ensuring it was followed, was the recognition that ultimately these codes were going to be circumvented. Revaleta knew that his drivers would take short-cuts down the mountainside, cutting off the curves in the tortuous bends by veering down crumbling gullies. He expected that people would turn up to work drunk, that they would ignore the rules about wearing their hard hat in their vehicles, break the speed limits that he had imposed in the name of health and safety, steal the road markers with their reflective strips, and forget to wear their face masks or florescent vests. Whilst this was a frustration for Revaleta, it was perhaps more interestingly a cause for the production of all kinds of other uncertainties and Downloaded from tcs.sagepub.com at UNIV OF ROCHESTER LIBRARY on December 30, 2011
Knox and Harvey ^ Anticipating Harm 151
responses to those uncertainties, as we will explore in the next section. As the assistants that are put in place as a key feature of the machinic code are displaced by choice or by circumstance, we not only see the limits of the code but also the reappearance of an unpredictable and unsystematized relation to the anticipation of harm. When the engineering camp, normally a space of danger, became transformed into a space of celebration ^ the site of a party celebrating Peruvian independence day ^ the safety rules which people were used to being told to abide by momentarily disappeared. Workers were told that they could wear civilian clothing that day, drink beer, and leave their hard hats at home. Without the coding to hold them in place, we found that the assistants mobilized to produce a machinic coding to militate against the anticipation of harm suddenly lost their power to prevent accidents according to the strictures of the coding which produced them, producing the effect of confusion and bafflement. It is here that we can begin to attend to the hidden dimensions of the apparently ‘transparent’ code as we are forced to recognize that the relationship of assistant to code is highly ambiguous ^ evidenced in this case by the fact that many still turned up at the party in their full safety attire. The seemingly arbitrary way in which rules can be applied and removed, and the ways in which this creates spaces in which people make mistakes about how to relate to assistants at different moments in time, often makes it even more difficult to enforce these rules when people decide for themselves that the circumstances they are in do not warrant the use of assistants, or at least not in the ways intended. Signallers on the road remove rubber face masks intended to protect them from the dust because the masks are hot and sticky and these same people have spent all their lives travelling along dusty roads without face masks and seem to have survived. However, even when they are removed from their causal link with the machinic code, assistants like these face masks do not necessarily lose the residue of their effectiveness towards a relationship with an unpredictable future. For example, it may not be their capacity to protect against dust, but rather the capacity that they hold when worn to protect against dismissal or accusation of lack of complicity that allows them to operate as assistants in the mitigation of harm. Here we are suggesting that the ambiguity of the relationship between the machinic code and its assistants produces the space in which people then take responsibility for themselves in ways that exceed the terms of the code which, in signing their contract with the company, they have agreed to abide by. Whilst the practice of instituting a machinic code is ostensibly all about making causal relations explicit, the social practice of codification harbours a certain ambiguity. The attempt to make explicit the orderings which produce generative relations requires the coder to move away from the empirically observable or the intuitive enactment and to render transparent the motivations, structures and forces which produce particular kinds of actions. Through this process of explication,4 the resulting codes themselves become socially restricted to those who are versed in the techniques Downloaded from tcs.sagepub.com at UNIV OF ROCHESTER LIBRARY on December 30, 2011
152
Theory, Culture & Society 28(6)
of observation, detection and inscription that make explication possible. Machinic codes thus often become associated with expert practice. Ironically, however, the very techniques through which codes explicate relations and render them transparent simultaneously produce descriptions whose formalization renders them inaccessible to others. The code becomes detached from the business of everyday living and is rendered opaque, or even secret. The explanatory force of the ‘secret code’ lies not in its internal logic but in its promise to unlock meaning and explicate causality by proving relevance at a different scale from that where it was produced. This scale shifting alerts us to the leap required to bridge the gap between the processes of abstraction from which the machinic code is derived and the entangled web of shifting relations which the code is subsequently expected to describe and control. The process requires a reversal whereby the abstraction is produced as prior to that from which it was derived. The pursuit of DNA as the ultimate information system driving human beings engages this doubled explication, as public interest drives subsequent investment to decipher it in a way that seems to gloss over the gaping chasm between the explication of a genetic relation and the explication of human social behaviour. Likewise, the production of a machinic code in the area of health and safety aims to produce relevance by altering the specificities of behaviour according to a universal logic of rational action. The machinic codes that the company devised and attempted to impose on its own workforce exemplifies the ways in which the management of risk involves both the direct control of workers’ bodies and sensibilities and the privatization of risk as workers become individually responsible for safety on the construction site. Yet the management understand that the code is not, in practice, generative of the norms of individual and collective responsibility upon which its efficacy relies. This insistence that risk can be managed through the imposition of regulations and codes of safe practice, while self-evidently foundational to company procedures, finds itself in constant confrontation with other notions of environmental and social hazard which it cannot contain. It is to the ways in which these other forms of anticipated harm make an appearance, and the assistants mobilized in dealing with them, that we now turn. The Deflection of Harm in the Emergent Code In this section we introduce into our analysis a discussion of the ways in which the uncertainty surrounding incidences of harm might be coded in ways that exceed their description as either located inside the machinic code and thus systematic and logical, or outside and therefore chaotic, uncoded and incomprehensible. We introduce into our analysis of uncertainty an alternative in the form of what we call the ‘emergent code’. By opening up the notion of coding from merely being a process of formal abstraction, we attempt to understand in greater detail the complex ways in Downloaded from tcs.sagepub.com at UNIV OF ROCHESTER LIBRARY on December 30, 2011
Knox and Harvey ^ Anticipating Harm 153
which lines of authority and control are established and negotiated around questions of security. Awareness of the scale shifting that codification practices entail with respect to human behaviour has led the social sciences to articulate a rather different notion of coding to that outlined above, captured in the less formal notions of habit or norm. These codes of conduct are produced iteratively with a view to disambiguation rather than causal determination. Case law which produces precedents out of which legal doctrines are derived are a good example of these kinds of codes which orient action but do not determine generative relations in the same way as machinic code. Indeed, such codes are brought into being through the processes of their enactment and it is in this sense that we use the term ‘emergent code’. The emergent code still has a powerful and restrictive hold on the shaping of the relational possibilities available to those who come under its rubric. Far from being free to change at will, these emergent codes, like machinic codes, also appear to entail a process of encoding, but this process operates with different techniques and produces different kinds of expertise and alternative dynamics as far as transparency and secrecy are concerned. In particular we are interested in how different forms of anticipatory coding co-exist and work on each other, looking more explicitly at how emergent codes embrace the relations and orientations which machinic codes, by contrast, seek to transcend. When people heard that a worker had died many spoke of the inevitability of the accident. This was, they said, the first in what was bound to be a series of fatalities. The accident happened in a landscape imbued with the presence of powerful spiritual forces, and the death was seen by many as retribution for the harm being done to the land as it was gouged out, displaced and removed in the process of road construction. There was little surprise that the man was run over ^ no doubt, some said, the earth grabbed hold of his legs and held him fixed in place as the machine rolled over him and crushed him. There was a recognition that these kinds of occurrences happen all the time. Since the accident there had been reports of other events. There were rumours that some of the dead man’s co-workers had begun to be haunted by his spirit, whilst others had fallen sick inexplicably. Another person told how a bulldozer had mysteriously crossed in front of one of the other night drivers, almost causing another accident. We were told that this was just the first in what was to be a long run of deaths resulting from the construction project. What were the means then of dealing with these kinds of occurrences? How did people approach the inevitability and yet unpredictability of harm that would occur through this project that was meant to be producing a more secure future for them? One means of mitigating the potential anger of the earth forces that were considered to be at play in this incident was to ensure that proper ritual payments were made to these spiritual forces, before the earth was tampered with. The people who described the accident as the outcome of unpredictable forces of retribution were used to making Downloaded from tcs.sagepub.com at UNIV OF ROCHESTER LIBRARY on December 30, 2011
154
Theory, Culture & Society 28(6)
such payments at the beginning of the agricultural cycle in the hope of an abundant harvest, at the start of any construction project or business venture, even when setting out on a journey (Harvey, 2001), and there was a recognition that a similar ritual would be appropriate in relation to the process of road building. Aware of this ritual practice in the region the road was to pass through, the community relations team at the engineering company had in fact performed a ceremony before the construction began where they made some ritual offerings to the earth in the form of the Andean ‘despacho’. In the Andes people make such offerings to mountain and earth forces in an attempt to draw these beings into relationships of exchange for mutual benefit. Huge care is put into the details of such offerings, which have to be performed accurately in order to avoid antagonizing the powers which animate the Andean landscape. These powers have the capacity to ruin harvests, kill livestock, and send illness and death. They also have a capacity for benevolence and can bring good fortune to a lucky few. Forging a relationship with earth and mountain forces in this way is, however, explicitly without guarantee. Humans can appeal but they do not control.5 In this respect, ritual payments involve a rather different sense of anticipation to that mobilized by the machinic code. As Amoore and de Goede (2008) and Anderson (2010) have shown, contemporary forms of governance involve a range of di¡erent responses to the anticipation of an uncertain future, variously described as practices of pre-emption, prediction and preparedness (also see Massumi, 2005). Revaleta’s health and safety code can be understood in these terms. It is a form of preventing harm which incorporates, in particular, aspects of pre-emption (accidents will happen and we must act to stop them before they do) and preparedness (health insurance for all workers and indemni¢cation insurance on the part of the company). In contrast, the Andean despacho does not ¢t easily within these characterizations of contemporary governance, and as an anticipatory practice might better be characterized through the idea of ‘petitioning’ for a safer future. Given the different basis of anticipation in the health and safety procedures as opposed to the ritual payment to the earth, the performance of an Andean despacho by the engineering company was therefore highly ambiguous. The authenticity of the practice was dismissed by some observers as impossibly na|« ve or deeply cynical, carried out as it was by a firm whose managers had no personal investment in the land and no belief in the importance of maintaining good relations with the land as a powerful and sentient life force. However, they did have an interest in calming the workforce. Thus we also found that the engineering consortium decided to hold a Catholic mass for the worker who had been killed. The death had led to a proliferation of hauntings and ghostly threats and workers were refusing to come to the site where the man had died. Themselves anxious to put an end to the contagious effects of the first death, the company recognized that there was a need to engage with the apprehensions of their Downloaded from tcs.sagepub.com at UNIV OF ROCHESTER LIBRARY on December 30, 2011
Knox and Harvey ^ Anticipating Harm 155
workers, even if their primary motivation was to calm the work force rather than the earth itself. Whilst, in the mode of the machinic code, the company had to find ways of exercising a duty of care that produced generative effects as transparent and reproducible, here they were drawn into engagements which required a sensitivity and attention to the emergent code. The duty of care was produced here not as a systematizing logic but rather as a relational capacity. This nonetheless operated within powerful, if not fixed, parameters of its own. The company policy was certainly to be respectful of local beliefs but, as in so many other post-colonial and development ‘encounters’, such culturalist accounts of difference which suggest that different people simply ‘believe’ different things mask other potentially more volatile differentiating practices such as the potential to attribute value, claim ownership, and express concern in a way that is recognized as valid (Cruikshank, 2005; Povinelli, 1995). One of the striking things when listening to Revaleta’s description of the machinic qualities of the health and safety code as a means of protecting against harm was just how different it seemed from the codes which worked to organize the anticipation of harm in manual construction projects that lay outside of the remit of the company. Most of the workers in the induction class came from the local area, due to a legal stipulation which had been imposed on the engineering company that they must employ labour from the locality through which the road was passing, whenever possible. As such, these workers were accustomed to an Andean mode of collaborative labour through which construction projects in the communities from which they hailed were usually organized. Just that week we had been in a community near the engineering camp, watching a group of people take down and re-assemble two adobe houses that were in the way of a small track that they had wanted to straighten and widen. The community work party was organized precisely around a kind of intense sociality that Revaleta had explicitly outlawed as amounting almost to conspiracy to murder under the rubric of a machinic code where personal responsibility remained the lubricant which enabled the production of a safe future to be realized. Revaleta had explicitly outlawed drinking, joking, smoking, and even laughing as potential distractions which could transform danger into risk and ultimately accident, yet these were utterly obligatory activities in Andean work parties like those that we observed. In these parties, the source of potential harm lay not in the equation that Revaleta outlined between danger, risk and accident, but rather in the danger of inhabiting a world without sufficient or proper relations. The code which militated against this occurrence was an emergent code, a code through which relations must be built and rebuilt according to circumstances which thrust themselves forward with meaning and force as misfortunes of different kinds. For the work party, ensuring safety meant ensuring adequate sociality. This required activities like drinking and smoking, activities which generated energy and the comradeship needed to prevent the collaboration from falling apart. Far from being an attempt at control as the primary Downloaded from tcs.sagepub.com at UNIV OF ROCHESTER LIBRARY on December 30, 2011
156
Theory, Culture & Society 28(6)
means of mitigating harm, the emergent code which these work parties produced was one oriented towards the production of energy as a way of engaging an uncertain future. In some contexts, passing out from excess alcohol is taken as a sign of trust and a job properly celebrated (Harvey, 1994). Here, the practices oriented to ensuring the ongoing efficacy of the emergent code required their own assistants ^ alcohol, coca leaves, ritual payments to telluric forces, ties of compadrazgo (godparenthood) and of kinship, and the acknowledgement of particular skills and capacities that marked particular men as ‘specialists’ (maestros) who would direct proceedings through direct involvement, taking the lead, performing the most difficult tasks, and showing others by example. Emergent codes are thus distinguished from machinic codes not by the existence of hierarchy or by the deployment of assistants, but rather in the assumption that all orderings are experiments or possibilities concerned with social relations that in the final analysis are not determinable by reason. Emergence in the Machinic Code For many people the risks and dangers produced by large-scale road construction projects are multiple. However strongly invested people are in the roads, the process is fraught with anxiety and preoccupation. These anxieties point to the fundamental uncertainty as to whether or not the road will become a public good. This raises the question of exactly who it is ‘good’ for: local towns and villages, regional or national economies, foreign investors, or those with the wit or the luck to take advantage of opportunities? The question of how benefit will accrue and to whom are questions that swirl around these projects, as rumours of corruption, hidden deals, and personal gain cling to the infrastructures of future possibility ^ their materials, their jobs, their promise of transformation. Construction companies do little to engage these anxieties. Drawing on the machinic codes of national and international statute, they limit their responsibility to the legal obligation to ensure that the construction process itself does not endanger life, or engender negative environmental impacts that could have been avoided. Indeed, as far as they possibly can they seek to produce explicit ‘procedures’ to frame all their activities as machinic codes, precisely as a mode of self-protection from the vortex of potential disappointment. Responsibility for the generation of benefit rests elsewhere, not least with the people who are constantly reminded that they are being provided with this road and it is up to them to make something of it. In this respect machinic codes appear as devices that offer protection in and of themselves. Machinic codes appear here as assistants that enable the construction companies to detach the technical from the social, and to proceed on that basis. Within this framing, the degree to which the companies attend to social issues becomes an added benefit, a gesture of goodwill, an agreement that enhances a bid and adds value to the core expertise of a company. And yet as we suggested above, machinic codes are Downloaded from tcs.sagepub.com at UNIV OF ROCHESTER LIBRARY on December 30, 2011
Knox and Harvey ^ Anticipating Harm 157
notoriously unstable with respect to projects of social ordering precisely because those relations and affective forces that are carefully put to one side do not go away. We were fascinated to observe how those responsible for health and safety in the construction process tackled this dilemma of the destabilizing gap between machinic and emergent codes. Their response was to embed the machinic codes in fields of passionate relations. Of all the many people we spent time with on the road construction project, none were as obsessive or as overtly passionate about their work as the health and safety officers. This attitude was one which they openly talked about and took pride in. The loved their work and they were fascinated by their potential to draw others into practices of safe living. Education was central, but education worked through allure, passion and affect. We saw in the previous example of the health and safety induction how the engineer in charge worked on his audience with a deep sense of theatricality ^ evoking the insecurities that seem to require machinic codes for safe living. Others worked via different relational possibilities. We were particularly captivated by a health and safety engineer who taught inductees by drawing his students into dramatic scenarios where they had to learn to help each other to survive. He was fascinated by his job and wanted to teach by contagious enthusiasm. The challenge was to conjure scenarios that his students could invest in emotionally, so that they would be motivated to learn the skills they needed to be able to work safely. The skills were not machinic codes but principles of interdependence and a capacity to improvise. For example, he taught them how to make a stretcher from poles and a shirt and stressed the importance of being able to respond to accidents in remote sites. We were particularly impressed by his passion for road signage. He wanted his road to have the best signage of any in Peru. Such signs are, of necessity, standard forms ^ themselves codes designed to disambiguate potential hazards such as bends, steep inclines, or to offer clear instructions as to speed limits. Nevertheless, his versions of these signs were hand-drawn, lovingly painted, and carefully placed. He was particularly happy about the reflectors that he had fixed to bridges and curves to alert drivers to dangerous drops. These small devices had succeeded in capturing local interest to the extent that people were said to be taking special trips to see ‘the lights’, sometimes travelling up to 13 km for the pleasure. Their popularity was such that they were also being stolen, which, while disappointing in some respects, was also clear evidence that they ‘worked’, they were noticed, and that they were mobilizing people. In their enactment through material relations of assistance, all the machinic codes we have referred to in this paper also participate in the affective fields of the emergent codes that they are designed to side-step. For in the social world all abstractions are unstable as they continually reenter material dynamics of exchange and circulation that render them unpredictable. In these circumstances the assistants in play multiply. At the same time, and by extension, the assistants themselves provoke a Downloaded from tcs.sagepub.com at UNIV OF ROCHESTER LIBRARY on December 30, 2011
158
Theory, Culture & Society 28(6)
flouting of machinic code. Thus while Revaleta positions himself as an assistant to the project of health and safety by extending his panoptic gaze in order to engender a principle of self-discipline in his workers, he also invites a response. Driving down ‘illegal’ short-cuts (cortes), the drivers of the consortium’s 4x4s are sceptical of Revaleta’s capacity to see them and are simultaneously able to transfer their trust in Revaleta’s rules and regulations into a trust in the capacities of the vehicle to perform the health and safety measures built into off-road vehicles. At the same time, there is a frisson of excitement about breaking Revaleta’s rules and possibly even wonder at the ability of the vehicle to take them to the limits of acceptable practice. Bored, away from home for long periods of time, in environments that offer them little by way of habitual past-times beyond the television and a football pitch, the construction site and its surroundings are environments made for practical jokes, seductions and the dangerous hide and seek with Revaleta and those who became his eyes and ears. These brief examples are intended to demonstrate the importance of looking at the specificity of the relationships in which machinic codes come into being, the risks they make visible, and the limitations or slippages that are produced when these codes are imposed, however strictly they are applied. Such codes can only become relevant to the extent that they engage with existing modes of safe living and the established expectations of others. Their abstract causal logics are destabilized by the lived experience and self-evidence of other options. While all the security personnel that we came across in the company had a sense that they knew better, and saw local people as in need of education, the means by which they enacted and produced the machinic code necessarily required an openness to other possibilities or codes of relationality which we have demarcated here as emergent. It is interesting, for example, that the skilled urban workforce were more likely to follow health and safety regulations with regard to eating in ‘approved’ cafes than they were to obey the rules on speed or having fun in the workplace. By paying attention to the practices of codification of different kinds, we find that experts do not rule through the authoritarian imposition of the health and safety code. Rather, we find that the coding practices are imbued with affect such that they operate to ensure that workers engage specific principles and mechanisms of safe-living. The manager or expert can remain utterly committed to ensuring safe-living in the fields of relations for which they have responsibility and at the same time deflect responsibility for anything that happens onto individual workers ^ instantiating the specific ways in which notions of security and control are combined. Our approach has thus shown the ways in which the power of governmentality through security emerges as a precarious and relational effect. Much as Ahmed (2010) has argued in relation to ‘happiness’, ‘security’ does not reside in objects but emerges relationally, and circulates in what are often subtle subversions of any particular purpose or orientation. The intentions that generate speci¢c regulations and bring particular assistants into play are not necessarily the means by which a sense of security Downloaded from tcs.sagepub.com at UNIV OF ROCHESTER LIBRARY on December 30, 2011
Knox and Harvey ^ Anticipating Harm 159
is engendered. Revaleta knew as well as anybody that security was not ensured merely by the existence of regulation or by the presence of the assistants deployed by him and his colleagues. The thrill or even the mundane instrumentality of alternative practices imbues the landscape of security and control every bit as much as the following of machinic codes. Conclusion In this article we have explored the way in which people confront the anticipation of harm in relation to road construction in terms of a tension between different kinds of coding practices. Building on the notion of coding as a social practice which entails the participation of assistants, we have investigated the appearance of two different tendencies which we suggest emerge as responses to conditions of uncertainty. The first is the more familiar version of coding, involving practices oriented to the production of what we call ‘machinic’ codes which attempt to render the world predictable, and which set up the conditions through which causality can be specified.6 The second involves certain practices which conventionally fall out of the language of coding but which, we suggest, o¡er an important additional dimension to understanding the ways in which the practice of coding is more complex and less determined than the machinic code allows for. This involves practices which encode a relational response to potential future occurrences, producing what we call ‘emergent’ codes oriented to a future that is inherently unpredictable in terms of the detail of its actualization. In some respects the di¡erentiation between machinic and emergent codes echoes the distinctions that Massumi identi¢es between codi¢cation and coding, regulation and regularity, rule and habit (Massumi, 2002: 81^8). However, our focus in this article has been on how a focus on coding as a socially situated and morally charged social practice reveals the many ways in which machinic and emergent codes are never fully di¡erentiated. We have been particularly interested in the role that the emergent code plays in enabling machinic codes to assume that semblance of autonomy on which their social force depends. By focusing on the way in which assistants are an integral part of coding practices, we have drawn attention to the centrality of forces of affect, emotion and desire as integral to the dynamics of coding. Attention to the divergent ways in which machinic and emergent codes respond to the uncertainties of an unfolding future holds the potential to reveal how different forms of anticipatory coding effectively entrench and destabilize lines of social and cultural difference. Approaching the material interventions of engineers as forms of governance, we have suggested that the practices through which engineers participate in the reproduction of contemporary forms of governmentality is not through the imposition or enactment of a singular logic but through the negotiation of a tension between abstractions which code the world according to machinic principles, and normative social codes which are concerned with the emergence of Downloaded from tcs.sagepub.com at UNIV OF ROCHESTER LIBRARY on December 30, 2011
160
Theory, Culture & Society 28(6)
social relations. We suggest that machinic and emergent codes are not just interdependent, but that the relationship between the two is what sustains the ways in which the desire for greater security simultaneously produces anxieties about unwelcome control. We have used ethnographic material to explore the social practices of coding which we found people to be mobilizing as ways of dealing with the anticipation of harm under conditions of transformation. We do not suggest that the codes we have identified are the only ones that matter; there are many kinds of habitual practice that could be drawn on to exemplify the distinctions we are discussing here. Indeed, it is arguable that what we are calling emergent codes are not really codes at all. Alternatively, we could see them as relational potentialities, or forms of association which are produced as the affective culmination of history, biography and interactions with material environments. Nonetheless, our primary aim has been to question the hegemony and exclusivity of the machinic code as the only form of coding by introducing the notion of emergent codes into the discussion. Our purpose in making this analytical distinction has been to help us identify with more precision how these different means of containing and relating to people and things orient practice in environments which seem to become increasingly uncertain in the face of state-led infrastructural development projects. We have also been concerned to find new ways of analysing the contemporary politics of difference as articulated and experienced under conditions of development and state intervention. In this respect we hope to have shown that an extension of our characterization of codes through attention to their relational imbrications provides a fruitful analytical approach. It is clear that coding practices are far from neutral interventions in social situations and themselves work as ‘assistants’ in the configuration and mitigation of anticipated harm.The variety of codes which course through a multiplicity of relational practices produce actions of different kinds. Some of these could be seen as simply cynical responses to cultural incommensurability, such as the enactment of the offerings to the land by the engineering company. Instead, however, we have suggested that by attuning ourselves to the co-presence of the machinic and the emergent code, we have been able to describe these events not as a clash between closed worlds that imitate or impinge on one another but rather as contingent and constraining codings and patternings which can operate as explanations for dynamics of contradiction and confusion as well as conduct and control. In this respect we have extended the work of Douglas (1992), who likewise was interested in how to produce an account of risk and blame which bridged the apparent gulf between anthropological accounts of divination, witchcraft and sorcery as responses to uncertainty on the one hand, and the scienti¢c enumeration of risk on the other. In invoking the language of machinic and emergent codes to confront the practices out of which these separations occur, we suggest we have come a little closer to evoking the politics through which ‘cultural’ di¡erence can be understood to be constituted. Here we have focused on how di¡erence is made through engagements with Downloaded from tcs.sagepub.com at UNIV OF ROCHESTER LIBRARY on December 30, 2011
Knox and Harvey ^ Anticipating Harm 161
danger as produced by projects of state control. In this sense we have tried to move away from the evocation of a contemporary neo-liberal system producing overarching dangers of one particular kind, with the implication that alternative systems produce alternative dangers that are radically disconnected from the neo-liberal project. Instead we have shifted our attention to the co-presence of diverse practices as they emerge in relation to shifting matters of concern that are provoked by contemporary forms of environmental intervention and development. This has begun to allow us to incorporate various dimensions of the problem of safe living ^ from Revaleta’s calls for personal responsibility, to the ritual payments to the earth, from the evocation of wonder in road signs, to the trust placed in mechanical vehicles ^ into a single description, without collapsing the di¡erences that continue to provoke and engender both suspicion and wonder about the dynamics of change and the possibilities of the future brought about by dangerous but desired technologies like roads. Acknowledgements We would like to thank Theo Vurdubakis, Adrian Mackenzie and the anonymous reviewers for their helpful comments on earlier drafts of this paper. The research which this article is based upon was generously funded by the Economic and Social Research Council (Grant No. RES- 000 -22 -1418).
Notes 1. See for example Merriman (2004) and Moran (2009) for a description of some of the risks and unforeseen e¡ects of road construction projects in the UK and Schivelbush (1986), who writes about the fears associated with the introduction of the railroads in the 19th century. 2. In this respect, roads might be seen as an instance of what Beck (1992) terms the ‘risk society’ in that they produce both the material and epistemological conditions in which society comes to be organized around the problem of risk. 3. See for example Mann’s (1993) discussion of infrastructural power and some of the more speci¢c consequences for Latin American states in Salvatore (2006). 4. We take the term from Peter Sloterdi jk, who writes about the acceleration in ‘explication’ over the 20th century, referring to ‘the revealing-inclusion of the background givens underlying manifest operations’ (Sloterdi jk, 2009: 9). 5. See Allen (1988); Gose (1986, 1994); Harris (1982); Sallnow (1987). Such orientation to powerful forces includes the ways in which people imagine and engage the state (Poole, 2004). 6. There are interesting overlaps between the notion of ‘machinic codes’ and the ‘engine sciences’ described by Carroll as the ‘cultural integration ^ in practice ^ of natural philosophy, mathematics, and engineering in the second half of the seventeenth century’ (Carroll, 2006: 7). Carroll’s interest in looking at the processes involved in rendering diverse practices and material entities amenable to both statecraft and experimental science are clearly still at work in the codi¢cations that fascinate 21st-century science and statecraft. Downloaded from tcs.sagepub.com at UNIV OF ROCHESTER LIBRARY on December 30, 2011
162
Theory, Culture & Society 28(6)
References Ahmed, S. (2010) The Promise of Happiness. Durham: Duke University Press. Allen, C. (1988) The Hold Life Has: Coca and Cultural Identity in an Andean Community. Washington, DC: Smithsonian Institution Press. Amoore, L. and M. de Goede (2008) ‘Transactions after 9/11: The Banal Face of the Pre-emptive Strike’, Transactions of the Institute of British Geographers 33: 173^185. Anderson, B. (2010) ‘Pre-emption, Precaution, Preparedness: Anticipatory Action and Future Geographies’, Progress in Human Geography 34: 777^798. Beck, U. (1992) Risk Society: Towards a New Modernity. London: SAGE. Beckmann, J. (2004) ‘Mobility and Safety’, Theory, Culture & Society 21(4^5): 81^100. Carroll, P. (2006) Science, Culture, and Modern State Formation. Berkeley: University of California Press. Cruikshank, J. (2005) Do Glaciers Listen? Local Knowledge, Colonial Encounters, and Social Imagination. Seattle: University of Washington Press. Deleuze, G. (1992) ‘Postscript on the Societies of Control’, October 59: 3^7. Douglas, M. (1992) Risk and Blame: Essays in Cultural Theory. London: Routledge. Foucault, M. (2008) The Birth of Biopolitics: Lectures at the Colle'ge de France, 1978^1979. Basingstoke: Palgrave Macmillan. Gose, P. (1986) ‘Sacrifice and the Commodity Form in the Andes’, Man 21(2): 296^310. Gose, P. (1994) Deathly Waters and Hungry Mountains: Agrarian Ritual and Class Formation in an Andean Town. Toronto: University of Toronto Press. Gourevich, A. (2010) ‘Environmentalism: Long Live the Politics of Fear’, Public Culture 22(3): 411^424. Harris, O. (1982) ‘The Dead and the Devils among the Bolivian Laymi’, pp. 45^73 in M. Bloch and J.P. Parry (eds) Death and the Regeneration of Life. Cambridge: Cambridge University Press. Harvey, P. (1994) ‘Gender, Community and Confrontation: Power Relations and Drunkenness in Ocongate’, in M. McDonald (ed.) Gender, Drink and Drugs. Oxford: Berg. Harvey, P. (2001) ‘Landscape and Commerce: Creating Contexts for the Exercise of Power’, pp. 197^210 in B. Bender and M. Winer (eds) Contested Landscapes: Movement, Exile and Place. Oxford: Berg. Hayles, N.K. (1999) How We Became Posthuman: Virtual Bodies in Cybernetics, Literature, and Informatics. Chicago: University of Chicago Press. Joyce, P. (2003) The Rule of Freedom: Liberalism and the Modern City. New York: Verso. Latour, B. (1992) ‘Where Are the Missing Masses? The Sociology of a Few Mundane Artefacts’, in W. Bi jker and J. Law (eds) Shaping Technology/Building Society. Cambridge, MA: MIT Press.
Downloaded from tcs.sagepub.com at UNIV OF ROCHESTER LIBRARY on December 30, 2011
Knox and Harvey ^ Anticipating Harm 163 Mann, M. (1993) The Sources of Social Power: The Rise of Classes and NationStates 1760^1914. Cambridge: Cambridge University Press. Massumi, B. (2002) Parables for the Virtual: Movement, Affect, Sensation. Durham: Duke University Press. Massumi, B. (2005) ‘The Future Birth of the Affective Fact’, in Conference Proceedings: Genealogies of Biopolitics. Available at: http://browse.reticular.info/text/collected/massumi.pdf (consulted July 2011). Merriman, P. (2004) ‘Driving Places: Marc Auge¤, Non-places, and the Geographies of England’s M1 Motorway’, Theory, Culture & Society 21(4^5): 145^167. Moran, J. (2009) On Roads: A Hidden History. London: Profile Books. Poole, D. (2004) ‘Between Threat and Guarantee: Justice and Community on the Margins of the Peruvian State’, pp. 35^66 in V. Das and D. Poole (eds) Anthropology in the Margins of the State. Santa Fe: School of American Research Press. Povinelli, E. (1995) ‘Do Rocks Listen? The Cultural Politics of Apprehending Australian Aboriginal Law’, American Anthropologist 97(3): 505^518. Sallnow, M.J. (1987) Pilgrims of the Andes: Regional Cults in Cusco. Washington, DC: Smithsonian Institution Press. Salvatore, R.D. (2006) ‘Imperial Mechanics: South America’s Hemispheric Integration in the Machine Age’, American Quarterly 58: 662^692. Schivelbush,W. (1986) The Railway Journey: The Industrialization of Time and Space in the Nineteenth Century. Berkeley: University of California Press. Sloterdi jk, P. (2009) Terror from the Air. Los Angeles: Semiotext(e). Stengers, I. (2007) ‘Diderot’s Egg: Divorcing Materialism from Eliminativism’, Radical Philosophy 144( July/August).
Hannah Knox is a Research Fellow in the ESRC Centre for Research on Socio-Cultural Change (CRESC) at the University of Manchester. Her research, conducted in the UK and Peru, explores ethnographically the sociality and materiality of contemporary knowledge practices. She is currently doing research in the UK on the cultural politics of climate change. [email:
[email protected]] Penny Harvey is Professor of Social Anthropology at the University of Manchester and co-Director of CRESC. She works ethnographically in Peru, in the UK and in Spain and has published widely on engineering practice, state formation, information technologies and the politics of communication. She is currently carrying out a collaborative ethnography of the Peruvian regional state focusing on technical expertise, regulatory practice and decentralization. [email:
[email protected]]
Downloaded from tcs.sagepub.com at UNIV OF ROCHESTER LIBRARY on December 30, 2011
A New Algorithmic Identity Soft Biopolitics and the Modulation of Control
John Cheney-Lippold
Abstract Marketing and web analytic companies have implemented sophisticated algorithms to observe, analyze, and identify users through large surveillance networks online. These computer algorithms have the capacity to infer categories of identity upon users based largely on their web-surfing habits. In this article I will first discuss the conceptual and theoretical work around code, outlining its use in an analysis of online categorization practices. The article will then approach the function of code at the level of the category, arguing that an analysis of coded computer algorithms enables a supplement to Foucauldian thinking around biopolitics and biopower, of what I call soft biopower and soft biopolitics. These new conceptual devices allow us to better understand the workings of biopower at the level of the category, of using computer code, statistics and surveillance to construct categories within populations according to users’ surveilled internet history. Finally, the article will think through the nuanced ways that algorithmic inference works as a mode of control, of processes of identification that structure and regulate our lives online within the context of online marketing and algorithmic categorization. Key words algorithm j biopolitics j internet
j
j
biopower
j
code
j
control
j
Deleuze
j
Foucault
Theory, Culture & Society 2011 (SAGE, Los Angeles, London, New Delhi, and Singapore), Vol. 28(6): 164^181 DOI: 10.1177/0263276411424420
Downloaded from tcs.sagepub.com at UNIV OF ROCHESTER LIBRARY on December 30, 2011
Cheney-Lippold ^ A New Algorithmic Identity 165
Introduction ET ME begin this article with a hypothetical. You open up a new computer and fire up a web browser. You go to the washingtonpost.com, visit a couple of blogs at Wordpress and Tumblr, and go on the business social networking site linkedin.com. Maybe you take a break from the internet, go grab a cup of coffee, but return to watch some videos on Hulu, check US gossip at tmz.com, and look at the weather at wunderground.com. At this point you decide it might be best to go to work so you close your computer, get dressed, and go outside. While you may proceed with your day as if nothing has happened, something has changed about who you are online. You have been identified. Your IP address has been logged; you have a cookie file installed on your computer. And somewhere, in a database far, far away, you very well may have a gender, class, and race. This small tour of web sites provides us with a mundane example of what I call in the title of this article a ‘new algorithmic identity’. The networked infrastructure of the internet, with its technological capacity to track user movements across different web sites and servers, has given rise to an industry of web analytics firms that are actively amassing information on individuals and fine-tuning computer algorithms to make sense of that data. The product of many of these firms is a ‘new algorithmic identity’, an identity formation that works through mathematical algorithms to infer categories of identity on otherwise anonymous beings. It uses statistical commonality models to determine one’s gender, class, or race in an automatic manner at the same time as it defines the actual meaning of gender, class, or race themselves. Ultimately, it moves the practice of identification into an entirely digital, and thus measureable, plane. This article will examine the consequence of some of those practices aimed at understanding exactly what kind of user is visiting web sites, purchasing products, or consuming media. Online a category like gender is not determined by one’s genitalia or even physical appearance. Nor is it entirely self-selected. Rather, categories of identity are being inferred upon individuals based on their web use. Code and algorithm are the engines behind such inference and are the axis from which I will think through this new construction of identity and category online. We are entering an online world where our identifications are largely made for us. A‘new algorithmic identity’ is situated at a distance from traditional liberal politics, removed from civil discourse via the proprietary nature of many algorithms while simultaneously enjoying an unprecedented ubiquity in its reach to surveil and record data about users. In this article I will first discuss the conceptual and theoretical work around code, outlining its use in an analysis of online categorization practices. The article will then approach the function of code at the level of the category, arguing that an analysis of coded computer algorithms enables a supplement to Foucauldian thinking around biopolitics and biopower, of what I call soft biopower and soft biopolitics. These new conceptual devices
L
Downloaded from tcs.sagepub.com at UNIV OF ROCHESTER LIBRARY on December 30, 2011
166
Theory, Culture & Society 28(6)
will allow us to better understand the workings of biopower at the level of the category, of using computer code, statistics and surveillance to construct categories within populations according to users’ surveilled internet history. Finally, the article will think through the nuanced ways that algorithmic inference works as a mode of control, of processes of identification that structure and regulate our lives online within the context of online marketing and algorithmic categorization. Code and Categorization Even spared of its technological complexity, code is still a tricky subject to wrap one’s head around. To begin with, code as a concept is used quite indiscriminately. Some theorists see code as an architecture, of the digital walls and foundations that delimit our experience online. This thinking is centered around the work of Lawrence Lessig, for whom code constructs the space we have to use and access information. Code, in Lessig’s mind, ‘is law’ (Lessig, 2006: 1). Rules are both intentionally and unintentionally written into the hardware and software of internet technologies that then create the infrastructure that determines how we as users can act in cyberspace. But most contemporary uses of code as a concept require a more dynamic definition. Nigel Thrift (2005: 173) offers code not as law but ‘new adaptive standards of conduct which can work in all kinds of situations at all kinds of scales’. And Alexander Galloway (2004) argues that this type of code, or his term ‘protocol’, also encompasses the standards for internet communication (TCP/IP and DNS). The structural ways that protocols enable and disable particular kinds of internet traffic become viable as they ensure standardization, an architectural metric system of sorts through which control follows from ‘a set of technical procedures for defining, managing, modulating, and distributing information through a flexible yet robust delivery infrastructure’ (Thacker, 2004: xv). Code as architecture works to structure the boundaries, as well as regulate the flows, of internet traffic. A key part of this regulation is in the actual work that code does, of creating, ordering and ultimately giving meaning to digital artifacts. Code as a concept must also force us to think about the mundane processes of data differentiation online, specifically of determining how a variable like X or Y can come to be defined, what X and Y actually mean and, most importantly, how users experience X and Y within the architecture that code creates. Stephen Graham echoes this sentiment with his work on ‘software-sorting’, of ‘directly, automatically and continuously allocating social or geographical access to all sorts of critical goods, services, life chances or mobility opportunities to certain social groups or geographical areas’ (2005: 3). Code is part of a dynamic relationship to the real world, one that can ‘automatically and continuously’ affect life chances offered to users based on a pre-configured but also reflexive programmed logic. But rather than look at code as just an automated and adaptive gatekeeper of critical Downloaded from tcs.sagepub.com at UNIV OF ROCHESTER LIBRARY on December 30, 2011
Cheney-Lippold ^ A New Algorithmic Identity 167
information, we can explore its function much more constitutionally. An analysis that centers on code allows us to look at a list of lines of a computer language and see how it uses certain representations of the world (variables and data) to produce new value. A new value like X ¼ male can then be used to suggest sorting exercises, like targeted content and advertisements, based entirely on assumptions around that value. Yet code can also construct meaning. While it can name X as male, it can also develop what ‘male’ may come to be defined as online. But this new value is not corralled within an entirely digital realm. When a user changes her profile on Facebook to state her ‘sex’, there’s both a delimiting action (of the set drop-down menu choices available to describe one’s ‘sex’ of male or female) as well as a cultural discourse (what does being female online mean?) through which code speaks (Fuller, 2008). The nascent field of software studies looks at cultural practices that exceed the material world of code, of incorporating an analysis that takes into account human-machine interactions as well as the technology of machines themselves (Cramer, 2005). The study of code and software in general tries to go ‘beyond the blip’ to understand the implicit politics of computer code, ‘to make visible the dynamics, structures, regimes, and drives’ of the wide variety of programmed scripts that are littered across the internet (Fuller, 2003: 32). With this view we can see computer code not as a literal, rhetorical text from which we derive meaning but a complex set of relationships that tie together the coded systems of definition and organization that constitute our experience online. Codes are cultural objects embedded and integrated within a social system whose logic, rules, and explicit functioning work to determine the new conditions of possibilities of users’ lives. How a variable like X comes to be defined, then, is not the result of objective fact but is rather a technologically-mediated and culturally-situated consequence of statistics and computer science. Categorization My argument will focus on the process of defining X, of the identification of particular groups within populations largely through the marketing logic of consumption. Within marketing, considerable research goes to identify the composition of a consumer audience. In the past, consumers were discriminated based on census-data laden geographies: rich people lived in a certain ZIP code, poor people lived in another, and businesses could provide coupons for class-differentiated products and services accordingly (Gandy, 1993). A move from demographic to psychographic categorizations followed, where clusters of consumer types were created to better understand the non-essentialist character of demographic-based consumption patterns (not all white people are interested in timeshares in Boca Raton, but those who are older and with money are) (Yankelovich and Meer, 2006; Arviddson, 2006). An important shift in marketing happened as marketers went online and were able to use data from search queries to create behavioral Downloaded from tcs.sagepub.com at UNIV OF ROCHESTER LIBRARY on December 30, 2011
168
Theory, Culture & Society 28(6)
understandings atop these clusters (Turrow, 2006). Purchase and other behavioral data were then lumped into how marketers see and interact with individuals through the construction of ‘databases of intentions’, where user information and search queries are aggregated to understand intentions ^ such as purchase intent ^ as well as general trends in social wants and needs (Battelle, 2005). Mathematical algorithms allowed marketers to make sense out of these data to better understand how to more effectively target services, advertisements, and content. Through algorithms, commonalities between data can be parsed and patterns within data then identified and labeled. And with the capacities of computers to surveil and capture user activity across the internet, consumer information was removed from the shackles of time-bound, decadal census data and began to float atop a constant stream of real-time web use that can be matched against existing behavior and identity models ^ like gender. Such a shift focuses not on essential notions of identity but instead on pliable behavioral models, undergirded by algorithms, that allow for the creation of a cybernetic relationship to identification. New worlds online (of content or advertisements) are created according to newly identified user data and interactions. Continually updated new content can then be suggested to users based on the history of a user’s interactions with the system, which are compiled into a particular identity for that user (like X ¼ male). As the capacity of computers to aggregate user data increases and algorithms are improved upon to make disparate data more intelligible and useful, the ability for real-time cybernetic modeling to monitor, compute, and act becomes more efficient. So as more data is received about a certain user’s behavior online, new coded computations can be done to change who the user is believed to be and what content that user might desire (the gender of the same user might change from male to female if enough user data, such as the addition of certain web sites that user visited, are presented to statistically identify that user with a different gender). In this constant feedback loop we encounter a form of control. But this is not a control that we can explain through references to code as law. It does not present us with static norms of conduct or of unsurpassable rules that prevent particular uses. Code also regulates the innards of the internet. It plays a constitutive role in the mundane activities ^ of what book I’m recommended at a particular moment or what kind of shampoo I’m advertised when I’m visiting a particular web page. This thinking falls into a variety of models of cybernetics, from Vincent Mosco’s (1996) cybernetic commodity that details information about consumer transactions and creates a database of user behavior to Nikolas Rose’s (1999) cybernetics of control that explores how normative assumptions of daily life become embedded within the circuits of our education, employment, and consumption practices. But I argue we can even more effectively understand this process as modulation, of a Deleuzeian approach to control that relies on practices characterized by the terms of the societies of control (Deleuze, 1992). This move offers us a way of seeing how many mechanisms of control Downloaded from tcs.sagepub.com at UNIV OF ROCHESTER LIBRARY on December 30, 2011
Cheney-Lippold ^ A New Algorithmic Identity 169
have shifted to practices external to the body, where discipline has often become more or less unnecessary if control can be enacted through a series of guiding, determining, and persuasive mechanisms of power. Instead of constructing subjectivity exclusively through disciplinary power, with normative discourses prefiguring how we engage with and even talk about the world, a regulative approach allows for a distance between power and subject (Foucault, 2008). It configures life by tailoring its conditions of possibility. Regulation predicts our lives as users by tethering the potential for alternative futures to our previous actions as users based on consumption and research for consumption. Control in this formation becomes best represented by the concept of modulation, ‘a self-deforming cast that will continuously change from one moment to the other, or like a sieve whose mesh will transmute from point to point’ that sees perpetual training replacing serial encounters with singular, static institutions (Deleuze, 1992: 4). And modulation marks a continuous control over society that speaks to individuals in a sort of coded language, of creating not individuals but endlessly sub-dividable ‘dividuals’ (Deleuze, 1992). These dividuals become the axiom of control, the recipients through which power flows as subjectivity takes a deconstructed dive into the digital era. When we situate this process within questions around digital identity, dividuals can be seen as those data that are aggregated to form unified subjects, of connecting dividual parts through arbitrary closures at the moment of the compilation of a computer program or at the result of a database query. Dividual pieces, onto which I conceptually map the raw data obtained by internet marketing surveillance networks, are made intelligible and thus constitute the digital subject through code and computer algorithms. The algorithmic process that I will focus on in this article looks at the web analytics firm Quantcast as it infers certain categorical identifications from seemingly meaningless web data.1 As new information gets created about an individual through the tracking of her web presence, a user’s identity becomes more defined within the system in accord with the logic of Quantcast’s identification algorithms. If I would surf on Quantcast’s member sites for several hours, my initially unknown gender would become more concrete in the face of Quantcast’s algorithms, as exemplified in the hypothetical that begins this article. Dividual fragments flow across seemingly open and frictionless networks and into rigid database fields as part of the subsumption implicit in data mining (the practice of finding patterns within the chaos of raw data). As a user travels across these networks, algorithms can topologically striate her surfing data, allocating certain web artifacts into particular, algorithmically-defined categories like gender. The fact that user X visits the web site CNN.com might suggest that X could be categorized as male. And additional data could then buttress or resignify how X is categorized. As X visits more sites like CNN.com, X’s maleness is statistically reinforced, adding confidence to the measure that X may be male. As X visits more sites that are unlike CNN.com, X’s maleness Downloaded from tcs.sagepub.com at UNIV OF ROCHESTER LIBRARY on December 30, 2011
170
Theory, Culture & Society 28(6)
might be put into question or potentially resignified to another gender identity. In this example, web data’s apparent chaos and undefinability, of the limitless possibilities that any data set initially offers, is allocated into specific categories. The implicit disorder of data collected about an individual is organized, defined, and made valuable by algorithmically assigning meaning to user behavior ^ and in turn limiting the potential excess of meanings that raw data offer. By using these dividual fragments, a subject’s identity can be articulated according to the programmed rationale of the algorithm. Ultimately, and after certain statistical checks of confidence, this articulation can result in a unified and static status: X is male. As described in a Quantcast white paper outlining the company’s inference methodology, these static user identities help determine the gendered make-up of a site (Quantcast, 2008). By tracking users and algorithmically assigning categories according to their behavior, Quantcast may define the gender composition of a site on its network as 73 percent male and 27 percent female. But as more inferred-as-female users begin to visit the site, the site’s gender identity has the chance to change in accord (for example to 72 percent male and 28 percent female). The indirect consequence of this dynamism comes as a site’s altered gendered make-up then can play a constitutive part in reconfiguring the statistical groupings that Quantcast uses to infer gender upon its users. New, not-yet-gendered individuals that visit this site may still be identified as male but now with a statistical confidence potentially less than previously calculated.2 In requisite cybernetic fashion a user’s ascribed gender can and may change as new user information arrives into the cybernetic system..This type of user identity does not approach identity formation from an essentialist framework. Rather, algorithms allow a shift to a more flexible and functional definition of the category, one that de-essentializes gender from its corporeal and societal forms and determinations while it also re-essentializes gender as a statistically-related, largely market research-driven category. Gender becomes a vector, a completely digital and math-based association that defines the meaning of maleness, femaleness, or whatever other gender (or category) a marketer requires. These categorizations are very much tied to the groupings of what a company like Quantcast finds useful, namely those that are derived from the world of market research (gender, age, income). And while offline data can be used to supplement algorithmic inference measures, the move toward a vector-based category marks a separation of the category from its normative domain and a reassociation of that category according to statistical correlation. Of course real world preconceptions around the meanings of categories feed into the foundational definitions that we have about certain genders, races, and classes online (Nakamura, 2002). In most cases, predetermined biases about the nature of group identity will be maintained online. But the capacity for cybernetic categorization to regulate certain categories’ meaning according Downloaded from tcs.sagepub.com at UNIV OF ROCHESTER LIBRARY on December 30, 2011
Cheney-Lippold ^ A New Algorithmic Identity 171
to algorithm marks a move away from offline stereotypes and into a form of statistical stereotyping. Rather than maintaining a particular and concrete relationship to maleness in the offline world, a cybernetic conception of gender would position maleness as a modulating category, based on a vector and composed of statistical assessments that define male group membership. New information can come to light within this algorithmic system and an algorithm can subsequently move to change both a category’s meaning and the once-essential ideals ascribed to that category. Gender as a category still holds the capacity for discipline ^ of serving targeted advertisements and targeted content according to the inferred digital identity of a user ^ but it also is embodied by a new and flexible cybernetic categorization system. Instead of standards of maleness defining and disciplining bodies according to an ideal type of maleness, standards of maleness can be suggested to users based on one’s presumed digital identity, from which the success of identification can be measured according to ad click-through rates, page views, and other assorted feedback mechanisms. The regulation of gender as a category then becomes wholly embedded within the logic of consumption, where categorical behaviors are statistically defined through a cybernetics of purchasing and research that marketers have deemed valuable for identification and categorization. As the specifics of Quantcast’s algorithms remain proprietary, we can use other examples of ‘machine learning’ to help us explicate this process in the public domain. Put simply, machine learning is the ‘technical basis . . . used to extract information from the raw data in databases ^ information that is [then] expressed in a comprehensible form and can be used for a variety of purposes’ such as prediction and inference (Witten and Frank, 2005: xxiii). Machines are programmed to learn patterns in data and use those correlative patterns to analyze and make assessments about new data. Yahoo researchers, for example, have described the capacity for an algorithm to infer demographic categories of identity (race, gender). Based on search query logs, user profile data, and US census data, these researchers found they can subsequently tailor results according to user categorizations. If a particular user issues the query ‘wagner’, an algorithm can offer differentiated results as determined by that user’s gender categorization. Based on the observed web habits of ‘typical’ women and men, search results for ‘wagner’ can include sites about the composer ‘Richard Wagner’ for women while at the same time provide sites on ‘Wagner USA’ paint supplies for men (Weber and Castillo, 2010). Similarly, researchers in Israel have created an algorithm that predicts an author’s gender based on lexical and syntactic features. Upon separating the written works of women and men, statistical correlations were run to create a vector-based categorization of two gender’s written features, through which the algorithm was able to then identify the ‘correct’ gender of a new, non-categorized work with a success rate of 80 percent (Koppel et al., 2002). But undergirding all of this is the theoretical possibility that these statistical models can change Downloaded from tcs.sagepub.com at UNIV OF ROCHESTER LIBRARY on December 30, 2011
172
Theory, Culture & Society 28(6)
given new data. The chance that men, in the Yahoo example, stop clicking on paint supplies and start clicking on music sites could statistically suggest that Yahoo’s definition of ‘men’ may not be appropriate. And the possibility that an inferred-as-male writer may not write like every man before him can make an algorithm statistically second guess its initial vector-based categorization. Categories always have the capacity for change, off and online, but the shift I want to bring to the world of identity is the continuous, data-centered manner that modulates both user experience (what content they see) as well as the categorical foundation of those identifications. Soft Biopower and Soft Biopolitics I argue that with cybernetic categorization comes a new analytical axis of power: the digital construction of categories of identity. A recent call to rethink biopolitics, to understand the historical context from which contemporary forms of control over populations interact, does well to push us away from more modernist conceptions of biopower and biopolitics and into a realm of contemporary, localized biopolitical analysis (Revel, 2009). My own analysis will focus on biopower as it works in the construction of identity, of the ‘power over life’ through the management of subjects at the level of the population (Foucault, 2003a: 135). The regulatory work of biopolitical technologies does much to understand the shift Foucault articulated from power of sovereignty to power over life (Foucault, 2008). And it allows us to assess the impact of somewhat ethically-ambiguous technologies of biopower. In what Foucault calls security mechanisms we can see the technologies installed to regulate the inherent randomness in any population of human beings so as to optimize a particular state of life (Foucault, 2003a: 246). I address these mechanisms through computer algorithmbased categorization that work to biopolitically manage and control groups of people. Biopolitics’ origin in 18th-century society worked to supplement existing modes of disciplinary power, of introducing forms of regulatory power that found a new type of control of populations at the level of the man-asspecies (Foucault, 2003a). Using statistics, demographic assessments, and through an analysis of birth and death rates, government was able to situate itself in a relationship with subjects not only vis-a' -vis individual bodies but vis-a' -vis the population and sub-populations. Statistics, forecasting, and measurement became the preferred modes of regulation, of governmental intervention at a level of ‘generality’ ^ the level where parts of life are determined and through which a temporary homeostasis could be achieved (Foucault 2007, 2008). Biopolitics in traditional Foucauldian terms has been centered in the practices of achieving this equilibrium. I situate my intervention with the different processes of categorization that allow us to come to this equilibrium through ‘a technology which brings together the mass effects of characteristic of a population, which tries to control the series of random events that can occur in a living mass, a technology Downloaded from tcs.sagepub.com at UNIV OF ROCHESTER LIBRARY on December 30, 2011
Cheney-Lippold ^ A New Algorithmic Identity 173
which tries to predict the probability of those events (by modifying it, if necessary) or at least compensate for their effects’ (Foucault, 2003a: 249). With the construction of categories we can see how biopolitical work controls those random events, of creating what Foucault calls ‘caesuras’, or breaks in the biological continuum of characteristics of life (Foucault, 2003a; Macey, 2009). Biopower works through a series of dividing practices, a process of objectification outlined in Louise Amoore’s (2006: 339) analysis of border technology biometrics that act on populations through the categorizations of ‘student’, ‘Muslim’, ‘woman’, ‘alien’, ‘immigrant’, and ‘illegal’. But the move I want to make is to analyze the role of what I term soft biopolitics, of how biopolitical processes give meaning to the categorizations that make up populations and through which we can look at the variable indirect processes of regulation. To explicate soft biopolitics is to better understand not just how populations are managed but the changing relationship that categories have to populations and, most importantly, what categories actually come to mean and be redefined as. We can also take a grounded example with a biopolitical tinge to explore this process. Say you are categorized as a male through your usepatterns online. Medical services and health-related advertisements could be served to you based on that gendered categorization. This means that those who are not categorized as male may not experience the same advertisement and opportunities, enacting the consequences of what was already referred to as ‘software-sorting’ (Graham, 2005). But while the initial biopolitical relationship of subject and power would stop at this point, the introduction of the notion of cybernetic categorization can provide another perspective of Foucauldian power. Users are not categorized according to one-off census survey data but through a process of continual interaction with, and modification of, the categories through which biopolitics works. In marketing, these categorizations are constructed within the logic of consumer capitalism, where biopower can govern subjects at a distance, guarding their apparent autonomy while ‘[optimizing] systems of difference, in which the field is left open to fluctuating processes. . . [and] in which there is an environmental type of intervention instead of the internal subjugation of individuals’ (Foucault, 2008: 259^60). Category construction in itself has become a biopolitical endeavor, where Foucauldian security dilemmas are resolved not just by enactment or change of policy but also through an alteration in how existing policy reaches subjects. Maleness can be constantly evaluated according to feedback data, through which the definitions of maleness can shift (to include or exclude transgendered folk, for just one example) according to the logic of the algorithm. Policies that rely on gendered categorizations then can be automatically and continuously reoriented to address new populations. These algorithmic definitions then supplement existing discursive understandings of a category like gender ^ most fundamentally through the language of gender itself ^ with statistical conclusions of what an idea like maleness actually is. The exclusivity of gender’s meaning then becomes Downloaded from tcs.sagepub.com at UNIV OF ROCHESTER LIBRARY on December 30, 2011
174
Theory, Culture & Society 28(6)
held within the logic of the categorical system as defined by a particular algorithm. The cybernetic definition and redefinition of gender then provides a form of elasticity to power. At this categorical level there exists a theoretical modulation that restages the relationship that biopower has to subjects. In our above example, medical advertisements targeted to men represent such power on two fronts. First, traditional biopolitical action of medical services helps manage life. The state or corporate power can biopolitically aim to reduce STD transfer among the male population by advertising condoms or safe sex behaviors. And second, the normalizing force of what it means to be categorized as male becomes a biopolitical activity. The category of gender in our example is a result of statistical analyses that regulate what Foucault calls the ‘conduct of conduct’, or more appropriately by discursively positioning maleness to indirectly, and largely unintentionally, control the conditions of possibilities afforded to users who are targeted as male (Foucault, 1982). It is this regulation of the randomness in gendered populations that optimizes through soft biopower a particular state of life as defined by algorithm. The demands put on soft biopower ^ of updatable categories that can adapt to the dynamism of populations ^ require malleable, modulating categorical groupings in order to effectively define and redefine categories’ meaning. The truth of what is gendered male requires us to dig beneath its potential self-evidence (scientific statistical probability models) and into a larger schema of power relations, of understanding the ‘fundamental codes of a culture ^ those governing its language, its schemas of perception, its exchanges, its techniques, its values . . . [that] establish for every man. . . the empirical orders with which he will be dealing and within which he will be at home’ (Foucault, 1973: xx). But Foucault denotes a middle space, between these coded limitations of our culture, on one hand, and philosophical and scientific interpretations of order on the other (for example, God created and classified the world). This middle space is where soft biopower lives, where the modulation of categorical identity changes according to a ‘culture, imperceptibly deviating from the empirical orders prescribed for it by its primary codes . . . [that] frees itself sufficiently to discover that these orders are perhaps not the only possible ones or the best ones’ (Foucault, 1973: xx). By stepping through the concept of cybernetic categorization we can follow this important move etched by Foucault in his understanding of the social construction of categories. First, ontologies are embedded within a set of power relations. Second, the categories that are bred from those ontologies exercise a profound impact on how we as subjects encounter our world. And third, changes in our categorizing schematics are indebted to this fundamental coding of a culture, from which I find a strong parallel to the technological organization of subject identities through code and algorithm. The knowledge formation implicit in algorithmic work ‘creates and causes to emerge new objects of knowledge and accumulates new bodies of information’ that ultimately give rise to new categorization practices, in the Downloaded from tcs.sagepub.com at UNIV OF ROCHESTER LIBRARY on December 30, 2011
Cheney-Lippold ^ A New Algorithmic Identity 175
case of the shift from offline to online marketing, but also redefined categorical meanings in the case of cybernetic categorization (Foucault, 1980: 52). This redefinition is part of what others have argued is a shift toward a ‘topological’ approach to genealogy, one that identifies ‘patterns of correlations’ that lead to the formation of particular dispositions of unified heterogeneous elements (Collier, 2009). While Collier’s treatment of a ‘topology of power’ takes these elements as a wide array of different and recombined techniques and technologies to create a unified state of power relations, the same logic can be applied to algorithm. Patterns of correlation can be found in technologies of algorithmic categorization, of recombining and unifying heterogeneous elements of data that have no inner necessity or coherence. The softer versions of biopower and biopolitics supplement the discursive production of categories’ meanings, as it is also through data and statistical analysis that conceptions of gender change ^ not just through discourse and its subsequent naturalization (Foucault, 1973).3 In order to understand the regulatory process of biopolitics we need to place additional attention on the biopolitical construction of these categorizations. Here we can better define soft biopower and soft biopolitics. The former signifies the changing nature of categories that on their own regulate and manage life. Unlike conceptions of hard biopower that regulate life through the use of categorizations, soft biopower regulates how those categories themselves are determined to define life. And if we describe biopolitics as Foucault does, as ‘the endeavor . . . to rationalize the problems presented to governmental practice by the phenomena characteristic of a group of living human beings constituted as a population’, soft biopolitics constitutes the ways that biopower defines what a population is and determines how that population is discursively situated and developed (Foucault, 2003b: 73). Control I argue these defining practices of soft biopower and soft biopolitics act as mechanisms of regulatory control. For the remainder of this article I will understand control as ‘operating through conditional access to circuits of consumption and civility’, interpreting control’s mark on subjects as a guiding mechanism that opens and closes particular conditions of possibility that users can encounter (Rose, 2000: 326). The feedback mechanism required in this guiding mechanism is the process of suggestion. I define suggestion as the opening (and consequent closing) of conditional access as determined by how the user is categorized online.4 As categorizations get constructed by firms like Quantcast, refined by algorithms that process online behavior, and continually improved upon as more and more time passes and more and more web artifacts are inputted into a database, advertisements and content are then suggested to users according to their perceived identities. In cybernetic categorization these groupings are always changeable, following the user and suggesting new Downloaded from tcs.sagepub.com at UNIV OF ROCHESTER LIBRARY on December 30, 2011
176
Theory, Culture & Society 28(6)
artifacts to be visited, seen, or consumed. Importantly, the cybernetic feedback loop exists only in so far as it productively aids in the maintenance of the system. The process of suggestion works to the extent that algorithmic inference adeptly categorizes.5 If a particular suggestion is misguided, algorithms are programmed to interpret a mis-categorization and reassign a category of identity based on that user’s newly observed behavior. Historically thinking around control in the field of surveillance has focused on a direct relationship to the subject. David Lyon describes surveillance as ‘the routine ways [that] focused attention is paid to personal details by organizations that want to influence, manage, or control certain persons or population groups’ (Lyon, 2003a: 5). Lyon’s explicit mention of control in a longer list including ‘influence’ and ‘manage’ presents a largely direct and causal description of control. Organizations surveil because they ‘want to’ determine discriminatory pricing (commercial surveillance), discourage crime (CCTV), or apprehend criminals (NSA surveillance programs).6 I propose we approach Lyon’s definition from a more subtle and indirect relationship of control. Through what I term cybernetic categorization, categories’ meaning can be realigned according to the code and algorithmic models built to target content to particular consumers. The process of identification, at least in the online world, becomes mediated by what I term soft biopolitics, as user identities become tethered to a set of movable, statistically-defined categorizations that then can have influence in biopolitical decisions by states and corporations. This control resembles Nikolas Rose’s (1999: 49^50) concept of ‘government at a distance’, of regulating how identity can be governed ‘through the decisions and endeavours of non-political modes of authority [that are] distanced spatially, in that these technologies of government link a multitude of experts in distant sites to the calculation of those at a center’. Control then works at levels far past the purview of liberal individualism, situating subjects within networks of power that govern indirectly and without proximity. The individual user is incapable of really experiencing the effect that algorithms have in determining one’s life as algorithms rarely, if ever, speak to the individual. Rather, individuals are seen by algorithm and surveillance networks as members of categories. So instead of positioning the individual as the locus of research, the addition of the level of the category can enable us to better understand the structuring of our lives under surveillance as well as the myriad ways control can work on both individuals and categories. And it is here that the potential for discourse around identity becomes problematic. The identifications that make us as subjects online are becoming more opaque and buried, away from our individual vantage points and removed from most forms of critical participation. They are increasingly finding mediation outside the realm of traditional political intervention and inside the black boxes of search engines and algorithmic inference systems (Becker and Stalder, 2009). This system of regulative control makes a break with the theoretical lineage around control and technology that was begot in the world of cyberpunk, as we travel away from the liberal focus on the individual and Downloaded from tcs.sagepub.com at UNIV OF ROCHESTER LIBRARY on December 30, 2011
Cheney-Lippold ^ A New Algorithmic Identity 177
toward the structuring forms of suggestion at work when users that are gendered (or with any other categorical identity) visit a web page.7 Such ubiquity demonstrates a capacity, and the increasing use of that capacity, to employ surveillance technologies to gather data about both users and populations in general. But this ubiquity also needs to take into account an anti-technological deterministic perspective that allows for technological failure. Surveillance technologies do not work as designed all the time, and we have seen with the example of CCTV that technological capacity does not ensure a human capacity to adeptly monitor and analyze that technology.8 I agree with Wendy Hui Kyong Chun (2006) when she argues that the internet is neither a tool for freedom nor a tool for total control. Control is never complete, and neither is our freedom. At this present moment, web analytic firms have made it nearly impossible to not be incorporated into surveillance networks. We are always vulnerable to some degree of surveillance’s gaze. But vulnerability, the historical point of departure when we define freedom in its most classical liberal sense, should then be thought not as the revocation of autonomy or one’s suffocation under control (Chun, 2006). Control has become a modulating exercise, much like Deleuze predicted, which constitutes an integral part of contemporary life as it works not just on the body nor just on the population, but in how we define ourselves and others (Deleuze, 1992). Deleuze is vital for exploring these processes for the very fact that he understood the shift from ‘enclosed’ environments of disciplinary society to the open terrain of the societies of control. Enclosure offers the idea of walls, of barriers to databases and surveillance technologies. Openness describes a freedom to action that at the same time is also vulnerable to surveillance and manipulation. And these open mechanisms of control, the automated categorization practices and the advertisements and content targeted to those categorizations effectively situate and define how we create and manage our own identities. Surveillance practices have increasingly moved from a set of inflexible disciplinary practices that operate at the level of the individual to the statistical regulation of categorical groupings: ‘it is not the personal identity of the embodied individual but rather the actuarial or categorical profile of the collective which is of foremost concern’ to new, unenclosed surveillance networks (Hier, 2003: 402). Cybernetic categorization provides an elastic relationship to power, one that uses the capacity of suggestion to softly persuade users towards models of normalized behavior and identity through the constant redefinition of categories of identity. If a certain set of categories ceases to effectively regulate, another set can quickly be reassigned to a user, providing a seemingly seamless experience online that still exerts a force over who that user is. This force is not entirely benign but is instead something that tells us who we are, what we want, and who we should be. It is removed from traditional mechanisms for resistance and ultimately requires us to conceive of freedom, in whatever form, much more differently than previously Downloaded from tcs.sagepub.com at UNIV OF ROCHESTER LIBRARY on December 30, 2011
178
Theory, Culture & Society 28(6)
thought. We are effectively losing control in defining who we are online, or more specifically we are losing ownership over the meaning of the categories that constitute our identities. Algorithm ultimately exercises control over us by harnessing these forces through the creation of relationships between real-world surveillance data and machines capable of making statistically relevant inferences about what that data can mean. And the processes of soft biopower work in a similar fashion, allowing for a modularity of meaning that is always productive ^ in that it constantly creates new information ^ and always following and surveilling its subjects to ensure that its user data are effective. New cybernetic category constructions are the consequence of this modularity and ultimately allow for a ‘free’, but constantly conditioned, user.
Notes 1. Quantcast (www.quantcast.com) is a web analytics company that operates as a free service where member sites include HTML snippets of code into each HTML page on a web server. These snippets ‘phone home’ to Quantcast databases every time a user visits a site. This recording of user visits is then aggregated over time to create a history of a user’s web use across various web sites. 2. Quantcast uses more than just web-use data to infer categories upon its users ^ ‘An increasing variety of inputs can be used to continually validate and refine Quantcast inference models’ ^ but the impact and extent of these data inputs are unknown. I will thus focus on web use exclusively (Quantcast Corporation, 2008: 1). 3. Soft as a prefix can be elaborated by linking biopower to two similar concepts from different literatures. In the more proximate field of computer science, soft computing developed as programmers moved from a world of perfect computation and into a fuzzier variety of close-enough, inexact solutions to problems (Zadeh, 1994). This mimics the shift from essential modes of identity online and toward fuzzy conceptions of statistical belief ^ a user is likely male based on probability. And in a theoretical parallel to diplomatic soft power, soft biopower works as a more flexible form of power that uses attraction and influence to achieve its goals. Where the brute force of biopolitical action (population control) is mediated through definition and redefinition of the targeted category, the brute force of hard state-power (war) is mediated through diplomatic mechanisms and arguments around mutual benefit (Nye, 2002). ‘Hard’ biopower acts by dividing populations and enacting policy to control subjects through those divisions; soft biopower acts by taking those divisions and modulating categories’ meaning so as to best serve the rationale of hard biopower. 4. One example in particular of how one’s conditions of possibility may be affected through targeting is what Cass Sunstein (2007) defines as ‘The Daily Me’, an individualized array of information that pertains to the perceived or provided interests of a user, theoretically decreasing the chance that the user will encounter news or information that may contradict his existing views about the world. Downloaded from tcs.sagepub.com at UNIV OF ROCHESTER LIBRARY on December 30, 2011
Cheney-Lippold ^ A New Algorithmic Identity 179 5. ‘Adeptly categorizes’ does not mean ‘correctly categorizes’. The re-essentializing move of this categorization process relies on the fact that a categorization fits the behavior of a user, not that the user himself embodies that category. 6. For work on commercial surveillance see Lyon (2003b); for work on CCTV see Lomell (2004); for work on the NSA see Bamford (2001). 7. This liberal notion is most famously articulated in the work of Gibson (1984) and Dick (1968), both of whom defined cyberpunk’s insistence on the individual’s direct relationship with power. 8. The extent to which CCTV works is arguable, though analysis by Webster (2009) has offered a strong case that CCTV does little to prevent crime. The number of cameras in London alone, which has elicited a constant cry from privacy advocates, makes human observation of the technology near impossible.
References Amoore, L. (2006) ‘Biometric Borders: Governing Mobilities in the War on Terror’, Political Geography 25: 336^351. Arvidsson, A. (2006) Brands: Meaning and Value in Media Culture. New York: Routledge. Bamford, J. (2001) Body of Secrets. New York: Random House. Battelle, J. (2005) The Search: How Google and its Rivals Rewrote the Rules of Business and Transformed Our Culture. New York: Portfolio. Becker, K. (2009) ‘The Power of Classification: Culture, Context, Command, Control, Communications, Computing’, in K. Becker and F. Stalder (eds) Deep Search: The Politics of Search Beyond Google. Munich: Studienverlag & Transaction. Becker, K. and F. Stalder (2009) ‘Introduction’, in K. Becker and F. Stalder (eds) Deep Search: The Politics of Search Beyond Google. Munich: Studienverlag & Transaction. Chun, W.H.K. (2006) Control and Freedom: Power and Paranoia in the Age of Fiber Optics. Cambridge, MA: MIT Press. Collier, S. (2009) ‘Topologies of Power: Foucault’s Analysis of Political Government Beyond ‘‘Governmentality’’’, Theory, Culture & Society 26(6): 79^109. Cramer, F. (2005) ‘Words Made Flesh: Code, Culture, Imagination’. Available at: http://pzwart.wdka.hro.nl/mdr/research/fcramer/wordsmadeflesh (consulted January 2010). Deleuze, G. (1992) ‘Postscript on the Societies of Control’, October 59: 3^7. Dick, P. (1968) Do Androids Dream of Electric Sheep? New York: Random House. Foucault, M. (1973) The Order of Things: An Archaeology of the Human Sciences. New York: Vintage Books. Foucault, M. (1979) Discipline and Punish: Birth of the Prison. New York: Vintage Books. Foucault, M. (1980) ‘Two-Lectures’, in C. Gordon (ed.) Power/Knowledge: Selected Interviews and Other Writings, 1972^1977. New York: Pantheon Books. Downloaded from tcs.sagepub.com at UNIV OF ROCHESTER LIBRARY on December 30, 2011
180
Theory, Culture & Society 28(6)
Foucault, M. (1982) ‘Afterward: The Subject and the Power’, in H. Dreyfus and P. Rabinow (eds) Michel Foucault: Beyond Structuralism and Hermeneutics. Chicago: University of Chicago Press. Foucault, M. (2003a) Society Must Be Defended: Lectures at the Colle'ge de France, 1975^1976. New York: Picador. Foucault, M. (2003b) Ethics: Subjectivity and Truth. New York: New Press. Foucault, M. (2007) Security, Territory, Population: Lectures at the Colle'ge de France, 1977^1978. New York: Palgrave Macmillan. Foucault, M. (2008) The Birth of Biopolitics: Lectures at the Colle'ge de France, 1978^1979. New York: Palgrave Macmillan. Fuller, M. (2003) Behind the Blip: Essays on the Culture of Software. Brooklyn: Autonomedia. Fuller, M. (ed.) (2008) Software Studies: A Lexicon. Cambridge, MA: MIT Press. Galloway, A. (2004) Protocol: How Control Exists After Decentralization. Cambridge, MA: MIT Press. Gandy, O. (1993) The Panoptic Sort: A Political Economy of Personal Information. Boulder: Westview Press. Gibson, W. (1984) Neuromancer. New York: Penguin. Graham, S. (2005) ‘Software-Sorted Geographies’, Progress in Human Geography 29(5): 1^19. Hier, S. (2003) ‘Probing the Surveillant Assemblage: On the Dialectics of Surveillance Practices as Processes of Social Control’, Surveillance & Society 1(3): 399^411. Koppel, M., S. Argamon and A. Shimoni (2002) ‘Automatically Categorizing Written Texts by Author Gender’, Lit Linguist Computing 17(4): 401^412. Lessig, L. (2006) Code: Version 2.0. New York: Basic Books. Lomell, H.M. (2004) ‘Targeting the Unwanted: Video Surveillance and Categorical Exclusion in Oslo, Norway’, Surveillance Studies 2(2/3). Lyon, D. (2003a) Surveillance After September 11. Cambridge: Polity. Lyon, D. (2003b) Surveillance as Social Sorting: Privacy, Risk and Digital Discrimination. New York: Routledge. Macey, D. (2009) ‘Rethinking Biopolitics, Race and Power in the Wake of Foucault’, Theory, Culture & Society 26(6): 186^205. McNay, L. (2009) ‘Self as Enterprise: Dilemmas of Control and Resistance in Foucault’s The Birth of Biopolitics’, Theory, Culture & Society 26(6): 55^77. Mosco, V. (1996) The Political Economy of Communication. London: SAGE. Nakamura, L. (2002) Cybertypes: Race, Ethnicity, and Identity on the Internet. New York: Routledge. Nye, J. (2002) ‘Soft Power’, Foreign Policy 80: 153^171. Quantcast Corporation (2008) ‘Quantcast Methodology Overview: Delivering an Actionable Audience Service’. Available at: http://www.quantcast.com/whitepapers/quantcast-methodology.pdf (consulted February 2010).
Downloaded from tcs.sagepub.com at UNIV OF ROCHESTER LIBRARY on December 30, 2011
Cheney-Lippold ^ A New Algorithmic Identity 181 Revel, J. (2009) ‘Identity, Nature, Life: Three Biopolitical Deconstructions’, Theory, Culture & Society 26(6): 45^54. Rose, N. (1996) Inventing Ourselves: Psychology, Power and Personhood. Cambridge: Cambridge University Press. Rose, N. (1999) Powers of Freedom: Reframing Political Thought. Cambridge: Cambridge University Press. Rose, N. (2000) ‘Government and Control’, The British Journal of Criminology 40: 321^339. Sunstein, C. (2007) Republic 2.0. Princeton: Princeton University Press. Thacker, E. (2004) ‘Foreword: Protocol is as Protocol Does’, in Protocol: How Control Exists After Decentralization, ed. A. Galloway. Cambridge, MA: MIT Press. Thrift, N. (2005) Knowing Capitalism. Thousand Oaks, CA: SAGE. Turrow, J. (2006) Niche Envy: Marketing Discrimination in the Digital Age. Cambridge, MA: MIT Press. Weber, I. and C. Castillo (2010) ‘The Demographics of Web Search’, Proceedings of 2010 Conference on Research and Development in Information Retrieval (SIGIR). Webster, W. (2009) ‘CCTV Policy in the UK: Reconsidering the Evidence Base’, Surveillance Studies 6(1): 10^22. Witten, I. and E. Frank (2005) Data Mining: Practical Machine Learning Tools and Techniques. San Francisco: Morgan Kaufmann. Yankelovich, D. and D. Meer (2006) ‘Rediscovering Market Segmentation’, Harvard Business Review 1). Zadeh, L. (1994) ‘What Is BISC?’. Available at: http.cs.berkeley.edu/projects/Bisc/ bisc.welcome.html (consulted September 2010).
John Cheney-Lippold is Assistant Professor of American Culture at the University of Michigan. [email:
[email protected]]
Downloaded from tcs.sagepub.com at UNIV OF ROCHESTER LIBRARY on December 30, 2011