The DIGITAL HAND
This page intentionally left blank
JAMES W. CORTADA
The
DIGITAL HAND Volume III How Computers Ch...
118 downloads
1213 Views
5MB Size
Report
This content was uploaded by our users and we assume good faith they have the permission to share this book. If you own the copyright to this book and it is wrongfully on our website, we offer a simple DMCA procedure to remove your content from our site. Start by pressing the button below!
Report copyright / DMCA form
The DIGITAL HAND
This page intentionally left blank
JAMES W. CORTADA
The
DIGITAL HAND Volume III How Computers Changed the Work of American Public Sector Industries
1 2008
1 Oxford University Press, Inc., publishes works that further Oxford University’s objective of excellence in research, scholarship, and education. Oxford New York Auckland Cape Town Dar es Salaam Hong Kong Karachi Kuala Lumpur Madrid Melbourne Mexico City Nairobi New Delhi Shanghai Taipei Toronto With offices in Argentina Austria Brazil Chile Czech Republic France Greece Guatemala Hungary Italy Japan Poland Portugal Singapore South Korea Switzerland Thailand Turkey Ukraine Vietnam
Copyright © 2008 by Oxford University Press, Inc. Published by Oxford University Press, Inc. 198 Madison Avenue, New York, New York 10016 www.oup.com Oxford is a registered trademark of Oxford University Press All rights reserved. No part of this publication may be reproduced, stored in a retrieval system, or transmitted, in any form or by any means, electronic, mechanical, photocopying, recording, or otherwise, without the prior permission of Oxford University Press. Library of Congress Cataloging-in-Publication Data Cortada, James W. The digital hand. Volume 3, How computers changed the work of American public sector industries / James W. Cortada. p. cm. Includes bibliographical references and index. ISBN 978-0-19-516586-9 1. Information technology—Economic aspects—United States—Case studies. 2. Technological innovations— Economic aspects—United States—Case studies. 3. Business—Data processing—Case studies. I. Title: How computers changed the work of American public sector industries. II. Title. HC110.I55C67 2005 338'.064'0973—dc22 2004030363
9 8 7 6 5 4 3 2 1 Printed in the United States on acid-free paper
To three visionaries who have done so much to help young historians study the history of information technology: Erwin Tomash Arthur L. Norberg William Aspray
This page intentionally left blank
PREFACE The object of government is the welfare of the people. —Theodore Roosevelt, 1910
T
he literature on how organizations in the private sector go about their dayto-day work is almost always silent about what workers who are employed by government agencies do. Business and economic historians, professors of management, and consultants who comment on government treat the public sector of the economy as “different”; hence they often bypass it in their studies of how work is done. Government officials and their employees contribute to this situation by reinforcing the notion that the public sector does things differently and plays a unique role in society. Agencies are not in the business of making profits but, rather, facilitating and protecting the welfare of the nation. But as a result of these attitudes, they all ignore some basic realities, not the least of which is that the daily work of public officials is often just the same as in the private sector. Because this simple truth is so often overlooked, as a result, we have a paucity of research on how the operational practices of day-to-day work in governments and other public institutions compare to those simultaneously in use in the private sector. We also face the problem that the numbers of studies about work practices in the public sector are far fewer than in the private sector. In addition, government agencies prefer to report more frequently and thoroughly on the activities of other industries, and of the economy as whole, than about themselves. Yet as the first chapter of this book demonstrates, the public sector is large, and depending on how public workers are counted or their budgets tabulated, the largest within the American economy. Does the paucity of studies about the role of the public sector in modern society mean that we are left with the interesting possibility that a massive portion of the American economy functions differently from the private sector? The question is an intriguing one that we cannot fully answer in this book, but by looking at the day-to-day activities of a variety of government agencies,
viii
Preface
we can feed the dialogue with a considerable amount of new details. Why do we care, one might ask? Simply put, the public sector today consumes between 19 and 21 percent of the Gross Domestic Product (GDP) and there are probably only a handful of citizens who believe they are getting as much value from their taxes as a similar amount might yield in the private sector. However, by examining the role of computing and telecommunications in the daily operations of various government agencies and public institutions, we can specifically identify how the work of officials and public employees plays out in the economy. We will see that many of the operational practices, and uses of computing, were for the same applications and reasons as in the private sector. As I did with so many other industries reviewed in the first two volumes of The Digital Hand, in this book I identify patterns of operations in the public sector that are remarkably consistent with the experiences in the private sector. For the truth is, public officials sought to use the new technologies of the second half of the twentieth century for many of the same reasons as managers in the private sector: to reduce operating costs, to increase throughput, to perform new work, and to provide new services. To be sure, their missions varied; they set policies and implemented them, regulated parts of the economy and specific industries, and played a vital role in nurturing the development of new technologies in support of the national economy, such as development of the computer itself. In the case of the computer, no government in the world did as much to support the rapid development of this technology over such a long period of time as did a combined collection of American government agencies. But our story is going to be about government as a user of the technology, not its inventor. We look at how public institutions acquired technologies and for what purposes. I describe the applications to which these were put and the expectations officials desired. When possible, results and consequences are explored. To be quite formal about the scope, however, the story begins in about 1950, when computers were coming to the attention of public officials, and concludes with discussions of trends and events occurring in 2007. There are some fundamental findings to report. First, while we can quibble about when a public institution adopted a specific use of computing, or how that compared to when the same occurred in the private sector, as a general statement public institutions embraced specific uses of computers at roughly the same time as did the private sector. Of course, there were exceptions, and those are pointed out quite explicitly. But the general finding I consider proof positive that management in the public sector learned about the continually evolving technology, and of its possible uses, at approximately the same time as the private sector. Furthermore, at approximately the same time, and through a similar process, they, too, made decisions to acquire, use, and replace technologies. In short, they were very much as plugged into the flow of information about the technology as any other part of the economy and were just as enamored with the managerial practices of their day as their counterparts in corporations and small companies. Patterns of uses of computing paralleled those evident in other industries. Four in particular are significant. First, initial uses involved shrinking or
Preface
containing costs of labor and served largely to manage more efficiently accounting functions. As new uses of computing became possible, similar applications appeared in governments, schools, and health facilities to do data entry (1960s), word processing (1970s), database management and to use bar codes (1980s) and the Internet to reach the public in the 1990s and beyond. Second, as in private sector industries, applications specific to a mission of a public industry also stimulated use of this technology. Just as ATMs were created for the specific use of the Banking Industry, specific computer devices were developed for the public sector, such as command-and-control software and hardware embedded in weapons for the military and distance learning for education. Because of the massive purchasing power of public institutions, they could impose technological standards that filtered into the private sector and, in time, became dominant and ubiquitous. As this book goes to press in late 2007, the current example is the deployment of RFID technology in business supply chains, because the U.S. Department of Defense mandated application of the technology early in the new century. This is no different from what General Motors did to its thousands of suppliers in the 1970s and 1980s when it set standards for electronic sharing of information or the insistence of Wal-Mart that its suppliers conform to specific technical standards and use applications in support of the large firm’s mission in the 1980s and 1990s. Third, public institutions proved as aggressive as private industries in embracing digital technologies and the constantly arriving new tools. To be sure, there were exceptions, where adoption proved slow, or slower than other industries; but even here it was for the same reason: a specific technology or application either did not yet function well enough, was simply not cost effective, or the budgeting process proved slower moving than in a corporation. In short, I found no intrinsic feature of governmental operations that constrained the desire to use the technology. Budgets always served as a major influence in setting the pace of adoption and change, but as a general statement, that, too, has always been the case in the private sector as well. Because public institutions can have such a pervasive effect on many industries (if for no other reason than size), these agencies could and did push ahead of the private sector in the use of communications and digital technologies, sometimes leading the way. For example, the early versions of what eventually became known as the Internet were used almost exclusively by universities and government agencies for over a decade before the private sector began to integrate this new form of communications into the fabric of their work and business strategies. Fourth, the extent of use of the digital and communications technologies proved substantial. We know a great deal about rates of adoption. Often these equaled or exceeded those of many industries in the private sector. When I began working on this book, I hypothesized that public institutions were really far more different from private ones, and to be sure there were and continue to be important differences between them. However, the evidence led me to conclude that when it came to using information technologies (IT) over the past half century, they were more similar than different. I address the reasons for that, and
ix
x
Preface
the implications and lessons derived, more thoroughly in the final chapter. But one practical managerial lesson can immediately be called out: private sector industries can and should learn how to innovate their use of their information technologies from the public sector. Conversely, public sector managers can continue to do that as well. I say can continue because the historic pattern is that they have been more willing to learn from the private sector how best to use computing than have company officials from government cohorts. Finally, and perhaps most important, the continuous, unrelenting, and ubiquitous adoption of computing proved so massive that by the end of the century how people did their work in public sector industries had fundamentally changed in both their look-and-feel and in what they actually did. Old services and functions were performed differently while new activities took place and, in the process, led to a new style of doing things. In the two earlier volumes of this study, I describe in detail the features of the new digital style, but essentially my argument has been that the changes were as substantial as when, for instance, manufacturing companies went from making things with craftsmen to mass production, in part facilitated by the availability of electricity and electrical motors. Today, professors in many fields cannot do their research and writing without the use of computers; Internet crime could not be committed until the wide availability of this new communications tool; terrorists could not coordinate activities without benefit of cell phones and laptops; audits of tax returns for larger percentages of today’s population would not be possible without use of computers. In the Gulf War of 1991, the United States’ military headquarters was not in Basra, Iraq, or even in Kuwait; it was in Florida, again made possible by advances in communications and computers not available decades earlier. Quietly, in an evolutionary way, technologies seeped into every corner of the public sector of the economy; and while the changes were incremental, they had such a cumulative effect that by the end of the century we could understand why so much of the hyperbole about “computer revolutions” could be articulated by observers of the American scene. The findings are derived from the study of some, but not all, public institutions. So, we need to understand the scope of the book you are holding. Slices of like-missioned portions of the public sector act very much as if they were independent well-defined industries, much as companies in similar markets relate to each other within an industry. Because of that common behavior, it became quite easy and practical to discuss similar uses of computers and mutually reinforcing behaviors in the public sector as we might in the private sector. For that reason, therefore, half the chapters in this book are organized much like those in the prior two volumes of The Digital Hand. Thus, there is a chapter on higher education, since colleges and universities identify with each other and have a shared set of uses and experiences, including with computing and communications. The same applies to kindergarten through high school (K–12) and their school districts. Local governments (towns, cities, counties, and states) identify with each other and, as K–12 and higher education, have their own industry publications, conventions, associations, and so forth. So a chapter is devoted to local governments.
Preface
Other chapters, however, are organized around common activities that transcend local, state, and federal organizations, and those individuals and agencies that work within lines of shared activities work so closely together that it was more important to discuss their use of computing that they shared. Law enforcement, the military, and tax collection are examples and thus are treated individually. In the case of law enforcement, that means discussing together the work of city police, county sheriffs, state police, and the Federal Bureau of Investigation (FBI), rather than providing individual histories of law enforcement in specific agencies. As with any industry, there is a common identity among law enforcement professionals or tax collectors, complete with their own industry-centric conventions and publications and so forth. Finally, there is the proverbial “everything else,” which I have chosen to ignore, because either they do not add significantly to the story I am telling or their uses of computing are more commonly evident across all agencies. Legislatures, for example, are today extensive users of e-mail and online applications that access data about legislation in process or files on constituents. So, I include a chapter that begins to document the crossagency types of applications, because as a group they illustrate the extent of deployment of computing and communications technologies in the public sector. My approach is a happy compromise between my desire to write histories of such organizations as the U.S. Department of State or the State of California and the need to describe enough of the public sector landscape so that at the end of this book you and I have confidence that we have a reliable enough understanding of the patterns of deployment and the effects of computing on the daily work of public officials to draw some conclusions about such matters as patterns of adoptions of various applications and their effects on organizations. This hybrid approach stands in sharp contrast to how government officials look at themselves. Go to any agency report, Web site, or history and the first thing you will notice is that all accounts are organized as studies or histories of a specific agency or department. This would be tantamount in the private sector to writing accounts of a specific company rather than of an industry, or of a division within a large corporation. So, for some readers in the public sector, my approach may seem unnatural—and it is—but perhaps also enlightening, particularly in light of the fact that as this book goes to press, government agencies are outsourcing their work to other agencies and to the private sector, which, in the process, forces one to view the work of a public sector organization differently, more in line with the supply chain- or value chain-centric perspectives so widespread in the private sector. The Defense Department (DoD) has outsourced a vast amount of its nonmilitary work and IT; the U.S. Postal Service (USPS) formed a partnership with private sector UPS delivery service in 2004; other cases could be cited, such as private firms doing trash pickup or snow plowing for a city. The point is that practices so evident in the private sector are seeping into the public sector, a key finding regarding the use of computing as well. In earlier volumes, I noted that practices in the public sector came into the private sector. In this book, we see occasionally the movement of ideas and practices from private to public sector.
xi
xii
Preface
Finally, I had to decide what to do about healthcare. That entire arena of economic activity consists of government regulators (e.g., Food and Drug Administration), providers of services (e.g., state hospitals and U.S. Department of Veterans Affairs’ hospitals), payers (e.g., Medicare and Social Security Administration, and even the U.S. Treasury Department that handles the flows of payments, such as checks). At the same time, private health insurance companies, doctors, clinics, hospitals, and benefits departments of corporations are involved. Universities (most of which are public institutions and yet others private) also do research and staff clinics and hospitals. In short, healthcare is a messy space in the American economy, ill defined, and one that flows across various private and public sectors. After carefully weighing the criteria for what was “in” and “out” of scope for this volume, I concluded that healthcare in America straddled many of the industries and agencies already described in the three volumes, so there would be little gained in devoting a full chapter to the subject at this time. The federal, state, and local agencies selected for inclusion in this book were portions of governments that were extensive users of IT, or otherwise influential in the use of computing by other public and private sector organizations. Thus, there is much discussion about military uses of computing, the role of the FBI, and so forth, but near silence about many parts of the U.S. government, such as the intelligence community (e.g., CIA and NSA) and the Department of Interior. At the state level, examples and trends are drawn from many states, but not all states are discussed. The same applies to schools, colleges, towns, and cities. Any one of these other entities could easily be the subject of their own article, chapter, or book, and I would encourage others to do research on the role of technology in specific departments and agencies so as to enrich our understanding of the role of the digital hand in the public sector. So as not to write a 3,000-page book, I chose to highlight agencies and examples that illustrated patterns rather than to strive for completeness. Nobody will read, let alone publish, the size of book the story would otherwise require. I chose the title The Digital Hand primarily to call attention to the insufficiently recognized influence of computing on how organizations, companies, industries, and national economies changed the way they worked. Changes came about as a result of this new class of technologies that entered the lives of people during the second half of the twentieth century. I will be the first to admit that many factors influenced the way work changed and organizations evolved in this period; the full story is not just about the consequences of computing or telecommunications. However, to draw our attention to the important role of technology I have chosen an overt title. The title is also a tip of the hat to the work of distinguished historian Alfred D. Chandler, Jr., whose analysis of the rise of the managerial class in business—his Visible Hand—so expertly described the rise of corporations and the new nature of work since the mid–nineteenth century. My work is not intended to replace his—Chandler worked on a larger canvas than I—rather to pile on to his suggested lines of thinking by considering only one facet: the role of IT in the professionalization of work and management, and the evolution of strategy, structures, and scope of organizations. Once we
Preface
understand this source of important influence on the work of industries, then future historians can place IT into a more proportional context that takes into account such other important influences as monetary policies, regulatory practices, politics, war, and environmental effects. I point this out because although in the two prior volumes of the Digital Hand I so noted my intention, it is easy to forget that other influences were at work; I just do not have the space in this book to discuss those and explore the role of IT at the level of detail required to get a realistic understanding of its influence. So I will brave the risk of a critic arguing that I am too narrowly focused. In the spirit of true candor, I should note that Professor Chandler’s ideas influenced enormously my views on business history; we also were friends and have collaborated in examining how computing and other information technologies came into American society over the course of more than two centuries and influenced modern business. In the small world of business historians, scholars take sides either in support of his views or critical of them. So there is no doubt about my worldview: I am a Chandlerian. Also in the spirit of candor, I have not asked him to critique or bless this project; this is a work of my own doing and the design of its organization and content a result of what my own research has suggested made sense. While this book was in production, the sad news came in that Professor Chandler had died at the age of eighty-eight. His voice has been silenced, but I hope that his influence will continue to energize historians and economists. For the same reasons of controlling scope when I looked at over thirty industries in the private sector, I focused only on the United States, clearly the earliest and most extensive user of computers in the world (a situation rapidly beginning to change as computing spreads around the world). So, this historical example has much to teach those who ponder the role of computing in Asia, the greater Europe (not just the European Union), Latin America, and parts of Africa. For the research methodology, I direct the reader to the first appendix in volume one that studied manufacturing and retail industries as it is the same one used in this book. I want to explore in this book some of the same issues analyzed in the two prior volumes. I want to know when computing came to the public sector and for what uses. How did that happen and why? We know, for example, that for the entire period federal agencies spent more on computing than many private sector industries. Why? How? To what extent did industry-centric applications influence the types and rate of adoption of computing from one industry to another? How much influence did practices in one industry have on another, for example, computing in banking on government, or the military on manufacturing firms? In manufacturing, we found that over time computing and telecommunications linked suppliers to large manufacturers in tight codependent relationships that today would be impossible to imagine breaking. Has that been happening in any of the public communities studied in this third book? A series of questions can also be asked regarding how computing affected economic activities in the public sector as compared to influences in the manufacturing and services sectors. The immediate answer is that the digital affected public sector industries in similar
xiii
xiv
Preface
ways to manufacturing, retailing, services and simultaneously in different ways. Finally, what are the implications, particularly for management, in the public sectors that are fast growing, to the look and feel of contemporary American society? The end notes are rich in detail of what was required to document my sources. Where the literature on a subject was not so well known, as in the case of the Internet, I have added material to help those who wish to explore further themes discussed in the book. The subject of this book is vast, and there is a growing and large body of contemporary material on the subject that can be examined. I hope that others can flush out details I could not address in the trilogy. As with the previous volumes I have had to be almost arbitrary in selecting materials and portions of the public sector to examine, and to limit the length of the discussion to keep it within the confines of one book, because I have had to write about a very broad set of issues. In many instances, I have had to generalize without fully developing explanations. For this I ask the reader to forgive me, or at least to understand. The views I express in this book, and the weaknesses you encounter, are not those of the good people who helped me, my employer (IBM), or my publisher. This book would not have been possible to write without the help of many people. Nancy Mulhern, an expert on government publications and librarian at the Wisconsin Historical Society and University of Wisconsin, did more to shape my research agenda, and the research strategy underpinning the project, than I can ever fully describe. Most historians stand in terror in front of hundreds of shelves of obscure government publications; she made those reports my friends and allies. James Howard, an archivist for the U.S. Air Force, introduced me to the study of the history of the modern military establishment, while historian Alfred Goldberg, also of the Department of Defense, proved helpful in teaching about how the department had evolved. At the Bureau of the Census, I received help from David M. Pemberton and William Maury, while at the Social Security Administration Larry DeWitt did the same. Megaera (Meg) M. Ausam and James Golden, both at the United States Postal Service, shared information, insights, and then critiqued my discussion of the USPS, all to the betterment of the book. At the IRS Terry Lutes was of extraordinary help in ensuring that I got my facts right, and a highly experienced private sector tax preparer, Linda Horton, advised me on the role of computing in tax collection and preparation. Dr. William J. Reese, at the University of Wisconsin-Madison, provided advice on my work on education in America. Dr. Richard N. Katz, vice president of EDUCAUSE, was very helpful in advising me on how to deal with the very large issue of computing in higher education. I received constant help about the role of the Internet from Professor Shane Greenstein at the Kellogg School at Northwestern University, while it was historian Walter Friedman at the Harvard Business School who gave me various opportunities to articulate my thoughts and findings, valuable tests before solidifying them in this book. Paul Lasewicz, IBM’s archivist, and his staff kept sending me materials for years that simply were of enormous value and unavailable elsewhere. The staff at the Charles Babbage Institute, the world’s largest research center and archive
Preface
devoted to the history of computing, did the same. To put things in perspective, each archive provided me with several thousand pages of archival material and had tens of thousands of pages of additional records I simply could not find time to read. Many agencies have archival materials relevant to the kind of study I did that are not catalogued but that they are willing to allow historians to study; to so many of them, my thanks. Finally, I want to publicly thank the editorial and production team at Oxford University Press. You know you are working with a world-class organization when it can handle such a large project as if it were a routine event and yet make the author feel, as I did, that he or she was its sole contributor. This project represented an enormous investment of time, staff, and budget on the part of the press; it bet that a business manager could write a large and useful history consistent with the press’s standards and mission. And like my wife Dora, who had to tolerate my discussions about the digital hand for over fifteen years, editors and the production staff quietly kept me grounded and focused on the project. Despite the help of so many people in various forms, this book may still have weaknesses and failings, and they remain my responsibility. Finally, I want to say something about the dedication. Erwin Tomash was an executive in the world of information processing in its formative years and was the first individual to recognize that the history of computing needed to be preserved and told professionally. He established both the Charles Babbage Foundation (CBF) and the Charles Babbage Institute (CBI) at the University of Minnesota, which has helped dozens of historians learn about the history of computing and equip them with the skills to conduct research on the subject. Professor Arthur L. Norberg, of the University of Minnesota, devoted his career to the study of the history of computing, modeling the way for many of us, and ran CBI for many years in what clearly was a brilliant manner. Finally, I want to publicly recognize Professor William Aspray, of Indiana University, who also is a distinguished historian of computing but who has perhaps done more behind the scenes than anyone in the world to mentor young historians, advise senior ones, and, in the process, fundamentally shape the research agenda of a generation of historians of computing.
xv
This page intentionally left blank
CONTENTS 1. 2. 3. 4. 5.
6. 7. 8. 9. 10.
Presence of the Public Sector in the American Economy Digital Applications in Tax and Financial Operations Digital Applications in Defense of the Nation Digital Applications in Law Enforcement Digital Applications in the Federal Government: The Social Security Administration, the Bureau of the Census, and the U.S. Postal Service Role, Presence, and Trends in the Use of Information Technology by the Federal Government Digital Applications in State, County, and Local Governments Digital Applications in Schools Digital Applications in Higher Education Conclusions: Patterns, Practices, and Implications
Notes Bibliographic Essay Index
3 16 49 102
140 184 211 251 284 334 364 437 453
This page intentionally left blank
The DIGITAL HAND
This page intentionally left blank
1 Presence of the Public Sector in the American Economy No single technological advance has contributed more to efficiency and economy in government operations than the development of automatic data processing equipment. —Howard D. Taylor, 1965 The very fact that the services of general government are not sold means that there is no market valuation in the conventional sense and no prices whereby the estimated value of output might be deflated. —John W. Kendrick, 1961
J
ohn W. Kendrick published one of the earliest comprehensive studies about productivity in the American economy in 1961. Like so many economists who have studied the role of government and other sectors, he faced difficulties in measuring its performance. The above quote reflected a specific problem he faced, yet one that remains with us today. He recognized the importance of understanding the public sector by using techniques similar to those deployed by economists to observe the private sector. However, Kendrick also concluded that alternative means of understanding the economic impact of the public sector was possible; his message permeates this book, beginning with this chapter.1 The importance was reinforced by Howard D. Taylor in the quote above because computing was seen by public officials as an important tool for the improvement of productivity in the public sector since the 1950s, and continues to be so 3
4
The DIGITAL HAND, Volume III
viewed at the dawn of the new millennium. At the time of his statement (1965), Taylor was the Regional Director of the U.S. Internal Revenue Service in New York and thus was one of those public officials. He further noted that “today’s largest—and first—user of ADP for business management purposes” was the federal government.2 At the dawn of the twenty-first century, the combined total of the nation’s GDP from town, city, state, and federal governmental portions of the public sector totaled approximately 19 percent (state and local was about 12 percent, federal another 7 percent). Over the decades, the percents went up or down a percentage or two, often as a result of whether the nation was at war or not. Those elements of the economy generally employed about 16 percent of the nation’s total workforce, making it the second largest component of the economy after all of services and surpassing manufacturing. While the public sector had grown during the second half of the twentieth century, even in 1950 it already was a major component of the economy, the result of New Deal agencies of the 1930s expanding in size, along with the growth of the federal government during World War II and in the early days of the Cold War. The GI Bill also pumped millions of dollars into higher education, beginning in the late 1940s. That infusion of funds was augmented by additional billions in support of research conducted by higher education during the 1950s through the 1980s, largely in direct response to Cold War requirements. State, county, city and town governments also expanded during the entire period for myriad reasons. The volume of taxes collected grew over the entire period as a direct result of an expanding, generally prosperous economy and in response to the need for funding in support of government’s increasing activities, most notably in support of the Cold War and later the “War on Terrorism.” The percent of the total economy dedicated to taxes (called receipts in government parlance) did not vary wildly across the half century, but expenditures did as various governments and administrations expanded or shrank their use of deficit spending. Public officials intertwined deployment of every major information technology in the creation and operation of the functions of government and in expenditure of funds over the half century. Early on (1940s–1950s) deployment of IT focused on building out use of calculators and adding machines—the existing “high tech” devices of the day—for applications identified as useful before World War II. Governments and educational institutions were already extensive users of these kinds of information-handling equipment. In the case of IBM and its predecessor companies, for example, the U.S. government had been their largest customer since the 1890s.3 State and local governments started using adding machines and typewriters before World War I. For most suppliers of “office appliances,” and later computers, software, and services, the same held for them; namely, public sector organizations were typically their largest customers or, at least, one of their biggest market segments. The larger an institution was, the more that proved to be the case. So, federal agencies remained the most important, while individual schools and their districts were the least significant, as measured by volume of sales.
The Public Sector in the American Economy
This pattern of size affecting extent of deployment of IT mimicked what occurred across the economy at the same time, particularly in large organizations and companies. For while I have demonstrated elsewhere that use of IT permeated all sizes of organizations, the fact remains that the largest enterprises and government agencies tended to be the earliest and most extensive users of all forms of IT equipment throughout the twentieth century.4 Since the public sector had some of the largest organizations in the economy, it should be of no surprise, therefore, that it would absorb a substantial quantity of every new form of IT to come along during the second half of the century and, in particular, computers and telecommunications. Their large size simultaneously created economic incentives to look for ways to control costs, while providing additional quantities of services to the expanding population in the United States, and even new ones. Operating within the context of a growing, prosperous economy also added other influences as well.5 Thus, to begin understanding the kinds of uses to which governments and other public institutions put IT and telecommunications, we need to appreciate how big this sector was and how the various components comprised its grand total. To simplify the exercise, one can look at the number of employees in the sector, tax revenues, and expenditures over time. Those data points collectively give us a quick snapshot of some macro trends that over time proved influential in the adoption of IT in this sector. Subsequent chapters describe specific uses of various technologies, rationale for their adoption, and where possible, extent of deployment. One trend that each chapter spells out is how uses of computing came in incremental waves, one washing over the other as new generations of technology appeared on the market or the consequences of prior uses manifested themselves, contributing to the gradual evolution of many work functions over time. Eventually, the presence of computing technology embedded in the activities of various agencies contributed to the extent of growth or shrinkage of an organization, to how it was organized, what its budgets could be, and what services it might provide. So, we will see agencies initially install computers and other IT to drive down costs and improve efficiencies and eventually arrive at a point where how government organized itself was driven in important ways by the capabilities and effects of the technology. It is a process still under way, unfolding as this book was being written. One quick example will have to suffice to illustrate the point. When the 9/11 Commission studied the causes of the terrorist attacks of September 2001 and made its recommendations about what to do going forward, it focused largely on the failure of various agencies to share information. One of its most important recommendations to the president was to consolidate various intelligence and law enforcement agencies so that they could share information and gain access to each other’s databases. The commission’s final report read as much like an IT organization strategy as it did a blueprint for responding to terrorists. Several quotes from the report’s key findings make the point: The U.S. government has access to a vast amount of information. When databases not usually thought of as ‘intelligence,’ such as customs or
5
6
The DIGITAL HAND, Volume III
immigration information, are included, the storehouse is immense. But the U.S. government has a weak system for processing and using what it has. Recommendation: Information procedures should provide incentives for sharing to restore a better balance between security and shared knowledge. We propose that information be shared horizontally, across new networks that transcend individual agencies. A decentralized network model, the concept behind much of the information revolution, shares data horizontally too. Agencies would still have their own databases, but those databases would be searchable across agency lines.6
How public officials could reach the juncture of being so dependent on IT is a story that has its origins decades earlier, when computers first became available. But before we tell that story, we need to know what made up the public sector.
The Federal Presence The federal presence in the American economy grew all through the period. From 1950 through the early 1990s, the number of civilian and uniformed military employees grew, as illustrated in table 1.1. During the entire period, the number of federal employees per 1,000 employees in all sectors of the American economy, however, actually declined. In the early 1960s, for example, there were about 13 federal employees for every 1,000 employees across the economy, which peaked at 14 in 1970, then declined to between 12.4 and 12 in the 1980s, before dropping to 10 in the 1990s, and in the new century to numbers ranging from 9.1 to 9.4. Yet, if we add in state and local government personnel, who constituted roughly half the employment base for all governments, the number per thousand workers employed essentially doubled. For most of the period, military personnel made up about half of the federal government’s total. The reason for so many military employees grew largely out of a combination of Cold War requirements, needs of the Vietnam War, and later conflicts in the Middle East and Afghanistan. There were more people in uniform in the 1960s (over 3 million)
Table 1.1 Federal Government Employees, 1950–2005 (thousands) 1950
1960
1970
1980
1990
2000
2003
2005
3,421
4,875
6,085
4,965
5,234
4,129
4,210
4,196
Note: All annual figures include uniformed and civilian employees. All tables in this chapter include data for 2005, the latest year for which information was available. Source: U.S. Department of Commerce, Bureau of the Census, Historical Statistics of the United States: Colonial Times to 1970 (Washington, D.C.: U.S. Government Printing Office, 1975): Part 2, pp. 1102, 1141; U.S. Government, Fiscal Year 2005 Historical Tables of the U.S. Government (Washington, D.C.: U.S. Government Printing Office, 2004): 301; “U.S. Employment and Labor Force,” http://www.dod/ mil/comptroller/defbudgt/fy2005/fy2005_greenbook.pdf (last accessed 3/06/2006).
The Public Sector in the American Economy
than in the 1980s (averaging about 2.1 million); in the 1990s that number dropped (1.4–1.5 million).7 While analyzing population data can be a tricky proposition, the general trend is clear: the proportion of the total workforce in America in the employment of the U.S. government remained relatively constant, although the absolute numbers were high, ranging from over 3 million to just in excess of 6 million people. A second way to understand the scope of the federal government’s presence is by looking at taxes and other receipts over the period. Table 1.2 summarizes the totals over time. First, what becomes quickly obvious is that the absolute number of dollars coming into the federal coffers from taxes, fees, and fines is enormous and grew in volume over time. Second, and more telling, as a percent of Gross Domestic Product (GDP), these increased by some 50 percent from a low of 14.4 percent of the economy’s output to nearly 21 percent, then declined as tax reductions initiated at the turn of the century took hold. Later in this chapter, we review similar kinds of statistics for state and local government, which also show a growing percent of GDP, from just over 5 percent of the total to an average of nearly 11 percent late in the century. In short, as measured by income, the federal government again appears as a major player in the economy.8 If we move to the other side of the balance sheet—to expenditures, which do include deficit financing that, hence, exceeds income—we gain another view of how large a role was played by this government. The key data are displayed in table 1.3. As with income, expenditures were large. From the 1950s into the early 1980s, expenditures roughly matched incomes; then expenditures exceeded receipts in ranges of a few percents to 20 percent until the years of the Clinton administration when expenditures actually declined to slightly lower volumes than receipts. The situation then reverses back to the pattern of the 1980s in the early years of the new century. So, budgetary outlays throughout the half century ranged from just over 14 percent of GDP to an excess of 20 percent at the dawn of the new century.9 Table 1.2 Federal Receipts, 1950–2003 Fiscal Year 1950 1960 1970 1980 1990 2000 2003
Total Government Receipts ($ Billion)
As Percentages of GDP
56.6 131.3 288.9 776.1 1,032.0 3,084.9 2,927.6
14.4 17.8 19.0 19.0 18.0 20.9 16.5
Source: U.S. Government, Fiscal Year 2005 Historical Tables of the U.S. Government (Washington, D.C.: U.S. Government Printing Office, 2004): 288.
7
8
The DIGITAL HAND, Volume III Table 1.3 Federal Expenditures, 1950–2003 (billions of dollars) 1950 1960 1970
42.6 92.2 195.6
1980 1990
590.9 1,253.2
2000 2003
1,788.8 2,157.8
Source: U.S. Government, Fiscal Year 2005 Historical Tables of the U.S. Government (Washington, D.C.: U.S. Government Printing Office, 2004): 289.
As impressive as are the raw statistics on number of employees and dollars taken in and spent, they do not tell the whole story, a tale that we will not discuss in this book. But, one should keep in mind that in some instances over the years, public funds went toward stimulating development of new technologies and science. One fundamental reason that, for example, computing advanced so rapidly in its technical evolution in the 1940s through the 1960s can be attributed to the millions of dollars spent by the U.S. government in support of early computing projects, and to outfit the military branches with computers and microelectronics embedded in weapons systems and avionics.10 That story has been told by many, so it is enough here to acknowledge a first and second order effect of expenditures on computing in America. Almost every computer development project of the 1940s and through the mid-1950s was funded almost entirely by the U.S. government. Many of the early components that comprised the Internet, before the availability of Web browsers, were also the result of federal largesse.11
State and Local Governments State and local governments also held a prominent position in the economy during the second half of the twentieth century. As table 1.4 demonstrates, the number of public employees at the state and local level, from governors to teachers, police and park personnel, city workers and county game wardens, and all other government workers remained high during the entire period. Furthermore, when compared to the number of federal employees, they substantially outnumbered them, increasingly so as time passed (see table 1.1). While there were many reasons for this trend, not the least of which was the conscious effort of the federal government to transfer responsibilities for various programs to the states (which in turn often also moved duties and obligations to local government), the numbers were impressive. Now let us examine revenues and expenditures to round out our view of the size of state and local government presence in the economy. As table 1.5 indicates, the volume of tax receipts taken in during the period proved enormous. As a percent of GDP, income doubled over the half century. Why did the share of GDP double while dollar flows grew by nearly two orders of magnitude? All through the half century, the U.S. economy also grew enormously, as measured
The Public Sector in the American Economy Table 1.4 State and Local Government Employees, 1950–2005 (thousands) 1950
1960
1970
1980
1990
2000
2003
2005
4,098
6,083
9,822
13,375
15,219
17,925
18,745
19,000
Note: The data also includes school employees. Source: U.S. Department of Commerce, Bureau of the Census, Historical Statistics of the United States: Colonial Times to 1970 (Washington, D.C.: U.S. Government Printing Office, 1975): Part 2, p. 1104; U.S. Government, Fiscal Year 2005 Historical Tables of the U.S. Government (Washington, D.C.: U.S. Government Printing Office, 2004): 301; “U.S. Employment and Labor Force,” http://www.dod/mil/ comptroller/defbudgt/fy2005/fy2005_greenbook.pdf (last accessed 3/6/2006).
Table 1.5 State and Local Government Tax Receipts, 1950–2003 Fiscal Year 1950 1960 1970 1980 1990 2000 2003
Total Government Receipts ($ Billion)
As Percentages of GDP
17.1 38.8 96.1 259.0 615.0 1,059.7 1,145.3
6.3 7.5 9.5 9.5 10.7 10.9 10.6
Source: U.S. Government, Fiscal Year 2005 Historical Tables of the U.S. Government (Washington, D.C.: U.S. Government Printing Office, 2004): 288.
by Gross National Product (GNP) from $284.8 billion in 1950 to $503.7 billion in 1960 (in 1958 dollars), during years of early and limited adoption of computing in the public sector of the economy.12 Next, the economy as a whole expanded all through the years in which extensive use of computing occurred. If we switch to the preferable measure of GDP, in 1970 that totaled just over $1 trillion, grew to $2.8 trillion in 1980, then to $5.8 trillion in 1990, and in 2000, reached nearly $9.9 trillion. Even in the recession year of 2001, GDP climbed to $10.2 trillion. In short, the economy was large, expansive, and could comfortably afford to support a growing public sector.13 How much did state and local governments spend, that is to say, how much did they inject back into the economy, such as for the purchase of computers, software, salaries in addition to its myriad other obligations? Table 1.6 indicates the volumes and trends. For the most part, state and local governments spent what they received, often more as a by-product of laws and state constitutional
9
10
The DIGITAL HAND, Volume III Table 1.6 State and Local Expenditures, 1950–2003 (billions of dollars) 1950 1960 1970
16.0 34.6 88.0
1980 1990
250.2 605.3
2000 2003
1,004.8 1,156.8
Source: U.S. Government, Fiscal Year 2005 Historical Tables of the U.S. Government (Washington, D.C.: U.S. Government Printing Office, 2004): 289.
Table 1.7 Number of Governmental Units by Type, 1952–2002 Type of Government
1952
1962
1972
1982
1992
Federal State County Municipal Towns & townships
1 48 3,052 16,807 17,202
1 50 3,043 18,000 17,142
1 50 3,044 18,517 16,991
1 50 3,041 19,076 16,734
1 50 3,043 19,279 16,656
2002 1 50 3,034 19,431 16,506
Source: U.S. Department of Commerce, Statistical Abstract of the United States: 2002: The National Data Book (Washington, D.C.: U.S. Government Printing Office, 2002): 260.
requirements that they live within their means than because of some particularly disciplined approach to fiscal management. It is a good point to keep in mind because state and local governments did not have as much flexibility as the national government to spend more and thus felt a continuous, even greater pressure than federal agencies to find ways to slow or contain expenses. To that end, they turned frequently to IT for assistance. Since budgets for such things as computers and communications networks reside in departments and agencies, rather than in the hands of individual employees, it is useful to understand how many such organizations exist, just as it is useful to know how many companies there are in an industry. The data in table 1.7 was prepared by a U.S. government agency and documents how many such agencies there were at the local level. The information is also terribly misleading when it comes to state and federal agencies since there are literally scores in every state and hundreds in the federal government. Furthermore, the data leave out such other governmental units as the commonwealth government of Puerto Rico and other possessions of the United States in the Pacific. In subsequent chapters, more granular information is provided (such as the number of law enforcement agencies by type). That said, however, note the large number of county, town, and municipal entities (listed in table 1.7); over time they all became users of computers and telecommunications. Add in the various state government agencies, and one begins to see that the user base for computing throughout the entire period grew enormous. If we add in school districts and
The Public Sector in the American Economy Table 1.8 Total Number of Local Governmental Units, 1952–2002 Year
1952
1962
1972
1982
1992
2002
Total
116,756
91,186
78,218
81,780
84,955
87,849
Source: U.S. Department of Commerce, Statistical Abstract of the United States: 2002: The National Data Book (Washington, D.C.: U.S. Government Printing Office, 2002): 260.
other specialized districts and entities, the numbers remain quite high for the entire period. Table 1.8 documents that pattern. Note also that a combination of consolidations and closures of agencies occurred during the period; one can reasonably assume that agencies adding to their responsibilities those of a closed organization would have grown in size over time, putting them in a better position to want and afford digital tools. School districts consolidated the most during the half century into ever larger governmental units while the number of “special districts” increased as well. This later category included such organizations as public housing authorities, and others that managed irrigation and power, often generating their revenues from rents and fees.
K–12 and Higher Education The majority of schools in the United States during the second half of the twentieth century were publicly funded and run. Because computers were not used in any substantial quantity in schools until the early 1980s, when personal computers became widely available, it is more important to understand how many schools there were in the last two decades of the century, rather than in prior decades. More important are the school districts, which, by virtue of being responsible for managing multiple schools, had the size and economic wherewithal to spend money on computers to handle such accounting and administrative functions as payroll, class assignments, and grades. Table 1.9 catalogs the number of elementary and secondary schools. The number became especially important in the 1990s when the Clinton administration launched its initiative to connect every school in the country to the Internet, over 80,000 of them, and their school districts, which added another 14,000 organizations into the mix (see table 1.10). Note that over time school districts consolidated into larger organizational units, all when the number of students rose from over 48 million in 1980 (as PCs were just starting to spread across the economy) to over 52 million in the 1990s.14 The number of teachers grew from just over 2.2 million in 1980 to nearly 2.8 million in 2000.15 Higher education experienced an extraordinary period of growth and prosperity throughout the half century. Its institutions became extensive users of all types of computing across administrative and research functions, and to a limited
11
12
The DIGITAL HAND, Volume III Table 1.9 Number of Elementary and Secondary Public and Private Schools in the United States, 1950–2002 Year
Elementary
Secondary
1950 1960 1970 1980 1990 2000 2002
138,600 105,427 80,172 61,069 61,340 64,131 70,516
27,873 29,845 29,122 24,362 23,460 22,365 27,468
Source: For 1950–1970, U.S. Department of Commerce, Bureau of the Census, Historical Statistics of the United States: Colonial Times to 1970 (Washington, D.C.: U.S. Government Printing Office, 1975): Part 1, p. 368; for 2000, U.S. Department of Commerce, Statistical Abstract of the United States: 2002: The National Data Book (Washington, D.C.: U.S. Government Printing Office, 2002): 147; “Digest of Education Statistics, 2003,” http://nces.ed/gov/programs/ digest/d03/tables/dt085.asp (last accessed 3/06/2006).
Table 1.10 Number of School Districts in the United States, 1952–2002 1952 1962
67,355 34,678
1972 1982
15,781 14,851
1992 2002
14,422 13,522
Source: U.S. Department of Commerce, Statistical Abstract of the United States: 2002: The National Data Book (Washington, D.C.: U.S. Government Printing Office, 2002): 260.
extent in teaching. Table 1.11 documents the number of public and private chartered institutions and includes junior and four-year colleges and universities. I include public and private universities because even private schools were the beneficiaries of extensive funding by the public sector, particularly for research and even library applications. In 1950, there were approximately 247,000 faculty members. In 1970, that population had swelled to 729,000 and, by the end of the century, to over a million.16 The army of students they taught also grew throughout the period, from 2.2 million in 1950 to 8.5 million in 1970. During the 1970s, enrollments rose massively, largely due to the baby boomers going to college, with enrollments exceeding 12 million in 1980, and rising to over 13.8 million in 1990 as their children also enrolled in postsecondary education programs. The surge in enrollments extended right to the end of the century, with the number of students approaching 14.8 million in 1999/2000.17
The Public Sector in the American Economy Table 1.11 Number of Institutions of Higher Education, 1950–1999 1950 1960
1,863 1,959
1970 1980
2,556 3,231
1990 1999
3,559 4,084
Source: For pre-1970, U.S. Department of Commerce, Bureau of the Census, Historical Statistics of the United States: Colonial Times to 1970 (Washington, D.C.: U.S. Government Printing Office, 1975): Part 1, p. 382; for post-1970, U.S. Department of Commerce, Statistical Abstract of the United States: 2002: The National Data Book (Washington, D.C.: U.S. Government Printing Office, 2002): 165. Note that these statistics are for public and private nonprofit institutions. In chapter 9 data is presented that suggests that in the early 2000s there were over 9,000, which includes corporate and for-profit postsecondary institutions.
If we total K–12 and higher education, we begin to understand how large the Education Industry was in the United States. Taking 1980 as our starting point— by which time computers were in wide use in higher education, were being broadly deployed in school districts, and were just starting to enter elementary and secondary schools—there were 60 million people going to school, taught by several million other individuals. By the end of the 1990s, approximately 65.8 million were in school, while an additional 3.8 million were teaching.
Summary By collecting together the summary data for the number of public sector employees and comparing them to the total work force, we can further clarify the size of government in the economy. Table 1.12 does that just with employment figures from the mid-1960s forward when uses of computing became substantial in the sector. Earlier, we had already established the contribution to GDP made by the sector in total. This table only shows employees working for a governmental institution and includes, for example, teachers and professors at public schools and universities. By placing alongside what percent of the total American workforce (in the public and private sectors) comprised those working in the public sector, we see that the proportion is roughly 15–17 percent, a fairly constant amount. To be sure, the data has all kinds of problems. For example, we know that governments outsourced a great deal of work in the 1990s, particularly the federal government, which otherwise would have more workers, resulting in larger proportions of the total workforce being in the public sector. Totals from one calculation or one agency to another also do not always match. However, as noted earlier, the trends are consistent enough to make the data useable. This applies to all three categories of information: employment, revenues (taxes), and expenditures. The data raises intriguing questions tied to Kendrick’s original interest in measuring productivity and related to the use of computers and telecommunications. For one thing, we would expect that as the amount of computing agencies
13
14
The DIGITAL HAND, Volume III Table 1.12 Comparison of Federal, State, and Local Employees to Total U.S. Workforce, 1965–1995 (thousands)
Year
Total Federal
1965 1975 1985 1995
5,215 5,061 5,256 4,475
Total State & Local 7,696 11,937 13,519 16,484
All Government 12,911 16,998 18,775 20,959
% of Total Total U.S. U.S. WorkWorkforce force 77,178 94,793 115,461 132,304
16.7 17.8 17.6 15.0
Note: U.S. workforce only includes civilian employees. Source: For pre-1970, U.S. Department of Commerce, Bureau of the Census, Historical Statistics of the United States: Colonial Times to 1970 (Washington, D.C.: U.S. Government Printing Office, 1975): Part 1, p. 127; for post-1970, U.S. Department of Commerce, Statistical Abstract of the United States: 2002: The National Data Book (Washington, D.C.: U.S. Government Printing Office, 2002): 367; and for 1975 from ibid., 1974 edition, 356; and U.S. Government, Fiscal Year 2005 Historical Tables of the U.S. Government (Washington, D.C.: U.S. Government Printing Office, 2004): 301.
deployed increased, their productivity would also improve. However, since almost all industries were also installing computers for similar applications and at approximately the same time, it should become difficult to differentiate public from private sector productivity directly attributable to computing, applying the old saw that “rising waters raise all boats.” We will have other occasions to discuss the “productivity paradox” in this book, but suffice it to say that the issue is more than just about productivity. The extent of deployment of computing in public institutions proved so extensive that fundamental ways of doing work had been altered to such a point that prior methods were no longer possible to use (e.g., tracking aircraft flying toward airports without use of computers). Furthermore, there now were new and different services that both the public at large and officials in agencies had come to see as part of what they wanted and did (e.g., communicating with citizens and each other via e-mail). That change in style in how this sector went about its business is the more important story. That said, however, we need to acknowledge that productivity measures in government have long intrigued observers and economists.18 In recent years, economists have engaged in an extensive debate about what kinds of data should be collected by the U.S. government regarding productivity in the services sector, because it is increasingly becoming quite clear that current data collections are not adequate.19 There is the additional problem that such data sets do not take into account the short- and long-term structural changes in productivity that grew directly out of the extensive use of IT.20 To make matters worse, we are left with woefully insufficient data about productivity in the public sector, but with an abundance of such information about manufacturing, and a slowly growing body of data regarding services.
The Public Sector in the American Economy
By looking at the fundamental “street level” uses of computing in the public sector, the whole discussion of productivity and effectiveness in government can be informed, suggesting possible avenues for documenting efficiencies in the public sector. Moving from just a discussion of comparative statistics—a favorite and useful exercise of economists—to the less abstract historical narrative that describes how this important collection of technologies was used, moves the whole discussion to a different plane. This holds out the hope that behind “the numbers” there is a more three-dimensional human understanding of how public institutions functioned in recent decades that contributes to our appreciation of how they go about their work today. For public officials, this exercise moves insights from statistical abstracts to managerial “best practices” and identifies patterns of adoption and deployment that can be applied to future decisions on acquiring and using IT. The road ahead is an exercise in business history, with an injection of economic data (but not necessarily theory), and a fairly disciplined avoidance of technical history of computing and telecommunications. The chapters describe when computing came into use, why, for what applications, and when available, document results, and certainly expectations. This will not be a debate about what policies were implemented or why, unless they directly concerned use of IT. Rather, our main concerns will be managerial, and in the process we will see that many practices evident in the public sector were quite similar to what existed simultaneously in the private sector, providing yet another line of evidence that the economy of the United States evolved broadly into new forms across so many industries—a basic theme of the trilogy of The Digital Hand. Because taxation, defense, and law enforcement transcend all manner of local, state, and federal agencies in one form or another, we will look at those first. Next we move away from heavily federally centered themes to deeply local ones, by examining state, county, and municipal governments and education. But it always seems that one cannot begin any discussion about government without raising the issue of taxes. For that reason, the next chapter is devoted to this topic and to the extensive use of computers deployed in their collection and expenditure.
15
2 Digital Applications in Tax and Financial Operations It is not an understatement to say that every aspect of the IRS’s mission has been and will continue to be affected by the technology revolution. For the IRS, it is impossible to look at even our recent past without seeing the enormous impact of information technology (IT). —Tim Brown, 1990
T
he most widely recognized agency within the U.S. government lies inside the Department of the Treasury: the Internal Revenue Service (IRS). It was also one of the nation’s largest users of computers throughout the second half of the twentieth century. Tim Brown (quoted above), the assistant commissioner for collections at the IRS, had witnessed for years the continuous growth in the use of computing in this large corner of the U.S. government. He commented to a case writer for the Harvard Business School that “from the way tax returns are processed to the way employees communicate and use office equipment, IT has continually changed the way we do business.”1 His comment could just as easily have been made about the entire federal government, for most states’ financial and tax collection agencies as the century progressed, and even about local governments. One can envision a virtual wave of technology sweeping across the entire financial and tax collection agencies of government, beginning earliest and most extensively with the largest organizations, such as the IRS and the states of New York and California, followed by smaller states, cities, counties, and towns. By the time midsized states were automating steps in their accounting and tax applications with computers, large corporate tax departments already had, and
16
Tax and Financial Operations
large independent accounting firms were busily moving into the world of IT. Next in the 1980s, individual taxpayers began using software to prepare their returns, using personal computers. By the end of the century, all the major participants in public accounting and tax preparation, payment and collection were extensive users of information technology. In the very early years of the new century, over half of individual taxpayers were filing online; a higher percentage of companies and corporations did the same. In short, all manner of tax accounting had been highly digitized.2 Reasons for this wave of automation splashing across the economy from large taxing agencies to individual taxpayers are not hard to find. Howard D. Taylor, the regional IRS commissioner in New York in the early 1960s, stated categorically that computers “contributed more to efficiency and economy in government operations” than any other practice or tool.3 Computers from the 1950s to the present could store and move large quantities of data, and this technology could also calculate and compare information faster and cheaper than people or prior information processing equipment. These twin reasons— efficiency and economy—remained the primary sources of interest in computing in both accounting departments at large organizations and in all areas of tax accounting for over a half century. The same held true for state and local agencies and for tax accountants and taxpayers. Over time, however, in addition to doing the normal work of preparing, collecting, and accounting for taxes, computing made other reasons equally attractive. IT projects in 2004 at the IRS, for example, were motivated by “improving service or performance,” “increasing tax compliance,” “detecting fraud, waste, and abuse,” and “detecting criminal activities or patterns,” all classic examples of data mining that only became possible after vast quantities of tax information resided in computers and software that could analyze large bodies of data became available.4 Many of these applications first appeared in the 1960s, particularly those related to compliance. In short, the technology lent itself to the twin reasons for deploying every new form of IT that had come along in the past century to accounting and tax activities. Implicit, perhaps obvious, was the fact that tax activities were enormously paper intensive throughout the century. Individual tax returns in the 1930s and 1940s were already two to four pages in length, while your author’s combined federal and state tax returns for 2005 exceeded ten forms, each at least two pages long, not including backup files and hundreds of receipts and canceled paper checks. Corporate and small business returns were always more voluminous, and even today there are a number of corporations whose tax return, if they were to be filed on paper, would exceed 50,000 pages. In short, tax preparations and management were some of the most paper-intensive activities in the economy. The lion’s share of this chapter reviews the introduction and use of computing in tax applications, with a tip of the hat to accounting and financial applications in general, since all three activities essentially called for the same use of computers: to collect data, calculate taxes and receipts, report results, and perform analysis of large bodies of information to identify patterns worthy of audits, criminal investigations, and so forth. Accounting applications remained some of
17
18
The DIGITAL HAND, Volume III
the largest and most pervasive uses of computers by all governments across the American economy. While the fundamental missions and tasks did not change over time, how the work of accounting, financial, and tax departments was accomplished did. How that happened is the story of this chapter.
U.S. Internal Revenue Service (IRS) It would be impossible to survey all the accounting and financial applications in use by the federal government in this book and still look at the broad array of other uses across the public sector at large. Observing the experience of the IRS, however, does give us the opportunity to understand uses and patterns of adoption of computing by the federal government and to begin understanding the challenges and consequences of adopting IT, particularly for large digital applications. The agency was always large, interacted with every business, government agency, and taxpaying resident in the nation. In short, its activities were pervasive, massive, and often emblematic of general patterns of use and deployment across the economy. Its role as the government’s major fund raiser cannot be overemphasized. In the nineteenth century, the majority of federal income came from customs fees charged for goods coming into the nation; in the twentieth century, the lion’s share came from individual, corporate, excise, and social insurance receipts. By the late years of the century, customs (excise) taxes accounted for only 4 percent of total federal income, while individual income taxes reached 46 percent in 1999, for instance. That same year social insurance receipts accounted for 34 percent of total tax revenues. Corporate taxes provided an additional 11 percent.5 The IRS relied on computer-based applications designed to reflect these realities. A second influence on applications was the physical location of the IRS. While the number of offices varied over the years, its organization remained remarkably constant. The IRS maintained its headquarters in Washington, D.C., many hundreds of offices all over the nation (although shrinking in number in the 1990s), seven regional offices, approximately sixty district offices, and seven to ten service centers (depending on which year we look at). In 1994—the year just before the massive increased use of the Internet across the United States— the IRS employed 115,000 people. In 1955, the IRS took its first step into the world of computing when it established a service center in Kansas City to begin consolidated processing of tax returns. Over the next few years, IRS officials converted manual, and partially automated, accounting functions to computer-based applications. These included the same applications migrating to computers in the private sector: payroll, general accounting, and some tax accounting. By the late 1960s, the IRS had over 1,000 data-processing personnel and some of the most advanced and largest systems installed in any industry and enterprise within the American economy. Early tax-specific applications implemented in the late 1950s and early 1960s included systems for maintaining taxpayers’ records, audit of returns, and notices of tax due or refunds.6 Deployment occurred in waves,
Tax and Financial Operations
Figure 2.1
First computer installed at the IRS computer center, 1964. (Courtesy IBM Archives)
beginning with projects in Kansas City and in New York City, followed by establishment of the first fully functional service center to be enabled with major automation at Chamblee, Georgia, in 1962 to process business tax returns from seven southern states. The IRS located its first—and only—National Computer Center in Martinsburg, West Virginia, in 1961 with five employees. By the late 1960s, it had hardware and software. By then, the overarching collection of applications had been defined, and the IRS was on its way to implementing them. These included a master file of all business taxpayers, later others, such as for individuals (called the individual master file or IMF); assignment of a permanent tax identification number to all taxpaying entities so that in the future data could be extracted from systems, using much the same method as inventory control or product numbers deployed in the private sector; and increased centralization of computing to take advantage of the rapidly expanding capabilities of mainframes and economies of scale. Taxpayers submitted returns, the IRS deposited tax receipts, and returns were sent to regional service centers for processing. An IRS official described what happened next: In the regional service center the tax returns are sorted, batched, edited and placed under control for processing. The taxpayer’s name and identifying number, as well as the line items of data from which the tax due is computed, are transcribed to punched cards exactly as they appear on the return forms. The punched cards are then converted to magnetic tape and the arithmetic is verified by the computer.7
19
20
The DIGITAL HAND, Volume III
After errors were corrected, the data transcribed from returns loaded on magnetic tape were shipped to the National Computer Center where they were sorted and edited. Staff updated master records and prepared tapes to generate refunds and other accounting records. During the mid-1960s, the IRS slowly implemented this system across the nation; the New York office began, for example, in 1965 with limited batch input for key forms filed by taxpayers, including Form 940 (Employers’ Annual Federal Unemployment Tax Returns), Form 1120 (U.S. Corporation Income Tax Returns), and other related documents, and in time additional forms.8 However, the real experimentation, and later production work, took place at the Chamblee computer center and later at the Philadelphia office, both in the late 1960s. These applications became the core uses of computing at the IRS for many years. They reflected the capabilities of the technology of the early 1960s: tape-based, batch, and run on large mainframe systems. By the late 1960s, advances in technology made it clear that it was time to start thinking about replacing these earliest systems to leverage new technological advances and to handle growing demands for capacity and function. All through the 1970s, the IRS and other government agencies (including congressional committees and the Government Accounting Office) worked with this issue. It proved to be a very slow process, however. Not until the mid-1970s did the IRS complete formulating a concept for its replacement, called the Tax Administration System, touted at the time as the largest data-processing project that would be undertaken by the federal government, with a proposed budget of $1 billion. That, however, did not happen for various administrative and political reasons. Specifically, administrative concerns centered on cost/benefits and political issues related to fears that the IRS was inspecting the public too intrusively. Recall that this was the period immediately following Watergate when public concerns about the ethics of government were at a heightened state of alert. In fact, public mistrust of “big brother” watching the public slowed a number of other data-processing projects at the CIA and FBI as well. Instead, all through the 1970s and most of the 1980s, incremental changes were made to existing batch systems that had previously not been connected to each other. Meanwhile, service levels declined, as did responsiveness to taxpayers. In 1982, the IRS tried again to get funding for a fundamental change, this time to launch the Tax System Redesign Project. It called on private industry to design the new system, have a private firm lead the project, and upgrade the master taxpayer file using IRS employees. According to the GAO, the IRS failed to accomplish this task because of constant changes in management, insufficient clarity in defining the roles and responsibilities of key officials, and lack of sufficient technical expertise at the agency. Auditors at the GAO observed that technicians in the IRS drove the project, rather than general line management. Without appropriate senior management pushing forward the initiative, the systems installed in the 1960s remained in use.9 The press and members of Congress criticized the IRS for its antiquated computing, in sharp contrast to the progressive image it had enjoyed in the early
Tax and Financial Operations
1960s. In 1985, the entire system went into crisis. The IRS had installed a new system with inadequate capacity or staff, with the result that by April 5, with 60 million returns already received and the tax filing season not yet over, it had only processed 60 percent of the returns already in-house. All the work was delayed, such as late refunds, but also some inaccurate dunning notices related to filed returns were mailed to taxpayers, and other errors occurred. Reports circulated that IRS employees were simply throwing away tax returns that they did not have time to process. Disciplinary actions were taken against a number of employees and managers as a result and a service center director and regional commissioner retired shortly after the incident; careers of some senior officials were also damaged in the wake of the crisis. The agency, which was so profoundly reliant on computers, had reached a point where it could not manage its own systems.10 However, the crisis had been brewing for years. An audit report in 1976 had expressed concern “about potential software problems because data base management software needed for the proposed system is not commercially available,” while hardware and other software would have to be tailored. All these actions would increase the expense, complexity, and time required to implement some of the largest systems in existence.11 A string of similar GAO reports issued throughout the 1970s and 1980s reviewed similar issues, expressing concerns about the IRS’s unrealistic optimism, lack of sufficient skills, and rising costs for implementing new systems. Piecemeal improvements intended to increase capacity had been the only major changes made to the systems during the 1970s and 1980s. Yet, the IRS soldiered on with its Tax System Modernization (TSM) project, one that it estimated would cost $21 billion by the end of the century to implement. The project grew in scope, and by 1990 the IRS had expended $120 billion on it since 1986. The story of bad project management, lack of focus and controls, and so forth, became the subject of numerous audits and investigations in the 1980s and 1990s and is a story that itself could fill a book.12 Furthermore, tax fraud became increasingly possible, both manually and electronically, with audits in the early 1990s suggesting that losses just from electronic fraud were reaching $5 billion per year.13 The issue of fraud was one faced by all agencies and companies at the time moving to digital transactions, an issue that would be partially alleviated in the late 1990s by the availability of new software tools that could minimize fraudulent activities. The IRS, like other users of the digital, realized that different tactics were needed to prevent fraudulent behavior from those long in use with paper-based transactions. Meanwhile, massive amounts of resources were still deployed to maintain the older systems at the same time that tax returns were increasing in volume. These applications kept changing and becoming more complex as Congress made intermittent and sometimes substantive changes to tax laws every year.14 Each new administration, however, kept investing in the development of TSM right through the 1990s, although with increasingly smaller budgets and with loud declarations of impatience with the leadership at the IRS. This project had become one of the noisiest and most publicly negative IT implementations in the
21
22
The DIGITAL HAND, Volume III
history of the nation. Yet, Congress and multiple administrations saw enough value in its implementation to continue funding it over the years. One observer of the situation at the IRS described the circumstance it faced in the late 1990s: The IRS struggles with the dilemmas of document processing and electronic filing, causing them to enter a spiral of innovation as computerized collection methods engender computerized tax fraud, which requires yet more sophisticated tax collection methods. The day-to-day operations of the IRS are still jeopardized by the crumbling systems in place. Meanwhile Tax Systems Modernization has brought conflict to the organization of the Internal Revenue Service, between the technical and business sides of the organization, exacerbated with every year that TSM failed to be implemented, causing additional problems for any projects in the future.15
In his memoirs written after being the IRS Commissioner from 1997 to 2002, Charles O. Rossotti described the situation he faced in that same period: “While there was little doubt in any quarter that the ever growing tax system could not continue to depend on computer systems developed in the 1960s and 1970s, there was also increasing concern about the IRS’s ability to manage its technology program.”16 Rossotti conducted a survey of the extent of the problem, mainly to understand what had to be accomplished to get past the Y2K problem. A few statistics suggested the massive size of the problem faced by the IRS: We found no fewer than 130 separate computer systems essential for the functioning of the tax system, running on 1,500 mainframe and midrange computers from twenty-seven vendors and comprising about eighteen thousand vendor-supplied software products and fifty million lines of custom computer code. These were connected through three wide-area communications networks, many stand-alone dedicated circuits, and 1,182 local-area networks. Although the IRS employed about 120,000 people at peak season, the agency had in its inventory over 200,000 end-user computers, partly because many users needed more than one computer to access the numerous incompatible systems and databases.17
In public, officials at the IRS noted in their defense that despite these difficulties, they were doing the work of collecting taxes—in fact over $1 trillion each year during the Clinton administration—processing over 200 million returns and some 85 million refunds. Michael P. Dolan, a deputy commissioner at the IRS, reported to Congress in March 1997 that “over the past few years, we have been trying to shift taxpayers, and the IRS, from some paper transactions. We have made more and more information available via the telephone, computer, fax services, and CD-ROM.”18 In the mid-1990s, the IRS began making available over 700 forms and tax publications over the Internet, and in 1996, over 100 million visits to its Web site took place. It had started accepting filings electronically, about which more is said later in this chapter. In the late 1990s, it implemented an online Integrated Collection System (ICS), which provided online
Tax and Financial Operations
access to current account data, while other online systems were upgraded, such as the Electronic Return Filing System, to reduce errors in operations. A set of applications that did not receive much attention in the press involved all the normal backroom accounting work that goes on in any organization. At the IRS, these uses of computers had been created over the years to handle personnel matters, payroll, accounting, and so forth, many of which were stand-alone applications. Following practices evident in corporate systems in the early 1990s, it consolidated or integrated many of these, such as the applications used for budgeting and accounting from eight major systems, into one financial control system, called the Automated Financial System.19 Before discussing the impact of the Internet, e-filing, and other applications that emerged late in the century, understanding the status of IT at the IRS at that time places in context the nature of its dependence on computing and the effectiveness of the agency’s operations. Using mid-1997 as a point of reference is useful because in that year a major review of the IRS generated considerable data on its performance. The audit report declared TSM a failure “because the IRS did not have a consistent long-term strategic vision to guide the project.”20 A national commission reporting on the IRS noted that “one of the most significant problems with TSM was the failure of the IRS to tie technology objectives directly to business objectives, and to assess success based on those objectives.”21 The failed TSM initiative had, in conjunction with other operational challenges, blemished the image of the IRS. A few statistics give us a broader measure of performance. In that year, the IRS processed 205 million tax returns, of which 120 million came from individuals. Approximately 10 percent of all filings had processing errors caused by the IRS’s people and systems, while another 10 percent of all returns had mistakes in calculations and data created by tax filers. Roughly one half of all tax returns were created on PCs and other computers by citizens and companies, which filers printed out and mailed to the IRS. Then IRS staff reentered these into its IT systems.22 Even with this task, only 40 percent of the data from paper returns was entered into IRS systems that year, the rest of the data was processed manually or only partially through use of computing. Table 2.1 documents briefly how large a proportion of the agency’s total budget went to IT. Computing was consuming approximately 19 percent of its budget by 1997. The various data make it clear how important computer-based applications had become to the agency and the nation at large. As we see next with e-filing, the IRS’s systems had also become quite public. These were clearly the most visible of all government uses of digital tools in the entire sector. President Clinton’s IRS Commissioner years later commented first on the origins of the predicament that his agency found itself in, arguing that “sheer size” was a major issue, because it became “more difficult to design, implement, and operate” the agency.23 Second, he blamed the Congress for constantly changing the tax code, which created additional levels of complexity and work for the IRS. The agency reacted by adding more hardware and lines of software code to existing programs as a quick fix. However, unlike the GAO and Congress, Rossotti did not criticize the management at IRS.
23
24
The DIGITAL HAND, Volume III Table 2.1 IRS Budgets for Fiscal 1996 and 1997 (billions of dollars) Appropriation by Project
1996
1997
Processing, assistance, & management Tax-law enforcement Information systems Totals
1.724 4.097 1.527 7.348
1.780 4.104 1.323 7.206
Source: “Statement of Lynda D. Willis, Director, Tax Policy and Administration Issues, Testimony Before the Subcommittee on Oversight of the House Committee on Ways and Means, March 18, 1997,” http://waysandmeans.house. gov/legacy/oversit/105cong/3-18-97/3-18will.htm (last accessed 1/2/2005).
E-filing emerged as a major new use of IT at the IRS and for the public at large in the 1990s. In time, this application made it possible for an individual or firm to submit electronically their tax returns directly to the IRS from their PC or company mainframe over telephone lines and later over the Internet. In turn, the IRS received these returns as digital data into its tax-processing systems, with the promise of faster processing, greater accuracy, lower costs, and faster turnaround on refunds. That was the promise and the hope, and like so many large applications at the IRS, it took many years to become a reality. In the beginning, individuals filed electronically only by using a professional tax return preparer; a situation that did not change until the mid-1990s, when the software needed had stabilized sufficiently to allow the public at large to use it. E-filing thus evolved into its current form, rather than emerging from some massive complex project done in one gesture, such as management wanted TSM to be. Yet, part of TSM had envisioned adoption of e-filing. The first step, and a part of TSM, was to make available online to taxpayers and their tax preparers more existing information, such as tax forms and instructions for filings. In 1990, an early pilot for electronic filing for this application attracted four million taxpayers. In a report that year, the IRS noted that “electronic filing is a new way of filing certain tax returns with the Internal Revenue Service. The return is not filed on paper; instead, it is transmitted to the IRS by modem over telephone lines.” Interest in such an application at that time was driven by “the growing number of tax preparation firms that use computers to prepare individual returns.” The IRS concluded that such an approach would reduce its own costs for processing returns, tracking compliance, while facilitating enforcement of the tax codes. It thus embraced the application. Initial implementation of e-filing had begun for individual returns as far back as 1986 for 1985’s taxes and for businesses in 1988. Over the years, additional types of existing tax forms were added that could be submitted electronically.24 Early in 1993, the IRS reported that five of its ten processing centers were handling electronic filings. It had also introduced its first major PC-based form called the 1040PC as a digital alternative to a traditional tax form, and that could be submitted electronically to the IRS.25
Tax and Financial Operations Table 2.2 Individual Federal Income Tax Returns Received in Paper and Electronic Formats, 1996 and 1997 (millions) Type of Filing Traditional paper 1040PC* Total paper Traditional electronic** TeleFile Total electronic Total all types
As of March 7, 1996
As of March 7, 1997
% Change
31.98 2.37 34.35 9.27 2.28 2.28 45.91
28.10 2.75 30.80 10.92 3.50 3.50 45.22
⫺12.3 15.7 ⫺10.3 17.8 53.0 24.7 ⫺1.5
*PC software was used to produce paper returns. Only lines where a taxpayer made an entry are included on this form. **Returns filed through third parties, such as tax return preparers. Note: In both years, this data reports on filings up to five weeks before the end of the filing season, so the final number of filings was actually larger for each year. Source: “Statement of Lynda D. Willis, Director, Tax Policy and Administration Issues, Testimony Before the Subcommittee on Oversight of the House Committee on Ways and Means, March 18, 1997,” http://waysandmeans.house.gov/legacy/oversite/105cong/3-18-97/3-18will.htm (last accessed 1/2/2005).
Three years later (in 1996), the IRS reported continued growth in the popularity of this application, with 685,000 taxpayers having used the 1040 via telephone and 41,000 businesses having made tax deposits via electronic funds transfer, a major enhancement to earlier online applications available to taxpayers.26 Usage grew as the decade passed, with nearly a 25 percent surge in 1997 over 1996, for example. By viewing table 2.2, one can see the relative proportion of electronic versus paper submissions; note how significant the volume of digital forms had become. What is particularly interesting is the evidence it presents, demonstrating that this application was evolving from paper to electronic, with both types in use, and, even in the case of 1040PC (a return generated using software, which when printed only contained data needed for processing, not a full tax return), still involved a combination of both paper and software, since it was the actual printed document that was still submitted.27 Because we are discussing an emerging and important application of IT, it is useful to see how the agency was spending its IT budget in 1997 to give a sense of how funds were distributed for computing. That distribution is reflected in table 2.3. Note that expenditures for TSM development continued, along with upgrades of telecommunications (both telephone systems for taxpayer inquiries and also other communications infrastructure common to large organizations). Congressional committees, some staff at the White House, and the IRS focused their attention on expanding online receipt and processing of tax returns during the middle years of the decade. The IRS added additional forms that
25
26
The DIGITAL HAND, Volume III
could be filed online and even experimented with helping state governments launch their own programs.28 Continuing into the early years of the new century, additional functions went online, such as filing for extensions over automated telephone systems (2001) and combined state/federal filing (2002). All the while volumes increased. In 2003, for example, over 150,000 tax preparers were authorized to file electronically on behalf of their clients. Taxpayers using PCs at home filed 11.6 million returns, of which 2.4 million qualified to do so at no cost to them. Some 6.7 million businesses also filed electronically that year. Adding in 4 million using TeleFile (which used software to handle the transactions over the telephone) brought the grand total to some 52.9 million electronic returns, of which 22.8 million were joint federal and state filings. Electronic funds transfer and credit cards were also used extensively. In 2004, the IRS reported that over 63 million returns had been filed electronically, of which 43 million had been submitted by tax preparers.29 Also in 2004, almost 2.4 million individuals made electronic payments totaling almost $3.3 billion.30 To be sure, the growth in e-filing was not simply the result of technology just making it easier. Crucial to the situation was that filing taxes had become increasingly complex during the last two decades of the twentieth century, motivating tax payers and preparers to use software tools to help them accomplish the task. While the federal government had launched a paper reduction campaign during the Clinton administration, the volume of paper actually increased in the mid-1990s, and one internal audit cited the IRS for the lion’s share of the increase: “Nearly 90 percent of the governmentwide increase during fiscal year 1999 was attributable to increases at IRS, which IRS said was primarily the result of new and existing statutory requirements.”31 The government estimated each year how many hours of work it burdened the public with to comply with its demands, such as Table 2.3 IT Budget for the Internal Revenue Service, Fiscal Year 1997 (millions of dollars) Project or Activity Legacy systems TSM operational systems TSM development and deployment Program infrastructure “Stay-in-business” projects Staff downsizing Telecommunications network conversion
Budget 758.4 206.2 130.1 83.4 62.1 61.0 21.9
Source: “Statement of Lynda D. Willis, Director, Tax Policy and Administration Issues, Testimony Before the Subcommittee on Oversight of the House Committee on Ways and Means, March 18, 1997,” http://waysandmeans.house,gov/legacy/oversit/105cong/3-1897/3-18will.htm (last accessed 1/2/2005).
Tax and Financial Operations
the filing of tax returns. About 80 percent of the time people spent complying with government requests involved tax returns and other tax-related reporting and filings.32 Trying to control this paper growth, coupled with ongoing efforts to improve efficiencies and services through the use of information technology, remained a central activity at the agency in the early years of the new century. As this chapter was being written in 2007, the IRS was still implementing TSM, now called Business Systems Modernization (BSM), along with a broad array of more specialized applications. These included data mining to improve service and performance of accounting applications, abusive corporate tax shelter detection, other applications to detect various forms of tax evasion and fraud, and a specialized system dedicated to electronic fraud detection and other criminal activities.33 Additional automation of various corporate tax returns was planned but not started, such as a scoring system for partnerships submitting returns. Projects totaled over $200 million a year and did not include the costs of running existing systems, many implemented in the 1960s.34 Beginning in the 1990s, moving applications and creating various avenues of access for the public to all government agencies via the Internet had also become major initiatives, the e-government effort started by the Clinton administration and reinforced with the passage of the E-Government Act of 2002. The IRS participated extensively in this government-wide initiative. In the case of tax filing, however, the public’s move away from paper-based to electronic forms progressed slowly for various reasons. Use of tax planning programs, required in order to prepare and file digitally, remained the purview of high-income taxpayers already familiar with PCs, or they used professional tax preparers. Use of the 1040EZ forms, which were aimed at those with simple returns, was not the group using software tools the most. Nonetheless, the Bush administration attempted to promote use of electronic filing by allowing people to submit tax returns electronically. Over twenty-five states also had established free Web-enabled filing by 2004 as well. Use of software by those inclined to do so did not simplify filing either for the IRS or the public at large. The rate of e-filings remained slower than desired by the IRS, and, as discussed below, the speed of adoption was driven less by practices at the IRS than those of tax accountants and software companies selling tax planning products.35 Complexity throughout the tax-processing effort—from filing by taxpayers through to the myriad activities of the IRS—motivated various parties to use accounting tools. But there were other features of the process as well that motivated the IRS to use computers. However, as we saw with online systems in the Banking Industry, developers of new systems always seemed to anticipate faster adoption than proved to be the case. The IRS was no exception to a pattern evident in many industries at the time. Perhaps the most remarkable feature of the IRS throughout its history is the massive nature of its activities, which included extensive use of IT. While there have been some small IT projects at the IRS, those that historically have been needed to keep it current, “modern,” have been some of the largest available in
27
28
The DIGITAL HAND, Volume III
the American economy. This circumstance was as true in 2007 as it was in 1957. Recent examples follow historic patterns. A project to provide the IRS with integration, engineering, and telecommunications services had a budget of $900 million to be spent over five years, beginning in 1998. An even larger initiative, called the IRS Integration Services, represented a fifteen-year, minimum $8 billion effort.36 If we look at projects from the perspective of number of transactions, we see large writ over all these systems. For example, use of scanning technologies, such as optical scanners (OCR), called at the IRS the Service Center Recognition Processing System (SCRIPS), launched in 1995 at half of the IRS’s service centers. In any given week, it was used to scan 2.5 million documents for a total over 249 million forms between 1995 and early 1998.37 It was the large volume of transactions that led the IRS to move early and quickly into the Information Age. These quantities, however, made for complex systems. In turn, the IRS became one of those organizations in the American economy that pushed the mythical “envelope” of a technology’s capabilities and, perhaps most important, began illustrating the managerial lessons that would have to be learned regarding any implementation and operation of large IT systems.38 Auditors at the GAO frequently studied IT projects at the IRS and routinely found much to criticize. Their evidence was specific and detailed, and the IRS rarely mounted any vigorous defenses against these charges. The IRS found itself most frequently in a situation where the massive size of its applications made the agency a prisoner of systems that increasingly became unattractive for various reasons. Turnover in senior leadership, bad project management, technical complexity, aging technologies, Congress’s changing tax laws, massive volumes of transactions, changes in budgets, and the requirement of a decade or more to achieve significant changes in applications and systems all converged to make the situation at the IRS difficult, if not impossible to overcome.39 Insights can be gleaned from the historical record. Old applications can still work despite the fact that management wants to change them. The IRS was incrementally able to change and upgrade myriad hardware, software, and processes, despite the constant churn in its leadership. All the GAO surveys of the IRS pointed out that management was too optimistic about what it could accomplish, underestimated costs, overestimated benefits, and underestimated the time required to make changes. In fairness to the IRS, GAO often found much to criticize in other federal agencies and departments in how they, too, managed development of new systems. From the late 1960s to the present, the patterns GAO documented seemed to have been common, prevailing year after year, and made more troublesome by the fact that these received considerable political and media attention. The IRS also operated in alien territory when compared to the smaller applications that emerged all over the private sector across dozens of major industries. Other federal agencies engaged in massively large applications, such as the U.S. Department of Defense (DoD), and when we examine that department’s experiences in the next chapter, we will see similar effects of complexity at work. Understanding the need for fundamental change, and not simply for upgrading computer systems, the Clinton administration reorganized the functions of the
Tax and Financial Operations
IRS so substantially that they could be compared to the work done to create the IRS in the first place in 1952 in the form it was to be for the second half of the century. While the history of that reorganization cannot be adequately described here, it is important to point out that it was the use of IT that compelled the agency to change the nature of its work. Read what the IRS Commissioner who directed the transformation later wrote of the effort: “For the IRS to be successful in modernizing its business operations and technology, the agency would first need to reorganize itself into fewer units, which could thus manage operations more consistently across the whole country. The overall IRS reorganization, implemented in October 2000, was therefore a critical step for the Business Systems Modernization Program as well.”40 Rossotti also had the agency document all its basic processes, such as how it handled returns, provided taxpayer services, and carried out compliance activities (audits and collections) so as to reduce confusion for employees and taxpayers alike and so as “to take advantage of new technology.”41 By the early years of the new century, the IRS employed a third less people than a half century ago, partially as a result of needing fewer employees to process paper returns brought about by electronic filing. As one student of the new application repeatedly noted, one would expect that as the public used electronic services, one could expect increasing amounts of productivity to accrue back to the IRS, and the cumulative results on employment over a half century is part of that record.42
State Tax and Financial Applications Uses of computers to process state taxes paralleled those of the IRS, yet they also differed. The basic nature of taxation was similar in that states taxed its citizens’ income and thus required taxpayers to submit returns each year, which the taxing agency would either accept or contest. A state would either collect receipts or pay out refunds. Both federal and state governments also collected fees for services that had to be accounted for. In the federal government, such collections were done across many departments and then deposited in the U.S. Treasury Department. Both federal and state agencies had to play essentially the same role in collecting income taxes of business enterprises. However, there also were, and continue to be, some fundamental differences. For one thing, states imposed sales taxes, which the federal government did not, and they had to collect these from merchants. For another, because states were smaller agencies than the federal government (they had a smaller load of transactions than the IRS, such as fewer tax papers to work with), often their equivalents to the IRS were accounting and financial departments that had a tax collection suborganization within them. Thus, tax applications were often more intimately connected to the rest of the accounting systems, running, for example, on the same computer as other financial applications. Finally, we should acknowledge that for decades prior to the arrival of the computer, taxing authorities in state government had relied on all available accounting and office appliances of the day,
29
30
The DIGITAL HAND, Volume III
such as adding and calculating machinery and tabulating punched-card equipment. As at the IRS, the work flows needed to process taxes had held together, in an organized manner, the combined managerial, operational, and technological activities of the day, with the work process dominating and all equipment subservient to it. With computers, software tools (applications) increasingly became the thread through the processes and managerial practices that characterized the cadence of the tasks to be performed and, over time, the nature of that work. This transformation, while subtle, was well recognized in the 1950s and 1960s and by the 1970s had become the subject of much discussion in data processing and academic circles.43 Since all states collected taxes, their experience in doing so with computers is instructive about how states in general also worked with a digital hand. States became interested in using computers in the late 1950s for the same reasons as had the federal government a half decade earlier. The first known survey of deployment of such technology at the state level appeared in 1960, and while it perhaps overstated the numbers, the pattern was unmistakable. It reported that nearly 100 computer systems were installed in state governments and an equal number were on order (or planned) for future installation. Five years later, both numbers had increased, leading one census taker to report that 163 systems were in use by state governments.44 Nearly half the systems were used by highway departments (also called public works) to perform engineering calculations and, later, project management. By 1964, forty-eight out of fifty states had a computer. The other major collection of applications installed between the late 1950s and the end of the 1960s served administrative purposes, including tax work. Table 2.4 presents data from the mid-1960s, cataloguing the various uses of computers in states. While most computers were used for Table 2.4 Major Uses of Computers by U.S. State Governments, circa 1964–1965 Application Public works Revenue administration Employment security Motor vehicle and driver licensing Public welfare Education Public health Other uses* Total
Number of Systems 55 29 19 16 9 7 3 25 163
*Often these were also used for accounting, tax, and administrative work. Source: Harry H. Fite, The Computer Challenge to Urban Planners and State Administrators (Washington, D.C.: Spartan Books, 1965): 4–6.
Tax and Financial Operations
multiple purposes, the table documents how many systems had at least one major application identified with a system. Normally, where there was a second or third computer, public works, again mainly for engineering applications, provided the main reason for their use. Note in the table that “revenue administration,” which included tax applications, accounted for the second largest use after public works. According to the individual who conducted the survey, Harry H. Fite, after public works, the most extensive use of computers in state governments “lay in the functional field of finance, payroll, billing, accounting or tax work.”45 To be more precise, he reported that “29 state governments have adopted computer methods in the tax work,” such that “electronic data processing has become an accepted technique for revenue administration at the state level also.” He described the work done with computers: “Such applications as revenue accounting and reporting, taxpayer assessment and accounting, revenue refunds, collections, deposits and reporting, statistics-gathering and reporting, tax history records, descriptions and cross references, etc., have become as familiar as the proverbial ‘old shoe.’”46 Another survey, done in 1963, reported a similar trend with sixteen out of forty-three responding states reporting use of computers for corporate, individual, and sales tax accounting. This census also noted that the most widely deployed applications were for highway computation and accounting (thirty-eight out of forty-three states).47 In short, from the earliest days of computer usage at the state level, tax applications were deployed and linked to the larger accounting and financial work. Administration, finance, and tax departments early on acquired control over their own computer systems, much as did accounting departments in large corporations, and, like them, retained managerial and operational control over such systems until the 1980s, when more centralized cross-agency IT operations increasingly became the norm in state government. But almost from the earliest days of computing, each department wanted its own system, if it could cost justify it. One Louisiana state official, after conducting a census of existing systems in his state in 1965, reported that “our survey revealed that computer centers were cropping up throughout the State agencies like lilies after a rain,”48 driving up the cost of administration enough to alarm officials in this state to start centralizing computing, beginning in the 1960s. Other states did the same. Like the experience of the IRS, state governments developed their own data processing applications in support of tax work in the 1960s, and these systems essentially remained in use over the next several decades. To be sure, they were moved to newer computers, evolved from purely batch systems to online versions, particularly taking advantage of terminals to provide employees with access to ever larger disk-based files, as states moved away from punched-card and tape systems in the 1970s and 1980s. Their systems were not as massive as those at the IRS, which made it more easily possible to keep up with technological innovations, although they, too, evolved more slowly than what one saw in the private sector. As each new innovation came along in hardware, software, and telecommunications, public officials examined the merits of converting tax applications to some new form. For example, as early as 1965, when the data
31
32
The DIGITAL HAND, Volume III
processing world was discussing the merits of a “total systems approach” to integrated computer-based applications, state officials participated in the debate. At the heart of the issue was the case for going beyond simply automating precomputer financial processes. As one commentator in the mid-1960s explained, “it involves rethinking the work of a Revenue Department and designing a new system which takes full advantage of the capabilities of electronic devices on the one hand, and serves the total needs of managing the revenue function—not merely processing its paper work—on the other.” These new procedures had to be “developed as part of a total integrated system in which all elements or parts are consciously interrelated with another so that no unnecessary or duplicate work is performed, so that the output of one state or steps becomes input for another and so that machine work is maximized and human intervention is minimized.”49 States had not heard such suggestions since the 1910s and 1920s when the first systems approaches to work had facilitated the wide adoption of tabulating, calculating, and adding machines.50 The call for redesign of applications did not fall on deaf ears. In small evolutionary steps during the 1960s, 1970s, and 1980s, more functions moved from manual or precomputer technologies over to computers, and along the way, officials optimized processes and work streams. Like the IRS in the 1970s, however, tax authorities found their systems often straining under the workload of increasing numbers of taxpayers and changes in tax law.51 Also, as at the IRS, tax evasion remained a chronic problem that public officials expected computers to help solve.52 Many non-IT actions had been taken, such as implementation of amnesty programs, whereby individuals could come forward and pay back taxes without penalty or punishment, more comparative analysis of IRS data files with those of various state agencies, and, of course, a continuous stream of changes in tax laws. For many states, the major new initiative of the late 1970s, and that continued throughout the 1980s, was finding ways to use computers to enforce compliance with tax laws by using software to compare various government agency records to identify individuals to go after. In addition, they used many non-IT approaches to increase compliance, such as publications, TV and radio spots, and expanded phone services that taxpayers could use to learn more about their state’s tax laws and requirements. However, to help in auditing compliance, states used computers extensively. As one student of the process reported in 1988: “Computer technology suits perfectly well the work of tax agencies. Programs and hardware greatly enhance their capability to process, store, and retrieve vast amounts of information. A major use of the technology is to check income tax return information with data from other resources.”53 These “other resources” included IRS income tax tapes (all states did this), IRS 1099 forms (about two-thirds of all states), and other IRS files. Sometimes they also used computer files of other states, records of Blue Cross/Blue Shield, partnership returns, employer withholding statements, and corporate and sales tax files. Comparing data files, or performing the newly emerging data mining searches made possible in the 1980s, allowed officials to identify anomalies that required investigation and enforcement, strengthening the hand of tax auditors.
Tax and Financial Operations
One survey from the late 1980s pointed out that thirty-six out of fifty states had extensive computerized tax audit capabilities to perform such functions as checks for failure to file in earlier years, mathematical accuracy, taxes owed and refunds due, identifying prior year delinquencies, presenting comprehensive views of a taxpayer’s files, matching returns with other data files, classifying which returns to pursue, and creating various management reports. Almost all states used computers as well in support of their collection processes, such as to operate automated phone collection systems and to track accounts receivables. Management also equipped investigators with PCs to gain access to data.54 Thus, over time, states had moved from simply collecting data and income tax returns in machine-readable form (1950s–1970s) to also using computers to help in enforcing laws (compliance) and supporting collections (1970s–1980s). During this period, states began using computers as well to inform tax preparers and payers about tax laws. By the end of the decade, states were next able to use tax data to enforce other laws. For example, in 1987 Massachusetts established the Child Enforcement Division within the Department of Revenue, using its tax systems to intercept “federal income tax refunds of individuals that could then be used to pay for past-due child support debts.”55 The effects generally were positive from the perspective of tax collectors in that they increasingly were able to enforce laws and discourage evasion. Processing of returns and receipts tightened. One method that spread during the 1980s and 1990s was the use of electronic funds transfer (EFT), whereby taxpayers could transfer funds electronically from their bank accounts to their state tax department, while tax refunds could be moved electronically into a citizen’s bank account. The same applied for business enterprises. The state of Florida, for example, began deploying such a process in 1990, but only for businesses paying taxes. Faster collection of taxes made it possible for the state to earn additional interest income, $21 million in that first year, for instance. As of 1987, only a few states used EFT; by the end of 1993, the number had grown to 45.56 As use of online systems expanded and the Internet gained wide acceptance as a tool for communicating and conducting business, government agencies in federal, state, and local governments began embracing the concept of e-government. The idea was that citizens and organizations could conduct their business with governments via the Internet or through other telecommunications networks, such as with highly automated voice response telephone systems. Most state governments only began deploying information about their services and requirements online via the Internet beginning in 1995 and 1996 and, like the private sector, did not begin conducting transactions over the Net until data security software became available later in the decade.57 State governments and the Clinton administration embraced the notion that use of the Internet would drive down operating costs, make services to the public more responsive and be delivered more quickly, and create a positive image of being progressive. Some officials even attempted to leverage these themes for political and other reasons. For instance, the governor of Pennsylvania, Tom Ridge, suggested that his state
33
34
The DIGITAL HAND, Volume III
should no longer be nicknamed the Keystone State but rather the Keyboard State.58 While a major push to online systems focused on such areas as education at the K–12 levels, e-mail within agencies, and the establishment of Web sites for most agencies to communicate with the public, tax agencies participated as well. As citizens became increasingly comfortable using the Internet, they called on their local and state governments to do the same. In one survey on this theme (conducted in 2000), a third of the respondents wanted the capability of filing tax returns over the Internet, and 27 percent wanted to pay their taxes using credit cards or e-checks.59 As tax filing went online, however, a new question arose, namely, who should pay for e-filing? Despite the fact that automating tax work helped eliminate some of the most intensive activities conducted by any government, providing online filing, refunds, and remittance added IT costs to an agency that, at least initially, had to be paid for by governments. There was considerable debate about whether to charge citizens and businesses for this service, in the belief that it made things easier for filers and allowed them to get refunds more quickly. States chose initially to charge for the service. Thus, for example, in 2000 one could pay their taxes in New Jersey using an online system but had to pay a service fee calculated as a percent of their bill. All over the United States, the public resisted using online systems of this type so long as there were fees. In the case of New Jersey, in 2001 the government made it possible for taxpayers to remit their taxes through EFT at no cost. Using credit cards posed a tangible problem since credit card processors charged a fee for such transactions that someone had to pay for, and the volumes of dollars involved were significant. To a large extent, state governments opted slowly to make taxpayers pick up those transaction fees. In the early 2000s, state governments experimented with credit cards, EFT approaches, and other ways of using the Internet to collect taxes and drive down costs for filing and paying.60 But the clear pattern was that most states were making it possible for citizens and businesses to file and pay online.61 By late 2002, 42 states had made it possible for individuals to download tax forms, 38 states provided tax advice online, and 35 offered online tax filing in conjunction with mail filing. Just over half (29) had launched complete online filing applications where refunds were expected and 23 of those where payments by citizens were required. The same survey also noted that approximately 16 percent of all taxpayers had used online tax submission applications.62 Meanwhile, backroom automation continued in many tax agencies. In fact, by 2002, ten had some 95 percent of their tax records stored in digital forms, while a third had over half their files in electronic formats. One result of having so many digitized files was that in thirty-seven states citizens could now view the status of their tax filings using online systems.63 In 2004, e-filing continued to grow at the state level, as with federal tax forms. In fact, nearly 45 million people and businesses filed their state tax forms online. If an individual filed their federal taxes electronically, they were inclined to do the same for state returns, with state filings actually increasing more rapidly than federal filings.64 So, we can conclude from these various statistics that tax agencies were active in
Tax and Financial Operations
the e-government movement of the 1990s and early 2000s, reflecting growth in use of the Internet that expanded as use of this new digital tool spread across society.65 We can also be confident that a fundamental change in how citizens and businesses were filing and paying taxes had been under way for some time. These activities were becoming highly digitized. There is one additional tax issue that surfaced at the dawn of the new century that has yet to be resolved but has become an important concern to state and local governments. It involves whether or not they could charge sales tax for purchases made over the Internet. In a nondigital transaction, a person making a purchase in a store pays a sales tax on that transaction, the amount determined by the state or community in which that store is physically located. With an Internet-based purchase, the order for a product could come from anywhere in the world, and obviously from another state. Most Internet merchants never collected sales taxes, and so long as the volumes of transactions were minor—the situation in the late 1990s—states were not too concerned. The U.S. Supreme Court had ruled in 1992 that no state could force a company that did not have a presence in that state to collect sales taxes for Internet sales. During the Clinton administration, in an effort to encourage citizens to use the Internet for e-commerce, the Congress exempted Internet-based purchases from sales taxes. But as states and communities saw their traditional sales tax bases shrinking at a time when they were desperate for additional revenues, they became quite concerned and in the early years of the new century renewed their lobbying for the ability to tax such transactions. The numbers involved were substantial and make it understandable why the renewed efforts. For example, one survey provided evidence that state and local governments had lost $18.9 billion in sales taxes and predicted that this number would nearly double the following year. In fact, that happened as online sales continued to grow year after year in the new century. As of this writing (2007), the U.S. Congress had yet to raise the moratorium, while online retailers kept complaining that to collect sales taxes was too complicated a process given the number of communities and state governments in the United States.66
Local Government Tax Applications County, city, and town—called local governments in the United States—shared many common financial, accounting, and managerial practices that affected how and when they used digital technologies and telecommunications. The most obvious difference between these governmental entities and state or federal agencies is that they served fewer numbers of citizens and themselves were proportionately smaller public organizations. These organizations had smaller accounting and financial departments, where the same employees did a number of tasks that in state or federal agencies would be the work of specialized departments. Thus, for example, a county or town accounting department might write payroll checks, pay bills, and collect taxes. In some communities, towns and
35
36
The DIGITAL HAND, Volume III
counties pooled their resources, such that one of the entities might bill citizens for property taxes on behalf of both governmental units. Finally, we should acknowledge that in the case of taxes, while there were income taxes in large cities, sometimes in counties, the bulk of the tax base consisted of property taxes. These taxes were based on the assessment of the market value of someone’s home or business by the local government or a private contractor working on behalf of the tax authorities, followed by the process of billing the taxpayer for property taxes. Collection processes, however, were similar to what state and federal agencies practiced. As we will see with all agencies and all applications of the digital hand, scale and size mattered. The smaller the local government, the later it embraced computing, while the reverse also held true. As computing arrived at the local level beginning in the 1960s, deployment continued expanding as the cost of this technology dropped in the 1970s and 1980s. Exceptions proved the rule: cities like New York, Chicago, and Los Angeles were some of the first municipalities to set up large data centers, because they had the volume of work to justify these expenditures. The earliest inventories of the deployment of computers did not list local installations; not until around 1960 did such surveys begin identifying uses. Even then the accuracy of such data is partially questionable because some communities first sent their data processing work to a service bureau before acquiring their own system. Nonetheless, by 1965 there were about sixty-five cities and counties in the United States that had their own computer systems. Since schools often represented the largest budget item for a town or county (up to 80 percent in some instances), we should acknowledge their use of computing as well. In 1965, a rough count suggested that an additional twenty-two school districts also had their own systems (discussed in chapter 8). As at the state and federal levels, major applications supported accounting and financial work and were installed first, particularly in the early years.67 Table 2.5 catalogs some of the digital applications and other forms of predigital information-handling technologies, such as tabulating equipment used by cities in 1965. The purpose of showing this table is to reinforce the message that the bigger the entity, the more officials used computing. We can make a second observation as well: there existed already a large body of applications of office appliances prior to the arrival of the computer that gave public officials experience with IT, and that already made them dependent on mechanization of accounting and tax collections. The very largest counties experienced a similar pattern of use and adoption. The investigator who conducted the survey reflected in table 2.5 put things in perspective. He reported that all state and local computers totaled some 250 systems, while there were approximately 10,000 installed across the entire U.S. economy in 1963. So, while the presence of state and local government was proportionately higher in the economy, their use of computers was not, a situation that changed by the end of the 1980s.68 Because of the multiple roles local accounting and financial departments played, accounting and tax functions were highly integrated, if for no other reason than the same individuals performed all manner of accounting work. They
Tax and Financial Operations Table 2.5 Uses of Data Processing Equipment by U.S. Cities, circa 1965 Cities over Cities with Cities with Cities with Cities with 250,000 100–250,000 50–100,000 25–50,000 ⬍25,000
Uses Utility bills Utility accounting Appropria. acctg. Cost accounting Tax billing License records Payroll Mgmt. reports Personnel Police
22 19 17 16 20 14 29 21 15 25
22 13 11 9 16 11 30 23 30 14
28 15 13 11 17 8 27 19 10 4
20 8 10 4 12 5 12 10 3 2
23 12 11 6 14 4 12 4 2 0
Source: Adapted from Harry H. Fite, The Computer Challenge to Urban Planners and State Administrators (Washington, D.C.: Spartan Books, 1965): 6–7.
used a variety of accounting equipment, evolving upward in sophistication as new technologies came on stream. The case of one county illustrated the process. Its data processing director noted that for many years prior to the use of computers, “county tax rolls and tax bills were produced by a combined operation using billing and addressing machines.” In 1956, as the volume of work kept increasing, the county started using small punched-card equipment “combined with addressing machines.” Yet the work remained labor intensive until 1960 when a computer was first used to automate some of the tax processing. The data processing manager is worth quoting at length on the effect it had on the work of tax collecting: This computer permitted more complex operations and speed of preparation, but even more speed was required to provide needed service for the county taxpayers. An (even more) modern computer was delivered to the county in 1961 to print the 1961 tax roll and tax bill. . . . The use of the new computer permitted the production of the 1961 tax roll and tax bills with a computer breakdown of county tax, flood control tax, and with the total tax computed into five discount amounts.69
This pattern of ever evolving capabilities occurred in almost all towns, cities, and counties throughout the second half of the twentieth century. In the years immediately following World War II, accounting departments added to their inventory of accounting equipment, moving from adding and calculating machines to more complex billing and tabulating equipment to handle growing volumes and complexity of tax work. They faced the challenge of calculating taxes, billing citizens in a timely and accurate manner, collecting and
37
38
The DIGITAL HAND, Volume III
accounting for the funds received (or not received), and posting of entries. Often these processes were a combination of intense manual labor supported at various steps by machinery. Communities acquired equipment to speed up work, minimize the number of additional employees who had to be hired, and sustain accuracy. Billing was of particular interest because many communities also invoiced citizens for various utilities, such as for water usage.70 So billing was a highvolume transaction activity suited perfectly for office equipment and later computers. All the cases reported on regarding use of IT in the 1950s emphasized speed, accuracy, and increased capacity. There was hardly a local government entity in the nation that did not use some form of office equipment to handle tax collections and billing.71 The scores of documented cases make it quite clear that in the 1950s local governments all over the nation evolved their tax collection processes into more formal, mechanized forms, moving from relying on simple accounting to more complex equipment. Even large cities that had resisted using various forms of automation before now fell in line. For example, the city of Philadelphia—the third largest city in the nation—which, for many political and managerial reasons, had barely used accounting equipment across its entire tax processing functions, did in 1958 when it mechanized its 540,000 assessment records, using punched-card equipment.72 So, by the time communities began turning to computers, they had accumulated a body of local experience in partially automating and mechanizing tax accounting. A few communities began using computers to do this kind of work at the end of the 1950s. Some small communities rented time on a computer, such as Patchogue, New York (population 8,200 with 3,200 tax bills), which used a Univac 120 and 10 percent of an accountant’s time to manage its tax process.73 Other cities acquired their own computer, such as New York and St. Louis, by the end of the decade.74 By the early 1960s, some communities were integrating various tax and accounting functions in more automated ways than before, particularly as they began using service bureau computers and then their own systems. Whereas in the 1950s individual steps in the process were either automated or assisted through mechanized actions, in the 1960s steps were linked together, made possible either by the use of more sophisticated accounting machines that updated various records from the same data entered into a system, or through increasing use of computers as time passed.75 In the 1970s, deployment of computers to process tax billing, receipts, and accounting spread across all fair-sized and large cities, many counties, and to smaller communities, either through direct use of such technology in-house or via a service bureau. Applications were normally batch, although some online query capabilities became available by the end of the decade. Pressures on local governments to keep down their costs led officials to continue automating all kinds of accounting work, not just tax processing, with the result that during that decade and into the 1980s, their data processing budgets kept growing. Service bureau work came in-house in ever increasing amounts in the late 1970s and early 1980s, for example, as the cost of having one’s own system dropped and as internal skills in IT increased. In a major survey done in 1983 on local government use of computers, with 743 responses, 82 percent reported having internal data processing
Tax and Financial Operations
operations. The pattern of wide adoption was evident all over the nation and not restricted to one region or size community. The largest communities continued to be the most extensive users of IT (88 percent of large cities, towns, and counties, 63 percent of townships). They used all the major computer products offered by IBM, Burroughs, NCR, Hewlett-Packard, Data General, Digital, Sperry-Univac, Wang, Honeywell, and others, all solid proof that local governments used everything from large IBM mainframes to minicomputers from Digital, down to word processing and accounting equipment. If one combined in-house and service bureau uses of computing, then 63.4 percent of townships now used computers; municipalities reported 87.8 percent, while cities with populations of over 50,000 but less than 150,000 were at 97 percent; all larger cities used computers.76 What were local officials doing with all these computers, and how important was tax accounting in this mix of applications? By the early 1980s, the major categories of applications resident on computers were accounting, assessment, budgeting and management, voter registration, public safety, treasury and collections, utilities, purchasing and inventory control, planning and zoning, and sanitation management. Within treasury and collection applications, property tax records and billing done with computers ranged from 64.9 percent of all small communities to just over 50 percent of large cities. Similar proportions of use were evident from special assessments, for maintaining tax records, and just slightly less for property tax assessment processes. In short, within a period of a quarter of a century, local government had gone from no use of computers to over 50 percent in aggregate, and if we look at large communities of over 10,000 residents, then deployment had reached 90 percent or more.77 Because most local governments came to computing later than the IRS or state governments and also had to contend with smaller volumes, their use of more modern equipment and software allowed them to have more contemporary applications than either the earlier two users. That meant, for example, that local governments could upgrade systems more quickly to more cost effective ones. The fact that major vendors included minicomputer manufacturers provided clear evidence of the greater use of more modern systems than could be found at the IRS. For example, in 1983, nearly 6 percent of all users of computers used H-P equipment, another 5.1 percent Data General’s, and yet a further group of 4.7 percent Digital’s. So, just in the area of minicomputers, nearly 16 percent deployed this third generation of technology to do accounting work.78 One area of computing that grew all through the 1960s, and has continued down to the present with direct bearing on taxation, was the emergence of mapping applications, usually called GIS (Geographical Information Systems). While this application is discussed in some detail in chapter 6, suffice it to point out here that one of the reasons for using computers to track land uses and to digitize maps of properties was in support of tax assessments. To be sure, such systems were also used to do planning and to build and maintain water systems and highways. But by the 1980s, these systems were also emerging as extensions of the tax assessment process. Geographical Information Systems software create maps of the land and are used to track ownerships, splits. and changes in property borders and in calculating property values, often using satellite
39
40
The DIGITAL HAND, Volume III
photography and, in earlier times, CAD/CAM-like software tools. In effect, digitized maps made it possible for tax assessors to do their work with less manpower required to visit properties and talk to owners.79 Meanwhile, systems went online in the 1980s and early 1990s so that accounting personnel could respond quickly to taxpayer requests for information regarding assessments, tax bills, and such, by accessing their files through CRTs. On the eve of the arrival of the Internet, online access to tax files was fairly widespread in large and small communities since they had upgraded their batch systems over the years in an incremental fashion, moving data from cards and tape of the 1960s and 1970s to direct access disk drives in the 1980s and 1990s.80 Migration to Internet-based tax applications, however, proved slower to accomplish. Local governments fretted over what to make available over the Internet and how to protect the privacy of such data. However, use of the Internet by local governments took place in many other areas, such as in providing information about various services of specific agencies and departments, data on how to apply for licenses, and information about community events. Like the federal and state governments, local governments were being affected by the growing use of IT across the American economy. Historically, local taxes were based on physical property (such as land, buildings, and machinery), sales taxes for goods and services that took place on their geography, and a few other miscellaneous sources of taxation. But with the increasing shift of the nation’s production and consumption of products to services, many of which were taking place anywhere in the nation and not necessarily physically where the transaction was initiated, debate about the fundamentals of tax sourcing began in the 1990s. As Internet sales increased in the late 1990s, denying local communities sales taxes (mentioned earlier in this chapter) resulted in a potential problem for sustaining taxes. A third conundrum just looming on the horizon was how to tax the growing creation of intangible assets as people made fewer things but did more knowledgebased work that had economic value and thus, in theory, could be subject to taxation. Today we also have the increased rivalry that grew all through the 1980s and 1990s among local jurisdictions for tax revenue, involving towns, cities, counties, and states competing, for example, for sales and use taxes. At the dawn of the new century, these issues were of greater interest to large communities and state and the federal governments than to small municipalities. But nonetheless, they were emerging as part of a broader discussion about the fundamentals of taxation in this nation, a debate that was only just beginning as this book went to press in 2007.81 In short, the digital hand was beginning to affect local government in ways officials could not have imagined even a decade earlier.
Tax Preparers and Payers We now move away from the work of government agencies that process tax returns and collect revenues to the two communities they deal with the most: accountants who prepare tax returns on behalf of clients and the taxpayers
Tax and Financial Operations
themselves. Both the preparers and payers increasingly adopted software tools to facilitate preparation of tax returns, beginning largely in the early 1980s and continuously expanding their use of IT over time. In fact, their deployment of such digital aids was part of the reason that the IRS was able to step up its plans to offer e-filing at the dawn of the new century. While in the 1950s no accounting firm used software to prepare returns, by the early years of the 2000s, over half of all returns were prepared using computers. The transformation from pencil and paper practices began with accountants and, with the arrival of the PC, spread to individual taxpayers. We should recognize a couple of other realities. First, there were various types of accountants and tax preparers, such as the high-end, well-trained employees at Deloitte, PriceWaterhouse, and other elite accounting firms, second, small bookkeeping CPA firms, and corporate entities such as H&R Block, and third, some mom-and-pop firms or individuals with little formal accounting or even tax preparation training. We should also recognize that this conglomerate of preparers as a whole had little appetite for the IRS to digitize their work, and many resisted and fought the proposed transition. With those facts in mind, we can begin looking at the experience of the accountants. There are essentially three applications of IT used by this community. The first is tax software to collect filing information, to fill out digital versions of federal and state tax return forms, and to do the necessary mathematics. That is not the same as traditional accounting software, which is used all year to perform normal accounting and bookkeeping. There are three attractions of such software packages: they are relatively easy and quick to use, once the user understands how to use the software package; they drive down the cost of offering a client the service, which is important in what has always been a competitive market; and they are accurate, avoiding many of the mathematical or judgmental errors that always plagued manual approaches. The second type of software used by such accountants consist of packages to perform tax research, such as to find out what the tax laws called for. Much like the use of such tools by the legal profession, tax preparers increasingly came to rely on such software in the 1980s. A myriad of digital tools comprised the third group and included such things as Web sites to inform clients of their offerings and by which individuals could communicate with accountants, e-mail, and e-filing software tools and telecommunications links to state and federal tax agencies. The first two appeared initially in the 1970s as emerging applications of computing. They became reasonably widespread in the 1980s among large accounting firms (such as at H&R Block) and in the 1990s spread to smaller firms. Online communications with individual tax clients is a story of the 1990s and beyond, although communications with businesses began in the 1980s. Tax preparation on behalf of clients had existed all through the twentieth century, and accountants were important customers for all manner of accounting equipment of the day, from such firms as Burroughs, NCR, Monroe, and Felt & Tarrant, and typewriters from an equally prestigious list of suppliers, not the least of which were Remington, Underwood, and IBM. But as with tax collecting agencies, these two classes of equipment—adding machines and calculators on the
41
42
The DIGITAL HAND, Volume III
one hand and typewriters on the other—were used to assist in essentially a manual process of typing onto forms necessary data and augmenting hand calculation of totals using office appliances. Up until the years following World War II, accounting firms did all manner of accounting, not just taxes (many still do). Then some began specializing in one or few aspects of accountancy. H&R Block, one of the nation’s largest tax accounting firms, was typical. Founded in 1946, it did all manner of bookkeeping for small firms but in 1955 decided to concentrate solely on tax preparation. Results were impressive in that its focus proved to be a winning strategy. Within one decade the firm annually prepared over a million returns and, in 1975, generated in excess of $100 million in revenues for this kind of work. In short, tax preparation in the years when tax law became more complex and the number of taxpayers grew proved to be good business.82 While lawyers, individual accountants, small accounting firms, and, increasingly over time, large tax preparation firms did tax filing preparation, the largest seemed to embrace IT the earliest. In the case of H&R Block—the nation’s largest tax preparation firm in the last quarter of the century—it began extensively using software tools in the 1970s and, in 1980, even acquired Compuserve, an early online consumer service. In 1986, H&R Block filed 22,000 electronic returns on behalf of its clients, relying on software preparation tools to get the job done. Exactly ten years later (in 1996), it prepared one out of every nine tax returns filed in the United States, giving it access to many taxpayers who could possibly be convinced to start allowing the firm to file electronically on their behalf.83 Tools for tax preparation services were often products from small software firms in the 1970s and early 1980s. One of the most successful of these was Intuit, which began life in 1983 with an accounting software product aimed at the home market, Quick Books, and over time added software products for use by professional accountants.84 Online tax preparation took off in the 1990s. To put that statement in perspective, consider that the total number of returns filed in that decade grew by 13.7 percent, while the rate of growth in the use of tax accountants expanded by 26.4 percent. And as noted earlier in this chapter, the percent of all tax returns filed electronically by individuals and accountants also grew substantially during the decade. Even short forms, such as the 1040EZ, were often prepared by accountants by the end of the 1990s. This work also was an example of new opportunities for generating income made possible by the digital hand. For example, some of the 1040 EZ business for tax professionals was tied to the Refund Anticipation Loan (RAL) products, which gave the taxpayer a loan for a portion of their refund; it became a whole new source of revenue for tax accountants. But in short, tax preparation had become a mainstream digital application. In tax preparation year 2001, 21.1 percent of all such forms submitted to the IRS were signed by tax preparers.85 While hard data on the number of tax returns submitted by preparers electronically in earlier years are difficult to come by, we do know that for 2000 the number exceeded 57 percent.86 Complexity of the tax code and time required to fill out forms had nurtured expansion of tax preparation services, augmented by an ever increasing number of reliable software
Tax and Financial Operations
tools, forming a virtuous circle of opportunity and adoption of digital tools. By the end of the century, the IRS had initiated a variety of programs to encourage even further electronic filing and aimed some of its promotional programs at the professional accounting and tax preparation firms. Ironically, and perhaps not surprisingly, in the early years of the new century as software tools that individual filers could easily use appeared on the market, the number of clients for these firms actually dipped slightly. It is quite possible that the reason for this situation can be explained by the fact that the economy as a whole was in a recession, and as a result the total number of tax returns filed had declined slightly (both paper and electronic). Nonetheless, the number of electronic filings they did as a percent of their total work remained high (71.9 percent of all filings done in 2001, 67.1 percent in 2003).87 In the 1980s, many software firms rushed into the market to provide professional tax preparers with software products, but, by the end of the 1990s, the number had dwindled to less than a couple dozen enjoying broad market appeal. Many other niche tools also were on the market. A survey done by the CPA Journal of tax preparers in the state of New York in late 2002 pointed out that on the whole, accountants were pleased with the performance of these tools. Their biggest complaint concerned the consolidation of software firms, which meant products they were familiar with went out of use and they had to learn how to use new ones. The same held true for tax research software, even though there were fewer options in what one could use. Two-thirds of all the firms surveyed were extensive users of all manner of IT, from preparation software to Web sites.88 The answer as to why continued to be the same as in the 1980s and 1990s. One report in 2003 summed up the rationale quite clearly: “Tax practice has changed significantly over the last decade as computer technology has enabled local and regional practioners to adopt sophisticated tax compliance and research software to better service their clients. The relatively low cost and high availability of Internet resources have also influenced the way accounting professionals communicate with existing and potential clients.”89 But at the same time, users were complaining more about the constant changes in their software tools. Those noted above were surveyed in 2002; in the following year, they increased their use of software tools to over 80 percent, so the complaining did not stop their increased use of the digital. Their lingering concern now shifted to how secure the Internet was (or was not) as a tool for accepting and transmitting sensitive financial data back and forth to clients and to tax agencies.90 As this chapter was being written, the market for software driven tax preparation continued growing but was increasingly being serviced by larger firms, using a fewer number of software tools. On the tool side, as of late 2004, there were only sixteen software firms that had products that could handle all the federal tax forms and calculate taxes from all the states for individual income tax returns. That number represented a decline of some 20 percent over the prior decade. That trend forced about 13 percent of preparers to switch packages, always a tension filled activity as they learned new ways of doing their core tasks.91 Table 2.6 lists some of the most widely deployed products used by tax
43
44
The DIGITAL HAND, Volume III Table 2.6 Tax Preparation Software Widely Used by Tax Preparation Firms, 2004 Vendor
Tax Program
Drake Software Research Institute of America Intuit, Inc. ATX, Inc. CCH Tax and Accounting, A Wolters Luwer company Universal Sax Systems Inc. TaxWorks by Laser Systems Thompson
Drake Software GoSystem Tax RS Lacerte Max Plus ProSystem fx Tax TaxWise TaxWorks UltraTax
Source: Stanley Zarowin, “Users Size Up Tax Software,” American Independent Certified Accountants, October 2004, http://www.aicpa.org/pubs/jofa/oct2004/ zarowin.htm (last accessed 1/8/2005).
preparers for individuals and businesses. The percents of usage are based on the survey by the CPA Journal done in 2004. As happened in so many industries, as use of digital tools spread from suppliers of goods and services to their customers, new possibilities turned into new waves of innovative services only made possible by the growing infrastructure of IT capabilities. One clear example of this phenomenon was just beginning to appear with tax preparers. Beginning in late 2004 and expanding in 2005, tax preparers targeted their services at teenagers and very young adults, often offering free or inexpensive services online in hopes of enticing them to become clients in subsequent years. H&R Block, for example, provided free online federal tax preparation for those under the age of eighteen earning less than $10,000. Intuit, Inc., which sells TurboTax, a software package that individuals can use to prepare their own returns, went after the same demographic with a Web site called RockYourRefund.com, displaying images of shirtless fun-loving beachgoers enjoying their tax refunds, all for a $5.95 tax filing fee. The site also offered discounts on purchases of electronics or travel. In both instances, the companies recognized that the youngest taxpayers often did not file returns, even though they had refunds due them, or their parents filled out the forms. In either case, these people represented a new segment in an otherwise slow but growing market; they were already on the Internet and thus could easily add financial transactions to their activities.92 It was no surprise that competing firms would pursue them. While professional preparers were changing their practices to incorporate extensive use of the digital, individuals filing returns were also busy at work embracing software tools, although not to the same extent. This did not happen overnight either. Several preconditions had to be met before individuals and small businesses could use software to prepare and file electronically. The most
Tax and Financial Operations
important of these was either owning or having access to a personal computer. While such devices first came on the market in the late 1970s, it was not until the 1980s that they had spread widely across the American scene, and in the 1990s even more so, often with access to the Internet by the late 1990s.93 The second prerequisite turned out to be an appetite for spreadsheets, personal financial planning, and management software. These could be used to manage household finances, plan investments, and pay bills. By the early 1980s, there were dozens of software products on the market to handle these kinds of transactions. In December 1984, Intuit shipped the first version of Quicken, which eventually became one of the most popular home financial planning products on the market. Over time, new releases added functions, while other vendors passed into history or saw their market shares shrink. By the early 1990s, Intuit had several million users. Then in 1993, Intuit acquired TurboTax, the tool that would be used by so many individuals to prepare their first taxes with the aid of software.94 By the early years of the new century, over 20 percent of all households used some sort of financial planning software product.95 So, long before online or even digitally based tax returns were prepared, many Americans had already become familiar with PC-based software for financial management. Initial users of tax preparation software tended to be those individuals with higher than average incomes, often were well educated, and had an assortment of mutual funds, retirement accounts, and stocks. The majority of those using tax preparation software had prior experience with other financial planning digital tools, thus making it either easier to prepare returns or to avoid the greater expense of having a tax preparer do the work.96 As the Internet became a viable means for filing electronically by the end of the century, e-filing began a slow but continuous deployment, as discussed earlier in this chapter. By 2005, the IRS had an aggressive campaign under way to promote such filing and even announced that it would be retiring use of telephone systems as a paperless filing method (TeleFile). When it announced that taxpayers could file free of charge, using software tools developed in the private sector, taxpayers found that they had fifteen different software packages they could access.97 Historians will someday recognize that this agreement to allow citizens to file this way was truly a landmark event in the history of American tax collecting. Professional tax preparation firms had fought mightily against the IRS’s making it possible for individuals to file electronically. The preparers lobbied Congress, which frequently reacted behind the scenes, threatening the IRS with budget cuts if it continued to move forward on building its own Internet filing site. Nonetheless, the agreement went into effect. Tax preparers accepted the changing nature of things. For example, Intuit promoted its product, TurboTax, both on the Internet and through retail channels, emphasizing that it would “cut down on errors and save time,” while providing verification from the IRS that it had received one’s returns. Refunds also came in more quickly. For example, in 2004, e-filers tended on average to receive their refunds electronically if posted to their bank accounts (via EFT) in just over two weeks, while paper filers waited anywhere from two to six weeks, a point emphasized by tax software vendors.98
45
46
The DIGITAL HAND, Volume III
Both large and small businesses also either used these same tools or wrote their own and, like individual tax filers, had long experience with digital financial tools. Since the 1960s, the Fortune 1000 companies—the largest firms in the United States—had used software to assist in the preparation of their corporate returns. By the late 1980s, some 60 percent of those firms had computerized over half the work done by their tax departments; one survey suggested that a third had automated 70 percent or more of their work. Personal computers were their favorite digital tool in both large and small firms. Larger firms used these in conjunction with mainframes that housed large financial accounting systems that they needed to access in order to do their tax work. In short, like individual preparers, big and small firms had moved to software tools in the 1980s. Down to the present, the application has remained virtually ubiquitous, particularly in large corporations, and less so in small firms, which have tended to track deployment more closely to what we witnessed with individual filers.99 The IRS operated in tandem with these developments. The first truly modern e-filing application from the IRS was for the 1120 (used for corporate returns) and the 990 (used by tax-exempt organization returns) as part of its Modernized E-File Project. Later, the IRS mandated electronic filing by all large corporations and tax-exempt entities. During the early 2000s, IRS began modernizing its digital 1040 software to sit within the same system.
Conclusions The deployment of the digital hand in the world of tax filing, collections, and compliance followed a similar pattern evident across many industries and applications. The work was paper intensive for all parties concerned. The subject was fraught with complexity and frequent errors in mathematical calculations and data entry. The volume of time, people, and transactions was always massive, to say the least. In short, from nearly the earliest days of digital computing, tax work proved an ideal use of computers. The benefits all sought were speed, ease of use, accuracy, and lower operating costs. Both tax agencies and tax filers in general were able to reduce paper work, speed up processing, increase accuracy, and sometimes even ease the complexity of the work. For individuals filing, costs as measured either in time spent on the process or in paying tax preparers did not go down, although self-preparation using software packages proved so convenient that the cost of the software was more than offset by the time people saved in preparing returns. The larger the tax agency one operated, the more complex digital applications became in support of the collection and processing of returns. The IRS became the epitome of complexity and largeness, even when compared to private corporations. Conversely, small towns and counties were able to use computers in support of tax work relatively easily and as part of their general accounting and financial activities. The experience of the IRS teaches us that the use of computing involved many nondigital aspects, such as the role of institutional focus,
Tax and Financial Operations
management attention, and having the right project management, programming, organization, and leadership skills. This discussion is not about incompetence at the IRS—which most GAO audits implied or stated—rather, about the realities of complex applications of the digital and organizational operations. Smaller tax agencies also faced these kinds of issues, but again, the smaller the tax department, the easier it was to manage the adoption and use of software in support of tax work. The experience of people in the United States with tax applications highlights the symbiotic and iterative evolution of adoption. First, we saw the IRS and large states begin using software to do tax work, followed by ever increasing waves of midsized states and large cities, then by even smaller states, cities, counties, and towns. As technology either became less expensive and modular or easier to use, adoption spread across all tax and financial departments in the public sector. Right on the heels of the large tax agencies were the financial and accounting departments of sizeable corporations also embracing use of the computer to do their tax work. We saw as well a similar pattern of ever smaller companies using computers over time. Almost simultaneous with adoption of computing by smaller firms were the professional preparers, who often were the organizations that introduced computing to smaller companies as part of their accounting services. Finally, with the arrival of the PC, online access (even before the Internet), and financial planning and tax preparation software, individuals began using the digital in support of their tax work. As each constituency embraced the computer, they caused other participants in the process to alter their use of computing. Thus, as companies, some tax preparers, and individuals began pressuring the IRS and state agencies to allow e-filing, tax collectors had to acquiesce. Congress, for example, in the late 1990s passed legislation directing the IRS to step up its support of e-filing and provided tools to do so, such as authority to use paid advertising to promote e-filing. Use of computing evolved over time as a function of changes in technology and as a growing base of experience and stock of installed applications became integral to daily work. In the early 1960s, the IRS accumulated massive card and tape files, and processing occurred in batch mode, that is to say, not real time or online. As technology evolved, online systems were added to the stock of digital applications, often building on these by providing windows into databases of information that could be examined or updated real time by a person at a terminal. As telecommunications improved in various forms in the 1960s and 1970s and were later reinforced with secure transactions over the Internet, moving information from one point to another became possible and desirable (e.g., e-filing). Every tax agency I studied in preparation for writing this chapter added applications and changed earlier ones to provide new functions or simply improved operations over earlier digitally supported processes. It was a constant evolution, iterative, incremental, and cumulative. Thus, a tax department of 1950, cluttered with adding machines, tabulators, and calculators became an alien, unfamiliar landscape when compared with what such an organization looked like a half century later when mainframes and PCs were the nerve centers. To be sure, much paper still remained,
47
48
The DIGITAL HAND, Volume III
because not all filers had gone online and taxes had become more complex, often requiring more forms or other documentation. The IRS still desired to replace older systems with newer ones, and their conversion projects were massive in vision and complexity. But when the IRS changed applications in an incremental fashion, it enjoyed successes for most of the half century; when it attempted to do large wholesale replacements, it simply either failed or took decades to accomplish. Technology changed faster than the IRS could, which simply complicated matters. The earliest users of large systems tend to be saddled with large problems of inertia and complexity in their attempt to move to more modern systems—a lesson from the IRS. Those who embraced computing later, even if large systems, tended to have an easier time of it because newer digital tools lent themselves to incremental upgrades and changes more than the earliest systems. Changing a large system, such as those of the states of New York, California, and Pennsylvania, or the IRS, was tantamount to trying to change a flat tire on a moving vehicle. One could not stop collecting taxes for several years to enhance older systems. So tax agencies did what the private sector also did; they changed systems incrementally, and that is why the history of tax applications is not the story of revolutions in computing but rather a long tale of evolutionary transformations from no computers to extensive reliance on the digital hand. The IRS Commissioner who most was able to change the agency in the second half of the century concluded from his experience “that it is wrong to assume that a big, entrenched institution that gets into deep trouble cannot be changed for the better. The crisis can be turned into an opportunity. If it is important enough to do, it can be done.”100 In the next chapter, we face two sets of issues. The first is the adoption of applications that are complex, indeed, some more so than those at the IRS, but second, a portfolio of uses that were far more varied than the focused work of a tax agency. The military services and the U.S. Department of Defense were arguably the largest users of computers in the world during the twentieth century. Uses ranged from the mundane, such as doing payroll, but sometimes for several million “employees,” to supporting logistics and complex supply chains that required almost every conceivable product available in the American economy to be ordered, transported, and consumed, sometimes under combat conditions. That is the story to which we turn next.
3 Digital Applications in Defense of the Nation Information systems have become essential ingredients to the success of combat operation on today’s battlefield. —General Colin Powell, 1992
T
he U.S. Department of Defense (DoD) is one of the largest users of computers and telecommunications in the world and the largest within the federal government. Its uses of these two bodies of technologies have historically been some of the most advanced and complex as well. This department used all manner of technologies, but none proved so central to the way it conducted its affairs as these two during the second half of the twentieth century. Computing and telecommunications spread across all the uniformed services, and civilian employees used the digital hand to assist in such activities as accounting, financial reporting, procurement, and logistics. What President Dwight D. Eisenhower called the military-industrial complex included the wider community of companies that the DoD often called upon to develop new uses of computing or the application of the digital hand in the creation of new weapons systems. Throughout the second half of the twentieth century, the DoD often supported development of more advanced uses of computers, while pushing forward the state of the art of digital and telecommunications technologies throughout the period of the Cold War. The development of the network we later came to know as the Internet was one of many important examples. In short, the combination of stimulating R&D in technology, then using these results in practical ways linked to the core missions of the department, presents us with a very large case
49
50
The DIGITAL HAND, Volume III
study of how computers were broadly used across the American economy and, more narrowly, in the public sector. A great deal of the story of how the DoD promoted the development of new computing technologies from the 1940s through the 1990s has been studied by historians.1 While the broad lines of those studies will be summarized below, this chapter focuses on the use of computers and telecommunications, a story that has not been told in any comprehensive manner. By doing that, we can demonstrate the extent to which this department relied on computing to do its work and the degree to which its uses of the technology changed how DoD evolved over time. We care about that story for all the obvious reasons, but also for one other. DoD was more often than not a department equal in size to some important American industries.2 In times of war, for example, it employed as many people as the banking or construction industries. It was also a world of its own, acting very much like an industry. It had its own values, language, organizations, methods for doing things, allies, constituents, suppliers, and value to deliver. It always had its own publications, conferences, training programs, and so forth. In these ways, for example, DoD was no different from other established industries. It may seem an odd perspective of the department but nonetheless a useful one for understanding this world, because like so many industries, actions taken in one part of DoD affected the work of other agencies, uniformed services, and vendors. Each learned from the other and influenced each other’s thinking and actions. We will not review in detail the role of the broader community that existed outside of DoD that made up part of the military-industrial complex, such as manufacturers of military aircraft. I described their uses of computing in the first volume of The Digital Hand. What is added below are comments about how that community interacted with DoD in developing or deploying IT. To be sure, my review covers research and development, but also logistics, ordnance, weapons systems, training, combat operations, Information Age warfare, and various noncombat applications. A brief discussion about deployment closes this review of DoD’s activities to provide a more comprehensive picture than we have had before of how IT came into DoD and to what extent. I do not discuss intelligence activities—a major function at DoD—because of the lack of sufficient publicly available information about its use at this time.
Makeup of the Department of Defense In order to appreciate the role of the digital hand at DoD, it is important to understand how that department organized itself and its work. The National Security Act of 1947 established the National Military Establishment, led by the secretary of defense, and included three military departments—Air Force, Army, and Navy—along with a variety of other agencies. Legislation in 1949 changed
Defense of the Nation
its name to the Department of Defense and reaffirmed its status as an executive department, with the secretary reporting to the president. To a large extent, the organizational history of the DoD is about the growing power of the secretary and the Joint Chiefs of Staff and of rivalries among the uniformed services for influence, funding, and over scope of their missions. Others have addressed the history of those rivalries, a story largely out of scope with our interest in understanding the role of computers.3 However, it is important to understand that this department combined the uniformed military services—Army, Navy, Air Force, Marine, and Coast Guard—and a variety of civilian agencies in support roles, employing a combination of civilian and military personnel. The size of DoD proved so influential on the role of computing both in the department and across the American economy that the demographics and size of budgets need to be understood. In 2005, for example, the defense budget consumed 21 percent of the entire federal government’s $2.5 trillion budget. In other words, it was over $500 billion, a figure that does not include other supplementary allocations.4 During periods of war, that percent of the total budget always climbed higher. Table 3.1 catalogs totals for various DoD expenditures for the period since 1950. The growth in expenditures derived largely from the expanded duties of the department throughout the Cold War and from waging the Korean Conflict, Vietnam War, Gulf War, and Iraq War, not to mention carrying out various smaller missions, such as the hundreds of occasions of helping people in natural disasters or rescuing people at sea, and funding smaller military engagements. Table 3.2 documents the number of people employed by DoD, including military personnel, civilian employees, and others outside of DoD doing work for the department, in short, President Eisenhower’s “militaryindustrial complex.” A quick glance at the employment figures reveals first that the number of civilian employees as a percent of all DoD employment was quite high throughout the period, requiring that any survey of the role of computing in DoD take into account their use of technology. Second, the role of outside contractors, whether in the development and manufacture of weapons systems or in running cafeterias, also comprised an important contingent. Combined, the data in tables 3.1 and 3.2 demonstrate that for the entire period, DoD was a large component of the government and, by extension, of the American economy at large, both in times of peace and war.
Table 3.1 U.S. Department of Defense Spending, 1950–2005 Year $ Million
1950 42.6
1960 92.2
1970 195.6
1980 591.0
1990 1,253.2
2000 1,788.8
2005 2,340.0
Source: Office of the Under Secretary of Defense, National Defense Budget Estimates for FY 1998 (Washington, D.C.: United States Government Printing Office, March 1997): 160–161; ibid., for 2005 (Washington, D.C.: United States Government Printing Office, 2005): 205.
51
52
The DIGITAL HAND, Volume III Table 3.2 Total Defense-Related Manpower, 1950–2005 (thousands) Year 1950 1960 1970 1980 1990 2000 2005
Active Duty 1,459 2,475 3,066 2,063 2,144 1,449 1,455
Civilian
Total DoD
Defense-Related
710 1,195 1,264 990 1,073 698 688
2,169 3,670 4,330 3,053 3,216 2,147 2,143
2,883 6,131 6,729 5,043 6,332 4,572 5,618
Source: Adapted from various tables in Roger R. Trask and Alfred Goldberg, The Department of Defense, 1947–1997: Organization and Leaders (Washington, D.C.: Historical Office, Office of the Secretary of Defense, 1997): 171–176; U.S. Census Bureau, Statistical Abstract of the United States: 2002 (Washington, D.C.: United States Government Printing Office, 2002): 329; ibid., for 2005 (Washington, D.C.: United States Government Printing Office, 2005): 213. Statistics varied from one source to another for any given year; however, when compared, the differences were slight.
Patterns of Research and Development Historians agree that the investments made by DoD in the development of computers in the 1940s and 1950s made it possible for this class of technology to reach a level of effectiveness such that it could be used by the private sector. Along with economists and other researchers, they have documented how this enormous investment in R&D gave the entire United States economy a leap forward toward the Information Age ahead of all other nations.5 To a large extent, the motivation for this enormous investment was the need of the American government to respond to the military threats posed by the Cold War, dangers that ranged from the development of nuclear weapons to complex guidance systems for missiles, to advanced avionics for aircraft and command and control systems for large naval fleets.6 Early and intensively, officials in both the uniformed services and in the civilian side of the War Department and later in the Department of Defense saw the possibility of using computing to perform complex calculations, to coordinate rapid air defense and combat command decision making, to support the development of increasingly lethal weapons (later “smart” weapons and ordnance), and to support vast logistical processes. It was a faith in the potential benefits of using computing that grew more intensely over time, beginning with hints of possibilities worth investing in during the 1940s and then morphing to wide-spread endorsement of the technology by the late 1960s. Paul Edwards has argued in his study of the military’s use of computing that the technology became a major source of the techno-world view the military embraced throughout most of the era of the Cold War.7 To accomplish the task of developing new
Defense of the Nation
technologies, and then new uses for these, required a variety of strategies that ranged from in-house development at facilities run by the uniformed services to national laboratories, even delegation to other departments, universities, and think tanks, and, of course, to the private sector. It was a complex ecosystem that spun off vast amounts of new tools and uses. Management of the R&D process also varied with the uniform services, multiple civilian agencies within DoD (most notably DARPA), and elsewhere, such as the National Science Foundation (NSF) and the National Security Agency (NSA) funding and directing research agendas. One historian who has extensively studied the influence of computing on the thinking of the military, Paul Edwards, summarized the focus of the research activities over the course of the last six decades of the twentieth century: “First, air defenses, then strategic early warning and nuclear response, and later the sophisticated tactical systems of the electronic battlefield grew from the control and communication capacities of information machines.”8 Almost every major weapon system of the late 1950s forward involved the use of digital technologies either for their development (such as design of aircraft) or their operation (for example, ballistic missiles and smart bombs). Defensive systems aimed at providing early warning of enemy attack also involved use of computers, such as SAGE in the 1950s through the early 1980s, and development of the Star Wars’ Strategic Defense Initiative (SDI) from the 1980s to the present, an effort still under way during the early years of the new century. Command (military jargon for leadership) and control (military term for management) systems, used to coordinate complex air, naval, and later land battles, also were the subject of R&D. The DoD devoted considerable attention to the creation of complex and sophisticated logistics systems (often what in the private sector is referred to as inventory control or supply chains). These clusters of work called for the use of operations research (OR), artificial intelligence (called expert systems in the private sector), and, most recently, exploitation of RFID technology (early 2000s).9 In short, for a wide range of activities of interest to the Pentagon, development of new computer technologies and applications became ongoing activities that began in the early 1940s and have continued to the present. The early research projects of the 1930s and 1940s have been well documented, so they need not detain us here. However, it is important to understand how they were managed by the military. An initial, popular approach was to seek out the help of American universities, institutions that already had collections of scientists and electrical engineers who could immediately be put to work on projects, such as the development of ENIAC at the University of Pennsylvania during World War II,10 other work at the Radio Research Laboratory at Harvard University, and at MIT’s Radiation Laboratory. The latter emerged with the largest research projects on behalf of the military by the late 1940s, employing some four thousand people across sixty-nine colleges and universities.11 The results of these early projects were impressive: the ENIAC for compiling ballistics firing tables for antiaircraft weapons and army artillery, radar from Harvard and MIT, and the largest of all the early computer projects, SAGE, led by MIT. All these institutions included subcontractors from the private sector in these various
53
54
The DIGITAL HAND, Volume III
projects. The combination of university and private sector doing R&D emerged as the favorite research strategy used by DoD for decades.12 R&D increased at the Pentagon in the late 1940s and early 1950s, running into the hundreds of millions of dollars each year, often comprising 80 to 90 percent of all federal funding for all manner of military R&D.13 Various agencies emerged in the 1940s to direct the research. During World War II, American scientist Vannevar Bush, builder of early analog devices in the 1920s and 1930s at MIT, created the Office of Scientific Research and Development (OSRD), which coordinated much R&D during World War II. The Ballistics Research Laboratory (BRL) was also an early investor in various ballistics projects, beginning with the ENIAC, which was moved to its facilities in Maryland in 1947. Additionally, the Navy established the Office of Naval Research (ONR) in 1946. As the Cold War heated up, along with the attendant nuclear arms race and later space race, both the Navy and Air Force took aggressive steps to ensure funding and management would be in place to sustain the development of new digital tools.14 The ONR became the leading military agency funding projects in the late 1940s and 1950s, with the Air Force playing a similarly important role.15 Professor Edwards reported that by 1948 the ONR alone was funding 40 percent of all basic research (not just about computers) being conducted in the United States and that two years later had 1,200 projects scattered across 200 universities.16 Major computer projects of the day included Whirlwind (later part of SAGE) at MIT, Hurricane at Raytheon, and the Mark III at Harvard.17 As the 1950s progressed, the Navy and Air Force added specific development contracts to the list of R&D projects, which they awarded to major electronics and other private firms to complete. For example, all of IBM’s early computers (circa 1950s) were in whole or partially funded by the military.18 Key participants also included Northrop Corporation (for example, with the Snark missile), Bell Labs (Nike missile); Burroughs and NCR with various ballistics and avionics applications; Engineering Research Associates (ERA), which worked initially on cryptographic computing and later the ATLAS, usually dubbed the second electronic stored-program computer in the United States when it went “live” in 1950; and Univac for its systems.19 By the early 1950s, while computing R&D was beginning in the private sector, the federal government still funded about 75 percent of the costs of all major projects, with the lion’s share focused on military requirements.20 Even though this massive injection of funding made it possible for the U.S. computer industry to come into existence and lead in world production in the 1950s, hence launching the capability in the private sector for companies to start doing their own R&D in the field, the Pentagon and various civilian government departments continued to support military projects in the 1960s. The military extensively supported defense research on miniaturization of electronics, and particularly development of semiconductors, right through the 1970s. So, even Texas Instrument’s famous introduction of the integrated circuit (IC) at the dawn of the 1960s was done with Air Force funding. ICs were critical for all missile guidance systems from the 1960s to the present and also for avionics in military aircraft, beginning in the 1970s.
Defense of the Nation
An important organizational change came in response to the launch of the Sputnik satellite by the Soviet Union in 1957 with the establishment of the Advanced Research Projects Agency (ARPA) in 1958, renamed DARPA (adding Defense to its title) in 1972. Over time, this agency funded many projects involving telecommunications networks and such advanced forms of computing as artificial intelligence, graphics, intelligent sensors, software, semiconductors, SDI, time sharing, and advanced computer architectures. Its most famous claim to fame was the funding and development of the early versions of what came to be known as the Internet, a digital packet switching, telecommunications network used initially by academics, government officials, and those companies working on military projects.21 DARPA remained one of the most extensive supporters of basic research on computing, and its military uses, in the world right into the early twenty-first century.22 Over the past half century, the government as a whole devoted considerable attention to the organization and management of clusters of research facilities. The combination of universities, private firms, national laboratories, and other facilities were collectively called Federally Funded Research and Development Centers (FFRDCs). These were distinct centers that were sometimes housed on university campuses, within government agencies, or in the private and nonprofit sectors. Each participant in this community might also be a subcontractor to an FFRDC. These emerged over time in line with how technology evolved on the one hand and, on the other, in response to the changing interests of the military community. They began with work on operations research during World War II and, by the early 1960s, systems analysis and systems engineering in basic scientific work along with others devoted specifically to IT. Between World War II and the mid-1990s, 150 such organizations became FFRDCs, and of this total, 70 were controlled by DoD. The earliest were devoted to military projects; not until the mid-1970s did a growing number of civilian research centers come into the program, when 40 existed, 10 of which DoD controlled.23 From the beginning, the military were active. The Navy set up, for example, the Operations and Evaluation Group (OEG), while the Air Force created RAND to meet its needs. Specialized organizations sometimes spun off into private ones, such as the RAND organization in 1948, the Systems Development Corporation (SDC) in 1957, and MITRE, which spun off from MIT in 1958. Many of these centers were located at universities, such as Lincoln Laboratory at MIT and the Software Engineering Institute at Carnegie-Mellon University, funded by DoD and working on projects for the military. While these were established to handle all manner of R&D for the federal government, it is very telling how many were devoted to military research, the majority of which involved computing topics. Table 3.3 documents for select years the number of military-centric centers.24 Almost all of the early centers were established in direct response to threats posed by the Cold War, a focus that remained unchanged until the early 1990s. Although outside the scope of our discussion of the DoD’s role, other government agencies also contributed funding, management, and staffing for R&D on military projects from the 1950s right into the next century. These included
55
56
The DIGITAL HAND, Volume III Table 3.3 Number of Federal and DoD Research Centers, Select Years, 1956–1995 Fiscal Year 1956 1961 1966 1971 1976 1981 1986 1991 1995
Total Federal
Total DoD
Shared DoD with Other Agencies
46 66 47 68 37 35 36 41 39
27 43 23 13 8 6 10 11 10
18 20 19 21 20 21 20 22 19
Note: Part of the reason for DoD’s decline in number in the 1970s reflected the surge in R&D that had been taken up by the private sector that resulted in commercially useful technologies directly applicable to the military, such as general purpose computers. Source: U.S. Congress, Office of Technology Assessment, A History of the Department of Defense Federally Funded Research and Development Centers, OTA-BP-ISS-157 (Washington, D.C.: U.S. Government Printing Office, June 1995): 51–52.
the Atomic Energy Commission (AEC), which played an important early role in the development of nuclear weapons and their supporting systems; the National Science Foundation (NSF), which funded projects much like DARPA but also R&D on computing for civilian projects; the National Institutes of Health (NIH); the National Security Agency (NSA); and the EPA.25 While it was not obvious in the late 1940s that computers could be useful to the military, by the early 1950s that doubt had been dispelled and so leveraging academic, private, and national laboratories became the way DoD added to its store of new information about the digital and development of many applications. Over time, it became increasingly obvious that computers could perform quickly complex calculations, operate reliably enough with the prospects of improving performance, were shrinking in size, and all the while adding capacity (certainly by the early 1950s). By the mid1960s, enormous advances had been made along each of these dimensions. Research and deployment projects tracked along the lines of what the technology proved capable of doing. For example, in the 1940s, what was seen as a scientific device was deployed to perform calculations that the equipment could do faster and more accurately than human calculators.26 Next, in the 1950s, R&D led to the development of ways to control weapons, such as missiles, and to guide aircraft.27 By the mid-1960s, the Pentagon had become very interested in a wide variety of projects that could simulate battle conditions and also engineering problems that might be faced by emerging weapons systems. A great deal
Defense of the Nation Table 3.4 DARPA-Sponsored Categories of Digital Military Research Projects, 1999–2005 (funding in millions of dollars) Project Type Defense research sciences Next-generation Internet Computer systems & communications technology Extensible info systems Tactical technology Integrated command & control technology Materials & electronics Advanced aerospace systems
1999
2001
2003
2005
57.4 42.0
90.4 15.0
94.4 0.0
96.1 0.0
309.1 0.0 159.0
376.6 69.3 121.1
355.4 90.0 151.1
364.3 95.0 174.3
38.3 268.6 0.0
31.8 249.8 26.8
0.0 215.3 40.0
0.0 230.6 44.0
Source: U.S. Department of Defense, Unclassified Department of Defense FY 2001 Budget Estimates, February 2000, vol. 1, Defense Advanced Research Projects Agency (Washington, D.C.: U.S. Department of Defense, 2000): unpaginated, available at http://www.defenselink.mil/comptroller/defbudget/ fy2001/budget_justification (last accessed 6/1/05).
of interest developed in the 1960s about how to automate battle decisions using computers to rapidly acquire information and to take actions. This focus has continued to the present and often is labeled the “electronic battlefield” or “Information Age Warfare.”28 As with private sector applications, each new use was met with considerable skepticism and only applied incrementally in an evolutionary manner as each improvement in the technology demonstrated capabilities of the digital hand to do something better or newer than previous methods. However, the Pentagon remained continuously a major supplier of funding and management for projects scattered across the American economy. Table 3.4 lists important categories of defense related IT projects of the late 1990s and early 2000s, all funded by DARPA. The list would have looked quite similar to those of the 1970s and 1980s as well. The projects covered a wide range of research, many of which had been under way in various forms since the 1960s. These included work on the Internet, intelligent systems and software, information survivability, asymmetric military threats, networked centric warfare, software for autonomous operations of equipment, and embedded systems. These also included a host of projects directly related to specific weapons: naval warfare, land systems, and a series of technologies for targeting, tactical support, aeronautics, and logistics. Not included in the table is a new category of research begun in the 1990s that, while not digital, could become so in time, called defense against biological warfare. By the early 2000s, DoD funded research on this subject at far greater levels than R&D for defense research sciences or tactical technologies.
57
58
The DIGITAL HAND, Volume III
While many projects were supported by DoD, a couple provide us with a sense of what was being done. Both the Navy and Air Force were early and consistent supporters of research on computing for weapons systems, although officers at all ranks would not support any proposed deployment of any system that had not proven its worth. The Army came later to use computers for weapons, although as we shall see, it was just as eager to deploy computers for accounting, logistics, and inventory control. Historians have so extensively documented SAGE, the air defense network, that we can look to other R&D initiatives for insights.29 Turning to the development of a family of weapons systems is a fruitful exercise for identifying patterns of relationships between R&D and computing. Creation of ballistic missiles demonstrates that without the helping hand of the digital, this class of weapons could not have been created or deployed in the forms that it took. Its development illustrates several behaviors. First, the weapon could not work without light-weight avionics to guide missiles to their targets, a role played by a combination of onboard and ground-based computers. Second, as with most weapons systems, all types of R&D on weapons occurred over a long period of time, in this case from the end of World War II until the present, with many projects leading to the creation of a system and then many additional ones improving incrementally their efficiency and performance. Third, these were expensive and sophisticated, indeed very complex, projects that simultaneously strained the entire body of knowledge related to various fields, ranging from computing to electrical engineering, to ballistics, and so forth. While ballistic missiles were developed in the 1950s and 1960s, in the early years of the twenty-first century that body of research was still being extended as
Figure 3.1
Use of computers in development of missiles was a major application in the Navy, circa mid-1950s. (Courtesy IBM Archives)
Defense of the Nation
the DoD continued work on the SDI project, which is intended to provide a SAGE-like shield, this time not just against enemy aircraft but more importantly enemy missiles. Patterns of behavior regarding R&D in computing apply to the vast majority of military projects because of their complexity and the magnitude of their deployment. Much like what the IRS faced, nothing the military did proved to be small, inexpensive, or easy to complete. Paul Edwards was one of the first civilians to document the complexity of all these projects. In addition to size and scientific/engineering complexity, there were the normal IT day-to-day problems that the private sector faced with systems that worked or did not, because they had been so extensively deployed by the military across all processes and most complex weapons systems. Always in the forefront for military leaders was their concern about whether or not complex systems and weapons would work in the heat of battle. Proliferation of software programming languages also plagued the DoD, forcing it to standardize to one in 1983 called Ada. It was intended to extend standardization of IT across the entire department to simplify maintenance and facilitate connectivity of systems.30 Robustness of systems—both physically and technically—remained an intense concern from the earliest days, an issue that has not yet gone away.31 Edwards documented, for example, a chronic collection of problems with the avionics of the F-15 fighter jet, a workhorse for the Air Force.32 Growing out of the experience of early German missiles fired at London during World War II, the American military community recognized the future potential of missiles and set about creating their own in the late 1940s, extending their development to the present, with the U.S. Air Force (USAF) largely responsible for their evolution and deployment. For a variety of reasons ranging from interservice rivalries to normal start-up efforts, work on the first generation of USAF long-range, strategic ballistic missiles (Atlas, Titan, and Thor) really did not progress until the end of the Korean War. The original concept of a pilotless aircraft had not changed for decades. Budget constraints in the late 1940s and early 1950s often dictated the pace of development, gating the involvement of American aircraft manufacturers and other firms that provided components and subassemblies. Problems to overcome with missiles involved reentry, range, guidance systems, efficient and effective motors, and fuels. Always on everyone’s mind was what platform could best be used to deliver nuclear warheads and, secondarily, other explosives. Nuclear warheads became lighter and more powerful, starting in the early 1950s, a development that influenced the configuration of rockets for long-range delivery. Increasingly, USAF coordination of development work centralized by the mid-1950s; hopes rose simultaneously of reducing costs and improving efficiencies. At the same time, a strategy of parallel development of subsystems emerged that has been applied in one manner or another to the present. This approach allowed multiple projects to thrive simultaneously, encouraging development of interchangeable components for various rocket programs, and to spread new knowledge of rocketry among companies supplying the DoD. For a typical missile system, including the three earliest ICBM missiles, subcontractors were recruited to develop the air frame, propulsion, guidance,
59
60
The DIGITAL HAND, Volume III
nose cones, and computer systems. Development involved multiple generations of missiles, so as one came online, the next generation was in development. Development of guidance software and hardware, and other digital and analog subsystems, spread to several companies. For the Atlas and Titan, General Electric and Bell Telephone were recruited, respectively. The USAF named Burroughs the prime vendor for made-to-order computers for the Atlas and gave the same chore for the Titan to Remington Rand.33 During the early months of the Kennedy administration, these systems came online, with thirteen ICBM squadrons of Atlas missiles and six of Titans ultimately established. The feared “missile gap” with the Soviets had, for all intents and purposes, vanished, but newer systems continued under development. The success of the three early weapons systems gave confidence to both the military and to private sector contractors that ever more sophisticated missiles could be developed, particularly as digital technologies shrank in size (largely due to the introduction and use of the integrated circuit in the 1960s) and larger software programs became possible to write and run. Older missiles were constantly updated with new hardware and software in the 1960s and 1970s, establishing a pattern of continuous replacements of subassemblies (including hardware and software) all during the second half of the twentieth century.34 To help put the size of the R&D effort in some context, in the late 1950s and early 1960s, approximately 2,000 contractors worked on missiles. Every major computer manufacturer of the time also participated.35 A similar tale could be told about how development of avionics for Air Force and Navy aircraft, and later for ships, was managed. Each of the uniformed services had its own R&D operations for basic and applied research and for deployment. Analog and digital control systems were developed for each that were highly specialized, designed to increase coordination of aircraft and ships in more complex, faster-flowing combat operations. Computers, particularly their integrated circuits, made it possible for both services to develop highly maneuverable planes in the 1970s, most specifically fighter aircraft, and in the 1980s, stealth combat aircraft. Conventional fighter aircrafts reached their pinnacle in the 1970s and early 1980s and were then followed by development of the stealth class of “black” aircraft, which made their detection by conventional radar nearly impossible. This class of aircraft heralded what many have called a new age of aircraft, comparable in importance to the development of jet craft in the 1940s.36 Developers used computers to assist in the design of this new class of aircraft and deployed avionics systems that enhanced flight stability while offloading from pilots an increased number of the mundane operational activities.37 Because these complex weapons platforms often took two decades or more to develop, for example, ballistic missiles and fighter aircraft, firms and government agencies developed specialized knowledge that permitted them to participate in the development of ever newer systems for the military. This practice applies to current work under way in developing SDI and stealth aircraft.38 How did the Pentagon manage basic research in computing in such areas as artificial intelligence and super computing, applications that have yet to emerge fully in weapons systems, particularly in the 1980s and 1990s? Research focused
Defense of the Nation
not only on weapons but also on the wider issue of command and control systems, which involved development of larger, more complex computing projects ranging from work on artificial intelligence to more advanced computer chips. DARPA had lead responsibility for charting the R&D effort and distributing work among companies and universities. The major set of initiatives of the 1980s and 1990s was called the Strategic Computing Program and grew largely out of the Reagan administration’s Strategic Defense Initiative. Research projects also involved intelligent systems, robotics, automation, visual programming, survivable networks, chip developments, integrated packet nets, distributed sensor nets, machine architectures, and a variety of operational applications. In short, a wide variety of work on basic and applied computer science once again became subjects of great interest at the Pentagon. As the 1980s turned into the 1990s, officials invested increasingly in machine intelligence projects, initiatives that had not yet resulted in significant new applications by the end of the decade. In the 1990s and early 2000s, projects also included work on nanotechnologies, security systems, and telecommunications.39 Officials envisioned various applications becoming possible from this research, an expectation dating back two decades. All through the 1980s and 1990s, uniformed and civilian officials had become enamored with the idea of computer-driven information battlefields. Their thinking included development of precision weapons, significantly advanced forms of intelligence gathering to warn of problems and to set targets, and a myriad of software tools to enhance in an orchestrated manner command and control of a wide variety of activities, people, weapons, and vehicles. DARPA officials envisioned a combination of commercially available IT, such as civilian networks, working with specialized software and hardware to create the information battlefield of the future. Increasingly, shifting decisions on what to do to equipment became a growing theme for discussion and research. As had happened for many decades, uniformed personnel were reluctant to turn over command and control to machines unless these proved effective, while civilian and engineering officials were eager to explore new possibilities. All through the 1990s and beyond, many military mid-career and senior officers and DoD civilian personnel debated the issue while work quietly went on in developing new systems. For example, at the start of the new century, researchers building neural networks for intelligent flight control for NASA began considering trials involving the F-15 and later the F-18 flight simulators.40 This recent effort demonstrates several patterns of application development at DoD. First, from the time someone conceived of an idea to the time it was deployed on ships, planes, or battlefields often ran over two decades. In the latest case of F-18 flight simulators, the ideas had been the subject of testing and development since the 1980s. Second, they often involved creating new computer science, and not simply new applications, which partially accounted for the enormous time and costs involved. SDI, conceived in the 1980s, was still not a reality in the early years of the new century, although an enormous amount of work had gone into creating new uses of computing in missiles and air defense systems, just
61
62
The DIGITAL HAND, Volume III
as had occurred in the 1950s with the creation of SAGE, the nation’s first computer-assisted air defense system. Third, these projects were clusters of R&D initiatives farmed out to national laboratories, universities, and corporations in waves over the years. This pattern of sourcing, outsourcing, and in-sourcing R&D led to the broad dissemination of leading-edge R&D in IT and telecommunications across wide swaths of the American economy over some six decades.
Inventory Control and Logistics Generations of military personnel have heard, and believed, the old quip that “nothing happens until something moves.” They meant people and materiel to support troops, sailors, and airmen. Some of the earliest deployments of computers by the military were not in combat or in air defense systems. Rather, uses mimicked many of the applications that first surfaced in the private sector as well: inventory control, cost accounting, purchasing, and logistics, often originally run on punched-card tabulating equipment. Because these uses involved the majority of civilian and military employees of DoD for over six decades, for many it was their initial introduction to data processing and to IT in general. For these two reasons, it is important to understand such early uses of computing before moving to what one might normally expect to discuss, such as combat operations, simulations in training, or information age warfare. To understand the role of computers, keep in mind that there are three interrelated sets of processes that interacted over time with each other and also involved the exchange of data among them. These consisted of purchasing (acquisition of materiel or services), inventory control (receiving, storing, and tracking of goods), and logistics (movement of goods and services to where they were needed).41 The histories of these three processes within DoD have yet to be written, but we know several facts about them. First, they were massive in size, complexity, volume of dollars expended, number of digital systems involved, staffs dedicated to them, and number of people, industries, and companies they affected. Second, they often operated independently of each other but also increasingly over time became more integrated. Third, they mirrored patterns of behavior in the private sector.42 A Defense Logistics Agency mission statement from 1991 clearly summarized the purpose of these processes: “to provide effective and efficient logistics support to the Military Departments and other organizations. The Agency’s vision is to continually improve the combat readiness of America’s fighting forces by providing America’s soldiers, sailors, airmen, and marines the best value in supplies and services, when and where needed.”43 This statement could have been made in any decade since World War II. Every uniformed service had inventory and logistics functions dating back to the Revolutionary War, a circumstance that has continued to the present. However, in an ongoing attempt to improve efficiencies and to reduce expenses, various organizational and operational reforms were implemented to centralize and better coordinate these functions. For example, in 1952, the Army, Navy,
Defense of the Nation
and Air Force established a center to control the identification of supply items, a first for the military. As in manufacturing industries, by migrating to common inventory nomenclatures, such as item numbers, it became easier to track available inventory, what was on order, and to share that information across the services. Additional reforms in the mid to late 1950s increased coordination across the services, making it easier to track vast amounts of inventory by using computers, leading decision makers to find newer technology to be more attractive to use than older manual or partially automated processes, many of which originated during World War II. In 1961, Secretary of Defense Robert McNamara established the first DoD-wide organization to handle the entire spectrum of functions called the Defense Supply Agency (DSA). The agency spent the 1960s and 1970s consolidating processes and systems, changing its name in 1977 to its current form, the Defense Logistics Agency (DLA).44 Beginning in the early 1950s, each of the uniformed services launched studies on the feasibility of using computers to handle inventory control, purchasing, and logistics with the result that by the mid-1950s, DoD began installing computers in support of these processes. One interesting and early application was the use of an IBM 705 by the U.S. Air Force to manage personnel as if they were inventory. Going live in July 1956, the system was an upgrade from an old punched-card system that tracked the availability of officers and enlisted personnel, and their movement, producing various personnel reports and tracking transfers of some 30,000 to 35,000 people per month.45 At approximately the same time, the Navy installed a similar computer to manage parts for ships, monitoring inventory on 181,000 items worth $550 million. The purposes of the system were to lower costs of inventory, ensure adequate supplies of parts, and reduce the number of people required to handle the process—all the same reasons the private sector used to justify such systems.46 The Army installed its first IBM 705 in 1956 in support of inventory control for its Signal Supply Agency, which managed 162,000 items worth $1.4 billion in support of military installations around the world. For the first time with the help of the 705, the Army was able to marry inventory management and requisition processing, which reduced the number of man hours to do both, and to carry out better and more accurate inventory control.47 Other systems appeared at various bases across all the services during the second half of the 1950s and early 1960s. While the number of systems installed is not absolutely clear, extant evidence suggests that several hundred computer systems were implemented.48 For the day, volumes of items tracked and cost were massive and, as one contemporary study after another made clear, could not be handled with manual systems. One report on the Air Force’s Air Material Command argued that this agency managed more assets it purchased in 1960 ($36 billion) “than General Motors, United States Steel, American Telephone and Telegraph, Metropolitan Life, and Western Electric combined.”49 At the time, this command used a dozen IBM 705 computers, nine IBM 650 tape RAMAC systems, two 305 RAMACs, 26 IBM 650 computers, and over 3,000 pieces of unit record equipment, such as punched-card readers, printers, and tape drives to process 1.6 million items.50 By
63
64
The DIGITAL HAND, Volume III
the early 1960s, half of all computers in DoD were used by the Air Force (400 versus 800 for all of DoD), and of the 400 in the USAF, 125 were dedicated to supporting its inventory management around the world, processing suppliers and parts worth $12 billion. Like the private sector, the Air Force wanted to minimize volumes and also the number of people deployed in inventory management. The USAF reported that using this first generation of computers reduced the number of personnel required to perform this work from 212,000 in 1956 to 146,000 in 1961.51 Yet by any measure, military inventory and logistics systems remained massive, even by today’s standards, let alone by those of the late 1950s and early 1960s. All of the services included in their inventory control and logistics processes a set of activities and uses of computing to handle procurement. Precomputer uses of information technology in this area reflected the same pattern adopted for inventory control: use of punched-card tabulating equipment and smaller office appliances. By the 1960s, purchasing and inventory control were integrated activities, beginning with planning for what had to be ordered, through to contract award, to inspection of what was received, and accounting reporting to pay vendors. Over a dozen reports typical of purchasing applications were routinely produced in the 1960s, and that continued to the present, such as bidding requests and contracts let. The move to computers in the early 1960s made it possible to expand the variety of analysis and data tracking beyond old punchedcard applications, mimicking what the private sector did in the same period.52 By the early 1960s, it had become impossible to manage these processes without using computers; all during the 1960s, they were upgraded, new applications added, and extended to all corners of the military establishment. In 1965, for example, the Pentagon launched what it intended to be a ten-year project to upgrade to a second generation of inventory management process using computers. This initiative led to the development of the Logistics Information System, a process that took twelve years to accomplish at a cost of $61 million. Thus, like the IRS, DoD had implemented systems in the late 1950s and throughout the 1960s that were, by the standards of the day, modern, and made possible by migrating from old punched-card systems to computers. These required some redesigning of processes and writing software that took advantage of the technology.53 Like the IRS, the military remained dependent on these early systems for many years. During the 1970s, they did essentially the same as the IRS, deploying use of computers to preexisting work practices. Assistant Secretary of Defense for Installations and Logistics Barry J. Shillito in 1973 told those interested in military logistics that “the challenge for the next four years, and the remainder of this decade, is to continue this progress.”54 Beginning in the late 1960s and extending through the 1970s, each of the uniformed services continued to control and enhance their own individual logistics and inventory control systems. Complaints from auditors and blue ribbon panels about redundancies began to appear. One such report, dated 1970, argued that “many of the modules of these systems perform almost identical functions, such as warehousing, shipping and
Defense of the Nation
receiving, inventory control, etc.” and that “software programming for each of these is costly and each independent modernization step taken on the many separate programs involves unnecessary duplication and appears to lock in more tightly the incompatibilities of the various systems.”55 Nonetheless, these systems remained, with hardware often upgraded as new machines became available, although many of the applications were essentially the same as developed in the late 1950s and 1960s. In fairness to the military, however, new technologies were bolted on to these processes much as occurred in the private sector. For example, after bar code systems became available in the late 1970s, the Pentagon did not hesitate to integrate these into existing inventory and logistics systems, much as it was doing with RFID technology in the early 2000s. In the case of bar codes, one of their first uses involved tracking maintenance data for airplane parts, which represented an upgrade from earlier punched-card tracking systems.56 Bar coding spread across all the uniform services for all manner of inventory and logistical processes during the 1980s and 1990s. Annual reports of the Defense Logistics Agency from the late 1980s and early 1990s pointed out that the DLA focused less on upgrading logistics systems already in place than in lowering operating costs, “building an effective relationship with industry,” and improving performance in general.57 When e-commerce practices came into their own at the turn of the century, DoD began using this new form of IT as well.58 Pentagon logistics managers began thinking of their subject area much in the same manner as the private sector, viewing logistics as supply chains, with the basic strategy in war time of deploying preexisting stockpiles of weapons, various supplies, and food accumulated in peacetime for quick response while ordering backfills and additional quantities from suppliers as needed. The planning and execution of the war in Iraq at the start of the 1990s strained the logistical capabilities of the Pentagon. The logistics strategy involved situating massive quantities of materiel in the Middle East prior to the war on Iraq. Existing logistics systems were capable of ordering, moving, and tracking the massive quantities of supplies needed for that initiative. Following the war, strains on the logistics processes subsided, until the start of the second war with Iraq (Operation Iraqi Freedom). Beginning in 2001, the Pentagon began planning for this war and started deploying troops in the theater in late 2002. DoD launched major combat operations in March 2003. In support of this new military campaign, the logistics community in DoD had to move in excess of 2 million short tons of cargo to the Persian Gulf, ranging from equipment and spare parts to food and batteries. It would be difficult to exaggerate what a large effort that was; from October 2002 through September 2004, for instance, expenditures for supplies and operating support reached $51.7 billion, and officials spent an additional $10.7 billion just for transportation of people and supplies.59 By any measure, all existing systems were being strained. The GAO pointed out after the early stages of the second Iraq war that existing supply chain management practices were problematic, not because of the quality of the IT systems, but due to high levels of inventory and lack of the “right” inventory needed in Iraq. One major IT flaw it identified was the poor quality of computerized models used to forecast item
65
66
The DIGITAL HAND, Volume III
demand, particularly by the Army. All forecasting was computer based, and in the case of the Army, its model was programmed to simulate peace-time demand and insufficiently to react to war-time circumstances. In addition, the GAO criticized the military in general (civilian and uniformed services) for their logistics IT systems’ not being able to “support the requisition and shipment of supplies into and through Iraq.”60 For instance, using RFID tags on pallets had been mandated prior to the war, but, as of 2004, only a third of all pallets and containers coming into the theater had RFID tags, with reasons ranging from lack of personnel trained in the use of this technology to disagreements over who had jurisdictional control of inventory.61 Spot shortages in this war became highly visible to the American public and created a significant political problem for DoD. The GAO’s auditors blamed the Pentagon for the problem: “the logistics systems used to order, track, and account for supplies were not well integrated; moreover, they could not provide the essential information necessary to effectively manage theater distribution.”62 In short, criticisms mirrored those faced by senior management at the IRS. Furthermore, the GAO reported similar root causes: disconnected systems preventing visibility across the entire supply chain, communications problems, particularly with field forces over radio, new address codes for forces in the field not matching those in computer systems, and lack of sufficient training in the use of these IT systems.63 The reason for dwelling on these IT problems is to demonstrate how important these applications of the digital hand had become over time for the military. Failures in IT logistical systems in the 1950s and early 1960s would have been irritating but not crucial; today, the opposite is the case. GAO auditors said that risks to troops in future wars would be great if these problems were not solved: “Troops will continue to face reduced operational capabilities and unnecessary risks unless DOD’s supply chain can distribute the right supplies to the right places when warfighters need them.”64 Like the IRS, DLA committed to another round of modernizing its software, largely by using commercially available off-the-shelf packages, integrating data from various files and systems, and to maintaining accurate, up-to-date status of inventory (on order, in storage, location, and so forth).65
Weapons Systems and Ordnance The first image that probably comes to mind when thinking about the role of computers in ordnance (such as bombs and missiles), or the more nebulous notion of weapons systems, is the “smart bombs” that the United States used in the Iraq war of 1991. Digital guidance embedded in bombs communicated back and forth with pilots and others, leading to the precision bombing of targets. However, the availability of smart ordnance is a relatively new development, although the concept of using computers to direct the use of “dumb bombs” or artillery shells pre-dates the arrival of digital computers. The idea of using computers to help coordinate use of weapons as part of a system of activities dates back to the 1940s. The notion of embedding intelligence in weapons systems lies
Defense of the Nation
at the heart of how the military wanted to use computers to coordinate defensive and offensive actions in warfare, what in the parlance of the military is normally referred to as command, control, and communications. Use of computing in weapons spread over the decades in three waves. The first involved analog devices in the 1930s and early digital computers in the 1940s to develop firing tables for Army and Marine Corps artillery, and by the Air Force for bombers and the Navy for its large guns. The second wave concerned use of computers to guide missiles to their targets, already discussed earlier in this chapter. The third wave involved deployment of smart bombs first developed in the 1980s, a process still unfolding as a new generation of technology.66 Underlying these three waves of ever increasing use of computers in weapons was the integration of the weapons themselves with the “platforms” used with them. Platforms can be ships and airplanes, and those also within the larger framework of a battlefield command and control process all supported extensively by computers. These systems provide commanders intelligence on the action under way and provide them with the ability to send orders increasingly down the entire chain of command right to the battlefield. The whole process of waves of digital introductions increased the use and influence of computing in battle over the decades, leading to a situation where today no battle can be fought without extensive use of computing in one form or another.67 As early as 1932, the Army had looked at the possibility of using Professor Vannevar Bush’s differential analyzer at MIT—an analog calculator—to improve the accuracy of artillery and bombing tables, subsequently building up a decade of experienced thinking about the application. In 1942, after the start of World War II, the Ordnance Department contracted with the University of Pennsylvania to build a large processor to prepare such tables, the system that came to be known as the Electronic Numerical Integrator and Computer, or simply, the ENIAC. While the story of the ENIAC and its successor machines has been told many times,68 for our purposes it is important to recall that this and other military-sponsored computers built in the 1940s and 1950s were used to create firing tables. By the early 1950s, this digital application had been well established and pervasive; thus the notion of using such technology to guide missiles did not require as great a leap of faith as one might otherwise think. The ENIAC proved to be such a successful instrument during the years it was employed (1946–1955) that almost all early firing tables were calculated using it. Many other government and academic researchers used it as well to perform various mathematical calculations regarding early missile development, weather prediction, atomic energy calculations, and other scientific research. The Army made this, and subsequent machines, available to academic researchers, providing an early tie to universities via computers. A successor to the ENIAC called the EDVAC (Electronic Discrete Variable Automatic Calculator) went online in 1951 and was used by the Ballistics Research Laboratory all through the 1950s into the early 1960s to do similar work as the ENIAC, particularly for missiles. For the latter, it calculated projectile, propellant, and launcher behavior. It computed detonation waves for
67
68
The DIGITAL HAND, Volume III
ordnance and missiles, trajectories of courses, more firing tables, and for the first time, generated guidance control data for weapons, particularly for guided missiles. Other “firsts” that became standard fare included mathematical evaluations of antiaircraft and antimissile performance, carrying out calculations for early war game problems, and supporting a variety of studies on the probabilities of lethality of various mines.69 In the 1950s, the Army also put into operation yet another system that it shared with all the uniform services called the Ordnance Variable Automatic Computer (ORDVAC), which was assigned to do necessary calculations for the development and use of missiles.70 Field artillery in the 1950s received considerable help from computers. Fire control problems continuously arose as new shells and cannons were developed. The major development of the 1950s, which set the pattern for future artillery systems, was the Field Artillery Fire Control System M35. It used an electromechanical computer to prepare its directions, and while the system had a variety of weaknesses (accuracy declined with distance a shell was fired), it proved especially useful for close-in shelling, such as done with 105-mm and 155-mm howitzers. In the 1950s, the Army contracted for the construction of a new system called the Field Artillery Digital Automatic Computer (FADAC), a solidstate electronic digital computer, weighing about 200 pounds, that could operate in the field—a first for digital computers—and that required minimal training of artillery personnel. Because how it was used became the standard form late during the Vietnam War and beyond, a contemporary description of its operation is worth quoting: “Input consists of a manual keyboard and various arrangements of paper tape or another FADAC. When all the data, such as target location, powder temperature, gun location, and meteorological data, are entered, depression of a button initiates computation. Gun orders, comprising deflection, quadrant elevation, fuze [sic] time, and charge are displayed in decimal form.”71 The same system was used in the operation of the Perishing, Sergeant, Lacrosse, and NIKEHercules weapons systems. In the 1960s and 1970s, artillery became more mobile—even mistaken by the uninformed as tanks—with onboard computing increasingly becoming available, and also for tanks, as computers became smaller, hence more portable. By the 1980s, the digital hand was evident at work with all these systems, in rear positions guiding shelling operations, and increasingly linked to headquarters where battlefield commanders were able to understand what was being fired and at what targets. The Navy had a similar experience. It needed firing tables for its onboard ship artillery, and because it began experimenting with missiles in the 1950s like the Air Force, it, too, became extensively involved in using digital tools in support of the larger ordnance.72 By the late 1960s, the Navy was installing computer assisted targeting systems on its ships for use with missiles.73 As one student described the effects of these new systems, by the mid-1960s, they were “starting to become the ‘glue’ that bonded together an increasing number of shipboard weapon systems and sensors.”74 The key to Navy artillery was linking its work with other radar and computational systems used to detect enemy threats to ships, and to coordinate various moving components in battle (ships,
Defense of the Nation
submarines, aircraft, missiles) in an integrated command and control battle management process.75 As systems became more advanced, less expensive, and smaller, they increased their role on ships throughout the 1970s and 1980s.76 Old analog devices and systems were replaced with more versatile digital computers. The Naval Tactical Data System (NTDS) was the name given to a collection of applications that first came into existence in the 1960s as the onboard command and control process. These applications spread in the 1960s (installed on three ships in 1961, twelve in 1965, and thirty in 1969) and beyond, with senior naval officers increasing their level of confidence in the effectiveness of computers in battle.77 These applications were installed in waves, on new ships and then retrofitted on existing vessels; then earlier versions were upgraded with new functions, a pattern that began in the 1960s and has continued to the present. Navy gunfire control systems became completely digital in the 1970s.78 In fact, by the late 1970s, minicomputers were ubiquitous on naval ships for all manner of computing, not just fire control. As the uniformed services became more comfortable and reliant using computing to direct firing of weapons, and nearly simultaneously in using command and control systems to direct battle operations, the notion of “systems” as a way of looking at integrating various activities spread through the services.79 Simultaneously, computing proved more useful as integrated circuits became more powerful, less expensive, had greater reliability, and as software improved as well. New applications became possible. If one could build a computer-guided missile, why not also a computer-guided bomb? The case for having such weapons was obvious. If one could drop a smart bomb on a desired target, that mission would only require one bomb, one airplane, and one pilot to accomplish. Without that weapon, a commander needed to send many aircraft, drop many bombs, and have many maintenance crews and even then not be guaranteed to have fulfilled the mission. A smart bomb lowered operating costs, held out the promise of improved performance, less collateral damage, and reduced the risk of crews being wounded or killed. So what are “smart bombs”? These are munitions that are guided to specific targets by a variety of means (such as a pilot) and have onboard computing capability that can communicate back and forth with the fire control software/ command to stay on or alter course and transmit images of its flight. Guidance systems are thus a two-part installation: computing on the ground (or aircraft) and in the bomb itself. The Army had attempted to develop radio-controlled bombs as far back as World War I, but without success; so it is a concept that has been explored for decades. By the early 1960s, some bombs were equipped with television cameras that aided the bomb director on an aircraft, who could send signals to steerable fins on the weapon; it was a popular application late during the Vietnam War, and that remained in use as late as the 1990s. Both the Army and Air Force experimented with laser-guided systems as early as the 1960s. Not until microchips became available in the 1970s did these prove practical and become used most spectacularly first by the British during the Falklands War in 1982 and next by the Americans in 1991 in Operation Desert Storm.
69
70
The DIGITAL HAND, Volume III
Satellite-guided weapons relying on GPS networks to find specific targets became the latest iteration of this class of weapon.80 With each wave of smart bombs came increased use of computing, more advanced systems, and the need for constant improvement to ensure accuracy and reliability. Because this is the latest iteration of weapons, smart bombs are not perfect. In the Iraq war in 2003, there were instances in which smart bombs did not hit their intended targets; but they had advanced so far that, while still used with “dumb bombs,” they had clearly become the wave of the future for bombing. By the mid-1980s, all the major components of modern digital applications in weapons systems had been designed and implemented to one degree or another across the major organizations and activities that made up how modern warfare was to be conducted. Computer-enhanced weapons were systems, not merely bombs and artillery configured with digital features. The work of airplanes, ships, and tanks, for example, needed to be coordinated for the weapons systems to work, much like machine guns firing straight through propeller blades in World War I had to be synchronized so as not to shoot off the propellers. One Pentagon training manual, published in the 1980s, described the modern military aircraft and the role of computers, illustrating the interdependences involved: The Korean War vintage cockpit, packed full of instruments, has been replaced with a cockpit containing just a few instruments and controls and display screens. Any information required by the pilot is simply displayed on the screens at a push of a button, anything from attitude indications to the status of weapons. They are all under the control of computers. Furthermore, there are now new subsystems that would not be possible without digital systems: diagnostic systems that can display the health of all the major subsystems and “expert systems” that provide the pilot with information on the various options available during a particular mission.81
The same report went on to say “that computers and software have introduced a whole new dimension to our weapon systems,” thereby expanding the capability of warriors at all levels of command to have greater flexibility, speed, and effectiveness in the use of their weapons.82 Table 3.5 lists major software applications common to the operation of Air Force and Navy combat aircraft by the late 1980s. For aircraft to deliver ordnance, the services needed a variety of software tools, some of which were used on the ground, such as those used for maintenance, training, and mission preparation. They also had to operate in real-time and be fault-free with far less failures than a business or accounting application. Clipping through a quick discussion of the injection of computing into weapons systems could lead one to conclude that these worked well and that problems were minimal. Nothing could be further from the truth. What these weapons all had in common was that they were new, unlike accounting or logistical applications. While they built on prior experiences, the requirements were
Defense of the Nation Table 3.5 U.S. Military Aircraft Weapon System Digital Applications, circa 1988 Flight management Data reduction Maintenance training Scenario/analysis
Battle management Crew training Automatic testing Mission preparation
Source: Adapted from Defense Systems Management College, Mission Critical Computer Resources Management Guide (Washington, D.C.: U.S. Government Printing Office, 1988): 2–6.
more stringent: they had to be more rugged than a computer system in an airconditioned data center; they had to be smaller and lighter; they had to be more durable and reliable because people’s lives depended on them; and they had to be effective. It always took years of R&D, experimentation, innovation, and then persuasion of the military to adopt these. The same patterns were evident in command and control systems, discussed more fully below. Firing tables developed in the 1940s and 1950s were subject to errors and had to be improved; software problems meant that not all missiles would fly flawlessly in the 1950s and 1960s; even the vaunted SAGE system had a series of false alarms that created near-war crises with the Soviet Union;83 and software “glitches” plagued every new innovation, a pattern that persists, just as is often the case with complex civilian systems. On a technical level, often new concepts had to be developed. In the case of firing tables in the 1940s, as one mathematician noted, since firing data had to be automatically calculated, “it became necessary to develop a quantitative theory of information in order to solve the associated engineering problems.”84 Pentagon officials worried about the growing complexity of systems. One wrote in 1975: As the role of digital computers has increased, so has their criticality in terms of performance and cost. Computers perform aircraft, ship, missile and spacecraft navigation, guidance and control, weapon control, target detection and tracking, combat direction, communication distribution and processing, automatic testing and other critical functions that affect the success or failure of strategic and tactical missions. Computer technology within Department of Defense (DOD) weapon systems is a relatively new field.85
He was blunt when it came to reliability: “Although hardware reliability has improved substantially, the corresponding gains in system reliability have not been realized.”86 In the 1960s and 1970s, Pentagon officials complained about the poor quality of systems development and discipline, and less about hardware and software. But complexity was a major factor. The USAF FB-111 aircraft of
71
72
The DIGITAL HAND, Volume III
1966 had an on-board computer with a 60,000 word memory while the 1988 version of the B-1B Bomber had a system that could hold 2.5 million words. As hardware capacity and reliability increased between the 1970s (when microprocessors became widely available) and the end of the century, the majority of problems and concerns moved from hardware to software for all manner of computing, not just for weapons systems, and to “uneven application of standards.”87 Such systems had to be fault-free and operate real-time, which added to their complexity. As decades passed, the relative importance of software over hardware mirrored patterns of dependence on computing evident in the private sector and in other government agencies. The biggest changes came in the 1970s when integrated circuits transformed radically the capacity, reliability, cost, and functionality of hardware, making it possible to store in a small machine a massively increased amount of data and complex software rich in functions. Military weapons designers realized very quickly that software was far more flexible than hardware or electromechanical systems, which is why the volume of software used in aircraft, for example, expanded all through the 1970s, 1980s, and 1990s. As hardware shrunk in size and software became more useful, the latter replaced functions performed in earlier years by machinery, cables, and levers. Of course, they had to overcome the problems posed by the lack of standards commonly applied across all weapons systems, such as the myriad of programming languages used in the 1960s and 1970s that led to the adoption of Ada as the one for all the services to use. Costs also suggest the change. In 1960, 80 percent of a weapons system’s cost went for hardware, and only 20 percent for software; by 1980, the ratio had flipped, with software now accounting for 80 percent of a digital system’s expense.88 The discussion later in this chapter on Information Age warfare describes how computers in weapons systems and their delivery platforms (such as airplanes and ships) were integrated into battle command systems to provide commanders with very computer-dependent ways of managing warfare late in the twentieth century, building on the incremental improvements made in software.
Training As military technologies emerged throughout the twentieth century, all the uniformed services embraced every training method that came along. They used such innovations as those that depended on technological innovations, such as computer aided instruction (CAI), simulation tools, and video games. They embedded the concept of technology-enhanced training deeply into the culture of all the services, just as they had military tactics in the nineteenth century. Beginning in the early stages of World War I, the uniformed services became extensive users of all manner of technologies, not just computers or telecommunications, with the result that by the end of the 1950s, the U.S. military establishment was the most technology equipped in the world. It was also the
Defense of the Nation
most technologically advanced. Both situations concerning technology—great deal of it in use and continuous incorporation of new forms—meant that there was an enormous need to train personnel, not only new recruits but also “lifers” all through their careers. Thus, training functions in DoD became massive, sophisticated, and evolved continuously. Training ranged from how to use or maintain a particular tool or weapon system to war gaming and leadership in which simulated combat or other situations were role played, increasingly using electromechanical simulators in the 1930s and 1940s (for example, to train pilots before putting them into expensive aircraft), analog simulators in the 1940s and 1950s, and increasingly afterward digital computers. By the end of the century, even video games, the most current form of simulation in training, had been widely deployed. In fact, DoD relied earlier and more extensively on every technologically based innovation in training that came along in the twentieth century than had elementary schools, high schools, or universities, often extending knowledge about pedagogical effects of the use of the digital hand. Why was DoD able to do this, to take a leadership role in the use of various forms of training in large quantities? Part of the answer lay in the need to train a continuous influx of new employees (civilian and uniformed), who were constantly changing their jobs and using ever evolving technologies. A second reason involved scale and scope. Unlike any school or university in the country, DoD was so big, and had so many people to train, that it could afford to develop expensive new teaching aids to displace older techniques. It could spread the costs and benefits across large masses of people. As one training program manager at DoD remarked in 1974, “the armed forces alone spent $6 billion annually on training of which more than one-third is devoted to technical specialty areas. This effort accounts for 120,000 man-years of trainee time each year and involves hundreds of thousands of trainees.”89 In 1981, the annual budget for training of all military personnel had risen to $8.8 billion, 74 percent of which went to the development of newly recruited personnel.90 By 2001, those responsible for training in the military were serving 2.4 million personnel in uniform and roughly another million civilians, helping both communities be productive in over 150 professions and at various levels from private to four-star general. That year, DoD budgeted roughly $18 billion for training.91 No school district or university had student populations even remotely comparable in size and scope with the exception of the very largest cities in the country. To control costs and be effective, DoD training organizations had no alternative but to explore every conceivable form of training possible; that included using computers to assist in the process. Since the cost for developing digital tools was often far beyond the financial capabilities of schools or higher education, but not of DoD, one can understand why this federal department could be the first to create and deploy many such tools. Also, it had a workforce comfortable with all manner of technologies. Whenever innovations emerged outside of DoD—such as PLATO—the department did not hesitate to incorporate them into their training program as well, just as it did not shrink from using off-the-shelf
73
74
The DIGITAL HAND, Volume III
software tools to manage inventory control, supply chains, and logistics, especially after they became widely available in the 1980s. In the 1950s, training consisted of a combination of classroom instruction with chalk and blackboards, practicing skills using reel-to-reel tape (for example, to learn a foreign language), on the job training, and field exercises in which mock simulations of combat, for instance, took place. One other form of training involved the use of electromechanical simulators to train airplane pilots before putting them into an expensive airplane to fly. That situation remained relatively unchanged until the late 1970s, when computer assisted instruction (CAI) methods were added to existing training programs. The uniformed services began limited experimentation with CAI systems as far back as the 1960s, but in the early 1970s, DARPA began funding CAI development to address administrative and training issues. An official at DARPA explained the thinking: Training is a highly labor-intense activity in terms of teacher as well as student time. It takes place at many training schools and at many diverse operational sites. The problem is to reduce the teaching and administrative burden and the trainee time, while maintaining the quality of the training received, in a way that permits instruction to be economically distributed to widely dispersed military installations. Computer technology, together with communications and student terminal capability, shows promise for solving the problem.92
CAI held out the promise of supporting drill and practice teaching, administration and grading of tests, and aggregating student scores, all the while allowing students to progress at speeds consistent with their capabilities. The earliest CAI system used by the military appears to be PLATO, a system developed at the University of Illinois. It consisted of a mainframe with distributed CRTs that students could use interactively to access and work with training courses. Various experiments throughout the 1970s led to the development of military training modules, which were widely deployed during the 1980s.93 Table 3.6 lists the kinds of early CAI-based training provided by DoD in the late 1970s and early 1980s. What becomes obvious is the broad range of subjects. In each of these instances, courses saved between 31 and 89 percent of time to train over prior methods. The courses listed in table 3.6 translated into savings in the millions of dollars per course since often thousands of students took them.94 In short, CAI was quickly beginning to prove its worth. All through the 1980s and 1990s, CAI-based training spread across almost every function that required training and was used alongside other methods of instruction, such as use of simulation tools. IT in support of simulations came early and spread intensely across the department. Simulations were always of two kinds: those that required one to understand how something might work, such as a proposed new airplane or the application of a new way of doing things (in military jargon, a doctrine); or as a way of teaching before allowing someone to actually do something for real, such as instructing pilots on how to fly specific models of aircraft before putting them
Defense of the Nation Table 3.6 Early Computer-Assisted Training within DoD, circa Late 1970s–Early 1980s Electronics Electricity Machinists A/C panel operator Medical assistant Vehicle repair Weather Weapons mechanic
Tactical coordination Fire control technician Navy aviation familiarization Navy aviation mechanics fundamentals Inventory management Material facilities Precision measurement equipment
Source: Jesse Orlansky and Joseph String, “Computer-Based Instruction for Military Training,” Defense Management Journal 17, no. 2 (Second Quarter 1981): 47–48.
in the cockpits of an actual airplane, or simulating battle conditions to teach tactics, strategy, and command and control practices (the heart of many video games today). The earliest applications of simulation using mechanical means date back to the late 1920s with the development of flight simulators to teach pilots how to fly. During World War II, military pilots were routinely taught flight proficiency using simulators that during the late 1940s increasingly incorporated advances in electronics to improve the breadth of the learning experience. The first major digitally based simulator was created by the University of Pennsylvania for the Navy and Air Force in the 1950s, called the Universal Digital Operational Flight Trainer Tool. All through the 1950s, 1960s, and 1970s, new trainers were developed, incorporating more complex maneuvers and, of course, operation of increasingly computerized avionics and weapons systems. The Navy did the same not only for pilots, but beginning in the 1960s, for submarine crews as well. By the start of the 1980s, simulators were used to train all submarine crews. In summary, from the late 1950s to the present, both the Navy and Air Force proved to be the most enthusiastic deployers of simulation systems to train people who operated all manner of equipment and did maintenance, and for a growing list of backroom office applications, such as inventory management.95 Normally, one reads about the use of digital simulation tools by the Navy and Air Force since they had the largest, most complex equipment, but the Army also became an active user of this training application. While the Navy and Air Force began that process in the 1950s, the Army began in the 1960s on a limited basis and over time expanded this application to all manner of skills, ranging from modeling new and more complex weapons (e.g., tanks), to training crews in their operation (tanks, helicopters), to simulating battlefield activities. Tank simulators for both development and training were some of the earliest uses of simulation. By the 1980s, tank training simulating battlefield conditions was used to train individual tank crews and also multiple crews required to operate
75
76
The DIGITAL HAND, Volume III
in a coordinated fashion on the battlefield. By the 1990s, training tank crews could be done simultaneously at multiple bases, involving virtual forces at war, rather than actual deployment of tanks. The heart of this system was called Simulation Network (SIMNET), which began as a Tank Team Gunnery Trainer in the early 1980s and, over time, evolved into a networked tool using local area networks (LANs) and super computers. By the 1990s, it had expanded into whole battlefield simulators.96 A second form of simulation involved soldiers in live training exercises with a new tool: in the 1990s, the Army added simulated bullets (as opposed to real bullets), which were lasers emitted from a weapon, while using blank ammunition to simulate the sound of firing.97 From the 1950s through most of the 1980s, simulation in training remained fragmented because tools were developed for specific use within individual uniformed services or for discrete activities, such as operation of tanks or airplanes, by the various services and agencies within the department, often independent of each other. But alongside these training applications were a constant curiosity in and healthy skepticism about the use of computers in what initially was called the “electronic battlefield” and which, by the 1990s, was also variously labeled “Digital War” and “Information Age Warfare.” But there is a difference between the two. The original use of the phrase “electronic battlefield” is unclear, although by the 1950s and 1960s the term was in wide use, often referring to reliance on radio and radar to communicate and track activities on the ground, air, and sea. All manner of warfare, armies, air forces, and navies around the world also became increasingly mobile in these years, diverse in the types of equipment they deployed and in the sheer volume of hardware and people moving around. As a result, more electronics were introduced into the process of warfare, with the result that, in the arcane language of the engineer, “the density of signals has grown to the point that the common descriptor of signal density—number of pulses per second—is all but meaningless.”98 During the Vietnam War, a variety of electronic sensors were deployed, for example, on the Ho Chi Minh Trail, to track the enemy’s movement by causing electronic signals to be broadcast to the U.S. military. The result of these and other proliferations of electronics on the battlefield in the second half of the twentieth century led to a situation where commanders had to process and understand quickly thousands of signals simultaneously. Increasingly over time, beginning with radar systems in the 1950s but extending to every complex weapons and communication system by the end of the 1970s, computers were used to digest, analyze, and present the messages generated on the electronic battlefield.99 How many signals they received, assessed, and digested often was a function of the capacity and capability of the technology. By the 1980s, commanders were beginning to realize that the pace at which effective battlefield command and control could be handled was becoming highly dependent on IT to receive, digest, and present the volume of incoming data. By the 1990s, they knew that the pace at which action could occur on the ground was being influenced profoundly by IT and telecommunications.100 One crosses over to the concept of digital warfare when computers are used to make battlefield decisions based on such data; to direct weapons systems (such
Defense of the Nation
as smart bombs); or to react to electronic and digital countermeasures by the enemy, such as jamming or blocking communications, or “hacking” into combat operational software programs. Electronic warfare has been a reality for the military since World War I, the possibilities of computer-based warfare a training and simulation activity first as a glimmer of a possibility in the 1950s, but more realistically a viable option to use and train for by the 1960s.101 The earliest use of simulators in this area was in war-gaming scenarios run against possible Soviet land wars in Eastern Europe during the Cold War in which rivals of the United States also had made significant investments in electronic warfare and, more specifically, in military computing. Each of the American uniformed services had initiatives in this area to train its commanders, even before using computers. The Navy implemented war games at its Naval Postgraduate School in Monterey, California, in the late 1950s, building on a tradition of manual war gaming dating back to the late 1880s at the Naval War College; use of computers to enhance the gaming process began in 1960.102 That same year, the Marine Corps introduced war games at its Marine Corps School in Quantico, Virginia, and later in the decade used computing in support of war gaming.103 The Air Force first applied operations research (OR) techniques during World War II and in gaming training in the 1950s. Its acceptance was profoundly influenced by that service’s long familiarity with flight simulators, with the result that it became the first military branch to apply the digital hand to war gaming in the 1950s when it simulated air-ground combat.104 RAND Corporation also developed nuclear war games for the Air Force in the 1950s and 1960s, keeping the Air Force at the forefront of computer-based gaming.105 SAGE in the 1960s and 1970s provided additional gaming opportunities for training and study (for example, in advancing knowledge about optimum targeting strategies). As at the other services, even senior officials early on participated, for example, the Air Force Chief of Staff in 1960 took part to understand the possibilities of Soviet first-strike nuclear warfare.106 The Army embraced gaming in the 1950s and used a Navy computer game called CARMONETTE I for the first time in 1959. Subsequent releases of the software included infantry and armed helicopters, communications and night vision exercises. These systems allowed commanders to move military units around (increasing in size to battalions by the late 1960s) and to decide whether to fire or not.107 By the early 1960s, the Joint Chiefs of Staff and even NATO commanders were beginning to use digital simulation tools to learn, to train, and to formulate tactics and strategies. Upon the backs of these early games were built those that ran on more sophisticated hardware and software all through the next three decades. In those years (1970s–1990s), as Cold War and post–Cold War realities set in, games were modified to train for anticipated new battlefield and logistical issues. By the end of the 1980s, battalion-level commanders (e.g., lieutenant colonels) could train using computer-based tools, sitting in front of terminals connected real-time or in batch mode to mainframes.108 As computing power in the late 1980s and early 1990s increased, new generations of war games appeared. In addition, as one government report written in 1995 noted,
77
78
The DIGITAL HAND, Volume III
Figure 3.2
Marines in training class, Cherry Point, NC, 2006. (Courtesy U.S. Marine Corps)
as the costs of computer imaging declined, “the compromise on resolution is being reversed: newer simulations are using increasingly higher resolution graphics to serve other uses besides training,” in effect preparing the military for the widespread adoption of video games by DoD late in the decade.109 By the mid-1990s, a cultural transformation had occurred within the military that encouraged the further use of computers in training. Although a whole generation of commanders and enlisted “lifers” had been exposed at one time or another, and often frequently, to computer-based training and simulation in the 1970s and 1980s, it was the arrival of a new generation of soldiers and civilians who had grown up with television, GameBoy, and video games now working for DoD in the 1990s that stimulated further use of digitally based training. One observer of the phenomenon, writing at the dawn of the new century, noted that senior officials were now “very mindful that the people that they’re trying to bring into the military—the 18 year olds—are probably the first generation that grew up with computers, who get ‘bored real easy’ with traditional classroom instruction. They keep this in mind when designing all their recruiting strategies and training programs.”110 The same observer drew the connection with actual combat operations: In addition to cost and motivation, add relevance. Because modern warfare increasingly takes place on airplane, tank or submarine computer screens without the operator ever seeing the enemy except as a symbol or avatar, simulations can be surprisingly close to the real thing. In addition, since war is a highly competitive situation, with rules (or at least constraints), goals, winners and losers, competitive games are a great way to train.111
Defense of the Nation Table 3.7 Sample Computer-Based Military Training Tools, circa 2000 Service
Projects
Army
Saving Sergeant Pabletti (for team skills) used with 80,000 soldiers Taskforce 2010 PC (wargaming over the Web) Spearhead II (tank game) Flight Simulator (Microsoft’s game modified for Army aviation) SubSkillsNet (laptop-based submarine training modules) Fire-control training games Electro Adventure (ship repair, problem solving) All initial flight training now done through simulators JVID and Finflash (target recognition) Falcon 4.0 (commercial flight game) Marine Doom (fire team training, based on commercial version) Quake (squad fire training) Battle Site Zero (squad simulator)
Navy
Air Force
Marines
Source: Derived from Marc Prensky, Digital Games-Based Learning (New York: McGraw-Hill, 2001): 303–312.
By the end of the century, battle simulation games in the military could be played at one or remote locations, involve thousands of participants, and include all sizes and many types of armies and combat conditions. DoD had created Joint War Fighting Centers to bring in various levels of commanders from all the services to learn how to collaborate and fight in modern wars. Even gaming with NATO armies in Europe across the Atlantic became part of the process. Table 3.7 lists some of the training projects under way at the turn of the century, suggesting the evolution in how the digital was used in comparison to the courses listed in table 3.6. What is also important to point out about the information in table 3.7 is that all these simulation games worked on a new IT platform that came into wide use in the 1980s and 1990s—video games—once again demonstrating that DoD was willing to integrate new IT developments into its training programs. As early as 1982, commentators noted that “the advent of home computer games has clearly demonstrated the teaching value and appeal of simulated engagements at all age levels,” an observation not lost on the military.112 While adoption of the new technology took time, by 1997, the Pentagon had a variety of development initiatives under way with both the Video Games Industry and with other entertainment industries for the creation of simulators to train individuals and groups, to evaluate new weapons systems and new doctrines and tactics, and all cost effectively.113 The Marine Corps may have been the first to adopt video games when, in 1996, the Commandant General Charles C. Krulak directed that the
79
80
The DIGITAL HAND, Volume III
Corps consider deploying such technology.114 That led to the adoption of the Doom game cited in table 3.7. The other forces added games, including Real War, intended to teach enlisted personnel how to think like a higher-ranking commander. Simultaneously, the military continued replacing Cold War–based strategy training games with video games suited to post–Cold War realities. As early as 1999, DoD set aside over $90 million for this purpose.115 One other form of training using computers came into its own just after the dawn of the twenty-first century, involving embedding training modules in modern weapons systems themselves by taking advantage of the availability of laptops and ever smaller electronics. These were small video training programs that were embedded in missile launchers and tanks. These could be simultaneously accessed with the equipment being trained on and when people had time or need to know. Training developers considered this class of training tools relatively new and the next evolution from traditional computer-based simulation training programs. They replicated what could be done with simulators but also provided story telling while teaching skills. As this book went to press, the services and other parts of DoD were merging old and new games together (for example, an old training video game on support strikes from F-16 jets with a new module on how to run truck convoys in Iraq).116 As with so many other digital systems, use of simulation software spread so widely that by the 1990s complaints and confusion surfaced similar to what occurred in other large federal departments, as we saw, for example, at the IRS. A RAND study from 2003 summarized the issues at DoD: Currently, the development, governance, financing, and use of simulation is a complex web, with multiple agencies responsible for defining and implementing modeling and simulation (M&S) policy. Furthermore, beyond the basic phase of training, training requirements for ships are only minimally articulated. The vagueness and inconsistency of training requirements and standards for assessing readiness further complicate the problem of determining how simulation might best be used.117
As at the IRS by the GAO, RAND called for more coordinated deployment of standards and practices, a useful suggestion to make since all the branches of the uniformed services were required by the 1990s to operate in an integrated manner in combat, first widely reflected in the Iraq war of 1991. The issues remained in the new century, most recently with the development of a whole new generation of training modules based on video games.
Combat Applications The heart of combat applications of any technology centers on support of command, control, and communications. Command involves planning, assessing capabilities of one’s own forces and the enemy’s assets, allocation of resources, and committing to action. Control is about such things as the management of weapons,
Defense of the Nation
as how computers are used to direct artillery. While much data can exist, only a small amount is needed to direct this work and to assess results. Surveillance is considered part of control. Finally, communications is, as one would expect, the task of communicating up and down the chain of command and horizontally on the battlefield about what is going on, transmitting orders, and collecting data on results. The earliest uses of computers were aimed at control activities. For instance, SAGE was designed to provide an umbrella of monitors to detect incoming Soviet aircraft and missiles in the 1950s and was used into the 1980s, when it was replaced with more sophisticated systems.118 Early uses were also applied to artillery and missile programs. As computers became more capable by the early 1960s, simulation tools could be used in support of planning functions—the command part of the 3 C’s of military leadership. Communications represented some of the most important applications of computing over the past half century, and also one where military leaders were as frustrated with the lack of progress at the end of the century as they were in the 1950s. Use of computers in support of combat evolved slowly over time. In the 1950s, the major initiatives related to combat were driven largely by the Air Force, first through the massive SAGE project but also in other, smaller ways. For example, the Air Force began using small computers (COMAR-1) to analyze bombing reconnaissance data in the mid-1950s, perhaps the first use of the digital hand in direct support of combat-related activities.119 But the major investment in these early years was in using computers in support of communications. The Strategic Air Command (SAC), which had responsibility for responding to any aerial attack on North America, installed a series of mainframes in the late 1950s and early 1960s in its underground headquarters in Omaha, Nebraska. There, three IBM 704s and two IBM 7090s were used to conduct planning, control, and evaluation applications; in short, they helped plan missions, dispatch them, and assess results.120 In the early 1960s, the Air Force installed its first system equivalent to e-mail, using teleprocessing terminals to communicate information regarding aircraft, missiles, personnel, and logistics.121 All the services and DoD had various communications networks in the 1950s and 1960s, all analog, most provided by or based on civilian telephone technologies, and AT&T. However, ensuring that these communications systems never went down was of constant concern, and the military did not hesitate to use computers made by AT&T and commercial IT vendors to track problems and warn of failures as early as 1962–1964.122 Earlier, we discussed the evolution of missiles; they, too, required complex communications to track trajectories and so forth, which were implemented at the same time. By the mid-1960s, this network included twenty computer systems scattered across missile firing ranges in the Pacific. They monitored range safety, predicted where missiles would land, provided orbital vehicle control, tracked flights, supported radar acquisition and handover functions, performed orbital computations, handled data, and conducted pre- and postlaunch checks.123 However, all of the applications just listed were ad hoc, little coordinated activities in support of combat operations. A couple of observers close to these
81
82
The DIGITAL HAND, Volume III
activities reported in 1962: “At the present time, there are few commands that depend extensively on computer assistance for command information processing. The majority of the commands to which automation may be applied are using interim data processing capabilities or are operating manually and planning to obtain automated capability in the near future.”124 Earlier in this chapter, we noted that operational support for navigation and functioning of aircraft and ships in the 1950s and 1960s was augmented by the installation of on-board computers, a process of incremental adoption and upgrades that occurred from that period to the present involving all aircraft and ships. By the 1980s, all major aircraft and ships used computers in support of their combat missions: command, control, and communications. Communications applications were some of the most complex and crucial for the military, and it would be difficult to overstate the importance for combat operations. Each of the armed services created a command (in civilian language, an agency or office) to ensure that they had effective communications systems and that these were in line with DoD’s overall requirements. These networks operated around the world and increasingly included computers to run them. While they began as analog networks, by the end of the century they were largely digital. DARPA spent vast sums of money to constantly improve these networks so as to ensure they were failsafe (a key design feature of the Internet developed by DoD). The various services developed independently of each other many of their own communications systems from simple field radio communications to networks. By the end of the 1960s, complaints surfaced about incompatibilities, and there were calls for standards and interoperability of systems; however, since in those days each of the services operated in combat more independently of each other and would not be explicitly required to integrate combat missions and operations until the 1980s, it is understandable why separate systems emerged. Some were quite massive. For example, at the end of 1968, the U.S. Army’s communications operations in Vietnam consisted of twenty-two signal battalions with 23,000 men, and that was only at the start of U.S. expansion of forces in the war. Enormous progress had been made in the 1960s in improving communications using computers. A deputy assistant secretary of defense, Paul H. Riley, summarized the progress. At the start of the decade: We had only Army, Navy and Air Force communications. If you will visualize with me hundreds of communications circuits in the form of water hoses, all interconnecting hundreds of houses in a suburban development, you will start to have the basic picture. This situation, with all of its criss-crossing of hoses, resembled the proverbial bucket of worms and unfortunately resembled our communications system.125
He complimented the Air Force for consolidating all its “hoses” while he announced that the Army was just beginning to do the same. He excused the Navy from his drill because it had a different situation, caused by ships in constant movement all over the world, although years later it, too, would have to bow to the call for integration and conformance to standards for the entire DoD.
Defense of the Nation
But like the other services, even in the 1990s it faced problems communicating with, for example, aircraft hovering over ships manned by pilots from the Air Force or Army. But that lay more in the future because in 1970–1973 all of the services were figuring out how to use computers in support of communications, because hardware was beginning to shrink in size and cost, while satellites were now making it possible to transmit sixty times more data than a traditional telephone line. Computers were beginning to route information in large quantities across various commands, an application that spread widely in the 1970s, known as Automatic Digital Network (AUTODIN).126 However, the world of combat was changing, requiring the Pentagon to improve its communications infrastructure in the 1970s. As early as 1969, Riley commented on why the need for change: The first reason was the change in the nature of warfare itself. Time had become the most fundamental and critical factor. The need to respond in seconds or minutes began to depend on communications more than at any time in the history of warfare. A second factor was the need for compatibility of communications between Army, Navy, and Air Force. We had to be sure that we could send our Department of Defense, or national command authority [president] instructions . . . a third factor, compelling change, was the sharp increase in the price of hoses and the fact that all users were pressing for more and longer hoses.127
The reason for the needed longer “hoses” can be illustrated by comments made at the same time by Major General Joseph J. Cody, Jr., of the Air Force, who was responsible for providing communications to his service, due to a growing appetite for information: “The missile age, which called for increased emphasis on command and control, also caused a revolution in management and the end of traditional management methods.”128 What he set forth to implement were improved communications systems that could “collect, transmit, process and display information.”129 At the time, he used computers linked to radar systems (such as SAGE), a second system called the Ballistic Missile Early Warning System (BMEWS) that fed data into SAGE, and the Back-Up Interceptor Control (BUIC) system that did the same thing. He also launched development of an airborne command and control system that could hover over a battlefield called the Airborne Weather Reconnaissance Systems (AWARS).130 By the early 1970s, all the services were using AUTODIN (launched 1966) and were ordered to start consolidating their various communications networks. That work went forward slowly through the 1970s and 1980s.131 As late as 1990, when DoD filed its annual report of achievements, it still commented on the need for integration and continued improvements in communications.132 Various assessments made in the 1980s and 1990s complained that little progress was being made not only in integrating communications but also in other digitally based weapons systems and processes. By now there were many of the same problems evident at the IRS: expensive legacy systems; newer ones that were incompatible with older networks, applications, and equipment; budgetary
83
84
The DIGITAL HAND, Volume III
constraints; the need to keep the networks operating while upgrading; complex procurement practices; changing nature of military operations—in short, a long list of problems. Following each war or mission, “after action” reviews invariably commented on the issues. The invasion of Granada in 1983 required extensive collaboration among the uniformed services and was plagued with so many problems that led to congressional action and changes in policies, procedures, and practices within DoD. However, progress proved illusive.133 By then, computing and communications had become central to all discussions about the effectiveness of the military. The issues were not resolved. Former Secretary of Defense Les Aspin and former Congressman William Dickinson reported after the Persian Gulf War that “Operation Desert Storm demonstrated that tactical communications are still plagued by incompatibilities and technical limitations.”134 The operation in Somalia in the early 1990s also illustrated the problem as did events in Iraq in the late 1990s, when the United States monitored the air space over the country. What is most interesting about DoD’s problems with communications in support of combat operations is how little the technology itself affected the adoption of new digital tools. It seemed the proverbial “everyone” wanted the most advanced telecommunications in support of combat and other functions. Computer and telecommunications technologies improved enormously in the 1980s and 1990s. The constraints on adoption were driven largely by nontechnical influences. An Air Force lieutenant colonel studying the problem at the dawn of the new century reported that “combinations of factors contribute to the persistent shortcomings in interoperability, including the military acquisition culture, dwindling budgets, rapidly changing technology, the changing nature of operations, competing priorities, insufficient oversight, and unrealistic training and exercises.”135 Another study done at the time by the National Research Council (NRC) drew similar conclusions. However, it also commented about the effect of legacy IT systems worth understanding, because the same problem existed in other federal government departments that had been early adopters of computers: “The military services have tended to retain legacy information systems that were developed in response to ‘stand-alone’ requirements, were not regarded as subject to connection with other systems and, therefore, are not operationally friendly with their increasingly interdependent companion systems. The legacy systems issue is one of the greatest challenges faced by the DOD today.”136 The problem was compounded by both military and congressional mandates that the services operate as a united force and by the growing collaboration with military services of other countries in joint operations in the 1990s. The cultural problems were subtle but significant for weapons systems and communications of all kinds. An Air Force colonel in 2000 explained the situation: “The problem is that while the Department of Defense assigns warfighter responsibilities to unified commands, each individual service is responsible for developing its own command and control systems. . . . This creates some big, ugly seams for joint commanders.”137 The debate and actions in support of reforms continued into the new century.
Defense of the Nation
The war with Iraq in 2003 once again demonstrated to the world on television the importance of communications applications in the deployment and performance of combat forces that combined operations from all the uniformed services and even those of a few other nations. As of this writing (2007), few details were just beginning to come out regarding the role of IT. Drone aircraft equipped with digital avionics surveyed the terrain and delivered small propelled munitions to targets; vehicles were equipped with laptops that could download information and use GPS technology, while satellites provided commanders with additional information. The fragmentary evidence and early after action reports indicated that from division level on up communications systems, working with computers provided significant amounts of information, but, below that organizational level out in the field, communications malfunctioned frequently, with many junior army and marine officers complaining at the lack of timely and accurate information about what was happening on the ground. Often troops had to rely on cell phones and old fashioned e-mail to communicate up and down their chain of command.138 In Afghanistan, units of special operations communicated horizontally with each other and even had their own web pages. That approach proved more effective, suggesting a different model for organizing battlefield communications. As an expert in unconventional warfare at the Naval Postgraduate School assessed the experience: “Some of the problems in Iraq grew out of an attempt to take this cascade of information provided by advanced information technology and try to jam it through the existing stovepipes of the hierarchical structure, whereas in Afghanistan we had a more fluid approach. This is war by minutes, and networking technology allows us to wage war by minutes with a great probability of success.”139 The other observation one can make is that digital systems
Figure 3.3
Soldier uses a computer, Iraq, 2005. (Courtesy U.S. Army)
85
86
The DIGITAL HAND, Volume III
had continued to be deployed late in the century down to the individual soldier, much as civilian uses of IT had migrated from large mainframes over the years down to individual productivity tools, such as digital watches, laptops, cell phones, and PDAs.140
Information Age Warfare Ever since officers and military analysts could envision using computers in support of combat operations, they had speculated on the future of warfare engaging the digital hand in important roles. By the end of the 1980s, a whole generation of officers and enlisted career soldiers, sailors, and airmen had been exposed to myriad computer systems in training, logistics, communications, and in support of their daily work. As Professor Edwards asserted in his study on computing in the Cold War, a whole generation of warriors had embraced computing as contributing to the next major revolution in warfare.141 All the uniformed services speculated about the future of warfare, of course, and by then, the digital influenced profoundly their thinking because there had been sufficient experience and successes with the technology. Most accepted the idea that it was now a major component of all command, control, and communications activities. Most military commentators on strategy and tactics agree that the Gulf War represented a turning point away from non-IT-centered warfare of the Cold War vintage and toward a new era in how combat would occur. The quote at the start of this chapter from General Powell, chairman of the Joint Chiefs of Staff, who was very familiar with the uses of IT described in this chapter, was made after the war. What changed in the 1990s was the ratcheting up of the debate on how to use the digital in warfare, building on prior thinking and experience. Many definitions of Information Age warfare emerged in the decade, along with a large body of publications inside and outside of DoD debating its elements and consequences. While the phrase information warfare dated back to 1976, it was not until the 1990s that it came into wide use.142 The key event in the Gulf War in increasing interest in the subject was the collection of technologies that underlay the successes shown on everyone’s evening news: “These outcomes were enabled by battlefield tracking and targeting systems that allowed American forces to identify and attack targets beyond the line of sight, by advanced aerial reconnaissance from airborne warning and control systems (AWACS) and from joint surveillance and target attack radar systems (JSTARS), and by space-based satellite sensors,” making it possible “to apply destructive force when and where we want to.”143 Ideas about what constituted Information Age warfare, however, remained in flux. Yet, an Army major provided a useful explanation of the difference between conventional and IT-centric combat: “Comparing digital and nondigital forces is the proverbial apples and oranges discussion. The two simply cannot fight in the same manner. Digital units have command-and-control equipment that tells the
Defense of the Nation
entire force, down to each vehicle, the friendly and enemy situation in relation to the surrounding terrain. A nondigital unit must advance physically to acquire the same information.”144 Understanding what those differences implied for doctrine, strategy, and tactics dominated discussions about the future of war. One student of the issue pointed out that “the new meanings of power and information . . . favor the argument that wars and other conflicts in the information age will revolve as much around organizational as technological factors.”145 The development of so many new weapons and ordnance in the 1980s, and their subsequent successful use in Iraq, led some observers to start thinking in terms of a new revolution in military practices. Proponents of the new way argued that wars in the future would be struggles over the control of data and knowledge and less over territory; that industrial age stovepipe organizations would be replaced with flatter horizontal units; and that weapons would increasingly herald a new age of precision warfare, delivered to the correct targets thanks to “information superiority,” with scores kept by myriad sensing devices, such as drones. More extreme visionaries spoke of successful nations dominating or destroying the information of its enemies, even to the point of using propaganda and media to reshape enemy public opinions, often all occurring without shedding blood.146 Senior officers at the Pentagon increased their formulation of doctrine and strategy in the course of their routine planning in the 1990s. Publication of the Joint Chiefs of Staff’s Joint Vision 2010 set the direction: “We must have information superiority: the capability to collect, process, and disseminate an uninterrupted flow of information while exploiting or denying an adversary’s ability to do the same.”147 Thinking derived from that statement led to enriched articulation of how IT would influence command, control, and communications activities and practices, including the then favored expression of command, control, communications, computers, intelligence, surveillance, and reconnaissance, or C4ISR, to describe the now normal realm of warfare. DoD now gave information warfare (IW) a formal definition: “information operations (IO) conducted during time of crisis or conflict to achieve or promote specific objectives over a specific adversary or adversaries.”148 This could be carried out on the battlefield or as psychological operations (PSYOP) on a population or nation, envisioning “Force XXI” thinking as using IT as a major force multiplier. On the extreme edge beyond official thinking were those who argued the case that “cyberwar” would come in which warfare was strictly digital, with attacks on networks, databases, local area networks, and so forth. To those operating in a battlefield environment, however, digital tools offered potentially three advantages. An infantry colonel explained: First, the unit with situational awareness can better maintain its bearings in urban warrens, congested jungles, and convoluted mountain outcroppings, not to mention under the cover of darkness, all the preferred fighting arenas of U.S. light contingents. The second benefit relates to the first and again gains time, widely acknowledged to be the most valuable currency of combat.
87
88
The DIGITAL HAND, Volume III
Finally, a digitized light unit can more quickly recover from the confusion that usually accompanies its inaugural events: an air-borne, air assault, or amphibious forced-entry operation.149
However, he also cautioned that “these advantages accrue to a trained, disciplined force that adds them to well-honed close-combat skills. They are not magic wands that guarantee victory when wielded by any guy dragged in off the street.”150 In the late 1990s and early 2000s, the language of information-based warfare increasingly concentrated on the notion of network-centric warfare (NCW), creating a whole body of thinking that once again emphasized enhancing communications to get precise information to forces and facilitating collaboration on the battlefield among various units. In DoD’s latest formal pronouncement at the dawn of the new century, entitled Joint Vision 2020, senior officials spoke of more complete pictures of the battlefield for commanders at all levels.151 Increasingly, merging video game technology with simulation software and communications vehicles, such as the Internet and LANs, creates a new combat system to track and attack targets, deploy people and materiel, and assess results. The experience in Afghanistan, discussed above, was viewed by a growing number of officials as proof positive of the value of what many in the military were now increasingly calling network-centric operations (NCO), although commanders complained about slow response time of networks and not enough bandwidth to keep up with what was actually happening on the ground. The closer one moved toward the frontlines, toward “the sharp end of the spear,” the louder the complaints about insufficient bandwidth, so theory and practice had yet to marry up.152 However, the notion of NCO had gained enough adherents to start influencing another round of developments in communications and strategy. The digital-centric warfare so widely embraced in the 1990s and beyond had its critics almost from the beginning, as had all emerging IT applications in past decades. Missteps with false alarms with SAGE and SAC provided early sources of criticisms should a technology fail to do as intended. Programs, such as SDI, were often oversold and overpromised, while confidence in high-tech weapons only came after deployed in combat.153 The poor performance of some rockets in the first Iraq war gave pause to some critics, many of whom were outside the military establishment. However, there were also critics who grew up in the military and who raised objections after leaving active service. Each published pronouncement by the Joint Chiefs of Staff prompted criticism. The case of Joint Vision 2020 ( JV2020), for example, proved to be no exception. One recently retired Army colonel described four fallacies with it, any one of which could prove deadly to the military. Retired Colonel John A. Gentry complained that the thinking promised only to address a narrow band of activities performed by the military; that it relied on highly vulnerable infrastructures; that easy countermeasures could be launched against the military’s systems; and that institutional impediments to change, baked into the culture of the military, posed another threat.154 The debate over the increased use of IT in fighting future wars became subsumed in a larger discussion held within the Pentagon regarding the general
Defense of the Nation
use of technology. The debate moved from theoretical to practical considerations as the George W. Bush administration prepared for war with Iraq, flush with the victory against relatively primitive Taliban warriors in Afghanistan. Secretary of Defense Donald H. Rumsfeld over many years had become a supporter of arguments that stressed the importance of air power, use of computer-centered communications, and the deployment of small, well-trained forces. He wanted to transform the department into a leaner, more technologically dependent fighting organization and had been running into enormous resistance from many senior officers and some members of Congress prior to the start of the Iraq war. Senior officers believed that future wars would continue to require combat on the ground with many troops and a great deal of fire power (derived from the Powell Doctrine of 1990–1991), what the experience in Iraq proved subsequently would have been the correct strategy. For the war, General Tommy Franks wanted 250,000 troops; the secretary insisted on a strategy that relied on a fewer number of troops and massive use of air power, smart bombs, and other technologies. By the start of the second week of the war, critics began complaining that there were not enough troops on the ground, and the chronic problems in controlling security after the formal war ended simply added evidence in support of those who thought wars could not be so extensively dependent on technology. In short, technology had its limits.155 As this book was going to press in 2007, the outcome in Iraq had yet to be determined. So we do not yet know if what some have called the Rumsfeld Doctrine proved effective. But what is very clear is that now war was profoundly influenced in a direct and open manner by computing. While IT had been important in combat operations as early as the late 1970s and 1980s, it was not until the 1990s that it was front and center, perhaps more so than for any other nation’s military. Meanwhile, the idea of Information Age warfare continued to unfold.
Non-Combat Applications While the breadth of uses of computing in DoD rivals that of any other government organization or company, several further applications suggest the extent to which this department deployed the technology. The most important of these are the accounting and financial reports prepared by DoD. Each of the uniformed services had been users of precomputer information technologies dating back to World War I. Accounting and financial reports on budgets, expenditures, and payroll had long been assisted with the use of punched-card tabulating equipment from IBM and a myriad of office appliances from other American vendors, such as Burroughs, NCR, Remington Rand, and Felt & Tarrant. Their equipment was used at all bases and ports, not just in offices located in the Washington, D.C. area.156 When computers became available, the uniformed services and civilian agencies migrated their preexisting processes over to the new technologies at the same time and for the same reasons as did the private sector because
89
90
The DIGITAL HAND, Volume III
they were faster, could handle larger volumes of transactions, and reduced the amount of manpower required to get the work completed. Minimal alterations to existing work processes took place because the intent was to automate existing activities. Only in the late 1950s and early 1960s did officials alter work processes as they began understanding the possibilities of even further streamlined or new activities using computers. One additional early pattern that has continued to the present involved an organization-wide concern about the formalities of procurement, creation, and defense of budgets and concern over expenses. Every annual report of the secretary of defense, for example, discussed the issue. Law required that budgets and financial reporting adhere to standards set by the U.S. Congress, and oversight organizations monitored conformity, such as the GAO and the Office of Management and Budget. Examples illustrate how quickly the military embraced computing in support of accounting and financial matters. The Navy used IBM 705s to handle payroll functions as early as 1958 at its various naval centers around the country. The Army, which had the largest number of personnel, began installing similar IBM equipment at its bases all over the world in the late 1950s to handle personnel statistics, depot supply accounting, financial reporting, and also such smaller applications as theater supply and housing control and graves registration; but the lion’s share of the computing power went to accounting and financial systems. In the language of the uniformed services, accounting consisted of a collection of processes called comptroller activities. For the majority of the second half of the twentieth century, these activities consisted of tracking and reporting on general accounting, cost accounting, civilian payroll and leave accounting, civilian personnel, command management, budget and manpower reporting, station property reporting, and station supply accounting. “Station” referred to the physical site, such as an Army base. Reports and data collection processes relied heavily on computers by the end of the 1950s, and, in fact, many of these accounting applications were not updated to reflect the capabilities of newer hardware and software until the 1970s.157 Accounting was so important that in wartime, often one of the first units to arrive in a combat theater were the accountants. A Marine officer working in accounting and data processing at the time of the Vietnam War preserved a case study of the process. The first data processing platoon to arrive in Vietnam came on March 23, 1965, from the Marine Corps to serve the 3d Marine Division. Like the Army, Navy, and Air Force, each Marine base and large units had their own data processing to support such applications as payroll, budget accounting, and to track personnel and inventory. All these processes migrated to computers in the 1950s and early 1960s. Beginning in 1958, the Corps began leasing six computer systems to support accounting and personnel applications, with three systems from NCR and another three from Univac. In 1962, these systems were replaced with IBM 1401s, and subsequently with even newer Univac IIIs by the start of the Vietnam War. The initial accountants to come to Vietnam consisted of an officer (data processing [DP] manager) assigned to the local commander, a second in command DP officer, and enlisted personnel who could do the work,
Defense of the Nation
run the machines, and perform necessary maintenance. The majority of the work they did involved budgeting, inventory control, and personnel accounting.158 It is impossible to discuss accounting and finance in the military without reviewing personnel accounting systems because unlike in the private sector, often larger quantities of information were kept about personnel in machinereadable form and as part of accounting applications. Major Henry W. Tubbs, Jr., who led some of the first Marine DP personnel into Vietnam, described what was involved: “Locally, the origin of this information is his unit and the individual’s service record book. Certain items used by the command and which comprise the input to personnel data processing programs are also kept on punched cards at the mobile dp installation [in this case in Vietnam]. New items will eventually end up in his master record on tape at the West Coast Computer Center at Camp Pendleton, Calif.”159 Work processed through his system included listing of personnel rotation tour data, attrition, officer assignments by unit, myriad casualty reporting, combat awards applied for and distributed, locator listing for personnel in Vietnam, and location of personnel by military occupational specialty (MOS). Additionally, a complex set of applications generating seven reports kept track of casualties for payroll, unit commanders, and the Marine Corps’ medical community. His data center consisted of two air-conditioned trucks that moved with the troops; often punched-card files were “located in tents,” which accumulated “dust, sand particles, and grime while being manually processed.”160 Similar personnel systems for use within the United States and around the world were implemented by all the services.161 It is easy to ignore how many people were tracked and accounted for this way, but there were many. One example will have to suffice to make the point. In the late 1950s, the Air Force employed 900,000 people in uniform and an additional 400,000 civilians for a total of 1.3 million people, more than the combined populations of Delaware, Nevada, Wyoming, and New Hampshire at that time. Beginning in 1956, the USAF began using an IBM 705 in support of the work to service 300 major bases and 3,000 locations around the world that spent $17 billion that year. Of the 300 bases, about 200 had some prior punched-card systems and in the late 1950s all began migrating to computers. The applications, and types of data collected and managed, mirrored the experience faced by the Marine major quoted above. Subsystems were developed to monitor personnel distribution, maintain an officer inventory control process, perform personnel turnover analysis, and do payroll and budget for people, buildings, and inventory. All the services reported that using computers improved the quality and accuracy of their data, timeliness in which it was collected, analyzed, and reported out.162 By the early 1960s, basic accounting and personnel systems were widespread across DoD. Modifications over the years included collecting data on procurement practices as they changed over time and a growing collection of applications to do analysis on patterns of expenditures, modeling options for purchases and how best to spend budgets. Accounting and financial systems were built all during the second half of the century that met the specific needs of branches and agencies but that also
91
92
The DIGITAL HAND, Volume III
addressed specific classes of transactions so that the DoD had, in effect, many automated systems that did not link up end-to-end with integrated flows of data. Attempts to address that latter problem did not begin in earnest until the 1990s. The result of disconnected applications over the years led to inherent problems described clearly by one audit: “The relationships among feeder, accounting and financial systems are still “detached” from the perspective of data standardization, transactional standardization, and system compatibility. This detachment causes much re-entry of data, ‘crosswalking’ or matching of data through elaborate edit processes and conversion tables, creating timing delays—all of which contribute to an inability to determine the status of financial information on a routine basis.”163 The description applied to many systems. At the end of the century, there were ninety-one critical applications feeding data to sixty-one accounting systems, few of which were even written in the same programming language, let alone technologically compatible with each other such that information could move from one system to another or changes made to one program could be implemented in another. Some of the affected classes of applications included payment transactions for civilian pay, debt management, military pay, contract and vendor payments, disbursing, and payments for transportation and travel, all crucial systems. Feeder applications that collected data on transactions covered acquisitions, personnel, cost management, property management, and inventory management. Other programs had been written to move data from one system to another, in effect, translating data so that it could be understood by upstream applications. Yet even at the end of the century, DoD still did not have an integrated balance sheet for the entire department and so still generated much manual work for accountants. Thus, the department was saddled with redundant systems, incompatible data and systems, and a challenge to integrate $500⫹ billion of expenditures.164 It became a fact of life that audits of DoD accounting always turned up problems, just as in many other federal agencies. A lesson we can derive from DoD’s experience with accounting and financial applications is that just because one used computers in support of a stream of work did not mean that the activity would be done well. As generations of IT experts always pointed out, computers were tools; use them effectively and you will do well; use them poorly and you will do poorly. The variety of systems that had been implemented over the years from various suppliers and of varying ages made consolidation of data increasingly difficult to do over the years. As late as 2001, after a half century of accounting using computers, one audit team reported that the “Defense Department’s accounting systems do not provide the information needed to relate financial inputs to policy outputs” and failed to generate accurate information that told “managers the costs of forces or activities that they manage and the relationship of funding levels to output, capability or performance of those forces or activities.”165 Until 1990, DoD was primarily responsible for auditing its own applications, with the occasional intrusion of auditors from other parts of the government. However, that year, Congress passed the Chief Financial Officers Act, requiring all departments to pass an audit that demonstrated expenditures
Defense of the Nation
Figure 3.4
The USAF used mainframes to do routine accounting and processing at the Pentagon. Notice that even in 1978, when this photo was taken, punched cards were still in use. (Courtesy IBM Archives)
matched appropriations. As of 2001, DoD had failed to pass any of its annual audits, a problem that plagued DoD accountants right into the new century.166 Periodically through the entire period, audits of inventory expenditures would call out problems with overpriced items, or overpayments on contracts, with insufficient oversight. As recently as 2003 and 2004, similar problems again surfaced concerning payments to civilian subcontractors working in Iraq and regarding the hundreds of systems tied to inventory control. Pointing to these problems is perhaps less a statement about incompetence and more a comment about how large organizations like DoD, IRS, and General Motors faced enormous challenges in managing enterprise-wide applications that involved a combination of legacy and new systems, and the constant requirement for changes in applications, such as for new forms of data and reports. Nonetheless, because of the large size of the Defense Department, GAO, OMB, and various congressional committees constantly monitored its accounting, inventory control, and personnel practices.167 A second class of application provides a more positive case study of a very contemporary use of computing—video games to help recruit new personnel—an application of IT that only came into its own in the late 1990s. Just as military trainers had recognized that young recruits had grown up with television, PCs, GameBoys, and video games, so, too, did recruiters appreciate the fact that this generation had to be reached in new ways. War games in video formats had been around since the late 1970s, a story described in volume 2 of The Digital Hand.168 However, on July 4, 2002, the Army released a free game to the public called America’s Army, intended to illustrate life in the Army, its values, and way of life.
93
94
The DIGITAL HAND, Volume III
Aimed at older teenage boys and young men, it could operate on various platforms, such as PCs, MACs, xBoxes, and over the Internet. It became very popular. Within months it had become one of the top ten video games in the country. Between initial release and the summer of 2005, it went through fifteen editions, attracting controversy with some accusing the Army of having prepared a propaganda piece, while recruiters claimed that it had significantly helped them in their recruitment efforts. Various surveys of enlisted personnel, students at West Point, and officers indicated that tens of thousands had played the game. It was a perfect match of a technology with a target audience and perhaps the most public success story of the use of IT by the military after the smart bomb and missiles.169 In addition, it reduced the cost of recruiting. Because it was free to download, the military avoided distribution costs of a commercial CD-based game, a savings amounting to some $13 million. Furthermore, the game was effective. Because it reflected Army values, enlisted personnel were encouraged to play it as well, as a form of training. One internal study estimated that in its first year of release, 1.2 million people had played the game, and that players around the world from many countries had played over 174 million times; in other words, people spent 17 million hours with this Army video game.170 Three years after its initial release, long after most video games were forgotten and replaced by others, this one remained popular. Because the Pentagon had invested some $16 million to keep it current, over 4.5 million people played the game by mid-2005, adding 100,000 new players every month.171 Its success was due to (a) the fact that it was free off the Internet or could be picked up at any recruiter’s office and (b) that the Army paid careful attention to keeping the game authentic in every detail, from the weapons and vehicles used to the circumstances in which the player found him or herself. Players went through boot camp in the game, were encouraged to work in teams, and went to virtual Fort Leavenworth military prison if they violated rules of engagement.172 Finally, one could ask, of all the non-combat-related uses of IT not covered in this chapter, were there others of significance? The most important one relates back to communications. DARPA helped fund the creation of the Internet for use by researchers working on behalf of the military. We saw earlier that primitive uses of telecommunications to transmit data over telephone lines began in the 1950s and extended all through the 1960s. By the 1970s, each of the uniformed services had begun implementing commercially available forms of fax machines and, more important, e-mail. One early available tool of this kind from the 1970s and 1980s was the Professional Office System (PROFS) from IBM. It consisted of terminals connected to mainframes that switched messages from one user to another. Multiple mainframes linked together so that additional users on other computers could communicate. It was useful for transmitting notes and letterlength messages, for creating online calendars, and for sending attached files, the sorts of functions that Lotus Notes provided in a more user-friendly format in the 1990s before the arrival of Internet e-mail. Early systems were designed to provide communications within a military base, initially often within an organization, such as a logistics operation on an Air Force base. It quickly also became the
Defense of the Nation
basis for communicating budget information as graphical packages became available and could be used, for example, by procurement and accounting.173 By the end of the century, e-mail had become ubiquitous in DoD and was even available on laptops mounted in armored vehicles in combat in Iraq in 2003–2007.
Patterns of IT Deployment Even before the invention of digital computers, the military community committed to their development and, for decades afterward, to their further improvement. But as this chapter makes clear, actual deployment and use of computers by the DoD far transcended its R&D activities in terms of what happened on a day-to-day basis within the department. Every command and agency used computers extensively and indeed as far back as the late 1950s, some even earlier. From the earliest days, DoD had formal processes for how any command or agency could acquire computers that involved practices similar to those evident in the private sector and in other public agencies. These included identifying specifications and applications, doing a cost justification analysis, soliciting equipment and software from vendors, and going through formal steps for evaluating to whom to award contracts. Those processes varied over the years but more in details than in concept.174 Despite attempts by various secretaries of defense to create standards and uniform practices, variations existed. For example, as far back as the 1950s and early 1960s, one Army official wrote: “Why so many different kinds? The reason is simply that before the Army reorganization, each Technical Service proceeded individually to acquire computers as they felt best suited their needs.”175 The CIO for the U.S. Air Force in the late 1990s was just as candid about experiences of DoD decades later: We’ve tended historically to bring on new information technology on a catch as catch can basis. We bought thousands of computers, without thinking that the time would come when those boxes might be used to form interactive networks of computers. Then we created computer networks without thinking about optimizing the interfaces between networks of computers. And, worst of all, perhaps, we did all that without thinking too much about the requirements of the individuals using the computers and networks.176
While one can quibble if he overstated the problem, he also noted that in Desert Storm (1991) military units in the field complained about insufficient bandwidth, the same complaint heard in 2003–2006 in Iraq.177 Many things thus stayed the same, not the least of which were procurement practices. In 2003, an Army lieutenant colonel wrote in his War College thesis that “DOD’s Planning, Programming, and Budgeting System . . . has changed little over the past decades.” He complained that circumstance “inhibits innovation and fails to adequately react to environmental changes.” He noted what so many other commentators had about the acquisition process: “Defense analysts aim most of their modeling efforts and statistical analysis at program/budget
95
96
The DIGITAL HAND, Volume III
requirements for successive six-year windows,” a practice that “severely limits objectivity by perpetuating the status quo,” perhaps providing one explanation for why non-combat core uses of computing often existed hardly modified from one decade to another.178 Two patterns of behaviors become visible from my analysis of DoD’s use of computers and telecommunications. First, and despite continuous skepticism about new applications, particularly in weapons systems, the entire DoD had a healthy appetite for IT, with critics often complaining that this interest was too great. Every agency and every branch of each service used computers. Second, despite initiatives to the contrary, acquisition and deployment of IT remained highly decentralized with every command, branch, and agency busily going about its work of acquiring and deploying systems that it found supported their missions. As their assignments changed, they dutifully acquired new applications, although more often than not retaining old ones. This pattern of acquisition of IT mirrored what happened with aircraft and weapons systems as well. This practice of acquisition and coexistence of new and old applications played out in a relatively clear manner. New weapons systems typically sported newer, more advanced technologies than nonmilitary applications. Put in the converse, support applications changed less frequently, such as payroll, personnel systems, inventory control, and logistics in general. However, give the Air Force a new airplane, and it would acquire the latest avionics. DoD attempted to implement standards, for example, as far back as the mid-1960s for telecommunications, done to simplify acquisitions, make them more compatible and, therefore, able to support cross-branch integration, and as a way of reducing acquisition and operating costs. However, much of the slow progress in upgrading old backroom applications and imposing consistent technologies mirrored the experiences at the IRS and at other agencies described in subsequent chapters. In the 1950s and 1960s, before the number of computers far exceeded the capacity of members of the Computer Industry from keeping lists of installations, all publicly available inventories showed that the military bought every kind of IT that came along, often acquiring 25 percent or more of the entire inventory in the United States.179 By the mid-1960s, DoD was spending over $700 million per year and had installed nearly 2,000 computer systems. By the end of the decade, it had over half the systems installed in the U.S. government, down from a dominating 70 percent of the inventory in 1965.180 Yet even in the mid-1970s, of the some 8,600 computers systems installed across the federal government, 4,245 belonged to DoD.181 By the end of the 1970s, it was easier to start observing the percent of the Pentagon’s budget allocated to IT or dollars spent than to track every computer system. Government accountants began reporting such data in the early 1980s, continuing to the present. As table 3.8 illustrates, in the 1980s and mid-1990s, DoD’s expenditures on IT ran into the billions of dollars per year. Expenditures grew all through the 1960s, 1970s, and 1980s and stabilized in the 1990s as the result of three circumstances: sharply declining costs of computing equipment and software, extensive budget cutbacks for all DoD, not just for IT, and normal operational improvements from streamlining
Defense of the Nation Table 3.8 Department of Defense Expenditures on IT, Select Years, 1982–1996 (billions of dollars) Year Amount
1982 4.2
1987 8.2
1992 9.9
1996 9.0
Source: U.S. Office of Management and Budget, Information Resources Management Plan of the Federal Government (Washington, D.C.: U.S. Government Printing Office, August 1996): 3.
and consolidating systems. However, those budget numbers mask a massive inventory of installed computers. In 1996, for example, DoD had 2.1 million computers (from super computers to PCs), operating on some 10,000 local area networks and 100 national networks.182 However, what is also clear from the historical evidence, and underpinning the data in table 3.8, is the fact that the DoD singlehandedly spent nearly 45 percent of all the funds the federal government expended on IT across the period from the 1950s through the 1980s; and in the 1990s, it still claimed over 25 percent of all federal expenditures on IT. In short, as every computer sales executive knew, DoD was the world’s largest consumer of IT in the second half of the twentieth century.183 For certain classes of IT, DoD’s presence was impressively large. For example, in 1989, the federal government had seventy-one super computer systems. Twenty were located in DoD, while another thirty-three were in the Department of Energy, where all the national laboratories were housed, and which did a considerable amount of work for the military, such as developing nuclear weapons. NASA had ten, but even with these systems, some of it was in support of the military, leaving, in effect, eight systems that we could conclude probably did not do much of any work for the military. These were located in the Department of Commerce, Environmental Protection Agency, and Department of Health and Human Services. Not included in this inventory, of course, were the numerous super computers used by various intelligence agencies, such as the National Security Agency, which housed some of their systems at military installations.184 The pattern of being a large consumer of all manner of IT extended into the new century, by which time DoD spent vast sums on IT outsourcing and other services, like so many corporations and other government agencies. In 2000, for example, DoD expended $53.1 billion for all manner of services, not just technical or IT, and of that amount, approximately $5 billion went to IT services.185
Conclusions It would be difficult to overstate the effect that the massive size and complexity of DoD has had on its internal workings and the effects it has had on wars, world
97
98
The DIGITAL HAND, Volume III
peace, and the operation of the national government at large during the second half of the twentieth century. When the position of secretary of defense was established in 1947, Congress authorized that official to hire a handful of employees to work in his office; fifty years later, that secretary had over 2,000 employees and managed 15 percent of the federal budget, while providing direction to over 2 million employees.186 No discussion about the DoD can ignore its sheer bigness. The history of the digital hand at this department ultimately reflects the consequences of that size and the complex and varied missions it was required to carry out. IT unquestionably and consistently played a supportive role in helping all organizations within DoD carry out their missions over the years. Every niche and corner of the Pentagon, at hundreds of bases over the world, and across every command and agency, employees used IT in support of their activities. If the technology did not exist to get the job done, DoD contracted out its invention. DoD shined best and most effectively in its use of IT in two fundamental ways. Over many decades, DoD godfathered a vast amount of computer science and technology that benefited both the department and the nation at large. The value of the Internet is merely today’s most obvious example of benefit to the nation; but we must not forget others: various programming languages, computer memories, online systems, computer operating systems, artificial intelligence, avionics—the list is long. We live in the Information Age thanks to the R&D and technology transfer of the Department of Defense over so many decades. The second way DoD shined was in the application of digital technology in weapons and transportation. Ballistic missiles, atomic bombs, smart bombs, modern aircraft, naval ships, and even wheeled vehicles made the United States the most powerful military might in the world and, to not an insignificant degree, led to its winning the Cold War. If DoD had to “get right” one use of IT, it was better that it be in military applications than in accounting and financial systems. DoD’s experience with IT, however, was not uniformly and consistently outstanding. As we saw with its accounting and financial systems, IT reflected the limitations of how the organization operated and its values. As historians of the department pointed out, a major war occurred within the DoD that has yet to end over control of assets and rivalries among the uniformed services. For those outside of the department, it is telling about how extensive this problem has been that its historians, themselves employees of DoD and publication of their studies endorsed by the department, considered it a central theme of their analysis of how the office of secretary of defense evolved.187 IT aligned on all sides, supporting battles among organizations functioning as relatively independent silos, which goes far to explain why every major audit of the department’s IT and accounting systems found that the biggest problem remained the lack of integration of information about the operations of this part of the American government. The situation extended across all manner of applications, from weapons and combat to training and logistics and, finally, to accounting and finance. Looking at other large agencies and major American corporations, one can spot similar patterns of IT usage. However, nowhere is the fragmentation
Defense of the Nation
of systems as great as at DoD, which can largely attribute that phenomenon to the sheer size of the place. Anybody who has worked for a very large public institution or major corporation understands instinctively how difficult it is to work outside one’s own silo, to comprehend and to work with others within the larger enterprise without getting lost, compromising one’s management’s priorities, or risking loss of financial or career incentives. Given the substantial deployment of all manner of IT, and the vast resources devoted to it over the decades, it is remarkable how long it took for many systems to go from conception to being up and running. Development of weapons systems and new aircraft and ships routinely took decades to accomplish. More mundane systems, such as a new generation of avionics, also took ten, fifteen, or more years to develop before being ready for implementation. At the same time, transformation of relatively simple systems, such as accounting and personnel applications, barely took place. It was not uncommon, for example, to see logistics systems in the 1980s that had been designed in the 1960s still operating all over DoD. Even ten-year-old (or older) hardware also existed all over the department. To be sure, the causes of this technological atrophy paralleled the experience of the IRS. Both had complex procurement issues; each complained about the lack of sufficient budget and resources; auditors pointed to poor leadership and alternative priorities; and each took over a decade to change any major application. Somehow, however, both survived. To be sure, the IRS came very close to not carrying out its mission of collecting taxes in the late 1980s, and we have yet to see DoD lose a war because of poor IT systems. Although, as noted earlier in our discussion about Information Age warfare, there are some military analysts who wonder if that may not yet happen. One is tempted to argue that because of the complexity and size of this department, it experienced every type of application of the digital hand used in the American economy and enjoyed (and suffered) an equally comprehensive set of consequences. But that might not reflect reality. For example, DoD took the lead in innovative uses of IT in education that the rest of the nation has yet to catch up on, particularly higher education. On the other hand, modern research techniques and research agendas of universities and companies have been influenced profoundly by the Pentagon; yet, indeed, for the period from the 1940s through the 1960s, it dominated these. While students of the IT industry are quick to point out that commercial R&D took off by the 1970s and created a situation whereby the Pentagon had to play catch up with the private sector, beginning in the 1980s with the emergence of the PC, the fact remains that DARPA still influences many research agendas. But back to our point about the influence of DoD, it is useful to think of its experiences with IT as reflective of what happened across the economy. President Eisenhower’s notion of the “military-industrial complex” is a useful paradigm to apply here, because DoD, in combination with its key suppliers, did constitute an economic and technology ecology that made DoD an economic “force multiplier,” to use a good military term. The DoD affected how whole industries functioned and used IT, such as airplane manufacturing, transportation, and weapons. By using the military
99
100
The DIGITAL HAND, Volume III
notion of “force multiplier,” I mean that DoD’s expenditures and the movement of its alumni into industry enhanced the power and influence of the department, creating waves of effects, much like what happens when a pebble is thrown into a calm body of water. In that way, it affected its suppliers and consumers in a manner not so dissimilar to the influence General Motors had on its suppliers and business partners in the 1960s and 1970s and Wal-Mart on its from the 1980s to the present. In each instance, large size, control over the amount of business parceled out to suppliers of research, equipment and other IT products, and services were quite similar in creating technological and economic ecologies that included unique practices, local cultures and values (even language), and economic bazaars. No other public institution in the American economy had this degree of influence. Ultimately, however, we must ask, despite size, complexity of culture, conflicting internal objectives, and the characteristics and use of the technology itself, did IT change the nature of the work of this department over the past half century? The answer is clearly yes. A soldier in 1950 rarely if ever saw a computer; in 2005, every employee and service personnel used computing at one time or another to do their work and had for nearly twenty years. In the case of the Air Force, it was more like thirty years. The advanced weapons of the United States that served it so well in the Cold War were profoundly influenced by computing. To be sure, in low technology-driven wars of insurgency like Vietnam and those waged in the Middle East, American forces consistently punished their enemies far more so than the converse. The percent of casualties suffered by American military forces, both wounded and dead, remained lower than for other nations that did not rely so extensively on IT and other related technologies. That balance of pain may change as widespread availability of IT extends to less developed economies and societies, as evidenced by the effective use of cell phones, garage door openers, and washing machine timers that caused thousands of U.S. soldiers to be wounded or killed in Iraq since 2003. But in aggregate over the past six decades, IT provided a better shield protecting American troops, sailors, and airmen than it did for enemies of the United States. Ultimately, the ability to fight and win wars, and to do so with minimal casualties and other costs to the nation, is the objective of the Department of Defense. There is no evidence to suggest that the DoD intends to pull back in its reliance on IT to carry out its reason for existing. Finally, we need to acknowledge an issue that is increasingly getting attention, namely, to what extent do we have in the case of DoD possibly too much reliance on technology? It is a question that does not come up with any other part of the public sector, with the possible exception of critics of the use of computing as a teaching tool in classrooms. Secretary McNamara realized in the late 1960s that his reliance on numerical targets, statistics, computerized modeling of events, and extensive aerial bombardment was not leading to victory in Vietnam. The authors of a recent history of the Iraq war have questioned whether Secretary Rumsfeld was suffering from an even worse case of overreliance on technology.188 Meanwhile, an historian of the role of technology in American
Defense of the Nation
foreign policy across all of U.S. history in regard to the latest war concluded that “techno-hubris goes far to explain the miscalculations of the civilian planners in the Pentagon who were the main architects of the 2003 invasion of Iraq.”189 Paul Edwards in his study of the Pentagon during the Cold War brilliantly documented how both civilian and military leaders viewed the world almost through the ideas and intellectual typologies suggested by the technological architectures of computing and telecommunications.190 In short, did the Pentagon suffer from too much reliance on the digital hand and other forms of technology, that is to say, was it influenced in its world view by a culture and zeitgeist of information technology? The accumulating historical evidence would suggest that it is a risk senior officials can be exposed to while embracing what otherwise are practical uses of the digital hand.191 If we turn our attention to the flip side of national security, that is to say, away from threats aimed at the United States originating in other countries, and look internally to crime, law enforcement, and judicial uses of the digital hand, what can we learn? It should be of no surprise that police officers, detectives, district attorneys, jailers, judges, and, yes, criminals all used IT. How they did that is the subject of the next chapter.
101
4 Digital Applications in Law Enforcement The need to develop, test, and field new law enforcement tools remains as compelling as ever, given the rapidly increasing technological capabilities of criminals. —William Schwabe, 1999
C
rime is big business in America. If law enforcement were measured like an industry, it would be listed as larger than many in the private sector. It has also expanded over time. Expenditures on law enforcement accounted for about 1.1 percent of GDP in 1982 and grew to 1.66 percent in 2001, or by roughly 50 percent in two decades. The percent growth is even more pronounced than the numbers would suggest because the economy was also larger in 2001 than in 1982. At the start of the new century, nearly 2.3 million people worked in this corner of American society, making it one of the largest employers in the economy.1 In one of the first attempts to quantify the cost of crime to the national economy, the U.S. Department of Justice reported that in 1995, personal crimes cost $105 billion, and if one included pain and suffering expenses, $450 billion. Violent crime caused 3 percent of all medical expenses and 14 percent of all injury-related medical spending. It further estimated that crime generated 10 to 20 percent of all mental health costs.2 So, it is a sector of the economy that cannot be ignored. We can learn much about the deployment of the digital hand in this part of society, because it is an example of how many small organizations around the country acquired IT and what they did with it. This case study also illustrates how initiatives by the federal government affected state and local
102
Law Enforcement
governments. In comparison to the IRS or the Department of Defense, most law enforcement agencies were always relatively small organizations. Since in many industries the speed with which computing permeated day-to-day work of companies and agencies was influenced by their size (scale) and available budgets (scope), how did computing come into law enforcement, where most agencies were small? What effects did the technology have on their operating culture? These are important questions to ask because by the end of the century, law enforcement agencies had become extensive users of information technology. Computers and all manner of telecommunications were in wide use in police departments, courts, and prisons; even criminals had become dependent on computing and telecommunications. In the process, the latter created a whole new type of crime often called cybercrime or simply computer crime. This chapter documents the waves of applications that washed over the law enforcement community over a half century. It is a sector of society that had long relied on vast quantities of data with which to do its work, and so the arrival of the computer was a development very suitable to this “industry,” with its ability to handle ever larger volumes of information. Specifically, we will look at the use of computing by policing agencies, courts, and corrections, with a brief introduction to the early history of computer crime as it represents a new class of criminal activity made possible by the existence of the digital hand. There is insufficient space in this book to discuss in detail the role of computers in law firms or in ancillary professions, such as private investigators and IT security experts, but they are subjects worthy of study by others.
Structure of the Law Enforcement Community This community consists of various police forces, systems of courts (judges and staffs), and myriad local, state, and federal jails and prisons (also called correctional facilities) filled with prisoners, and other individuals who either are charged with crimes pending decisions by courts or criminals simply loose in society. Sometimes the communities of lawyers and those who work for them (such as private investigators) in the law enforcement world are also described as part of the criminal justice ecosystem. We should also include in this milieu those who are victims of crime to complete the picture of a law enforcement ecosystem.3 The heart of law enforcement consists of a complex patchwork of local, county, state, and federal organizations that provide police protection, trials, corrections, and rehabilitation. Each has varying scopes of geographic and legal responsibilities and authorities; some even overlap. Towns, cities, counties, and states each have police agencies, courts, and prisons. The federal government has specialized law enforcement agencies, most notably the Federal Bureau of Investigation (FBI), military police in each of the uniformed services, others who concentrate on firearms and tobacco or who protect specific places (such as the White House). The Secret Service in the Department of the Treasury is responsible for enforcing laws against counterfeiting American money and for protecting
103
104
The DIGITAL HAND, Volume III
the personal security of the president and vice president.4 Criminals and victims of crime exist across society at all social levels and in each community, irrespective of conventional policing and jurisdictional boundaries. The total collection of various individuals and agencies presents a picture of highly fragmented sets of players, a feature of that community that had a profound influence on how they deployed computing. A few statistics illustrate the fragmented nature of this ecosystem. At the end of the century, there existed over 40,000 policing jurisdictions (police, sheriffs, state, and other specialized agencies), most of which were small and employed less than fifty officers; obviously, the more populous a community, the larger this was. All told these agencies employed roughly 450,000 uniformed officers. Each incorporated town, city, county, and state also had courts, in fact, nearly 16,000. If we add in courts and corrections to policing, roughly 2 percent of the American labor force worked in some capacity in law enforcement and justice, accounting for some 2.3 million individuals and a payroll of $8.1 billion in 2001. Nearly 60 percent of all these individuals worked for local governments. In the case of police, eight out of ten worked for local police departments, a circumstance that remained relatively constant across the second half of the twentieth century. Local communities and states also had the largest numbers of courts and correctional agencies. In the case of justice system employees, again using 2001 data, the total reached 488,143, of which nearly 55 percent worked in local systems, another 33 percent at the state level, and the remaining 12 percent at the federal level. In corrections, which employed a total of 747,000, just over 32 percent were local employees, another 63.4 percent worked in state government, but only 4.4 percent were in the federal prison system. From a funding perspective—critical to our story of the computer—nearly half of budgets came from local taxes, another 35 percent from state sources, and the rest federal. It was not uncommon for an agency to spend over 90 percent of its budget on salaries and employee benefits, particularly at the local and state levels. Percentages were frequently lower at the federal level because large amounts of funding went through these agencies to local and state authorities to acquire various technologies (including computers) and for other purposes, such as training. The number of prisoners reached an estimated 1.9 million in 2001, of which 631,000 resided in local jails. About 4 million additional people were on probation and thus interacted with corrections, courts, and police agencies. Overall, over the half century, the number of police, judges, lawyers, corrections personnel, and prisoners grew. For the period 1982 through 2001, for which we have excellent statistics, we can see that expenditures for law enforcement grew by 355 percent, with annual increases of 8 percent fairly typical. As federal involvement in law enforcement increased over the period, its expenditures grew on average by 11 percent per year. Ironically, police departments experienced the smallest growth rates. For the entire period, however, and for all of the various agencies comprising the law enforcement ecosystem, expenditures grew faster than inflation (consumer price index), which is why, as a percent of GDP, these expanded from 1.1 percent in 1982 to 1.66 percent in 2001.5
Law Enforcement
At the risk of droning on too much with statistics, it is important to look at what drove growth in law enforcement. Here again, a few statistics show the magnitude of the activity, although not the variety of its forms. If we combine crimes against property (such as burglary and auto theft) and violent crime (murder and robbery), we see that the total amount of crime grew over time in all decades except in the 1990s, all the while as the population of the nation expanded. Using absolute numbers of offenses known to police, in 1960 crimes totaled 2 million, grew to 5.5 million in 1970, expanded to 13.4 million in 1980, and to 14.5 million in 1990. Then crime rates declined, dropping to 11.6 million in 2000. However, these data do not include unreported crimes. Some surveys suggested that the absolute number of crimes was actually much higher.6 But the trend is obvious: crime occurred in sufficient amounts to drive the large increases in resources devoted to fighting it. The number of victims proved high as well. Even in the period of the 1990s, when the volume of reported crimes dropped, the number touched by personal and property crimes still ran into the tens of millions of residents, ranging from nearly 40 million in 1995 to almost 26 million in 2000.7 Those numbers go far to explain why the public maintained constant pressure on their local, state, and federal policing agencies, courts, and legislatures to “get tough on crime” throughout the period, pressure that contributed in particular to the various federal funding initiatives that made possible expanded use of computing in law enforcement, beginning in the 1970s and continuing to the present.
Adoption of Computing by Local and State Law Enforcement Agencies Police work is very data intensive, often more so than for most local or state governmental activities. Much work occurs in documenting such events as automobile accidents and crimes and in maintaining very large files on either wanted persons or others being processed through the legal system from arrest to trial through imprisonment and parole. Using large files on known criminals as research material for ongoing investigations constitutes another important activity. The most extensive users of data are police on the street inquiring about the backgrounds of individuals they are dealing with and about potentially stolen property, such as vehicles they have pulled over for a traffic violation. Collecting data, sorting it, and doing look-ups are major activities in the lives of police officers and their back office colleagues. Thus, on the one hand, there are policespecific uses of data, such as parking and traffic citations (and later lists of these) and assignments of officers to their beats, but also others focused more on the mundane data collection and reporting typical of any employer: hours spent on the job, payroll, personnel records, vehicle maintenance records, inventories (even of guns issued to officers). Pools of data used constantly in daily work include jail bookings; investigative reports for crimes and accidents; fingerprint files, often quite massive in large cities before the advent of computing; lists and reports regarding stolen or pawned property, such as jewelry; and automobile
105
106
The DIGITAL HAND, Volume III
theft and recovery reports and data. Kent W. Colton, a student of policing practices in the 1970s, observed that “police departments process large amounts of information. A great number of events transpire under the jurisdiction of the police, and detailed reports must be prepared on many of them.”8 That reality remains as true today as in earlier decades. The most important circumstances driving the need for better data processing tools in policing were the continuous increase in both the number of crimes reported and other activities police became involved in over many decades, expanding in volume faster than their budgets and number of employees who could deal with them. Hence, the constant hunt for productivity tools, and nowhere is this more evident than in information processing. Every major report on why police used computers linked back to the problem of growing workloads. The biggest gating factor in the adoption of computers was the availability of funding, or conversely put, the lack of sufficient budgets. Many individuals in the law enforcement ecosystem understood why (or why not) to use computers, and many had realistic views of the capabilities of this technology, particularly by the end of the 1970s.9 Knowledge of computing’s potential was not what determined what and when someone adopted an application. It often boiled down to funding and availability of people to implement a use. As occurred in so many industries, the largest organizations tended to be the first to adopt computers or some advanced wireless communications, because they had the volume of work and budgets both to justify and afford them. Over time, ever smaller organizations did too as the cost of computing dropped or various departments could share systems, such as those developed in the second half of the century by the U.S. Department of Justice, and more specifically, the Federal Bureau of Investigation (FBI), and many state governments.10 Communities first installed computers for police work in the 1960s, while extensive deployment did not occur until the 1970s. However, long before the arrival of computers, all over the nation police departments had organized vast files of local paper-based data on criminals and events; and as precomputer information processing tools came along, they used them, most notably the typewriter, telephone, radio, and punched-card equipment. In the 1950s and early 1960s, police departments continued to expand their use of punched-card equipment for accounting, personnel record keeping, and for tracking and collecting unpaid parking tickets. Large cities also used punched cards to monitor criminal complaints, arrest, and traffic accident statistics in such cities as New York, St. Louis, and Chicago, cities that became early adopters of digital computers.11 This occurred even though first and second generation computers became available but often were unaffordable (hardware and staffing). Also, IT skills within law enforcement agencies were sparse in these early years. Yet police departments slowly became aware of computers, beginning in the mid- to late 1960s, in part through their normal channels of information, such as law enforcement conventions and publications, but also due to an aggressive effort on the part of the U.S. Department of Justice to take an early lead in supporting use of computers. One observer of the American police scene between
Law Enforcement
Figure 4.1
New York City police examining punched-card criminal records, 1961. (Courtesy IBM Archives)
the 1930s and the early 1970s commented at the end of the 1960s on what was becoming evident to police departments all over the nation: “It is now readily apparent that electronic data processing will have far-reaching consequences in the American police field. The smaller departments in this country stand to be among the greatest beneficiaries of these new developments.”12 Enhancing crime fighting proved essential to the use of computers. Our same observer wrote: “The fantastic ability of EDP and its brainchild, the computer, to store enormous amounts of data with split-second retrieval, has prompted police administrators to extend their vision concerning the use of this equipment in law enforcement operations.”13 Police organizations had long recognized the need to collaborate and share information, a practice that could be improved by using computers. On September 18, 1963, New York City’s police commissioner, Michael J. Murphy, commented at a police chief’s conference that “criminals move fast and do not restrict their operations to any particular local geographical unit,” and so, “we recognize that the decentralized character of local law enforcement imposes obvious hardships upon us in our struggle against organized crime. But we have the will to establish regular channels for exchange of information and intelligence and to devise procedural machinery for concerted action.”14 Three features of computing proved particularly attractive to police departments. First, when coupled to radio communications, a police officer on the street could call a dispatcher to look up information about who he or she was dealing with, such as a driver of a car pulled over for speeding: Was it a stolen? Were there outstanding arrest warrants on the driver? Did the driver have a prior
107
108
The DIGITAL HAND, Volume III
record of being violent? Second, records could be used in an investigation of a crime, such as fingerprint and mug shot records that could lead to the identification of a criminal, and hence possibly solving a crime. Third, there was the requirement of the legal system to provide both documentation and tracking of people being processed through the law enforcement system.15 First uses of computers in policing came in the early 1960s. These focused on automating simple, existing processes and sets of data. The most widely deployed application involved traffic and parking citations. Police and municipal authorities wanted to track and manage tickets to increase the rate at which fines were collected for unpaid citations by identifying and pursuing owed amounts. Collections increased anywhere from 10–15 percent to over 30 percent in many communities in the 1960s and 1970s, generating hundreds of thousands of dollars of incremental income in large cities.16 These dollars represented additional revenue that otherwise would probably never have been collected because an officer or other official might not have known that an individual owed multiple fines, or lists of amounts due were not appearing on reports to help officials collect them. With computers, police, courts, and municipalities could start dunning individuals using the same techniques as a normal billing process at a company. All during the 1960s, large city police departments, and regional pools of smaller police departments, led the way in making available computer-based files of locally kept criminal records and others on stolen property (most notably vehicles). These could either be queried by a policeman in a car calling over the radio to a dispatcher (who could quickly access a system) or be viewed at a terminal back at the police station by a police officer. Subsequently (late 1980s), police began using terminals installed in their patrol cars to access directly the data they wanted, further speeding up the process while at the same time now having access to various large collections of online data files. Speed became the critical advantage because a police officer could be told in minutes (or seconds) whether someone he had pulled over for a traffic violation had any outstanding warrants, as opposed to the process in earlier years, where as much as a half hour or so passed while someone searched paper records.17 Major cities like New York, Boston, Chicago, and St. Louis installed computers devoted to such applications.18 Query systems generated considerable use. For example, in Chicago, police routinely probed computer-based files 2,500 times a day by 1967, while police supervisors began creating computer-generated reports on trends based on data in such files to allocate resources around the city and to maintain their automotive fleet.19 The police department in Kansas City, Missouri, is often credited with having the largest number of digital applications in the early years of computing, becoming a role model for many departments. In fact, its police chief, Clarence E. Kelley, acquired a very visible national profile in part because of his extensive use of computing, a reputation that contributed to his appointment as director of the FBI in 1973. He credited the better sets of data available to his officers for helping reduce crime in Kansas City. As one report on his city at the time called out: “Each Kansas City police officer has access to a wide range of necessary information. He can instantly learn if the vehicle he is
Law Enforcement
following is wanted in connection with a crime, has been stolen, is linked to a known criminal. He can determine if the person he is questioning is wanted for traffic or other offenses, if he uses aliases, if he has been convicted of a serious crime, if he has been known to attack officers.”20 These files were shared with forty-five other criminal justice agencies in Kansas. Quickly checking cars, licenses, people, and criminals became the earliest and most widely deployed uses of computing by police on the street in the early to late 1960s. Often, a number of police departments would collaborate in creating shared files, as happened in such metropolitan areas as Kansas City, San Francisco, St. Louis, Chicago, and New York, often with help and funding from state governments.21 Many of these systems were batch, that is to say, a dispatcher or clerk had to query tub cards and files or printouts of digital records. However, in 1964, St. Louis became the first city in the United States to deploy a real-time system by which a dispatcher could look up information via a terminal, launching a new era in query applications that spread slowly most assuredly across other police departments in the late 1960s and throughout the 1970s and 1980s. All during the 1960s, St. Louis expanded its online files, storing them on an IBM Systems/360 Model 40 in the late 1960s. This system processed over 11,000 inquiries per month for such items as stolen vehicles, wanted persons, alias files, and maintained rosters of habitual criminals. The police department added new data to its system twenty-four hours per day from thirty-five different locations across the city.22 One of the first major surveys on deployment of computing in law enforcement, conducted in 1971 with hundreds of departments, revealed that just over 100 used computers, while nearly an additional forty used punched-card records (using precomputer era equipment). By then, the range of applications had become quite substantial, as documented in table 4.1, expanding from such early uses as queries on traffic accidents and violations; although as late as 1967, these still constituted nearly half the uses of computing. Kent W. Colton reported that “a shift in focus began in the middle 1960’s. Police departments continued to install traffic and crime related files, but the development of real time computer systems to provide rapid feedback of information based on the inquiry of patrolmen in the field became popular,” such as the system deployed by the department in St. Louis.23 Colton noted that by 1971, 20 percent of all police applications were online and focused on outstanding warrants, stolen property, or vehicle registration. Meanwhile, management began receiving various reports dealing with statistics about types and quantities of crimes and deployment of their resources.24 What stimulated increased deployment grew out of events that occurred earlier in Washington, D.C. The year 1967 proved to be a pivotal one in the history of police computing. That year, the President’s Commission on Law Enforcement and Administration of Justice published a major report on all manner of practices across the entire justice ecosystem, making various recommendations about how to modernize policing.25 It included suggestions on how best to use computing. The report proved important because federal officials accepted many of its recommendations, most notably, expansion of funding for the development of IT systems that
109
110
The DIGITAL HAND, Volume III Table 4.1 Early Police Computer Query-Based Applications, 1960s–1970s (digital files accessed) Police patrol and inquiry (warrants, stolen property, vehicle registration) Traffic (accidents, citations, parking violations) Administration (personnel, budget analysis and forecasting, inventory control, vehicle fleet maintenance, payroll) Crime statistics (criminal offenses, arrests, juvenile activity) Resource allocation (police patrols and distribution, police service analysis, traffic patrol allocation and distribution) Criminal investigations (automated field integration reports, modus operandi files, automated fingerprints) Command and control/computer-aided dispatch (CAD assignments, geographic locations) Miscellaneous (intelligence compilations, jail arrests) Source: From two tables prepared by Kent W. Colton, “Computers and Police: Patterns of Success and Failure,” Sloan Management Review 14, no. 2 (winter 1972–73): 78 (see also his narrative description of these, pp. 77–79), and “The Experience of Police Departments in Using Computer Technology,” in Kent W. Colton, Police Computer Technology (Lexington, Mass.: Lexington Books, 1978): 28.
could be shared with local and state policing authorities. The commission also urged the national government to provide financial resources to pay for other types of non-IT equipment and training, such as for more modern radio systems and weapons, a funding process actually begun as early as 1965. Implementation of these recommendations resulted in extensive deployment of IT in the 1970s and 1980s. That proved so much to be the case that we can conclude that until the early 1970s, police departments were barely using computing and remained prisoners of massive paper and punched-card files and were only able to change that circumstance thanks to significant federal funding and leadership in supporting the greater use of computing in law enforcement. A central feature of the new wave of initiatives was the establishment within the FBI of the National Crime Information Center (NCIC), which the FBI equipped with computers “to gather in and squirrel away whole mountains of facts winnowed by thousands of ‘Sgt. Joe Fridays’ and their partners everywhere.”26 The key development involved the FBI’s creating national databases in support of local and federal law enforcement, with police departments all over the country asked to contribute voluntarily data to the system on a continuous basis per standards established by the NCIC. The wisdom of the time across most industries in the IT world was that large centralized systems represented the most effective use of computing; the FBI’s strategy was thus very much a reflection of the norms of the day. Although this conventional strategy evolved into decentralized approaches later in the century, as also occurred in many industries, companies, and agencies, by the end of the millennium the FBI’s databases had become a
Law Enforcement
massive source of information crucial to the functioning of law enforcement across the nation and in collaboration with police forces in other nations. The commission’s report, and subsequent availability of new sources of funding for policing and creation of the NCIC, had been driven less by the increased capabilities of computers that were emerging in fairly dramatic forms at that time (remember, the time when S/360s and its competitors were being deployed across the economy) than by increased criminal activity. In 1965—the last year before the commission’s report for which current data existed—nearly 2.8 million crimes had been committed, up 5 percent from the prior year, with no end in sight. Of these, nearly 1.2 million consisted of burglaries. Forcible rape had increased 8 percent over the prior year. What proved most disturbing was the fact that the crime rate had doubled since 1940 and, in the first five years of the 1960s, expanded five times faster than the growth in the nation’s population. In short, many politicians and law enforcement agencies had concluded that the country had a major crisis on its hands that required extraordinary initiatives by the national government. The FBI decided that data housed at the NCIC had to be made instantly accessible to all law enforcement agencies, along with assistance of other types. To illustrate its sense of urgency, the FBI started planning creation of the NCIC even before the report was completed. Before the end of January 1967, the FBI had it up and running.27 Law enforcement agencies around the country began implementing the applications listed in table 4.1 all during the 1970s. Big cities automated their files and fed data into the FBI’s. Regional and state-wide collaboration projects cropped up all over the nation.28 The NCIC had been a pilot project in the 1960s with sixteen law enforcement agencies accessing online files for wanted persons, stolen property, and criminal events. By the start of 1970, over 2,000 law enforcement agencies out of a total 40,000 had access to these files, which by now contained 1.7 million active records on wanted persons, vehicles, boats, license plates, guns, and even stocks and bonds.29 State governments also built local databases with additional information on criminal activities in their region. A study done in the early 1970s reported that half the surveyed police departments would not have been able to deploy computing if not for the expenditures on such programs by the federal government.30 By the mid-1970s, law enforcement agencies were reporting increased effectiveness in fighting crime and in collecting fines. A major new development in the 1970s also included computeraided dispatching by which calls for assistance came to operators equipped with data online on where police patrol cars, fire engines, and ambulances were located that could be directed in a speedy and productive manner to the scene of a crime or incident.31 By the end of the decade, almost all of the 212 largest metropolitan police departments had various assortments of computer-based police information systems, with over half relying on online access to information, some data systems maintained by state governments and others by the FBI, such as those at the NCIC.32 Looked at from the perspective of what cities were investing in regarding computing, law enforcement now ranked only second following financial applications such as payroll, accounting, and tax collections.
111
112
The DIGITAL HAND, Volume III
Query systems for files dominated the applications of the day, with computer-aided dispatching a close second in use for both local and state police. During the decade of the 1980s, local policing systems expanded to smaller municipal, county, and state law enforcement agencies, while large municipalities added new applications or upgraded existing ones to include more data or to take advantage of newer computing and communications equipment and software. In addition to more policing agencies using computers, to access either their own files or those of state and national databases, they increasingly integrated systems. These included linking personnel and patrol car location files to assign work or in support of computer-aided dispatching, and files related to thefts, criminals, and missing persons, and other records in support of police investigations.33 Law enforcement wanted to link together the four basic sets of records used in daily work, comprising systems used for services and complaints, incident and offense reports, name indices, and, of course, arrests. Although integration became possible, affordable, and desirable, by the end of the 1980s, many departments still had not achieved this objective and were burdened with many paper files, particularly in sheriffs’ departments and in smaller communities. One report in 1989 described the continuing set of circumstances: Some of the most common problems are use of valuable office space to store large stacks of old records, high labor cost of manual file manipulation, the disappearance of documents due to misfiling, the inability to keep pace with records management operations due to limited manpower, unavailability of vital information because controls such as check-out cards simply are not used, and postponement of trials because key witness investigative reports are misplaced or lost.34
Yet at the same time, when used, departments achieved productivity gains. In a rigorous study of the role of computers in detectives’ work in forty departments, two professors well versed in the use of computers by local governments found that two-thirds of the detectives used extensively both batch and online systems in their work most of the time. A third of the detectives reported that they could not have successfully completed their work without using computer-based data, such as in making arrests and clearing cases. However, the researchers concluded that, “the computer revolution has not touched all the detectives . . . nor has it touched the detectives evenly.”35 In large cities, which continued to be the most extensive users of computers, all manner of IT and communications made their way into policing functions in the 1980s. In surveys done in the 1980s and early 1990s, these cities reported an increase in the number of IT support personnel on their payrolls, and, of course, they spent more on computing per capita than smaller departments. Large cities also acknowledged having fewer police per capita, in part due to productivity increases made possible by computing speeding up and improving their work. One study of 188 police departments conducted in 1993 reported that “urban police agencies had become highly computerized,” extensive users of mainframes
Law Enforcement
and personal computers largely devoted to management and administration, processing reports and data, and supporting collection and use of crime evidence.36 New applications also spread in the 1980s, most notably digital crime maps. Police had been using paper versions since around 1900 when they were maps of a city or town with different colored pins noting where various types of crimes (or incidents, such as automobile accidents) occurred, which helped management determine when and where to deploy resources. As these were updated, information about prior incidents was lost, and thus these maps proved static; furthermore, they took up much wall space. In the 1970s and 1980s, mainframe computer mapping came into its own. By the early 1990s, one could access these on a PC. Data entry proved to be a labor-intensive operation, so beginning in the mid-1980s, departments began feeding data to map software from other reports and used color printers to eliminate hand drawn maps. One could begin looking at data in real-time, start tracking and assessing historical patterns, and later even model scenarios for deploying and responding to events. Often, these maps were by-products of a community-wide use of digital mapping called Geographic Information Systems (GIS) used to plan maintenance of roads, water pipes, sewers, and so forth, and about which more is presented in chapter 7.37 But how did more traditional crime fighting data evolve? Fingerprinting is a most unique data file for police, courts, and corrections. A brief history of the evolution of this type of data suggests patterns of use and effects brought about by the digital hand. For just over a century, police departments all over the world had used a fingerprinting identification system called the Henry Classification, named after the Scotland Yard official who developed it in the 1890s. It
Figure 4.2
Early computer-based maps in use, Atlanta Police Department, late 1970s. (Courtesy IBM Archives)
113
114
The DIGITAL HAND, Volume III
recognized that no two humans had the same fingerprints; making a copy of these from a person put on a card only took minutes and proved simple to do; and people left fingerprints on everything they touched. “Lifting” prints at a crime scene and comparing them to a set of prints on a fingerprint card provided credible evidence about whether that someone participated in a crime. Every American police department had sets of fingerprints of people they arrested; state and federal governments did too, always posted on a standard sized card that one could find by name or type of fingerprint. Yet manual searches of fingerprints were laborious and slow, and for decades everyone had to maintain massive files. When police began understanding what computers could do, it was almost inevitable that someone would find a way to automate the process of fingerprint identification. As one criminologist of the 1950s and 1960s noted, the arrival of the computer “qualifies on a scale of importance for a position almost equal to the original introduction to police service of the fingerprint identification system.”38 The reason is simple to explain: computers could hunt for the right card in seconds rather than in minutes or hours and could compare all the loops and arches in fingerprint images to those found at a crime scene, also in seconds, and often more accurately than humans. In the 1960s, the potential for improved productivity and accuracy was obvious and stunning. Prior to the computer, an examination of one card took an average of twelve minutes; by the late 1970s, computers had dropped the time by two-thirds, and by the end of the century much further. The Henry Classification system remained in use, but increasingly in an automated form. One of the first digital fingerprint systems went into operation in March 1976 in Arizona, using a Sperry System with software developed for the purpose by the firm. Initially, 200,000 cards were converted into machine-readable form, using optical scanning. After several months of use, an analysis of performance showed that in 86 percent of the cases, when a search of the files took place, the system matched correctly the data requested, a rate far higher than manual approaches.39 The FBI had the largest collection of fingerprint cards in the country, serving as a clearing house of information for local law enforcement agencies; local departments contributed fingerprint files as they created them. Reliance on these rose as well, indeed, so much so that the FBI had to build a complex conveyor belt system in its headquarters to move cards from storage to analysts, who pored over them. During the 1980s, the FBI moved to a hybrid manual/automated system for cataloguing and searching files. All during that decade, and into the 1990s, it moved records to computers in an attempt to keep up with the growing supply of prints and requests for identification of prints collected at crime scenes. Meanwhile, states and large cities also began automating their files, displacing ink and paper fingerprinting in the 1990s by scanning fingerprints from people right into digital files. By then, the application had its own name: automated fingerprint identification systems (AFIS); and vendors were selling software and scanners for these. Over time, technical standards were crafted largely by the FBI. By the end of the century, there were over 32 million digitized fingerprint files. To put a fine point on the volume of data involved, each file
Law Enforcement
consisted of prints of all ten fingers, and even after compressing the digital files, each occupied 750 kilobytes of information.40 Fingerprinting led in the 1990s and early 2000s to facial images next being collected in digital form that also could be compared and analyzed to identify people by measuring distances between facial features and the eyes, for example. Most recently (early 2000s), use of retina scans could also be digitally collected, stored, and analyzed, now a growing application used by border management organizations (often called biometrics).41 The general application of digital fingerprinting proved enormously successful. In addition to playing a major role in the percent of cases involving arrests, these files allowed the law enforcement community to link fingerprint records to other criminal files, right through to trials, incarcerations, and postconfinement activities. Every large state, and many others as well, had various systems by the end of the 1980s; they all also had access to the FBI’s. The federal agency made it a point to make sure police departments all over the nation understood how the technology worked and shared their files. The volumes involved proved staggering. In 1990, for example, by which time digital fingerprint files were in wide use, the FBI received 17,900 local files each day into its database, and by 2000, that number had grown to 24,000 per day.42 States that had long led in the deployment of computing made creation of their own local systems also a priority, such as New York, California, and Missouri, usually by state governments and within their state police agencies. All reported faster and more accurate identification of criminals, many of whom might otherwise have gone undetected. On average, such systems got a “hit,” that is to say, made an accurate match of person to prints, in about 70 percent of the cases the first time a request was entered into a terminal, versus one percent with a manual search.43 The U.S. Justice Department reported that by the late 1980s, thirty-nine states had their own digital fingerprinting applications. By the end of the century, almost every state had access to such applications.44 In the 1990s, police departments in large cities began merging fingerprint data with photographs of individuals and other files, such as criminal records, evidence of crimes, 911 system audio and video recordings. However, most police departments in the 1990s were not as fully aware of the possibilities of merging images into records as the technology of the day permitted, nor did they have the necessary financial resources to invest in them. So as with so many other applications, either large cities and states or the federal government pioneered the next wave of applications. One of the first to do so was Cook County, Illinois, which is the geographic site of Chicago, piloting its first integrated system in the mid-1990s.45 Over the next decade, the application spread across many large American cities and states. By the end of the century, law enforcement agencies had noticed a significant improvement in their ability to do their work because of such digital systems. They were all reporting that they provided faster searches, required less storage space for files, displayed higher-quality fingerprints (no smudged images were allowed to get into a system), faster filing, and proved cost effective.46
115
116
The DIGITAL HAND, Volume III
Similar tales could be told about other key applications of the 1980s and 1990s in which computer-aided dispatch expanded, often integrating data, records of where people were in patrol cars, online maps, and later to other emergency response agencies, such as fire and medical services, so that a dispatcher could get a call, look up where everyone was, and respond.47 Or, a supervisor could determine when and where to assign fire, police, and medical resources. These were important uses of computing because they automated a great deal of the work associated with receiving and responding to calls from the public and provided control and management over staffs moving about a community.48 A similar story of innovation and improved access could also be told about moving information-handling equipment closer to its users, most notably laptops to patrol cars, fire engines, and other vehicles in the 1990s to provide mobile computing. Enormous investments were made in such applications in the 1990s. Between 1994 and the end of the century, the federal government alone gave state and local governments over $330 million for mobile computing.49 As law enforcement agencies expanded their use of such systems and telecommunications, they built up complex IT infrastructures that mimicked what companies were installing across the private sector.50 The one type of technology police departments struggled with more than any other involved telecommunications. Radios in patrol cars had been in use since the 1930s and remained essentially the same right into the 1980s. As communities began integrating the work of police, fire, medical, county, and state agencies, incompatibility in radio systems had to be addressed—a problem that existed even as late as 2001 in the city of New York at the time of the 9/11 terrorist attack.51 In addition, as data became increasingly digital in the 1980s and especially 1990s, the desire of police departments to transmit images and files to a patrol car grew. New technologies, even the Internet, additionally made such applications possible and desirable. Systems for 911 emergency calls were an important part of the communications ecology. First introduced by AT&T in 1968 and initially implemented by the end of 1971, such systems for the public to use to report crimes, emergencies, and other crises spread across the nation over the next three decades. These were expensive, complex systems that in time merged together digital and telecommunications functions. All during the 1990s, every form of communications technology that appeared in the market was used with 911 and other policing applications: laptops, local area networks, digital PBX systems, and the Internet.52 Analog communications networks, which were not secure (i.e., a criminal could listen in on these too), were slowly converted to digital versions. One survey conducted in 1998 reported that 13 percent of law enforcement agencies had made the transition to digital voice communications, while another 55 percent said they were planning on doing so soon as well.53 The same survey reported that when wireless communications were used, 70 percent of all users did so to access NCIC files, roughly half to do data queries, send e-mail, and do their reporting, while much smaller percentages were transmitting pictures of wanted criminals (law enforcement professionals call these “mug shots”)
Law Enforcement
(19 percent), maps (12 percent), and using GPS (10 percent).54 All the percentages rose in the early years of the new century. Students of the problem blamed lack of sufficient funding to migrate to more integrated systems, rapidly changing technologies, age, and incompatibility of preexisting equipment.55 The portfolio of applications of computing in wide use by the early 1990s spanned most functions in law enforcement. Table 4.2 lists those that law enforcement agencies used, although to varying degrees across the country with the largest agencies most digitized. Yet all had access to various regional, state, or federal systems, primarily to search for information.56 A new initiative in the early 1990s began to integrate even further files on criminal activities across the nation. For years, agencies had used a reporting system called the Universal Crime Report (UCR), which provided statistical report cards on the performance of law enforcement agencies across the country. Results reported in the UCR encouraged federal and local officials to apply technology to the war on crime by promoting a crime reporting mentality, providing management with data they could use to fight crime. These data included statistics on response times and arrest and crime rates, causing a shift by management from making decisions based on anecdotal evidence (or experience) to those based more on quantitative data.57 Working together, various agencies developed a concept that would better leverage computing’s capabilities, called the National Incident-Based Crime Reporting System (NIBRS). Digital data on forty-six classes
Table 4.2 Sample Police Uses of Computing, circa 1990 Animal licensing Arrest/crime records Automated vehicle location Case disposition reports Case management Citation control Computer-aided dispatch Computerized sketching Crime analysis Crime lab operations Criminal associates lists Criminal history DWI/DUI Evidence management Fingerprint processing
Firearms registration Fleet management Fraud offenses Gang activity Intelligence gathering Inventory control Juvenile records Law violations Licenses/registration Mapping/geocoding Missing persons Name indices Narcotics control
NCIC data entry NIBRS Organized crime Parking tickets Pawned articles Report writing Stolen property Summons management Traffic accident reports Traffic case processing Traffic tickets Transport of prisoners UCR Vehicle inspection
Source: U.S. Department of Justice Programs, Bureau of Justice Statistics, Directory of Automated Criminal Justice Information Systems 1993, vol. 1, Law Enforcement (Washington, D.C.: U.S. Government Printing Office, 1993): 809–831.
117
118
The DIGITAL HAND, Volume III
of crime would be collected, with a half dozen specific types of information for each to be organized in a standard format that agencies could access all over the country. Many cities already had stand-alone crime data management systems that would have to be converted over to this one so that more organizations could share these data. During the 1990s, the FBI funded various pilot projects to start developing such a system.58 Meanwhile, crime-mapping applications continued to spread and data from those systems also were distributed to patrol cars, not just to supervisors.59 Technological innovations and improved cost performance, when coupled to federal and state funding for systems, facilitated the waves of adoption that had taken place. Yet one other innovation in policing also influenced adoption of computing. Beginning in the 1980s and extending right through into the new century, law enforcement agencies, and particularly urban police departments, changed tactics for how they provided protection to communities. They moved from simply responding to requests for aid to more community-based policing, where citizens, other agencies, and the police worked together to reduce and prevent crime. That shift in tactics required generally moving from localized policing applications of computing to different ones that supported this new approach to policing. One police official in Philadelphia left us an explanation of the problem so many senior police management now faced: “This tendency toward sporadic, highly specified, and hurried technology acquisition has created a maze of stovepipe systems of varying technological architectures that can be efficiently completing their tasks, but cannot, as a system, provide decision makers with a decision support concept that encourages strategic thinking and decision making in the organization.”60 Yet increasingly, that is exactly what computing had to help with as roles changed. Emergency 911 systems were merely one set of examples of this process at work. Arming the cop on the beat with information now became even more crucial. A report late in the century described the interrelationship between this kind of policing and technology: “Information technology has migrated from centralized mainframe/dumb-terminal architecture towards distributed, clientserver designs linked in local- and wide-area networks. This evolution has dovetailed with law enforcement’s widespread adoption of community-oriented policing strategies, which tend to decentralize authority and decision-making down to the precinct.”61 In a culture in which policing had always been quasimilitary in form, and in which supervisors back at the police station made all the key decisions, responsibility for decision making was moving outward, supported by the field having data that in prior times would not necessarily have been available to them. This change in the culture of how policing occurred paralleled in the 1980s and 1990s what happened in the private sector as increasing amounts of information and access to data shifted from management and headquarter locations to workers on the factory floor, in call centers, and in many other functions.62 It is in this context of policing and IT that we can begin to understand how the Internet became part of policing’s repertoire of computers. Use of the Internet
Law Enforcement
by law enforcement agencies (primarily local and state police and, to a lesser extent, sheriff departments) in its early stages can best be understood within the context of community policing. As police departments reached out to communities for help in preventing crime or in assisting police in other ways, the Internet became a tool both could use, and its adoption began spreading slowly in the mid-1990s. Police departments around the country set up Web sites with information about their organization and initiatives, then posted requests for help with wanted posters (what historically U.S. Post Offices displayed on their bulletin boards) and other information to help domestic violence victims. As in the past, funding from the federal government provided much financial wherewithal for departments to invest initially in constructing Web sites. Meanwhile, the FBI integrated this technology into its revamped databases in the NCIC, which by then had over 43 million digitized fingerprints and some 30 million criminal histories.63 By late 1996, some 500 police Web sites existed in the United States and Canada.64 By the early years of the new century, most police departments had either their own or shared space on their local government’s Web sites. This tool proved to be an important channel of communications between citizens and police on such basic issues as requesting tips about crimes, informing communities of various policing initiatives, and providing contact information, including who to write e-mails to in a police department.65 Finally, police began using the Internet very quickly in their investigations of crimes. As the amount of data available on the Internet increased, particularly after 1998, it became an important research tool for investigators and, by the early 2000s, an essential component of a police department’s operations. In support of police departments, various police associations, federal law enforcement agencies, and other institutions, all began posting material useful to each other on the Web. One of the earliest inventories of such Web sites (circa 2001–2002) listed hundreds of these.66 Deployment of the Internet mimicked patterns evident in many other government agencies, most companies, and industries. The first Web sites provided information and contact data (addresses, telephone numbers). Every agency that installed a Web site went through this phase, and so there were always some organizations in any year after 1995 in this stage of maturity. Second-generation Web sites began appearing in the late 1990s, still with information (mission statements, messages from the local chief of police, and other data). By the end of the century, a third generation appeared distinguishable from prior sites in that they were interactive, now having the facility for police to post requests for help in solving crimes, frequently presenting statistics and other information about the evolving crime situation in a community, all the while giving citizens the ability to communicate back and forth with police over the Internet. As one would thus expect, the amount of data being presented over the Net increased, such that by about 2004, these sites had become fairly standard, even ubiquitous across the nation.67 In the final analysis, policing agencies had made quite a transformation over the previous thirty to forty years. From not using computers until the early 1960s
119
120
The DIGITAL HAND, Volume III
to the end of the century, much had changed, a great deal of it even in the 1990s. At the start of that decade, for example, large cities (with over 250,000 residents) evolved from 90 percent of their police departments using computers in a wide variety of activities to 100 percent at the end of the century.68 Even the mundane use of in-field computers reached to over 90 percent in the same decade. The proverbial “everyone” used automated fingerprint and computer-aided dispatch systems. Movies and novels often characterized sheriffs as either gun-toting John Waynes or semi-illiterate power-hungry Southern power-brokers. Nothing could be farther from the truth. They encountered computers much like urban police, adopting them for the same reasons and slowed similarly from acquiring more due largely to budgetary constraints. Almost every sheriff’s office in the United States used computers for one purpose or another by the end of the century. Large counties relied extensively on computing to handle most administrative functions; nearly 90 percent used personal computers. Like police, they stored data on criminal activities and accessed the same FBI and state criminal records. Just over half used computer-aided dispatch systems, and nearly 40 percent of smaller, less funded counties did too. In short, like town and city police, sheriffs had become highly dependent on computers to do their daily work.69
Role of the Federal Government in Law Enforcement Computing While various federal agencies, such as the Secret Service and the Department of Justice, performed law enforcement and policing work, the center of national activities rested in the FBI, which itself was one of many bureaus within the DoJ. Both the department and the FBI were responsible for enforcing federal laws, so in that capacity they did many of the same things as local law enforcement: policing, arresting, prosecuting, and jailing. Thus, many of their uses of computers mirrored those of local police, courts, and prisons. Because of the scale of its operations, which involved law enforcement across the nation and with much activity in other countries (such as in embassies), the department became an early user of every form of information technology to appear in the twentieth century, not just computers. In addition, the department also played a support role in helping local law enforcement regarding computers, as discussed earlier in this chapter. In addition to funding local initiatives and investments in equipment, the FBI assisted in investigations through its forensics capabilities and research into its massive fingerprint and other criminal files. The department also had responsibility for tracking crime across the nation, reporting results to other government agencies and to the public at large. The NCIC within the FBI became the centerpiece of much law enforcement activity, of course, and an extensive data processing center for the nation’s law enforcement community. Its earliest use of computers in the 1960s involved collecting and maintaining batch files, which in the 1970s it made accessible online with terminals physically placed in police departments and state police agencies.70 Over the next several decades, the FBI enhanced its files and upgraded
Law Enforcement
and updated its technologies at a far more effective pace than either the IRS or DoD. Important new files were created along the way. For example, in 1971, the NCIC launched its Computerized Criminal History (CCH) database. CCH collected information about individuals and fingerprints of people arrested for committing major crimes; and it acquired additional data on what crimes were committed and on their disposition. By the 1980s, the NCIC managed fourteen major databases (see table 4.3). These files were rich deposits of information, which the NCIC expanded in number and volume of detail per person or crime over time. This repository provided a massive body of information that the FBI and DoJ could use to help solve crimes and to perform data mining so as to better understand patterns of crime in the United States. While these files were originally housed in twin IBM 3/360s in the 1960s, demand for computing kept rising, making the FBI an important user of the largest available computers. By the end of the 1980s, for example, the center used several of IBM’s very large 3033 computer systems to support some 60,000 various local, state, and federal agencies that needed access to its data. By 1990, terminals and personal computers were intermixed in the network so that data could be queried, updated, and added to in various ways.71 The FBI spent the second half of the 1960s building its various databases, piloting systems, and so forth. The Computerized Criminal History (CCH) system went “live” on November 30, 1971, and over time increasing numbers of states began contributing data to the system. From the beginning, these databases were used by law enforcement, and use rose all through the last three
Table 4.3 Major Databases, National Crime Information Center, 1980s Wanted persons Missing and unidentified persons Criminal history and fingerprint classification Stolen and felony vehicles Recovered vehicles Stolen and recovered firearms Stolen and recovered heavy equipment Stolen and recovered boats and marine equipment Stolen license plates Stolen and recovered securities Stolen and recovered identifiable articles Canadian warrants U.S. Secret Service protective file Originating agency identifier file Source: J. Van Duyn, Automated Crime Information Systems (Blue Ridge, Penn.: TAB Professional and Reference Books, 1991): 5–16.
121
122
The DIGITAL HAND, Volume III
decades of the century, although slowly in the beginning. Its development, along with the activities of the NCIC in the period of the late 1960s to the late 1970s, proved to be one of the more active times in the FBI’s deployment of IT (see table 4.4). Yet, one analysis from the late 1970s reported the FBI’s “lack of enthusiasm for continued participation in the CCH system” due to “lack of state participation, underestimation of costs and effort which would be required to establish, collect, and maintain data for the more elaborate CCH record format.”72 Insufficient disciplined data entry and lack of adequate technical capabilities at the local level hampered early deployment of the FBI’s databases, a problem, however, solved slowly over time. As of 1978, eight states were contributing data to the system; but participation grew steadily over the years. At the time, however, twenty-six states were accessing the system for data.73 So, local officials were more willing, or able, to access existing files than contributing to them. Funding, skills, and resources served as gating factors as they did at DoD and at the IRS. However, the Department of Justice knew what had to be done. In the late 1960s, it had established the Law Enforcement Assistance Administration (LEAA) to help the states. In 1969, LEAA launched Project SEARCH, consisting of a group of state governments to build and demonstrate the feasibility of a computerized network that would allow states to exchange data on criminal histories. The FBI would manage the network. In 1972, LEAA
Table 4.4 Key Computing Activities, NCIC and CCH, 1967–1977 Year
Event
1967
Commission on Law Enforcement and the Administration of Criminal Justice recommended deployment of decentralized systems Project SEARCH was created to develop a state-level network to exchange criminal history information FBI given control over Project SEARCH criminal history index FBI announced it added nation-wide criminal history data bank to NCIC Major discussions, problems faced regarding standards, security,cost, and use of CCH with states FBI authorized to serve as telecommunications switch for NCIC-related messages Justice Department began publishing standards and regulations regarding dissemination of criminal records and histories LEAA issued regulations regarding dissemination and sharing of computer systems FBI requested that it terminate its participation in CCH
1969 1970 1971 1973 1974 1975 1976 1977
Source: Office of Technology Assessment, A Preliminary Assessment of the National Crime Information Center and the Computerized Criminal History System (Washington, D.C.: U.S. Government Printing Office, December 1978): 77–80.
Law Enforcement
began funding work to encourage states to develop criminal justice systems at the local level.74 In the 1980s, the FBI shut down CCH and replaced it with a decentralized national criminal history record system, an approach preferred by the states as easier and less expensive to support. States had enhanced their technical capabilities all through the 1970s, making it possible to have decentralized systems in the 1980s. By the late 1990s, over forty states had local criminal databases, contributed to the national files maintained by the FBI, and had networked to systems maintained by other states.75 These systems were popular and grew in size. At the end of the century, they held collectively more than 59 million records of individual offenders in criminal records files, up from 30.3 million records in 1984 and 42.4 million in 1989. Put another way, the states doubled the size of their digitized records between 1984 and 1999. In addition, at the federal level, there were some 43 million files on individuals by the end of the century. With financial support from the federal government, the number of states accessing such files increased. In 1999, forty states reported that more than 75 percent of their criminal record histories were now automated. A substantial increase in deployment of digitized records had taken place in the 1990s; in fact, twenty-six states had these kinds of records by 1992, then forty at the end of the century.76 If we add into the mix the growth in state and federal fingerprint files that had been digitized, one can conclude that the states had caught up with the federal government by the late 1990s. Equally important, by now states routinely contributed information to the FBI’s files, while they had also become dependent on their own systems for the daily work of law enforcement. Of course, things were never perfect. Many inside and outside law enforcement expressed concerns about the accuracy of information. They debated who should have access to it and worked on other issues related to privacy.77 The federal government worked through many of these issues all through the 1980s and 1990s, a process that continued into the new century. One brief example involved the development of a standard “rap sheet” that all law enforcement agencies could use to document an individual’s criminal record (much like a job resume), an initiative launched in the mid-1990s. In addition to dealing with issues of accuracy and privacy, the proposed standard initially settled on the use of Internet-based technologies.78 Like the states, the FBI and other federal law enforcement agencies began to view use of the Internet in a favorable light late in the 1990s, primarily because now effective data and transaction security and encryption systems were available. They reacted much the same way as financial and retail industries to the evolving technology at the same time.79 Meanwhile, the FBI continued modernizing its internal systems. An important initiative of the period, called NCIC 2000, improved the NCIC’s aging telecommunications system, hardware, and software, eliminating entirely the exchange of paper-based records, for example, and adding new capabilities made possible by technological innovations, such as graphical data, including mug shots, pictures of tattoos, and signatures. Improved data mining and search functions were added, initiating an early but important application of artificial intelligence methods that could work with seventeen databases, not just the
123
124
The DIGITAL HAND, Volume III
original 14. NCIC 2000 went live on July 11, 1999. Within a year, it was processing over 2.3 million transactions each day.80 Then implementation of the Brady Act became the next major IT issue in 1993 for the DoJ because this law called for criminal background searches on people wanting to buy handguns. It further required the U.S. Attorney General to build a computerized system for that purpose within five years. That digital tool, called the National Instant Criminal Background Check System (NICS) became operational on November 30, 1998, making data available within thirty seconds of an inquiry by accessing the FBI’s preexisting criminal databases. By 2001, over 30,000 inquiries were being made every day.81 In short, in addition to its normal law enforcement duties, the FBI and its parent department continued serving as the nation’s central hub for major new law enforcement IT initiatives. In each of its annual reports, the Department of Justice always discussed its digital initiatives as important as its traditional law enforcement and prosecutorial responsibilities. In any year in the 1980s and 1990s, it had a half dozen or more major IT systems under development in support of its missions, those of the states, and even of other federal agencies responsible for law enforcement.82
Computers and the Courts Courts were some of the latest users of computers in the law enforcement ecosystem, embracing the technology in tentative ways in the 1960s and 1970s, and not substantially until the 1980s and 1990s. When courts finally did use computers, they did so for the same reasons as others in the world of law enforcement and the legal profession: to gain control over mounting loads of cases and their attendant paper-based records. Like police systems, theirs began as standalone applications, which courts integrated over time across a state or federal legal system. In time, they integrated with police systems in criminal cases as well. As technology became less expensive and easier to use and application software arrived on the market, courts could afford the time and effort to start weaving computers into the fabric of their daily work. The most fundamental driving force creating the need to use information handling tools proved to be the rising volume of work faced by all court systems across the United States during the second half of the twentieth century. The volume of cases rose faster than the number of new courts and judges available to deal with them at the local, state, and federal levels. For instance, between 1950 and the end of the century, the number of federal cases alone expanded from nearly 100,000 to over 375,000.83 Both civil and criminal cases grew in volume. Civil cases tended to be particularly more data intensive than criminal cases, and they increased in volume by 86 percent between 1990 and 1995, a period that saw extensive deployment of computers by courts that tried both civil and criminal cases. In that same decade, filings for bankruptcy doubled to 1.3 million in 1999, forcing courts to handle vast quantities of paper.84
Law Enforcement
We could tell a similar tale of increasing workloads about local and state judicial systems. Throughout the second half of the century, over 80 percent of all judges and courts were state and local. Providing comparative data to what occurred at the federal level, again using 1999 to illustrate the order of magnitude of work, 91.8 million cases were filed in state courts, ranging from traffic to civil cases, along with criminal, domestic, and juvenile cases. A massive 54.4 million of these involved cases in traffic courts, far surpassing all other categories combined.85 Some 14,000 trial courts, staffed with 18,000 judicial officers, were the lowest level courts in the nation, where most cases began their judicial odyssey. These courts represented 85 percent of all judicial bodies in this country. These lower courts handled 67 million matters annually by 1999 out of a grand total of 91.8 million cases filed in all state courts.86 Even writers of textbooks, who normally do not editorialize, characterized the work of these lower courts as “staggering.”87 Like law enforcement agencies, the nation’s courts comprised a patchwork of local, state, and federal courts, clerks’ offices, and other ancillary support functions, most staffed with a few employees. Into this world came police, criminals, other litigants, witnesses, prosecutors, various court officials, jurors by the millions, the press, interested bystanders, and, of course, lawyers representing all sides. To a large extent, orchestrating the comings and goings of all direct participants was a major activity for all courts. Making sure that evidence—data—kept up with all the key players in this never ending flow of activity, and always in an overworked, normally understaffed, and underfunded legal system, remained a constant challenge. The same complaints about large volumes occurred in every decade. In the broad landscape of life in America, one observer noted, “soaring crime rates and an increase in both the number and complexity of civil cases have turned America into a Litigant society.” However, “at the very time when more Americans want or need a day in court, the machinery to give it to them on a fair and timely basis is breaking down.”88 Courts had adopted precomputer tools on a limited basis, such as typewriters, telephones, and in larger city and state courts, some punched-card tabulating equipment. The only major information processing hardware developed specifically for courts (and newspaper reporters) in the precomputer era was, of course, the device court stenographers used to document what was said in courtrooms, called the stenograph machine.89 At the high end of information processing, involving tabulating equipment, the most sophisticated hardware was used to schedule court cases and appearances of jurors for duty because of the large numbers involved.90 Then, as other law enforcement agencies began using computers to help with the management of work loads and information, judges started slowly using the same technologies, initially driven by local and state governments with computers making them available to courts. Later, courts acquired their own applications (with software) and even their own dedicated systems. The process of adoption proved slow. As of the mid-1960s, barely a half dozen computers were in use for court administration; by the end of 1971, some
125
126
The DIGITAL HAND, Volume III
255 courts used them. That still left thousands of courts operating without using computers. The earliest applications involved selection of jurors and efforts to centralize or partially automate collection and reporting of statistics required by many state governments. Next, such uses as scheduling and tracking juvenile traffic cases came along and others to schedule court cases in general. In the early 1970s, interest in integrating various systems existed among vendors and other experts on computing, but courts were slow to change. Speaking at a conference of judges in 1971, President Richard Nixon—himself a lawyer—suggested that they “take advantage of many technical advances, such as electronic information retrieval, to expedite the result in both new and traditional areas of the law.”91 The most important early uses involved managing court dockets in large urban centers that had the greatest volume of cases. Leveraging computers to automate calendaring functions called for the redesign of long-standing internal procedures. For example, in Philadelphia in 1969, an IBM mainframe began scanning documents for all cases being presented before a local court on a weekly basis to determine their status and to schedule upcoming court actions, while informing all parties of the schedule.92 By the early 1970s, published reports began documenting how computers were helping speed up the movement of cases through courts.93 Beginning in the mid-1970s, independent applications started transforming tentatively into more integrated systems that tracked cases from beginning to end, from initial arrests (criminal) or filings (civil) to their dispositions, a morphing process that continued into the 1980s and 1990s. In the same period, online access to information and databases also spread widely across the nation in all types of courts from local traffic to federal.94 Clerks could plan their activities and schedule more efficiently and accurately judges and cases, while now providing online information, often in the courtroom itself. Database tools for this market appeared. For example, IBM, which had initially developed software to manage the over one billion pages of documents associated with its antitrust suits with private sector rivals and later the federal government in the 1970s, made it available as a product for lawyers and courts.95 Meanwhile, tools developed for lawyers were also used to help judges prepare for cases, using such tools as LEXIS and WESTLAW.96 In the 1980s, courts were extensively integrating case tracking from beginning to conclusion in fairly comprehensive ways, making it possible for state attorney generals’ offices, police departments, and court officials to share data. That circumstance reduced the volume of paper handling and other manual tasks per case that often were tedious, not merely time consuming. Meanwhile, the volume of transactions a court could process increased right along with the workload of new cases.97 For example, in the case of traffic courts in Maryland, with the help of its Judicial Information System in the mid-1980s, judges processed 700,000 traffic citations and an additional 160,000 other cases per year. The traffic side of the system helped generate $30 million in fines, primarily by handling larger volumes and not missing cases that otherwise would have gone unattended. Prior to the use of a computer, no court would have known
Law Enforcement
Figure 4.3
A law clerk researches online a prior case, circa late 1970s. (Courtesy IBM Archives)
how many traffic citations there were, let alone if some were not paid. In addition, the system balanced court appearances of state police at court with cases before a court and judge. The case file was the heart of such a system, what everyone involved in a proceeding could look at, update if authorized to do so, and use to spin off reports.98 By the early 1990s, many local and state courts had similar systems in operation. Many were state-wide, that is to say, developed by the state for common use by all its courts.99 Yet, for all this activity in the 1980s and early 1990s, courts still deployed computing less so than law firms or police departments. When compared to other government institutions odd omissions existed. For example, while many government agencies in the early 1990s accepted credit cards as a way to pay for licenses, normally courts still did not for fees and fines. Many courts did not use
127
128
The DIGITAL HAND, Volume III
computers to track individuals who failed to pay fees, yet even small towns had well-established accounts receivable systems to pursue overdue property taxes. In 1992, one student of court applications commented on why the lag: “Courts haven’t gotten a lot of attention in the technology arena until now because they didn’t have the big payrolls and the big budgets as did the executive and legislative branches.”100 Judicial culture, however, that is to say, the power of inertia and tradition, probably played an equally strong role. So mounting paperwork remained a constant problem for courts in the 1990s, although use of OCR scanners spread, while initial case records were being implemented by police departments whose systems could be extended to provide courts with data collected originally, for example, when someone was arrested.101 By the end of the century, however, all federal courts did have automated systems that interested parties could search to retrieve information on specific cases, using personal computers at court houses and over a dial-up system called PACER (Public Access to Court Electronic Records). Many courts also hosted home pages on the Web. A nearly similar extent of deployment of such applications also occurred at the state court level and in many large cities. E-mail also had rapidly spread through court systems all over the United States.102 Availability of the Internet marked a new period of innovation in court applications, yet generally slowly. By the mid-1990s, legal professional organizations began exploring the potential value of using the Internet, such as the National Institute of Justice (NIJ). In March 1996, it reported to the justice community that the Net promised speed of access to information and, of even greater value, the ability to allow various participants in the criminal justice community to engage in dialogues and share data. The NIJ began aggressively using the Internet in 1995 to disseminate information to its community, providing access to its files and a forum for discussion, while offering training programs on how to use the Net.103 By the end of the century, while many judges were familiar with computing, so many others still were not, while most lawyers pleading cases before a court had some working knowledge of computer-based tools, such as those used to do legal research online. Assessing the situation in 2000, Judge Edwin L. Nelson, a U.S. district court judge in Alabama, commented that “in a very few years, the portable computer will be as ubiquitous as long yellow legal pads, number two pencils, dictating equipment, and law books were 10 years ago.”104 The same judge reflected back on the 1990s: The last 10 years have seen an enormous positive shift in the acceptance of IT at all levels of the Judiciary. For example, when I became a district judge, the only e-mail system available connected my secretary, two law clerks, and me over a homemade network that I, more or less, devised and installed with the help of our clerk. There was no DCN (Data Communications Network). We had no nationwide e-mail system, and many judges were skeptical of the value of such a system. Today, as we move to the full nationwide implementation of and migration to Lotus Domino/Notes, many judges and courts believe that a
Law Enforcement
reliable, robust, and secure e-mail system is essential to the performance of our mission-critical functions.105
Clerks and judges could use Internet-based versions of WESTLAW and LEXIS to do research and another system called Virtual Law Library. The hard data was beginning to confirm this judge’s comments. All fifty states provided some online access to court decisions, including opinions. Thirty-nine states recognized digital signatures as legally binding for some transactions; later, federal law made that universal across the land. Over a dozen states either had or were implementing digital systems that accepted pleadings, motions, and filings of briefs. Most important, forty-four states had already integrated some or most of their criminal justice/law enforcement information systems available to judges and other court officials. In 2000–2001, the states with the most extensive automation of court/law enforcement systems included Colorado, Delaware, Illinois, New Jersey, Ohio, Georgia, Maryland, and Pennsylvania, and, of course, the federal courts.106 The tale of computing in courts in the years that followed became one of further deployment of integrated case management systems, rapid deployment of research over the Internet, and use of e-mail.107 Before moving to a discussion about computing and jails, we need to survey briefly how lawyers used computers because, while the majority of their work did not necessarily result in court trials (such as preparing deeds and wills), as the statistics cited in this chapter demonstrate, many legal matters did end up in court. So it is part of the story of computing in courts. There are three major classes of IT-related issues involving lawyers and uses of computing: applications related to the operation of a legal practice (such as tracking billable time, calendars, payroll), substantive law related to IT and its industries (such as the role of digital signatures, privacy), and research for cases using this technology. It is this third area of use that in particular fills in details regarding the role of computers and telecommunications with courts. Large law firms had long struggled with mounting loads of paper, just like courts and judges, along with the accumulating volume of legal literature and judicial decisions and cases that they had to study in preparing theirs for trial. In the early 1960s, one could begin reading articles about the case for computerized legal research, with some attempts to convert files into machine-readable form.108 The legal community first focused its attention on how to use technology to do research, making this application the central adoption of computing in the 1970s. The major early event came when the Ohio Bar Association built the first widely available computer-assisted legal research system (CALR) in the mid1970s. It proved successful and spun off as a private enterprise called LEXIS, owned by Mead Data Corporation. That software provided full text of all its files, making it an attractive tool. A competitor called WESTLAW evolved from just an index-and-abstracts-based system to one that provided full text as well. Since the 1970s, these two systems have served as the primary (although not only) software tools used by lawyers, legal researchers, and clerks to find related cases, statutes, and other information, searching through key words specifically sought
129
130
The DIGITAL HAND, Volume III
in a document or as concepts.109 Additional tools also existed, such as IBM’s STAIRS (Storage and Information Retrieval System), and still others created by federal and state agencies over the years.110 Use of such research tools through dial-up telecommunications became widespread among large and medium-sized law firms by the mid-1980s. The quality of the searches that could be done also improved. Lawyers enjoyed two immediate benefits: first, it cost less to do online searches because these could be done more quickly, and second, searches became increasingly complete and thorough.111 Meanwhile, in the same decade, as law firms became comfortable using these applications, they found other uses for computing, such as in support of legislative bill tracking through state and federal legislatures, commercially available databases to collect personal and financial data (particularly useful in bankruptcy cases), and business and medical databases.112 In addition, within a firm, computers began doing work seen in other industries, such as billing, payroll, and accounting, with some large firms in major cities acquiring their own data centers by the 1980s. Smaller ones outsourced their accounting work. The legal literature of the day began routinely publishing on the role of IT. These included Legal Administrator, Legal Economics, and the National Law Journal. The American Bar Association established a standing committee to advise their members on matters related to computing and telecommunications.113 Like other industries, companies, and public agencies, law firms began looking into the value of integrating their various stand-alone systems in the late 1980s, a process undertaken during that period and continuing throughout the 1990s.114 Arrival of the Internet enhanced the research capabilities of all law firms, driving down the costs of research on the one hand while on the other, making vast quantities of legal and nonlegal information readily accessible. During the 1980s and 1990s, vast quantities of case material and statutes were digitized in what can only be regarded as a remarkably fast transformation from paper-based libraries to digital ones. By the 1990s, it was difficult to find even a junior law clerk not familiar with online search tools and word processing. By the end of the decade, even judges were becoming familiar with the application, which they learned about as part of their prior jobs as lawyers or in pursuit of private interests and hobbies at home.
Computers and Corrections By the end of 2001, over 5.6 million residents in the United States had been incarcerated in either a state or federal prison. Like the rest of the law enforcement ecosystem, that population had grown over time. For example, from 1974 to 2001, the percent of all adults who had been in jail expanded from 1.9 to 6.6 percent.115 Much like the rest of the law enforcement ecosystem, the nation had a patchwork of federal, state, county, city, and town prisons and jails of varying sizes. In the 1950s and 1960s, use of computers to manage back office applications in prisons essentially did not exist. Accounting, tracking of prisoners, and
Law Enforcement
performance of administrative functions were an intensely paper-driven collection of processes, although use of precomputer information processing equipment was widespread, such as typewriters and adding machines. Telecommunications included analog radio and internal telephone networks. In the late 1960s and early 1970s, some efforts were made to teach prisoners about computing, primarily programming, as part of rehabilitation initiatives, but these were few and far between.116 In large communities, deployment of computing to support operation of prisons began in the late 1960s or very early 1970s. For example, Los Angeles County, California, operated the largest county jail system in the United States at the start of the 1970s. It had to track 11,000 inmates and on any given day, 1,250 were being moved about within the system or to and from court hearings. In 1971, the county implemented an online system that included automated booking, a booking information file, and an inmate database, all operating on an IBM S/360 Model 50 (a very large computer for its day), with terminals linked to this application scattered across the entire Los Angeles law enforcement community, not just to jails.117 During the 1970s, computing came into its own in large federal, state, and urban prison systems. New Mexico implemented applications to track accounting activities, inmates, and to produce statistics, and linked to other applications that tracked criminal activity across the state.118 Corrections officials in Baltimore, Maryland, focused on tracking inmates and availability of jail space to house them. Because prisoners could be in jail for short periods of time, paperbased record keeping often did not keep up with their comings and goings. A digital system made that possible, while maintaining a current jail census and a list of who visited each prisoner (information often requested by courts). Because many jails adopted similar applications in the 1970s and 1980s, table 4.5 lists the kinds of information maintained by this early online prison system. The kinds of information collected in each category illustrate the variety of data Table 4.5 Online Jail Information Used by Baltimore, Maryland, circa 1979 Inmate (name, address, aliases, identification number) Location (cell assignment, medical appointments, visitors) Court (case number, arrest number, next pending court action and date, charges) Classification (medical conditions, drug addiction, psychiatric problems) Appointments (lists of inmates scheduled to appear in courts, hospitals, elsewhere) Occupancy (data used to assign housing and processing returning inmates, expected arrivals, expiration of sentences) Transportation (data from which appointment reports are produced) Visitors (lists of visitors to all inmates, dates of visits) Cell assignment history (data on all transfers) Source: IBM Corporation, Jail Online Inmate Control System Baltimore, Maryland (White Plains, N.Y.: IBM Corporation, 1979), Box 246, Folder 7, IBM Archives, Somers, N.Y.
131
132
The DIGITAL HAND, Volume III
required to run a prison.119 Similar systems were installed by other states in the late 1970s and the 1980s. These included routine budgetary, accounting, and inmate census applications as well. Normally, they were online systems that could be accessed by both the staff at a prison and other law enforcement officials who had permission to access the data, such as police, sheriff, and state law enforcement agencies. The largest states became the earliest users of such systems, led the way in enhancing them over the years, and upgraded constantly the equipment and software involved. The state of Texas, which always had a large prison population, became a model for innovative systems during the last two decades of the twentieth century. But it was not alone as other states also implemented integrated applications as well.120 Reasons for implementing such systems mimicked what happened elsewhere in the law enforcement community. These systems improved scheduling of personnel, reduced the cost and time required to maintain accurate records on prisoners, made it possible to analyze patterns of costs and populations, improved the quality of decision making (for example, in deciding about paroles), and minimized the growth in staff. Courts and corrections changed their strategies over the years to find new ways of dealing with overcrowding of jails, moving toward rehabilitative strategies other than just incarceration. So, computing needs changed, particularly in the 1980s, in support of reforms making their way through the law enforcement world. One important alternative to imprisonment was the idea of house arrest, in which a prisoner, parolee, or probationer lived at home but wore an electronic monitor around their ankle, an idea that law enforcement first began considering as far back as the early 1960s but that did not become popular until overcrowding of jails in the 1980s made it an attractive alternative. Not until the introduction of computer chips into these monitors in the 1990s did such systems begin acquiring digital features. However, in the early 1980s, those wearing electronic monitors received computerized phone calls requiring them to respond via automated voice verification as a way of ensuring they were adhering to their confinement. If not, software would notify correctional personnel of a potential problem.121 During the 1980s, corrections facilities also had as much of a need for telecommunications as did police departments. Inmates needed to make telephone calls and corrections officials had to communicate with each other within a facility through wired and wireless (radio) systems. Prior to the 1980s, internal communications systems were often “home grown,” that is to say, put together locally or based on old AT&T analog systems. Beginning in the early 1980s, as telecommunication switches began to appear in the market (primarily digital), prisons had new options. All during the 1980s, new and old prisons modernized their telecommunications infrastructures, mimicking what occurred in other industries and across public safety agencies. Since 80 percent of a prison’s budget could go to payroll and other staff expenses, any tool that came along that could hold down such costs proved attractive. Modern PBX systems often required fewer telephone operators than before, so, just as companies and telephone service providers moved to the new technology to lower personnel costs, so, too, did large prison systems.122
Law Enforcement
By the late 1990s, federal, state, and large urban prisons had largely digitized their inmate record systems, routinely used computers to track and control budgets and to track assignment of personnel to shifts and duties. One national study conducted in 1998 reported that computerization of some 207 types of data had been accomplished by just over half of the largest prison systems in the country, while over 85 percent had implemented systems that collected and used a large core of that set of data, the kind listed, for example, in table 4.5.123 Use of the Internet by prisoners and their jailors has hardly been studied. However, by the early years of the new century, files of prisoners on location began moving to state, county, and municipal intranet and internet sites. In the case of internet sites (those that can be accessed by citizens), people could inquire about the location of inmates, for example, without having to call law enforcement agencies or prison officials. In some states, such as Minnesota and Washington, if one saw a neighbor arrested, they could query the Internet to find the charges files against that individual.124
Origins and Early History of Computer Crime The early history of crime involving computers is shrouded in hyperbole, anecdotal stories, and what extant evidence would suggest were few reported cases. Then, as now, even the definition of what constituted computer crime remained unclear. However, what is very certain is that from the earliest days of companies using computers, accountants and their auditors expressed concern about the possibility of accounting and financial fraud occurring through the use of computers.125 Their concern grew out of the general lack of knowledge about how computers operated in accounting circles in the 1950s and 1960s, the uncertainty caused by having to rely on programmers whose performance could not necessarily be understood by line management, and from the lack of best practices and auditing tools in computer systems. To one extent or another, firms and public agencies worried about these issues throughout the second half of the twentieth century. However, a look at computer crime from the perspective of law enforcement and courts, two communities that had no experience dealing with this new form of criminal activity in the 1950s and 1960s, presents a less passionate or idiosyncratic picture of computer crime waves in America. As late as 1978, one author of a book on computer crime confessed that “there is no widely accepted definition of computer crime. Some authorities define it as making use of the computer to steal large sums of money. Others include theft of services within this definition, as well as invasions of privacy. Some . . . [view] it as the use of a computer to perpetrate any scheme to defraud others of funds, services, and property.”126 The leading expert on computer-based crime of the 1960s–1980s, Donn B. Parker, made the same point in 1976, adding that “knowledge about the incidence of crime is small, and the data available are inaccurate,” and that comment came from him after he had compiled the most complete census of
133
134
The DIGITAL HAND, Volume III
computer-related crimes in the world!127 In 1983, he again made a similar point: “there is no general agreement on the meaning of computing abuse or computer crime.” 128 So his categories of abuse became for many the de facto early definition of computer crime. His categories included vandalism of hardware and software; theft of information, hardware, or software; financial fraud or theft committed by altering software and files; and unauthorized use or sale of services involving computers.129 His inventory of cases dated from the first one he could document (1958) through October 1975 and came from around the world. The industries most frequently subjected to this kind of problem were those that were some of the most extensive users of computers in those years, and which often did a relatively poor job in managing their digital assets: banking, education, all levels of government from local to federal, manufacturing, insurance, and computer services. Of the 372 cases he documented for that period, banking experienced 70, education 66, governments 61, manufacturing 46, insurance 28, and computer services 24. Another dozen industries often had less than six incidents each.130 A task force organized by the American Bar Association in the early 1970s characterized all cases as less important than violent crimes, while the American public viewed these as white-collar crimes, which in those years were routinely punished less harshly than others.131 One can conclude that the number of documented incidents prior to the late 1970s was quite small when compared to crimes law enforcement had faced long before the arrival of the computer. In fact, Parker said so in 1983: “computer crime is a relatively rare occurrence when compared with noncomputer crime.”132 There was much speculation about the magnitude of these crimes. Every study done of the issue in the United States in the years prior to 1990 all speculated that they ranged from an average of $400,000 to tens of millions of dollars. The ABA thought they were enormous, reporting that a fourth of their survey respondents thought they had been victims of the problem and that their total losses ranged from $145 to $730 million, which led the ABA to the simple math of dividing the number of respondents of their survey into these figures to arrive at a range of $2 to $10 million per enterprise.133 At about the same time, another study suggested that the total cost to the American economy was closer to $15–27 million.134 The disparity in the data suggests that nobody really knew the true numbers. Part of the data problem that all commentators had until the 1980s was that many enterprises and public institutions either could not or did not have the capability to identify computer-assisted crimes. If they did, they proved reluctant to report these for fear that such information might have an adverse effect on their business. Observers also noted how management and the public at large had great faith in data coming out of computers, assuming that printouts of information were actual documents of record, rather than a by-product of data in a computer that was first organized in a way that was then presented in the form of a report. Against this backdrop, prosecutors faced very few cases. Between the 1950s and the end of the 1970s, district attorneys reported handling 244 computer-related cases, what they called computer crime, applying existing laws that did not specifically address the use of this new technology. Of the 244 cases,
Law Enforcement
they prosecuted 199 and were able to get 157 convictions by plea and an additional 10 by trial. They handled a further 311 cases of fraud that involved use of computer-generated information. Of these, they prosecuted 215, plea bargained convictions on 158, and won convictions by trial in an additional 20 cases. The largest number of cases existed in large metropolitan areas, such as New York, Chicago, and Baltimore.135 The FBI ran its first courses on how to detect computer crimes in the mid-1970s and publication of what became the bible for many years used by police to investigate these kinds of crimes.136 The entire law enforcement community used existing statutes, such as federal mail and wire fraud legislation, because much fraudulent data moved across state lines, and the courts had been liberal in their interpretation of all manner of fraud under those laws.137 In the 1980s, as the number of computers in the American economy continued to increase, the kinds of crimes documented by Parker in the 1960s and 1970s continued to occur. With software products now widely available, these ephemera were either pirated or used to commit more traditional crimes, such as fraud and embezzlement. The earliest worms and viruses also began appearing in the 1980s, and it was in this period that the term “hacker” first made its appearance.138 In that same decade, state and federal legislatures began passing laws that specifically targeted computer-based crimes, such as arson (a major early problem with computing), hacking, and more traditional crimes related to fraud, embezzlement, theft, improper use of software, copyrights, and patents.139 Hacking became a highly publicized subject of increasing concern to law enforcement after the arrival of the PC and widespread use of telecommunications in the 1980s and 1990s.140 By the 1970s, however, the law enforcement community was already beginning to form opinions about these activities, which they began viewing as legal, illegal, or simply mischievous. But in general, police and prosecutors considered hackers as digital trespassers and thieves, while the press tended to treat them more like “a modern-day joy rider, roaming the electronic highways.”141 One lawyer familiar with computer crimes commented in the late 1980s that “nobody knows how hacking got started,” although he knew of instances dating back to the 1960s involving college students; but these kinds of users had not become a major issue until the 1980s, with the arrival of PCs and widespread computer literacy, particularly among young people.142 However, precomputing hacking had occurred in the 1970s with “phone phreaks,” people who developed “blue boxes” to access long-distance telephone lines, often for fun or to avoid paying for long-distance calls. John Draper (aka Cap’n Crunch) became famous as an early phone phreak. He was arrested, convicted, and sent to prison in Pennsylvania in 1976. The other famous case from this early period involved the Massachusetts Institute of Technology’s (MIT) Cookie Monster, who would destroy files if a user seeing the term “cookie” on their screen did not “feed” the monster by typing in “cookie.”143 As with fraud and more conventional crimes committed using computers, police, prosecutors, and judges had a difficult time understanding what was legal and illegal in these early years when it came to hacking.144
135
136
The DIGITAL HAND, Volume III
By the 1980s, a more serious problem began spreading that proved disturbing and remained a chronic issue to the present, namely, the role of organized crime. Like good business managers elsewhere in the economy, this class of criminal used computers to track their operations, to improve skimming and communications with clients and colleagues, and to track profits and income. Key areas of applications involved gambling (online bookmaking); providing such services and goods as prostitution, pornography, and drugs; fencing (buying and selling of illegally gotten goods); pilfering; money laundering; loan sharking; and use of fraudulent credit and, later, debit cards. All of these applications were in use by criminals by the early 1980s. In addition to these applications of the digital hand, they also stole and sold computer chips, computers, peripherals, and software.145 In the 1980s, the number of computer-based crimes began increasing to the point that large metropolitan police departments, in particular, state-level law enforcement, state and federal prosecutors, and the FBI and Secret Service (because of its mission to protect U.S. currency) had to increase their expertise in investigating and prosecuting these classes of crimes. By the early 1990s, such knowledge, coupled with a growing body of new laws, had begun making its way through the law enforcement ecosystem.146 The first step was the passage of laws. In 1978, the Florida legislature passed the first high-technology crime bill in the nation in response to a recent case where computer files at a race track were manipulated to show horses winning that had not, resulting in the payout of millions of dollars on lost bets. Subsequently, over the next quarter century, states either passed new laws or modified old ones to account for emerging applications of computers in criminal activity. By the end of the century, all states had done so. At the federal level, Congress passed over a dozen laws specifically designed to deal with computing and telecommunications. By the 1990s, the FBI had acquired a considerable body of expertise in fighting digital crime. As noted earlier in this chapter, the FBI had also become an extensive user of the technology, which in turn served as a training vehicle for its investigators. It also collaborated with other nations in fighting international digital crime and ran training programs for law enforcement organizations from around the world. In May 1999, the FBI opened the Internet Fraud Council as a center to which consumers could bring complaints about digitally based crimes. The FBI had already established the Computer Investigations and Infrastructure Threat Assessment Center (CITAC) and the National Infrastructure Protection Center (NIPC).147 Meanwhile, state, city, and federal collaboration and funding of initiatives had finally entered the mainstream of law enforcement in the 1990s.148 The wide deployment of the Internet and other dial-up networks that began in the 1980s opened up a whole new chapter in digital crime, a subject worthy of its own chapter. However, a brief view of the situation at the start of the new century suggests how far this new form of crime had come in barely two decades. In a small survey conducted in 2001, the Department of Justice learned that 74 percent of businesses surveyed had reported being the victim of some sort of Internet-based crime, now fashionably called cybercrime. Two-thirds had
Law Enforcement
experienced one or more viruses, one-quarter a “denial-of-service” attack (for example, slowed response time due to vast quantities of data pouring into a firm’s computer). A nearly similar percentage had been subjected to vandalism or sabotage of their computer systems. While the sampling was small (198 private companies) and hardly statistically representative, the study hints of the ubiquity of the problem and led the Department of Justice to announce publicly that it would begin collecting this kind of data much as it did on other types of crimes. Perhaps also interesting was why several hundred companies declined to participate in this study, reasons that resembled those given in earlier decades regarding computer crime to Parker and others: 17 percent were concerned about confidentiality of their survey input, 14 percent because they just did not know how often they had been victimized, and 82 percent stated simply that they did not participate in voluntary surveys. Buried in the details of this early cybercrime survey was information about the kinds of problems faced. Ranked by number of incidents, at the top were computer viruses, followed in descending order of frequency by denial of service, vandalism or sabotage, fraud or embezzlement. Given the fact that by the 1990s, first, much security software had been installed to protect accounting and financial systems and, second, on physical security for data centers and other locations with high concentrations of personal computers, the order of frequency would not be a surprise. The survey reported that in 2001, the number of incidents was higher than that of 2000, and that 10 percent of the respondents had insurance policies to cover losses from cybercrime.149 In 2005, the Department of Justice reported that Internet-specific crimes were now drawing considerable attention from the law enforcement community. These classes of crime included “cyberstalking,” “cyberbullying,” child pornography, Internet fraud, and identity theft. All were perverse digital innovations of the 1990s.150
Conclusions Law enforcement is a community made up of a patchwork of agencies varying both in size and mission, from little town police departments with ten employees to large court systems in Los Angeles, New York, and Chicago. Yet it is a community that shares many common values and ways of conducting its work. It is an observation that many who have studied this sector made over the years. The integration of the digital into its daily work streams reflects shared values and practices. Much as in so many industries and across other public sector agencies, digital applications came in waves. Large agencies could afford computers first, and so they naturally led the way. As their experiences demonstrated successes and failures, other agencies learned from these and either adopted, waited, or ignored implementations until they made sense. The availability of funding proved a most critical factor in the decision to embrace and deploy a particular use of computing. Enthusiasts of computing always complained that agencies adopted these tools too slowly. Kent W. Colton typified all students of the subject, lamenting
137
138
The DIGITAL HAND, Volume III
that adoptions were slower than desired, but that early uses embraced the digital for computers to do repetitive operations.151 Another expert, Scott Hebert, also complained that there was much resistance to new ways, largely due to cultural factors but often also to the inability to demonstrate improvements in policing operations.152 Those applications that gave police on the street information in a rapid fashion were consistently the most widely adopted and supported uses across the entire half century, largely because they fortified well the work of police. By the 1980s, integrated systems became fashionable as police, courts, and corrections began leveraging technology to collect and move information along at a speed that kept pace with the daily activities of the law enforcement community, while nearly spectacular improvements in digital fingerprinting and records retrieval helped enormously. Yet overall, the effects of computing on the operational cultures of various institutions were less pronounced than in other government agencies. While the ability to get one’s traditional work done faster, better, and less expensively using computers and telecommunications improved, consequences of a cultural nature only emerged slowly. The most dramatic one that cross-functional, integrated justice systems should have affected was the willingness of agencies to share information with each other. To be sure, there were dramatic examples of this, such as the pooling of data by multiple communities in California, St. Louis, and elsewhere related to wanted persons, stolen vehicles, and so forth.153 Clearly, the work of the FBI in establishing national databases evolved into a crucial contributor to this process. But as late as 2004, complaints of hoarding data and of turf battles continued to appear. One article in Governing, a major journal in the field of public administration, reported in 2004 that “there are still a lot of obstacles to getting good information-sharing systems in place. For many jurisdictions, the idea of pooling data resources . . . remains a little frightening. Individual agencies are nervous about allowing others free access to their information, even when there’s a valid public purpose involved.”154 There were many reasons cited in this important review of the problem—lack of technical and operational standards, legal impediments, issues of security and privacy—but nonetheless, just as the 9/11 Commission pointed out a couple of years earlier, sharing did not take place sufficiently, with the result that terrorists, or in the case of Governing’s report, criminals, could slip through the law enforcement system. Nonetheless, sharing and pooling data had made remarkable progress because of the ability of agencies all over the country to use the computer’s ability to house large volumes of data, provide rapid and accurate search and retrieval of this information, and do so virtually anywhere at any time. By the dawn of the new century, expert criticisms of law enforcement often had an IT tone. For example, GAO’s examination of the FBI’s critical functions focused on technical issues, such as on IT standards and modernization strategies.155 Laws and many court cases had evolved in response to the use of the new technology. Indeed, a sea change was occurring. In the 1960s and 1970s, most judges and prosecutors saw computer crime as traditional forms of criminal
Law Enforcement
behavior just done with computers rather than with some other tool, such as a gun. That attitude began changing in the 1970s and was reflected in legislation all over the land in the next quarter century. As security expert Donn Parker recollected in 2003, “having specific computer crime statutes was a way to establish a social agreement that these were real, serious crimes. The other purpose of getting specific statutes was to help law enforcement agencies get the budgets they needed to develop the capability to ultimately be able to deal with investigation and prosecution of computer crime.”156 On balance, one would have to conclude that adoption of computing and upgrading of communications had changed much of how the core institutions of the law enforcement world went about their work. Efficiency and effectiveness improved incrementally all through the period, making today’s use of computing seem normal, although ubiquitous. Today’s workforce working in this industry in the vast majority has only lived in a time when computing was widely available in their organizations and across society, so only a few members of this ecosystem can hardly recall a time when computing was not part of how they did their work. Even judges—the last within the law enforcement community to embrace the technology—followed much the same pattern as CEOs in companies. As they came to understand the value of the technology, they brought it into their operations; it was not a pattern explained away simply by arguing that judges were older than most police, criminals, or lawyers who came before them. Furthermore, as they learned how the technology worked and affected law, it influenced their conclusions; and as they had to try computer related crimes, they learned what had changed from prior criminal behavior and yet what remained consistent from one era to the next. The picture that the last four chapters begin to paint is that of whole communities within the American society adopting computing and communications in waves at various speeds but in common ways. By looking at additional uses of computing and communications in the federal government, we can see that process more clearly at work. For that reason, the next chapter is devoted to describing digital activities in other corners of the national government.
139
5 Digital Applications in the Federal Government: The Social Security Administration, the Bureau of the Census, and the U.S. Postal Service The Postal Service may be nearing the end of an era. —Bernard L. Ungar, General Accounting Office, 1999
T
he Social Security Administration (SSA), the Bureau of the Census, and the U.S. Postal Service (USPS) are storied agencies in the American government, touching directly the lives of all residents in the United States. In the case of the SSA, it is the largest insurance and old-age pension organization in the country, and the vast majority of residents are registered with it by way of their social security number. Over 50 million people receive pension, disability, or some other form of financial support from the SSA. For the poorest in the land, it was (and is) often their only financial safety net in their old age, or if disabled. The Bureau of the Census is responsible for tracking the population of the United States, for conducting other census counts of the economy, and is the federal government’s largest statistical agency. Its census count determines how many representatives any state can have in Congress and serves as the basis for the proportional distribution of federal funds to states, localities, even schools. Both the Census Bureau and the SSA have been extensive users of information technologies. In the case of the SSA, its early use of IBM punched-card equipment in
140
Social Security, the Census Bureau, and the USPS
the 1930s has taken on a near mythic persona in the histories of early computing and of IBM. With the Census Bureau, it was one of its employees, Herman Hollerith, who developed the modern punched-card and tabulating equipment in the 1880s; the company he established to rent the equipment ultimately became IBM. Use of information processing at the bureau, beginning with the census of 1890, began a century-long dependence on technology that is both storied and important. The Post Office was America’s first nation-wide public information highway, established in the late 1700s. It remained the backbone of much communications right into the twenty-first century. Each of the first six days of the week, this agency visits nearly every residence, business, or other organization in America, delivering paper-based communications, publications, and packages. All three agencies are highly visible public institutions, remain constantly in the news, and are discussed in every civics class taught at American high schools. They all long have been extensive users of IT. The importance of these agencies and of their use of IT and telecommunications is so great that no history of computing in the public sector would be complete without discussing their use and the effects of their technologies on their operations. All three agencies operated more as self-contained organizations than many other federal offices. They derived their specific roles and missions from federal laws. They often were more influenced by the actions of other federal agencies, the White House, and the Congress than by other potential participants in their ecosystem. I have suggested in this book that many public sector organizations were members of some larger ecosystem (such as law enforcement), and that their activities within such extended communities were increasingly facilitated by the digital hand. However, while this is also the case with these three organizations, it is sufficiently less so than with the IRS, for example, to allow us to examine them as discrete entities. Also important to keep in mind is the massive quantity of information and files and papers these organizations handled in the twentieth century. The statistics cited below of their volumes of transactions and files are stunning and huge by any standard and thus serve as extreme examples of the value of computers to them. By looking at each, we can see how technology affected the work of three very important components of the federal government, while containing the discussion to highly defined organizational boundaries. We can see how the three became familiar with technologies of all kinds within the context of highly defined institutional cultures and how they confronted and deployed computers. The importance of such a discussion is reinforced when we keep in mind that none could do their work without extensive reliance on information technology. That was so by the end of the census of 1890 in the case of the Bureau of the Census using precomputer IT and within a year or two after the founding of the SSA in the late 1930s. The Post Office experienced severe competition and important operational changes made possible by the new technology. In all three instances, their ability to use IT effectively had a direct bearing on their ability to
141
142
The DIGITAL HAND, Volume III
carry on their missions. The SSA and the Census Bureau were regarded as sophisticated users of computers and other forms of information technologies in the period 1950s–1970s, but subsequently, like the IRS and the DoD to various extents, were increasingly regarded as behind and ineffectual in their use of computing, or faced new operational challenges. In the case of the SSA, it, too, faced a serious and real risk of not being able to get its work done on time in the 1980s, while the Census Bureau became enmeshed in other operational issues, such as about the value of estimating scientifically how many people there were as opposed to counting all “noses” in the 1990s and in the 2000 census. In the case of the USPS, its use of computing was a longer time coming and its effects only became evident late in the century. How IT influenced the work of all three organizations is the subject of this chapter.
Social Security Administration Established in 1935 during the Great Depression, SSA is responsible for administering a collection of national social insurance programs in which employees and employers contribute to a pool of funds used to provide old-age pensions and other monthly payments, such as to disabled workers, widows, and orphaned children. It became the largest social insurance program in the nation supporting the aged. Congress extended its mission to assist financially those less able to take care of themselves over many decades and in various ways. Each time Congress added or changed a program, the agency had to modify or add new applications of their digital infrastructure in support of these new responsibilities. The majority of discussions conducted by historians about the use of IT by the SSA focused on the initial deployment of IBM equipment designed for the SSA in the 1930s, and that represented for the small IBM Company its largest sale to date. For the SSA, the hardware provided the ability to carry out its mission, since its leaders quickly realized that paper-based record keeping and payments could not be scaled up to the volumes needed.1 By the time computers were just becoming available in the federal government during the 1940s, the SSA had over a decade of experience using various kinds of sophisticated data processing equipment, employed a highly motivated and expert workforce, and implemented a noble mission that valued quality service to the American public characterized by accurate record keeping and timely payments of pensions.2 So it should be of no surprise that early on the agency recognized the potential possibilities of computers and took the time to understand, then to embrace the new data processing technology.3 Before discussing computing, it is helpful to appreciate the magnitude of the mission of this agency, keeping in mind that every transaction, piece of information, and individual it served required some form of data handling. Over time, these were activities and pieces of information that were either partially or completely collected, preserved, and managed by the digital hand. From its earliest days, the numbers were large at the SSA. In 1937, during its first full year of
Social Security, the Census Bureau, and the USPS
carrying out requirements of its charter, the SSA issued 26 million social security numbers to American workers and assigned 3.5 million employer identification numbers as well. It simultaneously tracked incomes and payments received from both the 26 and 3.5 million, using a centralized approach. These activities led to the creation of the Visible Index, listing every worker covered by the insurance system. That year, the records alone occupied 24,000 square feet of space, and that was only the beginning, because the nation’s population kept growing and an increasing number of people applied for social security numbers.4 By the time the SSA first began using computers in the mid-1950s, it maintained some 150 million accounts. By 1960, nearly 90 percent of all employed workers were either covered by social security or were eligible for some form of coverage, for a total of approximately 75 million workers. The rest of the population it served were survivors and others eligible for support. Nearly 75 percent of all individuals over the age of sixty-five were either receiving or eligible for benefits managed by the SSA.5 Those numbers, again using 1960, translated into 3.5 million newly established accounts, 2.3 million changes and updates in data to existing accounts, and calculations done quarterly on incomes for the 75 million workers. Some 25,000 employees at the SSA carried out this work scattered across the nation, with central offices (and files and data processing equipment) located in and around Baltimore, Maryland.6 Jump ahead twenty years, and the volumes
Figure 5.1
How SSA stored records on individuals prior to 1962. This is only a tiny corner of the vast collection, circa 1960. (Courtesy Social Security Administration)
143
144
The DIGITAL HAND, Volume III
Figure 5.2
This is SSA’s Visible Index to all the files illustrated in the previous photo; note the vast quantity, circa 1960. (Courtesy Social Security Administration)
simply grew. For example, in 1980 over 140 million workers were covered and at the dawn of the new century, in excess of 187 million; these numbers did not include underage children, widows, and others also helped by the SSA.7 The SSA concurrently managed funds contributed toward benefits. In the period 1980 to 2001, the numbers also proved daunting, suggestive of the complexity and volumes involved: old-age funds amounted to some $105 billion in 1980 and, when combined with various other assets, over $1 trillion in 2001.8 Meanwhile, the number of employees, while they grew (then shrunk) in number over many years during the second half of the century, did not increase as rapidly as the number of people they served. Yet as of 2005, for instance, the SSA employed 65,000 people deployed in ten regional offices, six processing centers, and over 1,300 field offices.9 In short, by any measure, all volumes involved were large. The SSA was a perfect candidate for using computers from the inception of this technology. Early Deployment of Computers, 1950s–1960s Officials had to define what files they would keep and update and implement formal processes for handling these. Their core data processing consisted of creating a file on every worker and eligible individual registering for a social security number (SSN) and a similar process for every employer; then perform
Social Security, the Census Bureau, and the USPS
the necessary reporting and recording of wages. By the time the first computer went “live” at the SSA in 1956, the agency had established approximately 120 million accounts of which 80 million had some updates made during the course of the year. Employers provided updated tax information to the Treasury Department each quarter, and that data (which dealt with what each employee was paid) Treasury passed over to the SSA.10 That was essentially the process used for decades. Other updates involved requests for benefits filed by individuals, followed by an assessment by SSA employees validating what payments should be made and then authorizing them. Thus, the essential requirements of IT for many decades consisted of: • Collecting large quantities of similar data in bulk form • Storing massive amounts of like information • Accepting similar updated information on a regular basis • Producing trend reports • Providing ad hoc printed reports of individual accounts, and in the 1990s, printing and mailing account information to each owner of a social security number.11
While this is a highly simplified description of the data processing features involved, these functions mimicked many of those in the Insurance Industry.12 More important, early computers could read punched cards and long tape files, features that were well matched to the needs of the SSA: a great deal of cheap storage capacity and less requirement for complex or extensive calculations. Prior to 1950, computations were done by persons in the district offices; the next year two IBM 604 Electronic Calculating Machines were able to perform some 100 computations per minute. The small size of these machines when compared to today’s computers demonstrates that data collection and storage were larger applications than the actual calculations themselves.13 Over the decades, storage continued to improve as the agency went from cards to tape in the 1950s and 1960s, then from tape to disk in the 1960s and 1970s, and from batch systems to a mixture of batch and online applications in the 1960s and 1970s. Increasingly in the 1980s, the agency went online with distributed processing to the large number of offices, beginning in volume in the late 1970s and extending to the end of the century. The SSA’s initial foray into computing involved a series of careful analyses of computers to determine how best they might work. As early as 1946, with the availability of the ENIAC, SSA officials began talking to computer designers and engineers. They tracked major developments in computing in the 1940s, both with vendors and other potential early users in government, such as the Census Bureau and the military, a dialogue that continued into the 1950s. Technological improvements evolved to the point that a task force within the SSA in September 1954 recommended that the agency place an order for a system for the purpose of performing statistical work. It also concluded, however, that the technology had not yet progressed sufficiently, nor cost effectively enough, to warrant using digital computers to perform the massive record-keeping functions of the agency.
145
146
The DIGITAL HAND, Volume III
Most specifically, sorting data with a computer remained too expensive when compared to performing the same functions on tabulating equipment.14 But for statistical work, in-house experts recommended using an IBM 702 system, concluding it could do the data crunching functions more typical of scientific computing to which the 702 was suited. The SSA could also use this system to experiment with other applications.15 Then, Congress passed legislation calling for changes in the process by which the SSA used earnings to calculate benefits, straining the capabilities of existing tabulating equipment, while increasing the need for more file cabinets, cards, and employees. Switching to tape, and away from punched cards, would save space and time. But it was the change in law that became the primary reason for the agency’s finally deciding to acquire its first computer, requesting bids from eleven vendors in June 1955. The SSA concluded that “the possibility of recording summary data in magnetic tape appears to be a solution which will give immediate relief to the problem of punch card storage. It also will permit the introduction of electronic processing in the statistical operation and will afford opportunities for extending and improving the recordkeeping system in general.”16 These same considerations of space, handling large volumes of data, and for processing and calculating remained core applications and concerns affecting IT deployments over the next half century.17 Space and speed were not trivial considerations. As one observer noted years after the move to tape: “The information on 60,000 summary cards could be stored on a single reel of magnetic tape 10 inches in diameter. Also during the period 1945–55, the internal speeds of the computer had increased 100,000 times, and storage capacity and reliability had improved 100-fold. The technology the agency needed was in place.”18 The SSA ordered the system; it came in during 1955 and went live in the fall of 1956. Between order time and installation, the agency began converting data from cards to tape, while training personnel on how to operate the equipment and to write software. With workloads increasing in the late 1950s, workers displaced by the computer system were assigned to other departments within the SSA. The amount of floor space needed to house cards shrank. Accuracy in handling and manipulating data began to improve, largely because the 702, and subsequent computers, could check the validity of every number and item as it was handled, spitting out potential errors for humans to assess and correct. In 1958, employees began posting updates directly to magnetic tape, a process performed much faster than with older data processing equipment.19 The agency replaced its 702 with larger twin IBM 705s in 1957 to perform quarterly updating of earnings records, to search accounts for information, to prepare coverage and earnings statements, and to conduct repetitive error checking operations. In short, once installed, computers were quickly assigned data handling operations central to the work of the SSA. By the end of the decade, additional incremental applications had been developed, such as preparation and addressing of correspondence to employers. Meanwhile, old accounting machinery began leaving the agency, displaced by computers. Faster systems came into
Social Security, the Census Bureau, and the USPS
Figure 5.3
The data center where the SSA began storing its records on magnetic tape, circa late 1950s. By the late 1970s, it had more than 500,000 reels of tape. (Courtesy Social Security Administration)
the SSA in the early 1960s and 1970s, reducing the amount of time it took to perform specific transactions, searches, and sorts, while increasing the capability of handling more volumes.20 The agency was benefiting from the rapidly improving capabilities of the technology suited to its needs. As workloads expanded in the 1960s, new systems came and went, but the agency proved able to handle its duties and workloads. GAO, the Congress, and SSA’s senior leadership were pleased with the results.21 Table 5.1 shows the incremental acquisition of applications that directly supported the SSA’s mission and made it possible for the data processing staff to do their own internal work. By the mid-1960s, the SSA had implemented various applications in support of eight basic operations: establishment of accounts, maintenance of earnings records, processing earnings incorrectly or incompletely reported to the SSA, certification of earnings records, maintenance of master beneficiary rolls, certification of benefit payments, preparation of benefit payee records, and maintenance of beneficiary payee rolls.22 An internal assessment of the role of computers painted a positive picture: “The over-all effect of automation on the Social Security Administration has been excellent. The impact of new legislation and growing workloads on Administration personnel requirements has been offset . . . by the use of EDP and . . . the Administration is giving better service.”23 The SSA also had to invigorate its telecommunication infrastructure. It had long used telephone services that included voice and later teletype transmission for data. But in 1960, the SSA began upgrading its network to handle more electronic transfers of information, leading to its reliance on a network built by
147
148
The DIGITAL HAND, Volume III Table 5.1 Introduction of Key SSA Applications, 1957–1969 1957
1958 1960
1961 1962
1963 1965
1966 1967 1969
Preparation of earnings statements and quarters of coverage statements Quarterly up-dating of summary records Summary file searching for claims and earnings statements Statistical operations Preparing and addressing employer correspondence File searching for miscellaneous requests Earnings report processing Establishing and correcting employee records Claims control Telecommunications processing Suspended accounts processing Correspondence control SSA payroll and leave system Bureau of Disability insurance claims control Bureau of District Office Operations management control Personnel records system Selected Claims in Process (SCIP) System Preparation of tapes for quarterly up-dating Claims and award processing Bureau of Disability insurance folder control Magnetic tape library control
Source: “History of Data Processing in BDPA,” internal report, undated, circa 1969, p. 30, Social Security Administration Archives, Baltimore, Md.
AT&T that remained in use until 1966. At that time, the next generation network, called the Advanced Records System, at the SSA went live, which made it possible to transmit ten words per minute over telephone lines to a central message center where employees moved the data to magnetic tape and in turn shipped the tape to the main computer center at the SSA’s headquarters in Baltimore. In the 1980s, the SSA replaced this system with an online network (called SSADARS) with which employees could directly access files at the main computer center.24 Crises of the 1970s and 1980s But as had occurred at the IRS, not all remained well over time. One government report about the SSA’s computing problems of the 1970s characterized the period as “a decade of deepening problems,” which “became intractable.”25 The most probable of all roots of the problem rests with the fact that between 1972 and
Social Security, the Census Bureau, and the USPS
1981, Congress passed fifteen new laws related to social security, which called for the agency to perform new and complex services that in turn required, of course, either changing existing data processing applications or adding new ones, all with no additional employees.26 The problems created were further compounded by the fact that increasingly over time, Congress gave the SSA less time from when a law passed to when it had to provide a new or different service. The inevitable backlogs climbed all through the period as a severe imbalance developed among technical, operational, and human resources in the face of new duties. As at the IRS, the rapid turnover in senior leadership, inadequate planning, and insufficient dialogue with Congress and various administrations about what it would take to do new things compounded the problems. Passage of the Supplemental Security Income (SSI) Program in early 1972 became the proverbial straw that broke the camel’s back, a crisis still discussed by veteran employees at the agency in the 1980s and 1990s. The programs in this law required extensive dialogue between employees and beneficiaries, while the volume of traffic on computer systems chocked networks, brought down systems, and created service delays, all damaging to the SSA’s reputation for being a well-run agency. The complexity of SSI was compounded because the SSA was asked to accept 3 million beneficiaries onto its rolls from some 1,300 state and local agencies, in effect federalizing a great deal of welfare programs originally the direct responsibility of the states, with the inevitable variety of programs, data formats, and so forth one could expect from such a mix of agencies. The SSA simply was not prepared to do this well, “not up to the task,” as one student of the crisis called the situation.27 Software to support the new service could not be written and deployed in time, while the telecommunications network in the agency could not handle the massive surge in traffic. The Office of Technology Assessment (OTA) carefully examined the situation and did not simply blame “new” or “old” computers, concluding instead “that the workload had become too large, too complex, and too dependent on automated processing to be handled by SSA’s existing workforce with existing technology. In this situation, every addition to the workload became a potential crisis.”28 The GAO also looked at the situation and reached similar conclusions, adding that the agency had sufficient computing power during the early stages of the crisis: “Before January 1977, Social Security was processing most of the workload for its major programs on 17 large-scale computer systems,” and the “systems were capable of supporting more than twice the largest identifiable workload processed.”29 Added to that one program were other laws passed that called for such actions as providing citizens with automatic cost-of-living adjustments, the Reagan debt collection initiative of 1981, and the Omnibus Reconciliation Act of the same year. One could begin to understand the crisis enveloping the SSA. Every law required changes to existing practices, policies, and software.30 An underlying managerial practice also contributed to the problem: the rapid conversion of hundreds of programs on the fly with little or no documentation of changes, with the result that as data processing personnel turned over within the SSA, what digital systems looked like became less clear to remaining technical staffs. In turn, that circumstance made
149
150
The DIGITAL HAND, Volume III
it more difficult to fix software problems or to know what to change and how as new laws went into effect. OTA criticized SSA’s management for not planning changes and for not having sound development plans for IT.31 The net result was a collective near-meltdown of the agency in the late 1970s and early 1980s, despite the fact that SSA had seventy-six major computer applications, although all seemed to be degrading with age and changing technologies and requirements. Here is how OTA described the consequences: “During the 1960s and 1970s, SSA was progressively less able to respond to congressional mandates without Herculean efforts, resulting in large backlogs, high error rates, deteriorating cost-effectiveness, and worsening workplace conditions.”32 The agency responded in 1982, however, with a strategy for fixing its problems, called the Systems Modernization Plan (SMP). The SSA proposed to create new software tools in support of its mission and staff; databases would be integrated, and both telecommunications and computers would be upgraded to handle new functions and larger volumes of transactions. The SSA wanted to do this over five years at a cost of some $479 million. Meanwhile, OMB proposed that in the process the SSA should reduce its workforce by 17,000 employees as the expected increase in worker productivity derived from new systems.33 For all intents and purposes, the SSA was able to implement its plan, despite much criticism and skepticism, while moving many systems from batch to online and converting additional paper files to digital form. It also installed toll-free 800 numbers for the public and better telecommunications for staff.34 By the early 1990s, one could see results. For example, an individual could get a social security card in ten days unlike the six weeks it took in 1982. The issuance of social security numbers, while it may seem such a small action, was actually complex and important, so speeding up their issuance proved crucial to the process of improving services. One could not accrue benefits or receive payments without such a number, and over the years increasing percentages of all age groups acquired a social security number. By the mid-1980s, the SSA had issued nearly 292 million numbers and had a potential 700 million left in its “inventory” that it could issue. In the decades of the 1960s through the 1990s, it was not uncommon for the agency to issue between roughly four and seven million numbers per year. By the mid-1980s, the process had essentially become fully automated.35 Emergency payments could be made in five days rather than in fifteen, as occurred in 1982. In 1992, it only took twenty-four hours to process an annual cost-of-living increase, while ten years earlier it required six weeks.36 SMP cost over $4 billion to implement while all systems cost the SSA $400 million per year to operate.37 Within the agency, much was done to improve digital support of the organization. A careful reading of the agency’s own internal newsletter to its employees documented many IT activities in the late 1980s and early 1990s designed to rescue the agency. These included modernizing and rewriting applications, installing terminals and networks, and replacing old hardware. Files moved from tape to disk, minicomputers into field offices, and massive numbers of terminals spread across the agency. The effort mimicked in scope and effort what the SSA
Social Security, the Census Bureau, and the USPS
and even the IRS had done to implement their original uses of the digital hand in the 1950s and 1960s.38 Telecommunications expanded to put most employees on a modern network, costing an additional $122 million to operate in 1992. The network was handling some 11 million transactions each day. So, one can conclude that by the early 1990s, after more than a decade of extensive investments in efforts, resources, and time, the crisis of the 1970s had been overcome. The mountain the SSA’s management had to climb over was complicated by the fact that they had to continue providing services to the public in conformance with law. A few numbers suggest what they faced in 1982. That year the SSA maintained 240 million individual records of active accounts, which included every holder of a social security number working or receiving benefits. Every month they had to pay out $170 billion in benefits to over 50 million individuals. They issued some 10 million new social security numbers and posted 380 million wage items reported by employers. They received and had to process 7.5 million new claims applications and processed an additional 19 million postadjudicative transactions, which included recalculating 2.5 million existing cases. Finally, they had to process some 120 million bills and inquiries from a myriad of health insurance intermediaries, providers, and so forth, all the while managing a work force of some 86,000 scattered across nearly 1,400 offices.39 Just reading this paragraph is exhausting. The price for relative success was high as well. Employee morale sank as staff saw that the way new systems were cost justified was by elimination of their jobs, which they resisted. All the while the number of residents of the United States needing the SSA’s services increased, and Congress expanded offerings to the public. The professional leadership and long years of experience of senior staff prior to the 1970s gave way to the kind of turnover in leadership evident at DoD and at the IRS, with a continuing decline in expertise concerning computing at the most senior levels of the agency. It seemed each new head of the SSA wanted their own reorganization, although one positive trend was the SSA’s tendency to evolve into functional structures rather than remain in programmatic lines. That trend made it possible to align the public’s needs with the evolving capabilities of computers (such as integrated databases making possible case management approaches).40 Even outsourcing work to vendors proved problematic. Space does not permit a review of the story of how Paradyne Corporation and the SSA became involved in one of the largest civilian government upgrades in IT up to the early 1980s, but the saga resulted in failed systems, lawsuits, and what even the normally sympathetic employees at OTA characterized as “a management disaster.”41 The rescue of the SSA and its systems should not be discounted; it was a massive effort. Just looking at the technical side, one senses the magnitude involved. OTA documented what had to be changed on the software and application side of the agency: There were in 1982 some 12 million lines of poorly written and undocumented program code. There were about 6,000 COBOL programs,
151
152
The DIGITAL HAND, Volume III
1,500 assembly language code programs, and another 1,000 miscellaneous programs. Over the years SSA had translated old manual procedures into software using now outdated programming languages, and then, converted them line by line to COBOL, preserving the inefficiencies of the older technology. The old code is being cleaned up and rewritten as it is needed, according to SSA.42
In the mid-1980s, OTA was not blind, however, to the work yet to be done, such as in both designing a comprehensive modern database management architecture and then implementing it.43 It did not help that SSA technicians had decided years earlier, when moving files from tape to disk (1960s and 1970s), to write their own software access method (Master Data Access Method or MADAM) instead of using an off-the-shelf product that would be maintained by a vendor, rather than by the SSA. When not well cared for, MADAM put the agency at risk of not having systems compatible with newer software and hardware.44 IT in a Networked Age, Late 1980s–2007 In the 1980s, the number of citizens continued to increase, while the size of the staff at the agency actually shrank, due to various federal cost-cutting measures. Increasingly, the SSA’s management looked to technology to bridge the gap, as they had in earlier times. But this continued to be a Herculean struggle that extended right into the last decade of the century. In 1994, one assessment of the agency described the situation as nearly precarious: “An ever-increasing workload, combined with staff reductions, threatens SSA’s ability to meet congressional and public expectations for service delivery. The agency’s toll-free 800 telephone numbers are severely overloaded during peak periods, for example, and its Disability Insurance benefits program is in serious distress with a large backlog and long processing delays.”45 As to the importance of the digital hand at this time, the same report opined that “information technology is essential to SSA in carrying out its mission. Indeed, SSA would literally collapse without the use of computers and telecommunications.”46 In no other GAO or OTA report done in the past half century had any auditor or expert made such an emphatic statement about the importance of IT for a federal agency as in this case. All the other agencies and departments assumed that somehow they could do “workarounds,” throwing additional manual labor at a crisis, as happened on a few occasions at the IRS and repeatedly in the military with logistics, usually during wartime. During the 1980s, the SSA deployed the SMP strategy, which led to dramatic reductions in the time required to service the public and to handle growing volumes of work per employee. But for all its success, the problems of volumes and staffing remained, while congressional changes in benefits programs did too. Several data points suggest the immediate problem: in 1984, the SSA had nearly 80,000 employees, a decade later only 63,000. The Disability Insurance program—which was the most labor-intensive for the SSA to manage—grew in
Social Security, the Census Bureau, and the USPS
number of claims from 1.7 million in 1990 to 2.5 million in 1993.47 In 1991, management had begun developing a new strategy for using IT, called the IWS/LAN Technology Program, which stood for “intelligent work station (IWS) and local area network (LAN).”48 It called for decentralizing work by beefing up the SSA’s telecommunications infrastructure and deploying applications out to employees in field and regional offices, using over 40,000 intelligent terminals. To a large extent, this strategy reflected what was occurring in the late 1980s and early 1990s all across the American economy as companies and public institutions shifted digital applications to where people worked and away from large centralized data centers. They redesigned mainframe systems and rewrote older application software. Also, new applications included further conversion of thousands of pages of regulations from paper manuals to online sources to reduce the “sea of paper” employees worked with across the nation.49 Enhancing client files online would also make it easier for employees to work with individuals requesting benefits, bringing quickly to an SSA official information needed in order to adjudicate a request. Word processing would improve communications while integrating files that had remained stand-alone, despite work done to address that issue in the 1970s and 1980s. All were elements of the new plan. As the SSA implemented it in the 1990s and early 2000s, paper in hundreds of offices shrank in volume, more ergonomic workspaces were created, and dumb terminals replaced first by intelligent CRTs, then with personal computers, increasingly linked into telecommunications networks, both locally and nationally. While the SSA narrowly saw this plan as an infrastructure project, it did result in changes in the way work was done as well. For example, the SSA continued to expand its use of toll-free 800 telephone numbers in the 1980s and 1990s. The SSA implemented electronic data interchange for use by businesses to file earnings reports, and it introduced direct electronic deposit of benefit payments. Throughout the 1990s, volumes of transactions going through all three applications grew steadily.50 The three key benefits programs managed by the SSA underwent additional automation during the 1980s and 1990s. These included the oldest program, Old Age and Survivors Insurance, followed by the highly complex set of benefits known as Disability Insurance51 and Supplemental Security Income. Of all three benefits programs, pensions was the most automated, while the processes underpinning disability benefits were the least as late as the mid-1990s. At the time, however, core applications for all processes remained highly centralized on older systems and applications, although that continued to change all during the 1990s and early 2000s. Basic claims-taking processes and individual files had been largely automated (particularly for the pension programs, but not yet for disability) or partially integrated within benefits programs and not across programs. The SSA used largely improved and modernized systems implemented in the 1980s now accessible through intelligent terminals and an increasing number of PCs. The SSA became one of the first federal agencies to launch a Web site to serve the general public. On May 17, 1994, it launched on the World Wide Web SSA Online at http://www.ssa.gov. In 1996, the SSA awarded Unisys Corporation
153
154
The DIGITAL HAND, Volume III
the agency’s first contract to implement the IWS/LAN plan. Meanwhile, the SSA continued to add information to its Web site and experimented with various applications, such as providing online personal earnings and benefit estimate statements, an application that went up and down on the Web site over the next several years as the agency dealt with concerns regarding privacy and data security over the Net. In 1998, the SSA launched a pilot service to make it possible for people to apply for retirement or survivors benefits over its 800 numbers, offering an alternative to the earlier processes of filing forms or visiting an SSA office. Direct deposit payments continued to rise, reaching roughly 75 percent of all payments made by early 1999. This number was not higher because not all beneficiaries had bank accounts.52 By the middle of that year, the SSA had installed 75,000 new workstations and had created 1,742 LANs in either its offices or those of state agencies with which it worked. Meanwhile, the SSA’s Web site began receiving positive reviews from IT industry publications, no doubt encouraging management to rely increasingly on this channel of communications with its clients, a pleasant trend that continued right into the new century.53 Table 5.2 lists key applications added to its Web site in the 1990s. The agency’s intranet made available to employees a substantial amount of information to help them with their daily work. It also provided e-mail and a growing body of electronic versions of the SSA’s forms, which employees, and later the public (via the Internet), could use to file and change claims. Like other
Table 5.2 Key SSA Applications on the Internet and Intranet, 1994–2003 1994 1995 1996 1997 1998 1999 2000
2002 2003
First Internet site launched (http://www.ssa.gov) to provide public information SSA adds information about programs to its Web site SSA launches Online Personal Earnings and Benefit Estimate Statement (PEBES) SSA temporarily suspends online PEBES due to privacy and security concerns SSA launches Digital Library for staff use; SSA first government agency to be Y2K compliant SSA completes first phase of IWS/LAN project with over 75,000 workstations installed with 1,742 LANS used by SSA and state DDS facilities SSA launches Electronic Newsletter (eNews) for public to get SSA news SSA launches electronic Retirement Planner SSA makes it possible for Medicare beneficiaries to apply for replacement cards SSA places Forms Repository on its Intranet site SSA introduces eVital to provide immediate online verification of birth and death data information to speed up processing of claims and other services One millionth response to Internet-received inquiries achieved since 1996
Source: Social Security Administration, “History of SSA 1993–2000,” undated, http://www.ssa.gov/ history/ssa/ssa2000exhibit1-1.html (last accessed 9/10/2005); Larry DeWitt, Historian, U.S. Social Security Administration.
Social Security, the Census Bureau, and the USPS
agencies during the Clinton Administration, the SSA was required to reduce paperwork and in response moved many of its existing paper files to online versions, only altering work processes to integrate more fully online variants in subsequent years. Applications relevant internally to the SSA were also automated and enhanced, such as the development of an integrated human resources system that used off-the-shelf software to process various personnel actions. More mundane activities were supported by specialized automated systems: ordering supplies, processing purchase orders, and supporting various electronic commerce activities (for example, forms printing, creating labels, and maintaining bidders mailing lists).54 The SSA’s budgetary commitments to IT had always been high, but the innovations of the 1990s and early 2000s continued to drive up expenditures, making this agency a major consumer of IT. In 2001, for example, the SSA spent over $740 million on IT, supporting 548 operational systems and projects (70 percent of expenditures), and an additional 265 systems it acquired or developed that year (30 percent of expenditures).55 Since many of the projects under way in 2001 would become major systems widespread long after this chapter was published, it is worth cataloguing them (see table 5.3).56 In addition, the SSA employed over 5,000 workers directly in its IT operations.57 The SSA entered the new century with Congress, OMB, and the GAO more pleased with its performance than had been the case a decade earlier. It continued to worry about the management of its disability programs since the American population was aging and thus would file more claims in future years, and wrestled with a variety of planning and research initiatives, all quite benign when compared to the crises of the 1970s and 1980s. The SSA continued to enhance its disability software with an electronic claims intake process for use by its field offices in the early years of the new century. It also began implementing
Table 5.3 Major IT Projects at the Social Security Administration, circa 2001 Financial Accounting System (FACTS) Managerial Cost Accounting System (MCAS) New national 800 number call center applications Talking and listening to customers Title II system redesign Electronic Service Delivery (ESD) Internet customer services Paperless program service centers Electronic wage reporting system Security infrastructure and operations support Source: General Accounting Office, Information Technology Management: Social Security Administration Practices Can Be Improved, GAO-01-961 (Washington, D.C.: U.S. Government Printing Office, August 2001): 11.
155
156
The DIGITAL HAND, Volume III
a related Internet-based process for people and health providers to send medical information to the SSA while making it possible for retirees to apply for benefits over the Internet.58 GAO, known famously for being critical of the operations of federal government agencies, however, expressed favorable opinions of the SSA in the early 2000s. In one assessment of its operations reported on in January 2003, GAO auditors stated: “Our evaluation of SSA’s information technology policies, procedures, and practices in five key areas—investment management, enterprise architecture, software development and acquisition, human capital, and information security—found that the agency has many important information technology management polices and procedures in place.”59 To be sure, it worried about identity theft over the Internet since social security numbers “are widely found in public records” and increasingly on the Web.60 So, as with all other federal, state, and local agencies using computers, the issue of data confidentiality remained an important issue for the SSA in the late 1990s and early 2000s. As of 2007—when this chapter was completed—the SSA managed a variety of applications and processes that were relatively stable and effective when compared to those of prior decades. Its technical infrastructure was reasonably modern and better managed than it had been since the 1960s. In late 2005, SSA operated over 1,500 offices, while in excess of 158 million individuals were paying social security taxes, and 48 million individuals received payments exceeding $490 billion. It employed 65,000 people and worked with an additional 16,000 state employees. The SSA was finally creating an online Electronic Disability (eDib) process that promised to liberate the agency of vast quantities of paper folders. Its human resources systems continued to expand, while management began using new reports on productivity and allocation of resources. In 2003, the SSA had launched a newly designed accounting system called Social Security Online Accounting and Reporting Systems (SSOARS), now the agency’s accounting system of record, with additional modules added in 2004 and 2005, that by 2006 was a major tool for management.61 The SSA had one of the most visited Web sites of the federal government in each of the early years of the new century as well, and 2005 proved to be no exception, well on its way to exceeding the 35 million visits it had in 2004. As it reported to its constituents and employees in 2005: Today, people can apply for retirement, disability and spouse’s benefits over the Internet, and use Social Security’s benefit planners to help determine what benefits they and their families would be eligible for. Services for current beneficiaries include: change of address, direct deposit, replacement Medicare card, replacement 1099, and proof of income letter request. Cumulatively, these and all other online services handled over 1.2 million transactions in FY 2004, a 225 percent increase over FY 2002.62
A similar tale could be told about internal administrative reporting and operations, many now handled through the SSA’s intranet, including how it handled many services, such as retirement and disability claims processes. The SSA’s
Social Security, the Census Bureau, and the USPS
newest database (eDib) worked along with such other systems as a new Case Processing and Management System (CPMS), which could access eDib’s database, but also produced a variety of operational and tracking reports.63 New functions made possible now by technological innovations led to new applications just appearing in federal agencies in the early years of the new century. These included online (electronic) signatures enabled by legislation signed by President Clinton; the SSA implemented such a signature proxy for claimants filing in person, online, or by telephone in 2004. In short, many of the agency’s IT enhancements focused on expanding use of the Internet by the public (citizens and employers, each of whom used different applications) and its intranet site by employees and other public officials.64 One IT application became a hugely popular service, one based largely on the old-fashioned batch processing, printing, and mailing out in paper form. Called the Social Security Statement, it is a printed document mailed to over 140 million workers each year that explains the SSA’s offerings, tells individuals how much they have paid into the system, and what they might expect to receive in the way of benefits upon retirement. The SSA uses the occasion to provide other information to these workers. As with all processes at the SAA, this one too has high volumes. In the several years since it has been functioning (to mid-2005), the SSA mailed out over 780 million automated statements, roughly 500,000 each day for an annual number of approximately 125 million statements. The process is essentially completely automated.
Bureau of the Census The federal government currently has some 100 agencies whose mission it is to collect data on a vast array of issues ranging from economic activities and population to agricultural productivity and health. Many of these agencies are large and critical to the functioning of various other governmental bodies. These statistical agencies include the Bureau of Justice Statistics, Bureau of Labor Statistics, and Centers for Disease Control and Prevention.65 All federal departments also collect statistics. But the most senior and best known of these agencies is the Bureau of the Census, whose mission the founding fathers of the American government embedded into their constitution in the eighteenth century. In the twentieth century, statistical agencies as a group advanced the field of statistics, were major supporters of the use of information technologies of all kinds, even participated in the early development of punched-card tabulating equipment and later computers (most notably at the Census Bureau), and shared innovations among themselves. In short, they represent a subculture within the government that, while highly fragmented, interacted extensively, and that, as a group, have hardly been studied by historians and others interested in public administration.66 Throughout the twentieth century, the Bureau of the Census was responsible for conducting various censuses and disseminating results of that work. These collections of data included population counts every decade and intermittently
157
158
The DIGITAL HAND, Volume III
within decades, others related to economic and social issues, and publication of findings and data in a variety of ways: printed reports and later large digital files accessible by researchers, public officials, and, ultimately, citizens. So there was a constant need for tools to collect, analyze, and publish data, and always to do it both cost effectively and in a timely fashion. It was thus no accident that when computers were first being secretly developed during World War II, officials from the Bureau of the Census became involved in exploring their possibilities. As historian Margo J. Anderson observed, “statisticians saw that the savings from such technological innovations would enable the bureau to process more data, at a lower cost, and with more accuracy.” But also—and key to understanding how work would change over time—she pointed out that “the savings could also be plowed back into further improvements in the statistics—to improve survey methods, to analyze errors, to try experimental new techniques.”67 Between the end of the census of 1940 and that of 1950, the bureau first encountered computers. As early as October 1944, John Mauchly (of the John Presper Eckert and Mauchly team that later built the first UNIVAC computer) met with William Madow of the bureau to discuss the potential uses of computing to speed up sorting and tabulating of census data, discussions that continued over the next couple of years. These conversations influenced Mauchly and others in his firm in the design of their new system, focusing particularly on how to handle massive quantities of data that had to be processed in a repeatable fashion. In the late 1940s, the bureau partially funded R&D on their new system, and in 1948, the government placed an order for one UNIVAC on behalf of the bureau. On March 30, 1951, the bureau took delivery of the first UNIVAC I, Serial Number 1. The immediate importance of this event lies more on the effect it had on computing in general rather than on the bureau since it was only used for a few calculations involving the census of 1950. More important, the construction and successful operation of the system made it possible for Eckert and Mauchly to demonstrate the viability of computers and helped launch the commercial computer market. The UNIVAC I became the poster child of the new technology during the early 1950s.68 Because of the ongoing conversations with its developers dating back over a half decade, officials at the bureau had become quite familiar with computing technology, often serving as experts to other agencies in the late 1940s and early 1950s. As for the census of 1950, the system’s construction was completed too late to play a central role. That census took place in essentially the same manner as prior ones with data collected, edited, coded, punched, and tabulated using some 6,000 older machines, ranging from card punches to tabulators. For that census, the bureau created nearly 200 million card records; just to punch holes in the cards required about 114,000 man days.69 Its own internal history of that census recorded that while the UNIVAC “was the most far-reaching innovation in automatic tabulating equipment” of the day, it “did not become available until late in the tabulating program and was used for only a small part of the population and housing tabulations” by moving data from cards to tape that then could be sorted using software. Initial use proved effective in reducing the number of
Social Security, the Census Bureau, and the USPS
times data had to be read to be sorted into various categories of information. Officials concluded that “with machines like the Univac, future censuses should be processed with considerably greater speed.”70 The key technological innovation that caught the attention of statisticians was the ability of the machine to work with large quantities of data on magnetic tape. While the UNIVAC has received much attention in the historical literature, in the run up to the 1950 census, the bureau and its suppliers of IT (most notably IBM) continued to improve existing precomputer technologies as done in preparation for prior censuses.71 The biggest data processing problem the bureau faced concerned data collection. It was too slow, too expensive, and required too many people; management had to automate the process as much as possible. Conversations in the 1940s and 1950s with the National Bureau of Standards, Mauchly, IBM, and others dominated much effort at the bureau. By the early 1950s, however, the possibilities of optical sensing and microfilm offered a way out and led to the development of Film Optical Sensing Devices for Input to Computers, better known as FOSDIC. A beam of light directed at microfilm created an image of a spot on the film, which could be read by a photoelectric cell connected to a device that “wrote” the pulses (dots) onto magnetic tape. That tape then served as input into the computer. Early versions of FOSDIC I could handle 2,000 spots per second and up to ten times as many by the 1970s, all very accurately. This new way of collecting data first went into use at the bureau in 1954, and over the next two decades improved models and approaches were developed, such as FOSDIC II, which went into service in 1957, FOSDIC III in the 1960s, and later FOSDIC IV. These classes of machines
Figure 5.4
First FOSDIC system at the Census Bureau at Jeffersonville, Indiana, circa 1960. (Courtesy Census Bureau)
159
160
The DIGITAL HAND, Volume III
were used for the 1960, 1970, 1980, and 1990 censuses. For decades, therefore, the combination of automatic microfilming and digital computing was used to pull into computers ever increasing amounts of data at faster speeds.72 By the time of the census of 1960, the bureau had begun to make strides in data entry (FOSDIC) and, of course, now had in-house experience with digital computers.73 The number of people living in the United States had increased to 180 million in 60 million homes. Employees of the bureau, called enumerators, called on these homes to collect the necessary data. Improved processes for handling the data, and now greater reliance on computing to tabulate information, meant that initial reports to the nation on the size of its population were published six months faster than for 1950’s, and final reports of the census, more than a year earlier than in the past. Enumerators used machine-readable forms and the bureau’s four FOSDIC systems. The data were then tabulated using computers. Much of the work was centralized for the nation at a facility in Jeffersonville, Indiana, where clerks checked the work of the enumerators for accuracy and so forth. Two Univac 1105 systems had replaced earlier systems, and magnetic tape now served as input into the systems and as output that could feed printers to produce final reports, along with other 1105s located at the University of North Carolina at Chapel Hill and at the Armour Research Foundation at the Illinois Institute of Technology in Chicago.74 During the 1960s came the huge surge in advancements in computing, symbolized by the IBM S/360. Disk memory became widely available as an alternative to cards, microfilm, and magnetic tape. Machines became faster and capable of handling larger volumes of transactions and data, but the bureau did not fully exploit these new capabilities, other than for handling larger volumes of data and performing some new data mining functions. Many of the systems needed for the next census had been put in place as early as 1962–1965, just before the arrival of the major technical innovations of the decade, too late to be fully integrated into the next great count. As with earlier censuses, the population had grown and the volume of information to be collected on each person had as well, in fact, for a total of over 4 billion pieces of information. The bureau wanted to make a great deal of this raw data available in digital form to officials and scholars. Innovations in software and programming languages also made possible new capabilities that the bureau was able to take advantage of with existing hardware. One reporter wrote in 1970 that “the improved utility of the census is due mainly to development of new software which can reformat and re-aggregate census statistics in ways specifically desired by the end user.”75 Just as important, users could relate census data to random lists of people, which allowed marketing managers, for example, to estimate the incomes of people in specific locations. Four Univac 1107s and two 1108s were at the ready to process this year’s data, while relying on yet six newer models of the FOSDIC input systems, reducing to some 8,000 tapes the nation’s entire census. In 1970, for the first time, most of the questionnaires were mailed to households rather than relying so greatly on just human enumerators to collect all the data. The forms were
Social Security, the Census Bureau, and the USPS
machine-readable and because more data was collected, there were more machine-readable statistics. In 1960, only a few digital statistical files were made available to the public, but with 1970, data on all questions asked were in machine-readable form. This represented a massive increase in statistical data on the nation. In response to this capability, some sixty firms came into existence to analyze this information for companies, such as for marketing departments and insurance companies. Census officials also expanded their software protections for data confidentiality (required by law), which had become of much concern, beginning in the 1960s with the growing dependence on computers.76 For 1970, the bureau added questions, did sample surveys on employment, inventories, and another on residential finances, all part of the growing trend of collecting increasing amounts of data about the nation by statistical agencies. The bureau also planned on producing a much larger array of demographic maps, a process initiated using computing with the 1960 count, although at that time on a far more tentative basis. Final reports totaled some 200,000 printed pages in paper and microfilm and some 117 printed reports. The data processing applications built on those deployed in the early 1960s, and accounted for the experiences gained in the mail-out/mail-back census returns, and for the requirement to present more information, along with the expanded publications of maps. Calculations and publications were computerized. Univac 1100s worked with two now aging IBM 1401s. The bureau had yet to take full advantage of disk storage, and it continued to rely on FOSDIC-based processing of input. Yet once again, the bureau had deployed the largest single use of computing to collect data in the history of the nation.77 While special interest groups and politicians debated whether or not the Census Bureau had undercounted the population, particularly minorities, in the 1960s and 1970s, deployment of computing for the 1980 count progressed. As the internal record of that census reported, “the 1980 program resembled the one for 1970 in scope, but with far greater emphasis on disseminating data on computer tapes and microfiche.”78 The volume of machine-readable reports published increased fivefold over 1970’s.79 Because the bureau began planning the next census while tabulating the last one (a pattern of dual operations dating back to at least the early 1960s), 1990’s became a major project during the 1980s. Continuing questions about undercounting had led the bureau to consider various techniques for statistically estimating populations, beginning in the 1960s, which became an urgent issue by the early 1970s. Being able to use continuously emerging statistical techniques and obviously more powerful computers to perform necessary calculations on increasing amounts of digitized census data influenced the technical discussions among experts and the bureau.80 Driving the work of the bureau from one census to another was the growing appetite for more and better (also accurate) data by public officials and the private sector, motivated by the succession of additional capabilities the bureau acquired over the years by using computers.81 The workload the bureau experienced in the 1980 count proved higher than originally planned; indeed, it counted over 5 million more people than
161
162
The DIGITAL HAND, Volume III
anticipated and actually ran out of budget, which Congress fixed with a supplemental appropriation. In the postcensus dialogue internally within the bureau, employees spent considerable time discussing the still labor-intensive data capture processes, particularly the manual field data collection tasks. One internal report written in 1984 on these activities during the census taking of 1980 noted that “55,000 clerks checked-in, checked-over, and hand-sorted into batches all of the 88 million decennial census documents.” In addition, “this was a time-consuming, expensive, non-uniform, and uncontrollable operation,” made worse by the fact that the census “employed over a quarter of a million people.”82 To avoid that, and other problems from the prior census, for the 1990 census the bureau chose to implement a strategy similar to the SSA’s, namely, to increase automation of its field operations. In January 1986, the bureau decided to acquire roughly 555 minicomputers to beef up its processing power. These were to be used to automate the check-in process by which questionnaires were received and checked for completeness, obvious errors, and so forth, much like the IRS began doing with electronically submitted returns later in the century. These machines would also be used to prepare maps for enumerators and to automate further other steps in the preparation of various and economic censuses. Bid proposal conflicts, however, delayed procurement of these machines, putting the bureau at risk of not being able to test software and processes in its dress rehearsals for the 1990 census. But a smaller number were installed and used. In addition, more digitized data were published, made available on CD-ROMs.83 Preparation and publication of statistical tables were significantly automated over prior years while its earlier software for tabulating all results were updated to reflect the current questions asked of the population at large and in support of various statistical studies under way. Finally, the bureau expanded a service begun in the early 1980s of making available online some of its published reports, which could be reached through time-sharing services. This service proved highly successful, ranging from a low of a few thousand users in the early 1980s to some 50,000 downloading 396,000 files (largely onto PCs) in 1992.84 Census maps had long been popular, and in the 1980s and 1990s, the bureau expended much effort in automating their creation and publication. For 1990, the bureau wanted to create “a single, nationwide, digital geographic and cartographic data base from which to produce all the required geographic products and with which to perform the geographic services of geocoding,” an impressive project.85 Working with the U.S. Geological Survey (USGS), the two created the Topologically Integrated Geographic Encoding and Referencing (TIGER) System, which was used as part of the census of 1990 and 2000. All maps could be generated by computer, using data provided by the USGS. TIGER covered the entire nation, making this one of the largest GIS applications implemented in the United States. The system could produce over 146,000 different maps. These contained data on roads, railroads, hydrography, various transportation features; all were integrated in specific local, regional, and national maps.86 In addition to this application, the bureau developed a new address control file for over 100 million housing units in the United States, done to
Social Security, the Census Bureau, and the USPS
support tracking and controlling data collected during the 1990 and future censuses. It included bar coding of census questionnaires, which tied the data in those documents to the new system.87 For conducting the census of 2000, the bureau used a combination of preexisting and new digital applications conjoined for this year’s data collection. The heart of the IT infrastructure consisted of ten major systems, listed in table 5.4. These did not include the more internal operational systems required by the federal government for budget, payroll, and human resources. The key
Figure 5.5
By the end of the century, all data entry was online, circa 2000, Jeffersonville, Indiana. (Courtesy Census Bureau) Table 5.4 Key Digital Census Applications, 2001 Headquarters Processing Data Capture System Geographical Support System Operations Control System Pre-Appointment Management System/Automated Decennial Administrative Management System Telephone Questionnaire Assistance and Coverage Edit Follow-Up Internet Data Collection/Internet Questionnaire Assistance Accuracy and Coverage Evaluation System Management Information System Data Access and Dissemination System Source: U.S. General Accounting Office, 2000 Census: Headquarters Processing System Status and Risks (Washington, D.C.: U.S. General Accounting Office, October 2000): 16–17.
163
164
The DIGITAL HAND, Volume III
observation to draw from table 5.4 is that all these systems had to operate interactively with each other to allow the bureau to count the nation. Each system also had subapplications; for example, the headquarters processing system had forty-eight applications that did such things as update address files, create files of census responses, and prepare data for tabulation and dissemination.88 The GAO observed that as in prior censuses, there were issues concerning confidentiality of data, accuracy of information, and insufficient staff to handle complex systems, concerns it had with other government agencies as well. In 2005, when the bureau was already gearing up for the census of 2010, 10 percent of its permanent workforce worked in IT (1,100 employees out of 12,000). In addition, it already had 500 consultants working in IT projects.89 In preparation for the 2010 census and other studies, it upgraded existing applications and invested in new systems. New applications of the digital included an American community census, a new master address file topologically integrated, an automated export trade statistics system, data access and dissemination systems, demographic statistics IT support systems, economic census support tools, e-filing of data, field support, and additional software for its geographic support system.90 In short, IT remained a major focus item for this bureau as it entered the new century.
United States Postal Service Today this organization is called the U.S. Postal Service (USPS), a government agency structured as a quasi-independent organization since 1971, ruled by a board of governors appointed by the president with its mission articulated by various federal laws. Between the 1790s and the 1820s, it was called the General Post Office, and afterward to 1970, it was the Post Office Department (POD); it became a cabinet level department in 1872. While those who have written about the postal system differentiate the subject as pre- and postreorganization, when viewed from the perspective of how IT affected the work of the POD or USPS, the differences are less obvious, because the adoption of technology was a long, slow process that caused the work of both organizations to evolve over time in incremental ways. A more useful chronological distinction accurately reflecting the changing nature of postal services should be either of two other dates. The first could be some time in the 1980s, when competition from the private sector became legally possible and pronounced (such as the services of UPS, Federal Express, and others) and cut deeply into its rapid delivery (e.g., overnight mail) services and delivery of packages. The second date could be one after the wide adoption of the Internet in the second half of the 1990s, because this new use of IT challenged the need for First Class Mail service, which provided the largest single source of revenue for the USPS. It was also the one technology that finally called into question whether the USPS either needed to be completely privatized or would simply disappear at some time during the full flowering of the Information Age deep in the twenty-first century, a theme hinted at by the
Social Security, the Census Bureau, and the USPS
epigraph introducing this chapter. The discussion below does not focus extensively on the increased role of competition because that had less to do with technology and its effect on the nature of work at the USPS than it did with regulatory changes making possible alternative service providers. The Internet, however, has to be faced squarely. From its earliest days, when Benjamin Franklin was its first postmaster general, the Post Office had as its core mission to provide universal access to information through postal delivery of mail and newspapers, later as well other publications, financial documents, and packages. It was the nation’s first “information highway.” Throughout the twentieth century, Congress added the requirement that the department (and later the USPS) be financially self-sustaining, that is to say, be able to finance its operating costs with the fees collected for its services. Congress also established ground rules for how those charges could be set and changed. The POD and the USPS have more often than not been unable to be financially self-sustaining for many reasons we cannot get into here but important to note because all through the second half of the twentieth century, it sought to use mechanization of mail collection, sortation, and delivery as an essential way of lowering operating expenses, and computers to reduce operating costs and to sustain high levels of service. For many postmaster generals of the late twentieth century, mechanization and automation were some of the very few options available to their organization for controlling expenses. Reducing salaries and benefits—the lion’s share of their expenses—or the number of employees, an option often chosen by executives in private sector service companies, proved very difficult to exercise due largely to the effective prowess of the postal unions, which fastidiously protected jobs and pushed through significant increases in compensation, benefits, and work rules, even launching a national strike when needed, an activist agenda that was most pronounced after the 1960s. The option of raising the cost of postal services— such as the price of stamps—met with frequent public, White House, and congressional resistance and when exercised never provided sufficient revenue to meet operating expenses. Hence, the importance of computing and other technologies to the modern postal service, which will probably continue to affect the course of events at the USPS since the other two options remain problematic.91 The story is also a big one because the USPS is one of the largest organizations in the American economy, often the second or third largest employer within the U.S. government (depending on whether we measure the total number of workers in peacetime or war), routinely ranked as one of the ten largest employers in the nation, and it handles massive quantities of mail. Table 5.5 displays a variety of data to suggest the size of the enterprise. Size is important because that affects the ability of an organization to leverage economies of scale, as happened at the Department of Defense. Size is also a gating factor for how quickly (or slowly) an institution can change, and how visible it is to its various constituencies and rivals. In the case of the USPS, for example, the vast majority of its 900,000 employees are unionized and enjoy higher levels of salary and benefits than rivals in the package delivery business, making it difficult for the USPS
165
166
The DIGITAL HAND, Volume III Table 5.5 U.S. Postal Service at a Glance, 1950–2005
Year 1950 1960 1970 1980 1990 1995 2000 2005
Pieces of Mail Processed (Millions) 45 64 85 106 166 181 208 212
Number of Post Offices* 41,464 35,238 32,022 30,326 28,969 28,392 27,876 27,385
Revenue** ($ Billion) 1.7 3.3 6.5 18.8 39.7 54.3 64.5 69.9
Expenses ($Billion)
Number of Employees*** (Thousands)
2.2 3.9 8.0 19.4 40.5 50.7 63.0 68.3
501 563 741 667 843 875 901 803
*The numbers are lower than actual because these do not include retail called “contract stations and branches,” or “community post offices,” which add another 3,000 or more to the totals. **These are operating revenues and operating expenses. ***The numbers are lower than actual; the USPS also has what it calls “non-career employees,” which in 2005 alone added an additional 98,284 people to the count and in most years in the early 2000s, exceeded 100,000, bringing the total employment base back to where it was in the 1990s. Source: U.S. Postal Service, The United States Postal Service: An American History, 1775–2002 (Washington, D.C.: U.S. Postal Service, September 2003): 53; U.S. Bureau of the Census, Historical Statistics of the United States: Colonial Times to 1970 (Washington, D.C.: U.S. Government Printing Office, 1975): Part 1, p. 806; U.S. Census Bureau, Statistical Abstract of the United States: 2002 (Washington, D.C.: U.S. Government Printing Office, 2001): 692, and annual reports.
either to transform the nature of its work or to get its costs in line with those of its competitors in an age when in addition the Internet poses an enormous threat to its income. Table 5.5 includes data on 1995, the last full year before the Internet’s rapidly increased use by American society was felt across the economy and the latest information at the time this chapter was being written (2007). Very obvious is that the services of this organization did not go down with the public’s adoption of the Internet as a crucial source for information and to conduct business, or with the success of private sector rivals, as one might otherwise assume should have been the case. The reasons are manifold, but they include the fact that the population of the nation continued to grow, while the economy expanded in that same period as well, both factors, thereby, creating collaterally more business for all. The data also reflects the historic practice of the nation of using concurrently all manner of information platforms, tools, and providers. Americans used simultaneously the Internet in increasing amounts and paperbased mail and package delivery services, just as they did various forms of music (CDs, tapes, and Internet downloads) and publications (newspapers and Internet news sites).
Social Security, the Census Bureau, and the USPS
The postal system is unique in one other related way: its employees routinely visit nearly every household and establishment (business, public and private) in the United States, normally once a day, six days a week. No other government agency does that. In 2005, for example, that meant the USPS routinely delivered mail to over 142 million addresses, while over 7 million customers conducted transactions at its post offices. It had to collect mail from over 280 million places (buildings and blue mail boxes), operate nearly 200,000 vehicles and dozens of large postal sorting facilities. It delivered over 98 million pieces of First Class Mail that year and had an operating revenue of $69.9 billion. Its tag line, “we are everywhere you are,” was thus not such an exaggeration, particularly since (by then) the USPS was also accessible over the Internet for information and to conduct transactions, such as the purchase of postage stamps.92 All of these highly visible activities involved extensive use of information technology. Early Deployment of Mechanization and IT, 1950–1983 In the postal system, there were two activities that took place simultaneously: mechanization of the acceptance, sortation, and delivery of mail and use of computing and telecommunications for the operation of the institution. For the majority of the period we look at, these two tracks of activities occurred almost independently of each other. Then, in the 1970s, the two began to intertwine, most noticeably because of ZIP codes, which made it possible for digitally enhanced equipment to sort mail for delivery. Each track presented fascinating examples of how large organizations react to the emergence of new technologies and deploy them. But, it is important to understand that frequently the two were separate, and that circumstance was not always clear. A reading of the annual reports of the postmaster general written over the decades can be confusing, for example, because frequently postal officials in the early years of the computer thought of this technology as yet another tool to mechanize (their term) or automate (the word used more frequently by other government agencies) both core activities. The Post Office constantly expressed interest in leveraging machinery. While no comprehensive history of this extensive activity at the Post Office has been written, we can discern many of the major activities by reading the annual reports, which discussed mechanization every year right into the new century. Take 1954’s statement as an example of management’s intent, one repeated in many ways over the next half century: “The mechanization of postal operations to the maximum extent is an integral part of our program to improve service and reduce costs. We are exploring all possible methods of utilizing machinery to simplify or eliminate the major manual operations.”93 In the 1950s, work started to look at how to use electronic devices to read addresses as a way of speeding up sorting, an initiative that resulted in the creation of ZIP codes in the 1960s. All through the 1950s and 1960s, postal engineers worked with various companies to develop automatic culling, facing, canceling, scanning, and sorting machines. Some of these machines were massive in size and worked as systems
167
168
The DIGITAL HAND, Volume III
of integrated devices used at regional offices to sort mail for delivery to individual post offices. Then in the late 1950s, the Post Office began experimenting with optical scanning technology to cancel stamps in the upper right-hand corner of envelopes.94 A key provider of technologies in support of mechanization initiatives was the Burroughs Corporation, also a major vendor of computing equipment in the 1950s and 1960s.95 On November 28, 1962, the Post Office introduced the Zoning Improvement Plan for mail sorting, better known as ZIP Code numbers, to start deployment in 1963. In its own words, the Post Office announced that “ZIP Code was designed to help the Department efficiently handle the rapidly mounting mail volume without a correspondingly large increase in manpower—to contain costs and forestall the day higher postage rates might become necessary.”96 Officials were clear on why they wanted the ZIP Code: “when fully implemented, [it] will reduce the number of times a letter must be handled, thus speeding its dispatch and
Figure 5.6
Data processing center in Richmond, Virginia, Regional Office, mid-1960s. (Courtesy USPS)
Social Security, the Census Bureau, and the USPS
reducing the number of mis-deliveries that might otherwise be encountered through misread addresses.”97 The Post Office made development of optical scanners to read ZIP codes its highest R&D priority. In the early 1960s, officials envisioned sorters and scanners with memories that could compare data on a letter to where it needed to be sent, and that linked this action to scheduling of transportation, which in turn meant the deployment of data processing, as well, at local Post Offices.98 In 1965, the USPS began creating digital ZIP code files.99 By the end of the 1960s, use of ZIP codes had been widely accomplished across various classes of mail and packages; by the early 1970s, it had become relatively ubiquitous with American mail. In the 1970s, wide deployment of bar coding and other devices to physically sort and move mail dominated much of the service’s mechanization initiatives. Some of the capabilities of the equipment proved very impressive. The postmaster general in his annual report of 1972–1973 noted that “envelopes with bar symbols can be sorted at speeds ranging up to 42,000 pieces per hour,” while providing companies that use these the ability to add additional “bits of information,” making possible more refined sorting of mail for delivery to specific departments or offices.100 Despite work done in the 1960s and 1970s to mechanize the handling of mail, Postmaster General Benjamin F. Bailar noted in his annual report of 1974–1975 that prior to the reorganization of POD into the USPS, “operational innovation was forced to lag behind those of other industries, and mechanization was minimal.”101 Yet development and deployment of mechanization mimicked the rate of innovations evident in other federal agencies, such as the Department of Defense, where often it took over a decade to develop a new tool and yet another decade or more to fully deploy it. Table 5.6, based on data collected by the USPS, provides evidence that mechanical sorting of mail was progressing quite rapidly in the early 1970s. By the end of 1980, it was sorting mechanically over 70 percent of all mail. To accomplish that task, the USPS spent nearly a billion dollars in equipment in the 1970s.102 Meanwhile, the ZIP code was expanded by an additional four digits to extend automation sortation down to the postal carrier’s individual route, a process of deployment that began during the 1980s. The earlier five-digit ZIP code was in use with over 97 percent of all mail by the early 1980s.103 Because the USPS chronically could not overcome fiscal deficits, the pressure to increase productivity through mechanization remained intense right into the new century.
Table 5.6 Letters Sorted Mechanically by the U.S. Postal Service, 1971–1977 Year Percent
1971 25
1972 35
1973 44
1974 52
1975 60
1976 63
1977 64
Source: U.S. Postal Service, Annual Report of the Postmaster General: Fiscal 1976 and Transition Quarter (Washington, D.C.: U.S. Government Printing Office, 1977): 6, Annual Report of the Postmaster General: Fiscal 1977 (Washington, D.C.: U.S. Government Printing Office, 1978): 7.
169
170
The DIGITAL HAND, Volume III
Yet progress had been made. For example, if we use the USPS’s own simple metric of gross productivity—total work years divided into total volume of mail handled—one sees demonstrated improvements, which it attributed largely to the use of new technologies. In 1973, the metric was 128,000 pieces, and by the end of 1983, that number had gone up to nearly 175,000.104 To be sure, costs had increased substantially too, so the simple measure has to be taken with the proverbial “grain of salt.” But what it suggests is that mechanization was having a direct and positive effect on how much mail its employees could handle and on controlling the rate of growth in operating costs. With regard to the large collection of back office functions, such as accounting, budgeting, cash management, and payroll, the Post Office had a variety of punched-card systems dating largely from the 1920s and 1930s, although its first use of this class of information technology began in 1913. Following World War II, as the department grew in size, it sought to consolidate and optimize operations, leveraging punched-card technology of the late 1940s and early 1950s. For example, in 1956, the department completed a major reorganization of disbursements and mechanization, moving to a totally punched-card form of check writing to pay its bills and payrolls, moving these functions from individual postmasters to regional offices. In the case of payroll checks, that meant some 520,000 issued every two weeks could now be partially automated and the cost to process them reduced, both for POD and the Treasury Department. As one department report noted about the new process, it “has simplified and drastically curtailed paperwork at post offices” and created “valuable by-product data . . . replacing mass accumulation of reports heretofore manually prepared by the operating bureau.”105 This application of predigital technology is important for several reasons. First, the combination of using technology and reorganizing to leverage economies of scale proved to be a positive experience for the department, thereby encouraging it to make other changes involving the combination of technologies and reorganizations. Second, it increased the department’s use of IT for accounting, payroll, and so forth in large enough quantities that when computers came into the organization in the late 1950s, these now older applications of technology could take advantage of the large computing capabilities of the digital hand. In short, like so many other large federal agencies and private companies, the Post Office could use computers and initially did so for the same applications. Third, it made it easier to collect data leading to insights on the operations of the agency. In its annual report for 1956, the department noted that “the regionalization program has simplified post office accounting and expedited and improved accuracy of reports. The use of punched card equipment makes available important statistical by-product data for management analysis, planning, control, and decision-making.”106 Fourth, it stimulated use of such precomputer technologies in other areas, as in converting all money orders from paper documents to punched cards, a massive application involving 360 million money orders produced at a centralized facility in Kansas City, Missouri.107 Building on these applications of punched-card equipment at various regional centers, the Post Office Department began transferring this work to
Social Security, the Census Bureau, and the USPS
computers in the late 1950s. Its earliest computers came from Univac (Model 60 and 120 mainframes), initially installed in New York and Chicago.108 Other mainframes went into regional headquarters in Washington, D.C., Dallas, Atlanta, Cincinnati, and Philadelphia. By the end of 1959, eleven out of the fifteen regional offices had digital computers, primarily for handling payroll and by-product reports.109 This commitment of digital equipment reinforced the mission of the relatively new regional headquarters in their support function as accounting centers for the post offices around the country and as a source of information for national headquarters in Washington, D.C. This development set in place the pattern of organization and operations for many years to come. Older accounting methods were redesigned and ported over to computers. These included, for example, cost record keeping for the motor vehicle fleet (1961), reconciliation, control and audit of money orders (1962), personnel time and record applications (1962–1963), second-generation payroll systems (1962), early operations management systems and installation of second-generation computers in regional offices (1963–1964), production of summary information on revenue, pieces of mail, weight, pound per mile, class of mail handled, and revenue and transactions (1964), and preparation of budgets (1965). In 1964, the department began consolidating regional offices, with data processing also consolidating from fourteen to six centers.110 These now reported into a new organization called the Bureau of Finance and Administration, which had previously been two separate departments: Office of Management Services and Bureau of Finance. In this reorganization, we see the second-order effects of the digital hand, an organizational transformation.111 The first order had been the reduction in cost of operations that resulted from consolidating and optimizing functions on computers. Recall that the transformation to the ZIP code process for sorting mail was also under way at the same time, relying on computers as well. The department continued to add new accounting applications all through the 1960s and in the 1970s, while simultaneously upgrading computing and telecommunications as new machines became available (such as IBM’s System 360 Model 65 installed in 1969) for use at both its national headquarters and by regional and very large post offices. One of the more important applications that remained a major source of data for the department in various forms for decades was the Origin-Destination Information System (ODIS). The department began collecting data to determine volumes of mail by point of origin and destination, using sampling methods and computers in 1969. This data was used to schedule workflows, determine manpower needs, deployment of trucks, and so forth, making ODIS a core managerial tool in the 1970s.112 The telecommunications backbone for this and other systems was the postal source data system (PSDS) deployed across its post offices and regional headquarters in the 1970s.113 It is customary for observers of the American postal system to characterize operations in this department as transforming slowly prior to its major reorganization into the USPS in 1971. However, with regard to computing, the Post Office Department moved into computing as fast as other government agencies and industries and, in the case of the ZIP code, did not hesitate to embrace a
171
172
The DIGITAL HAND, Volume III Table 5.7 Major IT Applications Implemented at USPS, 1968–1984 1968 1974 1975 1977 1977 1977 1978 1982 1982 1982 1983 1984
Postal Source Data System (PSDS) Code Directory System PSDS replaced with Management Operating Data System (MODS) Transportation Management System (TMS) Bulk Mail MIS Start of National Time Sharing, completed in 1979 Master File Data Base E-Com Direct Data Entry/Direct Reporting (DDE/DR) Permit System City Time and Attendance System (CTAPS) Corporate Information System (CIS)
Source: James L. Golden, Information Technology: 40 Years of Innovation and Success (Washington, D.C.: U.S. Postal Service, 2005), unpaginated.
radically new application of IT. Many of the major IT applications and projects of the 1970s were extensions of initiatives already under way in the years prior to reorganization. Emphasis on cost cutting using computers, sustaining quality and speed of service delivery with ZIP codes, and other increasingly digitized applications were points of focus for management before and after reorganization. Table 5.7 catalogs many of these events. Critics, however, noted that speed of deployment could always be quicker. Postmaster General Benjamin F. Bailer admitted as much in his annual report in 1975: “Prior to postal reorganization . . . operational innovation was forced to lag behind those of other industries, and mechanization was minimal,”114 a statement, however, not fully supported by the historical record. There is no doubt that the USPS increased its interest in IT in the post-1971 period. It installed more modern equipment earlier than before, for example. Thus, while IBM’s mainframe of the 1960s, introduced in 1964 and first shipped in 1965 (S/360), did not make it to the Post Office until 1969, the company’s follow-on product, the S/370, went in earlier in its life cycle, presenting the USPS with photo opportunities for its annual reports by mid-decade.115 By the late 1970s, the USPS was also using minicomputers at the same time as these new machines were being used in the private sector because they made it possible to continue automating manual operations, such as the process for forwarding “undeliverable as addressed mail.”116 Deployment of IT before the Internet, 1984–1994 During the 1980s, the twin strategies of using ZIP codes with OCR equipment and bar codes commingled into a pervasive way of handling mail. While manual
Social Security, the Census Bureau, and the USPS
collection and sorting never disappeared, and deployment of OCR equipment came slowly in the 1980s, ultimately deployment of these two technologies was extensive and altered substantially how many tasks were performed, enough that by the late 1990s, one could begin to see emerge a new style of physically handling and tracking mail. In 1986, the USPS was already beginning to deploy a second generation of OCR equipment, designed for its own uses. During the second half of the 1980s, the USPS increased its use of bar coding, reflecting the same level of interest and applications evident across many manufacturing, retailing, and distribution firms and industries in the American economy. The USPS did this with an eye to bar coding virtually all mail by the mid-1990s, because the cost of handling a piece of bar-coded mail was far less than half of a manual operation.117 Between 1987 and 1991, the USPS installed over 2,000 OCR, bar code sorters, and other devices, the start of an extensive deployment of new equipment that occurred over the next several years. Postal officials attributed their ability to reduce the number of employees directly to their use of this equipment, in effect articulating the same rationale for embracing technologies of various types evident at such other federal agencies as the Bureau of the Census and, most notably, at the Social Security Administration.118 By the early 1990s, the USPS was also encouraging business customers to pre-bar code their mail in exchange for discounted prices for postage. The USPS also deployed a strategy of replacing older equipment with more modern units that did the same work. It relied increasingly on vendors to evolve the technologies needed rather than to conduct its own internal R&D operations as it had in the 1950s through the 1970s. Thereby, the USPS seized the opportunity to drive down operating costs while avoiding the risk of installing obsolete equipment. Auditors from the GAO looked at the results in the early 1990s and concluded that since “more than half of the work of the Service is not directly affected by automation, this reduction did not have a perceptible effect on overall postal costs”; in short, despite extensive deployment, more was needed.119 Deployment, while continuous, remained a slow process by the USPS’s own admission. For example, in its annual report for 1994, the USPS acknowledged it had installed only 40 percent of all the automation equipment it wanted, at the not inconsiderable cost of $2.6 billion since 1987.120 Desire to reduce the number of employees also proved illusive. In the period 1992 through 1995, for example, despite public declarations that it intended to eliminate 30,000 employees, the number actually went up, from roughly 782,000 to 855,000. To be sure, the volume of mail had increased in the same period of time. The GAO noted that the increase in employment came in jobs that were labor intensive and that had not yet fully felt the effects of automation: clerks, nurses, city carriers, and mail handlers.121 A second broad area that the USPS approached with IT involved retail sales operations. IT began influencing retail functions in a tentative, yet visible way in the 1980s. For example, USPS experimented with a service to provide rapid transmittal of faxed letters and documents to over two dozen countries in the mid-1980s over a dial-up network. The offering, called INTELPOST, became
173
174
The DIGITAL HAND, Volume III
available in some 145 post offices in twelve cities, a small deployment when one keeps in mind that there were tens of thousands of post offices, but nonetheless an early experiment with telecommunications. A second early project was called Electronic Computer-Originated Mail (E-COM), which failed to catch on in the mid-1980s and, thus, due to low volumes, cost more to offer than the USPS collected in revenues; so the USPS shut down the offering on September 3, 1985.122 Meanwhile, in the same decade, post offices began installing point-of-sale terminals, with major deployment taking place during the second half of the decade. By using POS terminals, the USPS hoped to shorten the time it took a customer to conduct a transaction at a post office, since the terminals housed rate
Figure 5.7
Automated postage stamp dispensing machine, 2002. (Courtesy USPS)
Social Security, the Census Bureau, and the USPS
and other mailing information. It wanted to achieve twin objectives of shortening the time a customer had to wait for and then conduct a transaction, while additionally increasing the number of people a postal employee could work with. By the mid-1990s, deployment of the USPS’s Integrated Retail Terminal was relatively ubiquitous. The reasons for its success were not hard to find. In addition to speeding up service and improving the quality of that service (such as by using electronic scales to determine very accurately postage), it collected accounting data for the USPS. During the 1990s, the USPS added functions to the system, such as printing receipts and production of self-sticking postage labels with bar codes.123 As the body of information that the USPS collected via bar and ZIP codes increased, along with deployment of scanners throughout at its post offices, it became increasingly possible to start tracking packages and mail, a function done by its competitors in the private sector. In 1991, the USPS began work on what ultimately became its digital tool for tracking Express Mail, expanding the service nationwide in the early 1990s. During the last decade of the century, the USPS expanded its use of tracking mechanisms for its own use, while making that kind of data increasingly available to customers via telephone and, later, the Internet. The third area in which the USPS used IT concerned internal operations. As in all other agencies, the USPS had a variety of financial and accounting systems running on its computers to track budgets, expenditures, deployment of manpower, project management to manage the construction and maintenance of post offices and to manage maintenance and deployment of vehicles. These were systems that had been incrementally deployed all through the half century, often in response to requirements set either by the Treasury Department or the Office of Management and Budget (OMB), and after creation of the USPS, by Congressional edicts. In its use of IT for such mundane back office operations, the USPS was essentially functioning much like many other federal agencies. But like all agencies and private sector enterprises, its management also sought to use IT in ways specific to its mission. Perhaps the most important modern innovation began in the 1980s with the installation of online tools that postal management could use to control internal operations. In the 1980s, integrated systems provided management with quantified data on deployment of people, volumes of transactions conducted, and so forth, information necessary to run the business of the USPS and that supplied facts and insights to senior management and ultimately appeared in its annual reports. One major software tool, called the Supervisor’s Workstation, combined operational and planning functions, initially at mail processing facilities and later in other parts of the USPS, such as a system to help local post offices optimize delivery routes, one of the earliest expert systems at USPS.124 The GAO noted that in the 1980s and 1990s, during various efforts to restructure the USPS, shrink the number of employees, provide new services, and improve productivity, and with continuous growth in the volume of mail handled, service to the American public remained at levels very satisfactory to customers, Congress, and employees.125
175
176
The DIGITAL HAND, Volume III
The Postal Service in the Internet Age, 1995–2007 No public institution seemed affected so directly by the intervention of a new information technology as was the USPS, with the possible exception of the military. Most specifically, the wide use of the Internet to transmit e-mail has directly cut into the USPS’s First Class Mail volumes and revenues to such a sufficient extent as to stimulate an ongoing debate about whether or not the USPS would survive the twenty-first century and a less draconian option, whether it could be completely privatized to speed up its further transformation into a more competitive mail delivery service.126 As one very pessimistic observer noted in 2000—after the Internet had become widespread and interactive—the economic and functional balance of power was shifting away from the USPS: “Since about 1970, the nominal price of a first-class stamp has quadrupled, growing by about 10 percent in real terms, while the inflation-adjusted price of a long-distance phone call has declined by 88 percent and the price of a unit of computing power has declined by a factor of 10 million. The price of a cellular phone has fallen by 98 percent since 1984.”127 In addition to that harsh reality, the same commentator, Thomas J. Duesterberg, pointed out some internal challenges faced by the USPS, many of which had been well documented by the postal service in its annual reports: Despite more than $5 billion invested in automation equipment in recent years, the number of full-time postal employees has grown by about 5 percent since 1994. At the same time, the volume of mail delivered has been stagnant, growing at about 1 percent per year in recent years for first-class mail. Each full-time employee . . . moves, on average, about 223,000 pieces of mail per year. By contrast, each American Online (AOL) employee moves more than 13.8 million e-mail and instant messages per year and facilitates about 43 million “hits” on Web pages per year.128
The Pew Foundation documented the continuous growth in e-mail as well, some of which was at the expense of First Class Mail. While one could quibble over how much the USPS’s services mail volumes declined as a direct result of e-mail over the Internet, the numbers are stark enough not to ignore.129 The substitution of one technology for another (electronics for paper) was not lost on the USPS or other interested government agencies. It had experimented with e-mail as far back as the early 1980s with its E-COM offering, at a time when the USPS was already experiencing threats to its First Class Mail service from fax and electronic data exchange (EDS) services, which spread widely across many industries in the 1970s and 1980s. Thus, from the period when telecommunications began affecting the USPS, the problem grew in severity, because all through the second half of the century, roughly half the volume of mail and revenues came from First Class Mail. These alternative services addressed, first, its business-to-business volumes (1970s and 1980s) and, next, its person-to-person business (late 1990s). The GAO reported as early as 1994 that “the risk to the Postal Service posed by competition and changing
Social Security, the Census Bureau, and the USPS
technology is very real,” leaving the USPS with more expensive, less profitable mail to handle.130 In that year, the GAO reported that e-mail was growing annually at 25 to 30 percent, facsimile (fax) by 20 to 30 percent, while videotext (such as provided by CompuServ and Prodigy) by 30 to 40 percent. EDI, already a well-established communications medium, particularly in manufacturing and process industries, was also still growing at 30 to 40 percent each year.131 In short, the threat to First Class Mail’s revenue stream faced by the USPS pre-dated the Internet and was far more extensive than this one form of electronic communications. The USPS responded to these rapid technological developments by entering the EDI market, providing e-routing, setting up interactive kiosks at post offices, and deploying an electronic commerce system for federal agencies. It also participated in a European ePost network, and, of course, deployed retail systems. It was clear by the mid-1990s that the USPS had a real problem on its hands: how to thrive economically in the face of these new challenges. It had to thrive because Congress required it to provide universal mail service and remain fiscally solvent. The postmaster general in this period, Marvin Runyon, had spent the bulk of his highly successful career in the Automotive Industry and in other senior government positions and thus knew how to conduct reorganizations and offer new products. He expanded use of retail terminals to customers and introduced electronic notification services and acceptance of credit cards for payments.132 He began touting the idea of serving “America’s communication needs” as opposed to the notion of simply mail. He asked Congress to give him more freedom of action to set terms and conditions, to offer new services, and to set prices—all requests that were not provided to his level of satisfaction.133 Yet during the late 1990s, service levels and opinion surveys of the public demonstrated that the USPS was able to do its work, albeit economically not as productively as it wanted or as effectively as its niche competitors in the package delivery business. Nonetheless, the USPS introduced additional services based on IT, such as PC postage (1998), which was a digital stamp that could be purchased and printed using personal computers, while tracking and guarantying delivery services, relying on digital monitoring of shipments. In 1994, the USPS had launched its first public Internet site—and much like other government agencies, first provided the public information and forms and, later, the ability to conduct an ever growing collection of transactions. As the USPS approached the end of the century, it still employed about a third of the entire civilian federal employee population, had higher costs for labor than its rivals, faced the challenges of the Internet, and worked through the risks posed by Y2K.134 GAO auditors, and a growing number of observers, remained pessimistic. One GAO report from late 1999 began with the simple, yet dramatic, statement that “the Postal Service may be nearing the end of an era”; despite heroic efforts, the USPS itself shared with the GAO the view that its core business would continue to decline, largely due to “the growth of the Internet, electronic communications, and electronic commerce,” minimizing the other problem of competition from private sector package delivery services.135
177
178
The DIGITAL HAND, Volume III
The USPS attempted to displace lost revenue with new sources all through the early 2000s, largely based on electronic delivery of messages and e-commerce, such as Stamps Online, eBillPay, Electronic Postmarks (EPM), Internet change of address services, NetPost.Certified, and NetPost Mailing Online.136 In the early 2000s, the USPS emphasized how its Internet-based products enhanced connectivity among people and ease of use. If its annual reports are a true reflection of its main concerns, like others from so many other government agencies, these devoted considerable attention to the deployment of IT-based services and continued improvements in internal operations, thanks to the helping hand of digital technology. Yet volumes of large revenue generating products continued to decline and fiscal deficits remained a chronic problem. The problem faced by the USPS was not just competition from the Internet or private competitors. One government analyst described the root issues in 2002: USPS’s basic business model, which assumes that rising mail volume will cover rising costs and mitigate rate increases, is increasingly problematic since mail volume could stagnate or decline further. USPS has also had difficulty in making and sustaining productivity increases. Moreover, USPS’s framework of legal requirements, which form the foundation of USPS’s business model, as well as practical constraints impede USPS’s ability to ensure its own financial viability.137
To be sure, the fiscal situation had eroded, with the USPS experiencing shrinking net income from 1995 through the early 2000s, due to its inability to lower operating and infrastructure costs at a rate commensurate with growing challenges posed by the Internet and competitors. First Class Mail, for example, dropped in volume, as measured by millions of pieces, from nearly 103.7 million in 2001 to some 97.6 million in 2006. That decline resulted in the gradual loss of nearly $1 billion in revenue each year. Yet, to put those numbers in perspective, when all sources of income were accounted for, total operating revenues grew from $65.8 billion in 2001 to some $72.8 billion in 2006, albeit almost relatively flat, as did the number of pieces handled.138 The postmaster general in 2005, John E. Potter, acknowledged the challenges faced by the USPS: declining mail volumes, difficulty in controlling costs, and the overly optimistic assumption that increased volumes of First Class Mail might have helped to balance the financial books. He reaffirmed the value of continuing to use the digital hand, as had the prior half dozen postmaster generals. Despite his attempts to present the USPS’s initiatives in a positive light, he presented his dilemma clearly: “as electronic diversion continues to erode First-Class Mail volume, this product will become more pricesensitive than ever. Higher rates will likely increase the pace of change, accelerating the volume decline, resulting in falling revenue and the need, again, to increase rates. It is an economic model that is not sustainable in the long term and could lead to the proverbial death spiral that many have predicted.”139 Several months later, the USPS was authorized to raise the cost of First Class Mail from 37 cents to 39 cents, going into effect in January 2006. In 2007, another authorized increase raised the rate to 41 cents.
Social Security, the Census Bureau, and the USPS
As this crisis developed a head of steam in the 1990s and continued in the new century, the USPS sought out new ways to apply the digital hand in its internal operations. The USPS is one of the few cases we have of an agency adhering to a technology implementation plan regardless of what political party was in office. In the case of the USPS, this involved letter sorting and movement automation, which had proven so productive and effective within the period of any single postmaster general that incrementally deploying technologies of various sorts, and then incrementally upgrading them as new versions became available, made good sense to do. In the instance of letter mail automation, for example, the current round of implementation began in 1982, involving the movement of materials that had been bar coded either by business customers or the USPS. Extending bar coding to do delivery point sequencing (DPS)—the sorting of letters down to the carrier’s route—began in 1993 as an extension of the earlier initiative and was a program that the USPS kept deploying all through the 1990s and into the new century. By the end of 1997, 81 percent of all letters were bar coded, and a similar target was achieved on bar coding down to the carrier level. Overtime costs for carriers dropped as the amount of time they needed to spend hand sorting at their post offices declined throughout the 1990s, providing tangible returns to the USPS, although not at the aggressive levels desired.140 When compared to the results of many other federal technology projects, this one proved very effective. Over the previous twenty years, the USPS extended automation to process flat and parcel mail as well. The cost savings possible were enormous. One assessment made in 1997 demonstrated that 1,000 pieces of letter mail sorted manually cost the USPS $44.94; when sorted with the aid of mechanical devices, it cost $27.69; and when sorted through complete automation, $5.39. The latter was made possible by computer-managed bar coding.141 While confirming delivery of mail to customers was seen by the public and competitors as a new and desirable service in the late 1990s, it also came as a response by the USPS to growing competition. Postmaster generals saw this service as contributing to revenue. When the USPS began offering a delivery confirmation service for Priority Mail and parcel shipments in 1999, it built on its prior investment in bar coding and automation. Doing that required deploying over 300,000 scanners across the nation to read bar codes, while linking the data gathered to databases feeding an 800 toll-free telephone line and the USPS’s Web site. By the end of that first year, over one million shipments per week included use of this new service.142 In each of its strategic plans developed during the last two decades of the twentieth century, the USPS discussed the role the digital hand had to play to drive down operating costs for personnel, to improve and sustain quality of service, to keep employees motivated, and to offer new services that would be competitive, all the while generating new sources of revenues. Use of IT increased as a strategic tool from one plan to another. No major function at the USPS was immune from computerization by the mid-1990s. Web-based tools for employees, integrated databases and supply chains, and e-services came into
179
180
The DIGITAL HAND, Volume III
service.143 Initiatives under way in the early 2000s remained consistent from one year to the next as well as the purposes of these activities. With over 60 percent of all households accessing the Internet, the USPS anticipated continued decline of First Class Mail regardless of what actions it took. In its best case scenario, First Class Mail would decline just slightly between 2005 and 2010; the base line scenario, relying on historical and economic trends, had the volume dropping by 10 percent, and the worst case, by over 20 percent.144 Businesses were expected to use various emerging and already widely available technologies to reach customers as well, for example, the Internet, cell phones, and even iPods. The USPS intended to increase the amount of information customers could get over the Internet regarding the status of their mail delivery and for help. Bar coding would be extended to make mail delivery processes more intelligent and accessible by systems, employees, and customers. As new technologies became practical to use, they would be deployed to track mail, such as radio frequency identification devices (RFID) (already used by the Pentagon to collect information on the whereabouts of its supplies), and scanning and bar coding to improve business reply mail functions. Giving customers the ability to print postage and labels online—called Click-N-Ship—would be further enhanced. Finally, a variety of internal operations would be upgraded with shared services in accounting for all parts of the postal system, a more modern communications infrastructure, and increased use of online training tools. In short, plans as of mid-decade were replete with IT projects, far more than had been the case in the 1970s, 1980s, or early 1990s.
Conclusions The three organizations reviewed in this chapter all became extensive users of information technology, in fact, to such an extent that it would be difficult to imagine how they could function in the future without its use. How each came to such a point reflected various experiences unique to each agency. Rate of adoption and extent of deployment reflected more internal operational and managerial issues than just the merits of a particular technology, although, as we noticed in other federal agencies and departments, digital tools had to be configured in ways specific to their needs. In the case of the Social Security Administration, we have the instance where the commercially available functions of computers matched very well the needs of the agency: computers could handle large quantities of repetitive tasks and information formatted in the same way. Moving to the digital from precomputer technologies thus did not disrupt initially how tasks were done or disturb the mission and practices of the organization. This was so much the case that only when Congress began altering the historic role of the SSA did officials in the agency begin finding use of computers to be both challenging and even more necessary; in short, using computing after the 1970s became far more complex. In the instance of the Census Bureau, computers also evolved in ways that
Social Security, the Census Bureau, and the USPS
naturally fit into the daily operations of this organization since they could handle large, similarly structured files in massive quantities. At the Census Bureau, the technology also encouraged employees to expand general knowledge about statistics and, later, digital mapping (GIS systems). The Post Office, however, provided a different model of using IT in that other than for normal back office accounting applications, and many years later with retail terminals, it had to invent new equipment to make the digital useful. It pushed vendors in the 1960s and 1970s to improve OCR technologies and invented many specialized devices to sort and move mail, which, over time, had embedded in them digital components, such as bar coding. Computing, therefore, became easier to use and earlier, in those agencies where the technology’s functions happened to align nicely with the mission of an organization. But other factors also affected adoption of computing, most notably, availability of funding (since these were large capital expenditures), focus of management, and changing roles mandated by Congress. What is a remarkable finding from these three cases, however, is how consistent an agency could be in wanting to deploy and use a technology regardless of which political party was in power. In that sense, it mirrored the practices and cadence of deployment evident at the IRS and at DoD. While the GAO constantly criticized government agencies for poor or inconsistent leadership when it came to use of digital tools, these agencies, nonetheless, deployed computers and telecommunications that improved their operations. Like the IRS and DoD, however, they, too, were overly optimistic about the results they would achieve using IT, while implementation always took longer. These three agencies mirrored practices in other departments, such as the fact that it always took over a decade to implement a major deployment. The one possible exception was the Census Bureau, where incremental improvements were spurred on by the circumstance that it had to conduct a census every ten years, with no delays allowed in carrying out this constitutionally mandated mission. IT did not threaten the abilities of the SSA and Census Bureau from carrying out their respective missions. In the case of the Post Office, we have a different situation where much of the debate about the future of the USPS has centered around the effects of the Internet on this organization’s effectiveness. The historical record, however, suggests that too much attention is being paid to the effects of the Internet on revenues from First Class Mail. It appears that management at the USPS performed the due diligence required to understand the potential uses of IT evident in other public agencies and in the private sector, carried out implementations quickly enough or too slowly (depending on one’s views), but often with their cadence influenced more by the availability of budgets than by some reluctance on the part of management. The real problems at the USPS had less to do with technology and more with issues that had existed (in some cases) for a half century: a pool of employees too expensive when compared to competitors or even other government workers; legal restraints on altering the size of the enterprise to expand or contract to meet the realities of the market (for instance, unprofitable post offices could not be closed down as might a bank branch
181
182
The DIGITAL HAND, Volume III
office); and the existence of a culture of entitlement that reflected more of a public sector style of operation than what was evident in the private sector—a natural consequence of its heritage and legal structure as a public institution. How productive did computers make these agencies? In the case of the SSA, one cannot imagine it being able to carry out its fundamental mission without computing; in short, it was always a “high tech” Information Age enterprise. In the case of the Census Bureau, because it relied on a combination of automated and manual operations, one could observe productivity increasing as an evolutionary process over the decades as manual operations were incrementally automated, with wave upon wave of new uses of the digital hand. In the case of the USPS, which seemed to be criticized more than the other two agencies for its ineffectiveness, it may have been more harshly criticized than warranted. As the data in table 5.5 suggests, IT, when combined with other forms of mechanization, and managerial and operational practices, did make it possible for the USPS to handle four times as much mail at the end of the century with only twice as many employees (its biggest single expense). Put another way, the USPS was able to handle larger quantities of mail with less resources per piece of mail over time, while sustaining high levels of satisfaction with its services. To be sure, the cost of everything changed too over time, from salaries to the price of gasoline for trucks. The USPS always likes to point out that its cost for First Class Mail service is less than in most countries, but the historical record demonstrates that the public and Congress did (and do) not care about what stamps cost in other countries. Rather, the USPS has failed to understand fully and/or to communicate effectively the role technology played in keeping costs in check, such as the effects of OCR technologies that made it possible to read the vast majority of terrible handwriting on envelopes and packages. Imagine the cost to the nation if 200 billion pieces of mail had to still be hand sorted by a highly unionized well-paid workforce.145 A lesson these three agencies teach us is that deployment of IT is clearly more a function of nontechnical issues: budgets, labor practices, and congressional and constitutional mandates. In each instance, however, these are agencies that are highly visible to the public and to the Congress, with the result that all their major digital initiatives are not obscured from public scrutiny, with the possible exception of the USPS’s sorting equipment. In the census of 2000, arcane debates about the mathematics associated with sampling were debated in committee rooms of Congress, while the SSA’s ability to provide pensions became a major topic of national debate in the late 1970s–early 1980s, and again in the early 2000s. The point to draw from these realities is that IT, while absolutely crucial to the operations of these organizations, remains subservient to the factors that drive how they are used and to what extent. That reality was more important than the fact that by the late years of the twentieth century, all three were very much information-based organizations, massively dependent on IT to do their work. The American government is one of the most intensive users of IT in the world, perhaps more than any other in either the public or private sector. It
Social Security, the Census Bureau, and the USPS
certainly embraced digital computing and telecommunications earlier than other national governments, and often quite massively. It was no accident that the Clinton administration made IT policies a major focus item, leading to programs to get children and the nation at large onto the “Information Highway,” while simultaneously “Re-Inventing Government,” reducing paper work, and fostering deployment of IT and telecommunications across the economy through its regulatory practices. All of this grew out of decades of growing dependence on and understanding of the technology. For that reason, the next chapter is devoted to providing an overall view of the role of the digital hand in the federal government, allowing us to put into a larger context the case studies of specific agencies discussed in the first several chapters of this book.
183
6 Role, Presence, and Trends in the Use of Information Technology by the Federal Government At present, there is no function of Federal Government—administrative, scientific, or military—that is not dependent on the smooth functioning of computer hardware and software. —Grace Commission, 1983
T
he federal government was one of the earliest and largest users of computers in the twentieth century. In the early decades of computing—1950s–1970s— it was the most extensive user of computers in the American economy and at the dawn of the twenty-first century relied more on IT and communications than it did in the earlier years. The primary reason it does not continue today to be the most massive user of computing in the world is because the American private sector—which is roughly four times larger than the federal government—caught up with and deployed more computers in the last twenty-five years of the century, as did industries in many other countries. Various government agencies and departments embraced computing in its nescient form, funded its rapid evolution in the 1950s and 1960s, and used every new form of the technology that emerged in subsequent decades in nearly all corners of government. The process of using computers and becoming reliant on them across the entire government followed the pattern described in the past several chapters. Each agency made its own decisions about how and when to deploy the
184
Role, Presence, and Trends in Federal Government
technology, responded to massive increases in dependence on IT, budgetary issues, complexity, turnover in management, reacting at various speeds to the availability of new forms of technology. Attempts by various administrations and the Congress to rationalize the management of adoption and use of the technology remained unfulfilled, although by the end of the Clinton administration important progress had been made. To put the story into a simple numeric context, it was not uncommon for the entire federal government to spend between 1 and 3 percent of its total budget on computers, software, and staff to run them, and those numbers do not include the additional cost of telecommunications. In short, billions of dollars were spent each year on IT and telecommunications; their use proved as crucial to the operations of the entire government as it did to the agencies and departments discussed in earlier chapters. Yet, large agencies and departments had more difficulty in wringing out effective use of computers (DoD and IRS, for instance) than did smaller agencies (FBI and Census Bureau), despite the fact that the opportunities for greater budgetary savings, increased productivity, or improved service often lay with the largest agencies, bureaus, and departments. Issues related to size and complexity mixed with political and institutional crosscurrents to affect the role of the digital hand. By looking at adoption and use of computing across the entire federal government and generalizing about common patterns, we are able to develop a clearer picture of how one large segment of the American economy used this technology and changed the nature of work within it. However, there is a caveat: because adoption of computing occurred in a highly decentralized fashion, with each agency often making its own decisions about IT in considerable isolation from what was occurring elsewhere in the government, it is still impossible to paint a complete picture of the use of computing. In some cases, it was a matter of national security that led officials to purposefully mask data on this subject, particularly during the Cold War involving such extensive users of IT as the National Security Administration (NSA), DoD, and the Central Intelligence Agency (CIA). In other instances, issues turned more on poor accounting practices (as the GAO frequently reported was the case at DoD and at other departments) and, as so many presidential and congressional study groups noted, the lack of some effective centralized authority to track expenditures and establish technical standards. As a consequence, all the data presented in this chapter understate the extent of use of computing both in terms of expenditures and for what applications. Nonetheless, many commissions and agencies collected substantial quantities of information about the uses of IT in the government sufficient to make it possible to create an initial catalog of federal uses of computing over the past half century. Because so many industries followed many of the IT practices of various agencies during the early decades of computing, it is also important to understand those uses and practices. So, as with earlier chapters and prior volumes of The Digital Hand, treating the federal government as if it were one de facto industry allows us to tease out of the historical record many broad patterns of adoption.
185
186
The DIGITAL HAND, Volume III
Use and Deployment of Information Technology, 1950–1980 Early on, management all over the government began to appreciate the potential usefulness of the new technology. As early as 1958, Joseph Campbell, Comptroller General of the United States, sent to the Congress one of the first surveys done on federal use of computing in which he described characteristics of the technology and use that were to be repeated by other studies for decades to come. These uses proved attractive in two general situations: (1) when “processing large volumes of data, where the emphasis is on reduction of per-unit cost of transactions processed, and where there are relatively few management implications involved”; and (2) when “processing large volumes of data, where important managementcontrol implications are involved.”1 The first involved large but usually similar types of data handling activities, as took place in processing payrolls by the Treasury Department, wage data by the SSA, and population counts by the Census Bureau. These tasks did not call for generating data that would allow one agency to control another. In the second instance, use of IT involved analyzing data in order to decide how to expend and control funds and other resources, and to track results. These included large supply and logistical systems in DoD necessary to manage the process for procuring and deploying materiel. The comptroller general noted that use of computers had grown rapidly in the 1950s, “due to advances in technology, the population increase,” and to other demands of work in the government. But largely in the beginning, it was about reducing the cost and effort of collecting and using some 25 billion pieces of paper (recall the excitement at the SSA when it could use computers to reduce paper volumes).2 Every major agency of the government explored possible uses of computers by the early 1950s and began installing these systems by the end of the 1950s. All the surveys and inventories of the 1950s documented the surge in deployment that began by the mid-1950s, grew in number in the late 1950s, and extended into the early 1960s.3 Just to cite one inventory suggests the surge in deployment. Between 1951 and 1958, approximately 237 systems (comprised of computers, peripheral equipment, and software) went into the federal government. That number nearly doubled by mid-1960, to nearly 540 computers.4 The expert on computing who compiled this information, James D. Gallagher, also explained what government agencies were using all these computers for: Inventory control; aircraft engine management; electronic-equipment failure analysis; availability-and-demand history computation; inventory review-andavailability editing; cataloguing; payroll; stock-requirements; supply management; aircraft-configuration accounting; property accounting; earnings and claims data processing; cost accounting; workload forecasting; price-support analysis; road and bridge design; actuarial work; population statistics; war games; mobilization planning; economic census; air-traffic management; fiscal and budgetary control.5
The only major initial uses he failed to mention were early weather forecasting, scientific research, and air traffic control applications. Otherwise, the list was quite representative of the first generation of uses.
Role, Presence, and Trends in Federal Government
The variety of applications is important to call out for a number of reasons. It is customary for historians of computing to speak of first uses of computers concentrating on accounting, scientific, or military applications. To be sure, this and other listings of early uses document the especially wide deployment of computers to perform cost accounting, payroll, budgeting, and other accounting and fiduciary work for the simple reason that these involved labor-intensive, repetitive, and tedious tasks. But Gallagher’s list also reminds us that many nonaccounting applications were also deployed in the earliest years of the computer. Put another way, Paul Armer at the RAND Corporation observed in 1966, “the first phase of utilization by government has involved the mechanization of existing manual systems or of punched-card procedures,” which were quickly evolving into a second wave of applications in the early 1960s implemented simultaneously with this first group. This second wave involved integrating various data processing functions into more comprehensive systems comprised of what, in future years, would have been called “stand alone” applications, picking up momentum “at such a pace that some organizations will undoubtedly bypass the first,” which is largely what happened, for example, at the FBI.6 As of the mid-1960s, about 10 percent of all computers installed in the United States were nestled in federal government agencies, with the majority housed at DoD, NASA, and the Atomic Energy Commission (AEC).7 Thus, we can conclude that while computers were seen in the late 1940s and early 1950s as useful predominantly only for scientific and engineering applications, next for accounting, like users in the private sector, government officials, early and broadly, saw computers as useful for more applications by the mid-1950s than earlier stereotypical descriptions would suggest was the case. To be sure, scientific and engineering applications actually grew in importance in new fields. For example, by the mid-1950s, the U.S. Weather Bureau had installed IBM computers to help it do its central task—actual calculations with which to predict the weather—and not simply to process mundane accounting transactions.8 A second important area of interest involved aircraft safety. The Federal Aviation Agency (FAA) began using computers in the 1950s and became a massive user in the 1960s with national networks and other safety applications. With tens of thousands of airplanes flying in U.S. airspace by the late 1950s, air traffic control became especially important and increasingly complex to manage. IBM’s 650 system became an early workhorse for this application, helping the agency coordinate the work of radar systems all over the country. In fact, many of the applications installed in the early 1960s remained essentially the same until late in the century.9 Automatic data processing (ADP), as many frequently called IT in the late 1950s and early 1960s, quickly became a major line item in the government’s budget and the subject of much attention by the White House through its Bureau of the Budget. The interest that this office maintained in its various permutations continued right into the new century because of the cost of technology and its potential for offsetting other government expenditures through increased labor productivity. In 1959, of 40 nonmilitary agencies tracked by the Bureau of the
187
188
The DIGITAL HAND, Volume III
Budget, 24 had already installed one or more systems, while another 12 still used punched-card systems from earlier times. In each of the subsequent several years (1960–1963), the number of agencies installing computers increased at the rate of roughly a handful each year, such that by the end of fiscal 1963, 32 out of the 40 agencies had computers. By the end of 1963, these systems had translated into annual expenditures of roughly $688 million, outlays that grew over the years. By 1967, all but 3 agencies used computers. By the end of the decade, the proverbial “everyone” was using computing in support of their agencies’ work.10 None of this data included military or other “classified” applications of the digital hand. Table 6.1 shows the number of computers by year during the first three decades of the government’s adoption of the digital hand. Because of the limited processing capabilities of these early systems, it should be of no surprise that the majority were dedicated to one use only, rather than to multiprocessing applications, which became common by the end of the 1960s. For example, of the 1,006 systems reported in use in 1962, dedicated applications included scientific work (274), administrative or accounting (265), program work, such as military inventory control or the Treasury’s disbursing operations (224), while only 220 were used for two or more applications. The remaining 23 systems were largely deployed in classified uses.11 Depending on whose survey data one consults, what they all have in common is the finding that by the mid-1960s, approximately a third of all computers in use in the United States were federally owned, leased, or being used under contract to a government agency.12 One by-product of all this activity was growing interest on the part of Congress, the Executive Branch, and various data processing experts concerning how agencies were going about acquiring this technology. This interest came at a time when it appeared to public officials that data processing was consuming almost 2 percent of the national budget, an amount that could not be verified at the time but, in hindsight, seems reasonable when compared to patterns of expenditures occurring in the private sector. There was also growing concern that Table 6.1 Number of Computers in the Federal Government, Select Years, 1950–1979 1950 1955 1956 1958 1960
2 45 90 250 531
1962 1963 1965 1967 1971
1,030 1,326 2,412 3,692 5,934
1973 1975 1979
7,149 8,649 12,190
Source: Chart 6, Bureau of the Budget, Inventory of Automatic Data Processing (ADP) Equipment in the Federal Government, Including Costs, Categories of Use, and Personnel Utilization (Washington, D.C.: U.S. Bureau of the Budget, August 1962): 13; ibid., 1965 edition, 11; ibid., 1966 edition, 7; National Bureau of Standards, Computers in the Federal Government: A Compilation of Statistics, NBS Special Publication 500–7 (Washington, D.C.: U.S. Government Printing Office, June 1977): viii, 3.
Role, Presence, and Trends in Federal Government
open bidding for procurement of computers was not happening, thereby denying the government the economic benefits of a competitive Computer Industry. The story of this issue is complicated and deserves further study by historians; however, what is important to realize is that in 1963–1965, senior public officials had begun to take actions to improve the efficiency and effectiveness by which agencies justified, acquired, and managed computing. They initiated a string of managerial activities that increased in volume and importance over the next four decades, even becoming a major component of the domestic policies and politics of the Clinton administration in the 1990s.13 The result of the early concern was that Congress passed the Brooks Act, which went into effect in 1966. It defined more clearly than prior legislation and regulations how computer equipment would be acquired. The Office of Management and Budget (OMB) was required to set overall policies for all federal agencies while Congress charged the General Services Administration (GSA) with responsibility for overseeing implementation of these guidelines. The National Bureau of Standards (NBS), nestled within the Department of Commerce, assumed the task for setting technical standards for information processing. For the next three decades, these agencies retained the same responsibilities. They did not always operate as envisioned by either Congress or the White House for myriad reasons, resulting in a string of presidential and congressionally mandated commissions, audits, and task forces to study the acquisition and deployment of IT over the years. The epigraph at the start of this chapter came from one of the more important of these commissions. Acquisition of new computer systems continued all through the 1960s and 1970s, as illustrated by the case studies in prior chapters. Acquisition remained a highly decentralized function with each agency and department making its own decisions on what technology to deploy, although increasingly bowing to guidelines set by OMB, carried out by GSA, and watched over by the GAO and Congress. The legislative branch criticized the effectiveness of the watchdog functions established by the Brooks Act. Then in 1980, Congress passed the Paperwork Reduction Act, the first of several laws enacted over the next quarter century to reduce bureaucracy, streamline work in the government, and to shrink the time and effort required of the private sector to fulfill legal obligations, such as reporting salaries paid and filing tax returns. This law called for the appointment of an information resource manager (IRM) within every executive agency to improve the management of IT activities. The law also contained provisions for better collection of information about the use of digital tools.14 Since government computer-counters were delivering messages in the 1970s that the federal government’s use of computing was aging and slowing, their own data provides useful insights about both the government’s adoption of computing and how that help set the context of these acquisitions within the American economy at large. In that crucial decade from the mid-1960s to the mid-1970s, when computing spread rapidly across the American economy, it did, too, within government. Beginning with roughly 10 percent of all computers in the United States, a decade later the federal government’s share of installed systems had dropped to 4.5 percent.15 This shift was less a statement of the government
189
190
The DIGITAL HAND, Volume III
slowing down adoption, which table 6.1 clearly demonstrates was not the case, and rather more a reflection of the rapid expansion in the deployment of computers across the entire American economy, as suggested earlier in this chapter. In the same decade within the federal government, DoD, NASA, and the Energy Research and Development Administration (ERDA) (which replaced the Atomic Energy Commission in 1975) remained the three largest consumers of computers, accounting for some 90 percent of all nonclassified systems in 1966 and 84 percent a decade later.16 As with the general economy, this decline of 6 percent of systems in the government demonstrates that other agencies were installing their first or additional systems in the same period. What effect did these laws and commissions have on the use of IT by the federal government in the 1970s and 1980s? The inventory of aging technologies began to worry public officials. The example of the IRS being quick on its feet and creative in the 1960s, and later less so, proved emblematic of what happened at many agencies installing their first- or second-generation IT. However, despite a large inventory of installed systems, deployment in the 1970s became a mixed story of slowed innovation and faltering deployment, causing maintenance costs to exceed those evident in the private sector, and that weakened the ability of agencies to respond to new congressional mandates. Smaller agencies normally proved more effective in deploying modern technologies in support of their work, such as the FBI, while the very largest departments and agencies were increasingly viewed as backward when compared to IT practices in the private sector. The Grace Commission (1983) did not criticize the laws enacted in the 1960s or most recently in 1980 but rather blamed “the inability of the Office of Management and Budget (OMB) and the agency administrators managing ADP and their leadership to effectively introduce, justify and maintain effective ADP systems.”17 The GAO—an audit arm of the Congress—also found fault with many agencies over the same problem but shied away from criticizing the growing number of new programs mandated by the Congress (as happened with the SSA), or from acknowledging that lack of sufficient budget often was a severe impediment either in upgrading hardware and software or in rewriting old applications in the 1970s. Yet, the various assessments of DP activities written in the 1970s concluded that some gains in productivity were achieved, despite rapid increases in expenditures on IT, following the pattern of acquisition set as far back as the late 1950s. Existing applications from the 1960s were incrementally upgraded and changed, while staffs to maintain these systems also expanded.18 With the growing availability of faster and bigger computers in the 1960s and 1970s, the advent of online processing, and existence of bigger and less expensive digital storage, one could see the same patterns of usage documented in earlier chapters spreading across other government agencies. Because of the largeness of many agencies, it was reasonable to expect that these would develop large, complex, even comprehensive applications in support of their daily work. In fact, that is exactly what happened, particularly in the 1960s and often for the first time. During the 1970s, agencies slowed their development of new systems as they began to use the ones launched in the 1960s and early 1970s. The
Role, Presence, and Trends in Federal Government
pattern of evolution was one of incrementally adding functions to existing applications. Maintenance activities associated with taking care of installed hardware and software became an increasingly important and expensive set of activities in the 1970s, and because of the difficulty of using and maintaining a growing patch quilt of older systems by the dawn of the 1980s, upgrading and changing applications, software, and hardware proved more complex. These trends resulted in many of the systems of the earlier period remaining in use throughout the 1980s, as evidenced by the cases presented in earlier chapters for such agencies as the uniformed military services and the IRS. One report on federal computing of the early 1980s noted that in addition to complexity, bureaucratic inertia, and increased complexity in procurement practices, in those agencies in which systems were responsive, modern, and useful (usually in smaller organizations), these enjoyed “management continuity and attentiveness to ADP concerns,” citing the FBI as one example of the process at work.19 Finally, a brief comment is in order about the role of telecommunications working with computers in this era to round out the picture of IT usage in the federal government during this early period. Dial-up AT&T services, deployment of private networks, and quasi-public/private networks (such as what eventually would be known as the Internet) commenced in a wide fashion in the 1960s and expanded even more rapidly in the 1970s at rates ranging from just less than 10 percent increases in the billions of characters transmitted in some years to 19–25 percent annual rates of growth in volumes. In the period 1977–1981, just before the breakup of AT&T, and the subsequent emergence of a rapidly changing Telecommunications Industry, utilization of private and leased data circuits by the government increased at 16 percent each year. This growth represented an annual compound growth rate in expenses for the government’s telecommunications of 25 percent over the two decades.20 Reliance on telecommunications expanded further throughout the 1980s.
Use and Deployment of Information Technology, 1981–2007 Use of the digital hand in the 1980s shifted to new issues as well. While the period of the 1960s and 1970s saw the creation of IT systems in support of existing work streams, federal officials in the 1980s and early 1990s incrementally enhanced those systems, largely in response to new duties mandated by the Congress or that augmented preexisting data processing.21 While comprehensive hard data on expenditures are difficult to come by, at minimum the federal government spent annually over $9 to $15 billion on IT in these years. Since those figures often did not include all the expenses of maintaining systems, or others used in intelligence gathering agencies and for secret military projects, the actual figure was, no doubt, higher, possibly by as much as 50 percent.22 All these expenditures, however, made possible new applications to adopt. Innovations in technology and declining costs of IT and telecommunications helped too. Major initiatives were also launched to control and leverage better IT and information.
191
192
The DIGITAL HAND, Volume III
Applications of the digital hand that increasingly became part and parcel of an agency’s IT infrastructure included computer-assisted modeling in analytical forecasting and research activities of agencies. Decision support tasks also were augmented by use of computers. A third involved the management of information and paper-based records. All three had been the subject of some data processing in earlier years, but more so in the 1980s. Federal agencies acquired large numbers of microcomputers (PCs), beginning in the mid-1970s and reaching over 100,000 just by mid-decade, and had well over 500,000 just five years later. Employees used these systems to do desk-top analysis (using spreadsheet software) and to access larger files on agency mainframes. Spreadsheet software did more perhaps than any other category of IT tools in the early years of the microcomputer to cause the decentralization of computing from large data centers to desktops that began to occur in the federal government in the 1980s and 1990s. In addition to spreadsheet software, federal employees acquired or wrote modeling and decision analytic software tools. These made it possible for individuals to model options for many small or simple decisions.23 At the other extreme of technological developments came significant improvements in supercomputers, making it possible to build very complex models that, in fact, were developed quickly in such broad areas as aerodynamics, high-energy physics, and weather forecasting. These were deployed at NASA, DoD, and at each of those national laboratories under the management of the Department of Energy. Modeling at this level of sophistication had begun in the federal government in the 1950s for numerically intensive projects, such as weather forecasting; but during the 1960s this class of applications expanded, along with the capabilities of computers, to model issues relevant to DoD, NASA with its space programs, and at the Department of Energy. As the cost of computing dropped and capacity to handle more complex and higher volumes of data increased in the 1970s, new applications became part of the government’s tool kit in such areas as air pollution, solid waste management, water resources, urban development, and transportation. In the 1980s, 60 percent of 141 agencies used computer modeling, either with supercomputers or smaller systems, suggesting that computer-based modeling had become an important class of applications.24 Table 6.2 lists major uses as of the mid-1980s, demonstrating that the applications had also become quite diverse. To put this information in some useable context, the data in this table reflect a small sampling of the over 3,600 applications of modeling using computers. More pointedly on the question of how computers were used in decision support and analysis on a daily basis, deployment of this class of applications spread across various agencies from the 1960s to the end of the century. However, in the 1960s and 1970s, such uses of computers began largely within military agencies, elsewhere for some business-oriented work, and, of course, for a large variety of research and development projects. Government employees had long used paper and pencil, slide rules, and calculators, so when computers and, most specifically, microcomputers became available, there was no lack of projects one could envision. What made the computer attractive as a tool was its ability
Role, Presence, and Trends in Federal Government Table 6.2 U.S. Government Computer Modeling Applications, circa 1985 Economic research (analysis of farm produce, world food supplies, trade policy, forecasting supply and demand) Natural resource management (timber resource allocation, fire management, road designs, oil and gas lease management) Military uses (impact of defense spending on U.S. economy, strategic defense, force mobility modeling, war planning) Health and human services (modeling of social security and welfare programs and options) Emergency management (mobilization in event of nuclear war, earthquake damage, strategic stockpiling policy development, economic impact of disasters) Source: Office of Technology Assessment, Federal Government Information Technology: Management, Security, and Congressional Oversight, OTA-CIT-297 (Washington, D.C.: U.S. Government Printing Office, February 1986): 110.
to calculate accurately many large numbers quickly and frequently. The growing availability of commercial software packages in the 1980s simply facilitated further reliance on the digital hand. The first survey conducted by the federal government on this issue (mid-1980s) suggested that over 80 percent of all agencies used spreadsheet software, such as Lotus 1-2-3 or VisiCalc. Nearly 50 percent reported using such quantitative decision analytic techniques as linear programming, queuing analysis, and critical path analysis. Roughly a fourth used forecasting techniques and software packages, such as for regression analysis. A similar number relied on quantitative decision analytic techniques that included judgmental input, such as the use of decision trees and subjective probability methods. (Expert systems were added in the 1990s.) The same survey pointed out that about 10 percent of all agencies intended to adopt these various digital tools in the near future. The surveyors also concluded that their data undoubtedly understated the use of computers to do this kind of work.25 The point of going through this litany of deployment is to suggest that the use of analytical tools based on computing became part of how government agencies did their work by the end of the 1980s. An even larger set of applications of the digital involved the collection, use, and dissemination of information, paper reports, and forms across the federal government. This government was reputed to be the largest publisher in the world; certainly its printer, the Government Printing Office (GPO), was the biggest publisher throughout the second half of the twentieth century. Additionally, as discussed in chapter 2, the IRS alone produced massive quantities of documents and forms, costing the American public billions of dollars to read, fill out, and file. The government had collectively recognized the need to constrain the flow of paper work while making information more readily available to both the public and to its own employees. All through this period, public officials saw the potential of computers to assist on all fronts. While various attempts were made to
193
194
The DIGITAL HAND, Volume III
rectify the situation in the 1950s, 1960s, and the 1970s and became a major point of emphasis of the Clinton administration in the 1990s, it was in the 1980s that a combination of cost-effective computing and the accumulation of prerequisite debate, study, and skills led to tangible efforts addressing the broad issue of managing information and paperwork. In addition to data and paper, by the 1970s an increasing amount of the government’s information was beginning to appear in electronic form, not just on paper, raising questions about how employees and the public could gain practical access to it. As the amount of information and paper had increased in the 1960s and 1970s, officials became aware of the growing problems they presented. As use of computing spread rapidly (1960s and 1970s), officials and Congress sought to stem escalating costs associated with this class of work. Their efforts resulted, for example, in passage of the Brooks Act in 1966. But on the application side—information and paperwork—one of the most seminal steps taken was passage of the Paperwork Reduction Act in 1980. While not the first time Congress had legislated on the use of information— it had done so over the course of nearly a century and a half—legislators designed this law to reduce the burden of both public and private sectors in processing information required by law and the daily work of agencies. It expanded and invigorated various federal activities focusing on the management of government information. It aimed to reduce the burden of paperwork on businesses, individuals, and state and local governments.26 The details concerning the specific role of agencies, such as the GSA and OMB, while part of the law, need not detain us. What is important to recognize is that it stimulated a surge of actions across the federal government to conform to the law’s terms that have continued right into the new century. As part of the effort to reduce paperwork and make information more readily available, agencies turned increasingly to computers for help and paid more attention to the management of IT, as we saw with the examples of DoD, the IRS, and the smaller agencies of the FBI, SSA, and Bureau of the Census. One report on the actions of the Reagan administration summarized a whole new class of use for computing that took place in the 1980s: “the Executive Branch is tightening its procedures and policies, letting OMB direct the management of information resources, and expanded the collecting and reporting of information in electronic or magnetic tape form while curtailing publication in paper copy.”27 The Congress saw these actions as a critical collection of initiatives in support of reducing the deficit of the federal government and lowering the operating costs of agencies on a relatively permanent basis. In short, as had been the case in how most computer applications had been justified during prior decades, the hunt for less expensive ways of doing the work of government played a prominent role in public administration during the 1980s. By mid-decade, some 40 percent of all federal agencies had established ways to deliver or make available information to employees and the public in some electronic form, albeit in most cases in modest fashion, far more limited than what became prevalent after the wide integration of the Internet into the daily work of nearly all government agencies in the late 1990s. The first major application in response to the law expanded use of what quickly became known as
Role, Presence, and Trends in Federal Government
e-mail. In one survey of 118 agencies, 47 reported using this application by mid-decade to circulate “press releases, bulletins, notices, and short reports, and the use of computer tapes for distribution of statistical databases and reports.”28 Departments making available information housed in their electronic databases included Agriculture, Commerce, Energy, Health and Human Services, Interior, Justice, Labor, and Transportation.29 By the late 1980s, policies, programs, audits, and studies of progress documented a substantial increase in the use of computing to digitize information and its dissemination.30 Many functions and all agencies relied on computing in one fashion or another by the mid-1980s. In addition to the kinds of uses described above, agencies had developed specific software tools tailored to the needs of their organizations to a large extent; in other words, they had written or purchased software tailored to their needs and not just acquired generic software tools (such as spreadsheets and word processors) that could be used across agencies. Figure 6.1, put together in the mid-1980s by IBM, catalogs the variety of digital applications already available in the federal government. Some of the uses listed are generic in that they appeared in multiple agencies, such as training, but in almost all cases, highly tailored to the needs of specific organizations. Many were developed by federal employees in IT organizations, while a growing number also came from software vendors, or contractors who wrote software for hire. Budget data from the period provides additional evidence of the extent to which the digital hand played a role in governmental activities. As in earlier years, the data demonstrate that from the early 1980s into the early 1990s, during which time the federal government focused on cutting expenses on domestic and “backroom” operations, while investing in military systems to outbid the Soviets into what eventually became the collapse of the Cold War at the end of the 1980s, expenditures on IT continued rising both in dollar amounts and as a proportion of the federal budget. To put a fine point on the matter, in 1982, IT expenditures on nonclassified systems, software, and data processing personnel took up 1.23 percent of the federal budget, much in line with what companies spent in many private sector industries. As a proportion of the budget, this percentage increased to 1.70 by 1993. Total budgetary obligations (which include discretionary spending, mandatory programs such as Social Security, and net payments on the federal debt) for IT measured in dollars nearly doubled, from $9.1 billion to over $16.1 billion in 1991, with projections of $17.2 billion for just two years later. In real dollar terms, between 1982 and 1991, expenditures grew at an annual rate of 6.5 percent, a bit less than the private sector, nonetheless at an impressive rate. If we look at IT as a percent of the operating budget (which is the total budget of the government except mandatory spending and debt servicing for example) year over year, IT consumed annually 3.4 percent in 1982 and a decade later (1992), 5.4 percent. IT budgets grew faster in this period than the overall budget of the federal government.31 Expenditures in the early to mid-1990s showed a nominal increase to roughly $26.5 billion by middecade, largely driven by civilian agencies since the largest user of IT, DoD, actually leveled off its expenditures in this period.32 By the mid-1990s, however, IT
195
196
The DIGITAL HAND, Volume III
Figure 6.1
Federal uses of computers, mid-1980s. (Courtesy IBM Archives)
expenditures as a percent of the total federal budget amounted to about 1.8 percent while for the operating budget it had expanded to 6 percent.33 Concurrently, utilization of telecommunications across the federal government continued to expand. GSA had leased equipment to agencies and departments for years, but the costs and complexity of various networks had increased, so in 1988 GSA awarded AT&T and U.S. Sprint Communications contracts for supplying telecommunications to the government. The contract was estimated to be worth some $25 billion over the next decade, making this expenditure one of the largest for communications in the American economy. This more integrated telecommunications was named the Federal Telecommunications System (FTS
Role, Presence, and Trends in Federal Government
2000). It provided a combination of voice, data services, and video at higher speeds and capacity than available before, which made possible a national e-mail network using personal computers and two-way video conferencing. This initiative set the federal government on a path to have a near state-of-the-art telecommunications network by the end of the decade that was in place by the time the Internet became a major focal point for public officials. It replaced a network that had been in place since 1963, and that provided the government with longdistance telephone service, with voice the primary application. With the new one, data transmission was recognized as an equally voluminous and important application, and, of course, provided faster data transmission than available before. Finally, the new system provided digital transmission and began the process of retiring an all-analog network.34 Before discussing the role of the Internet—a major topic of interest to the Clinton administration in the 1990s—we need to acknowledge that the digital hand played other roles in the government. In short, the history of the 1990s is not just about the Internet in government. For one thing, at that moment in time when use of the Internet grew rapidly across the American landscape, the federal government was spending over $27 billion a year on IT; only a tiny portion of that sum went for Internet activities. Earlier concerns about continuing to improve governmental operations also resulted in passage of the Information Technology Management Reform Act in 1996 and repeal of the Brooks Act as part of a multiyear, ongoing process of reforms and transformations. To put this legislation in broad context, these legislative actions were only two out of over 300 passed by the Congress dealing with information and technology between 1977 and 1990 alone.35 Applications of the digital implemented in prior decades continued to be used in the 1990s. The GSA reported that in 1995, the government had over 30,000 computers, not including microcomputers, microprocessors, or high performance workstations. Every agency used computers of various sizes and capabilities. Many stand-alone applications dedicated to specific agency needs, or for engineering, for example, using Digital Equipment, Wang, and IBM products alone accounted for nearly 8,000 systems. Large mainframes of the 1970s and 1980s operated in every cabinet level department and in many agencies, accounting for another 700 systems. These large systems were most widely used at DoD, NASA, Department of the Treasury, and at very large agencies, or in those that needed extensive computing, such as the Department of Energy. The first three organizations just mentioned alone accounted for over 560 systems, indicative of where some of the largest and oldest uses of computing applications resided in the government.36 Agencies and departments continued to enhance existing systems. For example, the White House upgraded its e-mail software and added an online resume system, while various agencies enhanced theirs to combat fraud in their welfare programs.37 All during the 1990s and into the early years of the new century, data mining applications became extremely popular new uses of IT with over half of all government organizations exploiting software tools in support of this expanding use of computing. Widely used data mining included efforts to
197
198
The DIGITAL HAND, Volume III
improve services or performance; to detect various forms of fraud, waste, abuse, and criminal activities; in support of scientific and other research activities; increasingly for the management of human resources; after 9/11 for analyzing intelligence data for detecting terrorist activities; and, in general, for understanding patterns of behavior, whether of criminals, land use, or tax payers.38 The GSA continued to develop methods for agencies to acquire, install, and operate IT all through the 1990s, and their reports for these years reflect much of what one expected IT departments in private industry to work on all through the half century.39 All federal agencies and departments focused on becoming Y2K compliant in the late 1990s, a massive effort historians will someday need to document, because it consumed a large amount of resources of all kinds (technology, manpower, management, and budgetary) all through the late 1990s.40 The OMB continued to pressure agencies to lower operating costs by leveraging use of IT—a major theme all through the half century—while the GAO critiqued the effectiveness of existing IT managerial practices. During the Clinton years, agencies were also encouraged, indeed ordered, to improve the quality of their services to citizens and to peers within the government, part of the Reinventing Government initiative of this administration. This latter effort had simultaneously been adopted by many governments in Europe, Asia, and in parts of Latin America, representing a global trend extending far beyond the deployment of e-government (Internet-based services).41 In January 2001, the comptroller General of the United States, David M. Walker, who headed GAO, sent to the Congress a broad assessment of federal managerial challenges and opportunities. While much of the report’s concerns were swept away in the aftermath of the 9/11 attacks on the nation, his assessment, nonetheless, provided a snapshot of practices and thinking as of the late 1990s. Perhaps the most significant was the shift that had taken place in the 1990s toward implementing the concept of performance-based management, that is to say, the notion that departments should do their work in compliance with targets that measured the effects of their activities on citizens, not merely to conform to budgetary targets or internal priorities. This shift in thinking—which also began to appear in many European governments and soon in the European Union’s own managerial practices—held out the promise that new IT applications would be created in response to the changed thinking. At a minimum, the shift called for new reports on performance, relying on data mining and on new accounting systems, many of which had yet to be developed, a process started during the Y2K initiative that led to installation of some new systems. Walker urged the Congress to leverage that work: “It is critical that the momentum generated by the government’s Year 2000 efforts not be lost and that the lessons learned be considered in addressing other pressing IT challenges.”42 Specific problems facing government’s use of IT mirrored those of earlier decades, with the exception that he added “electronic government” to the list: • Improving the collection, use, and dissemination of government information • Pursuing opportunities for electronic government
Role, Presence, and Trends in Federal Government
• Constructing sound enterprise architectures • Fostering mature systems acquisition, development, and operational practices • Ensuring effective agency IT investment practices • Developing IT human capital strategies43
During the Bush administration, and particularly in the aftermath of 9/11, existing uses of computing continued as before. The one major new element was an increased emphasis on using computers in support of intelligence gathering, as we saw in the case of the FBI. The creation of the large Department of Homeland Security by consolidating twenty-two agencies did not lead to the kinds of radical changes in digital applications recommended by the 9/11 Commission during the first George W. Bush administration. However, more collaboration increased among airlines, the Transportation Services Administration (TSA), and various intelligence and law enforcement agencies. One major process started during the late 1980s continued all through the years of the Clinton administration, extending right into the new century. In 1990, the CFO Act passed, stipulating a variety of accounting practices for all of government, essentially calling for the kind of accounting and financial practices deployed in the private sector: establishment of a CFO function in each department, better accounting systems (many of the existing ones dated to the 1950s and 1960s), and the requirement that all departments pass financial audits, all done to improve financial management. During the last two decades of the century, agencies quietly went about either improving their internal accounting and financial IT applications or running into bluntly written GAO criticisms of failure, as experienced by DoD. The administration of the first President Bush strongly recognized and embraced the importance of this law, resulting in initiatives all over the government. In a rare statement of positive results, GAO commented that “financial management systems and internal control [sic] have been strengthened” by late 2005.44 It reported that eighteen of twenty-four CFOs recently passed their audits, a threefold increase from 1996 when only six passed. Passing agencies also generated data, called “accountability reports,” that extended beyond just financial issues.45 The GAO proposed, however, that many agencies and departments still needed to introduce a new generation of software applications in support of accounting (particularly for cost accounting), finance, and general accountability. Expenditures for IT in the early 2000s went more for maintenance of existing applications—by then a nearly $60 billion annual expense—than for new uses of computing. Leave aside for the moment discussion of the large new supply of Internet-based uses of computing for discussion below, because the larger concern of public officials in the early 2000s lay more with the management of existing tools than with adding new ones, with the exception of intelligence gathering required for the “war on terror.” Passage of the Information Technology Management Reform Act in 1996 created the role of chief information officer (CIO) for each agency, yet another effort to manage more effectively what had long been a massive expenditure on IT.46 A half dozen agencies dominated these
199
200
The DIGITAL HAND, Volume III
expenditures. DoD accounted for $28.5 billion in 2004 and increased its expenditures the following year. Upon its creation, the Department of Homeland Security instantly became one of the largest users of IT in the American government and, for that matter, in the nation, with expenditures in 2004 reaching nearly $4.8 billion, and that grew by over 15 percent the following year. A few other agencies in descending order of consumption by fiscal 2004 included Department of Health and Human Services ($4.6 billion), Department of Treasury ($2.8 billion), Department of Energy ($2.6 billion), and Department of Transportation ($2.5 billion). These half dozen combined departments spent over two-thirds of the entire budget; yet it was not uncommon for many other cabinet-level departments to spend annually between $30 and $400 million. The Social Security Administration in 2004 alone spent $868 million, while the Department of State expended nearly as much too ($857 million).47 The conclusion to draw is that, if anything, reliance on IT had actually increased right into the new century.
The Internet and Federal Government since 1993 The period of the Clinton administration (1993–2000) paralleled the explosive growth in the adoption of the Internet by the nation as a whole. Twin developments in the 1990s resulted in the deployment of new applications of IT by the federal government that mimicked in volume and transformative effects the use of computing in the 1960s: use of IT again as a strategic tool to improve the efficiency and effectiveness of government operations and the rapid and extensive deployment of the Internet by government agencies and departments. The Clinton administration came into office in 1993 with a near missionary zeal to transform government, making it more accountable for its performance, increasingly transparent in reporting its results, and responsive to citizens. Vice President Al Gore was put in charge of the new initiative at a time when Quality Management practices had already swept through the private sector as a new mantra for modern management practices, and that would be accompanied by the “process reengineering” initiatives of the early 1990s. An early area of focus was the acceleration of initiatives to reduce the amount of “paperwork” in government, and that included simplifying regulations and forms. Paperwork reduction acts were passed in 1995 and 1998, while the OMB and the vice president pressed for substantial reforms in how agencies did their work.48 To monitor progress, within months of taking office the new administration announced the National Performance Review (NPR) process, an initiative led by Vice President Gore to review performance of agencies as the government shifted its culture “from complacency and entitlement” to one of “initiative and empowerment.” The NPR would be used to “reinvent government.”49 President Clinton directed his vice president “to redesign, to reinvent, to reinvigorate the entire National Government.”50 Set into that broad policy background was the administration’s interest in stimulating national economic development through the use of IT and deployment of technology to improve operations of government. Its first, and broadest,
Role, Presence, and Trends in Federal Government
IT policy initiative came with a series of statements initially made within two months of the new administration taking office and that culminated with the introduction of the U.S. National Information Infrastructure (NII) initiative on September 15, 1993.51 It recognized the convergence of IT, telecommunications, and new ways of using computing and became the basis for much policy regarding information, process reengineering, inspiration for what R&D and infrastructures invested in, and what applications to deploy.52 Yet, as had long been the case, it also followed a long-standing practice of the federal government stimulating deployment of new technologies across the economy and within its own walls, dating back to World War II. Early efforts focused on reforming the Telecommunications Industry, replete with passage of the Telecommunications Act of 1996 and a stream of regulations.53 The NII envisioned use of national networks, following again the tradition of the early development and deployment of the Internet in the 1970s and 1980s, building on this prior work. Part of the administration’s initiative involved using its political, economic, and regulatory power to put the Internet into every classroom in America and to facilitate further use of this telecommunications network in the private sector, all at a time when the nation’s appetite for using this technology was growing. One student of the initiative, Brian Kahin, concluded that “as a whole (NII) has succeeded in focusing public attention on the transformative potential of information technology and networks and the need to develop a deeper understanding of their social, economic, and policy implications.”54 Writing in 1996, he also observed that NII “provided a useful framework for communications among federal agencies with diverse charters and perspectives.”55 The effect on the national economy and its citizens has been discussed elsewhere; less understood is what occurred within the federal government.56 At the heart of what happened in the 1990s was the emergence of a new way of looking at the work of government, called electronic government. Initially, the notion was a fuzzy acceptance of the possible use of telecommunications and computing, but as the decade proceeded, it became the symbol of an opportunity, a path, so to speak, toward a transformed government that could serve its citizens better and more cost effectively. The idea quickly took the form of finding better ways to deliver services to the American public. As early as 1993, one government report put the case forward for using IT to accomplish the objective: Information technology—computers, advanced telecommunications, optical disks, and the like—can be used by the Federal Government to deliver services to citizens. Most Americans, if they think about it, can identify at least a few Federal services that affect their lives. These include the:
• • • • •
46 million recipients of social security benefits, 27 million recipients of food stamps, 31 million Medicaid recipients, 14 million recipients of aid to families with dependent children, 15 million scientists who receive National Science Foundation research grants each year,
201
202
The DIGITAL HAND, Volume III
• 20,000 small businesses that receive business loans, • 600,000 persons participating in job-training programs, • people and organizations that annually place about 1.6 million orders for a total of 110 million publications from the U.S. Government Printing Office, • citizens who annually receive a total of 10 million pamphlets from the Consumer Information Center, • 30,000 or so academic and business researchers who receive research results and technical information each week from the National Technical Information Service, and • 170 citizens who use Federal depository libraries each week.57
The same report, however, stated calmly that “interest in the electronic delivery of Federal Government services (and related State/local services) has mushroomed,” with “electronic service delivery closely linked to the ‘reinventing government’ and ‘service to the citizen’ movements that started at the State and local levels and have spread to the Federal Government.”58 The Clinton administration launched a plethora of initiatives to implement its national IT policies in the 1990s.59 By late 1995, within the federal government many IT-centric initiatives focused on the use of the Internet, for the same reasons as occurred at the time in the private sector and already in state and local government. Public officials saw the Internet as a manageable way of meeting many objectives of NPR: most immediately to lower the cost of paper (required by various paper reduction acts), while making it easier and quicker to deliver information to the public at a time when every month the number of Americans using the Internet for the first time increased, and to improve the efficiency of internal operations.60 Looking back from 2001 at these early efforts, two observers concluded that “the federal government’s efforts over the past decade to put government information on the web, prompted by the Paperwork Reduction Act of 1995 and other legislative mandates, have been energetic, with volumes of information now available to the public through government agency websites,” only complaining that conducting transactions over the Net still was in a primitive stage of development.61 The term “electronic government” became “official” public language when, in 1997, the Clinton administration published a report called Access America: Reengineering through Information Technology.62 At the same time, the government was using the phrase “Information Highway” to describe its vision of linking businesses, schools, people, and governments more closely by way of the Internet.63 But how was the federal government using the information highway? To a large extent, it mimicked patterns evident in many industries. Early in the decade, agencies and departments began “putting up” their first-generation Web sites, populating them with information on how to contact them, and that included statements about their roles and missions.64 All through the 1990s, they added information that citizens constantly asked of agencies, as we saw with the IRS’s publications in chapter 2. By mid-1997, over forty federal agencies had established some 4,300 World Wide Web (WWW) sites.65 Obviously, it was not
Role, Presence, and Trends in Federal Government
uncommon for an agency to have multiple sites and to be constantly replacing them with new ones. Many were specialized with information intended for narrow audiences. For example—and a very typical illustration of this pattern across the entire government—in 1997, the Department of Agriculture maintained sites devoted to specific regions of the United States, others were hosted by the Agricultural Research Service (over 100) devoted to specific lines of inquiry (poultry, livestock, various types of plants), as with the National Agricultural Library, Animal and Plant Health Inspection Service (over a dozen sites), Cooperative State Research Education and Extension Service, and the Forest Service (over three dozen) to mention just a few.66 But these early uses of the Internet by government agencies did not fundamentally alter their daily work. In fact, as late as 1998, the GAO was still trying to explain to agencies what to use the Internet for and the benefits to them.67 In fairness to government fin-de-siècle officials, however, making it easier for citizens to get information was actually a major accomplishment since an historically important role of government had always been to create and disseminate information. In less than a decade, the Internet had become the single most important channel of access to federal information for people coming to government for data and an increasingly important source of information for federal employees with which to perform their duties. While agencies began to put information out on thousands of Web sites in the second half of the 1990s, standards for implementation and maintenance improved. Turnover in Web sites continued, however, as agencies moved rapidly from one generation Web site to another in the late 1990s, either in response to changing technology (such as the arrival of security software to protect data security and citizen privacy) or as federal IT organizations learned how better to create, populate with data, and manage Web sites, learning essentially at the same time as the private sector.68 By the mid-1990s, forty-two federal agencies were tracking expenditures for Internet- and dial-up-based activities, providing us with specific evidence about the already extensive use of the WWW. Between 1994 and 1996, these agencies spent cumulatively about $349 million, with $59 million of this sum expended in fiscal year 1994, approximately $100 million in fiscal year 1995, and roughly $190 million in the following year, a testimonial to the fact that even in 1994 the government was already an extensive user of the Internet and, like so many industries, increased massively its investment in this form of telecommunications after the arrival of tools to make access and use of the World Wide Web convenient. Of the total $349 million spent, roughly $325 million went to Internetbased activities related to the 4,300 Web sites mentioned earlier, and for 215 dial-up accounts. The lion’s share of the Internet expenses went for establishing Web sites, providing access to these for employees, and for maintenance of these Web sites.69 One of the earliest applications was e-mail for employees, in fact, for 1.7 million of them by 1997, representing approximately 50 percent of all civilian and military employees. Some 31 percent gained direct access to the WWW. The GAO reported that by 1997, “the Internet has [sic] become a valuable and widely used means of communicating and sharing information.”70 E-mail with
203
204
The DIGITAL HAND, Volume III
colleagues, and increasingly with the public, spread in this period. Employees sought professional, scientific, and technical information on the Web, while making increasing amounts of their data available to the public as well. As the case studies in prior chapters demonstrated, uses of the Web varied widely and early across the government.71 The percent of employees within agencies and departments with access to e-mail and to the WWW varied enormously in mid-decade, so the 50 percent cited above for e-mail and 31 percent for access to the Web are a bit misleading. Table 6.3 provides a sampling of data for fifteen organizations to demonstrate the breadth of deployment (hence use) by late 1997. Included are the largest employers in government and, when available, data on organizations discussed in earlier chapters, such as the SSA and the Justice Department. It should be of no surprise that since adoption of computing had always been a highly decentralized activity, it would be so with deployment and use of the Internet. However, the one conclusion that leaps out from the data, and the cumulative percentages of 50 and 31, is that as a whole the federal government proved to be an aggressive early adopter of the Internet when compared to most industries in the American economy. Between 1997 and the early years of the new century, government agencies added more data to their Web sites, exchanged more information among themselves through both Internet and intranet sites, and began adding transactions,
Table 6.3 Percentage of Federal Employees with E-mail and WWW Access, 1997 Federal Organization Department of Defense Department of the Treasury Department of Veterans Affairs Department of Transportation Department of the Interior Department of Agriculture Department of Health and Human Services Social Security Administration Environmental Protection Agency General Service Administration Department of Justice Executive Office of the President Federal Communications Commission Federal Deposit Insurance Corporation Federal Emergency Management Agency
% with E-mail
% with WWW Access
49.5 54.6 26.0 62.6 83.4 50.8 80.6 40.0 100.0 80.7 7.9 100.0 100.0 100.0 100.0
34.3 7.7 4.0 16.5 36.3 23.7 67.5 7.7 40.8 71.0 8.0 100.0 100.0 100.0 100.0
Source: General Accounting Office, Internet and Electronic Dial-Up Bulletin Boards: Information Reported by Federal Organizations, GAO/GGD-97–86 (Washington, D.C.: U.S. Government Printing Office, June 1997): 35–36.
Role, Presence, and Trends in Federal Government
such as the ability for a citizen to order a publication or, as we saw with the IRS, to file tax returns and other forms and make payments. It was in this later period that public officials at local, state, and federal agencies began to speak about “e-government” and “e-business,” which loosely defined was the ability to conduct electronically official business with minimal or no reliance on paper-based documents. Being able to order online supplies at DoD, for example, represented a profoundly different way of managing and deploying the acquisition, record keeping, and consumption of budgets, goods, and services. By the end of the century, federal agencies had already started to serve businesses, citizens, employees, and other governments over the Internet. A survey conducted in 2001 documented over 1,300 uses of the Internet in one of these four ways or a combination of the four. While extant data is limited, it appears that a third of these new applications of IT served citizens, 20 percent other agencies, just about 23 percent employees, and 20 percent businesses. However, the majority (just over 50 percent) had dissemination of information as their largest application. As one student of the process put it at the time, “the move to a full e-government is in the early stages, estimating that only 4 percent of these were transformative in nature.”72 Yet 34 percent had implemented transaction-based applications on their sites. Ironically, online forms actually represented a smaller use of the Internet than transactions at this time, a phenomenon described in chapter 2 about the IRS. The same survey, however, noted that all agencies and departments were now users of the Internet and that some 40 percent focused on serving citizens. Of the 1,200 sites surveyed, some 800 provided information and forms, often with the enthusiastic support of citizens. At that time, well over 50 percent of all residents in the United States had access to the Web.73 Agencies and departments were also creating intranets for their internal use, not to be confused with Internet sites, which were designed for use by individuals from outside an agency, department, or government. Functional portals were developed to deliver single managerial functions, such as human resource and financial tools. Web applications for a single purpose continued to be very widely used, such as for processing a new hire or for making travel arrangements. “Fat portals” were just coming online, used to house complex, multifunctional, and enterprise-wide functions. These were just beginning to make it possible for employees to have personalized work environments. “Thin portals” also existed for the purpose of making available an agency’s information and links to other intranet sites.74 The surge in use of the Internet and intranets continued unabated until the attacks of 9/11 in 2001 when, for security reasons, departments and agencies began to question the wisdom of keeping certain types of information on their Web sites, such as organization charts and some names and addresses. These were concerns, however, that dated back to at least 2000 but now became more urgent. One of the key recommendations from the 9/11 Commission was to integrate various sources of data in support of what became the Department of Homeland Security. While much of its IT activities remained shrouded in secrecy as of this writing (2007), some evidence from the GAO, for example, suggested
205
206
The DIGITAL HAND, Volume III
that agencies were collaborating more than in the past, including departments such as Agriculture, Treasury, and Health and Human Services.75 Hackers had also been a source of problems since the late 1990s as well, while considerations about the privacy of information concerning individual citizens had accompanied development and use of the Web across the entire economy and were thus not limited to issues regarding federal uses of the Net.76 As of late 2002, these issues commingled with a large body of existing uses of the Internet now accessible by federal employees and the public. All sites provided basic information about an agency or department; over 95 percent posted documents; already half made available downloadable and interactive forms. Roughly a fourth had started to provide multimedia applications as well. Despite all the trade press discussions about e-commerce, just over 10 percent of government Web sites had applications of this type. By that time, the majority (85 percent) provided information on how to apply for employment and a nearly similar number showcased publications (81 percent). Half posted statistics and information about obtaining contracts. A recent development evident on twothirds of federal Web sites was the ability to contact an agency through e-mail directly from the agency’s site, while almost all of them now provided search functions or site maps.77 By then, the premier portal to get to federal Web sites was FirstGov.gov. It became operational in September 2000 to help citizens access federal information by service rather than by agency. One student of federal Web sites concluded that “FirstGov serves as an efficient, effective gateway into the full range of federal information and services,” with some 51 million pages of information in 2,000 government Web sites by late 2002.78 A new generation of FirstGov was implemented in 2007 and renamed USA.gov, six years after the first governmentwide portal had gone into use.
Conclusions By looking at the government as a whole, several patterns of use, deployment, and effects become evident. As earlier chapters illustrated with specific departments and agencies, the federal government demonstrated a continuous appetite for information technologies for over a half century. The motivations for relying on the digital hand came largely out of desires to lower operating costs and the amount of labor required to perform work. Agencies and departments, however, also proved quite reluctant to alter fundamental aspects of operating as a consequence of using IT, such as their missions, work processes, and measures of accountability for results. Over time, their increased use of IT ultimately did cause incremental changes in how work was done. As we saw with the SSA and IRS, these changes encouraged Congress to change missions and work, because of the availability of digital tools that made it possible either to do things more cheaply, faster, or better, or to do simply something new. Yet, the fundamental structures of government proved quite resistant to change in contrast to what happened in the private sector, despite a clear historical
Role, Presence, and Trends in Federal Government
record of extensively using computing and telecommunications. As recently as 2004, one highly regarded expert on the effects of IT on organizations, Don Tapscott, bluntly stated that “unfortunately, many private sector innovations have yet to be embraced by the public sector.”79 He noted, however, that there was growing recognition within governments in developed economies of the need for fundamental transformation, including the American government. The reasons for that recognition have less to do with the functional value of IT than with other influences, such as private sector organizations claiming roles historically resident in the public sector (such as private package delivery services), undermining of government credibility, legitimacy, and even relevance caused by a myriad of issues ranging from badly managed activities (such as the ineffective support of hurricane Katrina’s victims or ballooning budget deficits during the George W. Bush administration), to an emerging sense of citizen empowerment (such as the civilian groups patrolling the U.S.-Mexican border), and the growing necessity of cooperation among governments in sustaining security.80 Nonetheless, federal agencies found many practical ways to use information technology, particularly in support of the collection, analysis, and dissemination of data in all decades and during the period since the late 1990s, for conducting transactions with other agencies, suppliers, and interacting with citizens using the Internet. For those applications, government’s pattern of adoption and use was the same in the era of the Internet as in the late 1950s and early 1960s when it first installed large mainframe-based systems. The cadence of change in applications and base technologies proved more a function of the availability of budgets and congressionally mandated changes than a result of the attractiveness of some new technology over an earlier one. The one general exception to this pattern was the development of weapons systems across departments and agencies, which, while slowly implemented, nonetheless consistently leveraged innovations in technology, even supporting the R&D to create new tools and weapons (such as through DARPA and the NSF). In that instance, military strategy, tactics, and to a certain extent organizations transformed as a direct consequence of new technological capabilities. The record is quite clear that federal agencies became as aggressive in adopting the Internet as any industry in the private sector. Part of the reason, of course, was the convenience provided by the new technology. But a series of mandates from Congress and an enthusiastic supporter of the Internet in the form of the Clinton administration, which remained in power for eight years, did more to motivate agencies and departments to take advantage of the Internet as their primary vehicle for reducing paperwork than the convenience of some new technology. The fact that the Internet could be implemented in small incremental phases, unlike large national, normally centralized (or just in few large geographic regions) applications of the digital hand, also proved quite attractive to senior public administrators all over the government. The number of disastrous or faulty implementations of Web sites and new e-based services proved fewer than for some of the major systems of earlier times. However, as we saw with the IRS, reduction of complexity in government was not a significant by-product of
207
208
The DIGITAL HAND, Volume III
the Internet’s deployment. The government still faced problems with large systems at the dawn of the new century, such as the FBI’s virtual case project and the FAA’s perennial attempt to manage better flight traffic in the United States.81 In contrast, there were fewer negative headlines about failed Internet projects. Selection of uses, deployment, and later changes to reflect new technologies, missions, and budget realities proved to be highly decentralized activities. To be sure, attempts were made to have some central authority responsible for setting standards, publishing guidelines, and so forth throughout the period by Congress, the White House, and specific agencies. In this regard, the GAO, OMB, and the National Bureau of Standards (NBS) played various roles over the decades. It is difficult to conclude that their efforts proved highly effective, but they clearly played support roles because, while agencies and departments remained fundamentally in charge of selecting how to use IT within their organizations, they increasingly bent to government-wide standards and practices regarding the way IT was acquired, used, and accounted for. Decentralization contributed mightily to the concurrent and extensive deployment of IT all over the government, even creating best practices that one agency could learn about from another, as occurred in the early years of the mainframe when the Bureau of the Census served as an advisor to other agencies and decades later organizations with awardwinning Internet sites became advisors to other parts of the government. For over a decade now, there has existed an intense discussion about e-government. Overwhelmingly, participants in the debate have acknowledged that their visions of government agencies relying extensively on the Internet and other forms of IT were collectively states of being yet to be implemented. Furthermore, a reading of annual reports of various agencies and departments would almost lead one to believe that this future state had arrived.82 To be sure, a great deal had been done, particularly through use of the Internet. A group of experts on the role of IT in the federal government declared as much: “Within a brief period of time—largely, the final decade of the twentieth century—the application of new IT to the performance of federal responsibilities, with a view to improving the efficiency and economy of government operations, produced e-government.”83 To be sure, any examination of all the laws and executive mandates from the White House that appeared in the 1990s and early years of the new decade clearly demonstrated that senior officials were perhaps as interested in creating new structures and roles made possible by IT as evident in any earlier decade.84 During the first term of the Bush administration, that is to say, after 2001, signs emerged suggesting that new ways of justifying the acquisition of IT were being implemented, building on initiatives of earlier administrations that, if successful, might provide government agencies with the necessary incentives to transform to the degree already evident in the private sector. Specifically, the Executive Branch encouraged agencies to present formal business cases for adoption of new applications. Thus, for instance, the Department of Homeland Security’s request for funding to consolidate multiple databases and applications into more cohesive ones in support of national security led the White House to
Role, Presence, and Trends in Federal Government
endorse DHS’s request for $884 million for this project. Implementation would result in organizational changes as well. Other business cases began to appear that talked about reorganizations, outcomes, and cost avoidance. The overall IT budget for the federal government was projected to increase by some 7 percent in 2006, distributed in a familiar way: 46 percent to DoD, 19 percent to DHS, and the rest to all other departments.85 Whether the government was entering a period of important transformation such as experienced by the private sector in the 1990s and early 2000s remained to be seen. But ultimately, we have to answer the question, how much of an effect has the Internet had so far on operations of government? The rhetoric about its successes and failures is loud and voluminous, and attention to non-Internet-based uses of IT quiet and in the background. One of the original objectives for embracing the Internet was to make available a large body of information to the public and, secondly, but later, to reduce the amount of labor required to conduct business with the public by having citizens seek out their own information and file applications and other forms with minimal involvement of government employees. Regarding the first objective, success has been well achieved, with many government Web sites and portals some of the most extensively visited around the world, not just in the United States. Visits to federal sites early and continuously remained high. More than just designing good Internet sites, these were implemented in a nation comfortable in its use of all manner of information technology, most recently the PC, and at a time when the percent of the population using the Internet was high and growing faster than in any other nation in the world. The fact that information, literature, and forms were normally free probably also helped to encourage traffic to Web sites. If there is a surprise in this finding, it is the speed with which citizens began using federal Web sites. As they had done with retailers, citizens pushed for access to Internet sites by voting with their dollars and their log-ons. With regard to the second objective of offloading to the Internet work with citizens that otherwise would have been done by federal employees, here, too, we see substantial growth in use of the Internet, although the evidence of change is less definitive because this is a phenomenon more of the period after 2000 than before. Darrell M. West, who has studied extensively the federal government’s use of the Internet, also observed that the Internet provided only “limited” transformation, reaching the same conclusions I have about the incremental adoption of IT across all of federal, state, and local governments across the past half century, not just regarding use of the Internet. West is absolutely correct in arguing that changes came slowly because that is the way of political thinking, and due to the fact that most uses of IT tend to reinforce existing circumstances, at least until the technology has been in place for some time.86 His—and for that matter, my—observations, however, are as applicable to pre-Internet use of technology as to developments that came after use of the Web. That is why, for example, we can argue that all the signs point to another technological success slowly in the making. So what we have is the citizenry at least transformed, now in the habit of increasingly using federal Web sites. A similar conclusion can be
209
210
The DIGITAL HAND, Volume III
reached about the internal use of intranets by federal employees. Extant evidence leads us to the same conclusion that usage is rising but varies from agency to agency, but not much beyond that fact. American soldiers in Iraq were comfortable sending e-mails home. The SSA only saw traffic in the form of transactions rise after the boomer generation began visiting its Web site, although an increasing number of senior citizens are now comfortable using the Internet. With regard to structural and cultural changes in the government, the Internet remains an alternative channel of communication and way of doing work, but not to the extent that we can conclude that the federal government works in a digital style, as we concluded was now the case in so many industries. The experience of the first “computer revolution” in government took place with the first adoption of computing in the late 1950s through the early 1970s. It was a slow and painful process that did not cause fundamental changes in the structure of organizations, institutional cultures, and only partially their missions and roles. So, we are left with the question, since so much that is now occurring with the Internet in government parallels the prior experience, how much “reengineering of government” can we expect? The question calls for a prediction, and that lies outside the scope of this book. However, evidence of what is happening to governments in other countries that have used the Internet more extensively to deliver services to its citizens suggests that there will be changes directly attributable to use of the Internet, but as much likely to be more of a consequence of a whole generation of employees retiring by 2015 as of the technology.87 As exciting as are the prospects of transformed federal agencies and departments, the experience of local, county, and state agencies proved more dramatic and far reaching, particularly during the last decade of the twentieth century. For that reason, we now shift attention away from the federal government and to local public administration over the next several chapters. The story shares with the federal experience such common traits as the hunt for increased productivity through use of IT and decentralized decision making about the acquisition of computing and telecommunications. The story is one of smaller orders of magnitude, a circumstance that itself had interesting consequences.
7 Digital Applications in State, County, and Local Governments We’re using information technology to support and enhance the core functions of Michigan government and to position our state as a global economic powerhouse in the 21st century. —Governor Jennifer M. Granholm, 2004
G
overnments at the state, county, and municipal level consist of tens of thousands of organizations. There are fifty states, and many have nearly 100 counties each, and almost every state has thousands of towns and cities, from little hamlets with hundreds of residents to large cities with populations of over 10 million people. But remarkably, they all share a broad collection of roles, ranging from public safety and law enforcement to providing education, water, and sewer services. Each one is also responsible for economic development, protecting the environment, managing government within the democratic framework spelled out in the U.S. Constitution, and preserving “quality of life” at levels expected by citizens. Collectively, these three sets of governments comprise a major segment within the nation’s economy and society. Perhaps it should be of no surprise that as a collection of governments, they sought to use the same digital tools all other industries did across the American scene. This chapter tells the story of what they did with the digital hand, how, and why. This chapter addresses the same questions put to other agencies earlier in this book. There are issues of timing to address, such as why large cities deployed IT sooner than small towns, and why the cadence of adoption varied from one type of government to another due to the cost performance and technological 211
212
The DIGITAL HAND, Volume III
evolution of IT hardware and software. It is a big, complicated, and diverse story, but one that reflects many of the patterns of behavior evident in some other public agencies and in so many private industries. The links become obvious and specific as the story unfolds.
State Governments and the Digital Hand The period immediately following World War II began with forty-eight states and by the end of the 1950s, had grown to fifty, with the addition of Hawaii and Alaska. In addition, there was the Commonwealth of Puerto Rico, which functioned much like a state as well since many of its government’s activities were the same as those carried out by states. States operated relatively independently of each other, making their own decisions about what uses of computing and telecommunications to implement. The one fundamental exception to this pattern of behavior occurred whenever the federal government injected itself into local activities either through funding of specific projects, such as the use of computing in law enforcement, or when it mandated, supported, or funded specific initiatives. Two important examples of the latter included the myriad welfare programs begun in the 1960s, briefly discussed earlier in regard to the role of the Social Security Administration, and the military services, because of the role of the National Guard. States also looked to each other for examples on how to use IT, and their technical staffs often talked, just as whole functions shared information and communicated, most notably officials responsible for state prison systems, sheriffs, state police, and taxing authorities. States varied in the size of their geographic footprint and in the number of citizens they had to serve. Some were highly urbanized, others overwhelmingly rural, and most comprised a mixture of the two situations. So, generalizing about what states did with computing and telecommunications remains a tenuous proposition; nonetheless, it is possible to navigate their broad variety and circumstances, and a necessary exercise because states mimicked each other and learned lessons from peer governments. The first issue to discuss concerns what in general states used computers for in the performance of their work. Various students of the issue have developed typologies and lists. Two reflect conveniently the clusters of applications. To understand them, we should recognize that over time states implemented a wide variety of uses of technology in support of preexisting and, later, changing operational responsibilities, just as had agencies of the federal government. In short, the variety of applications mimicked the rich diversity of those evident in the federal government and in cities. Figure 7.1, put together in the mid-1980s to educate IBM sales personnel calling on state and local governments, suggests the breadth of functional areas in which deployment of computing had already begun. Ignore the dots around the circle indicating for which functions IBM had software products, because states obtained their software from many vendors, not just IBM, and also developed many of their own software tools, just as did
State, County, and Local Governments
Figure 7.1
State and local uses of computers, mid-1980s. (Courtesy IBM Archives)
the federal government. Within each major category of activity—such as welfare—one can see that state and local governments had already started to equip nearly a dozen clusters of activities with computers and telecommunications. Many of the tasks performed by cities were similar to those of states. For example, each maintained roads, courts, and law enforcement, did accounting, budgeting, and finance, and had welfare, public safety, and education roles. An IT industry research firm, Gartner, created an equally useful list of applications late in the century that, while less detailed than the first one, was
213
214
The DIGITAL HAND, Volume III Table 7.1 IT Market Segments within State Governments, circa Late 1990s Administration and finance Transportation Public safety Human services Health
Criminal justice Natural resources/environment Public works Others
Source: Gartner, Inc., Trends in U.S. State and Local Governments: Market Trends (Stamford, Conn.: Gartner, Inc., March 19, 2002): 4.
quite similar. Table 7.1 is a simple list of the clusters of applications that Gartner tracked by the end of the century, which by then were market segments for software and hardware firms competing to sell specific products into those specific areas of state government.1 To a large extent, the story of computing in state governments is a tale of adoption of computing into these various parts of state government over time, coupled with a subsequent and often concurrent move toward state-wide integrated systems and centralized management of IT. As occurred in the federal government, state agencies normally acquired systems independently of each other. As costs for independent IT operations rose, governors and legislatures sought to harvest the benefits of economies of scale by creating state-wide data centers used by multiple agencies and by distributing, or centralizing, computing over time as the ebb and flow of technological innovations suggested new opportunities for cost containment or improved services. However, there were differences between federal and state applications, particularly in the era before wide use of the Internet. Most notably, states, like county and municipal governments, interacted directly with citizens more frequently than federal agencies and departments, literally face to face, much as occurred in the banking and retail industries. The federal government interacted with citizens more from afar, indirectly or by mail, with less mano-a-mano contact, reflecting patterns evident in the brokerage and insurance industries. Of course, there were many exceptions to this generalization; however, true enough that many of the uses of computing at the state and local level were sufficiently different than in federal applications. One brief example illustrates the point. Millions of citizens interacted with state employees each year to obtain an automobile driver or fishing license. On the other hand, it was always possible for an adult resident in the United States to live for many years without having to deal with a federal official face to face. It was impossible, however, to avoid that kind of contact at the state level, let alone at the municipal level, although the Internet was beginning to alter that circumstance in the new century. When the Internet became a viable tool for governments to use, states had decades’ of experience deploying computer-based aids in support of their dealings with the public and thus, more than the federal government, relied on this technology often in more interactive ways to deal with its citizens. In short, we can consider how states used digital tools as a variant in the digital style of public administration.
State, County, and Local Governments
Early Uses of Digital Tools, 1950s–Mid-1990s The motivations for initially and continuously using computing were, however, no different from what occurred at the federal level. As the nation’s population grew along with the role of governments, the desire to avoid adding employees to state payrolls and to lower operating costs in general constantly served as managerial priorities throughout the entire period. The same applied to local governments as well. So, improving the productivity of employees and controlling expenditures remained critical objectives of all legislatures and governors. Larger governments embraced computing sooner than smaller ones for the same reasons as in industry: they could afford the initial high costs of implementation and had more productivity to gain from economies of scale.2 The one major difference from the private sector was that state (and also municipal) governments deployed computing later than either the federal government or many commercial industries. In other words, it was more of a story that began in the 1960s and 1970s than with the federal agencies and companies, which began using computers at the start of the 1950s. Most state governments bypassed the first generation of computers (1940s–1950s) and began using the technology when it was in its second or third generation, when the equipment and software had reached levels of price performance states could more readily afford and when functions began to match the needs of various agencies. One proof point illustrates this trend. When IBM introduced its PC in 1981, it attracted a great deal of attention at both the state and local levels, and by the mid-1980s, these machines had been widely deployed across hundreds of state and local agencies. Yet, it took well over a decade for second-generation mainframes to become widespread in state government two decades earlier.3 State governments had used precomputer information processing tools for decades. It was not uncommon for state governments to have adopted IBM’s tabulating equipment in the 1920s and 1930s to do accounting, payroll, and other data collection and analysis. They had also become extensive users of smaller “office appliances,” such as desktop adding machines and calculators from scores of vendors as early as the two decades prior to World War I.4 Large governments, however, were the first to install computers in the 1950s and 1960s. Some of the earliest uses of computing in the 1950s included California’s installing an IBM 702 in its Department of Employment to perform employment insurance accounting. The Illinois Division of Highways acquired a Bendix computer in September 1956 to conduct calculations for highway design, as did the same departments in the states of Georgia and Ohio a few years later. Massachusetts installed a Burroughs system in 1960 to do statistical tax calculations. Work unique to a state also went to computers, such as the management of docks at the port of Mobile by Alabama.5 The case of California’s employment application of computing illustrated the changes envisioned by public officials at that time. The Department of Employment did everything from helping people find jobs to paying them unemployment insurance, and to tracking trends and collecting unemployment taxes from employers. During the 1950s, the state legislature introduced new terms and
215
216
The DIGITAL HAND, Volume III
conditions for people to qualify for unemployment insurance that required state employees to perform new calculations of the type “if this situation exists, then benefits are xyz,” which increased the complexity of determining benefits on a person-by-person basis. Officials had been using IBM tabulating equipment since 1937 and recognized earlier than many other peers across the nation the potential value of computers to handle ever growing volumes of employees, employers, beneficiaries, and calculations. They committed to using computers in 1955, most specifically an IBM 702 in early 1956. By the end of 1957, nearly half of the Department of Employment’s old tabulating equipment was gone, and the number of shifts of employees doing back office work had essentially shrunk to one per day. File management evolved from massive quantities of card files to magnetic tape records, much along the lines occurring at both the U.S. Social Security Administration and at the U.S. Bureau of the Census. In the case of California, the number of employees needed to do accounting work shrank as management offloaded work to the computer, while stabilizing work streams into predictable collections of activities planned for, and staffed, in an organized manner. Officials realized “substantial savings” in operating costs for the department. They reported that when they had regular, repetitive work running on the IBM 702, these operations outperformed prior systems and practices, and they were quicker. Work, however, now had to be more coordinated than in the past to leverage the speed and functionality of the system. The 702 proved far more accurate than earlier labor-based or even small machine-based approaches. In addition, with more data available now in machine-readable form, officials could perform additional analysis of the data more easily and quickly.6 As a result of this early success, other state agencies in California began to adopt computing, such as for vehicle registration, using an IBM 650 system beginning in 1958, and for doing engineering calculations in road design, one of the most popular early state applications across the nation.7 Illinois, another large state, became an early user of computers at the same time, also extending its use of an IBM 650 to various applications in the Department of Revenue. In fact, Illinois, like California, Michigan, New York, and later Washington, became an extensive user of computing, often deploying new applications of data processing before other states.8 Yet, computers were not limited to the upper Midwest or California. Florida became the first user of an IBM 1401 system in 1961 when it installed two of these systems to do accounting, finance, highway design, tax audits, and payroll processing. Earlier, each of these applications had been done using precomputer systems, including the most advanced electronic calculators of the day, such as an IBM 604 to produce checks.9 But these represented mostly isolated cases of implementation in the mid1950s. By around 1961, however, almost every state had at least one computer and often used them earliest for highway engineering, because in these years the technology was most suited to perform calculations far more rapidly and accurately than preexisting tools. Revenue administration proved slow to computerize, perhaps because even as late as 1958 only seventeen of the thirty-three states had income taxes or even used punched-card equipment for this purpose. In
State, County, and Local Governments
addition, less than a handful had even started to use computers to assist in welfare management. One study from 1961 characterized the deployment of computers at the state level as “slow,” because they were operating in the mode of “experimentation, trial and error and adoption on a piece-meal basis wherever departmental receptivity is greatest.”10 The nature of the technology also influenced the types of applications and rate of adoption in these very early years. One contemporary observer described the situation: “At the time that many states began to consider installing a computer, there was a considerable gap between the most popular small or medium-sized computer and the very large and expensive computers. Therefore, states found themselves facing a situation in which the departmental applications they were considering were a little too large for the smaller ones but much too small for one of the large ones.”11 Gaps were filled—by the early 1960s with new computers from a handful of vendors, stimulating new demand for computing. This new wave of adoption grew largely out of the shift from large vacuum-tube-based hardware to smaller transistorized systems, all of which also lowered costs of preparing transitions by the states from earlier modes of operation, even saving on expenses for air-conditioning, climate-controlled data centers, but not for rewriting of application software. Use of computers now began to increase, which is why the story of computing at the state level realistically became important in the 1960s, rather than in the previous decade. The first known survey of computing in state governments pegged the number of systems in state agencies at 101 as of 1960.12 A second survey, conducted in 1963, reported that over forty states used 243 systems. By the end of 1964, some 275 were installed. To place the latter statistic in some meaningful context, at the time there were about 22,000 computers installed across all industries in the United States.13 One can conclude that the take-off at the state level had finally started, albeit slowly when compared, for example, to most industries in the private sector or to the federal government. Based on data from the same period, table 7.2 lists some of the key applications and the number out of forty-three states that responded to a survey using computers in a specific functional area of public administration. One can immediately see that the priorities were only partially similar to those of the federal government, while many uses were unique to state governments, once again reflecting that systems were deployed in support of existing work flows and long-standing institutional missions. These applications either lent themselves to quick automation, such as payroll, or were funded by the federal government, for instance, employment security and highway design. (Recall that federal support for law enforcement applications did not start until the second half of the 1960s.) All of the applications cited in table 7.2 focused largely on routine record-keeping activities and thus had little effect on the organization of government, and only modestly on how public employees did their work. Yet, as occurred in the federal government, state officials would visit peers in other states who had already implemented a new use to find out how to do the same with the result that change did not vary as much from state to state as in the private sector, where companies within industries varied in objectives, structures, and workflows.14 In short, seeds of future uses were being planted.
217
218
The DIGITAL HAND, Volume III Table 7.2 IT Applications in State Governments, circa 1964 Application Area
Applications
Public works Revenue Finance
Highway computation & accounting Corporate, income, & sales taxes Expenditure & encumbrance accounting, payroll Benefits, employer contributions Registration, licensing Grant computation, check writing Contributions, pensions Vital statistics, patient billing
Employment Motor vehicle Welfare Employee retirement Health & mental hygiene Insurance Education Civil service Purchasing Law enforcement Conservation Agriculture Equalization Liquor control
Workmen’s compensation Scholarships, state aid Exams, eligible lists Inventory, purchase-order writing Arrest record keeping Hunting, fishing, & motorboat licenses Milk & disease controls Equalization computation, per capita aid Inventory licensing
Number Using 38 26 26 20 18 16 15 13 13 12 12 11 10 10 7 5 5
Source: Data in Dennis G. Price and Dennis E. Mulvihill, “The Present and Future Use of Computers in State Government,” Public Administration Review 25, no. 2 (June 1965): 144–145.
One final useful statistic is worth pondering, the percent of state budgets devoted to computing. As of mid-decade, it was approximately 0.4 percent, further evidence of the modest nature of the use of computing at the state level even as many American industries were spending at nearly twice that rate.15 As in the private sector, these early systems were normally housed in accounting, budgeting, or financial centers within state government, which were willing to share excess capacity with other departments, a practice not as evident in the federal government, yet clearly a rapidly growing practice in companies. Use of computers in the second half of the decade (and continuing into the next), mimicked the pattern initiated in the 1960s regarding what to automate, where to host computing, and rate of deployment. In this period, the federal government increased its demand for information from local and state agencies, which provided an additional incentive to mechanize and automate data collection, analysis, and reporting.16 By the late 1960s, the pattern of who used computers became clearer. One group comprised operational departments that relied on computing to support work concerning motor vehicles, agriculture, water resources, public health, social welfare, and justice. All these agencies came into constant
State, County, and Local Governments
contact with citizens. A second group focused on using computers in support of administrative tasks, such as those of the state comptroller, general administrative services, finance, and tax boards. A third class of agencies involved policy making and were the latest wave of adopters of computing in the late 1960s. Their applications included support of elected officials (governors and legislators) and policy boards. The latter required rapid summarization of large bodies of information officials could use in making decisions and in formulating policies and programs. Operational support for management normally required help in planning, allocation of resources (people, assets, and budgets), monitoring, and correction of errors in data. At the level of what computers actually did, specific practices had emerged. In operational applications, officials often used computers to do off-line repetitive types of jobs that could be scheduled in advance, such as accounting, updating payroll records and printing checks, tax bills, and periodic statistic records. The technology of the day was best suited for these kinds of “batch” applications. A second increasingly used class of applications involved performing engineering calculations, again because computers could do a great deal of this work quickly, in large volumes, and at scheduled times.17 Applications that officials found attractive in the 1960s spread all through the 1970s and 1980s, first into large states, then increasingly to smaller ones, particularly as the technology dropped in price, came in more granular sizes, and with increasing amounts of software products and tools.18 By the late 1970s, older systems, first installed in the 1960s, had either been expanded or replaced with newer versions, including use of online query capabilities after data processing staffs began moving files to disk drives and off magnetic tape and cards.19 As occurred in private industry and across the federal government, budgets allocated to data processing increased slowly in the 1970s, then sped up in the 1980s as the number of PCs in state government spread. Most states established large central data centers in the 1960s and 1970s to optimize the economies of scale that technologies made possible and even centralized the acquisition and deployment of myriad telecommunications, not just voice communications. More effectively than the federal government, legislatures began passing laws to improve the professional management of IT assets all through the 1970s and 1980s, an aspiration the Executive Branch of the federal government ambivalently supported for its own departments. Specifically at the state level, governors and small groups of legislators drove forward the innovations in management, demonstrating a quickness in leadership not as consistently evident in the federal government at the time.20 In short, computing proved more relevant to state public administrations than to many senior federal officials who, one could surmise, had more, or different, issues to contend with, ranging from foreign policy and the Cold War to enormously large organizations, some with hundreds of thousands of employees, encouraging them to delegate reforms to subcabinet officials and directors of agencies. In the 1980s, new applications evident across all industries and the federal government became relatively common in state governments. Already mentioned were the desk top uses of PCs, spreadsheets, and word processors. These little systems, along with other applications housed in mainframes, led to the wide
219
220
The DIGITAL HAND, Volume III
deployment of various decision support systems that had become so popular by the early 1980s.21 These were nested into what had evolved into a complex technological ecosystem. A professor observing this development, Sharon L. Caudle, described this world of the 1980s at the end of the decade: Information Technologies in the past decade increasingly served as powerful tools for government in providing services, regulating, and formulating and evaluating programs and policies. Information was the fuel in government’s business. Telecommunications and office systems technologies joined computing as central technologies. Information management and information technology organizations grew in visibility and power. Not surprisingly, elected and career government officials found themselves more and more concerned with the significant dollar investments in information technologies and their applications, compounded by accelerating waves of new computer and communications technologies.22
She reported that nineteen out of fifty states reorganized their IT organizations in the 1980s in order to improve operations, provide new services, and control costs, with a preponderance of states situating their technical staffs largely in administrative and financial departments.23 An immediate consequence, first felt in the late 1980s and extending right through the 1990s, was the emergence in these newly centralized organizations of an appetite to build statewide IT and telecommunications infrastructures. These in turn would have profound effects on the structure and work of state governments by the end of the century. Caudle was one of the first observers of state IT operations to identify the characteristics of this trend. In 1990, she noted that “the infrastructures link individual workers with a multitude of databases, ranging from stand-alone computing to powerful central data centers to contracted information services. Voice, data, and video telecommunications are rapidly developing as central informational technology tools for mission support. The technical means to instantaneously move information to and from local and state headquarters, across programs, and in real time will soon be a practical reality.”24 While many would criticize how slow states were in transforming their IT and telecommunications, states did not run into the same magnitude of budgetary and organizational paralysis that plagued so many federal agencies in the 1970s and 1980s. To be sure, governors and their legislatures struggled with budgetary constraints; but, perhaps because they ran less complex, smaller bureaucracies, they were able to do a better job in exploiting technological innovations and to make data processing a more integral part of their operations by the end of the 1980s than could evidently officials in the federal government.25 State legislatures played a particularly active role in the 1970s and 1980s in the use and deployment of IT in excess to that of the U.S. Congress. One can posit that there were perhaps two reasons for this activity, largely drawn from the personal familiarity of many legislators with IT not necessarily evident in the profiles of members of Congress. For one thing, many state legislators were not full-time lawmakers but rather had other careers in law and business, which exposed them
State, County, and Local Governments
to many uses of computers and telecommunications, a feature in their background noted by a number of observers of legislators’ role with IT.26 All congressmen in both houses were full-time legislators, and since they turned over infrequently (less than 10 percent lost their offices in any given election), many were career congressmen. Thus, they had less exposure to the evolving forms of computing than their state counterparts. In short, they faced the kinds of technological insulation experienced by judges. So, unless they served on some congressional committee charged with examining the role of IT or telecommunications, they might not be personally as aware of the possibilities of technology or the managerial considerations involved as would state legislators and city and town council members. Recall the chronic problem the Social Security Administration and the IRS faced in the 1970s and 1980s as Congress passed laws without giving sufficient consideration to the IT implications affecting the agencies. Unless one wants to attribute to congressmen a mean-spirited lack of consideration of such factors, one is left with no alternative but to conclude it was the paucity of adequate personal knowledge about technology that led federal legislators to act as they often did in these years. I also have been singularly unimpressed by the ability of agencies to articulate their IT needs to the staffs of congressmen. A second source of knowledge for state lawmakers came directly from a variety of digital applications used in the legislative and voting processes of their states. While use of computing by legislators and their staffs dated as far back as the mid-1960s,27 actual use did not begin in a widespread fashion until the late 1960s. Early uses included digitizing state statutes for ready look-up by staff, managing various drafts of legislation, word processing, and quick publication (usually photocomposition). By the early 1970s, half the states used their main computer data centers to support online retrieval of statutes,28 and about fifteen updated drafts of legislation using the digital hand, making the process faster, less tedious, and eliminating errors. As draft legislation went digital, soon after so, too, did tracking of potential laws as they made their way from conception to final passage. By 1972, over thirty states used this method for tracking and scheduling legislative activities, much the way court clerks tracked and scheduled cases. As with other state applications of IT, the largest states began first, with legislatures in New York, Washington, and Illinois, for example, experimenting early in the 1960s.29 By the end of the 1970s, thirty-four states now wrote, edited, and managed legislative actions via computer, making it possible for staffs to handle larger volumes of legislative work and to respond rapidly to iterative changes. Software products to help legislatures began appearing on the market. In addition, budgetary information derived from state financial and accounting departments became available through batch and online access to computerized files. A few states also experimented with voting via computer at a legislator’s desk.30 By the end of the 1970s, an estimated forty-four states now used computers for one application or another.31 The introduction of computers had a greater effect on legislative staffs; it was this community that seized the greatest initiative in persuading their legislators to invest in computing. One survey looking at the period of the 1970s and 1980s
221
222
The DIGITAL HAND, Volume III
made the crucial observation: “They (staff) wanted word processing, automated bill production, legislative status tracking, and legal document retrieval systems to make their work easier. The driving requirement was the usually impossible deadline of incorporating amendments into the budget bill and having it on the floor for debate within a few hours.”32 Table 7.3 shows the results of their efforts to integrate digital tools into the work of legislatures. While the extent of deployment may seem small, recall that this represented the cumulative results of about one decade’s worth of deployment and involved over half the states. In turn, use of computing changed the role of staffs, requiring them to learn how to search for material online, to understand how budgets work, and to be able to read and model budgetary options, in addition to doing queries very much as lawyers and their usually much younger and computer-savvy legal aids. Clerical staff had to understand how to use word processing, track the status of bills, retrieve statutes, use e-mail, and interact with online calendaring. Increasingly in the 1970s and 1980s, computer skills became a prerequisite for being hired. It was, therefore, not uncommon for legislators and their staffs to use online systems in the 1980s.33 In that decade, distributed processing expanded as legislatures acquired internal networks and sometimes their own computer systems. E-mail and other online delivery of legislative material to staff and legislators became increasingly common. Key applications that built on earlier ones of the 1970s included updated word processing, electronic mail, file management, and telecommunications, usually linked to district offices of legislators. Access to pending legislation and existing laws via terminals became increasingly normal practice across the country.34 By the end of the 1980s, it was not uncommon for legislators to be very familiar and facile with online query systems, and direct users of their legislature’s digital tools, deployed in the state capitol building, in their district offices, and at home via dial-up telephone communications. They also became early adopters of personal computers and, in the 1990s, laptops, carrying the latter into committee hearings to take their own notes or to look up
Table 7.3 State Legislative Applications in Use by 1977 Application Bill status tracking Fiscal-budgetary Bill drafting Statutory retrieval Bill indexing Modeling
Number of States Using 32 25 4 3 3 2
Source: M. Glenn Newkirk, “Trends in State Legislative Information Technology,” Government Information Quarterly 8, no. 3 (1991): 264.
State, County, and Local Governments
information. By the late 1980s, legislators more than their staffs were also the individuals requesting new digital tools, such as spreadsheets, local area networks on the floor of the legislatures, and economic and financial modeling capabilities.35 Finally, as pending bills made their way into computers and online access more common, legislatures began making these files available to other state officials, reporters, and interested parties via telecommunications. Alaska, Illinois, and Virginia had led the way in the 1970s, and by the mid-1980s, nearly a dozen permitted such access. By 1990, the number of states had nearly doubled, and by the end of the century it was ubiquitous.36 In short, as early as the end of the 1980s, use of computing had largely been woven intricately into the fabric of the daily work of state legislatures. Lest one think that the U.S. Congress eschewed computing, nothing could be farther from the truth. Just as staff at the states had pushed for computing in the 1970s and 1980s, a similar process had taken place at the Congress even earlier, in the 1960s as a matter of fact. By the 1970s, it had become an extensive user of IT, for the same reasons as at the state level. Patterns of adoption of computing evident at the state level were similar in Washington, D.C., although, as with many other federal organizations, they began earlier, in the 1960s, and often were only slowly modernized in the 1970s and 1980s. But use proved extensive with the very large staffs and volume of work characteristic of the Congress and its hundreds of district offices around the nation.37 Switching back to the executive branches of state government, deployment of computing in the later 1980s and early 1990s continued unabated, morphing to account for the use of distributed processing (thanks largely to cheap PCs and communications) and knowledge gained from using earlier versions of digital tools. Imaging applications became important by the early 1990s to digitize large bodies of records used in daily work, such as land records and birth certificates, and, of course, tax returns and Medicaid documentation. Optical scanning of documents came into the states at the same time as in the Insurance Industry, also a large community of workers inundated by vast quantities of documents. One report on this new application described why it was an attractive use of computing: “By creating electronic images of paper documents for computerized manipulation, storage and retrieval, the technology cuts down on space needed for filing cabinets, ensures accuracy of information, simplifies and speeds up employee access to information and thereby saves time, labor and money.”38 It was, however, a primitive technology in the early 1990s; but, by the late 1990s, it had stabilized into a very cost-effective, relatively easy-to-use application.39 Overall, by the time the Internet became a factor, beginning in the middle of the 1990s, states were collectively spending some $30 billion annually on computing, and nearly every office-bound employee interacted with the digital in one fashion or another.40 Large systems continued to be implemented, some very successfully while others were multimillion-dollar disasters, such as California’s failed welfare system of the 1990s, which officials had to abandon and start over with a new one.41 Applications ranged from department of motor vehicle systems to others in support of welfare and Medicaid. Some of these systems were
223
224
The DIGITAL HAND, Volume III
designed in response to federal mandates that states take over functions originally handled by federal agencies. This migration of responsibilities proved difficult to accomplish because all states tended to build the same kinds of systems simultaneously to meet federal deadlines for compliance (thereby absorbing all the consulting knowledge about a particular application available in the United States) or were designed based on federal guidelines or laws stipulating what services to provide or what rules to enforce. One frustrated state employee commented that “the notion that you can really develop systems from Washington under federal guidelines is akin to playing Mozart with mittens on.”42 On the other hand, there were also success stories of new generations of large applications being installed for welfare, Medicaid, and education, to name a few.43 As tax revenues increased in the 1990s during the expansion of the nation’s economy, states increased their budgets for IT, modernized old applications, added many new ones (such as GIS), distributed IT processing, and replaced old hardware and software.44 Collectively, state expenditures for IT rose from nearly $35 billion in mid-decade to well over $50 billion annually by the end of the decade. The largest proportions went to administrative and financial functions, but additionally to human services, transportation, and public safety. Health, criminal justice, environment, and public works also invested increasing amounts in IT all through the decade as well.45 By the late 1990s, state governments had been aggressively injecting computing into all manner of work across each department and branch. Areas of greatest use that governors and their staffs focused on in the mid- to late 1990s included higher education, elementary and high school education (K–12), business regulation, taxation, social services, law enforcement, and the courts. Simultaneously, more software to help in decision making at all levels of government kept being installed on every kind of platform from laptops to mainframes with open and proprietary software and networks.46 Recent Trends The center of much activity involving IT since the mid-1990s centered on the deployment of Internet-based uses of computing and communications. Like the federal and local governments, states paralleled similar patterns of applications and deployment. But, unlike the federal government, state governments went farther in creating applications that increasingly have been labeled e-democracy by which citizens were beginning to interact more frequently in public policy decision making by way of the Internet. It was a process that was just becoming evident in the early years of the new century,47 which was also reigniting a debate about the virtues and problems of direct versus representative democracy. These were issues that had first been discussed by the political leaders of the 1770s and 1780s when they created the structure of the federal and state governments.48 State agencies began creating Web sites as early as the mid-1990s. A decade later, observers noted a pattern discussed throughout this book in which states went through several generations of Web sites rather quickly, in a matter of a few years per generation instead of the decade or more per generation of computing
State, County, and Local Governments
more normal with mainframe, mini-, and PC-based computing from the 1950s to the end of the 1980s. One student of the states’ process, Darrell M. West, in 2004 identified four stages of evolution: “billboard stage,” “partial-service-delivery stage,” “portal stage,” and “interactive democracy with public outreach and accountability enhancing features” stage.49 That first one involved posting information, which described an organization’s mission, largely providing addresses and telephone numbers. The second enabled citizens to begin interacting with an agency, such as sending e-mail and filing applications for permits. The third organized menus of services not by departments but by type, also providing navigation tools to search out topics independent of any particular agency. The last stage is about e-democracy—discussed in more detail below—which states were just entering by the middle of the first decade of the new century. Roughly generalizing, states went through the first two stages at various speeds between the mid-1990s and about 1998/99. The third began about 1998 and continued into the new century, while the fourth was just starting as this book went to press in 2007.50 Early uses of the Internet involved linking state and local government agencies together so that they could collaborate on problems, such as public emergencies, and that created bulletin boards to inform citizens. Early examples included UtahNet, Info/Kansas, and HawaiiFYI.51 Pre-Internet communications with the public had included use of dial-up access via PCs and kiosks; officials now began to migrate to Internet-based platforms.52 Evolution to the second stage involved a large variety of state agencies working in an incremental fashion, adding more content and slowly next the ability to communicate and conduct a very limited number of transactions.53 Within a year, more than half the states were clearly implementing second-stage applications, and officials were largely convinced by then that they could use IT to realize another round of efficiency and cost savings in state operations, thanks to the Internet. As one report on this attitude noted in 1999, “The Web’s cheap, friendly, flexible interface is fast becoming another face of government.”54 By the late 1990s, over half the households in the U.S., and a just slightly higher percentage of workers, had access to the Internet, providing an environment in which state officials felt encouraged to use as a new way to deliver their information and services to the public at large. Furthermore, residents in the United States were learning in their nongovernmental activities to have access to businesses and information twenty-four hours a day, seven days a week and began expecting the same from their federal, state, and local governments. The states in turn saw provision of such services as a way of improving their response to the public’s will while containing operating costs. It seemed like an attractive “win-win” arrangement for both. With interactive services just starting to become available by the end of the century, citizens could obtain online their fishing licenses, register their cars, and so forth in many states. However, even as late as the dawn of the new century, the list of online, or in the parlance of the day “e-commerce,” applications remained few and only increased incrementally, department by department.55 Officials, observers, and citizens increasingly recognized by the end of the century that state and local government operations had entered a period of
225
226
The DIGITAL HAND, Volume III
significant, if ill-defined, transformation. Restructuring of public expenditures was one transformation, as was the way in which citizens would interact with government officials. But it was also becoming clear that traditional organizational structures could potentially change as similar services of various agencies were either bundled together or that caused multiple agencies to deal with a citizen in a more coordinated manner, using case-management techniques already widely deployed by the Insurance Industry and law enforcement and increasingly by welfare agencies.56 As with earlier adoptions of IT, almost from the beginning of the Internet’s incursion into state operations, publications and organizations tracked deployment, reported results, and influenced (indeed encouraged) officials to move further in using the Internet, beginning largely in the late 1990s.57 By 2001, all fifty states had at least arrived at a second stage of deployment, and over a fourth were deep into the next one, as we saw, for example, with the use of online tax filing.58 Yet deployment of specific uses remained uneven across the nation. Those functions that facilitated generation of taxes and fees seemed to be the earliest to be converted into second- or thirdgeneration Internet applications. Table 7.4 lists common examples of this process as of late 2002. Use of portals made it easier for state employees, private sector managers, and citizens to navigate through myriad state Web sites for information and services. An early example from Michigan illustrates the issues faced by public officials concerning the Internet and why they began deploying portals. Their circumstance existed essentially in similar form all over the country. A contemporary report (2001) documented the issues: “The tangle of Web services available until recently grew up in individual agencies with limited resources. It was
Table 7.4 Widely Available State Government Internet Applications and Number of States Offering These Services, 2002 For Tax Preparation and Filing: • Downloading forms • Tax advice • Filing For Registering Vehicles: • Downloading forms • Completing registration online For Obtaining Professional Licenses: • Downloading forms and obtaining information • Partial online registration • Totally online registration
42 38 35 11 16 50 25 2
Source: Ellen Perlman, “The People Connection,” Governing 15, no. 12 (September 2002): 32.
State, County, and Local Governments
confusing to users because there was no common navigation, no search function across agencies and no common look and feel. Users had to know the name of the agency to find the service they were looking for. At the new Web portal, which was launched July 11, services are arranged by theme.”59 Michigan had long been an extensive and early adopter of digital tools, along with such others as Illinois, New York, and California, so it ran into these sorts of problems earlier than most states. How they resolved them provided guiding practices used by many other public officials and their IT staffs.60 As occurred in the private sector, issues concerning data security and privacy also proved troublesome to both citizens and public officials, beginning in the mid-1990s, and were not fully resolved in the early years of the 2000s.61 State governments had two classes of expenditures in the 1990s and early 2000s that affected their move to the Internet. The first involved those for hardware, software, and staffs to support, update, and run pre-Internet based applications, such as the many back office uses developed in the 1970s and 1980s. A second cluster of expenses involved the creation, operation, and innovation of Web sites. In the late 1990s, state officials were generally able to spend between 1 and 2 percent of their state budgets on IT. Some states, like Michigan, spent more, as did others that enjoyed additional tax revenues and fees during the booming economy of the post-1995 period and before the arrival of the national economic recession of the early 2000s. By about 2003, while the absolute amount of dollars expended before on types of IT continued to rise, it was quite clear that their collective priorities settled on four sets of initiatives: to improve homeland security, continue expanding use of the Internet in “e-government” initiatives, support agency-specific applications, and outsource some IT work to the private sector.62 Governors had to cut back expenditures on projects to upgrade existing IT infrastructures due to the lingering national recession that was reducing the inflow of tax revenue to the states. Did citizens use Internet tools? There is some evidence that helps us to answer the question. However, a survey conducted in 2003 suggests that citizens were extensively using these Web sites. Officials reported that substantial use of their Web sites occurred in over half the states but varied by application, with lower usage for employment and public safety, while very high for professional licenses such as with over 80 percent of all nurses using the Internet for licensing in those states where online applications for licenses existed, for example.63 Another survey reported that some 68 million people had accessed government Web sites of all kinds (federal, state, local) by 2003.64 So, extant evidence suggests that the answer is “yes, they did.” The next stage in the evolution of Internet-based computing involved effects the digital hand was just beginning to wield on democracy and the practice of government as this book was going to the publisher (2007). At the risk of dealing with the issue too briefly, because so many other commentators have so extensively, the historic debate centers on whether citizens should vote directly, for example, on measures before a legislature (such as on the famous propositions constantly put before voters in California) or allow elected representatives
227
228
The DIGITAL HAND, Volume III
to do that on their behalf.65 The first form of democracy is often characterized as direct, while the second, representative. The founding fathers chose unequivocally the latter form largely in order to prevent passions of the day causing illconceived laws from being passed and to increase the possibility that well-informed legislators would use sound judgment on behalf of the people’s interests. The issue is complicated by the fact that modern democracies are predicated on the availability to citizens of large amounts of information. Enter electronic or e-democracy. As one student of the subject neatly defined it, “electronic democracy can be understood as the capacity of the new communications environment to enhance the degree and quality of public participation in government,” such as allowing citizens to vote using the Internet, or to do real-time polling of public opinions.66 The debate on the advantages and disadvantages of such use of technology is extensive. It appears that there is also an emerging new twist, namely, the growing involvement of nonprofit organizations, the media, experts, and others attempting to become involved in the governance of the nation and actively engaged in providing specific services to agencies.67 The issue has not led to any consensus and promises to become of greater concern in the years to come as the further deployment and use of the Internet takes place in public administration and in support of democratic practices. More narrowly within the confines of a government agency, there is the concern that technology would affect the way citizens and officials interacted and influenced each other, a subject not yet clear to anyone at the dawn of the new century. Will chat rooms affect how legislators vote? Will polls of a near instant type affect decisions made by elected and career officials? What will the vastly increased amount of information and access to government have on the level and role of public trust in public administration?68 These issues go to the very heart of how democratic government functions in the United States. There is one issue related to IT in general, and not simply about the Internet, that has become far more public in recent years than the effects technology will have on democracy: voting. In the United States, state governments have the greatest hands-on responsibility for conducting elections for federal, state, and many local offices, although county and municipal governments do too. The two key applications involving any form of technology concern how to cast and tabulate ballots. Since the 1950s, many states had used punched cards and IBM tabulating equipment to speed up tabulation and to increase accuracy of the work, all the while providing audit trails. Specialized voting equipment became available as well, and states and counties deployed these widely around the country throughout the 1960s and 1970s. In the 1970s and 1980s, various optical scanning systems began appearing around, while older punched-card systems also remained in use.69 Nobody living in the United States during the elections of 2000 could possibly have avoided hearing about the technical problems faced by officials in Florida with punched-card voting that ultimately led to the U.S. Supreme Court declaring George W. Bush as the winner in the national election for president over Albert Gore. During that election, approximately 32 percent of all polling
State, County, and Local Governments
places in the nation still used punched-card ballots, another 27 percent relied on optical scanners, and 18 percent on the old-fashion lever-operated machines. Only 9 percent had deployed touch screen (digital) systems. An additional few states still had paper ballots. At the time, only twenty-two states had established official guidelines on how to use digitally based voting systems.70 Largely as a result of the problems experienced by Florida, an uptick occurred in the deployment of digital voting systems. By 2002, thirty-one states reported that roughly half their precincts were using digital means to cast and tabulate votes, while eight remained committed to manual, paper ballots.71 Deployment was a function of what local and state governments did independently within a state, and states independently of each other.72 Technical problems and lack of budgets for new systems constrained deployment of digital voting systems. As of 2004, for example, optical scanners still predominated, although paper and punched-card systems had beaten a hasty retreat, largely supplanted by optical scanners and some digital systems, the latter mainly in Nevada, southern California, and in various parts of the southern United States.73 Only about one third of all votes cast in 2005–2007 were done electronically. Thus, we can conclude that deployment of digital voting tools remained relatively low and that adoption had been slow, incremental, and fragmented. Closely tied to voting issues, and directly an Internet application, were campaign Web sites, which became popular with campaign organizations in the late 1990s, especially for gubernatorial, state, and federal legislative elections, and somewhat less for county and municipal campaigning. They became standard fare during the early years of the next decade. Sites varied widely in the amount of information they provided, ability of people to conduct transactions (such as e-mail and making monetary contributions).74 During the national elections of 2004, however, the Democratic Party raised hundreds of millions of dollars over the Internet. In that election, roughly one third of the electorate used Internet sites to obtain information about candidates for local, state, and federal positions, up by a third from the off-year campaigns of 1998. By 2002, organizations had begun codifying best practices for use by candidates under the assumption that no one could now run a campaign without effectively using Web sites.75 The press, political scientists, and others also were now extensive users of the Internet for information related to political campaigns and voting.76
County Governments and the Digital Hand County governments play a special role in American society. On a more local basis, they carry out duties similar to a state, such as to provide law enforcement, manage voter registration rolls, conduct elections, protect the environment, maintain roads and bridges, and often are an important employer in the community. They also share jurisdictional responsibilities with municipalities as well, doing similar duties, such as running school systems and social welfare programs. They range in size from the very large Los Angeles County, which is
229
230
The DIGITAL HAND, Volume III
larger in population and budget than many states, all the way to very small ones in Delaware and Rhode Island. Some are quite urban, such as those that share the geographic footprint of Atlanta, Nashville, and Chicago, and also quite rural, as is the case across much of Montana, Wyoming, and Alaska. In short, they vary a great deal. Most studies of computing at the local level invariably merge practices of county and municipal governments into one milieu under the assumption that their roles are more similar than different. However, extant evidence exists on differences among the two types of governments, suggesting that at least in so far as the adoption of IT is concerned, there are sufficient variations that should be recognized. To the extent possible, it helps to define better the exact role of computing and telecommunications at the local level, across the many thousands of towns, cities, and counties in fifty very different states. Early Uses of Digital Tools, 1950s–Mid-1990s As so evident with states, cities, and companies, the largest counties often were the earliest to use computers, and for the same reasons. They had adequate scale and budgetary and staffing wherewithal, particularly in the beginning when computers were very expensive and required many other resources to implement. Large counties also had installed earlier information processing equipment, such as IBM punched-card tabulators and billing equipment made by Burroughs. They even used cash registers in support of financial transactions. One of the largest in the 1950s was Los Angeles County, home to some 5 million people and fifty-five cities, including the city of Los Angeles and such other wellknown communities as Pasadena, Glendale, Long Beach, Burbank, and Santa Monica. The county had in excess of 35,000 employees and a payroll bigger than that of forty states. In the early 1950s, it had approximately sixty administrative departments and other organizations and several data centers. So, it should be of no surprise that officials in this county became some of the earliest public servants to study the feasibility of using computers, beginning their exploration in 1953 and ending with the installation of a Datamatic 1000 in the fall of 1958. Over the next several years, they moved work done on such older calculators as an IBM 604 over to this system. Applications included assessments and collection of taxes, centralized payroll preparation and accounting, maintenance of voter records, others about grantors and grantees, preparation of social welfare statistics and claims for the state, accounting for the County Hospital, creation of utility bills and records, and support for the Road Department. An IBM 650 system did scientific analysis of wind currents and air pollution, housed at the county’s Air Pollution Control District.77 Other large counties followed suit although at a slower pace than Los Angeles County. Counties began with finance, payroll, billing, accounting, and tax applications, automating and speeding up existing processes for these core functions. One survey conducted in the early 1960s reported that there might have been roughly thirty computer systems installed in county governments, although the accuracy of the survey was not precise because data on cities and
State, County, and Local Governments
counties were often mixed in together. However, what is not in question is the fact that counties came to computing later than states.78 By the mid-1960s, all large counties saw computers as the next wave of technology they would soon embrace if they had not yet done so. As the cost of IT dropped during the 1960s, ever smaller counties, with their more diminutive budgets and populations to serve, emulated their larger brethren.79 Because many counties were first-time users of computers in the late 1960s and 1970s, they acquired state-of-the-art equipment, such as IBM System 360s in the second half of the 1960s, and IBM’s System 370 in the early years of the next decade, free from not having installed second-generation computers requiring major conversions, as faced by such early adopters as the IRS and some very large state governments.80 By the mid-1970s, approximately 25 percent of all counties with populations of over 10,000 residents were now using computers. The largest counties, that is to say, those with populations of over 250,000, were largely invested in the technology, minimally 97 percent of all such organizations, to be more precise. These also had the largest number of well-trained, professional managers and technical staffs who could install and use such technologies, which were far advanced over tabulating and accounting/calculating equipment. Surveys of the period also showed that counties proved slower than cities to adopt computing, statistically behind (as measured by size of population) by roughly five years. That placed the “take-off” in use by the largest counties in the early 1960s, midsized counties (say with populations of between 50,000 and 100,000) in mid-decade, and the rest in the 1970s. Counties of all sizes, however, often ran computing applications at service bureaus or shared resources with local cities or nearby counties, thereby avoiding, or lowering, costs of having their own systems and staffs. Even counties that had systems of their own also outsourced some work to a specialized public regional or private service bureau, such as payroll or ad hoc work. At the same time (1960s and 1970s) when counties did use computing, the percent of budget allocated to this endeavor often exceeded that of cities of comparable size because of the nature of their work. Counties ran large digital applications to prepare voter registration lists, property tax assessments, and in support of welfare and healthcare, often in excess to similar work assigned to many municipal computers. Counties also were more geographically dispersed than cities and frequently used computers to provide administrative integration of services, practices, data collection, and so forth. By the late 1970s, on average counties had nearly fifty different uses (applications) running on their computers, largely automation of routine tasks, such as record keeping, calculating, and printing, with primary emphasis on financial and accounting controls and procedures. Close behind these applications were those, of course, for law enforcement of the type discussed in chapter 4. In short, many county officials focused on generating revenue and supporting law enforcement.81 Use of computers, therefore, supported internal, inward-looking applications. Not until the late 1980s, and more extensively after wider deployment of the Internet, would citizen-centered applications be installed by their county
231
232
The DIGITAL HAND, Volume III
governments. About the only way citizens saw evidence of county IT at work in the 1960s and 1970s was through documents printed by a computer and mailed to them, such as utility and tax bills, or notification of voter status. Surveys of the period demonstrated that just having a computer did not lower operating costs or requirement for staff. To be sure, low-level clerical work sometimes went away, automated by the “giant brains,” but at the same time, counties had to hire more expensive technically competent staffs to run these systems. But there were offsetting benefits. These included the ability to process large complicated files (such as land records for assessing taxes), uses requiring frequent searching and updating of records (such as wanted persons or stolen vehicle files for sheriffs and police), and delivering information to geographically dispersed offices from centrally controlled records. Officials commented in the 1970s that such functions did make it easier to do more work faster and that IT helped them to avoid future budget increases. All of these considerations grew in importance during the second half of the 1960s when the federal government began requiring growing quantities of data from all local governments concerning crimes, education, welfare, and employment.82 In the late 1970s and early 1980s, increasing numbers of counties acquired computers for the kind of applications that had been embraced by earlier adopters of digital tools. In the period from the mid-1970s to the early 1980s, the cost of technology continued to drop while availability of software tools increased. It was also in this period that personal computers became available and managers all over the economy acquired the greatest number of minicomputers. However, the smallest counties that could afford computing did not seem as aggressive in wanting to rely on computing as larger counties. The same applied to smaller cities. One survey done in the mid-1980s demonstrated that both types of government had only increased their use of computers by 17 percent over the surveys done in the early to mid-1970s, but about 53 percent of all counties and cities combined now used computers (their own or through some service bureau or shared arrangement) to do some or a great deal of computing. The data demonstrate that counties with “professional management” were more inclined to use computers than otherwise. More precisely, the 53 percent broke out with 36 percent of all counties using computers, while over 67 percent of all cities as well. Thus, as a group, all counties tended to lag behind cities in their use of computing. One of the authors of the survey, Professor Donald F. Norris, opined that the rate of adoption turned on the question of the quality of management.83 However, while that was clearly the case, scale economic factors also played an enormous role, such as size of a county, volume of transactions it needed to perform, and the cost of technology. Less understood, but clearly a factor, was the additional issue of the availability of technical staff conversant in IT. The smaller the county one looked at, the more likely that it had less or no staff familiar with computer technology. By the mid-1980s, the most widespread uses of computers were in support of generating payrolls, accounting, budgeting, utility billing, tax assessments and billing, to manage personnel records, support law enforcement, track inventory,
State, County, and Local Governments
and voters. These remained core applications for both counties and cities for the rest of the century, regardless of the size of the governmental body. Use of computing facilitated the performance of existing work but did not fundamentally change the kind of activities performed by public employees, nor the structure (organization) of local government in this period. Thus, the effects (so far) were more on operational tasks, where work could be sped up, automated, or done more accurately and less expensively. Computers had yet to affect the management and decision making of county officials, a desire professional supervisors and advocates of computing hoped would occur.84 The story of the use of computing in counties between the second half of the 1980s and the mid-1990s is a tale of embracing new technologies, such as the shift from just batch to online systems and the adoption of personal computers in increasing amounts. The number of counties relying on computers also increased as the technology became more affordable and accessible so that by the mid-1990s, one would have been hard pressed to find a county that did not use computers for multiple applications.85 Recent Trends Like city and state governments, counties upgraded software and hardware throughout the late 1980s and right through the 1990s. Online systems expanded, while remote locations were increasingly linked together through private and public networks. From the perspective of the daily work of counties, they incrementally expanded use of computing to new areas, such as digital mapping (GIS) and to schedule work, planning economic growth, and infrastructure maintenance and development. PCs proliferated, while work that had been done on service bureau computers often came back in-house run by employees. Early adopters from the 1960s and 1970s faced Y2K remediation in the late 1990s to get ready for the year 2000, while those who came to computers later faced less of a potential problem since by the late 1990s a great many software products (and updated microcode in hardware) were already written to reflect the change of date that came on January 1, 2000. While counties were slow to establish a presence on the Web, they did, with their major take-off occurring in the early 2000s. So where were counties at the end of the century with computing? Survey data from the period suggested that over 80 percent of all departments in county governments had access to computing in one form or another, with almost half of all employees having access to PCs for use as part of their normal work. Less than 10 percent of all counties reported not using computers. Access to the Internet by employees remained low as of 1999–2000, to just less than 10 percent, yet half used internal e-mail. The best evidence suggests that a third of all counties now had an intranet site, but only about 13 percent of all county employees had access to the Internet at work. Forty-two percent of all counties reported having a Web site. The earliest services provided over the Internet existed in less than 10 percent of counties. When they were available, these included processing open records requests, providing copies of vital records, and handling voter registration—all offered mainly by very large counties. Two
233
234
The DIGITAL HAND, Volume III
percent or less had yet made available to citizens the ability to register motor vehicles over the Net, or for that matter, to obtain building permits, make tax payments, or to pay fees, acquire licenses, and settle fines.86 In short, counties were just getting started on their e-government journey. As one reporter covering local governments wrote in early 2001, “Counties typically haven’t been viewed as hotbeds of cutting-edge e-government innovation.”87 Large counties, however, were in the process of hiring chief information officers (CIOs) and consolidating their IT operations into IT departments which took advantage of professional management to leverage the technology. Press commentary in subsequent years increasingly reported that county governments were finally embracing the Internet and transforming how they did daily business. However, the rhetoric seemed to get ahead of the realities cited above. For example, one industry watcher declared in 2003 that “the Digital Counties Survey shows that technology is truly transforming government as we know it at the county level,” yet the data in that study citing where that was happening offered minimal evidence on extent of deployment.88 Most of the instances described were from large counties. However, this survey did indicate that by early 2003, over 82 percent of all counties had a Web site, and that half of these made it possible for citizens to explore information online, such as job openings in their county. Nearly all also provided e-mail for their employees and elected leaders.89 The same survey was conducted the next year, and the hubris continued. The most important change reported in this survey was less about how many more counties had Web sites—that number kept growing—than about the move to county portals (61 percent). This growing trend proved important because portals made it easier for citizens to conduct business online with their county, and we know from the experiences of state and municipal uses of portals that over time these did effect changes as governments focused more on the needs and desires of citizens and less on their own departmental, hence internal, priorities. Counties expanded the ability of citizens to reach individual officials by way of e-mail, and Web-casting of governing body meetings had started, building on the prior experience of local public television broadcasting such meetings as early as the 1970s.90 What was happening in smaller counties? The only data we have is for a category of counties with less than 150,000 residents, suggesting that what we do know is about those closer to the high-end populations than for others that, for example, had populations of less than 25,000. Yet even in this broad category of counties, by 2004 nearly 90 percent had Internet sites and gave citizens access to staff, management, and elected officials via e-mail. A third implemented IT using a strategic technology plan to guide their activities, and a similar number had created portals, which tells us that there were multiple Web sites in many counties that needed to be integrated. Online transaction processing was still rare (roughly 2 percent) while most made forms available over the Net. In contrast, a third reported having a broad selection of digital tools available to law enforcement, no doubt funded largely by grants from the federal government.91
State, County, and Local Governments
Observers of the adoption of computing at the local level noticed, however, that in the early 2000s, the public was willing to invest tax dollars in creating Internet-based e-government services. Regardless of whether a town, city, or county, it seemed that the proverbial “everyone” was expanding their Internetbased services and operations, and early adopters were moving through second, third, or into fourth stages of evolution, from simply posting information to slowly starting to offer transaction processing with the public. However, when compared to services in the private sector, they remained unsophisticated, with most still in very early stages of evolution more typical of the private sector of the mid-to late 1990s.92 Barriers to more rapid and sophisticated adoption and use of the Internet varied from one county to another. However, common issues most faced in the late 1990s and early 2000s included lack of staff knowledgeable about creating and managing Internet sites, insufficient budget to allocate to such initiatives, concerns regarding data security and privacy, and more pressing needs to upgrade earlier installed IT applications.93 In short, issues had less to do with desire or awareness of the potential benefits of computing, and more to do with what had always constrained adoption of digital tools in all governments over the past half century. Furthermore, these concerns turned less on the functionality of technology, or over time, even on the relative costs, and more on the absolute costs and availability of staff. In other words, even if the price of a computer declined from one generation of hardware to another, it still did not necessarily make it attractive if there were insufficient dollars in a budget to pay for the less expensive machinery and software. Earlier adopters also faced problems evident in large federal agencies. As several observers of the scene pointed out in 2003, “For those local governments that have historically adopted technology earlier, they may experience the ambivalence first as they are more likely to report both positive and negative impacts,” with the consequence that they “are likely to find that e-government is a mixed blessing and that expanding and enhancing electronic services may be further stretching already stressed IT staff resources.”94
Local (Municipal) Governments and the Digital Hand In the United States, it is commonly accepted that the phrase “local governments” refers to municipalities: towns, villages, and cities, not to unincorporated communities, and frequently counties as well. Municipal governments are normally run by elected mayors and town or city councils, although any community of size may also employ professional managers who are not elected, such as town managers and CIOs. Urban centers are of extraordinary importance, indeed, often far more so than counties, because the majority of the people living in the United States reside and work in towns and cities. Urbanites have outnumbered those living in rural communities since the 1920s, and by 1950 by nearly 2 to 1. The trend of living in urban centers continued right into the new century. In 2000, for instance, 79 percent lived in urban communities.95 If for no other
235
236
The DIGITAL HAND, Volume III
reason than where people lived, the role of towns and cities is of great importance in any discussion of life in modern America. Use of digital tools by these local governments, therefore, takes on a greater importance than by counties, because citizens often interacted with municipalities more frequently than with state officials. One could argue that IT in towns and cities was more visible and had a significant, if not greater, direct effect on the daily life and work of residents in this country than that of county, state, and federal governments. While local municipalities had issues specific to their circumstances, they shared similar concerns and roles with counties and state governments regarding the management of public institutions. Budgets remained tight for all public officials in periods of economic growth and recession. This meant officials found it difficult to free up funding for new ventures, such as deployment of IT, the same problem faced by police departments. The problem proved most acute in small municipalities where budgets were minimal. As with all governments, including federal agencies, senior public officials knew little or nothing about computers, their value, cost, or how to manage their use and funding. Scores of observers commenting on the use of computing at the local level considered this problem greater than the lack of appropriate technology or sufficient budget.96 A third commonly shared issue, although not always a problem, concerned the public nature of decisions and actions. Officials needed to demonstrate to the public their wise use of tax dollars; for IT that often meant improving services for citizens while driving down operating costs. For cities and counties, tax payers and voters wanted to see improvements in the speed, effectiveness, and accuracy of work done. These criteria proved important in the selection of digital applications. This reality never changed, even once local officials passed through the initial stages of automating such internal operations as accounting, finance, and basic personnel and inventory management functions. Early Uses of Digital Tools While knowledge of computing by public officials often was less the smaller the municipality one looked at, we should not draw the conclusion that they lacked interest in using information technology to improve the productivity and effectiveness of their governments. Quite to the contrary, throughout the twentieth century, municipalities of all sizes used typewriters, adding and calculating machinery, billing equipment, telephones, telecommunications, and other devices. In fact, that trend continued unabated after arrival of the computer. Put another way, officials continued to acquire precomputer information technology equipment and computers, the latter, however, only after they became affordable. Obviously, larger communities embraced computers earlier than smaller ones because they could more afford the high initial entry costs into the world of digital computing. But both continued adding mechanical aids to data collection and handling even after computers became available. Reading the professional literature of municipal officials for the 1950s and early 1960s, such as American City, one would think that computers had not been invented. Billing machines
State, County, and Local Governments
were popular for creating utility (water, sewer, and electricity) billing and both tax assessments and bills in communities of all sizes. The largest cities had already integrated various technologies into these processes prior to World War II while in the postwar period, smaller communities that had not done so often upgraded prior IT tools.97 Their trade press celebrated the further deployment of other accounting applications, such as cash management, payroll, and accounts receivable, touting the increases in speed and accuracy with which existing work streams were conducted.98 Large cities modernized their precomputer systems and added new uses, such as New York with its traffic violations process in the early 1950s.99 As occurred with county and state governments, large cities were extensively committed to using the modern technologies of the day, while midsized and smaller communities to the extent they could cost justify them.100 All, however, installed new equipment and changed their processes for doing work in waves, adding applications of these technologies year over year, to the extent that one writer in 1959 declared, “the general trend toward office automation has been accomplished by widespread and increasing municipal use of electronic and mechanical data processing equipment.”101 In fact, communities continued to install additional older technologies until the mid-1960s.102 Large cities took their initial plunge with computers in the second half of the 1950s. New York, with a massive payroll of nearly 200,000 employees, installed a Remington Rand Univac 60 computer in 1957 to improve the speed and volume of work associated with its partially automated earlier payroll system, while the city’s Transit Authority installed a Univac 120 the same year to help manage its inventory.103 On the other side of the country, Los Angeles installed an IBM 650 in its Department of Water and Power, initially to handle the payroll of its 11,000 employees who made up the largest municipally owned water and electric utility in the nation.104 Other cities followed suit. But the question is why did cities begin showing interest in computers? The answer mimicked that of state and federal agencies. As one commentator in 1957 put it, “alarmed by the staggering amounts of paperwork necessary for municipal, government operation, city administrators across the nation are looking to electronics to solve their problems.”105 The shift to computers by the largest cities came slowly, really not becoming a recognizable trend until the 1960s. Officials first needed to learn about the new technology. Next, those who embraced the technology early were asked to explain why. For example, when Boston installed a Univac 60 in 1961, the city auditor, Joseph P. Lally, discussed his city’s decision in print. Changes in state tax laws complicated payroll calculations for Boston, making existing technologies too slow and cumbersome to do the work in a timely fashion. Additional accounting applications were now also “speeded 30 to 40%,” saving “thousands of dollars in overtime work—about 250 hours in the tabulating department alone, to say nothing of payroll and other departments,” while industry publications began educating officials on when and how to acquire computers.106 The complexity of city work became another reason to modify work streams using computers. A data processing manager at Tulsa, Oklahoma, explained the problem addressed by his city: “As population and services expand, administrators require data systems that
237
238
The DIGITAL HAND, Volume III
provide information to support highly complex decisions on comprehensive land use planning; community and urban renewal; school facilities planning; resource planning for police, fire, and public works departments; and for analysis of services provided by health, building, and sanitation departments.”107 As with state governments, the initial focus rested in accounting and financial applications, and the new systems were installed in those departments, often the centers of whatever deep knowledge municipalities had of prior IT systems, such as tabulators and billing equipment.108 So processing speed and cost avoidance were early attractions for public officials, but by mid-decade, so, too, the growing capability of computers to handle large volumes of data. By mid-decade, almost half of all cities with populations of over 25,000 either had installed a computer, rented time on one at a service bureau, or otherwise were engaged in making decisions to acquire such technology. Local governments celebrated every installation of computers in municipalities in the 1960s as signs of progressive management at work. Scores of public officials rushed into print to announce and celebrate in laudatory fashion their new tools. They began reporting how computers were beginning to change their work. For example, village officials at Brookfield, Illinois, reported saving operating costs of renting of precomputer equipment and freeing up labor for other work when they moved billing operations to an IBM system in 1965.109 Officials in New York declared that in addition to saving time by automating purchasing, “the computer enabled the department to start several improvement programs previously not considered possible.”110 Officials in San Jose, California, began using computing to manage the flow of traffic and did not hesitate to laud the benefits of their new system: “The computer has reduced wait time at signal lights by 14 percent, eliminated 50,000 stops a day, and shaved one minute from every 10 previously spent in traffic along the route.”111 To be sure, there were those who criticized the rate of acceptance of computers by municipalities, but they were in the minority. One example will have to suffice, penned by a professor watching the process at work: “By and large, the nation’s medium sized municipalities have either not faced the computer issue or have been content to blunder into the acquisition of hardware with its use dictated by old wives’ tales.”112 When they did start using their computers, they did so by converting “from existing systems for computer processing.”113 Nonetheless, the variety of uses of computers by both large and medium-sized cities was impressive. Table 7.5 catalogs this growing variety of uses for computing that occurred in the 1960s, providing evidence that cities were going beyond using computers for accounting and increasingly in support of work unique to urban centers. However, on balance, during the 1960s, cities of all sizes using computers focused initially on automating or improving various previously semi-automated accounting and financial functions, only adding new functions not possible before to these earliest applications in the late 1960s and more frequently during the 1970s after migrating practices to computers. Later in this chapter, there is a discussion about waves of deployment of computers to large cities and subsequently into ever smaller communities. But to understand how the nature of work in these cities transformed over time, short
State, County, and Local Governments Table 7.5 Municipal Uses of Computers, 1960s Applications Building on Older Uses of Data Processing Payroll Accounting Budgeting Tax billing Inventory management
Applications New to Computing Scheduling traffic light repairs Road and city engineering Law enforcement record keeping Land use planning Voter services Planning activities Traffic control Online information availability
Source: Various articles in American City, 1960–1967.
histories of several uses of the digital hand critical to city and town governments illustrate the process at work. Evolution of Accounting and Financial Applications Myriad accounting activities became the first to be computerized, as one would expect, because these were the uses already most automated and structured before the arrival of the computer. So they lent themselves to further structuring because of the characteristics of the new technology, mainly its ability to handle rapidly and accurately routine, repeatable actions in large quantity and increasingly over time, more affordably when compared to earlier approaches, using preexisting input and output, such as cards and paper reports. The story mirrors what happened with accounting and financial operations in all industries and other governmental entities. In the 1950s, 1960s, and 1970s, officials processed more, faster, and often less expensively on ever newer, more reliable computers, either at in-house data centers, through the services of service bureaus, or with data processing systems shared with other governmental agencies. Over time, older, precomputer methods and tools for computation gave way to digital versions. In the 1980s, online systems and growing use of integrated files made it possible for accounting to be done very quickly across multiple charts of accounts. They provided up-to-date transaction data and “to date” running totals through printed reports, or online via a terminal. By the end of the 1980s, these also were widely available over telephone lines using PCs. Contemporary literature of the day provide overwhelming and vast evidence of this process at work, most notably, such publications as American City and academic studies.114 But what was the effect on work? In many ways, use of digital accounting paralleled what happened with telephone usage throughout the twentieth century. Like telephones, people used computerized accounting to speed up or make more efficient existing processes and policies, only modifying them incrementally as
239
240
The DIGITAL HAND, Volume III
specific characteristics of machines, software, telecommunications, and changing managerial practices and policies warranted. But again like the telephone, the technology generally complemented, rather than displaced, prior practices.115 To a large extent, this pattern can be attributed to the accounting and financial practices of the nation that were governed more by law, for example, than by the proclivities of some modern technology. The technology made possible, however, incremental integration of what used to be separate and distinct operations; payroll is one instance, receiving fees and taxes another, and so forth, bringing them all together over time. By the end of the 1980s, these had evolved into integrated systems. In turn, that situation made it possible for management to have a better and more current understanding of how funds were flowing in and out of government; they could track transactions in a more automated, current, and accurate manner; while driving down the costs of these activities. They also were better able to plan more effectively future activities involving budgets and cash flows. The ability to manage integrated flows of transactions, and to plan, represented a fundamental, even quantum, change in fiscal operations by the end of the 1980s over what they had been in earlier decades. This general observation applies as well to counties and state governments, although far less to the large highly siloed, often poorly integrated federal accounting and financial systems, which even at the dawn of the new century mimicked many of those installed in earlier decades eschewed by state and local governments. These patterns of use involved all local uses of accounting, including those unique to local communities, such as billing for utilities, exceptional tax assessments for repairs of sidewalks, and budgets needed by fire departments, emergency medical services, police, and even dog catchers. In short, by the end of the 1980s, accounting at the local level was sophisticated, easier to perform than in earlier decades, and iterative. In addition, communities found that they could track expenditures (transactions) more precisely, thereby enhancing their ability to audit “the books” much more easily and less expensively as time passed. Accounting systems improved both their comprehensive coverage of accounting and financial applications and the quality of that work, notably accuracy. Coming back to the analogy of the telephone, just as the earlier communicating device reinforced personal relationships, that is to say, how people had long interacted with each other, so, too, the computer reinforced the growing dependence of public officials and employees on data and accounting information with which to run their departments and municipalities, a process under way since the dawn of the twentieth century. At both the municipal and county level, it would have been hard to imagine publicly held meetings of even small town councils or county boards of supervisors without the ubiquitous spreadsheets and budget items on the agenda. Computing and Public Works An important function in any municipality, from tiny to big, is the maintenance of various infrastructures. These include water works, streets, public buildings,
State, County, and Local Governments
sewer lines, traffic lights, street lamps, roads, and sidewalks. A great deal of this work involves engineering, construction, scheduling work, and charging expenses to municipal budgets. After payroll and education expenses, public works normally represents the third largest collection of expenditures of money, personnel, and management’s time. What constitutes public works also varies from one municipality to another; nonetheless, they all share similar roles. Management has to acquire supplies of a enormous variety, from asphalt and pipes to paper and pencils; manage work crews who also vary in types of skills from water and sewer engineers to carpenters and garbage collectors; plan expenditures for capital investments (such as for buildings and vehicles) to myriad payrolls and operating expenditures (such as for fuel or subcontractors). In some communities, they also conduct transactions with the public (such as requests for repairs of sidewalks and street lamps). In short, all are functions at the core of what municipalities do in addition to running schools, police departments, and maintaining public safety. So, it would be of no surprise that as cities and towns began using computers, they would attempt to use these new tools in public works. Earliest applications were accounting oriented, specifically, for inventory control, purchasing, and payroll. That set of applications began spreading to the largest cities in the late 1950s, next across midsized communities in the 1960s and 1970s, then to almost all the rest in the 1980s and 1990s. Beginning in the 1960s, larger cities also went beyond accounting to create tracking systems for water-meter reading, evolving over time from punched-card systems of the 1960s and 1970s to hand held units that could turn over machine-readable data to computers in the 1980s and 1990s. Water works in particular were early adopters of computers since they had thousands or millions of customers whose water usage they monitored, usually on a monthly basis, and who were billed, just as done by electrical utility or telephone companies.116 Scheduling work became a crucial application in the 1970s to optimize use of labor, beginning with collection of refuse but later extending to road, sewer, and other maintenance activities, and for monitoring the quality of water and the nature of sewage.117 Scheduling use and maintenance of equipment and vehicles became popular candidates for automation in the 1970s and 1980s in a successful bid to control costs and to extend the life of existing vehicles and other equipment.118 As the cost of CAD software and hardware began to drop in the 1970s and 1980s, municipal engineers started to use this technology for the same reasons evident in nearly all manufacturing industries. Various alternative designs for work on roads and buildings, for example, could be modeled and then be committed to, and work schedules produced and priced. As high performance workstations and PCs became available with CAD software in the 1980s, even smaller communities were able to justify use of this application. The “take-off” in the use of this application occurred in the second half of the 1980s.119 By the mid-1980s, about a third of all cities and towns were using computers in support of maintenance of utilities, police and fire equipment, while parks managers were just beginning to do the same.120 Even the management of groundwater in such wet parts of the United States as Florida was being monitored with
241
242
The DIGITAL HAND, Volume III
computers. Thus, by the early 1990s, use of computing had spread widely across the nation’s municipalities. However, it should be noted that until the arrival of application software products and less expensive hardware in the 1970s and 1980s, adoption had been largely limited to big cities that could afford to write mainframe-based applications that could be used cost effectively by large staffs. In addition, many public works supervisors remained wary of using computers until the 1980s, having grown up in their professions without using digital tools, unlike many of their colleagues in accounting and finance, who had relied on the digital hand for over two decades.121 Computing and Traffic Control Nothing seems as local as the management of traffic, operating traffic lights, repairing streets, and removing snow. Traffic control using computers began in the 1960s and by the end of the 1970s had been highly computerized in many sizeable communities. Bill Gates, cofounder of that icon of late-twentieth-century American business, the software firm Microsoft, even wrote software for this function while in high school. Years later, he recalled that he and his friend Paul Allen (cofounder of Microsoft) figured “out a way to use the little chip to power a machine that could analyze the information counted by traffic monitors on city streets,” the little rubber hoses across streets that were still used even in the new century. Before computers, vehicles rode across a rubber hose, with an electrified wire in the hose sending an electrical signal to punch-paper tape housed in a little box on the side of the road. Gates and Allen devised a way to have a computer chip read the tape and produce charts and data. This led to the creation of his first company, Traf-O-Data, in the early 1970s; he recalled, “At the time it sounded like poetry.”122 By the end of the 1960s, specialized computer equipment began appearing in large and midsized municipalities all over the country to monitor traffic flows. They adjusted traffic lights to improve the movement of vehicles by relying on various types of sensors cars drove over, and that in turn fed data on volumes to software programs, data on volumes that it then used to send instructions to traffic lights to turn color. Prior to such systems, employees had to be stationed on the side of a road to observe traffic and adjust manually traffic signals; often these individuals were police officers. Ever smaller communities installed such systems all through the 1970s and 1980s. This use of computers provided data as well to officials involved in the much larger mission of planning what roads to repair or expand, and in support of other land use decisions, often linking traffic data to such other digital applications as GIS and budget modeling. County and state highway officials also used this type of information for similar purposes; both groups often collaborated on future plans for road construction.123 Geographic Information Systems (GIS) GIS became one of the most important uses of computers by local and county governments by the 1990s, one used extensively as well by state and federal
State, County, and Local Governments
agencies, and which also became useful to private sector firms by the end of the century. GIS was poised to become available to citizens over the Internet by the end of the first decade of the new century. Developers of this software added continuously to its functionality while users found myriad applications for it. GIS is also one of those uses of the digital hand that became increasingly affordable as the cost of IT dropped and hardware’s capacity grew to handle larger files and more complex calculations. By the end of the century, nearly as many departments in any size municipality or county government could use this digital tool as had long been the case with accounting and budgetary software. We could have discussed GIS earlier, when reviewing uses of IT by state or county governments. However, because large and midsized cities were the earliest to adopt GIS, it made sense to emphasize the central role this application played in city government. But keep in mind one other trend evident by the 1990s, namely, that municipalities and county governments collaborated increasingly on land use strategies and road planning by sharing GIS data and even working with one copy of the software nested in a shared computer. But first, because of its various applications and forms, we need a working definition of GIS. A useful one calls GIS software with information that “describes the locations, characteristics, and shapes of features and phenomena on the surface of the earth.”124 For nearly two thousand years, urban officials have used maps as tools for planning their maintenance and development. Since the early decades of the twentieth century, American communities used paper maps and later clear Mylar sheets with thematic data, such as one sheet just showing buildings and roads and another that could be overlaid on the first to picture the location of water and sewer pipes, and so forth, thereby revealing ever increasing amounts of detail about a particular area. Over the years, communities added to their GIS databases information they collected, in layers so to speak, because one paper map could never hold all the desired information or do it so that it was easy to understand. GIS software automated that prior use of maps, thereby making it possible to add and delete information relatively quickly, to model possible changes to the infrastructure, and to share information either on a laptop in a city truck, whose crew is trying to identify the source of a water leak, or with urban planners meeting in a conference room assessing various scenarios for designing the renewal of a blighted community. Earliest work on the development of GIS tools began in Canada in the 1950s and 1960s but soon also took place at the U.S. Bureau of the Census with its collection of demographic data (TIGER files) and at the National Geodetic Survey. Additional work in the 1960s and 1970s at various American universities and by a few state governments added to the body of software and knowledge about this application of the digital hand. Earliest uses of GIS involved collecting demographic and census data on where people were located (focus of much federal government work) and documenting locations of parcels of land to help communities in the sale and tax assessments of properties. By the end of the 1970s, utility companies and local governments were adding facilities and infrastructure to these layers of data, such as the location of underground pipes and
243
244
The DIGITAL HAND, Volume III
electrical wires. Subsequently, they added information about specific buildings and other above-ground infrastructures. As one could imagine, the amount of data required in digital form far exceeded what one needed to supply a digital application in accounting or word processing. Early GIS systems of the 1970s proved expensive, often costing over $1 million to acquire (hardware and software) and also costly to populate with data. As time passed, however, these systems declined in cost, particularly in the 1980s, when the price of hardware dropped dramatically, while the capacity of computers to hold these very large files became available.125 By the end of the decade, cities and counties all over the United States began equipping their staffs with this new use of the digital hand.126 As with every new digital application, in the late 1980s hype preceded wide deployment: “The trend toward geobased mapping is taking municipalities nationwide by storm, as the equipment becomes more widely available and prices begin to decrease.”127 Trade publications of the late 1980s began documenting installation and use of GIS, however, making it clear that large cities and counties were again the first to use this new application, followed by waves of ever smaller communities.128 The earliest users in the 1970s and 1980s concentrated on property recording, tax assessments, environmental management, and regional or local development. One of the first surveys of deployment of GIS, conducted in the early 1990s, demonstrated that counties and municipalities generally used GIS for similar purposes. In descending order of use from highest to lowest, these included demographic mapping, land parcel mapping, facility management, planning, and zoning. But already in that period uses that became widespread in the 1990s included land use, natural resources analysis (such as hydrographic mapping), transportation modeling, capital improvement planning, and permit tracking, to mention a few. Over 10 percent of all communities used GIS by the late 1980s,129 by the end of the century over half. By the late 1980s, communities began integrating separate digital maps and data into larger views of their communities. In the 1990s in particular, departments within a local government had to collaborate more together out of necessity to share information or out of desire since increasingly coordinated work could now be done more effectively than in the past. These activities included, for example, coordinating repaving of a street with replacement of underground pipes, electrical wires, and TV and telephone cables.130 New users now included law enforcement, fire departments, social welfare agencies, and school districts tracking the ebb and flow of students through communities as they came and went, or grew up.131 Thus, one could see GIS applications evolving through three general phases at the county, city, and even state levels, first from applications that provided inventories of existing structures, infrastructures, and parcels of land, into a second phase in which officials used GIS tools to analyze and plan activities (such as renovations of communities and economic and social development using “what if” scenario planning methods), and into a third phase clearly evident by the early 1990s, of using GIS to perform and monitor managerial and operational activities, much like project management software. Investments in
State, County, and Local Governments
GIS became some of the largest in the public sector, reaching annual expenditures of over $730 million by the end of 1992, and that continued to grow through the rest of the decade. All communities of over 100,000 residents used GIS applications fairly extensively by mid-decade.132 Because we will not discuss this application any further, it makes sense to take its history briefly into the early 2000s. Local systems became increasingly linked to county then to state GIS databases, and new layers of information were added to account for changing needs and replacement of existing buildings and infrastructures, but also expanded to include rural areas, such as mountains and vegetation. GPS sensing became available in the mid-1980s, but not until later did this system of satellites augment the accuracy of mapping. Satellite photography added more details such that, by the early 2000s, cities even had photographs of the exterior of one’s private home and street tied into their local GIS databases. Reports from the early years of the new century documented the extensive integration of GIS software in a vast array of planning and operational functions of all local governments. When the Internet became a viable tool for local government (discussed below), communities began making GIS data available online, a tool useful for building contractors and other firms doing work for municipalities, such as utility and construction companies.133 By 2005, the most extensive users of GIS in local government were administrative services, code enforcers, those involved in community development, public works, law enforcement, school districts, and regional governments. Uses had spread to a variety of functions such as agriculture, environmental management, health and human services, homeland security, land records, law enforcement, public safety, elections, economic development, library management, mapping of sex offenders, public utilities, transportation, telecommunications, water resources, and water and wastewater management. Deployment of IT in Municipalities, 1950s–Mid-1990s A brief discussion of patterns of deployment of uses of the digital hand that took place over a long period of time brings sense and order to the discussion of how so many thousands of communities of different sizes and personalities came to embrace computing. At the outset, we should acknowledge that many other applications of IT were not discussed above, such as the expanding role of telecommunications and voter systems because, while important, they do not add substantially to our understanding of how municipalities embraced computing. In the 1950s, the largest cities in America not only could afford computers the most but had the biggest problems to solve. An industry magazine reporter writing in 1957 reported that “alarmed by the staggering amounts of paper-work necessary for municipal government operations, city administrators across the nation are looking to electronics to solve their problems.”134 By the mid-1960s, officials had come to understand that computers could do more than accounting and billing, that “the great advantage of EDP” lay in its “ability to record large quantities of data and to readily provide this information in a form useful to
245
246
The DIGITAL HAND, Volume III
decision-makers.”135 Then the hunt was on to identify potential uses, cost justify these, and to implement them. Surveys of municipalities indicated that by the end of the 1960s, just over half used EDP in some form, such as with their own systems, shared with other agencies, and through use of a service bureau. Cities with over 500,000 residents spent on average some $1.8 million a year on EDP to as low as just over $40,000 by cities with populations of 25,000 to 50,000 residents.136 During the 1970s, all large cities used computers, and the story then was about their expanding portfolio of uses. The smallest local governments were in no rush to embrace computing, largely because of the technology’s relatively high cost and their lack of technical staff to implement these new devices and software. By the early 1980s, the most widely deployed applications in descending order of use were payroll, accounting, budgeting, utility billing, tax assessment, tax billing, personnel, law enforcement, inventory, and voter registration, ranging from over 85 percent of municipalities using IT for payroll to 16 percent for voter registration.137 Other surveys in the mid-1980s confirmed this general pattern, although with slightly varying statistics; but the trend was clear and obvious. New uses were added, particularly as deployment of PCs occurred in the 1980s. These applications included word processing, financial planning, and even fleet management. As commercially available software packages targeted at local governments became increasingly available in the late 1980s, use of these smaller systems by small communities expanded rapidly.138 A Gartner study done in 1989 looking at the number of workers per workstation (including PCs) provides surrogate statistical evidence of the relative deployment and availability of IT tools to workers. In state government, there were 3.79 workers per work station, while in county and municipal agencies, 6.10 employees per machine. Federal workers appeared to be the most automated as they had 2.56 workers per machine.139 One other contemporary survey reported that “virtually” all cities and counties used computers, and if one added federal usage, this resulted in the deployment of over 450 different applications of IT. As with state and county governments, however, local communities implemented most applications to reinforce existing functions and roles. So, no major changes in organizational and managerial power occurred as a result of using computers.140 Recent Trends Cities and towns continued to embrace new technologies, integrating them into their daily work, such as the Internet. Officials knew increasingly to do this carefully, because successful diffusion of any IT tool required technical expertise to implement and maintain, a lesson many learned again when they went through two to three generations of Web sites in the late 1990s and early 2000s. Many came into the late 1990s with myriad systems, ranging from large mainframe applications to relatively new ones, such as those housed in PCs or GIS systems. In recent times, PCs have played an important role in the lives of public officials. As two experts on IT in the public sector recorded, “the PC revolution came
State, County, and Local Governments
rapidly and recently. It began less than a decade ago for most cities, and unless in the context of a centralized system, PCs usually came with few of the supports provided with the earlier technology”; instead they came with “general purpose packaged software” but hardly tailored to their needs and with “minimal of support staff.”141 In 1995, a team of academic experts noted that the managerial and political environment officials operated in was not necessarily conducive to further adoption of IT: A number of major changes have taken place in the United States in the past decade that have heightened the tension inherent in the dilemma: the aftermath of expanded services provision spawned by federally supported programs; the cutbacks in funding sources for urban governments in many locales; a generally skeptical and critical attitude of citizens toward government; and a tendency to push an increasing number of responsibilities that had drifted to the state and federal levels back down to the local level.142
Meanwhile, as officials were still trying to figure out how best to use PCs and such new applications as GIS, industry reporters and citizens were asking municipalities what role they expected the Internet to play, including their participation in the “Super Highway” initiative of the federal government, more elegantly known as the National Information Infrastructure (NII).143 Between roughly 1995 and the start of 2000, municipalities also had three sets of issues to deal with concerning IT. The first was the normal deployment of new uses of IT and more current technologies, much along the lines described earlier about their activities in the 1970s through the early 1990s. The second concerned the immediate problem of Y2K, while the third, and most convoluted, involved a host of telecommunications issues ranging from understanding what they could do after passage of the Telecommunications Act of 1996 to how to respond to the arrival of the Internet. Y2K can be dispensed with quickly. Local governments were not immune to the issue; they read about it in their industry publications and heard about it at conferences, with interest building in 1997. They succumbed to the hype; for example, an industry “expert,” Peter de Jager, appeared in American City and County predicting “widespread failures,” although in fact that did not happen since Apple and many Microsoft-based software products were sufficiently upgraded in time by the vendors, as also occurred with much mainframe software.144 Yet, industry publication kept reporting that cities were not ready for Y2K; many communities busily worked to remedy their old systems.145 Simultaneously, officials had to concern themselves with the very complicated Telecommunications Act of 1996, discussed more fully in volume 2 of The Digital Hand.146 Concerns during congressional work on the law focused on the effects it might have on public rights-of-way (ROWs) and on zoning, particularly regarding TV cable systems, rights that the law generally protected. However, beginning in 1996 and extending to the end of the century, officials waited for the Federal Communications Commission (FCC) to publish enabling regulations regarding such issues as TV, cable, and telephone, all important issues to large and midsized metropolitan communities.147
247
248
The DIGITAL HAND, Volume III
In addition, there was the Internet. Communities began looking at how to use this new form of telecommunications in the early 1990s. Some fifty communities had established networks to provide citizens with information prior to the availability of the Internet; but this small body of experience had little influence on the thousands of other municipalities that were just learning about the Net at the same time as its citizens.148 Early adopters concluded that the technology would make it easier to provide citizen-centered services, and their installation of Web sites captured the attention of the industry press,149 spurred on in some cases with federal funding.150 As local governments approached 2000, many turned their attention to Y2K, momentarily putting Internet-based projects aside, such as those that could enhance purchasing or delivery of information to citizens.151 Nonetheless, large cities and many midsized communities established their presence on the Web by the end of the century, in part because citizens who had learned to buy goods online twenty-four hours a day, seven days a week, wanted to do the same with their governments, such as pay for parking tickets. Thus, pressure on officials came from citizens rather than from the more traditional source of requests for IT, municipal staffs and industry associations.152 With the Y2K scare behind them, although now saddled with a sagging economy and the clear expectation of shrinking tax revenues, governments began to move into an era of extensive use of the Internet. Surveys indicated that citizens were pleased when their local officials set up Web sites. Those municipalities that did this also found out that they had to reengineer processes and add resources to maintain these sites. In a survey of nearly 1,500 responding municipal governments, some 85 percent said they had a Web site; it is difficult to conclude, however, if that meant those that had not responded to the survey did too. Nonetheless, the evidence indicates that many had Web sites and that this number had essentially increased by at least 50 percent from an earlier survey conducted in 1997.153 Very few provided any interactive, e-government services, largely because they knew so little at this point about e-commerce and had issues concerning security and privacy.154 Observers noted, however, that as officials learned about this new use of IT, this technology was energizing local governments into improving and changing old operational practices and creating a more collaborative environment involving firms and citizens, much as happened among departments using GIS.155 That change over earlier practices represented the start of a significant transformation in how local governments worked that should not be minimized or ignored. Applications paralleled what occurred with county and state agencies of three general types: government to citizens applications (G2C), such as making forms available, e-mail, and applying for permits online; government to business (G2B), such as some shopping, corporate tax filing, acquiring permits and licenses; and business to government (B2G), such as simple purchases.156 But as of 2001–2002, these uses were in their embryonic stage of development. Nonetheless, it was already becoming evident that a paradigm shift was under way similar to what was occurring in state and county governments, away from a largely bureaucratic, inwardly facing perspective for doing work and
State, County, and Local Governments
increasingly, if slowly, toward a citizen-services-focused view of work. This subtle change was first noticed in medium and large cities, particularly those on the West Coast that had large computer-savvy populations and government officials.157 Adoption of this evolving way of working continued to be impeded by lack of sufficient technical staff and budgets, perhaps cultural inertia, and the need to redesign internal processes in order to optimize use of the Internet.158 By 2003, it was becoming evident, however, that public officials were overwhelmingly inclined to use the Internet, bolstered by positive early experiences of others, effusive press accounts of successes, and wide support from citizens who themselves were becoming extensive and enthusiastic users of the Internet. Early adopters were also going through a second round of upgrades of their Web sites, to which they were adding e-business functions, such as the ability to conduct some transactions with local government.159 Two important applications emerged that proved very relevant to municipal governments: procurement and building permits. In one important survey done in 2004, 75 percent of the responding municipal governments reported using electronic forms to speed up the process by which firms and contractors could obtain building permits. This application supported generation of revenue and, of course, economic development in a community. Purchasing online also proved popular, with 77 percent reporting relying on the Internet in support of this use. Online bill payments, however, although up year over year, remained at about 40 percent, and even then only for limited items, such as for parking tickets and obtaining parks and recreational passes. Online applications for city jobs also grew, from only 28 percent making that possible in 2001 to 44 percent in 2004. On balance, we can conclude that the “takeoff” in use of the Internet by municipalities occurred sometime after 2001.160 As this chapter was being written in 2006, wireless connections to the Internet for citizens and to municipal Web sites was just beginning, creating a whole new service and source of fees a community could charge. More important, officials saw quickly that having these “wi-fi” connections would make their cities attractive communities for high-tech companies to locate in, and as a place for computer-savvy citizens to do the same.161
Conclusions A team of experts who had long observed the workings of state and local governments concluded in 1993 that over half the public sector managers they had interviewed for a survey found “themselves as very dependent upon computing” and two-thirds concluded that the digital hand was proving important in carrying out their work.162 The dozens of surveys that preceded this one and others done over the next decade or so consistently confirmed that public sector managers and users enjoyed the same benefits and similar challenges in using the digital hand as those evident in many industries. They also avoided many of the difficulties faced by very large users of IT in the federal government, particularly in the 1970s and 1980s. Unlike so many of their federal colleagues, at all levels
249
250
The DIGITAL HAND, Volume III
of local government, officials had either newer systems that were not in dire need of replacement or small enough for them to find the necessary wherewithal and technical staff to keep systems more current. In the case of small state, county, and municipal governments, the reason lay elsewhere; they just did not yet have computers integrated into the core work of their departments. When these smaller agencies computerized work, they did it on more modern systems, often using software designed by vendors for specific applications in public administration, or shared services put together by local governments more experienced in these matters, such as by state agencies or very large counties and cities. State and local governments shared with federal departments and agencies, and almost every private sector industry, a technological resurgence, beginning in about 1997–1998 with their adoption of Internet-based services. Perhaps because the costs for Internet applications were often less, and the risk of failure minimized by the incremental nature with which Web sites (and their services) were created, one senses the kind of enthusiasm and optimism for improved productivity in governmental operations experienced when the first mainframes appeared in the American economy in the 1950s and 1960s. Press hubris was similar in both periods; managers rushed into print to brag about their newly installed systems and how much they expected of them, while citizens were just as enthusiastic and hence politically supportive of “e-government.” For many local governments, education represented the largest single line item in their budgets, and often also their biggest group of employees. Therefore, to complete our understanding of the role of the digital at the local level, we need to turn our attention to students, teachers, and their school districts. Theirs is a different story from the one just told.
8 Digital Applications in Schools Computers have become one of the expected trappings of today’s classroom, and schools have exhibited an insatiable appetite for hardware; but systemic curricular integration of computers is still more of a promise than a reality. —William R. Jordan, 1993
N
o aspect of American society seems to engage so many individuals, industries, and communities as schools. Children, their parents, community leaders, business executives, teachers, public officials, techno-enthusiasts, and nay sayers have made the world of schools so public and of such concern. Only wars have drawn greater public attention. The reason for education’s public nature is easy to find. Training future participants in how to succeed in America’s democratic society and to thrive in an “advanced” economy are central to the mission of American schools. The proverbial “everyone” has a perspective and most have personal experience with the American system.1 So, it should be of no surprise that as American society embraced computing and telecommunications, a major focus for this kind of activity would be the school house. From preschool through high school, students, teachers, and administrators came face to face with computers, raising questions about how best to use this technology, how to fit it into the fundamental missions of schools, and how it could best support teaching and learning. While billions of dollars have been spent on IT in education, this sector of the economy (and society) has been the subject of much debate and controversy with pro-technology enthusiasts touting the benefits of IT, overselling these in the minds of an equally loud community of critics. However, the epigraph opening this chapter typified the experiences of many educators, this one from a teacher in Florida who had pioneered use of computing in his school.2 The 251
252
The DIGITAL HAND, Volume III
debate about the benefits of using such technology is vociferous and unresolved despite what has clearly been a massive investment in IT in education. The debate has also spun off as many articles and books as all the literature combined for all manner of computing across all of the public sector agencies discussed in this book. The result is a conflicted, still ambiguous story about the role and value of information technology in education. It is with far less certainty that we can conclude that IT created a new style of operating in education as occurred in more certain, if varying forms, in such areas as the administration of government agencies and the military. It is that lack of conversion to a new style of teaching that makes the story of education somewhat different from the experiences of other public institutions, although much of the story told in this chapter will sound familiar: the introduction of computing, rates of deployment, and applications of the technology. However, we will also see that in this industry, as in all others (public and private), technology is embraced and effective when it supports and aligns with the core work and values of its users. To the extent that IT did or did not do these in education is where the core lessons lie for our understanding of how computing affected teaching and learning. As a distinguished student of the history of education and its use of various technologies recently pointed out: “Without a critical examination of the assumptions of techno-promoters, a return to the historic civil and social mission of schooling in America, and a rebuilding of social capital in our schools, our passion for school-based technology, driven by dreams of increased economic productivity and the demands of the workplace, will remain an expensive, narrowly conceived innovation.”3 In short, Larry Cuban reminds us that the story of computing in education is both about the continuous adoption of IT and the consequences of such actions, which are not as clear, or even as positive, as in other parts of the public sector—and this forty years after computers were first used by educators. Understanding how education fits into the broader story of the public sector’s use of computing is thus essential to appreciate and is the central focus of this chapter. To accomplish that task we need to view applications and deployment as tactically as in other chapters and agencies, through the organizations that acquired and used these technologies. In the case of education, its world is divided into two fundamental halves: administration and teaching. Administration concerns the managerial staffs in schools, such as principals and the people we students saw working in offices or in support roles within school buildings, and the staffs employed in school district offices. They hired and fired teachers and staffs; built and maintained school buildings; acquired text books; set, spent, and monitored budgets; collected grades and tracked academic performances.4 Teachers did this too, grading and reporting student performance, while communicating with their principals and parents. The second half of the educational organization—and the most populous—was the collection of teachers and students who were the ones most directly involved in the actual learning activities. That community organized in essentially three types of schools: preschool through elementary (kindergarten through roughly the third or fourth grade),
Schools
middle school (normally fourth through seventh or eighth grade), and high school. Each had different pedagogical issues to deal with, such as children of different ages and social backgrounds, were of varying sizes, and had correspondingly different sized budgets. In short, while we will generalize frequently about the deployment of computers across schools, it is important to note that when one looks at specific uses of computing, they vary between what a five-year-old can do versus what a sixteen-year-old does. It is an important observation to keep in mind because if this chapter were to be a full-length book, we would have to discuss in detail how computing affected teaching in each grade and subject, something that space does not permit here. Furthermore, there is a vast literature on this theme of differences.5 To complete the statement of scope for this chapter, it will only focus on public education, that is to say, on schools run by local governments funded by taxes. They represent the overwhelming majority of schools in America for the entire half century in question, and while private schools were also extensive users of computing, their experiences either mirrored those of public schools or were unique enough that to discuss them would take our focus away from what happened generally across the American educational landscape. Finally, keep in mind that the chapter after this one tells the story of computing in higher education, thereby further rounding out our discussion of the digital hand in education.
Computing in the Administration of Education, 1950s–2000s When one thinks of computing and education, they turn immediately to the use of IT by teachers and students in classrooms. However, use of IT in the administration of education proved far more influential than technology in teaching (so far). Deployment in administration mirrored much that took place in the deployment of computing and telecommunications in many other public sector agencies and in the private sector. Indeed, despite an enormous body of literature and debate about computing and teaching, the technology had less of an effect on teachers than one would have been led to believe.6 When looking at all the participants in education—students, parents, teachers, school administrators, and district staffs—the least amount of digital activity occurred in the classroom, although at home teachers used computers much as did parents and students, while school administrators of all stripes relied on the technology to do their daily work. For that reason, we start our discussion about the most important application of information technology in education on the administrative side of the equation. It should be of no surprise that the largest school districts in the country were often the ones that most used precomputer information technologies to track students, schedule classes, manage payrolls, accounting, budgets, purchases, inventories (such as furniture and text books), account for grades, and assign students to specific schedules, teachers, and classrooms. Paul Serote, in
253
254
The DIGITAL HAND, Volume III
charge of data processing for the Los Angeles City Schools, the second largest school district in the United States in the 1950s and 1960s, ran an operation that had been an extensive user of precomputer information technologies and, when computers arrived, became an early user of them. He described the case for using IT in support of his 700 schools: Education is big business, and you realize this when you look at your property tax bills and see that the largest portion goes to education.Behind the scenes in the education of a child is a vast machinery involved with the construction of buildings to house the student; instruction of teachers and the eventual selection of teachers to teach the students; purchasing of supplies, equipment, desks, furniture, laboratory equipment, magazines, periodicals, books, etc.; planting of grass, the development of athletic fields, the maintenance of such fields, and the cleaning of the classroom.7
He went on to describe the various functions of any school and district, arguing that the collection of functions were as complex as those of a large business and that was why his district had become such an extensive user of computers in most functional areas.8 When computers became available, superintendents in particular, but some principals as well, began looking at this new technology to determine what value it might bring to the administration of education.9 While teachers began using computers in measurable quantities to assist their teaching in the 1970s and 1980s, administrators had been relying on all manner of IT since the early decades of the twentieth century in support of office work. As in other public sector agencies, they adopted ever newer forms of office equipment as they become available. Similarly, they also had high expectations that technology could improve their productivity, thereby freeing them to do the core work of educators. Take the very typical example of the high school in Evanston, Illinois, in the late 1950s, which acquired an IBM 402 Accounting Machine—not yet a computer but a calculator-which it used for a variety of accounting applications. Its officials had high hopes for the machine: “Punched card processing equipment handles clerical problems at Evanston Township High School with such speed and efficiency that school administrators have time to give more personal attention to students, especially during scheduling difficulties.”10 This case is also a useful reminder that before computers, applications of information processing equipment had clarified what uses made sense of IT, so that when computers came along, administrators learned through prior experience with “office appliances” what potential uses there were for the new technology. That circumstance goes far to explain why the largest school districts were the first to enlist the help of the digital hand. In Evanston’s case, its 402 mimicked what computers were used for years later: to schedule classes efficiently, thoroughly, and accurately, track grades, produce statistical reports, and to do so quickly and with fewer employees.11 For many of the reasons experienced by other public sector agencies, however, most schools and school districts could not cost justify use of computers until the l960s or 1970s, but by the mid-1970s,
Schools
almost all large and even midsized school districts used computing for accounting, administrative, and statistical applications. They acquired their earliest computing in one of three ways: the result of work done at a school that spread to others, as in Evanston’s case; school districts that installed computers and implemented standardized administrative functions in their schools (the case of all major urban school districts), or by using service bureaus often made available by state or county governments. The earliest uses involved payroll, accounting, and financial reporting, just as across most public and private sector organizations. It was also largely a story of the 1960s and early 1970s. The next wave of uses were administrative, involving personnel records, inventory control, class rolls, recording and reporting of grades, and student scheduling. In the 1960s, data entry involved punched cards, but by the 1970s other input devices were used, such as magnetic cards and online terminals, while data storage had moved from cards to magnetic tape, later to disk storage. As computers became easier to use and software packages appeared on the market designed for education administration, larger school districts embraced them, such as IBM’s Systems 360 and 370. The evolution of technology also allowed school districts to take independent applications and integrate them into accounting or student systems, largely beginning in the 1970s.12 Lest we overstate the use of computing, professors studying K–12, writing in the mid-1960s, observed that “primitive paper and pencil techniques still prevail for most educational information processing,” while use of computing remained “limited” and its application to teaching “constitutes a frontier being explored only at the research and experimental levels.”13 Even as late as 1965–1966, uses involved business accounting, student records, and general administration. In states like California and New York, and in large cities such as Chicago and Philadelphia, these applications became increasingly available during this decade. Chicago’s IBM 305 RAMAC received considerable publicity as a state-of-the-art system; then later other role models emerged in Los Angeles, Atlanta, and elsewhere. One use of computers in very large school systems of particular interest in the 1960s concerned maintaining student census records to project changing demands for classrooms and teachers by grade and neighborhood. That function facilitated planning for class schedules, types and quantities of teachers needed, budget forecasting, and even planning bus routes.14 Inevitably we must ask, how many school districts actually used computers in the 1960s–early 1970s? One survey done in 1972 provides some insight. Of 12,400 respondents to the survey, just over 30 percent reported use of computing for administrative functions, while just over 65 percent had yet to use computers at all. Less than 4 percent acknowledged teachers using computers in instruction.15 Substantial increased use came in the 1970s, again more in administrative applications than in teaching. However, all the studies and surveys from the early 1970s reported that uses began in the 1960s, continued into the 1970s, with the addition that applications increasingly moved from card tub files to direct access memory systems, which were essential in the use of CRTs as well.16 With the
255
256
The DIGITAL HAND, Volume III
evolution of digital technologies in the late 1960s and 1970s, school districts endorsed the concept of management information systems (MIS), which called for integrated systems and the management of IT through centralized operations, embracing a trend evident across many industries and government agencies at that time. Information processing managers and the administrators who they worked for began integrating systems that planned programs in school districts, accounted for monies spent and budgeted, tracked and assigned people (students, teachers, administrators), documented student development, planned maintenance for buildings, and monitored inventories all while school districts, and many individual schools as well, grew in size and operational complexity in the 1970s and 1980s.17 The result of these various initiatives was that by the late 1980s, a wide range of applications of digital computing existed in school districts around the United States. Table 8.1 lists many of the more widely deployed uses. Note the variety, reflecting both the pattern evident in so many industries of using IT in accounting, personnel, and financial operations, but also those unique to education, such as in the functioning of libraries and schools. Survey data from the earliest days of computing in education suggest the speed with which digital tools were embraced. One study of 1,360 school districts conducted in 1981 looked at when they began deploying this technology. Generally, about 73 percent of school districts with 25,000 or more students began using computers between the late 1950s and 1969. About 59 percent of all districts with 10,000 to 24,900 students embraced computing between 1960 and 1970. School
Table 8.1 Common Administrative Uses of Computing in K–12 Education, Late 1980s General Type
Examples
Financial systems Office applications Personnel systems
Budgeting, accounting, purchasing, salaries Word processing, filing, desktop publishing Payroll, personnel records, faculty assignments, health records, tax records, benefits management Space utilization and room assignments, inventory management, maintenance planning, energy utilization Budget analysis, bus routing, statistical studies, testing and evaluation, project planning and control, enrollment analysis and forecasting Scheduling, class registration, grade reporting, attendance accounting, student demographics, health records, test scoring, class lists Circulation, catalogs, online database searches, purchasing
Asset applications Research and planning
Student applications
Library systems
Source: Adapted from William C. Bozeman, Stephen M. Raucher, and Dennis W. Spuck, “Application of Computer Technology to Educational Administration in the United States,” Journal of Research on Computing in Education 24, no. 1 (fall 1991): 66–68.
Schools
districts with less than 10,000 students began using computers after 1965, with about 58 percent participating for the first time during the 1960s or early 1970s.18 To put this data in context, the nearly 1,400 school districts participating in the survey—all of which used computers by 1981—represented only about 9 percent of the existing 16,000 school districts in the nation. So, it is not clear how many others might have used computers. There is enough data here, however, to surmise that districts learned about computing and adopted it as a tool in roughly the same time period, beginning in the 1960s, increasingly so after 1965, and extensively in smaller districts in the 1970s. The survey confirmed use of these applications listed in table 8.1. A second survey study conducted later in the 1980s suggested how quickly school districts had embraced computing in the 1980s. In this instance, in a sample of 20 percent of all districts in the United States, nearly 95 percent used computers for such applications as those listed in table 8.1. Sixty-two percent had a computer on their premises, another 16 percent relied on service bureaus, and the rest on systems in other districts. The data also suggest that the types of uses to which administrators put computers remained remarkably consistent with those listed in table 8.1 from the 1960s into the 1990s. The pattern was essentially about increasing numbers of districts using computers, with the smallest ones coming late to the process, and often only when they could integrate PCs into their daily work.19 No technology so affected education as the personal computer, about which we will have much to say below in discussing its role in teaching. What role did it play in the administration of schools and districts? In large school districts, already accustomed to using mainframes and equipped previously with software tools with which to do their work, administrators resisted switching to PCs or adding new applications onto personal computers when they could use existing computing power in their data centers and also avoid costs of converting existing software tools and processes. Therefore, small districts became extensive users of this new technology while others often did not come to appreciate the importance of these smaller devices for education until school principals, and their teachers, began talking about these new technologies in the 1980s.20 By the end of the 1980s, however, districts were integrating this class of technology into their inventory of applications, particularly as they linked networks from schools to district mainframes. Thus, by the mid-1990s, with the availability of the Internet just then spreading, PCs were appearing all across districts, from Apple Computer, which had captured the lion’s share of the teaching market largely at the elementary school level in the 1980s, and increasingly now, from IBM and other suppliers eager to cash in on the expanding market for first-time users and to replace the older Apples, which did not have the technical capability of attaching to networks, such as the Internet, or sufficient capacity to operate the latest software, much of which began using graphics and other presentation formats.21 Since the rest of this chapter is devoted to instructional uses of computing, the story of districts and computing in the Internet period makes sense to discuss at this point. In the second half of the 1990s, school districts began creating Web sites to inform parents and communities about schools, their programs, and
257
258
The DIGITAL HAND, Volume III
meetings. Individual schools did the same, such that by the early 2000s, hardly any school or district existed without a Web site. Principals used these to communicate with students and communities, and intranets to transmit information back and forth between districts and their schools. Districts did much the same, following familiar patterns evident in law enforcement and municipal and county administration, moving from early sites that only had passive information to later versions that provided the ability to communicate back and forth. In summary, the historical record shows that initial uses of computing were in direct support of existing administrative functions, such as payroll and other accounting and managerial operations. Doing these things became cheaper, faster, and better as technologies evolved, which also meant more accurately and for larger numbers of students, teachers, and staffs. By the 1980s, administrative functions increasingly merged, made possible by software that interconnected processes, such as accounting and records keeping. Like other agencies, management became increasingly able to rely on quantitative data with which to make operational decisions. In the case of principals, school districts, and their boards, they could model possible operational differences, just as legislators were able to do in the same period. Modeling made planning, particularly scenario planning, easier and better to do in the 1980s and 1990s. In short, the administrative side of education increasingly functioned like other agencies that used computers, integrating them into the fabric of all their major functions. At the school level, use of IT even extended to sporting events where coaches used PCs in the 1990s to analyze the performance of their athletes. Then there was teaching, presenting a far different story of adoption.
Computing Comes to Teaching, 1960s–1980s So far, in agencies described in this book, use of computing emerged as a tale of a new technology fitting conveniently into existing work of organizations and then later altering these functions to account for the capabilities of new technology. In other words, the story supports nicely our argument that what happened in the late twentieth century was the emergence of a new style of doing work. With K–12, however, and specifically with teachers, we may have an exception to the pattern. Many teachers used computers, particularly PCs, encouraged strongly to do so by parents and administrators. But the benefits of using computers to teach were not always as clear as with other applications, creating a conundrum that, fifty years after the introduction of computing, had not yet been resolved to the satisfaction of administrators, public officials, parents, and especially teachers. At the risk of doing terrible injustice to the fundamentals of pedagogy, it makes sense to summarize briefly a few basic values and styles of work of teachers that animated their attitudes and actions regarding computing. These have hardly changed in decades, indeed, in well over a century in the United States. Perhaps the most obvious is that for generations teachers embraced the concept
Schools
of “teacher-centered instruction,” which, simply put, meant a teacher taught. He or she was the center of attention, standing in front of a room full of students lecturing and orchestrating instructional activities. Teachers talked (taught) and students listened (learned). Classrooms often looked very similar in 2000 to those of 1900. To be sure, these varied with younger children often clustered into learning groups within a classroom, or with high school students moving from one room to another for the ubiquitous fifty-five-minute class on a specific subject. Classroom behavior, replete with raising hands, a code of conduct regarding who talked when, and how tests were administered, and so forth, remained relatively constant throughout the twentieth century. Teachers also saw it as their role to impart widely embraced social values, such as those of democracy in America; to sort out the bright from the less gifted young scholars; and to provide a modicum of social control. In short, the majority of the action was in the classroom. Most commentators on education who are either critical or supportive of this approach to teaching acknowledge that teachers share an ethos of conservatism, which values delight in teaching and the thrill of a student learning, but normally through a process that has changed little from one decade to another. This culture of teaching reinforces commonly shared practices from old to young teachers. Their beliefs in how students learn also are essential influences. Given the way teachers are managed, once one is alone in a classroom, he or she could do as they essentially wanted pedagogically. That situation did not begin to change in any appreciable way until the 1970s when local, state, and federal officials began mandating that students achieve specific levels of performance as tested by state or national examinations. By the late 1990s, federal aid to schools linked directly to such results. But for most of the century, teachers had enormous freedom to act as they willed in their classrooms.22 The historical record demonstrates clearly that teachers were not opposed to using innovations in their classrooms. The blackboard in the nineteenth century is perhaps the most important innovation in teaching of that century.23 All through the first half of the twentieth century, teachers and their school administrators experimented with other technologies, from radios to television, and, of course, the ubiquitous “film strips” and slides that animated so many lectures from the 1930s until nearly the end of the century. “Teaching machines” have been discussed, built, and used since the 1920s. Automation was always the Holy Grail, holding out the possibility for administrators to teach more students with fewer teachers, an important concern because teaching was a profession constantly projected to have insufficient numbers of members. Teachers saw these devices as possible enhancements to their core work in classroom-centered teaching.24 But what are “teaching machines”? These are devices that can be used by students to learn, such as skills or rote subjects.25 They originated in the work of Sidney L. Pressey in the 1920s, who developed objective self-scoring methods and standard tests, efforts which next led to his designing mechanical devices for selfinstruction. In the 1950s, a behavioral psychologist, B. F. Skinner, also expressed
259
260
The DIGITAL HAND, Volume III
interest in learning devices to teach humans much the way he had learned to do with animals. With both Pressey and Skinner, machines made it possible for individuals to control what they learned, the rate at which they accomplished such tasks, and made available an assessment of what they had mastered. By the 1960s, Skinner’s work had ignited modern interest in using some form of learning technology in teaching; although, as we saw with the military, training devices had been developed by the armed services as far back as World War II, often called military knowledge trainers. Pressey thought of his devices and techniques as augmentations to human teaching, while Skinner valued students’ providing the correct answer to a question and having that correct activity reinforced. Skinner’s approach influenced the design of teaching machines made, for example, to help children learn mathematics in the 1950s and 1960s. A less technical approach familiar to all students in any period since the 1940s was the ubiquitous workbook, also called in the education trade “programmed books.” Students were provided material, they answered questions about it, and then, and only then, were they permitted to move on to another topic. This became the basis of much early learning software, beginning in the 1980s. One further paper-based and learning-machine-based development became an additional precursor to the digital teaching tools of the 1980s and beyond called “branching programs.” Initially developed in the late 1950s, these built on developments in military training devices of the 1940s and 1950s. A student was given information, and the software required the young scholar to make a decision or provide an answer, and depending on how the student responded, the machine presented additional material. That new information varied depending on the answer. This was a fundamental principal behind video games and much modern educational training software, although in the 1950s and early 1960s, based on electromechanical devices. Many of these various classes of machines, from Pressey’s in the 1920s to Skinner’s and others’ of the 1950s–1970s, were used in adult education, while educators also experimented with elementary, high school, and college students during the rest of the century.26 While hard data on how extensively such devices were used in K–12 are hard to come by, extant evidence demonstrates that there were dozens of such devices available in the marketplace. Since many of those units each cost between $100 and $500 in the early 1960s—a great deal of money—we can safely assume that the number of teachers who had access to such tools remained few.27 Yet teachers trained in the late 1950s, 1960s, and 1970s would have known about the existence of such equipment and, of course, about the work of at least B. F. Skinner, if not also of Pressey, and others. The next chapter in the history of teaching machines involved use of computers. The story is normally understood to consist of two eras, both defined by hardware: pre- and post-microcomputers. Prior to the existence of personal computers (beginning in the second half of the 1970s), use of IT in the classroom for teaching was almost nonexistent; afterward it began a long and slow expansion to the present, and with much uncertainty about the effectiveness of its use. Taking a hardware-centric view of the history of computing in education, while
Schools
a useful shorthand for suggesting the extent of deployment, is not sufficient, but a place to start in understanding the role of IT. Seymour Papert, one of the earliest students of the use of computing in education, recalled that in the 1960s, “we were a small handful” of researchers conducting research on the use of computers in teaching, a group that increased to a “larger handful” in the early 1970s. He recollected that “the big break came with the advent of the microcomputer in the middle of the decade. By the early 1980s the numbers of people who devoted a significant part of their professional time to computers and education had shot up from a few hundred to tens of thousands,” a group made up largely of researchers in higher education and teachers in K–12.28 His observations point out that for all intents and purposes, computing was not used by teachers in the 1950s, remaining largely an experimental process in the 1960s centered at universities using mainframes. As time-sharing began evolving from its experimental forms into stable production systems at the end of the 1950s and early 1960s, various projects emerged, most notably development of PLATO. Led by Donald Bitzer at the University of Illinois, an interdisciplinary group of educators, psychologists, physicists, and engineers developed software to conduct individualized instruction, beginning in 1959. During the 1960s, it flowered into a useable system applied in teaching adults and children. In 1974, Control Data Corporation (CDC) acquired the rights to sell it as a product.29 Students gained online access to PLATO on a mainframe over a network; beginning in 1977, a student could use microcomputers and a new version of the software called Micro-PLATO. The system remained in use all through the 1980s and 1990s.30 A half dozen other mainframe-based projects for K–12 or higher education also existed in the 1960s–1990s, mostly centered at universities, and on occasion in collaboration with a school district, such as Chicago’s in the 1970s, which experimented with online teaching.31 While deployment in the K–12 environment remained miniscule in the 1960s, interest in the potential uses of computing grew and many educators debated the issue and anticipated its use. Educators recognized very quickly that teaching applications could include drill-and-practice exercises, instructing students on basic arithmetic, for example. Tutorial applications also seemed appropriate, advising students on the definition of a word, for instance, and examples of its use, followed by testing to confirm that the material had been mastered. Simulations of various experiences could also be created as a form of teaching (for example, what could be done with online games beginning in the 1980s). Educators expected that problem-solving software could be developed to help students think, reason, and solve problems, a variation of which could also be handled using game applications. Computers could also be used as a tool, much like a pencil or calculator, and for such things as word processing and mathematics (spreadsheets and calculators). Thousands of articles and hundreds of books waxed eloquently about these various potential uses of computing all through the second half of the twentieth century.32 Looked at more strictly from a pedagogical perspective, expectations always remained high that such
261
262
The DIGITAL HAND, Volume III
technology would eventually be used to impart knowledge of specific subjects, help train people to think and act creatively, to develop specific skills and be able to use techniques, and indirectly acquire desirable attitudes (for instance, a love of learning). An early and widely held aspiration for decades was the potential held out by computers to allow children to learn at different speeds and ways. Advocates of the use of computing in education believed by the early 1960s that students should progress independently, based on their individual abilities. Largely beginning in the late 1960s, and extending into the early 1970s, computers began appearing in schools, albeit slowly. Some teachers experimented with the technology, using computers acquired for administrative functions, or as part of some university-based research on IT in teaching. The high cost of mainframes, the lack of educational software, and teachers not versed in the technology combined as a set of factors holding back even experimentation with the new machines. Nonetheless, if one were to read the contemporary literature of the day, you could conclude that a great deal was going on. We read that high school students in Hinsdale, Illinois, used an IBM 1130 system to learn how to program software in 1969 and about a computer for tutorials in the early 1970s in Cleveland, Ohio. One sees reports of students learning about computers and how they worked in Memphis, Kansas City, Newark, New Jersey, Poughkeepsie, New York, or in Columbus, Ohio.33 Teachers were exposed to computers in various training sessions, even out of a mobile van from IBM in Appalachia in the early 1970s.34 Companies like IBM and CDC, eager to sell what appeared to them a large emerging market in education, promoted extensively multiple uses of computers.35 Early on, the federal government became an advocate of the new technology as well, essentially cataloguing “war stories” from around the country on the use of IT in classrooms and in school offices.36 One important report reflecting trends of the late 1970s and early 1980s assessed the situation as positive and expanding: “Four factors have more recently revived interest in interactive instruction: 1) the rapidly declining costs of computers and the advent of the desktop computer, 2) escalating labor-intensive costs of traditional schooling, 3) an improved understanding of how to create instructional packages, and 4) the development of alternative delivery mechanisms that link the computer with other technologies, such as video disk and interactive cable.”37 Government agencies funded research to figure out how computers could be used in the classroom or through remote computing, and dutifully reported results in the 1970s and 1980s.38 But its own assessments documented the slow deployment of computing in teaching in the 1970s, a decade that saw massive growth in the use of computers across the American economy and by many government agencies. As late as 1980, roughly one-half of all school districts had one or more terminals available for students and of those that did have one, only a quarter dedicated the terminal(s) for use by students. Large school districts tended to be early (but limited) adopters, while small districts reported not even having any intentions of using such technology in the near future.39 Those few devices in schools were used to teach computer literacy (that is to say, how computers worked and some
Schools
software programming), followed in descending volumes of use to do some drilland-practice and simulation exercises. School administrators pointed out that they needed funding for machines, technical support, software designed for teaching, and training on the technology for their teachers.40 All of these factors remained chronic obstacles to the use of computing in education throughout the late twentieth century. Emphasis on one issue or another increased or declined over time, but they all remained important. Traditional accounts of the use of computing in education could lead us to believe that with the arrival of the microcomputer, deployment in classrooms expanded dramatically, and that educators had begun embracing computing in sharply increasing numbers. To be sure, when compared to the 1960s and 1970s, adoption of computing in the 1980s and 1990s proved numerically massive, and by the end of the century the proverbial “everyone” had computers in their schools. But the problems cited above, and chronic pedagogical issues (discussed later in this chapter), did not go away, remaining right into the new century. But, before dealing with those issues, a short overview of educational activities after the arrival of the PC sets the context for the extent of the debate. When microcomputers first came into the economy in the mid-1970s, vendors quickly eyed schools as an attractive market. Early entrants included Apple, Radio Shack, and by the early 1980s, IBM. Schools first began using these smaller, less expensive forms of computers around 1977, and over the next half dozen years, software for educational purposes came out of universities, schools, and vendors.41 Two sets of activities became widespread in the 1980s: the move to find ways to use PCs to teach (or assist teachers) and a second to educate students about how computers could be used in general, even being called the “computer literacy movement.”42 School administrators, government officials, parents, and politicians often criticized the quality of education in general in the 1970s and 1980s, and their criticism culminated in the publication in 1983 of a report by the National Commission on Excellence in Education, Nation at Risk.43 The report was a seminal event in the history of modern American education because it galvanized opinions and led to various federal and state programs to “improve” education. One major criticism in the report concerned the fact that children were not being trained to work in a modern global economy characterized by extensive use of technologies of all kinds and in many automated functions. That concern translated quickly into a growing advocacy around the nation for more extensively exposing children to computers, while improving the overall efficiencies and effectiveness of education.44 As historian William J. Reese documented, at the same time as computing gained attention, and federal reports commented on the quality of education, the entire educational system was in churn as new forms of teaching came into use, even experimenting with such basic elements as the physical construction of classrooms (for instance, the open classroom movement), changing demographics, and the role of the Civil Rights Movement, which included the important aspiration to improve the quality of education of African American students.45
263
264
The DIGITAL HAND, Volume III
PCs were small, portable, could be put in classrooms, and cost far less than mainframes. Just as they seeped into various corners of American work and home life, so, too, did PCs appear in schools. Between the terminals available attached to mainframes, and personal computers, by 1982 there were in the range of 200,000 to 300,000 available to some 45 million students attending classes in 100,000 schools. Deployment of microcomputers had roughly doubled every year since the invention of the machine. If we accepted these statistics as a surrogate measure of their use, we could conclude that deployment had a relatively fast start. Just having the machines did not necessarily mean they were used, let alone effectively. Nonetheless, the process of using them had started at all grade levels. Some teachers found that these devices proved useful in teaching basic mathematics and early software tools appeared in support of these kinds of drill-and-practice instruction. In these early years, parents, teachers, and administrators wanted children exposed to how computers worked, serving as the most widely embraced reason—and application—for microcomputers.46 Elementary schools proved most aggressive in using the new technology, followed by secondary and high schools. Computer literacy and programming instruction were popular uses, while elementary schools were nearly twice the users of the technology for drill-and-practice applications than secondary schools, while the reverse was true for word processing. Mathematics teachers in secondary schools were some of the earliest users of PCs, but by the early 1980s, teachers were beginning to experiment with the technology in teaching language, science, and social studies, and wrote special instructional programs for exceptional students. The earliest surveys on use of computers began to suggest that students liked using the new technology.47 Impetus for deployment came from various sources: politicians, parents, education administrators, and some teachers. But it also came from vendors, as noted earlier. The most important of these was Apple Computer, which made an early and strategic commitment to this market far in excess of all other suppliers with the result that by the early 1980s, it owned over 56 percent of the public school market, and when combined with private schools, roughly half the national education market. Other brands included Atari and Radio Shack; IBM was not yet a major supplier in this market. All that said, it was still a developing market because only one terminal or PC was installed for roughly one student out of 144. This meant that very few students had any access while in school, let alone meaningful amounts of experience, working with computers in the late 1970s or early 1980s.48 However, when they were used in schools, teachers and administrators tried to optimize deployment by putting all that they had into one or few rooms, called computer labs, and to which students were brought just to work with PCs. In other words, computing had yet to be integrated into the teaching of courses taking place in regular classrooms. Drilland-practice remained an adjunct function provided by these early systems. It was the oldest, and also easiest, form of computing in education to create. Critics noted that this was no different from the previous page-turning of workbooks and that the technology was used with only slower-learning students who
Schools
needed more attention.49 By this time—1982–1984—the amount of software available to teachers to use with PCs began appearing in forms that were of some use, although vendors began accusing teachers of copying software without paying for it—calling it piracy—which led some software vendors to remain wary about entering the education market. It was an important issue because by the late 1980s, schools and their teachers had shifted their attention away from just getting machines into their buildings and now both were on the hunt for practical educational software. Interest grew in using software as an integral component of mainstream instruction. By now IBM had entered the market with its PCs and its Writing to Read software that taught young children reading skills.50 IBM, like so many vendors, aimed its selling efforts at school boards, districts, and principals as they were the ones with budgets and the ability to acquire multiple machines. This practice set in place a pattern of acquisition that remained right into the new century, causing many teachers frustration by their exclusion from decisions about what hardware and software others selected for their use.51 In the mid-1980s, commentators were beginning to question whether all these systems were being used, and about their effectiveness. However, defenders noted that “most of the available software, designed in the main to provide for drill and practice in the basic skills, was in fact dreadful, but no worse, it turned out, than much of the print materials schools had been using for the same purposes without complaints for years.”52 That observation should not be so surprising, since this early software was often created using already available printed materials that teachers and publishers had written and used for years. Meanwhile, all through the 1980s schools continued to install computers. Unfortunately, existing data is not as precise as to how many per school; the best information is in the form of percent of schools with at least one. From that perspective it would appear that a great deal was done, with 18 percent of all schools having at least one machine in 1981, to the point where by 1987–1988, over 95 percent claimed to have at least one.53 One can reasonably assume, however, that there were thousands of large schools with a dozen or more machines for their computer labs, while by now some classes would have started to obtain their own machines. Data on the number of students per microcomputer do exist, demonstrating that in 1983, the ratio was about 125 students per microcomputer; then it dropped most dramatically in the rest of the decade, in the following year or so to 75 students per machine by 1985. The number of PCs increased such that by 1990, the ratio was closer to 22 students per microcomputer.54 Because advocates of computing in schools normally criticized teachers for resisting use of computers, it is helpful to note that by the late 1980s, survey data provided clear evidence that the charges were not framed well. Table 8.2 provides evidence of a broad set of applications of computing in use by a large group of teachers who were already users of IT, although admittedly, not as widely for actual teaching, an issue explored in greater detail later in this chapter. In other words, if a teacher used a computer, this table suggested what he or she did with that system. Consensus among educators at the end of the decade held
265
266
The DIGITAL HAND, Volume III Table 8.2 IT Applications When Teachers Used Computers, circa 1989–1990 (Percentages) Text processing Instructional software Analytical and information Programming Games and simulations Graphics and operating tools Communications Multimedia
95 89 87 84 81 81 49 8
Source: Karen Sheingold and Martha Hadley, Accomplished Teachers: Integrating Computers into Classroom Practice (New York: Center for Technology in Education, 1990): 8.
that while use of computers had expanded in schools, they were “still generally considered as an add-on rather than an integral part of the curriculum and dayto-day instruction.”55 The discussion of hardware and software, their deployment, and what they were used for all blend into the larger story of how teachers used computers in the 1970s and 1980s. One student of the process stated bluntly “that the vast majority of U.S. teachers were nonusers of computers in their classrooms,” reporting that only one in four were even casual users, and only 10 percent serious users (such as one or more times per week).56 By standards of use evident in the 1980s in the private sector and in government, even teachers in that second group must be considered casual users. Most surveys on the number of machines installed in schools, and about what they were used for, are highly misleading and confusing, and thus one can suspect that the extent of deployment and utilization was probably lower than reported. Documented cases showed drill-and-practice applications at all levels in K–12, particularly for mathematics, reading, and spelling; and programming largely in high schools.57 Marc S. Tucker, executive director of the Carnegie Forum on Education and the Economy, in 1985 apply titled an article on the situation of computer use by teachers, “Computers in the Schools: What Revolution?” while defending their low use of the technology.58 Given what all the extant evidence demonstrates, one must conclude that right into the early 1990s, use of computing in schools lagged behind what was happening across many other parts of American society. It is an important perspective to keep in mind because as parents and public officials became more accustomed to using this technology in their work lives and even at home, they increased pressure on teachers, school administrators, school boards, and state legislatures to force teachers to use the technology. This situation stood in sharp contrast to all other sectors of the economy where it became relatively easier to appreciate the benefits of using computers in earlier
Schools
decades. So we should ask, why did a dichotomy exist in what otherwise was a substantial trend in integrating computing into the work lives of scores of millions of American workers? By exploring the answer to this question, we can better appreciate the substantial expanded use of computing in education that began to occur in the 1990s and that was also being affected by access to the Internet. For the truth is, it was in the 1990s that teachers began using computers in quantity, partially mimicking patterns of behavior evident in other industries and professions more obviously as early as the 1960s and 1970s, although not to the same degree of deployment.
Debate about the Role of Computing in Education Schools make up the one set of government institutions in American society most controlled at the local level; perhaps only churches are equally managed at the grass roots level. Throughout the history of the United States, local communities and school boards hired teachers and worked to have them reflect the community’s interests and values in what happened in schools. To be sure, as the twentieth century moved along, state boards of public education established various standards, generally with support of the electorate. Increasingly during the second half of the twentieth century, the federal government as well sought to influence the course of education in the United States through its funding of programs and support, much as it did with law enforcement, for example. Both in theory and practice, the electorate influenced the behavior of school principals, boards of education, and government agencies since these were the entities that trained, hired, and fired teachers, and that, in theory, could mandate what happened in the classroom. It is theory to a large extent because simultaneously teachers exercised considerable autonomy in their classrooms, a situation that did not begin changing until the 1980s as state and, subsequently, federal agencies began mandating standards regarding what they taught and to whom in any particular grade.59 However, relatively local control of schools remained in place for the entire second half of the twentieth century and that meant the views of local leaders and parents (often one and the same) played far more important and immediate roles than in many other public institutions. One could argue that this situation held so because there were more parents and students in any given community than there were more niche groups, such as accountants, firemen, Republicans or Democrats. In other words, schooling was an issue of interest to the largest segment of any community. Furthermore, citizens had long believed that education was important for the development of their children, and for the success of the American democracy and economy. As one student of American education put it, there was “a popular faith in using schools as a lever of social progress,” a theme “reinforced by the rising expectations of the post–World War II era.”60 Americans saw education as the path for personal and community improvements, the way forward toward entrance into the middle class, and access to financial well-being
267
268
The DIGITAL HAND, Volume III
and happiness. These values remain still strongly held by the public, dating as far back as the dawn of the United States. The same scholar quoted above, William J. Reese, reported that “since the 1980s, countless reform-minded citizens still placed the schools at the center of broad discussions of the good life and the future direction of the nation.”61 These were the same people who came to use computers at work and concluded that their children—and those of the nation as a whole—had to become familiar with computers if they, too, were someday to be successful in what was rapidly being called the Information Age or Information Economy. They also were joined by CEOs of corporations who could easily advocate investing in education for its modernization either in the belief that this position really did make sense, or out of some cynical notion that it would be a popular position to take with their customers and employees. The combination of constituencies in American society had joined forces with IT vendors and proponents of the use of computing in schools to demand that this technology not only be made available and familiar to students, but that IT also be integrated into the educational process. Larry Cuban rightly described all these advocates as “a loosely tied national coalition” that became vocal by the early 1980s.62 In short, there was a growing call for use of IT in teaching that built up, beginning in the 1970s and expanding to the present. To be sure, there were contrarian points of view that challenged widely held beliefs that computers should be part of the curriculum. These voices were equally articulate, expressed their views throughout the second half of the century, but remained in the minority because of the overwhelming support for the use of computing so widely held by parents, the public at large, legislators, and government officials. Often critics challenged conventional expectations that the digital hand could assist teachers in doing their jobs better and cheaper, even proposing that the quality of education would decline as a result of using this technology. Because of the enormous investment made in bringing the Internet and PCs into classrooms in the 1990s, a small surge of criticism emerged at the end of the century that challenged the wisdom of diverting funds from teaching art, music, and physical education in order to fund IT. Critics questioned the value of the cost of technology and its lack of effectiveness, or simply argued that what was done provided more than was needed to make future workers productive in the economy.63 Within the educational establishment, there were advocates of computing. Principals and administrators had early been exposed to the value of computers in handling budgets and accounting, in scheduling students, teachers, and classrooms, and so forth. So, one could reasonably expect them to be inclined to see the value of computers. Elected members of school boards used computers in their private lives and at work as well and so recognized many of the attributes and limitations of the technology as it evolved over time. Advocates of the technology in the greater society were also writing about its need in schools, from Robert P. Taylor and Seymour Papert within the educational establishment to such broader commentators as Alvin Toffler, the author of the hugely successful
Schools
vision of the future, Future Shock (published 1970), to the recent and also highly influential book by Thomas L. Friedman, The World Is Flat.64 So, teachers were constantly accused of being slow to embrace computers, dull witted Luddites holding back education and children from being properly trained to function in the modern information age. Most computers came into schools thanks to decisions about how to use them proposed by vendors and parents, and made by principals and boards of education. Often teachers were not consulted, although there always existed a small minority of them who were eager to experiment with the technology; typically they were self-taught and frequently used the machines at home the same way as other Americans, particularly after the arrival of the Internet. The historical record reflects that for over a century, teachers as a community embraced new technologies when they enhanced the way they taught, such as blackboards in the nineteenth century, and film strips in the twentieth. Rarely, however, did a new technology lead them to transform how they taught. But why was this so? Recent students of the issue suggest that the answer may lie in the way teaching is fundamentally done. Teachers use classrooms as the center of the teaching/learning experience. Add in what they are expected to do—maintain order and ensure students learn from increasingly standard curriculums—and we have the makings of a situation in which no technology would be used unless it fit into this way that teachers worked. As two historians of the role of technology in teaching pointed out: “Teachers have been willing, even eager, to adopt innovations such as chalkboards or overhead projectors that help them do their regular work more efficiently and that are simple, durable, flexible, and responsive to the way they define their tasks. But they have often regarded teaching by machines as extraneous to their central mission.”65 Computers in the 1970s and 1980s did not replace classroom-centered teaching, because the technology did not lend itself to that (such as lack of compelling software), or there were not enough of them to make a difference, or teachers did not receive adequate training on how to use the technology. Yet teachers used PCs to collect and report grades, and e-mail to communicate with each other, parents, and school administrators. The argument that they did not have enough machines failed the test of time and change. Just between 1984 and 1992, for example, the nation invested over $1 billion in providing them with digital tools, and as we saw earlier, the ratio of students to machines dropped dramatically such that in 1993, schools had one machine for every fourteen students, a ratio that continued to shrink all through the 1990s.66 Studies done in the 1990s on IT and education acknowledged that the issue might have deeper roots: the lack of sufficient complementarity between how the technology worked and teachers did their jobs. One study of the use of computing in Silicon Valley, where one would expect to see the most strident, extensive use of computing in the heart of America’s software business, found the opposite: “The teachers we studied adapted computers to sustain, rather than transform, their philosophy that the whole child develops best when both work and play are cultivated and ‘developmentally appropriate’ tasks and activities are offered.”67
269
270
The DIGITAL HAND, Volume III
The final conclusion of these researchers was clear to them: elementary school teachers sought “to conserve traditional civic, academic, and social values rather than turn children into future Net-workers.”68 But then did that mean critics were right, that teachers refused to change with the times and that they were dim-witted and not simply ignorant of the technology? The answer encompasses a complex of issues. For one thing, teachers and administrators agreed that neither had sufficient time in their daily routines to learn about new technologies or to evaluate how they might be used effectively in classrooms, let alone in distance learning or other models of education. Those teachers making the argument were often the same individuals who used computers in their homes and wanted training and time to do the same at work. They also had a history of making incremental changes in how they taught and what materials they used, just as workers did in every other industry studied for the three volumes of The Digital Hand. Teachers resisted radical changes in the classroom-centered, teacher-centered style of education that was nearly universally their practice in the nineteenth and twentieth centuries, just as workers in other industries also objected to fundamental, indeed, radical change to their work practices.69 In short, as I have argued in all three volumes, transformation in work practices was a massively incremental process, and the experiences of teachers and their administrators reflected that similar experience. This broad pattern of behavior remained in force right into the new century. Core teaching practices were simply not yet being changed by digital technologies. Later we come back to why that was the case. Seymour Papert noticed that in the 1970s and 1980s there was an additional unintended consequence at work slowing adoption of computing in teaching. In the belief that putting what few computers a school had together in one room— usually called the computer lab—principals believed they could optimize use of the technology by having students come in to use the equipment from various classes, as opposed to dispersing the machines to individual classrooms and have them sit there used far less than if concentrated into a computer lab. Papert noticed that teachers did not attempt to integrate computers into their daily teaching activities; learning about computers naturally became a separate subject, one dealt with in a separate classroom, even with its own name (computer lab), and that someone else taught: “Instead of cutting across and so challenging the very idea of subject boundaries, the computer now defined a new subject; instead of changing the emphasis from impersonal curriculum to excited live exploration by students, the computer was now used to reinforce School’s ways. What had started as a subversive instrument of change was neutralized by the system and converted into an instrument of consolidation.”70 His observation fit perfectly into the paradigm discussed in earlier paragraphs. Meanwhile, schools continued to invest in computing, indeed, more so in the 1990s than in earlier decades. This segregation of computing was unique in that I could not find a similar practice in any other industry. Everywhere else, regardless of the location of hardware, computers were integrated into daily work. Otherwise, often the
Schools
machines simply stayed in their original shipping containers in some closet, storage area, or even simply stacked in hallways in full view of all who passed by. In most instances, equipment and software came into an organization once IT experts (and their consultants), or potential users, had figured out how to integrate the technology into their work and had established expectations, implementation plans, and training schedules. None of these kinds of activities were as evident in schools. This is not to say this industry did not plan with teachers how best to use the technology, rather that the normal way of acquiring and adopting the technology was less in evidence here and may account for why so many teachers complained over so many decades about the lack of time and training to use IT. That circumstance would also contribute to why teachers, therefore, might have faulted software they did see as inappropriate for integrating into their teaching methods. It is a supposition, but based on the widely seen dichotomy between what educational software seemed to be offering and what teachers were actually doing. The issues of concern to teachers and administrators had not changed fundamentally since the dawn of the computer. An early student of the role of computing in education, Anthony G. Oettinger at Harvard University, explored how students, teachers, and administrators reacted to the computer in the 1960s. Like so many other observers, he noted the resistance to use of computers in education, and the situation he noticed and described in the late 1960s could just as easily have been seen in the 1980s: The prospects for change are also depressing when one looks at schools district by district and sees a tangle of authorities who feel mostly threatened, conservative, and broke. School administrators and teachers are fearful, filled with legitimate concerns for the safety of their jobs and of their persons. These key people are ill-disposed, by both background and training, to innovation, with or without the devices of new technology.71
He forecasted accurately that during the 1970s, there would be little change in this circumstance, despite expectations that digital tools would improve in the new decade, as they did indeed with the arrival of the microcomputer.72 Part of this state of affairs lay in the highly decentralized, fragmented approach to the adoption of computing. Designing educational software was and is a complicated process, as it is for any major software system, so expecting individual teachers, or even small specialized software firms or school districts, to do so was probably unrealistic, but often the reality faced by educators, particularly in the 1960s–1980s. As late as the early 1990s, even school districts were only just starting to look at the problem and getting involved, while survey evidence from that period suggested that teachers often still made the ultimate decisions on when and how to use computers, regardless of who acquired the equipment and software. The one use of technology, however, that did require coordination by teachers, schools, and often state education departments was distance learning, often bringing educational material to students as a complement to existing classroom-centered teaching by way of telecommunications and satellite. This use of
271
272
The DIGITAL HAND, Volume III
IT had been evident in some schools in the 1960s and 1970s, and had expanded slowly during the 1980s such that by about 1990–1991, some 1,500 school districts participated in some form of distance learning with about 25 percent of the nation’s students exposed to it in one form or another. So before the arrival of the Internet, schools had some experience with distance learning, albeit very limited, mostly using local television networks.73 One might ask at this point about what teachers were even being taught about the general subject of computers in their various undergraduate, graduate, and certification programs. Not until the 1980s does one even begin to see much commentary about the digital hand in any of their basic textbooks. The earliest textbooks devoted solely to the subject began appearing in the mid-1980s, such as those of Paul F. Merrill and his colleagues, and another by James Lockard and Peter D. Abrams.74 By the late 1990s, students in education were routinely exposed to basic information about the Internet and key instructional uses of digital and telecommunications technology.75 Meanwhile, their professors were simultaneously investigating the subject, often conducting basic research on the general theme of computing and education, or learning from consultants.76 It is an indication of how the technology was evolving, and new uses emerging, that widely used texts often appeared in new editions about every three years. But in each instance, they still had to explain why computers were controversial and to defend their use before explaining their nature and application to education.
Role of IT in Education, 1990s–Early 2000s The start of the Clinton administration in January 1993 heralded a new era of federal initiatives to get all students onto the “Information Highway,” reflecting the same interest officials entertained in using technology to improve government and the economy, described in earlier chapters. Critical to the federal government’s strategy was the further deployment of IT into the public school systems of the nation, particularly classrooms, building on earlier efforts of the prior Bush administration. In the year prior to the arrival of the new Democratic administration, the number of microcomputers in K–12 had surpassed 2.5 million, and by the end of the first six months of 1993, climbed to some 3.9 million units. The number of computers devoted to instructional purposes in schools continued to increase over the next several years, reaching nearly nine million by the end of 1998, bringing the ratio of students to machine down from about nineteen per PC in 1989–1990 to six per machine by 1998.77 By the end of the decade, some 99 percent of all schools now claimed to have one or more computers, although that figure appears to be an overstatement, while the ratio of students to computers had improved such that for every fifteen students up through middle schools there was one system and for high school students one for every ten. Regardless of effectiveness, or the much publicized emphasis of the Clinton administration’s interest in educational computing, there had been much activity in the 1980s and early 1990s. Local area networks were being
Schools
implemented to connect libraries and media centers to classrooms, and some software appeared that teachers used to create interactive learning environments, but all still in early stages of development.78 Federal officials expanded their assessments of the status of educational computing, funded various initiatives, and, as the Internet became widely available in the mid-1990s, pushed aggressively to make it accessible in classrooms all over the nation. Reliable data from the period 1994–1995 demonstrates that still only 75 percent of public schools had at least one computer, but already 35 percent had some access to the Internet, although it was not clear how it was used or how frequently. More telling, only 3 percent of all classrooms were “wired” into the Net. Public officials reported that teachers still hardly used computers for instruction, and that they needed a vision of what was possible, training, and support if they were to leverage the technology. The federal government began recommending, promoting, and funding efforts to integrate telecommunications, microcomputers, and curriculums, an effort that remained a constantly supported initiative of President Clinton’s domestic programs throughout his eight years in office.79 This all began with passage in 1994 of the Goals 2000: Educate America Act, which called on the U.S. Department of Education to create a national long-range plan for use of technology in schools and to implement programs to execute this initiative. Federal officials recognized realistically that the nation’s teachers needed training, effective software, and that their schools faced budget limitations similar to what police departments, for example, did as well. But the emphasis of schools, districts, state and federal governments had largely concentrated on getting equipment into the schools, and as we can see from the census data on PCs, the number tripled in one decade. In addition, they also focused on establishing connections to the Internet, a complementary initiative under way side-by-side with the installation of computers in classrooms. As the capability of newer machines increased to present graphical data, a mix of sound, images, and text, it became evident that these machines made application of IT more attractive than earlier devices that did not have these functions. When combined with what became available in the way of images, sound, and text on the Internet, the extensive inventory of old machines became a problem. Schools began learning how to replace older generations of equipment in the 1990s, such that by the end of the decade, nearly 2.5 percent of school budgets were routinely going to IT. Schools in economically depressed communities spent less, others in economically advantaged districts spent more. High schools tended to get the newest machines and cascaded their older devices to middle and elementary schools. About half of all computers made it into classrooms in the 1990s, and a nearly similar percentage still into computer labs. The rest were either in school libraries or in the office of administrators.80 What software did teachers have access to on all these machines? There were two classes of applications, one a collection of general uses that one might find in any industry, such as word processing and graphics, and then another group that was specifically instructional. Table 8.3 provides a catalog of the five most deployed general applications available on school machines expressed as a
273
274
The DIGITAL HAND, Volume III Table 8.3 Percentage of Instructional Computers with Access to Top Five Most Widely Deployed General Applications Software, circa 1998 Application Word processing Spreadsheet Database Drawing or painting Desktop publishing and presentations
Elementary
Middle
High School
Total
96 79 79 81
97 91 87 82
95 88 86 72
96 83 81 80
55
56
57
56
Source: Ronald E. Anderson and Amy Ronnkvist, The Presence of Computers in American Schools (Irvine, Calif.: Center for Research on Information Technology and Organizations, June 1999): 9.
Table 8.4 Percentage of Instructional Computers with Access to Top Seven Most Widely Deployed Teaching Applications Software, circa 1998 Application Math-specific Science English language Social studies Foreign language Typing CAD-CAM industrial arts
Elementary
Middle
High School
Total
62 35 56 44 11 59 3
39 29 29 28 13 51 6
22 21 23 18 15 30 10
50 31 45 36 12 52 3
Source: Ronald E. Anderson and Amy Ronnkvist, The Presence of Computers in American Schools. (Irvine, Calif.: Center for Research on Information Technology and Organizations, June 1999): 9.
percent of computers devoted to instructional purposes. Not listed were others such as image editing, multimedia development, Web development tools, and programming languages, all of which were on less than half the machines. Table 8.4 lists the most widely available instructional software. Elementary and middle schools had the greatest variety of specialized course software. However, given the age of the installed computers, many schools could not yet take advantage of the multimedia software beginning to appear on the market in support of digital instruction. By the late 1990s, there had been enough computers installed in schools to begin asking questions of the historical record about patterns of adoption and use since the arrival of personal computers. A team of education researchers at the North Central Regional Education Laboratory identified three phases of use that help situate the history of computing in teaching. The first phase reflected
Schools
the behavioral-based pedagogy favored in the 1970s and 1980s for such activities as drill-and-practice to teach specific skills and content. Software was relatively primitive, even throughout most of the 1980s, and often had simply digitized printed texts, and too frequently written by programmers with no educational experience. When teachers used these tools, students were normally sent to computer labs for their drill and practice exercises. The software thus reflected the pedagogy of the day, in which small bits of information were taught, students rewarded for learning, and were prescriptive in what they asked students to do and what the answers were to specific questions. As the authors noted, teaching moved from teacher to software, leading many educators to be hostile to technology because they did not control the experience. Nearly a thousand studies of this use of IT suggested that the technology did have some positive effects on learning, as measured by the results of standardized tests. Use of this kind of software proved practical, however, where a teacher’s personal knowledge of the subject being taught was low.81 During the second phase, it became possible to use software to introduce learner-centered activities, which made it possible to have students work in groups, for example. That phase began in the late 1980s but was really characteristic of much of the new instructional practices of the 1990s, and expanded after machines could be networked. Teachers using this new generation of software focused more on the quality of the learning experienced by students. This new software used hypertext, interactive exercises, sound, and hypertext and media formats. Word processing, desktop publishing, and access to databases of information became available, providing large increases and variety of facts, often available on CD-ROMs (and late in the decade over the Internet). It began incorporating use of sound, video, graphics, charts, and animation, all made possible by more powerful and faster machines, but which could not be used if teachers still used Phase One hardware of the 1980s. In this second phase, however, teachers and students could interact online among themselves and expand problem solving beyond just the math drill-and-practice teaching of the 1980s. Also, access to large amounts of information made it possible to design exercises in which students conducted research, organized data, and presented it in logical forms, thereby testing hypotheses, for example. Students began reporting that computing at school was motivating and fun. Networking made group thinking and collaboration possible as well. If students had access to the Internet, they could download information quickly, avoiding a trip to the school library that was time consuming. Students and teachers reported overall increases in productivity.82 This phase can be characterized as one in which computers facilitated learning across a wide range of subjects and proved capable of performing specific types of teaching: tutoring, exploration, use of tools (e.g., word processors), and communications. However, some educators who observed the emergence of Phase Two noted that schools focused primarily on extending deployment of these new tools “with little or no attention given to using technology to restructure schools or to teach higher-order thinking.”83 Another group of observers called out the fact that “many teachers
275
276
The DIGITAL HAND, Volume III
were unprepared for” the challenge of using software “to help students develop higher-order thinking skills.”84 The third phase emphasized more data-driven instructional experiences. This phase began to take form after teachers began using the Internet, a circumstance really of the late 1990s and beyond. Access to vast sums of data through the World Wide Web meant that short bits of information presented in some drill no longer could dominate use of computing. Multimedia formats were emerging, and all of the new software required networked classrooms and systems within rooms, across the school, and into the Internet, an increasingly expensive proposition for schools and one that required teachers to know more about IT than ever before. Teachers could download information for use in their teaching, while students began doing the same to do their class work and homework. Teachers were acquiring access to the Internet at home, with surveys in the period showing anywhere from 25 to nearly 60 percent had such access and used it by the late 1990s.85 By the end of the century, it had become more evident than ever before to teachers, school administrators, state and federal officials that teachers needed instruction on the technology and its uses, and time to learn how to apply it in class. In short, these were the same lessons managers had learned in other industries years earlier. Larry Cuban has made the point that some teachers used the Internet, in particular, on their own initiative as evidence for explaining why they were not so much hostile to digital technology as they were interested in finding ways to use technology in support of their preexisting classroom and teacher-centered teaching. Hundreds of surveys and studies done on how teachers used the Internet in the 1990s provided evidence that he was correct in his assessment, at least in regard to how they used the Internet.86 The story mimicked what occurred in other government agencies and across many industries: initial introduction to the Internet in the early to mid-1990s was slow and limited, but climbed during the second half of the decade as individuals acquired access at work or home, and as content increased, and as it became easier to use. By 1999, about a third of all teachers who had access to the Internet in class used it to create instructional materials, to do administrative record keeping, and to communicate with other teachers and colleagues. Less than 10 percent communicated with students, did research, created multimedia presentations, or modified lesson plans, while the youngest teachers tended to use the Internet the most, probably because they had become familiar with the technology at home at an early age. When used in class it was evident that the range of applications mirrored what had been done with PCs just before the arrival of the Net: in-class instruction, practice drills, problem solving, research, producing multimedia reports, and for conducting simulations. But to put use of these applications in perspective, for most of these, only a third of teachers reported using the Internet.87 Slow embrace of the Internet, however, was partially a function of how fast the new network became available in schools. Table 8.5 provides a snapshot of percent of schools with Internet access in at least one classroom; increasingly over time access became available in many classrooms within any given school.
Schools
By the early years of the new century, the percentage had nearly reached 100. Just as important is to understand how many students had access to the Internet, and for that answer, we have the data in table 8.6, which confirms that accessibility expanded rapidly during the second half of the decade, right into the new century. To understand the data correctly, remember that the smaller the ratio, the more access there was; thus for 2003, the data suggested there was a terminal or PC with the capability of accessing the Internet for roughly every four-plus students, suggesting that they had twice as much access than their cohorts just four years earlier. Results depicted in the tables hinted at the enormous investment made in this technology. Using 1998—the year table 8.4 shows a massive increase in deployment of the technology—expenditures for all manner of IT by public schools hovered at $7.2 billion, or roughly 2.7 percent of all expenditures that year (salaries, maintenance, construction of schools, books, and so forth). However, as impressive as were all those expenditures, they only amounted to an average of $113 per student, a pittance when compared to what other government agencies and companies had spent on their employees.88 Another way to see how teaching and learning were changing at the dawn of the new century was by asking the children to comment on what they experienced and thought. Extant evidence from 2002 provides an important window into their perspectives. As should be of no surprise, some 60 percent of all children under the age of eighteen accessed the Internet. They used the Internet to help them do homework, because it facilitated their working more quickly, to rely on more Table 8.5 Percentage of Public School Instructional Rooms with Internet Access, 1994–2003 1994 1996 1998
3 14 51
2000 2002 2003
77 92 93
Source: E. B. Tab, Internet Access in U.S. Public Schools and Classrooms: 1994–2003 (Washington, D.C.: U.S. Department of Education, February 2005): 4.
Table 8.6 Ratio of Public School Students to Instructional Computers with Internet Access, 1998–2003 1998 1999 2000
12.1 9.1 6.6
2001 2002 2003
5.4 4.8 4.4
Source: E. B. Tab, Internet Access in U.S. Public Schools and Classrooms: 1994–2003 (Washington, D.C.: U.S. Department of Education, February 2005): 8.
277
278
The DIGITAL HAND, Volume III
up-to-date information, and it allowed them to balance school and extracurricular activities. They used the Internet to do research for writing papers and completing exercises, to correspond with friends, and to share tips on homework and useful sites. They viewed the Internet much like a virtual textbook, reference library, tutor, study group, guidance counselor, and notebook. They also reported that they used the Internet in and out of school in enormously different ways. They attributed this disconnect in ways of using the Internet to several factors: school administrators dictating how the Internet was used at school; wide variety in teacher policies regarding how the Internet could be used within the same school (e.g., whether to allow its use for research or not, use in or outside the classroom); that assignments by teachers tended to be the least engaging uses to which they put the Internet. Students called for more imaginative uses of the technology to improve their attitude toward school and learning. They reported that the biggest barrier to further use of the Internet was quality of access, such as times of the day when they could use it, physical location of terminals, and the need for permission. Blocking and filtering software also constrained their use of the Internet, although they recognized the need to protect young children from inappropriate sites, but they wanted more assignments that relied on the technology. Finally, when asked what should schools do, their recommendations mirrored those of many teachers, officials, and observers: increase the quality of the access; teach teachers how to use the technology and provide them with technical support to fix problems; teach students keyboarding and Internet literacy skills; make high quality information freely available, accessible, and appropriate for each age of a child; and take seriously the digital divide that limited access of the technology to more wealthy students and schools. In short, they were telling schools to give them the same amount of access to the Internet as they had outside of school and let them use it more as a tool than their teachers currently permitted.89 All through the period discussed in this chapter, we saw teachers, and their supporters, quite defensive in explaining why they did not use computers more in their teaching, while they certainly did at home to the extent evident across the nation by people of all walks of life. To be sure, teachers and their defenders complained a great deal about the limitations of the technology and their circumstances far in excess of what occurred in all other industries. But had they changed their attitudes and concerns at the same time as students were demanding more use of the Internet? In a survey done on behalf of the teacher’s own professional association, the National Education Association (NEA) in 2004, we find a few answers. It reported that almost all educators now had access to the Internet both in and out of school and were “making valiant attempts to use educational technology as an instructional tool,” although they were still “plagued with numerous problems,” such as old hardware, too few computers, insufficient technical support, and training.90 The report made it clear through its language that use and support were quite thin, while the NEA made no mention of teachers trying to fix the problems by working through their schools and districts. The NEA recommended, however, that schools acquire more computers, technical support, training in the use of IT, and allow teachers to participate in decisions made by school administrators on what digital tools to acquire and how
Schools
best to manage them. Like the students, the NEA called for bridging digital divides among ethnic and socioeconomic cohorts. But quite telling in terms of the core issue of how teaching could be done with computers, it appeared that very little progress had been made on the pedagogical front: “The NEA strongly urges further research and development on effective technology programs to help inform the debate on the ‘value’ of technology in education . . . to help document direct links between school technology and student achievement.”91 In short, teachers, and their association, essentially still questioned the value of using IT at all, just as they had in earlier decades, while their students had already started to embrace the new tools and were becoming impatient with their teachers. The U.S. Department of Education noted in its report on the status of education in 2005 that “today’s students are very technology-savvy,” citing various uses, but importantly, that over 90 percent above the age of five used computing in one form or another. Children wanted to have their own personal machines, access to the Internet around the clock, and unfettered use in school, reporting that they were still frustrated with the constraints put in front of them in the use of IT by their schools.92 The U.S. Secretary of Education, Rod Paige, blasted the teaching profession and school administrators on this issue: “Education is the only business still debating the usefulness of technology. Schools remain unchanged for the most part, despite numerous reforms and increased investments in computers and networks.”93 Social commentators in the early years of the new century were still kibitzing about the changes in education required to produce the workforce of the twenty-first century, such as Alvin Tofler in his latest book, Revolutionary Wealth (2006), and Thomas Friedman in The World Is Flat (2005). Such commentators warned that if things did not change, traditional schools would be forced to transform by parents and policy makers, citing the rise in alternative schools, magnet schools, and home teaching as examples of changes already under way. However, schools were also creating new models for instruction, borrowing, in effect, from emerging practices in higher education. Specifically, in the early 2000s, “virtual schools” and what was called “distance learning” in higher education and in the private sector, as they came to be known, began appearing around the country. These are classes taken over the Internet that offer credits toward fulfillment of graduation requirements. By 2005, virtual schooling existed in over fifteen states, and hundreds of thousands of students participated. Some 25 percent of all K–12 public schools participated in such offerings by 2005, with high schools as the largest participants.94 This development could provide a path for leveraging technology and innovating pedagogy that then might transform how schools evolved in the years to come.
Conclusions In the meantime, the process of digital aids to education coming into schools progresses, and it appears with another round as declining costs for computing devices continue, making their way through the American society in the form of
279
280
The DIGITAL HAND, Volume III
ever smaller, more portable digitalia. In 2005, for example, handheld computers (usually called PDAs or personal digital assistants) and laptops spread widely, with about 28 percent of all school districts making these available to students and teachers. One in four PCs in schools were now laptops, continuing to reflect their rapid decline in cost over the previous several years and their convenience and capacity. By 2007, students and teachers were equipped extensively with such devices as hand-held calculators (first introduced into schools in the late 1970s) and cell telephones, but neither mainstream digital technologies were used as teaching machines. But the concern about results had not gone away. For example, as late as December 2005, one could hear the concerns such as those of Robin Raskin, founder and former editor of FamilyPC magazine: “Despite the fact that we have spent gazillions of dollars in schools on technology, it’s still just a leap of faith that kids are better educated because of that.”95 Recent experiences with PDAs and laptops suggest a remarkable consistency in the experience of the Education Industry over the past several decades. On the administrative side, there is little to discuss: the experience of administrators mimicked that of the rest of the American economy, because digital technology early and effectively evolved into forms at affordable rates that made sense to install and use. It was on the teaching side that the technology took far longer to evolve and, one might argue, has far yet to go before making it possible to “automate” learning and teaching, although that seems to be a process just now getting under way. That slow evolution of the technology can be attributed to several factors. First, for a computer to provide effective teaching automation (or partial displacement of teachers) requires a far more sophisticated form of computing than was available for all of the twentieth century. One could argue that companies, teachers, and administrators in schools and districts failed to collaborate in creating appropriate technologies, while every other industry did that with suppliers of computers and software. The experience of CDC and PLATO notwithstanding, the criticism would be reasonable to levy against all concerned. But if one is eager to find fault with why schools did not do more with computers, there is plenty of blame to go around. First, districts and principals failed to do what managers in over 100 industries routinely did: buy enough current technology, teach their employees how to use it, provide enough time to master the tools, and then tough-mindedly order them to use it or face dismissal. Those tasks were generally rarely done. In short, poor management practices were widespread when it came to the implementation of technology. But one could ask, “What about all the computers that were installed, particularly during the Clinton years?” The obsession government agencies and schools had with getting equipment physically into classrooms and networked was reasonable to expect, although disappointing in that this initiative was not accompanied simultaneously with sound implementation practices. So much equipment was wasted. Furthermore, teachers apparently proved quite intransigent about changing their teaching methods, regardless of whether one is discussing digital tools or not. That is not to say they failed to embrace new technologies—we saw in this chapter that they did—just not to the extent evident in other walks of life, and in part
Schools
because of the way they taught and also due to the primitive state of much teaching software. One could quibble about the “right” and the “wrong” of this reality; nonetheless, the fact remains that at the dawn of the new century, teachers taught in ways very similar to those working at the start of the 1900s. Rather than just fill up schools with computers, one might have hoped that federal and state officials would have spent some of the available funds on teaching teachers about the technology and in creating effective software tools. On the other hand, there is much to compliment all the parties involved. For one thing, students, teachers, and administrators used the technology when it made obvious sense to do so: word processing, e-mail, text messaging, tabulation of grades, budgeting, and so forth. Teachers used this technology as tools in support of their administrative functions and to enable developing aids to teaching, such as PowerPoint presentations, and replacements for old mimeographed teaching aids. So, while some sort of a digital revolution did not occur in education, the digital hand did assist schools. As far back as 1988, one educator predicted that computers would “be a transforming force but one that retains its place as an educational tool rather than as a technology that dictates the patterns of learning and social interaction.”96 He got it just right. So, educators responded in a responsible manner, suggesting that more than in most industries expectations of the computer proved far more exuberant than warranted by the realities of its ever expanding capabilities. As the experience described in the next chapter demonstrates, IT in education was a tool, not yet a replacement for instructors, despite the aspirations of some commentators and administrators. But were the parents and the public right in demanding that children learn about computers? They wanted children to understand how to function well in a world rapidly filling up with digital tools. It turned out that they did, thanks to the noneducational activities of parents and others. Children used computers their parents had at home; acquired digital games; adopted hand calculators, PDAs, and made cell phones integral to their social lives. They used the Internet as a convenience and entertainment vehicle and downloaded music to the consternation of the Recorded Music Industry, and all the while text messaged and gossiped in chat rooms and most recently in blogs. In short, and regardless of what teachers and principals did, students became very tech savvy, so much so that they now are the latest to advocate that schools “get with it” and use digital and telecommunications technologies more effectively. In short, students learned what they had to in order to thrive in the Information Age, leaving unanswered only the question of whether they were learning other bodies of knowledge and skills necessary to hold down jobs and live in a democracy-centric modern society. The Education Industry, however, represents a set of experiences also apart from that of other sectors of the American economy and, more specifically, of the public sector. We do not need to rehash that point again. However, what we can observe is that no industry was immune from the influence of the computer’s existence in American society. As in every other case examined in The Digital Hand, influences seeped in from other industries, vendors, and people. Important sources of influence included academic researchers and teachers of
281
282
The DIGITAL HAND, Volume III
teachers, some experiences of others with training tools (such as the military had), vendors of hardware and software on and off again, depending on their perception of the attractiveness of the education market, and that small group of administrators and teachers who were early adopters of the technology. Parents and public officials had a remarkably small influence on the use of IT in this industry; corporations that donated equipment and software even less so. To be sure, the Education Industry proved strongly resistant to change, and almost as reluctant to evolve as the Higher Education Industry demonstrated in the way its institutions operated and teaching was done, but did listen to persuasions when they fit into the existing norms for doing things—the same criteria used by people in so many other industries for determining if a new digital tool made sense to use. The conclusion we can reach in the most general of terms is that this industry demonstrated some of the limits of the nation’s transformation to a digitally rich economy. The technology had to enhance, then transform, how the daily core tasks of an industry were done. When that was not the case, deployment proved limited. The cost of units of technology had to match the size of available budgets. By this, I mean that if an educator could only spend, say, $2,000 per transaction, then IT had to be for sale in increments of about or less than that amount. Until the arrival of the PC, few schools and districts could afford computers, even though the cost per transaction had been declining for years. But, if one could only buy machines costing hundreds of thousands of dollars (or more), then the technology would be out of reach of teachers, principals, and many school boards. The nation spent collectively billions of dollars on computing for this industry, but the vast majority of those funds it expended in small amounts across many transactions conducted in a long incremental process. PCs, for example, came into this industry in a piecemeal fashion, not literally by the truck load as was the case in so many other public agencies and companies, a form of deployment that proved essential if machines were to be networked, or, if whole communities of teachers and students were to become dependent on the technology. Only administrators had sufficiently large enough budgets for their administrative needs to make the acquisition of larger, more expensive systems possible, normally leveraging the economies of scale of a school district’s larger aggregated budget. The effect of highly fragmented budgets on the deployment of IT awaits its historian; nonetheless, the problem existed in both K–12 and higher education to an extent far beyond anything evident in the other industries examined for The Digital Hand. Even when a specific form of software or hardware was available that proved imminently useful for teachers, they might not be adopted, as we saw with the availability of a new generation of PCs in the early 1990s that was far more suited to the needs of schools than the old Apple IIs of the 1980s, but which were acquired slowly. The problem was often budgetary, the lack of sufficient funds in large enough amounts in a timely manner to have the kind of effects on a school or district so evident in other industries. The flow of budgets has to be a factor that needs further attention; it is also one almost totally ignored by all the scholars who have looked at the role of IT in education.
Schools
What does higher education have to teach us about the role of the digital hand in education? With some 25 percent of the American public graduates of such institutions, it is a major collection of institutions in modern society that cannot be ignored. It is also an industry that played a profoundly important role both in the development of this technology and in its introduction to all industries, including government agencies. It is for these reasons, therefore, that we need to understand its pivotal role in the history of modern computing in America.
283
9 Digital Applications in Higher Education In the field of scholarship and education there is hardly an area that is not now using digital computing. —President’s Science Advisory Committee, 1967
A
merican universities played the central role in the development of computers in the 1930s and 1940s, doing much of the hard work of designing and experimenting with the first systems and then, in the 1950s and 1960s, in creating the field of computer science. In subsequent decades, they trained tens of thousands of computer scientists, engineers, and future business executives who went on to create what many commentators like to call the Information Age. All through the second half of the century, science and engineering professors improved information technology, even as the locus for most R&D moved substantively from their campuses to the private sector, beginning in the late 1950s. The story of the role of academia in creating the computer is well known and need not detain us here.1 However, other aspects of the story of computing in higher education have not been so well studied. For instance, less understood is the story of how higher education used computers and telecommunications for its own internal operations. As a general observation, two- and four-year colleges, and more so universities, became voracious users of the digital hand; but, as described below, in many uneven ways. Like businesses, for example, administrators used the digital hand early on in support of administrative and accounting functions. More than K–12 teachers, professors proved slow to use computing in their classrooms in direct instructional capacities, but found the technology very useful in support of such classroom related activities as e-mailing students, 284
Higher Education
posting reading materials, as sources of information, and for research. Indeed, they used extensively this technology in research, particularly in the physical sciences, later so in the social sciences, and only recently in the humanities. Telling the story of computing in higher education is daunting and complex, compounded by the fact that throughout the second half of the twentieth century, the number and variety of institutions grew from just over 1,800 (1950) to an excess of 9,000 (early 2000s).2 Also, the history of the use of IT has to be set into the context of this industry’s enormous expansion within the American economy. In 1960, for example, there were 3 million students and the nation spent $7 billion on them. At the end of the century, over 15 million students were enrolled in postsecondary education at a cost of $237 billion per year. Public colleges and universities absorbed the majority of that growth; in fact, 80 percent attended these institutions by the end of the 1990s. Put in more formal economic terms, higher education accounted for 2.6 percent of the nation’s gross domestic product (GDP).3 In short, higher education played an important role in American society. Patterns in the adoption and use of digital tools and the degree to which this was done can be understood by looking at several types of use throughout the period. This chapter describes applications of IT in administration, teaching, research, and libraries—four major foci for the use of computing in higher education. I do not discuss the training of computer scientists and the rise and evolution of computer science departments.4 However, because of the extraordinary
Figure 9.1
In the 1940s and 1950s it was common for many large universities to build their own systems. This is the University of Wisconsin’s WISC system, 1957. (Courtesy University of Wisconsin Archives)
285
286
The DIGITAL HAND, Volume III
presence of PCs and the Internet in higher education, we also need to understand their role in this industry. Higher education functioned within the context of a self-enclosed, highly articulated culture that exhibited patterns of behavior unique unto itself. There has been a massive and extended dialogue about the character of its institutions, and also regarding their resistance to change, and about the need for their uniqueness, reflecting many patterns of discussion and controversies evident in K–12. This colloquy intensified over the course of many years as the number of college graduates increased in society and as the cost of higher education to the nation did too. These events occurred during an era when the role of education and knowledge became increasingly important to the welfare and workings of modern society and as most industries changed profoundly. For our purposes, what is useful to understand, however, is less the merits of one point of view or another in this debate and more about what role information technology played in this industry. Let there be no misunderstanding, however; higher education behaved more like an industry than many other public and private corners of the economy. To understand the story of computing in American higher education, one should keep in mind that colleges and universities were rarely monolithic institutions with strong command-and-control cultures as more evident in private sector firms. Rather, they were, and still are, various quasi-independent communities that shared a campus, but operated in a relatively decentralized fashion, to the extent that one could find contradictions here in roles and purposes all through the history of computing. As one observer of the scene noted in 1962, “Universities are not only customers for large scale computation facilities but are also in the rather unique position of applying and teaching computation techniques developed for research,” yet with few communities on campus collaborating or coordinating activities to the extent seen in general, for example, in the private sector.5 Thus, professors may choose to use (or not) computing, while down the street a group of administrators may be investing heavily in digital technology to streamline their work. Generalizing about IT in higher education thus must be tempered by recognition of the reality that higher education does not have the same cohesive institutional structures evident in other parts of the public sector, and that reality often affected how members of this industry used digital tools.6 The story told below is about the extensive use of IT and telecommunications in many ways, but at the end of this discussion we will be left with the conclusion that despite such use, when compared to corporations and some other public agencies, higher education fundamentally changed less its structure, culture, and role. In fact, the normal results one would have expected—increased productivity, more effectiveness, lower operating costs, fundamental restructuring, and so forth—did not materialize to the extent its critics expected. Observers of the industry noted as recently as 2002 that in the prior two decades, tuition had doubled at twice the nation’s rate of inflation, while enrollments in four-year institutions expanded at one-half of a percent and ended the century with severe
Higher Education
budget crises, while some 500 had gone out of existence. Meanwhile, corporate and for-profit educational institutions prospered and expanded. Some 2,000 corporate universities operated, up some 1,600 since the start of the 1990s.7 While the number of these new forms of education thrived, largely with highly innovative services, traditional higher education institutions maintained the same “look-and-feel” as decades earlier. Task forces and industry organizations pled constantly for change, recounting similar urgings over the years and warning of dire changes to come if no organizational and cultural transformations occurred, but all to no avail.8 So, on the one hand, we see that higher education had an insatiable appetite for IT, but on the other hand, the effects of its use of the digital hand had not yet triggered the transformative effects on this industry so evident in many other parts of the economy. This is a feature shared with many government agencies and departments. Why those effects were less evident in the public sector in general are discussed in the final chapter of this book.
Administrative Uses Across every commercial industry and corner of the public sector, use of computing in support of administrative operations represented some of the earliest uses of computers, particularly mainframe-based applications. Higher education proved to be no exception and for the same reasons: the technology lent itself best to simple mathematical calculations and data collection, sorting, and storage. It worked well in large institutions with substantial amounts of clerical and accounting work, where administrators could take advantage of economies of scale, while initially providing incremental new ways of looking at data in support of managerial planning and decision making. As two observers of computing in higher education noted in the early 1970s, “just as computers have proved themselves useful, and sometimes indispensable, in the clerical tasks of business, they have demonstrated their value in the related tasks of educational institutions.”9 For the same reasons that managers in other industries used IT throughout the second half of the century, managers in academic administrative offices also applied these tools in their daily operations: in satisfying the need for more formal, fact-based managerial decisions, to handle larger volumes of transactions (due to greater number of students), and in response to growing demands of governments and citizens to account for results. While administrators in the 1950s began using computers for their internal purposes, by the end of the century they were heavily involved in the administration of massive networks in support of all members of an academic community. They managed institution-wide IT budgets difficult to contain as demand for IT grew, and that had exceeded 5 to 8 percent of the overall budgets of their institutions. By the dawn of the new century, administrators were beginning to recognize that perhaps community colleges, four-year institutions, and universities were finally beginning to change far more than in the recent past and that they were occupied in that process, driven as much by the growing uses of IT as by competitive forces, such as corporate and for-profit universities.10
287
288
The DIGITAL HAND, Volume III
Table 9.1 lists areas of administrative activities that administrators supported by using computers. As in so many other parts of the economy, they deployed computing incrementally and iteratively over time across the broad areas of operation listed in the table. Initially they used batch applications to collect accounting and financial data, next to collate and sort information, and to produce reports. Online interactive applications that provided services to the entire academic community came in the second half of the 1960s to some institutions and to most others in the 1970s, while new uses of IT in nonfinancial operations continued to expand. During the last quarter of the century, PCs began to play a far more significant role in higher education than in K–12, much along the lines evident in other industries, although more intensely. Simultaneously, administrations began to network their campuses. Nothing proved so important in the last ten to fifteen years of the century than use of the Internet in combination with PCs and laptops in the work of professors and students, although less so to administrators, who continued to rely mainly on centralized computing; but it was the latter that had to build, maintain, and improve their institutions’ networks. Administrative staffs of colleges and universities had long used precomputer information-handling tools, such as tabulators, adding and calculating machinery, and typewriters, and had partially integrated these office appliances into their daily work in accounting, financial management, personnel practices, student records, and so forth. When computers became useable as administrative tools by the mid- to late 1950s, and initially began to replace older punched-card tabulating systems installed originally in the 1930s and 1940s, the very largest universities and colleges were often the first to explore the possibility of automating daily tasks, much as evidenced in most industries and large government agencies. Some even had large data processing staffs poised to implement new uses of information technologies. For example, Pennsylvania State University had been using precomputer IT since the 1930s and by the mid-1960s had a staff of Table 9.1 Collections of Typical Higher Education Administrative Processes That Used Digital and Telecommunications Tools, 1950s–2000s Financial processes (e.g., cash receipts, purchase orders, financial statements) Personnel processes (e.g., payroll reports and disbursements, benefits administration, hiring of faculty and staff) Student processes (e.g., issuance of transcripts, maintaining grades, course enrollments, tuition and fees administration) Grants management processes (e.g., grants reporting, time and effort reports, budget tracking, proposal management) Management information and analysis processes (e.g., research management, enrollment management information, workforce analysis, sources and uses of funds) Source: Judith Borreson Caruso, Good Enough! IT Investment and Business Process Performance in Higher Education (Boulder, Colo.: Educause, June 2005): 1–14.
Higher Education
seventy running a combination of old computers installed in the 1950s (such as an IBM 7074 and a 1460), and soon operating the newer IBM S/360.11 Like so many large schools, this one had first started using computers for administrative purposes in the second half of the 1950s and, as newer computers became available, moved to these devices and to more current releases of software. Accounting functions were first automated incrementally; next came services for students, which, when combined with administrative applications, accounted for about 80 percent of all the work done by the data processing organization of the mid-1960s. Students and faculty used excess computing power for their research and studies.12 Other major universities followed Penn State’s example, such as the University of California at Irvine, which moved from batch to online systems in the 1960s. Growing availability of software, computer memory, and data storage made it possible to change systems, moving increasingly from sequential files on cards or tape to direct access on disk drives, thus creating more uses as the technology evolved, adding applications, for example, in support of early online enrollment systems and libraries.13 By the end of the first decade of administrative computing—approximately mid-1960s—to one extent or another over 70 percent of all public four-year institutions used computers in direct support of administrative operations, as did 99 percent of all universities with enrollments of over 7,000 students.14 Administrators faced growing volumes of work in the 1960s, which provided them with impetus to rely increasingly on this technology, largely to process work much along the lines they had with tabulating equipment in the 1940s and 1950s. They changed slowly as the technology evolved and when it became more obvious that it could be used in new ways, such as in searching online for data.15 As a couple of observers at the time wrote, “with swelling enrollments and more complex and expensive facilities and with ever-growing requirements for documentation of everything, the college administrator . . . finds the computer to be the only hope for keeping abreast of his job.”16 Already, some effects of computers could be seen on campuses: Students may complain that the use of computers for admissions, registration, and the maintenance of records is impersonal; but the fact is that without the computer, registration lines would be longer, admissions would be slower, and the student’s range of alternatives would be smaller. It is not the computer that makes the system impersonal; it is simply the growing size and complexity of our institutions which tax our ability to devise workable and humane administrative bureaucracies.17
Every major accounting, financial, and many student record management functions were ported over to computers by numerous institutions in subsequent years. Beginning in the 1960s and early 1970s, most midsized and larger schools, and later increasingly community colleges and smaller four-year institutions, began using such applications. Roughly 30 percent of expenditures for digital tools were allocated to administrative support; the rest went to research (40 percent) or for instructional purposes (30 percent), even as early as the mid-1960s.18
289
290
The DIGITAL HAND, Volume III
Introduction of computing was an iterative process, “a history of a gradual coming to terms between an old process and a new method,” as one observer described it in the early 1970s.19 As staffs learned about what computers could do, they automated records and calculations, later used terminals to access directly this data to answer questions, but still produced reports in batches as in the precomputer era. What were the earliest uses and why? The first applications focused on financial and accounting processes because they represented the most orderly and already structured operations, and hence were the easiest to move to computers. Close behind these processes came those of the registrar. In addition to both sharing highly structured tasks, they engaged some of the largest number of sequential and routine operations in administration. They also involved some of the most paper- and labor-intensive administrative activities at a college or university.20 Administrators were motivated less by quests to reduce costs of operation than by their more urgent need to keep up with growing workloads. As one University of California administrator admitted, “Almost every system we have built is more expensive than the one it has replaced because we collect more data and generate more reports.”21 In the 1970s, while the cost of computing equipment dropped, administrators learned from each other how best to use such technology, and vendors became active in selling software packages and equipment, increasingly becoming knowledgeable about the needs of academic computing. The use of computers spread to additional functions, most notably to such student-related activities as creation of class rosters within admissions, while batch applications began evolving into online versions with approximately 15 percent of all uses of computing occurring in this new form by the middle of the decade.22 Table 9.2 documents key application areas of the mid-1970s. The list reflects a substantially larger number of uses than deployed a decade earlier when systems focused principally on accounting and financial reporting. Not cited, but also increasing in number, were uses of computing to collect, analyze, and report on alumni relations and financial aid administration. By now personnel records management had also become an increasingly important area subjected to partial automation.23 As minicomputers began appearing on campuses, some administrative operations moved from large centralized data centers to decentralized facilities. In large universities, some of these applications moved to schools and colleges within an institution, and increasingly administrators converted these from batch to online systems. Case studies of successful deployment of networks, online administrative systems, student records systems, and other applications cited in Table 9.2 began appearing in the 1970s, encouraging other schools to follow their lead.24 The physical and spatial distribution of work at a campus also stimulated creation of early networks before the arrival of the Internet to connect various parts of an institution. Distribution of computing to various organizations within an institution also reinforced the preexisting decentralized structure and culture of higher education. In the 1970s, administrators used computers in essentially the same way as their counterparts in the private sector. In addition, however, faculty put
Higher Education Table 9.2 Administrative Applications in Higher Education, circa 1976 (roughly ranked from most to least deployed) Financial management Admissions and records General administrative services Planning, managing, and institutional research Logistical and related functions Financial aid Library operations Physical plant operations Hospital administration Source: John W. Hamblen and Carolyn P. Landis, eds., The Fourth Inventory of Computers in Higher Education: An Interpretive Report (Boulder, Colo.: EDUCOM, 1980): 76–79.
pressure on administrators to provide them with ever increasing amounts of computing capability to support their research, primarily at large universities, but also at smaller four-year institutions to enhance teaching, and for faculty to carry out their administrative duties, such as posting of grades and scheduling classes.25 One of the watershed technological events in higher education of the second half of the twentieth century was the arrival of microcomputers on campus, beginning in the late 1970s. They spread rapidly in the 1980s, achieving as close to ubiquitous status as one can imagine by the early 1990s. That trend will be discussed later in greater detail in this chapter, but for purposes of reviewing administrative uses of computing, one needs to recognize that this technology now began to play an important role with administrators as well. For the most part, however, computing systems in the late 1970s and early 1980s were highly centralized; they operated on large mainframes housed in centralized data processing centers. Data centers reported to administrative vice presidents or vice chancellors in most cases. Some administrative systems were also housed in colleges and professional schools using time sharing on an institution’s mainframe, a stand-alone mainframe (for example, in engineering schools), or housed in a minicomputer, often controlled by deans or directors. With the arrival of PCs, and their rapid spread across the entire Higher Education Industry, one would expect that beginning in the 1980s, administrators as well would begin to use this technology. In fact, they did. Like professors and students, they used these initially as word processors and to create and use spreadsheets, later to produce graphical representations of data, and by the end of the 1980s, to do e-mail. It should be noted that a small subset of the academic side of campus had been e-mailing via
291
292
The DIGITAL HAND, Volume III
what later came to be known as the Internet since the early 1970s, but the administrative staffs remained minor users of e-mail until the 1980s when the Internet and other networks spread across their institutions. Meanwhile, uses of PCs within administrative functions for more than just preparing word documents and spreadsheets began in the 1980s, and as software packages became available for use in their functional areas. One reliable list of administrative software products published in 1987 listed eighty-seven packages for admissions, accounting, financial contributions, grades, financial needs assessment, generating report cards, and student loan management among others. That same list catalogued twelve packages for use on minicomputers, and only three recent additions suitable for use with mainframes.26 Administrators shared with faculty and students alike some common experiences with microcomputers. As one observer from the period later recalled, “few people in the campus community (or elsewhere) anticipated the movement to freestanding desktop systems that began in the late 1970s and exploded throughout the early and mid 1980s.”27 But once they came, administrators too began to feel more enfranchised personally to use computing since micros were easier and more accessible than large mainframe systems. Senior management in administrative organizations, however, also saw a rise in demand for campus networks, beginning in the mid-1980s. Students, faculty, and their own staffs started asking that these devices be connected together into networks and that they get access to existing digital files resident on mainframes.28 In addition to word processing and graphics, administrative staffs began doing financial modeling, using spreadsheet software to inform their decisions. Economic justification of micros in administration, however, came by displacing pre-microcomputer word processing equipment with inexpensive easy-to-use PCs, which in addition could be used to do financial modeling. Modeling represented a new activity, making it possible for administrative staffs to impose greater order on budget development processes than possible previously. They imposed standard budgetary templates that could be used by departments, deans, and the entire institution on an iterative basis. That capability spread to analyzing enrollments, patterns of expenditures, comparing financial and other performances to those of other schools, and in the process, changing what data management relied upon to make decisions. That new practice led them to create more precise views of their institution’s situation couched in numerical terms before forming opinions, making decisions, or taking action.29 Another by-product of these new uses of the digital hand concerned a shift in status of individual staff members. By the late 1980s, those individuals who knew well how to use computing rose in importance and status in their offices, regardless of their title or rank; thus an undergraduate student working as a part-time employee, or a lower ranking clerk, or employee providing financial analysis to a vice chancellor responsible for accounting, could wield more influence on decisions than line management in some cases. However, it is not yet clear to historians how those newly acquired skills affected careers. While microcomputers spread through administrative offices, existing and new uses of computers largely involved relying on mainframes with an increasing
Higher Education
number of online applications. These were important, particularly for student services, because such uses of computing made it possible for administrative personnel to answer questions on the spot and to provide real-time services and reports to students, faculty, and other administrators without having to request data processing personnel to do the work overnight. Online systems reduced the need for clerical workers, and even more so by the 1990s, when self-service terminals and applications became available. Online systems began to improve productivity. As an example, online student enrollments—a popular new use of computing by the late 1980s that spread rapidly in the 1990s—illustrates how things changed for many applications and their users. In 1980, Tom Edmunds, Vice President for Student Affairs at Central Missouri State University, described the change: Previously, students met once with advisers to work out their programs, then had to return another day to have the program confirmed; it took that long to determine whether the course selections could be accommodated. If any of the selected courses were full, the student had to start over again. The system keeps track of the number of available seats in each course section, so advisors can see at a terminal if a section is full. This allows advisers to enroll students and confirm schedules during one advisory session.30
During the 1980s, other uses of online computing resulted in similar descriptions about how tasks changed in content, speed, and efficiency, particularly once a software system stabilized and worked as intended, which often took as long to achieve as major application installations in the private sector or in government agencies.31 By the late 1980s, demand for computing across all campuses in the United States had intensified, making delivery of such capability one of the leading managerial issues administrators faced, one that grew in the 1990s with further deployment of PCs and, of course, the Internet. But already in the 1970s and 1980s, administrators were scrambling to add computing capacity onto mainframes, buying PCs, and finding ways to expand networks, because in addition to their own investments in IT, professors and students were acquiring systems that they wanted to attach to academic networks. Decisions regarding acquisition and use of IT had decentralized, sometimes leaving management in a weakened position to control demand, let alone expenses. Deans at various schools were being pressured by their charges to acquire more computing as well. Administrative computing remained largely centralized in this period (1970s–1980s). At least during the first half of the 1980s, administrators still consumed about half of all computing used at an institution, slightly less at a large research-oriented university with its many science-based researchers, and far more than 50–60 percent at community and four-year schools. This had been the case even more so in the 1960s and 1970s, when administrative dominance of computing led many departments to acquire their own computers (minis), and in the late 1970s and 1980s microcomputers in support of their own research. Those initiatives started the trend of distributive processing that
293
294
The DIGITAL HAND, Volume III
became such a widespread feature of many colleges and universities by the late 1980s, often resulting in a vast assortment of products from almost every hardware and software vendor, frequently hundreds of brands and models, some old, others new. Normally, they were incompatible with other brands or earlier models; that is to say, data and software on one machine could not be shared or used on another brand of software or hardware, making networking difficult to do in many cases without wholesale replacement of older devices and software.32 It would be difficult to exaggerate the difficulties this situation posed to IT organizations responsible for creating and maintaining campus-wide networks and in providing training and support to individual users. It was still a thorny problem in the early 2000s. Presidents and chancellors responded by creating IT organizations within administrative functions to start supporting the wider academic community with networks, technical standards, institution-wide purchase contracts, help desks, training in the use of IT and networks, and so forth. They also had to increase the percent of their budgets devoted to IT of all kinds. In the 1960s and 1970s, about 2–3 percent of a school’s budget might have gone to digital tools; by the end of the 1980s, these percentages had crept up to 5 percent, and for some technologically oriented institutions, exceeded 10 percent. All of this growth in expenditures occurred within roughly one decade and proved somewhat higher than in the rest of the economy.33 By the early 1990s, administrative functions embraced widely use of e-mail and began providing many of its services online and increasingly over the Internet, campus-wide intranets, or LANs. Indeed, one can report that by the end of the decade, the vast majority of colleges and universities used Web sites to deliver information and services. Administrative functions that became widely available (that is to say, offered by over half of all institutions) over the Internet included undergraduate applications, course catalogs, listing of program and graduation requirements, class registration, library catalogs, applications for transcripts, press and media information, and just beginning in less than a fourth of the institutions, some form of e-commerce (i.e., ability to buy and pay for goods and services online).34 At the beginning of the decade, administrators at only a third of all campuses had an IT deployment plan for extending their services online or for supporting the needs of their institutions as a whole; but by the end of the decade, that percentage exceeded two-thirds, a result of campus-wide pressure for more services and as a way to control the rising costs of IT during a decade when most institutions had to cut their overall operating budgets.35 Key trends, however, mimicked what one saw in other industries. For example, large institutions installed ERP systems to put financial operations on a more businesslike footing and to integrate independent tasks that needed to share common data sets, and so forth. Pressure to produce more reports, particularly about financial matters and on results for benchmarking and justification of budget requests, increased all through the 1980s and 1990s as well, stimulating interest in ERP systems. “Homegrown” software applications were thus slowly replaced with commercially available versions that integrated financial,
Higher Education
student, and human resource functions.36 As in other industries and parts of the public sector, early Internet sites provided information about various services, then evolved into interactive tools with which to conduct transactions, such as enrolling in classes online, albeit slowly in the 1990s. By the end of the century, one could, for example, use credit cards to pay for services online in less than 20 percent of all institutions while almost all major retail stores offered online purchasing. Online registration for classes could be done at just over half of all institutions as well, although students could identify what classes to take online at nearly all schools.37 The story of online offerings in the early years of the new century is more a history of expanded use; the real change from the old ways of doing business before the Internet had been essentially completed during the 1990s. Due to severe budget cuts in the early 2000s, administrators began converting to self-service online services, whereby individuals would enter their own data, for example, to reduce the need for staff and thereby keep down operating costs.38 Academic analytics became a new use of digital tools, beginning in the 1980s, expanding in the 1990s, and a major activity in the post-2000 period as budgets shrank and demands for accountability for results grew, especially from legislatures, foundations, parents, and students. Using software to collect information from existing databases, administrators used IT to understand student academic performance, report on their graduation rates by gender, race, age, and so forth, and to understand the deployment and demographics of faculty and expenditure of budgets. Such tools spread from central administrative functions to the offices of deans and department chairs, following the practice begun with the development of budgets using computers as early as the 1960s and common by the end of the 1970s. The most active users of academic analytical tools by the early 2000s worked in finance, admissions, and research. Department chairs (and their staffs) and human resource offices were the least users, but they, too, had to rely on such software to perform their work. As should be of no surprise, the most advanced uses occurred in budget and financial planning where half the institutions extracted and reported on some transaction-level activities, while some 20 percent analyzed how budgets were spent or had begun to simulate potential budgetary allocations. Business functions and fundraising followed similar patterns of use.39 In hindsight, such uses of computing in the 1990s seemed obvious. However, administrations faced numerous uncertainties. Richard N. Katz and his colleagues at EDUCAUSE described the reality faced by senior administrators in this period: “The national economy was in the tank, states were drastically reducing budget allocations to their namesake universities, federal research was flagging . . . financial aid budgets were falling short of meeting needs as the economy sapped families’ ability to pay high private college and university tuition, and enrollments declined at many institutions across the country.”40 Reflecting on this time, one in which he participated as a senior administrator in higher education, Katz observed that “presidents and chief financial and information officers came to believe that if American industry could transform itself into a more efficient
295
296
The DIGITAL HAND, Volume III
and competitive mode, then so could—indeed, so should—colleges and universities, to achieve efficiencies consistent with a diminished and limited resource base.”41 They also needed to replace old systems in response to the potential threats posed by Y2K as well. He recalled, however, that faculty and other groups on campus resisted many of the proposed changes offered up in response to the economic realities and demands being placed on higher education: “With each newly touted capability of an enterprise system came defenses of locally grown shadow systems for their unique service to a unique clientele,” creating “concerns of confidentiality” and “territorial disputes about data ownership and control.”42 And there were technical hurdles, not the least of which was that “the ‘e’ in e-commerce was easier promised than delivered because of incompatibilities across information technologies among vendors and universities,” leading additionally to widespread underestimations of the effort to transform large applications and institutions, much as occurred at the IRS, at DoD, and at so many other federal and state agencies.43 While more will be said about deployment below, to end this discussion about administrative uses of IT, we can ask, what were administrators spending their IT funds on late in the first decade of the new century? Administrative ERP and related information systems remained at the top of their list of investments in applications. IT infrastructure also proved critical to improve so as to deliver online education, course management tools, campus-wide e-mail, and security and identity management, to mention the most obvious. These priorities had remained essentially unchanged since 2000–2001.44
Teaching and Computers Using computers in teaching brings the discussion of digital applications to a core mission of higher education. In many ways the issues mirrored those evident in K–12: classroom-centered instruction continuing throughout the century, lack of adequate faculty knowledge about computing, insufficient funding for equipment, not enough time to develop IT-based classes, and paucity of incentives for professors to change their teaching methods. There were also critics of the value of IT in teaching, and other commentators who defended it as a new way of having students interact with the material being taught.45 There are some differences as well, most notably that by the end of the century, traditional higher education institutions were beginning to compete with for-profit universities and corporate training programs that did make extensive use of IT in teaching and in providing remote training known as “distance learning.” This development led many observers to begin questioning the long-term viability of existing colleges and universities at the dawn of the new century.46 For another, both students and professors used computers in support of classroom work, for such things as research, writing, and communications, a combined set of uses of computing (most notably PCs) not as evident in K–12. But as in K–12, computing as a teaching tool came slowly to higher education and for the same reasons.
Higher Education
A U.S. presidential panel in 1967 catalogued a litany of reasons that computers were not used, largely due to inadequate funding, but reported that use had, nonetheless, started. It noted that as of 1965, less than 5 percent of all students even had access to computing for any purpose, and even those students were concentrated at large elite universities.47 Early uses of computing in teaching occurred in the hard sciences and in engineering to solve complex mathematical problems. Computing to simulate problems and solutions also began in the 1960s and early 1970s, with faculty and students using the technology in much the same way as in research. In those years, there were few large files that could be queried, a situation that changed as machine-readable data sets expanded in size and number all through the 1970s and 1980s, created by various government sources (e.g. census and economic) and by the physical and natural sciences. Early users of computer-aided instruction (CAI) focused on drilling exercises, much as in K–12, but these, too, were quite limited. Finally, we should acknowledge that students studying computer science were, of course, taught programming languages and about computer architectures and software.48 But use remained limited. One survey from the early 1970s suggested that less than 30 percent of computing budgets went to instructional uses, roughly the same amount as for administrative applications (nearly 28 percent), and less than spent on research (40 percent).49 Reasons noted for this distribution of use ranged from the research orientation of faculty at large universities to a reluctance to learn new skills (computing), to “laziness,” and “inherent faculty conservatism.”50 Examples of use of computing in teaching remained comparably sparse in the 1970s and when it did take place, relied on access to large mainframes normally devoted to administrative and research purposes.51 With the arrival of microcomputers in the late 1970s, the situation began to change, as it had in K–12 and in so many other industries. As one professor at the time observed, “for many students entering university in the 1980s their first acquaintance with computing is through a microcomputer.”52 This quote may qualify as the IT understatement of the half century in higher education because by the dawn of the new century, it would be reasonable to state that every college student either owned or used a PC or laptop in one fashion or another as part of their academic work. By the early 1980s, professors and students were relying on these machines to support their classroom, research, and administrative work, often acquiring the devices on their own, and increasingly through organized programs funded or managed by their institutions. A few academics experimented with computers in their teaching in the early 1980s, and by mid-decade, many schools were organizing efforts to extend deployment to ever larger percentages of students, from community colleges to universities. One observer from the period pointed out that professors were “not using software to deliver instruction.” Rather, they treated them as “tools for people—in this case students—to use, not electronic teachers to administer instruction.”53 In addition to word processing, spreadsheets, and graphics, students could conduct research for their class projects accessing large machine-readable databases, and other software tools as part of laboratory projects. These were the same tools used in
297
298
The DIGITAL HAND, Volume III
business and government, and in the same way, at any level of academic work. In short, less was done to create instructional software when compared to what was being attempted at the K–12 level. In that early period when PCs were just coming into higher education, say from the late 1970s to 1985, administrators continued to increase their use of centralized computing to automate functions in admissions and records, financial management, planning management, and in providing services to faculty, students, and staff, many indirectly related to teaching. Online versions exploded with growth. One survey suggested that for admissions and records and for various financial systems, they doubled between 1980 and 1985, with 60 percent of respondents having online versions; but essentially all major areas of applications that went online doubled in number.54 All through the 1980s, the number of students acquiring PCs increased. Campuses began creating networks that they could use to access online files and to communicate with faculty. Early experiments with chat rooms focused on specific courses first appeared in these years, but expanded exponentially after the arrival of browsers and the more user-friendly Internet of the late 1990s. By the early 1990s, however, surveys were reporting that use of computing had become relatively ubiquitous, often with over 90 percent of students and over 70 percent of faculty at least occasional users of PCs in support of classroom work.55 Table 9.3 lists types of uses reported in a national survey conducted in 1993. Software listed in this table was commercially available as products, such as WordPerfect and Word for word processing, Lotus and Excel for spreadsheets, or
Table 9.3 Uses of Commercially Available Software in Support of Class Work by Students, 1993 (listed as percentages in descending order of use) Application Word processing Spreadsheets Graphics Database packages Desktop publishing Instructional software Statistical analysis Engineering tools Presentation tools Authoring tools
Students 75 26 22 15 15 13 10 8 7 6
Source: Extrapolated from Susan H. Russell, Manie H. Collier, and Mary Hancock, 1994 Study of Communications Technology in Higher Education (Menlo Park, Calif.: SRI International, 1994): 28–29.
Higher Education
Minitab for statistical analysis. The same survey demonstrated that roughly 80 percent of faculty and some 60 percent of students had access to computing tools, such as PCs of their own, others in computer labs, or access to terminals in laboratories and libraries. If anything, however, this survey probably understated the extent of access everyone had to the technology by then. Impediments to even greater use by students and faculty were similar to those reported by teachers in K–12: expense of the equipment, insufficient software, time required to learn how to use it, lack of access, with complexity of operation the most common complaint in the early to mid-1990s.56 In this period immediately preceding the arrival of browsers in the mid1990s, networks in support of all manner of instructional purposes and research were, however, well in place. Survey data from the period suggested that over 75 percent of faculty and over 33 percent of the students had access to some form of campus network. Use varied by type of institution, however. Faculty use proved most extensive at universities (greater than 77 percent) and least at community colleges (38 percent), while 65 percent of students at universities used these tools in networks, but only 25 percent did so at community colleges.57 Classrooms and laboratories that were hard-wired to a campus network existed in over 90 percent of all institutions, but just as with K–12 networking, this did not mean all classrooms. Less than 20 percent of all classrooms were wired this way, but nonetheless represented a substantial investment. Nearly 85 percent had connected one or more campus libraries to their school’s network.58 But as with student uses of computing, faculty and students used these networks as they wanted, mainly in support of traditional research and teaching activities, not as interactive distance learning tools. As occurred all over the nation in all industries, the availability of user-friendly browsers in the mid-1990s on the Internet led to a surge in the use of networked computing for communications, research, and entertainment. It is in this post1994 period that renewed intense discussion surfaced about the wisdom and weaknesses of distance learning, generating a body of literature that rivaled that reviewed in the last chapter about K–12. Cases for and against were similar and thus need not detain us here too long.59 However, there also were some differences. Student demographics had changed in the 1980s and 1990s as the number of older students going to school increased. This population did not necessarily go straight from high school to college, spend four to five years full time in school, and then enter the workforce. Rather, on average, these students were several years older and many worked full or part time while attending classes. Their new requirement was to balance work and school demands on their time and, therefore, they became a logical potential pool of users of distance learning. As the 1990s progressed, they demonstrated a willingness to combine distance learning and traditional education. By then, earlier uses of telecommunications and some distance learning classes had demonstrated the efficacy of this approach to teaching, although it remained a minor form of instruction at most institutions. Deploying distance learning via the Internet was really a development that only began in earnest during the 1990s. As noted above, professors used the
299
300
The DIGITAL HAND, Volume III
Internet initially to supplement their traditional face-to-face teaching methods in physical classrooms. But in the process of using the Internet in a supplemental fashion they and their students became more familiar with the technology and at the same time that their institutions were wiring their campuses. Both professors and students were investing personally in the technology for use in their dormitories, apartments, or homes. Students became important sources of demand that increasing amounts of their education be delivered in ways that allowed them to select the time when to study. To be more concrete on the volume of traffic involved, surveys of the period suggested that by around 1997, nearly a third of all college courses used e-mail as part of the teaching experience, up nearly a third in volume since 1995.60 By the end of the century, usage had escalated. Yet, as a method of delivering instruction, distance learning grew quite slowly in the late 1990s, with experimentation going on at various universities, such as at University of Texas, University of Houston, and at the Massachusetts Institute of Technology (MIT). As one report from the period also noted, professors were reluctant to use courses developed by other instructors, therefore even early distance learning materials spread slowly.61 Nonetheless, most institutions had some offerings of this type by around 1997–1998, largely for part-time students, and one could find increasing numbers of community and four-year colleges offering such classes; less so by elite research-oriented universities. Evidence from the period demonstrated that distance learning proved effective with college-aged students, and that often such courses cost students and higher education about half of that for classroom-based teaching.62 As in K–12, students needed self-motivation to make this form of education work, and it could deprive students of the opportunity to develop interpersonal relationships as one normally did in a traditional college setting.63 Deployment of distance learning in the late 1990s is a story of slow expansion, but not a tale of technology replacing traditional teaching practices or about creating the revolution in higher education that so many observers had predicted would certainly happen quickly. However, while colleges and universities began dealing with the advantages and disadvantages of distance learning—an effort that extended right into the next century—and how best to fit it into existing programs in evolutionary ways, for-profit providers of higher education were moving more aggressively in the direction of providing distance learning for the reasons students wanted: flexibility and cost performance, particularly for those who were working and also taking classes.64 While the amount of training done this way by corporations and for-profit universities (such as University of Phoenix) remained low in the 1990s, it was a rapidly expanding form of competition for traditional bricks-and-mortar public and private institutions. Some statistics suggest the scope involved. Between 1988 and 1998, for-profit institutions that offered degrees grew by 59 percent. Put another way, in 1998 these organizations had 28 percent of the two-year college market and 8 percent of the four-year market. Those shares of the student market grew each year into the next century.65 Influencing the role of traditional teaching, and that of the new segment of the market, were factors that went far beyond distance learning and other uses of
Higher Education
computing, but that made distance learning itself an important topic that varied depending on whether it was a school like the University of Phoenix or the University of Wisconsin delivering it. Let two experienced professors/administrators explain the differences: Traditional colleges and universities tend to focus on inputs such as entering student quality and metrics such as expenditure per student as well as upon process dictated by established student-to-faculty ratios, credit hours, and degree programs. The new-for-profit providers focus instead on outputs, on measuring student learning and the competency achieved by particular programs, forms of pedagogy, and faculty. They have set aside the factory model of student credit hours, seat time, and degree programs long preferred by the higher education establishment and are moving, instead, to anytime, anyplace, any length, anyone flexibility, customized to the needs of the learner and verifiable in terms of its effectiveness.66
Distance learning allowed these new enterprises to disaggregate offerings, thereby going after “low-hanging fruit,” offering classes in great demand and that could be standardized, but presented increasingly in highly media-flavored, modular forms attractive to students. In the process, they challenged the vertically integrated organization of colleges and universities. This unbundling of teaching replicated the trend already under way in many other informationbased industries, including the media industries, and many in the entertainment sector of the economy.67 Among those professors dealing with the issue of distance learning, the potential threat of that kind of teaching coming from outside the academy hardly caught their attention, but it did of some administrators, and most emphatically of observers of higher education.68 By the 1990s, it had become a major example of the destructive gales of technological innovation that economists had spoken of for decades. While control over who is allowed by government to grant degrees constrained growth in the number of competitors threatening traditional higher education, nonetheless, the digital hand made it possible to offer programs in business, technology, accounting, and so forth that did not have to be taken at a campus or necessarily at set times. Combined with the slow response of higher education to provide more flexible programs at a time when demand was increasing for such services (1980s–1990s), computing’s capabilities encouraged new rivals, indeed, over 1,000 around the world with over a million students by the end of the century, in addition to the some 1,600 corporate training facilities in the United States that did the same.69 The emblem of this new movement was the University of Phoenix, which had over 100 centers in thirtytwo states and aimed its offerings at young adults aspiring to acquire master’s degrees in various professional areas, such as accounting and IT. In 2002, it had over 100,000 students, of whom 18 percent were taking distance learning classes. Its student body grew by about 20 percent annually in the early 2000s.70 So far, the largest response to this rapidly growing new segment of the market was university extension programs that began offering Internet-based instructions,
301
302
The DIGITAL HAND, Volume III
an initiative driven more by administrators rather than by professors, and that began in the 1980s and picked up momentum in the 1990s. An example is the National Technological University, established in the 1980s by Colorado State University, which brought together various institutions to offer courses in engineering to students already working in business. As of about 2001, it had awarded some 1,400 M.S. degrees in engineering and management.71 Experts on higher education, conducting an analysis of the industry for the American Council on Education, reported in 2002 that a commodity market for education was emerging made possible by the capabilities of distance learning. The implications they saw for how professors taught fell in line with the nature of changes in work evident in other industries: “Rather than spending most of his or her time developing content and transmitting it in a classroom environment, a faculty member might instead have to manage a learning process in which students use an educational commodity (e.g., the Microsoft Virtual Biology Course). Clearly, this would require a shift from the skills of intellectual analysis and classroom presentation, to those of motivation, consultation, and inspiration.”72 Competition for an American university could come from anywhere in the world given the availability of the Internet and a growing amount of distance learning classes. Meanwhile, deployment of distance learning (also now called e-learning) continued. By the early years of the new century, distance learning had become a major topic of conversation and initiatives. Interest grew slowly on the part of faculty to participate in the development and use of distance learning. Those that had experience with it were largely pleased with the results.73 Increasing numbers of institutions had added distance learning to their offerings in the 1990s, such that by the early years of the new century, the majority had at least dabbled with the new approach.74 Yet even as late as 2004—a decade after distance learning had become a major topic of discussion—publications in the Higher Education Industry still were publishing articles on the virtues of this form of teaching and the potential positive effects it could have on pedagogy. One frustrated advocate declared that year that “the lecture method is literally unchanged from its introduction centuries ago, and many technology innovations remain in limited use.”75 Extent of use remained a function of how far faculty and their institutions were willing to move intellectually, financially, and operationally. Faculty and students were equipped extensively with the necessary technology and wired campuses, yet both groups still used it largely in support of classcentered teaching and learning (such as e-mail and for research).76 As one senior administrator wrote in 2004: “I’m struck by many faculty members’ resistance to the obvious benefits of the maturing technologies,” acknowledging problems faced by faculty: “shortage of time, money, and energy.” His conclusion: “the problem is that the academic culture and the IT culture simply do not mix together well.”77 Yet optimistic, this dean proposed that what might work—and certainly did not yet—is having “tools that anyone can pick up, that can be customized, that are quick and adaptable, and that are less expensive in money, time, and commitment,” all characteristics of many digital tools then in wide use, such as PCs, laptops, word processing software, and cheap Internet access.78
Higher Education
Role of Computing in Academic Research While this chapter demonstrates that computers affected the tasks (work) done by faculty, students, and administrators, and less so the content and culture of higher education, the role of computing in research proved to be a profoundly different experience. What researchers studied, how they did that, and the resulting consequences were by any measure revolutionary and profoundly important and transformative for all modern societies. Research was also the one area in academic life most dramatically changed by the use of computers. The results of research can be classified as one of America’s higher education’s “finest hours,” indeed, possibly the nation’s greatest contribution to humankind. One is hard pressed to find observers who would castigate the overall results achieved in research. When criticized, academics were accused of sometimes picking trivial topics for research, but as an aggregate contribution to society, more new knowledge was created by academics in the United States during the last half century than in all time by any other society. Using the list of Nobel Prize winners as a surrogate measure of results, out of a total 758 people to be so honored through 2005, 269 were American professors and in some categories, the Americans dominated even more. The proportion of American winners during the second half of the twentieth century actually increased, with multiple U.S. scholars recognized routinely each year.79 It is nothing less than a remarkable accomplishment that dwarfs any petulant discussion we might have about the role of computing in administration or teaching in higher education. For many academic fields enjoying this historic achievement of creating new and useful knowledge, the digital hand played a central role. Furthermore, it was a role taken up at the moment computers were first built. The computer was nearly an ideal new tool for scientists and engineers, in particular, and later for social scientists, economists, and still later for scholars in the humanities. It allowed them to take myriad idiosyncratic research issues that they had and augment their existing methods of research at a perfect time in the evolution of investigation, particularly in the physical sciences, mathematics, and engineering. Since the seventeenth century, mathematics both as a field and as a research methodology had expanded, along with a large complement of various mathematical tools, such as slide rules and later mechanical calculators.80 By World War I, mathematics as a discipline had both expanded enormously in the prior century, while uses of mechanical aids to calculation had already started to stimulate broader, more data-intensive research agendas in such fields of mathematics, astronomy, physics, and chemistry. The years following World War I saw the arrival of more sophisticated IT equipment useful to research, such as deployment of punched-card equipment at Columbia University with the support of IBM, and use of analog calculators built by Vannevar Bush at MIT.81 Thus, as early as the dawn of the twentieth century, scientific research had become highly mathematized and quantitatively oriented, two trends that continued to intensify during the early decades of the new century. During the late 1930s and all through the 1940s, computers were “invented,” largely funded by the federal government in support of new applications needed
303
304
The DIGITAL HAND, Volume III
for World War II and the Cold War, with the result that in the late 1940s and early 1950s, there existed a community of scientists and engineers who collaborated in the development and use of computers in support of government research and their own.82 For example, the mathematician John von Neumann worked on government computing projects during World War II (advising on the construction of the ENIAC and EDVAC) and subsequently built his own system.83 Other scientists and engineers did the same at several dozen universities in the same period. By the early 1950s, the computer was already a useful scientific instrument. Von Neumann’s IAS Computer, for example, was used in the early to mid1950s to solve scientific and mathematical problems, as well as to learn about computers as they were evolving. One list of scientific problems worked on using the IAS had nearly fifty items, ranging from blast wave calculations for atomic bombs, to calculations in physics and chemistry, to analysis of weather data and X-ray diffractions.84 But why did computers become such an instant attraction to scientists and engineers? Bert F. Green, Jr., a psychologist by training but who spent the 1950s and 1960s as a computer scientist at MIT and at Carnegie Institute of Technology pointed out that the computer was extremely useful for doing complex mathematics and statistical calculations, and it could also manipulate large quantities of data. In discussing the effects of computing on behavioral scientists, he could just as easily have been representing the experiences of physics, chemists, mathematicians, and engineers: “Scientists are often faced with the necessity of performing many tiresome calculations in order to determine the statistical relationships among the variables they are studying. Often the sheer volume of required calculations is so overwhelming as to preclude doing the analysis by hand. Before computers were available for doing the work, many researchers were forced to settle for half a loaf.”85 In short, thorough work could not be done. But, as he noted after arrival of the digital hand, “A factor analysis that would have taken over a month to do by hand can be done in minutes on a computer.”86 By the early 1950s, computers could already compute faster than people using electromechanical calculators, by a factor of 10,000, and could manipulate far larger volumes of data—a capability that expanded by orders of magnitude throughout the second half of the century, along with similar increases in the speed of calculation.87 But the attraction of the digital hand was not limited to its ability simply to do things faster. The quality of the work itself could be improved, because researchers could more thoroughly investigate an issue and experiment with multiple paths of research rather than bypass options as being too time consuming. One chemist who worked with computers in the 1950s explained the weaknesses of scientific work prior to the arrival of the computer which included a neglect or a very inexact estimation of secondary correction factors, failure to quote precise limits of reliability, and unjustified partiality to certain items of data. Often published papers are merely summaries of the author’s conclusions illustrated with tables and diagrams that are of little value to the researcher
Higher Education
who may wish to check the calculations and measured quantities or to use the data to develop more sophisticated models or theories. Concise, the mathematically exact statements of the models tested and the raw data, including all limits of reliability, would enable the reader to quickly ascertain the usefulness of further investigations, and the unnecessary duplication of tedious experiments that would be eliminated.88
Even computers of the early 1950s made his desire far more possible than before. One astronomical example involving plotting the movements of five planets called for mathematical calculations using known techniques, but which had not resulted in a highly accurate set of answers. It illustrates specifically the kind of challenge one had in solving problems manually as opposed to using a computer, which did the work easily: “The mathematical problem is that of solving a set of simultaneous non-linear differential equations of the 30th order with an accuracy of fourteen decimals, enough to insure the desired accuracy at the beginning and end of a 400-year interval, over which the accumulation of rounding error results in the loss of five figures.”89 In short, problems requiring extensive calculations and the iterative manipulation of growing bodies of data became practical to solve. Use by scientists spread in the 1950s and 1960s from institutions and departments that were building their own machines, or systems for the government, to other departments and institutions that either shared systems or acquired their own.90 Federal funding helped speed up deployment in the 1940s and 1950s as well, particularly to solve problems in the physical sciences, to help develop the hydrogen bomb, to conduct real-time simulations, and to resolve myriad problems and issues in chemistry, high-energy physics, engineering, and even in biology and geology. During the period 1945 to about 1975, the physical sciences relied more extensively on the digital hand for research than other disciplines. However, close behind the physical sciences were researchers doing work in cognitive sciences, and by many in what eventually became known as artificial intelligence, and sometimes, cybernetics. Each involved the merger of experimentation and analysis of how brains and biology worked in systems, using theoretical and mechanical means to explore such concepts as neural feedback and response. Subsequently, in the 1960s and 1970s, biologists became extensive users of computers, particularly in enhancing numerical taxonomy, which could only be done once computers had the ability to store vast quantities of data. The use of computing so spread in this discipline that by the 1980s use led to the creation of a new name for computational biology, called bioinformatics. Meanwhile, the field of genetics came into its own. Also beginning in the 1950s, but coming into wide use in the 1970s and beyond, was deployment of computers to assist in a wide range of studies in medicine, from how diseases worked and could be cured, to the management of clinical trials, and even for administrative operations of a hospital. One student of the history of computing as a scientific instrument described the situation: “by 1975, the computer was deeply entrenched in the physical, biological, medical, and cognitive sciences.”91
305
306
The DIGITAL HAND, Volume III
Thus, in a short period of time, scientists had learned how to harness the great calculating and data manipulation features of computers to solve problems that required intense calculations not possible with pencil, paper, or even calculators; to control instruments, collect data, and perform myriad data reduction and analysis exercises. The cost of computing continued to drop while capacity and reliability improved, a trend that extended right into the twenty-first century, making it possible for ever increasing number of scientists, engineers, and researchers in other fields to gain access to what in the early years was a very expensive class of technology.92 As that availability increased, researchers were able to find more complex uses of computers, and perhaps the most important involved simulations. Unlike calculations of a complex nature to solve some mathematical or engineering problem, with simulations one creates alternative realities, or systems, and that requires both a combination of data and mathematical algorithms that cause the data to interact in predictable or even unpredictable ways. That exercise allows people in all fields to study real and hypothetical situations, often also called modeling, although, of course, they are not quite the same. It would be difficult to underestimate the popularity of such a capability to all researchers and even to all users of computers. The “what if” analysis one does on their PC using a spreadsheet software tool is a relatively simple example of computerized simulation. Most serious research in physical or social sciences today involves some modeling. Almost all forecasting of weather, economics, and military options are digital exercises in simulation. If one had to pick a single most profound change brought about by the computer in science, engineering, medicine, war, and economics, the ability to use simulations to learn new things and upon which to make important decisions, this is the tool. The strengths and weaknesses of the approach were understood in the 1950s and 1960s and remained useful for decades. One group of experts in the 1960s pointed out that while all models were artificial and that computers were often inflexible and required much work to model anything, at the same a computer simulation was “completely repeatable,” “ideal for the collection and processing of quantitative data,” and “free from the physical limitations on the system being studied.”93 The technology itself was not, however, enough to get the scientific community to come to rely so extensively on computers. Early financial support for computing extended to higher education, along with recruiting faculty to develop the technology, were the essential catalysts that jumpstarted use of computers by academics in the United States. While the role in funding research of all kinds by various agencies of the DoD and federal government have been discussed elsewhere in this book, it is important to note that for many decades, public officials channeled funds to researchers through several primary conduits. The most important was the National Science Foundation (NSF), which established a process by which researchers applied competitively for grants in support of their work, submitting proposals on topics of broad general interest to the NSF.94 The National Institutes of Health (NIH),95 and at the Pentagon, DARPA, represented other sources of funding for research. What is particularly unique
Higher Education
about their model is that funds were generally granted to researchers, making them personally responsible for spending the grants on research that they had defined. This approach stood in sharp contrast to the model of awarding contracts in exchange for research requested by the funding organization. The former approach gave researchers an enormous amount of flexibility to select what topics to pursue, and when combined with a peer review assessment of proposals, gave the process enough competition for funds to encourage quality relevant research. The approach also led scientists to shift from operating alone or in very small groups—as had been the case before World War II—to take on larger projects, some of which required hundreds of workers, creation of substantial laboratories, and often required multiyear financial commitments, leading to what in time became known as “Big Science,” a model relatively unique to the United States, and that had only been partially used by the British although extensively by the Soviets.96 That feature—the ability to scale up—provided a path to new knowledge that researchers could use to acquire access to substantial amounts of computing power, particularly before 1968, and thus proved to be crucial in the world of American academic research. After that date, federal funding for R&D began a long slow decline as it failed to keep up with national economic output or inflation. Nonetheless, the sums remained quite high, running annually into hundreds of millions of dollars. It was not uncommon, for example, for large research universities to receive over a $100 million each from federal sources in the 1980s and 1990s, and many other institutions to obtain tens of millions of dollars.97 The relative decline came as two other trends were unfolding: first, cost of computers kept declining, and second, the private sector and foundations increased their contributions to academic research. Tables 9.4 and 9.5 document the total amount of funding from all sources in higher education, including from the federal government, for a half century. By any measure, it was substantial. The enormous role played by research activities, and government funding, clearly strengthened scientific inquiry at American universities and colleges, Table 9.4 R&D Expenditures by American Higher Education, 1953–2003 (millions of dollars) Fiscal Year 1953 1963 1973 1983 1993 2003
Total Expenditures 255 1,081 2,884 7,882 19,951 40,077
For Basic Research 110 814 2,053 5,303 13,303 29,981
For Applied Research 145 267 831 2,579 6,648 10,097
Source: Table 2, pp. 8–9, National Science Foundation statistics, http://www.nsf.gov/statistics/ nsf05320/pdf/tables.pdf (last accessed 9/1/2006).
307
308
The DIGITAL HAND, Volume III Table 9.5 Federally Funded R&D at American Higher Education, 1953–2003 (millions of dollars) Fiscal Year 1953 1963 1973 1983 1993 2003
Total Expenditures
For Basic Research
For Applied Research
138 760 1,985 4,989 11,957 24,734
NA NA 1,454 3,547 8,398 19,500
NA NA 531 1,442 3,559 5,234
Source: Table 3, pp. 10–11, National Science Foundation statistics, http://www.nsf.gov/statistics/ nsf05320/pdf/tables.pdf (last accessed 9/1/2006).
more so than, we could argue, computers did. However, it was already evident by the 1960s that use of computers was having a profound effect on the outcomes of the research, results that were already making their way through the economy in a broad mix of ways, ranging from healthcare to agriculture, from weapons to aircraft and automotive performance, to the now fundamentally changing nature of how diseases, biology, and genetics were being studied and findings applied. Two academic commentators on higher education argued that the alliance between higher education and government “has made the United States the world’s leading source of fundamental scientific knowledge.”98 While much of the ground work for changing the nature of scientific research was laid in place by the mid-1960s, new discoveries and insights kept appearing as the technology transformed. The first was cultural—to big science—in part facilitated by use of computers and other complex scientific instrumentation and large projects (such as those in support of space travel and missiles for the military). “Big Science” meant scientists working in laboratories and less by themselves, with teams becoming the norm on ever larger problems and projects. Teams increasingly used data derived from iterative studies, often collaborating across departments, institutions, also with nonacademic organizations, and around the world, constantly applying new techniques. During the last quarter of the twentieth century, use of computing in academic research spread slowly beyond scientific and engineering disciplines, spilling over into business administration (especially in operations research in marketing and actuarial studies), into sociology (relying on large databases created by such federal agencies as the Census Bureau), criminology, and demographics. The next wave of adoption was, thus, largely among economists and social scientists. In the case of economics, the field had become so mathematized by the early 1960s that one could argue computers simply reinforced this trend to such an extent that in the early years of the new century, distinguished members of the discipline began arguing that perhaps economists had gone too
Higher Education
far in relying on mathematical approaches.99 But clearly by the late 1970s or early 1980s, social scientists also had begun using computers for the same reasons as their physical science cohorts. Finally, by the 1990s, researchers in the humanities did too, such as in languages, fine arts, and history. And like their scientific cohorts of the 1950s and 1960s, historians, for instance, had to deal with the value of the Internet and how best to use digital data.100 By the end of the century, all academic fields had researchers who relied directly on computers with which to conduct research. While humanists used computers to write their articles and books, some to teach, and others to send e-mail, the amount of research funds allocated to their work, in general, remained quite low, regardless of computers. In fact, one report published in the early years of the new century put the amount of research dollars invested outside of scientific and engineering disciplines at about 3.3 percent of the total.101 In other words, some 96.7 percent went toward scientific and engineering communities, even though the federal government and foundations were beginning to fund historical research, for example, requiring use of computers.102 One conclusion we can reach is that the whole notion of team-based “Big Science” approaches as a way to conduct research had yet to take wide hold in much of the social sciences, and even less so in the humanities where historians, for example, overwhelmingly researched alone with only part-time help from student assistants. Meanwhile, technology continued to evolve. Just as in the early decades the ability of machines to perform mathematics fast and to collect and analyze well vast quantities of data were points of attraction for computing, additional capabilities proved useful later. Most obvious, the increased ability of telecommunications to transmit large volumes of data in the 1980s, and especially in the 1990s, through broadband and enhanced Internet functions was of enormous value, making it practical for researchers from multiple institutions to collaborate, a practice that now is widely evident in the social sciences and the humanities. Second, myriad advances were made in the 1980s and 1990s in what computer experts called visualization, that is, giving researchers the ability to present data in graphical form on CRTs, often in motion-like video, to print out what otherwise could not be seen, such as molecules and subatomic particles or processes at work. In the 1980s, for example, those studying how tornados functioned created full-color visual models that moved like motion pictures but as representations of how the physical matter was supposed to function unseen by the human eye, based on actual data collected from tornados to illustrate their birth and evolution. Biologists studying genes or chemical compounds could also present their data in pictorial form, not just as charts and graphs as they had been able to do since the late 1950s. In fact, one could argue that today it is impossible to do the kind of research being conducted about DNA and genetics without visualization. As reliance on ever increasing amounts of computing pushed forward, the requirement to fund large data centers did too, much along the lines of what occurred in the late 1940s and 1950s, when federal agencies had to spend extraordinary amounts of money to support earlier computer installations.
309
310
The DIGITAL HAND, Volume III
During the second half of the twentieth century, the federal government either ran large data centers to which professors were given access, or it funded such facilities located at major national laboratories and at other agencies and organizations, sometimes physically located on some university campus. In the 1980s, the NSF expanded its funding for supercomputing in order to support large scientific and engineering research projects. With these systems they also added telecommunications, using, for example, its NSFNet to link these together. Some of the projects run at these data centers involved modeling of galaxies, weather, and proteins. During the Clinton administration, supercomputing became a national priority, and thus the NSF obtained funding to maintain these civilian data centers around the country.103 As this chapter was being written, once again computing was in the process of bringing about yet another clear revolution in scientific knowledge, this one involving biology and medicine. The increased computing power, the role of more sophisticated databases, and visualization among other things, are moving research through the use of simulation to new levels of sophistication. While others are describing that phenomenon, to summarize, medical assessments are moving rapidly toward greater use of scanners (such as MRI), while the basic science of disease treatment is shifting from chemically based approaches of the past to biologically based strategies, involving, for instance, genetic treatments and programming. The result in the latter case, for example, is that researchers model thousands, indeed even millions, of potential mixtures of compounds to determine what might be an ideal recipe for a medicine, while biologists model and visualize genes and changes to them. So, the image of people pouring liquids from one test tube to another or simply growing something in a petri dish overnight is fiction. Massive databases and extensive use of programmed algorithms have made modern medical and biological research highly computerized, creating in the process whole new fields of study.104
Digital Hand in the Library Libraries in the United States have played important and visible roles in American society for some two centuries. In American colleges and universities, they have historically served as the visible center of academic life, the physical and intellectual heart of a campus. In American society, there have long existed different types of libraries with varying purposes: a thick network of public libraries to serve the needs of local communities; corporate libraries to aid researchers, marketing specialists, and employees to do their work; government libraries that did the same or functioned as major research centers for scientists and engineers; and libraries in higher education that served students, faculty, and administrators, often maintaining very large collections of books, periodicals, and archival materials in support of scholarship, teaching, and even in providing a physical space for students to study, socialize, and, yes, even to take the occasional nap. Regardless of what kind of library they worked in, librarians have
Higher Education
been organized nationally as a “community of practice”105 for over a century, largely through the auspices of the American Library Association (ALA). They acted in much the same ways as members of any commercial industry’s national association. They collectively established points of view on policy issues (such as about free speech), set standards of performance for certification, described how materials should be catalogued, managed, acquired, or disposed, trained members, and published reports and other works. Unlike their iconoclastic American image of being quiet, shy public servants, librarians have long embraced every new form of information technology that came along. When three-by-five cards, file folders, and typewriters appeared in the last quarter of the nineteenth century, they quickly realized their value, becoming in the parlance of economics “early adopters.” By the very early 1900s, they were already moving on to new generations of cataloguing systems (Dewey, Library of Congress) that depended on these innovative tools. The very largest libraries used punched-card tabulating equipment between the two world wars to handle many backroom processing activities, such as book ordering and inventory control. As a profession and community, they collaborated in many projects over the first half of the century to create more efficient catalogs and to standardize the information they collected, for example. By the early 1950s, they had a long history of discussing the sorts of issues regarding information that would consume so much of the time of computer scientists who were creating programming languages, applications, and ultimately, database management systems by the late 1960s. The largest academic libraries often were the first to try using some form of IT in a new way, such as computers at large university research libraries. The Library of Congress played much the same role as did other federal agencies in other sectors of society in setting standards, advocating for federal funding and support, and for library applications, much as occurred at the FBI and the U.S. Department of Justice for law enforcement, or at the U.S. Department of Education for K–12 schools. While below I only discuss the role of computing in colleges and universities, many of the applications described also made their way into the other types of libraries not discussed, such as into public libraries, either later, or in smaller forms (such as on minis and PCs), or through shared networks. Much of the pioneering work in the use of IT invariably started at some of the largest 100 or more major American research universities, such as at various University of California campuses, MIT, Harvard, and at numerous state universities in the Midwest, but quickly seeped into smaller universities and colleges, either through dedicated library systems or as applications housed in a college’s or university’s central mainframe computer. Because librarians were well organized as a professional community, innovations at one library became widely known to their colleagues through conferences, publications, and by way of training programs for future librarians, often taught at state universities. These practices had long been in place. The ALA was founded in 1876 while many state-based schools of library “science” had been in existence since the dawn of the twentieth century. For example, that at the University of Wisconsin in Madison was
311
312
The DIGITAL HAND, Volume III
established in 1906, and a century later, this professional school graduated annually scores of librarians trained as much in the use of the Internet as they were in managing libraries stocked with books and archives, and to standards of performance set by the ALA.106 At many large libraries, precursors of the computer—namely, punched-card equipment—were familiar objects by the 1950s. In fact, their use had been codified to such an extent that one could think of them as widely deployed best practices. In fact, as early as 1952, the ALA had published its first guide to such uses.107 Of particular use to libraries was the clerical work these machines could do in support of ordering and acquiring materials; in scheduling and tracking binding; in cataloguing; and in monitoring movement of publications in and out of libraries, more commonly known as circulation. The latter is of particular interest because unlike many libraries around the world, for many decades American libraries had loaned materials to readers who could take them out of the library for specified periods of time to use at home or work. Libraries used IT to manage that substantial process since it was a core function of most libraries. For such an application, one could find many examples in use, not just in higher education libraries but also in large public libraries.108 Of all the early uses of punched-card applications, this one probably exposed more librarians in the 1940s and 1950s to notions of how data could be collected, sorted, and reported on using mechanical means than to most other users of IT on a campus or in a local government. These were crucial concepts because without appreciating the power of such early forms of IT, it would have been more difficult for library administrators to understand the potential of computers when they first became accessible to them in the late 1950s. Before describing the role of computers, we should keep in mind the operating environment of libraries in higher education. The period from the 1960s right to the end of the twentieth century saw the largest increase in the number of students, faculty, colleges, universities, research, and publications in American history. The number of students, for example, more than doubled. To be sure, budgets for all manner of services in higher education expanded, including those of libraries. The nation also went through periods of both economic expansion and recession. As with other parts of the public sector, librarians in the 1960s experienced budgetary expansion followed by contractions in the 1970s and 1980s, followed by renewed expansion in the 1990s, then another period of budgetary constraints. As with all other public institutions, library budgets were highly committed to salaries and preservation of existing infrastructure and stock of materials, leaving little, thus highly volatile, discretionary budgets essentially for acquisition of new materials, such as books, periodicals and journals, and later digital media, such as CDs. Obtaining sufficient funding for the development of IT systems, acquisition of PCs and other computing, and later for subscriptions to databases, therefore, proved challenging, much as experienced by police departments and K–12 schools. In addition to these operational environmental realities, during the last three decades of the century, librarians witnessed the revolution currently under way
Higher Education
in the development and use of a vast array of digitally based media and information: PCs, networks, CDs, DVDs, databases (information providers too), MP3s, video, and so forth. In fact, by the end of the century, some observers (including librarians) were pointing out that more information was being recorded in digital forms than on paper. The fact that libraries entered the last three decades largely operating in a paper-based world, but having to add (not substitute) digital media, presented substantial operational, managerial, and technical challenges that went far beyond budgetary concerns.109 In short, academic libraries represented a microcosm of many of the challenges faced by modern American society as they dealt with increasing numbers of digital tools and data at a time when many of their activities and attitudes were still rooted in a predigital age. Throughout the second half of the century, library administrators were animated by several needs. To be sure, reducing operating costs through use of automation and mechanization always remained an objective, but in reality they were less driven by that requirement than evident in any other part of the public sector community. In fact, librarians paid less attention to cost avoidance and justification of IT than any other community discussed in this book, with the possible exception of the military in development of weapons. Rather, library administrators wanted to reduce the amount of clerical work required to order, catalog, and manage their inventories of books and other library materials so that they could handle growing volumes of routine work and improve services. By the late 1960s, for example, global output of printed works was growing at between 8 and 10 percent a year, which translated into roughly 450,000 books and some 200,000 periodicals and a similar number of technical reports just in that decade.110 Budgets for university libraries were, therefore, growing fast—10 percent per year in the late 1960s was not unusual, a rate that could not be sustained. Often, the process for acquiring materials and getting them catalogued and on shelves could consume as much as a third of a library’s budget. While computers had not developed sufficiently by the early 1960s to make acquisition applications as cost effective as librarians wanted, they were experimenting with the technology so as to keep up with growing volumes of materials. The benefits of online catalogs were already understood intellectually, but as William N. Locke, director of MIT’s libraries at the time, pointed out, “We cannot afford on-line catalogs. And anybody who talks about storing any number of books, even off-line, is off his head.”111 Early uses of computers were, thus, driven by the costs and capabilities of computers, which goes far to explain why the great take-off in their everyday use began in the late 1960s, after these problems began to wane.112 Walk into an academic library in 2007 and you will see a building with terminals scattered about on most floors, which patrons use to look at online catalogs. In areas that used to have rows of large reading tables, one now sees cubicles or tables with PCs, devices that are used to find materials located inside the library and around the world. People download full texts to their PCs, which they can print out or store for future use. Libraries are interconnected via the Internet, as are national catalogs that tell a patron where copies of books and journals are located. At the University of Wisconsin, for example, patrons can
313
314
The DIGITAL HAND, Volume III
access some 650 online databases from private firms, publishers, commercial and nonprofit information providers, and government agencies. Its collection of electronic media is massive, and librarians will tell you that today students go first to the Internet for information before reaching out to librarians. This university is typical of many academic settings: it has multiple libraries scattered on its campus and has millions of books and periodicals, even though floor space is increasingly being devoted to terminals.113 Students, faculty, and really anyone, can access the catalog of this and most colleges and universities from the comfort of their own dorm, home, or office, often downloading instantly increasing amounts of materials to where they are physically working. Now contrast this scene with the circumstance of the late 1950s. We have the recollections of a director of the New York Public Library, who had worked at or run libraries at the University of Pennsylvania and Harvard, to call up an earlier reality: A library’s stock in trade consisted of books, journals, newspapers, and manuscript materials, and the only means of access was the library’s card catalog. The only machines in use were typewriters, Photostat machines, and some microcards and microfilm containing early printed books, newspapers, and doctoral dissertations. Those who wanted access to library materials had to come to the library and either use them in the building or borrow them for home use. If a library did not have what the patron wanted, he either had to send for it on interlibrary loan, which took up to three months, or he had to find out where it was and go to that library. Copying was done by the user in longhand or on a typewriter.114
Obviously, much had changed over the past half century.
Figure 9.2
Typical reading room at a modern U.S. university, in this case the University of Wisconsin, 2006. (Courtesy of the author)
Higher Education
In the 1960s, almost every library at a research university experimented with various forms of mechanization and later automation of their work. These various projects concentrated on lowering the amount of labor required to acquire materials (since it was often difficult to find enough qualified staffing), computerizing acquisition and management of serials (subscriptions of magazines and journals), creating punched-card and later online catalogs that displayed the same information available in paper-based card catalogs, improving data retrieval techniques, and in support of rapid less labor-intensive circulation processes.115 These initiatives can largely be characterized as experiments to learn how best to use computers, with digitally supported operations only coming on-stream toward the end of the decade and during the early years of the 1970s. One survey conducted in 1967 reported that 638 libraries out of a total of 24,000 actually used data processing equipment, although another 1,130 reported they would within a couple of years. In short, computers were in limited use in the 1960s in any kind of library, although where used they tended to be in large universities, big public libraries (such as at New York City), and by government librarians (such as at Library of Congress, national laboratories, and federal departments).116 In that same period, librarians used data processing mostly for circulation (165 libraries), managing serials (209), and to create and maintain accessing lists (170). To be sure, accounting and budgeting were widely used applications (reported by 235 libraries). Acquisitions (102), cataloguing functions (135), and preparation of union lists were also of interest (133).117 Yet as late as 1970, a librarian at the Library of Congress, home to many early computing projects, still reported slow progress: “Nowadays one can hardly throw a stone at any gathering of librarians without hitting someone who is planning, programming, or operating some kind of computer-based system. To be quite honest, the current odds are probably 100 to 1 that the stone will strike a planner rather than an operator; but, nonetheless, progress has been made.”118 The same librarian reminded her readers that librarians had a long heritage of networking socially and professionally, with the result that digital projects involved “cooperation and standardization,” so that “the field advances as a whole—that is, major changes are made only when a significant number of librarians are ready to accept them and to deal with them on the basis of the network of libraries rather than in terms of the individual library alone.”119 This explanation goes far to explain why from the 1970s forward groups of libraries created shared networks of publications, cataloguing systems, and other digital applications, and nowhere as much in the early years as in cataloguing operations, such as the development of standard catalog information, for example, the Library of Congress’s MachineReadable Cataloging (MARC).120 In short, collaboration characterized this community far more than some simple desire to leverage economies of scale to dampen expenditures. This standard alone made it possible for libraries and publishers to print standard sets of catalogs for publications, which could be sent to libraries without librarians having to recreate these labor-intensive cards.121 Bibliographic systems also appeared as well, most notably the Online Computer
315
316
The DIGITAL HAND, Volume III
Library Center (OCLC). Established in 1967, this nonprofit center provided a group of libraries in Ohio with various computerized services. It became the earliest and most widely used bibliographic service, beginning in 1970 with the publication of bibliographic cards for its members. Libraries contributed citations, and over the years, the OCLC expanded its cataloguing services, while the number of libraries of all types from across the United States joining it increased as well.122 During the 1970s, experimental, independent uses of computing gave way to more widely deployed, networked systems involving bibliographic guides, cataloguing, management of serials, interlibrary loan functions, and so forth. These were housed on campus mainframe computers that librarians shared, usually with administrative departments. During the 1970s, libraries in higher education began implementing online versions of their earlier (or first) uses of computer-based systems, such as card catalogs. By the end of the decade, some of these systems were accessible by users of a library’s services, again the card catalog being the most widely known of these systems. Online retrieval and updating of a library’s inventory of publications appeared at most university libraries and at many four-year colleges. MARC formats spread across the American library community, while networked communications for interlibrary loans came into their own by the end of the decade. Circulation control, involving scanning the transaction of a book being checked out and returned, while keying data into a terminal, also proved a popular form of mechanization, which also spread to public libraries. As online systems expanded, so, too, did use of direct access to bibliographic files created originally by the U.S. Library of Congress, various commercial bibliographic services, and by librarians managing special collections. New methods and software for conducting searches also made it possible for patrons of a library to start doing their own searches with minimal, or no, help from librarians. Speed of adoption of digital tools continued to be governed by declining costs of hardware, availability of software tools, knowledge of digital issues of librarians who either could articulate to programmers what they wanted or not, and such governance issues as who allocated budgets or managed computer facilities, or simply budgets.123 Despite substantive deployment of IT in the 1970s, at the end of the decade two librarians studying how extensively libraries used computers still concluded that “libraries are far more labor intensive than machine intensive. There also seems to be considerable fear among librarians about the increasing use of computers.”124 Librarians embraced networks in the 1970s that expanded their use of many systems in the 1980s. Beginning with OCLC, which went online in 1971, by the early 1980s there were also other networks in wide use: Research Libraries Information Network (RLIN), Washington Library Network (WLN), University of Toronto Library Automation System (UTLAS), and regional networks around the country, to mention a few key systems. Of particular interest to higher education were RLIN and regional networks, but all networks were fed with a continuous flow of data from OCLC, accessed through terminals. The combination
Higher Education
of networks and growing electronic files brought higher education libraries into the world of databases, making this new form of digitized files an important aspect of a library’s inventory of information by the end of the 1980s. Online access proved highly attractive, convenient tools for librarians and patrons, and shared networks meant shared expenses, hence more affordable systems. In fact, by the end of the decade, there was hardly an academic library that did not use some form of networking or had failed to provide access to online databases.125 Libraries embraced PCs in the 1980s in much the same way as the rest of higher education. Librarians used these devices to do word processing and to manage budgets and projects with spreadsheets.126 During the 1980s and 1990s, a great deal of information became available on CDs. Libraries were quick to find ways to add these to their collections as a way to control the costs of paper files, while making increasing amounts of material available to patrons. As with the rest of higher education, by the end of the decade, librarians were networking their PCs, using them as terminals and as tools for downloading large files to work with offline. One librarian called this practice “a return to local systems,” by which one could either access national and regional networks or work “offline.”127 But PCs were subsumed into the larger trend of using shared networks. As one library administrator recalled: “The era of localized library automation has effectively come to an end. Experience has shown that it is not economically feasible for any but the very largest libraries to afford the heavy costs of developing, maintaining, and operating complex localized computerbased systems. Many libraries are quietly abandoning this approach in favor of joining networks such as OCLC.”128 In effect, libraries were reducing their dependency on in-house IT staffs and computers, with the exception of PCs, their printers, and some on-staff PC wizardry, of course.129 Before discussing the role of the Internet, which was the next major technological introduction into libraries, it is important to understand the role of digital libraries, because they are so closely linked to use of the Internet. The term “digital libraries” is a phrase born in the 1990s. However, its roots date back to the 1960s with the development of early information retrieval systems that later were enhanced with development of hypertext systems in the 1980s, and simultaneously from the 1970s to the present through use of various telecommunications systems. Also known as “electronic libraries,” these became substantial sources of digitized information databases, largely in the 1980s and 1990s, by which time the cost of computing, storage, and networking had been dropping by annual rates approaching 20 percent, while hardware and software tools had been spreading across the entire academic community. As academics, online journals, commercial information providers and publishers, and others created online content, the notion of the digital library gained wide currency. The emergence of the World Wide Web (WWW) in the early 1990s, coupled to a massive expansion of Internet access across all of education in the second half of the decade, made use of rapidly growing digital libraries possible and, indeed, a reality. Users and librarians quickly embraced a widely shared vision of such databases as ubiquitous, with open access, and that shared information would be
317
318
The DIGITAL HAND, Volume III
anywhere at anytime. Librarians viewed digital libraries as “libraries without walls,”130 as extensions of what they had long done, which was to acquire, organize, and make available information, using whatever information technologies were around. In short, digital libraries were the next evolutionary step in what they did for a living. Experiments at Carnegie Mellon University in the late 1980s and early 1990s taught many librarians what would be needed in an academic setting, while engineers and scientists had earlier quietly built smaller digital libraries using the Internet in the 1970s and 1980s to support work they were doing for the U.S. Department of Defense. Thus, by the time the Internet had
Figure 9.3
Libraries used online systems to catalog books; here we see a librarian at the University of Wisconsin, 1983. (Courtesy University of Wisconsin Archives)
Higher Education
become widely deployed, librarians had over two decades of experience with such digital systems.131 During the 1990s, internal online systems in libraries were frequently ported over to intranet and Internet sites, making them accessible by both the academic community and, indeed, the public at large around the world. It seemed that every library in higher education was “on the Web” by the end of the century. Librarians and professors started to complain that students never used physical libraries, rather turning to the Web for whatever they wanted in the way of information. Costs of creating and making collections found on the Internet kept rising all through the decade, becoming major line items in an academic library’s budget. Librarians wondered what role they would play in organizing and making available digitally ever growing bodies of information. The University of California system announced at the end of the century that it would create a tenth library that was only digital as a way of organizing support and librarianship for this form of information.132 Web-based catalog systems, laptop connectivity, and even online classes had to be interconnected. Often college and university administrators, professors, and students now looked to their librarians for leadership and support, raising questions in the early years of the new century about what the future role should be for librarians.133 One survey conducted in 2002 illustrated the criticality of the situation for higher education. It reported that 60 percent of American faculty were comfortable using online research tools and expected to use them to a greater extent in the future. An even higher percent valued their school’s online catalogs (70 percent), yet nearly half still needed to use paper-based collections, putting the librarians in the situation where they had to collect, organize, and make available both paper ephemera (such as books, archives, and journals), while also providing a massively growing body of digital information. Students were even more inclined to use electronic collections than faculty.134 As this chapter was being written in 2006–2007, a new technological issue drew academic librarians into another round of discussion, and this time some controversy, regarding the role of the digital. It involved an initiative by the Internet access company, Google, Inc., which wanted to make available over the Internet vast quantities of information about and from copyrighted books, and indeed, whole copies. In December 2004, Google announced that it would scan the entire collection of publications held by Stanford University and the University of Michigan and all publications published before 1900 at Oxford University. It would launch pilot projects to scan collections at Harvard University and at the New York Public Library. Later, the University of California joined the project. Potentially tens of millions of volumes were involved. Google wanted to make available what amounted to a massive online catalog, snippets of materials (if copyrighted), and whole books if not protected by copyright. Publishers were given the opportunity to request that their publications not be scanned; otherwise, they would be added to the digital collection. Librarians from the participating libraries heralded this project as the start of a new age; publishers and authors went to court to block the initiative in 2005, repeating
319
320
The DIGITAL HAND, Volume III
arguments similar to those put forth by the Recorded Music Industry regarding Napster’s activities of a few years earlier. The controversy largely involves what can or cannot be copied that are in copyright but not submitted to Google to scan by those holding the copyright. To put things in some perspective, about 15 percent of all the books Google intends to scan are already in the public domain; about 65 percent are copyrighted, but out of print, and some 20 percent are in copyright and available for purchase from their publishers.135 As of when this chapter was written, the legal issues had yet to be resolved, but meanwhile Google scanned books and tested pilot Web sites in support of its initiative. What makes this project so critical to the story of libraries is that it represents yet another step in what has been a process under way for several decades of information (publications too) moving from printed to digital forms. There seems to be little doubt that this historic trend will continue during the twentyfirst century. Librarians are widely convinced of the continuation of this process. For years, library students have been given extensive training in the creation and use of various digital tools now widely used by libraries, faculty, students, and the public at large.136 In fact, the librarians seem more ahead on this issue of electronic publications than publishers.137
Personal Use of Computing So far we have looked at the use of digital tools largely from institutional perspectives, such as what administrators, researchers, and librarians did with the technology. By turning our focus toward key communities within higher education, can we learn more about the changing nature of work in higher education wrought by the computer? Put another way, what can we learn by observing use of computers by faculty and students? It is a useful view because once they had wide access to the technology, how they did their work changed, most notably first when they acquired PCs and later access to the Internet. Prior to the availability of PCs on campus, wide use of computing by both communities remained quite proscribed. Only the earliest users had extensive access to IT: engineering and science faculty and their students, who were the earliest to use mainframes in their research and, as we saw earlier in this chapter, hardly used it, however, to conduct teaching in any formal way. Learning occurred “on the job,” so to speak. Students studying computer science, of course, had access to programming languages, database tools, and computers as far back as the mid-1940s; but they remained a small population.138 A few students who held part-time jobs within administration were exposed to mainframes and minicomputers in the late 1960s, although the latter more in engineering programs in the 1970s, or via access to ARPANET (precursor to the Internet). Then the microcomputer became available during the second half of the 1970s. While it would be difficult to overstate the significance of this technology for how people worked in higher education, the reality is that adoption came
Higher Education
slowly in the 1970s and early 1980s, held back as in much of society by such factors as cost of computers (they could cost as much as $5,000 in the early 1980s), inadequate availability of software tools (such as application programs), or due to technical complexity at a time when students and faculty knew little or nothing about the technology. The image of the tech-savvy Net Generation is a product of a much later time, indeed, a characterization that did not begin to form until well over a decade after the first PCs began arriving on campuses in the United States. In fact, observers of the academic scene did not start looking at the role of personal computers on campuses to any serious degree until the second half of the 1980s, although by then vendors were actively selling institution-wide contracts for these machines, such as Apple and IBM in the early 1980s. Early users understood the potential of such portable computers to help solve problems in the hard sciences and mathematics, for example, building on earlier experiences with terminals attached to mainframes, dating as far back as the early to mid-1960s. So, from nearly the beginning, professors and students used these machines to solve problems, perform calculations, prepare reports, manipulate ever larger digitized data files, and later present results in some graphical format. In short, both communities used the technology in support of their main work (research, communicating, and learning) and not as a substitute for any fundamental way of doing work. One observer writing in the mid-1980s noted that “personal computers are simply making possible the use by many people of standard programs that up to now have been available only to a few.”139 These included word processing, database management tools, and statistical analysis software, spreadsheets, and
Figure 9.4
By the 1990s it was not uncommon for academic libraries to have large spaces devoted to students using computers; here we see students at the University of Wisconsin, 1994. (Courtesy University of Wisconsin Archives)
321
322
The DIGITAL HAND, Volume III
graphical display programs. Increasingly as well, such standard tools were customized to address specific problems and intents. A consulting firm conducting one of the largest early surveys of the use of personal computers in higher education, in 1986, involved over 200 public and private universities and colleges in the United States. Some of its findings remind us that deployment in higher education tracked more along the lines evident in the private sector than in government or K–12. Less than 20 percent of colleges and universities required or encouraged students to have a PC and, in fact, only 13 percent of students owned one. Where an institution made such machines available, only one in seventeen faculty or students had access to these. When either cohort had access to machines, they only used them on average for three hours per week. However, administrators already understood the latent demand for such devices, as did vendors selling them, with the result that over 60 percent either had or were about to negotiate for site licenses for hardware and software to keep costs as low as possible. So what software products did students, faculty, and administrators acquire and use? The same survey ranked use from most to least the following: word processing, spreadsheets, database managers, communications, statistical packages, and graphics tools, paralleling in order of use similar application software used earlier on mainframes and minis. Less than a third of all campuses offered any courseware.140 However, circumstances began to change incrementally between the mid1980s and the early 1990s, that is to say, before the massive expanded use of the Internet. Relying on survey data from the period, we see adoption took place slowly. In 1993, now 22 percent of students owned a PC, up from about 16 percent in 1990, and 13 percent in 1986. Faculty actually owned more PCs, probably due to the fact that they could afford them more, received them from their universities and colleges, or funded them out of research and teaching grants.141 The ratio of students to computers had improved by a third (11.2 students per machine) in 1993, with many schools now reporting that they had established computer labs that students could use at will.142 Again faculty outpaced students, with some 45 percent either owning or having access to such technology, actually using this technology more than the American population at large.143 Libraries had become major centers for student computing, with nearly 87 percent reported having PCs available for them to access catalogs, databases, do word processing, and to read, view, or listen to CDs and DVDs.144 Because campuses were already extensive users of networks, students and faculty became some of the earliest most extensive users of e-mail in the United States. Over 28 percent of undergraduate students reported using e-mail, 31 percent of graduate students, 39.5 percent of faculty, and 49 percent of administrators.145 So even before the spread of the Internet in 1994–1995 and beyond, higher education had already become one of the most enthusiastic users of e-mail in the world. In addition to the applications already cited, students and faculty added others during the second half of the decade. Students, of course, used laptops, PCs, and campus mainframes to download music off the Internet to such an extent that by 2000 IT management complained often that the students were
Higher Education
crashing their systems by using massive amounts of computing capacity, typically beginning at about 3 p.m. at the end of most classes for the day on many campuses. Faculty began to use course management tools (discussed earlier) to assign tasks, receive papers, allow online testing, and to post bibliographies and course assignments and schedules.146 Term paper mills appeared in the late 1990s, although there are no reliable statistics on how extensively students used these.147 By the late 1990s, we know that the vast majority of students either had their own PC or laptop or had convenient and frequent access to PCs on campus.
Figure 9.5
Online systems were in wide use in higher education by the 1980s. A researcher uses a system at the University of Wisconsin, 1986. (Courtesy University of Wisconsin Archives)
323
324
The DIGITAL HAND, Volume III
One study suggested that over 80 percent of students owned a PC of some sort.148 By then the Internet was a fixture at all institutions, and so it should be of no surprise that half the students played games online or downloaded them onto their machines, that 95 percent now used e-mail, and that 88 percent copied music from sites on the Internet. Some 96 percent also reported using PCs to do research or access materials at their college or university library.149 Surveys done later reported similar findings.150 Laptops appeared on campus only in small quantities in the early 1990s because they were often two to three times as expensive as PCs; however, by the end of the decade these could be purchased for less than $1,000, and so they spread. When linked to the Internet through wireless connections in the early 2000s, for many students and faculty they became the tool of choice for taking notes in class, in support of lectures and demonstrations, for studying and working outdoors, in cafes, and at other locations on and off campus. Students used these devices along with cell phones, which had become ubiquitous by the early 2000s, along with instant-messaging and cell phones joining e-mail as favored ways to communicate.151 In short, the inflection point in use of computing by students and faculty occurred some time in the second half of the 1980s and simply surged all through the 1990s and beyond. To ground one’s attitudes about computing, we should note what an observer from one campus recorded in the mid-1980s. After reporting that over half his campus used PCs extensively, he observed that “the new PC technology has not made a big difference in the way faculty or students think about or use computers. General attitudes abut computing did not change,” they were only tools.152
Special Role of the Internet The Internet has taken on a nearly iconoclastic image in characterizing the campus of the twenty-first century, displacing the PC, which had become such a visible artifact in higher education by the start of the 1990s. Any commentator of modern American society ignores the Internet at his or her own peril, as if not realizing that society is now in the Information Age. If one had to pick a candidate for ground zero for the Internet, it should be the Higher Education Industry, not because it was deployed early and extensively on so many campuses, but because institutions in this industry came to rely on it for so many of its functions, displacing many other networks which it had long used, most dating back to the 1960s. Put another way, universities in particular and earliest, and later, four- and two-year colleges, became extensive users of the Internet. All through the 1970s, research universities tended to be the largest users of telecommunications in higher education, to be sure, but during the 1980s, use of communications networks spread across most universities and colleges. By the early 1990s, survey data suggested that up to as many as eight out of ten faculty members in the United States had access to various network services, and that over half used these. Nearly half of American students were in a similar situation,
Higher Education
also largely accessing the Internet or BITNET.153 The most widely used application by students, faculty, and administrators was e-mail. Roughly a third of participating institutions also offered some form of distance learning, but it was limited; roughly 7 percent of all professors and instructors had ever taught this way, while distance learning via cable TV was more prevalent at the time.154 Faculty more widely used their networks to access data files than to teach. Table 9.6 summarizes types of uses of the Internet and BITNET in the early 1990s. Even if faculty and students worked off campus, dial-up access was already high, with public research universities most accommodating (93 percent), even small private colleges readily providing this service (80 percent), and public two-year institutions the least (46 percent). In short, over 70 percent of all institutions of higher education provided some form of Internet/telecommunications access to their communities.155 With the wider deployment and use of the Internet across the entire American society that followed the arrival of Web browsers, use on campus continued to grow, but in a community that was already the largest consumer of communications in the nation. By late 1998, some 97 percent of all faculty and staff had access to the Internet. Use of e-mail continued to expand to nearly 70 percent of faculty using it, for example, largely to communicate with each other and with students, for myriad reasons: scholarly communications, interactions with students related to classes, and so forth.156 In the late 1990s and early 2000s, institutions upgraded continually their networks to provide security features, next broadband, and subsequently wireless access. With each innovation, use and variety of applications increased. Thus, we can read about MIT posting class content and even publications on its network, while the case for every innovation was couched in terms arguing for faster and more services.157
Table 9.6 Percent of Institutions of Higher Education Providing Specific Networked Services, circa 1994 E-mail Access to file servers Access to campus computer Access to Internet and/or BITNET Gateway access from network to other locations Bulletin boards Electronic conferencing Interactive student services Access to UseNet and/or FidoNet
90 88 81 79 55 48 34 32 30
Source: Based on data in Susan H. Russell, Manie H. Collier, and Mary P. Hancock, 1994 Study of Communications Technology in Higher Education (Menlo Park, Calif.: SRI International, April 1995): 77.
325
326
The DIGITAL HAND, Volume III
Innovations came as fast as campuses were able to fund and install them. By mid2002, for example, roughly 7 percent of all campuses reported having any form of wireless communications; they expanded deployment steadily in subsequent years to include offices, dorms, classrooms, libraries, and finally open spaces.158 The first major study done on student use of the Internet was conducted in 2002. Researchers reminded readers of the obvious: that college students in 2002 had grown up with PCs always around them and that access to the Internet had been available to them for half their lives. Some 20 percent had used computers since the ages of five to eight. When on campus, 86 percent reported having used the Internet. In short, they were early users of the Internet when compared to faculty and the public at large. E-mail remained their favorite application, but they also downloaded music files (60 percent versus 28 percent of the population at large), and 78 percent had browsed the Internet for fun, versus 64 percent of all Internet users in American society. They reported that the Internet helped them with their education and to communicate with their professors, while enhancing their social lives. In some instances, they were able to create jobs and careers; recall that the founders of Yahoo! and Napster were college students. Finally, we should note that about three-quarters of all students used their Internet four or more hours each week.159 The study’s researchers suggested that a generation gap existed between students and faculty, since less than half of the latter required use of the Internet and when they did use it, for fewer applications than students, and mainly for e-mail.160 Given our earlier discussion about distance learning, what role did the Internet play? The television-based delivery of correspondence courses so familiar in the 1980s shifted increasingly to the Internet during the second half of the 1990s. As access to this form of telecommunications increased, along with software for teaching, the number of online degree programs increased also. Expanded use increased at the end of the century. One report on distance learning suggested that there were over 800 institutions in 1998 providing such services, while another publication documented 1,500 programs in 1999–2000.161 Why did colleges and universities continue to invest in the Internet—and for that matter, in all kinds of IT—despite chronic complaints of insufficiently available funding for such projects?162 Kenneth C. Green and his colleagues, who have tracked IT issues in higher education for many years, argued that improving productivity was not the real goal. Rather, as competition to attract students and faculty increased in the 1980s and 1990s, having a relatively modern, rich IT infrastructure was seen by many institutions as critical in attracting talent. As early as 1995 he was already pointing out that students were arriving on campus comfortable in using computers and networks and expecting their schools to make such tools available. Specifically, he argued that “network and online information resources now drive much of campus computing”—hence what administrators had to invest in to be competitive.163 He also argued that evidence had been growing in support of arguments favoring use of the Internet and computing in general to assist teaching. Finally, institutions wanted to make sure their graduates were prepared to function effectively in a work environment
Higher Education
already heavily computerized. He concluded “that content, curriculum, and communications—rather than productivity—are the appropriate focus of—and rationale for—campus investments in information technology.”164 Surveys done by others a decade later repeated the same findings: “Providing a high-quality campus network—reliable, secure, adaptable, scalable, and fault tolerant—has become fundamental in higher education.”165 By 2005–2007, upgrades focused on expanding portals and in implementing wireless communications.166
Patterns of Deployment Given the enormous variety in the use and deployment of computing and telecommunications on the one hand, and the diversity of institutions on the other, what were the patterns of adoption of technology in higher education? Higher education is an industry that has been extensively surveyed over the decades about the number of computers it had, what these were used for, their costs, and operational and managerial issues faced by users and management alike, so we can begin to answer this question. Early users (1950s–early 1970s) were normally large universities using mainframes. Some of the earliest applications of mainframes in the 1950s included statistics (University of Arkansas, MIT, University of Dayton), research computing (Purdue University, University of California at Berkeley, California Institute of Technology), scientific and engineering uses (University of Denver, University of Utah), and in support of American government projects (Georgia Institute of Technology, New Mexico State University, Harvard University).167 A great number of these were funded by various national government agencies. However, between roughly 1958 and 1962, the number of computers installed in colleges and universities grew rapidly to some 275 systems. They were highly eclectic from one campus to another and even within individual institutions. By 1962, however, it was clear that all the largest universities in the country now had mainframes, and MIT led the pack with ten systems, ranging from an IBM 704 and several of the smaller 650s to others from Bendix and other vendors. A list prepared in 1962 listed almost every type of first-generation computer installed somewhere in higher education.168 One report, reflecting on patterns of use in the 1950s and 1960s, explained the rapid diffusion of so many systems in so few years less as a result of opportunistic funding by some government agency or another and more as a result of the need to satisfy some operational issue on campus, “each computer assuming a single role such as teaching, research, or university administrative data processing,” in disregard to the potential inefficiency in cost and productivity of not acquiring systems for campuswide use.169 But there is no doubt that the federal government financed many application-driven projects. In 1966, for example, it provided over $7 million for seventy-two digital projects, and by the end of 1971, funding had cumulatively reached nearly $57 million for 220 projects.170 The population of systems became quite high; some 1,387 installed or on order as of June 1967, at 897 data
327
328
The DIGITAL HAND, Volume III
centers. Popular suppliers included Burroughs, CDC, DEC, GE, and IBM. The largest institutions were routinely the earliest or most extensive users of such technology.171 Extant data from the late 1960s suggested that about 30 percent of all expenditures for computing went for instructional uses, another 30 percent for administrative purposes, and the remaining 40 percent for research. The first figure seems high, given what we know about early uses of computing in teaching, and so that statistic should be treated with some circumspection. To summarize the population, between 1962 and the end of 1969, the number of systems expanded from some 200 to over 1,250, while expenditures for computing went from roughly $49 million to some $352 million.172 During the early 1970s, a third generation of computing signaled both another round of upgrades or replacement of older systems, and the introduction of less expensive systems going into ever smaller institutions, including to many small four- and two-year colleges. Meanwhile, demand for computing had grown beyond the capabilities of administrations to provide adequate supplies. Over half of American colleges and universities did not even have one system as of about 1969–1970. Slowly, however, funding became available from within school budgets, grants, and federal and state resources to address needs in the early 1970s at the same time that computers became less expensive.173 By 1977–1978, nearly 70 percent of all institutions had gained access to computing, either by leasing, purchasing, or contracting for timesharing services, with almost all large institutions equipped (well over 90 percent).174 Table 9.7 provides some numerical data to suggest the surge in adoptions that occurred between the mid-1960s and the late 1970s, when so many schools first began using computers. What the table demonstrates is that installation of systems essentially “took off” in the 1970s, all the while as the number of institutions also increased, however, with more than half now with access to digital tools. The story of the 1980s is one of the remaining institutions acquiring systems and the previous users replacing older ones with newer generations, and often with multiple copies. Demand and use were driven by the applications discussed earlier in this chapter.
Table 9.7 Estimated Number of Higher Education Institutions with Access to Computers, Select Years, 1964–1977 Year
With Access
Without Access
Total
1964 1967 1970 1977
707 980 1681 2163
1512 1497 1126 973
2219 2477 2807 3136
Source: John W. Hamblen and Thomas B. Baird, Fourth Inventory Computers in Higher Education, 1976–77 (Princeton, N.J.: EDUCOM, 1979): II-05.
Higher Education
As systems became less expensive, and as budgets became increasingly the purview of departments and colleges within universities, decisions to acquire computing diffused away from central authority to more shared distributed acquisitions all through the 1970s and 1980s. To a considerable degree, this dispersed decision making about what to acquire accounts for the fact that in no other industry in the United States was there such a mixture of compatible and incompatible systems, with the exception of the U.S. Department of Defense and for exactly the same reasons: dispersed budget control and expenditures, and idiosyncratic needs of individual departments and constituencies.175 By the early 1990s, academic and administrative computing costs were such ingrained line items in an institution’s budget that whenever a college or university either was flush with funds or had to cut budgets, computing expenditures enjoyed and suffered the same fate. Thus, for example, in 1991, when higher education budgets were cut all over the United States, over a third of institutions reduced budgets for desktop computing by 5 percent or more, clear evidence that this technology was no longer something new or small.176 The same happened again in 1992 and 1993, yet was understandably still treated as a discretionary expense eligible for cutting.177 Subsequently, demand for Internet services began to motivate administrators to find ways to expand their budgets for computing, mimicking patterns of expenditures evident in the 1960s and 1970s: multiplicity of funding sources, distributed acquisition decisions and expenditures, and so forth. In short, how colleges and universities acquired and used computing reflected the consensus-driven and highly distributed management and decisionmaking processes that were hallmarks of the culture of higher education.
Conclusions After reviewing the role of computers in the Higher Education Industry, it should seem almost pedantic and so obvious to say that colleges and universities were perched in some interesting locations in modern American society. Besides being a supplier of computer science and technology, it trained (educators would say, educated) tens of millions of people, equipping many of them with the values, work practices, and skills that have defined the economy and society of modern America and, indeed, of many individuals and firms around the world. Its use of all manner of technology also reflects patterns of application evident in many parts of the nation’s economy, including its use of the digital hand. Because it educates so many workers, and influences the values and activities of so many individuals, its use of computing is influential and important. At the dawn of the twenty-first century, about 25 percent of the work force in America has attended classes in some postsecondary educational institution. It is de rigueur to have some college education if one is to enter or stay in the middle or upper classes, or to work in high-tech firms or in large corporations. Given how many individuals working and studying in this industry (higher education) rely on computing to do their work, it seemed appropriate to explore the role of computing in that
329
330
The DIGITAL HAND, Volume III
corner of the economy before ending this book. In short, the story of computing in higher education is deep, complex, and massive, just as the whole story of computing in America across all its industries is also a large story. But higher education is rife with paradoxes as well. Perhaps the most obvious involves the culture of how its institutions function and make decisions. These are hardly command-and-control driven institutions, ecologies made up of departments, colleges, and most important, individuals all working in their own spheres. Theirs are institutions optimized for teaching and research. Budgets are highly parsed and coordination often a challenge. Yet as IT infrastructures needed to become more integrated, the demand for more collaboration to leverage the availability of new technology, or to respond to a new threat, also increased, such as what nonprofit higher education began to face from for-profit institutions, which often seemed capable of moving more quickly in making decisions and often leveraging technology effectively as well. But more important, has the use of IT fundamentally changed how higher education functions? It is the same question we have asked about the rest of the public sector. I have to conclude that at the institutional level the answer is no, so far. Administrators still have to build a consensus around an issue, while faculty senates and departments debate extensively issues. What IT has done over the past half century is to allow individuals working and studying in higher education to do what they have always done, only faster, better, perhaps less expensively (although that is not entirely evident), and to collaborate with ever larger communities outside their own institutions. So at the individual level, how work was done did change. Teaching influenced profoundly by the digital hand has only just started to transform. Thus, sources of change in institutional culture have to be found elsewhere, not from IT. What about the effects of networking, specifically the collaborative quality of the Internet—what impact is that having on institutions? It is well understood that the more people who participate in a network, the more valuable the network becomes. They also continue to cost more, so much so that administrators are increasingly finding it difficult to run their institutions in the highly decentralized manner. The president of Ferris State University described the effects in 2002: “At one level, information technology acts like an insatiable, hungry monster ready to eat up any available money in the budget. At another level, information technology provides institutions wonderful opportunities, if only the monster can be tamed.”178 Channeling the power of technology is increasingly requiring more centralized management so evident in all other industries. Can institutions in higher education be, as one IT executive put it, “smart and nimble enough to take advantage of” advances?179 It is a question that the deployment of IT has caused to be put squarely on the table for all in higher education to see. Conversely, technology has brought about some changes. The discussion we had about the role of new rivals coming into the industry, all focused on highly profitable offerings and equally adept use of digital tools, shows that they pose threats and opportunities not unlike what occurred already in such industries as banking, brokerage, travel, and retailing. When Richard S. Ruch in his survey of
Higher Education
the for-profit university described the challenges faced largely by public institutions and small private colleges, he was articulating the experiences that so many other industries had already faced, in some cases nearly two decades before.180 In those industries, several things happened. The most profitable components of an industry often were pulled out of established institutions and situated in new ones, as is happening now in the Recorded Music Industry and just beginning to occur to movie producers, book publishers, and even with charter schools in K–12. But unlike other parts of the public sector, higher education does not have the monopolistic role played in society by such institutions as police, the armed services, courts, legislatures, or tax collectors to protect it from the need to change in some fundamental way. It appears to be more subject to the sorts of competition that some public sector agencies face, most notably, the United States Postal Service and, of course, all of the private sector. The roles of individual cohorts in this industry are clarified by looking at their uses of computing in higher education. Administration leadership adopted computing for the same reasons as their peers in other industries and at roughly the same time. Like peers in other industries and pubic agencies, as the years went by and their installed inventory of digital applications and telecommunications increased, they had to face growing expenses for IT that often expanded faster than their budgets. However, they also had to deal more intensely with telecommunications than others in many other parts of the public and private sectors, particularly by the 1980s and certainly with the increased use of the Internet in subsequent years. The role of professors is most interesting because they are usually not discussed in the context of how computers are used on campus. Rather, students receive all the attention. But the historical record demonstrates clearly that professors were eager users of IT, and more often than not, the earliest adopters, not students. Reasons are not hard to find: the technology often suited their administrative, research, and communications needs, and they normally had more access to various funding sources to pay for the technology, particularly for PCs, access to mainframes, and for research-centric software. But they used IT in support of their existing ways of working, rather than as tools to radically change practices, which accounts for why they might use IT extensively to communicate and conduct research, but not so frequently to design Web-based classes. To be sure, disciplines affected their use. Thus, we find that engineers and scientists were the earliest and often most extensive users of computing throughout the second half of the century, while professors in the social sciences were later adopters, and those in the humanities the last to integrate computing into their work. Academics in large research universities tended to be the earliest adopters, followed by colleagues at elite universities and colleges, then by others at small private and public colleges, and at about the same time, community colleges. Finally, we should note that students did not create online classes. Whenever these kinds of offerings existed it was because administrators funded development while professors created classes in collaboration with colleagues, IT experts, and students serving as teaching assistants or as part-time IT support staff.
331
332
The DIGITAL HAND, Volume III
In the years from the 1950s through the 1970s, students came to their colleges and universities essentially ignorant of computing, learning about the technology only after arriving at school and if they were studying computer science, engineering, mathematics, or the physical sciences, and only if led to the technology by their professors. Not until they began using PCs at home or in secondary schools did they arrive on campus either as knowledgeable regarding IT as professors, or even more so. That situation did not begin to exist until the early to mid-1980s and was not a widespread circumstance until the late 1980s. It was only then, for example, that students began to complain that their professors were not creative in their use of computing in course management, for example, a perception that became widespread by the late 1990s. However, by the late 1990s, students also had become a major force to be reckoned with in higher education. They were willing to combine classroom-based courses with distance learning, cobbling together programs from multiple institutions to meet their needs. As the cost of education rose at the same time that they increasingly worked and went to classes, and hence were older, they became more articulate and demanding consumers. As the historical record shows, one of the major nexuses of their consumerism concerned the use of IT in their educational experience. For decades commentators on higher education have been warning that profound changes were needed, indeed, were coming. I have quoted from them all through this chapter, but little happened. Even as recently as 2003, distinguished experts on higher education noted the necessary changes, in part caused by IT, but also by the larger forces in society, which was becoming more dependent on information, knowledge, and, of course, digital tools.181 The changes so many called for have started to occur. Gene I. Maeroff, himself a life-long member of the Higher Education Industry, has argued that while his world was slow to respond to changes, many became possible because of the extensive use of the digital hand. He has argued that teaching in the future will rely more extensively on the use of IT than it has so far. He also acknowledged that IT in education is today one of the most controversial issues faced by both K–12 and higher education, yet predicted that many institutions would master the technology.182 Based on the prior experiences of dozens of other industries, his predictions seemed reasonable to expect to occur. Finally, we should call out a trend in the industry well under way in the early 2000s that is not unrelated to our whole discussion of the uses of IT in teaching and containing costs in higher education. Specifically, competition for students rose in the years of the new century to a level not seen before, particularly for schools seeking out the best students in the nation. Part of their sales presentation to them did involve bragging about the state of their IT infrastructure. But, as already noted, students and parents were very concerned about the cost of an education. To be competitive, colleges and universities have had to discount tuition off the retail asking price. They could do this because they had developed large scholarship funds over the previous half century, more than because of improved productivity made possible by computing. Nonetheless, for
Higher Education
all intents and purposes, an outstanding or even good student going to college in the early 2000s could find financial help in the form of scholarships, low cost loans, work-study programs, or simply discounted cost of tuition. In fairness to Princeton University, cited early for its high fees, it, too, discounted and had many other financial aid programs. As one industry expert noted, discounting was the big story in higher education by 2006. The effect of discounting and use of other financial grants effectively could keep the growth in the cost of tuition to levels similar to national rates of inflation.183 The case of computing and telecommunications in higher education brings to a conclusion the several dozen industries described in The Digital Hand. It is an industry that itself is the intellectual infrastructure and fountainhead of so much in modern society, what so many have called the Information Society, Networked Society, or Information Age. Yet, it is no more immune today to the waves of change blowing across the American economy as a by-product of the use and evolution of information technology than any other part of modern society. The last chapter is devoted to addressing myriad issues raised in this book that transcend any particular part of the public sector. It also identifies patterns of behavior and extracts lessons, finally situating the role of IT of this sector into the broader experiences of modern American society.
333
10 Conclusions: Patterns, Practices, and Implications Outmoded commitments and operations constitute an encumbrance on the future that can erode the capacity of the nation to better align its government with the needs and demands of a changing world and society. —U.S. Government Accountability Office, 2005
A
merican governments, schools, colleges, and universities used every kind of information technology developed in the twentieth century. The federal government was the largest user of such technologies in the world, and often its applications were also the biggest. We have only to think of the amount of goods and services purchased by the U.S. Department of Defense with the help of computers, or the role of IT in fighting crime at the FBI, or in counting the number of residents by the U.S. Bureau of the Census, or the hundreds of millions of pieces of mail delivered by the U.S. Postal Service, to realize the breadth, size, and complexity of IT in the public sector. This was also the sector that developed the earliest computers and funded their evolution with billions of dollars. When one additionally recalls that the entire public sector made up the largest portion of the U.S. economy during the second half of the century, we are left with the inevitable conclusion that the role of IT in this part of American life was crucial to understand, and an essential part of, the history of computing and telecommunications. If this book has done nothing else, it should be clear that all manner of governments became extensive users of IT, from large federal agencies to small town offices. Even little schools used computers and today, the Internet. The 334
Conclusions
story of computing in public administration is thus both one of massive use and of important changes in the work of government and education in the twentieth century. This chapter summarizes key trends in the use of IT and the effects such reliance had on these organizations, concluding with a brief discussion of the implications for today’s public administrators. More than simply a history of computing, what this book has done is to describe important operational transformations that occurred in the public sector, made possible by IT in many important, indeed, essential ways to the make-up of modern American society.
Public Sector as a Galaxy of Industries The notion of looking at American government and education as if they were industries did not originate with The Digital Hand, but rather dates back to the 1950s in the early work of John W. Kendrick on economic productivity and most importantly in that of Fritz Machlup in his studies of the production and distribution of knowledge.1 Yet since their work, the number of economists, for example, who have studied government the same way they looked at private sector industries has been negligible. Even recent studies of industries avoid the subject.2 Public administration and education are universally treated as if they did not make up any significant portion of the economy, or if they did, simply as a strange beast in the herd. Government economists and data collectors have also done an excellent job in obfuscating descriptions of the public sector, often for political reasons, so as to minimize, for instance, data on the growth of government. They are willing to measure other industries in considerable detail, but not as much their own organizations. One simple example will have to suffice in support of this observation. The U.S. Bureau of the Census publishes annually a handbook on the American economy, providing vast quantities of information on employment, income and expenditures, social data, and so forth about society and individual industries. When it has to list how many government entities there are, federal enumerators treat the U.S. government as one, each state as one, and then list in the thousands schools and local governments.3 But as has been demonstrated in The Digital Hand, just in the federal government alone there are many agencies the size of whole companies or industries. The Bureau of the Census alone is equal in size to some Fortune 500 corporations, while the Department of Defense (DoD) is bigger than many American industries. There are two partial exceptions—and only partial—K–12 and higher education for a variety of reasons, but even here the same techniques used to measure a private sector industry are hardly used by economists and experts on managerial practices. The result of these various forms of neglect in viewing the entire public sector as a galaxy of industries has denied students of modern American society a useful way of understanding the composition and role of over a third of the American economy. The narrow view taken in this book of just looking at the role of IT in this sector accents that observation. The case for focusing on the agencies and other institutions that make up the public sector has been discussed throughout this book by type and with
335
336
The DIGITAL HAND, Volume III
examples, and so can be summarized quickly. Like private sector industries, officials in similar roles collaborate or compete, they know each other, have their own associations, publications, conferences, “corporate culture,” and shared practices. Law enforcement officials make up a tightly knit fraternity; the military as well, even within each uniformed service. Military veterans alone constitute a world unto their own, dating back to the creation of military associations comprising Union and Confederate soldiers in the last quarter of the nineteenth century. Prison officials talk to each other, as do tax collectors who rely on each other’s data and forms with which to conduct business. Professors and their employers are also a nearly self-contained ecosystem of their own, while teachers have one of the strongest unions in the country. In short, the list is quite long. But once we recognize that groups of public servants act like members of an industry, the methods we have relied upon for decades to study the role and performance of these communities of firms become more obvious to use. One can begin to examine their levels of productivity, cost of doing business, how information is shared among them, and the effects they have as regulators, customers, and users of myriad tools and products within the U.S. and global economy. In short, an industry view allows students of American economic and managerial practices to decompose a major, insufficiently defined sector of the economy. We need to do this for many reasons, size being only one. For instance, public sector agencies collect and spend vast quantities of money and, therefore, influence or dominate whole private sector industries, particularly in wartime. They are a major employer and educator, and their actions dictate more than any other sector or industry the standard of living and quality of life of residents of the United States and, in wartime, of people in other countries, as we saw in the early 2000s with Iraq. Additionally, the political structure of the nation is driven more by the actions of government, and the attitudes taught to children and young adults by the public sector (through schools and many jobs, including employment in the military) than by any other community in society with the probable exception of parents and extended families. Yet even the influence of families may be diminishing at the dawn of the twenty-first century, as the configuration of the traditional nuclear family undergoes a transformation that has increasingly placed more burdens onto schools, postsecondary education, and government programs to educate students about life’s practices. We are led to the conclusion then that one can, and should, use whatever techniques are available to understand and guide the activities of the public sector. We should use tools of the economist, sociologist, cultural anthropologist, and business school professor, honed in the study of many industries, because they are applicable to the public sector. Yet it is common for public officials and those who work in the private sector to say that “government is different,” especially to someone working in a corporation. What did the history of the digital hand teach us? There are several aspects of the question to address: institutional culture, operational changes, and so forth. With regard to institutional culture, use of IT had less effect than evident in some private sector industries, such as in media and entertainment.
Conclusions
The values of the organization were the same—or changed—due to circumstances other than the use of IT. Indeed, public administrators and users relied on computing to reinforce the values and missions of their agencies. The military used computers to build better weapons; teachers refused to change their teacher-centered form of teaching, but used computers to reinforce their pedagogical ethos; law enforcement wanted the same kind of information to carry out their jobs in ways that would have been quite familiar to sheriffs and police officers of a half century earlier. The clear conclusion we can reach is that the digital hand reinforced existing norms of thinking about the role of various agencies and their employees. When one looks at attitudes regarding how to become more effective in carrying out these missions, then there are changes to be seen. As Paul N. Edwards observed regarding the military and technology, how experts on IT viewed digital tools spilled over into attitudes and language of users. In the case of the military, the notion of dealing with multiple activities as systems that were integrated and that could be substantially automated became an attitude that permeated military language and world views during the Cold War, and as we saw, has continued to the present.4 His observations are not an exception. As in the private sector, the language of systems seeped into the thinking and dialogue of people in the public sector, just as in the 1990s the language of process management became widespread in both the public and private sectors.5 Whether we are discussing the IRS, a centralized IT organization in government or in some large city, attitudes toward IT and, conversely, the language and world view derived from digital tools were quite similar in private and public sector organizations. Throughout the 1980s and 1990s, those who commented publicly about the effects of computing on companies and industries spoke about the desegregation of both types of organizations. All through volumes 1 and 2 of The Digital Hand we saw examples of the cry that it was the end of the old way of organizing business, and yet not one major industry disappeared as a result of using computers. To be sure, borders between industries blurred, as occurred when the Brokerage Industry offered checking accounts to clients and the Insurance Industry investment options to its customers. By the end of the century, observers of the public sector scene were predicting that agencies would meld together as a result of the citizen-centric view that public administrators would have to take, one that had to transcend the priorities of any particular organization. Creation of the Department of Homeland Security (DHS) was viewed as a dramatic harbinger of new things to come made possible by the digital hand. But, just as in the private sector, no public sector “industry” or agency went away. Those agencies that were subsumed into DHS are alive and well as distinctive institutions, including the FBI, described in chapter 3. The IRS is also still operating, as are school boards, town governments, and state agencies. In fact, of all the types of public institutions, the only ones that actually shrank in number were school districts, due to consolidations, and that resulted in increased operational productivity. Consolidations of any kind were often less a function of the power of computing than the influence of politics and economics, such as what happened with school
337
338
The DIGITAL HAND, Volume III
boards and towns from time to time. The tyranny of a logic that held IT efficiency drives productivity would have dictated that the states of Delaware and Rhode Island could have been efficiently absorbed by neighboring states as if they were simply a few additional counties. Yet it never happened. The point is, technology did not cause any major component of the public sector to disappear or to emerge as a new industry. DHS may someday be the first example of the opposite to occur but, so far, not so. If industries in both sectors survived the extensive use of IT, did they learn about the technology in different times or ways? It is an important question because in the private sector, learning about IT mimicked how much information on how to run a business flowed through the economy, affecting millions of managerial decisions regarding such myriad issues as how to use IT, business strategies, reorganizations, and so forth. Both the public and private sectors learned about specific uses of technology at roughly the same speed. Their managers and IT personnel read the same literature, attended many of the same conferences, graduated from the same colleges and universities, were certainly called on at the same time by such key vendors as IBM, Cisco, Microsoft, and Apple, and thousands of others; and, they collaborated on research and development of new uses. No knowledge gap existed in any decade regarding IT. In fact, because of funding initiatives undertaken by various government agencies, there were instances in which they were ahead of the private sector. Military systems in the 1950s and 1960s are obvious examples, so, too, development of various forms of IT by computer science departments in higher education from the 1940s through the 1980s. Did the public sector favor certain types of applications over those being installed in the private sector? In both government and corporations, management implemented systems that were remarkably similar in type. Both first installed systems to handle back office accounting, financial, personnel, and inventory control functions, leveraging the technical capabilities and capacities of computers of the 1950s–1970s. Both migrated from card files to tape files, and then to direct access records for the same reasons at roughly the same time, with the result that over time both created online systems and used database management software, both beginning in the late 1960s and extending right to the end of the century. Both also created systems that were specifically designed to handle core functions of their organizations. Thus, the military created smart bombs, while the Census Bureau installed massive data collection software. Automotive plants had software to schedule production of cars, while banks installed ATMs. So, industry-specific software was always popular across the entire economy, and the public sector was no exception. Indeed, as in the private sector, where industry-specific tools were lacking, or were perceived to be lacking by end users, there was less computing, as we saw, for example, with teachers, and good teaching software remained a rare product. Differences existed, too, in the kinds of applications installed. Companies implemented transaction-based software tools earlier than governments. For example, one could buy things over the Internet and pay for them with credit cards in the late 1990s but had to wait until the early 2000s to do the same for
Conclusions
licenses and payment of taxes. Public sector managers were more concerned with internally focused operations longer than the private sector, which perforce had to worry more about the role of customers in their businesses. So, we learn from that experience what historians of technology have long known, namely, that the institutional culture and priorities affected profoundly and directly exactly how technology was used; IT was no exception to that circumstance. In addition, both had increasingly to deal with their customers (citizens) via telecommunications and computing. Historically, agencies were notorious for not collaborating in dealing with citizens, for instance, someone on welfare who needed services from multiple agencies; they rarely collaborated to provide a unified composite set of services to such individuals. The technology existed to make that possible as early as the 1960s, but even in the early 2000s only some agencies were beginning to integrate their data and tools to do just that. European and some Asian governments were farther ahead than the Americans. In business parlance, the problem was known as “stovepiping,” that is to say, too much of an inward focused development of policies, programs, and uses of IT, as we saw in the extreme case of the defense community. The private sector was also guilty of stovepiping, sooner than government or education, yet began to integrate multiple offerings and services, hence IT systems, to provide customers more integrated services. This was done because there was money to be made doing that. Banks integrated checking and savings accounts, even making it possible as early as the 1970s to conduct cross-account transactions at an ATM. Brokerage firms made it possible for people to have cash accounts and to write checks against those accounts and at the same time buy and sell stocks against those cash/checking accounts. By the early 2000s, almost all private sector industries were (or had) moved to integrated customer service operations, while public sector had not. It is why, for example, the Clinton administration created portals that one accessed by desired service, rather than by agency providing the service, to fix a chronic cultural and service delivery problem in government in how it dealt with the public. Go to a private sector Web site and it probably will not list divisions within the firm, but it will catalog its products and services and allow customers to mix and match them if so desired. But ultimately, with many types of digital transactions, public sector organizations mimicked the style and form of customer/citizen transactions evident in the private sector. Yet there are differences from the private sector as well. For one thing, decision making about what IT to acquire—the procurement process—is far more complex, slow, and onerous (to suppliers in the private sector) than what is routinely the case in companies. Because the public sector’s various agencies were collectively the largest buyers of services and goods in the American economy, its ways of acquisition were often important points of discussion, contention, and interest to all from Congress frustrated with what it and the GAO saw as constant waste, to the private sector’s irritation with the effort it had to go through to sell.6 The same scenario played out in many state governments with agencies debating with their legislatures (and audit functions) about similar issues. One conclusion we can reach is that it often took the public sector longer to get things done than
339
340
The DIGITAL HAND, Volume III
similar tasks or decisions in the private sector. As noted throughout this book, citizens often pushed government and education to adopt applications in forms that they had become accustomed to in the private sector, such as online access to government information twenty-four hours a day and the ability to conduct transactions the same way. A second difference between these two components of the economy is that implementation of digital projects also took a long time, often due more to the sheer size and complexity of their form than because of some innate way of doing the required work. We have only to think about the complicated applications in military systems, avionics that made it possible to send men to the moon, the IRS’s massive systems, or even those of the FBI and the Social Security Administration to realize that digital systems often were very complex. On the other hand, systems comparable in size and complexity to those in the private sector went into towns and states, for example, just as fast (or slowly), and were just as poorly (or properly) implemented as in the private sector. A third contrast involving public sector agencies concerned the rate of changes. Public sector organizations were more resistant to operational changes than the private sector. This feature of the public sector may also be what so many observers in companies notice is so different about government, but it was not obvious to me that public officials saw that feature in themselves. With exceptions, of course, adoption of new ways of performing work proved to be a slower process, and often resisted, such as we saw with teachers and professors regarding computers in the classroom, and law enforcement until they got systems that gave them the same kind of information they had long been accustomed to have, or the military, who proved reluctant to allow a digital system to make command-and-control decisions on their behalf before having been proven to be absolutely effective. Some of the resistance was no different from what occurred in the private sector: an ineffectual application or software was simply rejected or delayed in any industry, public or private; if the cost was too high or not balanced against potential benefits, public and private managers reached the same conclusions. Where there were differences between public and private sector managers was in the justification for using systems. Many agencies, and all of education, faced growing workloads throughout the second half of the century that outpaced increases in budgets to support these increased volumes. That was so much the case that doing things in predigital ways could not be done. Recall the experience of educators in colleges and universities who implemented systems in the 1960s and 1970s because of the huge surges in the number of students arriving on their campuses. Also, Congress and many state legislatures mandated many changes that increased workloads for various agencies without fully funding these, and thus these organizations, too, had to look around for new ways of doing things. The Census Bureau constantly had to ask new questions of American residents, while the Social Security Administration had to provide new services. State legislators were also enthusiastic about mandating operational changes without always taking into account the implications for IT organizations. Our book is filled with many
Conclusions
instances of the tension between rising workloads, changing services, and highly constrained budgets. Public officials had little choice but to reach out to any available helping hand to get their work done, and the digital hand helped. That is why the private sector notion of formal cost justification—or cost avoidance— was less of an issue in the public sector. To be sure, cost avoidance was not ignored, but simply was not always as important a factor as for corporations. To be sure, some resistance was also due to lack of knowledge about the potential benefits of IT, even if there was an abundant supply available of the technology and information about it. Teachers in the 1990s often had many PCs, but insufficient knowledge (or time) to learn how to use them. Police officers represented another group with this problem. In general, public sector organizations did less to train users than private sector industries. When such was the case, much of that difference could be accounted for by a combination of factors, such as poor management, inexperience, lack of time, or simply inadequate budgets to support sufficient training. In the case of the federal government, common themes repeated in hundreds of reports written by GAO’s auditors included lack of adequate leadership, frequent turnover in management, and poor project management, which led to unrealistic targets, missed deadlines, and often to horrendous overruns in planned expenditures. The same occurred in state, county, and local governments, and in many schools, colleges, and universities. Often, the bigger the project, the greater the chance existed of running into the kinds of problems documented by the GAO since the early 1960s. On the other hand, the private sector in general experienced these problems relatively less frequently. The quality of its project management skills tended to be better, while its managers more directly held accountable for effective implementation, use, and results from reliance on digital tools. On occasion, while examining the role and work habits of specific agencies, one could sense that if they had a monopolistic position in society (to use a business phrase), their urge to change proved less urgent and that sense of security and authority could affect decisions regarding use of IT. We saw examples of that in the 1960s and 1970s with the postal system, throughout the entire period with higher education, and even in public education (K–12) during the 1990s. In each instance, there were managers who dismissed perceived challenges to their exclusive role in society as of minimal consequence. Yet over time, many discovered that they had various forms of competition as serious as any faced by any industry in the private sector. As that realization became undeniable, management in such an agency turned to the digital hand for help in driving out costs and maintaining, or providing, new competitive offerings. To be sure, this pattern varied by agency. Thus, the Post Office faced it beginning in the 1960s, and that led to its restructuring in the 1970s; K–12 and higher education only began wrestling with these problems in the 1990s, and as this book went to press, the end results were not yet clear. On the other hand, those that were least threatened often proved very reluctant to change, such as the court systems, although they, too, used computers but not to the extent, or as early, as colleagues in other parts of law enforcement. Admittedly, generalizing is at best a tenuous if useful exercise, because there are
341
342
The DIGITAL HAND, Volume III
exceptions to general statements. For example, the Census Bureau—which clearly had a constitutionally mandated monopoly for counting the number of residents in the United States—constantly self-examined its operations and sought new scientific, statistical, and technological ways to innovate in carrying out its role.
How the Digital Hand Changed the Work of Government and Education From the birth of the computer to the widespread deployment of the Internet, public officials were generally extensive users of every new form of technology that came along. Indeed, in many cases they invented it, such as the military, NASA, and higher education. Over the course of the half century, they essentially automated basic government collections of information, particularly many back office functions, and ended the century starting to digitize many transactions and interactions with citizens. Federal and state agencies are the most advanced in their deployment of various uses of IT, while local governments, while well along, are not as far along. The same generalization applies to K–12 and to pockets of higher education, although there are some communities and functions in higher education that are highly digitized, such as research. When compared to the private sector, most of the public sector was just as automated, computerized, or digitized. In short, daily work in the public sector changed as much as in the private sector. There is hardly any activity that somehow is not touched by the digital hand. This means that employees have become highly dependent on large bodies of data with which to make decisions and to do work. They are also affected in how they collect information and manage relations with individuals and organizations via their systems, what at one time might have been simply forms and policies, and which have been so computerized that the systems themselves cause great conformity by employees to policies. In other words, computing made it possible for public administrators to reinforce the practices of their agencies, because of so many systems that automated existing work practices. Additionally, these systems often stayed in place for so many years that it was not until the 1980s that structural changes in work practices and policies could be implemented, as we saw with the IRS and was going on with logistical systems at the Department of Defense in the early 2000s. Big systems went in early in the history of computing, and we know that those initial applications of the digital hand were largely automations of existing practices, with the notable exception of military weapons, which were new creations. Until those large early systems were replaced, and done so by a generation of public officials who had lived with the first wave of computers, fundamental changes in work practices occurred much more slowly than in the private sector. But by the end of the century, the transition to a new style of work had begun, even if it had not extended so far as evident in the private sector. Nonetheless, it had started. Use of computing reinforced the organizational silos in local, state, federal, K–12, and higher education. Systems were built to reflect the mission of
Conclusions
individual institutions, a point made repeatedly in this book. Thus, while early systems reinforced existing work habits, they also did the same for organizational structures and institutional cultures. In the case of the federal government, no major attempt was made to break down those institutional boundaries until the 1980s, for instance, when secretaries of defense began the process of forcing the uniformed services to collaborate in more unified command approaches to their work—a process still under way. On the civilian side, creation of the Department of Homeland Security had not yet caused a major restructuring of culture and practices as this book went to press in 2007. State and local governments had the same experience. However, sharing information across various government agencies improved substantially in the second half of the century and thus affected work, increasing interagency collaboration, for example. The best illustration of that, of course, is the sharing of information by law enforcement agencies across all organizational boundaries. A second case involves the reliance of state tax collectors on information gathered by the federal government from tax returns. Leaving aside issues and debates regarding privacy and data security, the fact remains that public officials collaborate and share more information today than they did a half century ago. That change in work was a direct result of the capabilities provided by computing and telecommunications. Another change evident in the second half of the century was the further stimulation given to technical and managerial standards of performance. The federal government, for example, could set standards of performance for teachers, state welfare programs, and for law enforcement not just because it was willing and able to fund programs, but also since it could provide services and gather information on compliance using the digital hand. In turn, the digital hand made it more attractive for federal and local organizations to standardize forms, ways information was collected and shared, and expected outcomes. The increased use of the FBI’s databases and practices became largely standardized across the nation. The same held true for welfare and medical programs, and in more recent years, in K–12. This trend of increased use of managerial and performance standards took place at the same time that technical IT and telecommunications standards increasingly seeped into the work and information infrastructures of the nation, a process that before computing was largely the purview of the Federal Communications Commission with radio, telegraph, and telephone communications and of the Federal Reserve Board with regard to communicating financial transactions among banks. In short, standards became more useful and critically important the closer one reached the end of the twentieth century. Did computing make it possible for government to become bigger or more effective? There are many ways to ask the question. For example, did computing help the public sector keep up with growing work loads? Did it do more than it had to as a result of the available technology? These are very important questions that impinge on issues of productivity and accountability, and the role of government in a democracy, indeed, in the world. It is also premature to provide empirically based answers. But we know enough to start laying down markers and
343
344
The DIGITAL HAND, Volume III
articulating hypotheses that future students of the public sector can test. There is no doubt that government at all levels, and both secondary and postsecondary education as well, expanded enormously in the second half of the twentieth century. We know that the population of the nation grew too, which would have caused a natural growth in various public services and staffs, regardless of whether or not computers and telecommunications were used more or less. An additional observation we can make is to recall that the amount of services provided by governments at all levels, and also by education, began expanding in American society in the 1930s in response to the economic and social crisis posed by the Great Depression. World War II stimulated government’s role in society to expand further as well, and one can turn to the Cold War as yet a third propellant in the growth of government services and size. So the trend of “big government” began nearly a quarter of a century before computers came into wide use in the public sector. In other words, the forces that caused government and education to expand their roles all through the second half of the century were far more important and momentous than the parts played by computers. To be sure, information technology provided a helping hand, serving as a facilitative tool making it possible for the public sector to carry out its expanding functions. There was no economic or technological determinism at work in the expanded use of computers and telecommunications. The evidence is quite clear on this point: public administrators used IT when it served the existing mission and objectives of their agencies or institutions. To be sure, economic incentives were created that made it easier for a police chief to acquire computers or to use an FBI database, for example, or for a school district to accept federal aid to wire classrooms for access to the Internet. But these officials, and the citizens they served, did not have to install a single machine. It is only at the end of the last century that agencies began requiring by law that people use computers, such as corporations filing their income taxes using a combination of software and telecommunications. The number of instances where one was forced to use computers to work with government proved far less than what we documented in the first two volumes of The Digital Hand. Even internally within agencies, the requirement to use IT in order to do daily work often came later and less forcefully than occurred in the private sector. I recognize that this is a judgment call on my part, but the evidence presented suggests that there existed (and is) a degree of difference between the two sectors. By the early years of the new century, governments in particular were providing services made possible by the existence of IT systems. In the 1990s, for example, we saw the wide deployment of Web sites for myriad agencies providing information useful to the public and soon after, the ability to access services by type rather than by agency, and to carry out online financial transactions, such as buying a license. Yet, even in this circumstance, offerings were presented within traditional bounds. For instance, one could pay their annual license plate fee for their automobile over the Net in Wisconsin; yet that same state had a fee for license plates going back to World War I. E-filing of tax returns did not change the fact that citizens had paid income taxes before the arrival of the
Conclusions
Figure 10.1
American field forces in Iraq used laptops to communicate with their commanders, 2006. (Courtesy U.S. Marine Corps)
computer. So much of what government did by using computers was in support of existing activities in more effective or more accessible ways. Computers, however, did not always help public administrators solve problems or provide better services. We have only to recall the difficulties at the IRS in the 1980s with its inability to process all tax returns in a timely fashion. The Pentagon has never been able to pass a DoD-wide accounting audit despite the fact that it has one of the largest collections of administrative and accounting systems in the world. California had its moments when large state-wide systems did not work, and in 2006 Wisconsin either had to abandon systems under development and start over (for instance, payroll for the University of Wisconsin system) or face the wrath of a legislative audit team examining troubled IT projects costing cumulatively well over a $100 million. Old systems kept too long drove costs of maintenance up while not always providing the level of service required. This circumstance was endemic late in the century at the FAA, IRS, state of New York, and among other agencies from time to time. Can we argue that the ever increasing percent of public sector budgets devoted to IT and telecommunications accounted for these problems? The answer is no because too many positive and successful uses drove increases in intentional expenditures on IT. Hundreds of GAO investigations confirm that, even though whenever there was a major problem, expenditures for repair and, more expensively, for remediation occurred. Computers also led to some spectacular results. Unquestionably, the U.S. military was the most powerful in the world during the second half of the century in large part because of its extensive—and successful—development and sometimes use of high-tech weapons and command-and-control systems, beginning with SAGE (technically spectacular for the day, operationally marginally
345
346
The DIGITAL HAND, Volume III
acceptable) and nuclear weapons, later with avionics and missile-guided systems, and most recently with “smart” ordnance and armed unmanned drones. The American military followed a long tradition dating back to before the Romans of relying on technologies more advanced than those of their enemies. A second spectacular effect of computing is what happened with scientific, medical, and engineering developments that poured out of universities, national laboratories, and private corporations. Avionic systems represent yet a third success story of extraordinary importance and proportion. It would be difficult to imagine as many aircraft flying around during most of the second half of the century without digital systems to guide aircraft and coordinate their flight patterns. The centerpiece of this story, of course, is the fact that the U.S. government could send humans to the moon and back, a feat that absolutely would have been impossible to do, let alone try, without the aid of computers and telecommunications. Cumulatively, these and other myriad uses of IT helped the United States remain a major political and military world power, contributed directly to making it a global economic powerhouse, and facilitated its ability to sustain one of the highest standards of livings in the world. Simultaneously, less spectacular results occurred where users of the technology failed to find ways of using it in support of their activities. Teachers and professors still lack software that would lead them to change fundamentally their existing pedagogical paradigms. Doctors and nurses rarely use e-mail to communicate with patients, although the latter would most eagerly welcome such a form of dialogue. Very large systems in the federal and state governments have a history of being problematic. Large departments have frequently not found ways to consolidate multiple redundant systems, such as DoD’s many logistical applications, less because of technological limitations and more due to institutional silo-centric values and practices. Many accounting systems at the federal and state levels have yet to be as well managed as required by law of the private sector. In some cases, the limitations are a function of culture—such as with the quality of logistical and accounting systems—while others of the technology itself—teaching tools for classroom use—and still others because the technology was simply evolving into new forms, such as the graphics required by both video game developers and the military, or the massive super computing and memory systems needed for development of new medications. In those cases where useful features of the digital hand did not exist, work changed less than where IT proved to be a good fit. What can we conclude about the effects on the workforces in the public sector? We know from data presented in the first chapter that the number of public employees increased all through the second half of the century. We can account for some of that due to the growth in the size of the nation’s population and economy.7 Additionally, we know that new services provided by public institutions increased in variety and complexity over the same period as well. It is still difficult, however, to delineate or draw a direct correlation between the use of IT and the size of the public payroll, which, if we could align with that correlation, would be a crude indicator of labor productivity. The public sector’s labor
Conclusions
productivity remains controversial and unresolved, as we saw as early as the 1960s with the work of John W. Kendrick.8
Patterns in the Adoption of Information Technologies Managers in all kinds of companies, industries, and government agencies have always concerned themselves with how best to take advantage of technologies of all kinds, not just computers or telecommunications. Implementing well uses of various technologies can help ensure that the cost of adopting some new application will not exceed what was expected, and that the application will go “live” at an appropriate time and serve the interests of the organization in a productive manner. In this book, we have seen examples of implementations that took years, thereby delaying benefits of such use to an agency. Other applications went in quickly, and thus one could take advantage of them sooner and in a more timely manner, such as some of the databases developed by the FBI. The costs of installing something new can often be as expensive as the hardware and software components themselves, accounting for writing software, deferring benefits while the use is being implemented, training, and the expense of often operating both the old and a quasi-implemented new approach.9 Once implemented, a new use of technology becomes part of what an organization is and does, remains for many years, often decades, even past the time of their effective value, as we saw in some cases with the IRS and logistical systems at DoD, for example. But in a bigger sense, the comings and goings of IT applications, and those of all other kinds of technology, are about the larger issue of how an organization works, the style discussed throughout this book and its two predecessor volumes. Historian David E. Nye described uses of technology “as the expressions of social forces, personal needs, technical limits, markets, and political considerations.”10 In short, a number of factors weigh on the issue of deployment. The rate at which technologies change represents one influence; public administrators obviously could only take advantage of a new technology once it had appeared, not before, despite the fact that the proverbial “everyone” was bombarded with forecasts of emerging new technologies throughout the entire period. The degree to which an institution could maintain consistent leadership, focus on an implementation plan, and retain budgetary commitments represented a second gating factor—the issue that so often drew the ire of the GAO, which complained frequently about the lack of constancy in leadership and purpose in so many federal IT implementations. The capabilities of individuals often also affected implementations, as we saw with the lack of knowledge of police chiefs about IT until the U.S. Department of Justice stepped in, or by the deep understanding that employees gained about computing in the U.S. Bureau of the Census, which made it possible for that agency to become an early user of computers. The list of variable influences could be extended, but the point is clear: adoption of a new use of computers was a complex sociomanagerial process influenced by a complex set of forces and issues.
347
348
The DIGITAL HAND, Volume III
Given those factors, what were the patterns of adoption of IT in the public sector? Did they differ from those evident in the private sector? In fact, the evidence of implementations and changes presented across the three volumes of The Digital Hand allows us to identify several common patterns. First, large organizations across the public sector were often the earliest to implement any and frequently all new forms of IT and telecommunications, because they had the need, budgets, and often the necessary skills with which to do this. This pattern applied to federal, state, and large urban governments and agencies, and to universities, some colleges, but rarely to school districts, and never to schools (which were too small). Second, as technology dropped in cost— as measured both by the relative expense to do a transaction or by the absolute price tag of an item—waves of new users embraced the technology. This was true across all of the public sector during the entire period. High points included the arrival of inexpensive personal computers at the end of the 1970s–early 1980s and the reasonably easy to use Internet in the 1990s; but examples can be found across all industries and public institutions. Third, as software became increasingly available on more affordable computers that did work specific to an industry or agency, public administrators adopted them. For example, as database tools for law enforcement came online, police embraced them. The converse was also true, as we saw with teachers and professors who resisted using teaching software in classrooms because they were often of poor quality or simply ineffective. As evident in the private sector, a fourth pattern of behavior was clearly in evidence: public administrators implemented specific uses of the digital hand only if their cohorts were doing the same. Very few administrators wanted to be the first to implement a new use; on the other hand, they knew not to be the last. As generations of IT salesmen could attest, whenever they presented a new use of computing to a public administrator, he or she could expect that official to ask, “Well, who else is doing this?” Like cohorts in private sector industries, public officials acted much like schools of fish. When a use of computing or telecommunications was seen as viable and affordable by administrators that had the same mission (for instance, sheriffs or educators), like-sized institutions generally adopted it at about the same time. Definition of “about the same time” was usually a period of roughly three to five years. A corollary issue to timing was the role played by either the federal or state government in encouraging adoption of some use by providing funding and other forms of support. I cannot overstate the importance of financial pump-priming that occurred, because throughout the second half of the century, larger pools of budgetary wherewithal often resided at the federal level, and also at the state level in the largest states (such as California, New York, Illinois, and Ohio), and thus officials in those organizations had the capability of changing the course of events by using their checkbooks. The smaller an organization, the more budgets went to salaries, and hence were not available to invest in IT. Thus, the dynamics of budgeting always affected the pace of adoption, indeed, the types of implementation of technologies and uses. Taking in these features of public sector adoption allows us to think of adoption of IT as waves of implementation, linking nicely to the typologies used by
Conclusions
historians of computing when they speak about generations of technology. The point to keep in mind, however, is that the waves of change that occurred in the adoption and use of IT were not limited just to the evolving features of one generation of computing over another; changes were also functions of how officials who did have large pools of budget wanted to use computing inside and outside their own agencies. Action also depended on the degree to which public administrators felt confident that a new use of technology was practical, as we saw with senior military officials who embraced a new application if proved effective. That last observation applied to the private sector as well all the time. We have only touched briefly in this book on the effectiveness of the implementation and use of computing in the public sector, because such a discussion would have taken us far afield from the issues we needed to address. However, successful and poorly implemented uses of IT also affected waves of change, patterns of adoption, and timing of changed uses of computing and telecommunications. A failed implementation of a large system in California or Wisconsin, for example, would set back use of a new application for years as they either had to start all over or were simply soured on an experience. Similarly at the federal level, a faulty implementation had the same effect or caused the agency involved to take years longer, often even a decade or more, to implement and often with significant overruns in costs. There were many poorly implemented systems at the federal level, as we know from the hundreds of GAO audits published on the issue since the early 1960s. To be sure, successes outnumbered failures, however; otherwise, public officials would still be using tabulating machines and desk top adding machines, as is still done by officials in underdeveloped countries. As the nation’s inventory of public uses of IT increased over the half century, and as the complexity of those systems did too, the issue of effective implementation became more serious, and terribly relevant. Privacy of one’s data began emerging as a critical public concern by the end of the 1980s and, by the end of the 1990s and early 2000s, security of data, which was threatened by cybercriminals or enemies of the United States. One GAO report from 2003 documented the growing attention to these issues as the nation became increasingly dependent on the digital hand: “Over the years, various working groups have been formed, special reports written, federal policies issued, and organizations created to address the nation’s critical infrastructure challenges.”11 Similar activities took place at the state and local level, and increasingly in the early years of the new century, in higher education. Implicit in many critiques of governmental operations was the assumption that the use of IT either helped or detracted from effectiveness of public institutions. But strong advocates of the use of computing in government saw that the issues often came back to problems in accountability for results and budgetary management. Vice President Al Gore, for example, in the case of the federal government, noted in the mid-1990s that “for a decade, the deficit has run out of control,” and that “below the surface, Americans believe, lies enormous unseen waste.” He cited the examples of DoD owning more than $40 billion in unused materiel and the IRS struggling “to collect billions in unpaid bills.”12 Speaking on behalf of
349
350
The DIGITAL HAND, Volume III
the administration, he concluded that “we suffer not only a budget deficit but a performance deficit.”13 To be sure, some of his comments were aimed at a public that had a very low opinion about the effectiveness of government during the early to mid-1990s; nonetheless, not all was well with public administration. But the administration implemented far-ranging reforms in the 1990s often called “reinventing government” and introduced a formal “National Performance Review” of all federal agencies. Underlying much thinking at the White House was the notion that many agencies were designed for an industrial era—large and hierarchical—when in fact they now needed to be “reengineered” to operate in an information age. Because officials at all levels of government were beginning to reach similar conclusions, particularly governors, it is worth quoting Gore at length once again: From the 1930s through the 1960s, we built large, top-down, centralized bureaucracies to do the public’s business. They were patterned after the corporate structures of the age: hierarchical bureaucracies in which tasks were broken into simple parts, each the responsibility of a different layer of employees, each defined by specific rules and regulations. With their rigid preoccupation with standard operating procedure, their vertical chains of command, and their standardized services, these bureaucracies were steady—but slow and cumbersome. And in today’s world of rapid change, lightening-quick information technologies, tough global competition, and demanding customers, large, top-down bureaucracies—public or private—don’t work very well.14
The key point to keep in mind was that effectiveness was less a function of technology than the purpose for which it was applied. For that reason, any study of the deployment and use of applications of IT is critical to any appreciation of how the American economy worked. As that last quote from Gore also suggests, the effectiveness of an implementation is directly tied to the culture, mission, and capabilities of individual agencies. Ultimately, we also need to acknowledge a very obvious pattern of deployment, that institutions across all of the public sector did implement IT in ways that made sense to them. There were surges of innovation in how agencies did their work and how the digital hand functioned. The early 1960s and again in the late 1990s at the federal level represent one example; in state and local governments in the 1970s–1980s, and at the dawn of the new century represent yet another; in higher education in the 1980s and again in the early 2000s. In the early years of the new century, deployment of all manner of uses of IT and telecommunications appears to be going through another round of change and growth, driven by myriad but well-known factors: newly emerging high-end super computing, low-end portable consumer devices, increased inventories of new software, expanded use of the Internet, availability of budget (despite complaints to the contrary), increased dependence on existing IT systems and infrastructures, a workforce familiar with IT at all levels top to bottom, realization that global economic competitiveness is at stake, the War on Terror, and so forth.
Conclusions
Role of Public Sector as Creator of Today’s Economy Vice President Gore’s comments about how governments reflected institutional values and structures of the period are borne out by the historical record and the findings of all three volumes of The Digital Hand. American public institutions, and especially governments, did not operate in isolation from the society in which they functioned. We saw the simple example of teachers using PCs at home in exactly the same ways as Americans in other walks of life, just not so much in the classroom. Federal agencies installed ERP software to manage supply chains at the same time as large manufacturing firms. Given those realities, we should also recognize that the public sector, which was so large throughout these years, was a proverbial 800-pound gorilla in the American economy. Indeed, it had been so since the days of the New Deal in the 1930s, when the federal government first embraced Keynesian economic policies. This new form of proactive public leadership called for officials to stimulate both the economic health of the nation and guide its activities. At each level of government, and higher education as well, public institutions contributed to the economic welfare of the nation. While this is not the book in which to tell that story, as it has been studied by many American economists in the last half century in one fashion or another, it is appropriate to summarize briefly what we have learned regarding the role of the digital hand in public affairs. In the 1940s and 1950s, the federal government funded development of the computer and, in the process, fueled the emergence of the computer industry. A result of that creation was decades of American exports of technologically advanced products, including computers, around the world. Second, the technology evolved into a major stimulus for the expansion of foreign operations of American corporations.15 Third, by funding R&D in computing, and subsequently buying products from indigenous American computer manufacturers, the federal government as a whole, and most specifically the Department of Defense, created a robust, competitive American computer industry, which, by the late 1970s, had gone global, with 45 percent of its total sales coming from exports and other business transactions conducted outside of the United States.16 In turn, these three sets of events made it possible for the Computer Industry, and later derivative industries, to thrive in a competitive capitalist economy with software, disk drives, computer chips, computers, and even later video games. Kenneth Flamm, who studied the early role of government in the creation of the robust computer industry in the United States, got it right when he argued that with or without federal support computers would have eventually been invented and turned into commercial products, because many people around the world were busy at work developing the technology in the 1930s and 1940s. But, without federal support, “the pace of advance would have been far slower,” and by funding early, very expensive R&D, public officials were able to eliminate the investment risks for companies that would have otherwise engaged in the process more slowly.17 Once the industry got going, it could—and did—make
351
352
The DIGITAL HAND, Volume III
incremental improvements in the technology at a time of decreasing funding by the federal government. The slack was picked up by the private sector as economic opportunities and revenues emerged. Yet right into the next century, the government continued to fund much leading-edge research and to support complex implementations, as described in chapter 3 about DoD. To put a fine point on the role of the federal government, its practices regarding IT also were implemented in aerospace, semiconductors, medicine, scientific and engineering instrumentation, and other scientific and engineering based fields all through the second half of the century. There were years when 85 percent of all funding for R&D in the United States came from the federal government. When one considers the fact that more R&D was done in the United States than any other country, the role of the federal government was so profound and such a distinctive feature that any history of the evolution of the American economy in the last half century must recognize it. Today that role is also being played by state governments.18 Federal motivations are simple enough to list: need for weapons for the Cold War, as a way of stimulating economic development and providing jobs, and so forth. But as a consequence of the actions taken, computing became available first and most intensely in the economy of the United States, with the direct result that American firms were often the first and most extensive beneficiaries of whatever benefits there were to be derived from computing. At the end of the heated economic debate about the productivity paradox in the 1990s, the judgment of most economists was that computers had contributed to the overall growth in American productivity.19 Furthermore, that productivity had built a head of steam, with minimal evidence of results in the 1950s through the mid1980s, and then picked up right to the end of the century. In short, this set of institutions did a great deal to make the American economy “advanced,” “hightech,” “info-based,” “information based,” or whatever other term one wants to use. To be sure, all industries participated either as creators of the technology or as its users, but the spark and sustaining flame came originally from federal sources, particularly in the early decades, and it remains an important engine of innovation in the American economy. Often left out of such grand discussions about federal policies and practices in the continuous modernization of the American economy are state governments. State governments collectively spent a great deal of money, employed as many people as the federal government, played a very aggressive role in stimulating local economic development, were just as eager to modernize their own operations and to help local employers to do the same as the federal government. In fact, the evidence suggests that they did a better job than federal agencies in continuously modernizing their own internal operations with the help of the digital hand. Their projects were often far smaller, but there were more of them. Local implementations of IT had the normal spillover effects of transmitting knowledge about IT into communities, creating new jobs, fostering development of entrepreneurial enterprises and Silicon Valleys all over the country.20 Because state governments were more pervasive across the physical landscape of America
Conclusions
with their many thousands of offices in countless communities, they not only made computing a visible component of American society, but part of its tool kit, and nearly as early as happened in the more distant federal agencies. Use of computers by state governments became so pervasive and influential on local economies that no economic history of the United States of the second half of the twentieth century can realistically be written without taking into account the role of state governments, and part of the reason for that is because of their use of computing to change the nature of work and productivity. Can we draw the same conclusions about local government at the county, city, and town levels? Here the effects lend themselves less to generalizations, because work was more narrowly focused and fragmented. Furthermore, local deployment of digital tools never became as extensive as in state and federal agencies. Public use of computing at the local level had the most profound effect in law enforcement; unquestionably, it performed better and more productively at all levels in large parts thanks to the helping hand of IT. The cost of designing and managing the construction of public facilities, such as highways, subways, and other transportation infrastructures, probably also benefited enormously in the same ways, although this application was not as visible to the public at large, and certainly has not been examined, except in this book. Backroom operations, such as tax collection and accounting did too, but because every public and private institution used computing for these purposes at roughly the same time, the relative advantage of one using it over another pretty much canceled out any sense of distinctiveness. The same is true to a lesser extent about online transactions with citizens. Given the fact that these only started to be done in the late 1990s—thanks largely to the availability of the Internet—it is too early to conclude that the effects are the same, although they probably will be.21 Yet, as a whole, the use of computing for such operations across the entire economy contributed to the productivity gains articulated by economists at the end of the century.22 What can we say about higher education and K–12 and the digital hand? Today, computers are widely available and used in much of K–12, and far more so in higher education. K–12, however, does not yet seem to have affected how children use this technology, let alone what they think about it, as the evidence points out that they learn more about the digital hand outside of school by playing with digital toys, communicating with cell phones, and using PCs at home. Schools and their districts acquire equipment and software just like any other industry, and for the same purposes, mainly for back office operations. Thus, what we can conclude is that K–12 does reflect, in general, society’s appetite for computing, but no more and no less. Higher education, on the other hand, may play a far more important role than K–12 in today’s economy and society, although the former is normally a major employer in any community. For one thing, these are large institutions that spend a great deal of money buying, maintaining, and using digital technologies of all kinds. These institutions teach students about the digital hand and have woven its use into their work. Junior colleges may actually have done more
353
354
The DIGITAL HAND, Volume III
recently than four-year institutions in this regard because when they trained students in a vocation, such as automobile repair, they integrated use of existing digital tools used by potential employers into their training programs. At fouryear institutions, the same can be said about all the physical sciences, most of the social sciences, and partially in the humanities, and for the rest of one’s academic work, most extensively, such as for research of library holdings, writing papers, or conducting business transactions. Thus, an important contribution higher education makes today is to enhance digital skills of its students while also being a large consumer of such technologies. We have already noted the profound contribution made by American higher education to humankind’s inventory of knowledge through research, which today is highly dependent on the use of information technology. That use has resulted in important changes in the American economy that now characterize how whole industries and thousands of companies function. Beginning in the 1950s in both California and Minnesota, computer engineers who trained at local universities or who worked in these states started to set up their own companies to make and sell computers, components, and software in the shadow of these institutions. Silicon Valley in California is near Stanford University, while hundreds of similar firms are located near the University of Minnesota in Minneapolis and the University of Michigan in Ann Arbor. MIT in the Boston area had a similar effect in Massachusetts during the 1960s and 1970s. Examples also exist in dozens of other states. But then whole new fields emerged, made possible by research done with the help of computers. Software is an obvious example but in hindsight may turn out to be a minor one given what is currently happening with biotechnology. Today, biotech research is being done at hundreds of institutions, and we are already seeing companies being created in the same communities where major research universities exist in support of this new field. Ann Arbor, Michigan, and Madison, Wisconsin, for example, are becoming biotechnology centers, dotted with small companies staffed with academics and graduates from local universities creating new medicines and other products. Thus, the tradition of higher education spawning new products and industries in the United States has continued and at such a high level of activity and economic volume that we can conclude it directly affects what important components of the economy look like. Referred to as university-industry technology transfer, it is a unique feature of the American scene, yet one dating back to the late nineteenth century. As one team of scholars looking at the role played by higher education reported, the higher education infrastructure “blended financial autonomy, public funding from state and local sources, federal research support, and substantial scale,” which “provided strong incentives for university faculty and administrators to focus their efforts on research activities with local economic and social benefits.”23 They further noted that the focus of this research was thus aimed largely at “understanding and solving problems of agriculture, public health, and industry.”24 In addition, the close ties to all high-tech industries of higher education are most intimate in the United States. With the decline in federal financial support
Conclusions
for research late in the century that continued to remain depressed during the Bush administration, the private sector took up the slack relatively quickly, with the result that as of this writing, there is no indication that American higher education will reduce its contributions to the development of new products and even whole industries in the foreseeable future. IT also affects the debate about performance. In late 2006, for example, the U.S. Department of Education announced the results of yet another blue-ribbon commission studying perceived problems in the performance of higher education, this time concerning its failure to keep America competitive in the global economy. The department called for a comprehensive database to be created that would collect performance data on colleges and universities that could be used by students, their parents, and government agencies to access results of what higher education did, including publishing a ranking of institutions. The heart of the ranking for accountability of performance described in the final report was, in effect, an extensive discussion of an IT application.25
Regulatory and Legislative Roles of the Public Sector in American Society American governments affected directly the activities of the economy through their extensive and continuously growing twin roles as regulators and legislators; this was no more so than with the federal government. All through the twentieth century, the breadth of regulatory activities expanded as the nation looked to government to ensure the safety of its food and medicines, provide economic and social safety nets, protect the environment, and establish standards to contain the dissemination of socially unacceptable material (such as sexually explicit content on radio, TV, and movies), while keeping channels of information sharing. The broad outlines of that story have been explored by many others and thus need not detain us here. However, what should be pointed out is that the expanded use of digital tools and telecommunications in the second half of the twentieth century became sources of new involvements and consequences of regulatory practices, a process that shows no signs of slowing down. It is its increased regulatory and legislative activities that led the public sector to influence the way Americans worked in addition to its own internal use of the digital hand. This happened in several ways. The first involved regulating the flow of business transactions through telecommunications and electronic means throughout most of the century. It did this by regulating banking transactions largely with the Federal Reserve banking system. As the ability became possible to conduct electronically additional financial transactions, regulators determined if they would allow these to be done and when, and under what circumstances, such as the large variety of brokerage services and some of the cross-over of these services into the preserve of the banking community. Those kinds of regulatory activities changed the relationships among financial industries, often blurring some borders, described in considerable detail in volume 2 of The Digital Hand.
355
356
The DIGITAL HAND, Volume III
To be blunt about it, no firm could introduce any significant digitally enabled offering without the regulators agreeing, and this was particularly the case in telecommunications, media, and finance. That often meant multiple federal and state regulatory agencies became involved. In addition to the financial regulations were a vast number involving telecommunications of all kinds, from telephones to radios, to the use of satellites, television, and cell phones, all the purview of the Federal Communications Commission (FCC). It proved to be an enormously influential player in the American economy. In addition, laws passed by Congress regulating telecommunications became central events in the economic life of the nation. The availability of multiple forms of digital and analog transmissions of information, conversations, and entertainment led the FCC to say grace over the nation’s telephone services, the creation and regulation of the practices of all manner of television services (even content to a limited extent), and of radio transmissions. It regulated business practices, influenced pricing, even determined technologically how transmissions would be made and what technical standards one had to adhere to, such as for the digital transmission of television programs. It even set deadlines for when such standards had to be used, such as for digital TV. A second way regulators influenced events in the American economy was through its advice to the Congress about what laws to pass regarding the flow of information and telecommunications. That happened along two tracks. The first involved advising Congress on myriad patent and copyright laws passed during the last quarter of the twentieth century in response to the arrival of such digital items as computer chips, software, and electronic copies of things originally only printed. Regulators and Congress together established the protections and property rights of those engaged in the creation and sale of digitally based products and services that had not existed prior to roughly the mid-1960s. By the dawn of the new century, a workday hardly went by without some pronouncement or judgment by a federal regulatory body concerning some digitally oriented issue. Thus, both through comprehensive legislation and through myriad rulings, government affected the nature of how IT and telecommunications were used in this country, and the pace at which change and adoption occurred. Less frequent, but nonetheless important, federal and state courts also heard cases involving computer companies, small and large antitrust suits, challenges to regulations concerning telecommunications, and so forth, thereby shaping incrementally practices in the American economy. A third way it influenced the work of this nation was through regulations and laws concerning the Internet. Decisions to abstain from regulating traffic over the Internet to the extent it did telephonic activities proved significant. Look at what the French or Chinese governments have done regarding the Internet, in contrast to the American government, and one can quickly see the difference. China filters content coming into the country to keep out subversive materials. France has blocked eBay from selling Nazi memorabilia over the Internet. Google faces constraints in various countries. In these various cases, we have examples of governments controlling activities of at least some interest to its
Conclusions
citizens. In the United States, officials resisted pressures to constrain, for example, the flow of pornographic material over the Net, choosing instead to use law enforcement to arrest those violating existing laws, rather than to risk crossing swords with the U.S. Constitution, which its authors wrote so as to allow the flow of information and ideas. The role of regulators and Congress with the Internet extended even to its economic practices by outlawing states from charging sales taxes for goods sold over the Net, in order to stimulate use of this digital tool to promote e-commerce. Various federal regulatory agencies deal with the Internet, from how banking can be done over it, to FCC actions concerning its technical standards, to the Congress pondering to what extent the national government should influence activities over the Net. State and local governments have played a far more restrained role in influencing how digital tools could be used through regulatory and legislative pronouncements. Part of the reason for this is that in the United States anything that involves multiple states is subject to some form of federal regulation, such as telecommunications. As we saw in chapter 2, state governments were not able to impose sales taxes on Internet-based transactions, trumped by federal law prohibiting such actions. States regulate the cost of telephone and television services, while local communities make such determinations as which cable company will provide local television services. Thus, so far, state, county, municipal, and other local governments have played a minor role in regulating the digital affairs of their communities. This pattern of minor participation in regulatory practices tracks along the same lines as the role local governments played in the evolution of radio, television, and movies. We have not yet seen one form of local practice emerge with digital media—book censorship—although growing discussions about pornography over the Net available to children have spawned the same kinds of debates that engaged communities regarding books for centuries. So far, the availability of commercial software has provided parents with tools that are relatively effective in controlling the flow of content into their homes without society having to look to Congress for significant legislative assistance.
Public Administration in the Information Economy The role and quality of the management of information technology and its uses in the public sector is an important topic that warrants its own books but cannot be completely ignored in this one. Are there issues and implications that we can draw out of their experience in working with the digital hand over the past six decades that would be of use to officials in the early years of the twenty-first century? The answer would normally depend on many variables; however, what is remarkable about the history of IT in the public sector is how consistent were many of the managerial issues from one decade to another. There are some chronic realities that management faces within the federal government and across state and local governments, and which often include higher education and sometimes K–12.
357
358
The DIGITAL HAND, Volume III
For one thing, size and complexity matter. The bigger the agency, the larger the systems they need and the more willing they seem to be in implementing them. The federal government, however, generally has a poor record of doing this well. Its agencies tend to overrun their budgets quite extensively, take far longer than planned to implement these, and get accused constantly by GAO of being often too ill informed regarding complexities involved. Frequently, the problems are turnovers in senior leadership, which changes its focus and priorities, and inadequately skilled IT implementation staffs or adequate amount of resources. We know about these conditions because of the over one thousand GAO audits done of such projects. Despite this litany of problems, intentions were often good, sincere efforts were made to be productive, and ultimately many important systems were installed and used.26 Less publicized managerial challenges were the construction of systems over time that met only the needs of a particular agency and the way an organization’s performance was measured. These realities normally stymied data sharing across agencies, a problem that grew in seriousness in the 1990s as the need to do so expanded, all at the same time as the Clinton administration was increasing its emphasis on measuring performance, and that became a major threat to the nation’s ability to combat terrorism after the creation of the Department of Homeland Security. Improvements in the management of complex projects occurred, along with the implementation of initiatives to demonstrate positive outcomes. Changes prescribed during the Clinton years built nicely on earlier initiatives with the result that federal accountants in various departments began to pass the kinds of accounting audits expected routinely of the private sector. Results of work were increasingly being documented, with the effect that public officials became more outcome oriented in their work and less inwardly focused than in the earlier years.27 The GAO challenged all of federal government to continue this progress, to think long term about the role and integration of work from multiple departments and agencies. It wanted the national government to do the difficult, but necessary, “periodic reexamination of existing portfolios of federal programs” as a way “of weeding out ineffective or outdated programs while strengthening and updating those programs that are retained.”28 Many of those actions will require replacing or modifying existing IT software and work practices. Some will be changed incrementally, while others will be complicated and take a long time to complete. That is a lesson from history. But highly experienced public officials understand this and often look to private sector for best practices. They also operate some of the largest IT applications in the world for which there are few analogues. To be sure, IBM and the Pentagon have the same size logistical operations and volumes and so can learn from each other. But who has a billcollecting operation the size of the IRS’s e-filing application? In short, public sector managers will have to invent new ways to handle very large projects, building on their six decades’ of institutional experiences in creating massive uses of the digital hand, or suffer the problems of so many of their predecessors. Information technology exhibits all the patterns of a newly emerging technology that historians are quite familiar with through their study of earlier ones.
Conclusions
IT is not mature by any means; we can’t even decide what functions should be integrated into a cell phone, let alone into large systems. When IBM announced that it had found a way to use copper in chip manufacturing in the 1990s, it was a breakthrough, one that made it possible to miniaturize many digital tools and pack them with considerable functions. That led to yet another round of transformations in consumer electronics that was still being introduced in the early 2000s. No respectable hardware or software engineer or scientist is prepared to say that the discovery and innovative stages of the technology are anywhere close to being completed, let alone stabilized. This means that public administrators, like their cohorts in the private sector, are going to face opportunities and challenges in the foreseeable future. As new digital tools appear, already installed ones will be too expensive to maintain and too difficult to change—the path dependency issue so many historians and managerial experts are discussing today.29 The larger a system is, the harder it is to replace. Public and private sector managers have all had a difficult time dealing with this issue over the past six decades. However, there are paths being taken in the private sector that public officials are beginning to consider using. The most obvious is simply shifting work to another organization (public or private) that has a better or more modern system, rather than write one’s own. Private sector calls this downsizing, rightsizing, or simply outsourcing.30 When federal agencies have done this in the past, it normally was for things they were not equipped to do (such as designing a new weapons system or aircraft). That shift in work to others is small enough in scope that the distraction can be made to disappear (such as running a cafeteria); or to provide additional resources fast, as for example, hiring ex-military as security guards for use in the Iraq War. But they will probably have to do the same with many IT tools and other agency work where tools and tasks can be pulled out in some modular fashion, as we are seeing happen in the private sector, fulfilling prophecies of most managerial experts of the 1970s and 1980s who said this would happen eventually.31 In the case of the federal government, and increasingly at state and local levels, we have the unique problem of an aging workforce getting ready to retire in such numbers that whole agencies are threatened with the possibility of not being able to hire and train enough replacements to do their work. That is a real, impending crisis that will probably have to be resolved by applying the same techniques private sector managers used in the 1990s when they reengineered whole work streams. Those projects made it possible to restructure work to be more effective while at the same time reduced the amount of labor required to do that.32 If the transformations can be timed correctly—and there is plenty of experience in the private sector to do this—then it would be possible to bring on new, highly digitized processes just as employees were retiring, so no president or governor would have to face the prospect of voters reading glaring headlines in the New York Times or in the local press reporting that he or she was laying off thousands of workers. It can be done using the incrementalist approaches so favored by public officials.33 Such actions would also begin to address one of GAO’s biggest concerns, namely, that as baby boomers retire, the cost of pensions
359
360
The DIGITAL HAND, Volume III
and medical programs will so sap federal budgets that there will not be enough money in the Treasury left to carry out the normal work of government.34 In short, changing technologies, geopolitical circumstances, and demographic realities provide, indeed force, opportunities for new ways of doing the work of government. This statement applies just as much to state and local governments as to the federal government. As technology seeps through American society, officials will find that they will revisit conversations held by the Founding Fathers, facing an issue far bigger than whether or not the IRS needs some new e-filing software or the military a different smart bomb. Should citizens be able to vote on weighty national issues, rather than allow their representatives in Congress to do so, mimicking the spade of propositions citizens in California are routinely asked to judge? It is an old debate that goes back to the eighteenth century and that the writers of the U.S. Constitution thought they had resolved: they wanted representative democracy rather than direct democracy. But as the nation moves toward digital voting, and possibly even someday to voting over the Internet, the step to direct democracy is then a short one. Security is rapidly declining as an issue; in fact, transactions over the Internet are now some of the safest in American business, and technology to protect identities is now available.35 A probable lesson from history is that the issue will appear in more intense form than ever before; it will not go away, but rather become a major point of consideration sometime in our new century. One of the things that happened in the last half century to public officials at all levels of government was their growing dependency on information with which to do their work and to make decisions. Computers made it very easy to accumulate ever larger amounts of data, to analyze it for people and other computerized systems, present options and predictions, and so forth. Federal officials were the first to become profoundly dependent on such data, but this is now occurring even at the municipal and town levels. Computer scientists maintain that the ability of systems to do more data gathering and analysis, and even decision making for humans, will increase sharply in the early decades of the new century.36 How are officials to react to this development at all levels of government? To what extent will they allow work and decision making to shift to machines? In volumes 1 and 2, I described how various industries had found it useful to shift responsibilities to machines for many mundane tasks and decisions, beginning with the Automotive Industry but also extending rapidly to parts of the Retail Industry, widely to all of the financial sector (banks, insurance, and brokerage), and to the Telecommunications Industry (most notably, local telephone services and long-distance call providers). To be sure, governments at all levels are beginning to do the same but have far to go to match what has happened so far in the private sector. Given the diversity of tasks we ask of all governments, the opportunity to shift more work and decisions to digital tools will probably increase in number and attractiveness, particularly as various government workforces age and retire, taking with them decades of experience.
Conclusions
Before this book goes out of print, the issue will be faced by officials at all levels of government. Finally, in all American governments, the historical record demonstrates that institutional cultures always trump technology. It would be difficult to overstate the importance of this reality in the way governments and higher education approached decisions regarding the nature of their use of information technologies. More than simply the silo-centric, inwardly focused perspectives pointed out in this book, indeed even earlier in this chapter, and by so many commentators on these sectors of society, IT deployment provided additional insights on behavior. Perhaps most noticeable was the lesser accountability for results when compared to what occurred in the private sector. That lesser accountability gave public officials both more room to experiment—and even to be bold—without as much fear of punishment should they fail but also created a situation where that sense of urgency to get things done so frequently evident in highly competitive companies and industries proved less, and where personal accountability also was less intense. This comment might seem quite harsh to public officials, but as the evidence presented in this book demonstrates, even members of their own community made this observation, including the GAO quite frequently, line management, military officers, industry associations, and university deans and presidents. With regard to the use of IT, we care about this institutional cultural attribute because it influenced the speed of adoption of an application and its form, that is to say, how it worked. Accountability as an issue also manifested itself in ways different from what occurred in the private sector. The most important manifestation was the way Congress and state legislatures in particular would mandate activities without adequate funding, understanding of what they were asking employees to accomplish, and with insufficient ongoing supervision. The experience of the IRS in the 1990s was one example, but these experiences existed all over the United States. For example, a legislature would for good or even crass political reasons demand some automated function be implemented and then, when it did not get accomplished well, criticize the agency involved or the administration in power for incompetent behavior. This practice extended far beyond digital issues, but as the number of IT and telecommunications projects increased over time in size, scope, and importance, the more frequently technological issues became the subject of the day. In government in particular, there was also the dichotomy between politically appointed management and career civil servants. The former came and went fairly rapidly, with it not being so unusual to see turnover in middle- and senior-level political appointees occurring every two to three years for myriad reasons. On the other hand, civil servants frequently spent years, if not their whole careers, in one or few agencies, developing deep knowledge of its practices, but also encumbered with a reluctance to change how they did their work. Political appointees, however, often pushed to have agencies change their operations for various reasons, and thus ran into workforces either still feeling the sting of changes they were
361
362
The DIGITAL HAND, Volume III
attempting to implement from their previous management, or resistant because they either understood the proposed transformations would not, indeed could not, work and thus be blamed for such a fiasco, or did not want to change daily operations. In either circumstance, they might wait out the departure of an unpopular political appointee knowing that it would be difficult, if not impossible, to fire them, demote them, or lower their salary. Teachers did the same with an unpopular school principal. As IT issues increasingly made it to senior management’s list of priorities, the behavior just described played itself out in the decisions to acquire new IT tools, and how best to design and use them. In short, once again, the history of uses of computing in the public sector demonstrated that technology really is as much a social phenomenon as it is ephemera of modern society. Lest we conclude falsely that public sector culture has atrophied, we must remember that the historical record also shows that public servants were creative and extensive users of all manner of IT and telecommunications. Bursts of innovation and creativity were always possible and occurred constantly in each decade. California was creative at the state and local level for many decades until it ran into severe budgetary problems beginning in the 1980s. The states of Washington and Oregon often led the nation in redefining the nature of public work in the 1980s, while the Clinton administration did the same in many federal agencies. Within higher education, there were always pockets of innovation and creativity within a campus and sometimes across a whole institution. The history of junior colleges in contributing to the skill set of the nation has yet to be told, but when it is, the digital hand will be seen as a critical protagonist supported by an institutional culture willing to change and accept risks in response to the needs of their local communities. Some of the most thoughtful strategic thinkers concerned with the effective use of IT in the public sector are civil servants and bright political appointees. That has been the case in each decade studied in this book. Management in the public sector has always shared many of the same values and concerns as their cohorts in the private sector. That is not simply a function of their having attended the same business schools or even worked in each other’s sectors. This commonality goes far to explain how public officials reacted to digital topics, despite their varied operational and cultural working environments. Often the differences in the work environment had a greater effect on how like-minded values and attitudes were implemented. In a survey of senior public and private officials conducted by IBM in 2006, researchers reported that leaders in both sectors strongly agreed on the importance of integrating business and technology (89 percent, to be precise). But only half of the public officials in the survey thought that they were doing a good job of this, while their private sector cohorts reported a much higher level of satisfaction.37 David M. Walker, Comptroller General of the United States, reflected the thinking of many public officials when, in April 2007, he stated, “We need nothing less than a top-to-bottom review of federal programs, policies, and operations,” and thus the continued transformation of government.38
Conclusions
Computing and Telecommunications in the American Public Sector All industries in the United States were avid users of all manner of technology, inventing it, installing it, and changing internal work practices as a result. That was so for every industry studied for The Digital Hand, and the public sector did not deviate from this pattern. Americans have long had a ravenous appetite for information and effective performance, reinforced by the nature of its government, capitalist economy, and generally highly successful economic results over the past century and a half. Public workers have long been enamored with technology, from promoting the use of interchangeable parts in military weapons early in the nation’s history to sending humans to the moon. Computing and telecommunications were, thus, part of a much larger mosaic of technological infrastructures and materiel of American society. Within the story of the digital hand, the role of the public sector largely mimicked that of the society at large.
363
NOTES Chapter 1 1. John W. Kendrick, Productivity Trends in the United States (Princeton, N.J.: Princeton University Press, 1961): 612, and also for the quote at the start of the chapter. For a more detailed discussion of his ideas concerning productivity in government, see ibid., 612–621. 2. Both quotes came from Howard D. Taylor, “Automatic Data Processing in the Internal Revenue Service,” Journal of Accountancy 119 (March 1965): 53. 3. Relations with the U.S. government had three dimensions: that as customer, as source of funding for leading-edge high-risk research and development, and as regulator or litigant in antitrust cases. Published histories of IBM describe all three dimensions; see, for example, Emerson W. Pugh, Building IBM: Shaping an Industry and Its Technology (Cambridge, Mass.: MIT Press, 1995). 4. Most recently in The Digital Hand: How Computers Changed the Work of American Manufacturing, Transportation, and Retail Industries (New York: Oxford University Press, 2004). 5. See, for examples, discussions by James W. Cortada, “Economic Preconditions that Made Possible Application of Commercial Computing in the United States,” IEEE Annals of the History of Computing 19, no. 3 (1997): 27–40; Richard R. Nelson, The Sources of Economic Growth (Cambridge, Mass.: Harvard University Press, 1996): 52–83, 233–273; F. M. Scherer, New Perspectives on Economic Growth and Technological Innovation (Washington, D.C.: Brookings Institution Press, 1999); Daniel E. Sichel, The Computer Revolution: An Economic Perspective (Washington, D.C.: Brookings Institution Press, 1997); Benn Steil, David G. Victor, and Richard R. Nelson (eds.), Technological Innovation and Economic Performance (Princeton, N.J.: Princeton University Press, 2002): in particular 23–46, 49–73. 6. National Commission on Terrorist Attacks Upon the United States, The 9/11 Commission Report: Final Report of the National Commission on Terrorist Attacks Upon the United States (New York: W. W. Norton & Company, 2004): 416–417, 418. 7. U.S. Department of Commerce, Bureau of the Census, Historical Statistics of the United States: Colonial Times to 1970 (Washington, D.C.: U.S. Government Printing Office, 1975): Part 2, 1102, 1141; U.S. Government, Fiscal Year 2005 Historical Tables of the U.S. Government (Washington, D.C.: U.S. Government Printing Office, 2004): 301. U.S. government statistics, while quite accurate, are difficult to fully understand because they were published to support policy agendas, such as to demonstrate that the number of employees was actually declining. Thus, for example, the data might be presented as civilian versus military, or federal versus state and local, to show that the national 364
Notes to Pages 7–14 numbers were actually better than thought by critics. The historical tables in the Fiscal Year reports, in particular, have to be read carefully because of this phenomenon. 8. Ibid., p. 288. 9. Ibid., p. 289. 10. All described in David F. Noble, Forces of Production: A Social History of Industrial Automation (New York: Oxford University Press, 1986); Kent C. Redmond and Thomas M. Smith, From Whirlwind to Mitre: The R&D Story of the SAGE Air Defense Computer (Cambridge, Mass.: MIT Press, 2000); David L. Boslaugh, When Computers Went to Sea: The Digitization of the United States Navy (Los Alamos, Calif.: IEEE Computer Society, 1999); Paul Ceruzzi, Beyond the Limits: Flight Enters the Computer Age (Cambridge, Mass.: MIT Press, 1989). 11. The standard historical works are Kenneth Flamm, Targeting the Computer: Government Support and International Competition (Washington, D.C.: Brookings Institution Press, 1987) and his second study, Creating the Computer: Government, Industry, and High Technology (Washington, D.C.: Brookings Institution Press, 1988); Paul N. Edwards, The Closed World: Computers and the Politics of Discourse in Cold War America (Cambridge, Mass.: MIT Press, 1996); Arthur L. Norberg and Judy O’Neill, Transforming Computer Technology: Information Processing for the Pentagon, 1962–1986 (Baltimore, Md.: Johns Hopkins University Press, 1996); Jane Abbate, Inventing the Internet (Cambridge, Mass.: MIT Press, 1999); Alex Roland and Philip Shiman, Strategic Computing: DARPA and the Quest for Machine Intelligence, 1983–1993 (Cambridge, Mass.: MIT Press, 2002). There is also a large body of material written by economists and political scientists concerning the role of the federal government in investing in R&D of all kinds, but I did not rely extensively on that material for this book. 12. U.S. Department of Commerce, Historical Statistics of the United States, Part 1, p. 224. 13. U.S. Department of Commerce, U.S. Census Bureau, Statistical Abstract of the United States: 2002: The National Data Book (Washington, D.C.: U.S. Government Printing Office, 2002): 417. 14. U.S. Department of Commerce, Statistical Abstract of the United States: 2002, 146. The numbers should not be taken as absolutes, rather as approximations because even the same source for this information provides contradictory and confusing data. For example, compare the data on page 146 with that presented on page 149. Yet, the key finding holds, namely, that there were many students. 15. Ibid., 146. 16. U.S. Department of Commerce, Historical Statistics of the United States, Part 1, p.382; U.S. Department of Commerce, Statistical Abstract of the United States: 2002, 165. 17. U.S. Department of Commerce, Historical Statistics of the United States, Part 1, p.383; U.S. Department of Commerce, Statistical Abstract of the United States: 2002, 165. 18. For example, David Osborne and Ted Gaebler, Reinventing Government: How the Entrepreneurial Spirit Is Transforming the Public Sector (Reading, Mass.: Addison-Wesley, 1992): 146–165; David K. Carr, Ian D. Littman, and John K. Condon, Improvement Driven Government: Public Service for the 21st Century (Washington, D.C.: Coopers & Lybrand, 1995): 311–334; John M. Kamensky and Albert Morales (eds.), Managing for Results 2005 (Lanham, Md.: Rowman & Littlefield, 2005): 1–14, passim. 19. We now have an excellent, indeed essential breakthrough in our understanding of the services sector from an economic perspective, with the publication of Jack E. Triplett and Barry P. Bosworth, Productivity in the U.S. Services Sector: New Sources of Economic Growth (Washington, D.C.: Brookings Institution Press, 2004); it includes extensive discussions of data sets produced by various U.S. government agencies. 20. The undercounting of computing’s effects on the economy is a major theme of Graham Tanaka, Digital Deflation: The Productivity Revolution and How It Will Ignite the
365
366
Notes to Pages 16–22 Economy (New York: McGraw-Hill, 2004), in which he argues that the entire economy as a whole improved: “As the Digital Revolution percolated throughout the economy, all kinds of things were getting faster, better, and cheaper” (25).
Chapter 2 1. John Chalykoff and Nitin Nohria, The Internal Revenue Service: Automated Collection System, Harvard Business School case study 9–490–042, revised 7/16/90 (Boston: Harvard Business School Press, 1990): 2, and also for lead quote for chapter. 2. H. Thomas Johnson and Robert S. Kaplan, Relevance Lost: The Rise and Fall of Management Accounting (Boston: Harvard Business School Press, 1987): 183–207; Martin Campbell-Kelly, From Airline Reservations to Sonic the Hedgehog: A History of the Software Industry (Cambridge, Mass.: MIT Press, 2003): 139, 156–161. 3. Howard D. Taylor, “Automatic Data Processing in the Internal Revenue Service,” The Journal of Accountancy 119 (March 1965): 53. 4. United States General Accounting Office, Data Mining: Federal Efforts Cover a Wide Range of Uses, GAO-04–548 (Washington, D.C.: U.S. General Accounting Office, May 2004): 51–52. 5. Budget of the United States Government, Fiscal Year 1999 (Washington, D.C.: U.S. Government Printing Office, 1998): 10; for how budgets and staff were allocated to accomplish this mission, see comments by the IRS Commissioner during the Clinton administration, Charles O. Rossotti, Many Unhappy Returns: One Man’s Quest to Turn Around the Most Unpopular Organization in America (Boston: Harvard Business School Press, 2005): 301–305. 6. Vico Henriques, “Automatic Data Processing in State and Federal Government,” in Alan D. Meacham (ed.), Data Processing Yearbook (Detroit: American Data Processing, 1962): 154–155. 7. Taylor, “Automatic Data Processing in the Internal Revenue Service,” 54. It should be noted, however, that at the time he made this comment, only about 40 percent of individual paper returns were treated this way. In fact, not until the arrival of electronic returns many years later could 100 percent of the data be captured in a fully automated way. 8. Ibid., 55. 9. This remained a constant theme all through the 1980s and 1990s in GAO’s assessments of the IRS, but see in particular, General Accounting Office, ADP Modernization: IRS’ Tax System Redesign Progress and Plans for the Future, GAO/IMTEC-88–23BR (Washington, D.C.: General Accounting Office, 1988). 10. The crisis is well documented by Helen Margetts, Information Technology in Government: Britain and America (London: Routledge, 1999): 89–91. 11. “A Proposed Automated Tax Administration System for Internal Revenue Service—An Evaluation of Costs and Benefits,” prepared by the Comptroller General as a Report to Congress, November 22, 1976, p. 11, available at http://archive.gao.gov/f0402/ 100046.pdf (last accessed 1/4/2005). 12. The main lines of the story have been described by Margetts, Information Technology in Government, 91–96. 13. Ibid., 94. 14. Jon Bakija and Eugene Steuerle, “Individual Income Taxation Since 1948,” National Tax Journal 44, no. 4, Part 2 (December 1991): 451–475. 15. Ibid., 107. 16. Rossotti, Many Unhappy Returns, 17. 17. Ibid., 196. 18. “Statement of Michael P. Dolan, Deputy Commissioner Internal Revenue Service, Testimony Before the Subcommittee on Oversight of the House Committee on Ways and
Notes to Pages 23–28 Means,” March 18, 1997, http://waysandmeans.house.gov/legacy/.oversite/105cong/3– 18–97/3–18dola.htm (last accessed 1/2/2005): 1. 19. Ibid. The entire hearings provide much data on IRS IT operations, circa 1994–1997. 20. National Commission on Restructuring the Internal Revenue Service, A Vision for a New IRS (Washington, D.C.: U.S. Government Printing Office, June 25, 1997): 25. 21. Ibid., 27. 22. Ibid., quotes pp. 25, 26. 23. Rossotti, Many Unhappy Returns, 202. 24. Internal Revenue Service, Guide to the Internal Revenue Service for Congressional Staff, Publication 1273 (Washington, D.C.: U.S. Internal Revenue Service, December 1990): 34. 25. Internal Revenue Service, Guide to the Internal Revenue Service for Congressional Staff, Publication 1273 (Washington, D.C.: U.S. Internal Revenue Service, January 1993): 12. 26. Internal Revenue Service, Guide to the Internal Revenue Service for Congressional Staff, Publication 1273 (Washington, D.C.: U.S. Internal Revenue Service, March 1996): 5. 27. The IRS Commissioner of the late 1990s provided a useful brief account of the evolution of e-filing during his years at the agency: Rossotti, Many Unhappy Returns, 139–144. 28. Internal Revenue Service, Guide to the Internal Revenue Service for Congressional Staff, Publication 1273 (Washington, D.C.: U.S. Internal Revenue Service, January 1999): 31–33; National Commission on Restructuring the Internal Revenue Service, A Vision for a New IRS, 29–34. 29. Internal Revenue Service, “History of e-file,” http://www.irs.gov/efile/article/ 0,,id=120353,00.html (last accessed 1/8/2005). 30. IRS to author, March 15, 2006. 31. U.S. General Accounting Office, Paperwork Reduction Act: Burden Increases at IRS and Other Agencies, GAO/T-GGD-00–114 (Washington, D.C.: U.S. General Accounting Office, April 12, 2000): 1. 32. Ibid., 4. 33. Core budgets at the IRS had been around $1.5 billion annually, and this figure included expenses of maintaining old systems and many core IT expenses. The BSM budget was (and still is) a separate budget on top of the $1.5 billion. 34. U.S. General Accountability Office, Data Mining: Federal Efforts Cover a Wide Range of Uses, GAO-04–548 (Washington, D.C.: U.S. General Accountability Office, May 2004): 52; Analytical Perspectives: Budget of the United States Government Fiscal Year 2004 (Washington, D.C.: U.S. Government Printing Office, 2003): 407, 431. 35. Austan Goolsbee, “The TurboTax Revolution: Can Technology Solve Tax Complexity?” in Henry J. Aaron and Joel Slemrod (eds.), The Crisis in Tax Administration (Washington, D.C.: Brookings Institution Press, 2004): 124–147. 36. B. Heather, “Treasury Dept.’s Commercial Steak Dazzles Industry,” Newsbytes News Network, May 26, 1998, http://global.factiva.com/en/arch/print_results.asp (last accessed 1/14/2005); Rossotti, Many Unhappy Returns, 195–237. 37. “Scanning Makes IRS Ops Less Taxing,” Automatic I.D. News, April 1, 1998, http://global.factiva.com/en/arch/print_results.asp (last accessed 1/14/2005). 38. The interaction between existing systems, new applications, and emerging technologies has been discussed almost from the dawn of computers. The most widely read of the early discussions, which has remained in print and is now a classic in computer science, is Frederick P. Brooks, Jr., The Mythical Man-Month: Essays on Software Engineering, originally published in 1972, but the authoritative edition appeared in 1982 (Reading, Mass.: Addison-Wesley, 1982). 39. For examples of the studies, see General Accounting Office, ADP Modernization: IRS’ Tax System Redesign Progress and Plans for the Future, GAO/IMTEC-88–23BR
367
368
Notes to Pages 29–33 (Washington, D.C.: U.S. General Accounting Office, 1988); ADP Modernization: IRS Needs to Assess Design Alternatives for Its Electronic Filing System, GAO/IMTEC-89–33 (Washington, D.C.: U.S. General Accounting Office, 1989); Tax System Modernization: IRS’ Challenge for the 21st Century, GAO/IMTEC-90–13 (Washington, D.C.: U.S. General Accounting Office, 1990); Tax System Modernization: Status of IRS’ Input Processing Initiative, GAO/IMTEC-91–9 (Washington, D.C.: U.S. General Accounting Office, 1990); Tax System Modernization: Further Testing of IRS’ Automated Taxpayer Service Systems Is Needed, GAO/IMTECH-91–42 (Washington, D.C.: U.S. General Accounting Office, 1991); Tax System Modernization: An Assessment of IRS’ Design Master Plan, GAO/IMTEC91–53BR (Washington, D.C.: General Accounting Office, 1991). Similar studies were prepared throughout the 1990s and early 2000s at the rate of at least one per year. 40. Rossotti, Many Unhappy Returns, 210. 41. Ibid. 42. Darrell M. West, Digital Government: Technology and Public Sector Performance (Princeton, N.J.: Princeton University Press, 2005): 82–101. 43. Kenneth C. Laudon, Computers and Bureaucratic Reform (New York: John Wiley & Sons, 1974); Kenneth L. Kraemer and John Leslie King, Computers and Local Government, vol. 2, A Review of the Research (New York: Praeger, 1977); Kenneth L. Kraemer, John Leslie King, Debora E. Dunkle, and Joseph P. Lane, Managing Information Systems: Change and Control in Organizational Computing (San Francisco: Jossey-Bass, 1989). This last book includes a lengthy bibliography on the theme of IT management in the public sector. 44. Harry H. Fite, The Computer Challenge to Urban Planners and State Administrators (Washington, D.C.: Spartan Books, 1965): 4–5, and for a summary of all prior inventories of state use. 45. Ibid., 4. 46. Ibid., 13. 47. Dennis G. Price and Dennis E. Mulvihill, “The Present and Future Use of Computers in State Government,” Public Administration Review 25, no. 2 (June 1965): 145. 48. N. P. Himbert, “The State of Louisiana Computer Time-Sharing System,” Data Processing Proceedings 1967 (Boston: Data Processing Management Association, 1967): 346–347. 49. Fite, The Computer Challenge, 90–91. On Total Systems, see Joseph I. Weinrebe, “The Information Utility: A Tool for Information Systems,” APICS Quarterly Bulletin 6, no. 2 (April 1965): 53–57; H. E. Markley, “Manufacturing by the Numbers: A Total Systems Concept of Manufacturing; Address, September 16, 1970,” Vital Speeches 37 (December 15, 1970): 143–145; W. M. A. Brooker, “The Total Systems Myth,” Systems and Procedures Journal 16, no. 4 (July–August 1965): 28–32; John Dearden, “Can Management Information Be Automated?” Harvard Business Review 42, no. 2 (March–April 1964): 128–135; Felix Kaufman, “Data Systems That Cross Companies’ Boundaries,” Harvard Business Review 44, no. 1 (January–February 1966): 141–145. 50. James W. Cortada, Before the Computer: IBM, NCR, Burroughs, and Remington Rand and the Industry They Created, 1865–1956 (Princeton, N.J.: Princeton University Press, 1993). 51. Clara Penniman, State Income Taxation (Baltimore: Johns Hopkins University Press, 1980): 266. 52. Benjamin A. Henzey and Richard A. Roadarmel, “A Comparative Analysis of State Individual Income Tax Enforcement Procedures,” National Tax Journal 34 (June 1981): 207–216. 53. Keith Snavely, “Innovations in State Tax Administration,” Public Administration Review 48, no. 5 (September–October 1988): 907; see also the entire article, 903–910. 54. Ibid., 907–908.
Notes to Pages 33–38 55. Ibid., 909. 56. Tod Newcombe, “Finance Goes High Tech: The Electronic Tax Collector,” Governing 13, no. 6 (November 1993): 63. 57. West, Digital Government, 12–70. 58. See special section called “eGoverning,” Governing 13, no. 12 (September 2000): 33–64. 59. Ibid., 40. 60. West, Digital Government, 98–99. 61. The IRS faced the same situation but negotiated an agreement with major tax preparers to provide free e-filing. The deal is described by the IRS Commissioner in his memoirs, Rossotti, Many Unhappy Returns, 143–144. 62. Ellen Perlman, “The People Connection,” Governing 15, no. 12 (September 2002): 32–42. On who should pay for e-filing, Christopher Swope, “Fee or Free?” Governing 14, no. 10 (July 2001): 46–47; “A Taxing Time,” Governing 15, no. 7 (April 2002): 72. 63. Kent Lassman, The Digital State 2002: How State Governments Use Digital Technologies (Washington, D.C.: The Progress & Freedom Foundation, November 2002): 9–10. 64. “Electronic Income Tax Filing Grows in Importance at the State Level,” FTA Bulletin, B-28/04, December 1, 2004. 65. A Gartner survey conducted in 2002 provided evidence that the greatest number of e-government initiatives were centered around administrative and financial operations (85 percent of respondents), followed by transportation (75 percent), public safety (50 percent), and human services and criminal justice (both 40 percent). Rishi Sood, Trends in U.S. State and Local Governments: Market Trends (Stamford, Conn.: Gartner, Inc., March 19, 2002): 32. 66. Christopher Swope, “E-conomics Problem,” Governing 13, no. 6 (March 2000): 20–21; Penelope Lemov, “The Untaxables,” Governing 15, no. 10 (July 2002): 36–37. For a sense of the intensifying debate in its early stages, see Reuven S. Avi-Yonah, “International Taxation of Electronic Commerce,” Tax Law Review 52, no. 3 (Spring 1997): 507–555; Karl Frieden, Cybertaxation: The Taxation of E-Commerce (Chicago: CCH Inc., 2000); Walter Hellerstein, “State Taxation of Electronic Commerce,” Tax Law Review 52, no.3 (Spring 1997): 425–505; Frances M. Horner and Jeffrey Owens, “Tax and the Web: New Technology, Old Problems,” Bulletin for International Fiscal Documentation 50, no. 11/12 (November–December 1996): 516–523; Michael J. McIntyre, “Taxing Electronic Commerce Fairly and Efficiently,” Tax Law Review 52, no. 4 (Summer 1997): 625–654; Charles E. McLure, Jr., “Taxation of Electronic Commerce: Economic Objectives, Technological Constraints, and Tax Law,” Tax Law Review 52, no. 3 (Spring 1997): 269–423, and other articles by him, “Electronic Commerce, State Sales Taxation, and Intergovernmental Fiscal Relations,” National Tax Journal 50, no. 4 (December 1997): 731–749; “Electronic Commerce and the Tax Assignment Problem: Preserving State Sovereignty in a Digital World,” State Tax Notes 14, no. 15 (April 13, 1998): 1169–1181; “Electronic Commerce and the U.S. Sales Tax: A Challenge to American Federalism,” International Tax and Public Finance 6, no. 2 (May 1999): 193–224; and “The Taxation of Electronic Commerce: Background and Proposal,” in Nicholas Imparato (ed.), Public Policy and the Internet: Privacy, Taxes and Contracts (Stanford, Calif.: Hoover Institution Press, 2000): 49–113. 67. Fite, The Computer Challenge, 5–6. 68. Ibid., 7. 69. Ibid., 12–13. 70. Rob Kling and Kenneth L. Kraemer, “Computing and Urban Services,” in James N. Danziger, William H. Dutton, Rob Kling, and Kenneth L. Kraemer (eds.), Computers and Politics: High Technology in American Local Governments (New York: Columbia University
369
370
Notes to Pages 38–40 Press, 1982): 197–200, 213; Lloyd Hackett, “A Small Computer for a Little City,” American City 84, no. 6 (June 1969): 114, 116–117; Bert Gibbons, “Why We Went to Computerized Billing,” American City 85, no. 1 (January 1970): 67, 118; Donald K. Price, “Small Size Is No Excuse,” American City 86, no. 5 (May 1971): 60–62; Daniel J. Giordano, “Daily Control Replaces Cycle Billing,” American City 86, no. 5 (May 1971): 109–110. 71. As communities upgraded their accounting processes, these were often reported by officials themselves in the government press. See, for example, J. H. Blackburn, “Mechanized Tax Accounting,” American City 65, no. 12 (December 1950): 118–119; E. T. Creagh, “Machine Billing in Niagara Falls,” ibid., 66, no. 9 (September 1951): 133; Clarke Gray, “Machines Speed County’s 40,000 Tax Bills,” ibid., 67, no. 1 (January 1952): 155; Stephen A. Roake and August F. Vangerow, “New Machines Speed Yonkers Tax Work,” ibid., 67, no. 3 (March 1952): 108–109; Barney Franklin, “Simple Machine-Accounting Pleases Kerrville, Texas,” ibid., 67, no. 5 (May 1952): 124–125; J. W. Hartman, “Los Angeles County Tax Billing,” ibid., 68, no. 8 (June 1952): 83–85; Walter H. Lundstrom, “Machine Billing and Receipting of Huntington, N.Y., Taxes,” ibid., 68, no. 11 (November 1952): 93; Ben Gentle, “251,917 Tax Bills in 14 Days,” ibid., 69, no. 1 (January 1954): 118; Ben Nordberg, “Tax Machines Save Jackson County $25,000 A Year,” ibid., 69, no. 3 (March 1954): 135; Rufus E. Deering, “New Tax System Puts County Two Months Ahead,” ibid., 69, no. 9 (September 1954): 109–110; E. Glenn Henning, “Orlando’s Machines Cut Billing Costs One-Third,” ibid., 70, no. 10 (October 1955): 144–145; Glen E. Thompson, “Modern Accounting Helps Prepare 188,000 Tax Bills in 37 Days,” ibid., 71, no. 9 (September 1956): 147–148; J. Ward Carter, “Weston, Mass., Bills and Bookkeeps on a Single Machine,” ibid., 72, no. 9 (September 1957): 130–131. 72. Harry G. McDowell, “Punch Put to Philadelphia Tax Rolls,” ibid., 73, no. 10 (October 1958): 116–117. 73. Salvatore J. Romeo, “Commercial Univac Writes Patchogue’s Tax Bills,” ibid., 74, no. 5 (May 1959): 213. 74. See, for example, the case of St. Louis, John H. Poelker, “Univac Figures St. Louis Personal Taxes,” ibid., 74, no. 8 (August 1959): 97; on New York, see Edwin Brenman, “Data Processing in a Large Municipality,” in Data Processing Annual (Detroit: Gillew Associates, 1961): 171–174, which, among many uses, maintained on its system real estate assessment records, lists of exempt and partially exempt properties, tax billing, collecting and accounting for real estate, special assessments, receipts, and tax accounting reporting. 75. For an early example of this integrative process, see Leo J. Fetzer, “Instant Tax Billing,” American City 77, no. 8 (August 1962): 90–92, but see also Robert E. Develle, “Electronic Data Processing as a Management Tool in Revenue Administration,” in Geoffrey Y. Cornog, James B. Kenney, Ellis Scott, and John J. Connelly (eds.), EDP Systems in Public Administration (Chicago: Rand McNally, 1968): 197–202. 76. “Data Processing Survey Shows Use Is Multiplying,” American City & County 98, no. 11 (November 1983): 42–44. 77. Ibid., 44. 78. Ibid., 43. 79. A great deal on GIS but also on cost is in Toregas and Taly Walsh, “Out with the Old, In with Re-engineering,” American City & County 108, no. 5 (May 1993): 49. 80. For an example of the pattern at work, see “Chicago County Goes Interactive with Taxes, Fines,” ibid., 110, no. 8 (July 1995): 16. 81. For an overview of the issues, see Robert Tannenwald, “Are State and Local Revenue Systems Becoming Obsolete?” National Tax Journal 55 (September 2002): 467–489.
Notes to Pages 42–50 82. H&R Block, “H&R Block, A History,” http://www.H&RBlock.com (last accessed 1/5/2005). 83. Ibid. 84. There is now a history of Intuit: Suzanne Taylor, Kathy Schroeder, and John Doerr, Inside Intuit: How the Makers of Quicken Beat Microsoft and Revolutionized an Entire Industry (Boston: Harvard Business School Press, 2003). 85. Marsha Blumenthal and Charles Christian, “Tax Preparers,” in Aaron and Slemrod, The Crisis in Tax Administration, 201. 86. Ibid., 202. 87. Ibid., 212. 88. Susan B. Anders and Carol M. Fischer, “A Hard Look at Tax Software,” CPA Journal 72, no. 11 (November 2002), online version, http://global.factiva.com/en/ arch/print_results.asp (last accessed 1/14/2005). 89. Susan B. Anders and Carol M. Fischer, “A Hard Look at Tax Software: 2003 Annual Survey of New York Practitioners,” ibid., 73, no. 11, online version, http://global. factiva.com/en/arch/print_results.asp (last accessed 1/14/2005). 90. Ibid.; “Electronic Income Tax Filing Grows in Importance at the State Level,” FTA Bulletin, B-28/04, December 1, 2004, pp. 1–6. 91. Stanley Zarowin, “Users Size Up Tax Software,” AICPA, October 2004, http://www.aicpa.org/org/pubs/jofa/oct2004/zarowin.htm (last accessed 1/8/2005). 92. Matt Sedensky, of Associated Press, reported on this growing trend of leveraging the Internet to reach new customers, “Teens Treat IRS Forms Like Hot Potatoes,” Wisconsin State Journal, February 28, 2005, p. A3. 93. Lee S. Sproull, “Computers in U.S. Households Since 1977,” in Alfred D. Chandler, Jr., and James W. Cortada (eds.), A Nation Transformed by Information: How Information Has Shaped the United States from Colonial Times to the Present (New York: Oxford University Press, 2000): 257–280. 94. Campbell-Kelly, From Airline Reservation to Sonic the Hedgehog, 294–300. 95. Goolsbee, “The Turbotax Revolution,” 128. 96. Ibid., 128–129, 134–135. 97. Mary Dalrymple, “Free Tax Programs Available Online,” Wisconsin State Journal, January 19, 2005, p. C10. 98. For example, “Why File Electronically?” Intuit’s presentation, http://www.turbotax.com/articles/WhyFileElectronically.html (last assessed 1/8/2005). 99. Robert J. Wells and James D. Keene, “Computer Use in Corporate Tax Departments,” Tax Executive 41, no. 3 (Spring 1989): 257–262. 100. Rossotti, Many Unhappy Returns, 291.
Chapter 3 1. Key studies include: Kent C. Redmond and Thomas M. Smith, Project Whirlwind: The History of a Pioneer Computer (Bedford, Mass.: Digital Press, 1980) and their sequel, From Whirlwind to Mitre: The R&D Story of the SAGE Air Defense Computer (Cambridge, Mass.: MIT Press, 2000); Arthur L. Norberg and Judy E. O’Neill, Transforming Computer Technology: Information Processing for the Pentagon, 1962–1986 (Baltimore: Johns Hopkins University Press, 1996); Paul N. Edwards, The Closed World: Computers and the Politics of Discourse in Cold War America (Cambridge, Mass.: MIT Press, 1996); James S. Small, The Analogue Alternative: The Electronic Analogue Computer in Britain and the USA, 1930–1975 (London: Routledge, 2001); Alex Roland and Philip Shiman, Strategic Computing: DARPA and the Quest for Machine Intelligence, 1983–1993 (Cambridge, Mass.: MIT Press, 2002); Stephen B. Johnson, The United States Air Force and the Culture of Innovation (Washington,
371
372
Notes to Pages 50–53 D.C.: U.S. Government Printing Office, 2002); James A. Tomayko, The Story of SelfRepairing Flight Control Systems (Edwards, Calif.: NASA Dryden Research Center, 2003). Although not written by an historian, a key study on the Pentagon’s efforts is by Kenneth Flamm, Creating the Computer: Government, Industry and High Technology (Washington, D.C.: Brookings Institution, 1988). 2. In the period 1975 through 2005, for example, the national defense budget ranged between 3 and 6.1 percent of the U.S. Gross Domestic Product, rising higher during war time and declining in peace time. DoD employed between 1.4 and 3.3 percent of the American workforce, varying over time depending on when the nation was at war (1970s), and it also outsourced many civilian functions (1990s), “Defense Shares of Economic and Budgetary Aggregates,” drawn from the U.S. Defense Budget, http://www. dod.mil/comptroller/defbudget/fy2005/fy2005_greenbook.pdf (last accessed 6/1/05). 3. Roger R. Trask and Alfred Goldberg, The Department of Defense, 1947–1997: Organization and Leaders (Washington, D.C.: Historical Office, Office of the Secretary of Defense, 1997); Johnson, The United States Air Force and the Culture of Innovation. 4. Office of the Under Secretary of Defense, National Defense Budget Estimates for FY 2005 (Washington, D.C.: United States Government Printing Office, 2005): 205. 5. Martin Cambell-Kelly and William Aspray, Computer: A History of the Information Machine (New York: Basic Books, 1996): 202; Paul E. Ceruzzi, A History of Modern Computing (Cambridge, Mass.: MIT Press, 1998): 91, 112, 258, 289; Alfred D. Chandler, Jr., Inventing the Electronic Century: The Epic Story of the Consumer Electronics and Computer Industries (New York: Free Press, 2001): 83–85; Flamm, Creating the Computer, 29–79; Douglas S. Meade, “Defense Spending in the Context of the U.S. Economy: 1987–2003,” INFORUM (April 1998): 1–8; Arthur L. Norberg, “The Shifting Interests of the US Government in the Development and Diffusion of Information Technology Since 1943,” in Richard Coopey (ed.), Information Technology Policy: An International History (Oxford: Oxford University Press, 2004): 24–53. 6. Historians are now learning that the U.S. was not alone in its intensive investment in R&D for computing for military reasons. They are discovering, for example, that the U.S.S.R. also had an impressive development program as well. See, Simon Berkovich, Reminiscences of Superconductive Associative Memory Research in the Former Soviet Union,”IEEE Annals of the History of Computing 25, no. 1 (January–March 2003): 72–75; Slava Gerovitch, “Mathematical Machines of the Cold War: Soviet Computing, American Cybernetics and Ideological Disputes in the Early 1950s,” Social Studies of Science 31, no. 2 (April 2001): 253–288, and also his From Newspeak to Cyberspeak: A History of Soviet Cybernetics (Cambridge, Mass.: MIT Press, 2002); Loren R. Graham, What Have We Learned about Science and Technology from the Russian Experience? (Stanford, Calif.: Stanford University Press, 1998); Valery Katkalo and David C. Mowery, “Institutional Structure and Innovation in the Emerging Russian Software Industry,” in David C. Mowery (ed.), The International Computer Software Industry: A Comparative Study of Industry Evolution and Structure (New York: Oxford University Press, 1996): 240–271; Peter Wolcott, “Soviet Advanced Technology: The Case of High-Performance Computing” (Ph.D. dissertation, University of Arizona, 1993). On activities in Central Europe, see the IEEE Annals of the History of Computing, 21, no. 3 (1999), which published a group of articles on activities in Romania, Slovakia, Hungary, Belarus, and Lithuania. 7. Edwards, The Closed World, 1–42. 8. Ibid., 1. 9. RFID means Radio Frequency Identification device, normally called RFID tag. It is a silicon chip with antenna that can be attached to inventory, for example, to track via a radio signal. One sees this in use today, for example, by car and truck drivers who put a device on their windshield as a way of tracking and paying tolls on highways.
Notes to Pages 53–56 10. Ceruzzi, A History of Modern Computing, 7, 15, 18, 20–21. 11. Edwards, The Closed World, 47. 12. The Lincoln Laboratory’s experience is relevant here, see William Z. Lemnios and Alan A. Grometstein, “Overview of the Lincoln Laboratory Ballistic Missile Defense Program,” Lincoln Laboratory Journal 13, no. 1 (2002): 9–31. 13. Kenneth Flamm, Targeting the Computer: Government Support and International Competition (Washington, D.C.: Brookings Institution, 1987): 6–8. 14. The Air Force even developed research tools using early computers to help do the job, deploying a Univac 90. Richard Hunt Brown, “The Armed Services Technical Information Agency,” (New York: Automation Consulting, 1961): III G15-1–G15-7, CBI 55, “Market Reports,” Box 70, Folder 3, Archives of the Charles Babbage Institute, University of Minnesota, Minneapolis. But see also David L. Boslaugh, When Computers Went to Sea: The Digitization of the United States Navy (Los Alamitos, Calif.: IEEE Computer Society, 1999) for the Navy, and Johnson, The United States Air Force and the Culture of Innovation, for the USAF. 15. Office of Naval Research, A Survey of Large Scale Digital Computers and Computer Projects (Washington, D.C.: Department of the Navy, 1950); David K. Allison, “The Origins of the Naval Research Laboratory,” U.S. Naval Institute Proceedings (July 1979): 62–69; Fred D. Rigby, “Tailored Electronic Data Processing Equipment,” in Lowell H. Hattery and George P. Bush, eds., Electronics in Management (Washington, D.C.: University Press of Washington, D.C., 1956): 31–37; Mina Rees, “The Computing Program of ONR, 1946–1953,” Annals of the History of Computing 4, no. 2 (April 1982): 102–120, and her “The Computing Program of the Office of Naval Research,” Communications, ACM 30 (1987): 832–848. The Naval Research Laboratory (NRL) was also active in R&D involving computing; for details, see U.S. Naval Research Laboratory, Pushing the Horizon: Seventy-Five Years of High Stakes Science and Technology at the Naval Research Laboratory (Washington, D.C.: U.S. Government Printing Office, 1998). 16. Edwards, The Closed World, 61–62. 17. James W. Cortada, The Computer in the United States: From Laboratory to Market, 1930 to 1960 (Armonk, N.Y.: M. E. Sharpe, 1993): 27–63. 18. Emerson W. Pugh, Building IBM: Shaping an Industry and Its Technology (Cambridge, Mass.: MIT Press, 1995): 167–172, 200–219; “We Build ‘Brains’ for Defense,” Business Machines (July 1, 1954): 3–5, IBM Archives, Somers, New York. 19. Edwards, The Closed World, 61; Arthur L. Norberg, Computers and Commerce: A Study of Technology and Management of the Eckert-Mauchly Computer Company, and Engineering Research Associates, and Remington Rand, 1946–1957 (Cambridge, Mass.: MIT Press, 2005): 154, 231–232; Richard Hunt Brown, “Univac 120 in Statistics Work,” (refers to Army Chemical Corp.) (New York: Automation Consulting, 1957): III G71–G7-7, CBI 55, “Market Reports,” Box 70, Folder 17, Archives of the Charles Babbage Institute, University of Minnesota, Minneapolis. 20. Flamm, Targeting the Computer, 6–8; Edwards, The Closed World, 61. 21. The story is well told by Janet Abbate, Inventing the Internet (Cambridge, Mass.: MIT Press, 1999). 22. By the 1980s, the lead on applied R&D had shifted to the private sector, where commercial applications set the pace for innovation in IT. 23. Office of Technology Assessment, Department of Defense Federally Funded Research and Development Centers (Washington, D.C.: U.S. Government Printing Office, June 1995): 10. 24. Office of Technology Assessment, Department of Defense Federally Funded Research and Development Centers, is the most complete source on the topic. 25. Flamm, Targeting the Computer, 42–124.
373
374
Notes to Pages 56–63 26. David Alan Grier, When Computers Were Human (Princeton, N.J.: Princeton University Press, 2005): 220–317. 27. Tomayko, The Story of Self-Repairing Flight Control Systems, 1–44; Mark A. Lorell and Hugh P. Levaux, The Cutting Edge: A Half Century of U.S. Fighter Aircraft R&D (Santa Monica, Calif.: RAND Corporation, 1998), available at http://www.rand.org/publications/MR/MR939/index.html (last accessed 3/22/05); Jacob Neufeld, The Development of Ballistic Missiles in the United States Air Force, 1945–1960 (n.p.: University Press of the Pacific, 2004). 28. Edwards, The Closed World, 71. 29. Ibid.,75–112, provides one of the most useful summaries, complete with bibliography. 30. Ibid.; on Ada, see R. M. Graham, “Ada—The Billion-Dollar Language,” Abacus 1, no. 2 (1984): 7–21; Herbert R. J. Grosch, “Ada’s First Stirring,” Annals of the History of Computing 11, no. 1 (1989): 54. 31. David Talbot, “How Tech Failed in Iraq,” Technology Review (November 2004): 36–42, 44. 32. Edwards, The Closed World, 286–288. 33. Neufeld, The Development of Ballistic Missiles in the United States Air Force, 132. 34. Ibid., 231. 35. The computer industry depended on military funding until the 1960s when the technology had matured enough that commercial opportunities made it possible for companies to increasingly support R&D and risk of developmental failures through private sector funding, Gerald W. Brock, The Second Information Revolution (Cambridge, Mass.: Harvard University Press, 2003): 83–111; Jeffrey Yost, The Computer Industry (Westport, Conn.: Greenwood Press, 2005): 27–51. 36. Lorell and Levaux, The Cutting Edge, 126–127. 37. Ibid., 130. 38. This situation was particularly well demonstrated with the development of stealth aircraft: the nature of the technology involved; which vendors qualified to develop it, because of prior expertise and current desire to move toward this new form of aircraft and avionics; and demonstrated that it took decades to flower, in this case, since the 1970s without new aircraft flying until the late 1980s. This process for stealth aircraft is described by Lorell and Levaux, The Cutting Edge, 129–153. 39. The mindset of Pentagon officials, and the consequent process that has extended into the new century, is described by Roland and Shiman, Strategic Computing, 319–331. 40. Tomayko, The Story of Self-Repairing Flight Control Systems, 45–48. 41. DOD defines logistics “as the science of planning and carrying out the movement and maintenance of forces.” There are six aspects of logistics: supply, maintenance, transportation, civil engineering, health services, and other services. For details see, United States Government Accountability Office, Defense Logistics: Actions Needed to Improve the Availability of Critical Items during Current and Future Operations (Washington, D.C.: U.S. Government Accountability Office, April 2005). 42. This set of processes was the central set of applications I described in the first volume of The Digital Hand: How Computers Changed the Work of American Manufacturing, Transportation, and Retail Industries (New York: Oxford University Press, 2004). 43. Office of Management and Budget, Information Resources Management Plan of the Federal Government (Washington, D.C.: U.S. Government Printing Office, 1992): IV-37. 44. Defense Logistics Agency, “History of the Defense Logistics Agency,” http://www. dla.mil/history/storyboard.htm (last accessed 5/8/05); A.J. Allott, “ADP In the U.S. Army Materiel Command,” Data Processing Proceedings 1963 (Detroit: Data Processing Management Association, 1963): 236–253.
Notes to Pages 63–67 45. Automation Consultants, “Personnel Inventory Control on the IBM 705,” undated case study (circa 1957), CBI 55, “Market Reports,” Box 70, Folder 17, Archives of the Charles Babbage Institute, University of Minnesota, Minneapolis. 46. Automation Consultants, “The IBM 705 in Ships and Parts Control,” 1957, CBI 55, “Market Reports,” Box 70, Folder 3, Archives of the Charles Babbage Institute, University of Minnesota, Minneapolis. 47. Automation Consultants, “Inventory Management Using the IBM 705,” undated (circa1957), CBI 55, “Market Reports,” Box 70, Folder 3, Archives of the Charles Babbage Institute, University of Minnesota, Minneapolis; Allott, “ADP In the U.S. Army Materiel Command,” 247–248. 48. Automation Consultants, “Univac 120 in Statistics Work,” 1957, CBI 55, “Market Reports,” Box 70, Folder 4, Archives of the Charles Babbage Institute, University of Minnesota, Minneapolis; “A 650 at Work in the 50th State,” ibid., Box 70, Folder 3; “Computer Commands Naval Supplies,” ibid., undated (circa 1959–1960), ibid.; “Controls World-Wide Supplies,” Business Machines (January 1960): 18, IBM Archives, Somers, N.Y. 49. “World’s Largest Purchasing Agency,” Business Machines (October 1960): 10, IBM Archives, Somers, N.Y. 50. Ibid. 51. Gilbert Burck et al., The Computer Age (New York: Harper & Row, 1965): 9. 52. Allott, “ADP In the U.S. Army Materiel Command,” 249. 53. Defense Logistics Information Service, “DLIS History,” http://www.dlis.dla.mil/ history.asp (last accessed 5/8/05). 54. Barry J. Shillito, “Defense Logistics: Challenge of the 1970s,” Defense Management Journal 9, no. 1 (January 1973): 2. 55. Blue Ribbon Defense Panel, Report to the President and the Secretary of Defense on the Department of Defense by the Blue Ribbon Defense Panel 1 July 1970 (Washington, D.C.: U.S. Government Printing Office, 1970): 152. 56. Roland E. Berg, “Serial Number Tracking Is More Than a Numbers Game,” Defense Management Journal 21, no. 2 (Second Quarter 1985): 37–41. 57. Office of Management and Budget, Information Resources Management Plan of the Federal Government (Washington, D.C.: U.S. Government Printing Office, November 1991): IV-37, but see the entire section for IT plans at DLA, IV-37–41 and 1993 edition, ibid. (Washington, D.C.: U.S. Government Printing Office, December 1993): IV-33–35, also for the Defense Information Systems Agency, ibid., IV-37–38. 58. J. Michael Brower, “The Promise of E-Commerce to Defense: The Road Ahead (to Savings!),” The Public Manager (Spring 2001): 39–40. 59. GAO, Defense Logistics, 1. 60. Ibid., 39. 61. Ibid., 40. 62. Ibid., 44. 63. Ibid., 45–47. 64. Ibid., 51. 65. Defense Logistics Agency, “Information Technology: Information Operations Initiatives,” undated, http://www.dla.mil/j-6/initiatives.asp (last accessed 5/8/05); entire issue of Army Alert (March–April 2005). 66. A potential fourth wave still in its experimental stages involves the application of smart bomb technologies to ever smaller ordnance, such as intelligent bullets, which have yet to go into general use. 67. Not to be confused with the notion of the Electronic Battlefield, which can include such nondigital electronics in use as radio communications and analog electronic sensors, neither of which require computers to function.
375
376
Notes to Pages 67–74 68. William T. Moye, “ENIAC: The Army-Sponsored Revolution,” January 1996, http://www.ftp.arl.mil/mike/comphist/96summary/index.html (last accessed 3/22/05). 69. Ibid., for details; this is the Ordnance Corps’ Web site. 70. Ibid.; K. Kempf, Electronic Computers Within the Ordnance Corps. (Aberdeen, Md.: U.S. Army Ordnance, Aberdeen Proving Ground, November 1961); R. E. Meagher and J. P. Nash, “The ORDVAC,” Report on AIEE-IRE Computer Conference (February 1952): 37–43; James E. Robertson, “The ORDVAC and ILLIAC,” in N. Metropolis et al., eds., A History of Computing in the Twentieth Century (New York: Academic Press, 1980): 347–364. 71. “Electronic Computers Within the Ordnance Corps: Chapter VI—Computers for Solving Gunnery Problems,” http://www.ftp.arl.mil/mike/comphist/96summary/index. html (last accessed 3/22/05). 72. “Five Customer Engineers Help the Navy ‘Aim’ Its Guns,” Business Machines (June 15, 1955): 3–5, IBM Archives, Somers, N.Y. 73. Boslaugh, When Computers Went to Sea, 319. 74. Ibid., 335. 75. “Electronic Computer Is Used to ‘Sea-Test’ the Seawolf,” Business Machines (September 5, 1955): 6; Charles N. Barnard, “Star-Spangled Division,” Think (January–February 1982): 19–24. 76. Ibid., 340–357. 77. Ibid., 369. 78. Ibid., 364–368. 79. Edwards, The Closed World, 5, 15, 113–114; Johnson, The United States Air Force and the Culture of Innovation, 11, 12–13, 16, 19–21. Both authors point out that Secretary of Defense McNamara during the Kennedy Administration introduced the notion of systems in many aspects of how DoD viewed processes, budgets, and other matters. In addition, see Trask and Goldberg, The Department of Defense, 33–34, 80–81. 80. George W. Bradley, III, “Origins of the Global Positioning System,” in Jacob Neufeld, George M. Watson, Jr., and David Chenoweth, Technology and the Air Force: A Retrospective Assessment (Washington, D,.C.: Air Force History and Museums Program, 1997): 245–253. 81. Defense Systems Management College, Mission Critical Computer Resources Management Guide (Washington, D.C.: U.S. Government Printing Office, 1988): 2–5. 82. Ibid. 83. On the problems with SAGE, see Alan Borning, “Computer System Reliability and Nuclear War,” in David Bellin and Gary Chapman, eds., Computers in Battle: Will They Work? (Boston: Harcourt Brace Jovanovich, 1987): 101–147. All standard accounts of SAGE avoid discussion of false alarms. However, Borning provides evidence that the Pentagon did not. 84. Anatol Rapoport, “War and Peace,” The Annals of the American Academy of Political and Social Science 412 (March 1974): 153. 85. Both quotes, Barry C. De Roze, “An Introspective Analysis of DOD Weapon System Software Management,” Defense Management Journal 11, no. 4 (October 1975): 2. 86. Ibid., 3. 87. Defense Systems Management College, Mission Critical Computer Resources Management Guide, 2–7. 88. Ibid., 2–4. 89. O’Sullivan, “Advanced Training Technology,” 31. 90. Orlansky and String, “Computer-Based Instruction for Military Training,” 46. 91. Marc Prensky, Digital Game-Based Learning (New York: McGraw-Hill, 2004): 304. 92. O’Sullivan, “Advanced Training Technology,” 31.
Notes to Pages 74–80 93. Ibid., 31–34; for many examples see, Orlansky and String, “Computer-Based Instruction for Military Training,” 46–54. 94. Orlansky and String, “Computer-Based Instruction for Military Training,” 49–54. 95. Allen Collier, “An Overview of Training-Systems Development,” Defense Management Journal 16, no. 4 (Fourth Quarter 1980): 2–5. The same issue carried other articles on training using flight simulators. 96. Office of Technology Assessment, Distributed Interactive Simulation of Combat, OTABP-ISS-151 (Washington, D.C.: U.S. Government Printing Office, September 1995): 2–5. 97. Ibid., 10–11. 98. Terry E. Bibbens, “Simulating the Modern Electronic Battlefield,” Defense Management Journal 18, no. 3 (Third Quarter 1982): 17. 99. Ibid., 17–18. 100. In the Iraqi war of 2003, soldiers complained that they had to stop their advance on the capital to pull over and download information (e.g., from GPS) or that insufficient bandwidth slowed their receipt of vital information. 101. Paul S. Deems, “War Gaming and Exercises,” Air University Quarterly Review 8 (winter 1956–57): 98–126; John G. Kemeny, “Games of Life and Death,” The Nation 192, no. 3 (January 21, 1961): 47–50; Thomas Clayton, “Military Gaming,” in Russell Ackoff, ed., Progress in Operations Research, vol. 1 (New York: John Wiley & Sons, 1961): 421–461; Sharon Ghamari, “Simulating the Unthinkable: Gaming Future Wars in the 1950s and 1960s,” Social Studies of Science 30, no.2 (April 2000): 163–223. 102. Peter P. Perla, The Art of Wargaming (Annapolis, Md.: Naval Institute Press, 1990); Francis McHugh, Fundamentals of Wargaming (Newport, R.I.: U.S. Naval War College, 1966); John B. Davis, Jr., and John A. Tiedeman, “The Navy War Games Program,” Proceedings of the U.S. Naval Institute (June 1960): 61–67. 103. Murray Greyson, ed., Second War Gaming Symposium Proceedings (Washington, D.C.: Washington Operations Research Council, 1964). 104. Leroy A. Brothers, “Operations Analysis in the United States Air Force,” Journal of the Operations Research Society of America 2, no. 1 (February 1954): 1–16. 105. Leon Feldman, “Role of the Contractor and the User in Air Force War Gaming,” in Greyson, Second War Gaming Symposium Proceedings, 105–116. 106. E. W. Paxson, War Gaming, RM-3489-PR (Santa Monica, Calif.: RAND Corporation, 1963). 107. Alfred Hausrath, Venture Simulation in War, Business, and Politics (New York: McGraw-Hill, 1971): 65–68; E. A. Adams and R. D. Forrester, Carmonette: A Computer Combat Simulation (Washington, D.C.: Operations Research Office, Johns Hopkins University, 1959). CARMONETTE means Computerized Monte Carlo Simulation. 108. Office of Technology Assessment, Distributed Interactive Simulation of Combat, 18–20. 109. Ibid., 20. 110. Prensky, Digital Game-Based Learning, 5. 111. Ibid., 6. 112. Bibbens, “Simulating the Modern Electronic Battlefield,” 20. 113. Tim Lenoir and Henry Lowood, “Theaters of War: The Military-Entertainment Complex” (2002), http://www.standford.edu/class/sts145/Library/Lenoir-Lowood_ TheatersOfWar.pdf (last accessed 3/2/05); Stephen Stockwell and Adam Muir, “The Military-Entertainment Complex: A New Facet of Information Warfare,” http:// journal.fibreculture.org/issue1/issue1_stockwellmuir.html (last accessed 6/16/05). 114. Martin Lister et al., New Media: A Critical Introduction (New York: Routledge, 2003). 115. Stockwell and Muir, “The Military-Entertainment Complex;” Peter Huck, “Hollywood Goes to War,” The Age (September 16, 2002), http://www.theage.com.au/ articles/2002/09/14/1031608342634.html (last accessed 6/16/05).
377
378
Notes to Pages 80–86 116. Andy Hall, “Video Games Viewed As Aid to Education,” Wisconsin State Journal, June 25, 2005, pp. A1, A9. 117. Roland J. Yardley et al., Use of Simulation for Training in the U.S. Navy Surface Force (Santa Monica, Calif.: RAND Corporation, 2003): ix. 118. Johnson, The United States Air Force and the Culture of Innovation, 8–10, 117–172. 119. “Introducing a New Computer—COMAR-1 of the U.S. Air Force,” Business Machines (November 7, 1955): 2, IBM Archives, Somers, N.Y. 120. “Command Decisions,” Business Machines (August 1961): 20–21. 121. “IBM Terminals for USAF,” Business Machines (October 1961): 22. 122. “IBM System Checks Defense Data Network,” IBM News (May 5, 1964): 8, IBM Archives, Somers, N.Y. 123. Walter Bauer and Sheldon Simmons, “A Real-Time Data Handling System,” Datamation 10, no. 3 (March 1964): 31–35. 124. Lee S. Christie and Marlin G. Kroger, “Information Processing for Military Command,” Datamation 8, no. 6 (June 1962): quote, p. 59, but see entire article, 58–61. 125. Paul H. Riley, “Management and Technology of Defense Communications Today,” Defense Management Journal 5, no. 2 (spring 1969): 21–22. 126. Richard J. Meyer, “World-Wide Communications ‘In Seconds,’ ” Defense Industry Bulletin 1, no. 6 (June 1965): 5, 14; Thomas Matthew Rienzi, Vietnam Studies Communications-Electronics, 1962–1970 (Washington, D.C.: U.S. Government Printing Office, 1972); John D. Bergen, Military Communications: A Test for Technology (Washington, D.C.: Department of Defense, Center of Military History, 1986). 127. Riley, “Management and Technology of Defense Communications Today,” 22. 128. Joseph J. Cody, Jr., “Command, Control, Communications Systems—‘Musts’ in Modern Weaponry,” Defense Industry Bulletin 5, no. 12 (December 1969): 2. 129. Burroughs Corporation, Burroughs Corporation Annual Report 1963 (Detroit: Burroughs Corporation, 1964): 10. 130. Ibid.; Cody, “Command, Control, Communications Systems,” 4. 131. For a description of the Army’s consolidation and rationale for using advanced communications, see James W. Madden, “Committing Communications to the Computer,” Defense Management Journal 8, no. 1 (April 1972): 50–54. 132. Office of Management and Budget, A Five-Year Plan for Meeting the Automatic Data Processing and Telecommunications Needs of the Federal Government (Washington, D.C.: U.S. Government Printing Office, November 1990): III-11–14. 133. Anthony W. Faughn, Interoperability: Is It Achievable? (Cambridge, Mass.: Center for Information Policy Research, Harvard University, 2002): 5–7. 134. Quoted in Frank M. Snyder, Command and Control: The Literature and Commentaries (Washington, D.C.: National Defense University Press, 1993): 71. 135. Faughn, Interoperability, 20. 136. National Research Council, Computer Sciences and Telecommunications Board, Commission on Physical Sciences, Mathematics, and Applications, Committee to Review DOD C41 Plans and Programs, Realizing the Potential of C41: Fundamental Challenges (Washington, D.C.: National Academy Press, December 1999): 19–20. 137. Quoted in Chuck Paone, “Office Makes All Pieces of the Puzzle Fit Together,” Hansconian 44, no. 38 (September 22, 2000): 3. 138. David Talbot, “How Tech Failed in Iraq,” Technology Review (November 2004): 36–44. 139. Ibid., 44. 140. The discussion over the past several paragraphs on Iraq was drawn largely from Ibid., 36–44, and from informal discussions held with Air Force and Army officers recently returned from duty in Iraq, held between April and June 2005. For more
Notes to Pages 86–89 positive but brief descriptions of the use of computing in this war, see Thomas L. Friedman, The World Is Flat: A Brief History of the Twenty-First Century (New York: Farrar, Straus and Giroux, 2005): 38–39, and “Digital Warfare Adapted for Iraq,” Associated Press, January 2, 2004. 141. Edwards, The Closed World, 7–19. 142. James Adams, The Next World War: Computers Are the Weapons and the Front Line Is Everywhere (New York: Simon & Schuster, 1998): 57. 143. Ryan Henry and C. Edwards Peartree, “Military Theory and Information Warfare,” Parameters (Autumn 1998): 126; see also Adams, The Next World War, 43–47. 144. Mike Pryor, “Digitization, Simulations, and the Future of the Army National Guard,” in Robert L. Bateman, III, ed., Digital War: A View from the Front Lines (Novato, Calif.: Presidio Press, 1999): 85. 145. Adams, The Next World War, 17. 146. Henry and Peartree, “Military Theory and Information Warfare,” 121–135; Alvin Toffler and Heidi Toffler, War and Anti-War: Survival at the Dawn of the 21st Century (New York: Warner Books, 1995); Martin C. Libicki, What Is Information Warfare? (Washington, D.C.: National Defense University Press, 1995); John Arquilla and David Ronfeldt, “Cyberwar Is Coming!” Comparative Strategy 12 (April–June 1993): 141–165; Stuart D. Schwartzstein, ed., The Information Revolution and National Security (Washington, D.C.: Center for Strategic and International Studies, 1996); see also Airpower Journal for the entire 1990s for many discussions of the subject. 147. Department of Defense, Joint Chiefs of Staff, Joint Vision 2010 (Washington, D.C.: U.S. Department of Defense, 1996): 16. 148. Department of Defense, Department of Defense Directive 3600.1, “Information Operations” (Washington, D.C.: U.S. Department of Defense, December 9, 1996): 1–1. 149. Daniel P. Bolger, “The Electric Pawn: Prospects for Light Forces on the Digitized Battlefield,” in Bateman, Digital War, 122. 150. Ibid. 151. U.S. Department of Defense, Joint Chiefs of Staff, Joint Vision 2020 (Washington, D.C.: U.S. Government Printing Office, June 2000). 152. “The Network Is the Battlefield,” BusinessWeek, January 7, 2003, http://www. businesweek.com/technology/content/jan2003/tc2003017_2464.htm (last accessed 1/16/03); “Tomorrow’s Smarter, Connected Navy,” BusinessWeek, January 7, 2003, http:// www/businessweek.com/technology/content/jan2003/tc20030110_3330.htm (last accessed 1/16/03); David S. Alberts, John J. Garstka, and Frederick P. Stein, Network Centric Warfare: Developing and Leveraging Information Superiority, 2nd ed. (Washington, D.C.: U.S. Department of Defense, Command and Control Research Program, 2003): 87–114; David S. Alberts, Information Age Transformation: Getting to a 21st Century Military (Washington, D.C.: U.S. Department of Defense, Command and Control Research Program, 2002); Richard Darilek et al., Measures of Effectiveness for the Information-Age Army (Santa Monica, Calif.: RAND Corporation, 2001): 61–65. 153. For a collection of views critical of the reliance on computing, see David Bellin and Gary Chapman, Computers in Battle: Will They Work (Boston: Harcourt Brace Jovanovich, 1987). 154. John A. Gentry, “Doomed to Fail: America’s Blind Faith in Military Technology,” Parameters (winter 2002–2003): 88–103. 155. “The Doctrine of Digital War,” BusinessWeek, April 7, 2003, http://www.businessweek. com/magazine/content /03_14/b3827601.htm (last accessed 7/3/05). The role of technology in planning and the actions of the secretary have recently been documented by Michael R. Gordon and Bernard E. Trainor, Cobra II: The Inside Story of the Invasion and Occupation of Iraq (New York: Pantheon Books, 2006): see in particular the first three
379
380
Notes to Pages 89–95 chapters. A similar concern regarding the Pentagon’s over reliance on all manner of technology is the subject of Michael Adas, Dominance by Design: Technological Imperatives and America’s Civilizing Mission (Cambridge, Mass.: Harvard University Press, 2006), which includes discussion about Vietnam and the recent conflicts in the Middle East. 156. James W. Cortada, Before the Computer: IBM, NCR, Burroughs, and Remington Rand and the Industry They Created, 1865–1956 (Princeton, N.J.: Princeton University Press, 1993): 80, 92, 201–204. 157. “Two IBM 705’s Are Piped Aboard,” Business Machines (August 1958): 6–7, IBM Archives, Somers, N.Y.; for the Army and Air Force, see “Pacific Area Office Aids U.S. Armed Forces,” Business Machines (August 16, 1954): 9; Allott, “ADP in the U.S. Army Materiel Command,” 249–251. 158. Henry W. Tubbs, Jr., “Data Processing Goes to War,” Datamation (August 1966): 45–46, 51. 159. Ibid., 46. 160. Ibid. 161. “The Modern Paul Revere,” August 1959, CBI 55, “Market Reports,” Box 70, Folder 5, Archives of the Charles Babbage Institute, University of Minnesota, Minneapolis; “U.S. Marine Corps Personnel Management Data Processing System,” circa 1960, CBI 55, “Market Reports,” Box 70, Folder 7; “Personnel Inventory Control on the IBM 705,” circa 1957, CBI 55, “Market Reports,” Box 70, Folder 3. 162. “Personnel Inventory Control on the IBM 705,” Folder 3. 163. Transforming Department of Defense Financial Management, 5. 164. Ibid. 3–7. 165. Philip A. Odeen et al., Transforming Defense National Security in the 21st Century. Transforming Department of Defense Financial Management: A Strategy for Change, Final Report 13, April 2001 (Washington, D.C.: U.S. Government Printing Office, 2001): unpaginated. 166. Franklin C. Spinney, “Written Testimony before House Subcommittee on National Security, Veterans Affairs, and International Relations, Committee on Government Reform,” quoted in Wesley L. McClellan, “Deciding on Future Defense Capabilities: Increasing Objectivity for Success,” Research Project Course 5999–02, undated (circa 2003), National Defense University, National War College, p. 7. 167. GAO, Best Practices: Taking a Strategic Approach Could Improve DOD’s Acquisition Services (Washington, D.C.: U.S. Government Printing Office, January 2002): 18, which discusses, in part, the nearly $5 billion in IT services contracted for by DoD in 2000; GAO, Data Mining: Federal Efforts Cover a Wide Range of Uses (Washington, D.C.: Government Printing Office, May 2004): 29–36. 168. James W. Cortada, The Digital Hand: How Computers Changed the Work of American Financial, Telecommunications, Media, and Entertainment Industries (New York: Oxford University Press, 2006): 413–430. 169. Tim Lenoir, “Taming a Disruptive Technology,” presentation made on September 9, 2003, IBM Corporation, Almaden, Calif.; “The Army’s New Killer App,” BusinessWeek, May 22, 2002, http://www.businesweek.com/technology/content/may 2002/tc20020523_ 2266.htm (last accessed 1/16/2003). 170. Lenoir, “Taming a Disruptive Technology,” unpaginated. 171. This success began to attract the attention of mainstream American press. See, for example, the article by a Knight Ridder Newspapers reporter, Robert S. Boyd, “Learning to Be a Soldier from a Computer Game,” Wisconsin State Journal, January 27, 2006, p. A8, but which appeared in many newspapers. 172. Lev Grossman, “The Army’s Killer App,” Time, February 28, 2005, 43–44. 173. IBM Corporation, Application Brief, Information Center at Hill Air Force Base, GK202227-1 (White Plains, N.Y.: IBM Corporation, 1986), Box 252, Folder 18, Archives, IBM Corporation, Somers, N.Y.
Notes to Pages 95–102 174. See, for example, Allott, “ADP in the U.S. Army Materiel Command,” 237–243; James D. Pewitt, Richard G. Abbott, and Alan G. Merten, “The Selection of Information Processing Systems to Support Air Force Management,” Defense Industry Bulletin 3, no. 11 (December 1967): 1–6; Robert F. Williams, “Evaluating Performance and Cost by Computer,” Defense Industry Bulletin 11, no. 3 (July 1975): 9–13. 175. Allott, “ADP in the U.S. Army Materiel Command,” 249. 176. Carl G. O’Berry, “Information Systems and Applications,” in Jacob Neufeld, George M. Watson, Jr., and David Chenoweth, eds., Technology and the Air Force: A Retrospective Assessment (Washington, D.C.: Air Force History and Museums Program, 1997): 313. 177. Ibid., 314; Talbot, “How Tech Failed in Iraq,” 36–44. 178. McClellan, “Deciding on Future Defense Capabilities,” 33. 179. See, for example, Montgomery Phister, Jr., Data Processing Technology and Economics (Santa Monica, Calif.: Santa Monica Publishing Company, 1975): 134; Automation Consultants, Inc., “Office Automation Applications,” various annual editions, CBI 55, “Market Reports,” Box 70, Folder 2, Archives of the Charles Babbage Institute, University of Minnesota, Minneapolis. 180. Phister, Data Processing Technology and Economics, 134; Paul H. Riley, “Toward Tighter, Leaner ADP Operations,” Cost Reduction Journal 3, no. 2 (spring 1967): 9–11. 181. National Bureau of Standards, Computers in the Federal Government: A Compilation of Statistics, NBS Special Publication 500-7 (Washington, D.C.: U.S. Government Printing Office, June 1977): 20. 182. Cited from DoD sources in Adams, The Next World War, 199. 183. Office of Management and Budget, Information Resources Management Plan of the Federal Government (Washington, D.C.: U.S. Government Printing Office, August 1996): 3. 184. U.S. Congress, Office of Technology Assessment, High Performance Computing and Networking for Science—Background Paper, OTA-BP-CIT-59 (Washington, D.C.: U.S. Government Printing Office, September 1989): 12. 185. General Accounting Office, Best Practices, 18. 186. Trask and Goldberg, The Department of Defense, 128. 187. See, for example, their discussion of the “Revolt of the Admirals,” ibid., 17, 62. 188. Gordon and Trainor, Cobra II, 499–500, passim. They concluded that during the warfighting phase of the Iraqi conflict, use of technology was effective, but hardly so during the subsequent occupation in which the number of troops in the country needed to provide security was far too little. 189. Adas, Dominance by Design, 412. 190. Edwards, The Closed World. 191. The underlying problem is a very familiar one. It is of the carpenter who sees every problem as having a solution that can be provided by a hammer and nail. Air force officers have often been accused of “overselling” the power of aircraft bombing as the way to win wars. This happened in Vietnam and again in Iraq. Historically, Army commanders have countered by arguing that without “boots on the ground,” nobody can win a war. Sources cited in the previous several endnotes discuss this issue. For background on why there were insufficient troops ready for service, see Frederick W. Kagan, “The U.S. Military’s Manpower Crisis,” Foreign Affairs (July–August 2006), http://www.foreignaffairs. org/20060701faessay85408/frederick-w-kagan.
Chapter 4 1. Lynn Bauer, “Justice Expenditure and Employment in the United States, 2001,” Bureau of Justice Statistics Bulletin (May 2004): 3. 2. U.S. Department of Justice, Victim Costs and Consequences: A New Look (Washington, D.C.: U.S. Government Printing Office, 1996).
381
382
Notes to Pages 103–108 3. James M. McCarty, The Impact of New Technology and Organizational Stress on Public Safety Decision Making (Lewiston, N.Y.: Edwin Mellen Press, 1999): 6–26. 4. On circumstances in the 1960s and 1970s in policing, V. L. Folley, American Law Enforcement (Boston: Allyn and Bacon, 1980), and Samuel Walker, The Police in America: An Introduction (New York: McGraw-Hill, 1983); for the 1980s and 1990s, see Henry M. Wrobleski and Karen M. Hess, Introduction to Law Enforcement and Criminal Justice (Belmont, Calif.: Wadsworth, 2003); for courts and corrections, see David W. Neubauer, America’s Courts and Criminal Justice System (Belmont, Calif.: Wadsworth, 2002); James Austin and John Irwin, It’s About Time: America’s Imprisonment Binge, 3rd ed. (Belmont, Calif.: Wadsworth, 2001). 5. Bauer, “Justice Expenditure and Employment in the United States, 2001,” 1–10. 6. U.S. Census Bureau, Statistical Abstract of the United States: 2002 (Washington, D.C.: U.S. Government Printing Office, 2001): 183; U.S. Census Bureau, Historical Abstracts of the United States: Colonial Times to 1970 (Washington, D.C.: U.S. Government printing Office, 1975): Part 1, 413. Statistics vary by agency and often are inconsistent. Compare the data cited in the text with those reported by Bureau of Justice Statistics, U.S. Department of Justice, Devon B. Adams and Lara E. Reynolds, Bureau of Justice Statistics 2002: At a Glance, NCJ 19449 (Washington, D.C.: U.S. Department of Justice, August 2002): 3. However, all sources show the same general trends. 7. U.S. Census Bureau, Statistical Abstract of the United States: 2002, 189. 8. Kent W. Colton, “Routine Computer Technology: Implementation and Impact,” in Kent W. Colton, ed., Police Computer Technology (Lexington, Mass.: Lexington Books, 1978): 47. 9. Most of the citations in the end notes from the period for this chapter read by police management invariably spoke about the “pros” and “cons” of using computers. 10. Edward F. R. Hearle, “Can EDP Be Applied to All Police Agencies?” The Police Chief (February 1962): 10–16; Robert R. J. Gallati, “Identification and Intelligence Systems for Administration of Justice,” in Geoffrey Y. Cornog, James B. Kenney, Ellis Scott, and John J. Connelly, eds., EDP Systems in Public Administration (Chicago: Rand McNally, 1968): 161–162; Kent W. Colton, “Computers and Police: Patterns of Success and Failure,” Sloan Management Review 14, no. 2 (winter 1972–1973): 75–98; for a more recent example, see J. Van Duyn, Automated Crime Information Systems (Blue Ridge Summit, Penn.: TAB Professional and Reference Books, 1991): 1–2. 11. “Time Equipment Used to Record N.C. Police Department’s Working Hours,” Business Machines, October 20, 1954, 2; “The Cards That Catch Criminals,” Business Machines January 5, 1955, 4–6; “Policemen’s Talent Scout,” Business Machines August 1, 1957, 10; “More Help for Cop on the Beat,” Business Machines February 1961, 15–16; “RAMAC to Aid Police,” Business Machines August 1961, p. 23, all from IBM Archives, Somers, N.Y.; Norman D. Young, “Mechanized System Doubles Traffic Fines Collected,” American City 71, no. 2 (February 1956): 120–121; John Gavigan, “Punched-Card Law Enforcement,” American City 77, no. 7 (July 1962): 153–154, both articles from this publication were written by police officers; “Office Automation Applications Report G16,” III G16, (circa 1960) pp. 1–2, CBI 55, “Market Reports,” Box 70, Folder 17, Archives of the Charles Babbage Institute, University of Minnesota, Minneapolis. 12. V. A. Leonard, The Police Records System (Springfield, Ill.: Charles C. Thomas Publisher, 1970): vi. 13. Ibid., 52. 14. Quoted in Basil J. Sollitto, “Impact of Automation on Law Enforcement,” in William H. Hewitt, Police Records Administration (Rochester, N.Y.: Aqueduct Books, 1968): 641. 15. Leonard, The Police Records System, 52–66.
Notes to Pages 108–111 16. “IBM’s Fabulous 1401 System Serves Customers in 41 Lands,” IBM News, February 10, 1964, 4; “Computer Net to Unchoke Police Records,” IBM News August 25, 1965, 7, both from IBM Archives, Somers, N.Y.; George A. Flaherty, “Computer Upgrades Law Enforcement, Revenue Management,” American City 85, no. 10 (October 1970): 102–103; Colton, “Routine Computer Technology,” 48–50. 17. Kent W. Colton, “A Decade of Experience Since the Crime Commission: Conclusions and Recommendations,” in Colton, Police Computer Technology, 281; Colton, “Routine Computer Technology,” in Colton, Police Computer Technology, 47–54; “Oregon Police Get Fast Assist in Checking Cars, Licenses,” IBM News, September 10, 1968, 7; “ ‘Help, Police’—It’ll Be Quicker; System/360 on Way in N.Y.C.,” IBM News January 25, 1968, 5; “Florida Pinpoints Transient Criminals,” IBM News June 10, 1968, 6, last three citations from IBM Archives, Somers, N.Y. For description of precomputer inquiry processes, see Leonard, The Police Records System, 54. 18. Scott Hebert, “The Introduction of Sophisticated Allocation Techniques in the Boston Police Department,” in Colton, Police Computer Technology, 97–109; Colton, “Computers and Police,” 75–80. 19. “Chicago’s Police EDP System,” Datamation (July 1967): 52–53. 20. “Computers Aid in Fight Against Crime,” Think (August 1973): 51. 21. “Dispatching Police Cars,” Data Processor 11, no. 2 (April 1968): 25, IBM Archives, Somers, N.Y.; Robert R. J. Gallati, “Identification and Intelligence Systems for Administration of Justice,” in Cornog et al., EDP Systems in Public Management, 161–169. 22. “A Different Beat,” Data Processor 13, no. 1 (February 1970): 18, 21, IBM Archives, Somers, N.Y.; William M. Shaffer, “Computers Play a Deadly Game: Cops and Robbers,” Think (May 1971): 28–30. 23. Colton, “Computers and Police,” 79. 24. Ibid. 25. President’s Commission on Law Enforcement and Administration of Justice, Task Force Report, Science and Technology (Washington, D.C.: U.S. Government Printing Office, 1967); see also its report, The Challenge of Crime in a Free Society (Washington, D.C.: U.S. Government Printing Office, 1967). 26. Jack Metcalfe, “Electronics Joins the Nation’s War on Crime,” The Sunday News, February 5, 1967, 84. The reference to Sgt. Joe Friday comes from a popular TV program of the day (Dragnet) in which the character Sgt. Joe Friday is a no-nonsense officer in the Los Angeles Police Department who constantly is asking criminals and witnesses “just for the facts,” showing no emotion. The show burned into the nation’s mind an image of how policing took place that has remained largely unchanged to the present. 27. Ibid., 84–85; “The FBI’s Computer Network,” Datamation ( June 1970): 146–147, 151. 28. Colton, “Routine Computer Technology,” 48–61; Hebert, “The Introduction of Sophisticated Allocation Techniques in the Boston Police Department,” 97–109; Scott Hebert, “The Use of a Computer-Assisted Patrol Deployment Model in the St. Louis Metropolitan Police Department,” Colton, Police Computer Technology, 77–95, and his other study, “Command and Control in the Boston Police Department: A Technological Approach to Reform,” ibid., 169–195; Scott Hebert and Kent W. Colton, “The Use of Resource-Allocation Models in the Los Angeles Police Department,” ibid., 111–138; Colton, “The Experience of Police Departments in Using Computer Technology,” ibid., 19–45; V. A. Leonard, The New Police Technology: Impact of the Computer and Automation on Police Staff and Line Performance (Springfield, Ill.: Charles C. Thomas, 1980), 117–143. 29. Colton, “Computers and Police,” 79–80. 30. Ibid., 82.
383
384
Notes to Pages 111–114 31. Alfred Blumenstein, “Information Systems Applications in the Criminal Justice System,” in Carlos A. Cuadra and Ann W. Luke, eds., Annual Review of Information Science and Technology, Vol. 7, 1972 (Washington, D.C.: American Society for Information Science, 1972): 471–495, who also reports on the first online query systems for checking on stolen cars, created by the California Highway Patrol and implemented in 1965, pp. 481–482; J. Mark Schuster and Kent W. Colton, “SPRINT: Computer-Assisted Dispatch in the New York City Police Department,” in Colton, Police Computer Technology, 197–223; Colton, “The Implementation of a Computer-Aided Dispatch System by the San Diego Police Department,” in Colton, Police Computer Technology, 225–242; IBM Corporation, Computer-Assisted Dispatching System: Hampton, Virginia Police (White Plains, N.Y.: IBM Corporation, 1976), IBM Archives, Somers, N.Y.; but see also other studies prepared by and preserved at the IBM Archives: IBM Corporation, Online Booking and Criminal Justice System: City of Philadelphia (White Plains, N.Y.: IBM Corporation, 1976), Police Information System: Norfolk, Virginia and Surrounding Tidewater Communities (White Plains, N.Y.: IBM Corporation, 1977), Automated Bail Agency Data Base District of Columbia Bail Agency (White Plains, N.Y.: IBM Corporation, 1977), Washington Area Law Enforcement System (White Plains, N.Y.: IBM Corporation, 1978), Civil Process Records System Sheriff’s Department: Hennepin County (White Plains, N.Y.: IBM Corporation, 1978), ALERT II: Kansas City, Missouri Police Department (White Plains, N.Y.: IBM Corporation, 1979), and for a state-level case study, Online Information System: Illinois Department of Law Enforcement (White Plains, N.Y.: IBM Corporation, 1979). 32. Leonard, The New Police Technology, 102–104. 33. The case of Baltimore’s police department is typical and documented, IBM Corporation, Police Reporting System at the Baltimore County Policy Department (White Plains, N.Y.: IBM Corporation, 1988), Box 252, Folder 32, and IBM Corporation, IBM Systems Integration Used to Implement Information Network for the Baltimore Police Department (White Plains, N.Y.: IBM Corporation, 1989), Box 253, Folder 2, IBM Archives, Somers, N.Y.; see also an earlier study in the Archives, IBM Corporation, Administrative Communications Network Using Displaywriters at Nebraska State Patrol (White Plains, N.Y.: IBM Corporation, 1984), Box 250, Folder 35. 34. Michael Pennington, “New Software Systems Aid Police Work,” American City and County 104, no. 7 (July 1989): 14. 35. James N. Danziger and Kenneth L. Kraemer, “Computerized Data-Based Systems and Productivity Among Professional Workers: The Case of Detectives,” Public Administration Review 45, no. 1 (January–February 1985): 199. 36. Samuel Nunn, “Police Information Technology: Assessing the Effects of Computerization on Urban Police Functions,” Public Administration Review 61, no. 2 (March–April 2001): 232. 37. For a useful survey of the application, see Keith Harries, Mapping Crime: Principle and Practice (1999), available at http://www.ncjrs.org/html/nij/mapping/ch1_1.html (last accessed 7/11/05); Wrobleski and Hess, Introduction to Law Enforcement and Criminal Justice, 184–185; Cynthia A. Mamalian and Nancy G. LaVigne, “The Use of Computerized Crime Mapping by Law Enforcement: Survey Results,” National Institute of Justice Research Preview (January 1999): unpaginated. This report presents data suggesting that 13 percent of law enforcement departments did some form of digital crime mapping, while large departments the most (36 percent). The most complete discussion of late 1990s vintage mapping systems is National Institute of Justice, Mapping Crime: Principle and Practice (Washington, D.C.: U.S. Department of Justice, 1999). 38. Leonard, The New Police Technology, 149. 39. Ibid., 149–158; for a description of precomputerized fingerprinting techniques just before the emergence of computer-based approaches, see Andre Moessens,
Notes to Pages 115–117 Fingerprint Techniques (New York: Chilton, 1971); for a very early account of the use of computers, Sperry Rand Corporation, Demonstrations of Prototype Fingerprint File and Technical Search System, CR-73-7 (Sudbury, Mass.: Sperry Rand Research Center, undated, circa 1973); R. Stock, “Automatic Fingerprint Reading,” Proceedings, 1872 Carnahan Conference on Electronic Crime Countermeasures (Lexington: University of Kentucky, 1972): 16–28. 40. Jim Chu, Law Enforcement Information Technology: A Managerial, Operational, and Practitioner Guide (Boca Raton, Fla.: CRC Press, 2001): 108. 41. U.S. Congress, Office of Technology Assessment, The FBI Fingerprint Identification Automation Program: Issues and Options—Background Paper, OTA-BP-TCT-84 (Washington, D.C.: U.S. Government Printing Office, November 1991): 1–31; U.S. Department of Justice, Office of Justice Programs, Bureau of Justice Statistics, Use and Management of Criminal History Record Information: A Comprehensive Report (Sacramento, Calif.: SEARCH Group, 1993): 23–28. 42. OTA, The FBI Fingerprint Identification Automation Program, 17. 43. See for an example, IBM Corporation, An Advanced Fingerprint Matching System Scores “Hits” in Missouri (White Plains, N.Y.: IBM Corporation, 1991), Box 254, Folder 26, IBM Archives, Somers, N.Y. 44. U.S. Department of Justice, Office of Justice Programs, Bureau of Justice Statistics, Use and Management of Criminal History Record Information: A Comprehensive Report, 2001 Update (Washington, D.C.: U.S. Government Printing Office, December 2001): 61. At the time six states had the Western Identification Network (WIN), established in 1989, to develop a multistate AFIS application for California, Idaho, Montana, Nevada, Oregon, Utah, and Wyoming. 45. Barry Wise, “Catching Crooks with Computers,” American City and County 110, no. 6 (May 1995): 54, 56, 58–62. 46. BJS, Use and Management of Criminal History Record Information, 2001 Update, 62–63; Ellen Perlman, “Fighting Crime: The Technology Bullet,” Governing 9, no. 2 (November 1995): 63–65; Josh Goodman, “Big Brother’s Imprint,” Governing 16, no. 12 (September 2003): 42, 44, 46; Melanie Coffee, “Fingerprinting Causes Little Fuss: How Fingerprint Scanners Work,” Wisconsin State Journal, January 6, 2004, pp. A1, A9. 47. Tom McEwen, Jacqueline Ahn, Steve Pendleton, Barbara Webster, and Gerald Williams, Computer Aided Dispatch in Support of Community Policing, Final Report, No. 204025 (Washington, D.C.: U.S. Department of Justice, February 2004): 7–74, 87–100. 48. Chu, Law Enforcement Information Technology, 119–120; “A New Twist on 911 Capability,” American City and County 112, no. 13 (December 1997): 10. 49. Chu, Law Enforcement Information Technology, 156–164; Peyton Whitely, “New Technology Hits Street with Officers,” The Seattle Times, June 8, 2005, http://seattletimes. nwsource.com/html/snohomishcountynews/2002320338_wireless08n.htm (last accessed 7/11/2005). 50. Ibid., 165–197. 51. National Commission on Terrorist Attacks upon the United States, The 9/11 Commission Report (New York: W. W. Norton, 2004): 397. 52. Chu, Law Enforcement Information Technology, 167–196. 53. Ibid., 214, but also see his bibliography, 225–226; for a large body of detail on the issues, see National Institute of Justice, State and Local Law Enforcement Wireless Communications and Interoperability: A Quantitative Analysis (Washington, D.C.: U.S. Department of Justice, 1998). 54. National Institute of Justice, State and Local Law Enforcement Wireless Communications and Interoperability, 230; this data was based on an important early study of modern telecommunications in law enforcement, Mary J. Taylor, Robert C. Epper, and
385
386
Notes to Pages 117–120 Thomas K. Tolman, State and Local Law Enforcement Wireless Communications and Interoperability: A Quantitative Analysis, National Institute of Justice, Research Report, NCJ 168961 (Washington, D.C.: U.S. Government Printing Office, January 1998), and a related publication, Mary J. Taylor, Robert C. Epper, and Thomas K. Tolman, “Wireless Communications and Interoperability among State and Local Law Enforcement Agencies,” National Institute of Justice Research in Action (January 1998): 1–12, which provides an excellent statistical discussion of issues, such as problems with communications at a level not discussed by the 9/11 report. 55. Brenna Smith and Tom Tolman, “Can We Talk? Public Safety and the Interoperability Challenge,” National Institute of Justice Journal (April 2000): 17–21. 56. Charles Drescher and Martin Zaworski, The Design of Information Systems for Law Enforcement (Springfield, Ill.: Charles C. Thomas, 2000): 15–36; “Technology Boosts Crime Fighting,” American City and County 113, no. 9 (August 1998): 10. 57. Chu, Law Enforcement Information Technolog, 26. 58. Jonathan Walters, “A Better Way to Crunch Crime Data,” Governing 7, no. 11 (August 1994): 56–57; for a continuing case of sharing, see “County, Cities Cooperate on Public Safety,” American City and County 112, no. 11 (October 1997): 8. 59. Thomas F. Rich, “The Use of Computerized Mapping in Crime Control and Prevention Programs,” National Institute of Justice Research in Action (July 1995): 1–11, and by the same author, “The Chicago Police Department’s Information Collection for Automated Mapping (ICAM) Program,” National Institute of Justice Research in Action (July 1996): 1–16. 60. McCarty, The Impact of New Technology and Organizational Stress on Public Safety Decision Making, 28. 61. Ann Lallande, “Computers and the War on Crime,” Governing 10, no. 10 (July 1997): 54. 62. One of the basic findings from nearly 30 industries reported in James W. Cortada, The Digital Hand, 2 vols. (New York: Oxford University Press, 2004–2006). See also Bureau of Justice Assistance, Keynote Presentations: 1999 Symposium on Integrated Justice Information Systems, NCJ 178231 (Washington, D.C.: U.S. Department of Justice, August 1999). 63. Marilyn J. Cohodas, “The Cybercops,” Governing 9, no. 12 (September 1996): 63–64. 64. For examples as of the late 1990s, http://police.sas.ab.ca (CopNet); http://www. fbi.gov (FBI’s home page); http://www.nlectc.org (JUSTNE from the National Institute of Justice); http://www.MostWanted.com. 65. Wrobleski and Hess, Introduction to Law Enforcement and Criminal Justice, 212. There exist now some reasonably current data (circa 2002–2003) on the extent of deployment of community policing. In a national survey, 77 percent of responding police departments engaged in some form of community policing, and half had modified their IT applications to support this new form of policing. For details, see McEwen et al., Computer Aided Dispatch in Support of Community Policing, 4–6. 66. Tim Dees, Online Resource Guide for Law Enforcement (Upper Saddle River, N.J.: Prentice Hall, 2002): 208–241. This listing of sites does not include those of individual police or sheriff departments. 67. Chu, Law Enforcement Information Technology, 38–48, 193–195. 68. Brian A. Reaves and Matthew J. Hickman, “Police Departments in Large Cities, 1999–2000,” Bureau of Justice Statistics Special Report, NCJ 175703 (Washington, D.C.: U.S. Department of Justice, May 2002): 8–10. 69. Brian A. Reaves and Matthew J. Hickman, “Sheriffs’ Offices 1999,” Bureau of Justice Statistics Special Report, NCJ 186479 (Washington, D.C.: U.S. Department of Justice, May 2001): 13–15. For a discussion of the continuing contrast between urban and rural
Notes to Pages 121–125 policing in the early years of the new century, see U.S. Department of Justice, Office of Justice Programs, Law Enforcement Technology—Are Small and Rural Agencies Equipped and Trained? (Washington, D.C.: National Institute of Justice, June 2004), available at http://www.ojp.usdoj.gov/nij. 70. “FBI’s National Crime Information Center Based on Two System/360 Model 40s,” IBM News, June 10, 1966, 2, and “FBI Center Uses Two Model 40s,” IBM News February 10, 1967, 2, IBM Archives, Somers, N.Y. 71. J. Van Duyn, Automated Crime Information Systems (Blue Ridge Summit, Penn.: TAB Professional and Reference Books, 1991): 3–19; Leonard, The Police Records System, 56–66; Metcalfe, “Electronics Joins the Nation’s War on Crime,” 790–795. 72. Office of Technology Assessment, A Preliminary Assessment of the National Crime Information Center and the Computerized Criminal History System (Washington, D.C.: U.S. Government Printing Office, 1978): 12. 73. Study done by the GAO, “Development of the Computerized Criminal History Information System,” undated, circa March 1974, http://archive.gao.gov/f0302/ 095985.pdf (last accessed 4/4/2005); OTA, A Preliminary Assessment of the National Crime Information Center and the Computerized Criminal History System, 9–13. 74. BJS, Use and Management of Criminal History Record Information, 2001 Update, 26. 75. Ibid., 74. 76. Ibid., 31. 77. Ibid., 38–42. 78. Ibid., 43–44. 79. Ibid., 70. 80. Ibid., 92. 81. Ibid., 94. 82. See, for example, Information Resources Management Plan of the Federal Government (Washington, D.C.: U.S. Government Printing Office, December 1993): IV-73–IV-77. 83. L. Ralph Mecham, Judicial Business of the United States Courts: 1999 Report of the Director (Washington, D.C.: Administrative Office of the U.S. Courts, 2000). 84. David W. Neubauer, America’s Courts and the Criminal Justice System (Belmont, Calif.: Wadsworth/Thomson Learning, 2002): 79–80. 85. The two other major categories were criminal (14.6 million) and civil (15.4 million). 86. Neubauer, America’s Courts and the Criminal Justice System, 91–92. 87. Ibid., 92. 88. W. Stuart Awbrey, “. . . With Liberty and Justice for All,” Think (May 1971): 25; IBM Corporation, Helping Justice Out (White Plains, N.Y.: IBM Corporation, 1969): 16, IBM Archives, Somers, N.Y. 89. The iconic information technology artifact of the court room dated to 1877 with the introduction of the first shorthand machine. In 1963, the first device introduced with magnetic tape made it to market, which could produce computer machine-readable files, named the Stenograph Data Writer. Within a few years, new versions used cartridges, then cassettes in 1974. In 1987, the Stenograph Smart Writer transferred to machinereadable form text typed on the machine, a DOS-based machine. In the early 1990s, machines such as Stenograph’s Stentura had a display screen, Stenograph, “A History of the Shorthand Writing Machine,” http://www.stenograph.com (last accessed 8/22/2005). The transformation of these tools, and the role of their users, to more computer-based approaches has recently been studied by Greg Downey, “Constructing ‘ComputerCompatible’ Stenographers: The Transition to Real-Time Transcription in Courtroom Reporting,” Technology and Culture 47, no. 1 (January 2006): 1–26, a theme he expanded on in his book, From Court Reporters to Closed Captions (Baltimore, Md.: Johns Hopkins University Press, forthcoming).
387
388
Notes to Pages 125–128 90. “IBM Equipment Picks the Jury,” Business Machines (September 1958): 12; Lester C. Goodchild, “The Courts Were Clogged with Paper Work,” American City 87, no. 2 (February 1972): 79–81. 91. Awbrey, “. . . With Liberty and Justice for All,” quote, p. 27, but see entire article, 25–27. 92. “Helping Justice Out,” Data Processor 12, no. 6 (September 1969): 16–18, IBM Archives, Somers, N.Y. 93. “Justice Quick and Fair,” Data Processor 15, no. 2 (May 1972): 13–14; “No More Overcrowding in This Detention Center,” Think, no. 1 (1974): 42–45; Darby Patterson, “State of the Digital State,” Government Technology (June 2001), http://www.govtech. net/magazine/story.php?id⫽4971&issue⫽6:2001 (last accessed 4/01/2007); Carolyn Ball and Kenneth Nichols, Integration of Law Enforcement Computer Technology, vol. 1 (Orono: University of Maine, September 2002), http://www.ume.maine.edu/~pubadmin/cj/ techrpt/ (last accessed 7/12/2005). 94. IBM conducted a series of detailed studies in this period on how courts were using its systems. These included descriptions of the applications, photographs of the online screen images, and comments by the users of such applications. IBM Corporation, Justice Information and Management System: Harris County, Texas (White Plains, N.Y.: IBM Corporation, 1976), Justice Information System in Milwaukee County, Wisconsin (White Plains, N.Y.: IBM Corporation, 1977), and Oakland County Justice System (White Plains, N.Y.: IBM Corporation, 1977), IBM Archives, Somers, N.Y. 95. Robert Sobel, IBM: Colossus in Transition (New York: Times Books, 1981): 258–262. IBM’s software tool, first developed to help it in its legal battles, was called STAIRS (Storage and Information Retrieval System). 96. Kathleen M. Carrick, LEXIS®: A Legal Research Manual (Dayton, Ohio: Mead Data Central, 1989): 7–9; Judy A. Long, Computer Aided Legal Research (Clifton Park, N.Y.: Delmar Learning, 2003): 53–68, 70–71. 97. IBM Corporation, Case Tracking System in the Texas Attorney General’s Office (White Plains, N.Y.: IBM Corporation, 1984), IBM Archives, Somers, N.Y. 98. IBM Corporation, The Maryland District Courts Judicial Information System (White Plains, N.Y.: IBM Corporation, 1986), but see also IBM Corporation, Integrated Justice System in Saginaw County (White Plains, N.Y.: IBM Corporation, 1987), IBM Archives, Somers, N.Y.; Jonathan Walters, “Cops and Courts Are Turning to High-Tech Tools,” Governing 4, no. 1 (October 1990): 24–26. 99. A series prepared by IBM, IBM Corporation, Colorado’s Self-Service Resource for the Courts (White Plains, N.Y.: IBM Corporation, 1990); Computerized Information System Streamlines Idaho’s Trial Courts (White Plains, N.Y.: IBM Corporation, 1991); 17th Judicial Circuit Court of Broward County, Florida, Managing Increased Caseload with SAA and OfficeVision (White Plains, N.Y.: IBM Corporation, 1991); A Powerful Tool for Juvenile Justice in Arizona (White Plains, N.Y.: IBM Corporation, 1991); Illinois’ 18th Judicial Circuit Court: On Its Way to Becoming a Paperless System (White Plains, N.Y.: IBM Corporation, 1992); Pennsylvania Courts: Ensuring Justice for All with a Statewide Information System (White Plains, N.Y.: IBM Corporation, 1992), IBM Archives, Somers, N.Y. 100. Steve Polilli, “The High-Tech Court of The Future,” Governing 5, no. 12 (September 1992): 18. 101. Ibid., 18–19. 102. American Bar Association, Facts about the American Judicial System (Washington, D.C.: American Bar Association, 1999): 17, 33; Elizabeth Glazer, “Harnessing Information in a Prosecutor’s Office,” NIJ Journal (October 2000): 3–7. 103. G. Martin Lively and Judy A. Reardon, “Justice on the Net: The National Institute of Justice Promotes Internet Services,” National Institute of Justice Research in Action (March
Notes to Pages 128–130 1996): 1–8. This publication lists many legal Web sites and provides screen images of various applications. 104. Quoted in “New Chair Sees IT Advantages for Courts,” The Third Branch, undated, circa 2000–2001, p. 1, http:www.uscourts.gov/ttb/march01ttb/interview.html (last accessed 7/11/2005). 105. Ibid., 2. 106. Thomas M. Lenard, “The Digital State Part 1: Social Services and Law Enforcement and the Courts,” Progress on Point, Release 8.12 (June 2001): 3–4. 107. “Ohio Court System Embraces Technology,” Government Technology, June 9, 2003, http://www.govtech.net/news/news.php?id⫽55195; Jim McKay, “Connecting the Courts,” Government Technology, September 2003, http://www.govtech.net/magazine/ story.php? id⫽66708&issue⫽9:2003; Blake Harris, “Resurrecting the Court,” Government Technology, http://www.govtech.net/magazine/story.php?id⫽89564&issue⫽3:2004; “Frequently Asked Questions: Jury Service,” Iowa Judicial Branch, July 11, 2005, http://www.judicial. state.ia.us/faq/jury.asp (all last accessed 7/11/2005). 108. William H. B. Thomas, “Project Lawsearch: A Study in Disseminable IR,” Datamation 7, no. 11 (November 1961): 119–120 [IR meant information retrieval]; “Automating the Archives: Computer to Take Over the Lawyer’s Plodding Search through Archives,” Time (December 13, 1962): 82; Roy N. Freed, “Computer Law Searching: Problems for the Layman,” Datamation 13, no. 2 (October 1967): 38–43; “When Computers Do the Digging: Hunt through Case and Statute Books,” BusinessWeek, January 23, 1965, 54ff. 109. Mickie A. Voges, “Information Systems and the Law,” Annual Review of Information Science and Technology 23 (1988): 193–194; Cary Griffith, “Dual-System Research: The Best of Both Worlds,” Legal Times, March 17, 1986, 1,10, and “Cost Effective ComputerAssisted Legal Research, Or When Two Are Better Than One,” Legal Reference Services Quarterly 7, no. 1 (spring 1987): 3–13; William G. Harrington, “Use of Lexis, and Westlaw, Too, Is Vital to Any Law Practice,” National Law Journal 10, no. 5 (October 12, 1987): 18–20, and another he wrote, “Sophisticated System Aids Litigators,” National Law Journal 10, no. 25 (February 29, 1988): 23. See also “Putting Law Libraries Into the Computer; Mead Data Central,” Business World (January 26, 1974): 36; William G. Harrington, “A Brief History of Computer-Assisted Legal Research,” Law Library Journal 77 (1985): 543. 110. Your author saw what computers could do in a courtroom in the late 1970s. Then an IBM salesman, I visited the federal court room in New York City in which the Justice Department’s antitrust suit against IBM was being heard. On one side of the courtroom, the federal attorneys had banks of file cabinets filled with documentation they needed to support their case. On the other side were IBM’s lawyers who sat in front of a terminal with a printer nearby both attached to a large mainframe back at an IBM data center in which they stored many of the same records the federal lawyers had in paper form crowding their side of the room. That day an IBM scientist was being questioned about a memo he wrote a number of years earlier that the federal lawyer quoted from. On cross examination, the IBM lawyer printed out a full copy of the text and when questioning the witness, said something to the effect, “but didn’t you also write in the same letter such and such?” which blunted the point being made by the DOJ lawyer. 111. Voges, “Information Systems and the Law,” 194–196. 112. S. Blair Kauffman, “Electronic Databases in Legal Research: Beyond LEXIS and WESTLAW,” Rutgers Computer and Technology Law Journal 13, no. 1 (1987): 73–104; Adolph J. Levy, “Online Computer Databases: Finding Data on Adverse Witnesses, Defendants, and Defective Products,” Trial 23, no. 1 (January 1987): 18–22; Alice J. Vollaro and Donald T. Hawkins, “End-User Searching in a Large Library Network: A Case Study of Patent Attorneys,” Online 10, no. 4 (July 1986): 67–72.
389
390
Notes to Pages 130–134 113. Voges, “Information Systems and the Law,” 199. 114. Allan H. Schneider, “The Benefits of System Integration,” New York Law Journal 197, no. 59 (March 30, 1987): 33; Robert A. Sparks, “Integration of Computers and Conventional Technology,” Legal Economics 13, no. 1 (January–February 1987): 36–41. 115. Thomas P. Bonczar, Prevalence of Imprisonment in the U.S. Population, 1974–2001, Bureau of Justice Statistics Special Report, NCJ 197976 (Washington, D.C.: U.S. Department of Justice, August 2003). 116. Spencer Welch, “A New Way Out,” Think (March 1971): 21–24; “On the Screen— 11,000 Stories,” Data Processor 13, no. 5 (December 1970–January 1971): 12–13, IBM Archives, Somers, N.Y. 117. “On the Screen—11,000 Stories,” 12–13. 118. IBM Corporation, Prison Accounting and Inmate Records System Division of Corrections State of New Mexico (White Plains, N.Y.: IBM Corporation, 1978), Box 245, Folder 22, IBM Archives, Somers, N.Y. 119. IBM Corporation, Jail Online Inmate Control System Baltimore, Maryland (White Plains, N.Y.: IBM Corporation, 1979), Box 246, Folder 7, IBM Archives, Somers, N.Y. 120. IBM Corporation, Inmate Accounting Online at the Texas Department of Corrections (White Plains, N.Y.: IBM Corporation, 1983), Box 250, Folder 21, Corrections Management Information System in New Jersey (White Plains, N.Y.: IBM Corporation, 1988), Box 252, Folder 40, The New York State Department of Correctional Services Manages Growth with IBM Systems (White Plains, N.Y.: IBM Corporation, 1990), Box 253, Folder 17, IBM Archives, Somers, N.Y. 121. Office of Technology Assessment, Criminal Justice: New Technologies and the Constitution. Special Report, OTA-CIT-366 (Washington, D.C.: U.S. Government Printing Office, May 1988): 31–37. 122. For an early case study, see “Telecommunications Helps Keep a ‘Model’ Jail Secure,” American City and County 97, no. 10 (October 1982): 23–25. 123. Department of Justice, Bureau of Justice Statistics, State and Federal Corrections Information Systems: An Inventory of Data Elements and an Assessment of Reporting Capabilities (Washington, D.C.: U.S. Department of Justice, August 1998). While a detailed report (over 200 pages), it did not draw conclusions based on the data on such issues as why there was, or was not, more or less use of IT and the effects it had. It did demonstrate, however, that with more digitized data, corrections communities could do a more thorough job in collecting and reporting on their prison populations. 124. “Jail Web Sites Publish Inmates’ Arrest Info,” American City and County Magazine 119, no. 6 (June 2004): 24–27. 125. Herbert Arkin, “Computers and the Audit Test,” Journal of Accountancy 120 (October 1965): 44–48; Wayne S. Boutell, “Auditing through the Computer,” Journal of Accountancy 121 (November 1965): 41–47; Goodrich F. Cleaver, “Auditing and Electronic Data Processing,” Journal of Accountancy 106 (November 1958): 48–54; Paul E. Hamman, “The Audit of Machine Records,” Journal of Accountancy 101 (March 1956): 56–61; Institute of Internal Auditors, Internal Audit and Control of Payroll and Accounts Payable (Where Accounting Machines Are Utilized) (New York: Institute of Internal Auditors, 1957); Felix Kaufman and Leo A. Schmidt, “Auditing Electronic Records,” Accounting Review (January 1957): 34–41; Joseph Pelej, “How Will Business Electronics Affect the Auditor’s Work?” Journal of Accountancy 98 (July 1954): 36–44; Arthur B. Toan, Jr., “The Auditor and EDP,” Journal of Accountancy 109 (June 1960): 42–46. 126. August Bequai, Computer Crime (Lexington, Mass.: Lexington Books, 1978): 3. 127. Donn B. Parker, Crime by Computer (New York: Charles Scribner’s Sons, 1976): 23. His original study, which became a minor classic, was written with Susan Nycum and S. Stephen Oüra, Computer Abuse (Stanford, Calif.: Stanford Research Institute, November
Notes to Pages 134–136 1973) distributed by the U.S. Department of Commerce, National Technical Information Service. Copy used is from Law Library, University of Wisconsin, Madison. 128. Donn B. Parker, Fighting Computer Crime (New York: Charles Scribner’s Sons, 1983): 23. 129. Parker, Crime by Computer, 27–34. 130. Ibid., 38. 131. Task Force on Computer Crime Section of Criminal Justice, American Bar Association, Report on Computer Crime (Washington, D.C.: American Bar Association, June 1984): xii, copy at the Law Library, University of Wisconsin, Madison. 132. Parker, Fighting Computer Crime, x. By then he had an inventory of some 1,000 documented crimes, roughly half committed in the United States. 133. ABA, Report on Computer Crime, xii. 134. Stephen W. Leibholz and Louis D. Wilson, Users’ Guide to Computer Crime: Its Commission, Detection and Prevention (Radnor, Penn.: Chilton Book Company, 1974): 3. 135. Parker, Fighting Computer Crime, 236–237. 136. This early reference work for law enforcement was the U.S. Department of Justice, Law Enforcement Administration, Criminal Justice Resource Manual on Computer Crime (Washington, D.C.: U.S. Government Printing Office, 1979); Parker, Fighting Computer Crime, 239; “Crash Course in Computer Science Enables FBI to Nab Brainy Crooks,” Crime Control Digest, June 27, 1977, 5. 137. August Bequai, Computer Crime (Lexington, Mass.: Lexington Books, 1978): 37–38. 138. Van Duyn, Automated Crime Information Systems, 79–86. 139. Bequai, Computer Crime, 25–53. 140. There is a large body of material about hackers. For a spectrum of views, see Bernadette H. Schell, John L. Dodge, and Steve S. Moutsatsos, The Hacking of America— Who’s Doing It, Why, and How (Westport, Conn.: Quorum Books, 2002); Katie Hafner and John Markoff, Cyberpunk: Outlaws and Hackers on the Computer Frontier (New York: Touchstone, 1992); The Knightmare, Secrets of a Super Hacker (Port Townsend, Wash.: Loompanics Unlimited, 1994); Steven Levy, Hackers: Heroes of the Computer Revolution (New York: Anchor Press, 1984); Guy L. Steele et al., The Hacker’s Dictionary (New York: Harper and Row, 1983); Bruce Sterling, The Hacker Crackdown: Law and Disorder on the Electronic Frontier (New York: Bantam Books, 1992). 141. August Bequai, Technocrimes (Lexington, Mass.: Lexington Books, 1987): 30. 142. Ibid., 31. 143. Ibid., 32. Cookie refers to a children’s character that appeared in a late twentiethcentury American public TV program, called Sesame Street, in which one friendly (but also slightly grouchy) character was the Cookie Monster, who always wanted a cookie. Most children growing up in the U.S. in the 1970s and 1980s who had access to a television set probably saw the program, and hardly any child at the time was unaware of the Cookie Monster. 144. Kovacich and Boni, High Technology-Crime Investigator’s Handbook, 32–33. There is a growing body of literature on crime conducted over the Internet. For examples, see William C. Boni and Gerald L. Kovacich, I-Way Robbery: Crime on the Internet (Boston, Mass.: Butterworth-Heinemann, 1999); Charles Platt, Anarchy Online, Net Crime, Net Sex (New York: Harper, 1996); Julian Dibbell, My Tiny Life: Crime and Passion in a Virtual World (New York: Henry Holt, 1998); Brian McWilliams, S*pam Kings: The Real Story Behind the High-Rolling Hucksters Pushing Porn, Pills, and @*#?% Enlargements (Cambridge, Mass.: O’Reilly, 2004); Steven Branigan, High-Tech Crimes Revealed: Cyberwar Stories From the Digital Front (Boston, Mass.: Addison-Wesley, 2005). 145. Bequai, Technocrimes, 61–76.
391
392
Notes to Pages 136–142 146. Franklin Clark and Ken Diliberto, Investigating Computer Crime (Boca Raton, Fla.: CRC, 1996); Gerald L. Kovacich and William C. Boni, High Technology-Crime Investigator’s Handbook: Working in the Global Information Environment (Boston, Mass.: ButterworthHeinemann, 2000); R.L. Bintliff, Complete Manual of White Collar Crime Detection and Prevention (Englewood Cliffs, N.J.: Prentice-Hall, 1993); David Icove, Karl Seger, and William Von Storch, Computer Crime: A Crimefighter’s Handbook (Sebastopol, Calif.: O’Reilly and Associates, 1995); U.S. Department of Justice, Office of Justice Programs, Bureau of Justice Statistics, Organizing for Computer Crime Investigation and Prosecution (Washington, D.C.: U.S. Government Printing Office, 1989). 147. Gerald L. Kovacich and William C. Boni, High Technology-Crime Investigator’s Handbook: Working in the Global Information Environment (Boston: ButterworthHeinemann, 2000), 34–36. 148. Ibid., 36–40. 149. Ramona R. Rantala, Cybercrime against Businesses, Bureau of Justice Statistics Technical Report 200639 (Washington. D.C.: U.S. Department of Justice, March 2004). 150. National Criminal Justice Reference Service, Internet Safety, undated, http://www.ncjrs.org/internetsafety/index.html (last accessed 7/11/2005). 151. Kent W. Colton, “Computer Technology and the Police: The Expectations and the Results,” in Colton, Police Computer Technology, 7. 152. Scott Hebert, “Impact and Implications for Future Criminal Justice Reform Efforts,” in Colton, Police Computer Technology, 151. 153. Nunn, “Police Information Technology,” 221–234. 154. Melissa Conradi, “Scared to Share,” Governing 17, no. 11 (August 2004): 38. 155. For example, U.S. General Accounting Office, Information Technology: FBI Needs an Enterprise Architecture to Guide Its Modernization Activities (Washington, D.C.: U.S. Government Printing Office, September 2003) and Information Technology: Foundational Steps Being Taken to Make Needed FBI Systems Modernization Management Improvements (Washington, D.C.: U.S. Government Printing Office, September 2004). 156. Based on an oral history conducted by Jeffrey R. Yost, “An Interview with Donn B. Parker,” May 14, 2003, 26, Archives of the Charles Babbage Institute, University of Minnesota, Minneapolis.
Chapter 5 1. Robert Sobel, IBM: Colossus in Transition (New York: Times Books, 1981): 86, 127; Arthur L. Norberg, “High-Technology Calculation in the Early 20th Century: Punched Card Machinery in Business and Government,” Technology and Culture 31, no. 4 (1990): 753–779; James W. Cortada, Before the Computer: IBM, NCR, Burroughs, and Remington Rand and the Industry They Created, 1865–1956 (Princeton, N.J.: Princeton University Press, 1993): 146–147, 221, 267; Emerson W. Pugh, Building IBM: Shaping an Industry and Its Technology (Cambridge, Mass.: MIT Press, 1995): 56, 63–64, 89, 317; Martin Campbell-Kelly and William Aspray, Computer: A History of the Information Machine (New York: Basic Books, 1996): 51–52; JoAnne Yates, “Business Use of Information and Technology during the Industrial Age,” in Alfred D. Chandler, Jr., and James W. Cortada eds., A Nation Transformed by Information: How Information Has Shaped the United States from Colonial Times to the Present (New York: Oxford University Press, 2000): 134. 2. An inventory of IT equipment from December 1946 listed 648 data processing machines just from IBM, renting for $838,229. The SSA also had 19,568 file cabinets of punched-card records, SSA Archives, Baltimore, Md. 3. Charles A. Coffindaffer, “The Conversion of Social Security Recordkeeping Operations to Electronic Data Processing” (M.A. thesis, George Washington University, 1963): 12–20.
Notes to Pages 143–146 4. Larry DeWitt, “Research Note #6: Early Automation Challenges for SSA,” April 2000, http://www.ssa.gov/history/ibm.html (last accessed 9/10/2005). 5. Victor Christgau, “Old-Age, Survivors, and Disability Insurance After Twenty-five Years,” Social Security Bulletin 23, no. 3 (August 1960): 20. 6. Coffindaffer, “The Conversion of Social Security Recordkeeping Operations to Electronic Data Processing,” 6–8. 7. U.S. Census Bureau, Statistical Abstract of the United States: 2002 (Washington, D.C.: U.S. Government Printing Office, 2001): table No. 517, p. 345. 8. Ibid., table no. 518, p. 345. 9. Social Security Administration, “Organizational Structure of the Social Security Administration,” August 25, 2005, http://www.ssa.gov/org/ssaorg.htm (last accessed 9/10/2005). 10. For early descriptions of this process see, Social Security Administration, Your Social Security Record (Baltimore, Md.: U.S. Social Security Administration, December 1955), reflecting how things were done just prior to the arrival of computers into the agency. 11. Coffindaffer, “The Conversion of Social Security Recordkeeping Operations to Electronic Data Processing,” 15–16. For a description of the original and evolving designs of record keeping and digital applications from the 1930s to the mid-1980s, see Michael A. Cronin, “Fifty Years of Operations in the Social Security Administration,” Social Security Bulletin 48, no. 6 (June 1985), available at http://www.saa.gov/history/cronin. html (last accessed 9/21/2005). 12. James W. Cortada, The Digital Hand: How Computers Changed the Work of American Financial, Telecommunications, Media, and Entertainment Industries (New York: Oxford University Press, 2006): 113–150. 13. U.S. Bureau of Old-Age and Survivors Insurance, “Breakthrough in the Earnings Record Operation,” OASIS (July 1959): 5. 14. Coffindaffer, “The Conversion of Social Security Recordkeeping Operations to Electronic Data Processing,” 21–30; S. D. Hearn and J. H. Cummins, Report on Feasibility of Electronic Computers for Processing Bureau of Old-Age and Survivors Insurance Statistics (Baltimore: The Bureau, 1954): 7. 15. Hearn and Cummins, Report on Feasibility of Electronic Computers for Processing Bureau of Old-Age and Survivors Insurance Statistics, 5–6. 16. Coffindaffer, “The Conversion of Social Security Recordkeeping Operations to Electronic Data Processing,” 59. 17. Prior to the use of the IBM 702, summary earnings records filled 6,000 file cabinets and took up 40,000 square feet of space. After these records had been transferred to tape, the SSA only needed 1,100 square feet of space in its Baltimore office to house the data. Later the National Employee Index was also converted to tape, shrinking space needed from one city block of floor space to 2,500 feet., ibid., 75. An early internal history of computing at the agency described the same themes of speed and storage capacity from one generation of computing to another as the driving force in acquiring ever larger and faster systems, “History of Data Processing in BDPA,” undated, circa 1969, SSA Archives, Baltimore, Md. See also Social Security Administration, Electronic Data Processing in The Social Security Administration (Baltimore: Social Security Administration, 1963) and a later study by SSA, History of Installation of Electronic Data Processing in OASDI Payment Centers (Baltimore: Social Security Administration, 1964), both at the SSA Archives, Baltimore, Md. Internal training materials for SSA’s DP staff describes many of the existing systems and management’s approach to their introduction and operation, Joseph L. Fay, Data Processing in the Social Security Administration (Baltimore: Social Security Administration, 1967), SSA Archives, Baltimore, Md. 18. Cronin, “Fifty Years of Operations in the Social Security Administration,” 19. Nothing so dramatically demonstrates the effects of tape files than to see photographs of
393
394
Notes to Pages 146–150 SSA storage areas before and after the introduction of magnetic tape. For such a photographic comparison, see the entire issue of SSA’s The Bulletin 23, no. 45 (November 9, 1961), available at http://www.ssa.gov/history/candlerops.html (last accessed 9/21/2005). 19. U.S. Bureau of Old-Age and Survivors Insurance, Division of Accounting Operations, Annual Financial Report—1956 Fiscal Year (Baltimore: U.S. Bureau of Old-Age and Survivors Insurance, 1957): A26. 20. Cronin, “Fifty Years of Operations in the Social Security Administration”; OAAplication Updating Service, “Eighty Million Accounts Receivable,” March 1959, III G2–16, and updated version, April 1961, CBI 55, “Market Reports,” Box 70, Folders 5 and 7, Archives of the Charles Babbage Institute, University of Minnesota, Minneapolis; Social Security Administration, “History of SSA during the Johnson Administration, 1963–1968,” undated, http://www/ssa.gov/history/ssa/lbjopere.html (last accessed 9/10/2005). 21. General Accounting Office, SSA Computers: Long-Range Vision Needed to Guide Future Systems Modernization Efforts, GAO/IMTEC-91-44 (Washington, D.C.: U.S. Government Printing Office, 1991): 2; Office of Technology Assessment, Social Security Administration and Information Technology, Special Report OTA-CIT-311 (Washington, D.C.: U.S. Government Printing Office, October 1986): 96, 105; Martha Derthick, Agency under Stress: The Social Security Administration in American Government (Washington, D.C.: Brookings Institution, 1990): 5; “IBM Optical Reader Processes Social Security Data,” IBM News, August 25, 1966, 2; Douglas W. Thomas, “Buzzing Card Reader Sounds Off, Help Cut Clerical Time at Social Security,” IBM News June 26, 1967, 2; “Security in Numbers,” Data Processor 10, no. 2 (June 1967): 3–7, all three articles at IBM Archives, Somers, N.Y. 22. SSA, Electronic Data Processing in The Social Security Administration, 6–9. 23. Ibid., 9. 24. Cronin, “Fifty Years of Operations in the Social Security Administration,” 26. 25. Office of Technology Assessment, Social Security Administration and Information Technology, 12. 26. For an inventory of many changes and laws originating in the Congress, see Geoffrey Kollmann, Summary of Major Changes in the Social Security Cash Benefits Program: 1935–1996 (Washington, D.C.: Congressional Research Service, The Library of Congress, December 20, 1996). 27. Helen Margetts, Information Technology in Government: Britain and America (London: Routledge, 1999): 72. 28. Office of Technology Assessment, Social Security Administration and Information Technology, 14. For an internally written (later published) assessment of the crisis from the perspective of the SSA, see Systems Modernization Plan: From Survival to State of the Art (Baltimore: Social Security Administration, February 1982): 1–30. 29. Elmer B. Stoats to Representative William R. Armstrong, June 6, 1977, General Accounting Office. An internal SSA assessment of data processing concluded that while the agency had state-of-the-art computers (such as 2 IBM System 370s Model 168 and 2 Model 165s), “many of the operational programs today are still processing as they did on the 7080,” Ferdinand Jung, “Memorandum for the Record by Ferdinand Jung,” May 24, 1976, 4, SSA Archives, Baltimore, Md. 30. Margetts, Information Technology in Government: Britain and America, 72–75. 31. Office of Technology Assessment, Social Security Administration and Information Technology, 14. 32. Ibid., 17. 33. For discussions of the SMP, see Margetts, Information Technology in Government: Britain and America, 74–75. For an official SSA description of the plan, see Social Security Administration, Systems Modernization Plan.
Notes to Pages 150–156 34. For a GAO assessment, Warren G. Reed to Senators Lowell Weicker, Jr., William Proxmire, and Lawton Chiles, August 30, 1985, http://archives.gao.gov/d11t3/128022. pdf (last accessed 1/3/2005). For a relatively final version of the plan, see Social Security Administration Office of Systems, Systems Modernization Plan . . . 1987, Field Edition (Baltimore: Social Security Administration, December 1986). 35. For a description of the process, see Cronin, “Fifty Years of Operations in the Social Security Administration,” p. 11 of Internet version. 36. Office of Technology Assessment, The Social Security Administration’s Decentralized Computer Strategy: Issues and Options, OTA-TCT-591 (Washington, D.C.: U.S. Government Printing Office, April 1994): 1–4. 37. General Accounting Office, SSA Computers, 2. 38. Systems Modernization Program, News Update; see in particular the monthly issues from 1985 through 1990, SSA Archives, Baltimore, Md. 39. Office of Technology Assessment, Social Security Administration and Information Technology, 119. 40. Margetts, Information Technology in Government, 76–78. 41. Office of Technology Assessment, Social Security Administration and Information Technology, 126, and for a fuller account of the affair, 126–131. 42. Ibid., 39. 43. Ibid., 42. 44. Ibid., 43. 45. Office of Technology Assessment, The Social Security Administration’s Decentralized Computer Strategy, 1. 46. Ibid. 47. Ibid., 31. 48. Ibid., 7. 49. Ibid. 50. Ibid., 16. 51. Made complex because each case had to be treated differently and assessed on its own merits, which called for knowledge, judgment, and time of employees whereas the Old Age and Security Insurance was largely a standard process driven by mathematical calculations that computers could execute. 52. General Accounting Office, Electronic Transfers: Use by Federal Payment Recipients Has Increased but Obstacles to Greater Participation Remain (Washington, D.C.: U.S. Government Printing Office, September 2002): 3. 53. Social Security Administration, “Chronology of SSA Events—1993–2000,” undated, http://www.ssa.gov/history/ssa/ssa2000exhibit1–1.html (last accessed 9/10/2005); Ibid. Social Security Administration, “Social Security Online Website Honors and Awards,” http://www.ssa.gov/awards/ (last accessed 9/10/2005). 54. Social Security Administration, “SSA History: History of SSA 1993–2000,” chapter 8, “Workforce Investments,” http://www.ssa.gov/history/ssa/ssa2000chapter8.html (last accessed 9/10/2005). 55. General Accounting Office, Information Technology Management: Social Security Administration Can Be Improved (Washington, D.C.: U.S. Government Printing Office, August 2001): 10. 56. Ibid., 11. 57. Ibid., 16. 58. General Accounting Office, Major Management Challenges and Program Risks: Social Security Administration (Washington, D.C.: U.S. Government Printing Office, January 2003): 16–20. 59. Ibid., 21.
395
396
Notes to Pages 156–160 60. Ibid., 25. 61. Social Security Administration, Results of the Social Security Administration: Getting It Done (n.p. [Baltimore?]: Social Security Administration, undated [summer 2005]): 1–14. 62. Ibid., 15. 63. Ibid., 18–19. 64. For details, see ibid., 20–25. 65. For a list and description of all the key agencies circa 2005, see “Federal Agencies with Statistical Programs,” http://www.fedstats.gov/agencies/index.html (last accessed 10/1/2005). 66. An excellent, important exception is a history written by two statisticians, Joseph W. Duncan and William C. Shelton, Revolution in United States Government Statistics, 1926–1976 (Washington, D.C.: U.S. Government Printing Office, October 1978). On the history of information-handling technologies at the Census Bureau prior to its use of computers, see Leon Truesdell, The Development of Punch Card Tabulation in the Bureau of the Census, 1890–1940 (Washington, D.C.: U.S. Government Printing Office, 1965), and specifically on the origins of punched cards at the bureau, the most complete account is by Geoffrey D. Austrian, Herman Hollerith: Forgotten Giant of Information Processing (New York: Columbia University Press, 1982): 39–73. The standard history of the census and its bureau is by Margo J. Anderson, The American Census: A Social History (New Haven, Conn.: Yale University Press, 1988), but also consult Margo J. Anderson and Stephen E. Fienberg, Who Counts? The Politics of Census-Taking in Contemporary America (New York: Russell Sage Foundation, 1999). Both contain citations to earlier studies of the census and the bureau. 67. Anderson, The American Census, 197. 68. Arthur L. Norberg, Computers and Commerce: A Study of Technology and Management at Eckert-Mauchly Computer Company, Engineering Research Associates, and Remington Rand, 1946–1957 (Cambridge, Mass.: MIT Press, 2005): 79–82, 97, 108–109, 117, 172–173, 177, 190–191, 217; Duncan and Shelton, Revolution in United States Statistics, 126–127. If one had to pick a single event that pushed IBM into the commercial computer business, it was the decision by the bureau to acquire a computer from any firm other than IBM. The bureau had been IBM’s oldest customer and one of its most important. Richard S. Tedlow, The Watson Dynasty: The Fiery Reign and Troubled Legacy of IBM’s Founding Father and Son (New York: Harper Business, 2003): 186–187; Charles J. Bashe, Lyle R. Johnson, John H. Palmer, and Emerson W. Pugh, IBM’s Early Computers (Cambridge, Mass.: MIT Press, 1986): 375–376. 69. Statistical Reports Division, U.S. Department of Commerce, The 1950 Censuses— How They Were Taken, Procedural Studies of the 1950 Censuses, No. 2: Population, Housing, Agriculture, Irrigation, Drainage (Washington, D.C.: U.S. Government Printing Office, 1955): 29–38; Morris H. Hansen and James L. McPherson, “Potentialities and Problems of Electronic Data Processing,” in Lowell H. Hattery and George P. Bush, eds., Electronics in Management (Washington, D.C.: University Press of Washington, D.C., 1956): 53–66; “New Electronic Statistical Machine to Be Used in Census,” Electrical Engineering 69 (February 1950): 147ff. 70. Hattery and Bush, Electronics in Management, ibid. 38. 71. Duncan and Shelton, Revolution in United States Government Statistics, 119. 72. Ibid., 129–133. 73. The bureau was also consulting on computing in general by government officials. See, for example, Robert W. Burgess, “Statement [on automation],” in Automation and Technological Change; Hearings, Joint Committee on the Economic Report, Congress of the United States . . . October 14–18, 1955 (Washington, D.C.: U.S. Government Printing Office, 1955): 78–82.
Notes to Pages 160–168 74. Bureau of the Census, United States Censuses of Population and Housing 1960: Processing the Data (Washington, D.C.: U.S. Government Printing Office, 1962): 1–10, 17–28. 75. Phil Hirsch, “The World’s Biggest Data Bank,” Datamation 16, no. 5 (May 1970): 66. 76. Ibid., 66–73. 77. Bureau of the Census, Procedural History: 1970 Census Population and Housing (Washington, D.C.: U.S. Government Printing Office, June 1976): 1-1–1-18, 8-1–8-42, 12-9–12-21, 13-1–13-26. 78. Bureau of the Census, 1980 Census of Population and Housing (Washington, D.C.: U.S. Government Printing Office, June 1989): 8–5, and for the volume of digital reports. 79. Described in ibid., 8-6–8-7, 8-11–8-14. 80. The subject of Anderson and Fienberg, Who Counts? 81. Anderson, The American Census, 203. 82. Paul Friday, “Automation of the 1990 Data Collection and Data Capture Processes,” 4, unpublished paper, January 1984, U.S. Bureau of the Census, courtesy of the Office of the Historian, Census Bureau. 83. Bureau of the Census, 1990 Census of Population and Housing (Washington, D.C.: U.S. Government Printing Office, October 1995): 1-27–1-28. 84. Ibid., 1-10–1-12. 85. Ibid., 1-15. 86. Ibid., 1-15–1-17, 1-28–1-36. TIGER has been the subject of much attention. See G. Boudriault, “Topology in the TIGER File,” Proceedings: Eighth International Symposium on Computer-Assisted Cartography (1987): 258–263; F. R. Broome and L. Godwin, “The Census Bureau’s Publication Map Production System,” Cartography and Geographic Information Systems, Journal of American Congress on Surveying and Mapping 17, no. 1 (January 1990): 79–88; T. F. Trainor, “Fully Automated Cartography: A Major Transition at the Census Bureau,” Journal of American Congress on Surveying and Mapping 17, no. 1 (January 1990): 27–38; K. Bidd, “Unleashing TIGER: A GIS Data Base for the United States,” Professional Surveyor (September/October 1989): 16–17. 87. Bureau of the Census, 100 Years of Data Processing: The Punchcard Century (Washington, D.C.: U.S. Department of Commerce, January 1991): 2. 88. For brief descriptions of the 48 subapplications, see U.S. General Accounting Office, 2000 Census: Headquarters Processing System Status and Risks (Washington, D.C.: U.S. General Accounting Office, October 2000): 43–54. 89. U.S. General Accountability Office, Information Technology Management: Census Bureau Has Implemented Many Key Practices, but Additional Actions Are Needed (Washington, D.C.: U.S. General Accountability Office, June 2005): 20. 90. Ibid., 21–23. 91. The most current strategic plan for the USPS describes in considerable detail the role IT will play in the future of this organization’s work and the rationale for such a continually growing dependency on the technology. USPS, Strategic Transformation Plan 2006–2010 (Washington, D.C.: U.S. Postal Service, September 2005). 92. “Postal Facts,” http://www.usps.com (last accessed 9/22/2005). 93. U.S. Post Office, 1954 Annual Report for the Fiscal Year Ended June 30, 1954 (Washington, D.C.: U.S. Government Printing Office, 1955): 11. 94. U.S. Post Office, The Postmaster General Reports on the Services of the United States Post Office Department, July 1, 1958 to June 30, 1959 (Washington, D.C.: U.S. Government Printing Office, 1959): 16–17. 95. For example, Burroughs received a major order for mail sorting equipment in 1958. Burroughs Corporation, Burroughs Corporation Annual Report 1958 (Detroit: Burroughs Corporation, 1959): unpaginated; Burroughs Corporation, Annual Report 1959
397
398
Notes to Pages 168–171 (Detroit: Burroughs Corporation, 1960): 5, 7; Burroughs Corporation, Record Group CBI 90, Series 39: Exhibits, Photos, and Records, National Postal Forum (Washington, D.C., December 26–27, 1970), Box 7, Folder 11, Archives of the Charles Babbage Institute, University of Minnesota, Minneapolis. 96. U.S. Post Office, The Postmaster General Reports on the Services of the United States Post Office Department during Fiscal Year 1963 (Washington, D.C.: U.S. Government Printing Office, 1963): 7–8. The Postmaster General’s report described the structure of the ZIP Code: “The five-digit ZIP number is a structured code in which the first digit identifies one of ten large areas of the Nation, and the second digit indicates a State, a geographic portion of a heavily populated State, or two or more less populated States. The third digit identifies a major destination area within a State, which may be a large city post office or a major mail concentration point (Sectional Center) in a less populated area. Five hundred fifty-three of these Sectional Centers have been designated across the country. The final two digits indicate either a postal delivery unit of a larger city post office, or an individual post office served from a Sectional Center,” ibid., 8. 97. Ibid., 8. 98. Frank W. Reilly and William S. Bowman, “Data Processing in the United States Postal Service,” Data Processing, vol. 7, Proceedings 1964 (New Orleans: Data Processing Management Association, 1964): 341–353. 99. U.S. Post Office, The Postmaster General Reports on the Services of the United States Post Office Department during Fiscal Year 1965 (Washington, D.C.: U.S. Government Printing Office, 1965): 83. 100. U.S. Post Office, Annual Report of the Postmaster General 1972–1973 (Washington, D.C.: U.S. Government Printing Office, 1973): 8. 101. U.S. Post Office, Annual Report of the Postmaster General 1974–1975 (Washington, D.C.: U.S. Government Printing Office, 1975): 5. 102. U.S. Post Office, Annual Report of the Postmaster General 1980 (Washington, D.C.: U.S. Government Printing Office, 1981): 6. 103. U.S. General Accounting Office, Conversion to Automated Mail Processing and NineDigit ZIP Code—A Status Report, GAO/GGD-83–84 (Washington, D.C.: U.S. General Accounting Office, September 28, 1983): 51, but see the entire report on how the ZIP code process functioned. 104. USPS, Annual Report of the Postmaster General 1983 (Washington, D.C.: USPS, 1984): 6. 105. U.S. Post Office Department, The Postmaster General Reports on the Services of the United States Post Office Department during the Fiscal Year 1956 (Washington, D.C.: U.S. Government Printing Office, 1957): 38. 106. Ibid., 29. 107. “Accounting for Money Orders by the Ton,” Business Machines, July 18, 1955, 9, IBM Archives, Somers, N.Y. 108. Automation Consultants, Inc., Office Automation Applications (New York: Automation Consultants, Inc., September 1958): IV-E-38, CBI 55, “Market Reports,” Box 70, Folder 2, Archives of the Charles Babbage Institute, University of Minnesota, Minneapolis; U.S. Post Office Department, The Postmaster General Reports on the Services of the United States Post Office Department during Fiscal Year 1958 (Washington, D.C.: U.S. Government Printing Office, 1958): 41. 109. U.S. Post Office Department, The Postmaster General Reports on the Services of the United States Post Office Department during Fiscal Year 1959 (Washington, D.C.: U.S. Government Printing Office, 1959): 46. 110. James Golden, Information Technology: 40 Years of Innovation and Success (Washington, D.C.: U.S. Postal Service, 2005): 1.
Notes to Pages 171–176 111. All briefly described in the annual reports of the postmaster general. This pattern of describing IT uses extended right into the new century. See also Golden, Information Technology. 112. U.S. Post Office Department, Annual Report of the Postmaster General United States Post Office Department, July 1, 1969–June 30, 1970 (Washington, D.C.: U.S. Government Printing Office, 1970): 40. 113. U.S. Post Office Department, Annual Report of the Postmaster General United States Post Office Department, July 1, 1968–June 30, 1969 (Washington, D.C.: U.S. Government Printing Office, 1969): 39. 114. USPS, Annual Report of the Postmaster General 1974–1975 (Washington, D.C.: USPS, 1975): 5. 115. Ibid., 39. 116. USPS, Annual Report of the Postmaster General Fiscal 1977 (Washington, D.C.: USPS, 1978): 7–8. 117. USPS, Annual Report of the Postmaster General 1988 (Washington, D.C.: USPS, 1989): 5. 118. USPS, Annual Report of the Postmaster General 1991 (Washington, D.C.: USPS, 1992): 7. 119. Michael E. Motley, Postal Service: Restructuring, Automation, and Ratemaking (Washington, D.C.: U.S. General Accounting Office, March 25, 1993): 6. 120. USPS, Annual Report of the Postmaster General 1994 (Washington, D.C.: USPS, 1995): 7. 121. J. William Gadsby to John M. McHugh, December 15, 1995, p. 3, released by General Accounting Office as publication GAO B-260998. This letter contained the same theme as an earlier study by the GAO, Postal Service: Automation Is Restraining but Not Reducing Costs (Washington, D.C.: U.S. Government Printing Office, May 12, 1992). 122. For insights on E-COM, see USPS, Annual Report of the Postmaster General 1983 (Washington, D.C.: USPS, 1984): 9. 123. USPS, The United States Postal Service: An American History, 1775–2002 (Washington, D.C.: USPS, 2002): 40–41. 124. USPS, Annual Report of the Postmaster General 1989 (Washington, D.C.: USPS, 1985): 10–11. 125. GAO, Postal Service: Restructuring, Automation, and Ratemaking provides details as of the early 1990s. 126. Key sources on the debate include Douglas K. Adie, Monopoly Mail: Privatizing the United States Postal Service (New Brunswick, N.J.: Transaction, 1989); Alan L. Sorkin, The Economics of the Postal System: Alternatives and Reforms (Lexington, Mass.: Lexington Books, 1990); Michael Schuyler, Wrong Delivery: The Postal Service in Competitive Markets (Washington, D.C.: Institute for Research on the Economics of Taxation, 1998); and the most recent important study, a collection of papers reflecting diverse perspectives, Edward L. Hudgins, ed., Mail@the Millennium: Will the Postal Service Go Private? (Washington, D.C.: CATO Institute, 2000). 127. Thomas J. Duesterberg, “The Post Office and the Digital Switch: Observations on an Outmoded Industry,” in Hudgins, Mail@the Millennium, 140. 128. Ibid., 140; his data from Dallas Federal Reserve Board, “Time Well Spent,” undated report, http://www.dallasfed.org (last accessed 9/03/2005) and USPS, http://www.usps. com/history. 129. Pew Foundation statistics show that over half of adults have had access to the Internet since 2000, and that the percent rose to 60 percent on a consistent basis by the end of 2001, rising incrementally in subsequent years. “Percent of American Adults Online, 1995–2004,” http://www.pewinternet.org/trends/InternetAdoption.jpg
399
400
Notes to Pages 177–182 (last accessed 7/5/2005); for an analysis of how Internet use expanded, see a U.S. Department of Commerce report, A Nation Online: How Americans Are Expanding Their Use of the Internet (Washington, D.C.: U.S. Department of Commerce, February, 2002); Lee S. Sproull, “Computers in U.S. Households Since 1977,” in Alfred D. Chandler, Jr., and James W. Cortada, eds., A Nation Transformed by Information: How Information Has Shaped the United States from Colonial Times to the Present (New York: Oxford University Press, 2000): 257–280. In addition to the direct loss of First Class Mail due to the convenience and reduced expense of using the Internet, there is another phenomenon that should be kept in mind called the Turnpiking Affect. As a new way of doing things becomes convenient and quick, it is used more than might otherwise have been the case. That would mean that e-mail would have increased in excess of whatever First Class Mail would have decreased because it was simple and cheap to use. 130. Testimony of Michael E. Motley before the Committee on Post Office and Civil Service, U.S. House of Representatives, May 24, 1994, GAO, Postal Service: Role in a Competitive Communications Environment (Washington, D.C.: U.S. General Accounting Office, 1994): 1. 131. Ibid., 2–3. 132. USPS, Annual Report of the Postmaster General 1995 (Washington, D.C.: USPS, 1996): 14. 133. USPS, Annual Report of the Postmaster General 1996 (Washington, D.C.: USPS, 1997): 5. 134. GAO, Major Management Challenges and Program Risks U.S. Postal Service (Washington, D.C.: U.S. General Accounting Office, January 1999). 135. GAO, U.S. Postal Service: Challenges to Sustaining Performance Improvements Remain Formidable on the Brink of the 21st Century (Washington, D.C.: U.S. General Accounting Office, October 21, 1999): 1. 136. For an excellent review of these services and their rationale for the USPS, see GAO, U.S. Postal Service: Post Activities and Laws Related to Electronic Commerce (Washington, D.C.: U.S. General Accounting Office, September 2000); for a briefer discussion, see USPS, Annual Report of the Postmaster General 2001 (Washington, D.C.: USPS, 2001): 46. 137. GAO, U.S. Postal Service: Deteriorating Financial Outlook Increases Need for Transformation (Washington, D.C.: U.S. General Accounting Office, February 2002): 4. 138. USPS, “Operating Statistics,” http://www.usps.com/history/anrp04/opstats_001. htm (last accessed 10/21/2005); http://www.usps.com/financials/_pdf/2006 AnnualReportFinal11-15-06withopinion.pdf (last accessed 3/31/2007). 139. “Testimony of John E. Potter, Postmaster General/CEO, Before a Hearing of the Committee on Homeland Security and Governmental Affairs, United States Senate, April 14, 2005,” Press Release, USPS, unpaginated. 140. For an excellent overview of this application of technology and its results, see GAO, U.S. Postal Service: Progress Made in Implementing Automated Letter Sequencing, but Some Issues Remain (Washington, D.C.: U.S. General Accounting Office, April 1998). 141. Harris Mahmud, “Automation and the Future of the United States Postal Service,” (M.S. thesis, California State University, Long Beach, August 2000): 18. 142. Ibid., 36; Golden, Information Technology, 3–4. 143. See the plan published in 2003, USPS, Five-Year Strategic Plan, covering fiscal years 2004–2008, and a later plan, published in 2005, Strategic Transformation Plan 2006–2010, both available at http:www.usps.com. 144. USPS, Strategic Transformation Plan, 8. 145. That is not so rhetorical a point. Frequently in its annual reports, the USPS published what it cost for first class mail in other countries, normally demonstrating that
Notes to Pages 186–188 American rates were some of the least expensive in the world. Some of the highest also were labor intensive and in the industrialized world and not as mechanized as the USPS or had to subsidize other services not required of the American postal system.
Chapter 6 1. Use of Electronic Data-Processing Equipment, Hearing before the Subcommittee on Census and Government Statistics of the Committee on Post Office and Civil Service, House of Representatives, Eighty-Sixth Congress, First Session, June 5, 1959 (Washington, D.C.: U.S. Government Printing Office, 1959): 44–45. 2. Ibid., 46. 3. A large number of surveys were done in the early years of the computer in the United States. For a listing of the various sources, including organizations collecting data, see Montgomery Phister, Jr., Data Processing Technology and Economics (Santa Monica, Calif.: Santa Monica Publishing Co., 1976): 519–534. The largest collection of this material is housed at the Charles Babbage Institute at the University of Minnesota, Minneapolis. Online bibliographies and other search tools can be found at its Web site, http://www.cbi.umn.edu/archmss.html (last accessed 12/28/2005). 4. James D. Gallagher, Management Information Systems and the Computer (New York: American Management Association, 1961): 31. 5. Ibid., 32. 6. Paul Armer, “Computer Aspects of Technological Change, Automation, and Economic Progress,” in National Commission on Technology, Automation, and Economic Progress, Technology and the American Economy, vol. 1 (Washington, D.C.: U.S. Government Printing Office, February 1966), Appendix, 220, copy at the Charles Babbage Institute, University of Minnesota, Minneapolis. 7. Ibid. Similar observations can be found in Bureau of Labor Statistics, Technological Trends in American Industry (Washington, D.C.: U.S. Government Printing Office, 1966): 255–259; “Computing Power in the Government,” Datamation (September 1961): 42–43. 8. “Weather and Its Latest Forecaster,” Business Machines (April 27, 1956): 2–3, “Blue Skies or Stormy Weather?” Business Machines (December 1962): 9–10, both in IBM Archives, Somers, N.Y. 9. Automation Consultants, Office Automation Applications (New York: Automation Consultants, Inc., circa 1957): III G9-1–G9-8, CBI 55, “Market Reports,” Box 70, Folder 3, another copy in Folder 5, Archives of the Charles Babbage Institute, University of Minnesota, Minneapolis; “Progress in Air Traffic Control,” Business Machines (October 1963): 14–15; “FSD Receives FAA Contract,” IBM News, August 25, 1965, 1–2, and Howard K. Janis, “Air Traffic Control—A Step Closer,” IBM News February 28, 1968, 4–5, all three at IBM Archives, Somers, N.Y. 10. Bureau of the Budget, Inventory of Automatic Data Processing (ADP) Equipment in the Federal Government, Including Costs, Categories of Use, and Personnel Utilization (Washington, D.C.: U.S. Bureau of the Budget, August 1962): 6;, ibid., 1966 edition, 8; John Diebold, Business Decisions and Technological Change (New York: Praeger, 1970): 258–260. 11. Ibid., 14. 12. “The History of Technology Procurement,” Washington Technology 10, no. 22 (February 22, 1996), hrrp://www.washingtontechnology.com (last accessed 12/20/2005); H. G. Berkman, “The Economics of Automatic Data Processing in Public Administration in the U.S.A.,” in A. B. Frielink, ed., Economics of Automatic Data Processing: Papers Presented at the International Symposium Organized by the International Computation Centre-Rome, October 19–22, 1965 (Amsterdam: North-Holland Publishing Company, 1965): 311.
401
402
Notes to Pages 189–195 13. In addition to the work of the Bureau of the Budget (cited in two previous endnotes), for early commentary, see Edward J. Mahoney, “Federal EDP,” Datamation ( January 1964): 26–27; Paul Armer,” “Computer Applications in Government,” in Irene Taviss, The Computer Impact (Englewood Cliffs, N.J.: Prentice-Hall, 1970): 123–127. 14. Peter Hernon and Charles R. McClure, Federal Information Policies in the 1980’s: Conflicts and Issues (Norwood, N.J.: Ablex Publishing Corporation, 1986): 112–163. 15. National Bureau of Standards, Computers in the Federal Government: A Compilation of Statistics, NBS Special Publication 500–7 (Washington, D.C.: U.S. Government Printing Office, June 1977): 8. 16. Ibid., 18. 17. Grace Commission, President’s Private Sector Survey on Cost Control: Report on Automated Data Processing/Office Automation (Washington, D.C.: U.S. Government Printing Office, 1983): 37. 18. Ibid., 38–39. 19. Grace Commission, Report on Automated Data Processing/Office Automation, 40. 20. Ibid., 59. 21. A major development discussed in future chapters, and mentioned in the prior discussion about the Census Bureau, was electronic mapping applications (GIS), resulting in the creation of national geographic databases in the 1970s and 1980s with massive increases in information. One survey reported that measured in billions of characters of storage, such data grew in volume from 48 in 1988 (already a vast number) to over 87 by 1991; see 1988 five-year plan cited in Office of Technology Assessment, Federal Government Information Technology: Management, Security, and Congressional Oversight, OTA-CIT-297 (Washington, D.C.: U.S. Government Printing Office, February 1986), 16. 22. Office of Technology Assessment, Federal Government Information Technology, 28. 23. Office of Technology Assessment, Federal Government Information Technology, 108. 24. Ibid., 108–109. 25. Ibid., 111. 26. Hernon and McClure, Federal Information Policies in the 1980’s, 112–163. 27. Ibid., 227. 28. Office of Technology Assessment, Federal Government Information Technology, 143. 29. Ibid., 143–146; Hernon and McClure, Federal Information Policies in the 1980’s, 252–257. Hernon and McClure concluded, however, that despite the fact that many agencies used telecommunications to deliver information to each other and to the public, the extent of deployment of this use of technology remained quite limited in the mid1980s., ibid., 253. 30. Office of Technology Assessment, Informing the Nation: Federal Information Dissemination in an Electronic Age, OTA-CIT-396 (Washington, D.C.: U.S. Government Printing Office, October 1988); Office of Technology Assessment, Electronic Delivery of Public Assistance Benefits, OTA-BP-CIT-47 (Washington, D.C.: U.S. Government Printing Office, April 1988); Office of Management and Budget, U.S. General Service Administration, and the U.S. Department of Commerce, A Five-Year Plan for Meeting the Automatic Data Processing and Telecommunications Needs of the Federal Government (Washington, D.C.: U.S. Government Printing Office, October 1988) and A Five-Year Plan for Meeting the Automatic Data Processing and Telecommunications Needs of the Federal Government (Washington, D.C.: U.S. Government Printing Office, November 1990) [this series of annual plans was published since the early 1980s in direct response to the Paperwork Reduction Act of 1980]; Sharon L. Caudle, “Federal Information Resources Management after the Paperwork Reduction Act,” Public Administration Review 48, no. 4 (July–August 1984): 790–799; Janet A. Weiss, Judith E. Gruber, and Robert H. Carver, “Reflections on Value: Policy Makers Evaluate Federal Information Systems,” Public Administration Review 46, Special Issue (November 1986): 497–505.
Notes to Pages 195–199 31. Office of Management and Budget, U.S. General Services Administration, and U.S. Department of Commerce, Information Resources Management Plan of the Federal Government (Washington, D.C.: U.S. Government Printing Office, November 1992): I-3–I-4. 32. General Accounting Office, Information Technology Investment: A Governmentwide Overview (Washington, D.C.: U.S. Government Printing Office, July 1995): 3. 33. Office of Management and Budget, Information Resources Management Plan of the Federal Government (Washington, D.C.: U.S. Government Printing Office, August 1996): 2. 34. General Accounting Office, FTS 2000: An Overview of the Federal Government’s New Telecommunications System, GAO/IMTEC-90–17FS (Washington, D.C.: U.S. Government Printing Office, February 1990). 35. Robert L. Chartrand, Information Policy and Technology Issues: Public Laws of the 95th through 101st Congresses (Washington, D.C.: Library of Congress, Congressional Research Services, 1991); for analyses of IT managerial problems many of these laws addressed, see John C. Beachboard and Charles R. McClure, “Managing Federal Information Technology: Conflicting Policies and Competing Philosophies,” Government Information Quarterly 13, no. 1 (1996): 15–33; David L. McClure, “Improving Federal Performance in the Information Era: The Information Technology Reform Act of 1996,” Information Quarterly 14, no. 3 (1997): 255–269; also Office of Management and Budget, Office of Information and Regulatory Affairs, Information Collection Budget of the United States Government: Fiscal Year 2000 (Washington, D.C.: Office of Management and Budget, undated, circa 1999–2000) at http://www.whitehouse.gov/OMB/ (last accessed 12/1/2003). 36. All statistics drawn from General Services Administration, Automated Data Processing Equipment in the U.S. Government: 1995 Summary (Washington, D.C.: U.S. General Services Administration, 1996). 37. General Accounting Office, Electronic Records: Clinton Administration’s Management of Executive Office of the President’s E-Mail System, GAO-01–446 (Washington, D.C.: U.S. General Accounting Office, April 2001); General Accounting Office, White House: Acquisition of Automated Resume Processing System, GAO/GG-93–117 (Washington, D.C.: U.S. General Accounting Office, June 1993). 38. General Accounting Office, Data Mining: Federal Efforts Cover a Wide Range of Uses, GAO-04–548 (Washington, D.C.: U.S. General Accounting Office, May 2004): 2–4. 39. Each of the annual reports has a section on IT; see, for instance, U.S. General Services Administration, Annual Report, for each year from 1997 through 2005, which are available at the agency’s home page. 40. For a very early account of the effort started to fix Y2K issues, see Ellen Perlman, “The Crash of 2000,” Governing 9, no. 12 (September 1996): 22–26. 41. PowerPoint presentation, IBM Corporation, “Global Government Business and IT Assessment Study,” August 2002, in possession of the author. 42. General Accounting Office, Major Management Challenges and Program Risks: A Governmentwide Perspective, GAO-01–241 (Washington, D.C.: U.S. Government Printing Office, January 2001): 17. 43. Ibid., 18. 44. General Accountability Office, CFO Act of 1990: Driving the Transformation of Federal Financial Management, GAO-06–242T (Washington, D.C.: U.S. Government Accountability Office, November 17, 2005): unpaginated first page. It noted, “There has been a clear cultural change in how financial management is viewed and carried out in the agencies and a recognition of the value and need for good financial management throughout government, which was not the case in 1990,” ibid. 45. Ibid.
403
404
Notes to Pages 199–202 46. Peter Hernon, Harold C. Relyea, Robert E. Dugan, and Joan F. Cheverie, United States Government Information: Policies and Sources (Westport, Conn.: Libraries Unlimited, 2002): 177. 47. Office of Management and Budget, Office of Information and Regulatory Affairs, “Report on Information Technology (IT) Spending for the Federal Government for Fiscal Years 2004, 2005, and 2006,” April 2005, http://www.whitehouse.gov/OMB (last accessed 5/15/2005). 48. Hernon, Relyea, Dugan, and Cheverie, United States Government Information: Policies and Sources, 177–198, 369–385; Peter Hernon, Charles R. McClure, Harold C. Relyea, Federal Information Policies in the 1990s: Views and Perspectives (Norwood, N.J.: Ablex Publishing Corporation, 1996): 19–44, 89–109; Office of the Vice President, Creating a Government That Works Better and Costs Less (Washington, D.C.: U.S. Government Printing Office, 1993); on national trends, see Don Tapscott, The Digital Economy (New York: McGraw-Hill, 1995); I also commented on the topic as a contemporary at the time, James W. Cortada, TQM for Information Systems Management: Quality Practices for Continuous Improvement (New York: McGraw-Hill, 1995). 49. Hernon, McClure, and Relyea, Federal Information Policies in the 1990s, 89. 50. William J. Clinton, “Remarks Announcing the National Performance Review, March 3, 1993,” in Public Papers of the President of the United States: William J. Clinton, Book I (Washington, D.C.: U.S. Government Printing Office, 1994): 233. 51. U.S. Government, National Information Infrastructure: Agenda for Action (Washington, D.C.: U.S. Government Printing Office, 1993). 52. For an excellent overview, see General Accounting Office, Information Superhighway: An Overview of Technology Challenges, GAO/AIMD-95–23 (Washington, D.C.: U.S. Government Printing Office, January 1995). 53. James W. Cortada, The Digital Hand: How Computers Changed the Work of American Financial, Telecommunications, Media, and Entertainment Industries (New York: Oxford University Press, 2006): 201–202, 206, 244–252. 54. Brian Kahin, “The U.S. National Information Infrastructure Initiative: The Market, the Net, and the Virtual Project,” in Brian Kahin and Ernest Wilson, eds., National Information Infrastructure Initiatives: Vision and Policy Design (Cambridge, Mass.: MIT Press, 1997): 182. 55. Ibid. 56. See Erik Brynjolfsson and Brian Kahin, eds., Understanding the Digital Economy: Data, Tools, and Research (Cambridge, Mass.: MIT Press, 2000); Alfred D. Chandler and James W. Cortada, eds., A Nation Transformed by Information: How Information Has Shaped the United States from Colonial Times to the Present (New York: Oxford University Press, 2000); Robert E. Litan and Alice M. Rivlin, eds., The Economic Payoff from the Internet Revolution (Washington, D.C.: Brookings Institution Press, 2001); James W. Cortada, Making the Information Society: Experience, Consequences, and Possibilities (Upper Saddle River, N.J.: Prentice Hall/PTR, 2002); Joseph E. Stiglitz, The Roaring Nineties (New York: W. W. Norton, 2003); Graham Tanaka, Digital Deflation: The Productivity Revolution and How It Will Ignite the Economy (New York: McGraw-Hill, 2004). For a useful governmental statement of Clinton’s Internet policies, see U.S. Government Working Group on Electronic Commerce, First Annual Report (Washington, D.C.: U.S. Government Printing Office, November 1998). 57. Office of Technology Assessment, Making Government Work: Electronic Delivery of Federal Services (Washington, D.C.: U.S. Government Printing Office, September 1993): 1–2. 58. Ibid., 2. 59. Hernon, McClure, and Relyea, Federal Information Policies in the 1990s, 105; Kahin, “U.S. National Information Infrastructure Initiative,” 150–189.
Notes to Pages 202–206 60. Office of Technology Assessment, Making Government Work, had as early as 1993 laid out the case. Five years later, it was still delivering the same message. General Accounting Office, Executive Guide: Measuring Performance and Demonstrating Results of Information Technology Investments, GAO/AIMD-98–89 (Washington, D.C.: U.S. Government Printing Office, March 1998). 61. Jane E. Fountain and Carlos A. Osorio-Urzua, “Public Sector: Early Stage of a Deep Transformation,” in Litan and Rivlin, The Economic Payoff from the Internet Revolution, 245; according to the GAO, , “The administration envisions the superhighway as a seamless web of communications networks, computers, databases, and consumer electronics—built, owned, and operated principally by the private sector—that will put vast amounts of information at users’ fingertips. It believes that the superhighway, if freed from the constraints imposed by rigid regulatory regimes, can fundamentally change the way we work, learn, get health care and public services, shop, communicate, and entertain ourselves,” GAO, Information Superhighway, 2. 62. Hernon, Relyea, Dugan, and Cheverie, United States Government Information, 369. 63. See, for example, General Accounting Office, Information Superhighway: Issues Affecting Development, AO/RCED-94–285 (Washington, D.C.: U.S. Government Printing Office, September 1994). 64. Darrell M. West, Digital Government: Technology and Public Sector Performance (Princeton, N.J.: Princeton University Press, 2005): 1–21. 65. General Accounting Office, Internet and Electronic Dial-Up Bulletin Boards: Information Reported by Federal Organizations, GAO/GGD-97–86 (Washington, D.C.: U.S. Government Printing Office, June, 1997) and Supplement: World Wide Web Sites Reported by Federal Organizations, GAO/GGD-97–86S (Washington, D.C.: U.S. Government Printing Office, June 1997). 66. GAO, Supplement, 5–15. 67. GAO, Measuring Performance and Demonstrating Results of Information Technology Investments. 68. West, Digital Government, 26–29. 69. General Accounting Office, Internet and Electronic Dial-Up Bulletin Boards: Information Reported by Federal Organizations, GAO/GGD-97–86 (Washington, D.C.: U.S. Government Printing Office, June 1997): 2. 70. For both the statistics and quote, ibid., 3. 71. West, Digital Government, 28–30. 72. Alan P. Balutis, “E-Government 2001, Part I: Understanding the Challenge and Evolving Strategies,” Public Manager 30, no. 1 (Spring 2001): 33. 73. Ibid., 33–37. For another source of similar data, see Hernon, Relyea, Dugan, and Cheverie, United States Government Information, 371–372. 74. Julianne G. Mahler and Priscilla M. Regan, Federal Intranet Work Sites: An Interim Assessment (Washington, D.C.: The PricewaterhouseCoopers Endowment for the Business of Government, June 2002): 9–12, which includes case studies on six organizations (Departments of Transportation, HUD, Commerce, Justice, and EPA and GSA). 75. Government Accountability Office, Information Technology: Major Federal Networks That Support Homeland Security Functions, GAO-04–375 (Washington, D.C.: U.S. Government Printing Office, September 2004). 76. For how the federal government reacted to these issues, see West, Digital Government, which has numerous comments, passim. 77. Genie N. L. Stowers, The State of Federal Websites: The Pursuit of Excellence (Washington, D.C.: The PricewaterhouseCoopers Endowment for the Business of Government, August 2002): 17. 78. Ibid., 21, but see also 21–22 for a general description of the site. For another description of this excellent tool, see Peter Hernon, Robert E. Dugan, and John A. Shuler,
405
406
Notes to Pages 207–215 U.S. Government on the Web: Getting the Information You Need (Westport, Conn.: Libraries Unlimited, 2003): 57–58 79. Don Tapscott, E-Government in the 21st Century: Moving from Industrial to Digital Government (Toronto: New Paradigm Learning, 2004): 2. 80. Ibid., 3–4. 81. One of the largest, and most serious, sets of initiatives that were in trouble were those at the Federal Aviation Administration. For details, see Government Accountability Office, National Airspace System: FAA Has Made Progress but Continues to Face Challenges in Acquiring Major Air Traffic Control Systems, GAO-05–331 (Washington, D.C.: U.S. Government Printing Office, June 2005); Leslie Miller, “FAA Projects Years Late, Over Budget,” Associated Press, June 1, 2005, reprinted in Wisconsin State Journal, June 1, 2005. 82. See, for example, even the GAO’s annual report for 2005, 12–14, in which its subtitle portends of a new way of doing things, “Creating a Successful Future at GSA.” 83. Hernon, Dugan, Shuler, U.S. Government on the Web, 396, but see also their entire discussion about the status of e-government, 379–401. There is an excellent bibliography on the subject now available: Tony Carrizales, “E-Government: Recent Publications,” Public Performance and Management Review 28, no. 1 (September 2004): 130–139. 84. Hernon, Dugan, and Shuler, U.S. Government on the Web, for a review of these initiatives, 379–401. 85. Larry Greenemeier and Eric Chabrow, “Bush’s Budget Stresses Return on Investments in I.T.,” Informationweek, February 14, 2005, 18. 86. West, Digital Government, 2, 5–7. 87. For discussions of the role of retiring baby boomers, which is now recognized as the most important impending source of change facing governments around the world, see James W. Cortada, Sally Drayton, Marc Le Noir, and Richard Lomax, When Governments Seek Future Prosperity: Maintaining Economic Strength and High Standards of Living (Somers, N.Y.: IBM Corporation, 2005), which provides a global view of the issue; and more explicitly about the U.S. circumstance, see U.S. General Accounting Office, Report on Federal Employee Retirements (Washington, D.C.: U.S. Government Printing Office, April 2001) and U.S. Office of Personnel Management, “Retirement Statistics: Highlights and Trends,” tables 12–22, http://www.opm.gov/feddata/retire/highlights.asp (last accessed 3/9/2005). West provides a useful discussion about potential future policy implications and options, Digital Government, 180–184.
Chapter 7 1. Gartner, Inc., Trends in U.S. State and Local Governments: Market Trends (Stamford, Conn.: Gartner, Inc., March 19, 2002): 20–29. 2. Edward F. R. Hearle, “Information Systems in State and Local Governments,” in Carlos A. Cuadra and Anne W. Lucas, eds., Annual Review of Information Science and Technology, 5 (Chicago: Encyclopedia Britannica, 1970): 325–349; Sharon L. Caudle et al., Managing Information Technology: New Directions in State Government (Syracuse, N.Y.: School of Information Studies, Syracuse University, 1989). 3. Patricia D. Fletcher and Deborah Otis Foy, “Managing Information Systems in State and Local Government,” Annual Review of Information Science and Technology 29 (1994): 249–250. 4. James W. Cortada, Before the Computer: IBM, NCR, Burroughs, and Remington Rand and the Industry They Created, 1865–1956 (Princeton, N.J.: Princeton University Press, 1993): 128–137, 299–300. 5. Automation Consultants, Inc., Office Automation Applications (New York: Automation Consultants, Inc., undated, circa late 1950s–early 1960s), CBI 55, “Market
Notes to Pages 216–219 Reports,” Box 70, Folder 2, Archives of the Charles Babbage Institute, University of Minnesota, Minneapolis. 6. “Employment Insurance Accounting on the IBM 702,” CBI 55, “Market Reports,” Box 70, Folder 3, Archives of the Charles Babbage Institute, University of Minnesota, Minneapolis. 7. In “California: Government by Automation,” Business Machines 39, no. 14 (August 30, 1956): 1, IBM Archives, Somers, N.Y. 8. “The Advanced States,” Business Machines (Christmas 1958): 6, IBM Archives, Somers, N.Y.; R. Hunt Brown, Office Automation Government (New York: Automation Consultants, Inc., undated, circa early 1960s), unpaginated but see preface and early sections, CBI 55, Market and Product Reports Collection, Box 70, Folder 16, Archives of the Charles Babbage Institute, University of Minnesota, Minneapolis. 9. “The Tallahassee Story,” Business Machines (November 1961): 4–10, IBM Archives, Somers, N.Y. 10. Harry H. Fite, “Administrative Evolution in ADP in State Government,” Public Administration Review 21, no. 1 (winter 1961): 2. 11. Ibid., 4. 12. John Diebold and Associates, Inc., Automatic Data Processing Service Newsletter, October 31, 1960; Dennis G. Price and Dennis E. Mulvihill, “The Present and Future Use of Computers in State Government,” Public Administration Review 25, no. 2 (June 1965): 142–143. Diebold was one of the most influential consultants in computing in the 1950s, 1960s, and 1970s. The historical records of his firm, The Diebold Group, covering the period 1957–1990, comprise some 1,000 reports and constitute a major collection on the early uses of computing, housed at the Archives of the Charles Babbage Institute, University of Minnesota, Minneapolis, CBI 178, Client Reports, 28 boxes. 13. Price and Mulvihill, “The Present and Future Use of Computers in State Government,” 144. The data on state computers do not include use of computers by state universities. 14. Ibid., 142–150. 15. Caleb B. Laning, “Forces and Trends in State and Local Government EDP,” Public Administration Review 25, no. 2 (June 1965): 151. 16. H. G. Berkman, “The Economics of Automatic Data Processing in Public Administration in the U.S.A.,” in A. B. Frielink, ed., Economics of Automatic Data Processing (Amsterdam: North-Holland Publishing, 1965): 311–339; Dennis G. Price, “Automation in State and Local Governments,” Datamation (March 1967): 22–25; William E. Greiner, “State and Local Government Systems,” in Data Processing, XII: Proceedings 1967 (Boston: Data Processing Management Association, 1967): 341–345; R. E. Montijo, Jr., “California DMV Goes On-Line,” Datamation (May 1967): 31–34, 36; “Driver Data on Display,” Data Processor 10, no. 3 (September 1967): 27, IBM Archives, Somers, N.Y.; Harry H. Fite, The Computer Challenge to Urban Planners and State Administrators (Washington, D.C.: Spartan Books, 1965): 77–88; “Site Selection for the Show-Me State,” Data Processor 11, no. 4 (August 1968): 20; “Helping the Disadvantaged,” Data Processor 12, no. 2 (April 1969): 26; “What’s New,” Data Processor 12, no. 3 (July 1969): 21; “A Pike’s Peak of Paper,” Data Processor 12, no. 6 (September 1969): 9–10, all issues of this magazine in IBM Archives, Somers, N.Y. 17. Herbert H. Isaacs, “User-Oriented Information Systems for State and Local Government,” in Geoffrey Y. Cornog, James B. Kenney, Ellois Scott, and John J. Connelly, eds., EDP Systems in Public Management (Chicago: Rand McNally & Company, 1968): 51–68; Geoffrey Y. Cornog, “Change, Management, and Electronic Data Processing in State and Local Government,” ibid., 3–17.
407
408
Notes to Pages 219–221 18. By this time software products included applications (such as engineering tools, CAD/CAM, payroll), utilities (such as for managing files and telecommunications), and database management (for file management so that records could be independently managed from application software and be available to multiple applications). Martin Campbell-Kelly, From Airline Reservations to Sonic the Hedgehog: A History of the Software Industry (Cambridge, Mass.: MIT Press, 2003): 121–162. 19. W. Ray Pitt, “Public Gaming: Tough New Market for IBM,” Think (May 1972): 32–34; Peter Hillyer, “He’s ‘Mr. IBM’ to the State of Georgia,” Think (September 1972): 40–41; “Moving Ahead with Multiprocessing,” Data Processor 16, no. 2 (May 1973): 4–6; “Assessing Resources,” Data Processor 20., no. 2 (March 1977): 15–16; “Online for the First Time,” Data Processor 21, no. 4 (September 1978): 10–11; “Triple Threat Welfare System,” Data Processor 23, no. 1 (February–March 1980): 10–12, all IBM Archives, Somers, N.Y. For detailed literature surveys, see Edward F. R. Hearle, “Information Systems in State and Local Government,” in Carlos A. Cuadra and Ann W. Luke, eds., Annual Review of Information Science and Technology, vol. 5, 1970 (Chicago: Encyclopedia Britannica, Inc., 1970): 325–349; and Fletcher and Deborah Otis Foy, “Managing Information Systems in State and Local Government,” ibid., vol. 29, 1994 (Medford, N.J.: Learned Information, Inc., 1994): 243–275. 20. Kenneth L. Kraemer, William H. Mitchell, Myron E. Weiner, and O. E. Dial, Integrated Municipal Information Systems: The Use of the Computer in Local Government (New York: Praeger, 1974) focuses on urban experience, but also see Kenneth L. Kraemer, John Leslie King, Debora E. Dunkle, and Joseph P. Lane, Managing Information Systems: Change and Control in Organizational Computing (San Francisco: Jossey-Bass Publishers, 1989): 1–32; Thomas R. Davies and William M. Hale, “Implementing a Policy and Planning Process for Managing State Use of Information Technology Resources,” Public Administration Review, 46, Special Issue (November 1986): 516–521; James M. Tien and James A. McClure, “Enhancing the Effectiveness of Computers in Public Organizations through Appropriate Use of Technology,” ibid., 553–562. 21. Robert P. McGowan and Gary A. Lombardo, “Decision Support Systems in State Government: Promises and Pitfalls,” Public Administration Review, 46, Special Issue (November 1986): 579–583. 22. Sharon L. Caudle, “Managing Information Resources in State Government,” Public Administration Review 50, no. 5 (September–October 1990): 515. 23. Ibid., 516–517. 24. Ibid., 522–523. 25. Your author has prepared and managed budgets for small organizations and departments but also for very large departments with thousands of employees and several billion dollars. The larger the budget, the more detailed is the chart of accounts and, therefore, along with other elements, such as multiple locations and the requirement to prepare and manage budgets within that of the parent organization as a whole, this increases complexity and the amount of time and effort involved. With the size of organizations that often exist in the federal government, planning this year’s and next year’s budgets becomes a twelve-month ongoing exercise in addition to tracking expenditures (accounting). In small state agencies, the latter is not as complicated and is less time consuming, while planning functions are seasonal, although by the 1980s, done usually twice annually. 26. For an explanation of this point, see M. Glenn Newkirk, “Trends in State Legislative Information Technology,” Government Information Quarterly 8, no. 3 (1991): 263–264. 27. “IBM System to Hold, Retrieve 20,000 New York Statutes,” IBM News, August 15, 1964, 3, IBM Archives, Somers, N.Y.
Notes to Pages 221–224 28. “EDP Seeks and Finds Iowa Laws,” Data Processor (March 1967): 21, IBM Archives, Somers, N.Y. 29. Charles N. Barnard, “Computers: A Lift for Lawmakers,” Think (September 1972): 34–37; Peter Hillyer, “An Extra Measure of Satisfaction,” Think (March 1972): 46–47. 30. Signe E. Larson and Martha E. Williams, “Computer Assisted Legal Research,” in Martha E. Williams, ed., ASIS Annual Review of Information Science and Technology 15 (1980): 266–268; Linda Schulte, “A Survey of Computerized Legislative Information Systems,” Law Library Journal 72, no. 1 (winter 1979): 99–129; James S. Elkins, Jr., “The Use of Electronic Data Processing by State Legislatures: An Overview,” October 28, 1975, a paper prepared for the American Society for Information Science; Jean Paul Emard and Jane Bortnick, “An Overview of Computerized Legal Information Systems—An Update,” Law and Computer Technology 10, no. 1 (1977): 2–16; Robert L. Chartrand, “Information Science in the Legislative Process,” in Williams, Annual Review of Information Science and Technology 11 (1976): 299–344, and with Jane Bortnick, “State Legislatures and Information Technology,” Law and Computer Technology 11, no. 2 (1978): 39–59. 31. Newkirk, “Trends in State Legislative Information Technology,” 260; Chartrand and Bortnick, “An Overview of Computerized Legal Information Systems—An Update,” 2–16. 32. Newkirk, “Trends in State Legislative Information Technology,” 263. 33. IBM Corporation, Displaywriters Support Alaska Legislative System (White Plains, N.Y.,: IBM Corporation, 1983), in “Application Briefs,” Box 250, Folder 27, IBM Archives, Somers, N.Y.; Robert Lee Chartrand, “Information Technology in the Legislative Process: 1976–1985,” in Williams, Annual Review of Information Science and Technology 21 (1986): 203–239. 34. For an example, see IBM Corporation, IBM Networking Helps Florida’s House of Representatives Meet the Challenge of Government Communications (White Plains, N.Y.: IBM Corporation, 1990) in “Application Briefs,” Box 253, Folder 26, IBM Archives, Somers, N.Y. 35. Newkirk, “Trends in State Legislative Information Technology,” 265–267. 36. Ibid., 269. 37. Larson and Williams, “Computer Assisted Legal Research,” 252–286; Chartrand, “Information Technology in the Legislative Process: 1976–1985,” 203–239, and his earlier study, “Information Science in the Legislative Process,” Williams, Annual Review of Information Science and Technology 11 (1976): 299–344—both are key sources on the bibliography of the period; Thomas G. Meenan and Charles R. Wyman, “Information Systems in the United State Senate: An Overview of Current and Projected Applications,” Government Information Quarterly 8, no. 3 (1991): 273–283; General Accounting Office, Program Evaluation: Improving the Flow of Information to the Congress, GAO/PEMD-95–1 (Washington, D.C.: U.S. Government Printing Office, January 1995). 38. M. J. Richter, “Imaging,” Governing (April 1993): 48. 39. “Desktop Technology in Government,” Governing (October 1996): 81–90. 40. John Cranford, “A Guide to Award-Winning Technology,” Governing (January 1995): 61. 41. “Another California Techno-Flop?” Governing (July 1997): 50; Ellen Perlman, “Technotrouble,” Governing (September 1998): 21–23; Robert Anderson, Tora K. Bikson, Rosalind Lewis, Joy Moini, and Susan Straus, Effective Use of Technology: Lessons about State Governance Structures and Processes (Santa Monica, Calif.: RAND, 2003); and on projects gone bad in that period, Rob Gurwitt, “Overload,” Governing (October 1995): 17–22. 42. Gurwitt, “Overload,” 19. 43. Marilyn J. Cohodas, “A Guide to Award-Winning Technology,” Governing ( January 1996): 43, 46, 48, 52.
409
410
Notes to Pages 224–226 44. Diane Kittower, “The GIS Toolbox,” Governing (October 1998): 66, 68, and also by Kittower, “In Search of Solutions,” Governing (September 1998): 51–52, 54, 56, 58–59. 45. Governing, Sourcebook 1999, 46; published each year with extensive data on expenditures by type and state. 46. The Progress & Freedom Foundation, The Digital State, 1998, http://www. pff.org/issues-pubs/books/digitalstate 1998.pdf, 5, also first edition published in 1997 (last accessed 7/16/2005); Ellen Perlman, “The Electronic Decision-Maker,” Governing (July 2000): 66–69, also available at http://www.Governing.com; M. J. Richter, “Toward Open Systems: One State’s Ambitious Plan,” Governing (June 1992): 81. 47. Gardner, Trends in U.S. State and Local Governments, 20–23; Kent Lassman, The Digital State 2002: How State Governments Use Digital Technologies (Washington, D.C.: Progress and Freedom Foundation, November 2002); M. Jae Moon, From E-Government to M-Government? Emerging Practices in the Use of Mobile Technology by State Governments (Washington, D.C.: IBM Center for the Business of Government, November 2004): 8; Kenneth Kraemer and Jason Dedrick, “The Payoffs from Investment in Information Technology: Findings from Asia-Pacific Countries,” World Development 22, no. 12 (1994): 1921–1931, and their “Computing and Organizations,” Journal of Public Administration Research and Theory 7, no. 1 (1997): 89–112. 48. The literature is vast, but for an introduction to the issue within the digital context, see Anthony G. Wilhelm, Digital Nation: Toward an Inclusive Information Society (Cambridge, Mass.: MIT Press, 2004) for the broader themes in society; Steve Davis, Larry Elin, and Grant Reeher, Click on Democracy: The Internet’s Power to Change Political Apathy into Civic Action (Boulder, Colo.: Westview, 2002); Graeme Browning, Electronic Democracy: Using the Internet to Affect American Politics (Wilton, Conn.: Pemberton Press Books/Online, Inc., 1996); and perhaps the best of these surveys, Lawrence K. Grossman, The Electronic Republic: Reshaping Democracy in the Information Age (New York: Penguin, 1995). 49. Darrell M. West, “E-Government and the Transformation of Service Delivery and Citizen Attitudes,” Public Administration Review 64, no. 1 (January–February 2004): 17. 50. Ibid., 15–27. 51. Cranford, “A Guide to Award-Winning Technology,” 65–66, 68–69. 52. Larry Stevens, “Bringing Government to the People,” Governing (October 1995): 67, 69–70, 72, 75–76; Christopher Swope, “A Guide to Bridging the Data Divide,” Governing (April 1997): 53–57. 53. Marilyn J. Cohodas, “Government and the Web Frontier,” Governing ( January 1998): 42, 44–46. 54. Ellen Perlman, “Guide to Web Innovation: The World Wide Workhorse,” Governing (April 1999): 54, 56, 58, 60. 55. Ellen Perlman, “Managing Technology: Leadership’s Challenge,” Governing ( June 1999): 64, 66, 68, 70; “Waiting for E-Com,” Governing (April 2000): 51. 56. Jane E. Fountain and Carlos A. Osorio-Urzua, “Public Sector: Early Stages of a Deep Transformation,” in Robert E. Litan and Alice M. Rivlin, eds., The Economic Payoff from the Internet Revolution (Washington, D.C.: Brookings Institution Press, 2001): 235–268; Jane E. Fountain, Building the Virtual State: Information Technology and Institutional Change (Washington, D.C.: Brookings Institution Press, 2001). 57. For examples, The Progress and Freedom Foundation has annually published a survey, The Digital State, beginning in 1997, available at its Web site, http://www. pff.org/issues-pubs/books/digitalstate(Add year desired).pdf (last accessed 7/15/2005), and at the same site, various reports based on these surveys, often written by Kent Lassman and others; “Technology WebWatch,” Governing (April 2002): 72, 99–100, 102, 104, 106, 108, 110–111, published each year since the late 1990s.
Notes to Pages 226–229 58. Darby Patterson, “State of the Digital State: Part II,” August 2001, http:www. govtech.net/magazine/story/php?id=5621&issue=8:2001 (last accessed 7/11/2005). 59. Ellen Perlman, “Thinking Big,” Governing (August 2001): 34. 60. West, “E-Government and the Transformation of Service Delivery and Citizen Attitudes,” 20; Kent Lassman, The Digital State 2002: How State Governments Use Digital Technologies (Washington, D.C.: Progress and Freedom Foundation, November 2002). 61. Multiple articles in Governing (September 2003). 62. “State Technology Spending,” Governing, Sourcebook 2003, 103–110; Eric Chabrow and Marianne Kolbasuk McGee, “Dire States,” InformationWeek, no. 926 (February 10, 2003): 32–34, 38. 63. Center for Digital Government, Digital States Survey: Digital States and the Second Generation of Digital Government, 7, http://media.centerdigitalgov.com/reg2view/ 2004Digital (last accessed 4/23/2005). The way organizations knew how many people visited their Web sites was made possible by software that tracked automatically the number of times people accessed a Web site, even what parts of that site they visited. In addition, network managers could also track the traffic to various addresses. Also, organizations began tracking the uses of specific tools and information on their Web sites, again using readily available software. 64. Molly Singer, “How Can Managers Integrate Technology Issues?” Public Management (March 2003): 7; Elena Larsen and Lee Rainie, The Rise of the E-Citizen: How People Use Government Agencies’ Web Sites (Washington, D.C.: Pew Internet and American Life Project, April 2002), http://www.pewinternet.org. This project began in the 1990s and has continued right into the new century, generating dozens of surveys on the use of the Internet in the United States. 65. James W. Cortada, Making the Information Society: Experience, Consequences, and Possibilities (Upper Saddle River, N.J.: Financial Times/Prentice Hall, 2002): 340–358; Grossman, The Electronic Republic, 9–142. 66. Andrew Kakabadse, Nada K. Kakabadse, and Alexander Kouzmin, “Reinventing the Democratic Governance Project through Information Technology? A Growing Agenda for Debate,” Public Administration Review 63, no. 1 (January/February 2003): 47. 67. Ibid., 54; see also the bibliography that is part of this article, 57–60. 68. Christopher Swope, “E-Gov’s New Gear,” Governing (March 2004): 40–42; Marc Holzer, James Melitski, Seung-Yong Rho, and Richard Schwester, Restoring Trust in Government: The Potential of Digital Citizen Participation (Washington, D.C.: IBM Center for the Business of Government, August 2004); F. Christopher Arterton, Can Technology Protect Democracy? (Newbury Park, Calif.: SAGE, 1987); K. K. Guthrie and W. H. Dutton, “The Politics of Citizen Access Technology,” Policy Studies Journal 20 (1992): 574–597; William E. Hudson, American Democracy in Peril: Seven Challenges to America’s Future (New York: Chatham House Publishers, 2001). 69. “Texas Election Bureau Speeds Ballot Counting,” Business Machines, October 20, 1954, 7; “How IBM Counted Your Vote,” Business Machines November 20, 1956, 5–7; “Voting Device Idea, Born over 40 Years, Now a Product,” IBM News, April 20, 1965, 8; “Voting System Uses Punched Card as Ballot,” IBM News October 25, 1967, 4, all four at IBM Archives, Somers, N.Y.; Automation Consultants, Inc., “Los Angeles Area Counts on Electronic Vote Tallying,” circa 1957, CBI 55, “Market Reports,” Box 70, Folder 3 and also in CBI 55, “Case Studies,” Box 70, Folder 17, Archives of the Charles Babbage Institute, University of Minnesota, Minneapolis; Robert L. Patrick and Aubrey Dahl, “Voting Systems,” Datamation 16, no. 5 (May 1970): 81–82. 70. Lara Jakes, “No More Dimpled Chads, Bipartisan Lawmakers Propose,” Hearst Newspapers, January 31, 2001, published in Wisconsin State Journal, January 31, 2001, A3.
411
412
Notes to Pages 229–233 71. “Arizona Wins Top Ranking in National 2002 Digital State Survey,” News Release, Progress and Freedom Foundation, November 4, 2002, http://www.pff.org/pr/ pr110402DigitalState2002.htm (last accessed 12/23/2002). 72. For an excellent contemporary overview of online voting, see Joseph Butcher et al., Digital Democracy: Voting in the Information Age (Cambridge, Mass.: Center for Information Policy Research, Harvard University, October 2002), http://www.pirp. harvard.edu (last accessed 12/05/2002). 73. Jim Drinkard, “Remember Chads? They’ve Hung Around,” USA Today, July 13, 2004, 1A, 4A–5A; General Accountability Office, Elections: Electronic Voting Offers Opportunities and Presents Challenges, GAO-04–975T (Washington, D.C.: U.S. Government Printing Office, July 20, 2004); Anya Sostek, “No Soft Touch,” Governing (May 2004): 36–38. 74. Institute for Politics, Democracy and the Internet, “Characteristics of 1998 Campaign Web Sites,” undated (circa 1998), http://www.ipdi.org/sites98html (last accessed 12/10/2002). 75. Institute for Politics, Democracy and the Internet, Online Campaigning 2002: A Primer (Washington, D.C.: Graduate School of Political Management, George Washington University, 2002). 76. Institute for Politics, Democracy and the Internet, The Virtual Trail: Political Journalism on the Internet (Washington, D.C.: Graduate School of Political Management, George Washington University, 2002); various stories in Time, October 18, 2004. 77. “Los Angeles County Orders Datamatic 1000,” CBI 55, “Market Reports,” Box 70, Folder 3 and another copy in Folder 17, Archives of the Charles Babbage Institute, University of Minnesota, Minneapolis. 78. Harry H. Fite, The Computer Challenge to Urban Planners and State Administrators (Washington, D.C.: Spartan Books, 1965): 5. 79. “EDP: Catalyst in the County,” Data Processor 9, no. 5 (December 1966): 3–7, IBM Archives, Somers, N.Y.; Gordon Milliman, “Alameda County’s ‘People Information System,’ ” Datamation (March 1967): 28–31; “Civic Computers,” Data Processor 11, no. 5 (November 1968): 28, IBM Archives, Somers, N.Y. 80. “In County Government,” Data Processor 14, no. 5 (December 1971): 22–23 and “Hennepin County,” Data Processor 20, no. 5 (September 1977): 14–18, both in IBM Archives, Somers, N.Y. 81. K. L. Kraemer, J. N. Danziger, and J. L. King, “Local Government and Information Technology in the United States,” in Organization for Economic Co-Operation and Development, Local Government and Information Technology (Paris: Organization for Economic Co-operation and Development [OECD], 1978): 186–237. 82. Ibid., 207, 244; see also Rob Kloing and Kenneth L. Kraemer, “Computing and Urban Services,” in James N. Danziger, William H. Dutton, Rob Kling, and Kenneth L. Kraemer, eds., Computers and Politics: High Technology in American Local Governments (New York: Columbia University Press, 1982): 195–275; also see their extensive bibliography; John Leslie King, “Local Government Use of Information Technology: The Next Decade,” Public Administration Review 42, no. 1 (January–February 1982): 25–36. 83. Donald F. Norris, “Computers and Small Local Governments: Uses and Users,” Public Administration Review 44, no. 1 (January–February 1984): 70–78; Donald F. Norris to James W. Cortada, e-mail, January 18, 2006. 84. Kenneth L. Kraemer and John Leslie King, “Computing and Public Organizations,” Computing and Public Administration Review, Special Issue (1986): 488–496. 85. John Martin, “The Computer Is an Expert,” Governing (July 1991): 25–27; Costis Toregas and Taly Walsh, “Out with the Old, in with Re-engineering,” American City and County 108, no. 5 (May 1993): 49–50, 52, 54–56; IBM, Iowa County Rightsizes,
Notes to Pages 234–237 Dramatically Increased Efficiencies (White Plains, N.Y.: IBM Corporation, 1993), Box 256, Folder 19, IBM Archives, Somers, N.Y. 86. “County Countdown to the Digital Era,” Governing (September 2000): 56. 87. Ellen Perlman, “The IT Czar of Main Street,” Governing (January 2001): 31–33. 88. “Nation’s First Digital Counties Survey Debuts,” Press Release, June 27, 2003, Government Technology, http://www.govtech/net/news/news.php?id=57813 (last accessed 7/11/2003). 89. Ibid. 90. Center for Digital Government, 2004 Digital Counties Survey (Washington, D.C.: Center for Digital Government, 2005). 91. Ibid., 8. 92. Holden, Norris, and Fletcher, “Electronic Government at the Local Level,” 325–344. 93. Ibid., 339. 94. Ibid., 341. 95. U.S. Bureau of the Census, Historical Statistics of the United States: Colonial Times to 1970 (Washington, D.C.: U.S. Government Printing Office, 1975): Part 1, 11; U.S. Bureau of the Census, Statistical Abstract of the United States: 2002 (Washington, D.C.: U.S. Government Printing Office, 2002): 861. 96. The bibliographic essay at the back of this volume discusses this body of literature. 97. Frequently described by accountants and line management in local government and by industry watchers; see for examples Robert E. Price, “Light and Water Charges on Post Card Bill,” American City 65, no. 9 (September 1950): 94–95; “Corrective Steps by Industry May Save Costly Municipal Regulation,” American City 65, no. 11 (November 1950): 134; W. A. Woodward, “Putting the Mostest Data on the Leastest Bill,” American City 66, no. 5 (May 1951): 115; Leon B. Miner, “New Billing System Collects More Taxes,” American City 66, no. 6 (June 1951): 91; Howard W. Flesche, “Volume Growth Is No Problem,” American City 66, no. 11 (November 1951): 104–105; J. Omer Laplante, “Twice the Billing in Half the Time,” American City 68, no. 1 (January 1953): 104–105; Stanley C. White, “61,200 Water Bills Punched in Rochester, N.Y.,” American City 69, no. 2 (February 1954): 155; Francis H. Connors, “Utility Billing and General Accounting in Arcade,” American City 70, no. 5 (May 1955): 131, 171; James J. Mima, “Inglewood, Calif., Now Bills Water by Machine,” American City 72, no. 1 (January 1957): 92. 98. John L. Williams, Jr., “Budgetary Control,” American City 65, no. 11 (November 1950): 90–91; R. T. Nichol, “Machine-Written Central Payroll,” American City 66, no. 2 (February 1951): 90–91; Richard A. Lion, “New Departure in Municipal Machine Accounting,” American City 67, no. 2 (June 1952): 112–113; W. B. Avery, “Machine Prepares City’s Payroll in Four Hours Instead of Five Days,” American City 67, no. 7 (July 1952): 112–113; William C. Rover, “Modern Machine Accounting,” American City 68, no. 8 (August 1953): 106, 109; Chester Kowal, “Buffalo Benefits from Mechanized Accounting Procedures,” American City 69, no. 7 (July 1954): 101–102; Cecil L. Marler, “Pomona, Calif., Modernizes Accounting System,” American City 69, no. 8 (August 1954): 112; Carmen T. Foritano, “Machine Accounting in Arlington, Mass.,” American City 70, no. 1 (January 1955): 130; James D. Williams, “Payroll and Budget Mechanized in Portsmouth, Ohio,” American City 71, no. 12 (December 1956): 147. 99. “New York City Budget Bureau Conducts Office Machine Demonstrations,” American City 68, no. 5 (May 1953): 128–129; John M. Murtagh, “New York City Treasury Is Only Beneficiary of Traffic Violations,” American City 68, no. 10 (August 1953): 108–109; “Computer Aids Lighting Repairs,” American City 77, no. 9 (September 1962): 101. 100. James W. Jardine, “Accounting Mechanization of Chicago’s Department of Water and Sewers,” American City 74, no. 5 (May 1959): 126–127.
413
414
Notes to Pages 237–239 101. “Automation in the City Hall,” Public Management 41, no. 11 (November 1959): 252. 102. Trade press of the 1950s carried articles on examples in almost every issue extending to about 1964. For examples from the primary publication of local government, see John M. Coughlin, “Punching Control into Worcester Salaries,” American City 72, no. 7 (July 1957): 117–118; “Dallas Punches Better Statistics Four Times Faster,” American City 73, no. 1 (January 1958): 96–97; A. L. Wiener, “Multiple Utility Billing Saves,” American City 74, no. 3 (March 1959): 111–112; Bernard Newman, “Punches Push Norwalk, Conn. Four Months Up to Date,” American City 75, no. 2 (February 1960): 102–103; W. H. Nussbaum, “Business Machines Make Water Works Businesslike,” American City 76, no. 9 (September 1961): 165, 167, 169; William M. Perkins, “Machine-Accounting Benefits,” American City 78, no. 6 (June 1963): 134–135. Sales of noncomputer IT remained strong all through the 1950s across many industries; James W. Cortada, Before the Computer: IBM, NCR, Burroughs, and Remington Rand and the Industry They Created, 1865–1956 (Princeton, N.J.: Princeton University Press, 1993): 251–262. 103. Lawrence E. Gerosa, “New York City’s Payroll Turned Out by Univac,” American City 72, no. 1 ( January 1957): 104–105, and to the best of my ability I believe the first article about computers in American City was “Univac Takes a Daily Inventory,” American City 72, no. 6 ( June 1957): 157–158. 104. Norman S. Jones, “Machines That Capture Original Data,” American City 72, no. 6 (June 1957): 160. 105. “Cities Are Discovering a New Office Technology,” American City 72, no. 6 (June 1957): 162. 106. Joseph P. Lally, “Boston Speeds Accounting with Electronic Computer,” American City 76, no. 5 (May 1961): 104; for examples of the commentary, see A. W. Hatch, “Should City Management Consider Electronic Data Processing?” American City Ibid. 75, no. 7 ( July 1961): 161, 163, 165; Clyde L. Palmer, “The Computer Is a Super Slide Rule,” American City 78, no. 1 ( January 1963): 88–89; Jack L. Guggino, “A Computer That Earns a Profit,” American City 79, no. 3 (March 1964): 102–103; E. F. R. Hearle, “Data Processing’s Role in City Government,” American City 79, no. 5 (May 1964): 140, 143; C. M. Conway, “A Computer That Pays Its Way,” American City 79, no. 9 (September 1964): 102–103. 107. Anthony C. Medin, “Cities Share Information Storage Techniques,” Public Management 47, no. 4 (April 1965): 74. 108. Berkman, “The Economics of Automatic Data Processing in Public Administration in the U.S.A.,” 318. 109. F. F. E. McGuire, “First Computer Center Billing,” American City 80, no. 5 (May 1965): 102–103. 110. G. A. Wechsler, “Computerize Your Purchases and Save,” American City 80, no. 11 (November 1965): 86. 111. “San Jose Cuts Traffic Knot,” Data Processor 9, no. 5 (December 1966): 26, IBM Archives, Somers, N.Y. 112. William H. Mitchel, “Tooling Computers to the Medium-Sized City,” Public Management 49, no. 1 (March 1967): 63. 113. Ibid. 114. For the entire second half of the twentieth century, industry magazines published more articles on accounting applications than for any other use of the digital hand and none more so than American City. Rather than provide citations for so many articles, for general descriptions of applications, see Myron E. Weiner, An Integrated Data System for Small-Medium Sized Local Governments (Storrs, Conn.: Institute of Public Service, University of Connecticut and International City Managers’ Association, January 1966): 6–9, 226; James M. Banovetrz, Managing the Modern City (Washington, D.C.: International City Management Association, 1971): 220–224.
Notes to Pages 240–244 115. On the telephone experience, see Robert D. Putnam, Bowling Alone: The Collapse and Revival of American Community (New York: Simon & Schuster, 2000): 166–169; Claude Fischer, America Calling: A Social History of the Telephone to 1940 (Berkeley: University of California Press, 1992): 82. 116. William B. Harvey, “Don’t Let Size Keep Your Waterworks from Using EDP,” American City 84, no. 12 (December 1969): 95–96, 114; Dewey Mann, “Data Transmission Network Speeds Customer Service,” American City 85, no. 2 (February 1970): 98–99; Robert Berryhill, “600 Meters Per Day,” American City 85, no. 9 (September 1970): 107–108; Donald K. Price, “Small Size Is No Excuse,” American City 86, no. 5 (May 1971): 60–62. 117. James W. Jardine, “Computer Assures Pinpoint Water Control,” American City 86, no. 9 (September 1971): 82–83; G. R. Horstjotte, Jr., and David G. Niles, “Computer Controls Put ‘Waste’ Water to Good Use,” American City 90, no. 10 (October 1975): 55–56; William D. Lam, “Computers Put Efficiency in Refuse Collection,” American City 87, no. 9 (September 1972): 151–152; W. S. Van Natta, “Computerized Reports Improve Sewer Maintenance,” American City 89, no. 6 (June 1974): 81–83; “Automation Ids Making Waves in Wastewater Management,” American City 93, no. 5 (May 1979): 79–81; Robert A. Davis and James L. Daugherty, “Computerized Means Cost-Effective for Sewage Treatment,” American City 97, no. 3 (March 1982): 39–44; Rice and Anderson, Electronic Data Processing in Local Government, 43–44. 118. Fred MacFarlane and Bob Kingston, “One-Step Computer Card Keeps Car Pool under Control,” American City 95, no. 2 (February 1980): 83–84; “Computer System Gets Vermont Fleet under Control,” American City 109, no. 1 (January 1994): 30. 119. “Computer Helps Sharpen Engineering Pencil,” American City 96, no. 8 (August 1981): 16; Henry E. Mecredy, Jr., “Goodby T-Square; Hello CRT,” American City 95, no. 12 (December 1980): 25–27; John Susskind, “Computer-Aided Drafting Solves Revision Problems,” American City 98, no. 11 (November 1983): 54, 56; John Whitman and Libby Clapp, “Micros: Computers in Public Works,” American City 100, no. 7 (July 1985): 66, 70, 72, 74, 77. 120. Joe Morris, “Using Computers in Public Works,” American City 101, no. 10 (October 1986): 44, 46, 48, 50, 54. 121. Many of the articles published in American City mentioned the resistance, including almost all dealing with public works uses in the 1970s. 122. Bill Gates, The Road Ahead (New York: Viking, 1995): 14. 123. “San Jose Cuts Traffic Knot,” 26; “ A Computerized Traffic-Control System for Any City,” American City 81, no. 7 (July 1966): 100–102; Myron L. Bacon, Jr., “A State Computer Analyzes City Traffic,” American City 79, no. 12 (December 1974): 84–85; Clinton A. Venable, “Low-Cost Computer Halves Driving Time,” American City 85, no. 3 (March 1971): 105–106 ; Gary M. Chamberlain, “Improve Your Traffic Flow,” American City 89, no. 8 (August 1974): 38–40; but see also Charles W. Rice, Jr., and Jeffrey L. Anderson, Electronic Data Processing in Local Government (Moscow: University of Idaho, October 1968): 42–43; “Computer Controlled Traffic System,” Computers and Automation (January 1966): 40. 124. Michael Goodchild, “Geographic Information System (GIS),” in Anthony Ralston, Edwin D. Reilly, and David Hemmendinger, eds., Encyclopedia of Computer Science, 4th ed. (London: Nature Publishing Group, 2000): 748. 125. P. Croswell and S. Clark, “Trends in Geographic Information Systems Hardware,” Photogrammetric Engineering and Remote Sensing 54, no. 11 (1988): 1571–1576; Greg Michael Saxe, “Analysis of the Adoption of Geographic Information Systems in the Local Planning Process,” (Ph.D. dissertation, University of Arizona, 1996). 126. The history of GIS has received considerable attention when compared to other public sector uses of IT. See Special Issue of American Cartographer 15, no. 3 (1988),
415
416
Notes to Pages 244–247 entitled “The Development of GIS Technology”; Y. C. Lee and G. Y. Zhang, “Developments in Geographic Information System Technology,” ASCE Journal of Surveying Engineering no. 115 (1989): 304–323; Timothy W. Foresman, ed., The History of GIS (Upper Saddle River, N.J.: Prentice Hall PTR, 1998). 127. Joe Morris, “Computer-Mapping the Infrastructure,” American City & County 102, no. 4 (April 1987): 50. 128. Ginger Juhl, “GIS Technology Coming of Age,” American City & County 104, no. 4 (April 1989): 50–53; Chuck Kindleberger, “Tomorrow’s GIS,” American City & County 107, no. 4 (April 1992): 38, 40, 42–50; Boyce Thompson, “The Dazzling Benefits (and Hidden Costs) of Computerized Mapping,” Governing 3, no. 3 (December 1989): 40–41, 43–46. 129. Lyna L. Wiggins, “Diffusion and Use of Geographic Information Systems in Public Sector Agencies in the United States,” in Ian Masser and Harlan J. Onsrud, eds., Diffusion and Use of Geographic Information Technologies (Dordrecht: Kluwer Academic Publishers, 1993): 154. 130. Roger Petzold, “Yielding the Benefits of GIS,” American City & County 109, no. 3 (March 1994): 56, 58, 60–61, 63; Timothy McCormick, “GIS Is No Pipe Dream for Stormwater Systems,” American City & County 109, no. 6 (May 1994): 59–60; “Metro Nashville-Davidson County Has Desktop, Will Map,” American City & County 109, no. 11 (October 1994): 32. 131. Ellen Perlman, “GIS: Everybody’s Favorite Tool,” Governing 8, no. 12 (September 1995): 59–61. 132. Stephen J. Ventura, “The Use of Geographic Information Systems in Local Government,” Public Administration Review 55, no. 5 (September/October 1995): 461–467, includes an extensive bibliography. 133. See a series of articles in American City & County 115, no. 15 (November 2000); Enterprise GIS for Municipal Government (New York: ESRI, July 2003); Peggy Ammerman, “Sharing the Wealth: Taking GIS Data to the Public,” American City & County 112, no. 11 (October 1997): 24–26, 28, 30–32, 34–35, 37–38. 134. “Cities Are Discovering a New Office Technology,” American City 72, no. 6 (June 1957): 162. 135. “EDP Opens New Vistas to Management,” Public Management 47, no. 4 (April 1965): 73. 136. John Scott, “Electronic Data Processing in Urban Government,” American City 86, no. 5 (May 1971): 69–72. 137. “Computer Electronics: The Major Tool of Modern Public Management in the ’80s,” American City & County 97, no. 1 ( January 1982): 44–45; Donald F. Norris, “Computers and Small Governments: Uses and Users,” Public Administration Review 44, no. 1 ( January–February 1984): 70–78. 138. Morris, “Using Computers in Public Works,” 44; John Sequerth, “Survey Monitors Computer Use,” American City & County 102, no. 7 (July 1987): 46, 48, 50, 53–54, 56; Tim Darnell, “Applying High-Tech to Local Operations,” American City & County 103, no. 7 ( July 1988): 34, 36, 40. 139. “The March of Computerization,” Governing 4, no. 2 (November 1990): 64–65. 140. Alana Northrop, Kenneth L. Kraemer, Debora Dunkle, and John Leslie King, “Payoffs from Computerization over Time,” Public Administration Review 50, no. 5 (September–October 1990): 505–513; Kenneth L. Kraemer and Donald F. Norris, “Computers in Local Government, 1993,” Yearbook (Washington, D.C.: International City Management Association, 1993). 141. Donald F. Norris and Kenneth L. Kraemer, “Mainframe and PC Computing in American Cities: Myths and Realities,” Working Paper #URB-083, undated (circa 1993): 16.
Notes to Pages 247–249 142. Kenneth L. Kraemer, Jason Dedrick, and John Leslie King, “The Impact of Information Technology on City Government in the United States,” prepared for the conference, “Urban Governments,” June 1995, unpaginated, available at http://www.crito. uci.edu/itr/pu.PDF (last accessed 8/26/2005). See also Donald F. Norris and Kenneth L. Kraemer, “Mainframe and PC Computing in American Cities: Myths and Realities,” Public Administration Review 56, no. 6 (November–December 1996): 568–576. 143. Lisa Huffman and Woody Talcove, “Information Infrastructure: Challenge and Opportunity,” Public Management 77, no. 5 (May 1995): 8, 10–14. 144. Jeff Green, “Ghost in the Machine: Year 2000 Spooks Nation’s Computers,” American City & County 112, no. 6 (May 1997): 54. 145. Dianah Neff, “The Year 2000 Challenge: A Real Risk to Your Locality,” Public Management 80, no. 8 (August 1998): 4–11; Steve Davis, “Can You Defuse the Y2K Bomb?” American City & County 113, no. 13 (December 1998): 30–32; Jane Ward, “Y2K: You’ve Still Got Time to Do Something,” American City & County 114, no. 15 (May 1999): 3; and a series of articles in the same issue on what various local governments were doing to prepare. 146. James W. Cortada, The Digital Hand: How Computers Changed the Work of American Financial, Telecommunications, Media, and Entertainment Industries (New York: Oxford University Press, 2006): 358, 364. 147. Anthony Crowell, “Local Government and the Telecommunications Act of 1996,” Public Management 78, no. 8 (June 1996): 6–7, 10–12; Harold McCombs, “Mixed Signals: How the Telecommunications Act Affects You,” American City & County 112, no. 9 (August 1997): 30, 32, 39–40, 42, 44, 46. 148. Herbert Lindsay, “Walking the High Wire,” American City & County 109, no. 7 (July 1994): 39–40, 42–44, 46. 149. T. J. Murray, “Virtual Reality Helps San Diego to Compete,” Public Management 77, no. 11 (November 1995): 12–15; Brian Moura, “San Carlos Discovers the Internet,” Public Management 78, no. 1 (January 1996): 31–37; “Internet Officers Power to the People,” American City & County 111, no. 2 (February 1996): 10; “Building Better Highways Via the Internet,” American City & County 111, no. 9 (August 1996): 8; Brandi Bowser, “WWW.localgovernment.com: Opening the Window to On-Line Democracy,” American City & County 113, no. 1 (January 1998): 32, 34, 36, 40–41, 44–45. 150. Brandi Bowser, “Getting on the Information Country Road,” American City & County 113, no. 3 (March 1998): 44–46, 51–52, 54, 56. The role of local education is reviewed in the next chapter. 151. Christina Couret, “Online Shopping Offers Governments ‘Net’ Gain,” American City & County 114, no. 1 (January 1999): 21. 152. “Virtual Government Puts Locals Online,” American City & County 114, no. 15 (December 1999): 12; Judy Potwora, “Bringing Internet Connections up to Speed,” American City & County 115, no. 5 (April 2000): 48–49, 53. 153. Donald Norris, Patricia Fletcher, and Stephen Holden, “Is Your Local Government Plugged In?” Public Management 83, no. 5 (June 2001): 4–5. 154. Ibid., 8. 155. John B. Horrigan, Thomas M. Leonard, and Stephen McGonegal, Cities Online: Urban Development and the Internet (Washington, D.C.: Progress & Freedom Foundation and PEW Internet and American Life Project, 2001): 5–12. 156. Genie N. L. Stowers, Commerce Comes to Government on the Desktop: E-Commerce Applications in the Public Sector (Washington, D.C.: PricewaterhouseCoopers Endowment for the Business of Government, February 2001): 15. 157. Alfred Rar-Kei Ho, “Reinventing Local Governments and the E-Government Initiative,” Public Administration Review 62, no. 4 (July–August 2002): 434–444.
417
418
Notes to Pages 249–254 158. Gartner, Trends in U.S. State and Local Governments, 38; M. Jae Moon, “The Evolution of E-Government among Municipalities: Rhetoric or Reality?” Public Administration Review 62, no. 4 (July–August 2002): 424–433. 159. Holden, Norris, and Fletcher, “Electronic Government at the Local Level,” 325–344; Kim A. O’Connel, “Computerizing Government: The Next Generation,” American City & County 118, no. 8 (July 2003): 37–38, 42–45. 160. National League of Cities and Center for Digital Government, 2004 Digital Cities Survey (Washington, D.C.: Center for Digital Government, 2005). 161. “The City That Cut the Cord,” Time (October 18, 2004): W2-W12; Ellen Perlman, “Plug Me In,” Governing 17, no. 10 (July 2004): 29–31; Christopher Swope, “The Big Band Era,” Governing 18, no. 4 (January 2005): 20–22, 24–25. 162. Kenneth L. Kraemer, James N. Danziger, Debora Dunkle, and John L. King, “The Usefulness of Computer-Based Information to Public Managers,” Working Paper URB019, Center for Research on Information Technology and Organization, University of California, Irvine, February 1993, 25.
Chapter 8 1. The issues of progress and regress and the role of public education in American society over the past century are effectively described by three leading experts on the history of education: David Tyack and Larry Cuban, Tinkering toward Utopia: A Century of Public School Reform (Cambridge, Mass.: Harvard University Press, 1995), especially, 12–39; and William J. Reese, America’s Public Schools: From the Common School to “No Child Left Behind” (Baltimore: Johns Hopkins University Press, 2005). 2. William R. Jordan, Using Technology to Improve Teaching and Education (Washington, D.C.: U.S. Department of Education, January 1993): 1. 3. Larry Cuban, Oversold and Underused: Computers in the Classroom (Cambridge, Mass.: Harvard University Press, 2001): 196. 4. On the role of administrators, see Edward Mowbray Tuttle, School Board Leadership in America (Chicago: The Author, 1963): 15–158; Larry Cuban, The Managerial Imperative and the Practice of Leadership in Schools (Albany: State University of New York Press, 1988); David Tyack and Elisabeth Hansot, Managers of Virtue: Public School Leadership in America, 1820–1980 (New York: Basic Books, 1982): 213–262. 5. The best source for this vast literature is the online database ERIC (Education Resources Information Center), www.eric.ed.gov. 6. Cuban, Oversold and Underutilized makes this a central finding of his work. 7. Both quotes in Paul Serote, “Educational Use of Electronic Data Processing Systems,” Data Processing Proceedings, 1964 (New Orleans: DPMA, 1964): 102. 8. Ibid., 101–117. 9. For early examples, see “Modern High School Sets Pace with Records System,” Business Machines, September 16, 1954, 4–5, IBM Archives, Somers, N.Y.; Dominick J. Mupo, “Data Processing and the Secondary Student,” in Data Processing Proceedings, 1967 (Boston: DPMA, 1967): 291–295; Edward H. Hudson, Sr., “School Fund-Raising and List Maintenance,” in Data Processing Proceedings, 1967, 297–306; Robert W. Sims, “Systems Concepts and Practices in Education,” in Enoch Haga, ed., Automated Educational Systems (Elmhurt, Ill.: Business Press, 1967): 1–6. 10. “Office Automation Applications Report G17,” Office Automation Applications (New York: Automation Consultants, Inc., undated [circa late 1950s]): III G17-1, CBI 55, “Market Reports,” Box 70, Folder 3, Archives of the Charles Babbage Institute, University of Minnesota, Minneapolis. 11. For a detailed discussion of the school’s use of the IBM 402, see ibid.
Notes to Pages 255–259 12. William C. Bozeman, Stephen M. Raucher, and Dennis W. Spuck, “Application of Computer Technology to Educational Administration in the United States,” and A. J. Visscher, “School Administrative Computing: A Framework for Analysis,” Journal of Research on Computing in Education 24, no. 1 (fall 1991): 62–66 and 1–19, respectively.,” ibid. 13. John I. Goodlad, John F. O’Toole, Jr., and Louise L. Tyler, Computers and Information Systems in Education (New York: Harcourt, Brace & World, 1966): 55. 14. Ibid., 59–63. 15. Charles A. Darby, Jr., Arthur L. Korotkin, and Tania Romashko, The Computer in Secondary Schools: A Survey of Its Instructional and Administrative Usage (New York: Praeger, 1972): 20, 29, 95–101. 16. John F. Vinsonhaler and Robert D. Moon, “Information Systems Applications in Education,” in Carlos A. Cuadra and Ann W. Luke, eds., Annual Review of Information Science and Technology, vol. 8 (Washington, D.C.: American Society for Information Science, 1973): 296–297; Harry Silberman and Robert T. Filep, “Information Systems Applications in Education,” Annual Review of Information Science and Technology, vol. 3 (Washington, D.C.: American Society for Information Science, 1968): 357–395; M. J. Norman and G. Tracy, “Computerized Route Planning for School Buses: Design and Implementation,” Bulletin of the Operations Research Society of America 17, no. 2 (1969): 273. 17. Lee M. Joiner, George J. Vensel, Jay D. Ross, and Burton J. Silverstein, Microcomputers in Education: A Nontechnical Guide to Instructional and School Management Applications (Holmes Beach, Fla.: Learning Publications, 1982): 181–186. Vendors were quick to report on successes in this industry, and IBM was typical. See, for example, a series of publications from IBM, Academic and Administrative Network Serving Oakland Schools Pontiac, Michigan (White Plains, N.Y.: IBM Corporation, 1979), Box 246, Folder 8; Student Records System at ABC Unified School District (White Plains, N.Y.: IBM Corporation, 1980), Box 246, Folder 34; Academic Processing at Huntington Beach Union High School District, California (White Plains, N.Y.: IBM Corporation, 1980), Box 247, Folder 5; Administrative and Academic Computing at Orange County Public Schools (White Plains, N.Y.: IBM Corporation, 1981), Box 247, Folder 31; Blue Hills Regional Schedules Students with Online EPIC: SOCRATES (White Plains, N.Y.: IBM Corporation, 1982), Box 247, Folder 37; Office Automation in Montgomery County Public Schools (White Plains, N.Y.: IBM Corporation, 1982), Box 250, Folder 9; Educational and Administrative Computing at Consolidated High School District No. 230 (White Plains, N.Y.: IBM Corporation, 1983), Box 250, Folder 25, all at IBM Archives, Somers, N.Y. 18. Nancy Protheroe, Deirdre Carroll, and Tracey Zoetis, School District Uses of Computer Technology (Arlington, Va.: Educational Research Services, 1982): 10. 19. Reported in Bozeman, Raucher, and Spuck, “Application of Computer Technology to Educational Administration in the United States,” 67–68. 20. For a detailed account of the issues as of the early 1980s, see Joiner, Vensel, Ross, and Silverstein, Microcomputers in Education. 21. Michael T. Sherman, Computers in Education: A Report (Concord, Mass.: Bates Publishing, 1983): 1–21. 22. A professor who is both a leading expert on education and once a teacher describes teaching ethos in Larry Cuban, How Teachers Taught: Constancy and Change in American Classrooms, 1890–1980 (New York: Longman, 1984), especially 199–253; on the context of their work, David B. Tyack, The One Best System: A History of American Urban Education (Cambridge, Mass.: Harvard University Press, 1974): 269–291. 23. Tyak and Cuban, Tinkering toward Utopia, 121–122. 24. Wilbur L. Ross, Jr., et al., Teaching Machines: Industry Survey and Buyers’ Guide (New York: Center for Programmed Instructions, Inc., 1962): 1–100.
419
420
Notes to Pages 259–263 25. Benjamin Fine, Teaching Machines (New York: Sterling Publishing Co., 1962): 27–44; Lawrence M. Stolurow, Teaching by Machine (Washington, D.C.: U.S. Government Printing Office, 1961): 17–50. 26. Last two paragraphs were based on the account by Karl U. Smith and Margaret Foltz Smith, Cybernetic Principles of Learning and Educational Design (New York: Holt, Rinehart and Winston, 1966): 245–265. 27. For descriptions, photographs, and prices of specific products, see Ross et al., Teaching Machines, 104–147. 28. Seymour Papert, The Children’s Machine (New York: Basic Books, 1993): 160–161. 29. Robert M. Price, The Eye for Innovation: Recognizing Possibilities and Managing the Creative Enterprise (New Haven, Conn.: Yale University Press, 2005): 29–30, 115–119. 30. Paul F. Merrill et al., Computers in Education (Boston: Allyn and Bacon, 1992): 58–59; S. G. Smith and B. A. Sherwood, “Educational Use of the PLATO Computer System,” Science 192 (April 23, 1976): 344–352; and by the developers, Daniel L. Bitzer, Paul G. Braunfeld, and W. W. Litchenberger, “PLATO II: A Multiple-Student, ComputerControlled, Automatic Teaching Device,” in John E. Coulson, ed., Programmed Learning and Computer-Based Instruction (New York: John Wiley and Sons, 1962): 205–216; Daniel L. Bitzer, E. T. Lyman and J. A. Easley, Jr., The Uses of PLATO: A Computer-Controlled Teaching System (Urbana: University of Illinois Coordinated Science Laboratory, 1965); and for a formal history, Elisabeth Van Meer, “PLATO: From Computer-Based Education to Corporate Social Responsibility,” Iterations (November 5, 2003): 1–22; and for memoirs, D. Lamont Johnson and Cleborne D. Maddux, eds., Technology in Education: A Twenty-Year Retrospective (Binghamton, N.Y.: Haworth Press, 2003). 31. Merrill et al., Computers in Education, 58–64, and see also their bibliography, 66. 32. The subject is well summarized by Cuban, Oversold and Underused; but see also two classics in the field, Robert P. Taylor, ed., The Computer in the School: Tutor, Tool, Tutee (New York: Teachers College Press, 1980) and Seymour Papert, Mindstorms: Children, Computers, and Powerful Ideas (New York: Basic Books, 1980); and for a very early discussion, Harry F. Silberman, “Characteristics of Some Recent Studies of Instructional Methods,” in Coulson, Programmed Learning and Computer-Based Instruction, 13–24. 33. IBM published a series of articles celebrating use of computers in education in the early 1970s, emblematic of what vendors working in the education industry were promoting, including, for example, Edward Hymoff, “Those Computer Kids,” Think (January 1971): 46–48; Lawrence Sandek, “Once Upon a Terminal,” Think (October 1971): 39–41; “1130 in the Classroom,” Data Processor 12, no. 1 (February 1969): 16; “The World of Education,” Data Processor 12 (December 1970): 16–17, IBM Archives, Somers, N.Y. 34. Spencer Welch, “Taking a Computer to the Hills,” Think (March 1972): 14–15. 35. IBM’s Data Processor magazine published regularly on this subject all through the 1970s, IBM Archives, Somers, N.Y. 36. Office of Technology Assessment, Informational Technology and Its Impact on American Education (Washington, D.C.: U.S. Government Printing Office, November 1982). 37. Ibid., 57. 38. Ibid., 63. 39. Ibid., 141–143. 40. Ibid., 143. 41. M. D. Roblyet, Jack Edwards, and Mary Anne Havriluk, Integrating Educational Technology into Teaching (Upper Saddle River, N.J.: Prentice Hall, 1997): 18–20; for bibliography, Katherine Clay, ed., Microcomputers in Education: A Handbook of Resources (Phoenix: Oryx Press, 1982): 17–22. 42. David Hawkridge, New Information Technology in Education (Baltimore: Johns Hopkins University Press, 1983): 71–140; IBM Corporation, Pekin Elementary School
Notes to Pages 263–269 Students Are Making Strides with IBM’s Teaching and Learning with Computers (White Plains, N.Y.: IBM Corporation, 1990), Box 253, Folder 16, IBM Archives, Somers, N.Y. 43. National Commission on Excellence in Education, Nation at Risk: The Imperative for Educational Reform (Washington, D.C.: U.S. Government Printing Office, 1983). 44. Cuban, Oversold and Underused, 1–20, and with David Tyack, Tinkering toward Utopia, 121–126. 45. Reese, America’s Public Schools, 219–285; but see also Cuban and Tyack, Tinkering toward Utopia. 46. Marvin N. Tolman and Ruel A. Alfred, The Computer and Education (Washington, D.C.: National Education Association, 1984): 5–7. 47. Ibid., 15–17. 48. Michael T. Sherman, Computers in Education: A Report (Concord, Mass.: Bates Publishing, 1983): 3–6. 49. Ibid., 14. 50. For a description, see John Henry Martin and Ardy Friedberg, Writing to Read: A Parent’s Guide to the New, Early Learning Program for Young Children (New York: Warner Books, 1986). 51. One could argue that this situation was no different from how text books were chosen for use in K–12, largely done by state departments of education. Gail A. Caissy, Microcomputers and the Classroom Teacher (Bloomington, Ind.: Phi Delta Kappa Educational Foundation, 1987): 29–33; Ed Grimm, “Coming on Fast in the Classroom,” Think (May 1988): 17–20; William P. Gorth and Michael Chernoff, “The Computer Goes to School,” Supplement, Publishers Weekly 225, no. 6 (February 10, 1984): 150, 152, 154, 156–158, 160; Robert Hawkins, “Technology Goes to School,” Think (June 1989): 30–34; Bobbie K. Hentrel, Computers in Education: A Guide for Educators (Ann Arbor: University of Michigan Press, 1985): 37–40. 52. Marc S. Tucker, “Computers in the Schools: What Revolution?” Journal of Communication 35, no. 4 (autumn 1985): 13. 53. Office of Technology Assessment, Power On! New Tools for Teaching and Learning OTA-SET-379 (Washington, D.C.: U.S. Government Printing Office, September 1988). 54. Quality Education Data, Microcomputer Usage in Schools: A 1989–90 QED Update (Denver, Colo.: Quality Education Data, 1990): 1. 55. Michael B. Eisenberg and Kathleen L. Spitzer, “Information Technology and Services in Schools, Annual Review of Information Science and Technology 26 (Medford, Mass.: American Society for Information Science, 1991): 253. 56. Cuban, Oversold and Underused, 71. 57. Ibid., and for a good discussion of sources, see his endnote 5, 214–215. 58. Tucker, “Computers in the Schools: What Revolution?” 59. Tyack and Cuban, Tinkering toward Utopia, 40–59, 110–133; Reese, America’s Public Schools, 215–250. 60. Reese, America’s Public Schools, 322. 61. Ibid. 62. Cuban, Oversold and Underused, 12. 63. Recent examples of this perspective include Clifford Stoll, High Tech Heretic: Why Computers Don’t Belong in the Classroom and Other Reflections by a Computer Contrarian (New York: Doubleday, 1999); and Alison Armstrong and Charles Casement, The Child and the Machine: How Computers Put Our Children’s Education at Risk (Beltsville, Md.: Robins Lane Press, 2000). 64. Alvin Toffler wrote a number of books following Future Shock that updated continuously his thinking about these themes. For his views as of the early 2000s, see Alvin Toffler and Heidi Toffler, Revolutionary Wealth (New York: Knopf, 2006); Thomas L.
421
422
Notes to Pages 269–275 Friedman, The World Is Flat: A Brief History of the Twenty-First Century (New York: Farrar, Straus and Giroux, 2005). Highly influential in education circles was Seymour Papert’s The Children’s Machine: Rethinking School in the Age of the Computer (New York: Basic Books, 1993) and his earlier book, Mindstorms: Children, Computers, and Powerful Ideas (New York: Basic Books, 1980), but see also Robert P. Taylor, The Computer in the School. 65. Both quotes, Tyack and Cuban, Tinkering toward Utopia, 124. 66. See citations in endnote 45, ibid., 173. 67. Cuban, Oversold and Underused, 65–66. 68. Ibid., 66. 69. Cuban made the same point, ibid., 134–135. 70. Papert, The Children’s Machine, 39. 71. Both quotes, Anthony G. Oettinger, Run, Computer, Run: The Mythology of Educational Innovation (Cambridge, Mass.: Harvard University Press, 1969): 44. 72. Ibid., 215. 73. Henry Jay Becker, “Top-Down Versus Grass Roots Decision Making about Computer Acquisition and Use in American Schools,” paper presented at Annual Conference of American Educational Research Association, San Francisco, April 1992, distributed by U.S. Department of Education; ERIC Development Team, “Trends in Educational Technology 1991,” ERIC Digest, ED343617, available at http://www.eric.ed.gov (last accessed 7/3/2005). 74. The first edition of Paul F. Merrill et al., Computers in Education, first appeared in 1986, the third edition in 1996. Similarly, James Lockard and Peter D. Abrams, Computers for Twenty-First Century Educators, fifth edition (New York: Longman, 2001) had a similar history. One of the earliest widely used texts was by Gary G. Bitter and Melissa E. Pierson, Using Technology in the Classroom (Boston: Allyn and Bacon, 1984, and subsequent editions in 1988, 1993, and 1999), which evolved as the technology changed and as new uses spread. 75. For an example of the kinds of materials used by students in this period, see Robler, Edwards, and Havriluk, Integrating Educational Technology into Teaching, first published in 1997. 76. See, for example, Barbara Means, William R. Penuel, and Christine Padilla, The Connected School: Technology and Learning in High School (San Francisco: Jossey-Bass, 2001); Janet Ward Schofield and Ann Locke Davidson, Bringing the Internet to School: Lessons from an Urban District (San Francisco: Jossey-Bass, 2001); Cuban, Oversold and Underused; Gene I. Maeroff, A Classroom of One (New York: Palgrave Macmillan, 2003). 77. Ronald E. Anderson and Amy Ronnkvist, The Presence of Computers in American Schools. (Irvine, Calif.: Center for Research on Information Technology and Organizations, June 1999): 33. 78. Dianne Rothenberg, “Information Technology in Education,” in Martha E. Williams, ed., Annual Review of Information Science and Technology, Vol. 29, 1994 (Medford, N.J.: Learned Information, Inc., 1994): 277–302. 79. For a major review of uses and deployment at the dawn of the Clinton administration, see Office of Technology Assessment, Teachers and Technology: Making the Connection, OTA-HER-616 (Washington, D.C.: U.S. Government Printing Office, 1995). 80. Anderson and Ronnkvist, The Presence of Computers in American Schools, entire report for details on volumes and locations of machines. It is the most authoritative study of the subject. 81. Gilbert Valdez et al., Computer-Based Technology and Learning: Evolving Uses and Expectations (Oak Brook, Ill.: North Central Regional Educational Laboratory, May 2000): 5–7. 82. Ibid., 11.
Notes to Pages 275–287 83. D. P. Ely, “Trends and Issues in Educational Technology,” in G. J. Anglin, ed., Instructional Technology (Englewood, Colo.: Libraries Unlimited, 1989): 14. 84. Valdez et al., Computer-Based Technology and Learning, 14. 85. Becky Smerdon et al., Teachers’ Tools for the 21st Century: A Report on Teachers’ Use of Technology (Washington, D.C.: National Center for Education Statistics, September 2000): 16–17. 86. These are cited in the ERIC database. 87. Described by Cassandra Rowand, Teacher Use of Computers and the Internet in Public Schools (Washington, D.C.: National Center for Educational Statistics, April 2000). 88. Ronald E. Anderson and Henry Jay Becker, School Investments in Instructional Technology (Irvine, Calif.: Center for Research on Information Technology and Organizations, University of California, Irvine, and University of Minnesota, July 2001): 5–7. 89. Douglas Levin and Sousan Arafeh, The Digital Disconnect: The Widening Gap Between Internet-Savvy Students and Their Schools (Washington, D.C.: Pew Internet & American Life Project, August 14, 2002); National Center for Educational Statistics, Young Children’s Access to Computers in the Home and at School in 1999 and 2000, NCES 2003–036 (Washington, D.C.: U.S. Department of Education, March 2003); National Center for Educational Statistics, Computer and Internet Use by Children and Adolescents in 2001, NCES 2004–014 (Washington, D.C.: U.S. Department of Education, October 2003). 90. National Education Association, Gains and Gaps in Education Technology: An NEA Survey of Educational Technologies in U.S. Schools (Washington, D.C.: National Education Association, 2004): 1. 91. Ibid., 3. 92. U.S. Department of Education, Toward a New Golden Age in American Education: How the Internet, the Law and Today’s Students Are Revolutionizing Expectations (Washington, D.C.: U.S. Department of Education): for quote, 19, but for the IT issue in total, 16–23. 93. Ibid., 22. 94. Ibid., 34. 95. Quote and statistics on handhelds and laptops, Garance Burke for Associated Press, “Handhelds the Rage in Schools,” Wisconsin State Journal, December 11, 2005, A5. 96. C. A. Bowers, The Cultural Dimensions of Education Computing: Understanding the Non-Neutrality of Technology (New York: Teachers College, Columbia University, 1988): 1.
Chapter 9 1. Paul E. Ceruzzi, A History of Modern Computing (Cambridge, Mass.: MIT Press, 1998): 1–27; Martin Campbell-Kelly and William Aspray, Computer: A History of the Information Machine (New York: Basic Books, 1996): 69–104. 2. James J. Duderstadt, Daniel E. Atkins, and Douglas Van Houweling, Higher Education in the Digital Age: Technology Issues and Strategies for American Colleges and Universities (Westport, Conn.: American Council on Education and Praeger, 2002): 137. 3. Ibid. 4. Ceruzzi, A History of Modern Computing, 101–103, 201–203. 5. Herbert M. Teager, “The University Leadership Abdicated,” Datamation (January 1962): 27. 6. However, educators widely believe they have shared values, and are cohesive, and work relatively well and effectively. 7. Clayton M. Christensen, Sally Aaron, and William Clark, “Disruption in Education,” EDUCAUSE Review (January/February 2003): 45. 8. While this literature is massive, consistently for many years there has been a string of publications and magazine articles published by EDUCAUSE, located in Boulder,
423
424
Notes to Pages 287–292 Colo. In addition, The Chronicle of Higher Education has routinely published hundreds of articles on various aspects of these issues over many decades. 9. Roger E. Levin and C. Mosmann, “Administrative and Library Uses of Computers,” in Roger E. Levin, ed., The Emerging Technology: Instructional Uses of the Computer in Higher Education (New York: McGraw-Hill, 1972): 33. 10. Ibid., 45–50; Frank Newman and Lara Couturier, The New Competitive Arena: Market Forces Invade the Academy (Providence, R.I.: Futures Project, Brown University, June 2001): 1–11. 11. Regarding Pennsylvania State University, see “Computer on Campus,” Data Processor 9, no. 4 (September 1966): 3–8; on the University of California, see William H. Harrison, “BMOC Means ‘Big Model 50 on Campus’: New University of California Campus Built around 360,” IBM News, April 25, 1967, 5, both located at IBM Archives, Somers, N.Y. 12. “Computer on Campus,” 4. 13. Harrison, “BMOC Means ‘Big Model 50 on Campus,’ ” 5. 14. Norman F. Kallaus, “The Computer on Campus,” Data Processing: Proceedings 1968 (Washington, D.C.: DPMA, 1968): 153. 15. Charles R. Thomas, “Administrative Uses of Computers in Higher Education,” in John W. Hamblen and Carolyn P. Landis, eds., The Fourth Inventory of Computers in Higher Education: An Interpretive Report (Boulder, Colo.: EDUCOM, 1980): 75–77. 16. Charles Mosmann and Einar Stefferud, “Campus Computing Management,” Datamation 17, no. 5 (March 1, 1971): 21. 17. Ibid., 21. 18. John W. Hamblen, Inventory of Computers in U.S. Higher Education 1966–67: Utilization and Related Degree Programs (Atlanta: National Science Foundation, August 1, 1970): III-13. 19. Charles Mosmann, Academic Computers in Service (San Francisco: Jossey-Bass, 1973): 125. 20. Ibid., 125–126. 21. Mosmann, Academic Computers in Service, 137. 22. Hamblen, Inventory of Computers in U.S. Higher Education 1966–67, 79–81. 23. Ibid., 80–101. 24. IBM Corporation, Virginia College Computer Network (Westchester, N.Y.: IBM Corporation, 1975), Online Administrative Systems at the University of Iowa (White Plains, N.Y.: IBM Corporation, 1975), Online Data Base Administration of Student Record Systems at the University of South Carolina (White Plains, N.Y.: IBM Corporation, 1976), Administrative and Academic Applications Using a System/370 Model 115 at University of Tennessee (White Plains, N.Y.: IBM Corporation, 1978), Need Analysis and Packaging Services for College Financial Aid Directors (Westchester, N.Y.: IBM Corporation, 1979), Personal Computing at the University of Akron (White Plains, N.Y.: IBM Corporation, 1979), IBM Archives, Somers, N.Y. 25. Martin D. Robbins, William S. Dorn, and John E. Skelton, Who Runs the Computer? Strategies for the Management of Computers in Higher Education (Boulder, Colo.: Westview, 1975): 28–29. 26. Sandy Albanese, Directory of Software in Higher Education (Washington, D.C.: Chronicle of Higher Education, October 1987). 27. Kenneth C. Green, “The New Administrative Computing,” in Kenneth C. Green and Steven W. Gilbert, eds., Making Computers Work for Administrators (San Francisco: Jossey-Bass, 1988): 6. 28. Ibid., 7; IBM Corporation, Local Area Networks Link Computing Resources across UTAustin Campus (White Plains, N.Y.: IBM Corporation, 1987), IBM Archives, Somers, N.Y.
Notes to Pages 292–295 29. Green, “The New Administrative Computing,” 8–10. 30. Quoted in IBM Corporation, Online Registration at Central Missouri State University (White Plains, N.Y.: IBM Corporation, 1980), unpaginated; see also IBM Corporation, Online Student Registration at the University of Georgia (White Plains, N.Y.: IBM Corporation, 1982), both at IBM Archives, Somers, N.Y. 31. For an example, IBM Corporation, Administrative and Academic Support Systems at Louisiana State University (White Plains, N.Y.: IBM Corporation, 1985), IBM Archives, Somers, N.Y. 32. J. Victor Baldridge, Janine Woodward Roberts, and Terri A. Weiner, The Campus and the Microcomputer Revolution: Practical Advice for Nontechnical Decision Makers (New York: American Council on Education, 1984): 7; Gerald Kissler, “A New Role for Deans in Computing,” in Green and Gilbert, Making Computers Work for Administrators, 47–55. 33. Brian L. Hawkins, “Administrative and Organizational Issues in Campus Computing,” in Green and Gilbert, Making Computers Work for Administrators, 13–26. 34. Kenneth C. Green, “The 1998 National Survey of Information Technology in Higher Education: Colleges Struggle with IT Planning,” Campus Computing Project (November 1998), available at http://www.campuscomputing.net (last accessed 7/1/2006); Green, “The 1999 National Survey of Information Technology in U.S. Higher Education: The Continuing Challenge of Instructional Integration and User Support,” Campus Computing Project (October 1999), http://www.campuscomputing.net (last accessed 7/1/2006). 35. Kenneth C. Green, “The 2000 National Survey of Information Technology in U.S. Higher Education,” Campus Computing Project (October 2000), http://www.campuscomputing. net (last accessed 7/1/2006). 36. Duderstadt, Atkins, and Van Houweling, Higher Education in the Digital Age, 102–103; David P. Roselle and Ginger Pinholster, “Capturing the First Keystrokes: A Technological Transformation at the University of Delaware,” in Diana G. Oblinger and Richard N. Katz, eds., Renewing Administration: Preparing Colleges and Universities for the 21st Century (Boston: Anker Publishing Company, 1999): 5–9; Dave Swartz and Ken Orgill, “Higher Education ERP: Lessons Learned,” EDUCAUSE Quarterly, no. 2 (2001): 20–27. 37. Lavon R. Frazier, “An Admissions Process Transformed with Technology: WSU’s New System Takes the Frustration out of Matriculation,” EDUCAUSE Quarterly, no. 3 (2000): 32–39; Richard Whiteside and George S. Mentz, “Online Admissions and Internet Recruiting: An Anatomy of Search Engine Placement,” EDUCAUSE Quarterly, no. 4 (2003): 63–66; Kenneth C. Green, “The 2001 National Survey of Information Technology in U.S. Higher Education: eCommerce Comes Slowly to the Campus,” The Campus Computing Project (October 2001), http://www.campuscomputing.net (last accessed 7/1/2006); Green, “The 2002 National Survey of Information Technology in U.S. Higher Education: Campus Portals Make Progress; Technology Budgets Suffer Significant Cuts,” Campus Computing Project (October 2002), http://www.campuscomputing.net (last accessed 7/1/2006). See also Martha Beede and Darlene Burnett, Planning for Student Services: Best Practices for the 21st Century (Ann Arbor: Society for College and University Planning, 2000). 38. Center for Digital Education, Digital Community Colleges and the Coming of the Millennials (Folsom, Calif.: Center for Digital Education, 2004); Kenneth C. Green, “The 2003 National Survey of Information Technology in U.S. Higher Education: Campus Policies Address Copyright Laws; Wireless Networks Show Big Gains,” The Campus Computing Project (October 2003), http://www.campuscomputing.net (last accessed 7/1/2006); Green, “The 2004 National Survey of Information Technology in U.S. Higher Education: Tech Budgets Get Some Relief,” Campus Computing Project (October 2004), http://www.campuscomputing.net (last accessed 7/1/2006); Leslie Maltz, Peter B. DeBlois, and EDUCAUSE Current Issues Committee, “Trends in Current Issues, Y2K-2005,” EDUCAUSE Quarterly, no. . 2 (2005):
425
426
Notes to Pages 295–297 6–23; Barbara I. Dewey, Peter B. DeBlois, and 2006 EDUCAUSE Current Issues Committee, “Top 10 IT Issues, 2006,” EDUCAUSE Quarterly (May/June 2006): 58–79. 39. Philip J. Goldstein, Academic Analytics: The Uses of Management Information and Technology in Higher Education (Boulder, Colo.: EDUCAUSE, 2005). 40. Richard N. Katz and Associates, Web Portals and Higher Education (San Francisco: Jossey-Bass, 2002): 126. 41. Ibid. Confirmed by other studies and surveys of the late 1990s. See, for example, Kenneth C. Green, “The Campus Computing Project: Instructional Integration and User Support Present Continuing Technology Challenges,” November 1996, http://www. campuscomputing.net (last accessed 7/1/2006); Donald P. Ely, Trends in Educational Technology 1995 (Syracuse, N.Y.: ERIC Clearinghouse on Information and Technology, May 1996). 42. Ely, Trends in Educational Technology 1995, 128. 43. Ibid., 128. 44. Barbara I. Dewey, Peter DeBlois, and EDUCAUSE Current Issues Committee, “Current IT Issues Survey Report, 2006, EDUCAUSE Quarterly, no. 2 (2006): 22. 45. The literature is vast and endlessly fascinating. For introductions to the various issues with a penchant toward concerns regarding implementation, see Sara Kiesler and Lee Sproull, eds., Computing and Changes on Campus (Cambridge: Cambridge University Press, 1987); Kevin Robins and Frank Webster, eds., The Virtual University? Knowledge, Markets, and Management (New York: Oxford University Press, 2002); Glen R. Jones, Cyberschools: An Education Renaissance (Englewood, Colo.: Jones Digital Century,1996); Les Lloyd, ed., Technology and Teaching (Medford, N.J.: Information Today, 1997); Anne B. Keating and Joseph Hargitai, The Wired Professor: A Guide to Incorporating the World Wide Web in College Instruction (New York: New York University Press, 1999); Parker Rossman, The Emerging Worldwide Electronic University: Information Age Global Higher Education (Westport, Conn.: Praeger, 1993); and one of the most balanced accounts of teaching and digital tools, Gene I. Maeroff, A Classroom of One (New York: Palgrave Macmillan, 2003). 46. Richard N. Katz and Associates, Dancing with the Devil: Information Technology and the New Competition in Higher Education (San Francisco: Jossey-Bass, 1999): 51–72; Duderstadt, Atkins, and Van Houweling, Higher Education in the Digital Age. 47. President’s Science Advisory Committee, Computers in Higher Education (Washington, D.C.: U.S. Government Printing Office, 1967): 4, and for examples, 61–73; for a case study, Richard M. McCoy, “The Computer at the University,” Data Processing Proceedings 1966 (Chicago: Data Processing Management Association, 1966): 294–302. 48. Mosmann, Academic Computers in Service, 107–123; H. A. Lekan, Index to Computer Assisted Instruction (New York: Harcourt Brace Jovanovich, 1971); W. H. Holtzman, ed., Computer-Assisted Instruction, Testing, and Guidance (New York: Harper and Row, 1970); D. Alpert and D. Bitzler, “Advances in Computer-Based Education,” in A. Finerman, ed., University Education in Computing Science (New York: Academic Press, 1968); “Curriculum 68, Recommendations for Academic Programs in Computer Science, a Report of the ACM Curriculum Committee on Computer Science,” Communications of the ACM 51 (1968): 151–197; EDUCOM Bulletin 7, no. 3 (1972) issue is devoted to the teaching of computer science. 49. See a report prepared for the Carnegie Commission on Higher Education, John F. Rockart and Michael S. Scott Morton, Computers and the Learning Process in Higher Education (New York: McGraw-Hill, 1975): 169. 50. Ibid., 238–239. 51. IBM Corporation, Computer Assisted Instruction at the University of Akron (White Plains, N.Y.: IBM Corporation, 1976), Box 242, Folder 1, IBM Archives, Somers, N.Y.
Notes to Pages 297–301 The same box contains other case studies on the use of IT in higher education in the United States during the 1970s; for one of the largest collections of such accounts, see Shelley A. Harrison and Lawrence M. Stolurow, eds., Improving Instructional Productivity in Higher Education (Englewood Cliffs, N.J.: Educational Technology Publications, 1975). 52. David Hawkridge, New Information Technology in Education (Baltimore: Johns Hopkins University Press, 1983): 119. 53. Marc S. Tucker, “Computers in the Schools: What Revolution?” Journal of Communication 35, no. 4 (autumn 1985): 12–23. Tutorials for professors abounded in this period. For an example with explanations of the problems academics faced, see Barry Heermann, Teaching and Learning with Computers: A Guide for College Faculty and Administrators (San Francisco: Jossey-Bass, 1988). 54. Charles R. Thomas and Dana S. van Hoesen, Administrative Information Systems: The 185 Profile and Five-Year Trends (Boulder, Colo.: CAUSE Publications, 1986): 148–150. 55. Susan H. Russell, Manie H. Collier, and Mary P. Hancock, 1994 Study of Communications Technology in Higher Education (Menlo Park, Calif.: SRI International, 1994): 22; California Postsecondary Education Commission, Coming of Information Age in California Higher Education, Commission Report 97–1 (Sacramento: State of California, February 1997); Jon T. Rickman and Dean L. Hubbard, The Electronic Campus (Maryville, Ohio: Prescott Publishing, 1992); R. Lewis and E. D. Tagg, eds., Trends in Computer Assisted Education (Oxford: Blackwell Scientific Publications, 1988). 56. Russell, Collier, and Hancock, 1994 Study of Communications Technology in Higher Education, 34, 64. 57. Ibid., 72, 73, 75. 58. Ibid., 81. 59. Jon Rickman and Mike Grudzinski, “Student Expectations of Information Technology Use in the Classroom,” EDUCAUSE Quarterly, no. 1 (2000): 24–30; Mary Hanson, “Incentives in IT Yield Success at MIT,” EDUCAUSE Quarterly, no. 1 (2001): 40–43; Lucinda Lea, Maria Clayton, Barbara Draude, and Sarah Barlow, “The Impact of Technology on Teaching and Learning,” EDUCAUSE Quarterly no. 2 (2001): 69–71; Carol Wilson, “Faculty Attitudes about Distance Learning,” EDUCAUSE Quarterly no. 2 (2001): 70–71; Judith S. Eaton, Distance Learning: Academic and Political Challenges for Higher Education Accreditation, CHEA Monograph Series 2001, no. 1 (n.p.: Council for Higher Education Accreditation, 2001); Maeroff, A Classroom of One, 1–19, 157–173. 60. Walter S. Baer, Will the Internet Transform Higher Education? RAND/RP-685 (Santa Monica, Calif.: RAND Corporation, 1998): 3; Kenneth C. Green, “1997 Campus Computing Survey,” Chronicle of Higher Education, October 17, 1997. 61. Baer, Will the Internet Transform Higher Education? 4. 62. Ibid., 4, 6; Richard E. Clark, “Reconsidering Research on Learning from Media,” Review of Educational Research 53, no. 4 (1983): 445–459. 63. Baer, Will the Internet Transform Higher Education? 6. 64. Austan Goolsbee, “Higher Education: Promises for Future Delivery,” in Robert E. Litan and Alice M. Rivlin, eds., The Economic Payoff from the Internet Revolution (Washington, D.C.: Brookings Institution Press, 2001): 271. 65. For a discussion of this growing portion of the Higher Education Industry, see Kathleen F. Kelly, Meeting Needs and Making Profits: The Rise of For-Profit Degree-Granting Institutions (Denver, Colo.: Education Commission of the States, 2001). 66. James J. Duderstadt and Farris W. Womack, The Future of the Public University in America: Beyond the Crossroads (Baltimore: Johns Hopkins University Press, 2003): 82. 67. Ibid., 88. As the two commentators just quoted thought of the consequence, they used the word tsunami to describe what was about to happen to traditional higher education and, most traumatically, to its professors.
427
428
Notes to Pages 301–304 68. While the discussion had begun before the wide availability of the Internet (see Parker Rossman, The Emerging Worldwide Electronic University: Information Age Global Higher Education [Westport, Conn.: Praeger, 1993]), it became quite intense and international later in the decade. See, for example, John S. Daniel, Mega-Universities and Knowledge Media (London: Kogan Page, 1996); Jones, Cyberschools; Keith Harry, ed., Higher Education through Open and Distance Learning (London: Routledge, 1999); Richard N. Katz, Diana G. Oblinger, eds., The “E” Is for Everything (San Francisco: Jossey-Bass, 2000); Katz, Dancing with the Devil; Robins and Webster, The Virtual University?; Matthew Serbin Pittinsky, The Wired Tower: Perspectives on the Impact of the Internet on Higher Education (Upper Saddle River, N.J.: Financial Times/Prentice Hall, 2003). 69. Duderstadt, Atkins, and Van Houweling, Higher Education in the Digital Age, 127. 70. Ibid., 127–130. 71. Ibid., 132. 72. Ibid., 141–142. 73. National Education Association, “Faculty Weigh in on Distance Education,” EDUCAUSE Quarterly 3 (2000): 45; Douglas L. Heerema and Richard L. Rogers “Learning from the Past,” EDUCAUSE Quarterly no. 1 (2002): 34–38; Warren Wilson, “Faculty Perceptions and Uses of Instructional Technology,” EDUCAUSE Quarterly no. 2 (2003): 60–62; Sharon Fass McEuen, “How Fluent with Information Technology Are Our Students?” EDUCAUSE Quarterly, no. 4 (2001): 8–17. 74. David W. Cheney, The Application and Implications of Information Technologies in Postsecondary Distance Education: An Initial Bibliography (Washington, D.C.: National Science Foundation, November 2002): 1–3. 75. Claudia A. Perry, “Information Technology and the Curriculum: A Status Report,” EDUCAUSE Quarterly, no. 4 (2004): 28–37; Diana Oblinger and Jill Kidwell, Distance Learning: Are We Being Realistic?” EDUCAUSE Review (May–June 2000): 31–39. 76. Judith Borreson Caruso and Robert B. Kvavik, “Students and Information Technology, 2005: Convenience, Connection, Control, and Learning,” Education Center for Applied Research (ECAR), October 2005, http://www.educause.edu/library/pdf/ ERO506/ekf0506.pdf (last accessed 7/22/2006), and see also their update of January 9, 2006, same Web site. See also Frank Newman and Jamie Scurry, Higher Education in the Digital Rapids (Providence, R.I.: Brown University, June 2001); How Does Technology Affect Access in Postsecondary Education? What Do We Really Know? (Jessup, Md.: National Postsecondary Education Cooperation, 2004). 77. Edward L. Ayers, “The Academic Culture and the IT Culture: Their Effect on Teaching and Scholarship,” EDUCAUSE Review (November–December 2004): 51. 78. Ibid., 62; see also Darrell L. Butler and Martin Sellbom, “Barriers to Adopting Technology for Teaching and Learning,” EDUCAUSE Quarterly, no. 2 (2002): 22–28. 79. In addition to the 269 academics, an additional 18 received the Peace Prize, which makes up part of the total of 758 recipients. Americans garnered 56 prizes in chemistry, another 38 in economics, 78 in physics, 86 in physiology or medicine, and even 11 in literature. In addition, a number of Americans won two Nobel Prizes, including the developers of the transistor; http://www.jinfo.org/US_Nobel_Prizes.html (last accessed 9/9/2006). By the end of 2006, an additional six scholars in the United States had been awarded this prize. 80. Friedrich L. Bauer, Informatik: Fuhrer durch die Ausstellung (Munich: Deutsches Museum, 2004): 18–51, 84–116. 81. Paul E. Ceruzzi, A History of Modern Computing (Cambridge, Mass.: MIT Press, 1998): 18, 301; James W. Cortada, The Computer in the United States: From Laboratory to Market, 1930 to 1960 (Armonk, N.Y.: M. E. Sharpe, 1993): 28–29, 31–32, 40, 127. 82. Explained by Kenneth Flamm in two books, Targeting the Computer: Government Support and International Competition (Washington, D.C.: Brookings Institution Press,
Notes to Pages 304–309 1987) and Creating the Computer: Government, Industry, and High Technology (Washington, D.C.: Brookings Institution Press, 1988). 83. William Aspray, John Von Neumann and the Origins of Modern Computing (Cambridge, Mass.: MIT Press, 1990): 26–94. 84. Ibid., 156–157. 85. Bert F. Green, Jr., Digital Computers in Research: An Introduction for Behavioral and Social Scientists (New York: McGraw-Hill, 1963): 111. 86. Ibid. 87. Aspray, John Von Neumann and the Origins of Modern Computing, 130. 88. Paul A. D. de Maine, Digital Computer Programs for Physical Chemistry, vol. 1 (New York: Macmillan, 1963), 1. 89. Morris S. Davis, “The Role of Computers in Astronomy,” in Walter F. Freiberger and William Prager, eds., Applications of Digital Computers (Boston: Ginn and Company, 1963): 88. 90. For early first-hand accounts of experiences in establishing and using data centers for researchers in higher education, see Preston C. Hammer, ed., The Computing Laboratory in the University (Madison: University of Wisconsin Press, 1957). 91. Jeffrey R. Yost, A Bibliographic Guide to Resources in Scientific Computing, 1945–1975 (Westport, Conn.: Greenwood Press, 2002): 5. 92. William Aspray and Bernard O. Williams, “Arming American Scientists: NSF and the Provision of Scientific Computing Facilities for Universities, 1950–1973,” IEEE Annals of the History of Computing 16, no. 4 (1994): 60–74; John W. Sheldon and L. H. Thomas, “The Use of Large Scale Computing in Physics,” Journal of Applied Physics 24 (1953): 235–242. 93. George W. Evans, II, Graham F. Wallace, and Georgia L. Sutherland, Simulation Using Digital Computers (Englewood Cliffs, N.J.: Prentice-Hall, 1967): 14. 94. NSF’s story is told by Dian Olson Belanger, Enabling American Innovation: Engineering and the National Science Foundation (West Lafayette, Ind.: Purdue University Press, 1998); Milton Lomask, A Minor Miracle: An Informal History of the National Science Foundation (Washington, D.C.: U.S. Government Printing Office, 1976); George T. Mazuzan, The National Science Foundation: A Brief History (Washington, D.C.: U.S. Government Printing Office, 1988); see also The National Science Board: A History in Highlights, 1950–2000 (Arlington, Va.: National Science Foundation, 2000). 95. William F. Raub, “The Life Sciences Computer Resources Program of the National Institutes of Health,” Computers in Biology and Medicine 2, no. 3 (November 1972): 211–220; but see also Stephen P. Strickland, The Story of the NIH Grants Programs (Lanham, Md.: University Press of America, 1989). 96. Duderstadt and Womack, The Future of the Public University in America, 52–55. 97. “National Patterns of R&D Resources: 1994,” http://www.nsf.gov/statistics/s2194/ conten1.htm (last accessed 9/5/2006). 98. Duderstadt and Womack, The Future of the Public University in America, 53. 99. For an example of a turn away from intense dependency on quantitative analysis, see Nobel Laureate Douglass C. North’s recent study, Understanding the Process of Economic Change (Princeton, N.J.: Princeton University Press, 2005). 100. For examples of use and issues, see Dennis A. Trinkle, ed., Writing, Teaching, and Researching History in the Electronic Age (Armonk, N.Y.: M. E. Sharpe, 1998). 101. H. Blustain, S. Braman, R. Katz, and G. Salaway, IT Engagement in Research: A Baseline Study (Boulder, Colo.: EDUCAUSE Center for Applied Research, 2006): 15. 102. For example, in the 1990s, historians at the Charles Babbage Institute at the University of Minnesota received hundreds of thousands of dollars in federal funding to write histories of the early days of the Internet and in support of various projects regarding the history of the software industry. For details, see the quarterly newsletters and other accounts, all available at http://www.cbi.umn.edu (last accessed 9/6/2006).
429
430
Notes to Pages 310–315 103. National Science Foundation Blue-Ribbon Advisory Panel on Cyberinfrastructure, Revolutionizing Science and Engineering through Cyberinfrastructure (Washington, D.C.: National Science Foundation, January 2003): 61–62; The National Science Board: A History in Highlights, 35. 104. For a sense of the changes underway, see Andy Kessler, The End of Medicine: How Silicon Valley (and Naked Mice) Will Reboot Your Doctor (New York: Collins, 2006). 105. It is a notion not frequently discussed by librarians. For a description of this kind of behavior, see Etienne Wenger, “Communities of Practice: The Key to Knowledge Strategy,” Knowledge Directions: The Journal of the Institute for Knowledge Management 1 (fall 1999): 48–63. 106. http://polyglot.lss.wisc.edu/slis/ (last accessed 9/5/2006). 107. Ralph H. Parker, Library Applications of Punched Cards: A Description of Mechanical Systems (Chicago: American Library Association, 1952); see also a text published the same year, Howard F. McGaw, Marginal Punched Cards in College and Research Libraries (Washington, D.C.: Scarecrow Press, 1952); six higher education case studies in IBM Corporation, IBM Library Mechanization Symposium, Endicott, New York, May 25, 1964 (Endicott, N.Y.: IBM Corporation, 1964): 77–236; Henry Birnbaum, General Information Manual: IBM Circulation Control at Brooklyn College Library (Endicott, N.Y.: IBM Corporation, 1960); H. Dewey, “Punched Card Catalogs—Theory and Technique,” American Documentation 10 (January 1959): 36–50; and for an early bibliography, H. E. Loftus and A. Kent, “Automation in the Library: An Annotated Bibliography,” American Documentation 7 (April 1956): 10–26. 108. Madeline M. Berry, “Application of Punched Cards to Library Routines,” in Robert S. Casey, James W. Perry, Madeline M. Berry, and Allen Kent, eds., Punched Cards: Their Application to Science and Industry (New York: Reinhold Publishing, 1958): 279–302; Louis A. Schultheiss, Don S. Culbertson, and Edward M. Heiliger, Advanced Data Processing in the University Library (New York: Scarecrow Press, 1962): 4–6. 109. Peter Lyman and Hal R. Varian, “How Much Information,” 2003, http://www.sims. berkeley.edu/how-much-info-2003 (last accessed on 9/15/2006). 110. William N. Locke, “Computer Costs for Large Libraries,” Datamation (February 1970): 69–74. 111. Ibid., 74. 112. Stanley J. Swihart and Beryl F. Hefley, Computer Systems in the Library: A Handbook for Managers and Designers (Los Angeles: Melville Publishing Co., 1973): 11–15. 113. For details, http://www.library.wisc.edu/collections/ (last accessed 9/01/2006). 114. Richard De Gennaro, Libraries, Technology, and the Information Marketplace: Selected Papers (Boston: G. K. Hall, 1987): 7. 115. Donald V. Black and Earl A. Farley, “Library Automation,” in Carlos A. Cuadra, ed., Annual Review of Information Science and Technology, vol. 1 (New York: Interscience Publishers, 1966): 273–303; Swihart and Hefley, Computer Systems in the Library; and the major survey of the period, because of its thoroughness and influence on library administrators, Information Systems Technology Staff, Technology and Libraries (Santa Monica, Calif.: System Development Corporation, 1967). 116. Information Systems Technology Staff, Technology and Libraries, 39. 117. Ibid., 40. 118. Barbara Evans Markuson, “An Overview of Library Systems and Automation,” Datamation (February 1970): 61. 119. Ibid., 62. 120. Ibid. 121. William Saffady, Introduction to Automation for Librarians (Chicago: American Library Association, 1989): 217–227; Gennaro, Libraries, Technology, and the Information Marketplace, 403–404.
Notes to Pages 316–319 122. Saffady, Introduction to Automation for Librarians, 228–238. 123. Susan K. Martin, “Library Automation,” in Carlos A. Cuadra and Ann W. Luke, eds., Annual Review of Information Science and Technology, vol. 7 (Washington, D.C.: American Society for Information Science, 1972): 243–277; Kenneth J. Bierman, “Library Automation,” Annual Review of Information Science and Technology, vol. 9 (Washington, D.C.: American Society for Information Science, 1974): 123–172; Karl M. Pearson, Jr., “Minicomputers in the Library,” Annual Review of Information Science and Technology, vol. 10 (Washington, D.C.: American Society for Information Science, 1975): 139–163; Bruce H. Alper, “Library Automation,” Annual Review of Information Science and Technology, vol. 10 (Washington, D.C.: American Society for Information Science, 1975): 199–236; Audrey N. Grosch, “Library Automation,” Annual Review of Information Science and Technology, vol. 11 (Washington, D.C.: American Society for Information Science, 1976): 225–265; IBM Corporation, CAPTAIN: Computer Aided Processing and Terminal Access Information Network: Rutgers University Library (White Plains, N.Y.: IBM Corporation, 1976), Box 242, Folder 16, IBM Archives, Somers, N.Y. 124. Mary Jane Pobst Reed and Hugh T. Vrooman, “Library Automation,” in Martha E. Williams, ed., Annual Review of Information Science and Technology, vol. 14 (Washington, D.C.: American Society for Information Science,1979): 193. 125. Glyn T. Evans, “Library Networks,” Annual Review of Information Science and Technology, vol. 16 (1981): 211–245; Ward Shaw and Patricia B. Culkin, “Systems That Inform: Emerging Trends in Library Automation and Network Development,” Annual Review of Information Science and Technology, Vol. 21 (Amsterdam, N.Y.: Elsevier Science Series, 1987): 265–292; Peter Hernon, “Depository Library Collections and Services in an Electronic Age: A Review of the Literature,” Government Information Quarterly 4, no. 4 (1987): 383–397. 126. Lawrence A. Woods, “Applications of Microcomputers in Libraries,” in Linda C. Smith, ed., New Information Technologies—New Opportunities (Urbana-Champaign: Graduate School of Library and Information Science, University of Illinois, 1982): 28–42. 127. Gennaro, Libraries, Technology, and the Information Marketplace, 243. 128. Ibid., 335. 129. Ann O’Brien, “Online Catalogs: Enhancements and Developments,” in Martha E. Williams, ed., Annual Review of Information Science and Technology, vol. 29 (Medford, N.J.: American Society for Information Science, 1994): 219–224. 130. Edward A. Fox and Shalini R. Urs, “Digital Libraries,” in Blaise Cronin, ed., Annual Review of Information Science and Technology, vol. 36 (Medford, N.J.: American Society for Information Science and Technology, 2002): 511 for quote, but see also 503–556. 131. Ibid., 511; John M. Budd, The Changing Academic Library: Operations, Culture, Environments (Chicago: Association of College and Research Libraries, 2005): 203–225. 132. Nancy M. Cline, “Virtual Continuity: The Challenge for Research Libraries Today,” EDUCAUSE Review (May–June 2000): 27. 133. Richard J. Bazillion, “Academic Libraries in the Digital Revolution,” EDUCAUSE Quarterly, no. 1 (2001): 51–55; Brian L. Hawkins, “Information Access in the Digital Era,” EDUCAUSE Review (September–October 2001): 51–57. 134. Bruce Heterick, “Faculty Attitudes toward Electronic Resources,” EDUCAUSE Review (July–August 2002): 10–11; Chris Ferguson, Gene Spencer, and Terry Metz, “Greater Than the Sum of Its Parts: The Integrated IT/Library Organization,” EDUCAUSE Review (May–June 2004): 39–46; Joan K. Lippincott, “Net Generation Students and Libraries,” EDUCAUSE Review (March–April 2005): 56–64; Jerry D. Campbell, “Changing a Cultural Icon: The Academic Library As a Virtual Destination,” EDUCAUSE Review (January–February 2006): 16–30; Cathy De Rosa, Joanne Cantrell, Janet Hawk, and Alane Wilson, College Students’ Perceptions of Libraries and Information Resources: A Report to the OCLC Membership (Dublin, Ohio: OCLC Online Computer Library Center, 2006).
431
432
Notes to Pages 320–325 135. For the calmest and clearest explanation of the project, see Hal R. Varian, “The Google Library Project,” February 20, 2006, and for Google’s own description, http://books.google.com (last accessed 9/15/2006). 136. For an example of a widely used text, see Ian H. Witten and David Bainbridge, How to Build a Digital Library (San Francisco: Morgan Kaufmann Publishers, 2003); Arlene G. Taylor, The Organization of Information (Westport, Conn.: Libraries Unlimited, 2004). 137. James W. Cortada, The Digital Hand: How Computers Changed the Work of American Financial, Telecommunications, Media, and Entertainment Industries (New York: Oxford University Press, 2006): 293–335. 138. One of the earliest sources on this topic was Peter Freeman and William Aspray, The Supply of Information Technology Workers in the United States (Washington, D.C.: Computing Research Association, 1999). 139. Marc S. Tucker, “Computers in the Schools: What Revolution?” Journal of Communication 35, no. 4 (autumn 1985): 19. 140. Peat Marwick, “Microcomputer Use in Higher Education,” Chronicle of Higher Education, May 1987. 141. Kenneth C. Green and Skip Eastman, Campus Computing 1993 (Los Angeles: The Technology, Teaching and Scholarship Project, February 1994): 4. 142. Ibid., 8. 143. Ibid., 9–10. 144. Ibid., 21. 145. Ibid., 26. 146. Glenda Morgan, Faculty Use of Course Management Systems (Boulder, Colo.: ECAR, May 2003). 147. Urban legend would have us believe that all students downloaded music off the Internet in some illegal fashion, an image promoted by the Recorded Music Industry during its campaign to block this behavior. I have discussed this issue elsewhere: James W. Cortada, The Digital Hand: How Computers Changed the Work of American Financial, Telecommunications, Media, and Entertainment Industries (New York: Oxford University Press, 2006): 405–409. 148. Karen Kaminiski, Pete Seet, and Kevin Cullen, “Technology Literature Students? Results from a Survey,” EDUCAUSE Quarterly, no. 3 (2003): 37. 149. Ibid., 37, based on a survey of one university. 150. For example, Judith Borreson Caruso, ECAR Study of Students and Information Technology, 2004: Convenience, Connection, and Control (Boulder, Colo.: ECAR, September 2004). 151. Paul Davidson, “Gadgets Rule on College Campuses,” Money, March 29, 205, B1–B2; Bill Wolff, “Laptop Use in University Common Spaces,” EDUCAUSE Quarterly, no. 1 (2006): 74–76. 152. Sara Kiesler and Lee Sproull, Computing and Change on Campus (Cambridge: Cambridge University Press, 1987): 99. 153. BITNET was a cooperative network established in 1981 by a group of American universities, used largely for e-mail. Use peaked in 1991 with some 500 participating institutions, after which use of the Internet displaced it as a network richer in function. 154. Susan H. Russell, Manie H. Collier, and Mary Hancock, 1994 Study of Communications Technology in Higher Education (Menlo Park, Calif.: SRI International, 1994): 3–5, 71–88. 155. Ibid., 80. 156. Edward C. Warburton, Xianglei Chen, and Ellen M. Bradburn, Teaching with Technology: Use of Telecommunications Technology by Postsecondary Instructional Faculty and Staff in Fall 1998, NCES 20002–161 (Washington, D.C.: U.S. Department of Education, 2002).
Notes to Pages 325–329 157. Scott Thorne, “The Data War,” EDUCAUSE Quarterly, no. 3 (2000): 26–30; Clifford Lynch, “Why Broadband Really Matters: Applications and Architectural Challenges,” EDUCAUSE Quarterly, no. 2 (2000): 59–62. 158. EDUCAR, Wireless Networking in Higher Education in the U.S. and Canada (Boulder, Colo.: ECAR, June 2002), http://www/educause.edu/ir/library/pdf/ERS0202/ekf0202.pdf (last accessed 6/22/2006). 159. Steve Jones and Mary Madden, The Internet Goes to College: How Students Are Living in the Future with Today’s Technology (Washington, D.C.: Pew Internet and American Life, September 15, 2002). 160. Ibid., 9. 161. Indeed, many guides and anthologies of such programs began appearing. For example, see Pat Criscito, Barron’s Guide to Distance Learning (Hauppauge, N.Y.: Barron’s Educational Series, 1999); Matthew Helm and April Helm, Get Your Degree Online (New York: McGraw-Hill, 2000). 162. In various surveys conducted in the early 2000s, the number one or two issue in IT in higher education was the lack of sufficient funding. Leslie Maltz, Peter B. DeBlois, and EDUCAUSE Current Issues Committee, “Trends in Current Issues, Y2K-2005,” EDUCAUSE Quarterly, no. 2 (205): 6–23. 163. Kenneth C. Green and Steven W. Gilbert, “Great Expectations: Content, Communications, Productivity, and the Role of Information Technology in Higher Education,” Change (March–April 1995): 12. 164. Ibid., 18. 165. Judith A. Pirani and Gail Salaway, Information Technology Networking in Higher Education: Campus Commodity and Competitive Differentiator (Boulder, Colo.: ECAR, February 2005): 1. 166. Richard N. Katz and Associates, Web Portals and Higher Education (San Francisco: Jossey-Bass, 2002); Michael Looney and Peter Lyman, “Portals in Higher Education: What They Are, and What Is Their Potential?” EDUCAUSE Review (July–August 2000): 28–36; Karen Rivedal, “Wireless Is the Way at UW,” Wisconsin State Journal, September 7, 2005, B1–B2; Richard N. Katz, “The ICT Infrastructure: A Driver of Change,” EDUCAUSE Review (July–August 2002): 50, 52, 54, 56, 58–61. 167. Automation Consultants, Inc., Office Automation Applications (New York: Automation Consultants, Inc., circa 1958–62), CBI 55, “Market Reports,” Box 70, Folder 2, Archives of the Charles Babbage Institute, University of Minnesota, Minneapolis. 168. “Computing in the University,” Datamation (May 1962): 27–30. 169. President’s Science Advisory Committee, Computers in Higher Education, 40. 170. Lawrence P. Grayson and Janet B. Robins, U.S. Office of Education Support of Computer Projects, 1965–1971 (Washington, D.C.: U.S. Government Printing Office, 1972): 7. 171. John W. Hamblen, Inventory of Computers in U.S. Higher Education, 1966–67 (Washington, D.C.: National Science Foundation, 1970): III-2. 172. G. A. Comstock, “National Utilization of Computers,” in Roger E. Levin, ed., The Emerging Technology: Instructional Uses of the Computer in Higher Education (New York: McGraw-Hill, 1972): 127. 173. John Fralick Rockart and Michael S. Morton, Computers and the Learning Process in Higher Education (Stanford, Calif.: Carnegie Commission on Higher Education, 1975), 175. 174. Hamblen, Inventory of Computers in U.S. Higher Education, 1966–67, II-4. 175. For discussion of some of these issues, see Carole Cotton, “Understanding the Changing Forces in Information Technology,” paper presented at CAUSE92, December 1–4, 1992, Dallas, Texas, http://www.educause.edu/ir/library/text/CNC9247.txt (last accessed 9/1/2006).
433
434
Notes to Pages 329–347 176. Kenneth C. Green and Skip Eastman, Campus Computing 1991: The EDUCOM-USC Survey of Desktop Computing in Higher Education (Los Angeles: Center for Scholarly Technology, University of Southern California, 1992): 1. 177. Ibid., 1992 (published 1993) and 1993 (published 1994). 178. William A. Sederburg, “The Net-Enhanced University,” EDUCAUSE Review (September–October, 2002): 65. 179. Jack McCredie, “Does IT Matter to Higher Education?” EDUCAUSE Review (November–December 2003): 17. 180. Ruch, Higher Education, Inc., 135–159. 181. James J. Duderstadt and Farris W. Womack, The Future of the Public University in America: Beyond the Crossroads (Baltimore: Johns Hopkins University Press, 2003). 182. Maeroff, A Classroom of One, 268–283. 183. Richard N. Katz to author, September 12, 2006.
Chapter 10 1. John W. Kendrick, Productivity Trends in the United States (Princeton, N.J.: Princeton University Press, 1961): 612–621; Fritz Machlup in the first of several books on this theme, The Production and Distribution of Knowledge in the United States (Princeton, N.J.: Princeton University Press, 1962): 6, 31, 325, 343–347, 350, 360, 363, 365–367. 2. For some recent examples of the avoidance of the public sector, see Anita M. McGahan, How Industries Evolve: Principles for Achieving and Sustaining Superior Performance (Boston: Harvard Business School Press, 2004); Dale W. Jorgenson, Productivity, vol. 1, Postwar U.S. Economic Growth (Cambridge, Mass.: MIT Press, 1995); or to discuss it only in terms of its regulatory or economic stimulation role and not as a participant in the sense of a private sector industry, see for example, Graham Tanaka, Digital Deflation: The Productivity Revolution and How It Will Ignite the Economy (New York: McGraw-Hill, 2004); Carl Shapiro and Hal R. Varian, Information Rules: A Strategic Guide to the Network Economy (Boston: Harvard Business School Press, 1999). 3. For example, see U.S. Census Bureau, Statistical Abstract of the United States: 2006 (Washington, D.C.: U.S. Government Printing Office, 2005): table 415, p. 272. 4. Paul N. Edwards, The Closed World: Computer and the Politics of Discourse in Cold War America (Cambridge, Mass.: MIT Press, 1997). 5. Robert E. Cole, Managing Quality Fads: How American Business Learned to Play the Quality Game (New York: Oxford University Press, 1999): 18–45. 6. The subject has stimulated a large body of advocacy literature from vendors suggesting alternative ways to handle procurement. For a recent collection of various perspectives, which also advocates reforms, see Mark A. Abramson and Roland S. Harris III, eds., The Procurement Revolution (Lanham, Md.: Rowman & Littlefield, 2003). 7. Ian D. Wyatt and Daniel E. Hecker, “Occupational Changes during the 20th Century,” Monthly Labor Review (March 2006): 35–57. 8. In conducting research for this paragraph, I discussed the matter of changing jobs in the public sector with economists at the U.S. Bureau of Labor Statistics. As a result, the BLS is now conducting research on this issue with the intention of publishing a paper on long-term trends, probably in its journal, Monthly Labor Review. In which case, it would be available at www.bls.gov/opub/mlr/mlrhome.htm. 9. Darrell M. West discovered the same dual set of expenses for old and new systems concurrently existing in federal and state governments in the 1990s; Digital Government: Technology and Public Sector Performance (Princeton, N.J.: Princeton University Press, 2005): 171–172.
Notes to Pages 347–358 10. David E. Nye, Technology Matters: Questions to Live With (Cambridge, Mass.: MIT Press, 2006): 49. 11. United States General Accounting Office, Protecting Information Systems Supporting the Federal Government and the Nation’s Critical Infrastructures, GAO-03–121 (Washington, D.C.: U.S. Government Printing Office, January 2003): unpaginated first page. 12. Al Gore, “Introduction to the National Performance Review,” in James W. Cortada and John A. Woods, eds., The Quality Yearbook 1995 (New York: McGraw-Hill, 1995): 155. 13. Ibid., 156. 14. Ibid., 157–158. 15. For a discussion of this theme, see Richard E. Caves, Multinational Enterprise and Economic Analysis (Cambridge: Cambridge University Press, 1982). 16. Kenneth Flamm, Targeting the Computer: Government Support and International Competition (Washington, D.C.: Brookings Institution, 1987): 11. 17. Kenneth Flamm, Creating the Computer: Government, Industry, and High Technology (Washington, D.C.: Brookings Institution, 1988): 251. 18. Beginning in about 2004–2005, stem cell research began receiving considerable state financial and legal support, beginning in California, next in Wisconsin, and subsequently in other states. 19. Dale W. Jorgenson, Mun S. Ho, and Kevin J. Stiroh, Productivity, vol. 3, Information Technology and the American Growth Resurgence (Cambridge, Mass.: MIT Press, 2005): 91–15; although results from computers are more minimalized by Daniel E. Sichel, The Computer Revolution: An Economic Perspective (Washington, D.C.: Brookings Institution Press, 1997): 77. 20. Recently well studied, Annalee Saxenian, Regional Advantage: Culture and Competition in Silicon Valley and Route 128 (Cambridge, Mass.: Harvard University Press, 1994); Margaret Pugh O’Mara, Cities of Knowledge: Cold War Science and the Search for the Next Silicon Valley (Princeton, N.J.: Princeton University Press, 2005); Christopher Lecuyer, Making Silicon Valley: Innovation and the Growth of High Tech, 1930–1970 (Cambridge, Mass.: MIT Press, 2006): 253–294. 21. Donald F. Norris, “E-Government at the American Grassroots: Future Trajectory,” Proceedings of the 38th Hawaii International Conference on System Science, 2005, 1–7. 22. While the number of economists who have studied this issue is large, for the clearest nontechnical explanation, I have been quite influenced by Tanaka, Digital Deflation. Do not be misled by the marketing hype of the title; this is a serious, well written analysis. For more highly academic discussions, see Erik Brynjolfsson and Brian Kahin, Understanding the Digital Economy: Data, Tools, and Research (Cambridge, Mass.: MIT Press, 2000) or F. M. Scherer, New Perspectives on Economic Growth and Technological Innovation (Washington, D.C.: Brookings Institution Press, 1999). 23. David C. Mowery, Richard R. Nelson, Bhaven N. Sampat, and Arvids A. Ziedonis, Ivory Tower and Industrial Innovation: University-Industry Technology Transfer Before and After the Bayh-Dole Act (Stanford, Calif.: Stanford University Press, 2004): 180. 24. Ibid. 25. U.S. Department of Education, A Test of Leadership: Charting the Future of U.S. Higher Education (Washington, D.C.: U.S. Department of Education, September 2006): 20–23; and the Secretary of Education’s speech on the matter at the National Press Club, Press Release, September 22, 2006, http://www.ed.gov/news/speeches/2006/09/ 09262006/html (last accessed 9/27/2006). 26. See for examples, United States Government Accountability Office, DOD Business Transformation: Sustained Leadership Needed to Address Long-standing Financial and Business Management Problems, GAO-05–723T (Washington, D.C.: U.S. Government Printing Office, June 8, 2005) and Information Technology: Improvements Needed to More Accurately
435
436
Notes to Pages 358–362 Identify and Better Oversee Risky Projects Totaling Billions of Dollars, GAO-06-1099T (Washington, D.C.: U.S. Government Printing Office, September 7, 2006). 27. For case studies and discussion of the trend, see John M. Kamensky and Albert Morales, eds., Managing for Results 2005 (Lanham, Md.: Rowman and Littlefield, 2005). 28. United States Government Accountability Office, 21st Century Challenges: Reexamining the Base of the Federal Government, GAO-05–325SP (Washington, D.C.: U.S. Government Printing Office, February 2005): 77. 29. A key theme, for example, in Alfred D. Chandler, Jr., Inventing the Electronic Century: The Epic Story of the Consumer Electronics and Computer Industries (New York: Free Press, 2001); for a specific discussion of various technologies, see David C. Mowery and Nathan Rosenberg, Paths of Innovation: Technological Change in 20th-Century America (Cambridge: Cambridge University Press, 1998). 30. While focused on the private sector, for a well-informed economic analysis of the subject, see William J. Baumol, Alan S. Blinder, and Edward N. Wolff, Downsizing in America: Reality, Causes, and Consequences (New York: Russell Sage Foundation, 2003). 31. The literature on this theme is vast, and so far, nobody has written a history of this managerial discussion. 32. It seemed that every private and public sector manager in the early 1990s was reading about reengineering as postulated by Michael Hammer and James Champy, Reengineering the Corporation (New York: Harper, 1992). 33. For a discussion of the issue and with proposed strategies for dealing with it, see James W. Cortada, Sally Drayton, Marc Le Noir, and Richard Lomax, When Governments Seek Future Prosperity: Maintaining Economic Strength and High Standards of Living (Somers, N.Y.: IBM Corporation, 2005). 34. GAO, 21st Century Challenges. 35. In a large scale survey conducted in 2006, identity theft due to online transactions amounted to 0.03 percent, while the greatest sources of theft were corrupt employees (15 percent), friends and relatives (15 percent), stolen from the mail (9 percent), misuse of data from in-store, mail, or by telephone (7 percent), data breaches (8 percent), computer viruses or hackers (5 percent), phishing (3 percent), from trashcans (1 percent); Jaikumar Vijayan, “Data Breaches Yield Few ID Thefts,” Computerworld, September 18, 2006, 61. 36. Described by a group of distinguished computer scientists, Peter J. Denning and Robert M. Metcalfe, Beyond Calculation: The Next Fifty Years of Computing (New York: Springer-Verlag, 1997); Peter J. Denning, ed., Talking Back to the Machine: Computers and Human Aspiration (New York: Springer-Verlag, 1999); Peter J. Denning, ed., The Invisible Future: The Seamless Integration of Technology into Everyday Life (New York: McGraw-Hill, 2002). 37. Albert Morales and Jonathan Breul, Market-Based Government through Innovation (Somers, N.Y.: IBM Corporation, 2006): 8–9. 38. David M. Walker, Transforming Government to Meet the Challenges and Capitalize on the Opportunities of the 21st Century, GAO-07-813CG (Washington, D.C.: U.S. Government Printing Office, April 2007), 5.
BIBLIOGRAPHIC ESSAY This brief bibliographic essay discusses some of the most obvious and useful sources for those interested in exploring in more detail the subject of this book. Citations in the endnotes point to sources and to additional materials used in highly specific ways, for example, Internet addresses. Those are not repeated below. I emphasize books rather than articles because the former cover broader subjects more suitable for the purposes of this essay.
General Sources A study of any aspect of the role of government at any level often requires understanding basic information about the nation’s economy and society, often through the use of statistical data. Equally useful is having data on the size of governments, their budgets, and, of course, over a long period of time. The standard works of reference include U.S. Bureau of the Census, Historical Statistics of the United States: Colonial Times to 1970, 2 vols. (Washington, D.C.: U.S. Government Printing Office, 1975), U.S. Government, Historical Tables: Budget of the United States Government Fiscal Year 2005 (Washington, D.C.: U.S. Government Printing Office, 2004), and the annual publication (and the most useful of all), U.S. Census Bureau, Statistical Abstract of the United States, published annually for over a hundred years (Washington, D.C.: U.S. Government Printing Office, editions used for Digital Hand were 122, published in 2001, and 125, published in 2005). Also useful as a quick reference is a standard college-level civics textbook, particularly if also rich in data and not just descriptions of how government works. For that purpose, see Cal Jillson, American Government: Political Change and Institutional Development (Fort Worth, Tex: Harcourt Brace College Publishers, 1999). Many federal, state, and local agencies also publish brochures, manuals, and booklength descriptions, statistics, and histories of themselves; many are cited in the endnotes to our chapters or are listed in this bibliographic essay. However, particularly in the case of federal agencies, often officials had additional information and internal publications not readily available to the public, which are of use in doing research. In many instances, by reaching out to officials in these agencies, one can obtain access to such materials without having to go through the time-consuming aggravation of forcing them to cooperate through legal means. As a general rule of thumb, agencies that collect statistics on themselves or others tend to be two to five years behind, with the normal exception of budgetary requests, which often also project out several years beyond the present. Some statistical agencies also forecast future events, expenditures, and statistical findings, such as employment trends and economic growth. Federal forecasts from statistical agencies tend to be quite accurate, such as those from the Bureau of Labor Statistics, and are relatively immune to political pressures to present a biased story.
437
438
Bibliographic Essay Finally, there is a series of handbooks to the public sector, largely focused on the federal government. Perhaps the single most useful book on the subject, and that should be consulted first before any other publication in this bibliography, are any of the volumes prepared by Peter Hernon and his colleagues, published over several decades. The latest general guide from him at the time my book was going to press was Peter Hernon, Harold C. Relyea, Robert E. Dugan, and Joan F. Cheverie, United States Government Information: Policies and Sources (Westport, Conn.: Libraries Unlimited, 2002). It discusses finding aids, sources of information, and each major branch of government with material on information, directories, and bibliography. Of particular use to those interested in the early responses of the federal government to the spread of computing in the 1970s and 1980s is Peter Hernon and Charles R. McClure, Federal Information Policies in the 1980’s: Conflicts and Issues (Norwood, N.J.: Ablex Publishing, 1987). On the period following wide use of the Internet in the 1990s, there is the invaluable guide by Peter Hernon, Robert E. Dugan, and John A. Shuler, U.S. Government on the Web: Getting the Information You Need (Westport, Conn.: Libraries Unlimited, 3rd ed., 2003; 1st ed., 1999; 2nd ed., 2001).
Tax Management Applications The literature on the American tax system is massive, consisting of thousands of articles and hundreds of books, but the vast majority address taxes themselves and not the management of the taxation process, let alone the role of computing. The majority of published material consists of articles on local taxation or about the IRS. The most useful material on the IRS consists of the over one dozen major audits done on the agency by the General Accounting Office (GAO) and in the hearings of the U.S. Ways and Means Committee in the Congress. The first major study done of the IRS and its use of computing (covering 1950s–1970s) was by Helen Margetts, Information Technology in Government: Britain and America (London: Routledge, 1999). She also compares the IRS to Britain’s taxing agency and other government bodies, such as the U.S. Social Security Administration. For IT initiatives in the late 1980s forward, consult the brief but useful IRS publication usually published each year, Guide to the Internal Revenue Service for Congressional Staff. Finally, there is Henry J. Aaron and Joel Slemrod, eds., The Crisis in Tax Administration (Washington, D.C.: Brookings Institution Press, 2004), which discusses the IRS of the early years of the new century and computing in detail. Finally, for a well-written, pro-IRS account rich in detail, we have the memoirs of the IRS Commissioner, covering the years 1997–2002, Charles O. Rossotti, Many Unhappy Returns: One Man’s Quest to Turn Around the Most Unpopular Organization in America (Boston: Harvard Business School Press, 2005). A series of books written or edited by Kenneth L. Kraemer and his colleagues provides considerable amounts of information about the role of IT in state and local governments, although not always about taxation. However, they are essential reading. These include with John Leslie King, Computers and Local Government, 2 vols. (New York: Praeger, 1977); with King, Debora E. Dunkle, and Joseph P. Lane, Managing Information Systems: Change and Control in Organizational Computing (San Francisco: Jossey-Bass, 1989), this last one including a large bibliography. Now a minor classic on taxation that should be consulted is Clara Penniman, State Income Taxation (Baltimore: Johns Hopkins University Press, 1980). For both state and federal tax issues, and on occasion, on computing, see the National Tax Journal and Tax Law Review. Finally, do not overlook Kent Lassman, The Digital State 2002: How State Governments Use Digital Technologies (Washington, D.C.: Progress & Freedom Foundation, 2002) for the most modern period. For local government, the best source remains American City, also sometimes titled American City & County. The issue of sales taxes on Internet transactions is beginning to be covered by public sector magazines and academic journals, with examples cited in the endnotes for chapter 2.
Bibliographic Essay However, there is also Karl Frieden, Cybertaxation: The Taxation of E-Commerce (Chicago: CCH Inc., 2000), an early publication on the topic. Use of computers by tax preparers and tax payers has not yet been studied in detail. However, begin with Henry J. Aaron and Joel Slemrod, eds., The Crisis in Tax Administration (Washington, D.C.: Brookings Institution Press, 2004), which devotes a chapter to each. Next there is Suzanne Taylor, Kathy Shroeder, and John Doerr, Inside Intuit: How the Makers of Quicken Beat Microsoft and Revolutionized an Entire Industry (Boston: Harvard Business School Press, 2003). The most detailed discussion of the availability of tax preparation software for the public, which takes the story to about 1995, is in one chapter in a book by Martin Campbell-Kelly, From Airline Reservations to Sonic the Hedgehog (Cambridge, Mass.: MIT Press, 2003).
Military Applications No history of computing is complete without substantial discussion of how the Department of Defense devoted considerable sums for its development during the early years of the technology. However, before studying that story, it is useful to understand the history of the department itself; and for that, the standard introductory work is by Roger R. Trask and Alfred Goldberg, The Department of Defense 1947–1997: Organization and Leaders (Washington, D.C.: U.S. Government Printing Office, 1997). The two standard works on how government and the DoD invested in the early development of computers were written by Kenneth Flamm, Targeting the Computer: Government Support and International Competition (Washington, D.C.: Brookings Institution, 1987) and Creating the Computer: Government, Industry, and High Technology (Washington, D.C.: Brookings Institution, 1988). The military’s first experiences with general purpose computers began during World War II with the development of the ENIAC. Many scholarly articles have been written about its development. However, the two standard works are by Nancy Stern, From ENIAC to UNIVAC: An Appraisal of the Eckert-Mauchly Computers (Bedford, Mass.: Digital Press, 1981) and Scott McCartney, ENIAC: The Triumphs and Tragedies of the World’s First Computer (New York: Walker and Company, 1999). For a history of the business side of computer developments from the ENIAC through other derivative computers, consult Arthur L. Norberg, Computers and Commerce: A Study of Technology and Management at Eckert-Mauchly Computer Company, Engineering Research Associates, and Remington Rand, 1946–1957 (Cambridge, Mass.: MIT Press, 2005). To understand the first major Cold War project—SAGE—begin with Paul N. Edwards, The Closed World: Computers and the Politics of Discourse in Cold War America (Cambridge, Mass.: MIT Press, 1996), which places both that project and computing in general into the overall context of how DoD responded to the Cold War. Then for detailed histories of this important and massive initiative, consult Kent C. Redmond and Thomas M. Smith: Project Whirlwind: The History of a Pioneer Computer (Bedford, Mass.: Digital Press, 1980) and their sequel, From Whirlwind to MITRE: The R&D Story of the SAGE Air Defense Computer (Cambridge, Mass.: MIT Press, 2000). Finally, there is a useful memoir, John F. Jacobs, The SAGE Air Defense System: A Personal History (Bedford, Mass.: MITRE Corporation, 1986). Analog computing also played an early, important role. For that story, see James S. Small, The Analogue Alternative: The Electronic Analogue Computer in Britain and the USA, 1930–1975 (London: Routledge, 2001). The work of DARPA, the agency that did the actual funding and directing of much research, has been the subject of several major studies. Arthur L. Norberg and Judy E. O’Neill, Transforming Computer Technology: Information Processing for the Pentagon, 1962–1986 (Baltimore: Johns Hopkins University Press, 1996) discusses the early years of the agency. Janet Abbate, Inventing the Internet (Cambridge, Mass.: MIT Press, 1999) and Katie Hafner
439
440
Bibliographic Essay and Matthew Lyon, Where Wizards Stay Up Late: The Origins of the Internet (New York: Simon & Schuster, 1996) tell the story of how this form of telecommunications emerged. Finally, for projects from the 1980s and 1990s, also based on extensive archival research as were the other studies cited above, see Alex Roland and Philip Shiman, Strategic Computing: DARPA and the Quest for Machine Intelligence, 1983–1993 (Cambridge, Mass.: MIT Press, 2002). The number of publications available about the IT activities of the uniformed services is limited. However, for the Air Force, there is the excellent study on its culture and practices by a professional historian, Stephen B. Johnson, The United States Air Force and the Culture of Innovation, 1945–1965 (Washington, D.C.: U.S. Government Printing Office, 2002). For a beautifully illustrated history of how digital technology was used in aircraft, see Paul E. Ceruzzi, Beyond the Limits: Flight Enters the Computer Age (Cambridge, Mass.: MIT Press, 1989). The key study on the Navy is by David L. Boslaugh, When Computers Went to Sea: The Digitization of the United States Navy (Los Alamitos, Calif.: Computer Society, 1999). There are yet no comparable studies of the Army and Marine Corps. While I did not discuss NASA in this chapter, it did have a military facet to its work. To understand its role, see James E. Tomayko, Computers in Space: Journeys with NASA (Indianapolis, Ind.: Alpha Books, 1994) and also his newer study, Computers Take Flight: A History of NASA’s Pioneering Digital Fly-By-Wire Project (Washington, D.C.: U.S. Government Printing Office, 2000); Howard E. McCurdy, Inside NASA: High Technology and Organizational Change in the U.S. Space Program (Baltimore: Johns Hopkins University Press, 1993); and for a detailed analysis of the role of space in warfare in the early years of the twenty-first century, Zalmay Khalilzad and Jeremy Shapiro, eds., Strategic Appraisal: United States Air and Space Power in the 21st Century (Santa Monica, Calif.: RAND Corporation, 2002). The subject of information age warfare has generated a large body of publications. For descriptions of how warfare was already transforming to a digital format, see a book written by a reporter, James Adams, The Next World War: Computers Are the Weapons and the Front Line Is Everywhere (New York: Simon & Schuster, 1999), but also Martin C. Libicki, What Is Information Warfare? (Washington, D.C.: National Defense University Press, 1995). A group of American Army officers, many with combat experience, provided a different perspective, Robert L. Bateman III, ed., Digital War: A View from the Front Lines (Novato, Calif.: Presidio Press, 1999). For another useful assessment by military experts, see Robert L. Pfaltzgraff, Jr., and Richard H. Shultz, Jr., eds., War in the Information Age: New Challenges for U.S. Security (Washington, D.C.: Brassey’s, 1997). For an introduction to the growing body of material that argues warfare will be as much a civilian event as a military one, replete with hackers and cyberterrorism, see Winn Schwartau, Information Warfare (New York: Thunder’s Mouth Press, 1996). For an introduction to the argument that an over reliance on computers weakens American military strength, consult the various contributed papers in David Bellin and Gary Chaptman, eds., Computers in Battle: Will They Work? (Boston: Harcourt Brace Jovanovich, 1987). Regarding network centric warfare, the most useful introduction is by David S. Alberts, John J. Garstka, and Frederick P. Stein, Network Centric Warfare: Developing and Leveraging Information Superiority, 2nd ed. (Washington, D.C.: U.S. Department of Defense, Command and Control Research Program, 2003). On war gaming, see Peter P. Perla, The Art of Wargaming (Annapolis, Md.: Naval Institute Press, 1990), who provides a useful introduction to concepts and early programs. For a more current exposure, see Roland J. Yardley et al., Use of Simulation for Training in the U.S. Navy Surface Force (Santa Monica, Calif.: RAND Corporation, 2003).
Law Enforcement Applications The earliest study of policing to discuss computers was probably William H. Hewitt, Police Records Administration (Rochester, N.Y.: Aqueduct Books, 1968); but the first important
Bibliographic Essay study to appear on how police used computers was an edited volume of contributed essays looking at the activities by specific urban police departments, Kent W. Colton, ed., Police Computer Technology (Lexington, Mass.: Lexington Books, 1978). The second major study added material about state activities and included chapters on functions and applications by V. A. Leonard, The New Police Technology: Impact of the Computer and Automation on Police Staff and Line Performance (Springfield, Ill.: Charles C. Thomas, 1980). For activities taking place in computing in the 1980s, the standard work is by J. Van Duyn, Automated Crime Information Systems (Blue Ridge Summit, Penn.: TAB Professional and Reference Books, 1991); and for the 1990s, see the useful “how to” study by Jim Chu, Law Enforcement Information Technology: A Managerial, Operational, and Practitioner Guide (Boca Raton, Fla.: CRC Press, 2001). Finally, for use by law enforcement management at the dawn of the new century, with absolutely no historical perspective, there is Charles Drescher and Martin Zaworski, The Design of Information Systems for Law Enforcement (Springfield, Ill.: Charles C. Thomas, 2000). A seminal report crucial to an understanding of policing in the second half of the twentieth century is a report by the President’s Commission on Law Enforcement and Administration of Justice, Task Force Report, Science and Technology (Washington, D.C.: U.S. Government Printing Office, 1967), but also see its companion report, The Challenge of Crime in a Free Society (Washington, D.C.: U.S. Government Printing Office, 1967). For a comprehensive overview of applications late in the century, there is Department of Justice, Office of Justice Programs, Bureau of Justice Statistics, Use and Management of Criminal History Record Information: A Comprehensive Report (Sacramento, Calif.: SEARCH Group, 1993), and an updated report with the same title, 2001, published by the U.S. Government Printing Office. The series of IBM publications describing the uses of computing in specific police departments is a detailed and useful collection of material worth examining. For fingerprinting, one of the earliest descriptions was by Andre Moessens, Fingerprint Techniques (New York: Chilton, 1971). For a later description of the application, there is the useful study by the U.S. Office of Technology Assessment, The FBI Fingerprint Identification Automation Program: Issues and Options—Background Paper, OTA-BP-TCT-84 (Washington, D.C.: U.S. Government Printing Office, 1991). While many articles have been published on crime mapping, for a comprehensive explanation of this use of computers, see National Institute of Justice, Mapping Crime: Principle and Practice (Washington, D.C.: U.S. Department of Justice, 1999). For a similar discussion of dispatching, see Tom McEwen, Jacqueline Ahn, Steve Pendleton, Barbara Webster, and Gerald Williams, Computer Aided Dispatch in Support of Community Policing (Washington, D.C.: U.S. Department of Justice, February 2004). The endnotes contain numerous references to articles on each of these applications. On telecommunications after radio went digital and computing wireless, consult the National Institute of Justice, State and Local Law Enforcement Wireless Communications and Interoperability: A Quantitative Analysis (Washington, D.C.: U.S. Department of Justice, 1998). By the end of the 1980s, police began using computers to do research on crimes. However, the first major descriptions of such applications did not appear until late in the century. Some of the earliest of these included William Schwabe, Crime-Fighting Technology: The Federal Role in Assisting and Local Law Enforcement (Santa Monica, Calif.: RAND Corporation 1999) for background information; for two early “how to” guides, see Mark A. Stallo, Using Microsoft Office to Improve Law Enforcement Operations: Crime Analysis, Community Policy, and Investigations (n.p.: privately printed, circa 2000) and Tim Dees, Online Resource Guide for Law Enforcement (Upper Saddle River, N.J.: Pearson, 2002). Police also used the same software tools as lawyers and judges. Standard reference works include Kathleen M. Carrick, LEXIS: A Legal Research Manual (Dayton, Ohio: Mead Data Central, 1989), which was the vendor’s “user guide” and which appeared in many editions and versions over the rest of the century; Ralph D. Thomas, How to Investigate by Computer (Austin,
441
442
Bibliographic Essay Texas: Thomas Investigative Publications, 1994); the massive guide by Christopher G. Wren and Jill Robinson Wren, Using Computers in Legal Research: A Guide to LEXIS and WESTLAW (Madison, Wis.: Adams & Ambrose Publishing, 1994); Judy A. Long, Computer Aided Legal Research (n.p.: Thompson, 2003). None, however, provide any historical commentary. There are no comprehensive studies on the role of technology in courts and corrections. However, one early useful study provides an introduction to the issues and applications in corrections, U.S. Department of Justice, Bureau of Justice Statistics, State and Federal Corrections Information Systems: An Inventory of Data Elements and an Assessment of Reporting Capabilities (Washington, D.C.: U.S. Department of Justice, August, 1998). On fighting computer crime, aimed at all participants in the law enforcement ecosystem, there are various contemporary monographs that shed light on practices. One of the earliest studies is by Stephen W. Leibholz and Louis D. Wilson, Users’ Guide to Computer Crime: Its Commission, Detection and Prevention (Radnor, Penn.: Chilton Book Company, 1974), which provides historical accounts of crimes, not just suggestions on how to avoid them. The most important books on computer crime were written by Donn B. Parker. To span the breadth of his work, begin with his Crime by Computer (New York: Charles Scribner’s Sons, 1976), which covers the period from the late 1950s to the mid-1970s; then consult his Fighting Computer Crime: A New Framework for Protecting Information (New York: John Wiley & Sons, 1998), which provides much information on activities in the 1980s and 1990s. For many years the standard reference work used by police, and written by Parker and colleagues, was U.S. Department of Justice, Law Enforcement Administration, Criminal Justice Resource Manual on Computer Crime (Washington, D.C.: U.S. Government Printing Office, 1979). Other studies on early computer crimes include August Bequai, Computer Crime (Lexington, Mass.: Lexington Books, 1978), which the author updated and expanded in Technocrimes (Lexington, Mass.: Lexington Books, 1987). See also Franklin Clark and Ken Diliberto, Investigating Computer Crime (Boca Raton, Fla.: CRC, 1996); Gerald L. Kovacich and William C. Boni, High Technology-Crime Investigator’s Handbook: Working in the Global Information Environment (Boston: Butterworth-Heinemann, 2000); R. L. Bintliff, Complete Manual of White Collar Crime Detection and Prevention (Englewood Cliffs, N.J.: Prentice-Hall, 1993); David Icove, Karl Seger, and William Von Storch, Computer Crime: A Crimefighter’s Handbook (Sebastopol, Calif.: O’Reilly and Associates, 1995); U.S. Department of Justice, Office of Justice Programs, Bureau of Justice Statistics, Organizing for Computer Crime Investigation and Prosecution (Washington, D.C.: U.S. Government Printing Office, 1989). There is a large body of material about hackers. For a spectrum of views, see Bernadette H. Schell, John L. Dodge, and Steve S. Moutsatsos, The Hacking of America— Who’s Doing It, Why, and How (Westport, Conn.: Quorum Books, 2002); Katie Hafner and John Markoff, Cyberpunk: Outlaws and Hackers on the Computer Frontier (New York: Touchstone, 1992); The Knightmare, Secrets of a Super Hacker (Port Townsend, Wash.: Loompanics Unlimited, 1994); Steven Levy, Hackers: Heroes of the Computer Revolution (New York: Anchor Press, 1984); Guy L. Steele et al., The Hacker’s Dictionary (New York: Harper and Row, 1983); Bruce Sterling, The Hacker Crackdown: Law and Disorder on the Electronic Frontier (New York: Bantam Books, 1992). Early studies of crimes committed over the Internet include, William C. Boni and Gerald L. Kovacich, I-Way Robbery: Crime on the Internet (Boston: Butterworth-Heinemann, 1999); Charles Platt, Anarchy Online, Net Crime, Net Sex (New York: Harper, 1996); Julian Dibbell, My Tiny Life: Crime and Passion in a Virtual World (New York: Henry Holt, 1998); Brian McWilliams, S*pam Kings: The Real Story behind the High-Rolling Hucksters Pushing Porn, Pills, and @*#?% Enlargements (Cambridge, Mass.: O’Reilly, 2004); Steven Branigan, High-Tech Crimes Revealed: Cyberwar Stories from the Digital Front (Boston: Addison-Wesley, 2005). Finally, for ongoing information about developments in the early twenty-first century, consult the Web sites of the various bureaus within the U.S. Department of Justice, many of
Bibliographic Essay which are cited in the endnotes to chapter 4. IBM’s archive also holds a substantial collection of case studies of the use of computers across all the entire period and from policing through corrections.
Social Security Administration, Bureau of the Census, and U.S. Postal Service Applications There is no general institutional history of the Social Security Administration; the vast body of material on this agency relates to the national debate on the pension program itself, a dialogue that extended from its founding in the 1930s to the present. Regarding computing, an important, early work written is essential: Charles A. Coffindaffer, “The Conversion of Social Security Recordkeeping Operations to Electronic Data Processing” (M.S. thesis, George Washington University, 1963), but see also an SSA publication, History of Installation of Electronic Data Processing in OASDI Payment Centers (Baltimore: Social Security Administration, 1964). For the 1980s, see Office of Technology Assessment, Social Security Administration and Information Technology, Special Report OTA-CIT-311 (Washington, D.C.: U.S. Government Printing Office, 1986) and Martha Derthick, Agency under Stress: The Social Security Administration in American Government (Washington, D.C.: Brookings Institution, 1990). For the entire period of the 1960s through the 1980s, there is also an excellent chapter in Helen Margetts, Information Technology in Government: Britain and America (London: Routledge, 1999). On the period of the late 1980s and early 1990s, there is an assessment of IT at the agency by the Office of Technology Assessment, The Social Security Administration’s Decentralized Computer Strategy: Issues and Options, OTA-TCT-591 (Washington, D.C.: U.S. Government Printing Office, April 1994). The SSA also has an historian and internal archives. The Bureau of the Census has a richer body of historical literature. While a number of institutional histories exist of the bureau, the standard work is Margo J. Anderson, The American Census: A Social History (New Haven, Conn.: Yale University Press, 1988). She also co-authored a study dealing with census-taking issues leading up to the 2000 count, written with Stephen E. Feienberg, Who Counts? The Politics of Census-Taking in Contemporary America (New York: Russell Sage Foundation, 1999). On the precomputer era history of information technology at the bureau, there is Leon Truesdell, The Development of Punch Card Tabulation in the Bureau of the Census, 1890–1940 (Washington, D.C.: U.S. Government Printing Office, 1965). For an understanding of both the work of this bureau and of other statistical agencies, the essential work is Joseph W. Duncan and William C. Shelton, Revolution in United States Government Statistics, 1926–1976 (Washington, D.C.: U.S. Government Printing Office, 1978). However, the most useful sources on the role of IT at the bureau are its “procedural histories,” written after each decennial census in which staff record an account of the process, challenges and successes encountered, and make recommendations for the next census. These are very detailed studies and always contain a considerable amount of details regarding the role of computers and telecommunications. There are histories for each census, beginning with 1940’s and extending through 2000’s. They normally appear in print several years after a census. They served as the backbone for the discussion on the role of IT at the bureau in this book, and their various titles can be found in the endnotes for chapter 5. Finally, there is a very brief history of IT at the bureau, U.S. Bureau of the Census, 100 Years of Data Processing: The Punchcard Century (Washington, D.C.: U.S. Department of Commerce, 1991). The bureau also has resident historians. Most recently, the USPS published a brief history of the role of computing in its organization, James L. Golden, IT Governance (Washington, D.C.: U.S. Postal Service, 2005). The key sources on computing in the postal system, however, are the annual reports of the postmaster generals. The ones published before 1971 are quite detailed, particularly regarding
443
444
Bibliographic Essay mechanization of mail handling, while the post-1971 editions read very much like a corporate annual report; nonetheless, collectively they are the most useful sources available in print. The General Accounting Office has published a myriad of studies on postal operations over the past thirty years, many of which comment on the role of IT as well. Not to be overlooked is a short master’s thesis by Harris Mahmud, “Automation and the Future of the United States Postal Service” (M.S. thesis, California State University, Long Beach, August 2000). There are a number of key works that discuss the future of the USPS and the impact of the Internet and other technologies and competitive forces on this organization. They include Douglas K. Adie, Monopoly Mail: Privatizing the United States Postal Service (New Brunswick, N.J.: Transaction, 1989); Alan L. Sorkin, The Economics of the Postal System: Alternatives and Reforms (Lexington, Mass.: Lexington Books, 1990); Michael Schuyler, Wrong Delivery: The Postal Service in Competitive Markets (Washington, D.C.: Institute for Research on the Economics of Taxation, 1998); and Edward L. Hudgins, ed., Mail@The Millennium: Will the Postal Service Go Private? (Washington, D.C.: CATO Institute, 2000); see also his earlier work, Edward L. Hudgins, ed., The Last Monopoly: Privatizing the Postal Service for the Information Age (Washington, D.C.: CATO Institute, 1996). Finally, for an overview of IT circa early 2000s, see the IT strategic plan of the USPS, Strategic Transformation Plan 2006–2010 (Washington, D.C.: U.S. Postal Service, 2005). The USPS also has a resident historian.
Computing in the Federal Government There are no formal histories of early computing applications in the federal government; however, in the period 1950s–1970s, various inventories were created. The earliest were often consultants’ reports prepared for the emerging Computer Industry. Most were never published; many are housed at the Charles Babbage Institute at the University of Minnesota at Minneapolis. Early government inventories are spotty, but nonetheless useful, and are cited in the endnotes to chapter 6. However, key publications include Bureau of the Budget, Inventory of Automatic Data Processing (ADP) Equipment in the Federal Government (Washington, D.C.: Bureau of the Budget, August 1962) and two other subsequent editions published in June 1965 and July 1966. For the period from the mid-1960s to the late 1990s, the General Service Administration published raw inventory data with little or no commentary, normally entitled Automated Data Processing Equipment in the U.S. Government (Washington, D.C.: U.S. General Services Administration, various years). Richer in detail and with written analysis about trends in the acquisition of IT is a set of three publications authored by the National Bureau of Standards, Computers in the Federal Government: A Compilation of Statistics (Washington, D.C.: U.S. Government Printing Office, June 1977, April 1979, and November 1982). For the period of the 1980s and 1990s, there is additionally a series of IT plans for the entire government, which include data on existing IT and planned events and expenditures, written jointly by the Office of Management and Budget, General Services Administration, and the Department of Commerce, Information Resources Management Plan of the Federal Government (Washington, D.C.: U.S. Government Printing Office, November 1990, November 1991, August 1992, November 1992, March 1995, August 1996). Additionally, for the period from the late 1960s down to the present, an essential source of information is the dozens upon dozens of reports of the General Accounting Office on various aspects related to IT, which were extensively used and cited in this book. With the arrival of the Internet, and particularly the interest of several presidential administrations in leveraging IT, the stream of pronouncements and commission studies grew, particularly after 1975; many are cited in endnotes to chapters 1 through 6. By the mid-1990s, however, a variety of guides began to appear in print describing government
Bibliographic Essay publications that are of immediate relevance to the history of federal computing. Three volumes provide a detailed analysis of federal computing issues of the 1970s, 1980s, and early 2000 that are essential for any understanding of the topic: Peter Hernon and Charles R. McClure, Federal Information Policies in the 1980’s (Norwood, N.J.: Ablex Publishing Corporation, 1987), Peter Hernon, Charles R. McClure, and Harold C. Relyea, eds., Federal Information Policies in the 1990s: Views and Perspectives (Norwood, N.J.: Ablex Publishing Corporation, 1996), Peter Hernon, Harold C. Relyea, Robert E. Dugan, and Joan F. Cheverie, United States Government Information: Policies and Sources (Westport, Conn.: Unlimited Libraries, 2002). For a description of the Clinton administration’s “information highway” initiatives, see Brian Kahin and Ernest Wilson, eds., National Information Infrastructure Initiatives: Vision and Policy Design (Cambridge, Mass.: MIT Press, 1997). The best single volume overview of federal Web sites is Peter Hernon, Robert E. Dugan, and John A. Shuler, U.S. Government on the Web: Getting the Information You Need (Westport, Conn.: Libraries Unlimited, in 3 editions, 1999, 2001, and 2003). Each has different information, so all should be consulted. Finally, for almost any year since the 1950s, individual cabinet-level departments published annual reports that contained various amounts of detail about the role of IT and telecommunications within their organizations. Those of the USPS and GSA are most useful, in particular. Finally, the best analysis of the role of the Internet at the federal, state, and local levels is Darrell M. West, Digital Government: Technology and Public Sector Performance (Princeton, N.J.: Princeton University Press, 2005). The only caution with this book is that much of its data on the use of the Internet ends with 2001 or 2002, and as demonstrated in The Digital Hand, a great deal has gone on since then. West has maintained a Web site, however, that contains additional material: InsidePolitics.org.
Computing in State Governments Computing in state governments has been examined as a contemporary management issue by academics and industry magazine reporters. The first book-length discussion of IT at the state and local level was written by Harry H. Fite, The Computer Challenge to Urban Planners and State Administrators (Washington, D.C.: Spartan Books, 1965), followed soon after with a study that also provided descriptions of early uses and managerial considerations, Geoffrey Y. Cornog, James B. Kenney, Ellois Scott, and John J. Connelly, eds., EDP Systems in Public Administration (Chicago: Rand McNally, 1968). Another early study, complete with scenarios, was written by Kenneth C. Laudon, Computers and Bureaucratic Reform (New York: John Wiley & Sons, 1974), discussing county use of computers as well. However, the most prolific and important commentators on the topic at both the state and county/municipal level in the 1970s and 1980s were Kenneth L. Kraemer and John Leslie King who, through a series of articles (many cited in the endnotes to chapter 7) and book-length anthologies, discussed IT managerial issues and conducted surveys on the role of it. Their two key studies are Computers and Local Government: A Review of the Research, 2 vols. (New York: Praeger, 1977) and with Debora E. Dunkle and Joseph P. Lane as well, Managing Information Systems: Change and Control in Organizational Computing (San Francisco: Jossey-Bass, 1989). Also useful for states is another anthology of papers on managerial issues, Sharon L. Caudle et al., Managing Information Technology: New Directions in State Government (Syracuse, N.Y.: School of Information Studies, Syracuse University, 1989). For the period of the 1990s and beyond, see Robert Anderson, Tora K. Bikson, Rosalind Lewis, Joy Moini, and Susan Straus, Effective Uses of Technology: Lessons about State Governance Structures and Processes (Santa Monica, Calif.: RAND, 2003). The most useful materials on uses of the computer, however, are in the form of articles either in trade publications, such as Governing, or academic journals, such as Public Administration Review. In addition, IBM and others published a string of application briefs describing how specific state governments used computers.
445
446
Bibliographic Essay The subject of e-democracy often is discussed as it affects all levels of government, but largely state and local, and the literature is now vast and most polemical. However, useful introductions to the issues include Lawrence K. Grossman, The Electronic Republic: Reshaping Democracy in the Information Age (New York: Penguin, 1995), which is probably the most useful volume to begin with. Since the Internet stimulated an enormous amount of discussion on the topic, see Graeme Browning, Electronic Democracy: Using the Internet to Affect American Politics (Wilton, Conn.: Pemberton Press Books/Online, Inc., 1996), and for a more current review of similar themes by Steve Davis, Larry Elin, and Grant Reeher, Click on Democracy: The Internet’s Power to Change Political Apathy into Civic Action (Boulder, Colo.: Westview, 2002). On some of the political issues involved, there is Anthony G. Wilhelm, Digital Nation: Toward an Inclusive Information Society (Cambridge, Mass.: MIT Press, 2004). Discussing the role of the Internet, one should also consult Jane E. Fountain’s excellent discussion, Building the Virtual State: Information Technology and Institutional Change (Washington, D.C.: Brookings Institution Press, 2001).
Computing in Local Governments Most of the literature on counties and municipalities discusses both as if they were one interchangeable topic. In addition to the publications just discussed regarding states, one of the earliest and most important was written by Kenneth L. Kraemer, William H. Mitchell, Myron E. Weiner, and O. E. Dial, Integrated Municipal Information Systems: The Use of the Computer in Local Government (New York: Praeger, 1974) and a later study, James N. Danziger, William H. Dutton, Rob Kling, and Kenneth L. Kraemer, eds., Computers and Politics: High Technology in American Local Government (New York: Columbia University Press, 1982). The essential source for a serious appreciation of computing at the municipal level is American City (later retitled American City & County), published every month throughout the entire half century, and which carried hundreds of stories about the use of computers in all types of communities. As is the case with federal, state, and county governments, there are no book-length histories of the role of IT, so the story has to be cobbled together from industry publications. That said, for the very earliest period, there is one description of potential applications that at least lays out what communities could do and that turned out to be what many did, Myron E. Weiner, An Integrated Data System for Small-Medium Sized Local Government (Storrs, Conn.: Institute of Public Service, University of Connecticut and International City Managers’ Association, 1966). The International City Managers’ Association (ICMA) published various surveys, annuals, and other papers dealing with local IT over the years as well. Also on the earliest period of computing, there is Charles W. Rice, Jr., and Jeffrey L. Anderson, Electronic Data Processing in Local Government (Moscow: University of Idaho, 1968). For GIS, there is a useful history, complete with bibliography, Timothy W. Foresman, ed., The History of GIS (Upper Saddle River, N.J.: Prentice Hall PTR, 1998). For additional material on the diffusion of this application on a global basis, see Ian Masser and Harlan J. Onsrud, eds., Diffusion and Use of Geographic Information Technologies (Dordrecht: Kluwer Academic Publishers, 1993).
Computing in Education (K–12) It is essential to understand the culture and history of this industry before attempting to explore the role of IT in it. There are several excellent studies one can go to, beginning with David Tyack and Larry Cohen, Tinkering toward Utopia: A Century of Public School Reform (Cambridge, Mass.: Harvard University Press, 1995); then read William J. Reese, America’s Public Schools: From the Common School to “No Child Left Behind” (Baltimore: Johns Hopkins University Press, 2005), which brings the story right into the twenty-first century. For
Bibliographic Essay background on the role of administration in education, see Larry Cuban, The Managerial Imperative and the Practice of Leadership in Schools (Albany: State University of New York Press, 1988) and David Tyack and Elisabeth Hansot, Managers of Virtue: Public School Leadership in America, 1820–1980 (New York: Basic Books, 1982). Because of his importance in education’s history, see also Larry Cuban, Oversold and Underused: Computers in the Classroom (Cambridge, Mass.: Harvard University Press, 2001), which in addition to discussing computers, contains much insight on the culture of teaching. Because the teaching ethos must be understood before examining the role of computers, consult two studies, the first by Larry Cuban, How Teachers Taught: Constancy and Change in American Classrooms, 1890–1980 (New York: Longman, 1984) and David B. Tyack, The One Best System: A History of American Urban Education (Cambridge, Mass.: Harvard University Press, 1974). Finally, for a study specifically addressing the effects of computing on the culture of teaching, see C. A. Bowers, The Cultural Dimensions of Educational Computing: Understanding the NonNeutrality of Technology (New York: Teachers College, Columbia University Press, 1988). On the early role of computers in education, there is Anthony G. Oettinger, Run, Computer, Run: The Mythology of Educational Innovation (Cambridge, Mass.: Harvard University Press, 1969). An early influential writer who framed many of the teaching issues of the period produced a book of contributed essays, Robert P. Taylor, ed., The Computer in the School: Tutor, Tool, Tutee (New York: Teachers College Press, Columbia University, 1980). Seymour Papert’s books are essential reading for understanding the case for using computers, Mindstorms: Children, Computers, and Powerful Ideas (New York: Basic Books, 1980) and The Children’s Machine: Rethinking School in the Age of the Computer (New York: Basic Books, 1993). For one of the earliest examinations of digital applications in education, see Charles A. Darby, Jr., Arthur L. Korotkin, and Tania Romashko, The Computer in Secondary Schools: A Survey of Its Instructional and Administrative Usage (New York: Praeger, 1972). A decade later a new account appeared, written by Nancy Protheroe, Deirdre Carroll, and Tracey Zoetis, School District Uses of Computer Technology (Arlington, Va.: Educational Research Services, 1982), but see also Michael T. Sherman, Computers in Education: A Report (Concord, Mass.: Bates Publishing, 1983). For examples of arguments against the use of computers in schools, see Alison Armstrong and Charles Casement, The Child and the Machine: How Computers Put Our Children’s Education at Risk (Beltsville, Md.: Robins Lane Press, 2000) and the better articulated case presented by Clifford Stoll, High Tech Heretic: Why Computers Don’t Belong in the Classroom and Other Reflections by a Computer Contrarian (New York: Doubleday, 1999). Three works introduce the concept of teaching machines and reflect contemporaneous equipment and their use. Wilbur L. Ross, Jr., et al, Teaching Machines: Industry Survey and Buyers’ Guide (New York: Center for Programmed Instructions, Inc., 1962) is the closest to a vendor’s guide as can be found for the period just when computers came on the scene; Benjamin Fine, Teaching Machines (New York: Sterling Publishing, 1962) is more of a description of applications, as is Lawrence M. Stolurow, Teaching by Machine (Washington, D.C.: U.S. Government Printing Office, 1961). The latter is difficult to find, but is available at national government repository libraries. For these machines and the transition to computing, also consult Karl U. Smith and Margaret Foltz Smith, Cybernetic Principles of Learning and Educational Design (New York: Holt, Rinehart and Winston, 1966). For PLATO, one of the first mainframe teaching tools, there is D. Lamont Johnson and Cleborne D. Maddux, eds., Technology in Education: A Twenty-Year Retrospective (Binghamton, N.Y.: Haworth Press, 2003) and Elisabeth Van Meer, “PLATO: From Computer-Based Education to Corporate Social Responsibility,” Iterations (November 5, 2003): 1–22. Note that Iterations is an online journal, published by the Charles Babbage Institute at the University of Minnesota, Minneapolis. Finally, on IBM’s major offering of the 1980s in this field, see John Henry Martin and Ardy Friedberg, Writing to Read: A Parent’s Guide to the New, Early Learning Program for Young Children (New York: Warner Books, 1986).
447
448
Bibliographic Essay The availability of microcomputers triggered a flood of studies. One of the earliest was written by Lee M. Joiner, George J. Vensel, Jay D. Ross, and Burton J. Silverstein, Microcomputers in Education: A Nontechnical Guide to Instructional and School Management Applications (Holmes Beach, Fla.: Learning Publications, 1982); David Hawkridge, New Information Technology in Education (Baltimore: Johns Hopkins University Press, 1983); Gail A. Caissy, Microcomputers and the Classroom Teacher (Bloomington, Ind.: Phi Delta Kappa Educational Foundation, 1987); Bobbie K. Hentrel, Computers in Education: A Guide for Educators (Ann Arbor: University of Michigan Press, 1985); and for the general overview of problems in education that helped stimulate acquisition of much computing, there is the National Commission on Excellence in Education, Nation at Risk: The Imperative for Educational Reform (Washington, D.C.: U.S. Government Printing Office, 1983). A large number of texts were published in the 1980s and 1990s, often going through multiple editions that reflect the changing uses of personal computers in this period. Good examples of this class of publications include Gary G. Bitter and Melissa E. Pierson, Using Technology in the Classroom (Boston: Allyn and Bacon, first edition 1984, and subsequently all through the two decades); and Paul F. Merrill, working with various co-authors and publishers, published a widely consulted text, Computers in Education (Englewood Cliffs, N.J.: Prentice-Hall, 1986, and various subsequent editions). Newer texts also included much discussion of the role of the Internet as well. Examples include M. D. Roblyer, Jack Edwards, and Mary Anne Havriluk, Integrating Educational Technology into Teaching (Upper Saddle River, N.J: PrenticeHall, 1997); Randall James Ryder and Tom Hughes, Internet for Educators (Upper Saddle River, N.J.: Prentice Hall, 1997); James Lockard and Peter D. Abrams, Computers for TwentyFirst Century Educators (New York: Longman, various editions, but the most useful that of 2001); Gilbert Valdez et al., Computer-Based Technology and Learning: Evolving Uses and Expectations (Oak Brook, Ill.: North Central Regional Educational Laboratory, May 2000); Judy Lever-Duffy, Jean B. McDonald. and Al P. Mizell, Teaching and Learning with Technology (Boston: Pearson, 2003). Those are all texts intended to teach about the technology. For more in-depth analyses of the technologies of the 1990s forward and their implications for education, see David N. Perkins, Judah L. Schwartz, Mary Maxwell West, and Martha Stone Wiske, eds., Software Goes to School: Teaching for Understanding with New Technologies (New York: Oxford University Press, 1995); Barbara Means, William R. Penuel, and Christine Padilla, The Connected School: Technology and Learning in High School (San Francisco: JosseyBass, 2001); Cuban, Oversold and Underused; Janet Ward Schofield and Ann Locke Davidson, Bringing the Internet to School: Lessons from an Urban District (San Francisco: Jossey-Bass, 2002); Gene I. Maeroff, A Classroom of One: How Online Learning Is Changing Our Schools and Colleges (New York: Palgrave, 2003), which is probably the best of the lot. Finally, on the role of video games in education, the key study is David Williamson Shaffer, How Computer Games Help Children Learn (New York: Palgrave, 2006). Many government studies have been published on extent of deployment, national policies, and so forth, cited in the endnotes to chapter 8. However, of particular use for contemporary issues there are Douglas Levin and Sousan Arafeh, The Digital Disconnect: The Widening Gap between Internet-Savvy Students and Their Schools (Washington, D.C.: Pew Internet & American Life Project, August 14, 2002); National Center for Educational Statistics, Young Children’s Access to Computers in the Home and at School in 1999 and 2000 NCES 2003–036(Washington, D.C.: U.S. Department of Education, March 2003) and also Computer and Internet Use by Children and Adolescents in 2001 NCES 2004–014 (Washington, D.C.: U.S. Department of Education, October 2003); and U.S. Department of Education, Toward a New Golden Age in American Education: How the Internet, the Law and Today’s Students Are Revolutionizing Expectations (Washington, D.C.: U.S. Department of Education, 2003). Endnotes contain citations to many articles and more narrowly published monographs. However, consulting ERIC, the U.S. Department of Education’s database, remains
Bibliographic Essay the best place to begin exploring that body of material. Most of the publications cited above also contain useful bibliographies and pointers to Web sites and other sources.
Computing in Higher Education As with K–12, there is a large body of literature on higher education in the United States. For purposes of understanding the role of the digital hand in academia, one might begin by reading James J. Duderstadt and Farris W. Womack, The Future of the Public University in America: Beyond the Crossroads (Baltimore: Johns Hopkins University Press, 2003) because it provides historical background and a forward view. Next, examining the changing structure of this industry can be well informed by understanding the rise of for-profit schools, explained very clearly by Richard S. Ruch, Higher Ed, Inc.: The Rise of the For-Profit University (Baltimore: Johns Hopkins University Press, 2001). Then for a detailed history of technology, going back into the nineteenth century, consult Paul Saettler, The Evolution of American Educational Technology (Greenwich, Conn.: Information Age Publishing, 1990, but recently reprinted, 2004). Finally, to get a broad view of the challenges facing higher education because of technology, there is the important and thoughtful study by James J. Duderstadt, Daniel E. Atkins, and Douglas Van Houweling, Higher Education in the Digital Age: Technology Issues and Strategies for American Colleges and Universities (Westport, Conn.: American Council on Education and Praeger, 2002). On the use of computers in support of administrative operations, begin with Charles Mosmann, Academic Computers in Service (San Francisco: Jossey-Bass, 1973), which discusses events of the 1960s–1970s, but see also Martin D. Robbins, William S. Dorn, and John E. Skelton, Who Runs the Computer? Strategies for the Management of Computers in Higher Education (Boulder, Colo.: Westview, 1975). A very early discussion of administrative systems was written by Charles R. Thomas and Dana S. van Hoesen, Administrative Information Systems: The 1985 Profile and Five-Year Trends (Boulder, Colo.: CAUSE, 1986). One of the most useful studies of computing in the 1980s is Kenneth C. Green and Steven W. Gilbert, eds., Making Computers Work for Administrators (San Francisco: Jossey-Bass, 1988). Two books discuss administration’s application and roles in the 1990s with examples: Diana G. Oblinger and Richard N. Katz, eds., Renewing Administration: Preparing Colleges and Universities for the 21st Century (Bolton, Mass.: Anker Publishing, 1999) and Diana G. Oblinger and Sean C. Rush, eds., The Future Compatible Campus (Bolton, Mass.: Anker Publishing, 1998). On a wide variety of managerial and operational issues, with cases, see Sara Kiesler and Lee Sproull, Computing and Changes on Campus (Cambridge: Cambridge University Press, 1987) and the more “how to do it” A. W. Bates, Managing Technological Change: Strategies for College and University Leaders (San Francisco: Jossey-Bass, 2000). Two industry journals addressed issues of concern to administrators, beginning in the 1980s, worth consulting: EDUCAUSE Quarterly and EDUCAUSE Review. Finally, on the early role of microcomputers, consult J. Victor Baldrige, Janine Woodward Roberts, and Terri A. Weiner, The Campus and the Microcomputer Revolution: Practical Advice for Nontechnical Decision Makers (New York: American Council on Education, 1984). For some recent new applications in administration, there is Philip J. Goldstein, Academic Analytics: The Uses of Management Information and Technology in Higher Education (Boulder, Colo.: EDUCAUSE, 2005). Using computers in teaching has also generated considerable attention. For an outstanding introduction to the issues and technologies involved, which includes historical perspectives, begin by reading Gene I. Maeroff, A Classroom of One (New York: Palgrave Macmillan, 2003) and then follow up with an excellent “how to” book by Judith Haymore Sandholtz, Cathy Ringstaff, and David C. Dwyer, Teaching with Technology: Creating StudentCentered Classrooms (New York: Teachers College, Columbia University, 1997). Two very
449
450
Bibliographic Essay early studies of the issue are H. A. Lekan, Index to Computer Assisted Instruction (New York: Harcourt Brace Jovanovich, 1971) and W. H. Holztman, ed., Computer-Assisted Instruction, Testing, and Guidance (New York: Harper and Row, 1970). A report prepared for the Carnegie Commission on Higher Education with much useful information is John F. Rockart and Michael S. Scott Morton, Computers and the Learning Process in Higher Education (New York: McGraw-Hill, 1975). At the same time, many examples of use were documented; see the useful collection by Shelley A. Harrison and Lawrence M. Stolurow, eds., Improving Instructional Productivity in Higher Education (Englewood Cliffs, N.J.: Educational Technology Publications, 1975). On activities in the 1980s, see Barry Heermann, Teaching and Learning with Computers: A Guide for College Faculty and Administrators (San Francisco: Jossey-Bass, 1988), and to compliment this work with analysis of trends, there is R. Lewis and E. D. Tagg, eds., Trends in Computer Assisted Education (Oxford: Blackwell Scientific Publications, 1988). For case studies, see Les Lloyd, ed., Technology and Teaching (Medford, N.J.: Information Today, 1997), and for a detailed study set in the period of the Internet, Anne B. Keating and Joseph Hargitai, The Wired Professor: A Guide to Incorporating the World Wide Web in College Instruction (New York: New York University Press, 1999). Finally, there is David W. Cheney, The Application and Implications of Information Technologies in Postsecondary Distance Education: An Initial Bibliography (Washington, D.C.: National Science Foundation, November 2002). Research in higher education is an idiosyncratic subject that has spun off, on the one hand, tens of thousands of publications on research that relied on computing, yet on the other hand, almost no historical monographs that synthesize the history of this subject, thereby leaving yet open a massive lacuna in the history of modern academic life in America. One of the earliest studies that had some historical perspective is Bert G. Green, Jr., Digital Computers in Research: An Introduction for Behavioral and Social Scientists (New York: McGrawHill, 1963). Over the next twenty years, others published similar books covering almost all physical science and medical fields. For all that literature, see an annotated bibliography by Jeffrey R. Yost, A Bibliographic Guide to Resources in Scientific Computing, 1945–1975 (Westport, Conn.: Greenwood Press, 2002), which has the added bonus of also listing archival materials as well. The historians have practitioner guides as well; the best is by Dennis A. Trinkle, ed., Writing, Teaching, and Researching History in the Electronic Age (Armonk, N.Y.: M. E. Sharpe, 1998). Trinkle has published many guides for historians focused on sources available over the Internet as well. A study done on hardware and software in science is a collection of papers, Stephen G. Nash, ed., A History of Scientific Computing (New York: ACM Press, 1990). The few histories of the National Science Foundation barely discuss computing, nor do the annual reports of the NSF. However, for the early history of government support of computing and IT-related research, there are two excellent monographs by Kenneth Flamm, Targeting the Computer: Government Support and International Competition (Washington, D.C.: Brookings Institution Press, 1987) and Creating the Computer: Government, Industry, and High Technology (Washington, D.C.: Brookings Institution Press, 1988). Telecommunications is a crucial issue to understand. One of the earliest collections of papers examining this issue in higher education is Robert J. Seidel and Martin L. Rubin, eds., Computers and Communications: Implications for Education (New York: Academic Press, 1977), and for a view from the late 1980s and early 1990s (essentially pre-Internet), Becky S. Duning, Marvin J. Van Kekerix, and Leon M. Zaborowski, Reaching Learners through Telecommunications (San Francisco: Jossey-Bass, 1993). The Internet led many to discuss its role in higher education as well. One strand of the discussion is about what to use it for and advocating for its further adoption. For an easy read, see Matthew Serbin Pittinsky, ed., The Wired Tower: Perspectives on the Impact of the Internet on Higher Education (Upper Saddle River, N.J.: Financial Times/Prentice Hall, 203). For an example of hyperbole about how the Internet will change everything, see Glen R. Jones, Cyberschools: An Education Renaissance (Englewood, Colo.: Jones Digital Century, 1996, 1997). Three studies produced
Bibliographic Essay by EDUCAUSE advocating use of the Internet are more sober monographs aimed at administrative management in higher education: Richard N. Katz and Associates, Dancing with the Devil: Information Technology and the New Competition in Higher Education (San Francisco: Jossey-Bass, 1999); Richard N. Katz and Diana G. Oblinger, eds., The “E” Is for Everything: E-Commerce, E-Business, and E-Learning in the Future of Higher Education (San Francisco: Jossey-Bass, 2000), and Richard N. Katz and Associates, Web Portals and Higher Education (San Francisco: Jossey-Bass, 2002). The whole subject of distance learning has become a global conversation. I would begin by looking at Kevin Robins and Frank Webster, eds., The Virtual University? Knowledge, Markets, and Management (New York: Oxford University Press, 2002), then follow it up with another examination of trends by Parker Rossman, The Emerging Worldwide Electronic University: Information Age Global Higher Education (Westport, Conn.: Praeger, 1993), which reflects the 1980s and early 1990s. Because so many interesting projects influential on American higher education are taking place outside the United States, it is worthwhile to become familiar with those developments. Three publications are useful; first, Keith Harry, ed., Higher Education through Open and Distance Learning (London: Routledge, 1999), which has many case studies; second, Reza Hazemi and Stephen Hailes, eds., The Digital University: Building a Learning Community (London: Springer-Verlag, 2002); and third, aimed at management, John S. Daniel, Mega-Universities and Knowledge Media: Technology Strategies for Higher Education (London: Kogan Page, 1996). There are many contemporaneous publications dealing with computing in libraries, and as of yet, no formal history. However, beginning in 1965 and continuing right down to the present, the Annual Review of Information Science and Technology published very lengthy annual reviews of the literature dealing with computation and library practices, describing trends and bibliography. For any student of the history of computing in libraries, this is the essential source, cumulatively running into well over a thousand pages of references. For precomputer IT applications, see Ralph H. Parker, Library Applications of Punched Cards: A Description of Mechanical Systems (Chicago: American Library Association, 1952) and Howard F. McGaw, Marginal Punched Cards in College and Research Libraries (Washington, D.C.: Scarecrow Press, 1952). For an early description of how to use computing in libraries, there is Louis A. Schultheiss, Don S. Culbertson, and Edward M. Heiliger, Advanced Data Processing in the University Library (New York: Scarecrow Press, 1962), next Edward M. Heiliger and Paul B. Henderson, Jr., Library Automation: Experience, Methodology, and Technology of the Library as an Information System (New York: McGraw-Hill, 1971), and for the 1970s, consult Stanley J. Swihart and Beryl F. Hefley, Computer Systems in the Library: A Handbook for Managers and Designers (Los Angeles: Melville Publishing Co., 1973). Although heavily weighed toward Britain’s experience, there is also L. A. Tedd, An Introduction to ComputerBased Library Systems (London: Heyden, 1977). Then moving into the 1980s, see the excellent collection of papers by Richard De Gennaro, Libraries, Technology, and the Information Marketplace: Selected Papers (Boston: G. K. Hall, 1987). Perhaps the best description of core applications for libraries can be found in William Saffady, Introduction to Automation for Librarians (Chicago: American Library Association,1989). He provides an excellent bibliography, and his explanations are so clear that one should almost begin with this book. For a sense of activities in the early 1980s, there is also Linda C. Smith, New Information Technologies—New Opportunities (Urbana-Champaign: Graduate School of Library and Information Science, University of Illinois, 1982); Hendrik Edelman, Libraries and Information Science in the Electronic Age (Philadelphia: ISI Press, 1986); and finally on the role of PCs, Peter Hernon and Charles R. McClure, Microcomputers for Library Decision Making: Issues, Trends, and Applications (Norwood, N.J.: Ablex Publishing, 1986). For the period of the 1990s and beyond, much of the literature describes use of the newer technologies, such as the Internet with browsers. However, one of the most
451
452
Bibliographic Essay thoughtful descriptions of the modern library, with discussion of technologies, is by John M. Budd, The Changing Academic Library: Operations, Culture, Environments (Chicago: Association of College and Research Libraries, 2005). To understand how librarians are dealing with digital collections, two useful texts are Arlene G. Taylor, The Organization of Information (Westport, Conn.: Libraries Unlimited, 2004), and with an excellent bibliography; Ian H. Witten and David Bainbridge, How to Build a Digital Library (Amsterdam: Morgan Kaufman Publishers, 2003). Finally, the key sources on inventories of computers exist, particularly for the period prior to 1980 when it was more possible to track mainframes than later. The key sources for the period of the 1960s and 1970s are the four major studies prepared by John W. Hamblen, most cited in the endnotes to chapter 9 of The Digital Hand. However, his findings and interpretations of the data can all be found in an edited volume, John W. Hamblen and Carolyn P. Landis, eds., The Fourth Inventory of Computers in Higher Education: An Interpretive Report (Boulder, Colo.: Westview Press for EDUCOM, 1980). The Annual Review of Information Science and Technology carries citations of most of the surveys done on higher education, from mainframes, to PCs, to other uses of IT.
INDEX Page numbers in italics indicate figures or pictures; those in bold indicate tables. Abrams, Peter D., 272 academic analytics, 295 academic research ARPANET, 320 computers and, 303–310, 327–329, 351–352, 429n90 telecommunications and, 324 University of Wisconsin, 323 Access America: Reengineering through Information Technology, 202 accounting California computer usage, 216 CFO Act, 199 chart of accounts, 408n25 computer crime, 133 corrections department, 130–131 and data mining, 198 DoD non-combat technical applications, 89–95 e-filing, 27 federal government computer usage, 185 government systems, 346, 358 higher education and administration, 289 IT use and, 17–18, 98 library management, 315 local government, 35–36 military occupational specialty (MOS), 91 municipalities’ IT deployment, 238–240, 414n114 public education and computers, 255 public sector IT deployment, 338 punched-card records and law enforcement, 106 QuickBooks, 42 Social Security Online Accounting and Reporting Systems (SSOARS), 156–157
tax preparation software, 40–41 USPS, 170, 175 acquisitions, 293 Ada programming language, 59, 72 Adas, Michael, 379–380n155 adding machines/calculators academic research, 303 adoption by government, 32 corrections facilities, 131 higher education and administration, 288 municipalities usage, 236 state payroll, 215 state tax collection, 29–30 tax preparation equipment, 41–42 use of, 4 administration (education) computers and (1950s–2000), 253–258 higher education, computer usage, 287–296, 328 K–12 computer use, late 1980s, 256 processes and telecommunications (1950s–2000), 288 telecommunications and, 331 universities and computers, 284 ADP (automatic data processing), 4, 187, 190 adult education, 260 Advanced Records System, 148 Advanced Research Projects Agency (ARPA). See DARPA AEC (Atomic Energy Commission), 56, 187 aerodynamics, 192 Afghanistan, 85–86, 88–89 AFIS (automated fingerprint identification systems), 114, 385n44 Agricultural Research Service, 203 Air Force Air Material Command, 63
453
454
Index Air Force (continued) Atlas missile, 60 ballistic missile development, 59–60 computer usage, 64 F-15 jet problems, 59 IBM 3032, 93 IBM 705s inventory control, 63 integrated circuit research, 54 IT deployment, 95 laser-guided bombs, 69 R&D weapons systems, 58 RAND Corporation, 55 SAGE Air Defense System, 81 software applications, 71 Univac 90, 54 war-gaming, 77 workforce, 91 Air Material Command, 63 air pollution, 192, 230 air traffic control, 186–187, 208, 406n81 airborne warning and control systems (AWACS), 86 Airborne Weather Reconnaissance Systems (AWARS), 83 ALA (American Library Association), 311 Alaska, 212, 223 Allen, Paul, 242 alternative schools, 279 alumni relations, 290 America Online (AOL), 175 American Bar Association (ABA), 134 American Cartographer, 415–416n126 American City, 236–237, 239, 247, 414n114 American Council on Education, 302 American economy Bureau of the Census handbook, 335 computer deployment, 190 education and computers, 281–282, 329 federal government computer usage, 184–185 federal presence in, 6–8 government and, 211 higher education and computers, 285–287 implementation of IT, 350 Internet impact on, 166 IT budget increases, 224 law enforcement ecosystem, 102 productivity of, 3 public education and, 267–268 public sector and, 334–335, 339, 351–355, 435n22 recession, 227 regulation of IT, 356 and USPS, 165, 166
American Library Association (ALA), 311 America’s Army, 93–94, 380n171 amnesty programs, 32 analytic techniques, 295 Anderson, Margo J., 158, 396n66 Animal and Plant Health Inspection Service, 203 Ann Arbor, Michigan, 354 antitrust suits, 126, 356, 364n3, 388n95, 389n110 AOL (America Online), 175 Appalachia, 262 Apple Computer, 247, 257, 263–264, 282, 321 Armer, Paul, 187 Armour Research Foundation, 160 Armstrong, William R., 394n29 Army IBM 705s, 90 ORDVAC firing tables, 68 R&D weapons systems, 58 radio-controlled bombs, 69 Signal Supply Agency, 63 Vietnam War communications, 82 war-gaming, 77 ARPA (Advanced Research Projects Agency). See DARPA ARPANET, 320 artificial intelligence, 53, 123–124, 305 Aspin, Les, 84 assessments, 39–40, 370n74 astronomy, 305 AT&T 911 emergency calls, 116 corrections facilities, 132 dial-up access, 191 DoD command and control, 81 Federal Telecommunications Systems, 196–197 SSA infrastructure, 147–148 Atari, 264 Atlanta, 171, 255 Atlas missile, 54, 59–60 ATMs, 339 Atomic Energy Commission (AEC), 56, 187 Attorney General (U.S.), 123 audit trails Chief Financial Officers Act, 92–93 computer crime, 133 DoD, 98 municipalities’ IT deployment, 240 USPS, 173 voting, 228 Austrian, Geoffrey D., 396n66 Automated Financial System, 23
Index automated fingerprint identification systems (AFIS), 114, 385n44 automatic data processing (ADP), 4, 187, 190 Automatic Digital Network (AUTODIN), 83 avionics ballistic missiles, 58 F-15 fighter jet, 59 military funding for R&D, 54, 60 technological advancement, 8, 99, 340, 346 AWACS (airborne warning and control systems), 86 B-1B Bomber, 72 Back-Up Interceptor Control (BUIC), 83 Bailar, Benjamin F., 172 Ballistic Missile Early Warning System (BMEWS), 83 ballistic missiles, 58–59, 83 Ballistics Research Laboratory (BRL), 54, 67 Baltimore, Maryland, 131, 135, 143 bandwidth, 88, 95 banking industry, 339, 355–356 bankruptcy, 123, 130 bar coding availability of, 65 census questionnaires, 163 tracking packages, 175 and USPS, 169, 172–173, 179–180 batch mode detectives and, 112 higher education and administration, 288–290 NCIC files, 120 query systems, 109 Social Security Statement, 157 SSA records, 145, 150 Bell Labs, 54 Bell Telephone, 60 Bendix computers, 215, 327 “best practices,” 15 bibliographic systems, 315–317 “Big Science,” 307–309 billing machines, 236–237 bioinformatics, 305 biological warfare, 57 biometrics, 115 biotechnology, 354 birth certificates, 223 BITNET, 325, 432n153 Bitzer, Donald, 261 “black” aircraft, 60 blackboards, 259, 269 blue boxes, 135 Blue Cross/Blue Shield, 32
bombing reconnaissance, 81 book ordering, 311 bookings, 131 boomer generation, 12, 210, 359–360, 406n87 Boston, 108, 237 Boyd, Robert S., 380n171 Brady Act, 123 branching programs, 260 broadband, 309 brokerage services, 337, 339, 355 Brookfield, Illinois, 238 Brooks, Frederick P., Jr., 367n38 Brooks Act, 189, 194, 197 Brown, Tim, 16 browsers, 299, 325 budgets county government, 235 higher education, computer usage, 287, 289–290, 292, 329, 433n162 higher education, cuts in, 295 IT and higher education, 294, 296 library management, 312–313 municipalities, 246 public education and computers, 273, 282 public sector and IT, 345, 348 teaching in higher education, 297 building permits, 249 Bureau of Finance and Administration, 171 Bureau of Justice Statistics, 157 Bureau of Labor Statistics, 157 Bureau of the Budget, 187–188 Bureau of the Census academic research, 308 automation of field operations, 162 computer deployment, 180–182, 340, 342, 347 computer development, 157–164 FOSDIC records, 159, 160–161 GIS databases, 39–40, 402n21 handbook on American economy, 335 history of, 396n66 and IBM computers, 159, 396n68 IT deployment, 208, 334 key digital census applications (2001), 163, 164 magnetic tape record storage, 158–159 optical scanning, 173 population count, 186 precomputer era, 141–142 public sector IT deployment, 338 TIGER files, 162, 243 Burroughs Corporation Atlas missile, 60
455
456
Index Burroughs Corporation (continued) county government precomputer equipment, 230 DoD non-combat technical applications, 89 higher education, computer usage, 328 local government IT, 39 Massachusetts computer system, 215 military funding for R&D, 54 tax preparation equipment, 41 USPS mechanization, 168, 397–398n95 Bush, George W., 228 Bush, Vannevar, 54, 67, 303 Bush (George H. W. ) administration, 272 Bush (George W.) administration, 27, 89, 199, 207, 355 business administration, 308–309 Business Systems Modernization Program, 27, 29 business tax returns, 19, 26 business to government (B2G), 247 Butcher, Joseph, 412n72 C4ISR (command, control, communications, computers, intelligence, surveillance, and reconnaissance), 87 cable TV, 325, 357 CAD/CAM, 241, 408n18 CAI (computer-aided instruction), 72, 74, 75, 297 California computers’ impact, 345 Department of Employment, 215–216 digital fingerprint systems, 115, 138 public education (1970s), 255 welfare system, failure, 223, 362 Camp Pendleton, 91 Campbell, Joseph, 186 Campbell-Kelly, Martin, 408n18 Canada, 243 capitalism, 363 Cap’n Crunch, 135 car registrations, 225 card catalogs, 314, 316 CARMONETTE I, 77 Carnegie Forum on Education and the Economy, 266 Carnegie-Mellon University, 55, 318 Carrizales, Tony, 406n83 Case Processing and Management System (CPMS), 157 case-management techniques, 226 cataloguing systems, 311, 313, 315, 318, 319 Caudle, Sharon L., 220 CCH (Computerized Criminal History), 120–123, 122
CDC (Control Data Corporation), 328 CD-ROMs, 162, 275, 317, 322 cell phones, 85, 175, 180, 280, 324 Census Bureau. See Bureau of the Census Center for Digital Government, 227 Centers for Disease Control and Prevention, 157 Central Intelligence Agency (CIA), 20, 185 centralized computing, 31 CFO Act, 199 Chamblee, Georgia, 19–20 Champy, James, 359 Chapel Hill, 160 Charles Babbage Institute, 401n3, 407n12, 429n102 chat rooms, 228, 298 checking accounts, 337 Chicago computer crime, 135 data centers, 36 online teaching, 261 public education computer use (1970s), 255 punched-card records and law enforcement, 106 query systems, 108–109 USPS mainframe installation, 171 Chief Financial Officers Act, 92–93 chief information officers (CIOs), 199, 234 Child Enforcement Division, 33 child pornography, 137 China, 356 Chronicle of Higher Education, 423–424n8 CIA (Central Intelligence Agency), 20, 185 Cincinnati, 171 CIOs (chief information officers), 199, 234 circulation, 312, 315 Civil Rights Movement, 263 civil servants, 362 civilian contractors, 93 class rolls, 255 class scheduling, 255 classroom rosters, 290 Cleveland, Ohio, 262 Click-N-Ship, 180 clinical trials, 305 Clinton administration CFO Act, 199 e-commerce, 35 e-government, 201–202, 339 E-Government Act of 2002, 27 federal government and the Internet, 200, 207 federal government computer usage, 185, 189, 362
Index Goals 2000: Educate America Act, 273 Internal Revenue Service, 28–29, 366n5 Internet and government operating costs, 33 paper reduction campaign, 26–27, 194 Paperwork Reduction Act, 202 public education and computers, 272–273 Re-Inventing Government, 183, 198, 200, 350, 358 school Internet connectivity, 11 Social Security Administration, 155, 157 supercomputing, 310 tax collection during, 22 taxes and expenditures, 7–8 Cobra II, 379–380n155, 381n188 Cody, Joseph J., Jr., 83 Cold War collapse of Soviet Union, 195 and computers, 86, 100, 304, 337 DoD budget, 51, 98 federal government and, 4, 6 increase in public services, 344 military use of computers, 52–53 National Security Agency (NSA), 185 R&D funding, 352 Colorado, 129 Colorado State University, 302 Colton, Kent W., 106, 137–138 Columbia University, 303 COMAR-1, 81 combat theater, 80–86, 90 command, control, communication, 67, 81–86 command, control, communications, computers, intelligence, surveillance, and reconnaissance (C4ISR), 87 command and control systems. See also U.S. Department of Defense “electronic battlefield,” 76 higher education, computer deployment, 330 Information Age Warfare, 86–87 Kennedy administration, 376n79 military and IT deployment, 340 Navy, 69 R&D, 53 SAGE Air Defense System, 81, 345–346 Strategic Computing Program, 61 Commonwealth of Puerto Rico, 212 communications, 80–86, 107–108 community colleges, 287, 299, 331, 353–354, 362 community practice, 311, 430n105 community-based policing, 118–119, 386n65 comptroller activities, 90 Compuserve, 42, 177
computer crime, 133–137, 139, 349, 391n140 “Computer in the School Room: What Revolution?,” 266 Computer Investigations and Infrastructure Threat Assessment Center (CITAC), 136 computer labs, 264–265, 270, 275, 322 computer literacy movement, 262–263 computer modeling, 192, 193 computer-aided dispatching, 111–112, 116, 120 computer-aided instruction (CAI), 72, 74, 75, 297 computer-assisted legal research system (CALR), 129 Computerized Criminal History (CCH), 120–123, 122 computers. See also academic research; IT Brooks Act, 189 Bureau of the Census, development of, 157–164, 396n66, 396n68 CAI (computer-aided instruction), 72, 74, 75, 297 congressional funding for, 181, 351 Diebold Group survey, 407n12 federal government usage, 184 Gartner survey, 214 GSA inventory, 197 higher education and, 354 intelligence gathering, 199 law enforcement and, 104, 108 library management, 315–316 military funding for R&D, 374n35 North Central Regional Education Laboratory, 274–275 numbers of in federal government (1950–1979), 188 and police work, 106 productivity, 15, 365–366n20 role in public education, 267–272 SSA deployment (1950s–1960s), 144–148 state legislatures and voting, 221 student-to-computer ratio, 265, 277 teacher usage (1960s–1980s), 258–267 technological advancement, 8 usage surveys, 186–187, 401n3 USPS and, 167, 171–172 Congress Bureau of the Census supplemental budget, 162 Campbell computing survey, 186 census and, 140 e-filing legislation, 47 IT deployment, 206–207, 223 regulation of IT, 356
457
458
Index Congress (continued) SSA benefits legislation, 146, 149 tax code, 23 technology funding, 181, 351 consumer price index (CPI), 104 Control Data Corporation (CDC), 261–262, 280 Cook County, 115 Cookie Monster, 135 Cooperative State Research Education and Extension Service, 203 copyrights, 356 corporations, 18, 20, 46–47, 344 corrections facilities, 104, 130–133, 336, 390n23 Cortada, James W., 408n25 cost considerations, 347–348 cost-of-living adjustments, 149–150 county government, 230–235 course catalogs, 294 court system antitrust suits, 356 bankruptcy filings, 123 and computers, 123–130, 127 and the Internet, 129 PACER, 128 state IT deployment, 224, 341 stenograph machine, 125, 387n89 CPA Journal, 43 CPI (Consumer Price Index), 104 credit cards and e-commerce, 338–339 higher education, computer usage, 295 motor vehicle licenses, 127 tax return payment, 26, 34 and USPS, 177 crime. See also computer crime child pornography, 137 costs of, 102 digital crime maps, 113, 118, 384n37 FBI files, 120–123, 122 growth of, 105, 111 Universal Crime Report (UCR), 117 criminal background searches, 123 criminal justice ecosystem, 103–105, 111 Criscito, Pat, 433n161 critical path analysis, 193 CRTs, 255 cryptography, 54, 123 Cuban, Larry, 252, 268, 276, 419n22 customs fees, 18 cyberbullying, 137 cybercrime, 136–137, 349 cybernetics, 305
cyberstalking, 137 cyberwar, 87 Dallas, 171 DARPA (Defense Advanced Research Projects Agency) academic research, funding, 306 CAI development, 74 digital military research (1999–2005), 57 established after Sputnik, 55 Internet development, 94, 207 military communication, 82 R&D, 53, 99, 351–352 Strategic Computing Program, 61 data centers academic research, 309–310 higher education, 291, 327–328, 429n90 Los Angeles County, 230 municipalities, 36 school districts computer usage, 257 state government, 214, 219–221 data confidentiality, 161, 164 Data Communications Network (DCN), 128–129 Data General, 39 data mining accounting systems and, 198 emergence of, 197–198 IRS, 17, 27 NCIC files, 121, 123 tax evasion, 32–33 data processing (DP), 31, 188, 315 data security, 203, 235 databases. See also GIS academic use of PCs, 321–322 FBI uses, 347 federal government, 195 NIBRS, 117–118 NICS, 123 public sector IT deployment, 338 standardization of information, 343 Datamatic 1000, 230 DCN (Data Communications Network), 128–129 de Jager, Peter, 247 decentralization, 153, 192, 204, 208 Defense Advanced Research Projects Agency. See DARPA defense budget, 51 Defense Logistics Agency (DLA), 62–63, 65–66 Defense Supply Agency, 63 deficits, 4 Delaware, 129 delivery confirmation, 179
Index delivery point sequencing (DPS), 179 Deloitte, 41 democracy, 224–225, 227–228 Democratic Party, 229 demographics, 243, 263 denial-of-service, 137 Department of Agriculture, 195, 203 Department of Commerce, 97, 189, 195 Department of Defense. See DoD Department of Education, 273, 279, 355 Department of Employment, 215–216 Department of Energy, 97, 192, 195, 197, 200 Department of Health and Human Services, 97, 195, 200 Department of Homeland Security (DHS). See DHS Department of Interior, 195 Department of Justice computer crime, 136–137 crime costs, 102 digital fingerprint systems, 115 digital initiatives, 123 electronic databases, 195 law enforcement and computers, 106–107, 347 LEAA (Law Enforcement Assistance Administration), 122–123 Western Identification Network (WIN), 385n44 Department of Labor, 195 Department of Revenue, 216 Department of State, 200 Department of Transportation, 195, 200 Department of Treasury, 16, 103, 197, 200 detectives, 112 DHS (Department of Homeland Security) creation of, 337, 343 e-government initiatives, 208–209 intelligence gathering, 199 IT deployment, 227 IT expenditures, 200 9/11 Commission, 205–206 dial-up access, 191, 203, 325 Dickinson, William, 84 Diebold Group, 407n12 differential analyzer, 67 Digital, 39 digital crime maps, 113, 118, 384n37 Digital Equipment, 197, 328 digital libraries, 317–319 digital mapping, 39–40, 181 digital PBX systems, 116 digital signatures, 129 digital TV, 356
digital voice communications, 116 Digital War, 76 Disability Insurance program, 152–153, 395n51 disk storage, 160–161, 255 distance learning guides to, 433n161 higher education and, 296, 325–326, 332 rise of, 279, 299–302, 427n67 software applications, 271–272 distributed processing, 145, 222–223, 293–294 DLA (Defense Logistics Agency), 62–63, 65–66 doctors, 346 DoD (U.S. Department of Defense). See also Cold War; DARPA; military-industrial complex academic research, funding, 306–307 accounting, 89–95 audit trails, 92–93, 98 computer and telecommunication usage, 49–50, 185, 190 defense budget, 51 digital libraries, 318 force multiplier, 99–100 GSA computer inventory, 197 Internet usage, 205 IT budget, 200, 209 IT deployment, 95–96, 97, 334, 342 Joint Vision 2020, 88 legacy systems, 84 logistics systems, 186, 347, 374n41 mismanagement of, 349–350 organization of, 50–52 productivity and IT, 345–346 R&D, 52–62, 351–352 supercomputers, 192 system incompatibility, 329 training programs, 72–80 Dolan, Michael P., 22 Doom (game), 80 downsizing, 359 DPS (delivery point sequencing), 179 Draper, John, 135 drill-and-practice exercises, 264–266, 275, 297 driver’s license, 214 drones, 85 Duesterberg, Thomas J., 175 Duncan, Joseph W., 396n66 dunning, 21 DVDs, 322 early adopters, 311 eBay, 356
459
460
Index e-business, 249 Eckert, John Presper, 158 e-commerce. See also online purchases Clinton administration, 35 federal Web sites, 206 higher education, 294, 296 regulation of, 357 state government and, 225, 338–339 and USPS, 178 economic development, 15 economics, 308–309, 429n99 economies of scale, 287 e-democracy, 224–225, 228 EDI, 177 EDP (electronic data processing), 107, 245–246 Education Resource Information Center (ERIC), 418n5 EDUCAUSE, 295, 423–424n8 EDVAC (Electronic Discrete Variable Automatic Calculator), 67, 304 Edwards, Paul N. Cold War and computers, 86 command and control systems, 376n79 military use of computers, 52–53, 337 R&D weapons systems, 59 techno-hubris, 101 e-filing accountants, 27 availability of, 22–24 Bureau of the Census, 164 congressional legislation, 47 corporation tax returns, 344 costs of, 34, 369n61 1140, 46 Integrated Collection System (ICS), 22–23 optical scanning, 168 tax preparation software, 41 tax preparers, 26 tax returns, 25, 205 Telefile, 26, 45 usage, 45, 339 Web sites, 344–345 e-government Clinton administration, 201–202 county government, 234–235 E-Government Act of 2002, 27 “E-Government: Recent Publications,” 406n83 federal government, 205 Internet, 33–35, 247, 369n65 IT deployment initiatives, 208, 406n83 municipalities and, 250 state government initiatives, 227
800 (toll-free) telephone numbers, 153–154, 179 Eisenhower, Dwight D., 49 e-learning. See distance learning “electronic battlefield,” 56–57, 76–77, 375n67 Electronic Computer-Originated Mail (E-COM), 174–175 electronic data exchange (EDS), 175 electronic data processing (EDP), 107, 245–246 electronic deposit, 153–154 Electronic Disability (eDib) process, 156 Electronic Discrete Variable Automatic Calculator (EDVAC), 67, 304 electronic funds transfer (EFT), 33–34, 45 electronic libraries, 317–319 electronic monitor, 132 Electronic Numerical Integrator and Computer (ENIAC), 53–54, 67, 145, 304 Electronic Postmarks (EPM), 178 Electronic Return Filing System, 23 electronic signatures, 157 elementary schools, 264 e-mail academic use, 284–285, 322, 324–326 Air Force, 81 BITNET, 432n153 community-based policing, 119 county government, 233 court system, 128–129 and distance learning, 300 Electronic Computer-Originated Mail (ECOM), 175 federal employees, 203, 204 federal government access, 194–195 Federal Telecommunications System (FTS), 197 and government, 34 higher education, 291–292, 294 Iraq War soldiers’ usage of, 210 law enforcement communications, 116 medical community, 346 military communication, 85, 94–95 municipalities and, 247 SSA employees, 154–155 state government and, 229 state Internet applications, 225 tax returns, 41 teacher usage, 269 use in education, 281 and USPS, 175–177 White House, 197 embezzlement, 137 employees. See workforce
Index Energy Research and Development Administration (ERDA), 190 engineering applications, 216 ENIAC (Electronic Numerical Integrator and Computer), 53–54, 67, 145, 304 Enterprise Resource Planning (ERP), 54, 294, 296, 351 enumerators, 160, 162 environmental management, 244 EPA (Environmental Protection Agency), 56, 97 ePost, 177 ERIC (Education Resource Information Center), 418n5 ERP (Enterprise Resource Planning), 54, 294, 296, 351 European Union, 198 Evanston, Illinois, 254–255 Excel, 168, 298 Express Mail, 175 F-15 fighter jet, 59, 61–62 F-18 fighter jet, 61–62 FAA (Federal Aviation Agency), 187, 208, 345, 406n81 facial digitizing, 115 faculty academic research, 306 distance learning, 325 higher education, computer usage, 299, 331 Internet use, 326 PC use in higher education, 322 FADAC (Field Artillery Digital Automatic Computer), 68 Falklands War, 69 false alarms, 88 fat portals, 205 fax machines, 94, 173–176 FB-111 aircraft, 71–72 FBI (Federal Bureau of Investigation) CCH files, 120–123 computer crime, 135–136 Computer Investigations and Infrastructure Threat Assessment Center (CITAC), 136 and DHS, 337 federal law enforcement, 120 fingerprints database, 114–115 Internet Fraud Council, 136 IT deployment, 191, 334, 340, 347 law enforcement ecosystem, 103 National Infrastructure Protection Center (NIPC), 136 NCIC, 110–111, 119 NIBRS development, 118 query systems, 108
virtual case project, 208 and Watergate, 20 FCC (Federal Communications Commission), 247, 343, 356 Federal Aviation Agency (FAA), 187, 208, 345, 406n81 Federal Bureau of Investigation. See FBI Federal Communications Commission (FCC), 247, 343, 356 federal courts, 128 Federal Express, 164 federal government. See also DARPA; individual agencies academic research, funding, 303–310 Bureau of the Census, 157–164 CFO Act, 199 computer modeling, 192, 193 computer technology budget, 185 computer usage, 184–185 county government statistics, 232 deficits, 349–350 DoD budget, 51 e-commerce, 206 education and computers, survey, 262 e-mail access, 194–195, 204 ERP software, 351 expenditures 1950–2003, 7, 8 FFRDC, 55, 56 FirstGov.gov, 206 GAO audits on IT, 341, 349, 358 higher education, computer usage, 295, 327 higher education, R&D expenditures (1953–2003), 308 IBM applications offerings, 196 increase in public services, 344 institutional culture and IT deployment, 342–347 and the Internet, 200–207, 356–357 IT budget, 209 IT deployment, 186–200, 206–210, 249–250, 272–273, 342 IT expenditures, 191, 195–196 IT use and, 334 and law enforcement, 104, 110, 120–124 libraries and IT deployment, 311 mobile computing funding, 116 Nation at Risk, 263 number of computers (1950–1979), 188 Paperwork Reduction Act, 194 performance-based management, 198 productivity and IT, 343–344 public education and computers, 259 public education standards, 267 retirement of workforce, 359, 360, 436n31
461
462
Index federal government (continued) state data collection, 218–219 university-industry technology transfer, 354–355 workforce, 6, 364–365n7 workforce per workstation, 246 Y2K scare, 198, 403n40 Federal Reserve Board, 343, 355 Federal Telecommunications System (FTS), 196–197 Felt & Tarrant, 41, 89 Ferris State University, 330 FFRDCs (Federally Funded Research and Development Centers), 55, 56 Field Artillery Digital Automatic Computer (FADAC), 68 Field Artillery Fire Control System, 68 Film Optical Sensing Devices for Input to Computers (FOSDIC), 159, 160–161 film strips, 259, 269 financial aid, 290 fingerprints AFIS (automated fingerprint identification systems), 114, 385n44 CCH files, 120 computers and, 108 data processing and, 105 digitized criminal records, 123 evolution of, 113–114 precomputer era, 384–385n39 Western Identification Network (WIN), 385n44 fire departments, 244 fire engines, 116 firing tables, 67–69, 71 First Class Mail service, 164, 178, 180–182, 399–400n129, 400–401n145 FirstGov.gov, 206 fishing licenses, 214, 225 Fite, Harry H., 31 Flamm, Kenneth, 351–352, 365n11 flight simulators, 61–62 Florida, 33, 136, 216, 228–229, 241–242 force multiplier, 99–100 Forest Service, 203 Form 940 (Employers’ Annual Federal Unemployment Tax Returns), 20 Form 1120 (U.S. Corporation Income Tax Returns), 20, 46 for-profit universities, 287, 296, 300–302, 330–331 Fortune 500, 335 FOSDIC (Film Optical Sensing Devices for Input to Computers), 159, 160–161
Fountain, Jane E., 405n61 France, 356 Franklin, Benjamin, 165 Franks, Tommy, 89 fraud, 137, 198 Friday, Joe, 110, 383n26 Friedman, Thomas L., 269, 279, 378–379n140, 421–422n64 Future Shock, 268–269, 421–422n64 Gallagher, James D., 186–187 GameBoy, 78 GAO (Government Accountability Office) audits of IT, 341, 349, 358 Bureau of the Census report, 164 computer acquisition, 189 computer deployment reports, 190, 358 cost justification for computers, 345 FBI report, 138 federal accounting practices, 403n44 federal agency cooperation, 205–206 federal employees and the Internet, 203–204 implementation of computer systems reports, 349 IRS audits, 47 IT deployment initiatives, 208 9/11 Commission report, 198 performance-based management, 198 “Public Sector: Early Stages of a Deep Transformation,” 405n61 SSA audits, 149, 152, 156, 394n29 supply chains in Iraq War, 65–66 Tax System Redesign Project, 20 USPS report, 175–176 Gartner, Inc., 213, 214, 246, 369n65 Gates, Bill, 242 gating factors, 347 GDP (gross domestic product) DoD percentage of, 372n2 federal taxes (1950–2003), 7 higher education and computers, 285 and law enforcement, 102, 104 public sector (early twenty-first century), 4 state and local government, 8–9 General Electric, 60, 328 General Motors, 100 General Post Office, 164 General Services Administration (GSA), 189, 196–198 genetics, 305, 309 Gentry, John A., 88 geocoding, 162 Geographical Information Systems (GIS). See GIS
Index Geological Survey (USGS), 162 Georgia, 129, 215 GI Bill, 4 GIS (Geographical Information Systems) American Cartographer, 415–416n126 assessments, 39–40 Bureau of the Census, 402n21 county government, 233 digital mapping, 39–40, 181 law enforcement, 113 municipalities’ IT deployment, 242–245 TIGER files, 162, 243 globalization, 350, 355 Goals 2000: Educate America Act, 273 Google, Inc., 319–320, 356, 432n135 Gordon, Michael R., 379–380n155, 381n188 Gore, Al, 200, 228, 349–350 Governing, 138 Government Accountability Office (GAO). See GAO Government Printing Office (GPO), 193 government to citizens applications (G2C), 247 GPS technology, 69–70, 76, 85, 245 Grace Commission, 190 grade reporting, 255, 269, 281 graduation rates, 295 Granada invasion, 84 graphics, 291, 297, 309, 321–322, 346 Great Depression, 142, 344 Green, Bert F., Jr., 304 Green, Kenneth C., 326–327 GSA (General Services Administration), 189, 196–198 H&R Block, 41–42 hackers, 77, 135, 206, 391n140 Hammer, Michael, 359 Hammer, Preston C., 429n90 Harries, Keith, 384n37 Harrison, Shelley A., 426–427n51 Harvard University, 16, 54, 271, 311, 319 Hawaii, 212, 225 health insurance, 151 healthcare, 231 Hebert, Scott, 138 Heermann, Barry, 427n53 Henry Classification, 113–114 Hernon, Peter, 402n29 Hewlett-Packard, 39 high-energy physics, 192 higher education. See also academic research; forprofit universities; Internet academic research and computers, 303–310, 338
administration and computer deployment, 287–296 administrative applications (1976), 291 computer usage, 328, 362, 427n53 computers and teaching, 296–302 and distance learning, 296, 299–302, 325–326, 332, 427n67 e-mail, 284–285, 322, 324–326 funding for computers, 433n162 growth in, 11–12 IBM case studies, 426–427n51 institutional culture and IT deployment, 336, 361 IT deployment, 327–330, 332–333, 340–341 library management and computers, 311–320 music downloading, 322–323 networking services, 325, 326–327 number of institutions (1950–1999), 13 personal use of computing, 320–324 R&D expenditures (1953–2003), 307 software applications (1993), 298, 299 state IT deployment, 224 tuition costs, 286–287 universities and computer development, 284–287, 342 university-industry technology transfer, 353–355 value system, 423n6 WISC computer system, 285 Hinsdale, Illinois, 262 Ho Chi Minh Trail, 76 Hollerith, Herman, 141 home schooling, 279 Honeywell, 39 house arrest, 132 humanities, 331 Hurricane Katrina, 207 Hurricane missile, 54 hydrogen bomb, 305 hydrography, 244 hypertext, 317 IAS Computer, 304 IBM antitrust suits, 126, 364n3, 388n95, 389n110 billing system, 238 and Bureau of the Census, 140–141, 159 chip manufacturing, 359 commercial computer business, 396n68 computers and education, 321, 420n33
463
464
Index IBM (continued) county government precomputer equipment, 230 courts and computers, 126, 127, 388n94 DoD non-combat technical applications, 89 and the federal government, 4, 364n3 federal government applications chart, 195, 196 GSA computer inventory, 197 higher education, computer usage, 328, 426–427n51 local government IT, 39 military funding for R&D, 54 mobile van teacher education, 262 PC introduction, 215 public education and microcomputers, 263–264 SSA equipment, 142, 392n2 STAIRS legal research, 130 state and local uses of computers (mid-1980s), 212, 213 survey of IT usage, 362 tabulators, 215 typewriters, 41 U.S. Weather Bureau, 187 Writing to Read, 265 IBM 1130, 262 IBM 1401, 90, 161, 216 IBM 1460, 289 IBM 3032, 93 IBM 3033, 121 IBM 305 RAMAC, 255 IBM 3/360, 121 IBM 402 Accounting Machine, 254–255 IBM 604 Electronic Calculating Machines, 145, 216, 230 IBM 650 air traffic control, 187 engineering applications, 216 inventory control, 63 Los Angeles, 230, 237 MIT, 327 IBM 702, 146, 215–216, 393n17 IBM 704, 81, 327 IBM 705, 63, 90–91, 146 IBM 7074, 289 IBM 7090, 81 IBM S/360, 160, 231, 255, 289 IBM S/360 Model 40, 109 IBM S/360 Model 50, 131 IBM S/370, 172, 231, 255, 394n29 IBM tabulators, 228 IC (integrated circuit), 54, 72 ICBM missiles, 59–60
identity theft, 137, 156, 436n35 Illinois, 129, 215, 221, 223 Illinois Institute of Technology, 160 individual master file, 19, 20 inflation, 286–287 Information Age DoD R&D, 98 and the Internet, 324 public education and computers, 268 R&D investment and, 52 universities and computer development, 284, 333 Information Age Warfare, 57, 76, 86–89, 378–379n140 Information Economy, 268 Information Highway, 165, 183, 202 information operations, 87 information resource manager, 189 information retrieval systems, 317 Information Society, 333 Information Superhighway, 247, 272, 405n61 information technology (IT). See IT Information Technology Management Reform Act, 197, 199 instant-messaging, 324 institutional culture, 336–342, 346 insurance industry, 226 integrated circuit (IC), 54, 72 Integrated Collection System, 22–23 Integrated Retail Terminal, 175 integrated systems, 31, 151 intelligence data, 198–199 intelligent bullets, 375n66 INTELPOST, 173–174 interlibrary loans, 314, 316 Internal Revenue Service (IRS). See IRS Internet. See also e-commerce; online purchases access to, 180 boomer generation use, 406n87 citizen access to, 225 classroom usage, 272 Clinton administration schools initiative, 11 community-based policing, 118–119 computer crime, 136–137 county government, 231–233 court system, 128–129 customer leveraging, 371n92 DARPA development, 55, 94 Democratic Party campaign contributions, 229 digitized criminal records, 123 and distance learning, 299–302 DoD R&D, 49–50, 98 e-filing, 24
Index e-government, 33–34, 369n65 failsafe, 82 federal government and, 200–207 First Class Mail service and, 179 GIS databases, 243, 245 higher education, computer usage, 288, 298, 324–327, 331 identity theft, 436n35 impact on American economy, 166 Joint Vision 2020, 88 K–12 usage, 34 law enforcement communications, 116 legal research, 130 and librarians, 312 and libraries, 317 library management, 317–320 license applications, 214 municipalities and, 246–249 music downloading, 322–323, 432n147 National Institute of Justice and, 128 online voting, 360 Pew Foundation report, 399–400n129 Pew Internet and American Life Project, 411n64 prisoner files, 133 public education and computers, 268, 281 public schools with Internet access, 276, 277 “Public Sector: Early Stages of a Deep Transformation,” 405n61 regulation of, 356–357 school districts computer usage, 257–258 and the SSA, 153–154, 156 state government usage, 224–229, 226 tax form availability, 22–23 teacher usage, 276 and USPS, 164, 175–180 West study, 209 Internet Fraud Council, 136 Internet-based technologies, 8, 365n11 intranets county government, 233 federal government, 204–205, 210 school districts, 258 SSA employees, 154, 155 Intuit, Inc., 42, 44 inventory control computers and, 62–66 IBM systems, 63 library management, 311 public education and computers, 255 public sector IT deployment, 338 R&D, 53 iPods, 180
Iraq War (Operation Desert Storm), 51, 65, 69–70, 84, 95 Iraq War (Operation Iraqi Freedom) airborne warning and control systems (AWACS), 86 Cobra II, 379–380n155, 381n188 “electronic battlefield,” 76 Information Age Warfare, 378–379n140 joint surveillance and target attack radar systems (JSTARS), 86 logistics systems, 65 military communication, 85 soldiers’ Internet use, 210 techno-hubris, 101 IRS (Internal Revenue Service) Automated Financial System, 23 budget (1996 and 1997), 23, 24 Clinton administration and, 28–29, 366n5 computer upgrade, 345, 347 computing technology adoption, 18–29, 342, 345 DARPA development, 94 e-filing campaign, 45–46 Election Return Filing System, 23 first computer installation, 19 GAO audits, 47 institutional culture and IT deployment, 337 and the Internet, 207 IRS Integration Services, 28 IT budget, 25, 26, 367n33 modernization projects, 20–23, 48 punched-card records, 19, 266n7 tax collection problems, 349–350 Taylor as Director, 4 workforce, 18, 29 IT (information technology). See also Internet academic libraries, 313–320 academic research, 303 academic use of PCs, 322 Access America: Reengineering through Information Technology, 202 and the ALA, 311 Bureau of the Census, 163, 164 corrections facilities’ deployment, 133, 390n23 county government deployment, 229–236 deployment (1950–1980), 186–191 deployment (1981–2007), 191–200 education deployment, 269–271, 282 FBI deployment, 122 federal budget (1982), 195 federal government agencies, 141, 181 federal government deployment, 206–210 government information sharing, 343
465
466
Index IT (continued) higher education, 285–287, 320–324, 327–329, 433n162 IBM case studies for higher education, 426–427n51 Information Age Warfare, 86 Information Technology Management Reform Act, 197, 199 institutional culture and deployment, 336–342, 361 Iraq War (Operation Iraqi Freedom), 379–380n155 IWS/LAN Technology Program, 153 law enforcement deployment, 104, 112 legacy systems, 84 and library management, 312 logistics system flaws in Iraq War, 65–66 military deployment, 95–97, 99 municipalities’ deployment, 235–249 myths of, 28, 367n38 National Information Infrastructure (NII), 201 National Performance Review (NPR), 200, 350 9/11 Commission recommendation, 5–6 police query systems deployment, 109–110 productivity, 14–15, 338 public education, impact on, 252–253, 418n5 public education and computers, 261 public education and computers (1990s–2000s), 272–279 public sector deployment, 4–5, 182–183 regulation of, 355–357 SSA deployment, 142–157 state, county, local deployment, 211–250 state government usage (1964), 218 state legislatures’ deployment, 220–221, 222 surveys, 401n3 tax preparation software usage, 41–42 techno-hubris, 101, 381n191 Universal Crime Report recommendations, 117 and USPS, 164, 172, 173–175, 181, 397n91 weapons systems, 66–72 before World War II, 4 IWS/LAN Technology Program, 153–154 jails, 105, 131, 132 Jeffersonville, 160, 163 job applications, 249 Joint Chiefs of Staff, 51 joint surveillance and target attack radar systems (JSTARS), 86
Joint Vision 2010, 87 Joint Vision 2020, 88 Joint War Fighting Centers, 79 Jordan, William R., 251 Judicial Information System, 126 junior colleges. See community colleges jury selection, 126 K–12 education. See also teachers computer use (1950–2000), 11, 12, 261, 365n14 computer use (1970s), 255 computer use (late 1980s), 256, 257 drill-and-practice exercises, 266 family impact, 336 Internet usage, 34 IT deployment, 342, 353 microcomputers in the classroom, 272 Micro-PLATO, 261 networking, 299 public schools with Internet access, 276, 277 standardization of information, 343 state IT deployment, 224, 341 Kahin, Brian, 201 Kansas City, Missouri, 19, 108–109, 170, 262 Kansas State, 225 Katz, Richard N., 295–296 Kelley, Clarence E., 108 Kendrick, John W., 3, 13–14, 335, 434n8 Kennedy administration, 60 Keyboard State, 33–34 Keynesian economics, 351 Korean Conflict, 51, 59 Krulak, Charles C., 79–80 Lacrosse missile, 68 Lally, Joseph P., 237 land records, 223 LANs (local area networks) classroom computer usage, 272–273 higher education, computer usage, 294 Joint Vision 2020, 88 public education and computers, 275 SIMNET (Simulation Network), 76 SSA records, 154 state legislature usage, 223 laptops academic use, 322, 324 digital libraries, 319 Marines military communication, 85 patrol cars, 116 public education and computers, 280 state legislatures usage, 222–223
Index laser-guided bombs, 69 law enforcement. See also corrections facilities; crime; jails community-based policing, 118–119 computer crime, 136 and corrections, 130–133 county government, 231–232, 234 courts and computers, 123–130 digital crime maps, 384n37 EDP, 107 expenditures, 104 federal mandates to states, 212 funding as impediment to computer adoption, 106 GIS databases, 244 institutional culture and IT deployment, 337 LEAA (Law Enforcement Assistance Administration), 122–123 query systems, 108–109, 110, 343 state IT deployment, 224 structure of, 103–105 telecommunications, 116 use of computing (1990), 117 Law Enforcement Assistance Administration (LEAA), 122–123 law firms, 126, 129–130 LEAA (Law Enforcement Assistance Administration), 122–123 learner-centered activities, 275 legacy systems, 84, 93 Legal Administrator, 130 Legal Economics, 130 legal research, 129–130 letter sorting (1971–1977), 169, 179 LEXIS, 126, 129 libraries academic libraries and digital information, 313 campus networking and, 299 and digital libraries, 317–319 higher education, computer usage, 289 and IT deployment, 320 Library of Congress, 311, 315–316 Online Computer Library Center (OCLC), 315–317 role in academic life, 310–311 schools and computer usage, 256, 273 students and PCs, 322 training programs, 311 universities and IT deployment, 313–320 university catalogs, 294 University of Wisconsin academic libraries, 321 license applications, 344
Lincoln Laboratory, 55 linear programming, 193 local area networks (LANs). See LANs local government, 36, 37, 40, 46. See also municipalities Lockard, James, 272 Locke, William N., 313 Logistics Information System, 65–66 logistics systems, 62–66, 186, 374n41 long-distance telephone service, 197 Los Angeles, 36, 131, 229–230, 237, 253–254 Lotus 1–2-3, 193 Lotus Notes, 94, 128–129, 298 Louisiana, 31 machine-readable, 129, 160–161, 297 Machlup, Fritz, 335 MADAM (Master Data Access Method), 151 Madow, William, 158 Maeroff, Gene I., 332 magnet schools, 279 magnetic tape Bureau of the Census records, 158–159 federal government usage, 195 public education use, 255 public sector IT deployment, 338 SSA records, 145–146, 147, 148, 393–394n18 state government usage, 216, 219 mail sorting, 169 mainframes academic research, 320, 327 federal government usage, 207 GSA computer inventory, 197 higher education and, 297 school districts computer usage, 257 software for higher education, 292 tax processing, 20 urban police agency usage, 112–113 USPS installation of, 171 management information systems (MIS), 255 MARC (Machine-Readable Cataloging), 315–316 Marine Corps, 77, 78, 79–80, 85, 90 Mark III, 54 Martinsburg, West Virginia, 19 Maryland, 126, 129 Massachusetts, 33, 215 Master Data Access Method (MADAM), 151 Mauchly, John, 158–159 McClure, Charles R., 402n29 McNamara, Robert, 63, 100, 376n79 Mead Data Corporation, 129 media, 294, 356
467
468
Index Medicaid documentation, 223 medical databases, 130 Memphis, Tennessee, 262 mental health, 102 Merrill, Paul F., 272 Michigan, 216, 226–227 microcomputers classroom usage, 272 federal government and, 192–193 higher education, computer usage, 297 public education and, 260–261, 263–267 use in academia, 320–324 microelectronics, 8 microfilm, 159–161 Micro-PLATO, 261 Microsoft, 242, 247, 302 military knowledge trainers, 260 military occupational specialty (MOS), 91 military-industrial complex. See also DARPA; missiles computer technology advancement and, 8, 99–100 DoD budget, 51 Eisenhower comment, 49 and IT deployment, 338, 342, 345–346 minicomputers, 162, 172, 290–292 Minitab, 299 Minnesota, 133 MIS (management information systems), 255 missiles ballistic missiles, 58–59, 83 communications, 81 and computers, 58, 346 guidance control data, 68 missile gap, 60 R&D for guidance control, 54 Missouri, 115 MIT (Massachusetts Institute of Technology) computer usage, 327 Cookie Monster, 135 and distance learning, 300 library management and computers, 311 MITRE, 55 networking services, 325 Radiation Laboratory, 53 university-industry technology transfer, 354 and Vannevar Bush, 303 MITRE, 55 Mobile, Alabama, 215 mobile computing, 116, 262 modeling and simulation (M&S), 292, 306 Modernized E-File Project, 46 money orders, 170–171
Monroe (corporation), 41 moon landing, 346 motor vehicle licenses, 127, 223, 234 motor vehicles, 171 MRI scanning, 310 mug shots, 108, 116–117, 123 multimedia software, 274–276 municipalities vs. county government, IT use, 232 demographics, 235–236 e-government, 250 GIS databases, 242–245 Internet usage, 246–249 IT budget, 246 IT deployment, 342, 353 IT deployment (1950s–1990s), 245–246 precomputer equipment, 236–239 public works, 240–242 traffic control, 242 use of computers, 239, 414n114 use of computers (1964–1965), 238 use of computers (mid-1980s), 213 workforce per workstation, 246 Murphy, Michael J., 107 music, 281, 322–323, 326, 432n147 Napster, 326 NASA computer deployment, 187, 190 IT deployment, 97, 342 moon landing, 346 supercomputers, 192 Nation at Risk, 263 National Agricultural Library, 203 National Bureau of Standards (NBS), 159, 189, 208 National Commission on Excellence in Education, 263 National Computer Center, 19–20 National Crime Information Center (NCIC). See NCIC national defense budget, 372n2 National Education Association (NEA), 278–279 National Employee Index, 393n17 National Geodetic Survey, 243 National Guard, 212 National Incident-Based Crime Reporting System (NIBRS), 117–118 National Information Infrastructure (NII), 201, 247 National Infrastructure Protection Center (NIPC), 136
Index National Instant Criminal Background Check System (NICS), 123 National Institute of Justice (NIJ), 128, 384n37, 385–386n54 National Institutes of Health (NIH), 56, 306 National Law Journal, 130 National Military Establishment, 50–51 National Performance Review (NPR), 200, 350 National Research Council (NRC), 84 National Science Foundation (NSF), 53, 56, 306, 310 national security, 50–51, 185, 208–209. See also NSA National Technological University, 302 NATO, 79 Naval Tactical Data System (NTDS), 69 Navy firing tables, 68–69 IBM 705s, 90 Naval War College, 77 OEG (Operations and Evaluation Group), 55 ONR (Office of Naval Research), 54 R&D weapons systems, 58 software applications, 71 war-gaming, 77 Nazi memorabilia, 356 NCIC (National Crime Information Center) data processing and, 120, 121, 122 FBI, 110–111, 119 NCIC 2000, 123–124 wireless connections, 116 NCR, 39, 41, 54, 89–90 NEA (National Education Association), 278–279 Nelson, Edwin L., 128–129 NetPost Mailing Online, 178 network-centric operations (NCO), 88, 261 networking higher education, computer usage, 288, 292, 294, 298, 325, 326–327 Information Age Warfare, 88 IT deployment in academia, 330 K–12 education, 299 library management, 316–317 MADAM (Master Data Access Method), 151 NCO (network-centric operations), 88, 261 Networked Society, 333 SIMNET (Simulation Network), 76 New Deal, 4, 351 New Jersey, 34, 129 New Mexico, 131 New York City computer crime, 135
computer deployment, 19, 38 data centers, 36 digital fingerprint systems, 115 New York Public Library, 314, 319 precomputer equipment, 237 punched-card records and law enforcement, 106, 107 query systems, 108–109 tax processing, 20 USPS mainframe installation, 171 New York State, 43–44, 216, 221, 255, 345 Newark, New Jersey, 262 NIBRS (National Incident-Based Crime Reporting System), 117–118 NICS (National Instant Criminal Background Check System), 123 NIH (National Institutes of Health), 56, 306 Nike missiles, 54, 68 911 calls, 115, 116, 118 9/11 attack, 116, 198, 205 9/11 Commission, 5–6, 138, 199, 205–206, 385–386n54 NIPC (National Infrastructure Protection Center), 136 Nixon, Richard, 126 Nobel Prize, 303, 428n79 nonprofit organizations, 228 Norberg, Arthur L., 396n68 Norris, Donald F., 232 North, Douglas, C., 429n99 North Central Regional Education Laboratory, 274–275 Northrop Corporation, 54 NSA (National Security Agency), 53, 56, 97, 185, 207 NSF (National Science Foundation), 53, 56, 306, 310 NSFNet, 310 NTDS (Naval Tactical Data System), 69 nuclear family, 336 nuclear war games, 77 nuclear warheads, 56, 59, 97, 346 numerical taxonomy, 305 nursing, 227, 346 Nye, David E., 347 OCLC (Online Computer Library Center), 315–317 OEG (Operations and Evaluation Group), 55 Oettinger, Anthony G., 271 Office of Management and Budget (OMB). See OMB Office of Naval Research (ONR), 54
469
470
Index Office of Scientific Research and Development (OSRD), 54 Office of Technology Assessment (OTA), 149–152 Ohio, 129, 215 Old Age and Survivors Insurance, 153, 395n51 Omaha, Nebraska, 81 OMB (Office of Management and Budget) Brooks Act, 189 Grace Commission, 190 IT deployment, 208 IT operating costs, 198, 200 SSA workforce, 150 Omnibus Reconciliation Act, 149 online applications, 288–289, 292–294 online bookmaking, 136 online classes, 331 Online Computer Library Center (OCLC), 315–317 online degree programs, 326. See also distance learning online gaming, 324 online job applications, 249 online library catalogs, 313, 314 online purchases, 35, 40, 247, 357, 436n35. See also e-commerce online testing, 323 online voting, 360, 412n72 ONR (Office of Naval Research), 54 open classrooms movement, 263 Operations and Evaluation Group (OEG), 55 operations research (OR), 53, 77 optical scanning academic research, 310 courts and, 128 Service Center Recognition Processing System (SCRIPS), 28 state government record digitization, 223 tracking packages, 175 and USPS, 168–169, 172–173, 179, 181–182 voting, 228–229 ORDVAC (Ordnance Variable Automatic Computer), 68 organized crime, 136 Origin-Destination Information System (ODIS), 171 Osorio-Urzua, Carlos A., 405n61 OSRD (Office of Scientific Research and Development), 54 outsourcing county government and, 231 private sector, 359, 436n31 public sector, 13
SSA records, 151–152 state government initiatives, 227 Oxford University, 319 PACER (Public Access to Court Electronic Records), 128 paper reduction campaign, 26–27, 155, 193–194, 200 Papert, Seymour, 261, 268, 270, 421–422n64 Paperwork Reduction Act, 189, 194, 202 Parker, Donn B., 133–134, 139 parking tickets, 106 parolee, 132 Patchogue, New York, 38 patents, 356 patrol cars, 116, 118 payroll federal government computer usage, 186 IBM 604 Electronic Calculating Machines, 216 IBM 705, 90 IBM tabulators, 215 Los Angeles, 230, 237 New York City, 237 public education and computers, 255, 345 public sector, 346–347 USPS, 170 PBX systems, 132 PC postage, 177 PCs computer crime, 135 corporate tax returns and, 46 county government IT use, 232–233 higher education, computer usage, 288, 292, 298, 332 introduction of (1981), 215 library management, 317 municipalities and, 239, 241, 246–247 NCIC files, 121 PC postage, 177 public education and computers, 265, 266, 282 public sector deployment, 348 school districts computer usage, 257 SSA records, 153 state government usage, 219 state legislatures usage, 222–223 state pre-Internet communications, 225 student use, 322–324 tax enforcement, 33 tax preparation software usage, 41 tax returns, 23, 45 teacher usage, 258, 269 urban police agency usage, 113
Index PDAs (personal digital assistants), 280 Peace Prize, 428n79 peer review, 307 Pennsylvania, 33–34, 129 Pennsylvania State University, 288–289 pensions, 153, 359–360 Pentagon academic research, funding, 306 computers’ impact, 345 “electronic battlefield,” 56–57 firing table errors, 71 Logistics Information System, 64 military communication, 83 R&D, 54 RFID tags, 65 war-gaming, 79 performance-based management, 198 Perlman, Ellen, 403n40 Pershing missile, 68 Persian Gulf War, 65, 84 personnel records, 255, 290, 338 Pew Foundation, 175, 399–400n129 Pew Internet and American Life Project, 411n64 Philadelphia community-based policing, 118 courts and computers, 126 public education computer use, 1970s, 255 punched-card records, 38 tax processing, 20 USPS mainframe installation, 171 Phister, Montgomery, Jr., 401n3 phone phreaks, 135 piracy, 265 platforms, 67 PLATO, 73–74, 261, 280 point-of-sale terminals (POS), 174–175 politics, 179, 181, 209 pornography, 357 portals, 234, 339 Post Office Department (POD). See USPS postage stamps, 174 postal source data system (PSDS), 171 postal unions, 165 Potter, John E., 178 Powell, Colin, 86, 89 PowerPoint presentations, 281 practice drills, 276 precomputer era, 141–142, 237, 254, 288–289. See also adding machines/calculators Pressey, Sidney L., 259–260 PriceWaterhouse, 41 Princeton University, 333 Priority Mail, 179
prisoners, 104, 122, 130, 131 privacy computer crime, 133 county government, 235 digitized criminal records, 123 lawyers and laws, 129 local government database protection, 40 public concern over, 349 SSA Web site, 154 turf battles, 138 Web sites, 203 private sector academic research, funding, 307 accounting practices, 199 comparison with law enforcement, 102 and computer usage, 184 cost justification for computers, 341 decentralization of decision-making, 118 downsizing, 359 and IT deployment, 338, 342 Paperwork Reduction Act, 189 vs. public sector, 207 public sector, influence on, 336 R&D in the (1980s), 55 stovepiping, 339 Web site usage, 203 privatization, 164, 175 probationers, 104 process management, 337 procurement IT deployment, 95–96, 99, 191 municipalities and, 249 public sector, 339, 434n6 Prodigy, 177 productivity computers and, 15, 187–188, 365–366n20 higher education, computer usage, 293, 332–333 Internal Revenue Service (IRS), 29 and IT deployment, 338 “productivity paradox,” 14 public sector, 3, 13–14, 335, 346–347, 353 school districts, 337 service sector, 365n19 programmed books, 260 programming languages, 59, 72, 131, 297 propaganda, 93–94, 380n171 property taxes computers and, 128 county government IT use, 231–232 GIS databases, 243–244 local government tax base, 36, 39 precomputer equipment, 237 and public education, 254
471
472
Index psychological operations (PSYOP), 87 Public Access to Court Electronic Records (PACER), 128 public education administration and computers (1950s–2000), 253–258, 280–281 “Computer in the School Room: What Revolution?,” 266 computer labs, 264–266 debate on computer role in, 267–272 drill-and-practice exercises, 264–266 ERIC (Education Resource Information Center), 418n5 Goals 2000: Educate America Act, 273 and the Internet, 276–279 Nation at Risk, 263 open classrooms movement, 263 role of computers, 251–253, 267–272, 281–282, 418n1 role of IT (1990s–2000s), 272–279 student-to-computer ratio, 265 teacher-centered instruction, 259, 269, 419n22 public rights-of-way (ROWs), 247 public sector American economy and, 3, 334, 351–355 cost justification for computers, 341, 345 and the digital hand, 141 expenditures on IT (1952–2002), 10 funding for computers, 348 as galaxy of industries, 335–342 GDP (gross domestic product), 4 and IT deployment, 338, 340, 342 management of IT, 357–362 outsourcing of, 13 patterns in adoption of IT, 347–350 vs. private sector, 13, 14, 207 private sector, influence on, 336 productivity, 3, 13–14, 335, 346–347, 353, 434n8 “Public Sector: Early Stages of a Deep Transformation,” 405n61 regulatory and legislative role, 355–357 taxes and expenditures, 7–8 telecommunications and computing, 362–363 public works, 30, 31, 240–242, 353 punched-card records academic research, 303 Bureau of the Census, development of, 396n66 Census Bureau equipment, 140–141 court system, 125 DoD accounting, 89
higher education and administration, 288 inventory control, 64 IRS processing centers, 19, 266n7 IT usage in federal government, 187 and law enforcement, 106, 107, 109 library management, 311–312 local government use, 37 Marine Corp data processing, 91 money orders, 170–171 municipalities’ IT deployment, 241 Philadelphia, 38 public education use, 255 public sector IT deployment, 338 SSA records, 145 state income tax and, 30–32, 216 statistical analysis and, 157–158 USPS payroll, 170 voting, 228–229 purchasing, 62–66 Quality Management, 200 quantitative analysis, 429n99 query systems computer-aided dispatching, 112 law enforcement, 108–109, 110, 343 state government applications, 219, 222 wireless connections, 116 queuing analysis, 193 Quicken, 45 R&D (research and development) academic research, funding, 307, 308 Bureau of the Census and computers, 158 DoD budget, 52–62, 98 federal expenditures for, 365n11 National Information Infrastructure (NII), 201 private sector, 55 Soviet Union, 272n6 stealth aircraft, 60, 374n38 universities and computer development, 284, 351–352 USPS, 169, 173 weapons systems, 207 Radiation Laboratory, 53 radio frequency identification devices (RFID), 53, 65–66, 180, 372n9 Radio Research Laboratory, 53 Radio Shack, 263–264 RAMACs, 63 RAND Corporation, 55, 77, 80, 187 rap sheet, 123 Raytheon, 54 Reagan administration, 61, 149, 194
Index Real War, 80 Recorded Music Industry, 281, 320, 331, 432n147 recreational passes, 249 Reese, William J., 263, 268 Refund Anticipation Loan (RAL), 42 registrations lists, 289–290, 294–295 Re-Inventing Government. See Clinton administration Remington Rand, 41, 60, 89, 237 research and development. See R&D research grants, 289, 295, 306–307, 429n102 Research Libraries Information Network (RLIN), 316 retina scans, 115 revenue administration, 31 Revolutionary Wealth, 279, 421–422n64 RFID (radio frequency identification devices), 53, 65–66, 180, 372n9 Ridge, Tom, 33–34 rightsizing, 359 Riley, Paul H., 82–83 RockYourRefund.com, 44 Rossotti, Charles O., 22–23, 29, 366n5, 369n61 Ruch, Richard S., 330–331 Rumsfeld, Donald H., 89, 100 Runyon, Marvin, 177 SAC (Strategic Air Command), 81, 88 SAGE Air Defense System Ballistic Missile Early Warning System (BMEWS), 83 command and control systems, 81, 345–346 computer-assisted air defense, 62 false alarms, 71, 88 R&D, 53, 58 war-gaming, 77 sales tax, 29, 32, 35, 40, 357 San Jose, California, 238 satellites, 55, 70, 83, 245, 356 school boards, 268 school districts, 11, 36, 255–256, 337 Schwabe, William, 102 scientific research, 186, 198, 297, 331 Scotland Yard, 113–114 SDC (Systems Development Corporation), 55 SDI (Strategic Defense Initiative), 53, 55, 59, 61, 88 Secret Service, 103, 136 Secretary of Defense, 98 Sedensky, Matt, 371n92 semiconductors, 54 Sergeant missile, 68
Serote, Paul, 253–254 Service Center Recognition Processing System (SCRIPS), 28 service sector, 365n19 Sgt. Joe Friday, 110, 383n26 Shelton, William C., 396n66 sheriffs, 120, 386–387n69 Shillito, Barry J., 64 Signal Supply Agency, 63 Silicon Valley, 269–270, 354 SIMNET (Simulation Network), 76 simulators, 72–80, 88, 306, 310 Singer, Molly, 411n64 Skinner, B. F., 259–260 slide rules, 303 smart bombs, 66–67, 69–70, 338, 375n66 SMP strategy, 151–152 Snark missile, 54 Social Security Administration (SSA). See SSA social security number (SSN), 143, 150, 156 Social Security Online Accounting and Reporting Systems (SSOARS), 156–157 Social Security Statement, 157 social services, 224 social welfare statistics, 230, 244 sociology, 308–309 software applications. See also GIS branching programs, 260 Bureau of the Census, 162, 338 county government IT use, 232–233 educational, 265, 271–273, 274, 281, 338 higher education, computer usage, 292, 298, 299 lack of for education, 262 Master Data Access Method (MADAM), 151 municipality usage, 242–243 patents, 356 PC usage and, 348 PLATO, 73–74, 261, 280 spreadsheets, 192 state legislatures and, 221, 222 Supervisor’s Workstation, 175 Systems Modernization Plan (SMP), 150 teacher input, 265 weapons systems, 70–72 Web site visit tracking, 411n63 Software Engineering Institute, 55 solid waste management, 192 Soviet Union, 71, 272n6, 307. See also Cold War Sperry Rand Corporation, 384–385n39 Sperry-Univac, 39, 114 sporting events, 258
473
474
Index spreadsheets federal government and, 192 higher education, computer usage, 291–292, 297 library management, 317 and PCs, 321–322 state government usage, 219 Sprint (corporation), 196–197 Sputnik, 55 SSA (Social Security Administration) Case Processing and Management System (CPMS), 157 computer crisis (1970s–1980s), 148–152 computer deployment, 144–148, 180–181, 340 Disability Insurance program, 152–153, 395n51 Electronic Disability (eDib) process, 156 e-mail access, 154–155 established, 142 IT deployment, 142–157 IT expenditures, 155, 200 IT projects (2001), 155 key applications (1957–1969), 148 networking (1980s–2006), 151–158 networking (1994–2003), 154 optical scanning, 173 precomputer era, 141–142 record storage prior (1962), 143 SMP strategy, 151–152 Social Security Statement, 157 SSN, 143, 150, 156 storage of records, 147, 393–394n18 Visible Index, 143, 144 wages data, 186 Web sites, 153–154, 210 SSN (social security number), 143, 150, 156 St. Louis, 38, 106, 108–109, 138, 370n74 STAIRS (Storage and Information Retrieval System), 130 Stamps Online, 178 standardization, 343 Stanford University, 319, 354 state and local government. See also local government; state government court system, 125 criminal justice databases, 111 digital fingerprint systems, 115 digitized criminal records, 123 expenditures, 10 IT deployment and, 249–250 law enforcement, adoption of computing, 105–120 local government units (1952–2002), 11
tax collection and computer use, 17 taxes and expenditures, 8–11, 9 workforce, 8, 9 World War I technology, 364n3 state government boards of education, standards, 267 CAD/CAM, 408n18 computer financial applications, 29–35 data collection, 218–219 Diebold Group survey, 217, 407n12 digital technology (1950s-mid-1990s), 215–224 and e-commerce, 338–339 federal law enforcement mandates, 212 Gartner survey, 214 Internet usage, 224–229, 226 IT applications (circa 1964), 218 IT budget, 218, 223–224, 340–341, 361 IT deployment, 212–224, 342 legislation and IT deployment, 223 online purchases, 357 online voting, 227–228 outsourcing initiatives, 227 procurement, 339 productivity and IT, 344–345 R&D funding, 352–353, 435n18 record digitization, 223 tax collection, 343 use of computers (1964–1965), 30, 31 use of computers (mid-1980s), 213 voting, 228–229 workforce per workstation, 246 state legislatures, 220–221, 222 statistical analysis, 157–158, 162, 299, 321–322 stealth aircraft, 60, 374n38 stem cell research, 435n18 stenograph machine, 125, 387n89 Stoats, Elmer B., 394n29 stolen property, 105, 108 Stolurow, Lawrence M., 426–427n51 Storage and Information Retrieval System (STAIRS), 130 stovepiping, 339, 361 Strategic Air Command (SAC), 81, 88 Strategic Computing Program, 61 Strategic Defense Initiative (SDI), 53, 55, 59, 61, 88 Strategic Transformation Plan 2006–2010, 397n91 students distance learning, 325 higher education, computer usage, 298–299 higher education enrollment, 292–293
Index PC use in higher education, 322, 332 student-to-computer ratio, 265, 277, 322 supercomputing, 97, 192, 310 Supervisor’s Workstation, 175 Supplemental Security Income (SSI), 149, 153 supply chains, 65–66, 179 Supreme Court, 35, 228 surveillance, 81 Systems Development Corporation (SDC), 55 Systems Modernization Plan (SMP), 150 tabulators, 215–216, 288 Tank Team Gunnery Trainer, 76 Tapscott, Don, 207 Tax Administration System, 20 tax collection, 16–17, 47 tax evasion, 32–33 tax fraud, 21, 27 tax preparation software, 17, 45–46 tax preparers computer usage among, 40–47 CPA Journal survey, 43–44 e-filing, 24, 26 Refund Anticipation Loan (RAL), 42 tax preparation software, 43, 44 tax returns. See also e-filing and computers, 46 corporate online filing, 344 costs of, 46 digitization of, 223 electronic funds transfer (EFT), 33 individual, 18 Integrated Collection System (ICS), 22–23 Internet filing survey, 34 length of, 17 online filing, 205, 247, 344 paper and electronic filing, 25 refunds, 33, 45 software usage, 41–42 tax shelters, 27 Tax System Modernization (TSM), 21–23 Tax System Redesign Project, 20 taxes. See also IRS; tax returns federal, 1950–2003, 7 Internet applications, 226 law enforcement and, 104 Los Angeles County, 230 percent of economy, 4 state and local government, 9 and state government, 29–35, 224, 343 tax-exempt organizations, 46 taxpayers, 17, 23, 27, 41 Taylor, Howard D., 3–4, 17 Taylor, Robert P., 268
teachers computer education for, 272, 341 computer influence on, 330 computer use (1960s–1980s), 258–267 computer use (1970s), 254 and distance learning, 300–302, 427n67 educational software, 265, 266 institutional culture, 336 Internet usage, 276–279 NEA survey, 278–279 number of (1980–2000), 11–12 software applications for, 346, 348 teacher-centered instruction, 259–260, 296, 337, 419n22 universities and computers, 284–285 teaching machines, 259–260 techno-hubris, 101, 381n191 telecommunications academia and the Internet, 331 and academic research, 324 Big Science and, 309 computer crime, 135 corrections facilities, 131–132 digital PBX systems, 116 and distance learning, 299, 326 and the federal government, 185, 191, 196–197, 203 higher education and, 288, 333 institutional culture and IT deployment, 339 INTELPOST, 174 and Internet usage, 47 law enforcement and, 116, 385–386n54 municipalities usage, 236 NCIC 2000, 123–124 online tax returns, 344 public sector and, 362–363 regulation of, 356 and SSA, 147–149, 151 state and local government, 213 state government usage, 219 state legislatures and, 222 Telecommunications Act of 1996, 201, 247 and USPS, 167, 171 TeleFile, 26, 45 telephones, 106, 125, 197, 236, 239–240 teletype, 147 1040EZ form, 27, 42 1040PC, 24–25 1099 forms, 32 term paper mills, 323 Texas, 132 Texas Instruments, 54 text books, 272, 421n51 text messaging, 281
475
476
Index Thor missile, 59 305 RAMACs, 63 TIGER (Topologically Integrated Geographic Encoding and Referencing System), 162, 243 time-sharing services, 162, 261, 291 Titan missile, 59–60 Toffler, Alvin, 268–269, 279, 421–422n64 toll-free 800 telephone numbers, 153–154, 179 Topologically Integrated Geographic Encoding and Referencing System (TIGER), 162, 243 “total systems approach,” 31 tracking packages, 175 traffic control, 242 traffic violations, 108, 126, 237, 249 Traf-O-Data, 242 training programs CAI (computer-aided instruction), 72, 74, 75 computer-based, 79 Joint Vision 2020, 88 military communication, 84 public vs. private sector, 341 simulators, 74–77 Trainor, Bernard E., 379–380n155, 381n188 transcripts, 294 transistors, 217 transportation modeling, 192 Transportation Services Administration (TSA), 199 Treasury Department, 145, 170, 186 Truesdell, Leon, 396n66 tub cards, 109, 255 Tubbs, Henry W., Jr., 91 Tucker, Marc S., 266, 427n53 tuition costs, 332–333 Tulsa, Oklahoma, 237–238 TurboTax, 44–45 turf battles, 138 Turnpiking Affect, 399–400n129 TV cable systems, 247 2000 election, 228 typewriters corrections facilities, 131 court system, 125 and law enforcement, 106 municipalities usage, 236 tax preparation equipment, 41–42 World War I technology, 4 “undeliverable as addressed” mail, 172 Underwood, 41
unemployment insurance, 215–216 unemployment tax, 20 Ungar, Bernard L., 140 Unisys Corporation, 153–154 Univac, 54, 90, 370n74 Univac I, 158–159 Univac III, 90 Univac 60, 171, 237 Univac 90, 54 Univac 120, 38, 171, 237 Univac 1105, 160 Univac 1107, 160–161 Univac 1108, 160–161 Universal Crime Report (UCR), 117 Universal Digital Operational Flight Trainer Tool, 75 University of California, 289–290, 311, 319 University of Illinois, 74, 261 University of Michigan, 354 University of Minnesota, 354 University of North Carolina, 160 University of Pennsylvania, 53, 67, 75 University of Phoenix, 300–302 University of Texas, 300 University of Toronto Library Automation System (UTLAS), 316 University of Wisconsin academic libraries, 321 academic research, 323 cataloguing books, 318 computer payroll, 345 library training, 311–312 online catalogs, 313, 314 WISC computer system, 285 university-industry technology transfer, 354–355 UPS, 164 urban development, 192 U.S. Geological Survey (USGS), 162 U.S. Sprint, 196–197 USAF. See Air Force USA.gov, 206 USPS (U.S. Postal Service). See also First Class Mail service and American economy, 165, 166 bar coding, 169, 172–173, 179 Bureau of Finance and Administration, 171 Click-N-Ship, 180 data processing, 168 delivery statistics, 167 deployment of mechanization and IT (1950–1983), 167–172 Electronic Computer-Originated Mail (E-COM), 174, 175
Index history of, 164 INTELPOST, 173–174 Internet Age (1995–2007), 175–180, 399–400n129 IT applications (1968–1984), 172 IT before the Internet (1984–1994), 172–175 IT deployment, 181, 334, 341 letter sorting (1971–1977), 169, 179 money orders, 170–171 NetPost Mailing Online, 178 optical scanning, 168–169, 172–173, 179, 181–182 Origin-Destination Information System (ODIS), 171 overtime costs, 179 PC postage, 177 point-of-sale terminals (POS), 174–175 postage stamp machine, 174 postal source data system (PSDS), 171 precomputer era, 141–142 Strategic Transformation Plan 2006–2010, 397n91 Supervisor’s Workstation, 175 toll-free 800 telephone numbers, 179 tracking packages, 175 Ungar report, 140 ZIP codes, 167–169, 171–172, 398n96 Zoning Improvement Plan (ZIP), 168 Utah, 225 utilities, 38, 230, 232, 237, 243–244 vacuum tubes, 217 Varian, Hal R., 432n135 vehicle registration, 216 veterans, 336 video conferencing, 197 video games graphics, 346 Joint Vision 2020, 88 military training, 72–73 recruitment, 93–94, 380n171 teaching machines, 260 war-gaming, 77–78, 79, 80 videotext, 177 Vietnam War and accounting, 90 communications, 82, 100 DoD budget, 51 “electronic battlefield,” 76 FADAC usage, 68 federal government workforce and, 6 Information Age Warfare, 379–380n155
Marine Corp data processing, 90 smart bombs, 69 violent crime, 102 Virginia, 223 Virtual Law Library, 129 virtual schools, 279 viruses, 135, 137 Visible Index, 143–145, 144 VisiCalc, 193 visualization, 309–310 von Neumann, John, 304 voting, 221, 227–228, 231, 233 Walker, David M., 198 Wal-Mart, 100 Wang, 39, 197 wanted posters, 119 War College, 95–96 War on Terror, 4, 199, 350 war-gaming, 77–78, 79 Washington, D.C., 171, 223 Washington Library Network (WLN), 316 Washington State, 133, 216, 221 water resources, 192 Watergate, 20 weapons systems, 53, 66–72, 99, 207 Weather Bureau (U.S), 187 weather forecasting, 186–187, 192, 306 Web sites Center for Digital Government, 227 county government, 233 federal government, 203, 206, 344 hackers, 206 higher education, 294 municipalities and, 246–247 political campaigns, 229 RockYourRefund.com, 44 school districts, 257–258 site visit tracking, 411n63 state government and, 224–229 welfare programs county government IT use, 231, 339 SSA benefits, 149 standardization, 343 and state computer usage, 217 state mandates, 212 West, Darrell M., 209, 225, 434n9 West Coast Computer Center, 91 Western Identification Network (WIN), 385n44 WESTLAW, 126, 129 Whirlwind, 54 white-collar crimes, 134 White House, 197
477
478
Index wind currents, 230 wireless connections, 116, 132, 249, 325–327, 385–386n54 WISC computer system, 285 Wisconsin, 344, 345 word processors higher education, computer usage, 291–292, 297 library management, 317 and PCs, 321–322 SSA records, 153 state government usage, 219 Word, 298 WordPerfect, 298 workbooks, 260, 264 workforce Air Force, 91 Bureau of the Census, 164 computer education for, 329 DoD budget, 51, 52 federal government (1950–2005), 6, 364–365n7 law enforcement ecosystem, 102, 104, 112, 139 Los Angeles County, 230 online job applications, 249 public sector employment, 4–5, 346–347 public vs. private, 13, 14 retirement of, 359, 360
Secretary of Defense, 98 SSA employees, 149–152, 155–156 USPS, 165, 173, 175, 177 World Is Flat, The (Friedman), 279, 421–422n64 World War I, 4, 69, 70, 72, 303 World War II accounting equipment after, 37–38 and computers, 304 federal government, growth after, 4 FFRDCs, 55 firing tables, 67 increase in public services, 344 OSRD, 54 war-gaming, 77 worms (computer), 135 Writing to Read, 265 Y2K scare county government, 233 federal government, 198, 403n40 higher education, computer usage, 296 municipalities and, 247–248 Rossotti, Charles O., 22 Yahoo!, 326 ZIP codes, 167–169, 171–172, 398n96 zoning, 244, 247 Zoning Improvement Plan (ZIP), 168