Tracking a Transformation: E-commerce and the Terms of Competition in Industries
BRIE-IGCC E-conomy Project
BROOKINGS INSTITUTION PRESS
T T
This page intentionally left blank
BRIE-IGCC E- P
T T E-commerce and the Terms of Competition in Industries
B I P Washington, D.C.
The Brookings Institution is a private nonprofit organization devoted to research, education, and publication on important issues of domestic and foreign policy. Its principal purpose is to bring knowledge to bear on current and emerging policy problems. The Institution maintains a position of neutrality on issues of public policy. Interpretations or conclusions in Brookings publications should be understood to be solely those of the authors. Copyright © 2001
1775 Massachusetts Avenue, N.W., Washington, D.C. 20036 www.brookings.edu All rights reserved Library of Congress Cataloging-in-Publication data Tracking a transformation : e-commerce and the terms of competition in industries / BRIE-IGCC E-conomy Project. p. cm. Includes bibliographical references and index. ISBN 0-8157-0067-9 (pbk. : alk. paper) 1. Electronic commerce—United States—Case studies. 2. Competition—United States—Case studies. 3. Industrial relations—Effect of technological innovations on— United States—Case studies. 4. Electronic commerce—Europe—Case studies. 5. Competition—Europe—Case studies. 6. Industrial relations—Effect of technological innovations on—Europe—Case studies. I. BRIE-IGCC E-conomy Project. II. Berkeley Roundtable on the International Economy. III. University of California Institute on Global Conflict and Cooperation. HF5548.325.U6 T73 2001 381’.1—dc21 2001005817 987654321 The paper used in this publication meets minimum requirements of the American National Standard for Information Sciences—Permanence of Paper for Printed Library Materials: ANSI Z39.48-1992. Typeset in Adobe Garamond Composition by R. Lynn Rivenbark Macon, Georgia Printed by R. R. Donnelley and Sons Harrisonburg, Virginia
Contents
Acknowledgments
ix
Acronyms
xi
The Enablers: Tools and Markets 1 Tools: The Drivers of E-Commerce
3
Stephen S. Cohen, J. Bradford DeLong, Steven Weber, and John Zysman
2 The Construction of Marketplace Architecture
27
François Bar
E-Commerce: A View from the Sectors
The Boundary Conditions of Services
53 v
vi
3 E-Finance: Recent Developments and Policy Implications
64
Setsuya Sato, John Hawkins, and Aleksander Berentsen
4 The Future of Retail Financial Services: Transparency, Bypass, and Differential Pricing
92
Eric K. Clemons, Lorin M. Hitt, and David C. Croson
5 Web Impact on the Air Travel Industry
112
Stefan Klein and Claudia Loebbecke
6 Confronting the Digital Era: Thoughts on the Music Sector
128
Jonathan Potter
Standard Modules and Market Flexibility
7 The Internet and the Personal Computer Value Chain
137 151
Martin Kenney and James Curry
8 E-volving the Auto Industry: E-Business Effects on Consumer and Supplier Relationships
178
Susan Helper and John Paul MacDuffie
9 E-Commerce and the Changing Terms of Competition in the Semiconductor Industry
214
Robert C. Leachman and Chien H. Leachman
10 The Old Economy Listening to the New: E-Commerce in Hearing Instruments Peter Lotz
229
vii
Making and Moving Stuff
241
11 Electronic Systems in the Food Industry: Entropy, Speed, and Sales
253
Jean Kinsey
12 Lean Information and the Role of the Internet in Food Retailing in the United Kingdom
280
Jennifer Frances and Elizabeth Garnsey
13 E-Commerce in the Textile and Apparel Industries
310
Jan Hammond and Kristin Kohler
14 E-Commerce and Competitive Change in the Trucking Industry
332
Anuradha Nagarajan, Enrique Canessa, Will Mitchell, and C. C. White III
What Comes Next? The Evolving Infrastructure
What Will the Next Generation of Tools, Networks, and Marketplaces Look Like?
357
15 The Mobile Internet Market: Lessons from Japan’s i-Mode System
369
Jeffrey L. Funk
16 E-Commerce and Network Architecture: New Perspectives Michael J. Kleeman with David Bach
389
viii
17 The Political Economy of Open Source Software
406
Steven Weber
18 The Next-Generation Internet: Promoting Innovation and User-Experimentation
435
François Bar, Stephen S. Cohen, Peter Cowhey, J. Bradford DeLong, Michael J. Kleeman, and John Zysman
Contributors
475
Index
477
Acknowledgments
T
he BRIE-IGCC E-conomy Project, led by BRIE codirectors Stephen S. Cohen and John Zysman (UC Berkeley) and IGCC director Peter Cowhey (UC San Diego), includes professors Fran¸cois Bar (Stanford), J. Bradford DeLong (UC Berkeley), Martin Kenney (UC Davis), and Steven Weber (UC Berkeley). Special thanks are due the BRIE graduate students who provided significant substantive and editorial contributions to this work: Benjamin Ansell, David Bach, John Cioffi, Gary Fields, Brodi Kemp, John Leslie, and Abe Newman. Newman’s work on the media chapter was of distinct importance. David Bach made significant contributions to the overall effort and helped draft important elements. The value of John Cioffi’s contributions to this book, and to the several conferences that led up to it, cannot be overstated. He has been an indispensable partner in the effort. Thanks are also due to Patricia Johnson and Susan Jong for substantive contributions and to Michelle Clark and Noriko Katagiri as well; all four provided elements of the complex coordination required in research and production stages. Thanks, too, to the many readers who contributed comments, and in particular to Mary Clare Fitzgerald, Brian Kahin, Elliot Maxwell, Andy Pincus, and Lee Price, who helped organize the conference at which some of these papers were presented. Peter Harter, for his help in
ix
the formative stages of the E-conomy Project, also deserves thanks, as does Ann Mine for help with everything from logistics to rewriting. The generosity of the German Marshall Fund of the United States and the Alfred P. Sloan Foundation supported work for this book. There was also significant support for this project from IGCC. A companion volume, a result of the work of the Brookings Task Force on the Internet, The Economic Payoff from the Internet Revolution, has also been published by Brookings Institution Press.
Acronyms
3G ACM AOL-TW API APR ARPA ARPANET ART ASCAP ASIC ASP AST ASTA ATM B2B B2B2C B2C B2V BIOS BMI BSD
Third Generation Wireless Technology Association for Computing Machinery America Online–Time Warner Application Program Interface Annual Percentage Rate Advanced Research and Projects Administration ARPA Network Advanced Radio Telecom American Society of Composers, Authors and Publishers Application-Specific Integrated Circuit Application Service Provider Advanced Systems Technology American Society of Travel Agents Automated Teller Machine Business-to-Business Business-to-Business-to-Consumer Business-to-Consumer Business-to-Vehicle Basic Input Output System Broadcast Music, Inc. Berkeley Software Distribution xi
xii
BTE BTO CAD CATV CD-ROM c-HTML CLEC CM CMOS CNPS CPFR CPI CRS CRTC CSM DC DLC DMO DNS DOJ DOS DP DRAM DRI DSD DSL DVD DWDM EBPP ECN ECR EDA EDI EFS EMACS EPOS EU FCC
Behind-the-Ear Build to Order Computer-Aided Design Cable Television Compact Disk Read Only Memory compact HTML Competitive Local Exchange Carrier Category Management Complementary Metal-Oxide Semiconductor Cross-National Production System Collaborative Planning, Forecasting, and Replenishment Consumer Price Index Computerized Reservation Systems Canadian Radio-Television and Telecommunications Commission Competitive Semiconductor Manufacturing DaimlerChrysler Digital Loop Carrier Destination Management Organization Domain Name System U.S. Department of Justice Disk Operating System Data Processing Dynamic Random Access Memory Defense Research Institute Direct Store Delivery Digital Subscriber Line Digital Versatile Disk Dense Wave Division Multiplexing Electronic Bill Presentment and Payment Electronic Communications Network Efficient Consumer Response Electronic Design Automation Electronic Data Interchange Electronic Financial Services Editor MACROS Electronic Point of Sale European Union Federal Communications Commission
FinCEN FSF FTC FTP FTTH G2C GCC GDB GDP GDS GEMA GM GNU GPL GPRS GPS GSM GUI HDD HDR HDTV HHS HKMA HMO HP HTML HTTP IC3D ICT IDE I-EDI ILECS IM IOR IP IPO IPv6 ISDN
xiii
Financial Crimes Enforcement Network Free Software Foundation Federal Trade Commission File Transfer Protocol Fiber to the Home Government-to-Consumer GNU Compiler Collection GNU Debugger Gross Domestic Product Global Distribution Systems Gesellschaft für musikalische Aufführungs- und mechanische Vervielfältigungsrechte General Motors GNU Is Not Unix General Public License General Packet Radio Service Global Positioning System Global System for Mobile Communications Graphical User Interface Hard Disk Drive High Data Rate High Definition Television U.S. Department of Health and Human Services Hong Kong Monetary Authority Health Maintenance Organization Hewlett Packard Hyper Text Markup Language Hyper Text Transfer Protocol Interactive Custom Clothes Company Design Information and Communication Technologies Integrated Drive Electronics Internet-Enabled Electronic Data Interchange Incumbent Local Exchange Carriers Instant Messenger Interorganizational Relations Internet Protocol Initial Public Offering Internet Protocol Version 6 Integrated Services Digital Network
xiv
ISP IT ITE ITN ITS ITS JCI JCP JIT LAN LMDS LRIC LTL LTO M&S MFN MIS MITI MMDS MML MP3 MPU MRO MSDW N2K NC NCTA NPD NRA NTO NTT NVH OAC OECD OEM OEO OFTEL
Internet Service Provider Information Technology In-the-Ear InterTAN Incompatible Time Sharing System Intelligent Transportation System Johnson Controls, Inc. J. C. Penny Just in Time Local Area Network Local Multipoint Distribution Services Long-Run Incremental Cost Less Than Truckload Local Tourist Organization Marks and Spencer Most Favored Nation Management Information System Ministry of International Trade and Industry (Japan) Multichannel Multipoint Distribution Services Mobile Markup Language MPEG (Moving Pictures Experts Group) Audio Layer 3 Microprocessor Unit Maintenance, Repair, General Plant Operations Morgan Stanley Dean Witter N2K, Inc., now CDnow Network Computer National Cable Television Association Network Presence Database National Regulatory Authority National Tourist Organization Nippon Telegraph and Telephone Noise, Vibration, Harshness Open Access Coalition Organization for Economic Cooperation and Development Original Equipment Manufacturer Optical-Electronic-Optical U.K. Office of Telecommunications
OOO OSI OSS OTC OVPN PARC PC PDA PDC POS PUD PX QRP QVC RBOC RDC REIT ROM-BIOS RTO SACEM SAE SAGE SBC SEMI SIAE SIM/USIM SKU SMS STP SUV TCP/IP TDMA TL TSMC UAW UCC UCCNet UHL
xv
Optical-Optical-Optical Open Source Initiative Open Source software Over the Counter Optical Virtual Private Network Palo Alto Research Center Personal Computer Personal Digital Assistant Personal Digital Standard Point of Sale Package Express (PX) Pickup and Delivery Vehicle Package Express Quick Response Partnershipping Quality, Value, Convenience Regional Bell Operating Company Regional Distribution Center Real Estate Investment Trust Read Only Memory—Basic Input Output System Regional Tourist Organization Société des Auteurs Compositeurs Éditeurs de Musique Society of Automotive Engineers Semi-Automatic Ground Environment Southwestern Bell Corporation Semiconductor Equipment and Materials International Società Italiana degli Autori ed Editori Subscriber Identity Module/Universal Subscriber Identity Module Stock Keeping Unit Short Messaging Service Straight through Processing Sports Utility Vehicle Transmission Control Protocol/Internet Protocol Time Division Multiple Access Truckload Taiwan Semiconductor Manufacturing Company United Auto Workers Uniform Code Council UCC Open Format Internet Platform Ultra Long Haul
xvi
UMC UMTIP UNIVAC URL VAN VAR W3C WAP W-CDMA WDM WML WWW XML
United Microelectronics Company University of Michigan Trucking Industry Program Universal Automatic Computer Universal Resource Locator Value-Added Network Value-Added Reseller World Wide Web Consortium Wireless Application Protocol Wideband Code Division Multiple Access Wavelength Division Multiplexing Wireless Markup Language World Wide Web eXtensible Markup Language
I
The Enablers: Tools and Markets
This page intentionally left blank
1
. .
Tools: The Drivers of E-Commerce
on the proposition that the late-twentieth-century information technology (IT) revolution marks the beginning of a fundamental economic transformation. IT is producing one of those very rare eras in which advancing technology and changing organizations do not revolutionize just one leading economic sector but transform the entire economy and ultimately the rest of society as well. Information technology builds tools to manipulate, organize, transmit, and store information in digital form. It amplifies brainpower in a way analogous to that in which the nineteenth-century industrial revolution amplified muscle power. Rapid change by itself is not “revolutionary,” at least not as we are using the term. Rapid economic and technological change is normal: it has been a standard part of the economic history of every era since the beginning of the industrial revolution. Productivity explosions happen regularly as invention and innovation remake particular “leading sectors”—like air transport in the 1960s, television in the 1950s, automobiles in the 1920s, organic chemicals in the 1890s, and so on back to the original invention of the steam engine to automate the pumping of water out of coal mines. Each of these innovations massively boosted productivity in its particular slice of the economy. Each had diffusion effects that changed economic processes in many other parts of the economy. Each set off its own “long boom.” But information technology may well be different.
T
, , ,
Information technology is creating tools for thought. The first generations of these tools have certainly spawned a leading sector that has brought enhanced productivity growth and rapid innovation to a particular slice of the economy. These first generations contributed significantly to the end of the post-1973 era of relative stagnation and to the long boom of the 1990s. (As these tools diffuse, they look very likely to do the same for other advanced and advancing economies in the near future.) But that is not the whole story. The revolutionary potential lies within the tools that information technology provides to all economic sectors. These tools will affect every economic activity in which organization, information processing, or communication is important—in short, everything. They open new possibilities for economic organization across the board. They change what can be done and how it can be done across a wide range of industries. Most important, they may well require changes in ideas about ownership, property, and control—the way in which governments regulate economies in the broadest sense of that term. The dynamic, encompassing nature of this transformation creates pitfalls for traditional research strategies. To understand relationships, specify causal links, and design valid measures of complex economic processes is difficult enough when dealing with limited, specific, and fairly conventional arguments and issues. On the other hand, to weave a broad pattern of relationships without precise measures too often produces not great insight but banal superficiality. The more profound the transformation, the sharper the dilemma becomes. As context itself changes, things that were treated (for better or for worse) as parameters become variables. Research problems cannot be so easily isolated nor causal relationships cleanly specified. The logic of this book is to work with these constraints, not to fight them. Building on the proposition that an information revolution is in the process of fundamentally transforming our entire economy, we take a “bottom-up,” inductive approach. Claims of a discrete “Internet economy” or “information economy” are momentary, transitional characterizations. They will soon seem as meaningless as a claim that there is a “fax economy” or a “telephone economy.” Information technologies fade into the background as they become a set of tools for the economy as a whole. The “e” in “e-commerce” will disappear as all commerce becomes organized and integrated into electronic networks. This process happens unevenly, at different rates, and in different ways across the many sectors of an economy. The core of our research strategy is to track that process by examining what is happening in these different sectors.
The application of these tools for thought creates literally millions and millions of micro changes, which vary from industry to industry but sum to revolutionary potential. Their impact is, therefore, disruptive. The information technology revolution is a story about structural change; it is not primarily a macroeconomic or cyclical phenomenon. There is no promise, and little likelihood, of smooth growth, rising stock prices, and government surpluses stretching out to the horizon, nor of permanently low rates of unemployment, interest, and inflation. A different god governs the macroeconomy, who will decide whether and how these productivityenhancing changes—changes that provide far more than a trivial amount of growth potential—will be well used or wasted.
The New Economy: A Transformative Era The information technology revolution story has three intertwined themes. The first is technology development. The second is innovations in organization and practice. The third is speed and extent—the rate at which the first two stories are unfolding and the global reach of their implications. The technology theme is most familiar. In the 1960s Intel Corporation cofounder Gordon Moore projected that the density of transistors on a silicon chip would double every eighteen months. What came to be called Moore’s Law has been continually wrong—but on the upside. Computing power has more than doubled, and its price has fallen by more than half in the eighteen-month cycle. Consumers now routinely expect that the $1,000 personal computer they buy in a department store will have the processing power of what, five years ago, was a $20,000 workstation. What was once called supercomputing is now packaged in a run-of-the-mill desktop PC. The past forty years have seen something like a billionfold increase in the world’s installed computing power base. There simply is no historical precedent for a technology whose raw measures of capability progress at anything like this rate. And despite repeated roadblocks in semiconductor manufacturing technologies that seem to threaten an imminent slowdown in the cycle, innovation has (until now, at least) successfully overcome the impediments. There is no compelling reason to believe that we have come anywhere near the endgame for Moore’s Law. It is a safe bet that raw processing power will continue to grow at a rate faster than we can figure out what to do with it.
, , ,
This points to the more fundamental rate-limiting factor in economic transformation—the human systems of organization and innovation. An enormous increase in raw processing power generated by semiconductors is simply an economic potential. It becomes important only if this potential is utilized. Thus the key question as the semiconductor revolution has proceeded has always been, “what is computer power useful for?” The technological determinants of the answer to that question are changing and will continue to change steadily as the price of computing drops, the size of a computer shrinks, and the possibilities for useful applications expand. The organizational determinants of the answer are harder to theorize about simply because there is no equivalent of Moore’s Law for human systems. At each point in the past forty years, the critical step in the transformation of technical potential into economic productivity has been the discovery by IT users of how to employ their ever-greater and evercheaper computing power to do the previously impossible. In a real sense the leading-edge users and the innovative applications that they have developed have been the drivers or at least the shapers of technological change, because they are the creators of the meaningful demand for better, faster, and cheaper computers. And it is this core demand created by user-side innovation that has sustained and rewarded technological development. At first computers were used as powerful calculators to perform complicated and lengthy sets of arithmetic operations. The first leading-edge applications of large-scale electronic computing power were military.1 The burst of innovation during World War II that produced the first handtooled electronic computers was funded and driven by the demands of war. The Korean War won IBM its first contract to actually deliver a computer: the million-dollar Defense Calculator. The military demand in the 1950s and the 1960s by projects such as Whirlwind and SAGE—a strategic air defense system—both filled the assembly lines of computer manufacturers and trained a generation of engineers.2 1. Even before then the lead user had been the government. Charles Babbage’s difference engine was a British government-funded research and development project. The earliest application of largescale electronic tabulating technology was by the government, specifically the Census Bureau. The national census of 1880 required 1,500 clerks employed as human computers to analyze the data—and it took them seven years to do so. See Anderson (1988). By 1890 the Census Bureau was a test bed for Herman Hollerith’s mechanical calculator. 2. Campbell-Kelly and Aspray (1996) quote from Thomas Watson Jr.’s autobiography: “it was the Cold War that helped IBM make itself king of the computer business.” SAGE accounted for one-fifth of IBM’s work force at its peak. See Watson and Petre (1990). Relying on Flamm, Campbell-Kelly and
The first leading-edge civilian economic applications of large computing power came from government agencies and from industries like insurance and finance that performed lengthy sets of calculations as they processed large amounts of paper. The U.S. Census Bureau bought the first UNIVAC computer. The second and third orders came from A.C. Nielson Market Research and the Prudential Insurance Company. The Census Bureau used computers to replace electromechanical tabulating machines. Businesses originally used computers to do the payroll, report generating, and record analyzing tasks that electromechanical calculators had previously performed. But it soon became clear that the computer was good for much more than performing repetitive calculations at high speed. The computer was much more than a calculator, however large and however fast. The point is that innovative users—in the course of automating existing processes—began to discover how they could employ the computer in new ways. An early innovation was stuffing information into and pulling information out of large databases. American Airlines used computers to create its SABRE automated reservations system—which cost as much as ten airplanes.3 SABRE made it possible to understand in a much more precise way the fine-grained characteristics of demand for air travel. The insurance industry first automated its traditional processes—its back office applications of sorting and classifying. But insurance companies then began to create customized insurance products using newly accessible databases that could be organized, reorganized, queried, and analyzed for data patterns.4 The user cycle became one of first learning about the capabilities of computers in the course of automating established processes, and then applying that learning to generate innovative applications.5 User-driven innovation is aided by rapidly advancing raw technological capability. The growth of computing power has enabled the development of computer-aided design—from airplanes built without wind tunnels6 to Aspray state that 2,000 programmer-years of effort went into the SAGE system in the 1950s and early 1960s. Thus “the chances [were] reasonably high that on a large data-processing job in the 1970s you would find at least one person who had worked with the SAGE system.” See Flamm (1987); Flamm (1988). 3. SABRE was the first large-scale real-time information processing system. See McKenny (1995). 4. See Baran (1986). 5. The literature on this topic of the lead role played by users in generating innovation is vast. See Lundvall (1985); Lundvall (1988, pp. 349–69); Nooteboom (1999, pp. 127–50); Slaughter (1993, pp. 81–95); Hatch and Mowery (1998, pp. 1461–77). 6. Boeing’s 777 is the best-known example, but computer-assisted engineering, design, and manufacture are transforming the entire aerospace industry—not just a single firm or a single product.
, , ,
pharmaceuticals designed at the molecular level for particular applications. In this area, the computer’s major function is neither as a calculatortabulator nor a database manager but is instead a “what-if machine.” The computer creates models of what-if: what would happen if the airplane, the molecule, the business, or the document were to be built up in a particular way. It thus enables an amount and a degree of experimentation in the virtual world that would be prohibitively expensive in resources and time in the real world. The value of this use as a what-if machine took most computer scientists and computer manufacturers by surprise: before Dan Bricklin programmed Visicalc, who had any idea of the utility of a spreadsheet program? The invention of the spreadsheet marked the entry of computers into the third domain of utility as a what-if machine—an area that today seems equally important as the computer as a manipulator of numbers or a sorter of records. User-driven innovation has a particularly interesting implication for information technology. “What-if ” machines can be used, of course, in a self-reflexive way by turning the tools of experimentation back on themselves. In simpler words, computers have become the key design tools for innovations in computing. Today’s complex designs for new semiconductors would be simply impossible without automated design tools. The process has come full circle and will continue to chase itself. Progress in computing depends on Moore’s Law; and the progress in semiconductors that makes possible the continued march of Moore’s Law depends on progress in computers and software. Systems theorists refer to this kind of process as an autocatalytic process. In early 2000 Bill Joy wrote a compelling piece to describe the potential dangers and downsides of technological autocatalysis.7 What may have been lost or inappropriately deemphasized in the discussion surrounding Joy’s manifesto was the incredible upside potential of the same processes. Autocatalytic change can be extraordinarily fast, and it can self-accelerate in surprising ways. Consider the potential for computing with DNA. This lies in the fact that DNA makes possible base 4 computing, which would be significantly more efficient than base 2 (digital) computers. Right now DNA can be manipulated to process information, but only at a very slow speed. Digital computers helped to unravel the structure and function of biological molecules—indeed, one of the most demanding information processing tasks is to determine how a protein strand will fold in on itself 7. Joy (2000).
to create a molecule in three dimensions. Faster computers and processing algorithms that grew out of this demand created a next generation that helped us to manipulate the biological molecule; and new generations of computers will likely help reconfigure it in ways that permit much faster processing of information in base 4. Just as DNA pushes silicon forward, silicon will push forward DNA. The speed and extent to which computing of this magnitude can set off autocatalytic innovation and transform economies and societies is ultimately a function of the degree to which processing power is integrated into economic and social processes. This has been happening recently in two quite critical and mutually reinforcing ways. First, computers have burrowed inside conventional products to become embedded systems. This is the notion of the “smart” car, house, toaster, whatever you choose. Second, computers have connected outside according to a set of open standards to create what we call the World Wide Web: a distributed global database of information all accessible through the single global network.
Pervasive Computing: The Microprocessor Becomes Embedded What does it mean to say that computing is becoming pervasive? The new production and distribution processes that pervasive computing makes possible are visible every day at the checkout counter, at the gas pump, and in the delivery truck. At the checkout counter and the gas station, computers scan, price, inventory, discount, and reorder before the groceries enter the bag or the nozzle is rehung. In the delivery truck, handheld computers determine the next stop and record the paperless “paperwork.” But these are actually quite primitive applications precisely because they remain visible. The most important part of pervasive computing is the computers that we do not see. They become embedded in traditional products and alter the way such products “operate” in the broadest sense of the term. In automobiles, antilock brakes, air bags, and engine self-diagnosis and adjustment are performed by embedded microprocessors that sense, compute, and adjust. The level of automotive performance in systems from brakes to emissions control is vastly greater today than it was a generation ago because of embedded microprocessors.8 Today’s automobile is already quite smart. Tomorrow’s “smart car” will integrate additional functions 8. Microprocessors in cars today control windows, door locks, cruise control, braking systems, fuel mix, emissions control, and more. The number of microprocessors in a typical automobile has passed
, , ,
that will soon become as invisible as the controllers for antilock brakes. At some point the automobile itself becomes more like an information processing system with an engine attached than a several-ton steel mass that can tell the driver about how fast it is going and how much gas it has left. In toys, embedded intelligence rests on very simple computing products. From cash registers and cell phones to hotel doors, elevators, and pacemakers, embedded microprocessors are transforming our world from the inside by adding features of intelligent behavior to potentially all engineered products.9 As product reliability and the trust level of users improve over the next decade, we are certain to see an explosion of intelligence in medical devices. This is far more dramatic than a “wearable” computer. At some point, what is being “worn” or “implanted” becomes as central to the definition of the person as anything that is biological in origin.
Computers Become Linked: The Spread of Networks As the cost of communications bandwidth dropped, it became not only possible but natural to link together individual sensing, computing, and thirty. The hardware cost of these semiconductors was then some $1,500. The software cost of programming and debugging them was perhaps the same. See Mowery and Rosenberg (1998). Note that $3,000 of today’s computing power would have cost $90,000—more than four times the entire price of the automobile—-at 1990’s levels of semiconductor, computer, and software productivity. See James Carbone, “Safety Features Mean More Chips in Cars,” Purchasing Online, September 18, 1998 (www.manufacturing.net/magazine/purchasing/archives/1998/pur0915.98/092enews.htm [February 2000]); “High Tech Industry Positively Impacts Economies, Globally and Locally,” the Kilby Center, September 9, 1997 (www.ti.com/corp/docs/kilbyctr/hightech.shtml [February 24, 2000]). 9. It is difficult to produce reliable estimates of the scope of the embedded microprocessor business. It is, however, possible to see the imprint and importance of this segment in computing in the decisions made by the producers of microprocessors. For example, IBM is ceasing production of PowerPC microprocessors for mass-market microcomputers in order to concentrate on production for high-end embedded sales in automotive applications, communications devices, consumer electronics, and Internet hardware. See “The PowerPC 440 Core: A High-Performance Superscalar Processor Core for Embedded Applications,” IBM Microelectronics Division, Research Triangle Park, N.C. (www.chips.ibm.com:80/ news/1999/990923/pdf/440_wp.pdf [February 2000]). Motorola continues to produce PowerPC microprocessors for use in Apple mass-market microcomputers but has also worked closely with purchasers who pursue applications unrelated to personal computers: a “PowerPC-based microcontroller for both engine and transmission control of next-generation, electronics-intensive automobiles due in 2000” that can handle “the highly rugged automotive environment,” for example. See Bernard Cole, “Motorola Tunes PowerPC for Auto Applications,” EE Times, April 21, 1998 (www.techweb.com/wire/ story/TWB19980421S0011 [February 2000]). Intel as well has put a considerable share of its mammoth venture capital funding toward enhancing its competitiveness in the market for embedded chips. See Crista Souza, Mark Hachman, and Mark LaPedus, “Intel Weaves Plan to Dominate Embedded Market,” EBN Online (www.ebnonline.com/digest/story/ OEG19990604S0024 [February 2000]).
storage units. The key point is not that rapid transmission has become technically feasible,10 but that the costs of data communication are dropping so far and fast as to make the wide use of the network for data transmission economically feasible for nearly every use we can think of. At the asymptote, the marginal cost of sending a piece of information around the world in real time approaches zero. Sun Microsystems uses the advertising slogan “the network is the computer” to describe the scale and scope of rethinking of traditional processes that this may entail. Leading-edge users took advantage of early network systems to create new applications in their pursuit of competitive advantage. The origins of today’s Internet in the experimental ARPANET funded and built by the Defense Department’s Advanced Research and Projects Administration (ARPA) is well known. Networking began primarily as private corporate networks (or, in the case of the French Minitel, a public network with defined and limited services). Business experimentation began. And data communications networks started down a road of exponential expansion as experimenting users found new applications and configurations.11
Computers Become Hyper-Linked: The Coming of the Internet But few saw the next iteration of the potential of high-speed data networking until the http protocol and the image-displaying browser—the components of the World Wide Web—revealed the potential benefits of linking networks to networks. Every PC suddenly became a window onto the world’s data store. And as the network grew, it became more and more clear that the value of the network to everyone grew as well. For the more people there are on a network, the greater is the value of a network to each user—a principle that is now well known as Metcalfe’s Law.12 The build-out of the Internet has been extraordinarily rapid in part because of this network effect. It was also so rapid because the Internet was initially run as a set of protocols over the existing voice telecommunications infrastructure. This was not anything like an optimal foundation for 10. It was technically feasible, after all, to send bits across 4,000 miles at lightspeed during the reign of Queen Victoria—by telegraph. But it was very costly. See Standage (1998); Stephenson (1996); and Yates and Benjamin (1991). 11. On the role of users in promoting the trajectory of innovation in the telecommunications and data networking industries, see Borrus and Bar (1994 ); and Bar and others (1999). 12. After Ethernet inventor and 3Com founder Bob Metcalfe, who said that the value of a network is proportional to the square of the number of nodes on the network. See Shapiro and Varian (1999, pp. 173–225).
, , ,
packet-switched data traffic, but it worked nonetheless. Even before the new technologies designed from the ground up to manage data communications emerged—and they will replace data-over-voice—the global Internet had already established its incredible reach.13 More than 60 million different computers were accessible over the Internet by late 1999, up from less than 1 million in 1993 and less than 10 million in 1996. Figure 1-1 shows the rapid speed of Internet diffusion around the world. Some elements of the next generation of data networks are already evident. First, for consumers and small business, one dramatic advance will be broadband to the home to create high-bandwidth and low-latency connections. The problem of the “last mile” is being solved by cable, digital subscriber line technology (DSL), satellite, and other technologies that either work with or simply bypass the fact that most houses were built to maximize privacy, not connectivity, and have only a small copper fiber information pipe leading out to the world. To download a data file or follow hyperlinks will take a fraction of the time previously required.14 The acceleration in speed will change the kinds of tasks that can be accomplished over the Internet. The increase in bandwidth and decrease in latency will mean not only a faster Internet; it will also mean a different Internet, with much more sophisticated applications. We can see this process at work in the sudden explosion of demand for products like Napster that allow users to assemble in transitory, ad hoc networks to trade large data files—in this case, music. This development was unexpected,
13. Ever since 1987, the Internet Software Consortium has run a semiannual survey to count the number of “hosts” on the Internet. By July 2000 their count exceeded 93 million computers, all accessible one to another through the Internet. In July 1999 there were 56 million. In October 1990 there were only 300,000 computers on the Internet. In August of 1981 there were only 213. “Internet Growth (1981–1991)” (www.isc.org/ds/rfc1296.txt [January 2001]). 14. Whether the first generation of high-bandwidth low-latency connections will be cable modem, DSL, or wireless connections will be a matter of market competition heavily influenced by policy choices. But the connections will arrive quickly. And there are subsequent generations of still higher bandwidth connections on the horizon. Kim Maxwell forecasts video-on-demand beginning in 2003 and fiber optic cable to the home starting around 2015. See Maxwell (1999). Note, however, that as of the end of 1999, fewer than 2.5 million people worldwide had broadband connections to the Internet. See “Internet Access Technology Moving to the Masses, Reports Cahners In-Stat Group” (www. instat.com/pr/2000/mm9914bw_pr.htm [January 2000]). For an analysis of the importance of lowlatency and high-speed connections in making the Internet useful, see Jakob Nielsen, “Usable Information Technology” (www.useit.com/, [January 2000]).
Figure 1-1. Households with Internet Access Millions of households 1995
40
2000
35 30 25 20 15 10 5 North America
Europe
Asia-Pacific
Others
Region Source: Jupiter Communications.
and it poses a huge challenge—and also a huge opportunity—to the recorded entertainment industry. Second, wireless voice networks will soon be as extensively deployed as the wired phone network. Widely diffused wireless data networks will set off another round of experimentation and learning, a round that is already visible (for example, in Finland) in the form of something called “mcommerce” (the “m” stands for mobile). This round of network deployment already brings new applications, challenges to established equipment and software players, and struggles over standards complicated by the fact that wireless providers do not yet know which wireless applications will prove to be truly useful. Third, the capacity and cost of the very backbone of the network will evolve dramatically over the next years, bringing new architectures, lower costs, ongoing experimentation, and new applications. Current experiments with Internet 2 suggest that we will soon be searching for applications to fill available bandwidth rather than the other way around. There
, , ,
will likely be a veritable tsunami of new capacity that brings technically advanced applications and dropping costs.15
Networks Transform Organizations But the full story of the information transformation cannot be told just by recounting the sequence of technologies. A focus on the numbers that describe technological advance and diffusion hides much of the real story: how the growth of an information network will transform organizations and the dynamics of competition. It is not just imprecise but fundamentally misleading to measure this transformation through estimates of “ecommerce” or the “Internet economy.” One set of numbers places the Internet economy at $300 billion in 1998 and $400 billion in 1999, accounting for 1.2 million jobs.16 Another set of numbers reports an Internet economy only one-third that size.17 Of course, much of the difference springs from where different analysts draw the line between “Internet” and “non-Internet.” But to our minds, the major lesson is that it is already becoming impossible to talk about an “Internet economy” per se. There soon will be no slice of the economy that can be carved out of the rest and assigned to the “Internet,” if there is such a thing today. Instead, all of the economy will be linked to the Internet. Every business organization and consumer marketplace can make use of the information processing and communications tools that constitute this current wave of technological advance. The question then becomes how will the entire economy be linked into information processing and data communications? The short answer is that we do not yet know (although there is a huge industry in business books that try to make the case that we do). There are several broad analytics that are suggestive. Nicholas Negroponte in 1996 stressed that, as the costs of transporting and transforming physical goods can only come down so far, but the costs of transporting and transforming information can
15. Vinod Khosla, “The Terabit Tsunami” slide presentation, Kleiner Perkins Caufield & Byers, Baltimore, December 17, 1999 (
[email protected]). 16. These are the results from a series of Cisco-sponsored studies of the “Internet economy.” See the University of Texas’s Center for Research in Electronic Commerce, “Measuring the Internet Economy” (www.Internetindicators.com/indicators.html [February 2000]). 17. See Robert Atkinson and Randolph Court, 1999, “The New Economy Index” (Washington: Progressive Policy Institute) (www.neweconomyindex.org/).
approach zero, there are powerful incentives to convert as much of the economy as possible from “atoms” to “bits.” Graciela Chichilnisky makes a related argument about how knowledge-intensive growth can replace resource-intensive growth.18 But we know that information (or knowledge) is not uniformly communicable. Markets for knowledge are no more self-organizing than are markets for goods. Information does not “want to be free” any more than it “wants” to be anything else—it responds (or more precisely, those who create or control information respond) to incentives that are set in markets and in policy. Some human and economic processes clearly benefit from making a transition away from physical space into information space. For example, it will be much easier to design pharmaceuticals in the realm of information—that is, with advanced knowledge of the human genome at hand, allowing us to “build” custom chemical interventions tailored to the genetic locus of a disorder—than it is to test a random assortment of chemicals in a test tube to see which cells they kill and which they do not. Music does not suffer from transmission in a digital form (so long as it can be “reassembled” perfectly at the other end). But can the same be said of emotion? Of the unique experience of a perfect meal at a fabulous restaurant? Or even of the economist’s stock example of a local service, the simple haircut? The last few years have produced lots of anecdotes and some systematic evidence of a small portion of the kinds of changes we should expect. Traditional businesses that act as intermediaries—like stockbrokers and travel agents—will be irrevocably altered. Traditional products like automobiles will be marketed and serviced in new ways. Stores will not likely disappear, but the mix of stores and what stores do will change. New ways of reaching customers in both time and space will in turn drive new ways of organizing production and delivering goods to consumers. Today we can see a range of strategic experiments, in the form of new companies trying to exploit the web and established companies trying to defend their positions.19 But we simply do not know which of these experiments in corporate information and network strategy will be successful. All business plans are predictions, and all these predictions will be wrong.
18. Negroponte (1996); Chichilnisky (1998). 19. For a brief survey of some of these experiments and their consequences, see Froomkin and DeLong (2000).
, , ,
The Future: The Emergence of the E-conomy In the real world, technology uptake and utilization by businesses, governments, and consumers is nearly unpredictable. Uses emerge within a process of search and experimentation—and may well be something that we do not now expect.20 Economic historian Paul David points out that it took nearly half a century for business users to figure out the possibilities for increased efficiency through factory reorganization opened up by the electric motor.21 Finding the most valued uses for the next wave of computer and communications technology may not take quite as long, but it will take time and probably a longer time than many expect. An era of profound experimentation is a natural and desirable thing. Changes in the powers and capabilities made available by modern information technologies are redefining efficient business practices and sustainable market structures. They are redefining which activities belong inside a firm and which can be purchased from outside. They are changing business models and market structures. Those changes are only beginning. It is anyone’s guess and any player’s bet what the final outcome will be. From a market ecology perspective, the more broadly we experiment and allow failures to emerge, the faster we will learn. Of course, as in any competitive ecology, there is sure to be significant roadkill along the way. In the mid-1990s proprietary online information and communication services were said (with great certainty) to be the killer application. In 1998 selling things like pet food over the web to individual consumers was said, with equal certainty, to be “it.” In 2000 it was business-to-business (B2B) auctions. At each starting moment there were compelling arguments about why this particular application was the “right” one. A year later there were equally compelling arguments about why it was totally “wrong.” Part of this intellectual churn can be written down to media hype and the herd psychology of venture capital. But the more important part stems from a more profound cause. The uncertainty is fundamentally real, not just a function of faulty or hasty thought. What we know for sure is simply that at almost every stage up to today, the killer application of each wave of technological innovation has been a surprise. 20. In part because these elements of economic destiny are not an equilibrium position predictable in advance but are path-dependent. See David (1993); Rosenberg (1996); and Dosi and others (1992). 21. David (1991, pp. 315–47).
The E-conomy Unfolds: Innovations in Organization and Business Practice Technology and innovations in business organization and practice are yoked together—each pulls the other forward. Just as technology usually advances through experimental trial and error, innovations in business practice evolve out of day-to-day efforts to resolve real problems or take advantage of perceived opportunities. Organizations have their own ecology. Out of the swirl of fads, frustrations, tactics, and strategies—like justin-time, total quality, downsizings, knowledge management, outsourcings, strategic alliances, mergers, demergers, spin-offs and start-ups—has emerged a new reality. The ecology as a whole constitutes a rapidly entrepreneurial environment that is able to innovate and commercialize at much faster speeds than before. This rests on at least two important and interrelated changes in business practice: new responses to the “innovation dilemma” and to the “production challenge.”
Resolving the Innovation Dilemma It is often the case that large established firms are not very good at fully developing and commercializing technologies that disrupt their existing markets and procedures. The reasons are endemic to large organizations. Parts of a large company, often the biggest and most powerful parts, are not eager to contemplate the risky development of a new technology that could end up cannibalizing their market and destroying their division. Typically, that group will doubt the feasibility, the reliability, and the marketability of the potential technology. New markets are hard to imagine and harder even to assess quantitatively. Ironically, the more effectively a company is tied into its network of customers and suppliers, the more likely it is to sustain a course of innovation that maintains its position within existing markets and technologies. Thus the less likely it will be to undertake radical innovation. This often looks like a winning strategy. After all, substantial enhancements to existing product lines can generate considerable returns. This creates an innovation dilemma. Companies that are responsive to their customers actually risk getting locked into a set of arrangements that precludes them from grasping the competitive advantages of innovation.22 22. See Christensen (1997).
, , ,
The examples are legion. AT&T asserted that an Internet-style communications system was impractical. Motorola, the leader in analog mobile phones, missed the step in the shift to digital. IBM missed Internet routers. Microsoft came late to the web browser, web server, and web development tools.23 The dilemma is particularly poignant when an established company generates the technology but is unable to capture its value. The creation at Xerox PARC of the functioning Graphical User Interface, the page description language, the Ethernet—and their commercial exploitation by others (Apple and Microsoft, Adobe, 3Com)—is simply one of many examples of breakthrough technology lost inside of excellent established companies.24 This organizational dilemma is a major reason why start-ups and entrepreneurial companies have been the drivers of much of the radical innovation in the transition to an e-conomy. These companies have defined and developed new industries. They are the major source of Schumpeterian competition, which simply bypasses price competition in existing markets to build a business through radical innovation. Entrepreneurial start-up companies, however, face substantial obstacles. They require money, help developing business plans and strategies, supplier contacts, access to clients, legal advice, production and logistics services, and so on. The list of things that start-ups need but cannot generate easily from their own resources is very long. America in the 1980s and 1990s built up a business environment that made it not only possible but in many cases easy and straightforward to establish an entrepreneurial start-up. Early venture money paved the way in making available the funds to start and develop a company. Changes in the prudent-man rule allowed institutional money to enter the venture business and so greatly enlarged its scale.25 The scale of investment changed, and funds were suddenly available for the venture world to move from niche to centerpiece. In a similar fashion, the growth of compensation through stock options that reward success with stunning wealth allowed founders to share a significant portion of the risk and rewards of a new company with like-minded employees. The institution of stock options meant that a cut in pay and a move across country could suddenly represent an opportunity, not a failing—if the reward were a share in value 23. See Ferguson (1999). 24. See Hafner (1996); Hiltzik (1999); and Smith and Alexander (1988). 25. See Lerner (1999).
of a venture start-up. And large established firms followed by seeking ways to encourage and to participate in spin-outs, start-ups, and venture funds.26 These elements make up part of a “Silicon Valley System” (that is no longer geographically limited to Silicon Valley, of course). It is a set of social institutions (such as research universities, venture capitalists, and specialized law firms) and market institutions (such as an extremely flexible labor market, incentive compensation, financial capital, and ultra-high-skilled people from the entire world)—institutions that together make it possible for an entrepreneurial company to bring innovations to market quickly and at scale. This new industrial-economic system has become a critical growth engine for the world and a strong source of comparative advantage for America—and will be until it is successfully imitated elsewhere.27
The Production Challenge Unexpectedly and abruptly in the 1980s, Japanese consumer durable and electronics products surged into American markets. Previous import surges in labor-intensive products such as shoes, apparel, and low-end assembled goods such as toys had forced significant reorganization in American industry. But they did not challenge the sense that American producers and production methods defined advanced manufacturing and advanced industry. The Japanese challenge was fundamentally different. Japanese competitive strength (particularly in autos and electronics) was the result of fundamental innovations in a “lean production system” that simultaneously eliminated inventories and their costs, permitted constant quality improvement, and reduced cost. The shock of a basic challenge to position in the symbol of the industrial age, the auto, and the symbol of the emerging electronic age, the basic memory chip, was considerable. It forced American and European 26. The United States has been particularly successful in facilitating the activities of venture capitalists. In 1999 venture capital investments reached record levels in the United States, amounting to $48.3 billion, which represents a 152 percent increase over the $19.2 billion figure for 1998. Internetrelated firms attracted two-thirds of this sum for 1999 with $31.9 billion. Northern California, with $16.9 billion in venture funds, was by far the largest regional recipient of venture largess, almost double the next largest region, the Northeast. See Venture Economics News, February 8, 2000 (www. securitiesdata.com/news/news_ve/1999VEpress/VEpress02_08_00.html [February 2000]); and xent. ics.uci.edu/FoRK-archive/august97/0400.html (February 2000). 27. On the institutional ecology of the Silicon Valley system, see Kenney and von Burg (1999); Saxenian (1994); and Cohen and Fields (1999).
, , ,
producers to fundamentally reorganize their production and business practices.28 This was a messy business, made more difficult by a severely overvalued dollar in the mid-1980s. The short-term result was the hollowing-out of large chunks of American manufacturing capacity—and in the process the destruction of a lot of valuable human- and firm-specific capital.29 Nevertheless, in the medium term American companies proved remarkably successful at adopting their own version of “lean production” innovations. The Japanese manufacturers may have taught American producers a painful lesson, but the American producers really learned. By the mid-1990s—with a stronger yen and reconfigured American manufacturing processes—the balance of manufacturing advantage in hightechnology industries appeared much more even. The eclipse of the Japanese challenge came about partly because the leading edge of consumer electronics shifted from broadcast-entertainment— TVs, VCRs, radios and related products—to wireless- and computer-based products where U.S.–based producers had set standards. Partly it came about because companies such as Hewlett Packard (HP) now understood the long-run benefits from learning by doing and how large the benefits were that came from controlling the low end of a market through highquality volume production, even if cost accountants told top managers that low-end margins were low. With the inkjet printer, HP dominated the market by systematically defending the bottom end of the market as it introduced new low-cost products.30 But a larger part of the change came with a finer division of labor. Producers discovered that they could lower their costs by concentrating on what they did best and contracting to buy the rest from those with a firmspecific advantage in productivity or a nation-specific factor-cost-based comparative advantage. Outsourcing across borders, a cross-national pro-
28. The reorganization of manufacturing techniques and the principles of lean production are described in detail in Womack and others (1990). For a somewhat more critical view of the lean production system, see Kenney and Florida (1988). 29. See Lawrence (1984); “The Hollow Corporation,” Business Week, March 3, 1986, pp. 57–85; Harrison and Bluestone (1982); and Piore and Sabel (1984). 30. HP introduced the inkjet printer to maintain market share in less-expensive printers. Maintaining inkjet market share increased HP’s bargaining position vis-à-vis Canon, the supplier of the laser printing engine itself. (The situation is further complicated by the fact that the inkjet printer was HP-developed technology while the laser printer was not, and HP had a bias toward invented-here technology.) Nevertheless, this strategy contrasts with the classic strategy of defending the high end of the market. See Cohen and Zysman (1987).
duction system, and the emergence of contract manufacturing have been at the heart of the solution of the production dilemma. Better communications have enabled firms to implement this “outsourcing” strategy. The ability to use modern data communications networks to transmit information allows client firms to specify in great detail what, exactly, they want their contractors to do. In a previous generation, with information flow limited to telephone, fax, mail, and air couriers, a lot of tacit knowledge was necessary in order for work to be distributed. This included knowledge, for example, about how the client branch of the organization would use the output and what the client organization’s default operating procedures were. Such tacit knowledge could best be gained through long experience. Hence large multidivisional enterprises that allowed the building within the enterprise of this tacit knowledge were an attractive organizational form. The increase in bandwidth has allowed explicit directions and thick presentation of the overall project to substitute in considerable measure for tacit knowledge and experience. It has allowed for a much finer division of labor and the creation of what we now call contract manufacturing. Because the world’s nations are so highly differentiated in terms of labor skills and labor costs, the greatest benefits to producers from the finer division of labor may well come from the possibility of extending the firm’s division of labor across nations. The development of a truly innovative production system took place in several stages.31 First came the shift from a market dominated by integrated producers to one in which firms located anywhere in the disintegrated value chain can potentially control the evolution of key standards and in that way define the terms of competition—not just of their particular segment, but critically in final product markets as well. Market power shifted from the assemblers (such as Compaq, Gateway, IBM, or Toshiba) to key producers of components (such as Intel); operating systems (such as Microsoft); applications (such as SAP, Adobe); interfaces (such as Netscape); languages (such as Sun with Java); and to pure product definition companies (like Cisco Systems and 3COM). What all of these firms have in common is that, from quite different vantage points in the value chain, they all own key technical specifications that have been accepted as de facto product standards in the market. This was a key signal of how
31. Sturgeon (1999); Sturgeon (1997a); and Sturgeon (1997b).
, , ,
disruptive start-up companies began to define the direction and fate of the industry.32 Second, companies that had found production a weakness began to outsource both component production and assembly. New highly flexible and adaptable production systems emerged out of this process. Cross-National Production Systems (CNPS) is a convenient label to apply to the consequent disintegration of industrial value chain into constituent functions that can be contracted out to independent producers wherever those companies are located in the global economy. And such independent producers can locate wherever factor costs and local levels of technological development provide a comparative advantage.33 CNPSs take advantage of an increasingly fine division of labor both between firms and between nations. The networks permit firms to weave together the constituent elements of the value chain into competitively effective new production systems while facilitating diverse points of innovation. They are not principally about lower wages as such, nor about access to markets and natural resources— although these objectives often motivated initial investments. Rather they are about the emergence of locations that can deliver different mixes of technology and production at different cost-performance points. Third, and perhaps most important, CNPSs imbued supply chain management with a strategic meaning. This set the stage for companies such as Dell to integrate marketing and production and convert themselves into service businesses tying the design, production, and delivery of the product directly to the customer.34 But there is still a physical product at stake, 32. On the theoretical and historical process of standard setting, see David (1987); and David and Greenstein (1990). David makes a distinction between “standards agreements” that are negotiated and “unsponsored standards” that arise more generally, even spontaneously, in competitive environments. While many of these unsponsored standards may emerge as optimal solutions to specific technological problems, it is sometimes the case that standards result from initial first-mover advantage, that is, from initial specifications of a new technology established by start-up firms. Once established, standards create positive feedbacks, lock-in, and path dependence owing to high switching costs that ensue as standards diffuse. David (1993); see also Shapiro and Varian (1999). On the role of users in standardsetting, see Borrus and Zysman (1997). 33. On cross-national production systems and networks of companies, see Stephen S. Cohen and Michael Borrus, “Networks of Companies in Asia,” Berkeley Roundtable on the International Economy, 1996 (brie.berkeley.edu/courses/sc/cp221/pascal.pdf [April 13, 2001]); Gereffi and Korzeniewicz (1994); Reich (1994); and Cohen and Guerrieri (1994). 34. On the reconfiguration of value chains and the new links emerging within firms between production, procurement, and sales, see Kenney and Curry (1999b); “Business to Business E-Commerce” (1999); “The Net Imperative: Business and the Internet,” Economist, June 26, 1999, pp. 5–40; U.S. Department of Commerce (1998), app. 3; and Kenney and Curry (1999a).
and much of the service that a company like Dell provides lies in organizing the production, marketing, delivery, and customization of that product for a particular need. To recognize how manufacturing and production still matter in the world of e-commerce means grasping the evolving place that production has in a system set up to deliver either a product or a service to a consumer.
Tracking the Transformation The e-commerce transformation represents a series of remarkable opportunities for businesses, governments, and other organizations to remake themselves, re-create what it is that they can do, and reconstruct their relationships with customers, citizens, and constituents. It is also a remarkable opportunity for social scientists. This is not a separate research domain for a small and specialized group of observers interested in business evolution and the politics of technological change. It is not simply a productivity phenomenon (of greater or lesser magnitude). It is a social, economic, organizational, legal, and political phenomenon all at once—and may yet be more than that, extending to a phenomenon of consciousness as well. It is clear that a book like this is only a start in what will and should be a much broader process of understanding the transformation. Our analytic approach at this stage is middle of the road. Between relatively esoteric debates over the precise macroeconomic measures of productivity and hugely speculative and vague arguments about radical changes in society and consciousness, there lies a more grounded analytic approach based in empirical research about ongoing and foreseeable changes in business practices within sectors. It is possible to extract from that data a set of general themes. It is also possible to set some contours for a research strategy moving forward. Technological tools are developing much more quickly than are the human and organizational systems that make use of them. Governance policies follow, typically, yet another step behind. Yet expectations about policy as well as real changes in policy set parameters around experimentation with business models. Since business model transformation is the central organizational driver of the transformation, these policy choices are key to the way in which the revolution unfolds. Libertarian fantasies of cyberspace as a policy-free zone are (thankfully) a thing of the past. The question now is what kind of governance and from
, , ,
where? Clearly a set of existing rules is not easily adapted to this new environment. To govern the e-conomy will mean updating old understandings, rules, and bargains all at once. And much of this will have to be done globally, or at least internationally, as well as domestically. It is a tough agenda. We hope the sectoral studies in this book serve to clarify some of the issues that will be dealt with along the way.
References Anderson, Margo. 1988. The American Census. Yale University Press. Bar, François, and others. 1999. “Defending the Internet Revolution: When Doing Nothing Is Doing Harm.” Working Paper 12. Berkeley Roundtable on the International Economy (August). Baran, Barbara E. 1986. “The Technological Transformation of White Collar Work: A Case Study of the Insurance Industry.” Ph.D. dissertation, University of California, Berkeley. Borrus, Michael, and François Bar. 1994. “The Future of Networking.” Research Paper. Berkeley Roundtable on the International Economy. Borrus, Michael, and John Zysman. 1997. “Globalization with Borders: The Rise of Wintelism as the Future of Global Competition.” Industry and Innovation 4 (2): 141–66. “Business to Business E-Commerce.” 1999. Business 2.0 (September): 84–124. Campbell-Kelly, Martin, and William Aspray. 1996. Computer: A History of the Information Machine. Basic Books. Chichilnisky, Graciela. 1998. “The Knowledge Revolution.” Journal of Trade and Economic Development 7 (1): 39–54. Christensen, Clayton M. 1997. The Innovator’s Dilemma: When New Technologies Cause Great Firms to Fail. Harvard Business School Press. Cohen, Stephen S., and Gary Fields. 1999. “Social Capital and Capital Gains in Silicon Valley.” California Management Review 41 (2): 108–30. Cohen, Stephen S., and Paolo Guerrieri. 1994. “The Variable Geometry of Asian Trade.” Working Paper 70. Berkeley Roundtable on the International Economy. Cohen, Stephen S., and John Zysman. 1987. Manufacturing Matters: The Myth of the PostIndustrial Society. Basic Books. David, Paul A. 1987. “Some New Standards for the Economics of Standardization in the Information Age.” In The Economic Theory of Technology Policy, edited by Partha Dasgupta and P. L. Stoneman. Cambridge University Press. ———. 1991. “Computer and Dynamo: The Productivity Paradox in a Not-Too-Distant Mirror.” In Technology and Productivity: The Challenge for Economic Policy, 315–47. Paris: OECD. ———. 1993. “Historical Economics in the Long Run: Some Implications for Path Dependence.” In Historical Analysis in Economics, edited by Graeme Donald Snooks, 29–40. London: Routledge. David, Paul A., and Shane Greenstein. 1990. “The Economics of Compatibility Standards: An Introduction to Recent Research.” Economic Innovation and New Technology 1 (1): 3–41.
Dosi, Giovanni, and others, eds. 1992. Technology and Enterprise in Historical Perspective. Oxford: Clarendon Press. Ferguson, Charles. 1999. High Stakes, No Prisoners. Times Books. Flamm, Kenneth. 1987. Targeting the Computer: Government Support and International Competition. Brookings. ———. 1988. Creating the Computer: Government, Industry, and High Technology. Brookings. Froomkin, A. Michael, and J. Bradford DeLong. 2000. “Some Speculative Microeconomics for Tomorrow’s Economy.” First Monday 5 (2). Gereffi, Gary, and Miguel Korzeniewicz, eds. 1994. Commodity Chains and Global Capitalism. London: Praeger. Hafner, Katie. 1996. Where Wizards Stay up Late: The Origins of the Internet. Simon and Schuster. Harrison, Bennett, and Barry Bluestone. 1982. The Deindustrialization of America: Plant Closings, Community Abandonment, and the Dismantling of Basic Industry. Basic Books. Hatch, Nile W., and David C. Mowery. 1998. “Process Innovation and Learning by Doing in Semiconductor Manufacturing,” part 1. Management Science 44 (11): 1461–77. Hiltzik, Michael. 1999. Dealers of Lightning: Xerox PARC and the Dawn of the Computer Age. New York: HarperBusiness. Joy, Bill. 2000. “Why the Future Doesn’t Need Us.” Wired 8 (4): 238–62. Kenney, Martin, and James Curry. 1999a. “Beating the Clock: Corporate Responses to Rapid Change in the PC Industry.” California Management Review 42 (1): 8–36. ———. 1999b. “E-Commerce: Implications for Firm Strategy and Industry Configuration.” Working Paper 2. Berkeley Roundtable on the International Economy. Kenney, Martin, and Richard Florida. 1988. “Beyond Mass Production: Production and the Labor Process in Japan.” Politics and Society 16 (1): 121–58. Kenney, Martin, and Urs von Burg. 1999. “Technology, Entrepreneurship and Path Dependence: Industrial Clustering in Silicon Valley and Route 128.” Industrial and Corporate Change 8 (1): 67–103. Lawrence, Robert Z. 1984. Can America Compete? Brookings. Lerner, Josh. 1999. The Venture Capital Cycle. MIT Press. Lundvall, B. A. 1985. Product Innovation and User-Producer Interaction. Aalborg University Press. ———. 1988. “Innovation as an Interactive Process: From User-Producer Interaction to the National System of Innovation.” In Technical Change and Economic Theory, edited by Giovanni Dosi and others, 349–69. London: Pinter. Maxwell, Kim. 1999. Residential Broadband: An Insider’s Guide to the Battle for the Last Mile. John Wiley and Sons. McKenny, James. 1995. Waves of Change: Business Evolution through Information Technology. Harvard Business School Press. Mowery, David, and Nathan Rosenberg. 1998. Paths of Innovation. Cambridge University Press. Negroponte, Nicholas. 1996. Being Digital. Vintage Books. Nooteboom, Bart. 1999. “Innovation, Learning and Industrial Organization.” Cambridge Journal of Economics 23 (2): 127–50. Piore, Michael, and Charles Sabel. 1984. The Second Industial Divide. Basic Books.
, , ,
Reich, Robert. 1994. The Work of Nations. Basic Books. Rosenberg, Nathan. 1996. “Uncertainty and Technological Change.” In Technology and Growth, edited by Jeffrey C. Fuhrer and Jane Sneddon Little, 91–110. Federal Reserve Bank of Boston. Saxenian, AnnaLee. 1994. Regional Advantage: Culture and Competition in Silicon Valley and Route 128. Harvard University Press. Shapiro, Carl, and Hal Varian. 1999. Information Rules: A Strategic Guide to the Network Economy. Harvard Business School Press. Slaughter, Sarah. 1993. “Innovation and Learning during Implementation: A Comparison of User and Manufacturer Innovations.” Research Policy 22 (1): 81–95. Smith, Douglas, and Robert Alexander. 1988. Fumbling the Future. Morrow. Standage, Tom. 1998. The Victorian Internet. New York: Berkley Books. Stephenson, Neal. 1996. “Mother Earth, Mother Board.” Wired 4 (12): 98–160. Sturgeon, Timothy J. 1997a. “Does Manufacturing Still Matter? The Organizational Delinking of Production from Innovation.” Working Paper 92B. Berkeley Roundtable on the International Economy. ———. 1997b. “Turnkey Production Networks: A New American Model of Industrial Organization?” Working Paper 92A. Berkeley Roundtable on the International Economy. ———. 1999. “Turn-Key Production Networks: The Organizational Delinking of Production from Innovation.” In New Product Development and Production Networks. Global Industrial Experience, edited by Ulrich Juergens. Berlin: Springer Verlag. U.S. Department of Commerce. 1998. The Emerging Digital Economy. Watson, Thomas Jr., and Peter Petre. 1990. Father and Son and Company. London: Bantam Press. Womack, James P., and others. 1990. The Machine That Changed the World. Simon and Schuster. Yates, JoAnne, and Robert I. Benjamin. 1991. “The Past and Present as a Window on the Future.” In The Corporation of the 1990s: Information Technology and Organizational Transformation, edited by Michael S. Scott Morton, 61–92. Oxford University Press.
2
The Construction of Marketplace Architecture
An agent is at hand to bring everything into harmonious cooperation, triumphing over space and time, to subdue prejudice and unite every part of our land in rapid and friendly communication . . . and that great motive agent is steam. ⁽ ⁾
, the commercial release of the first-version Netscape browser and Netsite server signaled the transformation of the Internet from an elite network reserved for advanced research and academics into a mass medium. As the number of dot-com sites quickly outpaced the dot-mil and dot-edu pioneers, users began to see the Internet’s tremendous potential for transforming commercial interactions. Sweeping predictions accompanied this transition. The Internet was going to transform economic activity, yielding significant increases in efficiency and productivity. Fueling this confident optimism was the expectation that the Internet would usher in perfect markets that would in turn replace traditional, inefficient corporate hierarchies and supply chains. Internet-based
A
market processes would yield flatter organizations, disintermediate economic relationships, and give equal power to all market participants. To a large extent, these hopes were rooted in the characteristics of Internet technology and of the innovation process that saw its emergence. The Internet’s development was user-driven and bottom-up. This resulted in a decentralized network, where any node can become part of the internetwork as soon as it speaks “Internet protocol” (IP), the common language. Governance of the Internet was itself decentralized, largely outside the hands of traditional government institutions, and carefully watched by a wide array of private individuals and institutions. Unlike previous communication networks, the Internet seemed largely self-governing and allowed any individual or organization to participate on an equal footing. The hopes were that these characteristics of the technology would simply carry over to the economic processes that make use of the Internet—that decentralized technology would naturally lead to decentralized outcomes in the use of the technology. The Internet’s liberating technology would drive market structure: the low cost of adoption and the endless range of new applications would lower barriers to entry, decentralize economic power, and thereby democratize society and empower individuals.1 With the benefit of hindsight, we know today that previous infrastructures, such as the railroad, fundamentally transformed the structure and efficiency of our economies. Indeed, “steam-commerce,” as the economic transformation brought on by the railroads might have been called, saw profound reorganization of productive activities within integrated multidivisional corporations. It led to sweeping restructuring of supply chains, markets for raw materials, and finished products. It allowed firms to draw on vastly broader labor markets and sources of inputs. This resulted in the reorganization of marketplaces around the possibilities created by the railroad.2 The real dimensions of such change are always difficult to gauge while the transformation is under way. Several years after the onset of the commercial Internet, as some dot-com pioneers stumble, we began to see how some early predictions missed the mark. The true economic impact of the
1. This tendency to assume that a decentralized technology naturally leads to decentralized and democratic uses is what we have described as “the Jeffersonian Syndrome.” François Bar, John Richards, and Christian Sandvig, “The Jeffersonian Syndrome: The Predictable Misperception of the Internet’s Boon to Commerce, Politics, and Community” (www.stanford.edu/~fbar/Publications/jeffersoniansyndrome.PDF [March 2000]). 2. Chandler (1977).
Internet still remains to be seen, but, with some initial Internet history to draw from, it is worth critically revisiting the early hopes.
Promises and Reality Viewing optimistically the inherently decentralized and democratic characteristics of early Internet technology, analysts predicted that, applied to commercial endeavors, it would transform marketplace communication and bring us closer to the ideal of a “perfect market”: multiple buyers, multiple sellers, many interchangeable products, all smoothly and swiftly converging toward equilibrium thanks to perfect information. With the Internet, it was argued, market participants can know everything there is to know about the prices, characteristics, and quantities of goods in the market and make instantaneous, perfect, rational decisions. The result was to be “a new world of low-friction low-overhead capitalism, in which market information will be plentiful and transaction costs low.”3 Easy entry and easy exit afforded by cheap, flexible Internet technologies would keep incumbent players constantly on their toes, in the best interest of economic efficiency. “Any product that resembles a commodity—and most do—will be driven down in price by the efficiency of the Internet as a marketplace.”4 At the end of the day, analysts promised that we could now get very close to Adam Smith’s ideal, perfect market. “There is a fundamental shift in power, and it’s shifting to the consumer.”5 Today, however, an economic reality has emerged that diverges substantially from these predictions. Far from a multitude of interchangeable participants, concentration seems the rule in many segments of e-commerce, where only the largest actors seem able to succeed. While the Internet has reduced overhead and transaction costs, e-commerce players have required considerable investment to survive. Far from a “friction-free” environment, e-commerce sites strive to create “stickiness” that will keep their customers from clicking on to their competitors. Overall, while the Internet has had significant impact on commerce, the resulting economic landscape reveals
3. Gates (1995, p. 158). 4. Bill Gates, “Friction-free Capitalism and the Price of the Future,” May 20, 1998 (www. microsoft.com/billgates/columns/1998Essay/5-20col.asp [December 1999]). 5. Ferguson, cited in R. Quick, “The Attack of the Robots: Comparison-Shopping Technology Is Here—Whether Retailers Like It or Not,” Wall Street Journal, December 7, 1998, p. R14.
many unanticipated features. The early expectations rested on three key assumptions: low entry barriers, decreased roles for intermediaries, and lower transaction costs. It is worth examining where they stopped short before moving on to an analysis of the new landscape. First, the Internet was expected to shatter entry barriers. New players, able to marshal virtual resources in place of real ones, were expected to compete on par with incumbents. With the Internet, no need to build stores or hire a sales force—a cleverly designed website would suffice. No need to keep costly inventory—orders would simply be passed on to suppliers for just-in-time delivery. Technology was to abolish entrenched positions as competitive advantage. Small players could be as powerful as large ones. In many sectors, however, the barriers to successful and credible entry remained high. It quickly became apparent that if entering was easy, staying would be more difficult. Anyone could open a bookstore on the Internet. Yet only the best-capitalized bookstores would manage to survive (and even for these, survival still remains an open question). Entrants have discovered that it takes significant resources and skills to maintain an effective web presence, to guarantee that orders received will be filled. Traditional businesses still have the relationships and the marketing expertise that create substantial obstacles to new players’ entry. Entrants had their chance, but over time it turned out that experience, long-standing business relationships, and domain expertise still matter. In fact, concentration seems pervasive throughout the Internet world. Whether you look at portals, ISPs, exchanges, or the makers of the underlying network infrastructure, tremendous economies of scale, scope, and network seem to favor the largest players and reinforce concentration. Because of network externalities and economics (large fixed costs, low marginal costs), we are seeing increasing concentration and large players, rather than a multitude of small players. Second, the Internet was to bring disintermediation.6 In the old economy, intermediaries of all kinds performed important functions as information brokers. They aggregated demand for suppliers, giving them a better sense of what the market wanted, and offered buyers a convenient one-stop picture of supply. In the Internet economy, however, a world
6. Malone, Yates, and Benjamin, (1987).
where everyone supposedly has access to complete information, intermediaries simply add cost and delay. They could be eliminated now that technology allowed direct connections between buyers and sellers, without need for brokers, market makers, consolidators, and other middlemen. In fact, new intermediaries emerged, old intermediaries adapted. Homebuyers and sellers did not simply bypass realtors but began to use the services of online brokers, some new, but mostly traditional realtors with a new web presence.7 Stockbrokers did not disappear: new brokers like etrade emerged while traditional brokers adapted. There may even be more intermediaries today than before: business-to-business transactions once directly negotiated between two parties now often take place within online marketplaces. Rather than disintermediation, we are witnessing the transformation of intermediation.8 Third, the Internet economy was to be friction-free. In the old economy, communication activities account for a large portion of the costs of transactions. They reflect the trouble and expense of searching for products, identifying the right buyers and sellers, negotiating contracts, invoicing purchasers, billing and collecting. All create market friction and make it burdensome to switch from established commercial relationships. Hierarchical relationships among commercial partners, embodied in longterm contracts or organizational integration, were the old economy’s way to reduce transaction costs.9 The Internet, offering cheap and efficient ways to set up and execute transactions, was expected to reduce that friction, making markets more perfect. In turn, this would lead to greater reliance on markets than hierarchies for the organization of economic activities. But if the Internet could reduce friction, the same technology can also be deployed to create more of it. Start-ups discovered that friction (or “stickiness,” as their business plans prefer to call it) often is the key to profits.10 Friction, and the resulting market imperfections, creates seller or buyer advantage as well as arbitrage opportunities for traders. Embedding friction within their web offerings, they were able to create switching costs for their customers, either through standards (for example, incompatible instant messaging), or particular implementation (for example, web-based 7. Buxmann and Gebauer (1998). 8. Sarkar, Butler, and Steinfield (1995). 9. Williamson (1975). 10. Smith, Bailey, and Brynjolfsson (2000).
e-mail that cannot be forwarded).11 Friction-free may be a macroeconomic ideal but makes less sense from the point of view of individual business players. Indeed, friction is where business opportunities are to be found. Instead of friction-free commerce, what emerged was the design of exchange spaces with differential friction. Through this all, a common theme starts to emerge. At the core of the transition toward e-commerce is the emergence of multiple virtual spaces for exchange. These are not trivial to build and to back up with real-world ability to deliver on the agreements they help to negotiate. As a result, there are real barriers to credible and sustainable entry. Far from disintermediation, they constitute the emergence of new intermediaries or the reinvention of old ones. And far from providing friction-free interaction, they represent the careful arrangement of intentional friction. As firms deploy electronic technology to create commercial advantage for themselves, they are striving to alter the terms and dynamics of competition. They are, in the process, re-creating the marketplace.
Mapping a Way through the Transformation The Internet-based reinvention of markets comes in varied degrees and flavors. As a result, the single label of “e-commerce” covers a wide variety of ways to organize production and exchange activities. Commercial interactions are organized differently in different sectors, often reflecting the preexisting ways of doing business and the position of incumbents. To discern what is really new and analyze the implications, we need to start with a map (see figure 2-1). Commercial activities, whether conventional or electronic, involve four basic levels.12 . Commerce requires communication: buyers and sellers must exchange information about the characteristics of goods and services, about quantities, availability, and prices; firms must coordinate their activities with those of partners and subcontractors. In the most primitive markets, such as farmers’ markets on the central square of medieval towns, communication was interpersonal and unmediated as buyers and sellers negotiated directly with each other. All communication technologies and media have influenced commercial 11. Shapiro and Varian (1998). 12. Bar and Murase (1999). See also Picot and others (1997).
C-good
C-good
E-good E-delivery
Transaction and payment Marketplace
Di rec e- c t om me rce
C-good
Ind ir e-c ect om me rce
Deliverable
Ne t-a com ided me rce
Co nv com entio me nal rce
Figure 2-1. Mapping E-Commerce
E-payment Conventional marketplace
Infrastructure
E-payment Electronic marketplace
Electronic infrastructure
Conventional
Electronic
activities. Carrier pigeons and the mails have increased the reach of old marketplaces; telegraph and telephone have helped accelerate the pace of exchange. New communication media will continue to influence the conduct of economic activities as they transform the way in which various economic actors communicate. . Commercial communications do not take place in a vacuum, but in the context of structured coordination environments within which buyers interact with sellers, negotiate, and agree on the terms of a transaction. These marketplaces come in many shapes and forms, from the vast network of fairs in medieval Europe13 to the modern NASDAQ. They share the fact that they are embedded within communication infrastructures that shape and constrain their mechanisms: the characteristics of what is traded, the process for matching demand and supply depend on the communication infrastructure.
13. Braudel (1979, pp. 63–74).
. These come into play to send, execute, and settle orders (including payments) that have been agreed to in the marketplace. They rely on the features of the communication infrastructure, either to physically transfer a payment or to transmit information about credits and debits to the accounts of buyers and sellers. . Finally, the system’s goal is to supply deliverables, the service or merchandise being exchanged. Here again, the underlying communication infrastructure constrains what goods can be shipped (or transmitted), what services require proximity or can be performed at a distance. Different degrees of reliance on electronic technologies at these four levels result in four broad categories of commerce (figure 2-1). In pure conventional commerce, nothing electronic is involved. Buyers and sellers physically meet in a market, communicate face-to-face, conduct transactions directly, and settle them with physical currency. The buyer physically takes delivery of the good or service. Since their inception however, electronic technologies have assisted in the entire range of commercial activities. A first level of electronic commerce, network-aided commerce, relies on electronic communication technologies to assist traditional commercial activities. The telephone made it possible to conduct traditional transactions at a distance rather than in person, and electronic data interchange (EDI) allowed companies to automate the exchange of orders and invoices. These, however, do not fundamentally change the commercial process, they simply make existing processes faster, cheaper, and more efficient. When a company lets customers pick a product from a website rather than a printed catalog, or takes their orders over e-mail, it uses the Internet as an aid to existing commerce rather than to transform it, continuing that trend. The next level, indirect e-commerce,14 corresponds to the creation of an electronic marketplace on the network, within which demand and supply are matched, even though the goods and services traded are ultimately delivered physically to a customer. This matching process often differs significantly from what goes on in the physical marketplaces it replaces (or competes with). Airlines’ computerized reservation systems (CRS), such as American Airlines’ SABRE, effectively created an electronic marketplace for airplane trips and ancillary travel services that functioned quite differently from the network of travel agents interacting with airlines over the
14. Committee on Economic and Monetary Affairs and Industrial Policy (1998).
phone it replaced.15 Similarly, E-Bay’s electronic marketplace is much more than the automation of newspaper classified ads and provides a new process for pairing up buyers and sellers. Finally, direct e-commerce is purely electronic, where the goods or services traded are themselves electronic and delivered over a network. This includes the commerce of software in its many forms, from music to computer programs, as well as online stock exchanges. Insurance and services such as aircraft engine maintenance are now traded that way as well. In this purest form, the electronic infrastructure supports the marketplace and transaction and payment mechanisms as well as the transmission of the traded objects themselves. This mapping of e-commerce makes two important points. First, the broad category of electronic commerce covers in fact a wide diversity of commercial arrangements, with different degrees of “electronic-ness.” As a consequence, the impact can range from a mere enhancement of traditional commercial activities to fundamentally new ways to structure and implement them. The nature and magnitude of the economic implications will be equally diverse. This makes it crucial to look at real cases in individual sectors in order to understand the diverse implications of the transiting to electronic commerce. Second, the map highlights what is the most transformative aspect of this transition: the emergence of electronic marketplaces. While communication networks have always been an important aid to the market and to market activities, the network itself is now increasingly becoming a marketplace, that is, the place where buyers meet sellers, negotiate prices and quantities, agree on delivery terms, and exchange goods and payments. Thus it is useful to distinguish two categories of economic implications of e-commerce, broadly described by the headings of efficiency and structure. The quest for enhanced efficiency within existing commercial practices has important economic benefits but is not fundamentally new. It was the goal in the application of previous communication innovations to commercial practices: the world has seen previous rounds of mailcommerce, telegraph-commerce, phone-commerce. Each entailed substantial economic benefits, making existing market processes faster and cheaper. But this is still network-aided commerce. The second category is less obvious but more fundamental. The implementation of electronic
15. Hopper (1990). See also chapter 5 by Klein and Loebbecke in this volume.
marketplaces within the network infrastructure shapes the structure of economic relationships between companies and the operation of market processes. The next sections look at these two in turn.
Not So New: Pursuing Efficiency through Network-Aided Commerce A first dimension of electronic commerce, and the most visible, simply constitutes the continuation of existing trends: it is the application of electronic communication technologies to existing commercial practices and marketplaces. Like the diffusion of previous communication technologies through commercial activities of the past, it does not in itself represent a fundamental transformation but rather incremental improvement of existing processes. Communication technologies are key to market processes because markets mechanisms are information processing activities. Markets are structured information exchange environments, where actors convey information about the characteristics of goods and services, their prices and availability. Market mechanisms such as negotiation, matching, and agreement similarly are communication processes. Naturally, every time a new communication technology comes along, it is typically applied to the automation of existing market communication, further enhancing the flow of market-related information.16 Thus before the Internet, couriers, the telegraph, the telephone, or electronic data interchange (EDI) have served to enhance market-related communication—to allow faster, better, broader, cheaper matching of buyers and sellers and the settlement of transactions and payments.17 Time and again, the first step in the application of new technologies has led to further automation of existing marketplaces—designed to improve their operation along existing processes rather than transform these processes. This is also the case with the Internet. Exchanging information about product and services becomes faster and cheaper. The Internet allows sellers to reach new potential customers, increasing their range, and conversely enables buyers to compare the offerings of a greater set of suppliers. As a result, sellers have developed better ways to gauge demand for their products, to adjust prices accordingly, and to relay these adjustments to the 16. Beniger (1986). 17. Brousseau (1994).
marketplace, leading to more dynamic pricing mechanisms. Internet technologies permit faster and more cost-effective matching of demand and supply, yielding improved market clearing mechanisms. They support better negotiation mechanisms and faster transaction settlement. Sellers have harnessed the features of the network to offer more responsive and more personalized customer service.18 These are all significant improvements. However, similar claims can also be made for any of the previous communication technologies, from the postal network to the telephone. The result, in this round as in the previous rounds, is network-aided commerce, a move toward more efficiency in market processes rather than fundamentally new market processes. Two characteristics of this transition deserve particular notice. First, these improvements need be neither uniform nor symmetrical. In fact, they are typically implemented strategically. Individual market participants hope to get a leg up on their competitors by deploying information systems that give them faster, better market information or that help them close a transaction faster. Sellers try to make it less attractive for their customers to switch to competing suppliers because of the superior service they can provide thanks to improved communication technology. Buyers try to exert greater pressure on their suppliers by deploying network systems that give them a more accurate vision of their alternatives. In all cases, as with earlier information systems, the purpose is precisely to create advantage over competitors, greater leverage over buyers or suppliers.19 As a result, especially in the early stages, communication technology deployment may improve the efficiency of certain market operations, but this does not in itself make the market more perfect. Improvements will, more likely than not, be unevenly distributed and asymmetrical in the benefits they yield. This obviously calls for a response from those who lost out in the first round of technology deployment. As time passes and the technology matures, marketplace improvements will diffuse and technology-based advantages will tend to cancel out, leaving the benefits of greater efficiency to be shared by all market participants.20 However, we should not expect that transition to be instantaneous. Second, the unfolding of these improvements represents the first step in a cyclical, evolutionary pattern. This initial implementation of market 18. Hanson (2000). 19. Clemons (1986). 20. Brynjolfsson and Hitt (1996).
automation technology requires the deployment of a new network infrastructure, sometimes within individual organizations, sometimes between market players. Initially, the motivation for this deployment may be strictly aligned with its automation goals and the investment justified on that basis. However, once this new infrastructure is in place, it allows market participants to experiment with possibilities beyond this initial intent, tinkering with other ways to use this network and its applications. This in turn will suggest and enable deeper transformation of market processes, beyond the strict automation that motivated the technology’s initial deployment.21 History has shown how this process ushers in a virtuous innovation cycle, leading to fundamental economic change, and there is no reason to expect this round to be any different. The next section explores the emergence of new market structures, an important step in that direction.
What, Then, Is Really New? Structuring the Electronic Marketplace The most fundamental transformation of commercial activities through the application of electronic technologies is not primarily about efficiency; it has to do with market and industry structure. It is about architecture. The architecture of conventional marketplaces, the physical arrangement of their “bricks and mortar,” is never neutral. Ludovic Piette’s depiction of Pontoise’s place du marché in 1876 shows this clearly. The buildings surrounding the square limit the area available for trading, and therefore the number of sellers and buyers who are allowed to take part in market activities. The physical arrangement of stalls constrains the discovery paths buyers can follow, and thus has an impact on what they buy. Sellers occupy different physical positions, display their wares on the ground or in carts, thus affecting their negotiating situation. A physical barrier stops buyers from entering the marketplace until the market is officially open for business. The specific constraints resulting from this architectural arrangement were somewhat arbitrary. Pontoise’s place du marché could have been organized differently, and indeed, one could find many different physical marketplace designs around the world, from the covered markets of Covent
21. Bar, Kane, and Simard (2000).
Ludovic Piette, Le marché de la place de l’Hôtel de Ville, á Pontoise (1876). Musées de Pontoise.
Garden to the souk of Istanbul. However, while different architectural choices could be made, none was neutral. Each entailed physical constraints that structured the market activities harbored in these spaces. Architecture shaped commerce. Set against the old brick and mortar marketplaces, communication technologies promised freedom from physical constraints. Telegraph and telephone began to allow distant buyers and sellers to participate in markets without being physically present. The Internet promised to liberate marketplaces from the constraints of physical space. There would be no limit to how many buyers and sellers could “fit” in the marketplace, since it was no longer physically bound. Sellers could dream up all kinds of ways to display their wares and design imaginative virtual stands within the software of the electronic marketplace. The market would no longer be held in a specific time zone and nothing demanded that it shut down after dark. With the Internet, the network is the marketplace.22 Not simply a lubricant for the wheels of traditional commerce, the Internet becomes the very place where buyers and sellers meet and transact business. The network, or more precisely the combination of network-based applications and network
22. Gordon (1989).
control software, is the environment within which the various stages of commercial exchange unfold. The network determines market access, since only those who are connected can participate in the market process. It supports discovery, as market players use the network to learn what goods and services are available, at what price, with what characteristics. Buyers and sellers also use the network to find out more about each other, from reputation to credit worthiness and service follow-through. Network-based software carries out the matching of demand and supply, connecting buyers with sellers. Once paired, buyers and sellers use the network to negotiate the precise terms of the transaction they wish to enter into. The network then supports the closing of a transaction and transfer of payment. For electronic products, the network also serves as delivery channel, completing the chain. For this reason, the transition to e-commerce is more profoundly transformative of economic processes than past transitions such as “steamcommerce”: the information infrastructure and the software that determines its configuration become the foundation for a full set of basic market processes. Market mechanisms then become embedded in the network’s software, and network configuration defines market operations. When the network becomes the marketplace, the information communicated over the infrastructure, the market mechanisms, and the functioning of transaction settlement and payments are all embedded in the software infrastructure that supports the network marketplace and determines its operating parameters. As software, they can adapt flexibly to changing market processes and coevolve with changing economic relationships and organizational forms. Furthermore, control over the configuration of digital networks—the software definition of who can communicate with whom, under which conditions, to do what—is separable from ownership of the underlying physical infrastructure.23 As a result, electronic marketplaces can potentially be designed and modified by a variety of parties, ranging from infrastructure owners, providers of marketplace environments, and market players—buyers, sellers, or third parties. Ability to control network configuration then becomes the key to defining the marketplace’s architecture. The architecture of the network marketplace and the software that supports it are the domain of the private actors that provide the network marketplaces—whether these are online auction or spot markets, online retail
23. Bar (1990).
sales sites, portals for e-commerce transactions, or business-to-business supply chain transactions. The promise of this newfound freedom led to a pervasive myth about the Internet: No longer bound by real-world constraints, the virtual marketplace would become a “perfect market” where all the inherent biases of the physical world could be overcome. In fact, architectural bias returned with a vengeance. Because the architecture of virtual marketplaces is defined in software, network configuration determines marketplace architecture.24 This means that those who have the power to set network configuration can decide who can participate in the market and what will be the rules of engagement within that marketplace. They can architect a virtual space with open or restricted access, decide to let in select buyers, sellers, or third parties. They can give them equal or differential access to market information. They can decide whether the market will function like an exchange, with bid-ask mechanisms, an auction, a brokerage, or simply a catalog. They can structure it so all parties get an equal shot at a transaction or embed preferential treatment for select market players within the software code that governs market clearing. Control can reside in a variety of places within the network. The “endto-end” principle,25 which has guided Internet design, argued for implementing software functions, to the extent possible, at the edge of the network (that is, in the computers connected to the network) and in the topmost software layers26 (that is, as independently as possible from the underlying network hardware). According to that model, the communication network is a neutral conduit and all the intelligence resides at the edge, in the servers controlled by the network users. In the pure end-to-end network, control over marketplace configuration therefore belongs to those who control the applications that run on these end devices and the virtual network they constitute. As the Internet matures, it experiences increasing departures from the purest end-to-end principle. Owners of various subnetworks (for example, backbone providers, broadband access providers, or Internet service providers) find it in their competitive interest to embed certain functions (such as security, caching, mirroring, etc.) within the piece of the network they control.27 They do so in part to improve network 24. Just as “Code is Law” (Lessig, 1999a), we might say that “Code is Economics.” 25. Saltzer, Reed, and Clark (1984). 26. See chapter 16 by Michael Kleeman in this volume, describing the network’s layered model. 27. Clark and Blumenthal (2000).
performance, but also because these functions can serve as the building blocks of electronic marketplaces, allowing them to leverage network control into market power. Thus in this emerging network environment, software-based control over network configuration can be found both at the edge and within the network, potentially exercised by multiple parties, jointly or independently.
Modern Marketplace Architecture The resulting combinations of software-based control open a virtually limitless array of possible arrangements, far greater than what exists in traditional markets. It allows the builders of electronic marketplaces substantial latitude to follow modern architecture’s dictum—“Form follows function”28—now that software building tools and materials are flexible enough to create marketplaces where design can be subordinated to the pursuit of specific market mechanisms and outcomes. This means that while Internet technology can be used to design perfect markets (or nearperfect markets), it can just as well serve to construct biased ones. And because the exact exchange mechanisms are buried deep within the code of complex network and application software, the true characteristics of these marketplaces may be much harder to divine than those of the farmers’ markets of old. A simple glance at Pontoise’s central square was enough to gauge the strength of a particular market position. Modern travelers would find it much harder to assess their relative position in the various possible e-marketplaces for airplane tickets.29 In this new world, network control is the key to market control. Consider the marketplaces described throughout the case studies in this volume. They show how Internet exchanges can be configured in very different ways to create many kinds of marketplaces, each aiming at distinct competitive outcomes. The vast majority of electronic commerce today occurs in e-marketplaces deployed by their main players. Analyst Emarketer estimates that 28. Sullivan (1947). 29. In its October 2000 issue, Consumer Reports compared the main online travel sites and concluded: “The results: The ‘lowest fare’ online rates for the same destination were all over the map— sometimes hundreds of dollars apart. Rates also differed from the baseline prices on Apollo, the computer reservation system used by many brick-and-mortar travel agencies. In many cases online rates were higher; in other cases, lower.”
over 93 percent of business-to-business e-commerce today takes place in what they call “private or proprietary exchanges”—that is, marketplaces controlled by the market’s dominant player.30 These marketplaces, such as those set up by Dell Computer or Wal-Mart, are primarily ways to automate these companies’ existing supply chains.31 While they create competitive pressures among the various nondominant players, for example between competing electronics parts suppliers to Dell, the reverse is not true. They are configured as proprietary exchanges that do not allow these suppliers to compare Dell or Wal-Mart with other potential buyers. Related to these are marketplaces sponsored by industry consortia rather than a single company. One of the most visible is Covisint, the automotive online exchange created by DaimlerChrysler, Ford Motor Co., and General Motors.32 The presence of competing buyers suggests that this marketplace might be less one-sided than Dell’s supply chain automation. The declared goal of Covisint is to cut the production cost of an average car by 10 percent. Some of these savings are expected from greater efficiency in transaction. Nevertheless, chances are that the architecture of the Covisint marketplace will lend itself to help its owners, the automakers, drive down the cost of components, rather than to help component makers set automakers against one another to bid up the price of their products. Consortiacontrolled marketplaces are emerging not only on the buyer side. For example, twenty-eight airlines gathered around United, Delta, Continental, Northwest, and American Airlines have announced the creation of Orbitz, a jointly controlled e-marketplace for travel services. A large number of marketplaces are controlled and operated by third parties—that is, by entities who do not trade within the exchange. Examples include Chemconnect (a marketplace for chemicals, plastics, and industrial gas) or Freemarkets (online auctions for industrial parts, raw materials, commodities, and services). The economic basis for these is different in the sense that their selling point precisely is to provide an unbiased trading environment for their customers and to charge a fee on the transactions they facilitate. Network control is then put to the service of creating a neutral marketplace architecture for traders. Third-party control does not guarantee absence of bias, however. In a number of cases, marketplace operators might have economic incentives to 30. Emarketer (2000). 31. For a discussion of the Dell model, see chapter 7 by Martin Kenney and James Curry in this volume. 32. See Fine and Raff (2001).
create particular market slants, even though they do not themselves trade in the marketplace. Examples include placement fees that marketplaces like amazon.com, ebay, or yahoo can charge to list some sellers or some goods more prominently than others. These incentives relate to the marketplace owner’s ability to allocate some scarce resource, like screen “real estate” in the case of placement. Screen space is at an even greater premium on the small screens of cell phones and PDAs, suggesting profitable strategies for those who control the order in which choices are displayed in their menus. In Japan, NTT DoCoMo has implemented a particular kind of bias in the marketplace it controls for mobile services, with the division of service providers in different classes: those within the “walled garden” and those outside.33 Those inside get not only premium placement but also better access to the infrastructure’s technical resources and to NTT’s marketing might. They also share a greater portion of their revenues with NTT, the operator of the marketplace linking them to their customers. Other kinds of bias may be buried deep in the network’s code and even harder to discern. For example, the broadband network provider Excite@Home has struck partnerships with certain content and service providers that agree to share their revenues in exchange for strategic caching and replication of their content within the network’s servers. While @Home’s network provides greater bandwidth for all services, it makes access to these privileged partners even better, presumably helping them along. Yet another kind of software-defined marketplace comes with the deployment of peer-to-peer technologies. There the transmission network purely serves as a neutral conduit, and the market mechanisms—discovery, matching, negotiation, and transaction—are implemented “at the edge.” Napster, and the corresponding marketplace for music built around this technology, represents the most visible deployment of such a marketplace.34 Companies like Kinecta or iSyndicate have implemented a very similar system for the exchange of digital content around a concept inspired by media syndication.35 The resulting marketplaces come perhaps 33. The French Minitel had pioneered a similar idea in the 1980s with the kiosque system, a tiered structure of partnerships with different service providers corresponding to different pricing structures and different levels of collaboration between the carrier and the service providers. 34. Napster’s original intent was nonprofit, to facilitate the exchange of free music, and it may seem strange to describe it as a marketplace. Recent developments, in particular Napster’s alliance with media giant Bertelsmann, indicate how a profitable marketplace can be built on that model. 35. Werbach (2000).
closest to a neutral marketplace. In these models each of the peers (the end-nodes at the edge of the network) publish a list of what they wish to sell or acquire, at what price. The network connecting them serves as a neutral conduit for that information. These examples demonstrate that electronic technologies make it possible to design marketplaces with a wide variety of architectures, each serving the interests of different groups of market participants. While these technologies can indeed serve to reduce friction, level the playing field, or give all players equal access to market information, they can just as well be deployed to embed architectural features in the network marketplace in order to create strategic advantage for certain players. Ultimately, it is not technology that dictates marketplace architecture, but those who control how technology is deployed and configured.
Rearchitecting Commerce How will the deployment of these various kinds of marketplaces transform commercial activities? The answer varies across sectors, and the studies in this volume begin to paint a series of pictures. Three themes stand out: marketplace efficiency, structure, and adaptability. First, evidence from the various sector studies indicates that e-commerce yields significant savings in the setup and execution of transactions. These savings, however, must be balanced against new expenses. The development of electronic exchanges has proven more expensive and time consuming than initially thought. The sector studies also suggest that those who control the new marketplaces primarily capture these savings. A second dimension relates to the impact e-marketplaces will have on the distribution of market power. Will they broadly follow existing patterns or challenge them? We have seen that, contrary to early expectations, Internet technologies did not necessarily create level markets but could also serve to design biases within the e-market’s architecture. These biases come in three main categories. First are information asymmetries, where the marketplace is structured so as to give some participants better or earlier market-relevant information. Second are matching asymmetries, when the market-clearing algorithms are programmed to favor some of the participants. Third are access asymmetries, when different market players have differential access to the telecommunication infrastructure.
This should not come as a surprise. Indeed, with rare exceptions, markets usually are asymmetric.36 Because these asymmetries reflect the relative market power of the participants and constitute further sources of market advantage, we should expect powerful players to leverage new technologies to further their advantage, to reinforce rather than eliminate these asymmetries. In some cases, however, the technology can create opportunities for traditionally weaker market participants to challenge the dominance of incumbents. The case studies of book or music distribution offer such examples. Perfect markets can exist only for commodities, where a product is fully described by three characteristics: identity, standardized quality (grade), and price. This is where we find the most successful electronic trading networks so far. In these cases, they have been easily justified by cost savings and information optimization in the supply chain. Most real-life transactions between businesses involve much more complex objects. They are not arm’s-length dealings when competition primarily revolves around price but multifaceted interactions including exchanges of expertise, joint learning, or collaboration in product design. In the emerging production networks, firms favor longer-term relationships with fewer suppliers who become partners in generating shared innovation. Such relationships are better supported by collaborative networks than by auction and electronic trading markets. In addition, Internet commerce appears to be penetrating business processes least where there is the greatest sunk investment in legacy information systems. Within the formal boundaries of the firm, there is resistance to displacing the legacy information systems that effectively manage mission-critical functions, often at low cost per transaction. The same appears to hold for business commerce that crosses firm boundaries where there is sunk investment in legacy systems that already similarly deliver very low transaction costs with broad reach—business-business payment systems, for example. This is not to say that Internet commerce will never make headway in these tougher applications, just that further progress awaits innovations that can deliver sufficient benefits to justify the replace36. On information asymmetries in particular, Scitovsky notes: “The root cause of the unequal distribution of knowledge between buyers and sellers is the division of labor, which causes everybody to know more than others about their own specialty and less about other people’s specialties than others know about them. The farther the division of labor proceeds, the wider becomes the gulf between the specialist’s knowledge and the nonspecialist’s ignorance of each specialty.” Scitovsky (1990, p. 137). Evolution to the “knowledge economy” exacerbates these asymmetries.
ment of existing systems. Outside of the obvious cost advantages of value chain trading networks, electronic marketplaces cannot yet support real innovation in areas such as collaborative product development and cooperative cross-firm work processes. They will eventually, as they spread further, but at the moment these network-based innovation processes remain at a very early stage indeed. This raises a third dimension of the impact of e-commerce—relatively unexplored and also more interesting. Because network configuration can be reprogrammed, the corresponding marketplace architecture is adaptable. This process is neither costless nor instantaneous but nonetheless allows faster, cheaper, and more flexible marketplace adaptation than in the preelectronic world. Marketplace architecture can then change to reflect evolving business practices or to take into account new ways to organize work processes within or between firms. In turn, changes in corporate form and new patterns of interfirm collaborations create user-driven experimentation with networking technologies and foster the development of new networking technologies and applications. The resulting process of coevolution may very well be the most significant characteristic of the transition to e-commerce. In the past the relative rigidity of the underlying communication infrastructure impeded rapid reorganization of work processes and interfirm relationships. By contrast today, e-commerce technologies allow for the joint adaptation of communication infrastructure and economic superstructure. In the knowledge-based economy, this becomes a powerful innovation engine.
Conclusion: Policy Implications As if to confirm that Internet technologies do not automatically generate perfect competition, electronic marketplaces—business-to-business marketplaces in particular—have already attracted significant antitrust scrutiny. This concern began before the current wave of Internet exchanges, most notably with the Department of Justice’s 1992 investigation of airline computer reservation systems (CRS), one of the first and largest e-marketplaces.37 As was the case with CRS, the exchanges attracting most 37. Department of Justice (DOJ) Antitrust Division, “United States v. Airline Tariff Publishing Company, et al., Proposed Final Judgment and Competitive Impact Statement,” Federal Register 59, no. 62 (March 31, 1994).
attention tend to be those owned by few major participants in a given marketplace, and the main concern is collusion.38 So far, policymakers do not view the anticompetitive risks associated with electronic marketplaces (and B2B marketplaces in particular) differently from those related to nonelectronic markets. They identify a number of potential antitrust issues, including information sharing agreements, joint purchasing, exclusionary practices, and exclusive access. In their view, however, these are not fundamentally new and can be addressed with traditional antitrust analysis.39 The above analysis suggests that there may be more to this story. Competitive biases can be built into the architecture of e-marketplaces in rather subtle ways. The danger is not so much the obvious one that online markets can be blatantly rigged to favor the market’s owner—indeed, an early example of such abuse, CRS, was effectively handled by antitrust. The real danger is that much more subtle manipulations of consumer choice and market outcome become possible and are likely to escape detection because they are embedded in the network’s very architecture.40 This issue of embedded, indirect market manipulation is one with which existing systems of commercial law and policy are ill-prepared to cope effectively.41 This points to a new link between communication policy and competition policy: when network control yields market control, policies for network access have implications that go beyond the strict domain of telecom policy to affect broader economic issues. When network code curtails fair market access, it becomes crucial to guarantee open access to networks, so that competing marketplaces can be created.
References Bar, François. 1990. “Configuring the Telecommunications Infrastructure for the Computer Age: The Economics of Network Control.” Ph.D. dissertation, University of California, Berkeley.
38. “A Market for Monopoly?” Economist, June 17, 2000, pp. 59–60. 39. “Entering the 21st Century: Competition Policy in the World of B2B Electronic Marketplaces,” Report by the Federal Trade Commission Staff, October 2000 (www.ftc.gov/os/2000/10/ b2breport.pdf ). 40. For a description of how such biases can occur in broadband cable networks, see chapter 18 in this volume. 41. Lessig (1999b).
Bar, François, N. Kane, and C. Simard. 2000. “Digital Networks and Organizational Change: The Evolutionary Deployment of Corporate Information Infrastructure.” Paper prepared for the International Sunbelt Social Network Conference. Vancouver, British Columbia, April 13–16. Bar, François, and E. Murase. 1999. “Charting Cyberspace: A U.S.-European-Japanese Blueprint for Electronic Commerce.” In Partners or Competitors? The Prospects for U.S.European Cooperation on Asian Trade, edited by Richard Steinberg and Bruce Stokes, 39–64. Lanham, Md.: Rowman & Littlefield. Beniger, J. 1986. The Control Revolution. Harvard University Press. Braudel, Fernand. 1979. Civilisation Matérielle, Économie et Capitalisme XVe–XVIIIe Siècle: Les Jeux de l’Échange. 1979. Paris: Armand Colin. Brousseau, E. 1994. “EDI and Inter-Firm Relationships: Toward a Standardization of Coordination Processes?” Information Economics and Policy 6 (3–4): 319–47. Brynjolfsson, Eric, and Lorin Hitt. 1996. “Paradox Lost? Firm-Level Evidence on the Returns to Information Systems Spending.” Management Science 42 (4): 541–58. Buxmann, P., and J. Gebauer.1998. “Internet Based Intermediaries: The Case of the Real Estate Market.” CMIT Working Paper 98-WP-1027. Berkeley: University of California. Chandler, Alfred D. J. 1977. The Visible Hand: The Managerial Revolution in American Business. Cambridge: Belknap. Clark, David, and Marjory Blumenthal. 2000. “Rethinking the Design of the Internet: The End-to-End Arguments vs. the Brave New World.” Paper prepared for TPRC. Alexandria, Va., September. Clemons, Eric K. 1986. “Information Systems for Sustainable Competitive Advantage.” Information and Management 11: 131–36. Committee on Economic and Monetary Affairs and Industrial Policy. 1998. Report on the Communication from the Commission to the Council, the European Parliament, the Economic and Social Committee and the Committee of the Regions on a European Initiative in Electronic Commerce. COM(97)0157–C4-0297/97. Emarketer. 2000. The eCommerce: B2B Report. New York. Fine, Charles M., and Daniel M. G. Raff. 2001. “Automotive Industry: Innovation and Economic Performance.” In The Economic Payoff from the Internet Revolution, edited by Robert E. Litan and Alice M. Rivlin, 62–86. Internet Policy Institute and Brookings. Gates, Bill. 1995. The Road Ahead. Viking Penguin. Gordon, P. 1989. La place du marché. Paris: La Documentation Française. Hanson, W. 2000. Principles of Internet Marketing. Cincinnati, Ohio: Southwestern College Publishing. Hopper, M. D. 1990. “Rattling SABRE—New Ways to Compete on Information.” Harvard Business Review (May–June): 118–25. Lessig, Lawrence. 1999a. Code and Other Laws of Cyberspace. Basic Books. ———. 1999b. “The Law of the Horse: What Cyberlaw Might Teach.” Harvard Law Review 113: 501–46 Malone, T., J. Yates, and R. Benjamin. 1987. “Electronic Markets and Electronic Hierarchies.” Communications of the ACM 6: 485–97. Picot, A., and others. 1997. “Organization of Electronic Markets: Contributions from the New Institutional Economics.” Information Society 13: 107–23.
Saltzer, Jerome H., David P. Reed, and David D. Clark. 1984. “End-to-End Arguments in System Design.” ACM Transactions on Computer Systems 2 (4): 277–88. Sarkar, M. B., B. Butler, and C. Steinfield. 1995. “Intermediaries and Cybermediaries: A Continuing Role for Mediating Players in the Electronic Marketplace.” Journal of Computer Mediated Communication 1 (3). Scitovsky T. 1990.”The Benefits of Asymmetric Markets.” Journal of Economic Perspectives 4 (1): 135–48. Shapiro, C., and Hal Varian. 1998. Information Rules: A Strategic Guide to the Networked Economy. Harvard Business School Press. Smith, M., J. Bailey, and Eric Brynjolfsson. 2000. “Understanding Digital Markets: Review and Assessment.” In Understanding the Digital Economy, edited by Eric Brynjolfsson and B. Kahin, 99–136. MIT Press. Sullivan, Louis. 1947. “The Tall Office Building Artistically Considered.” Reprinted in Kindergarten Chats (Revised 1918) and Other Writings. New York: Wittenborn, Schultz. Werbach, K. 2000. “Syndication: the Emerging Model for Business in the Internet Era.” Harvard Business Review (May–June): 86–93. Williamson, Oliver E. 1975. Markets and Hierarchies, Analysis and Antitrust Implications: A Study in the Economics of Internal Organization. Free Press.
II
E-Commerce: A View from the Sectors
This page intentionally left blank
The Boundary Condition of Services
- the blurry boundaries between goods and services as useless and frustrating as streaming video through ordinary phone lines. It is chopping the production process into bits and pieces— into discrete tasks like design, purchasing, and logistics—and reassembling them in new ways and in new places, for which old labels correspond poorly. It is rather, to push things a bit, the way packet switching treats a message by breaking it into bits and bytes, routing it separately through many different places, and reassembling it at the final destination. The lion’s share of our economy consists of services, but what precisely is a service? Enumeration is simple, though exhausting: a legal consultation, a crop dusting, a facelift, a rock concert. But the category blurs and flips. The rock concert on a purchased CD or DVD is a good; on TV it is a service. Then again, if the DVD was rented and not purchased, it is a service once again.The economic accounts category “services” has always been a messy, residual category. It exists in implicit distinction to the category “goods.” The construction parallels such interdependent categorizations as supply-demand, male-female, and debit-credit. But under midnight oil at lightning glance it sometimes seems closer to dancer-dance. Services are said to share defining, abstract properties: nontangibility being the most important. This separates them from goods, which are defined as being intangible. But not all nontangibles in our economic accounts are “really”
E
services: interest payments, for instance. And some services, like crop dusting and product design, take their finality in goods; they are part of the production of tangible things. Nonstorageability is another attribute that is often used to define services. But online data banks do just that: they store intangible information; and so do the keepers of the “files” in bureaucracies, whether by computer or by hand. George Stigler concluded, almost fifty years ago: “There exists no authoritative consensus on either the boundaries or the classification of service industries.”1 Since then, boundaries have softened considerably. The rearticulation of production chains into more finely defined tasks, or service activities, and their extension across more and more companies further complicates classification. The design for a new IBM semiconductor when produced by IBM in its semiconductor facility is part of a good; when produced by a semiconductor design house and sold to IBM, it is a service—usually, though its classification may change depending on the modes of delivery and payment. E-commerce is adding additional difficulties to the categorization scheme. That rock CD, downloaded and paid for on a microcharge website, is a service; downloaded on gnutella, it escapes all accounting categories. As they go online, books, magazines, and newspapers, traditionally classified as manufactured goods, become services. And promising new developments in cell therapy will be accompanied by a shift in many pharmaceuticals from goods to services, as bottles of massproduced pills are replaced by drug treatments that are created and administered a patient at a time according to individual DNA specifications, protein cultures, and financial immunities. The category “services,” bursting with busily employed occupants, is empty of meaning, though thought to be fraught with significance. Our purposes will be better served by analyzing discrete “tasks” or “service activities” such as design, logistics, procurement, assembly, billing, and paying independently of the final product designation of the industry. These activities are getting more separate and more servicelike. They constitute the stepping stones by which e-commerce is introduced, through which efficiency gains are realized and reorganization triggered. This taskby-task approach, which comes straight out of our case studies, permits us better to see how e-commerce, and more broadly information technology, is rearticulating those interdependent activities, redefining many of
1. Stigler (1956, p. 47).
them—and, in many informative cases, failing to do so. In brief, the case studies reveal significant differences among industries and industry segments—differences in particular “service activities” or tasks—that are obscured by sectoral aggregation.
E-Commerce and the Rearticulation of an Industrial Structure An excellent example of e-commerce enabling and accelerating the transformation of an industrial structure into separated activities is provided by the semiconductor industry. The emerging structure of an important segment of that industry is best understood as chains of different “services.” Each such transformation seems to enable yet another. New design tools (software, most often leased online, not bought like wrenches and screwdrivers) are a primary enabler of substantial structural change in the industry. In a large and rapidly growing segment of the industry, “fabless” (no factory) semiconductor firms design chips on these standard tools (as do integrated firms such as Intel and NEC). Specialized foundry firms, such as Taiwan Semiconductor, manufacture the chips. We have now entered the oxymoronic world of manufacturing services. But its linguistic oddity ought not to trivialize its importance. The foundry specifies design parameters dictated by the technical capabilities and cost structures of its manufacturing process. These are expressed by the design tools as design rules understandable to the remote circuit designer not expert in the process technology of the fab. The fabless firm sends its design, verifies it, and follows the progress of its chips online, in real time, through various stages of adjustment and then on through the manufacturing process. (This includes sending the design, through a similar calibrated process via the design tools online to a third-party “mask” manufacturer.) These successes are breeding second- and third-generation changes in the industry’s structures and practices. The leading design tool firm (Cadence) is spinning off a stand-alone service company to do, not enable, detailed design work. New foundries are adopting innovative ownership and organizational forms. As a semiconductor manufacturing facility (a fab) can now cost upwards of $2 billion, for several firms to share a fab is a way for each to reduce its risk resulting from an unsuccessful design. Along with benefits in time saved, this is, of course, a good part of the drive toward foundry services. New ownership and lease arrangements are converting fabs into a kind of time-share condo, where time slots on the
line can be owned, leased, and even traded as business conditions vary, differentially, among the designer firms.
The Enabler: Standardized Formats, Protocols, and Authentication The semiconductor industry’s ability to standardize rules and formats for setting and communicating designs and other transactions is the critical enabler for the extensive implementation of e-commerce in that sector. This standardization is embodied in the design tools and enforced, fundamentally, by the providers of manufacturing services. They are able to do it because their manufacturing processes are optimized to specific technical parameters. Designs that do not adhere to those specifications and protocols encounter real and serious time and price consequences. In this case, real-world constraints at one point of the production chain can enforce design rules and formats that are necessary for web-enabled transfer of design and monitoring of production. Critically, individual design firms, the fabs’ customers, are not losers in this standardization process. The studies in this book show great variation in the ability of different industrial segments to adopt commonly accepted formats. The basis for success among sectors that succeed in adopting common formats also varies. In some cases, such as semiconductors, all designer firms can gain, and none necessarily loses to any others, if the foundry-friendly format is adopted. Other “successful” cases seem to reflect the overweening power of one player (a powerful final assembler or retailer). Others, perhaps a majority, have thus far been disappointing in their advances toward adopting common formats and, therefore, toward realizing major potential gains of e-commerce, largely because of unresolved questions of differential gains and an inability to solve the distributional problem. The forthcoming migration of much of e-commerce to an XML language will attenuate some of the problems with common formats by introducing new dimensions of flexibility and variation. Authentication is a related problem: it is important to know for sure just who is entering an order or a payment. New, sophisticated authentication software is now being introduced. These new technologies will help to propel greater usage and gains, but they will not resolve fundamental distributional conflicts. In some industries, consortia of dominant firms are launching their own e-commerce marketplaces. In autos, for example, GM, Ford, Daimler, and Renault teamed up
to create an industry marketplace—Covisint—to muffled choruses of supplier anxiety. In many such cases, for example bond trading, antitrust questions will likely have to be resolved. Dell, the most admired model of e-commerce, is able to impose formats in its own supply chain—from the final consumer right back through the component suppliers and logistics providers. It is aided in this, of course, by its huge volume of purchases and its rather small but still substantial number of suppliers (75 percent of about twenty suppliers, almost all of which, except Microsoft, have competitors).2 In retail, the overwhelming power of a few giants, such as Wal-Mart, Kmart, Target, and Sears, permits them to snap their own supply chains and impose their own norms and forms. These giant companies have long been leaders in automating logistics. All of them long ago migrated from paper-based procurement systems to electronic data interchange (EDI) systems at considerable gain. They anticipate additional major gains from migrating to web-based systems. Sears predicts that procurement costs will fall from $100 an order with current EDI systems to about $10 an order with web-based systems. As Sears handles about 100 million purchasing orders annually, savings should total roughly $9 billion per year.3 These savings result mostly from improving visibility in the supply chain, which slashes communications and tracking costs, coupled with enhanced flexibility. This estimate includes the considerable win-win savings resulting from what Dell and UPS discovered when they went to Internet, real-time tracking: cheap, online tracking inquiries soared, but costly phone calls from customers dried up, resulting in substantial saving on both sides. Multiplying Sears’s projected gains by Wal-Mart, Target, and Kmart generates big numbers. Cautiously halving those numbers, and even halving them a second time, still leaves very big, net-gain numbers. But as the case studies show, in many major sectors—ranging from health care through liquor and automobile sales through government— these kinds of gains are not being achieved. In many cases the obstacle is legislated: as in online sales of wine or even autos, where local dealers, as a result of local legislation, must consummate sales. Most typically, the stumbling block no longer seems to be technology, or managerial awareness, or even the availability of competent people to execute new processes. The key problem reported in these studies centers around formats, standards, 2. See chapter 7 by Kenney and Curry in this volume. 3. See chapter 13 by Hammond and Kohler in this volume.
forms. If e-commerce is to realize its promise, the enabling condition is agreement on a common interface, a format or way to present information. In many giant sectors—retail banking and government, to mention just two—an additional problem stands in the way of realizing the full cost savings of e-commerce. Costly paper check processing and shipping is a perfect candidate for replacement by electronic payment systems. And many major banks are eagerly promoting the shift. But it will be a very long time indeed before all customers are willing to forgo paper checks and shift to electronic systems. In the very long interim, the banking system will be obliged to shoulder the costs of maintaining both payments systems, each with very high fixed costs. The same is true—even truer, if we may—for government. Governments at all levels—federal, state and local—are experimenting with what is beginning to be called governmentto-consumer (G2C) web-based services such as the provision online of tax forms or access to Social Security information and benefit calculation assistance. Almost half the states provide online income tax filings and use outsourced solutions for this service. Many states provide online services to order vital records such as birth, death, and marriage certificates. Governments are only beginning their move into online service provision. Some municipalities are experimenting with online filing for minor construction permits in an effort to eliminate the need to come to city hall and stand in line with arms full of blueprints. The enormous volume of government-toconsumer transactions means that, eventually, substantial cost savings (both monetized and nonmonetized) will be realized. But government must maintain nonelectronic systems for dealing with its “consumercitizens” whatever the cost. Government does not and cannot obey the simple market logic that drives corporate decisions. Government is also the largest customer in many markets, spending over $500 billion a year on procurement. The complex restrictions placed on government procurement, which have been the stumbling block for preelectronic efforts to “streamline the process,” should give pause to eager expectations of fast and major efficiency gains through electronic procurement systems of the kind used in business.4 Analyses of the health care sector estimate that e-commerce would generate savings of about $30 billion over a ten-year period just in billings and
4. See Fountain (2001).
payments transactions.5 This is a very conservative estimate (discordant if compared to Sears, above). But even this depends on the adoption of common standards for financial and administrative transactions. This estimated gain does not include gains resulting from the electronic handling and communication of medical records, where much larger gains in efficiency, not to mention effectiveness, would be realized. For medical records, substantive problems of airtight privacy remain to be resolved (even though they have not been resolved for incumbent systems). So do significant technical problems: doctors are unlikely to abandon current costly and error-prone hand scribbling of medical records and prescriptions until reliable and convenient voice recognition entry devices are available. Private standards-generating initiatives in the health sector were last year’s “new new thing.” Various efforts, some by outsiders, some by major insurers and health groups, compete. But the critical element remains the lack of a sufficiently powerful and determined player who can legitimately oblige the acceptance of common standards. The obvious candidate is, of course, the federal government, which pays so substantial a part of the national medical bill. Prudence means thoughtful balance, not just watered-down expectations. The economic gains resulting from moving billing and payments online will generate very substantial dollar savings, though they they will be achieved only slowly and with difficulty. But they represent what is, ultimately, a relatively minor application. The health sector is one of many realms where information technology (IT) is likely to play out on a grand scale, surging far beyond the substantial gains attributable to only one form of IT, e-commerce. Moving prescriptions and medical records online is an emblematic second round effect. It will reduce a major cause of medical mistakes—and, therefore, deaths, increased hospitalizations, and human misery—as well as enormous economic cost.6 It is the third round, however, where the great gains await: and they are in the substance, not the administration, of health care. As was stressed in the part I of this volume, information technology consists of tools for thought, the most powerful and allpurpose tools ever. For starters, we should consider advanced imaging, 5. U.S. Department of HHS. See the Health Portability and Accountability Act of 1996 (HIPAA), which calls for HHS to adopt standards for financial and administrative transactions in health care. See Danzon and Furukawa (2001). 6. On the banality and ubiquity of medical mistakes, see Kohn, Corrigan, and Donaldson (2000).
micro surgery, rational drug discovery, and stem cell therapy, all well along in the pipeline, all rather revolutionary, and all just beginning. A review of the service sector case studies highlights the potential paths and problems associated with the transformations at hand.
The Financial Services Sector As Clemons, Hitt, and Croson reveal in chapter 4, “The Future of Retail Financial Services: Transparency, Bypass, and Differential Pricing,” it is nearly impossible to generalize about the impact of the Internet on the American financial services sector. Although most of its subsectors are dominated by large credit agencies, their similarities end there. The dynamics of the market for credit card provision, for instance, differs wildly from that of retail banking, stockbroking, mortgage lending, or term insurance. Not only are these services processed differently, but each focuses on distinct customer segments. Whereas credit card firms target low-income clients likely to pay recurring service fees, stockbrokers scrap to attract high-income clients with lucrative portfolios. Simply put, each subsector in the financial services industry is characterized by significantly different degrees of price transparency, differential pricing, and customer disintermediation. Combined with market-specific government regulations, these differences create diverging growth prospects for financial service firms in the New Economy. Internet-enabled price transparency is throttling corporate profits in markets characterized by simple transactions. Since 1995, Internet brokers such as E*Trade have seized control of over 30 percent of all retail stock trades, offering order fulfillment at rates ten to twenty times below those promised by full service brokerages like Merrill Lynch. Online comparison shopping is driving down margins in mortgage and insurance provision, just as competition between firms like Amazon.com and Barnes & Nobles drove down the cost of many consumer goods. Already, 10 and 20 percent of all customers now use the Internet to research their mortgage and insurance needs, respectively. Despite a lowly market penetration of less than 1 percent, Internet-only banks are also placing upward pressure on consumer expectations of what constitutes a “reasonable” rate of return on uninvested capital. Consumer empowerment is only part of the story. Data-mining tools and mass customization allow firms to tailor their services to individual
market niches, wringing profits from previously untapped consumer needs. Just as Capital One Financial revolutionized the credit card industry with its decision to focus on the high-profit, low-income market segment, online transaction records and data-mining tools offer firms increasingly valuable knowledge about their customer base—knowledge that allows firms to extract profits from even the highest risk segments. As Clemons, Hitt, and Croson take pains to point out, however, the degree of customer disintermediation across various sectors frustrates simple analysis. Despite the downward pressure online brokers have placed on the going price of market orders, the ability of traditional firms to offer vertically differentiated services such as complex financial planning, access to coveted IPOs, and valuable market research provides a bulwark against the loss of their most lucrative customers. Disintermediation poses less of a threat to retail banks. With few exceptions, consumer-to-consumer (C2C) lending, borrowing, and payment systems have failed to materialize. As with credit cards, there appear few functional alternatives for most consumers than to continue to rely on retail bank services. Aiding this intransigence, many financial firms are protected against the encroachment of upstart competitors by their enormous economies of scale. Long-standing investments in innovative systems including automated teller machines (ATMs), centralized telephone call centers, and PC banking tools have institutionalized competitive advantage and created incredibly efficient information channels. Thus, turning their eyes to the growth of Internet banking, Clemons, Hitt, and Croson insightfully suggest that banks are shifting into web-centric systems out of competitive necessity, not out of an internally generated commitment to overhaul their existing business systems. Attempting to synthesize the lessons of their remarkably different case studies, Clemons, Hitt, and Croson propose a theory of “newly evulnerable markets.” They project that financial services that are easy to enter, attractive to attack, and difficult to defend will experience the most rapid upheaval in response to business process innovation and continued technological advances. Government policies and regulations are crucial insofar as they delimit competitive boundaries and restrict firm flexibility. Looking to the future, Clemons, Hitt, and Croson point to significant long-term changes in the structure of the financial systems of the advanced Western economies. Crucially, however, while technological efficiencies may be occurring rapidly, their analysis suggests that business model innovation will be largely pioneered by firms specializing in niche markets at the
margin of profitability and will only diffuse slowly into the broader financial community.
The Airline Industry In chapter 5, “Web Impact on the Air Travel Industry,” Klein and Loebbecke focus on a particular service task of the air travel industry— pricing—but develop the analysis in a way that makes their findings relevant for many other service sectors. Specifically, the authors investigate how ubiquitous digital networks enable the introduction of entirely new pricing models or pricing models previously unknown in the industry. Traditionally, consumers purchased flight tickets from travel agencies that acted as sales intermediaries for the airlines. While airlines moved toward differential pricing even before the advent of the Internet, end consumers were largely passive participants in the price determination process. The spread of the Internet and the availability of new electronic tools for marketing and sales have led to widespread experimentation in the area of pricing within the industry. Far from simply enabling disintermediation and direct sales from airlines to consumers, the Internet has given rise to web-based intermediaries, new roles for existing suppliers and intermediaries, and significantly increased consumer participation in the price determination process. Loebbecke and Klein trace and classify several kinds of innovative pricing models and conclude that the web will give rise to models of negotiated pricing that can go all the way down to the end consumer, a model familiar from bazaars but long absent in many industrial mass markets. The authors classify differential pricing models according to whether pricing is based on customer characteristics, product features, sales volume, or customer utility. In the process, they discuss innovative pricing (and business) models by Lufthansa (sales auctions), TravelBids (reverse auction), Rosenbluth (value-based pricing) and Priceline (demand collection). Each model assigns suppliers, intermediaries, and end customers specific roles in the price negotiation process with asymmetric distribution of influence over price across the chain. Both Lufthansa’s sales auctions and Priceline’s demand collection, for example, give end customers comparatively high influence over the final price. TravelBids’ reverse auctions and Accompany’s demand pooling, by contrast, locate highest influence over price with the supplier.
The survey of innovative web-based pricing models reveals that Internet and e-commerce have not led to a unidirectional shift in the balance of market power in the sector. Rather, each model empowers actors at various points in the value chain according to its own particular logic. Just as innovative pricing models are diverse, so are the business strategies that have brought them forth. New entrants such as TravelBid, Priceline, and Accompany have developed business models that are qualitatively different in order to break into an existing market and transform it. In contrast, Lufthansa’s monthly auction of surplus capacity is a means to enhance overall ticket sales efficiency. However, only a small number of tickets are auctioned this way and the airline’s business model as well as its overall sales model is virtually unaffected by small online auctions. In sum, the Internet has changed the air travel industry in that it has enabled models of negotiated pricing that were hitherto impractical. Suppliers, existing intermediaries, new intermediaries, and end consumers each play roles in the price determination process with the specific task and degree of influence over price varying from model to model. Much experimentation is still under way, and it is too early to assess whether a single pricing model will prevail or how big the overall impact of these models will be for this or other sectors.
References Danzon, Patricia M., and Michael Furukawa. 2001. “E-Health: Effects of the Internet on Competition and Productivity in Health Care.” In The Economic Payoff from the Internet Revolution, Brookings Task Force on the Internet. Brookings. Fountain, Jane. 2001. “The Economic Impact of the Internet on the Government Sector.” In The Economic Payoff from the Internet Revolution, Brookings Task Force on the Internet. Brookings. Kohn, Linda T., Janet Corrigan, and Molla S. Donaldson, eds.. 2000. To Err Is Human: Building a Safer Health System. Washington: National Academy Press. Stigler, George. 1956. Trends in Employment in the Service Industries. Princeton University Press.
3
E-Finance: Recent Developments and Policy Implications
begun to profoundly affect how financial services are delivered. On the one hand, the Internet affords convenience, price transparency, broader access to information, and lower cost; on the other hand, financial services are data-intensive and generally require no physical delivery. Combining the two should give the perfect environment for new entrants to build substantial businesses, go up the value chain, and compete on price.1 “E-finance” has been defined in different ways.2 In this chapter, we use the term rather broadly to mean the provision of financial services—banking and deposit-taking, brokerage, payment, mortgage and other lending, insurance and related services—over the Internet or via other open public networks. E-finance is expected to continue growing strongly. However, the fast pace of developments generates considerable uncertainty about the current situation and future implications. This uncertainty is shared by bankers and supervisors and is confined not just to how much stress will be placed
T
1. However, as discussed later, only a small portion of these revenues will be earned in an Internetonly environment. 2. Sometimes the terms “online finance,” “Internet finance,” “virtual finance,” and “cyber finance” are also used interchangeably.
-
on parts of the financial system, but to where such stress will be felt. It is likely that the cross-border, fast, and less regulated nature of e-finance will accentuate the manifestation of familiar challenges and risks, if not create new ones. This chapter provides a global overview of recent developments in efinance and examines its policy implications. The first section briefly surveys various manifestations of e-finance. The second section presents a conceptual framework to understand the e-finance structure and reviews changes taking place in individual institutions, exchanges, and trading systems. The third section reviews possible implications for financial stability, and the fourth section reviews implications for monetary stability. The fifth section discusses the role of central banks and supervisors in monitoring and assessing e-finance developments.
Overview of E-Finance Developments The sheer size of traditional financial services markets implies strong growth potential for the global e-finance market to exceed a trillion dollars (see table 3-1). One study estimates that e-finance revenues will more than double in three years.3 Although these growth statistics are impressive, they only partially indicate the impact e-finance will ultimately have on the financial services industry. While the Internet may appear to be just another technology wave adding a new delivery channel, its scope and potential impact on financial stability are much larger.4 Not only is the Internet taking business away from traditional “bricks and mortar” financial institutions, it is also introducing new business models, changing financial structures, and driving industry consolidation. Many believe consumer acceptance of online financial transactions will follow a “hockey stick” path as new technology increases convenience and 3. Morgan Stanley Dean Witter (1999). It should be noted that it is very difficult to distinguish between the new revenues (that is, fees, interest income) that directly result from an e-business solution the Internet offers and a replacement of what would have been originated in traditional channels. Different e-finance sectors have different revenue models to charge for their services, which makes the measurement even more complicated. 4. It should be noted, though, that much of the computing and communication-enabled transformation in relationships among financial institutions and their wholesale consumers was already occurring before the Internet was commercialized.
, ,
Table 3-1. Estimated Future Size of E-Finance Market in the United States and Europe, 2003 United States Penetration rate (percent)a Sector
Billions of U.S. dollars
Savings/banking Mutual funds Brokerage Credit card Car insurance Personal loans Mortgages Life and pensions Home and other general insurance Total
Europe
MSDW 2003
JP Morgan 2000–04
Billions of eurob
Penetration rate (percent)a
235 ... 32 4 18 ... 147 1
20 ... 38 30 15 ... 15 15
15 → 35 2 → 30 35 → 55 3 → 30