For further volumes: http://www.springer.com/series/5175
.
Silke Konsorski-Lang Michael Hampe (Eds.)
The Design of Material, Organism, and Minds Different Understandings of Design
Editors Dr. Silke Konsorski-Lang ETH Zurich Department of Computer Science Universita¨tstrasse 6 8092 Zurich CNB G 107.2 Switzerland
[email protected] ISBN 978-3-540-68995-9
Prof. Dr. Michael Hampe ETH Zurich Department of Humanities, Social and Political Sciences Ra¨mistr. 36 8092 Zurich RAC G 16 Switzerland
[email protected] e-ISBN 978-3-540-69002-3
DOI 10.1007/978-3-540-69002-3 Springer Heidelberg Dordrecht London New York X.media.publishing ISSN 1612-1449 Library of Congress Control Number: 2009940400 # Springer-Verlag Berlin Heidelberg 2010 This work is subject to copyright. All rights are reserved, whether the whole or part of the material is concerned, specifically the rights of translation, reprinting, reuse of illustrations, recitation, broadcasting, reproduction on microfilm or in any other way, and storage in data banks. Duplication of this publication or parts thereof is permitted only under the provisions of the German Copyright Law of September 9, 1965, in its current version, and permission for use must always be obtained from Springer. Violations are liable to prosecution under the German Copyright Law. The use of general descriptive names, registered names, trademarks, etc. in this publication does not imply, even in the absence of a specific statement, that such names are exempt from the relevant protective laws and regulations and therefore free for general use. Cover design: Ku¨nkelLopka GmbH, Heidelberg, Germany Printed on acid-free paper Springer is part of Springer Science+Business Media (www.springer.com)
Acknowledgements
The motivation of this book comes from our lecture series “What is Design?” and reflects the research carried out within the first phase of the Competence Center for Digital Design and Modeling. In one of our steering committee meetings, I proposed to the committee to bring out a book about the different understandings of design. At the same time that we were thinking about bringing out a book and reflecting about a publisher, Mr. Engesser from Springer Verlag contacted Michael Hampe, asking if he would be interested in writing a book. This is how it happened that we met Mr. Engesser in Zurich and told him about our idea. The project began with this pleasant coincidence. So we started to verify our ideas and what we really wanted to present in our book. This book would not have been possible without all the work and effort the authors have put into their contributions. This project was accompanied by very good collaboration, discussions and inputs. Our special thanks go to Debbie Bregenzer for not only proof reading all articles, but also for the valuable comments and questions she had. We are very thankful to our publisher, Springer, for accepting our proposal and to everybody involved in bringing this book to life. Special thanks go to Dorothea Glaunsinger and Gabriele Fischer for all their patient understanding and support of the project.
v
.
Contents
Part I 1
Why Is Design Important? . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3 Silke Konsorski-Lang and Michael Hampe
Part II 2
Introduction
Design of Objects & Materials
Product Design: The Design of the Environment and the Surroundings . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Fritz Frenkler
21
3
MINI: Empathetic Design for the Future . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Gert Hildebrand
29
4
The Design and Development of Computer Games . . . . . . . . . . . . . . . . . . . . Markus Gross, Robert W. Sumner, and Nils Thu¨rey
39
5
Drug Design: Designer Drugs . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Gerd Folkers, Elvan Kut, and Martin Boyer
53
6
Making Matters: Materials, Shape and Function . . . . . . . . . . . . . . . . . . . . . . Paolo Ermanni
65
Part III
Design of Environments for Living
7
The Theory of Dialogical Design . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Meinhard von Gerkan
87
8
ETH Future Cities Simulation Platform . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Jan Halatsch, Antje Kunze, Remo Burkhard, and Gerhard Schmitt
95
9
Iterative Landscapes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 109 Christophe Girot, James Melsom, and Alexandre Kapellos
vii
viii
Contents
Part IV
Design of Minds
10
Applied Virtuality . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 119 Vera Bu¨hlmann
11
Text Design: Design Principles for Texts . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 131 Wibke Weber
12
Synesthetic Design of Music Visualization Based on Examples from the Sound-Color-Space Project . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 143 Natalia Sidler
Index . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 155
About the Editors
Silke Konsorski-Lang is, since March 1, 2008, Institute and Research Coordinator at the Institute for Visual Computing in the Department of Computer Science at the ETH Zurich. Before holding this position, she was the Managing Director of the Competence Center for Digital Design & Modeling from October 2005 onwards. She received a Diploma in Architecture (Dipl.-Ing. Univ.) from the University of Technology Munich in 2000 and a Diploma in Information Architecture (Dipl. NDS ETHZ) from the ETH Zurich in 2001. From 2001 to 2004, she worked as research and teaching assistant at the Chair of Computer Aided Architectural Design, ETH Zurich. She earned her PhD degree (Dr.sc.techn.) in 2004 for her work on the investigation of video systems, especially 3D video, in the context of architecture. Her research interests include design as a scientific discipline in general, and information and communication technologies, human–computer interaction, video systems, and enhanced environments, in particular. She is a co-author of several scientific publications and a co-organizer of the international eCAADe conference in 2010. Michael Hampe is, since October 1, 2003, Full Professor of Philosophy in the Department of Geistes-, Sozial und Staateswissenschaften at the ETH Zu¨rich. He studied philosophy, psychology, and German literature in Heidelberg and Cambridge. He got his M.A. degree in 1984 from Heidelberg. During 1984–1989, Hampe studied biology with specialization in neurobiology and genetics and worked as an assistant in Philosophy, both in Heidelberg. He was a Visiting Professor in Philosophy at Trinity College, Dublin. He was a winner of the German Research Society’s Gerhard Hess Development Prize in 1994, and a leader of an interdisciplinary research project in the concept of law in jurisprudence and the natural and social sciences. He was a Fellow at the Wissenschaftskolleg in Berlin from 1994 to 1995. From 1997 to 1999, Hampe was a Professor of Theoretical Philosophy at the Polytechnic University of Kassel, and was the holder of the Philosophy Chair II at the University of Bamberg between 1999 and 2003. Research projects: Philosophy and History of the empirical sciences, critical theory and metaphysics, science and public interests, technologies of self-reflection. Books: Die Wahrnehmungen der Organismen, Go¨ttingen 1990; Gesetz und Distanz, Heidelberg 1996; Alfred North Whitehead, Mu¨nchen 1999.
ix
.
Contributors
Martin Boyer has been a scientific researcher, since 2007, at Collegium Helveticum where he is working on a dissertation under the supervision of Prof. Folkers. In the framework of the “Tracking the Human” project, he is analyzing the pharmacological concepts of the human being in relation to drug development procedures. Boyer studied Biochemistry and Neuroscience at the Federal Institute of Technology in Zurich and wrote his master thesis on ant navigation with Prof. Wehner. Thereafter, he studied New Media Art at the Academy of Fine Arts in Prague, focusing mainly on installations and photography. Boyer is interested in Biology, Pharmacology, as well as in epistemological and socio-historical concept formation and according implications on scientific practices and methodologies. Vera Bu¨hlmann is a researcher in media theory and philosophy at the Chair for Computer-Aided Architectural Design at the ETH Zurich. She holds a PhD in Media Science from Basel University. In her thesis, she has worked on the relationships between different conceptssuch as potentiality, the conditions of integrability, functionality, and different/ciation, forms of intuition and forms of construction, with focus on how these may help for conceiving a medial architectonics. She has been a researcher and lecturer at different Art and Design Schools in Switzerland since 2003, and has also co-edited a book on research in design and art in 2008: pre-specifics. Some Comparatistic Investigations on Research in Design and Art. jrp|ringier, Zurich 2008. Remo Burkhard studied architecture and did his PhD on Knowledge Visualization, both at the ETH Zurich. He was a co-author of the Science City Project and was involved in the Strategic Planning Process 2008–2011 of the ETH Zurich. From 2003 to 2007, he was a project manager at the University of St. Gallen where he founded and built up the Competence Center Knowledge Visualization at the Institute for Media and Communications Management. Here, he was also responsible for the executive training program. Burkhard is a founding partner of vasp datatecture GmbH, a company working in the area of visualizing business contents. He has published various scientific articles, and has initiated and organized many international research workshops. Paolo Ermanni has been Full Professor of structure technologies at the ETH Zurich since April 1, 2003. He studied Mechanical Engineering at the ETH Zu¨rich and xi
xii
received his Dr. sc. techn. degree at the ETH Zurich in 1990 under the guidance of Prof. Dr. M. Flemming spent more than five years at the DaimlerChrysler Aerospace Airbus GmbH in Hamburg as a senior engineer, and later on, as a project manager. At Airbus, he mainly dealt with structural and technological issues related to the realization of a second generation of civil supersonic aircraft. In 1997, he tookup a new challenge as a manager in the consulting firm A.T. Kearney in Milan. He was appointed associate professor at the ETH Zurich in 1998. His area of interests are on lightweight and adaptive structures made of composite and smart materials. Flemming’s current research focuses on two areas, one is design, modeling and characterization of novel material systems and processes, and the other is computational structural mechanics, including the development of novel optimization algorithms in conjunction with the numerical simulation of structures and manufacturing processes. Gerd Folkers has been Full Professor of Pharmaceutical Chemistry at the ETH Zurich since 1994 and Associate Professor since 1991. He took the Chair of the Collegium Helveticum on the first of October 2004. Folkers studied Pharmacy at the University of Bonn where he earned his doctorate subsequently. With his doctoral advisor, he relocated to the University of Tu¨bingen in 1983 and remained there till 1989. He studied new research methods in computer-aided molecular design in Berne, London, and at the Texas A&M University in College Station. His focus of research lies on the molecular interaction between drugs and its binding sites in vivo. Folkers is particularly interested in the strong integration of computer-aided modeling and relevant biochemical/biophysical experiments. Besides the research on the molecular mechanism of «conventional» nucleoside therapeutics against virus infection and cancer, his interest has now shifted to immunotherapeutics. He is an author and editor of diverse scientific books, an elected member of the “Schweizerische Akademie der Technischen Wissenschaften”, and a member on the boards of many international scientific societies. Fritz Frenkler graduated with a degree in Industrial Design from the Braunschweig University of Art. He worked as president of frogdesign Asia, led the wiege Wilkhahn Entwicklungsgesellschaft, and was the chief designer of Deutsche Bahn AG. In 2000, Fritz Frenkler and Anette Ponholzer established f/p design deutschland gmbh, followed by the opening of f/p design japan inc. in 2003. Fritz Frenkler is a regional advisor of ICSID, chairman of the iF product design awards jury for many years, and a founding member of universal design e.V. Since 2006, Fritz Frenkler is University Professor on the chair of Industrial Design at the Technische Universita¨t Mu¨nchen (TUM). Meinhard von Gerkan completed his architectural studies in 1964 at the Carolo Wilhelmina Technical University in Braunschweig. Since 1965, he has worked as a freelance architect with his partner Volkwin Marg. Together they have completed more than 260 buildings and won 430 prizes in national and international competitions. Over the years, Meinhard von Gerkan has taught at various universities in his capacity as a Professor or a Visiting Professor, including the Free Academy of Arts in Hamburg and Nihon University in Tokyo/Japan. The foundation of the gmp Foundation in 2007 to promote the training of architects, is regarded as one of his most important projects.
Contributors
Contributors
xiii
Christophe Girot studied environmental planning and management at the University of California and landscape architecture at the University of California in Berkley. He has been Professor at the Chair of Landscape Architecture at the ETH Zurich since 2001, after having taught as a visiting professor in 1999 and 2001. In addition to his academic activities, Christophe Girot is a practicing landscape architect and owner of Atelier Girot in Zurich. Markus Gross is a Professor of computer science at ETH Zurich, head of the Computer Graphics Laboratory, and the director of Disney Research, Zurich. For more than 20 years, Prof. Gross has been pursuing basic and applied research in computer graphics, image generation and display, geometric modeling, and computer animation. His research interests include point-based graphics, physicallybased modeling, immersive displays, and 3D video. Prof. Gross received a Master of Science in electrical and computer engineering and a PhD in computer graphics and image analysis, both from Saarland University in Germany. From 1990 to 1994, he was a senior researcher at the Computer Graphics Center in Darmstadt, where he established and directed the Visual Computing Group. He also co-founded Cyfex AG, Novodex AG, LiberoVision AG, and Dybuster AG. Jan Halatsch graduated from the Technical University of Dresden in 2004. He works with computer graphics for his current postgraduate masters in Computer and Information Science at the University of Konstanz. Since 2006, Jan holds a position as a research and teaching assistant at the Chair for Information Architecture. He is a project manager for the Value Lab at the ETH Zurich. As an information architect, Jan gained working experience in various architecture-and media-related fields. He is also a feature writer for the digital content creation magazine, Digital Production. He is currently working, teaching, and publishing shape-grammar-based modeling, crowd simulation, and design theory related to knowledge architecture and sustainable architecture. Gert Volker Hildebrand, the Chief Designer of Mini, has had a prolific and influential career in automotive design. After completing university studies in engineering and design in Karlsruhe, Braunschweig, and the RCA London, Hildebrand joined Opel where he was part of the Kadett E design-team and the Opel JUNIOR show-car team 1983. Then, the Lo¨rrach, Germany-born designer moved to the Volkswagen Design Center where he performed on the Golf III, and IV, in addition to the Sharan Face-lift and the Bora. In 1995, Hildebrand was appointed Chief Designer of the Volkswagen-owned Seat Automotive Co. in Matorel, Spain. At Seat, besides designing the 1998 Toledo and Leon models, he also created the new face and the future design direction for Seat. After a successful career at the Volkswagen Group, he ventured to the Mitsubishi European Design Center where he was Chief Designer. Hildebrand also has experience in the automotive supply industry, working in the European subsidiaries of 3M and JCI. Since the beginning of 2001, Hildebrand leads the Mini Design Studio at BMW in Munich. Alexandre Kapellos studied architecture at the EPF in Lausanne and CAAD CAM technologies at ETH Zurich. He has been teaching CNC topographical modeling at the Chair of Landscape Architecture at ETH Zurich since 2005 and working as a freelance architect since 1999.
xiv
Antje Kunze studied architecture and graduated from the Technical University of Dresden, basing her thesis on Knowledge Architecture in 2004. After her degree, she worked as a writer for the digital content creation magazine, Digital Production. Antje obtained her masters degree in Computer and Information Science at the University of Konstanz. Since 2006, she is a project manager for the Value Lab at the ETH Zurich. Furthermore, she is a research and teaching assistant at the Chair for Information Architecture. Her main interests in this field are simulations of sustainable architecture, procedural modeling, and knowledge architecture in the real and virtual worlds. Elvan Kut has been a scientific researcher at Collegium Helveticum since 2004 and is now a member of the executive board. Born in Zurich, she studied Pharmaceutical Sciences at the ETH Zurich. In her dissertation, she worked on the emotional modulation of pain perception modulation, thereby focusing on endogenous opioids as neurochemical mediators between pain and emotion. Together with Amrei Wittwer and Nils Schaffner, she received the German «Fo¨rderpreis fu¨r Schmerzforschung 2007». She is a co-author and co-editor of several scientific publications. James Melsom studied Architecture at UWA in Australia, and the MAS in Landscape Architecture at the ETH, Zurich. Since 2001, till today, he has collaborated on landscape architecture projects throughout Southeast Asia and Europe. Since 2007, James teaches landscape design and digital methodologies at the Chair of Landscape Architecture at the ETH Zurich. Gerhard Schmitt is Professor of Information Architecture at the Department of Architecture since December 2005. Since 1998, he is Vice President for Planning and Logistics of ETH Zurich. Before that he was Professor of CAAD at ETH Zurich, where his teaching included CAAD, CAAD Programming, CAAD Practice, and postgraduate seminars. His research focuses on the development of intelligent design support systems and the architectural design of the information territory. His main books are Architectura et Machina (Vieweg, 1993), Architektur mit dem Computer (Vieweg, 1996), and Information Architecture (Testo & Immagine, 1998). From 1984 to 1988, he was on the Faculty of Architecture at Carnegie Mellon University, and from 1993 to 1994, he was a Visiting Professor at Harvard University. Natalia Sidler studied piano at the Zurich Conservatory and Theatre Zurich, at the University of Arts in Berlin with Prof. Georg Sava, obtained her master’s degree at the Basle Academy of Music, and completed her studies in classical song-accompaniment with Irwin Gage and Wolfram Rieger. Sidler plays the piano, the ondes martenot, the color-light piano, and the moog in different formations, and is the co-founder of the Ensemble Musique Brute and Klingenberg, specializing in free improvisation. Since 1995, she has been an assistant professor at the Berlin Academy of Arts. She directed the music of the «Farboper Violett» (1914) by Wassily Kandinsky. Since 1998, she has been teaching contemporary improvization and piano at Zurich University of the Arts (ZHdK). Robert W. Sumner is the associate director of Disney Research Zurich and leads the research group on animation and interactive graphics. He received a B.S. (1998)
Contributors
Contributors
xv
in Computer Science from the Georgia Institute of Technology in Atlanta, Georgia and his M.S. (2001) and PhD (2005) from the Massachusetts Institute of Technology (MIT) in Cambridge, Massachusetts. He spent three years as a senior researcher at ETH Zurich before joining Disney. Robert is an adjunct lecturer at ETH and currently teaches a course called the Game Programming Laboratory in which students work in small teams to design and implement novel video games. Nils Thuerey is a post-doctoral researcher at the Computer Graphics Laboratory of ETH Zurich. His main focus in research is the development of algorithms to efficiently simulate realistic fluids in real-time and off-line environments. Working together with the research group of AGEIA/NVIDIA, he has developed fast algorithms to simulate three-dimensional fluid effects in computer games. He is involved in the ETH game programming laboratory since it was started in 2007. Wibke Weber, PhD, is Professor for Information Design at Stuttgart Media University, Germany. Her major subjects are professional writing, text design, and convergent media. She worked as a radio journalist and multimedia editor for several German public-broadcasting stations. Weber is a board member of the International Institute for Information Design (IIID), and editor and co-author of the book Kompendium Informationsdesign (Springer 2007). Her current research interests comprise information graphics and multimedia storytelling.
Part Introduction
1
I
1
Why Is Design Important? An Introduction Silke Konsorski-Lang and Michael Hampe
1.1 State-of-the-Art Design Research Although design is as old as the human race itself, pervades our lives, and is fundamental to many different disciplines, the concept “design” is often vaguely defined, and the way in which it is understood and applied within these various disciplines diverges substantially. Design is a commonly shared key component for many diverse disciplines such as science, engineering, management and architecture – to name just a few. For example, there is system design in engineering, algorithm design in computer science, process design in management, creative design in architecture, and self-organizational design in biology. However within these fields the way in which design is understood and utilized differs significantly. Design research, including design science and design methodologies, is a wide and comprehensive field based on both expertise and formulated terminologies that are specific to a discipline. Even though research on design can be traced back to the early 1960s, it currently needs extensive research, now more than ever. The design methods developed in the 1960s and research into artificial intelligence in the 1980s provided some advances, but they did not have a significant practical impact. The fundamentals and principles of design are relatively little understood. Surprisingly, little effort has been made to investigate either the fundamental issues or the foundations of design and to formulate specific criteria to establish
S. Konsorski-Lang (*) ETH Zurich, Universita¨tstrasse 6, 8092 Zurich, Switzerland e-mail: lang@ inf.ethz.ch
it as an extensive scientific concept and discipline in its own right. For instance, in some specific engineering disciplines, such as user interface design, handbooks with detailed design instructions do exist already. However, a holistic understanding of design would enable completely new perspectives of and approaches to diverse disciplines, including those of architecture, engineering, management, and natural science. Until now, the potential of merging knowledge from various disciplines has only rarely been investigated.
1.1.1 What is Design Science? Design is typically concerned with creating things that people want. As the initial brainwork is normally unseen by those on the outside of the thought process, design is often seen as a procedure related to material things. According to S. A. Gregory, the fundamental idea behind design is building a structure, pattern, or system within a situation (Gregory 1966). And there are many examples of this. Engineering design is goal oriented and concerned with the process of making artifacts and complex systems for expert use. In natural science, design abides mainly by the laws of nature. However, engineers, technologists, and scientists, as well as architects, artists, and poets are all involved in design processes. These processes are relatively more or less creative, but all imply that thinking ahead is a significant component of this process. Many authors and scientists have sought to define the term design. The following list, which is certainly not exhaustive, summarizes some of the statements
S. Konsorski-Lang and M. Hampe (eds.), The Design of Material, Organism, and Minds, X.media.publishing, DOI: 10.1007/978-3-540-69002-3_1, # Springer-Verlag Berlin Heidelberg 2010
3
4
S. Konsorski-Lang and M. Hampe
about design found in the literature. According to these, design is:
An art form An applied science A process with an input and an output A goal-directed problem-solving and decisionmaking activity A deliberately intended or produced pattern Creativity and imagination Satisfying needs Drawings, sketches, plans, calculations Foresight toward production, assembly, testing, and other processes Managing, learning, planning, and optimizing Collecting and processing data Transferring and transforming knowledge.
Research on design may have originated when, in 1872, Viollet-le-Duc recognized that design problems are becoming so complex that the designer’s intuitive grasp is not sufficient to solve them (Heath 1984). Design research is concerned with the study, research, and investigation of man-made artifacts and systems. Within the manufacturing industry, design has been formally acknowledged as a separate discipline for the last 150 years. This is particularly true for the field of engineering, where scientific developments, especially those that occurred in the early 1940s, made significant contributions toward solving design problems. Multidisciplinary teams consisting of engineers, industrial designers, psychologists, and statisticians were set up. Initially, the focus of design research was on improving classical design by using systematic design methods. The Design Research Society was founded in London in 1966. In 1970, the Environmental Design Research Association was established. Their research involved evaluative studies of architecture and environmental planning. At the Portsmouth DRS Conference, L. Bruce Archer defined design research as “systematic inquiry whose goal is knowledge of, or in, the embodiment of configuration, composition, structure, purpose, value, and meaning in man-made things and systems.” (Archer 1981, pp. 30–47). However, since the 1990s, the focus has shifted to automated design. This novel approach has transformed information about design problems into detailed specification of physical solutions that use computers in order to attempt to solve the particular
problem. For instance, research in cybernetics has influenced design methodologists and theoreticians such as L. B. Archer and Gordan Pask. They draw similarities between designers’ design behavior and organisms’ self-control systems (Archer 1965; Pask 1963).
1.1.2 Design Science and its Origins Design science is a systematic approach that seeks an appropriate design methodology. This design methodology is a pattern of work, which is independent of the discipline and offers a means of solving various problems.
1.1.3 Related Work Regarding design and science, there are two periods of special interest: the 1920s with their investigations of scientific design products and the 1960s with their research on scientific design processes. It is interesting to note that in the 1920s, Theo van Doesberg and Le Corbusier already had the desire to bring science and design together (van Doesberg 1923; Le Corbusier 1929). Both produced works based on the values of science: objectivity and rationality. The subsequent investigation of innovative design methods had its origins in the upcoming problems of the Second World War. Novel, scientific, and computational methods were then investigated and applied to new and pressing problems. In the 1960s, the disciplines of urban design, graphic and interior design, industrial design, and engineering recognized what nowadays is commonly understood as design, and it became a discipline in its own right. The vast number of initiatives during that decade testify to this quickly growing awareness: the Conference on Design Method in 1962 (Jones and Thornley 1963), Christopher Alexander’s PhD on the use of information theory in design in 1964 (Alexander 1964), the Teaching of Design – Design Methods in Architecture conference in Ulm at the Hochschule fu¨r Gestaltung in 1966, and in 1967 the creation of the Design Methods Group at the University of California, Berkeley, the International Conference on
1
Why Is Design Important?
Engineering Design by Hubka, as well as The Design Methods in Architecture Symposium in Portsmouth, which all took place in the same year (Broadbent and Ward 1969). Buckminster Fuller was probably the first to coin the term Design Science. S. A. Gregory adopted it in 1965 at the conference on The Design Method (Gregory 1966). Gregory defined Design Science as the study of design in theory and in practice, in order to gain knowledge about design processes, about design procedures to create material objects, and about the behavior of its creators. According to his rather general description (which applies to digestion as well), design is a process that has an input and an output (Gregory 1966). In 1967, Hubka established the International Conference on Engineering Design where he introduced the scientific approach to engineering design methods as design science for the first time. Design science was described as a system consisting of logically related knowledge. This system was intended to organize the knowledge gained about designing.
1.1.4 Design Methodology According to Cross (1984), design methodology refers to the study of principles and procedures of design in a broad and general sense. It is concerned with how design is carried out. In doing so it analyzes how designers work and how they think. The aim is to make rational decisions that adapt to the prevailing values. This is achieved by looking at rational methods of incorporating scientific techniques and knowledge into the design process. Design methodology became important as a research topic in its own right at the Conference on Design Method in 1962 (Hubka and Eder 1996). In 1964, Christopher Alexander published his PhD thesis “Notes on the Synthesis of Form” in design methods (Alexander 1964). His approach for solving problems was to split design problems into small patterns. In doing so, he applied information theory. In 1967, the Design Methods Group at the University of California, Berkeley, was founded. Over the next decades, design methodology gained in importance, especially in engineering and industrial design. During this time, design as a research topic became common in Europe and the US. In 1966, The Teaching of Design – Design Methods in Architecture conference
5
was held in Ulm at the Hochschule fu¨r Gestaltung. The Design Methods in Architecture Symposium was held in 1967 in Portsmouth (Broadbent and Ward 1969). Design methods, together with artificial intelligence, got another impetus in the 1980s. During that decade and also in the early 1990s a series of books on engineering design methods and methodologies and new journals on design research, theory, and methodology were released. To name just some of them: Design Studies (1979), Design Issues (1984), Journal of Engineering Design (1990), Languages of Design (1993), and Design Journal (1997). The most relevant and important design methodologists during this period were: Morris Asimow, John Christopher Jones, Nigel Cross, L. Bruce Acher, T.T. Woodson, Stuart Pugh, and David Ullman. The first design methodologists were scientists and designers, and made their investigations to find rational criteria for decision making with the aim of optimizing decisions. Design methodologies were also used to offer appropriate methods for supporting creativity. Horst Rittel, a second-generation methodologist, proposed problem identification methods that were influenced by the philosopher Karl Popper. His approach differed from earlier attempts by incorporating user involvement in design decisions and the identification of user objectives.
1.2 Knowledge through Contemplation and Action These developments took place in a context of art and technology, a context that was distinguished sharply from science. It is an ancient idea that those who can design and make things, those who have a “techne,” do not possess the “right” or the “real” knowledge or “episteme” about things compared to persons who can talk about things after they have contemplated them and gained insight into their essence (Aristotle 1924, 981a). Thus, a shoemaker who designs and makes a shoe has, according to Aristotle, not necessarily an insight into the essence of a shoe compared to a philosopher who contemplates about what shoes are made for, what their purpose is, and what makes a shoe a good shoe. This Aristotelian view of devaluating practical or technical knowledge and favoring contemplation as
6
real knowledge came under pressure with Dewey’s pragmatism (Dewey 1986). Dewey unmasked in his sociology of knowledge the epistemic difference between contemplation and doing as one that originated from the attempt to privilege the knowledge of priesthood and to devaluate the knowledge of craftsmen and workers in a society based on slavery. In this way it was possible to secure the privileges of a class of men that did not do any physical labor, but that was in fact in its material self-preservation dependent on a class of enslaved people. Dewey thought that the hierarchy of knowledge by contemplation and knowledge by doing survived not only the abolishment of slavery, but also the disappearance of essentialism. The distinction between the pure and the applied sciences, between universities and mere polytechnics, still pays tribute to this hierarchy of different forms of knowledge. But what happens if things have no essence to contemplate and if the people who are creating things are no longer socially dependent on those who merely contemplate? Dewey’s answer was clear: As soon has this becomes obvious, one sees that all knowledge is in fact gained by doing or designing things. Dewey thought that this insight should also have pedagogical consequences: learning should happen by doing things and not by telling pupils about things. According to Dewey, knowledge, including “pure” theory, is an instrument for action and problem-solving (Dewey 1986). Since then, a great variety of theories of knowledge have developed, which all consider knowledge as a product of construction and not of contemplation, as something that is actively crafted by man and not passively conceived by the mind. The varieties of constructivism not only made knowledge into something that human beings design with their minds but it led to a relativization of the distinction between pure and applied science. Especially the concentration on the technological foundations of experimental science, the insight that science is as much about representing as it is about intervening in the world (Hacking 1983), reshaped the view of the relation between theory and technology in the philosophy of science. If human beings are engaged in processes of design when they create theories, experiments, or machines, then in what sense is the knowledge that is necessary for developing a machine or a building subordinate to the one that is needed in order to create a theory? Is the complexity involved in the design of a
S. Konsorski-Lang and M. Hampe
machine not much greater than the one involved in creating a “pure theory?” Why should the design of an experiment that is a stage in the design of a theory be considered as pure science, whereas the design of a material, a machine, or a building is “only” applied science according to the cascade model of knowledge that starts with theory on top (Bacon 1990)? Recent investigations into the nature of the relation between science and technology suggest that the design of a technical gadget is very often much more than an application of theoretically prefabricated knowledge and that even theoretical insights that are as “pure” as Einstein’s relativity theory are not gained independently from technical problems (Gallison 2003, Carrier 2006, pp. 15–31). A complex technical problem, such as the one Einstein was facing when he thought about the synchronization of the clocks in the railway system, can lead, if it is seen against the background of the general knowledge of the field (in Einstein’s case against the background of physics of moving bodies), to fundamental theoretical innovations. Thus, trying to solve a concrete technical problem or a problem of design can lead to very general new knowledge.
1.3 Design of Languages and Worlds Theories are often considered as structures in a language. As long as languages were considered as naturally given, constructing a theory was working in something that was not designed. This must not necessarily mean that one does not consider theories as products of design. For a machine is designed in a material, such as metal, that need not itself be designed. But since the mathematization of science, the picture has changed. It was Newton who invented or designed his own mathematics for his physics of accelerated bodies, the infinitesimal calculus. Since then physics has been dominated by the artificial, man-designed languages of mathematics. The “natural” language or ordinary language plays only a pedagogical role in physics. Since the development of computers, the design of languages and machines for solving scientific problems has become even more prominent. The printed circuit copied on a silicone board transforms the representation of a machine into a real machine that can solve problems in a language that is man-made, the
1
Why Is Design Important?
Boolean algebra. Designing a language for programming a computer, designing a computer as a material machine, and solving a theoretical problem of science can become very tightly connected tasks in those areas that use computer simulations. Insofar as we consider systems that solve problems by using symbolic representations as minds, the design of artificial languages and artificial symbolic problem solvers is the design of artificial minds. But the design of minds and languages is not a specialty of the epoch of artificial intelligence. Although every person is born into a “natural language,” there is hardly anybody who does not react to this language by deviating from it. In most people this will not lead to intentional design of languages. But for any poet the natural language is a material that is to be changed into something else: a designed language that serves different purposes and shows different things in a different light than the undesigned natural language. At the very beginning of European fiction, in Homer’s epics, this design is obvious. As the stories of these epics were conveyed orally for a long time, the language in which they were told had to support the memory of the singer. In order not to mix up events and characters, a verse was produced as well as phrases to fit this verse that could not be exchanged easily. Thus, Odysseus is always the sly one and Achilles the fast runner. It has been suggested that this design of a verse and mode of description that serves memory was also influencing the way the people who were telling these stories and were listening to them perceived their world (Feyerabend 2009, pp. 107–156). Thus, the design of a language forms the minds of the user of this language as much as the minds of the user (e.g., their capacity to memorize things) form the language they design. If we consider experienced worlds as the result of the way the minds shape languages to describe the world and the way languages shape minds to experience the world, then we can say that the process of designing languages and minds is a process of designing worlds. “If a new way of speaking spreads, it will affect the mental life and the perception, and man finds himself in a new environment, perceives new objects, he is living in a new world” (Feyerabend 2009, p. 169). One famous version of constructivism, the one developed by the American philosopher Nelson Goodman, says exactly this: worlds of experience (and these are the only ones we know of) are made,
7
and making languages is a way of making a world (Goodman 1981; Steinbrenner et al. 2005). Critics of this view may say that the world we experience is the product of a natural development, whereas the world man is able to create by designing languages is artificial worlds. This is true under the presupposition of a superficial understanding of the “natural” and the “artificial,” an understanding that is challenged by recent developments in design and simulation.
1.4 The Natural and the Artificial Creating intentionally artificial worlds is an activity in which humans were probably always engaged. Plays, paintings, and epics are artificial, man-made worlds. The programming of the artificial worlds in computer games is just the latest version of this creative activity. When we look, on the other hand, at the ways human beings thought the world they did not create themselves came about, we have three fundamental models: The world did not originate at all, but was from eternity and will be in eternity – the Aristotelian Model (1), the world developed in a process of evolution that involved elements of chance – the Democritean Model (2), and the world was designed intentionally by a designer – the Platonic Model (3). Today, models (2) and (3) are favored: physical cosmology and biological evolutionary theory develop modern versions of model (2), and Judaism, Christianity, and Islam favor model (3). Therefore, we consider model (2) to be a naturalistic one in which the world came about by a natural process without any intentions involved, whereas the religious views are considered supernaturalistic, because they involve a non-human, divine intention as responsible for the design of the structures of the world we consider to be natural. The fact that the world seems to have an order and that it contains many things useful to man was long – in the so-called argument from design – considered to be an indication (or even a proof) of the intelligent creator who designed the world in such a way that man can live in it (since Thomas Aquinas in the thirteenth century till Robert Payley in the nineteenth). David Hume criticized this argument in his “Dialogues Concerning Natural Religion” from 1779. Do we not know as many or even more principles of creating order besides intelligent design, like growth or instinct
8
or generation? Hume asked (1948, p. 49). And if we suppose that a divine mind designed the world by planning it, how did he produce an order of ideas in his intellect? Since Darwin’s theory of evolution and the theories of natural self-organization, design has disappeared entirely as a principle of explaining natural order. What happened is that evolutionary principles became tools for intentional design. For if we now look at modern design processes and at modern epistemology, the picture about the relation between the natural as the unintentionally and the artificial as the intentionally created becomes much more complicated. Some modern methods of designing and simulating things use evolutionary algorithms. The computer applies in these algorithms evolutionary strategies like the reproduction, variation (mutation), recombination, and selection onto structures in order to simulate and design things like materials, markets, organisms, pharmaceutics, biological populations, states, and much more (Ashlock 2006). These strategies are at the same time believed to be the most fundamental mechanisms behind biological evolution, i.e., behind the “natural production” of organisms. Darwin found his theory of evolution originally by applying observations about the social development of human populations from Malthus and about the methods of breeding on farms onto wild nature (Bowler 2003). His term “natural selection” already indicates that the process of selection was first considered a cultural one: the intentional selection of plants and animals for breeding by the farmer. By implementing evolutionary algorithms in a computer selection as a method of design that was theoretically first intentional (at the farm), then discovered to happen also nonintentionally in “wild” nature, design becomes “semi-intentional”: the designer, e.g., in search of a material for constructing an airplane, intentionally installs an intentionally developed algorithm that searches unintentionally for the best mix of components for a material in a computer. The autonomy of the transformation processes of algorithms in computers makes computer-aided design a semi-intentional process: natural processes are intentionally simulated in order to optimize design processes. If we now take into account that our view of natural processes is increasingly shaped by the processes of simulation and design in computers, we see that the border between the natural and the artificial becomes increasingly blurred: nature, originally seen as a product of
S. Konsorski-Lang and M. Hampe
design (in Model 3) and now seen as a product of evolution, is imitated in its creative potential in evolutionary algorithms that are installed in artificial brains that shape our view of nature. Perhaps the distinction between the natural and the artificial that dominated western thought in a normative way for many centuries will disappear or at least become a superficial one if we develop a deeper understanding of the processes of design and creativity in general. If we take into account that the most advanced computers, those that use evolutionary algorithms, are also able to learn, i.e., to develop their own minds, once they have been set up by man, we get an even more complicated picture. Man’s mind has been developed by biological and cultural evolution in such a way that human beings were able to design artificial minds intentionally. These artificial minds were considered most efficient if they ran evolutionary strategies and were able to undergo developments that are not planned intentionally. The way these artificial minds develop will, the more they are used in science, shape the way man sees the world. Thus, the human mind of the future and the future human view of the world will be developed in part by the artificial minds man designed himself in such a way that they can develop in a quasi natural fashion. This blurring of the distinction between the natural and the artificial has led theoreticians like Bruno Latour to the idea that the whole concept of purely natural and purely artificial things is a fiction to be replaced by the idea of hybrids (Latour 2000). The vision of a man who is using glasses is as much a hybrid as the thinking of a scientist about the world that is aided by a computer.
1.5 Pursuit of Perfection 1.5.1 What is Design? In English the term design is used as both a noun and a verb. As a noun, a design mostly refers to the final product or the result of a design process. As a verb, to design refers to the process of creating the product. Designing includes the consideration of esthetic and functional aspects. In English and French, the terms for design translate more to Gestaltung and Entwurf, whereas in
1
Why Is Design Important?
Italian disegno and Spanish disen˜o relate more to an approved activity. Translating design into German results in a broad range of meanings. In the German language, the term design relates to things and is targeted at their formal and artistic aspects, whereas in Anglo-Saxon design also involves technical and constructional aspects. Design itself is an iterative, creative, but controlled process. It needs clear definitions and controlled aims. In all disciplines the role of the designer involves specifying the principles of: need, describing the vision, and producing the result. So far within design there has been a strong differentiation between theory and practice. Research on design theory didn’t have much impact on practice until now. Designers in practice have, therefore, operated free from any design theory. But for all that designers use empirical insights, concepts, and logical systems, and their gained experience, all of which is in practice used for making decisions, these are often misinterpreted as intuition. Design theory, however, deals with design on a different level than design in practice. Research on design investigates models to explain and to assemble design experience in practice. The goal is to gain insights that can be used in practice in the future. The proposed theories, however, necessarily have to be generalized and border on at the limitation of descriptiveness.
1.5.2 Perfection by Design It is noticeable that designers work towards an improvement and a perfection of their products beyond disciplinary boundaries. Admittedly, the design itself will never be perfect, only the imperfection will be minimized. Since design conforms to constraints, requires choices, and involves compromise, it will never be perfect. Design nowadays is in most areas used to increase the user satisfaction, the brand identity, as well as the competitiveness in the sense of being better, quicker, cheaper, etc., than others. Armstrong defined design as the essential part of the creative process of engineering that makes it distinct from science (Armstrong 2008). The design process in engineering involves: imagination, creativity, knowledge, technical and scientific skills, and the use of materials. Creativity requires the ability to think
9
laterally, to anticipate the unexpected, to delight in problem solving, and to enjoy the beauties of the mind as well as of the physical world. But what makes a design good? Are there parameters or models that define whether a design is good or poor? In many disciplines evaluation criteria exist to assist in achieving high quality results. Principles exist that are fundamental to the discipline and that have to be fulfilled. In engineering, for example, it is possible to identify basic principles that can be applied to any other discipline when it comes to the initiation of work or the testing of design decisions. These principles in engineering should not be confused with postulates, definitions, hypotheses, standards, or rules. However, design is related to art; therefore, it is difficult to quantify and model it completely. No checklist of rules or fixed set of questions exists that can be applied or answered to determine that a design is good. Fundamental principles are generally well known to experienced designers, but may not have been clearly formulated. Principles in design are intended to provide assistance to the context of the design. They are not scientific hypothetical principles and are not necessarily rooted in physics and mathematics. But because man has always applied technologies onto himself, processes of design have led to ideals of human perfection as well. Perhaps the concept of perfection is most intimately linked with human selfunderstanding. Sportsmen have shaped or designed their bodies since ancient times according to certain ideals, and the design of drugs that will enhance our mental capacities and emotional makeup is already under way, i.e., man has started to design himself mentally and physically according to ideals of physical and mental perfection. In the mosaic of religions man, is designed by God according to his image, and he has no right to shape himself according to other images, which would be “unnatural” or a violation of piety in this view. But in pagan Greece and in modern times, when man sees himself as a product of evolution, this is different. Knowing the rules of evolution and considering himself at the same time as a free being (which is considered by some philosophers as a contradiction), human beings may take every liberty to improve themselves, e.g., their genetic material by genetic engineering or design. Perhaps in the future ideals for the genetic design of man will develop, such as already exist for the so-called lower organisms.
10
It could well be that the industries that develop a means of designing oneself physically and mentally will also develop ideals of perfection in order to make their products successful on the market. By such means, at least in capitalist societies, the ideals of perfection in design will probably always be connected with marketing strategies.
1.6 Design Parameters Unlike with recognized scientific disciplines, which study what already exits, the field of activity in design, and respectively, the discipline of design can hardly be reduced to a common denominator. Scientific methods, that obtain and test knowledge that is covering general truths of the operation of general laws underlie science (Webster’s New Collegiate Dictionary). In science, something can be proven either by observation or by measurement. This means if design is a science, we have to investigate a model that describes design in terms of a logical representation. If we assume that man-made design is a production process and its objectives are not simply the creation of physical objects but also all sorts of processes, services, interactions, entertainment, and ways of communicating and collaborating, we can recognize that design is one process step to optimize the product, bring it to perfection, and create value. The processes therefore determine the quality of the products. The improvement of the products calls accordingly for the improvement of the processes. Consequently, not only the products have to be redesigned, but so does the way we design. In order to improve our designs we therefore have to understand what we do and how we do it. Designing as a process is more or less creative. This usually includes the: intuitive, iterative, recursive, opportunistic, innovative, ingenious, unpredictable, refined, striking, novel, reflective, and also a search for elegance and beauty (Schoen 1983). The design process could be seen as the management of negotiable and non-negotiable constraints. Design as a process has many different forms depending on the resulting product and the discipline. Each design group developed a method for solving problems that evolved over time. Depending on the school of thought, different groups look at the problem from different perspectives. The results differ and so do their goals, as well as the
S. Konsorski-Lang and M. Hampe
scales of the projects and the methods they use. Even the actions appear to be different. However, looking at different design processes, we can notice that general similarities often appear in their approaches. That is to say that every process can be structured with the same few laws. Therefore, fundamental patterns exist within the process they follow. So if design is a process, the design process is the transformation process (method) between an input and an output. Assuming that design is a process, it fulfills the usual process definition. Processes can be defined as the way taken to achieve an end, and accordingly, the individual steps can be described as process skills. If the end is the response to a defined need, it can be called the design process. The design process contains three basic elements: inputs, outputs, and, in between, the method used. This may seem obvious, but identifying these three basic elements within design helps to improve the operation. Furthermore, once these elements are made clear, and roles are defined in advance, the probability of success is increased, and the risks are reduced. Uncertainties and fears can be narrowed down, and results can easily be improved, repeated, and modified by identifying and fixing broken processes. Therefore, design can be quantitatively measurable and could be evaluated and optimized. Nevertheless, it is important not to restrict creativity.
1.6.1 The Design Process People often think that the designer has an idea, does some not describable things (creativity), and suddenly the result appears (see Fig. 1.1) From this assumed course of action the following simplified process can be derived (Fig. 1.2). There is an input and an output, and in between a process, a transformation. This simplification neatens a complex approach and may suggest the illusion of linearity. A basic abstraction of the design process is shown in Fig. 1.3. First there is an input, a clearly defined need or desire. The output is the response of the need or desire, for example, a product, system, project, product description, or the use of something. To arrive from the input to the output, we need a method, the design process. The design process transforms the need into the result. The process consists of two
1
Why Is Design Important?
11
Fig. 1.1 Creative design process: Some not describable things result suddenly in a solution
Fig. 1.2 Simplified design process. There is a stated need, and between the need and the result there is the process
Fig. 1.3 Basic abstraction of the design process
basic parts: activities and resources. The overall process is convergent, but within the process there are periods of deliberate divergence.
1.7 Design and Evaluation If things are designed because of human needs and if human needs change because humans are confronted with new things or live in changing worlds, then there can be no eternal standards for evaluating designs. A design is good or bad relative to a fixed system of desires in a fixed world. But the constant use of a new design will change the needs of those using it, and the world from which the design originated. For
instance, the desires for people to communicate with each other were changed by inventions like the telegraph and the telephone; these led to developments like the Internet and e-mail, which will again change the ways people communicate and the desires they have. The history of design is often described as a progressive process: the telephone is progress in communication over the letter, the Internet progress over the telegraph, and e-mail progress over the telephone, and so on. But the mere fact that letters, telephones, and e-mail all exist side by side shows that the development of designed things is not necessarily one of linear improvement. If the design of things is a more or less intentional or unintentional design that is derived from human desires and of experienced worlds then it is also a design of principles of evaluation.
12
S. Konsorski-Lang and M. Hampe
Fig. 1.4 Evolution of the automobile: From the ox cart to the horse-drawn carriage to the motor carriage to the automobile
Having said this, the evaluation criteria of design emerge and change over time. They depend on the values, desires, needs, possibilities, etc., of the respective society. This is briefly illustrated using the example of automotive design (see Fig. 1.4). The development of the automobile is a good example of how technological evolution is sometimes based upon major design and technology shifts. In 1769, Nicolas Joseph Cugnot built the first recognizable automobile for transportation of people and used a steam engine to power it. In 1888, Benz and Daimler invented the four-stroke internal combustion engine, which is still used in most modern automobiles. In 1924, when the American automobile market started reaching saturation, General Motors pursued the strategy of annual model-year design changes with the goal of persuading car owners to buy a new replacement each year. This strategy was intended to maintain unit sales. Henry Ford, on the contrary, adhered to notions of simplicity, economics of scale, and design integrity. GM surpassed Ford’s sales and became the leading player in the automotive industry in the US. The yearly restyling influenced the design and made further changes necessary. Therefore, the lighter but less flexible monocoque design was changed to a body-onframe. Another change came in 1935 when designs became driven by consumer expectations rather than by engineering improvements. Automobile design emerged after World War II with the introduction of high-compression V8 engines and modern bodies. Throughout the 1950s, engine power, vehicle speed, and design gained in importance. Another shift came in the 1960s with the international competition among the US, Europe, and Japan. This era was affected by the use of independent suspensions, wider application of fuel injection, and an increasing focus on safety. The modern era is characterized by increasing standardization, platform sharing, and computer-aided design. Today aerodynamics, safety, and mainly environmental aspects such as fuel efficiency, engine output, carbon dioxide (CO2) emission, and gas consumption influence car designs.
Another phenomenon that can be observed is that some car models like the Isetta, Volkswagen, VW Ka¨fer, Fiat Cinquecento, and Citroe¨n 2CV became archetypes of modern spirit. The Isetta was built for the man on the street in a time when cheap, shortdistance transportation was needed after World War II. In 2009, the Mini celebrated its 50th anniversary: in 1959 the British Motor Corporation (BMC) gave Alec Issigonis clear instructions to construct a car with a spacious passenger compartment, but with short external dimensions, space for four passengers, and amazing handling characteristics. In 1974, the first VW Golf rolled from the assembly line. The VW Golf is one of the most successful cars built in Germany in the last three decades and also stands for a part of cultural history for a whole generation. In the mid 1970s, the VW Golf was considered as sporty, even with the smallest available engine capacity. Its design criteria were economical engines and being affordable for the masses. But not only cars gain cult status. One of the newest examples of good marketing and promotion is the clogs from Crocs shoes, imitated from other manufacturer and sold worldwide. These beach and camping shoes became fashionable as normal street shoes. You can even buy shoedoodles (Jibbitz), stickers that fit into the holes of crocs shoes, to individualize them (Fig. 1.5). So why do people wear plastic shoes in goofy looking colors with normal clothes?
1.8 Contents of this Book Since design is so broadly defined, there is no universal or unifying institution of all disciplines. Therefore, many differing philosophies of and approaches to design exist. What they all have in common is that they are designing. Their goals, actions, and, therefore, their results differ, but they are all also similar as they all follow processes. Serious research on design demands focusing on the design process.
1
Why Is Design Important?
13
Fig. 1.5 (a) Crocs Shoes Beach Color Variety (retrieved July 20, 2009 from http://www.sporthaus-ratingen.de/Crocs/crocs.html). (b) Crocs with Jibbitz (retrieved July 20, 2009 from flickr # jespahjoy)
Within this book we present how design is used in different disciplines and point out uniform as well as diverse principles within different design processes. In the first place, we are not asking What is design? We are asking How do you design? In their articles the authors give answers to the following questions: 1. How is design used in the discipline and what is designed? 2. Why is it designed? 3. Are there uniform as well as diverse principles within the respective design processes? Essentially, the book is organized to follow three main design classifications (see Fig. 1.6): Design of Objects and Materials, Design of Living Environments, and Design of Minds. Chapter 1 acquaints the reader with how objects and materials are designed as exemplified by Product Design, Automotive Design, Game Design, Drug Design, and Material Design. Fritz Frenkler, in his article “The Design of the Environment and the Surroundings,” focuses on product design, which not only refers to products, but also to the configuration of complete environments and surroundings. By means of examples, this generally very diverse field and accompanying diverse design approaches are described. Design is based on perception of the environment/surroundings as well as the social developments or trends resulting from an accountable perspective. Design is not the honed application of guidelines and rules, and design criteria are not static rules. Furthermore, they must be regulated and adapted to social changes. The goals of good design are to improve the quality of life and to create comfort
and user friendliness. In product design, design is a process that cannot be packed into reusable, general rules or principles. Product designers initially scrutinize the actual task, develop several methods of resolution, compare quality levels, and provide a recommendation. A subsequent article follows a disquisition on a specific product: the brand MINI. Here, Gerhard Hildebrand points out the importance of empathy for design. The principles of empathetic design are explained using the example of automobile design. So that design does not become random, social and ecological progressions have to be taken into account. Nowadays, good design is strongly related to economic success. Hildebrand explains how this factor of success is integrated into the structure of a company. Indeed in comparison to other factors, design is low cost. For example, at MINI the design costs are low, less than 10%, but are 80% of the reason for the purchase. Therefore, the most significant factor for design at MINI is the client. The focus of the designer is not to realize his own dream, but to create a product that fits the brand and the target group. One basic principle for the MINI design is “form follows function,” and another is the “human body archetype.” A good product is able to address all senses. Away from the automotive area, Markus Gross, Robert Sumner, and Nils Thuerey address the young and evolving field of game design in their article. Within a graduate course at the ETH, the Game Programming Laboratory, concerning the fundamentals of game design, the various stages of the design process are thought out and realized within prototype game developments. The most important and persistent principles in game design are, for example: iteration, peer review, prototyping, evolution, testing and evaluation, consistency,
14
S. Konsorski-Lang and M. Hampe
Fig. 1.6 Organization of the book
logical correctness, and simplicity. In the first part of the article, the authors give a brief overview of the history of game design before going deeper into the stages of game design: the concept phase, preproduction phase, production phase, and quality assurance. Besides formal elements, technology also plays a crucial part in game design. Information technology and computer science, for example, not only have a significant impact on the production costs, but also on the feasibility of the project. Conceptualization, prototyping, and play testing are also major stages in the design of a game. Moving on from the design of computer games, Folkers, Kut, and Boyer show that the design of drugs has changed since 3D models of molecules can now be handled computationally in such a way that in Computer Assisted Drug Design (CAAD), the machine will create a set of structural proposals for molecules that should have a certain effect in a living
body. They also show the limits of this design method, since it works on the (false) hypotheses of a one-to-one correlation between an artificially created molecule and a target structure in a living body with which it interacts. Despite the enormous amount of structural knowledge about complex molecules, “nobody has been able to predict the most exciting new drugs” as the molecular interaction between proteins is more complex than the computer-assisted drug design method presupposes. They also discuss the “dark side” of drug design: drugs that are similar to pharmacological substances but have effects that cannot be controlled and that were designed for drug abuse. They show that the culture of neurological enhancement, which may lead to the ability to design moods and minds, is possibly the meeting point of the “dark” and the medical sides of drug design. This chapter ends with an article by Paolo Ermanni on the design of materials and shapes for airplanes, cars, and other
1
Why Is Design Important?
technologies, which shows that modern design is no longer purely a process of intuitions by inventive individuals, but a collective process. Intuitions still play a role, but the more the development of a structure proceeds, the more the freedom to make changes on the structure decreases, or the more knowledge about an ideal solution for a certain function is gathered, the less freedom is left for intuitions. There are several criteria one might use to evaluate for designed products. The number of alternative solutions developed is as important for an evaluation as the amount of time and costs that have gone into a design process. The competition between alternative designs as a solution for the same problem within a market can, in a certain sense, be simulated by evolutionary algorithms in which a computational search for structural optimization takes place. Chapter 2 is devoted to the design of environments for living. The first part addresses the design of cities. The article by the architect Meinrad von Gerkan presents dialogical design in architecture. In the first part of this article, Gerkan describes the use of design and summarizes analytical reflections arising from his own work. The architect is an expert on design and architecture as a social commodity. Designing our environment requires dialogue and the ability to react to changing conditions. The key principles, which are simplicity, variety and unity, structural order, and unmistakable individuality, are identified and explained. The second part of the article strengthens the theory presented in the first part, using the city Lingang as an example. Lingang is a newly planned satellite city close to Shanghai and is being designed and developed from scratch based on the ideals of a traditional European city. Within their article “City Design – Designing Process for Planning Future Cities,” Halatsch, Kunze, Burkhard, and Schmitt investigate the design process using the example of future cities. They discuss how computer-based technology has changed the way architects and urban planners think, plan, and communicate. They have also developed a framework that allows simulating and evaluating urban environments to manage projects using GIS information and to collaborate over large distances. This framework contributes to solving urban planning issues and to establishing participatory planning processes. The last article in this chapter addresses the issue of landscape design. Interactive Landscapes by Christophe Girot, James Melsom, and Alexandre Kapellos points
15
out the influence of new technology in the design of large-scale environmental design. New technologies used as tools inserted into the design process provide new methods of verification and visualization that cannot be easily attained using traditional processes. However, in landscape design it is also essential to work with models. Computer numerical controlled (CNC) machines and CAAD-CAM technologies provide greater flexibility than traditional models, and the information obtained through the traditional modeling process feeds back into the design process, creating a synergy. Chapter 3 presents the design of minds such as Text Design and Synesthetic Design. This chapter begins with a discussion of theory and design. Focusing on the concept of virtuality, Vera Bu¨hlmann investigates, from a philosophical point of view, the conceptual and epistemological consequences of design becoming increasingly important in various sciences and for all relations of humans with the world in general. Starting from the problem of locality, as stated by the French philosopher and historian of science, Michel Serres, Vera Bu¨hlmann shows that since antiquity an external stance to human knowledge has blurred conceptual contrasts like the natural and the volitional, the given and the made, the created and the evolved. With digital design and simulation becoming one of the most important methodologies of handling the world, this external stance becomes the standard one. Thus, the idea that science solves naturally given problems by applying adequate rules to it becomes more and more obsolete. The whole idea of a fit between the natural and external to the mental and internal is going to disappear. Bu¨hlmann shows that even “material,” “organism,” and “mind” are terms that might need recategorization once the distinction between the natural and the artificial is gone. Wibke Weber points to the fact that texts consist of sentences that have been built. Not only can buildings be interpreted as texts, but texts and their sentences also have an architecture that is designed. Her article gives rules for designing good texts. It proposes a technique for visualizing the design of a text by using different colors for the different grammatical constituents. Keeping in mind that texts may be seen as graphical structures, are read as semantic structures, and were originally heard as acoustic phenomena, Weber proposes different criteria in order to design an optically, semantically, and acoustically
16
good text. By means of the project Sound-ColorSpace, Natalia Sidler discusses the synesthetic design of music visualization. The first part of her article gives insights into the phenomena of synesthesia and defines synesthetic design. Inspired by the field of Color-Light-Music, and the Color Light Organ, the research project “Sound-Color-Space” emerged. The transfer of synesthetic phenomena and characteristics from neuropsychological research into artisticesthetics studies provides the basis for the design of the unique Color Light Organ as well as for various visualization software developed for this instrument. The final section of this article illustrates the design criteria for the development and construction of this new instrument. After the instrument’s completion, three visualization programs were written with the goal of reproducing the sounds generated by the Color Light Organ into two- and three-dimensional geometries, structures, animations, and color arrangements to couple sound and color.
1.9 Quintessence At this point we are able to present the ways in which design is used in the fields described in Sect. 1.8 and point out that both uniform and diverse principles coexist within different design processes. The key discoveries looking at all articles are:
1.9.1 Design and Design Process Product Design is based on perception of the environment and the surrounding. Product Design is a process that cannot be packed into reusable, general rules or principles. In Product Design social aspects are crucial. Product Design is self-sustainable and a serious force. Design is the development and creation of industrial products that are produced in series and as such takes into account the following parameters: technology, ergonomics, sociology, and market relevance. Design produces a physical (material) object.
S. Konsorski-Lang and M. Hampe
Architectural Design evolves from a dialogue between the existing conditions and the ideals and models of the architect involved. Design is an iterative process. Design is a creative process, based on knowledge and intuition. Landscape Design is a very tangible exercise. Text Design begins with an idea to shape words, to form phrases, to build sentences and then paragraphs, etc. Technology is a tool within the design process. Constructions designed by peoples are driven by the environment. Synesthetic Design coordinates sensory impressions. In engineering the design process has four main phases: (1) planning and clarifying, (2) conceptual design, (3) embodiment design and, (4) detail design. In engineering the design process is motivated by an idea or need for improvement.
1.9.2 Designing Designing moved away from art and became a technical discipline. Product Designers are flexible and able to deal with an exceptionally wide range of different themes in a very short time. Product Designers recognize and analyze deficits and deficiencies. Product Designers scrutinize the actual task, develop several methods of resolution, compare quality levels, and provide a recommendation. Product Designers require the ability to think in a conceptual and holistic manner. Product Designers must look ahead to the future and create today what they expect to be fashionable in 5 years.
1.9.3 Design Criteria Design Criteria must be regularly examined, evaluated, and adapted to social changes. They cannot be static rules.
1
Why Is Design Important?
The constants of design are ergonomic criteria and safety guidelines. Form follows function. Technology follows function. One MINI design principle is the Human Body Archetype Intuition. Design principles in game design are: iteration, peer review, prototyping, evolution, testing and evaluation, consistency, logical correctness, and simplicity. Key principles in architecture are: simplicity, variety and unity, structural order, and unmistakable individuality. Gestalt laws like proximity, similarity, closure, symmetry, law of continuity, and law of proximity as well as writing style can be considered as design principles for texts. Synesthetic phenomena and characteristics can be transferred into artistic-esthetic studies and works to visualize music.
1.9.4 Design Evaluation The design process is measured in terms of time, costs, and quality of the final design. Good Design makes a considerable improvement in everyone’s quality of life. Successful design is empathetic design. Successful design is self-explanatory. Successful design creates a need. Fun is crucial in game design. Design is not a topic that can be investigated by an axiomatic science that starts from general principles that are universally applicable.
1.9.5 Design Science versus Design Engineering Regarding design and science, we distinguish among three different areas: Scientific Design, Science of Design, and Design Science. The term Scientific Design goes back to the time when industrial design became more complex and intuitive methods no longer worked. Scientific design merges intuitive and
17
rational design methods, and is simply an indication of the reality of modern design practice. Herbert Simon defined Science of Design as a body of intellectually thorough, analytic, partly formalizable, partly empirical, teachable doctrines about the design process. In 1969, he also postulated the development of a science of design. Natural science describes existing things according to natural laws. In contrast, design deals with how things ought to be. In our understanding, design is used in devising artifacts to attain defined goals. According to Simon, everybody who changes existing situations into preferred ones designs. In order to improve the understanding of design, the logic designers’ use has to be considered. Science of design can be considered as the proper study of mankind (Simon 1996). Abstracted, the science of design is concerned with the study of design with the aim of defining a design methodology. So, as previously described, Design Science is, in contrast to Science of Design, a systematical approach with the aim of defining rules and laws that lead to the design method. In further contrast, design in engineering is a feedback process engaging the following engineering activities: understanding the problem, concept generation, analysis and optimization, testing, and construction. Engineering design, therefore, refers to the chain from research and development, to manufacturing, construction, and on to marketing, and is based on scientific principles, technical information, mathematics, practical experience, and imagination. The focus is on the development of mechanical structures, machines, or structures based on predefined functions with the maximum of economy and efficiency (McCrory 1966, pp. 11–18, Eder 1966, pp. 19–31). Nowadays, engineers increasingly realize technical functions by immaterial and software technologies. The outcomes of these developments are the design, the production, and the process. Hubka and Eder (1996) defined the process of designing as the transformation of information derived from the condition of needs, demands, requirements, and constraints into the description of a structure. This structure is capable of fulfilling these demands, which include the wishes of the customer, the stages and requirements of the life cycle, and all the in-between states the products must run through. Petroski (1997) describes engineering as the art of rearranging materials and the forces of nature based on the constraints given by the immutable laws
18
of nature. Engineering itself is seen as a fundamental human process. To recap, the most important differentiation between science and engineering in this context is that scientists search for understanding. While scientists do not aim at rigidly specified goals, engineers work toward very concrete objectives requiring criteria and specifications. Design can therefore be considered as a hybrid. Design is part of fine art with its esthetic and artistic aspect, and is part of the engineering disciplines as well as part of the science disciplines. To a large extent, designers, architects, business managers, engineers, software developers, etc., are unaware of the practices and processes in other disciplines. They are not thinking about overlaps and do not bring together work from different areas. However, it is ultimately people who create things and environments to improve their situation, and the situation in turn alters the world view of those who live within it. This, then, subsequently shapes the persons who are born into this new situation. In this way, people design their worlds, and in so doing they also design future human beings. So can design be a scientific discipline? Or can the combination of Design Science and Design Engineering seen as Applied Science? Or is it something else?
References Alexander C (1964) Notes on the synthesis of form. Harvard University Press, UK, Cambridge Archer LB (1965) Systematic methods for designers. The Design Council, London Archer LB (1981) A view of the nature of the design research. In: Jacques R, Powell JA (eds) Design science: method. Surrey: IPC Business Press Ltd, Guilford, USA Aristotle (1924) “Metaphysics.” Ross WD (Transl. and Ed.), 2 vols. Oxford Armstrong J (2008) “Design matters.” Springer, London Ashlock D (2006) “Evolutionary computation for modelling and optimization,” Springer, Heidelberg Bacon F (1990) “Novum Orgnanum/Neues Organon,” Philosophische Bibliothek, Hamburg
S. Konsorski-Lang and M. Hampe Bowler PJ (2003) “Evolution: the history of an idea, Berkeley,” 3rd edn. Los Angeles, London Broadbent C, Ward A (Eds.) (1969) “Design Methods in Architecture.” Lund Humphries, London Carrier M (2006) The challenge of practice: Einstein, technological development and conceptual innovation. In: Ehlers J, La¨mmerzahl C (Eds.) “Special relativity: will it survive the next 101 years.” Springer, Heidelberg Cross N (1984) “Developments in design methodology.” Umi Research Pr, New York Dewey J (1986) Logic: the theory of inquiry, In: Boydston JA (Ed.): “John Dewey – the later works”, 1925–1953, Vol. 12: 1938, Carbondale Eder WE (1966) Definitions and Methodologies. In: Gregory SA (Ed.) “The Design Method.” Butterworths, London Feyerabend P (2009) “Naturphilosophie.” Suhrkamp, Frankfurt am Main Gallison P (2003) “Einstein’s clocks, Poincare’s maps” the empire of time. New York Goodman N (1981) “Ways of Worldmaking.” Indianapolis Gregory SA (1966) “The Design Method.” Butterworths, London Hacking I (1983) “Representing and intervening: introductory topics in the philosophy of natural science.” ISBN-10: 0521282462, Cambridge Heath T (1984) Method in architecture. Wiley, New York Hubka V, Eder WE (1996) “Design Science,” ISBN 3540199977, Springer Hume D (1948) “Dialogues Concerning Natural Religion,” Indianapolis Jones JC, Thornley DG (1963) “Conference on Design Methods.” Oxford University Press, Oxford Latour B (2000) “Die Hoffnung der Pandora. Untersuchungen zur Wirklichkeit der Wissenschaft,” Suhrkamp, Frankfurt am Main Le Corbusier (1929) CIAM 2nd Congress, Frankfurt McCrory RJ (1966) The design method in practice, In: Gregory SA (Ed.) “The Design Method.” Butterworths, London Pask G (1963) The conception of a shape and the evolution of a design. In: Jones JC, Thornley DG (eds) Conference on design methods. Pregamon Press, Oxford Petroski H (1997) “Invention by Design,” ISBN 0674463676, Harvard University Press, Cambridge, MA Schoen DA (1983) “The reflective practitioner: how professionals think in action,” Maurice Temple Smith Ltd, New York Simon HA (1996) “The sciences of the artificial,” ISBN 0262193744. The MIT Press, Cambridge Steinbrenner J, Scholz OR, Ernst G (2005) Symbole, Systeme, Welten, “Studien zur Philosoph Nelson Goodmans,” Synchron Wissenschaftsverlag der Autoren, Heidelberg van Doesberg T (1923) “Towards a Collective Construction.” De Stijle
Part Design of Objects & Materials
2
II
2
Product Design: The Design of the Environment and the Surroundings
Fritz Frenkler
Abstract Professional designers have learned holistic thought processes. They do not work according to predetermined examples or guidelines, but develop a tailored catalogue of requirements for each project through careful observation and close scrutiny. The user forms the main focus of the classic design process. For this reason, designers require social knowledge and expertise. It is not merely products that are designed, but almost the complete environment that surrounds us, from everyday objects, to public spaces, to services.
2.1 Introduction Knowledge about product technologies and production techniques has spread all over the world as a result of globalization, and this has led to the partial relocation of production. A focal point that relates directly to this can be found in Asia. As a result of this, the design of products is increasingly the key unique selling point of the different brand names that manufacture and market technical products. The cost pressures caused by growing international competition are, however, making it more difficult to be competitive using technological innovations alone. In general, the development costs of innovative products are too high to be able to compete against less innovative products that are manufactured at low costs and in enormous quantities. As a result of this, sales prices are drifting far apart and there is now a lack of products in the midpriced market segment. There is an ever increasing number of cheap products that are not designed particularly well, and in contrast to them, high-end products
F. Frenkler Chair of Industrial Design, Technische Universita¨t Mu¨nchen, Mu¨nchen, Germany
that are manufactured in smaller runs and are therefore more expensive, but are superior in terms of their overall quality and design. In the lower market segment, price dictates the purchase. In the area of quality, the design, the finishing and the associated brand are key. A purchase decision in the high-end market segment is generally influenced by several factors, including user friendliness, the material basis or quality of the product and also by the attributes associated with the brand. The design and the overall concepts linked to it create emotions, which convince the user of the advantages of the particular product.
2.2 What is Product Design? Design not only functions through the honed application of guidelines or rules, but is also based on the perception of the environment/surroundings and therefore the social developments or trends that result from an accountable perspective. Rather than thinking in terms of conventional patterns, designers should recognize and analyze deficits and deficiencies through observation and self-experiments, and be able to
S. Konsorski‐Lang and M. Hampe (eds.), The Design of Material, Organism, and Minds, X.media.publishing, DOI: 10.1007/978-3-540-69002-3_2, # Springer-Verlag Berlin Heidelberg 2010
21
22
generate appropriate superior solutions. The ability to identify key sociological contexts and to link them with others requires well-founded general knowledge and a continual dialogue with cultural and social events and public debates. This expertise provides the basis for every creative process. Digital and manual presentation techniques are both necessary tools in the design process for the visualization of new concepts. Styling is a part of a comprehensive process. In its optimal form, it assists the user’s understanding of the product, conveys a product identity, explains the handling of the product and transports information as well as attributes, sometimes prominently, sometimes in a way that is barely noticeable. The field of work occupied by product designers is generally very diverse. As a rule, a designer’s personal profile or specialist area results from the experiences and the associated product references gathered during the course of their training and work on the development of projects. The distinguishing feature of product designers is their flexibility and ability to deal with an exceptionally wide range of differing themes within a very short period of time. Product design not only refers to products, but also to the configuration of complete environments and surroundings: shops, public spaces, the interface between man and machine, packaging, modes of sale and distribution, and the associated products and services. The thinking and design of the elements that surround a product are just as important as the composition of the actual product itself.
2.3 Examples of Comprehensive Design If all of the previously listed aspects are addressed by a company from the perspective of a higher ranking design DNA – previously called design language – then one talks of comprehensive design, a 360 design or a corporate design. Examples of such comprehensive design, which are frequently mentioned, are that of Apple Computers, and, until 1995, that of the German consumer electronics company Braun. It is not only the products of the Californian company that are consistently designed in accordance with a recognizable style, the user friendly software, the packaging and the marketing are all part of the overall concept; many buyers of the iPod keep the packaging just as
F. Frenkler
long as they keep the product itself. The sales and marketing take place exclusively via Apple Stores and selected retailers, which are configured in accordance with the Apple corporate design. This means that a holistic application of the overall strategy can be guaranteed. The offer is supplemented by the iTunes Online Store that sells and markets music and videos. Individual songs are sold instead of whole albums, and Apple aims to set standards for the service design of the future. At Apple, customers do not just buy a product, they also gain access to a range of services. Together, all of these components create the brand image of the company and have a direct influence on the market success of the products. MUJI, a Japanese lifestyle chain established in 1980, which sells stationery, office items, household products, clothes, furniture and more recently cosmetics and food, is another example of the skillful development of a brand. Remarkably, MUJI actually stands for the Japanese phrase “Mujirushi Ryohin,” which means “not brand products but quality products” when translated into English. The functional, minimalist MUJI products are manufactured in a resourcesparing manner. Their intention is that the customer should be convinced by the products themselves – and not by a brand name or the name of a designer. This principle is consistently pursued: although MUJI products are designed by highly acclaimed designers from all over the world, the designers themselves remain anonymous. However, the “non-brand” of MUJI has advanced to become a brand itself. At the end of the 1950s, Braun’s minimalist HiFi system designs caused a big sensation. Market research analysis carried out for Braun in 1954 demonstrated that there was a clear demand for modern radio equipment among large sections of the population. The concept of the compact music system addressed this market and won over consumers thanks to its very simple, clear aesthetic styling, which was in clear contrast to the cumbersome radiograms that were a fixed feature of many German living rooms at that time. This exceptionally successful entry into the market was the cornerstone for a remarkable company history, which became a symbol of German design. By adopting a design philosophy, which corresponded with that of the University of Design in Ulm (HfG Ulm), the manufacturer developed a consistent policy of reducing all of its products to their key features; this gave the company its distinctive identity. The company’s
2
Product Design: The Design of the Environment and the Surroundings
structure, in which the design department was directly subordinate to the executive department, contributed to the fact that it was possible to realize such a strongly consistent product image. However, as a consequence of personnel changes, and an increasing orientation towards global sales, the company, with its once unique position characterized by its ideological approach, was transformed into an imageless global player with anonymous and sometimes arbitrary products. In the automotive sector, the manufacturers Audi and Porsche stand out as the most consistent in their adherence to a recognizable and evolutionary style of design. However, using the example of the sports car manufacturer Porsche, it is clear that consistently conforming to strong brand characteristics can also bring problems. The product can appear questionable when social or political changes create new sociological contexts in which the product must be considered. In an era when the finite nature of natural resources and alternative sources of energy are being debated, it is only a matter of time before cars with above average fuel consumption suffer from declining demand. This development, however, will almost certainly take place as a result of competitors’ influence rather than as a consequence of end customers’ moral considerations. With ever increasing fuel prices, rival products with lower rates of fuel consumption will invariably become more attractive to many buyers. It is, however, yet to be seen whether the successful sports car manufacturer Porsche can succeed in developing vehicles, or alternative products, that correspond to the new demands of the market while not undermining the strong core brands of Porsche.
2.4 Design Criteria Design criteria cannot be static rules; they must be regularly examined, evaluated and adapted to social changes in small, clear steps. In the future, demographic change will almost certainly have an enormous influence on design, as the demands that users place upon products will change in relation to the age structures of differing societies. In the future, older users will probably enjoy a decisive degree of purchasing power. They will be critical customers and expect perfect service. This changed demand will strongly
23
influence the market. Additionally, in view of the enormous price increases in the raw materials market, it is necessary to develop generation-spanning products and to use resources, which are in increasingly limited supply, more economically. With respect to the developments described above, there is enormous potential within the area of services design. With reference to this topic, a comparison of two logistics companies, United Parcel Service (UPS) and DHL, a subsidiary company of the Deutsche Post, is fitting. In 1961, Graphic Designer Paul Rand designed the sleek black and white UPS logo with the symbolic representation of a cord-tied parcel above the already familiar shield. This logo was an expression of the complete company philosophy at that time. The distinctive “Pullman brown” had always been the company color of UPS, as it was intended that the delivery vans, which often have to double park, should not attract any more attention than necessary. Instead, they should blend in harmoniously with the street environment. Accordingly, logos were generally not placed on the rear of the vehicles, and the color of the vehicles also allowed for water-saving measures during cleaning. The overall appearance of the company was straightforward, sleek and functional. The service was good and convincing. In comparison, the opposite can be seen with DHL. The yellow-red delivery vans attract attention, even at a distance, irrespective of the setting in which one sees them. In road traffic, the bright and gaudy delivery vans come across as possible obstacles and almost awaken associations with an intrusive visitor. Both companies’ customers are able to follow the progress of their package on the Internet. DHL, however, advertises this in such an aggressive manner that potentially high levels of frustration develop if the service is not appropriately intuitive or the promised service does not work. DHL also suffers from a considerable lack of flexibility in terms of delivery times. For example, if the recipient is not present at the time of delivery, they have to collect their package from the nearest DHL branch themselves. This may be a long way from their home, due to the random scattering of the different branches. For working customers, this presents a considerable problem, as no set delivery times are provided. The image of the “brand” DHL suffers from this poorly structured service, which doesn’t particularly take the requirements of the customers into consideration.
24
F. Frenkler
2.5 When is Design Good? Good design can make a considerable improvement to everyone’s quality of life. It can create comfort or user friendliness, and the lack of good design can create the opposite. A striking example of what happens without satisfactory design is the so-called “satellite towns”: faceless suburban towns without a satisfactory infrastructure. They are a recent occurrence and have come into being as a consequence of unanticipated population increases in mega-cities. Their insufficient infrastructure is aggravated by their lack of urban design. Nobody wants to live in such poorly designed or, in some cases, completely undesigned places unless they are forced to for economic reasons. Dieter Rams, former chief designer of Braun AG, expressed this type of situation as follows: Designers and the companies who make an effort to create good designs have a single task: the task of changing our world with the goal of making it better everywhere where it is currently ugly, inhuman, upsetting and destructive, energy sapping, oppressive or confusing. In the extent to which it is any of those things – from the small, everyday things, to the design of our cities – we have driven out of our consciousness. We must become much “quieter” if we want to dilute the visual noise and chaos in which we are forced to live a little. . . . . . . Architecture, design and also the design of communication are expressions of the socio-political reality – and can conversely influence and characterize it. For this reason I view the task of design to have an ethical and moral dimension. Good design is a value. This position differs to a great extent from the widespread approach today that only views design as being a type of entertainment. (Translation of a quote; Dieter Rams in Shimokawa 2005) Thomas Mann stated that it wasn’t possible “to solve the unsolvable and to remove the connection which inevitably exists between art and politics and intellect and politics.” He also said that “this is where the totality of the human acts, which can’t be denied under any circumstances.” In the term “humanity,” “the aesthetic, the moral and the socio-political come together.” (Mann 1990)
Compared to the design processes in the disciplines of engineering and management, sociological aspects play a bigger role in product design. The interests of the user form the primary focus of the design process. While the disciplines listed above look directly for an adequate solution to the specific problem, product designers initially scrutinize the actual task, develop several methods of resolution, compare quality levels
and then provide a recommendation as to which version is preferable in the context of the corresponding company and its operations. Contrary to the widespread opinion that product design only means “styling,” in fact conceptual, holistic thinking, which often far exceeds the actual task, is required. The designer becomes the intermediary between the user and the manufacturer, assumes the role of an advisor, and seeks a dialogue with other disciplines to be able to think in terms of what is appropriate in the circumstances. Designers with these skills are also ideally suited for general management positions. In interdisciplinary teams, a different point of view can create key stimuli, raise new questions and lead to intelligent, user friendly and economical solutions. It should be noted, however, that to generate innovative concepts appropriate “room for manoeuvre” is necessary for a designer. If the framework conditions are too narrow, then the designer’s possible degree of effectiveness is also minimized. Since the end of the 1950s, when design was still addressed as a “management issue,” marketing departments have had an ever increasing influence in design decisions. If they make decisions regarding product development and their decisions are merely supported by simple user questionnaires, then the basis for promising, forward-looking design will not exist. This kind of attitude does not help advance companies in the long term. In globally operating companies, design and decision-making processes have now become too complex for design and marketing to be able to undertake without some constructive form of cooperation. It is only with clear values and specifications that designers can make, and implement, their contribution to the strengthening of a brand. With exchangeable specifications, design is all too often correspondingly arbitrary. In his 1980 theory of design, “Design is Invisible,” Lucius Burckhart, sociologist and “founding father” of the Faculty of Design at the University of Weimar, referred to the “design behind the design” (p. 9). With the statement “design is invisible,” Burckhart does not mean that design isn’t important; he simply states that it can only be experienced through use and in a corresponding social context. He points to the “institutional and organizational components” (p. 15) of a design through which the complete context, meaning the invisible parts of the system, are taken into consideration. A good example to explain his theory is
2
Product Design: The Design of the Environment and the Surroundings
the text “the night is man-made”: the night is clearly a natural phenomenon, since it gets dark when the sun goes down. But it is only through “social and legal decisions” (p. 31) made by people that the night has become an institution in which activities come to a halt: rules and regulations, travel timetables, prices and opening times are all designed by people and therefore mould people’s lives (Burckhardt 1981).
2.6 Design Principles and Design Guidelines Determining general design principles often contradicts the classic creative approach, which should bring about the exact opposite of standardization. Design is a process, which cannot be packed into reuseable, general rules or principles. In a classic design process the clear positioning of a company, as long as this doesn’t already exist, is usually defined as part of a dialogue between the designer and the company. This dialogue should identify whether the company wants to place its emphasis on social, ecological and/or technological factors. Plausibility and the implementation of various focal points, which ideally include all areas of the business, are key to this process. If, for example, Deutsche Bank decided to align its image with current social considerations, its logo would have to be revised in terms of both its color and its styling. The cold blue and the rigid, straight lines of the logo would otherwise stand in stark contrast to the “soft” features which the bank wanted to represent and would, therefore, lack credibility. It is often the case that large discrepancies exist between the stated goal of a company whose mission may be something like: “we want to become more modern,” and the willingness to actually make that change recognizable. Often companies adhere to the widespread tendency of focusing on their competitors, participating in “me too behavior,” with generally unproductive results. If a previously existing contractual framework is destroyed and the company suddenly reacts too strongly without making noticeable developments in terms of social changes, an overall negative effect results – the company loses plausibility. Customers have certain expectations of a brand; one cannot simply reinvent it. If this is the case, a company will
25
unsettle customers and lose them instead of winning over new ones. The only person capable of developing a brand and keeping it alive is that the person who is recognizable and re-recognizable. This not only includes the products, but also their engagement with the user and, today, their environment. Static guidelines are limits. To a certain extent, they are also able to form a foundation. Ergonomic criteria or safety guidelines, for example, form the constants for design. Just as there are style guides in graphic design, companies can also create what might be called a product design DNA, for instance, by defining the formal specifications concerning style and color, quality standards, the materials to be used and the requirements of the production procedure and ecology. In general terms, a brand is a design guideline that is tailored to the company and which applies to the entire organization. This design represents the aspirations of the company. So, if, for example, a company decides to develop an ecological focus as its company philosophy, the room for maneuver is considerably limited. It is not only the products, which must then be appropriately ecologically manufactured and recyclable, but also the packaging, sales techniques and marketing must be consistent with this. The complete infrastructure of the company and the approach of the employees must correspond with this image so that the products gain long-term acceptance in the marketplace. Apple Computers, for example, predominantly use four materials: white or black plastic, aluminum and magnesium. The style is recognizable, the logo is succinct and internationally understandable and the advertising strategy is clear. Although many manufacturers of MP3 players have since imitated Apple, whenever you see a white earphone cable, you assume they are connected to an iPod in the person’s pocket. The unmistakable brand image is so strong that products that aren’t made by Apple, but have white cables, are prone to losing their prestige rather than associating themselves with the success of Apple (as was presumably desired by their marketing departments). The use of what one might call “celebrity designers” forms an exception to the processes so far described. In this instance, the name of the designer is often more widely known than that of the manufacturer, and this form of cooperation isn’t necessarily conducive to the strengthening of a brand. However, with their corresponding reputation, what I will call
26
“design issuers” are an exception to this exception, as here the “design issuers” are the manufacturers. In this arrangement, customers approach sellers demanding certain products. As with a brand, the compiled collection makes choosing the products easier for the customer. For this version of cooperation it would actually be better to use the job title of “industrial artist” rather than that of “designer” for the designers themselves, since here the manufacturer takes on the function of a gallery owner. The designer works in a self-appointed, completely free context, and looks for a manufacturer. And the manufacturer, or “issuer,” selects products that they want to include in their collection. It isn’t possible to talk of a services relationship in this context. In the introduction to this book, you will find the sentence: “design aims to manage complexity.” This phrasing, which demonstrates the goal of maximizing the dimensions of the area of design, can only find its application, however, if the overall context has been appropriately gathered and understood. Design is usually at its best, and, therefore, correct, if the user doesn’t think about the designed elements. As a rule, this also means that the designers have spent considerable time thinking about intuitive operability.
Fig. 2.1 Visions for aircraft cabins – Development of concepts for aircraft of incabin room, from Marvin Bratke, Daniel Jakovetic, Sandro Pfoh, Daniel Tudman
F. Frenkler
2.7 The Design Process It is relatively difficult to briefly describe the approach and thinking of a professional designer to a layperson. The design process consists of several successive working steps. On the one hand, the designer must think freely and have a lively imagination. On the other, designers must be able to evaluate, criticize and optimize not only their own ideas, but also those of the people with whom they are collaborating. This does not result in a linear, targeted procedure, but in continually going back and forth, which may sometimes appear frustrating to other cooperating disciplines. At Munich Technical University, students of architecture, mechanical engineering and engineering sciences also receive some supervision in design. These are shared projects, and our experience of these shows that it is unusual for disciplines that are relatively unrelated to design to encourage and advocate ideas only to subsequently discard them completely and start again from the beginning as designers often must do (see Fig. 2.1). Yet through every step of the design process, irrespective of which direction it may be in, the
2
Product Design: The Design of the Environment and the Surroundings
development process is pushed forward, and the key factors of the theme are carved out more clearly. The two sayings “you learn from your mistakes” and “learning by doing” are particularly applicable. The latter is especially clear in view of the trend to visualize concepts in digital format merely using 3D programs. In today’s design traineeships and apprenticeships, there is often insufficient training in hand craft skills, as it is quicker and easier for students to learn how to operate a software package, and less teaching and supervision are necessary. However, this approach results in a lack of a three-dimensional spatial sense. The resulting lack of knowledge about the style and method of production also presents problems. It is not a coincidence that a thorough training in handcrafts previously provided the basis for undertaking studies in the subject of design. Even if the presentation for the customer is now prepared using 3D software, simulation with models is still an indispensable part of the development process. Certain details can be easily overlooked on the computer screen, whereas they become obvious on a model. From the supervision of these students, it is clear, in terms of their working methods and partly also through their thought processes, that architecture and design are very similar and that the design processes of each discipline share some parallels. Both professions can be classified as lying between the technical/mathematical and artistic/creative disciplines. However, with the exception of urban planning, architects generally grapple less with sociological and fundamental social considerations, and are more focused on the technical construction realization and the design. As with design, a clear division can be seen between “architectural artists” and “service architects.” The former mostly design buildings starting from the exterior and working inwards. The service architects generally start their design in the interior and work outwards. This means that, in an optimal scenario, they consider the requirements of the user on the building. Cooperation between architects and designers isn’t just sensible, but it is increasingly necessary. Both products and architecture can be seen to depend more and more on one another, and are often used in combination with each other. A thorough understanding of the knowledge and expertise of other disciplines provides the basis for the superior design of our environment.
27
2.8 Conclusion Designers are not stylists – they are inventors. They know how one can generate new approaches through the optimization of what is at hand, the transferring of principles, or through cross-linking. They have also learned the tools of the trade, they know how to visualize them, and they know how to bring outsiders closer. People have made use of examples from the plant and animal worlds for the manufacture of “tools” since prehistoric times. The influence of so-called “bionic design,” taking nature as an example, stretches across all areas of design and plays just as important a role in the development of new materials and mechanisms as it does in practical design. In light of this, a cross-linking of design with the natural sciences can only be positive. An inter-disciplinary approach is always enriching. The interaction with arts and humanities is also important. As described in the introduction, philosophical factors are incorporated into the sophisticated perception of a designer. A designer also requires the ability to express himself or herself appropriately. Until recently, design research was not an important theme. Scientific approaches previously played virtually no role in the design profession; in recent years, however, it can be seen that a change has taken place in this area. Merely 30 years ago, design and marketing departments were at the same level, directly subordinate to executive management. Today, in many companies, design departments are subordinate to marketing or engineering. Since this structure is not particularly conducive to innovative activity, designers are failing to gain increasing influence in companies. The ethical and moral dimension of design, mentioned in the quote by Dieter Rams, could play a key role in the event of the assumption of leadership positions.
References ¨ sterreich. Inst. fu¨r Burckhardt L (1981) Design ist unsichtbar, O Visuelle Gestaltung, hrsg. von Helmuth Gso¨llpointner, Linz 1981 Mann T (1990) Der Ku¨nstler und die Gesellschaft. In: Gesammelte Werke in dreizehn Ba¨nden, Frankfurt. Bd. 10, S.386– 399, s.S. 394 Shimokawa M (2005) Katalog Less but better – Weniger aber besser, Die Welt von Dieter Rams
3
MINI: Empathetic Design for the Future
Gert Hildebrand
Abstract What is the intention and purpose of design? Why are we bothered with designing? The answer is obvious: emotion and sensibility. It is the design that animates a product and imparts or lends meaning to form. Design is communication. Therefore, car designers must think long and hard about which form of communication, traffic and transport (which is a form of communication in itself) will shape the future and which form of communication will match the human need and desire structure of the future.
3.1 What Are We Designing For It all comes down to empathy. In this case, empathy means opening up to the customer and trying to put oneself in their shoes. Only then do we know what they need and want. Only designers who are prepared and eager to feel what customers feel and to see the product from their perspective can really assess the product’s quality and the benefit to the customer. Empathy is prerequisite for successful design. Successful design is emphatic design (Fig. 3.1). Car designers have already participated significantly in this form of communication. They put a face on the communicators, the motorists and cyclists, a face, a feeling, a way of expressing their personality through their vehicle. By designing the exterior (the outward expression), the interior as well as the color and material of a car (their lodging and living room), they will continue to provide people with their very own, suitable means of expressing themselves within the communication environment of traffic and passenger transportation. A vehicle is a clear and very personal statement. When asked for their reasons for buying MINI, 80%
G. Hildebrand BMW Group, Munich, Germany
answered that they made their decision because of its design. This result cannot be explained with rationality alone. It is, in fact, a highly emotional decision and that is exactly why the MINI team of designers has to have such an emotional approach to their task. Preventing design from producing random, commonplace or careless results requires an understanding and keen awareness of what the future will bring. How will society develop? What are the challenges of the future?
3.2 The Challenges of the Future Current global developments in society and the economy bring about new challenges. Three issues are of particular relevance, both in a general context as well as in the special context of design: the trend towards mega cities, the demographic changes in industrialized nations and, naturally, the subject of sustainability, which has already assumed major significance in the wake of the first tangible signs of climate change and dramatically rising raw material prices. First, we will take a look at the development of mega cities. The global population is growing every day. That in itself is not cause for concern. What aggravates this situation and gives us cause to worry
S. Konsorski‐Lang and M. Hampe (eds.), The Design of Material, Organism, and Minds, X.media.publishing, DOI: 10.1007/978-3-540-69002-3_3, # Springer-Verlag Berlin Heidelberg 2010
29
30
G. Hildebrand
Fig. 3.1 Design formula – definition of design
Fig. 3.2 Relationship volume of cars and price of gasoline
about the future, however, is the fact that an increasing number of people in developing countries are driven from their homes in the country by poverty and disease, and forced to move into the cities. The development of mega cities with a population of 20 million inhabitants or more is particularly prevalent in these countries. Unfortunately, migration to the city often fails to solve any of the problems of the rural population. Hence, these cities become places where millions are struggling to survive. Shifting the focus on the population structure from global to local reveals that the development described above has the exact opposite effect in industrialized nations: our population is shrinking and our society is showing signs of overaging. Already, not only the European, but also the Chinese and Japanese societies have an above-average percentage of older people. In its wake, the young are faced with a completely new problem as they are left wondering who will look after them when they are old. The apparent paradox in view
of the sharp rise in the global population can be easily resolved. In today’s industrialized nations, older people have a higher life expectancy, while the young frequently decide to remain childless or to have only one child. This way, the population cannot be sustained and the so-called inter-generation contract cannot be fulfilled. The future situation is aggravated by a third challenge: the dramatic consequences of raw material shortage (Fig. 3.2). There has been a rising awareness of the problem and some possible solutions – dubbed with the collective term “sustainability” – have already been offered. Unfortunately, sustainability has become something of a buzzword, which is widely popular but increasingly used in a context that is devoid of meaning. This is the more regrettable as “sustainable behavior” basically means that the existing generation satisfies its needs without compromising the next generation’s capacity for satisfying its needs in turn. This translates into a responsible
3
MINI: Empathetic Design for the Future
31
Fig. 3.3 Using the Gaussian taste distribution, success or failure of design can be specified
treatment of our environment and all that it entails, i.e., the things that surround us and the things that we surround ourselves with. All these changes will also dramatically alter the structure of human desire and needs as well as people’s mobility requirements, which will ultimately affect the design of their transport vehicles. In the course of all these developments, design is more important than many of us can imagine.
3.3 Design as an Economic Factor or the Gaussian Taste Distribution Design is not only beneficial for the product, but also for the product’s user. This is due to the fact that the things we surround ourselves with have a direct effect on our sense of well-being. Humans are attracted to beauty and the esthetic side of things and people, while ugliness can lead to aggression. To adapt Plato: truth is beauty. Truth and beauty belong together; ugliness cannot be truth. Designers ought to take some formative responsibility in our world. Cars should always be objects of beauty that do not clash with their surroundings, whether they are parked in a tiny backwater or in a big city. But what is the relationship between beauty and ugliness? Are they closely related? Is there a clear distinction, and how are they related to economic
success and failure? Obviously, beauty is in the eye of the beholder, and as such it is a very subjective attribute. Nonetheless, it is possible to determine an average taste index, a kind of common denominator, something most people would agree with – like the average value on the Gaussian taste distribution (Fig. 3.3). The central section of this curve contains many providers of one product; they establish a solid middle ground, which you could almost call random or interchangeable. This section also specifies the fluid boundary between beauty and ugliness, between bad and successful design. The wider the distance from this taste average, the more pronounced the characteristics of success or failure of design and the fewer providers of this kind of product you will find. In these sections, much money is lost, or made, because successful design is a significant upgrade of the product substance, as much as bad design acts as a detriment. Above all, successful design creates a need or a desire and ensures that customers fall in love with the product within seconds. Successful design is self-explanatory, i.e., if you need to explain it, it does not work. The average taste section also defines the boundaries between economic success and failure; the distance between “beautiful” and “ugly” is very short. Once a product becomes firmly established in the “good design” section, thus creating a niche for itself and catering to this niche segment, long-term economic success is guaranteed. Hence, good design may not actually be a prerequisite for the profitability
32
of a product but – leaving the product utility aside – it acts as a logical precursor to its economic success.
3.4 The Meaning of Design and its Correlation with Other Disciplines How highly do companies value this “success factor,” and how do they integrate it into the corporate structure? Ideally, companies have an integrative approach to design, thus creating a holistic value-added chain. This approach requires interdisciplinary and interactive cooperation in crucial areas of the company: design, development, production and sales (Fig. 3.4). At present, however, there seem to be only two of these four disciplines engaged in the process of “adding value,” leaving vast potential untapped. These examples may serve to illustrate the point: if a company exclusively focuses on development and opts out of design, it does not create added value for the economy. The company only sells the product on paper; there is no marketing or production; it is exporting a blueprint. While this may be an intellectual achievement, it is not necessarily lucrative. A company focusing on design and marketing only will most likely locate its development and production facilities
Fig. 3.4 Integration of design into the corporate structure. Ideally, companies have an interdisciplinary and interactive cooperation in the areas of design, development, production and market
G. Hildebrand
in low-wage countries. This is the typical approach of fashion companies, for example. There are also companies that are exclusively involved in development and production. They develop simple products or turn a successful idea into a product, but do not engage in any major marketing activities and have no specific design: they create the so-called Me Too or generic products with no identity or personality. In Germany, any company exclusively involved in production would most likely go out of business in the long run, as its high production costs would fail to yield any profitability. Ultimately, cooperation across all disciplines is the only sensible approach. A company that creates products needs to design them, develop them and sell them; first and foremost, however, it must train its employees. In the best case scenario, development, design and marketing are provided from one source. However, design must be understood as a self-sustainable and serious force, and not treated as a tool to be used by the development or sales departments as they see fit. Establishing this understanding within the corporation is not an easy task. Every company has a development, production and marketing department, so these are accepted necessities. However, the idea of having a design department in its own right is still unusual.
3
MINI: Empathetic Design for the Future
While the latter may be true for the industry at large, it cannot be applied to MINI. Compared with other corporate segments, design is rather cost-efficient. In automotive engineering, for example, design costs amount to a single-digit percentage of the overall development costs. When asked for their reasons for buying MINI, however, 80% of customers answered that they made their decision because of its design. If design is so important, why do most companies treat it like a poor relation? If we want to take a closer look at the four segments mentioned earlier, we need to remember what these terms describe – a large group of people. And no human is like another, particularly when it comes to their opinions and their approach to problems.
3.5 Lateral Thinking or What Is Longitudinal Thinking More often than not, finding the answer to a difficult question is a not a linear process, but rather a path with several bends and detours. It takes one or several
33
people thinking outside of the box in order to find shortcuts as well as creative and effective solutions: this approach requires lateral thinking! If there are people who think laterally, then there must also be people who think longitudinally and others who think in circles (Fig. 3.5). What are the characteristics of these special ways of thinking, and how do they differ? First, let us take a closer look at the species of “circular thinkers.” About 90% of the population thinks in circles; they are just ordinary people. Adequately dubbed, circular thinkers revolve around themselves. Time and time again, they try and try, but do not dare to take a risk that would jeopardize their safety and comfort. Their way of thinking is reiterative; it moves in circles, between the two poles of “imitation” and “optimization.” Whenever they come across something good, they will copy it and optimize it to suit their requirements. This way, they never leave their personal comfort zone. Major developments, personal as well as creative, are rather rare because this would compel them to leave their comfort zone. Having thus struck a balance, they have created a stable equilibrium, which they value highly.
Fig. 3.5 Characteristics and relationships among people who thing longitudinally, who think in circles, troublemakers and lateral thinkers
34
Longitudinal thinkers are inconspicuous, streamlined and conventional. If required, these types will put their ego first and act unscrupulously, ruthlessly following their own interests – these people act according to the negative stereotype of the “successful” manager. Ultimately they will use the system to get on top of lateral and circular thinkers. They have no opinion on anything, never take any risks, but at the same time, they are very dominant. Like circular thinkers, they attach great importance to their safety and their own comfort zone, from which they never venture. These two types will bring up issues, but they will never inspire a team or act on ideas. They do not create innovation. Yet innovations are our only chance to handle future challenges in an adequate and successful way. We are missing a real “thinker,” somebody who thinks outside of the box, breaks new ground and dares to leave their own comfort zone – in short, we need lateral thinkers. Designers can assume this very role, because they like to challenge the conventional, want to understand, explore and create, but always with a new and fresh take on things. However, the other types often find lateral thinkers very hard to understand. To them, this way of approaching the world can seem unfathomable, elusive. Lateral thinkers are irritating because the others cannot comprehend, cannot or do not want to follow their thoughts. Therefore, lateral thinkers are often mistaken for querulous, cantankerous troublemakers. A querulous person is somebody with a destructive and disloyal attitude, somebody with a chip on his or her shoulder. But lateral thinkers are far from being querulous. They are very positive, creative and important characters. They are indispensable, and nothing would move forward without them. They provide valuable inspiration and incitement, and they will interrupt or
Fig. 3.6 The lateral thinker
G. Hildebrand
stop processes whenever necessary – but only to start an active rethinking process and to elaborate on existing ideas. They are the only ones to leave their comfort zone to move things forward – they are constructive and innovative, and they think across all segments. If we put all these different types in an interactive situation, we would be presented with the following scene (Fig. 3.6): the circular thinker runs in circles in their wheel, while the longitudinal thinker sits atop, benefiting from the efforts of the others, who try to pull them along. But it is the lateral thinker who brings the whole construction forward, straining against the efforts of the querulous troublemaker, who tries to block, obstruct and stop the process. As mentioned above, lateral thinkers are often mistakenly perceived as querulous people who are out to cause trouble. Integrating lateral thinkers is very difficult for companies that need to function and do so with an interactive approach. Related problems are evident, particularly in areas where several departments overlap, as each of the four departments (sales/marketing, development, production and design) have their own rhythm and flow. Marketing, for example, is very focused on the present; it comes with the territory and for obvious reasons. Designers, however, live in the future, because their objective is the development of products that will be launched in years to come. This dissonance creates vast potential for conflict. The marketing department can take a closer look at the present customer and compare them to the past, analyze and evaluate their findings. But they are not able to look into the future, because they do not have the time and because it is quite simply not their job. Designers, however, must look ahead to the future and create today what they expect to be fashionable in 5 years time. There are no relevant data, no statistics. Hence, they have to decide themselves what may
3
MINI: Empathetic Design for the Future
be fashionable in years to come. Working as a designer in a corporate environment, one is faced with a conflicting situation that is slightly uncomfortable. Because we are not hired as mere artists, we are expected to contribute to the corporation’s economic success. In this function, designers have a major responsibility: the new product must not outperform any older models. Customers must not feel outdone by the new product, which they perceive as superior. After all, for most people buying a car is the second most costly investment after buying an apartment or a house. Therefore, designers must take care not to focus on individual fulfillment alone, but rather aim to create a product that suits the brand and matches the target group (Fig. 3.7). As designers, we are the customer’s advocates, as we use the new product for the first time in their stead. We need to anticipate their feelings, show intuition and empathy. For MINI, the brand’s customers are the crucial parameter; we are striving not just to meet their requirements, but we also want to set new standards. In order to achieve this, we need to know what MINI stands for and where MINI comes from.
3.6 From Original to Original The production of MINI cars has been uninterrupted since 1959 and still continues. Over the past 50 years, MINI has covered a lot of ground, starting out as a car for the general public, a model for the masses that economized on gasoline and provided maximum internal space combined with the minimum external size. As time went on, however, MINI veered from its original specification. By the beginning of the 1960s, MINI had evolved into a sports car. Later, it developed
Fig. 3.7 Definition of design
35
into an icon of the pop and hippie culture, and today, it has become an accessory of the urban generation with their trendy lifestyle, a common sight in many big cities. From a designer’s point of view, this car is interesting, because it was admired and used by many different people and because the original MINI for the masses evolved into an object with cult status – after that, it just became timeless. This car has cult status. But there is no recipe for producing cult objects. Attributes such as timelessness and cult status will only be associated with a product if and when people accept it and integrate it into their lives. As important as an emotional form language and product design may be, function is of similar significance, and it is downright essential, as it defines the product’s true utility. In my opinion, the term “classic” that is often associated with MINI cars and MINI design describes this very combination of functional and esthetic endurance. Obviously, MINI cars are not the only ones with a classic design – but they all have one aspect in common: these cars were widely designed by engineers and not by designers. All of these cars were designed with a view to technical requirements. Hence, they will never be out of date or become unfashionable. Products that obey the principle “form follows function” have no expiration date. Many creative people love MINI for exactly this reason. The car radiates a certain assurance that conveys legitimacy. MINI cars are “masters of value,” i.e., they always retain their value. This proves that customers are not just interested in beauty, but also in a product that stands the test of time. This approach is a pleasant counterpoint to the many short-lived things that we encounter in our day-to-day lives. MINI cars are characterized by their proportions rather than by their size – the overall image is the distinctive feature. MINI adheres to very definite design guidelines – all written in English, obviously – to
36
G. Hildebrand
Fig. 3.8 MINI design sketch
ensure that this overall image is never jeopardized. The principle of “form follows function” is one of the maxims of “sustained” design. Sir Alec Issigonis’ original characteristic MINI concept is still valid today: the engine is transversely mounted in the direction of driving to provide passengers with maximum internal space in a car with the minimum external size (Fig. 3.8). MINI design follows the principle of the “Human Body Archetype”: a MINI car embraces all three archetypes of the human body – a baby’s likeable and friendly face, a man’s striking and muscular shoulders, and a woman’s curving and soft feminine figure – thus unwittingly addressing several different target groups at once. With its positive shapes it exudes a three-dimensional abundance. A very striking characteristic of the MINI appearance is its “stance on the wheels.” Like a go-kart, the car has a wheel-at-each-corner look, which is highlighted by a wrap-around black band. The “stance on the wheels” is the main reason that handling a MINI car, especially when navigating corners at speed, is enormous fun. From the side view, MINI presents its “dynamic orientation,” a dynamic wedge shape flowing in the direction of travel. This is created by the shoulder line, which rises to the rear, thus implying forward motion. The Greenhouse opens up towards the hood, which underlines the wedge shape. The “jewelry
icons” are a particularly unique feature, as they represent the MINI brand’s claim to premium quality. These high-quality chrome-finished details, which are rather uncommon in this segment, are displayed in the MINI radiator grill, headlights and door handles. In combination, these principles and elements create the “Emotional Sculpture” that is MINI (Fig. 3.9).
3.7 Sensual Design–Design for All Senses The emotional appeal of MINI models has a lot to do with sensuality and the experience of feeling. Hence, experiencing a product with our four senses is a very important aspect – a good product stimulates all senses. A product is only successful if the overall experience is right. The product experience resembles the process of granulization: initially, we perceive the rough outlines of a product, and only later we discover more and more details. The senses follow a clear hierarchy: first we see, then we hear, then we touch, and finally we smell or even taste (Fig. 3.10). Yet it is after the very first optical impression that we decide whether we like a product or not and whether we want to find out more about it or not.
3
MINI: Empathetic Design for the Future
37
Fig. 3.9 MINI design icons: stance on the wheels, dynamic orientation, jewelry icons that create the “Emotional Sculpture” Fig. 3.10 A good product stimulates all senses: first we see, then we hear, then we touch, and finally we smell or even taste
If customers fail to notice the design, they neither want to experience nor discover any other characteristics of this product, albeit for the simple reason that they are not interested in it. If, however, a product makes a convincing optical impression on the customers, they will want to touch it to find out how it feels, whether it is warm or cold, soft or hard, light or heavy – because they want to find out whether the optical properties match the tactile properties. If you enter a car, for example, then what you feel, smell and hear should blend into a harmonious whole. How does the steering wheel or the instrument panel feel? How do the plastic components smell? How do the engine or the closing door sound? If all of these sensations work well together, they will make a likeable and authentic impression. Hence, MINI design goes much further than the mere visually formative process.
3.8 Intuition and Empathy We do not know what the future will bring, but we have an idea. Every designer has a vision of the future; every architect has dreams and knows which style elements or materials they will use in future. Everybody derives their idea of the future from the past and their experiences. One possible scenario could entail today’s individual transport being gradually replaced by other means of communication. In the long term, this would turn cars into an even more hedonistic and highly emotional asset, with less practical benefit to its user. Instead, moving about in private transport and controlling a vehicle in free-flowing traffic will become the much-coveted focus of interest. But what does this
38
have to do with emphatic design? Good design is emphatic by nature. Knowing what goes on in the minds of others, the customers, knowing about their existing and future needs is the designer’s day-to-day business. In this context, creating future traffic scenarios is particularly exciting. Hence, it is important to open up all senses to future ways of communication, no matter what they may be, to anticipate what future customers may expect, want and need and to meet them every step of the way. Intuition plays a major role in this process. The more structured the approach is, the harder the
G. Hildebrand
impact of coincidence. Too much planning will lead to dead ends. Simply going ahead and doing something will achieve results. It takes a touch of genius to use this intuition not just for oneself, but to apply it to others, to find out who the others are, what they do and what they need. To know the future and to meet future challenges with more than just one’s eyes wide open – we can do all this if we feel empathy. Take Alec Issigonis, for example. His intuition told him what to do when he designed the MINI. Without the right spirit and intuition, the process will not flow.
4
The Design and Development of Computer Games
Markus Gross, Robert W. Sumner, and Nils Thu¨rey
Abstract The design of modern computer games is as much an art as painting, sculpting, music, or writing. Albeit relatively young and still evolving, game design is highly complex and requires a broad spectrum of artistic and technical skills. The following contribution reviews the process of designing and developing modern computer games. We will acquaint the reader with the most important fundamentals of game design, walk through the various stages of the design process, and highlight the specifics of each stage. We will analyze and elucidate the domain-independent principles underlying game design, such as iteration, evolution, consistency, and others. Our findings are illustrated by three case studies on computer games developed within the scope of the ETH Game Programming Laboratory class.
4.1 Introduction Over the past 50 years, video games evolved from simplistic 2D line drawings into highly complex software systems with rich media content, creative artwork, and complex game flow. Today, such video games share a 30 billion dollar market, and their socioeconomic impact is significant. Individual AAA game projects operate with budgets of tens of millions of dollars, and their interdisciplinary game design teams comprise both artists and engineers. Video games have also been the single most important force to drive the hardware development of personal computing, and contemporary game consoles, such as Sony’s Playstation 3 or Microsoft’s Xbox 360, rival the compute power of supercomputers from days gone by.
M. Gross (*) ETH Zurich, Computer Graphics Laboratory, Disney Research Zurich, Zurich, Switzerland
The first computer game, called Spacewar, was developed in 1962 at MIT (Graetz 1981, pp. 56–67) and subsequently commercialized as an arcade game in 1971. A further milestone in early game development was Pong, a very simple version of tennis developed in 1972. Pong was later released for home consoles and delighted a generation of video game enthusiasts. The continual increase in graphical complexity was evident in Space Invaders, a 1978 arcade game with colored spaceships displayed on raster screens. One of the most popular video games of the early eighties was Pacman, a game that appealed equally to both male and female demographics. The early eighties also spawned the first version of Nintendo’s Donkey Kong, the beginning of a highly successful sequence of jump-and-run games. In the late seventies and early eighties, the first cartridgebased home console systems from Atari (2600 and 5200) and Nintendo (NES) came out, along with the blockbuster games Super Mario Bros. and Zelda. A major milestone of the late eighties was Nintendo’s Game Boy, as well as the hugely popular title Tetris. In the late eighties and early nineties, game hardware moved to 16-bit architectures. Dramatically increased
S. Konsorski‐Lang and M. Hampe (eds.), The Design of Material, Organism, and Minds, X.media.publishing, DOI: 10.1007/978-3-540-69002-3_4, # Springer-Verlag Berlin Heidelberg 2010
39
40
compute and graphics powers provided the platform for a series of graphically, artistically, and technically more complex game designs. Prominent examples from this era comprise the Mario and Zelda sequels, Sonic the Hedgehog, and others. The game titles from the early nineties clearly evidence the increased challenge in designing compelling video games with rich assets and graphics content. The mid-nineties brought another quantum leap with Sony’s Playstation and Nintendo’s N64, the first consoles with 3D graphics acceleration. This time also marks the transition from 2D visuals to full three-dimensional content. The power of three-dimensionality was harnessed by many game genres, such as sports, adventure, action, shooting, and puzzles. Around 2000, additional game hardware was released, including the Playstation 2, Game Cube, Xbox and others. In parallel, personal computers were empowered by sophisticated 3D graphics accelerators. The release cycles of the latest generation game consoles, specifically Playstation 3 and Xbox 360, evidence the dramatically increased complexity and cost of game hardware design. Expressive 3D content, very rich media assets, computationally expensive physics, and online functionality characterize the broad spectrum of titles and genres available at present. The addition of online functionality alone has created its own genre of massive multiplayer online games, such as World of Warcraft. The design of early video games focused mostly on technical and implementation issues, since gameplay was simple and artwork very limited. Compute resources were the limiting factor. Since then, the design of video games has evolved into a sophisticated art of its own, and it is being taught at art schools and universities all over the world. Game design poses great challenges as it encompasses artistic as well as technical elements, which, in a compelling design, cannot be separated from one another. The art of game design is relatively young and still evolving. As such, the literature on game design is limited, but continuously growing. For instance, S. Rabin’s book (Rabin 2005) provides a holistic view of the topic, including some of the more formal and abstract concepts, such as fun and flow. For more practical purposes, T. Fullerton et al. give a workshoplike recipe for the design of computer games (Fullerton et al. 2004). The goal of this article is to survey and discuss the basic principles and core components of modern video
M. Gross et al.
game design, development, and implementation. We will describe the process and its iterative nature, focus on the different stages of game development, and abstract generic design principles. While the paper focuses on generic principles applicable to all game genres, our experience draws upon a capstone class taught at the Computer Science Department at ETH Zu¨rich over a period of 3 years (gamelab 2009). Our goals and experiences from this class have been summarized in a recent paper (Sumner et al. 2008), and many examples presented in this chapter are taken from the class projects. The organization of this article follows the main stages of game design. After giving an overview of the process in Sect. 2, we summarize the most important formal and technical elements of game design in Sect. 3. Section 4 focuses on the various aspects of fun and motivation. We discuss conceptualization as the first major stage of game design in Sect. 5, and Sect. 6 addresses preproduction and rapid prototyping. The actual production stage is explained in Sect. 7, while Sect. 8 focuses on evaluation, quality assurance and game testing. Section 9 provides examples from our game development projects of previous years.
4.2 Overview and Stages of Game Design The notion of design varies significantly among different disciplines within art, science, and engineering, and the ideas inherent in architectural design differ significantly from software design, genetic design, or fashion design. At present, there is no commonly accepted definition of “design,” and neither are there commonly agreed-upon rules or axioms to leverage a systematic approach to design. The term Game Design is relatively recent and emphasizes the creative process underlying the conception and implementation of a new computer game. Such creative elements comprise the definition of the game’s gameplay, as well as the creation of game assets, the underlying story, the artwork, the media content, and the software. By contrast, game development is focused on implementation, execution, and quality control, and is, as such, centered on engineering. Game design is often conceived as an iterative process (Bates 2004), as illustrated in Fig. 4.1. The core
4
The Design and Development of Computer Games
loop of iterative design entails the following steps. Ideas are generated, formalized, tested, evaluated, revised, and refined until no further improvements are possible. Then, the design transitions to the next stage where the process is repeated. The paradigm of iterative design thus describes the entire process from the initial idea to the playable game. The emphasis of iterative design lies in frequent testing and early prototyping. It is noteworthy that the same paradigm is utilized in many other design fields as well. Most notably, the creation of computer-animated feature films employs
Fig. 4.1 The concept of iterative design in its most generic form
41
a similar process, where artists and engineers collaborate on a major creative software project. Iterative game design allows one to gain an early and deep understanding of the game’s gameplay and foundations, the central elements of a successful design. Iteration, as opposed to a waterfall model of software development, is much more cost effective and helps uncover design flaws early on. As the iterative design spirals down from the first concepts to the final game, initially vague and unstructured ideas coalesce and solidify. The progressive refinement of iterative game design is often broken up into four major phases, as depicted in Fig. 4.2 (Fullerton et al. 2004, p. 197). 1. In the concept phase, brainstorming is used to generate ideas for the game’s gameplay and other central concepts. During brainstorming, the team tries to come up with as many ideas as possible. Such ideas can be structured using concept cards, whiteboards, inspiration boards or charts. The goal is to narrow ideas down to the few most promising ones and to write short summaries for each. In professional game development studios, concept art is often pinned up on the walls to inspire designers and developers. Another important paradigm of the concept phase is the creation of a physical prototype using paper, pen, and cardboard.
Fig. 4.2 The four major phases of game design. As the design progresses, structure dominates flexibility (adapted from Fullerton et al. 2004, p. 197)
42
This prototype serves for iterative playtesting and can be evaluated, revised, played, and tested until the designer is satisfied. Pictures and drawings symbolize main characters, assets, and artwork. First, concept documents are written and concept art is created. The project management develops project plans and budgets, and contracts with publishers are finalized. This stage concludes with initial presentations (screenings) of the game, in which the team pitches the core elements of the game, the game’s structure, the formal elements (see Sect. 3), and the project plan in front of a peer audience. The paradigm of peer presentation is also used in animated feature films, where peers review the progression of the story on a regular basis during internal storyboard screenings. 2. In the preproduction phase, a first playable game prototype is developed. This software prototype should model the core gameplay, but it need not contain extensive artwork, media, or graphics. Frequently, software tools are employed for rapid prototyping. Again, the gameplay is tested, evaluated, revised, and refined using the paradigm of iterative design. As an option, external game testers can be included. This process also helps to determine the required technology game assets. The design document constitutes a refined and extended version of the concept document and outlines every aspect of the game. It provides detailed descriptions of the flow, the artwork, and the game’s formal elements. At the project management level, plans are refined, and resources must be allocated. This planning helps avoid a major redesign at a later and hence more expensive stage. 3. The production phase is focused on the development of all game assets, the design of levels, the development of required software technology, and the production of alpha code. At this stage, structured design clearly prevails over flexibility. All team members verify the correctness of the design document. Project management challenges increase, software production must be coordinated, and the workload is distributed among team members. Individual strengths and weaknesses have to be considered. At the end of this phase, the first fully functional version of the game, the alpha version, should be available. Iterative design is consistently applied to avoid major redesigns at this stage.
M. Gross et al.
4. In the final, quality assurance (QA) phase, the alpha code is utilized for extensive game testing to discover flaws and glitches, and to fine-tune the game’s difficulty level. The gameplay at this stage must be solid, and the major focus is placed on usability and playtesting. The quality-assurance (QA) phase often involves external game testers. More recently, some professional game development teams have designed very systematic, almost scientific, approaches to game testing, including detailed statistical evaluations and visualizations (Halo 3). In summary, we can identify the following domainindependent design paradigms: Iteration as opposed to waterfall Peer Review and regular screenings Rapid Prototyping at all stages Evolution of a design document (story board) Quality Assurance by systematic testing and evaluation The following sections will discuss each of the phases of game design in greater detail and exemplify the above paradigms. As preparation, we will first acquaint the reader with some fundamental considerations about the nature of computer games.
4.3 Formal and Technical Elements of Computer Games 4.3.1 Game Type and Genre At the most basic level, a computer game is characterized by the category into which it falls – the game genre. Such genres include action, adventure, shooters, jump-and-run, sports, racing, fighting, roleplaying, simulations, strategy, puzzle, music, dance, artificial life, and others. These genres define the basic principles from which the player derives and retains his or her motivation to play. Motivation can be generated from hunting and killing, collecting items, direct competition, general reward for success, being a hero, having metaphysical abilities or superpowers, problem solving, learning control skills, socializing, and other similar principles. A more careful analysis of the aforementioned principles reveals that they
4
The Design and Development of Computer Games
fundamentally relate to the following four psychological abstractions: The archaic notion of hunters and gatherers Darwin’s principle and direct competition as the survival of the strongest The desire to be superior and extraordinary The desire to manipulate and control other beings
4.3.2 Formal Elements At the next level of refinement, a game is defined through its formal elements. The formal game elements refer to components that form the structure of a video game. Such components entail players, objective, procedures, rules, resources, conflicts, boundaries, outcome, goals, and others. We will discuss a few of the most significant components here. One of the most important formal game elements is the number of players, their roles, and their interaction patterns. Player interactions can be single player versus game, multiplayer competition, multiplayer
Single Player vs. Game
Multiple lndividual Players vs. Game
43
cooperative, team competition, etc. Some important player interaction patterns are illustrated in Fig. 4.3. Of similar importance is the game’s objective. The objective defines what the player is trying to accomplish. It is hence central for retaining the player’s motivation and can include capturing, chasing, racing, rescuing, exploring, searching, and others. The objective is bounded by the rules of the game; it must be challenging, but achievable. Many modern designs utilize artificial intelligence (AI) to adjust difficulty levels to ensure the player always has an achievable goal. The objective further sets the tone of the game. The game procedures define the methods of play and actions to achieve the objective. Such procedures involve actions and states, such as starting an action, progressing an action, or resolving an action. Procedures also have to consider system limitations, such as resolution or latency. The game rules constitute the allowable player actions. Rules can define concepts (chess or poker), restrict actions (soccer), or determine actions (puzzles). The two major design paradigms for the game’s rule base are:
Cooperative Play
Multilateral Competition
Player vs. Player
Unilateral Competition
Team Competition
Fig. 4.3 Illustration of game player interaction patterns (adapted from Fullerton 2004, p. 46)
44
Consistency and Logical Correctness The resources, often called game assets, are used to achieve goals and objectives, but also cater to the above-mentioned principles of collecting and gathering. The subconscious aspect of such assets requires careful consideration during design as it significantly influences the player’s motivation. Examples include lives, units, money, health, objects, terrain, and time. Most similar to creative writing, the notion of conflict plays a central role in the conception of a computer game. Such conflict arises when a player is trying to achieve the goals within the game’s constraints and boundaries. Conflict is essential to create a challenging game and to uphold motivation. Examples include obstacles, opponents, or dilemmas. Finally, there is the game’s outcome. Outcome describes the measurable achievement in the game. In most games, the outcome relates to a “winning” situation; however, some games have different outcomes. An interesting class of exceptions is massive mulitplayer online games (MMOGs), such as World of Warcraft, where the concept of a single outcome is dissolved and replaced by different kinds of rewards, partly related to the player’s social status within the online community. Such games have no clearly defined outcome, continue forever, and thus have high potential for addictive behavior.
4.3.3 Technical Elements In addition to formal game elements, there is a variety of technical elements that define a computer game. Modern professional games address a wide spectrum of problems related to information technology and computer science, and the employed technology has a significant impact on the production cost. As a design principle, we postulate: TFF: Technology follows function (gameplay) A detailed discussion of all relevant technical elements for video games is beyond the scope of this paper, and we limit our considerations to a summary of the most relevant ones. We refer the interested reader to technical textbooks on game development, such as Irish (2005), Rabin (2005), and Novak (2007).
M. Gross et al.
First and foremost, there is software engineering. Modern games demand a variety of sophisticated methods and algorithms, making the typical code base of professional game projects rather complex. The code complexity and the involved cost create the need for maximum reusability between different game projects. Such requirements are best addressed by advanced software engineering methods with proper class design, hierarchy, and the definition of interfaces. To the extent possible, standardized APIs and libraries are utilized. Microsoft’s XNA (Miles 2008) is a good example of a low-level game API. Higher level APIs are available for physics, such as Nvidia’s PhysX (Nvidia PhysX). The design principle underlying game software can be summarized as: Simplicity: As simple as possible, as complex as required A second important technical game element relates to 3D graphics. Modern 3D graphics require complex geometry and appearance specification, including the scene structure (scene graph) and organization, import and export of 3D models, their positioning and transformation, advanced pixel shading and lighting, texture mapping specifics, animation, antialiasing, and video replay. The way the player interfaces with the game comprises another important factor. User interaction must be properly specified early on. Among the variety of controllers, we can chose between conventional ones, such as mouse, joystick, buttons, dials, keyboard, cameras, and microphone, or design customized controllers to leverage gameplay. Harmonix’s Guitar Hero evidences the great benefit of a proprietary controller design. Special attention must be devoted to the control of viewing and cameras as well as the associated degrees of freedom. In general, camera control in computer games is a highly nontrivial problem and the subject of ongoing research (Oskam et al. 2009). The game’s artwork and assets constitute the central elements that determine the game’s appeal and esthetics. As such, the creation of the artwork can easily consume a substantial portion of the production budget. 3D models comprise scenes, landscapes, characters, and objects. The proper choice of modeling software is highly important. In addition to 3D models, the design of texture maps makes up a relevant part of the artwork. Texture maps are utilized for a variety of different effects, including general texture maps,
4
The Design and Development of Computer Games
shaders, lighting effects, surface properties, and data for more general computations. Most often, the creative use of real-world digital photos extends the repository of synthetically generated textures. Another salient element is sound and audio. Sound includes both general background or theme music as well as auditory feedback during gameplay. Sound contributes substantially to the feel of the game and draws upon the longstanding experience in cinematography of enhancing emotional response using sound elements. It is crucial to clarify the required sound elements early on, determine why they are needed, and how they contribute to the gameplay. The theme score of Halo exemplifies the sophistication of musical compositions as part of modern games. Game physics is gaining increasing importance for the realistic simulation of rigid bodies, collisions, friction, explosions, crashes, deformations, cloth, and other effects. One of the challenges of game physics, as opposed to scientific computations or special effects, is real-time performance in combination with unconditional stability and robustness. As such, most algorithms for game physics abstract from the actual physical laws and strive instead for visual plausibility. In addition, careful design of data structures is central for performance optimization. Since the design of these physical algorithms (Eberly 2003) is highly nontrivial, game development teams often resort to physics libraries, such as Nvidia’s PhysX or Intel’s Havoc. Lastly, one of the most important technical game elements is the game’s artificial intelligence (AI), or the game’s ability to adapt to the player and to compute believable behavior of its characters. Game AI encompasses path planning, modeling of agent behavior, modeling of the player, global control of the game’s state, and other processes. It is essential to achieve adaptivity in a video game. Most commonly employed algorithms include A* path planning, agent models, state machines, search and prune, and, more recently, statistical learning (Buckland 2004).
4.4 Understanding Fun When asked why someone plays a video game, “fun” will be at the top of the list of reasons. Games are fun. That’s why we play them. Thus, the concept of “fun” is crucial in game design, and understanding
45
“fun” is critical in developing a successful game. While “fun” may seem like a nebulous concept, the theory of “Natural Funativity” (Falstein 2004) connects this concept to our evolutionary roots. Natural Funativity suggests that games and other playful pastimes are thought to be fun and enjoyable not by chance, but by the virtue of evolution. Nearly all games (video and otherwise) involve some form of challenge, such as running fast, throwing a ball or spear, or solving a puzzle. The reason most people describe these activities as fun is because, in our deep ancestral origins, those who happened, by the randomness of genetics, to enjoy such activities in their playtime were better prepared to handle lifethreatening situations that borrowed similar skills. Those who enjoyed running with their friends were more likely to outrun the tiger; those who enjoyed throwing balls as a pastime were able to survive as hunters; those who enjoyed solving puzzles were more likely to outsmart their enemies. Thus, the things we consider fun are not accidental, but rather were specifically favored in evolution as the pastimes that increased chances of survival. The theory extends beyond physical and mental challenges to social aspects of the human experience. Our desire to form groups, interact with colleagues, chat with friends, tell stories, and flirt with others is rooted in our tribal hunter-and-gatherer ancestors, since those who preferred human bonding over going it alone were more likely to survive. With the theory of Natural Funativity in hand, the concept of “fun” in games is much less elusive. Firstperson shooters speak to our origins as hunters. Roleplaying games contain the element of exploration, which we consider fun since, in evolution, those who enjoyed exploring were more likely to find better shelter, better food, and other items necessary for survival. Games like Tetris and many other puzzle games help us hone our logic, perception, problem solving, and pattern recognition skills – all important in the history of our survival. On the social side, the cinematic nature of games appeals to our roots in storytelling. Games that focus on interacting with friends and flirting speak to our tribal roots in their purest form. Finally, it comes as no surprise that many of the most successful games combine several aspects of fun – physical, mental, and social – to resonate more strongly with our hard-wired concept of fun.
46
4.5 Conceptualization Although the technical sophistication of modern video games is enormous, with hundreds of thousands of lines of source code comprising countless algorithms and mathematical computations, game design – which ultimately makes the player experience so rewarding – is as much an art as painting, writing, sculpting, or storytelling. In fact, game design draws heavily from all of these artistic endeavors, as well as many more. As is the case with any highly creative process, there is no formula, recipe, or other step-by-step procedure to replicate the creativity necessary for game design. With no surefire approach, many people, especially beginners, find the prospect of designing a game from scratch extremely intimidating. Conceptualization, or, in short, the initial creative spark that leads to the core game idea, causes particular angst. While creativity itself resists formal explication, the creative process can be formalized to help harness the natural creativity present in everybody. Here, we describe formalizations employed by both seasoned game designers and neophytes alike to aid in the conceptualization process when the core game idea is developed.
4.5.1 Write it Down! Although the first rule may sound mundane, it is extremely important: write things down! Ideas come all of the time – day, night, walking, running, driving, showering, eating, listening to music, etc. You may even wake up in the morning with an idea in your head from the dream you were having. If you do not write it down immediately, this ephemeral thought may disappear. Thus, the first formalization to assist in game design is training yourself to write down each and every idea that you have.
4.5.2 Brainstorming Brainstorming is a formal way to stimulate creativity and the free flow of ideas. There are a multitude of brainstorming techniques ranging from list creation, in
M. Gross et al.
which you list everything you can think of about a particular topic, to randomization, in which you select words at random from the dictionary or from a stack of prearranged cards and consider making a game related to the words, to research, in which you learn as much as possible about a particular topic, to mind mapping, where you expand ideas in a radial fashion to represent semantic connections between topics (Fullerton et al. 2004). The goal and focus of brainstorming is always to generate lots of ideas. When working in a team, it is important to maintain a positive attitude and avoid criticism so that ideas flow unfiltered. Although brainstorming offers no guarantee (Gabler et al. 2005), it has the side effect that new ideas may continue to come when unexpected. Have pencil and paper ready, and write them down!
4.5.3 Organization and Refinement When a few brainstorming sessions are added to the normal flow of ideas, you may find that your list becomes unwieldy. At this stage, search for a way to organize and categorize your ideas. As you do so, look for patterns or structure that might hint at a profitable game direction. When going over your idea pool, select the most promising ones and expand upon them with a new, more directed brainstorming session. Restricting the potential landscape of ideas can often stimulate creativity and lead to clever or new approaches to a topic. As this process is iterated, continue to highlight and expand the best ideas while considering how they could map to the formal and technical elements of game design described in Sect. 3.
4.6 Prototyping A game prototype represents the first working version of the formal system describing the game. At the prototyping stage, only a rough approximation of the artwork, sound, and features of the game need to be considered. This allows a designer to focus on the fundamental game mechanics, instead of worrying about production-related issues. The goals of prototyping are twofold: define the gameplay in its purest
4
The Design and Development of Computer Games
form and learn whether the core mechanics hold the player’s interest. In addition, a prototype can help to balance the rules and discover emergent behavior. The complex interactions of gameplay elements typically make it difficult to uncover play patterns without a concrete realization in the form of a prototype. As a first step, the core gameplay can be summarized by describing the game’s core concept in one or two concise sentences that define the single action a player repeats most often while striving to achieve the game’s goal. Over time, the meaning and consequences in the game might change, but the core gameplay will remain the same (Fullerton et al. 2004). A prototype can be realized in a variety of ways. One popular option is to create a physical prototype using paper, cardboard, and a set of counters since this works very well for testing game mechanics, rules, and procedures. Figure 4.4 shows an example where a team from the game design class at ETH outlines how they balance the units of their strategy game using a paper version. However, it can be difficult to test action-oriented games in this fashion. A second prototyping option is to create a video-based animatic or a storyboard. Such a visual prototype can capture the user experience and communicate the game’s ideas to others. One drawback of this form of prototyping is that videos can be difficult to produce. A third option is a software prototype. Tools for rapid software development, such as Flash/Shockwave, Visual Basic, or
Fig. 4.4 A paper prototype for a strategy game is shown on the left. Each unit is represented by a piece of cardboard. The two images to the right depict the game’s units in the final software version
Fig. 4.5 On the left, an early software prototype of the game Battle Balls is shown. It is already fully playable and contains all major gameplay elements of the final version, shown on the right hand side
47
level editors of existing games, are well suited for this task. Figure 4.5 shows a game prototype created within only a few days using the Director software. It served as the basis for the creation of the game Battle Balls, and already contained the game’s core mechanics that were later included in the console version. Once the core gameplay has been explored and refined on the prototype, the design process can focus on other areas of the game, such as extending the basic feature set, the game’s controls, or its interface. At later stages of the development process, prototypes for specific elements of the game are created, again focusing on the underlying mechanics of the new element. Such prototypes can be used to demonstrate novel control schemes or new visual effects.
4.7 Playtesting Testing a game is itself a continuous, iterative process that overlaps all stages of game design. The goal of playtesting is to gain insights into how players experience the game, identify strengths and weaknesses of the design, and learn how to make the game more complete, balanced, and fun to play. Moreover, the results of a playtesting session can provide the necessary evidence to abandon unsuccessful parts of a game
48
design. Since making significant changes to a game’s design is more difficult at the later stages of the production cycle, it is important to start testing very early on in the design process. Although a designer constantly tests his or her design as a game is created, the emotional attachment to one’s own ideas makes it is crucial to have external testers review the game. Friends and family are good candidates to give feedback during early stages of the development. In later stages, unrelated testers can give more objective feedback and offer a fresh, unbiased viewpoint. Usually, after a selection process, the testers are invited to a testing session individually or in groups. During these sessions, it is vital to neutrally observe the testers. An explanation of the rules or the grand vision of the game might bias a tester’s opinion and prevent him or her from giving useful feedback. During the testing sessions, it is important to gather as much information as possible. The testers can be observed while they are playing the game, which might give insights about where players typically get stuck, where they become frustrated, or which obstacles and enemies are too easy to overcome. After a testing session, a tester can be asked to answer questions about formal aspects of the game. Typical questions are whether the objective of the game was clear at all times, whether a winning strategy became apparent, and whether the tester found any loopholes in the rules. Finally, each testing session is followed by an analysis stage to determine which conclusions should be drawn from the feedback of the testers and which parts of the game design might have to be changed as a consequence. A very elaborate form of playtesting is performed by Bungie Studios, creators of the Halo franchise on the Xbox (Halo 3 2009). In Bungies’s own research facility, testers are invited to play a current version of the game. While they are playing, a database captures all aspects of their performance: locations of deaths, weapons, vehicles, extras, the player’s progress over time, as well as a full video of the playing session. For Halo 3, more than 3,000 h of gameplay from about 600 testers were analyzed. This analysis made it possible to fully balance the multi-player game and minimize the frustration during the single-player campaign. With this large amount of gathered data, state of the art data mining techniques can be used to extract information. One possibility is to create “heat maps”
M. Gross et al.
that visualize which areas are occupied most often in multi-player matches or to add color-coded time stamps to the map of a level. The latter can be used to quickly identify regions where a player performs an undesired amount of backtracking and thus lost focus on the game’s objective.
4.8 Case Studies To illustrate the design principles outlined in the previous sections, we present and discuss three game projects, selected from the game design class at ETH Zurich that exhibit different design goals and challenges. Titor’s Equilibrium is a two-player game based on the physical interactions of objects in an abstract futuristic setting, Sea Blast focuses on multiplayer submarine battles, and Toon Dimension features cooperative game play in a stylized comic world. Videos of all games developed during the class are available on the course website (gamelab 2009). Titor’s Equilibrium was developed by three students of the game class in 2007 and takes place in an abstract futuristic setting inspired by films such as Tron. The game is designed for two simultaneous players, where each player controls a fragile ghost that can survive only for a limited amount of time in mid-air. To survive for a longer time, and attack the opponent, each player can enter geometric objects that are spread throughout the game’s levels. Depending on the type of object, the player has different abilities of attack and defense. Winning is achieved by destroying the other player’s object and preventing him or her from entering another one. The different attack styles are balanced in a rock-paper-scissors fashion. The visual design of the game contains abstract geometric forms and sets the different game elements apart by form and color, as shown in Fig. 4.6. This abstract style was deliberately chosen in response to the limited amount of time available for development during the class. Complex shading techniques give the game elements an interesting look. The computation of accurate physical interactions between the objects presented an additional difficulty during the realization of this game. The students chose to implement this part of the game engine themselves, which comprised a large portion of the overall development process.
4
The Design and Development of Computer Games
Fig. 4.6 Concept art and several screenshots at different development states of the game project “Titor’s Equilibrium.” The painting on the left is an early sketch showing the game elements and prescribing the game’s visual style. The two middle images
49
show technology tests for rendering the game objects and smoke trails. The right image shows the first playable version with preliminary graphics
Fig. 4.7 A typical scene from the final game: the green player attacks the orange one in the foreground by dashing forward through several stacked boxes Fig. 4.8 Three screen shots from the Sea Blast game: from left to right, the title screen, a scene from a four-player battle, and two players fighting against computercontrolled enemies
An example of a player interacting with the environment is shown in Fig. 4.7. Sea Blast was realized during the game class of 2008 (see Fig. 4.8). The game can be played by up to four players, either fighting against each other or against computer-controlled enemies. The setting of the game is an underwater battle between submarines. The game’s basic concept is very simple: players control a small vehicle in the level and can shoot at their opponents with different types of weapons. This simple initial design permitted careful extension of the game’s foundation with balanced gameplay elements. For example, vehicles with different characteristics can be selected at the start of each playing session, allowing for different winning strategies. To set their game apart from similar games, the students chose to implement a physical simulation to calculate the motion of the water filling each level. This results in complex interactions between the water flow and the players and weapons. The physical simulation is a strategic element that can be used to distract or damage the opponents, for example by creating a local vortex that draws objects to its center. The Sea Blast game highlights the importance of an overall balanced game design. While the underlying
concept makes sure the game is intuitive and fun to play, the technological aspects, such as the fluid simulation, set it apart from other games. Great artwork and sound effects round off the game’s experience. The importance of all of these elements also means that a team working on such a game should consist of talented people that are able to produce all elements of the game with the desired quality. In this case, the Sea Blast team achieved a very polished final version, and the game is publicly available for purchase via the Xbox online marketplace. As the name suggests, cartoons inspire Toon Dimension’s (2009) setting and visuals. An evil scientist has split the world’s dimensions, and two players in different dimensions must cooperate to win the game. While most of the environment can be seen and modified by all players, certain gameplay elements belong only to a single player’s dimension and can be modified exclusively by this player. This setting allows for interesting puzzles in which the players must collaborate to combine pieces from different dimensions. The game’s graphics and animations follow a comic style. Cel-shaded three-dimensional models give the game a painted look, and cartoons inspire
50
M. Gross et al.
Fig. 4.9 The game Toon Dimension features comicstyle visuals and a complex cooperative gameplay
the exaggerated motions of players and enemies. In Fig. 4.9, several screen shots of the game illustrate its unique visual style. In contrast to the previous two games, Toon Dimension’s gameplay is primarily based on the cooperation of the players. As a result, a great deal of effort was required to design the multiplayer puzzles. The team focused on creating a limited number of high-quality levels. As evident from these three games, multiplayer gameplay is a popular choice in our game classes. In a multi-player game, the actions of other players comprise a significant portion of the game’s mechanics, which simplifies development and makes such a design well suited for small teams with a limited time budget.
4.9 Conclusion Perhaps more than any other discipline, game design is both an art and a science, a technical endeavor and a form of artistic expression. The most cutting-edge games stress the technical sophistication of modern console hardware and incorporate advanced algorithms for graphics, physics, interaction, artificial intelligence, and data structures, while at the same time present never-before-seen visual styles, explore new forms of story telling, and invent new ways for humans to interact with computers and with their friends, families, and colleagues. The technology drives the design by providing ever-increasing computational power and the ability to express more sophisticated visual styles. The design, in turn, drives the technology by pushing the limits of modern day hardware and software in ways that scientists could never conceive. In this article, we have described the game design process and how it draws upon both technical and artistic elements. The earliest games – Spacewar, Pong, Space Invaders – seem exceedingly simple compared to today’s standards. Indeed, both the technological and artistic sophistication of video games has increased with
an unabated speed since the first game was made. The iterative process of modern game design stresses an early understanding of a game’s core mechanics and visual style so that design elements can be refined and tuned. Ideas are generated, prototypes built, game assets created, and gameplay tested in a spiral loop that zeros in on a highly polished final product. Formal elements such as players, procedures, rules, objectives, resources, conflicts, and outcome are developed alongside the visual style, user interface, and thematic score. Brainstorming and other formalizations of conceptualization can greatly help with this design process. Prototypes are used to test the design and validate that it lives up to our inherent standards of fun. Playtesting completes the iterative process with a formal method to obtain feedback and improve upon a game’s design. The final output is a collection of artwork, a collection of 3D models and animated characters, of algorithms, of procedures, of stories and music and sounds – all unified and coordinated by a single game design to deliver a wondrous experience that can only be found in the world of gaming. Acknowledgements The authors would like to thank all students and development teams participating in the ETH Game Programming Laboratory classes of 2007, 2008, and 2009. For Titor’s Equilibrium: Marino Alge, Gioacchino Noris, Alessandro Rigazzi; for Sea Blast: Urs Do¨nni, Martin Seiler, Julian Tschannen; for Toon Dimension: Peter Bucher, Christian Schulz, Nico Ranieri.
References Bates B (2004) “Game design“ Course Technology PTR. 2 edn. ISBN-10: 1592004938 Buckland M (2004) “Programming game AI by example”. Wordware Publishing, ISBN: 978-1556220784 Eberly DH (2003) “Game physics”. Morgan Kaufmann, ISBN10: 1558607404 Falstein N (2004) “Natural funativity” Gamasutra. http://www. gamasutra.com/features/20041110/falstein_01.shtml (retrieved July 2009)
4
The Design and Development of Computer Games
Fullerton T, Swain C, Hoffman S (2004) Game design workshop: designing, prototyping, and playtesting games. CMP Books, USA Gabler K, Gray K, Kucic M, Shodhan S (2005) “How to prototype a game in under 7 days: tips and tricks from four grad students who made over 50 games in one semester” gamasutra. http://www.gamasutra.com/features/20051026/ gabler_01.shtml (retrieved July 2009) Gamelab (2009) Game programming laboratory. http://graphics. ethz.ch/teaching/gamelab09/home.php (retrieved July 2009) Graetz JM (1981) “The origin of spacewar” in Creative Computing. http://www.wheels.org/spacewar/creative/SpacewarOrigin.html (retrieved July 2009) Halo 3 (2009) “How Microsoft labs invented a new science of play”. http://www.wired.com/gaming/virtualworlds/ magazine/15-09/ff_halo, WIRED Magazine, Issue 15.09 (retrieved July 2009)
51 Irish D (2005) “The game producer’s handbook” 1 edn. Course Technology PTR, ISBN-10: 1592006175 Miles R (2008) “Microsoft XNA Game studio 2.0: learn programming now!”. Microsoft Press, ISBN-10: 0735625220 Novak J (2007) “Game development essentials: an introduction” 2 edn. Delmar Cengage Learning, ISBN-10: 1418042080 NVIDIA PhysX. http://www.nvidia.com/object/physx_new.html (retrieved July 2009) Oskam Th, Sumner RW, Thuerey N, Gross M (2009) “Visibility transition planning for dynamic camera control”. In: Proceedings of the SIGGRAPH/Eurographics Symposium on Computer Animation, New York Rabin S (ed) (2005) “Introduction to game development”. Charles River Media, ISBN-10: 1584503777 Sumner RW, Thuerey N, Gross M (2008) “The ETH game programming laboratory: a capstone for computer science and visual computing”. Game Development in Computer Science Education (GDCSE), ACM, New York
5
Drug Design: Designer Drugs
Gerd Folkers, Elvan Kut, and Martin Boyer
Abstract This article interrogates the contemporary comprehension of rational drug design in the development of pharmaceutical products with therapeutic and enhancing properties and its extensions to designer drugs on the grey and black market. The latter not only evidently leads to ethical and hence societal questions of standardizing the state of well-being of body and mind, but also virtually sets the boundaries between therapeutic and enhancing drug usage. At the same time, molecular design is based on a rigorous model of living beings. Processes of health and disease are assigned to functioning or malfunctioning biochemical structures and systems that can be targeted and pharmacologically modified. In reviewing the history of the rational approach in drug design, it becomes apparent that the specific single-targeted chemical intervention to cure diseases is against its premises insufficiently productive. The overwhelming complexity of the living can only be partially understood by still comparably weak technologies and hence makes it difficult to succeed without serendipitous discoveries. Here, advances in systems biology promise improvements in drug development strategies. A straight-forward approach is taken in the discovery of new designer drugs. By testing molecular variations of known drugs directly in humans, the official drug test requirements can be bypassed. In doing so, new substances can be produced fast and cheaply, though, at the prize of health risks of uninformed test subjects.
“The Water from the tap. Of course. Theses changes in me had begun the moment I drank it. There was something in it, clearly. Poison? But I’d never heard of any poison that would. . .Wait a minute! I was, after all, a steady subscriber to all the major scientific publications. In just the last issue of Science Today there had been an article on some new psychotropic agents of the group of so-called benignimizers (the N,N-
G. Folkers (*) Collegium Helveticum, ETH Zurich and University of Zurich, Schmelzbergstr. 25, CH-8092 Zu¨rich
dimethylpeptocryptomides), which induced states of undirected joy and beatitude. Yes, yes! I could practically see that article now. Hedonidol, Euphoril, Inebrium, Felicitine, Empathan, Ecstasine, Halcyonal and a whole spate of derivatives! Though by replacing an amino group with a hydroxyl you obtained, instead Furiol, Antagonil, Rabiditine, Sadistizine, Dementium, Flagellan, Juggernol, and many other polyparanoidal stimulants of the group of so-called phrensobarbs (for these prompted the most vicious behavior, the lashing out at objects animate as well as inanimate and especially powerful here were the cannibal-cannabinols and manicomimetics).” (Lem 1974)
S. Konsorski‐Lang and M. Hampe (eds.), The Design of Material, Organism, and Minds, X.media.publishing, DOI: 10.1007/978-3-540-69002-3_5, # Springer-Verlag Berlin Heidelberg 2010
53
54
5.1 Preface Design in its Latin meaning “drawing something” has been a fundamental part of the essence of chemistry for a very long time. Always guided by pictorial notes, chemistry developed from the early metaphoric paintings of the alchemists into a strictly formal language of “formulae,” and now into computer-assisted illustrations of molecular properties and interactions, themselves sometimes metaphorical. Design, as the process of drawing something and designating the outcome with a meaning, is used both as a noun and a verb, often in contrast to randomness. Design requires intentional action, based on knowledge, research, planning and goals. Therefore “designing a drug” encompasses not only the creation of a beneficial bioactive molecule, but also implementing the whole process to find and create, test and market a new remedy. There have been remarkable changes in the processes of drug development since the 1950s. Theoretical knowledge about the molecular fundamentals of drug actions in both the healthy and the diseased body permitted a shift from trial-and-error routines in the laboratory to the computational engineering of a drug. Today, an exact virtual modeling of structure and interaction properties precedes the synthesis of a drug. The expression “Drug Design” has also undergone quite a few changes in its use throughout recent decades. The term mutated into “Structure-Based Drug Design,” “Ligand Design,” “Molecular Design,” be it structure-based or not, “Molecular Modeling” and “Rational Drug Design.” While all these terms have slightly different meanings, very often depending on the scientific discipline – crystallography, theoretical chemistry, pharmacology, medicine, material sciences – which they emerged from, they are used every which way in scientific literature. Amusingly, mirroring “drug design,” the expression “designer drug” leads to something akin to the dark side of the world of drug design, as in the tales of Alice (Carroll 1872) where the “mirror milk” was supposed to be poisonous for her cat. “Designer Drugs” are created when the molecular structure of governmentally approved drugs is modified. While this experimental technique is often used by pharmaceutical companies to optimize the safety or efficacy of a drug, clandestine laboratories strive for other goals: to enrich the drugs’ potential for recreational use and to bypass legal regulations (Christophersen 2000). Without the hindrance of
G. Folkers et al.
long and costly clinical trials, such drugs spread easily on the black market, never lacking risk-loving volunteers. Legally, the difference between federally controlled drugs and designer drugs is clearly defined. It is however interesting to note that drug developers, while creating either pure and tailored or impure and random substances, sometimes seek very similar drug actions. In the growing field of pharmacological psychotherapy, the border between treating an illness and improving mental well-being is particularly thin. Are anti-depressants “good” drugs because they help people to autonomously manage their daily life, or do they start to be doping for managers who need mood-brighteners for their demanding jobs? Whereas “designer drugs” still somehow have the taste of kitchen chemistry and drug dealing, the more elegant version is “doping by design” and “brain enhancement.” Naturally, this convergence of legal and illegal drugs in controversial drug applications is reflected on the Internet. The online library “erowid.org” provides thousands of documents on psychoactive substances, fascinatingly bringing together knowledge from all sorts of academic, medical and “experiential” experts in order to provide access to “reliable, non-judgmental information” (http:// www.erovid.org. retrieved 5/2008) This article will give an overview of the basic principles and premises of current drug design and address the convergence of medical and recreational drugs that aim to enhance physical, mental and emotional well-being.
5.2 The Two Worlds 5.2.1 Drug Design: The Bright Side The first written testimony citing a correlation between molecular worlds and our world of phenomena is given in Lucretius’ “De Rerum Natura,” where this ancient author speculates about the different viscosities of wine and oil. He explains this behavior by referring to the different size, shape and interaction of the “wine- and oil-atoms” (Lucretius 1957). His conclusion, amazingly, fits our current view of the basically ball-shaped water atoms, (which on average make up 87% of wine) and the more stretched-out lipid molecules in oils. About two millennia later, in 1894, the German chemist Emil Fischer proposed a lock and key model
5
Drug Design: Designer Drugs
to visualize the interaction between a substrate with a corresponding target structure, e.g., an enzyme, in the body. Having carried out experiments on the specificity of enzymes, he wrote: “To use a metaphor, I would say that enzyme and substrate must fit together like lock and key in order to exert a chemical effect on each other. In any case, this notion becomes more likely and its value for stereo-chemical research increases when the phenomenon itself is transferred from the biological to the chemical realm.” (Fischer 1894)
At the time, the propositions that substances of life fit together like lock and key and that biology at the molecular level becomes chemistry were quite unorthodox (Cramer 1995). But they have influenced modern biology ever since. In March 1900 Paul Ehrlich translated Fischer’s ideas into medical terminology, and, by coining his famous hypothesis: “Corpora nisi sunt fixunt, non agunt,” postulated molecular recognition principles and receptor theory as the basis for immune regulation. The fundamental hypotheses of Fischer and Ehrlich, who were each awarded a Nobel prize, gave rise to what we today consider to be drug design based on a “one-target one-disease” world. It was Paul Ehrlich who created the metaphor of what he called “The Magic Bullet” or “Die Zauberkugel”: “If we picture an organism as infected by a certain species of bacterium, it will (...) be easy to effect a cure if substances have been discovered which have a specific affinity for these bacteria and act (...) on these alone (...) while they possess no affinity for the normal constituents of the body (...) such substances would then be (...) magic bullets.” (Himmelwert 1960)
Currently, the deterministic paradigm of drug design is fundamentally comprised by the following steps: Correlate a disease with a molecular target Identify the molecular structure of the target Create a, preferably small, molecule that in an inhibitory or agonistic way interferes with the target Test the molecule in vitro and then in vivo
5.2.1.1 The Design Process Pursuing the Fischer–Ehrlich paradigm, the stereoelectronic complementarities of two interacting molecules became the central theme of drug development.
55
This is based on the assumption that once the molecular structure of the target, most often a protein, is known, the complementary properties of the interacting drug (ligand) can be tailored. Therefore, the 3D structure of the targeted protein first needs to be determined. The advent of molecular biology, in its co-evolution with computational technology, has given us access to gram amounts of pure protein, which is genetically engineered and produced within micro-organisms. Once the protein has been crystallized, its 3D structure can be elucidated by Xray crystallography.1 For the computational reconstruction of the protein model, a large collection of heterogeneous biophysical data is needed (Frank and Svetlana 2007). Data from these 3D models are then usually deposited in a freely accessible public database (PDB, http://www.wwpdb.org, retrieved 5/2008). Modern technology can reach previously unencountered sizes of crystallized objects at atomic resolution, such as whole viruses, the ribosome or the nucleosome (Davey et al. 2002). Complex structures like the nuclear pore, however, which consists of nearly 500 proteins working collaboratively in the active transports of cargoes through the nuclear membrane, can obviously not be crystallized in one single functional unit. Starting from the data set of the X-ray structure, molecular design procedures on the computer lead to insightful structural models, e.g., showing where and how a ligand can interact with the protein. Collections of molecules, so-called libraries, are presented to the targets in a highly automated (robotic) environment. The collection presented might be a fraction of 50,000 molecules, which share a structural similarity with the ligand designed for complementary interaction with the X-ray structure of the target protein. There might, however, be no correlation and no causal relation between structural similarity and similar bioactivity, for many possible reasons (e.g., entropic contributions, multiple interaction sites, etc.). Hence, with the most advanced methods, the computer will create a set of structural proposals, already based on the assumed complementary interaction of
1
Crystal symmetries provide the necessary coherent spatial orientation of a large set of molecules, whose electron densities deliver a “diffraction pattern” by interaction with a monochromatic X-ray beam. From this pattern the position of carbon atoms can be inferred, and this allows for a precise determination of bond lengths and angles within the nanometer range.
56
G. Folkers et al.
Fig. 5.1 A molecular dynamics model of a section of a cell membrane model is displayed with two incorporated rhodopsin receptors (yellow) surrounded by water (light blue), containing overall about 45,000 atoms. The simulation shows that once rhodopsin is liberated from crystal-packing constraints and embedded in a native-like phospholipid bilayer environment, it relaxes to an ensemble of conformations different from the crystal
structure. By modeling the dynamics of the molecules, information is obtained about additional internal water molecules and about parts of the protein that are crystallographically unresolved. The absorption of a photon leads to conformational changes in the red molecule (retinal) in between the protein loops, which releases signal transductions inside the cell. (Figure taken from Huber et al. 2004. Reprinted with the permission of the Biophysical Journal)
the ligand with the target protein in the body. Knowledge about small molecule structure has been accumulated in chemistry for more than 200 years and has reached a resolution far beyond the chemical bond. This has been achieved by establishing a rigorous theory involving the interplay of physics with physical and theoretical chemistry. Ever since the fundamental work of Bu¨rgi and Dunitz, structure correlations in chemical reactions have been better understood (Bu¨rgi and Dunitz 1994). Geometrical alterations and conformational adaptation of small molecules to large target proteins are less well known and are an important part of the ongoing research. Indeed, the newest generation of experiments focuses on the dynamics of receptor activation while interacting with a ligand. In Fig. 5.1, the interaction of a ligand with a highly complex target structure, rhodopsin, can be seen. Rhodopsin receptors are located in the retina of the eye and interact with the elementary particles of light, the photons, enabling us to take the very first steps in visual perception. The molecular dynamics model of the protein shows how the interaction of a photon leads to conformational changes of the whole ligand– protein complex. The red molecule shown in the central pocket of the protein loops is a photosensitive region (retinal), which by photon absorption changes
its shape and hence the conformation of the entire protein, resulting in a signaling process inside the cell (see Fig. 5.1). Rhodopsin is one out of almost 1,000 members of the family of G-protein-coupled receptors (GPCR) that can be found in the membranes of all kinds of cells in our body and that translate a plethora of extracellular stimuli into a signal that starts a biochemical response. Drugs that act on this class of receptor proteins represent 60% of all marketed human therapeutics available by prescription and so contribute some 50 billion Swiss Francs to the annual revenues worldwide. Despite extensive efforts, rhodopsin is to date the only member of this pharmacologically very important family that has a known structure at atomic resolution.
5.2.1.2 Scope and Limitations of Drug Design A ligand is not a drug. Computer Assisted Drug Design (CADD) and related approaches are ubiquitous in life science research; however, we must recognize that these computational methods are somehow behind the reality of drug development. All empiric and computational knowledge seems to stem from a concept of hypothesized one-to-one interaction of ligand
5
Drug Design: Designer Drugs
and target. This concept has been automated on a huge scale, able to generate hundreds of thousands of binding experiments between ligand collections and target collections in a single day. However, while the description of an interaction process between a ligand and its protein target may be tremendously precise in terms of physics, the necessary reduction of complexity – omitting the natural environment of the target – raises the question of whether the designed artifact is relevant to the living cell. Hopeful ligand candidates have frequently turned out to be inactive, toxic or just completely insoluble in water. In many laboratories, these facts have led to a much closer integration of molecular design, synthetic chemistry, biology and toxicology in order to gain advantage from the intuition of the other fields. Still, we have unfortunately not been able to establish a valid model of protein–protein interaction from which a concept of how to replace peptides and proteins that transmit information in our body by drugs consisting of “nice” small and stable molecules could be formulated. Nobody has been able to predict the most exciting new drugs;, instead they have been found by “serendipity.”2 The one-ligand-one-target concept also leaves out another possible outcome of the drug once applied to the living body: a ligand can interact with many different of targets. For example, the receptor-tyrosine kinase (TK) inhibitor Sunitinib#, a drug intended to treat renal cell cancer, was developed as a classic oneligand-one-target compound, but turned out to be active against some hundred TK targets in its watersoluble form. In fact, the multimodal action of a drug may be the rule rather than the exception: “Since specific cell-cell contact probably requires not just one site of interaction, but several, if not many lock-key contacts (lectin–sugar interactions), the geometry of this process must be very complex indeed.” (Cramer 1995, pp. 193–203)
Many drugs believed earlier to be specific for certain receptors have been found to be “promiscuous.”
2
The phenomenon by which one accidentally discovers something fortunate, especially while looking for something completely different, is often called “serendipity.” Penicillin for example, was discovered in 1928 by serendipity. More about the term serendipity can be read here: Merton RK & Barber E (2004) The Travels and Adventures of Serendipity: A Study in Sociological Semantics and the Sociology of Science. Princeton University Press, Princeton.
57
The Sunitinib case exemplified that if the target of a drug is a receptor within a cascade of biochemical information transfer, it might be useful to interact with more than one receptor within or beside this cascade in order to enhance the drug’s activity or even to make it work at all. Small molecule, multitargeted drugs of “rich pharmacology” may become the trend for treating complex diseases like cancer. Everyone is now attempting to achieve a “Swiss Army knife” approach to molecular design: the development of multiple acting compounds directed at one target site or at several target sites, dreaming of shutting down whole cascades of biochemical information transduction (Wermuth 2004). However, the assumption that the target proteins within such a biochemical pathway are present at the same time, space and at the same concentration within a given cell is problematic. Systems biology studies reveal that the molecular functions of the genes and their products are coupled in physical-chemical networks (Takahashi et al. 2005; Guffanti 2002). Therefore, it may be that the inhibition of a certain target favors or suppresses the expression or transport of a second target. Hence, it seems useless to start a design process on a multi-targeted bioactive compound, which from the perspective of molecular design is certainly feasible, but to leave out a model of the distribution of the targets in time and space. It seems more promising to start from a drug that is well-known to have multiple targets (by measuring its side effects) and to single out the desired one by tailoring its molecular properties. Again, however, there are not many successful examples. What has been achieved recently for a dual-acting cytokine regulator (Hanano et al. 2000) has turned out to be completely impossible for the separation of, e.g., morphine activities and others.
5.2.2 Designer Drugs: The Dark Side Entering the other side of drug development, which means using drug design intentionally for drugs of abuse, is reminiscent of the situation Alice was in when entering the world behind the mirror: “How would you like to live in a Looking-glass House, Kitty? I wonder if they’d give you milk, there? Perhaps Looking-glass milk isn’t good to drink ...” (Carroll 1872)
58
In fact, if mirror inversion is applied to the constituents of the cat’s milk, it will become poisonous for the cat. If the cat undergoes complete molecular inversion too, it might well be possible that in this mirrorimaged world everything will be the same, and the milk will be healthy. The metaphoric use of the quotation in this context is to stress the fact that there are no clear good and bad distinctions in drug design. Designer drugs “are universally understood to belong to a group of clandestinely produced drugs which are structurally and pharmacologically very similar to a controlled substance (Langston and Rosner 1986) but are not themselves ‘controlled substances.’”3 However, the very same synthetic compounds might be one step in the development of a new therapeutic in pharmaceutical research. Hence, it is less the synthetic drug itself, but the clandestine development and production of it, as well as its commercial use, that is the step into illegality. The whole scene of making and distributing “designer drugs” has been professionalized in the last two or three decades. Driven by e-commerce, a huge black or at least grey market appeared for ingredients and equipment, which facilitated the making of drugs at home or in hidden laboratories. Under permanent pressure from the US Drug Enforcement Administration, web pages offering such services are masterpieces of camouflage. Somewhere in between herb extracts and Asian medicine, all kinds of indole, phenylethylamine and tryptamine derivatives can be found, including the solvents and chemicals needed to modify them. Tryptamine, for instance, is the backbone of a huge class of bioactive compounds, which includes neurotransmitters as well as the toxic and psychoactive ingredients of mushrooms. It is therefore not so much chemistry and pharmaceutics that darken the mirror world of drug design, but the fact of irresponsi-
3
In October 1987, the United States Government amended the Controlled Substance Act (CsA) in an effort to curtail the illicit introduction of a new CsAs. This amendment states that any new drug that is substantially similar to a controlled substance currently listed under the Code of Federal Regulations (CFR), Schedule I or II, and has either pharmacological properties similar to a Schedule I or II substance or is represented as having those properties, shall also be considered a controlled substance and will be placed in Schedule I. The amendment further contains provisions that exempt the legitimate researcher as well as compounds that are already being legally marketed from the provisions of the amendment.
G. Folkers et al.
ble and life-threatening application to an “uninformed consenting”4 user. Designer drugs are not accompanied by a patient information leaflet. They have not been developed with respect to a disease model in animals, ending with costly and tedious studies in healthy volunteers and in patients until, eventually, the market release is finally granted by an official authority after 10–14 years. Designer drugs are quickly developed and highly risky, due to low-tech production – potentially highly impure drugs are sold illegally to a user, who has neither any knowledge about the quality nor any rights of amends. In many European cities, this has led to public health services offering free quality control checks for illegally obtained drugs, especially for the young party scenes, in order to prevent poisoning (or worse) as a result of badly produced batches of amphetamines or related compounds. It is interesting to note that many designer drugs stem from legal drug research programs and vice versa. Legal drugs can be abused by exploiting the side effects produced by overdosing, different administration routes, etc. Heroin is one of the first and surely most prominent examples. Following the idea of improving the properties of natural compounds like salicylic acid and morphine, researchers at Bayer AG modified these compounds by introducing small acetyl residues. Though the modification is small, it is highly effective for both molecules, but for different reasons. While the acetyl-moiety in Aspirin is the active ingredient, the heroin molecule is rendered more lipophilic, which makes it target the brain a hundred times faster than morphine. The goal was to create an antitussant. According to contemporary advertisements, it worked fine unless finally the tragic “side effects” took over, which is the main business of today. Clearly, modern doping design also falls into the realm of designer drugs. Creating drugs for doping means to pharmacologically achieve an action in the human body, which, for instance, provides an athlete with much higher endurance and at the same time makes the drug invisible to investigators. One strategy is to speed up natural biochemical processes in our body by increasing native precursors or hormones that control the body’s energy balance. Erythropoietin
4
Informed consent is a legal condition whereby a person can be said to have given consent based upon an appreciation and understanding of the facts and implications of an action.
5
Drug Design: Designer Drugs
(EPO) is the body’s chemical signal for the production of red blood cells, and a higher concentration than normal will provide lots of oxygen for the muscles. As a native glycopeptide, its surface is covered with sugar antennas, which are its fingerprint and involved in recognition of its targets. If recombinantly produced in micro-organisms, the sugars vary from the native ones. The investigators from sports associations can detect these differences. So, researchers, who try to fabricate EPO homologous to the human by technologically adapting the micro-organisms in a proper way serve both kidney patients and crooked athletes.
5.3 Where the Two Worlds Meet The German language distinguishes between “Medikament” and “Droge,” but in English there is a certain ambiguity in the single word “drug.” Unlike in the German, the English term does not eo ipso indicate good or bad.5 A drug can be used for the treatment, cure, prevention or diagnosis of a disease; it can also, however, be used to broaden perception or to enhance physical or mental well-being. For a long time pharmacologists have not only been searching for medicine for the body, but also for medicine for the soul. The proclamation of the twenty-first century, as the century of neuroscience, has set the tone. The cognitive and emotional conditions of an individual are not predetermined; they can be modulated and optimized towards well-being. It seems that body and mind become more and more part of our self-design and self-realization. We do not only want to decide how we look, but also how we feel.6 New insights in neuroscience, such as the plasticity of the brain,7 lead to growing acceptance of pharmacological neuro-modulators, as the brain is considered to be lastingly tunable. The neuronal circuits of the brain can be “rewired” up to an old age
5
The moral discourse and jurisdiction of the nineteenth and twentieth century started to connote “Medikament” positively and “Droge” negatively. 6 Advertisement for “inner peace” tea of Teekanne: “Decide yourself how you feel”. 7 The adult brain is not "hard-wired" with fixed and immutable neuronal circuits. There are many instances of cortical and subcortical rewiring of neuronal circuits in response to training as well as in response to injury. (Neuroplasticity|Wikipedia).
59
(Rakic 2002). But how fine is the line between the design of new cognitive and emotional brain enhancers and dreaded designer drugs?
5.3.1 Paradise Engineering Opium is by far one of the most powerful analgesics and has been used against pain and also as a recreational drug for millennia. The first known written reference to the opium poppy appears in a Sumerian text dated around 4000 BC. Not surprisingly, the flower was known as “hul gil,” the plant of joy. The Greek physiologist Galen commended its use against all sorts of illnesses: headaches, epilepsy, coughs, colic, fevers, women’s problems – and melancholy. Opium was the main ingredient of the famous concoction theriac and of Paracelsus’s universal remedy, laudanum. In 1805, the era of the single substance morphine8 started when an apothecary’s assistant, Friedrich Wilhelm Sertu¨rner, isolated the white crystalline from opium in a series of experiments performed in his spare time. Much later, in the 1970s, several targets of morphine, the opioid receptors, were found and characterized in animal and human cells (McNally and Akil 2002). These key findings opened the field for “rational drug design” of morphine-like drugs, so-called opioids. Opioid receptors control a wide range of biological functions: ranging from pain, to the hedonic impact (“liking”) of food, from anxiety to the feeling of drug rewards (Pecina and Berridge 2005). The design of new specific ligands may therefore not only generate nonaddictive analgetics, but also mood-brighteners and food regulators. Depressive people often suffer from a dysfunctional opioid system and anhedonia – an incapacity to experience pleasure (Leknes and Tracey 2008). Customized drugs enhancing the opioid system are promising for psychotherapy and are being tested in mice (Todtenkopf et al. 2004). Durable emotional super health, as originally described by Stanislav Lem, through substances such as Hedonidol and Euphoril, is the vision of some people. The “hedonistic imperative” has been claimed, taking analgesics and
8
Sertu¨rner called the isolated substance morphine after Morpheus, one of the Greek gods of dreams.
60
G. Folkers et al.
surgical anesthetic a giant step further. Some contemporaries even propose paradise engineering: “States of sublime well-being are destined to become the genetically preprogrammed norm of mental health.” (http://hedonistic-imperative.com, retrieved 5/2008)
LSD (lysergic acid diethylamide) and MDMA (3,4methylenedioxy-methamphetamine, “ecstasy”) are today known as psychoactive drugs of abuse and evoke associations of subcultural behavioral norms. Nevertheless, their therapeutic comeback is being reevaluated. In 2007 and 2008, two clinical studies testing LSD and MDMA in patients were approved by Swissmedic, the Swiss agency for therapeutic products. It is expected that MDMA will help patients suffering from post-traumatic stress disorder face up to the very difficult feelings associated with trauma, such as anxiety (http://clinicaltrials.gov/ct2/show/ NCT00353938, retrieved 2/2010). 3,4-Methylenedioxymethamphetamine was first created in 1912 by Merck as a by-product while attempting to synthesize a drug to stop abnormal bleeding, but shortly after it was patented it was forgotten. 3,4-Methylenedioxymethamphetamine’s potential to promote honest selfdisclosure was tested in clandestine US military research in the 1950s (Jerrard 1990). Its therapeutic value was then popularized by the controversial American chemist Alexander Shulgin in the late 1970s (Shulgin and Shulgin 1997). 3,4-Methylenedioxymethamphetamine is said to induce feelings of empathy (“empathogen”), generate a sense of “touching within,” a clarity of introspective self-insight (“entactogen”) and to catalyze intense spiritual experiences (“entheogen”). It is interesting to note that these effects are probably elicited by an increase of extra-cellular levels of the neurotransmitters serotonin and dopamine. The therapeutic benefits of LSD are also going to be tested for the first time in 35 years in a study of patients suffering from advanced-stage cancer and other terminal illnesses (http://clinicaltrials.gov/ct2/ show/NCT00920387, retrieved 2/2010). LSD is one of the most potent hallucinogenic substances. It intensifies and alters the senses, perceptions and moods. Accidentally found by Albert Hofmann in 1938, it quickly became a well-respected tool in psychotherapy (Hofmann 1980). Sandoz stopped the production of the tablets in 1966 after extensive abuse by the hippie movement had led to the banning of the drug (LSD 2006). When asked, at the age of
102 years, about the possible rehabilitation of his “problem child,” Albert Hofmann who died recently, said: “My wish has come true. I didn’t think I’d live to find out that LSD had finally taken its place in medicine.” (TV SF1 2007)
5.3.2 Mind Design There were times when it was believed that the abilities and afflictions of the body and the soul were given by God. In the second half of the twentieth century, genes took the place of this all-dominant entity. Nowadays, even the genetic predisposition of an individual can be overcome. For financially sound individuals, esthetic “problem areas” can be optimized, and infertility can often be treated. The enhancement of the body does not stop at the level of the brain. Of course, neurocognitive enhancement already is and will remain an issue, and rational drug design will continue to be on the leading edge of neurotechnology (Barondes 2003). There is relatively little controversy surrounding the use of psychopharmaceuticals, i.e., drugs that target the specific molecular events underlying cognition and emotion, to treat neurological and psychiatric illnesses (Fig. 5.2 illustrates the model of a drug-protein interaction of the antidepressant desipramine). Memory, for example, can be both bliss and torture. While memory enhancement is of interest for older adults and patients with Alzheimer’s disease, the erasing of memory is also a therapeutic challenge. The pharmaceutical industry also uses rational drug design to develop drugs that block memories or prevent their consolidation (Hall 2003). The aim is clearly to save trauma victims from developing a post-traumatic stress disorder (PTSD). But what about the enhancement of the psychological functions of healthy individuals? Several brain functions are potential targets for pharmacological enhancement: memory, executive function, mood, libido and sleep (Farah 2003). Most people enhance their normal neurocognitive function every day. Caffeine is a very popular and legal form of mind doping. It stimulates the nervous system by acting on several receptors and channels on the cell surface (Nehlig et al. 1992). Recently, the drugs modafinil and methylphenidate (Ritalin), which are
5
Drug Design: Designer Drugs
61
Fig. 5.2 Depression is one of the most common psychiatric disorders and is, inter alia, related to perturbation of the system of the neurotransmitter serotonin. In order to increase the neurotransmitter concentration in the synapse and hence enhancing the signal transduction, the neurotransmitter reuptake process is blocked. One class of drugs inhibiting reuptake transporter proteins is the tricyclic antidepressants (TCAs), such as, e.g.,
desipramine. This screenshot from the drug design software LigandScout displays a desipramine molecule lying within the narrow binding pocket (wire frame) of an amino acid transporter protein. Desipramine is stabilized in the pocket by means of hydrophobic interaction, illustrated by the spheres, as well as by ionic forces (green arrows) between the positively charged nitrogen (blue) and water (red asterisks)
prescribed for narcolepsy9 and attention deficithyperactivity disorder (ADHD),10 respectively, have drawn public attention. Both drugs are increasingly being abused as study aids by healthy students and others. On some college campuses in the US, up to 16% of the students have used methylphenidate (Babcock 2000), and it appears that this has also become a common practice among Swiss students (Lu¨thi 2007). Experimental studies have shown that drugs targeting the dopamine and noradrenaline neurotransmitter system, such as modafinil and methylphenidat, improve not only deficient executive function, but are also able to enhance normal mental performance (Turner et al. 2003; Mehta et al. 2000). Neurocognitive enhancement raises ethical questions, and the field of neuroethics has become very important (Farah 2004). Its sizeable impact on society calls for new social policies. Employers and educators might have to face new challenges in the
management and evaluation of people. Moreover, there will surely be cost barriers confining neurocognitive enhancement to certain segments of the population. Still, the development of pharmacological brain enhancers, so-called “nootropics”11 or simply “smart drugs,” can definitely have great benefits for health and quality of life. Martha J. Farah, together with other leading neuroscientists, has put it this way: “Humanity’s ability to alter its own brain function might well shape history as powerfully as the development of metallurgy in the Iron Age, mechanization in the Industrial Revolution or genetics in the second half of the twentieth century.” (Farah 2004)
5.3.3 Fitter, Happier, More Productive12 The advent of newly designed and versatile psychiatric drugs has revolutionized both the treatment and the
9
Narcolepsy is a disorder characterized by sudden and uncontrollable, though often brief, attacks of deep sleep, sometimes accompanied by paralysis and hallucinations. 10 ADHD is a neurologic disorder that manifests itself as excessive movement, irritability, immaturity, and an inability to concentrate or control impulses.
11
The word nootropic, coined in 1964, derives from the Greek words noos (mind) and tropein (to bend/turn). 12 “Fitter Happier” is a song by Radiohead (1997). Lyrics: “(. . .) calm, fitter, healthier and more productive, a pig in a cage on antibiotics.”
62
G. Folkers et al.
understanding of mental illnesses. Governmentapproved drugs and designer drugs have closed ranks in targeting emotional and cognitive well-being. Still, they leave us with certain unease. Tailored neuromodulators interfere with an unmanageable complex system, the human brain. Therefore, Ehrlich’s magic bullet is not feasible, at least not in the brain. An innumerable amount of neural interconnections and feedback loops means that novel, unanticipated side effects cannot be ruled out. Since we hardly understand the underlying mechanisms and entanglement of psychological functions, it is difficult to foresee longterm effects. Does enhanced memory simultaneously enhance learning processes, or is some information, for example, the emotional value, left behind? Obviously, the usefulness of psychiatric drugs in alleviating the mental sorrow of individual patients and caregivers cannot be denied. However, naturally occurring but nonetheless distressing emotions because of difficulties with oneself, one’s partner, family or job also seem to become more and more manageable by the intake of drugs. Nowadays, the delegation – or in newspeak, outsourcing – of responsibility, and therefore of the emotional state, is as rampant for the single individual as it is in the corporate world. It is as if the exaggerated slogan now is: “Don’t worry, buy your way out!” This trend is further driven by the omnipresent pathologizing of the human condition, in which a minor melancholy is considered to be a herald of an approaching depression (Horwitz and Wakefield 2007). The debate on the enhancement of human mental and physical abilities has a wide spectrum. As a speaker at the World Psychedelic Forum, held in March 2008 in Basel, said: “In the near future, substance use might be common in everyday life in order to increase alertness and working efficiency or just to feel ‘better than well’. Enthusiastically embraced as a step towards trans-humanism (Bostrom 2005) by some, others fear the loss of authenticity, growing inequalities or political quietism in a world of mind-doping and synthetic happiness.” (Bublitz)
And as Stanislaw Lem pointed out, in drug design, utopia and dystopia are indeed incredibly close to each other. “Molecular hedonism” is considered to be one of the possible outcomes of this development, a “brave new world” supplying us with our daily cavalier cocktail of happiness. But can the feelings evoked by external substances, drugs, be really compared to the bliss of listening to music, a jump into cold water on a
sunny day, a Bistecca Fiorentina or the a kiss of the dearest?
5.4 Conclusion Since the second half of the twentieth century, we have witnessed an enormous growth of biological knowledge, which has led to a shift in drug discovery from trial and error towards the rational design of drugs. The ongoing decoding of the molecular fundamentals of the healthy and diseased body fuels great expectation concerning the possibility of Ehrlich’s vision of the magic bullet. Once the structure of a malfunctioning target is known, it seems practicable to design the molecular structure of the counterpart drug. However, what has been largely omitted is quite a few of the limitations, and also much of the scope of the molecular design processes depends on a rigorous model of the human being. In omitting too many of the context variables, a model will degenerate quickly into an irrelevant image of human reality. Dissection of complex cellular or tissue biochemistry into separate nonfeedback-looped in vitro reactions will underestimate the influence of multiple interactions. Hence, more complex experimental set-ups are needed in combination with computational modeling, as well as a shift away from the linear conception of biochemical reality. These are the promises of systems biology. So, in the coming decades, making a drug for humans, be it for the good or the bad side of the coin, will basically remain a process of trial and error and not so much a question of design. Beside that, the huge body of regulations that must be followed within current health systems may restrict legalized drug development and, in parallel, the huge economical incentives for the grey market will only encourage dangerous developments on the dark side. Still, much is feasible, especially in the realm of mind-altering medicine. The question of risk, relevance and, in particular, desirability of such drugs is becoming ever more important, since the boundaries between therapy, enhancement and intoxication are continuously blurring. Acknowledgments The authors would like to thank Rainer Egloff, Priska Gisler, Vladimir Pliska, Beatrix Rubin and Amrei Wittwer for their careful review of the manuscript.
5
Drug Design: Designer Drugs
References Babcock Q (2000) Student perceptions of methylphenidate abuse at a public liberal arts college. J Am Coll Health 49:143–145 Barondes SH (2003) Better than Prozac: creating the next generation of psychiatric drugs. Oxford University Press, USA Bostrom N (2005) A history of transhumanist thought. J Evol Tech 14:1 Bublitz J Ch http://www.psychedelic.info/ (retrieved 5/2008) Bu¨rgi HB, Dunitz JD (1994) Structure correlation. VCH, Weinheim Carroll L (1872) Through the looking-glass, and what Alice found there. MacMillan, London Christophersen AS (2000) Amphetamine designer drugs – an overview and epidemiology. Toxicol Lett 112–113:127–131 Cramer F (1995) Biochemical correctness: Emil Fischer’s lock and key hypothesis, a hundred years after – an essay. Pharmaceutica Acta Helvetiae 4:193–203 Davey CA, Sargent DF et al (2002) Solvent mediated interac˚ tions in the structure of the nucleosome core particle at 1.9A resolution. J Mol Biol 319:1097–1113 Farah MJ (2003) Emerging ethical issues in neuroscience. Nat Neurosci 5:1123–1129 Farah MJ (2004) Neurocognitive enhancement: what can we do and what should we do. Nat Rev Neurosci 5:421–425 Fischer E (1894) Einfluss der Configuration auf die Wirkung der Enzyme. Ber Dtsch Chem Ges 27:2985–2993 Frank A, Svetlana D (2007) Determining the architectures of macromolecular assemblies. Nature 450:683–694 Guffanti A (2002) Modeling molecular networks: a systems biology approach to gene function. Genome Biol 3: reports 4031 Hall SS (2003) The quest for a smart pill. Sci Am 289:54–65 Hanano T, Adachi K, Aoki Y, Morimoto H, Naka Y et al (2000) Novel phenylpiperazine derivatives as dual cytokine regulators with TNF-alpha suppressing and IL-10 augmenting activity. Bioorg Med Chem Lett 10:875–879 Himmelwert F (1960) The collected papers of Paul Ehrlich. Pergamon, London Hofmann A (1980) LSD – my problem child. McGraw-Hill, New York Horwitz AV, Wakefield JC (2007) Loss of sadness: how psychiatry transformed normal sorrow into depressive disorder. Oxford University Press, New York Huber T, Botelho AV, Beyer K, Brown MF (2004) Membrane model for the G-protein-coupled receptor rhodopsin: hydrophobic interface and dynamical structure. Biophys J 86:2078–2100 Jerrard DA (1990) “Designer drugs” – a current perspective. J Emerg Med 8:733–741
63 Langston JW, Rosner DJ (1986) The hazards and consequences of the designer drug phenomenon: an initial approach to the problem. In: Church AC, Sapienza FL (Eds.) Proceedings of Controlled Substance Analog Leadership Conference Leknes S, Tracey I (2008) A common neurobiology for pain and pleasure. Nat Rev Neurosci 9:314–320 Lem S (1974) The futurological congress. Harvest Book, Harcourt Inc, Orlando LSD: cultural revolution and medical advances. Royal Society of Chemistr http://www.rsc.org/chemistryworld/Issues/2006/ January/LSD.asp (retrieved 5/2008) Lucretius CT (1957) Translated into English verse by Leonard WE On the nature of things (De Rerum Natura). The Heritage Club, New York Lu¨thi T (2007) Schneller, effizienter, besser. NZZ am Sonntag (30.12.2007) McNally GP, Akil H (2002) Opioid peptides and their receptors. In: Davis K, Charney D, Coyle JT, Nemeroff C (eds) Neuropsychopharmacology: fifth generation of progress. Lippincott, Williams, & Wilkins, New York Mehta MA, Owen AM, Sahakian BJ, Mavaddat N, Pickard JD et al (2000) Methylphenidate enhances working memory by modulating discrete frontal and parietal lobe regions in the human brain. J Neurosci 20(6):RC65 Nehlig A, Daval JL, Debry G (1992) Caffeine and the central nervous system: mechanisms of action, biochemical, metabolic and psychostimulant effects. Brain Res Brain Res Rev 17:139–170 Pecina S, Berridge KC (2005) Hedonic hot spot in nucleus accumbens shell: where do mu-opioids cause increased hedonic impact of sweetness. J Neurosci 25:11777–11778 Rakic P (2002) Neurogenesis in adult primate neocortex: an evaluation of the evidence. Nat Rev Neurosci 3:65–71 Shulgin A, Shulgin A (1997) LSD. In: Shulgin A, Shulgin A (eds) TiHKAL. Transform Press, Berkeley Takahashi K, Arjunan SNV, Tomita M (2005) Space in systems biology of signaling pathways–towards intracellular molecular crowding in silico. FEBS Lett 579:1783–1788 Todtenkopf MS, Marcus JF, Portoghese PS, Carlezon WA (2004) Effects of kappa-opioid receptor ligands on intracranial self-stimulation in rats. Psychopharmacology 172: 463–470 Turner DC, Robbins TW, Clark L, Aron AR, Dowson J et al (2003) Cognitive enhancing effects of modafinil in healthy volunteers. Psychopharmacology 165:260–269 TV SF1, 10 vor 10, LSD feiert Comeback als Medikament, 19.12.2007 Wermuth CG (2004) Multitargeted drugs: the end of the “one-target-one-disease” philosophy? Drug Discov Today 9:826–827
6
Making Matters: Materials, Shape and Function
Paolo Ermanni
Abstract Material, shape and function are features describing both natural and manmade structures; they are intimately related to one another. Any kind of structure obeys the same laws of Physics and is constructed to be light and efficient, to minimize material and energy utilization over the entire lifetime. In biological systems, the growth process is driven by the environment and takes advantage of a variety of amazing features that are typical of living systems. Technical structures are the result of a design process: Engineers are moving within a design space that is spanned by all the attributes involved in the design and are converging to viable solutions by determining appropriate values to all those attributes. Even though manmade systems are often inspired by nature, their design and performance are limited by the available materials and technologies. In this context, design of the next generation products will take advantage of novel ceramic polymers and composite materials with their capability of tailoring and adaptation in mechanical and physical properties. The ability of modern Computer-Aided Engineering (CAE) tools to simulate and predict the physical behavior of technical systems has dramatically improved in the past decades. CAE-tools in conjunction with Evolutionary Algorithms, which conceptually mimic the natural evolutionary process by implementing the Darwinian principle of survival of the fittest, provide powerful tools to cope with the increased design space and the complexity of the design process.
6.1 Introduction The meaning of structure is manifold. The main tasks of a structure are to carry forces and to hold and protect a kind of system. Structure also defines the form and the manner of building or construction. Structures are everywhere: They play a central role in the development of technical products and systems, even though they are not the purpose of the development itself.
P. Ermanni ETH Zurich, Institute of Mechanical Systems, Zu¨rich, Switzerland
Plants, leafs, beans, seeds, skeletal structures, animal shells and so on are typical examples of structures developed by nature (Fig. 6.1). They are constructed to be light and efficient, thus minimizing material and energy waste over an the entire lifetime. Ultimately, man-made and biological systems obey the same laws of physics; it is therefore not surprising that this commonality results in structures with similar arrangements and shapes. However, there are still fundamental differences between man-made and biological constructions: The range of materials and physical/functional properties available in nature is enormous (Vincent 2005).
S. Konsorski-Lang and M. Hampe (eds.), The Design of Material, Organism, and Minds, X.media.publishing, DOI: 10.1007/978-3-540-69002-3_6, # Springer-Verlag Berlin Heidelberg 2010
65
66
P. Ermanni
Fig. 6.1 Examples of biological structures. From left to right: Victoria amazonica leaf (www. biologie.uni-osnabrueck.de), tridacna maxima shell, trabecular structure of a bone (www.biomechanics.ethz.ch)
Biological systems are not the result of a design process. The growth process is driven by the environment. In plants and bones, for instance it takes place as a response to external mechanical stimuli by adapting morphology (shape, material properties) in order to avoid load concentrations, thus targeting a smooth stress distribution within the whole structure (Mattheck 1993). Technical structures are usually designed to be stiff. Plants and animals possess controlled compliances, which increase system functionality and ensure integrity under external forces in an extremely efficient manner (Ennos 2005). Biological systems are intelligent. They are configured with sensors (nerves) and embedded actuators (muscles) interacting with their intrinsic compliance in order to fulfill diverse functionalities in the presence of varying operational requirements and environmental conditions. Plants and animals are living systems. Thus, they are continuously adjusting morphological properties to optimally cope with changing environmental conditions both in very short time periods as well as over thousands of years. All these features provide the basis for a huge variety of adaptation mechanisms that have been developed by biological systems to respond to evolutionary pressure. The design space available for engineers people is much more modest. Technical solutions are often inspired by biological systems, but are ultimately limited in their performance and efficiency by the available materials and technologies. Neither the form nor the material properties of technical systems are going to change significantly during service life in order to better fulfill a given requirement. Certain mechanical systems are in fact capable of a limited degree of adaptation. Nevertheless, differently from biological systems, they are composed by groups of mechanisms – i.e., rigid mechanical elements connected to each
other using hinges – that transfer force and energy, and convert different types of motion. Flight is one of the most demanding adaptations (Kulfan 2009). Over hundreds of millions of years, insects and birds have developed a variety of different features and skills, aiming to reduce weight and improve performance (flying altitude, speed and range) and maneuverability. Obviously, animal flight has been inspiring people for thousands of years and has helped them to develop and understand the way flying is accomplished in natural systems. Early attempts to fly, emulating wing-flapping of birds, were nevertheless unsuccessful. Powered flight was achieved by realizing viable aircraft configurations considering the technical limits existing at that time and was ultimately enabled by the concurrent development of knowledge and enabling technologies in all sub-systems related to the flight system, namely aerodynamic systems, stability and control systems, propulsion, structural systems, mechanical systems and flight control systems (Kulfan 2009).
6.2 Materials From time immemorial, humankind has made use of tools to support the daily activities needed for surviving. To build weapons, clothes and pottery, people used materials directly available in nature, such as wood, stones, bones, animal skins, horn, straw and mud. They learned very soon how to the combine shapes and materials in order to achieve the desired function. Material shaping was rudimentary; thus, single elements were often connected to each other using different methods. Experience has always been the major driving force for improvement. Through trial and error procedures, humans continuously developed their technologies, skills and know-how, boosting the realization of increasing efficiency and performance.
6
Making Matters: Materials, Shape and Function
Copper
METALS
Bronze
Gold
67
METALS
Iron
Glassy metals, Al-lithium alloys Dual-phase steels Microalloyed stells New super alloys
Cast Iron
Relative importance
POLYMERS
Steels Alloy steels
Wood Skins Fibers Glues
Light alloys
COMPOSITES Straw-brick
Conducting polymers
Paper
Glass Nylon PE
Cement
CERAMICS
Refractories Portland cement
0
1000 1500
1800
Fused silica
1900
High temperature polymers
Titanium alloys, zirconium alloys, etc.
Bakelite
Pottery
5000 BC
POLYMERS
Super alloys
Stone Flint
10000 BC
Development slow: Mostly quality control and processing
High-modulus COMPOSITES polymers Ceramic-matrix composites Metal-matrix composites
Polyesters Epoxies AFRP PMMA Acrylics CFRP PC PS PP GFRP Cermets
1940
CERAMICS
Tough engineering ceramics Pyroceramics
1960
1980
1990
2000
2010
2020
Year
Fig. 6.2 Evolution of engineering materials (Ashby 1992)
However, quantum jumps were ultimately always related to the discovery of new materials. Metals have been the dominating engineering materials for many decades (Fig. 6.2). Novel metal alloys based on steel, aluminum or titanium alloys are characterized by a huge flexibility of use; they possess good mechanical properties and moderate densities, thus being suitable for lightweight structural applications. Improvements of the physical properties are limited. Thus, potential for further enhancement of the efficiency of cost and weight primarily relies on improved design concepts and process technologies, as well as multi-material approaches. Consequently, during the second half of the past century, new synthetic materials such as ceramic, polymer and composites entered the market, progressively reducing the dominance of traditional metals. In the following, we focus our attention on composite and smart materials because of their significance and potential for present and future structural systems.
6.2.1 Composite Materials Composites are multi-phase material systems. They result from the combination on a macroscopic level of two or more constituent materials with different properties to form a totally new material (Table 6.1).
The overall characteristics of the composite are different from those achievable from any of the individual components. Composites not only exhibit properties that are better than those of the constituent materials, but they also possess features that are not available in the constituent ones. Many structural materials designed by nature for plants and animals, such as wood, are composite materials. Thus, unknowingly, humankind has used composite materials for thousands of years. In 5,000 BC, the original inhabitants of the Swiss midland used axes made of wood. Amazinly, the strength had been improved by orienting the wood fibers along the load path (Fig. 6.3). Composites were used by the ancient Israelites. They used straw to reinforce mud bricks and avoid cracks during the drying process. The Egyptians introduced plywood to improve the properties and the environmental stability of their constructions. Most of the composites used for technical applications are based on fiber-reinforced polymer materials. In the case of advanced polymer composites, continuously aligned high-strength fibers are combined with thermoplastic or thermosetting matrix systems. The typical fiber volume content ranges between 40 and 60%. Composites show outstanding specific mechanical properties (Fig. 6.4). They have proven their performance and reliability in a huge number of successful
68
P. Ermanni
Table 6.1 Fiber reinforced polymer materials result from the combination of two or more constituent materials Constituent materials Matrix: Thermosets Thermoplastics Metals Ceramics
Fiber: Glass fibers Carbon fibers Polymer fibers Ceramics fibers
Fiber architecture Short random
Short aligned
Continuous unaligned
Continuous aligned
In addition, the final properties of the composite material are defined concurrently with the fabrication of the part and are heavily influenced by the quality of the impregnation and forming process as well as by the chemical and/or physical transformations occurring during the curing or consolidation phase.
6.2.2 Intelligent Materials
Fig. 6.3 Prehistoric axe made of wood. Wood fibers are oriented along the load path (source: Swiss National Museum, Zurich)
applications in aerospace, automotive, naval, biomedical and general mechanical engineering sectors. Conventional engineering materials demonstrate homogeneous and isotropic material properties. Composites span a huge design space and in some way copy features of biological materials: they are both heterogeneous and anisotropic. By varying the orientation of the fiber, the physical properties of the materials can be adapted locally in order to achieve tailored design solutions for specific applications. Design complexity is further increased by the strong interrelationship between materials, design and manufacturing aspects. The manufacturing route restricts the selection of the polymer matrix and in many cases the fiber orientation.
The common feature of all adaptive materials is the capability to respond to an external stimulus by modifying specific physical properties in a predictable and reproducible manner (Fig. 6.5). A new generation of functional material systems has become available in the last decades for applications in technical structures. They are all characterized by the possibility to convert an external input into mechanical work. Relevant intelligent materials include piezoelectric materials, shape memory alloys and dielectric elastomers: Piezoelectric materials show a charge separation when strained, and strain when an electric voltage is applied (Jaffe et al. 1971). Shape memory alloys (SMA) are activated by thermal energy. The induced phase transformation (martensitic transformation) of the cubic face centered lattice into a hexagonal closest packed lattice results in a macroscopic, reversible shape transformation. Dielectric elastomers are compliant capacitors. They react to compression forces generated by an external electric field by changing the dimension
6
Making Matters: Materials, Shape and Function
Fig. 6.4 Specific properties of fiber-reinforced polymers compared to other engineering materials
Fig. 6.5 Transduction between energy domains in intelligent materials for application in smart structures (Bergamini 2009)
69
70
P. Ermanni
parallel to the electrical field. By virtue of their incompressibility, an extension in the two transversal directions will be observed.
6.3 Design Process Design is a creative process. It mainly takes place in the minds of engineers who are applying their knowledge in various science and engineering fields as well as relying on their intuition and know-how to generate new technical materials. In early times, one single person was sufficient to master all the knowledge that was necessary to design and fabricate a technical object (Ulman 1997). Industrialization and consequently the mechanization of society, starting in the middle of the nineteenth century required a more systematic approach to engineering design; the complexity of the products could not be handled any more by single persons. People established principles of construction, derived rules for embodiment and guidelines for material selection, and integrated into the design process principles of mechanics to analyze the behavior of materials and mechanical elements. In other words, design moved away from art and became a technical discipline. Today, the development of new technical products is performed by teams of engineers representing all disciplines involved in the design process. Pahl et al. (2007) divide the design process into four main phases, namely: (1) planning and clarifying the task, (2) conceptual design, (3) embodiment design and (4) detail design.
Function and form are intimately related to one another. The function describes what the purpose of a technical object is; the form or structure describes how this product will do it (Ulman 1997). According to Fig. 6.6, the structure is characterized by a set of attributes defining material, topology, shape and size. The topology defines the distribution of the material properties in a given design domain, the shape describes the geometry or surface, and the size provides the dimensions, e.g., cross-sectional area or thickness of structural elements. The design process is motivated by an initial idea and/or a need for improvement. Drivers for new product ideas are depicted in Fig. 6.7. To cope with the complexity of the design problem, engineers tend to break down the overall problem into functional sub-systems of reduced complexity. The initial lack of knowledge can subsequently be more efficiently handled following a heuristic design approach (Ulman 1997; Pahl et al. 2007). During the different design phases, design undergoes a variety of design steps. Each design step coincides with design decisions, which are evolving into the design state, leading to a final construction. The product knowledge increases, while the degree of freedom for change decreases (Fig. 6.8). Concurrently, the abstract (virtual) representation of the technical object, namely the set of information representing the form and the function of the object, is continuously refined in order to adapt to the actual design state. Ultimately, design will produce a physical (material) object. At the beginning of the design process, engineers are ideally exploring without any constraint the design space, which is spanned by all design attributes
Material: Steel, Aluminum, Titanium, Plastics, Composites ... shape optimization
topology optimization
sizing
size of cross section?
Fig. 6.6 Main factors influencing design
Topology
Shape
Sizing
6
Making Matters: Materials, Shape and Function
71
General environment •
economical events (energy shortage)
•
substitutions (mechatronic replacing pure mechanics)
•
ecological requirements (Recycling)
Company •
Technological environment • • •
•
Product
new manufacturing methods new materials new processes/methods
• • • • •
Market • • • • •
technological/economical market position modification of market demands (design) client feedback competitive pressure
Fig. 6.7 Drivers for new product ideas (Steinhilper and Sauer 2006)
Fig. 6.8 Generic product development (Ledermann 2006)
need for product improvements research results ecological requirements market differentiation new manufacturing methods rationalization requirements budget demands
72
P. Ermanni
involved in the design. Design attributes encompass all design variables related to topology, shape and size of the structure. Their task is basically to identify viable and non-viable design regions and eventually constrain the design space by finding “optimal values” for all those attributes, considering a set of partly contradicting requirements encompassing:
Design objectives: performance and costs Structural requirements Functional requirements Technological aspects
The border dividing topology, shape and size is floating. Nevertheless, within an ideal design process, one can correlate those design features to the different design phases. Accordingly, variables related to the topology are defined during the conceptual phase. Shape and geometry are defined in the embodiment design phase, while the final dimensions are set during detail design. The design process can be measured in terms of time, costs and quality of the final design. The latter correlates with the number of design alternatives that have been generated and evaluated, and consequently with the number of design decisions that have been taken. As illustrated in Fig. 6.8, costs produced by design errors increase significantly along the different design phases. Moving decisions to earlier concept stages leads to increased freedom for changes in the product design and to lower error costs at the time of the decision-making (Ledermann 2006).
6.4 Design Methods and Tools The expansion of computer aided design (CAD) and computer aided engineering (CAE) tools in the past 20 years has provided totally new possibilities to simulate and evaluate the behavior of technical systems with respect to structural and functional requirements, virtually considering each discipline involved and the multi-disciplinary character of the development of today’s products. The ability of modern CAE tools to simulate and predict the physical behavior of a system, their flexibility in terms of object representation and fast adaption to design changes can be dramatically enhanced by applying automated optimization methods. In the
following, we dedicate particular attention to evolutionary algorithms (EA) because they have been demonstrated to be very well suited for handling multi-objective structural optimization problems, where the solution space is non-convex, and design variables are both of discrete and continuous nature. EAs conceptually mimic the natural evolutionary process by implementing the Darwinian principle of survival of the fittest in a virtual environment. Bentley states that as long as some individuals generate copies of themselves that inherit their parents’ characteristics with some small variation, and as long as some form of selection preferentially chooses some of the individuals to live and reproduce, evolution will occur (Bentley 1999). In nature, a population of a species evolves through the mating of the fittest individuals. Similarly, within a population of individuals, each of them representing a possible design for a structural problem, those individuals performing well under given design objectives and constraints are combined in order to find even better designs. EA distinguish between a genotype search space and a phenotype solution space. Accordingly, the phenotype, i.e., the virtual structural representation (e.g., CAD model, FEM model) that is used for evaluation purposes is converted into a defining set of parameters (genotype). Different genetic operators can be applied to the genotype to induce a selective pressure on the evolving population: The selection operator chooses the fitter solutions to be parents of the next generation. Similar to the combination of two parents’ genes that form the genes of a child, recombination operators combine two or more parents’ genotypes to generate offspring. Mutation operators randomly modify parameters within single genotypes. Replacement is controlling how offspring are inserted into the new generation. A basic EA scheme for solving structural optimization problems is shown in Fig. 6.9.
6.5 Application Examples The environment for product development and innovation is changing rapidly as a consequence of the continuously increasing competitive pressure. On one hand, the globalized market requires better products, faster and cheaper. On the other hand, today more than
6
Making Matters: Materials, Shape and Function
73
Smart (adaptive)
Multi-functional
Increasing stuctural functionality
Fig. 6.9 Basic evolutionary algorithm scheme for structural optimization (Giger 2007)
Compliant
Lightweight
Fig. 6.10 Materials properties and structural functionality
Increasing material complexity Isotropic materials
ever economic growth, prosperity and mobility need to be balanced with the limited availability of natural resources and sustainable energy sources. The design space available to engineers is continuously expanded by the availability of novel materials and material systems. Thanks to their outstanding mechanical properties and functional features, they open up to new and somehow intriguing applications, which in fact increasingly tend to mimic functionalities of biological systems. Concurrently, the difficulty and complexity of the design process are increasing due to the
Anisotropic materials
Intelligent materials
intricate interdependencies of the many design parameters, whose number is dramatically increasing. The relationship between selected classes of engineering materials and structure functionality is depicted in Fig. 6.10. Lightweight structures all provide a basic functionality, namely they are carrying and transferring loads with minimum material usage considering structural requirements, such as a sufficiently high margin of safety against failure, creep, buckling, too much deformation and fatigue, under all occurring operating conditions.
74
P. Ermanni
Table 6.2 Drivers for the utilization of composites for structural applications Requirement Potential advantage Structural aspects Outstanding specific material properties Almost unlimited combination of different materials and fiber architectures High resistance to fatigue damage Outstanding energy-absorption characteristics Multi-functionality Design and fabrication of highly integrated monolithic structures (reduction of single parts) Virtually no corrosion Good damping properties Manufacturing of highly complex shapes Anisotropic behavior Thermal stability Biocompatibility properties (surface and structural compatibility) Transparency to microwave energy Easy integration of doublers, inserts, damping materials and functional elements such as actuators and sensors
Compliant structures create solutions with specified kinematical properties and replace discrete joints with compliant regions. They combine load-carrying properties with functionalities that are specific to mechanisms. Those functionalities are achieved by taking advantage of the elastic properties of the material, thus introducing a controlled flexibility in the structure. Specific applications concern clamping or fixation mechanisms, energy storage and release, transferring or transforming motion, force and energy, as well as shape adaptation. The trend towards improved structural functionality can be achieved by using composite materials. They allow engineers to control, orient and tailor the intrinsic material’s mechanical and functional properties of the material to specific applications and thus to design optimized solutions for load-carrying multifunctional structures and components (Table 6.2). Design of next-generation products will take advantage of intelligent materials with their capabilities of controlled changes and adaptation in mechanical and physical properties. Smart structures integrate actuators, sensors, energy conversion or dissipation elements that interact with the elastic properties of the host structure by means of appropriate control devices. In this context, adaptive material systems open new perspectives in terms of structural efficiency, multi-functionality and adaptation to changing conditions of the operating environment. In particular, intelligent material systems can be applied for: Active noise and vibration control: smart structures can be used to reduce the amplitude of vibrations drastically, so as to achieve noise reduction and
sound insulation, and prolonged lifetime of parts and components. Shape control via compliant structures and mechanisms: optimized distribution of stiffness and compliance in structures combined with embedded active elements can be exploited for novel devices, e.g., deformable and adaptive wings. Structural health monitoring: embedded sensors provide continuous measurements of parameters related to mechanical integrity and systems performance. Next, to improve safety and reliability, this information allows reducing maintenance costs and increasing systems availability. Possible implementations are, e.g., in structural elements of modern transportation systems. In the following sections, four applications of increasing complexity are presented. The first example is a racing engine piston pin (Ko¨nig 2004). It illustrates that even highly constrained design tasks involving geometrically simple components can lead to unexpected design solutions going far beyond intuition. The second application example is an adaptive car seat; it illustrates the potential of compliant systems for large-scale applications (Sauter 2008). Demand for weight-efficient structural systems is continuously increasing. Typical applications include transportation systems, such as aircraft, trains and cars, where lightweight structural solutions already have made it possible to increase safety and comfort considerably and to reduce consumption and emissions significantly, thus contributing to a sustainable development of our society. Accordingly, the third example is dealing with the design of a motorcycle
6
Making Matters: Materials, Shape and Function
75
Fig. 6.11 Left: CAD assembly of the piston. Right: Finite element model of the piston assembly (Ko¨nig 2004)
rim made of carbon fiber reinforced plastics (Giger 2007). The last example finally presents the integration of smart technologies for vibration suppression purposes in the rear wing of a racing car.
6.5.1 Engine Piston Pin This application has been investigated by Ko¨nig (2004) within the frame of his PhD thesis. Thanks to experience and intuition, engineers are infusing knowledge in the design process. Following a heuristic design approach they can establish one or even more potential design configurations (initial designs) and move the design state quickly and efficiently towards a viable design solution. Intuition impacts the design process in different ways. On the one hand, experience is valuable when the design space is well-known, for instance, in case of redesigning existing products and when design requirements and technologies are not changing too much. In other cases, it might unnecessarily reduce the design space exploration, excluding potentially better design solutions. Experience was probably the main driver for the design proposed in Fig. 6.11. The piston pin is a highly loaded mechanical structure, because it transfers the huge pressure generated by the combustion from the piston to the connecting rod and ultimately to the crankshaft. Due to functional requirements, the external shape of the pin is cylindrical. Strength is the main design criterion because loads resulting from the reciprocating movement of the piston induces huge accelerations. Lightness is crucial to reduce reciprocating masses, hence increasing engine performance. The design objective is therefore minimum weight. Figure 6.12 shows the resulting von Mises stresses for the original solution. The material is given; thus,
parametric design is only concerned with finding the right value for the inner radius of the pin that meets the strength requirements. However, the design space can be enlarged by considering a variation of the inner contour of the pin, which in fact is not subjected to any functional requirement and can therefore be almost freely varied. A possible parameterization of one half of the symmetric pin is shown in Fig. 6.13. The relationship between design parameters describing the shape of the inner contour and the resulting stress distribution is quite complex. Thus, it would be very difficult to find an optimum configuration in a reasonable time using a conventional “trial and error” procedure. As shown in Fig. 6.13, optimization methods have highlighted the potential for weight reduction. In fact for the best individual a weight saving of 11% could be achieved without increasing the maximum stress occurring.
6.5.2 Adaptive Car Seat Shape adaptation is conceivable in principle for all mechanical systems. Yet adaptive structures will be of particular importance in the future for engineering systems for energy conversion and transportation, where functionality and performance require taking into account the ever-changing fluid/structure interaction effects. Examples hereof are stationary and rotating blades in turbines and compressors (for efficiency enhancement), aerodynamic surfaces in aircraft and helicopters (considering optimum lift/drag and maneuverability), energy harvesting from naturally occurring energy sources (such as wind, ocean waves), and aerodynamic properties of cars and trains (for increasing efficiency and vibration suppression).
76
P. Ermanni
Fig. 6.12 Resulting von Mises stresses (Ko¨nig 2004)
Fig. 6.13 Left: Parameterization of one half of the piston pin. Right: resulting von Mises stresses of best individual (Ko¨nig 2004)
Fig. 6.14 Morphing car concept BMW GINA (source: Automotive Engineering, August 2008)
Figure 6.14 illustrates the concept car BMW GINA (geometry and function in “N” applications). The shape of the car can morph to meet particular driving mode requirements. For instance, the rear spoiler can become larger at higher speed, and the front air intake can widen for extra cooling or shrink to improve
aerodynamics. However, as the photographs reveal, this amazing functionality has not yet been integrated in a viable structural concept. The adaptive car seat was proposed by Sauter (2008). It is a good example to illustrate the potential of compliant systems for large-scale applications. A
6
Making Matters: Materials, Shape and Function
car seat represents the biggest contact surface between the driver, passengers and car, heavily impacting their position and feeling of comfort and safety. The two main functions of the seat are support and adaption. In existing products, the adaptation of the seat to the size and weight of different people and an adequate support under different driving conditions are achieved through very complicated constructions and mechanisms. Compliant designs are lighter, since they can be realized as single-piece devices and provide their functionality without any hinges. Furthermore, low manufacturing and maintenance costs are expected due to the small number of parts. Contradicting requirements resulting from adaption and support have been balanced by designing a rib-like structure that basically imitates the human spine. As illustrated in Fig. 6.15, for stability reasons two columns keep the single ribs structures in position instead of one. The design of the single ribs takes into account the complex relationship between distributed body forces introduced by the driver and the desired target shape (Fig. 6.16). The related design task can be formulated as a topology optimization problem. Fitness formulation is composed of an average displacement error on discrete points along the rib contour. Results of the topology optimization using EA are shown in Fig. 6.16. Stiff and compliant zones are distributed within the structures at different locations (Fig. 6.17).
Fig. 6.16 Initial (outlined) and target shape (dashed) (Sauter 2008)
77
6.5.3 CFRP Rims for Motorcycle Applications This application has been investigated by Giger (2007) in the frame of his PhD thesis. Rims significantly influence the overall performance and energy efficiency of many transportation systems. For cars and motorcycles, state-of-the-art technologies include aluminum and magnesium alloys. Rims available on the market are the result of a long evolutionary process. Thus, the
Fig. 6.15 Adaptive car seat concept (Sauter 2008)
78
maturity level reached by those technologies is not likely to offer great potential for further improvement. The situation is different for composite technologies. The rim is designed to bear loads occurring under different driving conditions; besides, a well-defined structural compliance is required in order to achieve optimum handling properties. Additional design requirements include damage tolerance considerations, optimum introduction and distribution of the loads, geometric interfaces, manufacturing and customer acceptance. Design objectives are the minimization of the mass and of the moments of inertia of the rim. The rim is an unsprung mass and also contributes to the overall mass of the motorcycle. Both mass and moments of inertia heavily influence the handling characteristics of the motorcycle. Especially in the case of composite structures, the concurrent consideration of topology, shape and sizing aspects can enhance the design process, leading to superior solutions. In the following example, the complexity of the design optimization task is reduced by decomposing the overall design task in sub-functions and treating
P. Ermanni
them in a sequential manner. Consequently, in the first design phase, the shape of the rim is defined only considering esthetic, damage tolerance and load introduction requirements. The selected five-spoke design, as shown in Fig. 6.18, is well-suited for maximum damage tolerance in case of impact loading. This design moreover satisfies the common customer preferences. Spokes are hollow for maximum lateral stiffness. The arrangement of spokes seeks a tangential transition from the spokes to the hub, which reduces the critical compressive stresses of the laminate. At this point, no material or other structural properties have been created yet. Topology optimization is concerned with the finding of an optimum material distribution. The use of composite materials implies a much bigger design space. Solution finding has to cope with the complex interdependencies of the many design variables and their intricate relationship to the design objectives. The composite material is a stack of unidirectional and woven layers. The mechanical properties of the rim structures can be locally adapted to match local forces (stress) and global stiffness requirements. For this purpose, the rim geometry is subdivided into four sections. The stacking sequence in each domain is defined by: (1) The material properties of the constituent fiber and matrix materials in each layer and the respective volume contents. In this project a unidirectional carbon laminate and woven carbon laminates are used, whereas the woven fabric is available in two different thicknesses; (2) The number of layers; (3) The orientation of each layer.
Fig. 6.17 Functional prototype of an adaptive car seat (Sauter 2008)
Fig. 6.18 CAD and FE model of the rim (Giger 2007)
The representation of these properties is addressed by using a heterogeneous list of optimization parameters (Fig. 6.19). The optimization model finally
6
Making Matters: Materials, Shape and Function
79
Fig. 6.19 Optimization parameters (Giger 2007)
consists of 61 parameters to be optimized in order to achieve design goals. The optimization objective is to minimize the rim’s mass and moment of inertia. Simultaneously, two constraints have to be fulfilled, i.e., the rim must not fail under maximum loading, and it has to provide a target stiffness for sufficient handling properties. The strength criterion is based on the well-known TsaiWu criterion. The stiffness criterion relies on practical stiffness tests with magnesium alloy rims, whereas the CFRP rim should approximately achieve the same stiffness properties. Input for the FE model is the surface model defined by the CAD system. The purposes of the FE model are to analyze stress distribution in the structure as a result of critical load cases, to prove the mechanical strength of the rims and to estimate the stiffness properties of the rims. Figure 6.20 illustrates a convergence plot of the optimization. The plot show the 42 evaluated generations; further evaluations have not found significantly improved design solutions and are therefore omitted. The final stacking sequence of the CFRP rim leads to a
total mass of approximately 2,220 g, which is decisively lighter than state-of-the-art magnesium rims having a mass of at least 3,000 g (Fig. 6.21).
6.5.4 Smart Rear Wing of a Racing Car Many lightweight structures, such as vehicle bodies’ satellite structures, antennae are designed to fulfill severe dynamic requirements in order to avoid too large amplitude vibrations, which can affect the functionality of the whole system and reduce the design life of parts and components. The suppression of structural vibrations by means of piezoelectric ceramic elements and passive electrical networks has been investigated by Belloli on a Formula 1 rear wing structure (Belloli 2007). This technology has proven to be beneficial to both aerodynamic and structural performance by reducing flutter and induced drag, thus improving the car’s high speed performance and stability. Furthermore, reduced displacement magnitudes
80
P. Ermanni
Fig. 6.20 Convergence plot (Giger 2007)
2900 Best mass Average mass 2800
Mass
2700
2600
2500
2400
2300
0
5
Fig. 6.21 CFRP rim prototype manufactured in autoclave technology
correspond to lower strains and stresses. Critical parts can therefore be designed with lighter and more slender laminate profiles. Belloli proposes a four-level model to hierarchically represent smart structural systems (Fig. 6.22). Level 1 considers the synthesis and development of active materials. Level 2 is concerned with material systems for actuator and sensor technologies. Structural integration and control issues take place at Level 3, while overall system design is performed at level 4 (Belloli 2009). In general, material and technology development at levels 1and 2 is not directly connected to a specific design task. The highly multidisciplinary
10
15
20 25 30 Number of generations
35
40
45
design process takes place at levels 3 and 4 and needs to take into account the strong interaction among host structure, smart devices and control strategies in order to arrive at an effective energy-efficient solution concurrently considering the entire set of structural and functional requirements to be fulfilled. In particular, the fundamental question is how to achieve a satisfactory vibration damping performance while fulfilling structural requirements in terms of global stiffness, weight efficiency, strength and stability. Actuators based on piezoelectric materials are among the best established and most researched of solid state actuators. Their use for vibration suppression purposes has been described in a number of structures, ranging from consumer products such as sporting goods to helicopter rotor blades. Piezo actuators can be easily embedded into the matrix of composite structures, and thanks to their good transduction properties, they find widespread use in sensing and actuation of smart structures. The transduction of mechanical to electrical energy performed in piezoelectric components is the key to accessing the flexibility of electronic circuitry to control the flow of energy between the environment and a structure. In an active arrangement, an electrical field is applied to the piezoelectric module. Amplitude and frequency of the actuator-signal is based on sensor feedback. Active systems require a complex associated electronics and bulky amplifiers for driving the
6
Making Matters: Materials, Shape and Function
81
Fig. 6.22 Schematic of a generic smart structure (Belloli 2009)
Level 4
Complex Smart System
Active Structural Part
Passive Structural Part
Passive Structural Part
….
Level 3 Host Structure
Control (Algorithm)
Level 2 Smart Device
Level 1
Piezoelectric
Iz
Passive Material
Shunt Circuit Control
Cp Up
Uz
actuators. An alternate method of vibration control is referred to as shunt damping. The general method of shunt control damping takes advantage of the relatively strong electromechanical coupling exhibited by piezoelectric ceramics. As the piezoelectric element – bonded on or embedded in a structure - strains, a portion of the mechanical vibration energy is converted into electrical energy, which can be treated in a network of electrical elements connected to the piezoelectric ceramic patch. Niederberger (2005) has shown that monolithic piezoceramic actuators can be successfully placed into structures for optimum vibration suppression using R-L and switching R-L shunts (Fig. 6.23). Shunt damping promises reasonable performance without presenting the main drawbacks of visco-elastic materials and active systems, namely additional mass, sensitive material properties and expensive associate electronics. The vibration suppression performance with regard to functionality and structural efficiency depends on the complex interaction among the following factors: Structural mechanical response of the host-structure
L R
Mechanical Structure
Fig. 6.23 (a) Adaptive R L shunted piezoelectric patch (Niederberger 2005)
Mechanical Structure
Intelligent Material
Iz
Piezoelectric
Cp Up
S R
Uz L
Control strategy Sensors and actuators configurations, including type, dimensions, shape, placement and proper integration of the smart devices into the matrix of the composite host structure. Different integration configurations are illustrated in Fig. 6.24 (Melnykowycz 2008). Each of these aspects is related to a large number of design parameters, which have to be determined during the design process. For this application, the design variables related to the number, position, dimension and shape of the piezoelectric ceramic elements are determined using evolutionary algorithms. The considered design constraints include maximum added mass, maximum strain experienced by the ceramic elements and requirements for the electrical components. Experimental validation proved the technology to be effective in suppressing structural vibration. A vibration suppression of the first vibration mode by approximately 21.5 dB (or 76%) can be expected (Fig. 6.25).
82
P. Ermanni
a
b
Inserted
Interlaced
c
Cut-out
Fig. 6.24 Left: Optimum placement of two actuators. Middle: Integration configurations for smart devices. Right: Embedded piezoelectric actuator for vibration damping purposes (Belloli 2007)
Fig. 6.25 Top: Strain distribution for the 1st vibration mode. Bottom: Magnitude of the transferfunction from disturbance to velocity of the endplate tip: open and shunted systems, respectively
6
Making Matters: Materials, Shape and Function
References Ashby MF (1992) Materials selection in mechanical design. Pergamon Press, Oxford Belloli A (2007) Methods and techniques in composite design for structural vibration suppression via shunted piezoelectric elements. Dissertation ETH No. 18352 Bentley PJ (1999) (ed) Evolutionary design by computers. Morgan Kaufmann, San Francisco, CA Bergamini A (2009) Electrostatic modification of the bending stiffness of adaptive structures. Dissertation ETH No. 18159 Jaffe B Cook WR, Jaffe H (1971) Piezoelectric Ceramics, vol. 3. Non-Metallic Solids. Academic Press, London Ennos AR (2005) Compliance in plants. In: Jenkins CHM (ed) Compliant structures in nature and engineering. WIT press, Southampton Giger M (2007) Representation concepts in evolutionary algorithm-based structural optimization. Dissertation ETH No. 17017 Ko¨nig O (2004) Evolutionary design optimization: tools and applications. Dissertation ETH No. 15486 Kulfan BM (2009) A Paleoaerodynamic Exploration of the Evolution of Natures Flyers and Man’s Aircraft and Options
83 for Future Technology Innovations. In: 16th annual SPIE Smart Structures and Materials/NDE 8–12 March 2009, Town and Country Resort and Convention Ledermann C (2006) Parametric associative CAE Methods in Preliminary Aircraft Design. Dissertation ETH no.16778 Mattheck C (1993) Design in der Natur. Der Baum als Lehrmeistr Rombach Verlag, Freiburg Melnykowycz M (2008) Long term reliability of active fiber composites (AFC). ETH Dissertation No. 17767 Niederberger D (2005) Smart damping materials using shunt control. Dissertation ETH No. 16043 Pahl G, Beitz W, Feldhusen J, Grothe KH (2007) Engineering design a systematic approach. Springer, Berlin Sauter M (2008) A graph based optimization method for the design of compliant mechanisms and structures. Dissertation ETH 17787 Steinhilper W, Sauer B (2006) Konstruktionselemente des Maschinenbaus 1. Springer, Berlin Ulman DG (1997) The mechanical design process, vol 2. McGraw-Hill, New York Vincent JFV (2005) Compliant structures and materials in animals. In: Jenkins CHM (ed) Compliant structures in nature and engineering. WIT press, Southampton, Boston
Part Design of Environments for Living
3
III
7
The Theory of Dialogical Design
Meinhard von Gerkan
Architecture is not an autonomous art, like painting, sculpture, music and literature. It involves social application and is dependent on the conditions in which it is to be used, its location, the materials and techniques to be employed, and the capital available. Above all, it is conditioned by human needs. Hence, all architecture evolves from a dialog between the existing conditions and the ideals and models of the architects involved. Architects react to their task in different ways. They may see themselves as receiving orders, as holding a complacent monolog or as partners in a discussion. The pluralism of architecture today is partly the result of these different attitudes and reactions.
7.1 Dialogical Design in Architecture 7.1.1 The Conformist Pragmatic Position Here, architecture is seen as a social commodity; such architects are prepared to adapt to the prevailing conditions without taking a clear position of their own. Synthesis is achieved by following the line of least resistance. The architect taking this position assumes the role of an assistant helping fulfill a project or idea. He is a product designer, giving the required stamp or appearance.
they have invented and then made into conditions. The conditions are imaginary and do not represent the real requirements of concrete tasks. Changes in social values create new ideologies and new formal doctrines. Periodicals, exhibitions and theoretical discussions are full of such monologist designs. Graphic visions or ideas that can effectively be demonstrated graphically use architecture as a subject; in doing so, they do not provide information about a concrete architectural intention; they produce illusions with alienations, painterly intensifications and associative embellishments, avoiding deliberate confrontation with existing conditions.
7.1.2 The Complacent Monolog This tends to be dogmatic; existing conditions are largely ignored, and every proposal is given the stamp of ideology. Many contemporary architects tend to indulge in such monologs. They cite theories
M. von Gerkan gmp Architekten von Gerkan, Marg und Partner, Hamburg, Germany
7.1.3 The Dialog As I see our profession, the architect should be a partner in a dialog. His use to society derives from the fact that as an expert on the design of the environment, he reacts to conditions according to his subjective system of values. In this view, architecture is an art with social applications, and the result emerges from the interplay of inherent and external
S. Konsorski‐Lang and M. Hampe (eds.), The Design of Material, Organism, and Minds, X.media.publishing, DOI: 10.1007/978-3-540-69002-3_7, # Springer-Verlag Berlin Heidelberg 2010
87
88
M. von Gerkan
factors. The dialectic lies between the intellectual position of the architect and the opposition of existing conditions. Neither the predetermined regimentation of an intellectual or artistic prejudice, that is, the dictate of a formal concept, nor the totally free play of forces, in which the coincidental and the chaotic can gain the upper hand, should dominate. Only the interplay of free forces and natural elements with the structure of an intellectual and artistic concept will result in their synthesis in a “true” image of human society in architectural form. All major cultural achievements are the result of that dialog.
7.1.4 Design in Dialog I cannot put forward here a complete and finished theory; these remarks are analytical reflections on my own work. I would like to characterize my position in four key sentences: Simplicity: Search in your designs for the most obvious and the simplest solution. Try to make the simplest the best. Variety and Unity: Create unity in variety and variety in unity (see Fig. 7.2). Structural Order: Give the designs a structural order. Organize functions into clear architectural forms. Unmistakable Individuality: Give the design an identity using the specific features of the situation, location and task. These are my general aims and form the inherent “design philosophy” I use to respond to conditions during the design process.
Fig. 7.1 Search for the clearest solution for your design. Strive for the best of simplicity
7.1.4.1 The Need for Simplicity What I mean by simplicity is what is plausible, selfevident, clear and unadorned, in the sense that a sun umbrella is a simple solution. The most elementary simplicity, I am convinced, is also a guarantee of beauty and durability. But it is extremely difficult to achieve. Simplicity is also aiming at modesty, reduction and unity of materials. And, above all, simplicity is a maxim in the organization of function (see Fig 7.1).
Fig. 7.2 Create uniformity within variety. Create variety within uniformity
7
The Theory of Dialogical Design
7.1.4.2 Variety and Unity The better the balance between variety and unity, the higher is the quality of environmental design. The discomfort of our living environment is due either to an excess of uniformity, which we feel is monotonous, or an excess of variety, which we register as chaos. So I see a balance of unity and variety as a particularly important objective (see Fig. 7.2). However, this problem in design arises in every task with different and often totally opposing conditions. While rows of identical houses can lead to monotonous uniformity, there is always a risk, when designing shopping precincts and corridors, of overloading the design. My own objectives do not result from an idealization of variety nor a dogma of strict order; my aim is to enter into a dialog with the prevailing conditions.
7.1.4.3 Structural Order The purpose of structural order is to create clear architectural forms, to make the architecture visually legible and to give clear spatial orientation (see Fig. 7.3). Each design must be based on a structural principle, that subdivides the building and organizes its functions. The ground plan and elevations must be derived
Fig. 7.3 Render a structural order to the design. Organize function as clear building forms
89
from that structural principle. For me, the structural principle is the grammar of the design process. My organizational principles are generally simple or composite geometry, sometimes overlaid with free forms. 7.1.4.4 Unmistakable Individuality Every design concept is derived from the search for a specific identity in its solution (see Fig. 7.4). The concern here is not uniqueness for its own sake or a formal arbitrary act, but the endeavor to develop unmistakable individuality from the specific conditions of the given situation and the task in hand.
7.2 Conclusion How our environment is shaped is not only determined by us as architects. We can respond to the demands of the tasks in hand with designs. Whether our answers are accepted and have a chance of becoming part of our architectural environment depends in part on the extent to which we are prepared to listen during the dialog. Finding suitable and acceptable answers and solutions to the problems of designing the environment requires being ready for dialog and adapting ones standpoint to changing conditions. The decision as to what will be built and how affects society with its complex political and economic mechanisms. We architects are obliged to face up to these conditions by entering into a dialog and to participate in the discussion with inner conviction.
Fig. 7.4 Develop an identity of the design from the specific conditions of location and task
90
7.3 Lingang New City: A Metropolis in the East China Sea 7.3.1 Shanghai Shanghai is both a significant commercial center and at the same time a traffic junction within China. There are plans to develop Shanghai, a historically and culturally important city, into an international commercial, financial and trade metropolis in the future. The rapidly increasing population of Shanghai, today nearly 13 million, will reach 16 million by 2020, 13.6 million of which will live in the city. Shanghai’s city center will then have an estimated surface area of 600 km2 and approximately 8 million inhabitants. In order to accommodate the huge growth of population and industry in Shanghai, the city-planning department conducted an international competition for the planning of a new harbor city that incorporates the international deep-sea container harbor Yangshan. The first prize of the competition, which took place in
Fig. 7.5 “We propose a central lake, around which the central business area will be built. The form of the lake will be a precise circle. As we are dealing with an artificial feature, there is no need to clone natural forms. Instead, we will use the metaphor of a drop falling into the water and creating concentric ripples”
Fig. 7.6 Masterplan: Lingang New City
M. von Gerkan
several stages, was awarded to the Hamburg architect’s office gmp – von Gerkan, Marg and Partners. The newly planned satellite city, Lingang New City (see Figs 7.5, 7.6), is intended to provide space for 800,000 inhabitants in an area of 74 km and represents, alongside Chandighar, Brasilia and Canberra, the only city of this scale to be founded in the past 100 years.
7.3.2 Urban Planning Concept The concept for Lingang New City (see Fig. 7.7) takes up the ideals of the traditional European city and combines it with a “revolutionary” idea: instead of a highdensity center, the focal point will be a circular lake with a diameter of 2.5 km and an 8-km lakeside promenade with a bathing beach a` la Copa Cabana in the heart of the city. Cultural buildings and leisure facilities are located on islands, which can be accessed by boat. The design was inspired by the city of Alexandria, one of the Seven Wonders of the World; the quality of
7
The Theory of Dialogical Design
91
Fig. 7.7 Sketch: Lingang New City; Meinhard von Gerkan, “Each solution has something to do with dialog, the asked question has to be analyzed: first Shanghai, secondly the new harbor, seafaring, trade and as thirdly the exposed location on the coast – to be in dialog with the sea”
life provided by the close proximity to water draws its references from Hamburg. The whole city structure is based on the metaphor of an image of concentric ripples, formed by a drop falling into water. In line with this allegory, the utility structures are ordered in the form of concentric rings spreading outwards from the central Dishui Lake: from the promenade, through to the extremely dense business district, a circular city park, 500 m in width, which incorporates solitary public buildings, to the block-like living quarters for 13,000 inhabitants, respectively (see Fig. 7.8). The city ring between the lakeside promenade and the green belt, the business district, forms the center of city life. A mix of offices, shops, arcades, pedestrian precincts and dense living space is located here. The concentric structure is layered following the principle of a compass rose; the streets and pathways radiate out from the center. These provide the city with
a clear, ordered structure and divide the built-up rings into separate sectors. In this way, an ideal network of access is created, within which the city can also expand above and beyond the planned scale. The countryside penetrates like wedges as far as the second ring. Waterways and small lakes extend into all quarters, underlining the central theme of “waterside living” in a wide variety of forms. The public transport system with light trains at street level functions as a circular railway with adjoining loops.
7.3.3 Realization The first construction phase of the new city for 80,000 inhabitants in the present mainland area should be completed by 2010. The second and third phase of construction will follow until 2020. The area required
92 Fig. 7.8 Model photo (Photo: Heiner Leiska)
Fig. 7.9 Satellite photo before the land reclamation. (Photo: Shanghai Urban Planning Administration Bureau)
Fig. 7.10 Satellite photo of the land reclamation for Lingang New City, 2005 (Photo: Shanghai Urban Planning Administration Bureau)
M. von Gerkan
7
The Theory of Dialogical Design
for this was dragged up from the ocean by means of an earth bank (see Fig. 7.9). In the process of obtaining this land, Dishui Lake was formed as the central point of the new harbor city (see Fig. 7.10). At present, gmp is planning, among other projects, the Cloud Needle, the first segment of the so-called New Bund, Western Island, as well as building the Maritime Museum and the opposite Shanghai Nanhui District Administration Center (see Fig. 7.11).
7.3.4 Ring Development All three layers of the ring development will be characterized by dense building blocks, broken up by squares in each quarter and small pocket parks. The promenade ring provides a fantastic view across the lake, while the next city ring is traffic-free and characterized by its shopping facilities. Finally, the adjacent park ring provides an attractive location between the city park and the lake. Its northern part can accommodate living quarters. Each of the three layers has its own characteristic face. Differentiation in the height of the blocks, the choice of building materials and the design of the exterior provides exciting diversity. At the same time, they all fit a common city-planning canon, and the individual buildings can be plausibly seen as part of the whole, forming an ensemble. In the selection of the building materials for Lingang New City, it is important to consider the ancient building tradition of China, in particular in the Shanghai region. A successful mixture of cultural tradition
93
and modern European architectural styles will give Lingang an unmistakable identity. The 14 living quarters, grouped in the third ring around the center, are embedded as defined, identifiable areas in the expanded countryside area of the Nanhui province. These self-sustaining minicenters with shops, services, basic medical care, kindergartens and day care centers form self-sufficient communities. While their common size, grid and basic module, as well as a pre-determined canon of materials show that they are clearly part of a group, the design of the public spaces gives each of them a unique character. The squares and public parks are strongly influenced by international harbor cites, which serve as models of inspiration, thus allowing scope for individuality and an unmistakable identity.
7.3.5 Radial Structure The radial structure, which divides the city rings into individual segments, has squares of different shapes and sizes along its length. The most striking of the squares is the Main Square on the west-east axis. It serves as the prelude and invitation to a stroll along the lakeside promenade and an attraction for tourists from all over the world. The likewise radial city canals have an important function for the city water supply, but equally serve to give structure and identity to the inner city area. Their individual design and profile give each canal its own character and identity. The names of the canals are reminiscent of the great rivers of the world (Mississippi, Ganges, Volga, Yangtze, etc.).
7.3.6 Landmarks
Fig. 7.11 Landmark Building: Maritime Museum (completed 2009)
The landmarks are to a large extent visible representatives of the being and the substance of Lingang. The economic source, origin and culture of the city are manifested in them. They are solitary buildings, filigree and glass in their materiality. Like glowing crystals, they are allegories to the urban community and thus provide the city center with an architectural identity.
8
ETH Future Cities Simulation Platform A Framework for the Participative Management of Large-Scale Urban Environments Jan Halatsch, Antje Kunze, Remo Burkhard, and Gerhard Schmitt
Abstract How can architects and urban planners design sustainable future cities? Are urban designers really designing urban systems? And which new computerbased techniques have changed the way they think, plan, and communicate? In this article we will investigate how design thinking can contribute to the planning of sustainable future cities. We will emphasize the benefits of using computer-based techniques for purposes such as analysis, simulation, and visualization of complex urban systems. We will introduce a theoretical framework (Future City Simulation Platform), a new hardware infrastructure (ETH Value Lab) for collaborative design, and novel software-based approaches for typical design tasks, such as the simulation and visualization of scenarios or the collaborative decision-making processes among stakeholders with different backgrounds.
8.1 Introduction
This article is organized as follows: Background and Motivation for our research, and Elements of the Framework as a structural introduction to the Related Work section. Here an overview of available related work within each branch of the framework is given. The section Methodologies for Managing Future Cities sets out the main concepts of our framework, and in the section Technologies for the planning of large urban environments, we present the associated implementation in more detail. In Examples, we show examples from real world scenarios and teaching. The final section outlines our conclusion and our ambitions for future work.
What do Le Corbusier, Frank Lloyd Wright, Herzog and DeMeuron, and Zaha Hadid have in common? They have all worked as architects, urban planners, and at the same time also as designers of their own furniture. However, they rarely called themselves designers. For architects and urban planners, the term design is related to “design as process” (e.g., of a city or a building) rather than to “design as a product,” such as a well-designed car. In this article we will therefore investigate the design process and will discuss new computer-based methods and how they can become the foundation of a new design process for developing sustainable cities and sustainable buildings.
8.1.1 Background and Motivation
J. Halatsch (*) ETH Zurich, Chair for Information Architecture, Zurich, Switzerland
Future cities – a synonym for today’s evolving megacities – can be interpreted as networks that encompass dynamic properties and attributes operating on different spatial scales with instant stimuli, which include
S. Konsorski‐Lang and M. Hampe (eds.), The Design of Material, Organism, and Minds, X.media.publishing, DOI: 10.1007/978-3-540-69002-3_8, # Springer-Verlag Berlin Heidelberg 2010
95
96
dimensions such as social, cultural, and economic measures. Infrastructures and the environment have to be adapted in response to the exponential growth of megacities like Shanghai. New sustainable urban development strategies have to be elaborated. Especially long term, the quality of life in mega-cities has to be ensured. Planners, designers, officials, and even the public find it difficult to understand, to plan, and to communicate the effects of the persistently increasing demands regarding the use and the forms of energy, or the consumption of natural resources, as well as to meet the high expectations for transportation and communication infrastructures. Therefore, for large-scale planning projects, it is crucial to avoid missing links between experts and laymen. Because of their inherent complexity and dynamic nature, it is often very hard to perceive, to represent, and to communicate sophisticated problems or solutions since the matter involves an interdisciplinary challenge where many different skills may be combined: computer science, environmental studies, sociology, design, communication, urban planning, strategic management, architecture, and aesthetics, and also feedback from the general public. This article aims to create a shared vision, to map the desires of the involved participants, and to contribute to the challenge of participatory planning processes with the help of the urban planning framework, the Future Cities Simulation platform. The Future Cities Simulation platform will offer an infrastructure for the design, modeling, visualization, and analysis of future cities. It consists of a physical space with state-of-the art hardware and software components as well as intuitive human-computer interaction devices: the ETH Value Lab. The platform forms the basis for knowledge, discovery, and representation of potential transformations of the urban environment, using time-based scenario planning techniques in order to test the impact of varying parameters on the constitution of cities. This will help researchers and planners combine existing realities with planned propositions, and overcome the multiplicity (GIS, BIM, CAD) and multidimensionality of the data sets representing urban environments. The three research thrusts of this ongoing project are: 1. Interactive urban design and scenario planning: Research methods to support concurrent collaborative urban design over distances and scenario-
J. Halatsch et al.
planning based on defined case studies, and provide new means of combining design, simulation, analysis, and evaluation processes in an integrated workflow. 2. Elaboration of codified knowledge: Establish a City Knowledge Model (CKM) governing a city’s life cycle. Provide new techniques for data mining and automatic knowledge acquisition from existing data, e.g., through computer-vision methods. 3. Knowledge transfer: Research new ways to synthesize, communicate, and interact with the essential knowledge and findings from an array of disciplines for decision making, education, training, demonstration, and public discussions: the “Simulation Platform” as a forum for dialogue. The work will finally result in a dynamic platform for building, city, and territorial modeling with a longterm impact on the design of future cities. The project will enable simulation and analysis to be conducted in the interdisciplinary framework of the Future Cities Laboratory, creating decision synergies: a cityplanning model to be emulated worldwide.
8.1.2 Framework We group the main contributions of this presented framework into the following branches: Environmental Simulation: A simulation and visualization software system for the creation of large urban environments, evaluation of occupant movement, and high quality real-time visualization of simulation results. GIS-based Project Management and Collaborative City Design: A collaborative planning approach that uses techniques from environmental simulation is introduced. Further, a GIS layer-oriented software for project management, which is easy to use with touch gestures and works in large-scale multi-screen setups, is added. Display and Interaction Systems: A state-of-the-art large-scale multi-screen setup is used equipped with multi-touch technology that is connected to online and graphic computing resources situated outside the presentation room. At the same time, all possible configuration data are brought to the user within a few finger taps. In addition to this, a new camera-based gesture recognition system is incorporated.
8
ETH Future Cities Simulation Platform
Information Architecture: An architectural interior design is used that governs the processes and workflows within in a collaborative working setup and takes care of the usability of the display and interaction system.
8.1.3 Related Work Related work is structured as: Environmental Simulation: We adapted the systems described by Aschwanden et al. (2008), Halatsch et al. (2008a), Ulmer et al. (2007), and Mu¨ller et al. (2006) for use in the ETH Value Lab. It seems promising to evaluate the resulting geometry of shape grammar computations for urban environments [Halatsch et al. (2008a) with crowd simulation methods (Aschwanden et al., 2008; Burkhard et al., 2008)]. Wegener (1994) describes a universal urban system model overview. It outlines the behaviors that existing urban simulation models attempt to represent in varying degrees of compactness and features integrating differing approaches of temporal abstraction and detail of simulation agents and location. It further describes the interdependencies between urban entities household and business agents that are located in housing and nonresidential buildings, and the differing time scales of the evolution of buildings, transportation networks, urban form, and travel. The computational tool facilitated by the analysis of the generic city model is based on shape grammars (Stiny, 2006). Shape grammars have been implemented in the past in the analysis of several historical examples, such as Palladian villas (Stiny and Mitchell 1978). The technical characteristics are directly derived from an attributed shape grammar called CGA Shape, which is suited for applications in computer graphics. It was introduced by Mu¨ller et al. (2006) and has been extended by Ulmer et al. (2007) and Halatsch et al. (2008a) with urban planning rule sets and landscape patterns, which can be used for pre-visualization, master planning, guided design variation, and digital content creation purposes in the entertainment industries. The rules and patterns are associated with architectural attributes. Generated geometries follow basic architectural norms according to GIS information.
97
For an extensive overview on crowd animation systems, please refer to (Magnenat-Thalmann and Thalmann, 2004). Display and Interaction Systems: Display and interaction systems were recently explored by Gross et al. (2003). Examples of room-sized multi-projector displays, such as six planar display walls, include the CAVE, and tiled display walls, such as the PowerWall, are often limited in contrast ratios and daylight use. Multi-user experiences and direct-on-display surface interaction of the systems described above are quite unclear. We discard active and passive 3D screens and projection-based systems in favor of touch interfaces to establish multi-user setups. Information Architecture: The paradigms and ideas behind the term information architecture (Schmitt, 1999; Engeli, 2001) strongly influenced the final design of the ETH Value Lab. For more information, please refer to (Burkhard, 2006; Halatsch and Kunze, 2007; Burkhard et al., 2009; Halatsch et al., 2008c, 2009a).
8.2 Methodologies for Designing Future Cities Future cities, especially mega-cities, have to be understood as a dynamic system – a network that bridges many different scales, such as local, regional, and global scales. Since such a network comprises several dimensions, for example, social, cultural, and economic dimensions, it is necessary to connect active research, project management, and urban planning, as well as communication with the public, for example, to establish a mutual vision or to map the desires of the involved participants. In the following, we present the main components of our framework, and subsequently we will discuss newly developing needs and possible solutions.
8.2.1 Environmental Simulation Although several software-based methods are available to simulate and evaluate certain aspects of urban environments, they (1) are often limited in data exchange and project organization, (2) require expert knowledge to operate and understand the software, (3) lack multi-user interaction, which is necessary for
98
J. Halatsch et al.
collaborative planning, and (4) offer only limited realtime exploration of simulation results for evaluation of a design. We seek an answer to these needs with an urban simulation and visualization system that is capable of importing standard CAD and 3D geometry via common exchange standards like Collada and Autodesk FBX. The platform can be run on large multiscreen environments and offers current information to large user groups, such as in workshops. Currently, our system offers the visualization of large-scale urban environments and the simulation of occupant movement, which can be evaluated in real-time. In addition, our system is also capable of processing the CO2 calculation of placed buildings and occupant behavior simulation.
one large-scale interactive display space is a major milestone. The explorable screen size and available human computer interaction methods of today’s office and meeting room installations are focused towards fulfilling single user requirements. A multi-screen setup can drastically enhance collaboration and participatory processes by keeping information present for all attendees, for example, of an urban housing workshop. Therefore, we promote such a configuration and complete it with full multi-touch support for direct onscreen interaction. The corresponding hardware has to be open to standard software while delivering full computing and graphics performance. The system has to be easy to operate for average-skilled computer users.
8.2.2 GIS-Based Project Management and Collaborative City Design
8.2.4 Information Architecture
Common software-based project management tools offer powerful methods to manage resources and durations. Since a typical urban planning project associates all processes with geographic information, it is strictly necessary to keep that geographic information up to date at all planning and project management stages to establish a shareable vision among all stakeholders. This is a gap that currently available project management tools cannot offer. Our solution offers GIS information with sketch-based touch-display interaction and is multi-user and multi-screen capable. Our GISbased project management software keeps information updated and interactive on up to five large-scale highresolution screens. This is necessary since typical board meetings in urban planning consist of up to 15 stakeholders. In addition, our collaborative city design tool chain enables stakeholders from different disciplines to work together, and, in a participative manner, to set up design requirements as well as create the implementation, visualization, and annotation of urban designs.
8.2.3 Display and Interaction Systems The accessibility of both our environmental simulation and our GIS-based project management system from
The authors developed the ETH Value Lab as a new example of collaboration and interaction in an architectural space. Special components such as large-area multi-touch displays enhance collaboration whether users are present in the ETH Value Lab or are connected to it via network infrastructure from outside. An advanced architectural design is vital for a complete understanding of the workflows within such a special location. The authors analyzed the different states of operation on a hardware layer as well as on an abstracted layer that is visible to the users.
8.3 Technologies for the Planning of Large Urban Environments In the following, we wish to promote the technologies embedded in our urban planning framework, which consist of (1) environmental simulation, (2) GISbased project management, (3) display and interaction systems, and (4) information architecture. They can be understood as a first starting point to guide and visualize long-term planning processes while intensifying the focus on the optimization of infrastructures (e.g., transportation, water, and communication systems) and buildings through new concepts, new technologies, and new social behaviors to cut down CO2
8
ETH Future Cities Simulation Platform
emissions, energy consumption, and traffic load, and to increase the quality of life.
8.3.1 Procedural Simulation and Modeling Techniques for Environmental Simulation In this section, we introduce several ongoing research projects in the branch of environmental simulation that contribute as design tools for the development of future cities in the ETH Value Lab. The following projects explore how shape grammars can be integrated into the urban design process, can be integrated in CO2 emission calculation models, and be used for urban performance studies as well as for the interactive analysis, visualization, and iteration of urban designs. Therefore, we extended a system called CityEngine (Mu¨ller et al. 2006), which now can be used to visualize master planning scenarios in high polygonal resolution. To this system we added the ability to describe urban design patterns in the form of hierarchical ones. This system can be applied (1) to the field of urban planning as well as to project development for the pre-visualization of large urban areas, especially in cases where common building regulations exist, (2) to evaluate simulation criteria on an urban scale, e.g., in order to achieve lower CO2 emissions within cities, (3) to the field of architectural history and archeology for the reconstruction of building designs and cities (Guidi et al., 2005; Mu¨ller et al., 2006), (4) to the field of architectural theory, e.g., for the analysis and synthetic construction of unbuilt city utopias, and (5) to the curriculum in education as a systematic learning tool to understand and encode architecture. This system was initially introduced by Parish and Mu¨ller (2001) at the ETH Computer Graphics Laboratory for the production of city-like structures and extended for the production of computergenerated architecture by Mu¨ller et al. (2006) at the ETH Computer Vision Laboratory; it is now available
Fig. 8.1 Left: Typical street view of Pungol. Middle: Synthetic reconstruction of Pungol. Right: The resulting master plan
99
as a commercial product from Procedural Inc. (http:// www.procedural.com/cityengine). Both chairs are founding members of the DDM group. We adapted this system into a novel framework for grammarbased planning of urban environments to use inside the ETH Value Lab and for urban planning. For more information on the use of shape grammar tools in urban planning, we refer the reader to Halatsch et al. (2008a, b). In the following, we will illustrate this with representative work that was done by the Chair for Information Architecture and in part together with partners from the DDM group.
8.3.1.1 The Procedural Layout and Generation of City Block Configurations, Pungol, Singapore As an introductory case study, we researched the possible implementation of urban schemes in our system (Halatsch et al., 2008c). The district of Pungol in Singapore consists of a large, high-density housing area (see Fig. 8.1). This area is intended to answer the rising demands for housing of the fast growing population. The district‘s layout and the building shapes are highly regulated by the government. Therefore, there is a need for simulating various scenarios. The repetitive geometrical appearance of the facades and greening were encoded in CGA shape to predict the future appearance and growth at Singapore’s coast area if traditional high-density housing schemes of Singapore were used.
8.3.1.2 The Procedural Reconstruction of Le Corbusier’s “Ville Contemporaine” This project – carried out together with the Computer Vision Laboratory – evaluates the precise reconstruction of Le Corbusier’s “Ville Contemporaine” using grammar-based digital reconstruction techniques.
100
J. Halatsch et al.
Fig. 8.2 Left: Resulting master plan from the procedural reconstruction. Right: Detailed grammarbased building model
It features the complete procedural modeling of the urban plan onto an intermediary level. The resulting 3D city model incorporates buildings, the street network, open spaces, as well as waterways. In this work, we examined how knowledge from the past can be codified and directly applied into a city evaluation system. It helps to understand, model, and visualize the dependencies of urban systems for the efficient creation of large-scale 3D city layouts. Further, the paper gives clues of how to implement urban design knowledge into grammar-based design descriptions in order to create efficiently visualizations of urban scenarios with procedural techniques (see Fig. 8.2). The concepts of modernism made important contributions to the development of today’s strategies for urban planning; however, they may also be responsible for many design problems in current mega-cities. The utopias proposed by Garnier, Sant’Elia, and the garden city schemes of Howard inspired Le Corbusier to sketch the “Ville Contemporaine” as an answer to the misery of the growing number of slums inside French metropolis Paris in 1922. With Fordism and Taylorism in mind, Le Corbusier sought to achieve a high population density with cost-efficient housing units that consisted of serialized elements that we mass-produced. The “Ville Contemporaine” was targeted to situate 3 million inhabitants inside a complex metropolis with numerous and various functions, e.g., featuring standardized building types, open spaces for relaxation, as well as a standardized interconnectivity of separated traffic networks, which enable private and public transport on a broad scale. High-rise skyscrapers, cellular building blocks, and setback housings with integrated vertical gardens, a mixed use, and short distances to public and private transport pose a vision that is even nowadays difficult to achieve. Recent examples – for example, Masdar City – that
propose a new face for future cites are, interestingly, very similar to Le Corbusier’s utopias. First, using the 3D reconstruction of the complete urban model, the urban plan was analyzed. Social factors, the zoning, and the distribution of objects were translated into GIS data which were later used for the geometry production process. Second, the building grammars had to be defined. As inputs, we used the vast number of sketches and plans that Le Corbusier used to describe his urban vision. This step mainly concentrated on describing the proportional reconstruction of the building masses and the principle layout of the open space system in CGA. Finally, the building facades, vegetation, and street furniture were defined in intermediary detail for the procedural production process. The resulting model was used as a test platform for urban benchmark evaluation purposes.
8.3.1.3 The Hellenistic City Model Inspired by Koolhaas: A Test Case for a Generic City Model In the following project – together with Georgia Institute of Technologies – we developed a generic city description model suited for purposes like semi-automatic city modeling and urban layout evaluation (Halatsch et al., 2009b). The generic city model refers to basic vital functions of a (computable) city. Feature patterns are used to extend the generic city model with global and local characteristics. The Hellenistic cities serve as a platform for a first implementation to test a semi-automatic city model generation. As a result, four cities were reconstructed as a first example of our ongoing work, Miletos, Knidos, Priene, and Olynthos (Fig. 8.3). Architect Rem Koolhaas coined the term ‘generic city’ in the late 1990s. According to Koolhaas, the regional identity of the city has become
8
ETH Future Cities Simulation Platform
Fig. 8.3 Reconstruction of Olynthos: The city of Olynthos was generated by our context-sensitive shape grammar-based generic city modeling system
obsolete as a result of the global assimilation of cultural identities. Cities in the West and in the East now consist of interchangeable structural layouts and arbitrary design properties without being criticized by their inhabitants, who accept them as legacy habitats. There is a global common ground among urban landscapes, but they differ dramatically in regional and climatic properties (Heron, 1996). The concept of the “generic city” as a side effect of globalization can be detected in the ancient mass founding of Hellenistic cities all over Anatolia, from Asia Minor to the European Greek peninsula. By Hellenization of existing cites through massive transfers of population, colonization by retired colonists, transfer of population, or by synoecism, or by foundation of new cities, hundreds of instances of the “Greek city” spread in a relatively small period of time (Cohen, 1995). The attributes of the generic Hellenistic city are characterized by urban planning which follows the Hippodameian system as it was implemented in the classical cities of Miletos, Thourioi, Olynthos, and Priene, and that included the unexceptional presence of a bouleuterion, a gymnasium, a theater, a stadium, and a temenos (Wycherley, 1962). Regional characteristics of climate and landscape formation gave the specific characteristics of each city. Today, the concept of the “generic city” is once again contemporary. Globalization seems to have given a common fate to cities all over the world. Future cities – the evolving mega-cities of today – face, apart from the issue of identity, a wide range of other challenges: dynamic social change, the decline of fossil energy, increased population density, transportation, energy, communication, water infrastructures, and also the need to provide good access to healthcare, education, public safety, workplaces, and
101
amenities. The immanent need to reduce CO2 emissions in all domains of life requires the optimization of the urban planning processes as well as the validation and the evaluation of planning functions. Our generic city model is able to generate understandable visualizations, methods to enhance the design of master plans, and approaches to measure the performance of a city. The system implies that a city consists of basic functions and interlinked metadata that follow Koolhaas’ idea of a generic city. The basic functions, among other things, steer the automatic generation of the urban layout, the growth behavior, and the urban geometries. Feature patterns serve as a specific urban design input and transform the generic model description to a locally applied city model equipped with local design attributes and properties. Metadata serve as guiding inputs for the city creation process and for the analysis of the generated structures affecting all urban scales. The generic city model inherits the possibility of a unified city simulation model, and the resulting output can be used to optimize urban layouts in order to meet a highly energetic, economic, ecological performance as well as accommodating “weak” criteria such as urban qualities. Our generic system is a simplified implementation approach to the model given by Wegener (1994) to include behavioral or process models in methods for the procedural generation of urban layouts. This also includes the spatial patterns of access networks and locators for objects such as population, housing, and land use (see Fig. 8.4). The goal of our system is to create a basic setup for the automatic allocation of land use zones, associated roads, and generated building while using only shape grammars for implementation. In previous publications (Halatsch et al., 2008a; Ulmer et al., 2007), the authors proposed methods to generate city layouts that start with the procedural generation of street networks (based on l-systems) and derive street shapes and lot shapes automatically based on the system by Parish and Mu¨ller (2001) and Mu¨ller et al. (2006). In these cases the grammars are applied after the production processes of the urban layout in order to generate three-dimensional geometries like building, vegetation schemes, and street furniture. These approaches reduce computation costs efficiently and offer an unrivaled flexibility in the production of urban schemes. The inherent boundaries between the production systems limit the practical application of shape grammar sets to a whole city region for
102
J. Halatsch et al.
Fig. 8.4 Left: Original system model by Wegener (1994). The inherited complexity had been drastically reduced for the simulation of the Hellenistic city scheme pictured right
implementing urban simulation system models. In our case, we discarded the inbuilt per lot polygon creation of the given CityEngine framework by Mu¨ller et al. (2006) and implemented our own software framework, which consists basically of the CGA shape grammar scripting language. The system poses an ideal test bed for academically motivated urban modeling and 3D city model generation with the additional Python support in spite of the current drawbacks of higher memory consumption, increased costs for additional implementations needed, and limitations regarding landscape modeling (e.g., no topography at the moment). Three-dimensional scenes in this work were visualized with Autodesk Showcase (http://usa.autodesk.com/adsk/ servlet/index?id=6848305&siteID=123112) – a realtime rendering 3D scene graph application.
8.3.1.4 Simulating Building Energy Efficiency at the Urban Scale Using Generative 3D Models With this project we intend to give investors and urban planners a tool to control and optimize their portfolio with regard to CO2 emissions. This devel-
opment towards an incentive-driven instrument will entice all players and stakeholders to implement them; visibility of cause and effect will increase consent and productivity among the actors of urban transformation. It is currently very common to evaluate building energy efficiency and carbon dioxide efficiency before a building is erected. So often, in order to save time and money, this process is forwarded to specialized companies who perform simulations based on the finalized drawings. Thus, energy and CO2 efficiency and, especially, carbon footprint simulations are not usually included in the early design process when changes could stimulate better designs at comparably low costs. In addition, the simulations typically focus on individual buildings and not on city blocks or urban areas. We suggest that differences between simulation and measured results emerge also from neglecting the urban settings of buildings. In the presented approach, we start by extracting detailed attributes from a Building Information Model (BIM) and assigning them to the rule base of a generative urban model. The reports that result from multiple design iterations are then used as the basis for simulation, and this leads to a logical evaluation of the preceding projects.
8
ETH Future Cities Simulation Platform
We propose using a calculation of CO2 and energy per square meter on a generative urban model. This gives us the advantage of calculating adjusted designs without reconstructing the 3D model from scratch and the possibility to design towards reduced emissions that are generated by the built environment. The expansion of the building simulations to an urban scale is important to incorporate these impacts. The costs of changes in design are a crucial factor and are time reliant. The early on implementation of simulation methods are therefore important to meet the ecological requirements coming from the government, as well as the economical ones from builders. On the volumetric model, we simulate several aspects of the energy use, such as heating, cooling, and electricity consumption from artificial lighting. By defining the source of every form of energy and its origin, we transform this raw data into information about the ecological aspects in the building.
8.3.1.5 Evaluation of 3D City Models Using Automatically Placed Urban Agents Motion planning and path finding have been studied extensively for productions in game development, in film and broadcast productions, and they have a strong background in robotic research and areas covering knowledge mining and artificial intelligence. We apply these insights to an implementation based on the massive crowd animation software (http:// www.massivesoftware.com). Our method aims to analyze synthetic architectural mid-scale urban models and inherent spatial configurations in a virtual environment with agents who represent typical occupants of the considered area (Aschwanden et al., 2008; Burkhard et al., 2008). This method targeted the analysis, prediction, and visualization of occupant behavior in urban planning. For urban planners, the strong correlation between the built environment and its human occupants is evident, but often seems to be neglected in academic architectural research. Therefore, we simulate and quantify this correlation on the aspects of functions of buildings, number of people, and fluctuation in density. Besides that, this method offers a surplus value for the entertainment industry by delivering high quality imagery output and a
103
decreasing workload. Traditionally, the costs and time needed to produce populated digital urban sets for movie, game and interactive VR projects are enormous. The occupants’ location data and relevant semantic metadata are encoded inside a grammarbased city modeling system. This information is used for the context-dependent automatic placement of occupant locators during the procedural generation process of the urban 3D model. Most of the underlying parameters are interconnected with each other. For example, the number of resulting agents corresponds to the size, function, and location of one specific building. Once a 3D city model has been generated, occupants are represented by agents using (1) a commercial fuzzy logic system and (2) pre-animated 3D avatars. The agents find their way through the city by moving towards points of interest to which they are attracted. Each individual agent draws specific paths while interacting with the urban environments and other agents. Every path describes a set of parameters, for example, speed, space available, and level of exhaustion. The ensuing visual diagrammatic representation shows the resulting agent paths in correlation with the virtual environment. This offers the opportunity to investigate parts of a city and optimize corresponding aspects with minimal interventions on the urban level. The path maps are then used as a feedback input into our generic city model. Buildings will adjust themselves to meet the requirements coming from the occupants and vice versa. This iteration loops from occupant to buildings and back, showing their interaction, and leads to an optimum according to the rules of procedural modeling as well as the preferences of the occupants. Important goals are (1) to evaluate the functionality of local networks that can later on be used as representations of modeled urban objects. These distributed objects can be urban goals like train stations, pedestrian crossing, corner shops, office buildings, and courtyards with private zones. (2) The second goal is the corresponding load that an investigated area will achieve. Standard 3D geometry serves as a design input that will be evaluated with adapted agents for urban simulation. The outcomes are (1) specific visualization maps showing the movement, residence time, and the density of crowds in a certain area over time, (2) 3D animations that can be easily explored in real-time, and (3) animated footages for each agent and or agent group.
104
J. Halatsch et al.
8.3.2 Collaborative GIS-Based Project Management and Participative Urban Planning In cooperative planning, it is very challenging to establish a shared understanding among the involved partners. In this case, the coordinative functions of visual representations are a significant help. Theoretical research on the use of visual methods to transfer and create knowledge has been described in Eppler and Burkhard (2005, 2007). A static strategy map or roadmap, for example, can facilitate aligning all partners to one mutual “big picture.” While a static map designed by one author may already offer support, the most valuable effects can be measured when a group designs the map in a moderated process. If a group of people develops a visual representation in collaboration, they will accept it and adhere to it more consistently. In contrast, if they are not actively involved, they will read it and most likely forget it, but certainly not enthusiastically share it with peers. The following approaches presented in this section can be understood as an initial starting point to guide and visualize long-term planning processes, while intensifying the focus on the optimization of infrastructures (e.g., transportation, water, communication systems) and buildings through new concepts, new technologies, and new social behaviors to reduce CO2 emissions, energy consumption, and traffic load, and to increase the quality of life (Halatsch et al., 2009a). In the following exercise, we explored how shape grammars might be integrated into the urban design process. The participants in an elective course had to work together in a group of ten persons. They had to coordinate design requirements for the given urban area collaboratively. Therefore, they used an analog visual structuring technique, called Programming, which was introduced by Henn Architects, Munich, in the mid 1990s. From the visual analysis, the students developed the basic principles for the orientation and the layout of the urban plot as well as for the urban components, such as buildings, parks, streets, and so on. They translated their regulations from visual sketches directly into CGA shape text code representation inside the CityEngine. From there, the generated data can be easily exported from the CityEngine to real-time scene graph applications like Autodesk Showcase. With Showcase, for example, stakeholders
Fig. 8.5 As a result of the collaborative city design student workshops, a new use for an abandoned military airport on the outskirts of Zurich were implemented with collaborative working tools that are available inside the Value Lab
can easily explore different design variations and compare different 3D models side by side. The application further features an intuitive and touch screen-capable user interface for navigating through 3D scenes (see Fig. 8.5). The connection between design definition, its implementation, and visualization was very tight. In this manner, several iterations of the design could be simply implemented once the users were accustomed to the CGA grammar syntax. The preceding example shows that the co-creation of a big picture is a proven way to gain a shared vision and mutual understanding, and because no such tool has previously existed, we are currently implementing our own tool to enhance the process of data gathering and participative design. The emphasis lies on the integration of GIS functionality along with project management tool behavior and simple user interaction paradigms like design sketching and image annotation as well as roadmap drafting. The focus of this project is therefore the integration of GIS functionality along with project management tool behavior, simple user interaction paradigms like design sketching and image annotation, as well as roadmap drafting. The working title is “Future City Designer” (FCD), and it is customized for the ETH Value Lab (Burkhard et al., 2009). The software allows background images of the planned area to be added and to include individual elements on different layers. Such elements are: projects, reference images, texts, sketches, and project members as well as tasks. In addition to FCD, promising tools like Prezi (http://www.prezi.com) can be efficiently used to create and run content presentations that rely on the touch-capable, zoomable, user interface (ZUI) paradigm, which is based on a fractal 2D landscape with an infinite level of detail.
8
ETH Future Cities Simulation Platform
Fig. 8.6 The Value Lab represents the interface to advanced city simulation techniques and acts as the front end of the ETH Simulation Platform
8.3.2.1 ETH Value Lab Hardware: A Platform for Urban Simulation and Participatory Planning The ETH Value Lab (approximately 8 m in width, 10 m in length, 6 m in height) consists of one display table equipped with two 65-inch LCD screens. Furthermore, it offers a large interactive display board (see Fig. 8.6) that utilizes three LCD screens, each of which measures 82 inches in diameter. Both the display table and the display board feature multitouch screen overlays to enable users to work with contents directly and interactively. The system is tightly coupled and connected via fiberglass to the online and graphical computing platform in the basement of the building. A special challenge was the stable transmission of USB 2.0 signals to the computing resources, which required running 80 m of cable. Additionally, three FullHD projectors were installed. Two projectors form a concatenated highresolution projection display with 4-megapixel resolution. This configuration will be used for real-time landscape visualization. The third projector delivers associated views for videoconferencing, presentation, and screen sharing. A sophisticated multi-channel sound system controls the videoconference and echocancelation demands as well as traditional audio playback needs. The computing resources, display, and interaction system produces a tremendous number of possible configurations, especially in combination with the connected computing resources. Therefore, the system had to fulfill strong criteria for usability and also for ergonomics. The whole system is operated through activities the user selects via a touch panel. They describe common use cases of the ETH Value Lab. Activities can be easily customized since we rely on the AMX touch controller protocol. As a result,
105
users do not need to care about certain states of the system and single components, such as turning single displays and computers on and off. Most of the system resources, such as computers and operating systems, are abstracted. The user perceives the system through an abstracted layer as a holistic system, which behaves like a standard computer with a standard operating system and applications. The system manages all computing resources, operating systems, displays, inputs, storage, and backup functionality in the background as well as lighting conditions and different ad hoc user modes.
8.3.2.2 Camera-Based Gesture Recognition for Multi-User Interaction In the following section, we present a novel human computer interaction system for information caves, which is targeted towards room setups with physically spread sets of screens similar to the Value Lab. The work has been done in collaboration with the DDM member Computer Vision Laboratory. The system consists of a set of video cameras overlooking the room, and of which the signals are processed in real time to detect and track the participants, their poses, and hand gestures. The goal is to enable multiple users to interact with multiple screens from any location in a room. Preliminary experiments have been conducted in the Value Lab. They focus on the interaction of a single user with multiple layers (points of view) of a large city model displayed on multiple screens. The results are very promising, and work is continuing on the extension of the system to multiusers. We think that traditional human-computer interaction devices, such as mouse, keyboard, etc., are typically not adapted to work well with the multiplicity of participants and the multi-dimensionality and the multiplicity of data sets representing projects. Recently, touch screens have also been proposed as more intuitive human-computer interaction devices. Despite their advantage for interactions, including multiple users, particularly in table settings, they remain inadequate for use in rooms with physically spread sets of screens, as they require the users to constantly move from one screen to another.
106
8.3.3 Selected Exemplary Teaching and Research Events in the Value Lab The following examples represent an excerpt from several research and teaching activities inside the Value Lab. We will focus on urban modeling and simulation.
8.3.3.1 Workshop “New Methods in Urban Simulation,” November, 2008 In November 2008, about 30 international researchers from the eCAADe community came together to discuss future trends in urban simulation and their effects on research, education, and practical implementation. They used the Value Lab (Fig. 8.7) as a collaborative place to gather their ideas and concepts together in a collaborative manner. In this meeting, it turned out that for a larger group of participants, unidirectional presentations steered by one or more moderators tend to be ideal while also keeping alive the possibility of commenting on and affecting the workshop immediately.
J. Halatsch et al.
8.3.3.2 Elective Course “CityEngine” – Fall Semester, 2008 During the fall semester of 2008, students experimented with the analysis and configuration of existing city structures. After a visual analysis of a design with Architectural Programming (Pena et al., 1977), they implemented their solution in CGA shape – the language used by the CityEngine to interpret and generate 3D geometry. They used the Value Lab for discussing and presenting their findings (Fig. 8.8).
8.3.3.3 Elective Course “Collaborative City Design” – Spring Semester, 2009 In the spring semester of 2009, students researched how to establish design proposals in a more collaborative manner – which in this case involved plans for an abandoned Swiss military airport in Duebendorf. They set up an interactive shape grammar model, which was implemented with the CityEngine. Tools like Prezi were used for the concept visualization (Fig. 8.9). A real-time visualization implemented in Showcase was of immense help in understanding the design
Fig. 8.7 New methods in urban simulation workshop, November 2008 (http://www. urbansim.ethz.ch/)
Fig. 8.8 Left: Student analysis of a Tokyo district. Middle: Tokyo district reconstruction done by students. Right: Student group presenting their work inside the Value Lab
8
ETH Future Cities Simulation Platform
107
Fig. 8.9 Left: Project presentation with Prezi (http://www.prezi.com). Right: Real-time model of Duebendorf
interventions. Both applications are capable of working as touch-screen software.
8.4 Summary and Future Work In this article we have discussed that architects and urban planners mostly refer to the term design when they talk about the design process, which includes, for example, envisioning, simulating, or visualizing a building or urban scenario. They rarely use the term for a product in the same way as, for example, product designers use it. Therefore, we have put an emphasis on the design process and have described how new computer-based methods can contribute to the design or planning of sustainable future cities. We have illustrated with examples the benefits of using computer-based techniques for purposes such as analysis, simulation, and visualization of complex urban systems. More specifically, this article discussed how today’s urban design tasks can be significantly enhanced in planning, understanding, and communicating future cities, especially when different stakeholders are involved. The article described a framework that can help to solve urban planning issues. It introduced new workflow strategies in environmental simulation, which bring together insights from simulation and discussion in order to establish a participatory planning process. Because of the rising demands on multi-user usability and participatory information manipulation, we introduced the ETH Value Lab’s display and interaction system, which is strongly embedded into an architecturally designed space with the necessary attributes for ergonomics and architectural quality. We expect that much future research will utilize our framework of workflow issues and psychology, the
mapping between simulation results and real-world data, adding simulation-based city layout optimization as well as object-oriented urban design. Acknowledgements We would like to thank Andreas Ulmer, Christian Schneider, Martina Maldaner-Jacobi, Fre´de´ric Bosche´, Michael Van den Bergh, Gideon Aschwanden, and Martina Sehmi-Luck for their continuing support and for many helpful discussions, as well as our students from semesters 2008 and 2009.
References Aschwanden G, Halatsch J, Schmitt G (2008) Crowd simulation for urban planning. eCAADe 2008, Antwerp Burkhard R (2006) Learning from Architects: Complementary Concept Mapping Approaches. Inf Vis 5:225–234 Burkhard R, Bischof S, Herzog A (2008) The Potential of Crowd Simulations for Communication Purposes in Architecture. In: Proceedings of 12th International Information Visualisation Conference, London Burkhard R, Meier M, Schneider C (2009) The ETH Value Lab and Two Software Tools for Knowledge Creation in Teams. 13th International Information Visualisation Conference, Barcelona Cohen G (1995) The Hellenistic settlements in Europe, the islands and Asia Minor. University of California Press, Berkley Engeli M (2001) Bits and spaces: architecture and computing for physical, virtual, hybrid realms. Birkha¨user, Basel Eppler M, Burkhard R (2005) Knowledge Visualization. In: Schwartz D (ed) Encyclopedia of Knowledge Management. Idea Press, New York Eppler M, Burkhard R (2007) Visual representations in knowledge management. J Knowl Manag 11(4):112–122 Gross M, Wu¨rmlin S, Naef M, Lamboray E, Spagno C, Kunz A, Koller-Meier E, Svoboda T, Van Gool L, Lang S, Strehlke K, Moere AV, Staadt O (2003) Blue-c: a spatially immersive display and 3D video portal for telepresence. In: ACM SIGGRAPH 2003 Papers, San Diego
108 Guidi G, Frischer B et al (2005) Virtualizing ancient Rome: 3D acquisition and modeling of a large plaster-of-paris model of imperial Rome. In: Beraldin JA, El-Hakim SF, Gruen A, Walton JS (eds) Videometrics VIII, vol 5665. SPIE, pp 119–133 Halatsch J, Kunze A (2007) Value Lab: Collaboration in Space. In: Proceedings of 11th International Information Visualisation Conference, Zurich, pp 376-381 Halatsch J, Kunze A, Schmitt G (2008a) Using Shape Grammars for Master Planning. Third conference on design computing and cognition (DCC08), Atlanta Halatsch J, Kunze A, Schmitt G (2008b) Sustainable master planning using design grammars. 25th PLEA Conference, Dublin Halatsch J, Kunze A, Burkhard R, Schmitt G (2008c) ETH Value Lab – A Framework For Managing Large-Scale Urban Projects. 7th China Urban Housing Conference, Faculty of Architecture and Urban Planning, Chongqing University, Chongqing. Halatsch J, Kunze A, Schmitt G (2009a) Value Lab: A collaborative environment for the planning of Future Cities. In: Proceedings of eCAADe 2009, Istanbul Halatsch J, Mamoli M, Economou A, Schmitt G (2009b) The Hellenistic city model inspired by Koolhaas: A test case for a generic city model. In: Proceedings of eCAADe 2009, Istanbul Heron K (1996) From Bauhaus to Koolhaas (note: An interview with Rem Koolhaas). In: wired.com, Issue 4.07, http://www.
J. Halatsch et al. wired.com/wired/archive/4.07/koolhaas_pr.html. Accessed 24 July 2009 Magnenat-Thalmann N, Thalmann D (eds) (2004) Handbook of virtual humans. Wiley, Chichester Mu¨ller P, Wonka P, Haegler S, Ulmer A, Van Gool L (2006) Procedural Modeling of Buildings. In: Proceedings of ACM SIGGRAPH 2006/ACM Transactions on Graphics (TOG), vol 25, No. 3. ACM Press, pp 614–623 Parish Y, Mu¨ller P (2001) Procedural modeling of cities. In: Fiume E (ed):Proceedings of ACM SIGGRAPH 2001, ACM Press, pp 301–308 Pena WM, Caudill W, Focke J (1977) Problem seeking: An architectural programming primer. CBI publishing Company, Boston, MA Schmitt G (1999) Information architecture; basis and future of CAAD. Birkha¨user, Basel Stiny G (2006) Shape: talking about seeing and doing. MIT, Cambridge, MA Stiny G, Mitchell WJ (1978) Counting Palladian plans. Environ Plann B 7:189–198 Ulmer A, Halatsch J, Kunze A, Mu¨ller P, Van Gool L (2007) Procedural Design of Urban Open Spaces. In: Proceedings of eCAADe 2007, Frankfurt, pp 351–358 Wegener M (1994) Operational urban models state of the art. J Am Plann Assoc 60(1):17–29 Wycherley RE (1962) How the Greeks built cities, 2nd edn. Macmillan, London
9
Iterative Landscapes
Christophe Girot, James Melsom, and Alexandre Kapellos
Abstract Landscape architecture is now far better equipped technically than ever before to respond to the new challenges posed by the design and spatial modeling of large-scale territorial topologies. Advanced 3D design tools in computer visualization, modeling and simulation have been considerably improved over the past few years, allowing young designers to develop design visualizations and models directly within 3D environments. Some schools of landscape architecture, which are integrated in larger colleges of design and architecture, have had the opportunity to train young students in this particular area of “spatial information.” The Design Lab at the Institute of Landscape Architecture, part of the Department of Architecture at the ETH (Eidgenoessische Technische Hochschule) in Zurich, has chosen to organize largescale landscape studios in coordination with the Rapid Architectural Prototyping Laboratory of the Department of Architecture. Students have been able to construct and visualize 3D landscape topographies and also to mill the physical models of their projects with GIS coordinates through Computer Numerically Controlled technology. Each student project in this article shows how a distinct topological approach can respond back and forth to an iterative process in design. Through this method, we see that site-specific characteristics can be modulated and enhanced while dealing with a range of dynamic issues on large territories. Within the studio process, there is a continual interchange between sketches, hand-made study models in sand and computer models, which in turn help generate both sectional and plan developments.
9.1 Introduction The role of modeling in the education of landscape architecture is a fundamental one. Current technologies have only strengthened the role of the model in landscape design, whether in site appraisal, manipulation or representation. The ILA design studio at the ETH Zurich uses the synergies between landscape
C. Girot (*) Landscape Design LAB of the ILA, ETH Zurich
design and computer numerically controlled (CNC) machines as modeling tools for students in the design process. The focus of the course is to develop proficiency in CAAD-CAM technologies and to familiarize architects with landscape design and the problems of large-scale topographical interventions. These technologies provide an excellent method of verification and visualization not easily attainable with traditional processes. It allows for a continuous exchange between concept and physical three-dimensional output. While many prototyping machines are available to students at the ETH Zurich (laser cutter, flatbed cutter, 3D printer), the three-axis mill allows for the
S. Konsorski-Lang and M. Hampe (eds.), The Design of Material, Organism, and Minds, X.media.publishing, DOI: 10.1007/978-3-540-69002-3_9, # Springer-Verlag Berlin Heidelberg 2010
109
110
C. Girot et al.
Fig. 9.1 Initiating – water flow, sediment and barrier performance testing
best translation of idea into model. When employed amidst the design process, it forms a freeze-frame, or “control model” – one manifestation among countless in an evolving design. The mill works in much the same way as an excavating device: both remove material, at first coarse and then increasingly fine, leveling and smoothing the surfaces. Over several years of intense integration into the design course, the role of digital modeling and CNC milling has evolved, both in relation to the teaching process and the designing of landscape. Throughout the semester, modeling is a three-phase process running in parallel with the development of the design project. The final result is a series of models or stages of evolution, documenting the project idea as it evolves from the initial concept to the final product through repeated design iterations during each phase. Landscape design remains a very tangible exercise, where working with models is fundamental to the project process. Two experiments from the MIT Media Lab are worth mentioning here: Sandscape and Illuminating Clay (SandScape, Illuminating Clay). In both cases, advanced technology is used to generate topographical models, whether physical or virtual, in an effort to better apprehend or control the design process.
9.2 Model and Modeling Process The teaching techniques described in this article continue on a path presented at the 2006 eCAADe conference (Kapellos et al. 2006), in which technology becomes a tool that inserts itself into the design process and enhances it. The resulting models, fabricated from various media (wood, foam, etc.) encompass this potential, allowing for rich exchanges between the multiples phases in the development of the project, feeding the process with additional “information” obtained through the model and the modeling process.
Fig. 9.2 Initiating – ascertaining barrier performance
9.2.1 Initiating In the first phase, alongside intuitive sand models, an accurate digital site model is built, which becomes a base and point of reference for further work. While the students are assessing the site context, such a digital base allows students to generate sections and topographical height lines at varying intervals. Initial intuitions regarding the design hypothesis can then be mapped topographically and programmatically on the site in sand and in mixed media models. While a sand model allows (Fig. 9.1) the students to form an analytical “mapping” of the site, the digital and resulting milled model gives an early understanding of scale, relative heights and the existing landform. During an intense introduction to digital modeling and CNC milling, the students create a digital sketch model – a mixture between the existing site and its formal potential. The ability to simulate water and sedimentation scenarios allows the students to familiarize themselves with water dynamics (Fig. 9.2). This basic understanding can then give an indication of large-scale implications or inform small-scale scenarios (Fig. 9.3). From the beginning, the tool is exploited for both its potential accuracy and freedom in form generation. The focused exposure to modeling
9
Iterative Landscapes
111
software as a “sketching-tool” prepares the students to keep a free hand in the design process. The resulting models (Fig. 9.4) show the capacity to reform existing topography with conceptual interventions from an early stage in the process.
9.2.2 Controlling
Fig. 9.3 Initiating – developed topographical design concept
Fig. 9.4 Initiating – developed topographical design concept
Fig. 9.5 Initiating – sand model showing design concept
The second phase, the students verify their project hypothesis by means of a series of test models in parallel with graphic renderings (plans, sections). In
112
C. Girot et al.
design terms, each focuses on defining topologies such as edges, slopes, surfaces and paths. At critical points in the design, “hard copy” milled models can be tested for environmental and water performance properties (Fig. 9.5). CNC milling provides greater flexibility in the verification and visualization of the project. Each model, milled on a block of Styrofoam, lacks scale and is abstract. The constant exchange between the project (on paper) and the model is crucial because it allows a relatively precise adaptation of the expressed topology to the site concept. At this stage of design development, repeated test modeling and a process of “versioning” quickly focus the design hypothesis (Fig. 9.6). It is interesting to note that the milled model is often closer to the students’ initial topographical concept than the design drawings. The teaching method functions in a cyclical mode of working between topographical lines and section, refining the design with each revision. This iterative mode works in combination with the generation of milled models, which record a specific frame in the design process. At strategic moments in the design process, the current digital landscape model is used to test the scale and impact of design decisions. For the students, such an active mode of reference can quickly bridge between abstract design decisions and working at a recognizable scale. Specific readings of the design, such as potential scenographic processions, or the resulting spatial qualities can be tested and verified with simple cues or scale and form (Fig. 9.7).
Fig. 9.6 Controlling – milled sketch models demonstrating various stages of design development
9.2.3 Finalizing In the third phase, the modeling project evolves into a clear design statement, with a shift to a more precise mode of representation. Textures and variations in materials and color are used to define elements such as circulation patterns, edge conditions, types of surfaces and soils, and vegetation (Fig. 9.8). The result is a very precise site model with the topographical precision of a map. During this stage, atmospheric representations of the proposed landscape can be projected, allowing
abstract and poetic visions to sit on an extremely accurate and reworkable topographical base (Fig. 9.9). The move from representation within the design process to the communication of the project to others entails a shift in emphasis from a conceptual to a tangible landscape vision. The implications of this shift traditionally imply a “freezing” of the design process. By working with an “active” digital base, the students are able to continue to design with the benefit of these atmospheric images, utilizing them as further design tools.
9
Iterative Landscapes
Fig. 9.7 Controlling – versioning enables the testing of scale, spatial definition and human perspective
Fig. 9.8 Finalizing – milled models can be further developed to depict the detail and qualities of the site
113
114
C. Girot et al.
Fig. 9.9 Finalizing – 3D model as basis for visualizing potential landscape form iterative_landscapes_inddV3. indd 5 7/31/08 3:16:56 PM
9.3 Conclusion The digital process has merged with our aspirations for design education, particularly in the realm of design criticism and review. The common mode of fabrication facilitates an easily comparable and precise common reference for criticism and discussion between projects. Nevertheless, the technique allows huge variation in possible textures and topologies, and in combination with material choice can enable unique expression of the design and manifestation of the project. The results of the course have been published internationally and contribute to a broader discussion on landscape and design education (Girot 2007a, pp. 70–78; Girot 2007b). Within the course, the implementation of digital processes and iterative production has sharpened the critical and academic review process, raising the level of discussion, while engaging visiting guests. The clarity of site problematics and design rationale invites discussion of a high order. Everywhere the course of nature is changing more rapidly than we would like. And the challenge for contemporary schools of landscape architecture in this instance is whether they are capable of training new generations of students with the proper methods and tools of design. We need to offer both innovative methods and creative solutions that can be readily accepted and integrated by other professional fields. Such complex large-scale projects can only be achieved when working hand in hand with architects, engineers and environmentalists; we believe that landscape architects can play a more designated role in the initial development and implementation of such largescale designs. If we fail to train new generations of landscape architects in this particular area of study, then we will see the new large scale landscape projects
across the world designed only by engineers and environmentalists. We believe that with the proper method and understanding, landscape architecture is capable of branding an entirely new image and use of the environment while respecting the rules set by engineers and environmentalists alike. We are looking towards methods capable of reacting and adapting to important changes in the environment while improving the quality and imageability of landscapes to come. The development of a new method of topological design in landscape architecture linked to specific site problematics such as flood management must be approached within schools at an interdisciplinary level. This is probably the single most important challenge facing schools concerned with environmental design today. The rapid evolution of visualizing, modeling and simulation techniques for architecture, engineering, geodesics and hydrology signifies that we must be able both technically and conceptually to train people at a similar level of computer literacy in landscape architecture. In the present “knowledge economy,” computer literacy must remain at the cutting edge. Few schools of landscape architecture outside the ETH have been really able to provide a high-end training in computer visualization, modeling and simulation to young students that is really up to par with those taught in architecture and civil engineering schools. Advanced computer knowledge coupled with the acquisition of appropriate topological abilities in grading and in drainage is a must if landscape architecture is to remain competitive with other fields in environmental design. And it is precisely through the mastery of methods in visualization and modeling as shown in this essay that landscape designs will deliver compelling results that are sustainable. This is only the beginning of much greater environmental challenges to come. Let us hope that landscape architecture will gain competence and recognition with this
9
Iterative Landscapes
new iterative process in the development of new and more sustainable landscape topologies. Only then will it really acquire the place and recognition that it deserves in the field of advanced environmental design.
References Girot Ch (2007) “New landscape topology for flood control,” Topos 60 – 2007 – challenges. Callway Verlag, Germany Girot Ch (2007) “The rising tide of landscape architecture, Beijing Forestry University.” Landscape-bly, Landscape Architecture journal, China Illuminating Clay, MIT Tangible media Group (2002) http:// tangible.media.mit.edu/projects/illuminatingclay/ (retrieved July 15, 2009)
115 Kapellos A, et al. (2006) “CNC Morphological Modelling in Landscape Architecture,” Proc. of Communication Space(s) – 24th International eCAADe Conference, Volos, Greece SandScape, MIT Tangible Media Group (2002) – http://tangible. media.mit.edu/projects/sandscape/ (retrieved July 15, 2009)
Exhibitions Wasserlandschaften – Visionen fu¨r den Churer Rossboden Stadtgalerie Chur 2005 Werdende Wahrzeichen – Architektur und Landschaftprojekte fu¨r Graubu¨nden Gelbes Haus Flims 2005-06 and ARchENA der ETHZ 2006 Mostra Internacional Escoles de Paisatge i Arquitectura 2006 – 1st Prize awarded
Part Design of Minds
4
IV
10
Applied Virtuality On the Problematics Around Theory and Design Vera Bu¨hlmann
Abstract This article discusses certain issues that arise around the quest for a theory of design. It is engaged with the conditions and frameworks behind such a theory. Seeking to approximate design and science by a search for a methodology makes it quite obvious that the perhaps most abundant notion of design, design as problemsolving, falls short of important aspects. Without a respective architectonics, a set of systematically related methods alone could not legitimize normative actions (even if the performed normativity may be a factual one only). It could not do so other than as a specific form of dogmatism. Here, the difficult relation between design and theory becomes relevant: Would design as a science be theorized as one of the so-called natural sciences (in search for “natural design principles”)? Or as one of the positive sciences (in search for its “formal ways of operation”)? Or as one of the humanities disciplines (engaging within the difficult grounds of “semantic evaluation”)? Or even as a kind of politics (striving to integrate divergent intentional configurations)? This article suggests conceiving of design as an integral vector among any of these, and it tries to formulate candidate principles for theorizing it. Departing from a reference to the notion of “a thinking of the outside” as a kind of rationality that is capable of organizing topologies and networks, beyond the reign of the Cartesian grid, we identify “design” in addition to the more familiar guides of our thinking, namely “operationalization” and “description,” as an interesting third pole. With that aim, the philosophical concepts of the virtual and the differential are introduced as eminently practical figures of thought. The article concludes with the emphasis that design is, constitutively so, a regulated practice of and negotiation of and for prognosis.
10.1 Opening Remarks Thesauri are useful machines, and probably readily available in your own computer when pressing the respective function key, like Shift-F7 or something
V. Bu¨hlmann ETH Zurich, Chair for Computer-Aided Architectural Design, Zurich, Switzerland
similar, in your word processing software. Looking up the term “design,” and in its cloudy conceptual vicinity, we find, for instance: plan (arrangement, lay out, proposal, map, proposition, scheme), outline, intent (reason, notion, aim, thought, end, view, intention), meaning, contrive (hatch, model, devise, think, plan, project), mean, sketch. Even this short and almost certainly incomplete list suggests a wide usage and variety of meanings for the term “design.” Beyond that, it also emphasizes that design is fundamentally
S. Konsorski‐Lang and M. Hampe (eds.), The Design of Material, Organism, and Minds, X.media.publishing, DOI: 10.1007/978-3-540-69002-3_10, # Springer-Verlag Berlin Heidelberg 2010
119
V. Bu¨hlmann
120
linked to the question of how we relate – and particularly, how we want to relate – to the world. Any answer to this question determines at least the type of answer we may obtain to questions concerning the role of design, whether within science, for scientific purposes, or outside of science altogether. Design cannot be reduced to functions, problems, natural constraints, forms following or preceding any of those, or anything similar. Language tells us that the field of design applications spans across many dimensions, that it is concerned with expectations and prognoses, as well as the other way around: expectations and prognoses increasingly reveal themselves as design issues as well, so much so that design indeed seems to be irreducible. What we conceive to be design becomes self-referential. The latter becomes obvious with the emerging notion of design in relation to technologies that are enhancing not only our capacities towards the outside world, but also those that we consider to be constituents of what it means to be what we are, “human.” There, an undeniable ethical component comes into play, starting perhaps with the Human Genome Project, and continuing to be virulent with existing as well as anticipated nanotechnology, brain interfaces, and research on neuro-implants. Although in a certain sense we have always shaped our minds throughout cultural history, it is a fairly recent phenomenon that the design of minds comes within the reach of individuals (and is not an effect on a population scale). The scope of this extends far beyond the scope of indirect methods such as how to learn (or forget), or certain legal, illegal, or cultivated drugs. How would you design your mind, or the mind of your children, if given the possibility, tools, and methods? Which external expectations, forces, would you decide to react to, or would you decide in terms of “reacting” at all? Obviously, we need some kind of theoretical framework that allows us first and foremost to think and to speak about such issues.
10.2 Nul ne vit sur Place “Nul ne vit sur place,” nobody’s life is locally determined any more, Michel Serres claims in his book Hominescence (Serres 2001, p. 285). With this claim he implies far more than the simple observation, so obvious to any watcher of the media’s evolution and
the resulting upheavals throughout the twentieth century – the recognition that in our everyday life we virtualize, with increasing success, our dependence on local circumstances. His implication is calling it the end of the road for the Cartesian grid as a formal matrix for the organization of thought. The end of the grid as we knew it, and all forms derived from it1, according to Michel Serres’ further argument, brings forth what he calls “les nouveaux objects-monde,” the new world objects: “Connected to a cluster of points that is equipotential in every place of the world, we communicate at light speed. Who is my neighbour, my fellow man? Virtually, the human population in toto. As to numbers, space, time and speed, these world-objects lead us to live and think unconfined by and beyond location, towards the Universe, which, precisely, knows of no address [. . .].”2 This manner of thinking seems to be quite strange, as it implies that thought may leave the interiority of the thinking subject. Given its radical stance and its pervasiveness, one could even suspect a new kind of thinking here. It seems to resonate with a kind of thinking that Michel Foucault called “la pense´e du dehors” (Foucault 2001, p. 670–697). The thought OF the outside situates itself in an inherent relation to an outside, as well as denoting the thought of the OUTSIDE, a concept that is well aligned with both Wittgenstein’s and Putnam’s remarks about meaning, who both conceived of meaning as independent from any particular mind “matter” as well as from psychological “states” (Vossenkuhl 1995, p. 252, Putnam 1979). Difficult inconsistencies would certainly arise from conceiving of this relation in terms of a relation to an objective reality to be interiorized; rather, this makes thought
1
We are thinking here of the grid-like formations of human affairs as they are actualized in the claims of the possibility of ideal, linear, or even formalized rationality. Today, after overcoming most of the large state-forming bureaucracies, grids do not appear any more as “rectangular” forms of organization. A particular technical realization of grid-thinking still underlies most of the ways that we conceive and use digital information processing. 2 Serres [1], p. 276. Translation from the original French by Reinhart Fischer: „Connecte´s a` un ensemble de points e´quipotent a` tous les lieux du monde, nous communiquons a` la vitesse de la lumie`re. Qui est mon voisin, mon prochain? Virtuellement, la population humaine entie`re. En nombres, espace, temps et vitesse, ces objets-monde nous ame`nent a` vivre et a` penser hors et au-dela` du lieu, vers l’Univers qui, justement, n’a pas d’adresse. [...].”
10
Applied Virtuality
leave the realm of subjectivity in a much more radical way. Unlike the interiority of traditional philosophical reflection, the thought of the outside is not autonomous, but might perhaps be called “heteronomous” (Kouba 2008, p. 74–96). It becomes obvious that the ending of the grid, as it attends our virtualization of locality, is closely accompanied by the advent of topological, networked structures. This will result in changes not just for things such as thinking, interiority, and subjectivity, but also for what we have learned to call “objects,” and along with that, of course, also what we mean by such terms as “function,” “operation,” “control,” etc. So, following on from this, Michel Serres’ new objetsmonde should be taken in an emblematic sense, as a reminder that in a world beyond that where the grid rules as the only form of construction, everything is very likely to become literally different beyond imagination from what we know and perform so far. The main clue to the depth of these changes lies perhaps in the perceived loss of the neutrality of technology. Etymologically, this presumed neutrality can be traced back to the Greek word tekhnologia, which originally referred to the systematic exercise of an art, craft, or technique, for example, grammar. As such the presumption of technology’s neutrality has much in common with the beginning of Western philosophy itself. Since the time of ancient materialism, since the Sophists and later Plato and then Aristotle, the finding of a systematic way of treating, developing, and cultivating our capacity to think was a constitutive vector of philosophy. Such a systematic approach called for the identification of elements, both formal (in geometry) and material (in atomism), and also for the establishment or identification of rules for legitimately transforming those elements (in analysis, and later also in syllogistics). It may well be appropriate to remember here that, following its “genealogy,” the concept of technology has for a long time been intimately intertwined with ethics as a foundation for conducting a good life. More narrowly even, technology was fused with an ethics of “affectivity” as that relates to the handling of our experiences, of whatever happens to us, the impressions and imprints left. One could go as far as to see here the question of how to fabricate an immune systems ante litteram, so to speak.3 3
Peter Sloterdijk attempts to conceive of a philosophical notion of “immune systems” in his Sphere Trilogy: Sloterdijk, P. Spha¨ren I-III, Suhrkamp, Frankfurt a/M, 1998–2004.
121
Today, the presumption of the neutrality of technology itself may perhaps be the most acute threat to the fitness of our real psychosocial as well as physiological and morphological immune systems.4 Historically, that neutrality assumption can be related to the unparalleled role that Euclidean geometry has played with regard to any notion of systematicity for more than 2,000 years. The role of Euclidean geometry, both concerning its axiomatic structure as well as its content, is an important background issue throughout this essay. For now, suffice it to remember that even the most sophisticated technologies build on particular ways of elementarization, which have been fundamental to philosophical thinking from its inception. Similarly, all sciences today, including the natural as well as the so-called positive ones and the humanities, are built on particular ways of “elementarizing” the world – be it chemical elements, electrons, quarks, hadrons, DNA base pairs or whatever; words, narratives, institutions, psychological entities – such as drives, motives or emotions; processes, actions, interactions, time units, money, or you just name it. The novel role that falls on design in respect to science, as this book proposes, challenges technology’s neutrality assumption. What seems to be at stake thereby could be characterized as an inversion of that relation between elementarization (by deduction form a comprehending systematicity) and synthesis (adjusted for and fitting in that same systematicity). Today, we are fabricating with elements on mico-scales that cross any categorical boundaries. We can no longer presuppose to know enough about the values and meanings that the applied elements might assume in the newly constructed context. The technologies of information science, which are crucial for this newly gained level of “resolution of the world,” may be regarded as belonging to a different dispositive than the mechanical technologies, which can be regarded as direct descendants of the Cartesian grid, operating within figures of thought that are well bounded within the Euclidean geometry (Serres 1992, 4
There is a proper discourse about the role of technology in fields spanning from communication technologies, warfare industries, genetic engineering to the finance industry, which underlines the inadequateness of the neutrality assumption of technology – regardless of whether the societal outcomes will prove to have been detrimental or beneficial. Technology always transforms the users and their epistemological texture. Cf. Galison P., Daston L., Objectivity, Zone Books, New York, 2007.
V. Bu¨hlmann
122
S. 327–S. 342). Descartes’ concept is still a successful one. Yet today, it is being regarded as providing the method for merely one particular kind of rationality, if we take that in the etymological sense of ratio, Latin for “the relationship in quantity, amount, or size between two or more things.” In particular, it conveys “the rectitude constituent of the method of geometric support,”5 (Serres 2001, p. 274). In other words, the grid is the paradigm for the assumptions of linearity, additivity, and complete determinability. Yet as the history of mathematics of the eighteenth and nineteenth century demonstrates, the mere grid with its mobilized “origin” has turned out to not be the outmost post of achievable abstraction. Today, we know of rather different structures in mathematical thinking, e.g., group theory, l calculus, or the vector field. The vector field is a representation of all possible mappings of a certain kind. In terms of figurative representation, the vector field still fits within the construction form of the grid. As an operative figure of thought, however, it cannot be reduced to a one-dimensional axis; it also adds a verticality of abstraction to it (Serres 1991/1968, pp. 103–151). The effect of such new, non-Euclidean figures of thought is a mobilization and transformability with regard to what we conceive as “thinking.” They introduce a new dimension of abstraction. In a certain manner, representations seem to lose their perceivable and ostensive shape; any one particular determination is but an actualization of other potential actualizations. Perception of stochastic phenomena replaces the drawings of form6 (Serres 1992, p. 333). Thinking itself acquires a kind of – somewhat paradoxically for our intuition – architectonic “volatility.” This concept introduces the need for another language game7 that is capable of abstracting from the grid logic still underlying the concept of dynamics and
movement itself. The following aims to demonstrate the plausibility of this perspective, that the critical point to be highlighted here relates to speech suitable for developing a discourse, a regulated way of speaking, about this volatility of practiced thought. Historically, both approaches that have been cultivated for reflecting if not directing our thinking, that of operationalization as well as that of description, seem to fall short. The cultural technique of description relies on representation and hermeneutics, on tying words to an independent and external frame of reference. The concept of operationalization delineates, in its tight relationship with the genealogy of rationality, just about the very scope of the grid logic that the new volatility is leaving behind. It seems almost obvious that the volatility itself must be seen as transcending its origins. Hence, it has an evasive character: Strictly speaking, it can neither be represented, nor operated, nor grasped. That is now where “design” in the context of science comes into play, as an interesting third quasi-epistemic approach in addition to operationalization and description. In the following, three suggestions regarding the problematic of how to speak and negotiate about design issues will be discussed. The first suggestion is to speak about the aforementioned volatility in terms of “virtualization,” i.e., a certain view being brought to bear upon an “idealism of problems” as opposed to one of forms, and how to deal with them.8 Second, it will be argued that the philosophical concept of the “differential” can be considered as the operator within this volatile verticality qua abstraction implied by the triad of operationalization, description, and design. Thirdly, both of these suggestions will be related to a specific language game of “design,” which is intended to denote what will be referred to as the “orthoregulation” of situations that are both eventful and structured.
5
i.o. French “la rectitude dans la me´thode a` support ge´ome´trique.” Transl. form German: “die Wahrnehmung des Stochastischen ersetzt die Zeichnung der Form.” 7 The notion of the “language game” was introduced by Ludwig Wittgenstein. It denotes a conventionalistic approach to how to use concepts in language and how to distinguish different categories of statements that are effective in discourse. According to Wittgenstein, each language game must be determinable by rules that specify it as a whole, its properties, as well as its possible usage. These rules find their legitimacy outside of themselves in an explicit or implicit contract among its “players.” Cf. Wittgenstein, L. Philosophische Untersuchungen. } 23. Werkausgabe Bd. 1, Suhrkamp, Frankfurt a/M. 1984. 6
10.3 References/Preferences: Visual Figure or Abstract Symbol? While in the paradigm of classical mechanics, to control an instrument meant to operate it strictly schematically, today’s computation devices provide 8
As it has been developed in the writings of Gilles Deleuze, especially in: Difference and Repetition, Continnum Press, London 1994 [1968].
10
Applied Virtuality
us with many possible schemes to choose from. Computers, in the broad sense of the word, can be taken as analytical abstractions of the geometrically rooted mechanical instruments. If the language game distinguishing between analog and digital can be a meaningful one at all, then it seems to be in this sense. The computational devices as part of the realm of the electronic historically depend on a kind of analysis that operates in a way that cannot be reduced to points: imaginary numbers are two-dimensional. They have long been thought to contradict intuitive access, since pffi -1 derives directly from something like a negative square. If a negative number can be represented by a position on a vector pointing to the opposite direction, what would be that “opposite” for a square? Here, the geometrical figures so well suited for representing mechanical processes fail to represent what is going on. Instead of elementarizing a process through schematic shapes, such as points, lines, triangles, squares, etc., “elementarization” has now become truly analytically constructive, as operations with symbols (Siegert 2003). In addition, this conflict between the figure and the symbol, the networks for computations as well as the networks through computations, cannot be conceived of as independent from the choice of a particular topology. We need to be aware, however, that in choosing a topology we must first define the parameters for evaluation that a respective network could consist of. This choice is one of design. From a classical perspective of theory, there are two strategies to provide the foundations for evaluating the models we come up with. Firstly, that of the mos geometricus – the geometrical method – and the idealized system of categorization derived from it, and, secondly, its secularized counter-concept of rules oriented unambiguously by the abstract grid and its coordinates, supposedly in itself devoid of any semantic preconceptions and mechanically regulated by the universals of the natural laws. Any strategy to evaluate models is faced with a principle empirical indeterminacy, as emphasized by Duhem, Quine, and Putnam. Thus, any evaluation of a model is, at least partially, a synthesizing activity, in other words, a designing activity. In the case of a design practice conducted with an awareness of the availability of its own conditions, we cannot take recourse to any such foundations. Design of objects and materials, of minds, and of environments for living – once we can design them in
123
the sense of constructing them from scratch, they no longer belong to one and the same language game as prior to this availability. In consequence, it would be problematic to assume that the “states of mind” that can meanwhile be produced synthetically9 refer to some sort of natural container of human (brain) capacities just in wait for us to become clever enough to invoke them strategically, nor can the “new materials” coming within reach thanks to recent advances in nanotechnology, or even “new organisms” with advances in genetic engineering, be thought of as somehow “implicitly natural.” It is the here almost intuitively perceivable transgressions with which the relation between theory and design needs to find ways of arranging itself. After all, what would it mean to do theory without the postulate of departing from or recurring to some sort of positive reference? This is where the scope needs to be enlarged again; it is where the notion of theory is related to that of philosophy. Quite obviously, the notion of theory itself changes thereby, but this strain cannot be dealt with here.
10.4 Figures of Thought: Virtuality and the Differential An early approach pointing in this direction can be found in the philosophy of Gilles Deleuze and his conception of the virtual. In line with his suggestions, we can define the concept of virtualization10 as creative management of the specific frames of reference within which a something acquires its specific meaning: “The virtualization of a given entity consists in determining the general question to which it responds, in mutating the entity in the direction of this question and redefining the initial actuality as the response to a specific question [...].” (Le´vy 1998, p. 26) In other words, the concept of the virtual helps in transcending the structural role in orienting our 9
Cf., for example, the article by Folkers G., et al. on Drug Design in this volume, pp. 53–63. 10 The philosophical concept of the virtual or virtualization should be distinguished from the technical term of virtualization, which is rather popular in the area of computer technology, as so-called virtual machines or as virtual reality. Both of these are bearing only the weakest shine of the philosophical concept.
V. Bu¨hlmann
124
evaluations, traditionally occupied either by geometrically representable figures and relations, or the mechanically regulated processes between these elements. The onus of this move, on the other hand, is that every evaluation must be oriented according to the purpose that a specified problem – meaning any formulated problem – aims to respond to. The link to a new form of rationality, in play wherever simulations and digital modeling are used as epistemological tools, is obvious here. It is not possibly a deductivenomological one. What used to be considered a model of something is increasingly viewed as a simulation for something, so that any technological operationalization is in increasing dissociation from any understanding howsoever susceptible of being considered constitutive of this synthetic availability (Lenhard et al. 2006). This is not a small or insignificant move. Astonishingly, so one could expect, empiricism will transform itself into a “phantastic empiricism.” Within the dispositive of the digital, experiments take place in real time, the concepts of function and representability fall apart – things are being calculated, generated, and carried out in the absence of any genuine understanding of the implied and complex functional relations, of the calculated as well as the calculating “instance.” Somewhat surprisingly perhaps, this situation, which has gone hand in hand with the rise of computation as a cultural technique, is anything but a recent phenomenon. The issue was famously articulated by Pappus of Alexandria11 1,700 years ago as a fundamental problem germane to any analytical situation, i.e., in fact any situation in which we try to solve a problem reasonably, which means using abstraction and modeling within theory. For him, and this is where he is relevant to the present, analysis relates to procedures of problem solving that stand to be directed either way, backward as well as forward. Hintikka and Remes comment on this widely neglected aspect as follows: “Beyond the fact that analysis started with the assumption of the required, its direction of argument was not so definite. [. . .] In the actual analytical argument (as opposed to the formal analysis written down after success) probably both directions occurred. Characteristically, the heuristic procedure of a ‘problematical’ analysis would start arguing at any place
11
Pappus of Alexandria, 290–350 AC.
that looked promising as a link in the chain which ultimately was to connect the ‘given’ and the ‘required’.”12 In this sense, Pappus introduced the distinction between theoretical and problematical analysis. While the theoretical way orients itself on hypotheses whose status is defined as transcendental to the problematic situation, the problematical way is seen as an ever approximating, heuristic procedure that departs from an immanent perspective. With his philosophical notion of the virtual, Gilles Deleuze returns to this tradition. Deleuze takes this very peculiar concept of the “problematic” as one of the starting points for his philosophical quest. Within such a notion, problems in their specificity cannot be taken as logically anterior to their solutions. Both are constructed as a virtuous cycle, ever again adjusting the ways in which they might fit together. When a problem is properly formulated, then he would suggest that it is already solved, even if there may remain the need for a lot of practical experience to actually reach the goal. Deleuze here of course faces the same dilemma that had previously puzzled Pappus: How then can the model itself, if no proof by means of analytical calculation helps in this regard, be evaluated as to whether it is an adequate model or not? It is here that the philosophical framework orienting the methods of deduction becomes crucial. The development of the mos geometricus, the geometrical method as it relates to science and philosophy, originates from this dilemma. Time and again throughout the history of Western philosophy there have been attempts to provide analytical foundations for the geometrical schematisms and the “intuitive way” they appeal to our senses, and sometimes delude our reasoning – most famously, perhaps, those attempts by Rene´ Descartes driven by his interest in a rule-based method for analytical thinking. However, until the time of Nietzsche the philosophical status of geometry, as the epistemic correlate of what can be conceived in accord with the forms of intuition and with that the status of images, remained deeply troubling, so much so indeed that Nietzsche’s
12
Pappus of Alexandria, PAC, Pappi Alexandrini Collectionis quae supersunt, 3 vols., ed. Hultsch F. and Weidmann F. Berlin 1875–1878 [634-636]: analysis and synthesis, cited in: Hintikka, J. and Remes, U. The Method of Analysis, D. Reidel, Dordrecht 1974, p. 8–10.
10
Applied Virtuality
haunting call for a reversal of Platonism13 has not ceased to influence and challenge philosophers up to this day. Gilles Deleuzes’ own way of dealing with this dilemma stands out from the philosophical tradition because he grasped and started out from a particular quality of the differential quotient. While the value of delta x and delta y do not cease evanescing, strictly speaking, and thus denotes – in themselves, as the differential dy over dx – something like 0/0, the conceptual figure of the differential quite conversely is in each specific case well definable. Herein Deleuze discovers a vertically oriented operator14 that had been functioning in our thinking for some time already, but without being explicated properly. According to Deleuzes’ critique, a lot of attention has been given to one half of the dynamics of calculus. From a mathematical as well as from a logical perspective, for instance, it is often preferred to look along the way up, into abstraction, at the risk of neglecting the complementary movement all too often, back from abstraction into the initial frame of reference. There is a fine distinction between pure mathematics and applied mathematics, the latter being engaged with “the way down,” back to the originating frame or context, or reality, as we often name it in ordinary speech. This complementary movement, necessary for each application, is called integration, or instantiation in a philosophical language. At every step of abstraction – and the more phenomena one is to integrate, the more abstract the thinking must be – it is necessary to set certain values as constants, for all phenomena and through all steps of abstraction. Therefore, in order to integrate an observation mathematically, or to instantiate a concept in everyday life, values must be assigned to a number of free constants. After some 400 years of practice in calculus, thinking in this way may feel quite natural – and for anyone familiar with the basics of any contemporary programming language, the wording above may even sound strangely familiar. However, there is more to it. In a
13
Nietzsche, F. Kleinoktav Gesamtausgabe I-XVI, Bd. IX S.190, here cited in: Kurt Hildebrandt. Nietzsches Wettkampf mit Sokrates und Plato. Sibyllen-Verlag, Dresden 1922. 14 We label it “vertical” since, as a kind of abstraction, the differential can be imagined as being situated above, or at least on a different level, in relation to the originating context, or, mathematically speaking, the represented function.
125
certain sense, models are created just by that instantiation. We experience here the slipping back and forth between a mathematical reading, or that of a software programmer, and a more philosophical gradation of the terms and concepts. We want to emphasize the possibility for a philosophical interpretation of mathematical figures of thought, through taking structural metaphors literally, so to speak. Any such model, whether mathematical or not, depends on hypotheses and may, in the broad majority of cases, be tested only heuristically. Depending on the complexity and the experience with a particular situation, many choices are to be put to the test. Actually, this is one of the main advantages that computers have brought to engineers – computers enable this testing to happen automatically (after an appropriate software has been designed), and with enormous calculation power, resulting in a further vectorial field. A large part of digital modeling, at least regarding the structural side of it, concerns this kind of analysis. And what comes directly to light here is that going vertical does not mean simply one further step. Surely, Wittgensteins ladder had several of them, and we suggest that it was also transportable.
10.5 The Ecotope of Design Ever since design began to set itself apart from the ancient tradition of craftsmanship, it has striven to define itself through an analytical structure, let it be to approach science in terms of exactitude, let it be as a constituent of postindustrial production processes. Finally, design conceived itself as kind of extended problem-solving, trying to take into consideration the transformation of everyday circumstances. Yet there is a fundamental difficulty with this. The intention to solve a problem implies the following assumptions: first, that there is a mis-configuration, second, that this could be recognized and “repaired” through thoughtful rearrangements, and third, that the necessary steps could be identified by a scientific attitude, where science is directed to an independent and objectifiable outside world. Thus, the principles behind optimal design are considered to be something natural. Yet, from a philosophical perspective, speaking of something like “natural design principles” is problematic.
126
While the rise of the natural sciences in the seventeenth century was prominently about de-semantizing “nature” in favor of purely formal mechanisms, the concept of “design” today carries an interesting etymological ambiguity. The word sign itself derives from signum, Latin for “mark, token, indication, symbol.” Prior to the rise of the modern concepts of nature and science, anything significant, any sign was tied up in one or another transcendental hermeneutics and was interpreted as a trace, or some evidence, of God. Ever since, the recourse to semantics and sense – when speaking about nature – seems to imply a falling back into that ancient dilemma around the origins of the symbolic. Nevertheless, we are today living in a world where “information” has come to stand on a common scale with what modern science has conceived of as “nature,” its elements as well as mechanisms.15 The proliferating usage and wide acceptance of information and communication technologies for controlling the mechanisms of physical machines from within the new spaces of abstraction mentioned above (based on stochastic phenomena) allow for the emergence of a densifying and complexifying web of relations among objectified processes, goods, objects or, in the case of media, also between people and their ways of collective reasoning, not only across temporal distances, but across spatial distances as well. Here a link appears to what Michel Serres characterizes as on the yonder side of the principle of locality, and its manifest concreteness. The infrastructures that organize our surroundings today are operating with unparalleled powerful generalizations. RFID chips, GPS and GPRS systems, WLAN or UMTS can all be taken as abstractions of any previous common place, there is no longer a need to follow historically predefined habitual patterns in organizing, broadly speaking, logistics – be it of goods or human intercourse, postal
15
Although in a mode and relation that has yet to be characterized. The founding paper for cybernetics, by Norbert Wiener, made this the basis for the promise of a new science: next to matter and energy, so he held, a third pole has been introduced with the same categorical status – the pole of information: “Information is information, not matter or energy. No materialism which does not admit this can survive at the present day.” Cf. Wiener, N. Cybernetics, or Control and Communication in the Animal and the Machine (1948), here cited in: Janich, P. Kultur und Methode. Suhrkamp Verlag, Frankfurt a/M 2006, p. 216.
V. Bu¨hlmann
deliveries or arrangements for real-time production of cars. Any particular position can easily be derived from those devices. These abstracted machines and the ubiquitous need to apply the abstractions induced by them result in a topology that outstrips Euclidean geometry. We can observe the same in the field of machine-like devices producing, storing and controlling the mobilized and volatilized “world objects”: computing processors, measuring just a few square centimeters, can be regarded as the machines of machines – you can use a computer as a typewriter, calculator, movie player, or even as a telephone, as well as for driving all phases in digital production chains in architecture, for example, or any other industry. In every instance of de-signing, no less than of sensemaking, problem-framing and problem-solving, setting up the processing of information immaterials in an orchestrated and reasonable manner – in every such case the available (infra)structures literally allow for a thousand and one possible ways of completion. Implementation starting from the space of abstraction back into a concrete situation is never a univocal act, and elicits decisions at various levels of integration. More specifically, these decisions are taken from an irreducible manifoldness of possible choices. It is basically this non-univocal manifoldness that introduces freedom into the actualization of concepts. Let us turn our imagination towards a whole wealth of ready examples and take a look at just one of the key infrastructures organizing this new level of operability, in which our thinking needs to develop ways of accommodating and thus directing itself. In order to mark the dissimilar character of what must here – for lack of a better term – still run under the process of “abstraction,” the abstract entities shall, somewhat metaphorically, be called “the (yet undetermined) integral of” (Fig. 10.1). Just as in the times of Pappus, the urgent question that arises is that of methodology. Etymologically, the notion of method goes back to the Greek word me´thodos, for “scientific inquiry, method of inquiry,” originally for “pursuit, following after” from meta´ – “amidst, in between, afterwards” – and hodo´s – “path, road, way, travel.” How do we plan our explorations, our mental travels in this topological space oriented according to its manifold vectors? How shall this realm of abstract yet extant potentiality, this realm where information technologies are operative by following the programs of defined rules, itself be
10
Applied Virtuality
Fig. 10.1 The materiality of a processing computer chip may be regarded as an implementation of, and even is in some sense the embodiment of the abstracted principle “machinic processing”
modeled? As Ludwig Wittgenstein put it some 50 years ago: “Our paradox is this: a rule could not possibly determine a course of action, since each course of action would have to be brought into agreement with the rule.”16 Prior to Wittgenstein’s formulation of this problematic (yet not necessarily paradoxical) self-referentiality inherent in any attempt to provide analytical foundations for the intuition of geometry, rules had probably not been recognized as problematic in themselves. The natural sciences have essentially defined themselves as striving to detect and discover the laws of nature by conceiving of them as unambiguous rules on which to orient our actions, rules that in themselves are devoid of any semantics. Such rules are not “rules” but regularities, and as such they are to be conceived either as “laws” or as formalistic tautologies. They are local circumstances on the common ground of universal principles. For Michel Serres, the observation that we have ceased organizing our lives in this way also means the end of the (Cartesian) grid and its descendants, which now appear to be a planar scheme for
16
Unser Paradox ist dies: eine Regel ko¨nnte keine Handlungsweise bestimmen, da jede Handlungsweise mit der Regel in ¨ bereinstimmung zu bringen sei.“ In Wittgenstein, L. PhiloU sophical investigations. 1972. Blackwell, Oxford, }201. Wittgenstein resolved the threatening regress by the claim that rules are finally and completely absorbed in the forms of life.
127
orienting our thoughts, behaviors, and processes. The new technologies, so he holds, make us not only inhabit a world that needs to be conceived differently, they also ask us to think differently. “Absent from location, uprooted from locality, we are at a loss for answers to the old questions, where are you? Where are you speaking from? [. . .] Are we departing locality for joining globality? But what is the signification of the new habitat? Is our niche invading Earth altogether? Are we to understand that our presence is occupying space, universally? Are we turning into wanderers of the gentle Rousseau kind, travelers after Pascal’s pathetic fashion: will we feel astray, lost, unrooted? Is the king, the law – not catching up with us any more? Are we out of rules giving direction to our minds? These novel technologies make us habitate, ergo think, differently.”17 What is at stake, with this proclaimed new way of thinking that allegedly accompanies our ways of inhabiting the “instrumented planet,” regards the status of rules. As long as rules are merely a stand-in for the Euclidean elements, axioms and symmetry relations in the ancient mos geometricus, we do indeed lack a feasible idea of how to reason, properly speaking, in a way that would provide us orientation within our new surroundings. So, it may by now have well become a common topos to view design in terms of problem-solving; still, it would be wantonly negligent to assume it to have thereby fully integrated yet into its new post-crafts or, respectively, postindustrialized context. Striving to come to terms with the relation between design and problem-solving evokes many of the issues that were, at the time of Plato, virulent between him and the sophists. It affects the language games of both parties, that of design as well as that of
17
Serres (2001), p. 275. Translation from the original French by Reinhart Fischer: “Absents du lieu, de´racine´s du local, nous ne pouvons plus re´pondre aux vieilles questions: Ou` eˆtes-vous? D’ou` parlez-vous? [...] Quittons-nous le local pour rejoindre le global? Mais que signifie ce nouvel habitat? Notre niche envahit-elle la Terre entie`re? Faut-il comprendre que notre pre´sence occupe l’espace, universellement? Devenon-nous des promeneurs a` la manie`re aimable de Rousseau, des voyageurs, a` la fac¸on pathe´tique de Pascal: tragiquement, nous sentirons-nous errant, perdu, de´racine´? Le roi, le droit ne nous rattraperont-ils plus? N’avons-nous plus de re`gles pour diriger notre esprit? Ces nouvelles technologies nous font habiter, donc penser autrement.”
V. Bu¨hlmann
128
problem-solving, neither one of them remaining the same. Therefore, quite surprisingly, and interestingly enough, the affinity between the two is by no means of recent observation, as already alluded to. Analytical procedures must always preconceive what they want to end up “finding out;” this is one of the crucial insights from one of the first reflections on method proposed by Pappus of Alexandria. This problem has been taken up throughout history time and again; different models of dealing with it have characterized, e.g., Timaios’ triangulation of the cosmic order by Plato and the systematics he derived from that; Aristotle’s syllogistics, or the subsequent hermeneutics of scholasticism; Rene´ Descartes’ early attempt to provide analytical foundations for the geometrically conceived methodicity; Kant’s suggestion to assume a priori synthetic propositions that condition our reasoning; and on to Putnam’s critique, in our time, of the very assumption of a philosophy regarding the provision of orientation grounded on something like “analytical propositions” possible. The urgency of this issue today can again, briefly speaking, be illustrated by reference to digital modeling in epistemological contexts, even without actually entering the expert discourse on the topic.18 The outcome of computational analysis in principle comes to depend on the intention and scope with which a problem is framed. This renders any decision regarding which operations are to be “valued” as successful or not inherently problematic. In many respects, the weakening strength of “nature” as a scientific concept might be seen as a consequence of this situation. Methods of how to proceed in digital modeling have literally multiplied, and this not only in the field of statistical analysis strictly speaking. Here lies one of the driving factors for the prominent position taken by simulation software, as providing both, the grounds as well as the possibility for purely symbolical experiments. These new tools, and the applications they make possible, have, by now, become so predominant that many even suggest dropping the language game of natural science and opening up a new one in its place, which might be called something like “sciences of the
18
Cf. for an introduction and overview Lenhard, J. Ku¨ppers, G. Shin, T. (2006): Simulation. Pragmatic Construction of Reality. Springer, New York.
artificial” (Simon 1996), “technoscience”19 or even “design science.”20 If design is about to be integrated into the realm of science, a question that is philosophically crucial arises regarding the nature of the problem something like “design science” is supposed to “solve.” Should these problems be taken as belonging to the same category as those of (natural) science? Supposedly not, as this would mean engaging in a naturalist fallacy. There are several ways of dealing with this. However, they all seem to depend on two dipolar stances, one of which would be to indeed consider something like “natural design principles.” This stance would mean giving up the distinction between “voluntary” and “natural” forever, even if this dualism were to subsequently reappear using slightly different names (such as “creation” and “evolution,” for example). The two poles would simply fall together. Another stance would refer to the distinction between something contingent, or culturally determined, as we might say, and certain developments that are thought of as being “unwanted.” For us, this second polarity is more interesting. Its challenge consists in coming up with concepts that are capable of differentiating in the empirical world of phenomena, without recourse to any kind of a “naturally given positivity” – while at the same time avoiding the loss of the concept of “phenomena” itself as the very ground for such a radical empiricism. With this we can now more clearly characterize the volatilizing power of information technologies. In what might be called the dispositive of the digital, providing the means to relate any sort of data across
19
More recent conceptualizations, more or less varying, but generally speaking following the tradition of cybernetics in postulating the concept of information as belonging to the same category as that of matter and energy [cf. footnote 8]. cf. for example Haraway D.J.. Modest Witness@Second Millenium. FemaleMan Meets OncoMouse: Feminism and Technoscience. Routledge, London 1997. 20 The term “Design Science” was introduced by Buckminster Fuller in the early 1950s. Today it is propagated by the Buckminster Fuller Institute in the following way: “What is Design Science? Design science is the comprehensive and anticipatory application of the principles of science to the creative design of solutions to the problems of society. It is a way of changing the world in preferred directions that is based on innovation and thrives on transparency.” Available online: http://designsciencelab. org/what_is_the_design_science_lab/what_is_ design_science (last accessed May 10, 2009).
10
Applied Virtuality
the categorical or disciplinary boundaries – horizontally as well as vertically – information technologies allow for calculations with that which is, strictly speaking, under-determined and quasi-natural. These two polarities, the question of the categorical difference between science and the problems solvable by science on the one hand, and the challenge of empiricism without a given absolute on the other hand, seem to point to a fundamental change regarding the orientation of design: towards the conditions of the possibilities of design itself. How can the possibilities of design be designed as the topological locations for the actualization of the virtual? In the times of digitally mediated, forward-directed availability for synthesis, the ecotope21 of design is no longer a conceivable “given.” It is not concerned with the production of an ever exactly determinable “given,” as the technical language game of “result” would suggest, a language game that follows an inadequate conception of the relation among causality, language, and the social. The ecotope of “design” is itself an object of “design” in the here-developed sense, much like from the here-presented perspective, the central point of reference for rules is being recognized to be the rules. The final section of this article delves into the consequences of this shift of referencing frameworks. The Lebenswelt of “digital technology” includes not just machines and software strictly speaking, but also the humans in their respective roles as programmers, users, and actants in networks (to use Bruno Latour’s notion). While solving problems may well be treated by means of digitally archived recipes, so to speak in a quasi-automated manner, the invention and negotiation constitutive for the design of the design, and the relationality of design as an activity belong to the realm of the social.
21
By ecotope here we understand the abstract situs defined by the “networks of the Lebenswelt”, as Bernhard Waldenfels put it in his appreciation of Michel Foucaults thinking. Similar to the notion as it is used in biology, where the ecological description of a biocenosis comprises all relations among all living organisms as well as to the physical environment, and as the diagrammatic representation of energy flows, an ecotope of a concept could be said to comprise all the possible and potential relations, the entireness of forces which flow through it.
129
10.6 Orthoregulation – Applying the Abstract This change in the ecotope of design also concerns the problematics of the relation between theory and design. Traditionally, there is after all an interesting commonality between the two. In both of them, imagination has always played a constitutive role.22 As our resort to the philosophical notion of the virtual, as a certain way to conceive of problems, has revealed, there is something fantastic about analytical procedures per se: we need to assume a model in order to analyze and test it. And theories constitute the milieu within which we are guided to do that. Keeping with our terminology, there is no way (method) of finding the completely defined integral of an observed situation in its complexity, i.e., by following a certain chain of reasonable arguments. The assignment of values to integration constants remains a contingent assignment, and its testing can in most cases only be carried out through the pragmatics of approximation. Each assignment depends on its task, its aim (in applied mathematics significantly called the “cost functional”), as well as on the structures given by the related abstractum. Coming up initially with a model, in the context of science, is both empirically and theoretically motivated. “Empirical” here can only mean that the variables making up the model can be further determined via observation on a particular epistemic scale that is both different from as well as relative to that of the model itself. The scale of reference thereby can but must not be that of sensory data; it might also be that of data that is integrated on more abstract scales, themselves relating back to other models of higher abstraction (and the respective theories in which they are embedded) than the one currently being explicated. Especially since we gain such an extensive amount of technically mediated “sensory-data,” it becomes obvious that the assumption of an elementary or even atomistic level of initial correspondence between a model and the world has become problematic. Departing from this, it is not just every measurement that is rule-based, but also any interpretation of data.
22
cf. for a discussion of the role of the imagination for Descartes’ analytical method: Stolarski, D. Die Mathematisierung des Geistes. Algebra, Analysis und die Schriftlichkeit mentaler Prozesse bei Rene´ Descartes, LIT Verlag, Wien/Mu¨nster 2009.
V. Bu¨hlmann
130
Very often, the higher order rules (the outside of our theoretical home base, relatively speaking) necessary for how and when to apply which (lower order) rules (the inside of our theoretical home base, relatively speaking) are referred to as if the organizing telos was no object of negotiation, e.g., as tradition or decorum. But given the pace and the increasing future-directed availability the technologies of today make possible, this not only seems inappropriate, but also hinders a more powerful understanding of the micro-mechanics we are dealing with in the topological structures of networked reality. Instead, the notion of orthoregulation seems much preferable for those rule-governing rules.23 Viewed from this perspective, there is a dimension of contingency to any analytical calculation. It is the contingency in the selection of the integrability conditions, which we met above. Deleuze inquires about the philosophical status of these conditions and calls it the virtual. He thereby transcends the troubling questions: “Figure or symbol?”, “element or atom?” by ascribing an elementary or atomistic role to the vertically oriented operator of the differential, which in itself incorporates literally nothing, 0/0. As a vertical operator, of no value proper to itself, the differential nevertheless comes to regulate any “coherence” to be constructed from empirical data. Thus, if the virtual can be said to denote any “thing” at all, then it would be something as abstract as “potentiality.” Like the latter, the virtual as a concept becomes meaningful only through its actualizations, that is, as “applied virtuality.” The promise that a theory of design might hopefully hold, from such a perspective, would be the finding of methods to design these conditions of integrability and actualization. The question is, to sharpen the proposal, how to harvest and cultivate the unexpected and the unfolding of future potential. The outcome could not only consist of improved anticipation as well as better transparency of the decision-making processes involved in practicing design as a prognosis.24 The legitimizing of any particular choice for any particular
23
The concept of “orthoregulation” was introduced by Wassermann, K. Theorie als Sprachspiel. Wittgenstein, Wissenschaft und das Sprechen u¨ber den Geist. Prolegomena zu einer Antizipatorischen Vernunft als Methodizee. Unpublished manuscript 2008/2009, mimeo. 24 The notion of “design as an unavoidable prognosis” was proposed by the cultural anthropologist Manfred Fassler (2008).
situation becomes an issue of negotiation. So much so that design, as an integral part of all the disciplines we distinguish today, might at some point come to play out something like the pragmatics of scientific politics – an inventive, open, and involving politics of the Agora, hopefully not one of bureaucratic control.
References Deleuze G (1994 [1968]) Difference and Repetition. Continuum Press, London Fassler M (2008) “Design 2020”. In: Bu¨hlmann V, Wiedmer M, ringier jrp (eds) Pre-specifics some comparatistic investigations in research on art and design, Zu¨rich Foucault M (2001) Das Denken des Außen. In: Schriften 1, Suhrkamp, Frankfurt a/M Galison P, Daston L (2007) Objectivity, Zone Books, New York Haraway DJ (1997) Modest Witness@Second Millenium. FemaleMan Meets OncoMouse: Feminism and Technoscience. Routledge, London Hintikka J, Remes U (1974) The method of analysis. D Reidel, Dordrecht Janich P (2006) Kultur und Methode. Suhrkamp Verlag, Frankfurt a/M, p. 216 Kouba P (2008) “Two ways to the outside”. In: Buchanan I (ed) Deleuze Studies, Edinburgh University Press, vol 2, nr 2 Lenhard J, Ku¨ppers G, Shin T (2006) Simulation pragmatic construction of reality. Springer, New York Le´vy P (1998) Becoming virtual. Perseus Books, New York Nietzsche F (1922) Kleinoktav Gesamtausgabe I–XVI Bd IX S. 190. In: Hildebrandt K (ed) Nietzsches Wettkampf mit Sokrates und Plato. Sibyllen-Verlag, Dresden Pappus of Alexandria (1875–1878). In: Hultsch F, Weidmann F (ed) PAC, Pappi Alexandrini Collectionis quae supersunt, 3 vols, Berlin, pp 634–636 [analysis and synthesis] Putnam H (1979 [1975]) “Die Bedeutung von Bedeutung”, (ed) (transl: Spohn W) Vittorio Klostermann, Frankfurt a/M Serres M (1991 [1968]) “Die Anamnesen der Mathematik”. In: Hermes I Kommunikation. Merve, Berlin Serres M (1992) “Turner u¨bersetzt Carnot” In: Hermes III. ¨ bersetzung. Merve, Berlin U Serres M (2001) Hominescence. Edition les pommier, Paris Siegert B (2003) “Die Passage des Digitalen. Zeichenpraktiken der neuzeitlichen Wissenschaften 1500–1900”. Brinkmann & Bose, Berlin Simon HA (1996[1969]) “The Sciences of the Artificial”. MIT, MA Stolarski D (2009) Die Mathematisierung des Geistes. Algebra, Analysis und die Schriftlichkeit mentaler Prozesse bei Rene´ Descartes, LIT Verlag, Wien/Mu¨nster Sloterdijk P (1998–2004) “Spha¨ren I-III”. Suhrkamp, Frankfurt a/M Vossenkuhl W (1995) Ludwig Wittgenstein. Beck, Mu¨nchen Wittgenstein L (1984) Philosophische Untersuchungen. } 23. Werkausgabe Bd. 1, Suhrkamp, Frankfurt a/M
11
Text Design: Design Principles for Texts
Wibke Weber
Abstract In his book “Visual Function,” the information designer Paul Mijksenaar quoted the famous Dutch writer, Multatuli, who wrote around 1873: “Endeavor – with the most diligent labor, O aspiring artist! – to master the content. The form will rise to meet you” (as cited in Mijksenaar 1997, p. 52). Multatuli compares the discussion about form and content with a “beggar who debates whether he would keep his gold in a purse or a pouch... if he had any gold!”(as cited in Mijksenaar 1997, p. 52). In other words: “Without content there is no form” (Mijksenaar 1997, p. 52). The content always affects the form – form follows content. This article is about content: about text, design, and text design. As Multatuli knew: writing has something to do with design. Writing means to design texts: to start with an idea, to shape words, to form phrases, to build sentences and then paragraphs, to frame a headline, to lay out, to illustrate the text with pictures and graphics, etc. Since writing obviously has something to do with design, the following questions arise: are there design principles for the layout as well as for the writing style? Is it possible to apply the gestalt laws to texts? This article wants to give an answer: it defines the terms text and text design and outlines some ideas about content, form, and design principles for texts.
11.1 What Text Design Is About Using the term text, the first idea might be: it has something to do with words written in a line. In this case, the term text would be limited to its linguistic aspect. However, in the age of multimedia, text is more than any instance of spoken or written language. First, text appears to the reader in a visual form, as a “gestalt” (Sauer 2004, p. 52). The visual form of a text tells something about its content: a manual looks different from a scientific article; a poem looks different from a protocol or a business report. A text genre
W. Weber Stuttgart Media University, Stuttgart, Germany
without any reference to its visual side – its text image – does not exist (Sauer 2004, p. 54). Therefore, a text has two sides that are mutually dependent: content and form. Both sides define the message of a text. Therefore, it is necessary to widen the term text. According to Kress and van Leeuwen, the term text implicates: content and form, language and layout, words and visualizations, typography and graphics, stylistics and visual effects. “A written text (...) involves more than language: it is written on something, on some material (paper, wood, vellum, stone, metal, rock, etc.) and it is written with something [gold, ink, (en)gravings, dots of ink, etc.]; with letters formed in systems influenced by esthetic, psychological, pragmatic and other considerations; and with a layout imposed on the material substance,
S. Konsorski‐Lang and M. Hampe (eds.), The Design of Material, Organism, and Minds, X.media.publishing, DOI: 10.1007/978-3-540-69002-3_11, # Springer-Verlag Berlin Heidelberg 2010
131
132
whether on the page, the computer screen or a polished brass plaque” (Kress and van Leeuwen 1996, p. 39). This enhanced definition of the term text provides the basis for the next definition: the definition of text design. Using the term text design, the first idea might be: it has something to do with layout and typography, with print space, columns, paragraphs, and fonts. In this case, the term text design would be limited to the visual side of text – to the text image. However, text design is more than creating a page layout and putting a dummy text into it. Bucher defines text design as a strategy to bring together layout and text, form and content (Bucher 1996). According to Bucher, the term text design can be understood as a plea to connect both sides of a text: to bridge the gap between visual effects and the writing style. Therefore, text design means to shape texts visually und linguistically. Unfortunately, very often in practice both aspects of a text are considered apart from each other: on the one hand, you have the designers who use the text as a creative space for typography and layout; on the other hand there are the linguistics and journalists, who regard the text mostly as a verbal entity. However, if both sides are well coordinated and combined, text design may lead to powerful texts: clear, comprehensible, and convincing. The following paragraphs shows examples for text design and depict how design principles can affect the message of a text and the text comprehensibility.
11.2 Gestalt Laws: Designing the Visual Side of Texts Since the reader first perceives text as a visual entity, I want to start with the visual side of text: the text image. It is less useful, if a text is written excellently, but its exterior appears unattractive and barely legible, because of the missing structure: no paragraphs and subheadings that attract the eye, a line length that is too long, narrow line spacing, and a font that is hard to read. The text image should whet the reader’s appetite for the content; it should give an orientation to the reader about the content and support the comprehensibility. This article cannot explain all design aspects of typography or a user-friendly page layout (not to mention the fact that each medium, e.g. web, print, mobile,
W. Weber
touch screens, follows different design rules). Nevertheless, I want to mention some principles that are generally significant for the design process of texts: the gestalt laws. At the beginning of the twentieth century, the Gestalt School of Psychology was founded to investigate how we perceive form and organize visual elements into groups. The basic principle of gestalt is that in perception the whole is greater than the sum of its parts. For example, reading a text, the reader perceives each word first as a complete unit rather than seeing individual letters. The greater context arises from the configuration of the letters into a word. Based on this holistic theory, the gestalt psychologists Wertheim, Koffka, Ko¨hler, and later Arnheim (1971, 2000) developed several gestalt laws – a set of organizing principles to describe perceptual phenomena. Some important gestalt laws are: Proximity: Objects close to each other are perceptually grouped together. Similarity: Objects similar to each other are grouped together (because of form, color, size, or brightness). Closure: We perceptually tend to complete objects that have gaps in them or are not complete. Symmetry: Symmetrical shapes and forms are perceived as forming a group, even in spite of distance. Law of Continuity: We tend to see visual elements that are continuous as visual entities rather than ones that make abrupt turns. Usually, the gestalt laws are the domain of the designers; they use them for their visual products, e.g. photographs, posters, interfaces, and web pages. Thus, it might be surprising to apply these laws to the content – to the visual side of texts.
11.2.1 Law of Proximity Texts that belong together ought to be grouped nearby. For example, headlines should stand next to the continuous text that follows, rather than to the text before (Figs. 11.1 and 11.2). Captions and figure legends should be situated directly by the figure, so that the reader can assign the caption to the figure. Teasers should stand directly by the image they refer to (Fig. 11.3).
11
Text Design: Design Principles for Texts
Fig. 11.1 Although the yellow headlines (law of similarity) suggest another structure, the law of proximity forces the user to perceive two groups that do not match: “Veranstaltungen”
133
(Events) and “SAP-Pressekonferenz am 24. Januar 2007” (SAP Press Conference, 24 January 2007) (Retrieved April 2, 2007 from http://www.sap.com)
Fig. 11.2 Now the law of proximity corresponds with the law of similarity, the yellow headlines: “Veranstaltungen” (events) and “Pressemeldungen” (press releases)
134
W. Weber
Fig. 11.3 Which text belongs to which photo? Not easy to detect. The arrangement of texts and images makes it hard to perceive a certain structure. (Retrieved 28 February 2008 from http://www.krone at)
11.2.2 Law of Similarity
11.2.4 Law of Continuity
Text elements that look similar are perceived as part of the same form or fulfill the same function. The law of similarity is closely related to consistency regarding the layout of texts, websites, and interfaces. Therefore, elements of a bulleted list, highlighted keywords, and boxes (Fig. 11.4) should be used consistently, also underlining, boldface type, color, and font size of headlines and subheadings, symbols and icons, because the user tends to group them together or assign the same function to them.
An example for this gestalt law is a list with bullet points. The reader will naturally follow this line or curve like a string of beads (Fig. 11.8). In addition, indents of paragraphs and tables of contents follow the law of continuity. Another example for continuity is shown in Fig. 11.9. These are only a few examples that demonstrate how important the gestalt laws are for text design, how tightly connected writing, typography, and layout are, and therefore, how relevant text design is for the message of the text (for more examples see Schriver 1997, Dynamics in Document Design).
11.2.3 Law of Closure
11.3 Writing Style: Designing the Linguistic Side of Texts We perceive every table, every column, and every box as an entity because of its closed form, even if the box is not totally complete or broken by other elements, as in Fig. 11.5. However, if the table line is placed wrong, the reader will be confused, because wrong table lines cause wrong groups (as seen in Figs. 11.6 and 11.7).
Texts dressed in an attractive layout suggest an attractive content. Who would not be disappointed if the content does not fulfill what the packaging has promised? Like Multatuli, also the German philosopher Arthur Schopenhauer knew about the close connection
11
Text Design: Design Principles for Texts
135
Fig. 11.4 All definitions are highlighted similarly in grey boxes. Headlines are marked in red. (Weber: Kompendium Informationsdesign, Springer, 2007)
Fig. 11.5 The law of closure works, even if the soccer player and the dog break through the lines. (Retrieved 28 February 2008 from http:// www.faz.net/logo-download)
between writing and design. “Few write in the way in which an architect builds; who, before he sets to work, sketches out his plan, and thinks it over down to its smallest details. Nay, most people write only as though they were playing dominoes; and, as in this game, the pieces are arranged half by design, half by chance, so it is with the sequence and connection of their sentences,” writes Schopenhauer in his essay ¨ ber Schriftstellerei und “The Art of Literature” (“U Stil”, 1851). For Schopenhauer, writing means to design texts, to build sentences, to have an idea of the smallest
units: the words and their effects. Thus, the question is: How to build a text? How to write a text that is not only easy to understand, but also enables the reader to catch the message immediately? “Writing is easy. All you have to do is cross out the wrong words,” recommends Mark Twain. However, which words are the wrong words? Journalists and authors are aware of all the guidelines on how to write an attractive news article or story, so that the target audience can read them easily. Writing experts like Wolf Schneider (2001) and Gary Provost (1985) recommend many proven techniques for powerful
136
W. Weber
Fig. 11.6 Instruction manual for a MP3 player. The tables disconnect what should be grouped together: buttons and descriptions of functions. (Samsung, Quick Start Guide YP-U2/U2R)
Fig. 11.7 Now the law of closure compels the user’s eye to perceive button and function as one unit
writing. I have chosen only some important recommendations referring to the word classes – the basis of a sentence. Verbs: Verbs, not nouns, are the driving force of a sentence. The smaller the ratio of verbs to other words in a sentence, the harder it is to read. Studies about comprehensibility confirm these guidelines (Nielsen 1997; Langer et al. 2002). Particularly active verbs improve the text comprehensibility. Active verbs speed up the sentence,
they color a sentence and add action and motion to it; inactive verbs (e.g. to be, have) are static and less attractive. Verbs in active voice: A sentence is mostly about actors and actions. Sentences in active voice are clearer, more direct, and vivid than those in passive voice. Passive: The consequences were known by each formatting. Active: The students knew the consequences.
11
Text Design: Design Principles for Texts
137
Fig. 11.8 The bulleted list illustrated the law of continuity applied to texts. (Weber: Kompendium Informationsdesign, Springer, 2007)
Active voice emphasizes the actor, passive voice what’s being done. Passive voice is appropriate, when the actor is unknown or not important (e.g.: The ring was stolen). Nouns/nominalization: Nominalization means to turn a verb or an adjective into a noun (e.g., to coordinate ! the coordination; to use ! the using; lead to creation instead of create; give a report instead of report). Nominal style may create long strings of nouns and prepositional phrases that are hard to read and to understand – the sentences sounds dull and inactive. Business and academic writers often make use of the nominal-passive style (use nominal-passive style) and thus disregard the vigor of the verbs. Adjectives/adverbs: Adjectives are necessary when a noun or a verb cannot express the message
alone. However, too many adjectives can overload a sentence. Often, a strong verb or a strong noun can eliminate the adjective respectively adverb, and make the message of a sentence precise: e.g., hurry or run instead of go quickly wolf instead of eat quickly limousine, sedan (or the brand of the car like Lamborghini) instead of expensive and luxurious car Preposition: Too many prepositional phrases stringed together can overload a sentence; the reader has to memorize a lot of information. The results of the project of the students were submitted to the teachers of the departments in Germany at the end of March. (7 prepositions)
138
W. Weber
Fig. 11.9 Text-imagecomposition: The landing stage extends into the text. Headline and lead text lead directly to the lake. (Die Zeit, 4 April 2007, p. 71)
The results of the student project were submitted to the teachers of the German departments on March 31. (4 prepositions)
There Are a Lot More Rules
Vary sentence length and structure Use defined and established terms Use terms consistently Avoid wordy phrases Avoid tautologies Give only one piece of information in each sentence Be concise Be specific..., etc. Comparing writing guidelines and style manuals in different languages, it is astonishing that these rules for writing style apply to English as well as to German or French. This leads to the thesis that these rules might be understood as design principles for effective writing, especially the rules mentioned above: the guidelines about the word classes. Why? Because the
roots of these rules lie in the time before writing: in the oral culture.
11.4 Listen to What You Write – The Oral Side of Text Design “Writing should be conversational,” writes Provost (1985). His statement is a link to the oral culture, and to music, too. “The words you write make sounds, and when those sounds are in harmony, the writing will work” (Provost 1985, p. 58). This means: Write for the ear, listen to what you write. However, when reading a text nowadays, most people keep silent. For most of history, humankind had an oral culture, conveying information by sound and spoken words. Writing changed this. With the introduction of writing and the development of print, western culture moved even further away from a sensory world dominated by hearing to a world governed by sight. Writing and printing completely carried words from a “sound world” into a visual one (Ong 2002). Printed words became abstract things, no longer written to be read out loud: The reader became silent. That is the reason
11
Text Design: Design Principles for Texts
why nowadays text documents often sound so formal, unemotional, and difficult to understand: The authors disregard the principle of orality and rhetoric (cf. Plato’s Phaedrus, 2002). “Sparsely linear or analytic thought and speech are artificial creations, structured by the technology of writing” (Ong 2002, p. 40). According to Ong, oral and written language has different characteristics: “Oral language is more additive rather than subordinative,” more “aggregative rather than analytic,” “more redundant or copious,” more “empathetic and participatory rather than objectively distanced,” “more situational rather than abstract.” An effective writing style takes up these characteristics of oral language and internalizes them. The rules mentioned above allow for the principle of orality. Remembering the oral culture, it is safe to say that every written text has a hidden sound. Radio journalists have to be aware of this hidden sound. They always read texts aloud right away as they are writing them. Reading aloud shows whether a sentence flows or not. Reading aloud discovers a clumsy sentence structure. Reading aloud reveals gaps and dissonances (cf. Schneider 2001, p. 128–134). If someone listens to the text and doesn’t catch the sense immediately, he gets lost. The same is true for the reading process. Of course, the reader of an article can always go back and read over a sentence two or three times. But this is precisely what he is not going to do. He will stop reading and continue with another article. These ideas lead to the question: Does a writing style considering the principle of orality (e.g., journalistic and fiction writing) sound different from a writing style based on literacy (e.g., academic writing and officialese)? Does a scientific article or a sales report have another hidden sound compared to a novel or a feature story? A color pattern of a text might illustrate this. I tried to visualize the different sounds of texts respectively the different writing styles by colors using a color code that highlights the different word classes (as Barow suggested on his website, Barlow n.d.). Why colors? First, colors are “visually distinct” (Liu et al., 2003 p. 2); they can encode different word classes and identify even small words like articles and prepositions. Second, “colors have an inherent affectively evocative nature” (Liu et al., 2003 p. 2). For the color code, I tried to follow the meanings of colors (according to western culture). Each word class received the color corresponding to its function. Verbs are the energy of a sentence, so the color that
139
fits best would be red because red is associated with power and vigor. In contrast, the color black denotes strength, authority, and formality. Too many nouns and especially a nominal writing style make a text heavy and wordy, and hard to understand. Grey would best be allotted to the determiners because they are more likely neutral. Adjectives are colored green because they can make a text rich, but they should not be used too excessively because too many adjectives tend to cover the information and make the text grow exuberantly. Prepositions have a down-toearth-character, they are solid – that is why I have chosen brown. Moreover, conjunctions are unifying, so the color blue seems to be suited the best. Interjections express emotions and vitality like the color yellow does (Weber 2007a, b Table 11.1). The primary goal was to create a text image that demonstrates to the writer whether his or her text works or not: whether he or she overuses nouns, adjectives, or hidden verbs (nominalization). Here is an example: a sentence cited from the Information Design Source Book (IIIDj 2005, p. 15). At a glance, the text image of Fig. 11.10 shows that this text might sound dull because of its dark color. The writer can realize quickly that only one verb was used and that too many nouns and perhaps too many prepositions were used. He or she can quickly realize that the verb comes too late, nearly at the end of the sentence. The text image reveals the preponderance of nouns and the lack of verbs that is typical for academic and scientific style. This type of writing makes use of the nominal-passive style instead of the verbal-active style; therefore, it is not easy to understand. I rewrote the text transforming the sentence in an active style and using verbs instead of nominalizations (Fig. 11.11). Table 11.1 The colors highlight the different parts of speech Encoding scheme Verb Noun, personal pronoun Adjective, adverbial adjective Determinative (article, demonstrative pronoun, indefinite pronoun, relative pronoun, interrogative pronoun) Particle (preposition, conjunctive adverb, adverb of time, space, cause, e.g., today) Conjunction Interjection
140
W. Weber
Fig. 11.10 Many nouns and prepositions are used in this text, but only one verb is used Fig. 11.12 Different paragraphs taken from scientific articles (natural sciences and humanities)
Fig. 11.11 The utilization of fewer nouns and more verbs gives a dynamic to the text
The color pattern of text 1 (Fig. 11.10) differs from Fig. 11.13 Different paragraphs taken from fictional narratives the color pattern of text 2 (Fig. 11.11). The text image of text 2 looks brighter and friendlier because it counts more nominalizations, more adjectives, and few verbs. more verbs and fewer nouns. Therefore, it is more The lighter image of the narrative texts (Fig. 11.13) reflects the use of more verbs and fewer nominalizadynamic and is easier to understand. As shown in Figs. 11.10 and 11.11, a sentence with tions like in oral language. It remains to be investithe same message can have different text images – gated whether this holds true for other texts such as depending on its writing style. A modified sentence business plans, sales reports, or letters of adminisstructure produces a modified color pattern. Therefore, tration. The result shows that the darker the text, the different text genres have different text images more dull its sound may be, and the harder it is to because they are written in different styles, e.g., scien- understand tific texts and fictional narratives. Figure 11.12 shows These examples are a first step towards visualizaa sample of scientific texts, Fig. 11.13 a sample of tion of writing style. Research questions that still remain include: How to highlight passive style or fictional narratives. At first glance, the text-images have a different abstract words, how to tag sentence length, bridge appearance. Fig. 11.12 shows a darker text-image. In words or cliche´s, or how to highlight corporate wordcontrast, Fig. 11.13 appears more mellow and light. ing. However, the text visualization offers two things: These differences are due to different writing styles: (1) The visualization illustrates to the writer the the scientific samples (Fig. 11.12) contain more nouns, sentence structure so that the he or she can see
11
Text Design: Design Principles for Texts
troublesome phrases rapidly. (2) It shows that the writers of fictional narratives deal differently with language than the writers of scientific articles. The linguistic design principles of narratives are: more verbs, fewer nominalizations, and fewer adjectives.
11.5 Conclusion Newspaper articles, press releases, science journals, websites, business plans, technical reports, manuals, newsletters – these documents are easier to read if the writer considers the principles of text design. Text design means to shape texts visually und linguistically. As shown, basic elements for text design are the gestalt laws (the visual side) and the writing style (the linguistic side). Both can be considered as design principles for texts. It is not possible to write a text and not to think about the layout at the same time and vice versa (Sauer 2004, p. 41). The gestalt laws can help to organize the text as an entity, so that both sides of a text – layout and content – convey the same message. The writing style can benefit from the characteristics of the oral language (the oral/ acoustic side). Therefore, text design is form, content, and sound. Form, content, and sound compose a good text design: comprehensible, clear, and concise.
References Arnheim R (1971) Visual thinking. University of California Press, Berkeley, Los Angeles Arnheim R (2000) Kunst und Sehen Eine Psychologie des scho¨pferischen Auges. de Gruyter, Berlin, New York
141 Barlow M (n.d.) Understanding Texts Through Visualisation. http://www.michaelbarlow.com/viz.html (retrieved March 6, 2008) Bucher HJ (1996) Textdesign – Zaubermittel der Versta¨ndlichkeit? Die Tageszeitung auf dem Weg zum interaktiven Medium. In: Hess-Lu¨ttich EWB, Holly W, Pu¨schel U (Hrsg) Textstrukturen im Medienwandel. Lang, Frankfurt am Main, S 31–59 IIIDj Institute for Information Design Japan (2005) Information Design. Source Book, Birkha¨user, Basel Kress G, van Leeuwen T (1996) Reading images: grammar of visual design. Routlegde, London La Roche W (1986) Fu¨rs Ho¨ren schreiben. In: La Roche W, Buchholz A (eds) Radio-Journalismus. List Verlag, Mu¨nchen Langer I, Schulz von Thun F, Tausch R (2002) Sich versta¨ndlich ausdru¨cken. Reinhardt, Mu¨nchen Basel Liu H, Selker T, Lieberman H (2003) Visualizing the affective structure of a text document. Ft. Lauderdale, Florida, CHI 2003, April 5–10, 2003 Mijksenaar P (1997) Visual function. Princeton Architectural Press, New York Nielsen J (1997) How users read on the web. http://www.useit. com/alertbox/9710a.html (retrieved March 5, 2008) Ong W (2002) Orality and literacy. Routledge, New York Platon (2002) Phaidros. Reclam, Stuttgart Provost G (1985) One Hundred Ways to improve your writing. Mentor Books, New York Sauer C (2004) Der Stoff, aus dem die Texte sind. Vorla¨ufige Betrachtungen zu Erscheinung und Materie von Texten. http://www.semiose.de/index.php?id=291,53 (retrieved March 5, 2008) Schneider W (2001) Deutsch fu¨r Profis Wege zu gutem Stil. Wilhelm Goldmann Verlag, Mu¨nchen Schopenhauer A (n.d.): The essays of Arthur Schopenhauer; the art of literature. http://infomotions.com/etexts/gutenberg/ dirs/1/0/7/1/10714/10714.htm. http://ebooks.adelaide.edu.au/s/ schopenhauer/arthur/lit/chapter2.html (retrieved February 25, 2008) Schriver K (1997) Dynamics in document design. Creating text for readers. Wiley, New York Weber W (2007a) Textdesign. In: Weber W (ed) Kompendium Informationsdesign. Springer, Berlin, pp. 193–225 Weber W (2007b) Text Visualization – What Colors Tell About a Text. In: IEEE Proceedings of the 11th International Conference Information Visualization (IV 07). IEEE Zurich Switzerland, pp. 354–359
12
Synesthetic Design of Music Visualization Based on Examples from the Sound-Color-Space Project Natalia Sidler
Abstract The term synesthetic design, and its practical application to the visualization of music, will be discussed in relation to synesthetic phenomena. A brief summary of the history of Sound Light Music follows, with an example from Wassily Kandinsky, which in turn provided the impulse to build an instrument to “change colors into sounds.” Together with the Color Light Organ (Fig. 12.1), designed especially for this study, the Sound-Color-Space Project illustrates the relationship among sound, color and space in novel ways. In conclusion, three synesthetic examples of music visualization demonstrate various possibilities of visualization software.
12.1 Synesthetic Design The goal of synesthetic design is to coordinate as many sensory impressions as possible so that a coherent overall impression is achieved. Synesthetic design has its roots in synesthesia (Haverkamp 2009). The term refers to the application of a systematic arrangement of inter-modal relations, and Christian Fink proposed using it in connection with the discussion of a multi-sensory media esthetic (2004). Synesthetic design is based on the fact that all people are intuitively capable of establishing connections between the senses. If synesthetic design is applied as a multi-perceptual methodology, it will embrace all possible means of connection between the sensory modes. A systematic procedure for coordinating visual and auditory attributes in the development of industrial products, for example, has so far not been developed. However, in the visual arts, music and literature, connections have long been sought.
N. Sidler A Transdisciplinary Project of the, Zurich University of the Arts (ZHdK), Switzerland
For this reason, the analysis of drawings, paintings and sculptures aspiring to the visualization of acoustical events provides important basic material for the development of comprehensive design concepts.
12.1.1 Synesthesia The term synesthesia comes from the Greek synaisthesis. The word is a fusion of the Greek elements syn = “together” or “unity” and aisthesis = “perception” or “sense” Thus, synesthesia is the experiencing of two or more sensory impressions simultaneously (Harrison 2007, p. 3). This concept has been known to medicine for more than 300 years. Like many other subjective human experiences, the phenomenon of synesthesia was mainly researched and discussed between 1860 and 1930 (Emmerich et al. 2004, p. 15). Synesthesia is the common name for two kinds of more than 40 interconnected cognitive conditions. The first of these two kinds, sensory synesthesia, involves the stimulation of a sense, for example smell, which then involuntarily and simultaneously activates synesthetic sensations in other sensory areas, for example,
S. Konsorski‐Lang and M. Hampe (eds.), The Design of Material, Organism, and Minds, X.media.publishing, DOI: 10.1007/978-3-540-69002-3_12, # Springer-Verlag Berlin Heidelberg 2010
143
144
N. Sidler
can be detected using visual procedures such as magnetic resonance imaging (MRI), which visually depicts brain function. Synesthetic connections release electrical impulses that are recognizable and quantifiable.
12.1.2 Colored Hearing
Fig. 12.1 Color Light Organ, Prototype I
sight or hearing. The sounds of different musical instruments can provoke color associations in a person (colored hearing); the linking of a specific musical instrument to a specific color association remains constant for the same person. The second kind of synesthesia is cognitive synesthesia. Letters, numbers or the names of persons seem connected to smells, colors or taste. The most common form of cognitive synesthesia occurs in relation to objects, with affected people associating letters, numbers, standard units of time, pitches or major and minor keys with colors (Jewanski and Sidler 2006, p. 1). The frequency of synesthetic perception in a population is uncertain. According to Baron-Cohen’s investigations (1966), one person in 2000 is affected, but the American neuropsychologist Richard E. Cytowich proposes a rate of 1:2,000–1:25,000, depending on the specific type of synesthesia (1:500 for cognitive synesthesia involving colored numbers, letters or musical pitches, 1:3,000 for common kinds of sensory synesthesia such as colored musical sounds or colored tastes, and 1:25,000 or more for persons with rare kinds or combinations of synesthesia). Synesthetic connections
Through my research I came into contact with people with various forms of synesthesia, but my main interest was for those with colored hearing. At the Department of Neurology at the University Medical School in Hannover, I learned much from test subjects about the diversity of colored visions. Synesthetes reported differentiated color and shape impressions when listening to sounds, noise, music and their respective movements. In general, all these impressions seemed to take place in front of an “inner monitor” and to be perceived with the “inner eye.” They could also occur, albeit less frequently, as “projections” on walls and ceilings, or even outdoors. The following figures (Figs. 12.2–12.6) show a variety of synesthetic impressions. Some are from synesthetes, and some are so-called “acquired synesthesias” from people who have appropriated “colored hearing” as a memory aid or as an artistic-esthetic means of expression. Is then synesthesia a special aptitude or talent? What can be derived from it in artistic work? Which major artists incorporated their synesthesias and ideas of transdisciplinary thought in their works? These and other questions occupied me in connection with a theater text by Wassily Kandinsky and finally led to the creation of the Color Light Organ and ultimately to the design of visualized music.
Fig. 12.2 Visual synesthetic patterns from synesthete Alexandra Dittmar triggered by the vocal sound of 12 persons (Jewanski and Sidler 2006, pp. 43–44)
12
Synesthetic Design of Music Visualization Based on Examples from the Sound-Color-Space Project
Fig. 12.3 Melanie Filsinger, synesthetic series of drawings, triggered by hearing sounds during the recitation of “Synthese” (2005), a poem based on a text by Wassily Kandinsky (1922). (Jewanski and Sidler 2006, pp. 418–421)
145
Fig. 12.5 Matthias Waldeck, synesthetic visual graphics triggered by hearing piano sounds (Jewanski and Sidler 2006, pp. 90–91)
Fig. 12.4 Matthias Waldeck, synesthetic images triggered by different noises (Jewanski and Sidler 2006, pp. 114–115)
Fig. 12.6 Natalia Sidler, synesthetic sketches, triggered by different sounds from the Color Light Organ (Jewanski and Sidler 2006, pp. 440–441)
12.2 Color Light Music
La´szlo´ tried to render his own synesthetic perceptions and experiences in ideally matched sound, light and color presentations (Jewanski 1997, pp. 12–43). The focal point of his compositions was a so-called Color Light Organ, which was connected to various projection devices, better allowing him to give expression to his ideas of visualized music.
The relationships of sound, tone and interval to color and to terms situated between “color sound” and “sound color” have been debated and researched since antiquity. They have particularly preoccupied philosophers and scientists, poets, painters and musicians during the past two centuries. In 1910 Wassily Kandinsky presented his treatise “On the Spiritual in Art,” a “treatise on harmony” for abstract painting, wherein the “inner sound” of colors and shapes should find their ultimate expression (Kandinsky 1994). The previous year had seen the first performance of the symphonic poem “Prometheus” by the Russian composer Alexander Scriabin. In this composition, Scriabin attempted to realize the inner visual images he imagined arising from music. With pieces from Color Light Music, which premiered in 1925, the Hungarian composer Alexander
12.2.1 Wassily Kandinsky and the Integrated Work of Art “The color is the key. The eye is the hammer. The soul is the piano with its many strings. The artist is the hand, which, through the appropriate playing of one or another key, causes the human soul to vibrate.” (Kandinsky 1912)
Wassily Kandinsky (1866–1944), the father of abstract art, was influenced by the first atonal compositions of
146
Arnold Scho¨nberg (1874–1951) and the “synesthetic compositions” of the color-hearing Russian composer Alexander Scriabin (1872–1915). This resulted in both his exceptional paintings and his pre-Dadaistic plays, already tending towards transdisciplinary and Integrated Total Art. Total Art is nothing new. It had already made its appearance with Richard Wagner (1813– 1883), for example, in his text “Art and the Revolution” (1849). Wagner strove for a kind of Integrated Art, where, under the predominance of music, plot, text, costumes, stage sets, stage direction and technical realizations would all be planned together from the start. Wagner implemented his ideas in his many operas. Back to Kandinsky, whose Integrated Art consisted of painting, the theater and stage set designing. His transdisciplinary thinking led from painting titles like “Composition,” “Improvisation” and “Improvisation III,” to descriptions of the various equally important parameters for his play, “The Yellow Sound” (1912). The artist wanted form, color, movement, music and sound all to have equal expressive value. Kandinsky’s objective, as described by the writer John Harrison, was “to create pictures which suggested sounds to the viewer.” The inclusion of a sound dimension in painting was a step along the way to Kandinsky’s final goal, the creation of a “programmatic-synesthetic Integrated Work of Art.” Kandinsky’s thinking can be summarized as follows: the more senses stimulated by a work, the greater the chance to reach the soul of the audience (Harrison 2007, p. 121).
12.2.2 “The Instrument which Changes Colors into Sound” “The Colors and the Lute spin wildly.” (Kandinsky ¨ ber das Theater) WU
Kandinsky’s stage works were all created between 1909 and 1914, an extremely productive creative phase, during which, through abstract painting, he developed a new system of signs, to be read in a specific way. Both forms of artistic expression – stage setting and two-dimensional painting – influenced each other reciprocally. Kandinsky always described his paintings by their process-like structure. The altered manner of perception demanded of the viewer – that he stroll through paintings, actively explore visual space and
N. Sidler
feel his way from one color-form configuration to another – was, according to the artist, a time-consuming process. Kandinsky transposed the inner visual processes, forced into a two-dimensional painting and only set free by the act of viewing or of contemplation, onto the stage. As the single elements in his paintings are subjugated to the whole, so the individual figures in his stage compositions recede in favor of an integral visual effect and a uniform mood. In a famous passage from his autobiographical reminiscences of 1913, Kandinsky reports how he was jolted out of his old, rigid way of seeing by a painting standing on its side. His plays also seem to have contributed in showing him new ways of seeing and of perceiving the abstract qualities of a painting. In the “Apotheose,” the last act of the play “Violet,” there is a “Play of Light,” where the motive of a painting upside down appears anew. The painting spins time and again, so that the viewer’s final impression is one of all colors being swirled together. “The whole picture is turning: it positions itself on its left side, which is down; then the top has already become the bottom. Another fast turn. And another. Another again and then another. Always faster – the whole picture turns like a wheel – increasing in speed. Whip cracks become audible – ever louder and faster – the colors are spinning wildly [. . .].” (Boisell 1998, pp. 9–10).
12.3 Project Sound-Color-Space The first step for the research project on Sound-ColorSpace involved the transfer of synesthetic phenomena and characteristics from neuropsychological research into artistic-esthetic studies and works visualizing music. The second step dealt with the script for the play “Violet” (1914/26), by the Russian painter Wassily Kandinsky. The result was the construction of a specific musical instrument, the so-called Color Light Organ, which differs substantially in appearance and sound from a “normal” concert instrument. The project had two main goals: First, the construction of the Color Light Organ (Berlin 1998), and afterwards, in a separate research project in Zurich, the further development of the instrument, making it possible to synthesize relationships between sound, color and space in new ways. Without this instrument the three different Visualization Designs would never
12
Synesthetic Design of Music Visualization Based on Examples from the Sound-Color-Space Project
have been developed, for the organ/piano, with its new sound-generating possibilities, influenced the developmental direction of these three computer programs decisively. The second goal was the development of visualization algorithms that would reproduce the sounds generated by the Color Light Organ into two- and threedimensional geometries, structures, animations and color arrangements, and by doing so would permit the coupling of sound and color (Figs. 12.7, 12.8) With the help of the Color Light Organ, a new generation of interactive computer-based approaches to visualization could be studied and developed. The resulting software, based on a defined vocabulary of visualization, focuses on the flexibility and transportability of composing and staging practices and uses the newest computer graphic methods. An additional more general goal was to fuse music, image and space into a “synesthetic” whole and to achieve a better understanding of the phenomenon of synesthesia.
12.3.1 Design and Technique of the Color Light Organ The relationship between sound and color is extremely complex. Attempts to instantaneously visualize existing music through the computer are bound to fail on
Fig. 12.7 Synesthetic computer images to the composition Topas (2002) for Color Light Organ by Peter Wettstein, Visualization: Natalia Sidler
Fig. 12.8 Synesthetic computer images to the composition Ganymed (2002) for Color Light Organ, Ondes Martenot and Piano by Natalia Sidler. Visualization: Natalia Sidler
147
account of this complexity. The Color Light Organ therefore takes the opposite approach: because a normal organ/piano sound can never contain all timbres, the Color Light Organ has individual instruments with various sound characteristics built in that reproduce individual shades of color and their levels of luminosity (see Fig. 12.9). Each instrument within the Color Light Organ was chosen for its timbral characteristics. This decision was made intuitively and is therefore subjective. However, it is possible to alter this makeup to suit the personal wishes and specific needs of a composer. The ColorLight Organ was so constructed that each color in a seven-part color scale (from Isaac Newton’s “Optiks,” London 1704) corresponds to a note from the musical scale. Every individual color, with its shadings from very dark to very light, corresponds to a single instrumental timbre. The progression of color and luminosity begins with an extremely dark blue-black moving to light blue, then dark violet to light violet, dark red to light red, orange, yellow and light green. Christian Decker wrote for each color and its sound a specific visual graphic that reacts to the following parameters: strength of attack of the tone played (very soft/pp – soft/p – middle loud/mf –loud/f – very loud/ ff), type of touch (staccato, portato, legato) and the combination with the previously played sound. To this color combination for every single tone is added the background color, that corresponds to the color
148
scheme of the piano keyboard. Therefore, every single visual graphic consists of the background base color, the color combination with the previous sound and the subjective color of the sound of the key played. The whole visual representation is in constant motion, corresponding to the music’s movement and its tempo. A major challenge in implementing this was the difference in perception and processing of stimuli by eye and ear. For the synthesis of esthetically pleasant and meaningful visualizations on the Color Light Organ, tempi, touch and the frequency and combination of sounds must be played in the correct proportion. In addition, acoustical rendering and visualization occur in a closed circuit with the performer, necessitating a totally new way of composing and performing. The goal is thus achieved: the most intense stimulus occurs when the recipient is completely enveloped in both a color, as well as a sound, complex. Of course, the kind of projection and the performance space play a central role. Primarily, the Color Light Organ is an instrument to make visible the inner, subtly experienced images from
N. Sidler
synesthetic perceptions. But it is more than just an instrument that allows visual design and musical ideas to flow into one another; it is also a work of art in its own right. Stage directions from “Apotheose,” the last act of “Violet,” by Wassily Kandinsky (visual presentation = score for Color Light Organ by Natalia Sidler, see Fig. 12.10): “On a blue ground, irregularly surrounded by yellow, a red dot arises out of the middle. White threads extract themselves from the corners and pull themselves together toward the dot. A garish green oval runs back-and-forth in all directions on the blue ground, but it can’t reach the red point: it comes very close – the closer it gets, the more it has to struggle – then bounces back instantly again. It rolls then almost appalled over the blue ground. The white threads tremble, and some flee back to their corners. They gradually come back to the red dot when new strength collects within them. And every time the red dot trembles in these cases, to become bigger and then smaller [. . .]” (Boisell 1998)
12.3.2 Visualization Software for the Color Light Organ The musical and visual languages are related today, in our digital age, much as they have been in the past, namely each for itself. The difference today is that we can combine them meaningfully using the same techniques, so that they can have a reciprocal influence on each other. Attention to subtle details in the translation of the one to the other is necessary for musically or visually enhanced perception. Images reinforce music and vice versa. Especially in the genre of “invisible” music (sounds from a loudspeaker), which can often Fig. 12.9 Inner mechanism and keyboard of the Color be confusing for the listener, clarity and deepening of Light Organ perception can occur with visualization. Its application
Fig. 12.10 Natalia Sidler, Visualization of a text passage from Wassily Kandinsky’s “Violet,” excerpt
12
Synesthetic Design of Music Visualization Based on Examples from the Sound-Color-Space Project
spans the range from specific visualization of individual sounds to the projection of entire musical forms. And so, visuals can thus contribute to the understanding and reception of an acoustical work. Modern 3D computer graphics have at their disposal powerful mathematical processes that can completely describe the geometry and coloration of an image. These processes can create animation by simulation using physical models. Such methods are used today in computer games and could be used in connection with powerful graphic hardware for calculations in real time. Except for a few interactive artistic installations (http://www.johnadamczyk.com), such integrated implementations of music (with the help of 3D computer graphics, especially in combination with physical modeling) have rarely been studied or developed. Of special importance here is the heightened expressiveness of the visualization of music given by three-dimensional form, color, lighting, shading and texture, as well as by full utilization of room and space. Only in the past years have an intensive analysis and discussion of “synesthetic” projects in art of the twentieth century taken place [Sont et lumie`re: une histoire du son dans l’art du XXe sie`cle (2004), Brougher (2005), Grunnenberg (2005), Weibel and Jansen (2006)]. The influence of “visual music” extends into present-day trends, not only artistically, but also commercially. Music-related computer visualizations have become part of our mass culture through programs like Windows MediaPlayer or Apple’s iTunes. Today we are surrounded by an incalculable profusion of videos, video clips, multimedia and laser shows. In most cases, elements like the amplitude and frequency of audio signals are converted into simple two-dimensional patterns and then animated in real time. Most programs for music visualization are, however, controlled by random processes. These kinds of visualization are very limited in their expressivity as they ignore numerous channels of visual perception like spatial form, topology, texture, etc. In 2006, after 8 years of collaboration with various instrument companies, the Berlin University of the Arts and the Zurich University of the Arts, the development of the Color Light Organ was finally completed. Three computer graphics specialists wrote various visualization programs for the instrument and its music. The goal was to test various esthetics of synesthetic designs and graphic programming in con-
149
cert. Visualized music was composed especially for the Color Light Organ alone, and also for the Color Light Organ with an ensemble and the Color Light Organ with electronics.
12.3.2.1 Christian Decker’s Visualization Software The first visualizations were developed by the Berlin programmer Christian Decker. He created software based on Windows, which was specifically tailored to the Color Light Organ. The development of this program and the visually determined parameters to control it took 6 years. Decker’s visual vocabulary was based on synesthetic images, predetermined by me for every single sound (see Fig. 12.11). To achieve the utmost precision in translating the drawn templates, Decker constructed the individual graphics pixel by pixel. The drawback is that design changes are relatively time-consuming. The visualizations are in two dimensions, but nonetheless show considerable depth. Decker wanted to do justice to Kandinsky’s visual score and hoped, through my very exact information, to make the “flat colors” and drawn forms appear more lifelike and capable of motion. Christian Decker’s software was written to encourage the development of algorithms and techniques for realizing strategies of musical visualization. This software goes beyond usual visualization techniques by analyzing the visually fixed vocabulary and by transferring all information from the Color Light Organ using the Musical Instrument Digital Interface protocol (MIDI). This software has also been used as a tool for artistic research in visualization design. The transformation of sound to image and movement made possible by the Color Light Organ led to a unique new kind of transdisciplinary art situated between the acoustic representation of music and the visual-spatial representation of image and partaking of both.
12.3.2.2 Visualization of Music by Jan Schacher “It is interesting to see how closely aligned composer Alexander La´szlo´’s technical development was to his musical needs. The sequence of images had its own line in the score and the lighting technician was, so to speak, a
150
N. Sidler
Fig. 12.11 Christian Decker, excerpts from projections from a concert with the Color Light Organ. Visualization of the piece, The Memory of Colors by Adrian Koye, Berlin (Germany). Second Prize in the Composition Competition for ColorLight Organ and Ensemble, 2004 in Zu¨rich
fellow musician playing the lights. (...)” In his system the application of colored and structured glasses in combination with painted slides seems to me to be of particular interest. The fact that many projectors were available to La´szlo´ made it possible for him to produce very complex combinations.” (Schacher 2006, p. 357)
The starting point of Jan Schacher’s visualization concept (2004), similar to Decker’s, was through the depiction and realization of a primarily synesthetic experience. At the same time, it was important not only to understand sound and color as compositionally related parallel occurrences, but also to couple the generating of forms and colors directly and causally with the state of individual sound elements and resonating entities. Schacher’s work was greatly influenced by the study of analyzed material from La´szlo´’s Farblichtmusik examples from around 1925. Schacher constructed a mechanism to obtain information about music while it was being played. This information, in turn, was used to influence certain processes. For example, a passage with many high notes can evoke a change to a brighter or lighter shade of a color. Many of the imaging processes are semi-autonomous. This means they follow their own rules and can only be partially influenced. A direct coupling of dynamics with movement, for example, is
striking, but its effect wears off easily. Schacher’s images mostly follow the general mood of the music and its movements within a somewhat prolonged timespan. The impression of a second voice, a counterpoint, results through the deliberately indirect connection of image to music. The images do indeed move in connection with the music, but as no absolute compatibility is sought, the images and music often diverge, creating tension. This result is often the opposite of what the audience expects, namely the 1:1 coupling of sound and image (Fig. 12.12). In designing his visualization software, Schacher clearly recognized its proportional limitations and thus gave the visual graphics a different energy density and speed than the music. He wanted to avoid over-saturating the senses with a flood of stimuli. This he achieved with techniques like extreme deceleration, by completely eliminating images or through long sustained progressions while “unfolding” an image. This enables the audience to better follow both elements, heightening their experience. Like La´szlo´, Schacher interpreted the visual images for each concert anew. The basic structure of the images remained the same, but the concrete processes of motions and colors appeared differently every time. The program
12
Synesthetic Design of Music Visualization Based on Examples from the Sound-Color-Space Project
151
Fig. 12.12 Jan Schacher, Excerpts from projections from a concert on January 7, 2004, in Zurich. Visualization of the “Sonatina for Piano and ColorLight,” Op. 11, by A. La´zslo´, Leipzig 1926 (DVD Color-Light Music Synesthesia and Color-Light Music)
Fig. 12.13 “Le Ton-beau de Frank” 2004 by Jose´ Lo´pezMontes for ColorLight Organ, Ensemble and Visualization
resembles a color instrument more than a film projector: the visualizations arise in the moment and then have to be “performed.” An important part of this diversity is due to chance combinations of colors and shapes. In the role of performer, the user of this
software reacts to the music, as well as to the newly created images, for which the characteristics are known, but not the progression. By this means a tension is created, very similar to that experienced in music.
152
N. Sidler
Fig. 12.14 Jose´ Lo´pez-Montes, excerpts of projections from the composition “Julias Dyptychon” (2007) for live electronics, ambisonics and video. (DVD Color-Light-Music Synesthesia and Color-Light Music)
12
Synesthetic Design of Music Visualization Based on Examples from the Sound-Color-Space Project
12.3.2.3 Visualization of Music by Jose´ Lo´pezMontes “With the study of electro-acoustic music the wish grew in me for a visual body which would mirror the acoustic happenings. This can be compared to a violinist in concert, who, with his physical movements, “visualizes” the music, thereby bestowing on it more expressive power. In addition, I strove for a symbolic graphic score, which would be more inspiring and true-to-detail than traditional music notation.” (Schnebel 1969)
Lo´pez-Montes made it a principle in his composing process to allow himself equal time for writing the musical and the visual software (Fig. 12.13). The distinguishing characteristic of his works is that the synesthetic design of his visualization and his music are borne by the same emotion. If, in some of his early works, he conceived of music and video together and synthesized them separately, he began, in a subsequent work for piano, live electronics and ambisonics, to manipulate the music with video impulses. These images served as an impetus “to keep the music longer in the mind” (Fig. 12.14). Synesthetic design from music visualization is the connecting element between sound, space, movement and image. The virtual movement of the graphics and the spatial projections encourages a merging of the senses, awakens music to lively movement and conveys information from multidimensional spaces. The individual experiences himself “playing and singing,” creating again and again messages from inner, infinitely rich worlds.
153
References ¨ ber das Theater. Dumont Boisell J (1998) Wassily Kandinsky, U Verlag, Ko¨ln Brougher K (2005) “Visual music: synesthesia in art and music Since 1900,” Thames & Hudson, ISBN-10: 0500512175, London Emmerich HE, Schneider U, Zedler M (2004) Welche Farbe hat der Montag. Hirzel Verlag, Stuttgart-Leipzig Grunnenberg Ch (2005) “Summer of love: psychedelische Kunst der 60er Jahre,” Hatje Cantz Verlag, ISBN-10: 377571670X Harrison J (2007) Wenn To¨ne Farben haben. Springer-Verlag Berlin Heidelberg Haverkamp M (2009) Syna¨stetisches design. Syna¨stetisches Design. Hanser Verlag, Mu¨nchen/Wien Jewanski J, Sidler N (2006) Farbe-Licht-Musik, Zu¨richer Musikstudien Band V. Peter Lang Verlag, Bern Jewanski J (1997) Die Farblichtmusik Alexander Laszlos, in Zeitschrift fu¨r Kunstgeschichte 60, no1 ¨ ber das Geistige in der Kunst. OriginaKandinsky W (1912) U lausgabe von. Revidierte Neuauflage, Benteli Verlag, Bern 2004 Kandinsky W (1994) “Kandinsky, Complete Writings on Art”, Da Capo Press ¨ ber das Theater. Dumont Verlag, Ko¨ln Kandinsky W (1998) U Schacher J (2006) In: Jewanski J, Sidler N (eds) Farbe-LichtMusik, Zu¨richer Musikstudien Band. Peter Lang Verlag, Bern Schnebel D (1969) Musik zum Lesen, from the blurb DVD Color-Light-Music Synesthesia and Color-Light-Music, Art Adventures Verlag, 2009 Sont et lumie`re: une histoire du son dans l’art du XXe sie`cle (2004) Paros, Centre Pompidou http://www.johnadamczyk. com (last retrieved January 28, 2008) Weibel P, Jansen G (2006) “ Light art from artificial light: light as a medium 20th and 21st Century Art.” Hatje Cantz Verlag, ISBN-10: 3775717749
Index
A Applied science, 4, 6, 18 Automated design, 4
B Biological constructions, 65 Biological systems, 65, 72
C CAAD-CAM technologies, 109 Celebrity designers, 25 CGA shape grammar, 102 Circular thinkers, 33, 34 CityEngine, 99, 102, 104, 106 CNC milling, 110, 112 Color Light Music, 145–146, 151, 152 ColorLight Organ, 147, 150, 151 Complacent monologue, 87 Composite materials, 66–69, 73, 78 Computer aided design, 66, 69, 72, 75, 78 Computer aided engineering, 72 Computer assisted drug design (CADD), 56 Computer-based methods, 95, 107 Computer game, 39–50 Creative process, 40, 46
D Design attributes, 70 Design classifications, 13 Design criteria, 12, 13, 16–17, 23 Design DNA, 22, 25 Design engineering, 17–18 Designer drug, 53–62 Design evaluation, 17 Designing, 5–8, 10, 12, 15–17 Design issuers, 26 Design methodology, 3–5, 17 Design objectives, 70, 72, 77, 78 Design of languages, 6–7 Design of machines, 6
Design paradigms, 42, 43 Design principles, 25–26, 40, 44, 48, 131–141 Design process, 8–13, 15–17, 22, 24–27, 65, 69–72, 74, 78, 80, 83, 95, 99, 102, 104, 107, 109–112 Design requirements, 74, 77 Design research, 3–5 Design science, 3–5, 17–18 Design-space, 65, 68, 70, 72, 74, 75, 78 Design theory, 9 Deterministic paradigm, 54 Dialogue, 87–89, 91 Digital modeling, 110 Display and interaction systems, 96–98 Drug design, 53–62
E Emphatic design, 29, 38 Engineering design, 3, 5, 17, 69 Engineering materials, 66–68 Environmental simulation, 96–103, 107 Evaluation criteria, 9, 12 Evolutionary algorithms, 8, 15 Experimental science, 6
F Future cities, 95–107 Future cities simulation platform, 95–107
G Game Game Game Game Game Game Game Game Game Game Game
artwork, 40, 42, 44, 46, 49, 50 assets, 40, 42, 44, 50 design, 39–42, 45–50 elements, 43–45, 48, 49 genre, 40, 42 mechanics, 46, 47 objective, 43, 48 physics, 45 procedures, 43 prototype, 42, 46, 47 rules, 43
155
Index
156 Gaussian taste distribution, 31–32 Generic city, 97, 100–103 Genetic design, 9 Gestalt laws, 132–134, 141 GIS-based project management and collaborative city design, 96, 98 Global population, 29, 30 Good design, 24–25 Grammar-based planning, 99 Guidelines for material selection, 69
H Heuristic design approach, 10, 70, 74 Human body archetype, 36
I Individuality, 88, 89, 93 Information architecture, 97, 98 Integrative approach, 32 Intelligent materials, 69, 73 Iterative design, 41, 42 Iterative mode, 112
L Landscape design, 109, 110, 114 Lateral thinking, 33–35 Law of closure, 134–136 Law of continuity, 132, 134, 137 Law of proximity, 132–134 Law of similarity, 133, 134 Longitudinal thinkers, 34 Lysergic acid diethylamide (LSD), 59, 60
M Man-made constructions, 65 Mega cities, 29, 30 3,4-Methylenedioxy-methamphetamine, “ecstasy” (MDMA), 59 MINI, 29–38 Molecular design processes, 55 Molecular hedonism, 62 Molecular recognition principles, 5, 54 Motion planning, 103 Music visualization, 143–153
N Natural funativity, 45 Natural language, 6, 7
O Opium, 59 Optimal design, 125
P Path finding, 103 Playtesting, 42, 47–48, 50 Pragmatic position, 87 Principle of orality, 139 Principles of construction, 69 Procedural production process, 100, 101 Product design, 21–27 Progressive process, 11 Psychiatric drugs, 61 Psychopharmaceuticals, 60 Pure science, 6
R Rational drug design, 53, 59, 60 Raw material shortage, 30 Receptor theory, 54 Rules for embodiment, 69
S Science of design, 17 Scientific design, 4, 17 Self-organization design, 8 Sensual design, 36–37 Shape grammar tools, 99 Simplicity, 88 Smart drugs, 61 Social developments, 21 Sound-Color-Space, 143–153 Structural order, 88, 89 Successful design, 29, 31 Synesthesia, 143–144, 147, 151, 152 Synesthetic design, 143–153
T Technical structures, 65, 69 Text, 131–141 Text design, 131–141 Text image, 131, 132, 138–140 Troublemakers, 33, 34
U Unity, 88, 89, 93 Urban planning, 90–93, 96–101, 103–105, 107
V Variety, 88, 89, 91 Video games, 39, 40, 44–46, 50 Visualization of writing style, 140 Visualization software, 148–153 Visual vocabulary, 149
W Word classes, 136, 138, 139