The Competitiveness of Nations

in a Global Knowledge-Based Economy

January 2004

AAP Homepage

Carl Mitcham and Sanyin Siang *

Complexity, Simulations and Ethics

Professional Ethics Report, XIV (1), Winter 2001

Advances in science and technology often introduce ethical challenges.  After all, everything that can be done is not necessarily something that should be done.  Hence, as technology expands the realm of the possible, it requires us to extend our critical assessment regarding the ethical limitations to human action and what should not be done.  This has been true with the development of nuclear weapons, biomedical engineering, computers, biotechnology, and more.  It is thus to be expected that new developments in the simulation and design of complexity should in their turn raise ethical issues.  To put the point another way, ethical problems have always been considered complex.  They become even more so when we are dealing with the science and engineering of complexity.  This article will explore this new region of interaction between science, technology, and ethics by reporting on a symposium that we organized at the 2001 AAAS annual meeting in San Francisco, and then discussing current ethical trends in the field.

The symposium, titled Complexity, Simulations, and Ethics, examined the ways that we can take lessons learned from existing approaches to research ethics and apply them specifically to complexity simulations.  The ensuing discussion included questions regarding whether this relatively new field of scientific inquiry might raise new and special ethical issues or responsibilities and whether ethical issues should be included in complexity simulations.  The symposium included representatives from the sciences, engineering, the social sciences, and the humanities.

Sergio Sismondo (Queen’s University, Canada), a philosopher and social scientist, began by noting how computer simulations cut across traditional scientific boundaries.  As he puts it, “simulations occupy an ill-defined space between experiment and theory.”  Whereas experiments produce either reliable or unreliable data, and theories are judged as true or false, simulations tend to be thought of in more pragmatic terms as more or less useful.  At the same time, because of their complexity - especially insofar as they model complexities - the computer programs on which simulations rest contain a multitude of unclarified assumptions and often unrecognized bugs, the adequacies and inadequacies of which are exceptionally difficult to discern.  “Complex science is nothing new.  But computer simulation is novel in having some of the local and idiosyncratic features of experimentation even while its looks like theory.”  What this means is that our trust in a simulation must in fact rely on “trust in the skills of people who have created it” more than any comprehensively tested program.

Aerospace engineer Stephen M. Batill (University of Notre Dame) continued the discussion by distinguishing three interrelated complexities: the complexities of “large, collaborative groups representing many disciplines” that design complex systems; the complex systems themselves; and the complexities of interactions between these systems and the world.  Complexity in the design process is the result of “the curse of dimensionality,” which grows with the level of detail used to describe a system.  Using the aeronautics industry as an example, he describes the complexity and uncertainties posed in the design of a plane.  In the design process, engineers try to predict the behavior of systems prior to their realization.  To achieve this end they commonly use “computer-based models and simulations intended to represent some of the most complex phenomena that science can describe.”  But there is always a degree of uncertainty associated with the information provided by such simulations.

Characterizing and quantifying the uncertainty of simulations “is a key element in developing information of value to the engineer in the design decision-making process.”  On the one hand, this very characterization has its limitations.  On the other, social expectations regarding the powers of technology have increased at the same time that tolerance for failure has decreased.  “This introduces new ethical issues into the design process related to the engineer’s ability to use uncertain information to provide an assessment of the risk/cost versus the benefit to society for new technology development.”

Biologist Joseph Berry (Stanford University) extended the questioning by considering some of these issues in relation to what might be termed a live-in simulation, Biosphere 2.  For him, “models are powerful tools for dealing with complex systems” that run the “danger of misinforming or misleading if the input data, parameterizations or model structure have important errors.”  As a result, modelers have special responsibilities “to use the best available science and to be candid about the limitations of any particular model.”

But another problem is that precisely because of their complexity, simulations are easily criticized or dismissed when their results turn out to be counter-intuitive or contrary to strong economic interests - a phenomenon that has been well illustrated by reactions to climate change models.  This is where Biosphere 2 offers a new approach.  A large-scale model or simulation such as Biosphere 2 can serve not only as a testbed in which scientists can assess the adequacy of models for various earth system processes, but also as a demonstration site through which the non-scientific public may come to have more confidence in the models and thus not so easily dismiss unpalatable implications.  With 40,000 visitors a year touring Biosphere 2, this simulation presents real opportunities for scientific education of the general public.

Finally, Carl Mitcham (Colorado School of Mines), representing the interdisciplinary field of science, technology, and society studies, argued that although simulations of complexity may not require the development of any new ethics, they do call for deeper integration than have heretofore been achieved of research ethics and engineering ethics. Research ethics in science is primarily concerned with process: doing research in the right ways.  Engineering ethics, by contrast, is primarily concerned with product: designing efficient or safe products, processes, or systems.  Simulations of complexity bridge the traditional science/engineering divide by serving as large-scale virtual experiments (simulations) mounted on complex engineering products (computers).  As with the case of Biosphere 2, these simulations also may have important policy implications for the society at large; the questions they address are not simply issues of knowledge for its own sake.  As such, they engage not only research and engineering ethical issues, but even the politics of science.

Critical reflection on the ethical dimensions of complex computer modeling and simulations of complexity is something about which some members of the computer professional community have also become aware.  As a member of the symposium audience, Billy Grassie (editor of the “MetaListserve on science and religion, http://www.meta-list.org/), commented on the listserv: “Scientists, engineers, economists, and policy makers often take... simulations too literally, committing what A.N. Whitehead once labeled ‘the fallacy of misplaced concreteness.’...  Forget the debate about utilitarian ethics, deonotological ethics, or virtue ethics, we are losing moral agency in our growing collective inability to predict the consequences of complex systems.”

Current work is conducted by John Illgen (Illgen Simulation Technology, Inc, a Santa Barbara firm that focuses on modeling and simulations and receives direct contracts from government agencies and companies such as the FAA, NASA, Raytheon, and SAAB.  He is developing a code of ethics for simulations and modeling that can be used throughout the industry in the US and globally.  He maintains the importance of the community delivering what it practices, and not pressured by customers wanting more than what is possible beyond the technical standpoint.  Already, military offices of modeling and simulations - army, air force, navy, and marines- have been pressing for solid verification, validation, and accreditation (VVA) policies and cookbooks on how to perform such efforts.  “ We want to make sure what’s produced is done in a quality fashion, meaning the architecture that the models and simulations reside on, have all been properly tested - strict quality control, including verification, validation and accreditation.  There are VVA processes that have been developed by the defense and modeling and simulation office.  We should leverage off the excellent work that that group has done,” Ilgen has stated.

The code in development will strongly urge industry to cost programs in a realistic fashion.  It is just as important not to underbid programs beyond the capabilities of what a company is able to deliver as it is not to have profits that are too high.  The code will also stress the necessity to assign appropriate personnel, in terms of proper experience and academic level, to achieve the task.  It will also require simulation modelers, when encountering unforseen problems, to inform those for whom they work and present the solutions in an honest fashion.  Illgen believes that in parallel with a code of ethics, the computer simulation community needs certification in modeling simulation.  Currently, the Society for Computer Simulation is working on such a certification process with the idea those doing computer simulations would have to be certified in order to receive public funding.  He anticipates completing the code by November 2001.

* Carl Mitcham is Professor of Liberal Arts and International Studies at Colorado School of Mines. Sanyin Siang is Deputy Editor of Professional Ethics Report. Mitcham can be reached at cmitcham@mines.edu and Siang at ssiang@aaas.org.

Index

The Competitiveness of Nations

in a Global Knowledge-Based Economy

January 2004

AAP Homepage