Click here to login if you're an NAE Member
Recover Your Account Information
Author: Wm. A. Wulf
Poised as we are between the twentieth and twenty-first centuries, it is the perfect moment to reflect on the accomplishments of engineers in the last century and ponder the challenges facing them in the next.
This past February, working with the engineering professional societies, the NAE selected the 20 greatest engineering achievements of the twentieth century. The main criterion for selection was not technical "gee whiz," but how much an achievement improved people’s quality of life. The result is a testament to the power and promise of engineering.
Reviewing the list, it’s clear that if any of its elements were removed our world would be a very different place-and a much less hospitable one. The list covers a broad spectrum of human endeavor, from the vast networks of the electric grid (no. 1) to the development of high-performance materials (no. 20). In between are advancements that have revolutionized virtually every aspect of the way people live (safe water, no. 4, and medical technologies, no. 16); the way people work (computers, no. 8, and telephones, no. 9); the way people play (radio and television, no. 6); and the way people travel (automobile, no. 2, and airplane, no. 3).
In announcing the achievements, former astronaut Neil Armstrong noted that, "Almost every part of our lives underwent profound changes during the past 100 years thanks to the effort of engineers, changes impossible to imagine a century ago. People living in the early 1900s would be amazed at the advancements wrought by engineers." He added, "As someone who has experienced firsthand one of engineering’s most incredible advancements-space exploration-I have no doubt that the next 100 years will be even more amazing."
Given the immediacy of their impact on the public, many of the achievements seem obvious choices, such as the automobile and the airplane. The impact of other achievements are less obvious, but nonetheless introduced changes of staggering proportions.
The no. 4 achievement, for example, the mechanisms to supply and distribute safe and abundant water, together with sanitary sewers, literally changed the way Americans lived and died during the last century. In the early 1900s, waterborne diseases like typhoid fever and cholera killed tens of thousands of people annually, and dysentery and diarrhea, the most common waterborne diseases, were the third largest cause of death. By the 1940s, however, water treatment and distribution systems devised by engineers had almost totally eliminated these diseases in America and other developed nations.
Engineering is all around us, so people often take it for granted. Engineering develops consumer goods, builds the networks for highway, air, and rail travel, creates innovations like the Internet, designs artificial heart valves, builds lasers for applications from CD players to surgical tools, and brings us wonders like imaging technologies and conveniences like microwave ovens and compact discs. In short, engineers make our quality of life possible.
The NAE’s full list of engineering achievements, with an expanded explanation of each item, can be found on the Web at www.greatachievements.org. The short form of the list appears below:
Challenges for the 21st Century
So much for the achievements of engineering in the twentieth century; now let’s look forward to the challenges of the twenty-first. I am an optimist. I believe 2100 will be "more different" from 2000 than 2000 was from 1900. I believe that the differences will bring further improvements in our quality of life, and that these improvements will be extended to many more of the people on the planet. But that is a belief, not a guarantee-and there are profound challenges twixt here and there. Some of those challenges are reflected in the NAE program initiatives: megacities, Earth systems engineering, technological literacy of the general public, and so on. Rather than talk about all of these challenges, I want to talk in depth about just one. It’s a challenge that I haven’t written or spoken about yet, that I believe may be the greatest challenge for the twenty-first century, that I want to start an NAE program on, and that I want to begin a dialogue with you about. The challenge is engineering ethics!
Let me start by being clear that I believe engineers are, on the whole, very ethical. Indeed, ethics is a subject of great concern in engineering, reflecting the profession’s responsibility to the public. There are ethics courses at many engineering schools. There is a bewildering array of books on the subject. Every engineering society has a code of ethics-most start with something like "" hold paramount the health and welfare of the public."1 These codes typically go on to elaborate the engineer’s responsibility to clients and employers, the engineer’s responsibility to report dangerous or illegal acts, the engineer’s responsibility with respect to conflicts of interest, and so on.
Beyond the codes are the daily discussions that occur in the work of engineering. I have vivid memories of discussions with my father and uncle, with my professors, and with many colleagues-about everything from design margins to dealing with management pressure to cases where tough choices had to be made. All of that is still in place. It’s part of why I am proud to be an engineer!
So, why do I want to talk about engineering ethics? Why do I believe it may be the greatest challenge of the twenty-first century? Why do I think we need to start an NAE program activity on the topic? As you know, engineering is changing, and it is changing in ways that raise new ethical issues. These new issues are, I believe, "macroethical" ones that are different in kind from those that the profession has dealt with in the past.
The literature on engineering ethics, the professional society codes, and the college ethics courses all focus on the behavior of individual engineers; these have been called "microethical" issues. The changes I will discuss pose new questions for the profession more than for the individual.
In medicine, the microethical issues are very similar to those in engineering. But, in addition, there are many macroethical issues.. For example, the individual medical doctor cannot and should not make broad policy decisions about "allocation"-who should receive scarce organs for transplant, or doses of a limited stock of medicine, or even the doctor’s attention when there are more ill than can be accommodated. The profession, or better, society guided by the profession, needs to set these policies.
Several things have changed to create these new macroethical questions in engineering, but I am going to focus on one: complexity. Moreover, I will focus specifically on complexity arising from the use of information technology and biotechnology in an increasing number of products. The key point is that we are increasingly building engineered systems that, because of their inherent complexity, have the potential for behaviors that are impossible to predict in advance.
Let me stress what I just said. It isn’t just hard to predict the behavior of these systems, it isn’t just a matter of taking more into account or thinking more deeply-it is impossible to predict all of their behaviors. There is an extensive literature on engineering failures-the Titanic, Three Mile Island, etc. Engineering has, in fact, advanced and made safer, more reliable products because it has been willing to analyze its failures. I found two books on such failures particularly interesting: Normal Accidents, by Charles Perrow (1985, Basic Books2) and Why Things Bite Back, by Ed Tenner (1997, Vintage Books). I found them interesting because of the progression in thinking in the 12 years between them about why systems fail and what engineers should do about it. For Perrow, the problem is that we don’t think about multiple failures happening at once in "tightly coupled systems"-and the clear implication is that the solution is to think about them! For Tenner, there is a beginning of a glimmer that very complex systems have behaviors that are really hard to predict. But one still gets the feeling that if we just thought about it harder, if we just thought in the larger context in which the system is embedded, we would anticipate the problems.
Perrow and Tenner are not engineers-one is a historian and the other a sociologist-and they use the tools of their disciplines to analyze why failures happen. Mathematics isn’t one of those tools, and so they are unlikely to have encountered the technical explanation I am about to give you. And, of course, they are partly right about the earlier failures they analyze-those systems may not yet have crossed the threshold beyond which prediction is impossible.
Over the last several decades a mathematical theory of complex systems has been developing. It’s still immature compared to the highly honed mathematical tools that are the heart of modern engineering, but one thing is very solid-a sufficiently complex system will exhibit properties that are impossible to predict a priori!
"Emergent Properties" and Intractability
I said the theory was "immature"-unfortunately, it also carries some undeserved baggage. The term used for these unanticipated behaviors is "emergent properties," a term that originally arose in the 1930s in "soft" sociological explanations of group behaviors. Some postmodern critics of science have also tried to use the theory to discredit reductionist approaches to scientific research. Despite this baggage, there are solid results, and impossibility (or "intractability," to use a more technical term) is one of them. I don’t want to get technical, but I need to give you a flavor of why I say impossible. Consider the question of why software is so unreliable. There are many reasons, but one of them is not "errors" in the sense that we usually use the term. In these cases the software is doing exactly what it was designed to do; it is running "to spec." The problem is that the implications of the specified behavior were not fully understood because there are so many potential circumstances, and the software designers simply couldn’t anticipate them all. Not didn’t, but couldn’t! There are simply too many to analyze!
Let me just give you an idea of the magnitude of the numbers. The number of atoms in the universe is around 10100. The number of "states" in my laptop, the configuration of 1s and 0s in its memory, is about 1010000000000000000000. That’s just the number of states in the primary memory; it doesn’t include those on the disk. If every atom in the universe were a computer that could analyze 10100 states per second, there hasn’t been enough time since the Big Bang to analyze all the states in my laptop. When I say that predicting the behavior of complex systems is impossible, I don’t mean that there isn’t a process that, given enough time, could consider all the implications-it’s that there isn’t enough time!
So, that’s what has changed. We can, and do, build systems not all of whose behaviors we can predict. We do, however, know that there will be some such unpredicted behaviors-we just don’t know what they will be. The question then is: How do we ethically engineer when we know this-when we know that systems will have behaviors, some with negative or even catastrophic consequences-but we just don’t know what those behaviors will be?
Note that it wouldn’t be an ethical question if we didn’t anticipate that systems would have these negative properties. Ethicists and the courts alike have long held that if an engineer couldn’t reasonably know the consequences of his or her actions, that’s okay. But here we know! So how should we behave? How should we "engineer?"
Everything Connected to Everything
A concrete example is the programmatic theme the NAE has embarked on: Earth Systems Engineering. Clearly the biosphere, our planet, is not fully understood and is a very complex, interconnected system. It’s a clear example of a system where "everything is connected to everything." Every action will have an effect on the whole, albeit perhaps not a large one in most cases. (But we have many examples where we thought that an action wouldn’t have a large negative impact, and it did.) It’s a system where, even if we did understand all the parts, we would not be able to predict all of its behaviors. Moreover, we must recognize that the Earth is already a humanly engineered artifact! Whether we consider big engineering projects, as in the proposed restoration of the Everglades, or simply paving over a mall parking lot that happens to feed an aquifer vital to a community hundreds of miles away, we have changed the planet.
Consider the case of the Everglades-either we do something or we don’t; both are conscious acts. Either way, knowing that we can’t predict all of the consequences, how do we proceed ethically? How do we behave? How do we choose? Clearly these are deep issues, and issues for the whole profession, not the individual engineer. The kind of ethics embodied in our professional codes doesn’t tell us what to do.
This spring, Bill Joy, cofounder and chief engineer of Sun Microsystems, raised a somewhat related, but different, issue. In what I thought was an irresponsibly alarmist article in Wired magazine (8.04), Joy mused that the interaction of information technology, nanotechnology, and biotechnology would lead to self-replicating systems that would "replace" human beings. He then raised the question of whether we should stop research on some or all of these technologies. I abhor the way that Joy raised the question, but I think we have to deal with the fact that something like it is at the root of the public’s concerns over cloning, genetically modified organisms, etc. We are meddling with complex systems; how can the public be assured that we know all of the consequences of that meddling? I am repelled, however, by the notion that there is truth we should not know. I can embrace the notion that there are ways we should not learn truth, research methods we should not use-the Nazi experiments on humans, or perhaps even fetal tissue research, for example. I can embrace the notion that there are unethical, immoral, and illegal ways to use our knowledge. But I can’t embrace the notion that there is truth, knowledge, that we should not know.
It’s ironic that the first academies in the seventeenth century were created because science, this new way of knowing truth, was not accepted by the scholastic university establishment. More than 100 years later, Thomas Jefferson made a radical assertion, when, in founding the University of Virginia, the first secular university in the Americas, he said, "This institution will be based on the illimitable freedom of the human mind. For here we are not afraid to follow the truth wherever it may lead "" That’s the spirit of the pursuit of knowledge that I teethed on. Yet here I am in the Academy asking whether there is truth we should not know. Alas, I also have to admit that the history of the misuse of knowledge is not encouraging. I do not know the answer to Joy’s question, but it is also a macroethical one; it is not an issue for each of us individually. You might reasonably ask why we engineers need to ponder this as our ethical question? It’s because science is about discovering knowledge; engineering is about using knowledge to solve human problems. So, while I can’t bring myself to agree with the implied answer in Bill Joy’s question, I do believe it raises a deep question for engineers about the use of knowledge.
How should we behave to ensure proper use of knowledge? Again, it’s a question for the profession, not the individual. While an individual engineer perhaps should object to improper use of knowledge, such an act by itself will not prevent misuse. We need a guideline.
I could give other examples of new macroethical issues that engineering must face, but let me just summarize. Engineering-no, engineers-have made tremendous contributions to the quality of life of citizens of the developed world. There have been missteps, and there is much to be done even to bring the benefit of today’s technology to the rest of the world. But I am unabashedly optimistic about the prospects for further increasing our quality of life in the twenty-first century and for spreading that quality of life around the globe. However, that is not guaranteed. There are significant challenges, and, in fact, those challenges are not a bad operational definition of what the NAE program should be. One of these challenges, and perhaps the greatest one, is a class of macroethical questions that engineers must face. There are many such issues, but I chose two to illustrate the point. Projects such as the further modification of the Everglades will be done with imperfect knowledge of all of the consequences. They should be done with the certainty that some of the consequences will be negative-perhaps even disastrous. At the same time we do not have the luxury of "opting out." Not to act is also an action, so we must address the question of what constitutes ethical behavior under such circumstances. Does the current nature of the engineering process support, or even allow, such behavior?
A separate but related question is how we ethically use the increasing knowledge we have of the natural world and the power that knowledge gives us to modify nature, which I think is the substantive question raised by Bill Joy’s article.
Both of these are questions on which society must give us guidance-our professional codes do not address them. But we must raise the issue and provide society with the information to help it decide, and we had better do it soon!
I happened on a quote from John Ladd, emeritus professor of philosophy at Brown, that captures part of the point I have tried to raise. He said, "Perhaps the most mischievous side effect of [ethical] codes is that they tend to divert attention from the macroethical problems of a profession to its microethical ones." Our ethical codes are very important, but now we have another set of issues to address. Let’s not let our pride in one divert us from thinking hard about the other.
1. This particular wording is from the National Society of Professional Engineers code, but many others are derived from it and use similar language.
2. Charles Perrow released an updated edition of Normal Accidents in 1999 (Princeton University Press).