Download PDF Winter Issue of The Bridge on Complex Unifiable Systems December 15, 2020 Volume 50 Issue 4 The articles in this issue are a first step toward exploring the notion of unifiability, not merely as an engineering ethos but also as a broader cultural responsibility. Epilogue: Toward an Engineering 3.0 Thursday, December 17, 2020 Author: Norman R. Augustine The history of major engineering projects traces back at least 5000 years. It began in earnest with the construction of large stationary structures: pyramids, walls, roads, bridges, and aqueducts—what became known as civil engineering. The need to construct objects whose parts move relative to one another gave identity to the practice of mechanical engineering. Then the ability to separate, combine, and capitalize on the elements of matter enabled chemical engineering. The eventual ability to control the behavior of electrons defined the province of electrical engineering. A second era followed in which needs and capabilities arose that did not neatly fit the traditionally defined engineering categories. As a result new, more specialized engineering disciplines were born: petroleum, aerospace, biomedical, computer, entertainment, and many more. Then dawned the era that might be termed Engineering 3.0, a pursuit to better engage with complex systems of systems demanding great breadth as well as depth of knowledge. The engineering profession, well founded in methodology, nonetheless found itself ill prepared to deal with the interconnectedness and enormity of connected but uncoordinated systems and their consequences. Given the profusion of knowledge in each of the traditional engineering disciplines, the teaching of engineering had over the generations become highly compartmentalized to the point of occasional disconnection with society and reality. The “stovepiping” trend was exacerbated by such forces as the academic accreditation process, which encouraged this narrower focus. Similarly, industries largely structured themselves around highly specialized disciplines. Complex Systems Challenges It is clear that complex systems require far more than traditional engineering. For example, how can the natural environment be preserved when any solution demands that literally dozens of autonomous geopolitical entities work in concert? How can America’s public pre-K-12 education system, with its 14,000 independent school districts, be made to produce students who are uniformly competitive on a global scale? How can health care be provided to over 300 million people without bankrupting the nation? And how can congestion and gridlock on the nation’s 8 million miles of roads serving 247 million motor vehicles be eliminated? Any objective assessment of the current state of the art in engineering such complex systems would likely conclude that there is a great deal of room for improvement and unifiability—“a target-rich environment,” as they say in the Pentagon. Beginning with climate change, the carbon concentration in the atmosphere has now risen to well over 400 ppm for the first time in at least 900,000 years. In education, US 15-year-olds finish in 25th place on international tests in combined reading, science, and mathematics scores—even as this country spends more per student than any other nation but one. America now devotes 7 more percentage points of its GDP to medical care than the next highest spending nation, yet has a declining life expectancy and fails to impress across many other health indices. The average adult American wastes 54 hours a year in traffic delays, and 36,000 Americans die in automobile accidents each year. Two Complications for Engineering 3.0 Two particular complications confront the opportunities for Engineering 3.0. The first is that it involves…humans. Many systems include people and they appear not only as individuals but also collectively as society. Humans can be not only inconsistent but -notoriously irrational as well. They may refuse to take vaccines that are known to save lives. They are more frightened of shark attacks than bee stings although the latter kill 60 times more people in the United States each year. They may oppose the prospect of nuclear fusion energy because there have been accidents in nuclear fission plants and because of a fear of nuclear weapons. As research in behavioral sciences has repeatedly shown, people implausibly value something they have more highly than the identical thing they don’t have. Attempts to model the behavior of the stock market or project election outcomes provide classic examples of systems tortured by such idiosyncrasies. The second emergent complexity multiplier concerns a relatively recently discovered colorless, odorless, weightless substance called…software. It flourishes in complex systems but the accidental omission of a single bar among many thousands of lines of code can cause a spacecraft mission to Venus to fail (see Mariner 1). Further, adding a few lines of code to a major system is usually not very costly on the margin—but has led to the adage among some engineers that “If it isn’t broken, it doesn’t have enough functions yet.” A modern automobile contains around 100 million lines of code—about a thousand times the number of lines in the Apollo spacecraft. It is a software app on wheels—and driverless cars are still in the future. Further Challenges Systems of systems involve feedback, interconnectedness, instabilities, nonlinearities, and discontinuities. Philosophers and metaphysicists over the generations have puzzled over Lorenz’s conundrum that asks whether a butterfly flapping its wings in, say, New York, can cause a hurricane in China. (We now know the answer: a microbe in China can shut down New York.) Similarly, the assassination of an archduke in Sarajevo can trigger a world war. Or an argument between a street vendor and a police officer in Tunis can spark an “Arab Spring” throughout much of the Middle East when connectivity is provided through the widespread availability of cell phones. And a tree branch in Ohio can trigger a cascade of events that shuts off electric power for over 50 million people in the northeastern United States and part of Canada for up to 4 days (see 2003 blackout). Further, the challenge of designing and analyzing interdisciplinary systems usually requires accommodating legacy components of existing systems, while maintaining operability as change is introduced: the classic problem of rebuilding an airplane in flight—or restructuring a national healthcare system or introducing resilience into the nation’s existing electric grid. Friedrich Wiekhorst of the Max Planck Institute derived the equation that describes the number of states in which a system of n elements can exist, assuming each element can affect each other element in the simplest of possible manners, a binary connection. A system of two elements thus has four possible states. But a system of just seven elements has a number of possible states that approximates the number of stars in our galaxy. While in most actual systems every element is not directly connected to every other element, the magnitude of the number of theoretical possibilities does suggest, among other things, why many failure modes are not caught in testing. The pace of technological change intensifies the challenges faced by the modern systems engineer when a system can be out of date by the time it is deployed: the number of transistors on a chip has increased by a factor of about 10 million in just 50 years; the cost of gene sequencing has declined by over 6 orders of magnitude in less than 20 years; the number of smartphones in use has grown from zero to 3.5 billion (half the world’s population) in 13 years. Further, complex unifiable systems are often adaptive, as is particularly true of biological systems. Engineering such systems may entail compromises and trade-offs of unlike qualities. Limitations of Modeling and Simulation The rigorous practice of modeling and simulation as part of systems engineering can offer important insights into the design and analysis of complex systems of systems—sometimes aptly referred to as wicked problems. But even with these tools challenges abound. When it comes to systems of systems, the optimum of the whole rarely equals the sum of the optima of its parts. Contrary to ritual, the best way to eat this kind of elephant is not one piece at a time. If a model is too encompassing it may defy analysis. But if it is too narrow it may omit critical aspects of a system’s behavior. Unfortunately, it is not uncommon for system failures to be caused by elements that did not rise to the level of adequate concern by system designers. It was, for example, not one of the 25,000 tiles that received so much attention on the Space Shuttle’s thermal protection system that caused the failure of the Challenger. It was an O-ring. A regional telephone company performed an extensive analysis of what would be needed in order to recover from a major hurricane in its operating area. It stockpiled wire, telephone poles, vehicles, and more. But when the hurricane struck, the bottleneck that emerged was absent from the models: it was daycare centers for children. With schools closed, employees’ families with two working parents had to have one parent remain at home to care for the children, just at the time a full workforce was critically needed. So fundamental an issue as identifying figures of merit can be ambiguous in complex systems. There is, for example, the tension between controlling system cost and ensuring system resilience; e.g., just-in-time inventory vs. “just-in-case” inventory. Is it better to be efficient or resilient with regard to stockpiling empty beds in a hospital? Beyond Established Equations Evaluating systems involving humans may require placing a value on a human life, a year of human life, a quality-adjusted year of human life, or some other such measure. Should a new highway be constructed through the middle of a city that will save thousands of travelers many hours but will create a barrier to community life in the affected neighborhood? What is the exchange rate between tons of carbon emitted into the atmosphere and its social cost? Is it appropriate to put millions of people out of work, many of them into poverty, in order to save thousands of lives in a pandemic? Engineering complex systems not uncommonly finds itself more engulfed in the field of ethics than engineering—confronting issues that have no standard equations for their solution. As essays in this issue point out in various styles and substance, when it comes to engineering complex unifiable systems, both the profession and the practice may be better at reflecting on questions and offering insights than in delivering absolute solutions. Finally, a critical factor that the construction of many complex systems often fails to adequately address is their vulnerability to external interference, intentional or otherwise. The design of the World Wide Web does not appear to have adequately accounted for the impact of malevolent individuals or nations—or even of nature itself—disrupting the intended functioning of the system. America’s electric grid is a canonical example of this problem. With 7300 power plants and 160,000 miles of high-voltage line, the latter owned by some 500 independent firms, the US grid possesses substantial vulnerabilities, and a massive failure of the system could prevail for months, creating disruption of a magnitude even beyond that of covid-19. Communications would be curtailed, pumps in filling stations would not operate, refrigerators storing food would fail, entire regions would go dark. A near-term task will be to take hostile threats into consideration when designing self-driving cars that will be used on connected highways. As ordained in the variously attributed euphemism, “Every system is perfectly designed to get the result it gets.” Even, unfortunately, unwanted results. About the Author:Norman Augustine (NAE) is retired chair and CEO of Lockheed Martin Corporation and former chair of the National Academy of Engineering.