To avoid system errors, if Chrome is your preferred browser, please update to the latest version of Chrome (81 or higher) or use an alternative browser.
Click here to login if you're an NAE Member
Recover Your Account Information
Author: Thomas W. Smith III and Tara L. Hoke
Engineers’ primary ethical obligation in the performance of their professional duties is to protect public health, safety, and welfare. This obligation is captured in the rules adopted by state licensing boards, echoed in the oaths of the Order of the Engineer and its Canadian corollary,1 and codified as Fundamental Canon 1 in the ethical codes of the American Society of Civil Engineers (ASCE)2 and many other engineering professional societies.
While most engineers understand and are committed to their ethical obligations, professional services are not performed in a vacuum—that is to say, it is not always possible to make decisions that serve the public good without the challenge of potentially competing influences. We explain these competing influences and use recent examples of US engineering disasters to illustrate specific lessons for corporate culture in support of ethical practice.
In their practice engineers are subject to a host of external and internal pressures—from clients, employers, or colleagues—to obtain a desired result, to contain costs or meet deadlines, to compete in the marketplace. Often these pressures are at odds with each other.
Not all stakeholders on a project are affected equally by engineering decisions, so engineers may often find themselves in the difficult position of providing faithful service to multiple parties with conflicting needs and expectations. Even service to the public good is not without consideration of competing interests, as the most worthwhile project may still involve some tradeoff between degree of safety and financial feasibility.
The public places enormous trust in the judgment of engineering professionals to study and mitigate the risks that can be addressed, to communicate those that cannot be eliminated, and to warn when risks are too great or should not be taken. When, as is commonly the case, engineers strike the proper balance between these dictates, they are capable of transforming the environment for the betterment of humanity. When they fail to find this balance, however, the results can be calamitous.
The destruction of the space shuttles Challenger and Columbia in 1986 and 2003, the 2010 explosion and fire on the offshore drilling platform Deepwater Horizon, and the recall of 2.4 million GM vehicles in 2014 are a few of the most notorious recent examples of engineering failures. While the technical details differ greatly, these four cases share two crucial similarities:
Analysis of these examples offers multiple lessons on the importance of maintaining a corporate or organizational culture that respects and facilitates dissent in order to ensure that engineering professionals comply with their ethical obligations.
What Is Corporate Culture?
Corporate culture can be defined as the values and beliefs implicit in an organization’s conduct of its activities. While stated mission and goals may be a factor in determining this conduct, corporate culture is more rooted in the organization’s practices than in its ideology, shaped by people’s perceptions of how the organization sets priorities and encourages or discourages certain types of behavior.
In the engineering setting, the desired corporate culture is one that rewards commitment to the highest standards of professional ethics. Workers at all levels feel empowered and motivated to raise questions, address problems, and make decisions that serve the corporation while preserving the public good.
If, on the other hand, the culture favors loyalty to management or the company alone, service to the financial bottom line, or an unwillingness to “make waves,” the result will be an environment that hinders or even discourages the prioritization of public health, safety, or welfare. This unhealthy corporate culture places engineers faced with an ethical dilemma in the burdensome and isolated position of deciding whether to dissent on ethical grounds or to remain silent about a potentially catastrophic consequence.
How Can Corporate Culture Aid an Engineer’s Ethical Practice?
Clear, Written Policies and Procedures
One lesson from recent examples of engineering failures is the need for clear, written policies and procedures that emphasize legal compliance, financial transparency, and attention to the safety and welfare of workers, customers, and the public at large.
In their report on the Deepwater Horizon disaster, investigators cited BP’s failure to establish clear safety protocols as a significant contributing factor in the explosion (National Commission 2011, p. 126):
Corporations understandably encourage cost-saving and efficiency. But given the dangers of deepwater drilling, companies involved must have in place strict policies requiring rigorous analysis and proof that less-costly alternatives are in fact equally safe. If BP had any such policies in place, it does not appear that its Macondo team adhered to them. Unless companies create and enforce such policies, there is simply too great a risk that financial pressures will systematically bias decisionmaking in favor of time- and cost-savings.
While an ethical engineer should demand proper safety testing even in the absence of a clear organizational mandate, the existence of a written policy establishes a priority of safety checks and avoids placing the engineer in the potentially tricky position of advocating for increased cost and delay against an uncertain or unquantifiable risk.
Organizational Backing and Enforcement
Organizations must treat their policies as more than mere words by providing the organizational backing and enforcement to ensure that officers and employees alike comply.
The classic illustration of a disparity between policy and practice is the oft-cited Enron Code of Ethics,3 in which then CEO Kenneth Lay affirmed the duty of all Enron employees to conduct business “in accordance with all applicable laws and in a moral and honest manner”—only a year before the company collapsed under perhaps the business world’s most notorious example of institutionalized and willful corporate fraud.
Similarly, in its communications to investigators studying the 2003 Columbia disaster, the National Aeronautics and Space Administration (NASA) exposed a disconnect between its stated policies and the actual behaviors of its staff (Columbia 2003, p. 177):
NASA’s initial briefings to the Board on its safety programs espoused a risk-averse philosophy that empowered any employee to stop an operation at the mere glimmer of a problem. Unfortunately, NASA’s views of its safety culture in those briefings did not reflect reality.
Shuttle Program safety personnel failed to adequately assess anomalies and frequently accepted critical risks without qualitative or quantitative support, even when the tools to provide more comprehensive assessments were available.
If the culture of an organization is one that accepts corporate or management decisions without question, the pressures of conformity with existing norms will create added difficulties for engineers who wish to report or allay an ethical concern. Conversely, a culture in which discussions about risk are routine and questions are treated with respect and diligence will create a safer and more effective environment for communicating concerns.
Open Channels of Communication
Organizations need open channels of communication to ensure that critical information is conveyed to and received by the individuals best suited to address concerns, whether in management or in the field.
Communication failures have been recognized as a primary cause in many of the worst engineering disasters. In the Challenger accident report (Presidential Commission 1986, chapter V, “The Contributing Cause of the Accident”), for example, investigators noted a tendency of technical or management staff to avoid escalating problems to higher-level decision makers:
The Commission is troubled by what appears to be a propensity of management at Marshall [Space Flight Center] to contain potentially serious problems and to attempt to resolve them internally rather than communicate them forward. This tendency is altogether at odds with the need for Marshall to function as part of a system working toward successful flight missions, interfacing and communicating with the other parts of the system that work to the same end.
With Deepwater Horizon the most significant communication breakdown was a failure of technical experts to convey the importance of certain procedures down to the field-level staff charged with implementing them (National Commission 2011, p. 223):
Their management systems were marked by poor communications among BP, Transocean, and Halliburton employees regarding the risks associated with decisions being made. The decision making process on the rig was excessively compartmentalized, so individuals on the rig frequently made critical decisions without fully appreciating just how essential the decisions were to well safety—singly and in combination. As a result, officials made a series of decisions that saved BP, Halliburton, and Transocean time and money—but without full appreciation of the associated risks.
A muddled line of communication may create both practical and motivational challenges to an ethical engineer. The engineer might believe s/he had received or sent communications to the correct parties when in fact this was not the case—or the engineer might believe it is futile to communicate an ethical concern because no one is listening.
Also, the ease of communication may itself send a message about priorities. If a team assigned to monitor safety is buried under several layers of hierarchy below the true decision makers, this may be read as a signal that safety concerns are not as important as other considerations.
Organizations must impress upon their workers the need to accept personal accountability for protecting the public good.
The independent investigator’s review of the GM recall cited the failure of any one person or entity to take responsibility for investigating the ignition switch failures as a major issue (Valukas 2014, p. 255):
A cultural issue repeatedly described to us and borne out by the evidence is a proliferation of committees and a lack of accountability. . . . One witness described the GM phenomenon of avoiding responsibility as the “GM salute,” a crossing of the arms and pointing out toward others, indicating that the responsibility belongs to someone else, not me.
Particularly in cases where a potential issue is inadequately understood or unquantified, a healthy corporate culture inspires each person to take initiative and assume an active role in resolving ethical concerns, even if that role is not expressly included in the individual’s assignment of duties.
Protection from Reprisal
The organization must create an environment where workers can voice questions or concerns without fear of reprisal. An underlying factor in the failure cases examined in this article was a cultural norm of silence on ethical or safety-related matters:
From the GM recall report (Valukas 2014, pp. 252–253): “Some witnesses provided examples where culture, atmosphere, and the response of supervisors may have discouraged individuals from raising safety concerns, including . . . supervisors warning employees to ‘never put anything above the company’ and ‘never put the company at risk.’”
From the Columbia report (Columbia 2003, p. 138): “When workers are asked to find days of margin, they work furiously to do so and are praised for each extra day they find. But those same people (and this same culture) have difficulty admitting that something ‘can’t’ or ‘shouldn’t’ be done, that the margin has been cut too much, or that resources are being stretched too thin. No one at NASA wants to be the one to stand up and say, ‘We can’t make that date.’”
From the Deepwater Horizon report (National Commission 2011, p. 224): “A survey of the Transocean crew regarding ‘safety management and safety culture’ on the Deepwater Horizon conducted just a few weeks before the accident hints at the organizational roots of the problem. . . . Some 46 percent of crew members surveyed felt that some of the workforce feared reprisals for reporting unsafe situations.”
Though protection from reprisal invokes the concept of whistleblower protection, this aspect of a healthy corporate culture cannot be addressed simply by adopting an antiretaliation policy. Instead, the culture must first encourage and assist employees in raising concerns or offering dissenting opinions, and then reward those who participate in resolving safety or other ethical issues.
The ideal corporate culture is one that does not find it necessary to protect whistleblowers at all, because problems of safety or other ethical concerns are addressed openly and in the early stages, without the need for drastic intervention.
Other Impacts of Corporate Culture
Impacts of corporate culture have been identified as a factor in ethical challenges other than safety-related failures.
Accountability, communication, and prioritization of safety concerns were identified as issues in post-Katrina assessments of the New Orleans levee system (ASCE 2007), demonstrating the need to establish a culture of safety in complex, multiorganizational engineering projects. In efforts to combat corruption in the global market, corporate culture has also been identified as a key element for preventing ethical and legal lapses (US DOJ 2012).
Moreover, although the examples discussed in this article represent extreme cases of the impacts of problematic corporate culture, it is important to recognize that corporate culture has an important role in the day-to-day life of all engineering professionals. Every engineer, regardless of role, experiences pressure from employers, clients, competitors, regulators, or the public at large—and the environment in which engineers face that pressure can have a significant positive or negative influence on their ability to adhere to the profession’s ethical standards.
In fact, an examination of cases reviewed by ASCE’s Committee on Professional Conduct, which enforces ASCE’s Code of Ethics, reveals that a faulty corporate or organizational culture was frequently a contributing factor in an engineer’s ethical misstep. These include cases in which engineers plagiarized reports, conspired to make illegal political contributions, overbilled a public agency, colluded on bid submissions, or fraudulently altered an approved set of design plans (ASCE 2005, 2006, 2009, 2014, 2016).
Commitment to Corporate Culture
Extraordinary courage may be required for an engineer to speak out in an environment that operates to silence dissent. It is therefore important to train engineering professionals collectively to build ethics into their corporate culture—creating a framework in which dissent on matters of ethical concern does not rely on extraordinary behavior but rather can be offered at comparatively low personal risk.
Creating such an ethical corporate culture requires active commitment at all levels of an organization. Managers must set the tone for their departments, involving other staff members in decisions, being receptive to questions or criticism, and carrying the message of safety and ethics to all who report to them. Junior-level staff should be expected and encouraged to learn and expand their understanding of their professional responsibilities by studying applicable laws, corporate policies and guidelines, and ethical codes of conduct. Training and resources should be regularly made available for employees at all levels.
But even the most diligent managers and the most knowledgeable junior staff members cannot create an ethical culture without a wholehearted commitment to professional ethics by those at the top. The organization’s leadership has the greatest responsibility for establishing an ethical culture, as clearly stated by the Deepwater Horizon commission (National Commission 2011, p. 218):
[E]ven the most inherently risky industry can be made much safer, given the right incentives and disciplined systems, sustained by committed leadership and effective training. The critical common element is an unwavering commitment to safety at the top of an organization: the CEO and board of directors must create the culture and establish the conditions under which everyone in a company shares responsibility for maintaining a relentless focus on preventing accidents.
Engineering and corporate leaders must first “talk the talk,” by setting clear policies grounded in ethical standards, communicating a consistent message about ethical expectations, and offering training and resources to drive compliance at all levels. Next, they must also “walk the walk,” by holding every person accountable for compliance with ethical standards, ensuring that organizational incentives and disincentives align with the desired behaviors, being open and transparent about decision making, and above all providing a model of ethical behavior through their own actions.
ASCE provides an array of ethics resources for engineering leaders, practitioners, and students. They include seminars, webinars, and publications on engineering ethics, and ethics sessions at ASCE’s technical meetings. In addition, ASCE hosts regular Order of the Engineer ceremonies and publishes a monthly column titled “A Question of Ethics” (www.asce.org/a-question-of-ethics), featuring engineering ethics case studies and current topics.
ASCE [American Society of Civil Engineers]. 2005. Engineer plagiarizes another firm’s report on similar project. Available at www.asce.org/question-of-ethics-articles/apr-2005/.
ASCE. 2006. Engineers collude to suppress competition on government contracts. Available at www.asce.org/question-of-ethics-articles/apr-2005/.
ASCE. 2007. The New Orleans Hurricane Protection System: What Went Wrong and Why. Hurricane Katrina External Review Panel. Reston VA. Available at http://ascelibrary.org/doi/book/%2010.1061/9780784408933.
ASCE. 2009. Employer reimburses employee for political campaign contributions. Available at www.asce.org/question-of-ethics-articles/march-2009/.
ASCE. 2014. Acting as faithful agents or trustees. Civil Engineering 84(4):40–41. Available at www.asce.org/question-of-ethics-articles/apr-2014/.
ASCE. 2016. Zero tolerance for bribery, fraud, and corruption. Civil Engineering 86(4):44–45. Available at www.asce.org/question-of-ethics-articles/apr-2016/.
Columbia Accident Investigation Board. 2003. Report, Vol. 1. Washington: Government Printing Office. Available at www.nasa.gov/columbia/home/CAIB_Vol1.html.
National Commission on the BP Deepwater Horizon Oil Spill and Offshore Drilling. 2011. Deep Water: The Gulf Oil Disaster and the Future of Offshore Drilling. Available at https://www.gpo.gov/fdsys/pkg/GPO-OILCOMMISSION/pdf/GPO- OILCOMMISSION.pdf.
Presidential Commission on the Space Shuttle Challenger Accident. 1986. Report to the President, vol. I. Available at http://history.nasa.gov/rogersrep/genindex.htm.
US DOJ [US Department of Justice]. 2012. FCPA: A Resource Guide to the US Foreign Corrupt Practices Act. Available at https://www.justice.gov/criminal-fraud/fcpa-guidance.
Valukas AR. 2014. Report to Board of Directors of General Motors Company Regarding Ignition Switch Recalls. Available at www.nhtsa.gov/staticfiles/nvs/pdf/Valukas-report-on-gm- redacted.pdf.
1 The Canadian “Calling of an Engineer” is a ceremonial ritual dating back to the 1920s, in which engineers take an oath to honor their ethical obligation to protect the public, and many wear a special ring as a symbol of that commitment. Fifty years later, American engineers decided to create their own version of this ritual; hence, the Order of the Engineer.
2 ASCE’s Code of Ethics is available at www.asce.org/code-of-ethics.
3 The Enron Code of Ethics is available at www.thesmokinggun.com/file/enrons-code-ethics.