In This Issue
Fall Issue of The Bridge on Cybersecurity
September 19, 2019 Volume 49 Issue 3
This issue features selected papers intended to provide a basis for understanding the evolving nature of cyber-security threats, for learning from past incidents and best practices, and for anticipating the engineering challenges in an increasingly connected world.

What Every Engineer Should Know about Cybersecurity

Thursday, September 19, 2019

Author: Thomas A. Longstaff and Noelle K. Allon

Computer science and engineering—which includes computer, computational, communication, and information science and engineering—is the branch of engineering that concerns itself with cybersecurity. However, for the safe and secure development and deployment of engineering systems, attention to and knowledge of cybersecurity should extend beyond the domain of computer science and engineering to other branches of engineering. The participation of a cybersecurity engineer on a systems design team can ensure mitigations and modifications that will increase system resilience and longevity.

The Need for Cybersecurity Is Everywhere

Engineers develop many capabilities by embedding software in the systems they produce, from kitchen appliances and baby monitors to home security systems and cars, to name just a few. A large subset of these systems offer internet connectivity and comprise devices that constitute the Internet of Things. There are already more than 25 billion “things” connected to the internet, and the number is constantly growing.

Even if a system is produced with no apparent need for security, it can be vulnerable to misuse. For example, a smart lightbulb may not seem to require traditional security, but if its computational resources (e.g., for its settings and remote control) are misused, the light will not function as users expect it to. Engineers must ensure that the embedded software of all systems meets safety and security requirements in the environments in which the systems are deployed.

In addition to growing in number, devices are becoming more complex, especially as engineers incorporate machine learning and other artificial intelligence capabilities. As the complexity increases, the number of vulnerabilities does as well. As a result, each and every device is vulnerable to misuse, creating the potential for harm unless engineers can secure and continually update devices. As of April 2019, there were over 120,000 Common Vulnerabilities and Exposure entries in the National Vulnerability Database.[1] These vulnerabilities can exist in the code, the environment, or the system, and many can result in harm to the system itself or to people or other systems. Organizations can incur multiple costs—such as the breach of sensitive information, loss of money, reputation, and time—when -attackers successfully exploit a vulnerability.

While it is not reasonable to expect that every engineer will become an expert in cybersecurity, some awareness of cybersecurity threats, risks, and trade-offs can help any engineer understand cybersecurity requirements and how to work with cybersecurity engineers throughout the system lifecycle. The participation of cybersecurity engineers in systems engineering can help organizations reap benefits such as enhanced value of the system being engineered, improved system flexibility in a changing environment, protection of legitimate system users from harm, and greater likelihood that engineers will meet system requirements.

We review cybersecurity engineering goals and tools and offer five questions to guide implementation of cybersecurity in engineering endeavors. We discuss how organizations can address each question.

What Is Cybersecurity Engineering?

Cybersecurity engineering involves efforts to secure systems from both intentional and unintentional harm.

Researchers learned early lessons in cybersecurity engineering from incident responses to detected attacks and breaches, and these lessons have motivated better integration of cybersecurity in traditional systems and software engineering. From a financial and safety perspective, investing in cybersecurity up front is always cheaper than incurring the costs and risks associated with insufficient security.

Cybersecurity is not an end in itself but an ongoing set of practices in every stage of the system lifecycle to achieve all system goals and requirements.

Five Questions to Determine Whether Cybersecurity Engineering Is Needed

The following five questions are a starting point for incorporating cybersecurity goals and requirements in any engineered system:

  • What is the input to the system and who controls it?
  • What value does the system hold or need to protect?
  • What harm can adversaries do if they take control of the system?
  • Is there a fail-safe when a cybersecurity event is detected?
  • Can the system adapt to an evolving environment of use, attacks, and interoperability?

These questions do not represent a comprehensive set of cybersecurity concerns. Rather, they focus on (1) attack surface, (2) magnitude of consequence, (3) hazard, (4) resilience, and (5) system evolution, reflecting the most common ways that vulnerabilities and unintended consequences are introduced into systems. Their answers can help an organization determine whether to include a cybersecurity engineering specialist on the engineering team.

What Is the Input to the System and Who Controls It?

Keeping systems secure involves more than automating security rules to reduce human labor costs and error rates. It is important to know what the required inputs to the system are and who is controlling the system. Inputs to the system constitute the attack surface, which presents opportunities for both an adversary’s control of a system’s behavior and misuse by an authorized user.

Attackers can exploit vulnerabilities to force a system to violate safety constraints, divulge information, change or destroy valuable information, or use unauthor-ized resources. A cybersecurity engineer can use knowledge of the source of the data and which users control the system to develop an appropriate threat model and determine the controls necessary to restrict, detect, mitigate, and recover from adversarial use of the inputs.

When a system contains confidential, private, or valuable information, the data and actions that control system behavior must yield predictable outcomes. To achieve such outcomes, cybersecurity engineers will -model the system behavior under all foreseeable conditions to ensure that the model closes in a safe and secure state.

Complex systems with many interacting components can create conditions that are difficult to model; focusing on the portions of the system that are reachable through the attack surface can help to manage the complexity. In addition, engineers can leverage advances in model-based system engineering (McDermott et al. 2019) and formal methods such as static analysis tools to explore mitigations for vulnerabilities in the attack surface.

The incorporation of machine learning capabilities complicates this question because both the training data and operational data will determine the system behavior, and both sets of data may be susceptible to attacks. It is often difficult to discover attacks on these types of data through traditional monitoring.

A common example that illustrates this problem involves self-driving cars. They must be able to distinguish between different traffic signs. But researchers have discovered that when attackers place just a few stickers on a stop sign in a certain configuration, self-driving cars incorrectly classify it as a yield sign (Silver 2017), potentially creating an unsafe condition.

What Value Does the System Hold or Need to Protect?

A system can house various types of data, from public material to medical histories of patients with the same health insurance to highly classified intelligence on government adversaries. Knowledge of the value of the information that a system needs to protect will be used to frame the goals and requirements for protecting that information, both while it is at rest and while it is being processed or transmitted. The answer to this question also calls for identifying both the type of adversary who would try to extract or manipulate the information held by the system and the attacker’s motivation for doing so.

The impacts of a successful attack are tied to the value of the information in a system. Evaluation of the extent of the impacts can inform engineering trade-offs in the development and deployment of mitigation technologies and the incorporation of data protections such as encrypted storage, secure protocols, and protected processing.

When a system holds information of significant value, a cybersecurity engineer familiar with the use of cryptography should participate in its design and development. Many systems have failed by incorporating or implementing cryptography with exploitable flaws. The participation of a cybersecurity engineer is also important when the system coordinates between distributed components, as detection of data in transit is a well-known attack strategy.

Other elements of a cybersecurity strategy include appropriate encryption key management, access control through identity management, and authentication and authorization of appropriate assets per user. In addition, a monitoring and resilience strategy should be developed to look for and mitigate any attempts to subvert the proper behavior of the system.

What Harm Can Adversaries Do If They Take Control of the System?

It is relevant to know whether the system controls a safety-critical function or physical action that could benefit an adversary. For example, software that controls an alarm system and electronic locks may depend on an external time source that an adversary could manipulate to allow unauthorized individuals to bypass the system and cause harm. For cyberphysical systems, such harm may affect physical materials or services managed by the system (e.g., electric power, water, alarm systems).

With knowledge of the context in which the system operates and the harm that could result if an adversary controls the system behavior, it is possible to create a threat model to drive a realistic risk analysis to reduce risks and harms to acceptable levels. This analysis could motivate the incorporation of cybersecurity capabilities that limit the system’s physical behavior to ensure that it always adheres to safety and security constraints regardless of software vulnerabilities or an adversary’s control of the input.

Is There a Fail-Safe When a Cybersecurity Event Is Detected?

Recognizing that a system has entered an undesired state is a key area of software and systems engineering, and developing a detection and response strategy is an area for a cybersecurity engineer.

Common cybersecurity strategies include developing detection and response capabilities when designing software-intensive systems to improve resilience. For a system to be resilient, engineers must design it to operate reliabily (with a specified minimum set of capabilities) in foreseeable adverse conditions. Analysis of these conditions may indicate the incorporation of a resilient fail-safe mechanism to manage potential cybersecurity events.

A resilient design may involve a physical mechanism (e.g., a circuit breaker), but in software-intensive systems, cybersecurity engineers more often establish resilience through software exception handling. The dependency analysis necessary to safely and securely implement resilience in a complex system is nontrivial.

A cybersecurity engineer can ensure inclusion of appropriate system monitoring and automation to activate resilient mechanisms regardless of the input conditions that lead to an undesired state. Implementation of resilience tends to run more smoothly the earlier a cybersecurity engineer is involved in the project.

Can the System Adapt to an Evolving Environment of Use, Attacks, and Interoperability?

Finally, it is simply not possible to anticipate all future uses and enhancements of the system under design. To achieve resilience, it is standard practice to design and build a system that can adapt over time to an evolving environment. Traditional attributes such as modifiability, detailed models, architecture, and documentation all contribute to the ability of a system to easily evolve based on changing conditions.

However, a system’s evolution must not be accessible to an adversary who could introduce vulnerabilities and undesired behavior to a system. Furthermore, the authorized modification of the system must not subvert or corrupt any previous cybersecurity capability. The modification could introduce undesired consequences such as exposing encryption keys, removing monitoring capabilities, introducing previously mitigated vulnerabilities, and creating a new attack surface for an adversary.

Subversion or corruption is frequently an unintended consequence of a system modification because many cybersecurity capabilities remain silent until the system is attacked. Engineers cannot therefore easily test these capabilities through unit testing of the new system modification.

Adaptability is key for system resilience, but to the extent possible engineers must maintain cybersecurity goals and requirements for any modifications they make.


Investing in cybersecurity does not mean that all engineers will or should become cybersecurity experts. However, it does mean that engineers in different fields will be able to create a resilient system with safety and security assurances. Asking some simple questions can help to determine when and how a cybersecurity engineer should be added to the team.

The questions above are designed to be easily answered by any trained engineer working with a system. Often, however, the answers can reveal an unanticipated level of complexity. Although this complexity may introduce up-front cost in the design, the resulting system will be safer, more adaptable, and more secure than it would be without those answers—i.e., if engineers rely only on patching the system after having designed and built it. Mitigations at every layer—from code to software to network—can help ensure that a system functions correctly even under adverse conditions.

The addition of a cybersecurity expert to a system design team will prompt needed thinking about -security. As engineers incorporate more machine learning and other artificial intelligence capabilities in the systems they build, these questions are crucial to ensure the safe and secure operation of the systems that people -encounter and use every day.


McDermott TA, Canedo A, Clifford MM, Quirós G, Sitterle VB. 2019. System assurance in the design of resilient cyber-physical systems. In: Design Automation of Cyber-Physical Systems, eds Al Faruque M, Canedo A. Cham, Switzerland: Springer.

Silver D. 2017. Adversarial traffic signs., August 15.


[1]  This database, hosted by the National Institute of Standards and Technology, is available at

About the Author:Thomas Longstaff is chief technology officer of the Software Engineering Institute at Carnegie Mellon University. Noelle Allon is an analyst in the CERT Division of the Software Engineering Institute.