Tony Turner

The 5 D's of Cyber Sabotage

Cyber Informed Engineering


Cyber Informed Engineering

Cyber Informed Engineering a methodology proposed by US Department of Energy to establish Secure by Design thinking in the engineering process to achieve Critical Function Assurance. It leverages 12 core principles focused on reducing the consequences of failure for an organization's most critical functions.

Understanding adversary objectives in cyber sabotage and their impacts on your critical functions

Article content

  • Deny - creates loss of availability of dependencies supporting a critical function
  • Disrupt - temporary disruption of a dependency supporting a critical function
  • Deceive - impacts critical functions through deception that can alter operations
  • Degrade - decreases performance of critical functions through sabotage
  • Destroy - destruction of key assets or dependencies creating need for replacement

The roots of the 5 D’s, Deny, Disrupt, Deceive, Degrade and Destroy, have their roots going back to the OSS Simple Sabotage Manual written in 1944. Some would argue even farther back to more foundational books on tactics such as the Art of War. These concepts were most recently covered by the fantastic security book Countering Cyber Sabotage by Andy Bochman and Sarah Freeman and are expanded on greatly through the Idaho National Laboratory Consequences-Driven Cyber-Informed Engineering (CCE) methodology which you can learn more about HERE. We will see how these concepts apply to cyber sabotage and their relevant consequences.

The OSS Manual, from what became the CIA, details multiple means demonstrating how common access to materials can result in sabotage to an adversary’s mission such as destruction of engine components to disrupt critical manufacturing processes. The consequences of failure to defend have not changed much, but now the machines are far more complex, and much more difficult to understand and defend. Computing has added layers of abstraction that sometimes even the technology workers don’t fully understand as further abstraction via cloud and mobile puts core workings out of their control.

But what we are really talking about here, and what does the concept of cyber sabotage mean for us in cybersecurity, and more specifically, critical infrastructure?


Denial of Service, Distributed Denial of Service are just two flavors of the same basic principle. Create a condition where the target system either ceases to function, or a critical process is made unavailable. For instance, a rotational control process might continue to operate normally, but what if the safety system is taken offline? What if then, an adversary can manipulate the proper functioning of the system, and due to a loss of safety controls, a catastrophic condition can now be achieved, creating loss of human life? I witnessed a scenario years ago where a website I was protecting fell prey to a DDoS, the website stayed online, but a backend email server processing the comment form for the website was inundated and failed. The website was non-critical, but the email system serviced operations for the entire facility.


This is a close relative to deny, but can sometimes materialize in intermittent failures of a system. Consider a continuous process manufacturing job, where the mixture of a chemical and the stages in that process are critical to achieve the desired result. What if a highly explosive mixture is made more volatile because a key stage in the process was disrupted? Now consider we are talking about jet fuel, and a fleet of military planes will rely on this fuel for their next mission.


While we all certainly engage in the art of deception in the normal course of our lives, and certainly in spy craft, deception as applied to an automation process may mean faulty sensor data, or quality and control checks that pass when they should have failed. Consider the explosive jet fuel, if the operator was deceived that the mixture was correct, now he has no idea it is not. Or in the case of the Oldsmar water incident, where deception was used to alter the chemical mixture of water intended for human consumption, to toxic levels that could have killed thousands if not caught in time.


Sometimes a reduction in output or performance is all that is required. For instance, a degradation of power output may result in not enough capacity to keep the heat on in the winter or maintain proper controls in a manufacturing process. Both output and quality of output can create catastrophic consequences.


Can an adversary cause so much damage that the asset or a critical component cease to function? It might be a motor, or a motherboard, or a critical radio used to transmit data. What if an encryption key is destroyed rendering a system inoperable and unrecoverable? If equipment fails in unexpected and violent ways, and human safety can be put at risk, this can be even more catastrophic then the time it takes to replace or repair. Consider a safety control in a collating machine that protects the operators by raising a shield to prevent bodily injury. If that mechanism fails, not only is the safety control removed, but the resultant explosion of metal might kill people.


These kinetic consequences are not new, but through the connection of technology, become far more accessible to our adversaries. Vulnerability advisories will not talk about them and will paint a picture of bits and bytes affected, with no real-world consequence. It is extremely important that we start connecting the consequences of failure to the esoteric scenarios that keep security folks up at night but are largely lost on the operator and the business. Its time we start thinking about cyber sabotage as part of a robust safety culture. It’s time we take cyber security seriously.

Originally posted October 26, 2022 at

To get notified of more posts like this, subscribe below or visit us anytime at

The 5 D's of Cyber Sabotage

Tony Turner

Founder, CEO

Experienced cybersecurity executive 30+ years, Author of SANS SEC547 Defending Product Supply Chains and Software Transparency.

Author's page