Stuxnet Report V: Security Culture Needs Work
Wednesday, March 23, 2011 @ 09:03 PM gHale
EDITOR’S NOTE: Stuxnet has hit and has made its way through the system. The bad news is if someone or something wants to target your organization, you will be hit. However, if a manufacturer has a solid security plan and remains poised to fight back, the company will minimize damages as much as possible. That means the situation is not hopeless. But make no mistake about it, Stuxnet was one of the most complex and well-engineered worms ever seen. Security professionals Eric Byres, Andrew Ginter and Joel Langill teamed to publish a white paper entitled “How Stuxnet Spreads – A Study of Infection Paths in Best Practice Systems.” This is the final installment in a five-part series detailing just how the Stuxnet worm was able to infiltrate a system, and how automation professionals can keep an eye out for the next type of attack.
By Eric Byres, Andrew Ginter and Joel Langill
Looking over the long haul, one of the key lessons from this analysis is just how complex and interconnected a typical control system is. Potential pathways exist right from the outside world, through the Enterprise Control Network and down to the process controllers.
Because of this complexity, Stuxnet had many possible pathways to get to its target process. In the graphic below we summarized some of these pathways in an attack graph or infection data flow diagram. As complicated as this diagram looks, it is certainly incomplete — there are likely other potential paths this worm (and future worms) might take that we have missed.
To make matters worse, the worm might completely bypass some of the stages. For example, the infected USB storage drive might have first compromised one of the Support Stations and gained direct entrance to Perimeter or Process Control networks. (Support Stations connecting via the Back-Firewall will have a trusted connection to the Process Control Network, whereas the Support Stations connecting via the Front-Firewall typically only get access to the semi-trusted Perimeter Network.) Alternatively, a PLC programming laptop, used and infected at another site, might have gone directly into the Control Network and used to program the target PLCs. In these situations, the worm would have completely circumvented quite a few of the security controls proposed by the Siemens’ Security Concept documents.
Excessive focus on USB drives
What can the SCADA or control engineer learn from such a complicated diagram? First, there is a flaw in just focusing on a single path. For example, the topic discussed most heavily throughout most of the early Stuxnet discourse was the LNK vulnerability in the Windows operation system and how USB drives spread the worm. This was understandable as the novelty, simplicity and zero-day nature of the LNK generated widespread interest in the IT community.
Unfortunately, companies immediately focused on banning USB drives in their control systems areas, failing to realize it was only one tool in Stuxnet’s bag of nasty tricks. As the above discussion and diagram illustrate, Stuxnet could have propagated to its target without ever needing to use a USB drive as a pathway, using for example a compact disc (CD) with infected project files to launch the initial infection. Focusing on one path and failing to address the others is a serious security failing.
Part I: Stuxnet attacked the Siemens SIMATIC PCS 7; Why that system?
Part II: How did Stuxnet infect a system?
Part III: A “high security site” targeted by Stuxnet or the Next Gen of Stuxnet-like worms.
Part IV: How Stuxnet infected a minor computer and then got deep inside a control system.
Part V: What should this mean for security of industrial control systems in the future?
Download the complete White Paper at Tofino Security.
Talk to Me: Stuxnet: Joint Operation Nets Victim
Furthermore, while it is easy to ban all USB drives from the plant floor, there are cases when the drives are the lesser of many security evils. For example, imagine if the network connection to a plant floor device, such as a switch, fails. The maintenance team needs diagnostic data to diagnose the failure. Plugging in a USB drive to download the logs is much safer than plugging in a laptop, but a complete ban on USB drives can force staff to either resort to the less secure option, or do without the diagnostic data and most likely take much longer to diagnose and repair the root problem.
RPC attack opportunities
The fact the worm’s authors made heavy use of the Remote Procedure Calls (RPC) protocol for propagation and the P2P network provides important lessons. RPC is an ideal protocol for SCADA and ICS attacks because it provides quite a few legitimate purposes in modern control systems. For example, the dominant industrial integration technology, OPC Classic, uses DCOM which requires RPC traffic to flow between process areas. Furthermore, control system servers and workstations routinely share files or printers using the Microsoft RPC/SMB transport between networks. Perhaps most relevant in this example, all Siemens PCS 7 systems make extensive use of a proprietary messaging technology that travels over RPC. Simple blocking of RPC traffic at control systems firewalls would result in a self-induced denial of service for the entire process.
RPC will be a potential pathway for ICS worms for some time to come. The complexity of this protocol and its heavy use in proprietary systems means the opportunities for new zero-day vulnerabilities are significant. Stuxnet’s easy paths, such as USB drives, may soon face roadblocks at sites, but future worms will have many other paths to choose from – RPC, easily infected project files, and widespread use of hardcoded passwords just to mention a few.
Common-cause security failures
It is also important to consider the fact Stuxnet needed to attack control and safety functions in the target system in order to be successful. We do not know whether the target system had these functions integrated in the same controller (i.e. the S7-315 PLC) or the S7-315 PLC provided the control functions while the S7-417 PLC provided safety functions. What we do know is the worm was able to use a common protocol and programming system to affect control and safety functions. This significantly reduced the complexity of the worm and the likelihood of failure.
In the safety industry, they would call Stuxnet a common cause failure mode. An effective mitigation is to ensure control and safety functions are independent and diverse in mission critical systems. Unfortunately, the current trend in the industry is increasing integration and close coupling of these two functions.
Protecting the Crown Jewels
Is the situation hopeless? We certainly do not think so; we do believe ICS/SCADA security best practices must improve significantly, however.
First, the industry needs to accept the idea that complete prevention of control system infection is probably impossible. Determined worm developers have so many pathways available to them that some assets will suffer compromise over the life of a system.
Understanding there will be no complete prevention, the industry must create a security architecture that can respond to the full life cycle of a cyber breach. One area that needs attention is in the early identification of potential attacks. Currently, there are limited products available designed specifically for ICS environments, and in particular, little in terms of inspecting data in transit within control system networks. However, benefits can occur from network behavior analysis and existing intrusion detection technologies that use normal traffic patterns to capture anomalies indicating potential threats.
These early warning signs can then integrate with security event monitoring tools capable of reading and analyzing event information from multiple control system hosts further offering insight into the state of the system. Complex alarm annunciation systems are common for plant safety; it is time users employ these same tools to address security issues that can compromise safety.
Next, the industry needs to focus on containment of attacks when prevention fails. For example, assuming Iran was Stuxnet’s target, no matter what its engineering teams did, Stuxnet would have likely infected a number of computers. However, the number of infected systems in Iran (estimated at 60,000) would have been much lower with good zone-based design, such as the one suggested by the ANSI/ISA99 standards. This is an important lesson for all industrial sites anywhere in the world, as the next worm may not be so selective when choosing its victims.
Furthermore, if Stuxnet was not able to make that final hop to the Siemens S7 PLCs, particularly the S7-417 PLC that was possibly the safety system, then the actual process would have been safe. This is an important lesson that all SCADA/ICS asset owners should consider — while infected computers are bad, infected safety systems are deadly. The effort spent securing these “last-line-of-defense” critical systems needs to match the seriousness of the consequences if someone or something gets in and breaches it.
Better firewall granularity, deployment
There could be an improvement in the use of firewalls as suggested in the Siemens Security Concepts documents. For example, the widely followed “NISCC Good Practice Guide on Firewall Deployment for SCADA and Process Control Networks” suggests:
“An extension to this concept is the idea of using ‘disjoint’ protocols in all PCN-enterprise communications. That is, if a protocol is allowed between the PCN and DMZ then it is explicitly NOT allowed between DMZ and enterprise networks. This design greatly reduces the chance of a worm such as Slammer actually making its way into the PCN/SCADA network since the worm would have to deploy two different exploits over two different protocols.”
Unfortunately, in the security architecture proposed by Siemens (and other vendors), the same protocols (particularly RPC) are allowed through multiple firewalls and zones. Rules that enforce disjoint protocols probably would not have stopped Stuxnet, as it did have different exploits available, but they could make life much harder for the next worm developer.
Finally, it should be clear the idea of providing security by simply blocking or allowing entire classes of protocols between areas is no longer sufficient. The fact RPC was critical to PCS 7 operations and at the same time a major vector for Stuxnet highlights the need for the deep packet inspection (DPI) of key SCADA and ICS protocols. This type of fine-grained control of network traffic is currently available for a few protocols, such as Modbus TCP and OPC, but there is a need for DPI for other protocols also.
Weak security culture
It is important to reiterate the analysis presented in this paper comes from a security model that, though accepted in industry as a best practice, it is often not a real life implementation. System architectures in the real world are typically much less secure than the one presented in the Siemens Security Concept document.
There are a number of reasons for this. One reason is there are often inaccurate perceptions about cyber security. For example, senior management often feels their control systems exist in isolation from the outside world, or they believe a firewall separates systems securely and therefore their situation is equivalent to being isolated from outside threats.
To date, the perceived risk from external threats has been low, and has not merited more than a cursory understanding by management. This low risk perception has led to most organizations not budgeting sufficient funds or people to protect their control systems from the multiple infection pathways of advanced threats.
For example, most operators today have not sufficiently segmented their control networks to limit the consequences of the occasional infection, do not have early warning infection detection systems in place, and do not include security assessments and testing as part of system development.
Another example; most procurement processes do not include security processes or components in their specifications. Any vendor who includes extra components such as security servers, software and appliances not specified and which involve additional costs are at a competitive disadvantage and will often lose their bid. Similarly, ICS vendors investing additional resources in creating a highly secure ICS product are likely to be at a competitive disadvantage in a world largely unaware of the need for security.
In addition to increased capital costs, there are ongoing operational and maintenance costs incurred to ensure a company maintains a strong initial security posture throughout the life of a deployment. Not all ICS management teams have understood the need for these expenses.
The fact the Siemens SQLServer systems had an embedded password that could not be changed is an excellent example of the conflict between what many ICS customers are willing to pay for and what is needed. This password was available on the Internet as early as 2008 and yet has not been addressed even today. Clearly default passwords that are both unknown to the end user and that cannot be changed even if they are known is not acceptable, so why are they still in the PCS 7 system?
The reason is the cost of creating a changeable internal password system between the PCS 7 components is expensive, in terms of product modifications and deployment costs. While Siemens will likely release a new version of the PCS 7 that allows modifiable passwords, the bulk of the cost will be on the customers who have to deploy these changes in live and highly distributed ICS systems. Thus, most ICS users have not demanded secure password management from any of their suppliers, with Siemens just an unfortunate example.
In a post-Stuxnet world, vendors, management teams and technical teams need to undertake frank conversations about the risk to their operations of advanced threats, and allocate resources accordingly. In short, they need to work together to improve their industrial security culture and practices.
Stuxnet is certainly not the last worm of its kind the SCADA/ICS industry will face. If Stuxnet was successful in damaging its target, whatever that target was, it is wishful thinking not to expect the injured party to respond in kind. Even if Stuxnet was not successful, it is clear the infrastructure of the developed and developing world is vulnerable to attack by malware as sophisticated as Stuxnet, and enemies of different countries and cultures now have an example of how to structure their own malware to carry out such attacks. This analysis also demonstrates that control systems are vulnerable not only due to the weaknesses of a particular vendor, but in general with any vendor due to shortcuts or omissions from accepted industry best practices addressing security.
Government agencies are not the only potential threat. Organized crime rings across the globe have demonstrated amply over the last several years the skills needed to construct most of the components of the Stuxnet worm are readily available on the black market. Acquiring the remaining PLC programming skills is a matter of identifying the target technologies, purchasing examples of them, and purchasing and attending vendor training in one of the many geographies such training is offered. The payoff would be a powerful new tool for extortion threats against the major infrastructure providers – a style of attack the banking industry has been dealing with for close to a decade.
Integrating individual components into a single package like Stuxnet is something we have not seen before, but the required skills seem comparable to the skills required to produce any complex software application. Creating another threat like Stuxnet seems straightforward for any organization with sufficient funds and a bit of time. Modifying copies of the Stuxnet worm to target other industrial platforms is also possible and should likely cost far less than writing an entirely new worm.
If the critical infrastructures of the world are to be safe and secure, then the owners and operators need to recognize their control systems are now the target of sophisticated attacks and need to adjust their security programs accordingly. In particular, security programs need to:
- Consider all possible infection pathways and have strategies for mitigating those pathways, rather than focusing on a single pathway such as USB keys,
- Recognize no protective security posture is perfect, and take steps to aggressively segment control networks to limit the consequences of compromise,
- Install ICS-appropriate intrusion detection technologies to detect attacks and raise an alarm when equipment suffers compromise or is at risk of compromise,
- Deploy, operate and maintain at maximum effectiveness ICS-appropriate security technologies and practices, including firewalls, antivirus technology, patching systems and whitelisting designed for SCADA/ICS, to make attacks by sophisticated malware much more difficult,
- Look beyond traditional network layer firewalls, toward firewalls capable of deep packet inspection of key SCADA and ICS protocols,
- Focus on securing last-line-of-defense critical systems, particularly safety integrated systems (SIS),
- Include security assessments and testing as part of the system development and periodic maintenance processes. Identify and correct potential vulnerabilities, thereby decreasing the likelihood of a successful attack, and
- Work to improve the culture of industrial security amongst management and technical teams.
These changes to improve defense-in-depth postures for industrial control systems are needed urgently. Waiting for the next worm may be too late.
Eric Byres, P. Eng., ISA Fellow, is the chief technology officer at Byres Security Inc. (firstname.lastname@example.org); Andrew Ginter, CISSP, is the chief technology officer at Abterra Technologies (email@example.com) and Joel Langill, CEH, CPT, CCNA, is the chief security officer at SCADAhacker.com (firstname.lastname@example.org) and Dept. of Critical Infrastructure Officer with The Cyber Security Forum Initiative (csfi.us).
Leave a Reply
You must be logged in to post a comment.