Stuxnet: Staying Ahead of the Bad Guys
Last week I had the chance to attend a very interesting seminar at the Stanford Research Institute called the DHS/SRI Infosec Technology Transition Council Meeting (ITTC). It wasn’t focused on SCADA or ICS or even Stuxnet, yet some of the talks had a lot of applicability to the control systems world.
A particularly interesting speaker was Dr. Ross Anderson, one of the few true experts in computer security (If you haven’t read his book “Security Engineering”, get a copy – I refer to it more than any other book).
Ross discussed the infiltration of the Dalai Lama's Tibetan exile centers through the Ghostnet attacks of 2008/2009. These attacks were likely driven by intelligence agents working for the Chinese government. If you would like to read his full report see “The snooping dragon: social-malware surveillance of the Tibetan movement”.
Lessons for the ICS Community
Ross made three key points that are important for the ICS community:
- Few organizations outside the defense and intelligence sector could withstand an attack like this,
- While this case involved the agents of a major power, the attack could in fact have been mounted by a capable motivated individual,
- New attack and malware technologies and capabilities quickly migrate from government agencies to organized crime groups.
“Back when we wrote the report in 2009 we said that what Chinese spooks did in 2008, Russian crooks will do in 2010. Unfortunately we were too optimistic – the criminals had all the tools in under a year,” noted Ross.
There is every reason to believe that Stuxnet will follow the same pattern, and its concepts and tools will migrate from sophisticated government operations to criminals quickly.
Reverse engineered Stuxnet code is now freely available on the Internet (for example, see http://amrthabet.blogspot.com/), and the infamous default passwords in Siemens and other ICS systems are widely shared (see http://www.defaultpassword.com/). The “Son-of-Stuxnet” appearing on the world stage is only a question of when and not if.
Securing control systems from “Son-of-Stuxnet”
Owners of control systems need to start preparing for that day now. There is no quick fix, but there are solutions. For example, exida will be hosting a free webinar on February 24, 2011, entitled “The 7 Things Every Plant Manager Should Know About Control System Security” - I have seen the early slides and it looks interesting, especially if your company is embarking on an ICS Security program.
Finally, as you plan an ICS security program, remember that if you just focus on preventing the actual Stuxnet worm from infecting your plant, you are missing the point. Stuxnet’s day has come and gone. Instead consider Stuxnet as a learning tool. Study how it infected its victim, so you can prevent Stuxnet’s offspring from penetrating your defenses.
Somewhere in the world, bad guys are carefully analyzing Stuxnet so they can attack some unfortunate company for political or financial reasons. As an ICS/SCADA engineer, owner or operator, you need to stay ahead of those bad guys.
Comments
Cautiously Moving Forward
Dear Eric,
While i have been keeping abreast of the Stuxnet issue, i have also realised that the potential to blow this out of proportion is significant.
I believe targeted infiltration has always been a possibility. I wrote a comment on ControlGlobals community website in 2009 about it (see http://community.controlglobal.com/content/your-management-showing-inter...).
I believe then and i believe it now that, while i think that such events are do-able, it is not easy.
We do need to look at the facilities that potentially can cause the most significant impact to human life or the environment (such as nuclear facilities, oil/gas/petrochemical facilities, etc) and in general there are enormous standards/codes that cover the design and operation of these facilities.
In most critical facilities in developed countries the control and safeguarding systems are distinct (however, i do acknowledge that there is some direction to bring this together). In especially the older facilities, the systems are legacy based and using proprietary vendor software, something that you need to be really familiar with.
As i mentioned in the note to ControlGlobal, even if you were able to understand the greek in the control system software, you would still need to access a separate system for safeguarding (typically from another vendor with another proprietary language). Even if you were able to circumvent this problem and are able to get through both these systems, you would need to understand what you are seeing, which means an implicit understanding of the process and the automated safety protection for these systems (which by the way are frequently not cut-n-paste but purpose built).
In the Oil and Gas industry (from which i originate), we tend to add an additional line of defence which are what we call (mechanical line of defense). This gives us some measure of comfort if everything else fails. However, not all systems are designed this way and vulnerabilities are still present in potentially 'explosive' equipment. Even with these equipment, destruction is not straightforward. However you would need to know where to look and how to deactivate what are considered safety protection.
So all in all, you would need
i) a strong knowledge of control system configuration
(usually dedicated personnel)
ii) a strong knowldge of safety system configuration
(usually dedicated personnel)
iii) a very strong knowledge of the process
(usually dedicated personnel from a completely different engineering field)
iv) a person who understands weaknesses in process safety
(may be the guy in (iii) above but needs to be someone really well versed with the facility, i.e. been there more than 10 years)
This is what i am getting at, while i think stuxnet is an issue, for a lot of the major facilities, such an intent involves considerable effort.
If anything i would move forward cautiously by critically identifying facilities which have potential impact to human life/environment (in most developed or developing countries, governments mandate a document called the Control of major incidents....) and thereafter regulatory bodies would already know which are the most critical. An analysis would then be required to look at the potential of such an attack. This analysis would not just look at hardware issues but look at personnel issues (i.e. identification of critical personnel, including those who have left the company).
There was an old proverb that i read somewhere, 'locks hold up against animals, nothing holds against man'. If there is intent and resources are available, it would be extremely difficult to prevent the intent from coming true. As a bare minimum, however, there is a social responsibility attributable to the large industries to ensure that they do not become the one that changes the world. At least with that attitude, certain minimum measures can be taken by governments and organisations to ensure that these issues can be prevented from escalating.
Regards,
Prakash K.K.
On the Same page
Hi Prakash
Thanks so much for your comments. I think we are pretty much in agreement on all your points. I certainly agree that creating a worm that attacks control systems is not easy and requires knowledge of control systems, safety systems and the process.
Unfortunately, what Stuxnet showed is that it is possible to put such a team together and build such a worm. We will probably never know who wrote Stuxnet, but the resources to create a son-of-stuxnet are certainly available in government agencies, criminal gangs and various non-governmental actors. If you have the resources to put together a multi-person, multi-year attack like the world witnessed on 9/11, you probably have the resources to build something like Stuxnet.
Furthermore, the bar for those resources has certainly been lowered by Stuxnet – much of the Stuxnet design could be reused for other PLCs without near as much effort. Similarly, resources for both controllers and process theory are much easier to obtain today, thanks to the Internet. For example, even though I know little about natural gas compressor theory, I was able to invent a very plausible attack (at least according to my compressor engineering friends) against compressors for a class exercise. All it took was a few days of browsing the web and asking questions on various engineering news groups.
Your statement “as a bare minimum, there is a social responsibility attributable to the large industries to ensure that they do not become the one that changes the world” is right on the mark. In our latest paper, Andrew, Joel and I discuss how our industry needs to start making sure our safety systems are secure first and then move onto the rest of the ICS. Unfortunately the move by some players to bring the safety and primary control system together into one unit is counterproductive – we are leaving ourselves open to a common cause failure mode. Certainly it appears that whatever safety system Stuxnet’s victim used, it was probably integrated into the Siemens PLCs. That certainly made Stuxnet’s creators life a lot easier.
As you point out, “certain minimum measures can be taken by governments and organizations to ensure that these issues can be prevented from escalating.” My wish is that those minimum measures are taken and the lessons of Stuxnet are not ignored until it is too late.
Thanks again for pointing out what really is needed to get our industry moving toward reasonable and effective security solutions.
Regards,
Eric
Add new comment