back to article Stuxnet clones may target critical US systems, DHS warns

Officials with the US Department of Homeland Security warned that hackers could attack the country's power generation plants, water treatment facilities, and other critical infrastructure with clones of the Stuxnet computer worm, which was used to disrupt Iran's nuclear-enrichment operations. Stuxnet was first detected last …

COMMENTS

This topic is closed for new posts.
  1. Anonymous Coward
    Anonymous Coward

    Never gonna happen

    Most of the systems are so large, complex and, lets face it, old and written / installed / setup when security was not a #1 concern.

    I have no doubt that the amount of effort required to secure our large scale industrial / power generation facilities is beyond us at the moment.

    In this case "secure" meaning secure from penetration by non-authorized members either via network or tricking authorized members to infect systems. Deliberate Sabotage by authorized members is not really something you can fix in software

    The job is just too large, complex and dealing with too much crappy and disparate legacy code.

    For me to have any confidence in the security of these systems they would have to be almost entirely rewritten and:

    #1 running on a highly secure, reviewed, standardised purpose built OS. Not sitting on windows NT.

    #2 All SCADA applications requiring security certification AND high-cost penalties should any exploits be found in live code (Both to the SCADA writing company and the certifying authority)

    #3 A regulating agency with the ability to perform surprise physical inspections, penetration testing and the authority to punish companies running insecure setups. (Ranging from financial penalties up to and including shutting down facilities).

    Do the above and maybe we can reduce the vulnerability of our systems, until then Damocles-Sun-Zhu's (Or modern country of your choice) sword is swinging merrily above.

    1. cum grano salis

      so negative

      I have to disagree somewhat with your sky-is-falling viewpoint. A few policy changes followed by some precautions can protect entire systems similar to this. Step 1, create bulletproof policy and *enforce* it strictly. In NBC facilities, this is easy: you imprison people. In other cases, such as the "omg hackers in the power grid" power grids, you fire them and sue them in civil court because you *do* have a valid contract in your jurisdiction. As for protections, maintain an air gap in all cases. This is easier than most people think. First, unplug everything that isn't essential. Second, reduce cabling at the breaker panel and the switches so that it is impossible to plug something else in and have it work. Third, fill every unnecessary computer port with epoxy cement and epoxy cement all keyboards and mice into their own ports. Terminals are typically redundant in these establishments, so unless you fuck up all of your terminals in all of your redundant locations, you'll be fine. These few, minor tasks are easily surmountable in these large organizations.

      1. Anonymous Coward
        Anonymous Coward

        A polite response! I am shocked

        I agree with your first point, without actually kicking some arses nothing will change. However in a worst case scenario being able to sue after the fact may not be good enough; there needs to be serious pro-active prodding.

        Regarding air-gapping, there must always be some way to enter and retrieve data from a system, either via a network (for realtime monitoring) or system / application patches (network, USB, CD, whatever) so a fully air-gapped system is simply not possible in the real world.

        You can do most of the steps you mentioned above and reduce the risk a heck of a lot, but you cannot isolate it fully.

        Air-gapping only hides the underlying systems weaknesses so there is no defence in depth, with systems like these I think we need all the defence we can get.

        1. cum grano salis

          true, but

          If you need some holes punched through your air gap, secured rooms that only two (or so) qualified people can and must enter simultaneously can be provided. The secured room details can be laid out in length, but it should be taken seriously. (Paper trail, full auditing, independent security monitoring both local and remote, etc.) This can still be defeated by collusion, but collusion is a separate problem from people plugging in infected iPods or taking information out on a CD-R. It is absolutely possible. (I've done it.)

          As for reducing risk, that may be all that is necessary for most SCADA systems. As you said, it would probably take a complete teardown and rebuild to make it right and defense in depth, but that would have to be determined on a case-by-case basis. Air gapping critical systems would buy them enough time to drum up an emergency budget out of profits and capital (what those pricks should have already done decades ago) and execute some sort of plan to fix the rest. Until hackers find a way to impose a signal onto a network from a distance without frying their retinas out, things can be pretty safe.

        2. Paul Crawford Silver badge

          @A polite response

          As pointed out, until some proactive ass-kicking is done, nothing will change. At the very least the CEO/board should be held personally responsible for any lack of due diligence in critical systems, only then will things be done that cost money.

          While true air-gap protection may not be possible, at the very least they can have a properly separated/fire-walled red/blue style network and *serious* penalties for anyone plugging in non-approved equipment to the secure one. Like a big fine (contractor) and/or immediate dismissal (employee).

          Finally there is the whole 'old windows is fine' mentality that is so incredibly dumb, in particular the even greater piss-poor security of vendors (like Siemens, I mean hard-coded root passwords that if changed break the system, WTF!).

          Something like compulsory insurance with a premium based on the evaluated security of the system might just make people choose and/or replace things that are woefully insecure because suddenly there is a very real on-going cost for it, and not a 'oh if it happens we will react' mentality.

        3. Trygve Henriksen

          Air-gapping works...

          But only as a single layer in a multi-layered defense.

      2. clanger9
        Stop

        Re: so negative

        Please don't assume that an "air gap" is a panacea for securing these systems.

        By nature, utility infrastructure is often distributed over thousands of square km; the hardware is not all boxed up in some neat little secure control room. Plus, everything needs to talk to everything else (within reason), so maintaining effective air gaps in such a network is always going to be a challenge that borders on the impossible.

        Secondly, new "smart grid"/intelligent infrastructure makes significant use of real-time external data streams - things like weather, local demand, supply availability, fuel prices, etc. You can't (efficiently) do real-time operations behind an air gap, so data connections tend to pop up all over the place.

        Air gaps are a neat idea (and security in the control room is certainly important), but don't think for one second that "supergluing the USB ports and firing people for breach of policy" is going to have the slightest effect in securing the system. The real vulnerabilities lie elsewhere.

    2. Anonymous Coward
      WTF?

      Utterly flawed

      R"running on a highly secure, reviewed, standardised purpose built OS. Not sitting on windows NT."

      Yes agree with NT, but there is no such thing as a secure OS. Someone, somewhere will find a hole.

      "All SCADA applications requiring security certification AND high-cost penalties should any exploits be found in live code"

      You will NEVER find a contractor / supplier willing to sign up to that. So who will supply it?

      A regulating agency with the ability to perform surprise physical inspections, penetration testing and the authority to punish companies running insecure setups. (Ranging from financial penalties up to and including shutting down facilities).

      WTF? Seriously WTF? 1. Running penertration testing againt live CRITICAL infrastructure could be very dangerous, what happens if the take it off grid? Will they pay? And as for shutting down? you are kidding right, you are going to shut down a hydro-dam, Major power plant, water works, for a bug? F**k off. You may find it a great idea, but the 1/2/ million people without electric or water may disagree..

      No computer system will be 100% secure, anyone that thinks so is an idiot. After all, idiots are often the ones using it.

      1. Anonymous Coward
        Anonymous Coward

        Ah, the impolite response...

        ...good good, I was getting worried by how civil things were.

        "Yes agree with NT, but there is no such thing as a secure OS. Someone, somewhere will find a hole."

        I never said "Secure" (Implied binary options: Secure / Insecure). I said "Highly secure" (Implied sliding scale).

        The point I was trying to make is that we should not be using a general purpose operating system for this but a dedicated, highly secure etc. one. This is a tech site with a tech readership, I didn't think I needed to push the point.

        "You will NEVER find a contractor / supplier willing to sign up to that. So who will supply it?"

        Did you read the title of my post? "Never gonna happen". The points I listed are what, in my opinion, are required (minimum) for a system that we could have confidence in, not how politically / commercially feasible it is.

        "WTF? Seriously WTF? 1. Running penertration testing againt live CRITICAL infrastructure could be very dangerous, what happens if the take it off grid? Will they pay? And as for shutting down? you are kidding right, you are going to shut down a hydro-dam, Major power plant, water works, for a bug? F**k off. You may find it a great idea, but the 1/2/ million people without electric or water may disagree.."

        You seem to have taken my general point. fleshed it out with your own ideas and then declared it "WTF" so I could just ignore it but I shall try and answer the points you came up with.

        1. Please go and look up penetration testing. Penetration testing != destroying the system. Breaking through an external firewall will not wreck the turbine control computer.

        1b. You can use test or simulated environments depending on what you are trying to break in to.

        1c. If you want to go against the live system you can do so during scheduled maintenance to avoid further impact.

        1d. Tests may not be done without co-operation of facility staff to further reduce risk.

        1e. Other ways of doing things that smarter people than I would think of.

        In summary, I am not saying that we should tell Lulzsec to go and try to rf -rf / every computer they gain access to.

        2. Regarding shutting down. Facilities shutdown all the time for maintenance. There are very few facilities (if any) which if shut down will totally wipe out a service to an area, there is redundancy built into these large scale systems. Selective shut-down would also be an option ("Your reporting system which links to X Y Z is shit, shut it down while you fix it, doesn't affect the main systems")

        A good example is Japan running ~25 Nuclear reactors shy of full compliment yet still has enough power to function almost normally. Shutting down 1 to fix bad security will not bring the country to a stand-still.

        "No computer system will be 100% secure, anyone that thinks so is an idiot. After all, idiots are often the ones using it."

        The point of the above was not to make a 100% secure system but to create something we can at least have a greater degree of confidence in rather than the current crappy setup we have going now.

  2. Mark 65

    So it turns out

    Someone in a glass house threw some very large stones when they wrote this code.

    1. Anonymous Coward
      Anonymous Coward

      The more interesting question is

      The more interesting question is why there are no shutters available to put in front of the glass in case stones start flying around. That way you stand at least some chance and get an extra breather until you rebuild the house or put bulletproof glass.

      Some of us have said it for years. SCADA is at the security level where Internet was before Aleph1's paper "Smashing The Stack for Fun and Profit" 14 years ago. The security on most systems is directed only against people who go via authorized access. To add insult to injury it regularly runs on systems with prehistoric patch levels. A piece of attack code can sail through the "glass window" and smash it to bits any time any day. In fact, Stux was a total overkill. It was like smashing a shopfront window with an ICBM. Most systems out there require a fraction of the Stux effort.

      Unfortunately the powers that be in charge of "putting the shutters" were busy in making sure that the shutters are designed by the "right" contractors instead of putting any shutters at all.

      As a result as you noted we still have a glass house with a lot of stones flying around. At least in this country.

    2. Anonymous Coward
      Anonymous Coward

      It's happened before, last time scada was attacked

      From the world of smoke & mirrors, did it happen;didn't it happen?

      I'm referring to the Mitterand/Canadian/CIA/KGB affair where a scada design for gas pipeline control was filched by the URSS, but it had been previously Firmware poisoned, such that the pipeline went 'pop' in what is described as the world's largest non-nuke explosion!

      Except by those who deny it ever happened (to avoid damaging their/our glass houses)

      http://en.wikipedia.org/wiki/Siberian_pipeline_sabotage (the wiki page changes regularly as the debate continues from 1982 and the two or more sides edit their critical infrastructure protection away from the streisand effect)

    3. John Smith 19 Gold badge
      Unhappy

      @Mark 65

      "Someone in a glass house threw some very large stones when they wrote this code."

      But it looked like *such* a good idea at the time.

      So elegant.

      So simple.

      So what could go wrong?

      After all these <insert suitably derogatory racist epithet> are *far* too stupid to reverse engineer it and either re-target it on systems *we* use, let alone insert it into *our* systems.

      I doubt those who built stuxnet will *ever* have to fix the trouble their little "prank" will cause.

      1. Destroy All Monsters Silver badge
        FAIL

        The usual of the cojoined twins comic relief duo

        1) Israel - You want?

        2) USA - You ask?

        3) ???

        4) Blowback!

    4. The Flying Dutchman
      Happy

      my first thought...

      ... when I read the headline, had something to do with "hoist" and "petard"...

  3. Anomalous Cowturd
    Holmes

    I believe its called...

    Karma.

    Let's hope it bites them in the arse. Hard!

  4. Steen Hive
    Thumb Up

    Oh dear

    Those pesky roosting chickens once again, eh?

  5. cum grano salis
    FAIL

    whoops!

    "spread virally through SCADA, or supervisory control and data acquisition, systems"

    Not really.

    Here comes Frankenstein's monster. This is what happens when you create monsters, dictators, tearists, viruses, or Israel.

  6. amanfromMars 1 Silver badge
    Holmes

    Targeting the Weakest Link with SMARTer Virtual Machine CodedDINformation ....

    .... ESPecial Instructions of Sublime Construction?

    "At a hearing Tuesday before a subcommittee of the US House of Representatives Committee on Energy and Commerce, DHS officials said they are worried the wealth of technical details and code samples from Stuxnet could lead to clones that similarly target critical infrastructure in the US.

    "Looking ahead, the Department is concerned that attackers could use the increasingly public information about the code to develop variants targeted at broader installations of programmable equipment in control systems," "

    Are DHS officials worried that targeted programmable equipment in critical infrastructure control systems are its human assets/worker drones/systems administrators? For they are surely vulnerable to programming with advanced intelligence via a multitude of novel information sharing and product placement channels.

    Of course, such could/would/should be the very valid concern of any and all Intelligent Service Providers for there is every reason to suppose that it is a current program running silent phish and stealthy embed projects in specific control and lucrative command areas of high interest and third party concern.

  7. Roger Varley

    I wonder ...

    ... why the words "hoisted", "petard" and "by their own" come to mind.

    1. The Flying Dutchman
      Unhappy

      awww...

      ... you beat me to it.

  8. Aitor 1

    just disconnect them

    Ok, these systems are not going to be secure.

    So, why not have them in their own VLAN, or, better still, subnetwork?

    As for geting things into the systems, use a "bridge" system.. and THAT should be the key: a relay system that only accpts industrial input.

    There it is, fixed for you...

    1. Anonymous Coward
      Anonymous Coward

      Or...

      All you really need to do is have them separated in the same way that, say ATMs, are separated from a bank's network. They run XP, or above Windows, but are locked down and firewalled off.

      You can run a perfectly secure system on Windows NT/XP/etc, you just have to think about it. The same goes for linux and unix...

  9. Trygve Henriksen
    Coat

    What I don't understand is...

    How are SCADA systems vulnerable to worms?

    I mean, it's not as if people connect systems controlling Nuclear power plants and such, directly to the internet, right?

    Sure, I understand the need for some systems to 'talk' to each other, but...

    It's still possible to lease dedicated lines. Or use analogue or ISDN dialup, with all kinds of verification to block out the kiddies.(Not that a script kiddie would know what a modem is, or how to connect to it)

    And if they absolutely have to connect through the internet, there's this marvellous tech called VPN. There's firewalls that can block traffic and all kinds of fun stuff.

    Mine's the one with the RSA dongle in the pocket...

    1. Naughtyhorse

      It's still possible to lease dedicated lines

      Not in the uk it isn't.

      More cost effective to unilaterally bung eveyone on an ip based system, which is unacceptable for any secure/time critical application.

      AFAIK stux moved via usb keys - tyipically SCADA isnt connected by TCP/IP - usually a modern variant of RS 232. So code gets developed & tested in a sim, and shipped to remote sites in some dudes pocket on a flashdrive.

      1. Trygve Henriksen

        No leased lines?

        What a bummer...

        Cost effective is usually not the best when it comes to security.

        Frankly, I'd stick an old GSM modem in the systems, rather than trust a 'normal' internet connection.

        As for the flashdrive...

        Not only must that always be scanned by a separate computer(running a different OS and scanner package than the system that generated the content), but care should be taken to disable things such as Autorun for removeable media on 'critical' systems.

        That's just commonsense, though, so probably won't be implemented in anything Government controlled...

      2. clanger9

        Re: It's still possible to lease dedicated lines

        ...and the new stuff is usually IP-enabled (see the IEC61850 communication standard).

        Combine that with the lack of availability of private lines and guess what's the cheapest way to get the control traffic back to headquarters? Yup, the good ol' public internet.

        Granted, there are ways to do this securely. Hint: trying to maintain an air gap isn't one of them.

  10. HalSpace2001

    Watch this informative video on TED

    http://www.ted.com/talks/ralph_langner_cracking_stuxnet_a_21st_century_cyberweapon.html

    1. Destroy All Monsters Silver badge
      Thumb Down

      That guy's a jerk

      Starts off by assuming that "Iran wants the bomb" (like we are hearing ad-nausean since 1993 or thereabouts with the only proof scary cover artwork by the The Economist) then proceeds from there.

      And who is that "we" this guy is talking about?

  11. Kurgan
    FAIL

    We are all doomed!

    Now, seriously... our IT industry runs on some very wrong assumptions. ("ass" is the word)

    We assume that software has to be insecure and prone to bugs and crashes, and that nothing can be done to fix it. We have to thank Microsoft for this. And after them, every software maker has decided that it was OK to sell bug-ridden and insecure software, to maximize profit. And since software crashed a lot, hardware vendors began selling crap hardware, no one will notice anyway, since the computers keep crashing all the time.

    And here we are, running crap software on crap hardware. Oh, sure, it costs less than a half (or maybe far less than a half) of good software and good hardware, but would you bet your life on it?

    Well, if you use crap software and crap hardware to run critical infrastructure, you are actually betting your life on it.

    I mean, come on... windows should only be used to play videogames, and any software that has blatant security issues like a hard-coded root password (which is not a bug, it's by fucking design) should be just laughed at and thrown away.

    PS: Please excuse me for my poor english.

  12. John Smith 19 Gold badge
    WTF?

    OMG builds a digital loop to spoof the control system while the hardware destroys itsself.

    And the core is generic.

    Which raises an interesting question.

    Is this a failed sucker punch at Iran.

    Or could it be (like the Washington anthrax attack) an attempted wake up call to get people to take the threat *seriously*. By analogy with the Washington anthrax attack that would make the *most* likely developers a (very) unscrupulous security software supplier.

    Or both?

  13. Will Godfrey Silver badge
    FAIL

    eltiT

    Never saw that coming. Oh noes.

  14. Anonymous Coward
    FAIL

    "scanned by a separate computer" (and similar)

    This scanner of which you speak, it detects previously unknown exploits does it?

    You do realise Stuxnet used multiple previously unknown exploits, do you? (Call them zero-days if you will).

    And the follow-ons will too.

    The scanner may look good to the clueless PHBs and the certified Microsoft dependent but its actual security value to the organisation is negligible. The value of Windows as an underlying OS in critical applications is negative - it is not an asset, it is a liability, and the sooner the insurers pick up on this, the better.

This topic is closed for new posts.