back to article Alaskan borough dusts off the typewriters after ransomware crims pwn entire network

A ransomware infection has cast the Alaskan borough of Matanuska-Susitna (Mat-Su) back to the dark ages. The malware was activated in mid-July, infecting 60 of the borough's Windows 7 PCs. As the IT department tried to clean the infection and reset passwords using a script, the malware started "attacking back", spreading to …

  1. Crisp Silver badge

    Because no one has ever stolen records from a filing cabinet before.

    Although to upgrade security they could always hang a sign on it saying "Beware of the Leopard" or something.

    1. DougS Silver badge

      Re: Because no one has ever stolen records from a filing cabinet before.

      Not from halfway around the world they haven't.

      1. I3N
        Pint

        Re: Because no one has ever stolen records from a filing cabinet before.

        nor thousands of paper records except for some prolific secret hoarders ...

  2. Alister Silver badge

    The attackers gained Active Directory admin access

    Only criminal negligence, or deliberate criminal intent of an insider, could allow that to happen, surely.

    This doesn't sound like a happenstance ransomware or malware infection, but a deliberately targeted attempt to destroy the borough's IT.

    1. Warm Braw Silver badge

      a deliberately targeted attempt to destroy the borough's IT

      Given that their recovery plan involves using backups, some of them up to a year old, it seems at least possible that they may have pinned the target to their own forehead.

      1. Anonymous Coward
        Anonymous Coward

        @Warm Braw "Given that their recovery plan involves using backups, some of them up to a year old, it seems at least possible that they may have pinned the target to their own forehead."

        Yep, blame the victim, that always helps.

        1. AndrueC Silver badge
          Stop

          Yep, blame the victim, that always helps.

          Finding the root cause of a problem usually does. Attempting to gloss over it or 'move on' means less chance of anyone learning from the mistake. Absolutely it shouldn't be a witch hunt and no-one should lose their job over it unless criminal intent or utter incompetence is discovered. But those responsible need to be made aware of what they did wrong so that they work out how to stop it happening again.

          That's why I dislike the term 'car accident'. By dismissing such events as 'accidental' you're implying that there's nothing anyone could have done different and therefore no reason for anyone to change the way they drive.

          Things go wrong. Mistakes get made. People shouldn't be vilified over them but people who make mistakes should be told then helped to avoid repeating them.

        2. sprograms

          It seems that in our language and culture the definition of the word "victim" keeps receiving an ever greater inclusiveness. If blaming the victim is taboo, it becomes vitally important to qualify as a victim. In this way the negligent and foolish become immune to responsibility and critique.

      2. Korev Silver badge

        Given that their recovery plan involves using backups, some of them up to a year old, it seems at least possible that they may have pinned the target to their own forehead.

        They may have also gone that far back in time to make sure that they weren't restoring the trojan. I guess only the people doing the work know for certain though.

        1. big_D Silver badge

          Excactly, Korev, the story says that the backup infrastructure was also infected. So that doesn't mean that under "normal" circumstances they would have had to go back a year.

          1. vtcodger Silver badge

            It sounds like they MIGHT be able to eventually very carefully recover their data from the infected backups. Personally, I'd look into using a unix to do so in order to minimize the chance of propagating their old infection back into their system once they get it decontaminated and running again.

            1. Anonymous Coward
              Anonymous Coward

              The barrier is not the infection, which can be controlled and contained, but the encryption.

              A good trapdoor function will leave you with no chance of getting the data back before the heat death of the universe... unless someone comes up with much better cracking tools in a hundred or a thousand years.

          2. Warm Braw Silver badge

            the story says that the backup infrastructure was also infected

            It doesn't. It says their "disaster recovery" systems were infected. That's unfortunate, but should simply have meant that they were not able to recover instantly, but they were able to recover in reasonable time to a recent point in history.

            If you're having to go back a year to find backups that aren't infected then you either didn't notice for 12 months that you were infected or your backup process is not worthy of the description. Copies of files sitting on active systems that are capable of being infected are not backups, they're hostages to fortune.

      3. Donn Bly

        Using Old Backups

        Seriously? For most people having a recovery plan that involves using backups is not only normal, it is part of best practices.

        The fact that some of the backups are a year old isn't abnormal either. If the source code of a software package hasn't changed in years, artwork for logos, etc. then why NOT use a years-old backup that you know is safe.

        When restoring a backup in this situation you want the OLDEST backups that have the data you need, not the newest.

        They had "disaster recovery" servers. I read that as hot spares with automatically replicated data. Unfortunately, automatically replicated data means a lack of air-gap, so they got infected with everything else because they didn't consider this type of "disaster". How do you recover from that? Well, you bust out your second-tier recovery solution which is generally archived backups.

        Yes, this "security event" was enabled by insecure policies and practices. Most likely some administrator had made a decision that a network-wide share that housed executables needed to be read-write (or the applications used demanded it), and/or one or more people with admin access used their admin account daily instead of having a second account. Those two situations - found in the MAJORITY of small networks, cause this type of problem to go from "annoyance" or "major catastrophe"

        1. bombastic bob Silver badge
          Unhappy

          Re: Using Old Backups

          "a network-wide share that housed executables needed to be read-write (or the applications used demanded it) "

          Ack.

          I've griped at Micro-shaft before about putting WRITABLE files *anywhere* within the 'C:\Program Files' tree... MANY TIMES before. At one time, they were doing this with SQL Server, actual database files within that directory tree. The problem of writable 'executable file' directories goes right back up to the source, at Micro-shaft, where they had DESIGNED IT THIS WAY.

          In any case, that kind of hindsight won't fix the specific problem at hand (the ransomware encrypting things and spreading itself) nor get the data back. And if the machines hosting the various services are compromised, then malware with admin-access could simply do 'whatever' and not be stopped. So even with proper practice of "nothing writable in directories with executables in it" the admin-level access by the malware would overwrite things anyway and bypass all of that.

          It doesn't stop me from figuring that maybe, JUST maybe, the original vector _WAS_ something so simple like user-writable executable file directories. There was an 'outlook express' virus/trojan that did something like that, a while back, now wasn't there? And MSN Messenger (on by default) spread the thing, as I recall...

    2. big_D Silver badge

      If someone with high level access was spearphished, it is unfortunate and they need to look at their training. But even with good security, there is always a weak link somewhere that allows them in.

    3. Joe Montana

      Domain admin

      Hardening active directory to make attacks like this difficult (not impossible) requires significant investment, you need third party tools, and highly competent (ie expensive) staff. Chances are this organisation didn't have the budget required to hire such staff, or do so in sufficient numbers to manage and monitor a network of this size.

      If not suitably hardened, active directory is extremely easy to compromise and since it's often tied into everything - that means you now have control of the entire organisation and are extremely difficult to remove.

      1. a_yank_lurker Silver badge

        Re: Domain admin

        I would think the talent pool they have to hire from is relatively thin as Alaska is not a hot bed of the IT industry. It is not like there is a lot of talent flocking there for jobs like many major US cities.

    4. hplasm Silver badge
      Windows

      The attackers gained Active Directory admin access

      "Only criminal negligence, or deliberate criminal intent of an insider, could allow that to happen, surely."

      Sounds like policy decision to me.

    5. Anonymous Coward
      Anonymous Coward

      Only criminal negligence, or deliberate criminal intent of an insider, could allow that to happen, surely.

      ------------------------------------------------------------------------------------------------------------------------------------

      Not really.

      Once a remote root exploit is achieved against a relevant target then a technically adept attacker can bootstrap that into almost any level of access to anything on the network, including active directory servers, anti-malware servers, intrusion detection systems, and the like.

      It takes some skill and patience, but it is easily within the realm of the possible.

      The initial compromise does not need to be within the corporate network. A compromised offsite computer used for remote administration and tech support, for example, can yield all the information needed to gain administrative control over key servers and services. A keylogger and patience will eventually get you everything.

      For that matter, if our hypothetical administrator were to use a USB key to transfer data from a compromised computer (any kind) to machine(s) inside the network, you might not have to wait for the right logins to be captured on the first machine.

      Given the large number of zero day remote exploits, a persistent attack will eventually succeed.

      Also, once you are inside a corporate network, machines are often running some older (more easily compromised) software for compatibility purposes. There are still applications - often legacy customised applications that would take ages and piles of cash to re-implement or replace - around that insist on talking only to Internet Explorer, for example.

      Couple that with the disinclination of many executives for spending more than the minimum on security, redundancy, and testing, and this state of vulnerability is not surprising, nor is it often seen as an issue until it becomes a disaster. Humans are remarkably poor at risk estimation. Think of how many people are scared of terrorism or flying, but think nothing of riding in a car, eating rare hamburgers, or going skiing.

  3. Anonymous Coward
    Anonymous Coward

    Using windows in a business environment is negligent in of itself.

    Insecure POS.

    Apart from the 7 infected PCs must still have been connected to the network for it to spread even though they were know to be in that state.

    You need an second alarm system along side the fire alarm for people to swicth off their PC when it goes off, shite like this spreads faster than a fire.

    1. the future is back!

      Re: Using windows in a business environment is negligent in of itself.

      But...but. Isn't it Windows 7 PCs - or all 7 of them? I am confused.

  4. big_D Silver badge

    1st rule of an IT system...

    When I learnt computing, the first thing we were taught is, when you implement a new IT system, you also document the manual procedures to carry on working, if those systems go down.

    It looks like they managed to cope reasonably well, given the circumstances, although I doubt the manual procedures were defined in the disaster recovery plan.

    1. Tuesday Is Soylent Green Day

      Re: 1st rule of an IT system...

      It must have been fun watching Millennials try to figure out how to use a typewriter....

      1. nuked

        Re: 1st rule of an IT system...

        Or their cerebrum

        1. the future is back!

          Re: 1st rule of an IT system...

          Or their chainsaws on Win 7 PCs

    2. DougS Silver badge

      "Manual procedures" are less and less possible

      What's Amazon's manual procedure supposed to be when their web site goes down? What's the backup plan for a company that runs a fleet of self-driving taxis if GPS goes out?

  5. frank ly Silver badge

    Eggs .... Basket

    "Networked telephones and email went down, door-card entry was disrupted, ..."

  6. adam payne Silver badge

    The attackers gained Active Directory admin access, compromising the controller to reconfigure its security settings.

    Ouch!

    The borough is now reimaging its systems using backups, some of them up to a year old. However, a lot of data such as email has been lost.

    I can imagine desktop / laptop images being out of date by a year but losing data on servers how did that happen? What is the backup process? surely they weren't just doing disk to disk backup.

    1. Ragarath

      Backups Infected

      DR backups were also infected. It's all well and good having these swanky connected backup systems but as said they are connected and thus can also be infected.

      I assume they were diligent and had offline backups (though not that diligent if they are a year old) and these were the ones being restored.

      1. bombastic bob Silver badge
        Unhappy

        Re: Backups Infected

        I can see the possibility of incremental backups turning into excremental backups if they don't do frequent "everything" backups in between...

  7. defiler Silver badge

    Day-to-day domain admin

    What are the odds that one/some of the admins used Domain Admin creds on their normal day-to-day account? You know - the one they open their email and browse the web with.

    Obviously I can't say that this is definitely what happened, but plenty of us have done it in the past, and have only been lucky enough to get out of the habit before something like this kicked off...

  8. Version 1.0 Silver badge

    Hack America Great Again

    At least this is putting Americans back to work ...

  9. Doctor Syntax Silver badge

    "We immediately started to isolate servers, took workstations off the network"

    The implication is that this happened after they tried to clean machines. If so it looks as if they were doing things in the wrong order.

  10. sanmigueelbeer Silver badge

    The attackers gained Active Directory admin access

    Game over, man. Game over!

    So first, they were hit with BitPaymer and then followed by Emotep? No way that's coincidental.

    Someone is hell-bent about taking down the system (and records).

  11. cdegroot

    Onehundredandfifty?

    "120 out of 150 servers" - 150 servers sounds like a tad much for a borough that servers 100,000 people. Couldn't find it in the linked status update either so I guess someone has been misreading something?

  12. Walter Bishop Silver badge
    Linux

    Matanuska-Susitna ransomware infection

    A ransomware infection has cast the Alaskan borough of Matanuska-Susitna

    By any chance, did this ransomware infection run under Microsoft Windows?

    1. bombastic bob Silver badge
      Linux

      Re: Matanuska-Susitna ransomware infection

      yeah we all can pretty much interpret/know that the ransomware 'ran under windows'. however, it's worth pointing out that if you use a utility (one like rsync) to back up files to a Linux box, FROM WITHIN the Linux box, such that it reads files from remote systems but does NOT allow those remote systems to write TO it, and does so in a manner that can restore files 'to a 'point in time' (i.e. the July 12th version of that particular file, before it got encrypted by malware) then having live systems doing daily backups isn't so much of a security risk, keeping them "on all of the time".

      However, I suspect in THIS case that such backup/recovery/disaster systems were, in fact, ALSO running windows...

      So yeah the basic model here would be for a Linux box to use standard utilities, maybe Samba, maybe rsync, or maybe some 3rd party backup software, such that the backup server PULLS the data [and does NOT get data PUSHED to it], and then LOCALLY files it someplace in a manner that allows for getting back "the state of things on a particular date/time". Anyway that's my $.10 on it, and a Linux server running those backups with its own security context could help to prevent network-wide malware from infecting the disaster recovery backups.

  13. Dropper

    Honesty

    "The attack is notable not only for the way it dismantled an entire organisation's computer infrastructure, but the remarkable honesty of the victims."

    While I was living and working in Alaska, it was always refreshing that the IT staff I worked with didn't bother to waste anyone's time trying to hide mistakes. There wasn't really any need to. We just didn't have the money to hire the expertise that would have prevented this kind of attack, let alone the software and hardware. Besides, it helps with post mortem troubleshooting when you can step through mistakes without having to worry about whether a job is on the line. People are a lot more honest when they see that owning a mistake results in trying to figure out how not to make it again rather than finger pointing and disciplinary actions.

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Biting the hand that feeds IT © 1998–2019