back to article Hack attack kills thousands of Aussie websites

Thousands of Australian websites have irretrievably lost their data and email files following a malicious security hack on Australian domain registrar and web host Distribute.IT. The company has been scrambling to save data and get customers back online or moved to safe servers since the security breach occurred over a week …

COMMENTS

This topic is closed for new posts.
  1. TomasF
    Coat

    All backups wiped

    To everyone who asks me when I'll give up duplication to tape, I would hereby like to say "HAH!". And I'm not even going to mention offsite copies... oh... nevermind.

    1. Aussie Brusader
      Facepalm

      @All backups wiped

      My thoughts exactly.

      This is why I always do my own backups and testing and never rely or trust a third party to have and follow correct procedures.

      In business, "Trust nobody, check everything" wins, or at least keeps your business alive.

    2. T J
      Paris Hilton

      The Tao Of Backup

      Yup, 2nded of course.

      So, DNS? Probably dead or at least very ill and in a nursing home.

      Cloud? Ooh that's dead, so very, very dead. :)

      Paris, just because she's a good generic symbol for public Idiocy.

    3. melts
      Meh

      thats what i was pondering

      why no tape backups.. (or any offline mechanism)

      seems like a beginners mistake not taking your data somewhere disconnected

      i assume they had offsite backup servers, in the event of natural disaster, and just left the links on and up... not the best, not the worst

      at least all their clients can just restore from their backups to a new provider, since they did do regular backups of their code and databases, right, right..?

      painful lessons learnt, i hope some people reading this are going into their hosted sites panel and getting copies of the db and what not, or urging their customers to pay up for a regular backup service :P

    4. Jonski
      FAIL

      Here's some Fail for you, and Fail for you, and you and you.

      From another report of this mess... "I think I'm in shock ... I have lost everything .... I couldnt possibly replicate all those years of work again ... my whole lifes work is gone down the drain," wrote one.

      How does someone entrust another party with their life’s work, with no copies of it themselves?

      Epic, epic FAIL.

    5. Anonymous Coward
      Anonymous Coward

      Yes,

      TFA does rather suggest that backups where on disk, hot connected to the servers involved, which does seem a little careless. (As i my experience, when a machine dies, Windows server does occasionally take even connected USB sticks with it)

  2. El Cid Campeador
    Mushroom

    I must agree with Tomas

    Where the effin' heck are their offsite (or at least offline) backups (I know, I know, huge files tons of data blah blah blah... still...).

    Of course, maybe, just maybe, you the customer should have a backup of what you upload??

    So, friends, do you STILL want to outsource your enterprise? How is that hopey-cloudy thing working out, eh?

  3. Head
    FAIL

    Hmmm

    Backups.

    There is a reason they exist.

    Backup Policies.

    There is a reason they exist too.

  4. Grumpy Old Fart
    Trollface

    secure, I mean really secure. No, really.

    We have a team of Malaysian students who meticulously copy all our data down on reams of paper in binary format, and then photocopy those pages, and store them in climate-controlled rooms on two separate sites, so if we are ever hacked and lose our data we can reconstruct it.

    Of course, the team are currently 200-strong and about 3 years behind with the transcription process, but it's still a lot better than this newfangled fancy-dancy "cloud" rubbish.

  5. Pierson
    Meh

    Store you own backups...

    Hum...

    So, how many of these websites (especially the commercial ones) keep their primary copy of their data on the customer's local servers, where it is also fully backed up, including off-line copies?

    Then this copy is used to regularly synchronise the online servers at the ISP, so that the ISP provided machines and accounts were merely an easily replaced conduit for traffic.

    Now, hands up everyone who simply relies on their ISP, however cheapo, reliably hosting their data for ever and a day with no loss whatsoever...

  6. Stuart 22
    WTF?

    Who is the villain?

    Something wrong here. The hacker appears to highlighted a big hole in the hoster's backup policy. That is unforgiveable. It's very hard to keep a server safe, that's why backups are more important.

    The worst a hacker should be able to achieve is wiping the server and possibly poison the last backup or so. That's why you should always archive backups. Then you can work your way back to a safe position and minimise the loss.

  7. Jim Bob

    Only a 11 days later...

    http://www.itnews.com.au/News/260306,distributeit-hit-by-malicious-attack.aspx

  8. Anonymous Coward
    Trollface

    hmmm

    Our backup policy was just fine! We were taking one of these fangled snapshots every day!

  9. Black Betty

    To all of the above. Tera(peta?)bytes.

    BOFH descriptions of "industry best practice" describe constant, off site, hot duplication of data.

    So if I were to want to do something like this, perhaps I would come at the "problem" bass ackwards. After compromising the main system, I'd posion only the backups over enough time to "get" them all. And only then take down the main.

  10. Anonymous Coward
    Anonymous Coward

    No backups?

    Did the customers pay for backups, or a snapshot, both or neither?

    No all failures like this and data losses are down to the company.

    1. Anonymous Coward
      Anonymous Coward

      Spot on.

      Any purchase is 'buyer beware'. If you don't understand what you are buying, get someone involved who does.

      If it is a core capability, keep it in house, test, audit etc.

  11. Bill Coleman
    FAIL

    one more backup rant

    so the hosting company had no disconnected/offline backup or offsite tapes. many of the customers had no personal local copies of their data. the data that so many peoples livelihoods entirely depended on. are you freaking kidding me?! we're going to see more and more of this with budget "cloud" services appearing all over the place.

This topic is closed for new posts.