back to article Hacker launches ransomware rescue kit

Security bod Jada Cyrus has compiled a ransomware rescue kit to help victims decrypt locked files and avoid paying off crooks. The kit sports removal tools for common ransomware variants along with guides for how to perform the necessary tasks. Cyrus recommends users not pay ransoms as doing so sustains the criminal business …

  1. frank ly

    Procedures

    "... some ransomware will quietly encrypt and decrypt data on-the-fly for months in a bid to spoil backups."

    Regular sample recoveries are a good idea.

    1. Anonymous Coward
      Unhappy

      Re: Procedures

      "Regular sample recoveries are a good idea."

      You mean you don't have the luxury of restoring data for people on an almost daily basis...lucky you.

    2. Bloakey1

      Re: Procedures

      "Regular sample recoveries are a good idea."

      Agreed. With or without ransome ware it is nice to know that the file you see is not corrupted, encrypted or sitting on a bad block etc.

    3. Joey M0usepad Silver badge

      Re: Procedures

      I'm not sure how a sample recovery would tell you anything. If the file is getting encrypted but remaining useable for months - how will restoring a backup change anything?

      Admittedly I dont know how the malware could encrypt quietly - ie have the file still open normally , yet still be encypted when it wants it to be - unless theres an active transparent decrypter running - but this would only work if the files are only accessed by the one infected machine - which is not that likely in a commercial environment

    4. The_Idiot

      Re: Procedures

      They are - and yet still can be, in some circumstances impractical.

      For a business user? One would hope so - though all too often the backup is carried out religiously and trial restores, um, not so much. But for the domestic, Jill (or Joe) Public user?

      Even if they do backups (or if their helpful IT relative set them up with a backup), how many have 'spare' systems to restore to (I'm sure I don't have to explain why a test restore onto the source machine has potential problems)? How many would know _how_ to restore? How many would know if their backups are all or nothing in restore terms, or more granular (specific files)? How many would know the difference between an absolute and a relative file path, and how to make sure a test restore _doesn't_ get copied over the current 'live' version? And that's before we get into the complexities of whether just restoring a file here or there lets you know if things still actually work in terms of a full system restore, in the absence of a second system to restore the test to.

  2. Anonymous Coward
    Anonymous Coward

    Amusingly...

    Someone in my company managed to infect one of the less important network drives with one of these today. They're right in the middle of restoration of last night's backup.

    I'm fairly surprised something like this hasn't happened sooner. Everyone here has full access to every file on almost every drive (although cross-office access is more restrictive). I've accidentally moved entire project directories into other project directories before. It wouldn't take much for someone to accidentally lean on the delete key and wipe out everything that wasn't nailed down.

    1. Velv
      Pirate

      Re: Amusingly...

      Damned if you do, damned if you don't.

      Amazingly how common so many businesses are open like this.

      Lock it down and all the users do is complain how much it "stops them doing their job". Infection causes a service outage and all you hear from the users is how much it "stops them doing their job and why weren't we protected against this".

  3. Nifty Silver badge

    Detection better than cure?

    The 'test your restores often idea' is obvious but could this not be made easier by a regular scan by a utility that can test if your online files are encrypted or not and flag up accordingly? Such a utility would need to detect and bypass silent decryption being done by malware.

    1. Anonymous Coward
      Anonymous Coward

      Re: Detection better than cure?

      wont all the others non infected users do much the same job? if files are encrpted they'd notice. The example i saw recently was good enough to put a new file extension on too just toi confirm its presence :)

    2. Adam 1

      Re: Detection better than cure?

      How does one detect that a file is encrypted? It is just a sequence of 1's and 0's until an application decides how to process it. Detection online just moves the problem further down the stack. Take an xlsx file as an example. It is just a zip file holding a set of XML documents and other artifacts. What makes it valid? A valid to an online scanner? Is a valid zip file header enough? If so you can expect the encrypted xml document to be added to a valid zip file. It is a seriously hard problem to solve. Regular test restores to clean VMs are the best we have at the minute.

      1. Nifty Silver badge

        Re: Detection better than cure?

        By planting some reference files that can be compared?

        And the only help that encryption detection can offer is to warn you not to overwrite your backups.

  4. TeeCee Gold badge
    Facepalm

    System administrators caught out by ransomware without recent clean backups must first avoid panic

    Difficult that one as the bit in bold would suggest that a P45 moment is looming in their near future.

  5. Anonymous Coward
    Anonymous Coward

    Yes , "dont panic" and "remove machine from network and image it"

    are great tips .

    What this story isnt telling us is how you decrypt your files. Or is the intention just to prevent more files getting encrypted.

    also what I'd like to know is - If you see files getting encrpted on shared network drives how do you know where the infected machine is? My companys response recently was to run around unplugging the PC of anyone who was kind enough to alert I.T to the issue!

    1. Doctor Syntax Silver badge

      Did you follow the link in the article? The one that tells you what the kit does including -. No, look it up for yourself.

    2. Mpower181

      "also what I'd like to know is - If you see files getting encrypted on shared network drives how do you know where the infected machine is?"

      What I do is to check the properties of the file for the last person that modified it, chances are that the virus is on the device that they're logged in to.

      Just unplug the network and check the local drives as they'd likely encrypted also.

      All the ones I've dealt with so far have scanned the drives\folders alphabetically, maybe a user based quota system with an network E Drive for all users containing some very large files of different types that, when infected, will bump the user over their quota limit thus stopping the ability of the virus to modify any more files ?

      Either that or something that can monitor open files per user and if too many get modified too quickly alerts the user

  6. Doctor Syntax Silver badge

    New OS approach needed

    ISTM that it's time to rethink the whole architecture of applications and OS.

    What I have in mind is that permissions would be based on a combination of user ID and application ID. For instance only Twitbook would be able to write to Twitbook storage. If Facegram needed to read something from Twitbook's storage it would have had to have been given permission as to what it could read, it would only be able to read from a specific user's storage and it wouldn't be allowed to write back.

    A way of implementing this would be to separate applications into front-end and back-end with back-end being something along the lines of a kernel module. The actual kernel itself would have much reduced facilities; it might be able to enforce quotas but it wouldn't be able to duplicate or over-ride the back-end kernel modules' reading and writing privileges. In some respects a micro-kernel architecture would fit but any existing micro-kernel would have to be enhanced with the extended permissions.

    Ideally this should prevent any rogue app getting in and over-writing everything. At worst if, for instance, a rogue managed to pass itself off as Instanter it wouldn't be able to encrypt Twitbook & Facegram data.

    1. This post has been deleted by its author

    2. phil dude
      Linux

      Re: New OS approach needed

      well in *nix space there is Copy-on-write. It is probably ancient tech as NetApp had a really good version of it and they did not invent it.

      I'm using ZFS on Linux. Perfect? No. But it works. There is BTRFS, but....well I'm not touching it for a while...

      And yes, you need to backup offsite to tapes, or you really don't care about your data *enough*.....

      P.

      1. Robert Helpmann??
        Childcatcher

        Re: New OS approach needed

        And yes, you need to backup offsite to tapes, or you really don't care about your data *enough*...

        Or at least off-site. Using tape for backups is more of a corporate approach, but many people being targeted by this malware are home users. There are plenty of free and commercial options available for regular folks, so it is still good advice.

    3. Nifty Silver badge

      Re: New OS approach needed

      It's called IOS and is not popular with everyone.

  7. Tannin

    sigh

    Fond memories of the days when blank CDs cost 20c and you could fit all of your important stuff on one or two or three of them. Use write-once CD blanks (never re-writables) and every time you make a fresh backup, thow the older set into a shoe-box. When the shoe-box is full, put it in the shed and buy some new shoes. Result: an endless set of incorruptable backups, proof against anything bar fire, a maniac with a hammer, or your girlfriend having a little tidy-up.

    sigh

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Other stories you might like