Re: Wouldn't fly in my office @Crazy
Um. How would this have helped in this case?
Presumably, all the users must have access to the file servers in order to copy the files there. And I'm guessing that these shares are mapped all the time.
So the malware follows every path it has access to, and encrypts all of the files it finds. This includes the files on the hot file server.
How is this the fault of any individual (apart from the person clicking the link)?
Having on-line copies on permanently mounted shares is no protection from this type of malware unless one of the following is true:
1. The copy is made by a high-privilege task that puts the copies in an area of the file servers that general users who may run the malware cannot write to.
2. The copy is made to worm devices, which do not allow files to be overwritten or deleted, just new versions created.
Even having the backups done by a high privilege task is not perfect unless there are some form of multiple versions kept, as it may be overwriting good data with bad. You've still not prevented the problem, and you've said as well as an (singular) offline replica, and the server is continuously wiped and rebuilt from the backups, which would imply that if the problem goes undetected, one backup and restore cycle later, you're still screwed.
It strikes me that there is a general failure of file sharing in many organisations. There ought to be a much finer granular permissions system, where a user only has permission to write to the parts of the file store that they need to for their job. This would prevent wholesale encryption of the data, but would not completely solve the problem.
Couple this with a proper off-line backup system (where the malware cannot overwrite the media, because it's not writeable by ordinary processes, either by permission or because the media is physically unavailable), which keeps copies of various ages (daily kept for a week, 1 copy per week for 6 weeks, 1 copy per month kept for an extended period, for example). Or use a managed backup solution with offline media that keeps multiple versions (TSM, Arcserve, Amanda etc.)
In the medium and large systems environment, this is a well established process. I'm sure I preaching to the converted here, but the lesson just does not seem to sink in to some SAs.
I know that the amount of data that kept is now quite huge, even for relatively small organisations, but it seems to me that the current some of the current IT world have totally ignored the best practices of previous generations.
This may be, of course, because the Management and bean counters are allowed to squash the required good practice because of cost, and over-ride any suggestions from their experienced technical administrators (or engineer them out of the company), in which case they (the management) should be held entirely responsible.
Oh. And seriously control the ability of the users to run any code, trusted or untrusted directly from web-pages or emails. At least make it a two stage process where they have to download it first, and then explicitly execute it. It's not much protection, but it will prevent casual click attacks, and as it's an explicit action, means that it is easier to discipline the culprit. This should extend to scripts in any language.