Ah, another day, another government initiative designed to educate users about cyber risk. The Canadian government has declared October “Get Cyber Safe” month. It has a web site, too, which advises users on how to avoid getting pwned. The advice list includes updating your malware signatures and not giving out your password. …
I'd love to see the gamification idea however well-intentioned, square-up to the wall of British cynicism and dry humour.
I can just imagine the latest virtual reward being being hotly debated over a round in the pub afterwards.... (To the utter embarrassment of the winner, that is)
On the other hand, a 50-buck (or 50-quid) gift card might get some positive attention.
Ok, so, security is hard because it has this reputation for being cumbersome and hopelessly in the way. Stands to reason because it's long been just shoveled in and indeed, been hopelessly in the way. Essentially, that's attempting to secure things by unspecific blanket because the techies already know there's sensitive data in them thar servers and let's not lose it, hmkay. So why not start there? Actually, there's an even better place to start.
That is to sit down and do a bit of DR Q&A*. Things like "what would happen if $info got copied and sold to the highest bidding competition?" What really are the most important assets that you don't want to lose, don't want to see others hare off with? That's a wonderful focus on securing right there. You'll get a much better response if people know when to care and when they're allowed to slack off a little. Much less tiring that way.
Then get down to practicalities. And I don't mean so much to map who can have what access. While a good idea in theory, it moves too fast in practice to set in stone. And it's the disconnect between what security forces people to do and their expectation of being able to get their work done that is where it bites.
So things like easy handing out of access to those that need it are pretty important. The rub lies in making sure that the ability to map access matches the burden of responsibility. If bosses want do do stupid things, well, that's up to them. Just make sure it's documented who did it. Make it easy to hand out, and natural to take back, and not just upon termination. Make sure that people that do the handing out of access understand what it means and that it's their rep on the line in trusting whomever they're handing the access to. Make sure the ability to do it, the understanding of the implications, and the responsibility for it all coincide.
Then work hard to integrate security to be a natural part of the workflow. About as much effort as unlocking a door, going through, and locking it again, is reasonable for most casual use. Some things need to be streamlined, other things might be justified in being more trouble. There, too, careful arrangement can make a lot of difference.
Now look at what sort of hoops traditional IT "securing" expects people to jump through. That gap, right there, is the chasm to overcome. All the rest is fluff, bells and whistles, nice ideas but not enough. Understand just what it is you're trying to secure first, and that really isn't a technical thing.
* For the rest of us: Play what if... with various disastrous things and what that'd do to the company. That sort of thing you need to know anyway, might as well exploit it for security streamlining and saving some costs by not securing that which doesn't need securing.
if you pick up a virus on your company machine, you get the sack.
If you give away a password to anyone, socially engineered or not, you get the sack.
If you cause the leak of any sensitive data, you get the sack.
Bringing a pendrive to work, you get the sack.
If you are the boss and you want to override IT department recommendations and use a silly password, the board and shareholders are informed, and you get the sack.
If everyone was as strict, then antivirus software would be a thing of the past, as people would actively try to verify the source and content of what they install or download and err on the side of caution.
Only zero-day vulnerabilities would cause a problem for assigning blame, but every transaction on the network should be logged for auditing.
People will learn, if ignorance leads to hunger.
Plus ca change...
That's not quite the what-if I had in mind. You're thinking what to do to the users here, and the result isn't very workable. So that's wishful thinking. What would be more productive is to integrate DR and security in such a way that data leaks and such become quantifiable risks to the business. That is, find the pieces of information that would be the most damaging if leaked/lost/whatever, and secure those. Then make it as easy as need be to work with that information in a secure fashion, and make it hard to work with it otherwise. Only then should you try and blame users for not doing the easy and right thing but instead doing the damaging to the company thing that is now self-evidently stupid and actually hard to do. Then getting the sack is fully justified. Without that, it isn't, and you're landing the company in a quagmire of unfair dismissal litigation and other nasties.
For strict to work it has to be fair, too.
Does their advice include telling people not to use Winblows?
Turtle_Fan your a seer... Here's your first cynic :D
or they could do it the right way
re: "Or mobile workers could be rewarded for connecting their managed laptops to the virtual private network for scheduled patching and maintenance."
My remote users are required to have current (and running) AV when they log in. The only user this seems to impact is the president of the company, because the host checker (when combined with his own changes to Windows) bork his machine on occasion.
It also forces them to maintain good (well, complex) passwords and change them on a regular basis.
Also, requiring them to login into a domain account insures that they connect to the corporate network often enough to keep their cached password current.