For the past 20 years, computing resources have been available via three approaches: client/server, Software as a Service (SaaS) and thin client. Organisations typically chose one approach, and did not tend to provide services using multiple different techniques. This is changing. In the client/server world, the bulk of the work …
Every good sysadin knows
security-wise, your data is everywhere and available to all, until the day you need to restore it from a backup. Then it is nowhere...
Scarce Resources - there's an expert out there. Chances are he works for the vendor.
It's worth bearing in mind that the increasing complexity of vulnerabilities and exploits means that simply staying on top of your patch schedule is no longer a guarantee of security. There is a limited availability of technical resources with a deep knowledge of each and every application and appliance running on your network. In some cases it could be more secure to have Microsoft/Google host your mail and Salesforce.com your CRM data. Just like you have to trust your storage provider to manage your externally-hosted SAN. Not everybody can manually re-code a corrupted file structure in hex.
The increasing abstraction in systems means greater exposure for the SysAdmin to a variety of systems, but also a growing reliance on technical experts with deep knowledge of a particular solution to go beyond the GUI. The IT Manager's role is changing from being a technical expert to become a risk and vendor manager. We end up enforcing best practice in process rather than best practice in configuration.
One other thing to note is that in the traditional client/server model data always travels over a protected network link - LAN/WAN for office users and an encrypted VPN tunnel for road warriors/home workers. One thing to bear in mind is that data accessed over the internet via the browser needs end-to-end encryption whether this is by SSL or anther preferred method.
** In the 700 user firm I work for we are seeing a huge trend to move internally hosted systems to cheaper yet reputable service providers. Even our servers in an outsourced data centre are being decommissioned and replaced with new SaaS offerings.
> “Secure” no longer means requiring a username and a password. “Secure” means careful consideration of which access method is being used and the devices from which this method is employed.
I would sincerely hope this has always been true in best practices - it certainly has been ingrained in the advice and methodologies from security folk I have heard and such resources as SANS, at least for as long as I can remember.
The good old days
"Secure means careful consideration of which access method is being used and the devices from which this method is employed."
"I would sincerely hope this has always been true in best practices."
Well, in the Golden Age, it was "any access method you want as long as it's a hardwired VT52 terminal/teleprinter/punch tape." So, considering the security of devices was not a lengthy exercise. Now, people are using personal phone devices to connect to enterprise computer systems. And management -approve-.
"“Secure” no longer means requiring a username and a password."
It never did unless you were a moron.
TIA by whom?
"Total Information Awareness" is different in the real world ("private sector") as contrasted to the State. To the modern State, TIA can best be paraphrased as "All your data are belong to us."
- Fee fie Firefox: Mozilla's lawyers probe Dell over browser install charge
- 20 Freescale staff on vanished Malaysia Airlines flight MH370
- Neil Young touts MP3 player that's no Piece of Crap
- Review Distro diaspora: Four flavours of Ubuntu unpacked
- Sysadmins and devs: Do these job descriptions make any sense?