Trusted Computing, the widely-derided idea of computing secured for, and against, its users, is back and the necessary hardware is already in the majority of pockets. When Intel and Microsoft tried to introduce Trusted Computing, under the Palladium brand, they were pilloried as betraying the freedoms which had made desktop …
Apart from the basic "signing is not security" quagmire -- which this doens't address -- there's something more insidious going on, and it's not been addressed. What riled everyone up back then was that a maker of software (also the byword for shoddy, rushed out the door, fix it in version three software) planned to take away our control of our owned devices in the name of making up for their deficiencies through means that don't actually do that. We're still there, sort-of.
Personally I don't mind code signing, it could be helpful if we figured out how to deploy it usefully. But what this brings starkly to the fore is the simple question of who owns the device. If some other party retains the keys, I don't really own it.
See, for example, sony (in another shining example of how not to treat your customers, oh how the mighty have fallen), and their removal of the "other os" option on ps3 consoles. That they were on the wrong path was already clear with installing a shoddy trojan by way of thanking their customers for buying their music. A more benign example is the smart car, where you don't buy the car, but a "transportation service", meaning that if the thing breaks they'll fix it, or replace it. Some people hadn't read the small print and were upset with getting a replacement back.
So changing the device usage model doesn't need to be a problem as long as everybody knows this to be the case, and understands the implications, but you need to be careful about it. Paladium was anything but subtle. But the point remains: If I don't have the keys, I don't own the device. If it is sold as if I did own it, then I must also receive (all) the keys.
We may have to put that in law. Just to keep the vendors honest.
are belong to us.
I think the simple code signing approach doesn't fit the problems it is touted to solve. Not security, not "enabling innovation". It is only really useful for putting control of the device where some middleman can monetise the device and/or the user. I have some choice words for that, but let's focus on the technical aspect for the moment.
What we inceasingly see, is multiple parties trying to stake out a bit of "secure" space on a device they don't own. At the same time we see people wanting to do more things with the same device. Several phones already have multiple "faces", easily switchable themes for say business and leisure. I say this is not nearly enough.
It is entirely reasonable for corporate IT to want to control the environment the worker bee accesses sensitive corporate information with. This has to do with being able to trust it doesn't leak, no malware is installed, and so on. Partly this is because of shoddy software. Another part is ensuring things work reasonably well.
Now that this boundary is blending, we're going to see more leaks, more trouble. One way to contain that is the "virtual machine" idea. You could restrict all access to corporate information to a "viewer" client (vnc, rdp, citrix, you name it), but mobile devices are powerful enough to run such a corporate access vm locally, and could store data within.
Now split the device into multiple personalities using such vm technology; one for your personal use, one for corporate IT, maybe one for your bank to do provide "managed banking" with, perhaps one for the operator to stuff their branding in. Or even more, nothing to stop a consultant to keep one for each client, ensuring confidentiality that way.
That way customers don't need to endure the various battles between eagerly vying middlemen that all want a slice of an entirely artificial pie. We know how to. Now for doing it.
Dogs do that, you're not a dog are you]" Reg?
"These days computer users seem happier to accept a few limits on their freedom in exchange for better security"
Users don't know any different because they've never been allowed to see any different with their mobile phones and fondleslabs. To imply that there was a conscious choice made is completely misrepresenting the situation.
Ask users if they're ok with their carriers deciding not to upgrade their software leaving them with giant gaping security vulnerabilities... or why they're not allowed to remove that NASCAR app that the carrier decided *had* to be there.
You might as well say that veal calfs love being confined to their tiny little pens - they don't know any different either.
So why haven't you gone into business selling phones and fondleslabs without NASCAR apps, and with a guarantee that you'll provide upgrades for the foreseable future?
Maybe you actually agree with the phone companies that that's no a viable business plan?
Seems to work OK for Apple, and the Nexus phones have gotten decent feedback - so the word of mouth I've heard goes at least.
I have a collection of old smartphones that my three year old plays with. All are still in working order... they power up, are capable of carrying a cell signal, etc. The software on it is obsolete, and I am not allowed/able to put anything on it... so they all might as well be in a landfill precisely because of what we're talking about here. Ever tried browsing a modern website with IE on WM 6.5? It's not pretty... if you manage to catch a glimpse of what it's trying to display before it crashes.
Maybe you, your family, your company, etc can afford to trash and replace all your electronics on a regular basis due to forced obsolescence. If so, good on you... I guess. Personally, I find it wasteful and ridiculous that a cell vendor might opt to not even keep my security patching current through the warranty period on the device, or that a hardware vendor in general would prevent an obsolete device from being re-purposed for something else.
Listen, I understand that all the carrier rewrites of the OS have created a frackin mess, that they are getting subsidies for forcing crapware on us, and that it's insanely expensive for them to keep up with their software updates... but as a consumer I really could give less of a shit because it's about me, not them.
If they can't make a viable business plan that includes keeping me - their customer - happy, then maybe they shouldn't be in business.
Hint: "secure" generally implies something that cannot be brute-forced in a matter of seconds and also does not fall open and spill its contents when the device is reset (El Reg articles passim).
El Reg will be no more then.
Windows 8 on ARM requires that all devices are equipped wit a Mobile Trusted Module (MTM) Version 2.0
MS is already trying to shaft everyone with (cough) trusted computing in the guise of UEFI bullcrap.
"[s/MS/Apple] is already [s/trying to shaft/shafting] everyone with (cough) trusted computing in the guise of UEFI bullcrap."
And that's just on the Macintosh.
What was old is new again, and customers are happy to gobble it up on their iDevices and 'droids when they didn't understand it on the bog standard PC. Subsidized devices and cheap games make for a more pleasant screwing-over, like being taken out to dinner first. Of course, designing TPM into an entirely new platform instead of bolting it onto a decades-old architecture helped some.
A spoonful of sugar helps the (TPM) medicine go down.
I'm not liking it any, but I'm not expecting to change the minds of the iSheep anytime soon with postings on El Reg.
Microsoft "Trusted" Module.
I don't Microsoft and trusted belong in a single sentence.
A nightmarish future awaits.
Only 10% thrive creatively.
As old hardware will become new again
Shall we whip out our old analog motorola phone?
Would you rather have a CB radio using throw away surface mount chips mixed with potting compound (to hide the chip name and make repair a several day affair with razorblades cleaning potting compound out), or something you can solder on with a haako?
Just include it with all the android devices and a little dip-switch under the battery cover to turn it off & boot from a ROM to alter the keys. All you need is the md5 sum for the code which could be put into some camera-readable format so you can pull it from a screen or printout. Allow multiple kernels and you have a fall-back too.
Look Ma! All the benefits with none of the drawbacks! I wonder why MS and Apple don't do that? Seeing as Apple already do this sort of thing and MS are doing it, just make it obvious to everyone how much of an anti-feature this can be.
Having said that, the real problem is large corporates skewing the market. This allows MS to use profits from elsewhere to subsidise the hardware and drive competitors out. Open hardware would allow the software to compete on its own merits.
Don't tar UEFI with the TPM brush. It doesn't have to have TPM in it, that's an MS requirement. UEFI just happens to be the start-point to secure the boot.
Personally I suspect there are other more serious security issues in these devices than boot compromises. These devices are pretty much always-on. I'd like to see much finer-grained controls. Such as, "This app connects to hosts in the *.music.com domain," rather than "This app requires full internet access." Maybe we could have, "This app can read & write to /sdcard/appdata/myapp and subidrectories" rather than "this app can read & write to the sd card."
Then we could have the kernel enforce this access.
I know, I'm a dreamer...