Knew there was a reason...
I banned Quicktime off our network. Except for the one guy who just has to have an iPone. Good to know he's undermining our security.
A security researcher has unearthed a “bizarre” flaw in Apple's QuickTime Player that can be exploited to remotely execute malicious code on Windows-based PCs, even those running the most recent versions of operating system. Technically, the inclusion of an unused parameter known as “_Marshaled_pUnk” is a backdoor because it is …
I banned Quicktime off our network. Except for the one guy who just has to have an iPone. Good to know he's undermining our security.
Quicktime is junk, rightly so you banned it.
There's so many other programs like it, most better aswell
And what's wrong with the good old Windows Media Player people?
Clearly you don't think there have ever been any security vulnerabilities found in Windows Media Player?!
When did i say that AC?
I'm talking about the better functionality, simplicity and design, which quicktime doesn't have.
Quicktime is crap bloatware
... on reflection, where's the evidence that QuickTime suffers security problems any more often than Acrobat, Internet Explorer, etc? That is, assuming you're willing to agree that all security flaws that allow control over a machine are equal — I'm not aware of any other mainstream product shipping with a security hole engineered into it deliberately but left in only accidentally.
"I'm not aware of any other mainstream product shipping with a security hole engineered into it deliberately but left in only accidentally."
The Sony BMG bebacle with the unremovable rootkit DRM?
That can't have been tested at all or they'd have noticed at least some of the trouble it caused.
I assumed you were talking to the other guy about the security issue, so haven't replied to that
I think you'll find that all of your users are undermining your security. That's what users do.
As for Quicktime, I don't think it's required to sync an iPhone, I think you only need iTunes. It's just a bit tricky to get hold of iTunes without also lumbering yourself with Quicktime.
Given Jobs declared prurient view point and his wish for users to conform to his dictates it was likely included to see what His software was being used for.
If not, this also demonstrates Apples poor quality control routines, again.
To be fair its not solely because of the extraneous parameter its also due to the lack of ASLR and DEP with respect to that .dll
I note that this vulnerability isn't present in OSX or Linux, now while the quicktime flaw may not be present for these OSes, the implementation of ASLR and DEP is also a factor
Actually, DEP (or lack thereof) isn't a factor of this attack as the actual code is executed in the .dll in question, not on the stack.
See the paper here: http://cseweb.ucsd.edu/~hovav/talks/blackhat08.html for more details on ROP.
ASLR would help to prevent this kind of attack, though I'm pretty sure there could be a work around.
It can't be! Steve Jobs said all Apple software is totally free of bugs and security issues. It's Bill Gates' fault, stop lying and making our blessed savior and patron Saint Steve look bad.
Hand Grenade cause I can't wait for the flame war to start!
Lets face it, this hole wouldn't be possible if Microsoft's code execution protection worked properly.
Adding DEP and Intel adding a flag to stop the contents of the stack being executable has not stopped attacks being possible.
Sure, this exploit is via Quicktime, but I'm sure there are many other applications which could be used.
Why attack just Apple when the problem with with Microsoft's OS as well. This fault hasn't been mentioned as being exploitable in OSX.
is because Apple doesn't understand how to write secure software for Windows Platforms. Safari carpet bombing, twice-monthly gaping holes in Quicktime. We kind of get tired of garbage third-party software.
any application developer have to write secure software for any operating system?
It's the OS's developers task to provide a secure efficient API , not the application developer.
... you continue to use first-party garbage like Windows. Bizzare choice there, my friend. Bizzare.
"Adding DEP and Intel adding a flag to stop the contents of the stack being executable has not stopped attacks being possible."
DEP wouldn't prevent this type of attack, as it is only return addresses and data stored on the stack, no actual code stream is ever placed on the stack.
see the paper here: http://cseweb.ucsd.edu/~hovav/talks/blackhat08.html for more details on ROP.
Sounds like a straw man argument to me.
The advertising often contrasts the amount of malware on OS X to the amount of malware on Windows to make an argument that a consumer is safer on Mac, but that's objectively true. It doesn't matter why, but there is substantially less.
Because if we try to remove their ability to put unsecure crap on production systems they scream and yell until the safety measures are turned off again, that's why. Secure systems are possible, just not a lot of fun to work with. Besides most coders don't have the brains to code properly, secure or not. Witness the crud usually found in connection with VBA and related "databases". Go read The Daily WTF for an extra dose of cynicism...
Unfortunate that the space appears to be kernel memory (at least, that's how the description reads to me).
That was a bad design choice on MS's part. Trading security for speed is always a bad trade.
"In addition to demonstrating the importance of regular code reviews to identify extraneous parameters, the exploit underscores the threat that comes from programs that fail to use of the ASLR and DEP protections baked into more recent versions of Windows."
Good bit of poor sport there. For you ought to know, as an IT type hack, if the OS was any good at protecting itself and other applications from applications with holes in them, is suddenly becomes a lot less urgent. I think that chiding applications for not doing everything to protect the OS is turning the world upside down, and shows poor understanding of OS theory. Basic things like "why do we bother with an OS, anyway?" Not that micros~1 won't try to get people to believe their take on things. It's never their fault, according to them.
In principle, ASLR and (hardware) DEP should not need application support. As an aside, the existence of the NX/DX bits is a bit of a failure in the x86 architecture; there's bits that already ought to work like that but that wasn't actually enforced previously. That's right, protection bits that didn't actually protect. But certain programs won't work with them so you can turn it off again.
SoftDEP/SafeSEH needs software support but both is a bit of a different beast than DEP and again I don't see why programs should go out of their way to use it if the OS can't be arsed to get its stuff right. As an application programmer I have better things to do, like at least make sure my software works and doesn't contain obvious backdoors.
The reason OS level protections aren't enforced on Windows is because developers scream and shout and no OS vendor wants to piss of it's developers. Did you see the hatred when Vista enforced LUA (running as a user)...?
The two dev's in our IT department HATE security. Ranging from Windows firewall (inbound protection only - yet they blame it when THEY didn't bother to lookup how to get THEIR app to create a hole in the firewall and need to do it manually), though to running with UAC turned on, running as a user rather than administrator, they nag about AV slowing down their performance... and so it goes on.
Some developers care about security and about the platform they write for. They wouldn't want their application to be a point of entry to a bug in another 3rd party app or the OS itself. I call it best practice.
However MOST of the dev's I've come across couldn't give a crap. DBA's seem to just care about performance (regardless if it's noticable to the user or if it reduces load on their hugely overpowered server), and application developers only seem to care about if it works. As long as the bug can't be aimed just at them then the app can be shoved out to the business.
If I put in place the AppLocker technology that I'd love to the developers would be screaming at me for months. Same if I enforced UAC on their machine (it is for everyone else - just not the dev's). It's taken two years for me to get them to understand that I'm putting a firewall between two different networks and that they NEED to tell me port numbers. The first thing a developer (both in-house and 3rd party support) tells me to do in the event of an application failing to execute is to turn off DEP.
Now I don't know (or care) if you're in the group of a few developers I've met that care about security or not - but believe me when I tell you that a LARGE chunk of developers would happily scream the house down and bitch about their platform than to incorporate security features that the platform HANDS TO THEM ON A PLATE to implement.
If developers didn't moan about running in a virtual sandbox environment without access to other apps or running processes, DEP was enforced, AV always scanning, always running as a user or lower security context and had no choice about it then two thing would happen - developers would moan, bitch, gripe and a number would probably piss off to another platform, however the world would be a happier, more secure place.
The problem is that the platform needs developers as much as the developer needs the platform. Ergo we have this middle ground until the developers man-up, realise they are human and thus will have holes in their code, and allow the OS to restrict their access to the platform substantially.
It's catch-22; until applications start using (and more importantly are testing) these security features then Microsoft can't enforce them due to compatibility. (Again, see Vista).
However developers like YOU won't use them until they are enforced.
See the problem?
Developers like ME? I beg to differ. I only very rarely "do windows" nowadays. My daily fare is thoroughly unix based -- using various free variants by preference. You know, the much older system, or rather extended family of systems now, that came with basic protections built right into it from the beginning, because in order to support multiple users you practically have to?
I've also worn the sysadmin hat often enough to know very well the nuisances of stupid app developers requiring admin rights for no discernible reason other than "refuses to work". Yes, that's stupid, irresponsibly so.
To be fair, there are unix devs that hate security, and there is a fair share of commercial, proprietary unix software that needs root for no discernible reason. Most open source things don't have that problem if only because someone'll look at the code, see no reason why it should have root, and hack it out. Or there will be tech-savvy users who'll bitch and demand no root be needed. There is thus far more back pressure than in "commercial" settings where the price of the app will be such that people won't complain, or complaints will get lost in the maze devoid of clue separating customer and developer.
But WRT developer hate, I think the rub is mainly in the circumstance that unix had some consistent, usable, if simplistic, security model built in right from the start. By stark contrast, "windows" started out as a shoddy replacement for command.com and went downhill from there. micros~1's take on "security" went from nonexistent via "not a priority" to various bodges, kludges, mis-steps, fuckups, marketeering stunts, through various confusing acronyms, and is thus a veritable theme park of historical and future failure pittoresquely sited in a veritable abyss of blackhole class suckage and cluelessness.
You really can't blame app devs or even project managers for not being up to snuff on the latest micros~1 buzzwords that add no direct business value and that micros~1 has apparently been intent on using as a pavlovian training tool for aversion. Had they had some model, even a simplistic one, that gave a clear idea of where the security barriers are and when to ask for permission, and had they stuck to it, then yes, one could legitimately gripe that devs hate security when they should know better. In the micros~1 universe, however, not even the basic APIs stay the same, nevermind the security related ones, instead seeing continuous complete replacement and complete model overhauls more often than linux sees VM swaps. No wonder it takes about three times more time and effort to write for windows than for "unix", while the latter has far more variety in systems. There they do have a unifying model and they do work mostly the same, predictably so.
To me, these "technologies" are more of the same. Up to and including the fact that "DEP" stands for two unrelated technologies, one of which requires app support and the other doesn't. This is just more proof that micros~1 is institutionally incapable of designing itself out of a wet paper bag to save itself, nevermind the world.
So yeah, I've seen your problem and consider it trivial in light of the rest of the problems in this domain.
We all know that since the early days of Windows Microsoft haven't looked ahead at security one bit. Only since the mid 2000's has it been taken seriously and the impact of that are the changes you are moaning about.
What should Microsoft so? Piss off the developers by rectifying their mistake in one fell swoop? (Destorying their market as there are so many piss-poor in-house apps used in enterprises worldwide) Give up maybe so we're back in 2002 with the crap security everywhere?
Or how about they continue to push the right way whilst allowing for backwards compatibility? The only problem is developers that continue to ignore the best practice. That just results in a worse world for all us.
There's no need to play the "by Microsoft can't make their minds up about OS security". Do me a favor, high-level best practice such as LUA, inbound firewalls, AV, DEP/NX and even API's like the security centre have been writen on MSDN since 2003. The problem is Microsoft can't just turn all of this on for the reasons mentioned above and in my first post.
Fully patched XP using LUA, Vista and 7 are all pretty secure out of the box with auto-updates turned on. Not perfect, but good enough for consumers and for businesses. Applications developed without security in-mind and code monkeys that insist on developing/testing without UAC, without AV, without DEP and without other best practice against the platform they are developing are causing huge security issues, second only to end-user education.
On a side note; in the context of a thread on application and OS security how can you consider my post on application security through developers not caring trivial? And please shed light on the 'other problems in this domain' other than bitching about a platform stacked with application level security that isn't taken advantage of by shite developers including Apple?
We have a thoroughly broken field and all you can think of is blame "the devs". That's not merely unfair, it's unworkable. Besides, I don't care as you seem to do whether micros~1 lives or dies. Ah, that's not true. I wouldn't mind at all if they died a fiery death for proven industry-standard-and-best-practice incompetence. "The market" you don't need to worry about. It will carry on and in fact is increasingly ready to jump ship to "linux" or some other platform. There might even be room for wacky alternatives like the late BeOS. The need for computing is still there, but it is less and less locked in. If the environment crumbles the actors will move to a new environment. Kicking and screaming perhaps, but given no choice, they'll move.
Yes, blaming the devs is trivial in the face of grand architect failure. If you didn't get that from my previous post, here's an analogy: What we have with windows "architecture" is not so much a cathedral, but a shanty town. Now you're blaming "the devs" for living there. You can't expect them to build on that sensibly. So they carry on because they need their paychecks too. And exactly because it is such a jumble, no project manager is willing to let them navigate everything there is on offer "for security". "What NOW then? We have to do WHAT? Sod that and move on." Only a vocal clientele will make them expend significant effort there.
And that is not unreasonable or even unfair, because the OS vendor isn't in this for free either. They're only adding "security" now after a decade of customer howling. And now they find that a lot of things in their shanty town just aren't fixable without breaking compatability with everything that came before. All the (E)TLAed attempts to do just that notwithstanding.
Indeed I do think that your lovely acronyms are just that: Acronyms, buzzwords, marketeering. The technology is probably wonderful and built on sound ideas, but it's too little, too late, and too constrained by having to maintain backwards compatability to do any good. If you know anything about security you know that a false sense of security is worse than knowing you have no security at all. I do think these things offer, perhaps through no fault of their own, but because applied to this environment, exactly that: A false sense of security. That makes the combination snake oil, sound ideas and engineering notwithstanding.
So you might as well not bother. The real problem remains that expecting devs to make up for the failings of the OS is layer separation violations writ large and a travesty of engineering. We need solid systems, and this is not it.
For future generations, you can try and teach them, devs and users and admins and everybody else, the importance of "security", but even then you'd have to give them something to work with. You need a workable model and a clear idea how to implement good enough safeguards with minimal hassle. Implement in the broadest sense, from writing code to establishing policies to trudging on in the daily grind. Especially the latter because that's where mechanisms are most circumvented and policies habitually disregarded.
micros~1 should've scrambled and put their brains to work on this very problem around 1995 at the lastest. They knew they had the market. Mister visionary chief geek at taht very outfit even wrote a book about interconnecting everything. We already knew that interconnecting everything would bring in Bad Stuff too. They didn't draw the obvious conclusion.
Now? Just burn the bloody thing down. How did that phrase go? Prior Preparation Prevents Piss Poor Performance.
... but am I right in thinking that this allows QuickTime to load up another file without the users consent...? If so wouldn't ASLR get around this for other non-MS applications and therefore it's a way of loading into memory Microsoft DLL's already on the machine?
As in it's not actually a huge massive security hole that will eat us all alive but rather some shite coding from Apple and Microsoft by-passing their own security features?
On another note - would I be right in assuming that as QuickTime is running under the context of the current user that any process spawned off it would only have the same access rights? If admin rights are required to launch the naughty DLL into memory would UAC not jump up?
...You've been _pUnk'd
Look how the Windows fanbois are out in force! Didn't you guys just say that Windows was MORE secure than OS X? So why then does the same flaw NOT effect Mac OS X? I mean the Mac is LESS secure right? Oh wait...SNAP!
writes pretty shaky software for Windows systems. It's always been an annoyance of mine, Clunktime has always been a security viability, and always will be. Less so for a Mac, but even then it still has exploitable flaws that require constant upgrade even on that platform.
Maybe the dumbass (apple) backdoor wasn't written into the Mac version, think of that? Eh? Eh?
Besides try running 9 year old software on the lasest OSX.
"Besides, try running 9 year old software on the latest OSX."
Well, since the latest OS X is UNIX-certified, any program written for UNIX will run on it. And UNIX has been around since the 70s. Ergo, we can run FORTY-YEAR-OLD code if we want.
Apple writes backdoor into their software.
Journalist actually surprised.
Maybe we should just have a vote on what is the worst garbage third-party software out there
MY Vote goes to Adobe's Flash
What sort of lame-ass process is going on that Windows can't just throw down its protection like a blanket bomb, and applications either crash (and hopefully thus get fixed) or work. Surely inter-process negotiation and protection is what the damned OS is there for. Well, that and tarting up the UI with pretty icons and 3D effects.
Really, it shouldn't be up to *me* to support DEP. It shouldn't even be my choice. EPICFA~1 MICROS~1!
Let's break just about every piece of existing software there is from years back that's still used in the business market and make users think it's a good operating system! Not quite sure current consumer machines are powerful enough to sandbox every piece of old software with its own virtualized operating system to work around that though.
...provided you are 100% sure, without fail, there are no flaws or bugs in your sofware.
in fact, if you are so sure, put up a 200% refund for all your software for everyone how ever uses it, should one flaw ever be found.
Still so cocky?
... you can't "fix" windows without breaking existing applications. So what you need is a new secure operating system, since the applications have to be rewritten anyway. Maybe it should be called Linux?
Did you even read my post, or did you only get as far as the end of the title?
I was NOT saying my software is perfect, I was saying why is it up to ME to implement DEP in my programs when this is the sort of thing that the operating system ought to be doing for itself as a matter of course.
AFAIK the Videolan guys had fixed the DEP thing already, or were working on fixing it.
As someone said, those protection technologies should not need application support, but considering how badly many windows apps were written in the past... and how BAD old versions of windows were...
But does VLC play Quicktime, and if so then do we still have a problem?
As for responsibilities... The operating system's job is to provide services to applications, not to make it difficult. Whereas applications such as a media player - or anything that handles documents or other files from the internet or other insecure locations - are responsible for being careful.
I suppose there is a role for an application to request limited access to local resources - you know, like on Android.
The most infuriating thing about this bug is that it comes courtesy of a piece of software that NOBODY WANTS. I can't even find Apple fanbois who like QuickTime. It's ridiculous.
Just another example of how a large corporation can use it's might to keep an awful, awful, product alive years after it's use by date even though basically 0% of it's user base ever wanted it to begin with.
Windows has security problems and can't protect itself, despite desperate, CPU-jogging attempts and much hoop jumping.
So bizarre that DLLs, the walking dead, zombie from the 80's, is causing problems.
And, sure, nobody wants QuickTime. Good one! Why then did Softy famously lift it's code and install it directly into Windoze Media?
"... and reorder the commands in a way that allowed him to take control of the underlying computer" -
Priceless. Sounds like a script from the IT Crowd...
Ha ha ha ha
Of course it is sweety, of course it is...
Yep.. note "to date".. :p
Problem is Windows would be damned if they do and damned if they don't. If they made DEP type protection mandatory in the OS, you'd break loads of stuff.
New WinOS =
Oh my drivers don't work anymore -> this new OS is crap.
Oh my piece of businessware that went out of support ten years ago won't work -> this new OS is crap.
Oh this game I really like won't work -> this new OS is crap.
Simple thing is users are not prepared to accept that an upgrade to the operating system mandates that they also upgrade every piece of software they own.
So instead MS is forced to "wean" us off the crap stuff, by making the OS more secure, but allowing the old stuff to still run - and recognising that you'll be supporting some legacy apps for the next decade at least (Win7 "XP Mode" anyone?).
So if you can't enforce protection globally in one go, you need the assistance of the developer community to work with you to write more secure software that would eventually mean you could flip the switch and lock down the OS.
Maybe this is actually a ploy by Apple - they tell their developer to write crap code that doesn't follow security best practice, and then they can blame windows for not enforcing security protections globally.
"the exploit underscores the threat that comes from programs that fail to use the ASLR and DEP protections baked into more recent versions of Windows"
If ASLR and DEP protections were really baked into Windows then the `flawed' apps wouldn't be able to bypass them.
Not one apple product is installed on my systems and there is no need for any apple crap, especially quirktime garbage which is 1/2 step above (or below) realplayer.
Just say NO, say it out loud, say it with your wallet. NO. NO. NO.