Flash should not be installed on any system.
Security researchers have defeated vulnerability protections baked into the latest versions of Internet Explorer, demonstrating that it's possible to poke holes in a safety net that's widely relied on to keep end users safe from drive-by exploits. By exploiting weaknesses in Adobe Systems' Flash Player, researchers have devised …
Flash should not be installed on any system.
"Why can't we have Flash on the iPad?!!"
Here's why. Windows is doing a really good job here, and Flash is totally screwing this up. Can you remove Flash from IE8? Won't that help?
My thoughts exactly - there isn't a single decent exploit for the iphone OS yet - the vuln published the other day still relies on a phishing-type approach (i.e. click here to be hacked, stupid user!). Presence of Flash could potentially blow this right open.
Come to think of it - is Jobs paying this guy to make his point for him?!
I've thought for a long time that the reason apple are so hard-assed about what can be installed on the iPhone and now iPad is to prevent the installation of any apps that could break their security. This then leaves them with a problem - what would the more conservative investors say if they started distributing smut, or other "unsuitable" material? It does raise questions about how confident Apple are with their security, or if they are just very very protective of their "no viruses" image, which has seeped into their general corporate image.
Anyways back to the article - I wonder if the code is executed in the system context or the users' context?
Not too long ago ASLR and DEP were hailed as silver bullets, yet defeating them only took, well, adobe's JIT. I wonder if the JS JITs in vogue now aren't opening up similar vectors. But anyway, it turns out to be all too true what everyone should've seen coming right from the start: Executing foreign code, even supposedly crippled scripts in browser sandboxes, is asking for trouble. Of course, some trouble more so than others, but even so, remote code execution is never a good idea.
I despise flash as much as the next person. but in this case it sounds like flash was merely used to set the contents of memory to a known value. A separate exploit is still required.
Address randomization techniques serve only to make data structure overflows unpredictable (such that the attacker has a hard time figuring out where his malicious code will be loaded or should be loaded). Instead of running the hacker's code, randomization makes application crashes more likely. However the fundamental overflow bugs are still present.
Even with heap randomization, when enough of the heap can be initialized to contain malicious code, it could make successful overflow exploits much more likely. That's what it sounds like the case is here. Flash isn't the vulnerability, but a tool to prepare the heap for the attack.
I was going to post a comment, but what's the point? It's just one more hole in MS's cruddy swiss-cheese software. WHY, oh, WHY does anyone use this stuff?
As for flash's part in this - I can't help thinking of the comment the other day from Apple about not supporting flash because it's so buggy. Mmmm.... maybe they have a point (and yes, it may actually be a bug as such in flash, but it's not exactly vindicating it is it?).
Can you clarify what you're saying here? Is this a Windows problem or an Adobe problem?
Secondly, is this an IE problem or would it also impact Chrome and Firefox?
Thirdly, could this same technique be used on other OS's?
I'd assume Firefox is not immune to this either then, since it's Flash causing it. The FlashBlock addon should help reduce the risk at least? https://addons.mozilla.org/en-US/firefox/addon/433
I never use Firefox without both Flashblock and Adblock-plus. Adblock makes sure that the buggest(*) source of potential exploits (bogus adverts served into other peoples' legitimate pages) never make it onto my system. Flashblock means that all flash animations get replaced by a logo, which I click if I want to view them. That takes about half a second. Most of the time I don't want to see them, just read the text, and it probably saves those half-seconds ten times over in bandwidth-wait.
(*) a typo, but I like it.
"JIT-Spray" sounds like something you might see in a raunchy grumble flick. You certainly wouldn't want to get any on you.
But in our world, it's a perfectly reasonable topic to discuss at length in public. Of course, this is why non-techies think we're all keeping a private channel open to the mother ship, and it worries them.
QUOTE: "exploiting weaknesses in Adobe Systems' Flash Player"
Yes, I'm all shouty because this is bloody ADOBE'S fault AGAIN! If you DON'T have Flash installed you are OK, am I right or not? Presumably the same bug is going to occur in whatever browser is running Flash, ie. Firefox & Safari will also be vulnerable, as Flash must install with Admin privilege, and is therefore beyond the browser 'sandbox' straight away, tearing whatever OS you're running a new one.
For Fckus Sake, if you install MALWARE with Admin privilege the same thing will happen, therefore ADOBE FLASH = MALWARE
No wonder Apple won't allow it onto their i-Products, Flash is downright dangerous!
My desktops are now 9 months Flash-free and have had not one problem whatsoever, running Win7-64, not one single crash. What are the odds, eh?
"therefore ADOBE FLASH = MALWARE"
But that doesn't alter the fact that Microsoft's piss-poor memory management allows this kind of attack, now does it? It's just a matter of time before another bit of code delivers the same type of exploit.
As a sidenote: that's spleled "for fsck's sake", just for future reference.
Well, we all know that Flash is buggy but if Windows wasn't a complete pile of crap then this wouldn't work.
You'll notice that the security guys aren't saying "we got the exploit to work on all browsers" it's IE specific and I'll wager the blame rest equally with MS and Adboe.
I think you're missing the point.
This particular exploit uses Flash, but Flash is just the conduit. We all know Flash is a piece of shit and is badly coded and full of bugs. The real story, however, isn't that yet another vulnerability has been discovered in Flash, but that by exploiting it you can bring down DEP and ASLR.
In this instance you can pat yourself on the back as your desktops are immune due to you not running Flash, but you can be sure that there will be other methods discovered that allow the same end result. Undoubtedly, eventually, one of those other apps will be installed on your machines.
As I see it, the real culprit here is Microsoft... Adobe isn't blameless, and in an ideal world all software would be bug free (but then I'd be out of a job) and this would be a non-issue. Instead, here on Planet Earth, there are bugs in software and those bugs will be used to attack the Crown Jewels, i.e. the target OS. Yet again, Microsoft's OS is found to be not as secure as Redmond would have you believe.
If Microsoft stopped relying on backwards compatibility quite so much, then they could spend their gazillions of dollars in the bank to re-write their OS from scratch and this time actually make it as secure as possible (instead of as user-friendly for all and sundry) from the beginning instead of trying to tack on security measures as an after thought.
Not to be nominated as a candidate for the village savant, but seriously: Why do we even need Flash? Doesn't the unadorned Java API have enough to offer - seriously?
Looking at Adobe's "Tour de Flex", it looks like there are some highly specialized user interfaces being made with the Adobe toolkits. Do we need to recall the infamous Turing principle, in order to illustrate that those same UIs can be made with Java? So why haven't they been? (Hint: It's not a trick question, but I'll leave it as an open question, here, anyways)
Not artists. Not designers. *Programmers*. Got that?
And the last f*cking thing this planet needs is more programmers thinking they can build a bloody GUI worth a damn.
Adobe's Flash—for all its flaws and legacy cruft—comes with design tools aimed specifically at artists and designers.
Java comes with nowt. You can use Eclipse, Emacs and even Notepad to build stuff in it, but GUI development isn't really what it's intended for. It also requires a JVM of truly staggering proportions which makes even Flash's bloated arse look svelte.
(Personally I'd go with Unity over either option, but I'm biased.)
The ONLY application I have seen using flash that is actually useful is video display, and this could be done with other means anyway.
Other than this, I have never seen a single web page that is more usable WITH flash than without. It's a scurge that should be got rid of.
Hey, just because people have been lagging about human-computer interface design, doesn't mean us programmers need to be shat on for it, bub ;)
"Not artists. Not designers. *Programmers*. Got that?"
Uh ... no.
Java is a bastardization of a real programming language, designed specifically for the proletariat. Java is almost, but not quite, as bad as BASIC as a first language for programming neophytes. It has it's niche, but it's hardly a *Programmers*(sic) tool.
Maybe I missed something. What was the first "bastion" of Windows security?
other than the power cord...
...going to recommend that their citizens uninstall Flash now?
And if not, why not?
Buffer overflows are a potential hazard in Linux and BSD as well as in Microsoft Windows.
In IBM's mainframe operating systems, and in VMS, they're much less of an issue, because text files on disks are organized as a length code followed by the characters in a record, instead of characters followed by a carriage return, line feed, or both. So if the length code is one byte long, for example, it's impossible for a 256-byte buffer to be overflowed from a disk file.
And the other I/O routines in those operating systems are designed to follow the same model. So the driver software takes care of buffer overflow, and applications programmers only have to follow the correct calling sequences for the routines they use; they don't have to spend extra cycles checking for overflow themselves.
For Linux, this late in the game, to shift gears and change from a Unix model to a traditional mainframe OS model, though, would seem highly impractical, I admit. And a new OS project would likely never get very far in terms of adoption.
>Buffer overflows are a potential hazard in Linux and BSD as well as in Microsoft Windows.
>In IBM's mainframe operating systems, and in VMS, they're much less of an issue, because text files on disks are organized as a length code followed by the characters in a record, instead of characters followed by a carriage return, line feed, or both. So if the length code is one byte long, for example, it's impossible for a 256-byte buffer to be overflowed from a disk file.
Provided that you aren't using 'C' as your programming language on VM/CMS and/or VMS. Fortunately, most operating systems manage to map heap memory into separate regions so that if you whack your own stuff, you don't whack everyone else. Alas, Windows doesn't seem to understand this (why heap and stack and code are in the same address space on Windows is a mystery to me... Why didn't MS map these things into distinct regions of memory and use those horrid segment registers to keep things nice and tidy right from the get-go? Inquiring minds want to know!)
If I use firefox on unix, it runs as me, and so any malicious things it is tricked into doing can potentially damage all my stuff.
Security features, such as file permissions are there to be used. Obviously, these browser makers prefer to skip on that one. I am not interested enough to set up a 'nobody' user specially for the browser, so it can only wreck the files in the nobody owned part of the filesystem. This should only include files that are explicitly downloaded anyway. So why isn't firefox designed to run in such a sandbox.
As far as messing with the operating system's memory, which shouldn't belong to any user, this should be dealt with by the o/s itself (it shouldn't be possible). Even unix is at fault here. A root user shouldn't be able to write to any old memory.
I remember these features from the 80's, on long gone kit. The hardware itself generated 'pointer faults' for any attempt to access memory in a priviledge ring that was different to the calling address. This includes the kernel ring 0 code calling procedures in a less priviledged ring. There were still issues, but maybe lessons weren't learned because there were so many o/s's about.
"Even unix is at fault here. A root user shouldn't be able to write to any old memory."
I disagree. Such tricks come in handy, occasionally.
Now, on the other hand, one shouldn't run as root as a matter of course ... unless your root account is an ordinary user account, and the admin account(s) has a different name(s) ;-)
aint that one for the books, St Stevie being right about just how crap flash is.
In IE8 just go Tools-Manage Addons, click on Shockwave Flash Player and then click on 'More Information'. From there click on 'Remove All' to disable Flash for all sites.
Then for sites you Trust, re-enable Flash by allowing Flash to run the first time you visit one of your trusted sites.
For all other sites (particularly dodgy Chines ones!!!) Flash won't run, so no exploit.
Unlike Firefox, IE8 doesn't need a 3rd party add-on to block Flash!
Archos 605 (does have flash) and later ARM based Archos, Android phones, Symbain phones, iPhone and iPad are all ARM cpu based. x86 exploits can at worst simply crash them if tjhey do anything (which is unlikely)
The Problem is not Flash. But C and C++ and other derived languages programming model, especially for strings. That and the mentality of how virtually all C/C++/C# WinAPI libraries are written. If it wasn't Flash it would be something else.
You can BSOD windows with a Java program. Thus an exploit could be crafted.
JAVA is nicely cross platform and very "objectafied" but it's a slow painful development process and slow runtime. VB6 running p-Code easily beats it.
"Security researchers have defeated vulnerability protections baked into the latest versions of Internet Explorer, demonstrating that it's possible to poke holes in a safety net that's widely relied on to keep end users safe from drive-by exploits."
Doesn't El Reg post an article with that in it almost every week?
(If only there was an icon for "MS are crap" rather than "MS are evil". Well, MS software sucks and by that logic, perhaps the Paris icon will do...)
And with .NET or Silverlight?
Perhaps what needs to be prevented is the "spraying" of many, many copies of malicious code into system memory? More use of signed code, and a limit to the creation of multiple structures in memory from one source, can be considered - at a risk of breaking some current good software.
...think of the children, and stop using Flash. How many more gaping security holes will that pointless plugin create? At the very least, use ClickToFlash on FF or the Safari blocker on OSX to disable it and it's insidious cookies.
Browsers should stop installing it by default, and display an urgent security warning if they subsequently detect it. Come in Flash, your time is up.
Microsoft is fully capable of running custom memory allocators for each application. No doubt they don't want to, but it is possible..
I'm down on my morning coffee consumption this morning...I'd tuned out by the 3rd paragraph and fell asleep.
So...for the coffee-deprived the story is basically: Flash = bad, Windows = good?
Saint steve 'cos he hates flash too.
Firefox for Maemo RC3 has been released and guess which "ubiquitous" plugin has been disabled because it "degraded the performance of the browser to the point where it didn’t meet our standards."? Who said Schadenfreude? It may not have a massive user base, but it's just more evidence that Flash at the very least needs fixing.
It's not clear from the article how they did it unless it really was as simple as it sounds, that the flash compiler + environment can't be depped because that would negate the compiler, so a hole is left. Can anyone clarify?
@gimbal: I can only speak for myself but I don't allow java in the browser either. I've noticed when it comes preinstalled it seems to stick itself everywhere and nag you repeatedly for upgrades. Also ask yourself if these complex in-browser UIs are needed at all (Hint: 97% of the time, no).
You're missing the point. There is indeed a vulnerability in Adobe's Flash which the researchers are exploiting to gain unauthorised access to the system, that much is true. However, what makes this particular story notable is that, under normal circumstances, once that vulnerability is exploited, the standard system protections of DEP and ASLR would nullify any gains made by the attacker; but during this particular attack, those protections are violated.
So, essentially, the researchers have demonstrated how to circumvent DEP and ASLR protection mechanisms (both low-level functions of the OS and/or hardware) by using the technique known as "JIT-Spraying". While you are right in that this technique was fascilitated by a bug in Flash, that is merely circumstantial.
By the way, I thought that DEP was already proven vulnerable by a technique dubbed "Return-Oriented Programming", in which the attacker pieces together his payload from the legitimate executable instructions already present in memory. ASLR, on the other hand, seemed a bit more promising. Alas, such is life.
For more information on Return-Oriented Programming, visit the following page:
We have an intrinsically insecure architecture right down to chip level, wherein data and instructions are only distinguished from each other by context at runtime, and we've replicated the problem at OS level via the ludicrous stack definition that allows data, parameters and return addresses of functions to reside adjacent to each other.
We've used this architecture in mainstream commercial microsystems since the year dot and it lets us down more and more often as time passes. Most successful exploits rely on abusing this single weakness in one way or another. Fancy tricks like ASLR and DEP are merely plasters that cover an increasingly festering ancient wound.
But there has been an alternative for ages - Harvard architecture, which segregates code and data in separate and completely independent memories. It's widely used in embedded controllers such as the PIC family, and would make practically all exploits of the type discussed here impossible. Why on earth don't we create a mainstream Harvard processor? Why in the interim don't we create a virtual Harvard OS?
This is a fundamental conceptual flaw - not a specific of Flash, IE or anything else at the application level. It's time we dealt with the real problem, not just went on tampering with its symptoms.
Main stream ?
How about the StrongARM, or more recently the Cortex A8/A9.
Sure, but you won't find many of these running Windows on the office PC, will you. That's where the biggest target is. Quite apart from which, the conventional OS stack (still pretty much as derived from the C&R C stack) is a huge contributor to the problem, and that's a higher level issue than the processor choice.
... that code is data as well, and you have to have some mechanism for updating your code (firmware updates, etc.) from data. As long as the code is changeable, there will be a way to change it. Harvard architecture is no more a silver bullet than ASLR and DEP are (in fact, if you think about it, you'll see that DEP is an (admittedly incomplete) adaptation of the Harvard architecture concept to the Windows OS.)
Also, as mentioned in another comment, hard segregation would eliminate any JIT compilation, and there goes all of your scripting. Consider this: is HTML code or data? If you call it code, then the hard segregation rule would mean either not being able to download it, or verifying every page is from a trusted source. If you call it data, then you ignore the fact that it is a set of instructions for the browser, and there may be the possibility of exploiting a weakness of the browser simply by cleverly crafting those instructions (remember what some web sites managed to do with FRAMESET, or BLINK?)
As several comments have already pointed out, Flash is not the vulnerable element in those exploits, nor is an SWF file a vector for the attack. Flash is only used to fill the memory with highly-reptetitive code segments that could be, if directly executed, calling system functions. The actual direct execution had to be caused by attacking an existing overflow in another application, that could be anything from TCP stack itself to the Infamous Explorer.
To sum it up: if one wants DEP to be effective, no JIT of any kind (and thus, no optimized execution of foreign scripts in any language) should be allowed. Goodbye, web 2.0.
But does not *cause* them.
"Buffer overflow" MS say.
The worst most insecure browser demanded, usually by banks, to access the world's most secure sites!
Probably not. All three of those examples are managed code, they don't let you piss about with the memory - which is how it should be. If there was a flaw in the silverlight runtime that exposed this then yes, but I've yet to hear of anything being exploited in the way that flash does (Not to say it hasn't happened though).
It may be complex to fix the memory handler so that JIT-spraying doesn't work. I'd have thought that fixing DEP so that it isn't fooled by obfuscation techniques might be a tad simpler.
Accept Adobe's crapware stomping around in data memory, just make sure that what it stuffs in there can't be executed later (like wot DEP is supposed to do).
"If Microsoft stopped relying on backwards compatibility quite so much, then they could spend their gazillions of dollars in the bank to re-write their OS from scratch"
It's so very easy to say, so very nearly impossible for MS to acheive even by the mid 90s.
One patch tuesday not so long ago, I was reading through the "known issues with this security update" section for a particular patch, and came across the line "programs using modules written in turbo pascal..."
But this is the reality of the matter. Yes, MS could very easily rewrite their memory management routines from scratch, but it would break hundreds of badly written programs that businesses rely on, and the corporations won't replace said crap, because "it still works" and would cost millions to do so.
Apple is the perfect example of what MS wish they COULD do. They were a niche market to start with, so little preasure from blue chip companies to maintain the status quo. Then when they ditched their OS 9 and developed from the foundations of unix, they also switched processor.
A completely new OS, running on a totally different CPU instruction set? You can't get much of a cleaner sweep than that.
You can bet Billy G was green with envy at the time!
MS are mired in legacy by their very popularity, and it's support for this legacy that is both a blessing and a curse. Yes, they should have designed it properly in the first place, but hindsight is 20:20.
Biting the hand that feeds IT © 1998–2017