The Flying Wallendas were a legendary circus troupe that performed death-defying acts from a high wire without the use of nets or safety devices of any kind. Even when they performed their world-famous four-person, three-level pyramid act 50 feet in the air, patriarch Karl Wallenda steadfastly eschewed nets out of a belief they …
There's also a sports metaphor here.
Many sports show a similar effect. The sports that tend to be dangerous often require the use of protective gear. Unfortunately, in many of them like hockey or football, the use of protective gear gives the players an excuse to hit each other even harder than they normally would. The belief that the gear is going to protect the players tends to make them rely on it, when they should be trying to avoid injuries as though they didn't have the protective gear.
a great demotivation poster says
"The purpose of your life could be only to serve as a warning to others." The poster then shows a ship wreck in shallow waters. Lets be honest for Adobe it is far too late. That boat sailed when earlier this decade they decided to outsource their code base to lowest bidder code monkeys in India. The cost to refactor and fix their badly broken code would be astronomical (how the hell their broken products became stardard-like on the internet is beyond me). Adobe though should be warning to the mega corporations that once you start down the rabbit hole it is really tough to dig your way back out.
Hope for the best.
Plan for the worst.
Reality is consistently somewhere in between.
I like the no net idea
But I would be even happier if all of those clowns (get it?) at Adobe were forced to walk the plank instead.
To think that Adobe sw is such a bug filled mess that Microsoft looks good in comparison. I wonder how much longer MS will tolerate Adobe creating holes in its OS. MS could either cut them off and blacklist the bugware, or put their feet to the fire and implement an improvement program.
Talk about broad brushes...
"At more than 41MB, it's more than five times as big as competing PDF reader Foxit, and that means there's five times the attack surface to exploit."
That's a bit of a crude measure, don't you think? How much of that 41MB is image files and other ancilliary data?
Quantifying "attack surface" is pretty much impossible to do, but you could at least start with measuring the quantity of executable code in the two products? Or somehow measuring the relative number of features (on the assumption that there is a relationship between features and bugs as well as bugs and security holes)?
Why do we assume Foxit is more secure? Because it has fewer publicly-disclosed vulnerabilities? That's a bit naiive...
Calling it more secure because there is a lower chance that you'll get compromised due to the larger number of exploits targeting Acrobat vs Foxit I could potentially buy, but I'm not sure that "less likely to be compromised" is the same as "more secure" in a general sense.
I would be very surprised if Foxit stood up to the same level of scrutiny that Adobe Reader/Acrobat is getting without having as many holes found.
Unfortunately, we aren't likely to find out - I can't see Foxit commanding enough of an installed base to cause the crackers to switch targets.
Adobe's sandbox idea is alright I guess, but why is this not a feature of the operating system? Applications need to operate on a minimum-rights principle, whereby they only have permission to do what they need to do. Unfortunately, configuring such a setup, whilst possible with Windows is just too difficult to be practical.
It's not helped by applications which require more permissions than they actually need, due to there being no historical reason for them to be careful about what they do. There are still tons of Windows apps out there (I can count several at work) which need to be able to write to their program directory when run as a normal user!
IT bods don't like what Apple do with iOS apps, and the restrictions when it comes to data-sharing between different apps I believe are quite annoying (I don't have an iOS device, so may be wrong here), but I hear very little about security holes in iOS apps. Privacy holes, user tracking, brokenness yes, but not security holes.
OK, that's the "rant" part of my Saturday todo list done. What's next...
DEP has nothing to do with digital signing.
All DEP does is turn off the "this memory contains executable code" flag in the page table (or equivalent) on the basis that program data and the stack does not typically contain CPU instructions.
Basically, it's a feature that should have been there since day 1 because the only reason you would want to execute program data is for things like self-modifying code and other hacks which aren't worth it in 99.999% of cases.
From what I understand, DEP has to be explicitly enabled in:
* The BIOS
* The OS
* The compiler when building your program
i.e. it's not on by default in many cases, presumably because there is so much utterly shite code out there (both closed and open source) which would break if it suddenly couldn't execute its own data.
Basically, it should be on by default with a clear warning when it is triggered by a crap program, explaining that the cause is either a vulnerability or shite programming (the latter has a good chance of creating many of the former anyway) and therefore the program should be fixed. If that's not possible, then the feature can of course be turned off.
The "self-modifying code idea" is not so hot, but these days you have lots of "evals" (think, LISP evaling an s-expression that was just built on the fly) and Virtual Machine Code patching (Aspect-Oriented Programming and other Code Injection ideas come to mind). These ain't coming nearthe no-execution bit because the only thing that executes are the VMs, The rest, bytecode or syntax trees, stays data to be processed.
Adobe bought the car
Karl Wallenda landed on and thought that would do.
In this analysis...
... I think I like the pictures best.
The metaphor is very apt, though I am a bit more concerned about the faked ceritficate. If you can't trust the link between certificate and issuer, isn't the value of signing undermined somewhat?
It wasn't a fake certificate, it was a real certificate that was issued without the knowledge of the actual certificate owner..
On this note, how do El Reg readers secure their web server certificates? Once it is installed in IIS6 or 7, even if you install it non-exportable, a simple permissions change via Windows Explorer is enough to get access to the private key. In apache, you just have to know where to look, unless you've configured it to require manual intervention (entering the passphrase) every time the service is restarted. Are there methods and tricks that I haven't come across that I SHOULD BE USING?
Re: Good Article
I think I get the point you're making, but I don't think it's what you think it is . . . ^_^ I'm going to get a bit pedantic, but this is a /*very*/ important issue and is the result of one of the most egregious errors a company can make. The problem is a key management issue on the part of the private keyholder, not a problem between the CA and the keyholder. Verisign did it's job correctly. Vantage Credit Union fscked up /*ROYALLY*/. They let someone get their private key. That is the biggest, primary, WTF-did-you-do, I-can't-believe-you-would-allow-that-to-happen screw-up that can be committed in a PKI. Whomever has the private key /*IS*/ the entity, as far as trust goes. This is a /*ROYAL*/ failure of their key management process. Heads should roll on this. Obviously there was a significant lack of risk management process, governance, information security policy definition and enforcement and plain ol' slack system administration. It's not like there is no guidance on the issue. For starters see: http://csrc.nist.gov/publications/nistpubs/800-57/sp800-57-Part1-revised2_Mar08-2007.pdf, http://csrc.nist.gov/publications/nistpubs/800-57/SP800-57-Part2.pdf, http://csrc.nist.gov/publications/nistpubs/800-57/sp800-57_PART3_key-management_Dec2009.pdf, http://www.verisign.com/static/005308.pdf. Key management is the Achilles heel of PKIs, and these folks committed the ultimate in screw-ups. This is the greatest sin that can be committed. It is also why I recommend against using public key cryptography when a company /*really*/ needs to protect data.
Time to switch to another language?
It's been many years since I programmed regularly (late 1980's, American university) but even then we were using Modula-2 or ADA. Then around that time C came on the scene.
For the life of me I don't understand why companies don't switch to ADA. The majority (all?) of these problems disappear.
Stop, since it seems it's time for the IT industry to stop and evaluate the tools it's using to create programs.
wasn't that the language used to program the Ariane 5 that blew up.due to a programming error.
It doesn't matter which language is used if its still programmed by idiots, or worse still managed by managers who claim to understand money or even worse still programmers who think their language is best!
Ariane V first flight
Still, a great European Success. Ariane V is now the dominating Space Launch System. Sh$t happens. Don't be confused by temporary setbacks. Work hard, move on.
Ariane V and Ada
The Ariane V incident is one of the few cases where laziness would have saved the day. The horizontal accelaration variable was guarded to a max value AHmax. That's what you can do in Ada and it serves a very good purpose normally. A float value larger than expected normally indicates a major problem.
With Ariane V they simply took the proven software from Ariane IV and never had a proper simulation run. Apparently the simulation gear/software was deemed to expensive and not necessary.
The only problem was that Ariane V was designed for much higher horizontal acceleration. This consequentially triggered the AHmax guard, which stopped the primary control program. The standby control computer discovered the same issue with AHmax (as it runs the same software) and then the rocket was uncontrolled.
Another computer discovered a little later that the rocket was not properly aligned relative to the flight path and initiated destruction.
500 million Euros wasted for a bad management decision. Not at all an Ada failure.
possibly not perfect example
"Even when they performed their world-famous four-person, three-level pyramid act 50 feet in the air, patriarch Karl Wallenda steadfastly eschewed nets out of a belief they sapped the aerialists' concentration."
That would be the dude who died when he fell off a high wire? Right. Thought so.
A good part of last week was spent trying to clean a neighbour's comp that got infected. It looks as though the problem got in through a .swf file. When it comes to ease of getting infected, Adobe is the Acquired Immune Deficiency Syndrome of the internet.
Given the company's appalling record on security, if you run adobe reader on your comp, its probably best to treat all .pdf flies on the web as virus-infected by default.
Harkens me back to A. K. Dewdney's Core Wars from the 1980s... See also http://www.koth.org/info/akdewdney/First.htm
DEP relies on digital signatures now?
"The attack also got around a second major defense that's known as DEP, or data execution prevention. The feature blocks the execution of code unless it has been digitally signed"
DEP allows the CPU to check every memory page to see if it has been marked as executable. The application has access to a set of APIs to mark pages as executable (or not).
Digital signing OTOH means an entire module (such as an executable or dll) has been signed with a certificate of sorts. That way it is possible to detect if a module has been tampered with prior to executing this module. (or one could have a policy forbidding non-signed executables to load, like MS did with 64-bit Windows where all drivers now have to be signed)
“But it shows that just because that possibility exists on a platform doesn't mean that it's impossible to exploit. That's the key lesson here.”
The key lesson here is that if you do not mark ALL of your modules as ASLR and DEP capable, then you're screwed. ASLR was not compromised in the case described in the article; It was never enabled.
That said, the article do cover some basics correctly:
1) Less code means less exploitable surface (I've been saying this for years)
2) ASLR is still neat
3) Code needs to undergo some form of quality control and developers must be in control of their build system (so a module not marked as ASLR capable will slip by)
The biggest mitigating factor though, is that few users are admins these days. Today's malware rarely get to embed itself deep into the OS and is relatively easy to deal with.
And finally: Even this year I have observed more problems with AV than with malware. The "cure" is still worse than the disease. I also note that AV systems themselves present exploiters with an expanded attack surface. I am also greatful for a particular AV that alerted me to some malicious stuff that turned out to be an internet shortcut (a 238 byte text file) that was part of the Foxit reader installation. It was very useful to be alerted of such "threats".
Safety nets do have their uses.
It wasn't just patriarch Karl who died from a net-less fall.
Three other members of the family died in other falls and another was left in a wheelchair.
Makes an argument for open source
This really just argues for open source and against Windows:
1) Some may decide not to look for buffer overflows etc. because of this protection, but in open source projects this is not the case. They've even fixed bugs in error cases they are not sure even execute (it'd be for VERY weird scenarios, like someone hot-swap pulling a card in mid-transaction), just because they want code to be as bug-free as possible. They will certainly NOT quit looking for buffer overflows etc. just because there should be protection against them.
2) Windows ASLR is a big fail -- it's a bit better than nothign, but making it so vendors have to specifically flag libraries and apps to use ASLR and NX just guarantees that there'll be pleny of exploits still available on the system due to libs and apps that don't use these. So, in fact, using the Wallendas as an analogy, these problems have been caused by Windows having like half a net underneath instead of a full one like there should be. The solution is definitely not to eliminate the net. This again argues for open source, apps avoided relying on address layout or executing stuff off the stack all along, and the few apps that cheated (oddly enough, Emacs was one..it seems odd for a text editor, no matter how fancy, to have to rely on this..) were patched rapidly to support these security features rather than punching holes in the security features to acommodate the apps.
So where the open source replacement for Adobe Reader, then?
There's a couple of open source PDF readers out there, but none of them have even a fraction of the mind-share of Foxit or PDFXchange, the two most popular alternatives. Lots of people use open source PDF creation tools, such as PDF Creator, but the tools that could really make a difference, the readers, just don't seem to have any traction.
Evince, Okular, Google Chrome (dev Version), Xpdf, GSView