It's been more than a week since Debian patched a massive security hole in the library the operating system uses to create cryptographic keys for securing email, websites and administrative servers. Now the hard work begins, as legions of admins are saddled with the odious task of regenerating keys too numerous for anyone to …
>They're very often done in order to make software fit Debian's "policies". The >problem is that the "policies" are arbitrary - they're Debian-only standards which >the original developer of the software may not be aware of, and there may be very >good reasons not to do things the way the "policy" says.
The developer of the software is aware that the license they distribute allows Debian, Redhat et al. to do these sorts of things with their code. If you don't like it, don't distribute your code with licenses that allow modification and redistribution.
DJB did exactly that.
>In fact as I mentioned on another one of these comment pages, the first time I >found a bug in a Debian package which didn't exist in the original source codes >(i.e. a bug which could be removed by uninstalling the Debian version of the >software and then reinstalling from the original source code) was back in 1996
I guess you never reported said bug to the package maintainer so that it could be fixed? When you "reinstalled from source", were you linking against locally built libraries or the shipped ones? There are numerous things that could have caused a bug on your system that had nothing to do with the package you experienced problems with. Like out mutual friend Chris Thomas you seem to have had one bad experience with Debian and in your eyes that seems to give you the right to bad mouth the thousands and thousands of hours of time people have volunteered into the project.
I don't even see why you brought Windows into the argument.
Ideally as a consequence of this there will be a major paradigm shift in the process of changing the code. It's less to do with creating more responsible patch policies, restricting who can make them etc. And more to do with encouraging developer responsibility.
Really should never be the case where a developer looks at a block of code, commented, borked, free or closed, says to himself "Hmmm I have no idea what effect changing this might have" and changes it. It is less to do with "human error" or "n00b mistakes" and an awful lot more to do with downright irresponsible recklessness. If he tried to find a peer review, and found no help forthcoming the obvious next step is to just not touch it. Unless he has at least a vague idea as to the purpose of the code.
Defend him however you want, but he showed a reckless disregard for good sense which would almost certainly have cost him his job in an organization. (As you said earlier, it takes two or three small mistakes - but this was pure arrogance)
@ Chris Thomas
"Who the f**k does something like "remove" a key component of the RNG in the first place, a dumbass, thats who, a total waste of space, do they even acknowledge the damage they have done?"
Unless it was done deliberately.
As a non expert who gets trough life by GUI only I have come to understand that there is no such thing as computer security. Those who think that free software can not be used because somebody makes a mistake could still benefit from its efficiency and low cost by adding a proprietary encrypted tunnel to the SSL SHTI STD's etc. and perhaps another one inside that one that you roll yourself. Keep adding tunnels until you feel comfortable. If your business is storing atomic bomb detonation codes for the government I would recommend pen and paper and an old fashioned safe.
Since the worlds business increasingly relies on shifting secret information through a global net that is used by all it is very important that the encrypting tools are made up by free software. What would you rather have: Microsoft saying we have made these tools and your are going to have to trust us when we say your secrets are safe or Debian where you can see for yourself exactly what is going on.
The remedy is not to argue for the death of Debian but to join Debian and help work on computer "security". It takes more effort than calling for peoples heads but you would be adding your weight to masses of people who have pulled up a gem that is untainted by political and commercial interests.
Art R Big
Re: Beware applying the patch on remote servers!
> Is anyone keeping track of which SSL issuing authorities are providing instructions and free (zero-cost) certificate re-issues for affected certificates which need new CSRs?
I got a nice email from GlobalSign this morning:
"One of the advantages of being a GlobalSign customer and having the GlobalSign Certificate Center (Formally known as the Global Agent System) is the ability to re-issue a certificate in the event of private key loss, or in this case a possible weakness in the keys.
"If you do wish to re-issue your certificate, then please follow these simple instructions:-
"(1) Address the vulnerability by patching the affected operating system to the latest level.
"(2) Create a new CSR remembering to use exactly the same information that you used last time. (The information can be seen within your account area or within the certificate itself.)
"(3) Log into your GlobalSign Certificate Center account and click on the 'Certificate Application History' menu option and select the appropriate certificate that you wish to replace.
"(4) Click on the 'Re-Issue' button and enter your new CSR.
"(5) The new certificate, based on your new CSR and new private keys, will be issued and available for you to download in moments.
"Many thanks for choosing GlobalSign. We hope you see the benefit today and every day."
Can't understand why anyone would use this O/S on a medium to large enterprise environment. It isn't listed as IA certified on the common criteria portal's Certified Product List.
Shame on any security engineers who actually let this product onto their network!
Just a reminder
... that the Windows WMF security flaw existed undetected from 1990 to the end of 2005. Compared to that, two years is fairly trivial.
The ads around this article
Anyone else notice loads of ads for Windows Server 2008 around this article. Made me chuckle.
Paris...because she prefers a secure tunnelling.
I haven't read comments but....
Surely in this day and age, when crims have massive botnets at their disposal and even smartcards and WEP are found to be flawed, you wouldn't base SSL or any other encryption on a random seed that wasn't KNOWN to be truly physiclly random? e.g. LSBs from a recording of the microphone jack, or even a dedicated RNG PCI card?
Seriously, it's down to common bloody sense and covering your arse to the limits of your technical ability these days. No excuses.
Flame cos I know the excuses will flow thick and fast.
Well of course you want the source of randomness to be random - and it is not that hard.
I hazily recall having to bash away at the keyboard; in some quasi finger dance to different melodies mixed by a DJ with no concept of the fade buttons.
And having to quaff a few pints (actual number of pints and music listened to not revealed, so as to reduce chance of replay attack), in the spirit of entropy creation.
Oh, those were the days.
My part to ease the pain
I've written a Windows tool that allows scanning of remote web servers (or anything that answers a TCP connection with an SSL exchange) to determine whether they have the weak keys identified by Ubuntu.
Read about it, then download it, from my blog at http://msmvps.com/blogs/alunj/archive/2008/05/22/1626252.aspx
Written in C#, with source code provided.
Suburban Inmate is right - there are several known good sources of entropy.
Unfortunately, the change to Debian meant that even if you _had_ good sources of entropy, they were ignored.
The line that was changed isn't the line that asked ssleay_rand_add to add 'entropy' from an uninitialised buffer.
The line that was changes is the line _in_ ssleay_rand_add that added entropy from a program-supplied buffer.
So, every time any caller (including other parts of OpenSSL) asked to add new entropy, the function threw it away.
helps with debugging
Saying "if it helps with debugging" is not the same as saying, "go ahead ship it out to all of your users and downstream it into a bunch of child-distros"
If something like that is intended to help debug a certain issue then that's fine, because the random number generator becomes a lot less random and therefore it makes it possibly to debug certain aspects of the code easier than before.
If you want repeatable tests you need to make random numbers predictable for the purposes of the tests, "debugging" this is not simply a golden ticket to ignore the issue. An inexperienced developer made a mistake, string 'em up!
** downward thumb because well, isn't that what the romans did? **
As for the person who said using an unitialised spot of memory somewhere fairly randomly placed in the system is a bad idea should think again, its probably one of the best sources of entropy on a computer system. If for instance hard disk interrupts, they can be manipulated, network interrupts can also be manipulated, also keyboard interrupts etc are all prone to injection.
However, I still agree we should all have some kind of frequency hopping galactic background radiation based hardware RNG system, or something equally as impossible to control.
Re: Just to flame
AC wrote "My point is that this issue has blown a massive argument for OSS out of the window."
Sorry man, you've missed the point - this cock up with Debian does in no way invalidate OSS as a method. Like others, I'd suggest that if this was a closed-source project, then it's likely that damage control would be invoked and the fix would be slipped out quietly. At least team Debian have 'fessed up to their mistake, and issued a fix.
However, having disagreed with your generalisations, I will agree with you on a specific point - to whit, that this has clearly demonstrated (to me at least) that Debian's code control process is in dire need of review. As a software "professional" myself I find it alarming that a _critical_ piece of software like SSL/SSH is allowed to be committed with only _one_ peer reviewer!
I would also agree with others that, if Debian's packagers are making modifications to code that are not purely to do with different locations etc, then these need to be subject to more scrutiny than would otherwise be the case. And they definitely should be feeding these "improvements" (?!) back to the original development teams!
I'll continue to use Debian after this, (mainly because I find it a deal more stable than Ubuntu), but I hope that they learn from this gaffe, and put in a process "fix" as quickly as possible!
The man in the middle
The real problem now, is has any debian openssl encrypted traffic been sniffed over the last couple of years.
We know, that BT and Phorm have been doing it, if a webserver was running debian patched openssl, and a customer gave their credit card details, then potentially those details could now be compromised.
And people who used an ssh encrypted connection with debian patched openssl key, to setup user accounts, may also find that those details are compromised.
This is why illegal wiretapping should remain illegal. The man in the middle attack does not have to be done in real time, they can siphon off the data and wait until a vulnerability like this occurs, or a time when the problem of the primes is solved.
Interception of data, should not be allowed, for reasons such like this. Phorm cannot hide behind the fact that data may be encrypted so they have no access, right now they probably have a load of encrypted data that is now vulnerable.
Most of the changes I've personally seen package maintainers apply to their packages were not made by the package maintainer; the package maintainer simply reviewed the code, verified it fixed the issue in question, and either applied it, or in the case of source code distros, properly marked it for being applied at a specific time relative to other patches for that package.
It's quite rare for a package maintainer to apply a patch to move files around, because most packages contain provisions to specify alternate locations for files via a configure script. Of course, this does not stop some people; I've seen Gentoo packages which patched the upstream package to do what could have been just as easily done by giving the proper command-line arguments to configure, and I've seen other Gentoo packages which patched the upstream package to move related files into unrelated directories - the configure script only allowed for selecting where each related set of files went. (The latter move, by the way, was very annoying for those of us who understood why the upstream package maintainer felt those files were related...)
"It's ok if it helps debugging" means that a certain change is suitable for a debugging version of the package; it doesn't mean it's suitable for everyone to use.
Note that the OpenSSL maintainers aren't responsible for the Debian developer deciding to noose all of his users. They advised - and their advise was not understood, because the Debian developer assumed that their mindset was the same as his, rather than seeking additional clarification.
Also, the problem has not had a huge impact on *everyone*, although the impact is certainly beyond Debian-based systems. At work, we ran all our certs through the blacklist tool, and found we had not been hit - except, of course, for the two Ubuntu systems themselves. The impact would've been much bigger if they had been considered the stable systems to work from, rather than being new and experimental.
Pen testing software UK and Germany
As Germany, has now banned pen testing software, how would they have insured that the OpenSSL keys generated were good? And one assumes that if they were to use a tool to identify the problem today, they would be using an illegal tool in Germany.
Sure, someone could have done this by hands and eyeballs. But, any German writing software to check would have fallen foul of their laws.
And with the UK moving swiftly in the same misguided direction, if any UK citizen distributes software to aid in the identification of such flaws, they too will fall foul of the law soon.
I have just noticed Alun Jones above, is offering such a tool, now that tool may be useful, but of course it could be used for nefarious purpose as well. And of course a lot of the pen suites have already placed code in to detect servers that are using weak keys.
Most admins I would bet will be using such tools, and seeing the benefit, perhaps even checking prior to launch, but soon in the UK that will be illegal as the law stands.
A lot of misguided people here.
I don't know how anyone can comment on open-source in a bad way. There is nothing negative that can come from open-source other than of course those greedy people cannot take money from the people who could use it in better ways, but hey that's another subject altogether.
I think we can should all take a big step back for a second. Unless the developer who made this apparent self-made mistake pushed his way into his current postion by nothing but complete lies then a lot of people here need to start questioning their own ability, especially within the development world.
How can one possibly leave such a large task to one man? I have no doubt on my mind that this person asked for help and that this person will even take the wrap for what has happended even after asking for support.
What we have hear is a lack of communication and nothing more, we might of seen development skills from both sides of the table but I know which side of the table needs blaming in my opinion and I'll give you a hint - it's not Debian.
Seriously, what the hell is going on these days.
The thing I dont get
This vulnerability was introduced in Sep 06, meaning it was in the patch for the RSA exponent 3 vulnerability. This vulnerability was a great example of how complicated cryptography is, and how many things have to be right to get a secure system. The default exponent 3 was used by most CA's for the ease of calculation it provided and this was considered safe until a mathematical vulnerability was found and an exploit published.
I think it needs to be highlighted that this defect was almost definately injected in a patch to fix a major security vulnerability, which should have highlighted the difficulty in getting these things right.
Any tinkering to a patch fixing a major security vulnerability is probably not a good thing unless you really know what you are doing.
Complexity is the enemy of security. Personally I like and use Debian for a cheap hosted virtual machine server and Kubuntu on my desktop. Ubuntu can't afford to do all the work done by Debian volunteers, though pays for a few of those working on Debian full time. The more complex the system the more likely there exist things about it we don't understand and would not like if we did. Sometimes a developer or distributor has to release or package a product without knowing as much about it as they would like. I've been there and done that as a programmer far too often to count, and almost any bug is a potential vulnerability in some circumstances. And those who imagine that achieving software quality would be helped by the introduction of blame culture into programming don't deserve the benefit of being able to use computers programmed by other people.
So what are the alternatives ? Not to use computers ? Put up with a proprietary OS which is 10 times as complex due to the commercial need to maintain binary compatibility with everything everyone has ever done on it in the past ? Hope that the fact that only criminals, spies and employees of the company that sells the proprietary OS have source access means that the employees and spies know enough consistently to defeat the crooks ?
The value added by distributions makes running free software OSs feasible for most users. Having consistency in how things are installed and removed and upgraded such that this process can be automated makes a massive difference if you want to use an OS which includes several thousand programs developed independently by tens of thousands of programmers - where you only have time to read the source code of less than a dozen of these programs or none at all.
Debian followed by Ubuntu are the only OSs I have used where I have consistently been able to upgrade through several major releases without ending up with a cruft-ridden or unstable system. Using Red Hat, Slackware or Mandriva I didn't have quite the same level of confidence in system performance and stability without reinstalling from scratch every 18 - 24 months - though maybe these other distributions have got better since I stopped using them.
So thanks to all Debian and Ubuntu developers out there, as well as all original free software programmers who work on making this possible. So long as you stay open and accountable, I don't expect you to be perfect.
On patching, OpenSource and Freetards :)
What I would like to see come of this, is full explanations of patching.
OpenSource is a funny one, sure the code may be open source, but not many bother to publish the design docs. I have tested the waters on this myself, with an application where both the design and the source code is open for viewing, and I have noticed most people just go straight to the download. But, I personally have found it quite useful to refer to the design docs of my own applications, when extending or changing, so I have a space to rationalize about change.
With that said, myself personally, and I think quite a lot of other people, are interested in what a distro package maintainer does as far as patching is concerned. I would like to see a command that lists all the patches along with a succinct reason as to the patch.
See, opensource is great, but we are giving it lip service in many areas, and I personally would like to see some commercial benefit for developers of opensource code, ie. I think that monetary recompense to the developer for code, would actually create better opensource applications, and that's what we should be working on.
We need to get off the 'freetard' merry go round, keep the advantages of open source and put in place some form of monetary model, instead of just throwing our hands up in the air and saying we cannot weave monetary renumeration into this.
"See, opensource is great, but we are giving it lip service in many areas, and I personally would like to see some commercial benefit for developers of opensource code, ie. I think that monetary recompense to the developer for code, would actually create better opensource applications, and that's what we should be working on."
The overwhelming majority of work on open source software (OSS) is paid work, as demonstrated by extensive automated analysis of contributions to Linux based on source code line-count and copyright signoff, see
The motivations for creating software are various. In my case this research supports teaching I am paid for. Some software creation is motivated by the sale of packages which might be harmed by opensourcing them, but package sales are a minority interest amongst programmers.
When looking at the benefits of OSS to its creators the greatest is the reduction in transaction costs in a world where a useful system has to be integrated from contributions by hundreds of thousands of programmers working for tens of thousands of independent organisations, for which sale of the software could never be our main line of business because the transaction costs would be greater than the sales value.
Creators of this code do want your lip service to support our viral method of distribution, but we value your custom when you buy one of our courses, consultancy services, software publications or hardware in which OSS is embedded or which is sold on the back of OSS even more. If you can help identify and fix a bug or contribute a useful patch that works for you and which benefits you by having it included in the upstream version even better.
I do think someone above, should perhaps revisit their maths' books.
Take a dice roll, roll that dice once and we would say that the dice had a 1 in 3 chance of being a a 3 or 4.
But roll that dice a million times, and you will find that the average of the rolls tends to 3.5.
Now, every roll of the dice is random and you cannot predict what will be shown, but to test if something is random, you can run a basic test like this, and on average you will also see that the numbers selected are all fairly evenly spread, as long as the sample set is large.
If the numbers are not evenly spread, then your random system is not that random.
Sure there is a chance that a random system will return always the same number, but I personally would not use such a system, and in fact a sequence of +1 would be more secure than a random system that returned the same number each time.
And using uninitialized memory is not secure from being gamed either.
- World's OLDEST human DNA found in leg bone – but that's not the only boning going on...
- Lightning strikes USB bosses: Next-gen jacks will be REVERSIBLE
- OHM MY GOD! Move over graphene, here comes '100% PERFECT' stanene
- Pics Brit inventors' GRAVITY POWERED LIGHT ships out after just 1 year
- Beijing leans on Microsoft to maintain Windows XP support