Despite the numbers the dev's should be aiming for zero vulnerabilities - no matter was the OS is.
These numbers prove that nothing is 100% secure and bug-free, despite certain sections of the IT community wearing rose-tinted spectacles.
Apple's operating systems and Linux racked up more vulnerability reports than Windows during 2014, according to research from security outfit GFI. Cupertino's OS X and iOS platforms topped the 2014 bug charts with 147 and 127 holes disclosed in each, nudging out the Linux Kernel with 119 flagged flaws, the National …
there is no such thing as a totally secure system
... because the phrase "totally secure" is meaningless. "Secure" is a relative attribute: it indicates the work factor faced by an attacker1 attempting to make the system do something it's not supposed to do, or fail to do something it is supposed to do, under a particular threat model.
To paraphrase John von Neumann, anyone who speaks of a "secure system" in absolute terms is in a state of sin.
For the same reason, "security" is not solely defined by the system itself. Both what the system is "supposed to do" and the threat model are defined externally and can change. Thus you cannot speak accurately of a system's security as if that is an attribute of the system.
1In this context, "attacker" can be "accident".
>>"These numbers prove that nothing is 100% secure and bug-free, despite certain sections of the IT community wearing rose-tinted spectacles."
Indeed. I've had numerous arguments with GNU/Linux zealots (note: zealot != user) on here. Say what you want about Windows but no-one has ever sat back and said: 'I don't need to worry about security, I use Windows".
Anything as sophisticated as an OS is going to have flaws. I think most actual GNU/Linux sysadmins are smart enough to know how seriously they have to take security, but there is a second tier of zealots who talk as if GNU/Linux is far ahead of Windows in security. That hasn't been true for quite a long time now, but I still see it routinely on these forums. There was a post here just the other day that said Windows had fewer vulnerabilities than Linux in the last year (as this report suggests) and it got downvoted to oblivion.
"there is a second tier of zealots who talk as if GNU/Linux is far ahead of Windows in security"
I believe those guys talk, think about viruses and in that respect they are right, of course. I you have say 500 known Linux viruses there are more than a million for Windows.
But of course viruses are not the only thing there is regarding security.
If you search for antivirus for Linux you find stuff like this:
"When You Need an Antivirus on Linux
Antivirus software isn’t entirely useless on Linux. If you are running a Linux-based file server or mail server, you will probably want to use antivirus software. If you don’t, infected Windows computers may upload infected files to your Linux machine, allowing it to infect other Windows systems.
The antivirus software will scan for Windows malware and delete it. It isn’t protecting your Linux system – it’s protecting the Windows computers from themselves."
I have not downloaded any antivirus for Linux as I dont mix with any Widows mahines and still I have used Linux since 97 with no infections so far, but things may change. Nor would I feel secure dowloading any free antivirus for Linux as that program could contain just the malware I don't need at all.
I could imagine there has been a "snowball effect" regarding viruses on Windows, so many to learn from and tweak.
Then there is, of course, the one and only old explanation about there being more computers running Windows. And why not.
But then again if you consider how Linux runs the backbone of the internet, more than every second web server on the internet, more or less every stock exchange in the world, big firms like Google, Facebook and similar, more or less every super computer and so forth, I would say it's rather silly to claim there are no interesting victims to hit.
But anyway software will contain bugs now and in the future.
Try to stay safe.
"But then again if you consider how Linux runs the backbone of the internet, more than every second web server on the internet, more or less every stock exchange in the world, big firms like Google, Facebook and similar, more or less every super computer and so forth, I would say it's rather silly to claim there are no interesting victims to hit."
Presumably why hacking and defacement statistics demonstrate that you are several times more likely to be compromised if you run Linux as an internet facing system than if you run Windows Server...
Of business servers that do the real day to day work (email, fileservers, database, authentication, web portals, etc.) the overwhelming majority run Windows Server, and it's market share is still growing. Windows Server has a 75% market share according to Forbes.
This is extremely bad reporting. Really, do some fact checking El Reg. This is a "report" (it in fact isn't a report), its a badly disguised press release by a security firm who sell services to the PC industry. The database they have trawled is the US National Vulnerabilities database, which lists fixed reported vulnerabilities voluntarily reported by companies. There is no equivalence or assurance in terms of how comprehensive vendor reporting is, nor does the database try to pretend there is, all the reporting is voluntary. If a vulnerability isn't reported by the company it isn't reported. If a company reports more fixed vulnerabilities, it will have a higher count on the database, if a company has a ton of vulnerabilities and fails to fix them, it will have a low count on the database. If a company reports vulnerabilities on a precautionary basis but that were never exploited, they will appear on the database. In other words the database can tell you nothing about the relative state of security of OS A versus OS B.
The company that prepared this report, works for PC industry vendors. It provides a nice bullet point for PC marketing. There is nothing, nothing, objective or professional about it. Your half-hearted disclaimer in the last paragraph is hardly sufficient to claim objective reporting on this one.
"There is no equivalence or assurance in terms of how comprehensive vendor reporting is, nor does the database try to pretend there is, all the reporting is voluntary."
Somehow I suspect that coverage of Apple, Microsoft and the Linux kernel is going to be pretty comprehensive...
It's no suprise there are more bugs reported against a large number of third party applications (that you may have or may have not installed) than those reported about a handful of kernels (which you have always installed)
Applications may also be very large compared to a kernel (i.e. a database engine).
Anyway, a kernel bug is usually nastier than am application bug, because often happens in code executed with high privileges.
If the software is included in a distro then it becomes a problem for the OS it's distributed with if it offers a way to compromise the entire system? The open-source ethos relies heavily on bundling 'third-party' software as it makes more sense than reinventing the wheel (and the bugs!).
"...as that Linux or OSX has more vulnerabilities than Windows when you then go on to say that 80% of the flaws are with third party software."
That means that if you looked at a Linux distribution instead of just the kernel, the Linux figures would be 5 times worse....
You haven't specified a subject but I'm going to assume that you are talking about GNU/Linux. There are two answers to your question (neither mutually exclusive). The first is that you're wrong - there actually aren't a "vanishingly small number of attempts to exploit them". Companies face active attempts to compromise their GNU/Linux systems daily. It is end users who don't see many attacks.
And that last part leads into the second answer which is regarding the disparity between attacks on GNU/Linux end users and those on Windows end users. The reasons are fairly elementary. If it takes the same amount of effort to craft an attack on either OS, are you going to direct your malware efforts at the OS that has a huge proportion of the total end users, or the one that has a small proportion. Furthermore, are you going to target the userbase that is a mix of technically competent and technically incompetent people, or the one that is stripped of the technically incompetent people?
Short version: For back-end systems, your question is actually wrong - both GNU/Linux servers and Windows servers are actively targeted because they have equal value. For end users, the reason for the huge disparity is that the two sections do not have equal value.
"I'm going to assume that you are talking about GNU/Linux."
Assume all you want, but we have no stats for GNU/Linux, only Linux and it has nearly as many holes as **all versions of OS X combined**!
Now add Samba, Gnome, BASH etc and you can see what an utter dog GNU/Linux is from a security viewpoint.
@AC, given they specified that vulnerabilities like ShellShock and HeartBleed were among the vulnerabilities that dinged the Linux kernel we know they are in fact referring to GNU/Linux. We also know that one of the two following things is true, either you are unaware that ShellShock was related to bash and HeartBleed to OpenSSL and not the Linux kernel proper in which case you're too ignorant to comment intelligently or you are aware in which case you are a troll.
OS X contains both Samba and bash, as well as OpenSSL and many, many other GNU utilities, daemons, and packages. After all, OS X, in its current form, is not much more than *BSD with a ridiculously heavy-weight window manager, some extra drivers, and a couple system parameters tuned for the hardware.
I grew up in the UK hearing "begging the question" in the sense that something immediately demanded an obvious question be asked. So did most people grow up with that meaning around them. It's not like a word such as "whale" where it has a definition independent of common meaning and if someone calls a shark a whale you can correct them. It's a phrase. You have a different and far less intuitive understanding of the phrase which may or may not be older, but is not authoritative - because it's a phrase.
The only phrase that can be said to be inherently wrong is "I could care less" unless that's actually what someone intends to convey which it seldom is. Other than that I get tired of somebody popping up whenever other people are using a common phrase in the way both they and the listener are used to using it and attempting to tell them they're wrong and they should use the newcomer's definition. Really, such behaviour just begs the question of what they actually want by doing this, my answer to which is that they just like pretending they know more than other people.
TL;DR: Pedant Fail.
Hardly, with that rubbish about "whale" having some sort of independent meaning.
Language does not require different ontological cateogories for different words and phrases. They all have the same status: they are signals used by interlocutors in a dance that attempts to get their audience to converge on a meaning sufficiently similar to their own. To that end speech communities converge toward (but never quite to) a set of interpretations (denotations and connotations, generally weighted and context-sensitive) for any given word or idiomatic phrase. Different communities will have somewhat different sets, and every language user belongs to multiple communities and code-switches. That is all words are. They do not have existence independent of use, much less meaning.
That said, I too endorse the shibboleth of using "beg the question" for "raise the question"; just as a matter of style. It's unnecessary elevation. It's not quite as bad as, say, using "I" in the objective case ("between you and I" - a vile barbarism much loved by scriptwriters these days).1 But it sounds affected and it's unnecessary, even if it didn't grate on people familiar with the etymology of the expression.
1That is properly a matter of usage, the pronoun "I" traditionally being used specifically and exclusively for the nominative case. (It's not a "grammatical error", because grammar is not offended. There's a well-formed prepositional phrase there. It's simply an error of using a word in a form that is not traditionally the preferred one.)
Well that's one of the most fallacy laden responses I have ever seen!
First things first, "begging the question" is not a phrase it is a defined logical fallacy.
I also grew up in the UK and was taught the correct meaning of "begging the question" at school; it is basically circular reasoning. Asserting that a particular incorrect usage of the phrase by a large number of people makes your use correct is also incorrect, just because a proportion of people use a phrase in that way does not make that use correct.
The next part of the response is not a discussion of the reason behind the point it is simply an attack on the person making the post.
I was lucky and went to a South Yorkshire pit village Comprehensive school where we had a debating society, we were encouraged to learn how to spot fallacies in arguments and how to counter them.
"I also grew up in the UK and was taught the correct meaning of "begging the question" at school; it is basically circular reasoning."
The OED lists both definitions, noting that the meaning 'invite the obvious question' is by far the commonest use and has been in print for a hundred years. Neither meaning trumps the other; we are watching language mutate. Of course, avoiding the phrase reduces the attack surface of your prose and decrements its cliché count. But it was perfectly clear what she meant.
>>"Well that's one of the most fallacy laden responses I have ever seen!"
Really? Then allow me to list the fallacies in your response.
>>"First things first, "begging the question" is not a phrase it is a defined logical fallacy."
It is most certainly a phrase, it may or may not also be this other thing. False Dichotomy.
>>"I also grew up in the UK and was taught the correct meaning of "begging the question" at school"
Assuming the Answer. You declare that it is the correct meaning because you believe it to be so. Were you to argue that it was the original meaning, you would have more of a case perhaps. But even there the phrase in that sense is actually a mistranslation of petitio principii which means "assuming the initial point". It is ironic that you are arguing that your definition is correct because your misuse is a old. If you doubt any of this, by all means check and you'll find that I am correct.
>>"Asserting that a particular incorrect usage of the phrase by a large number of people makes your use correct is also incorrect, just because a proportion of people use a phrase in that way does not make that use correct."
Two flaws in this one. Firstly, a repetition of assuming the answer (stating it is incorrect therefore my explanation must also be incorrect). Secondly, you argue that words have meaning other than their usage in order to try and show how a minority definition of the phrase is right. This argument carries some weight in some cases - such as my example of someone calling a hammerhead a whale. It has weight because the majority of people have a different understanding; there is a scientific classification that ties to it; and there is an existing better word to use which is "shark". None of these are an absolute argument, but they are all good ones and amount to it being legitimate to correct someone. "Begging the question" isn't a word, it's a phrase with two different meanings. One is a minority use debating term which also has a better and far less awkward alternative which is "Assuming the Answer". Something I know you are familiar with because of your vaunted experience of Comprehensive School Debating Societies. (A rather sad Appeal to Accomplishment, btw.)
>>"The next part of the response is not a discussion of the reason behind the point it is simply an attack on the person making the post."
Correct. Just as they began this with an attack on someone else for using a phrase that everyone understood and which is commonly used that way by most people. An attack or insult of someone is not a fallacy unless it is used in lieu of argument. With me, you will find it is always a supplement.
>>"I was lucky and went to a South Yorkshire pit village Comprehensive school where we had a debating society, we were encouraged to learn how to spot fallacies in arguments and how to counter them."
Excellent. I suggest you read your own post in that case.
Actually it may well beg the question.
The OP states that the the OS is full of holes and then asks why more exploits have not been seen.
There is an implicit assumption regarding the number of holes in the OS and this is used as a premise to question the number of exploits. We do not know that the OS has a larger or smaller number of holes than any other OS to the OP is begging the question in the formal sense; we have to accept the conclusion that there is a large number of holes in order to ask the question.
Did you even read the Wikipedia article?
The term "begging the question" originated in the 16th century as a mistranslation of Latin petitio principii "assuming the initial point". In modern vernacular usage, "to beg the question" is sometimes also used to mean "to raise the question" (as in "This begs the question of whether...")
I've no idea why your post has been downvoted so much.
I'm in agreement. This yearly release schedule is causing major headaches with the bugs it introduces. Not just the security bugs, of which most appear to have existed for quite some before being discovered, but the stability bugs - the things that kill otherwise functioning stuff. Hence my main machine still runs Mavericks.
Apple is well overdue for a "No new features" release.
I have hope that this comments section will not become a sports match - all of the comments so far have been non-partisan. I guess we'll find out after lunch when the East Coast has woken up and seen this. ;)
Anyway, I don't think this shows a failure on GNU/Linux's part. I think instead it shows how far Windows has come. Go back to the Windows XP era and this situation was far reversed. XP had a poor security model and was riddled with problems. GNU/Linux has actually improved as well. It's just that Microsoft bit the bullet with Vista and went through the massive pain of re-doing much of their system from the ground up. We're now seeing the long-term benefits of that process.
And aside from changes to their security model and obvious improvements to their quality control, there's another thing MS addressed which isn't impacting those figures above but is impacting actual daily security a lot. And that is they took some of the responsibility for security back from the user and manage it themselves now. All Windows systems can have Windows Defender / SmartScreen / etc on and running and any that doesn't have Third Party anti-malware software running normally does. Windows Defender isn't fully as comprehensive as something like Trend Micro or Kapersky, but it does the job and has low-impact. The fact that modern Windows installs have proper anti-malware up to date by default now is making a big difference to the general state of end user security.
Agreed, and compared to Windows 95/98/ME, Windows NT/2000/XP was a veritable fortress.
Microsoft has come a LONG way, from a relative laughing stock that could not be taken seriously for anything moderately secure, to a reasonably decent platform.
Such a shame though coding for it is a royal pain due to its largely NIH-inspired programming interfaces.
Linux has moved forward in that time, but it didn't have as far to come on the security front.
It has a LONG way to go on the usability front. People at my workplace complain about how hard Linux is to use, even describe it as "weird", but that is because many of them started with Windows XP (or maybe Windows 98) and didn't see what Linux was like years ago when getting a graphical desktop meant a long session with XF86configurator and a need for deep knowledge of your hardware.
>>"It has a LONG way to go on the usability front."
I actually find it fine to use, though I will concede I started out with HP UNIX and XWindows so I may not be fully calibrated to the average user. But still, I think Distros like Mint are out of the box pretty good. I agree it is light years ahead of where it was and I have many memories of hours spent editing xorg files trying to get it to work right.
The area that I personally think GNU/Linux might want to improve on a bit more, is enterprise tools. I'm happy to be corrected on this one if I'm wrong. I have programmed on GNU/Linux professionally and used to use Gentoo as my primary so I therefore have a reasonable understanding of the principles and how it is put together, but I have never administered a company's Linux systems so I may not have a solid feel for this - like I say, if I am wrong I am happy to be corrected. But last year I encountered puppet for the first time. I also have had to witness the painful, painful way in which user accounts are being managed across many Linux boxes / VMs. The sysadmins doing all this aren't idiots, they're smart people. So if this is really how things are done in the Linux enterprise environment then they are actually behind the tools that MS provide for this by a considerable margin. Given Linux's stronghold is backend enterprise, I think this is as important as UI refinements, imo.
Of course it's difficult to find people who are experienced sysadmins of both Windows AND Linux, so informed comparisons are hard to come by. Unlike most of my posts, I wont be arguing in defence of this one either way - these are just my impressions.
System administration is one of those areas where Linux has suffered because of the diversity of the distros.
The one-size-fits-all processes like useradd will do the basic job at hand on the local system, and are pretty similar across all versions of Linux. Once you get beyond this, each of the distros have their own idea of how to streamline this and other admin tasks, and most of these are pretty distro specific. In some cases they are proprietary and closed source to try and generate a revenue stream, and do not interoperate.
There is not even a consistent package management format across all versions of Linux.
It is very difficult for a new Open Source package to come along and streamline this. What is needed is a low-level tool that goes in at a suitable level so that it can manipulate the configuration files/databases/objects fundamental in Linux, to provide a consistent system management layer in all distros .
What you actually get (like with Puppet) is a whole load of distro specific methods layered on top of and driving the specific interfaces for each distro. This works, but is high maintenance, which often means that it becomes paid-for software (again, Puppet is an example of this).
There are two ways this could happen. One is if the major distros decide to collaborate and produce a common administration interface. The other is for a standardisation body to add the specification of such an interface, and have the distros adopt that standard.
The former is unlikely to happen, as the distro specific sysadmin stuff is where people like RedHat and Canonical make some of their money. The latter cannot happen as there is no accepted Linux standard or even standardisation authority, and even if there were, it would be dominated by the commercial distro maintainers, because they are the only people who might have resources to invest in a standard, and then we are back to the former point.
So what we have left is paid-for software or home-grown scripts put together by sysadmins which do the job, but are seen as being messy.
I can see no way of moving this forward unless someone with big pockets and a lot of influence with the distro maintainers decides to take it on.
>>"The former is unlikely to happen, as the distro specific sysadmin stuff is where people like RedHat and Canonical make some of their money. The latter cannot happen as there is no accepted Linux standard or even standardisation authority, and even if there were, it would be dominated by the commercial distro maintainers, because they are the only people who might have resources to invest in a standard, and then we are back to the former point."
That's a really interesting post, I've just snipped out part of it. It might be optimistic (or naïve according to view) but perhaps there is a third option. Linux grew out of a community of people collaborating voluntarily. Perhaps given there is an evident need, the same can happen again. It may seem unlikely, but then the entire Open Source movement was, and yet people made it happen.
It is entirely possible that it could be done as a community project, but the resource involved would probably be too much for a one-man band, or even a small group of people doing it in their spare time, and the necessity to test it against the plethora of distros would be a similarly mammoth task.
It's easy to have a community project that adds a veneer over the top, because you can break the tool down into modules that drive the documented tools. Getting in at the fundamental layers, where the different disros tend to differ from each other, and where the documentation has not been maintained, or in some cases not even written is a much harder task, and requires much more research and testing.
It would be difficult to get such a layer accepted to the extent that the major distro owners would adopt and maintain this common approach in preference to their own distro specific tool.
If we had had a situation where a fully free Linux had become a defacto standard, then if that distro maintainer was altruistic, they could have incorporated something like this and hope that it would be picked up by other distros, but it seems unlikely that the increasingly fragmented Linux world will settle on a dominant distro (hell, the systemd risks fracturing the community even more than it currently is).
What with Canonical, a company that was being portrayed as a bit of a white knight a few years back, going in a direction that is unlikely to be followed by other distros, I think the time for a dominant distro is fading into the past. Mint is unfortunately reliant on Ubuntu, and RedHat always had an agenda to try to leverage support contracts from their users. SuSE, which looked like it's independence was under threat appears to have weathered the storm but has lost followers. Debian appears to be going with systemd, which will alienate a lot of people (and will be a nightmare to administer using a tool such as I am proposing).
I suppose that Lennart Poettering (systemd) could take on an administration tool that would plug into systemd and extend it to cover other sysadmin tasks, but I for one would not trust him to run such a project without making it almost completely unusable/unsupportable.
There's not much I can argue against in that post. Seems to be (sadly) right on the money. Especially your summary of the main distros. I'm quite sure that Poettering probably would take it on - seeing as there's nothing he's encountered so far that he hasn't tried to vacuum into systemd. But like you, that's not a solution I look forward to seeing.
I have many memories of hours spent editing xorg files trying to get it to work right.
I still have the occasional nightmare featuring sendmail.cf :)
I rather liked HP-UX, more than IBMs AIX (use the
force menu, Luke). SunOS and Solaris weren't bad either, provided you got GCC installed asap. Ah, memories.. :)
I think it's a good thing that this apparent myth of invulnerability got cracked, because it ensures people go back to actually paying attention to security. This whole "it can't happen to me" feeling was dangerous IMHO.
Having said that, I still prefer a Unix derivative over Windows but that has more to do with expertise. I know what to look for to make a Unix derivative safe, whereas someone who works with Windows on a daily basis as sysadmin is always going to be better than me at keeping that platform clean.
" People at my workplace complain about how hard Linux is to use, even describe it as "weird", but that is because many of them started with Windows XP (or maybe Windows 98) and didn't see what Linux was like years ago when getting a graphical desktop meant a long session with XF86configurator and a need for deep knowledge of your hardware."
To a large extent "hard to use" can translate as "different" but the desktop you're providing can also make a difference. Presumably they'd have come up with exactly the same reaction to Win 8.
Personally I found it easier to transition from XP/7 to Linux Mint than to Windows 8. When some friends heard I was using Linux they "didn't want to know about all that command line stuff" until they saw the actual desktop in action and thought it was some version of Windows they'd not heard of.
The only area Linux falls behind Windows for me is printing, and that's down to manufacturers driver support (specifically Canon, the bar stewards).
Biting the hand that feeds IT © 1998–2019