back to article The GPL self-destruct mechanism that is killing Linux

Does one of the biggest-ever revolutions in software, open source, contain the seeds of its own decay and destruction? Poul-Henning Kamp, a noted FreeBSD developer and creator of the Varnish web-server cache, wrote this year that the open-source world's bazaar development model - described in Eric Raymond's book The Cathedral …

COMMENTS

This topic is closed for new posts.
  1. Lee Dowling Silver badge

    And your point is?

    I can't really get a handle on where the article is supposed to lead me.

    Linux is running on more machines worldwide than just about anything else. Just Android smartphones, TomTom navigation devices, various set-top-boxes and smart-TV's outnumber the only operating system ordinary people have ever heard of (Windows). BSDs? I'd be hard pressed (apart from certain parts that made their way into things like the Windows TCP/IP stack decades ago) to name anything that's really come from them. So it's not really unpopular in either in-depth-hardware-geek territory (who most certainly would have heard of BSD, and would use it if they could - because it doesn't require them to expose their own code - but yet hardly anyone makes embedded devices that run on it), or even just general usage in homebrew projects (Raspberry Pi, various handheld consoles, etc.). I don't get the argument you're trying to present there by suggesting it'll all go titsup.

    And Linus is preventing Linux forking by being unique - so if it forked, who would do Linus' job in the fork? Either someone would come along (and thus Linus wouldn't be unique), or someone wouldn't (in which case it wouldn't be forked).

    I've lost the point there, apart from suggesting that the GPL (*the* most popular open source license) is somehow corrupting. I believe that was its intention, so people couldn't freeload from it for their own commercial purposes and not contribute back (for hobbyist purposes, it has no real hindrance because you only have to offer your code to the people who end up with the end product of your derivative work, which is probably just you).

    And quoting Tannenbaum is really the last straw - his own progeny MINIX hasn't been touched in years, barely runs any of the huge amounts of code out there today and is unheard of outside of academia teaching operating systems. He operates in a world of perfect mathematical programs and no real-life OS would ever satisfy those criteria and always be "obsolete" (and, don't forget, MINIX predates Linux and Linux is basically the "I can do that better" version of it that Linus wrote - I think he proved his point).

    I'm not a massive advocate for the subtleties of open-source, I avoid licensing wars like the plague (seriously, BSD, GPL, or proprietary is all I really care about - even the versions don't bother me much), and I don't care for the personalities and their opinions much. But this article is a rambling mess that somehow tries to sow seeds of doubt about Linux with no, actual, real point to do it with. It's the sort of thing I'd expect on the FSF website, not here.

    1. petur
      Meh

      Re: And your point is?

      Well, I know of at least 1 big TV manufacturer runing FreeBSD on its sets, but other than that I agree with your comment. Ask somebody from the BSD camp about GPL and there's your article ;)

      1. David Kelly 2

        Re: And your point is?

        Panasonic Viera.

        1. Anonymous Coward 15
          Joke

          Re: And your point is?

          He's a footballer or something?

    2. Lars Silver badge

      Re: And your point is?

      Yes, about that article, I suppose if ones thinking is muddled the writing is muddled too.

    3. frank ly

      Re: And your point is?

      "... you only have to offer your code to the people who end up with the end product of your derivative work, which is probably just you."

      As far as my understanding goes, (please correct me or give other opinions if you can), this also extends to in-house corporate use. If a company decides to use Linux, or any GPL software, for purely in-house use as part of its internal operations (e.g. process monitoring/control, networking, e-mail, etc), and they develop clever modifications and add-ons; then that corporation does not have to make their new source code available. In asking their employees to operate the clever machines and systems they have developed, they are not actually 'distributing' the code (as specified in the GPL license), they are simply making tools for employees to use.

      There are some people who argue against corporate use of GPL code by saying, " .. if we develop anything useful and clever, we have to give it away to the rest of the world, according to the license." I believe this is not true. They also say, " .. at least with Microsoft, we'll get years of product support." Hahaha.

    4. jocaferro

      Re: And your point is?

      FreeBSD choice of Clang vs gcc?

      But you'll need a crystal ball to guess because this article is a mess.

      This is an old war since the days when gcc became GPL v3 - I think FreeBSD was still using an older version of gcc.

      As a developer I welcome the inclusion of Clang - first there is more choice and second this will benefit Clang and at the same time gcc. However I still use gcc since I feel it is (yet) better - it's my opinion and certainly many people in the BSD "world" will disagree.

      Now - "Linux is dying"

      As a matter of fact Linux is gaining and in many areas is dominating. From the mere "k" embedded OS to the almost 95% of the 500 large supercomputers of the world!

      Except in the "desktop" but the paradigm has changed and nowadays the "desktop" has no longer the maiority of the market in the userland...

    5. Anonymous Coward
      Anonymous Coward

      I thought this article was about GPL

      ...then we go and invoke the old micro vs. monolithic kernel debate in the last paragraph. It leaves the impression of a hatchet job - [let's pile every bit of anti-Linux FUD we can find and throw it into one article and see what happens].

      Differing opinions are well and good, but in the end the current state of affairs for Linux is that its use is exploding pretty much everywhere other than the desktop... so the title premise - that something is killing it - is provably false.

      To the contrary, in the past few years I have seen a marked move on the part of server software vendors to closed, appliance VMs based on Linux and away from self-installed versions of their software for various OSs. Forking and the GPL make that possible.

      You can do better than this Reg.

      1. Lars Silver badge
        Coat

        Re: I thought this article was about GPL

        Yes perhaps, but I can find no reference to Liam Proven as working for the Reg.

        1. Daniel von Asmuth
          Flame

          Re: I thought this article was about GPL

          This is just the old flamewar about whether the GPL or BSD is the one true Open Source license. There is just no objective answer. If you want the cathedral rather than the bazaar, get AT & T's Plan 9 (Unix done better) or Inferno (better still) or Microsoft Midori OS.

    6. Anonymous Coward
      Anonymous Coward

      You've talked the point under.

      Actually, the BSD family is more of a geheimtipp in the industry. So what if it doesn't have the recognition or fanbase, there's a number of people who do know to value the software and make good use of it. In fact, several embedded manufacturers would've gotten into a lot less trouble had they not tried to jump on the linux bandwagon then gotten bitten by the so-called "viral" features of the GPL. You know, where they "neglect" to release the source to their distributed modifications and end up getting sued.

      Apparently the industry suffers more from whippersnapper engineers not having heard of this cathedral thing than you might think. Which is what the acm article this builds upon argues--in typical forceful phk style. Or maybe it's just the managers that drive the decisions and do so mostly on bandwagoneering reasoning.

      The BSDs, in contrast, do allow you to simply take the source and do whatever you like with it. Don't even have to mention it these days (the licence got amended, and it's not GNU philosophy that drove opening the source in the first place, by the by). Still and all, there are a number of features that seeped back into the BSDs from commercial non-open-source use. Most notably netgraph, which is fantastically useful if you need that sort of thing. Came from a vendor that used the software to build routers with. There's still vendors that do that, juniper being a rather big name one.

      There's TVs, gambling machines, whatnot else that also run BSD software. That you don't see it advertised doesn't mean it's not true. It just makes it the geheimtipp alluded to earlier. The windows networking stack, though, is something they "rewrote from scratch" --IIRC it's been replaced twice now-- reintroducing a number of really old DoS vulnerabilities around the time of the vista previews. Apparently redmond is not big on regression testing. By the by, the original BSD networking stack was copied from somewhere else, so as not to reinvent the wheel. The extensive work that happened afterward did make it good enough for others to take and build upon, which happened repeatedly. The gnu crowd, though, tends to leave all that aside because it's not the right licence for them. Or rather, not left enough of one, the BSD licence having been nicknamed "copymiddle".

      The Tanenbaum comment isn't as offensive as you make it out to be: Technically, as in from a computer science point of view, Andy has a point and Linus did leave a bit of an opportunity on the table on the architecture front. Of course, he was but a student, not a professor, and he didn't care. Seeing his massive success, especially when the hurd didn't reach anywhere near that success, he still doesn't care. And given that the sheer momentum of linux' success manages to bulldoze over many a crack, well, few others care much either. But that doesn't make Andy wrong: QNX is a nice example of a technically successful microkernel, though its price tag didn't help uptake outside of specialist niche markets much. Then again, if you don't care about subtleties, then indeed you'd best ignore that comment.

      And you'd also best ignore minix and the fact that it's still being worked on entirely. Its development doesn't come quick compared to the massive momentum linux has, but it does progress and the thing is still alive. Whether it'll stay with us if Andy perishes, well, who knows? But we don't know that of Linus either, and for such a large project that is a bit bothersome. Much like the jury is still out on whether apple will survive Jobs' demise, in fact. That was one of the points the article tried to make.

      1. Anonymous Coward
        Anonymous Coward

        Re: You've talked the point under.

        Regarding the Tanenbaum comment:

        The Micro vs.Monolithic debate is, IMHO, tantamount to RISC vs. CISC - both sides have their good points to be made, academically one side is viewed as better than the other, while the other has gained dominance (save ARM on smartphones, and MIPS/ARM in the embedded space) in the market. The fact that both sides of each debate still exist and, to varying degrees, thrive in the market is a good thing. In both cases the debate will continue for the foreseeable future with, more or less, the same lines being drawn and points being made.

        Complaining (at least my complaining) that it was raised in this article was not due to the validity of Tanenbaum's argument but rather the inappropriateness of bringing it up as a parting shot in an article about the GPL. I'd have the same objection to an article about a comparison of the ARM vs. MIPS vs. Intel business models if a one-sided swipe of the old RISC vs. CISC debate was thrown into the last paragraph.

      2. JEDIDIAH
        Linux

        Re: You've talked the point under.

        This is just sour grapes nonsense. The BSD folks are just mad that it's Linux that became popular and successful. On the one hand, the license on the source just doesn't matter for most people and even most companies. Most people simply don't have a 4 year old notion of property. (what's mine is your and what's yours is mine) So the whole drama of BSDL vs GPL is entirely irrelevant for them.

        On the other hand, contributors might object at being free labor for IBM or Oracle or Apple or Microsoft.

        THIS is why the GPL was created. It wasn't some subversive political agenda on the part of RMS. His contributors were p*ssed over exactly the kind of corporate proprietary assimilation that the BSL allows for.

        You gotta keep the talent happy.

        Noisy BSD fans are like the Trench Coat Mafia fantasizing about revenge on the popular crowd.

        1. Anonymous Coward
          Linux

          BSD folks are just mad at Linux?

          The I propose a de-fork, a merger of BSD and Linux ...

          1. superlinkx

            Re: BSD folks are just mad at Linux?

            Except Linux isn't a fork. It's a rewrite...

      3. jocaferro

        Re: You've talked the point under.

        "In fact, several embedded manufacturers would've gotten into a lot less trouble had they not tried to jump on the linux bandwagon then gotten bitten by the so-called "viral" features of the GPL. You know, where they "neglect" to release the source to their distributed modifications and end up getting sued."

        I think they and you still don't understand the GPL otherwise you know there are at least 2 options:

        1. If you/they want to distribute the original code and your modifications totally under GPL that's a fact you must release the source;

        2. If you/they want to distribute your modifications under other licence you can do it by the exception mechanism offered by GPL.

        So, what's the point here?

        They don't read the Licence or ask if they could somehow distribute their code under another licence?

        Anyway I absolute agree with your point about BSD.

    7. The Indomitable Gall

      @Lee Dowling

      The Torvalds vs Tanenbaum argument wasn't resolved to say "Linux is better" but "Linux is quicker and easier to produce". I don't think Tanenbaum ever argued against that, but rather suggested that Linux was quicker and easier to write because it was a hack. Ideologically, I'm on the side of the microkernel, but practically, I use Linux because it's there, and because there's stuff for it. And I use Windows more often than Linux, because there's even more stuff for it.

      But with the volume of people working on Linux now, I don't see why there isn't a concerted effort to shrink the kernel. It would save a lot of the "roll your own" work required for installing on non-standard or Frankenstein systems. (And it might help get rid of that persistent laptop backlight problem...)

    8. Christian Berger

      Many negative critiques about Unix and Open Source...

      Many negative critiques about Unix and Open Source are incoherent and muddled. So the quality of this article isn't to surprising.

    9. Vic

      Re: And your point is?

      > it has no real hindrance because you only have to offer your code to the people who

      > end up with the end product of your derivative work

      This is not true.

      [I'm going to quote GPLv2 here, but GPLv3 has similar clauses, albeit with different numbers.]

      You have the option to redistribute under section 3(a). This requires a copy of the source to *accompany* every single binary distribution - including downloads and updates. This clause quite clearly only means you need give your source to your downstream recipients (and the licence explicitly states that). Few projects use section 3(a) distribution.

      Most projects distribute under Section 3(b). This requires you to make an offer of source - valid for 3 years - to *any third party*. Any. Even those who haven't got your binary.

      The third form of distribution is Section 3(c). This is only available to non-commercial distributions, and only where the distributor himself received the code in binary form (e.g. from someone else's 3(b) distribution).

      Aside from that quibble, though, I'm with you on your response to the article...

      Vic.

    10. Antoinette Lacroix
      Devil

      Re: And your point is?

      If you're looking for devices running BSD, look no further than the Playstation - CellOS is a FreeBSD branch and there are others. There's a difference between devices, running a streamlined Kernel with a few programs made by professionals, and a full blown Linux distro, though. What they are suffering from is called 'Lego Syndrome'. Too much code with too many weird dependencies. It's all about new features, more often than not inspired by Microsoft. ( KDE4s "Semantic Desktop Search" is a nice example - it needs it's own database / server WTF ? ) Regular Linux users don't realize the problem because new dependencies are automatically installed by their package manager. Just one more package - so what ? If you're on BSD and have to compile hundreds of MBs of dependent code for a mandatory feature, you might see things differently. You'd ask yourself, like Poul-Henning Kamp did: "Is this really necessary ? Can't they just code a few functions themselves instead of relying on all that third party stuff ?" Well . . . apparently not. - and THAT is the point of the article. The more inexperienced programmers reuse code they don't understand , the more unmaintainable it gets.

    11. Anonymous Coward
      Big Brother

      Re: And your point is?

      "I'd be hard pressed (apart from certain parts that made their way into things like the Windows TCP/IP stack decades ago) to name anything that's really come from them."

      OS/X? Though its kernel is a descendent of Next Mach its far to say the system application layer and command line tooks are still basically FreeBSD.

    12. RAMChYLD
      Boffin

      Re: And your point is?

      > BSDs? I'd be hard pressed (apart from certain parts that made their way into things like the Windows TCP/IP

      > stack decades ago) to name anything that's really come from them

      Mac OS X runs on top of a BSD kernel, albeit a heavily patched one.

      Also, FreeNAS devices (i.e. certain QNAP boxes) also runs a BSD kernel.

      > And Linus is preventing Linux forking by being unique - so if it forked, who would do Linus' job in the fork?

      Erm, didn't Google fork the Linux kernel for Android? Yeah, they still call it Linux, and they do keep pulling a new version of the kernel when it comes out, but then they heavily patch it to fit their needs. Does that count?

      1. superlinkx

        Re: And your point is?

        They started work to re-merge it last year. I believe that 3.5 or 3.6 brings both kernels in sync, at least mostly. If not a complete merge, I know it's planned to be using the mainline kernel at some point in the not so distant future.

    13. Daniel B.
      Boffin

      Re: And your point is?

      You'd see why the article calls Linux a series of cheap hacks if you read the part saying "... says some dude from FreeBSD". Every couple of years, someone from the bitter BSD groups will come out and bitch about Linux because Linux went out and did what GNU and BSD were supposed to be (the free/open alternative to Unix). See Theo De Raadt basically spewing the same bile about 5+ years ago. (The GNU people themselves have their own tantrum, they insist on calling Linux "GNU/Linux" as well.)

      That said, the flock of C-gulls description isn't that off the mark. I've been using Linux since 1998, and during that time I've seen the silliness of branching and deprecation done real quick for either personal tantrums, pride, or infighting within the dev groups. Anyone remember ALSA, which was the one standard to supersede all other sound systems in Linux? Now there are a zillion "sound systems" still duking it out. Ditto with the XMMS project mentioned in this article. Or mpg123 and mpg321. And now the kernel itself seems to be doing the stupid change dance as well. Anyone using the latest and greatest distro might have noticed that the standard ethernet interface is no longer "eth0" but some weird thing called "p6p1". What does that mean?

      So Linux and the FOSS community do need to get their act up, but it isn't as bad as the BSDites are painting it.

      1. Anonymous Coward
        Unhappy

        Re: And your point is?

        "Anyone using the latest and greatest distro might have noticed that the standard ethernet interface is no longer "eth0" but some weird thing called "p6p1". What does that mean?"

        There seems to be a pointless rush towards obfuscation at the moment. You see the same nonsense with the use of device UUIDs instead of /dev block devices in fstab and the creation of that over complicated abortion known as systemd. Why? God knows. The only reason I can think of is because the devs think its more l337 to create and use stuff that is more cryptic that its predecessor and frankly because sometimes they just can't seem to tell when it something ain't broke.

      2. Anonymous Coward
        Anonymous Coward

        Re: And your point is?

        "Anyone remember ALSA, which was the one standard to supersede all other sound systems in Linux? Now there are a zillion "sound systems" still duking it out."

        I think you'll notice that most of the sound systems "duking it out" on Linux are in fact abstraction libraries over ALSA itself (eSound, PulseAudio, GStreamer, Phonon and so on are just ways of simplifying certain tasks; ALSA still does the heavy lifting). ALSA was meant to replace OSS (which at the time was outdated), not any of those. This it did.

    14. SFC

      Re: And your point is?

      Just because you don't know what's running FreeBSD doesn't mean that it's got a minimal install. As a result of the fact you don't have to share the code people don't have to tell you if they're using it or not. And they never have to give anything back.

      Off the top of my head - NetApp and EMC Isilon are both based off of FreeBSD. Kace which Dell now owns, and JunOS by Juniper.

      http://en.wikipedia.org/wiki/List_of_products_based_on_FreeBSD

  2. Sam Liddicott

    more buy in

    Has this guy ever noticed the festering pile of hacks that also masquerade as proprietary software?

    Linux and GNU with their terribly oppressive licenses managed to get more buy-in more quickly than the BSD's did.

    1. An0n C0w4rd

      Re: more buy in

      While you suggest that the reason GNU/Linux got more buy in is due to the license, the reality is rather more complex and a lot of people who were involved at the time believe it is more to do with the AT&T/USL lawsuit (mentioned in passing in the article) which accused UCB and the CSRG of copying proprietary files from SYS V into BSD. The end effect was several years where people looked elsewhere for their open source needs until the lawsuits were settled, and even then the free BSDs had considerable work to remove the allegedly offending code from their trees and move from BSD 4.3 Net-2 based code to the 4.4-Lite code. e.g. FreeBSD didn't get 2.0 (4.4-Lite based) out until 1994, and it wasn't really stable for a while.

      All the time Linux was quietly spreading without all these issues.

      A lot of people believe that the balance between the open source OS's would be different today if it had not been for the AT&T/USL suit.

      1. JEDIDIAH
        Linux

        Re: more buy in

        Linux was the first x86 Unix that supported my hardware. That's really all there is to it.

        You can make all of the excuses you like but I think the fact that Linux was a populist Unix early on contributed to it's popularity quite a bit. At the time, I would have been willing to shell out the coin necessary for Solaris x86 or OpenStep if only either of those actually supported MY hardware.

        On a long trajectory, a very small angular deviation can account for a very big difference where you end up.

        1. the spectacularly refined chap

          Re: more buy in

          Linux was the first x86 Unix that supported my hardware. That's really all there is to it.

          Never heard of Xenix then? Or SCO Unix, which was quite good back in the day at the "old" SCO even if the "new" SCO trailed its name through the mud? Or even 386BSD or Minix? Unix on x86 hardware goes back the best part of a decade before Linux.

          1. foo_bar_baz
            Facepalm

            Re: more buy in

            "My hardware" probably entails things such as drivers for a specific NIC and VGA card, not just the CPU architecture.

            But thanks for the history lesson, you really showed him, you refined chap.

            1. the spectacularly refined chap
              FAIL

              Re: more buy in

              That is nothing short of revisionism not backed up by reality. Try looking at the supported hardware of Linux distributions circa 1995 - i.e. after a few years of Linux development.

              Linux didn't support any video cards except as text consoles. XFree86 supported around a dozen, and even then generally as unaccelerated dumb frame buffers. In any case even then it was not tied to Linux but ran on other x86 operating systems if you so wanted so the idea that Linux supported any video cards that the others didn't is completely wide of the mark. Quite the reverse in fact: the X servers of the commercial Unixes had much better hardware support. SCO's X server for OpenServer supported 20-30 different cards, accelerated where supported by the underlying hardware. Linux simply wasn't in the same league.

              As for NICs it was the same story there. You had a choice of about six different cards, all ethernet so if you were a token ring shop back then you were out of luck. If you wanted 100Mbit ethernet you had a choice of two cards. OpenServer had support for 30-40 different adapters in the core distribution and pretty much any card not so supported would come with drivers for it - even low end cards. You can make exactly the same story about disk controllers too.

              The idea that Linux somehow bounded into existence with a full complement of device drivers from day one is very wide of the mark. From perhaps 1997 onwards Linux's hardware support had been rounded out much more fully, but it was actually quite poor to begin with. To pretend otherwise is either ignorant or deliberately misrepresenting the facts.

              1. Michael Wojcik Silver badge

                Re: more buy in

                The idea that Linux somehow bounded into existence with a full complement of device drivers from day one is very wide of the mark. From perhaps 1997 onwards Linux's hardware support had been rounded out much more fully, but it was actually quite poor to begin with.

                All true, but the post you were originally responding to[1] made no claim to the contrary. All it said was that Linux was "the first x86 Unix"[2] that supported some unspecified hardware. No mention was made of when Linux supported that hardware.

                That's not to say that I find JEDIDIAH's argument convincing, mind. In my experience, PC hardware support for any OS besides Windows was pretty lousy until the late '90s. And I worked with most of them: most of the MS-DOS versions, every version of Windows from 2.0 on, all of the OS/2 versions (which lacked decent drivers even for many of IBM's own Thinkpad models), SCO Open Desktop and Unixware, Solaris x86, FreeBSD, a handful of Linux distributions, Novell Netware, Coherent[4], and possibly some others which I've forgotten at the moment. Once Windows 3.0 became dominant in the PC market, most manufacturers just didn't care about drivers for other OSes.

                [1] By block-caps JEDIDIAH, which I assume is an acronym for a cabal of some sort or perhaps for an AI project cleverly mimicking a Reg commentator as part of a Shannon language-model test.

                [2] Technically incorrect, since Linux is not UNIX-branded,[3] but we'll accept "Unix" here as shorthand for "Unix-like OS".

                [3] IBM z/OS, on the other hand, is UNIX 95-branded, so if you want a real UNIX, I recommend buying a System z and installing z/OS. Old UNIX hands might find ISPF a bit odd, but using USS over a TN3270 connection will be all kinds of familiar.

                [4] A UNIX clone from Mark Williams Company, never very successful. Wikipedia claims it was the first UNIX-like OS for the PC and originally released in 1983; I haven't verified that (I didn't buy a copy until sometime around 1990). MWC was later made infamous by pursuing one of the first stupid-computer-patent lawsuits, over their patent on ... wait for it ... byte ordering.[5]

                [5] US 4,956,809. Read it if you dare.

  3. Sergey 1
    FAIL

    B0LL0X

    It's been GPL for years, and ended up in a bunch of fantastic OSes. Where's doom and gloom?

  4. Tom 7

    The only seeds sown for its demise

    appear to be from the proprietary camp competing with lawyers instead of software. The only way the GPL is going away is if someone manages to make it illegal.

    And as for the Tanenbaum comment from 20 years ago - 20 years ago they had a bit of a tiff but I doubt Tenanbaum feels the same way now and if you were to go read his works on OS design you would realise your article was mostly tosh. Especially the bit about re-inventing the wheel. MS does it every two years and as long as people are taught MS or new language shit in college and not Tanenbaum and Knuth we are going to get new wheels every few years with people patenting rounded corners on triangle wheels (one less bump than a square wheel!) and ask again what you think an oppressive license is - you cant get much more oppressive than MS/APPLEs pay us or FOAD approach to licensing.

    1. Jaybus

      Re: The only seeds sown for its demise

      The seeds of destruction have little to do with the GPL and everything to do with the patent laws of various nations slowly making it impossible to write new code, free or otherwise. It is already impossible to come up with a new video codec that doesn't violate existing patents, thus the severe restrictions on the HTML 5 Video tag. Patents are being handed out for trivial adaptations that any engineer would see as hardly worthy of being termed an "invention". Ridiculous, really. This is the real danger to open source software. Closed source software runs into the same infringement issues, but of course it is much easier to prove an infringement against an open source project.

  5. Destroy All Monsters Silver badge
    Devil

    Nein! Nein! Nein! Nein! Plan 9!

    "A pile of old festering hacks, endlessly copied and pasted by a clueless generation of IT 'professionals' who wouldn't recognise sound IT architecture if you hit them over the head with it"

    Woah, I am relieved that I am not alone in the WTF reaction I had when I checked what happens underneath "./configure".

    Also, the original rant (a bit over the top, and dated August 15, 2012 but then again this is El Reg) and the comment section is of paramount reading importance. The commentariat is often better than the rant, but then you have things like this:

    ---------

    metageek | Mon, 20 Aug 2012 16:11:54 UTC

    This is a typical engineering point of view. Engineers like to think that the world could be designed, however Nature shows the opposite. Any sufficiently complex system cannot be built out of design, it has to be *evolved*. Evolution is messy (anarchic?): pieces that work get reused, pieces that do not are lost. There must be something good about autoconf that has enabled it to survive and eliminate those awful "designed" makefiles that needed manual tweeking even though they did not check for 15 different Fortran compilers (the status quo of the 1980s). Does it matter that autoconf looks for those now extinct compilers? No, but it gives it robustness which is why it has survived. Someday a new, less bloated, alternative to autoconf will appear and it will slowly overtake autoconf. In the meantime we are doing very well with it, thank you very much.

    Software is the most complex construction that mankind has so far created. It is so complex that it can only be evolved much like Nature's products are. Engineers can continue living in their nice little artificial world of linear systems and planned devices. Those that venture in the real-world of complexity and are successful will take up Nature's ideas.

    Goodbye Unix of the 80s. Linux is here to stay

    ---------

    These is exactly the kind of person you actually want just carrying the boxes in the basement lest they give you a Therac-20, again.

    1. Anonymous Coward
      Anonymous Coward

      Re: Nein! Nein! Nein! Nein! Plan 9!

      Configure does make me wonder. Why does it look for a Fortran compiler when the program is in C? Why does it take longer than the actual compilation?

      1. fch
        Mushroom

        Re: Nein! Nein! Nein! Nein! Plan 9!

        configure / autoconf doesn't make me wonder - it makes me curse, swear and use sewer language of the worst kind. Nuking it from orbit is too kind a death for it.

        It's not a tool, it's a non-tool. Full agreement with the *BSD ranter there - noone bothers understanding autoconf input / setting it up properly; it gets copied-from-somewhere and hacked-to-compile; if "development toolkits" provide/create autoconf files for you, they're usually such that they check-and-test for the world and kitchen sink plus the ability to use the food waste shredder both ways.

        The result is that most autoconf'ed sources these days achieve the opposite of the intent of autoconf. Instead of configuring / compiling on many UN*X systems, you're lucky today if the stuff still compiles when you try on a different Linux distro than the one the autocr*p setup was created on.

        It had its reasons in 1992, but the UN*X wars are over; these days, if your Makefile is fine on Linux it's likely to be fine on Solaris or the *BSDs as well. Why bother with autoconf ? Usually one of: "because we've always done it that way, because we've never done it otherwise, and by the way, who are you to tell us !"

    2. Jaybus

      Re: Nein! Nein! Nein! Nein! Plan 9!

      Indeed, software, and computer systems in general, have followed an evolutionary path from the start. I wouldn't argue otherwise. Computers still have BIOS beep codes and humans still have an appendix. However, there is a subtle difference. Engineers affect evolutionary gains (or failures) by designing changes to an existing system. In Nature, evolutionary gains (or failures) come about by random chance. As an engineer, I will stick to my belief that a guided evolution must surely affect the rate of evolutionary gains in a positive manner when compared to the rate afforded by random chance.

    3. Anonymous Coward
      Stop

      Dont rag on autotools

      The autotools are a bit difficult to learn but I think they are worth the effort, M4 is a bit fugly but

      with a bit of practise and ruthless factoring into small macros its not that bad, (Top tip add banner lines to your macros so you can spot the output in the generated configure script)

      With the autotools, I get cross-compilation/packaging/ cross-compiled unit-tests execute in a cross environment/transparent replacement of missing functions at link time/standardised argument handling which generates help messages/binutil access and ability to mung various assets (images/sql etc) in to my code with very little effort.

      Mostly I copy my build-aux and m4 directories into a new project and write a simple configure. My heart sinks when I have to work on project that doesn't use autotools.

      So I think the autotools survived because when you take into account everything it provides, it's streets ahead of everything else. (Libtool is still a thorn in my side, admittedly)

    4. Infernoz Bronze badge
      Stop

      Where are the /free/ Micro Kernel OSs; I think that says it all!

      The only use I have for BSD is the specialised FreeBSD dist. FreeNAS running my RAID box, because it too long to free it, so it was bypassed for Linux by most free OS users. No, I don't have or want any iCrapple devices.

      I had an Amiga twice, I thought it was brilliant because the message passing made lots of stuff really easy to do, and did usable Multi-tasking unlike most other computers then; however Amiga OS is irrelevant now because it was not set free; such a missed opportunity!

      Linux is massively successful because it has always been free; true a better new OS may eventually replace it, maybe even a Micro Kernel, but it must be free; this is where the toy OS Minix failed hugely, and Tanenbaum was talking complete nonsense then to say that Multi-tasking was not critically important for any practical OS, especially a Micro Kernel!

      As for http://www.haiku-os.org/ and http://common-lisp.net/project/movitz/; you are taking the piss, the first "car" is a hobby Alpha concept car, so not safe to drive away from a test track, the second "bicycle" is more like a unicycle with a solid wheel, no tire and no brakes, so not practical!

      All the arguments for central / monopoly planning are being progressively demolished and humiliated e.g. Steve Kean's critique of Neo-Classical economics aptly demonstrates why extrapolation of supply and demand behaviour from individuals to groups of people is utter nonsense and why the results are so different!

    5. Anonymous Coward
      Anonymous Coward

      Re: lest they give you a Therac-20

      ITYM Therac-25

      1. Destroy All Monsters Silver badge
        Unhappy

        Re: lest they give you a Therac-20

        Alzheimer strikes again

  6. Anonymous Coward
    Anonymous Coward

    Ho hum.

    As analysis goes, it misses a bit of background, doesn't quite catch the flavour, isn't all that thorough. Best try again and find more viewpoints to compare and contrast. This just doesn't do the unix history justice. It's not the lack of length, it's just that the short and sweet of it doesn't quite manage to hit the nail on the head. One of the larger misses is that windows is very much defined as not being unix, qv. that chief designer guy.

    Doesn't change that the current state of software is quite sorry, and that various vaunted world-improvers have not managed to actually, you know, improve the world. They just added more code and standards on top of an already big pile (Obxkcd left as an exercise).

    I for me was just mouthing off against linux' sorry state of networking, which I won't repeat here except to note that it is fairly sorry with fragmented and multivaried, multiversioned mutually incompatible toolsets and hardware driver stacks. This sort of thing have me conclude that somethimes a design cabal is actually useful and to have people with a bit of background in no-reinventing conservatism at minimum (explicit goal in the CSRG), but ideally some design and architecting background also would be nice. In that, Poul Henning-Kamp is quite correct, even if he doesn't always manage to live up to the expectations it rises.

    Linus Torvalds, though, is quite fine as a bad example. He's not great in large scale architecture, which is why the design is still "obsolete", though it has managed to muddle by. His example though has actually made people mistake forcefulness for leadership and this has crashed a few promising projects. This is not to blame him for those failures, but more of an illustration how lack of good example does fail to breed following good example.

    Then again, forcefully opinionated is somewhat of a steady state in computing. One might say it is a sign of immaturity in the field. And since there are few truly one true ways of doing things, this isn't likely to change anytime soon.

    1. Anonymous Coward
      Anonymous Coward

      Re: Ho hum.

      Instead of criticizing Linus Torvalds (I'm not saying you're right or wrong doing that) why don't you or anyone who has a better idea fork the Linux kernel, push it in the right direction and bring us all into a new era of computing. I'm pretty sure Linus himself will not be bothered by that (we wouldn't care even if he did).

      What I don't like is people wanting to stick their ideas (good or bad) into the Linux kernel and complaining for being rejected. If there is a way to re-engineer the Linux kernel architecture, and I'm sure there are some other brilliant minds out there, what's stopping them ?

      1. Anonymous Coward
        Anonymous Coward

        Why fork linux at all?

        Why would anyone want to try and weather the rough-and-tumble of the linux dev community only to be rejected because some other idea happened to be hot that day and yours wasn't? Or what if you don't happen to be into that cult of personality? And if you're going away from that, why stick with linux?

        I could tell you why, and I could tell you why not. But all that aside, I wasn't complaining my code got rejected. I observe myself getting bitten by lack of uniform interface provided by "linux", as it happens today particularly in its networking, overall supporting phk's argument. I usually use other systems that have less such trouble, but I can't always do that. But I digress.

        The point is that the state of software in general is quite poor. As far as examples of that go, linux certainly is not the only one, but it is a nice and illustrative one. For various reasons it is a common source of grief for other systems by dint of "linuxisms" in software written for it. That's not even the kernel code's fault or its apis or whatever, it's to do with the people surrounding the thing. You can't fix that by forking the linux code base.

        It's not that something is "stopping them", it's that expecting unspecified people to roll a fat one, sit back, and spend a good relaxing night of forking, is entirely irrelevant. So that forking suggestion, well that's not particularly helpful. Neither is the incessant myopic focus on linux-the-kernel. We're talking a rather bigger picture than that. Thanks for playing. Try again soon.

      2. Daniel von Asmuth
        FAIL

        Re: Ho hum.

        If you fork the Linux kernel, it's still under the GPL. Why not wait for 50 years till Linus dies or something and elect a new leader? Why wait at all? Why use Linus's much-critised kernel in the first place? What is a 1970 OS doing in the 21st century? What is keeping you from designing a clean all-modern new OS?

        1. This post has been deleted by its author

  7. mike acker

    as far as MSFT v Linus goes MSFT is its own enemy while Linus has an unlimited pool of allies generating Open Source Software.

    The result: MSFT attempting to cram their style onto us (and make us pay for their stuff); Linus offering Freedom as the alternative at n/c

    how will this play?

    I think MSFT is retreating to the mobil and gaming area, leaving the desk workstations to Linus -- which various versions of Linux have already won the field for servers

    as MSFT pushes into the mobil/gaming field they will face Google and Android on their other flank though...

    "Half a league, Half a League, Half a League, onward ...

    1. RICHTO
      Mushroom

      I think you are a bit deluded there. Microsoft own over 90% of the desktop market and circa 50% of the server market by hardware revenue. Versus about 1% and 20% respectively for Linux.

      And Microsoft's server market share is still growing (at the expense of UNIX).

      1. Tim Bates

        "circa 50% of the server market by hardware revenue"

        I kind of find that one hard to see anyone measuring successfully - the invoices I see from Dell never have the price for Windows Server listed as a separate item. Does that mean this revenue is counting OEM licenses of Windows Server too? On a smaller server that can be more than the price of the hardware.

  8. Antti Roppola
    Paris Hilton

    Think of the furniture!

    All those kludged spice racks and poorly assembled flat pack furniture assembled by delusional dilettantes detracting from the value of artisan crafted heirlooms! How on earth will we be able to pick out the quality articles?

  9. Flocke Kroes Silver badge

    A few more lumps of confusion in the article

    The Linux kernel has forked: Android. Android had design features that worked well on a mobile phone, but were not appropriate to a cluster. Google considered merging back with Linus's Linux to be sufficiently valuable that they have been recoding bits of Android to scale better. Some of the new code has been merged. Linux and Android are getting closer.

    Forks are good in free software. If you are Microsoft, you have a few programmers. To achieve anything, you must pick a direction and herd your programmers that way. If you picked the right direction, all well and good. Celebrate and have a beer. If you guess wrong, you end up with Windows ME, Long Horn Vista or Winphone. Free software has many programmers. It is practical to let all of them code in different directions. The result is lots and lots of editors, toolkits, GUIs, and so on. Some of them are tripe. Some of them are not your cup of tea. Some are outstanding and there is almost always something that gets the job done.

    Business friendly is a Microsoft term for code they can embrace extend and extinguish. They labelled the GPL as not business friendly because if they used it, they would not be able to keep their customers locked in. Any other business that actually reads the license finds that the GPL is really friendly.

    The pipes are still there. Start an xterm/gterm/konsole/LXterminal, read man bash and info coreutils then pipe away to your heart's content.

    You do not need a virtual machine to mix languages. GCC can mix C, C++, fortran pascal (and probably a few more) with a little effort. I write lots of things in python and replace bits with C if speed is a problem. The advantage of Java is you can write once and run on one of several well maintained virtual machines. The advantage of Mono is that if you make a profit Microsoft can change the license and sue you for patent infringement.

    1. Anonymous Coward
      Anonymous Coward

      Re: A few more lumps of confusion in the article

      Forks may be good for the software, but they are terrible for the customers and Linux devs should think of the people who use their software as a cusomer, not as users. The product may be notionally free, but this doesn't mean that the userbase should expect poor customer services. Forking a product means that customers who have already chosen to use that product are presented with a choice of two or more products which may or may not do all of what they wanted to do in the first place. If you pick the wrong fork you may end up a year down the line using abandonware and face a huge challenge getting back to where you were.

      1. Anonymous Coward
        Anonymous Coward

        @AC 16:38GMT - Re: A few more lumps of confusion in the article

        So you are advocating a suppression of choice for the fear one might end up picking the wrong alternative ?

        1. Anonymous Coward
          Anonymous Coward

          Re: @AC 16:38GMT - A few more lumps of confusion in the article

          Choice is what you have when you decide what you want to use in the first place. Inconvenience and irritation is what you have when the product you are using ceases to be and you are presented with two or more forks to choose from, forks which may or may not do the job you initially wanted.

          1. Graham Dawson Silver badge

            Re: @AC 16:38GMT - A few more lumps of confusion in the article

            No, that's choice. You seem to be complaining that there will be relatively painless alternatives when an open source project up and dies.

            What happens when proprietary software is abandoned? No choice there. You have to find something else that likely is completely incompatible with your existing system. With the forks you at least have something that resembles what you already do, and most likely have something that is exactly the same as the software you already use, except it has some bug fixes and Feature X tacked on.

      2. Anonymous Coward
        Anonymous Coward

        Re: A few more lumps of confusion in the article

        Consumers bewildered by choice and competition? That's the first argument against open source that appears to support central economic planning. well done Comrade.

    2. Anonymous Coward
      Anonymous Coward

      Re: A few more lumps of confusion in the article

      "...The advantage of Mono is that if you make a profit Microsoft can change the license and sue you for patent infringement...."

      Utter rubbish.

  10. Anonymous Coward
    Anonymous Coward

    There is an argument that says if you can just copy something then we don't really move on much.

    Open source can result in an elimination of competition and it is competition that often drives people to produce something better than someone else.

    1. Lee Dowling Silver badge

      Imagine if the first caveman to invent fire had hidden the secret away and not shared it with anyone and just produced a "fire shop" that you could buy some fire from. I don't think humans would have evolved as quickly as they did. Thus, copying (including academic "copying", which is different to blatant plagiarism) is actually the only real sensible way forward. "Copy, and make better" is the best mantra you can have, and the unstated business plan of many a corporation.

      But "better" itself is subjective. A brand-new Honda is almost certainly "better" than my own car in many areas. But if it's "better" for me or not is a much more subtle question - my requirements differ immensely to those of someone designing the perfect car. For a start, I need to afford it, and I'd quite like one I can repair without having to send it back to the manufacturer each time. Of course, a "better" car is likely to cost more and actually be beyond simple repair. But that makes it *not* better for my needs.

      BSD vs GPL vs proprietary is a similar argument. Proprietary does things that the others can't (i.e. run the programs I need to use for work and games I want to use at home, although that situation is in flux at the moment, support all the hardware in my PC, etc.). But equally the open-source offerings provide me with advantages that I can't get from proprietary software, like being able to hack into the source and change stuff (and, although I'm one of the few that can, this has saved my employers lots of money several times already on everything from fax systems to access-control systems to simple things like making use of old hardware), and being able to deploy as many units as I like without counting licenses.

      And even, when it comes to it, making up for some proprietary shortfall with some knocked-together solution. A case in point? How about an expensive proprietary access control system that stores all its data in a Firebird database on the controller machine - think SQLite, the whole database saved in a folder and run locally but you can still do "database-y" things to it? We wanted a list of people who are on the premises when we click a certain button. Proprietary offering is another £x plus some more on top of software we already had to pay for and doesn't really do much. We can't hack into the program or fake a key to give us that feature without breaking the license agreement. But I can load a Firebird-compatible DB layer onto a Linux machine, probe the database remotely over Samba, throw in a couple of SQL statements and viola, my results - Samba is GPL, Firebird is MPL, my code was "who cares, only I get to see it"PL, and I get my solution. No doubt the proprietary software is "better" and would do a better, more accurate, more coherent job of that task. But in terms of the end result, my hacked-together script is "better" for my workplace (so much so, I had their reseller's field engineers ask me for it so they could use it themselves).

      Open source isn't "better" generally because nothing is. And if someone doesn't understand that, then I doubt they understand how to argue their own system is better anyway.

      1. Anonymous Coward
        Anonymous Coward

        "Imagine if the first caveman to invent fire had hidden the secret away"

        This reminded me of someone failing to do that and getting called all sorts of names, like "prince of lies" and so on.

        Anyhow, bygones and such. Back to the point: "Open source can result in an elimination of competition" is a rather interesting statement. In theory, yes, but that's not what happens. In practice, people see room to improve, the maintainer disagrees, forks ensue. Thus, we see easier competition than otherwise. I mean, how many alternatives to windows are there, and of the ones that tried, what caused their demise? Can similar things happen in open source land, why or why not?

        It's quite interesting to dig into this for a bit, also as it might shed light on how revenue models (must) differ from proprietary ones.

    2. feanor

      The alternative argument is that if you can't just copy something you waste endless man hours reinventing the wheel.

    3. Anonymous Coward
      Anonymous Coward

      Citation needed

      Can you give me an example of elimination of competition due to open source?

    4. Nuke
      FAIL

      AC @ 9 Nov 13:05

      Wrote :- "There is an argument that says if you can just copy something then we don't really move on much"

      Is there? Not me chum, I am never satisfied with what is already there even if I can just copy it. I have my own ideas, but there just isn't time in this life to implement them all. Nor is it very evident around me - like the fact that you can copy Shakespeare does not seem to stop new authors coming along all the time.

      And :- "Open source can result in an elimination of competition"

      That's a new one! Usually the complaint about Open Source is that there are too MANY versions (Fedora vs Suse, KDE vs Gnome, Libre Office vs KOffice, emacs vs vi ... need I go on?). In fact it is maybe the MAIN complaint about Open Source that htere are too many alternatives clamouring for attention.

      If you want to see how competition is killed, take a look at Microsoft's history. Here is somewhere to start :-

      http://en.wikipedia.org/wiki/United_States_v._Microsoft

  11. Anonymous Coward
    Meh

    garbage

    IT is not Rocket Surgery or Brain Science. I would blame garbage collection for that. Certainly not the GPL.

  12. Pete 2 Silver badge

    Divide and conquer

    > the ability to freely modify whatever software is running on his computer and share it

    This ideal is great for the tiny minority of highly able individuals who have both the skills, motivation and the time to do this. So far as making Linux (or other GPL projects) popular, accepted and used by ordinary people, it's completely irrelevant. A theoretical nirvana that (even more than free speech) is promoted by its ignorant or naive proponents as being something it is not - and never, ever could be.

    So while a small number of geeks can take a project and fork it, they are simply diluting the brand, A few forks turn out to be more successful than the original vision and yet more forks take over from dead, dying or stalled projects. However most of them offer nothing different or innovative but are merely platforms for someone's ego. (exactly how many MP3 players or DVD rippers does one universe need?).

    What this means is that the GPL world is like an undisciplined army - a horde of separate proud footsoldiers, who take orders from no-one, rushing headlong with their battle-cry of "software wants to be free" towards the organised and fatally disciplined ranks of professional software developers. Sure: the GPL-ites may have numbers on their side, even enthusisam too - but superior marketing, design processes, documentation, training and support (albeit paid for) is no match for the unwashed marauders who's doom is inevitable - even if they happen to be right.

    So instead of GPL software being a directed force. Applied to fixing the problems that everyday users want fixed and supplying a free, ubiquitous, easy to use, flexible solution that would be universally adopted at low cost and without borders we have a balkanised, unstable source of software. Instead of being usable by "the average person" it requires enormous, duplicated effort by each individual who wishes to install some of its parts - usually requiring a whole load of other dependent parts, too.

    What will be GPL's epitaph? Instead of a headstone, reading "We made software for everyone to use", there will be a series of tiny pebbles with poorly spelled inscriptions scratched on them - mostly the same, that when put together will read: A Lo'st oporrtinity LOL"

    1. Tim Parker

      @Pete : Re: Divide and conquer

      'Instead of being usable by "the average person" it requires enormous, duplicated effort by each individual who wishes to install some of its parts - usually requiring a whole load of other dependent parts, too.'

      Thank you for repeating the same, old, tired bollocks - I was wondering when that was going to make an appearance and was worried i'd missed it.

      Please pray tell, exactly what things are required by the "average user" on your average Linux-based OS that require "enormous, duplicated effort by each individual who wishes to install" it ?

      1. Pete 2 Silver badge

        Re: @Pete : Divide and conquer

        > exactly what things are required by the "average user" on your average Linux-based OS that require "enormous, duplicated effort by each individual who wishes to install" it

        You've obviously never had to watch an ordinary, non-technical but scarily intelligent human being going through the frustration of trying to install libdvdcss2 on her machine. I'm sure more than a millisecond's thought would throw up many more examples. But that will do for now.

        1. feanor

          Re: @Pete : Divide and conquer

          I've done this any number of time on any number of distros. My 14 year old son worked that one out after 5 mins. Clearly not as intelligent as you were led to believe.

          Plus any complication around libdvdcss is imposed on distro's by the proprietary nature of the code.

          So bad example.

          Next?

          1. Pete 2 Silver badge

            Re: @Pete : Divide and conquer

            > I've done this any number of time on any number of distros

            and there's the clue. We aren't talking about the self-selected collection of uber-geeks who frequent El Reg. We're talking about normal people who don't know, care or feel it's polite to ask them if they're running Debian, SuSE, Centos or any of the million other none-quite-the same distros. As for whether it's i386 or 64 bit? the blank look you get could swallow entire civilisations.

            That people who can do this with ease can only scoff and look down upon those who can't is exactly the problem with GPL software and is the clearest reason why it will never be a usable solution for the "other 99%"

            1. DiBosco
              Linux

              Re: @Pete : Divide and conquer

              Bullshit. Utter and total bullshit.

              You go into your repository (that's the app centre for the braincell challenged and unimaginative of you), select libdvdcss and, er, hit "install". Oh my gosh, how much of a geek do you need to be to do that. How did anyone without a degree in rocket science ever work it out?

              Or...

              ..install something like Mageia, Mint or a whole load of other distributions where it's installed by default. Sheesh the FUD that people come out.

              1. Anonymous Coward
                Anonymous Coward

                Re: @Pete : Divide and conquer

                @DiBosco - I don't think you understand what Pete is saying: Some very intelligent people indeed, who just happen to not be IT people, don't even understand the concept of a repository, or that it could be possible to have an operating system which comes with a media player of some sort, on a system with a DVD drive and can't play back DVDs until something is manually installed by them.

                A case in point: My father in law is a retired Neuro-pathologist (ie: smarter than most) he had a spare laptop kicking around (a small generic, common dell netbook) and mentioned to me that he'd heard of this linux thing and would like to try it out. So last time I stayed at their place, I said that I'd stick fedora onto it for him. Normally this is a pretty straight-forward process, I've done it loads of times, but this particular install took an entire day because the screen was 1366x768 and this particular one wasn't supported despite it being generic hardware and the wireless network required downloading a driver a compiling it myself. Both of these activities took large amounts of trial and error and internet research. A smart, non-IT guy, would not have been able to carry out this work because they wouldn't have known where to start. I had to research and repeat work which had been done by many people already and should have been in the build from day one. We're not talking about obscure hardware either...

                1. Anonymous Coward
                  Anonymous Coward

                  Re: @Pete : Divide and conquer

                  "because the screen was 1366x768 and this particular one wasn't supported despite it being generic hardware and the wireless network required downloading a driver a compiling it myself"

                  Live-CDs - use, check everything works and then go on and install

                  1. Anonymous Coward
                    Anonymous Coward

                    Re: @Pete : Divide and conquer

                    I may be able to use a live CD to check everything works, but if I have fixed hardware and a "put linux on this" task to complete, I still have to research on the Internet and manually make it work.

                    Let me re-iterate: This was a generic Dell bog standard nothing special laptop, it was a year or so old and had standard nic, graphics and wifi. It should have just worked.

                2. Ben Tasker

                  Re: @Pete : Divide and conquer

                  @AC

                  sorry, just to clarify - your father is a smart guy but a Linux newbie and you chose Fedora? WTF did you think was going to happen when you tried to introduce a newbie using something that's essentially bleeding-edge?

                  That's not a problem with Linux per se, more an issue with the advice you're giving. Its a bit like putting someone on Ubuntu 12.10 instead of an LTS - there are going to be kinks. Are you planning on installing the Win 9 RTM if/when we see that?

                3. JEDIDIAH
                  Linux

                  Re: @Pete : Divide and conquer

                  > Some very intelligent people indeed, who just happen to not be IT people, don't even understand the concept of a repository,

                  They understand App Stores and GUIs.

                  That's all that's necessary for a suitably complete distribution.

                  The entire situation is an artificial legal issue that has squat to do with the underlying usability of Linux.

                4. DiBosco

                  Re: @Pete : Divide and conquer

                  Yes, and he wouldn't have been able to install Windows either. That's a completely different argument.

                  As for taking a day to install and needing to compile wireless drivers, I again say, use a more non-techy, user friendly version such as the hugely underrated Mageia or Mint or similar. I've not had to compile wireless drivers for at least two years since even the Broadcom drivers are now open source.

                  As for not understanding a repository, I just do not accept that a two minute explanation is beyond the understanding of someone with the most basic amount of intelligence. No-one says they can't understand the Apple app store or the Android Play store and that they need certain things installing to carry out certain tasks. A repository is exactly the same thing. Linux comes with an office suite, acrobat readers and all sorts of other stuff that Windows does not have; do people run away from having to install those on Windows and without the benefit of the wonderful repository system?

                  What I do understand is the people just come out with an endless list of invalid excuses and FUD when it comes to talking about Linux and conveniently forget that for every issue Linux has, Windows its own problem.

                  1. LaeMing
                    Joke

                    @DiBosco

                    There is one huge fundamental difference between a Repository and an App Store - a repository doesn't need your credit card number to bill you for fundamental software functionality.

                    It's evil and anti-American I tell you!

                5. Anonymous Coward
                  Anonymous Coward

                  Re: @Pete : Divide and conquer

                  "A smart, non-IT guy, would not have been able to carry out this work because they wouldn't have known where to start. I had to research and repeat work which had been done by many people already and should have been in the build from day one. We're not talking about obscure hardware either..."

                  I replaced Vista with Win XP on a Gateway laptop for which XP was not officially supported. The experience was very similar to what you encountered with Fedora Linux.

                  So what was your point again?

                6. Keith Smith 1
                  FAIL

                  Re: @Pete : Divide and conquer

                  XOh my. . . 1366x768 you say. Please buy an oem windows XP and give an install shot on that baby and tell me if you are any more successful. Ie I call bullshit on the argument. Your comparing a Vendor pre-installation with a customer install. I've had no end the first grief with 1366x768 on XP with various video cards.

                7. P. Lee

                  Re: @Pete : Divide and conquer

                  If you don't understand the tech, you buy the expertise or get it from elsewhere.

                  So you buy a dvd player or you get VLC from their website.

                  I'm surprised the major distro's haven't caught onto this and include something in the installation script which says:

                  1. I am in the USA and wish to buy a DVD-player license for my DVDs (radio button)

                  2. I am legally allowed to use a free DVD player (radio button)

                  3. I do not wish to play DVDs.

                  The distros could make a bit of cash off that.

                8. Nuke
                  Holmes

                  Re: @AC 16:59 @Pete : Divide and conquer

                  My father in law .. (smarter than most) ... mentioned to me that he'd heard of this linux thing and would like to try it out. Normally this is a pretty straight-forward process, I've done it loads of times, but this particular install took an entire day because ...etc etc ... A smart, non-IT guy, would not have been able to carry out this work because they wouldn't have known where to start.

                  But would a smart non-IT guy have been able to install Windows?

              2. PhilBuk

                Re: @Pete : Divide and conquer

                @DiBosco - you've just demonstrated Pete's point.

                Phil.

                1. This post has been deleted by its author

                2. DiBosco

                  Re: @Pete : Divide and conquer

                  @PhilBUK

                  How?

              3. Anonymous Coward
                Anonymous Coward

                Re: @Pete : Divide and conquer

                I just installed libdvdcss on a Debian system; it's a bit geeky (./configure/make/make install) but easy enough if the development tools are installed. The problem with libdvdcss appears to be that in some places it's illegal software (posting anon for that reason). While this installation definitely is not for the faint of heart, it is not truly difficult and is far from the norm for application packages.

                Gnu/Linux has its challenges, but there appears to be a thriving market for PC geeks to assist largely windows users in managing their systems, so I guess there are issues with that as well. The PC, whatever the installed OS, undoubtedly is the most complicated machine that most people ever use.

              4. RICHTO
                Mushroom

                Re: @Pete : Divide and conquer

                How about playing back a Blu Ray movie under Linux then? Good luck with that one without lots of screwing around.

                The easiest way is to run up a Windows VM.....

            2. Bernardo Sviso
              Boffin

              Re: @Pete : Divide and conquer

              > > I've done this any number of time on any number of distros

              > and there's the clue. We aren't talking about the self-selected collection of uber-geeks who frequent El Reg. We're talking about normal people who don't know, care or feel it's polite to ask them if they're running Debian, SuSE, Centos or any of the million other none-quite-the same distros. As for whether it's i386 or 64 bit? the blank look you get could swallow entire civilisations.

              </quote>

              You mean people like me, who've never worked in IT (in favour of a career buying and selling used books) but somehow managed to install and get comfortable with Debian well over a decade ago?

              Or the mill-worker I met in the local Starbuck's last spring, who told me that the Ubuntu variant running on his netbook was nice enough -- but was going to reinstall Slackware again, because he liked it better?

              Or the grad students working on their History/Poli-Sci/English Lit theses, and were very happy with Ubuntu/Mandriva/Turbo Linux?

              Or the financial consultant who told me how his group practice/partnership had switched half their desktops and all their servers to Linux -- on their own, because they didn't have an in-house "IT guy" (and as a result they wasted less time and frustration doing IT stuff instead of paying work)?

              It's a matter of attitude, not a matter of how "uber-geek" one might be.

              1. LaeMing
                Meh

                On a serious note

                I shudder at what would happen if I moved my senior-cit. mum off Kubuntu. She would get totally lost in the whole missing-basic-functionality-until-you-pay-more, having-to-go-here-there-everywhere-on-the-internet-to-update-drivers-and-software commercial IT world.

            3. Anonymous Coward
              Linux

              Über-geeks who frequent El Reg?

              > We aren't talking about the self-selected collection of uber-geeks who frequent El Reg

              Install Ubuntu alongside Windows 7

        2. JEDIDIAH
          Linux

          Re: @Pete : Divide and conquer

          > through the frustration of trying to install libdvdcss2

          Are you kidding? That's a hack to get around the DMCA. It's an extreme legal grey area only made difficult by a highly corrupt American copyright law. It's relevance to just about anything else is nil.

          If that's really the best you can do then you just proved the other guys point.

        3. Anonymous Coward
          Anonymous Coward

          Frustration of trying to install libdvdcss2

          "You've obviously never had to watch an ordinary, non-technical but scarily intelligent human being going through the frustration of trying to install libdvdcss2 on her machine"

          $sudo wget http://packages.medibuntu.org/pool/free/libd/libdvdcss/libdvdcss2_1.2.12-0.0medibuntu1_amd64.deb

          $sudo dpkg -i libdvdcss2_1.2.12-0.0medibuntu1_amd64.deb

      2. Chemist

        Re: @Pete : Divide and conquer

        "duplicated effort by each individual who wishes to install"

        Well it took me 25 mins last night - mind I was doing something else at the same time

    2. Lee Dowling Silver badge

      Re: Divide and conquer

      A common misconception is that EVERYONE needs to contribute back. It's just not true. I have code in open-source projects out there, but it's pretty minimal to say the least, and I have several private patches that I wouldn't dare to pollute a foreign codebase with. But most of the people I know who are the largest users of open-source don't contribute anything at all, and yet it still thrives and grows every year (use Firefox? You're one of them).

      The reality is that *I* benefit from other people being able to see the code and play with it. The usual argument is security, but it goes far, far beyond that. A single patch, approved, tested and posted to the Linux kernel will end up on MILLIONS of machines within hours. I benefit from that. I benefit from moaning about it not working too. I benefit from other people looking at code and saying "I can't understand that, it looks like complete nonsense" on any of the projects that I use. I even benefit when projects are forked or abandoned because of that (because otherwise *I* would have to fork myself, be left without support, have to start up a rival project from scratch, or go seek out alternatives on my own).

      99.9999% of GPL-licensed software users push back exactly zilch. That's not a problem at all. Nobody really cares, but it's more than, say, the number of Microsoft users whose code ends up in Windows (good luck with that!). You're confusing users with developers, though.

      As the barest of bare amateur developers, I have tested, patched and hacked the code I have available to make it work the way I need to. Everything from patching the rt2500 drivers on my private systems with a patch that I had to craft to get it to compile on a new kernel (pre kernel inclusion, something to do with the way interrupts were handled changing in that kernel, if I remember), to the patches I have to make to my own copy of Hylafax to get it to run the numerous fax lines in my workplace without forcing me to upgrade to the next version to get feature X (risking a hefty mid-cycle upgrade), to fixing TuxPaint (I work in schools, it's one of their most used programs) to juggle the menu items to make them easier for little kids, to providing a routine to OpenTTD that reduced the amount of bug reports they got where people were using hacked/unofficial datafiles (which has since stopped a lot of spam on their bug monitoring and provided several people with the knowledge that their datafile were unintentionally corrupted - who knows, maybe I found out for someone that their hard disk was corrupt by that patch!).

      Those little changes are the freedom I pay for. If notepad doesn't want to open a particular file, there is bog-all I can do about it. But if *metapad*, the program I use in preference because it lets you do lots more, does it then I can work out why and change its behaviour, or lobby to have it changed. I don't expect a user to, but like some of the things mentioned above, after some time they may be able to do exactly what I've done without having to know how (the first person to invent the wheel was a genius, the people who followed after benefited from his genius, and now we all take wheels for granted).

      Similarly, I just hacked Classic Shell because that open-source project refused to allow in a feature that I think it needs (an option so right-clicking the toolbar provides the "old" Windows context menu by default and the "new" Windows context menu if you're holding Shift, instead of vice-versa). So I hacked the code myself, added the feature, and *I'm* better off even if no-one else is. And I didn't require their co-operation at all (and received just the opposite). But how many people can actually go and tweak their *Microsoft* Start bar or put it back into Windows no matter how nice they are to MS.

      The GPL, especially, gives users and developers the freedom to benefit from each other. It's arguable that the average end user ever really benefits from MS developer's wild ideas. But certainly, though your granny can't hack into the KDE source code, she benefits from it being available to others.

      Think about the movie I, Robot:

      Man (sarcastically): "Can a robot create a masterpiece?"

      Robot: "Can you?"

      Arguing that users don't benefit is like saying that air passengers don't benefit from someone looking into aircraft designs, their safety, effectiveness and room for improvement, from outside the industry, or even the general public provoking outcry when a particular type of plane keeps having problems. Of course they do. They just can't necessarily do it themselves directly.

      The GPL's epitaph? "Someone better came along and replaced us. Mission accomplished."

    3. sueme2
      Linux

      Re: Divide and conquer

      World domination is just a matter of time. The last time I looked, Linux worked out of the box on more hardware than any other operating system. I can not recall where I saw the stat, but it sort of does match reality. For me, it just works. There is a thing called a "learning curve" If you are unlearned then you are free to ask, and you will be given the free knowledge. If you are not prepared to learn, then stick with what you think you know.

    4. Anonymous Coward
      Anonymous Coward

      Re: towards the organised and fatally disciplined ranks of professional software developers

      So you're one of those people who think that only amateur hobbyists are Linux developers? Or are you just a FUD-spreading shill?

  13. plrndl
    Linux

    Hurding Cats

    20 years on from Tannenbaum's promotion of the microkernal as the new black, I am not aware of any such OS that has made it out of academia. When's the last time anyone heard anything about the GNU Hurd? Everything that isn't Windows is UNIX derived. Meanwhile Linux is taking over the world, apart from the desktop, which is rapidly going out of fashion.

    1. PhilBuk

      Re: Hurding Cats

      How about the Mach kernel. The basis for OSX's kernel (often mistakenly for FreeBSD).

      Phil.

      1. JEDIDIAH
        Linux

        Re: Hurding Cats

        MacOS is going nowhere.

        PhoneOS is being marginalized.

        The academically objectionable approach is still doing very well both in terms of pure performance and it's ability to drive sales. Linux continues to thrive in the server room and on mobile devices and in embedded applications.

        The main problem with a Mach kernel running on a Mac is not the kernel the hardware is running but the fact that you've got very narrow limitations when it comes to that hardware and what kind of system design tradeoffs you can make.

        You are better off running MacOS in a VM on a cheaper and much more powerful Linux machine.

    2. Anonymous Coward
      Anonymous Coward

      Re: Hurding Cats

      "Everything that isn't Windows is UNIX derived."

      Are you for real?

      z OS

      OS/400 (iOS)

      TANDEM

      RISC OS

      VMS

      Next (AKA Mac OS, let's face it)

      And and number of specialist real-time OSes

  14. Anonymous Coward
    Anonymous Coward

    Title and article

    I didn't find the article made the argument that the title seemed to suggest it would. The title is striking and forceful and then the article just rambles on about vaguely related stuff.

  15. Jim 59

    Q: "Does one of the biggest-ever revolutions in software, open source, contain the seeds of its own decay and destruction?"

    A: No.

    The article runs counter to nearly all the evidence of the last 20 years. Linux has been a phenomenal success and continues to thrive with almost explosive vigour, and that pattern seems likely to continue for the foreseeable future.

    "By the end of the 1980s, things were looking bad for Unix. AT&T's former skunkworks project had metastasised into dozens of competing products from all the major computer manufacturers..."

    - And how is that "looking bad" ? In 1989 Unix was all over the datacentre like a cheap suit, also dominating the engineering, scientific and financial desktops, as well as the lower mid range market subsequently taken over by NT. Unix was obscenely healthy in those days.

    "Microsoft hired DEC's genius coder Dave Cutler and ...the result was Windows NT ...enough time to get the new kernel working ...today it runs on about 90 per cent of all desktop computers."

    And that kernel that has hamstrung Windows ever since. MS was so desperate to get NT out of the door they made the fateful decision not to implement proper protected memory spaces and execution levels. The system was prey to every user, process virus. And every version of Windows since has carried this fatal gene. Cutler must have been grinding his teeth. Had the decision been otherwise, our world would be quite different.

    "But today's Unix descendants are large, complex graphical beasts, and so are their apps. Any significant modern application is a big project..."

    Obviously Unix apps are graphical. They always were. The OS is not graphical. You might run a file manager, but underneath it is still all pipes and everything is a file.

    Good article though.

  16. Anonymous Coward
    Anonymous Coward

    The problem with autoconf...

    ...is that it works.

    It takes a really keen person to rewrite something as important and as complicated (assuming you care about "legacy" which linux normally does [thank god]) as autoconf when, at the end of the project, no-one's going to care.

    1. Gerhard Mack

      Re: The problem with autoconf...

      Except that autoconf doesn't really work in a frightening number of examples. In many cases, if the library is new, the programmer who set up autoconf won't have known to add a check for it causing a compile failure rather than the helpful error message it was designed for. The other common mistake is for autoconf to be set to export the report on what libraries exist and then not handle any of the possible results.

      I would say most of the time autoconf does nothing except run a ridiculous tests and everyone just assumes it is working because it spends a lot of time doing things and displaying cryptic output and that's why I end up just using shell scripts for my projects.

      1. Frumious Bandersnatch

        Re: The problem with autoconf...

        On balance, it seems to me that most of the accusations levelled at autoconf here are more to do with how it's used than the software itself. It is a pretty horrendous bit of software in itself, thanks to the pretty steep learning curve, and I've been hit a few times by some of its idiosyncrasies (incompatible versions, missing m4 macros and the way it sometimes runs differently if you run 'sh ./configure' or './configure', mainly) but on the whole if you've got a project beyond a certain size and you care about portability, I think it's usually a no-brainer: use autoconf.

        As I already said, the problem is often more to do with how the software is used. It's not a magic bullet that will automatically make your program portable. You still have to do all the work in your source code to account for all the different flavours of *nix or whatever, like whether they have certain library functions available to them (and which version), what system include files are needed, as well as, sometimes, lower level concerns like the machine's endianness, word sizes, data alignment characteristics, and especially the right architecture (or compiler)-specific flags to use. The other thing about autoconf, besides being an aid to achieving portability is that modern software generally has a multitude of dependencies, and without something like autoconf (and supporting tools/standards like pkg-config) any homebrew configure/make system is apt to get very complex very fast, and worse than that, they're (relatively speaking) very difficult to maintain and often not very portable in themselves. Most problems with building (besides problems with dependencies) tend to be with the author not writing portable code in the first place or simply not knowing about the foibles of your particular system or toolchain. Again, that's not autoconf's fault, but it is what it's designed to help the coder with.

        and that's why I end up just using shell scripts for my projects.

        There's nothing wrong with rolling your own configure/build system, but for the end user (ie, the person building the system), I think that familiarity with the autoconf system usually makes it easier to handle cases where things go wrong for some reason. Once you've compiled a few dozen apps it becomes pretty easy to figure out where the build is going wrong and how to fix it. Maybe that's just my personal preference, though.

  17. Real Ale is Best
    Stop

    Hmmm.

    Can't rate this article.

    Telling.

    1. Test Man
      FAIL

      Re: Hmmm.

      A bit like a lot of other articles. No conspiracy here, but nice one trying to make out that there might be.

      1. LaeMing

        Re: Hmmm.

        Wasn't article rating removed wholescale in the site redesign a few months back?

        I understand why: not being able to separate the ratings based on what people though of the actual article from the ratings basted on what people think of the topic would have made the system useless.

  18. Anonymous Coward
    Anonymous Coward

    ....FreeWho?

    Never heard of it.

  19. David Kelly 2

    BSD Dropped The Attribution Clause A Long Time Ago

    BSD dropped the attribution clause a long time ago, partly when they found shady operators were twisting it as an endorsement of their product which used pieces of BSD code.

    Under BSD you can not claim the code was all your own (unless it really was). And you can not remove copyright notices. But you do not have to go out of your way to tell everyone that you used BSD code.

  20. Cthonus
    WTF?

    Parsing error

    "Linux itself hasn't split is the forceful, charismatic leadership of Linus Torvalds"

    Surely the bigger the headline the more easy it is to spot a cut-n-paste howler?

  21. Ken Hagan Gold badge

    Linus doesn't scale?

    "Linus" as a project management methodology does not *have* to scale.

    The principle (and it is both ancient and not particularly related to software design) is to maintain a single coherent vision of what the project is supposed to be. You do that by having a small group who do that and then organise the rest of the work-force to be delegated to so that the architect(s) can spend time maintaining conceptual coherence. (Brooks had a whole chapter on this, IIRC.)

    Of course, finding people to play the roles is tricky. The hard part is when the architect needs to say "That's shit." (or words to that effect) rather than "Are you sure about that?". At that point, the underling needs to have sufficient respect for the architect that they don't kick back. Linus seems to manage this. Bill Gates was supposed to command similar respect but I haven't heard similar remarks about his successors.

  22. spencer

    Closed source software can be as rubbish.

    I don't think the argument holds water. Show me a bad open source program and I'll show you an equally bad closed source one. Fact is - if it's open source at the very least there's the chance that someone smarter than you can correct your mess.

    "but - unfortunately - Linus doesn't scale. Very few projects get to have a Torvalds-like leader."

    Yup - this holds true of all software, not just open source.

    1. Christian Berger

      Actually

      The cause of bad software is usually bad developers. On closed source projects those developers are hired and will continue developing it, often without improving. As long as it sells, and it will sell as there's a large pool of stupid customers, it'll be developed by those people. If a new guy who is good enters such a company he'll wear out as he needs to deal with idiots and quits or resigns.

      On bad open source projects 2 things can happen :

      First, the developers loose interest, since nobody wants to deal with that piece of crap, the project simply will die.

      Second, a new and good developer comes along and can either improve the quality of the programmer and the software, by rewriting code and mentoring developers, or he can make a fork.

      So bad open source software has a much less chance of staying bad software. It either dies or gets better.

  23. Sandtreader
    Joke

    Festering hacks, endlessly copied and pasted...

    Not a great article, but El Reg journalism isn't *that* bad.

  24. The BigYin
    Flame

    What is this all about?

    Windows (any version): a pile of festering hacks that you can't see.

    Linux (any version): a pile of festering hacks you can see.

    In the former, you can only find out about the problem after the fact. In the latter, you can do some due diligence (or pay someone else to do it) before the fact (maybe even knock up a few test cases; whatever). Which one is better?

    Oh, and most Linux devs are professionals who draw a salary.

    As to GPL "infection"...if code has a license you don't like, don't use that code write it yourself! Who are you (or I, or anyone) to tell an author what license they should use? You could, of course, ask the author how much a dual license deal will cost you. Y'know...pay them.

    People who moan about the GPL are fools who want to have their cake and eat it. Correction. They want to have your cake and eat. Then demand you do the washing up.

  25. mrvvg
    Big Brother

    Calling libraries written in other languages

    Now there's a thought... hmm that would be DEC's VMS - you could make library calls from ANY language supported by the operating system - and the architect?.. Mr Dave Cutler, genius...

    Proper operating system, proper clustering, properly scalable, totally reliable, who needs the BSOD?

    1. Christian Berger

      Other approach

      The is also another approach to solve the library problem. It's the Unix way of using text as an interface. In fact in Plan9 everything is in the file system. So if you want to open a socket you write into a file. Same goes for opening windows. In fact your software can even easily provide file system based interfaces. So there's an IRC client which allows you to write into a file to connect to a server. This causes a directory to appear representing the connection. From that on you can open channels all by just writing into files. It doesn't matter what programming language you use, it just works.

  26. Anonymous Coward
    Anonymous Coward

    FreeBSD developers need a reality check

    They are beginning to sound like the US Republican Party - when facts don't match their own delusion, they invent an alternate bubble of their own, and blame factual reality on a widespread conspiracy.

    About the migration from GCC to CLang/LLVM: quit yammering about it, and how cool and amazing it is, because it hasn't happened yet. They've been working on it for two years. Maybe they should get it working first, and then brag about it.

    Linux is dying. Really? Sez none other than FreeBSD?

    Here's a few links - chosen at random - about FreeBSD's worldwide market share and usage:

    2012:

    http://w3techs.com/technologies/details/os-freebsd/all/all

    2011:

    https://ssl.netcraft.com/ssl-sample-report//CMatch/oscnt_all

    2011 again:

    http://fossreview.wordpress.com/2011/02/01/an-analysis-on-the-trend-of-ossfs-market-share/

    I haven't yet seen, or heard of, a single mobile device, of any kind, running FreeBSD.

    And I'm not even a Linux die-hard. I just like facts.

    Put down the pipe, guys.

  27. Jack 23
    Stop

    Confused much?

    I don't see conherency in your arguments. You pull out the viral nature of the GPL code and I don't see that as a bad thing. But that fact doesn't explain the open/closed development model. There are plenty of closed-development projects that (regardless of the breadth of platforms they support) are actually *targeted* at Linux users. There is nothing in the GPL that forces an organisation of developers to accept code from outside parties. In fact, developers of GPL code regularly come in for criticism because of their close development practices. That's not because of the GPL license either though.

    On the non-scalibility of Linus: I'd still rather have him dropping slightly too many patches than the opposite situation - where almost every patch is accepted. But you seem to be going against the main grain of your argument in bringing that up anyway.

    The main difference between development of core FreeBSD and core Linux is nothing to do with the licenses. Both are relatively low-level systems and code for each of them that is brought in from outside the main development community is BSD-licensed or GPL-licensed respectively. I do appreciate the closed-house approach that FreeBSD advocates, but differences between that and the barely-contained-whirlwind approach of Linux kernel development are due to historical imperative, *not* the licenses.

    Finally, "bizarre" is spelled "bizarre", not "bazaar".

  28. Jean-Luc
    FAIL

    What is this article trying to say, exactly?

    None of the criticism voiced here is specific to Linux or the GPL. And, one, alleged, autoconf mess does not a general indictment make.

    What I read here instead is a broad indictment of open source. Not entirely unwarranted in some cases, but way too broad and not argued well.

    Can open source programs be a mess? Yes. So can closed source programs. The first step in doing anything with an open source anything is 1. check when was the last time the program was updated. 2. check the open bugs. 3. if you are a dev and planning to use the libraries, take a look at the code.

    I know step 3 got me to junk a once-favored Python alternative to Django - code was an incredibly ugly mess of nested IFs that would discredit any programmer. Not clever - I have a hard time grokking Django's internals because it is too clever for me - just ugly.

    None of the 3 vetting criteria above can be applied as efficiently with closed source, since even bug counts are generally kept under wrap.

    Second, there could be case made that the BSD family of Unixes are kept on a tighter leash than Linux. But that this more due to the smaller teams and reluctance to change things much than to a GPL vs. BSD license argument. Stability over features and innovation. That's a different question, but not what the article covers.

    Third, can open source programs be less than innovative? Yes, many are. So are most commercial programs. Can they be useless forks or vanity projects? Yes, and it behooves you to estimate long term viability before coupling your code or business processes to an open source project.

    Last, spot the reasoning:

    a) autoconf is a mess and uses GPL

    b) Linux uses GPL

    c) Therefore Linux is a mess

    I prefer BSD over GPL in general, but I find this character assassination less than convincing. And, microkernel vs. monolithic has, again, little to do with the GPL. It's not like microkernels are broadly used in any license family.

  29. Will Godfrey Silver badge
    Unhappy

    Maybe I'm getting old and senile (don't answer that!), but can somebody explain exactly what is being said here.

  30. Paul Hovnanian Silver badge
    FAIL

    I suppose ...

    ... if I had just tried to use autoconf and failed on a BSD or OSX system, I'd be ranting about those O/Ss.

  31. MissingSecurity

    If one of this issues is rouge coding, wouldn't the OSS community do themsleves a favor by being better teachers in this regard?

    Anyone with basic ambitions can learn to code in a number of languages, but I have yet to find a quality document or tutorial, explaining elegant coding principals, beyond your basics.

  32. Anonymous Coward
    Anonymous Coward

    The Art of Computer Programming

    by Donald Knuth, thank you very much for asking.

  33. 2briancox
    Thumb Down

    Synopsis time!

    Oh boy, this is a fun article to review. To save anyone the trouble of trying to understand this nonsense, I'm going to graciously provide a much needed synopsis.

    First, the author complains that Linux is being killed (which it of course isn't) by GPL, because it is a collection of copy-and-pastes. The author attempts to justify this statment with a series of copy-and-pastes as follows:

    1- Kamp, a BSD developer, once complained that Linux is a pile of festering hacks copied and pased. No further examples of actual problems are provided. Let's keep looking to see if the author can actually make a case...

    2- After a brief history of Unix, Windows, Apple and Linux, the author complains about how BSD often has its parts taken and used in various projects with little kudos. This is of course not a comment on Linux. But a comment on how this article is a series of unrelated hacks strung together in an incoherent manner.

    3- The author then quotes Balmer's irritation about the fact that if they copy-and-paste code from a GPL program they have to then you'll have to open the code of your program. Let's stop and congradulate the author about directly addressing his thesis. Well, ok, he didn't. But he did use the phrase "copy and paste" which was in his thesis. Kudos on kinda talking about something that was in your thesis, author.

    4- The author next conquers the subject of forking. And no, this hasn't happened to Linux, but forking sure is bad! I have no idea why he thinks this. You won't get an explanation of how this harms Linux or any other project. But he does admit, "Well ok, sometimes forked projects even merge." So what? Don't read the article if you want an asnwer to that. You won't find it. But interjected in this portion of the article he will complain that it's only personality that holds Linux together. Then he ignores the fact that it is corporate support that is keeping it together.

    5- The article then meanders into the fact that Linux is a collection of C programs. Or maybe one day it will merge with other languages. The author goes on to suggest maybe Java would be a good choice.

    6- And the best quote of them all ... someone in 1992 proclaimed that Linux is already obsolete. Yup. That's his summary to bring together all of the above points.

    SO, what the hell did that have to do with the idea that Linux is dying because it includes copy and pasted hacks? NOTHING! But what a joyfully insane rambling of writing. I have been on roller-coasters with fewer crazy twists and turns. At least the roller-coasters ended up back in the same place they started.

    1. Christian Berger

      Thanks, this makes me wonder...

      How does one get to be a paid Reg-Author? It seems like a job even I could do.

      1. RICHTO
        Mushroom

        Re: Thanks, this makes me wonder...

        Post controversial crap that generates lots of clicks. The jobs yours...

        1. Anonymous Coward
          Anonymous Coward

          Re: Thanks, this makes me wonder...

          "Post controversial crap that generates lots of clicks."

          The job IS yours !!

  34. Jean-Luc
    Meh

    Go to the source article...

    Start @ http://queue.acm.org/detail.cfm?id=2349257

    The point that original article is trying to make is much more succinct in scope.

    1. FreeBsd takes a huge amount of time to compile.

    2. That's because there are a lot of Ports (think apt-get or rpm) pointing to LOTS of programs

    3. The programs have horrendous package dependencies.

    Example: Firefox requiring, somewhere upstream, a TIFF package, either directly or through its dependents, even though FF does not do TIFF.

    Or a package requiring both PERL and Python directly (WTF???).

    4. Supposedly, autoconf makes a hash of what it has to deal with in 2. and 3. The author therefore laments that the kids these days don't know how to code.

    Personally, regardless of the very ugly plumbing and cruft, which I am sure the original poster is much better qualified to comment on, I am rather impressed that I can go on an Apple command line and run the macports to install & compile a program automatically, including its dependencies.

    Or that the various sudo apt-get flavors on Linux manage the same feat on essentially the same program source code.

    When you think about it, that IS pretty impressive and a huge achievement of open source. Or are we supposed to pine for the heydays of 1990s Unix fragmentation???

    Even though I can't disagree with the OP that there are a lot of cruft and hacks involved. And I am sure there are many incompetent coders distributed amongst all the FOSS licenses and proprietary stacks.

  35. This post has been deleted by its author

  36. PAT MCCLUNG

    Clerk

    IN RE OBSOLESCENCE,TANNENBAUM WAS OBSOLETE LONG BEFORE 1992. TANNEMBAUM WAS JUST PLAIN WRONG.

    1. Destroy All Monsters Silver badge
      Facepalm

      Re: Clerk

      lolwhat

  37. E 2
    Facepalm

    This article is flamebait, you wankers!

  38. Anonymous Coward
    Anonymous Coward

    I don't see what is wrong with hacky code or copy and pasted code really as long as you follow the license of the code that you are using.

    There are lots of people with good/useful ideas but less people that can construct code that won't offend the most anal FreeBSD developer (basing an article about Linux dying on a FreeBSD developers comments is just laughable...) and I guess a lot of time the people with the ideas and the people that can develop "perfect code" aren't the same people. So.. a person with ideas and no coding skills will probably employ a cheap worker to implement the idea resulting in a lump of hacky code copy and pasted from stackoverflow answers. A person with an idea and limited coding skills will generate something of equal quality and the FreeBSD developer(TM) will sit on their hands until the "crappy code" ends up in ports because although it is a mess it is something that users have found useful. If anyone cares enough they can fix the issues or rewrite the code from scratch. If you have the time to write a blog post/news article about how crap something is you should be prepared to actually fix the problem yourself. I have found myself quite a few times when I work with some library and think "this library is a piece of shit" and start reimplementing it myself or fixing the issue I get so far and back out because I hit exactly the same issues/problems the original developer had but I didn't see as a consumer.

  39. Anonymous Coward
    Anonymous Coward

    The beautiful cathedral of Unix, ...

    "...deservedly famous for its simplicity of design, its economy of features, and its elegance of execution."

    What a load of crap. UNIX was a quick & dirty hack job cobbled together for (even at that time) underpowered hardware. There's absolutely no beauty in this mess, and its stupid design features (like 'everything is a file') has made the implementation of many modern feature much more painful than necessary, and are still holding it back.

    The true beauty has been MULTICS which unlike UNIX was a really advanced OS, and hadn't it been decided to go for the crap job to make a fast buck then we wouldn't have to sustain the turd that UNIX is.

    1. Destroy All Monsters Silver badge
      Trollface

      Re: The beautiful cathedral of Unix, ...

      1/10, because at least MULTICS was mentioned.

    2. Anonymous Coward
      Anonymous Coward

      Re: The beautiful cathedral of Unix, ...

      Not sure why others have downvoted you..

      For BSD at least there was/is a ton of hacky code.. there are some BSD history videos were it's clearly stated by one of the people involved early on that BSDs much lauded TCP/IP stack was a massive hack and only done because AT&T or whoever was actually contracted to do the TCP/IP stack took to long about it.

  40. CJatCTi
    Holmes

    So what's the difference?

    The fundamental flaw in this hatchet job is that assumption closed OS's are perfectly designed & constructed - when all the evidence is there is no difference they are all patch jobs, the difference with Open Source if that every once can see it.

  41. Anonymous Coward
    Anonymous Coward

    Alarmist headline...

    Baseless article. Do you work for Microsoft?

  42. Robert Brockway
    Meh

    Let me join the chorus

    The article is muddled. It makes major claims about the future of FOSS and mostly talks about the Linux kernel, a single FOSS project. The article further mixes in issues like monolithic vs micro- kernels.

    I also want to note that while it is true that forks are fairly common, successful forks are not. Forking tends to be an unstable equilibrium - either the fork will fail or the original project will disappear following the fork. While it is true that there are examples of a fork and the original project going on to be successful this really occurs in a minority of cases

  43. Anonymous Coward
    Anonymous Coward

    What is a bazaar development model?

    At first I thought it was an attempt at wittiness, but even if one were to liken the open source world to a bazaar, the comparison falls apart quickly. This was the first of many deficient points in the article. While the topic is one that should support many interesting discussions, this article fails to follow through and deliver. The Register disappoints again.

  44. Anonymous Coward
    Anonymous Coward

    Ignorance?

    This is nothing new is the world. The Great US of A is founded on a similar principal; The Republic can elect representatives that give away everything and require a subsection of their population to pay for it. Sorta like they did by re-electing the Socialist AssHat Obama.

    Guess what kiddos, when you give people freedom you have to accept that they might be incredibly stupid with it. See the previous paragraph.

    I guess I could sum up my entire post in 1 word: DUH!

    You'll figure some of stuff out in 15 or 20 years... until then you'll be a junior whatever. ;)

  45. simonckenyon
    FAIL

    clueless

    i read the aricle twice, trying to discover the point that the author was trying to make.

    my conclusion was that there was none.

    this is not up to the usual standard of the register.

    1. diodesign (Written by Reg staff) Silver badge

      Re: clueless

      Essentially this: somewhere along the line, the Unix world went from elegance to, in some ways, "an embarrassing mess" (eg: autoconf, you either love it or hate it). Are licences, such as the GPL, encouraging the confusion or not? Liam discusses.

      C.

      1. Vic

        Re: clueless

        > Liam discusses.

        I'm not entirely sure that's the correct verb, TBH...

        Vic.

  46. Anonymous Coward
    Anonymous Coward

    Angry

    Claim BSD is in anyway better than Linux and then slag off my beloved Autoconf...I'm so angry i need to lay down.

  47. Anonymous Coward
    Anonymous Coward

    perfection is not the point, is it ?

    linux is not about quality, just like windows or os-x is not about quality.

    it provides all kinds of individuals, organizations, companies with an independent platform that is owned by every one.

    level playing field, it frees you from having to give money to whomever for something that's already there for years.

    Imagine what India or China or African nations would face in financial difficulty if they had to pay microsoft, apple or ibm for every instance of windows/os-x/ os-2.

    that's why linux is a success. it fixes a whole lot of people's (4 billion or so ? ) financial problem to be allowed to use their computer.

    RG

This topic is closed for new posts.

Other stories you might like