back to article Fedora 10 alpha code is go

The Red Hat-sponsored open source team behind Fedora has released an alpha version of Fedora 10 (Cambridge) with better security features and audio support. The operating system is suitable for testing but not for running on any critical systems. Along with bug fixes Fedora promises better audio support, a graphical security …

COMMENTS

This topic is closed for new posts.
  1. Anonymous Coward
    Alert

    So....

    We're really alpha-alpha testing RHEL6 features then?

  2. Iain Watson
    Alert

    How does alpha code differ?

    I've come to the conclusion that Fedora in general is only suitable for testing. It's definitely no good for running on any critical systems, unless you enjoy rebooting them weekly.

  3. Anonymous Coward
    Anonymous Coward

    @ So...

    Fedora is one of the most up to date distros, I expect users and testers findings are applied to a lot more than just RHEL, as bugs found may well go back upstream to for example the Samba team and then given that it's all open-source get taken up by all Linux users sooner or later...

  4. Anonymous Coward
    Linux

    Re: So....

    Yes. Fedora have never really tried to deny that.

  5. Anonymous Coward
    Alert

    Re: How does alpha code differ?

    "It's definitely no good for running on any critical systems, unless you enjoy rebooting them weekly."

    My company's webserver begs to differ - running patched FC7 and a tad over 235 days uptime so far. We've got several 'internal' servers, mostly running FC8 (and one running FC9 for evaluation) with similar uptimes.

    If you're having to reboot weekly, you've screwed up somewhere. Gentoo is my distribution of choice, but for work purposes, Fedora fits the bill just fine.

  6. Anonymous Coward
    Anonymous Coward

    Re: How does alpha code differ?

    Yes, it's true that Fedora is considered an experimental distro. Anyway I use it at work for development tasks since Fedora 1, even for some servers (but has to be said that most critical ones use RHEL or CentOS) and in general, it has performed quite well and I will stick to it.

    I have learned that is better to wait some weeks after new version has been released and code has stabilized a bit. For instance making an Nvidia card run AIGLX (or any 3D) on FC9 was not possible the first weeks because there was not a driver available for the Xorg version at that time (that was a pre-release, not a final version).

  7. Dave

    Re: How does alpha code differ?

    I've had Fedora running for months at a time without reboot. As others have said, you'd doing something wrong if it won't run for long without a reboot. Of course, it's necessary to reboot once a year at least, simply to upgrade to a newer version of Fedora, that could be a downside for production. I use CentOS for systems where I would rather not have to do major work that often.

  8. Chris Thomas
    Coat

    @How does alpha code differ

    you know, when I first started out with fedora, it was rock solid, no features allowed which would seriously compromise a system, it worked and was on par with pretty much all distributions.

    however, that was then, NOW they allow pretty much any alpha feature you like in and f***k up the system with all kinds of experimental things, do you remember the time when alpha quality software on linux usually meant the equivielent of full quality software on windows? I sure do.

    I remember a time when I wanted to recompile my first kernel and patch it with fat32 support with the original version that was available back in 1996 (more or less) my c language lecturer commented at me saying "I'd trust alpha code for linux over release code for windows" and he was right back in those days, alpha really didnt mean alpha in the way we all KNEW it meant, you just rode with the punches and they didnt come that often. alpha really was stable as houses.

    Nowadays, alpha really does mean alpha, where did the quality go? maybe the pool of developers has gotten full of people who arent really up to the same level of quality as the developers years ago. Now any muppet who can write python, can get their project into a linux distro, almost no questions asked.

    it really hit me when I found out that fedora 9 would come with a pre-release of x.org, not even a very functional one at that, unless you had an intel graphics card, so lets get this straight, whats the % of laptops or desktops that have nvidia or ati cards inside AND that use fedora 9

    ALL of those people, have just been screwed and we're back messing with config files and disabling nvidia and ati kernel modules because the new xorg doesnt support binary drivers. When this was pointed out, I was told that fedora is an experimental distribution, etc, etc, blah blah.

    like I'd just walked through the door five minutes ago, not been with fedora since day 1, or redhat linux before that.

    we've gone from an age of rock solid reliability, to beta quality, alpha quality, who knows what quality.

    Anyone else noticed that gnome crashes just as many times, or more, as windows? anyone else noticed that the reboots to your machine is now measured in days more then weeks (including hibernate). KDE is causing problems for pretty much everyone (read the blogs).

    maybe I'm just bitter becasue I got bitten, but beta used to mean rock solid years ago, now it means barely functional (pulseaudio anyone?)

    TL;DR

  9. Anonymous Coward
    Anonymous Coward

    Make it easter to..

    ...install proprietary drivers for audio and video. It is a royal pain the in arse to do at present.

  10. Owen Williams
    Linux

    Fedora is as stable as you are...

    competent, patient and active in the community.

  11. James Butler
    Linux

    @Dave 13:39

    "it's necessary to reboot once a year at least, simply to upgrade to a newer version of Fedora"

    There's no reason to reboot unless you're upgrading the kernel, which would also be the case with CentOS ... as it is a Red Hat package conglomeration, similar to Fedora. Check your yum.

    Oh ... my Fedora systems reboot as often as my RH systems ... with each kernel upgrade. I've got 2 Fedora desktops and one RH7 desktop; the others are production servers running Fedora 4, Fedora 5, Fedora 9, RH7 and RH9. Like anything else, care and feeding are important, but the OSs themselves have been rock solid and a pleasure to administer. I really love Fedora 9. It just works, and feels good to me while doing so.

    As noted by Chris Thomas, KDE is on its way out ... but to differ with him/her, the recent Gnome interface is fantastic! I haven't had any crashing issues at all.

  12. James Butler
    Thumb Up

    Oh, and Chris...

    I also agree with your statements about initial quality, to some extent, although I'm certainly not missing the Big Slack or the Mandrake experiences ... the current releases (not alphas or betas) are pretty damn good when compared to earlier releases (not alphas or betas).

  13. Chris Thomas
    Heart

    @James Butler

    in a way you're right, in another way wrong, it's a blurry issue tbh, some days I can go without a single crash, but when recovering from hibernation, sometimes my "login screen" is completely grey, a compiz issue perhaps? still, gnome will take the bullet, cause it's what I see, not the background, perception is everything I guess.

    sometimes network manager fails to find any access points, when my girlfriend who runs linux too (shock horror: news at 11) has no problems and can see about 20 of them, etc, etc.

    The point I was trying to make, is that COMPARED to 5, 6 years ago, linux is increasingly coming to the point where it crashes MORE than it did before, I think what we might be seeing is that linux is not infallable to these things, but because the software was less complex, less capable and less used, problems got missed and design strategies were unable to find them, now we've got 10x the users, 10x the desktops and 10x the complexity with 10x the functionality, all of a sudden, we start crashing like we took the piss out of windows for doing for years.

    sorry to say it, but linux in the past crashed less often, because it did less, or it wasnt used as much, you wanna tell me I'm wrong, go for it, but you wont win, I've been using linux for 10+ years and I've seen it, 10 years ago, X windows was barely a window manager unless you hacked the holy mother out of fvwm and fvwm2, I remember gnome 1.0, it was nothing barely close to windows 95 in terms of ease to use etc, etc, so beware any of you guys who think you can tell me i'm wrong, cause I aint. dont waste your breath picking at straws, or terminology, get a proper argument too.

    basically, linux has turned teenager and got spots, welcome to windows 95 territory, expect to see more of this before it gets better.

    FORTUNATELY, linux moves at such a rapid pace that this phase will be over quite quickly, but still, painful for some.

  14. James Butler

    Kind of agree

    "The point I was trying to make, is that COMPARED to 5, 6 years ago, linux is increasingly coming to the point where it crashes MORE than it did before"

    I do agree that a system running Linux is more susceptible to crashes than a similar system from 5 or 6 years ago, however I would argue that it's not necessarily the OS that's crashing. Your issues, for example, seem to be with Gnome and/or your WLAN sniffer app and/or your hardware's "wake" functionality (the most likely culprit, based on so little info), rather than with Linux, per se.

    Since 90% of the stuff I do with Linux involves boxes halfway around the planet and straight CLI manipulation, I don't get much chance to use the GUI except for when I'm at home.

    Under those circumstances (CLI-managed production boxes), the things that go wrong typically involve performance failures under stress, for example, and not application design or execution failures ... unless you count stress failures as design failures, which I could understand while still quibbling with.

  15. Boris the Cockroach Silver badge
    Linux

    oh goody

    Its fedora 10 "alpha release" or as a certain large software company with no morals calls it "done, finished and ready for the user"

    Boris

    <<running 2 boxes with Fedora 6 on them and sees no reason to change

  16. This post has been deleted by its author

  17. Dave

    @James Butler

    I guess the point I was making with reboots to upgrade Fedora is that each version has a limited life before it stops getting updates (I know there's another level of support that continues but I don't use it), so once a year it helps to refresh to a newer version (twice a year if you're really keen). CentOS follows the RHEL support cycle so I know I'll still be getting security patches and other updates without having to upgrade the whole system.

    Having tried an in-place upgrade between Fedora versions, I have no faith in it, so I go for a clean install. It helps clean out the dross that accumulates without needing to go through a failed hard disk scenario.

  18. John Haxby
    Linux

    Fedora stability

    Fedora 10 alpha is very, very early in the cycle. You won't get access to anything else as early unless you're actually working on it yourself and that is the point: if you want to get involved very early on you can do.

    Fedora 9 gave a much-needed impetus to the KDE 4.0 work and while KDE 4.0.5 undoubtedly has a lot of rough edges, it's much better for the extensive testing and fixing that it got through the Fedora exposure -- KDE 4.1 (and Fedora 10) hold great promise as a result.

    Fedora, any release, though can be seen as the enthusiasts distro: it has everything bar the kitchen sink (I checked) and the latest version of all of those things. By its very nature it's not stable although I have less trouble with it than I do with Windows (except that I hardly use Windows any more).

    If you want crave stability and you want to stick with Red Hat, then go for RHEL or CentOS. Of course, you won't get the latest of everything and your favourite package probably won't be updated ("rebased") for the lifetime of the release, but it's stable and well-tested and the only time they get rebooted is when the kernel gets a security fix or when the next update for the series comes along. Or some idiot trips over the power lead.

    You could write practically the same about Ubuntu and Ubuntu LTS -- the enthusiast releases will never have the stability that the long-term support releases have.

  19. James Butler

    @BKB

    Not all of us run simple web servers ... and the number of machines is merely an indication of the breadth of our experience. If the 100 computers are all running different versions of the OS, then that experience translates into a broader information base from which to draw. When I say that I've not had problems, I'm saying that across many years and many releases, which is saying something different than if I had only built one system with one release.

    That said, nicely put, John Haxby, but remember that even though the releases are available "with everything", it's trivial to select a minimal set of packages to install, which is what most sysops do.

    Oh ... and I just went through one of those "mass upgrades" noted by ESR in his note to Linux.com with no dependency issues and no hang-ups. Of course each system has its own hardware, so I mention this simply to note that ESR's experiences are not universal, and neither are mine.

This topic is closed for new posts.

Other stories you might like