Mark Shuttleworth, head of Canonical and founder of the Ubuntu project, has called on other Linux developers to synchronize releases of new versions of their distros. He also pledged to deliver the next Long Term Support (LTS) release of Ubuntu, version 10.4, in April 2010 - unless, of course, Red Hat, Novell and Debian decide …
Does this seem like A) A good idea or B) Something which is remotely possible?
Why even have different distros if they all use the same software? This can only lead to intra-distro politics and standards which will ultimately just waste a hell of a lot of time.
Does it seem to anyone else that the Linux community is sort of losing focus on the things that once made it awesome?
While I'm not sure about the method, it seems like the bigguest problem linux has is that there's a constant reinvention of the bicycle. Most distros rewrite the glue that holds the applications together for no apparent reason, instead of collaborating on one front. I think that it's a good idea to synchronize in a greedy sense - if it works, it would yield a lot of end-user benefit, because the dev time can be spent on improvement instead. However, ultimately I must agree - linux is the kind of hacker os, where personal preference is the only thing that matters. Limiting that could even scare seome devs off.
If you want everything personalized, try out Gentoo instead of any of those listed in the article.
I like Mark Shuttleworth's idea in principle. If Gnu/Linux is to seriously rival closed source OSs there needs to be more common ground between the different distros. Otherwise what chance is there of attracting the technology challenged non-geek. At the moment, IMHO, Ubuntu comes closest although others are now slowly closing that gap. Until then the likes of Micro$haft will dominate.
Why even have different distros if they all use the same software?
There's no mention of them using all the same software.
Just about all distributions (except the smaller 'Live' distributions such as damn Small Linux & Puppy Linux, and possibly one or two others) use the current kernel, OpenOffice, KDE or Gnome (and have the other packaged in their mirrors) and gcc.
Shuttleworth isn't suggesting that Fedora, Ubuntu, etc. become clones, merely proposing that the next versions come out in the same month.
Is that a problem?
Linux reinvents the slashdot effect
Is it just me or does this seem like they're literally trying to kill the internets mirrors? One distro is slow enough to get ahold of on release day, but all of them at the same time?
I kinda agree
The biggest problem with Linux, and equally its biggest barrier to entry is lack of coordination between different distributions.
One the one hand: this is what Open Source is about: everyone working on the same project to make it better. Also, FOSS isn't about rivalry between projects, but different ways of solving the same problem.
On the other hand, though, choice is what makes Linux great. Anyone can come along and make a new distro/project in any way they like without having to adhere to standards and licences.
Argument 1 if you want more users
Argument 2 if you want to keep the philosophy (I hate that word) of FOSS
"While I'm not sure about the method, it seems like the biggest problem linux has is that there's a constant reinvention of the bicycle."
That's really a strength. I mean, it's true that it spends developer time that could go to other things.. but
a) Developers like to take existing code and make it better. These same developers may not be interested at all in working on some new project. (After all, many are working on it on a personal, not paid basis.)
b) It avoids ending up with a Microsoft-like mess of code.
1. Ugly code, even if it works, will be pulled out and replaced with clean code. This avoids a maintenance nightmare later on. Code that isn't ugly but is slow is sped up. This helps make it possible to release new distros without drastically increasing hardware requirements... so depending on how you look at it, you can run the latest'n'greatest on your old hardware (I have run Ubuntu 8.04 on some systems that are pushing 10 years old and it ran OK) or you can get more out of your shiny new servers, because the apps are much more efficient than they could be.
2. People don't just glue what should be seperate apps into an interdependent mass, because it makes the apps break. Which can sound bad when someone wants that app out NOW, but is good overall because it means the app will keep running in the future, portable to other systems, and so on. See Microsoft having trouble with their own apps breaking with each OS release, because they start intermingling what could/should be a seperate app with OS functions.
This is increasingly irrelevant
For me at least. After playing around for a while in a VirtualBox with the latest Ubuntu offering, I just decided to keep my current 7.04. What drove me to do that was the simple question: what's there for me? The answer was just as simple: nothing that I don't already have. A friend of mine upgraded the system driven by sheer enthusiasm only to find the same as I, that is, nothing relevant enough to notice a big difference.
Perhaps Kubuntu 8.10 with the KDE 4.1 desktop stabilised will give me compelling reasons for the upgrade, but I think Linux and desktop OSs in general have reached a level of maturity where new versions attract mainly newcomers, but if current users are already happy with their XP, OS X, Fedora, Ubuntu, Suse or whatever are becoming less and less enthusiast to upgrade.
"Sure there's a distro," Doc Torvalds replied. "Distro-22. the only good open and dynamic o/s is a constrained and restricted one"
There was only one distro and that was Distro-22, which specified that a concern for one's safety in the face of dangers that were real and immediate was the process of a rational mind. Shuttleworth was crazy and so he can't make a new release. All he had to do was ask; and as soon as he did, he would no longer be crazy and would have to make another LTS. Shuttleworth would be crazy to release another LTS and sane if he didn't, but if he was sane he had to release them. If he released them he was crazy and didn't have to; but if he didn't want to he was sane and had to. Userrian was moved very deeply by the absolute simplicity of this clause of Distro-22 and let out a respectful whistle.
"That's some distro that Distro-22," he observed.
"It's the best there is," Doc Torvalds agreed.
[ black helicopter supplied by Milo Minderbender Enterprises ]
The problem with Linux: nobody's in charge
This leads to two irritating characteristics of Linux:
1. The user interface varies arbitrarily from app to app. If you want to open the "file" menu, some apps want "press alt, release, then press F". Others want "press alt, hold it down, and press F".
Linux may be more robust and secure under the hood, but Windows unquestionably has the more polished UI.
2. The developers seem to be fixated on adding bells and whistles, resulting in bloat. If the Linux community really wants to differentiate Linux from Windows, it wouldn't hurt to actively attack bloat.
Many, many years ago when I worked in "large systems support" for Burroughs (B6700 days), releases of the MCP alternated between new features and improved efficiency. One release would add bells and whistles, and the machine would slow down. Next release would add few new features, but everything got faster and more responsive.
Not a bad model for an OS release cycle, no?
I imagine trying to direct the Linux development community in one direction is much like trying to herd cats, but can't a good border collie manage the latter?
Or am I a hopeless dreamer?
Heart, because at my advanced age one hopelessly dreams of love. [N.B. I *did* *not* use the Paris icon this time. Paris may be for lovers, but Paris herself and the emotion "love" seem to live on different planets.]
@Dan (slow distro dl's on release day)
Torrents: they're not just for porn anymore!
It might just work...
Many moons ago, when XP was trucking along laying waste to everything someone said "All it'll take is Microsoft to screw up once and we'll have them". Dunno who that was, but here we are. Vista tanked, Vista-SP1 tanked, XP-SP3 tanked. The beast's on the floor right now, give it a damn good kicking!
Realistically, Gnome 2.22's features are universal amongst any distro using it, same with KDE and so on. It's not 'Ubuntu' or 'Red Hat' detecting hardware... it's the 2.6.xx kernel coming of age doing the leg work.
Conical, Red Hat, Debian and the other distro's who haven't signed up to MS's FUD attack (the beasts with two backs get no favors from me) all going forward in lock step, combining their budgets, throwing out the same FUD Microsoft does, advertising, signing OEM deals and such would be that damn good kicking.
Paris, because her spell checker is stuck in American as well.
Sync then would result in all the big distros releasing the same thing.
Diversity can be a pain in the neck - thus sysadmins like homogeneity - but it also results in higher levels of innovation. Would Shuttleworth have a commitee spec distro releases.
F**k, I'm gonna start to sound like a marketer, or manfrommars, if I keep this up.
Shuttleworth only wants synchronisation so Ubuntu can feed more effectively off other distributions' work. Ubuntu doesn't have much manpower (as opposed to hordes of enthusiasts) in the areas which need heavy lifting - kernel, toolchain, OpenOffice and the desktops, anything ending in 'Kit', so has to follow the developments blazed by openSUSE and Fedora. RHEL and SLES are based on the stabilised work of the free distros to make a commercial product, but Ubuntu LTS, the product Shuttleworth wants to make money with, is effectively slaved to these as the work is happening on Novell and Red Hat's terms.
It's not about end-user benefit - this is already present in the innovative distributions - but about creating the conditions to feed the Ubuntu wagon with the least investment.
You can object that it's all Free Software and in the end the user wins if you combine Ubuntu's community building magic with the other distros' engineering prowess, and this is true in the longer term. In the short term a lot of the tough problems that are solved to make Linux competitive come from professional developers in-house. These guys have salaries that are paid for by selling RHEL and SLES. Setting up conditions that make it easier for Ubuntu to enlarge its commercial Linux market share reduce those sales, and talented distro guys leave to Google or Web 2.0 startups.
What's clever about his offer is that he only requires "two out of three of Red Hat (RHEL), Novell (SLES) and Debian" to sync in order to sync Ubuntu to them. Thus it comes down to getting the Debian leaders (also net recipients, and realistically, the easiest group to seduce) to shift their release cycle, and he's created a 75% bloc that will be hurt the distro left out.
Saint Bill, because it's worth looking past the shine of the halo sometimes.
...user interface varies arbitrarily from app to app. If you want to open the "file" menu, some apps want "press alt, release, then press F". Others want "press alt, hold it down, and press F...
I have been using GNU/Linux for 10 years as my desktop system and have never opened an application in either manner - perhaps you might let us know which applications you are using?
...Linux may be more robust and secure under the hood, but Windows unquestionably has the more polished UI...
Two disconnected statements pretending to be an argument. Actually the UI point was valid but as long ago as SUSE 8.3 (openSUSE will be 11.0 soon) Relevantiv did a usability study against XP showing little difference - and KDE 4.1 is chasing Mac. I don't doubt that Gnome is fairly impressive given popularity of Ubuntu.
...developers seem to be fixated on adding bells and whistles...
Such as a better UI?
...resulting in bloat...
I read, but I am not a developer, that KDE4 has a smaller overhead than KDE3
...If the Linux community really wants to differentiate Linux from Windows, it wouldn't hurt to actively attack bloat...
...trying to direct the Linux development community in one direction is much like trying to herd cats, but can't a good border collie manage the latter...
The myth of anarchy in Free/Open development has been de-bunked several times over. Picking on the kernel, I think there is a Finnish border collie.
Ubuntu blows.. Linux sucks
Ubuntu really blows, I've tried to install it a few times and it still falls down at the integrating with the rest of the network.
Having to mess about in weird text files to try and lock the speed of a network card which is 3com so hardly uncommon and apauling performance when trying to stream files from a NAS, its as broken as Vista on these basic functions!
Why is it so difficult to have an GUI interface to change the NIC speed? Its not rocket science, there are many questions on such issues in forums with vague answers that dont work.....and why can it not work against other file systems without having to resort to brain surgery style exercises...
Sorry I went back to XP which just works out the box
The point is not to make all the distros look and feel the same, it's more about commonality of effort.
If the releases are synchronized then you can be fairly sure that when you're bug fixing before a release then the external developer you want to talk to won't be on holiday, at a conference, busy with other stuff etc.
They will also be up to speed and working on the same version that you are and it won't take them a day to build a new test system to replicate your problem.
It will also be much easier to organize other co-ordinated development efforts like "Bug Days"
I think your all missing the point
The idea is not to have exactly the same hardware, but to use the same kernel, and X-systems so bugs can be fixed at Kernel level / X all in one go, otherwise developers spend time fixing version x, then y, then applying fix for y back to x, etc. etc, Also, the different distros already use each others fixes for hardware patches, etc. just as they are at all different stages of kernels and x, etc, so there is no cohesion to what they are working on.
It really makes sense - as developers can all concentrate on fixing one set of bugs, rather than a mish mash spread over different versions over differnent distros; then they can spend their time doing fancy UI stuff.
@It might just work...
Really XPSP3 and Vista SP1 tanked ? The beast isn't on the floor it sold more oems last month than Linux has users, Good luck to nix but get real
They aren't talking about merging
"ll going forward in lock step, combining their budgets, throwing out the same FUD Microsoft does, advertising, signing OEM deals and such"
Shuttleworths suggestion is nothing at all to do with what AC wrote.
All he wants to do is set release dates in common, nothing more.
hmmmm looks like hes trying to get the "competition" to release at the same time that the end user will rather wait for is version....
i use the word competition in a general sence. i release open source software so know that warm fuzzy feeling inside when people use yours rather than somone elses :P
also think about it all the major software companies will hang back with their own releases for the distros... that would leangthen the road forward some more. surely to the compunity faster and more regular is better?
alien cause i want an alienware laptop
When I get a new release it usually comes down at around 100kbps which is about as high as I can get on my sloppy net connection.
@Nexox Enigma & @Anonymous Gentoo fan
Whilst recognised as a worthy desktop distribution, surely Shuttleworth is trying to make Ubuntu more relevant in the server room. It would seem he thinks Red Hat and Debian currently hold the enterprise laurels (note *not* Suse). And he'd be right.
Gentoo will become a supportable, enterprise-capable Linux distro shortly after hell freezes over.
Paris because she runs her business on Gentoo.
@ It might just work...
Why because HP fluffed their images and applied the wrong processor drivers in all images. I've been testing SP3 on many systems and it's very stable the only thing it's acheived is to make Vista SP1 look like a faliure.
Thats not as tanked as my Ubuntu 8 upgrade that 'tanked' repeatedly and won't install now on my notebook, shame i've been using 6 happily for ages.
Linux is Broken.
Until the following is fixed, Linux will NEVER be mainstream.
1. Common Kernel across all distros - managed and developed by an independent Kernel Developers Group
2. The removal of X as the GUI engine. X was designed as a remote graphical engine, allowing users to run GUIs on dumb terminals connected over networks to the main multiuser *nix machines. Desktop Linux does not have this requirement. Apple realised this, and re-write a new GUI engine for OSX.
3. A standardised application deployment/installation system. Loads of bloated shell scripts just don't cut it, you need something like MSI (used on Windows) that can install a standard application package onto ANY distro.
Fix those three things, and then, maybe, you'll have something your granny can use.
Linux isn't broken, neither does it 'suck'.
It is still a toy for hackers and people who like to play. It's the likes of Ubuntu who try to push this stuff out to the 'simple' enduser before making sure there is a means to manage all aspects of the system without hacking text files who are broken.
Gentoo & Debian do exactly what they are intended to do.
The point of moving all distros over to one installer is interesting. But complete bllocks. There are different installers for good reasons. However, polishing them and making it easier for the frontends to have the same 'appearance' is needed. Makinging it possible for a manufacturer to create a simple install script that can just pass the required information to the installer is all that is required.
One unified hook across the different installers would do great.
Portage on RedHat? RPM on Gentoo? Hell, neither would work too well would they.
Of course, adding common hooks means that package dependancies needs studies due to different files present in different packages across systems (I think this is improving already though).
Getting it into schools is the trick to making it big. And there needs to be a lot of work done before that happens.
Windows is broken
Until the following is fixed, Windows will NEVER be secure or useful.
1. Differing Kernels for differing hardware - managed and developed by an independent Kernel Developers Group. One size fits all is sooo 1990s
2. Use X as the GUI engine. X was designed as a remote graphical engine, allowing users to run GUIs on remote terminals connected over networks. Don't the fools see how useful network transparency is?
3. Replace the deployment/installation system with some simple shell scripts, so you can install a package onto ANY version.
Fix those three things, and then, maybe, you'll have something an expert can use.
Re: conspiracy :P
Well there's one obvious difference between commercial development work and bedroom hackers who do things at their own whim... people employed by companies/organisations are usually vetted for their ability to communicate coherently in some known language using appropriate grammar and syntax.
I've always been disturbed by the thought of relying on software written by people who display a lack of formal education. They can't possibly have written good quality code when they can't communicate to (other?) humans clearly.
Paris, 'cos she c*nt speel eyether.
Taking your points in order....
1: There is - the only real difference is which version they settle on. The fact that the kernel changes so much within minor revisions is the problem - and that's a more findamental issue
2: The fact that the desktop and remote access work perfectly well together is a nice thing. Not having to use something like VNC makes my life easier. The X11 protocol is also being upgraded in the background to the extent that remote sessions can take advantage of the accelerated features - something I've not seen in either the M$ or Apple products.
3: What do you think happens behind an MSI install???? There are already multiple package management options - most of which exceed the functionality of the MSI installers (uninstalls that actually work are almost a rarity in the windows world, and as for dependancy checking...)
So basically - I feel you're talking rubbish.
Debian developers responded????
... that was quick for anything to have happened on the Debian project ;-)
also @FlatSpot's flamebait.... thanks for trying ubuntu, in terms of what can and what can't be done, your comments only show that you don't know nearly as much as you like to think you do about computers or networks. What you are really complaining about is not having a GUI wizard to wipe your butt for you. Have patience, do more research and try again and you might have better luck.
But will it be ready for the desktop?
If three or four distros come out at once then the packages they use can also be timed so that reasonably stable versions are ready a month or two before the distro comes out. This is good.
Maybe they can all decide on a common file tree as well while they're at it and we might be getting somewhere. Better for installers, easier for people to find their way around the sistem.
I tried Ubuntu once. I spent a weekend messing around with xconf.org and decided to uninstall it and do something else more productive with my time. I suppose I could blame either Ubuntu or ATI for crappy dual monitor support but I'm sure the whole process would have been far less painful without X sodding 11 in the middle. If they ever do manage to fix that abortion then it'll be something completely new with the same name because as it is it's a 25-year-old irreversibly broken program that should be thrown out as soon as possible (I feel better now).
Xorg is a significantly cleaner X11 than XFree86, generally it works 'out of the box', and with most graphics card drivers now supporting Xrandr, the usability of X11 is similar to that of windows. I hold no truck with GUI apps to configure any part of my system, I'm old fashioned that way, but IIRC there are plenty of Xrandr applets now.
Almost as an aside, that has nothing to do with Linux. X11 is for unix boxes, and it happens to be able run on Linux. The drivers for X11 are drivers for X11, not linux, and so also support BSD, solaris etc. Personally, I think the way most linux distros are going is not great, we'll just end up with a bunch of windows clones, with auto conf wizards, and a bloody GUI tool to configure your NIC. The only tools you need to configure your NIC are man(1) and ifconfig(8).
Delicious pasta is delicious and I must eat it.
Why SHOULDN'T someone be able to use a graphical tool to perform common tasks? linux config being mainly text files is good becasuse a) it's easier to migrate settings than windows registry, and b) it's easier in how-tos to use terminal commands rather than endless screenshots of GUI windows, c) It's easier to control over an ssh connection. None of those applies.
And while some unlikely combinations might require hand tweaking, this should be a rare case, not a common one. Every version of ubuntu I've installed on my home machine, I've tried to get by with only using the GUI tools. Every time, I've been thwarted (mainly by x configs). it is a traversty that in this day and age you still can't tweak your monitor settings without the distinct possibility of needing to un-mangle your x config afterwards.
"don't know nearly as much as you like to think you do about computers or networks" Yeah, because knowing how to configure NICs in linux is fundamental computing or networking knowledge?
Linux is Broken?
Perhaps in your particular living room remote desktop displays are not a requirement. Please do not extrapolate from that to assert that the rest of the world is identical to your living room; for a significant proportion of Linux users (including people in my living room) the network transparency of X is essential. Thin clients never did go away, you know - and they make just as much sense today as ever before.
We already have a common kernel across distros - it's called Linux. You can use pretty much whichever version you want - it's exceptionally unlikely to cause problems at a used-by-granny application level. Use what came with your distro and you aren't likely to go far wrong.
As for point 3... no. Not much more to be said there, really; if you're a technically inept user, stick to the packages provided by your distro - it's at least as easy if not more so to install these than it is to install most legit software on Windows.
You may not like GUI apps, but as I'm trying to configure two screens I think we can make an exception in this case.
I would like Ubuntu to do the following for me...
1. Detect I have have an LCD screen and a TV (it did, just, after installing a new driver).
2. Detect the resolutions, colour depth, and refresh rates that these screen can use (it didn't).
3. Start with a default that is visible (it did, probably by accident, but only on one screen).
4. Allow me to choose another resolution, colour depth, and refresh rate for each screen and let me position screen 1 and 2 how I want them (it couldn't).
5. Bonus points for detecting when I plug in and unplug the telly and resizing the desktop accordingly.
I just want to be able to open a window and choose three settings things for each screen and drag screen 2 around screen 1 so it matches where things are in the real world.
All of this X11 is manifestly unable to do. I don't particularly want to spend a weekend booting into text or VGA mode, cranking up vi and diving into xconf.org trying to tell it all the combinations of resolutions, colour depths, refresh rates, the horizontal sync rate of the LCD screen and telly, whether the desktop should extend onto this screen, etc... I get the feeling that if I type the wrong thing my TV will implode.
And the moment you plug in something new or unplug something you have to do it all over again.
Other people have mentioned that X11 works over the network, which is sort of like keeping a old mangy three-legged dog because it can do roll over and play dead tricks. The right thing to do is humanely put it out of its misery.
@ herding cats
Obviously not seen any cats lately.
If border collie tries to herd cats one of the following will happen:
(1) the cats will ignore the collie, only occasionally looking at it in that arrogant way cats have with looking at things
(2) the collie will go after the cats, all barking and teeth and drooling. This will lead to:
(2a) the cats will absolutely rip the doggy to pieces. A few cats may get hurt but the community will survive, without the collie annoying them.
(2b) the cats will initially run away, then (with the exception of the one the dog is chasing) they will stop, lick their genitals and carry on as before. Sure the one chased by the collie will suffer but the community will survive.
(3) The collie barks a bit, maybe runs around a little to remind everyone he's still there but leaves the cats alone on the principle that "as long as no kitty is trying to scratch my eyes out I won't try to eat any of them".
The problem* with FOSS is that anyone can write stuff. The problem** then is that it becomes difficult to turn down "improvements" without loss of control and forks of the projects. This then leads to the problem*** that you don't get one single, standard piece of, well anything.
Drop X ?
Now that it works properly ? And replace it with what ? Are you aware of the fact that almost every graphical Nix application is written with an X-Toolkit ? Are you suggesting, all those apps should be rewritten ? Rewritten by volunteers, in their spare time, so YOU can use them without having to think or learn ? It's not Xs fault if you coudn't get it to work. It's not the fault of ATI, Nvidia, DistroXY, or whoever. It's YOUR fault ! All you had to do was type: "man xorg.conf", this manpage really tells you everything. If that's too difficult for you "experts", any Nix comes with a configuration tool. It just doesn't hit you in the face and shouts "KLICK ME". It's true, X is a tad more difficult to set up than Windows or OS X graphics, but it's easy enough, and we like it that way. It gives us a lot of options, and keeps the mousepushers were they belong. Always keep in mind: You are allowed to join our community, and if you choose to do so, it's for your own good. We don't force you to do so, and we don't necessarily WANT you to !
I'm not sure choice is a bad thing. Sure, _unnecessary_ choice isn't good (like having 1001 different text editors), but people cope in other areas of life like choosing a car, or a breakfast cereal. I think there is plenty of room for three or four main distros, and many smaller specific ones for routers, firewalls, NAS servers etc. People should be able to choose what works well for them.
I guess what Mark Shuttleworth is suggesting shows how much code is shared between the main distros (e.g. Gnome, KDE, Xorg, the kernel, etc). Working together improving the software and fixing bugs would be a great benefit.
The key thing is ensuring that each distro "just works" for anyone who picks it up and uses it.
As annoyed as I might be with failing X configs, I for one wouldn't ditch X. The problem resides in that when it correctly detects your config, it'll work fine. When it doesn't, prepare for a living HELL. I had these issues with Xfree86 back in 1999, my Fujitsu Lifebook 200 had a video chipset that XFree didn't like. It took me 1 year of waiting untill one XFree86 update gave me the correct drivers.
My current setup was easier, as nVidia has made "self-extracting executable installers" which do everything for you ... but they are command-line installers, *and* require kernel-source or whatever that pkg is called in your different distros. (Done it in Ubuntu, and Fedora 6.) I think that at least in nVidia's case, they've done very well on making the X experience easier on the user.
Too bad not everyone's as good on that, yet.
i take my hat off to mark just don't stall the bleeding edge
i take my hat off to mark and hope that the response from the debiam community comes through linux is and always will be a forward moving os driven by the community but collaboration between distribution can only be a good thing my only concern is that the bleeding edge disro's may suffer from release and version synchroniding (ubuntu is amazing but long live the fedora project)
And the point....
...I fail to see it.
Why coordinate the releases of Linux distros?
I (mainly) use openSuSE, I don't care for the release of other distros, I wait for mine, then I get a copy of it.
So tell me what is the point of coordinateing them again?
Hey, if you need automatic multi screen TV support so much, then write a module to do it yourself, or pay for someone who has the skills to do it for you.
One of good things about open source is that once there's a big enough need then the support will come, as is happening now with wifi, despite what appears to be the best efforts of some card manufacturers to stop it. And that support will stay even after said manufacturers have orphaned their products, and moved on.
I've done the PC to TV and Monitor hookup probably a grand total of four times over the 15yrs I've had a PC, and I'm supposedly a geek. So I would imagine Joe Average hasn't even thought about it, even if they know the option exists. So I can see why that might not be a priority for X. Rather, just icing on the cake.
Whenever I connect a new PC to a new monitor / TV , I always make sure I have the manual handy to make sure that the best supported resolution / refresh rate / colour depth is chosen, including the time recently when I tried the families laptop on our shiny new wide screen LCD TV. Maybe its because I'm old, but if I'd tried it with the Kubuntu install on said laptop, I wouldn't of found it such a bind to have to copy and edit a configuration to match the values from that manual. Typing, on a PC, yeah I know, how old fashioned - I made this comment by dragging and dropping text from other peoples comments, as I'm sure you did with yours.
Moreover, if the GPU / Monitor / TV doesn't report what they support correctly, isn't that just a symptom of the manufacturers either not following a standard (VGA, SVGA, XVGA or whatever it is now) or not releasing the drivers / specifications needed to work with their devices.
As you feel so strongly about it, I feel your the ideal candidate to be the FOSS representative that liaises with those companies and persuades them to supply the necessary details or drivers, and to lead the configuration tool writing effort. I'm sure all in the community will thank you, as they have Mr Shuttleworth for his sterling efforts.
@ Antoinette Lacroix
I'm afraid it is Ubuntu's fault... the preferences apps overwrite xconf.org in a way that means I have the choice of Ubuntu's preferences or editing xconf.org but not both. There's also a bug which leaves things in an odd state if you restart X on Ubuntu. This makes every change to xconf.org a royal pain in the arse to test.
I'm afraid it is X's fault... X defines a basic set of preferences in xconf.org but if you want anything other than that then the syntax changes according to the driver you have installed.
I'm afraid it is ATI's fault... Documentation for their Linux drivers is non-existent. Source code is non-existent too. So that leaves hitting Google and seeing what other people have tried (which sometimes works and sometimes doesn't).
Now a lot of this could be mitigated by co-ordinating releases and having a date to work against. Once that is done targets for said date can be set. Then we might get somewhere, advancing by design instead of by accident.
With release dates you could do stuff like distros, X, ATI, and nVidia agreeing on a common syntax for xconf.org. Once we have a common syntax for xconf.org then preferences applications can read and write xconf.org and configure every make of video card that has a Linux driver for it. Then we might get somewhere approximating to a pleasant user experience.
I assume you use "mouse pushers" in the same way that other people use the phrase "mouth breathers". Obviously some people (such as yourself) feel under threat from mouse pushers and preferences apps that work with a GUI. Having a computer that just works when you use it might then leave you with too much time on your hands and then you might come to the conclusion that despite spending ages at the keyboard you're not actually doing that much useful, you're just tinkering about with files in /etc.
Anyway, I reluctantly agree, it's probably too late to drop X11 (which incidentally has a whole chapter devoted to it in The Unix Hater's Handbook entitled "The X-Windows Disaster"). The best thing would be just make the damn thing work and make it write out xconf.org so that other applications that need it still work but leave it at that.
@Schroeder: And I suppose if I need it than I could write a module to do it or pay someone to do it, if a) I were not against the combined might of every implicated party (the distro, X11, and my video card manufacturer) failing to agree on anything and b) I were not sold the line with every single major release that "Our distro just works" when it's obvious that there's still a while to go, probably a good couple of years yet. If I'm told it just works then I would like it to just work. If it doesn't then I'm well within my rights to vent spleen.
I think you'll find there are two main obstacles to getting your non mainstream requirement to work.
A) The hardware manufacturers, who are just finally beginning to accept that Linux may well have a place on the desktop and are beginning to defy Microsoft and work with the community to improve support.
B) The simple fact that what you need for linux to 'just work' in your case, isn't that common, otherwise someone would have already gone through all that hard work for you.
The fact that you feel it's only important enough to rant on here about, and tie it your hatred of all things 'nix and non-gui based, rather than actually do anything really constructive about it says rather more about you and your reasons than you might like.
Linux a couple a years away from being useful? Some of us who've been found it easy to install, configure and use on the desktop for more than 10 years might, quite rightly, take issue. Especially, when all you can rant about is an obscure case, with an easily workaround, while admitting you couldn't even be bothered to try and put some work towards the nice method you'd like, so that everyone else can benefit.
Oh and your claim about the hype for major Unbuntu releases. I can think of big OS company that does goes much further than that. Their last product worked so well, that despite spending after over 6 years in development on it, they're already telling their users that the next version is coming real soon now, and will be even better.
1... 2... 3... heave... 1... 2... 3... heave...
If all (or even most) of the Linux distros were pulling in the same direction, they could all feed off each others work, avoiding duplication of effort in bug fixing, etc. Just how many people out there are compiling and recompiling software for different kernels, different package managers and different platforms?
There are far more important jobs to be done out there. I don't have a problem with having different distros, competition is good, they say, but I worry more that every single software that uses an address book seems to create their own rather than using a common platform like LDAP that can be shared. I worry that we still do not have a decent shared calendaring system. I worry that I cannot sync my PDA or phone with my server, because everybody seems to want to do it via the desktop, which defeats the object of having a PDA.
I honestly think that GNU/Linux will never make it as a mainstream desktop system until it offers something different. I don't want to say that GNU/Linux needs quality rather than quantity, because that would demean so much good work, but why can't address book writers use LDAP, and LDAP only? Why can't calendar programmers only use CalDav? Package managers are so good these days installing a properly configured server package is a breeze. You just need to ask during install whether to install the full package or just the client. Why are we constantly re-inventing the wheel?
An example. Address Books (Yes, my favourite whinge). To share an address book in Outlook + Exchange you can create a public folder. BUT if you have a Windows based PDA or a BlackBerry, you cannot use that data without using extra software on your desktop. Mac's can share an address book by creating an iMac account and synchronising the address books across all the desktops. How stupid is that? On my server I have an Address Book in OpenLdap. I can access the data from Squirrelmail, I can access it from Thunderbird. We have written a (too) simple web query so that I can get to it from the Web. I can access the data from my fax machine, and our telephone handsets.
All I need now is for those same software writers to understand that I also want to be able to save data from those devices back into LDAP. It's not rocket science. It would be an awesome tool, a Unique Selling Point, for GNU/Linux. And that is only one of the many possibilities that are possible with Open Source software. And that is why I love it.
My obscure case is... playing videos on the TV. If the BBC have set up iPlayer to let you do download videos, if most graphics cards in the past few years have a second output and if Macs and Windows PCs let you configure these features as standard then it's not that obscure.
I'm not a Unix hater, I often use Unix in my work, however without the X-Windows getting in the way. I've also got a Mac at home (which is Unix), which is a version Unix aimed at the desktop market and done right. As I use Unix, Windows, and Mac I don't have any problems recognising that if one OS or distro needs two days of time to set something up and still doesn't work right and the rest don't then there's an obvious design problem with that OS or distro.
I'm not sure what the easy work-around is, a weekend of my time couldn't get it to work. Maybe you could tell me what it is? (Pointing a video camera at my monitor and connecting that to my telly is not an option.)
I'm not talking only about Ubuntu, I'm talking about most Linux distros aimed at the user market. I'll happily recognise that Ubuntu is an improvement on SuSE which I tried a few years back. So, given the pace which I've seen things move, I don't think a couple of years is an over estimation to fix things. That is, if everyone starts to agree on release dates (which is a sensible suggestion and something I mentioned in the first post) as well as fix common targets.
And however much of a dog Vista is (I won't be trying it out unless it's set up for me at work as its reputation has proceeded it), I have yet to hear about problems when you plug in a second monitor or a TV. Much of my disappointment with Ubuntu was due to my expectations based on the reputation it has gained and then seeing it fall at the first hurdle.
I suppose part of the hype for Ubuntu is due to disappointment with Vista and a willingness to try another OS. Going along with this hype and blaming the user for using standard features as seen on other computers is possibly one way of going about things, but as more users will come along and come across the same problems then things may turn sour (as with Vista). Perhaps making it clear from the start that dual screen support and some other features that other users from other OSes may take for granted are not guaranteed would lead to less disappointment.
But, Dan want you want to do is obscure - I work in a large IT department and no-one I know uses their PC to directly watch videos on their TV. Those that do download stuff and want to watch it on their nice big telly's generally do so by either streaming it via a media client, like a PS3 or XBOX, or by burning them to DVD. Probably because both solutions are fairly simple and far neater than having the PC / Laptop hooked up under the TV. Or even pointing a video camera at your monitor. And to be honest, until HD became common, the image on a TV was no match for even a fairly cheap monitor, which was actually the main disappointment for me when I tried it with the first card I bought that supported S-Video output.
IPlayer? You want to run a flash based stream up onto a big screen? As for the P2P version - surely the DRM still allows you to use a DLNA server to stream it to a media hub? It would certainly save you from having to fumble behind the TV and plug in the laptop in every time you forgot to record a programme?
The Dual monitor support seen in windows / mac was probably driven by two separate things - traveling reps wanting to use a projector or big screen to show those lovely powerpoint slide shows, and those MS Flight Sim enthusiasts from years ago who always wanted a triple monitor set-up(I used to be one of those).
So the card manufacturer's ensured the drivers they created for Microsoft and Apple systems had working multiple monitor support (after all its a function of the video card not the OS is it not?). I can imagine Apple and Microsoft were more than happy to extend their configuration applets to take advantage of this
Again, I see you backed away from trying to help persuade the card manufacturers to work with the community to ensure that good multiscreen driver support extends to X windows too? But thats the shill's base argument isn't it?. Theres no market for linux to do X, Y or Z, so why should manufacturers waste time working on it. If someone, such as yourself, who is happy to post such detailed comments on site such as this, claims they can't be bothered to feed their issues back up to people in the community and the manufacturers so it can be addressed and fixed properly, can you perhaps see why some of us would suspect your motives?. Especially when you then try to extend that to saying Linux is completely unfit for general desktop use as a result.
Here, I'll give you an example that I think is probably less of an obscure requirement than running a TV from a laptop, and probably more relevant to Joe Public. The SLI or Crossfire dual card setup, much coveted by hardcore gamers. If you buy such a system, yet find that it doesn't bring the claimed performance benefit and is often unstable, who do you blame? Microsoft? Or the card manufacturer for writing crap drivers? Who regularly gets slammed by the gaming community for this currently? Not Microsoft.
To say that Linux distro's should all say they are unfit for use by the Joe Public, just because trying to set up a slide show might take a little work is FUD. By the same token, I can bet most of the El Reg readership has seen a manager having to fumble around to get a presentation going on a windows laptop, so maybe Microsoft should also stop saying windows is fit for Joe Public by your standards. I'm sure my outfit doesn't have all the clueless PHB's in the world.
As I said, I've used Linux on the desktop for over 10 years, with probably less issues than I've had with windows over the same time.
As for Linux advertising that it doesn't support multi screen setup. I don't think I've ever seen it given as a reason to try linux. Similarly, I don't think I've come across any Windows advertising material that specifically highlights this feature. I might have seen it pushed as feature by a video card manufacturer, but then they always instruct you to install their drivers as soon as you fit the card, to allow you to access all wonderful features you shelled out for!
In real world terms I don't think the dual monitor support in linux, would be something Joe Public would have high on their list and I certainly don't think it comes close to all the issues being raised with Vista. And honestly, I don't think Joe Public cares, or is even aware of most the issues that cause people such as myself to move away from using Windows. I bet you'll have more joy, getting the the X-windows devs to look into sorting your multiple monitor support, than you will waiting for Microsoft to change it's spots.
sounds good for the 32 bit...stabilize/standardize it..make it ah never mind.
but we have yet to scratch the potential of 64 bit yet, let the race begin
gui and games and ease of use, practicality is already here..