The Red Hat–backed Fedora Project has released the latest version of its Linux-based operating system, Fedora 15, into the wild. Despite the similarities of the two leading Linux-based PC operating systems, Fedora has long played second fiddle to Ubuntu in the minds of many Linux fans. Now – for the first time – there are …
Gimme a break
"After all, GNOME 2 borrowed much of its UI design and basic interface concepts from Windows 95 – and it's been a long time since Windows 95 was cutting-edge."
IIRC Win95 introduced pop-up start menu idiom as main interface to the OS. 98 kept the pop-up idiom, as did Me, as did 2000, as did 2003, as did 2008, as did Vista as did 7.
Things got more bling encrusted as time passed but the basic idiom never changed.
Qua UI, I'd say Win 95's idiom has weathered the past 16 years very well indeed. It's hardly changed at all.
Open Sourcers should perhaps stick to writing some of the best s/w the world has ever seen and leave the marketing babble to M$ & co.
Start menu was a knock-off
Technically, the Win95 start menu was a knock of the Apple menu from pre-OSX versions of Mac OS, with the minor proviso that the apple menu sat in the top left corner.
it's been a long time since Windows 95 was cutting-edge
Since April 1992, I'd say.
When '95 came out, it looked an awful lot to me like the Workplace Shell from Tracey.
Ok, the front end looks awful but surely I can replace all that crap with Windowmaker and carry on as normal, can't I? Or has the bonnet finally been welded shut?
of course you can. that would be a hell of a bonnet-welding operation to undertake otherwise...
I've never used Linux as my MAIN system so can't comment on the differences between Gnome 2 vs 3. Not being able to have icons/files/directories on the desktop seems like a strange decision though. But Fedora does seem like it's the future distro for me. Not sure about Ubuntu these days, with all social media blarg all pre-integrated. I think I'd end up spending more time disabling and removing stuff.
But for me to switch to Linux full time (and I really really want to, believe me!) there needs to be more polish in the lead programs, not the OS. Proper CMYK support in GIMP and a decent Acrobat equivalent (not just a PDF reader/creator but full PDF manipulation). Libre/OpenOffice is alright but needs another 12 months or so of work. Seems to crash with pivot tables frequently but I've worked with them on that. Inkscape is brilliant and Scribus is close. Then I'll be able to switch most of my workflow over to Linux.
But a serious question... do we ever think commercial software companies will start developing applications for Linux that aren't just an afterthought, feature parity with Win/Mac etc?
"But a serious question... do we ever think commercial software companies will start developing applications for Linux that aren't just an afterthought, feature parity with Win/Mac etc?"
Things you need to bear in mind here are what kind of commerce the software company is engaged in, and what kind of customers they are going after. If the software supplier is primarily service or hardware driven (or both) and their customer is technical, e.g. like an ISP or Google, then Linux is a very good fit as a delivery platform, because it's more than likely the software development platform. Google now have, I think, around 80% of their staff using Linux, and I think IBM were planning something similar last time I heard.
If the software company sells services or hardware as opposed to packaged software, there is little loss in giving away access to the source code, and much to gain in spreading the cost of software development. There are a few closed source packages ported to Linux, but it is the open source packages which can be kept more up to date and which don't restrict the user to particular supported versions, so closed source on Linux isn't a very good fit.
In one sense what we're seeing in the growth of Linux is a cultural shift, a bit like the way the music business is changing from one where money is made from packaged recordings to one where money is made from live performances. Software companies which have little revenue other than from sale of packaged software will have a hard time with Linux and have little incentive to support it - so while Linux isn't likely to take over the computer games market anytime soon, in relation to standard desktop platforms, for software development and for networking and servers, Microsoft and Apple have real competition.
My company ONLY produces software for Linux. In fact, we used to have a Windows version but we dropped it because it was detracting development resources from where we make the most money: Linux
"There are a few closed source packages ported to Linux, but it is the open source packages which can be kept more up to date and which don't restrict the user to particular supported versions, so closed source on Linux isn't a very good fit."
do you really think this is down to ideology though, or more practical reasons, such as cost versus potential userbase?
Packaging software for linux is a royal PITA, you either have to roll numerous distribution packages (well, okay, probably an RPM and a .deb, but still), and then that doesn't cover the dozens of edge cases.
Whilst it's not ideal (infact, it's horrendous) windows does have a fairly well established system of installation and uninstallation, which allows me to be fairly confident that software will work on the version of windows I've tested it against. (Linux even falls down here as there's often not even a single 'version'), it's fine supporting the RHs and the Ubuntu's but you've got to also support the more esoteric stuff, especially if you've got paying customers.
Apple (or more correctly NeXT I suppose) have gone a long way to address this by 'borrowing' Acorn's application as a self contained directory idea, until there's a suitable mechanism available for linux, or a dramatic upswing in desktop usage to make the fringe case headaches worth the return financially I don't see this changing.
But on the other hand it may be more ideological than I give it credit for?
"Packaging software for linux is a royal PITA"
You sure? I find that packaging someone else's software can be a bit of work when you have software that does not follow established conventions, or when it uses less common build systems, etc., The good news is that you can still do it.
Packaging your own products, on the other hand, is completely straightforward. On ours it is a simple matter of running a make target when we want to build a package set. The only manual input required is entering the password for the signing key.
"well, okay, probably an RPM and a .deb, but still"
Not much packaging experience, I gather? For any non-trivial program you will have a different package for each supported version of each supported distro. There are three main approaches to this:
One: you look after your project and publish a tarball or public access VCS. Distro packagers take care of packaging your software.
Two: same as above, but you create a more or less generic/skeleton package recipe which distro packagers can tailor to their own OS idiosyncrasies.
Three: you are the developer and the packager. You choose which distros you are going to support, prepare and publish packages for those. The door is still open to other distros to carry your product as well, but they look after it themselves.
"Whilst it's not ideal (infact, it's horrendous) windows does have a fairly well established system of installation and uninstallation,"
I have no Windows experience whatsoever so I cannot comment there, but I have heard rather otherwise from my Windows colleagues.
"Linux even falls down here as there's often not even a single 'version'"
No, of course there is not. Different distros are in effect different operating systems, usually from different vendors, and often targeted to very different markets. When it says "Linux" on the tin, it is describing the kernel being used and little else. "Linux" by itself does not denote an operating system, at a level that's useful considering from a user's point of view.
Having a varied yet interoperable ecosystem is precisely one of the advantages being brought into the game. As a developer I much welcome this.
Umm.. Slight correcton..
If you are distributing via a repository.. then yes. You need RPM/DEB what ever. Just as you need the relevant file type to download an Android or iOS app. Centralised repositories are like that.
And with open source software, not a problem. The distro maintainer handles it, or the users set up their own repos. We are after all, talking about open source based OSs. Not really set up for all the closed source stuff. Why should they be. Different business model.
But user or distro created packages of paid for software would be a bit tricky to reconcile with per seat licenses. Eh?
A repository for each program would get very awkward very quickly too. Totally impractical. And providing the source is obviously out of the question..
But that is only install method 1.
Install method 2 also turns out to be unsuitable. Releasing the source code is not practical.
Which is why you use install method number three.. The one that nobody seems to talk about. Outside those of us who actually use Linux that is.
Pre compiled binaries. The Aspirin to your distribution headaches.
Download and uncompress a tarball, type "./install.sh", and it goes into a set-up program, asking you where you want to put it, and what working directories you want to use etc. Just like Windows really. I assume they could make a network wide, or multi user install, so long as they put the config files in home.. Perhaps even have an uncompressed version on CD that just runs when you pop the disk in the drive. But auto running tends to be frowned upon, so they would have to know how to make a file executable at the very least.
The old "Buuuut you have to make a different exe for every distro.. " chestnut is about as realistic as complaining about not being able to mount a USB key. Basically, a depreciated stick to beat Linux with. .
Realistically, no reason why someone couldn't wrap a Linux program up in all kinds of lovely phone home DRM, and demand a key disk and license number be typed in if they really wanted. None of that is actually down to the OS on Windows either.
And before the next old wives tale pops up. No prohibition on closed source software running on Linux. It's not going to catch GPL.
"Not much packaging experience, I gather? For any non-trivial program you will have a different package for each supported version of each supported distro. There are three main approaches to this:"
No, I'll freely admit I don't. I'm looking at this from a small developers point of view too. I maintain a small piece of freeware, as a hobby, and because I use the rather wonderful Qt framework I can ship for win/linux and OSX easly, so I do.
My build process for a release takes about an hour (involving 2 virtual machines for the 32/64 bit linux builds), and is fairly boring.
However, at the end I end up with a windows installer I know will work on Windows from 32bit XP to 64bit Windows 7, 2 OSX builds, one for intel from 10.5 onwards, and one universal one that'll work on anything from 10.4 onwards on either PPC or intel.
I also end up with 2 tars for the linux distribution, which do work on most of the linux distros out there. (BTW, I get it's gnu/linux, but hey..) However if I was to offer distributions packages also I would (unless I'm missing something), have to have at least another 1 (or 2 for 32/64 builds) distro's installed in virtual machines, and more and more complexity as part of my build process.
If I was a company with a build infrastructure obviously this wouldn't be a major issue, but as a single developer workng on a small hobby project the extra time/effort for the very small number of users it would affect is not something I'm going to do.
So I have an installer that works on all the versions of windows I support, all the versions of OSX and 90-95% of the linux distros out there. That was my point.
"I have no Windows experience whatsoever so I cannot comment there, but I have heard rather otherwise from my Windows colleagues."
Windows installer makers are usually horrible (Installshield is yukky, NSIS is free and... strange...), however if you understand windows DLL loading semantics my point was more that you can create an installer and have confidence it will work on your target OS, not that it /should/ work assuming all the dependencies are satisfied.
"Having a varied yet interoperable ecosystem is precisely one of the advantages being brought into the game. As a developer I much welcome this."
I wouldn't disagree with this, I do quite like linux (although I'm not hugely into the ideology behind the GPL, if I'm honest my nix of choice is NetBSD.) however my point was that easly packaging software for the desktop is difficult, especially if you want to be distribution agnostic, and this is possibly one of the hinderances to large scale adoption of it as a desktop platform.
Oh, and @John Bailey:
"Which is why you use install method number three.. The one that nobody seems to talk about. Outside those of us who actually use Linux that is.
Pre compiled binaries. The Aspirin to your distribution headaches."
Ironically, this is what I do do, I ship a tar with the required shared libs all precompiled, and I'll agree it does work, however it does occasionally fall down. (eg, I have a report from a user using a slightly more esoteric distro of my software that they can't get it to run, the only way I'm going to fix it is to install the distro their using to track down the issue as it works fine for me on the ones I try.)
"The old "Buuuut you have to make a different exe for every distro.. " chestnut is about as realistic as complaining about not being able to mount a USB key. Basically, a depreciated stick to beat Linux with. ."
However, anything more than a non trivial program is going to have a fair chunk of dependencies, now there's a good chance (expecially these days) that most distro's do ship with fairly similar versions of whatever libs you've linked against, but there's always that /slight/ chance that someone will try your precompiled software on a machine with an old or incompatable libraray, and, from a users point of view, weird dynamic linker errors aren't particularly friendls.
"And with open source software, not a problem. The distro maintainer handles it, or the users set up their own repos. We are after all, talking about open source based OSs. Not really set up for all the closed source stuff. Why should they be. Different business model."
Because i'm using an operating system, not an ideology? At the end of the day I want it to provide an abstraction layer for software to run on, and ideally I'd like it to run whatever software I choose, not just whatever software happens to be ideologically compatable. ie, I don't run linux at home because it's 'open source', I run it because it's the best platform to drive the software I need. (In this case Mythtv), But that's just a different point of view I suppose.
"so while Linux isn't likely to take over the computer games market anytime soon"
This kinda stuck out for me and is interesting.
I`ve been enjoying ID`s Q3A @ quakelive.com
Re-living the good old days and the entertainment value is phenomenal. I noticed some of the older maps have been spruced up a little and quite frankly, this game is every bit as good if not better that your COD`s and MOH`s. Quakelive has killed my xbox dead for some 6mnths now. I`m developing a 7th sense with the rail, with a good chance of hitting something that catches the corner of my eye briefly...
Amazes me that flash player can be used in this way. When Adobe first came on the scene and everyone/most ppl where still on dialup. I couldnt believe they where coming away with such crap! I wasn't one for turning off images when browsing, lynx style or whatever, but I couldn't fathom any possible use for this Sh*te
kinda makes you wonder if ID isnt showing a clear path here? A lifeline if you will?
Love KDE btw, always found windows ok to use. Miss WB and saw someone using a mac the otherday, first time in years. Got a bit of a shock at how good it looked, is this the market force messing with the GUI`s? (Answer plz)
I`ll use the man with the safety glasses, I like safety Glasses, Safety Glasses are good....
Have a seat
"user or distro created packages of paid for software..."
Not a problem there, at least for vertical enough markets.
"would be a bit tricky to reconcile with per seat licenses."
Probably. But that model, on the desktop, has been dead since the last CD was sold :). If you want to play that game with any perspective of a longer term future you need to move to the mobile arena.
That leaves us with two common viable options: you can sell content, and you can sell support. In either case, the software is just a means to an end, not the end in itself, so closing the source does not help. In fact, if your target market requires an assurance of continuity (i.e., in case you go under) that plays very much against you.
One way or the other, paid for OS software is doing very well indeed. It just happens that that particular sales model you mention can only work where you have tight control of the distribution channels, as was the case in the era of foil-wrapped software, or is the case now with mobile marketplaces.
Note btw that I'm not talking about the future of closed-source as a development approach, which is an entirely different beast.
"Pre compiled binaries."
You mean software appliances?
"The Aspirin to your distribution headaches."
In the general case, that would be at the expense of the user's convenience (if you're talking a massive statically linked blob), and experience, unless you have gone to great lengths to ensure smooth integration with the user's environment, or your application is intentionally designed to be sui generis.
There are specific cases where I would say a software appliance is exactly the right approach, but that's not something I have experience on, so I won't comment.
"However if I was to offer distributions packages also I would (unless I'm missing something), have to have at least another 1 (or 2 for 32/64 builds) distro's installed in virtual machines, and more and more complexity as part of my build process"
You may want to try the OpenSUSE Build Service (http://build.opensuse.org/) if you haven't already done so. I have the impression that's what most of us use for multidistro packaging these days.
If your products are FOSS you can use their build farm at the above address. Else, the build service itself is open source so you can download it and install it on your own servers (for your own use, a commodity machine with a large enough hard drive should be adequate).
Incidentally, you do not need VMs for multidistro packaging, although that's probably the easiest choice if you need to get something going in a hurry. Outside of using the build service, that is ;)
"if you understand windows DLL loading semantics my point was more that you can create an installer and have confidence it will work on your target OS, not that it /should/ work assuming all the dependencies are satisfied."
Thanks for your insight on the Windows build process. As regards your quote above, how does the dependency solving work in Windows? You just ship everything you think your product may need on the installation blob, and then some program takes care of looking at what's already installed and what's missing, that sort of thing?
"[Open SUSE Build service]"
Thanks, didn't know about that. It looks like exactly what I need for this kind of thing in future. :)
"how does the dependency solving work in Windows? You just ship everything you think your product may need on the installation blob, and then some program takes care of looking at what's already installed and what's missing, that sort of thing?"
Generally yes, just bung it all in the same directory as the program and it should just work.
The example I was talking about uses Qt, which is compiled using Visual C 2008, to make it work I need to ship the Qt dlls, and the visual C runtime. Microsoft recommend you ship, or point people at the VC redistributable installer, which will install it site wide on the end users PC, however I wasn't happy with that (as I don't like asking people to install additional software) so have shipped it in the application directory, which works fine. Same goes for Qt, it's just a small config file to tell it where to load the plugin dll's it uses.
Windows isn't too bad an OS these days, it's just got years and years of bad design to deal with, however if you're aware of these issues it's now perfectly possible to ship apps that cooperate well with others. (eg, you can use them on a machine with limited user rights, they'll write settings to correct folders to support profile roaming, etc etc.) The main problem still is many windows devs don't.
FYI Krita is comming along (part of koffice) and says it has CMYK support.
Krita seems to export PDF too, but load of apps do so I presume you have specific requriements for PDF output?
"....If you hate GNOME 3 with the sort of passion most people reserve for politics and religion, well, your best bet is to stick with Fedora 14. Forever...."
A more constructive suggestion may be to try XFCE. This new direction with Unity and GnomeShell is in my view a huge mistake. Sure, kids might like it 'cos it looks like a huge Android phone, but to get work done, no way. I have given Gnome 3 a week and that was a week too long. I switched to XFCE today and I'm very happy to get back to a proper work machine.
GNOME 3 & Unity
I think both desktops will get there eventually but both IMO at present need work. My biggest gripe with GNOME 3 is you can't make links on the desktop. Other annoyances would be the dock which resides offscreen where you can't see what's in it without explicitly looking, the lobotomized prefs, and the lack of minimize / maximize buttons. In all these cases I do not accept that it would interfere with the design of GNOME to improve this behaviour, e.g. by looking how Windows 7 & OS X manage to do it.
Unity is more conservative but has annoyances such as that horrible global app menu, and a lack of prefs dialogs to configure it's quite irritating default behaviour.
There is no doubt that GNOME 2.x was pretty mouldy or "decadent" as someone put it. It was fine for what it was but what it was was about 10 years behind the curve in terms of desktop design. So I'm glad to see a bit of effort gone back into moving things on even if first results are still lacking. Hopefully a point release or 2 will make these desktops more palatable and hopefully will pave the way for wayland too when X can be dumped entirely from the local desktop experience.
I'm sure fedora 15 has many nice new features. Shame I won't be using it thought BECAUSE of the new Gnome 3.0 interface. I tried to use it the other day and it sucked. Everything needed more mouse clicks or key presses to do. It seems to be a GUI designers wet dream rather than a working interface. I have stuck with fedora/RH over the years from pre F1 to F14 and now I am looking for a new distro. Very sad.
You don't have to use it. Install one of the other of the myriad of desktop/WM environments available.
I'm pretty sure that's what he was saying...
No - he said he's moving distro. He could stick with Fedora and use a different DE, such as Xfce. If he wants gnome 2 then move distro - but eventually this option is going to disappear as other distros shift from gnome 2 to gnome 3.
You don't have to use Gnome 3.0
I tried it because it was new and went back to KDE because I've been using that for about the last, oh, goodness knows, ages. There's at least two others to choose from as well: XFCE being the most popular. There's also a couple of extremely lightweight options.
People like to bash X, but one thing it does well it provide you with choice. If you don't like one particular desktop paradigm there are others to choose from and this is what gives us progress. The change is exciting, and I'm sure big changes are on the way that will make life better for all of us: it's taken a long time to shake off the Win95 and CDE legacy.
I'm On The Fence Regarding Gnome3
Unity? No way. Not ever.
I'd like to try Fedora, because I've got an itch to switch from Ubuntu cos I don't like the direction they're going in. I came from Redhat (RH9 was the last one I used) and I use Centos a bit here at work and I have to say that were it not for yum/rpm I would be have moved back already.
I simply cannot stand yum and RPM.
Dependency hell - Do NOT tell me that this is a thing of the past, it is not.
Unable to proxy repositories - I use apt-cacher-ng currently so that I don't have to download the same packages over and over and over. How do you do that with RPM? I've never figured it out.
I just find YUM and RPM to be unnecessarily clunky and problem prone compared to APT/DEB so I don't care how good a distro is, if it is RPM based I will avoid it like the plague.
I like RedHat, I think they are great for the Linux community but not enough to get over my hatred of RPM
I'm sure I'll get downvoted by a bunch of RH fans but whatever, this is where I stand. I'm switching over to Debian 6 for my server, I still have to choose between Mint Debian or vanilla debian for my desktops.
Thumbs up for Redhat though.
It really is.
"Dependency hell - Do NOT tell me that this is a thing of the past, it is not."
No, it really is.
The term applied to the situation that happened before dependency-solving package managers, where you had to solve deps manually: you'd download evolution.rpm, try and install it, it'd tell you it needed gtk, so you'd download gtk.rpm, try and install it, it'd tell you it needed freetype, you downloaded freetype.rpm...and on and on and on to the bottom of the stack.
Fedora has had a dependency resolving package manager ever since it became Fedora, so Fedora has never been subject to 'dependency hell', as the term is properly applied.
These days it tends to be *mis*applied to situations where you try to use third party repositories and run into dependency problems, or the now very rare situations where there are dependency problems within the main Fedora package set. But neither of these things are actually 'dependency hell', and neither of them has anything at all to do with rpm or yum; they're simply issues in the dependencies themselves. You can have dependency problems with absolutely any package format that _has_ a concept of dependencies; there's nothing the package format or any package manager can do to stop maintainers making mistakes.
Your problem is not with yum/rpm
I've upgraded my CentOS systems from 5.3 to 5.5 and then to 5.6 and from 4.5 directly to 4.9 without any dependency problem. On the other hand I got hit pretty bad when I tried to install drivers for my shiny new HP laser color printer on CentOS v5.3 64bit (I've managed to hack it somehow though).
Your enemy is not yum/rpm in itself, it's perhaps the lack of understanding the Linux distro policy/philosophy and the correct usage of RPM repositories.
On a long term support distro (as it should be on almost any production server) just stick with the official repositories and you can live long and prosper without YUM/RPM coming out to get you. I admit though that sometimes this position can become untenable.
Of course I'm not going to downvote you, after all Linux is about choice, isn't it ?
Take a look at mrepo in the DAG Wieers repository. It will let you set up a centralised yum/apt proxy containing mirrors of just about everything you want.
simple, you create your own local copy. Yes, it takes 20G of disk, but it allows you to to point to a local repository. I have 20 CentOS servers (most virtual) and one of them holds my local repository. With a weekly cron job that repository is kept up-to-date. I simply updated my /etc/yum.repos.d/CentOS-Base.repo at my local repository. You'll find that the speed of your disk subsystem is your new bottleneck. (if you have a Gig+ lan).
ah yes, the old create a mirror suggestion
I was waiting for that.
So, in order to avoid re-downloading a few dozen packages I should build a 20G mirror, and have that constantly downloading updated packages for every single package in the entire repository even though I will only ever use a fraction of the packages therein?
How exactly is that saving on downloads?
Thanks for that, I'll take a look. As long as it doesn't want to replicate the entire Fedora ecosystem in a local mirror then it might do the trick. I might even give Fedora a go (If I overcome my laziness long enough to rebuild my desktop anyway)
Re ah yes, the old create a mirror suggestion
I use apt-cacher-ng on one of my workstations to serve my 7 other machines. It only downloads the packages I use and when I install another machine for someone its like lightning! The equivalent for yum is, I beleive yum-cacher, and if its like apt-cacher-ng then the 10 minutes it takes to configure it will be repaid a thousand times.
Why the comparison with Win95?
I mean... come on, the so-called Win95 interface has been around a lot longer than Win95. I had pop up menus and win like desktops under Linux before Win95 was released. And was using proper desktops since the days of old Solaris and Amigas.... why do people thing the desktop paradigm started with Win95?
Yep, I used CDE in HPUX in 1993. It was quite ugly, but had more functionality than Win95.
Win95 introduced the Start button and menu.
Before then everything else relied upon you opening a folder and double clicking a program to run it. Or dragging it to a dock to create a quick link.
Win95 also put the close buttons on the right of the window when the rest of the world had them on the left.
Perhaps there is prior art for some of the above, but obscure X Windows desktops don't count as they were not on 90+% of desktops.
Start menu, task bar etc
I think Windows can claim credit for a start menu that can launch apps and double up as a task bar. It can't claim credit for docks which appeared in numerous ways e.g. Acorn Archimedes, CDE, OS/2, NeXT etc.
Neither can it claim credit for unifying the concept of task bar and dock into a single thing. Arguably OS X got there before Windows did (with Windows 7) in a mainstream OS, and I'm sure there are some precedents before that too.
I do think the concept of a dock / taskbar is an accepted design concept now. It's good to see Linux desktop modernising though Unity and GNOME 3.0 clearly need more work to fulfill their potential.
The UNIX community didn't grok Windows for a long time
At Princeton prior to 1995, we had a variety of UNIX workstations in the CS department, IRIX, HPUX, a lab full of SUN's for the undergraduates, and if you were really low on the pecking order, you got an X-terminal that logged into a DEC mainframe running UNIX. If you looked at anyone's screen, what you saw was half a dozen command-shell windows running various text-oriented programs. It was still very much a command-line interface world, still heavily dependant on text editors to do any kind of content creation. That was the state of the art in UNIX. You could copy/paste text between windows, but there was none of the object/embedding/scripting action that you had in Apple or Windows PC. You were just beginning to see people play with TCL/TK, an ersatz version of Visual Basic, but UNIX "apps" were still very limited by the lack of underlying system support.
At a deeper level, UNIX was still very monolithic in those days, while Windows was very modular and object oriented. UNIX was just beginning to support DLL's and device driver interfaces, so the operating system was pretty much one giant C program. Windows was based on dynamically loaded libraries and COM interfaces that let Microsoft update and swap out components without compiling from source. I still remember the pain of installing a new scanner on my SUN workstation, having to fiddle interrupt vector tables and then compile the whole damn kernel. This lack of modularity plagued the UNIX window system as well. You can go read the X-windows paper in ACM ToG, and it's all about how clever they were at supporting overlapping of bitmaps and handling redraw with you moved something. But there's nothing there about the underlying logic of communicating interface objects that Apple and Microsoft were focused on. A UNIX window was a virtual command console. If a program presented a GUI interface, it was pure brute force, it had to track the cursor and know where any "buttons" were.
Even in the late 1990s, when SGI briefly flirted with Windows NT on their hardware, it was pretty amusing to watch them demonstrating drag and drop, like it was something new discovered on Mars.
Those were the days
"If you looked at anyone's screen, what you saw was half a dozen command-shell windows running various text-oriented programs"
Exactly! That was (and still is) the beauty of X. Whereas before you were limited to one text terminal, you could now easily have upwards of eight terminals all on the same screen. That was pure genius, that was!
"I still remember the pain of installing a new scanner on my SUN workstation, having to fiddle interrupt vector tables and then compile the whole damn kernel."
I, like you, have only ever once written one device driver too (but we did have shared libraries by then), which makes us both basic users.
"A UNIX window was a virtual command console. If a program presented a GUI interface..."
Well, yes. But you're talking about the days before the advent of VIM.
I guess I'll go recompile a kernel now, just for old time's sake (snif!)
@Don - Depends on which flavour of UNIX
IBM introduced dynamic driver load/unload, shared libraries by default, virtual Kernel address space (associated with never having to sysgen a system again), along with journaling filesystems and many other features, in 1990.
Shared libraries were around in SunOS before then, although the norm was still to statically linked libraries for several years.
I think that your description of X11 applications is completely wrong for everything except Java graphical programs (but that is a Java problem).
The concept of Drag-and-Drop in X-Windows (and it was probably X10 at the time) was shown to me on a Torch TripleX running X.desktop (although I'm sure it was also called LookingGlass and possibly OpenTop) in the middle of the 80's, along with desktop icons and walking menus. I concede that MacOS had these concepts before then, but they were not foreign to UNIX even before Windows.
The standard X-Windows model for GUI type programs was indeed to use toolkits and widgets (effectively library code) for drawing things like buttons, text boxes and pixmaps, and this does mean that the application has to keep some sort of track of what is going on on it's own graphical space, but the server is what keeps track of where the cursor is. X-Windows is built around call-backs and managed data objects, which meant that the X Server (the thing that controls the keyboard, mouse and screen) always has a degree of separation from client programs (which is really to allow X-Windows to run across a network, something that Windows still does not really do well), but it can only marginally be called Object Oriented.
This separation allows a client to be completely ignorant about the position of the cursor and which parts of a window was obscured by another window. Each click, key press and other event was tagged with the current cursor position by the server, and when a part of a window was uncovered, the server gave one (or sometimes many) expose events, saying exactly which part of a window needed to be re-drawn. And if the server was configured with BackingStore, the server itself could fill in the missing bits without bothering the client. This was designed to make it run efficiently with a network between the server and client.
In addition, things like window decorations (frames, resizing options, window control buttons) are all handled by a separate component from either the client applications and the X server. This is the Window Manager, which is what allows you to rapidly change the look and feel of the GUI. This works by encapsulating an application window (X11 defines a window hierarchy, with the root window at the top, application windows in the middle, and individual graphic contexts at the bottom handling widgets within application windows) , allowing keyboard and button events to be acted upon before they are given to the client. This is also an OO type feature.
I don't think that Windows integrated COM into the presentation manager until the late 90's probably with Windows 2000, although it was available to applications, and all windowing applications needed to manage their own
HP VUE and then CDE did provide something like COM, and this was before Windows95, but coding for CDE was difficult, and the old X11 models still worked, so were still used.
There are not many people now who actually code at the X11 level. Almost all applications are now written with toolkits or SDKs (like Motif, Qt and GTK+), which hide almost all of the complexity of how X11 works.
"At Princeton prior to 1995, ..."
You've got your history around the wrong way. The first time I used Windows on Unix was in 1986 (and it wasn't new then), in 1987 I was using X10 and by the time I eventually got a PC (at 25MHz 386 running Windows 3.1) in 1992 X11 was well established. By that time, the protocols and mechanisms for things like drag and drop and the object-oriented things that we all take for granted were becoming well-established.
The lack of policy in X was one of the major strengths in developing those mechanisms. On a platform where the policy was fixed there was an apparent leap-frog where things like drag and drop gave the impression of being polished. The X Window System didn't hurry, it took its time to get things _right_. You might think that it took too long (and you could well be right) but while the X desktop was evolving people were learning from it and applying what they learned elsewhere.
I think that the solid base that we now have is paying serious dividends: we have something that new paradigms can be tried with. Gnome 3, Unity and KDE are quite different, but they cooperate with each other where it matters and I'm sure that we're on the brink of seeing great stuff.
Any news of virtual workspaces? Given that the article states that the desktop is gone or something like that, I'd guess no virtual workspaces either. Is that so?
Virtual workspaces are my favorite feature on Linux desktops. Without them, it's hard to see myself happily working with a computer. I mean, if I have to have a computer that looks like a Mac user's, with a ton of overlapping windows covering every surface (I don't know why, but the few people I've observed using Macs for a longer period always had screens like that), then I'd rather pass...
I know this is off-topic
but Macs have had virtual desktops built into OSX for a while now, and you could get hold of third-party hacks to add them in on earlier versions. (God alone knows why they weren't there to begin with though -- that was one of the most annoying things about swapping to OSX from Linux for me, too.)
I guess the Mac users you know are lazy or something. I only have four screens configured but I always use all of them with three or four windows on each.
Gnome 3 provides 2 by default, in a vertical configuration.
Since it is so raw I haven't found the settings area to change this, but I did notice that as soon as I started working in the 2nd workspace it automatically added a 3rd empty one below it.
Thank God for that at least.... multiple workspaces, are IMO, one of the most useful features ever...
Shell actually kind of expects you to use workspaces quite heavily (though I don't). As Skrrp says, there's a workspace switcher bar on the right hand side of the overlay which is mostly hidden and pops out when you mouse over it; it shows a thumbnail of each desktop. There is always exactly one empty workspace at the bottom: if you put a window on that workspace, a new empty one appears; if you remove the last window from any workspace, it vanishes so there's still only one empty one. This is neat, but it does screw up the workflow some people use where they always have X workspaces and always put their apps in a particular configuration. There's an extension which changes the workspace system to be more static, for those people.
Just one question
How easy is it to adjust the UI for use at 10 feet on an HTPC? Guess I'll have to give it a whirl to find out.
Why would you run a desktop interface on an HTPC and not, well, an HTPC interface? It's not like Linux is short of them - MythTV, XBMC, Freevo...
The HTPC interfaces are great for the HT side of the equation. When you want to do the PC side... not so much.
And jumps on the iPhone bandwagon
"GNOME emerges from last century"... and assumes I have a low res 3" x 4" dsiplay device and only ever want to see one thing at a time.
I have been giving Gnome 3 a red-hot go, but the whole assumption that I'm driving my user experience form a tiny touch screen is really starting to grate. Why shouldn't I be able to have my power settings and network preferences open at the same time?