"...it's tough to deny that both haven't borrowed..."
So you're saying they haven't?
It's been a tumultuous year for the Linux desktop. Anno domini 2011 saw the release of not one, but two major new desktops, the GNOME project's GNOME 3 shell and Ubuntu's rival Unity desktop. By the time most distros hit their stride in 2011, the GNOME 2.x line had been replaced with GNOME 3. With change comes angst - …
"Activity" and "Activity oriented workspace" are sheets of the _ANDROID_ songbook, not Apple's. Both KDE4 and Gnome 3 are such a trainwreck because of copying _THAT_ obsession without copying any of the state management and workstack management which goes with it. iOS is still application based, not Activity + Intent based.
Just read the frigging Android and iOS developers manuals for crying out loud, look at the changes in KDE 3-4 and iOS and it becomes immediately clear who copied whom.
25 Up votes?
Go on then, I'll bite, name me another distro that is relevant on the desktop?
Isn't this sweet, a distro goes hell for leather for mainstream and the freetards around here just gun it down.
Don't worry fella, I'm sure 2012 is the year of the Linux desktop but obviously we don't want it to be a popular one!
I've been a perfectly satisfied Linux user for a very long time. But the desktop developments are beginning to annoy me.
I'm X fan, and have often used the redirection abilities to remote displays for navigation and science systems. I got on perfectly well with the basic windows managers.
Now we have fantastically elaborate desktops, dbus, half a dozen sound managers, and hardware integration is no better. Perhaps the effort has gone in the wrong place?
Half of me thinks that the changes for Gnome3 and the new KDE are good, because innovation is good. But you tell me now that they are copying apple, not breaking new ground? All I know is that things get more complex, harder to understand, but not /better/. I feel much the same about Unity: change is often good, but it isn't working out.
I like the E17 desktop, and get on with that. I like LKCFE or whatever it is called, I use that on a Suse netbook.
Laptops have been getting some really dismal screens lately (768, 800 vertical pixels in many cases) and if people need to rip up and start again, why not change to a left hand menu to restore much needed vertical screen space (I know about Unity)?
But really I feel that the Desktop problem is largely solved until someone comes up with a dramatically different paradigm (like merging the desktop screen and two or three tablets into a single environment). Can we not get some of this astonishingly skilled development effort into finishing a few applications instead? Fix the bugs in Libreoffice/openoffice. Finish at least one of the CAD projects. Why do we have 6 desktops and only one Gimp? Skype has stolen the public space that GPL started. Voice recognition? Access for the blind?
Can we fix all the scruffiness about power up/hibernate/sleep/restore?
How about fixing the printer problem with something similar to ndis netowrk driver handling? get all those dodgy lexmark printers working just using their windows driver disk?
Stop re-inventing the front offside wheel. Fix the other wheels as well.
but I have 6 machines running OpenSuse 11.4, a netbook, a laptop, 2 workstations, a fileserver and a desktop in our holiday home.
I don't have a problem with KDE although like you I'm quite happy with more 'primitve' managers
I don't have a problem with sleep mode
I have two printers - a Samsung laser and and Epson scanner/copier/inkjet which both work perfectly and network nicely from the fileserver.
I do agree about GIMP but I often process RAW photos with showFoto
I don't use Windows at all and expect to do everything I want in Linux - these days I find this quite easy.
Incidently I'm thinking of building a compute server to offload a lot of intensive scientific calculations & modeling and also for the transcoding/rendering of 1080/50p video
> Stop re-inventing the front offside wheel. Fix the other wheels as well.
Are you willing to provide the funds?
There are basically two kinds of developers in the Linux arena: those who get paid to work on it, and those who are self-funding.
The former generally work on specific projects assigned to them by their employers, who expect to benefit from it within the context of their own business plans--this is mostly not consumer oriented stuff, unless you count Android.
The latter, which apparently is most of us, work on whatever we feel like and are able to tackle within the constraints of our own knowledge, competence, and motivation. Projects with a positive and welcoming "feel" to them also tend to attract more contributions--this is why I for example have contributed to the KDE4 code base and ecosystem, even though I definitely do not consider myself a desktop developer, nor is something that I'm attracted to; it's just an opportunistic reaction.
Personally, I feel you have every right to a) use the stuff I create or contribute to (in the FOSS sense, including profiting from it), and b) complain if something I have done doesn't work as it should or is a piece of rubbish in any other way.
On the other hand, unless you're willing to dig into your pockets, I don't see how you can tell me what I or anyone else should be working on. Either pay for it and have it the way you want it (or code it yourself if you are able), or go with one of the various commercial off-the-shelf offerings from Microsoft, Apple, or whoever else if they already provide what you need. They do some good stuff too.
I'm so sick of people saying that! There is no way in hell, I'm ever going to switch completely to a laptop/netbook/tablet and use only web apps, it's just NOT going to happen, so stop trying to tell me I will!
We have a tablet in the house, my partner loves it, but for her graphic design work, she still returns to her desktop. I have both a netbook and a laptop and a Galaxy S, but I do 98% of my daily work on my desktop, that's not likely to change, ever!
I'm with Robert E A Harvey, why does hibernate still not work properly, why do not all printers just plug and play the way they should? Linux has SO much going for it in desktop, which is still just as important as it ever bloody was, and yet it still falls over with simple tasks like being able to plug in a cheap printer or run 2 versions of an app in different desktops, WTF people! Stop whinging about the bloody web and get on with the desktop for crying out loud! Let android deal with the web, it's doing a damn fine job, Ubuntu is a Desktop OS!
"why does hibernate still not work properly"
There's a lot of hardware out there, a lot of design issues.
Hibernate works on my Acer Aspire One, Opensuse 12.1 (it did on the 11s too). An MSI windbook 64bit is a bit more touchy, will be rolling it back to 11.4 but that's because I have the choice.
"it still falls over with simple tasks like being able to plug in a cheap printer"
I use HP, have done for years, on my third having started with 640C (?). Lexmark don't want to support Linux then don't buy their printers. Network based printing will solve all that over time.
"or run 2 versions of an app in different desktops"
bit difficult to argue that generality - does that never happen with Apple? It does with Windows and the only solution is to spend money.
With Linux, possibly we're all in this together and by using feedback to the devs we can reduce the problems.
Hibernate doesn't work on my Aspire One, but that's because I've got no swap space set up.
However, I chose to stick with LXDE for the window manager on it, responsive and perfectly adequate for the small screen. I hope I am never forced to use Unity, and I wasn't too impressed with Gnome 3, but then I always preferred KDE over Gnome anyway.
Developers working on Gnome, KDE and such mostly do not intersect with those that work on the kernel and drivers. Otherwise, Linus would fork whatever he wanted all by himself.
Hibernate work on all of the computers I have seen so far, however with a strange GRUB_CMDLINE_LINUX="resume=/dev/sdax" addition in the grub config. Suspend-to-ram is a little more problematic when kms/dr on i sometimes... I haven't seen a non-working printer for a long time. Like other people said, do not buy from manufacturers that do not support GNU Linux.
My cheapo brother hl-2230 works perfectly and is controlled by CUPS server on Debian. All Ubuntu machines can access it through the route and their CUPS. I heard that this is harder to implement on the Windblows machine. Moreover, almost everyday I find a taken-for-granted feature on Linux to be non-existent or more complex to get on Windows. A fiend asked to salvage data from her burnt laptop said that someone couldn't get her hdd to connect via sata-to-usb to a Windie machine. It got mounted for me right away ...
In general, both MS Windows and even Mac OS X are both so very lame compared to free and open OS's, even with all the "unholy mess" that we have in DE right now.
Since most distros use CUPS (pretty much Apple's only useful contribution to this world), it's usually safe to say that if the printer is advertised to work with OS X, it'll work in Linux. Although this of course is not always the case, it's a pretty safe bet.
Samsung was always good about putting a penguin on their laser printers, and whatever you do DO NOT buy from Lexmark. They have always been a tool of Microsoft, and although a select few models do work in Linux, they involve some sort of badly written Java-based [shudders] driver and installer. Epson is a good one - in fact most of their modern all in ones, like the Artisan 810 that I have actually use embedded Linux behind the scenes. I've yet to find an HP that didn't work (aside from hardware issues).
Printers are practically the easiest Linux hardware to buy as far as compatibility. Just for god's sake don't buy Lexmark, no matter how cheap they are.
I almost completely agree with you, except that we should not be thankful to Apple for CUPS. It was only 2007 when ESP was bought by Apple. They've been using CUPS since 2002. We should be thankful to Michael Sweet though and all the developers that contribute to the project (including Apple, HP and Brother are even more) .
I agree with you , Robert. It is amazing how many devices out there are not advertised to work flawlessly under GNU/Linux when they do, including the mentioned printer.The installation through cups (+some front-end) is pretty straightforward, except for the suggested driver might not be the best, so a little tweaking is needed. Like in the case with hl-2230, older version of CUPS did not know which driver was best for it. The newer found it right away. The manufacturer's poor job...
I find it just as irritating to be told by a manufacturer that some device 'only' works with Windows 'X' or maybe Apple and to find that actually it works perfectly well or better with Linux.
Recent examples include a USB 3G dongle, a firewire video camera and a USB/serial converter
>Developers working on Gnome, KDE and such mostly do not intersect with
>those that work on the kernel and drivers.
Except, perhaps, in organizations where they are paid to make linux work.
Yes, Mr Cannonical. I'd have far rather you came up with a Linux version of TWAIN that will work inside all graphical and photo apps, than bugger about with the desktop that isn't broken.
Or stop pitching and rolling from one audio driver, patch panel, and music app to another like a drunken sailor who has the price of one bottle of rum and is equidistant between three of them.
A Linux version of TWAIN????
Are you in any way familiar with TWAIN's architecture? I thought even Windows had got rid of it years ago (but I wouldn't know about that).
I presume you know about SANE. What's wrong with it? And more importantly, how are you planning to fix it, or you just feel like having a whinge about a product that is made available to you in an incredibly permissive way and at no charge, should you wish to use it.
My wife bought me a nice cannon scanner that only works in Windoze, because the one in my HP all-in-one is so bleeding slow.
Or stupid android phones that don't use USB bulk storage, but expect MCP instead
> Or scanners.
It's a long time since I've had a scanner that didn't work under Linux.
My old Mustek stopped working under Windows (apparently the cable is broken), but works just fine in Linux.
> stupid android phones that don't use USB bulk storage
Haven't seen one of those - which ones are you talking about?
I've yet to acquire one that doesn't work.
> TV tuners
I had a problem with a Realtek chipset in a tuner. I asked them for a datasheet so I could write a driver - they sent me a driver instead. Then they said I could release it under GPL.
> Sat navs
The only sat navs I've got access to just give me a filesystem. Which ones are you trying to talk to?
If you felt a bit brave and had some spare time, ask 100 strangers two questions:
1. Do you know what Linux is?
2. Do you know what Ubuntu is?
Perhaps shift the demographic in favour of the computer literate and ask the the same question to strangers in say, PC World or an Apple retail outlet.
I feel certain that right now, you'd be lucky to find a single person who knew what Linux or Ubuntu was.
However, go back a few years and ask "Do you know what Android is?"
You would probably get a reply along the lines of "It's a movie about robots?"
These days, most people would know it has something to do with mobile phones - you'd probably get a few people saying "Oh yeah, that's HTC?" and more savvy people "That's google, right?"
Where I'm going with this is that it's taken Google with all their billions and eyes on computer screens to realise Android and get it to market.
Ask a further question now:
1. What is Google's mobile operating system based on?
How many people would say 'Linux' ?
For Ubuntu to have any chance in the Tablet market, they will need to increase market awareness massively.
They've made a tiny dent in the Desktop market amongst the geek set, which was no mean feat, but it's taken them 8 years to get to this point and still the total Linux desktop market share is tiny - figures vary, but at most it's 3%
Canonical are right to pursue the mobile market, but to do this effectively, they need brand awareness and that's going to cost a heck of a lot of cash.
In the interim, Apple, Google & Microsoft aren't exactly going to be standing still and with massive war chests they'll drown out any noise that Canonical may make.
If I were a betting man, I wouldn't put any money on Ubuntu getting any significant share of the mobile market, unless they sold out to Google.
More accurately some one asked me. Working in library services she asked me if a new e-book service she was testing worked on Linux (not Android), as she was concerned about universal access. (No, it didn't).
She didn't know that Android was Linux based. I used the growth of content consumption devices based on Android as a reason to pressurise the supplier - who replied with the gem "he didn't think that Linux for Android had been released yet"
I don't know the outcome of that one, yet, however this isn't a race, it's erosion. Ballmer was right.
Once market share gets to a level much higher than it should need to be (15%?) universal access to online public services will create political pressure to level the playing field a bit more, thus removing one more barrier. Possibly Android for tablets will enable LOTD as it becomes more possible to refurbish old hardware for general use.
Well, when someone really thinks that a big non-touch monitor needs to run the same interface as a small touch one, I think it's time to say "sure, go on with this madness" and promptly choose another distro or another desktop manager.
I like the Android touch-friendly interface on my phone and on my tablet, but I DON'T WANT the same interface on my 28 inch non-touch monitor.
Interface designers in commercial products think that users are stupid, suffer from attention disorder, and cannot focus on more that one simple task on one big window that covers all of the screen, with no more that two big buttons at a time. Everything more complex is absolutely too hard to use. And while there are smarter users in the commercial software world, there are also a lot of brain damaged users.
But, if we keep helping the brain damaged users, sooner or later the smart users will die of boredom.
Have you seen the movie "Idiocracy"? It is a perfect example of where we are headed.
Windows is at least going in the right direction here with the "clip to the edge of the screen" thing.
Ok, I hate the implementation and want to find how to turn it off on my laptop, but at least they are thinking about things like this.
Likewise, full screen apps on Lion - probably good for 10" and 13" laptops... not so great for 27" displays.
Size does change behaviour and one OS behaviour does not meet the needs of a 19" and a 27" screen user. Come on, Linux desktops are renowned for giving options. All we need is a bit of logic to test screen size and a manual override button.
"I like the Android touch-friendly interface on my phone and on my tablet, but I DON'T WANT the same interface on my 28 inch non-touch monitor."
"Likewise, full screen apps on Lion - probably good for 10" and 13" laptops... not so great for 27" displays."
Yeah. On my 28" (actually dual 28 inchers which makes the interface issues stand out even further) Unity totally falls apart. Windows 8 doesn't do very well either, and separately I've been subjected to what the modified "Lisa" interface does with larger screens with multiple windows. But for the dual 28's, the modern/stable UI's of KDE 4.7 (now 4.8beta for fun) and Win7 work fine.
For a 15.6" laptop monitor or smaller screens on netbooks and such with 1366*768 max resolution, then Unity and Lisa (Mac) UI's work. But get them on large screens, and they're unfriendly. The mouse gets tired running ALL the way across the screen for a menu option that isn't attached to the window in the lower right. :) Then Win8 isn't friendly on anything without a touchscreen...
What I've liked on KDE is it does detect when it's on a 10" netbook, and adjusts accordingly to the "netbook" interface which saves real estate, and also switches to the "globalmenu". On a normal desktop, it is a fully equipped desktop. I don't see why that strategy isn't adopted by MS, Canonical, Apple, etc etc. Why have exactly the same UI across all the netbook/laptop/desktop computers?
"Interface designers in commercial products think that users are stupid, suffer from attention disorder, and cannot focus on more that one simple task on one big window that covers all of the screen, with no more that two big buttons at a time."
Only ONE big button, thanks. OFF!!
"Not everyone wants to relearn how to use their computer just so Canonical's designers can show off how they think the desktop ought to work."
Summed it up nicely, IMO.
Very glad there's a fork of Windowmaker (windowmaker-crm) now in active development again so there's no reason to switch to Gnome or KDE in the foreseeable future.
It seems to me the Linux crowd need to decide if they want a desktop THEY can use, or one which will make Linux popular to regular PC users.
Linux users are often quite strong evangelists of Linux but they seem to fail to realise regular folk NEED an easy interface, along the lines of Win/Mac. However if they make an 'open source Windows/Mac' shell that _would_ actually attract those they evengelise to, they could end up with something they don't themselves like.
So do they want Linux to go mass-market, or do they want to keep it for those who understand computers? The former seems an obvious answer, but if they're not in it for the money then perhaps the latter is better.
Why should we choose between a mass market interface *or* a power interface? Do hot rod enthusiasts decide between a tweaked out vehicle *or* a vehicle their mom can drive, or do they enjoy a world where both are well-supported?
The point of FOSS and Linux to me is *choice*! I want total power over my computing experience, and an app-compatible mass market platform for the non-geeks I love (and support).
I borrowed a Mac Mini a few weeks ago to see what the fuss was about.
I was expecting a life changing experience after all the hype.
What a disappointment. It just felt clunky and looked clunky. I'm a long term Windows user and after using Win7, OSX just felt awful.
I even prefer using the Ubuntu 10.4 Netbook edition on my Acer Aspire One netbook for getting stuff done.
> I borrowed a Mac Mini a few weeks ago to see what the fuss was about.
Think yourself lucky. I bought one for a job.
> I was expecting a life changing experience after all the hype.
As was I.
> What a disappointment. It just felt clunky and looked clunky.
I don't know about "clunky", but there were definitely a number of things that didn't "just work". And finding the fix was less than easy :-(
> I'm a long term Windows user and after using Win7, OSX just felt awful.
I'm a long-term Fedora user. I now have three Macs (for various work projects), but I would not consider changing my personal machines for Apples.
... as long as one has a proper shell and perhaps a few GUI applications like Browsers.
What would be interesting, instead of re-inventing the same wheel, to find a way to actually have _functional_ graphical user interfaces. Interfaces which are more than just forms and switchboards. Where I can draw a command, and the computer executes it and gives me the result... just like I type a command, and the computer will execute it.
> Where I can draw a command, and the computer executes it and gives me the result
Visual programming paradigms have been around for donkeys' years. I last used one in anger in the mid-90s.
Sadly, they don't really work for long. If you have the richness of interface that a typical command achieves, you have a very complex interface to try to support that richness. It very quickly becomes much easier to learn the "traditional" text-based interface than to try to work through the mappings from that interface to a visual paradigm.
So the visual thing gets you through the first few weeks of development, as you start up the learning curve, but soon becomes an impediment to progress, not an aid.
... so nobody ever will.
Which makes absolutely no sense—especially in the IT industry, where Apple has proved *all* the mobile phone industry's veterans flat-out wrong about how touch-screen interfaces *should* be designed.
The reason most visual programming tools suck is because they've been designed by programmers. Most programmers have all the graphic and interface design skills—and you really do need such skills for a _visual_ tool—of a used sheet of toilet tissue, so expecting them to get something like this right, is like expecting a congenitally blind child to paint the Mona Lisa.
A programmer will only ever create a visual interface to a traditional programming paradigm. The result is just a bunch of visual stand-ins for APIs that you wire up like a circuit, or slot together like a bizarre version of Tetris. Neither is a particularly good representation of visual programming because both rely on a fundamentally flawed assumption: that a visual tool must simply provide a like-for-like representation of the underlying, traditional, written-code-centric API.
Before you can create a visual programming tool, you have to first redefine "programming" to suit the new medium. That means hiring *non-programmers*, and having them lead the team. Programmers already *know* how to code; they cannot possibly be expected to approach the problem with fresh eyes.
> ... so nobody ever will.
My opinion is that interfaces are inherently wide - either a small number of classes with a lot of options, or a large number of classes with fewer options.
To create a visual paradigm for this, you either have a vast amount of options on each item, or you have a very large pallette of items from which to select. Either of these situations ends up with the user needing to know so much about the API that he might as well just be coding to it.
It's the same old GUI/CLI argument: the reason we developed speech, rather than just pointing and gesturing at images, is that speech is very much more expressive. You can convey an *accurate* message with far less effort.
And that is the end-point of programming; the language doesn't matter, it's all about describing the solution to the problem space. The more expressive a coder can be, the more effectively he can fulfil that task. And so a GUI is excellent when getting to know a system, but the CLI becomes the tool of choice when the user is more experienced with its capabilities.
 Choose your term here; "class" is appropriate, but I'm not going to get into a semantic argument about how to term a collection of programming elements.
I'm sure that, after several iterations, GNOME 3 will eventually get it right. However, for both Ubuntu and Fedora to dive straight into it after many years with a mature GNOME 2 was a clear mistake. It was such a jarring change that the distros should have provided a decent fallback for a couple of releases (i.e. the old GNOME 2, not some half-baked "GNOME 3 made to look a bit like GNOME 2" effort that they dished out).
During the transition period, they could then develop a *proper* GNOME 2-lookalike to sit on top of GNOME 3 (like the Linux Mint effort), which could be been provided for a couple of more releases at least (during which time, some of the "missing" GNOME 2 stuff could re-appear in core GNOME 3 hopefully).
I don't have time to wait for all of that, so I'm trying out XFCE in Fedora 16 and have got it pretty close to my preferred GNOME 2 setup that I used in Fedora 14 (the last Fedora with GNOME 2). I'll keep trying out new Fedora and Ubuntu releases in the hope that GNOME 3 becomes usable, but to me, the current GNOME 3 looks like it's designed exclusively for a touchscreen (huge icons, lots of scrolling) which makes it an epic fail for desktops at the moment.
" I don't have time to wait for all of that, so I'm trying out XFCE in Fedora 16 ... I'll keep trying out new Fedora and Ubuntu releases in the hope that GNOME 3 becomes usable, "
Yep, I can sympathize. You've just described what I did with KDE for years while waiting for it to restabilize after the major, major disruption with version 4.
If I might make a recommendation for another experiment - LXDE has been growing on me from trying Lubuntu out on some lower end hardware where it ran OK (PIII600, 256MB RAM, ouch). I then tried it out on modern hardware and WHOA - FAST! It has few of the cool gee whiz effects that slow stuff down, although after having had a bunch of that for a while, somehow going back to a simple, "it just works" (and nothing extra) UI is refreshing. I might start switching to LXDE as my normal desktop UI. I've already switched to it for desktop VM's as it is very gentle for memory/proc requirements.
True story, I disabled ALL the eyecandy years ago on an aging WinXP work laptop at the end of its lifecycle to regain performance and drop some memory in the process. I was desperate with an overheating video card, running into the 3GB limit, and was still stuck with that laptop for 6 more months. It ended up looking quite plain and spartan, but it was more stable and faster as a result. I got more compliments on it, and even one person asking if I was beta testing a new version of Windows and where could she get it? It very much surprised me, since I'd basically made it look like Win95... Dull/drab and functional for me, but clean and fast apparently beat gee whiz. (For disclosure's sake, this was with a bunch of hw/sw engineers, so that might not hold true in a "normal office")
For years, Linux desktops did everything they could to copy Microsoft. Under the misguided belief that users could only be attracted if Linux looked like Windows, KDE and Gnome both did everything they could to look and feel like a warmed-over version of Windows 95. Real innovation was sorely lacking, and most users saw nothing in these desktops to get them to switch from Windows.
Now they are copying Apple instead. But this may not be such a change: Microsoft itself has copied Apple in its recent revamp of the Taskbar. So we could say that KDE and Gnome continue to follow the lead of Microsoft.
On four desktops on a single Mint 10 laptop, I had almost thirty applications - including windows 2000 running in a VM with a Z80 emulator itself running CP/M and sundry 8-bit development tasks... some of us do *work* on our tools; they're not just advertainment channels.
Sure, I *can* do the same thing with Unity or Gnome 3 - but what I *can't* do is keep track of what's going on with anything like the same ease and ability. And that's because the new interfaces are less than ideal for anything other than full-screen applications on smallish screens.
And until I can, they will remain interfaces for touchpads, not for usable tools.
My greenhouse runs on CP/M. Has for 26 or 27 years. It works, so why change it? I maintain similar systems for a few commercial flower, fruit & veg growers here in the Sonoma/Napa/Mendocino/Lake county area. I no longer charge for the service, it's more a hobby. On the other hand, I haven't had to pay for fruit & veg & cut flowers for over 20 years ...
The fully tested backup system runs a stripped-down version of Slackware, but I fully intend to keep the Z80/S-100 systems running as long as possible. Yes, I have plenty of spare parts ... but I rarely need to use them. The old stuff was built like a shit-brickhouse :-)
"I believe it was Apple that borrowed from KDE."
and they swiped from OS/2 Warp for the bottom dock. OS/2 did have some interesting experimentation going on. Microsoft has also borrowed from KDE on eyecandy for Vista/Win7. Apple's webbrowser is based on KDE's KHTML engine as well. One thing about this industry, is that people borrow all the time. Nothing new.
OS X was built on NeXTSTEP. This is why Jobs returned to Apple in 1995: Apple urgently needed a replacement for their old OS and NeXTSTEP was the best fit, so they bought the entire company, including Jobs.
Take a look at some videos of what NeXTSTEP used to look like. You'll be surprised at just how many of the features in OS X today were already there 20 years ago. It really was a long, long way ahead of its time.
NeXTSTEP v1.0 appeared in 1988, which predates KDE by some 8 years. KDE copied from NeXTSTEP (and, later, OS X), not the other way around.
And, yes, that dock was also in NeXTSTEP first. All Apple did was move it to the bottom of the screen by default, although it can be trivially reset to the right or left if desired.
One of the most common complaints that I receive as a senior computer architect is that the user interface on Windows changes with every release. Windows 95 didn't have a Start button - and the XP Start button is very different from AeroPeek. Win7's task bar works very differently from XP's task bar + quick launch bar. The control panel for each Windows release is different enough that we have to invest in training for each release. And on and on.
Similarly, while I was a MacOS 2 power user, when I tried to evaluate the OS/X platform in 2011 I was *lost*. It was much better once I learned it, of course, but "exactly like 1984"? Not even close!
The extent of similarity between the various Windows and Mac releases and different Gnome and KDE releases are roughly equivalent. The only significant difference is that Linux users have a choice, while Windows and Mac users are stuck with Microsoft's and Apple's view of how a GUI should work.
I prefer choice, by the way...
Swing and a miss. Windows 95 *did* have a 'Start' button; it was the first version of Windows where one clicked 'Start' to shut the machine down. In fact the word 'Start' and the ubiquitous button featured prominently in their marketing (see http://www.youtube.com/watch?v=AGNGRU5g1aU&feature=related) Windows 1.0 through 3.1 *didn't* have a 'Start' button...
"The only significant difference is that Linux users have a choice" a trite fallacy. What does the choice give them? Not a lot really. It certainly doesn't make the UX any better or worse and it's not trivial to edit the configs or switch between the two, which IMHO renders that particular point moot.
But it's on the wrong end of the taskbar. As usual Microsoft stole someone else's ideas & botched the implemenation (mind, they perhaps didn't want a repeat of the Apple law suit).
Quick list of new stuff in W95 that was in a desktop environment in 1987 (that should give knowledgeable people a clue), a taskbar all along the bottom edge which hides under application windows, a backdrop that it's possible to pin documents, & applications and/or links to them, context sensitive menus (there's a lot more).
Stuff missed - altering scroll direction without having to move the mouse, not giving focus to windows just because you wanted scroll the contents of a window that wasn't the currently active one, true drag & drop between applications.
There are people using windows 7 with classic theme which is assured to there on windows 8. There are graphics designers and various large screen professionals who never used the screen corners or even disabled them.
I plan to use gentoo with window maker myself so you aren't really talking to some windows or Mac guy or an old fashion person. I am saying the company who introduced gui to mass market doesn't change the paradigm and its follower still have option to keep it. What are the credentials of gnome ui designers?
Actually if you could use Windows v2, then you can use Win7 easily enough. It amazes me that the initial desktop UI's derived off Xerox Star basically still work OK.
Win8 clearly breaks that clean succession though. Win8 is where the paradigm shifts more severely as it copied a combination of Unity/Android/iOS/winphone phone/tablet UI's, and then kept tweaking until it barely works.
Windows 8 is assured to have classic ux and even classic theme. Gnome guys change the entire thing and force users.
And people (including me) theorize conspiracies about desktop Linux failure especially on large installs. Perhaps there is no conspiracy, just imagine what would cost to retrain all users if you had 40000 gnome desktops. Perhaps there is a conspiracy since the Trojan Mono app in Debian comes with gnome ;)
Those options will save you from the latest brainfarts of the GNOME people. In about three years GNOME 3 will probably be robust and useable.
Nobody forces you to use any particular Linux version and newer is definitely not better. It's not like the proprietary world, where you simply cannot buy WinXP anymore, despite you <b>wanting</b> to do exactly that.
Also, Linux gained even more dominance in the server domain, where the GUI does not matter at all. That's the important 2011 message.
Always has been, and will undoubtedly will remain so. I see no reason why, or even how this will change. (For the Desktop)...
But, lets look at integrated Devices, i.e. Set-Top Boxes (Cable & Satellite), Phones and Tablets as well as Data Storage Systems. Linux has always thrived well here. And none of these Applications would require a "Desktop" of sorts.
I have used various Linux Distros, Redhat, SuSe, Mandrake, Gentoo, Debian and Ubuntu to name a few.
The thing I always hated about Linux is that Once you spend enough time figuring out how to do things on on CLI you'd have to then relearn those incarnations for another System.
As a huge Fanboi of Klaus Schmidinger's VDR Project (http://linuxtv.org/vdrwiki/index.php/Main_Page), One of the major highlights of my life is when I need to either update to a newer (or change over to a different) Distro, and find that I then need to reconfigure that Distros' version of LIRC to work with my Remote. Or in the case that most of the newer Distros are finally moving to Ubuntu as a base. Find that LIRC has been superseded by something even more assinine, and compleatly fails to do it's job. Even as the Distro makes chime in as to how this would make Most Remotes work OoTB. Imagine my surprise when ~90% of the Buttons actually Worked, save the Blue, Menu, Exit and Info Buttons then DO NOT WORK!
But, then I'm still loathing the withdraw of the 60W Incandescent Light Bulbs as well
plus ça change I guess....
But Linux on the Desktop? Naghh it's never gonna happen, and unlike my Desktop loving friends here, I can't wait to see the back of 'em. What I wouldn't give to find a decent MoBo that actually still had a full complement (i.e. FIVE) 32-Bit PCI Slots on it. Good Luck tracking One of those down. Gaming belongs on the Consoles, and productivity can be done on a Notebok / Netbook or even Tablets or Phones. (although I'd like to see these Two pick up where Crapple left the Newton for this endevor though). The Desktop may very well still have a lot of life left in it, for those chores that Mobile Devices either can not or will not handle. But these are on the whole very limited in scope and those who'd cry the most about 'em will not still this tide.
The Desktop as we know it is dieing out, I wouldn't even it give it Ten Years time. before it falls off the map, and Windows 8 with it's touchy - Feelly Interface is a good a harbinger of this as any.
Pirate Flag Cause One of the great things about it (VDR), is You don't need a Skybox to record things, and as Linux is otherwise unrestricted you can actually record what you want when you want, and not what Sky (or others I'm thinking of HD+ here), would tell you, you could.
That supposes everything is going to end up moving to a web-server based paradigm. While at the moment it's going that way, we have to be aware that this is very fashionable so while in many cases it's a great idea, many people are _only_ doing this because it's cool... porting desktop apps that work great, to web-versions which are not as rich and require constant web connectivity.
I wonder if things will end up balanced a bit further towards the desktop in a few years... instead of everything being forced onto the browser simply because the browser is now capable of doing so.
2011 saw the Linux kernel put into the hands of more new people than any year ever - in the form of Android. November found the sum at over 200,000,000 Android users, and December saw the number swelling by 700,000 per day. With over half the market in the US, and soon the world, Android looks to be the delivery mechanism to finally bring Linux to the masses.
Does this not deserve at least a mention in your year-end Linux wrapup?
Probably not worth mentioning because they didn't choose the underlying OS, they chose the functionality (or lots of stuff they don't actually use) provided - and little (if any) of that functionality is unique to Android, iOS and WinPhone provide similar, and I'd guess that Symbian could provide similar. Perhaps they chose based on GUI that matched the devices their friends had shown them, but don't think that OS itself is the decider for most people. It's like saying that people chose ARM based phones rather than Intel/Motorola/Zilog because of the processor architecture rather than what the processor architecture supports. Especially considering most applications on iOS and Android (guess some kind of WinPhone CLR) run in some kind of VM the underlying OS is probably even less important.
Why do I think that most phone users don't choose base on OS, because I like Android, have an Android phone and recommend it when asked, but won't consider an Android tablet that does not allow use of the Android Market Place out of the box - so no cheap Anova device because they use AppsLib, even though they are good hardware.
"Probably not worth mentioning because they didn't choose the underlying OS..."
That's just silly - it's like saying iphones are irrelevant because most users didn't explicitly choose ios. The point he was making is that Android is introducing linux to a very large number of users, most of whom would probably never use Ubuntu or similar.
About adding Google Market to a tablet - there's an app for that - and you only have to run it once.
Linux was and still is all about end-user freedom, unlike Android, it is FOSS. I know, I know, Android is using a fork of Linux kernel but it is hopelessly shackled deep into the bowels of a proprietary locked hardware platform. In order to do something in Android you absolutely need :
1- someone to write an app for that
2- someone to allow you to use that app (app store or Something)
I was pretty close of buying a Samsung Galaxy tablet running Android v3 and I was baffled to discover at the last minute that :
1. there is no Skype for Android on tablets although there is one for Android phones. More generally, except Google Talk (unavailable in large areas of the world) there is no other video+voice communication solution
2. Samsung will not offer the upgrade to Android 4, and I'll have to buy another tablet with the new version even though no hardware upgrades would be required.
Hopefully, I'm old enough to be able in the near future to stick with my desktop running on the (still) open PC hardware. I know for sure very few of you can understand and appreciate the ability to own and trust your computer as well as the rewarding feeling that your computer trusts you too.
> Android is using a fork of Linux kernel
No. Android is using a Linux kernel, but it doesn't use the Gnu userspace that many people seem to want to call "Linux". That's sort of what RMS's (oft-misunderstood) rant was all about...
> 1- someone to write an app for that
You need someone to write an app for anything you want to do with any computer. But writing for Android really isn't very hard.
> 2- someone to allow you to use that app (app store or Something)
Absolutely wrong in every possible respect.
Android is *not* a walled garden. You can put what you want on it. Side-loading of APKs is commonplace on the cheaper tablets which don't have a marketplace, and supported on every single one of them.
> no Skype for Android on tablets although there is one for Android phones
And what is the *difference* between a tablet and a phone?
> Samsung will not offer the upgrade to Android 4
That's between you and Samsung. It has nothing to do with Android.
But if the Samsung hardware can support Android 4, you can bet someone will release code that enables your hardware to run it. It's up to you whether or not to take that route.
GNU/Linux on the desktop has failed - adding all the hundreds of distros together barely gives a market share of 1%. Are we expected to believe that suddenly GNU/Linux will be on phones, tablets etc? Not going to happen.
The GNU/Linux crowed had a massive chance (probably their final one) when MS dropped the ball with Vista. But the utlra-nerds were too busy kicking sand in each others' faces and forking to do anything cohesive. The very few OEMs they did (somehow) manage to con into releasing hardware with GNU/Linux saw a massive flop as there was no clear vision.
Loathe or love MS and Apple, at least one knows what one is getting and they have a clear direction, hardware support and standardisation of idioms. This does not exist in GNU/Linux - they can't even decide what a right-click should do, the behaviour differs from program to program in the same desktop environment; pathetic. This is what you get when their is no clear design and no quality assurance (the users are the testers).
GNU/Linux will be around for a while longer in nice areas (like set-top boxes) but over time it will disappear from there as well. Their ecosystem is too chaotic, too fractured, too adversarial, to cliquey for any manufacturer of any decent size to consider using it.
GNU/Linux on the desktop has failed - adding all the hundreds of distros together barely gives a market share of 1%.
'Market share' can be calculated (and exaggerated) by the developers of items sold.
Should market share also be calculated on number of downloads, magazine cover disks (do they still do them?), CD/DVD/Usb sticks handed to mates?
And the reason is very simple: a Linux UI is created by developers for developers and some closely related professions like sysadmins.
And the reason that nobody else wants to use a Linux UI is because these systems insist that you enjoy doing the computer equivalent of tying bootlaces and replacing nappies. Children grow up, but these computers are still acting as if they are 2 year olds.
I'm not sure why that is a reply.
Actually, the person who sought my advice didn't know that Android was Linux based either. For unidentified reasons, a librarian with a need to think about universal access had become aware that the computing world was becoming more diverse and she had to do something.
Unless one is using some forms, opening a pdf on any platform, "just works". Whether the typical end user notices it is oKular or Adobe (etc) is not relevant. Similarly internet banking "just works" and you can now use British Airways' website (for example) whereas before ActiveX (or something) was a requirement.
None of that will be of the slightest interest to the typical end-user. Why should it?
As I understand it Linux netbooks were popular because of their price advantage but failed because on balance they didn't "just work" for a lot of the things end-users wanted to do.
One of the LOTD areas that most definitely doesn't "just work" is online public services. It's not that amazing on Apple either. A particular joy is the public sector unthinking adoption of file formats that still only work well on one platform (if at all on others), e.g., the drift to using .docx without any obvious decision to do so.
Over time the growth of Android will produce a critical mass of don't care end-users that do care that some stuff they want to do, doesn't "just work". This will create pressure for change.
Possibly your niece is about to start a PhD in an aspect of kernel design, but I could easily find 100 people that don't know that there's a difference between hardware and software. Or how a light switch works. Or that there isn't such a thing as a tin of striped paint.
I didn't understand your point.
This might be true but EVERY one of those users have chosen to move to Linux.
In a world where essentially all desktop computers are sold with Windows or Apple OS pre-installed and where most people, in any case, don't care I'd count the desktop usage of Linux as a major success.
I remember happily running gnome 1.4, then someone at debian decided they were going to be the torchbearers for gnome 2. It was *hideous* - remove the control panel, then don't replace it with anything.. half the apps used different font settings and had no GUI to change them. It had some kind of explorer thing where most of the buttons didn't do anything and came up with blank pages... Then there was the 'lets swap OK/Cancel for no good reason' debate..
By the time they (presumably) fixed it in 2.1 I was already so used to KDE it was irrelevant to me. KDE 3->4 didn't really change much that I used, so never felt any push back to gnome.
It'll sort itself out. If the picture of the UI on the article is anything to go by I can't see myself using it, but I'm sure it'll be a competent desktop by 3.1.
Personally I predict android will become such a force that it'll start pushing onto the linux desktop, and onto netbooks/ultrabooks, making the gnome/kde dichotomy somewhat irrelevant. It's not that far off now.
Of course it doesn't answer the burning question.. why the *hell* was this titled 'EL REG'S 2011 LINUX-LAND ROUNDUP'? It's about as far from a 2011 roundup as it can be - no mention of android? New kernels? Really nothing about '2011' and everything about a single desktop.
... wasn't talking about Linux, per se, but rather it was talking about user interfaces. This old laptop boots into three bash prompts. From there, I can launch any one of about a dozen GUIs, if I want to. I usually run xfce (on my variation of Slackware). The Wife & DearOldMum (mid 70s) & GreatAunt (mid 90s) all use a cut-down variation of Slackware & a much modified version of KDE 3.5 ... I get zero calls for tech support from them.
The kids using the Slackware + KDE machines in the barn's clubhouse also don't have any issues. Anyone claiming that Linux isn't ready for the desktop hasn't given it an honest evaluation.
With that said, Shovelware distros like the *buntus, with their "we must be all things to all people, so lets throw in the kitchen sink" corporate attitude have the exact same issues as operating environments from Redmond and Cupertino, and for exactly the same reasons.
 One is the native laptop screen, the second is on a larger flat-screen display, the third is on an old "dumb terminal" attached to a serial port on the docking station.
I am seriously low quality but tomorrow I will be spending extra time in bed not buggering about with the XP install downstairs. I know it is shafted but fuck it and the person who fucked it and still uses it.
I'd upgrade them to Vista if I was feeling pissed. Is that any better?
As a user of Linux since 1996 I have seen a lot of changes. Most in time seemed appropriate. I can't say that at the time of them being brought forward to my use I was very happy about them, and I do understand how most people feel comfortable with what is instead of what will be.
I use aptosid Linux. A rolling release... and all things change in time.
What people want is some stability of continuity, not radical revamping with out getting use to the change first.... that is why the grumbling users get mad.
Its the unpredictable change that’s upsetting. but when you look at any technology, any human activity and any history... nothing has ever been static.
Embrace the future, why not, you really do not have a choice if you wish to be were the future is... Linux is reactive to change that is why it survives.
Linux is supremely adaptive. But also understand the past, it is how you know were you are
Host/Kernel/OS "Eyland1" running Linux 3.1-6.slh.1-aptosid-amd64 x86_64 [ aptosid 2011-02 Ἡμέρα - kde-lite - (201107131633) ]
Canonical needs to make money soon else no Ubuntu releases once the Spaceman runs out of cash, so hope they sell lots of embedded tabletty shiny cloudy things.
gnome-fallback-session is in Gnome 3.3 dev release and so will make it into Gnome 3.4 public release, which takes us to Ubuntu LTS 14.04 (12.04 just misses the Gnome 3.4 ship date). Ubunties of a conservative type have until 2019. If you really must have genuine Gnome 2, CentOS 6 has support until 2017. CentOS can run R, LaTeX and LibreOffice so I'm covered.
That's all after my retirement date, after which I'll be using a morse key and 10watts on 3.5MHz. Email? nah, postcards
PS: has anyone tried putting a netbook running Unity in front of a group of Younger People? They seem to get the hang of it quite quickly (I always put restricted extras on first so they can actually play music &C).
Gnome lost almost all it's popularity amongst the older users cause it's a load of crap.
Most the guys i know and myself use KDE cause gnome is a load of it.
Don't associate gnome . linux and say linux is imitating apple .. It's not.
Gnome desktop is . and most the real enthusiast crowd that were with me 12 years ago agree gnome 3 is a pos. Please dont get mixed up and say Linux imitates Apple .. Gnome may be in some ways but Linux certainly aint. For a real desktop that's easy to install and use like Ubuntu is a Gentoo based distro called Sabayon Linux KDE version. Probably the best version around and as you will see if you install and try , it's got nothing to do with Apple.
Please avoid titles that really confuse people.
Because the Linux community does not use the ergonomics experts that Apple/MS spend alot of money on.
It is these guyes that create the requirements for the programers to code for. In the linux community it's mainly the programmers that decide what the requirements are or end users who ask.
"Make it look like Windows/OSX"
But why is it that all linux desktops I've seen just look blurred, fonts are crap and to me still look like the X-windows of old.
This article is pure fluff - "how it looks". Nothing could be more trivial and tedious, and nearly totally insignificant to the subject - "Linux in 2011". Truly, if one doesn't like "how it looks", it is pretty easy, (no big deal), to make it look any way you want it to look.
These facts are independent of the beliefs of all the strange and boring humans out there, who seem to worry about "how it looks", as apposed to what you can do with it.
I replaced my parents' aging WinXP computer last year with a silent mini-ITX running Ubuntu 10.04.
It's been great for them, and me. They can do everything they want, and aren't forced to do anything they don't want. Their complaints run along the lines of "I spent 5 seconds searching for how to do something new" rather than "I lost all my work because Windows decided to restart". Rather than spending three hours every week fixing their problems, I spend 2 minutes every month telling them what to search for in the Software Centre.
Until Unity, that is. "Install all the updates all the time, that'll keep everything running perfectly", I said. So they did, and suddenly Unity hit them. They couldn't find anything, their settings were removed, even their desktop background was changed to the Unity default.
I couldn't even figure out how to revert, I had to reinstall Ubuntu and disable updating to 11.
Ubuntu obviously took a page from the rude book of Microsoft - "We know what you should want, and if you disagree you're wrong". 10 is great, but I'm steering well clear of 11 on anything with a mouse.
Gnome has long been annoying and the Devs have long had a God-complex.
The systematic dumbing down of Gnome was well underway before the Gnome 3 debacle and the Devs were dismissing questions regarding stupid design decisions with sneering indifference some years ago.
On the other hand - who cares?
If they wish to drill holes in the bottom of their own boat, let the fuckers drown as a consequence. Anyone who thinks Gnome 3/Unity is an imrovement on what went before is not worth the oxygen they waste on a daily basis.
Vive la KDE
Biting the hand that feeds IT © 1998–2019