"the system has a more modern and 'flatter' look"
Ubuntu 18.10 is about to make its scheduled appearance: Cosmic Cuttlefish will take centre stage from previous incumbent, Bionic Beaver. Looking back, version 18.04 of Ubuntu Linux was a shock to some and featured a desktop not to everyone’s taste, being the first Long Term Support (LTS) release of the distribution without the …
That caught me eye too, but the first question I have is: with KDE* available, why does anyone use Gnome?
I mean it's the default desktop in so many popular distros that there must be a good reason, but I can't for the life of me work out what it is.
* Or Budgie, or Cinnamon, or XFCE etc etc...
I came to the comments to weigh in on this same topic. I don't get the drift to flat interfaces. 5-year-old graphics chips are perfectly capable of rendering 3-D graphics. I prefer icons with some depth; the appearance is visually pleasing. So who decided that we all need flat interfaces? Is this change simply for change's sake?
> Is this change simply for change's sake?
You've answered your own question. About a great many things.
It was a sad day when some Linux developers decided they were "artists" instead of "engineers". The gnome lot are probably the worst, though systemd takes the special mention prize for zealotry.
" I don't get the drift to flat interfaces"
It's like MS Windows with a chest wrap. Or a mastectomy. Whichever.
Some millenial "children" out there [and their enabling oldsters] *FELT* (not 'think' but 'feel') that THEY should impose a Windows 2.x interface upon THE REST OF US, with their "It's *OUR* turn now" twist upon it, because they *FELT* (not 'think' but 'feel') that it was somehow "better" because in the early noughties, this is all that SMART PHONE INTERFACES could manage. So, in the realm of "all devices are phones" and "all OS's must ACT like phones", they cram this 2D FLATSO FLUGLY eldritch abomination (read: zombie resurrection of the windows 2.x interface) entirely up the rest of our backsides JUST to make themselves *FEEL* better about being important or something...
The UI, on any operating system, is something people are reluctant to change. We still have, with slight differences elements that have been here since Windows on MS-DOS. Look at how minimise/maximise/close has and hasn't changed.
I have seen ideas for UI changes which might be improvements, but the struggle to overcome all those decades of habit made them more like failures.
@Dave Bell : Look at how minimise/maximise/close has and hasn't changed.
hum ... dunno ... I removed to minimize buttons, as I use "present windows" instead, and replaced it with a "keep above" which allows me to have a window visible and on top even if typing in another one. Or take "roll-up" and "roll-down" when rolling the mouse in the title-bar, like MacOS-9 did in it's time.
Windows and MacOS don't propose these very useful UI features, only KDE (Kwin) does.
"I have seen ideas for UI changes which might be improvements, but the struggle to overcome all those decades of habit made them more like failures"
For users, any UI change has to provide an amount of benefit that exceeds the cost changing work habits (and that cost is rather high). If a UI change provides a benefit that doesn't reach that threshold, then it doesn't just seem like a failure, it is a failure.
For me, i don't consider kde an option ever because it wastes so much resources on eye candy, bogging down my system. In a vm kde is unusable too becsuse it is so bloated.
Then again, I don't use gnome either. First thing I do is install xmonad, a non-decorated tiling window manager. I come to my computer to do stuff, not to get dazzled by eye-candy or getting rsi endlessly arranging windows with a mouse.
Multi-monitor support also sucks for both kde and gnome.
@Hstubbe : "Multi-monitor support also sucks for both kde and gnome."
I don't know about gnome (haven't used it for 10 years) but Kwin (KDE Plasma 5.12) is clearly one of the best window/display managers. I have multiple screens, at work, at home, in conferences, HDMI, DisplayPort, DVI, and it all just works. I never need to reboot, uptimes are in weeks.
What needs do you have that Kwin doesn't do ?
When it comes to Multi-monitors, workspace per screen is the only way to go IMO.
Few Desktops have Windowmanagers that do it.
Gnomes workspaces on only once screen default option isn't too bad - a little limited, but you can leave an app on another screen undisturbed while you flip workspaces on the main screen.
Openbox has a patched version that does workspace per screen (works ok with LXTQ, not tried with KDE/Openbox).
Enlightenment does it
But yes, the most functional are mostly the Tiling Window Managers.
When it comes to Multi-monitors, workspace per screen is the only way to go IMO.
Definitely! I've always run that way. When XFCE dropped support for multiple screens a few yeas ago, I ended up switching to running multiple instances of openbox+tint2. It works great 99% of the time, but there is still a bug where openbox occasionally locks up the X11 system. That's annoying but not bad enough to make me give up having multiple screens.
Does a flat file server handle flat file databases?
But seriously, uptimes of months are easy and common in every world but the one with Redmond in it. There are several machines around here that only get rebooted when kernel security updates demand it. They always come back up gracefully.
With that said, anybody who maintains personal uptimes just for the sake of bragging probably deserves what they get. If your system's security/performance/whatever would benefit from a reboot, then reboot the fucking thing already! That particular DSW was over a couple decades ago, and BSD clearly won by a nose, with a properly setup Linux machine coming in a close second, followed by Apple trailing by a couple lengths. Redmond, sadly, DNFed and needed to be put down ... but for some reason was spared by the fanbois. Perhaps it'll be put out to pasture soon, it certainly won't be offered up for stud ...
 Corporate uptimes are a whole 'nuther kettle of worms.
With that said, anybody who maintains personal uptimes just for the sake of bragging probably deserves what they get.
After 18 years of not using Windows, occasionally noticing my uptime is more than a week or two still gives me a little thrill.
I still find it a pain having to reboot everytime a kernel or Graphics driver update (get your shit togetther, Nvidia) forces me to. SystemD promised quicker reboots, but I'm not seeing them, every time, some errant process takes nearly 2 minutes to close, and UEFI insists on polling every single external drive looking for something to boot , despite boot order settings.
I still find it a pain having to reboot everytime a kernel or Graphics driver update (get your shit togetther, Nvidia) forces me to.
Why do you have to reboot for an Nvidia driver update? I've been running nvidia cards for decades, and don't remember ever having to reboot for driver updates.
I do remind myself every couple of weeks (when I've got some spare time) to reboot my desktop boxes just to make sure they are still still bootable. If you wait for six months, then inevitably something will force a reboot right in the middle of some urgent work -- and only then will you find that some update or other that happened during the past months required a configuration adjustment that you forgot. Now you've got to figure out what went wrong while people in manufacturing are twiddling their thumbs waiting for you.
This is a virtualization host, only trusted hosts on it and off-course its firewalled against access unless from a few hosts (and has no Internet access). It runs desktop hardware and consumer SSD's!!
You can't argue with years of service! Its the only currency in the operations business!
"For me, i don't consider kde an option ever because it wastes so much resources on eye candy, bogging down my system. In a vm kde is unusable too becsuse it is so bloated."
What are you on about? KDE uses less CPU and RAM then even XFCE these days, KDE is incredibly lite.
I went directly for Budgie after trying it on a VM. It has a few flaws like not being able to spawn panels on a second monitor, but I don't really care. The looks is nice, it's very light weight and at the end of the day I do my work on the shell or a browser, what's underneath it is almost irrelevant, I could just start my stuff from a shell for that matter.
I used to really love KDE, but it sadly lost its way from version 4 onwards, requiring a graphics chipset with quite some oomph at a time when many computers still didn't have one, making it (for the time) bloated and slow.
On the other hand, early releases of Gnome 3 were indeed horrible, but, now, with a useful search box, an unobtrusive dock, a Mac-like launcher, a Mac-like window thumbnail previewer (you might as well borrow other people's good ideas), and good keyboard control, I find that I do actually quite like it.
In a way, it's oddly amusing that the open source community has managed to make something that is perhaps everything that Windows 8 was trying to be (but failed), and did a far, far better job of it.
My only real complaint would be that there are a number of features that are still very desirable and useful that are relegated to the Tweak Tool rather than being properly available via the main Settings window.
I realise that it is not for everyone, however, but I've never been one to have a desktop full of icons, there's no way I'll ever see any of them behind all of my open windows!
"Modern KDE can indeed be far too resource-heavy, but at least you can configure it to be light."
Can you easily get rid of crap like akonadi being a prerequisite of every thing I'd like to install but don't want a goddamn desktop indexer for?
I used to like KDE a lot. Used to.
"I used to really love KDE, but it sadly lost its way from version 4 onwards, requiring a graphics chipset with quite some oomph at a time when many computers still didn't have one, making it (for the time) bloated and slow."
I keep hearing this, but my reality doesn't match the griping. I installed Slackware 13.0 on a (then) 6 year old laptop in 2009. It came with KDE 4.x. The laptop in question (HP zv5105) has low-end Intel graphics with "shared" 64megs of memory. It ran KDE 4.x quite nicely, and was my primary computer for 7 years.
I wrote: "The laptop in question (HP zv5105) has low-end Intel graphics with "shared" 64megs of memory. It ran KDE 4.x quite nicely, and was my primary computer for 7 years."
Make that an ATI Radeon 9000 IGP w/64Megs of shared memory. It was my primary computer for 7 more years. I just booted it up, and it is running the video software that came with Slackware 14.0 (no ATIdriver). It works fine, I still use it on the road when I'm in a place where it might get lost, stolen or strayed (nobody is going to steal a 15 year old laptop!).
She started on Slack 9.1 and was updated as Slack was updated. The old gal got wiped&re-installed for a couple releases of Slack, after major architecture changes. She's now running Slack 14.0 -stable ... I'm tempted to update to Slack 14.2 -stable, just to see how it goes. KDE has never even stumbled in all that time (aside from issues that everybody was having, of course).
They say the memory is the first to go ... or maybe I forgot it was ATI video because I never thought about it again after the initial installation. Still, not too bad for a 15 year old machine :-)
KDE runs quite well on my Intel Braswell dual core laptop with 4GB of RAM and Intel integrated graphics. It's probably the most underpowered CPU/GPU you can get from Intel right now, but it doesn't have any trouble with KDE. It runs better than Windows 10 did for the very short time it was on there.
"Same reason Linux took off and BSD not so much. Because LibQT was open sourced just a bit too late."
Nah. LibQT became an important part of the equation many years after Linux blew past BSD in the number of developers and/or users column. The real reason is because AT&T's lawyers had issues with BSD. Linux was seen as being unencumbered with perceived AT&T license problems. We needed a cheap-or-free *nix for work, home, and play ... Minix wasn't it (yet ... it's a good general purpose OS now, check it out!), Coherent was about to go away (open sourced in 2005, find it at tuhs), the other commercial offerings were either brain-dead or stupid-expensive for home/student use. So Linux+GNU won by default.
That was the mid-90s; the rest, as they say, is history.
Note that I'm not anti-BSD, far from it ... I've used BSD since roughly the same time that ken got to Berkeley. It's a great OS, and IMO a better server OS than Linux ... although Linux is breathing down BSD's neck in that department.
It was "... windows and controls gain a lighter feel." that caught my eye. I have absolutely no idea what it means, and the rest of the article failed to enlighten me.
Are we talking about "light" as opposed to "dark", here, or "light" as opposed to "heavy"? I'm not sure that either means any more than the other, in this context, anyway.
Yup, Microsoft has had to go flat to lower CPU and GPU loads to help with battery life. If you don't recall, Microsoft Windows on portables has always been a big battery hog compared to all other OS's. So with Microsoft going "flat" that seems to be what some people think is "modern".
Linux distros should always stand up and stand out for what's better. Following Microsoft is never a good thing.
"Microsoft has had to go flat to lower CPU and GPU loads to help with battery life"
From a company that walks over dollars to pick up dimes, then. Even *IF* (and I highly doubt it) that this 2D FLATSO trend 'saves battery life', your computer is now LESS efficient because of ".Not" and UWP, and so whatever CPU efficiency is allegedly "made up for" by the 2D FLATSO FLUGLY has _EASILY_ been UN-DONE by the ".Not" "UWP" gross inefficiencies!!! Just watch the CPU during any application loadup involving ".Not" and you'll see what I mean.
And while running, UWP applications have a tendency to 'spin' the CPU from what I've seen. it's somehow worse when you run it in a VM, worst of all when multiple UWP CRapps fight one another for CPU time while waiting for god knows what interprocess communications they're trying to do, between the 'start thing' cortana and all of that slurping.
Yeah, what a joke, to even REMOTELY try and claim that MS was trying to save on battery life by doing the 2D FLATSO. HA HA HA HA HA HA HA HA HA HA HA HA HA HA HA!
What is the matter, you don't like searching each screen for a minute to find were the the buttons are?
Well, I suppose at least you might be less likely to commit any hasty mistakes.
You should be able to just hit Return for OK/Accept but that ability appears to not be working (at least last time I used Gnome based GTK3 applications.).
I've seen this. A pop-up asks for a response, you type in text, hit return, and for some crazy reason the program responds as if you clicked on a different button that is hidden behind the pop-up. It's the sort of thing that makes you wonder if programmers are human, or some monster which will be revealed in the next episode of Doctor Who.
For many thousands of years, we and our ancestors have been dealing with three dimensional objects in space. Being able to do this well and quickly no doubt often meant the difference between making a meal of a dangerous animal or it making a meal of you. We're well adapted to living in a three dimensional spatial world.
In terms of biology, we're no different than humans who lived and died before personal computers existed. We're still wired to intuitively respond to three dimensional objects. In terms of GUIs, I've said before that our perception of skeuomorphic, non-flat UIs is hardware-accelerated in our brains. Effects like shading and shadows that give the illusion of depth allow us to instantly recognize that a window is distinct from the background because it has a shadow or that buttons are meant to be pressed because they look like actual buttons. Flat interfaces that don't look like anything we're wired to instantly recognize require more cognitive processing for us to figure out what UI elements are actionable, and that takes a person's attention away from the task at hand for a brief moment. It adds up... I think the Reg article some time ago said that the subjects using flat interfaces were 22% slower than those using skeuomorphic ones, according to the study they were reporting upon. A particularly bad flat interface could be still worse if you have to hunt around to see what is an active UI element.
Looking modern or what some people think is attractive (I think flat interfaces are ugly) is not a good reason to make user interfaces (or themes that define their appearance) that are slower and less intuitive. Function is beauty.
> the obvious question is “what’s new?” The answer is… not a whole lot.
But this is true of almost every Linux [ and by "Linux" we all know that means the kernel and the suite of apps that make up a distribution ] - and has been for years.
The question that rarely gets asked and even less frequently gets a satisfactory answer is: what will I be able to do, with this release, that I could not do before?
And most times the answer is "nothing". For many years now, all new Linux releases have been merely rolling the version numbers on libraries and utilities (squashing bugs and fixing security problems), adding support for new hardware and fiddling with the UI.
The only real change that has arrived in recent years is systemd. But even that is 4 years old, is hated as much as it is adopted and makes no difference at all to the users and the list of functions they can use.
One could argue that stability is a major benefit. That being able to take a user from 20 years ago (i.e. me!) and plunk them down in front of a Linux desktop that they will instantly recognise and be able to use, is a good thing. Apart from some minor silliness, like moving the position of menus and toolbars it is totally familiar. This is very true. But it is not innovation, it is not "cutting edge" and it is not what developers want to spend their time doing.
Linux has grown fat and slow in middle age. It is no longer the inspirational "alternative" it once was. It no longer leads in terms of utility or design. Yet it contains all the old baggage that makes it a hostile environment for people to adopt. Just try adding a new package - download this, edit that, compile the other, add new libraries to satisfy installation criteria, fix conflicts and maybe - just maybe - after a full day of effort and Googling user forums that shiny new app will work.
We should be at the stage where all a user has to do is sit at a screen and say (or type) "I want to write a document" (or letter, email, flame, program, magazine review ... ) and everything just happens. And the same applies to hardware - especially stuff you can plug in like USB. None of these should be issues, but they are all insoluble due to group dynamics and office politics within the community.
So Linux will continue to increment version numbers. Giving the illusion of progress without change. And in 20 years time someone else will re-write this comment about Ubuntu 38.10. That is, if the Y2038 problem hasn't destroyed the world.
"it contains all the old baggage that makes it a hostile environment for people to adopt"
To a certain extent, yes it does - if you've come from using Windows to Linux, just try to add a second hard drive for extra storage! It's relatively straightforward once you know what to do, but seeing how good the current crop of distributions are, it would be better if Linux had a built-in utility which could perform all these steps for you?
Not everyone knows how to boot from a pendrive and edit fstab in vi to recover their system.
I was in general agreement with your post until...
Just try adding a new package - download this, edit that, compile the other, add new libraries to satisfy installation criteria, fix conflicts and maybe - just maybe - after a full day of effort and Googling user forums that shiny new app will work.
I have not had to do that(make, make install etc) for years. Modern package managers handle the adding of dependencies for you.
Perhaps you need to get out of your basement a little more (to use a US put down)?
> And most times the answer is "nothing". For many years now, all new Linux releases have been merely rolling the version numbers on libraries and utilities (squashing bugs and fixing security problems), adding support for new hardware and fiddling with the UI.
Then the answer isn't "nothing", is it? The answer is "use shiny new hardware and not get pwnd".
"Linux has grown fat and slow in middle age."
This is just not true. I run it on a couple of low-powered machines. It's totally fine as long as you're not trying to use a bloatware desktop environment.
"Just try adding a new package - download this, edit that, compile the other, add new libraries to satisfy installation criteria, fix conflicts and maybe - just maybe - after a full day of effort and Googling user forums that shiny new app will work."
sounds like you haven't used it for a while. we have AppImages, snaps, and PPAs now. If you can't find one of these for what you want then it's *very* esoteric. I have exactly one program on my machine that's custom-compiled, and that's because I contribute patches to it - I could just install it with apt.
"add new libraries to satisfy installation criteria"
apt-get build-dep <program>
"We should be at the stage where all a user has to do is sit at a screen and say (or type) "I want to write a document" (or letter, email, flame, program, magazine review ... ) and everything just happens"
You know you could have spent your time building that rather than trolling, right?
"And the same applies to hardware - especially stuff you can plug in like USB. None of these should be issues, but they are all insoluble"
Why do i keep hearing this? You know it's not 1994 anymore, right? I literally can't remember the last time I plugged in a piece of hardware and it didn't just work. I think it was probably around 2005. USB!? geez, it might not have been this century the last time I plugged in a USB device that didn't just work. I have hardware that is not supported in windows that works perfectly fine in Linux. insoluble?? You are just trolling, right?
with respect to building an application from source that requires dependencies be installed...
For those of us trying to create those packages in the FIRST place, or use 'bleeding edge' or otherwise "unpopular" or "unsupported" software on Linux, it can be a bit of a chore.
However, if you find yourself doing 'make install', I would hope that you're familiar enough with the package system in order to be able to install any 'development file' packages so that your target will build. It usually takes me a few tries until I get them all, yeah.
And it's really not THAT hard.
Still, I'd prefer it if developers didn't use so many obscure packages for their "whatever". it can sometimes get irritating, and I don't blame Linux itself for that.
[yeah I've built a few deb packages in my day. It's been a while so I'd have to read the docs again, but there are really good instructions available from debian.org - I just look there when I need it]
"We should be at the stage where all a user has to do is sit at a screen and say (or type) "I want to write a document" (or letter, email, flame, program, magazine review ... ) and everything just happens."
To do that, I click on Applications/Office/Writer. Been that way for at least 15 years. Not sure now it can be any easier.
"And the same applies to hardware - especially stuff you can plug in like USB."
Also 15 years ago: I bought my first digital camera, an el-cheapo Kodak. My kids were small and playing in in the leaves. I had a Windows computer and one running SUSE. After taking several pictures, I was out of space. So I plugged the USB into the Windows computer. Nothing at all. After about an hour of Googling and downloading 100mb of crapware from Kodak I was *finally* able to download the photos to the computer. It was getting dark so I wouldn't be able to get any more pictures. I had assumed that using the camera with Linux would be harder but I hadn't tried it. So I plugged the same camera into the SUSE computer. I immediately got a dialog: "A camera has been detected. Would you like to import your photos in to F-Spot?"
It was that day that Windows basically ended for me. Actually *doing* things was now easier on Linux.
I used to be a big fan of Ubuntu - and I still think that it’s an excellent OS, but…
The problem is that there are other, excellenter (sic), free OSs out there. Mint is lovely and as refreshing as its namesake (it’s what Ubuntu should be in my view). Elementary is shaping up very nicely - and, although it’s not a Linux variant, Haiku looks like it might be worth a bash.
At any rate, I’m a little bit fed up with the constant fiddling with the Ubuntu UI. Menus in the bar, menus out of the bar, Unity, Gnome, you put your left leg in. Simmer down Canonical. Pick a UI - and then stick to it, please!
Ubuntu's decision to put the menus at the top of the screen was a usability disaster. I could see how it might make sense on a very small screen where vertical space matters. But for most desktop users, it just makes it a pain in the arse to use menus.
Likewise the decision to hide scrollbars. Hiding them may make sense on small screens, but if you have the space, it just increases the effort to scroll things.
It was smart to finally dump Unity for GNOME shell. I'm not sure about the decision to go back from Wayland - perhaps for a LTS release it makes sense but they should be doing their best to take X out of the baseline and defaulting back to Wayland would hasten that transition.
I’ve read the studies - but, personally, I don’t care. Stick the menu where you like - hide it altogether and make it a pop up only if you like - just don’t keep moving it. I don’t want to have to retrain muscle memory after every new release. As for the scrollbars, I haven’t used them since the scrollwheel was invented.
Ubuntu's decision to put the menus at the top of the screen was a usability disaster.
Too true, but it's a disaster shared by Mac OS. It gets even better when you install Excel on a Mac: you have the ribbon, a menu at the top of the window, and another, subtly different menu at the top of the screen.
I thought there were people who actually study the ergonomics of user interfaces. Do they keep their results secret?
I've just migrated to Gnome (Ubuntu 18.04) from Unity (16.04) and... I miss the menus! I didn't ever like having them at the top of the screen (never liked that on MacOS or even Amiga!), but I did really like having them replace the window title.
Probably not the ideal place for new users to find and understand how to use them, but it really does save screen space which is particularly good when you have a laptop with ~800px vertically.
I'm also missing Unity lenses and HUD. And I'm irritated by stupid stuff like the lock screen looking like a phone lock screen to the extent of being able to swipe upwards (I press escape but that's not the point). And I really hate various sounds like the bell in the text editor (gedit) that AFAIK can't be disabled without disabling system-sounds altogether. And it's got a bloody sound that it makes when USB storage is inserted and removed. Just like Windows XP did! And an alert for the same events. Just like Windows XP. And that always really annoyed me because I *KNOW* when I've just inserted or removed USB storage devices... I don't need to be told about it.
In fact, Gnome is really bloody annoying. I'm trying to like it. It is at least a bit more configurable than Unity was, but good luck with quickly cobbling a little extension to do something simple because there's next to no documentation for any of it. Plus most of the extensions that do exist, don't work quite right! (Prolly because of lack of docs?! I dunno!).
You've made me rant now. Rant-end! :D
I absolutely hate the switch from Unity to Gnome. The wasted space with the window titles, the added gnomey application menu that doesn't do anything, the clock in the middle of the top bar for... reasons. It took me about a month to get used to Unity in the first place, and from that point on it's just been right.
I guess if you run massive screens with 30 windows on then it could be annoying, but I work on a laptop, mostly using the keyboard. There's very little reason not to have everything maximised.
"they should be doing their best to take X out of the baseline and defaulting back to Wayland would hasten that transition."
ugh, not that Wayland crap again. How about 'stay with something that we know WORKS' and 'stay with something that supports REMOTE SESSIONS' ??
Here's an example of why I can't use wayland:
I'm on an X11 desktop logged in as 'a_user'. But I want to run a desktop application as 'b_user'. So, how can you do this with WAYLAND? As far as I understand it, you can NOT. [this is especially useful when editing system files using a GUI editor while logged in as root].
Or, what if I'm running an application on a Raspberry Pi, a graphical editor, and that RPi is HEADLESS, so there's no GUI to run it on. How can I use the GUI editor with WAYLAND As far as as I understand it, you can NOT.
to avoid a length post, and the 'captcha' irritation, I'll summarize by saying it involves correctly setting the 'DISPLAY' environment variable, using teh 'xhost' application to enable a host to connect to the X server, using '.xserverrc' to enable the X server to listen for TCP connections, blocking incoming connectinos to port 6000 in your firewall, and then 'just run the GUI application' in the logged-in session and it works, remotely interacting with your GUI desktop on a different [or sometimes, the same] computer, using TCP to talk to the X server, and allowing for doing things like running a GUI app as root, when needed.
One of the reasons I like Mint is that they are willing to experiment with the UI, but they are able to support a cluster of different shells and keep them available in parallel. They might not all be released on the same day, and I doubt that's a good idea anyway, but it is one of the things that makes it more than an Ubuntu clone.
One of the biggest barriers to "linux on the desktop" is the bewildering array of ... well what *are* GNOME, KDE, etc, etc ????
*I* know what they are. *You* do. But it's one of the hardest things to discuss with Windows users who have no frame of reference.
Especially when distros mess around with their choices over the years ... Ubuntu dropping GNOME for Unity, then reverting. Mint using Cinnamon as default (but you can change that if you like).
It wouldn't really matter a pile of beans until you then move up a bit, and discover that certain programs require a specific Desktop. For example, Cinnamon doesn't have a calendar widget for Evolution. You need GNOME. And so on.
As for developers ... well bad luck if you were writing a suite for Unity, and your target market switches to GNOME.
" ...and discover that certain programs require a specific Desktop."
I don't see what your problem is. I use Mate on PCLinuxOS but I came from KDE many years ago and still prefer some of the KDE utilities like K3B. I just installed K3B and Synaptic went and fetched the KDE runtimes so that K3b would run. Yes I know it uses more space on the drive and introduces other overheads but the systems these days are more than capable of dealing with such issues.
I find the reverse. If you're a Windows user and you didn't like what Microsoft did with Win 8, what were you to do? You could find some shareware alternative desktop, but it wouldn't be supported by many others.
Similarly, there are programs that recreate a proper start menu for Win 10. Which Microsoft break every six months with their compulsory updates.
Apple too take the view that they know what you want more than you do, and - when it comes to the iPhone - are prepared to ban any program doing it differently.
With Linux, if you don't like what GNOME did with GNOME 3, or Ubuntu's Unity, it is utterly trivial to find another desktop. That works. Even if you don't go for a Ubuntu version with your favourite - and they cover most of them - installing another one is easy. At worst, you'll pull in a hundred library files to get working.
What makes an interface, “modern”? Almost all interfaces are the same CDD (Click-Drag-Drop) interfaces we have had since the mouse was invented.
Whether one uses a 7-button mouse, a stylus, or a touchscreen, (or even a hand with a 3-D imaging camera), it is still a CDD interface. All that is being done is changing the appearance of WIPs (Windows-Icons-Pointers), and that does not make it any more or less modern.
When I look at an icon and wink at it, and it opens a window, —sounds modern, doesn't it— it is still CDD on a WIP. Put a mask on my face and a glove on my hand, some thing. We are still talking keyboards, mice, and screens. It is just as modern —or archaic, as the case may be— as every other interface over the last four decades.
Tired of this stupid argument. One uses apps on Linux distros by going to the “App Store” equivalent, and clicking, "Install.” MS finally caught up with everyone else in that arena.
If the app is not is the Store, then download the install package and here click it. Same as in Windows or Mac. ?.exe, ?.MSI, ?.Deb, ?.RPM,… doesn't matter.
If there is no installer package, click on the executable. Snap packs, flat packs, flap jacks,… whatever.
No executable? Then, just as one will do it in the Windows world, one has the same thing in the Linux world; compile and install! …Except most Windows uses just don't know that that is an option, and say, “I wish there was an app to do such and such.”
They can find the app and install it just as we can, but they don't.
Yeah, like Windows or Mac has had any big, shiny, new things lately. Sure, every now and then, Windows changes to look of their interface, calls it new & better, half the people complain, the other half are wowed, and the world moves on. Linux distros hardly ever make the claim of, “all new and improved user experience!”
What they often claim is, “better stability, new technology complaint, bugs crushed, more secure, less resource intensive, improved HAL, more capable drivers,” and that is really all that an updated OS ought to bring.
Confusing Desktop Choices
The operative term there is, “choices.” When MS changes to Windows desktop, the end under doesn't really have a choice, (until enough of them complain and MS sends an update, allowing them to revert to the old desktop, then removes that choice in the next Widows major release). Most new users are not ‘offered’ a choice, but guess the distros default desktop. They can still choose to use any of the many desktops out there, by simply installing them.
Running GNOME but want to install A KDE app? No problem! Install the KDE app! It will install all of the KDE libraries in needs, so you don't have to worry. Running KDE But want to install a GAME app? …You got this! Install, and it works!
Back in the days of Windows 3.x to Windows XP/2000, one was able to choose ones desktop, (although MS did not make it clear that you cold). Remember the Packard Bell desktop anyone? I had used Lotus SmartSuite as my desktop at one time. Today, not possible, due to Secure Platform Initiative —or whatever it is called— from MS, preventing one from changing basic system configuration, allegedly to keep us safe.
Linux keeps us safe AND allows desktop choices, without “confusing us” with this or that option.
Doesn't Run On Linux
I am just about as tired of the Windows photographers who tell me that they cannot use Linux because it doesn't run Adobe CS, as I am with to ones who tell me they cannot use DarkTable because it doesn't run on Windows. If you are so need to an OS because it is all you learned in school, or need to an app because it is all you learned in college, then your education system needs rehabilitation.
If you have Linux, use DarkTable. If you have Windows, use Lightroom. What's the problem? “But I NEED Lightroom!” “But I NEED Windows!” I hold that both those statements are false, and won't get into it now, but if you think you need Windows, or any of its apps, use it and it's apps. No one is forcing you to change.
The argument that the Linux desktop is not ready for prime time because this or that Windows app does not run on Linux is a not-starter. Linux has enough apps for whatever one wants to do, that the Windows as are not required.
I feel the need to point out that many apps are “cross-platform” apps, —such as the acclaimed, SolidWorks— and can run on several different platforms, but such a statement would not help those who are adamant about the ‘faults’ of the Linux desktop.
With the exception of some printers and scanners, I have never had a piece of hardware which did not work on Linux, within the last ten years. …Except for my Harmony remote, but some frustrated programmer fixed that. It was totally Logitech's fault. To be sure, the Harmony remote is NOT a piece of computer hardware which did not work; it was a piece of TV hardware which came with a Windows application to program it.
This is not a Linux hardware Compatibility issue. Linux saw it just fine. Linux, like Windows, just had no clue of what to do with the hardware it saw.
As for scanners, most scanners do not work with Windows due to TWAIN. They all NEED special drivers to work, and it is usually up to the manufactures to provide those drivers. A similar, issue happens with some printers on Linux/Windows, where very specific drivers are required, and they do not work out of the box on Windows, either.
The scanners which work on Linux are either those whose manufacturers have provided drivers, just as they did for Windows, or those who work because the Linux community made them work, in site of the lazy manufacturers. But SANE outshines TWAIN, because it gives the same interface to all scanners, where the TWAIN interface varies from one maker to another.
There is no standard scanner protocol, but there are several standard printer protocols. If your printer uses any of them, it works on Linux. If it uses a non-standard, proprietary protocol, then it only works on Linux —or Windows, or Mac OS— if the manufacturer provided proprietary drivers, or if a frustrated programmer decides to do the simple task which the maker was too silly to do themself.
All other hardware just seems to work.
Some Other Nonsense
Not going there. It is nonsense. I have installed Linux on failed computers of several non-technical family members, and they all seemed to use it fine for all they do. The surf the web, they read email, they write reports, use spreadsheets, edit photos, and videos, make music,… all they desire.
Not one complaint, (except by a sister-in-law, who insisted that a web-based app she needed only worked on Internet Explorer on Windows, but it worked just fine on both Chrome and Chromium on Linux. We never checked Firefox).
A fine post, Logics. I'd like to add something.....
A lot of development work runs under a customized Eclipse based platform that's supplied by chip vendors. This platform runs a combination of scripts, custom utilities and GNU tools and, being Java based, is largely agnostic of the GUI environment. The tools, being for the most part command line, aren't interested in the GUI environment as well. The real differences are when you switch between Windows and Linux. Since the tools are effectively Linux based, not native Windows applications, they're invariably run using Cygwin. Here the deficiencies of the Windows environment show up in innumerable small ways due to legacy issues with file separators and the like, a really weird user model, hit-and-miss USB implementations, rouge interactions with anti-virus programs and so on. People cope with this because Corporate invariably specifies Windows because the be-all and end-all of their work environment is Office and they're the "decision makers".
I'd suggest that the big mistake Linux distributors have made over the years is imitating Windows. They assume they're going to compete with it, to convert 'Corporate' to using this platform. It isn't going to happen, at least not in the foreseeable future. The Windows desktop is too entrenched, too familiar and functional to be easily taken down. The rest of the system is, not to put too fine a point on it, total crap. Microsoft have kind of acknowledged it by providing a Linux shell capability but with their typical flair for taking the straightforward and making a pig's ear of it the result is less than perfect. Although people swear by virtual machines the best way to marry Corporate's love of Windows with reality is through a X Server on the Windows desktop -- between that and a couple of utilities like WinSCP and Putty that's pretty much all you need. You just need a spare computer and they'll be plenty of those about since Window's greed for resources obsoletes a lot of perfectly good machines.
... the appearance of WIPs (Windows-Icons-Pointers)
Don't forget "Menus", menus are important, too (even if dysfunctional 'modern' GUIs do keep trying to hide them from us in ever-more-annoying ways) ... or did you think the 'M' in "WIMP" stood for "Mouse"?
An upvote is not sufficient for the depth of this post and its rantiness quotient, so have a (virtual) pint.
I would just add that employer number 1 provides an rdp session that I can use to access their Windows Only business logic applications and employer number 2 has just adopted Office 365 (rightly or wrongly, we'll see how that pans out). So I can render unto my twin caesars what is due from the comfort of my Slackware running on a humble dual core processor.
What makes an interface, “modern”?
1. Complete lack of immediate visual discoverability so you have to mouse over everywhere to find out where the widgets are.
2. Doesn't follow the platform style guidelines (which doesn't matter any more as they've probably been a shitshow over the past 5-10 years anyway).
(a bit TLDR even for me)
But your comment about "Modern": I think it's more used as a PEJORATIVE to INSULT anyone who does NOT jump on that bandwagon. We become luddites, old fogies, sticks in the mud, who won't learn, who maybe even CAN'T LEARN, stuck in our ways, opposed to change, blah blah blah blah.
/me swings a clue-bat and a cat-5-o-nine-tails at ANYONE calling that so-called 'modern' 2D FLAT crap "modern" [and actually believing it]
"/me swings a clue-bat and a cat-5-o-nine-tails at ANYONE calling that so-called 'modern' 2D FLAT crap "modern" [and actually believing it]"
Well good to see you.
Here's our cutting edge UI that our team have researched the hell out of to bring the user and the computer together in a more harmonious state. You can if you wish change it to something you are more familiar with, but this is our vision of the future, so why not get onboard now.
I'm just upgrading.
Well good to see you're still with us.
Here's the UI you have always used, but I might point out that more options are now available within the bounds of this UI. You might want to read more, or even have a peek at other UI's we make available.
I'm just downgrading
Well I'm really sorry our latest iteration foisted something awful at you.
Here's the system you left behind. We have noted where we think the new system didn't meet your approval and will notify you just as soon as Torvald finishes wringing the necks of those behind your bad experience.
I've jumped from Windows
Well good for you, and welcome aboard.
Here's the ui you're most probably used to, but I'm afraid we won't be able to recreate the same dysfunctional experience that Microsoft designs into its products. You might want to open your computer and randomly rake a fork across the motherboard to recreate the Windows effect.
I've jumped from Apple
We're really sorry to hear you've lost your design job/trust fund/savings.....
Actually after 15 years on macOS I am considering jumping back to Linux. Back, because I ran it when I had to roll (and patch) my own kernels to do so.
Why? It feels like the quality has gone. Stuff used to "just work", and work brilliantly - Snow Leopard stands out in my memory as a high point. But now... time machine fails so often my trust has gone. I'm unconvinced I could restore from backup to a new machine when required. iTunes breaks virtually every rule of user interface design that has ever been written. Login for various online systems seems to work, or not work, depending on whether the cloud is having a good day. I can't delete or move certain things around in / unless I reboot to some sort of safe mode, even as root. If I upgrade my Macbook I have to lose USB-A. And, most significantly, despite half a lifetime in UNIX I can't diagnose and fix anything because I haven't got any f*ing documentation, logs or source code to help me through.
So I've ordered a comparable laptop and will be testing Linux on the Desktop. I'm going to give Ubuntu and Mint a spin I think and see how it plays out.
Design and money are not a factor. This is an engineering decision - I suspect I will spend less time trying to get Linux working properly than macOS.
Biting the hand that feeds IT © 1998–2019