Wot! Someone copying Apple?
Please.... No... Please someone tell us all that Apple copied it from someone. Then the Karma of the Universe will be restored.
Canonical is shifting around the trash can icon on the upcoming Ubuntu 17.10 release, which might give some a sense of déjà vu. Apple kicked off the trash in the corner trend in 1983, with an easily accessible icon for storing junk on its Lisa computer. In 1995, Microsoft added a "recycle bin" to the DOS replacement, Windows …
I’m pretty sure (and someone is bound to correct me if I’m wrong) that Apple hasn’t invented anything per se. What it has done is innovated (a lot) - and many of those innovations have since been copied by other computer manufacturers.
The beige plastic case. It might seem stupid, but this did a lot to make computers acceptable for home use. Before Apple? Heavy, pressed steel case full of bodged wires and unfriendliness. After Apple? Streamlined plastic, and finished circuit boards.
The floppy drive. Apple didn’t invent the floppy drive - but, before Apple, disk drives cost more than the computer - and contained their own CPUs, RAM and so forth to drive the, er, drive. After Apple, the computer’s own CPU drove the drive using software run on the computer itself. Thanks to Woz the price of disk drives dropped dramatically.
Colour graphics. Before Apple if your computer could even drive a display it was driven like a teletype. No moving graphics. Just text - and strictly black and white. Using some clever kludges based on the inadequacies of NTSC Apple gave us colour graphics.
Drop down menus. The GUI existed before Apple, but it was very menu driven. No one thought of hiding the menus aways so that they weren’t visible until clicked. In fact, I think that the icon representation of files and folders might be an Apple innovation too (Xerox used lists of filenames).
Regions. This is the cunning method by which only the parts of the screen which have changed get redrawn, rather than the entire visible area. It’s how Apple got away with using comparatively weedy CPUs and limited memory compared with the beast that was the Xerox Star.
ADB. Imagine a desktop bus through which you could daisy chain, keyboards, mice, joysticks - even slow handheld scanners. Sounds like USB? Actually, it’s ADB - and the year is 1986.
Desktop spanning multiple monitors. Apple may have been the first - but even if it wasn’t it was the first affordable (relatively) implementation. Yours since 1986 (with an addon board and monitor which clipped to the CPU of the Mac Plus).
CD-ROM. Again, Apple wasn’t the first - but it was the first to ship an optical drive as an integral part of the computer (Mac IIvi / IIvx).
No floppy drive (or CD-ROM). How everyone laughed. And then copied this usefully cost saving idea.
The Dockable Computer. If only they hadn’t abandoned this useful idea. I which modern Macs had a dock connector - but that doesn’t alter the fact that the Duo did it first, and (even today) did it peerlessly well.
I could go on. The postscript laser printer, the swipeable touch screen smartphone, the ‘intelligent’ PDA, the modern tablet computer and many more besides. None of these products was, strictly speaking, the first - all had ancestors - but all did it in a way that made them significantly more useful than anything that went before.
Ultimately, you might not want to use an Apple product (whether for good reasons (there’s another system which fits your use-case better) or stupid ones (I hate Apple and I’ll never buy an Apple product)), but if you use a computer then you have no choice but to use Apple’s innovations.
Dead dead wrong. Apple just about copied the whole shebang wholesale. Sure, they improved lots here and there, but the basic work for the whole deal was Xerox's.
Could be wrong, and someone correct me, but don't think Apple EVER did ANYTHING that was genuinely "new".
I think that it's fair to say that Woz was the genius behind the design, but a good design isn't enough to succeed in this world. You need business acumen as well - and without the business acumen of Steve Jobs there wouldn't be an Apple (or a Pixar) today. You may not like him, but Apple would have gone the way of Osbourne and Sinclair without him.
And Woz wozn't (sic) the only hardware / software genius at Apple. Let's not forget the likes of Burrell Smith, Bill Atkinson, Andy Hertzfeld…
Stanford's NLS (by Doug Engelbart) can reasonably claim to be the first ever GUI. Xerox's Alto (http://toastytech.com/guis/salto.html) followed up on this groundbreaking work - but had no icons, or drop down menus, at all. Nevertheless, it was a huge step forward, but nothing like a GUI that you'd recognise today.
Star, also by Xerox, improved on Alto with icons and resizable, overlapping, windows - but, in the case of Star, a huge amount of CPU power was required because the entire screen had to be redrawn whenever a window was moved. Oh, and it didn't have trash either, at least, not in its earliest incarnation, and by the time it did get a trash Lisa and Macintosh had launched. Nor did it have drop down menus - all its menus were in a ribbon like bar at the top of the app, plainly on view at all times.
So yes, I think that Apple can reasonably claim to have innovated the first recognisably modern GUI. More importantly, I think that Apple (Bill Atkinson, to be more accurate) can reasonably claim to have invented the crazily complex maths required to do Quickdraw Regions - that clever functionality whereby only the parts of the screen which have changed get redrawn. Xerox were astonished by the regions functionality - they hadn't thought it possible - and it permitted Apple to run a full GUI on a 5MHz 68000 CPU (Lisa).
Furthermore, Apple built on the NLS work of Doug Engelbart (1968) and ENQUIRE by Tim Berners Lee to develop HyperCard - the first mixed text and media hypertext system. HyperCard in turn influenced Tim Berners Lee (very recursive) and Robert Cailliau to develop a what we now call a web browser. Which is a very useful innovation.
But really, has anyone ever done anything that was genuinely 'New'? We're all just standing on the shoulders of giants. It's giants - all the way down.
Where to start almost everything you mentioned was avaiable on some other computer or some other OS before it appeared on an apple device.
Almost all the GUI elements you see on a modern computer were developed at Xerox PARC and were commercially available on Xerox Viewpoint workstations years before Lisa or the MAC.
There were many home computers from Tandy, Commodor etc which came in neat beige boxes with built in floppy drives at the same time as the first Apples (cheap enough chips were avaiable to make a hobby PC viable -- several companies had a go!).
Probably the only real innovation were the full page Glossy adds in Omni magazine.
I think you are trying to say that Linux is perfectly good as a desktop (and has been for a number of years) and is being used more and more but not yet replaced other established operating systems installed on the majority of desktop computers? It is also runs on more than 99% of the Top 500 super computer list and dominates the smart phone market and owns the server market too. Lastly, it is the most popular operating system in the world!
I know that you are a semi-troller but although it might not (yet) be the year of Linux on the Desktop, it positively is the year of Linux in the Pocket since the number of mobile/tablet devices sporting Android have in fact surpassed Windows overall and Android is for the most built on top of Linux.
"Don't use it ... the rm command works nicely for me."
But not the same thing. In most implementations, the bin is a directory or similar structure to where unwanted files are moved. How they then get finally deleted is a matter for both the implementation and possibly a users config choices. Many *nix GUI desktops will even default to the DEL key moving files to the bin and the user having to press SHIFT-DEL to actually delete the file.
So, although the rm command works well for you, it's not the same action for most users in anything approaching the default settings.
Personally, I'm more likely to be using the rm command too, and change the default window manager action so that DEL deletes properly.
Apple kicked off the trash in the corner trend in 1983, with an easily accessible icon for storing junk on its Lisa computer.
I didn't have a Lisa. But my failing memory is that the trash can on my first mac (an SE in 1987, System 6.0.x) didn't store anything. It either trashed files or ejected disks. Am I wrong about that?
More or less. Dragging a file to the wastebasket (as it was known on U.K. Macs back then) didn't result in the immediate deletion of the file. The file could be removed from the trash until Finder quit. On System 6 or earlier this meant that the file(s) would be deleted whenever you launched a program (in single tasking mode) or at shutdown (if multifinder was running for cooperative multitasking)
System 7 (1991, for the Mac Plus and above) fixed this so that deleting files worked more or less as it still does today, from a user perspective at least.
Indeed - System 7 is bascially System 6 with Multifinder permanently on, and a new Control Panel and Fonts implementation, and VM support (if your ROM was 32bit clean of course) - along with some other refinements.
The Wastebasket lived on until OS 9 came out, where it became Trash across the board.
32bit cleanliness is not required for VM. 32bit cleanliness is required to use more than 10MB RAM. To use virtual memory you need a 68030 cpu or a 68020 cpu with an MMU fitted to your computer. In practice this means that all non-68000 Macs can use virtual memory, 32 bit clean or not, with the exception of the LC which, despite being 32bit clean, had no MMU - and a multiplexer which was limited to 10MB RAM. Making the original LC a bit of a shit computer.
...instead of lines of text, programs and documents on the new interface will be represented by "icons," which symbolically represent the nature of the object. Users will be able to manipulate a device called a "mouse" to move an on-screen pointer. By clicking buttons on the "mouse" while the pointer is over an "icon," the user can "tell" the computer to perform common actions without having to memorize complicated command line instructions.
...instead of lines of text, programs and documents on the new interface will be represented by icons
Icons? What decade are you still living in, take a look at the quintessential paper design interface, icons are on the way out to replaced by large areas of white space and some text (just to break the monotony).
Look at gnome, hardly any icons (well, none that seem to denote anything remotely symbolic or helpful - notifications is just a lucky bag circus guessing game, and systray icons are confined to their leper colony on the bottom left) until you go into activities then it's just brand logos.
Icons are the purview of religious fanatics (again).
Icons? What decade are you still living in
That was part of the joke, you see.
Riffing off of the idea that a trash can was somehow "new," I took that to the next step and satirically implied that the whole concept of a GUI (circa 1984) was also new.
It seemed clever at the time. Apparently I was a bit too subtle.
I honestly don't think I have ever used this thing on linux or windoze (not an apple user). Dragging files/icons/whatever there is more effort than hitting del after you have selected them. TBH I'm not even sure if the trash can is enabled 'cause I've never used it to restore anything either.
I used Ubuntu for the first time today and was massively disappointed. As my first Linux desktop, and what's supposedly the most robust of them all, I was hugely disappointed. The UX oscillates violently between overly simplistic and ridiculously convoluted. Specifically I had to navigate its abysmal nomenclature for safely removing a USB flash drive, because apparently what Windows 7 needs just the one term for Ubuntu needs to have four different but overlapping ones. After half an hour of StackExchange research I finally understood that my USB didn't need to be Unmounted, Ejected, or Powered Down, but Safely Removed, and that not doing so would be...unsafe. Naturally though, you can't Safely Remove a USB from Disks - no, that's just Unmounting and Powering down. The only location from which you can Safely Remove a USB is the Launcher, which of course, my USB wasn't displaying on.
This is the great Ubuntu that every Church of Linux sysadmin swears by? Am I missing something?
Your points are valid ones. Over the last few years developers spent a lot of time and effort to replace something that was familiar and actually worked with messy implementations of the latest trends in GUIs. The move from Gnome 2 to Unity or Gnome 3 was pretty much made mandatory by most distros, even though those interfaces were unfinished and buggy.
A bit like the mandatory move to systemd.
However, I suspect your sysadmin friends that swear by Linux either swear by it as a server operating system, or if it's a desktop they use something sane and sensible for the desktop environment (I'm using Cinnamon, but XFCE and Pantheon are quite popular).
On the other hand, I had a quick go on WIndows 10 for the first time the other week and found it to be a confused, mis-mashed, monstrous, un-intuitive nightmare of a GUI. So it's swings and roundabouts really.
> "As my first Linux desktop, and what's supposedly the most robust of them all"
No one in the Linux Community would consider Ubuntu (Unity) to be the most robust desktop . . . not even close. That would be KDE Plasma which is usable via Kubuntu or KDE Neon.
> "The UX oscillates violently between overly simplistic and ridiculously convoluted."
It's fine that you don't like Unity because there are many other options that provide a lot more features and control but also Unity is being replaced in just a couple of months so your first time testing something being deprecated already is unfortunate timing.
> "The only location from which you can Safely Remove a USB is the Launcher, which of course, my USB wasn't displaying on."
You can also just right click the option at the top right of the top system panel. (aka the system tray like Windows users would expect though on top right instead of bottom right)
> "This is the great Ubuntu that every Church of Linux sysadmin swears by? Am I missing something?"
SysAdmins typically suggest Ubuntu because it is the simplest and easiest to start with. They don't swear by it at all and usually use something much more complex and robust but would be irresponsible to suggest to others as their first venture into the Linux ecosystem.
It is easy to make a suggestion of which distro someone should use if you are talking to them one on one because we can ask questions to narrow down the options. However, it is very hard to suggest any one distro to "everyone" so we all just default to Ubuntu because it has the biggest backing and the simplest approach out of the box.
And a friggin' recycle bin makes the news??? Most every PC that gets recycled by me has a trash bin in the corner of he desktop, either accidentally via the OS or it is intentionally put there. Deleted files in Win-3.1 were saved to a directory from which they could be restored, most of the time. I am surprised that somebody has not blamed Unity for this.
Biting the hand that feeds IT © 1998–2019