back to article Happy birthday, Lisa: Apple's slow but heavy workhorse turns 30

Read a press release from Apple in the 1990s and it'll end with something along the lines of: “Apple ignited the personal computer revolution in the 1970s with the Apple II and reinvented the personal computer in the 1980s with the Macintosh.” All of which is true up to a point, but the statement does overlook the product …

COMMENTS

This topic is closed for new posts.
  1. Alan Bourke
    Pint

    Great stuff.

    Can we have at least one feature like this a day. Kthxbai.

    1. zb

      Happy birthday Lisa

      I saw the headline and groaned "not another Simpsons story" and skipped the rest.

      1. taxman
        Facepalm

        Re: Happy birthday Lisa

        Doh!

    2. keithpeter Silver badge
      Happy

      Re: Great stuff.

      "Can we have at least one feature like this a day."

      an ebook of he best of these historical articles with edited comments (the ones where the protagonists come out of the woodwork and comment on the proceedings) would probably sell well. I'd drop at least a fiver on a Kindle edition...

    3. J. R. Hartley
      Thumb Down

      Bollocks stuff.

      “Apple ignited the personal computer revolution in the 1970s with the Apple II and reinvented the personal computer in the 1980s with the Macintosh.”

      You mean:

      “Commodore ignited the personal computer revolution in the 1980s with the C64 and reinvented the personal computer in the 1980s with the Amiga.”

      There, fixed it for you.

      1. Anonymous Coward
        WTF?

        Re: Bollocks stuff.

        "“Commodore ignited the personal computer revolution in the 1980s with the C64 and reinvented the personal computer in the 1980s with the Amiga.”

        There, fixed it for you."

        You're fscking kidding , right? The C64 was good but it was in no way a pioneer. The Sinclair ZX80 and ZX81 and BBC Micro beat it by a few years even if you don't count the Apple machines as proper personal computers (and I've no idea why you wouldn't). As for the Amiga, it was a damn good machine but it didn't reinvent anything, it was simply an evolutionary progression. And you seem to be forgetting about the Atari ST of the same era which cleaned up on the amateur and professional music production side with its MIDI support.

  2. Anonymous Coward
    Anonymous Coward

    He was convinced that the graphical user interface he saw at Xerox, along with the mouse, had the potential to shape the future of computing. He was right.....copied the idea in a flash and was successful in sucking in his followers to believe he is the messiah.

    1. Anonymous Coward
      Anonymous Coward

      Beware the fanbois

      As it happens, I worked on the UK launch of the Star kit (in Brum, IIRC) for Rank Xerox. It was in 1981, two years before the Lisa appeared.

    2. Sorry that handle is already taken. Silver badge

      I loved this bit

      "How many buttons for the mouse is a question that still rages today, but Apple’s testing on computer novices found that one button reduced confusion and eliminated the occasional glance to check the mouse buttons being used."

      I wasn't aware of this, but I think an argument can rightly be made that the way a novice uses a computer isn't a good design guide for optimal use of a computer.

      1. jai

        Re: I loved this bit

        i always maintained that if you needed a second mouse button to do something, then it wasn't worth doing it.

        that was, of course, until Apple implemented a right-click action on their buttonless mice...

        1. TeeCee Gold badge
          WTF?

          Re: I loved this bit

          So you never wanted a second mouse button until Apple told you that you did?

          Is this a one-off, or do you rely on Apple to do all your thinking for you?

          1. Dave 126 Silver badge

            Re: I loved this bit

            >Relying on people remembering or using experience isn't a good way to design a human computer interface.

            They won't be able to learn quickly if they have to remember strange keyboard shortcuts, but if they do the task often enough then their 'muscle memory' will make it almost automatic. This is why I like menus as a training aid- the novice can select File > Save with the mouse, or to give their wrist a rest they can use Alt > F > S (or use Alt > cursor keys), or when they are used to the system they can save time by using Ctrl+S.

            What I don't like about menus is when they get nested, and it becomes a test of mouse dexterity to select an item 3 levels deep... Oh well.

          2. jai
            Trollface

            Re: I loved this bit

            oh no, my entire life philosophy can be boiled down to that. once you accept the reality distortion field, life becomes so much simpler and just works

        2. rcorrect
          Mushroom

          Re: I loved this bit

          With each Mac purchase the first thing to be replaced was the mouse and keyboard. Even after Apple introduced Mighty Mouse I still wasn't a fan. Sure you could left or right-click but not both because it was still a single button mouse. Having separate buttons is handy especially for gaming. You could navigate by holding down the left button then click the right button to do something else such as shoot.

      2. Anonymous Coward
        Anonymous Coward

        Re: I loved this bit

        Yes and no.

        Your first instincts are often correct. Relying on people remembering or using experience isn't a good way to design a human computer interface.

        Take a swinging door for example. If it only opens one way and there are handles both sides then you instinct is to pull it open, but on one of the sides of the door this won't work. So really the side where you push should have a finger plate and the other side a handle.

        You have to forget the simple applications we use like web browsers, the limited number of buttons and UI controls are pretty simple to work out. But look at a professional tool like video editing, 3D modelling and sound engineering. A sub optimal UI in such a tool is rather annoying as you'll be spending 7 hours a day using this software, lots of button clicks, keyboard clicks.

        Of course most tools tend to provide power users with keyboard shortcuts, this is where web interfaces suck big time and why ChromeOS is dead in the water, it's not simple to have keyboard shortcuts easily in a web application.

      3. Anonymous Coward
        Anonymous Coward

        Re: I loved this bit

        "[...]I think an argument can rightly be made that the way a novice uses a computer isn't a good design guide for optimal use of a computer."

        Louder, please, I don't think the Windows 8 and Gnome developers heard you the first time.

      4. joejack
        Megaphone

        Re: I loved this bit

        Macs have ALWAYS had second and even third mouse buttons; it's just that they're (even more confusingly) located on the keyboard. CTRL-Click = right click, option-click = ...

        1. Blake St. Claire
          Boffin

          Re: I loved this bit

          > Macs have ALWAYS had second and even third mouse buttons; it's just that they're (even more

          > confusingly) located on the keyboard. CTRL-Click = right click, option-click = ...

          They were actually on the mouse if you could be bothered to buy one of the three button mice that were on the market. I saved for a long time to buy my first Mac, and almost immediately purchased a three button mouse for it.

          I spent a few years in the late 80s and early 90s working for one of the NASA labs; one project I had some peripheral involvement with was building a system for the US military using Macs running A/UX (and DecStation 3100s running Ultrix.) The biggest user complaint was the one button mouse. I edumacated them about three button mice, they bought them, problem solved.

      5. Anonymous Coward
        Unhappy

        Re: I loved this bit

        "I wasn't aware of this, but I think an argument can rightly be made that the way a novice uses a computer isn't a good design guide for optimal use of a computer."

        Quite. Having more than 1 button on a mouse isn't an indulgence, it makes some contextual operations a damn site easier. Apples single button not only makes some things more of a faff but it also makes it next to impossible to use some X windows applications properly on OS/X because they require 3 buttons and apples 2 button bodge really doesn't work very well since you're never quite sure if you've pressed it correctly until something happens on screen. And we won't get into the lack of a roller wheel. The Apple mouse could be used as the dictionary definition of style over practicality.

    3. messele
      Pint

      The truth of the matter is he liberated the Xerox engineers from their short sighted idiot upper management on the other side of the country by firstly recognising their genius (which the Xerox board never did) and then offering them a job at Apple. I wouldn't call that poaching them since their ideas were rotting in a lab and were never going to see the outside world via Xerox.

      The fact that the Xerox board approved Jobs and Apple engineers visits in return for being given the privilege of purchasing Apple stock before their flotation tells you how short sighted they were.

      Find an interview with Bob Metcalfe, Larry Tesler, Alan Kay or a multitude of others who worked at PARC what they feel about Apple and Steve Jobs and you'll find a frank, not always flattering but overwhelmingly grateful response to a man.

    4. This post has been deleted by its author

    5. Dana W
      Meh

      Paid for the idea. He saw its potential, Xerox did not.

      They paid on stock. Guess what it would be worth now if Xerox had KEPT it?

  3. Mondo the Magnificent
    Thumb Up

    Excellent

    Excellent in depth story about the LISA and what she brought to the party in regards to "personal" computing

    This was briefly covered in the documentary film titled "Weclome to Macintosh" where some of the original development team members were intrviewed

    As for a LISA being worth almost US$25K, I wonder how many former owners rue junking their LISAs now?

    1. Franklin
      Thumb Up

      Re: Excellent

      "As for a LISA being worth almost US$25K, I wonder how many former owners rue junking their LISAs now?"

      I junked mine about three years ago when it totally quit working. The motherboard had so much corrosion on the circuit traces that it would no longer even turn on.

      I picked it up for $100 in about 1987 or 1988, from a used computer place that said it wasn't working. It turned out that a cable for the video tube had popped off; easy 5-minute fix. It was a Lisa 2/5, with the 5 MB hard drive, and served as my primary computer for about the next four years or so--I replaced the ROMs with Mac XL ROMs and found it ran Mac software quite nicely. (I was running System 6.0.8 at the time.)

      Lovely machine. I was sorry when it finally failed for good.

  4. Electric Panda
    Joke

    I suppose the Lisa would make a dent...

    if you dropped it

  5. Silverburn

    Pic 1, page 1

    hmmm....lovely desk finish...just perfect for using your 80's mouse on. Or not. Then there's the small matter of the diarrhea colour pattern...

    1. TheRealRoland
      Facepalm

      Re: Pic 1, page 1

      Uhm, remember having to clean the innards of the mice, because lint, dust and crud would collect around the two axis thingies? Mice have not always been 'optical' based...

      Am i showing my age?

      1. Michael Xion
        Thumb Up

        Re: Pic 1, page 1

        Ah, the old 'drunken mouse' after dousing the insides with rubbing alcohol to get rid of accumulated cruft.

        1. Muscleguy
          Boffin

          Re: Pic 1, page 1

          One advantage of working in a science lab is that industrial quantities of pure alcohol for the cleaning of mouse rollers and balls was no problem. I still have a small vial of the stuff for loosening the ball my current mouse, the one on top.

          Meths will do just fine as well of course, though smellier.

  6. Shonko Kid
    Mushroom

    "Apple invents the Personal Computer. Again."

    I see that their marketing department has never rested on it's laurels, and has continued to provide fresh, up to the minute copy...

    1. Goldmember

      Re: "Apple invents the Personal Computer. Again."

      Indeed. And it seems attempting to flog hugely overpriced, underwhelming kit and long, expensive lawsuits are also ancient traditions kept very much alive today.

  7. Anonymous Coward
    Anonymous Coward

    The Apple II was my first self-contained micro in 1979. As far as I recall the Lisa/Mac didn't offer an affordable upgrade or experimenter card slots. So I migrated to IBM compatibles instead.

    1. Dave 126 Silver badge

      Yep, I've heard from engineers older than myself that they went with IBM compatibles because whilst they liked Apples, they just couldn't connect anything to them.

      1. Blake St. Claire
        Boffin

        > Yep, I've heard from engineers older than myself that they went with IBM compatibles because

        > whilst they liked Apples, they just couldn't connect anything to them.

        Yes, they went with PCs, and added cards for all the things the Macs had built-in.

    2. Dazed and Confused

      Apple II and card slots

      You used to see Apple II with all sorts of cards hanging out (did they ever make a top cover for it?)

      The Apple II was so successful because it was such an open system.

      Then with the Apple III they closed the doors, so it wasn't any use to anyone who'd been using the II outside the office.

      1. Blake St. Claire
        Boffin

        Re: Apple II and card slots

        Er, the Apple III had slots. (Google it.)

    3. Franklin

      "As far as I recall the Lisa/Mac didn't offer an affordable upgrade or experimenter card slots."

      True of the first Macs, not of the Lisa. The Lisa had a card cage next to the motherboard, with (if I recall correctly) three slots. On my machine, one of the slots was occupied, but I don't recall what was in there. (Parallel port, maybe?) I bought an aftermarket SCSI card for the second slot, and used it to connect SCSI devices when the parallel-port Profile hard drive--with its whopping 5 megabyte capacity--started to get a bit flakey. If I remember right (it's been quite a while), the computer couldn't boot from a SCSI drive but it could use them.

  8. Anonymous Coward
    Anonymous Coward

    I'm waiting for the but Android is better comment.

    1. Anonymous Coward
      Anonymous Coward

      My pleasure...

      Android is better.

      :o)

  9. Anonymous Coward
    Anonymous Coward

    Forgotten?

    I'm not so sure Lisa is "the machine Apple would rather you didn't remember". From Jobs' autobiography, it seems to me it's more that Jobs deliberately spoiled it with the Original Mac in a (somewhat typical) fit of pique when was sidelined in the company.

    Perhaps a better description would be the machine *Jobs* would rather you didn't remember.

    To be fair to him though (flawed flaky genius and all that stuff), the Mac proved to be a better thought out, more commercial machine.

  10. hugo tyson
    Unhappy

    Everyone tried to copy it

    Certainly "troubled Cambridge micro-maker" Acorn tried in the early '80s. The first ARM powered machine was supposed to be Lisa-like, but the researchers in the USA treated it like ongoing research rather than a product to be finished (obviously my bit was finished in time :-> ), that it more-or-less caused the "troubled" epithet and the development of Arthur and RISCOS in a hurry and instead.

  11. Anonymous Coward
    Anonymous Coward

    >"rectangles with rounded corners [...] were “everywhere”."

    So, the one thing he invented, and he didn't even invent it because it was already a commonplace idea, by his own admission. Good riddance to the egotistical liar.

    1. Dave 126 Silver badge

      Re: >"rectangles with rounded corners [...] were “everywhere”."

      An invention that exists only on paper is of no good to anybody (except patent trolls). Look at how much tech has been invented in the UK, and then look at how successfully they have been turned into money to reward the inventors. That observation alone should tell you that people who aren't inventors are required to turn ideas into products and money. That was Jobs' role.

      What's yours, AC?

      Even obvious and clearly superior ideas need to be championed, sadly. If you live the UK, look at the light switch on your wall- chances are that it is an inch-long switch with sharp corners sitting in the middle of a 4" plate, and it requires a firm press. Nasty. Now, look at the light switches that are commonly used on the continent- the switch is that same size as the plate, it has round corners and it can be easily tapped to switch between on and off.

      1. Anonymous Coward
        Anonymous Coward

        Re: >"invention that exists only on paper is of no good to anybody"

        How exactly are rounded rectangles "an invention that exists only on paper", when the quote quite clearly says that they were "everywhere"? What do you think that line about Jobs taking the guy out for "an educational stroll" meant? He was pointing out actual rounded rectangles in everyday life, not just chatting to him. They were a commonplace, as Jobs himself admitted at the time, but years later he reversed his position and claimed to have invented them; that makes him a bare-faced liar.

  12. Neill Mitchell

    Is it me?

    Or is that Lemming advert incredibly ironic now.

    1. Darryl
      Happy

      Re: Is it me?

      Does kind of remind me of the Apple Store lineups on new-release day...

  13. Eponymous Cowherd
    Meh

    The problem with Jobs?

    People seem to think Jobs' biggest fault was the way he stole others' ideas.

    It isn't

    That is the way technology evolves, and always has evolved. See a good idea and make it smaller / faster / cheaper / easier to use. That is what Jobs did, and I have no problem with that. In fact he should be lauded for it because he did it well.

    The issue with Jobs is that he wanted to have his cake and eat it. He was happy to use other people's ideas, but threw his toys out of the pram when someone did the same with one of his products. There are recordings of him boasting about stealing Xerox' ideas, and then his famous "kill Google" rant.

    1. Dave 126 Silver badge

      Re: The problem with Jobs?

      I remember Bill Gates reacting to Jobs accusation that he stole the Windows GUI from Apple. Gate's used an analogy along the lines of 'Imagine you had a friend who stole a TV set from his neighbour... now you go to the same neighbour and steal his other TV set, but your friend says you stole it from him...'

  14. Spoonsinger
    Happy

    1983 - A good year for Kev!!!

    The arm of the dead guy in 'The Big Chill' and an Apple advert. Where did it all go wrong?

    1. Destroy All Monsters Silver badge
      Pint

      Re: 1983 - A good year for Kev!!!

      You can see his trademark way of answering the phone - resting the whole upper body weight on the elbows...

  15. Wade Burchette
    FAIL

    Icons and windows

    "Spreadsheets would never be the same again and neither would the way we relate to computers. This world of icons, folders and office stationery remains to this day and likewise the impression that these places exist on the computer continues to shape our thinking as we engage daily in direct manipulation of computational tasks."

    Until Microsoft decided to scrap that for an user interface that only works well on phones and tablets. And forces that user interface on everybody while at the same time ignoring people who legitimately do not like that user interface. (The people who claim that "if you don't like Windows 8 you must not have used it" are side-effect because Microsoft is to square to have a blind following.)

    1. Dave 126 Silver badge

      Re: Icons and windows

      Install a 3rd party replacement for the Start Menu. It ain't that difficult. But yeah, 'twas silly of MS to give people a reason to bash them, when it was so easily avoided. Still, they probably figured a lot of people are happy enough with Win7 and wouldn't upgrade anyway, so they thought they'd get a bit experimental with Win8.

      Thinking positively, being able to choose from a few options for different parts of Window's Desktop Environment might work out better for the end-user... you could choose from a selection of File Browsers that are competing on quality, or are just better suited to the way you do things. Intermediate and Advanced users already use 3rd party software to give shortcuts to deeply buried settings, and many OEMs impose their own interfaces for audio options and the like on their customers. Logitech's Windows software gives the user a clone of the OSX's 'Mission Control: Show all Windows' feature which I find handy...

      [Now, on the other hand that Ribbon Interface was very poorly handled... there was no reason why it couldn't co-exist with normal menus for a version or two. And it ate up too many vertical pixels when people have too many to begin with.... very silly MS. What really took the piss was that rather than provide a plugin that reinstated menus, they directed you to an interactive "Where the bleedin heck is that thing I'm looking for?" guide.]

  16. Mystic Megabyte
    Trollface

    Start as you mean to carry on?

    "It was also a non-standard media – it had additional cut-outs on the sleeve"

    "The Lisa failed because it was very underpowered,"

    "fatal combination of poor performance and high price"

    1. Lars Silver badge
      Thumb Down

      Re: Start as you mean to carry on?

      My interest in Apple died when the Lisa came, I am not sure why. It did have running horses though.

  17. Anonymous Coward
    Anonymous Coward

    Patent system in reverse

    All this talk of who copied who is just silly. It is quite possible graphical GUI systems existed even before the Xerox PARC. The problem with these very early systems is that there simply wasn’t enough computer power to make them successful. In those days DOS like systems were viable and hence were very successful. Graphical user interfaces most definitely must have been talked about in university research papers and how such human computer interaction would improve productivity. As processor technology improved over the years much of what was dreamt about decades earlier was now finally realized.

    If somebody tomorrow manages to make a spaceship or time travel worm hole would you say they copied Einstein. Any idiot can sit around and dream about goblins and futuristic technology. Getting this technology to work is what most people would call genius.

    The Patent system isn't design to protect lamers who have nothing better to do than dream about technology they would like. E.g. I couldn't patent a wrist watch that projects an interactive computer desktop hologram into thin air. On the other hand if I developed LED technology which allowed this then I should get a patent for my LED's. If somebody else manages to project a hologram with music speakers then they should get a patent for that. The way certain governments are running their patent offices hampers technological advancement. Intelligent people are not willing to develop the required technology because some lamer is waiting to take their cut. The system was devised to encourage technological advancement but it seems this is in reverse.

  18. Don Casey

    Trivia

    At the time, Cullinet (makers of IDMS, later absorbed into CA) was developing an integrated desktop application called Goldengate, which included Word/Excel/PPT equivalents (my memory is hazy on the latter).

    On bit of this was the ability to upload/download mainframe database data from an IDMS facility known as the Information DataBase (some packaged IDMS facilities that provided a quasi-relational database).

    Apple and Cullinet were in joint development, until Apple (as I hear it) pulled out of their side. Cullinet went on to develop the product for the PC side of things, where it pretty much underwhelmed, in spite of being arguably revolutionary.

    I still have my square "Lisa/Cullinet: the Intelligent Link" button.

    1. Michael Wojcik Silver badge

      Re: Trivia

      Thanks for the reminder - I'd completely forgotten Goldengate until I read your post.

      Even more trivial: I met John Cullinane (Cullinet's founder) once. Seemed like a nice guy.

  19. Harman Mogul

    Excellent!

    A really fascinating read. It reminds me of all the stuff I failed to grasp when subbing Practical Computing magazine. I went to the UK press launch of the Lisa and as far as I could tell it was operated by PFM. But the drink did not aid comprehension.

  20. William Roberts

    I was lucky enough to have used a Lisa

    The team I worked with at Phillips Petroleum from 81-85 used the Lisa as a standard office workstation. It was a revelation after my earlier experience with IBM AT technology and software. The integrated software was far ahead of its time. In '85 Apple came and demonstarted the Macintosh. When we found out that the software from Lisa would never be migrated to the Mac, we laughed them out of the conference room. There was simply no way the inital Mac could ever compete with the Lisa. To this day, I have a hard time understanding how it ended up on the trashheap.

  21. Paul McClure
    Thumb Up

    History

    It's nice looking back to see today's processors many hundreds of times faster, communication speeds a thousand times faster, and storage 100 thousand times more vast. Sweet.

    1. Anonymous Coward
      Anonymous Coward

      Re: History

      And software that is really not all that different. A Lisa user transported to 2013 would find a current Mac or Windows 7 PC fairly recognizable, albeit much faster and more colorful.

      1. Peter Mc Aulay

        Re: History

        Not that much faster, TBH.

  22. Herby

    Understand the Lisa for what it really is!

    In the grand scheme of things, it was the prototype for the Mac. You develop all the features and work lots of the troubles out. Then you go and make a "second system". While this "second system" really isn't related to the first one, it grows and improves upon the the original. Sure, it did take a while to get the Mac up to the capability of the Lisa (around the Mac plus), things did work out.

    My feeling is that it was a shame they dropped the 68k processor. For a given clock speed the 68k is far superior to the 80x86 chips. When they went to the PPC processor the deveopment of the 68k kinda died. Had it continued, I suspect the 68k processors would have kept pace. Oh, well!

    1. Anonymous Coward
      Anonymous Coward

      Re: Understand the Lisa for what it really is!

      I could be mistaken, but I seem to recall reading it was Motorola who decided to discontinue the M68000-series, not Apple, SGI, Sun, NeXT, Commodore or anyone else who was using it.

      1. Kristian Walsh Silver badge

        Re: Understand the Lisa for what it really is!

        Yes, because only Motorola could discontinue their processor line, but the reason for them killing the line was that the customers for the high-powered desktop chips were moving away from Motorola. A product without buyers is of no use to anyone.

        SGI had already gone RISC, Atari had moved back to videogame consoles (and never progressed past the 68030), Commodore's Amiga was hitting the end of the line (and also never shipped with anything beyond 68030 - the 040 and 060 chips came from the resurrected organisation).

        Intel's phenomenal success in the late 1980s gave them so much more money to pour into deep pipelining and other tricks to make their architecture work for them, but Moto couldn't justify it. They had very few volume sales for the high-end CPUs, and at the same time, the embedded device customers were pulling them towards low-power operation, which is where the 68k ended up.

        Motorola stopped developing high-performance 680x0s, killed their 88000 RISC line, and joined the AIM (Apple IBM Motorola) alliance in the early 1990s. The idea was to produce a modern, RISC-based desktop to mainframe processor architecture between them and reap the rewards of the larger economies of scale. Apple and IBM were to provide an OS (Taligent), Moto and IBM did the fabrication, and all three would make customer hardware.

        It didn't work that way: Taligent died very slowly, IBM concentrated on Power servers and Motorola ended up making PowerPC chips for embedded devices, with a high-performance variant just for Apple, who were also dying on their feet. The G4 was the last Moto chip in a Mac, but while its embedded focus made Apple's laptops king of the heap for battery life, the desktop line was falling behind the competition, not just on the customer-facing marketing point of peak clock speed, but also on real performance. The IBM-sourced G5 was the last hurrah, but as a server part, it would never make it to a portable.

        I learned assembly on a 68000 (Atari ST), and 68030 (Atari Falcon030), and always remember the chips fondly...

      2. Tridac

        Re: Understand the Lisa for what it really is!

        Not quite correct - you can still by a m68000 from motorola even now. The difference being that it's in cmos and much faster. Motorola did stop development after the 68040, but only because better architectures and processes were being developed and it had run it's course.

        As for the graphics, Evans& Sutherland really wrote the book on early computer graphics, includig a line clipping alrgorithm in this us patent from 1972:

        http://www.google.com/patents?id=hwI1AAAAEBAJ&printsec=abstract&zoom=4#v=onepage&q&f=false

        and a complete computer graphics system running on a dec system 10, from 1969:

        http://bitsavers.informatik.uni-stuttgart.de/pdf/evansAndSutherland/lds-1/LDS-1_Brochure.pdf

        It may be fashionable to think that Apple invented all this stuff, but they were merely building on much earlier work, done in the days when computers were barely powerfull enough to run any knid of graphics system...

        1. Destroy All Monsters Silver badge
          Angel

          Re: Understand the Lisa for what it really is!

          > http://bitsavers.informatik.uni-stuttgart.de/

          DAT LINK!!

        2. Kristian Walsh Silver badge

          Clipping...

          It may be fashionable (and justifiable) to think Apple are nothing but a hollow marketing operation, but that doesn't mean that the company never invented, or never pushed technology forward.

          The QuickDraw region-clipping algorithm *was* new - other graphics systems allowed you to specify a single polygon if you were lucky, but usually only a rectangle. For irregular overlaps, this meant dividing your L-shaped exposed area into two rectangles, and then repeating your draw operation for each one. With an expensive draw operation, that meant wasted computation, or more complex code (preCompute(); for each rect in clip list: clip(); draw(); )

          The QuickDraw system, unlike these, allowed arbitrary-shaped clipping regions. You could open a "region" handle, draw into that region with any of the QD primitves to define its shape, and then finalise it. Once you defined your clipping area using that Region, the clipping was done at the bit level, not geometrically (it's actually bit masking, but with some optimisations to a. pack the mask bit structure efficiently, and b. never execute fully-obscured draw commands).

          The famous Evans & Sutherland patent is for geometric clipping on vector displays. The same techniques can be adapted for bitmaps, but the bitmapped nature of the display allowed much more sophisticated clipping to be performed. This is what Apple, or rather Bill Atkinson, did, and it's why he was awarded the patent.

          (more history on this: http://www.folklore.org/StoryView.py?project=Macintosh&story=I_Still_Remember_Regions.txt )

          1. Michael Wojcik Silver badge

            Re: Clipping...

            While Atkinson's work was important, it was hardly unprecedented. The Sutherland-Hodgman clipping algorithm from 1974 can clip to arbitrary polygons. Wolfgang Straßer and Edwin Catmull had independently described z-buffering in '74 and '75 respectively, and that's a perfectly good method for doing raster clipping (just treat the windows as planes parallel to the viewport). Weiler-Atherton, published in 1977, can clip to an arbitrary window.

            The bitmap-mask method described in Atkinson's patent may well have been novel at the time - I don't know of any prior or independent invention of it - but it falls out pretty naturally from BitBLT with the appropriate raster op, and much of that came from PARC.

  23. Snipp

    Lisa, it's your birthday. Happy birthday, Lisa.

    1. Destroy All Monsters Silver badge

      "Thank you, HAL"

  24. Philip Lewis
    Pint

    30 years on

    My career started in April 1982 IIRC.

    Sometime after that, my cycling mate and flatmate Philip got a job at Apple and started his career.

    We had an Apple III at home with plans to write business software for it, but it was clear early that the III was not going mainstream, and we were busy trying to eke out an existence. And "existence" is the right word. A small 2 bedroom flat in the western suburbs, 2 mattresses, 2 chairs and a table and an old B&W TV we sponged from a mate when he moved on. We also had the ultimate in utility furniture - the floor. Life was minimal in those days. Our bikes were the only valuable things we owned - and many a hard weekend ride was had to Waterfall and through the National Park, where I was mercilessly hammered.

    Then the Lisa got released in Australia and Phil was the demo guy. We had the Lisa in the flat to play with for a while as well. I didn't understand at the time how revolutionary it was, or the impact its successors would have on my life. It seemed like a toy (which it was), but it's legacy is with us today, like it or not.

    I bought my first car during this period, a not too shabby Mk. II E-Type jaguar, about a year after the Lisa was available (since I had started getting paid and saved a few pennies). I got it from some guy who was a structural engineer who needed the money to buy the Lisa! I believe some engineering stress analysis software became available that was useful and cool for his engineering work. I hope it helped him make money. I, on the other hand put a shit load of miles on the Jag!

    So, this article reminds me that I have been in the business for a long time. It reminds me that I was once clueless, and now I am not. It reminds me that I was once penniless and now I am not. It reminds me of old friends who were there at the beginning as well.

    Cheers for the Lisa, it was there right at the beginning for me ...

    Beer, obviously.

    philip

    1. Zmodem

      Re: 30 years on

      i'd happily forget a random auto biography

  25. Anonymous Coward
    Anonymous Coward

    Lisa

    The computer Apple would like you to forget, named after the daughter that Steve tried desperately to forget.

  26. david 12 Silver badge

    DE-9 plugs found on the Mac

    Another victim of the Wikepedia vocabulary Nazi's.

    I've been following the emergence of this term (DE-9) on the web for years. Originally, it was only a few people who had a sad desire to feel superio to everyone else. Gradually it has become more common, and now (if you search part suppliers) you even see that even (many) manufacturers have adopted the new nomenclenture.

    However, since this was a historical article, it seemed anachronistic to see it used here.

  27. f1rest0rm

    Excellent Article

    Very enjoyable read ...

  28. Phlip
    Thumb Up

    There's a working Lisa on display in the entrance to The National Museum of Computing to mark the anniversary.

  29. paulll

    "Smalltalk microcode"

    Umm .... what exactly are we thinking,"microcode," means?

  30. Anonymous Coward
    Anonymous Coward

    GUI Desktop

    Apple didnt invent the Graphical User Interface. The revolutionary computer scientist Douglas Engelbart developed many of the concepts we take for granted today. It was his research papers back in the 1960s that detailed many of the concepts on our desktops today. He even produced a prototype in 1968 demonstrating some of his ideas. Apple would like everybody to believe they developed the desktop or they paid Xerox for the technology. It wasnt even Xerox who invented the GUI, they merely implemented Engelbarts ideas in full and is a reason as to why they did not apply for a Patent. Its remarkable as to how the whole world thinks Apple invented the Computer Desktop. Any person who has attended a decent University and studied Human Computer Interaction would know that this is not the case.

    1. Michael Wojcik Silver badge

      Re: GUI Desktop

      Yes, if only the article had mentioned Engelbart. Somewhere around the middle of page 2 would have been a good choice.

      You probably would have noticed that, had you not been so busy grinding your axe.

  31. dssf

    Jobs and Cruise?

    Is it me, or could Tom Cruise today play Steve Jobs of the era of that photo? I just thought it was funny. Similar to how various middle aged actors of Hollywood today look like the younger versions of other actors.

  32. Hillman_Hunter
    FAIL

    led to MS Windows

    Lest face it Apple Lisa was not a success, and MS windows Wiped the floor with Mac. Jobs stuck the signposts up though, just struggled with a market that only ever purchased IBM kit. the computer business was very conservative back in the day, When the Market was ready it was Bill Gates who had the genius to exploit it with Windows (and mug IBM to boot) Similar thing happening with iPhone now, Apple bust down the doors with the right product and one which there competitors can easily copy, and watch there competitors make off while there arguing with the judge.

    1. Anonymous Coward
      Anonymous Coward

      Re: led to MS Windows

      It was obvious the direction the smart-phone market was headed. Symbian Series 60, then UIQ going on to linux handsets. Mobile browsers going from WAP to opera mobile and then full function browsers. Apple realized this and developed one of the most expensive phones on the market and sold it at a loss by giving large discounts. In addition to this they heavily marketed their products trying to convince the world they invented the smart-phone.

      Apples competitors did not rip off Apple. The smart-phone is nothing more than a phone and a computer. Apples competitors waited for the right time until the fast expensive technology came down to the right price that a consumer is willing to pay. In fact Apple copied the user interface of others. Then claimed they were the first to do it on a smart-phone and hence should be granted a monopoly. Apple seems to think that if they do something obvious first, even if its uneconomical, then they should be granted a monopoly.

      The idiots at the patent office help them in their lunacy. They grant me-too obvious patents. E.g. Phone text messages went from text to picture messaging. A patent would be granted if somebody sees this and they implement in their blogging software, which was text only before, picture messaging features.

  33. Martin Huizing
    Windows

    “A year later these FileWare drives would be replaced by a single Sony 3.5in 400KB floppy drive”

    3.5inch? You sure you don't mean 5.1/4?

  34. Ian 55

    What was the name of the TV programme that featured lots of them?

    Probably Channel4, about a kidnap, possibly in Ireland. The businessman having to come up with the ransom was clearly targeted because he could afford dozens of them...

    Oh, and bonus question, when will Bill Atkinson receive the widespread fame he deserves?

  35. pelicanmike

    I worked on developing some of the first accounting software for the UK market in conjunction with the UK Launch of the Lisa.

    It is easy to dismiss the Lisa as a brief blip on the Apple calendar but at the time it was seen as completely revolutionary.

    We worked on the Apple stand at the which? computer show in Birmingham for three days demonstrating our software. For the entire time, there was a massive queue of people wanting to see the new system in action. After three days of delivering demos I literally couldn't talk (unbelievable I know).

    The Lisa was an important evolution in the Personal Computer Market. Some features we have come to take for granted, were first introduced by the Lisa project (with obvious credit going to work also carried out at Xerox).

    The delays in the project did indeed allow IBM / Microsoft to gain a dominant position in the market place. Here we are 30 years later and the technology landscape has changed beyond even the wildest imaginings of all involved in the industry back then, I remember having conversations about the 3.5inch floppy disk and asking "whatever next?".

    Happy birthday Lisa

This topic is closed for new posts.

Other stories you might like