back to article The Steve Jobs of supercomputers: We remember Seymour Cray

Before Steve Jobs, there was Seymour Cray – father of the supercomputer and regarded as something close to a God in the circles he moved in. Jobs’ Apple Computer is reputed to have bought one of Seymour’s massive machines back in the day: a Cray, to design the brand-new Macintosh personal computer. This would have been a …

  1. Aristotles slow and dimwitted horse

    Pretty sure...

    The first real "CGI" in movies was done on a CRAY XMP. The Last Starfighter and such like...

    1. Richard Wharram

      Re: Pretty sure...

      Tron was done on a Foonly was it not?

      1. Bit Brain

        Re: Pretty sure...

        3 different companies did the CGI for Tron. According to the "The New Magicians" episode of the old TV series Equinox, one of them used the X-MP.

      2. Michael Strorm Silver badge

        Re: Pretty sure...

        Interestingly, despite the fact it *did* include quite a bit of innovative CGI, the majority of Tron's "high tech" look was done using non-digital masking, layering, film processing techniques and backlighting. (#)

        It certainly wasn't the first film to use any of these techniques- indeed, backlit animation was very popular in the early 80s for that neon/computer look back when real CGI was limited and expensive. However, it's been observed that it was probably the first (and will likely remain the only (##)) film to use them in such an ambitious and extensive manner- basically, every scene inside the computer that isn't CGI uses these film processing techniques to some extent.

        Ironically, Tron's reputation as groundbreaking CGI has overshadowed this (also impressive) use of traditional filmmaking techniques in such an original way.

        (#) See my comment on "Max Headroom" elsewhere for another example of early-CGI-that-wasn't-actually-CGI-at-all.

        (##) I say this because- with the huge technical advances and reductions in cost of actual CGI since Tron came out- there's no way anyone would do it that way today. Even if they wanted to replicate exact the same appearance and feel, it would still be so much easier to do that digitally, no-one with that sort of budget would do it the incredibly tedious and error-prone analogue way.

      3. John Smith 19 Gold badge
        Unhappy

        "Tron was done on a Foonly was it not?"

        From an article at the time Tron used 3 different systems for different scenes in the film, with different models (wire frame versus frame buffer) and at least one using custom built hardware.

        The article said it bankrupted all three production houses involved.

        The Last Starfighter was all done on a Cray and IIRC the company was quite proud that the animation got done at about 1/24 real time IE 1 frame a sec from scratch

        So pretty much what a competent Blender user could achieve on a new PC.

    2. Amorous Cowherder

      Re: Pretty sure...

      A CRAY was used as a fancy prop in the Robert Redford/Ben Kingsley flick Sneakers. Redford, after being knocked unconscious, wakes up on the "seated" part of the CRAY by Kingsley the evil genius paid by the Mafia to run their office "clerical" systems!

      1. oldcoder

        Re: Pretty sure...

        No. They wanted to use a Cray, but the company said no - they would never sell to a Mafia backed company.

        So they used a fake that had some of visual cues of a Cray.

  2. Anonymous Coward
    Anonymous Coward

    A superfast computer ...

    ... and a comfy bench when your all calc'ed out.

    1. Teiwaz
      Coat

      Re: A superfast computer ...

      It's a wonder there is no 'themed' furniture, or lego?

      Maybe next time I'm building my own desktop I'll try for a mock-up (think I still have an 'action man' or two in a box somewhere to put on the seats).

    2. Ilgaz

      $38M chair eh?

      You can also plug your phone. Read what they did in 70s

      http://www.0x07bell.net/WWWMASTER/CrayWWWStuff/Cfaqp2.html#TOC12

      Regular Crashes

      The following is an approximate description of an event that took place in the late '70s:

      There was the time that an early Cray-1 serial number was running pre-ship reliability tests and was crashing consistently at roughly the same time in the pre-dawn morning hours. After much head scratching, somebody realized that the (newly hired) third shift custodial engineer was coming through on his regular pass through the checkout bays at about the time the failures happened. He was questioned when he came in the next night to find out if he might have been moving or operating any of the system machinery in the bay during his rounds. He denied doing anything of the sort. Later that night however he was noticed plugging his Hoover into one of the 60Hz utility outlets conveniently located on the base of a Cray-1 power supply 'bench' in an adjacent checkout bay. The outlets were handy for plugging in test equipment during checkout activities but an ECO (engineering change order) was issued soon afterward removing them.

      1. Tcat

        My 3 Deamons

        For my mid-life crisis I went to IT Training.

        EMI RFI and static we're a minute zero center when real training began.

        I remember in irony, the police car RADIO trigging the stereo store burglar alarm

        (when parked on one building side), DSU/CSU mortally wounded because

        the power vacuum met the data cable. OR the ID10T that ran CAT 3

        inside an elevator shaft in active duty.

        and the poor sod that was so naturally static creating in his office,

        I found him a tech anti static mat and to this day, I keep in good service

        a pair of Birkenstocks with ESD built into the heel.

  3. Andy The Hat Silver badge

    Dribble mode <engaged> ...

    Sorry, I always loved the Cray-1 ... the Cray XMP wasn't bad either ... vector processors ... ooh missus get your fingers off my keyboard!

    It may date me but I'd now be happy if I could simply have a Cray 1 as a seat - like a computer version of Clarkson's Top Gear furniture ... Full geekdom would be in sight :-)

  4. Anonymous Coward
    Boffin

    What use as a bitcoin miner?

    1. Dani Eder

      Bitcoin miner

      These days, anything other than custom bitcoin mining hardware isn't worth the electricity it consumes.

      The bitcoin network currently runs at 5.5 million Petaflops. That far outclasses the 361 Petaflops of the world's top 500 supercomputers *combined*. The reason the network is so fast, is these days it runs on custom chips that implement the hashing algorithm in hardware, and are massively parallel. The downside is these chips are useless for almost any other purpose. You can't reprogram them to do some other calculation, it is wired into the silicon.

      The reason these custom chips were worth making is newly mined bitcoins are worth $315 million a year at the moment, more than enough to justify a custom chip.

  5. Anonymous Coward
    Devil

    A question

    Steve Wozniak built the Macintosh in order to democratise computing

    So why are they so fucking expensive?

    1. Teiwaz

      Re: A question

      "Steve Wozniak built the Macintosh in order to democratise computing

      So why are they so fucking expensive?"

      Ah, I see your mistake was in thinking 'democratise' in the traditional sense, not in the modern sense where money buys legislature.

      But then Steve Wozniaks thinking was probably the former, the other Steve was more of the latter.

    2. James 139

      Re: A question

      Probably a combination of things, such as seeing the price people were willing to pay for a Cray.

      As for democratisation, dont confuse available to all with achievable by all.

    3. Steve Todd

      Re: A question

      To my knowledge Woz wasn't involved in the design of the Macintosh. He designed the Apple I and Apple II to that end, and they were cheap compared to commercial kit at the time.

      PC manufacturers these days have two basic options : compete on price (which leads to a race to the bottom), or compete on features (at which point PCs and Macs are about on parity for a given price point).

      You're not forced to buy a Mac, in the same way you aren't forced to buy a BMW rather than a Ford. Providing there are enough people out there who think that BMW/Apple are worth the extra then they make a living. That seems fundamentally democratic to me.

      1. MondoMan

        Re: A question

        Yep, the Macintosh was famously Jobs' baby. The anecdote reference is to the Apple I/II instead of the Macintosh.

      2. This is my handle

        Re: A question

        > or compete on features (at which point PCs and Macs are about on parity for a given price point).

        I hear this all the time from Apple users, and have always been skeptical, at least for US pricing. "One 3rd the number of mouse buttons, 3 times the price" is what I've always said.

        I'm eating my words though; parity is near. The local BestBuy offers the machine below for $1K US before taxes: Apple - MacBook Air® (Latest Model) - 13.3" Display - Intel Core i5 - 4GB Memory - 128GB Flash Storage - Silver. Comparable WinDoze machines vary from $800 - $1,400 depending on whether or not you want "touch", or will reformat the Win10 drive w/ Ubuntu, LOL.

      3. TheOtherHobbes

        Re: A question

        Woz was distantly involved in the pre-Jobs Jef Raskin-Mac. Raskin wanted a $500 computing appliance, with a supporting national network (!) and some unusual software (like a pseudo-APL "calculator") running on a conservative hardware spec to keep costs down.

        Jobs liked the all-in-one idea but wanted a more powerful spec because he'd been to PARC by then, and a 6809 with 64k wasn't going to make it happen.

        Raskin and Jobs weren't a mutual fan club, so Raskin left (he never forgave Jobs), leaving Jobs with the idea of friendly+cheap and a brilliant design team.

        The friendly part stuck, the cheap part didn't. Nor did Raskin's unusual software ideas - some of which would still be interesting today.

    4. Anonymous Coward
      Thumb Up

      Re: A question

      Yep, Commodore et al did more toward that goal!

    5. Anonymous Coward
      Anonymous Coward

      Re: A question

      Because they design and integrate both the software and the hardware so it works.

    6. SImon Hobson Bronze badge

      Re: A question

      > So why are they so fucking expensive?

      Ah, that was democratise as in ... make accessible to mere users, not just the nerds. And at the time, other computers (of decent spec) weren't exactly cheap !.

      The vision (and as pointed out, it was Jobs, not Wozniac) was to make a computer that was easy enough to use that anyone could use it - that's the democracy bit. Instead of having to learn loads of nerdy crap to do anything, you just had this simple visual desktop metaphor that just about anyone could get to grips with.

      Alongside that was possibly the most important feature - a printed 3 volume set of developer info, Inside Macintosh, one whole volume of which was on how to "do the right thing" with the interface. Most developers followed this and so their programs were easy to pick up and use. For the few that ignored it, the users generally told them where to stick their crap UI and the developer either fell into line or the program flopped.

      There was much detail in the dev books (I had a set). Even the mundane things like "though shalt have an Apple menu, and it shall contain ...", and "the next menu will be File, and it will contain ...", and so on.

      For those too young to remember, this really was a major milestone. Even without the graphic interface, just the user interface consistency was a major thing - back then every (almost all text based) program "did it's own thing", so having learned one program was naff all help in using another because every developer had their own idea of how it should be done.

      And the general lack of modality was another breakthrough. At around this time, in ${dayjob} the standard word processor was IBM's Displaywrite. This was highly model - you went into one menu from where you could edit a program, if you wanted to print you had to save the job, exit that menu, go into the printing menu, print the file, and then go back to the edit menu to continue editing. I don't recall if it had background printing or whether you had to wait for printing to finish before doing something else.

      On the Mac it was all event driven - and that challenged some developers who were used to letting the user do half the work. So in a word processor, you could be typing away, and without a thought just whizz up to the File menu, select Print, and the program had to cater for that.

      I never got into "classic" programming on the Mac, but I did get to do a fair bit in Hypercard. For it's day that too was a real breakthrough, with fairly simple (but capable) programming, object oriented, and with rudimentary database capability. I know a lot of stuff was built with that !

    7. Dadmin
      Holmes

      Re: A question

      Because they bothered to put more thought into the design and the coupling to the OS than anybody the biz did, or does. Apple only produced a single architecture at time and took great pains to make it all "look and feel" better than your experience with any other desktop machine of the era. To that end it makes it easy to build the entire computing experience from opening of the box, to setup, to booting, to working on it when your all one company doing one thing. Think about it, who else bothers to do this? Apple had an entire half of a building floor filled with Human Interface design engineers. This was back in the late 80's and early 90's when I worked there. No other computer hardware manufacturer or software manufacturer went to these lengths to get the entire system to work together in a fluid way that makes you concentrate on your work, not how to get past a fucking giant windows modal dialog box that thwarts you from doing even the most simple tasks.

      Linux desktops are getting quite good, and I'm moving my primary home desktop from an old Mac Mini to a RaspberryPi setup, with some time before I complete cut over, or at least as long as this old Mac stays running. But, as to your question; it's the amount of time they spent on crafting their product and the supply and demand that can justify the price. You pay a bit more for the design and the convenience factor, not because they put a cheap Intel heater inside it.

      Some notes relating to the article:

      -The first Cray at Apple Computer was a Cray X-MP

      -It was custom painted purple at a nearby auto body shop in Sunnyvale CA

      -The second Cray they purchased was the Cray Y-MP

      -I was present the day we ALMOST had a halon dump in the main Vally Green 3 computer room. This was back in the early 1990s and I was but a lowly network apprentice. Somehow the air in the computer room became much more humid (It's those chillers, I tell you!), at any rate the smoke doctors under the raised floor got a bit misty-eyed and started the countdown to HALON DUMP TIME! Luckily one of the Unix admins quickly assessed the situation and hit the "don't fucking dump the halon, no fire" button. Good times.

      -This is the same admin who showed me how to stitch together multi-part unencoded porn pics from USENET. Good guy.

      1. Dan 55 Silver badge
        Windows

        entire computing experience... one company doing one thing... who else bothers to do this?

        At the time, Commodore, Atari, Acorn...

    8. Captain DaFt

      Re: A question

      "Steve Wozniak built the Macintosh in order to democratise computing

      So why are they so fucking expensive?"

      Well it's all a matter of scale. The iPhone may be pricy, but pales in comparison to what a Cray costed, and can out compute a room full of them.

      Anyone can buy an iPhone, only governments could afford the Cray.

      That looks like democratistion to me.

      1. Steve Davies 3 Silver badge

        Re: A question

        But..... I've still not seen anyone brave/foolhardy/idiotic/etc wearing an Apple Watch.

        Even in the Apple Store in Washington DC I visited earlier today (to get out of the rain...) I failed to spot anyone wearing one.

        Bit of a damp squid if you ask me.

        On the otherhand, a quick glance around the BA Lounge at Dulles there was hardly any non Apple device in operation.

        Make what you want of that.

        Kudos to the Apple Store though, the did have a MacPro on show and working.

        Oh, and the Computer Museum at Bletchley Park has a Cray-1 complete with seats.

  6. Little Mouse

    In the nerdy corner of the school playground in the eighties I remember clearly that Cray computers were considered the absolute ultimate when it came to processing power.

    My mate swore blind that using one was absolutely definitely the ONLY way that Max Headroom could have been made (!)

    1. Teiwaz

      My mate swore blind...

      I'd a mate at school who always tried to outdo everyone, and didn't let his ignorance get in the way. He once claimed he was getting a Cray (I think he got an MSX, in the end).

      1. Destroy All Monsters Silver badge
        Childcatcher

        Re: My mate swore blind...

        I think he got an MSX, in the end

        Damn you Greg. Long time, no see!

      2. PaulyV

        Re: My mate swore blind...

        Ah, every school had one of those. Ours was called 'Peter' and it wasn't long before he had the nickname of 'The Incredible BullSh*tting Man'.

    2. Michael Strorm Silver badge

      Max Headroom

      @ Little Mouse; The irony- as I suspect you know, but others might not- being that Max Headroom himself wasn't CGI at all, but actor Matt Frewer wearing a load of prosthetics. I'm not sure anything approaching Headroom's appearance would have been possible- let alone practical- with computers of the time. (People- even intentionally fake-looking ones like Headroom- were always much harder to do convincingly for early CGI than shiny, flat-surfaced spaceships and plastic balls).

      Admittedly the effect was enhanced (I'm assuming) by what would then have been state-of-the-art digital editing effects et al, but that's still not CGI in itself. The rather simpler background graphics in some later episodes were apparently created on an Amiga, but that's hardly in the same league of complexity.

      1. Anonymous Coward
        Anonymous Coward

        Re: Max Headroom

        Max-Headroom was fantastic, shame it didn't last.

      2. Frank Bough

        Re: Max Headroom

        The BG gfx weren't done on an Amiga, I used to work with the guy who did them, I think it was a Matisse system that they used. Not sure.

        1. Michael Strorm Silver badge

          Re: Max Headroom

          @Frank Bough; Re: the background graphics, I was only going by what was written in the Wikipedia article (hence "apparently") which claims that the Amiga was used for backgrounds in the later US TV show and they were originally done by hand by the same guy that did the pseudo-computer-displays for the Hitchhikers Guide to the Galaxy TV show. Neither of these claims are referenced, though, so I've no idea where they came from.

          1. Clueless_Shot

            Re: Max Headroom

            Thanks for a fun to read thread

            I do hope that this can be answered

            I'm curious, Years ago way back in the eighties reading through one of my fabulous ZX Spectrum Magazines I recall reading about the BBC using the Cray that was used by the met service to render an image of Max headroom, Does anyone here remember this happening or were you involved in this? I'm interested to find out as its bugging the heck out of me. I can not for the life of me remember how long it took, I seem to think it was a week to render one image?

            Anyone?

  7. Tromos

    Happy days

    I used Cray designed kit throughout most of my early career, loads of CDC 6000 and 7000 series and then Cyber17x. Managed to use (and sit on!) not one but two Cray-1s. Always loved the clean and elegant machine instruction sets. That's probably why I much preferred the Signetics and Motorola microprocessors to the Zilog and Intel ones when the personal computer revolution started.

    1. Michael H.F. Wilkinson Silver badge

      Re: Happy days

      Same here. My first programming was on a CDC 7600 and much later did loads more serious work on the J932 and SV1e. Cray didn't just provide good hardware, but also cracking good compilers. They could recognize just about the most obfusticated bunch of for-loops as a matrix multiplication and replace the code by some optimized routine from their library. What I really liked about both the SV1e and J932 is how these shared-memory machines managed to attain an average performance of roughly 2/3rds of the theoretical peak. Some really nifty scheduling going on, which is VERY hard to accomplish on clusters.

  8. John Smith 19 Gold badge
    Coat

    The cry of all big data apps is

    "Feed me, Seymour."

    "Feed me all night long."

    Time to be gone.

  9. Peter Simpson 1
    Thumb Up

    CYBER 74

    My university took delivery of a previously owned (they got a deal) CDC CYBER 74 while I was there. It was on this machine that I took my required assembly language programming course from the CDC applications engineer who came with the machine. 60 bit words, hardware floating point operations and a "count ones" instruction. "Anyone know why that's in the instruction set?" he asked. No one did. "By special request from a three letter government agency", he replied.

    // much more fun than waiting in line at 3AM for time on the PDP-11

    1. Loud Speaker

      Re: CYBER 74

      The "Count ones" instruction was there because it is used for database joins. (Codd, Date). It also has some use in image processing (not sure what).

      Report the highest bit set is generally found in the same context.

      Debugging Compass was not great fun.

      1. oldcoder

        Re: CYBER 74

        It is also good for very fast schedulers. Just assign a bit to signify the queue. On a 60 bit machine, that allowed for 60 priority queues - the first bit set to 1 gives you the queue index.

      2. IvyKing
        Boffin

        Re: CYBER 74

        COMPASS == Complete pain in the ass

        One nice thing about the 6000/7000 series instruction set was that it was really easy looking through an octal dump compared to what's involved with an x86 hex dump. Also had fond memories of the use of A1 to A5 for reading memory into the associated X register and A6 and A7 for writing to memory.

        Text processing on the other hand was a royal pain.

    2. DaveB

      Re: CYBER 74

      "Cray supercomputers early on featured a population count machine instruction, rumoured to have been specifically requested by the U.S. government National Security Agency for cryptanalysis applications."

      https://en.wikipedia.org/wiki/Hamming_weight

    3. John Smith 19 Gold badge
      Boffin

      "Anyone know why that's in the instruction set?"

      Small side point I wonder if it uses the bit counting algorithm outlined in "Combinatorial Algorithms" by Reingold, Nievergelt and Deo whose complexity varies with the binary log of the word length, takes a fixed number of cycles and which they "Do not give explicitly" ?

      As opposed to the other 3 bit counting methods they happily describe in details.

  10. Anonymous Coward
    Anonymous Coward

    Flight Of The Navigator

    <--- this was also a use of earlier CG.

    Anyone know what it was done on?

    1. Christian Berger

      Re: Flight Of The Navigator

      http://dave.zfx.com/f1.html

      It was the Foonly F1

  11. Bob Merkin

    Typical management

    "Barack Obama issued an executive order in July telling his nation’s technologists to build the world’s fastest and most powerful supercomputer – one exaflop – by 2015."

    Wait until the last minute to give an order, then expect the impossible. I don't think they're going to make it.

    1. Anonymous Coward
      Facepalm

      Re: Typical management

      Well, we did send a man to the moon and return him safely.

    2. Destroy All Monsters Silver badge

      Re: Typical management

      If COMMUNISTS can order up supertech on demand, then SO CAN WE!

    3. PleebSmash

      Re: Typical management

      Came here to quote that. I think it's a misprint by El Reg.

    4. Anonymous Coward
      Anonymous Coward

      Re: Typical management

      Bob Merkin:

      > Wait until the last minute to give an order, then expect the impossible.

      > I don't think they're going to make it.

      The engineers cannae make it Cap'n!

  12. Voland's right hand Silver badge

    I am surprised

    So many comments and none on one of the greatest Cray legacies:

    The best thing someone ever said about virtual memory: Memory is like an orgasm. It's a lot better if you don't have to fake it.

    1. John Hughes

      Virtual Memory

      Memory is like an orgasm. It's a lot better if you don't have to fake it.

      He was just annoyed that ATLAS was delivered before the CDC 6600

  13. GlenP Silver badge

    Memories...

    Back in my first job (mid-eighties) I did get the chance to submit the occasional job to the Cray at a government nuclear research facility*. Unfortunately a promised visit there never materialised so I never actually saw the thing, it was just an address at the end of a PSS connection.

    *Not sure if this would still be covered by the Official Secrets Act!

    1. StuartF

      Re: Memories...

      Glen. Same here I was at AEE Winfrith and submitted code overnight that usually failed. I did manage to visit it. It was a great thrill to sit on the seat as a 22/23 year old

      Also probably cannot mention where it was but t began with H and ended with L

      Now I have written that you may have to kill me !

      1. John Savard

        Re: Memories...

        That would be Harwell.

        1. Christine Munro Silver badge

          Re: Memories...

          My gf's late father worked at the H-place and there was a whole load of analytical printouts lying around here until recently. I have no idea which system spawned them, though, and they meant little to a random Unix-head who doesn't really want to admit the grade awarded to her physics A-level.

  14. Clive Harris

    Back in the 70's, a friend of mine was interviewing someone for a computing job. The man mentioned that he had once worked for "Crays".

    "What?", my friend said, "You worked for Seymour Cray?" .

    "No", he replied, "Ronnie and Reggie" (Kray)

    1. Anonymous Coward
      Anonymous Coward

      Seymour Cray, do you know my name?

      I'm genuinely curious as to what sort of job he was going for such that the interviewer knew who Seymour Cray was *and* yet your friend thought it was a good idea to mention his "experience" with the Kray twins. :-/

      1. Loud Speaker

        Re: Seymour Cray, do you know my name?

        "Nice business you have here, shame if anything were to happen to it"

      2. Anonymous Coward
        Anonymous Coward

        Re: Seymour Cray, do you know my name?

        Maybe can't tell us the sort of job it was.

        Or who the employer was - although if they just hid behind their post office box 500 anyway, who'd guess?

        (They sound like the perfect customer for a Cray/Kray partnership btw ... )

      3. Clive Harris

        Re: Seymour Cray, do you know my name?

        > I'm genuinely curious as to what sort of job he was going for such that the interviewer knew who Seymour Cray was *and* yet your friend thought it was a good idea to mention his "experience" with the Kray twins. :-/ <

        No, you've got it the wrong way round. My friend was a manager of a computer department, interviewing people for a job there (back in the days when a PDP11 was state of the art). The man who mentioned the Krays was being interviewed by him. I don't know why he mentioned them - perhaps he wanted to demonstrate his ability to work under pressure. Apparently he didn't get the job, and my friend didn't suffer any retaliation.

    2. Anonymous Coward
      Anonymous Coward

      Cray Twins ?

      Back in the eighties when the Met Office had a pair of Krays, I'm pretty sure they were called Ronnie and Reggie.

  15. 45RPM Silver badge

    From the stories on Folklore (which don't mention a Cray), I'd guess that the Apple Cray was used in the design of the Power Macintosh (but I'm far too lazy to find / cite any sources). Seymour Cray, on discovering this, reputedly laughed and said that he was using a Macintosh to design the next Cray. Which only goes to show what an incestuous (and possibly full of bullshit and myth) business we're in.

    1. MondoMan
      Trollface

      "POSSIBLY full of bullshit and myth"

      Ahhh, British understatement...

    2. Ilgaz

      http://www.0x07bell.net/WWWMASTER/CrayWWWStuff/Cfaqp3.html#TOC23

      Cray FAQ says:

      The machines originally purchased to help out on a computer on a chip project, the machines eventually earned their keep running MOLDFLOW an injection plastic modelling program ( producing some results in the form of Quicktime movies) and later as a file server. Other applications were CFD codes for disk drive design improvement and one source reports ".. they sometimes ran the first XMP as a single user MacOS emulator ... They had a frame buffer and a mouse hooked up to the IOP."

  16. Wilseus

    It's a shame

    I've always wondered if we'll ever return to the days of interesting, elegant hardware rather than everything using banks of shitty, off the shelf x86 chips. This doesn't just apply to supercomputing either, but Macs, SGI workstations and games consoles as well.

    Perhaps the pendulum will swing back again sometime in the future and there'll be the next Seymour Cray ready and waiting to come up with something fascinating.

    1. Nigel 11

      Re: It's a shame

      It's become far too easy to mass-produce complexity these days. Also, almost nobody writes in assembly language. That said, the ARM instruction set is quite elegant, and any RISC architecture (just about anything today except x86) owes a lot to Cray who designed the first one.

      1. Anonymous Coward
        Anonymous Coward

        Re: It's a shame

        Worth remembering that even "x86" architectures these days aren't, really. Since the Pentium Pro (and its mainstream sibling, the Pentium II)- they've essentially been entirely different designs with a wrapper that converts x86 instructions to the more-easily-optimised-and-reordered native format instructions on the fly.

        Whether they're actually RISC chips is open to question- while I've seen some claim this, I've also seen others dispute that this is actually the case.

        1. Voland's right hand Silver badge

          Re: It's a shame

          Worth remembering that even "x86" architectures these days aren't, really. Since the Pentium Pro

          Nope. The first one was UMC U5 and it was quite openly declared to be using RISC underneath and a translator It was creaming its counterpart - 486SX so badly that it was not even funny. I had one - it was running on par with DX2 and DX4 chips. To add insult to injury it was cold while, Intel chips by that point already required heatsinks and fans. Intel killed them using patents. I have never ever seen Intel move so fast as in that case. Not surprising as UMC apparently had a DX part and a Pentium killer lined up too.

          This was several years before Pentium Pro. In fact, I think AMD started using this tech in K5 and K6 before Intel. I would not be surprised if IBM Blue Lightning used some of that too. Intel as usually, innovated through marketing and "interesting" practices before giving up and following. Same as with x86_64, AES insructions, RNG and many other things.

          1. Michael Strorm Silver badge

            Re: It's a shame

            That's very interesting, thanks for the story. Despite vaguely remembering Cyrix from when I was choosing my first PC in the late 90s, I'd never even heard of UMC let alone been aware of that story until now. Strange!

            I should have been clearer though; I was talking about *Intel's* chips specifically and how they hadn't been "true" x86 since the Pentium Pro/II. Specifically, I hadn't been sure about AMD, but guessed they'd probably ended up doing something similar to avoid being backed into the same corner as Intel- your comment on them was interesting.

            Ditto the RISC comment; I don't know if Intel's own underlying designs were or weren't truly RISC, and assuming it was only accessible via the x86 layer in normal use, there's no reason (in theory) they couldn't have used two or several completely unrelated microarchitectures without affecting compatibility.

            (Interestingly, about 15 years ago, before I was aware of the Pentium Pro's background, I remember reading in a textbook about how ludicrously complex the x86 design was becoming with all the legacy cruft and wondering how on *earth* they were able to do anything with all that baggage. Well... I guess they cheated, sort of!)

        2. Alan Brown Silver badge

          Re: It's a shame

          Unsure about Intel, but AMD's K5s were x86 decoders wrapped around their 29000 RISC system.

          1. Anonymous Coward
            Anonymous Coward

            Re: It's a shame

            "Unsure about Intel, but AMD's K5s were x86 decoders wrapped around their 29000 RISC system."

            Takes me back. I remember being shown an AMD board that emulated an 808(0?5?) in bitslice, the idea being that you could just leave the rest of your hardware and software untouched and get an instant several times boost in arithmetic and logic handling.

            It wasn't cost effective for civilian stuff but I wonder if it found its way into any military systems?

  17. This post has been deleted by its author

  18. Christian Berger

    The genious part was to simplify the problem

    I mean that's what distinguishes him from modern day computing company managers. A C64 probably took more engineering effort to design.

    The Cray 1 didn't use any custom silicon. It used generic ECL gates which you could buy in bulk in any store. It used careful design to get speed out of it. For example every board was designed so the propagation delay was constant. Every line between 2 components and particularly between 2 boards was a well run propagation line. Every long line between 2 boards had the same length. All of this suddenly makes the problem much easier as you could count on certain universal preconditions. For example your signal would arrive 1 clock cycle later at the other board because of wire delays. It would _always_ do that and you could count on that.

    ECL also has the nice effect of taking a constant amount of power. That way there are no current transients on your boards which are a huge problem today. That's why, under most CPUs in modern PCs you will find a whole battery of capacitors to satisfy the current demands. The Cray just took a constant current. This also simplified the power supply. It was a simple 6 phase rectifier with a bit of capacitors after it. The regulation was done externally with an electromechanical converter converting both the line power to 400 Hz 3 phase as well as regulating the output voltage for slow variations of the supply voltage.

    There are 2 talks by him on Youtube. They are worth watching, even if you are not into engineering. He's a rather good speaker:

    https://www.youtube.com/watch?v=vtOA1vuoDgQ

    https://www.youtube.com/watch?v=xW7j2ipE2Ck

    BTW Steve Jobs was a salesperson, Seymour Cray actually designed most of the logic in boolean equations. So comparing those is kinda offensive to engineers.

    1. Anonymous Coward
      Anonymous Coward

      Re: The genious part was to simplify the problem

      ISTR that the Cray-1 used phased memory banks to increase main memory access rates. The chips used to implement main memory RAM were much slower than the CPU and to get around this the main memory was arranged in multiple banks, running in different phases; access was switched to each bank, sort of one after the other, as each bank came 'in-phase'. Clever stuff, but not really simple, except in concept I suppose.

      1. Christian Berger

        Re: The genious part was to simplify the problem

        Well of course it used all the tricks that were known in processor design back then, after all it was already a vector machine and used pipeling and stuff. However they avoided problems wherever they could, turning an "impossible" problem into one you can solve with a handful of people in acceptable time. Once you have done that, development will go rather quickly as you don't have large teams to worry about, just hire good people and let them solve their problems and the job will get done.

  19. tengel

    Great article; minor correction

    Thanks for the great article remembering the man and his companies that heralded today's technologies (like the 'supercomputer in your pocket' iPhone) ... Regarding "Just one Cray-3 was ever delivered – to the US National Center for Atmospheric Research – but unreliability issues saw it taken out of service." I wish to correct the record: that system, s/n 5, which NCAR called "graywolf" (see https://www.cisl.ucar.edu/computers/gallery/cray/graywolf.jsp), was in use by NCAR until the day after Cray Computer Corporation filed Chapter 11. NCAR also had access to another Cray-3, s/n 7, that was located in the checkout bay at CCC's Colorado Springs headquarters, and used it extensively.

  20. Anonymous Coward
    Anonymous Coward

    General Motors, VW, Ford and others ran mio of jobs/hours on Cray computers to improve car crash security and the poor genius dies in a car crash...I worked with a lot of good people from Cray but nobody new the brand of the car that killed him :-(

    1. linuxguru

      He was driving a Jeep Cherokee or some similar SUV, IIRC. The vehicle which prompted him to swerve was a rust-bucket '70s cookie-cutter - maybe a Chevy or Pontiac. This was reported in the local newspapers at that time (1997).

  21. James Wheeler

    Tried Porting to a Cray Once

    It was in the mid 80s, and the Cray salesman thought he could sell a machine to a certain 3-letter agency if only it had an implementation of the APL language. We had an APL system written in C, there was a C compiler that more or less worked, so Cray set us up with their porting center.

    Once the code was running we eagerly fired up our benchmark suite, only to discover that the Cray ran them slower than a Sun workstation. The Cray salesman was crushed. Apparently the machine was really designed to run Fortran, and really only got going if you used the vector instructions. The cost of adapting our technology to fit the machine was much larger than we'd ever recoup in software licenses, so that was the end of that.

    No intent to knock Cray here, only to reflect that to get value out of a "supercomputer" you really have to understand what that architecture actually is and what must be done to exploit it..

    1. Nigel 11

      Re: Tried Porting to a Cray Once

      These days you can have your own desktop "supercomputer" that works just like a Cray once did: a big GPGPU board in a PC. If you understand your GPGPU and use codes that vectorize and parallelize well, it'll be hundreds of times faster than if you don't.

    2. nijam Silver badge

      Re: Tried Porting to a Cray Once

      > Apparently the machine was really designed to run Fortran, and really only got going if you used the vector instructions.

      What an irony - APL was a much better language for specifiying vector and matrix operations! Of course you'd need a native compiler - C has many good features but fast native vector processing isn't one of them, so it wouldn't make a good intermediate language in this case.

  22. davebarnes

    Apple did have a Cray.

    I saw it close up in the early 90s.

    Seymour Cray was a god.

    I loved the CDC 6600 that I got to use. The PPs were a brilliant idea.

  23. Andrew Tyler 1

    American Idioms

    Is it possible the quotation on page two, if it was spoken rather than written, isn't "[worked] through curves" but "threw curves" as in like a baseball pitcher throwing a curveball, which usually means someone saying or doing something unexpected? It doesn't quite fit with the normal usage, but I don't know what 'working through curves' means either.

  24. zen1

    Comparison to Jobs?

    Throughout my career I've met the likes of both Steve's (Jobs & Woz) and Bill Gates. And while I never met Mr. Cray, he was definitely at the top of the list of all the people I've ever wanted to meet. I think the comparison needs to be worded to reflect Mr. Jobs being "the Cray of micro and hand held computing".

    From what I've only heard about Mr. Cray, he was relatively soft spoken, in comparison to the other gentlemen and was obsessed with building the absolute best. In my humble opinion, in his day he was the epitome of the digital revolutionary and everything everybody else did was nothing more than the next logical evolutionary step by giving large scale computing to the masses. I'm pretty much convinced I'll get down voted for this statement, but Mr. Cray was light years beyond Mr. Jobs, or any of his contemporaries.

    1. John 104

      Re: Comparison to Jobs?

      Cray’s reputation as a genius – he was as close to Steve Jobs among that community as you could get."

      Shame on whoever said that for being such a mindless sheep.

      Jobs may have been a marketing genius, but Cray actually invented and built things. Jobs and company have been in the improving someone eles technology game for far to long to be considered genious or any other complimentary term.

    2. anonymous boring coward Silver badge

      Re: Comparison to Jobs?

      I have a lot of respect for what Jobs achieved, but to compare him with a technical genius?

      Cray on his own easily surpasses the combined technical abilities of Woz and Jobs (which, let's face it, is about the same as just Woz on his own). Even Woz would probably acknowlege this.

      Jobs was a visionary, and a genius at that. For long-term dedication to a specific goal he can't be beat.

  25. Douglas Crockford

    Unlike Jobs

    Cray, unlike Jobs, actually designed and built this stuff. He wanted to make the fastest computer on Earth. He was not building stuff for the rest of us.

  26. Johnny Canuck

    Build your own

    Anyone wishing to build their own Cray-1 case to house their motherboard and bits can find instructions here -

    http://www.bit-tech.net/modding/case-mod/2010/07/28/cray-1-by-daryl-brach/1

    You can probably find other examples out there on the interwebs as well.

    1. arnieL

      Re: Build your own

      Rather than just a case, how about a cycle-accurate scale replica? http://hackaday.com/2012/01/10/help-chris-boot-his-cray-1-supercomputer/

  27. Deryk Barker

    Pity that

    The article seemed to focus on the wiring and cooling of the CDC6600.

    What still makes that machine so important is the architectural features - it was the first superscalar machine and in order to support multiple in-execution instructions he invented scoreboarding, techniques still in use today.

  28. Alister

    Unfair comparison

    Like two previous posters, I feel it is a disservice to compare Cray to Jobs.

    Jobs was a great salesman, but not a designer or builder, whereas Cray was all three.

    The Cray legacy is all down to one man, who designed, built, sold and evangelised his products. He had a clear vision of what he wanted to produce, and he himself (with assistance) built, tested and refined the product until it did what he wanted.

    The Apple legacy is much more of a dispersed effort, with Jobs as the figurehead. Jobs knew what he wanted the end product to be, but the realisation of that vision was done by other people.

    1. Sean Timarco Baggaley

      Re: Unfair comparison

      @Alister:

      Jobs' early (pre-Apple) years did involve studying electronics. How do you think he *met* Wozniak? But he also studied design - he had a particular interest in typography - so Jobs combined both design and engineering. This is how he turned Apple around when he returned to the company.

      Contrary to popular belief, the same mind-set that can recognise both good interface design and good engineering isn't limited to electronics or software: Both skills are used when dealing with complex systems and making them work well together. You'll find any number of complex systems and interfaces between them in any major corporation. Indeed, most such corporations fail because of a lack of such mind-sets in their upper management.

    2. jonathanb Silver badge

      Re: Unfair comparison

      I would say that Jobs was more than just a salesman. He had a very clear vision about how computers should work, even if he didn't have the technical knowledge to make it happen.

  29. Richard Taylor 2
    Thumb Up

    Seymour Cray - a REAL programmer (as well as much more) - cue boot - http://www.boo.net/~jasonp/progrmrs

  30. stephanh
    Boffin

    Cray FP != IEEE 754

    Cray floating point arithmetic was notoriously erratic. Most floating point units use internally more precision during calculation to minimize round-off errors, but Cray didn't do that because of performance concerns. The result is rounding errors that would not occur on IEEE 754 FPUs.

    See: http://www.cs.berkeley.edu/~demmel/cs267/lecture21/lecture21.html

    "In particular, this means that if a and b are nearly equal, so a-b is much smaller than either a or b (this is called extreme cancellation), then the relative error may be quite large. For example when subtracting 1-x, where x is the next smaller floating point number than 1, the answer is twice too large on a Cray C90, and 0 when it should be nonzero on a Cray 2."

    This does mean that algorithms written using the IEEE 754 guarantees on floating point rounding behaviour to avoid catastrophic cancellation (e.g. Python's math.fsum) don't work on the Cray.

    1. John Smith 19 Gold badge
      Boffin

      Re: Cray FP != IEEE 754

      "his does mean that algorithms written using the IEEE 754 guarantees on floating point rounding behaviour to avoid catastrophic cancellation (e.g. Python's math.fsum) don't work on the Cray."

      Unfortunately the first meeting of the working froup for it was in 1977 and the Cray 1 came out in 1975.

      The first version of 754 was published in 1985.

      There is a story Cray users ran programs twice to compare results due to FP concerns.

      1. Destroy All Monsters Silver badge
        Holmes

        Re: Cray FP != IEEE 754

        Speaking of floats, does anyone know whether there are implementations of the UNUM floating point representation out there in HPC world?

  31. John Savard

    Appropriate

    Remember, Apple manufactured millions of Macintosh computers and sold them. So saving even a fraction of a cent here or there in the design was worthwhile; so it made sense to use a supercomputer to optimize the design of a Mac. On the othe hand, while Cray's computers were bigger and more powerful, they were made in small quantities - they needed to be designed to work properly, but not to be one-tenth of a cent cheaper to build. So a Macintosh might well have been adequate for the purpose of assisting Seymour Cray in that task.

  32. Anonymous Coward
    Anonymous Coward

    Germanium transistors - really?

    The only really fast switching germanium transistors I remember had a very short life - around 50 hours or so.

    I seem to recall (and am very open to correction) that Cray used ECL - emitter coupled logic - which is nonsaturating and so uses the maximum possible speed of the transistors. IBM and others used first RTL and then TTL, which saturates the base/emitter junction and so achieves a good voltage swing but is slow due to the need to get the carriers out of the base region to switch. Of course, eventually CMOS scaled to the point at which it was faster than TTL and used less power than ECL, and the rest is the history of portable computers and mobile phones.

    IBM were also very conservative with their designs to ensure reliability. They sold systems of which the computer was only one element, so it was heavily value engineered.

    Can anybody expand on this and provide a more correct explanation of Cray's switching speed superiority?

    1. This post has been deleted by its author

    2. Destroy All Monsters Silver badge

      Re: Germanium transistors - really?

      Actually, this being about 1962, the transistors at the time were using Germanium. When Cray ran into trouble getting the multiplier circuits for the CDC6600 to work, he looked for something better, and Fairchild Semiconductor had just started marketing FASTER transistors in Silicium. With these, progress could be made just as the demands of business started to close its jaws around the project.

      1. IvyKing

        Re: Germanium transistors - really?

        I'm very sure that the CDC 6600 was designed using silicon transistors, the germanium machine was the CDC-1604, which also pioneered the use of a peripheral processor. IIRC, the CDC-3600 was the 1604 redone with silicon transistors. FWIW, the Smithsonian Air & Space museum by Dulles has a CDC-3800 (dual processor 3600).

  33. Retired old bloke

    Great hardware, but ...

    The first three Cray 1s were the _very_ fastest - at the time: they didn't use ECC like the later ones. The Los Alamos machine was shipped elsewhere (Livermore?), then to Daresbury labs (UK) then to the University of London Computing Centre. I worked with it at those two sites. The application software they shipped with it was pretty crap - the CFD code we were running ran faster on a CDC 7600 than on the Cray, initially. We (at QMC, London, & a colleague at Harwell) took the Cray's FFT software to pieces & totally reworked it so that it ran OK on the Cray We then offered the FFTs back into the community under the old pals act. Some six months later Cray were shipping the improved code in their application packages. Some months after that, at QMC, that we received a letter at from the US State Department stating that we were running un-approved (for export) software, and would we desist. Prof Leslie, our head of department got in touch with Harwell and worked out how much time we had all spent getting the FFTs to run OK and sent a letter back to the State Dept along the lines of

    "Thank you for drawing this to my attention with respect to the software we developed. My staff and colleagues spent so-and-so much time on this software and I enclose an invoice for £XXXX. I await your payment." The letter was longer than that of course & I can't remember what the £XXXX sum was - something in the region of £16K or so. In any event, we never heard another peep from any US Govt. department.

  34. SirLurksAlot

    Love The Love Seat

    Some serious flops in it's day.

  35. Dick Head

    Not everyone loved the Crays.

    I worked on a competitor's AP product in the early 1980's whose development was entirely funded by Shell in Houston. They didn't like having to run their geophysic runs at least twice to ensure no undetected errors got into the calculation. Our product was never widely successful in the long run but nobody really cared to much because the development costs were largely covered.

  36. T. F. M. Reader

    Nitpicking

    Before Roadrunner there was Blue Gene. I am really surprised it is not mentioned, all the more so that even the current Top500 list has 3 Crays and 4 Blue Gene/Qs in the top 10, and that's a hell of an achievement for both Cray and IBM.

    And Roadrunner was based on Cell processors (i.e., PowerPC cores), not AMD as the article claims.

  37. John Smith 19 Gold badge

    For those interested in Seymour Cray's background

    There's a veteran's group of old staff of companies for which he was a member

    http://vipclubmn.org

    One of Cray's lesser known achievements was using a computer (back then hugely expensive) to help manufacture more of them reliably, developing some of the first CAE software to take care of the mental "grunt work" of logic equation checking, front panel layouts etc. Obvious now, but very bold for the early 1950's.

    One of the history files points out this was possible because a secret government customer was willing to support the effort. This was long before CDC existed.

  38. Chris Gray 1
    Happy

    Fun days

    As part of a job, I visited the Cray HQ once. I was impressed by the carpets - they were done in a Mandelbrot Set pattern - that must have cost a pretty penny!

    I specifically looked for a Cray-1 through the windows in the hallway I walked past, but didn't see one - just lots of rectangular boxes. I was disappointed.

    Later, in a successor job, I was doing work making a pthreads library to run on the Tera MTA (Tera later bought Cray, then renamed itself to Cray, if I recall correctly). I had a remote login for testing to the MTA machine in SDSC - great fun.

    Small groups, working out of the mainstream, produce the most interesting things!

    No-one has mentioned the story about a Cray machine that was shipped to an unnameable USA agency. The delivery instructions were something along the lines of "put it in a semi; leave the trailer here on this night; pick up the trailer two days later".

  39. D Moss Esq

    Ferrous drape sales

    It’s all reminiscent of the early days of Cray in the 1970s and 1980s, when Cray’s eponymously named systems were for friends and Cold War allies only. Supercomputers were on a list of technologies whose export to foreign powers was tightly controlled by Washington DC ... In the mid-1980s, the CIA reckoned (PDF) that the purchase of a single Cray-1 could have doubled the total scientific computing power available to their ideological enemies in the USSR.

    I remember newspaper reports of a Control Data machine being sold to the Russians. They didn't have any dollars to pay for it with. They bartered for it with ... Christmas cards, presumably quite a lot of them.

    Can't find a link in any of the comics I used to read – Computer Weekly, Computing, Stop Press – but Google turns up this link, which includes Pepsi-Cola's sale of concentrate to Hungary in return for film distribution rights, but not the sale of ditto my friend C_________ did for dried onion soup.

    1. Anonymous Coward
      Anonymous Coward

      Re: Ferrous drape sales

      One may remember Thierry Breton's "Softwar" thriller whereby a "Craig-1" is sold to soviets with a "logic bomb" cunningly included in the software package (take-home advice: install from scratch, remove crapware, go open-source, audit and keep the debugger handy). This book seems to be un-history, scant data on Google. It got me interested in CS and Russian Ladies Doing Science. Success in these two subjects is very lopsided.

  40. John Savard

    Seymour Cray's accomplishments

    The Univac 1103 was one of his early accomplishments. And the CDC 6600 certainly was an impressive accomplishment for its day.

    But the CDC 6600 was just a big computer - the same way that the NORC, the LARC, and the STRETCH before it were big computers. IBM, with the 360/91 and then the 360/195, significantly improved on Cray's design - so the Pentium and its descendants largely follow the lead of the 360/195. Some say that the 6600 was the first out-of-order machine, even if IBM, with the 91, later perfected that, but the STRETCH also was out-of-order to a limited extent.

    The Cray I, however, was a unique design; its use of vector registers meant that it far outclassed previous attempts at vector-oriented supercomputers. The design was widely copied, for example for add-ons to the Univac 1110, the IBM 3090, and the VAX.

    So I think that the Cray I stands as where he made the most original and greatest contribution to the science of computers.

    As for the Roadrunner and similar machines - thanks to Moore's Law, eventually specialized hardware got beat by just throwing large numbers of commodity microprocessors at the problem. Although good interconnects between the many parallel processors are valuable, and require design and engineering expertise - Cray himself, just before he died, tried his hand at participating in this new era - the credit for this type of computer taking over lies in the gradual improvement of the capability of microprocessor chips, not primarily some ingenuity on the part of supercomputer designers, at IBM or anywhere else.

    1. lambda_beta
      Linux

      Re: Seymour Cray's accomplishments

      How can you even mention Jobs and Cray in the same article? Jobs was, at best, a marketing guy, and had nothing to do with any computer technology or innovations. Even the Mac was a rip-off of the Xerox Star. Cray was a true pioneer in both innovation, design and implementation.

    2. nijam Silver badge

      Re: Seymour Cray's accomplishments

      > IBM, with the 360/91 and then the 360/195, significantly improved on Cray's design

      In other words (as well as all the clever vector hardware etc.), he invented loads of stuff that quickly became commonplace in mainstream mainframes (OK, maybe not the IBM 360 series itself)?

  41. Anonymous Coward
    Anonymous Coward

    A review of The Supermen: The Story of Seymour Cray and the Technical Wizards behind the Supercomputer. I remember reading somewhen back in the nineties.

    Note to uni profs: Before you throw jargon on the blackboard ("CDC6600" huh? wuh?), please add a genealogic tree, however imprecise. Freshman and even grad students have a warped notion of time-and-effort-spent-on-work, where ideas come from and how they are developed, and the network of business and geographical "hot spots" that make development possible - they actually think technology springs fully formed out of white-collar office worker dens.

  42. dc_m

    Why no Google Doodle?

    Google seem to celebrate everything else! Seems a little odd given the connections.

  43. Anonymous Coward
    Anonymous Coward

    CDC 1604 - the world's first fully transistorized computer? Not!

    The article states:

    "Cray left the Navy to form the first of four companies in 1957 – Control Data Corp. Here he built the world’s first fully transistor-based computer, the CDC 1604."

    The Harwell CADET of 1955 was fully transistorized, so the CDC 1604 was a bit late to the party given that the first one was delivered in 1960:

    https://en.wikipedia.org/wiki/Harwell_CADET

    Various other transistorized computers had been completed from 1953 onwards, but they'd all used a small number of valves - in the clock circuitry on the whole, from what I've just read.

    The CDC 1604 (first one delivered 1960) has a valid claim to be called the second commercially successful fully transistorized large scale computer, after the IBM 7090 (first one delivered 1959):

    http://www.thocp.net/hardware/mainframe.htm

  44. Stevie

    Bah!

    An article on Seymour Cray with no mention of his chief competitor.

    I can understand the ommission of WOPR, but to completely ignore the Gibson?

  45. Anonymous Coward
    Anonymous Coward

    JObs ??

    Don't compare a Creep to a God, the least you could have done is said Woz.

  46. botnet3

    To set the record - Apple did have a Cray at least in the early 90s. Not sure when it was acquired.

  47. Anonymous Coward
    Anonymous Coward

    Seymour is gone but Steve Chen is still going strong!

    Steve Chen is the other "Super-Brain" from the glory days of Cray. Chen's Cray X-MP actually beat Seymour's Cray's original architecture. Steve is still going strong driving a vision of a universal health care cloud:

    http://english.cw.com.tw/article.do?action=show&id=12054

  48. Anonymous Coward
    Anonymous Coward

    Still the same headline?

    Given how much The Reg seems to love the late Steve Jobs, it makes you wonder how much they love/admire Seymour Cray? Not a lot?

  49. Ian Joyner Bronze badge

    Bob Barton was the original 'Think Different' computer architect

    Bob Barton was the architect of the Burroughs B5000 - a computer so far ahead of its time, that it is still ahead in 2015 (as Unisys ClearPath Libra MCP systems).

    In the early 1960s, Barton was sick of seeing electronic circuit designers design instruction sets and turn them over to programmers to make something out of this mess. He this had the software people design the instruction set and they turned that over to the electronics people. Thus they had things like good support for high-level languages (ALGOL was the system programming language, long before the lower-level C came along as a backward step), automatic memory management. The most significant thing is probably single-level memory, where registers, cache, RAM, and virtual memory all looked as one level of memory to the programmer (even most system programmers). This was the first commercial implementation of virtual memory - the idea having come from Manchester University where Turing worked - and Turing machines have one level of durable memory.

    We can also attribute Reverse Polish Notation to Barton, as used by Hewlett Packard, which employed a lot of ex-Burroughs engineers.

    Bob Barton was certainly a model for Steve Jobs' Think Different. Barton went on to teach at University of Utah, where he taught among others Alan Kay, who went on to invent the window and other things at Xerox PARC, which was then integrated with Apple's efforts, since Xerox dropped the ball.

    https://en.wikipedia.org/wiki/Robert_S._Barton

    http://www.computer.org/csdl/proceedings/afips/1961/5058/00/50580393.pdf

    http://www.computer.org/csdl/proceedings/afips/1963/5062/00/50620169.pdf

  50. Sacioz

    Steve Jobs

    Steve Jobs, Steve Jobs , Steve Jobs ...just a show pony ! Wozniak was/is/will always be / the icon to adore along Simon et al .

    1. Ian Joyner Bronze badge

      Re: Steve Jobs

      No, Jobs really got what computing for everyone was. He saw that computers would come out of their security glass rooms.

      That's not to deny what a brilliant engineer Woz is, but I don't think he got it in the same way Jobs did.

  51. Anonymous Coward
    Anonymous Coward

    Yes, on a Macintosh

    Yes, Seymour did do the Cray-3/4 design on a Mac.... in text and by hand.

    To say that Seymour had the "big picture" would be an understatement.

    The problem with the Cray-3 was not so much the GaAs logic circuitry but the Si based static ram. The first supplier had little experience in SRAM and produced parts which performed poorly. Ultimately a deal was cut with an overseas supplier with a clue about memory and the parts worked. Keep in mind that we were buying bare die, something probably no other company in the world was doing. All are chips were bare die and we wire bonded and placed and pinned them to the boards ourselves.

    To say that CCC was "vertically integrated" was also an understatement. We made every piece from the circuit boards to the pins which held them together in stacks to the chips in each board in each layer and every stack.

    My experience working at CCC was immensely enjoyable if not financially rewarding. I lament the fact that we ran out of cash before the Cray-4 was completed. But I relish the challenge I faced when Seymour said "Make it simpler". And I did.

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Other stories you might like