back to article Microsoft and Rambus will get schwifty in quantum-cum-cryogenic computation collab

Microsoft and Rambus have announced "an expanded collaboration" to develop prototype systems that can optimise memory performance at cryogenic temperatures. The pair first joined forces back in 2015 to investigate new computer architectures in anticipation of the Moore's Law apocalypse. The concern is that demand for realtime …

  1. Dave 126 Silver badge

    And for anyone who clicked on this article for the headline, the first episode of Rick and Morty Season 3 was broadcast on the first of April. Rest of the season to follow later this summer.

    For those slow on the take up, imagine a cross between Doc Brown and Futurama's Professor - but alcoholic and misanthropic - careering around a HHGTTG universe. Yeah, it's great.

    1. frank ly

      There's an amusing fifteen second vignette at the end of every episode. My OCD habit of watching all the end credits has payed off with this series.

    2. Anonymous Coward
      Anonymous Coward

      wide band

      there's this gem but on the flippy, the idea of a portable universe that can spew energy that you didn't put in when you created it is kinda retarded. I know it's just a plot device and I guess that makes me That Guy but oh well.

  2. Anonymous Coward
    Anonymous Coward

    Rambus still going

    Wow, there's a surprise.

    1. ecofeco Silver badge

      Re: Rambus still going

      Rubba dubba Rambus!

  3. Anonymous Coward
    Anonymous Coward

    Why WHY WHYYYY..

    would you get in bet with one of the original patent Trolls after the great DRAM fiasco? I know that the first trait I look for in a joint venture partner is one who got caught filing patents behind their partners backs and then suing them for infringement back during the P4 era?

    as reported here:

    https://www.theregister.co.uk/2001/03/18/racketeer_act_enters_rambus_infineon/

    1. Flocke Kroes Silver badge

      Re: Why WHY WHYYYY..

      Before RDRAM, memory manufacturers had come up with several different replacements for SDRAM. Intel decided to make their CPUs compatible with none of them. The message was clear: you spend money on R&D and Intel will ensure it is money down the drain. The unique selling point of RDRAM was the patents. In return for creating a market for RDRAM, Intel could use the patented interface to connect their CPU to the north bridge chip. Everyone else making north bridge chips could be bled dry with patent royalties by Intel while Rambus trolled the memory manufacturers.

      For some reason, memory manufacturers did not immediately leap under the bus. RDRAM did not arrive in quantity until months after Intel paid $1B to Micron. Even then the speed was terrible because RDRAM was difficult to manufacture and the latency was appalling because RDRAM was defective by design. For an added bonus Intel CPUs could only operate RDRAM at a few different speeds which did not match the speeds that could be manufactured in quantity, so the CPU had to select the next speed down - which turned out to be a big step down to the lowest available setting.

      All this was abundantly clear in advance to the techies that lived through it. Some PHBs got burned by the memory translation hub fiasco, but I am not convinced that many of them understood exactly what was defective or why Intel had been so determined to create a spectacularly bigger cock-up than the FDIV bug.

      The obvious answer to 'why WHY WHYYYY..' is that Rambus have tremendous expertise in selling bullshit to PHBs, and the PHBs at Microsoft do not understand just how badly Intel got crippled and burned (although - to be fair - AMD were able to let that huge opportunity painfully drag itself past them).

      Get your popcorn, sit back and get ready for a giggle. A mosquito has got into bed with a leach. I would like to think that Rambus can deliver an even bigger train wreck than RDRAM, but I doubt Microsoft will commit themselves to filling a black hole with cash quite as enthusiastically as Intel did.

  4. Throatwarbler Mangrove Silver badge
    Joke

    There's a better cooling solution

    Datacenters. In. SPAAAAAAAAACE!

  5. This post has been deleted by its author

  6. tedleaf

    Iv still got plenty of 128mb sticks if they need some more rambus !!!

  7. Nolveys
    Gimp

    Microsoft and Rambus

    Cryogenic RAM for interfacing with quantum computers and a log attached to the end of an industrial pile driver for interfacing with customers.

  8. Anonymous Coward
    Anonymous Coward

    Eye roll...

    This is supposed to be a power/density play - rather than bothering with memory at all, first work out how to reliably maintain cryogenic temperatures at scale in a fashion that doesn't immediately negate whatever power/density savings you achieved.

    I'll wait...

  9. John Smith 19 Gold badge
    WTF?

    supercomputers <> superconductive.

    4K is the level of NiobiumTin alloys. Low level work has gone on low temp SC systems for decades at SSI levels for specialist systems.

    Those readers who operate around big data centres know plumbing large amounts of chilled air through a server farm is non trivial.

    Anyone who thinks it will be easier and less hassle to do with liquid cooling should look at IBM 3090 (water cooled) or the Cray 1s (CFC cooled).

    But these guys seem to be talking Liquid Helium or (best case) Nitrogen.

    Saving energy by radically increasing the amount of energy you have to remove from the coolant before use? Is anyone else getting a serious bovine whiff coming off this "con-cept"

  10. jason.bourne
    Terminator

    What?

    What datacenter is going to facilitate cryogenic cooling?

    Why am I reading my own question with the CinamaSins voice?

  11. Anonymous Coward
    Anonymous Coward

    RAMBUS

    I remember them fondly from the past, in the same manner I look back and remember measles.

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Other stories you might like