back to article RIP HPE's The Machine product, 2014-2016: We hardly knew ye

HPE lab boffins have finally, after years of work, built a proof-of-concept prototype of their fairytale memory-focused computer, The Machine. The achievement is bittersweet. A decision has been made within Hewlett Packard Enterprise management to step back from manufacturing and selling the much-hyped fabled device as a …

  1. Dan 55 Silver badge
    Facepalm

    Driving Innovation to Product

    So much for that slide.

    HPE, RIP.

    1. Rosie Davies

      Re: Driving Innovation to Product

      That seems a little harsh. It appears the HP were approaching The Machine as a test bed for a variety of different approaches rather than an end in itself, which seems sensible enough, prototypes and all that.

      Hopefully some of what's been found out will turn out to be useful and find it's way into boxes that can be bought. HP used to be good at that sort of thing.

      Rosie

      1. asdf
        Facepalm

        Re: Driving Innovation to Product

        >The Machine as a test bed for a variety of different approaches rather than an end in itself, which seems sensible enough, prototypes and all that.

        Which is fine if it stays internal to R&D but you let marketing folks and senior managment run with it and even the engineers starting making public promises and then next thing you know the world laughs at you ala Itanium (which at least delievered something even if it was underwhelming).

    2. Anonymous Coward
      Anonymous Coward

      Re: Driving Innovation to Product

      It says "Driving Innovation to Product", as opposed to "Driving Innovation to a Product"; the extra "a" makes a big difference.

      The demise of The Machine as a product is a bit of a shame, if not unexpected. Timescales were drifting outwards and the R&D spend was no doubt considerable. Shame we will not see it as a product though.

  2. Anonymous Coward
    Anonymous Coward

    Dead alongside its creator

    I don't think anyone is surprised this is officially toast.

    It was always a long running joke anyway.

    But now its blown to the four winds, much like the rest of HPE.

  3. John Smith 19 Gold badge
    Unhappy

    So some fast processors + biggish RAM + bigger chunk NVM with quick sharing by others.

    Not really seeing what the fuss is about.

    And it seems neither does HP.

    In the mid 80's Chapel Hill did some student chips called "Smart memory" with fine grained processing built close to small chunks of RAM for immediate access.

    30 years later that still looks quite advanced.

    This does not.

    1. Naselus

      Re: So some fast processors + biggish RAM + bigger chunk NVM with quick sharing by others.

      The idea was that it wouldn't have RAM or storage, just enormous quantities of NVRAM operating in both roles. And the processors would be specialized hardware cores, hundreds or thousands of them, all sharing the same memory. So from an architecture point of view, it requires a big rethink of how a computer actually works.

      So yeah, it was actually a bit more interesting than you're making out. Not massively original, though; it's a fairly obvious idea once you have NVRAM, and just needs the details worked out. The Machine was an attempt to do that working out.

      1. John Smith 19 Gold badge
        Unhappy

        Re: So some fast processors + biggish RAM + bigger chunk NVM with quick sharing by others.

        "The idea was that it wouldn't have RAM or storage, just enormous quantities of NVRAM operating in both roles. "

        So single level storage. Fine if you can get the price/bit of NVM down to DRAM at the same access speed. As for "large number of processors" SP2's of the late 90's were running up to about 64 K processors and IIRC 100's of 1000s of processors are certainly known today.

        As for "be specialized hardware cores" aren't all processors SOC's and so (to a certain extent) "specialized"? If you mean specialized to individual companies servers then that would make each block of machines sold unique to their customer.

        "Persistent" storage was available 40 years ago.

        It's called "core."

        NVM as the main memory for a processor node is somewhat bold. So how does it compare to flash?

  4. Anonymous Coward
    Anonymous Coward

    Think this through to endgame

    There is far more revenue and profit potential in mainstreaming this technology in the existing product lines than there ever was in bringing a boutique "The Machine" to market in a narrow segment.

    Likewise the decision to truly open Gen-Z to the industry rather than just developing it as the proprietary ASICs of a boutique product line.

    The questions, then, are has HPE anticipated technology's direction well enough, how does the industry reconfigure for shared access to byte-addressable storage class memory, when does the memory industry actually deliver the ten millionth affordable storage class memory die, and who ends up with the lion's share of the revenue. The next decade will be really, really fun.

    1. asdf

      Re: Think this through to endgame

      Which would be fine if it was HP selling the storage class memory but its not. The memristor always was the equivalent of the gyroball in baseball (technically exists but more bullshit and myth than anything).

      1. Steve Chalmers

        Re: Think this through to endgame

        There are at least three basic technologies, all persistent/nonvolatile, competing for the brass ring of replacing DRAM in the 2020s. Memristor by itself is no longer one of those three, but a descendant is. (ReRAM, or resistive RAM, is a category which includes memristor and several related technologies.)

        Concur that what matters is who gets the revenue/margin. I honestly don't know what the market will look like in 2025, much less what share of what segment of that market HPE will have. But HPE invested in a very long view of driving technology for that era, not just following the herd, and we should respect that choice and keep an open mind for the medium and long term consequences of that choice.

  5. Dave 126 Silver badge

    Just for fun:

    Here's HP's Star Trek tie-in teaser trailer for The machine:

    https://www.youtube.com/watch?v=y3sHh6CsN7c

    [To be read in your best movie-trailer voice:]

    At the beginning of the 21st century, the earth needed to find a way to keep up with the data from over 30 billion connected devices, which changed the basic architecture of computing. This year, Hewlett Packard Enterprise will preview [dum dum dum!] The Machine

    1. Doctor Huh?

      Re: Just for fun:

      No, the Star Trek tie-in that applies here is just this:

      "It's dead, Jim."

  6. Steve Davies 3 Silver badge
    Holmes

    Welcome to 'The Machine'

    Oh wait, it has been canned.

    Elemtary my dear 'Watson'.

    1. Steve Knox

      Re: Welcome to 'The Machine'

      Come in, HP, boy, have a cigar...

  7. Mark York 3 Silver badge
    Big Brother

    Did Samaritan Win Out After All?

    How sad not a single Person Of Interest reference.

    1. Dave 126 Silver badge

      Re: Did Samaritan Win Out After All?

      If you skip back to The Reg's previous article about the machine, you'll see me recommending that excellent television series. It really gets going (and then some!) half way through the third season. The first season, whilst having a gentle overarching plot, is largely a 'monster of the week' police procedural.

  8. Anonymous Coward
    Anonymous Coward

    The Machine was obsoleted by Intel's Purely Platform

    The memory-centric design of The Machine was obsoleted by the memory-centric capabilities of Intel's upcoming Purely servers. The prospect of Purely's Apache Pass memory architecture with 3D XPoint DIMMs will provide extreme memory density (6TB/socket) without the need for fabric connected DRAM.

    I do not know where or how HPE will apply concepts from The Machine, except perhaps in their Superdome X platform to address scaling Intel Xeon beyond 8 sockets.

    1. Steve Chalmers

      Re: The Machine was obsoleted by Intel's Purely Platform

      It's not that simple.

      A large pool of (storage class, nonvolatile) memory can be built either by putting some in each of many servers, or with memory boxes somewhat analogous to today's disk arrays.

      However, a key point of sharing byte addressable storage class memory like this is, well, accessing it directly, inline in user space code. Like a DAX space, but shared at the rack level (or larger). Not calling an RPC, not calling a storage stack, just reading and writing (subject to access controls, of course).

      Another key point is that the limit on the number of DRAM DIMMs in a server today is far too low, and the reach of DRAM connections is far too short, to replace rather than supplement storage.

      Intel is a smart, resourceful company, but Purley was developed to run with today's software, not in the future software world The Machine envisions. So while Intel has the Cray PGAS software to draw from, and could probably share the storage within a multi socket server over QPI or successors, there is no indication of user space (inline) access over Omni-Path.

  9. Deltics

    Am I the only one...

    ...finding my nostalgia centre being unexpectedly stimulated by those pictures of boards festooned with high-density, high chip-count daughter boards.... ?

    Takes me right back to 1990's PC's. Ironically. :)

    1. Mpeler
      Mushroom

      Re: Am I the only one...

      Reminds me of the beginning of "Soul of a New Machine", where Tom West went into a Data Center where a new DEC was being installed, opened it up, counted the number of boards, approximated the parts, and the resulting cost and cost/complexity of manufacturing it. DG's goal was a one-board CPU.....

      From "Soul of a New Machine" to "sold off a new machine" (sort-of). Sad.

      RIP, HP labs and the HP Way.

      1. Anonymous Coward
        Anonymous Coward

        Re: Am I the only one...

        Aye, it is like that... new hardware architecture, new OS, new programming paradigms... this was even harder than what DG did; they only had to clone the PDP-11, IIRC (and extend it to 32-bit? whatever, not rocket science)

        Beats me why you got downvotes. Negative? Honest though. Like DEC and DG, HP is toast. That was clear the day HP knelt down with a diamond ring for Compaq. What the hell were they thinking?!

  10. bombastic bob Silver badge
    Devil

    "Big Iron" thinking won't win

    just to point out, since the invention of the PC, where distributed processing is the natural way for things to go, there's still a lot of "big iron" thinking out there, trying to drive computing in the direction of the past [i.e. big supercomputer, lots of dumb clients]. "The Cloud" is one of these trends, and it's not trending so well in my opinion (i.e. "highly overrated").

    Sure, there will always be need for centralized data and storage, and even occasional centralized processing, and rent-a-CPU cloud services try to fill that need. However, no data pipe is fat enough to handle what a properly designed, locally run on a multi-core CPU, multi-threaded algorithm can do with summarized/snapshot data in lieu of some monster-server trying to run mega-queries on "all that noisy nonsense".

    Hopefully HPE's good ideas will end up on THE DESKTOP COMPUTER, where they belong. Or a game console. Or some other such 'end user device'. That's because the benefits of "Big Iron" just aren't there for the larger segment of the industry.

    So if _I_ were HP, I'd focus on leveraging multi-core desktops instead. It will have a bigger and more sustained payoff.

    1. asdf

      Re: "Big Iron" thinking won't win

      >So if _I_ were HP, I'd focus on leveraging multi-core desktops instead.

      Ask Acer how making that your main focus is working out.

  11. James Wheeler

    It's more than memristors

    I believe there's still great potential in turning machine architecture "inside out" as HP envisioned, making machines that are memory-centric where processors are a resource that's applied to large data stores that permanently reside in high speed storage. If not memristors, then DRAM, or whatever new non-volatile memory does catch on. Bring the computing to the data, rather than scrape the data off a disk and feed it to the processor. Columnar data stores like HANA and others could really benefit. Or so my thought experiment goes....

  12. Francis Vaughan

    As described it was never going to be a product. That was just a mish mash of stuff with no clear use case and more importantly - no software to take advantage of it.

    IMHO the big problem was that the processors had no sensible architectural support for making use of a world with huge addressable persistent memory. My personal crusade is for tagged memory. HP are one of the few companies that could conceivably create a commercially useful architecture with this. IBM being the obvious other. Tags for data type, if a pointer, access protection, and concurrency control, at a minimum would make life vastly more interesting. You can tie concurrently control and pointer identification into your memory network control. Suddenly lots of optimisations are available at a hardware level, and you eliminate a whole raft of crud from software.

    None of this is exactly new. IBM's AS 400 was on the way, and it was a commercial success. And there were many other small volume and research systems built. But the ubiquity of x86, Windows, and Linux ensures that the barrier to entry of a properly new paradigm is very high.

  13. Anonymous Coward
    Anonymous Coward

    Sun Microsystems' "Rock" processor says hello...

    Some of the ideas remind me of the "Sea of Memory" stuff that used to be talked about around that program more than ten years ago. A great conceptual idea does not have to translate into a single marketable product on its own to impact on the industry. Living on as "components" of something else is still a significant legacy.

  14. hellwig

    Big Data isn't the Future

    So what if modern machines will be crunched by big data, who actually cares?

    You know the primary use of big data today? Advertising. I don't think anyone will care if advertising stays as non-personal as it is today. We have billion dollar industries being driven by the need to have personalized advertising to justify the billions of dollars being spent on personalized advertising. I think companies were getting along A-OK before Facebook, Google, and the like.

    Now, big-data analytics can benefit certain fields, especially scientific and medical research, and I don't see why the industry wouldn't shift focus to... oh right, money.

    Lets face it, we were doing find without a lot of the "data" we have today. I don't recall the last time any big data company found a cure for a disease or solved some critical problem in the world.

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Other stories you might like