back to article Zombie Moore's Law shows hardware is eating software

After being pronounced dead this past February - in Nature, no less - Moore’s Law seems to be having a very weird afterlife. Within the space of the last thirty days we've seen: Intel announce some next-generation CPUs that aren’t very much faster than the last generation of CPUs; Intel delay, again, the release of some of …

Page:

      1. jason 7

        Re: Nothing wrong with the chips.

        "No, it's not that simple. Code is a product. It is paid for with money."

        So after all that...it's still the code that's the problem!

    1. ITnoob

      Re: Nothing wrong with the chips.

      Is that you Linus?

  1. boltar Silver badge

    The software is still there

    Its just called "firmware" or "microcode". Good luck implementing complex graphics algorithms using hard wired TTL logic. You'd need a chip the size of a bus.

    1. 8Ace

      Re: The software is still there

      That's why the largest current FPGA's and VLSI chips have billions of transistors, you "connect them up". You are still using logic primitives in a lot of cases but they are soft configured within the device.

      1. boltar Silver badge

        Re: The software is still there

        "That's why the largest current FPGA's and VLSI chips have billions of transistors, you "connect them up"."

        And at the end of it they generally solve ONE problem. Now imagine a hardwired chip that had EVERY modern graphics algorithm built into it. Seeing my point?

        1. 8Ace

          Re: The software is still there

          "And at the end of it they generally solve ONE problem. Now imagine a hardwired chip that had EVERY modern graphics algorithm built into it. Seeing my point?"

          Jeez calm down, if you are a developer your output isn't going to be replaced by an FPGA anytime soon. The story is saying that hardware is more efficient in a lot of cases and these will increase. Nobody is saying that hardware can replace software, any more that they are saying that software is possible without hardware.

          1. boltar Silver badge

            Re: The software is still there

            "Jeez calm down, if you are a developer your output isn't going to be replaced by an FPGA anytime soon."

            Calm down? Wtf, I was just making a point. Why are some people so wet they see any disagreement as some kind of confrontation? And my original point was that to reproduce the functions of a modern graphics card using hard wired logic (no microcode) would require a massive die.

    2. Steve Todd

      Re: The software is still there

      Talk to nVidia or AMD. Their GPUs are a mix of dedicated hardware and stream processors. No one said hardware could do everything, but there's a lot of performance to be had by offloading the right bits of a task to it.

      1. boltar Silver badge

        Re: The software is still there

        "Talk to nVidia or AMD. Their GPUs are a mix of dedicated hardware and stream processors."

        Any complex modern computer processor requires microcode to operate. Microcode is software.

        1. 8Ace

          Re: The software is still there

          "Any complex modern computer processor requires microcode to operate. Microcode is software."

          ARM designs are supplied to licensee's with HDL descriptions of intsruction decode, these are then implemented directly in hardware

        2. John Savard Silver badge

          Re: The software is still there

          The 386 used microcode, like a System/360 Model 65. The 486, though, was hardwired, like a System/360 Model 75.

          It is true, though, that even the latest x86 processors use microcode to handle a few complex instructions, as do the latest System z processors. Most RISC chips, though, eschew instructions so complex as to make microcode a necessity, even though they still do things like floating-point arithmetic that take multiple cycles.

          1. oldcoder

            Re: The software is still there

            The x86 uses a much more complex of microcode.

            The x86 instructions are first translated into RISC instruction strem and optimized...

            LOTS of microcode there.

        3. Long John Brass Silver badge
          Alien

          Re: The software is still there

          @Boltar

          Any complex modern computer processor requires microcode to operate. Microcode is software.

          It kind of is and it kind of isn't

          Microcode is a series of bit patterns that enable/disable/connects the various chunks of logic blocks in the CPU "fabric" although fabric is probably not the right word

          So while its updateable microcode is not really what you would consider a program or software

          Caveat: I'm not a CPU designer but I play one on the internet :)

  2. Wommit

    @Boltar7

    Who pissed into your cornflakes this morning?

  3. kars1997

    This is not going to save Moore's law

    I don't buy the premise of this story.

    Yes, doing stuff in specialized hardware gives you a 200x, or a 1000x-boost over doing it in software on a general-purpose chip.

    But that's a one-time boost. At the end of the day, the performance of that hardware is still going to be limited by its process density.

    So all you're really doing is delaying the point at which you can no longer improve performance, even in hardware, at the cost of adding extra chippery for various functions.

    1. 8Ace

      Re: This is not going to save Moore's law

      No, process density is only one factor, there are others such as the process itself (Lithography etc.) and the device type being implemented. Currently the full density can't be exploited due to these other limitations, however there are ways round this. For example the FinFET devices now being used have advantages over previous devices so the technology continues to move forward. All the way from bipolar to CMOS, SOI etc. the devices have been improved. Same applies to the process and the process density. Engineering is problem solving after all.

  4. Alan J. Wylie Silver badge

    The wheel of reincarnation

    http://www.catb.org/~esr/jargon/html/W/wheel-of-reincarnation.html

    [coined in a paper by T.H. Myer and I.E. Sutherland On the Design of Display Processors, Comm. ACM, Vol. 11, no. 6, June 1968)] Term used to refer to a well-known effect whereby function in a computing system family is migrated out to special-purpose peripheral hardware for speed, then the peripheral evolves toward more computing power as it does its job, then somebody notices that it is inefficient to support two asymmetrical processors in the architecture and folds the function back into the main CPU, at which point the cycle begins again.

    Several iterations of this cycle have been observed in graphics-processor design, and at least one or two in communications and floating-point processors. Also known as the Wheel of Life, the Wheel of Samsara, and other variations of the basic Hindu/Buddhist theological idea. See also blitter.

  5. inmypjs Silver badge

    "we’re seeing a migration away from software and into hardware"

    ", wringing every last bit of capacity out of the transistor."

    WTF are you talking about?

    A transistor in a circuit dedicated to video decompression for example sits doing nothing when you are not decompressing video. A transistor sitting idle most of the time is hardly squeezing every last bit of anything out of it.

    Dedicated circuits can be faster and use less energy. They cost to manufacture and development is expensive. High volume applications (to amortise development costs) with low energy requirements like smart phones are an obvious candidate especially high end smart phones where production cost is less of an obstacle.

    1. Charles 9 Silver badge

      Re: "we’re seeing a migration away from software and into hardware"

      "A transistor in a circuit dedicated to video decompression for example sits doing nothing when you are not decompressing video."

      But if the times when it's NOT decompressing video (or compositing a UI or whatever task it is dedicated to perform) are few and far between, then odds are you get a net benefit for it. That's part of what's happening now. They're taking a look at what things CPUs have to do all the time and offloading them so that the CPU has more time for more generalized workloads, much like having a specialist for handling particular jobs that happen to come up quite frequently.

  6. Anonymous Coward
    Anonymous Coward

    This is not a new trend...

    We didn't always have a math co-processor in the CPU. Intel and AMD both added extensions into their processors, including ones to increase multimedia performance. Outside of computers we have multiple ASICs in everything from DVD players, receivers, to cars

    One could even argue that software is driving a need for more logic in hardware.

    1. Charles 9 Silver badge

      Re: This is not a new trend...

      The argument being that you're starting to see similar kinds of software being used all the time. If you have a particular job being done again and again, it becomes practical to push this function into an ASIC to (a) speed up the turnaround on that process, and (b) to offload work so that the CPU can concentrate on more generalized tasks. That's one reason SIMD/vector computing instructions were introduced: to better deal with common math functions that were used in programs of the day. It's recent Intel CPUs include AES-NI: because an increased need for security has pushed the use of AES so much we end up using it all over the place.

  7. ajny

    The price of the masks and the rest of the tool flow means this is a corporate endeavor. The marginal price of writing original software is only opportunity cost.

  8. Poncey McPonceface

    Nice one El Reg, made the front page of Hacker News!

    More discussion of the software/hardware divide over yonder.

    https://news.ycombinator.com/item?id=12555500

  9. Colin Tree

    Chuck Moore's Law

    They've discovered Chuck Moore's Law

    Less is More

    Software has layers of abstraction, so it doesn't matter what hardware you run an application on. Programmers want to focus on the high level task and not on the nuts and bolts.

    How wrong that paradigm is, and I hope it goes away.

    1. roger stillick
      Linux

      Re: Chuck Moore's Law

      Unfortunately the paradigm is exactly right, and it also applies to parts count on hardware..

      IMHO=less is always more if it actually works.. RS.

  10. roger stillick
    Linux

    Fast Internals Useless if it cant TALK

    AT&T's old single Power PC chip had communication channels internal to the chip to allow anything the processor did to be IO'd externally.. IBM's current Power 8 chip has 8 processors inside a wrapped around a non-blocking switching network on chip connected to a wideband IO MXR..

    Haswell and newer Intel chips have the Processors, alas they have a blocking communications system forcing space/ time/ space MXing for the IO stuff.. like limited instruction set boxes, they might appear to be faster, their data crunched throughput is really not much faster..

    Hypervisor Software and divided data streams allow these Intel chips to scream.. however the IBM Power 8 architecture CPU's runs simply as fast w/o special sauce software to make it work faster, or at all (sort of what this article implies= Hardware + Special sauce gives Moore's Law traction).. Happy C-64 ?? (still faster than my Haswell laptop)..

    IMHO= a non blocking network is needed on chip to take advantage of the many cores on a chip..RS.

  11. martinusher Silver badge

    The wheel has finally arrived

    When the PC turned up it was indeed neat to have a computer that was small and cheap enough to own. The price we paid for it was a return to hardware architecture that was twenty years out of date. Increases in hardware performance masked this giant step backwards, we got used to code bloat being managed by plummeting memory prices and blazing fast processors. (You'd be amazed at just how fast a generic Intel processor is when its not encumbered by the software we normally run on it.)

    We're finally moving into a world where we wanted to be in the 1980s, and its possible because -- finally -- hardware and tools don't require multi-million dollar investments to build anything, the blocks are cheap, the tools are cheap and the techniques are well understood.

    Its unfortunate that our software technology is still pretty crude -- in fact modern applications programming looks awfully like "chuck a load of mud at the wall and see what sticks". This might be a practical solution to getting the job done with the available resources but the size of modern programs for their functionality is embarrassing. (...and no, memory isn't cheap -- the parts are but the time taken to load and unload the stuff mounts up)

Page:

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Biting the hand that feeds IT © 1998–2019