back to article Moore's Law has ten years to run, predicts physicist

Renowned theoretical physicist Michio Kaku has predicted Moore's Law will run out of steam within the next ten years as silicon designs run up against the laws of physics. "In about ten years or so we will see the collapse of Moore's Law," he says. "In fact we already see a slowing down of Moore's Law. Computing power cannot …

COMMENTS

This topic is closed for new posts.
  1. Jeebus

    I know people get paid a lot, but "Futurologist" is a horrible title, it just screams "False" though, like conspiracy theorists and such.

    1. TeeCee Gold badge
      Happy

      Don't knock it.

      It beats standing on street corners flogging lucky heather and reading palms, old skool style.

    2. nexsphil

      True

      Humans never conspire to do bad things. Especially not those with power. Anyone that proposes otherwise is insane. Despite this awesome wisdom I'm not sure my Grandkids will jump on board, looking back at the way things are today.

    3. MacGyver
      Facepalm

      Arg.

      He lost all credibility with that stupid "Sci Fi Science: Physics of the Impossible" show.

      "To travel across out galaxy, first we design a worm-hole generator." Really, that's all we need to do Dr. Michio Kaku? I love future tech shows, just not when written by and for redneck hill people.

      That kind of show, and those stupid "Ancient Aliens" shows are corrupting real science.

      I predict that "Ten years", just happens to be longer than anyone would remember that he made that prediction.

      1. elderlybloke
        Angel

        Re: Arg.

        Dr Kaku is a very succssful and rich BS artist.

        I actually enjoy his TV shows, he has a very creative imagination and who knows what may come true.

        Black Holes were science fiction things a few decades ago , and now are a very active subject of research.

    4. Anonymous Coward
      Anonymous Coward

      we have a futureoligist

      everyone calls him the bullshiit artist

  2. Anonymous Coward
    Anonymous Coward

    Moore's Law - so what

    We already have enough compute power to watch movies on mobile phones, as well as make video calls. Surely anything else is superfluous for most people's needs?

    (I am excluding Large Hadron Collider type non-personal use here, obviously :-D).

    Perhaps when we do hit the wall, the old art of hand crafting code tweaks will be back in vogue ;-)

    1. Quantum Leaper

      Re: Moore's Law - so what

      I remember when someone said 640K was more than enough ram in a personal computer. You will never know what tomorrow killer app will be or how much power it will need...

      1. Christian Berger Silver badge

        We all don't remimber

        When people actually thought the Concorde was something sensible and that there would be faster planes after it.

        Technology often reaches a certain peak. We now wonder why people once thought that sitting for 4 hours in a plane to cross the Atlantic is a good thing, when you can just use e-mail or the telephone.

        I mean the whole idea of a home computer, like we now all have, was to build a computer that was less powerful than a "real computer", but also a lot smaller and cheaper.

        1. Aldous
          Facepalm

          Re: We all don't remimber

          concorde was sensible at the time it was designed and built. no email and cheap oil. if it wasn't for the nimbys/tree huggers (propped up by US Aerospace) you would have supersonic flight everywhere(at least the option of).

          instead there was the whole "think of the children" protests over sonic booms over land, threats by individual states to not allow conorde over head etc and so they were restricted to supersonic over water only (wonder how long uk-australia would take in a long range concorde?). not to mention email/phone is not so good for stuff, supersonic cargo planes anyone?

          mind you its probably for the best given the "OMGZ THE SKY IS FALLING" result of the recent eurofighter sonic booms (i was in an affected area and thought my motorcycle had fallen over again, not that it was the end of the world omg terrorists!) some people even claimed to see bright flashes of light acompanying the boom ffs.

          1. Jan 0
            Pint

            @Aldous

            Sensible? It was a justifiable choice, nevertheless, there were other choices. There could have been a bigger push to improve person to person text*, audio and video communication instead. Maybe you wouldn't be thinking of supersonic cargo planes if we now had high speed submarine container ships. Why do people want under-ripe tropical fruit anyway? Maybe we should have empowered psychologists back in the 1950s!

            *Weren't teletype 'hotlines' more important to governments and businesses than passenger 'planes anyway?

            I'll raise a pint to Vannevar Bush, but why was Alan Kay so late?

            1. Anonymous Coward
              Anonymous Coward

              Speaking of communication...

              ...the basic idea of the internet dates back to a 1960 paper from the RAND Corporation.

              http://www.rand.org/pubs/papers/P1995.html

              I'll raise a pint to Paul Baran, but why did Peter Kay stop being funny?

          2. Nigel 11

            Bright flashes ...

            "some people even claimed to see bright flashes of light acompanying the boom ffs"

            Almost certainly, they did. It's called synaesthesia. It's quite common. For most people it occurs only when one of their senses is overloaded by a sudden and unexpected input. There is some sort of neural spill-over in their mental processes that registers as a different sense. If the triggering experience is sufficiently rare, they may not recognise it as an internal rather than an external phenomenon.

            For me, a sudden loud noise also registers as taste (acid on my tongue).

            For a smaller number of people, the linkage between their senses is a permanent part of everyday experience. They're not mad, because they are fully aware that it's their own internal "wiring" that is different to that of most other people, and because it doesn't cause them any distress.

          3. Anonymous Coward
            Anonymous Coward

            Re: We all don't remimber

            "some people even claimed to see bright flashes of light acompanying the boom"

            You've never known someone with interlinked senses then? People who get coloured flashes in front of their eyes when they hear certain types of music or taste certain flavours? Some people are wired differently, the world doesn't always conform to what the text books ( the Wiki and forums! ) state, for the 98% it's true but theres the special ones who offset the averages.

          4. boltar Silver badge

            Re: We all don't remimber

            "f it wasn't for the nimbys/tree huggers (propped up by US Aerospace) you would have supersonic flight everywhere(at least the option of)."

            I'm not so sure about that. Supersonic flight uses a huge amount of fuel and short of charging utterly ridiculous ticket prices its hard to see how it could be a viable business proposition these days. British Airways (don't know about Air France) didn't ditch Concorde because of safety fears, it ditched it because the aircraft required huge amounts of maintenance and it was making very little money. The safety scare was simply an excuse to get rid of a "prestige" service that was actually a boat anchor around the companys finances.

            "(wonder how long uk-australia would take in a long range concorde?)."

            Quite a long time. It would have had to stop at least twice to refuel.

            1. Tom 13

              @Baltar

              No, he's right. The reason Concorde didn't make money is because most of the places where it could have made money were put off limits by the tree huggers here in the US. Old Blighty to Australia might not work, but the US would have. If we'd granted the permits, which we should have. And I expect that had THAT worked, they could have upped the fuel capacity on a newer model to make the Aussie run work too.

              1. Michael Wojcik Silver badge

                Re: @Tom 13

                > The reason Concorde didn't make money is because most of the places where it could have

                > made money were put off limits by the tree huggers here in the US.

                Evidence, please. The air-travel market is pretty mature, transparent, and efficient, and it's barely profitable as it is. No doubt contemporary SST commercial-passenger aircraft would be more fuel-efficient than the Concorde, had development continued; but SSTs would still be ravenous consumers of fuel, so their ticket premium would be large - and even more sensitive to oil-price spikes than conventional air travel.[1]

                So commercial SST travel would only be profitable on routes where the time savings was sufficient to justify the large price premium to a substantial fraction of the market. Most travelers already take the cheapest option possible, choosing multiple-segment coach itineraries over direct flights and more comfortable seating. Businesses, meanwhile, have cut back both on premium travel options and on travel overall, as an easy area for cost savings.

                Historically, worldwide premium air travel has grown at around 5% annually, but much of this is international.[2] And in recent years it has stumbled badly, as it did in 2009 and 2011.[3]

                Maybe commercial SST travel would have been profitable - but I tend to doubt it.

                [1] This can be hedged, of course, and the airlines do; but ultimately that just spreads the cost out.

                [2] See eg this IATA press release

                [3] See eg this Bloomberg piece from 2009 and this from Peter Greenberg in 2011

          5. Christian Berger Silver badge

            Re: We all don't remimber

            Well that's exactly the point. I used to be sensible _at the time_. Today it's just idiotic. The requirements of a society change. Maybe one day, we won't strive for faster computers any more, just like we don't want faster modes of transports.

        2. /dev/null

          Re: We all don't remimber

          Quite. We've got used to computer technology becoming more powerful every year, but the world won't end if that doesn't happen any more. Intel and AMD might find it harder to sell chips though.

          The aerospace industry is an interesting comparison - in its first 60 years, it went from the Wright brothers to the Chinook helicopter. In the subsequent 50 years, it has gone from the Chinook helicopter to... more Chinook helicopters.

          1. Morg

            Re: We all don't remimber

            That's mostly an illusion, the only military tech you know of is public knowledge, i.e. those weapons that were shown and that governments could not hide because they extensively used them.

            However there has been ongoing military research and no real war to force the new tech on the field for at least 50 years, anyone with half a brain understands that there has to be a ton of stuff in store for the unlikely event of another real war.

          2. Ben Holmes
            Joke

            Re: /dev/null

            Actually, taking the latest Strategic Defense Review into account, it actually means 'no more Chinook helicopters.'

          3. Paul_Murphy

            Re: We all don't remimber

            Similarly it didnt take that long to go from Kittyhawk to the moon, but after what is roughly the same period of time we have gone from the moon to nowhere.

            ttfn

        3. Andus McCoatover
          Windows

          Re: We all don't remember

          We worry about "Moore's Law" then forget that an atom is just soo sized. So, while we talk about getting a transistor's gate down to 6 atoms thick, as we drive to the lab in a "Suck-squeese-bang-blow"* driven vehicle to the lab, that the technology of which was invented 120 years ago, and hasn't changed much since....When someone shrinks the atom... (No, Rick Moranis, that's not a cue for a sequel!)

          *Infernal combsution engine.

        4. Anonymous Coward
          Anonymous Coward

          Re: We all don't remimber

          They did *have* telephones in the 1960s. It wasn't a case of flying across the Atlantic because that was the only way to talk to somebody on the other side. It was about getting there in 40% less time. I take it you regard the High Speed Rail plan as ridiculous on the same grounds?

      2. Pen-y-gors Silver badge

        Next killer app?

        To be honest, the last 'killer apps' for me were e-mail and a web browser. (both circa 1992). We could have a long wait...

      3. boltar Silver badge
        Unhappy

        Re: Moore's Law - so what

        "You will never know what tomorrow killer app will be or how much power it will need..."

        Apps just suck up more and more power but don't deliver the equvalent amount of functionality. Do I really need a Ghz class CPU to run a friggin word processor when , for example, MacWrite which had perfectly servicable functionality and and pleasant GUI ran happily on a 20Mhz 68000?

        Sure , for 3D games, AI and maths intensive operations such as image transforms in photoshop you need the fastest silicon you can get, but for everything else? No , sorry. The reason most apps require more CPU is a combination of lazy and/or incompetent programmers, inefficient bloated libraries and in a lot of cases slow , memory sucking managed languages.

      4. Dire Critic
        Facepalm

        Re: Moore's Law - so what

        "You will never know what tomorrow killer app will be or how much power it will need..."

        We just know that it'll be bloated, slow and take up twice as much drive space as its predecessor.

      5. Anonymous Coward
        Anonymous Coward

        Re: Moore's Law - so what

        virtual reality porn........... it's the next big app

    2. atippey
      Facepalm

      Re: Moore's Law - so what

      Goddammit, I want sub-pixel polygons, molecular level physics simulation, and a direct-to-brain interface. Otherwise, Crysis IX will not be worth my money.

      1. Omgwtfbbqtime Silver badge
        Trollface

        "Otherwise, Crysis IX will not be worth my money."

        It won't anyway as it will only take 8 minutes to complete.

      2. Evil Auditor Silver badge

        Re Crysis IX will not be worth my money

        IX?! Not even Crysis 2 was worth the money.

  3. pip25
    Joke

    Oooh, time travel!

    I think I've read this article already, around ten years ago!

    1. RIBrsiq
      Thumb Up

      Re: Oooh, time travel!

      I recall reading it in the early nineties, myself...

      No, really: the number of graphs and whatnot showing this exponential growth trend in computing power surviving several paradigm shifts is *huge*. Just Google it and see.

  4. Anonymous Coward
    Anonymous Coward

    often stated and pretty consistenly wrong

    Since both Intel and AMD have been leaking their experiments with real multi-layer 3d chip designs, I'd say that in 10 years we will have hammered out another generation of fairly conventional silicon production by building up instead of shrinking. Considering the big problem with this has actually been heat(As in their test pentium 3 + ram dies melting in the lab) and there have been some interesting developments in that arena in the last few months, I am betting that we can look forward to quite a few more "Moore's Law" Keynotes. Then will probably come optical interconnects, persistent state RAM and host of other new tech.

    Pity too, because when all of that runs out of headroom, things may actually get interesting. Until then, unless fabs stop costing tens of billions of dollars, things will probably stay incremental and safe and dull. Though I do wonder if AMD will be around to see it(as anything other than a GPU house at least).

    1. Morg

      Re: often stated and pretty consistenly wrong

      Optical interconnects are NOT better... light travels only marginally faster than electrons AND it requires two electric to optical converters to work. It is highly unlikely that this will be used anytime soon INSIDE a cpu or even motherboard.

      Don't forget graphene and frequency scaling. there's a lot of room there still-

      1. Hungry Sean
        Boffin

        Re: often stated and pretty consistenly wrong

        If you're going to act authoritative, at least be correct. First, the electrons themselves aren't traveling at speed through a wire, the charge is (think a wave moving across the ocean). Second, the problem isn't the speed of the propagation of electricity through an isolated wire, it's the capacitance with neighboring wires that introduce a lovely rc time constant into the switching. Capacitance also allows wires to potentially induce errors on each other. Between these two things, you end up needing beefy transistors to drive the wire, extra transistors at the receiving end to avoid some of the potential nasty effects, and possibly dead wires in between to "shield" things.

        Optical interconnects don't have these problems and so they could certainly be better on the scale of a PCB and might even make sense as global interconnects on a die, or maybe even as an alternative to through silicon vias. As I understand it, the main barrier to all of this cool stuff is that the technology is very young, not that it is a dumb idea in principle.

      2. Field Marshal Von Krakenfart
        Boffin

        Re: often stated and pretty consistenly wrong

        Actually electrons move very slowly, it could take a free electron anything up to 12 hours to travel through 1 metre of copper depending on diameter of the copper, voltage and current.

        I always understood that the real CPU limit was not the mask size but a combination of mask size and clock speed, as most clock speeds are now well into the radio frequency range the real limiting factor is Radio Frequency Interference (RFI) within the chip, where a signal in one track in a CPU induces an unwanted current in an adjacent track.

        Because of this we may hit the limit for CPU clock speeds and transistor density before we hit any physical manufacturing constraint

        As for EUV, I was also under the impression that some masks were now photo-etched using the non-visible spectrum (x-Rays???)

        Technical bits open to correction.

  5. gronkle

    It does seem a bit of an affliction for scientists once they reach a certain age or level of influence to start making "that's it, we've done it all, it's all down here from now on" type pronouncements, despite every generation having faced the same kind of pronouncements themselves, previously.

    My money (none) is on Graphene keeping us ticking for another few years yet...

    http://www.theregister.co.uk/2011/09/22/src_nsf_research_grants/

    1. Kristian Walsh Silver badge

      He was talking about Silicon, though...

      Graphene is one of the most likely materials to take over from Silicon, I think, but even if it doesn't, it has so many other useful properties that you're going to be seeing it in lots of products long before Silicon runs out of steam.. first out, as a replacement for the difficult to source Indium used in touch displays (where it will allow capacitive and capacitive+resistive designs), but also as an engineering and surfacing material.

      But Kaku did say that the problem is with the physics of *Silicon*, and it's pretty hard to dispute this argument. There's a certain critical mass of atoms below which a semiconductor junction won't work. Semiconductors junctions work on the principle of doping pure silicon with other elements - these impurities are what provide the "one-way current" behaviour that all digital electronics relies on.

      Make these features too small, and the absolute number of doping atoms becomes significant, rather than their ratio (a Silicon atom has a diameter of a touch over 0.2 nanometres, so a 5 nm feature size is less than 25 Si atoms across)..

      Of course, this doesn't preclude three-dimensional construction of devices (although cooling is a major problem here), or hybrid Silicon/Something-else designs, but I think that's his point: using Silicon alone, you cannot go on forever reducing feature sizes. My guess is that it'll be economics, not physics that prevents us reaching the theoretical limit of Silicon.

      1. Kool-Aid drinker
        Thumb Up

        Re: He was talking about Silicon, though...

        Aye, the stone age didn't end because the stone ran out or we exhausted its possibilities, it ended because something better came along, and so it will be with silicon.

        1. Jedit
          Headmaster

          "The Stone Age didn't end because the stone ran out"

          Well, obviously - if it had, we wouldn't have been able to have a Silicon Age.

        2. Tom 13
          Joke

          Re: ...it ended because something better came along...

          Damn young whipper-snappers!

          Paper wasn't any improvement over stone for record keeping - it rots, stone is forever. Sure it's requires a bit more space to store than paper, but until you get done with acid free, and humidity controls and temperature controls (and their affects on our atmosphere) you haven't really gained ANYTHING. And don't get me started on the silly crap about using electrons to store data!

    2. Dave 126 Silver badge

      "It does seem a bit of an affliction for scientists once they reach a certain age or level of influence to start making "that's it, we've done it all, it's all down here from now on" type pronouncements"

      -gronkle

      Or, put another way:

      When a distinguished but elderly scientist states that something is possible, he is almost certainly right. When he states that something is impossible, he is very probably wrong.

      -Arthur C. Clarke, Clarke's first law

      1. Anonymous Coward
        Anonymous Coward

        Corollary

        You've forgotten Isaac Asimov’s Corollary to Clarke's first law:

        "When, however, the lay public rallies round an idea that is denounced by distinguished but elderly scientists and supports that idea with great fervor and emotion — the distinguished but elderly scientists are then, after all, probably right."

        and Egbert's Corollary to Asimov's Corollary:

        "Whenever the lay public repeat and riducule the original statement they will conveniently forget that it was made in a specific context with specific conditions and instead will generalise it to a groesque characature of what was originally said".

        Thomas J. Watson: "I think there is a world market for maybe five computers". If he said it at all (dubious) it would have held true for about 10 years and, in any case, was probably (and correctly) being said about the computers being created *at that time*.

        Michio Kaku: "Computing power cannot maintain its rapid exponential rise using *standard silicon technology*."

  6. Rolland Trolland

    And not a moment too soon!

    The horrific bloated slop that passes for code these days is an embarrassment to anyone of pre GUI age.

    Maybe when / if a processor cap appeared Moores law could be continued (in a fashion) by people dumping some of these lazy libraries and putting a bit more thought into their code so that the processor is actually doing something useful and not merely navigating it's way around excessive layers of pointless abstraction!

    1. Mussie (Ed)
      Thumb Up

      Re: And not a moment too soon!

      You must be a fan of Mel

      http://foldoc.org/The+story+of+Mel,+a+Real+Programmer

      1. Rolland Trolland
        Thumb Up

        @Mussie

        Aye, those were t'days. Real programmers wrote code by candlelight with rusty t' soldering iron, on motherboard made out of sheeps bladder :)

    2. Anonymous Coward 101

      Re: And not a moment too soon!

      "The horrific bloated slop that passes for code these days is an embarrassment to anyone of pre GUI age."

      Why, there was a mythical golden age of wondrous coding! And did kids know their place as well?!

    3. Some Beggar
      FAIL

      Re: And not a moment too soon!

      Absolute twaddle. There was shit code in the 1970s and there's good code today. Engineering is all about pragmatism and engineering effort is expensive. There's no excuse for sloppy code, but there's equally no excuse for wasting effort optimising something to the Nth degree when you don't need to. The increased power of processors and the increased capability of high-level languages is unequivocally a Good Thing. You can hand craft an entire system in assembler on punch cards if you want. The rest of us will take advantage of whatever whizzbangery we have available.

    4. Michael H.F. Wilkinson Silver badge
      Happy

      Re: And not a moment too soon!

      As Niklaus Wirth says: Software is getting slower faster than hardware is getting faster.

      Word 2.0 was a very capable word-processor, and ran happily on my 25 MHz 80386 machine with 4 MB of RAM (I really splashed out on that machine :-) ). Word 2010 would require rather more. More in fact than the processing power and memory of the good old Cray Y-MP. That is worrying.

      GUIs of course do need more resources, but the above example suggests you can run a full-blown GUI-based word processor in 1,000 times less memory than we seem to need today. If you look at the memory footprint of something like FLTK, which is so small you can afford to link it statically for easier distribution, and compare that to some other tools, you cannot help but question the efficiency of some code.

      Much of the best coding is going on in the embedded-systems world. You really have to conserve resources in that arena.

      1. Some Beggar

        Re: And not a moment too soon!

        Software is getting slower faster than hardware is getting faster.

        But this is clearly nonsense. If it were true then the latest games consoles would be incapable of playing Manic Miner.

        That is worrying.

        Because ... ?

        1. Michael H.F. Wilkinson Silver badge

          @ Some Beggar

          What Wirth means is that for a given task, the current software needs vastly more resources (CPU and memory alike) than similar software years ago.

          Why is this worrying? Because it suggests that we could get by on much leaner compute capacity for many mundane tasks. It means machines that still work fine have to be replaced when the software is updated, and the minimum specs are upped again. This is ultimately wasteful. It also means that bigger server parks are needed for a given workload. If you can make code more efficient, less hardware is needed, and less energy is wasted. Mobile computing (like embedded) can give an impetus to leaner programs, simply because cutting clock cycles can cut battery usage.

          1. Some Beggar

            Re: @ Some Beggar

            What Wirth actually said was this:

            "Do increased performance and functionality keep pace with the increased demand for resources? Mostly the answer is no. "

            Which is simply a statement that we become more profligate when we have more resources. There are variants of this truism dating all the way back to the Hebrew Bible. It's why Americans build gigantic cars with inefficient engines - they have big roads and cheap petroleum. The only time this becomes a problem is when the resources start to be throttled. This isn't going to happen any time soon with mips and memory. Wirth wrote this nearly 20 years ago and the computing industry hasn't collapsed as far as I've noticed.

            The paraphrased version that you quoted is simply nonsense.

          2. Fibbles

            re: Michael H F Wilkinson

            I'd like to agree with you about mobile computing but it's not what I see when I look at smartphones. For every improvement in battery tech and efficiency gained through die shrinkage I see an increase in clockspeed to deal with ever more bloated software.

            1. Some Beggar

              Re: re: Michael H F Wilkinson

              And these smartphones have processing capability (in terms of what the user can actually do - not simply in terms of what the silicon can theoretically do) that is well beyond what a desktop PC would have had back when Wirth made his oft-misquoted statement.

              So you can now do things on a battery-powered pocket device that were impossible on a mains powered box that lived under your desk twenty years ago. How exactly does this demonstrate that software has bloated faster than hardware has speeded up?

  7. JeffyPooh Silver badge
    Pint

    Clark's Law trumps Kaku's prediction, thus extending Moore's Law

    Arthur C. Clarke's laws of prediction:

    1) When a distinguished but elderly scientist states that something is possible, he is almost certainly right. When he states that something is impossible, he is very probably wrong.

    2) The only way of discovering the limits of the possible is to venture a little way past them into the impossible.

    3) Any sufficiently advanced technology is indistinguishable from magic.

    Michio Kaku is certainly distinguished. He's also recently turned 65, and that makes him elderly. See Rule 1. Moore's Law will shift gears, but it will continue further than expected. Just like last time. And the time before that.

    1. Pierre Castille
      Stop

      Re: Clark's Law trumps Kaku's prediction, thus extending Moore's Law

      "Michio Kaku is certainly distinguished. He's also recently turned 65 and that makes him elderly."

      Steady on with the elderly bit! - I'm 66 and I consider myself only a teeny bit old (though my children and grandchildren might hold a different opinion!

  8. Muckminded

    Moore's Misnomer

    It actually refers to integrated circuits, not silicon, and it isn't a law, but who cares. As mentioned above, there will inevitably be discoveries that ever push the horizon a few years forward, until the day we realize we've had enough math.

    1. Gavin King
      Boffin

      Re: Moore's Misnomer

      Strictly speaking the Second Law of Thermodynamics isn't a law either, but it still seems to hold.

      And don't talk about enough math.

  9. Anonymous Coward
    Anonymous Coward

    The path beyond the next few Moore's Law process generations has always been quite murky.

    However, just because previous predictions of its demise have been wrong doesn't mean current predictions are also wrong.

    If I understand correctly, a *lot* of the innovation in the past decades has been about pushing the need for big changes (like EUV) forward into the next process, then the next, and so on. How long can that go on? Even if the technologies have passed far beyond my understanding into the realm of apparent "magic", Moore's Law is not a magic law operating outside the physical world - exponential growth will always slow down at some point.

  10. Anonymous Coward
    Anonymous Coward

    There's Mores Law Part 2 though.

    Part one states that every 2 years or so, the transistor count in an IC will double.

    Part two state that every 2 years or so, someone will incorrectly predict the end of part one.

  11. Wunderbar1

    They are already working on Silicon replacements

    http://www.research.ibm.com/nanoscience/nanotubes.html

    1. Charlie Clark Silver badge

      Re: They are already working on Silicon replacements

      Yes, the materials and production processes will probably change. The economics of chip development have changed significantly in the last ten years due to the cost of making the machines that make the chips. There are fewer suppliers of the lithography machines than there were ten years ago which is driving up the cost of each new generation. At the same time even Intel's margins are starting to come under pressure as it's designer hardware struggles to differentiate itself from the commodity ARM clones.

      The number of companies entering the nano-technology and additive manufacturing to address some of the same problems is increasing following the. If they can get it right printed OLED screens may be the first children of this revolution.

      Worth noting that IBM and Samsung may well be right in not having an asset-light strategy in this area.

  12. imanidiot Silver badge
    Facepalm

    Am I the only one?

    I don't have that much respect for Michio Kaku. Yes, he's got a good brain and yes he's good at dreaming up fantastical stuff. But he has a penchant for seeking media attention and spouting crap about area's of expertise he's not nearly qualified to talk about. (Like around the time of the whole Fukushima debacle)

    Maybe I've missed it, but I haven't seen anything useful done by mister Kaku. He keeps "thinking up" wonderful new crap that's either highly improbable or just plain wrong and impossible from an engineering standpoint. (using materials and methods he suggests atleast. And often those things aren't even all that original)

    I wouldn't be bothered if he just kept making dull TV shows, but he keeps going to the media with these kinds of statements, that, to me, just seem like he's seeking media attention again.

    1. Anonymous Coward
      Anonymous Coward

      Re: Am I the only one?

      I couldn't agree more. If you've ever seen Koo-Koo Kaku speak in person, you'd have been witness to a spotlight-seeker who'll say nearly anything to grab the attention of folks with weaker scientific chops than he has. Sure, he's educated and articulate, but his pronouncements are increasingly tales told by an idiot, full of sound and fury, signifying nothing.

      Also, remember that he's a physicist and/or astrophysicist, not a materials science guy. If you want to know the future of microprocessor technology, talk to microprocessor technologists. I think the Reg has published some stories about this matter in which they talk to people who actually know what they're talking about.

      Hang it up, Michio – your 15 minutes of fame are over.

  13. gecco2

    Moore? Really

    Why does Moore continue to get credit with the physicist's prediction of processor speed doubling annually? It was not Moore who discovered or predicted it but who made the prediction well known by non-physicists!

  14. JDX Gold badge

    They've been saying we're about to run into physical limits for as long as I can remember. It's a bit like fusion... always just around the corner.

  15. geocrunch
    FAIL

    Doubling CPU cores is also doubling transistors

    In the end the article says that manufacturers look for for other ways of expanding CPU power, such as increasing the number of cores in a CPU, etc. The CPU cores have been doubling every 2 years over the last 5 years or so - how is that not doubling the number of transistors? This physicist obviously isn't an IT professional who would otherwise know that the next step of scalability is not vertical, but horizontal.

    1. Peter Gathercole Silver badge

      Re: Doubling CPU cores is also doubling transistors

      One of the problems that chip designers have is how to use the vast number of transistors that can be fitted onto the large die-sizes at the smallest scale.

      They got to the point where more registers, more cache and more instructions units in a single core was not making for faster processors, so they then started using the still increasing transistor budget to put multiple cores on a single die.

      There is a lot to be said for a large number of cores on a single die, but this has it's own problems with access to memory, cache coherency between cores and I/O.

      Another avenue is putting disparate processors (like GPUs) on the same die, or even System on a Chip (SoC), where all of the functional elements (I/O, Graphics, memory etc) of a complete system appear on a single piece of silicon (think what is going into 'phones and tablets).

      In my view, to make use of the vast scale of integration, it's about time we had a fundamental rethink about how processors work. I don't have any new ideas, but I think that listening to some of the people with outlandish ideas might be worthwhile in coming up with a completely new direction to investigate.

  16. Iad Uroboros's Nemesis
    Go

    Graphene will be key to take Moore's Law strain

    As Gronkle says, Graphene is already spinning up & ready to take over from where Silicon's physical limits are reached. With the atoms being so much smaller than Si and some, I believe, wondrous other characteristics at such a small scale, I think we have a way to go yet, maybe even until quantum chips do come online in quantity.

  17. lulu8137

    am i the only person to know that in 2011, MIT lab researchers already produced non-silicon chips that can run at 100GHz or more? sometimes i feel i am. we already have this technology and 99.99% of the world has no clue it exists.

    it's just the manufacturers don't want to put in the R&D and production costs when they're making enough $ with silicon.

    1. Muckminded

      OMG, RLY?

      Those guys are so yesterdollar. AMIRITE?

    2. Bronek Kozicki Silver badge
      IT Angle

      I'm in this 99.99%

      would you please enlighten me what this technology is and, perhaps, why isn't it commercially used?

  18. Pete 2 Silver badge

    Sadly, Wirth's Law will keep on going

    "Software gets slower faster than hardware gets faster"

    Luckily for us, the annual increments in processing power have masked the continual degradation in software performance. If Moore's law does hit the buffers, someone somewhere is going to have a 50 year project on their hands to fix all the inefficiencies, crocks, bad design and unoptimised code.

    Maybe once there is an actual incentive to write fast, efficient code instead of the traditional "do whatever it takes, just deliver it by Monday" approach to software development, we'll have a renaissance in programming. If we're lucky, we may even find that once people start writing well-designed, reliable and efficient software that programmers start to win back some professional esteem and respect, too.

    1. JDX Gold badge

      Re: Sadly, Wirth's Law will keep on going

      Software is ALWAYS about doing what it takes to run on the target hardware. It's all well and good being nostalgic but people used to hand-optimise ASM because they had to to make it run fast enough, not due to some sense of craftsmanship.

      Those old-school skills are fun to learn and good to know, but you learned them because it was necessary to get the job done.

    2. Charlie Clark Silver badge

      Re: Sadly, Wirth's Law will keep on going

      It seems to have escaped your notice how much work has been done on compilers in the last few years which do an increasingly good job of removing many of the inefficiencies. That and the ability to shift tasks to hardware implementations (encryption, signal processing, video compression and decompression).

      1. JDX Gold badge

        how much work has been done on compilers in the last few years

        People have also been saying that for years. It's undoubtedly true but a good optimiser can still beat the hell out of the compiler through knowing the context and stuff like SSE. The claim "modern compilers can't be beaten by hand coding" is simply bogus - however the truth is it doesn't normally need optimising.

    3. Anonymous Coward 101
      Thumb Down

      Re: Sadly, Wirth's Law will keep on going

      "Maybe once there is an actual incentive to write fast, efficient code instead of the traditional "do whatever it takes, just deliver it by Monday" approach to software development, we'll have a renaissance in programming. If we're lucky, we may even find that once people start writing well-designed, reliable and efficient software that programmers start to win back some professional esteem and respect, too."

      Whining cry baby.

  19. Some Beggar

    What ... AGAIN??

  20. Anonymous Coward
    Anonymous Coward

    While I wont go so far as to predict the time scale I will say this, Moore law will end one day. There is a physical limit to the information you process with a given quantity of matter.

    1. TheOtherHobbes

      You're assuming

      you have to use matter to do the processing.

      1. Some Beggar

        Re: You're assuming

        In order to be useful to us material humans, at some point the processing has to interact with the material universe.

        Or are you suggesting we have to transcend to some higher plane of existence to get around Moore's law?

  21. Anonymous Coward
    Anonymous Coward

    Chinny Reckon?

    "the first proper quantum systems won't come online until late in the 21st century"

    I'd give it 20 years meself, and I'm not known for my optimism.

    1. amanfromearth

      Re: Chinny Reckon?

      Much uncertainty in quantum computing there is young padowan

    2. William Towle
      Joke

      Re: Chinny Reckon?

      > "the first proper quantum systems won't come online until late in the 21st century"

      > I'd give it 20 years meself, and I'm not known for my optimism.

      And even then they'll be online and not online at the same time (...er, if I've understood it correctly?)

  22. Tom 7 Silver badge

    We havent even started yet....

    when I first started making micro chips 32 years ago there were lots and lots of techniques being looked at that haven't been re-looked at that could easily keep Moore's law going for another decade or two.

    I have often wondered if they were really dead ends, just swept aside by the usual monotonic rush, or possibly as they were not patentable any more and so wouldn’t give any more than a moments advantage to anyone who developed them to fruition.

    But as has been alluded to earlier in this thread Moore's law has an 18 month doubling and Wirth's law has software doubling bloat in 17 months.

    Me - I'd be quite happy with a 4 core arm machine with a socket for adding more when I need to.

    It wont run microsoft office - so what? My car wont pull a plough.

  23. Daniel Garcia 2
    Go

    Naaa, there is still new technologies coming up

    stack of graphene and silicene techno-wizardry combine with nanotubes coduits and "on the chip" dynamic (fast moving fluids) heat transfer solution is my bet for the future.

    we still are at infancy on term of internal heat transfer solution that is at the end of the day the main physical limitation.

  24. MJI Silver badge

    Software bloat

    I am personally going the other way, trying to squeeze a little more out of it.

    Word processors need little hardware, if they do they are poorly written.

  25. stucs201

    What happened to diamond chips?

    I'm sure I read about those years ago. As I recall they were very heat tolerant, so we could go back to the GHz race (well for desktops, heat might continue to be an issue for stuff you have to be able to hold).

  26. nigel 15

    Everything that will ever be invented has already been.

    no really. it has.

    idiot

  27. Colin Millar
    Boffin

    Duped again

    Why is it that supposedly hard nosed tech types keep cleaving to these marketing mantras like "Moore's Law".

    It's not a law cos it doesn't have a proof. Its not even a theorem because it isn't based on any prior established proofs. Moore himself called it a "trend" and its likely continuance for another decade a "prediction" but these days it is really just a marketing soundbite designed to promote the buying cycle.

    As a statement it is tremendously inaccurate in its terms which relate to two variables in a multi-variable environment and it seems to assume "all other things being equal" without any real reason for making such an assumption - particularly as the original statement was made over 40 years ago and carried a caveat by the guy who actually said it.

    I am constantly amazed at the ability of supposedly intelligent people to completely ignore actual evidence in favour of theologising their prejudices.

  28. kit

    Optical network, embedded memory, Graphene, Nanoptube and TSV will replace silicone downsizing.

    Silicone tech may run out of steam in perhaps 10 yrs' time. Other techs like inter-core optical networking, massive embedded fast memory (resistive or phase change memory) as well as the application of new materials (graphene and nanotube), and 3-D chips (TSV) , will replace silicone downsizing.

    1. Chemist

      Re: Optical network, embedded memory, Graphene, Nanoptube and TSV will replace silicone downsizing.

      "Silicone tech may run out of steam in perhaps 10 yrs' time."

      Silicone is wubbery polymer !

      1. kit

        Re: Optical network, embedded memory, Graphene, Nanoptube and TSV will replace silicone downsizing.

        Attached is the news release from ibm for your question.

        http://www-03.ibm.com/press/us/en/pressrelease/33115.wss

  29. rav

    Moore's Law???

    If it was Moore's Law then it would not have 10 years to go. It is Moore's observation.

  30. YARR
    Boffin

    No mention of Nanotechnology?

    I'm concerned that for an esteemed futurologist he made no mention of nanotechnology.

    I had thought that the technology for producing ICs would transition into molecular manufacturing, bringing about a new era of mechanical nanocomputers.

    When built in 3D, we'd fit the most powerful supercomputers into the size of a speck of dust, without the frequency, overheating and current leakage issues that affect electronics at this scale. Or did Eric Drexler get it wrong?

    1. attoman

      Re: No mention of Nanotechnology?

      Nanotechnology is that technology in the range of 100 nanometers to .1 nanometer. Semiconductors were there in the last millennium where were you?

      Drexler predicted there would be robot assemblers building things an atom at a time using any material. He predicted that carbon atoms would be assembled into diamond for instance. He ignored DNA and life and the limits of energy in such reactions that drive the ancient nano assemblers in all life. After billions of years does anyone really think that if it was possible to somehow not spend 1000 degrees C of energy in forming a diamond carbon bond life would have found it and used it instead of calcium bones, or or cartilage?

      Drexler also had no place for electromagnetics in his actual implementation predictions.

      Drexler got it wrong. Drexler got us talking and thinking about Feynman's point and he deserves a lot of credit for publicizing the field. No core nanotech inventions though that I'm aware of have his name as principal inventor.

  31. Chris Hennick
    Go

    Moore's Law: 10 years left since 1965!

  32. attoman

    Moore's Wannabe Law of Device Physics

    Gordon expressed a hope and goal, not a law of nature or of man. Ever since inventors have used their imagination to sustain the hope and goal and the economy of much of the world.

    Spoiled brat's like the readers of this rotter's blog haven't a clue where the miracles that make their culture and life come from or in many cases how core and critical they are to their well being until they are denied. Then like the obscene reaction of lawmakers and courts to the loss of their crackberry fix a few years ago the addicts will do anything, break any code, blame any helpless group to get back their drug.

    When the miracle of ongoing Moore's Law Invention ends, so does the love affair with ever more powerful, less expensive personal computing including iPads, iPhones and all their contemporaries, and successors.

    So also does the preeminence of Silicon Valley, and the economy of the USA.

This topic is closed for new posts.

Biting the hand that feeds IT © 1998–2019