back to article Moore's Law has ten years to run, predicts physicist

Renowned theoretical physicist Michio Kaku has predicted Moore's Law will run out of steam within the next ten years as silicon designs run up against the laws of physics. "In about ten years or so we will see the collapse of Moore's Law," he says. "In fact we already see a slowing down of Moore's Law. Computing power cannot …

COMMENTS

This topic is closed for new posts.

Page:

  1. Jeebus

    I know people get paid a lot, but "Futurologist" is a horrible title, it just screams "False" though, like conspiracy theorists and such.

    1. TeeCee Gold badge
      Happy

      Don't knock it.

      It beats standing on street corners flogging lucky heather and reading palms, old skool style.

    2. nexsphil

      True

      Humans never conspire to do bad things. Especially not those with power. Anyone that proposes otherwise is insane. Despite this awesome wisdom I'm not sure my Grandkids will jump on board, looking back at the way things are today.

    3. MacGyver
      Facepalm

      Arg.

      He lost all credibility with that stupid "Sci Fi Science: Physics of the Impossible" show.

      "To travel across out galaxy, first we design a worm-hole generator." Really, that's all we need to do Dr. Michio Kaku? I love future tech shows, just not when written by and for redneck hill people.

      That kind of show, and those stupid "Ancient Aliens" shows are corrupting real science.

      I predict that "Ten years", just happens to be longer than anyone would remember that he made that prediction.

      1. elderlybloke
        Angel

        Re: Arg.

        Dr Kaku is a very succssful and rich BS artist.

        I actually enjoy his TV shows, he has a very creative imagination and who knows what may come true.

        Black Holes were science fiction things a few decades ago , and now are a very active subject of research.

    4. Anonymous Coward
      Anonymous Coward

      we have a futureoligist

      everyone calls him the bullshiit artist

  2. Anonymous Coward
    Anonymous Coward

    Moore's Law - so what

    We already have enough compute power to watch movies on mobile phones, as well as make video calls. Surely anything else is superfluous for most people's needs?

    (I am excluding Large Hadron Collider type non-personal use here, obviously :-D).

    Perhaps when we do hit the wall, the old art of hand crafting code tweaks will be back in vogue ;-)

    1. Quantum Leaper

      Re: Moore's Law - so what

      I remember when someone said 640K was more than enough ram in a personal computer. You will never know what tomorrow killer app will be or how much power it will need...

      1. Christian Berger

        We all don't remimber

        When people actually thought the Concorde was something sensible and that there would be faster planes after it.

        Technology often reaches a certain peak. We now wonder why people once thought that sitting for 4 hours in a plane to cross the Atlantic is a good thing, when you can just use e-mail or the telephone.

        I mean the whole idea of a home computer, like we now all have, was to build a computer that was less powerful than a "real computer", but also a lot smaller and cheaper.

        1. Aldous
          Facepalm

          Re: We all don't remimber

          concorde was sensible at the time it was designed and built. no email and cheap oil. if it wasn't for the nimbys/tree huggers (propped up by US Aerospace) you would have supersonic flight everywhere(at least the option of).

          instead there was the whole "think of the children" protests over sonic booms over land, threats by individual states to not allow conorde over head etc and so they were restricted to supersonic over water only (wonder how long uk-australia would take in a long range concorde?). not to mention email/phone is not so good for stuff, supersonic cargo planes anyone?

          mind you its probably for the best given the "OMGZ THE SKY IS FALLING" result of the recent eurofighter sonic booms (i was in an affected area and thought my motorcycle had fallen over again, not that it was the end of the world omg terrorists!) some people even claimed to see bright flashes of light acompanying the boom ffs.

          1. Jan 0 Silver badge
            Pint

            @Aldous

            Sensible? It was a justifiable choice, nevertheless, there were other choices. There could have been a bigger push to improve person to person text*, audio and video communication instead. Maybe you wouldn't be thinking of supersonic cargo planes if we now had high speed submarine container ships. Why do people want under-ripe tropical fruit anyway? Maybe we should have empowered psychologists back in the 1950s!

            *Weren't teletype 'hotlines' more important to governments and businesses than passenger 'planes anyway?

            I'll raise a pint to Vannevar Bush, but why was Alan Kay so late?

            1. Anonymous Coward
              Anonymous Coward

              Speaking of communication...

              ...the basic idea of the internet dates back to a 1960 paper from the RAND Corporation.

              http://www.rand.org/pubs/papers/P1995.html

              I'll raise a pint to Paul Baran, but why did Peter Kay stop being funny?

          2. Nigel 11

            Bright flashes ...

            "some people even claimed to see bright flashes of light acompanying the boom ffs"

            Almost certainly, they did. It's called synaesthesia. It's quite common. For most people it occurs only when one of their senses is overloaded by a sudden and unexpected input. There is some sort of neural spill-over in their mental processes that registers as a different sense. If the triggering experience is sufficiently rare, they may not recognise it as an internal rather than an external phenomenon.

            For me, a sudden loud noise also registers as taste (acid on my tongue).

            For a smaller number of people, the linkage between their senses is a permanent part of everyday experience. They're not mad, because they are fully aware that it's their own internal "wiring" that is different to that of most other people, and because it doesn't cause them any distress.

          3. Anonymous Coward
            Anonymous Coward

            Re: We all don't remimber

            "some people even claimed to see bright flashes of light acompanying the boom"

            You've never known someone with interlinked senses then? People who get coloured flashes in front of their eyes when they hear certain types of music or taste certain flavours? Some people are wired differently, the world doesn't always conform to what the text books ( the Wiki and forums! ) state, for the 98% it's true but theres the special ones who offset the averages.

          4. Anonymous Coward
            Anonymous Coward

            Re: We all don't remimber

            "f it wasn't for the nimbys/tree huggers (propped up by US Aerospace) you would have supersonic flight everywhere(at least the option of)."

            I'm not so sure about that. Supersonic flight uses a huge amount of fuel and short of charging utterly ridiculous ticket prices its hard to see how it could be a viable business proposition these days. British Airways (don't know about Air France) didn't ditch Concorde because of safety fears, it ditched it because the aircraft required huge amounts of maintenance and it was making very little money. The safety scare was simply an excuse to get rid of a "prestige" service that was actually a boat anchor around the companys finances.

            "(wonder how long uk-australia would take in a long range concorde?)."

            Quite a long time. It would have had to stop at least twice to refuel.

            1. Tom 13

              @Baltar

              No, he's right. The reason Concorde didn't make money is because most of the places where it could have made money were put off limits by the tree huggers here in the US. Old Blighty to Australia might not work, but the US would have. If we'd granted the permits, which we should have. And I expect that had THAT worked, they could have upped the fuel capacity on a newer model to make the Aussie run work too.

              1. Michael Wojcik Silver badge

                Re: @Tom 13

                > The reason Concorde didn't make money is because most of the places where it could have

                > made money were put off limits by the tree huggers here in the US.

                Evidence, please. The air-travel market is pretty mature, transparent, and efficient, and it's barely profitable as it is. No doubt contemporary SST commercial-passenger aircraft would be more fuel-efficient than the Concorde, had development continued; but SSTs would still be ravenous consumers of fuel, so their ticket premium would be large - and even more sensitive to oil-price spikes than conventional air travel.[1]

                So commercial SST travel would only be profitable on routes where the time savings was sufficient to justify the large price premium to a substantial fraction of the market. Most travelers already take the cheapest option possible, choosing multiple-segment coach itineraries over direct flights and more comfortable seating. Businesses, meanwhile, have cut back both on premium travel options and on travel overall, as an easy area for cost savings.

                Historically, worldwide premium air travel has grown at around 5% annually, but much of this is international.[2] And in recent years it has stumbled badly, as it did in 2009 and 2011.[3]

                Maybe commercial SST travel would have been profitable - but I tend to doubt it.

                [1] This can be hedged, of course, and the airlines do; but ultimately that just spreads the cost out.

                [2] See eg this IATA press release

                [3] See eg this Bloomberg piece from 2009 and this from Peter Greenberg in 2011

          5. Christian Berger

            Re: We all don't remimber

            Well that's exactly the point. I used to be sensible _at the time_. Today it's just idiotic. The requirements of a society change. Maybe one day, we won't strive for faster computers any more, just like we don't want faster modes of transports.

        2. /dev/null

          Re: We all don't remimber

          Quite. We've got used to computer technology becoming more powerful every year, but the world won't end if that doesn't happen any more. Intel and AMD might find it harder to sell chips though.

          The aerospace industry is an interesting comparison - in its first 60 years, it went from the Wright brothers to the Chinook helicopter. In the subsequent 50 years, it has gone from the Chinook helicopter to... more Chinook helicopters.

          1. Morg

            Re: We all don't remimber

            That's mostly an illusion, the only military tech you know of is public knowledge, i.e. those weapons that were shown and that governments could not hide because they extensively used them.

            However there has been ongoing military research and no real war to force the new tech on the field for at least 50 years, anyone with half a brain understands that there has to be a ton of stuff in store for the unlikely event of another real war.

          2. Ben Holmes
            Joke

            Re: /dev/null

            Actually, taking the latest Strategic Defense Review into account, it actually means 'no more Chinook helicopters.'

          3. Paul_Murphy

            Re: We all don't remimber

            Similarly it didnt take that long to go from Kittyhawk to the moon, but after what is roughly the same period of time we have gone from the moon to nowhere.

            ttfn

        3. Andus McCoatover
          Windows

          Re: We all don't remember

          We worry about "Moore's Law" then forget that an atom is just soo sized. So, while we talk about getting a transistor's gate down to 6 atoms thick, as we drive to the lab in a "Suck-squeese-bang-blow"* driven vehicle to the lab, that the technology of which was invented 120 years ago, and hasn't changed much since....When someone shrinks the atom... (No, Rick Moranis, that's not a cue for a sequel!)

          *Infernal combsution engine.

        4. Anonymous Coward
          Anonymous Coward

          Re: We all don't remimber

          They did *have* telephones in the 1960s. It wasn't a case of flying across the Atlantic because that was the only way to talk to somebody on the other side. It was about getting there in 40% less time. I take it you regard the High Speed Rail plan as ridiculous on the same grounds?

      2. Pen-y-gors

        Next killer app?

        To be honest, the last 'killer apps' for me were e-mail and a web browser. (both circa 1992). We could have a long wait...

      3. Anonymous Coward
        Unhappy

        Re: Moore's Law - so what

        "You will never know what tomorrow killer app will be or how much power it will need..."

        Apps just suck up more and more power but don't deliver the equvalent amount of functionality. Do I really need a Ghz class CPU to run a friggin word processor when , for example, MacWrite which had perfectly servicable functionality and and pleasant GUI ran happily on a 20Mhz 68000?

        Sure , for 3D games, AI and maths intensive operations such as image transforms in photoshop you need the fastest silicon you can get, but for everything else? No , sorry. The reason most apps require more CPU is a combination of lazy and/or incompetent programmers, inefficient bloated libraries and in a lot of cases slow , memory sucking managed languages.

      4. Dire Critic
        Facepalm

        Re: Moore's Law - so what

        "You will never know what tomorrow killer app will be or how much power it will need..."

        We just know that it'll be bloated, slow and take up twice as much drive space as its predecessor.

      5. Anonymous Coward
        Anonymous Coward

        Re: Moore's Law - so what

        virtual reality porn........... it's the next big app

    2. atippey
      Facepalm

      Re: Moore's Law - so what

      Goddammit, I want sub-pixel polygons, molecular level physics simulation, and a direct-to-brain interface. Otherwise, Crysis IX will not be worth my money.

      1. Omgwtfbbqtime
        Trollface

        "Otherwise, Crysis IX will not be worth my money."

        It won't anyway as it will only take 8 minutes to complete.

      2. Evil Auditor Silver badge

        Re Crysis IX will not be worth my money

        IX?! Not even Crysis 2 was worth the money.

  3. pip25
    Joke

    Oooh, time travel!

    I think I've read this article already, around ten years ago!

    1. RIBrsiq
      Thumb Up

      Re: Oooh, time travel!

      I recall reading it in the early nineties, myself...

      No, really: the number of graphs and whatnot showing this exponential growth trend in computing power surviving several paradigm shifts is *huge*. Just Google it and see.

  4. Anonymous Coward
    Anonymous Coward

    often stated and pretty consistenly wrong

    Since both Intel and AMD have been leaking their experiments with real multi-layer 3d chip designs, I'd say that in 10 years we will have hammered out another generation of fairly conventional silicon production by building up instead of shrinking. Considering the big problem with this has actually been heat(As in their test pentium 3 + ram dies melting in the lab) and there have been some interesting developments in that arena in the last few months, I am betting that we can look forward to quite a few more "Moore's Law" Keynotes. Then will probably come optical interconnects, persistent state RAM and host of other new tech.

    Pity too, because when all of that runs out of headroom, things may actually get interesting. Until then, unless fabs stop costing tens of billions of dollars, things will probably stay incremental and safe and dull. Though I do wonder if AMD will be around to see it(as anything other than a GPU house at least).

    1. Morg

      Re: often stated and pretty consistenly wrong

      Optical interconnects are NOT better... light travels only marginally faster than electrons AND it requires two electric to optical converters to work. It is highly unlikely that this will be used anytime soon INSIDE a cpu or even motherboard.

      Don't forget graphene and frequency scaling. there's a lot of room there still-

      1. Hungry Sean
        Boffin

        Re: often stated and pretty consistenly wrong

        If you're going to act authoritative, at least be correct. First, the electrons themselves aren't traveling at speed through a wire, the charge is (think a wave moving across the ocean). Second, the problem isn't the speed of the propagation of electricity through an isolated wire, it's the capacitance with neighboring wires that introduce a lovely rc time constant into the switching. Capacitance also allows wires to potentially induce errors on each other. Between these two things, you end up needing beefy transistors to drive the wire, extra transistors at the receiving end to avoid some of the potential nasty effects, and possibly dead wires in between to "shield" things.

        Optical interconnects don't have these problems and so they could certainly be better on the scale of a PCB and might even make sense as global interconnects on a die, or maybe even as an alternative to through silicon vias. As I understand it, the main barrier to all of this cool stuff is that the technology is very young, not that it is a dumb idea in principle.

      2. Field Marshal Von Krakenfart
        Boffin

        Re: often stated and pretty consistenly wrong

        Actually electrons move very slowly, it could take a free electron anything up to 12 hours to travel through 1 metre of copper depending on diameter of the copper, voltage and current.

        I always understood that the real CPU limit was not the mask size but a combination of mask size and clock speed, as most clock speeds are now well into the radio frequency range the real limiting factor is Radio Frequency Interference (RFI) within the chip, where a signal in one track in a CPU induces an unwanted current in an adjacent track.

        Because of this we may hit the limit for CPU clock speeds and transistor density before we hit any physical manufacturing constraint

        As for EUV, I was also under the impression that some masks were now photo-etched using the non-visible spectrum (x-Rays???)

        Technical bits open to correction.

  5. gronkle

    It does seem a bit of an affliction for scientists once they reach a certain age or level of influence to start making "that's it, we've done it all, it's all down here from now on" type pronouncements, despite every generation having faced the same kind of pronouncements themselves, previously.

    My money (none) is on Graphene keeping us ticking for another few years yet...

    http://www.theregister.co.uk/2011/09/22/src_nsf_research_grants/

    1. Kristian Walsh Silver badge

      He was talking about Silicon, though...

      Graphene is one of the most likely materials to take over from Silicon, I think, but even if it doesn't, it has so many other useful properties that you're going to be seeing it in lots of products long before Silicon runs out of steam.. first out, as a replacement for the difficult to source Indium used in touch displays (where it will allow capacitive and capacitive+resistive designs), but also as an engineering and surfacing material.

      But Kaku did say that the problem is with the physics of *Silicon*, and it's pretty hard to dispute this argument. There's a certain critical mass of atoms below which a semiconductor junction won't work. Semiconductors junctions work on the principle of doping pure silicon with other elements - these impurities are what provide the "one-way current" behaviour that all digital electronics relies on.

      Make these features too small, and the absolute number of doping atoms becomes significant, rather than their ratio (a Silicon atom has a diameter of a touch over 0.2 nanometres, so a 5 nm feature size is less than 25 Si atoms across)..

      Of course, this doesn't preclude three-dimensional construction of devices (although cooling is a major problem here), or hybrid Silicon/Something-else designs, but I think that's his point: using Silicon alone, you cannot go on forever reducing feature sizes. My guess is that it'll be economics, not physics that prevents us reaching the theoretical limit of Silicon.

      1. Kool-Aid drinker
        Thumb Up

        Re: He was talking about Silicon, though...

        Aye, the stone age didn't end because the stone ran out or we exhausted its possibilities, it ended because something better came along, and so it will be with silicon.

        1. Jedit Silver badge
          Headmaster

          "The Stone Age didn't end because the stone ran out"

          Well, obviously - if it had, we wouldn't have been able to have a Silicon Age.

        2. Tom 13
          Joke

          Re: ...it ended because something better came along...

          Damn young whipper-snappers!

          Paper wasn't any improvement over stone for record keeping - it rots, stone is forever. Sure it's requires a bit more space to store than paper, but until you get done with acid free, and humidity controls and temperature controls (and their affects on our atmosphere) you haven't really gained ANYTHING. And don't get me started on the silly crap about using electrons to store data!

    2. Dave 126 Silver badge

      "It does seem a bit of an affliction for scientists once they reach a certain age or level of influence to start making "that's it, we've done it all, it's all down here from now on" type pronouncements"

      -gronkle

      Or, put another way:

      When a distinguished but elderly scientist states that something is possible, he is almost certainly right. When he states that something is impossible, he is very probably wrong.

      -Arthur C. Clarke, Clarke's first law

      1. Anonymous Coward
        Anonymous Coward

        Corollary

        You've forgotten Isaac Asimov’s Corollary to Clarke's first law:

        "When, however, the lay public rallies round an idea that is denounced by distinguished but elderly scientists and supports that idea with great fervor and emotion — the distinguished but elderly scientists are then, after all, probably right."

        and Egbert's Corollary to Asimov's Corollary:

        "Whenever the lay public repeat and riducule the original statement they will conveniently forget that it was made in a specific context with specific conditions and instead will generalise it to a groesque characature of what was originally said".

        Thomas J. Watson: "I think there is a world market for maybe five computers". If he said it at all (dubious) it would have held true for about 10 years and, in any case, was probably (and correctly) being said about the computers being created *at that time*.

        Michio Kaku: "Computing power cannot maintain its rapid exponential rise using *standard silicon technology*."

  6. Rolland Trolland

    And not a moment too soon!

    The horrific bloated slop that passes for code these days is an embarrassment to anyone of pre GUI age.

    Maybe when / if a processor cap appeared Moores law could be continued (in a fashion) by people dumping some of these lazy libraries and putting a bit more thought into their code so that the processor is actually doing something useful and not merely navigating it's way around excessive layers of pointless abstraction!

    1. Mussie (Ed)
      Thumb Up

      Re: And not a moment too soon!

      You must be a fan of Mel

      http://foldoc.org/The+story+of+Mel,+a+Real+Programmer

      1. Rolland Trolland
        Thumb Up

        @Mussie

        Aye, those were t'days. Real programmers wrote code by candlelight with rusty t' soldering iron, on motherboard made out of sheeps bladder :)

    2. Anonymous Coward 101

      Re: And not a moment too soon!

      "The horrific bloated slop that passes for code these days is an embarrassment to anyone of pre GUI age."

      Why, there was a mythical golden age of wondrous coding! And did kids know their place as well?!

    3. Some Beggar
      FAIL

      Re: And not a moment too soon!

      Absolute twaddle. There was shit code in the 1970s and there's good code today. Engineering is all about pragmatism and engineering effort is expensive. There's no excuse for sloppy code, but there's equally no excuse for wasting effort optimising something to the Nth degree when you don't need to. The increased power of processors and the increased capability of high-level languages is unequivocally a Good Thing. You can hand craft an entire system in assembler on punch cards if you want. The rest of us will take advantage of whatever whizzbangery we have available.

    4. Michael H.F. Wilkinson Silver badge
      Happy

      Re: And not a moment too soon!

      As Niklaus Wirth says: Software is getting slower faster than hardware is getting faster.

      Word 2.0 was a very capable word-processor, and ran happily on my 25 MHz 80386 machine with 4 MB of RAM (I really splashed out on that machine :-) ). Word 2010 would require rather more. More in fact than the processing power and memory of the good old Cray Y-MP. That is worrying.

      GUIs of course do need more resources, but the above example suggests you can run a full-blown GUI-based word processor in 1,000 times less memory than we seem to need today. If you look at the memory footprint of something like FLTK, which is so small you can afford to link it statically for easier distribution, and compare that to some other tools, you cannot help but question the efficiency of some code.

      Much of the best coding is going on in the embedded-systems world. You really have to conserve resources in that arena.

Page:

This topic is closed for new posts.

Other stories you might like