I know people get paid a lot, but "Futurologist" is a horrible title, it just screams "False" though, like conspiracy theorists and such.
Moore's Law has ten years to run, predicts physicist
Renowned theoretical physicist Michio Kaku has predicted Moore's Law will run out of steam within the next ten years as silicon designs run up against the laws of physics. "In about ten years or so we will see the collapse of Moore's Law," he says. "In fact we already see a slowing down of Moore's Law. Computing power cannot …
-
-
Tuesday 1st May 2012 10:27 GMT MacGyver
Arg.
He lost all credibility with that stupid "Sci Fi Science: Physics of the Impossible" show.
"To travel across out galaxy, first we design a worm-hole generator." Really, that's all we need to do Dr. Michio Kaku? I love future tech shows, just not when written by and for redneck hill people.
That kind of show, and those stupid "Ancient Aliens" shows are corrupting real science.
I predict that "Ten years", just happens to be longer than anyone would remember that he made that prediction.
-
Monday 30th April 2012 23:14 GMT Anonymous Coward
Moore's Law - so what
We already have enough compute power to watch movies on mobile phones, as well as make video calls. Surely anything else is superfluous for most people's needs?
(I am excluding Large Hadron Collider type non-personal use here, obviously :-D).
Perhaps when we do hit the wall, the old art of hand crafting code tweaks will be back in vogue ;-)
-
-
Tuesday 1st May 2012 08:07 GMT Christian Berger
We all don't remimber
When people actually thought the Concorde was something sensible and that there would be faster planes after it.
Technology often reaches a certain peak. We now wonder why people once thought that sitting for 4 hours in a plane to cross the Atlantic is a good thing, when you can just use e-mail or the telephone.
I mean the whole idea of a home computer, like we now all have, was to build a computer that was less powerful than a "real computer", but also a lot smaller and cheaper.
-
Tuesday 1st May 2012 09:00 GMT Aldous
Re: We all don't remimber
concorde was sensible at the time it was designed and built. no email and cheap oil. if it wasn't for the nimbys/tree huggers (propped up by US Aerospace) you would have supersonic flight everywhere(at least the option of).
instead there was the whole "think of the children" protests over sonic booms over land, threats by individual states to not allow conorde over head etc and so they were restricted to supersonic over water only (wonder how long uk-australia would take in a long range concorde?). not to mention email/phone is not so good for stuff, supersonic cargo planes anyone?
mind you its probably for the best given the "OMGZ THE SKY IS FALLING" result of the recent eurofighter sonic booms (i was in an affected area and thought my motorcycle had fallen over again, not that it was the end of the world omg terrorists!) some people even claimed to see bright flashes of light acompanying the boom ffs.
-
Tuesday 1st May 2012 10:17 GMT Jan 0
@Aldous
Sensible? It was a justifiable choice, nevertheless, there were other choices. There could have been a bigger push to improve person to person text*, audio and video communication instead. Maybe you wouldn't be thinking of supersonic cargo planes if we now had high speed submarine container ships. Why do people want under-ripe tropical fruit anyway? Maybe we should have empowered psychologists back in the 1950s!
*Weren't teletype 'hotlines' more important to governments and businesses than passenger 'planes anyway?
I'll raise a pint to Vannevar Bush, but why was Alan Kay so late?
-
Friday 4th May 2012 11:34 GMT Anonymous Coward
Speaking of communication...
...the basic idea of the internet dates back to a 1960 paper from the RAND Corporation.
http://www.rand.org/pubs/papers/P1995.html
I'll raise a pint to Paul Baran, but why did Peter Kay stop being funny?
-
-
Tuesday 1st May 2012 10:38 GMT Nigel 11
Bright flashes ...
"some people even claimed to see bright flashes of light acompanying the boom ffs"
Almost certainly, they did. It's called synaesthesia. It's quite common. For most people it occurs only when one of their senses is overloaded by a sudden and unexpected input. There is some sort of neural spill-over in their mental processes that registers as a different sense. If the triggering experience is sufficiently rare, they may not recognise it as an internal rather than an external phenomenon.
For me, a sudden loud noise also registers as taste (acid on my tongue).
For a smaller number of people, the linkage between their senses is a permanent part of everyday experience. They're not mad, because they are fully aware that it's their own internal "wiring" that is different to that of most other people, and because it doesn't cause them any distress.
-
Tuesday 1st May 2012 11:31 GMT Anonymous Coward
Re: We all don't remimber
"some people even claimed to see bright flashes of light acompanying the boom"
You've never known someone with interlinked senses then? People who get coloured flashes in front of their eyes when they hear certain types of music or taste certain flavours? Some people are wired differently, the world doesn't always conform to what the text books ( the Wiki and forums! ) state, for the 98% it's true but theres the special ones who offset the averages.
-
Tuesday 1st May 2012 11:35 GMT Anonymous Coward
Re: We all don't remimber
"f it wasn't for the nimbys/tree huggers (propped up by US Aerospace) you would have supersonic flight everywhere(at least the option of)."
I'm not so sure about that. Supersonic flight uses a huge amount of fuel and short of charging utterly ridiculous ticket prices its hard to see how it could be a viable business proposition these days. British Airways (don't know about Air France) didn't ditch Concorde because of safety fears, it ditched it because the aircraft required huge amounts of maintenance and it was making very little money. The safety scare was simply an excuse to get rid of a "prestige" service that was actually a boat anchor around the companys finances.
"(wonder how long uk-australia would take in a long range concorde?)."
Quite a long time. It would have had to stop at least twice to refuel.
-
Tuesday 1st May 2012 13:39 GMT Tom 13
@Baltar
No, he's right. The reason Concorde didn't make money is because most of the places where it could have made money were put off limits by the tree huggers here in the US. Old Blighty to Australia might not work, but the US would have. If we'd granted the permits, which we should have. And I expect that had THAT worked, they could have upped the fuel capacity on a newer model to make the Aussie run work too.
-
Tuesday 1st May 2012 21:52 GMT Michael Wojcik
Re: @Tom 13
> The reason Concorde didn't make money is because most of the places where it could have
> made money were put off limits by the tree huggers here in the US.
Evidence, please. The air-travel market is pretty mature, transparent, and efficient, and it's barely profitable as it is. No doubt contemporary SST commercial-passenger aircraft would be more fuel-efficient than the Concorde, had development continued; but SSTs would still be ravenous consumers of fuel, so their ticket premium would be large - and even more sensitive to oil-price spikes than conventional air travel.[1]
So commercial SST travel would only be profitable on routes where the time savings was sufficient to justify the large price premium to a substantial fraction of the market. Most travelers already take the cheapest option possible, choosing multiple-segment coach itineraries over direct flights and more comfortable seating. Businesses, meanwhile, have cut back both on premium travel options and on travel overall, as an easy area for cost savings.
Historically, worldwide premium air travel has grown at around 5% annually, but much of this is international.[2] And in recent years it has stumbled badly, as it did in 2009 and 2011.[3]
Maybe commercial SST travel would have been profitable - but I tend to doubt it.
[1] This can be hedged, of course, and the airlines do; but ultimately that just spreads the cost out.
[2] See eg this IATA press release
[3] See eg this Bloomberg piece from 2009 and this from Peter Greenberg in 2011
-
-
-
-
Tuesday 1st May 2012 09:49 GMT /dev/null
Re: We all don't remimber
Quite. We've got used to computer technology becoming more powerful every year, but the world won't end if that doesn't happen any more. Intel and AMD might find it harder to sell chips though.
The aerospace industry is an interesting comparison - in its first 60 years, it went from the Wright brothers to the Chinook helicopter. In the subsequent 50 years, it has gone from the Chinook helicopter to... more Chinook helicopters.
-
Tuesday 1st May 2012 13:20 GMT Morg
Re: We all don't remimber
That's mostly an illusion, the only military tech you know of is public knowledge, i.e. those weapons that were shown and that governments could not hide because they extensively used them.
However there has been ongoing military research and no real war to force the new tech on the field for at least 50 years, anyone with half a brain understands that there has to be a ton of stuff in store for the unlikely event of another real war.
-
-
Tuesday 1st May 2012 11:35 GMT Andus McCoatover
Re: We all don't remember
We worry about "Moore's Law" then forget that an atom is just soo sized. So, while we talk about getting a transistor's gate down to 6 atoms thick, as we drive to the lab in a "Suck-squeese-bang-blow"* driven vehicle to the lab, that the technology of which was invented 120 years ago, and hasn't changed much since....When someone shrinks the atom... (No, Rick Moranis, that's not a cue for a sequel!)
*Infernal combsution engine.
-
Friday 4th May 2012 11:08 GMT Anonymous Coward
Re: We all don't remimber
They did *have* telephones in the 1960s. It wasn't a case of flying across the Atlantic because that was the only way to talk to somebody on the other side. It was about getting there in 40% less time. I take it you regard the High Speed Rail plan as ridiculous on the same grounds?
-
-
Tuesday 1st May 2012 11:28 GMT Anonymous Coward
Re: Moore's Law - so what
"You will never know what tomorrow killer app will be or how much power it will need..."
Apps just suck up more and more power but don't deliver the equvalent amount of functionality. Do I really need a Ghz class CPU to run a friggin word processor when , for example, MacWrite which had perfectly servicable functionality and and pleasant GUI ran happily on a 20Mhz 68000?
Sure , for 3D games, AI and maths intensive operations such as image transforms in photoshop you need the fastest silicon you can get, but for everything else? No , sorry. The reason most apps require more CPU is a combination of lazy and/or incompetent programmers, inefficient bloated libraries and in a lot of cases slow , memory sucking managed languages.
-
-
-
Monday 30th April 2012 23:51 GMT Anonymous Coward
often stated and pretty consistenly wrong
Since both Intel and AMD have been leaking their experiments with real multi-layer 3d chip designs, I'd say that in 10 years we will have hammered out another generation of fairly conventional silicon production by building up instead of shrinking. Considering the big problem with this has actually been heat(As in their test pentium 3 + ram dies melting in the lab) and there have been some interesting developments in that arena in the last few months, I am betting that we can look forward to quite a few more "Moore's Law" Keynotes. Then will probably come optical interconnects, persistent state RAM and host of other new tech.
Pity too, because when all of that runs out of headroom, things may actually get interesting. Until then, unless fabs stop costing tens of billions of dollars, things will probably stay incremental and safe and dull. Though I do wonder if AMD will be around to see it(as anything other than a GPU house at least).
-
Tuesday 1st May 2012 13:24 GMT Morg
Re: often stated and pretty consistenly wrong
Optical interconnects are NOT better... light travels only marginally faster than electrons AND it requires two electric to optical converters to work. It is highly unlikely that this will be used anytime soon INSIDE a cpu or even motherboard.
Don't forget graphene and frequency scaling. there's a lot of room there still-
-
Tuesday 1st May 2012 14:41 GMT Hungry Sean
Re: often stated and pretty consistenly wrong
If you're going to act authoritative, at least be correct. First, the electrons themselves aren't traveling at speed through a wire, the charge is (think a wave moving across the ocean). Second, the problem isn't the speed of the propagation of electricity through an isolated wire, it's the capacitance with neighboring wires that introduce a lovely rc time constant into the switching. Capacitance also allows wires to potentially induce errors on each other. Between these two things, you end up needing beefy transistors to drive the wire, extra transistors at the receiving end to avoid some of the potential nasty effects, and possibly dead wires in between to "shield" things.
Optical interconnects don't have these problems and so they could certainly be better on the scale of a PCB and might even make sense as global interconnects on a die, or maybe even as an alternative to through silicon vias. As I understand it, the main barrier to all of this cool stuff is that the technology is very young, not that it is a dumb idea in principle.
-
Tuesday 1st May 2012 17:33 GMT Field Marshal Von Krakenfart
Re: often stated and pretty consistenly wrong
Actually electrons move very slowly, it could take a free electron anything up to 12 hours to travel through 1 metre of copper depending on diameter of the copper, voltage and current.
I always understood that the real CPU limit was not the mask size but a combination of mask size and clock speed, as most clock speeds are now well into the radio frequency range the real limiting factor is Radio Frequency Interference (RFI) within the chip, where a signal in one track in a CPU induces an unwanted current in an adjacent track.
Because of this we may hit the limit for CPU clock speeds and transistor density before we hit any physical manufacturing constraint
As for EUV, I was also under the impression that some masks were now photo-etched using the non-visible spectrum (x-Rays???)
Technical bits open to correction.
-
-
-
Tuesday 1st May 2012 00:58 GMT gronkle
It does seem a bit of an affliction for scientists once they reach a certain age or level of influence to start making "that's it, we've done it all, it's all down here from now on" type pronouncements, despite every generation having faced the same kind of pronouncements themselves, previously.
My money (none) is on Graphene keeping us ticking for another few years yet...
http://www.theregister.co.uk/2011/09/22/src_nsf_research_grants/
-
Tuesday 1st May 2012 08:36 GMT Kristian Walsh
He was talking about Silicon, though...
Graphene is one of the most likely materials to take over from Silicon, I think, but even if it doesn't, it has so many other useful properties that you're going to be seeing it in lots of products long before Silicon runs out of steam.. first out, as a replacement for the difficult to source Indium used in touch displays (where it will allow capacitive and capacitive+resistive designs), but also as an engineering and surfacing material.
But Kaku did say that the problem is with the physics of *Silicon*, and it's pretty hard to dispute this argument. There's a certain critical mass of atoms below which a semiconductor junction won't work. Semiconductors junctions work on the principle of doping pure silicon with other elements - these impurities are what provide the "one-way current" behaviour that all digital electronics relies on.
Make these features too small, and the absolute number of doping atoms becomes significant, rather than their ratio (a Silicon atom has a diameter of a touch over 0.2 nanometres, so a 5 nm feature size is less than 25 Si atoms across)..
Of course, this doesn't preclude three-dimensional construction of devices (although cooling is a major problem here), or hybrid Silicon/Something-else designs, but I think that's his point: using Silicon alone, you cannot go on forever reducing feature sizes. My guess is that it'll be economics, not physics that prevents us reaching the theoretical limit of Silicon.
-
-
Tuesday 1st May 2012 13:44 GMT Tom 13
Re: ...it ended because something better came along...
Damn young whipper-snappers!
Paper wasn't any improvement over stone for record keeping - it rots, stone is forever. Sure it's requires a bit more space to store than paper, but until you get done with acid free, and humidity controls and temperature controls (and their affects on our atmosphere) you haven't really gained ANYTHING. And don't get me started on the silly crap about using electrons to store data!
-
-
Tuesday 1st May 2012 10:00 GMT Dave 126
"It does seem a bit of an affliction for scientists once they reach a certain age or level of influence to start making "that's it, we've done it all, it's all down here from now on" type pronouncements"
-gronkle
Or, put another way:
When a distinguished but elderly scientist states that something is possible, he is almost certainly right. When he states that something is impossible, he is very probably wrong.
-Arthur C. Clarke, Clarke's first law
-
Tuesday 1st May 2012 13:10 GMT Anonymous Coward
Corollary
You've forgotten Isaac Asimov’s Corollary to Clarke's first law:
"When, however, the lay public rallies round an idea that is denounced by distinguished but elderly scientists and supports that idea with great fervor and emotion — the distinguished but elderly scientists are then, after all, probably right."
and Egbert's Corollary to Asimov's Corollary:
"Whenever the lay public repeat and riducule the original statement they will conveniently forget that it was made in a specific context with specific conditions and instead will generalise it to a groesque characature of what was originally said".
Thomas J. Watson: "I think there is a world market for maybe five computers". If he said it at all (dubious) it would have held true for about 10 years and, in any case, was probably (and correctly) being said about the computers being created *at that time*.
Michio Kaku: "Computing power cannot maintain its rapid exponential rise using *standard silicon technology*."
-
-
-
Tuesday 1st May 2012 00:58 GMT Rolland Trolland
And not a moment too soon!
The horrific bloated slop that passes for code these days is an embarrassment to anyone of pre GUI age.
Maybe when / if a processor cap appeared Moores law could be continued (in a fashion) by people dumping some of these lazy libraries and putting a bit more thought into their code so that the processor is actually doing something useful and not merely navigating it's way around excessive layers of pointless abstraction!
-
Tuesday 1st May 2012 09:19 GMT Some Beggar
Re: And not a moment too soon!
Absolute twaddle. There was shit code in the 1970s and there's good code today. Engineering is all about pragmatism and engineering effort is expensive. There's no excuse for sloppy code, but there's equally no excuse for wasting effort optimising something to the Nth degree when you don't need to. The increased power of processors and the increased capability of high-level languages is unequivocally a Good Thing. You can hand craft an entire system in assembler on punch cards if you want. The rest of us will take advantage of whatever whizzbangery we have available.
-
Tuesday 1st May 2012 09:45 GMT Michael H.F. Wilkinson
Re: And not a moment too soon!
As Niklaus Wirth says: Software is getting slower faster than hardware is getting faster.
Word 2.0 was a very capable word-processor, and ran happily on my 25 MHz 80386 machine with 4 MB of RAM (I really splashed out on that machine :-) ). Word 2010 would require rather more. More in fact than the processing power and memory of the good old Cray Y-MP. That is worrying.
GUIs of course do need more resources, but the above example suggests you can run a full-blown GUI-based word processor in 1,000 times less memory than we seem to need today. If you look at the memory footprint of something like FLTK, which is so small you can afford to link it statically for easier distribution, and compare that to some other tools, you cannot help but question the efficiency of some code.
Much of the best coding is going on in the embedded-systems world. You really have to conserve resources in that arena.
-
-
Tuesday 1st May 2012 11:38 GMT Michael H.F. Wilkinson
@ Some Beggar
What Wirth means is that for a given task, the current software needs vastly more resources (CPU and memory alike) than similar software years ago.
Why is this worrying? Because it suggests that we could get by on much leaner compute capacity for many mundane tasks. It means machines that still work fine have to be replaced when the software is updated, and the minimum specs are upped again. This is ultimately wasteful. It also means that bigger server parks are needed for a given workload. If you can make code more efficient, less hardware is needed, and less energy is wasted. Mobile computing (like embedded) can give an impetus to leaner programs, simply because cutting clock cycles can cut battery usage.
-
Tuesday 1st May 2012 12:46 GMT Some Beggar
Re: @ Some Beggar
What Wirth actually said was this:
"Do increased performance and functionality keep pace with the increased demand for resources? Mostly the answer is no. "
Which is simply a statement that we become more profligate when we have more resources. There are variants of this truism dating all the way back to the Hebrew Bible. It's why Americans build gigantic cars with inefficient engines - they have big roads and cheap petroleum. The only time this becomes a problem is when the resources start to be throttled. This isn't going to happen any time soon with mips and memory. Wirth wrote this nearly 20 years ago and the computing industry hasn't collapsed as far as I've noticed.
The paraphrased version that you quoted is simply nonsense.
-
-
Wednesday 2nd May 2012 08:23 GMT Some Beggar
Re: re: Michael H F Wilkinson
And these smartphones have processing capability (in terms of what the user can actually do - not simply in terms of what the silicon can theoretically do) that is well beyond what a desktop PC would have had back when Wirth made his oft-misquoted statement.
So you can now do things on a battery-powered pocket device that were impossible on a mains powered box that lived under your desk twenty years ago. How exactly does this demonstrate that software has bloated faster than hardware has speeded up?
-
-
-
-
-
Tuesday 1st May 2012 02:36 GMT JeffyPooh
Clark's Law trumps Kaku's prediction, thus extending Moore's Law
Arthur C. Clarke's laws of prediction:
1) When a distinguished but elderly scientist states that something is possible, he is almost certainly right. When he states that something is impossible, he is very probably wrong.
2) The only way of discovering the limits of the possible is to venture a little way past them into the impossible.
3) Any sufficiently advanced technology is indistinguishable from magic.
Michio Kaku is certainly distinguished. He's also recently turned 65, and that makes him elderly. See Rule 1. Moore's Law will shift gears, but it will continue further than expected. Just like last time. And the time before that.
-
Tuesday 1st May 2012 15:46 GMT Pierre Castille
Re: Clark's Law trumps Kaku's prediction, thus extending Moore's Law
"Michio Kaku is certainly distinguished. He's also recently turned 65 and that makes him elderly."
Steady on with the elderly bit! - I'm 66 and I consider myself only a teeny bit old (though my children and grandchildren might hold a different opinion!
-
-
Tuesday 1st May 2012 06:05 GMT Anonymous Coward
The path beyond the next few Moore's Law process generations has always been quite murky.
However, just because previous predictions of its demise have been wrong doesn't mean current predictions are also wrong.
If I understand correctly, a *lot* of the innovation in the past decades has been about pushing the need for big changes (like EUV) forward into the next process, then the next, and so on. How long can that go on? Even if the technologies have passed far beyond my understanding into the realm of apparent "magic", Moore's Law is not a magic law operating outside the physical world - exponential growth will always slow down at some point.
-
-
Tuesday 1st May 2012 09:18 GMT Charlie Clark
Re: They are already working on Silicon replacements
Yes, the materials and production processes will probably change. The economics of chip development have changed significantly in the last ten years due to the cost of making the machines that make the chips. There are fewer suppliers of the lithography machines than there were ten years ago which is driving up the cost of each new generation. At the same time even Intel's margins are starting to come under pressure as it's designer hardware struggles to differentiate itself from the commodity ARM clones.
The number of companies entering the nano-technology and additive manufacturing to address some of the same problems is increasing following the. If they can get it right printed OLED screens may be the first children of this revolution.
Worth noting that IBM and Samsung may well be right in not having an asset-light strategy in this area.
-
-
Tuesday 1st May 2012 07:08 GMT imanidiot
Am I the only one?
I don't have that much respect for Michio Kaku. Yes, he's got a good brain and yes he's good at dreaming up fantastical stuff. But he has a penchant for seeking media attention and spouting crap about area's of expertise he's not nearly qualified to talk about. (Like around the time of the whole Fukushima debacle)
Maybe I've missed it, but I haven't seen anything useful done by mister Kaku. He keeps "thinking up" wonderful new crap that's either highly improbable or just plain wrong and impossible from an engineering standpoint. (using materials and methods he suggests atleast. And often those things aren't even all that original)
I wouldn't be bothered if he just kept making dull TV shows, but he keeps going to the media with these kinds of statements, that, to me, just seem like he's seeking media attention again.
-
Wednesday 2nd May 2012 22:54 GMT Anonymous Coward
Re: Am I the only one?
I couldn't agree more. If you've ever seen Koo-Koo Kaku speak in person, you'd have been witness to a spotlight-seeker who'll say nearly anything to grab the attention of folks with weaker scientific chops than he has. Sure, he's educated and articulate, but his pronouncements are increasingly tales told by an idiot, full of sound and fury, signifying nothing.
Also, remember that he's a physicist and/or astrophysicist, not a materials science guy. If you want to know the future of microprocessor technology, talk to microprocessor technologists. I think the Reg has published some stories about this matter in which they talk to people who actually know what they're talking about.
Hang it up, Michio – your 15 minutes of fame are over.
-
-
Tuesday 1st May 2012 07:37 GMT geocrunch
Doubling CPU cores is also doubling transistors
In the end the article says that manufacturers look for for other ways of expanding CPU power, such as increasing the number of cores in a CPU, etc. The CPU cores have been doubling every 2 years over the last 5 years or so - how is that not doubling the number of transistors? This physicist obviously isn't an IT professional who would otherwise know that the next step of scalability is not vertical, but horizontal.
-
Tuesday 1st May 2012 14:15 GMT Peter Gathercole
Re: Doubling CPU cores is also doubling transistors
One of the problems that chip designers have is how to use the vast number of transistors that can be fitted onto the large die-sizes at the smallest scale.
They got to the point where more registers, more cache and more instructions units in a single core was not making for faster processors, so they then started using the still increasing transistor budget to put multiple cores on a single die.
There is a lot to be said for a large number of cores on a single die, but this has it's own problems with access to memory, cache coherency between cores and I/O.
Another avenue is putting disparate processors (like GPUs) on the same die, or even System on a Chip (SoC), where all of the functional elements (I/O, Graphics, memory etc) of a complete system appear on a single piece of silicon (think what is going into 'phones and tablets).
In my view, to make use of the vast scale of integration, it's about time we had a fundamental rethink about how processors work. I don't have any new ideas, but I think that listening to some of the people with outlandish ideas might be worthwhile in coming up with a completely new direction to investigate.
-
-
Tuesday 1st May 2012 07:42 GMT Iad Uroboros's Nemesis
Graphene will be key to take Moore's Law strain
As Gronkle says, Graphene is already spinning up & ready to take over from where Silicon's physical limits are reached. With the atoms being so much smaller than Si and some, I believe, wondrous other characteristics at such a small scale, I think we have a way to go yet, maybe even until quantum chips do come online in quantity.
-
Tuesday 1st May 2012 07:44 GMT lulu8137
am i the only person to know that in 2011, MIT lab researchers already produced non-silicon chips that can run at 100GHz or more? sometimes i feel i am. we already have this technology and 99.99% of the world has no clue it exists.
it's just the manufacturers don't want to put in the R&D and production costs when they're making enough $ with silicon.
-
Tuesday 1st May 2012 08:13 GMT Pete 2
Sadly, Wirth's Law will keep on going
"Software gets slower faster than hardware gets faster"
Luckily for us, the annual increments in processing power have masked the continual degradation in software performance. If Moore's law does hit the buffers, someone somewhere is going to have a 50 year project on their hands to fix all the inefficiencies, crocks, bad design and unoptimised code.
Maybe once there is an actual incentive to write fast, efficient code instead of the traditional "do whatever it takes, just deliver it by Monday" approach to software development, we'll have a renaissance in programming. If we're lucky, we may even find that once people start writing well-designed, reliable and efficient software that programmers start to win back some professional esteem and respect, too.
-
Tuesday 1st May 2012 08:41 GMT JDX
Re: Sadly, Wirth's Law will keep on going
Software is ALWAYS about doing what it takes to run on the target hardware. It's all well and good being nostalgic but people used to hand-optimise ASM because they had to to make it run fast enough, not due to some sense of craftsmanship.
Those old-school skills are fun to learn and good to know, but you learned them because it was necessary to get the job done.
-
Tuesday 1st May 2012 09:44 GMT Charlie Clark
Re: Sadly, Wirth's Law will keep on going
It seems to have escaped your notice how much work has been done on compilers in the last few years which do an increasingly good job of removing many of the inefficiencies. That and the ability to shift tasks to hardware implementations (encryption, signal processing, video compression and decompression).
-
Sunday 6th May 2012 17:33 GMT JDX
how much work has been done on compilers in the last few years
People have also been saying that for years. It's undoubtedly true but a good optimiser can still beat the hell out of the compiler through knowing the context and stuff like SSE. The claim "modern compilers can't be beaten by hand coding" is simply bogus - however the truth is it doesn't normally need optimising.
-
-
Tuesday 1st May 2012 10:59 GMT Anonymous Coward 101
Re: Sadly, Wirth's Law will keep on going
"Maybe once there is an actual incentive to write fast, efficient code instead of the traditional "do whatever it takes, just deliver it by Monday" approach to software development, we'll have a renaissance in programming. If we're lucky, we may even find that once people start writing well-designed, reliable and efficient software that programmers start to win back some professional esteem and respect, too."
Whining cry baby.
-
-
Tuesday 1st May 2012 09:24 GMT Tom 7
We havent even started yet....
when I first started making micro chips 32 years ago there were lots and lots of techniques being looked at that haven't been re-looked at that could easily keep Moore's law going for another decade or two.
I have often wondered if they were really dead ends, just swept aside by the usual monotonic rush, or possibly as they were not patentable any more and so wouldn’t give any more than a moments advantage to anyone who developed them to fruition.
But as has been alluded to earlier in this thread Moore's law has an 18 month doubling and Wirth's law has software doubling bloat in 17 months.
Me - I'd be quite happy with a 4 core arm machine with a socket for adding more when I need to.
It wont run microsoft office - so what? My car wont pull a plough.
-
Tuesday 1st May 2012 11:16 GMT Daniel Garcia 2
Naaa, there is still new technologies coming up
stack of graphene and silicene techno-wizardry combine with nanotubes coduits and "on the chip" dynamic (fast moving fluids) heat transfer solution is my bet for the future.
we still are at infancy on term of internal heat transfer solution that is at the end of the day the main physical limitation.
-
Tuesday 1st May 2012 13:42 GMT Colin Millar
Duped again
Why is it that supposedly hard nosed tech types keep cleaving to these marketing mantras like "Moore's Law".
It's not a law cos it doesn't have a proof. Its not even a theorem because it isn't based on any prior established proofs. Moore himself called it a "trend" and its likely continuance for another decade a "prediction" but these days it is really just a marketing soundbite designed to promote the buying cycle.
As a statement it is tremendously inaccurate in its terms which relate to two variables in a multi-variable environment and it seems to assume "all other things being equal" without any real reason for making such an assumption - particularly as the original statement was made over 40 years ago and carried a caveat by the guy who actually said it.
I am constantly amazed at the ability of supposedly intelligent people to completely ignore actual evidence in favour of theologising their prejudices.
-
Tuesday 1st May 2012 14:14 GMT kit
Optical network, embedded memory, Graphene, Nanoptube and TSV will replace silicone downsizing.
Silicone tech may run out of steam in perhaps 10 yrs' time. Other techs like inter-core optical networking, massive embedded fast memory (resistive or phase change memory) as well as the application of new materials (graphene and nanotube), and 3-D chips (TSV) , will replace silicone downsizing.
-
Tuesday 1st May 2012 21:34 GMT YARR
No mention of Nanotechnology?
I'm concerned that for an esteemed futurologist he made no mention of nanotechnology.
I had thought that the technology for producing ICs would transition into molecular manufacturing, bringing about a new era of mechanical nanocomputers.
When built in 3D, we'd fit the most powerful supercomputers into the size of a speck of dust, without the frequency, overheating and current leakage issues that affect electronics at this scale. Or did Eric Drexler get it wrong?
-
Monday 7th May 2012 08:12 GMT attoman
Re: No mention of Nanotechnology?
Nanotechnology is that technology in the range of 100 nanometers to .1 nanometer. Semiconductors were there in the last millennium where were you?
Drexler predicted there would be robot assemblers building things an atom at a time using any material. He predicted that carbon atoms would be assembled into diamond for instance. He ignored DNA and life and the limits of energy in such reactions that drive the ancient nano assemblers in all life. After billions of years does anyone really think that if it was possible to somehow not spend 1000 degrees C of energy in forming a diamond carbon bond life would have found it and used it instead of calcium bones, or or cartilage?
Drexler also had no place for electromagnetics in his actual implementation predictions.
Drexler got it wrong. Drexler got us talking and thinking about Feynman's point and he deserves a lot of credit for publicizing the field. No core nanotech inventions though that I'm aware of have his name as principal inventor.
-
-
Monday 7th May 2012 07:54 GMT attoman
Moore's Wannabe Law of Device Physics
Gordon expressed a hope and goal, not a law of nature or of man. Ever since inventors have used their imagination to sustain the hope and goal and the economy of much of the world.
Spoiled brat's like the readers of this rotter's blog haven't a clue where the miracles that make their culture and life come from or in many cases how core and critical they are to their well being until they are denied. Then like the obscene reaction of lawmakers and courts to the loss of their crackberry fix a few years ago the addicts will do anything, break any code, blame any helpless group to get back their drug.
When the miracle of ongoing Moore's Law Invention ends, so does the love affair with ever more powerful, less expensive personal computing including iPads, iPhones and all their contemporaries, and successors.
So also does the preeminence of Silicon Valley, and the economy of the USA.