So,
"My computer's got a virus" will be a good thing.
We may have a long way to go before Moore's Law, which calls for the doubling of transistor counts — and therefore computing capacity — every two years or so, runs completely out of gas on current electron beam and optical lithography techniques. The question is, will the IT industry, which is predicated on the idea that …
Nice article. IBM researchers may indeed be close to a giant leap in miniaturization and cost efficiency. This could theoretically lead to extremely powerful, cheap and energy efficient consumer electronics. This is exciting because future computer applications will rely heavily on brain-like neural networks that are based on huge numbers of parallel processing entities. Future consumers will demand portable intelligent assistants that would fill a warehouse if based on today's technology.
However, there is a monster that threatens to destroy these exciting developments. It's called memory bandwidth. It is a monster that gets meaner and nastier every time you add another core to a processor. The reason is that all the cores must use a single data bus and a single address bus to access a single piece of data at a time and this creates a paralyzing bottleneck. Current memory systems would be hard pressed to keep up with a processor with a hundred cores, let alone the thousands and millions of cores that are being predicted by the pundits.
It goes without saying that the industry must eliminate the bottleneck. The core/bus/memory approach to computing is fast approaching the end of its usefulness. We must come up with a completely different type of computer, one that solves the bandwidth problem by embedding huge numbers of elementary processors directly into the memory substrate. Decentralization will be the game in town. Biological brains get around the bandwidth problem by growing as many nerve fibers as necessary. However, this is a very slow process, one that would be inadequate with the fast random access requirements of modern software. See link below if you’re interested in this subject.
Parallel Computing: The Fourth Crisis
http://rebelscience.blogspot.com/2009/01/parallel-computing-fourth-crisis-part-i.html
Don't get me wrong I think that there are some positive uses and that you're on the path to an 'organic' computer. This and the article on mini lasers are showing some nice new technology.
However, is it just me, or should we start to be concerned about some of the potential negative uses for this technology? (Hence the black helicopter)
"Real Genius" anyone? :-)
-G
For processes such as this, atom numbers aren't really used for two reasons:
1) Atoms vary in diameter depending on the element, ionization and even to a degree electron spin, so its possible not only for two atoms of different elements to be of different size, but also two atoms of the same element to be of different size. And going further, two atoms of the same element and same charge can also be of a different size.
2) " x atoms thick" or "x atoms long" is only really used in PR, as most people seem to take it as some kind of perspective, despite having no idea how big an atom really is. Go figure.
That said, this is between 40 and 220 atoms long and wide (depending on the geometric face)
Source: http://hypertextbook.com/facts/MichaelPhillip.shtml
yadda yadda yadda, welcome to the DNA based super computer in the next Xbox. Maby thay ca even make it quiet....
Which following Moores law (which is a law derived from empirical observation rather than an underlying physical model) would mean a further 6 generations after that.
So ho hum back to the future.
The bottleneck problems have been known since the late 60s at least.
You might like to look up the Inmos Transputer as an architecture to try to handle this.
Mine will be the one with KE Drexlers Nanosystems in the (very large) side pocket.
So we're going to implant viri into every CPU? I for one welcome our new superintelligent viral network masters.
@Louis Savain
"Future consumers will demand portable intelligent assistants that would fill a warehouse if based on today's technology."
Will I? Will we also demand matter transporters and single day returns to Mars?
@Suburban Inmate
Atom: 30-300 pm (1/1,000,000,000,000)
nm = 1/1,000,000,000)
So between 73 and 733 unless my fagpacket calculator has gone wonky.
@ Ian Michael Gumby
More importantly, why are you talking about the human race in the third person?
Unproven technology won't arrive faster simly because it involves "bio" and "DNA". There is a bit of a tough transition between the aggregate (classic) and the molecular (quantum) world. Biological systems bridge both worlds, but you should expect some decades before humankind will understand and harness that power. And even then it might not help with computing. (<-- Thats a full stop as in "Full Stop"!)
Dream on ...
Synthesising semiconducting carbon nanotubes is rather expensive at the moment (non semiconducting ones are cheaper... but somewhat more boring). However, it's already been demonstrated from a top-down perspective that you can arrange carbon nanotubes to form logic gates (you only need to make a NAND or NOR to be able to make everything else...) which is only a few steps away from a rudimentary processor. Once a processor can be developed from the top-down perspective using nanotubes, using DNA scaffolds to self-assemble will follow.
If you look at the development of any new technology, automobiles for instance, in the beginning it is all about more powerful/faster then an economic/cultural limit is reached and more useful/cheaper takes over as the prime mover. It is currently possible to make a 700mph car but they wouldn't sell well, consumers prefer better fuel economy and service intervals.
I think we are already seeing the end of the beginning period in PC development. Raw power is not the only demand. Of course there will always be some use for a faster PC but most people have enough power as it is. Cheaper, easier to maintain, quieter, more usable is what sells PC's now. Look at the market for netbooks (small, cheap useful) and the difficulty Microsoft had in shifting Vista (power hungry, little or no added usefulness). We are being told that Win7 is smaller and faster on the same hardware. Moore's Law may well be true but it will become less and less relevant as time goes on.
Re Thanks! (Suburban Inmate)
I'm not a marketeer but the their usual response seems to be "ours is smaller or bigger or faster"
So on reducing further from 90 atoms (by 90 atoms) we're inevitably heading towards quantum effects land (which makes lala land look quite sensible) and where what few electrons can be put onto a minute isolated feature will tunnel way somewhere else (or even somewhen else)
"All the computing power in the world doesn't mean much if you can't afford it."
Afford?! In terms of what? Cash, or human labour? If there *has* to be a profit behind it, of course, it will be slow to appear, but that's the same with every technology just now. The system holds us back. It's not human focused but profit focused. And yeah, that brings around some nice new "technologies", but imagine if we just went ahead and choose, as a species, to develop these things, for us:)
How much nicer this world would be.
Mine's the one with the Reality sticker on it
I'm connected.
You don't have much available to do anything useful for the run of the mill work that most folks have today, much less really use memory and multiple processors. Remember folks, most desktop computers are still built around how many interrupts; how many registers, how much memory? if you don't have better access to your data none of this will matter.
just because....
think we are already seeing the end of the beginning period in PC development. Raw power is not the only demand. Of course there will always be some use for a faster PC but most people have enough power as it is.
This is one of those "I forcast the world will only need about 7 (mainframe) computers" statements.
The limit (as the article says) is due to heat problems and the ability to dump so much heat at a given level of technology.
DARPA are pursuing the petaflop computer cabinet (1x10^15 if my maths still works) which they reckon is the level needed for processing power equal to a human brain.
At 1 Gflops a processor core that's a million processor box, which I doubt is what's sitting on your desk.
Raw power and bandwidth have enabled not just new versions of apps but whole new classes of apps to be supported. CFD/CAD on the desktop. Personal high resolution image adjustment (Photoshop's commercial predecessors ran on custom bit slice hardware to get the speed and bandwidth needed at the right price to do that). Route finding apps which can plan journeys might still be running on a 1984 era PC a week after you started it (if it could run at all).
Of course it will probably be handicapped by running Windows.
@ Keith Oldham
So on reducing further from 90 atoms (by 90 atoms) we're inevitably heading towards quantum effects land (which makes lala land look quite sensible) and where what few electrons can be put onto a minute isolated feature will tunnel way somewhere else (or even somewhen else)
A bit behind the curve on this one. Quantum effects have been a design factor since the early 80s when 1micrometre was the Everest of IC production and lots of people were saying the next generation would have to be the last before the shift to full on X-ray systems using plasma sources and storage rings.
Outside the digital world poeple make single electron transistors (some times called ballistic electron transistors) and givens the actual charge on all flash RAM and DRAM cells these days is so small (IIRC about 1x10^-15C. 1 electron is 1.6x10^-19) you might as well talk in terms of the number of electrons anyway. Applying Moors law here says we are about 10-11 generations from a 1 electron storage element.