The great ultraviolet hope that will enable Intel to break the 10nm chip geometry barrier is further away than ever, the vendor’s CTO has admitted, raising the spectre of Moore’s Law* running out of steam. The breakthrough revolves around the distance between the circuit-lines chipmakers can etch onto the surface of a computer …
Wouldn't ultraviolet be under the rainbow?
"Wouldn't ultraviolet be under the rainbow?"
No, that would be infrared...
Re: "Wouldn't ultraviolet be under the rainbow?"
"No, that would be infrared"
No, they were right. A rainbow has red on the outside of the curve (longest wavelength, so most refracted) and violet (shortest wavelength, so least refracted) in the inside.
So UV would appear inside that, and thus as the original A.C. said, under the rainbow.
Moore's Law Theory
In the strictest sense Moore's Law is under threat - the number of transistors may not double ever two years. It indeed has hard limits in the size of not only the tracks but the transistor - can't go smaller than the atom (yet)
However perhaps more interesting is the correlation between the computing power in the same physical space and the cost of said computing power. How closely does that continue to match Moore's Law?
There's also other avenues of computational power on the horizon - quantum and optical processors might reduce the number of elements in a processor.
Re: Moore's Law Theory
"There's also other avenues of computational power on the horizon - quantum and optical processors might reduce the number of elements in a processor."
Indeed, and on a more mundane level World+Dog is stacking chips and layout these days - so the effective density is still going to go up for a while yet, even when the process steps take longer and longer to take.
I don't know. Is it so bad that Mr Moore's Law will one day go off track?
What Moore actually said was that the number of components which can be integrated *at the minimum cost per component* doubled every two years (or whatever), since this is what drives the economics of what can be integrated.
In this sense it's already dead, even at 20nm the projections are that the cost *per transistor* won't fall below that for 28nm in the foreseeable future, and neither will 14nm, because the cost per square mm is going up as fast as the gates/mm2.
The industry had better get used to the fact that from now on you can integrate more, but it will cost more -- the days of the free luch are over :-(
Bright enough; not trying very hard are they?
I have a LED torch which shines UV light, and apparently Graphene look possible as a Lasing material to do loads of wavelengths!
"EUV" or "X-Ray" lithography was a b**ard to do in the 80's
And guess what.
It still is.
High energy accelerators?
We are approaching relativistic/quantum level physics, and the two are disconnected. For those who see the LHC as money wasted, it may well connect the very large with the very large, and obsolete all existing game machines
- Fee fie Firefox: Mozilla's lawyers probe Dell over browser install charge
- Did Apple's iOS make you physically SICK? Try swallowing version 7.1
- Pics Indestructible Death Stars blow up planets using glowing KILL RAY
- Neil Young touts MP3 player that's no Piece of Crap
- Review Distro diaspora: Four flavours of Ubuntu unpacked