Re: moore's law
you went and fucked it all up there right at the end. Moore's law is demonstrably bollocks and promotes a world view that is.... unhelpful, as acknowledged by the great man himself.
Sorry but you are completely wrong. From a physicist's or engineer's perspective, it's a scaling law, that predicted (back in the 1980s) that there was absolutely nothing fundamental in the way of going from the earliest CMOS computers with a few tens of thousands of transistors running at a few MHz up to today's billion-transistor chips clocked at a few GHz. It also predicted where the law would inevitably fail (ie run out of predictive power). That's where we are today. It's because the transistors have to be made out of discrete atoms and the thickness of a gate is now as thin (as few atoms) as it can be while remaining an insulator.
Back in the days of bipolar transistors, a bit was represented as a flow of current and engineers faced what they called a "smoking hairy golfball" problem. You had to put the components sufficiently far apart so you could keep them cool, which restricted the clock speed because of the speed of light. Shrink it too much and you can't stop it catching fire (aided by the fact that bipolar transistors suffer thermal runaway).
CMOS, on the other hand, scales so that energy is dissipated only when a logic element changes state, and the heat generated per unit area of active electronics is a constant as you shrink the transistors, shrink the operating voltage, and scale up the clock speed, all by the same factor.
Agreed, Moore's law is now historical and no longer predictive. (the entire context of this thread is history!) We've now pretty much reached the physical limits for the smallness of a transistor, and any future improvements in CPU performance will have to come from using the billion or so transistors on a chip more intelligently.