MrBillious
I think you've misunderstood the nature of windfarm costs...
First, costs are rising, not falling - historically, they've gone up ahead of inflation. There's a glut on the market at the moment, but that's maintly reflecting cancellations.
Second, if anything capacity factors, etc. are likely to fall - inevitably, in the main more favourable sites get developed first, and then more expensive/less favourable sites later. THat's certainly been the experience as shown to date
Third, the empirical evidence is, developers are demanding higher, not lower subsidies with time, in order to get projects launced. The original (onshore) schemes could make money at 1 RO/Mwh. Then, in 2009 or 2010, that had to be upped to 1.5RO, and subsequently 2 RO/MWh, for HMG to be able to claim that it was maintaining progress. That's due to drop back to 1.5 sometime in the next 12 months - my bet is, it'll end up being maintained at 2 RO.
However, there's a more fundamental reason why you're not about to see step change - all the things you mention increase capital cost, while (potentially) improving performance. however, there are good reasons to doubt that increasing performance can make a step change in economics.
Have you heard of "Betz's law"?
http://en.wikipedia.org/wiki/Betz'_law
It's a manifestation of the second law of thermodynamics - which places a limit on the maximum proportion of the energy incipient in the blade disc that can be recovered. It's a bit under 60%. Further, there are limits on how close that can in reality be approached, and that varies with the wind/blade speed.
Current generation turbines already do fairly well - they'll remove 40% plus of the available energy. But note that - that means at best they could do half as well again.
Then, there's the matter of how much energy's actually available. it varies with the CUBE of wind-speed. So, if I design a turbine to make 1MW in a 50 kph wind (at maximum output), it means two things - anything more than that, I'll have to waste energy (because it's the maximum my plant is built for), or worse, trip to avoid overspeeding the blades. Second, it means if the wind's 20kph, I'm not going to get 400Kw of the design maximum ouput - I'm going to get 64Kw.
In reality, I'll get less, because of that efficiency relationship.
Also, as to costs. If you look at where the cost is in a turbine build, it's dominated by site-erection costs, and by a few major components - the blades, hub and pylon. The generator itself, and the gearbox probably account for no more than 10-15% of the cost of an erected turbine.
Why's that important? None of them are likely to be impacted greatly by economies of scale in production. Blades are basically aircraft wings (especially so if you make them more complex, as you suggest). Despite having an aircraft industry operating for a century, we've never managed to apply mass production/assembly line techniques to wing construction, especially larger ones. Go to any Airbus or Boeing factory, and you'll see it's dominated by hand-assembly processes - right down to the rivetting of wing-skins onto the underlying structure. Most turbine blades are made from fibreglass - hand-laid, and subject to multi-day curing times. An even worse prospect...Similarly, we're never automated propellor or rotor hub construction, for reasons of complexity.
The same applies to the pylon, and pretty much by definition to the erection activities.
You've completely lost me as to why foundation costs should come down - the experience to date is very much the opposite, that rigidity, etc. will have to increase from earlier designs - many of which are suffering from grouting failures, etc. And the desing of foundations will always be dominated by the local geology/soil mechanics. No economies of scale there.