A good article
I disagree with a few points though.
Firstly there is this Orlowski-style simile where he compares compulsory coding over a 1- to 3-year period with a journalist doing it for a day. Doesn't really mean anything. You can delete the whole journalist doing HTML part from the article and lose nothing.
Secondly there is the part about where does the time come from for coding. Well, the time comes from not teaching kids how to use Word so much. There already is time on the curriculum for IT, the question is how do we use that time.
Thirdly there is this Wizard of Oz nonsense. Abstraction is what computers do. The whole point of using computers, at every level, is to rely on the man behind the curtain. But each man behind a curtain is a piece of software, or hardware, which talks to a man behind another curtain. Ultimately it's all governed by the quantum wave equation, but fortunately you don't need to be able to solve that in your head in order to add a div element to a webpage. That's progress, Andrew, not philistinism.
But I think the main reason Orlowski falls down here is due to his lack of imagination. This article is good in that it does acknowledge the current state of things rather than appealing purely to historical precedent, but we're not teaching kids to enter the workforce of 2012, we're teaching them to enter the workforce of 2016 or 2020, where they will stay until 2060 or so.
Now, it is noticeable today that not many people program - it is a vocation, not a skill like driving a car. But the benefits from using computers are always greater to those who understand them, while the ease of programming and understanding the programming models is increasing. It is perfectly conceivable - perhaps inevitable - that by the middle of these people's careers it will be quite normal for everyone to be doing programming at some level. Even now, a manager who can write an Excel macro is going to be more productive than one who has no idea.
This idea may be speculative, but the idea that the industry will be unchanged in 30 years is obviously wrong. What we can agree on, surely, is that computers are pretty important, using computers beyond a certain level requires some programming knowledge, that basic programming knowledge is easy enough to teach, and that teaching kids how to drive the UIs of programs which may not be around in 10 years is stupid? In which case, teaching programming at years 7-9 seems to be a no-brainer.
I also find it somewhat bizarre to find anyone arguing for the status quo in education today, but hey.