C# is one of the fastest growing languages in the world. Practically everyone that writes VB is moving over to C#.
A-level computer science students will no longer be taught C, C# or PHP from next year following a decision to withdraw the languages by the largest exam board. Schools teaching the Assessment and Qualifications Alliance's (AQA) COMP1 syllabus have been asked to use one of its other approved languages - Java, Pascal/Delphi, …
What a marvellous idea, lets stop teaching kids things that are useful and instead have them learn Pascal and Delphi - what wonderful careers that will set them up for.
Instead lets just prep them for a degree in computer science at some uni then when they come out they wont have a clue like all the other grads with no useful knowledge.
They are to learn Computer Science in a uni, not to get work experience in it. You get work experience at work, not in the Uni.
Fundamentally it is a version of the difference between a structural engineer and a builder. A builder can become a builder by work experience alone. A civil engineer needs to study lots of boring stuff like math and such to become a civil engineer. He cannot become a civil engineer on work experience.
Coming back to your comment. First of all there is a reason for this. See this for an example:
You cannot teach even the basics of algorithms and data structures in java, C# or php. They simply disallow you access to pointers and low levels to do some of the basic data ops required to show the students a data structures problem. Try doing a doubly linked list in java for example. That is good for the trade, but it is bad for the teaching.
While it is possible in C or Perl making it readable and presentable is a nightmare.
While I am not a great fan of Pascal, credit where credit is due - it is a language designed for teaching. It is one of the very few languages which can be used to learn things like linked list manipulation, pointers, etc and it is _READABLE_. You simply cannot do that in C#, java or php.
It is the university job to provide higher education. It provides the CS equivalent of engineers, not builders. A person who knows data structures and the fundamentals of CS can start coding in _ANY_ language in a couple of days including Java, C# or PHP.
If however, the universities are to follow your great advice they will provide "builders" which can practice only their trade and cannot do any of the real stuff. No better way to ensure that this country never ever has a Google an Infovista or even a measly Yahoo to be proud of.
I can only say - applause, long overdue.
El Reg readers would be the first to complain when school children aren't taught 'word processing' or 'spreadsheets' but Word and Excel.
How is teaching the use of a specific language any different.
Education is there to teach the concepts and principles behind computer science, so that they can be applied anywhere, to any language. Not how to write simple Java programs.
Learning how to use a particular programming language is something that is learned after leaving school/university at work. What is commercially important changes rapidly and the educational system neither needs nor can afford to continually follow what is in fashion.
In an educational setting the aim should be to first teach principles. In the context of creating software that means things like how to manipulate data and what algorithm to apply to a problem or how to translate a system into objects. An established and stable programming language that can be used to learn these things - and I think that Pascal and related languages are very good - it what is needed.
Any competent person can program a solution in several programming languages. Teaching the basics properly should produce someone able to do that.
Modern coding may well be done in C# or Python, but using languages which are so high-level as to be almost pseudo-code teaches nothing about how to write efficient or elegant code.
I did VB in my CompSci A-Level. Can you guess how much I learned about writing efficient sorting algorithms?
I applaud moving away from objective languages, at least until a firm grasp of good basic coding practices is established. Up until the end of college, maybe...
This does appear to be a somewhat remarkable decision. I could understand if the languages were defunct, but C is still widely used, as are C++, C# et al.
When I did O level computer studies (y-e-a-r-s ago) we had to do a project in BASIC. We could use one of the school's PETs or our home computers (Speccy, C64, VIC20, BBC, etc). My mate was a dab at Z80 and wrote a good 80% of his project in machine code, only to be told to re-write it as the examiners wouldn't be able to disassemble it. Then, at college, we were told to submit projects in 6502 machine code as that was the way forward.....
Seems that 25 years on some people haven't learned a darn thing....
Sure, why teach em something that might prove useful.
Just about any language can be used to teach programming fundamentals, but leaving out C (but keeping Delphi?) seems about as practical as teaching general driving skills with the use of a tractor.
Should we be looking forward to a generation of VB-wielding engineers?
If Pascal/Delphi are better Computer Science learning tools then it may, indeed, be sensible to teach these instead of C#. The amount of time actually spent learning a language in school is small compared to that which an employee will spend learning a scripting or coding language.
I'm neither a software Engineer or a Teacher but I don't believe a Schools primary duty is to turn out only productive worker bees.
Yes, you're right.
However, I'm not sure how VB gives you access to the memory management stuff that's required for teaching about things like pointers and memory management that are important aspects of any computer science course. Actually, C is generally used to teach these. Strange one, that.
What on earth do you think is efficient about writing your own sorting algorithms!? Other people have written plenty of well honed well considered sorting algorithms for you, all you need to know is which one to use if the standard one proves too slow...
Sorting an object by typing whatever.sort() as opposed to spending a whole afternoon debugging and testing your homebrew barely remembered un-reusable quicksort, now that's efficiency right there! Supporting features like duck typing so you don't have to rewrite your quicksort function for every different type you might ever want to use, now that's efficiency.
Personally I'd advocate a Beauty and the Beast strategy - I think Python and C (proper not #, not ++) should be the only languages in use at A level. Early exposure to (the beauty of) Python should ensure they are naturally repelled by all the other awful, backwards, compiled, non-dynamic languages out there, and that if they really need recourse to some serious low level voodoo there's really no need for anything other than C. The two play together fairly well too.
Props to them for ditching PHP BTW, I've had to write far more code in that ugly terrible language than I care to recall. You know a language is a dog when programming in Adobe Actionscript feels like taking a holiday!
Dropping C, and C# in favour of Pascal, Delphi and VB?
Seriously, it may be easier to _teach_ those languages, but it's hardly going to be of any value to the student once they have their A-level, unless they want to go into a career where their job involves maintaining spreadsheets, and software written 40 years ago.
If anything, they should drop the obsolete and 'toy' languages in favour of languages like C#. It might be harder for the students, and they might not all get A-grades, but they'd get an employable skill.
"Most centres offer Pascal/Delphi and Visual Basic as the language of choice for their students. This selection is based on the experience of the teacher in that centre and their own comfort with that language."
So students have to learn a language which won't get them a proper job because it's the only language their teacher knows? Anybody else sensing a vicious cycle?
That may be taught in higher education? Well, lets not teach anybody English, French or whatever until they get to Uni if they're studying Modern Languages courses.. See how that flies..
Pascal/Delphi/Ada isn't bad as a 'basics' language.. Maybe for O level..
VB? Don't make me laugh.
While C/C# may have a bit of a learning curve, A levels aren't meant to be too easy, and you don't need to know all of C/C#/C++ to code basic programs in it.
Removing some of the industry's most used and effective languages, while leaving niche languages on a syllabus just strikes me as wrong.
No more than Visual Basic, and certainly much less than C.
I say C++. Teaching a multi-paradigm programming language is the best basis for students moving on at a later date to other technologies. C++ is one of the most fundamental technologies in the industry.
Please, not Visual Basic. It teaches almost nothing about technology.
I can understand dropping C, simply because it's very low level (for the most part), where-as C++ allows for some relatively high level development with the right tools.
I think you've missed the point of computer science. It isn't about "technology" at all. Technology is the end result of people with an education trying to solve real-world problems. Computer science is (in part) about teaching the different theories and methodologies used to make computers do things. Object oriented programming is one, as is functional programming, imperative programming and so on. Each is an entirely distinct system of thought, each has its own particular strengths which promote its use for solving certain classes of problem, and each has weaknesses which are revealed when trying to use it for the wrong thing.
It's necessary to learn the low level in order to understand why the high level works as it does. One needs this in order to create code which is sympathetic to the way the machine operates and is therefore efficient. Jumping straight in at the high level leads to quick results, but without the deep, intuitive understanding of what's going on under the covers the student cannot build on their knowledge. It's the difference between a tourist phrasebook and properly learning a foreign language. One will get you a beer, the other will get you a wife.
A "multi-paradigm" language like C++ is probably the worst possible case. Mixing imperative (or structured, or procedural) and object oriented programming muddies both concepts. Many students struggle with the difference; presenting them in the same language is likely to hinder their understanding. C is perfect for teaching structured programming because it is so close to the underlying machine code and so intimately bound up with what the computer is actually doing. You can easily see why one algorithm may be much less efficient than another, because you can see what the machine is actually doing. C is not in any way hard. Once you understand the _concepts of that paradigm_ it is very simple, it just gets out of the way. The "learning curve" you refer to is actually the difficulty in learning to program properly, which is what comp sci teaches!
Do you have a computer science degree?
... but they need a good grounding in data structures, pointers, and the machine model. If that can be combined with a basic understanding of OO all the better. Sadly few coming out of UK universities have the skills, which is why we mostly recruit overseas.
Computer science, is not, or shouldn't be, about learning to program in language X, it's about learning the full software development methodology.
It's goal is not to teach you C#, java, c, whatever, it's goal is to teach you about memory, variables, data-structures, code flow, etc.
If you understand these concepts, the actual implementation language doesn't matter as much. Yes, you can code some pretty impressive stuff, quite easily in c#, but if it doesn't work exactly as you expected, if you don't understand what basic concepts sit underneath it, you'll never know why.
I did a software engineering degree, and at no point during it, was i ever taught to actually program. You were expected to go away and learn 3-4 different languages on your own, no specifics given, all of the lectures and concepts were done in a form of pseudocode.
When you come across sorting code that chunders along slowly, and find out it's because someone is copying memory all over the place constantly instead of just swapping pointers around. That's the sort of stuff computer science should be teaching, not how to use library sorting functions. Once you understand how to do it yourself, then you can start using the libraries!
I am now a professional, and guess what, in reality, it's not about the language, if you're needed to do some c, you get a CBT course, then you're coding in C. Yes experienced people are needed to support and review, but in the main, unless you are doing something specific, the actual language is unimportant, you need to work in VB, you learn VB, it doesn't take long to get going if you have the basic grounding in the fundamentals that the purpose of these courses is to teach.
The school should be teaching the student how advanced programming works NOT how a specific language works.
This way the student will have sufficient knowledge to pick up any language from the basics learned in school.
The learning of VB is because it is a nice easy language to begin with, the student can see instant results and not be bogged down with any complexities too early on. Once they have the basics mastered, they progress to something like C in Uni.
Remember, it is up to the individual student to learn the language themselves. I (and I'm sure a lot more of you) taught myself several languages (Java, C++, PHP, Perl etc) all from the basics I learned when in School.
As someone once said, give a man a fish and he will eat a meal. Give a man a fishing rod and he will provide for himself!
I'm quite sick of people saying that the programming languages used in schools must be popular in industry. After all, a lot can change between the time the students learn about programming to the time they have to apply for a job. Of course, you can try to predict which languages will be popular in the future, but you are as likely to be wrong as right, and even if you guess right, the languages are likely to change even if their names don't.
Also, in my experience, if you have learned to really program (as opposed to doing trivial cut-and-paste exercises) , then you can very quickly apply this to new languages. So instead of using a language where you have to do a lot of apparently meaningless mumbo-jumbo to just write a hello-world program, use one where you only write things that have direct relevance to the problem you solve. And use a simple language. Instead of being introduced to a new language construct or library function every time they need to solve a new problem, the students should learn only a few constructs and build everything up from these. They might not be able to create flashy animations or games as quickly, but they learn more.
Tories take over and the world ends.
PHP is the only language where pay held up and there weren't mass redundancies during the election.
C is a crazy important language anyway - everything useful is written in it.
C# is the future, and coincidentally is a great reference language, because it takes everything that's been learnt about writing software over the years and boils it down into a new language with an excellent feature set. Microsoft's implementation taking the most beautiful language ever written and effectively turning it into faster Java not withstanding.
Java is just useless across the board and it can only be a good thing it's going out the window.
Delphi? Come on. Python is pretty ugly too. Not saying PHP isn't but at least PHP knows it is, Python thinks it's God's gift to software developers and falls waaaay short of the mark.
I mostly take issue with the killing of C and C#, they're quite possibly the most important languages on the planet right now, everything we have is written in C so you need to know it to maintain all other code, and C# is the future.
And no; this stuff isn't too hard, if they can't get to grips at A Level they probably won't at Uni either.
If we're really trying to create a generation of totally unemployable people why not just teach them all Fortran and have done with it?
"This selection is based on the experience of the teacher in that centre"
On that basis, I should have been taught COBOL using punch cards by Mr Crusty McOldfart at Technical College.
I suppose it's too much to ask that the teachers are actually competent in a more 'current' language rather than just the one they were taught 25 years ago?
The only job you'll get working with Pascal is teaching it.
The OU has a stubborn attachment to Java for some reason. Yes, it's getting a lot of support from mobile devices to enterprise stuff, but I'd rather stick my spuds in vice than do another Java module. How about offering some other languages?
I've been working with C# for a couple of years now. As an old C/C++ dinosaur, I'm hugely impressed by the way it gently pushes towards good, clean programming practice.
It's just a shame the IDE decides to eat it's own feet occasionaly.
OK, who the hell thought teaching Microsoft languages was a good idea? Really?! Aren't they quite likely to be dropped without trace (J++) or the name applied to something totally different (Visual Basic).
Don't even get me started about Delphi...
I guess somethings never change, I had to study CoBOL, DBase IV, Fortran 77, never used any of them. I also did Pascal, but I suppose I can see the point as an "introductory" language.
Clearly educationalists are slow learners (ironically).
It's an A-level not a complete programming course. As long as the teacher is competent enough to teach the basics of programming using whatever chosen language then does it matter? If the students really want careers as developers then they will learn all that they need either in higher education or on line.
My higher education programming consisted of:
Pascal and VB6 for beginners programming
C for more than beginners programming
Motorolla 68k Assembler for hardware principals
Java for Object Oriented stuff
A, B, Z and C++ for formal methods
And now, professionally, I write Databasic for Reality.
If the intention is to teach students the concepts of programming, then using a language suited to that purpose (pascal/delphi) is not a bad thing. With pascal/delphi you can code in a functional or object oriented style, for example.
Once you know programming concepts and how to code in one language, you should be able to pick up other languages reasonably easily - this is what is required of programmers in the real world.
Don Knuth used a made-up assembly language in The Art of Computer Programming. Anyone who learned that language and worked through the examples in that book is likely to be much deeper knowledge of programming and be more useful in the commercial world than someone who learned and only knows Php/C#/Java/whatever.
'but a course that covers the fundamentals of computing ' - last time I looked most OSs were written in C/C++ , and C syntax still forms the basis of many modern languages. So I guess the main reson for dropping it is simply that the teachers don't have the depth of knowledge to teach it.
You might want to look up what the __stdcall prefix is for in some of the older Windows APIs.
As for the C family's underserved popularity: perhaps it's still common because many operating systems are still build on the lumbering dinosaur known as UNIX. How a 40-year-old OS (and OS design) has managed to remain relevant to today's IT needs is a mystery.
Schools don't have huge IT budgets—hence the continued support for VB6, or did everyone miss that?—so it's hardly a great shock that some older languages remain. Teachers have to teach with whatever they have to hand. If that means a classroom stuffed with ageing Pentium 4s running Windows 2000 and Delphi, so be it. The teacher doesn't get to demand new PCs capable of running the latest toolchains.
Few companies are hiring programmers fresh out of 6th form. They're hiring them out of *uni*. So there's at least another 3-4 years of learning after the student has their A or AS Level.
(Of course, when I were a lad, we were satisfied with a lone ZX80, Commodore Pet, and the school's mighty Research Machines "Link 380Z" running its "Cassette Operating System". Kids these days don't know they're born. [INSERT PYTHON SKETCH HERE]. Etc.)
"Schools don't have huge IT budgets—hence the continued support for VB6, or did everyone miss that?"
Hence them using some outdated proprietary shit as opposed to a real language with a free compiler/interpreter? Or did you miss that?
I think a large part of why they use VB and Delphi is because either A) The teachers know nothing else and are to scared or lazy to change environments and course materials OR B) They don't credit their students with enough intelligence to grasp a language not inherently entombed in a point and click graphical IDE. Neither of these is a very good excuse as isn't money.
Most people here seem to be subscribers to a false dichotomy. There is no reason a language can't be both useful for teaching and useful in the real world. Find me a computing concept that can't be well demonstrated using either of Python or C or both - they are both useful real world languages no?
"Hence them using some outdated proprietary shit as opposed to a real language with a free compiler/interpreter? Or did you miss that?"
No, I didn't "miss that". Believe it or not, the cost of a toolchain is not an indicator of its quality.
I've used vim and gcc. I've written in low level and high level languages, from various assembly languages through C++ to .NET and Objective-C. I've written entire games using HiSoft's GenST tool and even the Picturesque Assembler for the ZX Spectrum (at a time when I couldn't afford anything other than a cassette deck for storage; believe me, you learn to write good, quality code *fast* under such environments.)
Microsoft, for all their recent management problems, actually do know what they're doing when it comes to developer tools. They've been making them for much longer than they've been making operating systems. And they're actually pretty good. One of their key philosophies is to make programming *easier*. Visual IDEs are a part of that. (The GNU / Linux community appears to believe in making programming *harder*. I'm not yet sure why, although, having used Emacs, I suspect masochism may have a lot to do with it.)
Laboriously writing your commands out in dumb, flat text files and telling the computer how to link them together is a *terrible* solution for most software development problems. That it's how it was done 40 years ago does not mean it's a good way to do it now. Unfortunately, the programming fraternity is ridiculously conservative.
Windows and OS X each provide a single, homogenous platform to target, with easy development tools that let you get results quickly. The myriad variants of UNIX do not: their development platforms are fragmented and barely coherent, let alone cohesive.
My job as a teacher was teaching *programming*. And, since "BASIC" actually stands for "Beginner's All-purpose Symbolic Instruction Code", I think it's fair to say that it's not a terrible choice of programming language for—you know—*beginners*.
Knowing how to choose the right algorithm or library is far more valuable than understanding the finer details of the "make" command or Emacs.
Similarly, the argument that lots of people use "C" or "C++" does not wash. Lots of people love "X Factor" and "Pop Idol" and drinking themselves to the edge of alcohol poisoning too. Doesn't mean they're *right*.
That the tools are *easy to learn* and *easy to teach with* is of far more value to a teacher than whether they're particularly popular. Most languages are very similar syntactically, and by the time the student has been through university, he'll have had plenty of experience with other programming languages already.
Biting the hand that feeds IT © 1998–2019