back to article Academics slam Java

The choice of Java as a first programming language in computer science courses is undermining good programming practice, according to two leading academics. In a withering attack on those responsible for setting the curriculum for computer science courses, doctors Robert Dewar and Edmond Schonberg of New York University (and …

COMMENTS

This topic is closed for new posts.
  1. Thomas Beek

    Bias of Academics Against Commercial Uses

    The bias of the "good doctors" is that they are allowing themselves to be distracted by Java's uses and how it is being taught, rather than anything about the language in particular.

    Java is being used everywhere, not just in Web apps.

    -Tom

  2. Thomas Beek

    Critics Also Praise Java

    If you follow the link to their company's website, you will find this quote about ADA:

    "A major improvement [to ADA] is that Java-like interfaces are introduced thereby permitting simple multiple inheritance; null procedures have also been introduced as a category of operation."

  3. Anonymous Coward
    Boffin

    He's right about Basic programmers...

    'Edsger Dijkstra described those exposed to Basic as "mentally mutilated beyond hope of regeneration"'.

    I'd say it goes double for ASP numpties... :-)

    More seriously, though, it isn't just the choice of language or how the language is taught. It has a lot more to do with the curiculum as a whole. How much and what quality of Maths are they taught. Do they get taught these subjects (Discrete Maths, Computer Electronics and Programming) in isolation, or are they taught in a way that shows how each afects the other.

    Un fourtunately, in many schools, the Maths department teaches maths as though no other subject exists and the Computing Departments and Electronics departments are guilty of the same.

  4. Anonymous Coward
    Stop

    Criticism odd?

    "While the good doctors acknowledge that "real programmers can write in any language", they specifically laud the virtues of C, C++, Lisp and Ada.

    All of which makes the criticism of Java somewhat odd. Java syntax is derived from C++ ..."

    Actually, no, it doesn't make the criticism odd unless the criticism was specifically about syntax and I sincerely doubt that. You do realize that there's slightly more to programming languages than syntax, don't you? And that beyond syntax Java and C are about as far apart from each other than two imperative languages can be?

  5. Michael Hoffmann Silver badge

    Now they love C eh?

    Back when I was in uni those profs lambasted C while extolling the virtues of Pascal/Modula 2.

    I would have expected them to at least praise something like Haskell.

    Mike

  6. Stephen B Streater
    Heart

    Consumer software runs on the web these days

    It's funny to criticise Java for working so well on the web. Most applications are run on the web these days - Google, Ebay, Amazon, Facebook, Myspace, Wikipedia (can I mention that here?). Modern JITs are efficient enough to allow even the most challenging of applications - real time video editing and playback.

    Software is there for its consumers, not its programmers. So let's get away from purism and observe that, in the real world, if you want real time safe, secure and accessible software, Java works best.

  7. Anonymous Coward
    Anonymous Coward

    And now that I actually bothered to read the original article...

    Christ, it's spot on. The list of "languages that matter" - nothing missing, in the right order, on the list for the right reasons. You can hate academics all you want but the author of this particular critique seems to have a valid point to make. And yes there's something unique about C as the layer between the machine and the programmer/compiler designer. I never cease to amaze at the foresight (or luck) of those two guys working for AT&T in the mid-seventies.

    But what do I know, I just design instruction set architectures, implement them with pipelining, multithreading and support for loosely-coupled memory interconnects, supervise compiler projects serving them and finally write a simple operating system and play a game of tetris running in VHDL simulation once all the pieces are there. And yes I'm even being paid for all that. :-)

  8. Anonymous Coward
    Unhappy

    Haskell

    I kid you not we have just had a 12 week course on... Haskell.

    Talk about a massive waste of tuition fees.

  9. Martin Gregorie

    Java considered bad for learning?

    The main criticisms of Java in this article would seem to be a mixture of unfamiliarity with the language and poor teaching material.

    Unfamiliarity

    ============

    The two statements:

    - Students found it hard to write programs that did not have a graphic interface

    - its restricted use in the development of web applications

    are mutually contradictory since web applications typically run on servers and never have a graphical interface.

    Poor teaching material

    =====================

    If students don't understand the relationship between the source program and what code is generated or can't write command line utilities they simply haven't been taught properly.

    ====

    It seems to me that Java is a nearly ideal first language since it gives excellent training in creating modular, well structured programs. This can be carried forward into other languages. In addition, it has such a rich set of standard classes that non-trivial assignments can be written much faster than would be possible in, say, C.

    Teach Java first, then C or Ada and leave C++ for last. If you want students to understand the hardware, then teach them assembler right after C.

  10. Herby

    Oh, well....

    Back to Fortran (which is what I'm using at this very moment).

    Yes, java is terrible. So is (IMHO) C++.

    People should learn all about command line interfaces before doing much else. Leave that GUI stuff to silly artist types.

  11. Sampler
    Thumb Up

    I can see the point

    I've worked as tech support at a dev base for the HBOS - building of two to three hundred programmers and developers meant I didn't get any software calls as they dealt with them their selves (where feasible, access restrictions do get in the way but those calls I was merely there to type in the password ;) ) but the hardware fault list went through the roof as most of these programmers didn't know the basics of how a computer ran - which always bemused me how the can write efficient clean code if they didn't know how the system works?

    But then again I'm just a box swapper and don't code :D

  12. Anonymous Coward
    Thumb Down

    History keeps on keeping on...

    To me the lambasting that keeps on keeping on is more to do with the dilemma of how much a college should be making its students "commercially ready" and how much they should be getting into the nitty gritty computer science. This sort of rant from college people happens on a somewhat regular cycle.

    That being said, note that these people have their own language bias towards ADA.

  13. Anonymous Coward
    Thumb Down

    As far as BASIC goes,

    it was the first language I ever learned, on a ZX81. I followed it up with BBC BASIC and 6502 assembly language.

    I currently work as both a web developer (using multiple scripting languages) and as an Oracle certified developer, having previously worked with IBM Assembly Language and COBOL on IBM mainframes.

    Yep. BASIC sure screwed up my chances of being an IT professional.

  14. Greg

    Bollocks

    Prepare for some fanboyism...

    What a load of utter crap. What they're saying is, programmers aren't as hardcore and uber-nerdy about piddly little details as they used to be, and these old-timers don't like it.

    So what they're really saying is, someone's made a decent language where you don't HAVE to care about that shite, and that's a bad thing. Well, sorry lads, some of us got bored of having to worry about constant pointer issues and memory leaks.

    I've used many many languages over the years, starting with BASIC and Pascal, moving on through web languages like PHP to C++ and finally to Java (via a gazillion others). And of them all, Java is the one that impresses me the most. Contrary to the opinions of these morons, it's flexible beyond belief. Want an app for a mobile phone? Perhaps your wristwatch? A website? Or maybe you'd like to code up a scalable enterprise banking system instead?

    Sure, you don't get down to the really hardcore stuff as much, but that's kind of the point of Java - why the hell would you want to when someone's written a perfectly good library already? With the wonders of proper OO, you can just drop in a free-licence library and save yourself a ton of time and effort. How can that not be brilliant?

    And how can using a very well structured, strongly typed, flexible language undermine good practice? If you don't follow good practice in Java, your program's likely to be crap. Or not even compile, for that matter.

    I'm sorry, but their remarks are nonsense. Especially the one about "not knowing how to build programs without a graphical interface." That one made me laugh. On my University course they taught us how to code servlets and the like before they taught us how to code with Swing.

    Truth is, take any student that has sat a "programming course," having never done programming before or outside the course, and they will still know sod all about programming in most languages when they come out. There's only so much you can teach, and there's such a vast amount to learn. Of course they don't know everything - that comes with experience. And obviously the course tutors are going to settle on a language that a) they like, and b) will be useful to the students in the future. And that's Java. (No matter how a certain over-enthusiastic lecturer at Leeds Uni wishes it was Python. Don't worry, we're getting him treatment.)

    I wouldn't mind them teaching C++. Really, I wouldn't. It's not a crap language. It's just that...well...Java's better. By miles.

    Right, rant over. Back to fiddling with PHP 4 and wishing it was PHP 5.

  15. Morely Dotes

    Snobs, bigots, and venal professors

    Edsger Dijkstra was (is? He's dead to me) a narrow-minded, bigoted specialist.

    My second programming language was BASIC (Tiny-BASIC, to be precise). Hardware constraints forced me to teach myself assembly language, and, since no compiler available would fit in the memory I had available (this was the Z80 era, 4K RAM was expensive and 16K was luxury beyond belief), I also had to hand-compile it. Prior to that, I had learned enough FORTRAN to make it through my local community college's CompSci 101 course (all programs to be punched on lovely cards, sorted, and run through numerous torture devices before they could be submitted to the High Priests who determined exactly what was to be run, and what was to be discarded, presumably by reading the entrails of a freshman).

    I later learned C, PowerBASIC (a lovely compiled BASIC that rivaled C for final code execution speed), C++, and several other "languages" which are Web-related and which I therefor consider less qualified to be called "programming" languages. I've been published, worked in Intel's Network Products Division as a build engineer, and moved on to enjoying the fruits of others' programming rather than writing much more than bash scripts these days.

    The leeches deprecating Java are not wholly incorrect, but they have placed emphasis on the language which they prefer for venal reasons, rather than logical ones. Students should learn to program in pseudo-code, hand-compile, and run the results, before they are ever "taught" a high-level language such as Fortran, BASIC, or Java - or Ada. I wasn't fortunate enough to go that route myself, but I was sensible enough to go seek out the knowledge I needed when I needed it. I know from my experience as a supervisor that not everyone will do that.

    Naturally, I don't post with my real name.

  16. Mr B
    Thumb Up

    An over hyped heap of p**h

    At last ... java should 'av been char broiled years ago.

    OOOh pointer arithmetic is error prone ... lets not doo that!!! but the OS is mostly assembly & C, so are most of the third party apps ARRRGHHHH ... quick JNI to the rescue. That whole pointer thing is a joke, most of optimised assembly code I wrote is not even reentrant.

    Wot's happening in the JVM ... who noze??? Lets instrument ... but you'll have to write some C++ to get the full (alleged) benefits of JVM TI ... rubbish!!!

    Have a look at the Hot Spot JVM source code ... when I started as a low grade C++ code pee'er I would have been thrown out of the window for writing that kind of code.

    Virtual Machine is not new and some I used (Smaltalk & Forté 4GL) were said to be slow ... lightning fast compared with the Gosling T21somic child, they started that crap in the early 90's.

    I'll shut up the day a Doom/Quake like game written entirely in Java will perform the same on the same machine. If it's not fit for a million dollar game how can it be for a billion dollar bank app?

    Good experiment but please Ditch the stuff!

  17. Joe M

    Read it again Phil!

    Very poor form Phil! If you read the article again, it does not attack Java at all. It simply states the fact, obvious to anyone who has tried to hire fresh young programmers with sparkling new degrees, that Java is not a great language to teach the fundamentals in (flamers note: the fundamentals). Neither is COBOL for that matter.

    It's not academic snootiness but a simple fact of life. I have found to my cost that trying to alter the mindset of Java-only programmers is likely to be a very expensive re-training exercise.

    By the way, neither of the two best programmers I have ever worked with had formal qualifications of any sort and were entirely self-taught. One ended up as the principal partner of one of the most specialised software companies in the Old Dart. This may say something about formal education.

    Your characterisation of Tony Hoare as an ivory tower egghead is so far off the mark that I must assume that "you are making with the little jokes, si?". A more practical minded guru there never was.

    But keep up the good work writing about languages. It's always interesting.

  18. Rafael
    Stop

    Sure, Ada, Lisp, what else?

    Cobol? Fortran?

    The way basic CS courses are taught are more important than the language itself. On the other hand it is quite hard to motivate students to learn languages such as ADA, Lisp, Prolog and such.

    Yet another case of "I hate it, refuse to learn it therefore it is crap"?

    Oh, PL-1, wherefore art thou?

    Rafael

  19. Aubry Thonon

    I happen to agree

    Based on personal experience working with "new graduates", those who went through the theoretical programming courses (ie, basics of OS, memory management, set theory) where more useful than those who simply picked up a language where all the "dirty work" (garbage collection, memory handling, etc) was done for them. The first group, being cognisant of the "theory" of programming could pick up new languages in a very short time (all programming follows certain basic patterns). Those in the first group were lost once they were asked to move outside of the language they were taught.

    And that's the problem: the first group were taught to programme, the second group were taught Java/.Net/etc... Big difference.

    Don't get me wrong - Java (for example) is a great language *if you already know the theory of programming*. I remember recoiling in horror when I attended a Java class (i needed a couple of credits to finish my post-grad) and a student asked why Java arrays start at 0... and the lecturer could not answer! Going straight into Java, .Net and their ilk produces a generation of "button pushers" who have no idea of what is actually going on inside the HW (even virtually) and are totally lost once they have to think "outside the IDE".

    As far as I'm concerned, don't let *anyone* near 3.5G language until they've learnt the theory...

  20. Anonymous Coward
    Thumb Down

    Those who can, do, those who can't, pontificate

    "restricted use in the development of web applications".

    I can only think that they are talking about the core language and not the profusion of APIs and framework that have sprung up around it to simplify web development.

    However, learning LISP and ADA are hardly going to help putting graduates in a good position to apply for technical positions on leaving education. Learning a well adopted language in the business sector such as C++, Java and C# will.

    And what is wrong with the console for feedback anyway?

  21. This post has been deleted by its author

  22. Mark Rendle
    Thumb Up

    Damned for popularity

    They're attacking Java because it's what all the universities teach; if they taught C#, VB or Python I'm sure those would be under fire. And rightly so: CS students should learn the basics and the basics start with console applications written on *nix in C, using the standard libraries.

  23. Anonymous Coward
    Anonymous Coward

    Back to your towers, guys.

    OK, so they complain of "the lack of mathematical rigor and formal techniques" and then propose C over Java? Remind me never to bother with any papers from these guys. Oh hang on, I've NEVER come across any papers from these guys before.

    Your comment about their criticism of Java's "close association" to web applications is spot on. I'm a strong advocate of Java as a language for teaching computer science, and we never see a single GUI window in our class.

    Juxtaposing these guys with Dijkstra and Hoare is an insult to computer scientists everywhere :)

  24. Chris Rimmer

    Similarity of syntax misses the point

    While the syntax of Java is derived from C++, the semantics are very different, and I expect that that is what the two academics are uncomfortable with.

    There's nothing wrong with teaching Java - I consider it a good programming language, with a good set of class libraries. But it is a much higher-level language than C, C++ and Lisp (I don't know Ada), in the sense that the language constructs do not map closely to operations in a typical physical machine. Java is designed to run in a virtual machine with complex semantics; a lack of understanding of lower-level concepts would make it hard to understand how this virtual machine can in principle be implemented on common hardware, and thus what the cost in time and space of Java constructs is likely to be.

    One of the best books I read at university (early 90s) was 'Structured Computer Organisation' by Andrew Tanenbaum, which presented logic circuits, microcode, several assembly languages, and then C-like languages, showing how each level of abstraction could be built on an implementation in the level immediately below. Java could usefully be taught once these lower-level abstractions are understood. The student would then be equipped to understand which problems are suitable for tackling in a Java-like language, and which would be better implemented in something more like C++, or even machine code.

  25. Paul Murray
    Flame

    What??

    So, Java is a bad teaching language, but C++ is ok? Or Ada?

    If the kiddies are learning bad programming with a lack of rigor, meybe they ought to teach good, rigorous programming. Java is a perfectly fine language for this purpose. If their courses are too narrowly focused on web apps, maybe they ought to change the content of their courses, what they themmselves designed and are teaching. If you get my drift.

    The problem is not with java. It is with sausage-factory "higher education" which is nothing but vocational training. Take their money, teach 'em just enough so they can pursue a career, and to hell with learning.

    Yes, it all comes down to money, and to the way universities really work these days.

    IMO: programming-for-money ought to be taught in technical colleges. It is a trade - you are building something.

  26. Brian Miller

    All high-level languages are fallacies

    Joke: Assembler is the only true language! All hail assembly!

    Not joke: BASIC was my first language, as the first computer I used was a PET 2001-N. Then I tried learning 6502 machine language, but it was too difficult without a book.

    What is being missed is that the students are not taught design. They have no concept of design, so they blindly write something that kind of fits the language.

    Of course basic computer use is essential. Microsoft routinely hires people who can't type, can't use a command prompt, don't know how to use Windows, and can't even hook the computer up to a KVM. This means that their productivity is nearly nil, and constantly complain that tasks are "too hard."

    I chatted recently with a fellow who had a professor who taught Java. The prof in question was a Java-worshiper, who was completely dismissive of any language which could access a pointer. These instructors are damaging the students. The real power of a computer is only realized when you get close to the hardware. Everything else eats away at the full power of the system, especially throwing away CPU cycles to a virtual machine.

    All languages have good parts and bad parts. What needs to come back into use is domain-specific languages. C and C++ are horrible for GUI work. Java is absolutely the wrong choice for writing a device driver. COBOL would be horrific for sending a man to the moon. Use the language that is appropriate for the problem at hand.

  27. JIm Davies

    Withering attack?

    It's not really a withering attack, is it? Not on those responsible for the curriculum, and not even on Java. And I don't think that you'e chosen terribly good examples of ivory tower academics.

    The argument that the authors put forward might seem a little out of date - or perhaps it's too positive about Ada, and not positive enough about Java - but the concerns are real enough.

  28. Leo Davidson

    Syntax isn't everything

    They didn't slag off Java's syntax so why do you mention its similarity to the C/C++ syntax? Their complaint seems to be that Java hides what the actual machine is doing, pointers and so on, which is completely true, and that students don't seem to understand anything other than GUI programs (which seems unfair on the face of it; Java can write bad command-line apps as well as it can write terrible GUI apps).

    Syntax is a very small part of languages, and just because Java gets it right by using a good syntax doesn't mean everything else is right with the language.

    If you don't like academics in their ivory towers, read what Joel Spolsky said about Java, which seems to be along similar lines to what these two academics have said.

    (At university I was taught functional programming followed by C, and then Java on a course that most would consider excellent. I already knew assembler so I guess I didn't need teaching about hardware and pointers but I'd say that, at the time at least, Java probably made sense as a teaching language for certain things, but certainly not everything. Any course which only uses one language is stupid if you ask me, but at the same time I would say that, these days, I'm not sure Java makes sense for anything as there are better langauges with a superset of its features and no relative downsides.)

  29. Maksim Rukov
    Joke

    Pointers? Sheer bloddy luxury!

    When I were a lad, we wrote our programs using strange three-letter words and the occasional number. If we wanted a variable we 'ad to go diggin' through the stack. And we thought we were lucky!

    Words? With letters? You were lucky! I 'ad to punch 'oles in little pieces of card and line up behind ten thousand other programmers to put the cards in the room-sized machine. And we thought we were lucky!

    And you try and tell the young people of today that ... they won't believe you.

  30. Anonymous Coward
    Anonymous Coward

    Well most of us hate Java :)

    java was designed for multimedia TVs not web, it just keeps trying to reinvent itself.

    java is a bit like the ugly girl at the dance, wearing one of those pink princess dresses, and unlike a fairytale there is no Godmother to wipe away those blemishes, it is permanent acne.

    IMO Python is probably the best language to start on, and to use day to day. C should still be taught, as should C++. The shells are useful at least one should be known. Haskell for the functional. O'Caml is quite interesting, and of course Assembler should still be taught at degree level if not earlier. Ada is worth looking at; good programming practice can be drawn from that language.

    The ugly sisters are things like Coldfusion, gopher, and modula, they have very little application, and if you do find someone specializing in them they have either never had an academic grounding in computer science or have never left academia.

    I must admit I rather like Perl, but it is a connoisseur language like Lisp, unlike Lisp though a lot of people pick Perl up who perhaps need to start with something else.

    PHP is popular but of course the language is not really academic, it is a quick hack really trying to up its respectability, but of course that is the language most ISPs include with their free web space, so there are quite a few younguns who grew up with it. PHP will always be Personal Home Page to me. :)

    java is just a verbose mess, at least there exists Jython for those who cannot convince the 'powers that be' that the Java platform is not a smart move. It is especially seen in development life cycle; you could probably fit in three comparable Perl projects to one comparable java one. Python you will probably be looking at 2.5 projects but then at least you can revisit with a bit more ease. If you are developing using java you are probably actually coning the client a bit.

    "Wherefore Art, Thou" by Larry Wall of Perl fame, does a good of job of describing the differences between some languages; java is muzak and that is a generous description of it. :)

  31. Joe M

    @Greg

    Greg my dear boy, anyone who gets so worked up about a computer language has to be a great programmer and I’m sure you have built some terrific software in your time. But, psst…, I’ll let you in on a little secret. You know the mobile phone and the wristwatch, which you think you can program so well. You may not believe this - at any rate you tell us emphatically that you don’t care - but some poor sod who actually understands the hardware and all the other “piddly little details” of the device, wrote some nerdy embedded stuff which makes it possible for you and other programmers working at your elevated level to think that you are in control of the thing.

    If he or she was good at their work and virtualised the underlying levels well enough, you may never ever have to think about the piddles yourself. But eventually something will come and bite you and when that happens you won’t have the foggiest clue what to do. You know why? Because by your own admission you can’t really program a computer at all. Not really really. You need at least six layers of other people’s work between you and that “sub ax,ax”.

    Now, that does not mean that you are stupid or incompetent or in any way inferior. It means that whatever you do, and probably do well, relies on the competence and hard work of many, many other people, most of whom would certainly acknowledge and admire your efforts. Maybe you should return the complement or just think of them sometimes.

    And another whisper: the guy who understands what “sub ax,ax” means (and implies) can probably change places with you and write some pretty good Java if he sets his mind to it. Can you do his work as well!?

  32. Steve Roper

    Teaching novice programmers

    Speaking as a one-time IT lecturer, I have to say that NO one language is suitable for teaching programming to beginners. All languages have their quirks and the last thing novice students need is to get bogged down in syntax before they've even had the chance to learn the basics.

    Back when I was lecturing, I developed a curriculum module for entry-level students called "The Robot's Kitchen". The premise was simple: given a gridded map of a kitchen, with all the ingredients and equipment in specified locations and a recipe for chocolate cake, get a robot to make the cake. The robot can only understand simple commands, like MOVEFORWARD, TURNLEFT, TURNRIGHT, PICKUP etc. as well as IF/ELSE/ENDIF and DO/WHILE/UNTIL for repetitive task optimisation. In this way, I was able to impart the essential principles of programming - complex problem breakdown, function calls, conditional branching, looping and process flow - without the students having to learn the syntax of a specific language or writing a single line of code. Most of the students were amazed at how a half-page cake recipe turned into a twenty-page assignment, but as I explained to them, that's exactly what programming is all about - taking a complex problem and breaking it down into simple, single-action steps.

    This paid dividends to the students in the second term, when I started teaching them actual programming, using Pascal to begin with, then C - NOT C++! Because I'd primed the students with programming methods in the Robot's Kitchen assignment, they had far fewer problems getting their heads around the principles of programming than students in other classes. While I don't want to sound like I'm blowing my own trumpet, I should point out that my class consistently had the highest pass rate in the faculty, and that by my second year, new students who had friends in my previous year's classes were asking specifically to be in my class at enrolment.

    As to programmers not needing to know about machine micromanagement (memory allocation/deallocation, managing the stack and so on), I have to disagree. As a programmer, you are taking complete control of the machine, and if you don't understand how the machine works, you have no hope of controlling it. Even if you end up as a DBA or frontend developer, you should still know about memory management, stack size and clock cycles because you still need to optimise your code for security, speed and efficiency - and you can't do that if you don't know what makes a computer tick. All programmers, even 3GL programmers, should be able to think in binary and hexadecimal and know the powers of two at least to 65536. They should know about the limitations of stack size and the importance of memory management - the famous buffer overflow exploit is a prime example of what happens when programmers don't understand these basics. That's like needing to know your alphabet before you can read Shakespeare.

    Finally, C is probably the best language to teach once your students have a good grounding - not at entry-level. Nearly all other 3GLs are based on C and use its syntax, so if you can program in C, you can easily learn any other language. I don't like C++ because it takes too many shortcuts and is too idiosyncratic. But C enforces good programming practice and forms a solid foundation for the student to build their programming career on.

  33. Tesco
    Thumb Down

    Pish

    "Students found it hard to write programs that did not have a graphic interface"

    Are they for real? What's easier than System.out.printing to console? Have you ever tried to write GUI code from scratch? Sure, drag-and-drop GUI-building tools tend to come built-in to Java IDEs these days, but it's not a feature of the Java language. Considering how many GUI-building Java tools there are available, and how bloody hard it sometimes is to write Swing code that works, I'd rather say it's more a feature of Java to /obfuscate/ GUI operations!

    "had no feeling for the relationship between the source program and what the hardware would actually do"

    Well, isn't that the point of Java? "Write once, run anywhere"? I hear that writing generally portable Java code is not always possible in practice, but, as long as the porgram is doing what thought I told it to, I'd rather not care what the hardware is up to, thankyouverymuch! If you wanna get in touch with the hardware, write in C or assembly!

    "(most damaging) did not understand the semantics of pointers at all, which made the use of C in systems programming very challenging"

    C can't express everyting that assembly code can, but it gives the option to call assembly code from its functions. Similarly, methods in Java may be decalred 'native'; that is, they are implemented in C.

    Shock-horror: Java-learned folks come to C and understand nothing of the semantic's of C's pointers (shocking, conisidering Java deliberately abstracts such things), which makes the use of C is systems programming very challenging. Great, why not teach them C then,if the point is to program systems in C?

    Java is a different language for a different purpose, and Java can't be blamed for not being able to express the concepts of a language of a lower level than itself.

  34. This post has been deleted by its author

  35. Ian Michael Gumby

    Oh the choice of languages...

    C? Yes its one of those perfect languages.

    (COBOL too could be considered a perfect language of its generation.)

    But C++? Hmmm no. (See my rant below. ;-)

    If you're going to look at a good OO language, Objective C would get my vote. (Think a blend of small talk and C.)

    We're talking practical languages here. Not just something to teach you theory.

    <begin rant>

    But I'd still take Java over C++. Have you ever had to debug C++ written by someone who doesn't understand languages? The issue in the real world is that you're required to turn concepts in to reality quickly and for a lowest cost possible. Java is very portable and you can find "cheap" labor to write something that "runs". Think of it as the AK-47 of the programming languages. Think of C as an M1A1 and C++ as an early M-16.

    Its not to say that C++ is a *bad* language, but that its harder to support or find good support and frankly you could pretty much write the solution in C that is more maintainable and less prone to errors.

    <end rant>

    And everyone who has ever programmed will have their favorite language.

  36. RW
    Paris Hilton

    @ Brian Miller

    "Joke: Assembler is the only true language! All hail assembly!"

    I think you may be confusing symbolic machine language with assembler language.

    Certainly the IBM System 360 Assembler (and it's presumable Nth generation descendants) was a devil of a lot more complicated than mere machine language. On the one hand, you had to worry about making sure the hardware environment was suitable (all those lovely base registers -- anybody else remember those?) and on the other hand you had a macro facility that was an invention of Satan himself.

    [Let me annoy the moderator by discoursing at some greater length on the topic, how to teach programming.]

    It seems to me that it's a mistake to think that teaching any one language is "teaching programming." All that does is churn out graduates with a single skill and very little understanding, especially if the one-and-only language is a very high level one. There's no particular virtue in suffering, but exposure to the hardware guts of a computer and to a selection of languages of varying levels of abstraction is really The Thing To Do. (If I use caps like that, can I use the Man from Mars icon?)

    The point of doing so is to teach not gruesome details of syntax but rather, and much more importantly, underlying principles. The successful student would be able to relate the rather simple structure of classical procedural languages (Algol, Fortran, Cobol) to all sorts of wild and wooly derivatives: OO programming environments, Lisp, Ada, Python, Java, you name it.

    With the right selection of languages taught, the student is then able to get his head around whatever turns up next a lot faster than the poor doofus who thought learning one language was "learning programming."

    I've been out of touch with the field for so long I can't begin to even speculate what selection of progamming languages would be optimal for teaching, but I think readers will understand my point.

    [Hey, what happened to the Man From Mars icon???? Guess PH will have to do in lieu.]

  37. Trix
    IT Angle

    So why the maths?

    I am not a programmer, I am a lowly systems admin. But can anyone tell me what high-level mathematics has to do with computer science? Boolean logic, yes. Binary, of course. Anything else? (Not at the physics level of little chunks of electricity and circuits)

  38. Ramon Casha
    Boffin

    Java is the best language to teach

    Java is the best language to teach students - not only is it good for learning concepts gradually, but in the end you've been trained in a language that has very solid commercial prospects. You can use it for web applications, GUI applications, command-line applications, database stored procedures, business logic, mobile applications, and even Blu-ray interactive stuff.

    Languages like C still have their place, of course, but it's a much smaller place - low-level stuff like drivers, kernels etc. I'd include it in a later stage of programming tuition.

    Personally, one of the things I hated about C++ is that if I wrote a program using Borland C++, and later had to switch to MS-C++, all the dependencies on the Borland libraries didn't work any more. Oh, and forget about switching to a different platform like Unix. Anything beyond some fairly basic standard libraries is different in each brand of compiler. The shift nowadays is for the language to include a substantial library of classes/routines which is always present whichever compiler or platform you use. So, whether it's Java or C# or PHP, you're not so tied down to one brand. Java is tops in that regard.

    It's good for programmers to have at least some knowledge of what goes on behind the scenes when they make use of a library, but on the other hand it's nice to be able to just call a function to read an HTTP connection without having to learn all the details about how to handle proxies, redirect status codes, and all the other nitty gritty.

  39. Keith T
    Heart

    Professionals versus tradesmen

    There are 2 sorts of computer staff.

    1. Professionals who understand the theory of what they are doing and are capable of independently learning new skills, and who are capable of making a wide variety of decisions on IT and related fields.

    2. Technicians/trades people who understand a language or two and a few specific tools. They are generally better at using those specific tools than the professionals, but their limited range of knowledge limits the range of tasks they can safely do.

    Our industry has not recognized these 2 categories yet. But this is similar to accounting, engineering, and medicine.

    Yes a professional might use Java. But also a professional will know the theory of what is going on behind the scenes, which you don't learn or think about when you learn Java. Of course you could learn just the C part of Java, and then you could learn behind the scenes, but then you would be learning C not Java.

  40. Christian Berger

    It's not just about languages

    I believe students should learn the ideas behind the languages, for example by getting an intro into one language of each group. As most languages can be learned in 2 weeks (there are exceptions like C and C++) it shouldn't be a problem.

    I'd start with the classical "spaghetti-code" languages like old BASIC or some assembler. It gives the student understanding of the basic way of how computers work.

    Then I'd teach stack-based languages. (like Forth) The shock is rather small and gives much understanding for the next group.

    Algol-like languages. (Pascal, C) Unfortunately those are now preety much the onĺy thing in use.

    Symbolic languages: Since they are so small I'd teach both LISP and Prolog

    Object Orientated languages (Smalltalk, Ruby, etc) to show people how object orientated languages work.

    Pseudo OOP-languages: Like C++, to show how you are not supposed to do object orientation.

    .

    It's definitely a bad idea to throw students into C or C++. In my opinion C++ is dangerous to your mental health.

  41. Joe M

    @Steve Roper

    Hear, Hear! Teach programming not programming languages is the gist of what you and the article quoted by Phil Manchester are saying, I think.

    By the way, my old Computer Machinery lecturer (may he rest in peace) used knitting patterns from the Womans Weekly for the same purpose that you used the kitchen and cake with your students. (I think I just gave my age away. In those days women's magazines dealt with knitting and crocheting, not with nooking and crotches.) Eventually, as an assignment we had to describe the knitting language grammar in BNF. Ah, the good old days! But what I would have given for the PC I'm pecking this on!!!

  42. Michael H.F. Wilkinson Silver badge
    Boffin

    Teaching programming BASICS

    I have taught programming basics both in pascal and more recently java. The main reason we deprecated pascal at our department is that the version we had available at that point (an older gpc) did not do good run-time checking. Given that pascal's strong type checking and good run-time checking in the previous incarnation (HP pascal under HP-UX (especially bounds checking (buffer overflow errors anyone))) are extremely helpful to the novice programmer, one of the main reasons for using pascal was lost. A novice might deal with an error message along the line:

    Array index out of bounds at line ....

    but not with

    segmentation fault

    which C would produce (if you were lucky). The problem we face teaching imperative programming in Java is that many students treat the problem of progamming as a "cut and paste" excercise. Too often they just look up a suitable class/method and cludge together a program which does sort an array as required, but they have not understood ANYTHING about the algorithms or data structures behind sorting. This is unacceptable. In practice, it is an important skill to use available classes etc, but a programmer, and certainly a computer scientist must be able to develop NEW algorithms as well. This requires a thorough understanding of existing algorithms and data structures. I have actually just developed a new algorithm which performs an image filtering in O(N) rather than O(N^2), and it is hugely satisfying to see computing time drop from 4 - 10 minutes to less than a second on a 3 megapixel image. This may be developed in an "ivory tower" environment, but it will be used (hopefully) in real-world applications. Without a thorough grounding in the basics of programming, including complexity analysis, I could not have developed this algorithm.

    I think the auhtors of the paper raise a valid point: we need something which allows us to focus on teaching people the basics of algorithms and data structures. I could imagine using first C and then going on to C++/C#/Java. But only if we can get C to give our students sensible error messages, etc. Of course industry needs people who know their way around the vast number of available classes and methods of, e.g., Java, and it is very nicely portable, but industry also needs people who can develop new algorithms.

  43. James Anderson

    @Stephen

    " most applications are run on the web " ... well a lot anyway

    however most of these applications are not written in Java!.

    Outside the "gated community" of the Fortune 500 web applications are written in Perl, PHP, C, C++ in fact were you to name any programming language of the last 20 years there is a 50/50 chance I could point you to a site implmented in that language.

    As for the list you give:-

    Google -- Mostly C/C++

    Ebay -- Java

    Amazon -- almost everything that can run on a windows box

    Facebook -- PHP

    Myspace -- Cold Fusion

    Wikipedia --PHP

    So one "Java" site out of 6. I suppose you could argue "ColdFusion" is really Java.

    Anyway what did these acedemics ever do for us?

    Computer Science gets less and less relevent as the technoligy improves. Its as if music was taught as "Resonance Sciences" and aspiring violinists had to do a paper in "String Science" before they were given a violin.

    Currently on "Joel on Software" there is an artical advocating that there should be a Bacholer of Arts course in programming and application design where students would spend most of there time programming in the same way that film school students spend most of thier time filming.

  44. George Flecknell
    Thumb Down

    Dinosaur frenzy

    Robert Dewar and Edmond Schonberg of New York University are clearly fools. Everyone is taught as they learn Java that it is ia severely limited language in some respects. As people that claim they are academics they are doing nothing to further the cause of students who have moved across disciplines to Java and have found it any easy route into learning development skills.

    Such elitist nonsense is typical these days. I am a science grad who is retraining in development- Java has been an easy starting point for me. On my own I have gone on to fill in the holes in my geometry, algebra and low level language skils.

    These people are clearly thinking only of furthering their own cause by promoting their own languages of choice. Unfortunately other people listen to them.

  45. Shakje
    Stop

    My experience is...

    people will defend to the hilt the langauges that they were educated with. Personally, I started coding when I was about 8 in BASIC, and only made the leap to C++ about 4 years later. I've done a game coding course and a course in Java since, but I was always a C++ advocate, until recently. It was only very recently that I started using C#, but I really love it. Anyway, that's by the by, since the real issue is NOT if Java is a better language than others, but should it be used as a teaching tool? Personally, Java has an important part to play, however much you like or dislike the language itself, and I have to say that Java has my favourite IDEs over those available to other languages.

    Yes, it might be portable, and you might be able to get "cheap" labor to code it, but the truth is that a lot of corporate companies have legacy code in C++ or C, and companies that code in the games sector will more than likely have a very high percentage of code in C++. This would not in itself be a problem, but the techniques that are involved in C++ are easily transferrable to Java. My own experience of this is that I learnt to use Java reasonably well within two weeks or so, and part of this is because I already understood fundamentals of programming at a lower level. I don't think that it would be so smooth the other way around.

    Using Java in education is a shortcut, it means that you can skip all those difficult sections on pointers, stacks and heaps, and concentrate on the other difficult concepts such as the basics of O-O coding. Having known people who learnt Java as a first language, I know that a lot of them didn't get O-O, and would write procedural code inside a few objects.

    Personally, I think that everyone should learn C first. Not because I think it's a better language, but because without a knowledge of C, you can't possibly understand why a language is needed, or where it fits into the heirarchy or C-style languages, you need to learn concepts in a backwards fashion (such as how parameters are passed internally, and why that int isn't changing outside your function), and also because I think you need a knowledge of procedural before you move to O-O, so that you know exactly what the difference is. After you learn C, and the principals of O-O, it should be reasonably easy to pick up other languages (especially if you start with C++). A lot of languages have a lot of different features, but C++ has pretty much all the features that other languages have in some form or other (pointers, templates/generics, namespaces, almost all forms of inheritance, ability to slip into assembler, include files) and once you have a good understanding of the aspects of C++ you will have a grounding that cannot be matched. Of course, with the advent of C++0x, which supports garbage collection and many other important new features, in the next few years, the face of programming could easily change again.

    Do I think that pointers are a good thing? Not necessarily, but the theory behind them makes for intelligent and well-versed programming.

  46. Paul Clark

    This is computer science, not development

    Folks, this isn't about which language is best for production development; it's about which is best for illustrating the core principles of Computer Science. Java is great for certain kinds of high-level development, largely because of the scope of the libraries and development platforms that come with it.

    If you actually want to understand what's under the hood, though - which is what Computer Scientists are supposed to do - you'd be better off trying to implement a JVM or JIT in C or C++. And if you _really_ want to understand what's happening, you need to implement a kernel, which can only be done in C with small bits of assembler.

    (CS degree, former assembler & C programmer, now using C++ *and* Java *and* PHP, for different jobs)

  47. Steve
    Thumb Up

    Well said

    "no feeling for the relationship between the source program and what the hardware would actually do"

    Too true. There are way to many people around today who have no clue what's happening underneath, and who leave cr@p everywhere because the underlying implementation will clean up their garbage for them so they don't have to care.

    It's not the language that's the problem, it's the sloppy practices it encourages by hiding the programmers from the consequences. That's why we end up with monstrostities like Windows Vista that won't run in less than 1GB of RAM. My home file server has 32MB (and no Java) and even that seems extravagant... Now where did I leave that Fortran compiler?

  48. Viet
    Linux

    Basic ?

    Like many people born at the beginning of the 70s', my first exposure to computers was through Basic. But *not* the brain damaged basic from microsoft which couldn't even get its arithmetics right, rather a CBASIC dialect from DRI. In many respects, Java is simply a grand-son from that particular kind of basics, with regard to the pseudo-code / VM part of the thing. I had much fun with it, doing some relatively complex things, but I have to say that I really got a feeling of programming only years later. At that time I had already decided to pursue another career path than becoming an IT professional, but I was still hanging with nerdy friends (all were beginning their CS degree). So I self taught myself C and x86 asm just to be able to do their assignements as a challenge. There, I met pointers and my life changed forever. Once you can imagine your pointers dancing in front of you, you can program anything in any language : you're sure to beat the cr*p out of any high-level programmer, in terms of performance of course, but also in terms of internal logic ; your programs are sure to be of a *much* better quality.

    I understand the rant of the two professors : java hides the pointers, and it's the worst trick you can play on a wannabe programmer. If you can't tell a value from a reference, you can only produce substandard code. Maybe java is right for seasoned professionals, to help them meet their goal quickly, but seasoned professional intuitively knows what the language is really doing from the look of it. So they feel if a method is actually using a value or a pointer, and pick the right one accordingly. A beginner can't make a good choice while he doesn't knows there is a difference. C is an extremely good language to teach pointers because if you mess with them, they bite you hard.

    Today ? I'm still not part of the IT crowd, but I like to occasionaly toy with programming. While I can do decent C, and a few other languages, I've reverted to my old Basic friend since I've met Gambas under Linux. This language despite its quirks can produce amazing results (think of it as java semantics with a basic syntax). The interpreter/VM is not polished yet, but its main author is working hard on it, and it already works extremely well on a 32 bits linux-intel box (next step : 64 bits cleanup).

  49. Anonymous Coward
    Anonymous Coward

    @Morley Dotes

    Students should learn to program in pseudo-code, hand-compile, and run the results, before they are ever "taught" a high-level language...

    This is the most sensible thing that has been said and is also the method by which I was taught. I would venture to say that language is completely irrelevant when learning to program. What makes a good programmer is the ability to solve problems. You can take any language you like but when it comes down to it all you ever do is move things from one place to another and use some basic mathematical functions, this requires sod all so called programming skill and can be done in any language. You will constantly hear, at least I have, that the best programmers have physics degrees and the worst computer science ones. The former discipline lends itself well to encouraging problem solving. As far as I've ever seen all the latter teaches you is how add up using two digits.

    In general programmers are an elitist bunch with their heads so far stuck up their own backside that they can't see past their own pet language. They have perpetuated the myth that what they do requires immense skill so much so that they actually believe it themselves. Utter twaddle, any fool can program in any language.

    Courses in problem solving is what's needed not a specific language. After all, the Java of today will be the COBOL of tomorrow and programs will still be adding, subtracting and moving bits about.

  50. Anonymous Coward
    Anonymous Coward

    More pointless flamebait

    The best thing about this article is that I can use it to annoy the Java programmers in our department, by emailing them the link and pretending that I agree with the "report".

    Such is office life - I'm sure you all have ways of livening up your day too.

    Of course it's total bollocks though, because actually they are very skillful guys who are good at their jobs and write nice software.

    As arguments go, it's about as useful as the Mac vs. Windows or Vi vs. Emacs debates - i.e. not at all. You might as well print something that says scientists have proved photography is better than painting.

  51. Mark
    Thumb Up

    I agree with academics

    Hi,

    I have done a computer science degree and computer systems engineering degree... and High level languages are just too removed from hardware...

    So what happens is that coding for performance and elegance goes out the window... where as knowing low level languages requires a good interaction between hardware and software...

    When I taught in university I always made sure my students psuedo coded everything... and then used the psuedo code as comments to code... That way in theory they could code in any language.

    However that said, what is really lacking in the graduates I have employed in the last 4 years, is a real foundation in language agnostic problem solving! Being able to look at a problem and figure out 2 or 3 approaches to solving the problem.

    Changing programming languages should be easy... However solving problems is where the focus should be on.

  52. TeeCee Gold badge
    Happy

    Programming courses.

    The second thing I was taught to write in at college (after BASIC of course) was Z80 machine code. That's right, the institution concerned was too tight to shell for an assembler, so we sat in front of the memory map and keyed the stuff in. The 0-9 and A-F keys on the keyboard were well worn and the remainder virtually untouched. If we wanted to write in Assembly language, we did it on a bit of paper, hand-optimised it and then translated it into hex ourselves.

    With the benefit of hindsight it was a bit like building a house without using any power tools. Perfectly possible, but you wouldn't want to make a habit of it.

    Kids of today <mutter, mutter>.

    I accuse Maksim of misuse of the "Joke" icon.....

  53. chris stephenson
    Coat

    Why can't we all just get along

    Cool your egos chaps, use whatever language, tools, or os for that matter, you like. If you and those that pay your wages are happy with the work you do then who gives a flying... Why is it that IT people get so fundamentalist about thier favourite language/framework/os? Different tools for different jobs chaps, diversity is a good thing.

    excuse me whilst i strap a bomb to myself and recite my mantra "there is one but language but ...

  54. Richard
    Boffin

    It's not about which language is best!

    This article was always going to produce a fanboy comment war.

    Steve Roper's comment above hits the nail firmly on the head - it's not about whether Java is a good language or nay, it's about whether it's a good FIRST language. It's not, just as a Hummer isn't probably the best vehicle to learn to drive in.

    The toolkit approach starts at way too high a level of abstraction. The problem isn't that you can't learn the techniques this way, it's that you start to assume that the toolkit IS the language and thus make completely invalid assumptions as to its capabilities and flaws. In a similar vein, I think that students should have to write and run programs outside of an IDE for at least a little while to realise what goes on under the hood.

  55. Rich Silver badge

    Sort-of agree

    I sort-of agree with the comments of the pofs.

    Personally, I don't like Java, but that's just me - I don't have a problem with anyone using it.

    The problem with Java (and indeed any very high level, highly abstracted language like it) is that it's very easy to loose sight of what's actually going on behind the scenes. Which, of course, if this is the only language you are using, you NEVER appreciate this.

    If your shiny new program is running in a multi-GHz, Multi-GB PC then this doesn't really matter much. The problem comes about when the hardware you are writing for is not so capable. Embedded systems with limited resources need a much better appreciation of what's ACTUALLY happening, rather than what a cursory glance at the high level source code would suggest is happening.

    If you don't realise that your single Java do_everything() method is actually allocating shed-loads of memory and churning through huge amounts of data, then you are going to come unstuck! Unfortunately, there are a surprising number of people who DON'T appreciate this. I see code all the time (written in C) that is desperately inefficient and resource-hungry. On an embedded system (which is what I usually work with), this matters. It matters a LOT. And this is code written in C; a language which SHOULD automatically give you an appreciation of what is really going on under the covers. Just imagine how bad (by 'bad', I mean inefficient and resource-hungry) the code would be if it was written in Java.

    And it's not just embedded systems. Why do you think today's desktop applications are massively larger than the ones we had 10 years ago, and despite it not actually doing a great deal more than the old software ever did, and running on hardware that is hugely faster and more capable, actually runs SLOWER?! The reason is, of course, because the code is massively less efficient!

    Comments by some people here on the lines of "why use something cruddy like C when you can just click your fingers and Java will do it for you" obviously don't need to worry about running their code on low-powered resource-limited systems. Indeed, some of them almost certainly do not even appreciate the problem. It is THESE people that the profs are referring to - softies that have been brought up on a diet of very high level coding that simply do not understand the full implications of what they are coding. As a result, these people are useless when it matters.

  56. Joe Stalin
    Stop

    Real Programers don't use Pascal

    Am I showing my age by using that title? If you remember it then you will know the author made some very good point, he favoured FORTRAN over PASCAL but his point stand the test of time. Some bubblewrapped kid can't compete with a dirty handed code monkey. Like him I would not trust a Java Programer to go back to his code 6 months after they last looked at it and expect them to fix/modify it, or worse still have them fix/modify some-one else's code. That would be a job for a real programmer not one of those quiche eaters.

  57. Anonymous Coward
    Boffin

    Re:So why the maths?

    @Trix,

    You know the computer games you play. You would be surprised to find many if not all of them have some rotational matrix calculations in there somewhere.

  58. Edward Rose

    I love these 'discussions'

    @Martin Gregorie

    I want what you're smoking, maybe then I would start to like my employers. Anywho, you'll find an awful lot of java web applications do have a graphical frontend. I'm not saying that Java isn't used as a backend, but I've never seen it (yes, humour intended).

    Pseudo-code is by far the best to teach first.

    I self taught Basic, then a small amount of assembly. School taught us pseudo as the teachers knew no languages, Best education I've had to date.

    Elec degree at uni taught us pseudo, C and assembly. Because that's what we'd use as gingerbeers (Okay, HDLs too).

    In current job a firm is being contracted to write fairly simple embedded system (I wasn't allowed, despite the saving it could have made the company, because I would have actually enjoyed it). They tried to insist on writting in C/C++. ???? Why?

    I mostly work in C, but there is no doubt that this project should be done in asm. It's just not big enough to need high-level.

    So, how about teaching the students a short bit on asm. a larger bit on C and some on a language like Java (about as much as you teach on C). There is nothing wrong with any of these languages (well, okay lots wrong, but meh ;) so let the students learn the differences and where each should be used. Of course, start with theory of structure, modules and pseudo code.

    A project on a language of their choice should be given (ie, write tetris, choose any of the taught, or maybe untaught, languages to do so). I think this is what is called an education, unlike what most places seem to give now.

    To the above comment, solving problems comes down to extracting fingures and trying to think. Kids of today sit back and expect everyone else to do the thinking (Yep, I'm nearly one of these kids, and I do work with these kids in my free time). They are always given the answer so don't ever expect them to work it out.

    Can we have an icon with a teacher/lecturer being spanked please?

  59. amanfromMars Silver badge

    Creating a New Language creates a NeuReal State?

    "I have regarded it as the highest goal of programming language design to enable good ideas to be elegantly expressed.".... http://www.cs.ucsb.edu/~ravenben/papers/coreos/Hoa81.pdf

    Amen to that Biggy, Brother. ITs Holy Grail.

    "You do realize that there's slightly more to programming languages than syntax, don't you?" ..... Would ConText be the Prime Core Discipline for Real Virtual Drive, AC? If Programming Languages do not Relate to Real Life Conditions they will not Drive IT Forward with Shared IDers* Programs....... NeuReal Agendas in New World Order Programming .....always AI Work in Progress, Intellectual Property Exercise ..... Presently executed Abysmally, probably definitely because it is not exercised at all outside of NEUKlearer HyperRadioProActivity Rings of Steganographic Stealth.

    * IntelAIgent Designers

    "Naturally, I don't post with my real name." ..Morely Dotes Posted Thursday 10th January 2008 22:48 GMT

    It is real enough for work here, Morely Dotes. And such CyberIDEntities as are Created for InterNetional Space Stationery Archives .... Virtual Reality Libraries for Realising Application .... can easily MetaMorph/Phlip Phlop In and Out of Imagined Spaces made Real. Today's Reality is only the Result of Yesterday's Thoughts Shared and Acted upon. Does C stand for C.ore? And C++, C.ore Creation/Immaculate Source. Or would that be a Special Application of Programming 42 Provide Virgin Root for New Growth for a Pretty Good Privacy Perly Gates Python snakeoil of ITs Own .... biding in the Open for Added Security with a Taste and a Swallow for all Languages.

    AI Unifying Script for Virtual Machinery, Magical Mystery Turing in Uncharted and Unchartered Waters/Environments.... Private Pirate PlayGrounds?

    Hmmm. I suppose UN Charted and UN Chartered Virtual Environments would be a logical Tangent for AI Political Control of CyberIntelAIgent HyperRadioProActivity although that would Create a QuITe Enigmatic Moral Dilemma for IT would surely be asked .."Who is in Control of Whom and Servering to What?" Suck IT and See will Answer that Riddle, Definitively.

    To ask the question is to Imagine that IT be True? :-) Indeed, In Deed, it does.

    Language is Siloed Hardware which Creates everything Artificial [the things that blight/enhance the Natural Environment .... which we also plant and infest with Seeds and their attendant Weeds. Although such Natural Growth is badly Abused and Used.

  60. Darrell

    @Sort-of agree

    Fair point, but surely this comes down to programming ability not the language.

    e.g.

    the String vs StringBuffer methods in java.

    if you append a String to a String then a new String object is created for every append., StringBuffer only uses the one object thus saving (potentially) shed loads of memory.

    You can write a poorly running, resource heavy, inefficient program in any language.

    (well the ones i have experience with, I know, I've done it!)

    P.S.

    Java was originally designed to be run on embedded systems (back in the old Oak days). So saying that Java is a web based only language is just plain wrong!

  61. Joe M

    Re:So why the maths?

    There are many situations where maths is an essential part of the solution e.g. minimal spanning trees, numerical analysis, linear programming and optimisation, goal seeking, signal analysis, Fourier transforms, pattern matching, statistics.... the list goes on and on. Although it is possible to use canned methods it's best if the developer at least understands the underlying mathematics.

    But the single most important reason for being a reasonable mathematician is because it makes you think in a certain way about logic, proof and method. This way of thinking is excellent for problem solving in the IT field. I have always found that although non-mathematicians can do a good job, mathematicians can often do it a little bit better.

    (And I won't bore you here about the guy who pulled my bacon out of the fire on a tricky printer driver job using advanced trig and calculus. But it happened.)

  62. Darrell
    Paris Hilton

    if only java didnt have a GUI

    "Students found it hard to write programs that did not have a graphic interface"

    If someone could find a way of writing a Java program without the need of a graphic interface, students could learn how to program properly. Hang on... I already do program in java and NEVER use the GUI interface.

    Its no different to using Visual C++ to build a form based program.

    Edsger Dijkstra described those exposed to Basic as "mentally mutilated beyond hope of regeneration"'.

    As far as Beginners All Purpose Instruction Code goes, I thought that it was supposed to be simple and easy to learn as a first step on the ladder to programming? Should every 13 year old computer studies student wanting to write their first "Hello World" program jump straight into C ?

    I started on LOGO, then BASIC then Pascal and finally C and C++ before learning Java. so does that make me mentally mutilated beyond hope of regeneration?

    Probably

    The Paris tag is to show what a real mentally mutilated person beyond hope of regeneration really looks like.

    Although I wouldn't kick her out of bed!

  63. Andrew

    Military Intelligence

    The actual article is on a "U.S. Air Force" affiliate website. This explains the bewildering choice of ADA as the "best first language" to learn. If you are programming for the military it's the only language they'll use (I've heard). I've never heard of ADA being used in any of the companies I've worked for.

    The Stroustup quotes come from "The Power of Ten – Rules for Developing Safety Critical Code", so it seems that these boys are only interested in teaching how to develop safety critical code for the Air Force in ADA, presumably using consultants from ADACore. Their assertion that you shouldn't learn java because it results in an inability to program without a user interface is also a bit inexact. Do you think they actually meant visual basic 6?

  64. David
    Stop

    Useless

    Well they've completely missed the point of Java spectacularly; to not be dependent on hardware/platform specific details. If they wanted to give students a grasp of more technical issues, they should get them to write java byte code to execute on the JVM.

    They also might want to know about System.out.println("Hello world"); which makes it easy to write non-ui programs.

    Maybe they should go back to doing something they're good at; plagiarising other people's material for their own courses (something I've seen done in academia)

  65. Philip Sargent
    Thumb Down

    ML

    The Cambridge computer science course uses lots of languages for different purposes, but the one that it is used to introduce new students to the idea of computer *science* is ML. This certainly inculcates some useful habits of thought.

    http://www.cl.cam.ac.uk/admissions/undergraduate/myths/

    "We start by teaching a language that few if any will have even heard of let alone used – so all our students start off at roughly the same level, and we build from there. "

    "Teaching yourself to program can lead to your picking up bad habits that will hinder your progress later. In particular, you should avoid languages like C and C++. "

  66. breakfast Silver badge
    Boffin

    @trix

    High level maths has everything to do with programming. Absolutely everything. Not just the basic language, nor the operations performed but the whole theory of how and why programs work, how efficient they are how they interact with the hardware or the virtual machine is all mathematics.

    The code you see may not look that way, but the best programmers are the ones who can understand the maths of what they are doing.

  67. Anonymous Coward
    Happy

    hrmm

    Any sensible uni course teaches a variety of technology and language; mine used Java as the main teaching language (that is, for things like OO concepts, software architecture/frameworks and various concepts) but augmented it with courses using languages like C and a few others for the specifics that Java couldn't cover (like lower level code/assembly, pointers, compiler writing, etc).

    It seems they're making a criticism of the language (derived from its real-world usage) when it really should be a criticism of the use of it in teaching, or teaching methods too wedded to a specific language or task area.

  68. David

    Abstract and Theorectical

    Well they've completely missed the point of Java spectacularly; to not be dependent on hardware/platform specific details. If they wanted to give students a grasp of more technical issues, they should get them to write java byte code to execute on the JVM.

    In the real world a lot of programming is in a web environment where platform details are explicitly not important.

    Scientists in Physics spend a large percentage of their time fixing pointer errors instead of writing code that solves their problems. What an earth has the exact details of an x86 processor got to do with the mass of a photon ;-)?

    They also might want to know about System.out.println("Hello world"); which makes it easy to write non-ui programs.

    Maybe they should go back to doing something they're good at; spending 'charity' donations plagiarising other people's material for their own courses (something I've seen done in academia) and why I did an applied degree instead of something abstract like computer science.

  69. Anonymous Coward
    Anonymous Coward

    Nice one Mars

    I was getting a bit worried because some of amanfromMars' posts on other topics were starting to make sense.

    Good to see he's back on top form with this one - but what happened to the little Mars icon? I smell a conspiracy.

  70. Llanfair
    IT Angle

    when I was a lad...

    I remember starting programming by learning Basic on the Amstrad CPC machines. That gave me the fundamentals of programming introducing for loops, while, if statements and so on. When I was at college, they taught Pascal and I loved the language as it was a strongly typed language.

    We also learnt Z80 assembler. The way we did that was we were given a problem and we had to make pseudo code in how to solve it. Then we had to convert it to assembler. We had to predict what would happen to all the registers and the stack. We also had I think it was a HP Z80 boards that would input with the built in keyboard. We had to convert the code into the opcodes and enter it in Hexadecimal. The other thing we had to work out was how long it would take. We also had hooked an oscilloscope to watch the pins going up and down when we used the step by step function. I really enjoyed programming from then on because I learnt exactly what my code was doing on the legs of the Z80 microchip (no innuendo intended). Those were the days.

    Unfortunately these days, when applying for jobs, they don't care that I used MSDOS or have used assembly language. It has never really helped me to get a job.

  71. Mark Wills
    Happy

    What about Forth then?

    : MYOPINION 10 do ." Forth Rocks" LOOP ;

  72. Shakje

    @David

    You have spectacularly missed the point of the article, and the following comments.

    Programming in byte code would nullify the exercise of learning a high-level language. The argument is, that Java is too high-level, and of course, byte code is too low level. A better way of illustrating what you have said is, teach them to write platform independent code in C (which is pretty straight-forward) so that they know platform-discrepancies and what to look out for, and then show them a platform-independent language such as Java, which circumvents the problems they may have faced writing platform independent code.

    In the real world, I would venture that a lot more code is platform-dependent apps and systems code.

    "Scientists in Physics spend a large percentage of their time fixing pointer errors instead of writing code that solves their problems. What an earth has the exact details of an x86 processor got to do with the mass of a photon ;-)?"

    This is a ridiculous statement, and completely off-topic. The article is based on programming taught in CS courses, nothing whatsoever to do with physics scientists who use coding to make their life easier. They should be using a high-level language, designed for science, and the comment is irrelevant to the discussion.

    println has been mentioned plenty already.

    It's not plagiarism, it's referencing.

    Lastly, why post the post you already posted again? How about just putting the new bit in the middle into a new comment instead.

  73. Dan Clark
    Happy

    I've heard of Java.

    It's a bit like Javascript isn't it? ;)

  74. Anonymous Coward
    Alien

    Hardware vs. Software.

    Java is promoted by a HW vendor, what do you expect. They argue it's platform independent, but they make sure the platform it runs the fastest on is theirs (which they having trouble with anyway, J6 is sluggish on T2 processors compared with Xeon or Itanium).

    So the slower & more bloated the better, it means more hardware.

    The next step, I think, is ship some hardware 'JVM add-on PCIe x 256 cards' to boost the slug. Which is going to be the JV(irtually)V(irtual)M, I see the irony comin' up.

    Come on it's not only bad teaching wise, it's bad on all counts, except marketing.

  75. James
    Jobs Horns

    another programming holy war!

    Anybody who spouts off about language a being better than language b, needs a bit of a reality check. It's really all about the best tool for the job, and there are times were C++ wins over a managed language like Java and vice versa.

    After all, you wouldn't write a real-time rigid-body simulation to u on specialist hardware in Java, would you? And at the same time, you would probably not want to write a web app, or multi-threaded server code in C++.

    And as for those of you claiming that a certain programming language makes it easier to make mistakes, I suggest you find another career outside of IT. Nobody's interested in your excuses for your incompetence, least of all me.

  76. Anonymous Coward
    Flame

    typical academic snobbery

    having worked at universities for 8 years now, i can safely say that this is the kind of crap academics spout ad infinitum. they are always poo-pooing stuff which they aren't involved in. 6 months/years later, when they are involved, they extoll the virtues of it.

    what riles me is that these people are the ones teaching others.

  77. Anonymous Coward
    Happy

    At last ...

    "If your shiny new program is running in a multi-GHz, Multi-GB PC then this doesn't really matter much"

    It's this attitude that lies at the heart of this debate. In a world that is trying to reduce power consumption and go-green all resources become important. A faster processor consumes more power and dissipates more heat, all that memory consumes power to remember the 1000s of extraneous bytes of code and data that innefficient resource use requires. Of course, if we used all those cycles efficiently those tasks we still complain are slow would be so much faster and thus use less power to complete but companies like Intel wouldn't be very happy as they wouldn't have such a ready market for their next multi-cored, faster processor 'cause we wouldn't need it!

    I have been arguing for many many years now that garbage collection is the devil's idea as it instantly increases a running programs footprint with 'dead' objects lying around waiting to be reclaimed and removes the programmer from understanding and controlling the lifecycle of their data.

    I admit to being one of the old breed of Computer Science graduate (not to mention having a childhood writing code to run on 6502s, 68000s and building hardware with my father that needed low-level bit-twiddling control code) who learnt how all these small black plastic bits with pins could be joined together to allow a game such as Elite to run using so few resources on a BBC micro. I'd like to see a modern 'programmer' create a game of such power with so few resources available now.

    Rant over! I'm pleased to see that these academics and so many readers here have the same view as me - there is hope for this industry yet.

  78. Anonymous Coward
    Anonymous Coward

    @Dan Clark

    that's the way it started, and should have stayed. Java descrepit.

  79. David Harper

    Re What about Forth then?

    Ah, Forth. A language which combines reverse Polish notation with the ability to define new keywords in the language itself. And it was invented by an astronomer to control telescopes. What more could a Real Programmer ask for!

  80. Dr. Ellen
    Flame

    Taking a utility belt from my professors

    Bear in mind that I'm speaking as a physicist rather than a computer scientist, but long ago (the 1960s) I was taking a programming course. ALGOL, nothing but ALGOL. I noted that I'd rather have FORTRAN. Steam started coming out of the professor's ears.

    Academe has a strong anti-utilitarian bias. Always has, always will. "Are we a trade school, then?" they will say. And as a result, they turn out people unfit for actually DOING things.

  81. Anonymous Coward
    Jobs Horns

    Sigh....

    I think oranges are better than apples because they have more citrus.... really now - ever heard the term Fitness for Purpose? Academia sometimes amazes me...

  82. Ross

    @Joe M

    [And another whisper: the guy who understands what “sub ax,ax” means (and implies)]

    It means you'd better hope you're in a 16 bit environment otherwise Murphys law says you're gonna have one difficult to trace bug next time you do anything with eax.

    Personally I prefer bitwise operands like xor or and to do the same trick.

    Back to the article, Java has its uses in teaching. It's a tool like any other language, and like any tool it can be used for the right reason and the wrong reason. For example I can use a hammer to put a screw in a baton on the wall, but it's an awful lot more hard work than using the right tool, and the result will be somewhat less elegant.

    Java is good when you're teaching logic and methods etc - the student can focus on the task at hand rather than wondering why their 4 hours of work just threw a SIGSEV. It's crap for teaching ppl about the link between software and hardware as it purposely abstracts that.

    Pick the right tool for the job at hand and you'll get a better result.

    And whilst we're randomly ragging on various languages - C++ is an abomination unto the Lord and it's use should be punished with fire. *Lots* of fire. You may as well add PEEK and POKE to Javascript as add OO to C. Anyway, you can approximate classes with judicious use of structs, function pointers and void pointers (oh no he didn't just say that? He did?!)

  83. Ishkandar
    Boffin

    @Basic Anonymous Coward

    Of course, Basic screwed up your IT employment opportunities !! Unfortunately, that was saved by your knowledge of COBOL. These days, to get a half-way decent COBOL programmer, one, frequently, has to fetch a Voodoo practitioner to "re-call" them !!

    Alas, excessively long hours, loads of black coffee and tons of ciggies have taken their toll !!

  84. Anonymous Coward
    Boffin

    Computer Science vs Computing

    *** Possible Rant Warning ***

    Personally, I think that a lot of comments here have been way off the mark.

    To my mind, as a Computer Science graduate, there is a very big difference between CS and Computing.

    I would expect a CS student to learn to understand how the CPU, Microprocessor or Peripheral Device they are coding for works **at the electronics level**. In contrast, I would expect a Computing student to learn to become a code monkey - someone who can code GUI / Web applications but relies on all the other support code (OS, Server, Comms Protocols, Support Libraries etc) to be written by others.

    Now, for Computing students who don't need to know the fine detail, Java is probably fine.

    On the other hand, a CS student NEEDS to know about and understand things like Pointers, Memory Management, optimisation of Algorithms and Data Structures, Threads and Concurrency (and how to do them properly). Java is not a good teaching tool for these concepts.

    Because Java shields the programmer from the 'low level' concepts it is not good for anyone who needs to know and understand the low level - eg. when coding for things like Operating Systems, Embedded Systems, Safety Critical Systems, SCADA, Military Systems, Command and Control etc.

    Java has its place - but its place is in courses churning out Code Monkeys, not on serious CS courses.

  85. Mo

    Object Pascal

    It's a shame Borland's Object Pascal is all but dead these days. {Turbo,Borland} Pascal 7/8 were fantastic environments for teaching the principles of programming in.

    If you've mastered Object Pascal, you can pick up most other languages fairly easily (including, of course, the lecturers' beloved Ada).

  86. Rich Silver badge

    @David

    "In the real world a lot of programming is in a web environment where platform details are explicitly not important."

    ...and a HUGE amount of programming is not in a web environment and platform details are very much important :-)

  87. Darrell

    @Computer Science vs Computing

    Hows the weather up there in your ivory tower?

  88. David

    @Rich

    "...and a HUGE amount of programming is not in a web environment and platform details are very much important :-)"

    ... So that makes the case of the web environment a tiny insignificant one then?

  89. David

    @Shakje

    "Programming in byte code would nullify the exercise of learning a high-level language. The argument is, that Java is too high-level, and of course, byte code is too low level. A better way of illustrating what you have said is, teach them to write platform independent code in C (which is pretty straight-forward) so that they know platform-discrepancies and what to look out for, and then show them a platform-independent language such as Java, which circumvents the problems they may have faced writing platform independent code.

    "

    Well I might well be wrong but I've certainly heard of people regarding this as a useful exercise. Never tried it myself though. I agree with write platform independent code in C.

    "In the real world, I would venture that a lot more code is platform-dependent apps and systems code."

    You can argue this until the cows come home. Platform independent code isn't some kind of tiny niche though and I'd venture it will increase in amount.

    "Scientists in Physics spend a large percentage of their time fixing pointer errors instead of writing code that solves their problems. What an earth has the exact details of an x86 processor got to do with the mass of a photon ;-)?"

    "This is a ridiculous statement, and completely off-topic. The article is based on programming taught in CS courses, nothing whatsoever to do with physics scientists who use coding to make their life easier. They should be using a high-level language, designed for science, and the comment is irrelevant to the discussion.

    "

    Not at all irrelevant. Much of computing is solving real world problems with a specified budget/time budget. Teaching should reflect this. The point is they don't use a high level language and pay for it dearly.

    "

    println has been mentioned plenty already.

    "

    Oh I''m sorry.

    "It's not plagiarism, it's referencing."

    Not when they remove it from a web server because its getting too many hits and their worried someone work out where they got it from ;-)

    "Lastly, why post the post you already posted again? How about just putting the new bit in the middle into a new comment instead."

    A simple mistake.

  90. marc caron

    Developer

    The point of C/C++ in the education of developers is that it is one of the few languages that you can move seamlessly from Procedural Programming to Object Oriented programming without having to either learn a different syntax or ignore half the code.

    Java and other strictly OO languages are confusing for newbies because of the excessive framework and OO constructs that mire up the beginners ability to do simple procedural programming.

    Also any new developer out of college is not expected to know any of the frameworks real well. That is one of those things learned in the workforce or on your own time. It's not the college's responsibility to teach you all the Vendor specific crap that's out there.

    Also to many colleges are allowing their curriculum to be dictated by local business needs. Ex. Here in Iowa Cobol and VB are pushed in many colleges because the local companies want a cheep development staff that just churns out code and doesn't question authority because they don't know any better.

  91. Darrell

    @ oliver jones

    "CS graduate comes out of University completely unprepared for the real world"

    your post nothing to do with which language they use and everything to do with poor teaching standards and the university curriculum.

  92. Rune Moberg
    Flame

    Java? Who cares?

    Where I used to work, Java was considered a bad joke. We would celebrate whenever a competitor chose Java, because it meant they had to hire more staff and their products would suffer. Java applets are usually memory intensive, CPU hungry and offers a clunky UI experience (to put it kindly). The deployment footprint grows and time to market is slow.

    For those of you that require platform independence, I sympathise... But ruining the user experience on all supported platforms is seldom the answer.

    Native Windows development: Delphi or C#. Use C/C++ for shell extensions and similar.

    Everything else? Well... Not Java in any case.

  93. Daniel

    Who watches the watchers?

    Ultimately, someone else has to write the software that sits too cose to the hardware layer for the hardware to be ignored. They have to do this, in order for others to be able to write the layers above that.

    On the whole, you do not need a higher level educatiuonal qualification in software design, to work in those upper layers of programming, and that's fine. These guys are saying that you shouldn't even have to go to university, if what you want to do is write in Java, or some similar higher-level language. However, someone has to work in the lower layers of programming, because those layers will never go away. If students on university degree courses are not the ones being taught to do this work, then who is? To quote minimsft, writing of an absence of such tallents in core areas of Redmond:

    "More and more candidates who can lay down the smack with Java and script can't manipulate memory and discuss deep operating system constructs just-in-time at all. I need you to be able to write a Garbage Collector, not be in an unhealthy co-dependent relationship with one."

    (From: http://minimsft.blogspot.com/2006/11/microsoft-academy.html)

  94. Francis Vaughan
    Thumb Down

    Read the cited article!!!!!

    Dear oh dear. I would have to say that almost every comment above has been made by people who never bothered to actually read the article being cited, but simply relied upon the paraphrasing of it.

    The authors DO NOT dislike Java. They probably don't even mind it being used as a teaching language. What they are worried about are the concomitant problems in teaching that have been allowed to arise. The section on Java is titled "The Pitfalls of Java as a First Programming Language". Pitfalls. I.e. problems that can arise if you don't watch out.

    I spent a long time in academia before joining industry. I taught second year data structures many times - in Java. I got involved in heated arguments many times with my colleges, and most of the issues raised have real resonance.

    I well remember when Java became the core teaching language. And soon after the first year introduction to programming course started to delight in showing the students how to pop up windows, do cute little GUI tricks. The idea being that the students saw some tangible gratification quickly. It didn't take long (about the time these students moved to the later courses) for everyone to realise that the students had actually failed to "get it" and were on a path to a toolbox (almost cargo cult) mentality. This is exactly the issue the authors of the cited critique cite. They are NOT criticising Java, they are criticising the mindset that uses the toolbox features early in teaching to the detriment of teaching how to program. There are only so many lectures, and so many hours available for programming in a course. Each is precious. Devoting any time early on to GUI coding is always going to be a bad deal for the students. It will always be at the cost of some other more important concept.

    One of the worst things that Java brought to teaching has been the OO theology. There has been a huge movement that OO must be taught right from the start. I found myself teaching courses (prepared by others) where there was more time spent trying to work through OO issues than fundamental data structure and algorithm issues. This again is appallingly bad. You can teach the basics of OO to a student that has a background in graph theory in about an hour. Less if they are bright. But if you try to teach OO first you just end up confusing the student. They think designing the class hierarchy is designing the data structure. And it just gets worse.

    The article bemoans the loss of mathematical skills. This is desperately true. We are beginning to see a generation of programmers who don't even know enough to know that they don't know. Numerical analysis was compulsory when I was a student. Like most, I hated it. Now I am quite convinced it should go back as a compulsory component. We have reached the point where there are programs out there that generate results which have no validity, simply because the programmers had no idea what they were doing. If a programmer uses floating point and cannot explain what the term "epsilon zero" means they have no right to claim the program is correct.

    Many programmers disclaim the value of "higher" or even basic mathematics. Again, with sufficient ignorance they don't even know what it is they don't know. An ability to understand, reason about, and sometimes design algorithms is always going to be important for any real professional. If you can't prove your algorithm is correct you shouldn't be doing it. If you can't derive its complexity similarly. If you can't reason about the selection of algorithms based upon complexity, and understand the pitfalls in simple order N analysis you shouldn't be designing programs. Further, discrete mathematics and language theory is critical in many diverse places. Automata theory, language translation basics, nobody involved in design of a complex GUI should be without these. So many are, and the dire mess so many interfaces are reflects this. And so it goes.

    An computer professional should be able to be totally comfortable with the hierarchy from gates to GUIs. There is a full lifetime ahead. To imagine you can survive without is naive in the extreme.

    To return to the article in question. It does not slag Java. It slags laziness in curriculum design and a worldwide move to dumb down and make "more accessible" and "sexy" computer science courses. Removal of hard maths, core computer science components, emphasis on ephemeral toolboxes, all are creating a generation of programmers that are far less well equipped for a lifetime in the industry than their predecessors.

  95. paddy carroll

    clever people

    The more senior they are the higher the grade of bollocks

    Me

    FORTRAN->BASIC->FORTH->6502->C->VB->C++->Java!

    I love it, it is a proper language.

    lets face it you could find some idiot to recommend Logo if you looked hard enough.

  96. Don Mitchell

    Learn C++

    I think it is important to learn a compiled language that allows you to allocate memory and see address pointers. An interpreted scripting language with garbage collection is much more removed from the hardware, just as they say. I remember one graduate student from Brown U. who applied for a summer internship. He knew Java, but had no idea how pointers worked. He was completely unable to help us with a project in C++.

    Java is pretty complete, but it is still an interpreted scripting language, essentially a modernized analog to Visual Basic. It is good to teach something like Basic or Java or C# in a course about GUI design and rapid prototyping, but not as a basic programming course. You want students to understand what the machine is really doing. Not assembly language, but at least a compiling language.

    There seem to be three catagories of languages:

    1. Pure academic experiments in programming like Haskel, Lisp, ML, CAML, etc. These language represent some clever thinking about computation, but I have never thought they were practical languages for big projects.

    2. Scripting languages like Python, Ruby, Javascript, Perl, etc. These languages expose important functionality, but they are often rather messy and amateurish in design. Not saying they aren't important, just that whoever designed them was not very familiar with basic computer science -- their grammers are full of ambiguities and readability/maintainability hazards.

    3. Practical languges designed to get computation done. FORTRAN, C, ADA, etc. The grammer of ADA is very professionally designed, free of ambiguities (like if-else) with a lot of thought about maintainability and QA issues. And they are designed to compile into efficient code. Programs written in these languages will run 10 to 100 times faster than an interpreted script (even with something like MIcrosoft's JIT engine).

  97. amanfromMars Silver badge

    Root Virgin Source ....... Krakatoan Cracked Tao*?

    "Because Java shields the programmer from the 'low level' concepts it is not good for anyone who needs to know and understand the low level - eg. when coding for things like Operating Systems, Embedded Systems, Safety Critical Systems, SCADA, Military Systems, Command and Control etc." ....Posted Friday 11th January 2008 14:35 GMT

    Java then shields one from Infections, AC? A Perl amongst Swine?

    Does Imagination code for things like Operating Systems, Embedded Systems, Safety Critical Systems, SCADA, Military Systems, Command and Control etc.?

    * http://www.cs.wustl.edu/~schmidt/TAO.html

  98. Joe M

    @Ross

    "It means you'd better hope you're in a 16 bit environment otherwise Murphys law says you're gonna have one difficult to trace bug next time you do anything with eax."

    I have an old embedded 80C186 in mind. I never had the need to program 32 bit assembler as I use 8 or 16 bit hardware, but thanks for the thought.

    Your comments on Java are well taken but what I find is that graduates leave University with Java only, which I find amazing. When I needed Java for server work it was great but even then there were a few problems diving below the line when needed.

    (I think anyone who makes a strong public argument for any one particular language should be forced to write a significant app in Forth. That will fix them!)

  99. Robert Harrison

    @Joe M

    "And another whisper: the guy who understands what “sub ax,ax” means (and implies) can probably change places with you and write some pretty good Java if he sets his mind to it. Can you do his work as well!?"

    Damn I wish I had replied earlier before the comments built up :o)

    A quick whisper in your ear Joe: Of course you could write good Java with your hands tied behind your back because all of us front-end/UI programmers just throw objects together OO style with lots of libraries and it all Just Works (tm). Obviously saying nothing about event-driven programming, multithreading, swishy fast graphics and all the other things that users demand more and more of. In answer to your admittedly slightly rhetorical question: Yes. Given some time and experience, in the same way that an embedded/system programmer would be able to 'cross over' to the interface code. Each arena has certain disciplines associated with it and each deserves some respect for the skills required.

  100. Anonymous Coward
    Black Helicopters

    moving away from the holy war...

    Am I the only one bothered by the fact this article is using Wikipedia as a trusted source of information, to illustrate the differences between C++ and Java?

    Maybe Wikipedia black choppers left Utah and descended upon Vulture Central...

  101. N
    Unhappy

    Gee? You think!

    Most computer science courses are completely replacing the standard of programming languages (C, C#, C+, Cobol, and even HTML) with out of the box "programs it for you" languages.

    In Devry alone, which used to have some clout of programming about its curriculum, has done away with any programming language in favor of Java and VB.net. Both of these write the code for you.

    ...

    Where's the logic?

  102. Dave
    Gates Horns

    @Joe M

    For starters, I'm a Java programmer. Cobol, C++ and Java were the languages taught to me during my 3 year higher education. This was a vocational training. I did not learn any higher maths there. I was taught programming, program design, a (very) small course on opcodes (decoding binary to opcodes with pen & paper for example), pen & paper simulation of a idealized CPU machine, databases, various other things that I'm not going to keep typing out here. But everything came down to the fact that I only got the basics handed to me.

    Becoming good in IT, is something no place of higher learning can do for you. They can help you sure, but in the end you have to do it yourself.

    So, in my first week when I heard we would learning Cobol, I went down to the school library during lunch, took out the 2 syntax manuals and the manual for the compiler, and I started to read. To this day, I have not stopped reading about programming.

    It's just a matter of attitude.

    Anyway, Joe, I'm very sure the engineer who handled the assembler programming will take about as long to learn Java at my level, as I will take learning asm at his level. So, yes, I can change places with a assembler programmer. I just don't want to.

    And, incidentally, I have only once regretted not having more computer/math theory in my training and that was while implementing Reed Solomon codes. Only time I (for a while) felt myself out of my league.

  103. Joe M

    @Dave @Robert Harrison

    My comments were not meant as a putdown even if they were strongly put. Of course low-level programming has no place where Java is needed. (In fact there are very few places still left for it.) And Java is not some trivial toy. Nothing, which enables modelling real-world systems, is ever trivial. All programming needs a specific skill set which takes years to acquire, and as the old saying goes once you can program you can program in any language.

    The trouble is, as the original article pointed out and as I have experienced, many people now believe that one does not need to know the fundamentals of our profession at all. Not because they are lazy or ignorant, but because they think that it is unnecessary. I meet this attitude all the time. The article is simply trying to pinpoint, successfully I think, where this idea originates from and how powerful high-level tools like Java contribute to it.

    Final word. It's been a pleasure to read so many well thought out and interesting comments on this topic - especially those that I disagree with. They are the ones, which make you think and perhaps learn something new.

  104. amanfromMars Silver badge
    Alien

    Quick Flash, Gordon, for Real Networks

    "Final word. It's been a pleasure to read so many well thought out and interesting comments on this topic - especially those that I disagree with. They are the ones, which make you think and perhaps learn something new."

    Final final words ... I concur and applaud all who would share what they think. And as irreverent and light-hearted as the Register can appear to the casual observer, behind it are some Real Astute Programmers Internetworking Data for AI Virtual Stealth. Without y'all, where would we be?

  105. Robert Harrison

    @Joe M

    "The trouble is, as the original article pointed out and as I have experienced, many people now believe that one does not need to know the fundamentals of our profession at all. Not because they are lazy or ignorant, but because they think that it is unnecessary."

    To close this discourse amicably I couldn't agree more. Sadly, I have too often encountered individuals who don't get as far as the "think it's unnecessary" part. That really grates me because you have programmers who sit down and say to themselves "I want to get from A to B". They will then churn out a pile of worthless code that solves that path, and nothing else. Not reusable, not elegant, not efficient. They do it time and time again, and then sadly when it comes to debugging have no idea even how it works, which is too bad because their peers stand even less of a chance.

  106. Anonymous Coward
    IT Angle

    My thoughs on this subject.

    Ok, ignoring the person that said the author had going off in a bit of a tangent.

    Having done a modern computer degree... I might have an though on this subject.

    I did first year around 11 years ago part-time, had to quit because I changed jobs. Came back into 2nd year and completed my degree a few years ago.

    Now not all universitys will be the same, and even in a given insitution folk will have there own oppions. However let me make this clear about my University.

    1. In first year you learn Hello World, and pseudo-code. You also learn things like RAD (rapid application development), where often you will create things using off the shelf products like Access, and combine the basics of databases and programing to create a record collection. Messing around with Access (or similar) is a mear practical application of the theory they are filling your head with. Most first years will include hardware courses telling you how the memory connects, what a cpu cycle is and so on.

    2. From second year your expected to know the fundamentals of programing and they begin to explain 2 very important things. The first is Software Developent, that being client/server relation ships, depending on your course you may do HCI (human computer interfaces) modules, networking, web development and so on and so forth. Needless to say, if you are being trained as a programmer you will do more SD than everybody else and other important modules on how managing errors and other such stuff. (don't ask me I did networking).

    The second important thing they teach you, and this is really from day one, is programming is not knowing a language, teaching sombody a language to program in is a dead end as languages change much like jobs, and being to focused on one languages leave your skills sadly difficult to transfer. They teach the theory of programing, how to figure out and break down all the pesky challenges that will face you... they then grab the programing language of the day and throw you in.

    As it happens back when I did my first year (knowing programing languages such as Pascal and Cobol were entry requirements), C++ was flavour of the day, however when I returned it was Java. Now I heard whining about people using graphic interfaces to program in java. I for one and I only completed my degree a few years back now (and I happen to know its still the same) did all my programing in notepad, or a notepad like programing enviroment.

    Do I think Java is the best language to learn doing a degree?, well if it was the only one I learned or knew.... maybe not. HOWEVER nobody I know comes out of a degree knowing only ONE language, all are exposed to Web, Database, Scripts and all sorts of other stuff. I mean... I learned Perl...

    I find Java a good language once you have know one or two others. Simply because you can teach advanced and complex things in Java that would take you all week just to program in C or C++. Remember the code is not important, knowing how to program is not the code.

    Thus having done my round of programing in Java and messing around with arrays, Swing and other stuff I moved on to more network orientated work. However (I dunno maybe as a joke) on the network modules list was a 3D course..... to my horror (I am not a fan of programing and can be considered reluctant) I had to learn to program 3D stuff in Java. Now in hind sight I thing it was a good thing to include, but at the time it was murder, as programing is like riding a bike, you never forget but that does not mean you will not fall off a few times anyway.

    I have now started jibbering, in summary at uni I was taught how to program, I was not 'taught' a language. Passing the degree was considered proof that I was capable of programing, and it happened to teach a language (or four) that might be useful to an employer. However as it has alwas been, Graduates are taught theory, employers are meant to take a given quality of person and theory and 'show them the ropes'.

    All the Software Engineers that I still know from my course do what I would consider 'real' programing. One mainly does Java Scripting for a large oil multinational, and the other whist Java Certifyed from Sun (did a placement there), has gone on to program some (from what I hear) horrific in house language. My pal doing the java scripting did infact say to me the other day that when he came out of uni the though he 'know most of java script', now over two years later of using it every day, he thinks 'maybe' he knows most of what you can do with it. (for the record he does do more than just java script).

    Now I took networking, and I have had to program in VBA.... yuk, but still even having never learned VB or any other basic program, I managed to get the job done. Now whilst I hope to never have to poke my head into 'proper' programing again, I do use my knolage of programing and apply it to other things, such as Routers and phone systems.

    All in all I do not think that you can make a general statement about the teaching of programing in Computer Degrees, other than to say its a little diffrent than yester year because everything is a bit diffrent from yester year.

    Anyway, I am sorry for going on so long, I am sorry that I was to sleepy to spell check this post... I know my spelling sucks. Most of all I am sorry for not proof reading this before posting, as I am sure there are some awsome gaffs.

    However of the posts I read before puting fingers to keyboard, I agreed with these:

    Bollocks - By Greg

    Snobs, bigots, and venal professors - By Morely Dotes

    And at:

    @Greg - By Joe M

    The main market for programers is currently in higher level launguages in the UK, low level programmers whilst in demand (because there is not all that many of them), are mostly trained on an electronics degree, where they must master both the hardware (creation) and the control software.

    Thus, An Electronics Degree student may create a device and produce low level drivers, whist a Computing Degree student will design a program which requires a code monkey to create some libarys to allow it to be pluged in to an already existing web application, which sombody who has done a Web Development Degree created.

    The IT feild has grown very large, even since I first joined it a decade or so ago. With so much to learn it is no supprise that there are now specialisations. Low level programing is such a specilisation. If you want sombody with both awsome program design skills and machine code abilitys, you will need to find sombody with one of the skills already and train them in the other.

    Such is life.

    -Ano

  107. Anonymous Coward
    Thumb Up

    Then again …

    "The point of C/C++ in the education of developers is that it is one of the few languages that you can move seamlessly from Procedural Programming to Object Oriented programming without having to either learn a different syntax or ignore half the code.

    Java and other strictly OO languages are confusing for newbies because of the excessive framework and OO constructs that mire up the beginners ability to do simple procedural programming."

    COULDN'T AGREE MORE.

    BTW, COBOL is far from dead, I'd advise anyone to learn it (you should be proficient in a few months if you can really handle working in IT) because there is a LOT more COBOL out there in the “real” world than, Java, C, C++ or even VB.

    In a few years there will be a big thrust to move vast amounts of “legacy” code (God how I hate that term) from mainframes to clustered servers (right reason or none, it will happen) and there will be a lot of ca$h in this for anyone with good/half-way decent COBOL knowledge and proficiency in C ot C++.

    Watch this space …

  108. Anonymous Coward
    Anonymous Coward

    40 years of muddle

    Having read this with interest, I can look back over 40 years of bickering about the best langauge.

    This discussion recurs regularly and various interest groups try to press their favourites on us all. Why?

    I "learnt" Algol 60 at university in the 1960's, except that I never really understood it until much later. We covered a simplified machine architecture and assembler too for an Elliott 803 computer. All in one afternoon a week for a term. Incidently, we had a visit to the computer room and met the operators but I never actually ran a program.

    Then in my first job as an engineer I learnt ICL's Jean which was like Basic followed by Fortran IV. As I used this every day I thought in it - this is what seems to be behind people's preferences. The first language they really can think in.

    Following that I taught and tutored: Fortran (IV, 70 and 90), Algol 60, Basic, Pascal, C++ and Java, whilst writing applications in Basic, C and Visual Basic. Then later Perl. However much object-oriented is the fashion I still often think in terms of that original Fortran whatever its limitations. It is like my native language.

    Students need to be able to have early sucess with very simple programs that they can understand - programming is like Lego(r) you put the same bricks together to make a wide range of objects, but they must be taught, and endlessly practice, how to make and use those little bricks - loops, IFs etc. before they can build their fancier programs. They also need a reason to do this and motivations is in short supply these days when so much of what they are taught is modular and they work towards and exam then promptly forget what they learnt.

    Many just seem to want to pass an exam and get a qualification learning as little as possible on the way. Some however are different.

    Motivations starts before university level, those old (pre PC) mini-computers with their Basic programming built in, gave a whole generation a start in programming but this phase seems to have passed. Web programming is much more complex just to get going on.

    Finally, the programming problems that many students are set in classes are often on subjects areas outside of their experience. They often do not understand the problems themselves properly so have no hope of solving them. Setting problems about playing card games when many of the students are from cultures where they have never seen any, and maybe gamboling is thought undesirable, is bizarre. As is setting problems about business when most students are only just starting to buy things with their own money and have their first bank accounts so rarely understand how businesses work.

    Lecturers should try to discover areas that all the students understand and programmes they would actually find useful so that they continue to improve them after the course is finished. This might require a great deal of sharing amongst academics - not something they do readily.

    Plus ca Change..... as the French say.

  109. Jay

    Reminds me of university!

    1992 compsci degree course starts with Modula-2. "...we could teach you something commercially viable like Ada, but we're not into that..."

    1993 compsci degree course starts with Ada...

  110. Shakje

    @Dave

    Sorry if the reply came across a bit harshly at first.

    1st point we appear to agree on. I can see that learning byte code would be a good learning experience for a java coder, but I don't see it being an academic excercise, more a personal development. I still think that learning platform independent C will give you a better grounding in the differences between platforms, and even the place of Java.

    I'm not saying it's a niche, I just don't think its size justifies the amount of courses which teach Java over other languages. It may well increase, it depends on the aftermath of Vista.

    Yes, computing is about adapting to the current situation, and solving real world problems. I may have elaborated on my point a bit too much. The comment is irrelevant because it talks about physics scientists and teaching methods, whereas the article talks about CS students. Continuing this though, if the physics scientists had been taught C properly, they would have no problems with pointer issues. :)

    Last bits were dribble, I apologise.

    Cheers.

  111. Anonymous Coward
    IT Angle

    Perhaps it is funding related?

    Profs tend not to like it if funding bids don't go their way.

    Maybe Ada lost out to Java on some funding awards/applications?

This topic is closed for new posts.