back to article Academics slam Java

The choice of Java as a first programming language in computer science courses is undermining good programming practice, according to two leading academics. In a withering attack on those responsible for setting the curriculum for computer science courses, doctors Robert Dewar and Edmond Schonberg of New York University (and …

COMMENTS

This topic is closed for new posts.

Page:

  1. N
    Unhappy

    Gee? You think!

    Most computer science courses are completely replacing the standard of programming languages (C, C#, C+, Cobol, and even HTML) with out of the box "programs it for you" languages.

    In Devry alone, which used to have some clout of programming about its curriculum, has done away with any programming language in favor of Java and VB.net. Both of these write the code for you.

    ...

    Where's the logic?

  2. Dave
    Gates Horns

    @Joe M

    For starters, I'm a Java programmer. Cobol, C++ and Java were the languages taught to me during my 3 year higher education. This was a vocational training. I did not learn any higher maths there. I was taught programming, program design, a (very) small course on opcodes (decoding binary to opcodes with pen & paper for example), pen & paper simulation of a idealized CPU machine, databases, various other things that I'm not going to keep typing out here. But everything came down to the fact that I only got the basics handed to me.

    Becoming good in IT, is something no place of higher learning can do for you. They can help you sure, but in the end you have to do it yourself.

    So, in my first week when I heard we would learning Cobol, I went down to the school library during lunch, took out the 2 syntax manuals and the manual for the compiler, and I started to read. To this day, I have not stopped reading about programming.

    It's just a matter of attitude.

    Anyway, Joe, I'm very sure the engineer who handled the assembler programming will take about as long to learn Java at my level, as I will take learning asm at his level. So, yes, I can change places with a assembler programmer. I just don't want to.

    And, incidentally, I have only once regretted not having more computer/math theory in my training and that was while implementing Reed Solomon codes. Only time I (for a while) felt myself out of my league.

  3. Joe M

    @Dave @Robert Harrison

    My comments were not meant as a putdown even if they were strongly put. Of course low-level programming has no place where Java is needed. (In fact there are very few places still left for it.) And Java is not some trivial toy. Nothing, which enables modelling real-world systems, is ever trivial. All programming needs a specific skill set which takes years to acquire, and as the old saying goes once you can program you can program in any language.

    The trouble is, as the original article pointed out and as I have experienced, many people now believe that one does not need to know the fundamentals of our profession at all. Not because they are lazy or ignorant, but because they think that it is unnecessary. I meet this attitude all the time. The article is simply trying to pinpoint, successfully I think, where this idea originates from and how powerful high-level tools like Java contribute to it.

    Final word. It's been a pleasure to read so many well thought out and interesting comments on this topic - especially those that I disagree with. They are the ones, which make you think and perhaps learn something new.

  4. amanfromMars Silver badge
    Alien

    Quick Flash, Gordon, for Real Networks

    "Final word. It's been a pleasure to read so many well thought out and interesting comments on this topic - especially those that I disagree with. They are the ones, which make you think and perhaps learn something new."

    Final final words ... I concur and applaud all who would share what they think. And as irreverent and light-hearted as the Register can appear to the casual observer, behind it are some Real Astute Programmers Internetworking Data for AI Virtual Stealth. Without y'all, where would we be?

  5. Robert Harrison

    @Joe M

    "The trouble is, as the original article pointed out and as I have experienced, many people now believe that one does not need to know the fundamentals of our profession at all. Not because they are lazy or ignorant, but because they think that it is unnecessary."

    To close this discourse amicably I couldn't agree more. Sadly, I have too often encountered individuals who don't get as far as the "think it's unnecessary" part. That really grates me because you have programmers who sit down and say to themselves "I want to get from A to B". They will then churn out a pile of worthless code that solves that path, and nothing else. Not reusable, not elegant, not efficient. They do it time and time again, and then sadly when it comes to debugging have no idea even how it works, which is too bad because their peers stand even less of a chance.

  6. Anonymous Coward
    IT Angle

    My thoughs on this subject.

    Ok, ignoring the person that said the author had going off in a bit of a tangent.

    Having done a modern computer degree... I might have an though on this subject.

    I did first year around 11 years ago part-time, had to quit because I changed jobs. Came back into 2nd year and completed my degree a few years ago.

    Now not all universitys will be the same, and even in a given insitution folk will have there own oppions. However let me make this clear about my University.

    1. In first year you learn Hello World, and pseudo-code. You also learn things like RAD (rapid application development), where often you will create things using off the shelf products like Access, and combine the basics of databases and programing to create a record collection. Messing around with Access (or similar) is a mear practical application of the theory they are filling your head with. Most first years will include hardware courses telling you how the memory connects, what a cpu cycle is and so on.

    2. From second year your expected to know the fundamentals of programing and they begin to explain 2 very important things. The first is Software Developent, that being client/server relation ships, depending on your course you may do HCI (human computer interfaces) modules, networking, web development and so on and so forth. Needless to say, if you are being trained as a programmer you will do more SD than everybody else and other important modules on how managing errors and other such stuff. (don't ask me I did networking).

    The second important thing they teach you, and this is really from day one, is programming is not knowing a language, teaching sombody a language to program in is a dead end as languages change much like jobs, and being to focused on one languages leave your skills sadly difficult to transfer. They teach the theory of programing, how to figure out and break down all the pesky challenges that will face you... they then grab the programing language of the day and throw you in.

    As it happens back when I did my first year (knowing programing languages such as Pascal and Cobol were entry requirements), C++ was flavour of the day, however when I returned it was Java. Now I heard whining about people using graphic interfaces to program in java. I for one and I only completed my degree a few years back now (and I happen to know its still the same) did all my programing in notepad, or a notepad like programing enviroment.

    Do I think Java is the best language to learn doing a degree?, well if it was the only one I learned or knew.... maybe not. HOWEVER nobody I know comes out of a degree knowing only ONE language, all are exposed to Web, Database, Scripts and all sorts of other stuff. I mean... I learned Perl...

    I find Java a good language once you have know one or two others. Simply because you can teach advanced and complex things in Java that would take you all week just to program in C or C++. Remember the code is not important, knowing how to program is not the code.

    Thus having done my round of programing in Java and messing around with arrays, Swing and other stuff I moved on to more network orientated work. However (I dunno maybe as a joke) on the network modules list was a 3D course..... to my horror (I am not a fan of programing and can be considered reluctant) I had to learn to program 3D stuff in Java. Now in hind sight I thing it was a good thing to include, but at the time it was murder, as programing is like riding a bike, you never forget but that does not mean you will not fall off a few times anyway.

    I have now started jibbering, in summary at uni I was taught how to program, I was not 'taught' a language. Passing the degree was considered proof that I was capable of programing, and it happened to teach a language (or four) that might be useful to an employer. However as it has alwas been, Graduates are taught theory, employers are meant to take a given quality of person and theory and 'show them the ropes'.

    All the Software Engineers that I still know from my course do what I would consider 'real' programing. One mainly does Java Scripting for a large oil multinational, and the other whist Java Certifyed from Sun (did a placement there), has gone on to program some (from what I hear) horrific in house language. My pal doing the java scripting did infact say to me the other day that when he came out of uni the though he 'know most of java script', now over two years later of using it every day, he thinks 'maybe' he knows most of what you can do with it. (for the record he does do more than just java script).

    Now I took networking, and I have had to program in VBA.... yuk, but still even having never learned VB or any other basic program, I managed to get the job done. Now whilst I hope to never have to poke my head into 'proper' programing again, I do use my knolage of programing and apply it to other things, such as Routers and phone systems.

    All in all I do not think that you can make a general statement about the teaching of programing in Computer Degrees, other than to say its a little diffrent than yester year because everything is a bit diffrent from yester year.

    Anyway, I am sorry for going on so long, I am sorry that I was to sleepy to spell check this post... I know my spelling sucks. Most of all I am sorry for not proof reading this before posting, as I am sure there are some awsome gaffs.

    However of the posts I read before puting fingers to keyboard, I agreed with these:

    Bollocks - By Greg

    Snobs, bigots, and venal professors - By Morely Dotes

    And at:

    @Greg - By Joe M

    The main market for programers is currently in higher level launguages in the UK, low level programmers whilst in demand (because there is not all that many of them), are mostly trained on an electronics degree, where they must master both the hardware (creation) and the control software.

    Thus, An Electronics Degree student may create a device and produce low level drivers, whist a Computing Degree student will design a program which requires a code monkey to create some libarys to allow it to be pluged in to an already existing web application, which sombody who has done a Web Development Degree created.

    The IT feild has grown very large, even since I first joined it a decade or so ago. With so much to learn it is no supprise that there are now specialisations. Low level programing is such a specilisation. If you want sombody with both awsome program design skills and machine code abilitys, you will need to find sombody with one of the skills already and train them in the other.

    Such is life.

    -Ano

  7. Anonymous Coward
    Thumb Up

    Then again …

    "The point of C/C++ in the education of developers is that it is one of the few languages that you can move seamlessly from Procedural Programming to Object Oriented programming without having to either learn a different syntax or ignore half the code.

    Java and other strictly OO languages are confusing for newbies because of the excessive framework and OO constructs that mire up the beginners ability to do simple procedural programming."

    COULDN'T AGREE MORE.

    BTW, COBOL is far from dead, I'd advise anyone to learn it (you should be proficient in a few months if you can really handle working in IT) because there is a LOT more COBOL out there in the “real” world than, Java, C, C++ or even VB.

    In a few years there will be a big thrust to move vast amounts of “legacy” code (God how I hate that term) from mainframes to clustered servers (right reason or none, it will happen) and there will be a lot of ca$h in this for anyone with good/half-way decent COBOL knowledge and proficiency in C ot C++.

    Watch this space …

  8. Anonymous Coward
    Anonymous Coward

    40 years of muddle

    Having read this with interest, I can look back over 40 years of bickering about the best langauge.

    This discussion recurs regularly and various interest groups try to press their favourites on us all. Why?

    I "learnt" Algol 60 at university in the 1960's, except that I never really understood it until much later. We covered a simplified machine architecture and assembler too for an Elliott 803 computer. All in one afternoon a week for a term. Incidently, we had a visit to the computer room and met the operators but I never actually ran a program.

    Then in my first job as an engineer I learnt ICL's Jean which was like Basic followed by Fortran IV. As I used this every day I thought in it - this is what seems to be behind people's preferences. The first language they really can think in.

    Following that I taught and tutored: Fortran (IV, 70 and 90), Algol 60, Basic, Pascal, C++ and Java, whilst writing applications in Basic, C and Visual Basic. Then later Perl. However much object-oriented is the fashion I still often think in terms of that original Fortran whatever its limitations. It is like my native language.

    Students need to be able to have early sucess with very simple programs that they can understand - programming is like Lego(r) you put the same bricks together to make a wide range of objects, but they must be taught, and endlessly practice, how to make and use those little bricks - loops, IFs etc. before they can build their fancier programs. They also need a reason to do this and motivations is in short supply these days when so much of what they are taught is modular and they work towards and exam then promptly forget what they learnt.

    Many just seem to want to pass an exam and get a qualification learning as little as possible on the way. Some however are different.

    Motivations starts before university level, those old (pre PC) mini-computers with their Basic programming built in, gave a whole generation a start in programming but this phase seems to have passed. Web programming is much more complex just to get going on.

    Finally, the programming problems that many students are set in classes are often on subjects areas outside of their experience. They often do not understand the problems themselves properly so have no hope of solving them. Setting problems about playing card games when many of the students are from cultures where they have never seen any, and maybe gamboling is thought undesirable, is bizarre. As is setting problems about business when most students are only just starting to buy things with their own money and have their first bank accounts so rarely understand how businesses work.

    Lecturers should try to discover areas that all the students understand and programmes they would actually find useful so that they continue to improve them after the course is finished. This might require a great deal of sharing amongst academics - not something they do readily.

    Plus ca Change..... as the French say.

  9. Jay

    Reminds me of university!

    1992 compsci degree course starts with Modula-2. "...we could teach you something commercially viable like Ada, but we're not into that..."

    1993 compsci degree course starts with Ada...

  10. Shakje

    @Dave

    Sorry if the reply came across a bit harshly at first.

    1st point we appear to agree on. I can see that learning byte code would be a good learning experience for a java coder, but I don't see it being an academic excercise, more a personal development. I still think that learning platform independent C will give you a better grounding in the differences between platforms, and even the place of Java.

    I'm not saying it's a niche, I just don't think its size justifies the amount of courses which teach Java over other languages. It may well increase, it depends on the aftermath of Vista.

    Yes, computing is about adapting to the current situation, and solving real world problems. I may have elaborated on my point a bit too much. The comment is irrelevant because it talks about physics scientists and teaching methods, whereas the article talks about CS students. Continuing this though, if the physics scientists had been taught C properly, they would have no problems with pointer issues. :)

    Last bits were dribble, I apologise.

    Cheers.

  11. Anonymous Coward
    IT Angle

    Perhaps it is funding related?

    Profs tend not to like it if funding bids don't go their way.

    Maybe Ada lost out to Java on some funding awards/applications?

Page:

This topic is closed for new posts.