Recently a director at a huge bank asked me “Do British students learn algorithms?” At first I thought he was joking, but even though he was paying three times what the average new grad gets paid, he felt despair. Because of similar experiences I was surprised to read that only 17 per cent of CompSci grads from last year haven’ …
I'm gay and a geek, so I'm doubly F....
But back to the subject of the article, it definitely echoes my experience of Comp Sci Graddies. They think they know all there is about IT, but sadly the truth is very different. It is rare to find one who can roll straight into the job without having to be hand fed for the first six months.
Fail, because Universities appear to be doing just that.
_̴ı̴̴̡̡̡ ̡͌l̡̡̡ ̡͌l̡*̡̡ ̴̡ı̴̴̡ ̡̡͡|̲̲̲͡ ▫͡ ̲̲͡π̲̲͡ ̲͡▫̲͡͡ ̲|̡̡̡ ̡ ̴̡ı̴̡̡ ̡͌l̡̡̡. Title lives here
I'm not for a moment suggesting that universities are not failing but having to train up new graduates for six minths before they are ready to start work is not in itself an indication of that - it is not the purpose of university to mould students to the needs of a particular business.
Sounds like a win to me!
All too true...
This is unbelievably accurate. As a senior programmer for a big mobile house, I've been doing recruiting for graduates to take on as junior iOS programmers.
I used to teach CS in the USA - studied it there too. My intro language was C. When I started getting CVs through here, I was surprised to find a lot of people applying for iPhone positions without any C/C++ education. Since Obj-C on iOS lacks garbage collection, having some experience of memory management is really rather important, and yet it's a skill completely lacking from many coming out of UK CS courses.
I have singularly failed to find a single graduate student who formally studied C or C++ for their bachelors. I'm honestly surprised *more* CS graduates aren't unemployed. Plenty of them deserve to be.
As a senior programmer for a big mobile house
winibagoes need programming?!!
They shouldn't be applying for iOS positions maybe ..
... but the number of other languages out there that they might expect to be involved with that don't handle memory management is ever dwindling.
Do I get a cookie? :D
I chose to do C++ =D
But then again, I'm employed :)
It follows logically that garbage collection would be alien if all the CS grads are using Java as the lubricant to degree achievement.
Formally studying languages is the problem
The languages don't matter. I studied at a top ten university, graduating seven years ago and we didn't do a single class designed to teach a language. From memory, there was:
• principles of programming; an introductory course with submissions in Scheme but the point being to understand standard programming constructs
• computer architectures; involved an invented stack-based assembly for the practicals, but a written exam on instruction set architectures and design was the main part of the course
• computer graphics and visualisation; rasterisation algorithms mainly, with the necessary toe dipping into light number theory — coursework submission required in C and OpenGL
• algorithms and data structures; big O issues mainly, work submitted in Ada95
etc, etc. And, as a Maths & CS student, I spent only half my time in the CS department. I don't think we touched Java at any point, but I don't think anybody in the department there would consider that a failing. Though I guess they might, for example, shuffle Ada95 out for Java if general shift in tool chains made it more reasonable.
This does, inevitably, make for graduates that finish still a few months short of being able to be fully up to speed on any specific job if they've not done anything for themselves outside of the course, but it's much better than having explicit language courses, even though it allows for cheap shots that there are no universities teaching languages other than Java. That's possibly true, but it doesn't mean that all universities teach is Java.
The Engine Management Systems do..
Yep, I don't doubt there is a part of Winebagos that needs programming. The EMS..
Any how, TBH, when I did my Comp Sci degree, we started off learning Modula 2. Why? Because according to my lecturer, it's a good language to learn the basic structures and thought processes involved in programming. It's also a language that isn't used in Industry, so we'd be forced to learn something more useful. I think the idea was that if you know one language, it's easier to learn a second. If you know two, it's easier to learn a third and so on.
2nd year, we learned C++, eventually going on to use the MFC to program simple Windows apps. We also had to knock up simple apps using Gnome on Solaris. In C++. In fact, the bulk of my programming throughout my degree was in C++.
The students now use all sorts of essentially scripting languages, the hardest of which seems to be Java...
When they said Indians would dominate the industry....
... I thought they meant Hindi speakers, not Winnebagos, Cherokee, .....
I suppose someone has to keep those AS400's humming at the local casino....
RE: Formally studying languages is the problem
I'm guessing you studied at the University of York, which in the last year or two has since switched nearly exclusively to Java. It's rather saddening as, while bit of a struggle to learn a new programming language with each module, it was rather insightful to see how things can be done in different ways.
Re: RE: Formally studying languages is the problem
Yep, York it was. And I'm an AC just per the standard rules of healthy separation between online and real life. I quite liked the place and, at least in its early 2000s incarnation, was impressed by the academic approach taken. Not always so keen on my colleagues though; on one of the whiteboards on the way into the department at some point someone had written one of those mathematical 'proofs' that 1 equals zero based on fatuously factoring out a multiplication by 0. Someone else had written 'this is a divide by 0, the result is undefined' or something like it by the side and a third person had cleverly added 'this is computer science, not maths'. Hmmm. Still, one of the big advantages of a campus university is that you end up socialising a lot outside of your department...
Really sad to hear that they've decided to give implicit approval to a concrete technology over just teaching the principles, especially as they're are still 6th in the country for Computer Science and 17th in Europe in general, per the latest lists that The Times will furnish. I'd still be surprised if they're churning out that many graduates that can't pick up C++ in a week. Compiler construction was still a popular item on the roster for second year students in my day — that and the emphasis on algorithms and how things work at the machine level elsewhere can't leave people too stranded, can it? And I was under the impression that there was quite a lot of actual hardware engineering stuff for solely Computer Science students? I definitely had to help someone with z80 assembler at some point, having covered it at A Level.
Sorry, my first language was 6502 assembler and a couple of different flavors of basic.
Add to this Fortran, Cobol, C, all in college. (Ok, this was before Java and C++ was still relatively new).
But my point was that the language didn't matter. (There's a class for that.)
Java, Objective-C all came later.
Yeah I do know C++, which is why I can make the statement that C++ blows. Sure its my opinion, and there are others that would disagree with me.
The key is that I can defend my opinion. Any decent programmer should be able to defend their opinion. When you interview for a developer, you should ask questions based on their stated experience that required a detailed response. Not only will it show their technical expertise, but also their communication skills which are also just as important.
PS. Sorry I dislike C++ because as a consultant, I'm called in to fix projects have gone wrong. Cleaning up bad C++ is a hell of its own... ;-)
No garbage collection in Java?
eh? I think you were thinking of "memory management". "Garbage collection" should certainly not be alien when using Java.. it's handled for you, certainly, but you still configure it with VM tuning and should at least be aware of it, and if you've never run into memory leaks due to weak references or similar pitfalls of garbage collection mechanisms then you're probably not using Java as much as you thought you do.
Re: C++ blows - Agreed!
As a young teenager in the 80s I started playing first with a commodore PET at school and then my own BBC Micro. So I picked up BASIC. My 'O' level was written in 6502 machine code. My 'A' level was written in BASIC. They had taught us Pascal, which was an option, but I was pissed at them for not letting me use machine code for the 'A' level.
I would give Pascal another chance a few years later.
Even doing the 'A' level I started to become disillusioned with what we were being taught, so when I saw the degree syllabus I decided I'd had enough of this education rubbish. I still remember the computer science teacher calling me out of a geography lesson to help him set things up when I was 13!
From there the big old world of work started. COBOL, PCs, 8086 machine code, DOS TSRs, windoze, bit of OS2, VB, Delphi, C etc. Intarweb and TCP/IP.
Delphi (Pascal++) is probably my favourite for writing windows apps. It's just so quick. How C became some dominant when it was designed to make thing unreadable after you turn away for 5 minutes is beyond me.
I never ceased to be amazed at the number of grads I worked with who just had no fundamental knowledge of computers. Sure I could forgive them for not having built a computer from spare bits aged 15, not everyone was a geeky as me, but to looked astonished when I open a command prompt and type a few magical incantations is unforgivable.
Typical example - "I have a load of music mp3s, mpeg movies and word docs all together in a directory, how can I move just the tunes and movies?"
Errr, move *.m* <destination>
Reclaim all the drive space windows has cluttered up with temporary files?
del "%temp%\*.*" /s /f /q
These days I program windows apps and microcontrollers. It's interesting juggling between Delphi windows apps and C/ASM on the MCU, ad yes, fixing other people's C code.
There is always a sigh and relaxed smile on my face when I return to Delphi and have native string handling once again! Not to mention the inherent security aspect of having a language which knows how to handle strings, and isn't just forcing you to throw random characters about in memory!
But when a programmer has some consideration of ethics is that really "trash?" I don't think so.
It's not just about coding skills
You're quite right. It's a decent article but the comment "and other nonsense where you write essays rather than think" is just stupid.
Why? Because if there's one thing I have consistently found it's that CS grads with no social skills are very limited in their choice of profession. Those who can be let loose on customers win over every time.
Yes, there was a point to going out, getting drunk and getting laid at University ... in fact, for CS students there was an exponentially greater benefit than for those who were naturally more sociable anyway.
Communications skills are highly underrated
I've been programming since I was 14, started on a PDP 11 with paper cards, so it was a while ago :) I've done many flavours of Assembler, Cobol, Fortran, C++, Java Etc.Everything from smart cards to mainframes. But new grads know at most one or two languages and have no clue how a computer actually works. It's a magic box to them. They also don't know how to communicate, neither in writing nor verbally.
Despite my many years of programming experience I spend less than 15% of my time programming, most of my time is spent interacting with customers, giving talks at events etc. And that ability has put me in the top 2% of income earners. Comp Sci students definitely need to learn more hardcore, in-depth programming skills, but they also need to learn how to communicate what they know to others.
If you have to be taught ethics...
I don't want to know, never mind work with you.
My experience of CompSci was one of thoroughly awful lecturers. While there were a few truly decent ones at my Uni, it was the mediocre to downright appalling ones that were teaching some of the hardest subjects on the course. I've no doubt that these guys were leaders in their particular research field, but generally they couldn't lecture themselves out of a paper bag.
At least arts lecturers get into the field expecting to be teaching, or at least addressing a crowd. CS lecturers however seem to have no social skills and often are brought in from various corners of the world and as such have either very poor or heavily accented English.
The fact is though, if you have decent social skills, an engaging personality and are knowledgeable in the subject then you have far better employment opportunities than as a University lecturer. Most of the lecturers I found are either the end of career types, whose understanding of the basic fundamentals of computing are excellent but really lack any modern experience, those who lack the personality to make it in the job market and those who just want to do research, to whom lecturing is something you're forced to do rather than any kind of calling.
Why do they get the post grad students to teach?
Why do they get the post grad students to teach?
I had to take Z80 assembler in my first year of IT at Leicester Poly (in the days when IT consisted of analogue and digital communications, analogue and digital electronics, assembler and C programming, plus the wolley social implications stuff). I had been writing games and the like from scratch on the ZX81 and Spectrum for a few years whilst I was at school, so I knew Z80 assembler inside out. (And I had blown a few up interfacing to them!)
The lecturer was a post grad student who was reading from a book. About half an hour into the first lecture, after I pointed out for the third time some glaring mistake he had made, he told me to f**k off and not bother him again. Others on the course, passed the assignment questions to me through the year and I passed back my answers via the course leader (so they could not get lost). I reccon that I spent more time teaching that course to the students in the pub than the lecturer ever did. And the bast*rd only gave me 99% mark for the year, dropping me a mark for not attending the lectures!
There were a couple of other post grads who also had little idea about teaching, but at least they had more than one old guy who was only emplyed as had a large research grant in tow!
The best lecturers all had a passion for the subject.
Now, when I interview applicants, I get them to write (with a pen and paper) about their journey to the interview. If they can't write a coherent, legible description, then I'm not even going to waste my time on them. I can teach them a programming language, but they need to show an aptitude in their own native larguage first!
So, bad lecturers and bad students. Who is suprised by the unemployment! Not me!
definately an issue
All of my database related courses were taught by post-grad students, who were also overseas postgrad students!
The fundamentals of database design and architecture, passed on by someone who didn't seem to understand the subject that well, couldn't express himself very well, due to limited English, and was unable to answer questions to any degree, also due to struggling to understand.
Object oriented software engineering, taught by someone who completely failed to get across any concepts of object orientation. Later an 'old fashioned lecturer' went off on a tangent, in a completely different subject, about data types and built it up from simple data types to the concept behind object orientation, and how classes work making it all fall into place.
We had "tutors" in the programming labs to go to with any issues, but most of them lacked even the most basic understanding of debugging, often, even how to sort out compilation errors. The basic skills and understandings were overlooked in selecting most of the people who were there to teach and support.
I now work with a lot of people who did computer science at Uni, most know how to program, but not a one really knows how to design and construct an application. They generally learned by looking at the existing code and applications and duplicating it, cut & paste coding, tweaking bits here and there, never actually knowing what they are really doing in any great detail.
As the article states "a good programmer knows that it is how you think, not the language you code in, that determines your ability." Unfortunately, recruiters aren't good programmers!
With a wife who has a PHD I find the issue of postgrad lecturing difficult. The problem is that many universities will not consider someone highly, even as a basic lecturer, who has not got some teaching experience. I think the real problem is that there's no formally accepted training system in universities for 'teacher training', but you can see the reasons for this. PHD students have already spent at least 5 years as a student, and are looking at at least another 3 years on top of that, so what do you think would happen if they were required to spend another year or two doing teacher training to become a lecturer? It's practically impossible to fit in proper training in a Masters or PHD schedule without increasing the length (for some people I could see how this could be done, but it wouldn't be fair on those with heavier content doctorates, and it might also affect private funding if they thought that some of their money was going towards teacher training instead of benefiting the private sector). You would end up with more and more postgrads going into research and private jobs and less and less lecturers, which in turn would lead to less qualified lecturers as universities tried to fill positions with the people that were willing to learn to teach rather than spend time on their own subject.
I can fully understand that it's not a perfect process, but teaching experience is vital to PHD students if they wish to pursue lecturing. What I think could be done is to make sure that postgrads begin by teaching tutorial groups instead of full lectures, but, being bluntly honest, when I was at uni I had my fair share of bad full-time lecturers (who seemed glued to just reading out their notes) as well as plenty of very good ones, and I think one thing that you should take away from uni is that you won't always get the best person communicating the content, but that it's the content that's important. If you can't grasp this I can see it being very difficult to operate in a business setting.
Not much has improved since my day, then
I did my secondary schooling in the late 1990s, and was so utterly put off IT by my teachers (and by the woeful syllabus they had to teach against) that I didn't actually do my CS degree until five years later, having basically dropped off the map during that time.
The first year of my CS degree was similarly dismal; being taught what integers were, and how to perform boolean evaluations, and the like.
There seems to be a (not inaccurate) assumption within the university system that, unlike any other subject I know of (except for Art, which requires a foundation year), their first-year students will know NOTHING AT ALL about their chosen subject.
You wouldn't just walk into a university one day and ask if they had any places available on a Chemistry degree course, and yet that is exactly what I did when starting my CS degree. I just walked in, they checked for empty places, and signed me up on the spot.
When I asked what I'd need to know before starting, they explained to me that there wasn't any requirement beyond basic reading and writing skills (and some UCAS points, I suppose). This is a problem that starts in schools at the (utterly, utterly woeful) ICT GCSE level. It wastes university resources, and valuable time that students should be spending at the end of their degrees on advanced subjects.
Re: Not much has improved since my day, then
Sounds the same experience I had in the late 80's
All universities said "no prior knowledge" required, and at one university, the person I went to see said it was actually better to have no prior qualifications.
Seeing as I had an A at O level, then got an A at A level, and a 1(distinction) at S level, I had no intention on doing a course with people with no knowledge, so I fell back to my second love, electronic engineering
I still went into the computer industry, and was employed on my knowledge rather than qualifications.
After a while we developed a "unix test" for future employees. It wasn't hard - any seasoned unix programmer / systems guy should have easily got 100%, but some of these so called experts were getting 20% or less
Could be worse, could be Flash
Early on in during my degree I was told that we should trust the OS and write for that. That being Windows 3 and the course being real-time systems engineering design. Having been a mature student with a couple of years doing stuff like reverse engineering control system code, that depressed me somewhat. So did having to learn Z, but I appreciate that more now. Unis seem to churn out people that think they can code, but often no idea about why they're doing it and how it relates to the business they're in.
I found Java the dullest part of my CompSci course 10 years ago - we used Haskell, Pascal, Java, C++ and Matlab, and spent a good deal of time on algorithms, operating systems (minix - written in C), hardware architecture, and concurrency.
And now I code in C# most of the time - but when I need to use other languages, I can and do, and my understanding of the computer as a whole stack (hardware/OS/software) gives me an advantage in writing better software than people who just seemed to just spend 3 years learning Java, and no "computer science" whatsoever.
I don't have any issue with Java being the primary language of education
It is after all the most sought after skill by employers. But if all they've learned is Java then they are only one step above useless. They need a proper grounding in the basics of computer hardware, OSes and algorithms. And you can't properly do that without at least some C, hopefully some C++ and maybe even assembler skills as well.
"Computer Science" education of today is hardly any better than a mail order "learn to program in 90 days" course.
On the button
This is exactly right. I agree almost entirely. There is no merit to any CS grad if all they know is one language and no operating system internals. Java is OK, but then so is C#. It should be mandatory to study some part of an operating system, and here as LINUX is opensource and coded in C, it make sense to have C. Let them learn how pointers work. Defensive programming. Database work at the lowest level.
Has no-one understood why all the best CS grads in well paid jobs were not educated in English Universities?
Of course, this starts at school. Kids are crammed to get good A levels. Just this weekend my duaghter had a friend staying over from a good English university reading Maths. She is in her first term. She said the course is hard work - "They don;t teach us everything, we are expected to find things out ourselves and have to use the library". She was almost euqally horrified when I suggest that research is not looking things up on Wikipedia.
Personally, I don't care what languages who have or what DBs you have used. But you must have more than one language and know when to use different ones; and at least have good SQL. Know that and I can teach you how our shop works. But without that you are a drone.
We used to have an education system that was the envy of the world. Where did it all go wrong?
Adrian (AC so I don't totally hack off my team during milk rounds)
"We used to have an education system that was the envy of the world. Where did it all go wrong?"
Politicians, politicians and politicians.
Most British are uninterested in politics (and who can blame them) so the ones we have are basically the lowest common denominator elected by those who don't understand and don't care.
So we're well on our way to hell in a handcart.
Unless sufficient remember that rebellion is the right of every citizen - and sometimes a moral duty.
Re: On the button (AC 1/11/2010 12:58 GMT)
<quote>Personally, I don't care what languages who have or what DBs you have used. But you must have more than one language and know when to use different ones; and at least have good SQL. Know that and I can teach you how our shop works. But without that you are a drone.<unquote>
What barbarous nonsense. Ted Codd studied maths and knew nothing of SQL when he invented the relational model for databases without which SQL would never have never existed. I was a lot younger than Ted, but was doing research in databases and information retrieval long before SQL was invented. SQL is just another language (actally a badly screwed up version of Ted Codd's idea for a relational calculus based language) and saying that someone who doesn't know that particular language is a "drone" for that reason is total garbage (and I say this as someone whose last two technical director/VP level jobs were based partly on my SQL expertise, not as someone who wants to claim SQL doesn't matter because they don't know it).
<quote>We used to have an education system that was the envy of the world. Where did it all go wrong?<unquote>
When they began to let idiots who think some particular computer language is an essential part of CS eductaion have some influence? (Unfortunately that really has happened, and at a large number of Universties that language is Basic, and - o tempora, o mores - at an even larger number it is Java; but the real bad news is that it's yet more often C++.)
What I have noticed a lot of, is the people who are working in IT are no longer Geeks, in the true sense of the word.
I hear a lot of: "I'm a programmer, I don't care about computers.". Just the other day our SQL DBA called me a geek (not that I minded), just because I have a NAS at home and multiple laptops and computers in the house. When I first started out (early 90's professionally), only geeks worked in IT - now it seems everybody does, but don't actually have any passion for it (apart from us old timers)..
More people = More diversity
There are several hundred times more people employed in the computing industry than in 1990's so it's not surprising that there are proportionally less geeks but I doubt there are less total geeks. Probably more concentrated in the challenging jobs such as assembly programmers (life expectancy 30 years, sanity expectancy 3 years) and those nutty hardware guys (just guessing).
diversity == FAIL
Sadly, since things have moved to a 'mobile' world (iOS/Android/web-based tools), there is a belief that there is no longer a need for high-level languages/OSes. WRONG! I call this the 'University of Phoenix' approach to education: Teach the 'Hot' tool and get them edumahcated.
Epic FAIL becuase students have become as lazy as the so-called institutions that are teaching them.
Re: Geeks?? Where??
Same thing over here. When I mentioned that I had bought a used Cisco Catalyst 2950 switch, a lot of former college friends asked me why would I want something like that. I responded that it was so I could separate my home network into 3 VLANs. It brought another ton of "why?" questions.
It really, really seems that "because I want to tinker with this stuff" is no longer accepted even withing CompSci grads. :(
Funny thing: When I told my Mechanical Engineering student peers that I wanted a few VLANS (Bought that cheesy Netgear managed gbit switch,) they were halfway interested.
More comedy: At my university the CS students had to ask an EE or ME student to fix their computer when it broke. Their interest in computers ended at the window border of their IDE.
Even more: Thanks to a mid-year shift in the standard syllabus, my class of ME students learned 2 languages in the first year (C and Matlab,) while CS students only learned Java.
Also: ME and EE students were required to take an intro to Linux course. CS students were not.
The Best Part: With my ME degree and barely out of university, I've now got a programming job (Long story) and I interview CS grads for non-programming positions. And they all suck. 1/3 of a page (Out of at least 3!) of your resume should not be a bulleted list of every version of Windows you've ever used, especially when you're applying for a Linux-only position. And spelling and grammar do count, though I've had to throughly give up on judging by resume layout, since they're all pretty much based on the same tragic MS Word template.
I have been called a "geek" by IT support people
It was intended in a more-or-less friendly way, at least I choose to interpret it as such.
I was a mere deployment tech (put PC on desk, turn on, copy user's data, remove old PC). But my boss tended to sneak me out to do the curlier support tasks as she had no faith in most of support to fix more than someone's word file. Most IT people today seem to treat a PC as a magical box. They have rote-learned some arcane gestures to manipulate the icons on the magic window to keep the godlet in the box mostly happy but have no real knowledge of how it really works.
I'm old-skool - I started with digital electronics after dropping out of a completely pointless senior high school experience back in the 80's. Ican (and have) build an 8- or 16-bit computer from chips up. Can pick up the basics of any language in a few days (interestingly it took exposure to Java back in the 90's to get my head around object-oriented before I could finally manage C++). A few years ago I spent a boring lunch break drawing a pure-logic router circuit on the whiteboard while doing a Networking course at trade college. It became very obvious why all the masks are inverted - significant saving in transistors, which back when these protocols were created was important.
Most of my knowledge is not formal and so not attached to a piece of paper, so finding a job I liked was a bit of a slog, but by starting at the entry level and showing what I had, I think I am finally there now - I now get to help Digital Media Creative Arts students make electronics/microcontroller-based art, amongst many other things - heaven.
My Uni teaches C++ to compsis (and "C with elements of C++" to physicists), together with Java, MIPS assembly and SML, and expects students to learn any language they may need to do a chosen project on their own, and majority of people are doing pretty well at that. Programming is considered an useful skill rather than goal of Computer Science degree and people are generally more concerned with formal approach (algorithms being one aspect of it).
I, therefore, fail to see any problem.
At least in Cambridge.
I hardly see F# being a useful end point for Computer Science graduates. Maybe for a bitter IT hack... But a computer scientist, with a knowledge of more than just this months flavour language, is infinitely more useful.
The idea that "only Queen Mary's" produces students able to write C++ is frankly ridiculous, perhaps the author should try contacting some other universities.
And bemoaning the lack of IT skills in Computer Science graduates shows nothing more than a distinct lack of knowledge of the field. If I end up stuck in an IT job after graduating as a scientist/engineer I will hang myself.
Assumption about Queen Mary's is not so ridiculous
after you stumble on this: http://www.qmul.ac.uk/alumni/profiles/computerscience/index.html#dominicconnor
I think you have misunderstood my argument.
A well taught graduate will not leave with a huge amount of knowledge of specific technologies. With the pace of change in the industry that would be a useless skill set.
Rather a graduate should leave with the principle skills to learn, understand and utilise whichever technology they need to.
Why teach networking in the context of setting up a Windows Server 2008 install when a few hours on google will show you how to apply your abstracted knowledge to that specific skill.
You wouldn't design code to handle only very specific cases, why would you want to train graduates in such a way?
Arrogant people unwilling to apply their knowledge cut off their noses, but in what discipline would that not be true.
I agree, but...
I've yet to see an Oxbridge CompSci with a 1st who can't set-up a server. Granted, the first time they do it it may take them slightly longer as these Unis don't waste time on teaching technology or product X, but rather general principles, how to apply them and how to build upon them.
There is some specialisation in this discipline - while it's reasonable to expect a good graduate to be able to set-up a server for whatever they've written, there might be people who will do it better than they do.
Maybe it's only the dross which are using your services, the rest of them are quite capable of landing themselves a decent job.
I read this with genuine interest assuming it was an employer, until I saw it was a recruitment agent.
Now it makes perfect sense: you ask for a Java developer and they send you someone with a CV full of "skills" you need to look up on wikipedia.
Meanwhile they tell graduates they can be on 100K.
I agree CS courses need geeks.
Having recently interviewed a number of CS grads for a networking post i was dismayed about how little they knew about the fundamentals of networking.
One applicant had Networking and Advanced Networking modules on their CV but could not explain the difference between a router and a switch.
I also agree that unlike some subjects a CS grad has to have a genuine interest in the subject. Maybe when Universities start having to publish REAL employment stats they will worry about having relevant course content.
- Mounties always get their man: Heartbleed 'hacker', 19, CUFFED
- Batten down the hatches, Ubuntu 14.04 LTS due in TWO DAYS
- Samsung Galaxy S5 fingerprint scanner hacked in just 4 DAYS
- Feast your PUNY eyes on highest resolution phone display EVER
- AMD demos 'Berlin' Opteron, world's first heterogeneous system architecture server chip