except you won't get past the HR CV sift without a degree.
Learning to code in your bedroom will prepare you for the IT job market just as well as a three-year degree costing £27,000, professionals said in a survey published today by CWJobs.co.uk. More than half the IT professionals polled said they would not do an IT-related degree today if they were paying the increased fees, which …
This is why the industry is in such shit hole. HR in some companies need to fired or shot simply for rejecting any CV without a degree on it.
Some of the greatest developers don't have a degree. Requiring a degree just to get to interview stage will risk that company losing some pretty good talent.
I would easily considering take a developer with 3 years experience and no degree depending on his/her evidence and some basic interview concepts testing.
A degree acts as a credential that says 'I learned something' - a degree from a big name uni counting for a lot more than one from a newer uni.
In general, I think too much focus is placed on degrees these days. A lot of people could go into certain jobs without a degree and be just as good. But employers insist on them (irrationally), so everyone must trot along to university or be judged a waste of space.
I don't have a degree. But I do have a shedload of experience. Furthermore, quite what a degree acquired 20+ years ago tells the hiring community beats me. It may have suggested that at age 21-ish I managed to scrape through some exams, but it doesn't take into account that the ongoing loss of brain cells ever since has removed that capability.
I was a badge-carrying member of the Tufty Club some 50 years ago - does that mean I will get preferential treatment for any job which involves crossing the road? Or is it safe to assume that the no-hopers who work in the HR departments making ridiculous decisions wouldn't take this acquired experience into consideration?
I don’t have a degree and seem to be doing very well thank you very much, however I have worked since I was 16 / GCSE age in a computer shop etc before working my way up. So have experience.
The other side of the coin is the people doing IT degrees because IT gets sold as a well paying job. The people come out of Uni with no experience and wonder why they don’t get hired for the well paying jobs.
I’m now in a position to hire etc - I would take experience over a degree, and in a perfect world Industry cert's with experience as my preferred choice.
Weirdly the degree you actually have is actually less important than having one... a degree basically says "yup, I left my parents' basement and went away for three years to gain some life experience*" ... my programming experience is largely web based (LAMP stack) but I've also taught myself Perl, C#, Java to a certain extent and tinkered with C++ (when Valve released the Half-Life SDK with the full source code I HAD to have a look).
The one advantage I'd say self-taught "bedroom enthusiasts" have is that they tend to remain enthusiasts (e.g. enthusiastic about programming) ... something the education machine can have a nasty habit of draining away.
Oh, my degree ... Arts & Design, finalled in Fine Art Sculpture - on paper I'm more qualified to cast bronze, use an arc welder, or chisel stone than I am to turn on a PC - and I have far less enthusiasm for art these days than I do well structured objects or good database design.
*life experiences such as alcohol tolerances, surviving food poisoning and experimentation with controlled substances and sexuality ...
your're close. Having a Degree tells the prospective employer that you are capable
of learing on your own . And it doesn't matter what subject.
Carly Fiorina of late memory had a degree in Medieval History for gawds sake
(but she sure new how to destroy HP and the shareholders).
Anyway, I used to work for a compnay called EDS who was vry successful in hiring
'liberal arts' graduates and teaching them Cobol/PL-1/IMS/DB2 etc and did very well.
Not so well in the new reality of Java/C++ etc...
The post below reminds my of myself.
I was a Mathematics graduate and really, really thick. for a long time before
started to understand DP....
First Cobol program I tried to use a tape drive for an indexed file.....
When I'm searching for jobs and come across anything which suggests I must have at least a 2.1 I automatically practice my toilet swilling abilities. Not only do I not have a degree, but if someone is making serious hiring decisions about job applicants based upon a degree which could have been acquired 20+ years ago whilst ignoring the vast experience that I have then I really don't want to be working there. Ever.
I think at least part of the problem is that those who are doing the hiring and who are stating that the degree is a requirement most probably went to university and assume that they are somehow superior to the underlings who didn't waste 4 years of their life running up massive debts.
For something like medicine then a degree could potentially add real value, because to get that degree you must have sat thru lectures and done lab work - plus answered some questions in an intelligent way, all of which would be relevant to the profession.
The worrying thing is that in today's society I fear that it won't be very long before McD's are hiring burger flippers based upon degree qualifications.
20+ years ago, degree's were a rarer beast, not having one was the norm. Nowadays they are a standard part of education, so not having one makes you look uneducated.
But to be honest, i haven't seen many CV's where the person jumped out at me as being competent to be let anywhere near real software. The absolute worst were the ones straight out of uni, who claim to build commercial grade applications in their spare time. (Always java)
They already hire people based on degrees in the most menial jobs where I live. If you don't have one you can't even get a job in a store selling cigarettes.
I don't live in the western world though.
I've spoken to guys recently who told me they generally don't hire anyone who hasn't got a Masters degree, never mind a plain old BSc.
I work with two people who managed to secure degrees and get jobs in IT in my dept and they are still as thick as flour enriched, extra strength pig-shit! It's not just a case of not being technically knowledgable but genuinely f**king thick at most things!
I wonder how on earth they manage to get themselves dressed in the morning, some days! Sad fact of the matter is these sorts of people only got jobs in IT as they were promised good money, not 'cos they care about the work or enjoy technical challenges on a daily basis.
I studied up to A-level and enjoyed working as a tape-monkey at a data centre so much during that last Summer at school, I got a real job. Eventually I moved over to system admin on Novell and finally to Unix/Oracle admin. I have no wish to be in management which almost certainly requires a degree these days. I enjoy being back-room and treated like a mushroom, thanks very much!
Alright a degree might have helped me get my foot in few more doors and I have to "fight" a little harder in the interviews I do get, but 25 years experience in IT admin seems to have kept me going. I have every intention of working my mortgage off by the time I'm 45 and then getting a series of ever crappy jobs to keep me going until I can retire and take up my hobby of photography, full time!
Your description sounds a little like my CV!
I did A levels (back in 1981) and was just sick of academia, so rather than waist time and money going to uni I looked for a job.
When I left school I had only ever seen/used one computer doing a summer job in an insurance company (my school got it's first BBC micro the year after I left), but I endied up getting a job in operations. The main qualification was probably my driving license as I needed to do shift work and the trains/buses don't run that well at 1:00am.
When I had some free time I started doing little scripts (in DCL mainly) and later became the trainee programmer, entirely because the management saw what I had already managed to do by reading the manuals on my own.
Everything else I have learnt by reading the manuals along with some trial and error, and learnig from some very experience co-workers.
Over the years I only know of one occasion that I have not been call for an interview due to not having a degree, and that was with an outfit that were a spin-off from a University, so prejudice was to be expected.
Also over the past quarter century that I have been doing this job, I have only met about 4 or 5 IT Graduates I would regards as being anything other than useless. Of those 3 had done the job full-time before hand, and only got the academic qualifications after being made redundant.
The main reason most gratuates are useless is that they have in fact been trained in the worst programming practices known, by people how typically cannot do real/useful software development themselves. The end result are very old children who technically would have been better of buying a few good books and playing with software development on a Linux download. With a total cost of maybe a £100 or so plus an old PC.
I started work straight from school with not even bedroom coding practise and by 20 I was an "Associate Systems Engineer" at EDS, by 21 I was a full Systems Engineer and I was training 22-24 year old graduates who were on about 10K less than me... I had no debts where as they all had crippling student loans even when there were no tuition fees to cope with
all it really takes to get on in this world is a serious desire to succeed and the tiniest bit of luck... mostly all it requires is standing up and saying "I can do that" whenever you hear someone saying they need something doing - even if you don't know how you can quickly find out how and get the job done
I have been working in IT for 30 years. I have been a programmer, network engineer, pc tech, etc. etc. All the jobs I have applied for have some tests, and as long as you pass the tests and do well on them, you are in. Most corporations want results more than a piece of paper.
The ones that will require the piece of paper are School based jobs, with other acredited positions. They are in the business of education and so they require one in order to qualify, but a vast majority of IT jobs have been found to have better people that are self taught.
The four Yorkshiremen sketch was from At Last The 1948 Show, not Monty Python. 50% of the original four Yorkshiremen went on to form Monty Python, and the sketch was performed by the MP team, but accuracy is important, I feel.
A wonderfully innovative and prolific time for UK comedy.
The problem with not having a degree (in many cases) results in candidates having general skills thus making their jobs far easier to outsource. When the shareholders come asking HR "Can we find this elsewhere for cheaper?" the answer is usually "Yes"
Being able to use the tool is one thing. Applying it in an industry is another story. I.T. is used in many places. The few examples I have come across in my career are: mathematical analysis of financial statistics, chemical engineering and instrumentation design and testing.
Get a degree, find a niche and lessen the chances of the ba*tards outsourcing you.
Unfortunately its not just HR. Me and my friends are increasingly finding that these days, the worst barrier is the job agency sales droids who don't want to put your CV through to the companies, if your CV doesn't have a degree on it.
You can even try to reason with the job agency sales droids, such as you've got decades of experience, worked on large projects, have finished products on sale for years etc.. but it often doesn't work, because the sales droids have such an apathetic negative attitude towards you as soon as they know you don't have a degree, that they pass on that apathetic attitude to the company even if you can finally get them to pass on your CV and some simply refuse to pass it on.
If you don't have a degree, the job agency sales droids don't want to know you, because you are harder to sell to companies, because you don't fit into their pigeon holed tick boxes and its getting worse, as more companies go through job agencies to find staff.
Even worse, some companies are now not even allowing people to contact them directly, because increasingly they also sub-contract out the HR CV sift to the job agencies! … That unfortunately means you have no choice. You have to deal with the sales droids and they don't want to know. :(
I think a lot of people here are guilty of not reading exactly what was said...
"Learning to code in your bedroom will prepare you for the IT job market just as well as a three-year degree costing £27,000, professionals said in a survey published today by CWJobs.co.uk.
More than half the IT professionals polled said they would not do an IT-related degree today if they were paying the increased fees, which will come into force next year."
The first sentence is an incorrect inference from the second (the result). What IT professionals are saying is that they would not do *an IT-related degree* if they were paying the increased fees.
This is spot on from what I've seen. I've worked with Mathematicians, Physicists, Engineers etc as well as non-degree people (tough getting past HR without good history) who have all been up to the job. An IT-degree hasn't been a necessity for a long time but probably helps in those hardcore C++ architectural/leading edge positions.
I'm a self taught developer of many years and I got my first job (in programming) straight out of high school. Experience is much more valuable than a paper degree and any employer who you would want to work for will know this.
It might just be my personal experience, but I have met more programmers than I can count on one hand who have University degrees with no sense of how to actually apply their "skills" to real world problems.
I've have seen a number of graduates who don't understand even the most basic principles of how computers work, thinks like instructions, stacks, address space, and pointers are thing some have never even heard of.
The typical teenager in the mid to late eighties had a better idea, when all they did was read Byte and write simple basic scripts on the BBC down at Dixons.
Playing devil's advocate here. But surely back in the 80s, with computer technology being less divergent than it is today, it was simpler to leave school and pick up programming on the job. With the amount of different technologies people are meant to know today it would seem to me that a good degree would be invaluable in terms of having a proper overview of computing. Without that I don't see how people would know where to start.
Obviously a degree can only go so far, the rest is down to the individual to go the extra mile.
and have seen quite some horror's produced by various "bedroom coders". They do get things to work, but the code is often not an "oil painting". There are certainly those who can teach themselves, but there are those who do benefit from learning a more disciplined approach.
The lack of discipline can be really astounding in some. There was one guy who insisted he wanted to hand code in in C# rather than Java. I told him of course he could hand his assignment in in C#, so long as he did not mind failing the course. He found this unreasonable. I suggested my attitude reflected that of a potential employer or customer, who more often than not have some requirements on programming languages, coding style, comments. If you hand in your work in a different language, you would be in breach of contract, or get fired.
He still thought I was being very unreasonable.
Another story I like is the guy who handed in an iterative solution where the assignment explicitly stated: "implement a recursive method to compute ....." He argued this was more efficient, we said that was true, but that the assignment was to learn recursion. He said but my implementation is more efficient, we said that was true, but that the assignment was to learn recursion. He said but my implementation is more efficient, we said that was true, but that the assignment was to learn recursion. ..............................
This went on a while until we terminated this infinite loop (not by kill -9, but more humane methods)
This is not to say the tuition fees aren't outrageous. You can get a much more favourable deal in the Netherlands, and the university I work at (Groningen) is drawing more and more students from the UK. Our MSc courses are English language anyway, and our BSc courses are headed that way as well.
Also rather ironic that you expect a lecturer in computing to also be an expert in English.
Yes, it might be useful when teaching computing to know the usage of the apostrophe in English (oh, hold on, maybe not), but I'm pretty sure it doesn't work the other way round. Do you know any English lecturers who can teach programming? Or indeed many experts IN ANY FIELD who can use the apostrophe correctly?
I taught myself to program starting with the zx81 using basic and moving on. However I never had any confidence that I was actually doing things in the best manner. It was utterly impossible to get a job back then without a degree. I eventually went and did a BSc in Software Engineering. I was horrified that some of the computer science degrees at our uni didn't even require a single programming module to be taken over the three year course!
Even of those who did do courses which required some programming many were dreadful to be honest. Lots of them were there because it was a trendy subject which could earn you good money afterwards. I think the answer is
There are good degrees and there are dreadful ones.
There are good educated programmers and dreadful ones
There are good bedroom coders and dreadful ones.
Long interviews and some testing are the only real way to determine who you're dealing with.
He's teaching computing not English. You might be forgiven for expecting impeccable English from all teachers whose first language is English. When the teacher has already stated that he(?) is based in the Netherlands and English probably isn't his first language, it seems a little unreasonable.
That said, most people who learn English outside of the English speaking world seem to get a better education in the language.
If you think it is idiotic to expect people to follow instructions, then good luck job-hunting. It's not hard. You are at a university. You are asked a question. You don't answer a different question.
The only thing proved here is how tolerant some academics are when presented with yet another cretinous self-obsessed student who is so full of his own "entitlement" that he thinks he can redefine the course requirements to suit himself.
While this all may be true ... you can run into issues with fully qualified programmers with bits of paper and everything - particularly when the education system is geared towards creating drones for a particular software environment...
Programmer: "Why don't we use .NET on the website?"
Me: "Because it's a LAMP stack using Slackware, Mono isn't production ready and we don't need the overhead".
Programmer: *blank look* "But .NET works on everything? What's Mono?"
It all comes down to the person and how they're educated - my biggest worry, being self-taught and effectively working in isolation most of the time, is that I might be very, very wrong and never even know it - I work fairly hard to ensure that I'm not but still...
Good point. We TRY to teach them, but then some just replicate instructions until they pass the exam, and then forget all about it. Some people are, as we say "resistant to education." At the same time there are many self-taught people who are excellent. Besides, there are many practical skills we do not teach (especially on the wealth of different tools out there).
The most important thing you can learn in education is learning itself. I have had no formal training in lattice theory, but I have acquired the skill set to learn it, and now contribute to the field. Many of my coding and debugging skills I learned in practice, in my first job as scientific programmer (developed a 150 kloc image processing system). That is when theory gets turned into practice.
You're being rather knob-headed to expect any programmer to know what .NET and Lamp Stacks are and expecting programmers to be pre-loaded to your requirements.
Such detailed knowledge will depend very much on the programmer's area of expertise.
What is more important is if the person can learn and become productive within a short time.
Take me for instance. I'm pretty capable. I've been writing software for around 30 years - mainly kernel stuff and embedded. I only have the slightest knowledge of what a LAMP stack and .NET are (enough to get me into trouble and certainly not enough to get out again).
I could however learn and come up to speed quickly if needed.
I won't return the compliment by calling you an idiot for not understanding the details of how the CAN bus or ARM assembler works.
Actually - I DON'T expect a programmer to come pre-loaded to my requirements; that was kind of my point - Universities seem to be churning out programmers that ARE pre-loaded to some predetermined blueprint, to *sombody's* requirements, with a specific set of skills - they don't seem to have been taught how to learn or solve problems nor do they have a real interest in the field they're employed in (remember crapverts like "get an IT qualification - earn ££££s"?)
I don't care if they're a C# .NET developer or some kind of warped guru who thinks in x86 Assembly... however I do expect, if I'm employing a programmer, that they have enough interest in the field in which they're working to know that there are other operating systems beyond Windows, that .NET is MS-specific (Mono not withstanding), that if they're a .NET developer (for example) I'd expect them to know the shortcomings of that as well as the benefits; they must have an ability to think on their feet rather than dogmatically pursue a language/framework that they're comfortable with when it's not the best tool for the job.
The most important part of programming is problem solving - coding is pretty much a matter of grammar and vocabulary in which you can be retrained; it's not a massive a leap to move from C# to Java for instance - or to a lesser extent C++ or PHP. I'd expect a computing graduate to understand the principles and be able to problem solve - _really_ detailed knowledge of the language comes best when you're using it day in day out, i.e. on the job.
I'll happily admit that I know nothing about how ARM assembler works, I've never needed to, BUT I have enough interest in this crap to fondly remember RiscOS on the Archie (which ran on an ARM chip) from some oooooh 15+ years ago.
I agree that some bedroom coders produce crap but the degree-bearing ones aren't any better. I say this as someone who's self-taught and has just spent the whole morning fixing bugs in code produced by someone who has a degree. I am, however, in the process of getting a degree to keep HR departments happy. So far I've learnt that I don't like Java, but as you say that's not the point if you're being paid to produce it.
Oh I agree.
Both Bedroom coders and degree bearing gits can both produce crap code.
Universities today fail to teach the basics, even to their masters students.
If you can't think, you can't code.
While that sounds simple. You would be surprised at the number of stupid gits who come up with crap or want to touch code just for the sake of getting their name and credit on the code.
And yes, I've cleaned up enough crap from all types that I almost decided to call myself a sanitation engineer.
As someone with an IT degree who regularly interviews to hire developers, I can say that I don't personally rate IT degrees much at all.
I agree that a degree should be more than just about learning a subject; a true degree teaches you how to solve problems, not how to solve *a* problem (that's the job of GCSE).
My biggest issue is that IT degrees seem to be simplifying and removing a lot of the deeper knowledge that I require in all of my developers. I don't care if you only work in Java/.NET, if you can't discuss the heap and the stack at a conceptual level and you don't know what an "abstract" function does, you're going to need to blow me away with your understanding of your chosen framework.
I'd much rather hire someone with 3 years of industry experience than a degree, because if they can talk about it, they've gone out to learn it themselves. But, as a lot of people have said, it's very rare a non-graduate gets passed the CV filter stage.
...and am self-taught and degreeless. I agree 100% with all your points about bad coding, unreasonable assumptions, etc - apart from one point. You generalise when you tar all 'bedroom coders' with the same brush.
I have worked in large IT consultancies for many, many years. As a child of the Sinclair home coder boom in the Uk, many of my peers (now in their late 40s) entered the industry in much the same way as I did - eg via open-minded employers. Once at management level I was able to interview for new team members, and the experience was enlightening...
I once had a young grad join my team (whom I did not interview), who was working on a simple VB UI for one of our products. He said his VB skills were good, so I gave him a small task involving writing a simple encryption key that we could use to ringfence certain functionality on a per-client basis. I roughed it for him and handed it over, only to have him come back to me at the end of the afternoon. He wouldn't believe that a XOR operation could reveal the original value of a previously-XORd number, when used with the original key.
I drew him truth tables and walked him through it. He still didn't get it. In the end I had to finish the task and moved him on to some form / button work.
Nice guy, but clueless. Out of curiousity I asked him what programming projects he worked on at home. He looked at me as if I were a weirdo.
Yes, some 'bedroom coders' with no access to the real world can make a real hash of things. Structured programming can be an alien concept to many, but I would still take someone who has real passion, and if possible has cut their teeth in assembly language over someone who (as this guy did admit) had moved into IT solely as a good earner.
Note: You can be a passionate home-coder AND a grad, of course! I never met very many, sadly.
I actually state that there are those who can teach themselves, maybe I should have stressed that point more. In fact I studied astronomy, so much of my computer science and coding skills is self taught. This is why I am still learning new things about core computer science.
I also notice there are huge differences in IT degrees themselves. Some degrees are much more oriented to learning to use available tools and languages than to actual problem solving. We sometimes get students from these courses applying for our MSc course in computer science. These guys no way more than our students when it comes to common tools out there. Where they have huge problems in in their problem-solving skills. The best acquire theses skills, there rest flunks the course.
I've been hired by numerous degree students to write their dissertations for them, because they were too lazy and badly trained to write it themselves.
I think your example just demonstrates the point - that your recursive student developer was far more useful in the real world, the fact that he could prove his solution was more efficient should have amply illustrated that he had already learned what a recursive algorithm was.
IMO self-taught developers are ALWAYS more skilled than those with formal training or education.
There is generally much less time for pointless exercises in naval gazing in the "real world" - like insisting that something be recursive when there is no need for it.....
> and have seen quite some horror's produced by various "bedroom coders"
Similarly, I've seen some total dogs' breakfasts served up by formally-taught coders. Including academics.
> I told him of course he could hand his assignment in in C#, so long as he did not
> mind failing the course
Was this a Java course, or a programming course?
> He found this unreasonable
As do I, unless it was specifically a Java course.
> I suggested my attitude reflected that of a potential employer or customer
My experience of both employers and customers is that almost all are entirely amenable to the programmer picking the tools to solve the problem, so long as he can justify his decisions. I would suggest that your attitude is representative only of a few dogmatic employers.
 I had one University lecturer who was introduced to us as a real high-flier, who wrote code so complex that computers weren't available that could do his code justice, so he set up a company to build his own computers. When I later went on to work for the company that supplied him the processors, he was known as "that nob that can't write code to fit on a sensible number of cores"...
Your first point made sense, then you kind of threw it all away.
The vast majority of software employment is maintenance - an application already exists, it must be extended to add new features and bugfixed to correct the mistakes and incorrect assumptions made in the past.
Do you think you can choose to write your new features and bugfixes for a C# program in Java, or a LAMP stack in HLSL?
There are only two situations where it's possible to choose the language and framework for a task:
A) It's a brand new project, and will not be using any modules that company created for previous projects.
B) The project is to port an existing product to a new language or framework - in which case the programmer rarely gets to choose because they're being employed/directed to do that particular port.
In all other cases, the language and framework are pre-defined by history - either by the modules you already have or the program it is extending.
Even in case A), any software company will have a set of preferred languages and frameworks, and as a programmer you will write in the one chosen for that project.
In a good team you'll be able to influence which language and framework are chosen, but at the end of the day it will be selected and whether you agree or not, you will write that project in that language and framework.
Your post implies that you've usually been writing-from-scratch on your own, and that's not they way most people earn their corn.
University education is often the first time people are held to a specification - coding at home you can make the program do whatever however you like, but in the real world there are specifications to meet and other people and teams to work with if you want to get paid.
It's pretty rare for the requirements not to define the OS(s), language or framework.
> The vast majority of software employment is maintenance
OK so far.
> Do you think you can choose to write your new features and bugfixes for a C# program in Java
The answer, as always, is "perhaps".
What I am advocating - what I am always advocating - is using the right tools for the job.
If that means switching languages within an application - so be it.
You should notice that, in my original post, I made the point about "so long as he can justify his decisions". A change in language mid-project is going to cause ructions; if a programmer can justify that change, then he has chosen the right tool for the job. If he cannot justify it, then he has not.
> There are only two situations where it's possible to choose the language
No, you are incorrect here.
Most toolchains have the ability to export symbols to other tools. Mixing languages within a project is hardly uncommon.
> Your post implies that you've usually been writing-from-scratch on your own
Not so. Your post implies you haven't seen the mixed environment that frequently occurs in large projects.
> It's pretty rare for the requirements not to define the OS(s), language or framework.
I would argue the opposite; it's quite rare, at a requirements capture level, for those to be defined. Once the detailed design is completed, some or all of them may have indicators of varying strengths, but only a truly closed mind would refuse to consider alterations if a proper substantiation is supplied.
I spent some years writing quick and dirty 'job doers' when problems needed solutions yesterday. Yet your account struck certain chords with me. The candidates with whom you crossed swords were clearly not cut out to succeed, yet they had reached a University, I find that worrying in itself.
With a tiny bit of thought (gosh that is a rare concept) they would have realised that they can do this and pass or do that and fail. However, that would still only prove that can find the way through a simple maze to reach a self interest goal, e.g. do right pass, do wrong and fail - pretty straight forward logic gate to me.
My experience was of graduates overseas who replaced the qualified, time served trainees. As electronics technicians with a book they were possibly OK. perhaps even better than OK. Set them to fault find real equipment problems then please walk away it was going to be a long wait for progress, you might need not only a good meal but a bed for the night.
An IT graduate was set a straight forward process to create. Three months later masses of code and no output. Booted off him off and another practical chap did it in six weeks while still doing his day job. I guess the other one went to the NHS systems development team.
Time served practitioners at least know how to approach their work, possibly the questions to ask and the answers they need to progress the job.
"He said but my implementation is more efficient, we said that was true, but that the assignment was to learn recursion. He said but my implementation is more efficient, we said that was true, but that the assignment was to learn recursion."
If he worked for you, and you worked for me. I would have quickly gave him YOUR job, because efficient code is required to make or save money. I want programmers that generate results, not follow the same old broken cookie cutter.
Schools often teach those formalized way of doing things, but they are not usually practical in the real world.
I started programming in primary school, went through secondary school taking every IT class they had and breezed through it. Didnt bother going to uni. Got a job developing at a private investment bank on the strength of my predicted grades (didnt ever finish or hand in any coursework), and doubled my salary every 12-15 months until I was 22, then quit full time work on a salary enormously higher than any graduate was getting, and started freelancing for even more.
Aside from being up-to-date with the latest frameworks and teaching techniques, most uni graduates are hopelessly inept at solving real world problems until they've been working in the wild for a bit. Not saying they arent smart people - some of them are very smart. But having a degree is no yard-stick for how competent somebody is at solving a problem or applying abstract thinking through code.
I gather that the guy who handed in the iterative solution to your task may not have got the top marks, but probably went on to earn much more than your teachers salary?
"Another story I like is the guy who handed in an iterative solution where the assignment explicitly stated: "implement a recursive method to compute ....." He argued this was more efficient, we said that was true, but that the assignment was to learn recursion. He said but my implementation is more efficient, we said that was true."
The guy has the right attitude: simplicity and effectiveness. Programming is all about solving the problem, and if the problem can be solved in a way that's simpler and more effective then it's a better solution, and forcing someone to take an approach that's not that way goes against good programming.
I suggest that what you do is to find a problem that can't be done without recursion. The classic one is a product/part explosion where product A is made up of products B and C, B is made up of D and E, C is made of F and G and so forth. Then get someone to provide a list of parts and a total cost (or something like that). You can't solve that without recursion.
And please teach people about trapping situations where the database gets in a mess and A is made up of B and C, and C is made up of A and D. I've seen that a few times in the real world and the program just sits there constructing parts until a stack overflow occurs.
I taught computing too (O.K. it was one of those store front Ponzi schemes you see advertised during the day "training to become a medical receptionist, cosmetician, Security professional, computer programmer..')
I did teach my students the basics (3 operations, sequence, iteration, decision)
file systems (flat, indexed, random) etc. etc
There were a couple of students who were really serious and I gve them the best advice
I could. work on typing/data entry, RPM and try to get a data entry/operator position at a small company with S/36/S38 and work your way up cos you know RPM. IBM had just come out with OS400 and there was lost of opportunities
but there is flexibility in how you get it.
Degrees don't guarantee good performance in any filed, nor does exuberance. We all know self-taught programmers that are good and well papered people that are crap, but those are the exceptions.
What we do know is that if you go looking for good programmers amongst those with CS degrees then you have a far better chance of finding a good one than if you take random candidates from the pavement.
Ours is an industry of change and you need to work hard just to stay current. Good programmers invariably keep reading and keep experimenting through their lives. Those with a piece of paper see the degree as just a start and not the end of their education.
Contributing to an open source project from your bedroom is fairly hard graft:
First you have to hang out on a msg board for months, then you got to get to know the code which will take forever, then you'll be asked to justify whatever changes you've made (this is quite tricky when large egos are involved).
All this and you have to first learn a programming language in the first place :-)
I'd say this is harder than the 3 years of Electronic Systems Engineering Degree - and in the end it was the bedroom coding experience in TCL (of all things) that landed me my first job
...but I have found that most of the good developers I've worked with did their degrees in electrical engineering, not ICT.
--- this is not relevant to previous post, justy being lazy and tagging this on ---
Nothing annoys me more than the "a degree proves your have learned to learn" or the cobblers about university being "gaining life experience" that so many academics witter on about.
If modern A levels were anything like a difficult today as they were 20-30 years ago (when very few would even attempt more than 3 because of the work load required), that proved you had "learned to learn", because if you hadn't you failed them. Also nothing provides better "life experience" than getting out of school and getting a job in the real world and working with others, but most academics in teaching have never done that so they wouldn't know that.
but I seriously doubt I'd consider hiring a developer who didn't know the difference between the word 'waste' and the word 'waist'.
It would make me think that attention to detail wasn't your strongest area, and I probably wouldn't get clear and unambiguous documentation out of you.
My university course was, other than a few key modules, pretty damned useless. Almost everything I do as a developer involves stuff I taught myself or acquired on the job. I've since encouraged people younger than me to consider whether university is necessary for their career, or whether they would be better served spending that time getting direct work experience and skipping the large amounts of time and debt involved.
Personally, I'd have 3 of those 4 years back in a heartbeat.
> My university course was, other than a few key modules, pretty damned useless.
Much of mine was worse than useless.
There were a number of us on my course who had been software professionals before going to university - in those days, software was still largely seen as a new discipline, so people without formal qualifications were routinely found throughout the discipline.
There were several stand-offs where an academic tried to push through principles that we "would need in industry" which were somewhat different to what we were using in industry...
 My favourite example was when one guy was trying to get us to favour recursion over iteration. His argument was that the formal proof of a recursive function is easier than an iterative one. That's true, IMO - and entirely irrelevant; what's vital is that the tiny embedded system you're programming doesn't eat its entire stack in the first operation. That it is harder to prove to be correct merely means that the coder has to work harder to do so...
Sure, you need to be careful of stack space. That still doesn't mean recursion isn't the right solution though, how about tail recursion? Works fine for many problems, is amenable to formal proof and doesn't consume any stack space.
If you've thrown out recursion as a concept then you're never going to find that optimal tail recursive solution if it exists - of course, an iterative solution might be better still depending on the specific problem.
Arguing that iterative is always better than recursive is stupid - arguing that one is often better than the other is defensible.
Incidentally, the formal proof that your algorithm is correct is very important in some industries, while at the same time being almost irrelevant in many others.
Just because a given concept appears pointless for the work you're doing right now, doesn't mean it's pointless for every job you're ever going to do. Put that idea in the back of your brain and sooner or later you'll find it's part of the best solution for something.
> If you've thrown out recursion as a concept
I hadn't. No-one had. I didn't mention anything about throwing out recursion.
What I mentioned was an academic who was trying to throw out *iteration* because it made the formal proof harder.
Please tell me you can see the difference between those situations...
> Arguing that iterative is always better than recursive is stupid
So it's a good job no-one did that, then.
Would you also consider it stupid to argue that recursion is better than iteration? That's what this lecturer did, and I consider that very stupid.
> Incidentally, the formal proof that your algorithm is correct is very important in some industries
Yes, I've worked in those industries. I understand the need for formal proof. However, the point of my anecdote was that this guy was insisting on an extremely sub-optimal implementation just to make that proof a little easier to do. That's just laziness.
Please respond to the post I made, rather than making up what you thought I said...
It sounds (reads) as if most of you went to a university when you should have gone to study City and Guilds or NVQs or some sort of apprenticeship or whatever the modern job training system is. A good degree, with the exception of certain professions, trains the mind. Even medicine and law top up the academic education with practical, work related training.
As an employer or manager, one of your most important jobs is to teach new staff the reality of working and the new skills required for that, not to make smug observations, challenge them with your pet programming job, probably badly specified because you never learnt how to specify and document.
The two worst types of horror I have experience are the Windows and the Linux "hacker". They get something working; but it costs a fortune to correct, maintain and understand for ever afterwards, using all the tweaks available only in their pet, home environment. I have suffered these opinionated fools in three countries. Aaaaaaarrrrhhhh!
> I've run into several University grads who are useless as programmers.
When I was interviewing grads for a large software operation, I had several candidates who decided I owed them a job because they had Firsts.
It was quite amazing how quickly some of them lost the plot when presented with some fairly straightforward coding problems.
One guy flatly refused to answer my questions when I gave him a design problem involving what he considered to be a mundane piece of household equipment - he thought it was beneath him. I wanted to see how he would go about designing an embedded computer system, and the application is entirely irrelevant.
Education is for the benefit of the student, not employers.
Study something you enjoy learning about. Your tutors are there to teach you how to learn.
I don't know what they teach these days, but I did computer science, which covered all sorts of things such as AI, relational algebra and database theory, super-computing theory, and 3d graphics. The programming was mostly incidental to learning other things so we did Pascal, C/C++, Ada, Lisp, Miranda and so Forth (geddit?). Sadly no cobol, as that's a real money spinner...
Anyway, there wasn't much that I've seen in action since or used in anger. All I use now is what I learnt at o-level - binary/hex/decimal conversions and logical operations (used for IP address calculations).
Having been in a position to say "been there, tried that and failed miserably", I can inform that without a degree you stand a snowballs chance in hell of getting an IT job without a degree. No employer will take one look at IT coders/admins without a degree because the vast majority of people in industry do have degrees and the cost saving is negligible
Nice idea, except a large number of job adverts specify that you must have a degree, some of the higher paid ones insist on a 2:1 or better. Also I'm sure I saw a games job advert that required a degree. You also see this "Must have a degree" for php jobs which isn't as far as I know taught at unis.
... is that few "traditional" forms of education are equipped to handle the modern world ... and even fewer "cutting edge" kids are equipped to handle the real world.
It's probably been that way since the year dot ... but with today's connectivity, it's becoming blindingly obvious that humanity needs to step it up a notch. Or two. Allow, and indeed encourage, tool users to use tools ... it's what made us Human in the first place ...
Education is a racket to create jobs for the educators.
In principle some people could benefit from a specialised and abstract education rather than on the job apprenticeship. In practice academics produce more academics, advancing an academic career and being useful in the real world grow ever further apart. Bright people with a passion for whatever subject will teach themselves, the dullards will game the system and get bits of paper. The world is full of the dullards trading bits of paper among themselves.
Me started in computer operations in 1973, went to university in 1995 because no one would look at my cv without a degree. Now PhD and working as a consultant.
... but try getting that past the HR fucknuts who seem to think a degree is required to take a shit unsupervised...
Not having a degree has put me out of the running for so many jobs, despite having the ability to do them blindfolded and with one hand tied behind my back.
Just ignore the "must have a degree" and apply.
My last 4 employers (that's over a decades worth) all stated "must have education to degree level" or similar in their job adds, but that didn't stop me getting to the interview and being given the jobs.
Perhaps the fact that I ignore all the CV writing recommendations (from the educationalist) and put the totally irrelevant "what I did at school" stuff as the back page of my CV that helps. The HR staff probably cannot be bother the read more than the 1st paragraph.
When I was made redundant in May 09 (the whole company has gone now) I didn't work for about 6 months. That was because I really didn't want to start commuting to London again, but there just were no jobs in Kent I wanted (agencies kept offering me HTML/Perl, but I do C/C++), and I definitely was not looking to move house. After 6 months and brief contract period, I gave in decided I'd have to include London: I was back in full time work 3 weeks later.
"45 per cent said they feel a degree in computing is no longer valuable for securing a career in IT"
SECURING a career - thats different to getting your foot in the door in the 1st place.
It is also a different mindset between "learning to code" and studying for a degree - its not a bl**dy 3 year training course!
...that the little kiddies are sold the lie (by those in education mainly) about a degree getting you a well paid job!
Most degrees are worthless to employers unless they are specific to a real world tasks, like medicine, law or real engineering (civil, electrical,...).
In most cases (IT included) both employers and employees would be better off if companies provided real apprenticeships. For starters the apprentices would be earning some money not getting into debt, and they would also learn from people who actually know what is required in the real world so would be learning useful skills.
I believe anything else should be regarded as Hobby courses and should be paid for by the person taking it.
Ironically - when I was "straight out of Uni" (11 years ago) - every job advert said 2 years experience; my first job out of Uni - web designer on £13k a year ... which I was happy to take as I knew it would start to get me that experience.
If I'd clocked up £27k worth of debt going to Uni now - I'd be less than pleased with having to have chalked up that massive debt for that wage.
27k debt for 13k salary, not a great deal n the face of it but let's not forget that you won;t have to pay a damn thing back until you earn quite a bit more and if you're good with your cash then you can repay early to reduce the debt.
I'd take the deal if I didn't have two kids and their mother to support.
Don't get me wrong - I'd still have taken that £13k p/a job; it's just that you're sold the "get a degree - earn more £££s" line all the time - the reality isn't quite like that and if I'd bought into that line to the tune of £27k debt that reality would have been a bigger kick in the nuts... not to mention that your "quite a bit more" figure is going down by the year...
When I went to Uni (first year with loans and we had grants as well and no tuition fees - kerching!) the amount you had to earn to _begin_ paying your student loan back was about £28k p/a IIRC which is adjusted for inflation every year so it's always been more than my salary... and at such a low interest rate I've never bothered to pay it back (it's actually at -0.5% atm so it's paying itself off without me doing anything) - financially it makes better sense to put the money into an ISA or higher interest e-banking account until such time as the interest rate on the loan is higher than it would be on the ISA (or you're earning "too much") and then just pay the loan off in a lump sum; student loans were only about £1500 p/a then mind as they didn't have to cover tuition fees.
If you go to Uni now it's what, £18k before you have to start paying your loan back? Less?
But every job I've been self-qualified for and interested in says "Graduate ....."
Write a covering letter explaining why you are right for the job. Don't whine about how life is unfair - a prospective employer just doesn't care. Tell him why you are going to make him rich. Convince him that securing hot-and-cold running blowjobs for the rest of his career depends on hiring *you*.
This is an example of solving a problem using the tools available. It's what we programmers are supposed to be good at.
As an employer, I'm quite lenient about qualifications, I'm more interested in proven ability and I'm prepared to some of the recruiting myself - but that puts me firmly in the minority. If you don't have a computing-related degree, you WILL be at a disadvantage in the current market, compared to those who do.
It's not just employers at fault. If it will save them some effort, recruiters will tend to set "silent" additional requirements of their own, and the ones who use automated CV filtering a lot - that's most recruiters, remember! - will have dropped a lot of non-degree-qualified candidates before the prospective employer ever gets to see the list. And that can happen even if the employer explicitly stated they were prepared to consider non-graduates.
So I'd take opinion polls of potential candidates along with a couple of packs of Maldon's finest, because the candidates only see half of the picture, at best. I'll put this to the nay-sayers: just for giggles, when you're next looking for a job, try putting together a CV replete with your top-notch job history, but miss out the qualifications - and see how far you get. You might feel differently after the 50th "Unfortunately, on this occasion..." response.
>> You might feel differently after the 50th "Unfortunately, on this occasion..." response.
Some of us would consider it good going to even get an acknowledgement of the application, actually being told you were unsuccessful would just be the cherry on top. Would it be so hard, or cost so much to send out the standard "your application is being considered" and "sod off, you're not the droid we're looking for" letters ? Sadly it seems that standards have dropped so far than many agencies can't even be bothered with that.
Now, I'm off to polish up the CV, announcement of impending short time and/or redundancies at work :(
AC because some people at work read ElReg.
having gone through two months of my previous work slowly going out of business, paychecks being missed, etc., I know what you're going through. The best bit was definitely the rejection letter from ARM I received a full year after I'd applied (by which time I was luckily at my new work). I'm not sure why they even bothered at that point.
Bedroom coders are probably doing it because they are interested and in my experience passion for a subject usually makes you better at it that someone who is doing it "because my career teacher said there were good paying jobs in IT".
I'm not sure how people select Uni course these days...presumably not always for sensible reasons. Only this week I was at a meeting where a pretty intelligent young chap said he was going to Uni to study 'x'. He said "I don't know what 'x' is, but I guess I'll find out soon".
PS. I may be biased...I was a bedroom coder who used my interest in IT to make a 30+ year (and counting) career in IT.
Paris...because she's passionate.
I disagree with the survey. I started coding around age 10, but studying formal methods, functional programming, etc during my CS degree really made a difference to how I _think_ when I code.
I've seen lots of code written by self-taught coders who learned it on the job (or incidentally while doing some other degree) and they don't remotely compare to those who've actually _studied_ computer science.
We do teach computer science. There are IT courses taught by the Hanze University of Applied Sciences next door, but we, and all other traditional universities over here still teach computer science. However, very few of our graduates ever become coders. They become researchers (in academia or industry, a lot of ours go to companies like Philips Medical Systems) , or become software engineers and architects (OK, they are also involved in coding, but cost a lot more ;-) )
I've being interviewing people for system design / development position for over 15 years. I've watched standards decline over that period to the present point where It seems that the universities have decided to produce graduates that industry thinks it needs rather than providing students with a strong theoretical background that will be useful throughout their professional lives.
Trivial example to prove a point. Try asking a recent graduate to sketch out pseudo code for quicksort. Assuming they understand what you're asking, you'll find fewer and fewer able to do this adequately. Often it will be pointed out that the current framework du jour provides sorting as standard so they don't need to know "this stuff". Which is all well and good until you're working on a micro-controller and you need to write the code from scratch in assembler.
I know what you mean, but I would recommend trying different languages as an even better way do expanding the way you think about code. Use C/Java/C++, but learn enough to do at least simple programs in Smalltalk, Lisp, Forth, Io, Ruby, Assembler, Prolog, Haskell etc. Even if you never get paid to program in these languages it will make your code better and you more adaptable.
I completely disagree with your general claim that people who have studied computer science by doing a degree in it are in anyway superior coders to those who have studied it because they have a love of it. There are *plenty* of computer science students who have zero real interest in the subject and it shows very clearly in their code. A degree is not any sort of certificate of ability in the subject, only in the ability to get through university.
Totally agree about expanding the program language repertoire ...
My interest in computing got serious when I was at university (doing a degree in Art *shrugs*) and the web was just beginning to really enter the public consciousness; before the original .com boom - I wasn't about to try and switch course so I started to teach myself because it fascinated me - being able to make something, in a computer, and see it there, on the screen - upload it and everyone can see it...
So I started with the very basic for the web, HTML - added in CSS to simply things and begin separating the semantics from the style ... added JS to make it more interactive ... added PHP/MySQL to make it easier to maintain (not having to upload files over FTP; just very basic CMS type stuff initially)... then Perl and so on throughout my degree and well into my working career.
Started learning Java in my spare time and OOP suddenly clicked; even more so with C# as that "feels" like a better language to me and I find it much easier to "get along with" than Java - I think that the PHP code I use in my day-job is now MUCH better because of my experiences with Java and C#. I don't try and shoehorn PHP into a strictly typed language (it isn't one) but I do much better understand OOP principles (abstraction, interfaces, patterns ... ) and can better apply them to PHP.
All well and good
But you don't start off with experience, it needs to be aquired, you can't aquire experience because no employer will give you a job due to your inexperience.
Tell me how my generation gets its foot in the door, when every job application seems to require one, and not only that but most places demand that you have at least a 2:1. (Which I have).
The only people who say experience is all you need, are those that have already have it.
Which is exactly what I did for my BSc Soft Eng.
When I graduated I too found that despite the qualification most employers wanted two years experience.
Of course the company I did my sandwhich course with hired me two months later. I worked there for 5 years from code monkey to senior developer. Just don't screw up your year in industry :)
> Tell me how my generation gets its foot in the door, when every job application
> seems to require one
The thing you need to realise is that something like 60% of job vacancies are never advertised.
If you're having difficulty getting into a career - change you approach. Find people who can recommend you from the inside of a company.
This will undoubtedly require some hard work on your part. Them's the breaks.
I work with a lot of bioinformatics people, many of whom come from a biology background and are self taught with the programming side. Some of them can code properly, but many of them are worse than useless - for example producing millions of files and cocking up the network filesystem for everyone, instead of taking half a day to lean to use a bit of SQL and put it in a database table.
Firstly, I don't have a degree in computing of any sort. Secondly, I've got about 12 years experience at all levels.
Over the years I've employed developers with and without degrees, and to be honest the ones without degrees tend to be better. Their skill set tends to be more up to date, they tend to be able to think their way around problems. Graduates tended to be only able to do things in one particular way, and that way often two or three years out of date.
This is changing now though. We're seeing graduates who've got themselves some work experience in a real company, learning that a developers job isn't always just sitting down and typing away in your preferred language, coding what you decide you want to code. There's a whole lot more to making yourself employable than just getting a degree. We've had graduates in the past that turn up late, leave early, dress poorly, refuse to answer phones or doors, talk loudly and wonder why they don't last through their trial periods. We've also had bedroom coders with the same problems, but they tend to not think that the job should be theirs because they've got a bit of paper.
But it is changing, some Universities are realising that the degrees they were offering weren't making their students employable so they've started working with companies to give their students "real world" experience.
For me personally though, do I wish I'd gone to uni and got a degree? No. Back then there were no degrees in what I now do. What I could have done would have been a waste of time. IT moves fast, and thankfully the unis are starting to catch up to working at that speed.
Do I think that students deciding whether or not to go to University should go? Yes, but choose *very* carefully which University, and which course you go to. If you can, talk to the people at the sorts of companies you want to work for and find out what *they* think of the degrees. And if you do go, get involved in as much of the social aspect of the computing courses as you can. I still think you learn just as much from talking to other developers as you do from text books.
I've seen many rubbish coders, some had degrees, some PhDs and some didn't have any higher qualifications at all. I've seen some great coders, again, some had degrees and some didn't. From my experience, having a degree is no indication of coding ability. Either you get it or you don't. There's an interesting paper at http://www.eis.mdx.ac.uk/research/PhDArea/saeed/paper1.pdf.
I think you are quite right: bedroom coders often lack a proper foundation. They probably won't even have heard of Knuth or Fred Brooks, and have little idea of the history of software (quite important if you're not to reinvent the wheel - sometimes in square or pentagonal shape).
Trouble is, most employers know just as little about software and software engineering. They hire people who have superficial coding experience, cut costs, set unrealistic deadlines, and then are surprised when their software is bad or doesn't even work at all. It's a vicious circle, which will be broken very gradually as people who do know what they are doing get into positions of responsibility and start hiring good experienced programmers.
I don't have a degree, but did go to uni for a couple of years on an electronic engineering/comp sci related degree. The issue that I have is that many people without degrees don't really understand what's going on under the hood, or have fully rounded knowledge of the systems. However, conversely many people with degrees don't understand various areas in business which you don't learn at uni and can be equally important.
....and who were told to get off our 18 year old arses and "get a bloody job!" this cant be said enough.
Thanks to Blair making degrees the equivalent of the maths/english GCSE, those of us of a certain age with no degree (as it wasnt expected of us) but 20+ years of solid experience cant get a bloody job worth a damn.
I can list all the IT skills I have, all the major IT projects I've delivered on time and on budget etc. etc. but as I dont have a crappy medieval studies degree I cant get an interview.
I do partly blame the dumb as ditchwater HR depts too. When you read the job description/requirements it bloody obvious they have just listed every IT term they could find in a computer magazine. It's almost like you have to have been genetically created for that very job.
Haven't ever used any of the so-called Computer Science they taught us. The one course that might have been useful was the Database design one, but it was boring as hell so I tended to stay in the flat and playing Fallout and StarCreft instead of going to the lectures. Even then, I've picked up all the database knowledge I've needed through using google when a problem has presented itself.
The IT world changes so much that by the time you leave uni, what was useful when you started is going to be old-hat and out of fashion. So there's no hope for whatever topics they put on the curriculum 5 years before you signed up, unless they constantly change it every 6 months
Having worked in IT for a number of years I often come across the 'I taught myself' type of IT 'professional'. Unfortunately these people are so far up their own backsides that they fail to see that despite doing something for the past 20 years they have been doing [insert sys admin or programming skill here] wrong or badly for all that time. Not their fault, they haven't been formally trained in the subject but they fail to see this.
Having more formally qualified people in the field should gain real IT professionals more respect; would you want someone designing bridges or buildings to have taught themselves about engineering in their bedroom or rather have them complete a 4 year degree in the subject? Designing large computing systems is the same nowadays and it's these bedroom cowboys who are the reason for so many failed IT projects.
As someone with nearly thirty years in IT, I have managed to progress pretty well without a degree and recently joined one of the top consultancies without issue. Anything I could possibly have learned at university in the early 1980s would be somewhat useless today.
And when I recently tried to recruit someone whilst at a major UK retailer, the internal HR department were shocked when I told the assembled recruiters that a degree was not a prerequisite.
But I'm certain I am an exception. If under 30 and wanting to work in IT, a degree of some kind is vital to even be considered - at least if one is applying for anything more senior than 1st line support.
I am of the ZX81, 6502 machine code generation. I passed computer studies O level where you had to actually write code.
I have 25 years in the industry and no degree. I've worked first line replacing broken disc drives and unjamming printers. I worked on small database developments pre MS Access (Paradox, DataEase) and on large multi gigabyte Oracle databases. I now specialise in BI solutions on large DBs. I have experience, a proven track record and good references.
HR departments that receive 200+ applicants have and do throw out my CV for not having a degree. Not only as an easy way to filter down the numbers but also to avoid accusations of some form of *ism. If you can show you have applied objective criteria it is easier to defend a decision.
I have managed to get a number of jobs through word of mouth, on more than one occasion having been rung up and advised to apply. That simply does not happen today except in small companies.
Funny - the Romans seemed to manage to build things without bits of paper using some kind of strange apprenticeship scheme - the laws of physics haven't changed in the last 2000 years so there's no real reason why an architect shouldn't be taught "on the job" (especially now that Physics is no longer a requirement for degrees in architecture).
However programming principles, designs and practices change frequently - a computing degree from 10 years ago is probably ONLY worthwhile as an indicator that the person has (or should have, it's not always the case) a good understanding of IT principles - if they've not kept their skills current they're probably going to be useless in a real-world environment - even with a degree you still need to keep learning on the job.
Though I tend to agree - we really need something like RIBA for software architects (RIBSA perhaps).
Sorry but from my experience that's complete BS. Most self taught coders are atrocious. They usually have massive gaps in their knowledge of basic practices and theory so tend to create bizarre overly complex solutions to solve simple problems. They also tend to be much more unwilling to learn anything new unless they discover it meaning they are a nightmare to work with.
Programing in itself is easy. Really, so is playing the piano. You point to any key or set of keys on the piano and I can hit them. Does this mean I can play the piano? Not even close. The same analogy goes for programing. Just because I can do an "if" statement, can call a function or class, and/or increment a variable, it does that mean I am a good programmer.
The bigger issue at hand here is how one designs/architects ones code, in other words puts it all together. As a person with a degree, the big thing I noticed was that during the dot com days too many people dropped out of college or just outright skipped it to jump into the game. Many of these people never learnt the appropriate discipline and understanding software design/architecture that is required to writing good software, and now many of these same people like to think that they are "experts" and are deserving of special something when in reality they stink as programmers.
My favourite quote is "I don't understand why my application is running so slow .... we are using a database"
Another non-graduate here and I'm doing pretty well after 23 years, however thats not to say it wouldn't have been easier with a degree.
The apprenticeship approach is whats needed if things are going to be different and that didn't really exist back when I started and it certainly doesn't now.
You can succeed without a piece of paper, but you need to be dedicated, not put off at the first 100 rejections and don't imagine that a big corp will be your first job.
It becomes much easier once you have some experience behind you and the playing field levels out.
For those starting out now, I have no idea how to advise them.
Yes and, at the same time, no.
Someone having a CS degree does not immediately make them a good developer, and I know many excellent developers who do not have CS degrees.
However, at some point the depth of their knowledge usually fails. You can't* talk to them about efficiency using O() notation, and they won't know GoF design patterns, or the ISO/OSI 7 layer model, what a B-tree is, etc etc
I've expressed this as absolutes, obviously some people will know more than others, and some very special people will know it all, but the bottom line is if you want someone with a grounding in computer science, the best place to look is at graduates from decent universities who have spent 3-4 years studying it.
If you're just looking for code monkeys to operate from a well produced plan and design, then bedroom coders can be fine.
I'm slightly jaded though, we had a bedroom coder come on and join us for a couple of years. The libraries he worked on work perfectly well from the outside, but the innards of the libraries are full of horrible indirection and NIH-inspired code, and are so unmaintainable that we've deprecated them and rewritten them.
Benefits of university over bedroom:
1. Experience of meeting deadlines and work under pressure.
2. Working in a team.
3. Exposure to ideas and avoiding reinventing the wheel. (Bedroom coders have access to the web, but who filters out all the crap?)
4. Experience of working to protcols and procedures.
Fees are too much, and not worth it at some unis. EU unis will benefit.
If you look at the competence of people in their jobs, a degree has little to do with it. There's good and bad people both with and without degrees. You only have to look at the people often provided by the top consultancies to see the 'quality' of many with degrees. Degrees have been devalued by every Tom, Dick and Harry taking them and are now of little use as a determinator. I've met many an idiot with a degree, whether IT or not.
I learned coding way before I went to uni and was pretty good with C++ when I started. I've sifted CVs before and I'm always going to pick someone with a degree above someone without AT ENTRY LEVEL. It has absolutely nothing to do with coding ability (my experience is that most academic institutions aren't very good at really prepping students for business-level coding [there's hardly any focus on standards, and they seem to currently be more focused on helping people who don't know how to code as much as those who do, which means advanced topics aren't as well covered]), but is down to 4 simple things:
1. That person has shown that they are able to stick at something and do well at it
2. That person has been placed (usually) in team project situations
3. That person has experience of working to a deadline
4. That person has experience of other tools and techniques than the ones that they have chosen themselves
If the person has relevant experience in other roles that mitigates this then I'll put them forward, but in terms of just choosing people based on CV for their first job in IT, if, for instance, I'm given a bunch of them and told to pick 5 for interview, I don't see much contest. After you've got some industry experience it's swings and roundabouts, but a degree still gives you a bit of a foot in the door.
You are entirely correct that self-learning with nobody to review your work can lead to ingrained, horrendous habits... BUT computer science courses are not about teaching students how to program - that's typically seen as too vocational.
The point I'll make is that someone with a passion for programming who spends their free time doing it for sheer pleasure, is probably preferable to someone who has never done any coding before university and expects to be taught it all.
The ideal combination is someone who has a lot of experience from hobby coding, and THEN goes to university to learn the theory side and learn better habits while they are still young enough to adapt.
I didn't go to uni but instead after my a-levels did my CCNA and MCSA (I'm systems and network and not programming) but it has never stopped me. Getting the first job was a bit tricky but I'd say only as much as if I was a graduate 3-4 years later. Most of my managers and upper management said they didn't care if I had a degree or not. I can do the job and I am approachable by people in the business; they are the two key factors. 6 years on, I think I can safely say it's not going to hold me back in anyway
IT professional' covers a huge number of different jobs, but for developers I think that a good CS degree is a huge benefit. A good degree gives exposure to multiple programming languages, complexity and performance, low-level network design, CPU design,development methodologies and different programming paradigms (OO, functional, declarative etc)
If you teach yourself to code it's likely that you're missing out on all of those other lessons that turn you into a well rounded developer.
I'd add "5. Prepared to stick at something for at least 3 years".
I've worked with numerous people doing coding who haven't had a computer science degree... on the whole they've all been extremely competent at web and web-service type stuff which didn't demand complex designs or have problems of great technical difficulty to overcome.
When its come to doing less conventional dev work, they've started to flounder. Their architectures tend to be mediocre, they've often got no familiarity with moderately esoteric things like function pointers or closures and they take a long time to 'get' patterns and idioms. Anything that isn't a straightfoward coding exercise tends to be a struggle, because it is something they've not really had to worry about before.
If they're caught early enough and given a job in a decent team writing non-trivial applications, everything is great. But due to the difficulty of landing that first job without a degree, they grow up, their brains become less plastic and they end up with a load of ingrained atrocious habits.
That's not to say an appropriate degree washes all these sins away, but it makes a good foundation for a competent software engineer and puts them in a better position to answer the tough interview questions too.
When I take on PhD students for projects with a lot of coding, I always look for passion, for people who do extracurricular stuff (not just in terms of coding, mind you). One problem is that 3-5 years is way too short to learn to code REALLY well. You typically need ten years (as for any real skill). A Uni can give you the theoretical foundation, you have to build a house on that by work experience (or hobby).
A nice site on this topic is
I am a self-taught programmer who also got his diploma, and I agree the diploma doesn't help in terms of skills.
However, the comments about HR are true. The world's simply not fair and whining about it only makes the truth more painful.
Another scenario I saw more than once proves my point. The self-taught non-diploma guru reaches mid-life in a consulting company. He has a good reputation, maybe even manages some coders, but the new hires (with diplomas) make his salary starting out! This is especially true in sectors where jobs are contracted to the government. Allowable billed rates are based on measurable education levels, at least with the US government. Such people have a harder time going back to school because they have families, mortgages, and tuition is more expensive than it was 15-20 years prior.
Most "IT" degrees don't seem to be very useful. They add a lot of noise to the signal here - Yet Another Java Course often just indicates that someone was told that an IT qualification would automagically lead to a good career (poor naive sod).
A Computer Science degree should (SHOULD) be different. It's not about how to code, it's about how to think, devise algorithms, understand the business of abstraction. There are plenty of non-degreed coders who are very good at wrangling giant masses of braces and semicolons but shy away from (or just fail at) anything that looks like maths or devising new abstract algorithms to tackle new problems. There are plenty of CS degreed types who know the theory but can't code, but plenty more who can do both (if only a majority of code was written by these people!)
So the survey isn't really measuring anything useful (surprise surprise)
People with no degree who have taught themselves BOTH sides are like gold dust, but there are just not many around and they're not always easy to identify. (Plus there's no guarantee that these remarkable talents correlate with other abilities that make someone successful in the workplace, useful on a project, or a bearable colleague.)
I've worked with bedroom coders and degree educated softies, and both groups have their fair share of idiots and experts.
However - as an employer, I want someone who plays well with others, is in work in the same part of the day as their team-mates, who can communicate their ideas well, and leave some sort of paper trail as to where they've been. Solving the problem is not as important as solving the problem in a maintainable, repeatable, scalable manner.
There are certainly passionate, articulate, self-taught people who are a valuable part of a team. That's not the same as being a genius coder, and often orthogonal to having spent your formative years locked in a room by yourself. Unfortunately in the IT industry, the latter (great intellect, no social skills) are often confused with the former.
I would never reject someone who didn't have a degree on their CV. However in their interview I would be looking for evidence of a whole set of skills that the degree educated candidate would have less need to prove. Conversely, in a degree educated candidate I'd be looking for a spark of passion and commitment that is presumed in a self-taught individual.
True most of the time.
I have a degree but it isn't in IT. IT Degree courses didn't exist when I was at university (79-83).
I have a physics degree which has not been much help to me as a coder but at least it does help me get my sums right.
In all my years of doing this lark, the best coders I've met either had no degree or they had a degree in something totally unrelated to software design and engineering.
My first programming job was coding flight recorders in assembler -- and I learned this from my boss who was a woman with a degree in Eng. Lit. One of THE best programmers I've ever met.
Modern IT degrees do NOT teach real-life real-world coding or algorithmic problem solving skills. They appear to be little more than Java-Wonk factories.
Firstly, I think more people should look at doing the BCS Professional Graduate Diploma instead of a full-time honours degree course - it has the same academic standing (and qualifies you for doing a Master's afterwards), but you have the option of self-study as well as doing it through course providers. This allows you to pick up a stack of books, learn for yourself and sit the exams, spending £1k instead of £27k.
I left a (well-respected) uni Computing degree course because I wasn't happy with the quality of the teaching (e.g. near 300 students in one of the courses lectures so no one-on-one time realistically possible). I then worked my way up from tech support into programming, and I'm pretty certain that I've been passed over for a lot of the jobs I've applied for because I didn't tick that degree checkbox - perhaps half my applications, or something like that. Yes, I'm in a well-paid job, and I'm only just finishing up that above-mentioned PGD in IT over 10 years on, but at the same time it has sometimes made things harder.
People should bear in mind that during in-demand periods you can get a job with less experience - almost none during the dotcom boom, even - but when there's a crunch and more redundancies, then that missing degree may be a much bigger problem. Also, having come back later, some of what seemed irrelevant at the time made more sense, and I knew where i had gaps that were filled in. So yes, a degree prepares you for work (to some extent), but there are areas which you may not cover if you're just learning on the job.
You can write both good and awful code without any formal training. The formal training aims to give you a better understanding so you're less likely to (not that this always works as my recent interviewing of candidates for my employer showed). That said, I'd certainly take a bright enthusiast over a less bright and interested degree-holder, as the field moves fast enough that an interest in ongoing skill development is essential. What we could probably do with are more apprenticeships, where people study towards exams like the BCS's while working for a company, and get a balance between real-world experience and academic knowledge, while also earning a modest wage. Bring on the IT Apprenticeship!
I hire a shed load of IT guys. I am an IT guy who is now a manager, so I can see both sides, and I know full-well that bedroom coders often outshine their uni colleagues in terms of pure coding.
BUT, bedroom coders vary massively. Some are brilliant. Some are truly appalling. Uni coders usually meet a minimum standard - even if it's not great, it's nearly always good enough. I can't afford to interview 10 bedroom coders to find an outstanding one, when I can interview 3 uni coders and find a good one.
To put it bluntly, however easy it may now be to get a 2:1 in an IT degree, it is always going to be harder then just typing 'I like coding in my spare time' on a CV.
I also find that bedroom coders have more 'personality clashes' with co-workers, and often aren't great at dealing with an often non-technical customer. The uni coders seem more rounded - possibly because their courses didn't focus solely on coding, possibly because they're the type of people who chose to get out and go to Uni and meet people face to face, rather than stay at home in the dark. I don't know, but that's definitely the pattern I see.
(And don't flame me with "I'm a bedroom coder who gets out" posts. I know many bedroom coders can't afford to go to uni, or don;t feel it worthwhile. But many others I have interviewed clearly didn't like the human contact at all. I'm sorry, but that's just the way it is.)
P.S. This is just for first time job seekers. Once a candidate has loads of experience and good references, the education takes a back seat.
I know if possibly stupid thing, but if you're considering moving to the US and applying for a Green Card this does make a fairly considerable degree to the time it takes to process an application... right now the difference between an EB-2 and and EB-3 is about 6 years (with the quicker route being for someone with at least a 4 year degree)
Ironically after 20 years in the industry coding everything from 6502/68000 assembler to COBOL, CASE tools to 4GLs, designing and implementing large scale projects my application is on the slow path... because all I had was an HND in Computer Studies
Just goes to show that there's still a disproportionate value attached to that bit of paper even half a lifetime after you've paid off your student loans
IT jobs have never really needed a degree, head of companies will advertise for people and require a degree, when the head admin is employed they will advertise a job that is open, and as long as you can answer the hotspot questions and proove you are able, then the job could be yours with or with the degree
.. so that I could interface the REAL WORLD to the IT WORLD.
If you did engineering then formal methods, funcional analysis and system architecture just naturally falls into your lap.
Oh, oh, oh, oh! And when it comes to best endevours on a typical UK IT project ('cause fuck me you lot can't do IT and that's why you ship it all off to India), I am able to step into the breach and actually solder up an RS232 null modem cable when needed - as happened a few days ago. Bet none of the CS-degree folk even know what I am talking about here!
No degree here and I'm happily much further along my career path than I would have been had I spent the first 3 years in Uni instead. I'm sure there's plenty of us here that started out with basic rollouts and learnt our way up the ladder through the course of our careers thus far, real-life experience outweights a nice piece of paper everytime for me.
It's such a shame because judging from some of the comments above it seems the industry has changed and it's much harder to get your foot in the door and we'll be worse off because of it.
I did some CS degree stuff... even got a diploma in it, then ran out of money and dropped out and yes I know exactly what a null modem cable is because I've made enough over the years of struggling to interface machine tools and robots with PCs, even down to reading up all the specs of the communication protocols the various control manufacturers use and then writing a serial port driver that implemented said protocol.
But I have'nt got that all important degree so according to HR and recruiters , I'm not qualified to do any programming, even though I'm anal enough to sit down and work out how many clock cycles are being consumed by an interrupt servicing routine to ensure it does'nt overrun its time slice, no matter which way through the code it runs.
But then I've given up on recruiters since seeing that ad asking for 2 years+ experience in C# and .Net when the products had only been out SIX F***KING MONTHS!
As an employer I look for a mix. My background by the way is a bedroom coder (albiet from the 80s-90s) that then became a computer graduate when i came of age.
What a degree says to me is that you got a combination of ability/dedication. Its impossible to say whether you got your degree because you were a superstar or you just worked damn hard (or obviously somewhere in the middle). What it does tell me is you not a complete muppet.
I then get REALLY interested in what the candidate does as home projects. This tells me if the candidate has a passion for this art.
Lastly once past all this I give them a simple practical. This can be quite revealing. I've had so called experienced people that really couldnt even code yet alone structure a solution. I've seen the rough end of bedroom coders which actually produced a work solutioning but was so fundermentally awfull , you knew they were never get out of their ways of working without either going on course or being mentored continuously.
I also found that being obsessed with the language was unecessary. In fact when I was looking for C++ programmers I actually found it better getting Java programmers than so called C++ programmers that had a background of C as many never ever became true OOP they just used the C++ syntax.
I have a non-CS first degree and taught myself to program in my bedroom. Many years ago I started working as a programmer and worked myself up to a point where it was clear that if my career was to progress, I needed a CS degree.
I studied in my spare time, got a CS MSc and it certainly made me a better programmer. I was forced to get to grips with concepts that I had found complicated / boring / irrelevant. It taught me how to write architectural diagrams and project proposals. More than anything it got me past the CV filter of 'CS or relevant degree' and into a better job.
It didn't make me a better person, but I learned things I wouldn't have otherwise, and got me interviews for senior roles for which I would have been rejected without my Masters.
I've been in the (un)lucky position of hiring a few developers in my time; I was a drop-out (still am as I suppose I never graduated) so I don't bitch and moan at HR to only give me degree holding candidates.
This is a Java shop, and Java certified programmers are 10 a penny so I have a programming test that they can download and take away, based on that I ask back for 2nd interviews. It's wholly unstructured; it will take about 1/2 a day. It's not even that hard; I could google or bing all the "programmatic answers" in about an hour.
What they send back can tell you a lot of things the applicant; you just have to find your system that lets you make the best decision.
Degrees vs No-Degree isn't just about whether or not having that piece of paper improves your job prospects (it might do, but *only at the start of your career* IMO). Even though I dropped out, attending university gave me the "best years of my life"; I've made good friends and contacts, but I probably shouldn't have gone when I did at 18.
The general theory is that education is a quality signal to employers (or whoever!) since there is some level of effort required to complete a degree or any other qualification you care to mention. To illustrate this using an extreme case, you cannot pull a random Joe/Joan off the street into an exam and get them to pass it. This works in practice too - if it was easy, everyone would have a CS degree which implies that the more talented people find it easier to complete qualifications than less talented people. The upshot of this is that people who have degrees (or in this case a CS degree) are more talented than people without and they have shown more commitment to their chosen profession - another quality signal. Ditto higher degrees: More talented/committed people have Masters, professional qualifications etc.
Even with costly degrees here in the UK this still applies. We can all hopefully accept that if you get a 1st class cs degree from a Russell group uni you will earn more than a 3rd class (degree awarding) college degree. If you are SO clever, how come you did not get into Oxbridge and get a 1st? Money is not an excuse since you will make it back in next to no time (our earlier axiom/assumption). Have not got the time? All you blind-folded guys/gals with one hand behind your back should be able to do a degree part-time/online and still have time to party since you are so good and the work is easy. Again, money is not the issue (see earlier). Lazy? Yeah, it shows. Do not want to do a degree on principle? Another indicator that you are a moron and need to get with the world of work.
There will be people who have degrees and are useless but using degrees and qualifications is an effective filtering tool. After you have done the filtering, then look for the guys/gals with good experience. If anyone actually ends up working with a clueless noob then you need to get the techies into the interview process.
Personally I don't deal much with coders so maybe a degree would be useful for them, but within the sysadmin, build, support type roles we deal with I would agree that a degree is of very little value.
I have always felt that sysadmin jobs are really more of a vocational job that an academic one and as such its aptitude and attitude that win out.
Some of my best admins have no formal qualifications past A-level but just have that feel for IT systems and understanding of what keeps service ticking along which you only get be being on the job.
Maybe if we spent a little less time forcing degrees on people, and placed more value on vocational skills, we wouldn't have such a problem funding higher education for those areas where it really does make a difference. Just my 2p.
A CS graduate may in some cases have no programming experience outside the mandatory homework and may even choose electives that has as little actual coding as possible. These will, in most cases, be terrible programmers and self-taught bedroom coders will be much better. But the majority of CS graduates either work as part-time programmers alongside their studies, have (often ambitious) hobby programming projects or contribute to open source. These will be excellent programmers having both practical experience and theory in place and for anything non-trivial, they can code circles around 99% of the self-taught programmers. The last 1% is really hard to find, though.
You can compare it to music students who only practice music enough to pass their courses. They will typically be worse performers than self-taught musicians who practice every day and play in clubs at night. But few self-taught musicians will be able to transpose a piece of music to a different key, follow the directions of a conductor and arrange orchestral works.
As for university fees, I think the UK is shooting itself in the foot with this. Students wishing a M.Sc. in CS can go to one of the many English-language CS degrees that are free to all EU citizens. For example, the one her at the University of Copenhagen.
It is. Programming should be on this level, not an academic discipline. This would turn out people better suited to modern IT requirements as it would be more dynamic in responding to real-world demand.
Upskilling and reassessment of skills would be carried out every few years or by request (somebody deciding: "I used to be a Java coder but I'm really more interested in data analysis and database design now", for example).
I cannot think why this isn't being done. It would give those interested in this career a firm and assessable universally recognised qualification and provide life-long skill updating to meet demand.
Like being a plumber or a sparky. Some of whom I know would actually make damn good coders.
If I think about some of the interview questions I ask, there are some that are significantly more likely to be answered correctly by someone with a degree than without. There are some where it doesn't matter at all. For example, if I ask someone a simple algorithms questions - for example, how can I reverse all the words in a string in place - I would expect a self taught programmer to stand the same chance of answering the question as a degree taught programmer.
If I ask a question on algorithmic complexity I would be surprised to find a self taught programmer who knew his stuff in that space. For example, what is the algorithmic complexity of finding something in a balanced binary tree compared to a hash container.
Then again, if I ask something on design patterns, I would be very surprised to get anything from a self taught programmer. For example, explain to me what the observer pattern is and how I can use it in a real application.
In fact, a number of my standard interview questions are actually plucked from either the first or second year of my own degree course. I use them simply because if someone did a degree and doesn't know this stuff, then I'm not convinced they will be able to learn new stuff from me.
Incidentally, I regularly interview both degree and non-degree educated people, that is why I have the views I do. I will also happily say that getting some sort of computer science degree doesn't make someone a decent candidate. In fact I've interviewed plenty of truly terrible people with degrees. Probably the funniest was someone whose CV claimed that they taught C++ at a decent university. They were unable to deal with basic concepts like const and static, let alone put together a simple class.
I'm a self educated programmer, and, if you'll excuse the lack of humility for a moment, a pretty good one. I have near a decade of experience and a whole lot of projects in my portfolio, but no degree. I have trouble getting HR departments to even look at my resume and I get paid about $15k-$20k less than what a programmer working in my specific field with my experience usually gets paid according to every article I've ever seen on the subject. After years of dealing with this I'm sick of it and in the process of getting my degree just so that I can get my foot in the door. You may or may not need the formal education to get the skills (after all, not everyone can effectively self educate) but you definately need the formal education if you want to find a decent job.
// reverse in place
void str_reverse(char *str)
ERRNO = EINVAL;
// new scope for autos
char *begin = str;
// swap begin and end until we meet in the middle
for( char *end = (str + strlen(str))-1; end > begin; --end, ++begin; )
// using a temporary is probably faster now days but I like this method
*end ^= *begin;
*begin ^= *end;
*end ^= *begin;
Hash tables/containers are O(1) -> lookup complexity is not dependant on size of container
Trees are O(log n) -> lookup complexity is logarithmically related to depth of tree
// Design patterns
The observer pattern is fancy name for a list of callbacks, something that is "observable" provides a mechinism for interested parties to register a callback (e.g function pointer in C).
When something happens, the observable object iterates over its list of interested parties calling the callback.
You might use that in an application by using the observer to update the view in response to changes in the model, in a traditional MVC application.
For what it's worth these question are a bit rubbish, anyone with any STL knowledge will know the complexity ratings of the various containers.
Secondly GOF is probably the most overrated book in the literature, some of the patterns are useful and some are widely overused outside of the problem they actually solve, e.g. the singleton solves the static initialization problem in C++ but almost everywhere else is just disguised global.
How about some questions which assess useful knowledge.
// knowledge of testing and refactoring
1) how would you refactor a piece of code which is atrocious in implementation but correct in behavior to a more maintainable design while preserving behavior.
2) how would you prove the behavior was the same.
// knowledge of existing methods
3) implement an efficient hash function for strings
4) explain why you chose that implementation and any tradeoffs in the design.
// Recursion -> Iteration
5) implement a recursive pre-order tree traveral i.e. visit(*node)
6) implement the iterative equivalent.
7) name three methods of IPC
8) explain when you would prefer one method over another and why.
// Memory Managment
9) Explain how you would determine the maximal memory usage of a process
10) is there a way to impose a hard limit on the usage of your application (e.g allocators)
11) describe the behaviour of a quicksort
12) describe the behaviour of bubblesort
13) describe when is a bubblesort preferable to a quicksort
// STL knowledge
14) name three sequence containers
15) when should you prefer a pair of sorted std::vector<K> over a std::map<K,K>
// CI & build systems
16) what is the function of Continuous integration
17) name two CI systems
18) name three build systems e.g. Make/Ant/CMake
// Copy correctness
19) what properties are needed to make an object "trivally" copyable
// Estimation and planning
20) How do you estimate how long it will take to complete a piece of work
21) What safety factors do you build into your estimate.
// Low level knowlege
22) what is an atomic operation
23) how would you implement atomic increment and decrement on an x86 processor
24) how would you implement a mutex given the following primitives atomic_increment() and atomic_compare_and_swap();
BTW, left school at fifteen, been a paid programmer since seventeen, been training graduates and post-graduates since the age of twenty-three, now the highest paid person on the team (the only one without a phd).
Do I regret not taking a degree? Sometimes, but not so I could spend three years, reading textbooks that I could read at anytime, but more for the esoteric parts of the discipline, compiler construction - access to different architectures other than the x86 and mips.
I learned a lot on the job, working with really good programmers, I'm still learning all these years later. It's not about code - it's about design and architecture. You can learn these things yourself overtime, but the idea that three years trying to get your end away and killing braincells in the student union, confers some advantage to three years at the coal face learning your craft is a joke.
The main advantage of the really good course are the additional elements of the industry, but for your developer as opposed to your quant, a degree doesn't confer much except a whole bunch of bad habits which must be shaken out of the incoming member of staff.
As for getting your foot in the door, here what you do..
Setup a github account,
Start writing code, make sure that code has tests.
Go to the agile conferences, meet people around the industry.
Learn new technologies, for example teach yourself erlang.
Buy a copy of sedgwick`s algorithms - implement them, understand them.
Install Linux on a old pc, any old piece of junk will do.
Write some code, disassemble it, try and understand the relationship between the high level code you have written and the assembly it generated.
Learn a scripting language - like ruby or python - learn how to interface them to native languages.
Write simple network applications and use a packet sniffer (Tcdump or wireshark) to examine the packets and understand whats going on when you send something across the wire.
Use distributed source control (git or mercurial )
Keep your chin up and your resolve strong and you will do it, keep applying and don't let the bar-stewards grind you down.
Qualifications don't mean anything, ability and experience are everything.
You can't teach ability and remember experience is a fancy way of saying "I made that mistake already so I won't do it again" no more - no less.
I love the way you have tried to rubbish what I said by answering my questions (in some cases well, in some cases badly). Specifically, another poster pointed out you reversed the characters in a string rather than the words in a string.
Incidentally, I don't like your interview questions and would avoid using them. Let me explain some simple reasons.
1) Refactoring - not a bad general question, but unless you have some good follow ups is easily covered with a wishy washy answer.
2) Testing - again, not a bad question, but I'll answer it in a few words - unit and regression testing. Follow up questions are key.
3) Hash function for strings - I don't care if someone can remember a hash function for strings. If they need to write one it will take them 10s to look up 5 different implementations.
4) See (3) above.
5&6) Iterating tree nodes - not a bad question. A bit simple for what I normally ask, but ok.
A lot of the remaining questions are far too specific. Naming 2 CI systems or 3 build systems is a funny one. I've know lots of really good programmers who have only ever used one of each. If they can code C++ well, they can pick up any build system, and I don't need all my programmers to be specialists in multiple CI or build systems. OS specific questions on things like IPC and memory management are only useful if you want someone with specific experience on an OS. Again I find the standard of C++ person I employ can switch between OSs very easily. Also similar on architectures. I've interviewed people who could tell me in detail about atomic ops on Sparc, but have never looked up the operations on x86, or vice versa. A 10s google search will answer their question though.
As an aside, I used to set up my interview questions in a very particular way. I would write them out, and send them to every candidate 3 days in advance. My reasoning was that every programmer has multiple resources in front of them in a normal job, but not in an interview. I want to know how they survive in the job. If they passed my first interview, I would bring them in for a half day interview which involved sitting down in front of our code base with a real bug that someone had recently fixed and seeing how they went about debugging it. Out of interest, I only ever got 2 people who aced the questions I sent out 3 days in advance. A third who came close to acing them.
Are there people without degrees who are good? Yes, of course. It is the exception rather than the rule. I've interviewed lots of both category. Up to now I'll recruit about 5-10% of the degree qualified who I interview, and I've recruited no-one without a degree - why - because they failed to answer my questions.
In my experience it is people with degrees that want to employ people with degrees. It is like a little club.
A degree just means you could remember stuff when you had to answer questions on a paper, whether they were real world problems is another thing.
I have worked in a uni and out of one. Have seen, worked with and interviewed folks with and without degrees and as far as techy subjects are concerned the people who are worth having didn't need the degree as they were obviously smart and switched on people in the first place.
A degree may teach you something but in the 3 years a degree takes you can get a shed load of usefull experience and learn what you need not what someone else thinks you may need.
IT Managers are happy to look at applicants that do not have a degree but have the required experience.
In the real world of applying for jobs it becomes quite apparent that very few IT managers actually do any of the initial vetting of CVs. That unfortunately, is all done by feckless recruitment drones and Pam in the HR dept.
No degree = Delete.
Just thought I'd chip in to this debate, I have been teaching myself programming languages since 10, I am currently doing a degree because getting into the industry without one as mentioned above would be a hell of a lot harder than it is if you do have a degree. I agree that university courses don't really prepare graduates for the real world but I'd like to point out that some universities get it right. I go to Sheffield Hallam (yes you can stop giggling now), the course content hasn't been the most challenging but it has been interesting on ocassion. However I've just completed a placement year, the course I'm on is a sandwich course where we are meant to spend 1 year working in the industry (for near slave wage). When I started I was pretty useless and out of my depth but it didn't take me long to pick it up. None the less by the end of it I was a key part of the team I worked with and was one of the go to guys when people had problems, they offered me a job for when I finish my final year of university so I will be going back. This has been a massive help to me I feel and that my university through their links with the industry have helped my 10 fold over non sandwich course degrees. Just thought I'd throw that out there :)
I've seen a lot of degree envy among coders who do not hold a degree, and a lot of that envy is appearing in this comment section.
A person who has not obtained a degree from an accredited college or university either has not tried, did not qualify or did not follow through to graduation. Any of those conditions shows either a lack of ability, or self discipline, which would make itself apparent in their work as a coder. Most of these coders exhibit one common trait: their code usually resembles spaghetti, regardless of how well it appears to work. The example of a "self-trained" coder who was truly smarter than her professors is such a rare exception it borders on mythology. Those kind of wunderkind learned to read at 2 or 3 years old and had 5 certifications by the time they were 10. By the time they are 12 they've graduated from college and were working on their advanced degree. Those gifted students are smarter than their professors, but you won't find them settling for writing code in some shop.
I taught programming for 18 years. During that time I have seen many of the usual kind of "wunderkind" coming out of high school who demonstrated a lot of skill using a specific language, usually Pascal, some form of BASIC or a GUI framework. But, their skill set was full of glaring holes. They couldn't identify the core elements that make up any language. Their knowledge of hardware interfaces was scant or non-existent. They may have discovered the "Bubble Sort" on their own, but most could not code any form of Quick Sort or other hashing algorithms, and if it wasn't an function or object in their framework they couldn't use it. Their knowledge of relational or other forms of databases rarely progressed beyond the flat file or a simple dbf tool. They could not design to the 3rd Normal Form, and usually didn't know what that meant. Few ever heard of a Pick RDMS. Most glaring of all was their lack of sufficient math skills, and/or Boolean logic. Few of these wunderkind knew of or used version control software, or test suits. They never collaborated with a user base to identify data input and output, report requirements, create program specifications, establish production products and determine their milestone dates so that the user base had something they could test and sign off. Often their communication and writing skills were usually atrocious.
Using APEX or a Java or some other web based GUI design tool to slap together a glitzy web page may impress the owner of some SOHO, but when that app fails to treat edge cases properly, or blows up for unknown reasons, the purpose of a quality education will become apparent.
...more dispair that degrees have been pushed to everyone for the past 15 years and now its considered essential for any position above burger flipping by those less well informed in employment circles.
A friend of mine is a retired University lecturer and he always tells me "University wasnt meant for everyone, just those that needed the extra education such as doctors and lawyers. Estate agents, bank tellers and the like didnt need to apply!"
Just the top 20% so to speak. Not everyone.
Degrees are now a total barrier to entry for those of us that were encouraged years ago "to work your way up and get experience!"
It's frustrating. Oh well best get my apron on.
Just because you have never heard of Ambulo doen't mean you don't do it (latin for walking by the way). Fancy words don't make a programmer.
I don't have a degree, I certain don't eny those with one. I do envy missing the student lifestyle though.
Eduction tends to remove inovation, rather than do a task in a way you have been taught why not find a better way of doing it?
You may sense Degree Envy, I sense degree snobery.
I have seen lots of code, Degree written code usually works but is generally slower and fatter.
"I have seen lots of code, Degree written code usually works but is generally slower and fatter."
I think this may come down to one of the differences, the way you said that suggests that you automatically think slower & fatter code means worse code!
If it's slower and fatter because it's abstracted, structured, more resilient and, ultimately, maintainable then that's something preferable to speed in most cases. But it does depend on the application. Unless you have severe memory, processor, or response time restrictions then speedy code generally takes a back seat to other, more important, factors.
From personal experience, it's no good having code that completes in half the time*, if no-one else but the original author can follow it and it needs to be completely re-written for even the smallest change. (from a project where a number was extended 2 digits, and masses of it had to be completely re-written because of 'optimisations'. After the re-write, it may have been a tiny bit slower, but if it needs to be extended again, or changed in other ways, it will take days instead of months.)
In the 'good old days' code had to be stripped right down to get it to run with any semblance of usability on the hardware available. It was fast, but that was all, 'best-practice' was thrown out of the window. We are still paying the price for some of the decisions taken 15+ years ago in the name of optimisation. This cutting back isn't necessary anymore, and code no longer has to suffer for it. Although, some of the developers from that time find it hard to get out of the mindset of optimising everything to the absolute limit.
*obviously if you are doing bulk batch processing, then yes optimisations that shave tiny fractions of a second off each item are worth it over maintainability.
Coders with no degree have a tendency to err toward spaghetti code. But they also continuously learn from their mistakes, are happy to admit they're wrong, adapt, become better. They also tend to have their eyes set on the goal, i.e. finish the damned project.
Coders with CompSci degrees suffer from another form of pasta code: Lasagne code. They layer and implement interface upon interface upon abstract class upon class it's almost impossible to find where the bloody meat of the code is! There is also no bloody way to change the point of view of a compsci student who has 'mastered' a particular design pattern. They have an awful tendency to become design pattern zealots for no bloody reason. They do however generally only have to write once and reuse the code over time, though.
So which one do you pick? Spaghetti or Lasagne?
Truth be told, you need a bit of both.
I've been asked several times by parents with teenaged offspring whether I'd recommend a career in IT (or whatever the current trendy term might be). It's given me a good living for a quarter of a century. Each time, my answer is "Don't do it - go into Electrical, Mechanical or Civil Engineering instead, but don't touch IT".
When I started in the business, there was a recognised career path. Start as a junior analyst or programmer, gain more experience, work your way up, move employers a few times, try a bit of contracting or move into management. The issue of a degree - or otherwise - only really mattered at the very first stage. Everything else was based around experience. The degree got you the first interview - and had progressively less importance.
Wind the clock forward twenty years, and you see large numbers of graduates clutching the 2:1 or 2:2 degree, wanting to start on £30k plus per annum - and they find themselves out of work. Modern corporates have realised that offshoring to your friendly indian outsourcer, or onshoring to a load of drafted in ICT-visa holders is far cheaper. Why fund a graduate - with all their demands and training costs - when you can import cheap skills, and leave someone else to worry about pay and rations. Modern corporates wound back their milk round recruitment years before the recession kicked in. It isn't a phenomenon of the last couple of years.
The ironic thing, of course, is that even India is now being priced out of the market. I've recently seen work outsourced to eastern Europe, and the philippines, because the usual outsourcers crew - at one fifth of the cost of experienced local skills were still deemed "too expensive". A real race to the bottom, led by the public sector - and the banks.
I wouldn't start in IT now, even if I found someone enlightened enough to pay me...
Just to point out that the GBP 9000 amount only covers the annual university fees; for most student there is also a small matter of accommodation & living costs, so thats another GBP 5000 (say).
Of course the loan to cover the fees has special conditions; e.g. if you end up on a 4-year course and then earn not much more than the average wage over the next 20 years (which would seem to be more likely as the proportion of students increases) then you end up paying back ~GBP 30000 and the tax payer coughs up (writes off) ~GBP 35000.
I agree with HR a degree (even not in IT) is a requirement to show you are interested in learning and not one of those types living weekend to weekend.
Looking for professional qualifications as well - which will become increasing important for those without an on topic degree/or degree (in a few years time).
There is a lot of half baked scum out there to sift through.
I find that many people without a degree are often better than graduates...
You're either good at it, in which case you will have self taught yourself before even reaching the stage of taking a degree, or your not naturally inclined towards the subject and thus took the degree solely because you saw money in it...
And in a fast changing area like IT, you will probably be learning old tech at any educational establishment anyway... True enthusiasts will keep up to date on their own.
I live in a society where all of my colleagues have a degree. So having a degree is not a differentiator.
However, ten years ago I had the chance to work with and learn from a truely genius network engineer. He studied Math for one year at University, but then he quit. He had a job since he was 17 and decided that University is a waste of time for him. When he was 20 (10 years ago), he got his first CCIE. He became the senior designer of an ISP network. At the age of 25, nobody wanted to promote him to a management position, because he did not have a degree. Then he finally opted for a no name University that gave him a degree in 2 years, just to satisfy the society. The truth is that we look for degrees as we look for an approval stamp. But there is no guarantee that the stamp really delivers. It is the passion and experience that does.
...the real problem we work in an immature and rapidly expanding industry, where the talent pool is dwarfed by demand and the end product is opaque.
The massive demand means there's lots of idiots getting jobs, and the opaque nature of the product means badly written but functional code will not get discovered for many many years, and the specialist nature of most software development (as most software is developed to fufill specific business needs) makes comparative assesment hard.
Or to put it another way, whilst it may hold my weight I can see you built that bridge out of old toilet rolls, bits of twigs, blu-tak and hope so I'm not going on it.
I have been a self-taught programmer for many, many years (and I mean a lot), and to be honest I feel that I have missed out on many opportunities because of not having that degree on my CV.
I have a ton of experience in numerous languages but without the formal training my work was somewhat sloppy, but always worked as expected. I really needed some direction, so I recently decided to pay my dues and go to University, and while yes there is a lot of rubbish being taught such as "Personal & Professional Development" rubbish that you have to do (government-mandated self-reflection bullshit), there are some important programming concepts being taught.
For example, I never fully understood classes and objects until I began at university, my algorithms were virtually non-existent, and many other concepts were missing from my "skillset" (gawd I hate that word).
So, I am not at University because the money in IT is great (it isn't, but it should be), I'm there to sharpen up my programming. I highly recommend it to any budding programmer, if you love programming, a degree will give you a greater power behind the console. Just watch out for the fees these days, yikes!!
I don't connect with the idea you needed a class or a lecturer-at least in person-to get these ideas.
My own skills are rather rudimentary and syntax frequently makes me a Tourrettes sufferer. However, the concepts themselves are frequently well explained in many books, many articles and not a few videos.
In addition, for VS users, DevExpress refactor product or ReSharper (and similar) is a massive help. These things suggest refactorings that often caused me to find out what they were on about. Very bloody handy.
I'll miss out on big corporate work I'm sure as I don't pay the papers due respect and have none. Fine with that, I never got on with larger structured establishments in any case. It's interesting to note the two points of view that this article divides us into however. For years, I've been surrounded by other IT guys-any programmers were usually cleverer at it than me (still are)-and without exception these were all home grown types.
The one uni guy I knew, I was also training on stuff.I was 17 :). So, my bias isn't obvious :).
ok, i'm heading towards south, but this just had to be said. so, here we go.
what uni teaches is scientific approach and gives you framework for further development. you don't have that, and you're just a craftsman. very skilful, very experienced, but just a craftsman. is this bad? no, not at all. IT (and programming) became a craft these days. bit like plumbing. or bricklaying. advanced, yet craft.
can you learn these methods yourself? i doubt it. hundreds of years were invested into formulating them, and laying education and training foundations. hardly one can be so advanced to just pick it up. yes, doable, but not for everyone. the rest need some support. from those who in the know. the professors.
do we need a degree to do IT? no. it's craftsmanship. no need to get a degree to become a successful carpenter.
I easily can tell whether the person interviewing me has a degree. the one without usually asks specific craftsmanship questions. what are the 'grep' switches. name all 'tcp' packet fields. what's the best propane-butane/oxygen ratio. stuff like that. one with degree is more interested in my thinking processes. there's the difference.
do we need that in IT? no. not in the bottom two thirds anyway.
there. said it. have fun. it's the red button.
Wow! What a lot of anti-degree shoulder chippery!
The primary talent of a good programmer is problem solving. I suspect that that's something that simply can't be taught and so, clearly, one can find self-taught programmers who're great problem solvers and graduates who're rubbish.
It's also fairly self-evident that, on balance, for a really demanding technical role where in-depth knowledge of how microprocessors work, efficiency of sorting algorithms, mathematics, etc., a sensible employer would prefer a graduate with experience over a self-taught coder with similar experience. The former would be far more likely not to have holes in his knowledge of the subject matter. Even the best self-taught programmers are unlikely to be able to hack university level maths.
In fact, on one of my first jobs as a new graduate with a degree in computer science and electronics, I found myself hand-coding algorithms in 8051 microcontroller assembly language. The bloke who was the team leader was entirely self-taught and much more experienced than I. However, I was the one who coded all of the mathematical operations, i.e. addition, subtraction, multiplication, division and all of the trig functions because he couldn't do long division!
It's a fairly strong bet that the vast majority of graduates we've spewed out over the last 10 years are utterly useless because they would have had to be stupid to embark on a degree in computer science since the dot com bubble burst and the writing was on the wall for onshore software development. The industry in the UK is dying on its arse and I wouldn't recommend anyone shelling out £27k to get a degree to work in an industry where they're competing against programmers from India, Chine and Brazil who'd bite employers' hands off for £9k pa.
In other words, don't bother with the degree or IT if you want a life-long career. Try and find something that can't be off-shored or commoditised. Gigolo perhaps? That's what I'd do if I hadn't spent the last 25 years sitting on my arse writing code:(.
... because, as everybody knows, managers don't actually *do* everything -- and without a degree to prove it, nobody would know whether they are any good at doing whatever it is they don't actually do.
Programmers without degrees can just get on with the job and do it well, so that some manager with a degree can take the credit for it, because that's what managers seem to be for.
... interviewed more than 100 people for a job vacancy, most of whom had degrees and varying industry certification.
NONE of these guys could answer basic questions well. Only one person managed to answer a question involving knowledge of what happens on the wire as a PC boots up and goes to a website.
That one person had no degree, but DID have 15 years experience, so they hired him (ME!)
What the exercise proved is that paper qualifications aren't worth a hell of a lot (Several were RHCEs, some were CNEs) and subsequent hiring cycles have shown this to be more and more true - so we're less inclined on insisting on degrees and more interested in looking at suitability for the job.
Said employer is a major London university. YMMV but not usually by much.
I actually started out my coding from a college course, but actually I learned everything I used in my job from summer work. My degree got me some interviews.
I think a bedroom coder shows a willingness to learn, which is something to be admired. Not that many people learn things off their own back, and can turn it in to employment. It's all down to the willingness to learn at that point. Some people leave uni and within a year class themselves as an expert in a language and are bothering their employer for a Senior title.
One of the most talented coders I know has no formal education beyond GCSE, and some certifications which has has taken.
I've seen quite a few CVs these days, and I'll quite happily interview someone. Though, I do request code samples first to see what their code is like. Because frankly, what some people pass as code good enough to show to an employer is borderline comedic.
Been in this industry for over 20 years, and there is a difference between degree-holding consultants and others... but if you are talking coding, sure... I will admit that a degree to code seems a little overboard.
That all being said... an individuals aptitude and work ethic plays far more of a role in their overall ability to excel than a degree.
This is the predictable consequence of the dumbing-down of higher education:
Programming is just the second step in Comp. Sci. (after discrete mathematics & logic - but I get the impression that this foundation is being skimped on as 'too difficult'), so when you get graduates who cannot even program, something is wrong.
Generally if I see 'Computer Science' degree on a CV, especially if said person was in education after around 1995, then I treat them with a large barrel of salt.
I simply cannot fathom how CompSci grads can come out of university not knowing practically anything about the hardware or any system knowledge. How on earth can you call yourself a programmer if you don't know how to fully exploit the hardware you're given (in my case an absolute joy as we control the hardware we supply our systems on, not dissimilar to Apple - *this* is a luxury few devs can afford!)
These are questions CompSci grads couldn't answer in my interviews:
1. What is the difference between the heap an the stack?
2. How and why would you implement a semaphore?
3. What is a register in the computer?
Tell me honestly, are these questions ridiculous? Because the amount people I've turned away beggars belief.
Seriously, if you don't know a little bit of C++ to take advantage of say SIMD optimised matrices in addition to your general high level language, i.e. Java/C#/Python (delete as appropriate), then as far as I'm concerned you'd be a liability to the team.
> Tell me honestly, are these questions ridiculous?
The second one is :-
> 2. How and why would you implement a semaphore?
The only real answer to "why" is "because someone needs a semaphore". And the answer to "how" is extremely hardware-dependent.
But if you'd asked "How and why would you use a semaphore?", it would be much more reasonable. Most coders will need to understand using semaphores. Very few will ever have to write one.
Working in various areas of software like computer vision, AI, Data Mining, etc... with the best of the experts in the field in the world today, was amazing experience. I was fortunate enough to get my university education completely free off charge. I think the biggest flaw of bedroom coders is that they don't even know what they're missing.
IT (by which I mean system / network / database administration, simple website development, etc.) is more focused on specific tools rather than general principles, and is thus far better suited to vocational training.
Programming (by which I mean writing good code to solve non-trivial problems) is far more about algorithms, code structure and general clarity of thought than the language in which it is written, and whilst experience will give you practice and writing good code, a degree also provides many useful skills.
My degree was in a technical subject, though only covered a very small amount of programming. However, I still found it invaluable training for the problem solving ability it gave me.
Moreover, a real degree from a good university is proof that you are able to think (though absence of a degree is most definitely not absence of the ability to think).
Unfortunately, the fact that there is some correlation between the presence of a degree and intelligence makes this an easy discriminator for lazy HR departments. I fell foul of a similar requirement for experience whilst employed, and it was only when I found a good agency with sufficient clout the get my CV (and thence me) in front of the people for whom I would actually be working did I make any progress.
I can see such broad brush approaches are justifiable where employers are simply seeking someone good enough, and where the extra expense of guaranteeing to hire the best candidate isn't justifiable given the benefit to the employer. However, a position where there's no benefit in being skilled beyond a certain level is not one which I would wish to fill!
Similar arguments apply to university applications done solely on the basis on A-levels: whilst a useful indicator, there's no substitute for interviewing a candidate, yet only relatively few universities actually interview candidates. Again, I was fortunate enough the attend one that did.
This has turned from a brief comment on the difference between vocational and academic disciplines into a rant about the evils of "good enough" syndrome, and being on page 4 of the comments, will certainly go unread, so I shall leave it there.
Let me get straight down to the facts. Most people go and study IT since it used to be a quick way in making good money and support yourself. Point is that most of these people dont do it because they love it, or even capable of doing that.
Look at the hackers of today - kids doing it from their bedroom, putting security from corporations to shame with all their graduates scratching their heads on how it was done. Game mods done are also done by kids loving the challenge.
My experience is by taking someone from uni, they expect to be treated special, and want a fast career progression. This mean less coders who actually do the job. Architects etc designing is good, but sitting with a whole department of architects, analysts and managers just doesnt get the job done. Some are good, no doubt, but the majority we recycle quite frequently since they just dont cut it on real coding skills. Im talking about real time transactions, not some lame business form with fields.
For this reason thoroughly test applicants in my company. Uni or the not, if you can master our tests (Not some conceptual nonsense on paper, real programming) it counts in the favor.
We're at our most adaptive and in a position to learn most when we're young.
At a university you are guaranteed to be challenged to a certain minimum level and to spend all your time constantly learning new things.
You might be so lucky in your first job, but you could easily get thrown in a corner and taught a lot of very bad habits.
If you are hugely ambitious (this probably matters more than smarts) and are confident that you can drive yourself forward then a degree probably isn't necessary. For anyone else, you might get lucky - but you'll probably drive yourself down a dead end career.
If I were an 18 year old today I would get a degree in agricultural engineering. Lots of very old farmers about to pop it and a huge shortage of fertilizers coming in the next 40 years. You stand to make a mint.
I wouldn't advise people against going to university, but I would recommend that people consider their options very carefully. A lot of people opt to do a degree without thinking much about what they want to do at the end of it and what the industry is looking for in terms of qualifications.
I decided to go down the professional certification route instead of doing a degree and I haven't regretted that choice. I think it's opened up more doors for me than a degree would have. Jobs often ask for "degree or equivalent" - I've applied for such jobs and been successful with my IT Pro certifications alone. I don't think I've missed an opportunity because of a lack of degree. Having said that, I'm sure a degree would still add value to my CV.
In my opinion experience is far more valuable than any qualification, but you do need to have something on your CV in terms of academic or professional qualification.
I think degrees work for some people when they include a placement year - this can give people experience and a foot in the door that might actually be more valuable than the degree itself.
Although I think IT certifications are generally more relevant, they do have a limited shelf life. A degree is for life, but IT certifications will either expire or become irrelevant over time. Anyone working in IT needs to keep their skills up-to-date regardless though - degree or no degree.
So, you guys are getting a few things mixed up. You're talking about a degree for IT, then talking about programming and so on. If you want a degree for a programmer you are looking for a computer science degree. Computer science is not IT.
I do agree though, both for programming and for IT -- and I say this as someone *with* a computer science degree -- that important things are probably in order enthusiam/interest, an aptitude in the subject, and experience/degree in the subject.
If someone is just in it "for the money" they are not going to be arsed to keep up on any new developments in either IT or computer science, unless they are told by some boss "you must go and learn this topic". Having someone who has some enthusiasm in what they are doing is always better than someone who just goes through the bare minimum and goes home. The person with enthusiasm is more likely to make suggestions on improvements, rather than just maintaining the status quo.
Aptitude -- there's certainly *some* "drones" in IT that just have to follow a checklist -- "Hey you, here's a dozen machines, follow this check list to prepare them for deployment and let me know if anything goes wrong with a machine and I'll come by and take a look at it." But for problem solving, implementing anything new, even deciding if anything new *should* be implemented at this point in time, it takes some aptitude in terms of logical problem solving and so on. In the long term, someone with aptitude will learn about your specific systems, while someone with prior experience about your type of systems will not gain aptitude.
Experience and/or degree. Nice, but these are really BOTH overemphasized. If someone is interested and has aptitude, they can be taught a new programming language relatively quickly if they know in general how to program; they can be taught the specific IT procedures for you, and since they have aptitude and interest will be able to tweak and improve upon this pretty shortly. On the other hand, if you just go down a checklist and end up with people that have experience in your specific language, but no interest or perhaps not enough aptitude, then you can end up with someone mediocre, who becomes worthless if you change languages at all. I do think I got something out of my comp sci degree, but I would definitely think a company is daft for demanding one.
. . . a degree is learning at a higher level with feedback at that level. Who would allow access to even a dev server without some proof of knowledge of what they doing? Would you trust a bedroom \ hobbyist builder construct your home?
The sad thing is that in the past a degree represented top 5% and now represents upper 50% (or some other ridiculous figure). I worked in an academic environment for a few years, I heard an advert on local radio traveling to work:
"Failed your A levels why not contact [insert your local town here] we can help you find the course that meets your needs"
The best students I\we never ever saw or heard they just handed in exceptional work before the deadline. The others, bless them, struggled with concepts such as variables and took hours of support :(
A degree ain't what it used to be but that is all changing now - not sure for the better (complex).
This is a human situation - no simple rules. If you have problems with the people recruited then look at your recruitment people. If you have problems with degrees then look at the people that confer those degrees.
I don't get how universities can justify the price of bachelor degrees given their lack of value in the market place, especially in IT. A computing degree might offer someone with zero programming skills some basic knowledge, enough to get them started at the very lowest level, but not much else.
Certainly they offer nothing when it comes to serious network administration or technical support. What they teach you about software use should be something that anyone who wants to work in IT has learned years before even attending university.
Employers do like to know that you have a degree, because it shows a work ethic and the ability to conform. You are showing that you are willing to work without supervision and that you will do what you are told to do without whining about whether it's really useful or not. But any degree at all will show this, so one based on computing subjects is largely irrelevant.
The other things employers like are obvious. Technical certifications and experience. If you want to work as an administrator or in tech support and have to choose between a degree and something like Microsoft certification, get the certification. So even without any on the job experience, you can show you know things that are actually very useful in the workplace. I know a lot of people dislike Microsoft products, they have some very valid justification for their point of view, but when it comes to getting a job you need knowledge of systems businesses actually use - and MS administrator certifications absolutely DO teach you real world solutions to problems you will encounter in the work place.
Programming is different. If all you know about a programming language is what you learned at university, you'll probably never get a job writing software. Yes you've learned something valuable about design, but you'll still have to teach yourself a lot more about programming than you're taught in the classroom. With software engineering you need both. You need to teach yourself, preferably being able to show potential employers software you've written that does something useful - and get as much as you can from your coursework, showing employers you are willing to conform to design standards and design processes they need for long term projects.
My degree at Exeter University gave me an awesome base in Software Engineering - DB schema structure (1NF, 2NF, 3NF, etc), parallelism, specification languages, GUI implementation (admittedly it was Motif on Unix, but the infrastructure used is common), Prolog, Assembly Programming, C++, OOA/D, etc.
Totally, someone can develop in a company without the above - but if they don't learn it, then they are lacking.
diplomas and certificates server to impress managers, but any operator worth his salt knows that paper knowledge is a dangerous thing if anything else. You can't really learn IT either, you're either good at it, or you suck, there's hardly a grey area and education doesn't change that.
i do have an education and stacks of certificates, but their only purpose was to get a job, i don't really use any of that knowledge anymore.
Given the choice, I will pick out adaptable, self taught and self driven people any day of the week. Divergent thinkers who've got the discipline to sit and learn the principles and practices of a profession (or three) get my vote.
Critical use of the internet, of proper books, peer reviewing the knowledge sources used and of course-the main staple-direct personal trial, error, review (after Tourette's attacks ad infinitum) I have found to be rather effective.
Using the Dreyfuss expertise scale, I'm competent at coding. Refactoring necessity, principles of reusability, contracts, coherent classes all make some sense to me-and some I can use :). In addition, it's obvious where my own gaps are and what to do next to fill them in. So grasping a new bit is easy enough, it's just a time/effort issue.
All of it, fitted in, when i needed it, in the last two years-around a dozen other (non programming) roles and tasks of a new job.
Books used? Code complete, pragmattic programmer, the inevitable collection of MS books on specific technologies/kindle books-then onto deeper principles themselves-the theory I would have picked up at uni.
Time invested-4-8 months of actual work, few hundred pounds of books-and software.
The profit ratio in my very early days-i.e. now-of programming and developing is rather high. And I'm not servicing a student dept.
So, tongue in cheek here, it's also an IQ test.
Is the lady or gent in front of me, looking for work, clever enough to do it themselves with resources like I used-or did they waste a number of years and 5 figures to do it the long way?
Cynical perhaps but learning to learn is another time/effort and appropriate materials job.
I can't think of a sterling reason to use a University for my future education requirements-though I'm honest enough to admit, I'll use the products of some of their cleverest bods...the books and other stuff they come out with.
that many of you lot have got jobs at all, regardless of whether you studied a degree. Just like the 'educated' people (including professors) I work with every day, almost all of the commenters above have atrocious communication skills.
It doesn't take an expert in English language to correctly use an apostrophe or spell commonly-used words. It doesn't take a genius nor is it particularly time-consuming to read what you have just written before posting, sending an e-mail or publishing an academic paper.
I would argue strongly that anyone wanting their advice, opinions or work in any field to be taken seriously should learn to communicate effectively.
Those who work in technical fields such as IT regard themselves as scientists and engineers. Every day we follow rules and systems far more complex than the three simple conditions that define the use of an apostrophe.
It's actually really embarrassing that the standard of English amongst people who speak it as a second language is generally much higher than most people who have grown up in an English-speaking country. There are several languages which contain even more convoluted rules than we find in English, and yet people seem to successfully use them - without moaning constantly because they are too lazy and ignorant to find out about things they didn't pay attention to in school.
I don't understand why so many people here have such chips on there shoulder about CompSci Graduates. While I will admit they tend to (and myself included) not have a lot of knowledge about things like software development in a professional setting the degree does act as a primer to what should be a life time of education.
A degree is useful today as without it will be a lot harder to progress your career above the 'glass ceiling' or even get an interview in some cases.
Combine the degree with a raft of certifications and you are onto a winner.
Biting the hand that feeds IT © 1998–2019