Resistant to education
... I LIKE that term! :D
Learning to code in your bedroom will prepare you for the IT job market just as well as a three-year degree costing £27,000, professionals said in a survey published today by CWJobs.co.uk. More than half the IT professionals polled said they would not do an IT-related degree today if they were paying the increased fees, which …
... I LIKE that term! :D
I actually state that there are those who can teach themselves, maybe I should have stressed that point more. In fact I studied astronomy, so much of my computer science and coding skills is self taught. This is why I am still learning new things about core computer science.
I also notice there are huge differences in IT degrees themselves. Some degrees are much more oriented to learning to use available tools and languages than to actual problem solving. We sometimes get students from these courses applying for our MSc course in computer science. These guys no way more than our students when it comes to common tools out there. Where they have huge problems in in their problem-solving skills. The best acquire theses skills, there rest flunks the course.
What you said is so true, and unfortunately also occurs in other scenarios e.g. Java where it does run on almost everything (or certainly a lot more than .NET) and yet its still not the ideal solution...
I've been hired by numerous degree students to write their dissertations for them, because they were too lazy and badly trained to write it themselves.
I think your example just demonstrates the point - that your recursive student developer was far more useful in the real world, the fact that he could prove his solution was more efficient should have amply illustrated that he had already learned what a recursive algorithm was.
IMO self-taught developers are ALWAYS more skilled than those with formal training or education.
There is generally much less time for pointless exercises in naval gazing in the "real world" - like insisting that something be recursive when there is no need for it.....
I actually state that there are those who can teach themselves, but maybe I should have stressed that more. I studied astronomy, and am therefore largely self taught, when in comes to computer science. This is why I am still learning more about core computer science today.
> and have seen quite some horror's produced by various "bedroom coders"
Similarly, I've seen some total dogs' breakfasts served up by formally-taught coders. Including academics.
> I told him of course he could hand his assignment in in C#, so long as he did not
> mind failing the course
Was this a Java course, or a programming course?
> He found this unreasonable
As do I, unless it was specifically a Java course.
> I suggested my attitude reflected that of a potential employer or customer
My experience of both employers and customers is that almost all are entirely amenable to the programmer picking the tools to solve the problem, so long as he can justify his decisions. I would suggest that your attitude is representative only of a few dogmatic employers.
 I had one University lecturer who was introduced to us as a real high-flier, who wrote code so complex that computers weren't available that could do his code justice, so he set up a company to build his own computers. When I later went on to work for the company that supplied him the processors, he was known as "that nob that can't write code to fit on a sensible number of cores"...
I taught myself to program starting with the zx81 using basic and moving on. However I never had any confidence that I was actually doing things in the best manner. It was utterly impossible to get a job back then without a degree. I eventually went and did a BSc in Software Engineering. I was horrified that some of the computer science degrees at our uni didn't even require a single programming module to be taken over the three year course!
Even of those who did do courses which required some programming many were dreadful to be honest. Lots of them were there because it was a trendy subject which could earn you good money afterwards. I think the answer is
There are good degrees and there are dreadful ones.
There are good educated programmers and dreadful ones
There are good bedroom coders and dreadful ones.
Long interviews and some testing are the only real way to determine who you're dealing with.
I spent some years writing quick and dirty 'job doers' when problems needed solutions yesterday. Yet your account struck certain chords with me. The candidates with whom you crossed swords were clearly not cut out to succeed, yet they had reached a University, I find that worrying in itself.
With a tiny bit of thought (gosh that is a rare concept) they would have realised that they can do this and pass or do that and fail. However, that would still only prove that can find the way through a simple maze to reach a self interest goal, e.g. do right pass, do wrong and fail - pretty straight forward logic gate to me.
My experience was of graduates overseas who replaced the qualified, time served trainees. As electronics technicians with a book they were possibly OK. perhaps even better than OK. Set them to fault find real equipment problems then please walk away it was going to be a long wait for progress, you might need not only a good meal but a bed for the night.
An IT graduate was set a straight forward process to create. Three months later masses of code and no output. Booted off him off and another practical chap did it in six weeks while still doing his day job. I guess the other one went to the NHS systems development team.
Time served practitioners at least know how to approach their work, possibly the questions to ask and the answers they need to progress the job.
Oh I agree.
Both Bedroom coders and degree bearing gits can both produce crap code.
Universities today fail to teach the basics, even to their masters students.
If you can't think, you can't code.
While that sounds simple. You would be surprised at the number of stupid gits who come up with crap or want to touch code just for the sake of getting their name and credit on the code.
And yes, I've cleaned up enough crap from all types that I almost decided to call myself a sanitation engineer.
"He said but my implementation is more efficient, we said that was true, but that the assignment was to learn recursion. He said but my implementation is more efficient, we said that was true, but that the assignment was to learn recursion."
If he worked for you, and you worked for me. I would have quickly gave him YOUR job, because efficient code is required to make or save money. I want programmers that generate results, not follow the same old broken cookie cutter.
Schools often teach those formalized way of doing things, but they are not usually practical in the real world.
Your first point made sense, then you kind of threw it all away.
The vast majority of software employment is maintenance - an application already exists, it must be extended to add new features and bugfixed to correct the mistakes and incorrect assumptions made in the past.
Do you think you can choose to write your new features and bugfixes for a C# program in Java, or a LAMP stack in HLSL?
There are only two situations where it's possible to choose the language and framework for a task:
A) It's a brand new project, and will not be using any modules that company created for previous projects.
B) The project is to port an existing product to a new language or framework - in which case the programmer rarely gets to choose because they're being employed/directed to do that particular port.
In all other cases, the language and framework are pre-defined by history - either by the modules you already have or the program it is extending.
Even in case A), any software company will have a set of preferred languages and frameworks, and as a programmer you will write in the one chosen for that project.
In a good team you'll be able to influence which language and framework are chosen, but at the end of the day it will be selected and whether you agree or not, you will write that project in that language and framework.
Your post implies that you've usually been writing-from-scratch on your own, and that's not they way most people earn their corn.
University education is often the first time people are held to a specification - coding at home you can make the program do whatever however you like, but in the real world there are specifications to meet and other people and teams to work with if you want to get paid.
It's pretty rare for the requirements not to define the OS(s), language or framework.
I started programming in primary school, went through secondary school taking every IT class they had and breezed through it. Didnt bother going to uni. Got a job developing at a private investment bank on the strength of my predicted grades (didnt ever finish or hand in any coursework), and doubled my salary every 12-15 months until I was 22, then quit full time work on a salary enormously higher than any graduate was getting, and started freelancing for even more.
Aside from being up-to-date with the latest frameworks and teaching techniques, most uni graduates are hopelessly inept at solving real world problems until they've been working in the wild for a bit. Not saying they arent smart people - some of them are very smart. But having a degree is no yard-stick for how competent somebody is at solving a problem or applying abstract thinking through code.
I gather that the guy who handed in the iterative solution to your task may not have got the top marks, but probably went on to earn much more than your teachers salary?
"Another story I like is the guy who handed in an iterative solution where the assignment explicitly stated: "implement a recursive method to compute ....." He argued this was more efficient, we said that was true, but that the assignment was to learn recursion. He said but my implementation is more efficient, we said that was true."
The guy has the right attitude: simplicity and effectiveness. Programming is all about solving the problem, and if the problem can be solved in a way that's simpler and more effective then it's a better solution, and forcing someone to take an approach that's not that way goes against good programming.
I suggest that what you do is to find a problem that can't be done without recursion. The classic one is a product/part explosion where product A is made up of products B and C, B is made up of D and E, C is made of F and G and so forth. Then get someone to provide a list of parts and a total cost (or something like that). You can't solve that without recursion.
And please teach people about trapping situations where the database gets in a mess and A is made up of B and C, and C is made up of A and D. I've seen that a few times in the real world and the program just sits there constructing parts until a stack overflow occurs.
But it' s not a work situation. It's a school situation. The desired outcome is not efficient code, but for the student to learn recursion, so he failed.
In order to learn recursion, you must first learn recursion...
He's teaching computing not English. You might be forgiven for expecting impeccable English from all teachers whose first language is English. When the teacher has already stated that he(?) is based in the Netherlands and English probably isn't his first language, it seems a little unreasonable.
That said, most people who learn English outside of the English speaking world seem to get a better education in the language.
I taught computing too (O.K. it was one of those store front Ponzi schemes you see advertised during the day "training to become a medical receptionist, cosmetician, Security professional, computer programmer..')
I did teach my students the basics (3 operations, sequence, iteration, decision)
file systems (flat, indexed, random) etc. etc
There were a couple of students who were really serious and I gve them the best advice
I could. work on typing/data entry, RPM and try to get a data entry/operator position at a small company with S/36/S38 and work your way up cos you know RPM. IBM had just come out with OS400 and there was lost of opportunities
> The vast majority of software employment is maintenance
OK so far.
> Do you think you can choose to write your new features and bugfixes for a C# program in Java
The answer, as always, is "perhaps".
What I am advocating - what I am always advocating - is using the right tools for the job.
If that means switching languages within an application - so be it.
You should notice that, in my original post, I made the point about "so long as he can justify his decisions". A change in language mid-project is going to cause ructions; if a programmer can justify that change, then he has chosen the right tool for the job. If he cannot justify it, then he has not.
> There are only two situations where it's possible to choose the language
No, you are incorrect here.
Most toolchains have the ability to export symbols to other tools. Mixing languages within a project is hardly uncommon.
> Your post implies that you've usually been writing-from-scratch on your own
Not so. Your post implies you haven't seen the mixed environment that frequently occurs in large projects.
> It's pretty rare for the requirements not to define the OS(s), language or framework.
I would argue the opposite; it's quite rare, at a requirements capture level, for those to be defined. Once the detailed design is completed, some or all of them may have indicators of varying strengths, but only a truly closed mind would refuse to consider alterations if a proper substantiation is supplied.
but there is flexibility in how you get it.
Degrees don't guarantee good performance in any filed, nor does exuberance. We all know self-taught programmers that are good and well papered people that are crap, but those are the exceptions.
What we do know is that if you go looking for good programmers amongst those with CS degrees then you have a far better chance of finding a good one than if you take random candidates from the pavement.
Ours is an industry of change and you need to work hard just to stay current. Good programmers invariably keep reading and keep experimenting through their lives. Those with a piece of paper see the degree as just a start and not the end of their education.
Grocers' apostrophe - brilliant! However, a quick re-parse...
"I taught programming and have seen [that] quite some horror [i]s produced by various "bedroom coders"."
It's an old code, but it checks out!
You're being rather knob-headed to expect any programmer to know what .NET and Lamp Stacks are and expecting programmers to be pre-loaded to your requirements.
Such detailed knowledge will depend very much on the programmer's area of expertise.
What is more important is if the person can learn and become productive within a short time.
Take me for instance. I'm pretty capable. I've been writing software for around 30 years - mainly kernel stuff and embedded. I only have the slightest knowledge of what a LAMP stack and .NET are (enough to get me into trouble and certainly not enough to get out again).
I could however learn and come up to speed quickly if needed.
I won't return the compliment by calling you an idiot for not understanding the details of how the CAN bus or ARM assembler works.
Actually - I DON'T expect a programmer to come pre-loaded to my requirements; that was kind of my point - Universities seem to be churning out programmers that ARE pre-loaded to some predetermined blueprint, to *sombody's* requirements, with a specific set of skills - they don't seem to have been taught how to learn or solve problems nor do they have a real interest in the field they're employed in (remember crapverts like "get an IT qualification - earn ££££s"?)
I don't care if they're a C# .NET developer or some kind of warped guru who thinks in x86 Assembly... however I do expect, if I'm employing a programmer, that they have enough interest in the field in which they're working to know that there are other operating systems beyond Windows, that .NET is MS-specific (Mono not withstanding), that if they're a .NET developer (for example) I'd expect them to know the shortcomings of that as well as the benefits; they must have an ability to think on their feet rather than dogmatically pursue a language/framework that they're comfortable with when it's not the best tool for the job.
The most important part of programming is problem solving - coding is pretty much a matter of grammar and vocabulary in which you can be retrained; it's not a massive a leap to move from C# to Java for instance - or to a lesser extent C++ or PHP. I'd expect a computing graduate to understand the principles and be able to problem solve - _really_ detailed knowledge of the language comes best when you're using it day in day out, i.e. on the job.
I'll happily admit that I know nothing about how ARM assembler works, I've never needed to, BUT I have enough interest in this crap to fondly remember RiscOS on the Archie (which ran on an ARM chip) from some oooooh 15+ years ago.
I don't need no steenking degrees!
For people who are going to be purely coders of desktop and web applications, there's probably a large grain of truth but for many IT jobs, there's a lot more going on than just coding. Any decent degree course *should* be preparing students for a lot more than just coding.....
A degree is not always required to do the job, but good luck trying getting an interview at any large employer without a degree.
Contributing to an open source project from your bedroom is fairly hard graft:
First you have to hang out on a msg board for months, then you got to get to know the code which will take forever, then you'll be asked to justify whatever changes you've made (this is quite tricky when large egos are involved).
All this and you have to first learn a programming language in the first place :-)
I'd say this is harder than the 3 years of Electronic Systems Engineering Degree - and in the end it was the bedroom coding experience in TCL (of all things) that landed me my first job
...but I have found that most of the good developers I've worked with did their degrees in electrical engineering, not ICT.
--- this is not relevant to previous post, justy being lazy and tagging this on ---
Nothing annoys me more than the "a degree proves your have learned to learn" or the cobblers about university being "gaining life experience" that so many academics witter on about.
If modern A levels were anything like a difficult today as they were 20-30 years ago (when very few would even attempt more than 3 because of the work load required), that proved you had "learned to learn", because if you hadn't you failed them. Also nothing provides better "life experience" than getting out of school and getting a job in the real world and working with others, but most academics in teaching have never done that so they wouldn't know that.
but I seriously doubt I'd consider hiring a developer who didn't know the difference between the word 'waste' and the word 'waist'.
It would make me think that attention to detail wasn't your strongest area, and I probably wouldn't get clear and unambiguous documentation out of you.
My university course was, other than a few key modules, pretty damned useless. Almost everything I do as a developer involves stuff I taught myself or acquired on the job. I've since encouraged people younger than me to consider whether university is necessary for their career, or whether they would be better served spending that time getting direct work experience and skipping the large amounts of time and debt involved.
Personally, I'd have 3 of those 4 years back in a heartbeat.
> My university course was, other than a few key modules, pretty damned useless.
Much of mine was worse than useless.
There were a number of us on my course who had been software professionals before going to university - in those days, software was still largely seen as a new discipline, so people without formal qualifications were routinely found throughout the discipline.
There were several stand-offs where an academic tried to push through principles that we "would need in industry" which were somewhat different to what we were using in industry...
 My favourite example was when one guy was trying to get us to favour recursion over iteration. His argument was that the formal proof of a recursive function is easier than an iterative one. That's true, IMO - and entirely irrelevant; what's vital is that the tiny embedded system you're programming doesn't eat its entire stack in the first operation. That it is harder to prove to be correct merely means that the coder has to work harder to do so...
Sure, you need to be careful of stack space. That still doesn't mean recursion isn't the right solution though, how about tail recursion? Works fine for many problems, is amenable to formal proof and doesn't consume any stack space.
If you've thrown out recursion as a concept then you're never going to find that optimal tail recursive solution if it exists - of course, an iterative solution might be better still depending on the specific problem.
Arguing that iterative is always better than recursive is stupid - arguing that one is often better than the other is defensible.
Incidentally, the formal proof that your algorithm is correct is very important in some industries, while at the same time being almost irrelevant in many others.
Just because a given concept appears pointless for the work you're doing right now, doesn't mean it's pointless for every job you're ever going to do. Put that idea in the back of your brain and sooner or later you'll find it's part of the best solution for something.
It sounds (reads) as if most of you went to a university when you should have gone to study City and Guilds or NVQs or some sort of apprenticeship or whatever the modern job training system is. A good degree, with the exception of certain professions, trains the mind. Even medicine and law top up the academic education with practical, work related training.
As an employer or manager, one of your most important jobs is to teach new staff the reality of working and the new skills required for that, not to make smug observations, challenge them with your pet programming job, probably badly specified because you never learnt how to specify and document.
The two worst types of horror I have experience are the Windows and the Linux "hacker". They get something working; but it costs a fortune to correct, maintain and understand for ever afterwards, using all the tweaks available only in their pet, home environment. I have suffered these opinionated fools in three countries. Aaaaaaarrrrhhhh!
> If you've thrown out recursion as a concept
I hadn't. No-one had. I didn't mention anything about throwing out recursion.
What I mentioned was an academic who was trying to throw out *iteration* because it made the formal proof harder.
Please tell me you can see the difference between those situations...
> Arguing that iterative is always better than recursive is stupid
So it's a good job no-one did that, then.
Would you also consider it stupid to argue that recursion is better than iteration? That's what this lecturer did, and I consider that very stupid.
> Incidentally, the formal proof that your algorithm is correct is very important in some industries
Yes, I've worked in those industries. I understand the need for formal proof. However, the point of my anecdote was that this guy was insisting on an extremely sub-optimal implementation just to make that proof a little easier to do. That's just laziness.
Please respond to the post I made, rather than making up what you thought I said...
On the one hand, I firmly believe that a computing degree (particularly software engineering) is a must for developers - there are too many untrained cowboys.
On the other hand, the salary one can get as a developer makes a £9000 loan for the degree seem like a bad deal.
I've run into several University grads who are useless as programmers. The attitude of "This should work so it's no my fault if it doesn't" is a problem on both sides of the fence.
> I've run into several University grads who are useless as programmers.
When I was interviewing grads for a large software operation, I had several candidates who decided I owed them a job because they had Firsts.
It was quite amazing how quickly some of them lost the plot when presented with some fairly straightforward coding problems.
One guy flatly refused to answer my questions when I gave him a design problem involving what he considered to be a mundane piece of household equipment - he thought it was beneath him. I wanted to see how he would go about designing an embedded computer system, and the application is entirely irrelevant.
Education is for the benefit of the student, not employers.
Study something you enjoy learning about. Your tutors are there to teach you how to learn.
I don't know what they teach these days, but I did computer science, which covered all sorts of things such as AI, relational algebra and database theory, super-computing theory, and 3d graphics. The programming was mostly incidental to learning other things so we did Pascal, C/C++, Ada, Lisp, Miranda and so Forth (geddit?). Sadly no cobol, as that's a real money spinner...
Anyway, there wasn't much that I've seen in action since or used in anger. All I use now is what I learnt at o-level - binary/hex/decimal conversions and logical operations (used for IP address calculations).
This is exactly right, and to be honest has been right for at least 5 years
Having been in a position to say "been there, tried that and failed miserably", I can inform that without a degree you stand a snowballs chance in hell of getting an IT job without a degree. No employer will take one look at IT coders/admins without a degree because the vast majority of people in industry do have degrees and the cost saving is negligible
When I was made redundant last Christmas I, without a degree, had to decide which of the 3 job offers I had received by mid January I wanted to take. It got to the point I was turning down interview offers.
Nice idea, except a large number of job adverts specify that you must have a degree, some of the higher paid ones insist on a 2:1 or better. Also I'm sure I saw a games job advert that required a degree. You also see this "Must have a degree" for php jobs which isn't as far as I know taught at unis.
You also see this "Must have a degree" for php jobs which isn't as far as I know taught at unis.
What's that got to do with the price of bacon - a degree SHOULD NOT teach you a programming language (specifically) it should teach you how to program (big difference).
" a degree SHOULD NOT teach you a programming language (specifically) it should teach you how to program (big difference)."
Couldn't agree more, and yet Java and similar now seems to be taught a lot more over C and similar. People won't learn memory management etc ec
That's pretty much an automatic ask from an HR dept though. If graduates weren't as common, they'd stop asking for it once a position has gone unfilled for long enough.
... is that few "traditional" forms of education are equipped to handle the modern world ... and even fewer "cutting edge" kids are equipped to handle the real world.
It's probably been that way since the year dot ... but with today's connectivity, it's becoming blindingly obvious that humanity needs to step it up a notch. Or two. Allow, and indeed encourage, tool users to use tools ... it's what made us Human in the first place ...
Education is a racket to create jobs for the educators.
In principle some people could benefit from a specialised and abstract education rather than on the job apprenticeship. In practice academics produce more academics, advancing an academic career and being useful in the real world grow ever further apart. Bright people with a passion for whatever subject will teach themselves, the dullards will game the system and get bits of paper. The world is full of the dullards trading bits of paper among themselves.
Me started in computer operations in 1973, went to university in 1995 because no one would look at my cv without a degree. Now PhD and working as a consultant.
... but try getting that past the HR fucknuts who seem to think a degree is required to take a shit unsupervised...
Not having a degree has put me out of the running for so many jobs, despite having the ability to do them blindfolded and with one hand tied behind my back.
Just ignore the "must have a degree" and apply.
My last 4 employers (that's over a decades worth) all stated "must have education to degree level" or similar in their job adds, but that didn't stop me getting to the interview and being given the jobs.
Perhaps the fact that I ignore all the CV writing recommendations (from the educationalist) and put the totally irrelevant "what I did at school" stuff as the back page of my CV that helps. The HR staff probably cannot be bother the read more than the 1st paragraph.
When I was made redundant in May 09 (the whole company has gone now) I didn't work for about 6 months. That was because I really didn't want to start commuting to London again, but there just were no jobs in Kent I wanted (agencies kept offering me HTML/Perl, but I do C/C++), and I definitely was not looking to move house. After 6 months and brief contract period, I gave in decided I'd have to include London: I was back in full time work 3 weeks later.
20+ years experience, asked for by name by clients, citations from multiple IT managers/directors.
Can I get my CV past the HR departments?
Can I hell.
I've been working in IT for 17 years now but larger companies won't touch you unless you have a degree, although bizarrely that only applies if you're full time, they'll take anyone on contract.
Biting the hand that feeds IT © 1998–2018