C overwhelmingly proved the most popular programming language for thousands of new open-source projects in 2008, according to license tracker Black Duck Software. The company, which monitors 180,000 projects on nearly 4,000 sites, said almost half - 47 per cent - of new projects last year used C. Black Duck said 17,000 new open- …
link to the article
here's a link for the source, which seems to be missing from the reg article:
Not much more there than what the reg has posted, shame as it would be pretty interested to see a proper table breaking down the language rankings!
I code in C, but most of the youngsters use C++ or C# ... Yes, I know the the later languages too, and when to use them. I just prefer good ol' C for day-to-day coding. It doesn't get in the way when I want to play StupidHardwareTricks[tm] ...
Anyway, any idea on the percentage breakdown of C-like languages?
bash-3.1# which C
which: no C in (/usr/local/sbin:/usr/sbin:/sbin:/usr/local/bin:/usr/bin:/bin:/usr/games:/usr/lib/java/bin:/usr/lib/java/jre/bin:/usr/lib/qt/bin:/usr/share/texmf/bin)
I don't expect it to be massivly popular, but I'd expect it to show up at least?
It's not much of a surprise to me
Some PHP and Ruby fanboys out there fail to realise there is more to the interweb than a web site. And that is mostly what PHP and Ruby seem to be used for. There is an endless supply of PHP scripts for blogs, CMS, and so on. That doesn't mean they're bad programs, but a lot of them do very similar things but with different bells and whistles.
You could possibly write a database in PHP or Ruby, but most likely C++, C or Java will be better.
What is (Open) Solaris written in? And Linux? And FreeBSD? And OpenOffice? And (core) Apache? And much of the GNU tool set? It's not PHP or Ruby... They may have Perl/Python/whatever add ons but they are not crucial and can most likely be written in any language.
I'm not knocking PHP or Ruby, but I have noticed that some fanboys really are ignorant.
So basically, their PR department, which compiled this press release, doesn't know the difference between C, C++ and possibly C#.
When can we dump C & use modern techniques?
Time is past to use the right tools for the right job. C is good when the occasional StupidHardwareTricks[tm] are necessary, otherwise it's just fine for UnnecessaryDatastructureViolationBugs(C).
Moving up won't eliminate bugs but it'll wreak significant improvement.
That'll piss off Sun. They like to think that most of the internet is built in Java.
Graph for 2001 to 2006
I just threw up in my mouth a little.
"When can we dump C & use modern techniques?"
It will happen only if someone comes up with a modern technique that beats the portability and power of good ol' C for certain tasks ... Trust me, the kernel coders of the world are fully aware of when to use "modern" programming languages, but they still code kernels in C.
C was written for a reason, and it's still being used for that reason.
Note that I'm not addressing people choosing C for inappropriate projects. It's not the hammer's fault if someone tries to use it as a screwdriver.
Also note that I'm not addressing errors that can be introduced by neophyte C programmers ... Those of us who have been using C for two or three decades or more understand the issues. It's the people who have taken C-101, 102, and 103 at Uni and then get hired as "professional" C programmers with zero street smarts who are the problem, not the language itself.
Also note that I don't blame the neophytes. I blame the so-called teachers who rarely even go as far as teaching what the heap and the stack are for, much less how the compiler uses them ... and the HR-drones who only look at pieces of paper when hiring.
I have many tools. All are good, in their place. I don't blame any of 'em for my own faults.
re: When can we dump C & use modern techniques?
Perhaps when modern engineers can code for toffee?
Without being bitten on the backside by the damned machine actually doing what you ask, I've had the misfortune to see some truly hideous "working" code.
At least bad C code has a tendency to cause nasal demons, which are easily cured with the appropriate balms.
Well said, that man
Presumably you're including C++ in that C? But not C# which isn't C and isn't #. And presumably not Objective C, which objectively isn't C either.
RE: When can we dump C & use modern techniques?
Shush! I was just getting comfy with the idea all those old C skills I haven't used for years are still relevant. Actually, all those C skills I have forgotten would be closer to the truth.
Regarding the question posted above of what about C++, I must admit the majority of my C++ code was simply old C code in a C++ wrapper, but I found that quite prevalent even on professional projects. It might be the code scanner they are using to detect the language type is simply picking out the internal C patterns and ignoring the wrapper.
Re: More Releases = Better?
From their site:
We would expect projects that support a large number of releases to tend, more often than not, to have a more committed team of talented developers.
I really don't think that follows.
More Releases = Better?
Call me stupid, but I don't get why you would judge a project on the number of releases, I would have thought that 191 releases a year shows a lack of QA, no?
RE: Re: More Releases = Better?
I'm guessing they are looking at the glass half-full, which goes against all that is holy in the world of development.
They look at multiple releases as commited developers bringing out new features each release, where as some would tend to see bug-fixes, and/or wild thrashing of the keyboard in a coffee-induced fit at the desk at 3.31am and accidently hitting commit!
Every language has it's place... and every programmer has their prefered language of the moment...
Re: Re: More Releases = Better?
no, they probably do have "...a more committed team of talented developers." but they probably spend all their time developing rather than testing the stuff that they churn out actually works or does what it's supposed to do (if they even know what that is...)
Agree 100% In fact more releases might well = worse as things get patched due to the poor quality in previous releases. They should be using a mixture of metrics such as commits, committers, downloads and so on. I imagine though, that then they would actually, you know, have to do some work for the report rather than just auto analyse a bunch of stats on open source sites.
@When can we dump C & use modern techniques?
Because most "modern techniques" are like shitting in a bidet - usable, but misguided - and you are likely to end up smeared in shite.
re: When can we dump C & use modern techniques?
It might reflect the prevalence of media stuff recently only an idiot would prefer java over c for any kind of codec or media player.
C++ isn't really different enough from C to be considered independently.
I bemoan the lack of definition regarding C and its offspring.
C has always somehow made a lot more sense to me than C++, and it would be good to see how many new projects are using actual C. You've got your memory and your instructions all wrapped up in some nice meaningful words - why would you need more than that?
I have also just recently discovered the joys of Haskell and Lisp. It would be nice to see Haskell get a boost.
Rather suprising how little Python there is in their survey.
Lots of ways to measure popularity
Measuring language popularity is a difficult task, quality data is hard to come by and what exactly is meant by popularity are some of the issues.
re: re: When can we dump C & use modern techniques?
@jake: I completely agree and I'm not blaming the languages (it's fine, and as @Beard said, "C has always somehow made a lot more sense to me than C++" - have to agree there too in some ways). I'm blaming the people who use it where better tools exist.
But it was never designed to write huge systems and the continual e.g. buffer overflow failures in databases to OSs seem to be telling us something.
@ Ian C: hard to know if modern engineers are any better than old ones. As for "bad C code has a tendency to cause nasal demons", well the obvious faults can cause obvious explosions. Buffer overflows, race conditions etc. tend not to, which is why they are so serious security-wise. Even good people make mistakes, and I want fail-stop not fail-crawl-corrupt-leakData-crash. The only way to ensure that is through modifications in language/compiler/software engineering.
@Steen Hive: Um, it's down to informal use of, for example, multithreading that so many race conditions occur. It's been known since the 70's that the compiler could (and some did - qv Per Brinch Hansen) catch most race conditions, but that got ignored. See <http://brinch-hansen.net/papers/1999b.pdf>, extracts:
In 1975 Concurrent Pascal demonstrated that platform-independent parallel programs (even small operating systems) can be written in a secure programming language with monitors. It is astounding to me that Java's insecure parallelism is taken seriously by the programming community, a quarter of a century after the invention of monitors and Concurrent Pascal.
It has no merit.
Although the development of parallel languages began around 1972, it did not stop there. Today we have three major communication paradigms: monitors, remote procedures, and message passing. Any one of them would have been a vast improvement over Java's insecure variant of shared classes. As it is, Java ignores the last twenty-¯ve years of research in parallel languages.
If programmers no longer see the need for interference control then I have ap parently wasted my most creative years developing rigorous concepts which have now been compromised or abandoned by programmers.
I'm sure all of us here have had to debug other people's race conditions and stray reads/writes, and it's horrible. We end up being smeared in shite, in your words, partly because we choose not to avoid being smeared in shite.
Oh well, IMO anyway. Flame on.
Oh, and @Beard, I'd love to use Haskell but it seems not ready for production use.
C does make sense. Lots of sense....
Yes, it would be nice to have some definition regarding C++ here, but C is the "big beast".
Apparently across the commercial and FOSS worlds Java is now the most used, at 19%, followed by C at 15% and then there's C++ at 10%. See the TIOBE Programming Community Index.
"The numbers are a surprise" ?
They aren't really a surprise unless you happen to be a PHP or Ruby weeny and harbouring the delusion that your skills are somehow relevant in the wider computing sphere. Both are domain specific tools for building web sites, useful in that context, but frankly not much use outside of it where there are far better and more appropriate tools.
I suspect the prevalence of C reflects the fact that outside of the odd CMS derived frameworks like ticketing and CRM systems, the fact is that most FOSS software is not run in the 'cloud', but on a box, e.g. applications. Running on a box means either native, which means C, because nothing else is portable enough to even compile across nix flavours, or cross platform, which means C with a cross platform framework like QT, or Java, which to picky will be running a VM that was written in C.
C is, was, and will be, the rock upon which the church of IT is built. This will remain the case at least until someone builds an OS out of a managed, type safe, language, which won't happen until until the type safe wars have died down a bit and a language that everyone can accept, or can be stuffed down enough peoples throats, comes along. Which may well be never, nothing has ever had the broad appeal of C.
and......no Python ?
wow looks like Python doesn't even exist.........
It comes down to this, multi-threading or no, that you can't prove to anyone that a formally correct program written in C is any less reliable that a formally correct program written in ToolDuJour or any other language, and it is no easier in practicality to prove that a program *is* formally correct in another language than it is in C .
If it were, ADA would absolutely rule the universe at the moment, but it doesn't because it's rubbish. C is rubbish too, but responsibility for it's rubbishness lies solely with the programmer.
/gets coat and shakes head as several million 100MB runtimes are loaded to execute System.out.println().
"...and it is no easier in practicality to prove that a program *is* formally correct in another language than it is in C"
That's entirely not true, in whole-program proofs or proving only of certain important fragments.
Read the link I posted. Another quote:
"The Concurrent Pascal interpreter ensured mutual exclusion of all operations on the variables of any process or monitor. "
By imposing quite reasonable constraints on the language he allowed the compiler to prove absence of race conditions automatically. Try getting gcc to do that on an arbitrary piece of C.
So he did this 30 years ago and we still haven't learnt.
C was never designed for huge systems?
You mean like UNIX?
Hmmm.... methinks you have things backwards there.
C (and C++) offers unparalleled flexibility and performance. You can code to the metal and you do exactly what you want with it.
Which is why some folks get into trouble, sure.
Also, concurrency in C is fine if you're not an idiot.
"The Concurrent Pascal interpreter ensured mutual exclusion of all operations on the variables of any process or monitor. "
Oh come on! Wrapping every variable in a mutex will make your code run like a dog and all you'll get for your trouble is replacing your earth-destroying race conditions with deadlocks instead!
Further, the Concurrent Pascal interpreter (judging by it's name) does the one thing that could possibly make Pascal a more hideous language to use - it interprets it! - thereby cheating by having runtime info available to it that a compiler can't.
I'm always amazed that people can't see why C persists - It's not inertia, it's not laziness, it's not elegance. It's actually got bugger all to do with programming per se - it's called the law of diminishing returns.
"But it was never designed to write huge systems and the continual e.g. buffer overflow failures in databases to OSs seem to be telling us something"
Yeah, it tells me lots. It tells me that many of today's C programmers aren't worth a shit, combined with the fact that for some reason "lines of code per day" is seen as important ... I agree that C wasn't designed to write huge systems, in that what we consider huge systems today would have used more computer power than even existed a third of a century ago. It's a tribute to K&R that C is as flexible as it is.
I also agree that, in theory, there are better ways of doing things. A couple weeks ago, some friends and I were having a "what if" conversation, discussing where DEC would be today if they hadn't squandered away the TOPS-10 & TOPS-20 franchises ...
But what we have is C. And as I always say, "running code trumps all".
 That's not original to me, I first read it on Usenet over twenty years ago, probably during the C/C++ wars ... If you eyeball dejagoo, you can probably find the original, and maybe even pick out my contributions to the flames, if you squint ... No, I'm not proud of myself :-)
> some title or other <
The problems with posting here is justifying it afterwards. So...
I'm sure I read that C was designed to make small, modular subsystems which interacted with each other. The nearest I can find is <http://web.cecs.pdx.edu/~jrb/ui/bsdstack/bach1.txt>: "to provide o.s. primitives that enable users to write small, modular programs that can serve as building bridges for more complex programs".
But anyway as jake points out, early unix was a tiddler compared to the modern CD- and DVD-based distros as it had to run in a few K (IIRC PDP11 had 64K - not sure).
> C (and C++) offers unparalleled flexibility and performance. You can code to the metal and you do exactly what you want with it.
Yup, its strength is also its weakness. I want to be able to do all that, but only when I explicitly want to, not be exposed to my own mistakes constantly. Being human I do make mistakes, and I'd rather the compiler took care of as much as it can so I can work at a different level. Not, for the most part, bare metal.
> Also, concurrency in C is fine if you're not an idiot.
Yeah, well 1) I don't agree, in general it's inhumanly hard and even in simple cases I've found it's bitten me, even with a simple problem and a paranoid level of care, and 2) even if you're right there are enough programmers out there who don't do it right even with major software (no names). I've had to debug it and I don't want to do it again. Also 3) I think anyone who can claim that all you need is not to be an idiot is kidding himself. It's just plain hard. The more tools you have to prevent basic human mistakes, in threading, pointers or even high level stuff such as requirements capture, the better.
> "Oh come on! Wrapping every variable in a mutex will make your code run like a dog ..."
Point taken and his OS was not considered to be more than single user. However if static analysis had been pushed in the last 30 years (actually, make that 45 years by now), how much doggish performance could have been optimised out?
"...and all you'll get for your trouble is replacing your earth-destroying race conditions with deadlocks instead!"
Mmm. Which do you prefer - suble corruption bugs (I presume we've both had to debug these written by others, and try to clean up the data afterwards, which is hellish in both cases) or very evident deadlocks (which can be handled by static analysis if you're lucky, or cycle-detection and with care, rollback & retry - this means keeping an in-memory log or state generally - and other tricks) which we can safely and cannily optimise around to usually regain speed.
You can't magic away complexity.
BTW Concurrent pascal was compiled.
@jake: a thoughtful post but as before, I can't automatically accept that today's C programmers are worse than their fathers. It's just a difficult problem which has grown more difficult by dint of hugely more ambitious programs that get written now, while we certainly haven't evolved bigger brains.
But, "running code trumps all"? How buggy may that code be before we decide the author shouldn't have bothered, and we shouldn't have bothered buying it? If we can kill certain classes of serious bugs by using different languages/tools, why the hell not?
"You can't magic away complexity"
I think it's arguable that the programs that solve the problems of today are actually more "complex" by any decent definition of complexity. Interactions between program components should follow more-or-less the same very narrow set of rules as always - typical of which is if one has more than one execution context possibly accessing the the same data then it should be guarded and serialized in the most appropriate way for the situation - whether this be a RAM location or some esoteric OODB horror on the other side of the World.
My point would be that most of the apparent "complexity" in today's problems is quantitative, rather than qualitative and programmers spend more time fighting stupid abstractions, paradigms, extreme fashion methods and downright bloat than they do solving the fundamental problem at hand.
I put it to you that your average programmer well-versed in, say, Java would have a lot more trouble writing a decent OS kernel than a kernel programmer would have writing a distributed enterprise system. And before I get flamed, it's not to say that Java doesn't engender fantastic bespoke systems in the manner of cobol before it - I've had to bite the bullet and use it myself - with good results, but as a general and expressive programming solution it is pants. Of course if your Java program falls over, you can always try to blame the guy that wrote the superclass :-)
"I can't automatically accept that today's C programmers are worse than their fathers."
It's not a matter of better or worse. It's a matter of having 30 years experience over someone who just passed a two year programming class. Or to put it another way, I *am* the father of a C programmer. She's making the same mistakes that I did. And she's learning from those mistakes, just like I did. It's called a learning curve. On the other hand, she's smart enough to call me when she finds herself in a pickle. A lot of the young hot-shots are too proud to ask for help when they get into trouble ... which is probably where some of the errors creep in.
"It's just a difficult problem which has grown more difficult by dint of hugely more ambitious programs that get written now, while we certainly haven't evolved bigger brains."
I call on that 30 years, again. As projects built with the language grew, so did we. People who administered DOS 1.0 on the original IBM PC, and stayed with MS products thru' XP (I don't do Vista) don't find XP to be complicated, despite the obvious complexity. Likewise, I've been using UN*X-like programs for the same amount of time (I was at Berkeley at the right moment in time ... just lucky, I guess). At home, I switched from Coherent to Slackware 1.0 when Mark Williams Company was obviously about to close up shop. I've been running Slack ever since. As a result, Slackware 12.2 isn't complicated to me. Complex, yes. Complicated, no.
"But, "running code trumps all"? How buggy may that code be before we decide the author shouldn't have bothered, and we shouldn't have bothered buying it?"
Four words: BIND, Sendmail, Apache and Mozilla ... maybe a low blow :-)
"If we can kill certain classes of serious bugs by using different languages/tools, why the hell not?"
Personally, I agree with you. Functionally, I actually do this. I teach Smalltalk, Lisp & Scheme occasionally (when allowed by the administration). I use awk or grep instead of perl where awk or grep (or whatever) will do the job. I (tried to) give a sysadmin/securityadmin concepts class using an old VAX cluster running BSD, so the people in the class wouldn't get hung up on applications and could concentrate on the concepts I was trying to teach.. And etc. Unfortunately, you and I can't turn the world around.
As a last word, judging by how much COBOL and Fortran code is still out there doing useful work (maybe a couple billion lines, by some estimates), I rather suspect that C will be with us for at least another 50 years. The installed base is just too big. Call it inertia if you like, but it's reality.
Note to the mods: If this has come thru' a couple times, I apologize. The 'net wibbled at me.
feedback & corrections
I implied Brinch Hansen's work went further towards managing concurrency safely than it actually did. I checked again and, as Steen Hive picked up, it was monitors only. They were part of the language and therefore a technically 'reasonable constraint' but still, my memory served me a bit too creatively there.
BTW an extreme example of language constraints that allow concurrency to be managed easily is in the immutability typically found in functional languages. Static everything = concurrent safe. Certainly not the whole answer, though.
@Steen Hive: Interesting thoughts on complexity but I've no idea how to evaluate whether the complexity is quantitative Vs qualitative . You also seem to suggest that it's the tool-user not the tool that counts and in an ideal case, yes, surely!, but things are rarely ideal.
@jake: Good points on the 30 years maturing period; you've clearly got a decade of experience on me. And no, it's not a low blow.
I guess it's always going to be uphill. I'm asking for consideration of (not obligation to use, just consideration of) new tools where the old seem deficient in some manner, but sometimes I despair when I meet programmers who don't actually 'get' the most basic tools such as procedures/functions [*] (they really don't see the point of them) and those who do but disdain them unless they're heroically long and twisted. Even the next step up, simple old-fashioned assertions are little-known of.
And all that's before we get to preprocessors (they still have their place), OO, DSLs, static analysis, new language styles....
[*] I've just realised there are even more basic tools such as comprehensibly-written code, comments, external documentation and project planning and management, and although these aren't considered tools in the normal sense, they certainly are. Most projects I've worked on lack even these.
****It comes down to this, multi-threading or no, that you can't prove to anyone that a formally correct program written in C is any less reliable that a formally correct program written in ToolDuJour or any other language, and it is no easier in practicality to prove that a program *is* formally correct in another language than it is in C .
If it were, ADA would absolutely rule the universe at the moment, but it doesn't because it's rubbish. C is rubbish too, but responsibility for it's rubbishness lies solely with the programmer.****
If you could prove algorithms (the turing machine proved you cant) then we would have perfect programming languages and no need for debuggers. I dont know where you get off saying Ada is rubbish. The fact that every language in common use besides C/C++ is interpreted (Java, C#, Python, Perl, PHP,etc) means that todays programmers cant handle low level programming. Ada is the only compiled language that is built to be safe by very strong type checking and very insular modularity. But you can just keep using Java....you clod
I think you need C and something like Java...
I don't think it's an either/or type of thing.
For building user-facing software, you'd obviously pick Java because of its GUI libraries and ease of development. You can use Java for most day to day programming and be very happy and content.
For building system-level stuff, and high performance stuff, you'd pick C. You might even use C to write some native libraries called by your Java code. It's as fast as assembler, but easier to work with, and you can use inline assembler with it, so that's cool.
If you really want to get nutty, you can write some code in C, halfway compile it with GCC and get assembly language code, edit and tweak the assembly language code until you're happy, and do your final build.
Don't be a "one book man".
Don't be a guy with a hammer always looking for nails.
Mix and match! It's pret a porter!
I'd hazard a guess that none of these projects would actually compile under strict ISO rules. In all probability, the compiler is invoked in the mode that treats it as C++, but the code doesn't use many obvious C++ extensions, like classes or templates. Most of the code published on the internet is of such poor quality that I suspect the authors don't actually realise what language they've written it in.
@Drak: learn the subject before calling someone a clod
and @AWeirdoNamedPhil: entirely agreed in principle, I'm just not sure we're using the best (ie. widest) range of tools.
My alter-ego "steve" hive uses Java only when forced to by circumstances, you clod.
"Ada is the only compiled language that is built to be safe by very strong type checking and very insular modularity.
Yeah it reads like sanskrit backwards and is all but unusable for any project unless you have a fat wallet of military $$ that makes the grief worthwhile. I'd go as far to say ADA is provably shit, because it is freely available, and very safe but no-one in their right mind outside civil servants and those parasiting on their teats would touch it with a barge-pole. So I'll stick to C, you clod.
I think it's because C++ was a lesson in how NOT to do OOP and most languages that handle it decently are monoparadigm. C, on the other hand, can approach OOP with structs and unions. It might not have things like inheritance, garbage collection or a language-defined GUI, but shared libraries offer the language a large array of abilities.
"I've no idea how to evaluate whether the complexity is quantitative Vs qualitative . You also seem to suggest that it's the tool-user not the tool that counts and in an ideal case, yes, surely!, but things are rarely ideal."
I often think about qualitative vs quantitative complexity in terms of Microsoft Excel (really!). I ask myself the question "What does Excel 2008 really give you in several hundred MB that Visicalc didn't 30 years ago in 27 KB?" Now of course it's a dumb question in many ways but I always come to the answer "in terms of the actual job, not a whole bloody lot." - Alternatively, "25 years ago, how did a bunch of hippies manage to code a beautiful, fully pre-emptive-multitasking, high colour OS in 256KB of ROM for the Amiga?" Answer: They properly solved the problem at hand - so what the DOS was written in BCPL! - A 2009 Windows PC with a top of the line nvidia card still can't genlock a TV signal properly.
About things rarely being ideal - things are *never* ideal and I always look upon that issue in the manner I look at "zero-tolerance" laws - usually well meaning, but ultimately just wrong. A hypothetical, "perfect" computer program *will* break as soon as you put a human in the loop, and humans are thus far always in the loop. Observe this is not saying that one should not strive to write as-correct-as-possible code, but where current developments are taking us I fear we will be solving all our computing problems by running a copy of Excel of infinite size :-)
C and it's bastard children are still around because they solve enough basic computing problems well enough - this is an empirical observation and not an academic proof. This won't always be the case, but I'm not holding my breath because I have a program to get on and write composed of most of the same problems I have ever solved, but in a different order.
A wise man doesn't need a full toolbox.
"I'm just not sure we're using the best (ie. widest) range of tools."
Of course not. Think about it this way ... How many people have the know-how and all the tools necessary to repair all the little things that go wrong around the house? How many people have the tools and know-how to build any new tool that they need? How many of us are tool & die makers and/or blacksmiths?
Plumbers, electricians, carpenters, roofers, and HVAC folks all have their own problems, and the tools to resolve them. I'm sure you can hang a bookshelf ... but can you build a shed, and install a generator and transfer switch according to your local building code? Would you attempt it? If you did, would it pass inspection? Would your insurance still be valid? (I have, because our horses are on well water so we need power 24/7 ... it's not all that difficult, if you talk to experts first.)
Do you want your next-door neighbor's kid, who completely restored a 1959 VW Bug, working on your brand new 7-series BMW? Come to think of it, do you want the kid who recently passed (with honors!) the BMW technical school working on your '09 Learjet 60 XR?
Why should computers be any different?
I'm NOT saying that people shouldn't have a wide variety of tools available to them (I'm an old UN*X hacker, the more uni-tasking bits of code you give me, the better I like it ...).
What I am saying is trying to specialize in all of it is a fool's errand. Computers are the most complex bits of kit that good ol' HomoSap has ever invented. There is NO WAY to understand computing in it's entirety. Rather, you're better off picking a couple of Swiss Army Knives that'll get the job done with a minimum of hassle.
C is, to the best of my knowledge, the best all-round tool in the programmers toolbox. Don't get me wrong, C is not the best tool for every programming job. But it'll work in a pinch in every scenario I have ever seen, under any OS, and on any hardware. I can't say that about any other computer language.
Should a programmer have many tricks up their sleeve? Of course. Should they know when one of their other languages is a better solution than C? Of course. Should they continue to learn new languages indefinitely? No. Not in my opinion. At some point you'll hit the point of diminishing returns ... which by definition is negative-flow.
Taking it back to automotive analogies (not perfect, I know), a friend of mine is the best wrench I've ever known. I've known him since we were about 12 years old. He was my mechanic when I owned an OMC dealership in the Port of Redwood City. He has taken it upon himself to collect as many tools as possible "just in case". Large tools, small tools, machine tools, hand tools, electric tools, pneumatic tools, hydraulic tools, diagnostic tools ... Name it, he's got it. He can work on wood, glass, metal, fiberglass, plastic, the land ... He has rebuilt railway cars, boats large (200ft+) and small, cars, motorbikes and trucks (large and small). He has restored houses (mine) and added onto them (again, mine), and ripped up and replanted a pest ravaged vineyard (again, mine ... our first batch of wine from that is aging (finally!)). If man has built it, he can probably fix it.
Unfortunately, his tool collection is now so large that he needs four 40 foot shipping containers to house the bits that need to stay out of the weather; the weather-proof bits would make a largish equipment rental yard jealous (I know, his collection is behind one of my barns ... It took seventeen trips with a heavy equipment flatbed to get the weather-proof bits here). Keeping it cataloged and in good repair is a full time occupation. I only had him work on my house to keep him out of the poor-house ... He's got so much kit that without my ranch for storage, and to keep him busy, he'd spend so much time taking care of his collection and paying for its storage that he'd go broke. I'm trying to convince him to sell it on ebay ... He must have close to 10 million dollars worth of stuff that is mostly unused ...
On the other hand, I have a basic set of Craftsman tools (probably $4000 to replace), some basic woodworking, electrical and plumbing tools, three welders, two compressors (one portable, one in the tractor shed) and their attendant tools, large, medium and small tractors (with just over a dozen total attachments), a riding lawn mower, three chainsaws, a couple weedeaters and a water truck ... and I manage to properly maintain a largish horse ranch without going to a rental yard more than a couple times per year.
Sometime less is more ... The trick is to pick which less to learn in it's entirety.
Or as Granpa used to put it "A wise man doesn't need a full toolbox" ...
Now, wheres my f****ng gun!
When I hear the bleating calls of "When can we dump C & use modern techniques?" I look for the cattle-prod or the glock - whichever I find first is not important as long as it it ready ;-)
Arrrh - The polite answer is: Whenever you lot learn to actually design something, Most of you can't but the tools make it look like you can - so I have to clean up your mess!
"Modern Techniques" is the neophyte Java-Geek speak for endless threads (weel they do end when all memory is exhausted) huge object graphs cached everywhere in memory because just looking up lines in a database is sooo "Non-Oh-Oh" e.t.c.
Unfortunately, Java itself being very modern - progressive even, does the best it can to keep the idiot designers from hanging themselves soon enough so their boated abominations actually "work" long enough to get shipped.
Very few people are actually capable of design anything; the ones that can all have around 10 years development experience. The worst part is the people coming straight from university are particularly obnoxious: all brain, no intelligence and stuffed to the limit with "Modern Techniques", breeding hate and revulsion wherever they go.
Something that deserves some kind of response
This is the most entertaining thread I've read in ages.
I'm one of those newbie coders that claims to know c#/.net, java, c++, c but only have four years knowledge of each. I've worked with some real genius's and some stuck up 'I know it alls' from both the past and the present.
I figure I must fit in not to badly however as I can't spell if my life depended on it and my grammer sux. This rule of thumb seems to fit in with all great developers.
One day I'm going to be old and look down on the young whipper snappers who are programming their computers with voice commands in C-shore and tell them that they can't code for toffee, and that back in my day we knew people that could build an operating system with 4 lines of codes of pure binary.
Things won't have changed much either
Dev:"Computer create application which says 'Hello World'"
*2000 core 4Thz Computer churns away....*
2 Minutes later
Comp:"Your program is complete - executable size 2TB"
Dev:"Computer run program 'Hello World'"
Comp:"Error at line 6"
Dev:"Cross reference with all libraries available, and run program 'Hello World'"
Comp; "Executable size 4 PB"
Comp: "Hello World"
Dev: Run 'Hello World' on mikes computer
Comp: "Please insert Microsoft Office Disk 2065"
Comp: "System Error, Would you like to inform Microsoft?"
There is no "best" tool for the job in programming or any other field.
There is a whole new generation learning programming languages like C++ by rote in quick courses in places like junior colleges, and they dont understand the underlying concepts of the language or how it applys to computer architecture. And then these people go around asking "experts" why the language they are learning is the "best tool for the job" hoping they will get an explanation to justify their choice of language. But that is a mistake, no one should ever accept anything as best on face value or even the word of an "expert". The best language is the one that gives you the results you want and balance that with the power and flexibility you want, and all these things dont come free. C++ is a difficult language to use. Even "experts" who think they understand it often rely on all manor of parlor tricks to accomplish even the most minor tasks. Runtime safety for example is given very little value. A language like Java might suffice for enterprise infrustructure apps, but for complex commercial standalone apps only a language like Ada will provide the kind of runtime safety for people smart enough to use it.
@A wise man doesn't need a full toolbox
@jake: I'd thought we'd come to the end of the thread, but ok, you've made me think, and move away a little from the idea of hands-on tools (actual programming languages) to conceptual tools. Like procedures, OO, etc. What happens in your head, not so much what you install in your computer/fill up your barn with (err, I take it you're from the states then? I hear you've got a new king or something last tuesday. Good luck to him, he's going to need it. Anyway).
You're right, I shouldn't have said widest range. That was wrong. Trouble is, I'm not sure exactly what I meant. I guess I'm trying to say a wide exposure to different tools (software & conceptual) so you have a sufficiently (not excessively - touche!) stocked toolbox
I don't worry if my *specific software* tools lie in the dark for a while, when the time comes I open the door and spend a day or two kicking off the rust. The tools of value *conceptually* are what I learnt from different sources, and there are far fewer of them and in them lies the real value because they are transferrable and composable. There are tens of thousands of languages but in reality very few 'paradigms' of programming, to wit, functional, logical, procedural, OO, SQL (technically a subclass of logical) and a couple more I forget, and some rare oddities thrown in.
Once you know how to use these few and the bits that come with them, your toolbox is pretty good. Once you are comfortable with closures, you can use them in any language that supports closures to (sometimes majorly) tidy up and tuck in loose ends - if you use them properly. Once you are familiar with one parser generator you should be pretty well set up for others - if you know such software tools exist.
Let's be concrete: in my last company they used a commercial report generator which had a pascal subset built in. Except it didn't work. The sods who wrote it didn't use a parser generator (software tool) and they didn't know how to write a parser by hand (conceptual tool). So much time wasted by our staff over this as it failed to work in random & often unfixable ways.
But if people can't even use subroutines properly and write huge procedures with everything cut&pasted inline, and (I've actually seen this) disable all compiler warnings because "they get in the way", well why bother hoping.
@gun-totin' AC 17:20 GMT: misuse of tools (eg. threads) is as bad as ignorance of them, granted. And I find those that have been to university have less arrogance and are more willing to listen as a whole than those who haven't. YMMV.
I have no answers
Yeah, I'm a Yank. Born at Stanford Hospital, Palo Alto native. One of SillyConValley's few local-born technogeeks. But don't let that fool you ... I traveled between California and the UK a lot as a child. I got my O & A levels in England (somehow, considering half my schooling was in Palo Alto), and spent my first year of Uni at Kings College, London ... Then I attended Berkeley, just in time for ken to give the series of lectures that led to this conversation.
"But if people can't even use subroutines properly and write huge procedures with everything cut&pasted inline, and (I've actually seen this) disable all compiler warnings because "they get in the way", well why bother hoping."
Indeed. I've been thinking about this since I last posted. I think the issue is more a social problem than a tool problem. The tools exist. We know that.
Basically, and in a C shell, development environment complexity (and perceived "ease of use"), combined with the perceived ease of use of computers in general, has created a meme where somehow, any idiot can be a programmer or designer. As a result, many idiots are.
I think the true question should be "How do we get back to teaching the kids how the underpinnings of the system work, so that when they actually start programming it's easy to teach them what memory and CPU and cache is, how they are used, how the compiler sees them, and why?" ... Learning to crawl before walking used to be considered a GOOD thing ... Unfortunately, we're living in an age of "I want it NOW!", and nobody seems to want to make an effort anymore. ::sighs::
I blame the Internet, video games and Apple. Seriously. When I was a kid, the computer itself was the game. We built 'em, and a win was making 'em work (I had a home-built 16-bit LSI 11 based Heath H11 in 1977). Then Apple introduced the "Ease of Use" myth with the first Mac (which was bloody useless, I bought one new out of curiosity ... biggest waste of money ever), followed by the first VGA video games that just worked, right off the floppy. The next big step was the Internet in every home ... which gives kids ... err ... other outlets, and no interest in the technology itself.
As a result, these days you can't get kids into playing with the system itself. And on the rare occasion when you CAN, they are easily distracted by something else once they login.
Is it any wonder they don't bother learning the ins & outs of a program language's guts? Much less several different programming languages?
I fear we are about to lose something nearly undefinable ... With the exception of online criminals, we are in danger of losing the hackish attitude that got us to this point. This is not good. I have no answers.
I do, however, have a mare foaling ... so I guess life goes on :-)
- Vid Google opens Inbox – email for people too thick to handle email
- Pic Forget the $2499 5K iMac – today we reveal Apple's most expensive computer to date
- RUMPY PUMPY: Bone says humans BONED Neanderthals 50,000 years B.C.
- Is your home or office internet gateway one of '1.2 MILLION' wide open to hijacking?
- Review Vulture trails claw across Lenovo's touchy N20p Chromebook