I encounter your work every day.
C programming language inventor Dennis Ritchie is reported to have died. Rob Pike, a Google engineer and former colleague of Ritchie, said on Google+ that the 70-year-old, who was a founding developer of Unix and known as dmr, died at home over the weekend after a long illness. At the time of writing, Ritchie's web page on …
Don't forget that he also co-wrote "The C Programming Language" which I still consider the gold standard in how to write a clear and yet concise book about a programming language.
Of course some of that is simply a reflection on the simplicity of C itself, but it was the first non-BASIC programming book I read and I have yet to read a more clearly written (and short) programming book that is not a "nutshell" style reference.
It's also a great demonstration of why nerds need to be good communicators.
For me personally this is a far more significant passing than Jobs, and I say that as an Apple user.
It's a shame that so few people (including in the industry) understand how much we owe to the old guard of early software pioneers, many of whom are still kicking around.
Those of us who can type at significant speeds know that windows is legendary for "losing" characters while it stalls momentarily to manage itself.
Also, you will find many of the transposition and other errors are a direct result of the electrical path length differences between the right & left hand. Yes, your brain sent the messages in the right sequence, no they arrived out of order.
As for DMR - one of the more significant figures of my age. Another sad loss for our industry. Who cannot recognise that big blue C? No one I know!
Have one for me up there!
While I use C++ a bit more because some of it features I'd keenly miss for those programs that benefit from them, the language is just a bit too large to fit in an elegant book. I like that its warts and quirks turn out to have (often deep, tricky, or obscure but most always) technical reasons, but there's something to be said for a less complex and much more elegantly describable language, too. So I do convert things that need no more than C so that a C compiler is all they need. And yes, my "dot h" can all be fed to a C compiler, too.
Tangentially, with the proliferation of integrated development environments and general reliance on graphical user interfaces, computer- and thus also programming language books have tended to include lots of screenshots, exploding the books and making them that much more vulnerable to version changes. IE this generates lots of "virtual dead weight" in computer related books. There are also quite a lot of titles now-a-days, but very few genuine gems.
C is by no means perfect, but it did strike a chord somehow, as does the book describing it. Something to remember, over a glass, but also as something to strive for.
Goldberg & Robson's Smalltalk-80 book is the only other thing I can think of that even comes close, but K&R is the gold standard. Ritchie did tremendous work bringing elegance to the internal structure and organisation of computer systems and deserves every compliment already posted here and a million more.
Whilst I largely agree with your first sentiment, I cannot agree with second. Firstly, the Reg mods doubtless have far, far better things to be doing with their time that acting as sentiment police or niceneess filters. Secondly can you even begin to imagine the uproar on these boards if people discovered that that sort of comment was being censored ?
You may not like what people have to say here, but if you need a nanny to prevent you from hearing it, then can I suggest China or Saudi Arabia?
But completely true. Given the mainstream coverage of Jobs you'd think he could walk on water...
One man created tools, and the other man created trinkets. Neither was evil, both will be missed. Dennis Ritchie will be missed by far fewer people though, despite having a greater effect. Hardly anyone you see in the street will know his name, let alone his accomplishments. Nobody will make a movie of his life.
RIP, Dennis. My sympathies to his family and friends.
The man who built the company that made the machines that Ritchie used.
The guys at at good old DEC changed the world just as much as and (probably more) than Jobs and Apple ever did.
These were the guys that changed the way every company in the world worked, rather than just provide toys to the masses.
You know, it was Steve Jobs who brought down UNIX (yes, OS X is still UNIX and even certified) and Objective C (still c) to hands of general public.
Years ago, people imagined #login on a green monitor when you said UNIX, now some ordinary, a bit technically curious say "I heard this stability comes from the fact that it is UNIX with a nice interface."
Or if you really want to be old school just:
But this practice is deprecated these days.
Please.. no void main () stuff. Probably won't make him turn in his grave, but I reckon you owe it to this guy not to do this :P Although in fairness, in some conditions (compiler, OS, embedded) it may not matter. But fix main up the way it's meant to be.
And... one sloppy habit may mean others. Find these, and get rid of them, you'll be better for it.
RIP Ritchie. I may not have met him, but I learnt a lot from him and his like. Those 3 lads (or more at Bell labs!) did more than we can probably ever appreciate.
It was almost by chance that I found myself sitting in front of a Unix system and a shelf of manuals. For the first few months, I didn't realise that I was teaching myself the career that I would follow for the rest of my working life.
With no previous computing background, I took to Unix, its philosophy, methods, and even the language of its documentation. This was computing, not marketing.
The masses will probably never even recognise the name. The true innovators pass unnoticed.
I've read so much about the man's history and achievements and still rank "The C Programming Language" as in my top 2 tech books of all time, jostling for top spot with Stevens' "TCP/IP Illustrated, Volume 1 - The Protocols".
A TRUE pioneer and innovator. Pretty much personifies those words, in fact.
Rest in peace Dennis.
To put it mildly. In fact most operating systems are just versions of Unix, or thinly disguised versions. Linux, Android, OSX. And most embedded systems, at least in my house, routers, NASes, and so on, run versions of Unix.
ISTR it was the "posix subsystem" of wnt that got certified, which may or may not still be supported, but certainly wasn't easily accessible (required to be separately installed? don't recall, but it was and is simpler to slap on cygwin or use mingw or something) and didn't come with much of any graphics support. Bit of a dead ducky in that pond. But hey, it could be certified and that's marketeering winings* right there. Windows NT was also designed by a well-known unix hater**, and that's clearly noticeable.
While we're talking defining influences, Unix itself was a third system after multics and its second system effect ailments, so in a sense it started out as a "3.0", to lasting success. Which is curious, seeing the long history of antagonism, lawsuits, and infighting. But let's not dwell on that today, eh. It did get a couple things very right and the result is useful to this day.
* typo left in.
** look it up if you don't believe me
Yeah, I'd forgotten about Cutler. Thanks for the reminder.
I do remember pinging a fresh out-of-the-carton NT box when we were setting up servers for a new project and spending more than a minute wondering why I kept finding some BSD server when we had no BSD servers, only Solaris, AIX, and the new NT box.
It sadden me to learn the demise of one of the father of modern computing science, Dennis Ritchie, inventor of the C language and one of the founder of Unix system.
He belonged to this generation of forerunners, he was what we can called a computer genius, and he was really able to write three lines of code without having two compilation errors.
He was one of those people who built their reputation of visionary thanks to their technical skills, and not only on their abilities of crushing their competitors, terrorize their underlings and screw their dummy customers.
A computer scientist who never cared about the visual appearance of C, or how many hours are needed for a chineese factory worker for building an UNIX machine.
Someone who helped the humanity to go forward, and who invented things that we could not live without.
Someone that no one cares about...
(Original by Asp Explorer)
swapping the operands when one is a non-lvalue so it is on the LHS (e.g. 0 in the example) causes a compilation error if "=" is present. It's a trick some people use to help prevent the mistake. It's not perfect because it doesn't work for comparisons of two l-values but it's popular in some places, nonetheless.
Put your pet peeve in the context of this guy and his mates effectively making a cross-platform assembly language which has endured the test of time.
Which you sound like you still use perhaps?
My philosophy is if you play with guns and knives, and you hurt yourself, you've only yourself to blame. If you play with guns and knives, you'd BETTER know your tools.
And unfortunately hardly any media coverage unlike the hype over a certain turtle-necked jumper wearer who died last week. (I'm looking at you BBC News).
Yeah, yeah, Apple made nice shiny things but without Dennis' work none of it would have existed.
So let's raise a glass to Dennis, thank you for giving us UNIX and C.
Unix was derived in part from Multics, a venerable OS from the 60s and 70s which was tied to GE (later Honeywell kit). Ritchie worked on the Multics project and realised that it did some things very well, and some things arguably very badly (and there were a lot of arguments in the 70s and 80s). Unix of course won the day, mostly because it was relatively lightweight and wasn't tied down to specific hardware.
People who read El Reg probably understand what impact Unix and C has had. Even though Joe Public has probably never heard of Unix, all iPhones (via Mach) and Android smartphones (via Linux) run an OS derived from the work Lell Labs and Ritchie did. And C is pretty ubiquitous too.
It took nearly 40 years for the technology developed at Bell Labs to end up in everyone's pocket. I wonder what he thought of that?
Thanks Dennis. The world is a better place for the pioneering work you did.
"If I have seen further is was by standing on the shoulders of giants." - How perfectly apt for my feelings today.
I actually shed a tear when the news sunk in earlier. A great man and a great contributor to our world. Without him much of what we do - I mean those of us reading El Reg - would not be doing it. Simple as that.
K&R was, and remains, the only programming book I have just sat down and READ. It remains, decades later, as the shining example of what a programming book should look like.
C was a good, solid, easy-to-learn language for it's time, but I can't say the world would be THAT different without it. But I CAN say that the way hundreds of thousands of programmers understand programming would never have been the same without him and the book he co-authored.
Take any contemporary bit of IT kit, be it a Windows box, a Mac, iPad, iPhone, Android phone, Kindle, routers, sat-navs, etc, etc.....
You will find that they are either based on Unix and/or created using C or one of its derivatives.
Jobs had a big impact on a relatively small section of IT, Ritchie's work underpins almost all of it.
A very great and sad loss to our profession.
For my first machine was an Apple ][....
So, as much as I *hate* saying this, I owe Jobs some. No Plenty. But nope, don't like that man. It's no secret of mine. It's like he turned evil or something. Either way, I wish Jobs well - he has left grieving people behind and it must have been a bitch to die of cancer.
And, please, C still *IS* a good solid language, without which a lot of us would be doing stuff like mucking around with assembler or forths. Well we still do don't we ;).
So RIP Ritchie... and yes, you too, Jobs.
I was fortunate enough to have visited Bell Labs and met Dennis Ritchie (along with Ken Thompson, Brian Kernighan and Bjarne Stroustrup) in the mid eighties. What struck me, apart from the sheer intellectual horsepower he had, was what a nice bloke he was too.
Having had the opportunity to share a beer and a pizza with him remains one of the more memorable moments of my professional life 25 years or so on.
The work done by the Unix team (and he'd be the first to acknowledge the huge input of less well-celebrated people like Joe Ossana, Bill Plauger and many others) did change the world I knew, and in most respects for the better.
I had lunch once with Dennis and Ken at the UKUUG in London in 1990 - really down-to-earth blokes and both are incredibly knowledgable.
Sadly, the huge influence on computing will have passed without many having even been aware of his influence.
I'll toast his memory at lunch....
You need both, as with most good things. The Bell people and their predecessors were necessary to lay the foundations, build the tools to build the next generation. Jobs and his people were/are brilliant at using these to produce the technology in well designed, attractive packages of use to the "common man", to make it widespread and profitable outside computer specialists.
C and UNIX, my bread and butter and delight for thirty years, was not doing well against Windows and others. Even Linux and Minix were just fringe activities in the wider world. Apple made it, is making it, mainstream. Android built on the market stimulated by Jobs, who built his on clever design and use of existing tools, most built on C and UNIX.
You can not separate them. Both men in this case are extraordinary in their fields and all, including Windows phone and Android phone makers and users, should be grateful to both.
Life is seldom "either or".
I'm surprised by how few of the commenters seem to get that. The thing that sets Jobs apart from Gates et al was that he had an eye for good, clean and elegant design decisions (like the ones made by Ritchie and friends). Consider him an evil businessman if you will but at least his choice of technologies to use when building his evil empires wasn't based on tossing a coin (or even worse, breaking things deliberately).
I'ts a shame when the lives of vacuous airheads are paraded out endlessly in the broadsheets and public media that the passing of such an influential figure has gone largely unreported.
Having spent a number of years in a bookshop flogging "Kernighan & Ritchie" yearly to the new intake at the local Uni I can attest to the longevity of his works.
My development as a programmer went:
1. BASIC. Mostly Applesoft at high school.
2. Pascal. First language I learned at University. I was never much good at it.
3. Fortran. Second language I learned at University. Much more to my taste.
4. C. Third language I learned at University. I learned the basics in a weekend on a Pyramid 90x in the middle of 1986. Loved the language and have been using it ever since.
I've learned and used other languages since then but C is my favourite.
I'm currently a Unix/Linux system administrator but if ever asked I tell people that I'm really just a C programmer in disguise.
Its worth pointing out that C had a huge influence on hardware evolution. The developers of the various RISC CPUs optimised almost exclusively for C and Unix performance.
Sad news indeed.
Somebody in one of these posts pointed out that C came from B. I would add that B was a derivative of BCPL from Cambridge University, England, a good tool that I used briefly. Curiously, Bourne, of Bourne shell fame, was also English (an Algol specialist), if I remember correctly. One wonders what it was about England, specifically Cambridge, that gave rise to this.
There is a straight line from reading the BOFH on Usenet in my mother's office twenty years ago, to installing Linux as soon as I could find it, to my first job at Lucent (with a brief stint on the Bell Labs side of the building maintaining the exptools distribution) to sitting at my desk typing this morning. The influence of dmr on this path is incalculable.
I am very saddened to hear about Dennis' passing. My thoughts go out to his family.
His contribution to the computing and digital world has paved the advancement of modern technology and as a result he has touched possibly every human being on the planet in one way or another - even if they don't know it! He will be forever remembered alongside other brilliants such as Einstein, Edison, Bohr, et al.
I'm sad for his loss as any human, but it's not like he did wonders for IT.
C wasn't that good, almost indistinguishable from the assembler of the day. It still doesn't make sense nowadays if not for all the inertia. But I guess the ability to point out arcane features of the language makes many IT people feel intellectually superior.
Pascal and a few others would have been a far better "default" language than C and VMS again a much better system than Unix (which was never even designed to be an operating system).
But as jwz would say "Worse is Better" and wins any day. Can't argue with that.
Started programming in basic 32 years ago in high school (punch cards being taken to the local uni to be run as a batch for us).
Went out into the real work to work as a manual labourer for the next 10+ years. Then 20 years ago, decided that I was going to get out of manual labour and start learning how to program in real languages.
Started off learning Pascal to learn proper structured programming then moved to C, ASM and CoBOL in that order. C stuck as my preferred language.
Because of learning how to program in C, every other C based language is learnt in weeks, if not days because of the basic syntax and semantics.
I started using unix based OS's about 20 years ago as well. Before that, it was the ever dodgy (Q)DOS as supplied by good old MS. To be honest, I think that Unix hands DOS a serious arse kicking.
return (dmr != null ? 'Hello World' : 'Goodbye World');
I'm not going to get into the slanging match as to whose death is more important, I will leave that to the fanbois..
All i know is this, as time passes the greats are moving on, one by one, and with each passing , it leaves the world that , a bit more dreay a place.
I remember the 70 and 80 when all this stuff was new, and before the lawyers got involved , and the net was just a gleam in berner lee's eye. Spectrum/C64/Amstrad/BBC micro slanging matches , Elite.. those were truly the golden years :)
Didn't know the guy personally, didn't need to.
K+R. A genuine classic in computer books, one many authors could learn from. My copy of the book went walkabout literally decades ago, and its replacement more recently, but the language is sufficiently compact and memorable that I don't actually need the book much.
Thirty years or so ago, I read K+R, and "Software Tools (in Pascal)" co-written by the K in K+R. K+R changed the way my employers built programs, long after I had moved elsewhere. Sadly their first experience of C was via Whitesmiths, but they recovered eventually, especially once the VAX arrived.
Respect is due. Sympathy to those close to him (including to bwk)
I think the K&R bible on the C language is by far my most read and referenced book (even more so than the Hitchhikers Guide to the Galaxy!). A true legend and genius of the computer world. RIP
He did so much more than Pope Jobs I and his collection of strokable, shiny bauble devices, zzz although we know who will be idolised more in the sycophantic media . . .
Biting the hand that feeds IT © 1998–2019