A true innovator
... and a real "genius". One that Michael Bloomberg probably won't have heard of.
It was 1968 and students and workers were on the march, protesting against the Vietnam War, with the western world seemingly teetering on the brink of revolution. In the sleepy, leafy suburb of New Jersey's Murray Hill, a young maths and physics graduate was laying the groundwork for an entirely different revolution. For Dennis …
... and a real "genius". One that Michael Bloomberg probably won't have heard of.
With much of the Bloomberg platform written in C & Fortran from the early days, it is likely that Micheal Bloomberg knew he was using K&R C (before ISO standardized it)
Agreed, a quiet hero in my mind.
ATC staff don't know or care what language Air Traffic Control systems are written in. Just so long as they work.
Most bank staff don't know or care what technology underpins their daily working lives (and I can guarantee that the bank board doesn't either).
So why Michael Bloomberg should, I completely fail to understand.
Is/was he a coder before he was a politician? No. Its like saying Reuters managers know anything about the internals of the Reuters System.
Fail to see your point.
For all the fun that you made possible with the tools that you gave us, thank you Dennis.
Does anyone know if they ever met? I'd like to hear what they had to say to each other...
RIP, dmr, and thanks for all the C!
Along with the likes of Alan Turing and William Shockley, Dennis Ritchie was one of the founding fathers of modern computing, and I was saddened to learn of his passing. Without his genius and insight, we would not have home and office computing today. He will be sorely missed by the IT community. Condolences to his family, may the great man rest in peace.
These blokes were indeed heroes and great contributors to the industry and their input should be recognised.
It is a complete myth that the industry would not have happened without them. The truth is that there were many parallel threads of development and if Ritchie had never been born his work would still have been performed by others. C is an Algol-family language and there was a rich primordial soup of languages and OSs.
The same myth is encountered all over the place. If the Wright Bros had not invented powered flight we'd still be using ships to get from USA to Europe. Well no, there were many parallel threads working on powered flight. Wright Bros just scooped the glory. The same applies to much of Edison's work, Intel and the 4004 and so on.
What gave us Unix and C was not so much Ritchie as the legal mechanism that forced Bell to release the code to universities. If that had not happened we'd probably have ended up on a very different trajectory that Unix/C. Perhaps PASCAL/Modula2 based, perhaps something else and quite likely something without many of the pitfalls of C.
So if we really want to give thanks for C and Unix then perhaps - as much as it galls us - we should also thank the lawyers.
Everything anyone ever did depended on what those who went before built for them, but I was disgusted when Steven Spielberg described Steve Jobs as "the greatest inventor since Thomas Edison". He didn't wield a soldering iron or write code, he was a CEO who hired teams of highly skilled engineers to invent things for him. Dennis Ritchie at least actually built stuff with his own hands and therefore deserves that much more acknowledgement for his achievements.
>Wright Bros just scooped the glory.
Then they patented it and got left in the dust.
you couldn't be more wrong charles. dennis ritchie was a hero inventor twice over and you disgrace his memory.
yes, if it wasn't for the lawyers, unix and c might not have made it into academia just as cs departments were buying computers for themselves. however dennis's inventions -- as well as ken thompson's and the others at bell labs -- just would not have caught on if they weren't any good. unix and c prevailed because they were (and still are) light-years ahead of the alternatives.
those inventions are still going strong today ~40 years later. which is more than can be said for pascal and modula-2. c might not be as pure as those languages but when there's real work to do, c gets the job done time and time again. which is why it is so heavily used and the likes of modula-2 isn't.
most decent operating systems and programming languages in use today are a direct result of dennis ritchie's genius. many of those that aren't are heavily influenced by his work.
now maybe someone else might have invented unix. or c. but they didn't. dennis did. well, ken thompson shares the credit for unix. you would do well to remember that and be thankful for the wonderful platform they gave to the world.
And Al Gore
"at least [Ritchie] actually built stuff with his own hands and therefore deserves that much more acknowledgement for his achievements."
Well if you're using that analogy then Mark Zuckerberg also deserves a lot of credit.
The difference between Ritchie and people like Jobs (or Zuckerberg) is how much they care about making the sound of their own voices heard. Some people like to just create great stuff and don't do it for glory or credit.
It's quite sad that many people don't even know who this guy was but also it's a compliment to how little he gave a shit about "marketing" himself!
"those inventions are still going strong today ~40 years later. which is more than can be said for pascal and modula-2."
I just wonder though, is the popularity of C as much to do with the popularity of products like Borland Turbo C and MickySoft C and their access to the windoze API?
it only encourages them to multiply, and we've got enough of the parasites already.
Second, whether or not someone else would have done something at the time is not what makes a leader a hero. The question is whether or not someone else could have done as well or better than they did it. When that leader consistently turns in exceptional results, they are indeed a hero, regardless of what else was percolating at the time.
I think it was Spielberg who a few weeks after 9/11 said the US should use it soft power and beam Seinfeld into the Yurts of the all Taliban in Kathmandu.
Shortly before 9/11 the most of Nepalese royal family was assassinated by one of there own number. The day before Spielberg's asinine suggestion the last remaining Nepalese princess died in a helicopter crash. At the time Mongolia had suffered a number of years of drought and very harsh winters, as a consequence most of the grazing stock died of starvation forcing the herdsmen to move into the towns & cities.
The Taliban don't live in Kathmandu, the capital of Nepal. The Taliban live in Afghanistan the capital of which is Kabul, nor do they live in Yurts. In fact no one in Kathmandu lives in a Yurt, nor in Afghanistan. But 1800 miles away, Mongolian herdsmen do live in Yurts when they're out on the steppes, except when all of their animals are dead.
So what the f**k would Spielberg know about anything.
God bless you Dennis Ritchie - I wanted to write you an obit, but seeing Spielbergs name in this context made me mad. We met once, in Boston I think - long time ago, ken was there too, not sure about Brian.
Rest in Peace brother - I'll always treasure what you and your colleagues Ken T & Brian gave to me and so many others
Most stuff made with c don't even run on windows, or even have an interface for that matter. people seam to forget that almost everything with a chip also has some programming done and burned in a ROM somewhere, and C is the most common language to write that stuff. C is #2 on TIOBE but i wonder if a more specific count would be done on these small gadget if it would not jump to # 1 by a whole order of magnitude.
I am no expert, but surely to say "Linus Torvalds announced his project of writing an open-source clone of Unix from scratch in 1991" is a bit misleading? A kernel, yes. But a great deal of the rest was GNU -- notably the C-compiler. I can see that Stallman has not made any friends in the past week, but that doesn't justify airbrushing GNU out of Unix history like Trotsky.
Richard? Is that you?
"Hello everybody out there using minix -
"I’m doing a (free) operating system (just a hobby, won’t be big and
professional like gnu) for 386(486) AT clones. This has been brewing
since april, and is starting to get ready. I’d like any feedback on
things people like/dislike in minix, as my OS resembles it somewhat
(same physical layout of the file-system (due to practical reasons)
among other things).
"I’ve currently ported bash(1.08) and gcc(1.40), and things seem to work.
This implies that I’ll get something practical within a few months, and
I’d like to know what features most people would want. Any suggestions
are welcome, but I won’t promise I’ll implement them :-)
"Linus ([email protected])
"PS. Yes – it’s free of any minix code, and it has a multi-threaded fs.
It is NOT protable (uses 386 task switching etc), and it probably never
will support anything other than AT-harddisks, as that’s all I have :-(."
He did announce just that, and started with the kernel. After getting the kernel working, other people ported the GNU system to work on it - it had pretty much everything working except the Hurd kernel which still even today isn't considered ready for production use.
every time you run a GNU tool, it'll happily show the GNU credits. It isn't Linus' fault that the GNU project got stalled short of releasing the full GNU OS.
One of the reasons Ritchie is not more famous is because of Stallman's hatred for the Bell Labs group and the practical suppression of pre-Torvalds UNIX history. Remember, GNU stood for "GNU is not UNIX". They hated UNIX/C, because it came out of a corporate research lab, and most of all because eclipsed the TOPS-10/LISP culture that the MIT AI Lab dominated in those days.
Yes, part of UNIX success was that it was free and open source. But it was also elegantly simple. I worked at Bell Labs Research in the 1980s and knew Dennis and Ken. I heard horror stories about MULTICs, how it was so complex and inefficient it could only "time share" two users at once. Ken came off that project determined to do something completely different. Simplicity is hard to achieve, and a lot of thought went into UNIX and its (now long-lost) philosophy of composing simple tools to perform complex tasks.
There was a long rivalry between MIT and Bell Labs about this question of design simplicity. The hacker culture had a macho attitude of "my code is bigger than your code", while Ken and Dennis spent hours trying to boil things down to the fewest lines of code and the fewest necessary features.
I remember one attempt to reconcile. The UNIX team invited Stallman to visit Bell Labs. I don't recall much about Stallman's talk, but everyone remembers that he picked his nose and ate a bugger in front of everyone. In return, Rob Pike was sent to give a talk at MIT, which he was not able to deliver, because Stallman and his friends heckled him.
But for some reason i like him. i think we need a crazy bastard like him to counter the weight of the crazy bastards at the other side. DMR was in the middle, not because he compromised.. but because wasn't crazy.
or at least cause a segfault for writing to RODATA. And a number of other nitpicks. But as it's an obituary, I'll leave them be.
One thing I will say, and that's that as so often, history needs its actors to act or something else entirely will happen. Anyone smart enough could've made similar improvements, but it needed someone to do it there and then. Besides, you never know in advance where you'll end up and in hindsight it often looks far more obvious than it did at the start. In fact, if it doesn't look obvious (to you) afterwards, you may be doing it wrong. Both because elegance tends to be inherent (to quote another great man, "everything should be as simple as possible, but no simpler") and because you usually want to stay down to earth and off to the next adventure. If everyone else falls over themselves exlaiming they could or could not have done it, well, that just people.
None of that changes that dmr left us with a few things very useful and yet he will be missed moreso for it.
This man made a bigger contribution to the world of computing than Steve Jobs
Fact: Apple's OS and toolset all came from Ritchie. Tablets, Mobiles mainframes all use Unix like operating systems and toolkits.
Another loss of another pioneer.
It's sad to see another of that pioneering generation heading off to the great operating system in the sky. However, I take issue with one part of this obituary: C is most emphatically *not* a "high level" language! I don't think Dennis Ritchie himself would thank you for referring to his "portable assembly language" as such.
Also, the history of UNIX through the ages seems more than a little revisionary: BSD Unix (which really *is* UNIX) and the BSD-derived NeXTStep were already available by 1991, long before Torvalds' kernel made it out into the wild as an integral part of a viable operating system.
I'd also disagree violently that Linux did anything to spread UNIX: Linux is no more "UNIX" than a Compaq PC *clone* was an actual IBM PC.
Linux is a *UNIX-like* operating system, but it is not itself a bona fide version of UNIX and has, therefore, done more to *reduce* the prevalence of *reduce* the use of Thompson & Ritchie's UNIX operating system and its later releases than even Microsoft have managed. Apple's BSD-derived OS X has probably done far more for UNIX's popularity in the consumer space as BSD really is a direct descendant of T&R's UNIX and not a clone.
This is analogous to Compaq's reverse engineering of the original IBM PC's BIOS, opening up the market for compatible IBM PC *clones*. Linux is a clone of UNIX. It is a "UNIX-compatible" OS.
It quacks like UNIX®, walks like UNIX® has ~ an interface according to Posix, and slimy CEOs controlling a UNIX® use it for shakedowns for being UNIX®.
It doesn't have the "historicity", but good enough: It's a Unix,
(UNIX® is a registered trademark of The Open Group.)
They've "fixed" it. Now it says "ALGO".
Following your logic - my Windows 2003 box running Services for UNIX 3.5 is UNIX.
No, it's not and Linux isn't UNIX either.
"BSD Unix (which really *is* UNIX)" - not in the most pedantic sense.
A lot depends on your definition. BSD (which was originally a series of add-ons and modifications published in source) became a full OS distribution and split from Bell Labs UNIX around version/edition 6/7, and was never re-integrated (although SVR2/3/4 all added BSD features back into to the AT&T sourced versions).
As a UNIX pedant, I would say that BSD is *NOT* UNIX. Remember the lawsuit that forced AT&T code to be removed from BSD, leading to BSD/Lite, FreeBSD and BSD/386, so it is difficult to justify the claim that BSD is UNIX.
By comparison, HP/UX, AIX, Xenix, UNIXWare/SCO UNIX, Altix, SINIX and many more were derivatives of AT&T code, and passed UNIX branding tests, so could legally be called UNIX.
I accept that a lot of people who were from outside the Bell Labs/AT&T world may well have seen BSD before any commercial version of UNIX, and may well have referred to BSD as UNIX before AT&T got commercially sensitive about the UNIX brand, but that does not alter the fact that it was a very early fork of UNIX which never gained UNIX branding. I am not arguing that BSD is no good, because clearly it is, but that it's claim to be UNIX is subject to interpretation.
As far as I am aware, the only BSD variant that passes any of the UNIX compliance test suites is OSX!
... deserves complete admiration and respect. Ritchie's contribution to computing is far more significant than the one from Steve Jobs (yes, I realise that I am comparing apples to oranges, here (no pun intended)). It is a pity that Ritchie's passing will be overshadowed by Jobs'.
Must be one of the very few occasions Job's got his 'product' to market first.
And one who had a much greater impact than anything Apple/MS related. Yet I don't see Obits on the news and no fanbois crying and lamenting his passing
Or was it Thompson? Anyway, think it was Ritchie - Massey University, New Zealand, about 1985: He worked out that, per head of population it was the largest UNIX gathering he had attended. I asked him what, in his opinion, UNIX's biggest defect is, he answered, "No concurrency", while crouching to feed the ducks. Asked why UNIX commands, during the main Q and A session, had such cryptic names, he said it was to make sure they did not get confused with anything else. Very approachable and filled the image of beard, long hair and sandals exactly.
By the way, B may come from Bell labs., but it comes, actually, from BCPL, from University of Cambridge (England). The old question was, what would the successor to "C" be called, "D" or "P"? As for Algol, so casually dismissed: it was a decent language, affected Pascal, Ada, Modula and not without influence on C++ and Java. I believe Steven Bourne used Algol ideas, if not Algol, for his first Bourne shell implementation.
Before he went to Bell Labs, Steve Bourne got his PhD at Cambridge where he worked on an Algol68C compiler. The similarity of some constructs in the shell to those of Algol68 is not accidental! (Both case...esac and if...fi come from Algol68. Loops would have had do...od if "od" wasn't already a octal dump.)
"C paved the way for object-oriented programming ... Today, C and its descendent C++ are a popular choice for building operating systems"
That should be
"C was refitted with object-oriented programming features, yielding something not unlike a gaily decorated truck out of Lahore ... Today, C and its descendent C++ are a popular choice for shooting oneself in both feet trying to build operating systems"
Will there be an ASCII art candle vigil? Somehow I doubt it.
Funny how the media are not shouting to the world about this compared to the recent Apple announcement.
An ASCII art candle vigil
- that put a smile on my face, thanks.
Not attempting to post one here due to a) copyright and b) no monospace.. but a good collection at:
This is a great loss. In many ways this touches me more than the death of Steve Jobs last week.
Must find K&R books on shelf. Must not give up. Repeat.
You're joking, right?
Six flavours doesn't even come close, we had that many flavours just on one site - which was particularly odd as we only had machines from two vendors.
In fact the old Siemens-Nixdorf T35 actually had two flavours on one machine both available at the same time; you could just switch between the two with a simple command once logged in. (No, there was no virtualisation involved.)
The Pyramid RISC had "universes" of different UNIXes, many as you like.
God I'd clean forgot about Pyramid - what happened to them - I know I worked on one, money market system I think
Memory's long past are fading,
But in flesh or in spirit Dennis,
Ken & Brian always burn brightly
Everlasting into the long dark night
Thanks to Dennis R, respects to Ken T & Brian K.
Ritchie was the man. He was not a showman like Jobs. He was the real deal in terms of important impact on the IT industry. A genius. RIP
A real innovator and one who genuinely changed computing, but who hardly anyone heard of, or the megalomaniac control freak Jobs - I know which one I feel that the field is the poorer for losing,
"and oh by the way those aren't pointers to bytes, they're pointers to words"
I met this on a microcontroller, not too many moons ago. And lo, there was much wailing and gnashing of teeth, and darkness was on the face of the softies who had to deal with this f**king stupid idea.
Welcome to the wonderful world of embedded processors. We use a new and different architecture in every second or third project. If you don't like it, stick to desktops.
If one was to compare the recent losses in technology to certain fictional figures then I would not consider it to be unfair to say that.
Dennis Ritchie == Jesus
Steve Jobs == Daniel Plainview