Happy Birthday System/360 and COBOL ...
... and hats off to Grace Hopper, despite your misgivings - "Widely regarded as the worst mistake in computing history"? That'll be a hundred thousand rock-solid disasters then?
Apart from RBS of course.
Cobol is the language most associated with mainframes, especially the IBM System 360 whose 50th anniversary is being celebrated or at least commemorated this week. But when COBOL was first spawned in the mid-1950s, it wasn’t intended for programmers. It was aimed instead at “accountants and business managers” – basically a …
Exactly so. COBOL was of its time and it should not be criticised for not being designed with 30 years of nonexistent hindsight. It was designed in a day when the whole understanding of what a computer was and what it did was very different, when the work to be done by a compiler had to be minimised because processing power was expensive. As noted in TFA, it was originally seen as a language to be used by technical people to solve problems in their discipline, not to be used by a special class of programmers. Rather like ALGOL and FORTRAN, in fact.
The number of academic languages which are theoretically wonderful that have emerged since, and which have sunk, if not without trace, at least into footnotes, is evidence that sometimes what people are used to is actually the best solution.
I actually heard Grace Hopper speak several tiems and her story was that she and her coworkers who Designed COBOL had two major objectives:
1) A Language that did the things Business needed to do. That's why they called it: COmmon Business Oriented Language.
2) A language that looked enough like English that it would be Easy to teach and read. That was one of the reasons the Pentagon helped Pay for the development of COBOL. They were looking for a language that could be taught to the average Military Recruit with a High School Education in a short period of time.
They also knew exactly how difficult it is to find the Time to Document the program when yur Manager is breathing down your neck. So, they added "Self Documenting" to the list of attributes. Yet another push in the direction of "Looks Like English".
The fact that so many COBOL programs are still chugging along, just doing the job, is proof that Hopper and friends succeded in meeting 99% of their objectives.
Oh, and when it comes down to the accusation of "Wordiness", how many pages worth of reading is required for a Properly Documented C++ Program that the next programmer can just pick up and figure out what it's REALLY Doing??
In 1980, I was in first year at university and had holiday jobs working in various computer places.
Back then, most COBOL programmers had only a three month programming course and that was it.
Essentially, they took the more intelligent looking filing clerks and ran them through a 3 month course from IBM, ICL, or whatever, and out popped a newly minted COBOL programmer who could convert flowcharts into COBOL. THe better ones could even generate the flowcharts too.
They could do basic stuff like inventory, accounting, etc. COBOL was great for that task.
Many of these programmers got a bit big headed. They were needed to produce the end of month accounts and could weedle various favours out of management.
In about 1986, one place I worked at discovered desktop computers (Macs) running spread sheets. Suddenly managers could generate their own reports without some of these COBOL programmers holding them hostage. Many of the most dickish programmers were quickly fired.
Never been fond of it, even though I've done a fair bit over the years. It got slightly more tolerable when Cobol 2 came along and you didn't have to type with your little finger constantly on the shift key though. As Dominic points out, insanely verbose though, especially when people insisted that variable (or "field") names should be suitably meaningful. Not fun on 3270's without cut'n'paste. These days, of course, IBM do perfectly good 'C' and C++ compilers which work nicely with any s/360-derived data formats if you avoid that string malarky, and there is a perfectly good JVM if you really enjoy typing (pun intended).
I guess it was popular simply because it was intended for accountants and business managers; just unfortunate that, whilst a language syntax may be relatively easy to pick up, writing decent software, for any purpose, remains rather hard.
But the (wholly intentional) advantage of its verbosity is that you could take a section of procedural code and show it to an intelligent accountant or business manager (yes, they do exist, honestly) and explain what it was meant to do. If you were lucky, they might even be able to point out why what you were attempting to achieve wasn't actually what the business needed.
Try doing that in C.
ISTR editors allowed macros that allowed simple keypresses to fill most shit GIVING ease of use.
Can't remember when CAPS LOCK first came in but my old typewriter had it - though you couldn't easily use it to write templates and macros that could do 90% of the work once you'd worked out a name for the PROGRAM-ID.
I loved writing COBOL on the Vax - EDT and TPU FTW.
I miss those "simple" days. I think the verbosity of the code was really nice, you felt you had acheived something when you had written a screen full of code. With C, in the same vertical space you probably had 2 lines of code and 20 lines curly braces! :-P
I Remember hearing the the screaming of the line printer, coming to a halt and then seeing one of the programmers reach down and pick up 6 inches of green bar so he could see what was wrong. After several iterations of this the printer seemed to quiet down for quite a while. He must be back at his desk trying to figure out what the compiler dump printout said. All in a day's work I guess. I always knew that it was the programmers because the first page always had their name on it instead of a program name! The programmers must have decimated hundreds of square miles of forest with their printouts!
"writing decent software, for any purpose, remains rather hard."
I don't think this was fully appreciated at the time. You only have to look at the futurological predictions to realise that people always underestimate just how difficult new technology is going to prove (e.g. "flying cars" - they exist all right, but it turned out that helicopters were hugely expensive, fuel-inefficient and need highly skilled and trained people to operate. Or fusion, which is now more years in the future than it was thought to be in the 1960s.)
You could argue that people should by now make allowances for underestimation of difficulty, but people with money tend to get the jitters when you put even 20% of contingency into a project. And perhaps being economical with the forecast is right; if the US Government had realised how expensive the Manhattan Project was really going to be, they would probably have invested in improvements to conventional weapons instead. Sometimes you just need to take a walk in the dark to make progress.
Actually the COBOL PICTURE stuff used for formating reports (COBOL's bread and butter), is far less verbose than attempting the same thing in C.
So is the record copying: Copy a record of one type to another type and all the fields with common names get copied across. One line does what n lines of C/Algol/Pascal/... does and does not need to be changed when the field names are changed.
COBOL is incredibly useful, and reasonably succinct, when used for what it was intended for.
Don't try writing an OS in COBOL though...
"Not fun on 3270's without cut'n'paste"? It was simply a different paradigm, that's all. Who needed cut'n'paste when you had ISPF edit? Incredibly powerful - if you grew up with it, you could make it jump through hoops. Far from needing cut and paste, when a GUI interface was all I had, I missed the sheer raw power of Edit's command line. In my later years, I had a rich choice of coding environments; I used GUIs for some languages, such as Java, but 3270 for others (most definitely including COBOL) . Horses for courses, and all that.
I started in other languages, and came to COBOL late-ish in my career (supporting CICS development - and I consciously started writing my code in COBOL because, yes, that's what a lot of the customers I talked to were still using). Yes, it had lots of annoying details - and until COBOL 2 came along, some serious linguistic omissions too - but of all the annoyances, the biggest one of the lot for me was that comparatively trivial, and wretched, terminating period. Absolutely mandatory in some places; a syntax error after the self-same statement in others. Slap a conditional around a piece of perfectly good, working code, and suddenly it wouldn't even compile. If I'd had a swear jar in the office, it would have been full about twice a day.
If there really are a million COBOL (I'm old school) programmers out there, I bet their average age isn't much less than 50. So you're losing getting on for 10% of your 'stock' every year, most (hopefully) to well-earned retirement. So who's going to maintain your 100,000 COBOL programs in 10 years time? Do you have a cunning plan to rewrite/redevelop them all in some hip modern language? How many programmers do you need to rewrite 10,000 programs a year?
I've got lots of questions, but I don't hear many answers.
"So who's going to maintain your 100,000 COBOL programs in 10 years time?"
Me and a whole bunch of like-minded guys. Our motto shall be "You don't need a new data center, you need people who won't sneer at the one you've got".
Working on the details as I type. Meanwhile, back to your desk to find that missing semi-colon or malformed (but entirely compiler legal) "if" equality that is needlessly crippling the general ledger.
Aw 'cmon - you could spend days debugging a COBOL program if the punch girl missed a slightly faint full-stop. And, despite my disinclination to type long variable names (and let the compiler find the typo's), to this day I habitually put a horizontal bar through 7's and Z's when I write them down.
I would say that COBOL has achieved its longevity (we are talking about mainframes here) largely through:
a) IBM making the s/360, S/370, etc. architectures backward-compatible (genius IMHO)
b) Programming in assembly language being just too hard/too easy to create nightmares out of.
Pace that COBOL ur-program mentioned above - I'll bet it was a sequential batch update ported from the original assembler by someone who happened to be extremely proud of their penmanship!
Well *I* didn't have to because my mired-in-the-50s-mindset factory used verifiers, so most typos got caught.
I slash zeros and write G with a big tail on it too as a legacy of coding sheets, but since confusion between letters and numbers is still cause for concern when typing is not an option I view it as a positive rather than as reason to start drinking the Java Kool-Aid.
But why on earth would you use an IBM machine to do what a Unisys Sperry (or Burroughs) node in a 2200 does so much better and so much more securely on every level? IBM have only ten years of one technology that was done tested rolled out and earning its keep in 1992 on Unisys machines.
I still remember being asked by an IBM DBA "When you lose a disk in the online day and have to recover, how much data do you lose?" and me standing there like a numpty trying to understand his question, because for over a decade the answer for me had been "none" and I'd forgotten the bad old days pre-integrated recovery.
Its just another language. Not that big a deal. So I bet that in big cobol using companies astute folks in their 40s who fancy the idea of steady employment until the pension fund is healthy enough, might be saying hey boss, let me work with old Bill until he retires and I'll learn that stuff so we are covered.
the fourth, or should that be the third is "The Psychology of Everyday Things" Should be mandatory reading for any UI designer. Trying to use some of these fondleslabs makes me wonder why I don't see more electronic devices smashed on pavement. But I digress. Anyone try Fujitsu Visual Cobol ? Pity it was not as easy to use as MicroFocus.. The idea of resurecting the Screen Painter was good, but the usual COBOL dialect differences made me drop it for Delphi. Now what happened to Kylix on Linux ? About time someone tried again.
Back in the eighties the degree that I was studying COBOL was a good part of it.
Never used it after I was finished with the degree, but the one lasting thing it did do was to teach me how to spell "environment" correctly. Without COBOL I'd still be misspelling that word.
That's only pants because it's not:
DIVIDE cake INTO 8 GIVING slices
If you can read that you can understand it. I think that's how computer languages should work, let the compiler/interpreter take care of converting it to machine, I want something I can read!
If anything, Python has the worst punctuation.
The python punctuation is whitespace and impossible to see. I've had python code mangled by editors, emailing and the like which took ages to fix.
At least if C code gets mangled it is reasonably easy to fix with a pretty printer.
Another calumny spread by the one-course wonders.
You could write DIVIDE cake BY 8 GIVING slices even on the old ICL 1901T.
So double 8oP to Anonymous Ivy, too idle to crack a manual.
If you wanted to live dangerously you could COMPUTE slices=cake/8 but that is a Cobol 101 error in the making. Rule of Thumb: If you let the computer decide, it will decide on the worst possible option for your desired outcome. Works for just about everything, even today, even using C-like languages on Unix-like operating systems.
Dominic Connor worked on getting Cobol to work properly under OS/2, which was so successful that he’s now a City headhunter.
If ever there was a time when the City needed new heads a’hunted, DC, it is now, for its key secret* is reverse engineered to base foundation and realised to be indefensibly vulnerable to simple text based complex attack/SMARTR IntelAIgent Discourse.
* Don’t let anyone fool you into thinking and wasting time on discovering secrets, plural, when its key secret, singular, rules over everything under the sun. And it be priceless and worth whatever it takes and is asked, to keep it a closely guarded secret known only to a carefully vetted few, who invariably have been clever enough to work things out for themselves too, making it a sort of autonomous, self-actuated and self-actuating group, practically astute, virtually anonymous and highly active.
Play the wrong game against them, and the secret universal control mechanism of which we speak, the City’s key secret, is made known to everyone worldwide in a language which they cannot fail to perfectly understand. And one wouldn’t want to be a big City player whenever that info hit the fan and main street, for the baying mobs will not be stopped having their rightful revenge on the source of all of their pains.
It wasn't just the big companies that used COBOL. In the late 80s we were taught COBOL on a BTEC computing course at college. It seemed on its last legs even then: having to write out the code on coding sheets to be entered by a roomful of typists; getting your program back with one typo, you had to watch everyone else gleefully calculate the compound interest on a 15-year loan. The one we used was called Flinders COBOL, if i remember rightly, and ran on a Prime Computer. A year after finishing the course, I visited the campus, and saw all the Prime terminals piled up in a skip.
It's perhaps interesting to ask whether the coding sheets method actually made for better programmers. Nowadays auto suggest and complete and instant debugging has made the skill of being able to write code without trivial errors somewhat redundant - but from my old fart perspective, striving for 100% accuracy first time was good mental discipline.
We also need to remember that for server side code, writing the code using an IDE is just as much an abstraction as doing it on coding sheets - the fact that it is being written "on a computer" isn't directly relevant to code quality (it can just be written much faster with fewer trivial errors and without all that consulting of paper manuals).
In the early days of microcontroller assembler I used to write assembly language in French squared exercise books because I could see far more lines at a time than I could on a green screen monitor and it was quicker than using an ASR-300. Nowadays, of course, that would be impossible (and pointless).
"It's perhaps interesting to ask whether the coding sheets method actually made for better programmers"
Coding sheets weren't used by choice. They were used because the equipment for getting code into a computer used to be expensive and so everyone submitted their code to the punch department (always called "punch girls" because I never saw a bloke doing data entry) where it would be converted to card or paper tape, verified by punching it again in a special machine before it was sent into the computer room to be run, spooled or whatever variety of input process your mainframe used.
In the late 70s I worked an ICL site that had one VT for a department of a dozen programming staff. At one point the waiting list to use it was three days long.
When we converted to Sperry and got a bank of eight VDUs, and later, one each, we were gobsmacked. Turns out our pointy-haired manager didn't think we "needed" more on the ICL kit.
They replaced him for the Sperry kit-out.
I know they weren't used by choice, but because I/O was very expensive. That wasn't my question. The question is whether it was good training in accuracy to have to do it that way.
Watching some programmers bash code in nowadays, they are either geniuses or doing things in the most "obvious" way without really stopping to think. Some people still use paper a lot for initial thinking. Is there a best or optimal approach?
Some of us coded in red or blue ink. Not only was it good discipline, the keypunchers could read it more easily than pencil, so we got fewer errors. Then we read the cards, of course, before submitting them for compile. The idea was to get it to run the first time without compilation errors, and preferably correctly.
Also, we all had copies of punched cards showing all the default stuff, and when more than one program needed the same file structures, copies of those as well. We did not have to recode all that. Of course, later there were copy libraries, but that had to wait until we were using terminals rather than cards.
"... the mainframe Cobol stuff focused on being rock solid rather than interesting; ask yourself if you want a more reliable air traffic control system or one that’s more exciting?"
what is widely regarded as the worst mistake in computing history
By whom, Dominic? COBOL was only one of the languages pilloried by Dijkstra in "Truths" - not a thoughtful or useful piece in any case, however wildly overrated in some quarters. Sure, there's plenty of grumbling about COBOL, primarily among people who haven't used a modern variant. But substantial critiques are few and far between, and I'd like to see the evidence for this "widely regarded" rubbish.
As Chris points out, the verbosity of COBOL was specifically to make it readable, and for some programs writable, by non-programmers, so that domain experts - chiefly lawyers and accountants - could validate the business logic. That's an admirable goal that wasn't really taken up again until Knuth's Literate Programming twenty years later (and Knuth's scheme suffers from abysmal source syntax, in typical Knuthian fashion, though the underlying idea is good).
Really, if you can't write an application in COBOL - particularly using modern simplified syntax - in essentially the same amount of time that you can write it in, say, C++, you don't understand at least one of those languages.
(I work for Micro Focus, but I write code professionally in C, C++, procedural COBOL, OO COBOL, and a smattering of other languages; and academically in a couple dozen more. And programming languages is one of my areas of research, and I've presented on COBOL and the economics of source-code readability at academic conferences. So I have a little experience with this topic.)
Also, I don't know where "the two remaining Cobol compiler teams" came from. Did you miss Open COBOL?
I require a good reason why I would bother learning and using Cobol, when other languages are readily available and widely supported.
It doesn't feel like this is the case with Cobol. GNU Cobol is obviously making things better, but it's implemented in C. I wonder just what is the advantage of using Cobol, when there are so many languages that are better-integrated with my operating systems, and when I personally don't have any Cobol legacy code.
As someone who has written rather a lot of COBOL I can say it has a number of serious deficiencies.
No standard libraries. Everything gets written in house.
No type checking on subroutine calls. This is the source of numerous bugs and crashes. It discourages modular programming and leaves cut and paste as the only practical way to reuse code.
Its not portable. There are so many variations you cannot even port between different machines from the same manufacturer.
"what is widely regarded as the worst mistake in computing history. "
Yeah. That would explain it's predominance in the marketplace even today for financial reporting and tracking.
You have to love people who think they know a lot about something based on a single course taken for their CS degrees.
I remember with fondness a paragraph in the old Wikipedia entry on Cobol (which was edited - in part by me - a long time ago) "explaining" by means of an example why Cobol "doesn't work" - said example containing a Cobol 101 "never do" *programming* error prominently listed in every manufacturer's documentation as a route to misery along with the entirely reasonable technical explanation as to why it was a bad idea. Hint: it had to do with the internal representation of intermediate values, just like it would with C# or Java in analogous circumstances. Yes, I can break Java (or any C-derived language) by the same idiot way of mis-doing arithmetic. Can I use this as "proof" that C, Java et al "don't work"? I wish.
Contrast this with Java: Hard to learn, loaded with confusing syntax (the semi colon and the two different equals uses* were, in 1997, judged to be the leading causes of internet start-up cash losses and programmer overtime generation in a peer-reviewed white paper circulating in Venture Capitalist businesses I read back when I thought such things were important). and no native currency type which, in conjunction with really crappy explanations of the binary-floating-point-number-used-to-represent-decimal-amounts issue in text books leads to the same "charming" mistake being made again and again and again in software suite rewrites.
And to top it off Java doesn't even solve the major issues facing anyone using a large enterprise software library - the ability to search it for something you have quicker than it will take a programmer to write something of their own to do the job, defeating the "wheel reinvention" rule that is supposed to be so important. To do that you have to write your own class library browser with search facility. Or switch to Eiffel, which not only includes usable class library browsing but is also rigorously Object Oriented in it's implementation - which Java ain't.
Just because it doesn't look like "C" doesn't mean it is Satan made manifest. Quite the reverse.
Even the most primitive Cobol's I've had to wrangle (ICL) came with integral support for structures, data masking on the fly, native, error-free currency types *as well as* floating point number implementation (of little use to be honest, most financial apps requiring scaled decimal more often than large inaccurate real numbers).
Buffer cleverness was integrated into the language so one could make stuff lightning fast with a couple of sentences added at the right place (with the understood cost of doing so of course).
Segmentation was a built-in feature so when computers were not clever at managing their very limited memory even a moderately talented programmer could do so, thereby fitting a gallon into a pint box. Of less use now but when Cobol was specced out object building and tear-down would consume more resources than you had for the entire program.
Unisys's Cobol could be written procedurally, declaratively or as a truth table even in the 1970s. You picked your metaphor of the day and got stuck in. Today there are "Object Oriented" versions you can play with, though why you'd bother is a mystery to me.
So, yeah. A total waste of time. *nods*.
The most honest critique I've heard from my colleagues of the language is that it isn't concise, to which I reply "and your point is?"
And before anyone weighs in with Y2K I should like to point out that a) that was a compiler limitation imposed by manufacturers, *not* as has been mooted in these very pages programmer laziness and 2) Our tape library system crashed hard just before Christmas and yours truly was dispatched to find out what was wrong with the perl that makes it go. Turns out that there is a y2k38 problem with the 32 bit epochal time returned by many unix-like systems. Known and worked around using perl libraries (not in the local perl at the time of course) but still. With all the froofaraw over Y2K you'd think that I'd have seen *one* mention of the issue in the open somewhere. 8o)
Oh, and those Air Traffic Control systems were more likely to be written in Coral 66 than Cobol, which was never intended to do anything other than make the payroll and balance the books - jobs it still does and does well, notwithstanding the spokesman for the US combined military payroll office who seemed to think a few weeks ago that mainframes and Cobol were why soldiers weren't getting paid properly, rather than logjam paperwork in the wetware stages of the operation and the "we don't make mistakes" doctrine.
So 8oP to the lot of you.
*Yes, I know what the difference between them is. Do you at three in the morning after pulling an all-nighter on someone else's code with the CIO breathing down your neck?
But I know many who have.
Remember it comes from a time of 16K core store, tape drives and all the compute power of a 4004 in a cabinet the size of a small room.
Grace conceived as far as I can tell as a way of making the hideousness of assembler accessible to people who didn't actually understand a damned thing abut registers, memory size or anything beyond 'this set of words will produce this result'
And that is very important: There are a vanishingly small number of people who understand the intricacies of hardware and machine level instructions AND business modelling and problem solving.
Computer scientists will sneer at COBOL because it doesn't fit their prejudices as to what a 'proper programming language' ought to be.
But millions of people are grateful that someone had the nous to provide a tool that was accessible enough to enable them to USE the computing power at their disposal to solve enormous problems in finance, banking and other aspects of business related IT.
Like most other successful innovations COBOL was not designed by a theoretical scientist, it was designed by a practical person with a particular use in mind: TO enable non technical people to write code to solve their problems.
It is a tool, and a useful tool and at the time the best tool there was. It did all it needed to and the fact it's still in use shows that it was a good tool.
It ranks with C and possibly FORTRAN as a basic software tool that enables people to solve problems.
One of the "Funnier" stories Grace Hopper told about the very Early development stage of COBOL was how they failed in their first sales pitch for the language.
The prototype "Demo" compiler only understood a list of about a dozen verbs.
Add, Subtract, Move, etc. The team worried that might not be impressive enough, so they made a very Quick modification which demonstrated that the program could recognize those Same instructions in FRENCH, giving the potential to Sell COBOL versions in Any European language, thus at least Doubling the market.
As Hopper put it: The Board of Directors KNEW they were being Scammed because there was no way a God Fearing American Computer Built in Philadelphia could Ever speak French.
Another pile of easily researched bollox.
French Cobol was developed because the French needed computers they could talk to in something other than Linear B and resented the living enfer out of having to use English instead. I know this because it was told to me in person by a nice Control Data representative during my year-long search for a job in the late 70s economy and I later spoke to a number of Les Consutantes from the Land o' Truffles who confirmed it with a hearty may wee.
So I disbelieve the truth of your "Grace Hopper Said" story as the product of the Git Generation, Bakana.
As Poul-Henning Kamp writes, C has caused The Most Expensive One-byte Mistake, that is, data structures without built-in bounds checks. Most recently, that type of programming is in the news because of the Heartbeat bug. (Yes, I know the Heartbeat bug doesn't specifically use strcpy; the principle is the same.)
The menace from C is insidious and inescapable. Modern processors are built to be good at running C, not some safe programming language, and runtimes for other languages are implemented in C. For example, in theory, Java is a very safe programming language, but in practice Java is the vehicle of countless security vulnerabilities. Because of so much legacy investment in C-based systems, including Windows, Mac, and Unix, I don't see an easy way out of this trap.
Well, as we are there (inter alia) to provide a service to managers and beancounters, what on earth is wrong with that. I know the managers and beancounters are suffering from the delusion that they are useful in themselves, rather than simply providing a service to the people who actually create the product, but that doesn't mean we should make the same mistake.
"the worst mistake in computing history."
Actually the problem was not so much COBOL itself,
but rather The Great Bad Idea it had : Noise Words.
( Note that I am gritting my teeth furiously as I type this. )
A noise word, children, is a word of code that, although writtten and read
by and for humans, is actually IGNORED by the compiler.
You may recognise it, for instance, in versions of BASIC where you may or may not
use the word 'Let'. In COBOL it is an entire cult ( which may be a misprint ).
The problem about making code readable for accountants and so on
is to confuse 'readability' with verbosity.
Certainly the problem with just about every COBOL pogram
I ever had to amend/update ( whch was a lot ) was that
the writer seemed under the impression that the more that actually got typed
the 'better' the code would be.
And while I never even aspired to be a manager, my dream was to hang a banner
across the ceiling of the office saying
You do NOT get paid per line of code
But I was just about the only person who actually realised that.
The digression is only slight to mention comments, meaning an asterisk in column 7.
There would be an entire page of comment STATEMENTS with very little actual
comment ( such as 'PRINT LINE OF CODE' ) and at the bottiom of the page
the actual PRINT statement. Talk about 'No shit, Sherlock'.
( And then of course, precisely when the code gets complex,
comments explaining the processes involved are conspicously absent. )
So a program that takes several dozen pages of code to print out
COULD be confined to a few screensworth
( and laid out there neatly and READABLY ).
In fact my rule was that, by default, noise words should NOT be included.
The only noise word I ever used at all regularly was 'THEN',
especially when the actual 'IF' statement itself was non-trivial
and the THEN was immediately above the ELSE.
Apologies if this is tl:dr but thank you for its therapeutic effect or myself.
PS. Why does El Reg keep inserting blank lines in my text ?
I learned Cobol REALLY well in college! In fact i was able to do my Data Structures assignments in Cobol because the IBM 370 we had, had a graphics framework/formswork, accessible from Cobol. So i got extra credit for the assignments doing them in a graphical nature! Miss that stuff!
What I missed about COBOL is the fact that if you needed to read a dump, the data would be there as laid out in the program. Sometimes you just wanted to see what was there when the thing blew up. And if you ordered a PMAP you could see the procedures also. This was very useful, believe it or not. I still remember two striking bugs that I will not bore you with.
It was quite a shock to me to find that PL/I did not lay out storage the same way.
I worked for a successful software firm from 2004 to 2007, making a niche application for the wholesale and logistics industry. It does stock-keeping and rather complicated pricing and label-printing and warehouse management and loads of other things that various customers have demanded be added over the years, like managing fleets of lorries. The whole thing is built entirely in Cobol, and not even because it's old legacy code; most of it's pretty new.
We were using Acucobol, which doesn't bother with any of that crazy indentation stuff the mainframes insist on, and provides a nice VB-like programming environment. It even automatically generates full stops and puts them on their own separate line, which is really the only sensible way to deal with the bastards.
The major use of Cobol to the firm was, I think, that they could bring in new programmers, even with no Cobol experience, and get them up to speed pretty damn quickly, precisely because the code is so readable. Try that with Java.
Since the company is one of the major players in wholesaler's software in the UK and Ireland, I can guarantee that a young(ish) company who have chosen to use Cobol have enabled you to buy stuff in the shops. Which is nice.
COBOL is verbose?
Thanks Grace Hopper!
Cobol programs I've written in 1972 are running on MF, Unix, Linux, PC machines and lot of programmers put their hands on the auto documented code to implement new functions.
What I see with C# and Java is a continuous remaking caused by a criptical style of programming.
What the author forgets is the fact that C# was "sold" as a replacement for Assemblers to have a better on memory, but this was in an era of memory scarcity.
Then came Java, sold as a "secure" language in a client server environment, another technology of the past when computing load must be part on the host, part on a local server, part on the client due to slow telecommunications connections.
Biting the hand that feeds IT © 1998–2019