...thanks for sharing them. My son is mad-keen on all things NASA (with the keenness and 'just get it done' outlook of a typical 10yr-old) and will enjoy reading this too.
The IT runs strong in my family. One day -- this is more than thirty years ago -- when I pulled myself out of the code mines, I found my Aunt Anna visiting. I mentioned (proudly) that I was mastering the arcana of assembly language programming on the TRS-80, using Microsoft’s MASM macro-assembler. “Really?” she replied. “A …
Indeed it rang a few bells for me. My first program was in Deuce machine code. Deuce was a valve computer built by English Electric and I won a prize to work on one for a few days at Nelson Labs in '64. It was the opposite of today's computer rooms - all the windows were wide open to get rid of the huge heat output from the valves.
Raw programming in machine code - and because the main memory capacity was so small you could keep in your head all the memory locations you were using. Saved having to document and meant you could get stuff running very quickly. Of course updating it later was kinda difficult.
I went on to proper programming at University and after in high level languages on a variety of of ICL mainframes. It was only when I bought a TRS-80 with my own money that I got back to real raw programming in Z80 code and Assembler. It was a joy and found I could put stuff together much faster and more reliably than the ponderous mainframes. GEORGE3 had a half life of only 15 minutes at one time.
So I lashed up a termiprinter to the Trash-80 which would take standard 132 pp computer print out paper. I actually had a major business planning application running on it. I kept it secret and presented the results on printouts that the board assumed was from the 1902A mainframe they had allocated to me. Fun while it lasted.
What happened when they found out I may tell another time.
When I was at college, we had Z80 boards with Hex pads for programming them. Write the assembler, convert to opcodes and punch it in.
The same for my ZX81, assembler -> opcodes -> REM statement at the end of the program.
My first day at college, we had to write a program to calculate the minimum number of coins to return in change, in CBM BASIC on a PET 4000. We had a double lesson to do this. I finished it in around 15 minutes, including testing. So I spent the rest of the lesson punching in machine code to draw borders around the screen, split the screen in 2 and use gets to accept the input and make 8x8 block graphic numbers to represent the figures entered. It also used block graphics at the bottom to draw piles of coins...
The lecturers reaction? "I didn't know you could do that with a computer."
And that was on my first day! :-S
The BBC and my Memotech, with built in assemblers were a huge step forward. The built-in machine code monitor was also very useful on the Memotech. It was much more than just a pretty machine, it combined the best bits of the Spectrum, BBC and C64; unfortunately it never took off.
The 704 was the first computer I ever programmed, not that they let -- snif -- STUDENTS actually touch it. But even back in 1959, there were higher-level languages for it. I've almost never programmed anything that didn't have at least assembly language. One experience coding in absolute octal was enough for me.
"One experience coding in absolute octal was enough for me."
In the 1970s it was not unusual to write raw machine code on many occasions as the prototype mainframe didn't yet have any OS or compilers.
That came in useful on one occasion in a distant land when a customer operator had accidentally relabelled their mainframe development hard disk - and owing to archiving problems they hadn't done a back-up for months. In theory all it needed was a bit setting in the header record on the hard disk. The self-loading program was designed and the machine code written on paper. The binary was then punched onto papertape - hole by hole using a hand "dibber". Finally an initial program load sequence for the papertape reader was entered via the cpu engineering panel handkeys. It was a one-shot chance - and it worked.
Quite often when supporting a new comms machine it needed a bug fixing in the field asap. Each machine had a uniquely compiled system binary. That meant coding the fix by hand - and entering it as byte values on papertape to be fed into the console teletype for the system debugger. The only compiler tools were on special machines back at the factory which could only produce complete compiled system binaries. Sometimes these field machine code patches were fundamental redesigns of part of an application that had originally been written in a high level language.
As the sign on my desk said "We do the impossible immediately - miracles take a bit longer". Computing was fun in those days - hard work - but the adrenaline highs were spectacular.
Yeah, knowing how hardware works is still important.
It got me a job a couple of years back. By changing some code for an eCommerce system, I managed to get it from falling over at 200 simultaneous transactions across 4 servers to not breaking a sweat with 200 transaction on 1 sever. Just by changing the logic around to be processor friendly
Was enjoying the article until I saw the RIP at the end, which meant there was not going to be a follow up.
A pity with all these entrepreneurs collecting old kit, no-one is collecting the memories of those who used it (and how they made it work). Perhaps we need an archive of their experiences for future generations to refer back to?
Usenet: alt.folklore.computers (it's google groups now).
Where people like Dennis Ritchie used to hang out. It was always fun watching some new member arguing about some obscure point of C, and DRM would say his piece, and when asked what he knew about it would politely point out that he wrote it.
"[...] when asked what he knew about it would politely point out that he wrote it."
There was a conversation in one of the usenet groups on network protocols. The slide-lock on the obsolescent Ethernet AUI cable was criticised for being awkward and not infallible. To which a regular contributor*** agreed that he regretted designing that feature.
***Might have been the legendary Bob Metcalfe??
One of my aunts headed an OS development team in some high-tec company on the west coast and earlier her husband started development of stock trading systems in NY on IBM1401 and IBM360. I learned all this after I started to write assembler code on IBM4341 and MVS. Needles to say that I "played" with 1401's and 360's as a very lucky teenager.
This brings back nice memories.
Wasn't a "computer" back then someone who "just" did the maths by hand, or maybe on a mechanical tabulator? I can certainly understand engineers that built bridges or the like needing a human "computer" to offload the preparation of load tables, etc. However, the author is pretty clear his Aunt translated formulas into instructions to run on the IBM, that could take input, probably process through multiple iterations, and generate some set of outputs. Sounds like programming to me. (My first program was to solve the General Quadratic Equation on a Canola programmable calculator - allowing you to rerun the same program but with different inputs. This was quite different from the preceding couple of years at school of just using calculator to "compute" maths answers).
I'm pretty sure the 704 had FORTRAN and almost certainly had an assembler.
Mark's aunt Anna's foray into programming reminds me of the start of my first assembly a bit over a decade later. The prof introduced us to the instruction set of the year old intel 8008 and our first programming was done in octal and we ran the code on an 8008 simulator running on a CDC 6400. ISTR that the simulator was written in FORTRAN.
As for the article - that was a very nice tribute to his aunt.
The prof introduced us to the instruction set of the year old intel 8008 and we ran the code on an 8008 simulator running on a CDC 6400.
At the Helsinki University of Techonology we had an actual 8008 machine, very obsolete even then in mid 1980's, but still used for some student exercises. We were to write a little program by hand in hex, for and burn it into an EPROM chip, and run (the simple 8008 machine had no other storage devices). I must admit I cheated a bit: I used a 8080 assembler (running it inside a CP/M emulator in a PC, a set-up I already had around because of other interests), and avoided those instructions that were not available (the 8008 instructions were a sub-set of 8080).
Indeed it did, complete with IF (SENSE LIGHT) and READ DRUM statements.
Not sure if it was true of the 704 compiler, but I do recall reading of early IBM FORTRAN compilers that they did a lexical pass on the input card deck, overpunched a code indicating a statement type. sorted the deck accordingly, loaded the compiler fragments for each statement type in sequence, generated (I presume) some intermediate code which was then sorted back into the original statement order.
Yes indeed. When I walked out of university in December 1965 with my degree in mathematical statistics and into my new boss' office the next day, he greeted me with "Can you program a computer?" My response was "How do you spell that - with an e or an o?" - because there was a caste of highly numerate people with quick right wrists who used to do complex calculations at dizzying speeds on Facit manual calculators. They were called "computors"... Receiving the inhumane answer, I said "No". His response was to hand me a Fortran 2 Reference Manual for my new employer's IBM704 with the comment "Well, you'd better teach yourself, it won't be long before all statisticians will have to program computers." Wasn't too far off.
An engineer once told me of a time a very old (PDP-11?) machine had some hard-to-trace timing related problem. He did some digging and found a wire connecting two adjacent pins on an IC which seemed to serve no purpose but when disconnected stopped the machine working. Shorting the two pins also did, so he traced. And traced. AND traced. 6 feet of wire later, realized that this was a very effective delay line !
Turns out the problem was actually not that IC at all but something else.
An even more hilarious anecdote is that in the days when vacuum tubes were king and transistors weren't stable or cheap enough for computers the engineers when faced with a working but unstable batch of expen$ive tubes procured a large domestic oven and "cooked" the tubes to cause the presumably slightly underfired getters to grab some more stray molecules and render the tubes stable at the high speeds used.
Then used a very ingenious handheld mini-Tesla Coil based diagnostics rig to measure the impedance of the tube with a metal kitchen scourer on the cap and a second to all other pins in order to avoid plugging the valves in and potentially damaging them, if the valve passed this test it was used or if not cooked again.
Some of these lasted several months in use so obviously the weak ones were being weeded out.
Suggesting the PDP-11 is "very old" is making me feel old! When I started working on them they had been around for a while though I guess.
We had one course at Uni that was done with pure hand converted machine code on LSI-11 boxes (essentially PDP-11 without the peripherals). I think it was included to give us an understanding of the background rather than in the expectation we would even need to do this professionally. Back then I could count in Octal as well as Decimal & Hex.
Being young and stupid I also did a reasonable amount of this on the Sinclair machines (REM statement in the first line(s) of the program to create the space then Poke the 2-byte instructions in).
Probably to make it go faster. That's what I did on a Commodore PET to get over the atrocious speed of the BASIC interpreter. First I poked a subroutine in what was largely a BASIC program. Then on a more ambitious, machine code only, program it was SYS 1024 and start banging in the hex. The improvement was so great that I had to artificially slow it down to make the game playable. That would have been somewhere in time between the zx81 and the speccy release dates.
"6 feet of wire later, realized that this was a very effective delay line !"
The engineers developing our System 4/70 mainframe prototype in 1967 used a loop of several feet of twisted pair wires when a problem indicated a delay was needed.
That particular prototype used a new range of fans - several of which were blowing upwards through the roof of each of the many high cabinets. They burned out so often that we ended up putting a strip of papertape over each one, with one end weighed down by a large nut. The top of each cabinet then had an array of these white flickering tongues. If a fan failed it was quickly noticed and replaced.
The normal engineering component fault paperwork used to have the dates of fitting and failure - it had to be extended to date, hour, and minute.
Weird the things you find out about your family. My uncle Bill was an electronics engineer for most of his working life. He's the reason I turned out the nerd I am. He designed (some of) the control systems for the Harrier, and worked on the avionics for Concorde. He was one of the people aboard the prototype Concorde #002 when they flew it out of Filton airport.
He always kept it really quiet and downplayed it. I only really found out about just how much he'd done at his funeral, talking to his colleagues. We'd assumed he was just a guy on the spanners, but it was much more than that.
Christ, I miss that old nerd.
Yes, there were a lot of unsung pioneers in the field of computer science and we owe them a lot!
My 1st computer was an Olivetti programmable computer that used magnetic cards for both data entry and memory. The language: Olivetti assembler. Amazing what you could do with only 4k of memory!
Actually included X-ray erasure because UV light wouldn't do the trick.
I hear that NASA still used 25+ year old CPUs (8086) in their Shuttle solid rocket boosters and have to trawl online auctions for new-in-box chips because they aren't made any more.
The Russian versions do work but aren't quite similar enough for legacy code which has to work 100% of the time.
The ISS still runs on decades old technology and the laptops are still Pentium based because none of the modern CPUs <ie sub 100nm) will run for long in space without crashing.
Your aunt may well have had my father as a math instructor at Tufts. He was teaching there at the time.
Boston-area mathematics was not very lucrative back then. My father had to work three jobs to make ends meet. Eventually, the family moved to California, where I was born. He more than doubled his salary in the process. As a mathematician in the high-tech haven that was Northern California in the later 1950s and through the 1960s, he solved differential equations by hand, including numerical estimation. The collapse of the Tacoma Narrows Bridge was relatively fresh in people's minds. Calculation was viewed as a must. But some sets of simultaneous equations would take him and his boss a couple of weeks to work out.
Computers were enormous curiosities. "Real" mathematicians didn't need them. My father never really embraced computers. The closest he got to a computer addiction was playing Gorilla-- a QBasic classic -- on DOS. Your aunt was the new generation -- computer savvy.
I studied econometrics in college, which was a computer-based pursuit by then. Eventually I vectored off into computers proper. For about 12 years I followed the siren call of assembly language programming, on IBM mainframes and on the early PCs -- 386 through the first Pentiums. It was fun work. I have a lot of respect for your aunt's ability to work directly in machine code. That's a real intellectual challenge, and machine code is utterly unforgiving.
The Old Ways! Kids today with their gangsta rock 'n' roll and their drag 'n' drop. They don't know the value of a base register. ;)
I enjoyed the article. Thanks for sharing your aunt's story.
Biting the hand that feeds IT © 1998–2019