Even then...
... technology was amazing.
Some info here:
http://history.nasa.gov/computers/Ch6-2.html
Reading that makes yer eyes go wonky though.
If you thought Fortran and Cold War-era assembly language programming is pointless and purely for old-timers, guess again. NASA has found an engineer comfortable with the software to keep its old space-race-age systems ticking over. In an interview with Popular Mechanics this month, the manager of NASA's Voyager program …
... technology was amazing.
Some info here:
http://history.nasa.gov/computers/Ch6-2.html
Reading that makes yer eyes go wonky though.
You'll notice in fact there are 3 processors involved here. Some came from the Viking programme but all (AFAIK) are custom processors built out of LS TTL, usually aroudn the LS171 ALU, like the PD11 and the Xerox PARC Alto.
When one of the processors was not fast enough to do the work they decided to add a DMA mode to all instructions to allow "hidden" data movements without the direct involvement of the CPU.
Not something the average x86 or ARM programmer is used to considering as a design option.
I suspect the JPL does have the necessary documents but you may have to rehost the assembler if you're going to have a go at re-programming Voyager as data updates are going to be sloooooow.
"When one of the processors was not fast enough to do the work they decided to add a DMA mode to all instructions to allow "hidden" data movements without the direct involvement of the CPU.
Not something the average x86 or ARM programmer is used to considering as a design option.
"
DMA usage is very common. As I write this I'm taking a 10 minute break from debugging a dma issue on an ARM micro. The CPU is doing almost no work (CPU loading of about 1%), but the DMA is working at about 70%.
Go look in the Linux kernel - stuffed to the gunnels with dma. cd linux; greip -ir dma
" but you may have to rehost the assembler if you're going to have a go at re-programming Voyager "
What do you mean by rehosting the assembler? I would expect the assembler is a cross-assembler (ie. it runs on a normal machine (eg. originally a Vax or such, but now aLinux box), but generates code for the target CPU. THat's how most embedded systems are developed.
"DMA usage is very common. As I write this I'm taking a 10 minute break from debugging a dma issue on an ARM micro. The CPU is doing almost no work (CPU loading of about 1%), but the DMA is working at about 70%."
You need to read the chapter. Slowly.
The design team added DMA to each individual instructions implementation when data transfers were not quick enough. Not an option for any modern mpu's IP.
"What do you mean by rehosting the assembler? I would expect the assembler is a cross-assembler (ie. it runs on a normal machine (eg. originally a Vax or such, but now aLinux box), but generates code for the target CPU. THat's how most embedded systems are developed."
True, these systems data from the 70's. IOW you're looking at 70's assembler written in the 70's version of it's implementation language and running on a 70's computer.
It all depends on how up to date NASA's tool hosting has been.
If the toolset was developed in a main stream language without using too many supplier unique features it'll be simple. If they relied on special features of that language or its support libraries you'd either have to duplicate them or build a new tool chain.
"
If the toolset was developed in a main stream language without using too many supplier unique features it'll be simple. If they relied on special features of that language or its support libraries you'd either have to duplicate them or build a new tool chain.
"
There won't be any "support libraries". It will be from-scratch code all written by the project programmer(s). Take a look at the ZX spectrum disassembly (google) to see how it was done in those days. A 1 or 2 pass assembler *maybe* followed by a linker - though frequently the code was fed to the assembler as effectively one single module, in which case linking would not be required - the assembler output the completed binary.
There are a few generic assemblers available. You start by defining the basic architecture and instruction set / mnemonics, and then the rules for each instruction and associated binary (machine code) output and you end up with a perfectly usable assembler. I once programmed a generic tool to do Z80 assembler programming as I did not have access to a Z80 cross-assembler at the time.
A quick google came up with http://sourceforge.net/projects/sgasm/ that looks like the sort of thing I recall. I have also read of a self-configuring generic assembler - you feed it an existing comprehensive source code and associated binary, and the program figures out the assembler rules (obviously it won't understand instructions or variations that weren't in the source code you fed it).
Two problems with C programmers and neither has to do with competence. First the FORTRAN dialect is probably something like FORTRAN IV which is very different from the modern version of FORTRAN. Lots of nasty differences between the versions. Second is assembly language instruction sets are processor specific. So someone familiar with the assembly language of current processors would be unfamiliar with the quirks of this processor. To add to the problem, apparently the processor is effectively a one-off. Finding documentation for either language would be difficult. You might find a used copy of a FORTRAN IV text but I suspect the assembly language would be difficult to find. The assembly language documentation was probably very good originally but how much has been lost, misfiled, etc. in 40 years is an open question.
Programming in the mid 70's was more concerned about absolute memory management and accounting for memory usage than today. The economics of programming has fundamentally changed from programmers are relatively cheap compared to the hardware to now were most hardware is cheap and this the programmer becomes relatively expensive. The two schemes require very different approaches to programming.
This post has been deleted by its author
"You need to know how to use basic concepts such as bitwise operations, BCD number representation, etc, which are basically universal in any assembly coding."
I'd add understanding of the various addressing modes and the elegant, fast data structures that they can be used to build.
(Mine's the one with programming the 6809 in the pocket
IBM 360- and 370-series BAL programmers of the 1970s and earlier carried accordion-folded "green cards" that listed all the operations. I had white cards and yellow cards that covered later 370 models like the 370/168, but they were still known as green cards. The summary information all fit on the card and reference to the big manual that actually described how the instructions worked in detail was only occasionally required. The instruction sets of the DEC PDP 11/70 had a distinct flavor (memory addressing and subroutine calling conventions, octal vs. hex, ASCII vs. EBCDIC), and the programming conventions were different but the basic concepts were the same. The IBM Series/1 minicomputer instruction set, for which I coded assembly for several years was relatively byzantine. The equivalent "green card" was actually a booklet, and the full processor manual was a little more useful. I only dabbled with Motorola 6502, Intel 8088 and the like in assembler, but can say confidently that the knowledge is universal and relevant even when working in much higher layers such as say, Scala in a JVM, but less often applicable.
But all this knowledge could be circumscribed well and is limited in scope. There is much more to know in today's environment and I believe the work is even more challenging to do well. We have tools to protect us from the old classic errors, but as creative humans, we will continue to find new ways to screw up. I believe an assembly language experience is worthwhile for any coder.
This post has been deleted by its author
"I wrote a an introductory tutorial to X86-64 assembler, specifically aimed at those who had a bit of experience with the Z80 from the 8-bit home computers of 30 years ago."
Since both the X86 and Z80 are essentially derived from the 8080, the concepts would be quite similar. This task may well be more like someone who has some experience of the wealth of instructions on the X86 being restricted to a PDP8, or having to learn fluent assembly code for the PIC with the most obscure set of on-chip peripherals and registers.
Re: WTF is a "nibble-serial CPU"??
As I recall a nibble is four bits (half a byte) and given that all I have read suggests that the processor is probably a 'bit splice' design, i.e. entirely custom and created from discrete logic IC's, its not inconceivable that the data is shuttled between memory and accumulator in a serial fashion as opposed to a parallel bus. No doubt error checking and reliability being a major driver behind the design.
Its a long long time since I have been involved in that stuff but its not a complete punt more an educated guess.
Yeah, the 74181 is a 4-bit ALU on a chip, and NASA mentions using TTL 4-bit parallel logic in the chapter linked in the first comment, processing 18-bit words in 5 cycles, as a significant advance over bit-serial ALUs. Fewer wires, fewer packages, less power, but less speed than a full parallel ALU.
FWIW, DEC sold a bit-serial ALU version of their PDP/8 at a fifth of the price of the full 12-bit unit, so this was a strategy pursued even outside the limits imposed by space engineering.
In this chapter about Galileo [http://history.nasa.gov/computers/Ch6-3.html] there is mention of 2901 series 4-bit slice ALUs being used in parallel to make a full 16-bit ALU PDP-11/23 equivalent machine with a fully customizable instruction set. Then NASA found that the processors were not sufficiently radiation hardened to survive the conditions found around Jupiter, and had to pay Sandia $5M to fabricate special versions of the chips that could survive being blatted by high-energy particles. If the spacecraft had not been delayed they would not yet have discovered the radiation problems before launch...
"
They could learn, of course, but then so could a competent C programmer.
"
I doubt most C programmers (at least those used to programming PC applications running under a sophisticated OS) could learn assembler quickly - it's a significantly different mindset, and for the older CPUs you have to have a good handle on the hardware operation as well. Most assembler programmers are however capable of switching to a different CPU instruction set and becoming competant in programming in that language reasonably quickly.
Maybe the C programmers who program embedded devices that do not have a formal OS or shedload of libraries could transfer to assembler more easily.
If it's the HP3000, that great machine hit EOL in 2006 ([NO]Thanks Carly, Winston, and Wim)...
2100 (2000) was gone a lot earler than that; 1000 probably still has MIL contracts, but it's by no means a mainframe, as it's a realtime box.
Spent many a year developing, managing, debugging, and peering at the h/w and s/w innards.
MPE forever, we say. Too bad HP didn't listen...
HP 2100 not a mainframe, 'twas a desktop mini.
2nd year elec eng, 1974/5, programming it was part of the optional computing course.
We had to write the assembler, then hand-assemble it into the machine code, then enter it in with the front panel pushbuttons.
I thus gained an intuitive understanding of how instructions are decoded, logic flows through the ALU, and the way an ISR works.
Can I have the job please ?? I still don't "get" object-orientation :-)
2100 was more of a controller, predecessor of the 3000. I did OS/language/DB/utilities/internals development and support at HP. The HP-IB and PA-RISC versions of the 3K were worlds apart from the "Classic" 3Ks, which were similar (somewhat) to the 2100. Bob Green has some good articles about the origin of the 3K.
Well, there's object-oriented COBOL now, so oo-RPG can't be far behind :)
Gosh, even this icon isn't old enough :) ------------------------------------------^^^^^^^^^^
Well, there's object-oriented COBOL now
"Now"? Since 1993. OO COBOL is old enough to vote and td
so oo-RPG can't be far behind
Maybe, though the only real change to the language since RPG IV in 2001 seems to be 2010's Open Access for RPG, which is really an I/O plug-in mechanism.
This post has been deleted by its author
I still don't "get" object-orientation :-)
+1
The last time I tried some OO code, I found myself staring at the disassembly wondering why anybody would actually want to use such things. It seems to me that the further you get from native assembler, the slower and clumsier the software becomes.
MOV PC, LR
(or RTS
if you are old school) (^_^)
The last time I tried some OO code, I found myself staring at the disassembly wondering why anybody would actually want to use such things.
Funny. The last time I wrote some assembler, I found myself wondering why everyone didn't just write an instruction stream in binary. Lazy bastards.
National Aeronatical and Space ADMINISTRATION.
A bunch of ADMINISTRATORS.
The head of NASA is personally selected by the prez, so it's largely a political position, and that sets the one. The decision making is much like most political decision making: just kick the problem down the road and hope the ramifications are not experienced on your watch.
Now perhaps you can understand by people just ignore O-ring erosion and cross their fingers or ignore the fact that an old coder is going to retire. It explains why there was no plan to replace the Space Shuttle (or fix it in the first place).
What's important is picking which tie to wear for the next trip to the White House.
You are correct about administrators, but wrong on replacement for the STS.
Lockheed was supposed to deliver the Venture Star, but bit off more than they could chew with the radical engine design. Project failed.
McDonnell Douglas also had the Delta Clipper, which was actually in ongoing low level test flights, but lost the contract to... yeah, the Venture Star. Which didn't even have a static engineering model having all the money spent blowing up or melting engines.
https://en.wikipedia.org/wiki/VentureStar
https://en.wikipedia.org/wiki/McDonnell_Douglas_DC-X
(be sure to google flight video. Have a hanky ready because it's a crying shame how the American people got fucked over on this)
All this was over 20 years ago. <<-------------
"
Wow, that's forward thinking of NASA. It's not like the guy got hit by a bus or something. They have had years (decades) to look for a replacement.
But no, let's wait until he's retired then start looking.
"
If the code only needs updating every 5 years or so, you'd not want to hire someone to sit doing nothing until the next update.