back to article Oh, hi there, SKYNET: US military wants self-enhancing software that will outlive its creators

The US military's nerve-center of secret-squirrel boffinry DARPA wants to write software capable of running for a full century without becoming obsolete. Dubbed "Building Resource Adaptive Software Systems" (BRASS), the project [PDF] will look into the creation of a new software stack that can automatically make use of …

  1. Chris Miller

    I suspect some (many?) banks are today running COBOL programs that are over 50 years old, and I wouldn't bet against more than a few of them still being around in 50 years' time.

    1. This post has been deleted by its author

      1. HildyJ Silver badge

        COBOL/FORTRAN programmers haunting the fora? I am. As Python said, I'm not dead yet. As far as big applications, forget reactors, with the US tax day coming up, your returns will be processed by the IRS using COBOL programs.

        1. Anonymous Coward
          Anonymous Coward

          "...will be processed by the IRS using COBOL programs."

          C, if it's filed electronically or scanned in, it will be C. Most of the COBOL that did heavy lifting set sail in the early 80's. However, if it requires human editing or a human interface for audit, then all bets are off (but COBOL is still heavily used, along with overly expensive mechanical custom keyboards). It wasn't until recent, very recent, that you started to see actual window form GUI's for internal service systems.

          1. HildyJ Silver badge

            Only the front ends have changed. When it gets to the Master File (officially MF) it's COBOL on a mainframe for both the Individual MF and the Business MF. I was a consultant on the second attempt to rewrite the MF which failed (the first and third also failed - by the time you finished the last bit of system documentation, Congress had seen to it that the frist bit was obsolete).

      2. Anonymous Coward
        Anonymous Coward

        "I wonder how many COBOL and FORTRAN coders inhabit El Reg forums?"

        Don't forget the other member of that applications triad - ALGOL.

        1. Anonymous Coward
          Anonymous Coward

          Fuck your shitty melancholy mood!

          Out of what nether domain of Dino the Punchy Punchard Nibbler's wonder world does the COBOL discussion emerge from, now? That COBOL was used widely is not due to the fact that it is a sensible or flexible way of writing code or but due to the fact that it was considered a safe "standard", low-wattage personnel could be taught to develop in it in a jiffy, and it could, sort-kinda, get the job done on deca-kilobyte machines - the "job" being in this case electronic processing of tabular data of the kind done perfectly well by Dehomag/Hollerith/IBM mechanical/electromechanical machines a bit earlier (inb4 "muh holocaust"). That is is still widely used is down to good old-fashioned inertia and the absence of any sensible economic reason to change it (source code getting lost doesn't count).

          Similarly, I hear some demented individuals are still using "C/C++" for anything more than bare-knuckles macho programming contests in 2015. Now THAT is most unreasonable.

      3. Mullerrad

        I was taught Cobol and Fortran at School..... also basic and machine code.

        No A level computing then just maths and physics teacher adding value.. this was 1976 :)

        1. Anonymous Coward
          Anonymous Coward

          Since 1976 the whole thing had been commercialize. College is a business now, and like a business, they have affiliates. You're lucky to have experienced college as an educational institution, our rather how it's advertised. Today, it appears that if you don't go ivy league, you'll be stuck in some abstraction nightmare learning abstract methods that will never even run on a microwave, let alone anything that would leave orbit.

    2. HildyJ Silver badge

      COBOL is a programming language, not an operating system, so it doesn't qualify. But what it can compile into, the IBM mainframe operating systems, might be a good place to start.

  2. BongoJoe

    What happens when the military have new hardware to lob at people? New communications techniques and methods? Nothing is going to remain static for 50 years unless they write code which handles today's equivilent of teletypes and keep them supported whilst newer stuff comes along.

    This sounds like a committee from hell being formed.

    1. This post has been deleted by its author

    2. Charles Manning

      Committee from hell?

      Nope. It sounds like the next greatest trough for the research snouts at universities etc.

      The more outlandish your claims and "vision" the more you'll get funded.

  3. Marketing Hack Silver badge
    Happy

    In other news...

    DARPA puts worldwide software industry out of business!

    1. amanfromMars 1 Silver badge

      Re: In other news...

      Hi, Marketing Hack,

      The DARPA BRASS Program, which is admirably speculative, is much more a project which contracts to provide an absolute leadership for stagnating business and corrupted money systems and ponzi stock markets with worldwide software and firmware than anything else, methinks.

      And there is a parallel track from the more civil sector exploring the notion and possibility, and therefore it's most likely future reality/virtual reality and calling for papers, here ...... https://www.h-net.org/announce/show.cgi?ID=221414

  4. Matt Piechota

    So: Java (shudder)

    I'm not going to hold Java up as a paragon of reliability, but realistically something like it with a stable set of classes that will "never" be phased out and you're getting close. You could selectively add stuff to the base as time goes on to support new features as long is it doesn't interfere with the older classes.

    1. T. F. M. Reader

      Re: So: Java (shudder)

      with a stable set of classes that will "never" be phased out...

      ...and semantics changing between one version and another, by specification, introducing incompatibly different behaviour of multithreaded applications (just an example, mind you) when using different versions of JVM, thus preventing upgrades?

      I shudder indeed recalling the horrors of having several different versions of JVM because applications written by the same group in a big multinational corporation (that sold JVMs among lots of other things) each required a different version of Java and could not run on the others. And whole research teams inside the said multinational working on software that would detect the incompatibilities and tweak the code (and/or bytecode) to automate the transitions. [Come to think of it, maybe they should submit proposals to DARPA...]

      Pleeease...

    2. asdf

      Re: So: Java (shudder)

      >Java ... with a stable set of classes that will "never" be phased out

      Java, backward compatibility, hahaha good one. Almost as funny as write once run everywhere. I do take your general point but find Java to be a miserable implementation for what they are trying to achieve.

    3. Christian Berger

      Re: So: Java (shudder)

      No Oracle has made that clear by saying that Java will not be the "new COBOL", which means that Java will not stabilize and will always remain an ever changing language with more and more features bolted on.

  5. martinusher Silver badge

    Plenty of old code out there

    Software doesn't wear out so if the environment is modular there's no reason that 50 year old code or even a hundred year old code shouldn't work fine. The problem today is that the software design methodologies we use militate against reliable code -- a typical modern application is essentially non-deterministic, its a dog's breakfast of code and libraries that probably runs on the system it was developed on but if that system is changed even slightly then nobody can guarantee that it will continue to function.

    I mostly write embedded code, code for 'things'. Once the code is written and running it just runs until something goes wrong with the underlying hardware. If the hardware lasts for 100 years then the code will also be running in 100 years. Its as simple as that.

    1. DropBear Silver badge

      Re: Plenty of old code out there

      I'm not disagreeing (heck, I do the same thing with similar results), but I'd like to point out that these 'things' are meant to be static and not change once they're installed - DARPA seems to want something that can keep up with the obligatory changes in time that right now force an upgrade every now and then. To be honest, never to have to uproot every custom setting I ever did just to raze everything and start all over again would be nothing short of heaven for me too - I'm already clinging to scary old versions of things way past of what's reasonable, but something invariably comes along that forces me to move on or lose functionality that used to work before.

    2. Christian Berger

      Re: Plenty of old code out there

      Actually if you look at systems like Maxima which is based on a 1982 version of Macsyma which is from 1968, we are getting close to 50 year old code still being in widespread active use.

  6. GBE

    feather an egg?

    "feathered a fair number of consultants' nest eggs"

    I like the clever mixed metaphore.

    1. This post has been deleted by its author

    2. Phil.T.Tipp
      Thumb Up

      Re: feather an egg?

      +1 Eggzackly. Beat me to it. All yolking aside, these dim DARPA types need to unscramble their metaphors.

  7. Hardrada

    Where have I heard this before?

    Here: http://www.bricklin.com/200yearsoftware.htm

  8. Anonymous Coward
    Anonymous Coward

    IBM OS/400 ?

    Maybe DARPA should speak to IBM. Their OS/400 operating system is already quarter of the way there (~ 25 years old). From Wikipedia: "One feature that has contributed to the longevity of the IBM System i platform is its high-level instruction set (called TIMI for "Technology Independent Machine Interface" by IBM), which allows application programs to take advantage of advances in hardware and software without recompilation. TIMI is a virtual instruction set independent of the underlying machine instruction set of the CPU. User-mode programs contain both TIMI instructions and the machine instructions of the CPU, thus ensuring hardware independence." and "An application saved from the older 48-bit platform can simply be restored onto the new 64-bit platform where the operating system discards the old machine instructions and re-translates the TIMI instructions into 64-bit instructions for the new processor.". Unfortunately, IBM has already got form with the military-industrial complex (see Wikipedia: IBM and the Holocaust).

    1. Anonymous Coward
      Anonymous Coward

      Re: IBM OS/400 ?

      The ICL VME O/S of the 1970s was designed against a "target" machine. The idea was that any future technology changes would be below that target interface.

      One of its major "hardware" features was Data Descriptors. IIRC they limited the access to a declared variable by software. Presumably if Intel had used their billions of gates on a chip to implement that feature - then buffer overflow exploits would not be possible.

    2. Anonymous Coward
      Anonymous Coward

      Re: IBM OS/400 ?

      ""An application saved from the older 48-bit platform can simply be restored onto the new 64-bit platform where the operating system discards the old machine instructions and re-translates the TIMI instructions into 64-bit instructions for the new processor.""

      It would be interesting to know if they maintain exactly the same results for floating point operations with differing legacy precision.

      It was not unusual in the 1960s to get different results when moving a scientific program to different hardware. For applications using convergence the effects could be significant.

      The number of bits for mantissa and exponent could be different - or the microcode FP algorithm was different even between compatible op-code machines. It could even matter what value the FP algorithm returned when the computation could not be completed.

      International standards possibly took care of the FP algorithms eventually. However - any precision conversion might not be transparent.

    3. amanfromMars 1 Silver badge

      Re: IBM OS/400 Branding on Alien Interference?

      Technology Independent Machine Interface = Plain Text Humanoid Readable whenever DIMM Chipped Mankind is a collection of Virtual Machines programmed by AI for Leading Boffins.

      And all fools waste time and effort in space trying to decide on which comes first in that revolutionary chicken and egg.

      Oh, and something for DARPA to be getting their heads around for Future Autonomous Command and Absolute Global Control via BRASS and AIdDVenturing in Alternate Channels and Dark Web Networks ...... http://thedailybell.com/news-analysis/36217/Netanyahu-Is-Either-Lying-or-Incompetent/#comment-1952692427 ..... but the field is a market place of intense lucrative competitive interest, and even localised terror concern, given what can be so easily done with the DIMM Ignorant Media Machine.

    4. asdf

      Re: IBM OS/400 ?

      >Maybe DARPA should speak to IBM.

      DoD and IBM were drinking buddies long before you were born. Vendor lock in is of particular danger to a project like this I would assume.

  9. Christian Berger

    This could either become a disaster...

    ...by building frameworks that try to abstract the program logic away from its implementation (or such nonsense)...

    ...or they end up just recreating UNIX which uses a few simple principles to make sure your software will play nice with just about anything from COBOL to J2k. In fact you can even re-implement parts of your software easily without breaking the rest.

    1. Anonymous Coward
      Anonymous Coward

      Re: This could either become a disaster...

      Whatever happened to the bright future of Format Definition Techniques producing perfect hardware and software? IIRC the RSRE Viper was eventually shown to be less than perfect.

  10. DeathSquid
    Go

    LLVM?

    Software compiled into LLVM intermediate representation can be easily targeted to new hardware. LLVM has front ends for a wide variety of languages, including old school (Fortran, Ada, C), new school (Java, Rust, Ruby) and the plain weird (Haskell).

  11. Kevin McMurtrie Silver badge
    Terminator

    LLVM is a start

    LLVM is a start but cruft will still pile up in the LLVM environment. A system that can refactor and test itself would be larger than itself. The cheat to that paradox is massive social interaction where systems get feedback from other systems on whether or not they're doing better or worse. Being a cheat, you just have to hope that the feedback sends evolution in the right direction and us meat sacks aren't eradicated in the process.

  12. T. F. M. Reader

    Portable code

    Many, many years ago I was working in an environment (it would be called "cloud" today, but the term had not been invented then) with a large variety of target platforms (CPU/OS/libraries/etc.). A standing meta-requirement was that one's code (it was a period when new college graduates - not me - no longer graduated with Fortran skills, so C was used more and more instead) had to run on any system, including those that the company didn't have yet, and those that had not yet been invented.

    Turned out quite possible, with standards (POSIX, SUS, IEEE754, etc.) and paying attention to every compiler warning imaginable, etc. I grew into a habit to writing software that way, and the results proved reproducible through the years and in different settings. Not only did the software work correctly on just about anything that got hot, when new CPUs came out and compilers caught up, the same code would run more efficiently after recompilation, possibly with a new set of options. Definitely one of the most important lessons of my career.

    [Disclaimer: Microsoft Windows was always out of scope.]

    I don't mean to criticise (really!), but I suspect that the presence of a compiler (and thus static analysis and machine code independent of the programming environment and language) is significant in the context. When an evolving language (and that's a good trait) includes its own runtime environment things get more complicated, possibly because maintaining compatibility is difficult when you deal with more complex, high level concepts. I reacted to someone's touting Java's "stability" separately. Python2->Python3 does not seem a bliss so far, either, does it? And I am not sure the conceptual difference is as big as between different Fortran versions (or C++98/11/14 for that matter).

  13. Medixstiff

    I hope they hook it up to some drones too.

    That way once someone tries to inject malware or penis pill SPAM emails, the system can target the source and take it out, saving billions in lost productivity for the rest of us per year.

  14. BongoJoe

    ADA

    Wasn't this the purpose of this strange Algol derived monstrosity?

  15. This post has been deleted by its author

  16. Anonymous Coward
    Anonymous Coward

    wait

    >DARPA mulls stable code that will run for decades.

    Doesn't the military complex already have this with Ada? Are they looking to justify why they should continue to use it forever?

  17. Swarthy
    Terminator

    Hmm...

    Maybe my coffee hasn't kicked in yet, but are they looking for code that will never be updated (other than by itself) or are they looking for software that will update seamlessly? For the latter, one could use principles similar to POSIX with a micro-kernel that allows patching on the fly, combined with a config file format standard; having the standard state failure mode for absent config settings.

    Or possibly a poly-kernel wherein the kernel operations are dispersed among multiple semi-kernels so that each handles it's own tasks and is independently patchable. Specify that the interoperability is to be based on the robustness principle.

    For the Former (Self-updating code) : https://xkcd.com/534/

  18. This post has been deleted by its author

  19. Paul Hovnanian Silver badge

    Outlive its creators?

    Well, there's C, COBOL and FORTRAN.

    On the other hand, you could use .NET. And get developers to pursue its creators with sharp garden implements.

    1. Swarthy
      Pirate

      Re: Outlive its creators?

      Oooh. I'd sign up to be part of the chasing party.

  20. Vic

    Emacs, obviously.

    It grows into anything you want it to be (and quite a lot you don't).

    Where do I apply?

    Vic.

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Biting the hand that feeds IT © 1998–2020