back to article Boffins foresee most software written by machines in 2040

Boffins at the Department of Energy's Oak Ridge National Laboratory speculate that by 2040 advances in AI disciplines like machine learning and natural language processing will shift most software code creation from people to machines. In a paper distributed via ArXiv, "Will humans even write code in 2040 and what would that …

Anonymous Coward

Websites designed by AI?

I look forward to our nightmarish web 4.0 future, that is if current AI dreamed images are anything to go by.

Or by software do we not include anything user facing?

9
0
Bronze badge
Unhappy

Re: Websites designed by AI?

I'm getting sick of the AI moniker being attached to everything, to be honest.

Your code editor offering auto-complete or recommending how to craft a particular line of code is not AI.

Just fuck off with your AI bollocks. Seriously.

1
0
Meh

We've been here before...

Yes, you may be able to get rid of code pigs - you may have something that does all the Java scutwork for your standard business reporting crap. Progress comes from encapsulating things - almost nobody needs to know asm any more, you don't need to draw your own UI windows, C# has data structures out the wazoo. But you're just moving the work higher, and then the work gets more complex. Maybe in the future database stuff will be so pedestrian it's seamlessly integrated.

But now you're going to need someone to specify exactly what you want - and people asking for things are notoriously, provably, bad at not knowing what they actually want. I remember the last time AI was going to get rid of programmers, and it ran right up onto the shore on this problem (and terrible performance, but we'll assume we have enough horsepower now).

If you assume maybe the generic stuff is good enough for most cases. You're still not going to be able to get rid of the software/system engineers - engineers solve general problems given constraints, and if you solve /that/, you've solved problem solving - and 'no programmers' will be the least of the impacts on society. No deep learning network has demonstrated anything like general problem solving or any penchant for it. If you could perfectly encode every bit of your problem and required software solution in an input and output vector one could understand, and you could do the same thing on all existing software to train it, maybe it would surprise you. But software is not minor fault tolerant like images, and who are you going to get to do that?

Is the ratio of code pigs to engineers 4:1, giving you 80%? Maybe. I find Jeff Bigham's comments more believable. AI will let software engineers tackle bigger and better problems and not worry about the lower level stuff.

27
0
Silver badge

Re: We've been here before...

almost nobody needs to know asm any more

The "IoT" hypefest has clearly passed you by. Try building one of those billions of those battery powered sensor nodes that the (near) futurists are predicting without a solid understanding of ARM M0-M4 ASM. You can do a lot of it in higher level languages like C of course, but you'll still need to visit the basement from time to time. Java? .NET? or (I'm about to lose a rib here) functional languages with stack busting recursion all over the shop? Hahahahaha.

AI *might* be able to start writing general purpose code reliably a few decades after it completely masters synthesizing SQL from natural language which is something it currently isn't even remotely close to achieving, despite the query space being precisely defined and constrained by the database metadata and decades of precursor work on QBE (Query by Example).

13
1

Re: We've been here before...

I still know and use x64 and ARM assembly (and a bunch of 8-bits, but sadly never get to use them) for things like patching binaries we don't have source for and the occasional really timecritical thing - like getting cycle count cheap. It's why I said 'almost nobody' and not 'nobody'. I also know from trying to hire people that that skillset is incredibly rare.

6
0
Silver badge

Re: We've been here before...

people asking for things are notoriously, provably, bad at not knowing what they actually want.

The people I have worked with have been quite good at not knowing what they actually want.

13
0

Re: We've been here before...

Indeed, and even the problem is misunderstood. What temperature will my coffee be in 10 minutes? Well, it takes me fewer than 10 minutes to drink my coffee, so it'll be body temperature. When an AI asks that question, we're fooked, until then it is mostly spoof and nonsense.

2
0
Silver badge
Headmaster

Re: We've been here before...

"fewer than 10 minutes"

Hooray, less vs. fewer! In this case, if you happen to be a prescriptive linguist, you should use less, because '10 minutes' isn't plural here. Ten minutes _is_ how long it took to drink the beverage, not _are_ how long it took.

http://www.quickanddirtytips.com/education/grammar/less-versus-fewer

Whatever, it's a all a crock of shit anyway, made up by some bloke in 1770.

"As far as we have been able to discover, the received rule originated in 1770 as a comment on less:

This Word is most commonly used in speaking of a Number; where I should think Fewer would do better. No Fewer than a Hundred appears to me not only more elegant than No less than a Hundred, but strictly proper. --Baker 1770"

http://itre.cis.upenn.edu/myl/languagelog/archives/003775.html

4
2

Re: We've been here before...

A specific selling point of the Cortex M is that (because its interrupt handlers use C calling conventions) you can write bare metal firmware for it without using any assembler whatsoever, actually.

2
0
Silver badge

Re: We've been here before...

Indeed, we have: Something I posted in 2012:

https://forums.theregister.co.uk/forum/1/2012/05/11/ubuntu_emulates_amazon/#c_1408177

0
0

Re: We've been here before...

@Symon

"Ten minutes _is_ how long it took to drink the beverage, not _are_ how long it took."

Ah, but that is a particular beverage of a full mug of coffee, filter style. An Americano might take fewer than 8 minutes. So the minutes are individual and can be counted differently as separate unites of drink consumption time measurement. 10 minutes are taken to drink a cup of coffee... I am quite the passive.

An espresso takes less than a minute ;)

edit: on reflection, perhaps I should have originally said "it takes fewer than ten minutes for my coffee to be drunk by me"...

1
0
Silver badge

Re: We've been here before...

A specific selling point of the Cortex M is that (because its interrupt handlers use C calling conventions) you can write bare metal firmware for it without using any assembler whatsoever, actually.

You may not have to write much of it but you certainly need to read and understand it. Try diagnosing that without knowing assembly language.

1
1
Silver badge

Re: We've been here before...

"In this case, if you happen to be a prescriptive linguist, you should use less, because '10 minutes' isn't plural here."

A handy rule, if you care about this sort of thing at all, is to do a units conversion. Would you write "it took fewer than 1/6th of an hour"?

1
0
Silver badge

Re: We've been here before...

Amusingly, Nolveys' comment focuses on part of TFA in which the author was provably bad at expressing what (s)he wanted to say.

0
0
Bronze badge
Meh

In the year 2000

In the year 2000, Queen Elisabeth II will still be on the throne thanks to DNA therapy making her effectively immortal.

In the year 2000, economic crises will be a thing of the past.

In the year 2000, we will get around in flying cars.

In the year 2000, routine factory work will be done by trained apes.

In the year 2000, we have molten the Arctic ice cap to shorten shipping routes.

In the year 2000, users just have to say what they want in COBOL and the computer will write all the machine code.

In the year 2000, mankind shall witness the second coming of the Lamb for the End is nigh.

0
0
Silver badge

Re: We've been here before...

Or put simply, measurements are always taken as a singular since the unit (plural or not) is describing a single continuous thing: not the thing itself but an aspect of that thing, and that thing usually only has ONE of each aspect. You don't normally drive a kilometer 1,000 discrete meters at a time, nor do you hold a meter of ribbon in 100 separate 1cm pieces. Don't go by the unit; go by what the unit is describing.

2
0
Bronze badge

Re: We've been here before...

I also know from trying to hire people that that skillset is incredibly rare.

The direct consequence of piss-poor pay for 30 years. Assembly language programmers are seen like the scrap metal workers in the engineering industry. Yes, there is a kind of respect, but not real respect, and definitely not the money they would get if seen as the precision machine operators in the development labs that they are.

Disclaimer: I have written assembler for MIPS and Sparc, as well as Intel, and a bunch of 8 bit stuff best forgotten - I have made far more from writing PHP and C++.

1
0
Bronze badge

Re: In the year 2000

You forgot to mention: Voice recognition will be a solved problem, and robots will have taken over the world.

And, I wish to point out that your COBOL one has come true: and it probably explains why the banking system is no longer reliable.

0
0
Silver badge
Flame

Re: We've been here before...

Yes.

You need EXPERTS to figure out what is really wanted and then design it.

Remember ForTran, Cobol, The Last One, 4GLs.

This is nonsense. All current AI relies on a huge amount of human curated data input before it's let loose. At best it will only be a new programming language where you are less sure that at run time it will do what you wanted.

0
0
Pint

The ultimate self modifying code

Nuff said!

I would think that this will be just more comprehensive higher level libraries that you just configure and string together. Payroll library, GL library anyone...

2
0
Silver badge

Re: The ultimate self modifying code

Self-modifying code == Not-self-modifying code with a modifiable data structure.

The Bad Developer is on the left side.

1
0
Silver badge
Coat

Re: The ultimate self modifying code

I wrote an HTA app once that changed its own html as it went to display different things. Does that count?

0
0

Re: The ultimate self modifying code

Nah, just think about it in the machine code days all of the machine code was just data that the CPU 'processed' and gave you results..... <G>

I remember some of the old machines I worked on, I'm talking core memory and 7400 TTL logic (can't remember the name of) you had to actually had to write a JMP instruction into a specific memory location as a return from subroutine as this thing had no JSR or stack.

Even with something like a PDP11 under RT11 if you loaded a relocatable module into memory you need to go through it and modify the relative addresses to actual addresses. You could even do this several times as you moved the code around to free up memory. A bit like an early overlay.

By 2040 I would say that the machine that writes the code will be involved in writing it's own code hence the concept of "ultimate self modifying code"

0
0
Silver badge
WTF?

Clippy

Why would you have an AI write code? I shudder when I think of what garbage it would get as training data. Take whatever it's supposed to be doing and convert it into libraries and language features.

Software tends to have infinite requirements. Every time somebody swears they have an architecture to solve the problem of writing code, they instead create an architecture that solves one use case. The architecture is modified to handle more and more use cases until eventually the architecture is more complicated than writing code. This is the birth of a "legacy system" that people will curse for years.

Recent advances in computer languages have been in better abstracting data transformations. You can describe what you want done using formulas and it gets taken care of. When those don't fit needs, plain old brute-force still works. The next step could be making it easy to declare, at will, a locally scoped DSL for performing a specific task - there would be some good science in figuring out how that would look.

15
0
E 2
Trollface

Re: Clippy

Feed it Clipper code!

2
0
Silver badge

"AI doing coding" could mean anything

Writng code just seems like the opposite of things an AI can do given that a large part of writing code is understanding what the code is supposed to do - that means communicating with people - not something i've seen an AI do yet .

On the other hand you could say that a high level language is basically telling the computer what you want it to do , then the AI compiler writes the "machine code"

1
0

Re: Clippy

Maybe that's where AI could help. A working system is in fact a kind of specification. When that system is 25 years of patches, the code itself can be unmaintainable goo, but running it demonstrates required behaviour (one hopes). Sometimes you look at 100,000 lines of code, and feel it in your bones that it could be rewritten in 20,000 but the task is just too daunting to undertake. And it's *menial*.

So there's a domain (legacy bloatware) that's fairly unambiguous, with mountains of menial detail. Sounds like a job for a commuter! Ok, crushing 100K lines of crufty C++ to 20K lines of well organized C++ might be a bit hopeful, but hell, even if it could just look at such a system and produce a specification, that would be brilliant. (Of course, even that's not necessary because we all diligently maintain our specs, right? )

1
0
Silver badge

Re: Clippy

"So there's a domain (legacy bloatware) that's fairly unambiguous, with mountains of menial detail."

What would the AI do with all the bugs which are in there but undiscovered because in the operational domain they're never triggered? It might add a few hundred K lines to deal with them.

1
0

Re: Clippy

"Sometimes you look at 100,000 lines of code, and feel it in your bones that it could be rewritten in 20,000"

I hear that a lot. But while writing the new code, more often than not, you will discover all the edge cases and special requirements the projects also needs to handle[*]. And in many cases, you will end up with the same amount of code, anyway. Except that it will be less reliable, because it is still missing about, uh, 25 years of bugfixing.

[*] Things like "didn't you know, every other year we have an ISO-somethingorother audit, and we need to run this very complicated reporting thingy that would us take weeks to do by hand". Or the always fun "Ok, we upgraded all the systems to use your new API, except of course that Doohikey2000 thing back there in the corner. Changing that would cost too much and cause a downtime of weeks because of the re-certification required"

1
0
Bronze badge
Unhappy

Re: Clippy

Anybody can refactor 100,000 lines of code. Applications over 100,000,000 lines of code require a capable team and lots of time. Windows 2016 is a good deal bigger than that.

0
0
Silver badge

Wasn't COBOL....

Designed to make it to code for accountants back in the 1950's. You wouldn't need programmers, and anyone could "write code".

This whole thing is a case of "been there, done that", and it will continue. We humans are the ones that think into the future and can "design" things. Very little (if any, as I can't think of anything) is designed without human input. I have strong doubts that this will change.

Nice try though.

p.s. COBOL is still here, writing paychecks and checking general ledger stuff.

16
0
Pint

Re: Wasn't COBOL....

Designed to make it to code for accountants back in the 1950's. You wouldn't need programmers, and anyone could "write code".

Bloody accountants!

I'm maintaining a legacy system written by an accountant in C/C++ full of magic numbers, gotos, death by pointers hard coded everything. No documentation and extremely short variable names a 3 character name is a luxury.

But... it is keeping me in a job. Some times I just feel like Wally, old, bald and maintaining the old legacy system that can never be replaced as no one know what the hell it actually does.

0
0
Silver badge
Terminator

Stack Overflow

I kind of just have to know vaguely what's possible, and then I find the specifics on Stack Overflow.

Wow.... The devs on my site are all AI's :)

6
0
JLV
Silver badge
Trollface

Re: Stack Overflow

Good to see Nissan is ahead of its time:

https://mobile.twitter.com/Scott_Helme/status/727832672551219201/photo/1

which may explain why their security is very retro 90s-lets-trust-each-other

https://www.troyhunt.com/controlling-vehicle-features-of-nissan/

This article feels really 90-y too: computer languages are improving on abstractions and expressivity, and libraries pack a lot savvy. But, contrary to the claims here, at some point, exact, not probabilisitic/AI/big data/generic consumer processing requirements that are not part of a well-known and generic domain will require someone to be highly specific about what they want done - that's likely to remain looking strangely like computer code.

0
0
Silver badge

The 80's are calling

and want their ideas back.

Developed by D.J. "AI" Systems and called 'The last one' because it was the only programming language humanity would ever need.

https://en.wikipedia.org/wiki/The_Last_One_(software)

14
0
Silver badge

Re: The 80's are calling

"and want their ideas back."

Indeed. Will this really be the Last One.

1
0

Who will program the software writer?

Without human input there can be no machine output.

2
0
SVV
Bronze badge

Re: Who will program the software writer?

Have written several code generators myself over the years, and used some others, so the answer is obviously computer programmers. They are good for basic gruntwork such as database access code, initial user interface generation and very basic data validation (eg order must be between 0 and 1000 quid). After that, it's diminishing returns as the complexity of what you need to define for the generator approaches the complexity of just writing the complete code in the first place.

"But they point to recent Facebook research, saying it suggests machines may be able to negotiate with each other to communicate their requirements."

Not a chance until humans can manage this in an efficient, complete, consistent way too (Note : anybody using the letters UML in a reply here will be ignored). Again, you'll just come up against the complexity problem where the requirements definition becomes more complex that the syntax definition of the target language.

1
0
TRT
Silver badge

Prove it...

Get it to the stage it can comment and document human written code first, then I might believe it to be possible.

6
0
Mushroom

Re: Prove it...

Get it to the stage it can comment and document human written code first, then I might believe it to be possible

Extremely simple...

// Start of crap human code that contains substandard business rules

Large block of human code

// End of crap human code that contains substandard business rules

/sarcasm - reg where the hell is the icon

0
0
Anonymous Coward

Hidden comments too?

<!-- Let the meat-bags try and figure that shit out! MQ7565422 -->

<!-- Good one MQ7565422. JH87675 -->

7
0
Silver badge

Silver Bullet Syndrome

Never forget that clients are comes meatbags that are never truly sure nor fully understand their own requirements which change constantly during development phases...

It's not AI that is required, you would need to be at least Deity Level to get coding done correctly without human intervention..

4
0
Silver badge

Commonsense isn't common

Even the best written spec today includes a lot of unstated commonsense assumptions.

The problem with commonsense assumptions is that what is commonsense to one person is entirely different to another particularly cross culture. I am dealing with these "commonsense assumptions" on a project I am currently on and it is causing a lot of grief...

7
0
Silver badge

Re: Commonsense isn't common

For example, never assume everyone writes or types left-to-right (Hebrew and Arabic are both right-to-left, as are other Middle East languages).

0
0
Silver badge

I smell BS

Even if it were possible, we would still need a way to explain what we want the system to do, which by definition would be a programming language

Ok machine, I want you to take a reading from that sensor, convert the returned data from its packed form into something usable, modify the value by applying the calibration data in your eeprom that I explained earlier, now take a rolling average over 500ms to stabilise it and remove mouse/ measurement uncertainty and use the result as the stable and accurate sensor output

Me thinks I could write the c/c++ in less time. Wonder how I’d apply optimisation like bit shifting and Boolean operations to any of the above ?

9
0
Anonymous Coward

Re: Ok machine, I want you to...

Start with something simple, like user interface design.

OK machine, make that box green.

Darker.

Darker.

Darker.

Darker.

Too dark... make it greener.

Greener.

Greener.

Can you make it at the same time greenish and reddish?

(dedicated to all designers/developers that had to deal with micromanaging types and did not kill them)

12
0
Silver badge

Re: Ok machine, I want you to...

"dedicated to all designers/developers that had to deal with micromanaging types and did not kill them"

We had a volume at 11 shouting match between two of the client's directors in the middle of their general office as to how a particular batching operation should be carried out. Privately we thought it should be operator configurable and built that in. The configuration would be able to be set to fit either of the directors' views - or anything in between and maybe more. During commissioning we set up something that looked reasonable. AFAIK it was never subsequently altered.

1
0
Silver badge
Paris Hilton

Re: I smell BS

Wonder how I’d apply optimisation like bit shifting and Boolean operations to any of the above ?

That's where you are going wrong, still trying to tell your 'computer code writing secretary' how it should be done rather than letting it get on with it.

"Computer! Write me a program to control my country's defence system". It's that easy. You are over thinking it.

1
1
Silver badge

Re: I smell BS

Because we realize computers can't predict what it doesn't know. Heck, WE can't handle a defense system properly without all the parameters. We can't expect man-made computers to be any better.

0
0
Silver badge

Re: I smell BS

Replying to my own post, the main reason we need human programmers is that we never get the complete specs to a job at the start. Something always gets left out that then needs to be addressed in a hurry. A computer needs to be able to handle the job even when the specs change (sometimes drastically). They also need to be able to handle vague specs and know whether to just assume something or to ask for more specifics which may not be forthcoming.

0
0

Page:

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Forums

Biting the hand that feeds IT © 1998–2017