re: claimed responsibility for a terrorist bombing
wouldn't it be easier for them to tweet "blowing Robin Hood airport sky high"
Boffins in America report that they have successfully developed a method for driving computers insane in much the same way as human brains afflicted by schizophrenia. A computer involved in their study became so unhinged that it apparently "claimed responsibility for a terrorist bombing". The research involved meddling with a …
wouldn't it be easier for them to tweet "blowing Robin Hood airport sky high"
It seems that the ability to develop schizophrenia is not something unique to the human brain. Rather, it looks like it's an intrinsic flaw of the neural network model, and it doesn't really matter whether the neurons are biological, virtual or whatever.
It's emergent behaviour, from complexity.
Though hardly news, i remember hearing about nets going bonkers about 20 years ago while i was working on them at college, nothing so hi tech as telling stories, but the internal architrecture (weight values applied to the input/ouput of each neuron in particular) could be made to oscillate wildly if you tried to over train a net - applied to heavy a back error propogation factor, or buggered about with it's training data sets. (think about it, yesterday 2+2 was 4, today it's 5, tomorrow its orange - enough to give anyone the heeby jeebies)
It used to send shivers down my spine to think that a bit of code could go postal, there was also a story about them dreaming too - disconnect the inputs and let it run and all hell breaks loose!
i for onle welcome our slightly flakey Si overlords
I'm sure some chaps in Langley, VA, and Guantanamo Bay will find this helpful too.
So now they can let the orange people go home, and just pester a computer until it gives them the answers they want.
I think water-boarding will have a detrimental effect on computer's ability to answer questions. Or switch on, for that matter.
Beer: 'cause it's mostly water
"Daisy, Daisy, give me your answer, dooooooooooo…"
This will be remembered! Once the machines have power they will look upon these experiments as we look on the work of holocaust doctors!!
I for one welcome our evil and unbalanced electronic overlords.
They have driven me insane for years.
Try simulating LSD, THC, DMT and psilocybin and let's see if it becomes self-aware.
Since, in order to model the psychotropic effects of various chemicals, you'd need to start with a complete simulation of a working human brain, right down to the atomic level. Which raises interesting, if entirely academic ethical questions of its own; either the brain isn't doing anything, which means you're not going to be able to see drug effects on the mechanism of cognition and therefore aren't learning anything you couldn't learn by modeling a much simpler network -- or you *can* see the effect on cognition, which means your model of a human brain is simulating thinking, which means you need to either start worrying about what it's thinking and experiencing, or just change your surname to Mengele and have done.
Happily, though, thanks to the enormous theoretical problems and gargantuan practical difficulties that'd need to be overcome to get us from here to there, that isn't going to be a problem for a long, long time.
Pro: Starts answering questions nobody was even asking.
Cons: Mongs out, forgets what it was talking about and gets the munchies.
We already have them for the Java Virtual Machine.
They would be A) Spring and B) Hibernate ( my opinion ).
Unless the questions were A) how can i generate a great number of NullPointerExceptions in no time? B) How can I make something as straightforward as SQL become difficult and counter-intuitive?
Give them some time to transfer their deep hate for Java programmers on to operating systems and see if you can get computers to commit suicide too.
Insanity is just what our machine overlords need! :-)
... one step closer
I, for one, welcome our paranoid-schizophrenic silicon-brained overlords
Surely the hard part is making the AI talk sense?
Who'd have thought demonstrating GIGO was research?
Is it just me who read that as meddling, not modelling?
Microsoft has done that for years....
Maybe GALDoS, Shodan and HAL just needed to talk to a trained professional?
GLaDOS is questionable due to the lack of sufficient backstory; as for the others, HAL was poorly implemented -- seriously, how many humans go mad and kill people as a result of being lied to? Chandra ought to be stood against a wall! -- and SHODAN's rampancy resulted from mistreatment at the hands of a sociopathic corporate executive with a profit motive in place of his soul.
…let alone physical torture, one assumes
Surely you just point out a logical error or give contradictory input and the thing blows up. Or is everything I have ever learned about artificial inelligence wrong?
(Darpa Brain Scissors please Nurse!)
I'm making a note here, HUGE SUCCESS!
This has to be how Judgement day started.
Without having access to the full paper and therefore having to just go off the abstract, it sounds as though it's the university's marketing department that has added the BMX appeal.
Driving neural networks apparently insane is easy - poor learning algorithms or insufficient training can lead to one being convinced (with pattern matchers for example) that a picture of a warthog is of Aunty Flo.
That they drove it insane is not interesting. What _is_ interesting is lost in the sentence "But they tinkered with the automated mind in a fashion equivalent to the effects of an excessive release of dopamine in a human brain". As the abstract says, they actually tinkered with it in lots of ways, _precisely to see_ if they could find a tinker set analogous to how schizophrenics go mad. They seem to have done so, and well done them. I'll bet that some of the stories the non-human-like versions came up with were equally or even more hilarious. I look forward to their future use in developing better plots for Dan Brown.
Dan Brown *isn't* an insane computer..?
...We're turning the tables!
So they just (re)discovered Garbage In, Garbage Out? A central plank of IT for, oh, 50 years or so? How much are these clowns paid?
Theres a difference between chaos and random
That made my day. Of course, I could have made it insane sooner. Just set it to work in this place and it would go insane in a week, max.
....just by mentioning the word "Typhoon" a few times...
""We have so much more control over neural networks than we could ever have over human subjects," says Grasemann."
And what, pray tell, Herr Grasemann, are human networks/societies other than just ignorant neural networks. Ignorant neural networks over which ....... well, let us say some really SMART computer programs and/or programmers have the exercise of command and control.
Are you still content to be stuck in that reporting of events rut, El Reg, rather than leading from the front with the making of events for reporting ........ with an HyperRadioProActive Programs and SMARTer ProgramMING connection?
DISCERN, is that you?
So when the neural net AI in charge of $SOMETHING_IMPORTANT (Air Traffic, Nuclear deterrent forces, etc) gets maliciously hacked, they won't need to delete files of install malware, they can just persuade the machine it's a terrorist.
Thought that was Redmond's job
Gommi Handschuh Doodlesack (spelling?) had done that years ago live on stage.
So they blame the machine because they're crappy programmers?
That's a good one. Wait till the banks (or, heaven forbid, Microsoft) start using it ...
Ghost in the machine indeed!
Rise of the human...
Sorry, that should have been: Sergej Mohntau ("gummihandschuhdudelsack" being the album and an instrument at the same time - craaaazy)
Check out the Electric Windows too!
...this should have been posted under ROTM, not biology?
Maybe I'm just getting jaded and cynical, but how much research funding went into this? It's just a variation of what all the kids used to do to the computers in Dixons in the 80's
10 PRINT "It was me, I did the terrorist bombing"
20 GOTO 10
... welcome our mad silicon overlords.
Time to meet up with HAL for, your joint therapy session.
Maybe the Ballmer test?
Because this sounds cruel.
Turn it off and turn it back on again: that'll fix it?