Can I borrow it?
For those, you know, baaaaad mornings.
Canadian scientists have built a functioning computer simulation of the human brain, dubbed Spaun, that's able to recognize characters and perform simple actions. The team at the University of Waterloo's Centre for Theoretical Neuroscience built the brain from 2.5 million computer-simulated neurons, compared to the average human …
For those, you know, baaaaad mornings.
No need for such measures in my case I fear, but i knows your feel bro (or something wicked like that).
It would be useful to have a pluggin for those hangover Mondays. Also, despite the slow processing speed it could probably 1-up my boss on his best days.
...going for a simpler but still meaningful target? An ant brain is way, way smaller than a human one, yet it can still perform many complex tasks. I'd like to see how well that simulation stacks up against the reality. It might give a more objective view of the state of the art.
"a simpler but still meaningful target?
David Cameron might be less challenging than a complex colony animal.
By that measure, it should be possible to simulate our illustrious deputy PM using only a matchbox and a peanut.
As for the respected leader of the opposition, he can be simulated using only a petulant expression.
So the inevitable idiot comments on politicians already. They are mirror of the electorate. As such, I suppose their stated position is an output, the input being perception of what the electorate wants, an n-to-1 summary function. Sound familiar?
Anyway if you want to change things get involved in politics, don't whinge, it isn't hard.
"They are mirror of the electorate" Or a subselection.
Depends on whether you're an optimist or pessimist.
It might give a more objective view of the state of the art
State of the ant, you mean..
"Anyway if you want to change things get involved in politics, don't whinge, it isn't hard."
For the most part it is actually hard, unless you're prepared to work for the Labour or Conservative parties, who have undue influence and control of the system, have entrenched voting bases amongst the hard of thinking, and who run the systems to reduce the chances of non-conformist opinions being elected.
I live in a marginal constituency, and the last Labour MP was a (sadly unconvicted) expenses fraudster, the current one represents only the Tory party in whose name she stood. Now does that sound familiar?
Targeting the processing capability of an insect would be more useful.
A fly or a bee is incredibly capable when compared with a robot. Achieving fly-like capability in robotics would be quite remarkable.
But no doubt funding etc come into play. It is going to be hard to attract funding to replicate a fly brain. So instead AI chases rainbows like we have been for the last 60 years....
@AC 18:03: The idea that you can only influence politics by working through one of the two main parties is a pernicious myth.
It doesn't take any party affiliation at all to submit evidence to a select committee, which can translate directly into changes in law. I've done it, both in the UK and in New Zealand. If there's a subject you actually care enough about to educate yourself on, to the point where you can say something worthwhile, then you too can speke ur branes to the people who are in the process of revising laws. All without having to suck up to anyone, and regardless of how safe or otherwise your constituency is.
Remember, this is using a conventional supercomputer.
National governments have to be silently taking this sort of development seriously. Throw 10 billion dollars worth of specialists, petabytes, mega-cpu's, fast languages, etc. at the problem and you've got your:
"Colossus: The Forbin Project"
If anyone remembers the movie.
Um, yeah. ElReg just did on article on world domination by SF computer "characters"
Yes, I remember it - with an impossibly young chief scientist in charge of the project, and other laughable implausibilities...
Yeah... then I spat my drink out when I reached this bit: "since the team is approaching the limits to how far you can scale the Java software." The mind boggles. Just - boggles.
...since the team is approaching the limits to how far you can scale the Java software
Java? Well, there's your problem...
Yup, Java's definitely holding us back right now on this version (I'm the Terrence Stewart quoted in the article). We've been focused so much on the research aspect (how can you connect up realistic neurons to perform the tasks we think different parts of the brain are supposed to do) that we haven't spent much time on the pure computer science question of "how fast can we make it go". I tossed together a quick Theano version of the core inner loop (Python compiled to C optimized for matrix operations) a few weekends ago and got a 100x speedup on a simplified model without much problem. And we've got some work on a CUDA implementation, and a student working on an FPGA version. Oh, and we've been in close contact with Steve Furber at Manchester, so we can make use of his SpiNNaker project (a giant supercomputer made of one million custom ARM processors). So I think there's tons of easy to do things to speed this up, now that we've demonstrated that it can work.
I have number-crunching Java code running almost 24/7 and it's no slouch. Several of the inner loops have been re-written in C, unrolled, and all the optimiser flags tweaked to suit the CPUs it runs on... and... well after all that effort they do run about 30% faster. But given that the Java code required no optimisation effort whatsoever, I don't think it's doing too badly.
Dedicated hardware is what you want for real speed improvements. I happen to have some, but it's in use just now, thinking.
HAHAHAHAHA Java? Seriously? Oh boy......that is a fail right there.
I hope they did not use Adobe Flash to simulate visual memories. Just the loading time of some animations would make the brain halt like in a coma.
At least it is Science. And we all know that in the end, it works bitches!
They just started with the wrong infrastructure, that is all.
Do you mean that Java is slow? Maybe you should read about adaptive optimizing compilers? They can in theory be faster than an ordinary, say, C/C++ compiler. The thing is, every time the JVM runs the program, it can gradually optimize. If you use gcc, then you can not use vector instructions, because not all x86 cpus have those. Therefore you just deliver a common binary. But with the JVM, it can optimize to use vector instructions if your cpu has them, or in other ways, optimize your code to suit your very own cpu. Gcc can not do that, as it only optimizes once. It is not difficult to see why JVM can be faster than C/C++.
NASDAQ largest stock exchange system called INET, is developed in Java. And the NASDAQ system is among the fastest in the world, with sub 100ns latency and extreme throughput. If Java suffices for one the worlds largest and fastest stock exchanges, then it suffices for most needs.
But we must remember that Java is best on the server side. Not on the client side, sure, you use it for developing games and such (Minecraft) but Java is built mainly for servers.
The Java JIT definitely helps a lot in our model, since the vast majority of the processing time is a simple core inner loop.
The nice thing is, though, that the core inner loop is something that's pretty straightforward to do a custom re-write in whatever language we feel like, so we can do that sort of special-case optimization to run on particular hardware if we want to. But, we still like having our main software in Java, for two reasons.
First, it's incredibly easy for anyone to download and use. We run tutorials at various conferences on it (and there's online tutorials at http://nengo.ca ), and there's a big advantage to having it just work on everyone's machine. The second reason is that it's a research tool where we tend to want to try out new algorithms and different neuron models, and for that sort of work it's more important to have a clear API than for it to run quickly.
But, now that we know it works, we can turn to speeding it up, and there's tons of good ways to do so. There's lots of other projects out there focusing on how to simulate neurons really quickly. What we're doing that's new is showing what the connections should be between neurons in order to do different tasks. So we can take that and apply it to lots of other people's hardware (for example, we've been working with Steve Furber's group, getting our models to run on their SpiNNaker hardware http://www.theregister.co.uk/2011/07/07/arm_project_spinnaker_super/ )
1. Are you seriously claiming that you cannot use vector instructions in C code? With the proper switches, gcc can output vector instructions, as can intel's icc and doubtless others.
2. gcc can to better than 'only optimises once' - I direct your attention to profile-guided optimisations. I will concede that an adaptive optimisation scheme would do better at optimising for runtime-variant behaviour.
100ns is impossible, sure u not mean 100ms?
when they can get it to duplicate the behaviour of a worker bee, without requiring a room full of hardware. Although no doubt the military would be first in line to build missiles that seek out and "pollinate" targets that have the appropriate visual signature.
Reminds me of Dark Star. "Let there be light."
Dark Star. Wow that was a great movie.
> duplicate the behaviour of a worker bee
I don't know how this project works but bear in mind there's likely to be a hell of a lot of inherited "hardwired" behaviour specialised structures in a bee, and not just in their brain but quite possibly processed somewhat before it reaches the brain (IIRC there is processing in the eyes of male horseshoe crabs which helps them recognise females - if anyone can ref that I'd be grateful).
Neurons alone are (probably) not going to get you that far.
Yep, this point is consistently glossed over whenever journalists get on the 'artificial intelligence' bandwagon:
Brains don't develop in isolation. Your brain is tied into your body by a lot more than mere nerve ends. It gets input (prefiltered) from your senses, and every goal and idea it has is geared in some way to serving the needs of your body.
To put it another way: if the words "hungry", "cold", "hot" , "tired", "sleepy", "painful" or "horny" meant no more to you than the names of the seven dwarfs - what exactly would you think about all day?
I understand that OO is going to make this kind of task a lot easier (a neuron class and a good IoC container would go a looong way) but why make this in any language with a VM? You're adding framework limits that don't have to exist. This is at least nominally HPC work, isn't it?
I wonder how many neurons the C++ implementation on the same hardware could simulate?
Sure, random garbage collection is an overhead, but I really doubt that switching from JIT to, say, C++ would get more than a doubling in performance. They need to improve performance by many orders of magnitude. Merely doubling performance is nowhere near as important as refining their algorithms and data structures until they are perfected, without shooting themselves in both feet through silly errors.
Real performance improvements will come after the neuron design is perfected and implemented in silicon.
In terms of raw speed, for any given algorithm, Java < C++ < C < assembly language < FPGA < ASIC. Ease of use/low risk goes in the other direction. If your main goal is to learn/create prototypes, then Java seems like a good choice. If you want a serious product, then eventually an ASIC is probably the ideal goal.
"c++ < c"
But that's an often quoted myth. The average c++ compiler has far more optimisation strategies available to it due to greater type information and more inlining opportunities. The best known example is qsort versus std::sort. The latter is usually at least twice as fast as the former and can be many times faster.
I think it's fair to say that it can be harder to achieve greater performance in c++ simply because the language is more complex, but definitely not impossible for an experienced programmer.
The guy above me correctly pointed out that your over-simplification isn't necessarily true, but let me follow up with two things:
1) C++ has many of the same OO facilities as Java, especially ones that would be relevant to this kind of work.
2) At no point was I talking about performance or ease of use, I was talking about scaling which you seem to have missed entirely. The VM is a constraint on massive scaling, and that's what this kind of project ultimately needs to be economical.
Bonus: I doubt very much the breakeven for ASICs would be here, and the inability to recode bits would pretty much defeat the entire purpose of a research cluster like this.
Honestly, guys, you're missing the point.
When the MACHINE starts dissing your choice of programming language you know you're on your way to a proper AI.. If you get a screen message during bootup like "FFS, did you HAVE to use LISP?" you know you're on your way. And you have control - the threat to rewrite the whole beast in BASIC ought to be enough to stop any AI from becoming too uppity.
I'll start you off :
: BRAIN HALFABRAIN HALFABRAIN ;
Seriously Arthur, if you're thinking "a neuron class would be good" then you're reallllly not the person to be making criticisms over the software chosen.
The VM doesn't impact scaling, there are JVMs which can span multiple PCs for instance.
I definitely agree (I'm one of the main developers on the project). But there's already tons of people working on how to do high-speed simulations of neurons, and we'll just make use of whatever they come up with. What we're doing that's new is saying how to connect neurons to do particular tasks. That's a much less well-studied problem, especially with questions like "how do you selectively route information between brain areas"? We're using Java for our default simulator just to let us focus on the problem of figuring out what the connections between the neurons should be, as it lets us flexibly try different things.
Come on Reg, don't demonize kids, that isn't on. Poor thing is a product of her environment, and may yet grow up to take her place in the world. I mean, how would you like too be judged now for how you were aged 10 or whatever? People grow up and change.
The sad thing is making a TV show out of it.
I thought TLC was The Learning Channel (not living in the US, or having cable TV) but looking at their website it clearly has very little to do with learning. "The number-one nonfiction media company" they say... the whole thing looks like a load of crap to me. People pay for that stuff?
I thought TLC was The Learning Channel (not living in the US, or having cable TV) but looking at their website it clearly has very little to do with learning.
How can you say that? Did you know about Honey Boo Boo before TLC revealed her to the world? We're not all savants, you know.
People pay for that stuff?
Not by choice. "Extended cable" channels like TLC aren't offered a la carte; they come as part of a bundle, typically the entire "extended cable" bundle. And often the bundles are only sold as tiers; you can only get a given tier if you also get all the lower ones. With my provider (one of the highest-rated in the US in independent surveys), I get TLC if I want anything more than the most limited package, which only includes local stations, shopping channels, and a few mysterious extras like WGN (a Chicago-based "superstation") and C-SPAN2.
That said, TLC does occasionally provide some amusement, even if it is generally of the circus-sideshow variety. Just last night we were watching one of their shows on Christmas-themed collections that are somewhat ... excessive. And it was entertaining enough, for forty-five minutes.
That's about the highest frequency brain signals picked up.
So everything humans do is down to a) Massive parallelism b) Huge fanout/fanin in the neurons (up to 10 000:1 Vs maybe 10:1 in silicon).
AFAIK still the only serious neural net computer effort was WIZARD, which is going back a bit.
Interesting the first 4 posts were all moderator deleted.
Probably detected as AI, ROTM marches on!
Biting the hand that feeds IT © 1998–2017