Re: A section of rat brain
No need to apologize!
The one point I'd make -- having played student politics at Brasenose against one D Cameron -- is that simulating a politician's brain would be a very tricky task indeed..
59 posts • joined 3 Jul 2007
No need to apologize!
The one point I'd make -- having played student politics at Brasenose against one D Cameron -- is that simulating a politician's brain would be a very tricky task indeed..
R Olsen writes: "@David Lester: I'm not a neuro-scientist but I like to follow the research/read articles etc. It seems like there are complexities in synapse and neuron function that would need to be accounted for to make the model valuable."
Well, I'm not a neuroscientist either!
You've hit on exactly the right question. What level of modelling accuracy is required to obtain the results you're interested in? Steve and I are particularly interested in "plasticity and learning", i.e. mechanisms that allow animals to learn and remember their responses to previous stimuli.
"I'm sure you are aware of many more examples, but two I've been reading about recently:
1-Dendrite preprocessing of information, seems like there is a lot more going on there than previously thought, not sure if the models take that into account.
2 - Neurons switching between slow and fast firing type depending on conditions.
Are the models being used (for synapse and neuron activity) good enough to think the entire model will provide a reasonable simulation of actual?"
Let's answer this in two parts: Henry's model does indeed feature dendritic computation and the neurons also switch modes. Our simplified SpiNNaker models do not currently feature the dendritic computation feature (instead we model the currents passed into the neuron as a linear rather than multiplicative property), but by using the Izhikevich neuron we do get the bursty-ness property.
Previously we'd have been able to say "well our model shows many of the same properties as Henry's more complicated models", but this wouldn't say anything about how well that matched biological reality.
So, for us the interesting next test is: "Do we need non-linear dendrites?" Because everything in SpiNNaker is done in software, we can make this change, but it will affect the speed and or density of function we can achieve. One thing to point out is that the systems we're developing in HBP can model one brain area in high fidelity and the rest of the brain at a much lower level of detail.
Schultz writes: "Is that a financial stimulus package or do they expect to get 1.2 billion worth of scientific knowledge out of their supercomputer?"
Well, the results reported here are part of the Blue Brain Project, which has been primarily funded by the Swiss Federal Government. The Human Brain Project -- of which I'm a part -- is funded on a biennial basis contingent on results, probably to the tune of about €0.5B in ten years. This funding originated in FET-ICT ("Future Emerging Technologies for ICT") whose mission is to foster industrial collaboration between different ICT companies in the EU. Currently, the money is coming from DG Connect, which is part of the EU's Digital Agenda, which I think was established by The Register's favourite recently-retired EU Commisioner: Neelie Kroes.
The University of Manchester contribution with Technical University of Dresden is a new 28nm SpiNNaker component. SpiNNaker-1 cost the UK research council ~£5M for a 130nm component. Our HBP funding is order €10M.
OK, let's be a bit less gnomic, and try to place Henry and the rest of Blue Brains' work into an IT context.
A cortical microcircuit is really the minimal functional unit of a brain. An analogy is to consider a neuron as a transistor and the microcircuit as some sort of generic IC. One of the achievements of this work is to provide us with a provisional count for the number of distinct neuron (i.e. transistor) types. When you consider that biology has what we'd call an "extremely high process variation" and that neuroscientists are basically given a pile of 37,000 different cells and asked to classify them into "morphologies" then you're getting an idea of what's needed.
Another key idea in this paper is that the places where connections are made occur geometrically, that is wherever the "wires" get sufficiently close to permit connections (synapses) to form. Obviously, it remains for the results to be confirmed by other labs, but if this result proves true, then again our task becomes a bit easier, as the biology becomes a bit less tentative.
To me, as someone tasked with providing an even more simplified version of this circuit (but running in real time, rather than taking hours to simulate a second, as the model reported in the original article does) the true significance is that we have a "reference semantics" against which we can compare the behaviour of the SpiNNaker model.
A test version of SpiNNaker-2 has been taped out in July, and we should get back the result just before Christmas. Although the test chip is just trying out ideas (in 28nm), our[*] aim is to permit a microcortical circuit of 40,000-100,000 neurons[**] to be realised on a single one Watt chip.
There are of course any number of features which Henry's model does not yet include, but the intriguing thing is that the model already makes a number of predictions about the results of future experiments.
[*] Sebastian Hoppner (TUD), Christian Mayr (TUD), Steve Furber (UMAN), Dave Lester (UMAN).
[**] Both the number of neurons and the number of connections increases as the animal's brain complexity increases, e.g. for a macaque we're looking at about 80,000 neurons with a fan-in/out of ~5,000. For the rat we're looking at 37,000 neurons with a fan-in/out of ~1,000.
Mr Mage opines "I'd bet it's absolutely nothing like ANY biological brain..".
What stake do you have in mind in Mr Mage?
Dave Lester (Chip designer for Human Brain Project, led by Henry Markram)
As part of GEC Hirst Research Centre, I had the immense privilege to work with Pierre America's POOL/DOOM team in my first EU project: ESPRIT 415.
My favourite moment was being asked to explain to the lab director how it was possible for a meeting "to have ordered and consumed over a bottle each for lunch?" My reply, which was accepted, was: "There were a significant number of Frenchmen attending the meeting, Sir."
Since the project meetings took place every six weeks, my late father also took to asking: "When are you going to get a real job? You're enjoying this one far too much!"
I also discovered that despite it's no alcohol policy, GEC accountants must have assumed that "wijn" was dutch for some sort of duck!
As an academic getting some EU money (about 15%, since you ask), here's your reality cheque:
(*) If the UK is to make savings from its exit from the EU, it cannot be intending to recycle the money in exactly the same way; otherwise where would the saving be? It'd be pretty naive to think that my current EU grants would continue upon Brexit, and fairly unrealistic to expect that the UK would pay for further EU science participation.
(*) All academic funding in the ICT field is made with an eye to enhancing technological competitiveness. I have an interesting project which is stymied because it is not in the UK's interests to let industrial tech leak to Australia, and not in Australia's interests to boost our Tech Industry. So this project is never going to happen. Nor will it happen if/when we are out of the EU: it would still not be in Australia's interests to pay for ARM to become more dominant.
(*) Academic funding is offered for a number of reasons. Some are to provide individual researchers with funds to do their own thing; examples are responsive-mode EPSRC (UK) and ERC Advanced Grant (EU). Others, such as Marie-Curie (EU), or EPSRCs DCT (UK) grants provide funding for PhD students. And some are to promote collaborative research, i.e. H2020 FET-ICT (EU) and EPSRC Programme Grant (UK). I find a mix of these is the most effective way to undertake my research: the collaborative programmes give you a much broader perspective of how your research fits into a wider field. (Full Disclosure: I believe the only grant scheme I have not held from the list above is the Marie-Curie.)
(*) There is no requirement to have a weak European partner, and every reason to avoid them: they will make effective working extremely hard.
(*) Both Switzerland and Norway pay over the odds to be members of EU Science, as does Israel. In addition they do not sit in on the meetings to decide on the next five year research agenda. The Swiss have complained loudly about being cut out of the Marie-Curie and ERC schemes, which was a result of their referendum refusal to extend free movement to Slovenian nationals. In particular the withdrawal from ERC caused problems since their universities have been using the award of ERC advanced grants as a proxy for the academic quality of their academics.
(*) Finally, the last I read about Juncker's proposed alternative destination for the H2020 money it was supposed to go to the CAP. The UK receives more than it pays into the science budget, but far less than it pays in from the agriculture budget.
This is possibly a marginally better idea than that of our new lords and masters:
And there I was, expecting a quick mention of the crytanalysts at Bletchley Park.
A rather nice photo of Joan Clarke is in the article:
"DiMarco, a longtime IBM employee, was previously in charge of designing, building, and running IBM's 300mm fab in East Fishkill, NY."
Really? 300mm? One imperial foot?
Are you sure you're not having a Dr Evil moment?
(should it in fact be 30nm?)
"What bothers me is - does turning it on to see in the lock screen has contact info count as "using it without authorisation"?"
Well, to put the icing on the cake. After handing the phone in, and getting a receipt ("We don't want you to be done for receiving stolen goods, do we Sir"), the WPC turned it on, and because the PIN wasn't in use, she rapidly found "Mum" under the contacts list.
The ensuing conversation "She did what? On the bus?" did not bode well for a happy Mother-Daughter reunion that evening!
Right, in that case -- as someone who found a mobile on a bus one morning and turned it into the police (rather than the bus driver) -- what is the protocol in this situation?
Your lawyers may have "views"!
"Exascale does not just mean "ARM based vector processor".... for specialized applications."
Talking with NVidea executives in the back of a taxi in Lausanne: "Exascale is just a marketing term; it means whatever we want it to mean!"
Still, to be serious for a moment, the major energy consumption is going to lie in the interconnect. Making it fully general purpose and scalable will be extremely expensive. As most supercomputers are made of stock Intel components, it might be useful to consider custom interconnect in order to drive energy costs down. The different supercomputer customers have wildly different requirements. Google, for example, the largest user of supercomputers on the planet has no (or very little) need of floating point.
What I expect to happen with next generation supercomputers is:
(*) Vectorization (drives down fetch-execute costs)
(*) DRAM stacked 3D (reduces memory access energy costs by factor of 5 at 28nm)
(*) As many cores as you can put in. (Steve's on record saying that energy efficiency dictates that these cores need to be as small and simple as possible; my only comment as the programmer is I'd like one heavy-duty core to handle IO)
Furber and I announced an ARM-based vector processor for neuroscience applications with an equivalent power-performance ratio in Lisbon in June, and no need to invent new technologies, just 28nm, mobile DRAM, and 3D packaging.
... Of course, if you insist on something as power-hungry as x86, you'll need to be a bit more inventive, ...
AC writes: "What he seems to have said doesn't look particularly wrong to me. Turing had written Computable Numbers in 1938..."
1936. See http://www.cs.virginia.edu/~robins/Turing_Paper_1936.pdf
"... but it was only after the war, at the NPL, that he focused on building the universal machine. Due to the lack of momentum at NPL he decided to go to Manchester where at least he would have access to a working computer."
True. But part of the problem at NPL was too many people trying to "help" design it. Turing included.
"For a mathematician he was very hands on, and did lots of building of electronics."
"Finally, it is arguable that the Manchester device was the worlds first programmable digital logical computer, rather than a calculating machine which is what all previous devices were (and mostly analogue)."
Paul Turner has kindly pointed out similarities between SpiNNaker and the HBP proposal. This is no accident: Steve Furber and I have had considerable input into HBP, and we are major partners in this project.
As far as we are concerned, we gain immeasurably from access to interested Neuroscientists and the other skills we do not have. For example Seth Grant (Edinburgh/Cambridge/Sanger Institute) is one of the foremost scientists in the field of genetics and neuroscience. Stanislas Dehaene is a great Cognitive Scientist.
The difficulty with multi-disciplinary research projects is finding people willing to cooperate and willing to invest the time to understand the new languages used to describe other fields of science. For example, integrating the SpiNNaker chip into robotics is not something many mainstream roboticists wish to undertake, and with good reason. With most robots one would want to be sure about what it will do, for safety reasons if no others; this is not an option if the device's behaviour changes as it learns.
Steve and I have already invested time and effort talking to the "neurorobotic" part of the project: Alois Knoll and the rest of his team in Munich (there's a SpiNNaker board there already linked to one of their robots), Murray Shannahan at Imperial and others.
We also need the biological insights that will come from the neuroscience part of the project led by Henry Markram at EPFL Lausanne. Without this, we will struggle to make our work "biologically relevant".
Of course, there is also the Graphene research that won the other €1 billion prize; they're celebrating on the floor below me!
... I'm making damned sure he's up to his eye balls in EU funding proposals at the moment (it's my job); but if you like we'll see what we can do after Jan 6.
... because I rather fancy a wager on the success of neuro-computing.
Say $50 on the Human Brain Project ( http://www.humanbrainproject.eu/ ) turning out something useful if it gets ten years funding.
The snag with teaching yourself neuroscience is that it's really an entirely new subject; you'd be better off getting real experts involved. So HBP involves people like Henry Markram (Neuroscience), Steve Furber (ARM designer), and Seth Grant (Human Genome project, neuro-genomics expert).
Dave Lester (APT Manchester / HBP project)
I'm looking through an old book of mine ("Approaches to numerical Relativity" Ray d'Inverno, CUP 1992), and numeric simulations of colliding neutron stars appear to occur over very short periods: ie the final approach from 20km to merger takes about 2ms.
Does anyone have any more recent estimates? There's also an assumption that the pulsars are not shredded at 20km, which would have been a necessary simplification 20 years ago, but may be considered unrealistic nowadays. Anyone?
Does this tie in with the observation that neutrino bursts from distant supernova appear to arrive 20 minutes prior to the visual confirmation? Is this the right time difference to match the experiment? (Previously explained as "dust" interactions.)
As neutrinos notoriously don't interact much, could this be something to do with the virtual particle creation/destruction in transit.
... is that of heat transmission.
One of the "computing engines" in the Science Museum's collection is Carter's Ringing Machine. This was devised circa 1900 by Mr Carter, who was a Birmingham Bell-ringer, and -- for afficionados it was capable of ringing Stedman Triples on hand bells via an electro-mechanical linkage. The problems was that with the original steam engine, heat was transmitted via the drive shaft into the system causing sufficient expansion for the tolerances to be exceeded and the machine to seize.
In more recent times, the machine has rung a "full extent" of Stedman Triples (that's all 7! = 5040 changes or about three hours), using an electric motor as power supply. It's worth a look if you can get the curator to give you a private demonstration.
The topic for discussion next Monday with Kevin Gurney (Computational Neuroscience, Sheffield) is: "The striatum: what do we know? Can SpiNNaker model it convincingly?"
But the point about Robots is well-taken. We have already made contact with both UK and EU robotics potential partners. What Tim has not focused on (there's rather a lot of work behind the press release) is that the system runs in real-time. And it's low power --- each chip consumes 1A at 1V (for 1W power consumption) and for the neural simulation it has the computing power of a typical high-end desk top. The full system runs at less then 50kW (depending on work-load).
Still, the question we all have is: just what do you have to do to get an article filed under RiseOfTheMachines? Buy all the London-based staff beers?
Whilst the popularity of Psychology probably lies more in it's promise to aid understanding of other humans, rather than subsequent employability; there's a mystery as to why -- here at Manchester -- our Computer Science courses are so un-popular, at least for UK students.
Last year the department led the Engineering and Science Faculty tables on employability. Has this led to a major improvement in applications? Of course not! As Admissions Tutor I will be one of the few in the University going into clearing again this summer, and this despite having the target reduced by 10% a few weeks ago.
Perhaps the Reg readership would like to get to the bottom of the subject's unpopularity. Is it that compulsory IT GCSE leaves people with the mistaken impression that Computer Science is IT? Do most people not realise that the subject is either discrete maths or electrical engineering, and consequently fail to get the requisite maths A-level? Answers on a postcard, please ...
Still it's the same story over in Chemical Engineering, and they're the most well-rewarded newly-minted graduates.
Another question is the old Private Eye trick, whereby we conclude an article about the fragrant Imogen, with -- apropos nothing at all -- a quick: "Lester Haines is 57".
I noticed rather a lot of that in the last issue, including apparently interesting speculation on who best to invite to a dinner party with Andrew Marr.
I think it'd be best if their Lordships got their thinking straight on these relatively simple matters before taking on the complexity of the interwebs.
Binary Translation, anyone?
(Seriously, if this is the best Intel can muster, then the writing really is on the wall isn't it?)
The most famous example in the UK would be the late Robert Runcie MC: a tank platoon commander in 1944/5; latterly ABofC.
Not content with "genericness", we also have "fact-intensive assessment" inflicted upon us.
All I can suppose is that it makes a pleasant change to the usual lawyerly[*] amble through the sunlit uplands of "fact free" legal discourse.
[*] What do you mean: of course any noun can be f**kingly adverbed. It's part of my first amendment rights.
... you've only forgotten the most famous sword in Anglo-Saxon folklore (we'll ignore the Romano-British Excalibur, shall we?)
As someone who's been acting as ambassador to my boss on contract negotiations with the EU, the whole feel of these diplomatic cables gives me deja vu.
The boss (for whatever reason) cannot attend the mid-level meeting. He can read the minutes. But what he really needs is a feel for how the meeting went. That's what these thumbnail portraits (of foreign worthies) and gossip (the stuff that's not minuted) really represents.
And with few exceptions, what we see is that US diplomacy in private is virtually identical to what it says in public. And that's a jolly good thing.
Typical porcine behaviour; boar are viscious buggers when they're defending their young.
Still to add to you list: pheasants...
This is from a memorable Xmas special BMJ issue (1987); they used to reserve their strangest articles for the holiday period.
Steve Evans writes: "Jeeez The US patent office will allow a patent on anything won't they!"
I would like to patent a device that can automatically generate a patent that the US patent office will reject. Provided that such a device/software/buisness method is granted a patent -- and I'm unaware of prior art, nor do I think such a device is obvious to practicioners of the art -- then I think I can then diagonalize (after Georg Cantor) the US patent office, and we will never hear from them again.
... well more an RT executive, truth be told.
Whilst Amdahl's Law cannot be revoked, our intended application (neuroscience) has P=1 (ie everything is parallellizable). And hence, (modulo interconnect performance) a million chips runs a million times faster. By Amdahl's Law. Cool eh?
An 18 core ARM chip running at 233MHz? Pretty wimpy. (But can be powered from the USB port)
A million of them, with sufficient interconnect? Now that's what I call interesting ...
For those in need of the IT angle, I commend Leslie Lamport's excellent article on Buridan's Ass:
These are government statistics. This means that we will be using all HESA course codes in subject area 25.
(Decrypt: G4xx Computer Science, G5xx IT, G6x Software (and other) Engineering, G7xx AI, G8xx I just don't care).
So the message might more simply have been stated as: if you have a degree in IT, you'll struggle to get a job. The CS students will be a minority amongst the sample included within these statistics. Certainly here in Manchester, our graduates have turned out to be more employable than any others in the science faculty for the last year.
I was impressed enough with Daniela's work when I saw it in Italy last month that I'd intended to tip off El Reg about it.
One drawback on the practical front is the power consumption. Those smart wires draw five amps. She also has other vids of self-sculpting matter and smart matter. Again the current drawn by the 1624 minimotors she's using would worry me.
Still I'm no robotocist; instead it's a matter of some pride that she's using our low-power ARMs!
Panto season started early this year Mr Carnegie?
Having spent twenty-five years with threading and multi-processor programming, I'd say that this is probably the finest introduction to the subject that you'll ever come across.
Dear AC (14.50GMT),
I thought I might get a reply of that nature. ;-)
We are in complete agreement: what matters is the resources available to "UK based boffinry". And by that I mean PhD grants and money for post docs. If it's taken away, then there's not much point in my under-boffins or any xeno-boffins replacing me is there? They'd also be better off elsewhere. Don't forget that in today's world, my under-boffins are not UK nationals. As for getting UK nationals as PhD students....
And by taking away the PhD grants, the primary mechanism for finding your overseas boffins (for subsequent use in UK industry) is also removed. But you knew that, didn't you?
The question you have to ask DBIS is this: how many boffins do you expect to leave the UK in search of better prospects?
We put up with the low pay, and the petty bureaucracy, in return for sufficient funding to do something interesting and then tell others about it. We build up networks of friends around the globe. And, if conditions become uncomfortable, we are highly mobile.
So, for computing: Seattle, the Bay Area, or Sophia Antipolis vs Manchester? Right now, what holds me back are worries about health insurance and the doubtful nature of my rusty French. If either country has the foresight to make me an offer, I'm interested.
All it would take to wreck UK Science and Engineering is for politicians in another part of the world to make us an offer we can't refuse. An interesting ploy for the USA, at the right time, would be to make a blanket offer to consider all UK scientists in disciplines of interest, short-circuiting current immigration red-tape. For France the immigration status is even easier to sort out, and the climate in Nice is attractive. So to the French minister I say: "Give me a modest salary, and the research funding I need, and I'll move tomorrow."
And besides, buying up another countries scientists is probably the cheapest way to undermine it's viability.
(And if you want an example of how to do science in a recession look at the Alvey Project in the early 1980s. The UK CS scene was transformed by this, and ARM is only the most significant of the many productive companies resulting from that decision.)
What has David Willetts got to say?
Mr Anonymous Coward writes:
" - there is no offence of plotting to commit libel."
I wonder, however, whether there might be a charge of "conspiracy to commit libel"? Or is "consipricy" restricted purely to criminal charges?
Time to call Schillings and Carter-Ruck...
Surely you meant to ask Mr Gumby whether he wished to become "...our sex toy co-respondent" before pointing out that such a task might not involve any journalistic talent.
Paging Simon Travaglia...
Remind me: wasn't I using Intel chips in the 1970s with these features?
How long does a patent last in the good ole US of A?
Pace the informative correspondence attatched to your last item on plod/computer interactions, it should be abundantly clear by now that the Met have no intention of taking this to trial. Because if they did intend to go to trial, they now have no audited evidence trail for Mr Greene's PC or hard drive image.
We can therefore safely infer that the only remaining purpose of the SO15 operation is to serve as an intimidatory warning to others.
I'm still puzzled by the red herring of warrants? These are signed by a judge or magistrate on behalf of Her Majesty. And She is the one person who through centuries of hard-fought tradition is prohibited, along with all Her agents, from entering the House of Commons, except by the express invitation of the House.
Logically, any search of Parliament can only take place with the express permission of the Speaker and/or Sergeant-at-arms. Which, luckily for the Met, is exactly what happened.
First they came for the Icelandic Banks; but I wasn't an Icelandic Bank.
Then they came for opposition MPs; but I wasn't an opposition MP.
Perhaps it's time for all of us all to get our coats?
Alisdair was a lecturer here, not a research student.
And as I recall it, the initial development of the idea was undertaken by final year undergraduates.
A quick google through www.berr.gov.uk/pressroom (now cached only) shows that a certain Right Honourable Stephen Timms had this to say about nominet:
"I am really pleased that Nominet has taken the lead, showing that this really is not a government-led, top-down initiative, but one grown from the Internet community. It is an initiative that engages with all parts of the social and economic fabric that benefits from, and contributes to, the development of the information society.
The Forum is not a club for governments, not a decision or treaty making forum. It serves to improve our understanding, to help us make better decisions, to find better ways of addressing problems.
The Internet Governance Forum has identified key topics to help orient its discussion: access, diversity, openness and security. As well as concern from some countries about the management of critical Internet resources. These issues are all important for the future of the Internet, and they cannot be addressed in isolation from each other.
Next June the OECD is organising a ministerial conference on the future of the Internet economy. That will see a similar focus: how to increase global connectivity? How to respond to the changes in use of the Internet? How to allow individual choice for access to content?
These things are looked at differently around the world. There are different expectations and concerns. The more we can do to understand the challenges and the opportunities, the better prepared we will be to take advantage of the knowledge economy, of the societal benefits from the innovation and growth of this powerful channel of communications and human interaction.
And that’s where Nominet’s Best Practice Challenge comes in. The UK has a good story to tell. We can use our position as a major knowledge economy to help show others examples of good practice, to help others address concerns. Sharing good practice like this will also help us to identify things that we could do better.
And so to the best practice challenge awards.
First, I’d like to commend Nominet for their initiative: this is really important in helping underline UK leadership.
Second, I welcome the community’s response in coming forward with first class examples of what can be achieved. That must have made Alun Michael’s job as chair of the selection difficult.
And third to commend the selection team for choosing between the really good to identify the excellent.
Competition has been strong. There is a wealth of good examples which we can show the world with pride. Which show the UK off as a forward thinking, Internet economy that cares for its citizens, too.
Perhaps Bob and Lesley could tell us about the shortlist?"
I'm rather taken with the epistemology involved in the statement "... <True> Nothing<ness> exists".
Lewis writes: "Regarding the matter of which Shooting Star might eventually become the world's first space parachutist - truly, the first really haut couture sky diver - Giovangrossi played his cards close to his chest.
"Tests with humans are a long way off", he said. ®"
Surely the perfect moment to suggest that El Reg contributes to the advancement of science, no?
Weren't you even mildly tempted to suggest that there might be suitably qualified El Reg staffers prepared to go where no one has gone before in pursuit of the best in Italian high fashion?