Three chemists have been awarded the Nobel Prize in their field, not for arcane experiments with bubbling beakers but for writing software to make computers do all the hard work. "In the 1970s, Martin Karplus, Michael Levitt and Arieh Warshel laid the foundation for the powerful programs that are used to understand and predict …
You have to wonder...
Just how many people give up in the face of their peers being closed minded.
Well done lads. Enjoy Stockholm.
First Nobel for computational science?
I can't recall any others. Hopefully this will be the start of a new trend. Because of tenure and the academic hierarchy, too many scientists still view computing with contempt, perhaps because back in the days when they were graduate students, computers were women with slide rules and mechanical calculators.
Re: First Nobel for computational science?
Not even the first for computational chemistry. Kohn and Pople in '98.
Re: First Nobel for computational science?
back in the days when they were graduate students, computers were women with slide rules and mechanical calculators
You associate with some rather ... elderly ... scientists, I take it, since they were apparently in graduate school over half a century ago.
The Nobel lags decades behind current practice (because it's given for work that's stood the test of time). It's quite reasonable that there haven't been many Nobels awarded for computational science yet; as Warshel notes, in the '70s, the computing power wasn't generally available.
I don't know of a scientific field today that doesn't have a sizable body of practitioners who are fans of the "third leg", as Wilson called computational research. Even in the humanities it's nearly ubiquitous: CACM has run articles on computational folkloristics, computational journalism, and the like in recent years.
Sure, I can see the advantages of the whole approach but I can also see the dangers. Because the main problem with relying on computer models is that you're not so much relying on a computer which can't make mistakes (a very commonly used argument to promote stuff like this) but a computer program made by a human who can make mistakes.
Mistakes which can happen in both the programming as well as the underlying logic which has been applied.
This is not saying their efforts aren't impressive. But I honestly think that in some cases the use of computers is also dumbing us down. For example; scientists before us who had no computers or anything of the sort were still able to calculate the orbits of the planets around us such as the moon as well as harder to spot ones like Jupiter, Saturn and Mars.
Something tells me that scientists today are no longer capable of doing so without a computer being present. Which doesn't have to be a bad thing perse, but it does make me wonder about the growing dependency.
A dependency which is easily excused yet while often forgetting or ignoring that computers, or better put: their programmers, aren't immune to making mistakes.
In science something becomes scientific when several scientists have taken notice of a piece of work and have agreed with it. So what about scientific computer models?
From a slashdot comment apparently the 1998 Nobel prize went to Kohn & Pople's, which led to the development of the GAUSSIAN code. There is a whole pile of smelly stuff around that code which you can read about here:
http://www.nature.com/nature/journal/v429/n6989/full/429231a.html (paywalled) title "Software company bans competitive users".
The purpose of a model is to take the essential elements of the system of study that represents the feature of interest.
For example , a model of the planets works fine with spheres, so long as the mass is correct. If there were an atomsphere, the geometry would matter.
The following is true of *all* molecular dynamics calculations. For those with quantum physics the only atom that has a closed form solution is the hydrogen atom. Everything else (atomwise) is an approximation -a very *good* approximation, but an approximation nonetheless. There are no parameters required beyond the specification of the particles involved and their energies. e.g. electrons, neutrons, protons, photons etc... To make it tractable, the approximations are "systematic". All chemistry is electrons so we have "bohn-opppenheimer" static nuclei. Since H is solved we make "approximations" to multi-electron systems.. This is a brief summary, loads of details missing...
For "classical" molecular dynamics the "forces" are parameterised from experiment (or from ab initio) calculations using the techniques from above.
The use of the computers may dumb people down if they were used that way but they also permit us to test things that are *impossible* by experiment, and permit the construction of new experiments. But they are NOT the real world - that is the whole point of a model!
If you want a good book, Allen&Tildelsey "Computer Simulation of Liquids" is venerable but a very nice read.
If you want a nice comp.sci way of looking at this field, I highly recommend reading about the Anton supercomputer form D.E. Shaws group, that is a special purpose machine (that basically can solve a 32K 3D FFT in <4us). They used ASICs , FIXED point arithmetic, really fine grain networks etc... It is very well explained.
The point of this machine addresses the really thorny question of "computer models", because it is the first computer ever built that could run a trajectory for a millisecond (timesteps are femtoseconds 10^-15 so you need 10^12 for a ms!!! ). This is within the range of EXPERIMENTS that can measure small protein shapes at the same timescale so a direct comparison should be possible. So D.E Shaw said "We plan to simulate and measure all proteins that are known to fold within 1 ms" (SC 2010?).
We are still waiting to hear the results, as without their machine to replicate the results...;-)
In conclusion, since the Nobel prize is usually about betting on "winners", I would say this is a very positive step for computational science.
You are criticising the use f computers on various levels, some of which is justified, some not.
Where I'm with you is the current trend to replace causal understanding with statistics. Headlines starting with "studies show ...", and in the last paragraph some psychologist makes a wild-ass guess connected to evolution. You find this is mostly in psychology (which is crackpot science) and medicine (where the argument "as long as it works and kills fewer people than it saves" holds). This has no place in the hard sciences.
Programming bugs are a short term problem, annoying as they can be. If the underlying theory and method is published, more than one group will implement it, and discrepancies will be spotted.
Zooming in on chemistry: the underlying physics is known and understood. But to get results for anything bigger than hydrogen or helium, you have to introduce increasing levels of approximation, trading accuracy for tractability with current hardware and software. The skill of a computational chemist lies mainly in finding the right balance here. This year's Nobel Prize is for the combination of several of these approximate methods in such a way that chemical or biological systems large enough to be of actual interest can be treated with meaningful accuracy.
There is the implied accusation that it makes things "too easy". I agree, but that's a side-effect of making computational tools accessible for non-theoreticians, and it's not the fault of the method developers if they are misused. When I was a lad, you had to study physics and theoretical chemistry for three years before you could coax the available software into producing a meaningful result. The developers (of which I was one, long ago) worked long and hard to make the software more accessible. The result is that people can't be bothered studying the manual for three hours. They click on shiny user interfaces and get meaningless results, which they publish to seven digits accuracy. It's not the fault of Pople (Gaussian, Nobel Prize 1998) or Karplus (Charmm, NP 2013). Or mine, for that matter.
If you fire up an engineering simulation application, the disclaimer "This software is intended to reduce testing, and is not intendd to be a replacement for it" is displayed. You still need to test real models in a wind tunnel, but just not as many of them.
The same idea is applicable to software simulation of chemical compounds.
Of course, the real advances in science come when there is a discrepancy between the expected result and the real result of an experiment, a discrepancy that then needs explaining.
You find this is mostly in psychology (which is crackpot science)...
crackpot 1883, probably from pot in a slang sense of "head." Cf. crack-brain "crazy fellow" (c.1570).
I find that I must agree with your definition. Psychology as a science encompasses the study of crackpots.
Certainly *not* the first use.
I was looking up the history of Harris (formerly Datacraft) minicomputers (24 bits, AMD 2901 bit slice implementation, 3MB DRAM memory in 1972) installed in a US university to do this.
That machine was bought because at about $128K it was cheaper than the CDC 6700 at around $1800/hr (cheap because that only included the running costs).
The point was that it was 1/30 the speed of the CDC but they could use it for a lot of work without having to pay for.
So definitely not the first but certainly in the 1st wave. I can't say if this work was better but I guess it was.
Balls and sticks? Balls and sticks!
As opposed to the implications in the subtitle, the price actually honors the classical balls and sticks approach: it was the virtual balls and stick model -- with a little quantum physics injected where required -- that won this years Nobel price. The combination adds value, because the proper quantum description (very time consuming) can be combined with proper systems (beyond a few atoms).
The art is in finding the right balance between the wrong (classical calculations just won't do for molecules) and the infeasible (quantum calculations just take to much time). Most attempts get it wrong, but the field is still young.
Numbers, Big numbers, and Random numbers
The article contains a not-so-smart quote from Kersti Hermansson "When you solve equations on the computer, you obtain information that is at such detail it is almost impossible to get it from any other method."
D'oh, when you solve equation on the computer you get numbers of arbitrary precision (it's a digital device, after all and it doesn't specify error bounds). Whether those numbers have any resemblance to some property of a chemical system is a completely independent question that is much harder to address. Once you properly consider the approximations going into most molecular calculations, you'll realize that the details are quite unreliable. The art is to ask the right question, a question that can be answered by a calculation despite the lousy approximations and despite all the wrong details.
I like the statement of Phil Bucksbaum: "Most theoreticians perform small-brain, big-computer calculations." The problems of this approach should be readily apparent.
This price honors the tools of the trade. Now smart people have to use those tools.
Re: Numbers, Big numbers, and Random numbers
Just look at predicted LogD; it's usually way off experimental results because almost no one predicts it in 3D and accounts for solvent accessibility. But meh, good enough for government work.
Re: Numbers, Big numbers, and Random numbers
"Just look at predicted LogD; it's usually way off experimental results"
Calculating ( or rather estimating LogD is POTENTIALLY risky, but often is suprisingly good. HOWEVER the best method (short of measuring everything - which can be time-consuming & error-prone itself ) is a good measurement of a representative molecule and and then using that as a basis for prediction.
I've worked with the best in this area and there are definitely some huge pitfalls in BLINDLY estimating LogD by machine
The name of the game these days is collaboration.
In my department it is increasingly common for postgrads from the wet chemistry side of things to enlist the help of someone from computational. We get the best of both worlds, the wet chemists get to save a load of time on possibly fruitless avenues and the computational guys get more raw data to improve their models.
On a wider note it is becoming increasingly rare to not be collaborating, its just the way things are done now
Doing chemistry on a computer?!! Where's the fun in that for Lavoisier's sake?
"Doing chemistry on a computer?!! Where's the fun in that for Lavoisier's sake?"
Do you remember how he ended-up ? And he tried to get the execution delayed because he wanted to do one last experiment.
BTW. The classic Einstein quote in this area is " The trouble with Chemistry is that it is too hard for chemists" - I've tried to live my life by this mote
Re: Einstein. Big words for someone who had to get someone else to do the science to prove his maths worked. Important point: nothing Einstein ever did or said gave you a better fridge or a new form of transport or an iPhone. He can only be properly understood by hard sums experts and physicists and they believe in particles that go backward in time (or did when I was a twentysomething).
Lavoisier was guillotined by the French, which only goes to show you shouldn't do chemistry in France. Perhaps Einstein should have said "Chemistry is too upsetting to the French when they are in the mood to chop off heads". At least that would be supported by evidence.
To paraphrase Sir Humphrey Appleby:
Chemistry is fun. It is the Rolls Royce of science. It is the Science that Harrods would sell one. What more can I say?
Plus: a good chemistry lab is like a Bruce Willis movie on steroids. Explosions, belching flames, toxic spills, liquid nitrogen all over the floor and clouds of opaque and possibly deadly gasses rolling over the bench in a thick rolling carpet. A chemist must not only be a crackerjack scientist and a dab hand with the burette, he or she must also be an Olympic-class sprinter and jumper in order to escape other people's carelessness with the Kipp's.
What do physicists do of a day? Sit around a giant magnet in Switzerland that either doesn't work properly or produces such ambiguous answers that no-one understands or believes in them. I know which science I'd rather be photographed doing.
As for mathematicians, what can you say about a bunch of people who say they are working with "imaginary numbers" except that they have a future in politics?
"nothing Einstein ever did or said gave you a better fridge or a new form of transport or an iPhone."
I must be too obtuse to detect a clear sign of irony in you post...
You do realize that Einstein got a Nobel Prize for the photoelectric effect on which much of the modern electronics are based? Or that we owe him the theory of stimulated emission that is the basis of lasers (without which we would not have DVDs, laser printers, barcode scanners, laser surgery, or all sorts of other useful stuff)? Or that without his special and general theory of relativity you would not have, e.g., a GPS in your iPhone?
I do expect The Reg's audience to understand that mathematics is but a tool of science (a tool of "unreasonable efficiency", in the words of Eugene Wigner), and that Einstein's work was science (physics), not maths. He was a theorist and others did experiments to verify (or implement) his theories - this is normal collaboration, though in Einstein's sometimes with a few dozen years in between because it was so damned hard. And the experimentalists got recognized as well.
One should recall, by the way, that historically Nobel was awarded for work with practical implications. This was one reason why Einstein got his prize for photoelectric effect and not for relativity (he did both works in 1905). Eddington, who conducted the first observations proving the theory of relativity, did not get a Nobel Prize, but Joseph Taylor did many years later for his binary pulsar observations that led to extremely precise measurements of general relativistic effects that are crucial for satellite navigation and for proper functioning of the GPS in your iPhone (on top of verifying Einstein's prediction of gravitational waves). Prokhorov, Basov, and Townes got their Nobel for masers and lasers. Perrin got his for experimental verification of Einstein's theory of Brownian motion, which served as nothing less than the proof that atoms and molecules exist (not to mention being an essential tool of modern finance, helping to protect your pension, savings, and investments regardless of what you think of traders and bankers).
None of that would be possible without Einstein's work, so next time you play a DVD, print a document on your laser printer, pay at the till in a supermarket, pass through automatic doors with a photoelectric sensor, or use your GPS, I suggest you respectfully acknowledge your debt to the man.
- Updated Microsoft Azure goes TITSUP (Total Inability To Support Usual Performance)
- The Return of BSOD: Does ANYONE trust Microsoft patches?
- Review Apple takes blade to 13-inch MacBook Pro with Retina display
- Munich considers dumping Linux for ... GULP ... Windows!
- Pic iPhone 6 flip tip slips in Aussie's clip: Apple's 'reversible USB' leaks