German supercomputing chiefs are chuffed to announce today that the Forschungszentrum Jülich supercomputing centre will provide the main computer for the ITER fusion reactor, the international effort intended to solve the human race's energy problems. "We are proud that the European Fusion Development Agreement has chosen to …
Depressingly, probably not.
In before "Can it run crysis?"
Hmmmm. 24GB per CPU or 24GB for the entire system? Doesn't sound right if its the entire system.
Our bog standard web hosting nodes have 16GB RAM each.
Phycisist with blinkers on.
the superconductor cathedral is a waste of time and money, the basic concept is not good and instead to develop new conceps, they are obsessed with the stupidity that ion confinement plama with a maxwellian distribution is the only way. Lusers.
Now they think they need to run a supercomputer to keep the plama going on. more energy input, thats the way to go. yeah...
Shall we play a game?
"This is plainly possible - the sun and all the other stars run on self-sustaining fusion reactions - but achieving it using human technology has had top boffins stumped for decades."
er... is that not a thermonuclear bomb? We can already do that - the trick is to *contain* the fusion reaction, something the Sun doesn't need to bother with.
"The knotty question of how to power all the electric heating, electric industry, electric cars etc after the switch away from oil and gas (or how to produce all the hydrogen for the hydrogen cars, etc) will have been answered."
We'll run out of copper before we run out of fossil fuels.
Teraflops pedantry on a snowy winter morning
For you young folk out there, the origin of "teraflops" is "flops", which is an acronym standing for "FLoating point Operations Per Second". Therefore in traditional speak, one "teraflop" is one "trillion floating point operations per". We used to get quite annoyed by the missing "s" because "operations per" doesn't make a lot of sense without "second". Nowadays it doesn't bother us so much because we still derive a small amount of satisfaction from thinking we know better, but mostly we're too old to care.
Actually, thermonuclear bombs rely on nuclear fission. That is, the breakdown of heavy elements (usually uranium) into lighter ones releasing large amounts of heat in the process. Requiring the heavy elements in the first place.
Fission is the other way around, more or less, and isn't quite the same as a bomb going off. Hopefully.
Playing Games..... and dDefinitely not one for Boys with Toys as IT Requires XXXXPerienced Craft.
Talking about games? .. Control of the Great Game is One for Real in the Virtual World. Being AIMaster in that Parallel/Mirror, Delivers Source Currency for ITERative Power and Shared Controls. ...... [02 February 2009 at 10:54 .... http://www.newstatesman.com/economy/2009/01/labour-darling-interview-brown#reader-comments ]
A Little something for The Old Lady of Threadneedle Street to Pimp/Pump/Prime with Pretty Paper? Yes, of course it is, Darling, and how very Prudent too.
The sun uses gravity to do the job
So we already have a thermonuclear reactor that generates useful power. All we need to do to harness it is build a few dozen square miles of concentrated solar thermal power generation in deserts and use solar energy indirectly in the form of wind energy.
Seems likely to be a great deal quicker, simpler and cheaper to me, but I guess that doing what we already know very well how to do on a bigger scale or just a little bit cheaper isn't the stuff that great careers in science and Nobel prizes are made of.
So they're designing the supercomputer now, for a facility that won't be opening until 2018?
Surely they could wait another 5 years, then spec something from the technology of the day? Imagine how much more powerful (and less power-consuming) a supercomputer designed then could be.
Would be ljke running a current facility with a load of Pentium 2s.
Meaningless indication of processing speed?
To get 100tflops the figures appear to be assuming 4 flops/clock on every node, isn't this just a touch hopeful?
>Actually, thermonuclear bombs rely on nuclear fission.
Err no, the bombs dropped on Japan did, but modern bombs use fusion.
And how do we spell Jet?
I thought the process of producing a sustaining (i.e. produces more than enough power to contain the reaction, albeit not much headroom) had already been achieved in the UK in the 90s?
re: oliver mayes
(I'll ignore the wee mistake in the last sentence, I assume you meant to say "fusion is the other way around...")
"The second basic type of nuclear weapon produces a large amount of its energy through nuclear fusion reactions. Such fusion weapons are generally referred to as thermonuclear weapons or more colloquially as hydrogen bombs (abbreviated as H-bombs), as they rely on fusion reactions between isotopes of hydrogen (deuterium and tritium). However, all such weapons derive a significant portion – and sometimes a majority – of their energy from fission (including fission induced by neutrons from fusion reactions). "
The way I was taught it was that a fission event (ala "atomic bomb") provides the initial energy to kick off the fusion process creating a "thermonuclear bomb". The US bombs over Japan were fission bombs, but hydrogen bombs have only been tested. Thank god.
The Sun uses fusion in much the same way as H-bombs do, except it has a tad more fuel to burn through, and its own gravity stops it from exploding. When enough mass is burned then gravity won't stop the explosion and the earth will experience some severe global warming... :)
As I said, we can already produce uncontrolled Sun-like fusion, but it's the controllable aspect which is delaying the technology.
Thermonuclear bombs use a fission explosion to get started. The X-rays that this produces drive the fusion reaction that provides the bulk of the energy output. Typical output from a simple fission device - 15 or 20 kilotons. Typical output from a thermonuclear device - megaton range
Sorry Oliver, but you have that turned around.
NUCLEAR bombs rely upon fission, the splitting of unranium or plutonium atoms via cascading neutron bombardment, leaving a lighter set of fission products behind (most of which are intensely radioactive). These are the "classic" bombs developed in the 1940s and used over Japan.
However, THERMONUCLEAR bombs utilize a standard fission bomb as a "trigger", which then compresses a capsule of hydrogen isotopes and causes the hydrogen to initiate a fusion reaction - hence the term "hydrogen bomb". It takes the intense blast of the initial fission reaction to set off the fusion, such are the requirements of heat and pressure to start the fusing.
The output force of thermocuclear devices are magnitudes larger than fission devices...and the power derived from from a fusion reactor would be much larger than our current fission reactors, if only we could contain it safely and harness it.
"easier to find than scarce fissionables like uranium and thorium."
You're not the real Lewis Page. He would never say anything so blatantly anti-nuclear!
Not to be too pedantic, but thermonuclear bombs are the ones that rely on fusion - i.e. H-bombs. Fission bombs are the ones known as A-bombs and have a considerably lower yield (kilotonnes vs megatonnes). As I understand it, the detonation of an H-bomb usually uses an A-bomb to provide the required amount of heat to initiate the fusion reaction (10s of millions of Kelvin IIRC).
How excellent to have your expert opinion on that. Does that take into account the fact that copper can be easily recycled, and that given humanity's growing need for plastics of every possible description, (and our seeming complete lack of willingness to recycle anything that isn't metal?)
You do realise that the primary use for Oil is creating plastics and other such important chemicals, not powering your 6 tonne SUV with Dual SLI V12 Engines? (With sandbags in the back for extra traction on icy roads, and Quad Crossfire Air Conditioners for the heat.)
Peak Oil is allready in the past, and this MOMENTARY drop in Oil prices will pass. Don't let that deter you, however. Your lifestyle isn't wasteful or harmful in the least, and it is your Diety given RIGHT to take everything you can, and piss on everyone else, the envrionment, and espessially those hippies who think that we should try to live a more moderate, sustainable life.
Now, turn the furnace and the A/C up all the way, and see if they can keep your house room temperature. Now you're boosting the economy! Yay!
Meanwhile...to the sane aprt of the world:
F**K YEAH FUSION. I don't care about the whys and wherefores, is they can get a Tokomak producing more than they put in, on a long-term sustainable level...that's just bloody cool.
Top boffins not stumped as claimed
The previous fusion reactor to ITER was JET, and was an experimental fusion reaction to prove the concepts were correct. It was never scaled or desined to produce power.
It was the work at JET that provided the design parameters for ITER which IS designed to prove you can produce more power than you put in, followed then by a commerical prototype.
Alright, you caught me
Here I was hoping my GCSE physics would allow me to blag my way into people thinking I was a nuclear physicist. I've obviously got this wrong then, but despite pretty much all of my previous statements being wrong I still stand by the fact that I am right.
"...access 24 gigabytes of total main memory ..." - is this right?
24 gig per chip (that's an awful lot), or 24 terabytes overall?
Something's not right.
I recall ~20 years ago in Scientific American an article on the (then-nonexistent) teraflop computer which said something interesting, that when teraflop machines became a reality then computers wouldn't be seen as computers per se but instead take their place as another tool to examine the real world, next to traditional items such as the microscope. I wonder if that's become true now.
About our hunger for more speed, is it out of hand. The more you've got, the more you want but are current computing models right, on the whole? Current CPUs use IEEE arithmetic (which provide extreme care for arithmetic precision) but the problems they deal with often unstable, so would statistical approaches work better? Using real models?
Analogue computers? ( <-- most serious suggestion)
Just asking if there's another way than applying bulk computing power.
The Elephant in the Room
Actually the Soviets reported a net positive energy output from one of the last of their Tokamak test devices in the 1980s. A seldom mentioned problem they found was that the magnetic field does little to contain the prodigious neutron flux created by the plasma which has the annoying habit of causing everything nearby to glow in the dark for a very long time as well as rapidly degrading the structural integrity of the reactor components, not a very desirable thing at all. This would seem to be the greatest obstacle to practical fusion power generation. Thus, fusion reactors are not so "clean" as their proponents frequently state. Even if one were to find some way of containing the neutron flux, what does one do with a few tons of waste neutrons?
Also, just to pick a nit, the majority of the yield of most thermonuclear weapons comes from fission of the depleted uranium "tamper" used to compress the thermonuclear fuel. This is also responsible for most of the radioactive fallout. The highly energetic neutrons from the thermonuclear reaction are capable of fissioning depleted uranium and this is used to dramatically boost the yield of the device.
Waste neutron disposal. A suggestion
Crush the lot into neutronium and blow up the resulting little lump with dynamite.
Or just put it in a memory stick stamped "Her Majesty's Government: Confidential" on one side and on the other scrawl 'Jaquis pix an eveywuns soshal wourkas rekords an criem and polis an stuff touch this an U die biches!!!!!!!!!!!' and just wait for it take care of itself on the train or in the boot of a car in the traditional Governmental way.
The sun's too gentle.
"This is plainly possible - the sun and all the other stars run on self-sustaining fusion reactions - but achieving it using human technology has had top boffins stumped for decades."
The power output of the sun is less than a microwatt per kilogram of "fuel". In fact, the reaction inside the Sun's core is such a damp squib that it takes several thousand million years before needing refuelling. This is clearly inadequate for a fusion power station.
It would probably be better if the nuclear technologists first built another sun as a proof of concept before going on to the much higher temperatures and pressures which will be required by a terrestrial fusion power station.
Alan Mackenzie (Nuremberg, Germany).
suppose they get it to work, everything goes electrical, and electricity demand and supply go up 100-fold (for the sake of argument). That's a load of extra heat output. Does anyone (skank-features aside) know much about the environmental effects of this? Does heat from human activity even register next to solar and geothermal? Would it just radiate away? If we fuse it all really quickly, can we compensate for rising sea levels? what about all the helium that's blasted out?
in case it sounds otherwise, i'm thoroughly pro-tokamak - please don't go throwing clarksons at me - but there are still waste products from this process and I'd like to know what effects they may have when produced on the kind of scale that would be justified by truly free electricity.
Damn good question. It & its relatives have been worrying me. There's an answer somewhere (a recent article in new scientist?) suggesting that at current rates of increase of energy emission we would start making a globally significant contribution in heat in about a century (figure from memory, don't trust it).
We won't be doing it solely by fossil fuels because we'll be out before then, hence I wasn't worried by this particular extrapolation, but humanly speaking we seem to abuse any resource bequeathed us, and if/when geothermal takes off I fear we'll abuse it likewise. Someone's calculation was that geo could give us 30,000 years of energy. I think that assumed we'd use it wisely. I think that is not a valid assumption.
Just for comparison (on the computing front), JET used an IBM 3090 for data processing. It needed to get all the processing done in about half an hour as that is how frequently plasmas were run. There was often some preprocessing done on Solaris boxes nearer to the diagnostics but that couldn't have access to the other data (magnetic field strengths etc) that was necessary for full processing.
My memory says that in the 90s it was producing 30GB of data per pulse but don't rely on that data...
As for actually supplying energy. JET demonstrated "Breakeven" which means that the fusion reaction generated more energy that it took to create and sustain the plasma. But that is before the efficiencies of conversion. Then there is the "Lawson criterion" which means that you take into account the electrical generation efficiencies and the efficiency of the equipment supplying the power (microwaves or neutral beams). Finally there is ignition where you kick it off and the plasma generates enough power internally to sustain its own reaction. This means that anything you collect is usable and once you have gone beyond the amount you needed to create the plasma you are actually generating something.
ITER was originally designed to achieve ignition but I think this was later deemed impossible.
First of all, This computer will obviously be obsolete in a few years, so for sure it cannot be used to analyze a plasma that will not occur until 2018 at the earliest. I would assume that it will be used for design purposes and simulations of plasma performance.
Secondly, ITER faces significant challenges. For example, there is a shortage of the liquid helium needed to make its tens of kilometers of coils into superconductors. And then there is the inherent problem of a tokamak: it is a pulsed device unless you can drive the current in the correct profile forever. Pulses cause severe problems because of the extreme thermal and mechanical stresses that are placed on the components, especially those in the blanket and the shield. And no one has shown that there exists a steady-state burning tokamak equilibrium that handles things like heating, fueling, ash removal, maintaining the current profile, and avoiding major disruptions. (JET, mentioned in other comments, actually rose up into the air during one such disruption.)
There are only a handful of tokamak design parameters that can be used to satisfy the above design constraints: The major and minor radii, the cross-sectional shape, the toroidal field, and the current and its profile.
In contrast, modern stellarators have succeeded in decoupling the physics of the plasma with the coil design. Any toroidal surface with "stellarator symmetry" can be turned into a flux surface, and from this, appropriate coils can be created. Essentially currentless plasmas allow stellarators to be truly steady-state devices. The lack of current also means that particle orbits depend on the magnitude of the magnetic field, |B|, instead of its vector. This means that symmetric |B| devices can be created (to improve confinement) while vector B is still three-dimensional. The challenge for stellarators is that the coils are more complicated (greatly ameliorated by modern computer-aided design and construction) and that because the coil geometry improves as the plasma-coil spacing decreases, it is hard to make compact stellarator reactors with the required 1-m space for a blanket and shield. However, modern stellarators seem to perform as well as tokamaks of an equivalent size, especially the Large Helical Device in Toki, Japan. The next big test for stellarators will be Wendelstein-7X that is under construction in Griefswald, Germany. It is the first large stellarator designed using the new advances described above. Learn more about stellarators at http://www.ornl.gov/sci/fed/stelnews
Nonetheless, ITER is a worthwhile step forward because it will hopefully explore the physics of burning plasmas.
This being Germany an' all, why aren't they running AmigaOS on it?