Re: Solid Labour seats
Oxford west and Abingdon? ~170 majority and postal ballots have gone missing
64 posts • joined 26 Jun 2007
Oxford west and Abingdon? ~170 majority and postal ballots have gone missing
"Sweden's foreign minister criticised Saudi Arabia's human rights record.
Last month, foreign minister Margot Wallström said it was unethical for Sweden to continue with its military co-operation agreement with Saudi Arabia."
Well said. Mr Hammond, are you listening?
Disagree. The need for ABP, and I can't understand how people browse without it, is a symptom, it's not the cure. That will come when the advertisers understand that they haven't got a god given right to shove their crap in your face 24,7 whether you want it or not. But it's going to be a chilly day in hades then
Just like paperwork and your desk the computation expands to fill the computer ...
Remember in computational science it is very rare that computers give exact solutions to the real physical problem. Rather approximate solutions to models of reality are what you get, and the game is to get the most accurate solution to the best model that you can solve. 10 years ago the computational facilities could solve certain models to a certain accuracy. 10 years of improvement allows
a) more accurate and complex models to be solved
b) already used models to be solved at higher accuracy
c) a mixture of both
I don't know what kind of models they are solving, but for b) it might be as simple as resolving features in the solution at 1km instead of 5km (c.f. weather forecasting). And the higher accuracy in this case really is very much better - I really don't want these things going kablooey unrequested, and like the poster above I'm interested that this is done from every viewpoint AND with the best accuracy that is currently feasible.
Indeed. But today research is run by the bean counters who only care if the books balance at the end of the year. Any research getting done is purely incidental and of not great import comapred to that.
Simple. Just send the B ark out first. Of course the others can follow later.
And the Unified Model would run like a dog on it, you (and the Met Offices paying customers) probably wouldn't get tomorrow's prediction until next week at best, and it would cost a huge amount more in recurrent rather than capital expenditure.
Comparing embarrassingly parallel workloads on loosely coupled, rented hardware and communication sensitive codes that require tightly coupled, low latency hardware is at best an exercise in futility.
"He's fucking fantastic people"
Too much information ...
"One emulatrix tweeted:"
"He later added:"
'There should be a box saying " the information provided is for Electoral Administration purposes only " for you to tick.'
Rather, at least on line, there should be a box for you to untick
Agreed, but if a pox on the companies what malady is sufficient for the ultimate problem here, the ridiculous state of the US legal system wrt IP issues?
(And yes, I know there are multiple redundancies in that sentance)
Might be "bleeding obvious" to you, but till it is observed it ain't science - it was probably bleeding obvious in 1588 that the heavier a thing was the faster it fell, but then Galileo's balls dropped and the world was changed.
And kudos to those involved just for being able to see this, as noted above.
Please, the capitalisation ... It's Fortran. It's been officially Fortran for almost 25 years now. And if you look at the original Fortran manual
you'll see it was Fortran in 1956
Not my area, but my guess is that it is becuase the spectrum gets messed up in strong magnetic fields. See e.g.
Note that refers to first order perturbation terms - I suspect (but don't know) that the fields use at CERN are strong enough to make that a very poor approximation, and that many more terms are needed.
Please, factual posts like this look much better if you give at least some references so support your assertions. At the very least please quote your primary source, which I believe is www.pluckedoutmyass.com
Not the snappiest acronym for the bill. How about the "Courtesy Unto Normal TravellerS" act?
"(And can one manufacture perfluorographene, which might make PTFE look sticky if it can exist at all? )"
Yes you can. Carbon monofluoride has been known for years, and mre recently graphene fluoride has been made.
Personally even if, and it's a big if, the predicted properties are found to be true I doubt stanene will ever be made, and if it is it will be highly unstable under any useful conditions - it will just disproportionate to elemental tin and either the di or tetra-fluoride. Tin just ain't carbon!
"Well, given most of the population of the planet, when presented with a decent description, would gibber and their heads would explode, probably not a good idea."
Ideal for one of the recent party conferences, then,
'In "The Age of Austerity", do we really need any toothless (or muzzled) regulators?'
Of course we do! Regulation inhibits growth!! Under no circumstances should we regulate, that's anti-business!!! Your're not one of these red commie socialists, are you ?!!!!
"Unless you're doing some obscenely computationally intensive tasks, the Intel compiler isn't going to make a jot of difference other than scoring Intel a tick-in-a-box. Not really worth all the signing up."
I'm from the HPC world and while I agree with the "computationally intensive" bit that's not the only reason for getting a compiler. Different compilers have different debugging capabilities and expose different problems in code, and so I always try out code with a number of different compilers while developing - to get sucked into a "one true compiler" mentality can be very dangerous. So while I'm not sure about the relevance of the execution speed here, the possibility of checking the code with another compiler can only be a good thing.
"Wait a minute..... so 30 degrees of global warming is a good thing, but 30.5 would suddenly be a catastrophe?"
Please define catastrophe, otherwise that has little to do with science.
All one can say scientifically is that
a) Current models estimate that the greenhouse effect warms the planet by ~30 degrees
b) Those models seem to do a reasonable job at describing (pre-)historic climates
c) These models correlate increasing CO2 in the atmosphere with increasing temperature
d) According to the models increasing global temperature has a number of climatic consequences
Anything beyond his is politics, and not science.
"CO2 isn't (by an order of magnitude or so) the largest component of "green house gas". Good old dihydrogen oxide is."
Certainly H2O is the major comtributor to the Green House effect (order of magnitude is debatable). However that's irrelevant. The greenhouse effct is mostly a good thing, making the planet on average ~30 degrees warmer than one would expect. The problem is pushing a good thing too far, so what is at issue is not the major contributor to the greenhouse effect, but how the different contributions are changing. And CO2 is certainly a major contributor to the effect, and also it's concentration in the atmosphere has markedly increased over recent history.
Good luck Nvidia. My personal experience of the PGI compiler suite is that it is one that gives mediocre performance while failing to help me spot my stupidites. This is supported by
though of course this is hardly full and complete coverage.
Personally I much prefer gfortran, not for political reasons, just because I think it's a better compiler.
"There were no differences in initial conditions or in processing methods. The same starting conditions and same processing methods should lead to the same results"
Errrrrr, no. The initial conditions may be the same, but looking at the (not yet peer reviewed paper) shows that the model was run using a number of different compilers at a number of different optimisation levels using a number of different MPI implementations. All of these can lead to differences in operations as simple as adding up a list of numbers, and thus over time the result of the calculations may diverge. Hell, depending on the way the MPI (and possibly OpenMP) is implemented it's perfectly possible to get differing results from two runs on the same machine with the same executable.
And that's even before I consider whether the machine itself is strictly IEEE 754 compliant (maybe, probably not if you look at the grubby details), whether the code is being is being compiled to exploit the machine is a IEEE 754 manner (probably not in all cases), and whether you can expect all complier/library combinations used by the code can be expected to give identical answers in every single case (probably not). And then there's comparing runs done on differening numbers of cores. Et cetera, et cetera, et cetera ...
This is usual behavior. And what is being presented looks like (I haven't read the paper in detail) an reasonable attempt at trying to quantify the observed divergences.
You might take a look at
"I suspect most people would squirm with that question. Even if you believe the Big Bang theory, where did the energy come from to create the Big Bang? As far as I am aware there is currently no good answer for that."
Not my area of science but I was under the impression that
summarises the current position
It was killed by Ronnie
"Obviously, Windows 8.1 isn't as dramatic an upgrade as the leap from Windows 7 to Windows 8."
Indeed. But what is required is as dramatic a leap as from Windows 8 to Windows 7.
Guessing here but the RANDOMIZE "convention" might come from the ZX81 where it may have been used in preference to the let form as it took up less memory than the LET form, both in storing the source (key words were tokenised) and possibly avoiding having to store a variable - when you've only got ~3/4kByte for your program every little counted!
Well Fortran, as it has been spelt for 20+ years, is probably there as the gnu compiler collection contains a fortran front end, gfortran
Ian, who doesn't work in a bank
It's quite simple - the 2nd film is entirely about escaping from the goblin's dungeon
"Last time I looked (tried), ignorance was no defence."
Well you're probably not a director of a multinational media company then. Or a minister in her majesty's government.
There's even a special Journal for publishing such articles - It's called "The Daily Mail"
but on a somewhat larger scale?
Depends on what you have (obviously), but in the region of 10 GFlops will be the right sort of ball park.
There are a number of reasons why clouds can't meet all needs, at least in the scientific area. One is that many scientific codes need the low latencies that these machines provide to scale to
anything like the number cores they contain, and the network in clouds just don't cut it. There are also I/O and other issues. For a recent report take a look at
there's a key findings section of about a page and a half near the beginning.
Certainly looks like he's in a pickle
"The eclipse will also allow stargazers good views of other planets, by cutting out a major source of light pollution"
The moon is light pollution? So I suppose the big yellow shiny thing during the say is as well? Or is it a case of BLOW UP THE MOON SO WE CAN SEE THE STARS!
As I understand it, yes. But take this with a pinch of salt, it is second hand and I don't have many more details than I've said already.
The network is just not up to it. I have an associate who has benchmarked these things using applications that currently run on 1000s of cores of various dedicated HPC resource, and get outside one node and there is no competition. For instance one chemistry example runs in 194 seconds on 4 nodes (32 cores) of a University HPC server, but 450 seconds on 4 nodes (32 cores) of the cloud even though the single node speeds are about the same (534 v. 520).
To be fair if you don't need good, low latency comms it's not nearly that bad. Fortunately not a lot of proper scientific HPC falls into that camp - this keeps me in a job!
Well according to
one way is just as god intended - Fortran, message passing and OpenMP,
"where people trade their personal information for the chance to watch TV programmes"
Well they are idiots then.
> ... with Big Bang 2.0 and all (Did we learn why the Tardis is exploding?)
It was jump started by Lister
OK, having a degree in chemistry and being a tedious pedant I had to comment.
Gold is what is produced by this reaction, not what goes in (Citric acid dissolve gold? Wedding rings wouldn't last long in the kitchen). What is input is AuCl4-, and this is photoreduced to produce gold nanoparticles, the reaction being catalyzed by the protein. And it is the nanoparticles that cause the purple colour.
So purest purple is actually gold ... Percy didn't stand a chance!
(Also why is the El Reg spill chucker objecting to "colour")
This is political correctness gone mad
So yet again machines are being judged by how well they can run matrix multiplies (hidden as a linear equation solve), an algorithm that is
1) All but ideally suited to achieve macho flops to be quoted by hardware vendors, whatever the architecture
2) All but ideal for scaling to many compute cores
3) All but totally divorced from how the vast majority of real apps works
And now it's on a machine that the vast, vast majority of the small number of HPC coders can't program, at least in the scientific arena.
Given this I fully expect performance for the Linpack benchmark to continue being crucial to many procurement exercises.
Hmmm, it's difficult to find any kind of interesting technical detail, but one of the news articles does say
"Parallelizing applications will be performed by the upcoming Blue Cheetah product called Coalition, which is scheduled for release in the January 2011 timeframe."
"The decision means major projects such as the Diamond Light Source can go ahead."
Ermmmm, I'm a bit confused. Obviously the big building I see when I visit the Rutherford Appleton Lab is a figment of my imagination and
is wrong where it says "Construction of this new scientific facility began in early 2003 and Diamond became operational on schedule in January 2007." Well it's either that or El Reg is a little confused and talking rubbish, which would never happen