Re: Think sideways
Tribute to the Goodies? Like a trandem (three-seater bike) as replacement for the three-man Soyuz capsule
2747 posts • joined 24 Apr 2007
Tribute to the Goodies? Like a trandem (three-seater bike) as replacement for the three-man Soyuz capsule
A few bottles of Château de Chasselas from the four Yorkshiremen? To wash down the spam?
You could go further back.
could I build a Kim cluster? Might even catch up with a 80486 if you put together a few hundred.
Interesting developments, in particular the having multiple MPI tasks able to run simultaneously on a single Kepler chip. Great added flexibility.
I understand the why, it is just of no use to me. Others will probably be very happy with the machines
I will aim at a much cheaper, only slightly heavier 13.3" or 14" notebook with nVidia 520 or 540 graphics so I can run CUDA and openCL stuff (there are a few very nice ones from Asus, Samsung, and even Dell). The whole idea of an ultrabook is hobbled by the insistence on Intel graphics. For the prices they are asking they could put in a decent graphics chipset. Until Intel supports CUDA (i.e. when hell freezes over) I will steer clear of any machine with only Intel graphics.
Now there is an excuse for a HUGE monitor (or two, or three) if ever I heard one.
Seriously, nice toolkits are coming out for this kind of work. Much needed too, as parallel computing allows you to get things wrong MUCH quicker.
Some will say it is because the Merkin goverment wants to spy on them
Others will say it is because many American drivers cannot even handle a stick-shift.
Which is it?
Still like blowing things up though
And after a PhD you become Dr. Knob?
I would be willing to try the chips (and other AMD/Radeon graphics), and the newer Linux drivers. Only problem is we would have to port quite a bit of stuff from CUDA to OpenCL (which might be a good idea anyway, similar performance and no vendor lock in).
Regarding the binary nVidia drivers, I have no problems there. I rather like the fact that after inserting a new nVidia card in my PC to replace the old nVidia card, Linux runs without any adaptation, whereas Windows needs a new driver.
I must be allowed to keep up with the register anywhere, even whilst walking across the <SPLAT>
What, and miss out on free drinks and food, and any vendor freebees?
Think like a BOFH:
1. Go to the conference
2. Head for the bar
3. Wheedle/bribe/blackmail vendor into inviting you to another conference for free
4. goto 1
They are really getting the excitement in space exploration back to (near) Apollo levels.
Unless you purchase the extra extended warranty, which extends for a further 74 milliseconds for a mere 25% surcharge!!
Only while stocks last!
Or "Budget Suites in SPAAAAAAAAAAAACE!!"
You do not have to sit through them. Using the DVDs as frisbees can bring hours of healthy entertainment. Experiments microwave ovens are encouraged.
Not long, rest assured, not long
You are right when talking about looking through CDs and glass blackened with soot. There are however perfectly safe solutions. My Thousand Oaks glass objective solar filter works fine on my 8" scope. I watched and photographed the solar eclipse in 1999 with that scope, and the previous transit of Venus in 2004. Baader Solar film is perfectly safe, if attached correctly in front of the objective lens. All eyepiece filters are an absolute menace. I have recently made a solar filter out of it for my kids 4.5" F/4.4 Newtonian, and my eldest son and I had a nice view of sunspots through it. All harmful UV is blocked, and the total energy levels remain quite low.
Projection is actually dangerous in most reflectors, and certainly Maksutovs, Schmidt Cassegrains, and other scopes with fast primary mirrors, as the secondary can shatter under thermal stress, and even in fast refractors I would prefer a filter as the thermal stresses might cause eyepieces to shatter. In slow refractors projection is fine, especially as multiple can watch simultaneously.
"88 percent of respondents said they are willing to seek professional help to treat smut addicition, but would prefer to do it online;"
Well obviously they prefer to do it online, that was exactly the problem, wasn't it?
I have installed Dolphin as my default browser (after trying and not liking firefox on android). No worries on that front
It is SOOOOO tempting to suggest any orangutans on facebook would be in the upper quartile of users intelligence-wise .
But we must resist such temptations, must we not
says it all!
Mimmoths preferably pronounced to sound like Inspector Clouseau trying to say "mammoth", has my vote
Am I alone in reading Mammuthus creticus as Mammuthus cretinus
Time to head home
The technique worked here, because a body heated to 2000K emits copious amounts of IR (and even visible radiation. Move an object of the same size to an orbit with a more hospitable (at least for us) 273-300K, or roughly 7 times cooler, the same surface emits 7^4 =2,401 times less radiation (or 8.5 magnitudes lower). That would make a super Earth at room temperature much harder to spot against the glare of the star.
Darn, so I cannot use my Hercules graphics card after all
Speed limit on Autobahn? Only in some places. On most stretches they allow you to hit Mach 2. I once saw a video of a Renault Espace fitted with an F1 engine hitting 200 mph on a track. Perfectly OK for the German Autobahn.
The funny thing is most Germans support a 130 km/h speed limit, but that the car industry lobbies very successfully against it.
Brilliant, isn't it. Big thumbs up to the engineers!
So the dog deserves to be shot for that?
What a result, good excuse for a pint or to this evening
(as if we need an excuse)
Should have gone for
in damages, just for the sake of it.
team on crack?
exactly how many iPads may they claim on expenses?
If only it were a one in a million chance, we would see them every time
Mine is the one with "The Truth" in the pocket
I can just hear a couple of surviving dinosaurs having a conversation
"When I was a lad, we proper fleas, not these miniature little things that cannot bit through a piece of paper if they wanted to!"
"Right you are! I remember fleas that could bite right through a Triceratops's scales, he could"
"That's nothing, I saw some fleas that could drill straight through an Ankylosaur's club, no less"
"Rubbish, we had fleas which could drill for oil, they could, bite so strong it would go a mile through solid rock, it could!"
"And the problem with kids these days is that when you tell them they don't believe a word you say!"
Came with better screens and nVidia graphics. Light machines which run CUDA stuff. Not cheap though. My even older Vaio SZ is still working, and even it can run more (older) CUDA stuff. No Ultrabook ticks that box.
Fortunately, there has been a spate of very decent 13" and 14" laptops with nVidia 520 and 540 graphics on board. Cheaper than the Ultrabooks too. So, guess what I will get to replace my crumbling SZ.
The stars spell
So long, and thanks for all the fish
Ah, the well-known cloud magnet effect. My scope is 15-16 years old, but every time I buy a new eyepiece clouds rush in.
What Wirth means is that for a given task, the current software needs vastly more resources (CPU and memory alike) than similar software years ago.
Why is this worrying? Because it suggests that we could get by on much leaner compute capacity for many mundane tasks. It means machines that still work fine have to be replaced when the software is updated, and the minimum specs are upped again. This is ultimately wasteful. It also means that bigger server parks are needed for a given workload. If you can make code more efficient, less hardware is needed, and less energy is wasted. Mobile computing (like embedded) can give an impetus to leaner programs, simply because cutting clock cycles can cut battery usage.
As Niklaus Wirth says: Software is getting slower faster than hardware is getting faster.
Word 2.0 was a very capable word-processor, and ran happily on my 25 MHz 80386 machine with 4 MB of RAM (I really splashed out on that machine :-) ). Word 2010 would require rather more. More in fact than the processing power and memory of the good old Cray Y-MP. That is worrying.
GUIs of course do need more resources, but the above example suggests you can run a full-blown GUI-based word processor in 1,000 times less memory than we seem to need today. If you look at the memory footprint of something like FLTK, which is so small you can afford to link it statically for easier distribution, and compare that to some other tools, you cannot help but question the efficiency of some code.
Much of the best coding is going on in the embedded-systems world. You really have to conserve resources in that arena.
I must say I was a bit miffed at not being able to take part. I have several ideas of what to do on a petaflop machine, but as I am not a US or Canadian resident (excluding Quebec (mais pourqoui?)) I cannot send them in. AMD had similar rules for their "what would you do with 48 cores?" competition. There may be some law in the US requiring this, so nVidia may be obliged by law to include this rule. Pity.
"FFS - get yourself a high end graphics card! 256 cores + 1TB memory (or more) + proper parallel coding using CUDA. £300 will get you the dogs-danglies in the commercial world. If you want mil-aero specification - GE-IP have them for reasonable prices.
Image processing - visible, infra-red, microwave, radar, or all combined is exactly what CUDA on these high end graphics cards was designed for."
I assume you mean 1GB not 1TB of memory. If you can supply me a 1TB memory graphics card for £300, I am happy to give you a £300 tip :-). My images start at about 1GB, and end at 1.5 TB (for now). So they will not fit in my graphics cards (with the additional data needed during processing.
Regarding image processing and CUDA: For quite a few image processing problems you are right, in this case you are wrong. Connected filters that we use have a strictly data-driven processing order, which does not work well in CUDA. Indeed, because the outcome for every pixel depends on every other pixel (potentially), parallelization itself is hard (see the first method with can be found here (warning, pdf)). On 64 cores I am now getting about 30x speed-up. I am trying to adapt this to CUDA (or OpenCL) in collaboration with the CSIRO in Sydney, Australia, but we have not got it running yet.
I am just testing a new compute server for processing remote sensing images (64 cores 512 GB of RAM). My first runs already use up to 480 GB of that, and chug through a detailed analysis of 3.5 Gigapixel image in just over two minutes (was nearly an hour). The new images will be 2.2x larger for the same area covered.
I WANT A TERABYTE of RAM!!!
that was my optional title which was somehow removed (did I offend? ;-) )
These are not often mentioned, but they had quite a following. It was a neat machine, with Z80A and 128kB of memory expandable to 4MB, two asics controlling memory, sound and graphics. The memory worked at twice the clock frequency of the CPU, so the controller and CPU did not interfere (one had the even clock ticks, the other the ODD. Nice machine to play around with. Decent Basic, and word processor on board, very expandable. Linked it up to my Dad's daisy-wheel typewriter (what a racket that was