Re: Did anyone else notice?
Mr Eisenberg, are you certain you want to name your son Hagai? You realize you are dooming him to a career of quantum physics?
3048 posts • joined 24 Apr 2007
Mr Eisenberg, are you certain you want to name your son Hagai? You realize you are dooming him to a career of quantum physics?
A real scientist will go on arguing until he or she is satisfied by the proofs/arguments given (and even then could start arguing again as new data become available).
Dead right. Imagine the excitement if the fluid turned out to be alcohol (how easy would it be to get volunteers for a trip to Mars then?).
Alternatively, it was octane, and some Martian equivalent of the Humvee became so popular that they used every last drop (hence the carbon-dioxide atmosphere).
There's a Dutch saying which translates to:
The best sailors are always on shore
I think this is most appropriate.
I am also reminded of Harry Enfield's perennial
You don't wanna do it like that!
The difference is actually made in patent law in most countries.
Most patent laws explicitly exclude patenting mathematical equations (though not necessarily their application). Because any algorithm can be expressed in lambda calculus, it could be (and has been) argued that algorithms should not be patentable. Their application (as e.g. as codec) could be under various laws.
So, as Mr. Slant would say: "This is an interesting case."
With degree of interest being defined as proportional to the amount of money it brings to lawyers.
Sorry, couldn't resist.
Mine is the one with "What is Life?" by Erwin Schrödinger in the pocket (very much worth the read!)
The beancounter should have heeded the warning sign:
Danger! Dropping tables! Hard-hat and further protective clothing required.
or the more generic:
Here be dragons
when approaching the BOFH's den
Old git 1: When I was a lad, we never had these newfangled integrated bleeding development environments!
Old git 2: Aye, we had emacs, and we were glad of it!
Old git 3: Emacs! we would have loved to have had emacs, we were stuck with vi, but we were happy!
Old git 4: We did not have editors at all, we had piles of punched cards! Hundreds of lines of FORTAN, one to
Old git 1: Luxury! We had no programming languages at all! Everything was machine language
Old git 2: Machine language!? We would have died for machine language. We had to hand wire connections
in our computer to program it!
Old git 3: Right!
When I was young we had to compute entire navigation table with nought more than pencil and
paper and a hand-cranked mechanical computing apparatus!
Old git 4: And the problem with kids today is that if you tell them they don't believe a word you are saying!
very nice indeed
Stewing the results in beer is recommended
They seem to be getting better very slowly. I have yet to see a real killer app for these, but maybe one will come along.
I will just stick with my current watch. I only ever buy a watch when the previous one breaks in such a way that repair is no longer economical. Given that my current (only my third) watch is a year old, and on average, my watches last about 19 years (OK, N=2 statistics), there could be some really enticing options available when I am up for my fourth watch.
They might be able to claim "Force majeure"
Shouldn't that be:
Mine is the one with the "Das Boot" director's cut DVD in the pocket
Tonight I think I might be serving some portions of monkfish fillet, lightly drizzled in olive oil and some lemon juice, sprinkled with pepper and some fresh sage from the garden, wrapped in lean smoky bacon, baked in the oven at 220 C for just 15 minutes served with pasta and pesto alla genovese, and spinach.
Alternatively, I might just have some pizza. Some beer or wine would go down a treat as well
Pity really, I liked my HTC Desire a lot. I now have a Desire X (in part because the interface was familiar, in part because I got a good deal), but when that needs replacement I may well have to look elsewhere, unless by then telcos are dumping their HTC phones as they too bail out (something tells me living in the Netherlands has influenced me ;-) ).
One thing that the government here in the Netherlands might try (maybe the same elsewhere), is to let teachers teach, not administrate. I gave a few lessons of computer science at schools nearby, to show kids that CS is certainly not learning to work in MS-Office. I was appalled at how much time teachers have to waste on administration during each lesson. And then there are the endless assessments and test that are foisted upon school children here from the age of 4 upwards. Endless assessments to measure outcomes. As one teacher pointedly stated: "A pig doesn't get fatter by weighing it more often". This tendency to measure everything to death has even reached preschool creches we have. I have heard people agonize about the fact that a 3 or 4 year old kid only recognized 4 letters whereas they should know 5 at that age. As if that difference is at all meaningful!.
When teachers are assessed by the teaching outcomes of their pupils, they will train them to perform well at the test, which is not the same as giving them understanding of the subject matter (let alone stimulating them).
Or contains sauce
Darn, I'm hungry now
Just don't cross-connect it to an electric monk, it will try to believe all 16 channels simultaneously.
Mine is the one with "Dirk Gently's Holistic Detective Agency" in the pocket
So now, drivers can mess things up with sub-millimetre precision!
OK, stop this sketch it's getting silly
Mine is the one with a map in the pocket
(and yes, sub-millimetre precision is impressive!)
Never? You mean only when they wrench it out of your cold dead hands. In which case you will be passed caring.
which runs linux, allows you to log into your servers. A full keyboard would perhaps be a bit much, bt a small set of function keys with the following functions could be enough:
F1: rm -rf *
F2: kill -9
F3: shutdown -h now
like this one
Suggestions for other keys welcome
Cramming 41Mpixel into such a small format is apparently possible, but only by making the photosites tiny. This means the electron well can only hold comparatively few electrons before saturating. This in turn means that the number of photons detected before saturation is low, which means increased noise and lower dynamic range. By binning multiple photosites together into a much smaller number of pixels, you can counteract this, and no doubt produce some very nifty 8MP images (especially with good processing techniques).
However, this will never match the quality of a DLSR, simply because the much bigger lens of a DLSR gathers more photons, which means better signal to noise. The reason is that quantum efficiencies (percentage of photons detected) of photosites are very high indeed (40% and above) so little can be gained there (and every gain at the phone end can be made equally at the DSLR end). A DSLR lens easily has 4x the diameter of that of a camera phone, meaning it gets 16x the number of photons. A phone sensor would need to collect 800% of all photons (which is clearly impossible) to match a DSLR sensor with 50% quantum efficiency.
So, will these cameras produce nice images, good enough for most purposes? Even when deducting marketing speak: yes. Will they match professional DSLR kit: No; this little thing called physics gets in the way. On the other hand, do I carry a phone with me all day? Certainly. Do I carry several kilogrammes of DSLR kit around all the time? No way!
I do not care whether he is first or second: there is going to be (another) British astronaut. Cheers to him
Put like that, maybe NASA could get sponsoring from Ferrari to up the ante on that front
Share and enjoy!!
"We'll tell you:' Go stick your head in a pig'"
As sung by a choir of robots with their voice boxes exactly one flattened fifth out of tune.
No large, friendly letters available I am afraid
It might have been a scary (or more accurately, very, very sad) person who is responsible for the down vote, or alternatively, it might have been a very clumsy person.
Mine is the one with Jingo! in the pocket
between a beer icon and a helicopter icon.
OK, let's go for the tinfoil-hat brigade:
The government laces drone-supplied beer with truth serum so they can spy on you
In truth of course, beer in sufficient quantities can always let you divulge inconvenient truths
Every time I see the ISS fly over (happens quite often whilst stargazing, I could even spot its overall shape with big binoculars), I just have to give quiet praise to the people up there, and the people who helped put them there. However we bemoan not doing enough space exploration, some people are dedicating and even risking their lives in the name of space exploration.
People like Cmdr Hadfield set a shining example in my book. Cheers to him and all like him
CSP is powerful. The main problem I find is in keeping communication down, especially in terms of how often processes need to communicate. It is much cheaper to have a few large chunks shuttled from on process to another, than it is to have a whole lot of little messages. It is not just the latency in that case (it also plays a role), but it is also the synchronization that costs time (barriers are particularly costly).
I have some rather memory-intensive code, that I did once run (or rather walk) on a ScaleMP machine (8 boards with 8 cores each (older incarnation)). Performance was dismal. Why? Each core thread may need access to each part of memory, because the outcome for each pixel in these huge images may depend on any pixel in the image, and you do not know beforehand which ones matter. Everything works hunky-dory as long as each processor only accesses the memory on its board, but the moment it needs large amounts of data from another board, latency kills performance. Getting a speed-up of 0.5 at 2 threads (if they weren't explicitly pinned to cores on the same board) is rather discouraging.
What they are doing is putting a software layer over a distributed memory or NUMA machine, so as to hide the complexities and allow shared memory algorithms to run (or rather walk) without the need to rewrite the code. ScaleMP does hide the actual NUMA architecture very well. Curiously, this leads to problems when optimizing the parallel code for that particular hardware. Because details are too well hidden. You really need to understand the memory architecture and the latencies of the machine to design the appropriate algorithm. Parallel programming on shared-memory machines and distributed-memory/NUMA machines are two very different ball-games, often requiring a careful rethink of the algorithms, in order to get the processors spend their time working, not talking (just like an old-fashioned classroom), or waiting for data.
A ScaleMP-like approach could work if the latencies are kept very low (like the QPI approach). On run-of-the-mill network connections (or even Infiniband), you need to rethink shared memory code, not so much because of bandwidth, but because of latency. For Cell/GPU type systems similar rethinks are needed, for much the same reason
Or: OOOK! as the librarian warns you not to land on the dome of the library.
Sorry, couldn't resist.
Was he wearing suitable sun-block protection on his bare arms?
We must know!!!!!
"is" in between
More seriously, it looks like my decision to postpone the acquisition of a laptop until such time as MS backtracked was a good one
The coolant of choice is raspberry juice
I'll get me coat
Who says all researchers live in an ivory tower?
When people subscribe to a forum I help moderate, our software sends an authentication e-mail, with a link to let the user authenticate himself. Hotmail, and hotmail alone, corrupts the url we send, so an administrator has had to programme a workaround. Let's hope and pray that Outlook.com does mess up in the same way.
Standards? MS has heard of them, I recall.
I know, I am an eternal optimist.
Yep, noticed the same. Even the most measured posts are receiving thumbs down from somebody. Not necessarily an American though, there are plenty of idiots on either side of the big pond. Idiocy is very democratic in that sense
There is still a lot of common sense about, I see it all around me every day. I see morons too, of course, but they are in somewhat shorter supply. Modern communication technology just means you hear about the morons much quicker. Do not forget, people being sensible is not something that sells papers, or generates clicks on adds.
Silica or quartz is not (often) a true glass. Fused silica is used in optics, but due to the high temperatures needed for glass transition it is limited in its use. Typical glass combines silica (about 75%) with sodium and calcium oxides. So silica is the main constituent in many cases, but not the whole story. Most silica typically has a proper crystalline structure, and is not glass-like in many properties (e.g. thermal conductivity of glass is typically closer to that of certain liquids than that of crystalline silica or carborundum (alumina)).
Sorry, couldn't resist. Sounds like it's time to go already
There are many tasks that run like the clappers under CUDA. These are all those tasks that are of a more-or-less SIMD nature, like large matrix multiplications, Fourier transforms, and any other method that has a predefined processing order, preferably with a lot of micro-parallelism in there. Subtasks also need to be fairly isolated, to minimize communication load. For those tasks Kepler and Tesla-like processors are great (we have a couple).
However, there are also tasks in which the processing order is data driven, and where each processor might need to access arbitrary parts of the (large) data set here. I am currently doing multi-scale analysis of 3.9 Gpixel images, and doing that on a Kepler or Tesla board is a nightmare. Our 64-core Opteron machine gets between 32-50 times speed-up, because this algorithm is best using coarse-grained parallelism. X86-64 machines are not going away soon, and GP-GPU-processing is not a panacea (great though it is).
I cannot say that the fact that one particular missus cannot understand a phone is the best measure for usability. My missus moans about her android phone endlessly (which my kids understand in the blink of an eye), but at the same time refuses to listen to any advice (like what different buttons/apps do, what gestures are available, etc). In quite a few cases I find that moaning is not about getting help or advice, it is purely about getting attention. If you truly solve the problem, you deny them the chance to moan about that again.
BTW, this is not a female thing: we have several excellent and technically very competent female PhD students here. It is much more of an anti-tech mindset that some people develop.
Sorry, end of rant, it has been that kind of morning. Beer, because I feel I am in need of that.
To see the Horse-head nebula you need very dark sites, a big scope, and for preference an H-beta filter. Even with my 8" in the Alps I failed to spot it (did not have the filter though, next winter might be better).
We are lucky to live in a time when we have instruments like Hubble and Herschel to capture such beauty.
even at the subatomic level, being hit by a wimp does have much effect
Yeah, right. I should have gone home long ago
Just to keep back-seat drivers occupied
And the young'uns like to be near the queen of course. To paraphrase Loudon Wainwright III
"The cutest ant that I have ever seen
is our own big fat sexy queen!
True she hasn't got such great legs,
but you should see the girl lay eggs!"
Mine's the one with the dead skunk print on the back.
A virus scanner that identifies itself as malware
At least it will remove itself, or at least file harakiri.dll