Re: Well, you know what they say...
Don't be silly! It is never that small
2658 posts • joined 24 Apr 2007
Don't be silly! It is never that small
WWI is in the Concise Oxford Dictionary. Until 1939 it was referred to as the Great War or the World War. Only after WWII did the phrase WWI become commonplace.
No spam other than the usual rate (like from BILL GATES FONDATION, or the widow of the late UJUMBU N'TUITIF, or warnings that bank accounts are blocked until I update my personal information, usually from banks where I have no account). Internet business as usual, in other words. No offers for cheap Viagra today, or cheap PhDs (maybe they did find out I have a proper PhD through linkedin.
My password does not seemed to have leaked, no important stuff is on there, and I have changed my password to be on the safe side.
baffling "design decision." I will stick with Win 7 at home, unless I seriously want to tick off the missus (I really only boot to Linux).
I must convince the people at work that the GTX 680 would sit very nicely in my desktop, if only to test some neat CUDA fluid dynamics code a student has just made (which works nicely on a dual GTX 590 machine in the lab). It is so nice we are getting humongous compute clout for such modest prices, compared to the old Cray J932 and SV1e we used to have.
Good point about the southern hemisphere.
"NGC 6537 looks very like a cross - and many other planetary nebula exhibit the hourglass appearance that could be interpreted as a cross. The helix nebula is the size of the full moon and 2.5 light years across and may be a candidate - not sure how fast its expanding!
A nearby one may have dispersed by now, or possibly have blown itself away with a greater explosion later on - maybe the crab nebula was planetary before it went supernova?"
The Red Spider (NGC 6537) is pretty unique in shape, and at 1.5 arcminutes is not very big (and would have been smaller in the past). The hourglass-type side-on nebulae like the Dumbbell are not that cruciform, and though the Helix is the size of the full moon, it is very difficult to spot, even through my 8" scope, as its surface brightness is very low. This is a key problem: high surface brightness planetaries are small, one big enough to resolve by eye have very low surface brightness.
If the white dwarf at the centre of a planetary is part of a binary (spotted one such system last year), it could go supernova (Type Ia), otherwise this is unlikely (not enough mass). There is no indication that the Crab pulsar is part of a binary, I think. There may be a supernova remnant as yet undiscovered, of course. Supernovae embedded in a star forming region could generate strange light echoes on the surrounding dust and gas.
Finally, the "red crucifix" in the sky may have been atmospheric, rather than deep sky. A curious illumination of clouds after sunset, noctilucent clouds in a strange formation, auroras, or a bright meteor which exploded (and form cruciform patterns). And finally, if your king or local lord had stated he saw a red cross in the sky (after imbibing some bad mead, maybe) stating you could not see it might be a terminal career move ;-)
"A planetary nebula could easily be mistaken for a crucifix so I guess a nearby one would easily cover all the requirements. We just need to find the culprit."
Not the planetary nebulae I know. Besides, they are not large enough to be resolved by the naked eye. Planetary nebulae are not supernova remnants, but form when a star roughly the size of the sun blasts off its outer layers. However, let us not forget that the 1054 supernova was seen in the east, but not by western observers (too busy bashing each other's brains in ?). I do not know of a nearby SN-remnant which could be a candidate. A gamma-ray burst may be the culprit, as others have noted.
As this event is not so much impossible, but very, very improbable, I suggest the Heart of Gold is to blame
They almost make it sound dirty.
Nudge, nudge, wink, wink!
Your thunderbolt, does it go, hey, does it go?
I bet it likes lightnin', know what I mean? Say no more, say no more!
because it runs my code faster. On 8 or 16 cores I also tend to get a slightly better load balance than on 6 and 12, because the binary tree structure used in the gather phase of many algorithms is nicely balanced. Furthermore, hyperthreading is great mainly if the different threads share a lot of the data they work on, so the you do not get cache contention issues. In my code I find it does not contribute anything, and can actually harm performance.
The same does not hold for a lot of code out there. Horses for courses. For my desktop, the AMD chip is best (but not with a AMD/Radeon graphics board, because we also use CUDA), others may be served better with Intel chips.
I cannot see the transit of Venus due to bleedin' clouds day
Been there, loved it. Spent some time in Sydney, and really liked the place (people, climate, food, surroundings).
If you find Oz boring, can I have your helping?
Of course we must assume any trend in the past goes unchecked. Double brain size in 3000 AD? make that 4x bigger in 4000 AD, a massive 10kg brain size in 5000 AD etc. How long before our brains are visible from space?
had to be said
They would not communicate with cricket-playing nations, would they
Held in very bad taste out there in the galaxy.
Duh! Batmat == Batman
Calm down, Robin, it's just the batbot
It can't be that time already?
What is the speed of a kangaroo in vacuum?
Mine is the one with the roo-leather Barmah hat
Hat, coat, outahere!
They simply had to put up with whatever was preinstalled, or foisted upon them in the workplace. People are adaptable, and can learn new ways of working, but some "improvements" weren't.
I love the 13" form factor, but dislike the terrible resolution.
Unless the user let the browser remember the password. There should perhaps be stronger security than password/username, e.g. verification by response to mobile phone, or pin/bankcard validation used for online banking.
Neat machine, but ultimately beyond my means. A year later I was coding image processing software for a living on an 8 MHz 80286 with 640 MB RAM, and a Matrox PIP1024 image capture and processing board, which had a whole 1MB of RAM. I yearned for the vast 4MB RAM of the A440.
Elon Musk is quite aware of the similarities. When interviewed by Jon Stewart on the Daily Show, he acknowledged he should get a white Persian cat.
Great guy, great achievement by the whole SpaceX team!
"As a (senior) lecturer, you probably have no concept of how unabidably crap the average corporate trainer is. It's a certificate culture out there, and the delegates (I won't demean the term "student" by using it here) are expected to sit, listen, maybe "brainstorm" a bit, then walk away with a piece of paper."
Actually, a colleague had to follow an "Academic Leadership" training given by a corporate trainer. His description was telling. Unabidable crap is a fitting designation. The trainer asked questions like: "what would you do if a PhD student turns up at 9:30 each morning?"
Answer by (experienced) trainees varied from: "Nothing, as long as he gets his work done," to "Commend him for consistently arriving before the head of the department."
These were not the right answers according to the trainer (who clearly had no concept of an academic working environment). He honestly expected people to work regular 9-5 shifts. When criticised that this was not how we work, and that many PhD students work say 10 am to 8 pm shifts r longer, he stated this was no way to run a lab. He was questioned whether he had ever run a research department, he had to admit this was not the case, but he stuck to his guns that he knew how it should be run.
My colleague and all other trainees considered the course a complete waste of time, but you had to get the certificate for the new tenure track system. I gather they have now got rid of this course.
"Either the team responsible for this cock-up didn't attend - or those teaching the courses need to be fired."
As a (senior) lecturer, I cannot accept responsibility for all cock-ups my students make after following or even passing my course. The people teaching the course in source attribution may have been sterling, but let us not forget a student's ability to forget, misconstrue, or otherwise garble any information or skills imparted to them. I have seen all too often that students learn things only to the level to pass the exam, and then get totally plastered to ensure they erase that section of memory as effectively as possible. Fortunately, there are also many students who really want to learn and work hard at it. I have long ago decided to focus my efforts on the latter class, and lose no sleep over the former. After all, they are grown ups, they are responsible.
Many image search tools exist, they should have been able to find the source. If anyone is to be fired, fire those responsible.
The 16th century ball tested looks more like mr. Nutt's design than the lumps of wood with rags used before it
You are assuming the lawyer gets paid to help you, which is not how many lawyers see it. They get paid so they get richer. As the lawsuit is delayed, they charge more. The same holds for the lawyers of UPS. What incentive do they have to get things over with quickly if they get paid by the hour?
Agreed, they employ many salesmen of the caliber of C.M.O.T. Dibbler. To sell a bad product as successfully as Microsoft clearly can, you must be a hell of a salesman. Window 7 is mostly OK, however.
Quality comedy , that is, or quality trolling, maybe.
Now they have to worry about IE as well as IEDs
Sorry, couldn't resist, given all the trouble IE6 (aargh!) and 7 (less so) gave with our conference web site, which worked fine with Opera, Safari, Firefox, and passed W3C compliance tests.
On the other hand, I would not expect computer to drive more dangerously than a large percentage of drivers (word used without prejudice) in Crete or Cyprus. Driving there was, let us say, an interesting experience, after which a quick bout of dodging charging bulls seems like a pick-nick.
In relation to the earlier Streisand remark, I just read that as
and of course I had to think of Cohen the Barbarian (a.k.a. Ghenghiz Cohen)
Mine is the one with "Interesting Times" in the pocket/
That is correct because too many lusers are now using it because they heard it is secure
Fortunately one login for all except the HPC systems suffices here.
Those who love metric don't like mm Hg, they like Pascal.
El Reg readers like
All this drinking to their success may start to stress livers (and brain cells) in many places
Brilliant stuff. Takes me right back to my childhood memories of the Apollo program.
He has joined the choir invisible
He has gone to meet his maker
It is an ex-donkey
was my immediate thought.
Terry Pratchett, you are a wise man, and clearly we all live on the discworld/
you will imagine to hear it say
"More powerrr, we need more powerrrr!"
Not the frothy kind! Distill the stuff I say. A glass of single malt to my fond memories of Scotty saying
Ye cannae break the laws of physics capt'n!
I will wait to see the scientific paper on this. A problem with these tests is that it mixes hardware and software performance measurement. Gaining speed by increasing communication bandwidth (and decreasing latency, for preference) just get the "duh" response it deserves.
The only ways to see if two algorithms differ is to (i) do a proper complexity analysis (computing time and memory/bandwidth use) to see how it should scale theoretically (both in terms of data size and number of processors), and (ii) time optimized versions on the same hardware (or different sets of hardware), using a variable number of processors or nodes.
Oh really. I do not need spectacular visual effects on my desktop, unless I am playing a computer game, or running scientific visualization software. My OS should not try to dazzle me, I need to get work done. The best OS is the one you hardly notice. This may involve smart use of visual effects. Some visual effects I find useful (compiz-fusion has some things I find very handy, in particular in the area of switching desktops and looking for the right open app), but most are just battery-draining eye candy. It is telling that the spectacular visual effects are mentioned before the streamlined navigation (which is useful). I want substance, not bling.
I agree up to a point. Languages do need to change, and they are in fact changing. OpemMP is a sort of "bolt-on" solution for C(++) which allows the compiler to treat for loops as for-all statements, and provides various other mechanisms for syncing. A functional approach such as in Erlang is often proposed. I do have some doubts that we can solve all sorts of problems merely with new languages. We need to learn new ways of thinking about these problems. A good language can inspire new ways of thinking, of course.
I have to disagree a bit here. Parallel computing is great, but at the same time it is hard work, and it is only useful in particular, data and compute intensive tasks. Memory access bottlenecks have been reduced greatly by getting rid of the front-side bus (guess why Opterons are so popular in HPC), but they are still very much present in GPUs, in particular in communication between GPU and main memory. There are improvements in tooling, but they are too often over-hyped. Besides, as with all optimization, you need understanding of the hardware.
Parallel computing is at the forefront of computer science research, and new (wait-free) algorithms are being published in scientific journals, as are improvements in compilers, languages and other tools.
Throughout its early history, physics simulation with its emphasis on matrix-vector work dominated the HPC field. Now a much larger variety of code is being parallelized. People are finding out the hard way that parallel algorithm design is a lot harder than sequential programming.
As I like to tell my students: parallel computing provides much faster ways of making your program crash.
I thought the Unseen University had prior art on that. Go ahead Apple, sue UU, they have a pond full of people who tried to sue them.
Doffs hat to TP
Wasn't stage five "Light fuse"
or for a more emotional (i.e., high blood-pressure) pitch, Steve Balmer
Not sure about that. quite a few people (myself included) drop the default browser in Android for something with more functionality. My HTC Desire's default browser had no tabs, I tried firefox on Android briefly but was not impressed, and run Dolphin now. There may be better browsers out there for Android, but I rather like Dolphin, so won't change now.