UNIX has a card in its hand:
I ATEN'T DEAD!!
2658 posts • joined 24 Apr 2007
UNIX has a card in its hand:
I ATEN'T DEAD!!
Where they using procrastinators to get the filming done quickly?
Mine is the one with "Thief of Time" in the pocket
I would have no problems remembering 5 digit or longer ones.
Regarding blacklisting, given that your bank knows your date of birth, surely they could forbid you to use a number derived from that date? That would not prevent you from using your wife's, or kids birthday, but those are not printed on your ID, as a rule, so at least some security is added.
or actually, not that cool (the moon, that is, not the science).
Many image processing tools are easily broken up into small loads. The tools we develop are so-called connected filters in which data from all parts of the image may influence all other parts. Often you want to avoid this, but e.g. for analysis of complete road networks, you cannot predict which parts need to communicate with which other parts. To cut communication overhead down you need comparatively coarse grained parallellism.
We have been able to device a scheme (which we are now implementing) which cuts down local memory requirements from O(N) to O(N/Np + sqrt(N)), with N the number of pixels and Np the number of processor nodes. Normally you just require O(N/Np) per node. Communication overhead can be dropped from O(N) to O(sqrt(N) log(N)). This has moved the problem from the impossible to the "possible, but you need a coarse-grained cluster."
Of course, the first (shared memory) parallel algorithm for this filters dates from 2008, so we still have much to learn. Other problems in science can also require global interaction between particles (gravity has infinite reach). A lot of work is done cutting communication down, but this is often at the expense of accuracy.
We are looking at terapixel scale image processing. One aim is compute the entire aerial photography data set of the Haiti earthquake for damage assessment. We aim to do this in a few hours. This can still be done at the multi gigaflop rates provided by decent size clusters. For the 2004 tsunami, we would need orders of magnitude more compute power to handle all measurements required (we are talking petapixel range). These are computations that cannot possibly wait. Though exaflop processing might not be needed yet, as image resolution increases, so do the data rates. Compute power will have to keep up. Real-time emergency response requires real-time processing of vast amounts of data. The size of each compute node of the EC2 and similar cloud systems is far too small for our purposes (we need to have at least 32 cores (preferably 64) cores per node and between 128 GB and 1 TB of RAM per node).
So, yes, I can see real-world, life and death applications for these kinds of compute powers. Other applications include large scale scientific simulations (think evolution of galaxies, or the cosmic web, but also earthquakes, ocean currents, or even the blood flow through a sizeable chunk of the vascular system).
it is "merely" liquid nitrogen (LiN) level. Not everyday, but not nearly as impractical as previous devices requiring liquid helium (LiHe) temperatures. Think of the Josephson switch, which was hailed as a breakthrough in high performance computing once. LiHe cooling requirements.
If we can build qubit devices based on this, the potential gains might outstrip the cost of LiN cooling.
They (the studio, producer, whatever) enter into an agreement with the estate, and now in retrospect they don't want to pay? Can a written agreement be ignored so easily?
Two balloons seems more fitting for LOHAN
I know, I know, I'll get me coat
sorry, I'll get me coat
If all OSS products were withdrawn or crippled in one fell swoop, how much of the internet would be left standing? Users do not care about the fact that Apache is OSS, or that their home wifi router runs Linux, but unplug all OSS products from the net, and the wails of anguish would be deafening. Much of open OSS (or software in general) is not sexy. It just sits in the background doing some vital job. That means people only begin to care when something fails.
Small runs become much more interesting this way. Ultimately, one might envisage chip making to be commoditized like 3D printing is becoming a commodity. One off parts are becoming affordable (see LOHAN). Imagine designing your own ASIC chip from scratch rather than going the FPGA route. Design a schematic, send it off, have a wafer-full made, mounted and sent to you.
Far fetched now, but maybe some day.
What do you mean "a man can dream"? A man should dream!
or two, or three!
Devices like these could SERIOUSLy boost the performance of some seriously big image processing chores (multi terapixel stuff). We require both massive storage and rapid access to LOTS of data.
Aren't we living in wonderful times, technology wise.
"The steady stream of embarrassments are being seen as evidence of a power struggle within the church as one group of elderly men pits itself against another group of elderly men to replace an even more elderly man. Possibly with an elderly Italian."
Loved that bit to bits! It almost sounds like the bad old days at the Unseen University (dead men's shoes and all that).
On the plus side, if we wait just e bit longer, maybe nature will take its course and all these elderly men will start falling over from old age. Whether or not they will meet the maker they would expect (or would have us believe to expect) is another matter entirely
It may have been the 11.x version that would not boot on various NVidia powered laptops (specifically), not just mine. I then got the older Kubuntu (maybe Gnome versions worked better) which on occasion froze, until that fateful evening when it froze 4 times in quite quick succession (whilst writing some LaTeX in emacs, not a graphics or CPU intensive load). Later a patch was made available for 11.X, but by then I had switched back. New version might be fine, and we work with some Kubuntu derivative at work on my desktop machine, which seems happy enough with the NVidia support. It just did not get along with my old Vaio SZ.
Kubuntu 10.04 crashed on my NVidia-powered laptop 4 times in one evening (complete freeze of the desktop, lost quite some work that evening). The 10.10 version would not even boot on NVidia (beta testing apparently done by end users of the so-called stable version). Switched back to OpenSuse 11.2, no worries since.
You just know it. Better still, rig the Boss/head of IT's smartphone/fondleslab to remotely mess things up, with an easy trail leading to said Boss/head of IT. Just one more way of getting rid of a boss.
it is Microsoft trying to persuade users not to use IE at all
Not likely, I must admit, but it is an alternative explanation.
How many femtobarns in a nano-wales?
"They couldn't hit the broadside of a barn at this dist ..."
Here we modified that to:
"They couldn't hit a barn if they were standing inside it."
With a femtobarn, this would create difficulties of course.
a whole new meaning of "bipolar" springs to mind watching that video.
Or is it just me thinking that
do both front and rear of the dress show the front of the skeleton?
Mine is the one with the anatomical atlas in the pocket
The 16" guns used shells in the order of a ton in weight, which left the barrel at supersonic speed (820 m/s) yielding some 688 MJ of kinetic energy.
Of course, this is just a prototype being tested.
Why are you assuming the Denisovans were related to the French?
Monsieur, you offend me, we French are related to the tall, graceful Cro-Magnon!
And anyway, as Pratchett and Gaiman (1990) have stated, bishop Ussher was off by at least 15 minutes
was on my mind more, but then I am that old
6mm might still be OK, but I would start to worry at some point that the thing would break, and given where I carry my phone, that would leave gorilla-glass shards in my groinal area.
The eyes water at the thought
Coat, for that is where I should keep an overly thin phone.
Just over half of the moon's surface is visible from my back garden with my telescope (given enough time, and the various wobbles that allow us to see a bit more than half). Using bigger kit, and the odd lunar orbiter, you can see a lot more. It is also a lot easier to walk on the moon's surface (ask Neil), than on the bottom of the oceans, thousands of meters below (ask any scuba diver). Getting to the bottom of the ocean is relatively simple, of course. Getting there alive is quite a different matter.
runs for cover
plug the iron into the shirt
Mine is the one with "Danger, High Voltage" on the back
did it taste like chicken?
Mine is the one with the "cooking with crocodiles" book in the pocket
Really amusing. Brilliant touches of surrealism. Hysterically funny, and sad at the same time. A sort of accidental literary genius.
I aten't dead!!.
I will be happy if they first go to the moon. That will supply plenty of cool things to look forward to, and no, I wont be saying things like:
"Call that a moon rocket? When I was a kid they made proper moon rocket, none of this luxury stuff with 64 bit computers and touch screens, why when I was small we had 8 bits, and we were happy with that!
Grumble, mutter, ....
Brilliant stuff. These guys are putting the excitement back in space research and space travel. It really reminds me of the heady Apollo days, when I sat glued to the (black and white) TV for every launch. At age seven I badgered my parents to let me see the first lunar landing (they allowed it, bless them!, and yes, I am that old).
Now let's go to Mars. SpaceX is the odds-on favourite in my books.
I am about to get a 64 core Opteron-based machine (4x 16 Bulldozer cores and 512 GB RAM (memory hungry, our applications?) ), and have funding for another machine somewhere down the line. Might well have a good look at the Piledrivers at that time.
Ah, so the blancmanges were really jellyfish in disguise?
Mine is the one with "Angus Podgorny Fan Club" on the back
But I am building them a mini-Dobsonian telescope (and they are helping me build it). I somehow do not think keeping them interested is going to be that hard, because they are showing promising signs of nerdiness already. Nerds are interested in things for their own sake, not just because other people think it is cool.
Why did my old school not have such cool projects? Really nice way to get kids into science, or rather keep kids in science, as from a young age they seem interested in anything, and only later, when the "snobbery of indifference" sets in in puberty, do many seem to lose interest.
Nice one, NASA!
"What's up doc?"
But was it preemptive multi-tasking, cooperative multitasking, or did tasks genuinely run in parallel on 10 cores?
Chemical magic is still needed, but not the way Arthur C. Clarke knew it.
I will still raise a pint to him. He has more imagination than most of us.
Wasn't that the sneeze of the Great Green Arkleseizure?
Mine is the one with the H2G2 radio plays in the pocket
Precisely. I have no problems with people playing proper R&B (Ray Charles and the like)
I remember several cartoons containing bullets, which skidded round corners (maybe in a roadrunner cartoon). Should Warner Bros. or MGM claim prior art?
will it run Crysis?
I'll get my coat
Bit behind the times or shape of things to come?
They should have used large, friendly letters in the headline.
On a more serious note, climate debate is fine, it is what science is about. If proof is incontrovertible, then the global warming advocates need not resort to strong-arm tactics. If it is not, the skeptics have every right (if not duty) to challenge it.
Saving fuel seems a good idea whichever way you look at it. Oil has uses beyond keeping hummers on the road.
It's "one, two, many, lots"
Troll, because, well, this is Sgt. Detritus