1655 posts • joined Wednesday 24th March 2010 19:26 GMT
You must be out of your mind !
(One of my mates invented Arimidex)
"working with browser manufacturers"
You can just see the scene : a grimy factory, belching smoke, grim-faced workers enduring noise, soot and low-wages bashing out, probably with steam hammers, countless billion browsers a shift.
"how much is this like a magnox reactor then"
No at all. This governs the working fluid in the turbine stage - which will be steam in any conventional nuclear plant
From the BBC
"They point to studies suggesting that a 10km/h reduction in speed saves closer to 5% on fuel rather than 15%. The government's own figures suggest it could forfeit large sums in tax revenue due to the fuel savings. And the bill for changing the road signs for just four months runs to 250,000 euros."
Mind this is the same BBC that has 130kph translating as 75 mph or 81 mph on two adjacent lines
"That's something for Linux users to think about,"
I thought about it - a long time ago - Microsoft's cr*p doesn't go near my systems.
Problem is ..
Anywhere outside the solar system is such a dickens of a long way away that the chances of a very fast chunk of rock arriving here, finding the Earth, surviving entry AND containing signs of life seem pretty remote
For all who loved messing with the MK14....
Get yourselves a handful of 18F PICs (http://www.microchip.com/) and a programmer. It's easy to set one up to do all sorts of jobs and hook it up to a PC via a serial or serial/USB converter. One of mine is providing me with info about my house (temperatures etc.) via a small server program on my fileserver even though I'm 800 miles away in Switzerland.
Another is emulating a serial port chip on my homebrew FORTH 6809 system.
Cheap, robust, versatile and available in 0.1" pin spacings so PCBs can be made using laser printers by printing the mirror image onto photo paper, ironing that onto clean copper board, soaking off the paper and etching the board.
Chronically poisonous, not acutely, so rather a slow assault weapon
but it's certainly rather poor. Very fit, muscular athletes are often rather high BMI but low body fat
It's better to be the 'correct ' weight AND fit, but it seems better for health to be fit and rather overweight than unfit and 'correct' weight.
There seem to be better measures of weight, BMI is just the easiest to measure but studies have shown that % of body fat or even just waist measurement are better predictors of poor health outcomes. Very fit, muscular athletes are often rather high BMI but low body fat
Exercise does seem to be VERY important for long-term good health.
Because of chance & biological variation lots of people will experience different outcomes.
Changed it for you
PS: Could it be all that food they eat down south?
Seriously the combination of even a little too much food combined with too little exercise is enough to explain it all. A single slice of bread needs a mile of walking to burn it.. Eat just 100 cals a day over what you need and thats 36500 cals a year and that's ~4.5kg of excess weight. & that's ~1.5 -2 BMI units (OK BMI is a poor measure).
Er, don't know but he did say ..
"The problem with chemistry is that it's too difficult for chemists " or similar
Borsa Italiana runs on the TradElect system, based on Microsoft .Net and written in C#
Although emulation is mentioned in the article I can't see any performance figures. Is it me ? The only performance mentioned relating to 80x86 is the 30% hit on emulation. I assume the main use will be running native code under MIPS variant Linux.
Yes I appreciate that - I was just puzzled by the comment "but he is very enthusiastic about the Atom family, particularly now that it runs 64-bit code," which makes it sounds as though this is a recent thing.
I built the file server when I needed it - I'd much rather have an even lower powered one.
I wasn't supporting the use of cryogenic hydrogen...
but irneb's post I was replying to said :
"....as it would be to hold enough H2 in a canister at thousands of atmospheres to get it into its liquid form "
I was merely pointing out that hydrogen can't be liquified by pressurization. Further even liquid hydrogen has a much lower energy density than hydrocarbon.
By the way considerable research is being done into the use of cryogenic hydrogen by BMW and others
I'm no fan of using pure hydrogen as an energy carrier unless the numerous practical problems can be overcome, maybe by on-board generation from precursors (formic acid for example -which brings it's own set of problems)
Unfortunately even fuel cells have a low-ish efficiency (~50% tank-wheel) esp.when combined with the efficiency of hydrogen production. No doubt some considerable improvements remain to be made.
It's the printer drivers
It isn't !
The vast majority of printers work with Linux. Even cheapo scanner/copier/printers have printer drivers.
"H2 in a canister at thousands of atmospheres to get it into its liquid form"
Er, you can't. It needs to be cooled at great energy cost to liquify it
Generating hydrogen by electrolysis doesn't need high voltages but it does have a low-ish efficiency. (OK if you can use the waste heat)
"They won't be making extra electricity to power ....... electricity is already available"
I suggest you don't comment on something you clearly don't know anything about.
What rubbish, electricity is generated to demand. More demand - more generation
businesses will never take OSS seriously
Having worked for one of the world's largest pharmaceuticals companies for 38 years I can tell you that OSS was taken very seriously from Linux servers to large Linux compute farms to the 3D desktop workstations that computational chemists used. Even though some of the desktop software we used was some of the most expensive I've ever heard of the use of Linux and Xeons gave the stability and power to run molecular dynamics or protein modeling flat out for days at a stretch without expensive proprietary Unix/hardware. This wasn't just now this was 6-8 years ago
Re : A genuine question
OpenSUSE and KDE
the brewers. Panic now
On the other hand ..
You, supported by no-one, have ranted on without any evidence and totally ignored any real references to actual hardware and systems. I think you should attempt to get yourself up-to-date with the whole area.
Blacklight, the World’s Largest Coherent Shared-Memory Computing System, is Up and Running at the Pittsburgh Supercomputing Center
PITTSBURGH, PA., October 11, 2010 — Researchers are making productive use of Blacklight. This new system, which the Pittsburgh Supercomputing Center (PSC) acquired in July (aided by a $2.8M award from the National Science Foundation) features SGI’s (NASDAQ:SGI) newest scalable, shared-memory computing platform and associated disks. Called Blacklight, the SGI® Altix® UV1000 system’s extremely large, coherent shared-memory opens new computational capability for U.S. scientists and engineers.
Featuring 512 eight-core Intel Xeon 7500 (Nehalem) processors (4,096 cores) with 32 terabytes of memory, Blacklight is partitioned into two connected 16-terabyte coherent shared-memory systems — creating the two largest coherent shared-memory systems in the world.
These days, the Top500 list of the world's most powerful supercomputers is dominated by cluster designs assembled from many independent computing nodes. But there's still a place in the world for an earlier approach, as evidenced by a new machine called Blacklight at the Pittsburgh Supercomputing Center.
nsideHPC: And that’s a single system image for all those cores?
Dr. Eng Lim Goh: Yes. It runs as a Single System Image on the Linux operating system, either SuSe or Red Hat, and we are in the process of testing Windows on it right now. So when you get Windows running on it, it’s really going to be a very big PC. It will look just like a PC. We have engineers that are compiling code on their laptops and the binary just works on this system. The difference is that their laptops have two Gigabytes of memory and the Altix UV has up to 16 Terabytes of memory and 2000+ physical cores.
So this is going to be a really big PC. Imagine trying to load a 1.5 Terabyte Excel spreadsheet and then working with it all in memory. That’s one way of using the Altix UV.
(I may be a chemist but my involvement in computing goes back almost 30 years from SC/MP, 6502, 6809, 68000 and PICs., PDP-11/34, Evans and Sutherland vector graphics systems, VAX/VMS all the way to SMP workstations and 1024 cpu Linux Clusters.)
"if it's not being ingested then the TNT"
It's being eaten - the TNT gets into the grass, the sheep eat the grass but then the bacteria in their gut breaks down the TNT to, presumably, safer and/or more biodegradable products. These may well be absorbed/excreted/exhaled
I don't think it could have been
It was sometime in the early 80s and I was more interested in what could be done with it.
We had some fine kit in the next few years. I seem to recall an Evans and Sutherland colour vector graphics display run by a bit-slice processor and with a PDP11 as its file store.
By the time I retired we had stereo-graphics equipped dual xeon workstation running Linux with access to 1024 machine Linux clusters for the 'hard' stuff. Some of the easier protein modeling took a weekend to run on dual xeon machines
You seem to be under the wrong impression..
A multiprocessor system that runs ONE copy of the OS is not a cluster - how can it be ?
You seem to think that a multiprocessor NUMA system is a cluster - it's not
I repeat the PSC Blacklight is an Altix® UV1000 with 4096 cores and 2 copies of LInux
A computer cluster is a collection of computers that are highly interconnected via a high-speed network or switching fabric. Each computer runs under a separate instance of an Operating System (OS).
A multiprocessing computer is a computer, operating under a single OS and using more than one CPU, wherein the application-level software is indifferent to the number of processors. The processors share tasks using Symmetric multiprocessing (SMP) and Non-Uniform Memory Access (NUMA).
Very reasonable plan
Let's get the spanners out Gromit !
You seem to have had bad experiences. I don't know why. I use OpenSUSE 11.2 on 5 quite different machines and have had no problems - they are all rock solid. My ultra-low power file-server is only rebooted for updates.
Ah, the A1000
To reply to some comments down the forum :
Even if the usb stick didn't automount the same/similar vulnerability would be triggered if the user then mounted it and used whatever flawed application.
The main problem here is that the file manager opened the usb stick automatically - that should never be allowed and indeed I don't think it is in most distributions.
Well in a general sense I agree, Big Yin
but it doesn't allow anything that isn't predefined. No scripts or executables.
Certainly using OpenSUSE 11.2 and usb stick or any removable device *automounts* but ONLY puts up a requester asking for reasonable option to deal with the device ( Do nothing or file manager or picture viewer whatever for a USB stick)
Oh good grief !
Are we really supposed to take this seriously ?
- Facebook offshores HUGE WAD OF CASH to Caymans - via Ireland
- Review Best budget Android smartphone there is? Must be the Moto G
- NSFW Confessions of a porn site boss: How the net porn industry flopped
- World's OLDEST human DNA found in leg bone – but that's not the only boning going on...
- OHM MY GOD! Move over graphene, here comes '100% PERFECT' stanene