Re: Leaks? @Me
I MUST MUST MUST proofread my posts better. I meant to say "You leave your fingerprints everywhere...."
2390 posts • joined 15 Jun 2007
I MUST MUST MUST proofread my posts better. I meant to say "You leave your fingerprints everywhere...."
You leave your password everywhere, unless you are like Michael Jackson, and wear gloves all the time.
As soon as someone finds a way of lifting your fingerprints off the glass you drank your last pint from, and sorts out a method for creating a facsimile/feeding the correct hash from that into an authentication system, it will be busted wide open. And if there is a single hashing method, that will not take very long. Sounds soooooo secure to me!
If you are going to use biometrics, use something that is not generally available! But as soon as you do, the data from that biometric will leak (your iris or retina data is only safe now because you have never had a reason to have it scanned. As soon as you do, it will become generally available).
I'm also a little unhappy about putting my eye up to an optical device in a public place, because it would be possible for such a device to be hacked (like bank ATMs are now for card skimming) to do irreparable damage to my eyes (scenario, use a pulsed solid state laser to burn some small random patch of the eye. No immediate symptoms, so device may not be spotted immediately, but repeated use would degrade sight).
So, possession of a physical token, plus a changeable secret, with additional further authentication to resolve conflicts, which may include biometrics used at some trusted local identity broker (physical presence required) would be my preferred solution.
Not necessarily. It would be perfectly possible to write the command line phone interface to take the last group of numbers on the command line a single phone number. Not normally the way you would write a UNIX-like command, but possible.
What I would say about the OP is that I would not want my phone book installed in /etc. That is for non-personal, system configuration files owned by root. Maybe something like ~/.phonebook instead.
It's not HP that is being sued. It's the named parties, "HP's current CEO Meg Whitman, her predecessor Leo Apotheker, former HP chairman Ray Lane and Autonomy founder Mike Lynch, along with other senior execs and HP's banking aides, Barclays Capital and Perella Weinberg Partners."
If the defendants lose, they will personally (in the case of the individuals), or the companies named, have to find whatever the court deems suitable recompense. It should not cost HP anything, so the only damage will be reputational. That's the problem of being a senior officer in a US company. You have a fiduciary duty to the shareholders, which puts you on the spot if they are not happy.
I don't know about the US, but I believe that it is pretty difficult to make anything stick to the auditors, because they are acting as agents, and unless negligence or deliberate fraud is proved, can absolve themselves of blame.
Give me a voice activated TV! Say what you want done.
Probably cheaper to implement as well (microphone, noise discrimination circuit to avoid feedback from the TV sound, Google voice recognition, job done).
Scotty: "Computer.... Computer"
But the cloud service will probably be paid direct to Adobe US, and bypass the UK company completely. Remember, the Internet is international!
Soduim. My chemistry teacher did this in a large pyrex bowl. Said he had done it many time before. This time, the sodium ball stuck to the edge of the bowl, and caused the bowl to 'explode'.
Fortunately, nobody was hurt, but it did shake everybody up, including the teacher.
If you look at the 'front', there appear to be a ring of what look like LEDs, possibly multi-colour, which may work like those in a Solsuno watch, replacing the main hands of the watch. That would be pretty cool.
Damn. Beaten to the punch!
As caesium atomic clocks use the stable isotope caesium-133, it is not radioactive, and there is no danger of being accused of moving nuclear material while travelling.
It says 1KW.
When I installed an electricity monitor about four years ago, I was appalled by the base load of the house. It prompted me to go through all the devices and thinking what was being left on or on standby which should have been powered off (seriously, CRT tellies in standby can draw 60-100W).
It also encouraged me to identify all of the lights that are on for large parts of the day and making sure that I used the lowest power bulbs that did the job (my house has people in it 24x7 at the moment because my wife does not work and all the kids have moved back in! - seen the sitcom "My Family"? It's like that).
Since then, I have also had nearly all the old CRT tellies replaced by LCD ones (well, it was as good an excuse as any, and an easy Christmas present for the kids with benefits to me), moved my firewall onto a laptop, rationalised the number of devices needed to drive the home network, and made sure that the freezer is kept defrosted (it really makes a difference), and also used smart-power strips to remove the power from several devices when one is put into standby.
I just wish that more devices had physical power switches (and I can't use the switch on the socket because in most places, I have more than one device plugged into the socket, and I want to, for example, power the telly down while leaving the Sky box plugged running)
My base load is still around 500W, and I'm struggling to identify where that is going. Probably not something that a smart meter would help with unless they also supplied per-plug metering devices.
It does make you wonder when you can tell that one of the kids has left their gaming rig on overnight to download some game patches, and you can see 3-400W of additional drain. And also when the gas central heating kicks in, and the electric pump starts drawing 7-800W of power itself.
If only I could persuade my wife that the tumble-drier really is one of the biggest expenses. She will not understand that 2 hours of 2.5KW easily uses more power than 24 hours of 30W for the firewall.
Network Security 101.
Thou shalt under pain of ridicule by your peers, turn off all services that pass data, especially authentication information, across the 'net unencrypted.
At the time, if you wanted a serious calculator that would work for years, HP was your best bet. Their calculators were the Rolls-Royce of calculators.
Also at the time, HP were a major computer manufacturer as well as a medical and test equipment manufacturer (which is where they started). Printers were a bit of a late addition to their product set.
I agree about the EeePC 701, and mine is still working and in use running Ubuntu.
the '-r' switch in ls? As in reverse the order of the listing?
I use it multiple times a day. It reverses the order of the search, and is incredibly useful when used with the '-t' flag to find the most recently changed file, as in 'ls -ltr | pg' (or less, if you like).
You could just let it run to the bottom, but that's not what I want to do.
One thing which has hone wrong IMHO, is the fact that the order that files are sorted has changed with the advent of multi-byte character sets, such that the sort order is affected by the collating sequence of the NLS code page you are using. Add to this the stupid (again IMHO) default of the GNU ls to ignore non-alphabetic characters (like '.') into the equation, and the order that shown becomes almost complete nonsense for any practical purpose.
Life used to be so much more simple.
No. He's right.
If a problem in an LTS release is not fixed in the first year of the life, it almost certainly won't ever be.
One of the problems in 10.04 that gathered a huge number of sufferers was the interface to musicbrainz, which is used to identify the CD you're playing or ripping. The fact that it would not work in the library implementation that shipped with Lucid meant that you could not us it to rip CDs, without you keying in the album and track info in manually, regardless of the ripper you used (they all relied on the library implementation in Ubuntu).
After a year of relative inactivity, the responders on the Ubuntu bug tracker suggested people upgrade to a newer release of Ubuntu. After 2 years (still in the support window for Desktop), they closed down all of the reports as "will not fix". And the stupid thing is that the problem was well known, and could have been worked into the repository with comparatively little effort. People even offered to do it, but the updates were not put back into the source tree.
None of the Canonical responders were prepared to put the effort into what they saw as an old version. So I have to ask, what value is there to LTS releases. I'm not asking them to back-port the features from newer releases, but I would expect them to fix known bugs, even if they are not security related.
To fix the problem, I had to update to Precise, with all the collateral pain that caused. Now using Cinnamon on Precise relatively happily. I try Unity every now and then (it's still installed), but inevitably go back.
I'm fast becoming convinced that Canonical do not want Ubuntu to be seen as a Linux distribution, but as an operating environment that happens to run on top of Linux, in the same way that OSX is an operating environment running on top of Mach and BSD.
They want it to be distinct and different, and that appears to extend to being prepared to piss off long term UNIX/Linux users.
This may be a clever strategy. If they can get the wider population to adopt Ubuntu and accept it for all it's strengths as an environment in it's own right, rather than a Linux distribution, then it may be able to ditch the (undeserved IMHO) stigma that appears to blight Linux in the eyes of the community of non-Linux computer users.
But it sure does make me cross.
I'm just waiting for comments like "Oh. Ubuntu. That used to be base on Linux at one time, didn't it?". As soon as that perceived distance has been achieved, then maybe, just maybe Mark Shuttleworth will be content.
Or maybe I'm just an old fart, too out of touch to be relevant any more.
Cheques in the UK remain a valid payment method for transactions with businesses that choose to accept them. What has been phased out is the 'cheque guarantee' function of your card.
What has happened is that most major retailers have chosen to not accept cheques (it is their choice), although they did it on the back of the presumption that cheques would be phased out. In the end, they weren't because of the lack of a non-cash, disconnected payment method that many older people and particularly charities complained was missing. The Payments Council concluded that there was still a role for cheques (http://www.paymentscouncil.org.uk/media_centre/press_releases/-/page/1575/)
After making such an inaccurate about cheques being withdrawn, I wonder whether the icon you've chosen is actually justified.
If you were to assume that being present on a website shows ownership, then what prevents someone taking your image, forging some metadata that 'proves' that it was taken earlier than you posted it, and the accusing you of stealing the image yourself! Being posted to a web site is just not enough, especially if you are dealing with Instagram or Google
When it comes to identifying photos, diligent will mean either a quick check for the presence of metadata, or an incredibly huge and impractical manual search of millions and millions of images.
As far as I know, automatic comparison of pictures is still an inexact science, which means that it will be very difficult to automate the process of working out whether a picture is the same as another picture that someone claims ownership. It's probably easier with scanned film than digital images, because you can look for grain pattern and defects, but even that can be altered with digital filters.
Considering what is done routinely to crop, rotate, change the colour pallet, touch-up and resize images, you would have to have some means of automatically and reliably hashing a picture using the major distinctive features and be able to discriminate between different pictures of the same subject.
I'm sure there must be some major research going on, but I would think that any research will mainly be working on identification of the subject, not proving that two images are the same. Whether one can come from the other is a moot point, but without this technology, I would be much happier without this legislation unless you make it a major crime with appropriate punishments to strip or forge metadata.
Mr Dabbs was just a bit too late.
As a studentin the late 70s and early 80's, I had some of the earlier Amstrad HiFi, including an IC2000 amp. and an IC3000 tuner (and a JVC KD720 tape desk, and a turntable from Strathern, a failed Northern Irish employment project). I also had a set of Comet speakers which were the weakest components, but were the same as Amstrad speakers of the time, and definitely had two drivers, although they were replaced by a set of Keesonic Kubs, which I still have today (great little bookshelf speakers).
Now I know it was not up to the grade of my friends who had Rega, Quad, Tangerine, A&R and Mordaunt-Short kit, but it was definitely better than the so-called 'Music Centres' or pseudo stacks that many of my friends had. Was a good compromise between cost and quality.
The follow up Amstrad kit that was in hardboard boxes with tin-foil glued on to make it look like metal were crap, however. The switch was when the switch from from discrete power transistors to integrated circuits for the power amplification was the point where it went downhill. (BTW, the IC in the IC2000 amp referred to a single IC in the pre-amp stage, not the rest of the amp).
It's not all that clear cut here in the UK. Considering how small the British Isles are, there is a huge variation in regional dialect, from Scots to Welsh to Cornish, with the industrial regions of the Midlands, Liverpool and the North East all having broad and very distinct accents, some of which are as difficult to understand as your brand.
What Americans often think of as British English is an artefact of everybody wanting to talk like the Royal Family (often called the Queens English) that is mainly promoted by the BBC since Radio and TV came along. This is a real effect, but even around London, we have Cockney and Estuary English. Accents and dialects are slowly dying, but they're not dead in England yet!
Don't jump to too many conclusions. In many cases, it is British English that has changed, and American English that has stayed the same (I'm not talking here about unbearable brashness, street talk or Spanglish here - they're American!).
Many words used in America are hangovers from older forms of English, and some of the spellings that we think of as American are just archaic use of English.
Indeed, I've heard it said that if you want to find out how people spoke in 17th Century England, you should visit the southern Outer Banks in North Carolina, where people have lived with few outside influences for several hundred years. Just be quick about it, because they've got satellite television now!
I know that the LTS periods for desktop and server do not match, but I fail to see the difference when it comes to the repositories. I have a 10.04 desktop build (although it is used more like a server, but I do directly log in to it relatively regularly) in my environment, and it is still getting many updates from the repository.
I know that there is a good chance that some packages will not be updated (in particular, Chrome and Firefox updates do not happen, or happen infrequently), but the kernel and the basic OS appear to be receiving patches.
So I have taken the attitude that my 10.04 system will remain at that level for the foreseeable future. I believe that I will get basic OS security patches as long as the server LTS release is maintained, and most of the user-access stuff is sufficiently stable that I'm not overly worried that I may not get updates. Same goes for my Asus EeePC 701, which is really too small (4GB internal SSD - not upgradeable) for anything later than 10.04.
For the record, I'm using 12.04 with cinnamon on my laptop, and I am getting by, although I really would like to re-instate the pre-unity elevator boxes on my terminal sessions. The pop-up up/down/drag slider button thingy just annoys me when it disappears.
Should have used the Joke icon. Plus, I think you mean "netbook" rather than "notebook". Maybe spell-checker error.
Ah, but why did DEC become a target for a Compaq buyout.
They had been in decline for some time, but licensing problems with Intel (Intel infringed on some DEC patents IIRC, but DEC suffered as a result - never really understood how, but they did) and problems further developing Alpha meant that DECs share price dipped, and Compaq made an offer the shareholders could not refuse.
Compaq probably over-reached themselves, and coupled with a loss of market share meant that they became weak themselves.
but as the entirety of UK Government expenditure comes from taxes or sales of national assets, everything that the government does, including many things that are directly for "national security" are at our expense.
I totally agree that "national security" is hugely overused without the correct justification. I suspect that this is because some MPs are prepared to rubber-stamp anything that mentions the term without asking whether it is being correctly used.
Of course, if you do a global substitution to replace "national" with "Government", you may get a different picture.
that what has been standard practice in the Mainframe/Midrange platform market is finally becoming a reality in other architectures.
I'm an IBM pSeries and Power person, and we have been able to do remote IPL, console, and configuration/management. for years. OK, you paid a premium for some of the features, but many of the base capabilities have been built in for well over a decade, and now includes IVM/PowerVM. The RS/6000 F40 had a service processor when announced in 1996.
It does help that most of the system admin can be done from a command prompt, though.
One of the customers I worked at had a majority of their critical servers in lights-out, mainly unattended sites scattered around the country from before the Millennium..
IIRC, the original intent of the 1984 Data Protection act was mainly to enable people about whom information was being kept to be able to make sure that it was correct and what it was being used for, rather than for any other reason.
It may seem difficult to believe these days, but the idea of data-mining was so far off the agenda as to be unimportant, at least outside of the Security Services. In 1984, computer systems were rarely networked, and datasets were stored in isolation from each other. Client-Server computing applications were still relatively rare, and the chances of being able to discover new aspects of peoples lives by joining datasets together was so difficult to be nearly impossible.
I remember looking at the requirements to be able to change all copies of incorrect data with a degree of horror, as I had no practical way of re-writing data on system backup tapes.
Fortunately, the only personal records I was responsible for were the login details of users of the computers I administered. The Data Protection officer for the Polytechnic where I was working judged that with a small amount of change, the login details (held without any identifying information about the user other than their name and the course they were on, which was implied by the naming convention) was exempt from registration, although we did go through the exercise of filling in the forms to document that the exercise had been completed.
I was immensely grateful for this.
The only thing I can assume is that the reason you're not running Linux on the laptops is because you haven't really tried, even though you say you run it elsewhere.
I've put Ubuntu on a huge variety of laptops and netbooks from Asus, Lenovo, HP, Samsung, Dell etc, and it just works. No additional drivers, no command line tweaking, all sound, video and network devices at least working. Maybe not the accelerated graphics, and maybe not Bluetooth, but enough to use. Certainly better than a raw XP install from MS media, where almost nothing works without the vendors driver disk.
You missed the bit where MS had decided that XP was at end-of-marketing, and not only did they extend XP, they actually introduced a new version specifically for the reduced memory in netbooks. This was either a shrewd marketing move or a synical u-turn, depending on your point of view.
It would have been interesting to see whether a more mainstream Linux variant would have made them any more sucessful.
I'm still using my early eeePC 701 with Lucid on it on a regular basis, and would like to find another to replace my current firewall system.
It's not quite the same. In a typing pool, they would often type from dictation, either via a dictation machine, or through the phone system (or in the really old-fashioned office, by a secretary taking shorthand). The typists needed to be able to correct grammatical errors, and spell correctly, and also know how to format a letter.
Data entry is normally repetitive, vary rarely free text prose, and extremely boring. And it's slowly being replaced by OCR and mechanical form reading, or direct entry over the Internet anyway.
That's probably true now, although as standards progress, it means that you keep having to update your adapter (or phone or tablet) every time a new codec is becomes 'standard'.
But it is not just the software. In your comments, you're assuming that the people who write an alternative implementation have ripped off your code.
This is always about software patents, not the code itself. OK, you write a nice implementation, your code should be protected, I totally agree. But the algorithm used should be open, so that someone can provide an alternative implementation. If you can prove that they copied your code, then I would support you suing them through every court in the land. But if they wrote their own, through their own efforts.....
It's a serious dilemma, I admit, but all the time we have ambitions to produce a truly free operating system suitable for everybody,, then we have these problems.
I could (although with some reluctance) support going to a model where the OS is free, but the licensed codecs you need have a reasonable cost associated with them (in line with the H.264 charges that seem entirely reasonable). That is how it was in the early Windows days (remember Windows when it could not play media out-of-the-box and you had to buy software to play music and video), although too many people just copied MP3 and DVD packages on Windows.
If we go to this model, it should be clear that this is the case to users of all operating systems, and maybe other OS vendors should be prevented from providing the software as part of their OS offerings. But this has not exactly been a totally successful strategy in the Browser rulings.
And as long as the alternative implementations abide to the rules on the use of LGPL toolchains, this should not fall foul of any open systems licensing, either.
I got it wrong. It's 100,000 units, not 10,000.
This is a quote from the MPEG-LA H.264 License terms summary.
For (a) (2) branded encoder and decoder products sold on an OEM basis for incorporation into personal computers as part of a personal computer operating system, a Legal Entity may pay for its customers as follows (beginning January 1, 2005): 0 - 100,000 units/year = no royalty (available to one Legal Entity in an affiliated group); US $0.20 per unit after first 100,000 units/year; above 5 million units/year, royalty = US $0.10 per unit. The maximum annual royalty (“cap”) for an Enterprise (commonly controlled Legal Entities)
is $3.5 million per year in 2005-2006, $4.25 million per year in 2007-08 and $5 million per year in 2009-10, and $6.5 million per year in 2011-15
All rights to this text belong to MPEG-LA (just a disclaimer to avoid any copyright issues)
So, 20 cents for every shipped copy between 100,000 and 5,000,000, and 10 cents after that up to a maximum of $5 million dollars. That's quite acceptable if you are incorporating it into a product costing $20, but not so good if you are wanting to include it in a popular free Linux distribution. I wonder whether the fact that you are not 'selling' Linux is enough to get out of "sold on an OEM basis" part of the clause?
I'm perfectly happy with software being paid for running on Open Source platforms, but patenting the codecs such that you can't legally provide them as part of a free (as in free beer) OS puts huge amounts of leverage against projects who want to provide a free OS.
The problem is that if you stay within the law, and don't ship what may become the de-facto standard for video, then something like Linux will always be seen as not for general consumption.
Alternatively, if you ship the codecs as part of a distribution regardless, so that the experience to the end user is good, then MPEG-LA can then demand a payment from you. You have no revenue stream because you are providing the software for free, and cannot pay unless you are Mark Shuttleworth (who paid for an H.264 distribution license for Ubuntu, and got heavily criticised for it).
The problem clauses in the H,264 are the volume, which says something like the distributor has to pay a licence fee per copy deployed if they ship more than 10,000 copies, and the one that says that if you have to pay a fee if you use the encoder to produce commercial videos.
Bearing in mind how viral Linux distributions can be, how do you measure how many times it has been deployed. I download one install image, and use it to install thousands of systems, and offer re-distribution from my web site. How is that measured? And who should pay?
And what qualifies as commercial? If one of my kids record the neighbours cat doing something comical, and upload it to YouTube, and Google attaches adverts, is the video for commercial purposes? Should I pay for the encode? Should Google, even though that may not have encoded it?
Licensing like this is a legal minefield for Open Software since the days of MPEG2 Layer 3 (aka MP3) or GIF. My point is that it would be so much better if the codecs (or even just the algorithms) were available under a permissive license.
Let's hope that MPEG-LA are more generous about the licenses, although I would be surprised if they were.
And to pre-empt people who say that H.264 was freely available, I suggest that you look at the commercial encoding and decoder volume distribution clauses in the license agreement.
I kept being phoned up by my Service Provider saying I could upgrade my phone.
At the time (pre-iPhone), my answer was another question "What can I upgrade to?"
When all they could offer was a Windows Mobile phone, I normally answered "And that is an upgrade?" Eventually, I went Android, although an odd quirk was that because I didn't take an upgrade when I was entitled to it, when I did, the discount I was offered was less if I had upgraded promptly. Bizarre!
I was very happy with my Treo, and it is still by fall-back phone. And now, over 6 years later and still on it's original battery, It still lasts longer on one charge than my Sony Xperia.
Was it not "The Eggman" in the Japanese version, rather that Dr Robotnik?
Sonic also spawned at least two cartoon series on TV. Not quite the 360 degree marketing that Nintendo had with Pokemon, but still quite high market penetration.
I can still remember the nasal "I'm waaaaiting" from the cartoon series, which was akin to Sonic crossing his arms and tapping a foot if you didn't move in the games.
This was a really strange blend of Sonic and pinball, and yet it worked quite well.
When I got my oldest son a Dreamcast (picked it up at 00:00 on launch day), together with a copy of Sonic Adventure, I remember watching in dizzy, sickened awe at the speed of the traversal of the 3-D ramps and loops with the background swooping and rolling around. I found it unbelievable that it was able to render the backgrounds as fast as it did.
Whilst Sonic was undoubtedly a triumph, Sega had a stable of exceptional other titles that over the years included Alex Kidd, Ecco the Dolphin, Panzer Dragoon (particularly Saga), Knights into Dreams, and my personal favourite, Burning Rangers (the absolute heyday of the Saturn IMHO).
I'm sure I still have a picture in the 1976 "Electronics Tomorrow" special edition of the magazine Electronics Today International (the one that also had pre-release articles about Star Wars) of a PET 2001 prototype that was curvy.
When the production PETs came out, I thought the steel case and chiclet keys were just plain ugly, although that did not stop a group of us on the college staff-student consultative committee from trying to get one bought for the college. Unfortunately (for us), the council voted for a mini-bus instead. In hindsight, that was probably the best choice, but it did not seem so at the time.
I could not believe how many attempts at passing the stats the psychology students were allowed before they failed at the university I was at. IIRC, they had at least six chances, and I only had two at the maths to support my computing/electronics course. And their stats were really simple! I did the same at Advanced A-Level maths.
It's not that one is better than the other. I suspect that from a purists point of view, the most elegant, sophisticated and efficient code is written by old-school computer science graduates, you know, those who actually understood the reason for doing things, not just the learn-by-wrote of current teaching methods.
But I would also accept that the code that most resembles what is required for a particular problem may be produced by people who don't have a computer science degree.
The thing is that someone who has been taught computer science probably has a relatively poor understanding of problems that are not directly related with computers. So if you are programming a system for another discipline, someone with outside skills who has cross-trained to get relevant programming skills may not turn out the best code, but may have a better understanding of the requirements, especially if they have applicable knowledge for the problem. This is not a hard-and-fast rule, there will be exceptions, but computing is a terrifically introvert area of work.
A previous poster pointed out that the best technical writers are not computer scientists, and I agree. Writing good documentation is a totally different skill from writing good code. Someone with a basic technical understanding, access to the code writers to ask technical questions, and good writing skills will in almost all circumstances turn out better documentation that the code writers themselves. At least the spelling and grammar will be correct!
and if anybody is likely to have problems, it is me.
My closest English transmitter is Mendip (channels 49-58), but as that is about 40 miles away, I currently need an multi-tap amplifier to make sure that all the TVs in the house get acceptable signal.
Within a few degrees of arc of direct line to Mendip, and at a distance of no more than 600 metres, there are two cell basestations run by operators who won slots in the 4G auction. So there is a great chance that if these basestations start operating in the 800MHz band, my TV aerial and amplifier combination will get a great 4G signal, with little chance of using a directional antenna to alleviate the problem. If a filter attenuates the TV signal too much, it will probably degrade the signal enough so it is no longer viable.
I'm trying to get information from the operators (found using http://www.sitefinder.ofcom.org.uk/search), but so far, they have not answered my queries. I want to know as soon as possible, as I have several TVs in the house, and want to find out the impact as soon as possible.
I just hope that I don't have to wait until too late to find out.
Even that's not really possible. They use an evaporative cooling system as much as they can (they really are into providing the perception that they use as little power as possible - which could be understood if only you knew who they are). The only time that this cannot be done is when the outside temperature is too high, and this is likely to be the time that heating anything is least required.
I got my information directly from the building power and cooling engineer/manager, and if he can't work out a way of getting something useful back, I'm certain it can't be easy.
The heat that comes out of these systems is regarded a low-grade. What this means is that the temperature differential is not high enough to make it particularly practical to use.
Roadrunner is air-cooled (see the picture, spot the hot-cold aisles and no water-cooled rear doors), so the the heat will be picked up by the air handlers.
I've worked with 2 generations of IBM was water-cooled supers, and the output temperature of the water is around 25 degrees centigrade (although slightly hotter for the newer system). This is colder than the ambient temperature of the halls (it's a power concious organisation that is experimenting with running the machine rooms hotter than you would normally expect to save power). This makes it less than luke warm, and certainly not hot enough to even heat the water to wash your hands. The cooling works by cooling the water before sending it to the super (input temperature around 13 degrees centigrade), so the cooling was actually at the wrong end.
Of course, you could use heat-pumps to concentrate the heat, like ground water heaters, but there is a law of diminishing returns operating. If it takes more electrical power to concentrate the head than would be necessary to directly generate the same amount of heat, there is no gain.
is not that appropriate as a title for a picture showing fibre cables! Those are fibre Infiniband cables, with electro-optical transceivers (like SFPs) at both ends. The colour is a dead give-away.
Undoubtedly there will be quite a lot of copper in the system, but not in the picture.
...availability vs. cost.
There have been highly available and even un-interruptible services around, but you have to pay for them, and they don't come cheap. Nor do the staff to set them up and run them.
Microsoft probably have an HA offering now, but I expect that even they will charge a premium.
Segregating the function, so that you can put your information distribution systems on simple, small, cheap, and redundant servers in front of your actual service machines can help with the appearance of a service being available (as well as increasing the security), but if you truly want high availability, it's going to cost.
He's obviously not seen birdshot in action fired from a shotgun. If you can hit a clay pigeon or a bird at 20 metres, you should be able to do enough damage to bring one of these 'copters down, which is within the range of most shotguns.
And if they tried to retrieve it, that would be trespass.
I would dispute that seti@home is an HPC workload. It is a distributed workload, partitioned into units that can be worked on in isolation from each other.
There is a huge difference between a distributed workload and a proper HPC workload, and people like weather agencies, atomic research institutions etc. would be only too happy to explain.
Proper HPC needs a huge interconnect, so that a single model can be broken down into multiple threads spanning many systems, all passing data between each other.
BTW, IBM P7 795s are quite cool, but P7 775s are even cooler (literally, water cooled cooler!). I know, because I work with a couple of clusters worth.