I always like when people put flammable materials...
... close to equipments that may overheat or send sparks around...
Never let it be said that techies aren't agile or innovative – and perhaps a little slapdash – when solving problems on a tight budget. Only this week, El Reg was given early sight of a pre-patented swanky new cooling system. IT departments at businesses that asked to remain anonymous, for obvious reasons, have devised a …
"...The fools! They should use inflammable materials....."
Careful..........Americans might read this and take your words to heart!
They won't know what to do.........
As a Canadian, I can appreciate the distinction, but just in case, I did notice some stainless steel tanks with an "Inflammable Liquid" logo sticker on it that I MAYBE suggest using as an IT gear coolant. Just spray it on the hottest parts.....and see if it cools down your gear to a great enough degree!
That's why you always know where the nearest fire extinguisher is
They are all currently locked away in the H&S dept because we don't have an official document specifying if the test date is month/day/year or day/month/year so the test label is invalid so the extinguisher can't be used.
"Environmentally friendly recycling, and since they couldn't be used as extinguishers H&S can't complain"
They would probably ask for and get a budget increase approval to have them painted a special non-fire-extinguishy colour with a large notice on saying "NOT A FIRE EXTINGUISHER" and send all staff on a half day compulsory H&S refresher course to make sure everyone was aware that the new recycled door-stops are NOT FIRE EXTINGUISHERS, all at a far, far higher cost that just buying in a few quids worth of door stops.
No it would just not have an up to date test label signed by the correct authority - absence of this renders it unable to be used and so there would be no confusion.
We had to replace all the simple color coded red/black/cream/blue extinguishers with identical red ones and the type written in small letter on the label. So simply labelling these as contents "none" for use on "no fire types" would be sufficent and fully in line with H&S policy
Couldn't we just take all the old extinguishers, empty them and use them as door stops?
I know a guy who took a rather large CO2 extinguisher and converted it to a beer tap. Rather impressive to draw a beer and quench one's fiery thirst.
When I started working at one company, I saw that they were storing half-empty paint tins in the same cupboard as the fuse box. That cupboard was located between our offices and the stairs, cutting off escape if it caught fire. I got my manager to move them.
For these reasons, when purchasing network equiptment, I now take operating temperature ratings into account. Some equiptment can accept very high temperatures, making additional cooling during summer unnecessary.
I never had to use a fan or had heat related outages, but I know these small patch racks can get quite hot at times.
One company I worked for had a rack in a south facing 3rd floor room, with large windows. The CEO got air con in his office, but he declared the IT had survived this long without AC, it didn't need it.
The "trick" was, the first person in in the monring opened the windows wide to allow the air to circulate... :-S
I installed a thermometer in the room and in the rack. Average summer temperature in the room was 38°C. The middle of the rack was approaching over 60°C!
Interestingly, we only had one server throw a hissy fit, an 8 year old HP server. The rest (only 6 years old) all ran stably throughout the summer! We did however borrow an air compressor in the June and cleaned the dust out of every machine in the rack, 6 years worth of dust isn't good for the lungs!
6 years worth of dust isn't good for the lungs
How about 10 years-worth of cat hair and associated dried mud? That's what I vacced out of an old server once (at home I hasten to add).
These days, my home computer room door is kept shut - prompted by one of the cats being sick all over the network switch. Which meant buying a new one since half the ports stopped working once the stomach acids had done their work on the circuit board..
 Much to the annoyance of senior female cat - she regards any closed door as a personal affront to her dignity.
 Just as well that many years ago, we had aircon fitted to that room - back in the days when I was a contractor and had my own limited company. Which paid for my then motorbike and the computer room aircon.
At one place I worked, the in-house facilities guys turned one end of an office block into a fully airconned computer room with raised flooring:
1) They boxed-in a row of radiators behind drywall - but didn't shut them down, so from the getgo the room never reached the expected temperature. The aircon guys spent ages recalculating things, checking equipment etc., before someone commented 'does this wall seem warm to you..?' Out came the padsaw, holes were cut and valves were turned. The room temperature dropped, but the ugly holes were never fixed.
2) They put the room stat on a pillar next to a window so it was affected by outside temperature and sunshine. The room went into superchill mode when the sun was shining, and on very cold days the aircon would hardly kick in and the room stayed toasty. When someone put 2+2 together, the stat was relocated.
"Some equipment can accept very high temperatures, making additional cooling during summer unnecessary."
If you setup your patch rack correctly then any internal fans will be more than adequate to push airflow through and out of the cabinet - and if you think they're going to get hot then specify one with appropriate fans and acoustic baffling in the first place.
One way of achieving this is to ensure that the first places to lose network connectivity in the event of overheating are the offices of the beancounters, sales and HR.
This is normal for us Brits, I remember building a server with bits from Simply Computers many years ago. It had a ton of 9.2GB SCSI drives in it that ran hot enough to fry an egg on, obvs we took the front panel off and stuck a desk fan infront of it to blow through. Worked like a champ for years :P
Those sound like Micropolis disks. No need for a space heater in winter when you've got a couple of those in your workstation.
That, or Quantum Fireballs. The most aptly-named disks I ever used because they always felt like they were going to spontaneously burst into flame after a few hours use.
Icon, because toasty disks - AIIIIEEE !!! hot hot HOT HOT HOT !!!
No need for a space heater in winter when you've got a couple of those in your workstation
Ditto for a Dell server with 8 15K RPM drives in the front. The acoustic case mostly muffled the noise but didn't do much for the heat generated. Still, it kept upstairs nice and warm in the winter..
the original 4GB Seagate Barracudas (long before Quantum started naming drives) _required_ forced ventilation and not having it meant loss of warranty. At the time at 7k rpm they were the fastest drives on the market and sounded intimidating as they spun up (they were also $3500 apiece)
I applaud your use of a cable, rather than the more traditional gaffa tape, to secure the fan in position. If, however, you accomplished this job whilst standing on a stepladder rather than balancing precariously on an office chair, then you have lost all respect that I might have had for you.
The issue is that we don't often get hot weather in the UK, so proper cooling would be a "waste of money". ISTR that UK Elf n Safety regulations specify the lowest temperature staff can be made to work in, but not an upper limit.
Apparently the business being shut down by overheating kit heat isn't a problem?
"During working hours, the temperature in all workplaces inside buildings shall be reasonable."
I take that to mean the old Shops, Offices and Railway Premises Act, which did have a minimum working temperature has been superseded then? (IIRC, the minimum temp. had to be reached within an hour of the start of the work day.)
"....There's an opportunity here for some bright sparks to invent some form of cooling computers.
British businesses should ask how they do it in California...."
Here in Western Canada, especially parts in southern Alberta (i.e. a Western Canadian Province), some parts are a TRUE DESERT which have temperatures as high as 45 degrees Celcius (113 F) in the summer so Alberta oil company systems engineers who had servers in that area had an ingenious solution! Cutting Fluid!
Quad CPU server motherboards (Tyan brand usually) and hard drives (Seagate ATA or Western Digital SCSI) were encased in large powder coated aluminum electrical mains boxes that you could buy at any major construction supplies supplier for less than $30 CAN each or about 20 Euros! The powder coating was used to coat the entire interior and exterior of the case so the cutting fluid wouldn't react with the aluminum and the electrical and network connectors, which were the ruggedized kind. They sank 20 such cases into a 1 metre cubed tub and ran the cables to the outside connections. Cutting fluid is usually used in metal machining to cool the milling bits as they cut through stainless and very hard cobalt steels.
It's the perfect fluid to cool electronics! Just NOT in direct contact with the motherboard because the cutting fluid IS ionic and therefore CONDUCTIVE ... BUT...if you dip a PROPERLY SEALED aluminum mains case with a motherboard or drive into a pool of the stuff, the cooling power was amazing.
The motherboards were running NO HOTTER than 40 to 45 Celcius at FULL LOAD because the heat transfer was so good even in the hottest part of summer! They just made sure the CPU cooler fins had their fins cut and fitted so the CPU heat transferred directly to the aluminum case and then out to the external cutting fluid.
The entire process SHOULD have been patented, but the engineers (mostly oil and gas electrical and mechanical engineers) were just solving a local problem! Anyways, it's worked for DECADES! Some of those 2 and 4 core Quad CPU servers are STILL running after nearly 15 to 20 years untouched processing oil and gas reservoir drilling data! Even the DRIVES are still running after 15-to-20 years which is nearly UNHEARD OF in the industry! You would think after so many years the capacitors on the motherboards and the drive bearings would have dried out or broken down after being on 24/7/365 for 20 years!
Actually it showed just how WELL Tyan mottherboards and Western Digital SCSI drives were built in those days! They weren't cheap! BUT they are STILL running!
In a rare display of candour, and ingenuity, their IT types will happily point out that their servers all faced Westward--that is, before the seasonal Santa Ana winds came and the hardware were engulfed in the blaze along with a few dedicated-but-hopelessly-outdated fire extinguishers
If you have critical kit which regularly gets above 45C, tightly wrap it in 20 metres of 6mm clear plastic tubing. Insert one end of the tubing into one upper femoral artery of your youngest and healthiest intern and the other into the basilic vein of the opposite arm once all the air has been forced from the tubing. The intern will act as an active heat sink for your critical device.
Either that or is from the US..
No. If I were from the US I would have used obscure variations on medieval units of measurement rather than bog-standard metric.
(With apologies to the El Reg Standards Soviet for the obvious omissions.)
Those 10U wall racks aren't usually that big a problem regarding heat, though. Half the height is patch panels, one or two cable management panels, three or four switches at maybe 100W each. A decent rack design can deal with that.
Mine (10U) takes 130W max, and internally it's about 5 degrees above ambient
I, and half a dozen other people, worked in a stuffy windowless basement which was where the prototype hardware we were developing ROM code for resided.
In vain we pleaded for aircon. The temeperatures were up to 27 degrees - even 30.
I was working the ICE - In circuit emulator to the initiated, A vast box of power hungry ECL (emitter coupled logic) and schottky TTL that on a fair day with a following wind could pretend to be an 8086 sufficiently convincingly to be of use debugging the firmware.
Except when it wasn't. One day code that had previously run well, stopped. Analysis of breakpoint audit trails showed nonsensical behaviour. Opcodes were simply not being executed....as they should.
I reported to my boss 'The ICE has gone west' .
"Dunno. It is getting pretty hot. "
And we grabbed the manual and leafed to the back
OPERATING TEMPERATURE.15C-27C AMBIENT
Now I came from a hardware background, and I knew as well as anyone that those chips had capabilities to go over 100C without popping, and the thing was fan cooled, so the only reason they would have the temperature rating so low was that it was on the bleeding edge built out of selected chips and the timings would not hold out beyond that.
The thermometer one of the permies had installed on the wall said 28C
"Leave it with me" said the Boss
The next Monday we arrived to find air conditioning had been installed.
Its always nice to know management values its ICE more than it does its coders
"Now I came from a hardware background, and I knew as well as anyone that those chips had capabilities to go over 100C without popping"
Ah, ECL. Runs hot but the problem is, as you note, that being non-saturating its parameters are very temperature sensitive, so unless you have an expensive piece of kit with cooling, the temperature range is limited.
IIRC there was a Nat Semi ECL minicomputer which had a self test routine where the tail voltage and frequency was adjusted for the CPU periodically.
At a large machine room in Tokyo owned by a well known consumer electronics manufacturer whose (short) name begins with "S":
A long row of very heavily loaded racks, with front and back doors off, and in front of each one a standard tall office fan set to maximum.
The entire division's operations depended on those fans working.
Glad they're using fans.
Customer 1) 6 servers in a converted kitchen with 1 extractor fan (there due to the previous function) taking heat out. Warned on many occasion they wouldn't be supported due to overheating which did end up costing them for repairs to overheated kit.
Customer 2) Survey completed in a partitioned off area to add a server. No aircon, room was already warm to hot and they wanted to add more heat? Advised they'd need to provide cooling before we'd be able to provide a quote which they wouldn't.
Customer 3) our server has gone down. On site and no air con, it had failed months ago. Manglement had refused to pay until we refused to fix (at their cost..).
Seems there's a trend with cooling not being taken as seriously as it should.
All time favourite? Company with managed sir conditioning system. The system developed a fault in the server room one night, the technician turned up to repair the fault...to find nobody there because only the IT department could get into the server room and none of them answered the call.
It was an expensive mistake.
The BEST company server room I have ever seen is a local one in Vancouver, Canada which I had a tour of last year. It used a Silicone Oil liquid immersion system. They rack many quad-cpu or eight-way SuperMicro motherboards into what looks like large home refrigerators but are in fact filled with silicone oil which is NON-CONDUCTIVE and is allowed direct contact with the electronics of the motherboards including the GPU's, RAM, DRIVES, etc. A 16 kilowatt power supply (ALSO submerged!) powers each large box which is about 2 metres tall by one meter wide by 75 cm deep. The silicone oil is pumped into the bottom and flows up through each VERTICALLY PLACED motherboard and comes out the top warmer than it was at the bottom. A external radiator and condensor assembly uses huge radiator fins on the roof of the warehouse-like building to get rid of the heat which then recools the silicone oil which is pumped back to the bottom of each sealed rack system for reuse. Since BC Hydro supplies electricity as cheap as 8 cents per kw/hr if you buy blocks of it up front, their mains cost are VERY LOW compared to other parts of the world which is WHY they are located in the Vancouver area!
There were 150 racks containing 12 motherboards each with over 7200 8-core CPU's (that's 57600 CORES!) doing fast algorithmic day-trading. They could afford it! I was told by a 3rd party that they were pulling in over Two Million Canadian Dollars GROSS RECEIPTS A DAY 24/7/365 for the less than 10 shareholders...NOW EVEN WITH TAXES AND EXPENSES THAT IS STILL A MONSTER INCOME from a bunch of CPU's dipped in silicone cooling oil!
Some people take IT cooling VERY VERY SERIOUSLY!
I've seen many repurposed broom cupboards. Plus a specialised computer room, which would have been great if the computers hadn't been moved in before it was finished. The servers were lifted (and dropped) by the contractors laying the floor covering. And covered in little piles of brick dust where shelves were being put in.
One of my employers saw the light, and moved the servers and noisy high speed line-printers out of the general office to their own room. With not just an extractor fan, but external air from the cool side of the building sucked in! Unfortunately wasps built a nest near the intake one year, and we had a computer room full of dead wasps.
Over 20 years ago, so no smartphone pics.
Aircon popped in computer room. Getting toasty rapidly.
Prop open all the fire doors etc, grab, commandeer, steal deskfans from all over the place, step carefully over the spaghetti of cables and watch is roar. It worked, just about, for long enough. Back in the day of the really big hard drives without all these fine tolerances :)
although there was another occasion (minus the fans) in a different computer room where aircon poppage caused the heat buildup to be so intense that ingress to the room to switch off the old minicomputer was itself tricky and required hands to be covered with jumper to execute. This resulted in melted 8" boot floppy, and I am sure you could have cooked breakfast on the top of the racks. took a day or so to cool down! The drives were built like engine blocks and held their heat nicely!
Over 20 years ago, so no smartphone pics.
Aircon popped in computer room. Getting toasty rapidly.
Close to 35, in my case: I'm in the computer room for some reason when I hear the dull roar soundscape changing. After a couple of seconds it dawns on me that it's the aircon rumble that's absent now. Which means that there's now close to 100kW going in every second that's not taken out, which is clearly suboptimal for the continued operation of the equipment present. I sprint out the door and into the sysadmin pen, and alert those present. Half get assigned to raiding the offices and confiscating any fan they see, the others shut down and switch off all the systems and storage not deemed absolutely essential. It's the one time I've seen a thermograph move, climbing 10 degrees in as many minutes.
We lost maybe three RA81 HDAs and one logic board out of well over a hundred, and one memory board from one of the systems.
We lost maybe three RA81 HDAs and one logic board out of well over a hundred, and one memory board from one of the systems.
It was tough kit! Once had a VT-240 terminal that had been in a fire (sadly the VAXes didn't survive). The case had partly melted but when we powered it up it still came up with a reassuring "VT-240 OK"
"After a couple of seconds it dawns on me that it's the aircon rumble that's absent now. Which means that there's now close to 100kW going in every second that's not taken out"
You really _REALLY_ have to wonder why with millions of squids worth of kit and redundancy up the wazoo, manglement cut corners and allow a single point of failure for the cooling.
You really _REALLY_ have to wonder why with millions of squids worth of kit and redundancy up the wazoo,
Eh, what? Redundancy? At that time there were a dozen 785/8600/750 systems, half of them in a cluster, the others standalone (although the two PDPs could run the other's tasks if one of them failed) plus a scattering of MVII's. Disks: lots. Four HSC50s, maxed out, for the cluster. Plus some RAs for the standalones. PDPs ran off 4 RP06es each. I don't think there was disk mirroring in use (can't be arsed to look up if that was even an option with VMS 4.5), but there were a couple of unused disks which were used as lukewarm spares when needed. They didn't even have a no-break until the site became crucial as a comms hub for NW Europe. Which was close to a year later.
You’ll recall that film cameras have no more than 36 photos available before the SD card, sorry, film canister, needs to be ejected and then developed.
So, we weren’t really in the habit of snapping every random thing that took our fancy, with the gay abandon that we are today.
Every frame was sacred!
Got a late night call out on New Years Eve as the Comms room for a Law firm in one of the buildings we looked after had gone down. It turned out that they had scheduled maintenance on the Comms room cooling system that night as network traffic was at a minimum. In order to keep things cool while the crack units got serviced, several portable A/C units had been brought into the Comms room which the A/C contractor had plugged into power outlets on the wall of Comms room, all of which were backed up by the Comms room UPS.
Queue UPS overload alarm followed by immediate power loss to all computer equipment in the room.
It took me a while to get there as the CBD was closed to all traffic on New Years Eve and I had to walk from a couple of kilometers away, so by the time I got there they had already figured out their mistake and had everything running again.
Had another callout to a bank branch that was suffering from heat related issues. The A/C plant on the roof had broken down so an A/C company had brought in portable units to keep things cool, with the hot air being pumped into the false ceiling which acted as the return air of the currently not working A/C plant. Thus, all the hot air came flowing out of the return air vents in the ceiling and the place just got hotter and hotter, since no one had actually thought about how they were actually going to remove the heat from the building. Their computers were already slowing down due to CPU throttling and it would not have been long before their network equipment shutdown.
I explained the basic thermal dynamics to the manager, and the fact that they would soon be unable to trade when their network died. They craned a large temporary unit onto the roof the next day.
At my old place, management had declined requests for aircon in the equipment room - apparently they considered the existing arrangement, whereby hot air was simply fanned/ducted into the gents' loo next door, was a perfectly adequate solution.
During a particularly hot summer a piece of melted/distorted hardware was ceremoniously dropped on the MDs desk with the statement "that used to be part of a business-critical server, and that's why we need aircon"
management had declined requests for aircon in the equipment room
One place I was at we were combining two computer rooms into a new building. I'd (privately) done the power and heat calculations and so requested a certain UPS and aircon setup. Unfortunately, they'd already bought the UPS and aircon setup as part of the building fit-out.
Both of which were nowhere near enough. They would have been OK for either of the old computer rooms but not for the new, merged, room.
By the time we got to 60% of the move the UPS went into power bypass because the power draw was too high. And because of the room configuration, we couldn't add extra capacity even the the UP was expandable. And the aircon was a two-chiller design that, in an ideal world, would have let the whole room run off just one chiller so that we could take the other one down for maintenance. But again, we had to run both units of full to dissipate the heat - and soon discovered that the 'architect' who had designed the room had put the chiller drip-tray drainage above the server racks..
Not one of the finest fit-outs that I was involved with - I left that place two months later.
"Both of which were nowhere near enough. They would have been OK for either of the old computer rooms but not for the new, merged, room."
BTDT. The result was a server room which was maxxed out thermally at 1/3 it's physical capacity with manglement wanting to put desks in there. Explaining the legal ramifications of putting people in with _lethal_ fire suppression kit made them go a little white. ("manslaughter charges" were mentioned)
Where I work earlier this year, at the beginning of the heatwave, the machine room just below where I sit suffered a total air conditioning failure.
By the time someone followed up the alerts, the ambient temperature was heading towards 40oC. What it was inside the racks we don't know, but it would have been much higher.
As we had just recovered from a major storage failure a few months earlier, and did not want to have to do the same again, the managers put everyone on an emergency footing, but decided not to do an EPO, but wanted a rapid ordered power down.
Well, that made a very busy afternoon, shutting down 100+ servers and the associated comms. and storage gear in the correct sequence, and in order to try to slow the rise in temperature, every single desk fan was commandeered, as was the portable air-con unit that was keeping the coffee shop cool. All the doors and windows in the corridors around the machine room were wedged open, and security were told to ignore all of the warnings they were getting. The managers themselves manned the doors and open windows to make sure that nobody tried to get in, while we did the technical work.
Fortunately, most of the systems in the room were for test and development, so the external facing systems were not affected.
Never seen the managers, project managers and PMAs do so much physical work!
It must have worked, because we did not suffer any more than the normal number of failed disks following a power down, and in general, it all came back cleanly once the air-con. was fixed. Took a lot longer to bring up than it did to shut down, though.
One contract I had was with a branch of the Dutch Ministry of Agriculture, housed in (then) 40-year old barracks. Single story, low tarpaper roof, and even on just moderately sunny days the offices were already uncomfortably toasty. Cooling to the computer room was half-knackered, as one of the two units had a pinhole leak that apparently wasn't fixable, which had caused them to use up their freon-based coolant quota already. This lack of cooling caused serious swings in the computer room temperature, with a resultant well-higher than average component failure rate.
Their method of trying to keep the temperature within acceptable values was by installing garden sprinklers underneath the heat exchangers, and turning them on when temps in the computer room hit 25 degrees. Time to turn the tap off was often well into the evening, requiring one of the sysadmins to clock several hours of overtime. On really warm days boosting the cooling capacity like this was insufficient, and relief had to be brought by opening the back door and pointing half a dozen floor-standing fans at it.
(My suggestion to dump a few buckets of white paint on the roof, or tack a couple of rolls of alu-coated foil over it was dismissed, as 'this is just a temporary building'. It had been temporary since the mid-1950's, and it continued to be temporary for at least five more years after I left)
"My suggestion to dump a few buckets of white paint on the roof, or tack a couple of rolls of alu-coated foil ..."
The top floor of H Mansion has a flat roof with black bitumen roofing felt. That floor used to get uncomfortably hot but I assumed that most of the heat came in through the windows and not through the roof as that's insulated. But applied special aluminium bitumen paint to it last week and it's made a real difference. Unfortunately I didn't measure the before and after temperatures.
Tip: paint the roof before the weather gets too hot. In hot weather the primer, even when dry, is very sticky and that makes work difficult.
. . .
Ok. Racks get hot because the equipment in them draws cold air from the front, and then blows hot air out of the back. As wallmount cabling cabinets are poorly designed for airflow the hot air in the back can escape in two ways, firstly through escaping out of the tiny holes drilled in the side which probably wouldn't let enough air in to keep a hampster alive, and secondly by escaping out of the front in the gaps around the mounting rails. This hot air then mostly gets sucked back in the air intakes at the front of the equipment in the rack. The rack then gets hotter and hotter until the equipment generating the heat melts.
We can all agree that's the problem? Cool.
So why in the name of $DEITY do people blow cold air into the hot aisle they've created to cool the cabinet down? It's just pushing the hot air out the front, hopefully faster than it's being pulled back in again. A more elegant solution is to pull the air out of the hot aisle so that cool air is sucked in the front and hot air goes out of the back.
A PC case fan is ideal for this, and most IT departments have at least one dead PC waiting to be properly disposed of that can part with it's case fan. Figuring out how to get 12v to it should be an exercise left to the imagination of the reader, but I shall comment that most supermarkets will sell 12v PSU's for a few quid. This combination provably lasts years running 24/7 as long as the H&S people don't see the spliced wiring connecting these two parts.
There are of course other methods such as using transformers already owned by your organisation, such as running a long cable from the 12v rail on a nearby PC, buggering the dead PC's PSU so that it'll run without anything connected, sacrificing one of those 12v transformers in that box of surplus to requirement parts saved for a rainy day, or wiring 4 (12v) fans from the (48v) output from the PoE switch which is generating most of the heat in the cabinet for Cooling Over Ethernet, but a dedicated 12v supply is by far the best option.
With a half dozen top mounted case fans in fullsize cabinets, yeah. Bog standard arrangement.
But the article was showing pictures of those little wallmount comms cabinets which are generally mounted as close to the ceiling as the cabling contractor could get, which means you can't get access to the top. The article showed people having the same heat management problems with the comms cabinets as with larger cabinets and trying to control the temp by blowing ambient air into the hot space at the back with a normal deskfan.
My point is simply that controlling tempretures in a small cabinet is best done in exactly the same way as with a larger cabinet; kick the hot air out with a case fan or two and cooler air gets sucked in the other opening.
August 1990 broke all sorts of UK records for high temperatures. I was responsible for mainframe systems across two locations, both with well-specified (we thought) aircon units. We started getting over temperature alarms from the backup system, and when we checked we found that temperatures in the sun on the roof (the computer room was in the basement of a multistory office block) were pushing 50C. As a result the heat exchangers were actually heating the water coming from the machine room rather than cooling it.
Fortunately this was on a Friday afternoon, and though we had a hectic weekend sourcing additional aircon (like hens' teeth, because of the high temps across the country), but the weather had broken by Monday. (We sunsequently replaced the 'dry' heat exchangers on the roof with 'wet' ones that use water evaporation to provide additional cooling.)
Once reviewed a Health Testing laboratory IT room. While the main servers had backup generators for power failures the Air Conditioning units didn't. Ah no problem they said - we have some big household fans we can bring in.... OK so it's a power failure... where are you intending to plug them in?
Not the largest problem really... the fridges for the medical testing specimens had no backup power so things would get rather smelly rather quickly. And what was the point of the IT kit running when they had nothing to test...
I have an 8 port POE cisco pro-sumer gigabit prosumer switch acting as my core switch in my (unconditioned) loft/attic here is toasty old Florida, solid as a rock, the temps up there get well over 100 F every day, been there untouched for a couple of years now, i know it is still working because if it stops i get text alerts at work from the W.E.W.S (Wife Early Warning System ) and also more alerts from the U.K.W.N.I.S (Upset Kid With No Internet System)
I was called out last Sunday evening when our refinery monitoring systems went dark. The links between the monitoring network and normal data network turn out to rely on 18 year old Cisco kit that had overheated when the aircon in a server room failed. Cue the open windows and as many stand up fans as I could find. What fun!
Clearly some of these "tech" people are complete fucking morons.
Those boxes have Twin mains powered fans in the top or the ability to mount them.
And yet they mount the box almost flush to the ceiling.
We run kit in factories in China same box fronts closed IN FACTORY PRODUCTION LINES
and yet we don't need to engage in this sort of fuck feast.
Mind you...... we do ensure that they are checked weekly for dust blockages and that they all have functioning fans.
The Federal Government agency I worked for before had business units that operate in two ways: Fiefdoms and JFDI (just f*ckin' do it). Fiefdoms came in form of putting physical barriers (like physical walls) between their business units and "others". So a floor of a building will have several walled "gardens" and swipe card access limited to a few. JFDI is followed by "... or else ..." (a career ending gossip). Combine the first and the second and you'll get a business unit doing something without the other business units knowing about it (particularly IT). Duplicate work? Yeah, tons of it. The role of IT was to "filter" those duplicate work and tell the other work that "`tis been done already by <enter business unit here>." Also, I'd like to emphasize that with JFDI, business units don't like "change management" or "change control".
So back in 2001, a business decided to wall off a quiet corner of an underground car park (two floors below the street). The business unit decided that they wanted to put some servers. They put a tiny a/c unit in this space and it worked fine. Over the next 5 years, other business units caught wind of this scheme and, without consulting anyone, started putting their own servers in. This space grew from one rack to >12 racks (each side) and then one day, someone christened it a "data centre".
So it is of no surprise that during winter time, when the car park temperature was around 5C the inside of the "data centre" was hovering at >25C (both chillers running at 100% > 4 years without proper maintenance).
One day, one of the chillers decided to call it quits. Within 30 minutes, it was a full blown mess (servers auto shutdown and all boards lit up with a sea of red). It was >35C at the entrance (and hotter in the middle). The a/c tech said there is no way to source the parts in Australia and Asia. It took MONTHS for the CIO (yup, he had to sign off on that due to the cost involved) to authorize new chillers.
To keep the place down the two doors were left open, a full-time security guard at both entrance, during winter and blower fans were used during the summer.
Fun times, that was.
Instant flashback to the many jerry built cpu cooling solutions I've used down the years. The 12cm cooling can tied onto a fan 12to8 reducer and stupidly heavy copper radial heatsink with a shoelace worked surprisingly well. As in didn't fall off or break the vertically mounted mboard or CPU holder.
Still always a surprise how well popping the case and aiming a deskfan at the mboard works as well.
"That's hardly even warm (says the Australian)"
The Australian isn't getting 18 hours of daylight coming from 270 (horizontal) degrees of sky. It gets a little tricky to shield against the direct heating effects of this kind of thing. (suprisingly, it's _much_ easier to work against near-direct overhead sunlight at low-latitudes than low-incident angles at high latitudes. on the other hand you only need to deal with this shit for 2-3 months of the year)
I always add an inline good quality 150mm fan plus ducting to suck air out of rack and blow it into a nearby corridor. It's the circulation of air that helps, even if the air coming in is already warm.
For if fan fails I also get an Electrician to add a temp alarm that can be muted but with a red light, with instructions for staff.
Done this at many sites for many years, cheap and works really well. For those of you with selfish ceos, pipe it into their office and do your farting by the rack.
We ordered some ali box sections to make a 1.2x1.2x2.4m box that fits around our server cabinet. Picked some 25mm insulation board from Wickes, made some brackets (rivnuts are awesome) to keep them in, cut a slot and shoved a window AC unit on a shelf dumping in cool air.
Keeps the interior of the cabinet at a nice 20-23°C, while the outside can get up to 35. AC runs through our UPS so if there is a power cut it'll run as long as the cabinet does.
I kept the fact we have an AC unit quiet, in case the meat sacks get jealous!
Back in the day (2002) we had an IT tent, complete with a pair of DL380's and associated kit. ambient was 38 Celsius at it's peak and the wind direction determined the server maintenance schedule (daily or weekly). Still, could've been worse... and lo, two years later, it was. 55 Celsius, no time for acclimatisation and queues for the portaloos (emptied at 1500 daily). - but at least they were civilised enough to have British mains sockets!
1970s vintage Telecoms transmission and multiplexing equipment at my place of work is kept cool with the aid of the pedestal fan that used to keep me cool at my desk! It’s a vital piece of the nations infrastructure and is only kept alive by a £20 fan from Argos.
The UK; investing for the future!
previous company I worked for (small IT company, did some maintenance and support) - one of the customers had a small dell server sitting under the stairs in their very old building.
got a call one morning - went out to it, to discover that it had shut down overnight due to the cold (pretty cold winter in Scotland).
a big blanket to trap the heat in it's little nook, and a heater nearby - it was fine - they were advised that it should be in a temperature controlled area, but the scottish newsreaders that owned the company probably never did anything about it.
at the same time. the company that I worked for moved into new offices in Glasgow - with no aircon in the server room - the solution proposed by my boss (he wanted to reuse the waste heat from the servers to warm the office) - was to put a small desktop aircon unit on the shelf in front of the only window in the room, and open the window - servers regularly hit 50C during summer months - and shut down more than once while I was there.
A previous company to that, one of our customers had their servers in a hall cupboard - just about every night the servers would shutoff due to heat, and come back on in the morning - they were always complaining about the systems being down in the morning - turns out they were closing the cupboard door when they left, and opening it when they came in - no airflow, heat built up - servers shut down.
Biting the hand that feeds IT © 1998–2019