Re: Mumbai Multiway
Having electricity isn't the amazing thing. Not dying by it is!
2691 posts • joined 10 Jun 2009
Having electricity isn't the amazing thing. Not dying by it is!
Would you believe 100Mbit server networking down a stripped and twisted cable? (and does it work with newfangled Gbit? I've never tried).
Well, what would you do at 4am, when you discover that you have to connect the server to a switch 13m away from the rack and you have only 10m cables and shorter?
It was working the next morning and the strip-twisted cable assembly was swapped as soon as possible the next day.
And the Quark chip runs finger-burningly hot.
Presumably it is engineered to do so. As were Atoms before. And any chip well-designed for passive cooling (because you need a fairly large delta-T before convection gets going).
I remember an old Athlon system I once serviced. The heatsink fan had failed and I sizzled my finger on the heatsink (ie over 100C - heaven knows what the chip temperature was). Nevertheless it was "working perfectly". (The perceived problem was a failed CD drive). And it carried on working perfectly until it became obsolete a couple of years later.
Going back even further, I saw a power supply that had become overloaded because of a fault elsewhere, but which only failed when a rectifier diode melted its soldered connections and dropped off the circuit board. It was still a working diode.
CMOS silicon is very tolerant of high temperatures. The CMOS switching speed drops in inverse proportion to the temperature in degrees absolute, so the Tmax for a chip is usually the temperature above which the manufacturer will not guarantee correct function. Running too hot is akin to overclocking (which is why overclockers are into radical heatsink designs). Tmax is certainly not the temperature at which the chip will be destroyed in seconds. It gets much hotter when being soldered in place. Operational life is shortened by high temperature operation, but chips will function for decades and become obsolete long before ... dropping life expectancy from 30 years to 15 years rarely matters.
NUC isn't intended for embedded at all. If for anything, it's for the Mac Mini market. See my post above re GA-C1037UN-EU for details of a PC board that is easy to run off a vehicle 12V supply (or a 12V power brick) with far more interface options than an NUC. The board itself has a standard ATX power connector, the Pico-PSU range is the other half of the solution, and the C1307 CPU is 17W TDP and fanless.
Just for completeness, one should perhaps add that it is possible to obtain a full PC for not a huge amount more. I recently purchased a Gigabye GA-C1037UN-EU ITX board. Add a Pico-PSU, DDR3 RAM, a disk device (a USB stick will boot for embedded) and you're away. The board is fanless (17 watt TDP), and to my surprise even comes with two COM ports (one with D9 on the back-panel), two Gbit Ethernets, a PCI slot and an LPT header. Also 3 x SATA (one of which 6Gb for SSD) and e-SATA on the back panel. Cost around £120 (for board, pico-PSU and RAM).
Agreed, it's rather more power-hungry and expensive than boards being discussed here. But it's also a LOT more powerful. I wanted mine for a silent always-on home server cum internet browser, but I immediately thought that if I wanted a computer to run off 12V in an automobile or boat, it would be a perfect starting point.
And I've discovered that I'm no longer using my core-i5 desktop for anything except gaming these days. This little beast is always on so no boot-up wait, and feels plenty fast enough for everything else.
you went and fucked it all up there right at the end. Moore's law is demonstrably bollocks and promotes a world view that is.... unhelpful, as acknowledged by the great man himself.
Sorry but you are completely wrong. From a physicist's or engineer's perspective, it's a scaling law, that predicted (back in the 1980s) that there was absolutely nothing fundamental in the way of going from the earliest CMOS computers with a few tens of thousands of transistors running at a few MHz up to today's billion-transistor chips clocked at a few GHz. It also predicted where the law would inevitably fail (ie run out of predictive power). That's where we are today. It's because the transistors have to be made out of discrete atoms and the thickness of a gate is now as thin (as few atoms) as it can be while remaining an insulator.
Back in the days of bipolar transistors, a bit was represented as a flow of current and engineers faced what they called a "smoking hairy golfball" problem. You had to put the components sufficiently far apart so you could keep them cool, which restricted the clock speed because of the speed of light. Shrink it too much and you can't stop it catching fire (aided by the fact that bipolar transistors suffer thermal runaway).
CMOS, on the other hand, scales so that energy is dissipated only when a logic element changes state, and the heat generated per unit area of active electronics is a constant as you shrink the transistors, shrink the operating voltage, and scale up the clock speed, all by the same factor.
Agreed, Moore's law is now historical and no longer predictive. (the entire context of this thread is history!) We've now pretty much reached the physical limits for the smallness of a transistor, and any future improvements in CPU performance will have to come from using the billion or so transistors on a chip more intelligently.
but wasn't the invention of the programmable computer an invention that just maybe has a smidjin more of an impact of the modern world?
Definitely, although it was invented by Konrad Zuse in Germany before the war! Colossus came second, or maybe third if you believe the Yanks. Or even fourth, if you accept Babbage's mechanical engine as the first programmable computer.
Also don't under-estimate the input of genius physicists: http://en.wikipedia.org/wiki/History_of_the_transistor. If we still had to use thermionic valves there might be less than a thousand computers in the world. If we still had to use bipolar transistors there might be less than a million. Oddly, the FET was discovered first, though CMOS arrived quite late and was the enabling technology for Moore's law.
Which is actually true. http://en.wikipedia.org/wiki/Konrad_Zuse
Cracking Enigma was a huge achievement, but the hardware used was not a general-purpose computer. Turing's other contributions of genius were to the mathematics of computing and computability. He wasn't an engineer.
Microsoft thought that people would meet Windows 8 on their new mobiles, like it, and would then demand it on their desktop.
Whereas in fact they met it on their desktop, hated it, and that makes them buy anything except Microsoft for their next or first smartphone?
Assuming the sugar is of biological origin, it was made in a plant by photosynthesis using atmospheric CO2. So it's a closed loop (assuming the plant is regrown ... a fair assumption for agriculture).
There is a carbon cost, in that agriculture uses fossil fuels for powering machinery and for making Nitrogenous fertilizer.
... if you live in Scunthorpe.
I suspect that the only router you can trust is your own Linux system. (And that's only a maybe).
Paranoid mode on. They used to come from China with an NSA-approved backdoor in the flash with the vendor's secretly compelled acquiescence, plus a Chinese government backdoor without such acquiescence. Now, in order to provide plausible deniability, they've degraded the firmware so that they can blame their activities on organised slime, or indeed on any old Tom, Dick or Harrietta with a router.
It also lets the manufacturers sell "enterprise" routers at 20x the profit margin, which come with the better-engineered backdoors.
LInux Weekly news nearly shut down, because they didn't think that lots of readers around the world would pay them a few tenners per annum. Luckily they gave it a try, and the money rolled in.
I'd suggest that OpenBSD sets up a contributions site. It's probably easier to get 400 people to pledge and pay $50, than one to pay $20,000.
If you could strictly control the functionality of Glass - such as a Driving Glass product variant, then there could be potential for benefits.
Now there's a sensible idea that's almost trivial to implement. Mandate cars to have a low-power transmitter in the steering wheel, that puts any Glasses in the vicinity of the driver's seat into legally mandated driving mode. For cheap uncertified ones, that would simply be "off". For better certified ones, that would enable augmented reality for drivers. I don't believe the technology for the latter is good enough yet, but give it another decade and it will be.
This is bad... the police won't know it is on until they stop someone, and then they will just turn it off. Police will get charged and just worn't stop anyone.
You mean like drivers who were texting or otherwise playing with their mobile, who turn it off just after they've killed someone?
If necessary, mandate that Google glasses and suchlike maintain an activity log, and that the police are entitled to check that log. Like mobile phones do, and the police can.
A time will probably come when N'th generation Google glasses will be good enough to provide full augmented reality, and it will then become safer to have your car's instrumentation relayed into your field of forward vision, than to have to take your eyes off the road to (for example) check your speed. Likewise traffic warnings, which if displayed on roadside devices can be missed due to (say) a high-sided vehicle on your nearside. At that future time, I imagine a certification process will be required, to separate the products of adequate quality from the cheap toys. Some decades later, they may even become compulsory.
LOOK at them, a big bar down the right hand side of the glasses.. that WILL impede your peripheral vision,
Careful ... do you want to create a significant minority who are banned from driving? Some people don't have peripheral vision. They may need to wear very strong corrective lenses, which can correct only what's in front of them not what's to the side. Or they may have had certain eye diseases which have destroyed or damaged their peripheral vision before diagnosis and (in some cases) cure.
You'd also have to ban motor-cycles, since it's not legal to ride one without a helmet and helmets cut your peripheral vision.
It present (in the UK at least) peripheral vision is not a requirement for driving. Be careful what you wish!
Overpriced, definitely. I'd argue for over-specified rather than under-
Are there any inexpensive 15" laptops out there with 1920x1080 screens? Do we really need the high-end gubbinsry that this beast is encrusted with? Or just an ordinary computer with a decent screen to run Windows and/or Linux for serious work away from our desks?
@Ragarath - can I interest you in a car wirg a square steering wheel, the brake pedal where the accellerator used to be, and the throttle on the dashboard? I assure you, with a little practice it really is possible to drive the thing.
You can trademark "Cascade" for audio products, if no-one else is already using that name in connection with audio or digital networking. The latter has a problem: www.cni.net Maybe they decided the phonemes first and the lawyers decided they had to change the spelling?
PURE - the people who make DAB radios?
If DAB is your sound source and he doesn't think there's anything wrong with it, there's something wrong with his ears. It's already even more FUBAR than MP3, and no way could you notice any gain from using audiophile components downstream. (Though I'd agree that spending money on well-chosen electronics and speakers is a better use for it than fancy pieces of wire, if your source is FM, CD, or Vinyl. )
If the delay is completely constant, then I agree it's probably nor detectable, and otherwise equivalent to moving yourself or one of your speakers by less than a foot.
On the other hand, if it occasionally glitches (changes abruptly) that would be disconcerting, and if it glitches frequently or drifts continuously that would be horrible. There's a less serious reverse effect you can experience by wearing headphones. You move your head, and the soundstage moves with you. You get used to it, but in the first instance you are anticipating the sound being fixed when you move. The effect of your head moving when it wasn't would be worse. Like being drunk or motion-sick?
Never ceases to amaze me what people will pay and do to avoid using a piece of wire. (Failing which, an analogue RF transmitter/receiver which will maintain coherence of 1us per quarter-kilometer, or thereabouts).
Of course if you are turning a typical MP3 file into sound, it's FUBAR whatever you do with it. The only decent audio file is one that's compressed losslessly, if at all.
You get a low-res image. If it's of any interest you click on the image and get more details and a "visit page" link for the site that hosts the original image. How is this bad?
Windows 14, of course.
(Ask a Chinese if you need an explanation of the joke).
still uses Imperial measurements or, perhaps more accurately, does not use the metric system
Perhaps the rest of the world could start (accurately) calling them British imperial units, to help the USA readjust?
They probably mean the Windows 8 tiles page will use a better pattern matcher, so that a dyslexic can also find his apps. Well, more often than at present. Xecel ... Cexle ... Exlec ...
People who have Windows phones hate Windows 8 on their desktop PC. People who have Windows tablets hate Windows 8 on their desktop PC. Apple understands this, and sells three different interfaces matched to three classes of devices: phones, tablets, and desktop computers.
The choices will be Windows 8, Windows 9, or migrate away (Apple? Android? Linux? )
It's make or break for Microsoft. If Businesses can see that they are going to have to migrate from XP/7 "Windows" to something that shares only a Microsoft Logo on the packaging and a kernel, the other alternatives won't look nearly so radical as they once did.
As already posted above, all Microsoft has to do is give businesses what they actually want. Otherwise, Microsoft will be signing its own corporate death warrant.
I'm conflicted about that one. True, Aero will run on any modern graphics including Intel on-chip. But is it worth the extra electricity cost the 3D effects will inflict on your organisation?
I'd say bring back XP-style windows. Neither Aero nor Notro desktop were improvements.
Depending on your employment:
"C++ for dummies"
"Visual Basic for dummies"
"Linux system management for dummies"
And methinks there are a few bastards out there who should have read "Banking for dummies" but never did, and never let it hold them back.
Other missing options are "none of the above" and "it all depends on who catches you".
And Mao was the worst of the lot. Not only did he have even more people to kill, but he was also a paedophile.
(Or should one judge in percentage of population murdered, in which case Pol Pot is the worst)?
The lesson to learn is that the greater the concentration of power at the top, the worse the consequences. Or in a variant I once heard, "The best system of government is a benign dictatorship. Except that we've never worked out how to keep the dictator benign, and we never will, so don't go there".
Dan Brown book not un-enjoyable if you picked up the book in a charity shop out of curiosity, and have time to kill at an airport and in a plane. You do have to park your critical facilities and intellect in neutral, maybe some people can't do that. But isn't that true of most fiction?
A week later gave it back to the charity shop to sell again.
Nevertheless, people hold politicians in sufficiently low regard that politicians telling them what not to read may actually elevate the banned or merely deprecated material in certain people's minds. (Especially, I fear, in the minds of people who lack the intellectual capacity to read for themselves, anything longer than one column in a down-market newspaper).
So bans are counterproductive, even if well-intentioned.
The moment the authorities ban a book or try to persuade you that it will warp your mind, read it. Ditto if any significant pressure group is protesting its outrage. You may well decide it's a load of old rubbish, but if there's one thing in this world to avoid, it's allowing other people to make up your mind for you. You are a human being, not an ant.
Because it was always likely that drinking a pint or so of water with a small amount of dissolved chemicals would lead to dehydration
Dehydration is misunderstood. Perhaps surpringly, thirst and hunger aren't similar. One experiences hunger once one's body is capable of processing more food. One isn't in danger of physiological distress from lack of food for a day or more after one's last meal. In contrast, thirst is a physiological distress call. You needed to ingest more water a significant time *before* you felt thirsty.
The best guide is the colour of your pee. Pale straw: sufficiently hydrated. Darker: you aren't ingesting enough water. (Bright yellow: lay off the artificially coloured snacks! ).
I can assure you that drinking a pint of water laced with a small amount of certain pharmaceuticals will result in a pint of pee within an hour, followed by more pints of pee, and severe dehydration if you don't replace the water. Coffee is in the fourth division compared to a real diuretic drug.
Coffee is a weak diuretic, but who cares? You go to the loo, and then you visit the water fountain to replace the water. A small price to pay for the concentration-enhancing effect of coffee.
(I've no idea whether it boosts my memory. It certainly gets rid of sleepiness and, to some extent, seasonal blues).
Not sure about Viglen, I'm guessing it's just up against Dell and the like
They were more than holding their own (in niche markets such as education and HPC) until maybe two years ago. You could order exactly what you wanted, and you'd know that there would be no component substitutions made without your approval. If you look after hundreds of PCs and want trouble-free image installations, that's quite important. Also, they were pretty reliable.
I think the problem is technological. As more and more got built into the chips, there's less and less customisation available to a system builder, and less and less to diffrentiate motherboards and base systems. Also Viglen specialized in systems build from Intel-branded motherboards, and Intel's stopping making them.
It' s not like clockwork, and the timescale is geological. May not happen for hundreds of centuries yet. Hope not.
Actually it's conceptually easy to trigger a supervolcano erruption. Drill down as far as you can, maybe 500m above the magma, then put a "Tsar Bomba" hundred-megatonne nuke at the bottom and similar nukes every 500m or so all the way to the top, and blow them all at once. Fortunately, I don't think even the leadership of North Korea is quite that crazy. (Scarily, ISTR that there is a supervolcano reservoir inside North Korea's borders).
Definitely no joke. I'd be planning to relocate as soon as reasonably possible. Vesuvius errupts far more often than supervolcanoes. If you leave relocation until there's smoke coming out of the volcano, it may be too late to get yourself (and the entire population of Naples) safely out of town.
Scientific Linux is not a clone of Centos. It's also a derivative of Red Hat's source (not quite a clone, for significant reasons).
CERN(*) depended on the old Red Hat free-to-copy model. When Red Hat started charging after RHL 9, CERN had a problem. (Methinks someone at Red Hat didn't understand that CERN had thousands, perhaps millions, of systems embedded in apparatus, and really could not countenance any per-CPU charging scheme. I suspect that if Red Hat had offered CERN a no-support unlimited-copies license at a reasonable price, i.e. the status quo, they'd have paid for it, and the rest of us would be poorer for it).
Anyway, they didn't, and CERN took the only route that they could. Changing horses was not an option. Taking the source, and building their own distribution, was an option. CERN has a lot of very smart IT guys. So Scientific Linux was born (with the most inappropriate name of any Linux distribution).
Maybe it was a clone on day one, but they take the attitude that if something is needed for CERN that's not in Red Hat, it goes in, and if a bug is troubling CERN, then they fix it (even if Red Hat hasn't, or won't). However, they prefer to avoid divergence. From an ordinary use's point of view, you'll find it hard to tell the difference between Scientific Linux and Red Hat. The most obvious change, is that a default Scientific Linux install has automatic yum updating turned on. The next most noticeable thing is that SL has a fair number of (science-related, optional) packages in the distribution repositories, which are not in Centos or RHEL. I'm told that the SL kernel has a few extra things built in or removed into modules, but I've never run into anything that works on Centos or RHEL that doesn't work on the corresponding SL.
Centos used to to claim bug-for-bug compatibility with Red Hat, but since RHEL6 that has become harder for them (different build tools). Anyway do you really want to suffer a fixable bug just because some other distribution hasn't yet fixed it? So now Centos is also not quite a clone.
Perhaps it's like evolution. They're strains or races, not yet different species. The environmental change that would cause a speciation event (or a fork) has not yet happened, and hopefully won't.
(*) CERN implies "and Fermilab", everywhere.
Which RHEL are you talking about? 5, 6 or 7?
They are all current. If you want the most stable production platform, and provided you don't *need* the newer features, 5 might still be the best choice. (Though 6 seems pretty darned stable to me).
Also you need to evaluate the anatomy of whatever bugs are hurting you. If it's a bug in, say, Samba, the chances are high that you'll find the same bug with Samba running atop SuSe or Ubuntu. I.e., it's not Red Hat's code or package-building at fault.
At least the bugs do get fixed. As opposed to being swept under the carpet until a black hat starts exploiting them, or being documented as features, or being told to migrate to an incompatible and expensive version N+1 or lose all support. Techniques frequently adopted by closed-source alternatives.
The previous history is relevant.
With RHEL5, it was easy for Centos or anyone else to strip out the Red Hat copyrighted images and repeat Red Hat's build process using the open sources which Red Hat are obliged to distribute. They didn't care that Centos (and CERN - Scientific Linux) did this. They did care when Oracle did the same.
So with RHEL6 they made the build tools less open and more obfuscated, and that's why Centos 6 arrived a rather long time after RHEL6 (they had rather a lot of reverse-engineering to do). Centos was "collateral damage". Oracle was the target ( it was basically taking Red Hat's software, relabelling it, and reselling it in competition).
I'd feared that they would complete the process with RHEL7 and make RHEL 7 close to uncloneable despite the open-ness of the source. Does anyone know if they are freely licensing proprietary build tools to Centos and other free-beer distributions, while leaving Oracle to stew? If they are, it seems like the best possible solution.
You won't get useful amounts of energy from the cosmic microwave background, nor from acoustic noise somewhere you can hear a pin drop.
Mind you, acoustic scavenging might actually fly in some workplaces I can think of!
I was thinking how completely irritating and disabling it is, if there's anything on your eyeball that doesn't move exactly the same way as the eyeball. Think of a grain of sand in your eye, or conjunctivitis.
If the implant does react exactly the same as the eyeball with no added mechanical resistance, there's no way to harvest mechanical energy.
Solar power sums: the usual figure is 100 watts per square meter harvested from bright sunlight. That's 100 microwatts per square millimeter (10 microwatts on a dull day, maybe 1 microwatt indoors with office grade lighting).
And further to that thought, if the custom Silicon is on a card on the bus in a conventional server PC, you can yank it out and plug in this year's model. Rather more work than upgrading pure software(*), but not nearly as much work as replacing the entire server farm.
(*) that's after you've sorted out all the reconfiguration issues, and just have to do the tried and tested same over and over again.
What is it that you need in the server farm, that you can do with hardware integrated on the same slice if Silicon as the CPU, that you can't do with a separate piece of silicon attached to an Intel CPU's external bus?
I appreciate that at the consumer device end, there are serious economies to be reaped by integration of a system on one chip. (Serious economies means maybe a few tens of dollars per system). In the server room, I don't think a $50 cost advantage will win any arguments. It needs to be a technological advantage, or a price advantage at least one order greater.
Ultimately, I do expect Intel will be fabbing the world's ARM CPUs with their world-leading process technology, but in the first instance for mobiles, not for servers. ARM will conquer the server room last, if ever.
Agree. Better a standard USB stick, with some way of pulling out a short flexible micro USB cable (an inch or two would be long enough).
I have a portable DVD drive which has a USB cable that clips into recesses mounted in the plastic base of the drive when it's not being used. Something similar to that?