Re: It's part of a bigger picture
still uses Imperial measurements or, perhaps more accurately, does not use the metric system
Perhaps the rest of the world could start (accurately) calling them British imperial units, to help the USA readjust?
2567 posts • joined 10 Jun 2009
still uses Imperial measurements or, perhaps more accurately, does not use the metric system
Perhaps the rest of the world could start (accurately) calling them British imperial units, to help the USA readjust?
They probably mean the Windows 8 tiles page will use a better pattern matcher, so that a dyslexic can also find his apps. Well, more often than at present. Xecel ... Cexle ... Exlec ...
People who have Windows phones hate Windows 8 on their desktop PC. People who have Windows tablets hate Windows 8 on their desktop PC. Apple understands this, and sells three different interfaces matched to three classes of devices: phones, tablets, and desktop computers.
The choices will be Windows 8, Windows 9, or migrate away (Apple? Android? Linux? )
It's make or break for Microsoft. If Businesses can see that they are going to have to migrate from XP/7 "Windows" to something that shares only a Microsoft Logo on the packaging and a kernel, the other alternatives won't look nearly so radical as they once did.
As already posted above, all Microsoft has to do is give businesses what they actually want. Otherwise, Microsoft will be signing its own corporate death warrant.
I'm conflicted about that one. True, Aero will run on any modern graphics including Intel on-chip. But is it worth the extra electricity cost the 3D effects will inflict on your organisation?
I'd say bring back XP-style windows. Neither Aero nor Notro desktop were improvements.
Depending on your employment:
"C++ for dummies"
"Visual Basic for dummies"
"Linux system management for dummies"
And methinks there are a few bastards out there who should have read "Banking for dummies" but never did, and never let it hold them back.
Other missing options are "none of the above" and "it all depends on who catches you".
And Mao was the worst of the lot. Not only did he have even more people to kill, but he was also a paedophile.
(Or should one judge in percentage of population murdered, in which case Pol Pot is the worst)?
The lesson to learn is that the greater the concentration of power at the top, the worse the consequences. Or in a variant I once heard, "The best system of government is a benign dictatorship. Except that we've never worked out how to keep the dictator benign, and we never will, so don't go there".
Dan Brown book not un-enjoyable if you picked up the book in a charity shop out of curiosity, and have time to kill at an airport and in a plane. You do have to park your critical facilities and intellect in neutral, maybe some people can't do that. But isn't that true of most fiction?
A week later gave it back to the charity shop to sell again.
Nevertheless, people hold politicians in sufficiently low regard that politicians telling them what not to read may actually elevate the banned or merely deprecated material in certain people's minds. (Especially, I fear, in the minds of people who lack the intellectual capacity to read for themselves, anything longer than one column in a down-market newspaper).
So bans are counterproductive, even if well-intentioned.
The moment the authorities ban a book or try to persuade you that it will warp your mind, read it. Ditto if any significant pressure group is protesting its outrage. You may well decide it's a load of old rubbish, but if there's one thing in this world to avoid, it's allowing other people to make up your mind for you. You are a human being, not an ant.
Because it was always likely that drinking a pint or so of water with a small amount of dissolved chemicals would lead to dehydration
Dehydration is misunderstood. Perhaps surpringly, thirst and hunger aren't similar. One experiences hunger once one's body is capable of processing more food. One isn't in danger of physiological distress from lack of food for a day or more after one's last meal. In contrast, thirst is a physiological distress call. You needed to ingest more water a significant time *before* you felt thirsty.
The best guide is the colour of your pee. Pale straw: sufficiently hydrated. Darker: you aren't ingesting enough water. (Bright yellow: lay off the artificially coloured snacks! ).
I can assure you that drinking a pint of water laced with a small amount of certain pharmaceuticals will result in a pint of pee within an hour, followed by more pints of pee, and severe dehydration if you don't replace the water. Coffee is in the fourth division compared to a real diuretic drug.
Coffee is a weak diuretic, but who cares? You go to the loo, and then you visit the water fountain to replace the water. A small price to pay for the concentration-enhancing effect of coffee.
(I've no idea whether it boosts my memory. It certainly gets rid of sleepiness and, to some extent, seasonal blues).
Not sure about Viglen, I'm guessing it's just up against Dell and the like
They were more than holding their own (in niche markets such as education and HPC) until maybe two years ago. You could order exactly what you wanted, and you'd know that there would be no component substitutions made without your approval. If you look after hundreds of PCs and want trouble-free image installations, that's quite important. Also, they were pretty reliable.
I think the problem is technological. As more and more got built into the chips, there's less and less customisation available to a system builder, and less and less to diffrentiate motherboards and base systems. Also Viglen specialized in systems build from Intel-branded motherboards, and Intel's stopping making them.
It' s not like clockwork, and the timescale is geological. May not happen for hundreds of centuries yet. Hope not.
Actually it's conceptually easy to trigger a supervolcano erruption. Drill down as far as you can, maybe 500m above the magma, then put a "Tsar Bomba" hundred-megatonne nuke at the bottom and similar nukes every 500m or so all the way to the top, and blow them all at once. Fortunately, I don't think even the leadership of North Korea is quite that crazy. (Scarily, ISTR that there is a supervolcano reservoir inside North Korea's borders).
Definitely no joke. I'd be planning to relocate as soon as reasonably possible. Vesuvius errupts far more often than supervolcanoes. If you leave relocation until there's smoke coming out of the volcano, it may be too late to get yourself (and the entire population of Naples) safely out of town.
No consistency problems. Erruptions of different supervolcanoes are not correlated. Erruption times of a single supervolcano are vaguely cyclical, but not anything like clockwork. So the earth has had two go pop in the last 100K years, and may have none go pop in the next 100K years, averaging out as one every 100K years.
In reality (a) you'd have to average over a few Myears to get meaningful statistics, and (b) the Yellowstone supervolcano is likely to go pop in the geological near future. (Say in the next 100k years with probability 0.7 or higher).
Scientific Linux is not a clone of Centos. It's also a derivative of Red Hat's source (not quite a clone, for significant reasons).
CERN(*) depended on the old Red Hat free-to-copy model. When Red Hat started charging after RHL 9, CERN had a problem. (Methinks someone at Red Hat didn't understand that CERN had thousands, perhaps millions, of systems embedded in apparatus, and really could not countenance any per-CPU charging scheme. I suspect that if Red Hat had offered CERN a no-support unlimited-copies license at a reasonable price, i.e. the status quo, they'd have paid for it, and the rest of us would be poorer for it).
Anyway, they didn't, and CERN took the only route that they could. Changing horses was not an option. Taking the source, and building their own distribution, was an option. CERN has a lot of very smart IT guys. So Scientific Linux was born (with the most inappropriate name of any Linux distribution).
Maybe it was a clone on day one, but they take the attitude that if something is needed for CERN that's not in Red Hat, it goes in, and if a bug is troubling CERN, then they fix it (even if Red Hat hasn't, or won't). However, they prefer to avoid divergence. From an ordinary use's point of view, you'll find it hard to tell the difference between Scientific Linux and Red Hat. The most obvious change, is that a default Scientific Linux install has automatic yum updating turned on. The next most noticeable thing is that SL has a fair number of (science-related, optional) packages in the distribution repositories, which are not in Centos or RHEL. I'm told that the SL kernel has a few extra things built in or removed into modules, but I've never run into anything that works on Centos or RHEL that doesn't work on the corresponding SL.
Centos used to to claim bug-for-bug compatibility with Red Hat, but since RHEL6 that has become harder for them (different build tools). Anyway do you really want to suffer a fixable bug just because some other distribution hasn't yet fixed it? So now Centos is also not quite a clone.
Perhaps it's like evolution. They're strains or races, not yet different species. The environmental change that would cause a speciation event (or a fork) has not yet happened, and hopefully won't.
(*) CERN implies "and Fermilab", everywhere.
Which RHEL are you talking about? 5, 6 or 7?
They are all current. If you want the most stable production platform, and provided you don't *need* the newer features, 5 might still be the best choice. (Though 6 seems pretty darned stable to me).
Also you need to evaluate the anatomy of whatever bugs are hurting you. If it's a bug in, say, Samba, the chances are high that you'll find the same bug with Samba running atop SuSe or Ubuntu. I.e., it's not Red Hat's code or package-building at fault.
At least the bugs do get fixed. As opposed to being swept under the carpet until a black hat starts exploiting them, or being documented as features, or being told to migrate to an incompatible and expensive version N+1 or lose all support. Techniques frequently adopted by closed-source alternatives.
The previous history is relevant.
With RHEL5, it was easy for Centos or anyone else to strip out the Red Hat copyrighted images and repeat Red Hat's build process using the open sources which Red Hat are obliged to distribute. They didn't care that Centos (and CERN - Scientific Linux) did this. They did care when Oracle did the same.
So with RHEL6 they made the build tools less open and more obfuscated, and that's why Centos 6 arrived a rather long time after RHEL6 (they had rather a lot of reverse-engineering to do). Centos was "collateral damage". Oracle was the target ( it was basically taking Red Hat's software, relabelling it, and reselling it in competition).
I'd feared that they would complete the process with RHEL7 and make RHEL 7 close to uncloneable despite the open-ness of the source. Does anyone know if they are freely licensing proprietary build tools to Centos and other free-beer distributions, while leaving Oracle to stew? If they are, it seems like the best possible solution.
You won't get useful amounts of energy from the cosmic microwave background, nor from acoustic noise somewhere you can hear a pin drop.
Mind you, acoustic scavenging might actually fly in some workplaces I can think of!
I was thinking how completely irritating and disabling it is, if there's anything on your eyeball that doesn't move exactly the same way as the eyeball. Think of a grain of sand in your eye, or conjunctivitis.
If the implant does react exactly the same as the eyeball with no added mechanical resistance, there's no way to harvest mechanical energy.
Solar power sums: the usual figure is 100 watts per square meter harvested from bright sunlight. That's 100 microwatts per square millimeter (10 microwatts on a dull day, maybe 1 microwatt indoors with office grade lighting).
I'd have thought a better solution would be those power cells that metabolise glucose.
Good idea for implants in blood-perfused tissue, but the surface of an eye is not well-supplied with blood or glucose.
The best way for an eyeball implant would probably be photovoltaic ("solar power"). Eyeballs are well supplied with light, at least while you are awake. Transparent solar cells can be made (indeed, large ones are being installed as windows on "green" skyscrapers).
Another way would be wireless power (I'm assuming only microwatts are needed). You'd need to wear a power transmitter elsewhere on your person if continuous power was needed. You might be able to scavenge power from a 21st century environment if continuous power was not vital. (ie, parasitise off WLANs and mobile phones).
Yet another way would be piezo-kinetic power scavenging (as in the Seiko Kinetic watches). Again I can see that working much better for implants in other parts of the body.
And further to that thought, if the custom Silicon is on a card on the bus in a conventional server PC, you can yank it out and plug in this year's model. Rather more work than upgrading pure software(*), but not nearly as much work as replacing the entire server farm.
(*) that's after you've sorted out all the reconfiguration issues, and just have to do the tried and tested same over and over again.
What is it that you need in the server farm, that you can do with hardware integrated on the same slice if Silicon as the CPU, that you can't do with a separate piece of silicon attached to an Intel CPU's external bus?
I appreciate that at the consumer device end, there are serious economies to be reaped by integration of a system on one chip. (Serious economies means maybe a few tens of dollars per system). In the server room, I don't think a $50 cost advantage will win any arguments. It needs to be a technological advantage, or a price advantage at least one order greater.
Ultimately, I do expect Intel will be fabbing the world's ARM CPUs with their world-leading process technology, but in the first instance for mobiles, not for servers. ARM will conquer the server room last, if ever.
Agree. Better a standard USB stick, with some way of pulling out a short flexible micro USB cable (an inch or two would be long enough).
I have a portable DVD drive which has a USB cable that clips into recesses mounted in the plastic base of the drive when it's not being used. Something similar to that?
A lot of thought would need to go into providing a dictionary. Carefully chosen well-labelled pictures ought to allow a long-lost language to be reborn. If only (say) the Minoans or Etruscans had produced children's picture books on stone tablets.
In an SF story I once read, the (accidental) Rosetta stone was an annotated periodic table, although obviously only a fairly advanced civilisation could decode that one.
We don't want them being used as fetish objects or clubs by the cave-dwelling man apes that will wander the post-apocalyptic wastelands.
Or maybe we do, provided they're suitably resilient. Something that's periodically rediscovered and regarded as treasure from the great ones of yesteryear may stand a better chance of finally being decoded, than something lost in a hole in the ground getting buried deeper and deeper with every passing milennium. Tableware made of a high-tech ceramic much tougher than mere porcelain might be a good choice. If some barbarian manages to smash it and dumps it in a midden, the information loss probably isn't very great.
Anyway you'd run both strategies in parallel, with very many identical plates for redundancy.
Perhaps this method is right, for billion-year timescales.
If we're aiming only at the next civilisation after this one fails, or perhaps the next intelligent species after this one goes extinct, then a Babylonian solution beckons. Fired clay tablets (with a modern twist). Perhaps a somewhat more advanced ceramic, such as the all but indestructible "Corelle" that Dow Corning once made plates from. (Same problem as un-ladderable stockings. The plates last forever. No repeat sales. No profits. Discontinued. Sigh. )
You could put some low-tech diagrammatic writing on the plates, in an attempt to draw attention to the high-definition data embossed in the ceramic, or ink-jet printed at ~100dpi. Make them both beautiful and startingly hard to break, and stone-age or barbarian peoples might preserve them rather than destroy them. Anyway, it's really hard work to destroy the information on a ceramic surface and a subsequent civilisation will reassemble the fragments from your rubbish tip. Make lots and lots of them, distribute them as superior mass-market tableware, and survival of some of them is virtually guaranteed. All we need is a billionaire with a long-term view of things to underwrite the project. (Thinks ... an Indian one ... must be good for a lot of positive Karma. The largest Hindu unit of time is a large multiple of My, and they believe in re-incarnation and in cycles of creation and destruction. It all fits. )
(Apparently we know more about the everyday minutae of Babylonian life, than about any civilisation since! )
The Earth will become uninhabitable a lot sooner than 5 Gigayears hence. We're actually rather close to the inwards edge of the habitable zone around the Sun, and the Sun is getting hotter as it ages. Unless "we" initiate major planetary protection operations within the next Gigayear (orbital sunshades, or orbit expansion), life will be over by then. Some estimate as soon as 300My, before Earth suffers thermal runaway the same as Venus. (OMG multicellular life is having its midlife crisis! )
If we want to leave a *really* long-term record, Earth isn't really the right place. Too much corrosive oxygen and water and those awkward plate tectonics, and a boiling sulphuric acid nightmare after the end of life on Earth.The Moon is better (dig in deep to protect against all but huge meteor strikes, and position-mark with long-life radioactives near the surface). An outer moon of Saturn would be better still, might even survive Sol going red giant and nova. (Ring any bells? ....)
I'd be most surprised if he didn't start by dropping weights. It's not hard to distinguish clunk! from clunk-clunk!, where the impacts are separated by as little as 50ms (maybe less). Then he thought "that's interesting". Then he'd have looked for a way to make the experiment more accurate, and got lucky by using solid rather than hollow spheres on an inclined plane. Right result, but missing a large chunk of reasoning about then-unknown rotational kinetic energy and the importance of the spheres being homogeneous.
The gravitational interaction of two human-sized objects has been observed, and Newton's law of gravitation (inverse-squares) confirmed for masses of the order of kilogrammes at distances of the order of a meter. The experiments are very hard, and the accuracy isn't great. Some alternative theories have proposed that gravitation breaks down on milli- or micro-meter scales, and of course at much smaller scales still we get into quantum effects. These aren't yet testable.
Other theories posit that inverse-squares gravitation breaks down on the scale of a galaxy and above, and attempt to explain away dark matter and dark energy in terms of a different law of gravitation on very large scales. So far, such attempts have not been very successful in doing away with the need for dark matter and energy to explain the observations.
The equivalence principle is much better-tested for terrestrial masses, than Newton's law itself. But all the masses tested were made of conventional atoms. Perhaps fortunately, we don't have neutronium or large masses of antimatter to play with.
AFAIK the orbits of the Voyager spacecraft are still observed to be not quite as expected. Tantalizing, but they weren't ever designed as gravity probes, and their deviation from predicted orbit is just within the likely errors of observations.
I haven't worked out the details (which will be very complex, probably involving serious supercomputer resources to model) but there will be gravitational perturbations of each orbit by the other bodies. Here in the solar system, the existence and position of Pluto was deduced and calculated by very careful observation of the orbits of Uranus and Neptune. Once they knew where to look, they pointed a powerful telescope at that patch of sky, and there was Pluto.
For this system we can test GR rather than simple Newtonian gravity, because the bodies are very dense and close. It tests the equivalence princible, because a Neutron star is made of Neutrons and a White Dwarf is made of ordinary atomic nuclei (ie with protons, and electrons between the nuclei). Short of a black hole, that's about as much different as two astronomical masses can be. (*)
I'll have to file this system along with Zircons dating of the age of the Earth. $DEITY is being extremely kind to us, providing us with the means to scientifically investigate questions we might have thought were forever beyond our grasp.
(*) I've always wondered whether bosons (commonly neutrons and protons) and fermions (commonly electrons) are gravitationally equivalent. Is there any way to probe this, even conceptually, given that the electromagnetic interactions of electrical charges are about 10^40 times greater than any gravitational effects?
Here on Earth, if you put the feather on top of the hammer and drop in still air, the feather will fall with the hammer. The hammer pushes the air out of the way, and gravity does the rest.
I'd be very surprised if Galileo hadn't tried this, thought he was probably smart enough to realise that the limitations of this particular experiment would be siezed upon by his enemies.
All it takes is to split the hull and air pressure, plus the 600mph breeze outside will do the rest.
Contradictory datapoint: the Aloha airlines "open-top 737" incident.
Puncture? Almost certainly yes, if taped to the skin of the airframe itself. On the inner plastic skin, I doubt it. Further away, no chance. It's also unlikely that a small hole could bring down an airliner.
Yes, I know. MS would like all of its users to be tied in to a browser that won't let them block adverts, and which reports more than you know back to Microsoft so that they can target you with more "relevant" and unblockable advertising. The users, on the other hand ....
Ghostery and NoScript...
Adblock-plus, Flashblock, and Tabkit (tabs down the LHS in collapsible trees, not along the top)
Privacy concerns about what is being sent to Microsoft (IE) or Google (Chrome) without my knowledge or consent.
Mozilla doesn't have to hope anything. Firefox works just fine on Linux and anything else that might replace XP and Windows 7 on business desktops (including the Windows 8 desktop).
The day Microsoft announces EOL for Windows 7 without having an upgrade path that doesn't involve massive costs (including retraining all a business's low-skill keyboard-pokers), is the day Microsoft will have signed its own death warrant.
If I hadn't been around while Digital self-destructed, I'd think it couldn't happen.
There's actually a decent successor to Windows 7 in Windows 8, if only it could be made to boot to desktop with a 7-compatible login /switch user screen and start menu, and TIKFAM made so it can be defaulted hidden and configured completely unavailable using AD policies. Doesn't actually look too hard, if Microsoft would only stop pushing TIKFAM at business users who know they absolutely don't want it in any shape or form.
Personally I wish they'd decide that there never will be an official version of Firefox that works anywhere on Windows 8 except for the Windows 7-compatible desktop, until someone stumps up the entire cost of porting it.
If Microsoft think the non-availability hurts their chances of persuading people that TIKFAM is any more seaworthy than the post-iceberg Titanic, let them pay the Mozilla foundation to do the port. If Microsoft doesn't care to do that, tell the world that's why there's no TIKFAM-Firefox.
A thought on the NUC form-factor.
Intel needs to put more SATA connectors on it, not just one, so that it can be used as the basis of systems that need >1 hard disk. (IMO every system with locally-stored data needs >1 hard disk, for mirroring - at £40 for a HD can you afford not to? ) How much do four SATA connectors cost? Surely on-chip SATA interfaces are (or can be) designed to idle at microwatts if there is no hardware connected to them? There's clearly no great cost to putting six SATA on desktop boards that rarely have more than three of them connected. If there were, they'd leave them out and the minority would buy PCI-X SATA controllers.
You could connect extra disks by extending the NUC upwards with a section that bolts on top of the NUC box and holds the disks. Keeping the original lid and bolting that on top of the extended NUC would be a nice touch.
If the form-factor catches on the price ought to drop as NUC-format boards start being made by ASUS, Gigabyte etc., along with cases for passive cooling, multiple disks, etc. in various different shapes.
Intel set the ATX PSU format and the common desktop motherboard formats ATX, mATX, ITX. OTOH they failed to persuade anyone that BTX PSUs were a good idea. Standard form-factors are a good idea in principle.
For the record, if anyone finds this post with Google, I have at last found my passively cooled ITX board with a faster CPU than an Atom. It's a Gigabyte GA-C1037UN-EU. The -EU is important because there is another variant with a fan and (I presume) a wimpier heatsink and/or a different BIOS that will moan if the fan isn't present.
The CPU is a Celeron 1037U, 17W TDP, which CPU benchmarks about 2.5x my Atom. It also takes up to 16Gb of DDR3 which should save on disk IO. The board (incl CPU) is inexpensive (£72 incl VAT and shipping) and even boasts 3 x SATA (one of which SATA-3, great for an SSD), USB3, and one E-SATA.
Fingers crossed. Intel shouldn't be marketing these chips as Celerons. They should market them as low-wattage low-end Ivy Bridge i3 (which they are), or even as faster Atoms. There's a Haswell version coming "soon", but I decided not to wait for it.
I presume it still has an annoying fan in there somewhere, which is why IMHO it doesn't warrant a premium price. (If I''m wrong and the whole case is metal acting as a passive heatsink, please correct me and I'll reconsider).
What I want is a passively cooled board that will fit in an ITX case, to replace the Atom D520 I have at present. Having gotten used to silence, I don't want even a whisper of fan noise. Nobody seems to make such a board, though there's an i3 laptop CPU that's loads faster than the Atom and which has much the same TDP.
Failing which I'll be returning to a mini-tower case with a fan-less PSU and a NoFan heatsink on an ordinary Intel desktop CPU. Small size isn't a killer feature. Silence is.
Bitcoin is not untraceable. It's perfectly traceable for all time: that's what the blockchain is! trouble is, it would take someone with the resources of a whole nation to actually follow the trail (and then it would probably dead-end at a corrupt exchange where the traceable bitcoins were turned into untraceable cash).
Personally, I think that we should impose consecutive sentences on anyone proved to be responsible for deliberately destructive malware. Let's be generous, say just one day per victim of extortion. Destroy 36500 or more computers with your malware, and go to jail for life. Other countries might prefer to hang them, and I'd not be particularly bothered if they do. Less so than about many murderers. There are people losing their livelyhoods by the thousands because of acts like these, and I'd bet that there will have been suicides (plural, probably tens of) as a consequence. Yes I know that everything should be stored on servers run by professionals who make nightly or hourly read-only snapshots of their filesystems, but in the real world there are very many small businesses who don't have any IT staff at all (but still rely on a few PCs).
Yes, I'm ranting, so I'll stop.
If it's like a portable Chromebook, it won't be hard to install Linux instead of Chrome OS. There's also the possibility of keeping Chrome OS and using it as a thin terminal for anything that needs more than a web browser.
It was quite a few years ago that IBM sold its hard disk business. At the time there was speculation as to motive. Was it simply because IBM saw it was labouring under a disadvantage, trying to sell its hard drives to competitors in the server space such as HP and Compaq? Or was it because IBM believed this was a business with a declining long-term future, and sold out before that view became widely held?
I suspect the latter. They compare IBM to a elephant. It can't gallop, but it can move surprisingly fast and knows better than most large companies where it is and where it wants to get to.
fighting against engineers using the terms "master" and "slave" in their documentaion
I'm intrigued to know what she suggested as an alternative. IMO the terms are spot on, and may even amount to an implicit ethical warning for some (far?) future date, that there's a major problem should "slave" software ever approach the threshold for sentience.
"Producer" and "Consumer" are also used, but differently: this terminology suggests that the consumer side is not incapable of taking decisions. And of course, it's also politically loaded!
I can't help observing that to get a Fermion back into the same state it started in, you have to rotate it through not 360 degrees but 720 degrees. Are Fermions female? Discuss.