It' s not like clockwork, and the timescale is geological. May not happen for hundreds of centuries yet. Hope not.
2603 posts • joined 10 Jun 2009
It' s not like clockwork, and the timescale is geological. May not happen for hundreds of centuries yet. Hope not.
Actually it's conceptually easy to trigger a supervolcano erruption. Drill down as far as you can, maybe 500m above the magma, then put a "Tsar Bomba" hundred-megatonne nuke at the bottom and similar nukes every 500m or so all the way to the top, and blow them all at once. Fortunately, I don't think even the leadership of North Korea is quite that crazy. (Scarily, ISTR that there is a supervolcano reservoir inside North Korea's borders).
Definitely no joke. I'd be planning to relocate as soon as reasonably possible. Vesuvius errupts far more often than supervolcanoes. If you leave relocation until there's smoke coming out of the volcano, it may be too late to get yourself (and the entire population of Naples) safely out of town.
No consistency problems. Erruptions of different supervolcanoes are not correlated. Erruption times of a single supervolcano are vaguely cyclical, but not anything like clockwork. So the earth has had two go pop in the last 100K years, and may have none go pop in the next 100K years, averaging out as one every 100K years.
In reality (a) you'd have to average over a few Myears to get meaningful statistics, and (b) the Yellowstone supervolcano is likely to go pop in the geological near future. (Say in the next 100k years with probability 0.7 or higher).
Scientific Linux is not a clone of Centos. It's also a derivative of Red Hat's source (not quite a clone, for significant reasons).
CERN(*) depended on the old Red Hat free-to-copy model. When Red Hat started charging after RHL 9, CERN had a problem. (Methinks someone at Red Hat didn't understand that CERN had thousands, perhaps millions, of systems embedded in apparatus, and really could not countenance any per-CPU charging scheme. I suspect that if Red Hat had offered CERN a no-support unlimited-copies license at a reasonable price, i.e. the status quo, they'd have paid for it, and the rest of us would be poorer for it).
Anyway, they didn't, and CERN took the only route that they could. Changing horses was not an option. Taking the source, and building their own distribution, was an option. CERN has a lot of very smart IT guys. So Scientific Linux was born (with the most inappropriate name of any Linux distribution).
Maybe it was a clone on day one, but they take the attitude that if something is needed for CERN that's not in Red Hat, it goes in, and if a bug is troubling CERN, then they fix it (even if Red Hat hasn't, or won't). However, they prefer to avoid divergence. From an ordinary use's point of view, you'll find it hard to tell the difference between Scientific Linux and Red Hat. The most obvious change, is that a default Scientific Linux install has automatic yum updating turned on. The next most noticeable thing is that SL has a fair number of (science-related, optional) packages in the distribution repositories, which are not in Centos or RHEL. I'm told that the SL kernel has a few extra things built in or removed into modules, but I've never run into anything that works on Centos or RHEL that doesn't work on the corresponding SL.
Centos used to to claim bug-for-bug compatibility with Red Hat, but since RHEL6 that has become harder for them (different build tools). Anyway do you really want to suffer a fixable bug just because some other distribution hasn't yet fixed it? So now Centos is also not quite a clone.
Perhaps it's like evolution. They're strains or races, not yet different species. The environmental change that would cause a speciation event (or a fork) has not yet happened, and hopefully won't.
(*) CERN implies "and Fermilab", everywhere.
Which RHEL are you talking about? 5, 6 or 7?
They are all current. If you want the most stable production platform, and provided you don't *need* the newer features, 5 might still be the best choice. (Though 6 seems pretty darned stable to me).
Also you need to evaluate the anatomy of whatever bugs are hurting you. If it's a bug in, say, Samba, the chances are high that you'll find the same bug with Samba running atop SuSe or Ubuntu. I.e., it's not Red Hat's code or package-building at fault.
At least the bugs do get fixed. As opposed to being swept under the carpet until a black hat starts exploiting them, or being documented as features, or being told to migrate to an incompatible and expensive version N+1 or lose all support. Techniques frequently adopted by closed-source alternatives.
The previous history is relevant.
With RHEL5, it was easy for Centos or anyone else to strip out the Red Hat copyrighted images and repeat Red Hat's build process using the open sources which Red Hat are obliged to distribute. They didn't care that Centos (and CERN - Scientific Linux) did this. They did care when Oracle did the same.
So with RHEL6 they made the build tools less open and more obfuscated, and that's why Centos 6 arrived a rather long time after RHEL6 (they had rather a lot of reverse-engineering to do). Centos was "collateral damage". Oracle was the target ( it was basically taking Red Hat's software, relabelling it, and reselling it in competition).
I'd feared that they would complete the process with RHEL7 and make RHEL 7 close to uncloneable despite the open-ness of the source. Does anyone know if they are freely licensing proprietary build tools to Centos and other free-beer distributions, while leaving Oracle to stew? If they are, it seems like the best possible solution.
You won't get useful amounts of energy from the cosmic microwave background, nor from acoustic noise somewhere you can hear a pin drop.
Mind you, acoustic scavenging might actually fly in some workplaces I can think of!
I was thinking how completely irritating and disabling it is, if there's anything on your eyeball that doesn't move exactly the same way as the eyeball. Think of a grain of sand in your eye, or conjunctivitis.
If the implant does react exactly the same as the eyeball with no added mechanical resistance, there's no way to harvest mechanical energy.
Solar power sums: the usual figure is 100 watts per square meter harvested from bright sunlight. That's 100 microwatts per square millimeter (10 microwatts on a dull day, maybe 1 microwatt indoors with office grade lighting).
I'd have thought a better solution would be those power cells that metabolise glucose.
Good idea for implants in blood-perfused tissue, but the surface of an eye is not well-supplied with blood or glucose.
The best way for an eyeball implant would probably be photovoltaic ("solar power"). Eyeballs are well supplied with light, at least while you are awake. Transparent solar cells can be made (indeed, large ones are being installed as windows on "green" skyscrapers).
Another way would be wireless power (I'm assuming only microwatts are needed). You'd need to wear a power transmitter elsewhere on your person if continuous power was needed. You might be able to scavenge power from a 21st century environment if continuous power was not vital. (ie, parasitise off WLANs and mobile phones).
Yet another way would be piezo-kinetic power scavenging (as in the Seiko Kinetic watches). Again I can see that working much better for implants in other parts of the body.
And further to that thought, if the custom Silicon is on a card on the bus in a conventional server PC, you can yank it out and plug in this year's model. Rather more work than upgrading pure software(*), but not nearly as much work as replacing the entire server farm.
(*) that's after you've sorted out all the reconfiguration issues, and just have to do the tried and tested same over and over again.
What is it that you need in the server farm, that you can do with hardware integrated on the same slice if Silicon as the CPU, that you can't do with a separate piece of silicon attached to an Intel CPU's external bus?
I appreciate that at the consumer device end, there are serious economies to be reaped by integration of a system on one chip. (Serious economies means maybe a few tens of dollars per system). In the server room, I don't think a $50 cost advantage will win any arguments. It needs to be a technological advantage, or a price advantage at least one order greater.
Ultimately, I do expect Intel will be fabbing the world's ARM CPUs with their world-leading process technology, but in the first instance for mobiles, not for servers. ARM will conquer the server room last, if ever.
Agree. Better a standard USB stick, with some way of pulling out a short flexible micro USB cable (an inch or two would be long enough).
I have a portable DVD drive which has a USB cable that clips into recesses mounted in the plastic base of the drive when it's not being used. Something similar to that?
A lot of thought would need to go into providing a dictionary. Carefully chosen well-labelled pictures ought to allow a long-lost language to be reborn. If only (say) the Minoans or Etruscans had produced children's picture books on stone tablets.
In an SF story I once read, the (accidental) Rosetta stone was an annotated periodic table, although obviously only a fairly advanced civilisation could decode that one.
We don't want them being used as fetish objects or clubs by the cave-dwelling man apes that will wander the post-apocalyptic wastelands.
Or maybe we do, provided they're suitably resilient. Something that's periodically rediscovered and regarded as treasure from the great ones of yesteryear may stand a better chance of finally being decoded, than something lost in a hole in the ground getting buried deeper and deeper with every passing milennium. Tableware made of a high-tech ceramic much tougher than mere porcelain might be a good choice. If some barbarian manages to smash it and dumps it in a midden, the information loss probably isn't very great.
Anyway you'd run both strategies in parallel, with very many identical plates for redundancy.
Perhaps this method is right, for billion-year timescales.
If we're aiming only at the next civilisation after this one fails, or perhaps the next intelligent species after this one goes extinct, then a Babylonian solution beckons. Fired clay tablets (with a modern twist). Perhaps a somewhat more advanced ceramic, such as the all but indestructible "Corelle" that Dow Corning once made plates from. (Same problem as un-ladderable stockings. The plates last forever. No repeat sales. No profits. Discontinued. Sigh. )
You could put some low-tech diagrammatic writing on the plates, in an attempt to draw attention to the high-definition data embossed in the ceramic, or ink-jet printed at ~100dpi. Make them both beautiful and startingly hard to break, and stone-age or barbarian peoples might preserve them rather than destroy them. Anyway, it's really hard work to destroy the information on a ceramic surface and a subsequent civilisation will reassemble the fragments from your rubbish tip. Make lots and lots of them, distribute them as superior mass-market tableware, and survival of some of them is virtually guaranteed. All we need is a billionaire with a long-term view of things to underwrite the project. (Thinks ... an Indian one ... must be good for a lot of positive Karma. The largest Hindu unit of time is a large multiple of My, and they believe in re-incarnation and in cycles of creation and destruction. It all fits. )
(Apparently we know more about the everyday minutae of Babylonian life, than about any civilisation since! )
The Earth will become uninhabitable a lot sooner than 5 Gigayears hence. We're actually rather close to the inwards edge of the habitable zone around the Sun, and the Sun is getting hotter as it ages. Unless "we" initiate major planetary protection operations within the next Gigayear (orbital sunshades, or orbit expansion), life will be over by then. Some estimate as soon as 300My, before Earth suffers thermal runaway the same as Venus. (OMG multicellular life is having its midlife crisis! )
If we want to leave a *really* long-term record, Earth isn't really the right place. Too much corrosive oxygen and water and those awkward plate tectonics, and a boiling sulphuric acid nightmare after the end of life on Earth.The Moon is better (dig in deep to protect against all but huge meteor strikes, and position-mark with long-life radioactives near the surface). An outer moon of Saturn would be better still, might even survive Sol going red giant and nova. (Ring any bells? ....)
I'd be most surprised if he didn't start by dropping weights. It's not hard to distinguish clunk! from clunk-clunk!, where the impacts are separated by as little as 50ms (maybe less). Then he thought "that's interesting". Then he'd have looked for a way to make the experiment more accurate, and got lucky by using solid rather than hollow spheres on an inclined plane. Right result, but missing a large chunk of reasoning about then-unknown rotational kinetic energy and the importance of the spheres being homogeneous.
The gravitational interaction of two human-sized objects has been observed, and Newton's law of gravitation (inverse-squares) confirmed for masses of the order of kilogrammes at distances of the order of a meter. The experiments are very hard, and the accuracy isn't great. Some alternative theories have proposed that gravitation breaks down on milli- or micro-meter scales, and of course at much smaller scales still we get into quantum effects. These aren't yet testable.
Other theories posit that inverse-squares gravitation breaks down on the scale of a galaxy and above, and attempt to explain away dark matter and dark energy in terms of a different law of gravitation on very large scales. So far, such attempts have not been very successful in doing away with the need for dark matter and energy to explain the observations.
The equivalence principle is much better-tested for terrestrial masses, than Newton's law itself. But all the masses tested were made of conventional atoms. Perhaps fortunately, we don't have neutronium or large masses of antimatter to play with.
AFAIK the orbits of the Voyager spacecraft are still observed to be not quite as expected. Tantalizing, but they weren't ever designed as gravity probes, and their deviation from predicted orbit is just within the likely errors of observations.
I haven't worked out the details (which will be very complex, probably involving serious supercomputer resources to model) but there will be gravitational perturbations of each orbit by the other bodies. Here in the solar system, the existence and position of Pluto was deduced and calculated by very careful observation of the orbits of Uranus and Neptune. Once they knew where to look, they pointed a powerful telescope at that patch of sky, and there was Pluto.
For this system we can test GR rather than simple Newtonian gravity, because the bodies are very dense and close. It tests the equivalence princible, because a Neutron star is made of Neutrons and a White Dwarf is made of ordinary atomic nuclei (ie with protons, and electrons between the nuclei). Short of a black hole, that's about as much different as two astronomical masses can be. (*)
I'll have to file this system along with Zircons dating of the age of the Earth. $DEITY is being extremely kind to us, providing us with the means to scientifically investigate questions we might have thought were forever beyond our grasp.
(*) I've always wondered whether bosons (commonly neutrons and protons) and fermions (commonly electrons) are gravitationally equivalent. Is there any way to probe this, even conceptually, given that the electromagnetic interactions of electrical charges are about 10^40 times greater than any gravitational effects?
Here on Earth, if you put the feather on top of the hammer and drop in still air, the feather will fall with the hammer. The hammer pushes the air out of the way, and gravity does the rest.
I'd be very surprised if Galileo hadn't tried this, thought he was probably smart enough to realise that the limitations of this particular experiment would be siezed upon by his enemies.
All it takes is to split the hull and air pressure, plus the 600mph breeze outside will do the rest.
Contradictory datapoint: the Aloha airlines "open-top 737" incident.
Puncture? Almost certainly yes, if taped to the skin of the airframe itself. On the inner plastic skin, I doubt it. Further away, no chance. It's also unlikely that a small hole could bring down an airliner.
Yes, I know. MS would like all of its users to be tied in to a browser that won't let them block adverts, and which reports more than you know back to Microsoft so that they can target you with more "relevant" and unblockable advertising. The users, on the other hand ....
Ghostery and NoScript...
Adblock-plus, Flashblock, and Tabkit (tabs down the LHS in collapsible trees, not along the top)
Privacy concerns about what is being sent to Microsoft (IE) or Google (Chrome) without my knowledge or consent.
Mozilla doesn't have to hope anything. Firefox works just fine on Linux and anything else that might replace XP and Windows 7 on business desktops (including the Windows 8 desktop).
The day Microsoft announces EOL for Windows 7 without having an upgrade path that doesn't involve massive costs (including retraining all a business's low-skill keyboard-pokers), is the day Microsoft will have signed its own death warrant.
If I hadn't been around while Digital self-destructed, I'd think it couldn't happen.
There's actually a decent successor to Windows 7 in Windows 8, if only it could be made to boot to desktop with a 7-compatible login /switch user screen and start menu, and TIKFAM made so it can be defaulted hidden and configured completely unavailable using AD policies. Doesn't actually look too hard, if Microsoft would only stop pushing TIKFAM at business users who know they absolutely don't want it in any shape or form.
Personally I wish they'd decide that there never will be an official version of Firefox that works anywhere on Windows 8 except for the Windows 7-compatible desktop, until someone stumps up the entire cost of porting it.
If Microsoft think the non-availability hurts their chances of persuading people that TIKFAM is any more seaworthy than the post-iceberg Titanic, let them pay the Mozilla foundation to do the port. If Microsoft doesn't care to do that, tell the world that's why there's no TIKFAM-Firefox.
A thought on the NUC form-factor.
Intel needs to put more SATA connectors on it, not just one, so that it can be used as the basis of systems that need >1 hard disk. (IMO every system with locally-stored data needs >1 hard disk, for mirroring - at £40 for a HD can you afford not to? ) How much do four SATA connectors cost? Surely on-chip SATA interfaces are (or can be) designed to idle at microwatts if there is no hardware connected to them? There's clearly no great cost to putting six SATA on desktop boards that rarely have more than three of them connected. If there were, they'd leave them out and the minority would buy PCI-X SATA controllers.
You could connect extra disks by extending the NUC upwards with a section that bolts on top of the NUC box and holds the disks. Keeping the original lid and bolting that on top of the extended NUC would be a nice touch.
If the form-factor catches on the price ought to drop as NUC-format boards start being made by ASUS, Gigabyte etc., along with cases for passive cooling, multiple disks, etc. in various different shapes.
Intel set the ATX PSU format and the common desktop motherboard formats ATX, mATX, ITX. OTOH they failed to persuade anyone that BTX PSUs were a good idea. Standard form-factors are a good idea in principle.
For the record, if anyone finds this post with Google, I have at last found my passively cooled ITX board with a faster CPU than an Atom. It's a Gigabyte GA-C1037UN-EU. The -EU is important because there is another variant with a fan and (I presume) a wimpier heatsink and/or a different BIOS that will moan if the fan isn't present.
The CPU is a Celeron 1037U, 17W TDP, which CPU benchmarks about 2.5x my Atom. It also takes up to 16Gb of DDR3 which should save on disk IO. The board (incl CPU) is inexpensive (£72 incl VAT and shipping) and even boasts 3 x SATA (one of which SATA-3, great for an SSD), USB3, and one E-SATA.
Fingers crossed. Intel shouldn't be marketing these chips as Celerons. They should market them as low-wattage low-end Ivy Bridge i3 (which they are), or even as faster Atoms. There's a Haswell version coming "soon", but I decided not to wait for it.
I presume it still has an annoying fan in there somewhere, which is why IMHO it doesn't warrant a premium price. (If I''m wrong and the whole case is metal acting as a passive heatsink, please correct me and I'll reconsider).
What I want is a passively cooled board that will fit in an ITX case, to replace the Atom D520 I have at present. Having gotten used to silence, I don't want even a whisper of fan noise. Nobody seems to make such a board, though there's an i3 laptop CPU that's loads faster than the Atom and which has much the same TDP.
Failing which I'll be returning to a mini-tower case with a fan-less PSU and a NoFan heatsink on an ordinary Intel desktop CPU. Small size isn't a killer feature. Silence is.
Bitcoin is not untraceable. It's perfectly traceable for all time: that's what the blockchain is! trouble is, it would take someone with the resources of a whole nation to actually follow the trail (and then it would probably dead-end at a corrupt exchange where the traceable bitcoins were turned into untraceable cash).
Personally, I think that we should impose consecutive sentences on anyone proved to be responsible for deliberately destructive malware. Let's be generous, say just one day per victim of extortion. Destroy 36500 or more computers with your malware, and go to jail for life. Other countries might prefer to hang them, and I'd not be particularly bothered if they do. Less so than about many murderers. There are people losing their livelyhoods by the thousands because of acts like these, and I'd bet that there will have been suicides (plural, probably tens of) as a consequence. Yes I know that everything should be stored on servers run by professionals who make nightly or hourly read-only snapshots of their filesystems, but in the real world there are very many small businesses who don't have any IT staff at all (but still rely on a few PCs).
Yes, I'm ranting, so I'll stop.
If it's like a portable Chromebook, it won't be hard to install Linux instead of Chrome OS. There's also the possibility of keeping Chrome OS and using it as a thin terminal for anything that needs more than a web browser.
It was quite a few years ago that IBM sold its hard disk business. At the time there was speculation as to motive. Was it simply because IBM saw it was labouring under a disadvantage, trying to sell its hard drives to competitors in the server space such as HP and Compaq? Or was it because IBM believed this was a business with a declining long-term future, and sold out before that view became widely held?
I suspect the latter. They compare IBM to a elephant. It can't gallop, but it can move surprisingly fast and knows better than most large companies where it is and where it wants to get to.
fighting against engineers using the terms "master" and "slave" in their documentaion
I'm intrigued to know what she suggested as an alternative. IMO the terms are spot on, and may even amount to an implicit ethical warning for some (far?) future date, that there's a major problem should "slave" software ever approach the threshold for sentience.
"Producer" and "Consumer" are also used, but differently: this terminology suggests that the consumer side is not incapable of taking decisions. And of course, it's also politically loaded!
I can't help observing that to get a Fermion back into the same state it started in, you have to rotate it through not 360 degrees but 720 degrees. Are Fermions female? Discuss.
There's satire, and then there's just being a bunch of dicks
Indeed, and this isn't even a patch on Intercal
" It is a well-known and oft-demonstrated fact that a person whose work is incomprehensible is held in high esteem. For example, if one were to state that the simplest way to store a value of 65536 in a 32-bit INTERCAL variable is:
DO :1 <- #0¢#256
any sensible programmer would say that that was absurd. Since this is indeed the simplest method, the programmer would be made to look foolish in front of his boss, who would of course happened to turn up, as bosses are wont to do. The effect would be no less devastating for the programmer having been correct.
It looks like something you'd fill with ice to cool a bottle of wine.
At the small end you could try a Cubieboard, with 1 x SATA on board. BTW Linux supports SATA port multipliers (forget the exact right term - but like SCSI LUNs on SATA, can attach four or more drives to one SATA port). Might be a fun project.
At the bigger end you could stick with Intel and get an Avoton server with 12 SATA ports. http://techreport.com/news/25703/asrock-combines-avoton-soc-12-sata-ports-on-mini-itx-mobo
There are plenty of whisper-quiet Intel solutions. Start with a low-wattage CPU (think 35W TDP is the lowest). Then research all components with fans. Typically you use a large case with a 12cm fan running at minimum speed and a large-fan CPU heatsink which again will result in the fan running at minimum speed (or off most of the time, if the motherboard is chosen to be capable of running the fan at zero rpm when the CPU isn't in need of any cooling). You can get completely fan-less PSUs. You'll also want to choose hard disk drives with care lest they be the noisiest part of your system. WD "Red" aren't the fastest, but they may be the quietest. I can't hear them seeking at all!
I've been trying to find a mini-ITX board with an Intel i3-4210U soldered onto it and passively cooled (15W TDP) but nobody seems to make that. Annoying. (I have an Atom-based home server which I'd like to speed up, but not at the expense of starting again with an Intel NUC and a 3rd-party passive-cooling case and not at the noise cost of even whispering fans).
Gnome 3.10 ... hopefully yum install cinnamon still deals with that. (Or you could have picked KDE in the first place).
They've already made BBC World Service DAB only (well, apart from AM ... it wasn't ever on UK FM). This is the only reason my DAB receiver didn't get thrown in the trash can.
I'd also note that in the intervening four years we also had something the world hasn't seen since: total war
We've seen it many times, just not on the same scale. I wonder how much consolation it was to a Korean or Vietnamese, or is to a Syrian, that most of the rest of the world was / is not likewise at war.
Except, one hopes that the NSA or someone is reverse-engineering the BIOSes being shipped, to keep the other side honest. Booby-trapping all BIOSes shipped is the sort of dirty trick that you could get away with only once (and pay a huge economic price afterwards). Unless you posit a multi-national conspiracy, if it had been happening we'd have heard about it by now.
Unless absolutely nobody is checking.
It's been around ever since some bean-counter demanded removal of the write-protect switch from a system's flash logic circuitry.
How it ought to be, is that to do a BIOS upgrade you'd start by taking the lid off the system and moving a jumper or switch to write-enable. Then update. Then set it back to write-protect. (Note: nothing to stop manufacturers shipping it write-enabled, if they know that their average customer is a moron. Intelligent customers would protect it on delivery -- or buy from a different manufacturer).
How much did removing one jumper save? One cent? Probably less. Bullet, meet foot.
Why in the thirty-two Hells of Carmack would they want to get into building their own hardware especially at the silicon level?
Because they can see a way to reduce their energy consumption by doing something differently in Silicon? Because of their massive scale, things may look different to Google compared to lesser companies.
What Google really wants is an ARM solution fabbed by Intel at 14nm!
Maybe Google is a large enough customer, that Intel might consider fabbing custom chips that aren't sold to anyone except Google. Google could buy any necessary ARM license and have it fabbed by any company willing to take their money.
Also maybe a hybrid chip with both x64_64 and ARM cores is possible and useful to Google.
Another, shorter formulation:
Any sufficiently developed bureaucracy is indistinguishable from malice
I think the Commodore SR-36 goes back a bit further. It had Red LED displays, not LCD (which ISTR hadn't yet been commercialized). Superb bit of kit for the time.
I used mine for about fifteen years, until the batteries faded to the point you could only use it tethered to a wire, and I threw it into a box in the attic. I recently found it and fired it up, but something had failed while it was in storage and it would display only gobbledegook.
Still looks good, though.
Invent an algorithm for your passwords.
If you forget your password recalculate My_Algorithm( "Facebook", other_things_that_I_CAN_remember)
It doesn't have to be complicated. You're already ahead of 99% of the crowd. I don't imagine for a moment it'll defeat spooks (who have inside access to Facebook et al anyway). It will defeat the sort of criminal who assumes that all your passwords are likely to be the same, or steals your computer so he can use your stored passwords.
And use a different algorithm for passwords that unlock real money!