By the way, the control codes at CERN really DO have to include a calculation of the phase of the moon. The earth suffers elastic tidal distortion, enough to change the alignment of the LHC to a significant extent.
2567 posts • joined 10 Jun 2009
By the way, the control codes at CERN really DO have to include a calculation of the phase of the moon. The earth suffers elastic tidal distortion, enough to change the alignment of the LHC to a significant extent.
The difference is that they're relatively small and rigid structures that can survive the gradual distortion of the ground they are built on bt anything less than a truly huge earthquake. Whereas a LINAC 10km long that has to be dead straight might end up bent out of alignment, which would ruin it.
A picture from California is worth a thousand words
Why not? It's no different to the everyday usage of "nuke it"!
Doesn't everyone know about natural cosmic rays up to10^19 eV, and the number of them that have interacted with the sun in the last 4 billion years? (Hint: the sun is BIG). If really high energy particles had any untoward effects on large collections of matter, we wouldn't be here to talk about it. (Or the fun alternative, maybe they once DID have dramatic effects on the pre-universe, and that's WHY we're here talking about it! )
That's a bit like upgrading from a Bentley to a Bugatti, when you know that what you really need is a supersonic jet. Not a big enough advance to be worth paying that much for. The LHC is the biggest ring we'll build in any forseeable future.
Linear accelerators are the most obvious way we might get a really big advance. In outline ... an electric field of a million volts per meter, over ten kilometers, is 10GV. Subject a proton to that and because it's ~2000 times more massy than an electron, that's ~20TeV. Now work on higher field strengths ... none of this is impossible, it's just unknown territory in engineering terms.
For a rather further-fetched idea, look up PASER (and consider where we are today, starting from some theoretical 1930s papers on the field effect ... with a big detour via germanium junction transistors).
Bigger screen + much bigger battery = win
Unless you start to notice the extra weight?
those places will probably not give you the kind of lifestyle you are used to.
Switzerland looks good for a decade or two longer, if you can hack the exchange rate. Unfortunately, it's surrounded.
I find the omnipresent surveillance nightmare more troubling than any other SFnal visions of the near future. Soon enough, we will be living in a world where everything is networked, and everything has eyes and ears and a CPU, and no-one will be willing or even able to break the rules. I can't see a way past that trap. And then we'll be no more able to react to changing circumstances than insects, and the subsequent fall will be deeper than any previous fall in history. Especially if by the time it happens, there is no longer any place outside "the empire" to flee to or to be invaded by.
When the Roman Empire fell, the people left in England lost the technology of throwing pots on a potter's wheel. This, even without nukes and surveillance.
In most contexts the two are close enough to be treated as ~equal. In software documentation concerning (mostly) disk partition tables and related entities, they take care to get it right. Often, command lines use abbrteviations m for 10^6 and M for 2^20, rather than M for decimal and Mi for binary, but the documentation spells it out.
Scientists will also disagree, because M G T etc. are SI prefixes defined as powers of ten. The correct prefixes for 2^10 2^20 ... are indeed ki, Mi, Gi, Ti, ...
A 4k7 resistor was 4700 Ohms, and a "megohm" resistor 1,000,000 Ohms, before the word "byte" had been coined.
A very necessary skill on a starship crew ....
By the time this is over I expect they'll be looking for a career at the bar in front of them. But they'll have trouble buying a drink there, let alone landing a job.
Wish I could upvote that a hundred times over. Why, why, WHY do people see any advantage in a wireless "key" rather than a contact "key"? Same as paying more for notebooks lacking a wired network socket, I guess.
Driving a junker works well. Someone recently radio-unlocked my 12-year-old car - presumably the tech to break 12-year-old radio security is now available for less than the cost of a new key? Anyway, they couldn't find anything much worth stealing, neither car nor contents.
Actually a major source of unwanted sulfur is the refining of crude oil and the purification of natural gas.
On the other hand, if the oil and gas industry didn't land us with heaps of sulfur to get rid of, gypsum (a common mineral) is Calcium Sulphate.
Are you confusing hydrogen sulphide? (which is about as toxic as hydrogen cyanide, except that you're more likely to notice the rotten-eggs smell in time to make a hasty escape).
Sulfur Dioxide isn't in that league, and there are many things much more toxic than cyanide.
If the raw materials are plentiful and cheap, you don't need a large energy density. Imagine that you could make a battery out of (say) Silicon dioxide and Calcium carbonate. You'd just pile up enough of it to solve the problem. It's only if the battery has to move its own mass around, as in a 'leccy car, that energy density becomes critical.
Sulfur is cheap and plentiful, Lithium rather less so.
The general problem with a battery is that it involves surface chemistry, but you need a lot of extra bulk material to give the surfaces some mechanical integrity. This is why having a reliable solid-electrolyte chemistry would be a big step forward. If the electrolyte is solid wlll add to the mechanical integrity of the whole.
For NiMH cells (whch I read up about), they are constructed much like a toilet roll. A long thin roll of charge-storage sandwich. You get a higher storage capacity by making the sandwich thinner, but thinner means greater charge leakage, and a law of diminishing returns sets in rapidly for an AA cell packing more than 2700mAh.
And you don't want to be in downwind proximity to a large pile of burning Sulfur. Not the most toxic of materials, but most definitely unpleasant ( S + air -> SO2 + water -> H2SO3 + more air -> H2SO4).
I read about a proposal to use sodium-sulfur batteries in electric cars in (I think) the 1980s, and I thought it was one of the craziest ideas I'd ever read. I thought, like putting a shock-sensitive detonator in a petrol tank. (Now they're talking about CNG ... at least a CNG cylinder can't not be tough).
According to wikipedia http://en.wikipedia.org/wiki/Nickel%E2%80%93iron_battery
Due to its low specific energy, poor charge retention, and high cost of manufacture, other types of rechargeable batteries have displaced the nickel–iron battery in most applications
Lots more interesting stuff. No mention of Exide. The batteries are out there and in use, in places where the weight of the battery is less important than its reliability or ruggedness. They're under review for renewable energy storage. For automotive use, weight is important, as is energy retention (cars are left unused for weeks, occasionally months) and for a fuel-driven car the battery is dead weight except for the few seconds when you are starting the engine. Doesn't sound like a competitor for lead-acid to me.
Just as long as it's not SO smart that one can't simply "forget" to connect it to the internet.
Of course, that's kind of begging the question of how much longer broadcast TV will exist. I suspect Digital TV may go the way of analogue TV within a couple of decades. Internet TV or no TV thereafter.
It's like bio-warfare. The Chinese(*) wrote it, but it escaped, and is now propagating itself beyond their ability to "recall" it. Scary.
(*) or anyone else, possibly including some near-autistic kid living in a council flat near you.
Funny, on the PCs I maintain today I've never needed anyting other than a #1 cross-head.
I do remember that early Dells were assembled with a nightmare mix of Philips, flat, Torx, hex-head, and OMG tamper-proof. But that was a LONG time ago.
Why is it that Dell (and other "corporate" PC vendors) hold to the idea that it's quicker to get at a machine's innards if you don't need a screwdriver? My experience has always been that by the time you've worked out which tricky little catches to release and what to slide in which direction, you could have replaced a disk drive in a drive bay in a well-designed machine where it's held in place with screws.
And shortly after you've worked it out, you'll notice blood dripping from a finger, and have to take a break to find a sticking-plaster.
(Are there really any service engineers out there so dim that they can't use a screwdriver? )
I also think MS has too much negative sentiment associated with its brand to really excite large portions of the market.
And when did IBM last excite large sections of the market? What percentage of people outside IT even know what IBM does? And ... so what?
Microsoft's bread and butter is desktop PCs used by businesses or for business. For content creation not content consumption. If they carry on down the road they are on, someone is going to eat their core business the way Linux(*) is eating the iPhone.
By the way, I don't care. I almost hope Microsoft does go the same way as Digital. Those whom the gods would destroy, they first make mad. Telling your customers that they're wrong and should <go away> is a symptom of corporate madness. It's also what Digital did more and more, until their fall.
(*) Android. It's Linux under the skin. Though the moral would be the same even if it weren't.
One danger MS has now is they've split their customers into two groups, those who prefer a traditional desktop environment and those who prefer Metro.
Linux solved that problem ages ago. When you log in you select the UI you prefer from a list of those installed on your system. If your UI of preference isn't on the list and you have admin access to your machine, you download and install the one you want. I bitched like hell when Gnome 3 destroyed Gnome 2 rather than being able to install them both side by side ... that's NOT THE LINUX WAY ... but the open-source community fixed this brain damage both possible ways, with Mate (a fork of the original Gnome 2) and Cinnamon (a new but similar UI running atop Gnome 3 libraries).
But Microsoft think there must be one and only one UI, that they tear up and replace with something utterly different on a whim. And the Windows 8 UI is frankly about as loveble as Jimmy Saville. The more one learns ....
At the moment I have five windows visible on my screen. That's how I want to work. I don't want anything to ever go full-screen unless I click on the full-screen button - which I never do, because no application I run needs 1920 x 1200 pixels. Several more windows are hidden but I can bring them to the foreground with one click and return them to underneath with another one click. I don't have to touch the keyboard to context-switch.
Microsoft think I should be working differently.
If Windows 7 ever goes away, I'll find a job that doesn't involve ever having to interact with a Microsoft system at all.
My thought too. For now, USB3 is good enough for almost everything, and eSATA good enough for the exception.
But maybe Intel are looking a long way ahead. When memristor technology arrives (dirt-cheap randomly addressible nonvolatile RAM - OK I'm an optimist), how are you going to connect your 1Tb memory stick in the year 2020? Hint: USB3 will offer only a small fraction of the attainable speed.
Also think about why USB2 beat Firewire (which was faster).
If anyone wanted to read a non-encrypted document in some old format or other, I doubt that they'd need to be qualified to work at GCHQ in order to "break" the (non-)code. It's just reverse-engineering something that actually isn't designed to conceal.
Strong encryption, on the other hand, means that once the decryption key is gone, so is the document.
Colour temperature is a stupid attempt to sum up a spectrum in one number. What you really need is a spectroscope. You'll then see that an incandescent bulb of any sort emits a nice smooth spectrum (which can indeed be derived from the filament's temperature). What comes out of a CFL is mostly green and all lumps and bumps, and what comes out of an LED is a rather smoother two-humped distribution with an unnaturally high amount of blue.
Our eyes are evolved to work best with a single-hump spectrum a.k.a. sunlight. A halogen bulb comes closest, but is not as hot as the sun so it's a bit deficient in blue and violet. An old-fashioned "yellow" filament bulb generates less green and virtually no blue and violet, but the spectrum is still smooth.
There's also mounting evidence that blue and violet light is involved in maintaining our bodies' circadian rhythms. Yellowish lighting for night-time use was probably a very sound non-choice, and the medical impact of switching to CFLs and LEDs may yet to be appreciated. If anyone is suffering from insomnia, I'd suggest it's worth a try to kick out all the CFL and LED bulbs from your house, and run halogen bulbs on a dimmer in the evening to suppress their blue output. Conversely on grey winter mornings, a full-on halogen bulb or even the bluer LED illumination may be a very good thing.
The difference in wattage between CFL or LED and incandescent is so great that there's no way they aren't saving electricity. Less than is advertized, though, for two reasons, one of which you'd not spot with any measurement of electricity going into the bulb.
1. CFL light spectrum is awful, and we compensate by upping the brightness. CFL warm-up time likewise. LED is far better and zero warm-up time.
2. "Waste heat" isn't all wasted. In winter, part heats the room, and the rest the space above. If you eliminate this heat source, the central heating has to supply the noticeably missing part(s). And in UK homes, lights are turned on for a much greater time in winter than in summer (short vs. long daylight hours).
Energy-saving bulbs make far more sense in the tropics, where they also save on airconditioning bills (i.e. extra electricity being used to pump the completely unwanted waste heat out of one's living quarters).
Anyway, give LEDs a few more years' development, and they'll probably become at least as good as halogen incandescent lighting.
It ought to be possible to produce a cheap version of powerline networking, with a bandwidth of a few kilobits per second, for the purpose of controlling domestic equipment. Get it ISO standardised, get the price down to a few pennies per chip, and build them into every appliance right down to the light bulb level. (Or into the ceiling roses or into bayonet to ES adapters! )
Just give us a physical disable mechanism, so technophobes or safety-critical devices can't be hacked!
For incandescent bulbs, I preferred screw. The problem is that the heat makes the plastic of the lampholder become brittle, and in my experience the bayonet holders are far more prone to breaking when you try to remove a blown bulb. Should be a thing of the past with flourescents and LEDs. Screw-fit bulbs working loose was probably caused by thermal cycling, so that is also probably a historical problem.
I also wonder how long it will be before designers realize that if an LED bulb will last 25 years, why make it a replaceable unit? Isn't it better to integrate the low-voltage PSU and the LEDs permanently into a metal light fitting? The light fitting would also provide a better heatsink for the LEDs, extending their life and/or increasing their brightness.
I've now got to ask - how do Germans indicate a diaresis, given that the usual method of doing so looks exactly like an umlaut?
Awful code? Or just code written by a person whose first language is not English, whose idea of a good variable name is therefore not the same as yours?
Maybe with more CPU cycles, they could start analyzing words with prefixes and suffixes, and put a wiggly brown line under words that might be OK but aren't actually in its dictionary. (Ie, words that break down as common prefixes and suffixes and something that is a known word in the middle.)
German (and Turkish and others) basically don't put spaces in a noun phrase, so there's no real significance to the length of a "word" and you're free to invent your own. Worth noting that spaces are actually a relatively modern development in writing. Theancientsdidn'tusespacesatall.
My thought too. Unlike most lengthy detritus in the dictionary, this word is actually useful (and has been so for centuries). If it didn't exist, political discourse would require that it be invented.
Does superluminal signalling automatically imply time travel?
I've read that it does. Interestingly, it's the IT angle of time travel (even if restricted to information only) which seems to me the greatest paradox. The problem is that if you can send even as little as one bit back in time, you can arrive at the result of any convergent iterative process in the time taken to compute a single iterative step (by sending the result of that iteration back in time to replace the initial approximation).
It's only a slight stretch to claim that such a computer would inevitably transform line noise into a strongly superhuman, perhaps ultimate, intelligence. So akin to the Fermi paradox, where is IT?
Am I the only person surprised that Intel hasn't used a bigLITTLE design? (ie, one with a much-simplified core for housekeeping when there's very little going on, sharing state with a much faster core to which it would hand over when things get too busy). Can they dynamically shut down so much of a core that they don't actually need to use silicon real-estate for a separate housekeeper-core architecture?
I'm trying to think of any reason why HTML5 needs to contain the DRM standard, rather than just having a separate standardised DRM.
Surely if the media lobby wants a single world-standard DRM they should decide on one themselves and put it forwards to ISO. Once standardised, users would download just one DRM entity to handle all standards-conforming media sites, and the (non-)problem of twenty incompatible DRM systems would be solved. Some browsers might even choose to implement ISOxxxx natively rather than as a separate user-requested plugin.
The only conclusion I can draw is that the media lobby has an ulterior motive in trying to get DRM into html5. Tell them to sod off and standardize it amongst themselves without fouling the HTML standard!
You could ask the same question about Gold, or any other fungible store of value. Whatever your nation enacts as law, is probably as good an answer as you'll get. In the UK, the rules are different for Sovereigns (=UK currency), Krugerrands (=SA currency) and gold bullion bars (commodity, not currency).
The irony is that Bitcoin is exactly the sort of thing that libertarian-leaning right-wingers ought to be welcoming with open arms and trying to protect. It's early days, but the intention is to create a form of currency that's (a) as "hard" as gold, (b) electronically tradeable, and (c) anonymous, just like fiat paper used to be, until circa 1970.
Shows how far the Republican party has fallen, that it will shoot itself in both feet just to bash the Democrats. Do they WANT the state to be able to trace every financial transaction by every "free" citizen? Do they WANT the state to be able to hyper-inflate everybody's savings out of existence? I feat that both of these things will be visited on the USA, and probably the UK, within my lifetime.
It's not a design flaw. It's an easily user-configurable option. It's also perfectly usable, if non-optimal, straight out of the box, because it's self-documenting. A program can't read your mind about where to insert itself, so it puts itself under its manufacturer's name and leaves it up to you to move or copy its entry to your place of choice. (Some programs do ask; more should). It also remembers the things you use most often and offers them to you at the top level (which is probably why most people never bother re-organising it).
If there's any problem, it's remembering how to change language, if the usual user of the PC is Finnish, or Arabic, or Chinese. But that's a general problem not specific to the start menu. Would have been so much more useful if there was a button (called Babelfish? ) on the keyboard, instead of a Windows button.
It's possible to run KDE on Windows (and possibly other Linux desktops).
Reference please! It can't be any more shit than those tiles!
No, that's not how it is.
While one is learning to use one's tools, one has to think about the tools, and one's productivity is low. When one has mastered them, one doesn't really think about them at all. They are just a part of one's environment, while one thinks about the actual useful work that one is accomplishing.
It's the difference between learning to ride a bicycle (if you can remember that far back) and riding one. You don't think about how to move the handlebars or when and how to change gears, you think about where you want to be and about what the other road users are up to. You can even day-dream, and you won't fall off, though failing to keep your mind on the other road-users can be fatal for other reasons.
Windows was wired into me back in the days of Windows 98. Thinking back, DecWindows and he AIX UI weren't so hugely different. (What? say the youngsters. Windows pre-Microsoft). XP SP3 was probably the best UI, with Gnome 2 a close second. Windows 7 was a tolerable irritation, like learning to drive a new car that you didn't actually want but couldn't refuse. Windows 8 is not a tolerable change (and neither was Gnome 3, but the open-source community rapidly fixed that stupidity, whereas Microsoft are very hard of hearing and have a monopoly on the source code).
Try configuring a PC for a university, or even a school. Large numbers of different programs. Students in each department and/or year use a different small subset of them. A menu is a perfect way to organise them. 1000+ tiles is random (or even defined) order isn't.
I'd be perfectly happy if those stupid tiles were a new option that an experienced sysadmin could turn OFF. But no, Microsoft insists on ramming them down everyone's throat. And that's only a small part of what's wrong with the Win 8 UI. It also insists on going full-screen at every opportunity. What I want to do is have several windows open on my nice large 1920x1080 screen, and switch between them with one mouse-click. That's WHY I have a large monitor (and they're cheap and flat these days, so it's no great expense - some of my colleagues have TWO).
If they don't fix it by the time Windows 7 is EOLed, I'll say goodbye to Microsoft even if I have to find a new job to do so. For now I'll just say no to 8. They've still got a while for the penny to drop, though it's looking a lot as if the board is going to have to eject Ballmer while there's still time to save the company. I've seen other big companies go down the plug-hole, because the boss's ego was too big to make the U turn that the customers were demanding.
Nice to see a mention of Cinnamon.
When the Gnome folks messed up with "3", the open-source community bitched about it a lot and then got on with doing what Microsoft are signally failing to do. One lot wrote a user-friendly environment to run on top of the new Graphics libraries (Cinnamon). The other lot rescued the old interface, warts and all (Mate). And of course, KDE, Gnome 2, XFCE, and dozens of others never actually got taken away.
I tried KDE, Mate and Cinnamon, decided I liked Cinnamon most, made "yum install Cinnamon" a standard part of my set-up repertoire, and stopped bitching.
Microsoft's "start button" is no Cinnamon. It's a fig-leaf with very serious caterpillar damage.
Just tell them to buy one running Windows 7, because if they get "8" they'll hate it and you won't be able to do anything to help them.
If they're hard of hearing you get to say "told you so" and remind them that you said you couldn't help.
We're a university. Our computers have abnormally large numbers of prograpms installed, because they are shared by students from many different departments, each of which needs a few different programs to the crowd.
A heirarchical menu is the perfect way to organise these. 1000+ tiles is not. This isn't the return of the start menu, it's a sop thrown in our faces. I wonder if it'll mollify enough people for Microsoft to get its misguided (or evil?) way with us?
There is evidence of sudden level changes in the form of certain estuaries. Sea level fell quite fast, geologically speaking, creating a new steep run for the river down into the sea. The resulting fast-moving water caused rapid erosion of a steep-sided valley. Then sea-level rose again, and the valley flooded with seawater, preserving its steep underwater form because erosion below the low-tide level is very slow. Take a look at Dartmouth or the Wye. It's very unlikely that the level of the UK land-mass could change fast enough to account for this.. The UK is geologically pretty stable (large earthquakes are rare, for example). The Andes are an example of rapidly-changing land levels - frequent catastrophic earthquakes, river-flows reversed in their valleys, and beaches raised tens of metres in mere thousands of years, as noted by Darwin. BTW glaciation creates a straight-ish U-shaped valley (or fjord), river erosion creates a wiggly V-shaped valley, so we can tell these southern estuaries aren't the result of glacier erosion.
One can also calculate what sea-level will be if all the ice-caps melt. At the very least, that should put you off investing in any land or property anywhere near to present sea-level! More seriously, the worst-case scenario is so bad that I do think we ought to heed the precautionary principle before it's too late.
Worst-case is permafrost thawing releasing methane causing global warming causing more permafrost thawing ... a positive feedback loop that won't stop until all the permafrost has thawed. There is geological evidence that such runaway warming events have happened several times in recent geological history. There is also geological evidence that the long-term stable (say 30Myear-average) situation for the Earth is with no large ice-caps at all and a MUCH warmer climate regulated by percentage cloud cover. Ice causes instability and positive feedback loops. We live in climatologically interesting times.
Laffer discredited? It's surely bleeding obvious that the receipts from a tax set at 100% are zero, because you'd have to be mad to earn under such a regime. So there must be a curve from zero income at 0% tax to zero income at 100% tax. The argument is over what shape that curve is and how it varies with time, for tax rates between the extremes, and the effects of changing the rates. This, like many interesting things in economics, cannot be measured with useful precision. (Though gut reaction says that if a government is taking more than half of what one earns, it's time to start thinking about leaving! )
Since so many companies can avoid paying this tax by relocating, why not scrap it?
There would then be no incentive to leave the UK and every incentive to relocate to the UK. More employment, so less benefits paid by the government to the unemployed. More purchasing by companies in the UK, so more indirect taxes raised. Maybe the government might find that it didn't need to raise taxes to make up any shortfall, maybe even the opposite. Worth a try?