2561 posts • joined 10 Jun 2009
Re: Another idea...
If you like Cinnamon but not so much the rest of the Mint distribution, Cinnamon is also available for Fedora through the standard package installer. In fact downloading Cinnamon is the first thing I do after installing Fedora, because I really dislike Gnome 3 and prefer Ciinnamon to KDE or XFCE. (Haven't checked if Mate is available for Fedora, since Cinnamon made me rather lose interest).
Scientific Linux 7 should be coming fairly soon (3 months after RHEL 7 ships? ), and will be based on Fedora 17. I hope that means I'll be able to install Cinnamon on Scientific Linux, which may come fairly close to perfection in my eyes.
I don't recommend Fedora in general because it's a bit bleeding-edge for my taste, and forces you onto an upgrade treadmill in that there's no long-term support for Fedora releases. However, it does tend to have support for the latest hardware, so I reach for it when SL lacks the hardware support I need.
Re: mint support
Is there a commercially supported stable linux that does not charge an arm and a leg for private use ? The coorporate red hat prices are a bit steep I think.
Depends what you mean by "supported".
If you want to pay so there's someone on a phone who will help you with any "issues", there are various organisations out there who will do that for popular distributions. I don't have any experience of who is how good.
If you mean you want the long-term security updates and platform stability that you can get by paying for RHEL, then look at Centos or Scientific Linux. Both are built from RHEL Open source. Both are free. For most usages, they are interchangeable. Both, for example, can use all(?) binary packages from Red Hat's EPEL repository (Extra Packages for Enterprise Linux).
Centos claims 100% bug-for-bug compatibility with RHEL.
Scientific Linux (the most misnamed distribution?) is an almost-clone of RHEL and can be used virtually anywhere that RHEL can. It might also be mis-named "CERN LInux". It's supported at CERN and used extensively by CERN. The not quite a clone bit is that if CERN finds a bug or needs a kernel feature that RHEL don't fix or ship, CERN will fix or build it themselves. In practice I have never found anything that works on RHEL that doesn't work on the same-numbered release of Scientific Linux. It also ships with auto-updating turmed on. Centos ships with that off.
There's always a risk that the Centos distribition might go away. There were reports of bust-ups amongst the project leaders around the time of RHEL 6.0, but it's still with us.
I doubt that Scientific Linux could go away any time before CERN goes away. That makes it my choice for a free Enterprise-grade Linux. YMMV.
Re: WINE or VM?
What sort of tasks produce this sort of results?
The most spectacular one for me was installing Windows XP + apps (i.e., developing images for deployment). Even after I worked out that VMWare had optimized formatting a virtual disk into an (almost-?) no-op and after adding back the time saved at that point, it was still considerably faster installing into a virgin VM than installing onto the hard disk of the same system natively. Some day, out of curiosity, I may repeat the measurements under native Linux KVM or Virtualbox and with Windows 7, but VMWare workstation on Linux was there first and that's when I was doing this stuff. It's someone else's baby now, and he insists on doing it all the all-MS way.
Why? My guess is that Linux's cacheing of the hard disk to RAM is greatly superior to Windows'. Operations like copying a large folder containing a lot of files were also faster within a VM than when Windows was in direct control of the disk. Or maybe it was just that 32-bit XP couldn't use all 4Gbytes of RAM, but 64-bit Linux could. (In fact so can 32-bit Linux, via PAE, though not all on one single process).
The even greater saving of time was from VMware VM snapshotting. Snapshot - sysprep - image - test depoly - find bugs - what then? On the VM as opposed to a native installation, you just revert to the snapshot, which takes mere seconds (and saves waiting for a boot!), then start fixing the bugs. You can also do much the same at the Linux LVM level, except it's probably a good idea to make sure the VM is shut down at the time you take the snapshot, if the VM is unaware that a snapshot is being taken.
Re: What about...
KDE is probably the "heaviest" of the Linux desktops. If you have a powerful desktop system it's OK. As for whether you like the KDE experience ... try it and see. The nice thing about Linux is that you can choose between desktop UIs at log-in time.
But another good thing about Linux is that it can be very usable on elderly hardware, which you can obtain for free or almost-free because it's not man enough to run Windows 7 comfortably. My feeling is that KDE and Windows 7 desktops are similarly demanding. Something lighter is better on such hardware.
Re: One of the first truly usable Linux desktops
Boggle. On any Distro I know, install Chrome or Chromium (or other open browser of your choice) and then forget about Firefox until you are sure you are happy with your chosen replacement. Then use the package manager to un-install Firefox, if you are really keen to reclaim a smallish amount of disk space.
Or you could close your eyes and jump: un-install Firefox first and then install Chrome or whatever. You can always put Firefox back later.
Linux is a perfectly adequate alternative to Windows right now, and has been for years. EXCEPT ...
People often don't want alternatives. They want exact bit-for-bit and feature-for-feature identity. For example, they don't regard Openoffice as an MS office alternative, neither on Linux nor on Windows. Ditto Gimp versus Photoshop (much bigger bucks at stake here). Ditto Octave versus Matlab. Some even object to browsers that aren't MSIE.
And then there are existing organisations that have locked themselves into MS proprietary file formats, and now can't afford to escape. Access databases are one obvious lock-in.
But if you are starting an organisation right now, think long and hard before you let Microsoft get a toe in your door. The best time to break free, is never to enslave yourself in the first place.
WINE or VM?
Personally I've never had much joy with Wine, whereas a Windows VM under Linux works extremely well, sometimes faster than running Windows native on the same hardware. NB you do need enough RAM for the two O/Ses side by side. Don't try this with Windows 7 on a 2Gb Linux system!
Windows into VM
If you have enough RAM and if Windows licensing doesn't block you, it's a good next step to make a Windows VM so that you never have to shut down Linux to run Windows.
Re: Question for you enterprise chaps
It's more a matter of conservatism and long planning stages, resulting in Microsoft "lemons" being squeezed out (sorry) before any deployment happens. IT is there to support the business. Disasters happen when the tail is allowed to wag the dog.
Most enterprises avoided Vista altogether because they were happy with XP for the forseeable future, and their techies were able to say "don't go there" before any serious commitments were made.
Many enterprises are still migrating XP to Windows 7. Many would prefer that XP remained viable forever, but Microsoft have hit the XP kill switch. There's no reason that Microsoft couldn't have incrementally improved XP, including replacing its kernel, without inflicting a painful migration. Or have given us proper migration tools for going from XP to 7. So that was perhaps the first sign of the rot setting in at Microsoft.
The posts above should give you a pretty good idea of what enterprise techies are telling their bosses about Windows 8. Soon, we'll be reading about what happens to the bosses and companies who override their techies because they prefer to believe Ballmer's promises. It'll be interesting. I wonder if it'll take down a bank? (Kidding - I hope). Most companies will wait and watch for now. It's been a good strategy in the past.
IMO "Windows 9" will be the last chance saloon for Microsoft. If they EOL Windows 7 without providing a Windows that enterprises are happy with, the decision will be taken that if there's got to be a really painful migration, then why should it involve Microsoft at all? Until then there is still time for Microsoft to fix things.
Sooner or later if Microsoft carry on the way they are at present, a big company with deep pockets and a long-term plan will decide to challenge Microsoft with a Linux-based Enterprise Desktop and some heavyweight migrate-from-Microsoft support. A company like IBM or Samsung or possibly a company from outside IT altogether. (Anyone remember what Nokia did before entering the mobile phone business? )
Years ago I watched as Digital Equipment self-destructed because of managerial greed, incompetence, and hubris. Now I feel that Microsoft is treading the same path to corporate oblivion.
My thought also: WD and HGST will merge, so duplicating research is pointless. Hydrid drives is WD's job.
As for "robust device drivers" I understand what he is saying. A modern CPU is more powerful than the microcontroller in a hard drive, especially when it comes to algorithms that can't be accelerated by custom hardware. So letting an OS combine an SSD-based cache with conventional disk drives may be the best-performing solution. However, it does require someone to write that driver and to support it. (Is it a hint that post-merger, HGST may be getting into the software business? )
The nice thing about a hybrid drive is that you know exactly who to blame if it mangles your data. I feel pretty confident that the caching in hybrid drives will be robust and reliable, even if it's far from optimal for any particular usage pattern .
Re: and let's also add...
Actually it [petrol] is not much of a fire hazard and save storage is easy
The connector is the really big headache
You've got to design a quick-swap connector that can support very high currents and voltages, and associated environmental shielding for the swappable battery. 150kW implies 150A at 1000V, or 600A at 250V, or something like that.It's got to be safe. It's got to work in an automotive environment, where salty water is being sprayed around it at 80mph. It's got to have consistently low resistance or else the car will go a few miles and cut out with a thermal alert (or just catch fire).
The lead-acid battery in your car is connected using a spanner and contact jelly, not a plug and socket, for very good reason. The much lower-current hot-swap connector in a single-kilowatt UPS is a weak point. I've seen what happens when it fails. Not pretty. I suspect Boeing's Dreamliner woes were also a failed design iteration of this same problem.
The trouble is that a battery degrades with age, and that the degradation is a function of how it is used. In contrast a Calor gas container is pretty much binary: OK with 100% capacity, or completely fubar.
The only way I can see to make it work would be for every electric car owner to pay a one-time battery fee (transferrable, refundable when the car is scrapped) and for all batteries to be owned by the motor company (or by the national battery bank(*) ). That, plus standardized batteries and swap-robotics across all electric cars. That, plus some in-battery circuitry so that its charge capacity is known when it's swapped, and folks not needing full range would get a discount on an age-degraded battery.
It's not impossible, but it maybe points at a need to give up private ownership of the cars themselves. The Streetcar model?
(*) I mean bank. They used to use gold as capital. Now they use lies. Why not use Lithium?
I firmly believe that NSA will (if they haven't already) crack quantum computing long before the private sector. When they do, they won't tell anyone.
I'm quite sure that they haven't already.
You may be right as of some near future, but the consequences of that will be greater and faster and stranger than you imagine. "The Laundry" plays this idea for (very uneasy) laughs, but I expect laughs would be the last thing on our minds. The result would be more like Skynet going active crossed with the Stargate sequence from 2001.
I don't expect the singularity to arrive this way, because I don't believe nature will support quantum computing work for numbers of qubits sufficient to break strong cryptography ... but I don't have any particular hotline to the future and may be proved wrong. In which case, may the Eschaton be merciful. Cracking cryptograms for our amusement will be the last thing on its mind.
Re: What's spanish for "exterminate"?
Wrong TV show
These ballbots are, of course, prototypes for "Rover". I am not a number, I am a free man!
Could be worse
The dead maniac might have made a very dangerous convert.
Or it could have been reality imitating fiction. In "Iron Sunrise" Charles Stross imagined a place where they studied Hitler, Mao and Pol Pot an examples of leaders who were excessively lenient.
Re: Somewhat ironic....
Not ironic at all. A simple case of "that's OUR job not YOUR job", or just "YOU do what WE say".
I'd be far less concerned about Google if I could be assured that governments could never get their hands on Google's data. Because there are no rules at all as to what a government can do. It's the government that MAKES the rules (and usually exempts itself from them).
Re: but do they still burn
Titanium. With a big enough magnifying glass, probably rather better than you expected. Rather an oops, requiring immediate retreat, and I hope you are outdoors. Sand won't put it out, and adding water would be like throwing petrol at an ordinary fire. Think Aluminium (thermite) on steroids.
Forever is a long time
There's a huge difference between 25 years and forever, or even just 250 years. I'm sure back in Faraday's time, the majority view was that electricity would never be of much use to anyone ... and of course, the famous quote that there might be a world market for a dozen or so computers.
A billion years of forever gave us self-replicators that didn't even need printers. (Or sex, for that matter ... sex was new in Life 2.0, or maybe 3.0 or 3.1)
I thought it was the Kiwis that had the giant insects, and the Aussies that had the small but deadly ones?
Re: There's no way an app can be written to run on any Windows device
Which is why the UI on an iMac is not the same as the UI on an iPad and neither is the same as the UI on an iPhone. THREE classes of device, three different UIs. Not one size fits all, nor two sizes fit all. If Apple is smart, there's considerable amounts of common code underlying them, but the user doesn't care one way or the other. He just cares about having the right UI for the device and work to hand.
The mark for how badly MS has fscked up, is that I'm starting to sound like an Apple fanboi!
Apologies seem to be completely out of fashion. Especially prompt and sincere ones.
I would have hoped that the guidelines would suggest there's a major difference between someone who apologises once it becomes clear he's caused offense, or who retracts something that's both hurtful and inaccurate, and someone who refuses to and instead reiterates or reinforces his original post.
Re: The article is about PC makers problems, not MS's internal problems
How does anyone how many desktops and laptops thre are out there running Linux? It's not as if you have to buy Linux from anyone. I burn a distro DVD for anyone who asks. Most of the hardware that is running Linux, has got a paid-for Microsoft license key glued to it, so it's being counted as a Windows sale by someone. Some started being used with Windows, and were then re-purposed. Some dual-boot. Some run a Linux VM in Windows (good way to start a personal migration) and some run a Windows VM in Linux (which can be faster than running Windows natively on the same hardware!) Some were nuked to Linux out of the box, and the Windows sticker represents a tax levied by a monopoly which you cannot avoid short of building your own PC from components. (Technically-minded Linux users do self-build far more often than Windows users, because they avoid paying the Windows tax that way, but it's not an option with a laptop).
I've seen more Linux laptops in our students' hands this year than I can remember in previous years. That's nothing compared to the increase in the number of Macbooks, though. Tablets of any sort less so - in a university, you tend to need a keyboard. Wouldn't be surprised if the better-off students also have a tablet back home. Just like me. And unlike me, they all seem to be toting smartphones, mostly Android or iPhones.
Re: My solution - elephant in the room is Windows.
The masses are not buying Mac.
their old PC running XP or 7 breaks down and they go down to a box-shifter on the high street and walk away with a new PC. They may see "Windows 8" as a mild positive if they are clueless. They get it home. They hate it. Their money is gone, so they are stuck with it.
When they have enough money to be able to buy again, what are they going to buy? "Fool me once, shame on you. Fool me twice, shame on ME".
Re: It's not that hard to see the problem
No, the OS is NOT irrelevant.
It's true that the PC market is a replacement market, and that people are now only buying new PCs because their last PC has broken down.
But this particular PC user does NOT want Windows 8. To me, its value is NEGATIVE, in other words I will pay more in order NOT to have to struggle with its numerous deficiencies. If I didn't have access to corporate downgrade rights (ie Windows 7) I would buy something from Apple. Or something running Android. Or install Linux onto a PC that came with Windows 8, if that remains possible and if I cannot find any alternative.
Microsoft is blind to this, and every day that goes by without Windows 7 or Windows 9-like-7 returning to Joe Public's marketplace is another day that Apple (in particular) eats further into Microsoft's home-user market. Folks DON'T NEED a new PC immediately, and they DON'T WANT one running Windows 8. Two different facts.
Toshiba may also gain a lot of reliability by using SLC. The other manufacturers are presumably confident that they can detect failing blocks in MLC and retire them without any detriment to the end-user, but SLC is intrinsically thousands of times more reliable as well as faster. And 1000x more write cycles means the firmware can be more aggressive (or responsive) about replacing blocks in the flash cache.
BTW it's possible that thinner actually confers some sort of advantage to the electromechanical parts n a single-platter drive. Perhaps head aerodynamics? Or that the shorter spindle down the middle is more rigid? Or maybe it's just that it'll find a home in new ultra-thin designs where the fatter drives can't venture.
AMD - Intel dynamics
I really wish I could be privy to Intel's boardroom's view of AMD.
My theory is that Intel regards AMD as an asset because Intel fears becoming a monopoly. That would mean (a) government interference and regulation, and (b) nothing to stop it increasing margins by decreasing R&D ... short-term gain ... until it missed something radical from a company that was completely off its radar. At that point it would be unable to respond fast enough, because its R&D would by then be a shadow of its former glory.
So AMD gets to play with exotic CPUs, while Intel gets to say it's not a monopoly because folks can buy from AMD instead. Some do, as proof thereof. Most don't. Intel has an intrinsic advantage from its process technology and scale. It does boring mainstream stuff extremely well. It can follow wherever AMD shows there is a serious new direction to move in (like x86-64, for example). As long as it keeps its process technology advantage, its designs only have to be 2/3 as effective to benchmark as equals, and it can improve on 2/3 x 3/2 on its next design iteration. In short, AMD keeps Intel on its technological toes without making a serious dent in its profits. Intel for its part keeps AMD in financial distress, but not terminally so. It could kill AMD with a price war, but doesn't want to.
Intel could probably also put AMD out of business through the courts for patent violations, but doesn't want to. Intel probably believes that it can get away with copying anything AMD has patented, because in a nuclear war between patent lawyers, the company with the deepest pockets will win even when it loses.
Well, that's my theory. Thoughts?
Re: Theory v Practice
I'll have a go. (If your question is WHY is quantum reality like (or somewhat like) this, no-one has the faintest idea. It's like asking why gravity exists. At present, just assume that $DEITY thought it was a good idea).
A quantum bit is in both states 0 and 1 at once, with a probability of 50% that when it is observed, it will be a 1 or a 0.
Entangled bits are in a state where their properties are mathematically correlated, so for example two qubits represent 0,1,2 and 3 at the same time. Call an ensemble of entangled quantum bits a quantum register. Start so that its N bits represent all humbers in the range 0 to (2^N)-1, with equal probability of 1/(2^N), at once.
Now perform arithmetic operations on your N-bit quantum register. For example if you add a quantum register to itself and then observe it, you are guaranteed an even number. The self-addition operator causes bit 0 to assume a zero probability of it being a one. So far, somewhat trivial.
It gets really interesting if they can make a quantum register with a large number of bits, and perform operations on it such that the probability of it representing any number that is not a factor of a specified much larger number is zero. If the specified number is the product of two large prime numbers, when you observe the quantum register you will obtain one factor or the other. The probability of any other number (pattern of observed bits) is zero. Suddenly not nearly so trivial!
You obtain your other factor, and also check that the quantum entanglement had not failed during your quantum computation, by conventional long division on a conventional computer. Entanglement failure during a quantum computation is akin to a hardware failure in a conventional CPU except far more likely and (on the positive side) not irreversible. For the factorisation application (i.e. cracking N-bit PK cryptography) even an unreliable quantum computer is useful, just as long as it occasionally spits out a right answer with a much higher probability than 1/(2^N) (i.e. guesswork).
Last time I read up on it, they'd managed a four-bit quantum computer and were able to factorize 15 (in other words, when observed after mathematical operators applied, their 4-bit register almost always said 3 or 5, rarely anything else). Of course that's trivial, but a 2048-bit quantum computer would be anything but. I don't know what's the latest N qubits? 30-plus would start getting practically very useful, and probably very secret.
My money is on quantum computing breaking down somewhere between four bits and 2048 bits, and that this will prove to be a gateway to some new physics. (Hopefully not of the Laundry variety, involving a takeover of our universe by beings best not even thought about). Philosophically, I'm not prepared to take on board the implications of a working Megabit quantum computer (or Giga - Tera - Zetta- ...!)
BTW if you are doing research in this field and make a sudden breakthrough of huge magnitude, your only chance of staying alive and at liberty is to spam the details to as much of the world as you can as fast as possible. There are also some mathematical theorems (unproven but believed true by almost all mathematicians), for which a disproof would have similarly vast real-world implications.
It's called chaos, or sensitivity to initial conditions. And also, you are comparing chalk to cheese. The black hole model is attempting to model possible behaviour of gas (plasma, surely!) in that environment, and to deduce some average properties such as its temperature. Weather forecasting is an attempt to predict exactly what state the atmoshere will be in tomorrow or next week, given observations of what state it's in today. A black hole type model simply tells you that it's possible that somewhere on the planet you will find a square mile where the air temperature is 40C and the humidity is high. And that nowhere will you find such a square mile where the temperature is 80C.
The weather forecast model is chaotic, or sensitive. You can't measure the atmosphere's current state with complete accuracy. The measurement errors grow with simulated time, to a greater or lesser extent, until the model is wholly erroneous when compared to the later reality. (It still represents possible weather, just not the correct realisation thereof). On the bright side, your forecast for a week out is useful maybe two times in three, and that for two weeks out is a waste of computer time. In the worst case, the not-quite-hurricane that is going to devastate northern France in six hours' time does a sudden right-angled turn and devastates Southern England instead. (Michael Fish, you are forgiven).
Re: Yeah, its about protecting your stolen phone - seriously?
No, that can't be it. If they believe that a phone is going to be used as a trigger for a bomb, all they have to do is ask (or require) the phone company to disconnect that phone number (or better, for evidence-gathering, to divert the phone number to their evidence-gatering facility). If they can't do this today, it's a relatively uncontroversial bit of legislation to fix it so they can.
On the other hand I can see why they might want to permanently kill every phone within 100m of a demonstration at which police brutality was happening. Other things like that. And of course, it's completely impossible that some hostile foreign government might get its hand on the kill switch for every mobile phone in the USA. Isn't it?
Re: Quality of life
Really FIX them? Or just work out how to steer the lusers around the rocks?
I don't know the details of what exactly the supremes did allow them to patent. As I said, if it's the novel lab or field techniques that allow testing for the BRCA1 gene, that may be fair enough. But if it's the complementary DNA sequence itself, that's really no different to patenting the BRCA1 gene (which the supremes disallowed).
An analogy might be a court saying that you aren't allowed a patent on the 23rd page of "Wuthering Heights", but letting you obtain a patent on the same with all occurences of the words "The" and "A" deleted. Complementary DNA is the original DNA with sequences known as introns deleted.
If they've managed to keep a patent a single string of complementary DNA, then they've pulled a blinder on the supremes. I don't even work in life sciences, but I know that's a re-coding of the original DNA with some bits deleted. So it surely ought to be as un-patentable as the original human gene.
If they're the company that first worked out HOW to do this in a lab, perhaps they deserve a patent for the laboratory technique (though in principle, they're just using the same mechanisms that nature evolved).
ZFS is coming to Linux
ZFS is coming to linux - it escaped Oracle's control.
As usual for open source there's healthy competition between the ZFS port and that new kid on the block called btrfs. And as usual with filesystems, especially complex ones, it'll be a few years yet before either is ready for serious production work.
Re: Why the hell would they do that?
There are a lot of other problems that have to be solved to build a close-to-lightspeed starship, including an improbably large initial mass to payload ratio. Realistically, they'd need a MUCH longer lifespan, high radiation tolerance, and be happy to journey at 0.001c, so they're unlikely to be showing up for quite a few centuries yet.
But yes - the flaw in the strong anthropic principle is that the universe is NOT perfectly tuned for human beings, but rather for creatures that measure their lifespans in milennia. If such can exist ... otherwise it's not well-tuned for starfaring civilisations at all. We do have trees that have lived a few milennia, and funghi that have lived maybe twenty times longer ... and if the funghi were/are conscious, could we ever know it?
Then there's the possibilities for our Silicon AI children to clock slower than their parents rather than faster. At which point the Fermi paradox again kicks in. Where are the alien AIs?
Depends what you are simulating
It depends now much you need to experience the neural input of real life physics (kinetics, mostly). For piloting a large ship, a simulator is virtually the same as reality. For an airliner it's close, although there are situations close to disaster where kinetic forces can be felt and reacted to. For a fighter aircraft or racing car, it's far less close. And so on.
Re: re. "... electrons and positrons forced to interact.."
Created as positron / electron pairs by gamma radiation or high-energy electrons. Rest mass of an electron is 0.511eV, so anything much above 1MeV can do it. Electric potential of a thunderstorm is 100s of MV.
Re: Win 8.1 - It's not a shot in the arm
I was thinking knee. Or maybe groin. Head, if they EOL Windows 7.
"Can I buy one with Windows 7"
"Do you sell anything with Windows 7"?
"Oh" (looks disappointed, wanders off to look at the TVs or kitchen appliances).
For every technology, the market eventually becomes mature and saturated, and the result is that volume of sales decreases to the replacement level because New and Old are approximately equal if Old still works at all. Witness kitchen appliances. Automobiles. Audio systems.
PCs have reached this point. Previously they became obsolete in three years, now they become worn out in 6-9 years, so a 2-3 fold shrinkage in volume is completely predictable (and tough for manufacturers). In contrast tablets haven't yet saturated their market, neither are they a mature technology. Most folks have worked out that PCs and tablets aren't competing products, any more than a car and a bicycle are competing products. PCs are for creating content. Tablets are for consuming content. Many of us have both.
Re: +1 avoiding an upgrade
You have to make an image backup of the HDD to another HDD before you do a Windows swap, because Windows is perfectly capable of borking the disk into complete unrepairability if you try to boot it in a new hardware environment. Microsoft doubtless considers this a desirable feature, not a bug.
In contrast, I've yanked a linux disk out of an AMD multi-core system and booted it on an utterly different Intel single-core system with no trouble. I also had the assurance that linux / and /home are separate partitions, so / was expendable and /home wouldn't be mounted after commenting out its line in fstab.
Re: Classic Coke
Can't speak for Macgyver, but I have the same problem. The same hardware works perfectly fine when loaded with Windows 7. Install Windows 8, and it's fubar on the networking front. Repeat-tested on two instances of the same hardware.
Re: >"a few more minutes"
"The bad news is that it's dropped into the earth's core and is now irreversibly eating our planet. The good news is that it'll take the next three billion years, give or take a few million."
I fear no-one woud care much even if it were just three thousand years.
Re: basic science
What did (Prof. Sir. ) Neville Mott get out of studying the field effect in the 1930s? At the time it was just theoretical physics. It wasn't even the first sort of transistor we made ....
Curiosity has survival value, or at least always has done until now. If it hadn't, we'd have evolved out of it. (And I'm not even sure we're the most curious species. If there were a way to normalize curiosity by intellectual capacity, I'm sure cats would beat us).
Re: If you want to bash Positrons & Electrons...
Doesn't work. The trouble is that charged particles emit electromagnetic radiation when you change the direction they are moving. The higher the energy of the particles and the tighter the bend, the more they do it. You reach a limit where it's not feasible to pump in energy any faster than the particles are shedding it, and the only way around that is a bigger ring (or ideally a straight line for zero losses, which is where this discussion started).
By the way, the control codes at CERN really DO have to include a calculation of the phase of the moon. The earth suffers elastic tidal distortion, enough to change the alignment of the LHC to a significant extent.
The difference is that they're relatively small and rigid structures that can survive the gradual distortion of the ground they are built on bt anything less than a truly huge earthquake. Whereas a LINAC 10km long that has to be dead straight might end up bent out of alignment, which would ruin it.
A picture from California is worth a thousand words
Why not? It's no different to the everyday usage of "nuke it"!
Doesn't everyone know about natural cosmic rays up to10^19 eV, and the number of them that have interacted with the sun in the last 4 billion years? (Hint: the sun is BIG). If really high energy particles had any untoward effects on large collections of matter, we wouldn't be here to talk about it. (Or the fun alternative, maybe they once DID have dramatic effects on the pre-universe, and that's WHY we're here talking about it! )
Re: meanwhile in the texas desert
That's a bit like upgrading from a Bentley to a Bugatti, when you know that what you really need is a supersonic jet. Not a big enough advance to be worth paying that much for. The LHC is the biggest ring we'll build in any forseeable future.
Linear accelerators are the most obvious way we might get a really big advance. In outline ... an electric field of a million volts per meter, over ten kilometers, is 10GV. Subject a proton to that and because it's ~2000 times more massy than an electron, that's ~20TeV. Now work on higher field strengths ... none of this is impossible, it's just unknown territory in engineering terms.
For a rather further-fetched idea, look up PASER (and consider where we are today, starting from some theoretical 1930s papers on the field effect ... with a big detour via germanium junction transistors).
Bigger screen + much bigger battery = win
Unless you start to notice the extra weight?
Re: Time to pack up and leave
those places will probably not give you the kind of lifestyle you are used to.
Switzerland looks good for a decade or two longer, if you can hack the exchange rate. Unfortunately, it's surrounded.
I find the omnipresent surveillance nightmare more troubling than any other SFnal visions of the near future. Soon enough, we will be living in a world where everything is networked, and everything has eyes and ears and a CPU, and no-one will be willing or even able to break the rules. I can't see a way past that trap. And then we'll be no more able to react to changing circumstances than insects, and the subsequent fall will be deeper than any previous fall in history. Especially if by the time it happens, there is no longer any place outside "the empire" to flee to or to be invaded by.
When the Roman Empire fell, the people left in England lost the technology of throwing pots on a potter's wheel. This, even without nukes and surveillance.
- Product round-up Ten excellent FREE PC apps to brighten your Windows
- Review Tough Banana Pi: a Raspberry Pi for colour-blind diehards
- Product round-up Ten Mac freeware apps for your new Apple baby
- Analysis Pity the poor Windows developer: The tools for desktop development are in disarray
- Chromecast video on UK, Euro TVs hertz so badly it makes us judder – but Google 'won't fix'