8 posts • joined 3 Aug 2010
Mobile phones don't have remotely enough range
No cell towers in the middle of the Ocean.
Agreed, it looks like it was probably a fire that disabled the crew and passengers with fumes but wasn't severe enough to take the plane down. In the first moments they changed course to head to an airport, but then the plane just travelled on as a ghost plane.
So the pilots weren't terrorists or suicidal and all the fantasies about the plane being diverted and landing somewhere were wrong. Just tragically crashed into the ocean.
I still don't understand why they had enough time to turn the jet but not enough to radio a distress. Mostly this makes more sense, however, than anything else I've heard so far.
If you note the announcement mentions that the user passwords in the database were cryptographically hashed (and via a strong hashing algorithm).
The contents of the database was the public wiki and ticket system which is already indexed by google.
Leptons not Hadrons
ILC is an electron-positron collider, not a proton-proton or proton-antiproton collider. Electrons and positrons are leptons, not hadrons. 500GeV per beam is sufficiently above the 125 GeV mass of the Higgs that they expect to be able to create massive numbers of Higgs particles, cleanly, without as much QCD background as the LHC produces. The LHC uses more massive particles so it can reach higher energies so is better for discovery, the ILC would be cleaner and better for precise measurement.
The Register, as usual, gets all the details wrong in favor of attitude...
Private cloud can still be done cheaper. However, we still have a generation of IT managers who lust after "big iron" and their idea of a private cloud is huge servers, huge SAN, VMware vCloud, lots of Cisco 7k series switches and routers and massive checks written to VMWare, EMC and Cisco. Those dinosaurs need to die. The next generation needs to be building thin dense private clouds using openstack, largely without using SANs, and not paying anything to VMware and EMC, and keeping the Cisco checks under control. Then using open source monitoring and management tools. You can absolutely beat Amazon's prices by a factor of 3-4x if you do that. If you hire a bunch of lowest-common-denominator idiots for the lowest dollar to run your IT, though, you'll find yourself chasing after vendors to try to make up for the idiots you hired and burning money on 6-figure IT solutions vendors over and over again and you'll be roadkill. Hire smart people, build around opensource, keep it lean, and private clouds are the way to go.
The main point of DevOps
Put the SDEs oncall for their own software bugs, and close the loop so that the SDEs are just tossing an unlimited amount of operational problems over the chinese wall between dev and ops and getting all their own technical debt "off their books" so to speak.
"as long as they understand the fundamentals of how a computer works"
Okay, i'll bite... What fundamentals are you talking about?
The fundamentals of how systems work that I interview for are things like what strace does and what a system call actually is, and what subsystems one would expect to see in a unix or unix-like kernel. Explain the TCP 3-way handshake, or explain how traceroute works. Other interviewers like to use the boot process question.
Generally you find very little understanding of the /fundamentals/ of how a computer works in people that just push around the mouse on the GUI. Those are the admins who don't know TCP at all or who barely stumble through SYN, ACK, SYN|ACK (yes, usually in that order). The ones that know options negotiation and PMTU discovery, SYN cookies, etc are the ones that also have strong commandline fu.
Only using a GUI keeps you dumb.
Anther savings is that if you're using Amazon you're stuck with a few different memory sizes of virts that you can get from EC2: 1.7GB, 7.5GB, 15GB, etc. If you need a bunch of 4GB virts then you're stuck buying 7.5GB RAM servers and trying to politically deal with bundling up apps on the same O/S instance, or else just wasting RAM.
You'll get better "compression" by being able to size the virts yourself by running your own internal cloud.
I've also been doing these calculations repeatedly because our board of directors seems to be infected with irrational lust for the cloud, and keep on determining that break-even on a cash-flow basis comes after about 3-4 months for 100% duty cycle instances -- so Vijay is probably being conservative at what can be achieved in the real world.
However, if your IT department spends millions on expensive enterprise-class tools and money is flying out the door to VMware, EMC, NetQOS and every other IT vendor out there, then it may be cheaper to go with the cloud -- but in going with the cloud you are getting Amazon's framework build on top of Xen and open source or in-house tools. You are not getting infrastructure engineered to "five 9's" you are getting infrastructure engineered to "you bet it'll fail -- so spin up another one!"
If you run the numbers on cloud and find that it makes economic sense, you should fire your IT staff and start over from scratch with people who understand cheap open source.
Does not contradict the science
Climate science has consistently predicted that climate change up until the past 20-30 years was dominated by largely natural forces (orbital cycles, solar cycles, volcanoes, etc), and that only in the past few decades should the temperature start to respond to increasing levels of CO2 in the atmosphere due to the integration over time of historical build up of CO2 and thermal lag in the Oceans.
Anthropogenic climate change from CO2 and other greenhouse gases is also partially offset by anthropogenic emissions of climate-cooling aerosols.
Prior to 1900 anthropogenic climate change is largely negligible, and up until 1960 the aerosol emissions largely balance GHG, but it is only post-1960 that anthropogenic GHGs are predicted to have dominated climate change.
So, this is like using pictures of astronauts in space to "falsify" gravity. Actually, gravity predicts that astronauts should experience zero gravity. Similarly, climate change predicts that prior to the last few decades that non-anthropogenic forces should have dominated. This article just confirms that climate science has been correct, and is yet another piece of evidence that natural forces have dominated in the past and have only very recently become completely decoupled from typical behavior because of the introduction of anthropogenic GHGs.
Of course in the "skeptical" world, black is white and evidence that is completely constant with climate science is now evidence that contradicts climate science. All that this new study does, however, is contradict a simplistic parody of climate science that editors at The Register hold -- they have, in fact, falsified their own understanding of climate science, but that just underscores the fact that the editors here know nothing at all about climate science.
- Infosec geniuses hack a Canon PRINTER and install DOOM
- 'Windows 9' LEAK: Microsoft's playing catchup with Linux
- Boffins say they've got Lithium batteries the wrong way around
- Game Theory Half a BILLION in the making: Bungie's Destiny reviewed
- Phones 4u slips into administration after EE cuts ties with Brit mobe retailer