Only weeks since mathematicians proved it couldn’t be done, CERN boffins have put the smile back on sci-fi fans’ faces everywhere by discovering neutrinos travelling faster than light. The astonishing results, reported by Reuters and others, came as the result of the OPERA experiment in which 15,000 beams of neutrinos were fired …
CERN are off their rocker
Here's a rational look at these folks and their "problems" with software.
Oh god that link made me cringe. 40,000 bugs! Oh no! All their work must be wrong!
How many bugs has Windows had, and still does have? Yet most people seem quite happy with it.
Oh no hackers know C++ so they must be able to automatically hack other C++ code and break in to CERN.
No, that is not a rational look, it's an irrational rant. Of course ROOT has bugs; every large software system ever written has bugs. And of course such bugs can increase the systematic error in a scientific result - actually the systematic error is always more worrisome than the statistical error, because what if some source of systematic error has been overlooked? But there's no news here. Archimedes had systematic error in his experiments, and so did Galileo.
Thus, a deviation of 20 ppm may or may not mean anything - hence, the next step will be an independent experiment somewhere else. And in any case, do you know whether ROOT was used for the OPERA analysis?
The first computer course I did (long after I had taught myself how to program) was mainly full of stuff that I already knew, and so tended to be boring. However, it did introduce me to a couple of rules that I still hold as fundamental:
Rule 1: No non-trivial computer program either a) works perfectly first time, or b) is entirely free of bugs
Rule 2: Any programming issue can be boiled down to one of two things: a) Garbage-in:Garbage-out, or b) Speed
Oh noes! It's written in C++!
Obviously, if you are truly serious about writing software that won't get hacked, you must write it in a language that hackers do not know. Something like... er... lemme think... COBAL.
That way, if you're ever hacked you know you're looking for a hacker in his 60s, something of a rarity.
@Gav - I think that COBAL is so secure
because you are the only person who knows about it!
I suspect that you mean COBOL, and if there is any modern (post millennium) serious scientific application (I will not accept financial software as falling in this category, even if it is for scientific establishments) written in COBOL, I'll eat my copy or K&R.
Can you define "serious scientific application" as distinct from... say a..... humorous scientific application??? Or do you think Cobol can't do any hard sums??
Even a terminator runs on 6502 assembler
Eat your copy or K&R?
Muphry's Law rides again!
nah, he meant
COBOL is a just SQL.. or rather it is a version of SQL written by an evil genius - who decided to write a programming language where your code's failure to compile is dependent on time rather than errors. The more time critical your project, the more likely your code will fail to compile.
You can take the exact same code that failed to compile one day and it will work the next, as long as your deadline has already passed.
cobol is for adding up accounting info, and dabatase operations.
COBOL was able to add up numbers of arbitrary length (directly in decimal, usually) while others were stuck with 16-bit ints. Even 32 bits only gets you to $20M (with pennies). So it can do 'hard sums' in that sense.
But, go write a COBOL procedure which finds the inverse of an m x m matrix, with variable m, and report back to us how it went. And if you succeed, let us know how fast it runs relative to the C or fortran version.
El Reg commentards -- had again
"My colleague, Gyro Gearloose, has unearthed statistics"
"No non-trivial program a) works perfectly first time, or b) is entirely free of bugs"
I reckon Microsoft could screw up "NO-OP".
I thought they already have, it's called Data Execution Prevention (DEP :-)
@Field Marshal etc.
Yeh, yeh. Very droll.
I was trying to exclude the daft things students do at University.
But seriously, Cobol is quite definitely a commercial language, and is not at all suited to scientific work. It's missing lots and lots of things you take for granted in any language more suited. There is only one language (apart from the out-and-out weird ones for specific purposes) that I can think of that is less suited, and that is RPG!
On the subject of 6502 assembler, I'm sure if you looked hard you may still find a BBC micro or two buried in the depths of some lab. somewhere. BBC basic was written in 6502 assembler originally, and people did lots of interesting things in that, so 6502 assembler by proxy.
"6502 assembler by proxy."
Almost ALL programming languages can be considered "assembler by proxy" you know...
(Almost because I remember having read something about processors using high level languages as their assembly languages a long time ago... Perhaps I even dreamt of it...)
For pointing out the silliness in the reporting, and the importance of the scientific method - BRAVO! HEAR HEAR!
This isn't a case of "They arrived in half the time", this is a case of barely measurable, maybe something maybe error, squint and it goes away type errors. Neutrinos are hard to detect, and by the time you go from impact to scintillation to detection, there's enough fuzz on this measurement that, while it's "more than noise", it's also still "less than a clear signal".
And faster than light happens all the time...
... just not faster than light in a vacuum (which the rocky 700km between CERN and "Big Rock" aren't).
The Cherenkov effect --- the greenish light always seen around say nuclear fuel rods in cooling water for example --- is exactly that, (charged) particles going faster than light in the medium at hand.
I gather the result was six sigma
60ns measured, vs 10ns error. They also repeated the measurement 15,000 times. I agree there is room for systematic errors in the OPERA experiment, and we should simply wait for confirmation (or the reverse) from independent experiments.
It ain't over till the fat lady signs
The replication part
is true. The result isn't valid unless it's replicated.
But if it *is* valid, the science reporting is spot on. It's not just popular science that has elevated relativity to an absolute limit - it's science itself.
If the reporter knew anything about relativity he'd understand that relativity is a philosophical position about causality, and not just some fancy footwork that describes velocities in a counterintuitive space time.
Mainstream science has been insistent that the speed of light is an absolute limit for most of the last century.
If it turns out it isn't, that's not just a bit of an adjustment and an extension to existing models - it's a complete revision of how reality works.
because light travels faster than sound ?
"But if it *is* valid, the science reporting is spot on. It's not just popular science that has elevated relativity to an absolute limit - it's science itself."
Not really - see Quantum Mechanics.
"Mainstream science has been insistent that the speed of light is an absolute limit for most of the last century."
It has always been a concern of mine, whether the limit really is the (max) speed of light, or whether it is actually something a fraction faster.
In any case, E=mc2 only really applies to 'normal' stuff with measurable mass - it has always seemed likely to me that there are particles out there that differ so much from 'normal' as to allow them to ignore this rule.
I think most physicists at Gran Sasso already know that....
They're comparing to the host-to-goodness, full-fat, speed-limit-o-the-universe in vacuo speed....
A better comparison would be with the apparently super-luminal velocities of the plumes of active galactic nuclei 'observed' a while back... Further (non-trivial) research showed that actually, all was right with the cosmos and it was illusory.
Hmm, I could either pull up a reference for this, or down tools and head to the pub.... Guess which is likelier
@The First dave
Indeed, is 'c' the limit to the speed of light, or a more general limit that light maybe doesn't achieve?
I recall, back in the days when people were actually measuring the speed of electromagnetic radiation, that the speed of radio frequency radiation appeared to be a little different to that of light.
Have the theoretical physicists ignored this since?
....because light travels faster than sound ?
Have to agree. You should see how quickly my Girlie vanishes out of the room when she hears me fart....
If an object has any mass (which neutrinos do) as you approach the speed of light the amount of energy required goes to infinity correct? Just like with tachyons this applies in both directions both faster and slower. I guess of course this only applies if you follow the curvature of space time. Hope this is another cold fusion moment.
The shortest distance between two points is the line along space-time curvature. Not following space-time merely slows you down and requires more energy.
To my knowledge the only tested 'faster than light' phenomena are - quantum tunnelling, entanglement, and these meta-materials - all of which aren't really 'faster than light' in a strict sense, or are quantum uncertainty in action.
faster than speed of light in a perfect vacuum that is
Relativity only says you can not travail or accelerate to the speed of light. If something start out faster then light it will take infinite energy to make it slower then light. Also it takes less and less energy to make it go faster yet.
> Also it takes less and less energy to make it go faster yet.
You're assuming that, in that situation, mass would remain real. Einstein's equation tells us that it would become imaginary. I have no idea what that means to Newtonian mechanics...
All things start stationary and must be accelerated. And you yourself said you can't accelerate it past the speed of light...
I always assumed
that relativity predicted that you could not travel at the speed of light, because it would imply infinite mass and thus infinite energy.
But if there was a dis-continuous way of jumping over the speed of light without actually accelerating through it, I believe that the equations could still hold, although I suspect that it would require a completely new branch of physics to explain the dis-continuous speed jump in the first place, and also some strange concepts like negative mass.
I'm expecting serious physicists to rip this suggestion to shreds ( I got no further than Principal Physics in my General Science degree thirty years ago - equivalent to the 2nd year of a normal Physics degree), and I'm expecting to be thumbed down, but it will be interesting to see what is said!
I think you missunderstand what infinity is, in the sense you are talking about it's a process by which it's incremental, it's not a value something can "change to". I think it's more likely that if you go faster than the speed of light it's reasonable to say that the amount of energy required is an unknown by current thinking. It can't just be set to infinity like it's a cap.
The specific part of the equation your looking for is the correction factor
which takes different forms but the easiest expression to work with is 1/sqrt(c*c - v*v) where c is the speed of light and v is the observed speed of the object. If v=c the divisor is 0, iv v>c the divisor is imaginary. What division by 0 and what imaginary might mean is left as an exercise for the student.
Einstein essentially reasoned that is was an inherently continuous process and you could therefore never get an object with mass to the speed of light. Since quanta are very, very small quantities of energy, and increased acceleration requires every increasing amounts of energy, it seems a reasonable assumption.
'All things start stationary and must be accelerated. '
Good luck accelerating a photon.
I'm fairly certain
that imaginary mass would mean that an apple would fall sideways and miss Sir Isaac's head altogether.
Perhaps it would catch in a (Kip) Thorne bush?
1/sqrt(c*c - v*v)
and yes it does go lopsided for v>c
The main logic Einstein used to start is if you go as fast as light you would see a static electric field without a charge.
If it missed his head and went sideways...
...then surely the Apple iPhone would slide neatly into Sir Isaac Newton's shirt pocket.
Sorry. Was that too obvious?
Yes, I realized that about 30 seconds after I hit the POST button,
but couldn't find my post to correct it.
Lasers depend on relativity? Really? Perhaps you are confused with quantum mechanics.
There ain't no Physics based on QM without Special Relativity.
What do you think the Dirac Equation is good for?
Short answer: special relativity predicted stimulated emission, at least that's my understanding.
Re: Re: Relativity?
Nah I'm pretty sure stimulated emission (Einstein A and B coefficients) were derived from thermodynamic arguments. Lasers work due to population inversions and stimulated emission producing coherent photons, neither of which has much to do with special relativity (Though if you can prove me wrong I'd be interested to read about it!)
Also Destroy All Monsters there is QM that doesn't depend on special relativity, the Dirac equation is good for explaining spin-1/2 particles travelling at relativistic speeds, if they're going slow you don't need to use it.
The Dirac equation is a Taylor approximation of a relativistic Schrödinger equation. The latter is a non-relativistic equation. QM arose independent of Relativity (see Planck), and it is still not possible to combine general relativity with QM.
lasers depend on relativity
well, since lasers produce light, and light follows Maxwell's equations, and Maxwell's equations are relativistic ...
Maxwell's equations are relativistic?
As Maxwell published in 1861 and Einstein in 1905, that is somewhat unlikely. Maxwell can be reframed in a relativistic context, but so can Newton.
Let's not comment on bad science reporting with bad science commentary, eh?
Re: lasers depend on relativity
It doesn't really work like that, the Maxwell equations are invariant under a Lorentz transformation, which means in different moving frames of reference they are the same, or you can also say that the speed of light is constant in any frame. This doesn't necessarily mean that a working laser proves that special relativity is correct just because light is involved somewhere. You can show how a laser works without involving SR at all.
A better example of SR would be time dilation of relativistic particles http://en.wikipedia.org/wiki/Time_dilation_of_moving_particles
re: (wheel) Maxwell's equations are relativistic?
Perhaps remarkably, Maxwell's equations indeed are relativistic, and were so before SR was proposed by Einstein. If you want a plausibilty argument, then consider that SR uses light(speed) propagation to establish a metric, and that Maxwell's equations describe light propagation.
Otherwise, I suggest you go read any EM textbook.
Or ponder why, post-SR, Maxwell's equations are so popular.
Or feel free to invent a theory of non-relativistic light to describe your laser. Good luck with that, eh.
- Breaking news: Google exec veep in terrifying SKY PLUNGE DRAMA
- Geek's Guide to Britain Kingston's aviation empire: From industry firsts to Airfix heroes
- Analysis Happy 2nd birthday, Windows 8 and Surface: Anatomy of a disaster
- Google CEO Larry Page gives Sundar Pichai keys to the kingdom
- Something for the Weekend, Sir? SKYPE has the HOTS for my NAKED WIFE