2130 posts • joined Tuesday 24th April 2007 14:31 GMT
Will you compare it to:
I have never tasted (not do intend to taste) this artery clogging horror, but I would be interested in a fair comparison. All in the interest of science, of course.
Beer, because it is Friday afternoon.
Re: Shouldn't be any such thing as hate crimes
If people beat people up, that's bad: agreed, it's just the level of badness that is at stake.
In my book at least, it is worse if someone beats you up because he hates the group you belong to, rather than, e.g., what you said about him or his mother, or because you took a swing at him. There is even a sort of "bad sense" in mugging: people do it for profit. It is definitely bad, but not as bad as an unprovoked attack based on a person's religion, race, or sexual orientation.
"Now is the time to take advantage of the large webOS user base."
What? Both of them?
Sorry, I like WebOS, I even got the SDK out of curiosity as a long time user of Palm hardware (my Tungsten T3 was only retired last year). The joke is bittersweet in that a good idea seems to have been ruined by bad marketing. The HP statement is just surreal.
let every male mourner embrace, and perhaps even kiss a male protester (I realize this requires setting aside your natural revulsion, but be strong!!).
For a nice touch add: "I forgive you."
I think this might freak these guys out, and they could hardly sue you for it, you are merely following the example set by by Christ, showing love to thy neighbours.
According to inflation theory
it was space that moved faster than the speed of light. This is perfectly allowable according to relativity. It is hard to get your mind around, I agree.
Someone may have copyrighted the phrase "Tick Tock"
Now astronomers have one more reason to detest astrology
only lawyers will benefit from this.
Keyboard, eat hot, er,.... tea!
The 60ns difference equates to 18 m
I trust there has not been an 18m shift in position, as that is earthquake magnitude. The mean temperature of rock beneath the ground is really stable, simply due to its HUGE thermal capacity, and high degree of thermal insulation. This is why (wine) cellars often have high thermal stability.
Previously, the peculiar, apparently superluminal motion of jets in quasars and active galaxies could be explained elegantly by invoking special relativity. by assuming the jet is traveling at near light speed almost directly towards us. Here general relativity may well be an explanation (but it may not)
He will be missed most by those near to him of course, but many others will miss his undoubted leadership. Though I am not an Apple user, many Apple designs set new standards, and gave others a point to aim at.
I will join you in raising a glass to his memory and achievements.
I can see why you might take offense
but I read the "as well" as meaning that it is OK to learn Java, and "real programming" also can be done in Java.
On the topic of teaching programming: Some years ago we decided to use C in the initial programming course (Imperative programming). This allowed us to ditch the object-oriented overhead of Java which was used before. We go on to teach them algorithms and data structures in C, complete with structs containing function pointers. The latter paves the way for understanding what objects actually do for you. They also learn to clean up any mess they leave behind. Object oriented programming is then done in Java. Later courses include functional programming (I think in Haskel), and parallel programming and concurrency.
This is just the main programming track of the curriculum, other tracks focus on software engineering and architecture, database theory, operating systems, compilers, etc. Many of these skills might never be used in practice, but they do give better insights, and mean you have learned to learn difficult topics.
The best reason to learn a programming language is to learn new ways of thinking about problems. It does not help to learn 15 subtly different OO languages at university, it is important you learn different programming approaches. Once you have learned a style well, it is trivial to learn another. I learned programming in Pascal, the switch to C was quite trivial; I started OO programming in C++, learning Java was quite simple.
And there I was thinking
they might be suffering from dyslexia.
Or they were in such a hurry to make money they forget the last letter of their nam.
having said that, their behaviour causes dyspepsia.
re: dark matter
I just love it when people come by and start saying: look scientists got it wrong in the past, current science must also be wrong. This shows they do not quite get the scientific process.
"Getting things wrong" is part and parcel of science. All current theories do is model the world in such a way that a large set of observations are explained. Testing theories consists of making predictions, doing experiments and seeing where the theory gets it wrong. If it gets things wrong, we have to make a new theory which explains all the old observations AND the new. Alternatively, there may be a mistake in the new observations, and the theory survives to be tested again. Each new theory offers a better approximation of the behaviour of nature, which should be harder to prove wrong than the previous. However, even if it mimics the behaviour of nature exactly, we have no guarantee it is the real mechanism behind nature, it is just a perfectly good model.
It is natural for scientists to be cautious when a theory confirmed by thousands of experiments is contradicted by a single experiment. At the same time many physicists are unhappy with the notions of dark matter and dark energy, and are looking for alternatives. Where there is disagreement there is progress in science.
I actually state that there are those who can teach themselves, but maybe I should have stressed that more. I studied astronomy, and am therefore largely self taught, when in comes to computer science. This is why I am still learning more about core computer science today.
When I take on PhD students for projects with a lot of coding, I always look for passion, for people who do extracurricular stuff (not just in terms of coding, mind you). One problem is that 3-5 years is way too short to learn to code REALLY well. You typically need ten years (as for any real skill). A Uni can give you the theoretical foundation, you have to build a house on that by work experience (or hobby).
A nice site on this topic is
We do teach computer science. There are IT courses taught by the Hanze University of Applied Sciences next door, but we, and all other traditional universities over here still teach computer science. However, very few of our graduates ever become coders. They become researchers (in academia or industry, a lot of ours go to companies like Philips Medical Systems) , or become software engineers and architects (OK, they are also involved in coding, but cost a lot more ;-) )
I actually state that there are those who can teach themselves, maybe I should have stressed that point more. In fact I studied astronomy, so much of my computer science and coding skills is self taught. This is why I am still learning new things about core computer science.
I also notice there are huge differences in IT degrees themselves. Some degrees are much more oriented to learning to use available tools and languages than to actual problem solving. We sometimes get students from these courses applying for our MSc course in computer science. These guys no way more than our students when it comes to common tools out there. Where they have huge problems in in their problem-solving skills. The best acquire theses skills, there rest flunks the course.
Good point. We TRY to teach them, but then some just replicate instructions until they pass the exam, and then forget all about it. Some people are, as we say "resistant to education." At the same time there are many self-taught people who are excellent. Besides, there are many practical skills we do not teach (especially on the wealth of different tools out there).
The most important thing you can learn in education is learning itself. I have had no formal training in lattice theory, but I have acquired the skill set to learn it, and now contribute to the field. Many of my coding and debugging skills I learned in practice, in my first job as scientific programmer (developed a 150 kloc image processing system). That is when theory gets turned into practice.
Play "Air Traffic Controller" with added realism !!
I'll get me coat
Now that is cool
I'll doff my hat
(the Tilley today, the Barmah is too hot in this weather)
Later today, I will no doubt raise a glass as well
I taught programming
and have seen quite some horror's produced by various "bedroom coders". They do get things to work, but the code is often not an "oil painting". There are certainly those who can teach themselves, but there are those who do benefit from learning a more disciplined approach.
The lack of discipline can be really astounding in some. There was one guy who insisted he wanted to hand code in in C# rather than Java. I told him of course he could hand his assignment in in C#, so long as he did not mind failing the course. He found this unreasonable. I suggested my attitude reflected that of a potential employer or customer, who more often than not have some requirements on programming languages, coding style, comments. If you hand in your work in a different language, you would be in breach of contract, or get fired.
He still thought I was being very unreasonable.
Another story I like is the guy who handed in an iterative solution where the assignment explicitly stated: "implement a recursive method to compute ....." He argued this was more efficient, we said that was true, but that the assignment was to learn recursion. He said but my implementation is more efficient, we said that was true, but that the assignment was to learn recursion. He said but my implementation is more efficient, we said that was true, but that the assignment was to learn recursion. ..............................
This went on a while until we terminated this infinite loop (not by kill -9, but more humane methods)
This is not to say the tuition fees aren't outrageous. You can get a much more favourable deal in the Netherlands, and the university I work at (Groningen) is drawing more and more students from the UK. Our MSc courses are English language anyway, and our BSc courses are headed that way as well.
They would say that!
After hearing it, they just had an uncontrollable urge to ......
A lot of people seem to be jumping the gun a bit. Firefox have not said they are going to block Java, they are considering the balance of usability vs security (as they should). It could also be a shot across Oracle's bows to get them to fix the hole in Java. As far as I can see nothing has been decided yet.
Why not retrophrenology? You would just need a massively parallel hammer
In the words of Deep Thought
"The Milliard Gargantubrain, A mere abacus"
(RIP Douglas Adams)
Or just use Parson's steam turbine
The Babbage engine in turbo mode
The difference in hours is explainable in core-collapse supernovae, because light gets absorbed and re-emitted many times before reaching the surface, whereas neutrino zip through the dense core matter. The currently observed effect would result in 4 years difference even in a nearby SN in the Large Magellanic Cloud.
"Someone actually gave you a thumbs up for that?"
May have hit the wrong button ;-)
One problem is the 1987 supernova
The neutrino burst from this was detected 3 hours before the first light, which was expected because the neutrinos travel from the collapsing core to the outside much faster IN THAT MEDIUM than the light. This is because the photons are constantly absorbed and re-emitted by all the nuclei and electrons in the dense stellar matter, whereas the neutrinos move like ghosts through everything. If the current difference is correct, the neutrinos should have arrived 4 years before.
Now these neutrinos may behave differently than the ones observed in OPERA, but it shows not all neutrinos move faster than light.
The Dirac equation is a Taylor approximation of a relativistic Schrödinger equation. The latter is a non-relativistic equation. QM arose independent of Relativity (see Planck), and it is still not possible to combine general relativity with QM.
I gather the result was six sigma
60ns measured, vs 10ns error. They also repeated the measurement 15,000 times. I agree there is room for systematic errors in the OPERA experiment, and we should simply wait for confirmation (or the reverse) from independent experiments.
It ain't over till the fat lady signs