2511 posts • joined 24 Apr 2007
Niccce fissshesssss, gollum!
I actually state that there are those who can teach themselves, but maybe I should have stressed that more. I studied astronomy, and am therefore largely self taught, when in comes to computer science. This is why I am still learning more about core computer science today.
When I take on PhD students for projects with a lot of coding, I always look for passion, for people who do extracurricular stuff (not just in terms of coding, mind you). One problem is that 3-5 years is way too short to learn to code REALLY well. You typically need ten years (as for any real skill). A Uni can give you the theoretical foundation, you have to build a house on that by work experience (or hobby).
A nice site on this topic is
We do teach computer science. There are IT courses taught by the Hanze University of Applied Sciences next door, but we, and all other traditional universities over here still teach computer science. However, very few of our graduates ever become coders. They become researchers (in academia or industry, a lot of ours go to companies like Philips Medical Systems) , or become software engineers and architects (OK, they are also involved in coding, but cost a lot more ;-) )
I actually state that there are those who can teach themselves, maybe I should have stressed that point more. In fact I studied astronomy, so much of my computer science and coding skills is self taught. This is why I am still learning new things about core computer science.
I also notice there are huge differences in IT degrees themselves. Some degrees are much more oriented to learning to use available tools and languages than to actual problem solving. We sometimes get students from these courses applying for our MSc course in computer science. These guys no way more than our students when it comes to common tools out there. Where they have huge problems in in their problem-solving skills. The best acquire theses skills, there rest flunks the course.
Good point. We TRY to teach them, but then some just replicate instructions until they pass the exam, and then forget all about it. Some people are, as we say "resistant to education." At the same time there are many self-taught people who are excellent. Besides, there are many practical skills we do not teach (especially on the wealth of different tools out there).
The most important thing you can learn in education is learning itself. I have had no formal training in lattice theory, but I have acquired the skill set to learn it, and now contribute to the field. Many of my coding and debugging skills I learned in practice, in my first job as scientific programmer (developed a 150 kloc image processing system). That is when theory gets turned into practice.
Re: Rather ironic
I stand corrected.
It is a good thing I don't teach language ;-)
I taught programming
and have seen quite some horror's produced by various "bedroom coders". They do get things to work, but the code is often not an "oil painting". There are certainly those who can teach themselves, but there are those who do benefit from learning a more disciplined approach.
The lack of discipline can be really astounding in some. There was one guy who insisted he wanted to hand code in in C# rather than Java. I told him of course he could hand his assignment in in C#, so long as he did not mind failing the course. He found this unreasonable. I suggested my attitude reflected that of a potential employer or customer, who more often than not have some requirements on programming languages, coding style, comments. If you hand in your work in a different language, you would be in breach of contract, or get fired.
He still thought I was being very unreasonable.
Another story I like is the guy who handed in an iterative solution where the assignment explicitly stated: "implement a recursive method to compute ....." He argued this was more efficient, we said that was true, but that the assignment was to learn recursion. He said but my implementation is more efficient, we said that was true, but that the assignment was to learn recursion. He said but my implementation is more efficient, we said that was true, but that the assignment was to learn recursion. ..............................
This went on a while until we terminated this infinite loop (not by kill -9, but more humane methods)
This is not to say the tuition fees aren't outrageous. You can get a much more favourable deal in the Netherlands, and the university I work at (Groningen) is drawing more and more students from the UK. Our MSc courses are English language anyway, and our BSc courses are headed that way as well.
Play "Air Traffic Controller" with added realism !!
I'll get me coat
Now that is cool
I'll doff my hat
(the Tilley today, the Barmah is too hot in this weather)
Later today, I will no doubt raise a glass as well
WON'T SOMEONE THINK OF THE CHILDREN!!!!!
Surprised that did not pop up in this context.
They would say that!
After hearing it, they just had an uncontrollable urge to ......
A lot of people seem to be jumping the gun a bit. Firefox have not said they are going to block Java, they are considering the balance of usability vs security (as they should). It could also be a shot across Oracle's bows to get them to fix the hole in Java. As far as I can see nothing has been decided yet.
We have some workloads coming up which require SERIOUS i/o bandwidth. The TMS cards look interesting indeed.
That is often true
The USPTO has a mixed history in this respect, to put it diplomatically.
Not a replacement for my ageing SZ
until AMD/Radeon gets its Linux drivers sorted. Pity really
But what about the MOT
So it has got the extra leg
When will it sprout the extra head, and develop into a new species which has to be named
Why not retrophrenology? You would just need a massively parallel hammer
poolean overflow error, core dumped
'T is a death trap!
for auditors, certainly
In the words of Deep Thought
"The Milliard Gargantubrain, A mere abacus"
(RIP Douglas Adams)
Or just use Parson's steam turbine
The Babbage engine in turbo mode
The difference in hours is explainable in core-collapse supernovae, because light gets absorbed and re-emitted many times before reaching the surface, whereas neutrino zip through the dense core matter. The currently observed effect would result in 4 years difference even in a nearby SN in the Large Magellanic Cloud.
"Someone actually gave you a thumbs up for that?"
May have hit the wrong button ;-)
One problem is the 1987 supernova
The neutrino burst from this was detected 3 hours before the first light, which was expected because the neutrinos travel from the collapsing core to the outside much faster IN THAT MEDIUM than the light. This is because the photons are constantly absorbed and re-emitted by all the nuclei and electrons in the dense stellar matter, whereas the neutrinos move like ghosts through everything. If the current difference is correct, the neutrinos should have arrived 4 years before.
Now these neutrinos may behave differently than the ones observed in OPERA, but it shows not all neutrinos move faster than light.
The Dirac equation is a Taylor approximation of a relativistic Schrödinger equation. The latter is a non-relativistic equation. QM arose independent of Relativity (see Planck), and it is still not possible to combine general relativity with QM.
I gather the result was six sigma
60ns measured, vs 10ns error. They also repeated the measurement 15,000 times. I agree there is room for systematic errors in the OPERA experiment, and we should simply wait for confirmation (or the reverse) from independent experiments.
It ain't over till the fat lady signs
It would seem fitting that a game of ping-pong heralds a new type of computer.
Sounds interesting. I might give this one a more detailed look.
My thoughts exactly
Does require an even bigger craft and thus even bigger rocket. It does seem to be a simple problem compared to shielding against cosmic rays and the like.
I always expect the world to be abnormal
mine is the one with the nonparametric statistics book in the pocket
What are the odds of politicians making sense of pure maths?
Apparently vanishingly small. If you look at many of the current innovative technologies, so many derive from other fields of mathematics than statistics and applied probability the mind boggles at the sheer stupidity of this policy. We are currently working (amongst other things) on diffusion tensor imaging. Tensor maths are quite hard to get your mind around, but of outstanding use in e.g. finding out how the brain is wired (cheap shot: politician's brains contain serious wiring defects), apart from being of use in general relativity (hands up who does not use GPS).
Shortsighted policy? Willfully blind more likely.
There is tangible proof of a massive impact
exactly on the K-T boundary. Paleontology that, not climate science. You are right in saying that the exact mechanisms of the extinction and the climate change as a result of the asteroid impact are speculative, but there is proof of massive forest fires in the fossil record. The estimate of the asteroid size (approx 5-10km, estimated from the impact crater remains) combined with the fact that the iridium-rich layer at the K-T boundary is found all over the world is powerful evidence that something affected the whole world just at the point dinosaurs went extinct. Bit of a coincidence that.
So the result from the break up could still be out there.....
RUN FOR THE HILLS!!
GET BRUCE WILLIS INTO A SHUTTLE,
Darn, that wont work
WE'RE DOOMED, DOOMED
always a trusty fall-back
Explosion, because well ....
That is fairly automatic these days
The Earth's rotation is the big one, but even my little equatorial mount can correct quite nicely for that. I think the 2.2 m is also on an equatorial mount. Nowadays, autotrackers are available for amateur scopes, so this big kind of scope should be fine. The Earth's movement around the sun has very little effect, the parallax created in one night or a few days is smaller than the point spread function of the optics.
Actually, the hard thing is to get a wide field
No zooming required with a telescope, but a big (268 megapixel or so) camera to get this wide (OK maybe it was not the 268 megapixel OmegaCAM, but still a biggie). The other hard part is collecting enough data, as these objects are faint. Hours of exposure are needed, and postprocessing to remove artifacts introduced imperfections in the camera (dark-current, hot pixels, slight differences in pixel sensitivity, and shading by the optics), and finally dynamic-range compression to show both he brighter and fainter parts in one shot.
unless you had fun smoking whatever it was.
Ye cannae change the laws of physics capt'n!
The smaller the camera, the smaller its aperture, the smaller the number of photons reaching the sensor. This causes an increase in photon noise. Though of course noise reduction techniques have improved, there is no substitute for more photons, and therefore a cleaner input image.
Still a cool looking little device. Could even be used by a playmobil puppet. Could this be useful for LOHAN? ;)
Comparatively measured statement
by Stallman's standards
Unless like me
you have put in a bios password, and set the boot sequence to HDD only. It will then not boot from CD USB or (heavens forbid) floppy (remember them). You can still open up the PC, zero the CMOS RAM by shorting it and try again, but that is not trivial to do without drawing some attention.
I still prefer my Linux, but the missus and the kids want their windows.
just drop a book on the keyboard. new password owrfqjerlkcsm, which will be displayed as *************
Maybe the BOFH will now allow shiny NEW macbooks in the workplace. The possibilities, the possibilities
that is just excellent
and not only just
Maybe I should make this mandatory reading for the new business computing track in our curriculum
the hills will be vapourized as well
Imagine if they had landed
"Hello Houston, this is Tranquility Base. Snoopy has landed"
Doesn't have the same ring to it, does it?
I tried running a nicely parallel shared memory workload (75% efficiency on 24 cores in a 4 socket opteron box) on a 64 core ScaleMP box with 8 2-socket boards linked by infiniband. Result: horrible. It might look like a shared memory, but access to off-board bits has huge latency.
BTW, I once got 110% efficiency on two cores, because the problem did not fit into the memory band of one socket, so spreading the workload over 2 cores or more reduced the latency nicely. Weird but true.
Cheers to them
Now there is a game worth playing! Excellent manners to give the gamers full credit.
We used a metal thermos-flask from a department store to transport liquid nitrogen quite safely (when we were running a test for an infrared spectrograph. Never encountered any problems
My Windows 7 install has not shown BSODs yet, neither has my XP install on my decidedly aged laptop done so for years. Having said that, a smiley like that would not go far to improve my temper once it did happen. Seeing one in a demo of a product is perhaps not too odd, as it is probably still in beta. I will give MS the benefit of the doubt here. If the BSOD is not likely to pop up at all in the final product, I don't much care what it looks like.
- Review Is it an iPad? Is it a MacBook Air? No, it's a Surface Pro 3
- Game Theory The agony and ecstasy of SteamOS: WHERE ARE MY GAMES?
- Hello, police, El Reg here. Are we a bunch of terrorists now?
- Microsoft and HTC are M8s again: New One mobe sports WinPhone
- Worstall on Wednesday Wall Street woes: Oh noes, tech titans aren't using bankers