2556 posts • joined 24 Apr 2007
I am somewhat puzzled how they predict this linear growth based as it seems on two data points: 2010 and 2011. OK mathematically that defines a line, but will markets not become saturated after a while? Some error bars might not be amiss.
Many people do:
My cooking range and central heating are gas-fired. Replacing all the incandescent bulbs by fluorescent lights saved us a significant portion of our energy bill.
As I recall
a law suit was threatened by Xerox, and Apple hastily withdrew its case against Microsoft on "look and feel"
The case was settled, making it legal in retrospect, I guess.
seems to spring to mind unbidden
Does anyone see similarity between iOS components listed above and my ancient Tungsten T3 (grid of icons, lack of buttons, soft keyboard when needed)? Apple may have put them together in a better sleeker way, but they did not invent that stuff. This fits in with earlier Apple history (Xerox work on GUIs anyone). Again, Apple may have combine elements better than others before, and that is to be applauded, but they also often stood on the shoulders of giants.
so they say
Rewriting compilers from C++ to Basic and C#??
Is this me, or does this remind others of Niklaus Wirth's (failed) attempt to write a Pascal compiler in FORTRAN?
Or are they just modifying the compiler to make interim listings available in C# and Visual Basic showing the transformations performed by the code optimizer (in much the same way as the Cray FORTRAN and C compilers I used in the 90s?
The claim that Steve Jobs may have had a good idea for a product the day before he died is still more believable than a similar claim about Steve Balmer on any arbitrary day.
More seriously, though I am not an Apple user in any form, I do not find the former claim beyond the realm of the possible. Though the story does have a strong ring of spin on it, it may well be true, and it may be a bit cynical to reject it off-hand (though cynics are right depressingly often). Trying to keep working may be a good way of keeping your thoughts off what must be scary to anyone. It can also take your mind off pain, or give you some positive feeling.
We are living in amazing times, aren't we. At some point you will hold the compute power and memory storage of a Cray Y-MP in your pocket.
At which point it becomes possible to run MS-Office 2010
Cheers to the lot of them
Glad to see the Dutch team on the podium (as me mum is Dutch), kudos to Tokai and Michigan too of course!
Balloon, buffoon, baboon, they all mix nicely to spell Balmer, don't they?
Balmer apparently needs to vent vast quantities of hot air from time to time, or else he might burst.
(Oh bleeding hell, now I cannot get that image out of my mind).
If he means what he says, Microsoft shareholders might be in for a rocky ride, if he stays at the helm.
They just look at the situation through rose-tinted rather than polarizing glasses
I'm on my way
"In the parallel universe where 1997 saw the end of Apple, I don't think anyone will be casting any blame for not sticking with it. Jobs pulled off a 1000:1 shot."
I think it was a million-to-one shot, and everybody knows you can always pull those off! Just ask Sergeant Colon
The real question is
Did he put the pizza or his brains in the blender.
By bet is on the latter
"... whereas if you open a normal car door six inches you couldnt get a sheet of paper out , never mind a person!"
Certainly not certain American persons
But will it do time travel?
Or can it not hit 88 mph?
there was a BOFH episode
C. Omputer (you mean Chaz!)
and many others, all receiving payment as consultants
stop this sketch
it's getting silly
The Rt Hon. Major Smyth-DeVere-Charteris, in a sauce of green herbs
Wasn't he a pet halibut?
What will be the next names?
Some modest proposals:
Maybe I should not get into marketing
exactly, the editors may expect a visit with a cattle prod.
or a modded security robot
Old men shouting at pigeons often make more sense
Millennium hand and shrimp, buggrit, buggrit. I told em, I told em, I told em, I told em. They'd only run out. Doorsteps!
Mine's the long trench coat.
And of course
WHY on earth would you bring a trailer full of cans of "Bud" to Australia when you can drink Aussie beer?
Mine's a James Boag's please.
And under the hood of the solar car
are hundreds of duracel bunnies
Could be of use to me though
I write some heavily multi-threaded stuff at work, and could test these when working at home before running it on the 24-core Opteron beasts we have here. We should be getting a 48-core one shortly, but maybe I will delay the purchase until the 64-core versions with Bulldozer cores come out. For our research purposes, testing for scalability over many cores is more important than absolute speed. For production machines, that changes, of course.
So I guess nobody
wants to welcome are hyper-intelligent qiant kraken overlords?
I'll have mine sliced, battered, and deep fried (and with garlic sauce)
A stabilizer requires extra weight, complexity and electricity, all of which come at a premium in these vehicles, I would guess. There may also be regulations against it.
Will you compare it to:
I have never tasted (not do intend to taste) this artery clogging horror, but I would be interested in a fair comparison. All in the interest of science, of course.
Beer, because it is Friday afternoon.
Re: Shouldn't be any such thing as hate crimes
If people beat people up, that's bad: agreed, it's just the level of badness that is at stake.
In my book at least, it is worse if someone beats you up because he hates the group you belong to, rather than, e.g., what you said about him or his mother, or because you took a swing at him. There is even a sort of "bad sense" in mugging: people do it for profit. It is definitely bad, but not as bad as an unprovoked attack based on a person's religion, race, or sexual orientation.
let every male mourner embrace, and perhaps even kiss a male protester (I realize this requires setting aside your natural revulsion, but be strong!!).
For a nice touch add: "I forgive you."
I think this might freak these guys out, and they could hardly sue you for it, you are merely following the example set by by Christ, showing love to thy neighbours.
"Now is the time to take advantage of the large webOS user base."
What? Both of them?
Sorry, I like WebOS, I even got the SDK out of curiosity as a long time user of Palm hardware (my Tungsten T3 was only retired last year). The joke is bittersweet in that a good idea seems to have been ruined by bad marketing. The HP statement is just surreal.
According to inflation theory
it was space that moved faster than the speed of light. This is perfectly allowable according to relativity. It is hard to get your mind around, I agree.
The 60ns difference equates to 18 m
I trust there has not been an 18m shift in position, as that is earthquake magnitude. The mean temperature of rock beneath the ground is really stable, simply due to its HUGE thermal capacity, and high degree of thermal insulation. This is why (wine) cellars often have high thermal stability.
Previously, the peculiar, apparently superluminal motion of jets in quasars and active galaxies could be explained elegantly by invoking special relativity. by assuming the jet is traveling at near light speed almost directly towards us. Here general relativity may well be an explanation (but it may not)
Someone may have copyrighted the phrase "Tick Tock"
Now astronomers have one more reason to detest astrology
only lawyers will benefit from this.
Keyboard, eat hot, er,.... tea!
He will be missed most by those near to him of course, but many others will miss his undoubted leadership. Though I am not an Apple user, many Apple designs set new standards, and gave others a point to aim at.
I will join you in raising a glass to his memory and achievements.
Just two questions:
Do these things come with kevlar reinforcements?
I can see why you might take offense
but I read the "as well" as meaning that it is OK to learn Java, and "real programming" also can be done in Java.
On the topic of teaching programming: Some years ago we decided to use C in the initial programming course (Imperative programming). This allowed us to ditch the object-oriented overhead of Java which was used before. We go on to teach them algorithms and data structures in C, complete with structs containing function pointers. The latter paves the way for understanding what objects actually do for you. They also learn to clean up any mess they leave behind. Object oriented programming is then done in Java. Later courses include functional programming (I think in Haskel), and parallel programming and concurrency.
This is just the main programming track of the curriculum, other tracks focus on software engineering and architecture, database theory, operating systems, compilers, etc. Many of these skills might never be used in practice, but they do give better insights, and mean you have learned to learn difficult topics.
The best reason to learn a programming language is to learn new ways of thinking about problems. It does not help to learn 15 subtly different OO languages at university, it is important you learn different programming approaches. Once you have learned a style well, it is trivial to learn another. I learned programming in Pascal, the switch to C was quite trivial; I started OO programming in C++, learning Java was quite simple.
Ruling makes sense
These are the kinds of rulings that let ordinary people believe there are benefits to being part of the EU.
I missed taking a towel to work on May 25
I missed saying "ARRH" on September 19
I think I will give October 14 a miss as well
Pirate, because, well AAARRRRHHH!
I thought they just made planets
but they might be branching out into whole galaxies.
re: dark matter
I just love it when people come by and start saying: look scientists got it wrong in the past, current science must also be wrong. This shows they do not quite get the scientific process.
"Getting things wrong" is part and parcel of science. All current theories do is model the world in such a way that a large set of observations are explained. Testing theories consists of making predictions, doing experiments and seeing where the theory gets it wrong. If it gets things wrong, we have to make a new theory which explains all the old observations AND the new. Alternatively, there may be a mistake in the new observations, and the theory survives to be tested again. Each new theory offers a better approximation of the behaviour of nature, which should be harder to prove wrong than the previous. However, even if it mimics the behaviour of nature exactly, we have no guarantee it is the real mechanism behind nature, it is just a perfectly good model.
It is natural for scientists to be cautious when a theory confirmed by thousands of experiments is contradicted by a single experiment. At the same time many physicists are unhappy with the notions of dark matter and dark energy, and are looking for alternatives. Where there is disagreement there is progress in science.
And there I was thinking
they might be suffering from dyslexia.
Or they were in such a hurry to make money they forget the last letter of their nam.
having said that, their behaviour causes dyspepsia.
"Intel hardware can't run the OpenGL test"
Why not, is the OpenGL support incomplete, does it lack required extensions?
If so, it is wholly unsuitable for me, alas.
Niccce fissshesssss, gollum!
I actually state that there are those who can teach themselves, but maybe I should have stressed that more. I studied astronomy, and am therefore largely self taught, when in comes to computer science. This is why I am still learning more about core computer science today.
When I take on PhD students for projects with a lot of coding, I always look for passion, for people who do extracurricular stuff (not just in terms of coding, mind you). One problem is that 3-5 years is way too short to learn to code REALLY well. You typically need ten years (as for any real skill). A Uni can give you the theoretical foundation, you have to build a house on that by work experience (or hobby).
A nice site on this topic is
We do teach computer science. There are IT courses taught by the Hanze University of Applied Sciences next door, but we, and all other traditional universities over here still teach computer science. However, very few of our graduates ever become coders. They become researchers (in academia or industry, a lot of ours go to companies like Philips Medical Systems) , or become software engineers and architects (OK, they are also involved in coding, but cost a lot more ;-) )
I actually state that there are those who can teach themselves, maybe I should have stressed that point more. In fact I studied astronomy, so much of my computer science and coding skills is self taught. This is why I am still learning new things about core computer science.
I also notice there are huge differences in IT degrees themselves. Some degrees are much more oriented to learning to use available tools and languages than to actual problem solving. We sometimes get students from these courses applying for our MSc course in computer science. These guys no way more than our students when it comes to common tools out there. Where they have huge problems in in their problem-solving skills. The best acquire theses skills, there rest flunks the course.