back to article Intel wants to own the weather prediction business

The conventional wisdom - whatever that is worth - pegs Intel EVP Sean Maloney as the company's successor to CEO Paul Otellini. That's great news for technology hacks because, man, this Maloney guy is quite a bit more open about his personal life and feelings than Otellini. One need only look at this piece in the Times to see …

COMMENTS

This topic is closed for new posts.
  1. Herby

    Maybe, just maybe they will ACCURATELY predict the weather

    Or climate models, global warming, etc...

    Oh, Al Gore was wrong, whoops!!

  2. Mad Hacker
    Coat

    "I am not a finance guy," Maloney said.

    But he is, apparently, a comedian!

    Was he laughing when he said that, or was that the somber end to the interview?

  3. Pete "oranges" B.
    Coat

    New motto:

    Intel: Building Meteorology A Better random Number Generator.

    (The slicker, seeing as there's a 39.999856738903 percent chance of rain here)

  4. Mike Horton

    A Little Silicon Valley History Might Help

    Ah, how soon the world starts rewriting history. The interface portrayed in Minority Report was based on the Starfire project, initiated by Bruce Toganzzini back in 1992 while he was working at Sun Microsystems ( http://www.asktog.com/starfire ). Interestingly enough, at a conference Bruce commented that they decided not to proceed with the interface because after a day's arm waving for filming purposes, the actor's complained that their arms and shoulders were almost dropping off!

  5. Stuart Van Onselen

    Chaos theory

    Add "meteorology" and "advanced mathematics" to the things he knows nothing about. Better weather prediction is not just about throwing computing power at the problem!

    To put it simply, chaos theory implies that you would need a near-perfect record of the meteorological conditions to predict the weather accurately beyond a few days from now. Think "what is the temperature, wind speed, wind direction, humidity etc. for every 1m^3 of atmosphere on the planet." And that probably still won't get you predictions that don't break down after two weeks.

    Intel has spent the last decade or so trying to find problems for their solutions!

  6. Marvin the Martian
    Paris Hilton

    A country mile?

    Where did that non-standard unit slip in? Is it shortish like a furlong, or is it the "oh you go 5 minutes there to the intersection, then right for another" type of league upon league?

    PH because we have a right to know, or we may have a right to remain ignorant; we don't know and refuse to find out.

  7. Anonymous Coward
    Anonymous Coward

    Stuart Van Onselen

    What the man is talking about is using your comp to RUN the prediction. Something that at the moment is only possible using a mainframe or some cluster job.

    He is not talking about extending the prediction period, or increasing time till prediction, but asking your desktop, "will it rain tomorrow". Actually a pointless exercise as someone else will have already done it, a lot faster than you have.

    Weather prediction has certainly improved over the last 10 years or so, when a simple statement "The weather tomorrow will be the same as today" was more accurate than the Met Office.

    As for 2 weeks ... bah, I wouldn't trust a weather prediction past 3 days in the future. Which is not at all the same as climate modeling. Not understanding the difference says you have absolutely no point to make, and should avoid the subject in future, I'm looking at you Herby : )

  8. Ru
    Flame

    Software lags hardware...

    This is probably because software can easily be far more complex than hardware. Witness the number of operating system disasters vs the number of processor disasters.

    Its all very well saying 'ooh, throw more cores at it, throw more processors at it', but actually making use of all these fancy new hardware resources requires non-trivial amount of programming talent, and more importantly, powerful compilers.

    Why do you suppose that Itanium, with its fancy EPIC architecture geared towards pushing more code through more hyperthreaded cores hasn't taken off as Intel hoped? It certainly didn't help that they pushed the burden of making better use of the processor on to the compiler. And sufficiently capable compilers just aren't out there.

  9. Elmer Phud
    Pirate

    "Big screens"

    "You'll have these giant screens and be able to throw images around them"

    Weather report:

    Global warming increases rapidly due to the significant power requirement of 'giant screens'. Please remain indoors, you will be updated on your Intel BeFOS (big fuck-off screen) on the situation as it changes.

    caution: Intel BeFOS are not intended for use by people with any motor-neurone dysfunctional ailments or by small children

  10. Martin
    Coat

    Supercomputing?

    I work for a meteorology company just now and have previously worked for another, I can assure you that we have no supercomputers (though we probably have more servers than the average service business). The last company I worked for had a lot of computing power at their main headquarters, but I'm buggered if I know what they used it for it was probably their to make the place look like NASA mission control.

    Conventional climate prediction models are run on nothing fancier than a 4-8 CPU rackmount with about 4-8GB of ram on your favourite linux distro of choice. Their isn't much business here for Intel to "own" really as far as conventional climate prediction goes. Perhaps this will change with the later release of finer grained models which use more complex mathematics, but by the time it does we will probably have servers which are very fast indeed and very cheap.

    No need for the coat the weathers fine.

  11. Anne van der Bom
    Paris Hilton

    Software lags hardware - 2

    It really stuns me that a man that is supposedly intelligent (otherwise you wouldn't be at the helm at one of the most successful companies in history), can state that 'software lags hardware'. I never heard such utter nonsense.

    Software and hardware are two totally uncomparable things. It is the same as saying that 'cars lag roads' or 'music lags digital audio'.

    Paris, cos even she's smart enough to understand that.

  12. Stuart Van Onselen

    Precision

    I read it as him wanting to increase the *accuracy* of the report, but I got side-tracked on the "length of accurate forcast" angle, so I buried my own point, while missing his.

    But we can agree, I'm sure, that whether he was talking about better accuracy, or doing your own, personal prediction, he's still an idiot? :-)

  13. malcolmus_rex
    Boffin

    @ Martin

    So, Martin, you work for a Meteorology company. I assume that means that your company sell forecasts based on publicly available forecast data... where do you think that data comes from if it's not from one of the National Weather Services with a great big supercomputer?

    Here's the supercomputers in meteorological organisations currently in the Top 500 list:

    67 ECMWF

    68 ECMWF

    72 Korea Meteorological Administration

    141 China Meteorological Administration

    197 Japan Meteorological Agency

    198 Japan Meteorological Agency

    219 NOAA/Geophysical Fluid Dynamics Laboratory (GFDL)

    232 Beijing Meteorlogical Bureau

    246 Fleet Numerical Meteorology and Oceanography Center

    410 Fleet Numerical Meteorology and Oceanography Center

    The chances are that your meteorology company get's its data from number 219, or from the UK Met Office (whose supercomputer has dropped off Top 500 list until gets a new one..)

    Climate models don't need much CPU or RAM. A weather forecasting model needs alot of both. 8GB of ram really doesn't come close....

  14. Tanuki
    Thumb Down

    23.921548790659632% chance of rain. Film at 11.

    So, they tell me there's a 50% chance of rain tomorrow. What good is that? Should I take half an umbrella?

  15. ImaGnuber
    Coat

    Moles

    Only some mole confined to a windowless cube farm would think that you need a supercomputer to predict the next couple of days of weather. Clue in to an old trick taught to me by an elderly member of the local indigenous population: "Want to know weather? Look at sky!"

    And for long term climate predictions - just stand some fat-assed ex-politician in front of a camera.

    Yes, yes. I'm gone. No, didn't need a coat today.

  16. brainwrong

    Chaos theory

    Edward Lorenz, the father of chaos theory, passed away just last month aged 90.

    http://en.wikipedia.org/wiki/Edward_Norton_Lorenz

    In the early 60's he created a very simplified computer model of the atmosphere (I believe more specifically of convection currents). He noticed that the tiniest change in the initial conditions gave rise to outputs that initially looked the same, but soon diverged. It is important to note that the system was completely deterministic, there was no (pseudo) random element. This was discovered by chance, Edward had a printout of one simulation run, with the initial conditions, and wished to re-run the simulation. The start variables had been rounded to (I think) 4 decimal places when they were printed.

    Edward later simplified his model to the minimum possible of 3 variables, which gives rise to the Lorenz Attractor. He also coined the term "butterfly effect".

    You can't measure the current state of the atmosphere (and oceans (and land!)) to any real precision. And your model probably isn't too good anyway. That's why you can't predict the weather.

  17. Anonymous Coward
    Anonymous Coward

    forecastting on supercomputers vs. desktops

    Supercomputers are required (and used as you can see from the list above) to run _global_ weather models. Some smaller weather companies run so called mesoscale models -which do not require supercomputers- but are driven by data from the global models. Those mesoscale models are run to get a better regional forecasts.

    With climate models it is a bit different.There are global models that you can actually run on a desktop, but they have only a very coarse resolution and a comparetively "simple" formulation of the climate system. To run state-of-the-art models you will definitively need a supercomputer.

  18. Anonymous Coward
    Boffin

    Getting to the core of the issue

    Three things:

    1) The HPC crowd, especially those involved in anything fluiddynamik, need all the cores they can get. Our software is already MPI-ed up to the eyeballs; just give us the cores and memory, and we will use them.

    2) Simplifying parallel programming. MIMD programming models are fearsome beasts, and horrendous to debug: you can spend weeks rather than days getting the buggering programs to work. But does anyone remember the fit of SIMD parallelism of the early 90s? The Connection Machine was actually /easier/ to program than the serial equivalent. (NVidia's CUDA provides an interesting avenue worth exploring)

    3) To Martin: your models couldn't have been doing anything clever. Running state-of-the-art CFD models simulating anything larger than 10^6 m^3 requires a serious amount of horsepower. That's why they practically all run off the back of MPI/OpenMP.

  19. Anonymous Coward
    Anonymous Coward

    Itaniwhat?

    "How's that grizzled veteran of high-end computing Itanium panning out?"

    Ask SGI and Nasa, who just signed a deal for a nice little Linux-based supercomputer, as reported right here on El Reg:

    http://www.theregister.co.uk/2008/05/06/sgi_moon_nasa/

    In contrast with some of the earlier IA64 supercomputers, which were based on free Itanium funded by Intel, this deal is Itanium-free.

    Q: "Is Itanium in the black?"

    A: "... increased profitability ..."

    You don't have to be a financial genius, or an Intel shareholder, to work out that Intel money spent on Itanium would have been, and would still be, more profitably spent on Xeon. Unless of course HP (as noted in the article, the only surviving Itanium customer of note) are paying a fortune for access to Itanium, and passing that ransom cost straight on to their captive HP-UX and VMS customers, thus supporting Itanium's "increased profitability", at least while it lasts.

  20. malcolmus_rex
    Boffin

    RE: forecastting on supercomputers vs. desktops

    A local area model running on a desktop is also a not doing anything clever - a local area model that's of any use has a resolution that's so much higher than a global model that it also needs a supercomputer. In fact it generally takes longer to run a good mesoscale model than it does to run a global one.

    Then there's the computer resources needed to process GB upon GB of observation data from all the different satellites, radar, weather stations and balloons etc. to produce a good starting point for the model. That should be done by a good mesoscale model too and is computationally expensive (and for a global model it's very very expensive indeed).

    Then to give a much better forecast, to get around the butterfly effect, many different forecasts can be run each with very slightly different starting conditions (within the errors of the observations). This ensemble of model forecasts can be used to quantify the probability of events occurring... It's particularly useful for severe weather events. Of course it also multiplies the computer cost by however many model runs you can do but the civil contingency people and utilities companies will pay good money for this. Sadly this sort of stuff never makes it to the BBC weather forecasts or the weather websites. There's a perception that the public are just too dumb to understand it (and they may be right: http://www.theregister.co.uk/2007/11/08/scratchcard_anarchy/ ).

    Of course those smaller companies you mention might not actually have a better forecast but I'd bet good money that they're able to disseminate the information they get from their forecasts in a more useful (and profitable) way than many of the bigger organisations.

  21. Anonymous Coward
    Alert

    RE: Itaniwhat?

    Who cares for Itanium, anyway?

    What size is the Itanium market in comparison the the PA-RISC market in 2000?

    The most interesting piece of information is missing anyway.

    The developement agreement between Intel an HP.

    From the fact that both companies are still throwing insane amounts of money into that black hole, it must be obvious that the liability provisions in that contract have to be nasty on a yet-to-be-invented scale.

  22. Steve Mann

    Haaaaang on.....

    I think people posting here are confusing "modeling the weather" with "forecasting the weather".

    While the point of the former is to do a better job of the latter, you can forecast the weather for most purposes to an acceptable level of error with the naked eye, a couple of instruments that have been around in essentailly the same form for several hundred years and a bit of knowledge.

    The addition of com links to other people doing the same stuff elsewhere makes life even easier of course, as does a radar setup of the right kind in the right place. Satelite pix are the ultimate bees knees.

    Not perfect by any means, but good enough for most purposes.

    It is my understanding that modeling the weather requires so much digital oomph that the further out you try and predict it the more likely it is that it is already happening by the time you get the answer.

    My next door neighbour uses a rock hanging from a string outside his kitchen window to forecast the weather. If he looks out and the rock is wet, it's raining. If it's not wet, it's not raining. If his window is broken, it's windy.

  23. Anonymous Coward
    Anonymous Coward

    USA NWS on Top 500

    FYI: National Weather Service operational weather forecasting systems are 74 and 75 on the Nov 2007 Top 500 list. System reliability (in excess of 99%) is a top consideration in a 7X24X365 operational environment.

  24. Anonymous Coward
    Anonymous Coward

    "development agreement between Intel and HP"

    There's that, and there's also the ten year agreement between Palmer's DIGITAL (aka DEC) and Intel, which was part of a huge patent-suit settlement between Intel and DEC (a settlement which, according to some, allegedly led to the premature termination of Alpha development, though bits of it live on elsewhere). The details of that ten year agreement were never disclosed but the lifetime of that agreement expires around now.

    You don't have to be a conspiracy theorist to wonder if Intel effectively wrote "you commit to killing your 64bit stuff (Alpha) ASAP, and we'll commit to providing you the world's finest 64bit stuff (Itanium, obviously) on 'most valued customer' terms for the next ten years (by which time Intel's 64bit will be the Industry Standard, obviously)"

    http://www.news.com/2100-1023-204668.html

  25. Solomon Grundy

    Multi Thread Software

    If you read carefully it seems that Intel is a bit concerned that developers aren't able to keep up with multi-thread development. Duh! 95% of them can't keep up with single core development - is Intel really surprised?

This topic is closed for new posts.

Other stories you might like