back to article Intel and Microsoft dump $20m on researchers to avert software crisis

Microsoft and Intel have put their money where their fear is. The two companies have shelled out $10m each to the University of California, Berkeley and the University of Illinois to fund research around advanced software programming techniques for PCs and mobile devices. The grants mark a significant effort on the part of …

COMMENTS

This topic is closed for new posts.
  1. Anonymous Coward
    Anonymous Coward

    I hope their brief...

    ...extends beyond dogs with wagging tails and paper-clips that tap on your monitor when you do something wrong.

  2. Joe

    Too easy

    FPGA's should be programmed to run the .NET runtime, or the entire OS. The time to upload a program to a FPGA is relatively short and well worth the wait for the improved performance

  3. Kurt Guntheroth

    hardware crisis

    There is a hardware crisis, not a software one. The *hardware* makers can't make a core go faster.

    I won't be simulating weather or challenging chess grandmasters on my PC any time soon. But I will keep typing characters into my word processor, linearly, one character after the other. Most computer programs are sequential, because most of them implement processes that happen over time. Duh. I can use a little bit of concurrency if I want to type and compile at the same time, but oops! my three year old 2.5ghz P4 is way fast enough, even time slicing.

    The sad fact is, if we can't find a way to sop up all those cores, then consumers will demand that prices fall as parts get smaller and cheaper to make. And we can't have that, oh no! Moore's Law was a free lunch, not just for lazy coders, but for lazy chip makers too.

    And it's over. Even though you can still put more aggregate power in a chip each year, you can't put more sequential execution speed in anymore. Time to look for a new industry to invest in, or else time to get way smarter about using the resources you have to do more (sequential) work.

    Maybe there's a voice recognition or virtual reality app out there somewhere, that everyone will want, that will sop up 128 cores and save Intel. Maybe it's AI. But it won't come from parallelizing inherently sequential applications.

    I hear biotechnology is the hot new investment area. Maybe they can use 128 cores for something. Something specialized that can be written by experts.

  4. Louis Savain

    Nightmare on Core Street

    One day soon, the computer industry will realize that, 150 years after Charles Babbage came up with his idea of a general purpose sequential computer, it is time to move on and change to a new computing model. The entire industry will be dragged kicking and screaming into the 21st century. Intel is already commited to the multithreaded approach to parallel computing. They've been having a hard time trying to make it easy for programmers to use threads to code parallel applications. They think they can solve their problem by throwing more money at it. Little do they realize that the problem is not with parallel programming but with the thread-based approach. Berkeley and other parallel programming research centers are being handed an impossible mission. See the link below to find out why threads are not part of the future of parallel programming.

    Nightmare on Core Street:

    http://rebelscience.blogspot.com/2008/03/nightmare-on-core-street.html

  5. Anonymous Coward
    Anonymous Coward

    What about F#?

    It seems to me that Microsoft are going this way already with the 'productization' of F#. The key advantage of functional programming is the pottential to implicitly parallelize code with no explicit action needed by the programmer.

  6. Joe Cooper

    The future!

    Well as far as single threaded, linear things like typing one character after another, even that 2.5 ghz P4 is overkill. I'm running Ubuntu 7.10 here on a PENTIUM III 700MHZ. It runs fast, it boots almost as fast as my Core Duo mac, and does everything I want it.

    But when we get into things like video processing and multimedia and gaming that a LOT of people do with their PCs, not only is every bit of CPU power useful, but so is every core.

    Any bulk data processing from video to gaming to compiling can and often does take advantage of multiple cores. There isn't that much difference in terms of computational needs between bulk video processing and anything you might be doing with Blue Gene except for the sheer scale, so why not use multiple cores? SMP is GREAT for the sorts of bulk data processing that people ~actually do~ with their computers.

    It's not a fairy tale.

    Of course, if you're not doing that, there's no reason not to be happy with a Pentium III, and nobody stopping you from using it. That's what I'm doing right here. 20 watt processor for the win.

  7. John
    Alien

    I can't beleive Micros~1 gave Berkeley $$$

    What the hell are Micros~1 thinking. Throwing money at these commies who make BSD. God damnit, BSD release all of their source code and let any goon modify it and redistribute it!! Bloody FOSS people and their webservers which have uptimes measured in years!!

    BSD are going to use this money to make some sort of usable multi-core framework and then release it under their freeBSD license. This means that anybody who uses it has to provide the source to their program. Can't make propitiatory code using this.

    What the hell are Micros~1 thinking????? Has their cheese slipped off their cracker??? Has Linux-friendly Intel pulled a fast one???? Does Mircors~1actually want BSD to work even better, so when they finally buy yahoo!(almost exclusively BSD) they can have a webserver OS that outperforms anything they can come up with???

  8. Henry Cobb
    Jobs Halo

    The software is the problem

    If you're going to blame anybody for the speed limit, blame Einstein for imposing a speed of light limit on moving information around. The future lies with massive numbers of computing elements.

    Step one for M$ is to admit their miserable failure at kernel design and follow Apple to the BSD kernel. (Hopefully one of the micro-kernels.)

    Then they can start work on totally asynchronous GUI design where the big event loop is replaced with a bunch of tiny reflexes that lead to emergent behavior.

    -HJC

  9. Andraž Levstik

    @John

    Get your facts straight...

    a) Berkely doesn't produce any of the BSD OS-es last I checked...

    b) The FreeBSD license is a highly permissiev license

    guess what micros~1 at one point took the BSD networking stack and used it in NT sysetms

    c) BSD license doesn't require to publish modified source code

    Thank you for your time...

  10. Steve
    Thumb Up

    @Kurt

    Maybe you will be playing a game that is simulating the behaviour of multiple bad guys, running a physics engine and rendering the results out to screen. It's quite easy to see how something like that could make use of many cores.

    Or maybe you'll just be editing your holiday footage, recoding it for BR, adding a soundtrack and doing the odd fade here and there. Again, plenty of scope for multi core work there.

    Or hell, maybe it's just the 40 different processes that windows is running in the background whilst you slowly type into your word processor, maybe it's your firewall, your virus scan, your yahoo widgets and your webserver all busy doing stuff whilst you type... plenty of things for several cores to be busy doing there as well.

  11. Ken Hagan Gold badge

    Re: What about F#?

    Whilst not disagreeing about the advantages of a function language for exposing parallelism in calculations, my understanding is that all functional languages have to introduce non-functional features if they are to exhibit side-effects, at which point the parallelism all goes away. They aren't a silver bullet.

    My concern would be that MS released an implementation of F# that ran like treacle on a single processor, became vaguely acceptable with 16 cores and really only became comparable to existing technology once you have 64 cores or more. MS would see that as "look how well this scales".

  12. amanfromMars Silver badge
    Alien

    If you use urImagination Everything is Real to You .......

    ...and Sharing IT makes IT Real to Everyone Else too.

    The Finance System Meltdown .... Central Processor Unit Failure ... or a Software Bug, Requiring a Simple ReWrite for a SurReal ReBoot.

    "When told by Intel of the shift toward multi-core chips, Bill Gates remarked, "We can't write software to keep up with that." Gates then urged Intel to continue producing faster processors as it had always done. "No, Bill, it's not going to work that way," Intel vice president Pat Gelsinger responded." ..... the Chips are SMARTer and more Powerful than Present Software Controls.

    Would that be a fair and Acurate Assessment of the Virtual Reality, Pat? If so, that would be as AIMicrosoft AdultERated PlayGround/CyberIntelAIgently Designed DreamScape for VistaXXXXE ...XXXXEsoteric Version.

    XXXX0Sets to Stun Shock and Awe.

    ""I was kind of staggered by that comment - that one of the leaders of computing sees the future in linear time computing," said Kurt Keutzer, a professor at Berkeley."

    He probably doesn't, professor, when it is exponential.

    "Maybe there's a voice recognition or virtual reality app out there somewhere, that everyone will want, that will sop up 128 cores and save Intel. Maybe it's AI." .... By Kurt Guntheroth

    Posted Tuesday 18th March 2008 20:21 GMT ..... AI AI Cap'n, Full Speed Ahead, Bosun, Full Speed Ahead

    "What about F#?" .... By Anonymous Coward

    Posted Tuesday 18th March 2008 22:59 GMT .... Sounds like Perfect NEUKlearer Fuel/Heavy MetAI Source CodeXXXX. Crikey we're planning Futures and Derivatives with Virtual Ventures which aren't even there Yet ....... which is QuITe Rapid Progress in such a Short Space of Time.

    Welcome to the GoldBabelFish Bowl .... AI Stage for Leading Programmers ..... Dream Master Pilots.

    But it is not so new as to be earlier unknown .... http://en.wikipedia.org/wiki/The_Dream_Master

  13. Eddie Priest

    Clever Compilers...

    What about some sophisticated compilers/interpretors rather than delving down the route of yet another programming language (i.e. F#) ?

    Just my tuppence worth.

    Ed.

  14. amanfromMars Silver badge

    Prime KISS XXXXAmple

    "What about some sophisticated compilers/interpretors rather than delving down the route of yet another programming language (i.e. F#) ?

    Just my tuppence worth." ..... By Eddie Priest Posted Wednesday 19th March 2008 10:27 GMT

    And so good an idea, bought for a Fiver, Ed.

  15. Funky Dennis
    Thumb Up

    Multi-threading is hard

    So hard, in fact, that most programmers don't seem to be able to manage it. The easiest gains will probably come from software libraries that use multiple cores but hide it from the programmer - it has to look like single-threaded programming even if under the hood it's not.

    Of course, there are only certain types of algorithms that are inherently massively parallel, but many of them seem to be related to processing video (including computer vision!), and video on desktop PCs has only been practical for about 5 years. With hi-def catching on big-time, that's one area where multiple cores will be useful. But I'm not sure how many people actually run transcoding jobs that take 12 hours. Editing home video is a bit more common.

    So in summary then, they'll find a way of making us want this shit.

  16. Joe Cooper
    Happy

    @Funky

    "But I'm not sure how many people actually run transcoding jobs that take 12 hours."

    None, because they'll have 16-way SMP.

  17. amanfromMars Silver badge
    Alien

    Inevitable Probability Curveballs/Slam Dunks... or Simple Stealthy Progress?.

    "So in summary then, they'll find a way of making us want this shit." .... By Funky Dennis

    Posted Wednesday 19th March 2008 14:01 GMT

    Actually, Funky Denis, they are busier finding ways to pay you for all the shit, so you can afford to buy IT and Move On to whatever is On Offer in Tele-Virtualised Futures.

    House of the Rise Sun Technologies SAN for Progress in Programming Virtual Realities within and into Higher Definition BroadBandCasting........ for AI Sublime Control with Ubiquitous Powers.

    Now that is QuITe Shockingly Awesome when True.

  18. Singlewhip
    Unhappy

    Why should this fix things now? $20M is peanuts. The issue is apps.

    The HPC crowd has been trying to make parallelism simpler for about 40 years. It can't be argued that those guys are dumb. It's also obvious that the total amount already invested in this quest completely dwarfs the $20M Intel & Microsoft are putting up now. There have already been multi-year projects at UIUC, MIT, Berkeley, Stanford, almost any other big research CS department anywhere, and numerous national labs.

    The result? Fortran and C/C++ is what's used, augmented with MPI and OpenMP. No breakthrough in usability there.

    My conclusion from the historical failure of all those efforts is that if the algorithms used for an application are parallel, you can do parallelism really well -- witness responding to a jillion user transactions (the canonical big-iron SMP app), or serving a jillion web requests (server farms are parallel systems), or doing image manipulation (for some manipulations, not all, lots of bit-level parallelism as in SSEx and CUDA). For such cases, parallelism is there, and gets used, in huge quantities. That's not to say that it's easy even then; it's not, although the complexity mostly gets buried in subsystem code (like OLTP monitors, Apache, Java beans). But at least it is there.

    It's just not there for the total breadth of applications on commodity systems.

    So I agree with Linus, with one caveat: Is there a new parallel killer app out there somewhere? Intel thinks virtual worlds are it, by the way, and I'm not sure they are wrong.

  19. Anonymous Coward
    Anonymous Coward

    Good old Linus

    If Linus makes a crack at something you know it is going to happen.

    Yeah, we needed a modular kernel, the desktop is where Linux is at, and the future is in parallel computing at the core.

    He always does stuff like this when it first gets spoken about, and the Linux kernel always alters to do what his outburst was against, genius. It actually forces people to examine the pragmatism of their approach, so his comments are tolerated on the whole.

    The BSD license is freeware with acknowledgment, the source code for alterations does not have to be released, but BSD source code is released, it is the perfect license for companies who want a base standard and for developers who just want recognition, and here we see it in action to get some revenue. The TCPIP stack in window is perhaps still BSD.

    MIT license is very similar to BSD as well. Yeah, those are the two organisations that are a no brainer for selection to push forward multi core technology, with a grant.

    And multi core technology is not the same as multi threading, there are similarities, but a lot will be gained from selecting which pipeline to send a given process down in relation to other processes running. Splitting a process across a number of cores will probably look at grouping of threads, but unlike threads, you can take advantage of persistent register values, whilst still maintaining responsiveness. I would imagine, that in the future not all cores will be equal, and specific styles of cores will be used for specific processes. Most of it is conjecture at the moment, that is what makes this field particularly interesting, there is a lot of rules of thumb to be found and broken.

    Well, I for one look forward to the new environment, the key is size to power ratio in hardware, once we can make it wearable, next step is the implants, and bio enhancements. I would be happy with a computer on a ring (patent pending), VDU in a pair of sunglasses, and a virtual key entry system based on hand and finger movements. The smaller the systems get, and the cheaper they get to produce, the more places we can put them. Hopefully we will end up with a panoramic utopia, where computers are just embedded into buildings and trees, no more messy cabling, no more boxes, just 24 / 7 convenient access to nets and computing resources.

This topic is closed for new posts.

Other stories you might like