back to article New WebCL toolkit hooks browser apps into GPUs – and that's not good news for Apple

Web browsers apps on PCs and smartphones will be able to hook directly into the system's graphics chip and boost performance, thanks to the first WebCL release. WebCL 1.0 is a set of JavaScript bindings to the OpenCL graphics programming standard, its developers the Khronos Group announced today: the software interface lets in …

COMMENTS

This topic is closed for new posts.
  1. Anonymous Coward
    Anonymous Coward

    How many...

    Fart Apps

    Cat Videos

    Simple business apps

    Launchers

    etc

    etc

    would have their performance noticeably improved by using WebCL in the application?

    Come el reg, why not look objectively for a change and at least do a finger in the air guesstimate of the percentage of Apps in somewhere like the AppStore that would be hit buy Apple NOT allowing WebCL access on their systems?

    My guess from looking at the applications I use regularly is very few. Photoshop & Lightroom are for my just about it and even Photoshop seems to use the Graphics card for some manipulation on the few Mac's I've had access to from time to time because you get a warning that the memory is full.

    There might be implications for the MacPro when used for Video rendering and that is probably worth an article in its own right.

    1. Anonymous Coward
      Anonymous Coward

      Re: How many...

      On the whole I have to agree with you ... as far as apps are concerned.

      Some Apps cost money, unless you're going to have a login on a web-page you're not going to be able to get that money.

      Free Apps have "in-game-purchases" ... the "beauty" of the app stores is that they already have this transaction mechanism in place.

      Big apps, or apps with big data stores will still want to be downloaded and stored in the device, rather than downloading each time (and eating into data contracts)

      I can see a great use for things like minecraft, and I can see Mozilla writing a Video player in it (and updating their flash alternative to use it as well) and probably Google going overboard with presentations, or spreadsheets, with graphs and whatnots, but I can't see it replacing even 10% of apps.

      1. Eddy Ito

        Re: How many...

        As you keenly point out, it isn't about how many apps especially when you take Sturgeon's law into account. It also comes down to the problem of Apple wanting to only make consumptive mobile devices rather than creative ones. This is where it might make tablets a viable alternative to PCs for designers, artists and engineers in addition to opening new possibilities in collaboration.

        Heck, I almost forgot the most creative types of all, sales and marketing personnel. The horror, the horror.

  2. as2003

    Cue millions of websites (hacked or otherwise) becoming host to embedded bitcoin miners.

    Perhaps not, but I do wonder how many zero-day hacks lie in wait for us.

    1. Michael Wojcik Silver badge

      Indeed. This seems like an impressively bad idea.

      I hope that when Firefox implements it, it can be disabled in the config.

      (Presumably it will be, since you can disable WebGL via about:config. I have, though I also block it with NoScript, because it's interesting to know when a site tries to use it, and because no doubt I'll eventually have to enable it in Firefox for some damned thing I'm required to use for work, and then at least I'll be able to whitelist it.)

  3. mafoo
    FAIL

    Silly headline

    Remember how everyone said HTML5 was going to destroy the appstore?

    Website require the app is downloaded every time you want to run it, mobile apps are stored on the device. That is the major advantage.

    Thats why about 30% of the popular non-game apps on the appstore are just JS/HTML in a app wrapper a-la phonegap.

    This will just mean that phonegap apps can have pretty graphics - not that browser apps will suddenly decimate mobile apps - although some nefarious websites might try and harness your GPU for a giant bitcoin mining grid :P

    1. Barry Dingle

      Re: Silly headline

      I've never heard of localStorage either.

      1. M Gale

        Re: Silly headline

        I've never heard of localStorage either.

        All 5-10 megs of it?

        1. Robert Grant

          Re: Silly headline

          I didn't realise that was a limit that could be changed in the future either.

        2. P. Lee

          Re: Silly headline

          or a browser cache?

          I suspect it's all moot anyway, the main benefit of the appstore is an easy way to pay without giving CC details to dubious devs.

          It might push more business paypal's way if people opt out of stores taking 30% What I suspect you'll get is more free versions in the appstore with the premium version paid externally. Google might push this, as they have less of an interest in purchases and more interest in web-pages / high usage numbers.

        3. Anonymous Coward
          Anonymous Coward

          Re: Silly headline

          The entire works of Shakespeare come in under 10MB so you shouldn't underestimate what can be done with that...

  4. jnffarrell1

    Who should suffer

    a user waiting for the big pixels and buffering stops or an App maker selling obsolete technology. I choose to let the App maker learn the new open API at his own expense and stop yelping about competition.

  5. Anonymous Coward
    IT Angle

    Interesting even w/o Obligatory Swipe

    I can't see Apple caring very much until there's a heck of a lot more customer push-back than the grief around Flash. I can see some interesting uses here given my addiction to all kinds of simulations. What will be very nice is for games (FPS, RPG,&c.) being capable enough for the (newish) high PPI displays. Not my niche. Memory and battery life in the face of GPU demands will be the new constraints.

    I wonder how crazy fast you can get on non-memory constrained desktops with high-end cards? Heck, might be the hot-ticket fon Intel and AMD on-die GPU's. [I have an AMD FirePro 3D W7000 looking for some light entertainment.]

  6. JDX Gold badge

    "This a big deal because now you'll be able to access everything on the web"

    Except there is no actual reason everything needs to be done through the web in the first place. This whole idea of moving every application to a web-site is just dumb. You shoe-horn your 3D game into a browser-plugin or rework your well-designed UI to use HTML and end up with something which is less good but "hey it's in the browser".

    If you need the kind of processing power that only a GPU can give, write a proper application.

    1. Robert Grant

      Re: "This a big deal because now you'll be able to access everything on the web"

      This is true right up until the point HTML can be as fluid as an app, companies realise that keeping around dev teams for every app platform is stupid.

      1. JDX Gold badge

        Re: "This a big deal because now you'll be able to access everything on the web"

        Yeah, but it's not. And nobody is changing that - we're still working with horrible CSS/DOM nonsense and this isn't set to change

  7. phil dude
    Linux

    MD @home in webcl...?

    I have a few version of a mini MD code, so perhaps I should make it an "at home" app ;-)

    I fully expect the folding folks to get in on this....

    P.

  8. IGnatius T Foobar
    Pint

    Viva la web web !

    It's great that Netscape won the browser war. Things are getting better all the time.

    Yes, you heard that right. The browser war wasn't really Netscape vs. Microsoft; it was web apps vs. Windows apps. Netscape TOTALLY WON the browser war, even though they died winning it. Everything is done through a web browser now.

    Technologies like WebGL and WebCL keep getting released now, proving that there is NO application for which people can continue to say "that'll never be able to run in a web browser." And that's the real reason why Micro$oft is so afraid of Chromebooks.

  9. Pet Peeve

    WebGL? You mean the thing that everyone said to turn off when it was even in beta?

    Native code in a browser=bad idea. I think noscript already disables it, even if you don't do it yourself in the browser settings.

  10. Peter 48

    fear not

    If it truly was a risk to the app store Apple would simply ban it with the excuse that it is dangerous/ laggy/ battery consuming and whatever else they accused flash of being.

    1. Anonymous Coward
      Anonymous Coward

      Re: fear not

      "the excuse that it is dangerous/ laggy/ battery consuming and whatever else they accused flash of being."

      Just as truth is an absolute defence in a libel case, a statement of truth as a basis for policy goes a long way.

      You may hate Apple, but "Flash" really is an appalling mistake realised, and doubly so in the mobile computing environment for all the reasons Apple cited.

      I have flashblock installed, and I see fewer and fewer blocked flash markers, indicating the world has recognised the truth and moved away from flash. OK, it is still a popular video delivery vehicle, but that won't last either.

  11. LordHighFixer

    The first thing I thought

    Was very cool, now we can have exploits and virii coded directly into images and videos....

  12. Michael Wojcik Silver badge

    It's a terrific API. Manual resource management is "strongly recommended", because you can't trust the ECMAScript implementation's garbage collector to reclaim resources promptly. (Well, no. It's a garbage collector.) So, lacking proper RAII or some other structured resource management, we'll have all sorts of crap apps leaking scarce resources.

    Asynchronous design with callbacks is "strongly recommended" to avoid blocking the implementation's main (generally only) thread. That's fine; no programmers who work on toy applications ever have trouble with asynchronous design.

    A zillion magic constants, defined as integers. No type safety. (And yes, you can impose some run-time type validation in ECMAScript; you just have to do it yourself. A WebCL implementation could have, if the API accommodated it.)

    WebCLDevice.getInfo is great for fingerprinting a victim's machine.

    "WebCL has been designed with security as a primary concern." Oh, well, that's all right then. What hath this "primary concern" yielded? Memory access protection, no object reuse, and a recommendation that the underlying OpenCL implementation guard against runaway apps. Groundbreaking.

    They did leave out a lot of the more-dangerous OpenCL features, so that's something. But not nearly enough for me to trust WebCL - not that I can imagine having any need, or indeed use, for it anytime soon.

    An aside: I noted the following hilariously vile phrasing from the Wikipedia article on WebCL: "WebCL allows web applications to actualize speed with multi-core CPUs and GPUs". At last, my web applications can actualize themselves some speed! "Man, this trip is taking forever. Can't this car actualize any more speed?"

  13. PeterGriffin

    The primary adopter of New Technology...

    ...will no doubt exploit this new technology to create the best interactive 3D porn yet!

  14. jubtastic1
    Thumb Up

    What could go wrong?

    Everything.

  15. Ilsa Loving

    Oh great

    So now we can have web pages write crappy OpenCL and cause our machines to lockup/reboot?

    I tried playing with some OpenCL code once and it was incredibly finicky. It would make my graphics chip shart itself and cause my whole machine to spontaneously reboot.

    If Apple hasn't implemented this system, then that's a point in their FAVOR.

  16. Vociferous

    Hype it till you make it

    I remember I was impressed when I first saw a 3D shape reflect the skybox. I think it was in Final Reality demo. An alien chromium spaceship. That was back in 1999, if memory serves.

    That browsers, 15 years later, have reached the same capability is in many ways impressive, but "same performance as native code" it clearly is not, and I doubt Apple's app store has much to fear. I understand the need to hype new products such as this or that new cell phone chip which claimed to outperform nVidias baddest offerings, but... seriously. Come on.

This topic is closed for new posts.

Other stories you might like