back to article Mozilla eyes multi-threaded webpage rendering

Mozilla is exploring ways of building a multi-threaded browser DOM for Firefox, so that a single web page can be rendered using multiple processor cores. "We think it's possible," Mozilla open source evangelist Chris Blizzard said on Thursday at the O'Reilly Velocity conference in Santa Clara, California. "This is an active …

COMMENTS

This topic is closed for new posts.
  1. Mark C Casey

    The current issue

    From a Firefox user, I think the biggest problem with Firefox right now is its single process nature.

    Because the engine that renders the webpages also renders the ui when you hit a large number of tabs or a javascript heavy website the entire user interface can slow to a crawl.

    Firefox is currently the only browser with this problem, they really need to get electrolysis (multi-process Firefox) sorted out faster. For me the others out there aren't as good web browsers but they do some things better or faster.

    I would use Chrome except Firefox has significantly better addons, a better bookmarking system and I don't trust google. (plus, I prefer a search box.. seriously.. all that wasted horizontal space by the silly location bar in Chrome)

    1. Anonymous Coward
      Anonymous Coward

      To damned right.

      As usual more fun creeping featurism, without really tackling major existing 'boring' problems, for at least 2 major versions!

      All the tabs should be self-contained processes, thread groups, threads whatever, just not a monolithic incestuous mess, and each site should be sand-boxed, period, so that one crashed (utube Flash tab) cannot not ever stall or crash the whole damned browser.

      As for the 1.7GB limit, that is so retarded, I have 12GB of RAM but the browser can only use a small fraction of the space left by my hungry 64-bit Apps. It's about freaking time that the browser used 64-bit addressing when available, even if they have to have a translation layer, or preferable a fully separate 32-bit process for retarded unstable 32-bit plug-ins like Flash!

      1. Oninoshiko
        Megaphone

        you WANTa browser to use more then 1.6G?

        I don't know about you, but I think 1G is excessive for a web browser.

        I have excessive amounts of memory for my non-web browser applications. there is NO reason that FF should ever need more then 1G.

        Anyone remember the reason FF got popular? It was a slimmed down replacement for Mozilla's bloat. I think maybe it's time to put it down and start again.

  2. Paul Crawford Silver badge
    Unhappy

    As if browsers were not buggy enough already

    "It's a C++-like languages designed to let you build in parallelism and security," Blizzard said.

    I can't be the only one to see C++ as not being associated with 'security' even if it is good for native speed?

    So not happy with the current buggyness of browser implementation, we can now add the joys of trying (and usually failing) to implement and debug multi-threaded code.

  3. overloaded
    Alert

    MOAR!! Memory

    I wonder if it means more memory used per thread, FireFox might end up using up a gig of memory if you go by current standards.

  4. Dirk Vandenheuvel
    Holmes

    Quest

    And so the quest continues to turn our browsers into the most over engineered complex platforms running shitty programming models and spaghetti code. It feels like 1990 again.

  5. <user />
    Stop

    null

    Do we really need to continue this speed related numbers pissing contest? Bored of the JavaScript engine speed contest? Lets move onto the actual DOM now.

    The real bottle neck is the connection, not the page render speed. This numbers pissing contest is frankly becoming a fucking joke.

    1. ChrisC Silver badge
      Thumb Down

      Yes, yes we do

      Your network connection (don't assume everyones is) may be the bottleneck for the initial page render, but is it still the bottleneck if the site you're visiting happens to use client-side processing to re-render the page contents based on your actions, without any further network access being required... No. Given the ongoing drive to move apps from the desktop and into the cloud, the ability of our "browsers" (an increasingly inaccurate description of what they actually are) to re-render our work in progress from the locally-cached data at a rate high enough to make us forget we're using web-based apps rather than native desktop-based ones is going to become ever more important.

  6. Ru
    Boffin

    A new language, huh?

    Usually this rings alarm bells. Unless you're writing some sort of declarative domain-specific scripting thing, creating a new language almost always seems to be the wrong thing to do. Rust actually looks like it might be quite sensible, which is almost astonishing. Sensible memory management, Erlang-style exception handling, sophisticated compile-time invariant checking, RAII... seems like it has a lot more going for it than the usual offerings.

  7. Anonymous Coward
    Thumb Down

    Terrible idea

    Firefox already has go-slow sessions (even though I have NoScript and AdBlockPlus running). The only redeeming feature is that, when it does so, it only brings one (virtual) core to its knees and the rest of the machine keeps running. Making firefox 50% faster (but still slow) is less important to me than keeping the computer itself interactive.

    @overloaded: My firefox already sits around 1.6GB, but then I have a *lot* of tabs open. It would be nice if it didn't fall over every time it tops 1.7GB, though.

    1. Anonymous Coward
      Anonymous Coward

      That is a good point

      But there must be a better solution to that problem than not making the browser multi-threaded.

      Either the browser or the OS should have an option to prevent the browser from using too many resources (CPUs or memory).

    2. Anonymous Coward
      Anonymous Coward

      firefox bombing at 1.7 gig

      Had that myself.

      32-bit processes under windows are limited to a certain amount, which is either 1.7 gig or reported as 1.7 gig.

      FX should handle that sensibly, it doesn't. Wasted a lot of time for me.

  8. David Harper 1
    FAIL

    I remember the good old days ...

    ... when you could browse the web on a 66MHz 486-based PC, running Windows 3.1 in 16MB of memory. Using NCSA Mosaic.

    Now, apparently, you need a freaking supercomputer to render a web page.

    I blame all the Flash-powered adverts and JavaScript-driven pole-dancing squirrels.

    What do you mean, you don't see the squirrels? How can you miss them? They're wearing hot-pink leotards, for Pete's sake! Oh, you've got the SquirrelBlocker plugin ...

    1. Anonymous Coward
      Anonymous Coward

      Tell me about it...

      I remember when an 8MHz ARM did an okay job of rendering the web. A 486 was a luxury. "You don't need a high-powered computer for web browsing, email and a bit of word processing, only for serious work."

      These days, emacs is light-weight, code compilation is trivial and I can ray trace usefully on machines that provide a painful web surfing experience. It's almost as though someone noticed that the "average Joe" wasn't buying the best machine he could afford any more, and arranged things so that the average computer *needed* to be faster. I don't think it's even entirely the fault of composited window managers (although that doesn't help...)

  9. David Harper 1
    Alert

    This is why you need a render farm to display web pages

    The article we're discussing contains 2463 bytes of actual content, including the headline, by-line and publication date.

    To display it, my browser had to download 507,673 bytes in 46 separate files.

This topic is closed for new posts.

Other stories you might like