back to article Nvidia to acquire ray tracing startup

Nvidia in the past has jeered Intel's heavy investments in ray tracing as a successor of rasterization for graphics rendering — but it's always stopped short of dismissing the technology completely. That logically led many to assume Nvidia was developing its own ray tracing technology on the side. As it turns out, those bets …

COMMENTS

This topic is closed for new posts.
  1. Anonymous Coward
    Coat

    The Real Problem

    ...is finding enough Rastafarians named Ray.

  2. Francis Boyle Silver badge

    OK

    All together now.

    "Maybe they could use traceroute"

  3. jon
    Paris Hilton

    Real time vs "real time"

    It's cool that hardware graphics cards are catching up with the non-RT graphics technologies and admitting it'll take a mix of techniques to make things look great instead of betting the farm on pushing pure numbers of polys or fill-rates or texture RAM or whatever the latest buzz word happens to be. Even big Hollywood movies do some stuff with an un-glamorous 2D effect, it's all about appropriate technology. But is this post-ray-tracing bust an admission of PR defeat ? Have the buzz words run out and only common sense is left ? It seems quite unlikely but if you look at the current generation of HD games the hardware is no longer really the limiting factor (and no it's not the limitations of imagination that hold us back) instead it's the mundane paying enough people enough money to spend the real time that's needed to make one of the current gen games and market it that defines the new boundaries of RT graphics capabilities.

    PH because the ray-traced Utah teapot icon appears to have gone missing.

  4. Troy Shanahan
    Thumb Up

    @ AC

    Ahahahaha! Best thing I've read all week. Well said.

    Regarding it being good at rendering cars, who here reckons we'll see this in the next need for speed, even if the hardware doesn't support it? I don't put it past EA somehow.

  5. jedd

    A serving of ray-tracing on the side

    Aren't these two things contradictory:

    "Nvidia was developing its own ray tracing technology on the side."

    and

    "Nvidia will soon announce its acquisition of a ray tracing startup"

    Is this in the same manner that Microsoft develops technology on the side?

  6. ulric
    Boffin

    nVidia already owns a ray tracer

    nVidia acquired the much more significant Mental images last december.

    http://www.nvidia.com/object/mental_images.html

  7. Torben Mogensen

    Raytracing will win (eventually)

    Just like solid polygon rendering replaced wireframing, raytracing will eventually replace Z-buffer or "painters algorithm" polygon rendering. There are several reasons:

    1. Raytracing can, as the article suggests, produce shading effects that are difficult to get with "traditional" rendering techniques.

    2. The time used for producing a ray-traced image is largely independent on the number of objects in the scene. A rough estimate is (#of pixels)*log(number of objects). Z-buffer and painter's algorithm use roughly (#of pixels)*(number of objects) to do the same, albeit with a lot smaller constant factor. So, as long as the display resolution grows faster than the number of objects in a scene, rendering wins on speed, but if the number of objects increase more rapidly, raytracing will eventually win.

    Raytracing is easy to parallelise, so it seems a natural development in a time where you have more transistors than you know how to use.

  8. Nick Sargeant

    Arvo and Kirk

    Isn't David Kirk the one who wrote papers, presented at SIGGRAPH with James Arvo in the 80s and 90s on ray tracing, and techniques for speeding up ray tracing?

    This might mean that Nvidia's acquisition has a nostalgic, self-indulgent twinge rather than having a specific product in mind - David collecting technology components together to see what might happen.

  9. Anonymous Coward
    Boffin

    Specific rays

    I cast about (ha! ha!) playing with a hybrid ray tracing / rasterization engine back in the early '90s. It was really just concept fleshed out with some x-mode C code, but the idea was to, like some people have said, use it to replace a z-buffer plus a little bit more. Mine would also have decided which polygons to render in the first place. Since my quality standards weren't too high (see time of development) I was looking at having it 'miss' far-off polygons intentionally, which would result in a kind of blurry background. So, I didn't have to trace a ray for each pixel - I'd do every five or so, which meant only a couple of hundred rays for a 320x240 screen.

    The nice bit was that it handled a lot of neat effects without much of a speed hit - translucent windows and such.

    If I'd kept going I think it could have been pretty cool, but I was really young at the time and didn't have the knowledge to exploit it. And now the standards are much higher, so I still don't have the knowledge to exploit it even though I have more knowledge! So it goes...

  10. TeeCee Gold badge
    Joke

    Making cars look shiny.

    If Nvidia can come up with a way of making *my* car look shiny without my having to polish it, they have a sale!

  11. Torben Mogensen

    @David

    Only tracing every fifth pixel is a good optimisation -- but you can vastly improve the picture quality if you follow by another step: If the four pixels in a square all hit the same object, interpolate the texture and intensity, but if they don't, follow all the "missing" rays for the intervening pixels. I did that with a raytracer I made in '86 (except I followed every fourth pixel instead of every fifth), and I found that (with relatively simple scenes), I got immense speed-ups and only a little degradation in picture quality. In an animated sequence, I doubt you would notice the difference.

  12. Andreas
    Boffin

    NVIDIA Gelato

    NVIDIA Gelato is a renderer with raytracing support...

    http://www.nvidia.com/page/gz_home.html

    Gelato, is a production quality non-interactive "final-frame" render. (In fact, I wouldn't be surprised if Gelato uses mental ray from their mental images acquisition.)

    RayScale on the other hand appears to be a different beast in that it aims for interactive (i.e., real-time) raytracing.

    My guess about NVIDIA's acquisition of RayScale is that they are hoping to benefit from the RayScale technology/knowledge to improve future GPUs with real-time raytracing support.

  13. Lee McKenzie
    Go

    The Amusement Machine - Real Time Raytracing...test it now.

    Interesting bit of news indeed, please try http://theamusementmachine.net/ . We are showing real-time ray-tracing on NVIDIA GPU's combined with rasterization(in some videos combined). There is also a beta demo you can download and try if you register. Quite a bit different from the classic spheres and refelections. Enjoy.

This topic is closed for new posts.

Other stories you might like