News Imagination demos PowerVR real-time raytracing tech

Discussion in 'Article Discussion' started by Gareth Halfacree, 7 Jan 2016.

  1. Gareth Halfacree

    Gareth Halfacree WIIGII! Staff Administrator Super Moderator Moderator

    4 Dec 2007
    Likes Received:
  2. GuilleAcoustic

    GuilleAcoustic Ook ? Ook !

    26 Nov 2010
    Likes Received:
    Reminds me a lot of the Caustic R2100 and R2500 (using imagination GPU too).
  3. Gadgety

    Gadgety New Member

    23 Jan 2014
    Likes Received:
    5x = impressive stuff. So either save on hardware, and or energy or perform more. So this is hardwired ray tracing.
  4. yougotkicked

    yougotkicked A.K.A. YGKtech

    3 Jan 2010
    Likes Received:
    Wow, I remember reading a snippet in Custom PC magazine ~8 years ago about one of the first experiments with real-time ray tracing. Someone modified Quake 4 to rely entirely on a ray tracing render engine; the only way to achieve a "playable" frame rate was to render the game window at a comically small size, like 64x64 pixels.

    Of course, if I understand them correctly, this isn't actually fully ray traced. It's a "hybrid" render engine, and they don't really detail how much rasterization was going on. The linked article says the PC with 4 GR6500's running in parallel managed "in excess of a billion rays per second", but doesn't really give us much context about how many rays need to be computed to render a single frame, and most people won't have any idea what that means.

    *Part way through writing a slightly dismissive post I stopped to do some research, now I'm actually impressed. The presentation hand-waves a few points so I assumed they were glossing over some important caveats to the demo, but now that I know a bit about the computational requirements, this is legitimately impressive.*

    The demo was running at 30 fps, presumably on a standard HD panel, so: 1920 x 1080 x 30 = ~62.2 million pixels per second. In principle an image can be ray-traced with 1 ray per pixel, but this results in the usual problems you see when using discrete elements (in this case pixels) to represent a system of continuous information. Each ray you trace gives you the correct color for a single dimensionless point in the image, not the average of the colors over an area, which is the value a pixel should have. The solution is to approximate the average color over the area by tracing several rays through different points in the pixel and averaging the results, there are some clever adaptive algorithms around for this, but using a 4x4 grid of rays for each pixel yields OK results.

    Coincidentally, 62.2 million x 4 x 4 = ~1 billion rays per second. This means that 4 GR6500's would be powerful enough to fully ray-trace a game on low quality settings, and deliver 30fps on a 1920x1080 screen. This is actually really impressive considering these are basically cell phone GPU's, they've developed specialized hardware that can do ray-trace calculations very efficiently, and paired it with a conventional GPU and a specialized render engine so traditional rasterization can give everything a final polish.

    A purely ray tracing based render engine still isn't computationally viable (to match the image quality we expect from rasterizing would require hundreds of rays per pixel), but this is a big step towards that. The demo didn't really convince me they were doing anything special, but now that I undertand what a billion ray traces per second can do, I think this legitimately counts as a 'breakthrough' in computer graphics.

    (I found this set of powerpoint slides particularly informative and well-presented if anyone feels like reading up on the subject a little:
  5. Bindibadgi

    Bindibadgi Tired. Forever tired.

    12 Mar 2001
    Likes Received:
    His name was Daniel Pohl, he was a PhD or post-doc researcher when I met him in 2007 IDF iirc, later hired by Intel to continue ray tracing research.

Share This Page