Clipless Dual-Space Bounds for Faster Stochastic Rasterization

We present a novel method for increasing the efficiency of stochastic rasterization of motion and defocus blur.

Contrary to earlier approaches, our method is efficient even with the low sampling densities commonly encountered in realtime rendering, while allowing the use of arbitrary sampling patterns for maximal image quality.

Our clipless dual-space formulation avoids problems with triangles that cross the camera plane during the shutter interval.

The method is also simple to plug into existing rendering systems.

Publication Date

Research Area