When I’m at SIGGRAPH, I’m planning to visit some friends in the Intel booth to see their new motion blurring technology. I’m sure people more knowledgeable in the field will find even more interesting developments from these engineers who are helping drive software defined visualization work, in particular with their Embree open source ray tracing library, and the OSPRay open source rendering engine for high-fidelity visualization (built on top of Embree). Key developers will be in the Intel booth at SIGGRAPH, and have a couple papers in the High-Performance Graphics (HPG) conference that is collocated with SIGGRAPH.
The Embree/OSPRay engineers have two interesting papers they will present at HPG. Both will be presented on Saturday July 29, in “Papers Session 3: Acceleration Structures for Ray Tracing”:
- Improved Two-Level BVHs Using Partial Re-Braiding, by Carsten Benthin, Sven Woop, Ingo Wald, Attila Afra
- STBVH: A Spatial-Temporal BVH for Efficient Multi-Segment Motion Blur, by Sven Woop, Attila Afra, Carsten Benthin
The SDVis machine (I refer to it as a “dream machine for software visualization” – but they simply call it the “Software Defined Visualization Appliance”) – and one such machine will be in the Intel booth at SIGGRAPH. I did not go to see it at ISC in Germany where they showed off HPC related visualization work with the hot topic being “in situ visualization.” At SIGGGRAPH, they will have demos around high-fidelity (realistic) visualization – specifically demonstrating Embree's novel approach to handle multi-segment motion blur and OSPRay's photorealistic renderer to interactively render a scene. These demos relate to their HPG papers.
showing the mblur approach: original (left) and with blurring (right)
original image (CC) caminandes.com
I’m sure the partial re-braiding is amazing, but it’s the blurring that caught my attention.
First of all, theoretically blurring is not needed. With a super high framerate, and amazing resolution, the blurring would just appear to us like real life. At least, I think that’s right.
However, with realistic framerates and resolutions we detect a scene as being unrealistic when blurring is not there. In fact, in some cases, spokes on wheels appear to go backwards.
The solution? Blurring. But, like many topics, what seems simple enough, is not. A simple algorithm might be to take adjacent frames and create a blur based on changes. Perhaps do this on a higher framerate visualization as your sample it down to the target framerate for your final production. Unfortunately, this approach is not efficient because the geometry has to be processed multiple times per frame and adaptive noise reduction on parts of the image is not possible.
That where “A Spatial-Temporal BVH for Efficient Multi-Segment Motion Blur” kicks in. These engineers had a different approach in which they pay attention to the actual motion of object. Imagine a scene with a helicopter blade turning around-and-around while a bird flies through the scene in something much closer to a straight line. Their method comprehends the actual motion, and create blurring based on that. Of course, doing this with high performance and high-fidelity both are what really makes their work valuable. In the example images above, the train blurring varies in a realistic fashion.
If want to read a better description of their work, and their comparisons with previous work, you should read their paper and/or visit them at HPG, or SIGGRAPH in the Intel booth.
I hope to see some of you at SIGGRAPH. I’m co-teaching a tutorial “Multithreading for Visual Effects” on Monday at 2pm. After that, I’m running over to see the Embree/OSPRay folks in the Intel booth.