WIP: HDR Rendering – Gaussian Filter using Compute Shader
Gaussian Blur Filter is one of the most used effect in graphics programming. It helps to achieve many other effects like Bloom, DOF, etc.
I coded a version of it using pixel shaders in my naive pixel shader based HDR Renderer earlier. Even though it served the purpose it was very limited in terms of performance and hence quality since we can’t increase radius beyond some levels for real-time rendering.
Then I posted about first compute shader based implementation here which based on Frank Luna’s algorithm but even that was limiting in the fact that we can’t go beyond some blur radius (depends on GPU) for performance reasons.
Here’s a screenshot (800 x 600)-
1st Screenshot – 1.01 ms or 988 FPS
2nd Screenshot -27.7ms or 36 FPS
Constant Time Blur
I was also aware of some talks about repeated box filter for constant time blur filter from MSBuild 2013 event. The main objective is to compute SAT or Sum Area Table of the image using compute shaders after which Gaussian blur of any radius can be performed but the number of arithmetic operations will remain same. (I am planning to write a tutorial kinda post later with all the details of the work & research I did for this because SAT using compute shaders is a very useful technique. Interested read can also find a working example in Nvidia DirectX 11 SDK)
Some screenshots of my Constant Time Gauss Blur Implementation (800 x 600) –
1st Screenshot – 6.8 ms or 147 FPS.
2nd Screenshot -6.8 ms or 147 FPS.
3rd Screenshot – 6.8 ms or 147 FPS.
As we can see if blur radius (and/or number of passes is small ) 1st algorithm wins but for larger blur radius 2nd algorithm is the clear winner. Also note that time taken by 2nd algorithm is almost same for all the images that’s why it’s called Constant Time Blur.
Right now I have both the algorithms in my post processor because I am not sure right now, which one will be better for my purpose in the end.