Tuesday, September 27, 2011

Fixing the holes in my Ozone layer


Although I've been working on the procedural architecture system to produce city buildings more interesting than simple boxes, I decided I had finally become irritated enough with a long standing bug to put that other work on hold and get it sorted.  The bug in question was in the atmospheric scattering shader and caused unsightly circular artifacts in the effect as the camera moved further and further away from the planet's surface:




The atmospheric scattering effect is an implementation of the algorithm described in Eric Bruneton and Fabrice Neyret's excellent paper Precomputed Atmospheric Scattering and to be fair to them the bug shown above wasn't a problem with their algorithm, it's a problem with the way that I am calculating the entry and exit points of the ray from the camera through each pixel into and out of the atmosphere.

The atmosphere is modelled as the space between two concentric spheres, one representing the inner most radius of the planet (the lowest possible point on the terrain) and one the outer most radius of the atmosphere.  Rather than render some sort of triangulated sphere with the atmospheric scattering shader on it however the effect is applied in screen space by rendering a full screen quad, the pixel shader for which calculates a ray from the camera through that pixel and into the space between these inner and outer spheres.  If your planet is perfectly smooth you can use an analytical ray-sphere intersection calculation to find the atmosphere intersecting portion of this ray which is generally good enough for distant views of the planet, but this fairly brutal assumption breaks down as you get closer to the planet as of course any sort of interesting terrain is by it's nature not perfectly flat.

As I want to be able to fly right down and walk around on the surface of my planet's terrain this simply wouldn't do, so instead of ray/sphere intersections I instead use the depth rendered by the terrain shader itself to calculate the end point of the atmospheric intersecting ray.  The terrain is rendered in two passes, the first is a Z pre-pass which only writes to the Z buffer while the second does all the texture sampling, blending and lighting before writing not just the final terrain colour but also a single floating point value representing the depth of that pixel to a separate depth buffer render target.  This depth buffer is a generally useful thing to have to hand, it's used to allow the edges of water geometry to alpha-blend smoothly where they intersect the terrain and also ultimately for soft particle effects - but one of it's most useful roles is for the atmospheric scattering as an accurate per-pixel distance allows for proper scattering on distant terrain rather than being limited to simple distance based fog solutions.




The problem however is that as you fly away from the planet the polygonal nature of the planet's geometry becomes apparent - the depth rendered by the terrain shader does not follow a true sphere but instead is only accurate at the vertices with the depths along the triangle edges and in their interiors being interpolated from these values.  As the triangles are flat though these interpolated values also describe flat surfaces so you get a facted depth image instead of a nice smooth sphere.  Once these faceted depth values are fed through the atmospheric scattering code areas where the erroneous interpolated depth produces points inside the inner atmosphere sphere produce understandibly incorrect scattering results and the artifacts seen above.

To resolve this I extended the atmospheric scattering pixel shader so when the camera is close to the planet it uses first the rendered per-pixel depth, when it's far away it uses an analytically computed ray-sphere intersection depth and when it somewhere in between these two thresholds it blends between the two based upon distance.  This ensures there is no visible glitch when switching from one method to the other and by the time the camera is far enough away to use the true sphere distances the terrain details are so small you can't see the difference anyway.






It would be slightly more optimal to produce three shaders each of which performed just one of these methods then switch between them depending on camera distance before rendering the scattering but I'll leave that as something for another day.  Another optimisation would be to render a quad or low polygon disk around the screen space bounds of the planet's atmosphere instead of a full screen quad to eliminate the increasingly large number of redundant pixels processed by the scattering shader as the planet becomes smaller on screen, this one I'm more likely to do as it really is very wasteful in these situations but for now I'm just happy to finally get rid of those ugly splotches!

No comments:

Post a Comment

Comments, questions or feedback? Here's your chance...