🎉 Celebrating 25 Years of GameDev.net! 🎉

Not many can claim 25 years on the Internet! Join us in celebrating this milestone. Learn more about our history, and thank you for being a part of our community!

Adaptive exposure with average log-luminance, using Pattanaik's technique

Started by
0 comments, last by alexpanter 2 years, 1 month ago

Hi, I have implemented adaptive camera exposure, but it doesn't seem to be working correctly. My strategy is this:

  1. write everything to HDR color buffer
  2. downsample the HDR color buffer progressively to 16x9 with bi-linear filtering, where first downsample converts to log-luminance
  3. dispatch compute shader to sum-reduce the 16x9 texture, divide by 144 (average), and exp2 to convert back to luminance
  4. adapt the luminance using Pattanaik's technique, and write this result to a 1x1 texture
  5. in tonemapping shader multiply color with exposure and average luminance, then tonemap on that

For 4. that means this piece of code (in compute shader):

float adaption = (1.0f - exp(-timeDelta * adaptionSpeed));

float newAdaption = lumPrev + (lumCurr - lumPrev)*adaption;

But somehow, the adaption converges to 1.0 and stays at 1.0, no matter the brightness of the scene. I start with an artificial, initial average luminance of 50.0 because it looks cool. Then gradually the algorithm converges to 1.0, which looks good. But after that it doesn't seem to take on other values than 1.0 no matter the lighting conditions in the scene - I go to dark area nothing happens - I go to bright area nothing happens.

Perhaps I have a bug in my application. But I just wanted to hear, if what I'm doing sounds reasonable. In my head it should be fine. And I'm basically following the same approach as in github.com/TheRealMJP/BakingLab, except that I downsample an image which has 16x9 aspect ratio, instead of 1x1, and that I'm using a full-screen quad fragment shader downsample instead of a compute shader.

Any ideas? ?

This topic is closed to new replies.

Advertisement