I don't see any sense in using max intensity value, because only single pixel can be extremely bright and ruin tonemapping.
Quote:
If the sun power or some lights are ridiculously strong you can just cap the max luma at some reasonable point.
No, i said in previous message, hard to balance adaptation to human perception, most game developers read those docs and making same code, but what is the result? Shit.
But the most important is that everybody write unrealistic computations and at same time trying to implement exactly the same code which described insome docs, it's rediculous. Less important is the fact that skyrim source data is ldr and docs describe hdr tonemapping techniques for real images, so result will never be realistic, it simply can't.
Quote:
to compress HDR range exactly to 0..1 range without losing any information
This can't be realistic looking. result=color/(1.0+color); - already compress to 0..1 range without loosing information, but what you have visually?
Quote:
does that mean if I set AdaptationSensitivity to 1.0, the grayadaptation would be the maximum luminance of the current image?
No, downsampling process of screen to small adaptation texture consist from many stages and first of them use average computation, but max-average interpolation applied closer to the end, when image small enough. This guarantee minimal "noise" from small spots in image. Don't remember, but guess 16*16 pixels of screen is that stage when AdaptationSensitivity applied.