- Values to define how far away the HDR background environment is from the camera.
The user needs to be able to select in what distance to the camera the HDR background is simulated.
In theory the assumed distance of the HDR background to the camera should have a huge effect on several different other parameters.
When simulating outdoor landscape scenes the assumed distance from the HDR environment to the camera should be larger than in interior scenes.
Example:
landscapes - distance HDR background to camera 10'000m
interiors - distance HDR background to camera 10m
One would assume that the light situation of an HDR background should be calculated differently in those two extreme scenarios.
In a huge HDR skydome that is assumed 10'000m away the "tint" of the environment should be less noticeable than the tint of the HDR interior background that simulates the effect of a photostudio.
- - -
HDR distance in combination with scene size volumes
The differences of the HDR light effects should be even more noticeable if one assumes that a fog or haze volume is placed between the camera and the HDR background.
The effect of the fog in combination with the HDR light in a small studio should be different than the effect of a huge outdoor scene.
In an interior scene the effect of light bounces between the HDR background and the scene geometry should be much more noticeable than in an exterior scene in which the light would have to travel a much larger distance for each bounce and would get much more affected by any volumes they travel trough.
One solution would be that OR asumes that volumes are created between the camera and the distance of the HDR background the user selected.
A more advanced option would be to actually add even more controls in which distance and position volumes are created.
The user could actually define multiple volumes with their distances and relative positions to simulate fog, haze, clouds or even the atmosphere for space scenes.
- - -
Camera movement linked to HDR background
Currently it seems when the camera is moved on the x - axis the shown section of the background HDR is not automatically adjusted based on scene size in animations.
The user has to manually rotate the HDR background to actually match the camera translation on the x axis.
In a landscape situation in which the HDR background is 10'000m away it makes sense that some small x translation of the camera does not result in a huge shift of the showed HDR background section.
Nevertheless in an interior scene a small x translation of the camera should have a huge effect of the showed section of the HDR background.
Instead of the user
"just moving some sliders until things look about right" the needed shift of the HDR environment to compensate for camera movement on the x or y axis could be calculated automatically based on the HDR distance the user selected.
- - -
Side Note:
I stumbled on this while trying to figure out how camera motion blur works in OR.
It seems without any way to adjust how far away the HDR background is camera motion blur just seems to look "wrong" in combination with HDR backgrounds.
compare:
http://render.otoy.com/forum/viewtopic. ... 41#p244266
Summary:
The user should be able to select in what distance to the camera the HDR background is simulated.
Light bounces, infinite fog and haze volumes, camera motion blur etc are calculated in a realistic way based on the distance between the camera and the HDR background and scene volume parameter selections by the user.
Camera movement and HDR background placement could be linked together to simulate a realistic interaction in animations based on scene size.
- - -