Both - but mostly animations of 3d models for inclusion in AR experiences.
Goal is to get 3d models in AR as photorealistic as possible. Wanted to use this Opera House eg as a learning experience.
Current approach is 1) use PBR global lightmapper to bake the building, then 2) use real-time AR ambient lighting
Hypothesis is the object will look more real because of the baking. However it won't quite look right when in AR because the baking happened in an artificial evironment unlike the present AR environment (but baking first and then using ambient AR lighting for soft shadows on the pre-baked object is the best we can do these days). So when you rotate the opera house around on the table for eg, you'd see that it doesn't look natural because the pre-baked rays do not reflect the true ambient lighting. Is this right or perhaps there's a better way?
Sorry what do you mean by 'viewing baking'?