Hi Guys,
I see to be having a real problem with the UV Baking in octane at the moment. I seem to be having repeating distortions appear in in the edges of the UV bakes I make, in particular the light sources.
Is there anything that can be done about them? Currently this ruining the render I'm trying to make (as when reduced to low resolution by mipmapping these distortions can creep into the edges of the model in Unity. IS there any way to clean this up?
If any one could help me resolve this I would be ecstatic!
Draydin_r
Octane UV Bake Error
Hi Draydin_r,
difficult to say from your screenshot.
If you can create and share a simple example scene that shows the issue, even via PM, it would be very useful.
ciao beppe
difficult to say from your screenshot.
If you can create and share a simple example scene that shows the issue, even via PM, it would be very useful.
ciao beppe
Noise may show up at the edges of your UV geometry if a low number of samples actually hit that area. That noise may spread along the padding area, which is what you see.
You may try to reduce the
https://docs.otoy.com/StN_H/StandaloneM ... Camera.htm
I hope that helps.
You may try to reduce the
Edge noise tolerance
of your baking camera settings so these hotpixels are removed during the baking process.https://docs.otoy.com/StN_H/StandaloneM ... Camera.htm
I hope that helps.
Thanks for the response! I have the edge tolorance cranked down to 0.0 and this is still the result I'm getting. One thing I have noticed is how ragged the edges of the UVs can get to be.
Here is what the results are with edge tolerance turned to 0.0 and the Edging brought to 16 pixels
My biggest issue is that the edges are so jagged for what should be straight lines, and as a result there are sections that simply will not clean up even with adaptive sampling. I suspect that this could the source of my troubles. Is there any thing that can resolve this?
Increasing the Filter size seems to help a bit but does not remove the problem. Is there anything that you can suggest? If you would like to take a look at the scene file send me a message and I can fire off the package for you to look at.
Looking forward to hearing back about this!
Draydin_r
Here is what the results are with edge tolerance turned to 0.0 and the Edging brought to 16 pixels
My biggest issue is that the edges are so jagged for what should be straight lines, and as a result there are sections that simply will not clean up even with adaptive sampling. I suspect that this could the source of my troubles. Is there any thing that can resolve this?
Increasing the Filter size seems to help a bit but does not remove the problem. Is there anything that you can suggest? If you would like to take a look at the scene file send me a message and I can fire off the package for you to look at.
Looking forward to hearing back about this!
Draydin_r
You do not have the required permissions to view the files attached to this post.
Last edited by Draydin_r on Wed Aug 02, 2017 12:34 am, edited 1 time in total.
Intel Core i7 CPU 6850K @ 3.60GHz 128.0 GB Ram / Win 10 64 bit / 1x GeForce RTX 3080
Thank you for sharing your scene.
Can you see any artifacts after mapping the baked image back to your mesh?
What you see here is pixels with a low sample count due to that they are right at the edge of your UV geometry and/or no light reaches them, which if you'd use an alpha channel would be near to transparent as you can see here:Draydin_r wrote:Thanks for the response! I have the edge tolorance cranked down to 0.0 and this is still the result I'm getting. One thing I have noticed is how ragged the edges of the UVs can get to be.
These low sample pixels are used to extrapolate your lightmap padding causing that effect. What I think it's happening is that because your scene in general takes a lot of samples to converge this is affecting specially these edge pixels, which would take even longer to clean up.Draydin_r wrote:
Can you see any artifacts after mapping the baked image back to your mesh?
You do not have the required permissions to view the files attached to this post.
This is from the final image exported from Octane (one of my earlier renders). As such this is just example of the in view port render in the other examples.What you see here is pixels with a low sample count due to that they are right at the edge of your UV geometry and/or no light reaches them, which if you'd use an alpha channel would be near to transparent as you can see here:
The problem is even with a 100K samples they don't clean up.... ever. This is a rather serious issue as at low resolution mip mapping shows the rare significant error (blotchy lines that repeat across an edge or in a corners etc). This would make the lighting extremely difficult to implement, as I would have to clean up the maps in Photoshop after every render. This kind of time expenditure is impossible to me to apply in a development timeline, I would easily log 50 hours per room after rendering time in the long run.These low sample pixels are used to extrapolate your lightmap padding causing that effect. What I think it's happening is that because your scene in general takes a lot of samples to converge this is affecting specially these edge pixels, which would take even longer to clean up.
Is there no way to interpolate the edges to account for these kinds of errors? Part of the issue is that I'l even seeing these in corners that are extremely well lit. The Irony Is few version back I didn't have this issue, but for the life me I can't find the version I was using that gave me such good results.... Is there anything else that could be causing this in scene?
You do not have the required permissions to view the files attached to this post.
Intel Core i7 CPU 6850K @ 3.60GHz 128.0 GB Ram / Win 10 64 bit / 1x GeForce RTX 3080