What you see here is pixels with a low sample count due to that they are right at the edge of your UV geometry and/or no light reaches them, which if you'd use an alpha channel would be near to transparent as you can see here:
This is from the final image exported from Octane (one of my earlier renders). As such this is just example of the in view port render in the other examples.
These low sample pixels are used to extrapolate your lightmap padding causing that effect. What I think it's happening is that because your scene in general takes a lot of samples to converge this is affecting specially these edge pixels, which would take even longer to clean up.
The problem is even with a 100K samples they don't clean up.... ever. This is a rather serious issue as at low resolution mip mapping shows the rare significant error (blotchy lines that repeat across an edge or in a corners etc). This would make the lighting extremely difficult to implement, as I would have to clean up the maps in Photoshop after every render. This kind of time expenditure is impossible to me to apply in a development timeline, I would easily log 50 hours per room after rendering time in the long run.
Distortions_08.JPG
Is there no way to interpolate the edges to account for these kinds of errors? Part of the issue is that I'l even seeing these in corners that are extremely well lit. The Irony Is few version back I didn't have this issue, but for the life me I can't find the version I was using that gave me such good results.... Is there anything else that could be causing this in scene?
Distortions_09.JPG
You do not have the required permissions to view the files attached to this post.
Intel Core i7 CPU 6850K @ 3.60GHz 128.0 GB Ram / Win 10 64 bit / 1x GeForce RTX 3080