out of core not working as expected: Mac/Win 3.07R2
Posted: Sun Jun 10, 2018 3:47 pm
Hey Guys,
Hoping someone can shed some light on something for me.
Have a mix of GPUs here, 1080Tis, 1070s, and 980tis. Only ever one type per box.
Netrender on 3.07R2 works great on all of them (yes yes will update soon, it's super stable so whatever).
I have Out of Core on the slave on the 980ti's set to 8gb. So, 6gb on the GPUs and 8GB OOC, sounds great.
I have scenes hovering around 6.27/8GB VRAM. The OOC on the 1070s handles this fine. But OOC on the 980ti, doesn't seem to do anything. Actually, it makes things worse.
If I netrender with the 1080Ti's and 1070s, everything is great. If I then add the 980Ti's box to the mix, rendering actually slows down by a minute or two per frame.
This is odd because if it's a small scene, there is no slow down. So it's not network speed or transfer. So it must be OOC? I heard it was slow but this seems super crazy slow.
Is this the problem? Or could it be something else?
Thanks
Hoping someone can shed some light on something for me.
Have a mix of GPUs here, 1080Tis, 1070s, and 980tis. Only ever one type per box.
Netrender on 3.07R2 works great on all of them (yes yes will update soon, it's super stable so whatever).
I have Out of Core on the slave on the 980ti's set to 8gb. So, 6gb on the GPUs and 8GB OOC, sounds great.
I have scenes hovering around 6.27/8GB VRAM. The OOC on the 1070s handles this fine. But OOC on the 980ti, doesn't seem to do anything. Actually, it makes things worse.
If I netrender with the 1080Ti's and 1070s, everything is great. If I then add the 980Ti's box to the mix, rendering actually slows down by a minute or two per frame.
This is odd because if it's a small scene, there is no slow down. So it's not network speed or transfer. So it must be OOC? I heard it was slow but this seems super crazy slow.
Is this the problem? Or could it be something else?
Thanks