Generic forum to discuss Octane Render, post ideas and suggest improvements.
Forum rules
Please add your OS and Hardware Configuration in your signature, it makes it easier for us to help you analyze problems. Example: Win 7 64 | Geforce GTX680 | i7 3770 | 16GB
I'm having a little trouble baking out textures, especially when I exceed 2,999x2,999 pixels. All my GPU's fail no matter which Kernel or combination of settings I try. Updated to the latest Standalone build and still have issues. I need to hit 8K bakes asap.
Does anyone know of this issue and or a fix for texture baking in standalone?
I have to say in all the years I've been using Octane, this is the very first time I've ever encountered an issue and it has kind of caught me off guard
As it turns out, if I only enable the Main GPU "The one my monitors are plugged into" I can bake textures at 8,192 pixels.
If I enable any other GPU's, they all fail.
I have two GTX 980ti's, two GTX 580's, and one GTX 670.
I use one of the GTX 980ti's for the monitors.
I created a simple test scene with a floor, a bench, and 1 mesh light.... Nothing awesome.... The memory at one parallel sample with a baking camera comes up to 402MB, although I've seen it come up to 900mb on the exact same scene when I restarted it. Either way this is well below the max MB limit each one of these GPU's can handle.
Dear octane team, I hope this gives you a starting point. Please Help. I spent countless hours setting up a scene to bake, and now it seems impossible.
When using the baking camera Octane requires slightly more memory to render the same scene compared to using other cameras. This is because it needs to keep some additional data structures required for baking and acceleration.
Remember Octane cannot use more memory than the minimum memory size from all your cards that are currently enabled. When you disable your 580 and 670 you're effectively reducing your processing power but increasing the available memory from 1.5GB to 6GB.
However, I am not too sure this is actually what's causing your issue as your scene doesn't seem to reach the 1.5GB limit in your case, could you please share your scene via PM?
Right, I have taken the memory aspect into consideration. But the test scene still gives me the same result so I will send you that first. It's a floor and random bench I simplied UV'd.
It's not a memory issue here. One 980ti can run fine, but two 980ti's fail....??
Appreciate any help in this matter. Happy new years.
Small reminder from old threads on this forums. It seems that some user had some issues with bakes on slower interfaces (x1 lanes). I'm not sure what You use & if that issue was solved..but worth taking a look as well.
Thank you for the advice very much. I do in fact think you are right. I tested this theory by disabling my x1 lane GPU's at high resolution bakes and I can say I haven't seen it crash.
This is definitely on the right track but this is still a huge issue for me as my other 980ti and 670 are on the x1 lanes.
Here is my GPU Setup.
x16 = 980ti
X16 = 580
x16 = 580
x1 = 980ti
x1 = 670 FTW
Anyways, All GPU's work fine including the x1 lanes at lower resolution bakes though. When I turn up the resolution to 3K-8K everything fails even if we are still under the maximum DRAM MB Limit.
Turning off the x1 lanes is a work around but I kinda of need them
Yeah one Guy in forum noticed this using Amfeltec GPU cluster.. (that connects via x1). However at that time I was building something based on Amfeltecs backplate that is also x1 but in my case thing was working without issues. Not so sure yet whether this issue was ever solved since I do not use that function myself & still prefer to have GPU in at least x4. However I could try to find a thread we were discussing about this issue & You might have a possibility to read a bit more or even ask user directly what was his solution. Please bump this up if You would not get the link if upcoming hour or so (I'll post it here =)
Hmm.
No expansion clusters here. Just risers from the motherboard 1x slots. The Motherboard has three x16 and two x1 slots. No x4
Never had any issues with anything... Even rendering 10K still images with all GPUs enabled.
ok, I could not find exact link to post/thread, but You can drop a line to Notius - he's pretty friendly & I'm sure he would answer to any of Your question =)