
OctaneRender™ 4.0 for LightWave™ - Beta build 4.00.0.8 XB3
Moderator: juanjgon
thanks Juanjo, works perfectly for us.


You should not change the Octane version in the middle of a production, especially to a beta build like this one.lightvu wrote:The hype for 4.0 is great, and I want to use the denoiser feature greatly.
Is this version safe to use right now? What I mean is, is it just like the latest 3.x build (just as stable) but with other features?
Or if I am in production mode should I stick with the 3.x I have installed
Thanks,
-Juanjo
- lstudios3d
- Posts: 38
- Joined: Wed Mar 08, 2017 10:55 pm
Had the render abort right away when trying to render with a really high number of instances and the denoiser active, not overly surprised assuming its maxing out the memory on my cards. I did however come across a bug when trying to find out what I had to reduce the instances down to in order to get it to render. I duplicated the instance generator turned one off and then lowered the numbers on the other one. When I went to render the instances weren't rendering, as soon as I deleted the instance generator that was turned off the instances rendered again. Didn't test if the order in the stack made a difference but it seems like at least if the top instance generator is turned off octane doesn't see anything else in the stack. I can put the scene back to how it was when it was having the problem and save it out if you need but I think it will be easy to reproduce in a simple scene.
This is known issue related to a limitation in the LightWave SDK. The plugin only can get the enable state of the first instance generator, so this state is used for all the generators on this same instancer.lstudios3d wrote:Didn't test if the order in the stack made a difference but it seems like at least if the top instance generator is turned off octane doesn't see anything else in the stack. I can put the scene back to how it was when it was having the problem and save it out if you need but I think it will be easy to reproduce in a simple scene.
Thanks,
-Juanjo
Hi,
I'm having troubles rendering scenes that render with 3.08.1.0 without any issue.
HW setup 2x1080ti, 1x1080 and 1 780ti., 64 gb RAM.
3.08 renders fine with out of core settings to 24gb.
I have two scenes, one quite complex with a lot of instances.
4.0 beta just reports a render failure with out of memory on different cards. Out of core is also set to 24gb, reports out of core memory used 0/24576 so that should not be a real issue?
This is without denoising.
Only enabling the 1080ti that has no display card, and it still won't render.
Loading another scene with 4.0beta, i can render without denoising active every shot i want.
When I enable the denoiser (active on all cards) it fails.
When I enable denoiser on the 1080ti without monitor (max ram available), it still fails
Switching back to 3.08 and i can render all scenes with all cards.
Logwith failure:
00:00:26 (0026.86) * OCTANE API MSG: CUDA error 2 on device 2: out of memory
00:00:26 (0026.86) * OCTANE API MSG: -> failed to allocate device memory
00:00:26 (0026.86) * OCTANE API MSG: device 2: failed to allocate film buffer
Log succesfull rendering:
00:00:52 (0052.48) | [profile] Function "GetImage" over "" execution time: 0.238 seconds
00:00:52 (0052.48) | ()()() Rendering done
00:00:52 (0052.54) | <> FRAME 499 done, render time 26 seconds
00:00:52 (0052.54) | <> Mem Used/Free/Total: 2354 / 0 / 3072
00:00:52 (0052.54) | <> Triangles/DisplaceTris/Hairs/Objects/Voxels rendered: 1653184 / 0 / 0 / 900080 / 38452266
00:00:52 (0052.54) | <> Textures RGB32/RGB64/GREY8/GREY16 used: 218 / 0 / 0 / 0
00:00:52 (0052.54) |
00:00:52 (0052.54) | Close and free scene, free buffer: 0, reset scene: 1
00:00:52 (0052.54) | ... setRenderTargetNode(NULL)
00:00:52 (0052.90) | ... update()
00:00:52 (0052.90) | ... getRootNodeGraph()->clear()
00:00:53 (0053.12) | >>> Refresh preview window
00:00:53 (0053.13) | >>> Draw preview window progress bar
00:00:53 (0053.13) | >>> Refresh preview window done
00:00:53 (0053.13) | Scene closed
00:00:53 (0053.13) |
00:00:53 (0053.13) | Close preview window
00:00:53 (0053.14) |
00:00:53 (0053.14) | Octane Render for Lightwave, end of log system
Is memory usage so much higher with 4.0 then in 3.0x?
I'm having troubles rendering scenes that render with 3.08.1.0 without any issue.
HW setup 2x1080ti, 1x1080 and 1 780ti., 64 gb RAM.
3.08 renders fine with out of core settings to 24gb.
I have two scenes, one quite complex with a lot of instances.
4.0 beta just reports a render failure with out of memory on different cards. Out of core is also set to 24gb, reports out of core memory used 0/24576 so that should not be a real issue?
This is without denoising.
Only enabling the 1080ti that has no display card, and it still won't render.
Loading another scene with 4.0beta, i can render without denoising active every shot i want.
When I enable the denoiser (active on all cards) it fails.
When I enable denoiser on the 1080ti without monitor (max ram available), it still fails
Switching back to 3.08 and i can render all scenes with all cards.
Logwith failure:
00:00:26 (0026.86) * OCTANE API MSG: CUDA error 2 on device 2: out of memory
00:00:26 (0026.86) * OCTANE API MSG: -> failed to allocate device memory
00:00:26 (0026.86) * OCTANE API MSG: device 2: failed to allocate film buffer
Log succesfull rendering:
00:00:52 (0052.48) | [profile] Function "GetImage" over "" execution time: 0.238 seconds
00:00:52 (0052.48) | ()()() Rendering done
00:00:52 (0052.54) | <> FRAME 499 done, render time 26 seconds
00:00:52 (0052.54) | <> Mem Used/Free/Total: 2354 / 0 / 3072
00:00:52 (0052.54) | <> Triangles/DisplaceTris/Hairs/Objects/Voxels rendered: 1653184 / 0 / 0 / 900080 / 38452266
00:00:52 (0052.54) | <> Textures RGB32/RGB64/GREY8/GREY16 used: 218 / 0 / 0 / 0
00:00:52 (0052.54) |
00:00:52 (0052.54) | Close and free scene, free buffer: 0, reset scene: 1
00:00:52 (0052.54) | ... setRenderTargetNode(NULL)
00:00:52 (0052.90) | ... update()
00:00:52 (0052.90) | ... getRootNodeGraph()->clear()
00:00:53 (0053.12) | >>> Refresh preview window
00:00:53 (0053.13) | >>> Draw preview window progress bar
00:00:53 (0053.13) | >>> Refresh preview window done
00:00:53 (0053.13) | Scene closed
00:00:53 (0053.13) |
00:00:53 (0053.13) | Close preview window
00:00:53 (0053.14) |
00:00:53 (0053.14) | Octane Render for Lightwave, end of log system
Is memory usage so much higher with 4.0 then in 3.0x?
In theory, only the denoiser needs additional memory, and the new out of core geometry feature should help. I've seen other reports about the same problem in other plugins, so probably this is an Octane core issue. If you can send me a scene to reproduce this problem here, it could help.hkleton wrote:Hi,
I'm having troubles rendering scenes that render with 3.08.1.0 without any issue.
HW setup 2x1080ti, 1x1080 and 1 780ti., 64 gb RAM.
Is memory usage so much higher with 4.0 then in 3.0x?
Thanks,
-Juanjo
- FrankPooleFloating
- Posts: 1669
- Joined: Thu Nov 29, 2012 3:48 pm
Yo Juanjo, what is up with that cudnn64_7.dll weighing in at a whopping 332MB?!!!
The cuda file we put in bin folder previously was like 356K!!! Zoinks!

Win10Pro || GA-X99-SOC-Champion || i7 5820k w/ H60 || 32GB DDR4 || 3x EVGA RTX 2070 Super Hybrid || EVGA Supernova G2 1300W || Tt Core X9 || LightWave Plug (v4 for old gigs) || Blender E-Cycles
This is all the NVIDIA AI and deep learning stuff. Remember that now you don't need to copy anything to the LW/bin folder. All the Octane system files must be located in the same .p plugin folder, as they are in the .zip file.FrankPooleFloating wrote:Yo Juanjo, what is up with that cudnn64_7.dll weighing in at a whopping 332MB?!!!The cuda file we put in bin folder previously was like 356K!!! Zoinks!
Thanks,
-juanjo