Hi. I am trying to use Octane Render but it never renders anything. I am using a GTX 1050.
I can use the demo version fine, even with meshes with millions of vertices. And it runs very well. I can render in Blender with millions of vertices as well perfectly fine.
However, even the simplest scene with Unreal won't work. It worked for about 2 minutes actually at first, but now it has stopped and doesn't work at all.
Does Unreal engine + octane really fail my GPU that hard? The scenes literally have nothing in them. But it still fails.
GPU keeps failing
Moderator: ChrisHekman
- ChrisHekman
- Posts: 1062
- Joined: Wed Jan 18, 2017 3:09 pm
Could you share the octane logs with me?
You can view the logs by going to the octane dropdown button on the main unreal toolbar.
Then click OctaneGUI.
Then go to the top toolbar in the new Octane GUI window and go to "windows" and then select "create log window".
In the meantime you could try to update your GPU drivers.
You can view the logs by going to the octane dropdown button on the main unreal toolbar.
Then click OctaneGUI.
Then go to the top toolbar in the new Octane GUI window and go to "windows" and then select "create log window".
In the meantime you could try to update your GPU drivers.
- lukaskawalec
- Posts: 8
- Joined: Thu Oct 17, 2019 2:23 am
I already updated my drivers. Same issue.
For some clarification: If I open an Octane file (import ORBX scene) then I can get it to consistently work, and I can go from there.
If I open an Unreal Engine level ("blank", for example) it will instantly fail. Here are the logs for that scenario.
"Started logging on 27.03.20 19:27:01
OctaneRender Prime 2020.1 RC2 (8000004)
OptiX context initialized more than once
CUDA error 2 on device 0: out of memory
-> failed to allocate device memory
device 0: failed to allocate film buffer
denoiserThread0 : Initialization failed. Restart required"
I also have Out of core memory enabled.
I find it hard to believe my graphics card is out of memory on a blank Unreal engine scene, when it is fine if I load an ORBX scene... maybe something in the Unreal level is very unoptimized? My card has 2 GB vram by the way, which I know isn't a lot, but is it really not enough?
For some clarification: If I open an Octane file (import ORBX scene) then I can get it to consistently work, and I can go from there.
If I open an Unreal Engine level ("blank", for example) it will instantly fail. Here are the logs for that scenario.
"Started logging on 27.03.20 19:27:01
OctaneRender Prime 2020.1 RC2 (8000004)
OptiX context initialized more than once
CUDA error 2 on device 0: out of memory
-> failed to allocate device memory
device 0: failed to allocate film buffer
denoiserThread0 : Initialization failed. Restart required"
I also have Out of core memory enabled.
I find it hard to believe my graphics card is out of memory on a blank Unreal engine scene, when it is fine if I load an ORBX scene... maybe something in the Unreal level is very unoptimized? My card has 2 GB vram by the way, which I know isn't a lot, but is it really not enough?
- ChrisHekman
- Posts: 1062
- Joined: Wed Jan 18, 2017 3:09 pm
I see an Optix message, do you have RTX on?
You can check how much vram is available by going to the octane GUI
go to "File", select preferences.
Then go to devices.
Could you tell me how much is still available? (preferably share me a screenshot of your window)
*Make sure you have activated rendering by pressing render on a rendertarget.
The window should look like this:
You can check how much vram is available by going to the octane GUI
go to "File", select preferences.
Then go to devices.
Could you tell me how much is still available? (preferably share me a screenshot of your window)
*Make sure you have activated rendering by pressing render on a rendertarget.
The window should look like this:
- lukaskawalec
- Posts: 8
- Joined: Thu Oct 17, 2019 2:23 am
I started a new project, so now there is no Optix in the logs. Logs are the same except for the last line + the optix message.

Maybe my GPU simply does not have enough memory for this?
Once again though, it does work fine if I import an Octane scene and work from there. Here is that same scene when doing that.


Maybe my GPU simply does not have enough memory for this?
Once again though, it does work fine if I import an Octane scene and work from there. Here is that same scene when doing that.

- lukaskawalec
- Posts: 8
- Joined: Thu Oct 17, 2019 2:23 am
***UPDATE***
I think it's a denoising issue. I looked the error message "denoiserThread0" on Google and got one result, and the solution was to use CPU memory for this. viewtopic.php?f=27&t=70309
I don't know how to do this. Can you help me with that?
I think it's a denoising issue. I looked the error message "denoiserThread0" on Google and got one result, and the solution was to use CPU memory for this. viewtopic.php?f=27&t=70309
I don't know how to do this. Can you help me with that?
- ChrisHekman
- Posts: 1062
- Joined: Wed Jan 18, 2017 3:09 pm
CPU memory is an option in the the out-of-core tab under the devices tab.
Just to be clear, the above picture with 1.2 gb of engine runtime memory is in Octane for Unreal Engine.
And the picture of 6.21 gb of engine runtime memory is in Octane standalone?
Not sure what you mean by "import an octane scene and work from there"
Just to be clear, the above picture with 1.2 gb of engine runtime memory is in Octane for Unreal Engine.
And the picture of 6.21 gb of engine runtime memory is in Octane standalone?
Not sure what you mean by "import an octane scene and work from there"
- ChrisHekman
- Posts: 1062
- Joined: Wed Jan 18, 2017 3:09 pm
Two other things that could be the issue.
What is the size of your octane viewport window in unreal?
The automatic resizer will increase the film buffer size to match the window size. Which can take a lot of data if you set the size to that of your screen.
Parallel samples settings might also be an issue, you could see if the memory footprint reduces when you descrease the parallel samples in the rendertarget kernel.
What is the size of your octane viewport window in unreal?
The automatic resizer will increase the film buffer size to match the window size. Which can take a lot of data if you set the size to that of your screen.
Parallel samples settings might also be an issue, you could see if the memory footprint reduces when you descrease the parallel samples in the rendertarget kernel.
- lukaskawalec
- Posts: 8
- Joined: Thu Oct 17, 2019 2:23 am
Wow! Reducing the parallel samples fixed it! The memory usage is way down because of that too. Thank you so much! (Also could you explain what parallel samples is exactly?)
This was the issue: When I start a blank file and have Unreal create a Rendertarget node for me, parallel samples defaults at 16 and I can't render.
When I import an ORBX scene, the render target node was already made, and parallel samples was set to 8. That's why I could render stuff when I imported an ORBX scene into Unreal but not when I started from the Blank level.
Both of the pictures above are in Unreal. The top one is when I start from the "Blank" level, and the bottom one is when I import an ORBX scene and load that instead.
Thank you again!
This was the issue: When I start a blank file and have Unreal create a Rendertarget node for me, parallel samples defaults at 16 and I can't render.
When I import an ORBX scene, the render target node was already made, and parallel samples was set to 8. That's why I could render stuff when I imported an ORBX scene into Unreal but not when I started from the Blank level.
Both of the pictures above are in Unreal. The top one is when I start from the "Blank" level, and the bottom one is when I import an ORBX scene and load that instead.
Thank you again!