Screamernet use both GPUs

Forums: Screamernet use both GPUs
Newtek Lightwave 3D (exporter developed by holocube, Integrated Plugin developed by juanjgon)

Moderator: juanjgon

Screamernet use both GPUs

Postby Rainbow-3D » Wed Jan 09, 2019 11:57 am

Rainbow-3D Wed Jan 09, 2019 11:57 am
Hi,
in my computer there is a RTX 2080, that I use only for the rendering, and a GTX 750Ti, that I don't use for rendering but only to work (OGL and so on), so I can continue to work in LW and in the same time render the scenes, one after the other, when they are ready.
To do this I use the screamernet for the rendering.
The problem is that the screamernet renders on both the GPUs, but I want it uses only the 2080.
There is a way to force SN to don't use the 750ti?
In the image you can see my settings in the renderer options

Thankyou for the help
Attachments
Cattura3.JPG
Win 7 x64 | Asus P8P67-M - i72600K | 16 GB RAM | 1 x nVidia RTX 2800
Win 11 | i7 13700K | 64 GB RAM | 1 x nVidia RTX 4090
User avatar
Rainbow-3D
Licensed Customer
Licensed Customer
 
Posts: 63
Joined: Tue Dec 11, 2018 7:27 pm

Re: Screamernet use both GPUs

Postby juanjgon » Wed Jan 09, 2019 11:58 pm

juanjgon Wed Jan 09, 2019 11:58 pm
Hi,

To configure the render nodes for LWSN you need to add a plain text config file with the name octane_node.cfg to the C:\ disk of each render node. This config file have some basic information about the Octane configuration, and it must have the next format:

Code: Select all
"UserID" (Not used in Octane 3 or 4, it can be any string)
"UserPassword" (Not used in Octane 3 or 4, it can be any string)
Out-of-core enabled (0 or 1)
Out-of-core RAM usage limit [GB]
Out-of-core GPU head room [MB]
"X:\path\to\TFD\TFD_loader_64.p" (Can be any string if you don’t have the TFD plugin)
GPU0 enabled (0 or 1, optional)
GPU1 enabled (0 or 1, optional)
GPU2 enabled (0 or 1, optional)
GPU3 enabled (0 or 1, optional)
....



For example, this is the config file to enable the out-of-core with 8 GB of RAM available and a GPU head room of 300 MB, enabling only the first GPU in a system with two GPUs:

Code: Select all
""
""
1
8
300
"M:\LW_Plugins\TurbulenceFD_LW_v1-0_1401\TFD_loader_64.p"
1
0



Hope it helps,
-Juanjo
User avatar
juanjgon
Octane Plugin Developer
Octane Plugin Developer
 
Posts: 8867
Joined: Tue Jan 19, 2010 12:01 pm
Location: Spain

Re: Screamernet use both GPUs

Postby Rainbow-3D » Thu Jan 10, 2019 12:59 am

Rainbow-3D Thu Jan 10, 2019 12:59 am
Really thankyou Juanjo! It works now!

To work right I changed about the GPU enabled
1
0
0
0

and not only
1
0

With only 2 lines screamernet writes only
GPU0 set in 1 state
but nothing about GPU1... I think (screamernet?) doesn't read the first 0 (after the 1) ... so now with 4 numbers it writes
GPU0 set in 1 state
GPU1 set in 0 state
GPU2 set in 0 state
I haven't the GPU2, so no matter write or don't write it, but very good now screamernet writes GPU1 set in 0 state (and only the RTX2080 renders the frames).

Again thankyou and have a good night.
Gianluca
Win 7 x64 | Asus P8P67-M - i72600K | 16 GB RAM | 1 x nVidia RTX 2800
Win 11 | i7 13700K | 64 GB RAM | 1 x nVidia RTX 4090
User avatar
Rainbow-3D
Licensed Customer
Licensed Customer
 
Posts: 63
Joined: Tue Dec 11, 2018 7:27 pm

Return to Lightwave 3D


Who is online

Users browsing this forum: No registered users and 25 guests

Fri Apr 26, 2024 5:02 am [ UTC ]