Hi everyone,
I wanted to ask a quick question as regards to GPU usage. Apologies if this has a thread already , I tried searching.
My current computer has a GTX 770, and I've just had my workplace order a new computer with 2 x 780Ti for Octane usage. Looking forward to the speed increase! What I'm wondering is if it would be possible to put the 770 in the new pc next to the 2 780Ti cards (there is a third slot to do this) - what I'd like to do ideally is have the 2 780Ti cards working away at octane rendering, whilst the 770 doesn't render and works with the day to day interface stuff, so the computer doesn't feel as though it's slowed down whilst rendering.
Is this possible? If so, how would it be done?
I'd really appreciate any help / advice on this! Thanks!
assigning GPUs to different tasks
Forum rules
Please add your OS and Hardware Configuration in your signature, it makes it easier for us to help you analyze problems. Example: Win 7 64 | Geforce GTX680 | i7 3770 | 16GB
Please add your OS and Hardware Configuration in your signature, it makes it easier for us to help you analyze problems. Example: Win 7 64 | Geforce GTX680 | i7 3770 | 16GB
hi, MoGrafik
Yes it's possible. You simply have to go to preferences and untick the box near Your GPU that says 'active' - leaving these 780s in Your case to render. Your third card will handle display and other tasks such as accelerating some photoshop actions or handleing navigation in modeling apps etc. =)
Yes it's possible. You simply have to go to preferences and untick the box near Your GPU that says 'active' - leaving these 780s in Your case to render. Your third card will handle display and other tasks such as accelerating some photoshop actions or handleing navigation in modeling apps etc. =)
- prehabitat
- Posts: 495
- Joined: Fri Aug 16, 2013 10:30 am
- Location: Victoria, Australia
Interesting, does the 'display' hard need to be hooked up to the monitors for this to work? Or is the driver smart enough to realise that gpu3 is idle (although not connected directly to the monitor) and send out display data via pcie to gpu1 (presumably connected to monitor) .... It seems unlikely....
Win10/3770/16gb/K600(display)/GTX780(Octane)/GTX590/372.70
Octane 3.x: GH Lands VARQ Rhino5 -Rhino.io- C4D R16 / Revit17
Octane 3.x: GH Lands VARQ Rhino5 -Rhino.io- C4D R16 / Revit17
thanks for your help! Sounds easy enough!
I guess I never considered that an option, since I've only been using one GPU until now and never considered that you can untick it! I'll give it a go, thanks so much for you help!
I guess I never considered that an option, since I've only been using one GPU until now and never considered that you can untick it! I'll give it a go, thanks so much for you help!
Win 8.1 / i7 4930k / 2 x GTX Titan / Cinema 4d
Hi everyone. My work computer is great! Now I'm thinking about adding a second GPU to my home workstation.
I'm running a scene with several high resolution textures, and the GPU is beginning to run out of memory. I have 2 questions:
1) Can octane allocate to the TOTAL memory of both cards?? (eg: 2 x 3GB cards= 6gb)
and
2) Do they have to be the same Graphics card? I have a GTX 780, and was thinking about getting a GTX 770 with 4GB so I have more space to play with!
Please let me know, guys! Thanks for all your help!
I'm running a scene with several high resolution textures, and the GPU is beginning to run out of memory. I have 2 questions:
1) Can octane allocate to the TOTAL memory of both cards?? (eg: 2 x 3GB cards= 6gb)
and
2) Do they have to be the same Graphics card? I have a GTX 780, and was thinking about getting a GTX 770 with 4GB so I have more space to play with!
Please let me know, guys! Thanks for all your help!
Win 8.1 / i7 4930k / 2 x GTX Titan / Cinema 4d
No,1) Can octane allocate to the TOTAL memory of both cards?? (eg: 2 x 3GB cards= 6gb)
2 x 3GB cards = 3GB
1 card with 3GB + 1 card with 4 GB = 3GB
The scene is not cut in parts.
The total available amount is always the amount of the card with less memory as all the cards have to load in parallel the same complete scene.
And also, the real available amount is always a bit less than the expected amount : for example my 580 3GB give me really 2.880 GB.
French Blender user - CPU : intel Quad QX9650 at 3GHz - 8GB of RAM - Windows 7 Pro 64 bits. Display GPU : GeForce GTX 480 (2 Samsung 2443BW-1920x1600 monitors). External GPUs : two EVGA GTX 580 3GB in a Cubix GPU-Xpander Pro 2. NVidia Driver : 368.22.
Hey Roubal,
thanks very much for getting back to me so fast. That's good to know - shame you can't double up on memory though. Maybe I'll end up having to scale down some textures after all!
Thanks very much for your help.
thanks very much for getting back to me so fast. That's good to know - shame you can't double up on memory though. Maybe I'll end up having to scale down some textures after all!
Thanks very much for your help.
Win 8.1 / i7 4930k / 2 x GTX Titan / Cinema 4d
Juyst one other quick question, if you don't mind...
I actually saved my project with textures and it came in at only 180mb including textures. I have a 3GB graphics card, so so how is it that the GPU is nearly full? What is using that other memory?
If anyone has any ideas, or can link me to any documentation I'd be very grateful!
Thanks
I actually saved my project with textures and it came in at only 180mb including textures. I have a 3GB graphics card, so so how is it that the GPU is nearly full? What is using that other memory?
If anyone has any ideas, or can link me to any documentation I'd be very grateful!
Thanks
Win 8.1 / i7 4930k / 2 x GTX Titan / Cinema 4d
- LudovicRouy
- Posts: 216
- Joined: Sat Apr 03, 2010 5:05 pm
- Location: FRANCE
- Contact:
The rendering size is probably what overload your vram :
Try to split your image into smaller ones with parralax offset or if quality doesn't matter scale it down
Try to split your image into smaller ones with parralax offset or if quality doesn't matter scale it down