I have two graphics cards in my PC box. The first is a 8800 card which runs the screen. The second is a 460 card which I use for Octane. Thanks to some other members of the forum, I have made the shocking discovery that my 460 card has been crippled for the last 6 months by my 8800 card. Despite the fact that the 460 is the active card and the 8800 is disabled for Octane, the render speeds that I am getting are those of the 8800 card. I this a bug in Octane? Does anyone know what I might be able to do in order to rectify this situation? At the moment I have had to remove the 8800 card in order to improve render times but this has the side effect of making the interface completely unresponsive whilst rendering. This fact proves that the 8800 has been working as the screen driver whilst the 460 has been working as the render engine.
I really don't want to spend another £180 on another 460 card simply so that the 460 card that I already have will work correctly.
Hopefully there are some PC boffins out there with the answer.
So far I have fiddled with drivers, swapped the 2 cards around so that the 460 is the primary etc. At one point I did have the 460 running at the correct speed whilst the 8800 was driving the screen but as soon as I rebooted the problem came back. Maybe there is a clue in that?
Secondary graphics card
Forum rules
NOTE: The software in this forum is not %100 reliable, they are development builds and are meant for testing by experienced octane users. If you are a new octane user, we recommend to use the current stable release from the 'Commercial Product News & Releases' forum.
NOTE: The software in this forum is not %100 reliable, they are development builds and are meant for testing by experienced octane users. If you are a new octane user, we recommend to use the current stable release from the 'Commercial Product News & Releases' forum.
This is weird... are you sure you have set the 8800GT as display adapter, not other way around? I think the GPU with "0" designation is the one, which is display adapter, so you should have this one set as inactive. I can be wrong though.
EDIT: Just saw the other thread about lightning with the CUDA settings screen and i was probably wrong with the "0" thing, more likely its related to the PCI-E slot position. I had 2x GTX460 and now i remember i used an upper card for rendering and now when i think about it, it was "0" too. Anyway i had this upper card plugged to monitor, but it was too noisy while gaming, so i decided to plug the display to the other card. However to do this, i had to set the system to look for display on another PCI-E than the primary one, it was not just a simple thing of unplugging the cable from one card and plugging it to other one. So maybe this is part of your issue.
EDIT: Just saw the other thread about lightning with the CUDA settings screen and i was probably wrong with the "0" thing, more likely its related to the PCI-E slot position. I had 2x GTX460 and now i remember i used an upper card for rendering and now when i think about it, it was "0" too. Anyway i had this upper card plugged to monitor, but it was too noisy while gaming, so i decided to plug the display to the other card. However to do this, i had to set the system to look for display on another PCI-E than the primary one, it was not just a simple thing of unplugging the cable from one card and plugging it to other one. So maybe this is part of your issue.
Intel Core i7 980x @ 3,78GHz - Gigabyte X58A UD7 rev 1.0 - 24GB DDR3 RAM - Gainward GTX590 3GB @ 700/1400/3900 Mhz- 2x Intel X25-M G2 80GB SSD - WD Caviar Black 2TB - WD Caviar Green 2TB - Fractal Design Define R2 - Win7 64bit - Octane 2.57
Well after a few hours of swapping of cards, cables and drivers I finally got something working. At the moment I appear to have pulled the wool over the PCs eyes because it appears to boot in dos using the 460 card. Then as windows loads, the 8800 takes over (screen is blank until windows starts loading). This has had the effect that the 8800 is not influencing the 460. So the 460 runs at full speed whilst the 8800 drives the screen. Odd but true.
So you couldn't just go into nvidia control panel and turn off the desired card from using cuda?


EVGA sr2 2x x5650 12c/24t| 64Gbddr3 1600mhz | 4x gtx580 3gb on h20|4x 128x corsair ssd | 2x Strider 1500watt + 500watt |swiftech Blocks ,3x quad rads + dual rad (two loops)||8Tb|Laptop:Dell XPS 1730| 4Gb| 2x 9800GTX sli
Portfolio/blog
Portfolio/blog
excellent idea, I will try it. That does mean that I will have to attempt to undo my hack but it sounds like it could be a winner.
(HW) Intel i7 2600k, 16GB DDR3, MSI 560GTX ti (2GB) x 3
(SW) Octane (1.50) Blender (2.70) (exporter 2.02)
(OS) Windows 7(64)
(SW) Octane (1.50) Blender (2.70) (exporter 2.02)
(OS) Windows 7(64)
Wonderful, it worked like a dream. Many thanks for the tip
. I notice that you have 4 graphics cards. Are they in some sort of breakout box or have you got some magical motherboard with umpteen PCI-e slots?
Ahh, just read you signature spec. That is some impressive system.

Ahh, just read you signature spec. That is some impressive system.
Last edited by steveps3 on Tue May 03, 2011 5:11 pm, edited 1 time in total.
(HW) Intel i7 2600k, 16GB DDR3, MSI 560GTX ti (2GB) x 3
(SW) Octane (1.50) Blender (2.70) (exporter 2.02)
(OS) Windows 7(64)
(SW) Octane (1.50) Blender (2.70) (exporter 2.02)
(OS) Windows 7(64)
- Jaberwocky
- Posts: 976
- Joined: Tue Sep 07, 2010 3:03 pm
steveps3 wrote:Wonderful, it worked like a dream. Many thanks for the tip. I notice that you have 4 graphics cards. Are they in some sort of breakout box or have you got some magical motherboard with umpteen PCI-e slots?
looking at the bottom of his post he seems to have this
see link
http://www.evga.com/articles/00537/
nice board


CPU:-AMD 1055T 6 core, Motherboard:-Gigabyte 990FXA-UD3 AM3+, Gigabyte GTX 460-1GB, RAM:-8GB Kingston hyper X Genesis DDR3 1600Mhz D/Ch, Hard Disk:-500GB samsung F3 , OS:-Win7 64bit
I just did, wow, that's the sort of kit I wouldn't mind. I wonder of the wife would mind if I sell the house.
(HW) Intel i7 2600k, 16GB DDR3, MSI 560GTX ti (2GB) x 3
(SW) Octane (1.50) Blender (2.70) (exporter 2.02)
(OS) Windows 7(64)
(SW) Octane (1.50) Blender (2.70) (exporter 2.02)
(OS) Windows 7(64)
- Jaberwocky
- Posts: 976
- Joined: Tue Sep 07, 2010 3:03 pm
steveps3 wrote:I just did, wow, that's the sort of kit I wouldn't mind. I wonder of the wife would mind if I sell the house.
nah...... just tell her its the latest tech in heating systems for the winter.

CPU:-AMD 1055T 6 core, Motherboard:-Gigabyte 990FXA-UD3 AM3+, Gigabyte GTX 460-1GB, RAM:-8GB Kingston hyper X Genesis DDR3 1600Mhz D/Ch, Hard Disk:-500GB samsung F3 , OS:-Win7 64bit
Put the mobo on water...

Got rid of the 480's ...


Got rid of the 480's ...

EVGA sr2 2x x5650 12c/24t| 64Gbddr3 1600mhz | 4x gtx580 3gb on h20|4x 128x corsair ssd | 2x Strider 1500watt + 500watt |swiftech Blocks ,3x quad rads + dual rad (two loops)||8Tb|Laptop:Dell XPS 1730| 4Gb| 2x 9800GTX sli
Portfolio/blog
Portfolio/blog