PCI 2.0 vs PCI 3.0

Generic forum to discuss Octane Render, post ideas and suggest improvements.
Forum rules
Please add your OS and Hardware Configuration in your signature, it makes it easier for us to help you analyze problems. Example: Win 7 64 | Geforce GTX680 | i7 3770 | 16GB
Rikk The Gaijin
Licensed Customer
Posts: 1528
Joined: Tue Sep 20, 2011 2:28 pm
Location: Japan

I apologize if this question has been asked before, I couldn't find it on the forum.

I recently purchased a GeForce GTX 680 4GB, to replace my 2x GTX 570s.
The card works fine, but on the box it says it's PCI 3.0, while my motherboard only supports PCI 2.0.
Is there any significant performance boost with Octane using the same card on a PCI 3.0 slot? Or the difference is minimal?

Thank you!
User avatar
pixelrush
Licensed Customer
Posts: 1618
Joined: Mon Jan 11, 2010 7:11 pm
Location: Nelson, New Zealand

Pcie3.0 is backward compatible with 2 it will just be a little slower to load the data, nothing you would really notice in practice.
i7-3820 @4.3Ghz | 24gb | Win7pro-64
GTS 250 display + 2 x GTX 780 cuda| driver 331.65
Octane v1.55
Rikk The Gaijin
Licensed Customer
Posts: 1528
Joined: Tue Sep 20, 2011 2:28 pm
Location: Japan

pixelrush wrote:Pcie3.0 is backward compatible with 2 it will just be a little slower to load the data, nothing you would really notice in practice.
Thank you, that's a relief! :lol:
User avatar
FrankPooleFloating
Licensed Customer
Posts: 1669
Joined: Thu Nov 29, 2012 3:48 pm

Rikk, how many slots do you have on your mobo?.. you are not actually yanking both
570's out, are you?!!

If you can keep at least one of the 570's in (in addition to the 680) do it!! And not just
for dedicated display... every benchmark I have seen has the 570's faster than 680...
Using both to render (when you do not have 4gb scenes loaded) would be more than
twice as fast/powerful.
Win10Pro || GA-X99-SOC-Champion || i7 5820k w/ H60 || 32GB DDR4 || 3x EVGA RTX 2070 Super Hybrid || EVGA Supernova G2 1300W || Tt Core X9 || LightWave Plug (v4 for old gigs) || Blender E-Cycles
Rikk The Gaijin
Licensed Customer
Posts: 1528
Joined: Tue Sep 20, 2011 2:28 pm
Location: Japan

FrankPooleFloating wrote:Rikk, how many slots do you have on your mobo?.. you are not actually yanking both
570's out, are you?!!
If you can keep at least one of the 570's in (in addition to the 680) do it!! And not just
for dedicated display... every benchmark I have seen has the 570's faster than 680...
Using both to render (when you do not have 4gb scenes loaded) would be more than
twice as fast/powerful.
Wait, do you mean I can use all three cards? :shock:
How does that work? Does Windows 7 has some sort of selection of which card is the default?
My MoBo is the Asus Maximus III Extreeme (LOL at the name :lol: ), I think I have enough slot, but my PSU doesn't have x6 6-pin connectors (each card requires x2 6-pin connectors), I think I have just 4... :? I'm not an hardware expert, is there a way I can connect additional 6-pins connectors with an adapter or something? :?
User avatar
FrankPooleFloating
Licensed Customer
Posts: 1669
Joined: Thu Nov 29, 2012 3:48 pm

Absolutely buddy!!.. you want Molex to 6 pin adapters... any decent PC store should have these..

If you have 6 slots (2 for each card, obviously - and 3 of them PCI-E) -- toss those puppies in!!

You aren't screwing with me, are you?... you did not know you can have multiple GPU's in your
system?... this is one of the favorite topics in half the threads here. :shock:

I am going to bed Rikk.. it is 1:00am here... when I wake up - I want to see that you have
all three GPU's rendering away in Octane!!

Seriously though -- depending on what else you have, you would likely want a minimum of
a 850W power supply... so make sure you are able to power all three (or even two) before you
stick any 570's back in.
Win10Pro || GA-X99-SOC-Champion || i7 5820k w/ H60 || 32GB DDR4 || 3x EVGA RTX 2070 Super Hybrid || EVGA Supernova G2 1300W || Tt Core X9 || LightWave Plug (v4 for old gigs) || Blender E-Cycles
Rikk The Gaijin
Licensed Customer
Posts: 1528
Joined: Tue Sep 20, 2011 2:28 pm
Location: Japan

I knew you can have multiple cards, but I thought every card MUST be exactly the same, I didn't know you can use different models, or even different chipset! :?
I guess games will not use that configuration tho, am I right?
I use the PC for play games as well (when I have time, hehe), will that be handle by the nVidia Drivers?
User avatar
pixelrush
Licensed Customer
Posts: 1618
Joined: Mon Jan 11, 2010 7:11 pm
Location: Nelson, New Zealand

Early on it in Octane development it was like that, that you needed the same cards, but these days as long as they are cuda capable they can contribute to the rendering. Ideally though you should use cards with the same amount of vram.
You can usually specify which pcie slot you want the system to use for the primary display in the bios of recent mobos.
For playing games you could use the 2x570 for graphics and specify the 680 for physix. Octane doesn't utilise sli. I am not sure if the 570 will require a sli connector to be removed or whether you can just leave it in and disable that in the driver as required.
i7-3820 @4.3Ghz | 24gb | Win7pro-64
GTS 250 display + 2 x GTX 780 cuda| driver 331.65
Octane v1.55
Rikk The Gaijin
Licensed Customer
Posts: 1528
Joined: Tue Sep 20, 2011 2:28 pm
Location: Japan

If it works, I'm gonna use the 680 as primary, as it should perform better than the two 570 (for games).
Rikk The Gaijin
Licensed Customer
Posts: 1528
Joined: Tue Sep 20, 2011 2:28 pm
Location: Japan

Alright, I tried but there are two main problems:
1) I thought I had enough slots for all three cards, but actually I can fit in just two
2) I connected the 680 (4GB) and the 570 (1.5GB), but Octane sees the memory just from the lowest card, so it reads just 1.5 memory! :?

Why??? The reason I purchased the 680 4GB is to have 4GB of course. And it works if I uncheck the 570 from the list, but with both the cards active, the memory goes down to 1.5GB :(
Post Reply

Return to “General Discussion”