Octane and PSU size

Discuss anything you like on this forum.
Post Reply
Leonardo
Licensed Customer
Posts: 37
Joined: Sun Oct 23, 2011 3:36 pm

I just purchased a second GTX580. From reading posts here and other websites, it seems like I would be a fool to pretend to power my system with my 650W power supply unit, and I should seek to upgrade to (at least) 850W.
However, I couldn't resist the temptation of plugging a watt meter and launch octane with both cards, to see how far I was from my PSU limit. The meter reads 520W, getting peaks of 550-570W. Being a 80plus efficient unit, that means my system is using about 470W (considering efficiency of 82%). That's nearly a 200W headroom! Or, said in other words, that my PSU is working at 3/4 of its capacity.

Am I being too cautious here? Should I stick with my 650W unit? FYI it's a Corsair HX650, with 624W in the 12V rail. I don't mind stressing it, but does that entail risking other pieces of hardware?
Z77 - i7 2600K - 16Gb | iGPU HD3000 | GTX580 + GTX590
Win7 HP 64 | Sketchup 8 - Octane standalone v1.16
User avatar
pixelrush
Licensed Customer
Posts: 1618
Joined: Mon Jan 11, 2010 7:11 pm
Location: Nelson, New Zealand

well that depends ;)

GTX580 = 245W each
Quadro 2000 = 60 W
2600k = 95W
fans, disks mobo etc ~ 75W(?)
Total = 720W x 125% for capacitor degradation over time = 900W
This is probably a better capacity to go for if you were building from scratch.
But of course not everything will be working at !00% while Octane renders although the 580 will be near 100%.
The Corsair is a quality unit so its probably robust and well regulated even near its max capacity.
If you're confident the whole draw is only 75% of its capacity then it would seem ok to keep using it like that IMHO :)
It might die earlier than it otherwise would have but then again it could soldier on for a long time...
i7-3820 @4.3Ghz | 24gb | Win7pro-64
GTS 250 display + 2 x GTX 780 cuda| driver 331.65
Octane v1.55
User avatar
FooZe
OctaneRender Team
Posts: 1335
Joined: Tue May 15, 2012 9:00 pm

Hi Leonardo,

If it's a good PSU (which it seems to be) then i doubt you are going to damage anything.
Most likely if the PSU detects an overload it will just switch itself off. Another thing is that the wattage rating is usually the maximum peak, and the maximum continuous output is not necessarily the same. Being 80plus efficient, means you are using anywhere from 470 through to 570 (@ 100% efficiency) so your using an MINIMUM of 470, most likely more.
NVidia rates each 580 at a max of 244W which i would take note of. So thats 488 + CPU etc, i wouldn't be surprised if this is about what you are experiencing. Try running something CPU intensive while you are rendering and see what the power usage is then.

Personally, i would be happy to run on it, but with the precautions of checking the temp by hand when it's been under load for a while, and i would have in the back of my mind that it could very well bust into flames at any stage (literally, i have seen smoke billowing out of PSU's before - i turned it off at the wall quick smart!). I wouldn't want to leave it overnight or anything like that before i had built up some trust that it would not overheat.

Thanks
Chris.
Leonardo
Licensed Customer
Posts: 37
Joined: Sun Oct 23, 2011 3:36 pm

Thank you both for your answers. I have to say, the idea of smoke coming out of the PSU is quite disturbing... I think I'll upgrade in the next days, just for the peace of mind.
You both claim that the 580's will be working at 100%. My numbers don't quite add up, is it possible that some scenes are less demanding (in power terms) than others? FYI I've been doing my measurements with the benchmark scene and the sample rate is correct (same as other 580 users in this forum)

Incidentally, I'm selling my Quadro 2000. If anybody is interested, please PM me.
Z77 - i7 2600K - 16Gb | iGPU HD3000 | GTX580 + GTX590
Win7 HP 64 | Sketchup 8 - Octane standalone v1.16
User avatar
FooZe
OctaneRender Team
Posts: 1335
Joined: Tue May 15, 2012 9:00 pm

Hi Leonardo,

The PSU that threw smoke out the back was some years ago, most likely before they had the protection circuits they have today. So i doubt this will happen.
I doubt your cards are using the max 245W each but i think it only makes sense to plan for this...
Even in the benchmark you should be pushing 100% usage, but i'm betting NVidia are playing it safe with the 245W max - this will be higher than actual usage i'm sure.

If you want to do it correctly then i would follow pixelrush's calculations.
What you can actually get away with is a black art/game of risk :)

Thanks
Chris.
Post Reply

Return to “Off Topic Forum”