Hello everybody!
I have a problem, in that I have a large job coming up which will require me to render a great number of architectural images. My current machine, a 2600k with 2x 3GB GTX 580s takes between six and ten minutes per image, which is interrupting my workflow. Therefore, I need a new machine asap. What's the fastest configuration I can put together, whilst keeping everything in one box?
I've found a couple of motherboards that have 8 PCI slots, does anyone have experience using one of these? Are they stable? The MSI big bang marshall seemed like a good first step but it doesn't use the newest Intel Socket, though I don't know how relevant that is in this context.
I'm willing to sacrifice some 'portability' and consider the use of PCI risers or splitters- do they affect performance or stability? Is there a hard limit on the number of cards a single PC will recognise? I also wouldn't know how to put a system like that together, I've only ever built single-box machines.
I also need to retain Solidworks compatibility, so the primary GPU would have to be a Quadro.
To summarise, please help me multiply my rendering speed by as great a factor as possible!
Thanks a lot for your time and help!
What's the fastest rendering machine I can build?
Forum rules
Please add your OS and Hardware Configuration in your signature, it makes it easier for us to help you analyze problems. Example: Win 7 64 | Geforce GTX680 | i7 3770 | 16GB
Please add your OS and Hardware Configuration in your signature, it makes it easier for us to help you analyze problems. Example: Win 7 64 | Geforce GTX680 | i7 3770 | 16GB
There are a couple different routes: 1) Put everything in one box, 2) Put everything in one box and add and expansion chassis, or 3) Put together separate boxes.
1) If you want to put everything in one box (which is the simplest solution), then the base system (without cards) could be something like this: http://pcpartpicker.com/p/XrKh23 which comes out to ~$2,500. It's built on the newer X99 chipset because that's the future and budget doesn't seem to be the biggest concern. Also, it's based on a big case (Corsair 900D) to handle liquid cooling radiators since you might need it to cool all the GPUs. Obviously that can be adjusted and you might save $100. So with a base of $2,500, you can decide on which graphics cards. The Quadro is gonna put a hamper on your budget vs. rendering power, but I think the K5200 is an OK balance between price and CUDA cores if you must have a Quadro.
- You could go the (1) Quadro K5200 [$2,000] + (3) Titan Black [$3,000] approach = $7,500 total with 8,659 CUDA cores. But there's gonna be a lot of heat in there throttling your Titans, so if you want to water cool them, add another ~$1,000. Switch the Titans to 780 6GB cards for a $1,400 savings, but a loss of 1,728 CUDA cores.
2) If you wanted to put everything in one box, but add an expansion chassis for more graphics, you can use the Netstor NA255A. With all accessories, it's about $2,500 for the box. It can handle up to 1,200 Watts, but having 4 cards in there might cause overheating. I'd start with 3 cards.
- You could go (1) Quadro K5200 [$2,000] + (2) 780 6GB [$1,100] + Netstor [$2,500] + (3) 780 6GB [$1,600] = $9,700 total with 13,824 CUDA cores. That brings you right up to your max budget, though. Water cooling the computer would put you well over.
3) Split them into multiple boxes. Well, there's lot of options here, but some of your budget is going to be used up in redundant computer hardware and software licenses that wouldn't need to be purchased with the above options. Not saying it's a bad option, you may be able to include existing hardware. Just something to consider.
Also, I have seen the Tyan 8 GPU servers. There are two potential problems: first, they likely require (2) Xeon CPUs and there goes a chunk of your budget before even one graphics card; and second, that gaming cards like the Titan or 780 will produce so much heat in there it will be ridiculous. The Quadro cards run cooler because they're down clocked, but you'd run out of money before you filled the case up with them. Just to consider.
Edit: I forgot to include my slightly ridiculous idea. You could build a single render-only machine made up of the base system and (2) Titan Z cards [$6,000] = $8,500 total with 11,520 CUDA cores. It costs more than (4) Titan Blacks, but you wouldn't have to worry about building a water cooling loop. Also, as a render node, you could knock down the size of the SSD and put in into an air cooler for the CPU.
1) If you want to put everything in one box (which is the simplest solution), then the base system (without cards) could be something like this: http://pcpartpicker.com/p/XrKh23 which comes out to ~$2,500. It's built on the newer X99 chipset because that's the future and budget doesn't seem to be the biggest concern. Also, it's based on a big case (Corsair 900D) to handle liquid cooling radiators since you might need it to cool all the GPUs. Obviously that can be adjusted and you might save $100. So with a base of $2,500, you can decide on which graphics cards. The Quadro is gonna put a hamper on your budget vs. rendering power, but I think the K5200 is an OK balance between price and CUDA cores if you must have a Quadro.
- You could go the (1) Quadro K5200 [$2,000] + (3) Titan Black [$3,000] approach = $7,500 total with 8,659 CUDA cores. But there's gonna be a lot of heat in there throttling your Titans, so if you want to water cool them, add another ~$1,000. Switch the Titans to 780 6GB cards for a $1,400 savings, but a loss of 1,728 CUDA cores.
2) If you wanted to put everything in one box, but add an expansion chassis for more graphics, you can use the Netstor NA255A. With all accessories, it's about $2,500 for the box. It can handle up to 1,200 Watts, but having 4 cards in there might cause overheating. I'd start with 3 cards.
- You could go (1) Quadro K5200 [$2,000] + (2) 780 6GB [$1,100] + Netstor [$2,500] + (3) 780 6GB [$1,600] = $9,700 total with 13,824 CUDA cores. That brings you right up to your max budget, though. Water cooling the computer would put you well over.
3) Split them into multiple boxes. Well, there's lot of options here, but some of your budget is going to be used up in redundant computer hardware and software licenses that wouldn't need to be purchased with the above options. Not saying it's a bad option, you may be able to include existing hardware. Just something to consider.
Also, I have seen the Tyan 8 GPU servers. There are two potential problems: first, they likely require (2) Xeon CPUs and there goes a chunk of your budget before even one graphics card; and second, that gaming cards like the Titan or 780 will produce so much heat in there it will be ridiculous. The Quadro cards run cooler because they're down clocked, but you'd run out of money before you filled the case up with them. Just to consider.
Edit: I forgot to include my slightly ridiculous idea. You could build a single render-only machine made up of the base system and (2) Titan Z cards [$6,000] = $8,500 total with 11,520 CUDA cores. It costs more than (4) Titan Blacks, but you wouldn't have to worry about building a water cooling loop. Also, as a render node, you could knock down the size of the SSD and put in into an air cooler for the CPU.
Rigless put this more or less on every way that You might want to go..
but TitanZ is now on sale for 2k$ (for boutique builders..) EK released dual slot water block for them..- so at about 11k, You should probably get four of them in a relatively small box =) no extra licences, no expanders & would look more or less like a regular hi end rig.
If the looks doesn't matter simply get multiple box's with 3-4 of 780 6gb - that will be the best value.. =) even with extra Octane Licences..
but TitanZ is now on sale for 2k$ (for boutique builders..) EK released dual slot water block for them..- so at about 11k, You should probably get four of them in a relatively small box =) no extra licences, no expanders & would look more or less like a regular hi end rig.
If the looks doesn't matter simply get multiple box's with 3-4 of 780 6gb - that will be the best value.. =) even with extra Octane Licences..
Unfortunately, that price reduction is really only for businesses that are system building partners with NVIDIA. It's still $3,000 for the rest of us. It's too bad because at the ~$1,850 system builder price, I'd sell my two Titan Blacks and get a Titan Z instead. But at 3K, not a chance.glimpse wrote:but TitanZ is now on sale for 2k$ (for boutique builders..) EK released dual slot water block for them..- so at about 11k, You should probably get four of them in a relatively small box =) no extra licences, no expanders & would look more or less like a regular hi end rig.
well I have an 8 PCI slot motherboard - and I would recommend asus if they still got those - bulletproof motherboards
- there is room to fit only 4 double slot cards - if you want to fit more with PCIe risers you will still have a box that is not very portable - because the extra cards will stick out - and you will not be able to cool them
- the very best option would be to get water cooling single slot cards on a single machine and populate it with 8 cards.. if there are any single slot water cooled cards out there
- I would reccommend external box - for a more elegant solution - it is a lot of work to get more than 4 gpus with risers to install properly - mine is a bit of a mess - but I manage 5 graphic cards - if you squeeze in 4 cards - you will have a few mm distance between each other ... but maybe try that as the first option
- with 10k to spend, you can get another external box and an 8x pcie slot machine - put 3 cards on the mainboard, and the rest in the external one - and consider getting big PSUs - i combined two PSUnits instead of a singe big one
- there is room to fit only 4 double slot cards - if you want to fit more with PCIe risers you will still have a box that is not very portable - because the extra cards will stick out - and you will not be able to cool them
- the very best option would be to get water cooling single slot cards on a single machine and populate it with 8 cards.. if there are any single slot water cooled cards out there
- I would reccommend external box - for a more elegant solution - it is a lot of work to get more than 4 gpus with risers to install properly - mine is a bit of a mess - but I manage 5 graphic cards - if you squeeze in 4 cards - you will have a few mm distance between each other ... but maybe try that as the first option
- with 10k to spend, you can get another external box and an 8x pcie slot machine - put 3 cards on the mainboard, and the rest in the external one - and consider getting big PSUs - i combined two PSUnits instead of a singe big one
3dmax, zbrush, UE
//Behance profile //BOONAR
//Octane render toolbox 3dsmax
//Behance profile //BOONAR
//Octane render toolbox 3dsmax
Man, that's my dream for a long time =) if Only I have a need or someOne would give me a chance to mess with such a project..acc24ex wrote:
- the very best option would be to get water cooling single slot cards on a single machine and populate it with 8 cards.. if there are any single slot water cooled cards out there
Actually, there's not so much of SS (Single slot cards) for a while, but handy DIYers managed to modify watercooled Titans/780s for SS
The biggest headache for me would be not the cards, not the single slot brackets or..conectors that You have to hard mod.. but the PCIe - still not sure if 7-8cards wouldn't melt out the mobo..-as they recommend a connector for four.. with additional power.. what would be plugging twice of that - I have a feeling, though not based on any facts, that all those connections on consumer mobos where not designed to run with that much of hungry GPU cards..maybe with other add-on cards, but not with GPUs - even folders, miners don't go so crazy on a single board..6 seems to be a sweet spot =)
(grr.. hate auto correct..)
- Seekerfinder
- Posts: 1600
- Joined: Tue Jan 04, 2011 11:34 am
Hi Eisberger,
It's strategic-thinking-under-pressure-time for you. The best kind. If I were in your position, I'd build TWO new identical machines. Perhaps with X79 chipset and, 3930K and maybe 12GB ram. I'd put 4x GTX 780 6GB cards in each (unfortunately reference coolers seem not to be available for the 6GB version so you'd better get a good case with lots of airflow - watercooling is an option in which case I'd put 3 cards in each for cost or even watercool one box with 4 cards and aircool the other with 2 since you'll have more space & less heat). I'd then get the additional slave licences and use Octane's network rendering feature.
Benefits:
1. It's conventional tech. I'd leave experimenting with risers (and even WC unless a specialist builder does that for you) for another time. Keep it simple & stick to what you know;
2. For probably under $9k you get 18,432 cuda cores (excluding your 580's);
3. Keep your current machine with all it's settings & apps (nothing new on (or just before) race day);
4. Redundancy. If any of the three machines croak, you have two others.
In fact, I may even just build one machine to test out temp, noise etc. Who knows, it may be enough for the project and it should bring your 10 mins down to 2. It also would limit your risk in case the job goes south.
Now Eisberger, this is your moma speaking: make sure you have the job firmly in the bag before splashing out on dem shiney things, you hea? And this is your dad: well done son, for making the leap from 2 GTX 580's to crazy-GPU-power-town over night. Daddy's proud of you. And I am jealous...
Well that's my $9k, I mean my 2c... Hope it helps. Let us know what you decide.
Seeker
It's strategic-thinking-under-pressure-time for you. The best kind. If I were in your position, I'd build TWO new identical machines. Perhaps with X79 chipset and, 3930K and maybe 12GB ram. I'd put 4x GTX 780 6GB cards in each (unfortunately reference coolers seem not to be available for the 6GB version so you'd better get a good case with lots of airflow - watercooling is an option in which case I'd put 3 cards in each for cost or even watercool one box with 4 cards and aircool the other with 2 since you'll have more space & less heat). I'd then get the additional slave licences and use Octane's network rendering feature.
Benefits:
1. It's conventional tech. I'd leave experimenting with risers (and even WC unless a specialist builder does that for you) for another time. Keep it simple & stick to what you know;
2. For probably under $9k you get 18,432 cuda cores (excluding your 580's);
3. Keep your current machine with all it's settings & apps (nothing new on (or just before) race day);
4. Redundancy. If any of the three machines croak, you have two others.
In fact, I may even just build one machine to test out temp, noise etc. Who knows, it may be enough for the project and it should bring your 10 mins down to 2. It also would limit your risk in case the job goes south.
Now Eisberger, this is your moma speaking: make sure you have the job firmly in the bag before splashing out on dem shiney things, you hea? And this is your dad: well done son, for making the leap from 2 GTX 580's to crazy-GPU-power-town over night. Daddy's proud of you. And I am jealous...
Well that's my $9k, I mean my 2c... Hope it helps. Let us know what you decide.
Seeker
Win 8(64) | P9X79-E WS | i7-3930K | 32GB | GTX Titan & GTX 780Ti | SketchUP | Revit | Beta tester for Revit & Sketchup plugins for Octane
- Seekerfinder
- Posts: 1600
- Joined: Tue Jan 04, 2011 11:34 am
Oo. I just re-read your post and saw you want to keep everything in one box... which makes my advice pretty redundant.
Just wondering why you'd want to do that? If you're not on a network you might consider a NAS box (I love Synology's products) - you should be able to set up a LAN pretty quick without a conventional server, switch etc. The NAS would also offer good backup functionality for your project.
Just a thought....
Else, if you really want to stick everything in one box, I go with Riggles' suggestion and get 2 TITAN Z's. Watercool them (EKWB) - if you don't you won't have space for the Quadro since they're tripple slot cards. Dream render machine.
Best,
Seeker
Just wondering why you'd want to do that? If you're not on a network you might consider a NAS box (I love Synology's products) - you should be able to set up a LAN pretty quick without a conventional server, switch etc. The NAS would also offer good backup functionality for your project.
Just a thought....
Else, if you really want to stick everything in one box, I go with Riggles' suggestion and get 2 TITAN Z's. Watercool them (EKWB) - if you don't you won't have space for the Quadro since they're tripple slot cards. Dream render machine.
Best,
Seeker
Win 8(64) | P9X79-E WS | i7-3930K | 32GB | GTX Titan & GTX 780Ti | SketchUP | Revit | Beta tester for Revit & Sketchup plugins for Octane