Boris, that's a beast =) & the temps are really good!
could You be more descriptive about the cost of this extension
(without GPUs, PSU, etc.) I've written to them but have no answer..
I doubt You're going to fill with more than 8GPUs..to many failed doing that..
thanks for sharing Your thoughs on this! great stuff!..
cheers
17 PCI Express 2.0 x16
Forum rules
Please add your OS and Hardware Configuration in your signature, it makes it easier for us to help you analyze problems. Example: Win 7 64 | Geforce GTX680 | i7 3770 | 16GB
Please add your OS and Hardware Configuration in your signature, it makes it easier for us to help you analyze problems. Example: Win 7 64 | Geforce GTX680 | i7 3770 | 16GB
expansion kit (host card, target card, 3m cable): $1426
bgp8032 backplane: $3078
shipping to europe fedex international priority (deadline!!): you do not wanna know.
cheers
boris
bgp8032 backplane: $3078
shipping to europe fedex international priority (deadline!!): you do not wanna know.
cheers
boris
P6T7 WS SuperComputer / i7 980 @ 3.33GHz / 24 GB / 1500W + 1200W PSUs / 6x GTX 680 4 GB + 1x Tesla M2070 6GB (placeholder
)

- the fastra did 12 (on linux). the modified linux kernel and bios was only to solve the addressing problem.glimpse wrote: I doubt You're going to fill with more than 8GPUs..to many failed doing that..
- ATI bitcoin miners can't because of their driver limitation.
The only problem I see is the BIOS/UEFI addressing at the moment (if we believe nvidia's customer support), and that should be solvable with a 64bit UEFI. many new boards just have begun to use UEFI.
trenton's backplane computers (which we could put where the target card sits right know) have an option in bios "allow addressing above 4gb". but at the moment we feel having spent enough money at trenton so for testing such a rig we'll probably go for a gamer UEFI board as a host board...
cheers
boris
P6T7 WS SuperComputer / i7 980 @ 3.33GHz / 24 GB / 1500W + 1200W PSUs / 6x GTX 680 4 GB + 1x Tesla M2070 6GB (placeholder
)

now that sounds interesting! =)
please share what You're going to findout later. I believe it could be interesting for some guys. wish U all the best!
& thanks for prices. expansion kit is actually very reasonably priced indead! though with shiping it's another story..
please share what You're going to findout later. I believe it could be interesting for some guys. wish U all the best!
& thanks for prices. expansion kit is actually very reasonably priced indead! though with shiping it's another story..
- gabrielefx
- Posts: 1701
- Joined: Wed Sep 28, 2011 2:00 pm
the new pci-x 3.0 barebone Tyan costs 3.500€, no cables or DIY chassis, 3 PSU included.
it supporta 8 K20 or 8 GTX680 at full 16x
now Tyan is distributed in Europe by PNY.
it supporta 8 K20 or 8 GTX680 at full 16x
now Tyan is distributed in Europe by PNY.
quad Titan Kepler 6GB + quad Titan X Pascal 12GB + quad GTX1080 8GB + dual GTX1080Ti 11GB
after heavily testing our system i can say that for this special cuda processes the pci bandwith is not that important. it may of course with other applications but that depends how they work.
we rendered out a 2min animation 1280x720 in 48 hours, including interior shots and animated trees. ok 15 fps, retimed in post to 30. sorry that i am not allowed to post a link yet.
still images we render 5000x3500 with no problems.
a tyan barbone needs of course at least one xeon processor, which comes pricy and not to forget.
if you do not want headphones on your head all the time and having your gpu rig right beside you, placing 8 gtx 680 side by side can only be done with watercooling IMO. of course you can work with pci riser cables but then the barbone case is somehow usless.
I somehow like manufacturing a case once a 17 card setup is running (dreaming...).
first we thought we put that backplane in mineral oil, but after reading some threads about that we forgot about it and were really stunned by the coolness of those gainward cards as long they are not stick side by side together.
I also like that the host card has a connector where I connected a switch which allows me to start the system without the backplane activated just to save some energy and gpu lifetime if not needed.
it's not a cheap solution, i know, but it works great till now and hopefully will grow a bit more in future.
cheers
boris
we rendered out a 2min animation 1280x720 in 48 hours, including interior shots and animated trees. ok 15 fps, retimed in post to 30. sorry that i am not allowed to post a link yet.
still images we render 5000x3500 with no problems.
a tyan barbone needs of course at least one xeon processor, which comes pricy and not to forget.
if you do not want headphones on your head all the time and having your gpu rig right beside you, placing 8 gtx 680 side by side can only be done with watercooling IMO. of course you can work with pci riser cables but then the barbone case is somehow usless.
I somehow like manufacturing a case once a 17 card setup is running (dreaming...).
first we thought we put that backplane in mineral oil, but after reading some threads about that we forgot about it and were really stunned by the coolness of those gainward cards as long they are not stick side by side together.
I also like that the host card has a connector where I connected a switch which allows me to start the system without the backplane activated just to save some energy and gpu lifetime if not needed.
it's not a cheap solution, i know, but it works great till now and hopefully will grow a bit more in future.
cheers
boris
P6T7 WS SuperComputer / i7 980 @ 3.33GHz / 24 GB / 1500W + 1200W PSUs / 6x GTX 680 4 GB + 1x Tesla M2070 6GB (placeholder
)

Yeah, Xeons would put the Tyan barebone to realativelly high price, pluss all the noise (these three fans are not one of the most silent I believe, pluss PSUs..)..in addition, a lot of users do have spare PCIe for conectivity. 1,5k for expander is a bargain..
Though Tyan case make sence, if You have spare room where You could house the server & use something like PCoIP solution to port You desktop to Your working space. OR actually using CPU power to work with hybrid engines Like ARION or IndigoRenderer (though the last time I've looked it still doesn't allow multi GPU computation, but that might be changed..=)
but if You don't need all the extra power from CPUs, something based on x79 for LGA2011 with two PLX PLEX 8747 (that effectivelly gives 72 PCIe lanes - & that's enough to run 7cards) like asrock x79 extreme 11 eventually start making lot more sence. Even if You might need some custom case etc.
Though Tyan case make sence, if You have spare room where You could house the server & use something like PCoIP solution to port You desktop to Your working space. OR actually using CPU power to work with hybrid engines Like ARION or IndigoRenderer (though the last time I've looked it still doesn't allow multi GPU computation, but that might be changed..=)
but if You don't need all the extra power from CPUs, something based on x79 for LGA2011 with two PLX PLEX 8747 (that effectivelly gives 72 PCIe lanes - & that's enough to run 7cards) like asrock x79 extreme 11 eventually start making lot more sence. Even if You might need some custom case etc.
This is link with some prices I received when asking by mail:César wrote:This is very interesting. If I understand, you just have to connect the big PCI backplane to your computer with the PCI extension cable and it works ?
...
Cyclone stuff : http://www.cyclone.com/products/expansi ... /index.php
They sell blackplanes too, but I don't find the price.
...
Prices can be found on our ecommerce site. http://yhst-72904622531421.stores.yahoo.net/
4090+3089ti & Quad 1080ti
ArchiCAD25, ofcourse Octane & OR-ArchiCAD plugin (love it)
http://www.tapperworks.com
http://www.facebook.com/pages/TAPPERWOR ... 9851341126
http://www.youtube.com/user/Tapperworks/videos