Hi everyone,
Back to my lastest added board 780ti (also just add it to the watercooled loop) : windows 8.1 would not like to much change. I just reflash the two other 780ti for a new bios, and bam --> only 7 boards are recognized and the last one (connected with USB3 riser) has exclamation mark on the device manager. I try to add registry like before, but always fail. I try to clean drivers install, but no issue. If I disable one board, and reboot, the 780ti works.
I go to windows 10, and no problem : all 8 boards work. Hum, What can I try now ? OK go to windows 10...
External Graphics Cards PC
Forum rules
For new users: this forum is moderated. Your first post will appear only after it has been reviewed by a moderator, so it will not show up immediately.
This is necessary to avoid this forum being flooded by spam.
For new users: this forum is moderated. Your first post will appear only after it has been reviewed by a moderator, so it will not show up immediately.
This is necessary to avoid this forum being flooded by spam.
Con't :
I cannot stay frustrated with only 7/8 boards on windows 8.1 !
I decided to clean install a new win 8.1 and finally get all them working (with registry hack).
So I go back to the old win 8.1 and decided to do a clean drivers installation :
- Use DDU (Display Driver Uninstaller) on safe mode and clean all GPU drivers.
- reboot and reinstall latest drivers. Before reboot to complete installation, I also complete the registry. and then reboot.
- Bingo ! after reboot, all my 8 boards are working !
I could say again that the last version of windows 10 works well and no pb on GPU detection.
I have enough power for 2 more GPU
. Hum ... will wait for cheaper 980Ti 
I cannot stay frustrated with only 7/8 boards on windows 8.1 !
I decided to clean install a new win 8.1 and finally get all them working (with registry hack).
So I go back to the old win 8.1 and decided to do a clean drivers installation :
- Use DDU (Display Driver Uninstaller) on safe mode and clean all GPU drivers.
- reboot and reinstall latest drivers. Before reboot to complete installation, I also complete the registry. and then reboot.
- Bingo ! after reboot, all my 8 boards are working !
I could say again that the last version of windows 10 works well and no pb on GPU detection.
I have enough power for 2 more GPU


I7-3930K 64Go RAM Win8.1pro , main 3 titans + 780Ti
Xeon 2696V3 64Go RAM Win8.1/win10/win7, 2x 1080Ti + 3x 980Ti + 2x Titan Black
Xeon 2696V3 64Go RAM Win8.1/win10/win7, 2x 1080Ti + 3x 980Ti + 2x Titan Black
thanks for sharing Your experience! =) might come handy!
Hello!
Tutor, thanks for your insights again. My CPU is Intel Core i7-3930K @3.20 GHz . I've been looking into my issue as a BIOS one with ASRock (Not Asus, as I incorrectly wrote !
...)
Basically, I can install 10GPU, but > 10 GPU I receive "BF" error. ASRock suggested I try newer than stock BIOS which allowed enabling of "Above 4G option", however the BIOS predates my current BIOS, which was even newer and allowed for "PCIE Improvements" (this latest one was the first to support GTX Titan X.) And so once I reverted to try the 4G option, I could no longer use the Titan X, so I am asking if they could get the 'enable 'above 4G feature' to the newer BIOS which supports GTX Titan X. I'll have to see what they say...I got to hand it to you Tutor, you called it with looking into the 4G Rom option!
I was imagining a 2nd networked rig to complement with more GPUs, but then I realized cost of all this, so I am hoping for a BIOS breakthrough from ASRock.
Tutor, thanks for your insights again. My CPU is Intel Core i7-3930K @3.20 GHz . I've been looking into my issue as a BIOS one with ASRock (Not Asus, as I incorrectly wrote !

Basically, I can install 10GPU, but > 10 GPU I receive "BF" error. ASRock suggested I try newer than stock BIOS which allowed enabling of "Above 4G option", however the BIOS predates my current BIOS, which was even newer and allowed for "PCIE Improvements" (this latest one was the first to support GTX Titan X.) And so once I reverted to try the 4G option, I could no longer use the Titan X, so I am asking if they could get the 'enable 'above 4G feature' to the newer BIOS which supports GTX Titan X. I'll have to see what they say...I got to hand it to you Tutor, you called it with looking into the 4G Rom option!
I was imagining a 2nd networked rig to complement with more GPUs, but then I realized cost of all this, so I am hoping for a BIOS breakthrough from ASRock.
Win 10 Pro 64, Xeon E5-2687W v2 (8x 3.40GHz), G.Skill 64 GB DDR3-2400, ASRock X79 Extreme 11
Mobo: 1 Titan RTX, 1 Titan Xp
External: 6 Titan X Pascal, 2 GTX Titan X
Plugs: Enterprise
Mobo: 1 Titan RTX, 1 Titan Xp
External: 6 Titan X Pascal, 2 GTX Titan X
Plugs: Enterprise
- A Polish Ginger
- Posts: 14
- Joined: Sun Mar 08, 2015 7:32 pm
- Location: Chicago, IL
Hello Gentlemen, seeing all these insane setups I thought I try it out myself.... but I can't get past 4 GPUs
I decided to go with the 16x to 1x USB risers http://www.amazon.com/RIF6-PCI-E-Adapte ... e+extender
I'm, trying, to run 6 GPUs (5-980ti's 1- Titan) and I had 5 of them connected via USB and 1 980ti connected on the 16x PCI slot. When I tried to start my computer the boot menu never appeared and remained blank on the screen. I took a look and the LED DeBug screen (or whatever its called) and it was repeating it self so I guess it was in a boot loop. After hours of tinkering I was able to get at least 5 of the 980ti's to work, but when I went to benchmark I got a score of 506 where as the 4 980ti's got 490. I'm no expert in this but that doesn't sound right at all.
I see everyone here not experiencing massive issues like this so does anyone have a suggestion? Is there some type of system bypass to allow the cards to pass through that I'm missing? I'm not well versed in anything dealing with BIOS settings and coding in general, like what tutor was talking about, the I/O space, yeah no idea what that is.
I am also sure its not a PSU thing because I connected 2 1600W togeather via that Add2PSU board.
Any help will be highly appreciated
I decided to go with the 16x to 1x USB risers http://www.amazon.com/RIF6-PCI-E-Adapte ... e+extender
I'm, trying, to run 6 GPUs (5-980ti's 1- Titan) and I had 5 of them connected via USB and 1 980ti connected on the 16x PCI slot. When I tried to start my computer the boot menu never appeared and remained blank on the screen. I took a look and the LED DeBug screen (or whatever its called) and it was repeating it self so I guess it was in a boot loop. After hours of tinkering I was able to get at least 5 of the 980ti's to work, but when I went to benchmark I got a score of 506 where as the 4 980ti's got 490. I'm no expert in this but that doesn't sound right at all.
I see everyone here not experiencing massive issues like this so does anyone have a suggestion? Is there some type of system bypass to allow the cards to pass through that I'm missing? I'm not well versed in anything dealing with BIOS settings and coding in general, like what tutor was talking about, the I/O space, yeah no idea what that is.
I am also sure its not a PSU thing because I connected 2 1600W togeather via that Add2PSU board.
Any help will be highly appreciated

Win 7 64 | GTX Titan, GTX 980 ti x5 | i7 4930K | 32 GB
Hello Polish Ginger!
I see you are experiencing some similar things to what has been written in this post so I'll try to take a crack at it, although my remarks may just evoke more considerations as opposed to having any immediate value.
Just as follow up to my own exploration of GPUs, I am getting a "bF" error on my motherboard, and am awaiting further information from the mobo company to see if they can offer any suggestions or BIOS mods that would allow a GTX Titan X with 5 accompanying Titan Z's.
I have tried since some unorthodox maneuvers to see if I could get my 5th Titan Z recognized (would be 11 GPUs, including a Titan X as primary):
(1) I tried turning on the external Z's after the BIOS had already loaded, and while they did power on, the fans on the cards, and on the external PSUs went to max, so I shut off right away because it just sounded wrong for them to be doing that;
(2) I hooked up 2 of the cards on one of my external PSUs to an outlet on a separate circuit via an extension cord to see if this was all a power thing. But no good, cards powered up alright, still same "bF" error.
My conclusion is that while I had thought that perhaps my motherboard's BIOS was acting as an arbitrary gatekeeper of some sort, it probably is not. In my case it may be a valid hardware limitation when it comes to current running through the motherboard beyond 10 GPU. So, I am still waiting for a BIOS surprise.
Okay, so in your situation there are 3 things going on:
(1) your mobo won't boot with 6 GPUs (1 primary, 5 external), goes into a boot loop. (sounds just like when I try my 5th Z for 11 GPUs). But you can get 5 GPUs just fine. You suggested your power supply should be fine. I am afraid all I can think of is that there is some motherboard hardware limit at play preventing the 6. It might be the case that the motherboard, despite any tinkering, may not physically support more than 5, even if 6 can be plugged in via available PCIE lanes. Maybe, however, you could explore an update your System BIOS on the motherboard, or first see, like Tutor had suggested to me, if there is an option for a 4G option at boot-up in the BIOS. (in a nutshell, when your PC boots, before it goes into Windows, it does BIOS check. It probably flashes an option to hit F2 to enter. in BIOS there may be, under the boot-up settings, an option to enable a '4G' mode of some sort. It could be that enabling this this is all it would take for the boot up to occur normally. If not, maybe an updated BIOS would have this.
(2) you are getting 5 set up, but the performance increase is slight. So slight that doesn't make sense. First thought, just make sure you have enabled all 5 as devices in the preferences menu, which you probably did. 2nd thought is along these lines, enable each device in a testing sequence, adding one more at a time (test 1, then 2, then 3, then 4, then 5) to see how a singled rendered scene progresses through its rendering. Not that benchmark is invalid, but just see how it works real world in your rendering environments and see that the pattern of a 5th having little impact is the same. If you see that a 5th does have little impact, maybe then someone else could make suggestions. But outside that you can't get all 6, it is a good thing that your PC sees the 5 as devices and can render using all.
Let us know. Good luck!
I see you are experiencing some similar things to what has been written in this post so I'll try to take a crack at it, although my remarks may just evoke more considerations as opposed to having any immediate value.
Just as follow up to my own exploration of GPUs, I am getting a "bF" error on my motherboard, and am awaiting further information from the mobo company to see if they can offer any suggestions or BIOS mods that would allow a GTX Titan X with 5 accompanying Titan Z's.
I have tried since some unorthodox maneuvers to see if I could get my 5th Titan Z recognized (would be 11 GPUs, including a Titan X as primary):
(1) I tried turning on the external Z's after the BIOS had already loaded, and while they did power on, the fans on the cards, and on the external PSUs went to max, so I shut off right away because it just sounded wrong for them to be doing that;
(2) I hooked up 2 of the cards on one of my external PSUs to an outlet on a separate circuit via an extension cord to see if this was all a power thing. But no good, cards powered up alright, still same "bF" error.
My conclusion is that while I had thought that perhaps my motherboard's BIOS was acting as an arbitrary gatekeeper of some sort, it probably is not. In my case it may be a valid hardware limitation when it comes to current running through the motherboard beyond 10 GPU. So, I am still waiting for a BIOS surprise.
Okay, so in your situation there are 3 things going on:
(1) your mobo won't boot with 6 GPUs (1 primary, 5 external), goes into a boot loop. (sounds just like when I try my 5th Z for 11 GPUs). But you can get 5 GPUs just fine. You suggested your power supply should be fine. I am afraid all I can think of is that there is some motherboard hardware limit at play preventing the 6. It might be the case that the motherboard, despite any tinkering, may not physically support more than 5, even if 6 can be plugged in via available PCIE lanes. Maybe, however, you could explore an update your System BIOS on the motherboard, or first see, like Tutor had suggested to me, if there is an option for a 4G option at boot-up in the BIOS. (in a nutshell, when your PC boots, before it goes into Windows, it does BIOS check. It probably flashes an option to hit F2 to enter. in BIOS there may be, under the boot-up settings, an option to enable a '4G' mode of some sort. It could be that enabling this this is all it would take for the boot up to occur normally. If not, maybe an updated BIOS would have this.
(2) you are getting 5 set up, but the performance increase is slight. So slight that doesn't make sense. First thought, just make sure you have enabled all 5 as devices in the preferences menu, which you probably did. 2nd thought is along these lines, enable each device in a testing sequence, adding one more at a time (test 1, then 2, then 3, then 4, then 5) to see how a singled rendered scene progresses through its rendering. Not that benchmark is invalid, but just see how it works real world in your rendering environments and see that the pattern of a 5th having little impact is the same. If you see that a 5th does have little impact, maybe then someone else could make suggestions. But outside that you can't get all 6, it is a good thing that your PC sees the 5 as devices and can render using all.
Let us know. Good luck!
Win 10 Pro 64, Xeon E5-2687W v2 (8x 3.40GHz), G.Skill 64 GB DDR3-2400, ASRock X79 Extreme 11
Mobo: 1 Titan RTX, 1 Titan Xp
External: 6 Titan X Pascal, 2 GTX Titan X
Plugs: Enterprise
Mobo: 1 Titan RTX, 1 Titan Xp
External: 6 Titan X Pascal, 2 GTX Titan X
Plugs: Enterprise
I said at the end of my last post you had 3 things going on...I should have said 2 

Win 10 Pro 64, Xeon E5-2687W v2 (8x 3.40GHz), G.Skill 64 GB DDR3-2400, ASRock X79 Extreme 11
Mobo: 1 Titan RTX, 1 Titan Xp
External: 6 Titan X Pascal, 2 GTX Titan X
Plugs: Enterprise
Mobo: 1 Titan RTX, 1 Titan Xp
External: 6 Titan X Pascal, 2 GTX Titan X
Plugs: Enterprise
- A Polish Ginger
- Posts: 14
- Joined: Sun Mar 08, 2015 7:32 pm
- Location: Chicago, IL
Hello Notiusweb
Thank you for replying and here is what I tried so far...
I tried just hooking 1 GPU to the riser and plugging into each PCI Express lane sequential to see if it was the lanes, and only Lane 1 worked 100%. I was able to get lane 3 and 5 to kinda work but the display was laggy, there was a "curtain" effect when loading the start menu, and random black screens. I even got a "bf" error code. And I couldn't get lanes 2,4,6 to work I kept getting error code 94 and b2 when I had it plugged in to those lanes I'll post some pics.
I then tried to go back to a 4 way set up and I got a Error Code 43 on the 4th GPU.
Unfortunately All this was tested before I did the "4G Enabled" in the BIOS, except for the 4 GPU set up. I had it enabled during that, but still a Error code 43. I'll give it another go in a couple of days, because I need my computer to be operational for tomorrow.
Part, If not the majority, of me wants to buy a new Motherboard because I feel like this should be a lot more simpler (at least this early in the process with only 6 GPUs). I did order that GPU cluster from Amfeltec so hopefully that will show some promise, but what Motherboards do you guys recommend?
Also this isn't a brand new motherboard, its been returned at least 5 times so it wouldn't surprise me if it was the problem.
Sincerely,
A Polish Ginger
Thank you for replying and here is what I tried so far...
I tried just hooking 1 GPU to the riser and plugging into each PCI Express lane sequential to see if it was the lanes, and only Lane 1 worked 100%. I was able to get lane 3 and 5 to kinda work but the display was laggy, there was a "curtain" effect when loading the start menu, and random black screens. I even got a "bf" error code. And I couldn't get lanes 2,4,6 to work I kept getting error code 94 and b2 when I had it plugged in to those lanes I'll post some pics.
I then tried to go back to a 4 way set up and I got a Error Code 43 on the 4th GPU.
Unfortunately All this was tested before I did the "4G Enabled" in the BIOS, except for the 4 GPU set up. I had it enabled during that, but still a Error code 43. I'll give it another go in a couple of days, because I need my computer to be operational for tomorrow.
Part, If not the majority, of me wants to buy a new Motherboard because I feel like this should be a lot more simpler (at least this early in the process with only 6 GPUs). I did order that GPU cluster from Amfeltec so hopefully that will show some promise, but what Motherboards do you guys recommend?
Also this isn't a brand new motherboard, its been returned at least 5 times so it wouldn't surprise me if it was the problem.
Sincerely,
A Polish Ginger
Win 7 64 | GTX Titan, GTX 980 ti x5 | i7 4930K | 32 GB
I am reading this topic with curiosity. Exceeding limits with gpus is still my plan.
As for motherboard - I'd go with something that has 2 PLX chips or even more to preserve sufficient PCIE lanes. From what Tutor wrote supermicro http://www.supermicro.com/products/moth ... DRX_-F.cfm
seems to be of great value. Otherwise I'd go with asus x99e-ws, asrock x99ws-e or asrock extreme 11 - they all have 2x plxs.
When you get amfeltec working please post some info.
As for motherboard - I'd go with something that has 2 PLX chips or even more to preserve sufficient PCIE lanes. From what Tutor wrote supermicro http://www.supermicro.com/products/moth ... DRX_-F.cfm
seems to be of great value. Otherwise I'd go with asus x99e-ws, asrock x99ws-e or asrock extreme 11 - they all have 2x plxs.
When you get amfeltec working please post some info.
3090, Titan, Quadro, Xeon Scalable Supermicro, 768GB RAM; Sketchup Pro, Classical Architecture.
Custom alloy powder coated laser cut cases, Autodesk metal-sheet 3D modelling.
build-log http://render.otoy.com/forum/viewtopic.php?f=9&t=42540
Custom alloy powder coated laser cut cases, Autodesk metal-sheet 3D modelling.
build-log http://render.otoy.com/forum/viewtopic.php?f=9&t=42540
Polish Ginger,
Cool pics of your rig. Sorry you're hitting these error codes!...
You may want to consider just using PCIE lane 1 for your display card. Usually at default this will yield best performance. I don't know your motherboard's ins and outs, but in general lane 1 will be set to have higher bandwidth through either, or both, of the hardware build or system BIOS config. So, if you use lane 1 for the display card, then mess around with the other PCIE lanes for trying to add as many working rendering cards as possible. I wouldn't worry that the other lanes don't work as well as lane 1. That may be how it's supposed to be, with the manufacturer assuming you will use lane 1 for your primary display GPU.
In my case, for whatever reason, my primary Titan X has to be connected directly to the motherboard itself, it will not boot up (gives error beeps) with it on the USB 3.0 risers I have. If you are able to use a riser for your primary display, that is pretty cool, because it frees up your lane 2 (mine is blocked)...
One other consideration, you had mentioned the benchmark tests with a working 5 cards (1 display, 4 additional). If you wanted to test how each card was pumping out CUDA, I would open a scene to be rendered in the Octane Render standalone and in the preferences menu, click box to enable or disable any combination of the 5 cards. This will allow you to see how the cards work together in a real world render situation. Keep in mind, enabling whatever card is primary will make the display slow down if you rotate angle, change focus, or mod materials, lighting, etc, because it will both try to output the display and generate CUDA computations for the rendering.
I myself am still awaiting feedback from my BIOS manufacturer as to why my motherboard doesn't like more than 10 GPU. I have enough power and lanes to exceed it, it may be the motherboard just can't handle it. In any event, I try to have a firm understanding of where the config stands when working. If I have scenarios working sometimes but not other times, I try to understand what is going on so that I'll be able to take these things into consideration as I add complexity.
Well, let us know how it goes!
Cool pics of your rig. Sorry you're hitting these error codes!...
You may want to consider just using PCIE lane 1 for your display card. Usually at default this will yield best performance. I don't know your motherboard's ins and outs, but in general lane 1 will be set to have higher bandwidth through either, or both, of the hardware build or system BIOS config. So, if you use lane 1 for the display card, then mess around with the other PCIE lanes for trying to add as many working rendering cards as possible. I wouldn't worry that the other lanes don't work as well as lane 1. That may be how it's supposed to be, with the manufacturer assuming you will use lane 1 for your primary display GPU.
In my case, for whatever reason, my primary Titan X has to be connected directly to the motherboard itself, it will not boot up (gives error beeps) with it on the USB 3.0 risers I have. If you are able to use a riser for your primary display, that is pretty cool, because it frees up your lane 2 (mine is blocked)...
One other consideration, you had mentioned the benchmark tests with a working 5 cards (1 display, 4 additional). If you wanted to test how each card was pumping out CUDA, I would open a scene to be rendered in the Octane Render standalone and in the preferences menu, click box to enable or disable any combination of the 5 cards. This will allow you to see how the cards work together in a real world render situation. Keep in mind, enabling whatever card is primary will make the display slow down if you rotate angle, change focus, or mod materials, lighting, etc, because it will both try to output the display and generate CUDA computations for the rendering.
I myself am still awaiting feedback from my BIOS manufacturer as to why my motherboard doesn't like more than 10 GPU. I have enough power and lanes to exceed it, it may be the motherboard just can't handle it. In any event, I try to have a firm understanding of where the config stands when working. If I have scenarios working sometimes but not other times, I try to understand what is going on so that I'll be able to take these things into consideration as I add complexity.
Well, let us know how it goes!

Win 10 Pro 64, Xeon E5-2687W v2 (8x 3.40GHz), G.Skill 64 GB DDR3-2400, ASRock X79 Extreme 11
Mobo: 1 Titan RTX, 1 Titan Xp
External: 6 Titan X Pascal, 2 GTX Titan X
Plugs: Enterprise
Mobo: 1 Titan RTX, 1 Titan Xp
External: 6 Titan X Pascal, 2 GTX Titan X
Plugs: Enterprise