Large Poly count and memory

Generic forum to discuss Octane Render, post ideas and suggest improvements.
Forum rules
Please add your OS and Hardware Configuration in your signature, it makes it easier for us to help you analyze problems. Example: Win 7 64 | Geforce GTX680 | i7 3770 | 16GB
treddie
Licensed Customer
Posts: 739
Joined: Fri Mar 23, 2012 5:44 am

My three cards are identical (560 ti) so I either leave one of them free to drive the OS or if I want the extra speed boost then I use all three. Obviously if I were to use all three to render then I would once again get the problem with some of the memory being eaten by windows.
That doesn't seem like much of an "issue" as it were. I mean, having two cards totally dedicated to the render plus significant boost from the third, except that Windows experiences the "jitters". When I'm working on a project and using all of my resources, I'm so excited about how the render is progressing, I say, "To hell with what Windows needs!" :) .

Which brings up a question...Is it possible to de-allocate a CUDA device in the middle of a render? That would be nice, if you want all of the resources spent on the render, with the option of taking a card out of the loop to let it concentrate on Windows.
Win7 | Geforce TitanX w/ 12Gb | Geforce GTX-560 w/ 2Gb | 6-Core 3.5GHz | 32Gb | Cinema4D w RipTide Importer and OctaneExporter Plugs.
User avatar
steveps3
Licensed Customer
Posts: 1118
Joined: Sat Aug 21, 2010 4:07 pm
Location: England

I used to have a 260 as my one and only card. When I was rendering using that the whole system was practically unusable. Because the 560 is a lot fast, and I think down to changes within Octane, the same problem doesn't really exibit itself. There is still a problem in that you can't do things that are graphically intense such as stream videos but for just modelling or something like that then I can leave all three of my cards thrashing away whilst I carry on working.
(HW) Intel i7 2600k, 16GB DDR3, MSI 560GTX ti (2GB) x 3
(SW) Octane (1.50) Blender (2.70) (exporter 2.02)
(OS) Windows 7(64)
treddie
Licensed Customer
Posts: 739
Joined: Fri Mar 23, 2012 5:44 am

I hope to have a 570 or 580 here this week sometime.

But, still curious, can a card be de-allocated mid-render?
Win7 | Geforce TitanX w/ 12Gb | Geforce GTX-560 w/ 2Gb | 6-Core 3.5GHz | 32Gb | Cinema4D w RipTide Importer and OctaneExporter Plugs.
User avatar
steveps3
Licensed Customer
Posts: 1118
Joined: Sat Aug 21, 2010 4:07 pm
Location: England

You can turn cards on and off yes. So if you want to do some work then you could deactivate one card then when you are ready for a boost then you could activate it again.
(HW) Intel i7 2600k, 16GB DDR3, MSI 560GTX ti (2GB) x 3
(SW) Octane (1.50) Blender (2.70) (exporter 2.02)
(OS) Windows 7(64)
treddie
Licensed Customer
Posts: 739
Joined: Fri Mar 23, 2012 5:44 am

Too cool.


I have one card waiting for a neighbor. :)

Will only have to accept the fact, though, that I will not be able to witness the full power of the 570/80 if I bring in the 260 as well. Would be more like using two 260's. Would probably actually run slower in that respect.
Win7 | Geforce TitanX w/ 12Gb | Geforce GTX-560 w/ 2Gb | 6-Core 3.5GHz | 32Gb | Cinema4D w RipTide Importer and OctaneExporter Plugs.
User avatar
roeland
OctaneRender Team
Posts: 1823
Joined: Wed Mar 09, 2011 10:09 pm

steveps3 wrote:You can turn cards on and off yes. So if you want to do some work then you could deactivate one card then when you are ready for a boost then you could activate it again.
This works, but a current limitation of Octane is that since the film buffer is stored on the card, you will lose the samples rendered on that card, so it makes most sense to activate all cards if you are probably going to let it finish.

--
Roeland
treddie
Licensed Customer
Posts: 739
Joined: Fri Mar 23, 2012 5:44 am

This works, but a current limitation of Octane is that since the film buffer is stored on the card, you will lose the samples rendered on that card
Does this mean that in the future, Octane will move that film buffer data someplace else? Would it put it in another GPU's memory, or system RAM?
Win7 | Geforce TitanX w/ 12Gb | Geforce GTX-560 w/ 2Gb | 6-Core 3.5GHz | 32Gb | Cinema4D w RipTide Importer and OctaneExporter Plugs.
User avatar
steveps3
Licensed Customer
Posts: 1118
Joined: Sat Aug 21, 2010 4:07 pm
Location: England

roeland wrote:
steveps3 wrote:You can turn cards on and off yes. So if you want to do some work then you could deactivate one card then when you are ready for a boost then you could activate it again.
This works, but a current limitation of Octane is that since the film buffer is stored on the card, you will lose the samples rendered on that card, so it makes most sense to activate all cards if you are probably going to let it finish.

--
Roeland
ah, ok. I can't say that I have ever removed a card at any point other than right at the start of a render. I've added cards midway through the render but obviously all that does is leave that card playing catchup.
(HW) Intel i7 2600k, 16GB DDR3, MSI 560GTX ti (2GB) x 3
(SW) Octane (1.50) Blender (2.70) (exporter 2.02)
(OS) Windows 7(64)
Post Reply

Return to “General Discussion”