face_off wrote:So in summary, IMO you can still work with the 580, but you just need to be wise about the scene metrics. If you have a specific example scene - post a screenshot here, along with the output from the Scene Texture Counter, and we can offer suggestions for fitting in the textureslots limits.
That is exactly my point - I can still use Octane but only if my scene can cope with up to 32 graycale maps. In my last render
http://ken1171.deviantart.com/art/From-Nyanna-with-Love-358222578 there was only a single figure (Vicky 4) and I already had to sacrifice bump maps all over. I assume this was because the clothing relied mainly on transparency maps, which along with the hair blew the 32 CUDA grayscale maps limits in a jiffy. I hence had to replace the room she was in with a plain backdrop just to have Octane quit moaning about the CUDA limits.
I have condensed all texture maps that were repeated, and reused specular maps as bump maps (ugh!, there goes my pretty skin shader), and basically removed almost all the bump maps until I came down to 32 grayscale maps total, replacing what could be replaced with Octane procedural maps (with dubious results). There was nothing I could do about the hair because transmaps cannot be removed or replaced by procedural textures.
At this point things are no fun anymore, for I am instead fighting the CUDA limits just to have a single character rendered. The lace details were lost because the grayscale bump maps just had to go, and Octane doesn't support procedural noise nodes we have in poser. Bump maps didn't work very well to replace specular maps, so my skin shader looks rather flat. And just forget about putting 2 or more characters in the scene, unless this was a nude beach scene - but nude is not my stuff. Maybe this would be possible if I carefully select clothing (and hair) that won't blow the 32 grayscale CUDA maps.
As for the GTX580, I indeed use it with 3DSMAX. By the time I bought this card last year, there was no GTX680 model that would ship with more than 1Gb or a more robust cooling solution (like the GTX580 Classified has). The hardware reviews also poorly rated the GTX680 in CUDA performance, claiming the older GTX580 still performed it in basically all benchmarks. At the time it seemed like a bad investment. nVidia blames the drivers for the low CUDA performance, but there are rumors that nVidia has intentionally crippled the CUDA performance to protect their higher end Quadro cards from loosing the competition, since they ship with way less streaming processors but cost a lot more.
In any case, to assume that people should buy the latest and greatest high-end video card (around $500) is not quite reasonable these days. I probably just did a poor deal with Octane when I consider the cost/benefit. Only 32 grayscale maps is just not enough... -_______-