GPU rendering is not the magic bullet it was promised to be
Forum rules
Please add your OS and Hardware Configuration in your signature, it makes it easier for us to help you analyze problems. Example: Win 7 64 | Geforce GTX680 | i7 3770 | 16GB
Please add your OS and Hardware Configuration in your signature, it makes it easier for us to help you analyze problems. Example: Win 7 64 | Geforce GTX680 | i7 3770 | 16GB
- AquaGeneral
- Posts: 23
- Joined: Fri Mar 19, 2010 6:00 am
Keep in mind if he is using 12 threads, that means he most likely has 2x Intel Core i7 980x's or the Xeon equivalent. That machine would cost 10x more than a standard computer with say a GTX 470. The fact that the benchmark is based around new high end CPU's vs older workstation oriented high end GPU's is unfair in it's own right.
GeForce GTX 980 + GeForce GTX 570 | Intel Core i7 2600K | 8GB of RAM
Exactly, not to mention you can then add 6 gtx470's in xpanders for the price of one of those 12 core Boxx workstations.AquaGeneral wrote:Keep in mind if he is using 12 threads, that means he most likely has 2x Intel Core i7 980x's or the Xeon equivalent. That machine would cost 10x more than a standard computer with say a GTX 470. The fact that the benchmark is based around new high end CPU's vs older workstation oriented high end GPU's is unfair in it's own right.
i could put that scene in mental ray and turn on switches and easily make it take 10mins.
the part of the demo i think isn't explained is how the scenes are optimized (octane maxdepth setting ?)
also those qaudro cards aren't the choice of gpu's to go toe to toe with 2x 3.33 xeons.
it would be more fair to do the benchmark using 1 xeon ($1700) aganst 4 gtx480's ($1800)
or 2xeons ($3400) aganst 8 gtx 480's ($3600)
the part of the demo i think isn't explained is how the scenes are optimized (octane maxdepth setting ?)
also those qaudro cards aren't the choice of gpu's to go toe to toe with 2x 3.33 xeons.
it would be more fair to do the benchmark using 1 xeon ($1700) aganst 4 gtx480's ($1800)
or 2xeons ($3400) aganst 8 gtx 480's ($3600)
2x xeon5650 | 3 x titan | Win7 64 | maya2014x64 | mountain mods
the goal of Brad's video is not to shoot down Octane. He used Octane because to them it probably looked like the best one out there to test. It could have been any other GPU renderer.
what he was pointing out was simply that the speed boost is not what it was hyped to be in GPU rendering and because of this they will not code it in their upcoming release. What they are trying to do however is find the balance between CPU & GPU and as he said, they are trying to 'crack that nut'.
It's definitely not to say Octane is bad or not a good renderer. That is not the style of Luxology.
don't want to sound like a fan, but reading some of these posts indicated that there may be misinterpretation as to why he showed the comparison.
I am sure Octane can produce real quality images too but the speed boost is just not there yet according to their testing.
To me however they are definitely not pissing in this pond and many don't feel that way neither on the Lux forums. Most took it only as information as to why GPU rendering wont be in the next release probably.
peace
what he was pointing out was simply that the speed boost is not what it was hyped to be in GPU rendering and because of this they will not code it in their upcoming release. What they are trying to do however is find the balance between CPU & GPU and as he said, they are trying to 'crack that nut'.
It's definitely not to say Octane is bad or not a good renderer. That is not the style of Luxology.
don't want to sound like a fan, but reading some of these posts indicated that there may be misinterpretation as to why he showed the comparison.
I am sure Octane can produce real quality images too but the speed boost is just not there yet according to their testing.
To me however they are definitely not pissing in this pond and many don't feel that way neither on the Lux forums. Most took it only as information as to why GPU rendering wont be in the next release probably.
peace
I watched it again and I'm sorry but I think its largely a snow job.
Some of it sounded like a none too subtle slur on Octane capability even though its known to be in early beta and limited.
As I understand it Octane is actually the only fully gpu renderer out there so I dont go with the idea they just happened to pick on Octane to compare with.
They could have chosen to do a 3 way test with cpu+gpu as well and it would have been more respectable 'research'
As for respecting and admiring Octane he just didnt look like a happy fellow to me.
I dont think the comparison was fair and I dont think Lux customers are getting an unbiased report. As others have mentioned there are a multitude of factors to consider.
Never the less tales propogated by Lux for their customers are not our concern.
We are all happy campers in Octaneland.
It should be noted that the whole Lux strategy is based on multiple cpu.
It wont be easy for them to change that.
I think this is the fundamental problem.
They are not in a position to double back.
They have a pretty mature product and an established position but now is suddenly a superceded technology.
gpu computing has quite a long way to go yet.
Actually Lux have an agreement with Solidworks to provide a cut down version of the renderer for their CAD program.
Solidworks are known to be working on cloud computing applications including rendering.
It wouldnt surprise me if Lux pops up on a SW cloud and this is what he might have been referring to.
Still I think its a last gasp effort to retain customer interest when quite obviously gpu is the new kid in town.
Some of it sounded like a none too subtle slur on Octane capability even though its known to be in early beta and limited.
As I understand it Octane is actually the only fully gpu renderer out there so I dont go with the idea they just happened to pick on Octane to compare with.
They could have chosen to do a 3 way test with cpu+gpu as well and it would have been more respectable 'research'
As for respecting and admiring Octane he just didnt look like a happy fellow to me.
I dont think the comparison was fair and I dont think Lux customers are getting an unbiased report. As others have mentioned there are a multitude of factors to consider.
Never the less tales propogated by Lux for their customers are not our concern.
We are all happy campers in Octaneland.
It should be noted that the whole Lux strategy is based on multiple cpu.
It wont be easy for them to change that.
I think this is the fundamental problem.
They are not in a position to double back.
They have a pretty mature product and an established position but now is suddenly a superceded technology.
gpu computing has quite a long way to go yet.
Actually Lux have an agreement with Solidworks to provide a cut down version of the renderer for their CAD program.
Solidworks are known to be working on cloud computing applications including rendering.
It wouldnt surprise me if Lux pops up on a SW cloud and this is what he might have been referring to.
Still I think its a last gasp effort to retain customer interest when quite obviously gpu is the new kid in town.
Last edited by pixelrush on Thu Aug 12, 2010 4:47 am, edited 2 times in total.
i7-3820 @4.3Ghz | 24gb | Win7pro-64
GTS 250 display + 2 x GTX 780 cuda| driver 331.65
Octane v1.55
GTS 250 display + 2 x GTX 780 cuda| driver 331.65
Octane v1.55
Modo and other renderers are hardly superceded; as you follow on to say, gpu tech has a long way to go with several limitations at present. Maybe future gpus will be designed with less of these limits.
Win 10. Threadripper 1920X 32gb Ram GTX 1080Ti GTX 980Ti 2xGTX 780
http://ajdesignstudio.co.nz/
http://ajdesignstudio.co.nz/
Indeed, I think it's quite exciting that the GPU hardware as well as its applications are still basically in their infancy. I would expect GPU hardware speeds to potentially increase much faster than CPU speeds for a while, as CPUs have been scraping the bottom of the performance barrel for quite a while now, hence the basically forced move to multi-core and multi-CPU systems.
And the competition from GPUs will just push the CPU manufacturers to accelerate building even-more-core CPUs sooner.
The trick may be for enough GPU accelerated applications (or ludicrous-resolution gaming maybe) to justify having the GPU people keep pushing the performance, and the same is probably true on the CPU side as well, as most people won't know what to do with a 64-core CPU chip either.
Fun times we live in.
Z.
And the competition from GPUs will just push the CPU manufacturers to accelerate building even-more-core CPUs sooner.
The trick may be for enough GPU accelerated applications (or ludicrous-resolution gaming maybe) to justify having the GPU people keep pushing the performance, and the same is probably true on the CPU side as well, as most people won't know what to do with a 64-core CPU chip either.
Fun times we live in.
Z.
Dell XPS730 H2C | Q9550 4GB | Win 7/64 | GTX-470
Lets hope there is not a major split in development needed between gpu for games and gpu for other tasks (rendering). This could lead to lower numbers of the chip being sold, increased costs etc, if it was a one trick pony. Look at ArtVPS, they use to make a raytrace chip/hardware but do not anymore and have a software/cpu solution.
Win 10. Threadripper 1920X 32gb Ram GTX 1080Ti GTX 980Ti 2xGTX 780
http://ajdesignstudio.co.nz/
http://ajdesignstudio.co.nz/
Pixelrush, you sound a bit too much like a fanboy...calm down mate.
I also think this test wasn't meant to discredit Octane, why would they? I think from a technological point of view he is absolutely right in saying that currently the best way to get the maximum horsepower in any given pc is to make use of both CPU and GPU. Modo can't do this currently. So what he is saying is not that Modo is on the right path for the future as a renderer. What he's saying is that it's not at the moment but that they will steer towards utilising the GPU as well where applicable.
CPU and GPU are two very different things that both have their strengths and weaknesses.
I for one think it would also benefit Octane to think outside of their "comfort zone" a bit. The current limit to videoram a huge problem in octane. Probably the biggest reason why I'm not using it much even though I love the look of the output you get with it. Nice for small images, but for print size (which is what I do a lot), what is the point? Now imagine Octane rendering small tiles into vram, filling a big framebuffer that is in RAM, not VRAM, preferably rendering out different passes of 32bit per channel output but using the video-ram to do it's superfast calculations. Even that simple use of you other hardware (not even using your cpu for rendering calculations) would be an enormous benefit to octane. Staring yourself blind to using one specialised bit of kit is just not clever.
I also think this test wasn't meant to discredit Octane, why would they? I think from a technological point of view he is absolutely right in saying that currently the best way to get the maximum horsepower in any given pc is to make use of both CPU and GPU. Modo can't do this currently. So what he is saying is not that Modo is on the right path for the future as a renderer. What he's saying is that it's not at the moment but that they will steer towards utilising the GPU as well where applicable.
CPU and GPU are two very different things that both have their strengths and weaknesses.
I for one think it would also benefit Octane to think outside of their "comfort zone" a bit. The current limit to videoram a huge problem in octane. Probably the biggest reason why I'm not using it much even though I love the look of the output you get with it. Nice for small images, but for print size (which is what I do a lot), what is the point? Now imagine Octane rendering small tiles into vram, filling a big framebuffer that is in RAM, not VRAM, preferably rendering out different passes of 32bit per channel output but using the video-ram to do it's superfast calculations. Even that simple use of you other hardware (not even using your cpu for rendering calculations) would be an enormous benefit to octane. Staring yourself blind to using one specialised bit of kit is just not clever.
gristle,
Well yeah theres nothing wrong with the Lux renderer it still gets you from A to B.
I wasnt trying to run down its quality or capability although I disagree with their assessment of gpu using Octane as an example.
Cross-ply tires were circular like the radials that replaced them.
An unbiased renderer isnt the total word on the type of rendering a gpu can be used for either.
Maybe someone makes a very nice realtime raytracer that attracts a lot of customers away from Lux. What are you going to say about speed then. RT@ 35fps - Lux @ 35spf?
I guess you talk about other things it cant do and Lux can.
There are other applications too that run very well on gpu that people might like to have a grunt cuda gpu in their pc for as well.
I dont think the case for gpu starts and stops with Octane.
Zoot,
Yes I think its likely cpu has peaked while gpu has only just got going.
It seems likely that multiple cores wont go too much further and will probably always be at a price/performance disadvantage.
Well yeah theres nothing wrong with the Lux renderer it still gets you from A to B.
I wasnt trying to run down its quality or capability although I disagree with their assessment of gpu using Octane as an example.
Cross-ply tires were circular like the radials that replaced them.
An unbiased renderer isnt the total word on the type of rendering a gpu can be used for either.
Maybe someone makes a very nice realtime raytracer that attracts a lot of customers away from Lux. What are you going to say about speed then. RT@ 35fps - Lux @ 35spf?
I guess you talk about other things it cant do and Lux can.
There are other applications too that run very well on gpu that people might like to have a grunt cuda gpu in their pc for as well.
I dont think the case for gpu starts and stops with Octane.
Zoot,
Yes I think its likely cpu has peaked while gpu has only just got going.
It seems likely that multiple cores wont go too much further and will probably always be at a price/performance disadvantage.
Last edited by pixelrush on Thu Aug 12, 2010 6:03 am, edited 2 times in total.
i7-3820 @4.3Ghz | 24gb | Win7pro-64
GTS 250 display + 2 x GTX 780 cuda| driver 331.65
Octane v1.55
GTS 250 display + 2 x GTX 780 cuda| driver 331.65
Octane v1.55