GPU rendering is not the magic bullet it was promised to be

Generic forum to discuss Octane Render, post ideas and suggest improvements.
Forum rules
Please add your OS and Hardware Configuration in your signature, it makes it easier for us to help you analyze problems. Example: Win 7 64 | Geforce GTX680 | i7 3770 | 16GB
pixym
Licensed Customer
Posts: 597
Joined: Thu Jan 21, 2010 4:27 pm
Location: French West Indies

If I remember correctly: Furry Ball is Direct X 11 quality render based…(Mach studio GPU as well)
Work Station : MB ASUS X299-Pro/SE - Intel i9 7980XE (2,6ghz 18 cores / 36 threads) - Ram 64GB - RTX4090 + RTX3090 - Win10 64
NET RENDER : MB ASUS P9X79 - Intel i7 - Ram 16GB - Two RTX 3080 TI - Win 10 64
User avatar
pixelrush
Licensed Customer
Posts: 1618
Joined: Mon Jan 11, 2010 7:11 pm
Location: Nelson, New Zealand

This is all happening because today is Friday the 13th. :o
Well it is here anyway @ 1.30am.
Enough forum torture for today. :evil:
1 more sleep and maybe Santa has come ;)
i7-3820 @4.3Ghz | 24gb | Win7pro-64
GTS 250 display + 2 x GTX 780 cuda| driver 331.65
Octane v1.55
_pepis
Licensed Customer
Posts: 36
Joined: Tue Apr 06, 2010 2:17 pm

I used some traditional renderers (always biased) and now I use only Octane. For my purposes octane works perfect!
More, I got results in seconds I couldn't get with another tracers. I produced so many renders since my octane purchase.

I am really impresed (and WOW effected) with Octane speed. I've got AMD64 X2 5600 CPU and crap 8800 GTX. This 8800 generates fantastic renders with Octane, I can't imagine to wait for the results with Maxwell Demo.
O also tried vray and it is good render, but in my oppinon like turtle in comparison to Octane.... (and still biased ;-)) I think when you assemble all this magic MLT and so on, octane will blow away evertything with its speed! ;-)

Peter

ps. Sorry for my EN.
Pauls
Licensed Customer
Posts: 37
Joined: Fri May 07, 2010 1:41 pm

I've been a user of Modo for a few years and find it to be a good app. Luxology is a good company and I think tries to be fair. Brad Peebler is a good guy and I'm sure there was no evil intent towards Octane...I'm just guessing Octane has gotten the most buzz of late (especially on the Modo forums) so it was the target of the video.

My personal belief was this video was meant to calm the existing user-base in regards 'why no GPU' development in 501? That is all.....just a marketing communication to show the users they aren't missing much by Modo not having any GPU development. It's just marketing. I have a feeling that Luxology wishes it could take the video back now....there was a reason why the video was shown to existing users only.

I love Octane and will love it even more once some of the needed features are released. I personally think GPU rendering is very fluid and a very creative and fresh approach to creating 3D Imagery.

My 2 cents......all personal opinion and supposition.
DayVids
Licensed Customer
Posts: 350
Joined: Tue May 11, 2010 2:35 am

that sounded like a company trying to quell their customers about requests for something they don't want to implement/aren't prepared to invest in. I've never used modo, I don't even do a lot of 3D stuff, I do sales, marketing, video production, etc... But to me it was very clearly a, we're not going to do this so lets get people off our backs. The comparisons were not fair in that they were not apples to apples, but they were fair in that they were product to product for similar purposes for a desired result.

I like octane, I like how it is fast, easy, and I can invest minimal $ to make it give me fast results, and I am hoping that we'll even see faster results as it ages, and gets more refined. I don't do hair, fur, etc... so that doesn't bother me, I would like to see particle calculation being done on the GPU at some point, as I would think that it's ideally suited for that, but I'll be keeping Octane in my toolbox and up to date, as long as I can pay for it ( doesn't appear to be a problem by the looks of it ), and as long as it's useful to me.

anyway, ya - that's my .02
CPU - i7-950 3.06 Ghz, 24GB Ram, Win7 x64, 2 display monitors, GeForce GTX 580 3GB Classified. I'm glad to say I LOVE OCTANE!
Pauls
Licensed Customer
Posts: 37
Joined: Fri May 07, 2010 1:41 pm

DayVids wrote:that sounded like a company trying to quell their customers about requests for something they don't want to implement/aren't prepared to invest in. I've never used modo, I don't even do a lot of 3D stuff, I do sales, marketing, video production, etc... But to me it was very clearly a, we're not going to do this so lets get people off our backs. The comparisons were not fair in that they were not apples to apples, but they were fair in that they were product to product for similar purposes for a desired result.

anyway, ya - that's my .02
I completely agree....I just feel it was done to calm the user-base. And as a company, that's probably a smart thing to do if one doesn't plan on introducing any sort of GPU 'buzz-word" rendering features in an up-coming release. I don't think it was a very fair portrayal of Octane nor the benifits of incorporating this new technology.

For certain kinds of work, a GPU work-flow is very fluid and very creative.....this aspect was not dealt with in the video. Anyway, I don't think it is too hard to portray something in less than flattering light if one really wants to....and Luxology really didn't forward any of the positives a GPU render has. I personally think minds were made up before much testing was even done.

2.5 cents worth and probably all I really need, or want to say other than, more power to Octane and the guys developing it..
adrencg
Licensed Customer
Posts: 236
Joined: Wed Feb 24, 2010 4:20 am

I think he's right about GPU + CPU being the future.
Ryzen 5950x
128GB Ram
RTX 3070 x 3
User avatar (Default)
wacom

"Luxology is working with intel to benchmark GPU renderers like Octane, Arion and others to see how they compare to CPU renderers." Lux rep

Well this is all you need to know. Luxology is getting paid, by Intel, to do "tests" of CPU vs GPU rendering. Hmmm they didn't ask mental images. Oh that's right because they are owned by Nvidia, who currently is eating into Intel's research development, medical, simulation, and oil and mining exploration markets with inexpensive and faster GPU based options! Additionally ATI and Nvidia are just too good at making GPUs fast and cheaply...and Intel, who has tried several times to sway the market, has not been able to deliver on that front.

So boy, it's just so puzzling as to why luxology would take the time to single out the main GPU only render engine on the market.

Intel is on the attack, pure and simple, to help thwart further deterioration of certain markets. Who do you think buys the bulk of their high end chips on the open market? Probably not people playing mine sweeper or Crysis. Intel's playing the long game, and trying to plant the seed now that you'll always need a CPU while quite a few markets show that in the future the designation between CPU/GPU is going to be increasingly blurred (IE AMDs next round of chips). Every high end GPU that 3D-Coat, Airon, or Octane indirectly sells is most likely an Intel chip NOT bought or upgraded.

The fact of the mater is that with CUDA and OpenCL the genie is out of the bottle, and there is no reason in the future there might not be some kind of Nvidia GPU that as an atom processor embedded on it. Think about it- if the base parts of the OS ran on an Atom like chip, but all of your simulation, deformations, rendering, transformation solving etc. ran on a GPU...would you need a 12 core CPU? This is exactly the kind of plausible future that Intel never wants us to see as it would mean either greatly re-tooling their company, or giving up huge amounts of market share.

Way to be an intel pawn Luxology- is there nothing that company will not do for a quick buck?
Michael314
Licensed Customer
Posts: 51
Joined: Sat Mar 06, 2010 3:41 pm
Location: Germany

Hello,
I would not take that video too seriously. As it has been pointed out earlier, I see it as Luxology's backing of their
decision not to implement a GPU-based renderer in Modo 5. Obviously they had tons of people asking for that.

I even believe that there are lots of cases, where CPU based computing is better right now, mainly due to RAM
limitations on current graphics cards. But we are not talking about today. Octane is just in it's beginning,
the 1.0 version is not yet out. By the time we will see Octane 3.0 or so, graphics chips will be magnitudes more
powerful, having > 1000 of computing cores and cards will have 8 Gigs of RAM or more.
I don't think any CPU based renderer will be able to catch up with such a rig then, CPU development is much
slower right now. And the video confirms that, they will look into putting tasks to the GPU as well, just not
right now. But once they start with that, Octane will be miles ahead already.

Best regards,
Michael
Win XP 64 | GTX 580 (3 GB) | Core2Quad 2.83GHz (Q9550) | 8GB
Win7 Ultimate 64 | GTX 680 (4 GB) | Core i7 X980 | 24GB
User avatar
kivig
Licensed Customer
Posts: 152
Joined: Fri Apr 09, 2010 10:42 pm
Contact:

Michael314 wrote:Hello,
I would not take that video too seriously. As it has been pointed out earlier, I see it as Luxology's backing of their
decision not to implement a GPU-based renderer in Modo 5. Obviously they had tons of people asking for that.

I even believe that there are lots of cases, where CPU based computing is better right now, mainly due to RAM
limitations on current graphics cards. But we are not talking about today. Octane is just in it's beginning,
the 1.0 version is not yet out. By the time we will see Octane 3.0 or so, graphics chips will be magnitudes more
powerful, having > 1000 of computing cores and cards will have 8 Gigs of RAM or more.
I don't think any CPU based renderer will be able to catch up with such a rig then, CPU development is much
slower right now. And the video confirms that, they will look into putting tasks to the GPU as well, just not
right now. But once they start with that, Octane will be miles ahead already.

Best regards,
Michael
+1

Though computer architecture could change too. New cpus could become modern gpu-like.
http://www.visnevskis.com
Vista64/Ubuntu, GTX470/580
Post Reply

Return to “General Discussion”