Sometime before v4 stable we’ll turn on v4 enterprise features for all v3 boxes licenses at no cost to them. At that point Just replace the words “20 GPU limit” with “200 GPU limit”. Also what card are you denoising on? A single 1080 Ti can denoise 1080p in 1-2 seconds...Notiusweb wrote:LOL!milanm wrote:Well, Tutor could give us an answer to that. If it doesn't work now it should be available to us at some point later on. IMHO XB1 = 'eXtremely Buggy one' so I wouldn't bet on that working right now.Notiusweb wrote:So we with standalone license can use 200 GPU now with the XB1?
42.Notiusweb wrote:Denoise with one word answer - Yes or No?
Cheers
Milan
I'm thinking then 42 translates to "NO" for us then
So, Otoy, if this is the case, could we get that confirmed 200 GPU limit as part of the XB2?
AND the faster denoise processing time. (For animators rendering 1080p +, it would be Kraken great...)
Rain down on us with an army of Krakens...
Hello World: OctaneRender 4 is here
Forum rules
NOTE: The software in this forum is not %100 reliable, they are development builds and are meant for testing by experienced octane users. If you are a new octane user, we recommend to use the current stable release from the 'Commercial Product News & Releases' forum.
NOTE: The software in this forum is not %100 reliable, they are development builds and are meant for testing by experienced octane users. If you are a new octane user, we recommend to use the current stable release from the 'Commercial Product News & Releases' forum.
Hi Goldorak, okay a one texture scene at 1080p will yes Denoise very quickly on my Titan X Pascal, but more textures will slow it down with 1080p + resolutions. I am not saying that a scene you are referring to has one texture, but really with a medium volume and emitters, many textures, you are saying 1080p, 1-2 seconds is currently feasible? What do you measure 1080ti at 2160 then?
Goldorak, can you, or someone at Otoy, set up a test scene where you get 1-2 seconds on the 1080ti at 1080p and let us all try it and benchmark, and we may see some system parameter that either speeds up or slows down the denoise. This would benefit each user, and could be crucial to us getting the best from Octane V4.
Otherwise Vijay_thirukonda explained there are plans to reduce the denoising processing time, so it can't then be at full speed now, so it seems that prioritizing this improvement with higher resolution scenes is huge for advertising.
(I mean, even Redshift can render fast at 800 x 600. Which is what is always shown, and then they say, "See, it's as fast as Octane!..".
But it's not when you go to higher resolutions. That's when Octane obliterates it.)
Thanks!
Goldorak, can you, or someone at Otoy, set up a test scene where you get 1-2 seconds on the 1080ti at 1080p and let us all try it and benchmark, and we may see some system parameter that either speeds up or slows down the denoise. This would benefit each user, and could be crucial to us getting the best from Octane V4.
Otherwise Vijay_thirukonda explained there are plans to reduce the denoising processing time, so it can't then be at full speed now, so it seems that prioritizing this improvement with higher resolution scenes is huge for advertising.
(I mean, even Redshift can render fast at 800 x 600. Which is what is always shown, and then they say, "See, it's as fast as Octane!..".
But it's not when you go to higher resolutions. That's when Octane obliterates it.)
Thanks!
Win 10 Pro 64, Xeon E5-2687W v2 (8x 3.40GHz), G.Skill 64 GB DDR3-2400, ASRock X79 Extreme 11
Mobo: 1 Titan RTX, 1 Titan Xp
External: 6 Titan X Pascal, 2 GTX Titan X
Plugs: Enterprise
Mobo: 1 Titan RTX, 1 Titan Xp
External: 6 Titan X Pascal, 2 GTX Titan X
Plugs: Enterprise
Okay, I did some review testing.
I do, to Goldorak's accurate point, get an actual denoising processing time of 2 seconds at 1080p time.
For me the same scene at 2160p / 4K / UHD gives me a denoising time of 8 seconds.
But here was the thing - The render time leading up to the denoise is what is slowed down. it takes longer to hit a sample amount when denoiser is active, as the denoiser is gathering data.
ie - my 1080p scene to 100 s/px takes 3 seconds with denoiser off, 10 seconds denoiser on+ 2 seconds processing time (Denoiser Tax), for a total net of 12 seconds including Denoiser Tax
('Denoiser Tax' is the Tax charged for data gathering services....more specifically, the data gathering services 'rendered'...GET IT?...)
Anyway, let me be more technically specific:
The Kraken Storm would involve:
(1) faster rendering times when denoiser is active, ideally the render time would equal that of the denoise mode being "off"...
(2) a faster actual denoiser processing time when the renders are UHD, like if that was also 1-2 seconds...Oh My....
Krakens would be Off the Hook
I do, to Goldorak's accurate point, get an actual denoising processing time of 2 seconds at 1080p time.
For me the same scene at 2160p / 4K / UHD gives me a denoising time of 8 seconds.
But here was the thing - The render time leading up to the denoise is what is slowed down. it takes longer to hit a sample amount when denoiser is active, as the denoiser is gathering data.
ie - my 1080p scene to 100 s/px takes 3 seconds with denoiser off, 10 seconds denoiser on+ 2 seconds processing time (Denoiser Tax), for a total net of 12 seconds including Denoiser Tax
('Denoiser Tax' is the Tax charged for data gathering services....more specifically, the data gathering services 'rendered'...GET IT?...)
Anyway, let me be more technically specific:
The Kraken Storm would involve:
(1) faster rendering times when denoiser is active, ideally the render time would equal that of the denoise mode being "off"...
(2) a faster actual denoiser processing time when the renders are UHD, like if that was also 1-2 seconds...Oh My....
Krakens would be Off the Hook
Win 10 Pro 64, Xeon E5-2687W v2 (8x 3.40GHz), G.Skill 64 GB DDR3-2400, ASRock X79 Extreme 11
Mobo: 1 Titan RTX, 1 Titan Xp
External: 6 Titan X Pascal, 2 GTX Titan X
Plugs: Enterprise
Mobo: 1 Titan RTX, 1 Titan Xp
External: 6 Titan X Pascal, 2 GTX Titan X
Plugs: Enterprise
Hi Notiusweb,
when you have found the sampling level that you want for denoising, enable the option to start denoising at the end of the rendering, to have the same rendering time with denoising deactivated.
ciao Beppe
when you have found the sampling level that you want for denoising, enable the option to start denoising at the end of the rendering, to have the same rendering time with denoising deactivated.
ciao Beppe
Hi Beppe, thanks.bepeg4d wrote:Hi Notiusweb,
when you have found the sampling level that you want for denoising, enable the option to start denoising at the end of the rendering, to have the same rendering time with denoising deactivated.
ciao Beppe
I know what you are saying, if you check the "Denoise on Completion" box, it disables the denoising previews by the Min Denoiser Samples and Max Denoising Intervals, so 'denoising once' at end is faster than having it update along the way. Or, if you do have Min-Max enabled, you can set the min sample amount > than the intended target sample amount to accomplish the same 'denoise once' effect
But that is indeed already what I am doing and testing. And even with the "Denoise on Completion" box checked, or, with the Min sample amount > than the intended target sample amount, the data collection is still taking place by the renderer, and this still slows down the render process, versus not using the denoiser at all. And this is understandable, as it was explained it is not yet fully developed and optimized.
But, when it is...
Get Ready....
Because we are ready!
Thanks!
Win 10 Pro 64, Xeon E5-2687W v2 (8x 3.40GHz), G.Skill 64 GB DDR3-2400, ASRock X79 Extreme 11
Mobo: 1 Titan RTX, 1 Titan Xp
External: 6 Titan X Pascal, 2 GTX Titan X
Plugs: Enterprise
Mobo: 1 Titan RTX, 1 Titan Xp
External: 6 Titan X Pascal, 2 GTX Titan X
Plugs: Enterprise
Simple scene exported from 3ds Max
1) Render without Denoiser - 2:55
2) Render wiith denoiser - 7:08 WHAT?
Also, I don't see any help from AI Light. ON or OFF - same speed and result.
Samples: 30000
Adaptive samples: ON
size: 1920x1080px
1) Render without Denoiser - 2:55
2) Render wiith denoiser - 7:08 WHAT?
Also, I don't see any help from AI Light. ON or OFF - same speed and result.
Samples: 30000
Adaptive samples: ON
size: 1920x1080px
@Ramone163ramone163 wrote:Simple scene exported from 3ds Max
1) Render without Denoiser - 2:55
2) Render wiith denoiser - 7:08 WHAT?
Also, I don't see any help from AI Light. ON or OFF - same speed and result.
Samples: 30000
Adaptive samples: ON
size: 1920x1080px
1-try setting the sample amount to say, 200 s/px (vs the 30,000), and then
2-Set the active render pass as the one that says "DeM", I see you have it on "Main". You mouse click the little box that says "DeM" in your green highlighted photo.
3-Check the "Denoise on Completion" box in the Imager tab area settings, this will denoise in the DeM render pass only at the end when it reaches your sample target, say 200 s/px
Then you play with it to see if you could get a better looking render faster than 2:55. I am finding some scenes it works (Denoiser beats straight render), but most often denoiser 'net' takes longer.
For simple animations almost always straight render is faster. Denoiser adds no value at the current processing speed in total. Your comparison shows how the denoise calculation slows down the renderer.
PS - I have no idea myself what the AI Light thing is and how it works!
Win 10 Pro 64, Xeon E5-2687W v2 (8x 3.40GHz), G.Skill 64 GB DDR3-2400, ASRock X79 Extreme 11
Mobo: 1 Titan RTX, 1 Titan Xp
External: 6 Titan X Pascal, 2 GTX Titan X
Plugs: Enterprise
Mobo: 1 Titan RTX, 1 Titan Xp
External: 6 Titan X Pascal, 2 GTX Titan X
Plugs: Enterprise
For my scene Denoiser is useless.
(Neat Video Denoiser makes better result with my scene)
2000 samples and I get this result with denoiser:
(Neat Video Denoiser makes better result with my scene)
2000 samples and I get this result with denoiser:
You had that render at 1:06 for 2,000 s/px. But your 30,000 s/px with no denoise was clocked at 2:55.ramone163 wrote:For my scene Denoiser is useless.
(Neat Video Denoiser makes better result with my scene)
2000 samples and I get this result with denoiser:
So the true test now would be then to render with denoise to ~5,900 s/px, or whatever sample amount, with denoise on, clocks close to 2:55.
Then, you can see which looks better (1) Denoise with, say, ~5900 s/px @2:55, or (2) No Denoise with 30,000 samples @2:55.
(But I know what you are saying, that you would like to have it render much faster with better results with the denoiser...)
Win 10 Pro 64, Xeon E5-2687W v2 (8x 3.40GHz), G.Skill 64 GB DDR3-2400, ASRock X79 Extreme 11
Mobo: 1 Titan RTX, 1 Titan Xp
External: 6 Titan X Pascal, 2 GTX Titan X
Plugs: Enterprise
Mobo: 1 Titan RTX, 1 Titan Xp
External: 6 Titan X Pascal, 2 GTX Titan X
Plugs: Enterprise
In my scene No Denoise with 30,000 samples @2:40 is much better then Denoise with 9000 s/px @2:40.Notiusweb wrote:You had that render at 1:06 for 2,000 s/px. But your 30,000 s/px with no denoise was clocked at 2:55.ramone163 wrote:For my scene Denoiser is useless.
(Neat Video Denoiser makes better result with my scene)
2000 samples and I get this result with denoiser:
So the true test now would be then to render with denoise to ~5,900 s/px, or whatever sample amount, with denoise on, clocks close to 2:55.
Then, you can see which looks better (1) Denoise with, say, ~5900 s/px @2:55, or (2) No Denoise with 30,000 samples @2:55.
(But I know what you are saying, that you would like to have it render much faster with better results with the denoiser...)
And I don't understand why 30000 samples render 2:40, and same scene with Denoiser ON render 7:40.
Maybe Denoiser ignores adaptive sampling option?
- 30000 samples Denoiser - OFF, Adaptive sampling - ON = 2:40
- 30000 samples Denoiser - ON, Adaptive sampling - ON = 7:40
- 30000 samples Denoiser - OFF, Adaptive sampling - OFF = 10:50
- 30000 samples Denoiser - ON, Adaptive sampling - OFF = 15:28
Developers? How it works. If we have more samples, denoiser needs more time?