We're currently rendering a scene that takes 1 minute to render, but 10 minutes to "update" between frames.
Granted, the scene uses FFX which may play a (large) role in that, but I'm wondering what other factors determine the update speed?
I've tried searching the forum but the "search" feature seems nearly useless.
Any insight would be appreciated.
Thanks!
Render question / VERY Slow Scene Update
Moderator: juanjgon
Thanks for looking at these.
If you need more info, please let me know.
The FFX settings on each of the braids are slightly different, but only in terms of fiber density and minor styling changes.
Logs and settings attached...
If you need more info, please let me know.
The FFX settings on each of the braids are slightly different, but only in terms of fiber density and minor styling changes.
Logs and settings attached...
- Attachments
-
- octane.log
- (354.68 KiB) Downloaded 161 times
Yes, the problem is that LightWave spends around 3 minutes to update the scene for each motion blur time sample (3 samples for the standard 2 steps MB). The only workaround is to disable the motion blur, but well, I suppose that this is not an option. In other apps. like Houdini, all the scene information can be exported to cache files to avoid this problem, but I'm not sure how far you can go with this workflow in LightWave.
Thanks,
-Juanjo
Thanks,
-Juanjo
- NemesisCGI
- Posts: 111
- Joined: Sat Apr 06, 2013 3:18 am
I did put in a feature request for a MDD based cache system for FFX, this would have helped here.
I'm sure some third party could create a cache system, the SDK is there for that. They just has to read from FFX the 2 point mesh & save out a mesh sequence adding a vertex map for motion (if that can be read).
The workaround I use is to export the FFX to mesh (2 point polygon chains), but if your model is deforming then you are a little stuck with matching the frozen fiber mesh to its source. There is the displacement node MetaLink, you can sometimes get away with using that for short fur & bake to MDD. Just be warned it can take quite some time to initialize & won't play nice with Sub-D's. Syflex works too, though crash-prone. If you're using bones then you could just transfer the weights to the fiber mesh.
One last way is to use 3rdPowers cage deformed, again worth baking to MDD for the final render.
I'm sure some third party could create a cache system, the SDK is there for that. They just has to read from FFX the 2 point mesh & save out a mesh sequence adding a vertex map for motion (if that can be read).
The workaround I use is to export the FFX to mesh (2 point polygon chains), but if your model is deforming then you are a little stuck with matching the frozen fiber mesh to its source. There is the displacement node MetaLink, you can sometimes get away with using that for short fur & bake to MDD. Just be warned it can take quite some time to initialize & won't play nice with Sub-D's. Syflex works too, though crash-prone. If you're using bones then you could just transfer the weights to the fiber mesh.
One last way is to use 3rdPowers cage deformed, again worth baking to MDD for the final render.
Win 7 pro 64bit GTX780
- NemesisCGI
- Posts: 111
- Joined: Sat Apr 06, 2013 3:18 am
I forgot to add I did a hair sim in Houdini & exported that as an OBJ sequence. Loaded this into LW and even though it's an obj sequence baked out to MDD. This works as each obj has the same number of points. So what I'm saying here is if someone wrote that FFX cache to disk tool as a LWO seq you could convert to MDD & get the motion blur that way.
Win 7 pro 64bit GTX780