Hi Guys,
I'm interesting to test and see more about Octane render with Unity to calculate lightfield for VR, or have more idea about a workflow I saw a lot videos about it.
Best,
JO
Jolbertoquini wrote:Hi Guys,
I'm interesting to test and see more about Octane render with Unity to calculate lightfield for VR, or have more idea about a workflow I saw a lot videos about it.
Best,
JO
Goldorak wrote:Jolbertoquini wrote:Hi Guys,
I'm interesting to test and see more about Octane render with Unity to calculate lightfield for VR, or have more idea about a workflow I saw a lot videos about it.
Best,
JO
There's a lot of details you can read about in this thread:
https://www.reddit.com/r/oculus/comment ... important/
We plan to release a Unity+Octane build as soon as the version we showed at Unite is a little more stable. Any scene node you can turn into an ORBX package today can be turned into a light field volume on ORC and streamed back into Unity game mode as a baked LF layer. For an LF render job, you don't need a render target, the bounding scene or object is default. If you do include a camera/RT node, it will be through a new render job node. The automatic conversion for LF rendering uses IPD you set in the stereo camera + a radius (50 cm typically).
The medium term goal is for Unity Editor + Octane integration to be used by a creator to set up the layers/elements/viewpoints that bake into Unity meshes (w/ Octane 3 current baking) and/or ORBX lightfields/glTF streams (more advanced baking outside Unity mesh system - placed on a fixed URL GUID as ORBX media streams or files). Those load into game mode in place of proxy bbox or shape when played back with ORBX media runtime, and are blended with Unity elements correctly. At some point after the first beta release, we plan to enable the ORBX exporter to also package your Unity project into an ORBX media file, which will play back via baseline OMP runtime form a URL as-is, even when Unity updates to new versions. So far this is looking promising on most platforms.
Jolbertoquini wrote:Goldorak wrote:Jolbertoquini wrote:Hi Guys,
I'm interesting to test and see more about Octane render with Unity to calculate lightfield for VR, or have more idea about a workflow I saw a lot videos about it.
Best,
JO
There's a lot of details you can read about in this thread:
https://www.reddit.com/r/oculus/comment ... important/
We plan to release a Unity+Octane build as soon as the version we showed at Unite is a little more stable. Any scene node you can turn into an ORBX package today can be turned into a light field volume on ORC and streamed back into Unity game mode as a baked LF layer. For an LF render job, you don't need a render target, the bounding scene or object is default. If you do include a camera/RT node, it will be through a new render job node. The automatic conversion for LF rendering uses IPD you set in the stereo camera + a radius (50 cm typically).
The medium term goal is for Unity Editor + Octane integration to be used by a creator to set up the layers/elements/viewpoints that bake into Unity meshes (w/ Octane 3 current baking) and/or ORBX lightfields/glTF streams (more advanced baking outside Unity mesh system - placed on a fixed URL GUID as ORBX media streams or files). Those load into game mode in place of proxy bbox or shape when played back with ORBX media runtime, and are blended with Unity elements correctly. At some point after the first beta release, we plan to enable the ORBX exporter to also package your Unity project into an ORBX media file, which will play back via baseline OMP runtime form a URL as-is, even when Unity updates to new versions. So far this is looking promising on most platforms.
Thanks Goldorak,
So if I understand well ,I do not need Unity to use Lightfield on VR, my question would be how I do to have a full product scene create at standalone to be display at Samsung VR with the lightfield?
Or what I do if I want create viewer on the web of the product on lightfield and have the option to activate a VR view.
I was think simple like a car or a laptop what I mean is only a product?
Best,
JO
Goldorak wrote:
Next, imagine this same cloud render job on ORC had a checkbox for one more resolution type, similar to the stereo cube map at 72K detail, but not dual eye stereo render, rather a bubble of light rays. This format can be sent as a live stream to the VR or 2D client at 1.5- 25 Mbs, where you can move anywhere in the bubble and the results are exactly like moving through the scene in Octane in the viewport, but without noise. This is the light field (+extras like AOV) thay extend stereo cube map we have now with volume boundaries with a few checkboxes.
When the render is done, It will be added to the JSON manifest of your VR render jobs on ORC, and the stream will be viewable on any device, and fall back to .mp4 or OpenGL compressed cube map video in ORBX media file if you need to run it offline. We can also offer a baked mesh option for near field content if the scene is under 15 million polys.
ristoraven wrote:Jolbertoquini,
My advice is that don't promise to your clients anything yet and don't make any plans yet. This tech is not ready, it is cooking and looking good. Nevertheles, even when it seems to be ready, it still needs to be tested trough and trough, before it has matured enough so that it can be offered to a paying client.. This all will take time. Also, how much a five minute LightField animation would cost to render in the cloud, is still a mystery.
So, hold your horses. Your company is not the first one interested in this tech.
Skyt8000 wrote:May i know why is Unreal Engine left out of this? Will there be integration with Unreal after Unity?
Return to OctaneVR Questions & Discussion
Users browsing this forum: No registered users and 32 guests