what's the rough procedure for creating an ORBX VR movie ?, like for example the Adam animated VR movie where i can watch around while movie is playing back. As its not a .png file like the stills any more, its an .orbx file. I use Blender with the Octane Demo rt now, does it support .orbx VR movie files ?
All of those VR movies were created on ORC, where we use unused CPU cycles to encode each SCM frame for multiple OpenGL ES device profiles while Octane uses the GPU. I wish we had one codec, but differences in GPUs make this hard. So the GearVR GL codec is not the same as PCVR desktop GL codec, and we create different .OKX frames for each, and use ORC to link multiple encoded output to the render job ID, which you can use to download (or live stream) the right ORBX media file based on useragent (baseline stream is HTML5/webVR through orbx.js) .
Encoding took longer than rendering for the Adam frames (and would cost a lot if we didn't run it in parallel to GPU work). It is handled in ORBX Media server module on the cloud, which can also be used to set up live adaptive streaming of the same media (and baked meshes, or LF streams in the future). ORBX media server is currently a different app than Octane, but you can download the non-cloud Windows PC version for free from install.otoy.com. This can be used to try some of the OPRBX live VR streaming and broadcasting features (currently virtual chrome and virtual desktop) inside Oculus Social or OMP.