We are planning to update ORBX player next week, in tandem with some announcements being made at Oculus Connect 3.
We are introducing a new lua 'render job' node in 3.04 - which can combine a multiple render target nodes (e.g. animation, multiple camera or layers) as well as previously global scripts, such as turntable, batch or daylight, and wrap them in a single output ORBX media output folder, including basic JSON+lua needed to play it back (e.g. multi-view, animation and multi-light are auto-generated) .
If you render this locally, so you can test the output in the ORBX\Media folder on your phone.
If you use ORC, it will compress the output per device (e.g. mobile VR, WebVR, PC VR) and, if you choose to publish to the web, generate a public short URL on x.io. This will be based on your jobID GUID i.e.
http://x.io/ygt6ue or username+projectname[+optversion] eg:
http://me.x.io/ristovan/myvrrender/
This means you can connect any browser or app that has partial or full ORBX media support (currently OMP, but soon Samsung Internet) to view the media file. You can also embed or link an HTML page to the ORBX media file, and use it as a live overlay or mult-light source in the next version of OMP. We are working out with Samsung on how JS can talk to the ORBX environment enclosing the live web page texture.
OMP will be supporting GC - see Mattel project. We just started Hololens support. For PC , we are going to be deploying ORBX Media Server - this allows us to much more than OMP does on mobile, including local light field rendering and high resolution glTF mesh support, and 49K stereo cube map streams at 120 fps from disk.