ORBX viewer 2.0
Another question, is it possible to have tooltips pop-up when hovering a hotspot?
Any news on the availability of the full documentation?
Any news on the availability of the full documentation?
-/-/-enri --- C360.NL
Just a quick note - JPG is not likely to get you more than ~30 frames of stereo cube map
sequences on a Note 4 or S6.
We had to come up with a faster and more efficient (albeit imperfect) encoding/decoding system for high speed 18K stereo cube map streams. It is device specific as it uses OpenGL shaders to decode data, and this depends further on the OGL shader level and supported extensions exposed on the device.
We want to handle this invisibly for artists, and one day we will. Right now, we expose this encoder on ORC as a new export option (tied to a specific device/GL feature set or platform - e. g Gear VR), alongside generic PNG/EXR/MP4. This encoding step may be built into Octane one day, if needed - but probably not in its current form. Encoding is slow (sometimes longer than the render) and the ouput for mobile doesn't work on desktop and vice-versa. As a result you have at least two or more data streams to generate for basic GVR/CV1 cross device support. On ORC we can squeeze all this work in between frame renders, so dual format encoding basically comes for free on multi-frame jobs, and a GUID allows you to fetch the right media file for the platform and device the ORBX viewer is running on.
Today that is just Gear VR, but down the line we need to support ORBX players on desktop VR, PSVR, iOS, Shield/Tango, JavaScript, etc.
sequences on a Note 4 or S6.
We had to come up with a faster and more efficient (albeit imperfect) encoding/decoding system for high speed 18K stereo cube map streams. It is device specific as it uses OpenGL shaders to decode data, and this depends further on the OGL shader level and supported extensions exposed on the device.
We want to handle this invisibly for artists, and one day we will. Right now, we expose this encoder on ORC as a new export option (tied to a specific device/GL feature set or platform - e. g Gear VR), alongside generic PNG/EXR/MP4. This encoding step may be built into Octane one day, if needed - but probably not in its current form. Encoding is slow (sometimes longer than the render) and the ouput for mobile doesn't work on desktop and vice-versa. As a result you have at least two or more data streams to generate for basic GVR/CV1 cross device support. On ORC we can squeeze all this work in between frame renders, so dual format encoding basically comes for free on multi-frame jobs, and a GUID allows you to fetch the right media file for the platform and device the ORBX viewer is running on.
Today that is just Gear VR, but down the line we need to support ORBX players on desktop VR, PSVR, iOS, Shield/Tango, JavaScript, etc.
- Rikk The Gaijin
- Posts: 1528
- Joined: Tue Sep 20, 2011 2:28 pm
- Location: Japan
I finally get the American Gear VR working again, so I got the ORBX v2 app installed, but... Is it me or there aren't any new samples yet? There is a new icon with the ORBX logo on it, but it doesn't do anything... 

As of right, all the best ORBX samples for the 2.0 app are not ours to share. Here is a review of one of them:Rikk The Gaijin wrote:I finally get the American Gear VR working again, so I got the ORBX v2 app installed, but... Is it me or there aren't any new samples yet? There is a new icon with the ORBX logo on it, but it doesn't do anything...
http://www.roadtovr.com/first-look-batm ... ideo-tech/
This runs in the public 2.0 GVR app if you drop the media file into the ORBX folder - same as any user project. It shows what is possible at scale for a studio like WB. Yet it's all built with the same toolchain we are exposing through ORC and Octane in 2.25.
That being said, we are looking ahead to 3.x script nodes which handle all exporting of lua/JSON from Octane/host app UI (maybe HTML too for templates).
- Rikk The Gaijin
- Posts: 1528
- Joined: Tue Sep 20, 2011 2:28 pm
- Location: Japan
Well, making amazing demos that nobody can experience makes no sense to me. Seriously, what's the point? Share the goddamn thing already!
And if you cant's share it, leak it.
And if you cant's share it, leak it.
- 360precision
- Posts: 63
- Joined: Mon Apr 27, 2015 8:55 am
Just some documentation for what we can currently do would be helpful. All this talk of V3 isn't looking good as it's too easy to say we'll release everything then. I have no idea why the V2.0 app has even been released if artists can't create anything impressive for it.
Matt
Matt
It's a commercial product, not a demo. Would like to make a demo around your RTM scene if you are open to it - we would need to do it on ORC. PM me.Rikk The Gaijin wrote:Well, making amazing demos that nobody can experience makes no sense to me. Seriously, what's the point? Share the goddamn thing already!
And if you cant's share it, leak it.
Everything in the WB project was done in V2.360precision wrote:Just some documentation for what we can currently do would be helpful. All this talk of V3 isn't looking good as it's too easy to say we'll release everything then. I have no idea why the V2.0 app has even been released if artists can't create anything impressive for it.
Matt
We're working on docs and samples.Our immediate priority this week is getting apk updates working correctly on Oculus Home:
https://www.reddit.com/r/GearVR/comment ... led_on_s6/
We have to reliably push out hot fixes (like 2.07) as we expose more features and you guys report more issues.
Once that is sorted out, we'll be able to more move quickly.