Page 1 of 1

OTOY's Siggraph AR - how to do this?

PostPosted: Mon Oct 08, 2018 8:17 am
by MaxXR
Hi
Saw this awesome / photorealistic object 41m in Jules' talk https://www.youtube.com/watch?v=IPXKDS7hz24
Image
How does one do this?
Am curious because 1) it Looks amazing and 2) Jules mentions how he's pushing for ORBX (perhaps because current workflow isn't practical)? 3) want to test on Magic Leap.

Is it a metalic baked object from Octane bounced to ORBX then placed via ARKit? How does the ambient lighting work?

Either way, looks great, well done

Re: OTOY's Siggraph AR - how to do this?

PostPosted: Tue Oct 09, 2018 3:25 pm
by Goldorak
it's a WIP of mobile AR tech being developed in tandem with Octane 2019, and our WIP iOS build of Octane. Just like we can render and export pre-computed VR for ORBX Media Player on VR/HMD's today, this would do the same for AR. There will be an slightly updated presentation of the video above at Unite LA later this month.

Re: OTOY's Siggraph AR - how to do this?

PostPosted: Wed Oct 10, 2018 1:10 am
by MaxXR
Goldorak wrote:it's a WIP of mobile AR tech being developed in tandem with Octane 2019, and our WIP iOS build of Octane. Just like we can render and export pre-computed VR for ORBX Media Player on VR/HMD's today, this would do the same for AR. There will be an slightly updated presentation of the video above at Unite LA later this month.


Very nice. So will one be able to pre-bake volumetric animations and play them back via an AR platform?

Also, curious, if Weta Digital uses Octane?
Just tried Dr Grordbort and the assets look the best of any i've seen on HMD thus far: https://www.youtube.com/watch?v=45ZHNq9_7eY and