Hello there!
We're looking into doing something very similar to the ARKit demo from last year and are considering to evaluate using your solution.
I would appreciate if someone would contact me about that, and answer a few questions, such as:
Was that demo from last year using RNDR to do the actual rendering live on the iPhone?
What parts of the rendering was done on the network vs. on the device?
Was it doing the heavy pre-computation of the irradiance cache on the network, streaming cache-updates to the device and then using the cache on the device for final rendering?
Was it using any pre-existing engine for the interactivity, or a custom written application that uses ARKit + RNDR client?
How feasible would it be to have it integrated with an existing game engine like Unity or Unreal?
Thanks, would appreciate any answers you can "render"