OTOY unveils holographic video

Generic forum to discuss Octane Render, post ideas and suggest improvements.
Forum rules
Please add your OS and Hardware Configuration in your signature, it makes it easier for us to help you analyze problems. Example: Win 7 64 | Geforce GTX680 | i7 3770 | 16GB
Nuge
Licensed Customer
Posts: 68
Joined: Wed Jan 09, 2013 9:36 pm
Location: Hamilton, New Zealand

http://render.otoy.com/newsblog/?p=547

Wow, just need a light saber and my dreams have come true!!!
C4D R19 / Win 10 / 7 GTX 1080
User avatar
Goldorak
OctaneRender Team
Posts: 2321
Joined: Sun Apr 22, 2012 8:09 pm
Contact:

Scenes exported in this format render instantly on any GPU (even mobile). The offline rendering time is proportional to the navigable volume, in increments of ~ 1 cubic foot. This is also where cloud rendering comes in handy.
Vue2Octane
Licensed Customer
Posts: 88
Joined: Thu Jun 26, 2014 8:16 am

Impressive idea.
This is basically a lightfield camera realised in 3D software.

This is the camera that is simulated. It also allows 3D from one shot (in Octane one render).
https://www.lytro.com
User avatar
Goldorak
OctaneRender Team
Posts: 2321
Joined: Sun Apr 22, 2012 8:09 pm
Contact:

Vue2Octane wrote:Impressive idea.
This is basically a lightfield camera realised in 3D software.

This is the camera that is simulated. It also allows 3D from one shot (in Octane one render).
https://www.lytro.com
Correct. Lytro has a very small grid size / capture volume, but an ORBX lightfield can be any size. You can scale up in units of about 10 cm cube. GPU LF rendering w/ Octane scales really nicely, just like 2D rendering. It is a good fit for the cloud service. Also, compared to traditional baking, this is better - you capture reflections/refractions, SSS, etc, not just 'splotchy' GI.

We can embed info channel kernel/render pass data into the LF pixels (by default depth is included). Lot's of possibilities...especially when you combine this with some of the other announcements coming out of Siggraph this week...
Vue2Octane
Licensed Customer
Posts: 88
Joined: Thu Jun 26, 2014 8:16 am

You guys really know what you are doing!
I looked into LFcameras myself, also designing one in Zemax Optical Design Software at some point.
It is quite nice looking at how the Octane team adds lots of game changers to the render world.
Not just the same technique slighlty refined or modified, but real game changers.
Adapting the lightfield concept to a render engine is just smart!
Rikk The Gaijin
Licensed Customer
Posts: 1528
Joined: Tue Sep 20, 2011 2:28 pm
Location: Japan

Image
User avatar
linvanchene
Licensed Customer
Posts: 783
Joined: Mon Mar 25, 2013 10:58 pm
Location: Switzerland

edited and removed by user
Last edited by linvanchene on Mon Oct 20, 2014 3:27 pm, edited 1 time in total.
r-username
Licensed Customer
Posts: 217
Joined: Thu Nov 24, 2011 3:39 pm

Very nice and glad to be using otoy software.

Is there a time frame for a beta version of standalone with rift type renders.

I have a fire phone with "dynamic perspective", can we render content for this device?
i7 960 - W7x64 - 12 GB - 2x GTX 780ti
http://www.startsimple.com/ - http://www.gigavr.com/
MOSFET
Licensed Customer
Posts: 84
Joined: Thu Jun 03, 2010 1:30 am

@Goldorak: Do you plan on having an online demo of this tech? This announcement may have had more impact if you had one available to the public at the same time as the press release.
User avatar
Goldorak
OctaneRender Team
Posts: 2321
Joined: Sun Apr 22, 2012 8:09 pm
Contact:

linvanchene wrote:This is amazing news! :-)

But I am a bit confused about some points:

In the blog its says this was presented on a mobile phone.

Was that a prototype with a special display?

How do those displays know from which angle we are looking at it?

or

Does the display always show all angles of the frame?

This seems like a lot of data to process.

What is the file size of one frame?
What minimum RAM should devices have?
You have two options - have the cloud service decode the lightfield in HD and stream it down with depth and an LF mipmap (for quick reprojection - also plugs into time warping on the Oculus).

OR

Send down the scene as an ORBX LF volume, for local or offline viewing using OpenGL ES (or WebGL + ORBX.js). The size can be 16 Mb to 16 GB depending on the volume.

The ORBX LF codec is still early in development, but the size is getting smaller the further we develop it and user more info from the infochannel kernel to compress the LF more . At medium quality, a 1 foot LF view cube is about 8x larger than a hi res 2D surface PNG @650 dpi.

If you are on the cloud, you can keep streaming in more LF cubes as you move through the scene. If you are on a mobile device with OpenGLES 3, the idea is you download an LF cube, and view the volume locally rather than look at a 2D picture. It it should fit into the device memory in that case, but larger, full res volumes should be streamed from the cloud (you can could cache a mip map of the current LF cube from the stream itself)
Post Reply

Return to “General Discussion”