Help about hdri and matte painting

Discuss anything you like on this forum.
milanm
Licensed Customer
Posts: 261
Joined: Tue Apr 30, 2013 7:23 pm

Hi

Well let's see what can we learn from how "The Martian" was made ;)
https://www.fxguide.com/featured/life-o ... e-martian/
https://www.youtube.com/watch?v=5CPbISkPqKk

So that would translate to this approach:

1. Your matte painting should not be yellow orange and it should match the color of your live footage.
2. Same color correction (yellow,orange) should be applied to all elements composited together.
3. HDRI from the set should be color matched to the live footage and both should be in linear space.
4. You might have to replace some reflections in your live footage with reflections of your matte painting. Same reflections would be on your CG elements.
5. To make that possible, your matte painting should be actual geometry around your CG and you may need to model and matchmove some of the live footage elements so that they could "catch" those reflections. Depending on the complexity of your shot that would require a separate pass.
6. Of course all compositing needs to be done in linear space.

I'm not sure, but I think that in V3 you could have two separate environment maps for specular and diffuse/visibility, those could be useful in this case.

There are many ways to do this and A LOT depends on what you have to work with.
As you can see in 'The Martian' making of, you could easily get away with just color correcting the final result.

Adding a "look" to each element separately would just make it harder to get a believable composite. If we would do a white balance on Mars, colors would be very close to what we have on Earth, right? Color correcting HDR would introduce a lighting problem when matching CG elements with live footage.

I hope that helps.

Regards
Milan
Colorist / VFX artist / Motion Designer
macOS - Windows 7 - Cinema 4D R19.068 - GTX1070TI - GTX780
ristoraven
Licensed Customer
Posts: 390
Joined: Wed Jan 08, 2014 5:47 am

That is a sound approach Milanm.

If on the other hand production team gives you a matte painting with final color hue and you need to add stuff to it from different sources (Red camera and Octane), you have to handle those assets separately, because they are not from the same source so all of their color values are off from each other.

As you can see in the making of Martian, there's a bunch of layers to different cgi assets and elements and top to those layers comes the final color corrections. Final color correction is of course made after the final edit, as a last step to make the entire movie to have a unified look..
milanm
Licensed Customer
Posts: 261
Joined: Tue Apr 30, 2013 7:23 pm

ristoraven wrote:That is a sound approach Milanm.

If on the other hand production team gives you a matte painting with final color hue and you need to add stuff to it from different sources (Red camera and Octane), you have to handle those assets separately, because they are not from the same source so all of their color values are off from each other.

As you can see in the making of Martian, there's a bunch of layers to different cgi assets and elements and top to those layers comes the final color corrections. Final color correction is of course made after the final edit, as a last step to make the entire movie to have a unified look..
Well yeah, in that case it comes down to: matte painting vs. everything else including actors. And we don't want our actors to look wierd beacause of our fancy background, right? ;)

I think the approach in "The Martian" was basically as if everything (including CGI and Matte Painting) was filmed on a real set (on Earth) in a.. sort of.. "day for night" style.

Of course all of this largely depends on the type of shot and number of shots. The method I described in my previous post is more universal.
If you had only one shot in a 15sec ad to do, you could do whatever you want to get the job done and it would look great. But three of them in 4K? And you're already in trouble. 100 shots? No way. You would need a method that works every time no matter what planet, camera model, lens etc. ;)

I hope that makes sense, it's late here and my English is rusty. :)

Regards
Milan
Colorist / VFX artist / Motion Designer
macOS - Windows 7 - Cinema 4D R19.068 - GTX1070TI - GTX780
ristoraven
Licensed Customer
Posts: 390
Joined: Wed Jan 08, 2014 5:47 am

Believe me, they are going "few" extra miles in cgi when it's a bigger production :)
Basically anything you could have on its own layer, is on its own layer. Simply because of brightness values. If you have two different assets from different sources on same layer, other one is brighter than the other, it makes no sense to try to match them on one layer..

Now that we are talking about this, there's this new feature of deep pixel in Octane. That adds a whole new dimension to compositing, "deep compositing". At the moment Nuke, as far as I know, is the only editor that can handle that data..

This video goes trough pretty well, what it's all about:
https://www.youtube.com/watch?v=19w3vkFp5X0
milanm
Licensed Customer
Posts: 261
Joined: Tue Apr 30, 2013 7:23 pm

ristoraven wrote:Believe me, they are going "few" extra miles in cgi when it's a bigger production :)
Basically anything you could have on its own layer, is on its own layer. Simply because of brightness values. If you have two different assets from different sources on same layer, other one is brighter than the other, it makes no sense to try to match them on one layer..

Now that we are talking about this, there's this new feature of deep pixel in Octane. That adds a whole new dimension to compositing, "deep compositing". At the moment Nuke, as far as I know, is the only editor that can handle that data..

This video goes trough pretty well, what it's all about:
https://www.youtube.com/watch?v=19w3vkFp5X0
Oh yes, of course i totally agree with you about layers.. and lots of them. Breaking everything apart is essential for compositing cg into live action. I thought this was considered a common practice so i didn't mention it. :)

Re: Deep Data: Oh yeah, very cool stuff. So many possibilities. I've been following it's development for a while now. Fusion was actually one the first apps to adopt it aside from Nuke. And finally I recently went back to Fusion after a long time struggle with Ae so, can't wait to try that stuff out. There are big downsides to it though. It's MASSIVE. But, compositing volumetrics is a massive pain in the neck without it.

Here's a great overview about how it was used in the latest Star Wars: https://youtu.be/G42oiW2EosU?t=21m55s

Thanks for that Nuke video. Cool stuff!

Regards
Milan
Colorist / VFX artist / Motion Designer
macOS - Windows 7 - Cinema 4D R19.068 - GTX1070TI - GTX780
calus
Licensed Customer
Posts: 1308
Joined: Sat May 22, 2010 9:31 am
Location: Paris

milanm wrote: Re: Deep Data: Oh yeah, very cool stuff. So many possibilities. I've been following it's development for a while now. Fusion was actually one the first apps to adopt it aside from Nuke.
Hi Milan,
Fusion user here,
and sorry I'm going to disappoint you,
but Deep Pixel compositing in Fusion is not at all the same thing as Deep Image compositing in Nuke.
This is just a cool technique (and tools) using a world position pass.

Nuke is really the only one, for the moment, having tools able to use the Deep datas from Deep images rendered in Octane...
Pascal ANDRE
ristoraven
Licensed Customer
Posts: 390
Joined: Wed Jan 08, 2014 5:47 am

Thanks for the Star Wars vid. :)

I am just thinking, is there any use for deep compositing in VR..?

Theoretically plenty, but practically? I have no idea.

Lightfield+deep pixels?
Deep pixels+cubemaps?
milanm
Licensed Customer
Posts: 261
Joined: Tue Apr 30, 2013 7:23 pm

calus wrote:
milanm wrote: Re: Deep Data: Oh yeah, very cool stuff. So many possibilities. I've been following it's development for a while now. Fusion was actually one the first apps to adopt it aside from Nuke.
Hi Milan,
Fusion user here,
and sorry I'm going to disappoint you,
but Deep Pixel compositing in Fusion is not at all the same thing as Deep Image compositing in Nuke.
This is just a cool technique (and tools) using a world position pass.

Nuke is really the only one, for the moment, having tools able to use the Deep datas from Deep images rendered in Octane...
Hmm.. so we couldn't use deep data from Octane in Fusion? Well that's a shame.. But I still love it for it's simplicity, same reason i like Octane. I'm sure it will be there eventually until Octane V3 matures a bit, especially after Blackmagic taking over. Here's hope.

ristoraven VR stuff scares me with it's super high resolutions.. and we're already way off and close to hijacking a topic here :)

Regards
Milan
Colorist / VFX artist / Motion Designer
macOS - Windows 7 - Cinema 4D R19.068 - GTX1070TI - GTX780
ristoraven
Licensed Customer
Posts: 390
Joined: Wed Jan 08, 2014 5:47 am

milanm wrote:
ristoraven VR stuff scares me with it's super high resolutions.. and we're already way off and close to hijacking a topic here :)

Regards
Milan
You're right. Those resolutions are insane, data sizes too and lets not hijack this topic :)
User avatar
mbetke
Licensed Customer
Posts: 1293
Joined: Fri Jun 04, 2010 9:12 am
Location: Germany
Contact:

In film they color correct the output. They use cureves, change the RGB and so on. Just more complex with a post-production pipeline like After Effects and other tools.
PURE3D Visualisierungen
Sys: Intel Core i9-12900K, 128GB RAM, 2x 4090 RTX, Windows 11 Pro x64, 3ds Max 2024.2
Post Reply

Return to “Off Topic Forum”