This page looks best with JavaScript enabled

Digital Compositor Checklist

 ·   ·  ☕ 3 min read

Below is a list of procedures a compositor should follow. I also have a blog post compiled of optimization tips, which should work as side document to this post.

Distortion

Evey footage has some sort of distortion. Not dealing with distortion before 3D integration can result in catastrophic disasters. It’s good to first undistort the background plate > do the CG integration > use the same undistortion data to re-distort the whole output. This way CG will be distorted as well. Giving a feel if it was shot with a camera.

You can find more about distortion workflow on Joe Raasch blog. Although distortion data is also handled by tracking software. That can be exported straight out from there.

Color Matching aka Color Correction

Color correction has not the only place in VFX compositing but it is from the start of the pipeline. One output, which has two different images with different sources can’t look real until it is color calibrated to the base image. A good understanding of color, in general, is a plus.

Match Grain

If looked closely, footage plate contains some noise. But our readers usually comes at full render quality. So does the static images are not grainy. So we have to match the rate of noise Matching the level of the grain of background plate to CG element or the images we are using to give the sense of realism.

Even there’s a node in Nuke for this. Inside the FurnaceCore plugin, named F_ReGrain.

Motion/Edge Blur

Final outputs are not as sharp as they are rendered plates are (there can be exceptions). At 24fps, there is some motion blur, unlike games, which usually run at 60fps. The whole purpose that films still is broadcasted at 24fps is to give a feel realism to the eyes.

Motion vector pass is generally used to do these motion blur things. Which can be obtained same way general passes are rendered. More on Motion Vector on Lester Bank’s blog.

Edge blur on the other hand is used to blur the sharp edges of render objects. It blurs only those part of scene in which objects are moving, relative to speed of object and not the camera. This is what differentiates it from motion blur.

Chromatic Aberration

There’s no lens made in the universe, which has no chromatic aberration. Chromatic Aberration is a phenomenon in which channels of the image tend to offset by a certain amount of pixels. Just like distortion, this offset increases as we go from the middle of the footage.

Depth of Field

Depth of Field is another factor which makes the plate look real.

z-depth pass rendered out from Maya is used inside ZDefocus node of Nuke. Similar node must be there on other packages.

LightWrap

When a key object is mixed over the background, it usually stands out. LightWrap helps there. The plugin was made by Red Giant for After Effects is also available in Nuke natively. A good tutorial about LightWrap in Nuke can be found at Huey Yeng’s Nuke blog.

Lens Flares?

Depending upon the lighting densities, there must be lens flare making the scene better. I find it less necessary of all listed here.

If you’ve read this whole post, and still reading, you should just give your opinions in the comments section.

Continue Reading

Share on

Santosh Kumar
WRITTEN BY
Santosh Kumar
Santosh is a Software Developer currently working with NuNet as a Full Stack Developer.