Making spherical HDR images

One of the crucial task of a Visual Effects supervisor on set is to shot spherical degree high dynamic range (HDR) panoramas. Those panoramas can be very useful in post production as a source of light or just a good reference for computer generated environments. This article  goes through all the steps required to create basic HDRIs.

Reasons to do on set HDRI capture

  • Use in 3D application for image based lighting
  • Use as background or base for a matte painting
  • 360 degree environment projections
  • Reference of the environment

Equipment

  • Camera: Canon 5D Mark 2
  • Lens: Canon 8 mm
  • Tripod
  • Nodal Ninja (Figure 1)
  • Smartphone or pan and paper
  • Camera remote (optional)
image04
Figure 1: Nodal Ninja

 Shooting Procedure

  • Mount Nodal Ninja on the tripod head.
  • Mount your camera on Nodal Ninja (NN)
  • Configure Nodal Ninja
How to find you lens nodal point
Nodal Ninja manual

Configuring the tripod

Level your tripod as shown on the Figure 2.

image03
Figure 2: Properly leveled tripod

Configuring Nodal Ninja

Nodal Ninja purpose is to take parallax free images to make it suitable for stitching in one continues panorama later on.

image01
Figure 3: Setting the nodal point side, [1]
image05
Figure 4: Setting the nodal point front, [1]

Shooting process

As a professional you have to be consistent with your results. That why it is very important to have a clearly defined procedure in place. This video will explain what to check before shooting  and how to properly shoot HDRIs

Before shoot checklist

  • Tripod leveled
  • Shoot a slate
  • Image is in focus
  • Camera set to manual mode (M)
  • Autofocus is disabled
  • White balance is locked
  • The exposure note change between each bracket shoot and follow repetitive pattern (e.g 1/25, 1/100, 1/400, 1/25, 1/100, 1/400, and so on).
  • ISO set no higher than 800

Shoot set of picture with different exposure in 4 direction (90 degree angle)

Assuming that you use 8 mm lens on full frame camera. You have to end up with overlapping pictures for each direction

Depending on your goal you can shoot between 3-5 pictures for each direction with 1 or 2 f-stop difference.

Slate

It is a good practice to shoot a slate for every panorama that you shooting. The reason for that is when you stiching your photos it might be difficult to find where is one sequence ends and another begins. An example of a slate by using a smartphone illustrated on the Figure 5.

Slate for each panorama should include:
  • Project name
  • Scene (e.g. Forest, Silver Lake, Garage Interior etc.)
  • Point Number (Different shooting point within the same scene)
image02
Figure 5: Example of  Android slate

Stitching images together

I would recommend PTGui for stitching your images. It does fantastic job combining multiple exposure Canon RAW images and then stitching it together.
If you did not messed up alignment or image exposure along the way this process should be automatic.

image00
Figure 6: Example of final lat-long panorama

Sources

 

Advertisements

Maya shader transparency and alpha multiplication

Problem statement: Black edges on object with transparency in Maya 2015, VRay 3.0.

Everyone who ever worked on integration of computer generated (CG) image over a life action video knows unpremult/premult workflow. The main ruke is before applying any color corrections to CG render it must be divided by alpha channel (unmultiplied). The whole thing about pre-multiplication and transparency is quite simple.

What in Nuke jargon sounds like “premult” in normal math language is

newRGB = RGB * Alpha

Similar for “unpremul”

newRGB = RGB / Alpha

The problem occur when you get footage which is already premultiplied and you multiply your RGB channel by Alfa again. In fact that exactly what happening when you try to use some footage with Alfa channel as transparency in Maya.

If you think your image is Unpremultiplied (but it’s really Premultiplied) and you add a Premult node you basically multiply the image twice (RGB x Alpha x Alpha). If you have an RGB pixel of 0.6 and you multiply it by the alpha pixel of 0.4 you get a correct value of 0.24 for that RGB pixel. But if you premultiply the image twice you are effectively doing 0.6 x 0.4 x 0.4 giving you a value of 0.096, which darkens the edge.

There are situations when you will want to Unpremultiply an image. The general rule for any form of colour correction is: Unpremult the image first, do the colour correction and then Premult back to its correct state. This is so you don’t accidentally colour correct any of the transparent Alpha edges. [2]

nuke_unpremult
Correct workflow with premulted assets [1]
nuke_no_unpremult
Correct workflow with premulted assets causing black edge artifact [1]
Sources

  1. http://yamagishi-2bit.blogspot.com/2012/06/aftereffectspremultiplyunpremultiply.html
  2. http://www.spherevfx.com/written-training/miscellaneous-written-training/understanding-premultiplied-images/

Maya Linear Workflow vs. Texture Gamma

Linear workflow is the subjects of a lot of mysteries. Indeed, it is convoluted an hard to understand especially if there are multiple ways to achieve similar but not the same result.
Linear workflow mean that all your images going trough the pipiline in linear color space. For example render output from Maya in linear space go to Nuke where get processed and render to final delivery format in sRGB.
In this article I want to show my observation end experiments that I’ve done with V-Ray and Maya and how at firs glance similar things  can produce quite different result.

The most important thing to know is that all of the renders perform calculation with linear colors, thus all the sources supplied, such as texture images, has to be linear. People often forget that shaders also need to be supplied with linear images.

Most of the source files (.jpg, png, tif, etc.) used as textures have non-linear gamma (2.2) applied to it.

Majority of render engines perform texture filtering in two places. First, when a shader process a texture, second, when the render engine calculating the image (anti-aliasing). As a matter of fact, we want to convert our images before shader start to perform its calculations, otherwise when it come to applying filtering the math will be wrong.

When you select you gamma conversion in application there are usually a couple choices sRGB and Gamma 2.2. This color spaces have slightly different curve which probably will be undistingvishble.  Here is an example of two overlapping curve taken from Nuke LUT Settings.

sRGB_Gamma2.2_difference

My test set up include two texture images. One texture used as is with gamma 2.2 color space, another was converted to linear by using Nuke Colorspace node.

brick_2_2_gamma brick_linear_gamma

nuke_2.2_to_linear

First I went and disabled all of the filtering on texture as well as in render settings in order to get clean more predictable result. In term of linear workflow I’ve been testing the following cases:

Input Option
Case 1 Texture in gamma 2.2 Linear Workflow Toggle On
Case 2 Texture in gamma 2.2 Vray Texture Input Gamma 2.2
Case 3 Linear Texture All off

Here is example with  Vray Texture Input Gamma 2.2.

vray_extra_attribute

This image show Linear Workflow Toggle On.

vray_linear_workflow

With linear workflow toggle ON you will get warning saying that linear workflow toggle was deprecated but documentation doesn’t explain why, I also tried to find information on  Chaos Group forum with no luck.

Linear workflow – this option is deprecated and will be removed in future versions of V-Ray. When this option is checked V-Ray will automatically apply the inverse of the Gamma correction that you have set in the Gamma field to all VRayMtl materials in your scene. Note: this option is intended to be used only for quickly converting old scenes which are not set up with proper linear workflow in mind. This option is not a replacement for proper linear workflow.
http://docs.chaosgroup.com/display/VRAY3/Color+Mapping#

The following image demonstrate the difference between Linear Workflow Toggle and on texture gamma correction. One main thing to note here that image corrected with VRay Extra Attribute (Case 2) has now difference with linear image without any correction (Case 3). On the other hand, image witch rendered with linear workflow toggle on has noticeable degradation in dark colors.

gamma_difference

Most likely degradation happening due to filtering process that happening when texture get process by shader. By specifying texture input gamma on shader VRay can interpret texture to linear work space before doing the actual calculation. In case when linear workflow is turned on in render settings the texture gets filtered with non-linear gamma which lead to some errors in shader math and only after that render convert the texture to linear work space and perform further calculations.

Resources

This article was inspired by the great tutorial from Alexey Mazurenko. Unfortunately only available in Russian.
https://vimeo.com/53806660

Fabulus paper from siggraph about RENDERMAN

http://old.siggraph.org/publications/2006cn/course25.pdf

What Ri spec newer told you

http://renderman.pixar.com/view/what-the-rispec-never-told-you

Houdini Materials Override

Overriding materials in Houdini  not a simple task that you might think. There are many ways to do so such as Takes or shop_surfacepath method mention here (http://forums.odforce.net/topic/13385-global-material-override/)

Python Filtering technic described in the same post by King Tapir work pretty well and do not required that all of your material assigned on the object level. However, it might be difficult to figure out how to use it for the first time.

Made a .py file. In my case it called filter.py and contain following code.

import mantra

def filterInstance() : 
    mantra.setproperty('object:surface', 'op:/shop/AO'.split())
    mantra.setproperty('object:overridedetail', 1)

print "Python Filter Finished"
filterInstance() Called just prior to the ray_end statement which locks off the settings for an instance object. The function can query object: settings and possibly alter them.
object:surface = ('') The surface shader attached to the object.
vm_overridedetail / object:overridedetail = ('false') When geometry has shaders defined on a per-primitive basis, this parameter will override these shaders and use only the object’s shader. This is useful when performing matte shading on objects.
Not supported for per-primitive material assignment (material SOP).
'op:/shop/AO' Shop material that we going to override

Tell mantra to use this file as a Python Filter

command

Also, don’t forget to force writing of all shaders into .idf by setting Declare All SHOPs.

declare_all_shops

Create a simple SHOP material in this case I’m doing very simple Ambient Occlusion shader.

oclusion

If everything set correctly you should see the print output “Python Filter Finished” in your console.

Screen Shot 2015-03-20 at 12.15.22 PM

This is my result

un_ao_test

Useful links to explore:

http://www.sidefx.com/docs/houdini11.0/rendering/ifd

Converting Houdini Not Commercial Files

This task was bothering me for a long time. We have educational license of Houdini in our school which works without any limitation within the school network. But often I want to reuse my asset created in school for other projects, however an attempt to do so transform my .hip file into apprentice version with all of the limitations applied.

By watching Ari Denesh tutorials (https://vimeo.com/60757344) I found a couple very useful commands.

opscript

http://www.sidefx.com/docs/houdini14.0/commands/opscript

cmdread

http://www.sidefx.com/docs/houdini14.0/commands/cmdread

This is an example of usage.

opscript -G -r / > F:/un/objgroups.cmd

And when load all generated .cmd back.

cmdread F:/un/groups.cmd

I didn’t test it with locked digital assets. According to documentation it won’t work.

Unpacking Houdini Files

Just stumble across this thread

http://www.sidefx.com/index.php?option=com_forum&Itemid=172&page=viewtopic&p=28954&sid=2170ee4566cf91f7b356d0951dcf0501

It took me some time to figure it out because standard “hexpand” utility doesn’t quite work on Windows. I also fail to solve “Permission denied” error discoursed in the thread.

Finally, I just use “hcpio.exe” utility. It seems work pretty well.

http://www.sidefx.com/docs/houdini14.0/ref/utils/hcpio

Here is the command.

“C:\Program Files\Side Effects Software\Houdini 14.0.201\bin\hcpio.exe” -idI F:\Projects\test_prj_01\halway
_micropoly_01.hip

Nuke Custom LUTs

Problem

Our post-production facility recessives DPX plates with .3dl LUT file from DaVinci Resolve. The LUTs were for two different cameras one of which is ARRI Alexa and another is RED. The task is to apply this LUTs to every dailies rendered by all artists.

Solution

First, we need to create a gizmo for each LUT which contain Vector Field Nuke node with the following expression in the “fectorfield file” field.

[getenv NUKE_LUT]/A_CAM_ArriLogC.3dl

where “getenv NUKE_LUT” will grab session environmental variable with name “NUKE_LUT” which contain path to the LUTs folder.

 

Nuke_vectorfield_example

 

Now create a “init.py” file inside you .nuke directory.

 C:\Users\User name\.nuke

The “init.py” file should look something like this

import nuke
import os

# Path to executed file
cwd = os.path.dirname(os.path.realpath(__file__))

# Provide env variable for use in vectorfield node as a relative path
os.environ['NUKE_LUT'] = os.path.join(cwd, 'BB_LUT')

# Paths to viewer gizmos
RED_Viewer_LUT = os.path.join(cwd, 'BB_LUT', 'BB_RED_Viewer_LUT.gizmo').replace("\\", "/")
ARRI_Viewer_LUT = os.path.join(cwd, 'BB_LUT', 'BB_ARRI_Viewer_LUT.gizmo').replace("\\", "/")

# Paths to node gizmos
RED_LUT = os.path.join(cwd, 'BB_LUT', 'BB_RED_LUT.gizmo').replace("\\", "/")
ARRI_LUT = os.path.join(cwd, 'BB_LUT', 'BB_ARRI_LUT.gizmo').replace("\\", "/")

# Register custom LUTs
nuke.ViewerProcess.register("BB_ARRI_LUT", nuke.Node, (ARRI_Viewer_LUT, ""))
nuke.ViewerProcess.register("BB_RED_LUT", nuke.Node, (RED_Viewer_LUT, ""))

# Regester costome meny for creatin LUT gizmos
nuke.menu('Nodes').addCommand( 'BB_Tools/ARRI_LUT', lambda: nuke.createNode(ARRI_LUT))
nuke.menu('Nodes').addCommand( 'BB_Tools/RED_LUT', lambda: nuke.createNode(RED_LUT))

 

Beside registering gizmo for Nuke Tab menu I also register them as a custom  Nuke display LUTs.

Nuke_LUTNuke_LUT_node

 

Note how the “init.py” script setting a ‘NUKE_LUT’ environmental variable which is hold the path pointing to the current directory plus LUT sub-directory.

After that we should be able to access that variable through the following TCL expression inside of Nuke.

getenv NUKE_LUT

Now we can copy the folder with our LUTs and gizmos within “init.py” to any Windows or IOS machine without worrying about hard-coded path to our LUTs.

DOWNLOAD PROJECT FILES