Its a circus out there

2009

Rendering

Home VOPs SOPs DOPs COPs POPs ROPs Expressions OSTips Shops Tools Lighting Python Pclouds HOT About

Bio Showreel GalleryHoudini.htmlVops.htmlSops.htmlDopsMain.htmlCops.htmlBlank.htmlExpressions.htmlOSTips.htmlShops.htmlTools.htmlLighting.htmlPython.htmlPointClouds.htmlHotOceanTookkit.htmlAbout.htmlBio.htmlShowreel.htmlGallery.htmlHoudini.htmlshapeimage_1_link_0shapeimage_1_link_1shapeimage_1_link_2shapeimage_1_link_3shapeimage_1_link_4shapeimage_1_link_5shapeimage_1_link_6shapeimage_1_link_7shapeimage_1_link_8shapeimage_1_link_9shapeimage_1_link_10shapeimage_1_link_11shapeimage_1_link_12shapeimage_1_link_13shapeimage_1_link_14shapeimage_1_link_15shapeimage_1_link_16shapeimage_1_link_17shapeimage_1_link_18
 

Mplay Tips

1st line (first pair of numbers, next to the image) - is the pixel number you are inspecting(they go from left to right and top to bottom.The top to bottom part will depend on the image type)


2nd line (second pair of numbers, next to the image) - is the uv location you are inspecting (from 0 to 1, left to right and top to bottom)


Third line (first set of 4 numbers) - RGBA notation in decimal code where 0 is the minimum value and 255 the maximum (this notation is used mostly for computer monitor displays)


Fourth line (second set of 4 numbers) - RGBA notation normalized (what you normally use in a color wheel, where 0 is the min and 1 is the max) Fourth line (set of 3 numbers preceded by HS/Lum) is the Hue, Saturation and Value numbers for the pixel you are inspecting Mplay



General Tips


  1. To open an image in the correct aspect ration in a shell type  “ mplay  myimage  -x2 “  this will open out a image at the correct ratio.

Apply the extra parameters to the mantra ROP. This way you can write out your irradiance cache. use the .pc  This will then write out a file as a point cloud.viewable then in Mantra through a geometry node.

Irrandiance cache

PBR Render Engines:-


Two render engines available: PBR stands for physically based rendering


  1. Micropolygon PBR . decouples the min ray samples from the pixel samples. Pixel samples control the motion blur and anti aliasing settings . using the Min Ray settings to control the quality of the global illumination quality. A handy tip is to make the min ray samples a squared value of the pixel samples parameter so if your pixel samples are 12 , 12 your min ray samples will be 12^2  which is 144.

  2. PBR  engine uses pure ray tracing and the pixel samples parameter control everything.

  3. PBR :- Get rid of noise using the micro polygon PBR you need increase the max ray samples , this figure is normally a multiplication of the 2 pixel samples , so if your pixel samples are 12 x 12 you need to increase the max ray samples to 144  which basically 12 squared.

The  irradiance function is used to compute irradiance in mantra. Irradiance is computed by sampling stochastically over the hemisphere and computing the incoming illumination from the other geometry in the scene.

Tips

  1. Make sure geometry has a thickness or you will get light leaking

Verbose information from a render:-


The following is an example of timing information and how to read it. 


Render Time: 389.000u 1.062s Memory: 72.13 MB of 74.25 MB arena size. 

The  u  value is the actual time in seconds mantra took to render the image. 

The      s      value is the system overhead incurred in rendering the frame (disk io, swap, etc.). This value might not be 100% accurate depending on the OS and other system variables.

Arena size is the amount of memory mantra allocated to actually render the image. It does not reflect how much memory mantra actually used.  

Mantra needs to grab continuous chunks of memory as it builds the data structures. Once it free’s up the data, the operating system controls the arena size shrinking it where it finds continuous chunks of memory back to the free pool of available memory. This is called memory allocation and memory deallocation. You don’t want the arena size much larger than the actual memory used. 

Ray Bias:-


"Ray Bias" (or "Shadow Bias") where it cheats the ray origin a tiny bit to prevent self-shadowing due to (1) numerical issues, and (2) the approximation due to planar faces being shaded as if smooth surfaces.

There are two solutions: (a) reduce your shadow bias (in your Light or shader) until it looks acceptable.. and if this doesn't eliminate the issue without artifacts, (cool.gif model some thickness to the little box/house you have there... and the only thing you should ensure is that your walls are thicker than the Shadow Bias amount. This should ensure all rays don't make their shadow tests outside of your house walls.

Linear workflow:-


First off, unless you go out of your way, shaders are linear. Texture maps, typically sRGB, need to be linearized before using(same as with any renderer). All you need to do is set the mplay you're doing your shader tests into to have a gamma of 2.2(lower right of the interface). Then light/shade.

Get it to look the way you want in that context, and you have a linear workflow - the images rendered from that should get read in as 'linear' in nuke, and you're done. The goal is to have everything - the maps, the render and the final image all linear. If you mplay your rendered image in a default mplay, it should appear 'darker' than you would expect. This is because mplay(and most other image viewers) makes some assumptions about what they're displaying and where. Nuke is more anal about assumptions of colourspace.


If I'm matching to a live plate, I'll typically throw a BG grid in there with a linearized plate constantly displayed on it, and render against that. You need to be careful here - again if you mplayed that plate, it would look dark unless you set the gamma in mplay to 2.2. Don't have a cineon or sRGB plate there, or something's out of the linear pipeline.


Rendering and Using a Vector Pass:-


Create the shader :-  I tend to package up with a toggle parameter in an if statement to be dropped into a surface material so I have the option to run it and output as a arb.


Set up the rop :-  TO COME


Use it in Nuke :-  TO COME


Use Pre and post render scripts to render a separate shadow pass :-


An example file to show the use of pre and post render scripts to render a separate shadow pass using a switch node to control which shader gets called at render time.


Notes:-

Refer to OS tips for texport use and expression testing

Example uses a global expression set in houdini.env file

Switch shop use the $RENDERPASS to toggle 1 zero

Example uses takes to switch the shadows for the beauty pass to off


Render ROP is chained and uses a pre and post render script to drive the switch shop.


  1. pre render script is  set RENDERPASS  = 1; varchange

  2. post render script is  set RENDERPASS = 0; varchange


Hip file example :- rse_rop_shadowSwitcher_project.rar

Utility Image planes:-


Rendering a P Pass:-

Add the P image plane in your rop.

Quantize :- 32 bit float as the output so as not to get any antialiasing on the edges.

Sample filter :- to Closest Surface

Pixel filter :- to minmax min

 

If you anti-alias the P channel, you don't actually have a real P value there but a blurred nonsense value that will cause errors in the a composite. The standard P output is in camera world space.




Rendering a Pz Pass:-


Add the pz image plane in your rop.

Quantize :- 32 bit float as the output so as not to get any antialiasing on the edges.

Sample filter :- to Closest Surface

Pixel filter :- to minmax min


Set the pixel filter to “closest sample filtering” or “minmax min” for no antialiasing or feathering of edges which cause useless depth information.



Rendering a Normal Pass:-


Add the N image plane in your rop.

Quantize :- 32 bit float as the output so as not to get any antialiasing on the edges.

Sample filter :- to Closest Surface

Pixel filter :- to minmax min



Hip file example :- rop_extraimageplanes.hip

H11 PBR test scene

Notes :-  

Basic Scene for testing PBR in H11 .Uses H11 shaders

scene file :- rse_pbr_H11_test.hip

otl file :-


H12 Scene set up for rendering / lighting


Notes :-   Houdini is renders linear by default but you view and set up swatches in a gamma 2.2 colour space. This normally means you set a gamma of 2.2 in mplay to view your image correctly. But what about swatch colours and other such areas of Houdini.


So you get the correct setting for colours so you don't have to pick blind. under

Edit > Color Settings > Color Correction. Set the following as shown in the image.

All your renders will still be linear.


This of course is only done if no LUT is being used.




H12 Mantra PBR render.


Notes :-  For PBR renders and not using a LUT. Set the color space parameter  to gamma 2.2 . Why ?


it's a color space in which the sampling is performed

if you set it to Gamma 2.2 the noise level threshold will be tested on samples in Gamma 2.2 color space as opposed to linear, therefore it will result in more samples in darker areas.

In other words, with this set to Gamma 2.2, you will have more constant noise step in your render, if you look at the linear image (as yours, Gamma 2.2) because it was sampled in the color space similar to how human eye perceives light information


If the sampling was performed in Linear color space then you may see more noise in dark areas since human eye is more sensitive to them .So in conclusion, change it to Gamma 2.2 whenever you are using Gamma 2.2 on top of your renders, which would be every time you want physically correct lighting



Sage words of wisdom from Tomas Slancik


H12 Hbatch.


Notes :- Launch from the command line tools


hbatch myfile.hip

cd /out

render mymantranode


quit = will quit you out of hbatch back into the shell dir


example hbatch code : =  render -f 87 300 -Va1 -s mymantra_rop

H13 Post render Geometry ROP.


Notes :- Use opparm to  execute a parameter ie hitting the reload button on a file sop and setting the load type.  Hscript entered in the rop post render tab


opparm -c /`opname("../..")`/box_object1/file1 reload

opparm -c /`opname("../..")`/box_object1/file1 loadtype 0


scene file :- rse_georop_post_renderscript.hip

otl file :-