Prev | Next


Shaders and RenderMan


Introduction

Shader Types

Shader Space  


Introduction

At the core of RenderMan is the versatile RenderMan Shading Language. What follows is a discussion of some related key concepts which may benefit users of RenderMan for Maya. Because Maya Materials are automatically converted this knowledge isn't absolutely required, but it is valuable for understanding how shaders are implemented in RenderMan and will be of benefit for those who wish to create well-executed shaders.  

All shaders answer the question, "What's going on at this point?" That is, a shader determines the color of a point by running algorithms to gather information about certain scene elements. The RenderMan Shading Language allows shaders to ask very complicated questions about what is happening at any given point. With the RenderMan Shading Language, shaders can be created without limitations, familiar shading models can be extended, or totally unique shaders can be developed. Physical properties of materials can be simulated or flat out defied in order to deliver whatever look is desired. The effects of studio lighting can be constructed by building lights that have special lenses, concentrators, flaps, or diffusers. Material types can also be combined, simulating the many coats of paint or finish applied to a surface. Remarkably realistic images can be produced with a few fairly simple shapes which have shaders that are asking the right questions. 

Shaders can be generated in a number of ways. RenderMan for Maya renders Maya Materials and shaders created by Slim. Alternatively, shaders can be written using the RenderMan Shading Language for custom purposes, which can then be imported into RenderMan for Maya. Either way, high-quality shaders can be created, no matter which workflow is preferred. 


Custom RenderMan shaders can be imported into RenderMan for Maya. Once imported into Maya the parameters of these custom RenderMan shaders can be animated, but other Maya Materials cannot be wired into them. These RenderMan shaders can be connected to a top-level shading group, but these connections are constrained by the three available slots: surface, displacement, and volume.   



Shader Types

RenderMan understands five types of shaders:

Surface Shader: Surface shaders are attached to all geometric primitives and are used to model the optical properties of materials from which the primitive was constructed. A surface shader computes the light reflected in a particular direction by summing over the incoming light and considering the properties of the surface.
 

Displacement Shader: Displacement shaders change the position and/or normals of points on the surface and can be used to place bumps on surfaces.
 

Light Shader: Lights may exist alone or be attached to geometric primitives. A light source shader calculates the color of the light emitted from a point on the light source towards a point on the surface being illuminated. A light will typically have a color or spectrum, an intensity, a directional dependency, and a fall-off with distance.
 
Volume Shader
: Volume shaders modulate the color of a light ray as it travels through a volume. Volumes are defined as the insides of solid objects. The atmosphere is the initial volume defined before any objects are created.
 

Imager Shader: Imager shaders are used to program pixel operations that are done before the image is quantized and output.


Shader Space

RenderMan for Maya automatically translates Maya Materials, including associated projections. You might find that, in certain cases, these RenderMan shader space attributes are helpful (e.g. when using imported custom RenderMan Shaders or when an esoteric shader space like "NDC" is required).

Shader space is used to define how 3D procedural shaders and 2D projections are applied to objects in a scene. This is in contrast to using the natural parameterization of a surface (UV mapping) to map 2D textures onto objects. A 3D procedural shader emanates in three dimensions. Such shaders are often called solid shaders. Naturally, a point in 3D space must be given to a solid shader as the center starting point from which the shader expands. To define this point we use coordinate systems.

Now any node in Maya (object, light, etc.) can be used to declare a coordinate system for a procedural shader, but it turns out that the RISpec already contains a number of quite helpful predeclared shader spaces:

Coordinate System Description
"object"
The coordinate system in which the current geometric primitive is defined. The modeling transformation converts from object coordinates to world coordinates.
"world"
The standard reference coordinate system. The camera transformation converts from world coordinates to camera coordinates.
"camera"
A coordinate system with the vantage point at the origin and the direction of view along the positive z-axis. The projection and screen transformation convert from camera coordinates to screen coordinates.
"screen"
The 2-D normalized coordinate system corresponding to the image plane. The raster transformation converts to raster coordinates.
"raster"
The raster or pixel coordinate system. An area of 1 in this coordinate system corresponds to the area of a single pixel. This coordinate system is either inherited from the display or set by selecting the resolution of the image desired.
"NDC"
Normalized device coordinates — like "raster" space, but normalized so that x and y both run from 0 to 1 across the whole (un-cropped) image, with (0,0) being at the upper left of the image, and (1,1) being at the lower right (regardless of the actual aspect ratio).


 


Prev | Next


 

 

Pixar Animation Studios
Copyright© Pixar. All rights reserved.
Pixar® and RenderMan® are registered trademarks of Pixar.
All other trademarks are the properties of their respective holders.