mental ray supports texture, bump, displacement and reflection mapping, all of which may be derived from an image file or procedurally defined using user-supplied shaders. Although there are various support functions provided by the mental ray shader interface, all these functions are fully implementable in shaders.
Procedural textures are computed by shaders, while image textures are read from image files. In practice, textures are a combination of both methods: a procedural texture shaders accepts an image texture parameter, which it accesses to read and filter pixels, and then modifies it according to other parameters to implement projections, scaling, cropping, replication, and other common operations on texture images. Purely procedural texture shaders that do not rely on texture images at all also exist, for example marble or cloud shaders. Some shaders use textures for special purposes; for example, a fur shader that computes hair in a volume might use a texture image to control the hair length or brushing direction.
The following table lists the file formats accepted by mental ray for reading texture image files:
format | description | color map |
compress | comp. | bits/comp. | extensions |
---|---|---|---|---|---|---|
rla rlb |
Wavefront image | - | RLE | 3, 4 | 8, 16 | color+z channel |
pic | Softimage image | - | RLE, - | 3, 4 | 8 | |
alias | Alias image | - | RLE | 3 | 8 | |
rgb | Silicon Graphics color | - | RLE, - | 3, 4 | 8 | |
jpg | JFIF image | - | JPEG | 3 | 8 | |
png | Portable Network Graphics | - | RLE, - | 3, 4 | 8 | |
yes | RLE, - | 3, 4 | 8 | |||
exr | OpenEXR | - | All | 1, 3, 4 | half, float | tiled storage, multi channel, filter pyramid |
tif* | TIFF image | - | RLE, Deflate | 1 | 1, 4, 8 | |
- | RLE, Deflate | 3, 4 | 8, 16, float | |||
yes | RLE, Deflate | 3, 4 | 4, 8 | |||
iff | Maya IFF image | - | RLE, - | 1, 3, 4 | 8, 16, float | tiled storage, color+z channel |
picture | Dassault Systèmes PICTURE | - | RLE | 3 | 8 | |
hdr | Radiance RGBE | - | - | 4 | 8 | |
ppm | Portable pixmap | - | - | 3 | 8, 16 | |
tga | Targa image | - | RLE, - | 1, 3, 4 | 8 | |
- | RLE, - | 3 | 5 | |||
- | RLE, - | 4 | 5/1 | |||
yes | RLE, - | 3, 4 | 8 | |||
lwi | Solidworks texture (read-only) | - | RLE | 3 | 8 | |
bmp | MS Windows/OS2 bitmap | - | - | 3, 4 | 8 | |
yes | - | 3, 4 | 1, 4, 8 | |||
dds | DirectX texture | - | DXTn | 1, 3, 4 | 8, 16, float | |
qnt | Quantel/Abekas YUV image | - | YUV | 3 | 3 | |
ct* | mental images texture | - | - | 4 | 8, 16, float | |
st* | mental images alpha texture | - | - | 1 | 8, 16 | |
- | - | 1 | float | |||
vt wt |
mental images basis vectors | - | - | 2 | 16 | |
zt | mental images depth | - | - | 1 | float | |
nt mt |
mental images vectors | - | - | 3 | float | |
tt | mental images label (tag) | - | - | 1 | 32 | |
bit | mental images bit mask | - | - | 1 | 1 | |
map | memory mapped texture | - | - | any | any | filter pyramid |
remap (tiled) texture | - | - | any | any | tiled storage, filter pyramid |
In the table, any combination of comma separated values determines a valid format subtype. For example, the SGI RGB image format will be read when the data type is 8 bits per component with or without alpha, either RLE compressed or uncompressed. The actual image format is determined by analyzing the file content, not just by checking the filename extension. This allows replacing texture files with memory-mapped textures without changing the name, for example. The asterisk (*) indicates omission; for example, ct* includes ctfp (floating point), cth (HDR), and ct16 (16 bits).
The extensions column informs about special support for file format features. If an image can be stored in tiles rather than as a single large block mental ray is able to optimize access to the file using texture caching. If texture filtering is requested in mental ray for tiled texture formats then the image file needs to provide the pre-filtered subimages as well (extension: filter pyramid) to support caching. Otherwise it will not be used for this texture during rendering and mental ray computes the filter pyramid as usual.
Typical image types like black/white, grayscale, color-mapped and true-color images, optionally compressed, are supported. Some of them could be used to supply additional alpha channel information (number of components greater than 3). The collection covers most common platform independent formats like TIFF and JFIF/JPEG [3], OpenEXR [4], special UNIX (PPM) or Windows (BMP, DDS) types and well known application formats. The mental images formats, normally created by mental ray itself, are mainly available to exchange data not storable with other formats. As a special case, mental ray allows storing RGBE data into file formats that accept RGBA. For formats which support to store multiple channels mental ray allows to store several frame buffers in the same file. For example, for iff and rla formats both color and depth buffer can be stored if a type list "+rgba,z" is specified. For OpenEXR [4] files the number of channels is basically unlimited.
A user-defined material shader is not restricted to the above applications for textures. It is free to evaluate any texture and any number of textures for a given point, and use the result for any purpose.
Typical texture shaders allow a named texture map to be given as an input parameter, and return a color value taken from the texture image. Such shaders can be connected to any color input of other shaders, for example diffuse color input of a material. The color of the diffuse component will then vary across a surface. To shade a given point on a surface, the coordinates in texture space are first determined for the point. The diffuse color used for shading calculations is then the value of the texture map at these coordinates. The shader interface is extremely flexible and permits user-defined shaders to use a number of different approaches, or completely different formats. The remainder of this section describes the standard shader parameters only.
The standard mental ray material shaders support texture mapping for all standard material parameters except the index of refraction. Shinyness, transparency, refraction transparency, and reflectivity are scalar values and may be mapped by a scalar map. Most shaders use color maps to implement bump mapping by sampling the color map three times, but some shaders accept a vector map (normal map) that requires only a single sample.
Color textures are normally not implemented in the
material shader but in a separate
texture shader, which is then
referenced by the material shader. This separation of work between
material and texture shaders allow more flexibility because any
texture shader may be combined with any material shader, without
having to program any new combination in a new material shader.
Also, texturing does not have to be programmed into every material
shader parameter. Instead, the material shader offers simple
parameters like "diffuse"
,
"ambient"
, "transparency"
,
"shinyness"
, and so on. Each parameter may
be assigned a static color such as "white", or it may be attached
to a texture shader. Even if the material shader was never
programmed to accept textures, all aspects of its operation for
which it has a parameter become texturable. In fact it is possible
to build whole multilevel graphs of shaders using parameter
assignment.
Shaders that do all the work internally are called monolithic shaders, while shaders designed for easy graph building are called base shaders.
Determining the texture coordinates of a point on a surface to be shaded requires defining a mapping from points on the surface to points in texture space. Such a mapping is itself referred to as a texture space for the surface. Multiple texture spaces may be specified for a surface. If the geometry is a polygon or subdivision surface, a texture space is created by associating texture vertices with the geometric vertices. If the geometry is a free-form surface, a texture space is created by associating a texture surface with the surface. A texture surface is a free-form surface which defines the mapping from the natural surface parameter space to texture space. Texture maps, and therefore texture spaces and texture vertices, may be one, two, or three dimensional.
Pyramid textures are a variant of mip-map textures. When loading a texture that is flagged with the filter keyword, mental ray builds a hierarchy of different-resolution texture images that allow elliptical filtering of texture samples. (The .map format may already contain this pyramid, which saves a lot of time and memory.) Without filtering, distant textures would be point-sampled at widely separated locations, missing the texture areas between the samples, which causes texture aliasing. Texture filtering attempts to project the screen pixel on the texture, which results in an elliptic area on the texture. Pyramid textures allow sampling this ellipse very efficiently, taking every pixel in the texture in the ellipse into account without sampling every pixel. Pyramid textures are not restricted to square and power-of-two resolutions, and work with any RGB or RGBA picture file format. The shader can either rely on mental ray's texture projection or specify its own. Filter blurriness can be adjusted per texture.
A procedural texture is free to use the texture space in any way it wants, but texture files are always defined to have unit size and to be repeated through all of texture space. That is, the lower-left corner of the file maps to (0.0, 0.0) in texture space, and again to (1.0, 0.0), (2.0, 0.0), and so on; the lower-right corner maps to (1.0, 0.0), (2.0, 0.0), etc. and the upper right to (1.0, 1.0), (2.0, 2.0), etc.
[3] The JPEG software is
based in part on the work of the Independent JPEG Group.
[4] OpenEXR is available on
Windows, Linux, MacOS X, and Irix platforms. Integration source
code is available on request for custom integration.
Copyright © 1986-2009 by mental images GmbH