OGL Texture

 
 
 

| Render Tree Usage

Category: RealTime > OpenGL

Shader Family: Realtime

Output: Realtime

The OGL Texture shader defines the image source, projection method and other attributes of a single texture image, but you can also use several OGL Texture nodes together for more complex multi-texturing effects. Each texture is set on a specified target.

General

Uniform Name

If this OpenGL texture shader is connected to a programmable shader that declares sampler types (for example, sampler2D) as uniforms, enter the name of the uniform variable as you defined it in your shader code.

Target

Defines the texture stage for multi-texturing operations in a single draw pass.

Each OGL Texture node needs to be bound to a unique texture target. A texture target is a layer in which a texture is set. These layers are then modulated together programmatically via a fragment shader. The number of available texture targets depends on the hardware you are using.

Image

Defines an image clip to use. Click Edit to open the Image Clip Property Editor where you can modify the image clip being presently used. To retrieve a new clip, click New and indicate whether you wish to select a new clip from file or create one from a source. For more information about working with images, see Managing Image Sources & Clips [Texturing].

Texture Space

Lets you choose a texture projection. If no texture projection has been defined, you can create one by clicking New. If a texture projection is already defined, you can edit it by clicking Edit.

When multiple objects with texture projections share the same material, the Texture Space widget includes an object selection list where you can specify the texture projection to be used for a specific object or all of a scene's objects.

Edit: Opens the Texture Projection Property Editor for the selected texture projection.

New: Specifies the new texture projection to be created. You can choose from the following:

  • UV: Projects the texture along the U and V.

  • Planar XY: Projects the texture on X and Y coordinates only.

  • Planar XZ: Projects the texture on X and Z coordinates only.

  • Planar YZ: Projects the texture on Y and Z coordinates only.

  • Cylindrical: Projects the texture as though it is a cylinder wrapped around an object.

  • Spherical: Surrounds the object with a spherical mapping across the whole surface with some distortion.

  • Spatial: The texture is centered on the scene's origin.

  • Cubic: Applies the texture onto the object by first assigning the object's polygons to faces of a cube, and then projecting the texture onto each face by default. The layout of a cubic projection can be completely customized using the options in the Texture Support property editor.

  • Camera Projection: The texture is mapped relative to the camera's center. In this coordinate system, the camera is the center of the "world," with an up vector, looking toward the negative Z axis. The result resembles an image projected from a camera. The texture will move with the camera and will be affected by any scaling, rotation, and translation.

  • Unique Uvs (Polymesh): Applies a texture to polygon objects by assigning each polygon's UV coordinates to its own distinct piece of the texture so that no two polygons' coordinates overlap each other.

  • Advanced: Opens the TextureWizard from which you can explicitly define a texture projection.

Border

Sets the color of the texture image's border.

The border is used for clamping purposes as you may want to clamp the texture to a certain portion of the object without having its colors bled onto the rest of the object. When you clamp the texture using its border, portions of the object not covered by the texture use the border color.

Modulation

Determines how the diffuse lighting attribute affects the texture color. The default texture environment is Modulate (GL_MODULATE), which multiplies the texture color by the primitive (or lighting) color.

The available texture modulation modes are:

  • Modulate: [ C = Cf*Ct, A = Af * At ]

  • Decal: [ C = Cf(1-At) + CtAt, A = Af ]

  • Blend: [ C = Cf(1-Ct) + CcCt, A = AfAt ]

  • Replace: [ C = Ct, A = At ]

  • Add: [ C = Cf+Ct, A = Af + At ]

(Where f = fragment, t = texture, and c = GL_TEXTURE_ENV_COLOR)

The order in which the object is shaded and modulated is as follows:

  1. The object is shaded either by a fixed function shading node, vertex colors, or a programmable vertex shader.

  2. The texture set in target 0 is then modulated on top of the previous shading.

  3. The texture set in target 1 is then modulated on top of the texture set in target 0.

  4. This continues until you reach the maximum number of texture targets available on your hardware.

Compression

Toggles texture compression on and off.

If texture quality is less important (for example, the texture is far away or has a lot of noise already), then you can activate Compression to save texture memory. This transforms your texture to S3TC format which allows it to compress about 4:1 on video memory.

Format

Specifies the image format. Typically, RGB/RGBA is used, although you can specify DSDT (red/green) or DSDT (red/blue) where necessary. For example, you may need a texture to be converted to DSDT format for the purpose of using bump reflection instructions on NV2x based video cards.

Filter

Magnification Filter

Determines the type of optimization used to smooth textures that are close to the camera. Magnification Filter controls how textures are filtered when texels get magnified — that is, when their coverage exceeds the area of one pixel.

Minification Filter

Determines the type of optimization used to smooth textures that are far from the camera. Minification Filter controls how textures are filtered when texels get minified — that is, when their coverage is less than the area of one screen pixel.

Mipmap LOD Bias

When mipmapping is used by the minification filter, the LOD bias shifts the level of detail up or down to reduce blurring and/or artifacts on textures when they appear far from the camera.

Anisotropic Filtering Level

Anisotropic filtering allows you to increase the level of detail on textures that are far away from the camera without consuming large amounts of memory.

You can use anisotropic filtering to get a crisper minification filter. To do so, set the Minification Filter to Linear and increase the Anisotropic Filtering Level.

Render Tree Usage

This shader lets you add textures to your realtime shader tree. It can output to any other realtime shader. It receives an input from an image clip node, and can receive an input from any other realtime shader.