NVEffects Sample Descriptions

And Effects Browser Framework

 

Please send comments/questions/suggestions to

Christopher Maughan (cmaughan@nvidia.com)

 

Introduction

This document outlines the NVEffects vertex/pixel shading samples.  Some of the samples are designed to demonstrate simple techniques, such as transform, lighting, texture coordinate generation, and texture coordinate transformation.  Others are intended to show the vast possibilities that vertex and pixel shaders[1] enable.  Some samples have been modified from existing SDK demos to show the advantages of using a programmable vertex shader over the traditional fixed-function T&L.  The samples are designed to be educational, and provide inspiration for the uses of programmable shaders.  It is easy to extract sections of vertex or pixel shader code and combine them to develop new shaders.

 

Though some of the examples may run on GeForce 256 (or even a TNT), many require a GeForce3 card for full hardware acceleration.

 

Implementation

All examples are built on a framework that shows the current vertex and/or pixel shader in a pane on the left along with the resulting demo in a pane on the right.  In the case of an OpenGL example, the vertex programs, texture shader programs and/or register combiner programs are shown in the pane.  The various effects are selectable from the tree in the left-most window.  Some effects have custom menu options accessed by right-clicking the 3D window or using the main menu bar.  The NVEffectsBrowser searches its own subdirectories for executable effect .dll files, or may also search additional directories specified by the user.  We often release new effects with source code.  For DX8 effects, you’re merely required to copy the new .dll and associated files anywhere into the Effects subdirectory.  For OpenGL effects, you must either point the browser to the directory holding the effects, or place them in the Effects\GLUT_Programs subdirectory.  Alternatively, if you have NVIDIA’s OpenGL SDK installed, it will automatically look for OpenGL effects there.

 

All samples are built on DirectX 8 or OpenGL, and make extensive use of the various features each API offers.  The DX8 effects use the D3DX helper APIs for more clear and concise display of objects, while the OpenGL effects use NVIDIA’s own nvparse tool to make the code easier to understand and to customize.

 

Each DX8 demo is contained in a directory with its own source files.  Look in the Effects subdirectory to see all related source code files.  Code for all the OpenGL effects are included in NVIDIA’s OpenGL SDK.

 

Building the DX8 effects

The DX8 effects projects were created with Microsoft’s Visual Studio® (C++) 6.0.  Each effect may rely on shared assets or code included with the NVEffectsBrowser package.  Each effect project file (.dsp) should be two subdirectories down from the NVEffectsBrowser.dsp root location.  For example:

 

NVEffectsBrowser  <dir>

     NVEffectsBrowser.exe / .dsp

     Effects  <dir>

          Quad  <dir>

                Quad.dsp

                Quad.cpp… etc.

     Media  <dir> 

 

Source files often have a directory path at the very beginning.  Use this as a guide for the correct relative path  (The listed absolute path is irrelevant, and will vary from project to project, but the paths should match starting from their first common directory on down).  Each effect compiles to a .dll which must live in a directory somewhere beneath the NVEffectsBrowser.  You can also load the NVEffectsBrowser workspace file (.dsw) in the root directory and do a ‘batch’ build to compile all the shaders you have downloaded.   Of course, you must also build the effects browser to launch the effects.

Building the OpenGL effects

The OpenGL effects projects were also created with Microsoft’s Visual Studio® (C++) 6.0.  Each effect may rely on shared assets or code included with NVIDIA’s OpenGL SDK.  The project must be compiled with a Browser config, as the standard Release/Debug configs create an .exe file, while the NVEffectsBrowser requires DLLs to be created instead.  All the OpenGL effects are written with a subset of the standard GLUT interface, in combination with nvparse (the source code of which is available as part of the standard OpenGL SDK).

 


Table Of Contents

 

Introduction. 1

Implementation. 1

Building the DX8 effects. 2

Building the OpenGL effects. 2

Table Of Contents. 3

1.     Spinning Quad and Spinning Quad With Specular 6

2.     DX7 Point Light 7

3.     DX7 Directional Light 8

4.     Many Point Lights. 9

5.     Two-Sided Lighting. 10

6.     Fish-Eye lens. 11

7.     Sine Wave Perturbation. 12

8.     Cube Mapping. 13

9.     Mesh Blending. 14

10.       Layered Fog. 15

11.       Dot 3 Bump Directional Light 16

12.       Dot 3 Bump Point Light 17

13.       Motion Blur 18

14.       Particle System.. 19

15.       Blinn Bump Reflection. 20

16.       Clip-Planes with Texkill 21

17.       Toon Shading. 22

18.       Filter Blitting. 23

19.       Two-Sided Texturing. 24

20.       Depth of Field. 25

21.       Rainbow Rendering. 27

22.       Anisotropic Rendering. 28

23.       Single Pass Reflection and Refraction. 29

24.       Membrane lighting. 30

25.       Perlin Noise Generation. 31

26.       Keyframe Interpolation. 32

27.       Matrix Palette Skinning w/ Dot3. 33

28.       Minnaert Lighting. 34

29.       NVLink Version 2.0. 35

30.       Silhouette Rendering. 36

31.       Simple Fog Example. 37

32.       Game of Life. 38

33.       Dynamic True-Reflective Bump Mapping. 39

34.       Dynamic Dot3 Bump Mapping. 40

35.       Lens Flare Texture Masking. 41

36.       Mesh Skinning with Vertex Programs. 42

37.       Per Pixel Bumped Reflected Skinned Mesh with Vertex Programs and Texture Shaders. 43

38.       Register Combiner Bump Mapping setup with Vertex Programs. 44

39.       Quaternion Product with Vertex Programs. 44

40.       Per Pixel Lighting with cube map. 45

41.       Per Pixel Distortion using Texture Shaders. 46

42.       Depth Sprites. 47

43.       Vertex Lighting using a Vertex State Program and a Vertex Program.. 48

44.       Sine Wave Perturbation. 49

45.       Cull Fragment Simple. 50

46.       Cull Fragment Complex. 51

47.       Secondary Color 52

48.       Separate Specular 53

49.       Texture LOD Bias. 54

50.       Simple P-Buffer 55

51.       Holy Grail Bump-Mapping. 56

52.       Earth Quad. 57

53.       Earth Sphere. 58

54.       Earth Sphere with Vertex Program setup. 59

55.       Earth Sphere with Vertex Program and Texture Shader 60

56.       Bump Reflection with Local Light Source. 61

57.       Bumpy Shiny Patch. 62

58.       Bumpy Refractive Patch. 63

59.       Bumpy Shiny Patch w/ dynamic normal map. 64

60.       Explosion Demo. 65

61.       Mipmap Toon Rendering. 66

62.       Dot Product Texture 2D Bump Mapping. 67

63.       Simple Bump Reflection. 68

64.       Tangent Space Bump Reflection. 69

65.       Hardware Shadow Mapping. 70

66.       Woo Shadow Mapping with Hardware Acceleration. 71

67.       Depth Peeling. 72

68.       Simple Vertex Array Range (VAR) example. 73

69.       Simple Multi-texture-based Shadow Mapping. 74

70.       Offset Bump Mapping. 75

71.       Order-Independent Transparency. 76

72.       Pbuffer to Texture Rectangle. 77

73.       Per-Pixel Attenuation. 78

74.       Simple Projective Texture Mapping. 79

75.       Homogeneous Texture Coordinates. 80

76.       Simple Rotation Example. 81

77.       Simple Multitexturing Example. 82

78.       Simple Shared PBuffer 83

Third Party Effects. 84

1.     Bezier Patch. 84

2.     Bump Refraction. 85

3.     Vortex. 86

4.     Fresnel Refract and Reflect 87

5.     Fur 88

6.     Grass. 89

7.     Height Field. 90

8.     Matrix TexGen. 92

9.     Rain. 93

10.       Bezier Spline. 94

11.       Spyrorama. 95

12.       Twister 96

13.       Flying Worlds. 98

14.       Shinning Flare. 100

15.       Pencil Sketch. 101

16.       Cook-Torrance lighting. 103

17.       Hair 108

18.       1D Particles. 110

19.       Particle System.. 111

20.       Plasma. 112

21.       Fur Effect with Dynamic Per Pixel Lighting. 113

22.       Jellyfish. 114

23.       Volume Lighting. 115

24.       Volumetric Fog. 116

25.       Non-Photorealistic Rendering. 117


 

1.     Spinning Quad and Spinning Quad With Specular

This sample draws a single spinning Quadrangle in the center of the viewport.  The shader program executes in 7 instructions for one vertex.  First, it transforms the input vertex by a model-view-projection matrix stored in the constant registers.  This value is output as the vertex position.  Second, it sets the output color based on the dot product of the vertex normal with the eye vector.  This is one of the simplest lighting calculations to do.  To save the normal transform into eye-space, the light vector is supplied in model space.  Third and last, it copies the texture coordinates. 

 

Demonstrates: Simple transformation and dot product lighting in model space

 

 

2.     DX7 Point Light

The shader lights in eye-space, it also calculates a half-vector between the light and the eye, using a local viewer.  This shader used the LIT instruction to correctly clamp the dot product/normal value and calculate a power function from the half-vector/normal dot products created in the shader.  The shader also calculates constant, linear and quadratic attenuation using the DST instruction to speed up the calculation.  The result is 3 different point lights with specular highlights and distance attenuation.  This light is implemented in a similar way to a DirectX 7 point light.

 

Demonstrates: Use of the ‘LIT’, ‘DST’ instructions, Specular/Diffuse lighting, half-vector calculation, local viewer lighting, and attenuation

 

 

3.     DX7 Directional Light

This shader implements a directional light in the same style as a DirectX 7 directional light.  The shader lights in eye space.  The viewer is assumed to be an infinite distance away, hence a constant half-vector is used. The LIT instruction is also used to correctly clamp the dot product/normal value and calculate a power function from the half-vector/normal dot products calculated in the shader.  The result is a directional light with a specular highlight, two of which are implemented in the shader. 

 

Demonstrates: Use of the LIT instruction, Specular/Diffuse lighting, half-vector calculation, non-local viewer lighting.


 

4.     Many Point Lights

Because of the large amount of storage available in the constant memory, we can use it to enable more lights than were previously possible.  This example creates 17 point-light sources, using a very simple diffuse lighting model.  Each light is calculated using just 7 instructions.  A macro enables easy placement of each light calculation in the shader.  The resulting instruction count is 126 – very near the maximum storage space available.

 

Demonstrates: Simple point lighting.  Use of the MAX instruction for clamping each light calculation


 

5.     Two-Sided Lighting

This sample uses non-indexed triangle lists.  Each vertex contains not only a vertex normal for correct lighting, but also a face-normal.  The face normal is checked against a vector to the eye.  The dot product of these two vectors indicates whether the face is oriented towards or away from the eye.  Based on this assessment the shader sets up the material color to be one of two options for front and back facing materials.  The example also does the lighting calculation in eye space and therefore transforms the normal vector to eye space and normalizes it before use.

 

Demonstrates: Conditional code without branching, Directional lighting in Eye Space, unusual usage of input streams.


6.     Fish-Eye lens

This example transforms a vertex into eye space and applies a distortion to it.  The distortion formula is loosely based on the formula to be found on: “www-cad.eecs.berkley.edu/Respep/Research/interact/playpen/fish”.  A variation of this formula is used to calculate a vertex distortion in eye-space.  This is then applied to all the geometry in the scene.

 

The lower image demonstrates the original scene, the upper image is the same scene with the distortion applied – giving an effective fish-eye lens effect.

Because the lighting of the terrain is slightly different to the trees, two different shaders are used, though they both apply the same distortion calculation.

 

The menu option allows to toggle wire-frame rendering.

 

Demonstrates: Irregular transform, use of MAX instruction for performing an ABS operation.


7.     Sine Wave Perturbation

A flat grid of polygons is deformed by a sine wave, sampled from the constant memory using a Taylor series.  The shader calculates the first 4 values in the series.  The distance from the center of the grid is used to generate a height, which is applied to the vertex coordinate.  A normal is also generated and used to generate a texture coordinate reflection vector from the eye, which looks into a cubic environment map.

 

The input data for this shader is simply 2 floating-point values.  The shader generates all normal, texture and color information itself.

 

The menu-option allows rendering in wire frame, and left clicking allows rotating the mesh.

 

Demonstrates: Calculation of approximate sine/cosine values using a Taylor series.  Generation of normals, texture coordinate generation.

8.     Cube Mapping

Cube maps have various uses.  This shader demonstrates how to generate a texture coordinate for use with environment cube maps.  The input normal vector and the eye-space vertex are used to calculate a reflection vector.  This vector is loaded into the texture coordinates for use by the rasterizer in drawing the reflected object with a cube map.  As can be seen, the object accurately reflects its surroundings.

 

After calculating a vector to the eye and a normal, the shader does a simple reflection calculation (4 instructions) – note that this example gives true local-viewer reflections into the cube map.

 

Demonstrates: Texture generation for cubic environment map reflections.

9.     Mesh Blending

The Mesh Blending shader implements blending/morphing between two input meshes.  The meshes and their normals are blended in model-space before being transformed and lit in eye-space.  In this case the weights are supplied in constant memory and change each frame to cause the dolphin to wave its tail.  There is nothing to stop the weights being passed into the shader on a per-vertex basis, if required.  A more advanced form of mesh blending could use several matrices and ‘pick’ the correct set from a list in constant memory, based on settings in the input stream (this technique is often referred to as ‘matrix palette skinning’).  This sample also generates procedural texture coordinates, based on the eye vertex position.  These are used to lookup the caustic texture applied to the top of the dolphin.

 

This shader also lights the dolphin from above using a directional light.  In this case, the alpha channel is set to contain the intensity of the light.  This enables the texturing pass to apply the caustic texture and the dolphin skin to the lit dolphin in a single dual-texture pass, using a pre-modulate texture blend mode and two texture stages.  The original DirectX SDK sample required two passes to draw the dolphin – here the vertex shader has been used to simplify the rest of the pipeline by a very simple modification to the lighting pipeline, previously impossible with fixed-function T&L.

 

Demonstrates: Mesh blending, multiple input coordinates, modified lighting calculation

10.    Layered Fog

 

This shader implements layered fog.  The concept of layered fog is quite simple; the fog intensity is a function of height above the ground as well as distance from the viewer.  The sample first generates a texture map containing a 2D lookup function.    The vertex shader then calculates the distance from the viewer and the height of the viewer above the ground for each vertex and uses these values to setup the texture coordinates.  The texture coordinates then index the fog map texture and this is applied as a second texture pass to each of the polygons.

 

Demonstrates: Texture coordinate generation, directional lighting

 

 


11.    Dot 3 Bump Directional Light

This shader implements the per-vertex setup required to rotate a light vector in to texture space.  The setup code first sets up 3 basis vectors – S, T and SxT.  These only change when the mesh is modified.  During vertex shading the directional light vector is rotated into texture space and stored in the diffuse vertex color.  The matrix to do the rotation is calculated by transforming the basis vectors into world space and combining them to form a rotation matrix.

 

Later, in the texture stages, the vertex color is combined with the normal map to create the bumpy surface.  A second texture stage then applies the base texture image to complete the effect.

 

The menu options allow the display of the bumpmap, the light vector that is calculated by the vertex shader, and the base texture.

 

Demonstrates: Dot3 bumpmapping setup in hardware.

 

 

12.    Dot 3 Bump Point Light

This shader computes a local, attenuated light-vector and transforms it with the basis vectors passed into the shader.  As with the directional dot3 light example (see above) the setup code first sets up 3 basis vectors – S, T and SxT.  These only change when the mesh is modified.  During vertex shading the local light vector is rotated into texture space and stored in the diffuse vertex color.  The matrix to do the rotation is calculated by transforming the basis vectors into world space and combining them to form a rotation matrix.

 

Later, in the texture stages, the vertex color is combined with the normal map to create the bumpy surface.  A second texture stage then applies the base texture image to complete the effect.

 

The menu options allow the display of the bumpmap, the base texture, the attenuation, the normalized texture space light vector, the bumped light, the bumped light with the base texture, or all the effects combined..

 

Demonstrates: Dot3 bumpmapping setup for attenuated point lights in hardware.

 

13.    Motion Blur

This shader creates a smooth motion-blur effect in two passes.  The first pass draws the object normally.  The second pass transforms each vertex both with the current frame’s transform as well as with the previous frame’s transform.  The difference between these two points is a per-vertex motion vector.  If the dot product of this motion-vector with the vertex-normal is positive, then the vertex faces into the direction of motion and we transform and light it normally.  If, however, this dot product is negative, then the vertex faces away from the direction of motion and we transform it with the previous frame’s transform.  We therefore stretch the object from its old position to its new position on the screen.  To enhance the motion-blur look, we also modulate the alpha-component of the diffuse color of vertices facing away from the direction of motion by the length of the motion-vector.

 

The menu options allow toggling the motion-blur effect, increasing object speed, pausing the demo, artificially lengthening (or shortening) the motion-blur trail, or rendering in wire frame.

 

Demonstrates: Conditional code without branching, unusual transform use.

 

14.    Particle System

This shader creates a particle system that executes exclusively on the GPU.  After creating particles in a vertex-buffer, the CPU never touches any vertex ever again! 

 

Each vertex corresponds to a single particle that is displayed as a point-sprite.  It stores in its per-vertex data various particle parameters, such as when it is first born, its lifetime, its air-resistance, its initial velocity, one half of a random number seed, as well as its initial color.  The vertex shader evaluates a closed-form function that computes a particle’s current position based on the current time (passed in through constant memory), initial particle velocity, particle air-resistance etc.  A particle’s initial direction is randomized through a combination of its random number seed and a random number seed from constant memory.  Color and fog-values are modulated based on the age of a particle.  The sprite-point size depends on the perspective camera-distance. 

 

The menu options allow freezing time and limiting the number of drawn particles to ten, as well as rendering in wire frame.  Left-clicking and moving the mouse changes the camera orientation and the arrow-keys move the camera around.

 

Demonstrates: Conditional code without branching, use of the oFog parameter, use of sprite-point size, use of the address-indexing register a0, calculation of approximate sine values using a Taylor series, and generation of position data from extremely unusual per-vertex data.

 

15.    Blinn Bump Reflection

This shader implements reflective Blinn Bump mapping.  At set-up time it computes and stores per-vertex S and T basis-vectors.  Rendering then commences in two passes: in the first pass the vertex shader computes the S and T cross product and creates a rotation matrix from the S, T, and SxT vectors.  This rotation matrix is applied in the pixel shader to the per-pixel normal read from a bump map.  The eye-vector is then reflected on the rotated per-pixel normal and then used to look up the reflected color from an environmental cube-map.  The second pass modulates the reflected color with a gloss map.  The example optionally uses the HILO texture format.

 

The menu options allow switching between the HILO and 8bit signed texture formats, and showing the effect on either an animated Bezier-patch or a sphere.  Additionally, you can change the height of the bumps and start-stop the animation.  The mouse can be used to rotate the object.

 

Demonstrates: Blinn Bump reflection setup in hardware, vertex shader setup for pixel shaders.


16.    Clip-Planes with Texkill

This shader implements clipping-planes using texkill.  A vertex shader first generates 2 clip planes and stores these in 2 columns of a 4x4 matrix (up to 4 clip planes can be used per texture coordinate).  A vertex shader transforms the incoming vertex coordinates by the clip plane matrix.  This effectively generates x & y values which will be less than 0 if the position is outside of the clip plane.  The resulting x,y,z,w value is loaded into a texture coordinate and iterated across the polygon.  Later, the pixel shader will discard any pixels that lie on one side of the plane.

 

Demonstrates:

Texkill, special texture coordinate generation, per-pixel clipping

 


 

17.    Toon Shading

 

This vertex shader computes L dot N and E dot N per vertex and uses the resulting values to index into two 1D textures.  The L dot N computation is used to generate cartoon-like three-tone shading (shadowed, lit, highlighted), and the E dot N is used to darken silhouettes.

 

Just using E dot N for silhouettes can result in over-darkening of flat regions, so this demo varies the texture indexed by E dot N over its mip levels, resulting in darkening only for the edge-on pixels which access the higher mip levels.

 

Demonstrates: Custom texture coordinate generation

18.    Filter Blitting

 

This shader shows how to filter-blit a texture in DirectX 8.  The texture to filter is set as the active texture for all four texture-units.  A vertex shader shifts the texture coordinates of a simple screen-spanning quad to achieve the desired (bi-linearly interpolated) texel sampling-points, before a pixel shader finally combines these four texel-samples.  The screen shows the original texture in the upper-left corner, and applying the filter once, twice, and three times in the upper-right corner, lower-right corner, and lower-left corner, respectively.  The luminance-based edge detection is different in that it shows partial results: upper-right corner shows the luminance-converted texture, lower-left corner shows the result of the edge-detection and the lower-right corner shows the final result of 2xModulating edges and luminance.

 

In addition to the luminance-based edge-detection, the following filters are implemented; each filter is a single pass operation (the filter kernels are displayed above the name):

 

 

-1      -1

 
 

 

 


9-sample cone filter        9-sample box filter      Sharpening Filter
                                                                        (the filter kernel may        16-sample box filter
                                                                        be arbitrarily rotated)


The menu options allow switching between the various filter-types and rendering in wire-frame.

 

Demonstrates: Render to texture; filter-blitting textures using the GPU; how to implement cone-, box-, and sharpening-filters using the texture-units bilinear filtering-unit and pixel-shaders.

 

19.    Two-Sided Texturing

 

This shader demonstrates correct lighting of a single-surface object viewed from the front or back, as well as a technique for applying different textures to the front and back faces of an object.  If the single-sided object were to be lit by the standard TnL pipeline, incorrect lighting would result as the strip twists to expose its back side.  Only one normal can be specified per vertex, and the back sides require a different normal than the front, a normal which points in the opposite direction.  As the single specified normal twists away from the viewer, lighting would go to zero which is not correct for the geometry of the single-sided object.  A vertex shader program is able to correct the back-facing normals and light the twisted strip correctly.  The vertex shader also writes an alpha value to the vertex color based on whether the light is in front or behind the object relative to the camera, so the shading can mimick a thin transparent object with a different appearance (a different texture) for reflective light versus transmitted light.

 

 

Demonstrates:  Vertex shader for single-sided objects;  Vertex shader virtual branching; Multitexturing for more interesting materials;  Pixel shader

 

 

20.    Depth of Field

 

This demo allows control over several camera parameters (camera position, camera orientation, focal length, focal distance, and f-stop) and renders the scene according to those parameters.  In particular, it approximates depth-of-field, i.e., it renders objects that are in focus sharply, and objects that are out-of-focus blurred, yet runs in real-time.

The effect renders the scene only once.  During this rendering vertex-shaders compute the per-vertex distance to the camera.  This distance gets interpolated per-pixel and an associated pixel-shader transforms this distance into a “blurriness-interpolator” that is stored in the frame-buffer’s alpha-component.  The frame-buffer is then filter-blitted multiple times to generate blurred versions.  Finally, based on the “blurriness-interpolator” stored in the alpha-component we blend between the original in-focus frame-buffer and its various blurred versions to derive the final result.  This technique is very similar to the algorithm described by M. Potmesil and I. Chakravarty in “A lens and aperture camera model for synthetic image generation,” Computer Graphics (Proceedings of SIGGRAPH 81), 15 (3), pp. 297-305 (August 1981, Dallas, Texas).

The “blurriness-interpolator” is looked-up in a texture.  This texture is currently parameterized by distance to camera, and focus distance.  Thus, when changing the f-stop this texture is regenerated.  If f-stops change frequently one could re-parameterize the texture accordingly.  Alternatively, one can operate on a 1D texture (parameterized by distance to camera only) and regenerate it every time any of the other parameters change – since this 1D texture would be of size ~1kB performance should be real-time.

This demo was inspired by an mpeg-trailer for a car-racing game.  The trailer used in-game video-footage that was post-processed to add depth-of-field to make the visuals more dramatic.  The technique shown here runs in real-time and applies to all racing-games that have a replay-option once the race is over.  The many cameras used in replay-mode would become more dramatic if depth-of-field effects were applied to them.

 

The menu options allow rendering in wire-frame, to view the per-pixel distance-to-camera, or to visualize the amount of blurriness.  Left-click-hold and moving the mouse controls the camera’s view-direction, and the arrow-keys move the camera forward, backward, left, and right.  The keys I and O zoom in and out (i.e., change focal-length); U and P change the focus-distance, and K and L change the f-stop.

 

Demonstrates: Render to texture; blurring an image by repeated filter-blitting on the GPU; generating a depth-of-field effect.

 


21.    Rainbow Rendering

 

This demo uses some ideas from Toon Shading ( see above ) to create texture coordinates for each vertex using a vertex shader, based on the Half angle vector, a light and eye vectors, and the surface normal.  Two samples are grabbed from the same texture and combined to create a rainbow iridescent effect.

 

Demonstrates : Vertex Shader, Multi-texturing

 

22.    Anisotropic Rendering

 

Similar to the Toon shader and the Rainbow Rendering, this example uses a vertex shader to find edges and map them into a texture.  It uses ideas from Anisotropic rendering to calculate H dot N for one axis of a texture and L dot N for the other axis.  The texture contains a bright diagonal area where H dot N and L dot N are equal to simulate a velvet effect.

 

Demonstrates : Vertex Shader, Multi-texturing, Anisotropic Lighting

 

 

 

23.    Single Pass Reflection and Refraction

 

This shader demonstrates how to generate two sets of texture coordinates for use with environment cube maps in order to produce a combined reflection and refraction effect in a single pass.  The input normal vector and the input vertex are first moved into world space, with the normal being normalised to account for world transform scaling. The world space eye (passed as a constant) is used to calculate a world space eye-vertex vector, which is then used to calculate the reflection and refraction vectors.  The refraction effect is approximated by shortening the normal and passing it through the standard reflection calculation. These vectors are loaded into the texture coordinates for use by the rasterizer in drawing the reflected/refracted object with a cube map. The blend between the reflection and refraction is controlled by the D3DRS_TEXTUREFACTOR renderstate and the D3DTOP_BLENDFACTORALPHA texture blend op. As can be seen, the object accurately reflects and refracts its surroundings convincingly.

 

Demonstrates: Texture generation for cubic environment map reflection and refraction

 

 

 

24.    Membrane lighting

 

A thin partially scattering transmissive membrane is more visible when held at an angle than when viewed flat on to the surface.  Plastic films, jellyfish, and other transparent marine life demonstrate this effect.  For this example, we use a vertex program and 1D texture to vary the light contribution from a surface as its angle to the viewer becomes more or less steep.  This method of rendering produces some interesting effects and is also useful in CAD or data visualization.  Try loading various .x file models and your own custom textures.

 

Demonstrates :  Vertex shader;  Advanced transparency;  Visualization technique

 

 

25.    Perlin Noise Generation

 

This shader generates a vertex offset for each coordinate based on a noise calculation.  The noise is calculated based on a permutation table stored in the constant memory.  The style of noise is similar to Perlin noise.  The resulting offset is used to distort the patch and setup the color of the vertices.  Pressing the PAGEUP/PAGEDOWN and the UPARROW/DOWNARROW keys modify frequency and amplitude.

 

Demonstrates : 

Generating noise in a vertex shader

 


 

 

26.    Keyframe Interpolation

 

These two examples show two methods for doing keyframe interpolation in a vertex shader.  The first one simply does linear interpolation of the vertices, similar to the dolphin demo.  The second blends between four keyframes using a hermite spline blending function (also called Catmull-Rom Spline) to achieve a smoother animation effect.

 

Demonstrates : Vertex blending, multiple vertex streams

 

 

27.    Matrix Palette Skinning w/ Dot3

 

This shader performs matrix palette skinning with dot3 setup.  Each vertex has two indices which reference the two bones in constant memory which influence this particular vertex, along with the weights of influence.  We calculate the vertex position in world space as influenced by each bone (so, two separate positions are generated per vertex) and then we blend between these two positions according to the weights to generate the final vertex position in world space. 

 

A per-vertex local texture space basis is also passed into the shader as texture coordinates, and then each of this basis’ vectors (all three of them) are skinned and blended just as the vertices are.  We then transform the light vector (passed in through the constant memory) into local texture space by using this skinned basis, and output the local light vector through diffuse color.  The texture stage states then calculate NormalMap dot LightVector modulate DecalTexture.

 

Demonstrates: Dot3 setup, indexed matrix palette skinning

 

 

28.    Minnaert Lighting

 

Minnaert Lighting is a form of BRDF which models subtle darkening effects, such as those in velvet.  This demo uses the vertex shader to place the eye vector (E), the normal (N), and the light vector (L) in iterated texcoords and uses the texm3x2pad and texm3x2tex pixel shader instructions to use these texcoords to index into a texture map containing the minneart lighting values.

 

Demonstrates : texm3x2 pixel shader instruction, normalization cube maps


 

29.    NVLink Version 2.0

 

NVLink is a tool that enables developers to create complex vertex shaders from shader ‘fragments’.  First, the developer creates small pieces of vertex shader code to solve common problems, such as “create_eyespace_eyevector”.  These files of fragments are assembled by NVASM to form fragment files, which are loaded by the linker.  During game execution an application can then request combinations of fragments, which are returned as D3D shader op-codes.  In this way, a developer can create many different shaders from a set of smaller fragments.

 

The demo, under “Tools\NVLink Vertex Shader Combiner”, uses NVLINK to create shaders based on the user input.  The dialog box enables the user to set the number of lights (point and directional), where the lighting calculations are done (eye space/world space), and if the lights are specular or not.  Additional options are available for texture coordinate generation for traditional cube mapping, and the more complex blinn bump mapping setup (requiring pixel shader support), as well as a fogging option.  The demo requires a minimum of a GeForce class card to run in hardware.  It will also run on the Reference Rasterizer, but very slowly, and with reduced functionality on other hardware.  When a shader is created, it is shown in the output pane, so you can see the shader that was generated for the current scene, and the number of registers, instructions and constants that it took.  The default setup for the demo shows a jester with skinning, per-pixel reflections and vertex lighting. 

 

Version 2 of the linker adds skinning support and caching of recently created shaders.  A basic optimization path has also been added which creates more efficient shaders by removing redundant instructions.

 

30.    Silhouette Rendering

 

This demo shows a two-pass technique for rendering silhouettes around objects.  The basic technique involves rendering the object once with depth-buffer writes disabled, using a vertex shader that extrudes each vertex along the vertex normal.  Then, we render the object again, this time normally.  This causes only the outline of the first rendering to be visible, resulting in an accented silhouette.

 

The halo effect is achieved by changing the alpha value dependent on E dot N, so vertices which are nearly edge-on are blended more.

The toon effect is done by simply doing the first rendering using a constant color.

 


 

31.    Simple Fog Example

 

This example renders a fogged landscape with two types of fog.  It shows the interaction of fog calulations with pixel and vertex shader programs.

 

The first fog factor is a traditional screen depth based fog, calculated from the Z component of the vertex after the model-view-project transform.  This value is written to the shader oFog.x which is used in a blending stage after the pixel shader program.

 

The second fog term is based on the model space landscape height and appears as the white fog in the valleys.  You can change the scaling of this second term with the “<” and “>” keys.  The value is passed into the pixel shader program, where it is added to the base texture color.

 

The landscape height is generated as the sum of three blurred and scaled random noise fields.  The blurring produces features of varying roughness, and the more blurred components are added in with a greater scale factor.

 

Demonstrates:  Fog with vertex and pixel shaders, landscape generation

 


 

32.    Game of Life

 

This effect demonstrates Conway’s Game of Life running in pixel and vertex shader programs on the GeForce3.  This game is one of a class of cellular automata (CA) programs which animate pixels (cells) based on their local environment within the image.  The technique is useful for generating proceedural textures and texture animations which require very little storage space or video memory.  Some of the impressive effects in Unreal are generated by similar algorithms, and we plan to develop other CA examples for water, fire, and smoke effects.

 

The patterns can tile automaticaly (hit ‘B’, then ‘3’ to see this in the demo) by enabling texture wrapping.

 

For this example, the logic required to produce the next generation requires three render-to-texture passes, and involves a dependent green-blue texture lookup (D3D’s texreg2gb instruction) in the final pass.  An 8x2 texture encodes the rules of the game.  You can modify this Rules.bmp to be any power of two size with whatever colors you like.  A few rules maps are provided.

 

Demonstrates:  Cellular automata, render to texture, texture animation


 

33.    Dynamic True-Reflective Bump Mapping

 

This example uses the pixel processing of the GeForce3 to run a height-based water simulation on the graphics processor.  The physical state of the water is stored in three textures:  one for height, one for velocity, and one for force.  Each texel represents the state of one point on the height-based water grid.  These textures are alternately used as source textures or render targets in rendering, so that the previous frame's textures generate those for the next frame.  In this way, six textures in video memory generate an endless non-repeating water animation.

 

The water animation involves grayscale values.  A separate DX8 pixel shader calculates an RGB normal map from the grayscale height field in a single render-to-texture pass.  This RGB normal map is used with the GeForce3’s true-reflective bump mapping to produce per-pixel environment reflections off of the rippling surface.

 

See the README.txt included with the demo and our developer website for more information.

 

Demonstrates:  Cellular Automata, Dynamic Normal Maps, GPU Physics

 


 

34.    Dynamic Dot3 Bump Mapping

 

This effect features a water simulation almost identical to that of our “Dynamic True-Reflective Bump Mapping” example.  In this case, one of the four nearest neighbor sample points of the physics simulation is weighted more heavily than the others, and traveling waves are the result.  The shape of these waves can be altered by several keyboard controls.  Hit ‘H’ or ‘F1’ to list these controls.

 

The resultant normal map (the blue map shown behind the sphere) is used in per-pixel DOTPRODUCT3 lighting to render the sphere.

 

See the demo’s README.txt and our developer website for more information.

 

Demonstrates:  Cellular Automata, Dynamic Normal Maps, GPU Physics

 

35.    Lens Flare Texture Masking

 

This demo shows the benefits to be had from the texture masking technique.  Briefly, the demo uses the graphics hardware to generate a texture map representing the intensity of a lens flare.  This texturemap is then modulated with the lens flare to attenuate it depending upon how occluded the sun is in the world.

 

The demo lets you switch between Texture Masking and Locking the frame buffer - the more traditional way of doing this effect.  Note that there is a substantial win from using the texture masking technique.  The win is also improved the larger the size of the framebuffer.  Note also that the lock method and the texture masking technique do not necessarily set exactly the same occlusion, so the intensity of the flare may vary from one technique to the other.  They could be brought more closely together with effort, but they are fundamentally different techniques.

 

See the README in the texture masking effect subdirectory for more information.

 

 

Demonstrates: Texture Masking technique using render to texture

 

 

 

36.    Mesh Skinning with Vertex Programs

 

This demo illustrates different methods to achieve mesh skinning on programmable GPU. It explains how you can optimize the mesh with regard to the vertex programs in order to maximize the performance. The two different skinning methods utilized in this demo illustrate some trade off in CPU usage versus bus bandwidth usage. Furthermore, it demonstrates that display lists can be used in conjunction with vertex programs in order to improve the performance.

 

 

Demonstrates: Mesh Skinning done on the GPU, Skinning performance

 

 

37.    Per Pixel Bumped Reflected Skinned Mesh with Vertex Programs and Texture Shaders

 

This demo illustrates how to achieve per pixel bumped reflected skinned mesh on a programmable GPU. It explains how you can optimize the mesh with regard to the vertex programs in order to maximize the performance. It also shows how to compute a skinned texel matrix in the vertex program needed to appropriately setup the texture shader program used to achieve the per pixel bumped reflection with the environment cube map.

 

 

Demonstrates: Per Pixel Bumped Reflected Mesh Skinning done on the GPU, Skinned Texel Matrix

 

 

 

38.    Register Combiner Bump Mapping setup with Vertex Programs

This demo illustrates how to take advantage of the vertex programs in order to compute the necessary transformation needed for the register combiner based bump mapping. The direct benefit of this setup is to relief the CPU from any of those extra computations, improving the overall performance of the application.

 

 

Demonstrates: Register Combiner Bump Mapping with Vertex Programs, 100% GPU accelerated bump mapping

 

 

 

 

39.    Quaternion Product with Vertex Programs

This simple demo illustrates how one can use vertex programs to transform a model with a quaternion. Instead of using a classic 3x3 transformation matrix to encode the rotation, we only pass a quaternion and process each vertex with the quaternion product.

 

 

Demonstrates: Mesh transformation with quaternions

 

 

 

40.    Per Pixel Lighting with cube map

This demo illustrates how one can make use of the cube map functionality in order to achieve Phong or Blinn per pixel lighting. By pre computing the complete lighting solution of the scene into the cubemap, one can setup the GPU texture coordinate generation to look up the lighting solution directly into the cube map. The direct benefits of this technique are the visual quality over vertex lighting, and the performance of the rendering since it uses only one texture map for any number of lights in the scene.

 

 

Demonstrates: Per Pixel lighting using cube maps

 

 

 

 

41.    Per Pixel Distortion using Texture Shaders

This demo illustrates how texture shaders can be used to achieve per pixel distortion of a texture. By rendering the teapot into texture and using a sequence of animated noise textures, this example simulates some noisy distortion on a per pixel basis. This effect could be used to fake a bad signal reception.

 

 

Demonstrates: Texture Shader Offset 2D, Per Pixel Distortion of texture maps.

 

 


 

42.    Depth Sprites

This demo illustrates how texture shaders can be used to create sprites with real depth data.  The depth sprites differ from normal sprites in that they can realistically intersect in 3D with other depth sprites or standard 3D objects.  The register combiners are used to normalize the light vector, calculate its reflection off the normal mapped surface, as well as calculate both diffuse and specular lighting equations.

 

 

Demonstrates: Texture Shader Dot Product Depth Replace, Sprite Drawing using Vertex Programs, Per Pixel Lighting using Register Combiners.

 

 

 

43.    Vertex Lighting using a Vertex State Program and a Vertex Program

This simple demo illustrates how to use a vertex state program and a vertex program to perform Phong lighting.   The vertex state program is used to transform a light vector and a view vector from camera space into world space.  It also computes a world space halfangle vector.  The vertex program computes specular and diffuse contributions of the Phong lighting model and transforms the vertices to screen space as well.  In addition, a texture is applied and the register combiners are used to control the texture blending.

 

 

Demonstrates: Transformation and Lighting using a Vertex Programs, Using a Vertex State Program, Simple Register Combiner Blending.

 

 

44.    Sine Wave Perturbation

This demo illustrates using a vertex program for custom vertex perturbation and texture coordinate generation.  A flat grid of polygons is deformed by a sine wave, sampled from the constant memory using a Taylor series.  The vertex program calculates the first 4 values in the series.  The distance from the center of the grid is used to generate a height, which is applied to the vertex coordinate.  A normal is generated and used to generate a reflection vector from the eye that is used to access a cubic environment map.

 

The input data for the vertex program is simply 2 floating-point values representing (u,v) grid position.  The vertex program generates all normal and texture coordinate information itself.

 

 

Demonstrates: Calculation of approximate sine/cosine values using a Taylor series.  Generation of normals, texture coordinate generation

 

 

45.    Cull Fragment Simple

This demo illustrates using the cull fragment texture shader.  Automatic object space texture coordinate generation is used to provide texture coordinates representing relative distance to clipping planes.  Based upon distance to the two planes, a fragment is rejected  or accepted for further processing.  This is similar to an alpha test but instead of using alpha, the (s,t,r,q) texture coordinates determine if a fragment should be eliminated from further processing.  The texture shader program used is very simple.

 

 

Demonstrates: Texture Shader Cull Fragment Operation.

 

46.    Cull Fragment Complex

This demo illustrates using the cull fragment texture shader in conjunction with a vertex program to perform complex cull fragment operations.  In this example a vertex program is used to compute custom texture coordinates representing distance from a point (or minimum distance from a set of points).  These texture coordinates are then used to reject or accept a fragment during rasterization.  The vertex program is also used to compute a simple diffuse lighting term.

 

 

Demonstrates: Complex Texture Shader Cull Fragment Usage.

 

47.    Secondary Color

This simple demo illustrates using the secondary color extension for specifying a primary (diffuse) color per-vertex and a secondary color on a per-vertex basis as well.

 

Demonstrates: Using Secondary Color to specify a second per-vertex interpolated color.

 

 

 

48.    Separate Specular

This simple demo illustrates using the separate specular extension for controlling texture blending.  Separate specular offers a small subset of the texture blending flexibility of the register combiners, but it is useful for simple blending operations and works on a broad range of hardware (the GeForce family as well as TNT/TNT2).

 

Demonstrates: Using Separate Speucular to specify a post-texture modulate specular sum operation.

 

 

49.    Texture LOD Bias

This simple demo illustrates using texture lod bias for biasing the mip-map level selected during texture fetches. 

 

Demonstrates: Using Texture LOD Bias, Anisotropic Filtering.


50.    Simple P-Buffer

This demo illustrates how to use a P-Buffer for offscreen rendering. 

 

Demonstrates: Using P-Buffers for Off-Screen Rendering.


51.    Holy Grail Bump-Mapping

This demo illustrates one way to perform specular and diffuse bump-mapping using the Phong lighting model and 16-bit normal maps.  A vertex program is used to perform the per-pixel lighting setup while the texture shaders are used to compute the specular and diffuse lighting coefficients.  The register combiners programs are used to control the blending and final visual appearance of the Grail. 

 

Demonstrates: Per-Pixel Lighting using the Texture Shaders and Register Combiners, Bump-Mapping setup in a vertex program.


 

 

52.    Earth Quad

This demo illustrates simple per-pixel lighting using an object-space formulation.  The bump mapping is done on a single quad, and the equation is evaluated in a single pass within the register combiners.

 

Demonstrates: Object-space Per-Pixel Lighting using Register Combiners.


 

53.    Earth Sphere

This demo extends the “Earth Quad” example by performing tangent-space bump mapping on a sphere.  The evaluation of the illumination equation still occurs in the register combiners.  This demo also illustrates a technique for normalizing denormalized vectors within the register combiners.  This is useful because it does not require using a normalization cube map.

 

Demonstrates: Tangent-space Per-Pixel Lighting using Register Combiners.


 

54.    Earth Sphere with Vertex Program setup

This demo is functionally identical to the “Earth Sphere” demo.  The implementation of the bump mapping setup, however occurs in a vertex program, which runs in hardware on GeForce3.

 

Demonstrates: Tangent-space Per-Pixel Lighting using Register Combiners and a vertex program for lighting setup.


 

55.    Earth Sphere with Vertex Program and Texture Shader

This demo is a variation on the other “Earth Sphere” demos.  It uses the dot product texture shaders to evaluate the illumination equation.  The advantage of this technique is that much shinier surfaces can be rendered.  One disadvantage is that it requires more surface tessellation to avoid denormalization artifacts.

 

Demonstrates: Tangent-space Per-Pixel Lighting using vertex programs and texture shaders.


 

56.    Bump Reflection with Local Light Source

One of the significant features of GeForce3 is True Reflective Bump Mapping.  Most illustrations of this technique emphasize reflection into an environment map which is infinitely far away. This effect demonstrates the use of “reflect cube map” hardware to get very high quality specular for local light sources.

 

Demonstrates: Light-space Per-Pixel Lighting using vertex programs and texture shaders.


 

57.    Bumpy Shiny Patch

This effect illustrates the use of vertex programs for true reflective bump mapping with the texture shaders.  It supports a number of tweakable features that illustrate the flexibility of the effect.

 

Demonstrates: True reflective bump mapping.  Cubemap-space Per-Pixel Lighting using vertex programs and texture shaders.


 

58.    Bumpy Refractive Patch

This demo is a variation on the reflective bump mapping that illustrates how to approximate refractive bump mapping.

 

Demonstrates: Refractive bump mapping using vertex programs and texture shaders.


 

59.    Bumpy Shiny Patch w/ dynamic normal map

Another variation on the bumpy shiny patch is the ability to render normalmaps on-the-fly.  In this effect, little bulges move around (independently) all over the surface of the reflective patch.  This dynamic normalmap effect works for refractive mode as well.

 

Demonstrates: True reflective bump mapping.  Cubemap-space Per-Pixel Lighting using vertex programs and texture shaders.


60.    Explosion Demo

This is a multi-layer particle system that uses vertex shaders to do physics calculations for the particles.  The vertex shader pretty big, but his commented quite a bit.  The C++ code includes a particle file parser as well as a class for implementing these types of particles.  The particle system is in 3D as can be seen from the picture.

 

Demonstrates: Vertex Programs, Constant Setting, Alpha Blending for compositing.

61.    Mipmap Toon Rendering

This demo illustrates a clever use of texture coordinate generation, LOD selection, and register combiners to quickly render the silhouettes of some types of objects.

 

Demonstrates: Toon Silhouette rendering with mipmapping and Register Combiners.


 

62.    Dot Product Texture 2D Bump Mapping

This effect is a very basic illustration of the dot product texture 2D texture shader mode for bump mapping.  One important distinction about this form of bump mapping is that much shinier surfaces (greater specular exponents) can be rendered.

 

Demonstrates: Object-space Per-Pixel Lighting using texture shaders.


 

63.    Simple Bump Reflection

This effect is the simplest example of true reflective bump mapping.  It is applied to a single quad.

 

Demonstrates: True reflective bump mapping using texture shaders.


 

64.    Tangent Space Bump Reflection

This effect is a minor extension to the previous example.  It illustrates a tangent-space formulation of true reflective bump mapping.  This example also illustrates the application of an RGB glossmap as a subsequent pass.

 

Demonstrates: True reflective tangent-space bump mapping using texture shaders.


 

65.    Hardware Shadow Mapping

This effect demonstrates the GeForce3 hardware shadow mapping support.  It uses the SGIX_shadow and SGIX_depth_texture extensions.  In addition, the shadow map generation is performed in a pbuffer (making it independent of the window size), and the depth buffer is copied to a shared depth texture using fast glCopyTex{Sub}Image2D() support.

 

Demonstrates: Hardware shadow mapping support.


 

66.    Woo Shadow Mapping with Hardware Acceleration

This effect demonstrates a research-oriented extension to the hardware shadow mapping example.  It makes use of depth peeling and the dot product depth replace texture shader to compute Woo-style shadow maps in two passes.  Woo shadow maps are shadow maps whose depth is half way between the nearest and second nearest surfaces from the light’s point of view.

 

Demonstrates: Hardware accelerated Woo shadow mapping support (research).


 

67.    Depth Peeling

This effect demonstrates a new technique called depth peeling.  This technique uses two depth tests per fragment (one via shadow mapping) to peel away layers of the scene exposing fragments that would otherwise have failed the visible surface determination depth test.  The example image shows the 2nd nearest surface from a rendering of a teapot.  The teapot is rendered with two-sided lighting (red on the outside, green on the inside), and a blue ground plane is underneath.  This effect allows you to interactively increase and decrease the number of layers that are peeled away in the scene.

 

Demonstrates: Depth Peeling with hardware shadow mapping support and texture shader (research).


 

68.    Simple Vertex Array Range (VAR) example

This effect demonstrates using the VAR and fence OpenGL extensions for getting the most efficient transfer of geometry to the GPU and the most efficient pipelining between the CPU and GPU.

 

Demonstrates: NV_vertex_array_range and NV_fence extensions.


 

69.    Simple Multi-texture-based Shadow Mapping

This effect demonstrates the use of simple multitexturing and combiners to perform a limited variant of shadow mapping without specific hardware support.

 

Demonstrates: Shadow mapping on TNT-class hardware.


 

70.    Offset Bump Mapping

This effect demonstrates offset bump mapping on GeForce3 using either the offset texture 2D or dot product texture 2D texture shaders.  The offset texture 2D mode (often called DX6 EMBM) cannot be used with tangent-space bump mapping because the 2x2 matrix is global state.  The dot product texture 2D shader improves on this by allowing the 2x2 matrix to be specified per-vertex, and so it can be used for correct offset bump mapping on a sphere or other curved surface.

 

Demonstrates: Object-space and tangent-space bump mapping with texture shaders.


 

71.    Order-Independent Transparency

This effect demonstrates the use of depth peeling to correctly extract and re-composite the various layers of fragments within a scene.  There is a separate whitepaper on this technique.

 

Demonstrates: Order-independent transparency via depth peeling (research).


 

72.    Pbuffer to Texture Rectangle

This demo illustrates the use of pbuffers and NV_texture_rectangle.

 

Demonstrates: Dynamic textures with fast copies from pbuffer to texture rectangle.


73.    Per-Pixel Attenuation

This effect demonstrates the use of simple multitexturing to compute attenuation on a per-fragment basis.

 

Demonstrates: Simple multitexture and combiner math for per-pixel attenuation.


 

74.    Simple Projective Texture Mapping

This effect demonstrates simple projective texture mapping.  There is a separate whitepaper on projective texture mapping.

 

Demonstrates: Projective texture mapping.


75.    Homogeneous Texture Coordinates

This effect demonstrates the assignment of homogeneous texture coordinates.

 

Demonstrates: Using homogeneous texture coordinates .


76.    Simple Rotation Example

This effect is a simple illustration of a rotation constructed from four input vectors.  Typically we think of computing rotations between only two vectors where we (somewhat arbitrarily) choose the axis of rotation to be perpendicular to both vectors.  This formulation has more practical application to per-pixel lighting.

 

Demonstrates: Constructing a rotation from two from-to constraints.


77.    Simple Multitexturing Example

This effect demonstrates simple multitexturing in action.  It shows the two input textures in the upper subwindows and the lower windows illustrate two combine modes.  In both cases texture 0 replaces the incoming fragment color.  In the lower left texture 1 is modulated in, and in the lower right texture 1 is added in.

 

Demonstrates: Use of the ARB_multitexture extension.


78.    Simple Shared PBuffer

This effect demonstrates dynamic texture generation by rendering into a pbuffer and copying the rendering into a shared texture with a fast glCopyTex{Sub}Image2D().  In addition this effect uses the SGIS_generate_mipmap extension, which automatically updates the mipmap pyramid when the base image is modified.

 

Demonstrates: Fast dynamic textures via render-to-pbuffer.


Third Party Effects

This document describes the additional effects submitted to NVIDIA as part of the shader competition.  It includes details on how to run and use the effects, compiled from information supplied by the original developers.  If the developers have used extra commands to the standard Effects Browser controls, they are documented here.  All information on the shaders and their properties was provided by the creator of the effect, and only minor changes have been made to the text for readability.

 

All the effects live under the 'effects\xtra3rdparty' directory, and are designed to be built and run from there.

 

.

1.     Bezier Patch

Creator:
Francesco Banterle

 

Description:
This effect shows how to generate a bezier patch inside the vertex shader.

Special Controls:
Up: Zoom In.
Down: Zoom Out.
b/B: Create new random control point.
+: Increase level detail of the patch.
-: Decrease level detail of the patch.


2.     Bump Refraction

Creator:
Tomohide Kano

 

Description:
Does per-pixel refraction and reflection, with options to control refractive index of the material to simulate objects made from glass, diamond, etc..


3.     Vortex

Creator:
Jim Bumgardner

 

Description:
This vertex shader produces a twisting vortex effect by remapping the texture coordinates. As a shortcut, the texture coordinates (tu, tv) are pre-computed as polar coordinates.  The vertex shader adds a twist rotation to the angular coordinate and then converts them back to Cartesian coordinates.  Since the center of the vortex contains the most detail, the mesh that is used has a correspondingly higher level of detail in the center.


4.     Fresnel Refract and Reflect

Creator:
Kristian Olivero

 

Description:
This vertex shader implements a Fresnel reflection effect. Objects tend to be more reflective when the light is incident at a shallow angle. Good examples of Fresnel are a highway, which is non-reflective when looked at directly, but mirror-like at a distance, or a piece of glass, which you can look through with almost no reflection viewed directly, but is a perfect mirror when held at an angle. In the menu, alternate between the uniformly
blended reflect and refract, and the fresnel version, to see what a big difference this effect makes.

This shader generates the camera space reflection coordinates in T0, and a reasonable approximation of refraction coordinates in T1. The approximation for refraction is set up by shortening the vertex normal and passing it through the standard reflection calculation.

The dot product between the vector from the eye to the vertex, and the vertex normal vector is calculated and negated into the alpha channel of the diffuse vertex color. This is used to blend between the reflection and refraction stages for the Fresnel reflection effect.

Modified From Original NVIDIA source to include Fresnel Reflections, and to easily switch between different reflection and refraction combinations to illustrate differences.


5.     Fur

Creator:
Sebastien St-Laurent

 

Description:
This effect is a Fur rendering shader. It simulates straight fur and by rendering multiple hulls around an object to create a 3D fur effect. A vertex shader is used in this renderer to facilitate the rendering of the hulls without having to maninupate the original mesh.

Special Controls:
M - Increases the number of hulls applied for fur.
L - Reduces the number of hulls applied for fur.
LEFT/RIGHT - Translates along X axis.
PGUP/PGDOWN - Translates along Y axis.
UP/DOWN - Translates along Z axis (equivalent to zooming).
+/- - Zoom in/out.


6.     Grass

Creator:
Sebastien St-Laurent

 

Description:
This effect is a grass rendering shader. It simulates straight fur and by rendering multiple hulls around an object to create a 3D grass effect. A vertex shader is used in this renderer to facilitate the rendering of the hulls without having to manipulate the original mesh.

Special Controls:
M - Increases the number of hulls applied for grass.
L - Reduces the number of hulls applied for grass.
LEFT/RIGHT - Translates along X axis.
PGUP/PGDOWN - Translates along Y axis.
UP/DOWN - Translates along Z axis (equivalent to zooming).
+/- - Zoom in/out.


7.     Height Field

Creator:
Steffen Bendel

 

Description:
This effect demonstrates how you can create a (massive parallel) stack engine in the pixel shader.  This is only possible if you can use the last computed value as input for the next pixel. Generally this is not implemented in current hardware, but you can use the render target as a source for the next step. The reference device can process a bitmap as source and render target at one time. So you can render current line X and use line X-1 as source.  To avoid problems with hardware rendering, this implementation uses 2 switching line-textures for 'line X' and 'line X-1'. So target and source texture are different and the result must be copied to the real target. This is a little bit slower than necessary, because there are a lot of state changes and a line is rendered by using 2 triangles instead of a real line primitive. Because the stack element has only a size of 4*8 bits, the demo uses only the red component for color, the alpha is used for current high and blue is used to hold the pointer to the next stack element.  Green is not explicitly used, but must be 1.0 for texture-lookup operation.

In the first pass, the height-map is rendered down (front) to top (back). The height-component (alpha) is scaled and added to the current line-height.  The next pointer is the last line.  If this is higher than the last, it is the result for the operation, else the last line is used.  After this pass, we have a list of stack top-down sorted elements (invisible ones are rejected), with the first being the stack pointer of the top line.  In the next pass, this list is drawn. The current stack element is used, until the render-height is lower than the height of this element, then the next is used.  This is retrieved using texture-lookup (the last-line stack pointer is an index into the lookup table created in the last first pass).

The effect is limited by color-resolution. The stack-pointer has only 8 bits, so the target picture has maximum 256 resolved lines. The x-resolution is maybe better, because an iterated value is used as coordinate for the lookup. This program is not well optimized, and maybe the visual quality can be better (higher precision by another implementation?, perspective view, etc..), but it does demonstrate that some complex stack-based operations are possible by using a pixel shader.


8.     Matrix TexGen

Creator:
Kristian Olivero

 

Description:
This is the vertex shader implementation of the traditional shader, which allows you to rotate, scale, and translate, texture coordinates.
Texture coordinate transformation matrix is set up, then necessary components are passed in the vertex shader constants. Any number of effects beyond the given example are possible with the shader by generating and multiplying the  correct component matrixes and passing in the composite. This is a nice speedy effect, with only 4 instructions more than the simple quad shader dealing with the texture coordinate transformation.

A 4x4 texture coordinate transformation matrix is used to utilize D3DX helper functions easily, and to facilitate easy transition to a full 3D transformation for 3D textures. Matrix is transposed so only 2 vectors can be easily passed in and used, one to transform each texture coordinate.


9.     Rain

Creator:
Sebastien St-Laurent

 

Description:
This renderer uses an accelerated vertex shader program to render a "rain splash" effect onto a mesh. The shader takes some time indexes as input and a rain direction vector.  From this information, the water splash sprite is scaled and alpha-blended properly for create an animating rain-splash effect.  In this sample, a separate vertex buffer is calculated (based on some random function and some triangle surface area calculation).  But if a mesh is sufficiently dense and non-uniform, the base mesh of a model could be used as the source of rain-splashes.


 

10.    Bezier Spline

Creator:
Lee Baldwin

 

Description:
This shader does the bi-cubic Bezier spline computations, based on the 4 control points packed into the constant registers.  The input vertex stream consists of a single float, which are the iterated delta values, from 0.0 to 1.0, in which to interpolate the spline. The number of vertices is equal to the number of segments to tessellate, and the primitive is drawn as a line list.

Special Controls:
The + and - keys increase or decrease the number of segments, meaning the vertex buffer gets rebuilt with a new set of deltas.
You can also toggle some simple animation on the y-axis of the in/out vectors in the bezier basis.


11.    Spyrorama

Creator:
Jim Bumgardner

 

Description:
The vertex texture coordinates (tu, tv) are precomputed during initialization as polar coordinates. These are used to generate a groovy psychadelic pattern.  Suitable for wild parties, and gas giant planets.

The color formula for each rgb component i is (range -1 to 1):

cos(log2(d)*x.i + a*y.i + z.i)*cos(a*w.i)

in which:

d is the distance polar coordinate (from v2.x)
a is the angle polar coordinate (from v2.y)
x is shader constant 10 (aka "twist")
- values from 0-10 are good. 0 = no twist
y is shader constant 11 (aka "rotate-direction")
- these should be integers - typically -1 or 1
z is shader constant 12 (aka "rotate phase")
- used to put r,g,b out of phase - values from 0-2pi
w is shader constant 13 (aka "pleats" (angular cos frequency))
- these should be integers - values from 0-6 are good

Shader constants x,y,z correspond to red,green,blue, respectively.


12.    Twister

Creator:
Alain Bellec

 

Description:
This effect consists of a twisting which moves along the x-axis or the y-axis of the world coordinates system.
Sine and cosine are calculated by the vertex shader to create the matrix of rotation to be applied to every vertex.
The calculation of the angle of rotation makes intervene at the same moment the time (a simple counter which believes and which decreases) as well as the position x (or y) of the current vertex of the object.  The relation between the position x (or y) of the vertex and the angle depends on a "step" value. In other words, the more the "step" is small and the more the rotation is stressed for the same distance.  It is possible to make vary the maximum value of the loop, the "step" value, the offset of the start point as well as the "speed" of the effect.  When an unspecified value is modified, it is often necessary to also modify other values. For example the duration of the loop must be adapted to the new value of speed or "step".
The calculation of the lighting uses the same matrix of rotation.

This effect is based on a simple use of the sines and cosines and it is easy to create other effects with a small modification of the code of the vertex shader.
Thus " Twister.cpp " is reused to carry out the effects " Sine ", " Scaling " and " Manta ".

Special Controls:
+ or - to increase or decrease the "Step " of the effect.
(ALT gr +) or (ALT gr -) to increase or decrease the time of the loop
PageUp or PageDown keys to increase or decrease the speed of the effect (only 5 values)
Left Arrow key to decrease the offset of the start point
Right Arrow key to increase the offset of the start point


13.    Flying Worlds

Creator:
Steffen Bendel

 

Description:
This effect demonstrates the possibility of emulating real geometry by using pixel shading. A billboard object is used to draw an illuminated sphere, that can viewed in all directions. This is similar to the principle of reflected bump-mapping. But here the normal-map is more macroscopic (because there is no basic geometry) and no reflection is calculated.

For this effect we need a normal-map and cube map texture.  The processing is here separated in two steps: lighting/border culling and texturing.

1. Pass:
Vertex Shader:
-transformation for billboard object
-calculate the intensity-scaled light-vector in eye-space

PixelShader:
- get normal-vector by texture
- calculate dp3 with light-vector for brightness
(more than one light should be not problem here)
- add ambient color to get result brightness
- use normal map alpha to clip the border of the sphere

Blending:
-alpha>=0.5 compare to clip the non-sphere-parts
(the normal map has a smooth alpha-gradient to get after
texture interpolation a 0.5-alpha-contour that is really
round.)

2. Pass:
Vertex Shader:
-transformation for billboard object
-creating the transformation matrix for the cube map lookup

PixelShader:
- get normal-vector by texture
- transform vector by matrix
- look in cube map to get the result color

Blending:
-multiply the result and the backbuffer color
-use z-equal compare to write only the sphere


14.    Shinning Flare

Creator:
Jarno Heikkinen

 

Description:
This is a quickie for the NVidia shader competition. Rotate the flare to change the orientation.  Displays a texture mapped triangle fan that is radially distorted. Vertex shader is used for interpolation. Radial distortion and flare gradients are loaded from files.


15.    Pencil Sketch

Creator:
David E. Tin Nyo

 

Description:
This shader effect was designed to simulate a hand drawn pencil sketch. Shaded areas of the 3D model appear to have a crosshatch pattern drawn on them. These lines are in screen-space and are not affected by the rotation of the camera or object, to simulate the way a real human would render a pencil drawing.

The shader is implemented by modulating pre-drawn 2D horizontal and vertical line maps with 1D grayscale ramps. The Vertex Shader for this plugin is used to scale the texture coordinates of the 2D bitmaps relative to the Z distance, which keeps the texture size constant in relation to the screen coordinates.

The vertex shader also computes the dot product of the surface normal and light direction and uses this as the texture coordinate of the grayscale texture, creating a transition from light to dark just like the toon texture.

A Pixel Shader is used to modulate the greyscale ramp with the 2D sketch texture, creating a sketch texture that appears white (no lines) in the direction of the light, but appears more distinct (darker lines) in areas of shade. The pixel shader is necessary because we need to combine the inverse of the textures, since the sketch textures are drawn as black lines on white background.  Additional interest is added by using two light sources and separate textures for horizontal and vertical lines. This creates areas of the drawing where the hatch lines "separate" from each other, adding to the authenticity of the hand drawn look.

The mouse can be used to rotate, pan, and zoom the view as with the other NVEffects. F1 or H can be used to view the help for the keyboard/mouse commands. The Pencil Sketch menu (or right-click in the view) allows the selection of three fixed light positions. The first, default, setting places the lights to the upper right, but slightly displaced from each other. The second setting places one light.

The crosshatch textures can be modified to give the shader a wide range of sketch styles. Also, since the greyscale texture is treated independently for each light, different textures can be supplied for different shading responses from each hatch texture.  Rotating the hatch textures using the Vertex Shader might provide some more interesting hatch behavior. For example, rotating the texture coordinates depending on the angle between the surface normal and the light.

This shader should have an option to automatically rotate the lights, similar to some of the samples (eg. the point-light sample), since there are a wide range of effects with varying light angles.


16.    Cook-Torrance lighting

Creator:
Eric Duhon

 

Description:
My primary goal for this effect was the accurate rendering of metallic surfaces.  The Cook and Torrance lighting model was designed for just such a purpose. All of the real-time hardware implementations I have come across for bdrf lighting models all seem to rely on some form of pre-computation. This has many draw backs, namely the accuracy of pre-computation is limited by the size of the lut you create, and a different lut texture has to be generated for every type of material you wish to have in your program. However there is no question that these techniques are very fast.  Vertex shaders, however, allow for enough computational power that the cook and Torrance lighting model could be calculated real-time. This allows the programmer to change the material by just changing a few numbers in constant registers. The accuracy of the shader is not nearly as limited as a lut. Only the precision of the shader registers limits accuracy. There is no need with this shader to pre-calculate anything. The first part of my effect is this dynamic cook and Torrance vertex shader.  The second problem with most attempts at rendering metals is secondary reflections.  Usually an ambient term is added to the lighting model to approximate secondary reflections.  However, while an ambient term works fairly well for diffuse secondary reflections it can not accurately represent specular secondary reflections. Adding a cubemap to represent specular secondary reflections to an ambient term to represent diffuse secondary reflections provide a fairly good approximation. However this approach tends to give to much of a mirror appearance to the material.  For this effect I used a variation of the basic cubemap. The cubemap is treated as a light source and then inputted into the Torrance and cook lighting model to come upwith a final value for the specular secondary reflection. This gives the cubemap more of the color of the material at high angles of incidence and more of the color of the light source (cubemap) at low angles of incidence. This method also works extremely well for dull materials.  For dull materials, when looking at them straight on, the cubemap is barely visible, but at low angles of incidence the cubemap becomes more visible.  A good example of this type of behavior in the real world is semi-polished floors. If you look straight down at them you will not see much of a reflection, it is only when looking further away from you that you begin to see reflections in the floor. Traditional cubemaps don't exhibit this behavior. The Torrance and Cook cubemap makes up the second part of my effect.  The effect contains both a phong implementation and the  Torrance and Cook implementations for comparison. You can also compare the primary lighting models and secondary (i.e. cubemap) lighting models separately.

Effect options described
The first group of options allows you to select the lighting model you wish to use.
It defaults to Torrance and Cook lighting plus Torrance and Cook cubemap, which is the complete
shader. The other lighting models are for comparison. They are described below.

Phong Lighting:
primary diffuse light: Standard phong diffuse Lighting
primary specular light: Standard phong specular lighting
secondary diffuse light: Standard phong ambient term
secondary specular light: none

Torrance and Cook Lgihting:
primary diffuse light: Standard phong diffuse Lighting
primary specular light: Torrance and Cook specular lighting
secondary diffuse light: Standard phong ambient term
secondary specular light: none

Phong Cubemap:
primary diffuse light: none
primary specular light: none
secondary diffuse light: none
secondary specular light: standard cubemap

Torrance and Cook Cubemap:
primary diffuse light: none
primary specular light: none
secondary diffuse light: none
secondary specular light: Torrance and Cook cubemap

Phong Lgihting:
primary diffuse light: Standard phong diffuse Lighting
primary specular light: Standard phong specular lighting
secondary diffuse light: Standard phong ambient term
secondary specular light: standard cubemap

Torrance and Cook Lighting:
primary diffuse light: Standard phong diffuse Lighting
primary specular light: Torrance and Cook specular lighting
secondary diffuse light: Standard phong ambient term
secondary specular light: Torrance and Cook cubemap

The second set of options is used for choosing the type of material you wish to view
Shiny Gold (default): This represents highly polished gold
Dull Gold: This represents a duller gold than the previous
Dull blue material: This represents no particular material it is used to view the shaders
effectiveness at rendering dull materials.
Silver: Just another cool shiny material to look at.
Mirror: An example of a mirror surface rendered using the shader.

The third set of options allows you to control the brightness of the 2 lights. There meanings
are self explanatory.

The fourth option is to pause the motions of the 2 lights around the teapot.


Mathematical Theory for effect
Below are the mathematical formulas used in the shader. dot means the 3 component dot product
of 2 vectors.

Standard Phong ambient term:
A = Ka * Dc where
A = final color
Ka = amount of secondary diffuse light (ie from material properties)
Dc = Diffuse color of material

Standard Phong Diffuse Lighting:
D = Il * (Kd * (N dot L) * Dc) where
D = final color
Il = light intensity
Kd = amount of primary diffuse light
DC = diffuse color
N = normal vector
L = Light vector

Standard Phong Specular Lighting:
S = Il * (Ks * (N dot H)^n) where
S = final color
Il = Light intensity
Ks = amount of primary specular light
N = Normal Vector
H = Halfway Vector
n = Specular power

Standard Cubemap:
Start with previous phong specular equation S = Ks * (N dot H)^n
since the light vector is just the view vector reflected around the normal
vector in a cube map, the halfway vector by definition is equal to the normal
vector in a cube map so the equation reduces to:
S = Ks * Il where
S = final color
Ks = amount of secondary specular light
Il = Light intensity and color sampled from the cubemap

Cook and Torrance lighting
The original Cook and Torrance equation is, except for F which is from Schlick
S = dw * Ks * ((F * D * G)/(pi * (N dot V)) where
D = 1/(4 * m^2 * (N dot H)^4) * e^(-((tan(arccos(N dot H))^2) / m^2))
G = min(1,Gm,Gs)
Gm = (2 * (N dot H) * (N dot V)) / (V dot H)
Gs = (2 * (N dot H) * (N dot L)) / (V dot H)
F = F0 + (1-(N dot V)) * (1 - F0) from Schlick

this is not very practical so its best to rearrange and simplify the equation so it works better with
vertex shaders. All transcendentals have been removed exept the exponent in the equation below.

S = (B * F * D * G) / (N dot V)
B = (dw * ks) / (2 * m^2 * pi) can be calculated on a per object basis
D = (e^((1/m^2) * (1-(1/(N dot H)^2))) / (N dot H)^4
G = min(0.5,Gm,Gs)
Gm = ((N dot H) * (N dot V)) / (V dot H)
Gs = ((N dot H) * (N dot L)) / (V dot H)
F = F0 + (1-(N dot V))^5 * (1 - F0) from Schlick
N = Normal Vector
H = Halfway Vector
V = View vector
L = Light vector
m = Reflectivity
dw = .0001
Ks = amount of primary specular light
F0 = fresnel coefficient at 0 degrees angle of incidence

Cook and Torrance Cubemap:
Start with previous Cook and Torrance specular equation
since the light vector is just the view vector reflected around the normal
vector in a cube map, the halfway vector by definition is equal to the normal
vector in a cube map and N dot V is equal to N dot L so the equation reduces to:

S = (B * F) / (N dot V)
B = (dw * ks) / (4 * m^2 * pi) can be calculated on a per object basis
F = F0 + (1-(N dot V))^5 * (1 - F0) from Schlick
N = Normal Vector
V = View vector
m = Reflectivity
dw = .0001
Ks = amount of secondary specular light
F0 = fresnel coefficient at 0 degrees angle of incidence

References
Phong B. (1975) Illumination for computer-generated pictures. COMM. ACM, 18 (6), 311-17
Cook R.L. and Torrance K. E. (1982). A reflectance model for computer graphics. COMPUTER GRAPHICS, 15 (3), 307-16
Watt A. (2000) 3D Computer Graphics.


17.    Hair

Creator:
Steffen Toksvig

 

Description:
This vertex shader takes a vertex stream:
D3DXVECTOR3 p; // position
D3DXVECTOR3 tx, ty; // tangents
D3DXVECTOR2 tex; // texture coordinate
D3DXVECTOR4 parm; // .x = u parameter value, .y = radius, .z = length
as input to render strips of triangles that that more or less resembles hair.

The animation and the rendering of the hairs are controlled by a number of constants.
Diffuse end specular colors
Specular reflection and sharpness
Transparency fade along the hair
Constant bending
Gravity point with distance attenuation
Dynamic bending is calculated from the object motion
Light source position
Bending shape polynomial

See ShaderConstants.h for a description of the constants.

In this version af the program fur can be grown on any parametric surface, that implements the ParameterSurface interface. However only a sphere and a torus is implemented.

Special Controls:
PgUP,PgDown Change number of hairs
+,- Change length
*,/ Change number of segments
R,E Change radius
P Change model


18.    1D Particles

Creator:
Martin Mittring

 

Description:
The work of the vertexshader is to project the particle (2 positions with 2 size values) and calculate the right texture coordinates (16 subpictures in one texture). I´m sure this calculation can be done with more precision - but I run out of time to prove it.  There is no special pixelshader used, this will work on most graphic cards.


19.    Particle System

Creator:
Greg Snook

 

Description:
Particle System #1
The first shader runs a complete particle simulation within the shader. Trajectory, Wind Shear, Gravity and Magnetic forces can all be applied real-time in addition to particle scale and rotation. The shader also provides four-frame texture animation for the particles using creative UV mapping on a single texture. Each Vertex for this shader is only 16 bytes (a single vector). A huge Dialog box of sliders is presented to adjust the particle system in real-time. It's not exactly user-friendly, but it does allow for full control of the system and the ability to save/load particle system definitions.

Particle System #2 (Model Based particle system)
The second shader also runs a complete particle simulation within the shader just as above. The main difference is that in this case the particles are launched from the surface of a donor model instead of emitting from a single point in space. The vertex stride increases to 32 bytes (two vectors) but the end effect is really useful. As with the first shader, a full dialog of ugly slider controls is provided, plus you can right click on the particle system viewer to load a different donor model.


20.    Plasma

Creator:
Laurent Mascherpa

 

Description:
A colourful procedural plasma effect.


21.    Fur Effect with Dynamic Per Pixel Lighting

Creator:
Jeremie Allard

 

Description:
This effect use a stack of transparent layer to simulate fur. It uses a vertex program the set up per pixel lighting environment rendered by register combiners. Only bump textures are used to demonstrate the added quality of per pixel lighting. It only uses 2 register combiner, it then run on any geforce card, but the vertex program is only accelerated on a geforce 3. The code is based to vtxprg_regcomb_setup sample program from NVOGLSDK, it needs to be copied in NVOGLSDK directory to be compiled.


22.    Jellyfish

Creator:
Wade Lutgen

 

Description:
This is a fairly complex model which is extremely memory and CPU efficient.  The entire model is defined by about 118 floating point values representing the 1d curved line that is the cross section for the jellyfish.  The body is made by sweeping this line around in a circle and placing the result in a display list.  The tentacles are 100% procedurally generated.  The shading is based off of the nvidia membrane shading demo, and uses a small 1-D texturemap.

 

All of the animation and shading are done within the vertex program by perturbing the original vertices and normals according to a simple formula ( new_vertex = (1 + cos(theta)) * old_vertex).  Theta depends on y, z, and t.  The time is sent once per frame as a parameter.

 

This adds up to almost no memory or CPU requirements.  An undersea scene could be filled with hundreds without being a resource hog.


23.    Volume Lighting

Creator:
Stefan Guthe

 

Description:
The volume dataset is visualized using a stack of 2d textures along the main viewing direction.


24.    Volumetric Fog

Creator:
Peter Popov

 

Description:
This is Volumetric Fog demo.


25.    Non-Photorealistic Rendering

Creator:
Szczepan Kuzniarz

 

Description:
This Effect is an implementation of non-photorealistic shading method described in [1] and silhouette rendering described in [2]. Register combiners are used to render silhouette and vertex programs to calculate lighting and texture coordinates. The lighting model allows shading with mid-tones only, so edges and specular highlights are remain visible.

 

References:

 

[1] Amy Gooch, Bruce Gooch, Peter Shirley, Elaine Cohen 'A Non-

    Photorealistic Lighting Model For Automatic Technical

    Illustrations', Departament of Computer Science, University

    of Utah, http://www.cs.utah.edu

 

[2] Cass Everitt, 'One-Pass Silhouette Rendering with GeForce and

    GeForce2', NVidia Corporation



[1] Vertex shader and pixel shader are terms used by Microsoft DirectX 8.  Similar functionality is also available under the OpenGL API via vertex program, texture shader and register combiners extensions.