Tuesday, March 29, 2011

[Direct3D HLSL] Normal Mapping tutorial

For an explanation about why to use tangent space, read this tidbit of text.


Let's assume we're using Direct3D and HLSL.

Converting to Tangent (or texture) space
Normals stored in the texture are surface orientation dependent and are stored in what's called Tangent Space. But all the other lighting components such as view direction are supplied in world space. Because we can't use world space, why not convert every lighting component we need to compare the normal with, to this format called tangent space? Why not compare apples to apples?

Changing coordinate systems requires transformation. I'll just skip the hardcore math, but what I do want to explain here is that we need a matrix to transform world to tangent space. Just like we need a matrix to get world space from object space, we need a matrix to convert to tangent space. Remember this:
  • We need the surface orientation, because that's where the texture normals depend on.
  • We know everything about our surface (a triangle).
  • Any lighting component we need in PS (lightdir,viewdir,surfacedir) needs to be multiplied by the resulting matrix.
/* We need 3 triangle corner positions, 3 triangle texture coordinates and a normal. Tangent and bitangent are the variables we're constructing */


// Determine surface orientation by calculating triangles edges
D3DXVECTOR3 edge1 = pos2 - pos1;
D3DXVECTOR3 edge2 = pos3 - pos1;
D3DXVec3Normalize(&edge1, &edge1);
D3DXVec3Normalize(&edge2, &edge2);

// Do the same in texture space
D3DXVECTOR2 texEdge1 = tex2 - tex1;
D3DXVECTOR2 texEdge2 = tex3 - tex1;
D3DXVec2Normalize(&texEdge1, &texEdge1);
D3DXVec2Normalize(&texEdge2, &texEdge2);

// A determinant returns the orientation of the surface
float det = (texEdge1.x * texEdge2.y) - (texEdge1.y * texEdge2.x);

// Account for imprecision
D3DXVECTOR3 bitangenttest;
if(fabsf(det) < 1e-6f) {

// Equal to zero (almost) means the surface lies flat on its back
tangent.x = 1.0f;
tangent.y = 0.0f;
tangent.z = 0.0f;

bitangenttest.x = 0.0f;
bitangenttest.y = 0.0f;
bitangenttest.z = 1.0f;
} else {
det = 1.0f / det;

tangent.x = (texEdge2.y * edge1.x - texEdge1.y * edge2.x) * det;
tangent.y = (texEdge2.y * edge1.y - texEdge1.y * edge2.y) * det;
tangent.z = (texEdge2.y * edge1.z - texEdge1.y * edge2.z) * det;

bitangenttest.x = (-texEdge2.x * edge1.x + texEdge1.x * edge2.x) * det;
bitangenttest.y = (-texEdge2.x * edge1.y + texEdge1.x * edge2.y) * det;
bitangenttest.z = (-texEdge2.x * edge1.z + texEdge1.x * edge2.z) * det;

D3DXVec3Normalize(&tangent, &tangent);
D3DXVec3Normalize(&bitangenttest, &bitangenttest);
}

// As the bitangent equals to the cross product between the normal and the tangent running along the surface, calculate it
D3DXVec3Cross(&bitangent, &normal, &tangent);

// Since we don't know if we must negate it, compare it with our computed one above
float crossinv = (D3DXVec3Dot(&bitangent, &bitangenttest) < 0.0f) ? -1.0f : 1.0f;
bitangent *= crossinv;

/* and add it to our model buffers */

We need to create a 3x3 matrix to be able to use it to convert object normals to surface-relative ones. This matrix should be built by adding the three components up in a matrix, and then transposing it in de Vertex Shader:

// tangentin, binormalin and normalin are 3D vectors supplied by the CPU
float3x3 tbnmatrix = transpose(float3x3(tangentin,binormalin,normalin));

// then multiply any vector we need in tangent space (the ones to be compared to
// the normal in the texture). For example, the light direction:
float3 lightdirtangent = mul(lightdir,tbnmatrix);

Then we're almost done. The only thing we need to do now is pass all the converted stuff to the Pixel Shader. Inside the same Pixel Shader retrieve the normal from the texture. Now you're supposed to end up with for example the light direction in tangent space. Then do your lighting calculations as you would always do, with the only exception being the source of the normal:

// we're inside a Pixel Shader now
// texture coordinates are equal to the ones used for the diffuse color map
float3 normal = tex2D(normalmapsampler,coordin);


// color is stored in the [0,1] range (0 - 255), but we want our normals to be
// in the range op [-1,1].
// solution: multiply them by 2 (yields [0,2]) and substract one (yields [-1,1]).
normal = 2.0f*normal-1.0f;


// now that we've got our normal to work with, obtain (for example) lightdir 
// for Phong shading
// lightdirtangentin is the same vector as lightdir in the VS around 
// 20 lines above
float3 lightdir = normalize(lightdirtangentin);


/* use the variables as you would always do with your favourite lighting model */

Why to use tangent space

You might've been wondering why you see lots of people doing their vector calculations in what they call texture or tangent space. Why can't they just do it in World space for example?

Well, because they want their special texture maps to be portable.

Let's take normals for example: 3D vectors which can be stored in textures to provide detail normals for every pixel on a texture (texel). Textures which store normals are called normal or bump maps. Here's an example of a normal map of a brick wall:

This image has three channels: R, G and B. They're used to store the X, Y and Z amount of a normal that's supposed to be at that given texel. This normal can be used to change surface lighting without having to add geometry. The only downside is that the object will still look flat from a side: it's just a trick to change lighting response, not to add actual depth. Lighting effects are changed because these normals are used in the color calculations, not the linear ones passed by the Vertex Shader.

Now about the colors: red equals to normals pointing along (tangent to) the surface, green equals to normals pointing left (bitangent to) and blue equals to normals pointing up away from the surface.

A pure blue (0,0,255) pixel (the perfect surface normal) will be our test subject.

Let's say we were using this vector as a world space normal. That would work perfectly fine for a surface lying flat on its back. The texture reader would read (0,0,255) and determine the current normal pointing 100% up in World space. Then imagine an object on its belly with this texture. The GPU reading the texture will still read (0,0,255) believing the normal is still pointing up in World space. This is not the case, since the object itself has rotated. Oops.

Looks like we can't just pluck world normals out of a texture, because world normals are object orientation dependant. The normals are relative to the surface the texture is applied on. So, we need to know surface orientation to be able to use these normals. There's no way to store portable world normals in a texture.

But there's always hope: just convert your data which is in stored in World Space to Tangent space.

Sunday, March 27, 2011

Introduction

Hello fellows from all over the world.

I've decided to start a blog about my own little 3D Engine I've started making.

Why a 3D Engine you might ask? Well, I got bored a while ago. Bored with all the PHP and Access shite at school. Sounds familiar eh?

I just wanted to know how the 3D stuff of all these epic games worked, and well, I just started creating my own. Just as simple as that, start your own, you might think. Well, it is not.

It's just called 'Engine' for now, and it will do for now. Enough for the personal stuff, lets race ahead to my actual latest version (I started making this blog a tad too late).