Hello again, my friends!

Let's start a new series of tutorials, at this time let's go deep in shaders universe, the most exciting part of OpenGL programmable pipeline. We'll treat about textures, lights, shadows, per-pixel effects, bump, reflections and more.

This series is composed by 3 parts:

  • Part 1 - Basic concepts about GLSL ES (Beginners)
  • Part 2 - Shaders Effects (Intermediate)
  • Part 3 - Mastering effects with OpenGL Shader Language (Advanced)


Here is a little list of contents to orient your reading:

List of Contents to this Tutorial


At a glance

We'll study in-depth the shaders language (more specifically the GLSL ES, the shader language for Embedded Systems) and let's make great effects using the shaders like specular lights and reflections, bump maps, refractions, reflections and more.

In this first part I'll cover the basic concepts sobre shaders and GLSL ES. We'll need to drill deep in something called Tangent Space, which is an intermediate level, so I'll create an article between the part 1 and 2 specific to treat Tangent Space concepts and its creation.

In the second part let's start creating some interesting effects like specular lights, reflections and refractions. Besides, on the second one let's create the environment mappings by using the cube texture.

Finally in the last part let's make the most advanced effect, the bump mapping and see what is the difference between Normal Mapping, Bump Mapping and Parallax Mapping.

Hands to work!



Shading Types

top
First off, we need to understand the evolution of that we call shader. Today we have many computations on the GPU and several shader techniques that achieve really good results, but how did we get here?

Once upon a time, there was a single shading technique, called Flat Shading. It defines that light is computed with the normal vector and each FACE of the mesh has a normal vector. The term "FACE" here means a polygon (usually a triangle).

Flat Shading

With the evolution of the hardwares we start to process values for each vertex, this improvement took us into a new level, making more light effects. This shading technique was called Goraud Shading.

Goraud Shading

Then we discovered something that produces really nice results, the interpolation. It's a process which we calculate all the intermediate points between two other points. We use it all the time. For example, if a point A has a texture coordinate U:1 V:2 and a point B has U:2 V:5, the interpolation will calculate all the middle points from 1,2 to 2,5. This is know as Phong Shading.

Phong Shading

But for many reasons, to simulate the real world, we need more than a linear interpolation. Surfaces have many details that change the way the light and shadow reacts. So we discovered a way to produce infinities values over a surface using less processing time. This technique was called Bump Shading.

Bump Shading

Today we have some advanced Bump's techniques, but all of them have the same basic concept: store each value surface deformation value as a set of RGB color. So, basically, the Bump Shading takes advantage of a texture map that store coordinates within a RGB format. The coordinates in there can be used for many things: Normals, Tangents, Bitangents, Vertices positions or anything else we want. Usually the textures for Bump Shading is called Normal Maps, because we store the Normal Vector value in it.

By default, OpenGL uses the Phong Shading, making the interpolations between the Vertex Shader's outputs and Fragment Shader's inputs. This information is very important, so I'll repeat: "Vertex Shader's outputs are interpolated to Fragment Shader's inputs". Technically, this is what happens:

Interpolation Table
  Vertex Shader Fragment Shader
Vertex A 0.0 0.0
  - 0.25
  - 0.50
  - 0.75
Vertex B 1.0 1.0



OpenGL Shaders

top
Here is a brief review about the shaders that we've seen on previous tutorials:

  • Shader is the way to calculate everything related to our 3D objects by our own (from the vertices positions to the most complex light equations).
  • Vertex Shaders (VSH) are processed one time for each object's vertex. Fragment Shader (FSH) are processed one time for each fragment (not necessarely a pixel) of the visible object (http://blog.db-in.com/all-about-opengl-es-2-x-part-1).
  • You can set constant values for Uniforms to work throughout the VSH and FSH processing (http://blog.db-in.com/all-about-opengl-es-2-x-part-2).
  • Dynamic variables can be assigned only to the Attribute kind, which is exclusive of VSH. You can send a variable from VSH to FSH via Varyings, but remember that those values will be interpolated!

The shaders are small pieces of code that will be processed directly in the GPU. Unfortunately, even in these days (2011) our hardwares are very poor and slow if compared with the real amount of calculations that exist in the real world. The most advanced hardware trying to calculate a real phenomenon of the sun light passing through the water in a pool could take days to calculate a single frame. In the other hand, the real nature calculates all physical phenomenon instantly (OK, the mother nature makes a kind of "calculation", not exactly an equation).

While we stay on the "Era of Bits" we can't try to calculate the real phenomenons (I wrote an article about the "Binary World" where I talked about the new Era of Quantum computers, maybe there, in the "Era of Quantums", we'll be able to reproduce our 3D world closest from the reality).

Anyway, what I meant is that the shaders are something trying to reproduce the real world with a very very abstractly code, making a bunch of simplifications of the real world. So if you want to mastering the shader, you must learn to extract some abstract pieces of code from the real world phenomenons. But don't worry too much about it now, in the right time you'll see that it's an easy task and can be cool as well.

It's important to start thinking in the concept of "Shader Program". It's a set of 2 (and only 2) shaders: a vertex shader and a fragment shader. So, we must think in the render as 2 different steps (vertex and fragment). Usually the Fragment shader is processed a lot of times more than the Vertex one. If a mesh has 10.000 vertices that means its Vertex Shader will generate 10.000 outputs to the Fragment Shader. Remember that those outputs will always be interpolated to the Fragment processing. So, to increase the performance we always try to place hard calculus in the Vertex Shader. Obviously there are calculus that we can't accept interpolation to their values, like the bump effect, just in these cases we make the calculus inside the Fragment Shader.

However, make sure you got the correct distinction between the concept of making the calculus inside the Vertex Shader and another thing called Per-Vertex/Per-Fragment Light. We'll see those concept in-depth later on, but just to clarify:

  • Per-Vertex Light means you have all the light calculus inside the Vertex Shader and them you interpolate the result to Fragment Shader.
  • Per-Fragment Light means you have all last light calculus (the output value) inside the Fragment Shader, independent if the first steps was made in the Vertex or in the Fragment Shader.

The interpolation happens on all Vertex Attributes, like the Texture Coordinates. As shown above in the Interpolation Table, the interpolated values from a texture coordinate can retrieve all the pixels from a texture. The Texture Coordinates usually is defined by a technique called "UV Map" or "UV Unwrap Map", which is an artistic job, actually is almost impossible to create detailed UV Maps only with the code. Often a professional 3D software export the Texture Coordinates values within the model coordinates, based on definitions from an user friendly UV editor.

But there is another per-vertex Attribute very important to 3D world. With the shaders we calculate lights, shadows, reflections, refractions and any other effects we want. All of them need something in common, a Normal vector.



Normal Vector

top
This is one of the most important per-vertex attributes and its concept is very easy to understand. In the real world, basically, there are two things that can alter how the light rays affect a surface: the material (reflectiveness, refraction, specularity, shininess, etc.) and the surface's angle. Well, actually, the point of view (the viewer's eyes) also affect how we see the light, but let's focus on the first two things. The normal vector is related to the surface's angle. As the performance is something crucial to us, instead to re-calculate the angle of each surface (triangle) at every shader processing, we are used to pre-calculate a normal vector to every surface (triangle).

In simple words, the normal vector is an unit vector (magnitude equals to 1.0 and all axis range vary between [0.0, 1.0]) which represents the surface's angle. Nice, now, how we can calculate it?

Well, that's not an easy task, my friend. I had to read/watch/try a bunch of tutorials until I found the right formula. There are many people trying to teach how to calculate the normals. Some say that you must calculate per-face normals and store them into a buffer, others say to calculate the averaged normals between adjacent faces, some even say that you need to calculate each surface area to include in your final calculation. But no one gave me the right formula! I had to find it by my self, with the help of a great 3D software called MODO (by the way, I love it!).

Unfortunately, I'll not explain how to calculate the normals in this first tutorial. I'll create a separated article to show you how to get the right formula to calculate the normals. The normals deserve more attention than a simple subject inside one tutorial.

The most valuable point here is you understand that the Normal is an unit vector and visualize how the normals work together and how they fit into our shaders' context.



Normal's Smooth Angle

top
As we always make an abstraction of the real world, trying to simplify it, we have created a concept that does not exist in the real world: the Smooth Angle. Imagine this: in the real world the surfaces has infinities vertices, take for example the image of an sphere, maybe a bowling ball.

Try to imagine the smallest face/triangle that composes that bowling ball. Even if we try to use a microscope, we never will see a faceted area. Now take a look at our virtual spheres, even if we create a 3D mesh using a stupidly high resolution of 8 millions polygons, we'll stay very very far from the perfection of the real world. By the way, for our 3D applications and games, we must work around thousands polygons, not millions. Does that means our 3D lights will seem ugly on low meshes? Fortunately we have a solution.

Our 3D world always will have imperfections.

This problem can be solved with the Normal's Smooth Angle. With simple words, it represents the maximum angle on which the light will looks continuous when reflected by a surface. The following picture helps us to understand this point better:

The Smooth Angle is the angle between two adjacent faces.

Remember that smooth angle should be calculated when we calculate the Normals, so any post-change to the smooth angle will affect the entire Normals. I'll talk more about the smooth angle in the article of the Normals calculations.



Tangent Space

top
The Tangent Space is composed by two different components, actually there are three properties, the third one is the Normal Vector. As we've talked about the normals, let's focus on the other two: Tangent and Bitangent (also known as Binormal, but the term "Binormal" is a mistake).

The Tangent and Bitangent are unit vectors, just as the Normal, the combination of these three components must form an Orthogonal and Orthonormal set. Before we go ahead, let me explain in simple words these two concepts:

  • Orthogonal: Two vectors that are perpendicular (form an angle of 90 degrees).
  • Orthonormal: A set of vectors that are all Orthogonals and unit vectors.

OK, these set of three vectors called Tangent Space is defined per-vertex. It's purpose is define a local space for each face/vertex, which will be used to interpret the surface's imperfection (bump map). The bump map (also known as normal map) is a RGB map that defines each relief of the surface.

It could sound confused to a text explanation. Just try to imagine this, as we always optimize everything in 3D world, the bump map is a technique that stores the surface's deformations into a single image file. The Tangent Space is a set of vectors that allow us to parse the bump informations for each fragment, independent of the mesh's rotation, position or scale. The following image illustrate the Tangent Space and its connection with the Texture Coordinates.

The Tangent Space is a set of 3 vectors.

Basically the Tangent Vector points to where the "S" coordinate increases (S Tangent) and the Bitangent points to where the "T" coordinate increases (T Tangent).

It's possible to exist more than a Tangent Space for a single vertex, in this case the vertex will break into two or more vertices with the same value for position, actually there are other important concepts in Tanget Space, but I'll not bother you with the details in here. Just as the Normal's calculations, I'll let this complex part to another article dedicated to that subject. The important thing here is to you understand what is the bump maps and how the Tangent Space is important to make bump effects.



Texture Coordinates

top
This is the most common vector component for shaders. It's responsible to place an image on a mesh's surface. The Texture Coordinates (texcoord for short), is defined per-vertex. It'll be interpolated along two vertices to achieve a per-fragment result. The texcoord is usually given in the range [0.0, 1.0], representing the order [S, T], which are the normalized values from the [U, V] notation.

The texcoord have more to do with the artistic work than with our code, usually the 3D softwares are responsible for generating it. The texcoord will directly affects how the texture image file should be created, I mean, the image of the texture must be created based on the texcoord positions. There are some 3D softwares that accept multiple texcoord channels. It could be good for some situations which multiple designers are working together, but it's not good to the performance and optimization. There is nothing that multiple texcoord channels can make that a single channel can't. So, keep it simple, always try to work with a single texcoord channel.

The texcoord is very important to create the Tangent Space. Multiple texcoord channels will need multiple Tangent Spaces as well. So, multiple channels is never a good idea.



Conclusion

top
OK, my friends, I don't want to make this first tutorial too long, so these are the basic concepts about the Shader. Now you know how the shaders work, what are their limitations, where is their power and what we need to have before enter in the shaders' world.

In this tutorial you saw:

  • The Shaders are responsible for all the visual results of our 3D world, including lights, shadows, reflections, refractions, etc.
  • There are four shading techniques most used: Flat Shading, Goraud Shading, Phong Shading and Bump Shading. OpenGL by default will use the Phong Shading.
  • The values from Vertex Shader to Fragment Shader will always be interpolated.
  • We have 3 very important per-vertex vectors: Position, Normal and Texture Coordinate.
  • The Normal Vector + 2 others form the Tangent Space, fundamental concept to produce the Bump Shading and any other displacement technique (like the Parallax Mapping).
  • Always try to use only one texcoord channel.

Well done!
My next article will not be the second part of this series, instead, it'll be a short article covering how to calculate the Normal Vector and the Tangent Space. We'll need to have those vectors very correct before enter in the real calculus inside the shaders' world.

If you have any doubts, just Tweet me:

See you soon!

© db-in 2014. All Rights Reserved.