Let's look at how we can make things more realistic with directional lighting. This video is quite complex, so please take your time and pause or rewind as you need to. Let's start by going back to our light and illumination model. If we know the direction of the light source and the direction of the surface the object is facing, which is the normal of the surface, we can then take the dot product of the two vectors, and it will give us the cosine of the angle between the two vectors. This cosine of the angle can then be used to calculate the illumination intensity reflected by the light source. If we know what direction the surface of the 3D object is facing, we can then find out the amount of illumination of the surface based on the cosine of the angle Theta. The intensity will equal to one if the light is pointing directly on the surface. It will be equal to minus one if the light source is pointing directly opposite. According to Lambert's law, reflected energy from a surface is proportional to the cosine of the angle between the direction of incoming light at the normal of the surface. Therefore, the light reflection intensity on the surface of the object will depend on the angle of the incoming light. So let's look at how we can calculate this. Let l equal to the unit vector of the light source direction, n equal to a unit vector of the surface normal direction, Theta equals the angle between the normal vector and the light source direction vector. Cosine Theta is equal to a dot product of these two vectors. We can then calculate the diffuse reflection intensity by multiplying this dot product with L_d, which equals the diffuse light source color, and the K_d, which is the diffusion coefficient. When comparing the point light source to the diffuse reflection, the diffuse reflection looks more realistic. This is because it considers the direction of the light source and provides a more realistic illumination of a 3D object. Apart from the light source direction, we also need to consider the attenuation of the light. The light intensity should attenuate when the light source is further away from the object. The light intensity attenuation is inversely proportional to the distance between the light source and the object surface, where q is the distance. To avoid division by zero, I add a constant 1 in the denominator of the equation and add vector b to control how fast the light intensity attenuates. However, linear reduction does not look as realistic as light attenuates quite rapidly. So add a quadratic term in the denominator with a coefficient c. This equation models the attenuation of the light based on its distance to the object. So the diffused reflection equation can then be modified to consider the light attenuation. Instead of a constant 1 to avoid division by zero, I replaced it with a co-efficient a. Hence, the variable a, b, and c are coefficients for adjusting the light attenuation. In order to calculate directional light reflection, we need to find the unit normal vector of the surface. For a triangle with vertices, P_1, P_2, and P_3, we can define two directional vectors u and v. The vector u is the vector from P_1 to P_2. The vector v is the vector from P_1 to P_3. The normal vector of the surface is a cross product of two vectors u and v. The unit normal vector can then be calculated by dividing the vector with its length which is the square root of x squared plus y squared plus z squared. What do you think is the surface normal vector of a vertex on the surface of a sphere at the origin. For any point on the surface of a sphere, say p equal to x y z, the normal vector of the point will be just x y z. How about the surface normal for arbitrary 3D object like this? This 3D object is constructed with two spheres and then linked with a polygon object. We know that the normal vector for each vertex of a sphere is basically the vertex itself, but how about the cone joining the two spheres together. As we construct the cones in a similar way to the spheres, the normal vector for each vertex is basically the vertex itself. Let's look at how we can implement these normal vectors for the arbitrary 3D object. I create a buffer for each section of the 3D object for storing the normal vectors. SphereNormal and Sphere2Normal are defined for the normal vectors of the two spheres. The ringNormal is defined for the normal vector of the cones joining the two spheres together. I have also added the variables diffuselightlocation, diffusecolor, and attenuation for setting a light source location like color and the attenuation co-efficient respectively. For the two spheres, the normal vectors are just the vertices. Similarly, for the cones joining the two spheres together, the normal vectors are also just the vertices. Let's look at the vertex shader. In the vertex shader, I've defined an attribute variable aVertexNormal for getting the normal vectors. The uniform variables, uDiffuseLightLocation, uDiffuseColor, and uAttenuation are defined for getting the diffused light, source location, the color of the light source, and the attenuation coefficients. The varying variables, vDiffuseLightWeighting and vDiffuseColor are then defined to pass the illumination vector and diffuse light color to the fragment shader. To calculate the diffuse reflection, I first calculate the light source direction from the vertex and the normal vector. The function normalize is used to calculate the unit vector. Then, I use the length function to calculate the distance between the light source and the vertex, and which is then used to calculate the attenuation factor. The diffuse reflection vector is then found by calculating the dot product between the normal vector and the diffuse light direction and multiplying that with the attenuation factor. The vDiffuseLightWeighting and vDiffuseColor passed by the vertex shader are then multiplied together in the fragment shader to calculate the diffuse illumination color. The diffuse color is then added to the vertex color to draw the fragments of the object. In the application program, I then created the buffers for storing and passing the normal vectors of each section of the arbitrary 3D objects to the shaders. The normalBuffer and normal2Buffer are the buffers for the normal vectors of the two spheres. The ring_normalBuffer is a buffer for the normal vectors for the cones joining the two spheres together. I then set the mNormalHandle to point to the attribute variable aVertexNormal, the diffuseLightLocationHandle, diffuseColorHandle, and attenuateHandle are set to point to the uniform variables. The diffuselightlocation is then set to 3, 2, 2, and the diffusecolor is set to 1, 1, 1, 1, which is the color white. The attenuation coefficient are set to 1, 0.35, 0.44. In the draw function, the values of the uniform variables diffuseLightLocation, diffuseColor, and attenuation coefficients are passed to the shaders. The normal vectors of each section of the object are then passed to the attribute variable, aVertexNormal through the mNormalHandle. Please note that a different normal vector buffer is passed to the mNormalHandle before joining each sections of the object. Let's implement this in the example program. We'll first add the attribute and uniform variables in our vertex shader, and then calculate the unit vector of the diffuse light direction, and then the unit normal vector, and the attenuation of the light source. We use it to calculate the diffuse light vector. We should pass it to the fragment shader to calculate the new fragment color. We then add the buffers for storing the normal vectors for the different sections of the object. As mentioned, the normal vector of the vertex on the sphere surface is just the vertex itself. And also, for the cones joining the two spheres together, the normal vector is also just the vertices. We then define the buffers for passing the attribute variables to the shaders. And also, define the handles for pointing to the attribute and uniform variables. In the draw function, we'll first pass the value to the uniform variables. We then pass the unit normal vectors to the attribute variables, aVertexNormal. And we do that for each section of the object. When you run the program, you then see the diffuse reflection effect on our 3D object. In the next video, we'll look at another type of reflection called ambient reflection.