
X
Complete Initialization for 10 kreds
15%
Congratulations! You’ve completed your Kongregate account!
Keep exploring Kongregate with more badges and games!
Hide the progress bar forever?
Yes
No

metadata
**Update 7b  Application**
The code for our fresnel effect is quite simple.
```
Shader "ShaderChallenge/FresnelSchlick"
{
Properties
{
_Albedo("Albedo", Color) = (1.0, 1.0, 1.0, 1.0)
_Fresnel("Fresnel", Range(1.0, 8.0)) = 5.0
_IOR("IOR", Range(1.0, 5.0)) = 1.45
_Metallic("Metallic", Range(0, 1)) = 0.0
}
SubShader
{
Pass
{
Tags { "LightMode" = "ForwardBase" }
CGPROGRAM
#pragma vertex vert
#pragma fragment frag
#include "UnityStandardBRDF.cginc"
float4 _Albedo; // Albedo color value
float _Fresnel; // Fresnel strength (default is 5.0)
float _IOR; // Material's refractive index at 0 metallic (default is 1.45 from Blender)
float _Metallic; // Metallic value determining how metallic the material is
struct v2f
{
float4 pos : SV_POSITION;
float3 normal : TEXCOORD1;
float3 worldPos : TEXCOORD2;
};
v2f vert(appdata_base v)
{
v2f o;
o.pos = mul(UNITY_MATRIX_MVP, v.vertex);
// Convert vertex position to world space
o.worldPos = mul(unity_ObjectToWorld, v.vertex);
// Normal vector
o.normal = v.normal;
return o;
}
float3 FresnelSchlick(float3 n, float3 v)
{
// Dot product between normal and view vector
float cosTheta = DotClamped(n, v);
// Calculate F0 value
float3 F0 = abs((1.0  _IOR) / (1.0 + _IOR));
F0 = pow(F0, 2.0);
// Lerp between F0 and albedo color based on Metallic value (0 is dielectric, 1 is metallic)
F0 = lerp(F0, _Albedo.rgb, _Metallic);
// Final calculation
return F0 + (1.0  F0) * pow(1.0  cosTheta, _Fresnel);
}
float4 frag(v2f i) : SV_TARGET
{
// Normal vector
float3 n = normalize(i.normal);
// Light vector
float3 l = _WorldSpaceLightPos0.xyz;
// Viewport(camera) vector
float3 v = normalize(_WorldSpaceCameraPos  i.worldPos.xyz);
// Get the result as an RGB value
float3 result = FresnelSchlick(n, v);
// Output the result with alpha being 1
return float4(result, 1.0);
}
ENDCG
}
}
}
```
Now, you may have noticed something a bit different in here than our previous shaders. I have put all our fresnel calculations in my own separate function rather than calculating it in *frag()*. This makes the whole code easier to read and it'll be apparent once we start adding all the parts and calculations needed for our final shader. But anyways, let's break down the code.
In *Properties{}*, we have four variables. *_Albedo* is going to act as a color, *_Fresnel* will decide how strong the effect will be, *_IOR* is the material's refractive index and *_Metallic* determines whether our material is dielectric or metallic. I have set the default values for *_Fresnel* to 5 as that's what the original Schlick's approximation have and the *_IOR* value to 1.45 from Blender. I probably won't change those values a lot when making our material but I like to give the artists more control over how their material look.
Over to *FresnelSchlick()* and we set up two inputs in the parameter which are the normal and view vector needed for *cosθ*. And remember to put *float3* because we are working with 3D vectors (x, y, z). We then calculate the F0 value based on the equations above and putting a new builtin function called *abs()* to make sure it's in absolute value (remove negative sign if value is negative). After all those, we came across another new function called *lerp()*. Now, note that our F0 value is a *float3*, meaning that it's going to be an RGB color. This F0 is going to represent the color value when our material is dielectric and the metal color (or albedo in this case) is going to represent the color when it is metallic. We can do this easily by lerping between those two color values based on our *_Metallic* value and that's what the *lerp()* function do, it transitions smoothly between two given values. And based on the range I've restricted in *_IOR*, our F0 rgb value will be between (0,0,0) when *_IOR* is 1 and roughly (0.44,0.44,0.44) when *_IOR* is 5. Basically, these values determine the amount of reflection that dielectric material will get and the more darker F0 is, the more reflection it gets. Anyways, we then put all the calculations together and return it as a float3 color value.
We display the fresnel effect over at our *frag()* function by calling our fresnel function. Here are [three results](http://imgur.com/a/0zLwf) with me changing the values of *_Fresnel*, *_IOR* and *_Metallic*. I have made comments explaining what's going on in each of those result. The fresnel effect can be seen as a white color which will be replaced by reflections once we made the final PBR shader. You can see this effect in greater detail over a flat surface just like in the plane below. At a more grazing angle, you will see more white/reflections and looking straight at the surface diminishes any reflections.
![](http://imgur.com/t2aauqo.gif)
So, that's the first part done for our PBR shader! Next, we will be focusing on diffuse but we are not going to use our previous Lambertian model. Many game engines actually uses it for their PBR materials and I might actually change it to that if it's too computationally expensive. But for now we're going to work with the [OrenNayar model](https://en.wikipedia.org/wiki/Oren–Nayar_reflectance_model) which is more physically accurate.



metadata
**Update 8a  Theory**
Moving forward with our PBR series, we are now ready to tackle a more physically accurate diffuse lighting named the [OrenNayar reflectance model](https://en.wikipedia.org/wiki/Oren–Nayar_reflectance_model). This model was developed by Michael Oren along with Shreen K. Nayar and is more accurate at depicting a whole wide range of rough materials such as concrete and material. It is a generalized model of the standard Lambertian model but it does come with the cost of being more computationally expensive. Because of this, someone from the Unity forum have shared with me a rewritten version of the OrenNayar BRDF model, called [van Ouwerkerk's rewrite of the Oren Nayar BRDF](http://shaderjvo.blogspot.com.au/2011/08/vanouwerkerksrewriteoforennayar.html). The rewritten version removes the sin and tan calculation into dot products making it more optimized for modern GPUs.
With the Lambertian model, the brightness of the object's surface never changes regardless of our viewing direction. This is not the case however because the object's surface should become brighter as our view direction approaches the light direction. This can be explained with Vcavities, which the OrenNayar BRDF model tries to represent. Vcavities can be understood as a really tiny cavity that consists of two microfacets, which are small planar surfaces that are too small to be seen with the naked eyes. We assume that these microfacets are Lambertian in nature and they produce three geometrical effects: shadowing, masking and interreflection. Interreflection is the reason why the edges of rough surfaces are not fully dark as in the case of Lambertian reflection.
![](http://i.imgur.com/JZsD3ki.png?1)
Now, it’s time to apply these concepts mathematically. Remember that I'm using van Ouwerkerk's rewritten version which is slightly different than the original OrenNayar equation.
![](http://i.imgur.com/lBn63DA.png)
![](http://i.imgur.com/ylmci39.png)
![](http://i.imgur.com/N8LWcHk.png)The equation looks pretty daunting at first but let's break it down. Firstly, *a^2* is going to be the roughness (0  1) value to the power of two and the users will be able to modify the roughness value once we're done implementing it. We can easily replicate the equations of *A* and *B* easily in the shader. Now, I don't actually know what those values do for the final result as I haven't learned it in detail so bear in mind that these equations are just something that I can implement in the shader. If you want to learn the theory in more details, check out the links I already provided above and you'll probably be able to understand it better than I do right now.
Anyways, let's see what is inside the final equation. *L* is the dot product between the light and normal vector while *V* is the dot product between the view and normal vector. *P* is actually going to be a dot product between two special normalized vectors. The first vector is calculated as *l  n x L* where *l* is the light vector and *n* is the normal vector. The second vector is *v  n x V* where *v* is the view vector. After that's done, we finish it off by multiplying it with a fraction where the numerator is the square root of *(1  L^2) x (1  V^2)* and the denominator is going to be the bigger value between the two dot products, *L* or *V*.
Using these math equations, we can create our own OrenNayar BRDF shader. The final shader will have a color and roughness property which the users can modify. Changing the roughness around would create the impressions of the material being smooth at 0 and rough at 1.
***
**Summary:**
The OrenNayar BRDF is a generalized model of the Lambertian model and depicts rough materials more correctly than Lambertian. As it follows real life lighting more, it takes into account how the surface of the material should become brighter if our view vector approaches the vector where the light comes from. We are going to need a roughness property to tell the shader how rough the material should be. The material will be smooth at 0 and rough at 1 roughness value. The formula we're using is called the *van Ouwerkerk's rewrite of the Oren Nayar BRDF* and as the name suggests, it's a rewritten version of the original OrenNayar BRDF formula. It removes the trigonometric instructions from the original formula and replaces them with a much faster dot product.



metadata
How about you include a small summery below your posts where you keep things really simple? Kinda like you did with your last post. Just separate it from the rest and make it stand out. This way everyone could follow your progress without having to understand every formula and every detail.



metadata
> *Originally posted by **[Tulrog](/forums/4/topics/702427?page=2#11237302)**:*
> How about you include a small summery below your posts where you keep things really simple? Kinda like you did with your last post. Just separate it from the rest and make it stand out. This way everyone could follow your progress without having to understand every formula and every detail.
Not sure which last post you're referring to but I'll put a summary from now on. Thanks for the idea.



metadata
**Update 8b  Application**
```
Shader "ShaderChallenge/OrenNayar"
{
Properties
{
_Color("Color", Color) = (1.0, 1.0, 1.0, 1.0)
_Roughness("Roughness", Range(0.0, 1.0)) = 1.0
}
SubShader
{
Pass
{
Tags { "LightMode" = "ForwardBase" }
CGPROGRAM
#pragma vertex vert
#pragma fragment frag
#include "UnityStandardBRDF.cginc"
float4 _Color;
float _Roughness;
struct v2f
{
float4 pos : SV_POSITION;
float3 normal : TEXCOORD1;
float3 worldPos : TEXCOORD2;
};
v2f vert(appdata_base v)
{
v2f o;
o.pos = mul(UNITY_MATRIX_MVP, v.vertex);
o.worldPos = mul(unity_ObjectToWorld, v.vertex);
o.normal = v.normal;
return o;
}
// van Ouwerkerk's rewrite of the OrenNayar BRDF
// http://shaderjvo.blogspot.com.au/2011/08/vanouwerkerksrewriteoforennayar.html
float OrenNayar(float3 n, float3 v, float3 l)
{
// A and B calculations
float a2 = pow(_Roughness, 2.0);
float A = 1.0  0.5 * (a2 / (a2 + 0.57));
float B = 0.45 * (a2 / (a2 + 0.09));
// Replace trigonometric instructions with dot products
float2 cosTheta = float2(DotClamped(n, l), DotClamped(n, v));
float2 cosTheta2 = pow(cosTheta, 2.0);
float sinTheta = sqrt((1.0  cosTheta2.x) * (1.0  cosTheta2.y));
float3 lightPlane = normalize(l  n * cosTheta.x);
float3 viewPlane = normalize(v  n * cosTheta.y);
float P = DotClamped(lightPlane, viewPlane);
// Put it all together
float fraction = sinTheta / max(cosTheta.x, cosTheta.y);
return cosTheta.x * (A + B * P * fraction);
}
float4 frag(v2f i) : SV_TARGET
{
// Normal vector
float3 n = normalize(i.normal);
// Light vector
float3 l = _WorldSpaceLightPos0.xyz;
// Viewport(camera) vector
float3 v = normalize(_WorldSpaceCameraPos  i.worldPos.xyz);
float3 diffuse = OrenNayar(n, v, l);
// Add ambient lighting to the final result
return float4(unity_AmbientSky + diffuse * _Color, 1.0);
}
ENDCG
}
}
}
```
Like before in our Fresnel shader, I'm putting all the calculations needed in its own function to keep it tidy and organized. Our *OrenNayar()* function will need three vectors that we can get from our fragment function.
In *Properties{}*, we have a color and roughness property. You can change color to a texture albedo if you want to but I'm gonna keep it simple here. The roughness property have a range of 0 to 1 that outside users can modify. Inside *OrenNayar()*, we first calculate *a^2*, *A* and *B* needed for the final equation. Once that's done, we move on to the dot products.
Now, I'm going to introduce you to something that I've learned while researching about this stuff and it's that you are able to calculate multiple things if you put it in a vector. What I mean is, if you have a 2D floatingpoint vector (x,y), you can put in something for x and something else for y. To use it, you can call it with *VectorName.x* or *VectorName.y*. I thought that was pretty neat and that's what I did for the next part. We have a 2D vector called *cosTheta* and inside are the two dot products needed for the shader. We calculate the dot product between normal and light vector for x and dot product between normal and view vector for y. After raising *cosTheta* to the power of 2, we call those dot products in the next part using *cosTheta2.x* and *cosTheta2.y*. Also, there's a new builtin function here called *sqrt()* which square roots the math that we put inside the brackets and return it. Moving on, we calculate *P* in the equation by first normalizing two special vectors that's called *lightPlane* and *viewPlane*. The dot products calculation finishes off with the dot product between those two vectors.
The last part introduces another new builtin function called *max()* that returns a greater value between two given numbers. After putting all the parts together and returning it, we call *OrenNayar()* in our fragment function. The final output returns a color value of *unity_AmbientSky + diffuse x _Color*. Unlike our previous Lambertian shader, I decided to add ambient lighting for this one although you can also do that for Lambert and any other shader too. You can modify the color values for *unity_AmbientSky* by going to Window > Lighting > Ambient Color (which is under the Scene tab). Adding ambient lighting causes the material to not have a pitch black shadow which I think looks better.
![](http://imgur.com/CCORvcj.gif)
As you can see, the OrenNayar material have a brighter spot where the light meets the surface head on. Changing the roughness value to a lower value makes the material brighter which are a characteristic of smooth materials. In the next part, we're going to start working on the specular component for our final PBR shader. The specular component is going to be complex so I'm going to have to break it down into a few parts.



metadata
**Update 9**
The specular part of our PBR shader is going to be based on a microfacet model called the TorranceSparrow model. A lot of the shaders I've seen so far uses a specular microfacet model called [CookTorrance](https://en.wikipedia.org/wiki/Specular_highlight#Cook.E2.80.93Torrance_model) and I've tried implementing that myself in Unity but the final result wasn't right no matter how many times I tried. So I decided to explore what Unity uses for their own Standard shader and looking at their shader source code (you can download [here](https://unity3d.com/getunity/download/archive)), they used the TorranceSparrow model. Even though that's the case, I can't find any documentations or resources on the model online, but we're going to go with it and based all our calculations on what Unity used.
The specular microfacet model will need three parts:
**1. Normal Distribution Function**
NDF determines how many microfacets are aligned to halfway vectors based on a roughness value. When the roughness is low (smooth), a bright spot will appear on the surface of the material because the microfacets essentially simulates a perfect mirror. Whereas, a rough material will have lots of imperfect and jagged microfacets, causing light to scatter in all direction causing the light spot to become dimmer and cover over a larger area. The NDF we'll be using is called TrowbridgeReitz GGX.
**2. Visibility/Geometry Function**
Because we're going to be using the TorranceSparrow model, our second part will be a Visibility function. The CookTorrance model used something called a Geometry function and because I can't find any resources on the Visibility function, I'm just going to explain what the Geometry function is (they're probably the same though, just different way to calculate). The function describes how many microfacets are blocked by other microfacets causing the light rays to lose their energy and create a shadowing effect. We'll be using the SmithSchlick visibility function.
**3. Fresnel**
The last thing we need for our specular part is Fresnel. As some of you may already know, the Fresnel equation describes how much light will be reflected off a surface depending on our view vector. There will be more reflections visible if we look at the surface at a more grazing angle. We've already done this part, but as a reminder, we used Schlick's approximation for our fresnel effect.
We'll be combining all these parts together for our specular shader. When we're done with that, we're going to work on putting the basis for the final PBR shader, starting from adding the diffuse and specular parts up until we implemented normal maps and ImageBased lighting. There are still quite a few ways to go before we finish this PBR series so I'm looking forward to that.
***
**Summary:**
The TorranceSparrow model will be the microfacet model used for our specular part. We're basing our calculations from Unity's own Standard shader. Our microfacet shader will have three parts needed: Normal Distribution Function, Visibility Function and Fresnel. We can combine these components together and achieve the final specular result needed to complete our PBR shader.



metadata
**Update 10**
The first part we're going to need for our PBR specular component is the Normal Distribution Function. This function describes the distribution of microfacet in the surface normal of the mesh. I've already explained what microfacet is a couple of times now but as a reminder, when you look on a surface under a microscope, even a smooth material won't have a completely flat surface. It will have these tiny planes that can be assumed of having perfect specularity (perfect mirror) and on a rougher material, these planes will be more jagged and uneven causing the light to reflect in all sort of directions. The scattering of light determines how bright the specularity of the material will be and this is what NDF does. Phong and BlinnPhong is also another model that describes this but for PBR, we're going to be using TrowbridgeReitz GGX.
![](http://i.imgur.com/g7Xz3Ng.png)A pretty simple formula here. Remember that *a* is the roughness value here, and *n* along with *h* is respectively the normal vector and halfway vector. We can put this in a shader code easily too.
```
Shader "ShaderChallenge/TrowbridgeReitzGGX"
{
Properties
{
_Roughness("Roughness", Range(0, 1)) = 0
}
SubShader
{
Pass
{
Tags { "LightMode" = "ForwardBase" }
CGPROGRAM
#pragma vertex vert
#pragma fragment frag
#include "UnityStandardBRDF.cginc"
#define pi 3.14159265359
float _Roughness;
struct v2f
{
float4 pos : SV_POSITION;
float3 normal : TEXCOORD1;
float3 worldPos : TEXCOORD2;
};
v2f vert(appdata_base v)
{
v2f o;
o.pos = UnityObjectToClipPos(v.vertex);
// Convert vertex position to world space
o.worldPos = mul(unity_ObjectToWorld, v.vertex);
// Normal vector
o.normal = v.normal;
return o;
}
// TrowbridgeReitz GGX
float DistributionGGX(float3 n, float3 h)
{
float a2 = pow(_Roughness, 2.0);
float nh = DotClamped(n, h);
float nh2 = pow(nh, 2.0);
float num = a2;
float den = (nh2 * (a2  1.0) + 1.0);
den = pi * pow(den, 2.0);
return num / den;
}
float4 frag(v2f i) : SV_TARGET
{
// Normalize the normal
float3 n = normalize(i.normal);
// Light vector from mesh's surface
float3 l = normalize(_WorldSpaceLightPos0.xyz);
// Viewport(camera) vector from mesh's surface
float3 v = normalize(_WorldSpaceCameraPos  i.worldPos.xyz);
// Halfway vector
float3 h = normalize(l + v);
// TrowbridgeReitz GGX result
float3 result = DistributionGGX(n, h);
// Output the result
return float4(result, 1.0);
}
ENDCG
}
}
}
```
The calculations are under the *DistributionGGX()* function. There's nothing new so hopefully everything is straightforward here. If we see the output result visually, the highlight on the material's surface becomes brighter and narrower as the roughness value reaches 0. This changes to being dimmer and spreads out more when roughness reaches 1 and this effect is simulating the scattering of light.
![](http://imgur.com/PADpYEF.gif)
We're going to focus on the Visibility function next.
***
Since I updated to Unity 5.6.1f1, it seems like you can now replace *mul(UNITY_MATRIX_MVP, v.vertex)* with *UnityObjectToClipPos(v.vertex)*. This should automatically change on your own shader code but I'll leave this here as a heads up.



metadata
**Update 11**
The Visibility (or Geometry) function describes the ratio of microfacets that occludes other microfacets and light that bounces off multiple microfacets before leaving the surfce. Due to this effect, light energy are lost in the process and this aspect is an important part in the energy conservation that we have observed in the real world. Without this function, the surface may reflect more light than what should be normal in the physics we know. Fortunately, the calculation for our Visibility function is as simple as our Normal Distribution Function. We're using a function called SmithSchlick and mathematically, it looks like this.
![](http://i.imgur.com/IyjLM4K.png)It should be clear now that *n*, *v* and *l* corresponds to the normal, view and light vector of our shader. The one we don't know is *k* which represents *Roughness x Roughness x 0.5*. Let's put this formula in code form.
```
Shader "ShaderChallenge/SchlickSmith"
{
Properties
{
_Roughness("Roughness", Range(0, 1)) = 0
}
SubShader
{
Pass
{
Tags { "LightMode" = "ForwardBase" }
CGPROGRAM
#pragma vertex vert
#pragma fragment frag
#include "UnityStandardBRDF.cginc"
#define pi 3.14159265359
float _Roughness;
struct v2f
{
float4 pos : SV_POSITION;
float3 normal : TEXCOORD1;
float3 worldPos : TEXCOORD2;
};
v2f vert(appdata_base v)
{
v2f o;
o.pos = UnityObjectToClipPos(v.vertex);
// Convert vertex position to world space
o.worldPos = mul(unity_ObjectToWorld, v.vertex);
// Normal vector
o.normal = v.normal;
return o;
}
// SchlickSmith
float VisibilitySmith(float3 n, float3 v, float3 l)
{
float k = pow(_Roughness, 2.0) * 0.5;
float V = DotClamped(n, v) * (1.0  k) + k;
float L = DotClamped(n, l) * (1.0  k) + k;
return 0.25 / (V * L);
}
float4 frag(v2f i) : SV_TARGET
{
// Normalize the normal
float3 n = normalize(i.normal);
// Light vector from mesh's surface
float3 l = normalize(_WorldSpaceLightPos0.xyz);
// Viewport(camera) vector from mesh's surface
float3 v = normalize(_WorldSpaceCameraPos  i.worldPos.xyz);
// Halfway vector
float3 h = normalize(l + v);
// TrowbridgeReitz GGX result
float3 result = VisibilitySmith(n, v, l);
// Output the result
return float4(result, 1.0);
}
ENDCG
}
}
}
```
Again, the calculations are all put together neatly under the *VisibilitySmith()* function and there should be nothing new here that I have to point out. So let's just see what this shader visually looks like.
![](http://imgur.com/zxiiNHj.gif)
Compared to previous shaders, the result may look odd but when we put it together as one big and complex shader, it should look visually correct at the end. For the next update, we're going to add together all the three components needed in the specular part of our PBR shader: Normal Distribution Function, Visibility Function and Fresnel. After that, we'll be working on adding the diffuse and specular components which will work as a backbone for the final PBR shader. The final shader will have a [normal map](https://en.wikipedia.org/wiki/Normal_mapping) component that adds 'bumps and scratches' on the surface of the object and [IBL](https://en.wikipedia.org/wiki/Imagebased_lighting) (Imagebased Lighting) which essentially captures the environment's surrounding to illuminate our objects in the scene. That way, you could have reflections at the surface of smooth objects. We still have quite a long way to go before we finalize our PBR shader but we're getting closer to finishing it so look forward to that.



metadata
Learning shader programming has always been on my list for quite some time now. Especially when you're working in Unity, considering the fact that it does not have the best postprocessing effects builtin. I haven't really found the time to work with it though, since I am now working fulltime and finishing up my final year in college. Instead, I've been relying on the Asset store for such effects. I will be keeping a close eye on this thread for valuable information.


