
X
Complete Initialization for 10 kreds
15%
Congratulations! You’ve completed your Kongregate account!
Keep exploring Kongregate with more badges and games!
Hide the progress bar forever?
Yes
No

metadata
**Update 7b  Application**
The code for our fresnel effect is quite simple.
```
Shader "ShaderChallenge/FresnelSchlick"
{
Properties
{
_Albedo("Albedo", Color) = (1.0, 1.0, 1.0, 1.0)
_Fresnel("Fresnel", Range(1.0, 8.0)) = 5.0
_IOR("IOR", Range(1.0, 5.0)) = 1.45
_Metallic("Metallic", Range(0, 1)) = 0.0
}
SubShader
{
Pass
{
Tags { "LightMode" = "ForwardBase" }
CGPROGRAM
#pragma vertex vert
#pragma fragment frag
#include "UnityStandardBRDF.cginc"
float4 _Albedo; // Albedo color value
float _Fresnel; // Fresnel strength (default is 5.0)
float _IOR; // Material's refractive index at 0 metallic (default is 1.45 from Blender)
float _Metallic; // Metallic value determining how metallic the material is
struct v2f
{
float4 pos : SV_POSITION;
float3 normal : TEXCOORD1;
float3 worldPos : TEXCOORD2;
};
v2f vert(appdata_base v)
{
v2f o;
o.pos = mul(UNITY_MATRIX_MVP, v.vertex);
// Convert vertex position to world space
o.worldPos = mul(unity_ObjectToWorld, v.vertex);
// Normal vector
o.normal = v.normal;
return o;
}
float3 FresnelSchlick(float3 n, float3 v)
{
// Dot product between normal and view vector
float cosTheta = DotClamped(n, v);
// Calculate F0 value
float3 F0 = abs((1.0  _IOR) / (1.0 + _IOR));
F0 = pow(F0, 2.0);
// Lerp between F0 and albedo color based on Metallic value (0 is dielectric, 1 is metallic)
F0 = lerp(F0, _Albedo.rgb, _Metallic);
// Final calculation
return F0 + (1.0  F0) * pow(1.0  cosTheta, _Fresnel);
}
float4 frag(v2f i) : SV_TARGET
{
// Normal vector
float3 n = normalize(i.normal);
// Light vector
float3 l = _WorldSpaceLightPos0.xyz;
// Viewport(camera) vector
float3 v = normalize(_WorldSpaceCameraPos  i.worldPos.xyz);
// Get the result as an RGB value
float3 result = FresnelSchlick(n, v);
// Output the result with alpha being 1
return float4(result, 1.0);
}
ENDCG
}
}
}
```
Now, you may have noticed something a bit different in here than our previous shaders. I have put all our fresnel calculations in my own separate function rather than calculating it in *frag()*. This makes the whole code easier to read and it'll be apparent once we start adding all the parts and calculations needed for our final shader. But anyways, let's break down the code.
In *Properties{}*, we have four variables. *_Albedo* is going to act as a color, *_Fresnel* will decide how strong the effect will be, *_IOR* is the material's refractive index and *_Metallic* determines whether our material is dielectric or metallic. I have set the default values for *_Fresnel* to 5 as that's what the original Schlick's approximation have and the *_IOR* value to 1.45 from Blender. I probably won't change those values a lot when making our material but I like to give the artists more control over how their material look.
Over to *FresnelSchlick()* and we set up two inputs in the parameter which are the normal and view vector needed for *cosθ*. And remember to put *float3* because we are working with 3D vectors (x, y, z). We then calculate the F0 value based on the equations above and putting a new builtin function called *abs()* to make sure it's in absolute value (remove negative sign if value is negative). After all those, we came across another new function called *lerp()*. Now, note that our F0 value is a *float3*, meaning that it's going to be an RGB color. This F0 is going to represent the color value when our material is dielectric and the metal color (or albedo in this case) is going to represent the color when it is metallic. We can do this easily by lerping between those two color values based on our *_Metallic* value and that's what the *lerp()* function do, it transitions smoothly between two given values. And based on the range I've restricted in *_IOR*, our F0 rgb value will be between (0,0,0) when *_IOR* is 1 and roughly (0.44,0.44,0.44) when *_IOR* is 5. Basically, these values determine the amount of reflection that dielectric material will get and the more darker F0 is, the more reflection it gets. Anyways, we then put all the calculations together and return it as a float3 color value.
We display the fresnel effect over at our *frag()* function by calling our fresnel function. Here are [three results](http://imgur.com/a/0zLwf) with me changing the values of *_Fresnel*, *_IOR* and *_Metallic*. I have made comments explaining what's going on in each of those result. The fresnel effect can be seen as a white color which will be replaced by reflections once we made the final PBR shader. You can see this effect in greater detail over a flat surface just like in the plane below. At a more grazing angle, you will see more white/reflections and looking straight at the surface diminishes any reflections.
![](http://imgur.com/t2aauqo.gif)
So, that's the first part done for our PBR shader! Next, we will be focusing on diffuse but we are not going to use our previous Lambertian model. Many game engines actually uses it for their PBR materials and I might actually change it to that if it's too computationally expensive. But for now we're going to work with the [OrenNayar model](https://en.wikipedia.org/wiki/Oren–Nayar_reflectance_model) which is more physically accurate.



metadata
**Update 8a  Theory**
Moving forward with our PBR series, we are now ready to tackle a more physically accurate diffuse lighting named the [OrenNayar reflectance model](https://en.wikipedia.org/wiki/Oren–Nayar_reflectance_model). This model was developed by Michael Oren along with Shreen K. Nayar and is more accurate at depicting a whole wide range of rough materials such as concrete and material. It is a generalized model of the standard Lambertian model but it does come with the cost of being more computationally expensive. Because of this, someone from the Unity forum have shared with me a rewritten version of the OrenNayar BRDF model, called [van Ouwerkerk's rewrite of the Oren Nayar BRDF](http://shaderjvo.blogspot.com.au/2011/08/vanouwerkerksrewriteoforennayar.html). The rewritten version removes the sin and tan calculation into dot products making it more optimized for modern GPUs.
With the Lambertian model, the brightness of the object's surface never changes regardless of our viewing direction. This is not the case however because the object's surface should become brighter as our view direction approaches the light direction. This can be explained with Vcavities, which the OrenNayar BRDF model tries to represent. Vcavities can be understood as a really tiny cavity that consists of two microfacets, which are small planar surfaces that are too small to be seen with the naked eyes. We assume that these microfacets are Lambertian in nature and they produce three geometrical effects: shadowing, masking and interreflection. Interreflection is the reason why the edges of rough surfaces are not fully dark as in the case of Lambertian reflection.
![](http://i.imgur.com/JZsD3ki.png?1)
Now, it’s time to apply these concepts mathematically. Remember that I'm using van Ouwerkerk's rewritten version which is slightly different than the original OrenNayar equation.
![](http://i.imgur.com/lBn63DA.png)
![](http://i.imgur.com/ylmci39.png)
![](http://i.imgur.com/N8LWcHk.png)The equation looks pretty daunting at first but let's break it down. Firstly, *a^2* is going to be the roughness (0  1) value to the power of two and the users will be able to modify the roughness value once we're done implementing it. We can easily replicate the equations of *A* and *B* easily in the shader. Now, I don't actually know what those values do for the final result as I haven't learned it in detail so bear in mind that these equations are just something that I can implement in the shader. If you want to learn the theory in more details, check out the links I already provided above and you'll probably be able to understand it better than I do right now.
Anyways, let's see what is inside the final equation. *L* is the dot product between the light and normal vector while *V* is the dot product between the view and normal vector. *P* is actually going to be a dot product between two special normalized vectors. The first vector is calculated as *l  n x L* where *l* is the light vector and *n* is the normal vector. The second vector is *v  n x V* where *v* is the view vector. After that's done, we finish it off by multiplying it with a fraction where the numerator is the square root of *(1  L^2) x (1  V^2)* and the denominator is going to be the bigger value between the two dot products, *L* or *V*.
Using these math equations, we can create our own OrenNayar BRDF shader. The final shader will have a color and roughness property which the users can modify. Changing the roughness around would create the impressions of the material being smooth at 0 and rough at 1.
***
**Summary:**
The OrenNayar BRDF is a generalized model of the Lambertian model and depicts rough materials more correctly than Lambertian. As it follows real life lighting more, it takes into account how the surface of the material should become brighter if our view vector approaches the vector where the light comes from. We are going to need a roughness property to tell the shader how rough the material should be. The material will be smooth at 0 and rough at 1 roughness value. The formula we're using is called the *van Ouwerkerk's rewrite of the Oren Nayar BRDF* and as the name suggests, it's a rewritten version of the original OrenNayar BRDF formula. It removes the trigonometric instructions from the original formula and replaces them with a much faster dot product.



metadata
How about you include a small summery below your posts where you keep things really simple? Kinda like you did with your last post. Just separate it from the rest and make it stand out. This way everyone could follow your progress without having to understand every formula and every detail.



metadata
> *Originally posted by **[Tulrog](/forums/4/topics/702427?page=2#11237302)**:*
> How about you include a small summery below your posts where you keep things really simple? Kinda like you did with your last post. Just separate it from the rest and make it stand out. This way everyone could follow your progress without having to understand every formula and every detail.
Not sure which last post you're referring to but I'll put a summary from now on. Thanks for the idea.



metadata
**Update 8b  Application**
```
Shader "ShaderChallenge/OrenNayar"
{
Properties
{
_Color("Color", Color) = (1.0, 1.0, 1.0, 1.0)
_Roughness("Roughness", Range(0.0, 1.0)) = 1.0
}
SubShader
{
Pass
{
Tags { "LightMode" = "ForwardBase" }
CGPROGRAM
#pragma vertex vert
#pragma fragment frag
#include "UnityStandardBRDF.cginc"
float4 _Color;
float _Roughness;
struct v2f
{
float4 pos : SV_POSITION;
float3 normal : TEXCOORD1;
float3 worldPos : TEXCOORD2;
};
v2f vert(appdata_base v)
{
v2f o;
o.pos = mul(UNITY_MATRIX_MVP, v.vertex);
o.worldPos = mul(unity_ObjectToWorld, v.vertex);
o.normal = v.normal;
return o;
}
// van Ouwerkerk's rewrite of the OrenNayar BRDF
// http://shaderjvo.blogspot.com.au/2011/08/vanouwerkerksrewriteoforennayar.html
float OrenNayar(float3 n, float3 v, float3 l)
{
// A and B calculations
float a2 = pow(_Roughness, 2.0);
float A = 1.0  0.5 * (a2 / (a2 + 0.57));
float B = 0.45 * (a2 / (a2 + 0.09));
// Replace trigonometric instructions with dot products
float2 cosTheta = float2(DotClamped(n, l), DotClamped(n, v));
float2 cosTheta2 = pow(cosTheta, 2.0);
float sinTheta = sqrt((1.0  cosTheta2.x) * (1.0  cosTheta2.y));
float3 lightPlane = normalize(l  n * cosTheta.x);
float3 viewPlane = normalize(v  n * cosTheta.y);
float P = DotClamped(lightPlane, viewPlane);
// Put it all together
float fraction = sinTheta / max(cosTheta.x, cosTheta.y);
return cosTheta.x * (A + B * P * fraction);
}
float4 frag(v2f i) : SV_TARGET
{
// Normal vector
float3 n = normalize(i.normal);
// Light vector
float3 l = _WorldSpaceLightPos0.xyz;
// Viewport(camera) vector
float3 v = normalize(_WorldSpaceCameraPos  i.worldPos.xyz);
float3 diffuse = OrenNayar(n, v, l);
// Add ambient lighting to the final result
return float4(unity_AmbientSky + diffuse * _Color, 1.0);
}
ENDCG
}
}
}
```
Like before in our Fresnel shader, I'm putting all the calculations needed in its own function to keep it tidy and organized. Our *OrenNayar()* function will need three vectors that we can get from our fragment function.
In *Properties{}*, we have a color and roughness property. You can change color to a texture albedo if you want to but I'm gonna keep it simple here. The roughness property have a range of 0 to 1 that outside users can modify. Inside *OrenNayar()*, we first calculate *a^2*, *A* and *B* needed for the final equation. Once that's done, we move on to the dot products.
Now, I'm going to introduce you to something that I've learned while researching about this stuff and it's that you are able to calculate multiple things if you put it in a vector. What I mean is, if you have a 2D floatingpoint vector (x,y), you can put in something for x and something else for y. To use it, you can call it with *VectorName.x* or *VectorName.y*. I thought that was pretty neat and that's what I did for the next part. We have a 2D vector called *cosTheta* and inside are the two dot products needed for the shader. We calculate the dot product between normal and light vector for x and dot product between normal and view vector for y. After raising *cosTheta* to the power of 2, we call those dot products in the next part using *cosTheta2.x* and *cosTheta2.y*. Also, there's a new builtin function here called *sqrt()* which square roots the math that we put inside the brackets and return it. Moving on, we calculate *P* in the equation by first normalizing two special vectors that's called *lightPlane* and *viewPlane*. The dot products calculation finishes off with the dot product between those two vectors.
The last part introduces another new builtin function called *max()* that returns a greater value between two given numbers. After putting all the parts together and returning it, we call *OrenNayar()* in our fragment function. The final output returns a color value of *unity_AmbientSky + diffuse x _Color*. Unlike our previous Lambertian shader, I decided to add ambient lighting for this one although you can also do that for Lambert and any other shader too. You can modify the color values for *unity_AmbientSky* by going to Window > Lighting > Ambient Color (which is under the Scene tab). Adding ambient lighting causes the material to not have a pitch black shadow which I think looks better.
![](http://imgur.com/CCORvcj.gif)
As you can see, the OrenNayar material have a brighter spot where the light meets the surface head on. Changing the roughness value to a lower value makes the material brighter which are a characteristic of smooth materials. In the next part, we're going to start working on the specular component for our final PBR shader. The specular component is going to be complex so I'm going to have to break it down into a few parts.



metadata
**Update 9**
The specular part of our PBR shader is going to be based on a microfacet model called the TorranceSparrow model. A lot of the shaders I've seen so far uses a specular microfacet model called [CookTorrance](https://en.wikipedia.org/wiki/Specular_highlight#Cook.E2.80.93Torrance_model) and I've tried implementing that myself in Unity but the final result wasn't right no matter how many times I tried. So I decided to explore what Unity uses for their own Standard shader and looking at their shader source code (you can download [here](https://unity3d.com/getunity/download/archive)), they used the TorranceSparrow model. Even though that's the case, I can't find any documentations or resources on the model online, but we're going to go with it and based all our calculations on what Unity used.
The specular microfacet model will need three parts:
**1. Normal Distribution Function**
NDF determines how many microfacets are aligned to halfway vectors based on a roughness value. When the roughness is low (smooth), a bright spot will appear on the surface of the material because the microfacets essentially simulates a perfect mirror. Whereas, a rough material will have lots of imperfect and jagged microfacets, causing light to scatter in all direction causing the light spot to become dimmer and cover over a larger area. The NDF we'll be using is called TrowbridgeReitz GGX.
**2. Visibility/Geometry Function**
Because we're going to be using the TorranceSparrow model, our second part will be a Visibility function. The CookTorrance model used something called a Geometry function and because I can't find any resources on the Visibility function, I'm just going to explain what the Geometry function is (they're probably the same though, just different way to calculate). The function describes how many microfacets are blocked by other microfacets causing the light rays to lose their energy and create a shadowing effect. We'll be using the SmithSchlick visibility function.
**3. Fresnel**
The last thing we need for our specular part is Fresnel. As some of you may already know, the Fresnel equation describes how much light will be reflected off a surface depending on our view vector. There will be more reflections visible if we look at the surface at a more grazing angle. We've already done this part, but as a reminder, we used Schlick's approximation for our fresnel effect.
We'll be combining all these parts together for our specular shader. When we're done with that, we're going to work on putting the basis for the final PBR shader, starting from adding the diffuse and specular parts up until we implemented normal maps and ImageBased lighting. There are still quite a few ways to go before we finish this PBR series so I'm looking forward to that.
***
**Summary:**
The TorranceSparrow model will be the microfacet model used for our specular part. We're basing our calculations from Unity's own Standard shader. Our microfacet shader will have three parts needed: Normal Distribution Function, Visibility Function and Fresnel. We can combine these components together and achieve the final specular result needed to complete our PBR shader.



metadata
**Update 10**
The first part we're going to need for our PBR specular component is the Normal Distribution Function. This function describes the distribution of microfacet in the surface normal of the mesh. I've already explained what microfacet is a couple of times now but as a reminder, when you look on a surface under a microscope, even a smooth material won't have a completely flat surface. It will have these tiny planes that can be assumed of having perfect specularity (perfect mirror) and on a rougher material, these planes will be more jagged and uneven causing the light to reflect in all sort of directions. The scattering of light determines how bright the specularity of the material will be and this is what NDF does. Phong and BlinnPhong is also another model that describes this but for PBR, we're going to be using TrowbridgeReitz GGX.
![](http://i.imgur.com/g7Xz3Ng.png)A pretty simple formula here. Remember that *a* is the roughness value here, and *n* along with *h* is respectively the normal vector and halfway vector. We can put this in a shader code easily too.
```
Shader "ShaderChallenge/TrowbridgeReitzGGX"
{
Properties
{
_Roughness("Roughness", Range(0, 1)) = 0
}
SubShader
{
Pass
{
Tags { "LightMode" = "ForwardBase" }
CGPROGRAM
#pragma vertex vert
#pragma fragment frag
#include "UnityStandardBRDF.cginc"
#define pi 3.14159265359
float _Roughness;
struct v2f
{
float4 pos : SV_POSITION;
float3 normal : TEXCOORD1;
float3 worldPos : TEXCOORD2;
};
v2f vert(appdata_base v)
{
v2f o;
o.pos = UnityObjectToClipPos(v.vertex);
// Convert vertex position to world space
o.worldPos = mul(unity_ObjectToWorld, v.vertex);
// Normal vector
o.normal = v.normal;
return o;
}
// TrowbridgeReitz GGX
float DistributionGGX(float3 n, float3 h)
{
float a2 = pow(_Roughness, 2.0);
float nh = DotClamped(n, h);
float nh2 = pow(nh, 2.0);
float num = a2;
float den = (nh2 * (a2  1.0) + 1.0);
den = pi * pow(den, 2.0);
return num / den;
}
float4 frag(v2f i) : SV_TARGET
{
// Normalize the normal
float3 n = normalize(i.normal);
// Light vector from mesh's surface
float3 l = normalize(_WorldSpaceLightPos0.xyz);
// Viewport(camera) vector from mesh's surface
float3 v = normalize(_WorldSpaceCameraPos  i.worldPos.xyz);
// Halfway vector
float3 h = normalize(l + v);
// TrowbridgeReitz GGX result
float3 result = DistributionGGX(n, h);
// Output the result
return float4(result, 1.0);
}
ENDCG
}
}
}
```
The calculations are under the *DistributionGGX()* function. There's nothing new so hopefully everything is straightforward here. If we see the output result visually, the highlight on the material's surface becomes brighter and narrower as the roughness value reaches 0. This changes to being dimmer and spreads out more when roughness reaches 1 and this effect is simulating the scattering of light.
![](http://imgur.com/PADpYEF.gif)
We're going to focus on the Visibility function next.
***
Since I updated to Unity 5.6.1f1, it seems like you can now replace *mul(UNITY_MATRIX_MVP, v.vertex)* with *UnityObjectToClipPos(v.vertex)*. This should automatically change on your own shader code but I'll leave this here as a heads up.



metadata
**Update 11**
The Visibility (or Geometry) function describes the ratio of microfacets that occludes other microfacets and light that bounces off multiple microfacets before leaving the surfce. Due to this effect, light energy are lost in the process and this aspect is an important part in the energy conservation that we have observed in the real world. Without this function, the surface may reflect more light than what should be normal in the physics we know. Fortunately, the calculation for our Visibility function is as simple as our Normal Distribution Function. We're using a function called SmithSchlick and mathematically, it looks like this.
![](http://i.imgur.com/IyjLM4K.png)It should be clear now that *n*, *v* and *l* corresponds to the normal, view and light vector of our shader. The one we don't know is *k* which represents *Roughness x Roughness x 0.5*. Let's put this formula in code form.
```
Shader "ShaderChallenge/SchlickSmith"
{
Properties
{
_Roughness("Roughness", Range(0, 1)) = 0
}
SubShader
{
Pass
{
Tags { "LightMode" = "ForwardBase" }
CGPROGRAM
#pragma vertex vert
#pragma fragment frag
#include "UnityStandardBRDF.cginc"
#define pi 3.14159265359
float _Roughness;
struct v2f
{
float4 pos : SV_POSITION;
float3 normal : TEXCOORD1;
float3 worldPos : TEXCOORD2;
};
v2f vert(appdata_base v)
{
v2f o;
o.pos = UnityObjectToClipPos(v.vertex);
// Convert vertex position to world space
o.worldPos = mul(unity_ObjectToWorld, v.vertex);
// Normal vector
o.normal = v.normal;
return o;
}
// SchlickSmith
float VisibilitySmith(float3 n, float3 v, float3 l)
{
float k = pow(_Roughness, 2.0) * 0.5;
float V = DotClamped(n, v) * (1.0  k) + k;
float L = DotClamped(n, l) * (1.0  k) + k;
return 0.25 / (V * L);
}
float4 frag(v2f i) : SV_TARGET
{
// Normalize the normal
float3 n = normalize(i.normal);
// Light vector from mesh's surface
float3 l = normalize(_WorldSpaceLightPos0.xyz);
// Viewport(camera) vector from mesh's surface
float3 v = normalize(_WorldSpaceCameraPos  i.worldPos.xyz);
// Halfway vector
float3 h = normalize(l + v);
// TrowbridgeReitz GGX result
float3 result = VisibilitySmith(n, v, l);
// Output the result
return float4(result, 1.0);
}
ENDCG
}
}
}
```
Again, the calculations are all put together neatly under the *VisibilitySmith()* function and there should be nothing new here that I have to point out. So let's just see what this shader visually looks like.
![](http://imgur.com/zxiiNHj.gif)
Compared to previous shaders, the result may look odd but when we put it together as one big and complex shader, it should look visually correct at the end. For the next update, we're going to add together all the three components needed in the specular part of our PBR shader: Normal Distribution Function, Visibility Function and Fresnel. After that, we'll be working on adding the diffuse and specular components which will work as a backbone for the final PBR shader. The final shader will have a [normal map](https://en.wikipedia.org/wiki/Normal_mapping) component that adds 'bumps and scratches' on the surface of the object and [IBL](https://en.wikipedia.org/wiki/Imagebased_lighting) (Imagebased Lighting) which essentially captures the environment's surrounding to illuminate our objects in the scene. That way, you could have reflections at the surface of smooth objects. We still have quite a long way to go before we finalize our PBR shader but we're getting closer to finishing it so look forward to that.



metadata
Learning shader programming has always been on my list for quite some time now. Especially when you're working in Unity, considering the fact that it does not have the best postprocessing effects builtin. I haven't really found the time to work with it though, since I am now working fulltime and finishing up my final year in college. Instead, I've been relying on the Asset store for such effects. I will be keeping a close eye on this thread for valuable information.



metadata
**Update 12**
I'm sorry that there's been a really long delay since the last update but I finally have time to continue this challenge again. In this update, we're going to add up all the specular parts together for the TorranceSparrow model. The maths is really simple as it is just a multiplication of the three parts along with the dot product of the normal and light vector and pi.
![](https://i.imgur.com/BCLloeR.png)*D*, *V* and *F* represents the Normal Distribution, Visibility and Fresnel function.
```
Shader "ShaderChallenge/TorranceSparrow"
{
Properties
{
_Albedo("Albedo", 2D) = "white" {}
_Fresnel("Fresnel", Range(1.0, 8.0)) = 5.0
_IOR("IOR", Range(1.0, 5.0)) = 1.45
_Roughness("Roughness", Range(0, 1)) = 1.0
_Metallic("Metallic", Range(0, 1)) = 0.0
}
SubShader
{
Pass
{
Tags { "LightMode" = "ForwardBase" }
CGPROGRAM
#pragma vertex vert
#pragma fragment frag
#include "UnityStandardBRDF.cginc"
#define pi 3.14159265359
sampler2D _Albedo;
float _Fresnel;
float _IOR;
float _Roughness;
float _Metallic;
struct v2f
{
float4 pos : SV_POSITION;
float3 worldPos : TEXCOORD0;
float3 normal : TEXCOORD1;
float2 uv : TEXCOORD2;
};
v2f vert(appdata_base v)
{
v2f o;
o.pos = UnityObjectToClipPos(v.vertex);
o.worldPos = mul(unity_ObjectToWorld, v.vertex);
o.normal = UnityObjectToWorldNormal(v.normal);
o.uv = v.texcoord;
return o;
}
// TrowbridgeReitz GGX
float DistributionGGX(float3 n, float3 h)
{
float a = pow(_Roughness, 2.0);
float a2 = pow(a, 2.0);
float nh = dot(n, h);
float nh2 = pow(nh, 2.0);
float num = a2;
float den = (nh2 * (a2  1.0) + 1.0);
den = pi * pow(den, 2.0);
return num / den;
}
// SchlickSmith
float VisibilitySmith(float3 n, float3 v, float3 l)
{
float k = pow(_Roughness, 2.0) * 0.5;
float V = DotClamped(n, v) * (1.0  k) + k;
float L = DotClamped(n, l) * (1.0  k) + k;
return 0.25 / (V * L);
}
// Schlick
float FresnelSchlick(float3 n, float3 v, float3 albedo)
{
float cosTheta = DotClamped(n, v);
float3 F0 = abs((1.0  _IOR) / (1.0 + _IOR));
F0 = pow(F0, 2.0);
F0 = lerp(F0, albedo.rgb, _Metallic);
return F0 + (1.0  F0) * pow(1.0  cosTheta, _Fresnel);
}
// Based on TorranceSparrow microfacet model
float Microfacet(float3 n, float3 h, float3 v, float3 l, float3 albedo)
{
float D = DistributionGGX(h, n);
float V = VisibilitySmith(n, v, l);
float F = FresnelSchlick(n, v, albedo);
float specularTerm = D * V * pi;
specularTerm = max(0.0, specularTerm * DotClamped(n, l));
return specularTerm * F;
}
float4 frag(v2f i) : SV_TARGET
{
// Normalize the normal
float3 n = normalize(i.normal);
// Light vector from mesh's surface
float3 l = normalize(_WorldSpaceLightPos0.xyz);
// Viewport(camera) vector from mesh's surface
float3 v = normalize(_WorldSpaceCameraPos  i.worldPos.xyz);
// Halfway vector
float3 h = normalize(l + v);
float3 albedo = tex2D(_Albedo, i.uv).rgb;
float3 result = Microfacet(n, h, v, l, albedo);
return float4(result, 1.0);
}
ENDCG
}
}
}
```
Before the *frag()* function, we added in all the three parts that made up the specular component, each having their own calculations. There's one slight adjustment that I made which is inside the *DistributionGGX()* function. For *a*, we'll be using Unreal Engine's version of roughness, which is roughness to the power of two. So basically, *a2* is now roughness to the power of four rather than being the power of two like we have in previous version. This will make the highlight smaller in the material which I personally think looks better in the end.
The adding up of all parts can be viewed in the *Microfacet()* function. Notice that we multiplied by *F* at the end. In the final shader, we're going to be multiplying by *F* outside of the *Microfacet()* function and placing it in a final specular calculation that'll be composed of IBL along with other things. Applying the Fresnel later will ensure the final shader looks correct at the end.
![](https://imgur.com/z9Db7OG.gif)![](https://imgur.com/jCTJPG0.gif)
Comparing the two GIFs together (first one is metallic, second one is dielectric), it's apparent that the first GIF contains stronger specularity which makes sense because it's metallic. For metallic, lowering the roughness creates a narrower specular highlight and also 'coats' the surface of the material with more black. Black will determine the amount of reflection in the surface so at 0 roughness, making it completely black is logical because it essentially respresents a perfect mirror, or perfect reflection. In a dielectric material, the specular highlight is less strong and the black color will be replaced the albedo of the material in the final shader.



metadata
**Update 13**
It's time to add a diffuse component to our TorranceSparrow shader from before. This shader will be our partial PBR shader as our final shader will still need a normal map, emission map and IBL. Remember that all calculations are based on Unity's own builtin shader, so I'm merely just converting their metalliconly workflow to a roughnessmetallic workflow with better code readability.
```
Shader "ShaderChallenge/CustomPBRPartial"
{
Properties
{
_Albedo("Albedo", 2D) = "white" {}
_Fresnel("Fresnel", Range(1.0, 8.0)) = 5.0
_IOR("IOR", Range(1.0, 5.0)) = 1.45
_Tint("Tint", Color) = (1.0, 1.0, 1.0, 1.0)
_MetallicTint("Metallic Tint", Color) = (1.0, 1.0, 1.0, 1.0)
_Roughness("Roughness", Range(0, 1)) = 1.0
_Metallic("Metallic", Range(0, 1)) = 0.0
}
SubShader
{
Pass
{
Tags { "LightMode" = "ForwardBase" }
CGPROGRAM
#pragma vertex vert
#pragma fragment frag
#include "UnityStandardBRDF.cginc"
#define pi 3.14159265359
sampler2D _Albedo;
float _Fresnel;
float _IOR;
float4 _Tint;
float4 _MetallicTint;
float _Roughness;
float _Metallic;
struct v2f
{
float4 pos : SV_POSITION;
float3 worldPos : TEXCOORD0;
float3 normal : TEXCOORD1;
float2 uv : TEXCOORD2;
};
v2f vert(appdata_base v)
{
v2f o;
o.pos = UnityObjectToClipPos(v.vertex);
o.worldPos = mul(unity_ObjectToWorld, v.vertex);
o.normal = UnityObjectToWorldNormal(v.normal);
o.uv = v.texcoord;
return o;
}
// TrowbridgeReitz GGX
float DistributionGGX(float3 n, float3 h)
{
float a = pow(_Roughness, 2.0);
float a2 = pow(a, 2.0);
float nh = dot(n, h);
float nh2 = pow(nh, 2.0);
float num = a2;
float den = (nh2 * (a2  1.0) + 1.0);
den = pi * pow(den, 2.0);
return num / den;
}
// SchlickSmith
float VisibilitySmith(float3 n, float3 v, float3 l)
{
float k = pow(_Roughness, 2.0) * 0.5;
float V = DotClamped(n, v) * (1.0  k) + k;
float L = DotClamped(n, l) * (1.0  k) + k;
return 0.25 / (V * L);
}
// Schlick
float FresnelSchlick(float3 n, float3 v, float3 albedo)
{
float cosTheta = DotClamped(n, v);
float3 F0 = abs((1.0  _IOR) / (1.0 + _IOR));
F0 = pow(F0, 2.0);
F0 = lerp(F0, albedo, _Metallic);
return F0 + (1.0  F0) * pow(1.0  cosTheta, _Fresnel);
}
// Based on TorranceSparrow microfacet model
float Specular(float3 n, float3 h, float3 v, float3 l)
{
float D = DistributionGGX(n, h);
float V = VisibilitySmith(n, v, l);
float specularTerm = D * V * pi;
specularTerm = max(0.0, specularTerm * DotClamped(n, l));
return specularTerm;
}
float4 frag(v2f i) : SV_TARGET
{
float3 lightCol = _LightColor0.rgb;
float3 ambient = unity_AmbientSky;
float3 tint = _Tint.rgb;
float3 metallicTint = _MetallicTint.rgb;
// Vectors required for the shader
float3 n = normalize(i.normal);
float3 l = _WorldSpaceLightPos0.xyz;
float3 v = normalize(_WorldSpaceCameraPos  i.worldPos.xyz);
float3 h = normalize(l + v);
float3 albedo = tex2D(_Albedo, i.uv).rgb;
float3 f = FresnelSchlick(n, v, albedo * tint);
// Diffuse and specular component
float3 diffuse = ambient + DotClamped(n, l);
diffuse *= 1.0  _Metallic;
float3 specular = Specular(n, h, v, l);
// Final diffuse and specular component
float3 finalDiffuse = albedo * tint * (lightCol * diffuse);
float3 finalSpecular = albedo * metallicTint * (specular * lightCol * f);
float3 color = finalDiffuse + finalSpecular;
return float4(color, 1.0);
}
ENDCG
}
}
}
```
Let's break down the code and understand what it's trying to do. Nothing much changed for the first part until the *Specular()* function except that we added a couple of new properties which should be understood by now. In the *Specular()* function, we omitted the Fresnel function because as I said before, we're going to be adding it in at the final specular calculation.
Moving on to the *frag()* function. Firstly, we added in a few variables just for better readability's sake. Then we have our vectors, albedo, and the Fresnel function that we'll use later. We need to introduce a diffuse and specular component first before calculating their final look. The diffuse component is a simple Lambertian lighting with ambient lighting added in. Note that you're allowed to use [OrenNayar](http://www.kongregate.com/forums/4gameprogramming/topics/7024272017challengedevelopingshadersusingshaderlabcg?page=2#posts11245115) in replacement of Lambert if you want the result to look slightly more realistic but I left it out because OrenNayar produces weird black artifacts at the edges of our objects. But if you want to add it in, make sure that you put in the correct vectors in the parameter. Anyways, we need to decrease diffuse as metallic goes higher, with there being no diffuse when the object's fully metallic. This is achieved with a simple multiplication found in the next line. The specular component is just our *Specular()* function that we've already typed.
In the final diffuse calculation, we multiplied albedo by the tint color which will only modify our diffuse color. This is multiplied by the light color and our diffuse component. For the final specular component, the code looks slightly different. We multiply albedo by the metallic tint (modifies the color of metal) and then multiply it by the specular component, light color and the Fresnel function we've left out so far. Adding the final diffuse component with the final specular component will be the final result of our shader.
![](https://imgur.com/4IoADvp.gif)![](https://imgur.com/iuKNwvH.gif)
These GIFs should hopefully clear up some lingering questions from our previous shader. You can see what I meant by the black color being replaced by the albedo color in dielectric materials and black color in metallic materials later on being replaced by reflections. This shader is currently not usable because it's not fully complete. IBL will be the major factor in PBR that'll give us that realistic look we want in our projects. But before we move on to that final step, we're going to move away from PBR for a short bit to figure out how to first implement a normal and emission map. Once that's done, we'll write a C# code to customize the Inspector of our shader because it's going to get quite messy with all the properties we're going to have in the final shader. Maybe you could see the start of it in the GIFs above. In the meantime, learn how to use [Reflection Probes](https://docs.unity3d.com/Manual/classReflectionProbe.html) if you haven't already, you'll be needing it.



metadata
**Update 14**
Before moving on with PBR, we have to first know how to implement a normal map and emission map. An emission map is a simple texture containing color information that will be mapped to the surface of your object. It's also really simple as it's essentially just a texture that have black to represent areas that isn't emitting anything and a color value to represent emission. To implement it, we simply add the color value of the emission map to the final color result.
A normal map on the other hand, seems daunting at first but it's actually pretty easy to understand. Normal maps are used to add bumps and scratches in place of using more vertices to create realistic details. This is used to save performance and it works really well. A lot of you probably already know what a normal map looks like, it's bluish in color and you can get a pretty general idea on how the bumps and scratches will look like in lighting. Each texel, or [texture pixel](https://en.wikipedia.org/wiki/Texel_(graphics), in the normal map represents an XYZ vector, a normal vector. According to [Wikipedia](https://en.wikipedia.org/wiki/Normal_mapping), for a lefthanded orientation system, the RGB value of each texel corresponds to the normal vector like this:
* X: 1 to +1 > Red: 0 to 255 (0  1)
* Y: 1 to +1 > Green: 0 to 255 (0  1)
* Z: 0 to 1 > Blue: 128 to 255 (0.5  1)
The bluish tone of normal maps comes from the fact that the Z (or blue) value always ranges from 0.5  1. If we want a normal vector facing directly towards the user (0, 0, 1) in XYZ, the RGB value of the texel will be (0.5, 0.5, 1) giving a light blue color. Now that we know what the color means in a normal map, we have to understand how to extract the normals contained within the normal map and use it to add details to our objects.
Converting the normals in the texture to a worldspace coordinate system can be done using the tangent space matrix. Tangentspace can be thought of as coordinates found in textures. If you understand how UV coordinates work, then you can visualize how the X axis points in the direction in which the U value increases while Y axis points to the increasing V value. The Z axis in tangent space is the surface normal, perpendicular to the surface itself. Moving on, we need to know the terms used to represent all this axis.
1. Normal  the vector perpendicular to the surface of the object (the Z axis, blue)
2. Tangent  the vector parallel to the surface of the object (the X axis, red)
3. Bitangent  the vector of the cross product between the normal and tangent vector (the Y axis, green)
![](https://i.imgur.com/6TA8bZP.png)
This image, which I've gotten from [opengltutorial](http://www.opengltutorial.org/intermediatetutorials/tutorial13normalmapping/), visualizes the vectors really well. We need all these vectors in our tangent space matrix, which would convert the normal from tangent to world space.
```
Shader "ShaderChallenge/NormalAndEmission"
{
Properties
{
_Color ("Color", Color) = (1.0, 1.0, 1.0, 1.0)
_BumpScale ("Bump Scale", Range(0.0, 1.0)) = 0.25
_NormalMap ("Normal Map", 2D) = "bump" {}
_EmissionMap("Emission Map", 2D) = "black" {}
}
SubShader
{
Pass
{
Tags { "LightMode" = "ForwardBase" }
CGPROGRAM
#pragma vertex vert
#pragma fragment frag
#include "UnityStandardBRDF.cginc"
#include "UnityStandardUtils.cginc"
float4 _Color;
float _BumpScale;
sampler2D _NormalMap;
sampler2D _EmissionMap;
struct v2f
{
float4 pos : SV_POSITION;
float3 worldPos : TEXCOORD1;
float2 uv : TEXCOORD2;
float3 tspace0 : TEXCOORD3; // tangent.x, bitangent.x, normal.x
float3 tspace1 : TEXCOORD4; // tangent.y, bitangent.y, normal.y
float3 tspace2 : TEXCOORD5; // tangent.z, bitangent.z, normal.z
};
v2f vert(appdata_full v)
{
v2f o;
o.pos = UnityObjectToClipPos(v.vertex);
o.worldPos = mul(unity_ObjectToWorld, v.vertex);
o.uv = v.texcoord;
// Get normal and tangent in world space
float3 wNormal = UnityObjectToWorldNormal(v.normal);
float3 wTangent = UnityObjectToWorldDir(v.tangent.xyz);
// Get bitangent from cross product of normal and tangent
float tangentSign = v.tangent.w * unity_WorldTransformParams.w;
float3 wBitangent = cross(wNormal, wTangent) * tangentSign;
// Output tangent space matrix
o.tspace0 = float3(wTangent.x, wBitangent.x, wNormal.y);
o.tspace1 = float3(wTangent.y, wBitangent.y, wNormal.y);
o.tspace2 = float3(wTangent.z, wBitangent.z, wNormal.z);
return o;
}
float4 frag(v2f i) : SV_TARGET
{
float3 tnormal = UnpackNormal(tex2D(_NormalMap, i.uv));
// Multiply the x and y axis of the normal to increase or decrease bump
tnormal.xy *= _BumpScale;
// Convert normal from tangent space to world space
float3 worldNormal;
worldNormal.x = dot(i.tspace0, tnormal);
worldNormal.y = dot(i.tspace1, tnormal);
worldNormal.z = dot(i.tspace2, tnormal);
// Vectors required
float3 n = normalize(worldNormal);
float3 l = _WorldSpaceLightPos0.xyz;
float3 v = normalize(_WorldSpaceCameraPos  i.worldPos.xyz);
float3 diffuse = dot(n, l);
float3 emission = tex2D(_EmissionMap, i.uv);
// Output final result
float3 result = (diffuse * _Color) + emission;
return float4(result, 1.0);
}
ENDCG
}
}
}
```
In *v2f* struct, we have 3 *tspaces* with each containing the 3 vectors needed for the matrix, which is 3x3 in size. The *vert()* function gets all these vectors and places them in our *tspaces* so that we can use it to convert the normal from tangent to worldspace. In our *frag()* function, we must first unpack the normal map using the builtin *UnpackNormal()* function. Unity stores their normal map in DXT5nm format and this function just retrieves the normal vector from the normal map using a series of mathematical algorithms. Anyways, we can control how bumpy we want our object to look by multiplying the XY value of the normal vector with *_BumpScale*. The final step after that would be the conversion from tangent space to world space, then our shader can use the final converted normal vector for lighting calculations. Also notice how simple it was to add emission. Just make sure you add the emission color after you calculate the final color value.
![](https://imgur.com/Q5nqYll.gif)
Next up, we'll create a C# script to customize the properties of our final shader and keep it neat. Remember that when you export a normal map texture, set its Texture Type to Normal map. If you added an emission map and it doesn't look as bright as you want it to be, don't worry. We're going to use HDR for our final emission color, which is going to be set up in the next update.



metadata
**Update 15**
First of all, this update won't delve too much into the C# code we're using. It'll merely describe what our custom shader GUI will contain for our final PBR shader. If you want to know how the code works, check out this [tutorial](http://catlikecoding.com/unity/tutorials/rendering/part9/) from Catlike Coding, which is where my code is based off. Another good resource is the documentation page for [ShaderGUI](https://docs.unity3d.com/ScriptReference/ShaderGUI.html).
So the reason we're writing a custom shader GUI is mainly to make our shader more user friendly. If we let Unity handle how our shader's properties will look in the Inspector, the whole thing will look messy and cluttered. With that, let's organize how to expose our PBR shader is a neat manner.
**Main Maps  contains all the textures or maps we can use in the shader**
* Albedo  the base color of the surface of the object, contains a color picker to tint the surface
* Roughness  controls how smooth or rough the object looks (ranges from 0  1)
* Metallic  controls how metallic the object looks (ranges from 0  1)
* Normal  controls how bumpy or scratchy the object looks (ranges from 0  1)
* Emission  the parts of the object containing 'light' such as LEDs, contains a HDR color picker to enable extra brightness
* Occlusion  the selfshadowing parts of the object (ranges from 0  1)
**Customization  extra properties to further customize our shader**
* Metallic Tint  controls the color of the metallic parts of the object
* Fresnel  controls how strong the fresnel effect looks (ranges from 1  8)
* IOR  controls the amount of reflection in dielectric materials (ranges from 1  5)
We haven't discussed about implementing ambient occlusion but it's going to be simple. To get our code working, create an Editor folder in the Project Window and add the code there. We also have to add one line of code near the end of the final PBR shader but that's going to come next.
```
using UnityEngine;
using UnityEditor;
public class PBRInterface : ShaderGUI
{
private Material target;
private MaterialEditor editor;
private MaterialProperty[] properties;
public override void OnGUI(MaterialEditor editor, MaterialProperty[] properties)
{
target = (Material)editor.target;
this.editor = editor;
this.properties = properties;
MainMaps();
Customization();
}
MaterialProperty FindProperty(string name)
{
return FindProperty(name, properties);
}
GUIContent CreateLabel(string label, string tooltip)
{
GUIContent newLabel = new GUIContent(label);
newLabel.tooltip = tooltip;
return newLabel;
}
void SetKeyword(string keyword, bool state)
{
if (state)
target.EnableKeyword(keyword);
else
target.DisableKeyword(keyword);
}
void MainMaps()
{
GUILayout.Label("Main Maps", EditorStyles.boldLabel);
SetMap("_Albedo", "Albedo", "_Tint");
SetMap("_RoughnessMap", "Roughness", "_Roughness");
SetMap("_MetallicMap", "Metallic", "_Metallic");
SetMap("_NormalMap", "Normal", "_BumpScale");
Emission();
SetMap("_OcclusionMap", "Occlusion", "_OcclusionStrength");
}
void Customization()
{
GUILayout.Label("Customization", EditorStyles.boldLabel);
MetallicTint();
Fresnel();
IOR();
}
void SetMap(string property, string name, string extraProperty)
{
MaterialProperty map = FindProperty(property);
GUIContent label = CreateLabel(name, name);
editor.TexturePropertySingleLine(label, map, FindProperty(extraProperty));
}
void Emission()
{
MaterialProperty map = FindProperty("_EmissionMap");
GUIContent label = CreateLabel("Emission", "Emission");
ColorPickerHDRConfig emissionConfig = new ColorPickerHDRConfig(0f, 99f, 1f / 99f, 3f);
editor.TexturePropertyWithHDRColor(label, map, FindProperty("_Emission"), emissionConfig, false);
}
void MetallicTint()
{
MaterialProperty metallicTint = FindProperty("_MetallicTint");
editor.ColorProperty(metallicTint, "Metallic Tint");
}
void Fresnel()
{
MaterialProperty slider = FindProperty("_Fresnel");
GUIContent label = CreateLabel("Fresnel", "Fresnel Effect");
editor.ShaderProperty(slider, label);
}
void IOR()
{
MaterialProperty slider = FindProperty("_IOR");
GUIContent label = CreateLabel("IOR", "Index of Refraction");
editor.ShaderProperty(slider, label);
}
}
```



metadata
@PlazmaGames
Great tutorial !
For users who want to test more advanced shaderlab techniques, please see:
> https://github.com/przemyslawzaworski/Unity3DCGprogramming
Some shaders in action:
> https://www.youtube.com/watch?v=NqcWY15OOPo
> https://www.youtube.com/watch?v=MzIuGkUYEmM
> https://www.youtube.com/watch?v=SZW14hp4hNw
> https://www.youtube.com/watch?v=ayHaykMcmiA



metadata
> *Originally posted by **[gtx660](/forums/4/topics/702427?page=2#11713398)**:*
> @PlazmaGames
> Great tutorial !
> For users who want to test more advanced shaderlab techniques, please see:
Wow, those shaders look really nice and cool. If this is all your work, then great job. I'll look into some of it when I have spare time.



metadata
**Update 16**
And here we are! The final post (for now) in this long PBR series. We're going to be adding ImageBased Lighting into our PBR shader, which should magically transform your shader into something realistic looking. Again, [Catlike Coding](http://catlikecoding.com/unity/tutorials/rendering/part8/) and Unity's own builtin shader are the basis in which I wrote this shader. Let's look at the code.
```
Shader "ShaderChallenge/CustomPBRFull"
{
Properties
{
_Albedo("Albedo", 2D) = "white" {}
_Fresnel("Fresnel", Range(1.0, 8.0)) = 5.0
_IOR("IOR", Range(1.0, 5.0)) = 1.45
_RoughnessMap("Roughness Map", 2D) = "white" {}
_MetallicMap("Metallic Map", 2D) = "white" {}
_Tint("Tint", Color) = (1.0, 1.0, 1.0, 1.0)
_MetallicTint("Metallic Tint", Color) = (1.0, 1.0, 1.0, 1.0)
_Roughness("Roughness", Range(0, 1)) = 1.0
_Metallic("Metallic", Range(0, 1)) = 0.0
_NormalMap("Normal Map", 2D) = "bump" {}
_BumpScale ("Bump Scale", Range(0.0, 1.0)) = 1.0
_EmissionMap("Emission Map", 2D) = "black" {}
_Emission("Emission", Color) = (0.0, 0.0, 0.0)
_OcclusionMap("Occlusion Map", 2D) = "white" {}
_OcclusionStrength("Occlusion Strength", Range(0.0, 1.0)) = 1.0
}
SubShader
{
Pass
{
Tags { "LightMode" = "ForwardBase" }
CGPROGRAM
#pragma vertex vert
#pragma fragment frag
#include "UnityStandardBRDF.cginc"
#define pi 3.14159265359
sampler2D _Albedo;
float _Fresnel;
float _IOR;
sampler2D _RoughnessMap;
sampler2D _MetallicMap;
float4 _Tint;
float4 _MetallicTint;
float _Roughness;
float _Metallic;
sampler2D _NormalMap;
float _BumpScale;
sampler2D _EmissionMap;
float3 _Emission;
sampler2D _OcclusionMap;
float _OcclusionStrength;
struct v2f
{
float4 pos : SV_POSITION;
float3 worldPos : TEXCOORD0;
float2 uv : TEXCOORD1;
float3 tspace0 : TEXCOORD2;
float3 tspace1 : TEXCOORD3;
float3 tspace2 : TEXCOORD4;
};
v2f vert(appdata_full v)
{
v2f o;
o.pos = UnityObjectToClipPos(v.vertex);
o.worldPos = mul(unity_ObjectToWorld, v.vertex);
o.uv = v.texcoord;
float3 wNormal = UnityObjectToWorldNormal(v.normal);
float3 wTangent = UnityObjectToWorldDir(v.tangent.xyz);
float tangentSign = v.tangent.w * unity_WorldTransformParams.w;
float3 wBitangent = cross(wNormal, wTangent) * tangentSign;
o.tspace0 = float3(wTangent.x, wBitangent.x, wNormal.x);
o.tspace1 = float3(wTangent.y, wBitangent.y, wNormal.y);
o.tspace2 = float3(wTangent.z, wBitangent.z, wNormal.z);
return o;
}
// TrowbridgeReitz GGX
float DistributionGGX(float3 n, float3 h, float roughness)
{
float a = pow(roughness, 2.0);
float a2 = pow(a, 2.0);
float nh = DotClamped(n, h);
float nh2 = pow(nh, 2.0);
float num = a2;
float den = (nh2 * (a2  1.0) + 1.0);
den = pi * pow(den, 2.0);
return num / den;
}
// SchlickSmith
float VisibilitySmith(float3 n, float3 v, float3 l, float roughness)
{
float k = pow(roughness, 2.0) * 0.5;
float V = DotClamped(n, v) * (1.0  k) + k;
float L = DotClamped(n, l) * (1.0  k) + k;
return 0.25 / (V * L);
}
// Schlick
float FresnelSchlickRoughness(float3 n, float3 v, float3 albedo, float roughness, float metallic)
{
float cosTheta = DotClamped(n, v);
float3 oneMinusRoughness = 1.0  roughness;
float3 F0 = abs((1.0  _IOR) / (1.0 + _IOR));
F0 = pow(F0, 2.0);
F0 = lerp(F0, albedo, metallic);
return F0 + (max(oneMinusRoughness, F0)  F0) * pow(1.0  cosTheta, _Fresnel);
}
// Based on TorranceSparrow microfacet model
float Specular(float3 n, float3 h, float3 v, float3 l, float roughness)
{
float D = DistributionGGX(n, h, roughness);
float V = VisibilitySmith(n, v, l, roughness);
float specularTerm = D * V * pi;
specularTerm = max(0.0, specularTerm * DotClamped(n, l));
return specularTerm;
}
// Indirect lighting
UnityIndirect IndirectLight(float3 n, float3 v, float roughness, float ao)
{
UnityIndirect il;
il.diffuse = max(0.0, ShadeSH9(float4(n, 1.0))) * ao;
float3 reflectionDir = reflect(v, n);
float roughnessMod = roughness;
roughnessMod *= 1.7  0.7 * roughness;
float4 envSample = UNITY_SAMPLE_TEXCUBE_LOD(unity_SpecCube0, reflectionDir, roughnessMod * UNITY_SPECCUBE_LOD_STEPS);
il.specular = DecodeHDR(envSample, unity_SpecCube0_HDR) * ao;
return il;
}
float4 frag(v2f i) : SV_TARGET
{
float roughness = tex2D(_RoughnessMap, i.uv.xy).g * _Roughness;
float metallic = tex2D(_MetallicMap, i.uv.xy).r * _Metallic;
float3 lightCol = _LightColor0.rgb;
float3 tint = _Tint.rgb;
float3 metallicTint = _MetallicTint.rgb;
float3 emission = tex2D(_EmissionMap, i.uv.xy) * _Emission;
float ao = lerp(1.0, tex2D(_OcclusionMap, i.uv.xy).g, _OcclusionStrength);
// Normal map calculations
float3 tnormal = UnpackNormal(tex2D(_NormalMap, i.uv));
tnormal.xy *= _BumpScale;
float3 worldNormal;
worldNormal.x = dot(i.tspace0, tnormal);
worldNormal.y = dot(i.tspace1, tnormal);
worldNormal.z = dot(i.tspace2, tnormal);
// Vectors required
float3 n = normalize(worldNormal);
float3 l = _WorldSpaceLightPos0.xyz;
float3 v = normalize(_WorldSpaceCameraPos  i.worldPos.xyz);
float3 h = normalize(l + v);
float3 albedo = tex2D(_Albedo, i.uv).rgb;
float surfaceReduction = 1.0 / (pow(roughness, 2.0) + 1.0);
float3 F = FresnelSchlickRoughness(n, v, albedo * tint, roughness, metallic);
UnityIndirect il = IndirectLight(n, v, roughness, ao);
// Diffuse and specular
float3 diffuse = il.diffuse + DotClamped(n, l);
diffuse *= 1.0  metallic;
float3 specular = Specular(n, h, v, l, roughness);
// Final diffuse and specular component
float3 finalDiffuse = albedo * tint * (lightCol * diffuse);
float3 finalSpecular = albedo * metallicTint * (specular * lightCol + surfaceReduction * il.specular * F);
float3 color = finalDiffuse + finalSpecular + emission;
return float4(color, 1.0);
}
ENDCG
}
}
CustomEditor "PBRInterface"
}
```
It's quite a big shader but let's break it down a bit. We added a couple of new properties that are mainly related to ambient occlusion first. Everything after that looks the same until we reach the function named *IndirectLight()*. This is the function that adds indirect lighting into our shader. The function returns a *UnityIndirect* structure, which contains information about indirect diffuse and specular light. Diffuse refers to ambient lighting while reflection refers to reflection that we see in smooth and metallic objects. We use a bunch of builtin functions and algorithms to get the indirect lights, including making sure reflections in metals becomes blurrier the rougher the object gets. More details can be found if you click Catlike Coding link I've provided above.
In the main function of *frag()*, the code looks similar to that of our partial PBR shader but we need to add a few more things. Remember that I'm basing all this calculation from Unity's own shader. Firstly, we have to make some slight alteration to our fresnel function. The previous fresnel function adds a fresnel effect for metallic and dielectric materials but in reality, metals shouldn't have any fresnel because it's already reflective. This modified function basically fixes that problem. At the final calculations for diffuse and specular, we add indirect lighting to it and that should pretty much finish everything we need for the final PBR shader. Also, to use the custom shader GUI we built in the previous post, we need to add a simple code before the last line of the shader. The name in between the quotes should be the name you used in the script of the custom shader GUI.
As this post is already long enough, you should see the next post to see the PBR shader in action.



metadata
**Update 17**
This post will be dedicated into looking at several results of our final PBR shader. I got the models and materials from these sites: [CGTrader](https://www.cgtrader.com/free3dmodels/pbr) and [Free PBR Materials](https://freepbr.com).
Our custom PBR shader used the roughnessmetallic workflow. Unity's workflow is different so if you prefer roughnessmetallic, you're more than free to use this shader. However, there are some pros and cons. Unity's standard shader definitely has more features such as shadow, multiple lights and global illumination. I also personally think how their metallic material turn out looks better than mine. Whereas for smooth dielectric materials, it's the opposite. Regarding customization, this shader is able to separately tint the dielectric and material parts in a combined material. As far as I know, this is not possible using the standard shader. But in the end, I'm pretty happy with the result of the custom PBR shader, considering this is the first time I've ever made one. See the results below.
[Full Gallery](https://imgur.com/a/ME0SF)
![](https://i.imgur.com/DVw0E5Q.png)
![](https://i.imgur.com/nCsQLfq.png)
![](https://i.imgur.com/ReCaoXa.png)
They look pretty good I'd say. All the textures and models are not mine and they belong to their respective owners. I also want to say that creating great textures is important when you want to make realistic models, even if your shader is working fine. These results shows how much good textures can do to make the graphic stand out at the end.
Well, now that we're finished with PBR, what's next? I've been thinking of maybe working with 2D shaders or post processing effects. I might also look into some shaders in [Shadertoy](https://www.shadertoy.com) and try to recreate them in Unity. The 2017 challenge is ending really soon so I'm probably not able to do much but I've been having thoughts of updating this thread after the challenge to post some of my future shader works. Anyways, that'll conclude this PBR series so for the time being, just wait for the next updates!



metadata
> *Originally posted by **[PlazmaGames](/forums/4/topics/702427?page=2#11726359)**:*
> > *Originally posted by **[gtx660](/forums/4/topics/702427?page=2#11713398)**:*
> > @PlazmaGames
> > Great tutorial !
> > For users who want to test more advanced shaderlab techniques, please see:
>
> Wow, those shaders look really nice and cool. If this is all your work, then great job. I'll look into some of it when I have spare time.
>
Thanks :)
Well, in most part, code was written by myself, but I heavily based on many references, especially various resources from ShaderToy.com. For example, FBM generator has algorhitms related to Perlin Noise, so I can't say these shaders were invented by me in 100 percents :) On the other hand, also you use "external" algorhitms like Fresnel or TorranceSparrow model. So you know what I mean :)



metadata
And in this place, generally I would like to encourage other users to study shader programming, because it is very passionate field. It is kind of magic, when you can create beautiful graphics from math. This is true Art :)


