Weathering Shader

Been having more fun with HLSL recently! Off the back of my alpha erosion shader, I wanted to use a similar technique to repilicate some weathering I saw on objects in the Napels National Archaeological Museum.


I started off by shifting the alpha erosion technique into an albedo blend. I added the erosion function to the albedo and then multiplied it by the albedo.

I then created a vertex function to push out all the verts within the eroded area along their normal. This gave me the raised blue surface. This could definitely look a lot more elegant but I think its not bad for a first try!

void vert(inout appdata_full v) { 
//Weathering float4 wm = tex2Dlod(_AlphaMask, float4(v.texcoord.xy, 0, 0)); 
float4 w = tex2Dlod(_Weather_Texture, float4(v.texcoord.xy, 0, 0)); 
wm = weathering_effect(wm, _Erosion); += lerp(0, v.normal * (_Erosion*.1), wm);

The last thing I added was AO for the raised area in order to look like it was giving us detail soft shadows. To do this, I used the erosion function again, but pushed the erosion amount so there were more dark areas. I then added this to an inverted version of the original erosion, to leave us with a line that fades out on both sides.


I added .2 to this so it wasn’t super dark, and got the small shadow that you see below.



Finished Erosion Shader

Made the finishing touches to my shader – a glowing edge! This is great for fire or sci-fi effects.

I gave this thickness and softness controls to allow for different looks depending on user need. The softness takes a 0 – 1 value and lerps it between 0 and the thickness value. We then smoothstep between this and the thickness. The lower the smoothness, the larger variant we’ll have between the emissive values.

You’ll notice that this relationship doesn’t make sense from a user perspective – surely it should be the other way around! For simplicity’s sake. I’ve done it this way around and then just swapped the values around on the input slider. Saves me doing remapping or a whole bunch of one minuses!

 float emissive_edge (float alpha_erosion) { 
    //when alpha is less than thickness, make pixel emissive 
    //when alpha is less than softness, make it fully emissive 
    float softness = lerp(0, _Thickness, _Softness); 
    return smoothstep(softness, _Thickness, alpha_erosion); 


Alpha Erosion Function

After fixing my normal issues, I moved onto the main purpose of my shader – alpha erosion! I wanted to replicate a sort of burn erosion effect using a mask input by the user.


I didn’t want to clutter up my surface function by doing it all in there, so split it out into a new one. The syntax was a little odd for me – being used to Python I keep forgetting to give everything a type – but I got there in the end!

I’m using a user input to subtract from the erosion texture, in order to move it from the original texture to fully transparent.

However, I didn’t want to start with the alpha being the texture, I wanted to start at fully opaque and then start eroding. To do this, the value I subtracted was a lerp between -1 (essentially adding 1 to the texture to make everything white) and the erosion value. To determine the interpolator I used a smoothstep. This takes a value and a max and min input and outputs a value between 0 and 1 based on a smooth curve between the min and max values. The reason I used this was because I could set the maximum input to 0.1 – meaning any value equal or greater than this would evaluate as 1.

After this I did a ciel on the subtraction, giving me a hard edge as every pixel comes out as 0 or 1. Using a ciel rather than a round keeps the values within the correct range. I then saturated it to prevent any values lower than 0 or higher than 1.

float alpha_erosion (fixed4 erosion_texture, float erosion_value) { 
//If user erosion input is 1, make alpha fully opaque by setting 
//subtracted value to -1. 
float fully_opaque = smoothstep(0, 0.01, erosion_value); 
float opaque_erosion_value = lerp(-1.0, erosion_value, fully_opaque); 
//Subtract alpha 
return saturate(ceil(erosion_texture.r - opaque_erosion_value)); 


Unpacking Normals

I’ve been working into the shader I was doing in the last blog post a little more. As I was finished with transparency and set my alpha to 1, I noticed that the light seemed off, with a band of specular light across the middle. This was because I had forgotten to unpack my normal!

When a map is marked as a normal map in Unity (and many other game engines), it changes its GPU compression type to DXT5. DXT or BC 5 compression ditches the blue channel in order to reduce file size. This works well with normal maps as the blue channel in a normal map represents the Z component of the surface normal and the Z should always be facing in the opposite direction from the camera. As we need the blue channel in order to calculate our normals, we use the UnpackNormal function.

fixed4 n = tex2D(_Normal, IN.uv_Normal);
o.Normal = UnpackNormal(n); //unpacks from dxt5nm to rgb

Below is a comparison of an unpacked normal vs a dxt5  packed normal – in the packed version the Z is always set to 0, regardless of the result of the dot product with the camera heading.



Transparency in Unity Shaders

I decided to get started on an alpha erosion shader today, as I’d recently made one in nodes and wanted to convert it to HLSL.

I started off by adding inputs for material and normal. I added these as properties at the top of the shader, then added a sampler2D in the subshader so that I could use the texture, as well as UV1 in the Input struct so I could access its texture coordinates.


In order to actually sample the texture, I used the built in HLSL function tex2D(), which takes a sampler2D and a UV coordinate. Once I had this, I could use my surf function to fill in the metallic and smoothness outputs in my instance of Unity’s SurfaceOutputStandard. This struct gives information about surface properties to Unity in order to build the actual pixel shader, as well as generating any required render passes.


This gave me some lovely boxes with a shiny intended squiggle!


Getting transparency working was a little more challenging, as I had to think about render pass order and blending modes. Not being familiar with Unity’s render pipe, this was very interesting!

The first things we need to do are set the correct render queue, and remove the object from the depth buffer write. As this is blended transparency, I want my object to be forward rendered and don’t want it pushed into an earlier pass for the zbuffer.

To do this, I set the render queue and type tags to “transparent”, and added “ZWrite Off”.


After this, I had to specify the blend mode and set the correct lighting model. As I wanted this to use traditional transparency, I set the source to source alpha and the destination to one minus source alpha. By default, this is an additive blend, but I believe you can change this. To set the lighting model, I added “alpha: fade” after the lighting model pragma. The pragma sends additional information to the shader compiler – if no alpha is specified, you’ll get an opaque object!


Once that was all set, I had to actually specify my alpha values in the surface function! Initially, I just set the surface output alpha parameter to equal my erosion texture r multiplied by my alpha multiplier input, but this only produced transparency between particles in the same system and not between other objects (probably to do with the way unity batches particle draw calls?). I needed to set the alpha of the colour value (c.a) I created from my texture sampler to this, and then set the alpha output to c.a.


Doing this gave me some nice transparency! Next I want to add photoshop levels style controls to the alpha mask in order to create the erosion effect. I’ve previously done this using a mixture of smoothstepping and remapping values. It looks like remap value is not a built in function, so I’ll have to write my own which should be interesting.





HLSL First Look

I do shader work at work and at home, but have only used node based shader editors. I’ve been wanting to learn something new and projects haven’t really been sticking recently, so figured I’d get a little lower level and give HLSL in Unity a go.

I did some real basics tonight, just having a look at the basic surface shader and having a go at changing around some parameters and adding a little functionality.

The first thing I did was change the smoothness input to a roughness one. This is what I’m used to using and seemed like a cool way to look at the various built in variables that the standard surface shader has.

I changed the smoothness variable from SurfaceOutputStandard from _Glossiness (the name of the user smoothness/roughness input) to (1- _Glossiness) to invert the relationship. This let me define the roughness, rather than the smoothness.


After that, I added a colour overlay. For this I had a to add a new user input and edit the albedo.

The input parameters, called properties, are defined at the top of the script and take a display name, type and default value. Being someone who’s used to node based and python scripting, having to be type safe is a bit of an adjustment!

I added this, and then added it to the colour definition, which is then used for albedo.
At first, this wouldn’t work, and I kept getting a variable is undefined error. This was because the properties and actual shader code are separate, and I needed to define the variable inside of the subshader itself.


Once I’d got the hang of those, I thought I’d add some new functionality to the shader with some scrolling UVs.

Again, I added the properties and then defined them in the subshader. The UVs are defined in the Input struct before the main bulk of the variables. A struct is a data type that stores multiple variables in the same physical memory space.  Its a bit like a class, but its always public – in C++ (and therefore HLSL) classes are private by default. This means we can access a specific instance of the struct and everything within it is available to us. For example, below we use IN.uv_MainTex, which is the instance of Input that we created when we defined the surf function.

To perform the scrolling, we multiply our scroll speed by time then frac it to take only what’s after the decimal point. This prevents the actual gametime from influencing our shader – we just need it to produce a constantly moving number.

This is really common, and something I’m very used to doing in node based shader graphs. The interesting thing here was that I had to use time.y – why this and not just time? In unity, time is a float4 that contains time/20, time, time*2 and time*3. Its just an easy way of storing a bunch of time related functions that you might be after.

After getting our scroll values, we add them to the original uv coords, The code looks something like this!

fixed2 scrolledUV = IN.uv_MainTex; 
fixed xScrollValue = frac(_ScrollSpeedX * _Time.y);
fixed yScrollValue = frac(_ScrollSpeedY * _Time.y);
scrolledUV += fixed2(xScrollValue, yScrollValue);


This was really fun – I learned a lot, even if I was just doing some very basic things! Next step I’d like to recreate a shader that I’ve made in nodes. I’ve been working on an alpha erosion shader recently, so might do that.


I think this is the first time I’ve ever posted 2D art to this blog! Its not something I do often, but occasionally it’s nice to come away from code and go back to where this all started.

Here’s a sketch of my D&D character, Kagri. She’s a barbarian with a penchant for ripped up princess dresses…


Its very “chicken scratch” – if I do more 2D work I’d like to learn how to refine linework nicely.