Subsurface Scattering

For an ice effect I worked on recently, I used this tutorial by Alan Zucconi to learn about the implementation of subsurface scattering in Unity shaders.

The concept and mathematics are explained properly in his tutorial, but in brief, subsurface scattering is when light hits a translucent material and instead of exiting the other side like with a transparent one, it bounces around inside until it finds its way out. Therefore, light absorbed at one point is not transmitted at the same point.

This can be seen when things appear to glow slightly in real life – think skin, milk and marble.

In code, we take the dot product of the view direction and the negative light vector plus distortion amount multiplied by surface normal. We then alter this using a texture, scale and power. Essentially what we are doing is looking at light as if it had come directly out of the other side, checking how much of this the player can see due to the angle between their view direction and the vector, and distorting the vector based on some user parameters. This distortion is the fake subsurface scattering. In my example, it mostly relies on a thickness texture to allow artist control, which is shown below.

2.PNG

I also added ambiance and intensity controls, which Zucconi talks about in the second part of his tutorial. This is an artificial scalar of light propagation across the surface, and the overall intensity of that light.

This is all done in a function that extends the unity’s PBR translucent lighting. Code below!

#include “UnityPBSLighting.cginc”
inline fixed4 LightingStandardTranslucent(SurfaceOutputStandard s, fixed3 viewDir, UnityGI gi)
{
//Original Color
fixed4 pbr = LightingStandard(s, viewDir, gi); //inbuilt lighting function

//Swap out thickness value if area is not ice
thickness *= smoothstep(0, 0.1, (1 – iceAmount));

//Translucency
float3 lightVector = gi.light.dir;
float3 viewDirection = viewDir;
float3 surfaceNormal = s.Normal;

float3 halfPoint = normalize(lightVector + surfaceNormal * _Distortion);
float intensity = pow(saturate(dot(lightVector, -halfPoint)), _Power) * _Scale;
float subsurface = _Attenuation * (intensity + _Ambient) * thickness;

//Add to PBR
pbr.rgb += gi.light.color * subsurface;
return pbr;
}

void LightingStandardTranslucent_GI(SurfaceOutputStandard s, UnityGIInput data, inout UnityGI gi)
{
LightingStandard_GI(s, data, gi);

}

The other shader effects were a mixture of combining different channel packed textures and the alpha erosion and normal recalculation techniques I’ve talked about before on this blog.

The texture below was the main texture, with each channel scrolling seperatley to create the magical under-surface effect. This looks like it is below the surface of the ice because the details are not present on the thickness map, and therefore is independent of the subsurface scattering light function.

4

This is the thickness map, and is largely used to say which parts of the ice are thicker than others, thus how much light exits when passing through the surface. By using the pixelate -> cells filter in photoshop, I changed a clouds render into a voronoi noise pattern. This gave me a nice faceted sort of look that I wanted for my stylized ice.

6

The normal map was created from the thickness map, to make sure that reflected light was consistent with transmitted light.

5

The alpha erosion used a cloud render with levels used to increase the value range. Here, black areas erode sooner than white areas due to a subtractive method being used to control coverage.

7

Ice Shader – Hard Edges in Vertex Shader

I started working on an ice shader recently and was trying to replicate a hard edged effect purely in the vertex shader, with a soft edged mesh. My idea that was that this would be good for two reasons:

  • We could use the mesh that the ice was growing on and duplicate it, saving memory as we have one less unique mesh and saving modelling time.
  • We could bypass not being able to have vertex manipulation in the shader as hard edges would break apart (being made up of additional non-connected vertices).

Hard Edges

To create hard edges using the vertex shader I used the ddx and ddy functions, which are partial difference derrivatives. What these do is look at a 2×2 block of pixels in screenspace and compare the current pixel the adjacent one (+ 1 in x or y depending on the function), giving us an idea of how different they are to one another. This is often used for things like calculating mip map usage based on screenspace covereage. For more info on derivative functions, see a clockwork berry.

What I’ve done here is use the vertice’s world position as a compartior. For each function, where the world space position in the axis changes, we output a different figure. To combine these two, we cross them, and then normalize in order to give us a value that equates to the normal of the vertex. This gives us a hard edged normal for each face.

float3 x = ddx(i.worldPos);
float3 y = ddy(i.worldPos);
float3 normal = normalize(cross(x, y));

3

Luminescence 

I wanted to remove colour from the above so that I could use it as lighting for my shader.
To convert from colour values to grey, I took the dot product of the normal and (0.22, 0.707, 0.071) and then saturated it.

capture

The l value comes from the way that we percive light. The human eye is most sensitive to green waves and therefore green contributes more to our perception of luminocity, red is next and then blue contributes the least. These come from NTSC TV but are often used to convert sRGB values to greyscale.
We use a dot product to multiply the colour and the luminoscity factor in order to produce a scalar. We need a scalar as greyscale values always have R = G = B if not a single channel.
We use the normal as the colour becuase in order to get hard edges, we want to vary the colour every time the angle of the light that bounces from the obect changes. This gives us a banded, hard edged look.

float3 lumi = dot(normal, float3(0.22, 0.707, 0.071));
float4 lighting = saturate(float4(lumi, 1));
return col * lighting * _Color;

2

Final result with this technique below. It was very interesting and I may use the above techniques in future, but I’m going to go in a different direction with this shader. I want a more triangular, fractal look and this just looks far too uniform.

Also, as I’d need to duplicate the mesh and render twice (different materials = different draw calls), the very small memory saving of duplicating the mesh and manipulating the verts just isn’t worth it compared to modelling a separate ice mesh that I have direct control over. The hard edges could be achieved through texture mapping.

1

LCD Shader

Today I used my lunch break to get started on an LCD shader for my current project. While the montior casing looks more likley to be CRT, I really like the subpixel element and moire looks that LCD provides.

Capture1

To create the effect above, I pixelated the main image then multiplied it with the RBG component as a texture.

Capture4

To pixelate the texture, I multipled the texture by the number of pixels I wanted on screen, rounded that to create a grid, then divided it by num pixels to bring the tiling back down, but clamped to the grid. (Shown below with a 10 pixel grid for ease of understanding – the image at the top uses a 100 pixel grid.)

1  2  3

fixed4 pixel = tex2D(_MainTex, (round(i.uv * _Pixel) / _Pixel));

I then made sure that the uv used to sample the rgb component texture was multiplied by the pixel value, to have them match up.

Capture5   Capture6

 

Starting Something New!

This might be the fastest I’ve started something new after finishing a project! I really liked the style I was working in with my pool shader, so wanted to continue.

I’m taking the rose tinted skybox, blown out lighting and simple shapes to a project I’ve had floating about in my head for a while – a sort of adventure game, sort of visual novel, based on internet mysteries and creepypasta. I’m hoping the juxtaposition of cutesy art direction and darker subject matter will make for something interesting!

I’m letting this one lead with art, as I know if I do mechanics first I’ll never get done! The 3D art will be a computer screen and then the majority of the rest of the art will be UI based.

1

Below I’ve got a quick first pass of the model with some unity default UI stuff in there. The blinking light is an incredibly simple shader – just a couple lines for that effect!

float blink = round(sin(_Time.y * _Speed) * 0.5 + 0.5);
fixed4 col = _Color * blink;

1b81eb30059bad646c9ab70caed6b9a0

Finished Water Shader

Finally done with this! Its been a while since I posted and can’t quite remember what I’ve done since then, so I’ll just post the full shader code at the bottom of this post so you can take a look!

As far as I know, I added some smaller waves, changed the uv scroll to be sin based rather than a scroll and made the wee sparkle particles.

Pool5

Deciding how to present this one was interesting – I wanted to emphasise the shader but just having a shaderball didn’t really show what it could do. I opted for this super simple style so that the main piece could shine.

Pool3Pool4Pool6Pool2

Here’s a video and the full code.

Shader "Unlit/Sh_Water_Unlit_River"
{
	Properties
	{	//Texture pack u freak
		_Color("Body Color 1", Color) = (1,1,1,1)
		_Color2("Edge Color", Color) = (1,1,1,1)
		_Color3("Body Color 2", Color) = (1,1,1,1)
		_MainTexture("Body Texture", 2D) = "white" {}
		_EdgeTexture("Edge Texture", 2D) = "white" {}
		_Distance("Edge Thickness", Float) = 1.0
		_Normal("Normal Map", 2D) = "bump"{}
		_Speed("Wave Speed", Range (0,10)) = 1.0
		_Noise("Wave Texture", 2D) = "white"{}
		_Amount("Wave Amount", Float) = 1.0
		_Speed2("Scroll Speed", Range(0,10)) = 1.0
		_TexAmt("Little Waves Amount", Float) = 1.0

	}
		SubShader
	{
		Tags { "RenderType" = "Transparent" "Queue" = "Transparent" "Lightmode" = "ForwardBase"}
		LOD 100
		Blend SrcAlpha OneMinusSrcAlpha

		Pass
		{
			CGPROGRAM
			#pragma vertex vert
			#pragma fragment frag
			#pragma multi_compile_fog

			#include "UnityCG.cginc"

			struct appdata
			{
				float4 vertex : POSITION;
				float2 uv : TEXCOORD0;
				float4 normal : NORMAL;
			};

			struct v2f
			{
				float2 uv : TEXCOORD0;
				UNITY_FOG_COORDS(1)
				float4 vertex : SV_POSITION;
				float4 screenPos : TEXCOORD1; //Custom Data in V2f
				float2 uv2 : TEXCOORD2;
				float4 normal : NORMAL;
				float2 viewDir : TEXCOORD3;
			};

			sampler2D _EdgeTexture;
			float4 _EdgeTexture_ST;
			float4 _Color;
			float4 _Color2;
			float4 _Color3;
			uniform sampler2D _CameraDepthTexture;
			float _Distance;
			sampler2D _Normal;
			float4 _Normal_ST;
			float _Speed;
			float _Speed2;
			sampler2D _Noise;
			float _Amount;
			sampler2D _MainTexture;
			float4 _MainTexture_ST;
			float _TexAmt;

			v2f vert (appdata v)
			{
				v2f o;
				//Wobble
				float4 noiseTex = tex2Dlod(_Noise, float4(v.uv, 0, 0) * 20);

				//y pos = sin for general up down movement * texture for waves and sides need to come up at different times
				v.vertex.y += (sin(_Time.y * _Speed + v.vertex.x + v.vertex.z) * _Amount * (noiseTex * 2)); //also fix normal
				v.vertex.y += (sin(_Time.y * _Speed * 0.5) * _TexAmt) * round(noiseTex);

				o.vertex = UnityObjectToClipPos(v.vertex);
				o.uv = TRANSFORM_TEX(v.uv, _EdgeTexture);
				o.uv2 = TRANSFORM_TEX(v.uv, _MainTexture);
				o.screenPos = ComputeScreenPos(o.vertex); //Get vertex in clip space, compute screen position
				o.normal = v.normal;
				o.viewDir = normalize(ObjSpaceViewDir(o.vertex));
				UNITY_TRANSFER_FOG(o,o.vertex);
				return o;
			}

			fixed4 frag (v2f i) : SV_Target
			{
				half depth = LinearEyeDepth(SAMPLE_DEPTH_TEXTURE_PROJ(_CameraDepthTexture, UNITY_PROJ_COORD(i.screenPos)));
				half screenDepth = saturate((depth - i.screenPos.w) / _Distance); 

				float4 noiseTex = tex2D(_Noise, i.uv2 * 0.5);

				float2 pan_uv = float2((i.uv.x * noiseTex.r), (i.uv.y + (sin(_Time.y * _Speed * 1.5) * _Amount * _Speed2)));
				float2 pan_uv2 = float2((i.uv2.x * noiseTex.r), (i.uv2.y + (sin(_Time.y * _Speed * 1.5) * _Amount * _Speed2)));

				float4 noiseTex2 = tex2D(_Noise, pan_uv2);

				fixed4 edge = saturate(tex2D(_EdgeTexture, pan_uv).b) * _Color2;
				fixed4 body = saturate(tex2D(_MainTexture, pan_uv2).b) + lerp(_Color, _Color3, smoothstep(0.25, 0.75, noiseTex2.r));

				float fresnel = max(0, 1 - (dot(i.viewDir, i.normal)));
				float wobbly_fresnel = fresnel;

				fixed4 color = lerp(edge + body, body, screenDepth);
				fixed4 col = saturate(lerp(color * 0.1, color, wobbly_fresnel));
				col.a = saturate(lerp(edge.a, _Color.a, screenDepth) + wobbly_fresnel * col.a);

				UNITY_APPLY_FOG(i.fogCoord, col);
				return col;
			}
			ENDCG
		}
	}
}

 

Water Shader Updates

First post here. 

Made some updates to my water shader this week!

I started with some vertex movement, with a sin wave with the x and z of the vertex added in order to get a rocking motion over the entire object. The further along in x or z the object is, the stronger the displacement, and because we have the sin wave in -1 – 1 space, this is reversed over time to get even displacement over the whole object. 

I multiplied this by a noise texture to get a nice wavy movement – the displacement will only be applied where the texture is white.

v.vertex.y += (sin(_Time.y * _Speed + v.vertex.x + v.vertex.z) * _Amount * noiseTex);

070921760851175ff2bbd60bb3f87f5c

I then added some textures to get this looking a bit less like programmer art! I made use of the _ST variable available in unity, where we can access the user input tiling and offset from the texture parameter in the format (uv.x, uv.y, offset.x, offset.y). This allowed me to have a more versatile shader that’s easily manipulated  by the user.

sampler2D _EdgeTexture;
float4 _EdgeTexture_ST;

To output this to my fragment shader, I used TRANSFORM_TEX when declaring my uv.

o.uv = TRANSFORM_TEX(v.uv, _EdgeTexture);

I used this with panning in both directions and blended it with my colours to add textures to the body and edge. The edge is multiplied as I want the white in the edge texture to be coloured by the colour2 input, whereas the body is added as I want the main body to be color1 with white on top.

float2 pan_uv = i.uv + _Time.y * _Speed2;
float2 pan_uv2 = i.uv2 + _Time.y * _Speed2;
fixed4 edge = saturate(tex2D(_EdgeTexture, pan_uv).b) * _Color2;
fixed4 body = saturate(tex2D(_MainTexture, pan_uv2).b) + _Color;

I then replaced my straight lerp with a lerp between the edge and body colours added together and the body colour. This meant that my edge always stands out and no greys are introduced, while maintaining the colour the user has input.

fixed4 col = saturate(lerp(edge + body, body, screenDepth));

68cdfe4e50b0612a6fbfea7bd529f357

After that I added in a little fresnel for faked reflections. This isn’t quite finished and still needs some tweaks. Fresnel essentially describes whether you are looking directly at something.  To do this, we check the angle between the camera heading and the object’s normal, using a dot product. The dot product returns 0 when vectors are at right angle, and 1 when the vectors are perpendicular. If this is less than 0, we just return 0 using max(). 

To do this, I got normal in appdata using the semantic, then added another output to my v2f function called viewDir.

float4 normal : NORMAL;

float2 viewDir : TEXCOORD3;

I then got the camera heading by using unity’s inbuilt function ObjSpaceViewDir. This returns the object space direction  from given object space vertex position towards the camera.

o.viewDir = ObjSpaceViewDir(o.vertex);

I then use the dot product as explained above and multiply this with a texture for a more painterly result.

float fresnel = max(0, 1 – saturate(dot(i.viewDir, i.normal)));
float wobbly_fresnel = fresnel * tex2D(_Noise, i.uv).r;

87dc54a8a5844540da57f9f219e15383

This looked a bit funky – as it turns out, the view direction function returns an unnormalized vector (a vector with length), so the distance from the camera was being taken into consideration. Once I normalized this to get only the direction, I got a much subtler result.

7be231777cb0aeaf8bf1e0646caa124e

Using Custom Vertex Streams in Unity Shaders

Custom vertex streams are a nice new addition to Unity that allows you to access per particle data in shaders. This is really nice for velocity or lifetime based shader effects.

To set it up in your particle system, go to the rendering tab and enabled custom vertex streams, then add the data that you want. In brackets, it shows the semantic that this is packed into.

Capture

Declare the semantic in your appdata, making sure that you’re using the right data type. As AgePercent is packed into TEXCOORD0, normally used for particle UVs, it defaults to a float2. Not changing this caught me out!

float4 uv : TEXCOORD0;

This then needs to go into your v2f struct so you can declare the variables you’ll want to use. For the new variable (in my case age), use a new TEXCOORD.

float2 uv : TEXCOORD0;
float age : TEXCOORD1;

This can then be output from your vertex shader, using the variable name you declared in the struct and the data from appdata that matches with the information in the custom vertex streams dropdown.

o.age = v.uv.z;

This can then be used to do whatever you want in your fragment shader!

fixed4 col = i.age;

As you can see below, the particle colour is equal to the age percentage of the particle – going from 0 when it is born to 1 when it dies.

Capture2

Pretty excited to see what I can do with this!

Water Shader

Wiping your PC’s data to fix it is a good excuse to start a new project right? 😉

I began working on a stylized water shader in Unity this morning. I’m aiming for something like the images below, with depth based texture effects, reflection, refraction, fresnel and a bit of movement.

Water Moodboard

Render Type and Blend Mode

I started off by making a simple transparent shader. I used unity’s unlit preset as this will allow me to add additional outputs to the vertex to fragment function which I’ll need for getting things like screen position later.

For the blend mode I’ve set it to use traditional transparency. The generated colour is multiplied by its own alpha, and the colour already on screen is multiplied by one minus the generated colour’s alpha. This gives us traditional layered transparency where the two values add up to one. The higher the value in the source alpha, the more opaque the generated colour will appear.

I might change this to additive later depending on the type of look I want to achieve.

Tags { “RenderType”=”Transparent” “Queue” = “Transparent”}

Blend SrcAlpha OneMinusSrcAlpha

water2

Depth Fade

The depth fade should colour pixels that are near geometry that intersects with the water plane.

The first thing I needed to do to was get the depth texture that is already being rendered as part of the inbuilt forward rendering pipeline in Unity.

uniform sampler2D _CameraDepthTexture;

After this, I added a new output from my vertex struct that would allow me to get the object’s screen position.

float4 screenPos : TEXCOORD1;

I used this in the vertex function (v2f) to get screen position, using unity’s built in function from UnityCG.cginc. This takes the vertices clip space position and uses it to work out the position on screen as a screenspace texture coordinate.

I get the clip space position by using UnityObjectToClipPos, which is in the shader by default anyway.  This transforms a position in 3D object space to the 2D clip space of the camera by multiplying it with the model, view and projection matrices.

o.vertex = UnityObjectToClipPos(v.vertex);

o.screenPos = ComputeScreenPos(o.vertex);

I then use in fragment shader to get the final colour.

Depth is computed using the LinearEyeDepth function, which when given the rendered depth texture calculates the depth between the camera each pixel. To sample the depth texture, I used an hlsl macro and gave it the screen position as a uv.

To convert this to something we can use with our objects, I took the screen position of the object (represented as a float in the 4th homogeneous coordinate) and subtracted that from the depth, allowing me to compare the depth of the object with the plane I’ve generated. 

half depth = LinearEyeDepth(SAMPLE_DEPTH_TEXTURE_PROJ(_CameraDepthTexture, UNITY_PROJ_COORD(i.screenPos)));
half screenDepth = depth – i.screenPos.w;
fixed4 col = saturate(lerp(_Color, _Color2, screenDepth));

This is what screen depth looks like, represented as red = 0 and green = 1. Green are pixels who’s depth is 1 as no other pixels need to be rendered behind it.

Water3

There’s a wee gotcha here – remember to saturate (clamp 0 -1) the final colour value, otherwise you’ll end up with negative numbers inverting your colour. I think this is to do with the screen space position still being a humongous coordinate and not a screen space one, so I’ll have a look at fixing that properly.

water4

Here’s the final result of the depth! Next I’m going to look at combining this with a texture to generate some nice edges on objects.water5

Accessing Particle Colour in Unity Shaders

Alongside my ren’py game, I’m currently working on a campfire scene that incorporates a number of elements including vfx, shaders and prop modelling.

I’ve just started on the particle fire shader, for which I need to access particle colour. Doing this in CG/Shaderlab is a little more complicated than the “drop a particle color node” that I’m used to in unreal/shaderforge/at work!

Capture

When you change colour over life etc. in Shuriken, you are actually changing the vertex colour of the particle.

To access vertex colour, you need to first define it in the input structs for both your vertex and pixel shaders.

The syntax shown below is for an HLSL semantic, which is a string attached to a variable that indicates its intended use. This will then allow the variable to be passed between shader pipeline stages. There a a number of built in semantics that allow us to access things like the tangent or the texture coord, I’ve used the built in one for colour below.

fixed4 color : COLOR;

Once you’ve done this, you need to set the color being output from your vertex shader to the vertex colour as defined in the struct (v).

o.color = v.color;
return o;

You can then use this in the pixel shader, as long as the vertex shader (i) is being used as an input.

fixed4 col = i.color

Full code for an unlit shader that takes vertex colour below!

Shader “Unlit/color only”
{
Properties
{
}
SubShader
{
Tags { “RenderType”=”Opaque” }
LOD 100

Pass
{
CGPROGRAM
#pragma vertex vert
#pragma fragment frag
// make fog work
#pragma multi_compile_fog

#include “UnityCG.cginc”

struct appdata
{
float4 vertex : POSITION;
float2 uv : TEXCOORD0;
fixed4 color : COLOR;
};

struct v2f
{
float2 uv : TEXCOORD0;
UNITY_FOG_COORDS(1)
float4 vertex : SV_POSITION;
fixed4 color : COLOR;
};
v2f vert (appdata v)
{
v2f o;
o.vertex = UnityObjectToClipPos(v.vertex);
o.uv = v.uv;
o.color = v.color;
UNITY_TRANSFER_FOG(o,o.vertex);
return o;
}

fixed4 frag (v2f i) : SV_Target
{
fixed4 col = i.color
// apply fog
UNITY_APPLY_FOG(i.fogCoord, col);
return col;
}
ENDCG
}
}
}

Weathering Shader

Been having more fun with HLSL recently! Off the back of my alpha erosion shader, I wanted to use a similar technique to repilicate some weathering I saw on objects in the Napels National Archaeological Museum.

11

I started off by shifting the alpha erosion technique into an albedo blend. I added the erosion function to the albedo and then multiplied it by the albedo.

I then created a vertex function to push out all the verts within the eroded area along their normal. This gave me the raised blue surface. This could definitely look a lot more elegant but I think its not bad for a first try!

void vert(inout appdata_full v) { 
//Weathering float4 wm = tex2Dlod(_AlphaMask, float4(v.texcoord.xy, 0, 0)); 
float4 w = tex2Dlod(_Weather_Texture, float4(v.texcoord.xy, 0, 0)); 
wm = weathering_effect(wm, _Erosion); 
v.vertex.xyz += lerp(0, v.normal * (_Erosion*.1), wm);

The last thing I added was AO for the raised area in order to look like it was giving us detail soft shadows. To do this, I used the erosion function again, but pushed the erosion amount so there were more dark areas. I then added this to an inverted version of the original erosion, to leave us with a line that fades out on both sides.

Untitled

I added .2 to this so it wasn’t super dark, and got the small shadow that you see below.

13

3e1b92ad5baf0e4addb51057296f8124