Unity Toon Shader Tutorial (2023)

Posted by Michael Sacco
./
May 15, 2023

What you’ll learn

In this tutorial, you will learn how to write a lit toon shader. You will write a custom shader for Unity URP using HLSL. The shader will receive lighting from a single direction light, and it will cast and receive shadows. This tutorial is based on Unity 2021.3 LTS.

Who am I?

Hi! I’m Michael, the developer behind OccaSoftware. I am writing this tutorial to help you learn about shaders and improve your Unity skills. I want your game to be a success, so I’m happy to do whatever it takes to help empower you and help you build your project.

As the developer for OccaSoftware, I created three toon shader assets for Unity that fit different use cases and different types of projects.

And, I developed more than 30 assets for Unity. More than 20,000 game developers use my game assets to help players fall in love with their games. If you want to learn about custom shading in Unity, I’m your guy.

Introduction to Toon Shading

So what is Toon Shading anyway?

Toon shading is a type of expressive rendering.

You can most easily understand expressive rendering in contrast to photorealistic rendering.

Photorealistic rendering aims to accurately render materials as the human eye would perceive it using physically-based rendering algorithms.

In contrast, expressive rendering aims to enable art-driven rendering to achieve specific visual styles.

Toon shading is also sometimes called cel shading. According to Wikipedia, the name cel shading "comes from cels (short for celluloid), clear sheets of acetate which were painted on for use in traditional 2D animation". I’ll just say that this was before my time and move on.

Toon shading is most often used to mimic the look of traditional cartoons, comics, or animations.

This shading style is characterized by having a limited number of discrete shades of color to create a flat look with blocks of color. Toon shading typically features some specular highlights as well as rim lighting.

Outline Shaders

In many projects that use toon shading, artists also often use outline shaders to further sell the cartoon look. Outline rendering is a complex topic on its own, so it is outside of the scope of this tutorial.

You will typically see toon shading applied only on character models with level geometry rendered using traditional shading techniques.

Examples of Toon Shading

At this point, I’m sure you’re wondering whether toon shading has been used in production titles.

If you’re familiar with the gaming world, then you already know that the answer is a resounding: yes. Toon shading is an extremely popular art style in games.

One of the most famous historical examples of toon shading is Jet Set Radio. I loved the Monster Rancher series, so I’ll also give a shout-out to Monster Rancher 3. More recent examples include Astral Chain, Genshin Impact, and The Legend of Zelda: Tears of the Kingdom. In my opinion, toon shading is the best fit for games that want to communicate an anime or cartoon-inspired aesthetic.

Prerequisites

For this tutorial, I will assume that you are using Unity 2021.3 LTS using the Universal Render Pipeline.

This tutorial is for writing shaders in Unity URP, which I think of as the default render pipeline as of this writing. If you don’t have the Universal Render Pipeline installed in your project, install it now.

This tutorial is designed for the Forward Rendering path. Technically, it ‚Äújust works‚ÄĚ with Deferred Path as well because we will use the ForwardOnly lightmode.

I also assume that you are fairly comfortable using the Unity engine and that you have a basic understanding of programming.

Set up your project

Before we start writing our shader, we will set up a simple test environment.

  1. Create a new scene and open it.
  2. Add a plane, center it on the origin.
  3. Add a sphere, set the world position to (0, 1, 0).
  4. Move your camera from (0, 0, -10) to (0, 0, -3) so that we can see our sphere a little better.
  5. Create a new material in your Project folder, then apply it to the sphere.

This is what my test environment looks like:

Note the newly-created Toon Shader Material asset in the Project view (and applied to the Sphere). We will come back to this material later.

(and ignore my messy project directory, thank you).

Write the shader boilerplate

Now that we have created our test environment, we will create our shader file, write some boilerplate to get our shader off the ground, and apply it to the material.

Create your shader file

In your project directory, create the shader file. I normally use the Unlit Shader type as my default. Open it in your text editor of choice.

Fortunately, Unity has populated our shader file with all of the boilerplate we need, so we’re already done with this section.

…

……

…………

……………………

Sorry, I lied. :)

Unity did populate the shader with what appears to be helpful boilerplate. Unfortunately, the code in the shader is applicable to the Built-In Render Pipeline.

We are not using the Built-In Render Pipeline for this tutorial. Therefore, this code is 100% useless to us.

We will need to write our own boilerplate.

Fortunately, I made boilerplate files for you that you can download and import directly to your project.

(Now is when you say ‚ÄúThanks, Michael!‚ÄĚ).

This is our .shader file:

/*
This is the name of the Shader in the inspector.
You can set directory hierarchy using forward slashes,
like shown here.
*/
Shader "My Custom Shaders/Toon Shader"
{
       /*
       The properties block enables you to edit and save the defined
       Material properties.
       If you don't define anything here, you can still set properties 
       by code, but you can't edit properties from the inspector, and
       changes don't persist between sessions.
       */ 
       Properties
       {
           [MainTexture] _ColorMap ("Color Map", 2D) = "white" {}
           [MainColor] _Color ("Color", Color) = (0.91, 0.91, 0.38)
       }
       
       
       /*
       SubShaders let you define settings and programs that can vary
       depending on hardware, render pipeline, and runtime settings.
       
       This is a more complex topic, but it enables you to write
       shaders that can easily work between different render pipelines.
       */
       
       SubShader
       {
       
           /* 
           We need to make sure the Tags includes our RenderPipeline 
           so that this shader works properly.
           */
           
           Tags { "RenderType"="Opaque" "RenderPipeline"="UniversalPipeline" }
           
           
           /*
           These are Shader Commands that control 
           GPU-side rendering properties.
           */
           
           Cull Back
           ZWrite On
           ZTest LEqual
           ZClip Off
           
           Pass
           {
               /* 
               The lightmode tag is extremely important.
               Unity sets up and runs render passes.
               These render passes render objects depending on the
               included LightMode tags. 
               
               e.g., "Render only UniversalForwardOnly in this pass".
               
               Our other LightModes (ShadowCaster, DepthOnly, DepthNormalsOnly)
               are automatically queued up and rendered by Unity during
               the appropriate render pass.
               */
               
               Name "ForwardLit"
               Tags {"LightMode" = "UniversalForwardOnly"}
               
               
               HLSLPROGRAM
               
               // These #pragma directives make fog and Decal rendering work.
               #pragma multi_compile_fog
               #pragma multi_compile_fragment _ _DBUFFER_MRT1 _DBUFFER_MRT2 _DBUFFER_MRT3
               
               // These #pragma directives set up Main Light Shadows.
               #pragma multi_compile _ _MAIN_LIGHT_SHADOWS _MAIN_LIGHT_SHADOWS_CASCADE _MAIN_LIGHT_SHADOWS_SCREEN
               #pragma multi_compile _ _SHADOWS_SOFT
               
               // These #pragma directives define the function names
               // for our Vertex and Fragment stage functions
               #pragma vertex Vertex
               #pragma fragment Fragment
               
               #include "ToonShaderPass.hlsl"
               
               ENDHLSL
           }
           
           Pass
           {
               Name "ShadowCaster"
               Tags {"LightMode" = "ShadowCaster"}
               
               
               HLSLPROGRAM
               
               // This define lets us take an alternate path 
               // when we get the Clipspace Position during the Vertex stage.
               #define SHADOW_CASTER_PASS
               
               #pragma vertex Vertex
               
               // In this case, we want to use the FragmentDepthOnly
               // function instead of the Fragment function we 
               // used in the ForwardLit pass.
               #pragma fragment FragmentDepthOnly
               
               #include "ToonShaderPass.hlsl"
               
               ENDHLSL
           }
           
           Pass
           {
               Name "DepthOnly"
               Tags {"LightMode" = "DepthOnly"}
               
               
               HLSLPROGRAM
               
               #pragma vertex Vertex
               
               // Our DepthOnly Pass and ShadowCaster pass
               // both use the FragmentDepthOnly function
               #pragma fragment FragmentDepthOnly
               
               #include "ToonShaderPass.hlsl"
               
               ENDHLSL
           }
           
           Pass
           {
               Name "DepthNormalsOnly"
               Tags {"LightMode" = "DepthNormalsOnly"}
               
               
               HLSLPROGRAM
               
               #pragma vertex Vertex
               
               // And our DepthNormalsOnly pass uses our
               // Fragment function named FragmentDepthNormalsOnly.
               #pragma fragment FragmentDepthNormalsOnly
               
               #include "ToonShaderPass.hlsl"
               
               ENDHLSL
           }
       }
}

And our .hlsl include file:

#ifndef MY_TOON_SHADER_INCLUDE
#define MY_TOON_SHADER_INCLUDE
#include "Packages/com.unity.render-pipelines.universal/ShaderLibrary/Core.hlsl"
#include "Packages/com.unity.render-pipelines.universal/ShaderLibrary/Lighting.hlsl"
#include "Packages/com.unity.render-pipelines.core/ShaderLibrary/SpaceTransforms.hlsl"
// See ShaderVariablesFunctions.hlsl in com.unity.render-pipelines.universal/ShaderLibrary/ShaderVariablesFunctions.hlsl
///////////////////////////////////////////////////////////////////////////////
//                      CBUFFER                                              //
///////////////////////////////////////////////////////////////////////////////
/*
Unity URP requires us to set up a CBUFFER
(or "Constant Buffer") of Constant Variables.
These should be the same variables we set up 
in the Properties.
This CBUFFER is REQUIRED for Unity
to correctly handle per-material changes
as well as batching / instancing.
Don't skip it :)
*/
CBUFFER_START(UnityPerMaterial)
       TEXTURE2D(_ColorMap);
       SAMPLER(sampler_ColorMap);
       float4 _ColorMap_ST;
       float3 _Color;
CBUFFER_END
///////////////////////////////////////////////////////////////////////////////
//                      STRUCTS                                              //
///////////////////////////////////////////////////////////////////////////////
/*
Our attributes struct is simple.
It contains the Object-Space Position
and Normal Direction as well as the 
UV0 coordinates for the mesh.
The Attributes struct is passed 
from the GPU to the Vertex function.
*/
struct Attributes
{
       float4 positionOS : POSITION;
       float3 normalOS   : NORMAL;
       float2 uv         : TEXCOORD0;
       
       // This line is required for VR SPI to work.
       UNITY_VERTEX_INPUT_INSTANCE_ID
};
/*
The Varyings struct is also straightforward.
It contains the Clip Space Position, the UV, and 
the World-Space Normals.
The Varyings struct is passed from the Vertex
function to the Fragment function.
*/
struct Varyings
{
       float4 positionHCS     : SV_POSITION;
       float2 uv              : TEXCOORD0;
       float3 positionWS      : TEXCOORD1;
       float3 normalWS        : TEXCOORD2;
       
       // This line is required for VR SPI to work.
    UNITY_VERTEX_INPUT_INSTANCE_ID
       UNITY_VERTEX_OUTPUT_STEREO
};
///////////////////////////////////////////////////////////////////////////////
//                      Common Lighting Transforms                           //
///////////////////////////////////////////////////////////////////////////////
// This is a global variable, Unity sets it for us.
float3 _LightDirection;
/*
This is a simple lighting transformation.
Normally, we just return the WorldToHClip position.
During the Shadow Pass, we want to make sure that Shadow Bias is baked 
in to the shadow map. To accomplish this, we use the ApplyShadowBias
method to push the world-space positions in their normal direction by the bias amount.
We define SHADOW_CASTER_PASS during the setup for the Shadow Caster pass.
*/
float4 GetClipSpacePosition(float3 positionWS, float3 normalWS)
{
       #if defined(SHADOW_CASTER_PASS)
           float4 positionHCS = TransformWorldToHClip(ApplyShadowBias(positionWS, normalWS, _LightDirection));
           
           #if UNITY_REVERSED_Z
               positionHCS.z = min(positionHCS.z, positionHCS.w * UNITY_NEAR_CLIP_VALUE);
           #else
               positionHCS.z = max(positionHCS.z, positionHCS.w * UNITY_NEAR_CLIP_VALUE);
           #endif
           
           return positionHCS;
       #endif
       
       return TransformWorldToHClip(positionWS);
}
/*
These two functions give us the shadow coordinates 
depending on whether screen shadows are enabled or not.
We have two methods here, one with two args (positionWS
and positionHCS), and one with just positionWS.
The two-arg method is faster when you have 
already calculated the positionHCS variable.
*/
float4 GetMainLightShadowCoord(float3 positionWS, float4 positionHCS)
{
       #if defined(_MAIN_LIGHT_SHADOWS_SCREEN)
           return ComputeScreenPos(positionHCS);
       #else
           return TransformWorldToShadowCoord(positionWS);
       #endif
}
float4 GetMainLightShadowCoord(float3 PositionWS)
{
       #if defined(_MAIN_LIGHT_SHADOWS_SCREEN)
           float4 clipPos = TransformWorldToHClip(PositionWS);
           return ComputeScreenPos(clipPos);
       #else
    return TransformWorldToShadowCoord(PositionWS);
       #endif
}
/*
This method gives us the main light as an out parameter.
The Light struct is defined in 
"Packages/com.unity.render-pipelines.universal/ShaderLibrary/RealtimeLights.hlsl",
so you can reference it there for more details on its fields.
This version of the GetMainLight method doesn't account for Light Cookies.
To account for Light cookies, you need to add the following line to your shader pass:
#pragma multi_compile _ _LIGHT_COOKIES
and also call a different GetMainLight method:
GetMainLight(float4 shadowCoord, float3 positionWS, half4 shadowMask)
*/
void GetMainLightData(float3 PositionWS, out Light light)
{
       float4 shadowCoord = GetMainLightShadowCoord(PositionWS);
       light = GetMainLight(shadowCoord);
}
///////////////////////////////////////////////////////////////////////////////
//                      Functions                                            //
///////////////////////////////////////////////////////////////////////////////
/*
The Vertex function is responsible 
for generating and manipulating the 
data for each vertex of the mesh.
*/
Varyings Vertex(Attributes IN)
{
       Varyings OUT = (Varyings)0;
       
       // These macros are required for VR SPI compatibility
       UNITY_SETUP_INSTANCE_ID(IN);
    UNITY_TRANSFER_INSTANCE_ID(IN, OUT);
    UNITY_INITIALIZE_VERTEX_OUTPUT_STEREO(OUT);
       
       
       // Set up each field of the Varyings struct, then return it.
       OUT.positionWS = mul(unity_ObjectToWorld, IN.positionOS).xyz;
       OUT.normalWS = TransformObjectToWorldNormal(IN.normalOS);
       OUT.positionHCS = GetClipSpacePosition(OUT.positionWS, OUT.normalWS);
       OUT.uv = TRANSFORM_TEX(IN.uv, _ColorMap);
       
       return OUT;
}
/*
The FragmentDepthOnly function is responsible 
for handling per-pixel shading during the 
DepthOnly and ShadowCaster passes.
*/
float FragmentDepthOnly(Varyings IN) : SV_Target
{
       // These macros are required for VR SPI compatibility
       UNITY_SETUP_INSTANCE_ID(IN);
    UNITY_SETUP_STEREO_EYE_INDEX_POST_VERTEX(IN);
       
       return 0;
}
/*
The FragmentDepthNormalsOnly function is responsible 
for handling per-pixel shading during the 
DepthNormalsOnly pass. This pass is less common, but
can be required by some post-process effects such as SSAO.
*/
float4 FragmentDepthNormalsOnly(Varyings IN) : SV_Target
{
       // These macros are required for VR SPI compatibility
       UNITY_SETUP_INSTANCE_ID(IN);
    UNITY_SETUP_STEREO_EYE_INDEX_POST_VERTEX(IN);
       
       return float4(normalize(IN.normalWS), 0);
}
/*
The Fragment function is responsible 
for handling per-pixel shading during the Forward 
rendering pass. We use the ForwardOnly pass, so this works
by default in both Forward and Deferred paths.
*/
float3 Fragment(Varyings IN) : SV_Target
{
       // These macros are required for VR SPI compatibility
    UNITY_SETUP_INSTANCE_ID(IN);
    UNITY_SETUP_STEREO_EYE_INDEX_POST_VERTEX(IN);
       
       return 0;
}
#endif

Our boilerplate consists of two files. A .shader file and a .hlsl file. Our .shader file has #include directives that cause the contents of our .hlsl file to be automatically imported to our .shader file.

We structure our files this way because it makes it easier for us to split up our boilerplate shader pass setup and #pragma directives from our core shader functions.

It also makes it easy for us to share the same core shader functions across multiple shader passes.

In this boilerplate setup, we have four shader passes. If we write our code inline, we need to copy and paste our code in four separate places - once for each pass. By using the #include approach, we only have to write our code once.

Understanding the Shader Boilerplate

At a high level, Shaders consist of the following elements:

I will not go into detail on each of these elements as part of this tutorial, but just be aware that they exist and that I set all of this up for you in the boilerplate downloads.

Also be aware that we will need to go back and add new properties to the Properties block, which means we also need to add new properties to the CBuffer block as well.

Take a few minutes to explore the .shader and .hlsl files and familiarize yourself with them. I wrote dozens of comments to explain the purpose of nearly every line of code, so take a look.

Getting Started with Lighting

Finally! We are done setting up the scene and importing our boilerplate. Now, we can have some fun with introductory concepts to lighting.

Let’s take a step back and imagine time and space before the big bang. Everything was dark. No stars, no moons, no planets, no atmospheres, no rocks, no nothing.

Now imagine that you create a perfectly round, perfectly white sphere, similar to the one that we have in our scene view right now. What would you see?

Well, nothing. Everything around you is black. No light is hitting the sphere because there is no light in this universe. The sphere is technically white. But without any light to illuminate it, it appears completely black. You can touch it, but you can’t see it. This is what happens without lighting.

Diffuse Lighting

Now imagine that we create a star, our sun, and lob it pretty far away. This star is bright, so the light from this star is hitting our sphere. The rest of the universe is still completely black, since there is nothing for the light from this sun to bounce off of.

But the sphere, well half of it at least, is now illuminated with the light of the sun. You can now see that the sphere is white. The sphere is most bright at the point that is directly facing the sun, and then gradually becomes darker until we get halfway round the sphere, at which point it is completely dark. This half, the half facing away from the sun, is still in complete darkness - no light is hitting it.

How can we think about this phenomenon? Each point on the surface of the sphere has a normal vector. The normal vector is the direction that is perpendicular to the surface for each point.

The normal vectors of a warped plane:

The normal vectors of a sphere:

When the normal vector on our sphere is perfectly opposite to the direction of the sun’s light rays, let’s say that the sphere is receiving 100% of the light from the sun. When the normal vector on our sphere is perpendicular to the direction of the sun’s light rays, then let’s say that our sphere is receiving 0% of the light from the sun. This is the intuition for the diffuse component lighting attenuation model that we will use later.

Accounting for the Color of Light

Now imagine that we tinted the sun red. What color would the sphere appear to be? To your eyes, the red light of the sun would appear to tint the white surface of the sphere so that the sphere also now looks red. It is like we multiplied the color of the sphere by the color of the sun. This is the intuition for how we will blend the light color with the material color later on. Materials in the real world have a base color that is neither perfectly black nor perfectly white, but rather fall into a range of values in between.

Specular Lighting

The light also has a specular component to it. We won’t go into the math here, but think about it like this. Assume that the ball is a little bit reflective. There will be some direction between your eyes and the ball where the light from the sun is reflected directly from the ball to your eyes, resulting in a bright highlight. This is what we call a specular term.

Ambient Lighting Term

Now imagine that we surround ourselves, the sun, and the sphere - the entire universe - with an infinitely large sphere, and that this sphere emits some color of light. This color will apply evenly to all parts of the sphere. This is what we will use for our ambient term.

Getting Started with Code

So, we’ve introduced the intuition for how we will approach diffuse lighting, specular lighting, ambient lighting, and how we will account for the light’s intrinsic color. Now we will get into the meat of the coding. We’ll start by setting up directional lighting and shadows, we’ll add specular and rim lighting, make some quality improvements, and then incorporate ambient lighting to finish up.

Setting up Directional Lighting

The first thing we will add for our shader is directional lighting. Let’s focus on the Fragment function in our ToonShaderPass.hlsl.

To add directional lighting, we will need a couple of pieces about the object and the material at this position.

  • What is the material‚Äôs color?
  • What is the normal vector normal vector?
  • What is the light‚Äôs color and direction?

We will work through these problems one-by-one.

Getting the Material’s Color

In our Properties and CBuffer, we defined a Color property (a color) and a ColorMap property (a texture). We can use these properties directly and update the Fragment shader like so:

float3 Fragment(Varyings IN) : SV_Target
{
     // These macros are required for VR SPI compatibility
    UNITY_SETUP_INSTANCE_ID(IN);
    UNITY_SETUP_STEREO_EYE_INDEX_POST_VERTEX(IN);
     
     return _Color * SAMPLE_TEXTURE2D(_ColorMap, sampler_ColorMap, IN.uv);
}

Your sphere will now render the same color as the input, effectively acting as an unlit material.

Getting the Main Light Properties

Next, we need to get the main directional light (and hopefully some additional information about it as well).

To get the main light, we can use one of the helper methods that I included in the ToonShaderPass.hlsl file: GetMainLightData(float3 PositionWS, out Light light).

This method expects a world-space position as the first argument and a light struct as the second argument.

Our Varyings struct contains the world-space position, but we need to create a variable for the light struct and pass that in, like so:

float3 Fragment(Varyings IN) : SV_Target
{
     // These macros are required for VR SPI compatibility
     UNITY_SETUP_INSTANCE_ID(IN);
     UNITY_SETUP_STEREO_EYE_INDEX_POST_VERTEX(IN);
       
     Light light;
     GetMainLightData(IN.positionWS, out light);
       
     return _Color * SAMPLE_TEXTURE2D(_ColorMap, sampler_ColorMap, IN.uv);
}

This method populates the light data for us. The Light struct contains information about the light, including the direction, color, and shadow attenuation.

Using the Main Light Data

We will use the light color and light direction to set up the directional lighting.

We need to compute the extent to which the main directional light is illuminating each point on the surface of the sphere. Remembering our intuition from earlier, we need to figure out the extent to which the normal vector and light direction vector are aligned. We need the dot product, which tells us how much two vectors point in the same direction.

Fortunately, there’s a handy intrinsic method built into HLSL that can calculate this for us: dot(x, y). We can use the XoY notation (read as X dot Y) to denote a variable that is a result of a dot product. In this case, we call the variable NoL (N dot L) to represent the dot product of the Normal and Light vectors.

float3 Fragment(Varyings IN) : SV_Target
{
       // These macros are required for VR SPI compatibility
    UNITY_SETUP_INSTANCE_ID(IN);
    UNITY_SETUP_STEREO_EYE_INDEX_POST_VERTEX(IN);
       
       Light light;
       GetMainLightData(IN.positionWS, light);
       float NoL = dot(IN.normalWS, light.direction);
       
       return _Color * SAMPLE_TEXTURE2D(_ColorMap, sampler_ColorMap, IN.uv) * NoL * light.color;
}

Our sphere is now lit according to the intuition that we formed earlier.

We also incorporated the main light’s color. Your sphere might look a little yellow now depending on the Directional Light’s color in your scene.

Take a few minutes to play with the light intensity, light color, and light position to see how it affects the sphere.

Switching to Toon Rendering

Of course, toon shading doesn‚Äôt look like this, so we need to take a few more steps to make sure that we render in the toon style. The classic toon style has a sharp cut-off between ‚Äúlit‚ÄĚ and ‚Äúunlit‚ÄĚ positions on the object.

To achieve this look, we are going to use the step(y, x) HLSL intrinsic. Step compares x and y. If x is greater than or equal to y, it returns 1, otherwise it returns 0. In our case, we know (and can infer from the lighting we have) that NoL > 0 for all normal vectors in the direction of the light source and NoL is < 0 for all normal vectors pointing away from the light source.

So, we will use step(0, NoL) to give us a value of (NoL > 0 ? 1 : 0) => 1 for all points facing the light source and a value of 0 for all points facing away from the light source.

Calculate this using a new float variable called toonLighting, then replace NoL in our lighting calculation with our new toonLighting variable.

float3 Fragment(Varyings IN) : SV_Target
{
       // These macros are required for VR SPI compatibility
    UNITY_SETUP_INSTANCE_ID(IN);
    UNITY_SETUP_STEREO_EYE_INDEX_POST_VERTEX(IN);
       
       Light light;
       GetMainLightData(IN.positionWS, light);
       float NoL = dot(IN.normalWS, light.direction);
       
       float toonLighting = step(0, NoL);
       
       return _Color * SAMPLE_TEXTURE2D(_ColorMap, sampler_ColorMap, IN.uv) * toonLighting * light.color;
}

Congratulations! You’ve successfully set up your first toon directional lighting shader. Next up, we will work on shadows.

P.S. The edge where light turns dark is a little pixelated. Don’t worry - we will revisit this artifact later on in our quality pass.

Receiving Shadows

You may have observed that your sphere is already casting shadows. This is thanks to the ShadowCaster pass that I included in our boilerplate. This is great, but we need to do a little extra work to make sure that our sphere can receive shadows and that we render these shadows in that signature toon style.

To make sure that we have an appropriate test environment to program our receiving shadows functionality, let’s make one small change to our demo scene. Simply add another sphere in the scene in a position where it would cast shadows on the toon shaded one. You can know that the sphere would cast shadows on the toon shaded one if the two shadows on the plane overlap.

What’s going on here? Why is our sphere casting shadows but not receiving any?

The reasoning is that shadow casting happens during the Shadow Caster pass, which I’ve set up for you in our boilerplate. During the Shadow Caster pass, Unity basically renders a depth texture containing all the opaque objects in your scene. This texture is called a Shadow Map.

Meanwhile, the shadow receiving happens during the material’s rendering pass (here the ForwardOnly pass). To receive shadows, each material samples that Shadow Map and determines whether it is in shadow or not. If it’s in shadow, then it shades itself. This is what we are going to set up.

For us, we can get the results of that Shadow Map sample as part of our light variable. The Light struct includes a shadowAttenuation variable, which represents the shadows that should be applied to our sphere. To incorporate this into our lighting, it’s as simple as multiplying our return value by the light.shadowAttenuation variable.

float3 Fragment(Varyings IN) : SV_Target
{
       // These macros are required for VR SPI compatibility
    UNITY_SETUP_INSTANCE_ID(IN);
    UNITY_SETUP_STEREO_EYE_INDEX_POST_VERTEX(IN);
       
       Light light;
       GetMainLightData(IN.positionWS, light);
       float NoL = dot(IN.normalWS, light.direction);
       
       float toonLighting = step(0, NoL);
       
       return _Color * SAMPLE_TEXTURE2D(_ColorMap, sampler_ColorMap, IN.uv) * toonLighting * light.color * light.shadowAttenuation;
}

Great! Only one problem - it’s not in the toon style.

At this point, it’s a simple matter of re-using what we learned from making NoL in the toon style. We can re-use the step function to make sure that shadows have this hard edge as well.

float3 Fragment(Varyings IN) : SV_Target
{
       // These macros are required for VR SPI compatibility
    UNITY_SETUP_INSTANCE_ID(IN);
    UNITY_SETUP_STEREO_EYE_INDEX_POST_VERTEX(IN);
       
       Light light;
       GetMainLightData(IN.positionWS, light);
       float NoL = dot(IN.normalWS, light.direction);
       
       float toonLighting = step(0, NoL);
       float toonShadows = step(0.5, light.shadowAttenuation);
       
       return _Color * SAMPLE_TEXTURE2D(_ColorMap, sampler_ColorMap, IN.uv) * toonLighting * light.color * toonShadows;
}

In this case, we create a new float variable, toonShadows. Then, we call the step function and pass in 0.5 as the y arg and light.shadowAttenuation as the x arg.

Why 0.5? You can see from the traditionally lit shadows before that the shadow demonstrates a gradient going from 0 (fully shadowed) to 1 (fully not shadowed). For our toon shader, we want the cut-off to be somewhere along this gradient rather than having our shadows end too early or too late.

Our directional shadows are now working! Next, we will set up some advanced lighting characteristics: specular lighting and rim lighting.

Specular Lighting

In this section, we will add a specular lighting term to our toon shader.

Like we reviewed earlier, you can think of the specular lighting term like the light is bouncing off our shiny sphere and being reflected to your eyes. To estimate this, we need to figure out ‚Äúthe direction the surface normal would need to be facing in order for the viewer to see a specular reflection from the light source‚ÄĚ. This is commonly referred to as the half-vector.

The half-vector represents the vector halfway between the light direction and the viewer’s eyes. It can be calculated as the normalized result of the light direction and view direction.

If you need to know how to normalize a vector, call the normalize method.

In other words:

float3 halfVector = normalize(light.direction + viewDirectionWS);

To calculate this vector, we need two pieces of information:

  • The light direction (which we have), and
  • The view direction (which we don‚Äôt have yet).

So, we need to calculate the view direction, then form the half vector, then use the half vector to create our specular term.

Get the View Direction and Create the Half Vector

We need to add the viewPositionWS variable to the Varyings struct so that we can pass it to the Fragment function. Do that now by adding this line to your Varyings struct:

float3 viewDirectionWS : TEXCOORD3;

Next, we need to calculate the value for this variable in the Vertex function. We will use the GetWorldSpaceViewDir(positionWS) method to calculate this value. We want to make sure that the output value is normalized, so we also normalize it. Do this now by adding this line to your vertex function after you have calculated the OUT.positionWS variable.

OUT.viewDirectionWS = normalize(GetWorldSpaceViewDir(OUT.positionWS));

Finally, we need to calculate the half vector in the fragment function. We can create this vector by summing the light direction and the view direction, then normalizing the result. This is like averaging the two vectors. Do this now by adding the following line to your fragment function (before the return, obviously :)).

float3 halfVector = normalize(light.direction + IN.viewDirectionWS);
// [...]
// [...]
struct Varyings
{
      // [...]
       float3 normalWS        : TEXCOORD2;
       float3 viewDirectionWS : TEXCOORD3; // Add the View Direction to our Varyings struct. Use the TEXCOORD3 semantic.
       
      // [...]
};
// [...]
// [...]
Varyings Vertex(Attributes IN)
{
       // [...]
       
       // We need to calculate the positionWS first, since
       // it is an argument for our GetWorldSpaceViewDir method.
       OUT.positionWS = mul(unity_ObjectToWorld, IN.positionOS).xyz;
       
       
       // Then, use the GetWorldSpaceViewDir(positionWS) 
       // method to calculate the view direction.
       OUT.viewDirectionWS = normalize(GetWorldSpaceViewDir(OUT.positionWS)); 
       
       // [...]
}
// [...]
// [...]
float3 Fragment(Varyings IN) : SV_Target
{
       // [...]
       float toonLighting = step(0, NoL);
       float toonShadows = step(0.5, light.shadowAttenuation);
       
       
       // Calculate the half vector by normalizing the 
       // result of the light direction + the view direction.
       // This is like averaging the two vectors.
       
       float3 halfVector = normalize(light.direction + IN.viewDirectionWS); 
       // [...]
}

Calculating NoH

Now that we have our half vector, we can calculate our specular term and apply it to the material.

Reusing our new knowledge of the dot product, we can calculate the alignment between the surface normal and our half vector by calculating the NoH. In this case, it’s actually important to make sure that our NoH is >= 0. So, we will also use the max(x, y) intrinsic.

float NoH = max(dot(IN.normalWS, halfVector), 0);

Now that we have calculated the NoH, we can use it to evaluate our specular term.

Thinking about the Specular Term

Let’s take a second to talk about the specular term. Where should the specular lighting appear? If the position is in shadow, then there should logically have no specular lighting. If the position is facing away from the light source, we should also have no specular lighting, since there is no light hitting the surface for it to reflect.

We need to incorporate our shadow term and our NoL terms in our specular equation. Just like our diffuse lighting, we also need to incorporate the light color.

Finally, we need some property to help us control the falloff for the specular term. This property is often called Smoothness, Glossiness, or Roughness (which is just 1 - Smoothness). We will call our property smoothness.

We will start by adding this new property to our Shader Properties and our CBUFFER, then we will calculate our specular term, and finally we will do a little clean-up.

Add Smoothness to our Shader Properties and CBUFFER

To add Smoothness to your Shader Properties, re-open your ToonShader.shader file. Look for the Properties section near the top, then add our new smoothness property just after the _Color property.

_Smoothness ("Smoothness", Float) = 16.0

We also need to add our new smoothness property in the CBUFFER block. So, go back to your ToonShaderPass.hlsl and locate it. Then, add the _Smoothness property like so:

float _Smoothness;

Your Property block and CBUFFER blocks should look like this now:

// In ToonShader.shader:
Properties
{
       [MainTexture] _ColorMap ("Color Map", 2D) = "white" {}
       [MainColor] _Color ("Color", Color) = (1.0, 1.0, 1.0)
       _Smoothness ("Smoothness", Float) = 16.0
}
//[...]
// In ToonShaderPass.hlsl:
CBUFFER_START(UnityPerMaterial)
       TEXTURE2D(_ColorMap);
       SAMPLER(sampler_ColorMap);
       float4 _ColorMap_ST;
       float3 _Color;
       float _Smoothness;
CBUFFER_END

Calculate our Specular Term

To calculate the specular term, we need to incorporate the shadow and NoL terms as well.

I think it makes the most sense to calculate the specular term:

float specularTerm = pow(NoH, _Smoothness * _Smoothness);

Then, we can attenuate this term using our toonShadows and toonLighting variables:

specularTerm *= toonLighting * toonShadows;

To toon-ify it, we re-use the step function using a small number together with our specularTerm:

specularTerm = step(0.01, specularTerm);

Finally, we combine it into the final lighting function by adding it with the directional lighting.

return _Color * SAMPLE_TEXTURE2D(_ColorMap, sampler_ColorMap, IN.uv) * (toonLighting * light.color * toonShadows + specularTerm * light.color);

Your fragment method should look like this now:

float3 Fragment(Varyings IN) : SV_Target
{
       // These macros are required for VR SPI compatibility
    UNITY_SETUP_INSTANCE_ID(IN);
    UNITY_SETUP_STEREO_EYE_INDEX_POST_VERTEX(IN);
       
       IN.normalWS = normalize(IN.normalWS);
       IN.viewDirectionWS = normalize(IN.viewDirectionWS);
       
       Light light;
       GetMainLightData(IN.positionWS, light);
       
       float NoL = dot(IN.normalWS, light.direction);
       
       float toonLighting = step(0, NoL);
       float toonShadows = step(0.5, light.shadowAttenuation);
       
       float3 halfVector = normalize(light.direction + IN.viewDirectionWS);
       float NoH = max(dot(IN.normalWS, halfVector), 0);
       float specularTerm = pow(NoH, _Smoothness * _Smoothness);
       specularTerm *= toonLighting * toonShadows;
       specularTerm = step(0.01, specularTerm);
       
       return _Color * SAMPLE_TEXTURE2D(_ColorMap, sampler_ColorMap, IN.uv) * (toonLighting * light.color * toonShadows + specularTerm * light.color);
}

If you reviewed the code carefully, you’d spot that we also added two normalize calls up towards the top there.

What gives?

Normalizing the Normal and View Direction in the Fragment function

You may have noticed some shading artifacts with the sphere while we have set up our specular term.

These shading artifacts come about when we use the interpolated normal and view direction values from the vertex stage without normalizing them first.

Resolving this is fairly simple. In your fragment stage, simply normalize the two values and re-assign them back.

IN.normalWS = normalize(IN.normalWS);
IN.viewDirectionWS = normalize(IN.viewDirectionWS);

Make sure you normalize these values before using them elsewhere in your Fragment shader stage.

Wrapping up the Specular Term

Our specular term calculation is all set. But, our return call is getting a little messy. Let’s simplify it by pre-calculating our surface color and final lighting data, then combining just the two at the end.

float3 surfaceColor = _Color * SAMPLE_TEXTURE2D(_ColorMap, sampler_ColorMap, IN.uv);
float3 directionalLighting = toonLighting * toonShadows * light.color;
float3 specularLighting = specularTerm * light.color;
float3 finalLighting = float3(0,0,0);
finalLighting += directionalLighting;finalLighting += specularLighting;
return surfaceColor * finalLighting;

You can see the full Fragment function here:

float3 Fragment(Varyings IN) : SV_Target
{
       // These macros are required for VR SPI compatibility
    UNITY_SETUP_INSTANCE_ID(IN);
    UNITY_SETUP_STEREO_EYE_INDEX_POST_VERTEX(IN);
       
       IN.normalWS = normalize(IN.normalWS);
       IN.viewDirectionWS = normalize(IN.viewDirectionWS);
       
       Light light;
       GetMainLightData(IN.positionWS, light);
       
       float NoL = dot(IN.normalWS, light.direction);
       
       float toonLighting = step(0, NoL);
       float toonShadows = step(0.5, light.shadowAttenuation);
       
       float3 halfVector = normalize(light.direction + IN.viewDirectionWS);
       float NoH = max(dot(IN.normalWS, halfVector), 0);
       float specularTerm = pow(NoH, _Smoothness * _Smoothness);
       specularTerm *= toonLighting * toonShadows;
       specularTerm = step(0.01, specularTerm);
       
       float3 surfaceColor = _Color * SAMPLE_TEXTURE2D(_ColorMap, sampler_ColorMap, IN.uv);
       
       float3 directionalLighting = toonLighting * toonShadows * light.color;
       float3 specularLighting = specularTerm * light.color;
       
       float3 finalLighting = float3(0,0,0);
       finalLighting += directionalLighting;
       finalLighting += specularLighting;
       
       return surfaceColor * finalLighting;
}

Next up, we’ll add in Rim lighting.

Rim Lighting

Rim lighting refers to the practice of adding extra light to the edges of an object to create the illusion of reflected light or backlighting. This technique is particularly beneficial for toon shaders since it enhances the object’s shape against the smooth, shaded backgrounds.

To identify the ‚Äúrim‚ÄĚ of an object, we want to find the surfaces that are pointed ‚Äúleast towards‚ÄĚ the camera. To calculate this, we‚Äôll re-use the dot product. This time, we‚Äôll use the normal direction and view direction to find the alignment between your eye and the surface. The less aligned they are, the more rim lighting we want, so we will flip the result: ( 1 - result ).

Calculate the Rim Term

We’ll simply take the dot product of the normal and view direction, raise the inverted value to a power, apply the toon lighting and toon shadows terms to make sure that we don’t draw the rim areas that aren’t receiving any light, then use our step function to make the result cohere with our toon shading model.

Note that you also need to add the RimSharpness float property to your Shader Properties and CBUFFER. I’ll leave that to you as a little challenge. If you get stuck, revisit how you did it for the Smoothness property.

float NoV = max(dot(IN.normalWS, IN.viewDirectionWS), 0);
float rimTerm = pow(1.0 - NoV, _RimSharpness);
rimTerm *= toonLighting * toonShadows;
rimTerm = step(0.01, rimTerm);

Integrating the Rim Term

Next, we want to multiply the rim term by a new _RimColor property, then add it to the final lighting variable so that it is included in our output.

Add the new _RimColor in your Shader Properties as an HDR Color.

[HDR] _RimColor ("Rim Color", Color) = (1.0, 1.0, 1.0)

And in your CBUFFER as a float3:

float3 _RimColor;

Then, in your Fragment function, create a new float3 variable called rimLighting and set it to the rim term multiplied by the rim color.

float3 rimLighting = rimTerm * _RimColor;

Finally, add your rim lighting to your final lighting variable:

finalLighting += rimLighting;

We covered a lot of ground here, so here are up-to-date versions of the .shader and .hlsl files for you to reference if you get stuck.

/*
This is the name of the Shader in the inspector.
You can set directory hierarchy using forward slashes,
like shown here.
*/
Shader "My Custom Shaders/Toon Shader"
{
       /*
       The properties block enables you to edit and save the defined
       Material properties.
       If you don't define anything here, you can still set properties 
       by code, but you can't edit properties from the inspector, and
       changes don't persist between sessions.
       */ 
       Properties
       {
           [MainTexture] _ColorMap ("Color Map", 2D) = "white" {}
           [MainColor] _Color ("Color", Color) = (1.0, 1.0, 1.0)
           _Smoothness ("Smoothness", Float) = 16.0
           _RimSharpness ("Rim Sharpness", Float) = 16.0
           [HDR] _RimColor ("Rim Color", Color) = (1.0, 1.0, 1.0)
       }
       
       
       /*
       SubShaders let you define settings and programs that can vary
       depending on hardware, render pipeline, and runtime settings.
       
       This is a more complex topic, but it enables you to write
       shaders that can easily work between different render pipelines.
       */
       
       SubShader
       {
       
           /* 
           We need to make sure the Tags includes our RenderPipeline 
           so that this shader works properly.
           */
           
           Tags { "RenderType"="Opaque" "RenderPipeline"="UniversalPipeline" }
           
           
           /*
           These are Shader Commands that control 
           GPU-side rendering properties.
           */
           
           Cull Back
           ZWrite On
           ZTest LEqual
           ZClip Off
           
           Pass
           {
               /* 
               The lightmode tag is extremely important.
               Unity sets up and runs render passes.
               These render passes render objects depending on the
               included LightMode tags. 
               
               e.g., "Render only UniversalForwardOnly in this pass".
               
               Our other LightModes (ShadowCaster, DepthOnly, DepthNormalsOnly)
               are automatically queued up and rendered by Unity during
               the appropriate render pass.
               */
               
               Name "ForwardLit"
               Tags {"LightMode" = "UniversalForwardOnly"}
               
               
               HLSLPROGRAM
               
               // These #pragma directives make fog and Decal rendering work.
               #pragma multi_compile_fog
               #pragma multi_compile_fragment _ _DBUFFER_MRT1 _DBUFFER_MRT2 _DBUFFER_MRT3
               
               // These #pragma directives set up Main Light Shadows.
               #pragma multi_compile _ _MAIN_LIGHT_SHADOWS _MAIN_LIGHT_SHADOWS_CASCADE _MAIN_LIGHT_SHADOWS_SCREEN
               #pragma multi_compile _ _SHADOWS_SOFT
               
               // These #pragma directives define the function names
               // for our Vertex and Fragment stage functions
               #pragma vertex Vertex
               #pragma fragment Fragment
               
               #include "ToonShaderPass.hlsl"
               
               ENDHLSL
           }
           
           Pass
           {
               Name "ShadowCaster"
               Tags {"LightMode" = "ShadowCaster"}
               
               
               HLSLPROGRAM
               
               // This define lets us take an alternate path 
               // when we get the Clipspace Position during the Vertex stage.
               #define SHADOW_CASTER_PASS
               
               #pragma vertex Vertex
               
               // In this case, we want to use the FragmentDepthOnly
               // function instead of the Fragment function we 
               // used in the ForwardLit pass.
               #pragma fragment FragmentDepthOnly
               
               #include "ToonShaderPass.hlsl"
               
               ENDHLSL
           }
           
           Pass
           {
               Name "DepthOnly"
               Tags {"LightMode" = "DepthOnly"}
               
               
               HLSLPROGRAM
               
               #pragma vertex Vertex
               
               // Our DepthOnly Pass and ShadowCaster pass
               // both use the FragmentDepthOnly function
               #pragma fragment FragmentDepthOnly
               
               #include "ToonShaderPass.hlsl"
               
               ENDHLSL
           }
           
           Pass
           {
               Name "DepthNormalsOnly"
               Tags {"LightMode" = "DepthNormalsOnly"}
               
               
               HLSLPROGRAM
               
               #pragma vertex Vertex
               
               // And our DepthNormalsOnly pass uses our
               // Fragment function named FragmentDepthNormalsOnly.
               #pragma fragment FragmentDepthNormalsOnly
               
               #include "ToonShaderPass.hlsl"
               
               ENDHLSL
           }
       }
}
#ifndef MY_TOON_SHADER_INCLUDE
#define MY_TOON_SHADER_INCLUDE
#include "Packages/com.unity.render-pipelines.universal/ShaderLibrary/Core.hlsl"
#include "Packages/com.unity.render-pipelines.universal/ShaderLibrary/Lighting.hlsl"
#include "Packages/com.unity.render-pipelines.core/ShaderLibrary/SpaceTransforms.hlsl"
// See ShaderVariablesFunctions.hlsl in com.unity.render-pipelines.universal/ShaderLibrary/ShaderVariablesFunctions.hlsl
///////////////////////////////////////////////////////////////////////////////
//                      CBUFFER                                              //
///////////////////////////////////////////////////////////////////////////////
/*
Unity URP requires us to set up a CBUFFER
(or "Constant Buffer") of Constant Variables.
These should be the same variables we set up 
in the Properties.
This CBUFFER is REQUIRED for Unity
to correctly handle per-material changes
as well as batching / instancing.
Don't skip it :)
*/
CBUFFER_START(UnityPerMaterial)
       TEXTURE2D(_ColorMap);
       SAMPLER(sampler_ColorMap);
       float4 _ColorMap_ST;
       float3 _Color;
       float _Smoothness;
       float _RimSharpness;
       float3 _RimColor;
CBUFFER_END
///////////////////////////////////////////////////////////////////////////////
//                      STRUCTS                                              //
///////////////////////////////////////////////////////////////////////////////
/*
Our attributes struct is simple.
It contains the Object-Space Position
and Normal Direction as well as the 
UV0 coordinates for the mesh.
The Attributes struct is passed 
from the GPU to the Vertex function.
*/
struct Attributes
{
       float4 positionOS : POSITION;
       float3 normalOS   : NORMAL;
       float2 uv         : TEXCOORD0;
       
       // This line is required for VR SPI to work.
       UNITY_VERTEX_INPUT_INSTANCE_ID
};
/*
The Varyings struct is also straightforward.
It contains the Clip Space Position, the UV, and 
the World-Space Normals.
The Varyings struct is passed from the Vertex
function to the Fragment function.
*/
struct Varyings
{
       float4 positionHCS     : SV_POSITION;
       float2 uv              : TEXCOORD0;
       float3 positionWS      : TEXCOORD1;
       float3 normalWS        : TEXCOORD2;
       float3 viewDirectionWS : TEXCOORD3;
       
       // This line is required for VR SPI to work.
    UNITY_VERTEX_INPUT_INSTANCE_ID
       UNITY_VERTEX_OUTPUT_STEREO
};
///////////////////////////////////////////////////////////////////////////////
//                      Common Lighting Transforms                           //
///////////////////////////////////////////////////////////////////////////////
// This is a global variable, Unity sets it for us.
float3 _LightDirection;
/*
This is a simple lighting transformation.
Normally, we just return the WorldToHClip position.
During the Shadow Pass, we want to make sure that Shadow Bias is baked 
in to the shadow map. To accomplish this, we use the ApplyShadowBias
method to push the world-space positions in their normal direction by the bias amount.
We define SHADOW_CASTER_PASS during the setup for the Shadow Caster pass.
*/
float4 GetClipSpacePosition(float3 positionWS, float3 normalWS)
{
       #if defined(SHADOW_CASTER_PASS)
           float4 positionHCS = TransformWorldToHClip(ApplyShadowBias(positionWS, normalWS, _LightDirection));
           
           #if UNITY_REVERSED_Z
               positionHCS.z = min(positionHCS.z, positionHCS.w * UNITY_NEAR_CLIP_VALUE);
           #else
               positionHCS.z = max(positionHCS.z, positionHCS.w * UNITY_NEAR_CLIP_VALUE);
           #endif
           
           return positionHCS;
       #endif
       
       return TransformWorldToHClip(positionWS);
}
/*
These two functions give us the shadow coordinates 
depending on whether screen shadows are enabled or not.
We have two methods here, one with two args (positionWS
and positionHCS), and one with just positionWS.
The two-arg method is faster when you have 
already calculated the positionHCS variable.
*/
float4 GetMainLightShadowCoord(float3 positionWS, float4 positionHCS)
{
       #if defined(_MAIN_LIGHT_SHADOWS_SCREEN)
           return ComputeScreenPos(positionHCS);
       #else
           return TransformWorldToShadowCoord(positionWS);
       #endif
}
float4 GetMainLightShadowCoord(float3 PositionWS)
{
       #if defined(_MAIN_LIGHT_SHADOWS_SCREEN)
           float4 clipPos = TransformWorldToHClip(PositionWS);
           return ComputeScreenPos(clipPos);
       #else
    return TransformWorldToShadowCoord(PositionWS);
       #endif
}
/*
This method gives us the main light as an out parameter.
The Light struct is defined in 
"Packages/com.unity.render-pipelines.universal/ShaderLibrary/RealtimeLights.hlsl",
so you can reference it there for more details on its fields.
This version of the GetMainLight method doesn't account for Light Cookies.
To account for Light cookies, you need to add the following line to your shader pass:
#pragma multi_compile _ _LIGHT_COOKIES
and also call a different GetMainLight method:
GetMainLight(float4 shadowCoord, float3 positionWS, half4 shadowMask)
*/
void GetMainLightData(float3 PositionWS, out Light light)
{
       float4 shadowCoord = GetMainLightShadowCoord(PositionWS);
       light = GetMainLight(shadowCoord);
}
///////////////////////////////////////////////////////////////////////////////
//                      Functions                                            //
///////////////////////////////////////////////////////////////////////////////
/*
The Vertex function is responsible 
for generating and manipulating the 
data for each vertex of the mesh.
*/
Varyings Vertex(Attributes IN)
{
       Varyings OUT = (Varyings)0;
       
       // These macros are required for VR SPI compatibility
       UNITY_SETUP_INSTANCE_ID(IN);
    UNITY_TRANSFER_INSTANCE_ID(IN, OUT);
    UNITY_INITIALIZE_VERTEX_OUTPUT_STEREO(OUT);
       
       
       // Set up each field of the Varyings struct, then return it.
       OUT.positionWS = mul(unity_ObjectToWorld, IN.positionOS).xyz;
       OUT.normalWS = NormalizeNormalPerPixel(TransformObjectToWorldNormal(IN.normalOS));
       OUT.positionHCS = GetClipSpacePosition(OUT.positionWS, OUT.normalWS);
       OUT.viewDirectionWS = normalize(GetWorldSpaceViewDir(OUT.positionWS));
       OUT.uv = TRANSFORM_TEX(IN.uv, _ColorMap);
       
       return OUT;
}
/*
The FragmentDepthOnly function is responsible 
for handling per-pixel shading during the 
DepthOnly and ShadowCaster passes.
*/
float FragmentDepthOnly(Varyings IN) : SV_Target
{
       // These macros are required for VR SPI compatibility
       UNITY_SETUP_INSTANCE_ID(IN);
    UNITY_SETUP_STEREO_EYE_INDEX_POST_VERTEX(IN);
       
       return 0;
}
/*
The FragmentDepthNormalsOnly function is responsible 
for handling per-pixel shading during the 
DepthNormalsOnly pass. This pass is less common, but
can be required by some post-process effects such as SSAO.
*/
float4 FragmentDepthNormalsOnly(Varyings IN) : SV_Target
{
       // These macros are required for VR SPI compatibility
       UNITY_SETUP_INSTANCE_ID(IN);
    UNITY_SETUP_STEREO_EYE_INDEX_POST_VERTEX(IN);
       
       return float4(normalize(IN.normalWS), 0);
}
/*
The Fragment function is responsible 
for handling per-pixel shading during the Forward 
rendering pass. We use the ForwardOnly pass, so this works
by default in both Forward and Deferred paths.
*/
float3 Fragment(Varyings IN) : SV_Target
{
       // These macros are required for VR SPI compatibility
    UNITY_SETUP_INSTANCE_ID(IN);
    UNITY_SETUP_STEREO_EYE_INDEX_POST_VERTEX(IN);
       
       IN.normalWS = normalize(IN.normalWS);
       IN.viewDirectionWS = normalize(IN.viewDirectionWS);
       
       Light light;
       GetMainLightData(IN.positionWS, light);
       
       float NoL = dot(IN.normalWS, light.direction);
       
       float toonLighting = step(0, NoL);
       float toonShadows = step(0.5, light.shadowAttenuation);
       
       float3 halfVector = normalize(light.direction + IN.viewDirectionWS);
       float NoH = max(dot(IN.normalWS, halfVector), 0);
       float specularTerm = pow(NoH, _Smoothness * _Smoothness);
       specularTerm *= toonLighting * toonShadows;
       specularTerm = step(0.01, specularTerm);
       
       float NoV = max(dot(IN.normalWS, IN.viewDirectionWS), 0);
       float rimTerm = pow(1.0 - NoV, _RimSharpness);
       rimTerm *= toonLighting * toonShadows;
       rimTerm = step(0.01, rimTerm);
       
       float3 surfaceColor = _Color * SAMPLE_TEXTURE2D(_ColorMap, sampler_ColorMap, IN.uv);
       
       float3 directionalLighting = toonLighting * toonShadows * light.color;
       float3 specularLighting = specularTerm * light.color;
       float3 rimLighting = rimTerm * _RimColor;
       
       float3 finalLighting = float3(0,0,0);
       finalLighting += directionalLighting;
       finalLighting += specularLighting;
       finalLighting += rimLighting;
       
       return surfaceColor * finalLighting;
}
#endif

Improving our Directional Lighting

Our lighting is still a little rough around the edges. You can see clear pixel edges everywhere. Although anti-aliasing can help somewhat to smooth these edges out, it’s better to provide cleaner results out of the gate.

In this section, we will replace our step() calls with smoothstep() function calls that give us a tight gradient.

Normally, we want our smoothstep to look something like smoothstep(min, min + y, x), where y is some small number like 0.01. We can create a nifty helper function to make this super easy to apply to all our current step calls.

Add a new method in your ToonShaderPass.hlsl,

float easysmoothstep(float min, float x)
{ 
    return smoothstep(min, min + 0.01, x);
}

Then replace each instance of step(y, x) with easysmoothstep(y, x).

Poof! Your hard edges are now (mostly) gone.

Setting up Ambient Lighting

What is ambient lighting?

Ambient lighting is crucial for toon shading. As you’ve seen so far, accounting only for direct lighting causes us to have black shadows and dark regions. That’s not quite the look we are going for.

So, we’ll introduce an ambient lighting term to help alleviate the dark regions.

Ambient lighting creates an overall, uniform light level throughout the level and will be applied evenly to all parts of the material.

Lighting is additive, so we add it together to our finalLighting term, then multiply it with the surfaceColor like the other lighting information.

I know what you’re about to ask: Doesn’t this mean that pure black materials stay black?

The answer is that, yeah, they do stay black. In fact, that’s how pure black diffuse materials also work in the real world, too. In real life, even the darkest materials (coal) have an albedo of around 0.04.

Should we use the skybox color or a uniform color?

To set up the ambient lighting, we will use a uniform color for the entire world to keep with our toon shading goals. If we wanted to use the world skybox color, we could just use SampleSH() and pass in the normal direction. If we used the skybox color, our mesh would look like this:

Ok, how do I set up the uniform color?

The approach we will take is easy and gives you a lot of control over each material.

Add a new Color property in your Toon Shader.shader Properties. Name it _WorldColor. Then, add the same property in your UnityPerMaterial CBUFFER block.

Finally, combine this property with the finalLighting term by adding it together like this:

finalLighting += _WorldColor;

I went ahead and added a Global Volume with Bloom and Tonemapping to make sure we can still see our object since we are working in linear color space.

You can adjust the world color from the material properties in the inspector.

Your sphere should look like this now:

The Final Shader

If you have made it this far:

CONGRATULATIONS!

This was a long tutorial, and I’m proud of you for taking the time to learn something new.

The final shader and hlsl files:

    // This is the name of the Shader in the inspector. You can set
    // directory hierarchy using forward slashes, like shown here.
    
    Shader"My Custom Shaders/Toon Shader"
    {
        // The properties block enables you to edit and save the defined
        // Material properties.
        //
        // If you don't define anything here, you can still set properties
        // by code, but you can't edit properties from the inspector, and
        // changes don't persist between sessions.
        
    
        Properties
        {
            [MainTexture] _ColorMap ("Color Map", 2D) = "white" {}
            [MainColor] _Color ("Color", Color) = (1.0, 1.0, 1.0)
            _Smoothness ("Smoothness", Float) = 16.0
            _RimSharpness ("Rim Sharpness", Float) = 16.0
            [HDR] _RimColor ("Rim Color", Color) = (1.0, 1.0, 1.0)
            [HDR] _WorldColor ("World Color", Color) = (0.1, 0.1, 0.1)
        }
        
        
        // SubShaders let you define settings and programs that can vary
        // depending on hardware, render pipeline, and runtime settings.
        //
        // This is a more complex topic, but it enables you to write shaders
        // that can easily work between different render pipelines.
        
        SubShader
        {
        
            // We need to make sure the Tags includes our RenderPipeline so
            // that this shader works properly.
            
            Tags { "RenderType"="Opaque" "RenderPipeline"="UniversalPipeline" }
            
            // These are Shader Commands that control GPU-side rendering
            // properties.
            
            Cull Back
            ZWrite On
            ZTest LEqual
            ZClip Off
            
            Pass
            {
                // The lightmode tag is extremely important. Unity sets up
                // and runs render passes. These render passes render
                // objects depending on the included LightMode tags. 
                //
                // e.g., "Render only UniversalForwardOnly in this pass".
                //
                // Our other LightModes (ShadowCaster, DepthOnly,
                // DepthNormalsOnly) are automatically queued up and
                // rendered by Unity during the appropriate render pass.
                
                Name "ForwardLit"
                Tags {"LightMode" = "UniversalForwardOnly"}
                
                
                HLSLPROGRAM
                
                // These #pragma directives make fog and Decal rendering
                // work.
                #pragma multi_compile_fog
                #pragma multi_compile_fragment _ _DBUFFER_MRT1 _DBUFFER_MRT2 _DBUFFER_MRT3
                
                // These #pragma directives set up Main Light Shadows.
                #pragma multi_compile _ _MAIN_LIGHT_SHADOWS _MAIN_LIGHT_SHADOWS_CASCADE _MAIN_LIGHT_SHADOWS_SCREEN
                #pragma multi_compile _ _SHADOWS_SOFT
                
                // These #pragma directives define the function names for
                // our Vertex and Fragment stage functions
                #pragma vertex Vertex
                #pragma fragment Fragment
                
                #include "ToonShaderPass.hlsl"
                
                ENDHLSL
            }
            
            Pass
            {
                Name "ShadowCaster"
                Tags {"LightMode" = "ShadowCaster"}
                
                
                HLSLPROGRAM
                
                // This define lets us take an alternate path when we get
                // the Clipspace Position during the Vertex stage.
                #define SHADOW_CASTER_PASS
                
                #pragma vertex Vertex
                
                // In this case, we want to use the FragmentDepthOnly
                // function instead of the Fragment function we used in the
                // ForwardLit pass.
                #pragma fragment FragmentDepthOnly
                
                #include "ToonShaderPass.hlsl"
                
                ENDHLSL
            }
            
            Pass
            {
                Name "DepthOnly"
                Tags {"LightMode" = "DepthOnly"}
                
                
                HLSLPROGRAM
                
                #pragma vertex Vertex
                
                // Our DepthOnly Pass and ShadowCaster pass both use the
                // FragmentDepthOnly function
                #pragma fragment FragmentDepthOnly
                
                #include "ToonShaderPass.hlsl"
                
                ENDHLSL
            }
            
            Pass
            {
                Name "DepthNormalsOnly"
                Tags {"LightMode" = "DepthNormalsOnly"}
                
                
                HLSLPROGRAM
                
                #pragma vertex Vertex
                
                // And our DepthNormalsOnly pass uses our Fragment function
                // named FragmentDepthNormalsOnly.
                #pragma fragment FragmentDepthNormalsOnly
                
                #include "ToonShaderPass.hlsl"
                
                ENDHLSL
            }
        }
    }
    #ifndef MY_TOON_SHADER_INCLUDE
    #define MY_TOON_SHADER_INCLUDE
    
    #include "Packages/com.unity.render-pipelines.universal/ShaderLibrary/Core.hlsl"
    #include "Packages/com.unity.render-pipelines.universal/ShaderLibrary/Lighting.hlsl"
    #include "Packages/com.unity.render-pipelines.core/ShaderLibrary/SpaceTransforms.hlsl"
    // See ShaderVariablesFunctions.hlsl in com.unity.render-pipelines.universal/ShaderLibrary/ShaderVariablesFunctions.hlsl
    
    ///////////////////////////////////////////////////////////////////////////////
    //                      CBUFFER                                              //
    ///////////////////////////////////////////////////////////////////////////////
    
    /*
    Unity URP requires us to set up a CBUFFER
    (or "Constant Buffer") of Constant Variables.
    These should be the same variables we set up 
    in the Properties.
    This CBUFFER is REQUIRED for Unity
    to correctly handle per-material changes
    as well as batching / instancing.
    Don't skip it :)
    */
    
    CBUFFER_START(UnityPerMaterial)
        TEXTURE2D(_ColorMap);
        SAMPLER(sampler_ColorMap);
        float4 _ColorMap_ST;
        float3 _Color;
        float _Smoothness;
        float _RimSharpness;
        float3 _RimColor;
        float3 _WorldColor;
    CBUFFER_END
    
    ///////////////////////////////////////////////////////////////////////////////
    //                      STRUCTS                                              //
    ///////////////////////////////////////////////////////////////////////////////
    
    /*
    Our attributes struct is simple.
    It contains the Object-Space Position
    and Normal Direction as well as the 
    UV0 coordinates for the mesh.
    The Attributes struct is passed 
    from the GPU to the Vertex function.
    */
    
    struct Attributes
    {
        float4 positionOS : POSITION;
        float3 normalOS   : NORMAL;
        float2 uv         : TEXCOORD0;
        
        // This line is required for VR SPI to work.
        UNITY_VERTEX_INPUT_INSTANCE_ID
    };
    
    /*
    The Varyings struct is also straightforward.
    It contains the Clip Space Position, the UV, and 
    the World-Space Normals.
    The Varyings struct is passed from the Vertex
    function to the Fragment function.
    */
    struct Varyings
    {
        float4 positionHCS     : SV_POSITION;
        float2 uv              : TEXCOORD0;
        float3 positionWS      : TEXCOORD1;
        float3 normalWS        : TEXCOORD2;
        float3 viewDirectionWS : TEXCOORD3;
        
        // This line is required for VR SPI to work.
        UNITY_VERTEX_INPUT_INSTANCE_ID
        UNITY_VERTEX_OUTPUT_STEREO
    };
    
    ///////////////////////////////////////////////////////////////////////////////
    //                      Common Lighting Transforms                           //
    ///////////////////////////////////////////////////////////////////////////////
    
    // This is a global variable, Unity sets it for us.
    float3 _LightDirection;
    
    /*
    This is a simple lighting transformation.
    Normally, we just return the WorldToHClip position.
    During the Shadow Pass, we want to make sure that Shadow Bias is baked 
    in to the shadow map. To accomplish this, we use the ApplyShadowBias
    method to push the world-space positions in their normal direction by the bias amount.
    We define SHADOW_CASTER_PASS during the setup for the Shadow Caster pass.
    */
    
    float4 GetClipSpacePosition(float3 positionWS, float3 normalWS)
    {
        #if defined(SHADOW_CASTER_PASS)
            float4 positionHCS = TransformWorldToHClip(ApplyShadowBias(positionWS, normalWS, _LightDirection));
            
            #if UNITY_REVERSED_Z
                positionHCS.z = min(positionHCS.z, positionHCS.w * UNITY_NEAR_CLIP_VALUE);
            #else
                positionHCS.z = max(positionHCS.z, positionHCS.w * UNITY_NEAR_CLIP_VALUE);
            #endif
            
            return positionHCS;
        #endif
        
        return TransformWorldToHClip(positionWS);
    }
    
    /*
    These two functions give us the shadow coordinates 
    depending on whether screen shadows are enabled or not.
    We have two methods here, one with two args (positionWS
    and positionHCS), and one with just positionWS.
    The two-arg method is faster when you have 
    already calculated the positionHCS variable.
    */
    
    float4 GetMainLightShadowCoord(float3 positionWS, float4 positionHCS)
    {
        #if defined(_MAIN_LIGHT_SHADOWS_SCREEN)
            return ComputeScreenPos(positionHCS);
        #else
            return TransformWorldToShadowCoord(positionWS);
        #endif
    }
    
    float4 GetMainLightShadowCoord(float3 PositionWS)
    {
        #if defined(_MAIN_LIGHT_SHADOWS_SCREEN)
            float4 clipPos = TransformWorldToHClip(PositionWS);
            return ComputeScreenPos(clipPos);
        #else
            return TransformWorldToShadowCoord(PositionWS);
        #endif
    }
    
    /*
    This method gives us the main light as an out parameter.
    The Light struct is defined in 
    "Packages/com.unity.render-pipelines.universal/ShaderLibrary/RealtimeLights.hlsl",
    so you can reference it there for more details on its fields.
    This version of the GetMainLight method doesn't account for Light Cookies.
    To account for Light cookies, you need to add the following line to your shader pass:
    #pragma multi_compile _ _LIGHT_COOKIES
    and also call a different GetMainLight method:
    GetMainLight(float4 shadowCoord, float3 positionWS, half4 shadowMask)
    */
    
    void GetMainLightData(float3 PositionWS, out Light light)
    {
        float4 shadowCoord = GetMainLightShadowCoord(PositionWS);
        light = GetMainLight(shadowCoord);
    }
        
        
    ///////////////////////////////////////////////////////////////////////////////
    //                      Helper Functions                                     //
    ///////////////////////////////////////////////////////////////////////////////
        
    float easysmoothstep(float min, float x)
    {
        return smoothstep(min, min + 0.01, x);
    }
        
    
    ///////////////////////////////////////////////////////////////////////////////
    //                      Functions                                            //
    ///////////////////////////////////////////////////////////////////////////////
    
    /*
    The Vertex function is responsible 
    for generating and manipulating the 
    data for each vertex of the mesh.
    */
    
    Varyings Vertex(Attributes IN)
    {
        Varyings OUT = (Varyings)0;
        
        // These macros are required for VR SPI compatibility
        UNITY_SETUP_INSTANCE_ID(IN);
        UNITY_TRANSFER_INSTANCE_ID(IN, OUT);
        UNITY_INITIALIZE_VERTEX_OUTPUT_STEREO(OUT);
        
        
        // Set up each field of the Varyings struct, then return it.
        OUT.positionWS = mul(unity_ObjectToWorld, IN.positionOS).xyz;
        OUT.normalWS = NormalizeNormalPerPixel(TransformObjectToWorldNormal(IN.normalOS));
        OUT.positionHCS = GetClipSpacePosition(OUT.positionWS, OUT.normalWS);
        OUT.viewDirectionWS = normalize(GetWorldSpaceViewDir(OUT.positionWS));
        OUT.uv = TRANSFORM_TEX(IN.uv, _ColorMap);
        
        return OUT;
    }
    
    /*
    The FragmentDepthOnly function is responsible 
    for handling per-pixel shading during the 
    DepthOnly and ShadowCaster passes.
    */
    
    float FragmentDepthOnly(Varyings IN) : SV_Target
    {
        // These macros are required for VR SPI compatibility
        UNITY_SETUP_INSTANCE_ID(IN);
        UNITY_SETUP_STEREO_EYE_INDEX_POST_VERTEX(IN);
        
        return 0;
    }
    
    /*
    The FragmentDepthNormalsOnly function is responsible 
    for handling per-pixel shading during the 
    DepthNormalsOnly pass. This pass is less common, but
    can be required by some post-process effects such as SSAO.
    */
    
    float4 FragmentDepthNormalsOnly(Varyings IN) : SV_Target
    {
        // These macros are required for VR SPI compatibility
        UNITY_SETUP_INSTANCE_ID(IN);
        UNITY_SETUP_STEREO_EYE_INDEX_POST_VERTEX(IN);
        return float4(normalize(IN.normalWS), 0);
    }
    
    /*
    The Fragment function is responsible 
    for handling per-pixel shading during the Forward 
    rendering pass. We use the ForwardOnly pass, so this works
    by default in both Forward and Deferred paths.
    */
    
    float3 Fragment(Varyings IN) : SV_Target
    {
        // These macros are required for VR SPI compatibility
        UNITY_SETUP_INSTANCE_ID(IN);
        UNITY_SETUP_STEREO_EYE_INDEX_POST_VERTEX(IN);
        
        IN.normalWS = normalize(IN.normalWS);
        IN.viewDirectionWS = normalize(IN.viewDirectionWS);
        
        Light light;
        GetMainLightData(IN.positionWS, light);
        
        float NoL = dot(IN.normalWS, light.direction);
        
        float toonLighting = easysmoothstep(0, NoL);
        float toonShadows = easysmoothstep(0.5, light.shadowAttenuation);
        
        float3 halfVector = normalize(light.direction + IN.viewDirectionWS);
        float NoH = max(dot(IN.normalWS, halfVector), 0);
        float specularTerm = pow(NoH, _Smoothness * _Smoothness);
        specularTerm *= toonLighting * toonShadows;
        specularTerm = easysmoothstep(0.01, specularTerm);
        
        float NoV = max(dot(IN.normalWS, IN.viewDirectionWS), 0);
        float rimTerm = pow(1.0 - NoV, _RimSharpness);
        rimTerm *= toonLighting * toonShadows;
        rimTerm = easysmoothstep(0.01, rimTerm);
        
        float3 surfaceColor = _Color * SAMPLE_TEXTURE2D(_ColorMap, sampler_ColorMap, IN.uv);
        
        float3 directionalLighting = toonLighting * toonShadows * light.color;
        float3 specularLighting = specularTerm * light.color;
        float3 rimLighting = rimTerm * _RimColor;
        
        float3 finalLighting = float3(0,0,0);
        finalLighting += _WorldColor;
        finalLighting += directionalLighting;
        finalLighting += specularLighting;
        finalLighting += rimLighting;
        
        return surfaceColor * finalLighting;
    }
    #endif

Need help? Have feedback?

If you have questions, get stuck, or see a mistake, just let me know. You can reach me by email at michael@occasoftware.com or join our Discord.

What should I read next?

Great question! Try checking out our other write-ups on the blog. If you liked this tutorial, you might like our comparison of URP, HDRP, and Built-In Render Pipelines, our tutorial on how to change your skybox in Unity, or you might want to find some great game development communities.

Licensing?

This tutorial was designed to help you learn about shaders and learn about toon shading in Unity. Given that context, the shaders discussed here are licensed under CC-BY-NC-SA. You can credit OccaSoftware, this blog post, and/or the website directly.

© 2024