Shader Journey #0: Introduction

Here’s the beginning of my new adventure in the world of shaders! 🙂

This article is also available on Medium.

The famous Minecraft game can get pretty impressive if you add some shaders to it! Image from:

A subfield of game dev that has fascinated (and resisted) me for years is the amazing world of shaders. I’ve always enjoyed watching a few lines of code transform my 3D objects into mirror-like chrome balls, or strange glowy spheres, or even spiky bubbles that pulse in the air… but I never really understood the logic behind it!

Then, I discovered Freya Holmér‘s incredible Youtube channel, and in particular her 3-part series on shaders for game dev. She explains what shaders are, how they work and how you can write them for your Unity games. These videos were a real revelation for me: I finally understood the basics of shaders, I got past Unity’s ShaderLab syntax and I started to actually correlate the weird lines of HLSL code and the nice visual effects on screen. So – kudos to her, and if you’re new to this topic and you want to learn more about shaders, make sure to watch those videos 😉

Shaders can completely transform the look and feel of the game (like the screenshot from Minecraft above), and they’re also a very optimised way of showing some pretty complex graphics because they take advantage of GPU parallel computation.

Note: in this series, I’ll mostly focus on shaders inside of Unity. Also, I won’t be diving into the new Shader Graph, I’ll only discuss writing shaders as code (in .shader asset files) 🙂

What are shaders?

A quick overview

Basically, a shader is a little piece of code that has a set of input parameters, as well as some additional implicit data, and then specific functions that tell the computer how specific object in the scene or a screen VFX should be rendered.

As Freya says – shaders are just some low-level frontend: they’re about writing maths and turning into colours to fill the various pixels on your screen with the right info.

The whole point of shaders for 3D rendering is to go from the 3D space to the 2D screen space: you initially have data for each of the vertices in your mesh, and you then pass and/or convert this data in various ways to eventually end up with per-fragment (= per-pixel) info.

The first phase is the vertex shader: this function first takes in per-vertex data (that is passed directly by Unity itself to your shader code). This data can contain the position of the vertex, its UV coordinates, its normal, etc. Then, you either modify it a bit (for example if you to add some displacement to your shape) or pass it through directly – anyway, the goal is to have it fit a new intermediary data structure: the interpolators.

Because the problem is that, of course, you don’t have one vertex for every pixel on your screen: there are way more pixels! So you can’t simply iterate through your vertices, get a colour for each, and print those colours on the screen. This would give you a screen filled with empty spaces, and only some scarce coloured pixels. To solve this issue, we use interpolation to infer the intermediary values (in-between your vertices) – hence the name “interpolators”. This is done automatically by the computer: it simply blends the values you computed for your vertices to get the interpolated values.

When this interpolation phase is done, you enter the last stage of the shader: the fragment shader. At this point, you have per-pixel data and you just want to output the proper matching pixel colour.

Here is a little diagram that sums up the entire process for a simple cube with coloured vertices:

Shader or material?

Something important to note is that shaders can’t be used directly by your 3D objects. Be it in Unity or in 3D softs like Blender, you first create materials that use a shader, and then apply those materials to your objects.

So, materials are sort of “instances” of shaders: the shader is the blueprint, the template; and the material uses this template with some custom parameters (for example a specific colour, or a specific glossiness…).

A peek at Unity shaders

Unity uses a declarative language calls ShaderLab to write its shaders. This language describes a shader object as a set of nested blocks (delimited by braces).

At the root level, you create a Shader block that encapsulates all of your code and that sets some Unity-specific info like the name of the shader in your project, the path to access it, etc. Then, there are 2 or more blocks inside:

  • the first block is the Properties block: it defines all the parameters that appear in the Inspector for materials that use this shader and that can be defined by the user – then those parameters can be used in your shader code by associating a variable to each
  • the next blocks are SubShader blocks: you usually have only one, but you can create several if you want your shader to behave differently depending on the hardware/render pipelines it’s run with

These sub shaders themselves contain some sub-blocks:

  • the level of detail, or LOD (optional) this shader should be shown for
  • the Tags (optional): these define when objects using this shader are placed in the render queue, if they are opaque or transparent…
  • most importantly: the Passes blocks

A Pass block is where you define the actual core code of the shader, using the HLSL language. Once again, you can define some tags and additional commands, for example to play with the depth buffer. But, at the beginning, you’ll mostly focus on these 5 main parts of your Passes:

  1. the #pragmas and #includes: those are C-like preprocessor instructions that define some entry points and import code from Unity’s shader libs and provide you with lots of util methods or variables to actually create Unity-viable code!
  2. the variables associated with the user-defined properties you listed in your Properties block: they are named exactly the same as the parameters, but the data types might be a bit different. You have to write this boilerplate “conversion” to be able to access the values the user set in the Inspector in your code.
  3. two data structures, called appdata and v2f by default: those define the fields that you can access at the vertex level in your vertex shader (appdata) and at the fragment level in your fragment shaders, after the interpolation (v2f)
  4. the HLSL code or your vertex shader, usually in a vert() function: this one takes in an instance of your appdata structure and spits out an instance of your v2f structure (it’s the 1st phase in the pipeline I discussed earlier)
  5. the HLSL code or your fragment shader, usually in a frag() function: this one takes in an instance of your v2f structure and spits out a colour, either as a float4 or a fixed4 (because you have the 3 usual colour channels, R, G, B; and you also have an alpha channel – so that’s 4 components in total)

I’m not diving into all the details of multi-passes, rendering order and all that sort of stuff – that’s another interesting but very complex subject 🙂

To sum up: Unity’s ShaderLab language wraps some HLSL snippets into higher-order objects, with a nested hierarchy of blocks, and it defines shader assets that can be used in materials to define how 3D objects in your scene will eventually be rendered.

A basic example

The most simple shader you can think of is one that always returns a “white” colour. In that case, you’ll simply ignore the incoming data and always return the same value: a float4 with 1-components everywhere. So you simply disregard the data inside of your appdata and v2f (you still have to fill the v2f or Unity will yell saying you’re missing some info in the fragment shader) and return a float4(1, 1, 1, 1).

Because of the nested structure I described before, even something as simple can result in somewhat “long” code, but it’s quite straight-forward once you’ve gotten used to those various levels in the hierarchy:

In the vertex shader, we have to go from the object’s local space to the screen clip space, which can be done using a Unity built-in tool (from the UnityCG.cginc lib we included above): the UnityObjectToClipPos(). This insures that the position of the vertex in its local 3D space is properly projected on our 2D screen.

Now, suppose you create a basic 3D scene with Unity’s usual blueprint and a few built-in 3D shapes like a cube, a sphere and a capsule:

If you create a material from this shader and you apply it to the 3D meshes, you’ll get simple white objects (I’ve changed the camera background to be a solid black colour):

Pretty dull, right?

Yep – all the lighting, shadows and reflections we’re so used to having are actually shader stuff that we need to implement ourselves if we want to reproduce those effects! So our very simple shader is just able to output a white colour for each pixel that corresponds to one of our three 3D objects, and we get basic unlit fixed shapes on our screen.

“Shader Journey”: introducing the project

Now that I’ve finally understood the basics of shaders in Unity, I’ve decided it’s time I embark on a long new adventure: exploring this intriguing world of shaders!

Throughout this series, I will try to write various shaders to make VFX or rendering styles that I’m interested in: toon shading, crazy displacements, UI elements (yes, it’s doable with shaders!), glowing effects… It will be a great opportunity to get better at writing those and phrasing some “visual problems” into math equations 😉

Note however that this series won’t be a series of tutorials: I won’t necessarily be re-detailing the basic concepts (for example lambertian or Blinn-Phong for BRDF lighting), I’ll simply jump right in and talk of a specific thing I tried to make, and some of its underlying building blocks.

Each time, I will make a little video of the result and I’ll list all the features that I managed to implement with this shaders.

🚀 I’ll also share the code for all these shaders on my Github. The one for this episode 0 with the simple white shader is already online! 🙂

I’ll also be improving my Unity skills, because I want to use Unity for the entire process, i.e. also for the video recording and (crude) editing. For this, I’m going to take advantage of the video recorder now built-in in Unity (for versions 2020+) and some UI canvas animations for panel or text fade-ins and fade-outs.

Note: by the way, if you want to learn more about animations for UI, you can check out another article I wrote about smooth scene transitions with UI cross-fades 🙂


This new series is yet another exciting opportunity to broaden my toolbox as a game developer! I’ll be working both on CGI and rendering, and on programming – shaders are a neat way of mixing two domains I really like!

Next time, I’ll start by doing some basic toon shaders

I hope you’ll like this project – and as always, feel free to react in the comments if you have ideas of effects or shaders I could try 😉

Leave a Reply

Your email address will not be published. Required fields are marked *