RTS Interlude #4: Improving the healthbars (Unity/C#)

Let’s digress for a while and leverage the power of shaders in our RTS game!

This article is also available on Medium.

In previous episodes of the series, we talked a bit about how to create healthbars for our units. In particular, we had some logic to create a UI element and “snap it” above the head of the unit.

However, this technique is pretty flawed and not very efficient because:

  • the healthbars can quickly get strange offsets
  • we have to manually reposition them on the screen whenever either the camera or the unit moves, plus when the camera zooms
  • we are deleting and re-creating objects (the UI healthbars) each time we select or deselect our units

All these are bad and power-consuming… but there is a solution to improve all of this! So, today, let’s do a retake on our healthbars and see how we can use shaders to display them efficiently, robustly and easily 🙂

Using a simple quad to have the healthbar follow the unit properly

Ok – first things first, let’s handle the positioning problem. For now, having to manage a second object and re-compute its screen position is a real thorn in our side, and we run the risk of making lots of mistakes along the way as we convert between all those spaces (local, world, screen…).

On the other hand, just by adding a little quad primitive inside our unit prefab hierarchy and placing it above its head, we directly get a 3D object that moves along with the unit and never gets offset!

This quad is simply added to my prefab with a bit of Y offset and some scaling. I removed the MeshCollider component because I don’t need any interactions, and I’ve (temporarily) switched the material to an unlit built-in one, the Default-Line, so my healthbar clearly shows in my movie. But we’ll see in the following sections how to setup a more appropriate material!

Of course, here, I have also removed the code responsible for spawning the UI healthbar element in my UnitManager, and instead I’m toggling this healthbar game object on or off. For the UpdateHealthbar() method, I’ve removed the previous logic because those objects don’t exist anymore, and I’ll leave it blank for now:

You might remember that we overloaded the UpdateHealthbar() function for the BuildingManager in case we are in the construction phase, so we need to clear the logic there too:

Note: don’t forget to assign your “Healthbar” game object to the new healthbar slot in your UnitManager-derived classes 😉

Great, right? No? Ok – I know that the rotation is going crazy… but bare with me: we’ll fix this in a couple of minutes!

Writing our own healthbar shader

But before we do that, let’s actually add some colours to this healthbar, and start to discuss how we can make the display update dynamically depending on the unit’s current health thanks to shaders.

Why do we need shaders?

Something important to remember is that to render something in 3D, be it in a game engine like Unity or a piece of 3D software like Blender or Maya, you effectively need to “translate” the 3D stuff into 2D info so that it can be printed on your flat screen. And this in turns requires your 3D objects to have several levels of components that each contribute to this translation.

Roughly put, rendering a 3D object requires you to know two things about the object:

  • its shape: this is given by its mesh, which is a collection of vertices and faces in the 3D world
  • its rendering properties: is it coloured? glossy? with an image texture? – this is determined by the object’s material

Once you have those two pieces of info, you can write some logic to “merge” them and compute the colours of the pixels matching your 3D object on the screen. And this merging is done thanks to bits of code called shaders.

This subfield of game dev is really fascinating, but also pretty complex – I’ve been trying some things out for years and never managed to get anything out of it! But then, I discovered Freya Holmér‘s incredible Youtube channel, and in particular her 3-part series on shaders for game dev. In these videos, she explains what shaders are, how they work and how you can write them for your Unity games. They are an amazingly comprehensive and clear intro to the fundamental concepts as well as the ShaderLab syntax (the language Unity uses for its shader) so I really recommend you watch those if you want more details 🙂

I’ve also talked about Unity materials and shaders in a short series I made a few months ago, my “Shader Journey”, so you can check it out if you want additional info on the structure of Unity shader code.

Just to give you some pointers, however, here is a quick sum up of shader basics.

What actually are shaders?

In short, a shader is a little piece of code that has a set of input parameters, as well as some additional implicit data, and then specific functions that tell the computer how a specific object in the scene or a screen VFX should be rendered. They are a very optimised way of showing some pretty complex graphics because they take advantage of GPU parallel computation.

However, shaders can’t be used directly by your 3D objects – they are just the “blueprints” for your materials; and the materials use these templates with some custom parameters (for example a specific colour, or a specific glossiness…).

Now, from a Unity dev point of view, shaders are programs, they are code that you write in .shader files and save in your Unity project – then assign to your materials by picking them from the dropdown at the top of the Inspector.

In terms of structure, all shaders rely on the same basic workflow. As we saw before, their goal is to allow the computer to go from the 3D space to the 2D screen space: so you initially have data for each of the vertices in your mesh, and you then pass and/or convert this data in various ways to eventually end up with per-fragment (= per-pixel) 2D info.

The process works in two (and a half) phases:

  • the vertex shader: this function takes in per-vertex data (that is passed directly by Unity itself to your shader code) and modifies it a bit (for example if you add some displacement to your shape) or passes it through directly to output a new intermediary data structure: the interpolators
  • the interpolation: of course, you don’t have one vertex for every pixel on your screen: there are way more pixels! So you need to infer the intermediary values (in-between your vertices) and blend the values you computed for your vertices to get the interpolated values for the rest of the pixels on-screen. This is done automatically by the computer (hence the “half” phase) 🙂
  • the fragment (or pixel) shader: finally, you use the per-pixel data and output the proper matching pixel colour – this is where you apply your textures, your material colouring, your glossiness, etc.

Here is a little diagram that sums up the entire process for a simple cube with coloured vertices:

Applying this to our healthbar problem

Ok – enough talking! To create my healthbar shader, I mostly followed Freya’s course and in particular her mid-course exercises, then mixed some of the results to get the following result:

I won’t go into all the details of the shader code itself, but overall this healthbar uses the following techniques:

  • the shader takes in an input parameter, the _Health, which is a float that determines both where the colour stops and what colour to use
  • the colour changes are further specified using other input parameters like the “low health colour”, the “high health colour” and the matching thresholds
  • taking into account those thresholds, the healthbar fill colour is either one of the extrema colours or a lerp (= a blend) between the two
  • the rounded corners and the border mask are computed using a signed-distance function (or SDF)
  • the anti-aliasing is done via partial derivatives with the fwidth() function
  • if the “Pulse Is Low” parameter is toggled on, then I use Unity’s _Time variable to get a periodic flash of my colour over time

The really cool thing is that just by creating a new material, giving it my custom shader and assigning it my quad object, I can then tweak all the values in the Inspector. To get the pulse animation, make sure to check the “Always Refresh” toggle in the scene view options:

Now, what does the shader look like? Here’s the code!

As expected, we have the vertex shader, the interpolators and the fragment shader. The meat of the code might not be completely obvious if you’re new to shaders but this post is meant to be more of a peek at shaders than an in-depth course, so I’ll let the curious dive into the topic on their own 😉

This .shader file can be saved anywhere in your Unity project and, because at the very top we set its “path” to be “Custom/Healthbar”, it will now be accessible to all materials by going to the shader dropdown, in the “Custom” submenu.

So to use this shader and apply it on the quad, you first need to create a material and give it the custom shader:

And then set this material on your quad:

Making the quad always face the camera

Now, of course, for now, the 3D object we put in our unit’s hierarchy translates around with the rest of the transform… but it also rotates along with the parent! As we saw before, if I move my Soldier, for example, and it starts to rotate, then the healthbar will rotate along and it won’t be facing the camera anymore!

We could use some Vector3 directions and angles to re-adjust this rotation and force the healthbar to face the camera, but it would be heavy computations (because of all the conversions between local and world spaces) and we would need to do it often – so, it’s clearly not a good idea.

Instead, we can take advantage of the power of shaders to offload this computation to the GPU, thanks to a technique known as “billboarding“. As explained in this nice article by William from NovelTech:

A Billboard shader is a shader that computes the direction from the camera to a given object and adjusts the rendering angle of the object to be always facing the camera.

The idea is basically to jump in during the vertex shader phase and, instead of computing the position of the object in world space with its current transformation matrix, we do it as if we were looking straight at the object with the camera. Then, we re-apply an offset based on the object’s position in world space to have the 3D world position of the mesh actually impact the rendering.

This is fairly easy to do in our shader; this wikipedia link gives us the formula to use, so we just have to re-integrate this transformation of our vertex position in the vertex shader:

And tadaa! This gives us a 3D object that always faces the camera 🙂

(You can see that the selection box of the object does move along with the view, i.e. the 3D object is rotated relative to the camera, but the visualisation we get after rendering via our shader is always snapped to the camera.)

Note: as mentioned by William in his article, a slight drawback of this method is that shadows won’t move properly along with the object – this is not an issue in our case because we are working on UI “overlaid” elements without any shadows, but keep this in mind if you plan on billboarding a 3D object inside your scene 😉

Updating the healthbar display with the current unit health

Last but not least, we need to use the _Health parameter we prepared earlier in our shader and actually update the healthbar display depending on the current unit health – we basically want to do via scripting what I did before when I manually updated my _Health variable slider in the Inspector.

The idea here is to use our C# scripts to dynamically change the shader parameter. We want to change the value of the _Health variable on the shader of the unit we are currently working with from our UnitManager, in place of our UI element updates.

The quick (and dirty) way

Whenever you have a material, you can access any of the properties of its shader and modify them using Unity-defined functions like SetFloat(), SetColor(), etc. Just pick the one that matches your property type and give it the name of the property to update! For example, here, we want to set a float, _Health, so we can modify our UpdateHealthbar() method to get our brand new quad’s renderer and set the property of its material’s shader:

Again, remember to also update the overloaded version in the BuildingManager:

The better (and more efficient) way

But as brilliantly explained by Thomas Mountainborn in this really cool (and short) article, directly assign properties to the shader via the material can quickly eat up your memory and frame rate, as you start to have more and more instances of the material to show on screen at the same time.

That’s because Unity will have to create copies of your materials so that each gets the right value for the property and that they don’t just “share” the same display. This means more memory consumption, but also CPU overhead because the program will have to upload a lot of material changes to the GPU.

Instead, a better technique is to use Material Property Blocks, which allow you to change properties in multiple copies of your material more efficiently.

Important note: of course, as with all optimisations, it adds a little bit of complexity to the code and should therefore only be implemented after profiling the game and checking you actually need this improvement… but in the spirit of sharing and writing a comprehensive tutorial, I thought it was a good opportunity to show it works on this very simple example 😉

The overall logic remains exactly the same: to update our healthbar display, we want to change the _Health float property of the shader. It’s just that now, this change will be done via a Material Property Block instead of directly doing it on the material. And, of course, to keep things smooth, you should make sure to only instantiate the Material Property Block variable once and cache it for further re-use – for example with the Singleton pattern:

And in the BuildingManager:

After all this work, here is a small demo of our updated healthbar system – of course, you can adapt the style and size of the healthbar to better fit your game, this example is pretty basic! 😉

Conclusion

This interlude was a good opportunity to discuss more advanced topics such as shaders or billboarding. And, even if I didn’t get into all the details, I hope it gave you an idea of how powerful shaders can be! Asking your GPU to do the computations rather than your CPU allows you to greatly improve the efficiency of your game…

And in some cases, it even makes for a simpler code! For example, we can now get rid of our Healthbar class entirely, the “Healthbar” prefab, and even of the GetBoundingBoxOnScreen() method in our Utils that helped us compute the on-screen UI element position 😉

Next week, we’ll get back on the main route and discuss the final feature of our RTS prototype: we’ll see how to implement technology trees…

2 thoughts on “RTS Interlude #4: Improving the healthbars (Unity/C#)”

  1. It is a neat idea! I wonder how it would work for non-ortho camera with a rotation support, like in Warcraft 3. I have implemented health bars using single overlay canvas and drawing them on top of the screen position with a fixed offset. But it does not play well with different resolution. Let me try your approach. Thank you!

    1. Hey – happy to know you’re interested, and I hope it suits your needs!
      Just for your info, this version of the healthbars is not the final system – I recently published another episode in the RTS series on how to make healthbars using shaders, so perhaps this technique will be better for you…
      Cheers! 🙂

Leave a Reply

Your email address will not be published.