Nodevember #1

Last November, the Nodevember event took place. This event focuses on doing procedural rendering with 3D software and it offers daily-themed challenges: the goal is to start with the simplest possible 3D model (a cube, a sphere…) and to create amazing visuals only with nodes and procedural shading.

Preliminary note: this article does not intend to be (too) technical. I won’t detail the keywords I use like “material”, “shading”, “rendering engine”… but if you’re interested in learning more or if you’re not familiar with these terms, just browse the Internet or take a look at some 3D glossaries.

Procedural shading focuses on the idea of automatically generating materials based on a large gallery of parameters all mixed up via various pipelines to blend into a final visual result. It relies on abstracting away the different important features of the material you want to create: what color is it? is it shiny or rough? does it have little bumps on the surface…? The interesting thing with procedural shaders is that they are basically a mix of many mathematical functions, RGB colors, gradients and other small blocks, or “nodes”, that you simply link to each other in order to get a big graph ultimately leading to a “material output” containing all the data needed by the rendering engine to render your object on screen. This means that it is not just one big black box but rather a super tweakable and adaptable ensemble of bricks that can easily be transferred to another object, be it entirely on in part. For example, with this technique, you can “extract” what makes for a fluffy-and-hairy material and then apply this little piece of shading to several objects in your scene without having to restart from scratch every time.

Nodevember’s icon: it’s all about doing a node pipeline to procedurally generate outstanding visuals!

The Nodevember event is organized by Jonas Dichelle and Luca Reed and it is a yearly event. It’s a bit like the “Advent of Code” daily programming challenge I talked about last December, but for procedural stuff. To participate in this challenge, you simply need to post your creations on Twitter during the November month, and you’re free to use whichever 3D software you prefer.

I myself am a big fan of Blender: it’s a free open-source tool for 3D creation; plus it now has a very powerful set of rendering engines and a really complex node system. To be honest, I’ve only scratched the surface… for now, I’m still making basic things. But it’s great fun, and I can already get some nice-looking results!

Disclaimer: I am by no means a true 3D artist. I like to have fun with modelling, rendering, rigging and even a bit of animation, but this is not my job and I don’t actually keep up with all the latests updates in all the available pieces of 3D software. There are lots of competitors to Blender: Autodesk Maya, ZBrush, Autodesk 3ds Max… I really like Blender because of the open-source philosophy, because I’ve (finally!) managed to get a grasp on the controls, because I find their latest “Eevee” engine quite fascinating and because I like the fact you can incorporate Python scripts directly into your scenes to create objects, rescale or move elements, generate parts completely out of nowhere…

Also, I’m just using the Nodevember challenge as a way of getting some ideas of things to do to improve my technical skills and my understanding of Blender’s node system. So, I probably won’t do a full series of pictures or movies for every Nodevember challenge but rather do some of them or derive and do other themes. Still: I’ll try and post on my website the various things I try on this topic in the months to come, since at least it might make for cool images 🙂

First things first: nodes?

Alright, so what does it mean we can use nodes to create materials? Blender’s node system lets you create material based on the built-in simple materials by adding custom parameters, blending them together or adding displacement effects.

As explained in Blender’s docs:

Just in case you are not (yet) familiar with the concepts: when you create a system of nodes, you are describing a data-processing pipeline of sorts, where data “flows from” nodes which describe various sources, “flows through” nodes which represent various processing and filtering stages, and finally “flows into” nodes which represent outputs or destinations. You can connect the nodes to one another in many different ways, and you can adjust “properties” or parameters, that control the behavior of each node. This gives you a tremendous amount of creative control. And, it will very quickly become intuitive.

Even though you start from basic data, by successively adding transformations on it and passing it on to the next node in the chain, you can achieve incredible results… just take a look at some 2019-Nodevember results!

Simon Thommes‘s contribution to Nodevember 2019 (follow the link for day-to-day tweets with animations):

Hans Chiu‘s contribution to Nodevember 2019 (follow the link to see the animations!):

Pretty amazing, huh? You can see even more incredible visuals and get more references in this Youtube video by Curtis Holt. Remember that all of these were produced with but a single cube in a 3D scene! The craziest thing is that those artists even share with us the full graph of nodes they used for most of those, and I’m still unable to really understand (and reproduce) these results…

A basic step-by-step example

Now, it’s quite hard to understand how you can get from a simple cube to things like that. So let’s take a look at a basic example to see how stuff gradually turns our cube into a pointy alien rock…

We are going to go from here (1) to there (2):

(1) Initial cube: we apply some set up step to add smoothing and more subdivisions so the nodes pipeline can have a greater impact on the object

(2) Final result: we transformed our cube into a weird peaky alien rock with funny colors!

Note: for the ones that want to try it out, if you want your cube to be correctly impacted and deformed by the nodes pipeline, you need to turn on the experimental settings of Blender as shown in Curtis Holt’s Youtube video. Then, go to “rendered” view (by maintaining Z and clicking the “rendered” option) to see the result live as you update your pipeline.

After we set up Blender, our cube already looks smoothed out like a sphere (because we enabled Blender to deform our mesh via the nodes). If we enter Blender’s node system, we see a very simple pipeline that currently defines our (sphere-like) cube’s material (on the left, the nodes pipeline; on the right, the current render result):

The node on the left is called Principled BSDF – despite the somewhat barbaric name, it simply tells Blender to “render this object with a plain color” (there are a lot of other options, but by default it’s what it does). Its output is fed into the “surface” input of the end Material node because we want this color to be applied on the surface of our object. Thus the whitish sphere on the right.

The first thing you might be wondering is: since we have a soft and smooth ball for now, and we are not “allowed” (in this context) to use 3D modelling to move around vertices and change the shape itself, how are we going to get pointy peaks on it? The trick is to use displacement: the point here is not to change the color of the material or to add shapes on it as you would with actual RGB and texture nodes, but rather to use the texture as a height map of sorts.

To better the grasp the concept of height maps, suppose you have an image in black and white. We are going to match the luminosity of the pixel to the height of the vertex in 3D. Say that “white” corresponds to mountains and “black” corresponds to valleys. Then for example, this image (left) would match this volume (top right) by being applied on the geometry vertices (bottom right).

If we blur the height map, we get smoother sides:

Displacement works similarly by taking your input texture (it can be an image or, in our case, a simple Blender built-in “noise texture”) and using it as reference for height. To do that, we are going to add two node to our pipeline (with Shift+A): a Noise Texture node and a Displacement node. By connecting the output of the Noise Texture to the “height” input of the Displacement, and the output of the Displacement to the “displacement” input of the end Material node, we get this new visual:

We already see that we get some bumps. It’s nice, but it’s not really pointy as expected.

Not to worry: the whole power of nodes resides in the customization available through parameters tweaking and successive transformations. To get actual peaks, we can go back to our Noise Texture node and play around with the “scale” and “detail” parameters. This is what we can get with some adjusted values:

This is becoming to look like an interesting shape, but it’s a bit small in the middle of our square render. Let’s take care of the height of the peaks. For now, we simply take the output of the Displacement and pass it on to the Material. What we can do is add another node in-between called Vector Math: among the various settings of this node, we can pick the “scale” operation and thus multiply our output displacement vector by some numerical amount to increase the effect. Here’s an example with a scale factor of 2:

Now, last but not least: the color. Let’s be honest, this whitish shade is simple but not very alien-like. For now, Blender has provided us with a basic Principled BSDF node. It has lots of parameters, among which the “base color”. This input can be defined by hand with a direct RGB color (by clicking on the square). For example, if we click on the square and choose some red-orange tint, we get this new result:

It’s fun but still quite plain. What we want is the peak to change colors, to have some sort of gradient tint. So we are instead going to feed the “base color” parameter of the Principled BSDF node with the output of another new node: the Color Ramp node. Let’s add it to the graph and connect it to the “base color” parameter of the Principled BSDF node. We have a default black-to-white gradient gets us back to a whitish paint because the black is “hidden” in the middle of our alien rock.

We still have 2 things to do:

  • prepare some nice colors on this gradient
  • make the “fac” parameter of the Color Ramp depend on the position in the peak (the “height” of the vertex along the peak), so that the gradient is applied along our peaks

For this last point, we can reuse our Noise Texture node and hook up its output a second time, this time to the “fac” input of the Color Ramp! With a little Math node (using the “square root” operation) in the middle and some reds, purples, whites and pinks in the Color Ramp, here is what we can get:

And voilà! If we go back to a “solid” view in our scene, we see that we still just have our cube. But if we render the image, we get this result. I find this amazing 😉

This is a basic pipeline: real pros can do way more like the examples I shown above: animation, incongruous shapes, glitter and shiny effects…

A few visuals of my own

I myself have tried to tackle two 2019-Nodevember challenges for now, plus one of my own: the “asteroid” (day 21), the “mineral” (day 26) and the “moon” (custom).

Note: the additional trick I’ve used for these anims that was not detailed before (because we were making a still image) is to add keyframes on the various parameters in my pipeline so as to have the colors and deformations change throughout the video.

There’s still a lot of room for improvement and I might even retake on these challenges someday if I feel like I can do a better job… in the meantime, here are my first results, I hope you’ll like it!

Day 21: “Asteroid”

Day 26: “Mineral”

Custom: “Moon”

References
  1. Nodevember’s website: https://nodevember.io/
  2. Blender’s website: https://www.blender.org/
  3. Autodesk’s website: https://www.autodesk.com/
  4. ZBrush’s website: https://www.zbrushcentral.com/
  5. Small glossary of 3D terms: http://www.timaxmedia.com/html/help/Glossary_of_3D_Terms_.htm
  6. Blender’s doc on the node system: https://docs.blender.org/manual/en/2.79/render/blender_render/materials/nodes/introduction.html
  7. Blender’s doc on the concept of displacement: https://docs.blender.org/manual/en/2.82/render/materials/components/displacement.html
  8. S. Thommes’s Twitter for Nodevember 2019: https://twitter.com/i/events/1190959967668494337?s=13
  9. H. Chiu’s contribution to Nodevember 2019: https://chiuhans111.github.io/Nodevember2019/
  10. C. Holt, “Amazing Nodevember Results (+ Free Resources)” (https://www.youtube.com/watch?v=JhLVzcCl1ug), Dec. 2019. [Online; last access 1-June-2020].
  11. M. Cauchi, “Beginners guide to: Procedural shading”, (https://www.mikecauchiart.com/single-post/2016/06/22/Beginners-guide-to-procedural-shading-using-Maya-2016-and-vray), June 2016. [Online; last access 1-June-2020].

Leave a Reply

Your email address will not be published. Required fields are marked *