Mapping Our Way to PBR: Writing Unity URP Shaders With Code, Part 4

NedMakesGames
27 min readDec 19, 2022
Video version of this tutorial.

Hi, I’m Ned, and I make games! Have you ever wondered how shaders work in Unity? Or, do you want to write your own shaders for the Universal Render Pipeline, but without Shader Graph? Either because you need some special feature or just prefer writing code, this tutorial has you covered.

This is the fourth tutorial in a series on HLSL shaders for URP. If you’re starting here, I recommend downloading the starting scripts, so you have something to work from.

  1. Introduction to shaders: simple unlit shaders with textures.
  2. Simple lighting and shadows: directional lights and cast shadows.
  3. Transparency: blended and cut out transparency.
  4. Physically based rendering: normal maps, metallic and specular workflows, and additional blend modes.
  5. Advanced lighting: spot, point, and baked lights and shadows.
  6. Advanced URP features: depth, depth-normals, screen space ambient occlusion, single pass VR rendering, batching and more.
  7. Custom lighting models: accessing and using light data to create your own lighting algorithms.
  8. Vertex animation: animating meshes in a shader.
  9. Gathering data from C#: additional vertex data, global variables and procedural colors.

If you prefer video tutorials, here’s a link to a video version of this article.

Today, we’re diving deep into material options to give our models more realistic options. With physically based rendering (PBR), we’ll implement normal mapping, metallic workflows, more transparency blending modes, parallax occlusion, and clear coat effects. I’ll end with a section outlining techniques to optimize texture use and make the shader your own.

Before I move on, I want to thank all my patrons for helping make this series possible, and give a big shout out to my “next-gen” patron: Crubidoobidoo! Thank you all so much.

Well, let’s get started!

Physically Based Rendering. So far, we’ve used the Blinn-Phong shading algorithm in MyLit. It’s great for simple objects, but we can do better. Most rendering engines have settled on a standard shading paradigm called “physically based rendering,” commonly abbreviated as PBR. It simulates realistic lighting using a “bidirectional reflectance distribution function.” Thankfully, Unity programmed this already for the default Lit shader!

PBR supports many different material types, like metal, plastic, and glass.

It’s easy to reuse their implementation in our MyLit shader.

In the ForwardLitPass Fragment function, change the final function call to UniversalFragmentPBR. It accepts the same arguments in Unity 2020 and 2021, making it safe to remove the #if branch.

PBR includes specular lighting by default — the #define in the shader file is not necessary.

In the scene view, things look very similar.

In Unity 2020, smoothness values should range from 0–1 when using PBR.

If you’re on Unity 2020, lower the material’s smoothness to a value between 0 and 1. In fact, we can enforce that in the inspector.

In the .shader file, change the _Smoothness property to be the Range type, with bounds 0 and 1.

The inspector renders a slider and clamps the value between the min and max.

That’s really all it takes to use PBR in your shaders, albeit a bare-bones version. We’ll implement more features shortly. Right now, notice the more realistic specular highlights. I won’t go into the math behind the BRDF or PBR algorithms in this tutorial, but if that sounds interesting, please let me know!

The Frame Debugger.

Debugging Views. As a shader gets more and more sophisticated, the need for better debugging tools grows. We used the frame debugger in the previous section. It’s useful for viewing draw order, but what if we need to check the value of a variable? In C#, there’s Debug.Log, but HLSL has no similar function.

Well, the fragment function outputs colors, and as we know, colors are just numbers. We can use them to visualize any value — as long as the value ranges from zero to one.

For instance, it’s helpful to output a normal vector as a color. There are a couple small steps to make that happen.

First, remap each component to range from 0 to 1. Since normal vectors are normalized, the length is always 1, and each component ranges from -1 to 1. Remap them by adding 1 and dividing by 2.

Second, Fragment outputs a float4, while normal vectors are float3s. Fix this by appending a 1 to the normal vector using a float4 constructor.

When viewing the results, you can interpret redder values as normals pointing in the positive X axis, green in the Y axis, and blue in the Z axis. Hey! The colors handily match Unity’s XYZ gizmo!

Starting from the top left, the normal colors for +X, +Z, -X, -Z, +Y, and -Y.

Here are the colors corresponding to each cardinal direction.

Visualizing UVs!

You can use this technique to visualize any value! Try outputting smoothness, UVs, or scaled world position. This is the Debug.Log of shader development — it’s a little crude, but a good place to start if you run into a problem!

The Rendering Debugger. Unity 2021 has a new feature called the Rendering Debugger which makes many common outputs automatically available. MyLit should already support a bunch of these!

The logic is inside UniversalFragmentPBR. Many views just output data fed into the InputData and SurfaceData structs.

To support a few more views, set the clip space position field in InputData

…and add a DEBUG_DISPLAY shader feature to the forward lit pass.

Viewing albedo, or base color.

There are still some views MyLit doesn’t support, in which case it will appear either black or white. As we progress through this tutorial series, these gaps will be filled. But, unfortunately, if you’re using Unity 2020, you will need to stick with old fashioned debugging methods.

Normal Mapping. You’ve probably used normal maps already if you’ve worked with 3D models at all. Also called bump maps, these textures encode directions, and modify the model’s normal vectors. Remember that normal vectors determine the strength of diffuse lighting; you can cheaply and easily add detail to a model using normal maps.

The left has a normal map applied. Notice the extra lighting details on the forehead and arms.

Before we implement normal mapping, let’s understand how they work. Textures store colors, but once again, these colors are just numbers. We can interpret them as normalized vectors, similarly to how we output debug normals a few paragraphs ago.

An example texture encoding the normal vectors of a sphere.

Each pixel stores one vector, remapped to range between zero and one. To reverse this, multiply by two and subtract by one. This gives us a vector centered around zero.

This is demo code to illustrate a naive implementation of normal mapping. It’s not required for your shader.

If you plugged in a normal map and tried to output these vectors, they would not look correct. There are two reasons. Number one: Unity encodes normal maps in a special format, optimized for normal vector storage.

This is why Unity frequently bugs you about flagging textures as normal maps in their import settings.

URP has a function to decode normal map texture samples, UnpackNormal, which also includes the remapping step.

These look slightly better, but there’s still something wrong. The normals are all bluish (pointing in the z-direction), and don’t change when the object rotates.

If you took the vector stored in the normal map and used it to apply lighting, it’s obvious we’re missing a step.

Tangent Space. Vectors in normal maps exist in a special frame of reference called tangent space. The easiest way to visualize it is to picture a globe. Up points outwards from the north pole, while right and forward point in perpendicular directions out the equator.

Now, picture yourself standing on the globe’s surface. From your perspective, up points above your head, right along your right arm, and forward in front of you. You exist in tangent space! It’s the perspective of an object standing on the surface of the mesh.

In tangent space, the up, right and forward vectors have special names. The up vector is pretty obviously the mesh’s normal vector. The forward and right vectors are called the tangent and bitangent.

Notice that these directions change (relative to world space) depending on where you stand on the globe.

That’s all well and good, but in the world of shaders, the light, camera, and all other data used in rendering are in world space. The normal vector must be as well if we want to use it in lighting calculations! Thankfully, it’s easy to convert a vector from tangent space to world space using a change of basis transform.

A visual representation of a tangent space to world space calculation.

A basis is the mathematical name for the coordinate axes: X-Y-Z in world space or tangent-bitangent-normal in tangent space. If you have the basis directions of one space given in terms of another, it’s easy to convert points between the two spaces.

In the 2D example above, we want to convert a point from tangent space to world space. Given the tangent and bitangent directions in world space, multiply each basis direction with the corresponding component in the point vector and add the products. The point (2, 1) equals the tangent vector times two plus the bitangent vector times one.

So, we need to get the tangent-bitangent-normal basis directions in world space. We already have the normal, given to us through the mesh data. With that, we could technically pick any two vectors that are perpendicular for the tangent and bitangent. However, in shader dev, we define the tangent and bitangent so they correspond to the model’s UV coordinates.

Return to the globe and overlay it with simple UV values, resembling latitude and longitude. The U-coordinate is red while the V-coordinate is green.

The tangent vector points in the direction of growing U-value. In other words, if you walk in the tangent direction, the U-value under your feet will always increase. Similarly, the bitangent points in the direction of growing V-value.

This works out really well. Since normal maps are a texture applied to the model with UVs, the tangent and bitangent align with the texture. If you unwrap the globe and watch where the tangent and bitangent point as you walk around, from the perspective of the normal texture, they always point in the same direction.

Meshes store the tangent and bitangent vectors as a mesh data stream, just like normals or positions. With that, we now have the entire tangent space basis in world space, can convert a vector extracted from the normal map to world space, and use it for lighting.

Voila!

Phew, that was a lot of math, but I feel it’s important to really understand how normal maps work. They’re fundamental to shader development! Let’s get started programming!

Adding Normal Mapping. First open MyLit.shader and add another texture property. Flag it with two attributes. First, the NoScaleOffset attribute hides the tiling fields in the inspector. This is preferable because the normal map should match the color map. Second, this Normal attribute prompts Unity to check that any texture placed inside is correctly encoded.

White is not a good default color for normal maps. White, or one one one, will generate weird unnormalized normals pointing diagonally. The default value should act as if no normal map is present — just using the mesh vertices’s normal vectors for lighting.

Default normal map blue.

The “bump” default has a value of (0.5, 0.5, 1), a periwinkle-like color, which resolves to the (0, 0, 1) tangent space vector.

In MyLitCommon.hlsl, add the texture declaration.

In the forward lit pass, we need the mesh’s tangent and bitangent vectors. Tangents are stored in a vertex data stream with the TANGENT semantic. Like NORMAL, these tangent vectors are in object space.

I did lie earlier… bitangents are not stored in the mesh, at least not directly. Instead, we can derive them from the normal vector, tangent vector, and a special bitangent sign value. Unity stores this sign value in the TANGENT semantic’s w-coordinate.

We also need the tangent and bitangent in the fragment stage, requiring a new tangent field in the Interpolators struct. It will be in world space by now, and also still contain the bitangent sign. Tag it with the next available TEXCOORD semantic.

In the Vertex function, we need to convert the tangent vector to world space. The GetVertexNormalInputs function has an overload taking the tangent vector and bitangent sign, outputting world space tangent vectors. Store this in output, passing along the bitangent sign as the W-component.

Now the Fragment function has everything it needs. First, calculate the tangent space normal vector. As mentioned earlier, Unity encodes normal maps in a special way; use the URP function UnpackNormal to decode the tangent space normal vector inside.

Next up, calculate the tangent space basis. We have the normal and tangent, but not the bitangent!

The two blue vectors are perpendicular to the red and green vectors.

In 3D space, given any two different directions, there are exactly two directions perpendicular to both — a direction and it’s opposite! Use the bitangent sign to choose a single vector between them. URP will calculate this for us in the CreateTangentToWorld function. Pass it the normal vector, tangent vector, and bitangent sign.

It returns a float3x3 matrix, which is simply three float3 vectors stacked together. You can think of a matrix as a two-dimensional array or a list of vectors. Remember the formula to convert a vector from tangent space to world space? It’s easy to implement this in HLSL using matrix multiplication.

The function TransformTangentToWorld multiplies the tangent space normal vector (from the normal map) with the tangent to world matrix, producing a world space vector. We will use this to calculate lighting!

One final step: normalize this new normal to prevent rounding errors. Some testing is in order! Return the normal vector, remapping it appropriately.

In the scene editor, find a test normal map and, in the texture importer, set it as a normal map. Apply it to your material and make sure everything makes sense. If you remove the normal map, the displaying color should not wildly change. Rotating the model should result in rotated normal vectors.

It even works with double sided rendering! Perfect!

Return to the code and remove the test line. Then set the normal vector in lightingData.

Normal Strength. Apply the normal map to your material again and marvel at the added detail! Sometimes the lighting effect is too strong. There’s an easy way to adjust normal map strength without editing the texture.

Add a normal strength property in MyLit.shader - usually a range from zero to one is recommended.

Add the declaration to the common hlsl file.

Then, switch the UnpackNormal function with UnpackNormalScale, passing the strength.

That’s all there is to it!

The left uses OpenGL-style bitangents, where the bitangent points from bottom to top. The right uses DirectX-style bitangents, the opposite, where the bitangent points from top to bottom.

Debugging Normal Maps. There’s one more wrinkle you should know about. When choosing a normal map, URP and our shader assume the bitangent points in the direction of growing-V, or from the bottom to top of the normal map. Some other programs use the opposite bitangent. Make sure your normal map follows the “OpenGL” convention and you’re good to go.

Finally, in Unity 2021, we can do a bit more to support additional views in the renderer debugger. Set the tangent to world transformation matrix in lightingData, and the tangent space normal in surfaceInput. (Even though it’s unused in 2020, normalTS does exist in SurfaceData in that version.)

Then, signal to the debugger that the shader supports normal maps by defining this _NORMALMAP keyword in the .shader file.

Now, the renderer debugger can display tangent space normals and remove normal mapping from lighting.

That’s the basics of normal mapping. There is complex math behind it, but URP does a good job helping you get by without thinking about it too much. Be sure to leverage normal maps to help your models shine.

The left side has metallic maps enabled. Notice the richer colors and sharper highlights around the nose.

Metallic Workflows. One cool aspect of PBR shaders is their ability to create metallic surfaces. URP’s UniversalFragmentPBR function makes metals really easy to implement in our shaders!

First, add a _Metalness float property in MyLit.shader, which ranges from zero to one.

In MyLitCommon.hlsl, declare the new property.

Then, in MyLitForwardLitPass.hlsl, sync the metallic field in surfaceData to this property.

In the scene editor, try it out! Notice how metallic strength changes the way highlights affect the material. It looks quite dark right now because reflections dominate lighting on metallic surfaces. We will set those up in the next part of this series!

Metalness Masks. Objects are rarely completely metallic, and it would be helpful to vary metalness across the mesh. This is pretty easy to do with another texture! Store metalness values in the texture and read them using UVs, like the color and normal map. Using a texture to turn on and off shader features is commonly called masking.

In your shader file, add another texture property for the metalness mask…

…and the new property to the common HLSL file.

In the Fragment function, sample the metalness mask. Since metalness is a float, only save the red channel’s value. Multiply that with the _Metalness property and set it in surfaceData.

Back in Unity, set a metallic texture in your material. Since this texture encodes special data and not colors, it’s a good idea to turn off the sRBG setting in the texture importer. This should be off for any texture that stores data, like masks or look up textures.

Regardless, only areas where the mask is white should appear metallic.

You can verify this by looking at the metallic view mode in the rendering debugger (or outputting metallic strength). Compare this with your mask by placing the mask in the color map slot.

This violin uses the specular workflow to specify highlight colors per-metal.

The Specular Workflow. The Lit shader actually has two distinct modes to handle metallic surfaces: the metallic workflow and the specular workflow. The specular workflow uses a specular color texture to determine the color of specular highlights over the model. The brightness of the specular color controls the metalness at that point.

To enable specular workflow, first #define this keyword in the forward lit pass block of MyLit.shader. We’ll also need properties for the specular texture and a tint.

Define them in MyLitCommon.hlsl too.

In MyLitForwardLitPass.hlsl’s Fragment function, sample the specular map and multiply the color by the specular tint. In specular workflow mode, UniversalFragmentPBR actually ignores the metallic value. Use a #if block to set the specular color only if _SPECULAR_SETUP is enabled, and metallic otherwise.

In the scene view. Try adding a colorful specular texture and see how it affects things! Trippy. Notice that the specular highlight is kind of the inverse of the surface color — this is optics at work!

You can also inspect specular color in the renderer debugger.

You might have noticed we used a keyword to switch between metallic and specular workflows. It’s easy to add a property to toggle keywords on and off, no custom inspector code required!

In MyLit.shader, add a float property with the special Toggle attribute. The keyword name to enable and disable goes inside the attribute. The property name doesn’t really matter, just choose something relevant. Finally, set the property value to zero to disable the keyword by default.

In the forward lit pass block, replace the #define with a #pragma shader_feature_local_fragment.

In the scene editor, you can now easily switch between metallic and specular modes.

Smoothness Masks. While on the topic of specular lighting, it would be very nice to vary a model’s smoothness across its surface. We can do that with another mask texture, this time containing smoothness values.

Add a smoothness texture to the shader properties…

…declare it in the common file…

…and sample it in the forward lit pass fragment function. Using the red channel, multiply it with the preexisting _Smoothness property and set it in surfaceData.

In the scene editor, try adding a smoothness mask. Again turn off sRGB. Pretty cool, these textures go a long way!

If you work with other 3D programs, you might see some use a gloss or glossiness map — this is just another name for a smoothness map. Some others use a roughness mask, which is just the inverse of smoothness. In other words, smoothness equals one minus roughness.

This is example code — you do not need to add it to your shader.

If you want to use roughness in your shader, just invert the texture sample before setting it in surfaceData. You could even add a property to optionally do this! However, for the rest of this tutorial, I will just support simple smoothness masks.

I know those of you with shader experience might be cringing at our use of separate textures for all these masks. At the end of this tutorial, I will explain how we can optimize a bit. Put this in the back of your mind for now.

Transparent Blending Modes. MyLit already has full transparency support, but URP’s lit shader has a few extra transparency modes we could add: additive, multiply, and premultiplied. Additive and multiply modes are very useful for particles, while premultiplied mode helps simulate glass.

Additive particles and multiplicative particles.

The bulk of the work lies in the custom inspector. Add a new BlendType enumeration with the four modes we’ll support: Alpha, Premultiplied, Additive, and Multiply. To store the currently selected blend mode, we’ll add another property, _BlendType, similarly to how we handled _SurfaceType. Get and set this new property in the OnGUI function.

The main difference between all these blend modes is the source and destination blend settings they employ. Remember, these instruct the renderer how to combine colors from the fragment function with colors on the screen. In UpdateSurfaceType, read the blendType property. In the switch statement where we set blend modes, add another switch in the TransparentBlend case. Inside, set the appropriate source and destination blend modes for each case.

The Alpha mode works like we’re used to, blending based on the alpha value of the source pixel. In this mode, the rasterizer multiplies the source color by the alpha value, the destination color by one minus the alpha value, and then adds them together.

The camera lens and viewfinder window use premultiplied alpha to simulate glass.

Premultiplied mode assumes alpha has already been multiplied with color and stored in a texture’s RGB values. That explains the name — alpha has been previously multiplied. In this case, the source blend type should be one, so alpha isn’t applied again, but the destination should still behave like normal. Premultiplied alpha gives artists better control over how a transparent object looks — effectively, brighter pixels are more opaque.

Additive mode is mathematically the inverse of Premultiplied mode. In this mode, the source is affected by alpha, but the destination is not. The destination color is only ever added to. Additive mode lightens the scene, and is great for particle effects like lighting and fire.

Finally, Multiply mode is for specialized use. It utilizes a new blend mode, SrcColor, which is simply the source pixel’s RGB values. Multiply mode completely ignores alpha, just multiplying the source and destination together. It tends to darken the scene; sometimes useful for otherworldly effects and masking.

In MyLit.shader, don’t forget to add the new _BlendType property.

Check the new modes out in the scene! It’s cool how we can get so many effects just by manipulating blending mode.

Premultipled Mode Specular Opacity. URP further enhances Premultiplied mode by having lighting affect the material’s alpha. It’s again wonderful for glass, where specular highlights appear opaque. UniversalFragmentPBR handles everything, we only need to enable a keyword: _ALPHAPREMULTIPLY_ON.

Back in the custom inspector, enable or disable the keyword appropriately.

Then, in MyLit.shader, add a shader feature to the forward lit pass.

In the scene editor, try it out! The specular highlight definitely appears more opaque.

Emission gives the illusion that the headlights are glowing.

Emission. Some objects, like electronics or magical artifacts, have small glowing parts. It’s too expensive to use real lights for these fine details, but we can do our best with something called an emission texture! It defines glowing areas on a mesh.

Emission is easy to implement! Add an emission texture property in MyLit.shader, and an emission tint.

The HDR attribute flags the tint as a high-dynamic-range color, meaning that its components can take values greater than one. This is useful in combination with some post-processing effects, like bloom. We’ll talk more about it later in the tutorial series.

Add the new properties to the common HLSL file.

Then sample the emission texture in the forward lit Fragment function. Multiply it with _EmissionTint and store it in surfaceData.

And that’s all you need to support emission. Test it out by grabbing an emission texture and adjusting the emission tint. Note that if emission tint is black, no emission will appear.

You can check out emission in the renderer debugger. If you want to disable it temporarily, there’s an option to do that with the lighting features enumeration.

Emission works by ignoring shadows and overexposing affected areas, it doesn’t actually light the scene, unfortunately. We can fix this by tying it into baked lighting, which will happen in part five of this series!

The right half of this video has parallax occlusion mapping, where UVs are offset based on viewing angle and a height map.

Parallax Occlusion Mapping. URP’s Complex Lit shader has a few cool options we should look at, the first being parallax occlusion. Parallax is a method of simulating depth by scrolling textures based on their distance from the camera. We can move UVs around to simulate small divots in our material!

The height map of the sand texture.

Let’s see how it works. First, we need another texture called a height or displacement map to determine which areas are depressed below the surface. White pixels are at the mesh’s surface, while black pixels are lowered below the surface.

The blue line is the view ray. It casts through the original UV. The collision with the deformed surface gives a new UV.

Next, pretend this height map actually deforms the mesh. When evaluating the parallax for a specific UV coordinate, cast a ray from the camera, through the UV, until it hits the imaginary deformed surface. Project the impact point back onto the texture and use that new UV coordinate to sample the color map, normal map, etc.

In this way, UVs change based on view direction — more for areas depressed by the height map. The effect gives the illusion of depth.

Luckily, all the math behind the raycasting and such is handled by URP functions. To get started, add a height map and parallax strength property to MyLit.shader.

In MyLitCommon.hlsl, add declarations for these new properties.

In MyLitForwardLitPass.hlsl, rearrange code so the normal and view direction are calculated before any texture sampling. We need to calculate a new UV based on these values! In the process, use these positionWS and viewDirectionWS variables throughout the function, instead of the values in input.

The parallax occlusion mapping algorithm needs view direction in tangent space, since the imaginary surface formed by the height map exists in tangent space. URP provides a GetViewDirectionTangentSpace function to do that. Include SRP’s ParallaxMapping.hlsl file to make it available, and pass it the world space tangent (with bitangent sign), normal, and view direction vectors. GetViewDirectionTangentSpace needs the vertex normal in an unnormalized state, an important bit of info to keep in mind!

Now call the magic function, ParallaxMapping, to do the work. It requires the height map and sampler. To pass textures and samplers to functions, you must use this special macro: TEXTURE2D_ARGS. It exists to get around platform differences, similarly to texture declarations.

Then, pass ParallaxMapping the tangent-space view direction, parallax strength property, and current UV. ParallaxMapping returns the amount to offset UVs, according to the parallax algorithm explained above. Add the offset to the current UV, and be sure to use this new UV while sampling all other textures.

Check it out in the scene! Find a height map and set the parallax strength. It doesn’t take much for a really strong effect! Usually a parallax strength around 0.05 is plenty.

This is as far as Complex Lit’s parallax mapping goes, but there’s a lot more we could do with it. Would you be interested in learning more in another tutorial series? Please let me know.

The right side has a clear coat mask applied.

Clear Coat. Besides parallax, there’s another advanced feature hidden away in the Complex Lit shader: clear coat! You know the shiny coating on car paint? Clear coat can recreate that effect.

Like emission before, it’s easy to implement. Add a clear coat strength mask and float property. Clear coat emulates a transparent coat of paint on top of a surface, so it can have its own, independent smoothness value. Add a clear coat smoothness mask and strength property as well. In the forward lit pass block, define the _CLEARCOATMAP keyword so UniversalFragmentPBR includes the calculation.

In MyLitCommon.hlsl, define the new textures and float properties.

In MyLitForwardLitPass.hlsl, sample the masks, taking their red channels and multiplying with their float properties. Set the clearCoatMask and clearCoatSmoothness fields in surfaceInput.

That’s basically it! Clear coat is useful for many different objects, like cars, furniture, or even candy. It is expensive though, basically requiring two BRDF calculations per material. Consider leaving this out if you don’t need it.

Optimizing Texture Use. With clear coat under our belt, we support just about all surface options built into URP. However, our shader has ballooned quite a bit, and now is a good time to think about slimming it down.

In future parts of this tutorial series, I’m going to stick with this general purpose, unoptimized shader. However, once you know exactly which features your personal shader requires, revisit the techniques in this section to speed it up and make it easier to use.

Step one: figure out which features you need. Always use the metallic workflow? Remove support for the specular workflow, including the specular color texture and tint. Removing expensive features like clear coat, parallax, or normal mapping is worth it. It’s also possible to toggle a feature on or off using a keyword.

Optional Features. For example, say only some of our materials need normal mapping. We can set up the inspector to disable a keyword when no normal map texture is assigned.

In MyLitCustomInspector’s UpdateSurfaceType function, use GetTexture on the material. Enable or disable the _NORMALMAP keyword based on if it returns null.

In order to call UpdateSurfaceType when the normal map changes, move the base.OnGui call in between Begin- and EndChangeCheck.

In MyLit.shader, change the #define to a #pragma shader_feature.

In MyLitForwardLitPass.hlsl, use the mesh’s normal vector if _NORMALMAP is not defined. This corresponds to a tangent-space normal vector of float3(0, 0, 1).

If you go a step farther and remove parallax mapping, you have no need of tangents in general. It’s important for performance to keep the Interpolators struct as small as possible, since each variable is more that the rasterizer has to interpolate. Using a keyword there works great.

Four mask textures combined into one.

Texture Channel Packing. Another useful technique is texture channel packing. Notice how the various mask textures only use the texture’s red channel. What if we combined several masks into one texture? For instance, we can put the metallic mask into red, smoothness into green, clear coat strength into blue, and clear coat smoothness into alpha.

Metalness, smoothness, clear coat, and clear coat smoothness are packed into a single texture.

Texture sampling is a big resource hog, especially on mobile platforms, and it’s very nice to merge four samples into one! You can construct this texture using Photoshop or any other photo editing software. Treat each mask as a grayscale texture and place them each in one of the color channels.

A simple opaque shader where the smoothness mask is packed into the color map’s alpha channel.

Many times, alpha channels aren’t used — they’re prime real estate for masks. If your material is always opaque, you can hide a mask in the color map’s alpha channel. The specular color and emission texture’s alpha channels are usually unused as well!

Heads up! We will be adding one more mask texture later on in this series: an occlusion mask. This controls the strength of ambient lighting across the model. If you go ahead and optimize your shader now, leave room for this texture as it’s quite important.

Adding cool surface features is one of my favorite aspects of shader development. Now, we can make metal objects, glass objects, bumpy objects, and a brand new car!

We’ve done a lot with our own material, it’s time to look outward! In the next tutorial, I will focus on a hotly anticipated topic: advanced lighting. We will learn how to support multiple point and cone lights, baked lights, occlusion textures, reflection probes, light cookies and more!

For reference, here are the final versions of the shader files after completing this tutorial, without any optimizations (as mentioned before).

If you enjoyed this tutorial, consider following me here to receive an email when the next part goes live.

If you want to see this tutorial from another angle, I created a video version you can watch here.

I want to thank my next gen patrons Michael Samuels and Kasey Vann for all their support, as well as all my patrons during the development of this tutorial: Adam R. Vierra, Alexander Thornborough, AlmostFamous Bit, Amin, Anja Irniger, Arturo Pulecio, Ben Luker, Ben Wander, Bert, bgbg, Bohemian Grape, Boscayolo, Brannon Northington, Brian Knox, Brooke Waddington, Cameron Horst, Charlie Jiao, Christopher Ellis, CongDT7, Connor Wendt, Crubidoobidoo, Dan Pearce, Daniel Sim, Danik Tomyn, Davide, Derek Arndt, Drew O’Meara, Elmar Moelzer, Eta Nol, Fabian Watermeier, Henry Chung, Howard Day, Isobel Shasha, Jack Phelps, Jay Liu, Jean-Sébastien Habigand, John Lism Fishman, John Luna, Jon Mayoh, Joseph Hirst, JP Lee, jpzz kim, JY, Kat, Kel Parke, Kyle Harrison, Lasserino, Leah Lycorea, Levino Agaser, lexie Dostal, Lhong Lhi, Lien Dinh, Lukas Schneider, Mad Science, Makoto Fujiwara, Marcin Krzeszowiec, Mattai, Max Thorne, Michael Samuels, Minh Triết Đỗ, MrShiggy, Nazarré Merchant, NotEvenAmatueR Streams, Oliver Davies, P W, Parking Lot Studio, Patrick, Patrik Bergsten, Phoenix Smith, rafael ludescher, Richard Pieterse, Robin Benzinger, Sailing On Thoughts, Sam CD-ROM, Samuel Ang, Sandro Traettino, santhosh, SausageRoll, SHELL SHELL, Shot Out Games, Simo Kemmer, Simon Jackson, starbi, Steph, Stephan Maier, Sung yup Jo, Syll art-design, T, Taavi Varm, Team 21 Studio, thearperson, Thomas Terkildsen, Tim Hart, Tomasz Patek, Tortilla Laser, ultraklei, vabbb, vertex, Vincent Loi, Voids Adrift, Wei Suo, Wojciech Marek, Yuriy T., and 杉上哲也.

If you would like to download all the shaders showcased in this tutorial inside a Unity project, consider joining my Patreon. You will also get early access to tutorials, voting power in topic polls, and more. Thank you!

If you have any questions, feel free to leave a comment or contact me at any of my social media links:

🔗 Tutorial list website ▶️ YouTube 🔴 Twitch 🐦 Twitter 🎮 Discord 📸 Instagram 👽 Reddit 🎶 TikTok 👑 PatreonKo-fi 📧 E-mail: nedmakesgames gmail

Thanks so much for reading, and make games!

Change log:

  • May 29th 2023: Update custom inspector code to add support for Unity 2022’s material variants.

©️ Timothy Ned Atton 2023. All rights reserved.

All code appearing in GitHub Gists is distributed under the MIT license.

Timothy Ned Atton is a game developer and graphics engineer with ten years experience working with Unity. He is currently employed at GOLF+ working on the VR golf game, GOLF+. This tutorial is not affiliated with nor endorsed by GOLF+, Unity Technologies, or any of the people and organizations listed above. Thanks for reading!

--

--

NedMakesGames

I'm a game developer and tutorial creator! If you prefer video tutorials, check out my YouTube channel NedMakesGames!