The Graphics Pipeline and You: Writing Unity URP Shaders with Code Part 1

NedMakesGames
24 min readApr 12, 2022

--

Video version of this tutorial.

Hi, I’m Ned, and I make games! Would you like to start writing shaders but don’t know where to start? Or, have you encountered a limitation of URP’s Shader Graph you need to overcome? In this tutorial series, I’ll walk you through writing a fully featured, general purpose shader for Unity’s Universal Render Pipeline — exclusively in code.

This tutorial is broken into nine sections, outlined below. You’re reading the first part, where I introduce shaders and how to write them. As I publish future sections, I will update this page with links! You can also subscribe here to receive a notification when I finish another part.

  1. Introduction to shaders: simple unlit shaders with textures.
  2. Simple lighting and shadows: directional lights and cast shadows.
  3. Transparency: blended and cut out transparency.
  4. Physically based rendering: normal maps, metallic and specular workflows, and additional blend modes.
  5. Advanced lighting: spot, point, and baked lights and shadows.
  6. Advanced URP features: depth, depth-normals, screen space ambient occlusion, single pass VR rendering, batching and more.
  7. Custom lighting models: accessing and using light data to create your own lighting algorithms.
  8. Vertex animation: animating meshes in a shader.
  9. Gathering data from C#: additional vertex data, global variables and procedural colors.

The examples in this tutorial were tested in Unity 2020.3, 2021.3, and 2022.3. As you follow the tutorial, you will come across many features only possible in 2021 or later. However, I have written the shaders so the same code runs in both Unity versions.

Let’s lay out some goals of this tutorial series. My aim is to teach how to write shaders. I will show several examples, writing them step by step and explaining as we go. Do not view these as ready-for-production shaders, but rather as blueprints you can customize to your game’s needs.

After completing the series, you’ll know how to write your own version of URP’s standard lit shader, as well as customize it with your own lighting algorithm and more. You’ll also know several optimization techniques and how to leverage Unity’s debugging tools to diagnose and fix rendering bugs.

There’s quite a lot to set up and understand before anything displays on screen, which can make learning shaders difficult. To speed up the process, I’ll only explain information as needed, and I might gloss over some edge cases or technicalities. Don’t worry, I’ll address them later if they become important.

To keep this series from becoming even longer, I will assume you have some general game development knowledge. You should be comfortable using Unity and its 3D game tools - models, textures, materials, and mesh renderers. And, although you need no prior experience writing shaders, you should know how to program. C# experience will definitely be useful.

With all that out of the way, let’s get started!

Project Set Up. In this series, we will use Unity’s Universal Render Pipeline. It’s one of several rendering frameworks Unity provides. I chose URP for this project due to its recency and ability to support a wide variety of platforms while remaining lightweight.

I would highly suggest creating a fresh project for this tutorial. You can either select the URP project template…

Or use the blank template, add URP manually through the package manager, and activate it in Graphics settings. In the settings object, make sure that “Depth Priming Mode” is set to “Disabled” and that the rendering mode is “Forward.”

Either way, create a new standard scene to work with.

The Anatomy of a Shader. Large shaders, like the one we will write, are made of several files. To keep things organized, create a new folder called “MyLit” to contain them all. Create a shader file by selecting “Until Shader” from the create dialog, then name it “MyLit.shader.”

If you create a material, you’ll see that your shader already shows up in the shader selection menu of the material inspector (under the Unlit submenu).

Open the shader file in your code editor of choice. Sometimes Unity doesn’t generate Visual Studio projects if there are no C# scripts present, so create an empty C# script if your shader doesn’t appear in the solution explorer.

Regardless, inside “MyLit.shader” is a lot of automatically generated code — delete it all for now. This part of a shader is written in a language called ShaderLab, and it defines meta information about the drawing code.

This first line opens a shader block and defines the shader’s name in the material inspector. Any slashes will create subsections in the selection menu — useful for organization. The block is bound by curly braces, like classes in C#.

A shader is more than just code that draws. A single shader is actually made of many — sometimes thousands — of smaller functions. Unity can choose to run any of them depending on the situation. They’re organized into several subdivisions. The top-most subdivisions are known as subshaders. Subshaders allow you to write different code for different render pipelines. Unity automatically picks the correct subshader to use.

Define a subshader with a Subshader block, and then add a Tags block to set the render pipeline. Tags blocks hold user defined metadata in a format kind of like a C# dictionary.

Tell Unity to use this subshader when the Universal Rendering Pipeline is active by setting “RenderPipeline” to “UniversalPipeline.” That’s the only subshader we’ll ever need in this tutorial.

Subshaders are just the first subdivision; below them are passes. Passes’ purpose is more abstract. Each pass has a specific job to help draw the entire scene — like calculating lighting, cast shadows, or special data for post processing effects. Unity expects all shaders to have specific passes to enable all of URP features. For now, let’s focus on the most important pass: the one that draws a material’s lit color.

To signal that this pass will draw color, add a Tags block inside. The pass type key is “LightMode”, and the value for our lit color pass is “UniversalForward.” You can also name passes, which helps a lot when debugging.

OK, we’re almost ready to write some code. URP shader code is written in a language called HLSL, which is similar to a streamlined C++. To mark a section of a shader file as containing HLSL, surround it with the HLSLPROGRAM and ENDHLSL code words.

To keep things organized, I like to keep HLSL code in a separate file from the .shader metadata. Thankfully, this is easy to do. You can’t create an HLSL file directly in Unity, but you can in Visual Studio (choose any template and change the extension to “.hlsl”) or your file system (create a text file and change the extension to “.hlsl”).

Name this new file “MyLitForwardLit.hlsl,” and open it in your code editor. I just want to mention that many code editors don’t work well with URP shaders. While working through this tutorial, ignore any errors you see in the code editor and just rely on Unity’s console.

The Graphics Pipeline. When writing shaders, you need to have a different mindset than you would working with C#. For one, there’s no “heap,” meaning most variables work like numeric primitives. You also can’t define classes or methods, or use inheritance. Structs and functions are still available to help organize your code!

Here is an example of simple code implemented similarly in C# and HLSL. HLSL has no support for methods, inheritance, polymorphism, or heap variables.

If you’ve ever worked with data driven design, you will feel at home writing shaders. The focus is gathering data and transforming it from one form to another.

In the broadest sense, shaders transform meshes, materials, and orientation data into colors on the screen.

There are two standard functions which the system calls, kind of like Start and Update in MonoBehaviours. These functions are called the vertex and fragment function. Every pass must have one of each.

These both transform data from one form to another. The vertex function takes mesh and world position data and transforms it into positions on the screen. The fragment function takes those positions, as well as material settings, and produces pixel colors.

Unity’s rendering system employs something called the graphics pipeline to link together these functions and handle low level logic. The pipeline gathers and prepares your data, calls your vertex and fragment functions, and displays the final colors on the screen. It’s made of several stages, running one after another, like an assembly line.

Each stage has a specific job, transforming data for stages down the assembly line. There are two special stages, the vertex and fragment stages, which are “programmable,” running the vertex and fragment functions you write. The others are not programmable and run the same code for all shaders, although you can influence them with various settings.

Let’s take a look at each stage, starting at the beginning: the input assembler. It prepares data for the vertex stage, gathering data from meshes and packaging it in a neat struct. Structs in HLSL are very similar to C# — a pass-by-value variable containing various data fields. This struct is customizable — you can determine what data the input assembler gathers by adding fields to the struct.

Each vertex has data for each field: position, normal, and UVs.

What data can the input assembler access? The input assembler works with meshes, specifically mesh vertices. Each vertex has a bunch of data assigned to it, such as a position, normal vector, texture UVs, etc. Each data type is known as a “vertex data stream.” To access any stream in your input structure, you simply need to tag it and the assembler will automatically set it for you.

For example, here’s an input struct for our forward pass’s vertex function. It defines a struct called “Attributes”. It has a field called “position,” with a type called “float3.” float3 is the HLSL term for a C# Vector3, or a vector containing three float numbers.

Use the POSITION semantic to access position data.

Use semantics to tag variables — the input assembler will automatically set them to a particular vertex data stream. For example, the POSITION semantic corresponds to vertex position. Keep in mind, only the semantic determines what data the input assembler will choose — the variable name does not matter. Feel free to name variables however you wish. We’ll see more semantics later on — HLSL uses them often to help the graphics pipeline.

With that, let’s move to the vertex stage. As a programmable stage, you get to define the code that runs here.

Defining a function in HLSL is very similar to C#, with a return type — void this time — a function name, and a list of arguments. An argument’s type precede the variable name. This function only needs a single argument of Attributes type.

The vertex stage’s primary objective is to compute where mesh vertices appear on screen. However, notice that the Attributes struct only contains a single position — only data for a single vertex. The render pipeline actually calls the vertex function multiple times, passing it data for each vertex until all are placed on screen. In fact, many calls will run in parallel!

If you’ve ever programmed multi-threaded systems, you know that parallel processing can introduce a lot of complexity. Shaders bypass much of this by forbidding storage of state information. Because of this, each vertex function call is effectively isolated from all others. You cannot pass the result of one vertex function — or any data computed inside — to another. Each can only depend on data in the input struct (and other global data; more on this later).

In addition, each vertex function call only knows about data for a single vertex. This is for efficiency: the GPU doesn’t have to load an entire mesh at once.

Viewing a vertex’s object space position in Blender.

We need to compute the screen position for a vertex described in the Attributes struct. When talking about positions, it’s important to determine the coordinate system it’s defined in: its “space.” The position vertex data stream gives values in object space, which is another name for local space that you’re accustomed to in Unity’s scene editor. If you view a mesh in a 3D modeling program, these positions are also displayed there.

Unity’s Transform component describes how to move a vertex from object to world space.

Another common space is world space. This is a common space that all objects exist in. To get world space from object space, just apply an objects Transform component. Unity provides this data to shaders, as we’ll see soon.

However, a vertex’s position on screen is described using a space called “clip space.” An explanation of clip space could fill an entire tutorial, but luckily we don’t have to work with it directly. URP provides a nice function to convert an object space position into clip space. To access it we first need to access the URP shader library.

In HLSL, we can reference any other HLSL file with “#include” directives. These commands tell the shader processor to read the file located at a given location and copy its contents onto this line. If you’re curious what’s inside Lighting.hlsl, or any other URP source file, you can read it yourself in the packages folder.

Lighting.hlsl in Unity 2020.3

Included files can themselves #include many other files, leading to a kind of tree structure. For instance, Lighting.hlsl will pull in many helpful functions from across the entire URP library.

From the URP shader library. You don’t need this code in your shader.

One such function, GetVertexPositionInputs, is located in ShaderVariableFunctions.hlsl. Its source code isn’t important now, but it returns a structure containing the passed object space position converted into various other spaces. Clip space is one of them!

Note that clip space is a float4 type. If you tried to store it in a float3, Unity would give you a warning that data will be truncated — or lost. This is a common source of bugs, so always heed these warnings and use the correct vector size!

Keeping track of which space a position is in can get tricky, fast! Standard URP code adds a suffix to all position variables indicating the space. “OS” denotes object space, “CS” clip space, etc. Let’s follow this pattern as well.

Next, we must fulfill the vertex stage’s job and output the clip space position for the input vertex. To do that, define another struct, called “Interpolators,” to serve as the vertex stage’s return type. Write a float4 positionCS field inside with the SV_POSITION semantic. The semantic signals that this field contains clip space vertex positions.

Have the Vertex function return an Interpolators struct, declare a variable of Interpolators type, set the positionCS field, and return the structure.

With that, the vertex stage is complete. The next stage in the rendering pipeline is called the rasterizer. The rasterizer takes vertex screen positions and calculates which of the mesh’s triangles appear in which pixels on screen. If a triangle is entirely off screen, the renderer is smart enough to ignore it!

The rasterizer will identify the light-gray pixels as covering the triangle and pass this data down the pipeline.

The rasterizer then gathers data for the next stage in the pipeline: the fragment stage.

The fragment stage is also programmable, and the fragment function runs once for every pixel the rasterizer determines contains a triangle.

The fragment function calculates and outputs the final color each pixel should display, but of course, each call only handles one pixel.

The fragment function has a form like above. It takes a struct as an argument, which contains data output from the vertex function. Naturally, the types should match.

The values inside input have been modified by the rasterizer. For instance, positionCS no longer contains clip space position, but rather this fragment’s pixel position.

Pixel positions for fragments. Pixel (0, 0) is at the bottom left in this diagram.

You can pass other data from the vertex function to the fragment function through the Interpolators struct, a technique we’ll investigate later on.

The color yellow is encoded as the vector (1, 1, 0, 0). Red = 1, green = 1, blue = 0, and alpha (opaqueness) = 1.

The fragment function outputs a float4 — the color of the pixel. It may be strange to think about, but colors are just vectors as well. They contain a red, green, blue, and alpha value, each ranging from zero to one.

To let the pipeline know we’re returning the final pixel color, tag the entire function with the SV_TARGET semantic. When tagging a function with a semantic, the compiler interprets the return value as having the semantic.

So we can finally display something on screen, let’s just color all pixels white. Return a float4 with all components set to one. Note that you don’t need to write “new” in HLSL when constructing vectors — just the type name is fine.

The last stage in the graphics pipeline is the presentation stage. It takes the output of the fragment function and, together with information from the rasterizer, colors all pixels accordingly.

The completed “MyLitForwardLitPass.hlsl” file — for now!

There’s one last thing to do to complete a functioning shader: we need to register our vertex and fragment functions to a shader pass. Open your MyLit.shader file.

The completed “MyLit.shader” file — also for now!

Tell the compiler to read the code inside your MyLitForwardLitPass.hlsl file using a #include command. Next, register the vertex and fragment functions using a “#pragma” command. #pragma has a variety of uses relating to shader metadata, but the vertex and fragment subcommands register the corresponding functions to the containing pass.

Make sure that the function names match those in your HLSL file!

And now we’re finally ready to view your shader! Make sure your material has MyLit selected, create a sphere in your scene, and give it the material. It should now appear as a flat white circle.

If there’s any issue, check Unity’s console and the shader asset to see if there are any errors.

Adding Color with Material Properties. We have a flat white circle now — a good starting place! Let’s make the color adjustable from the material inspector. This is possible through a system Unity calls “material properties.”

Material properties are essentially HLSL variables that can be set and edited through the material inspector. They are specified per material and allow objects using the same shader to look different. If you’re wondering “What is the difference between a shader and a material?,” this is it! A material is a shader with specific property settings.

We can define properties inside the .shader file with a Properties block. The syntax for these is… inconsistent, but I will explain. To define a color property, first decide on a reference name. This is how you’ll access this property in HLSL. By convention, properties have an underscore prefix.

Follow that with a parentheses pair, like you’re writing function arguments. The first argument is a string. This is the label — how it will display in the material inspector. The next argument is the property type. There are various, but “Color” defines a color property.

Close the parentheses and set a default value. The syntax is different for each property type, but for colors, start with an equals sign and then the red-green-blue-alpha values inside parentheses.

As mentioned before, colors are just four float values, each corresponding to a color channel — red, green, blue and alpha. Each number ranges from zero to one, where white is all ones and black is all zeroes. For alpha, 1 is opaque and 0 is invisible. If you’d like more info on how these numbers combine to create a color, check this link.

You can now see your property in the material inspector!

Later, this shader will have many properties, so add a header label denoting surface options. To do that, use a Header command. Strangely, the label is not enclosed in quotation marks here.

There’s one last thing we should do. Properties can also be tagged with attributes, like classes in C#, which give the properties special features. Tag _ColorTint with [MainColor]. Now it’s possible to easily set this property from C# using Material.color.

The property is set up, but the value is not reflected on screen. Open MyLitForwardLitPass.hlsl.

Although we defined a property in ShaderLab, we also must define it in HLSL — make sure the reference names match exactly. Unity will automatically synchronize this variable with the material inspector.

The fragment function can also access material properties.

Earlier, I said that vertex and fragment functions could only access data from a single vertex or fragment. While this is true, they can also access any material properties. These variables are “uniform,” meaning they don’t change while the pipeline is running. Unity sets them before the pipeline begins, and you cannot modify them from a vertex or fragment function.

With this in mind, have the fragment function return _ColorTint as the final color.

Return to the scene editor, select your material and change the color tint property. The shader should immediately reflect your choice!

Varying Colors with Textures. Flat colors are great, but I’d like to vary color across the sphere. We can do this with textures!

Zoom in close to a texture (without blending) and the fact that they’re just 2D arrays becomes more apparent. Each position in the array holds a color.

Shaders love working with textures. They’re just image files, but shaders think of them as 2D arrays of color data. To add a texture to a shader, first add a texture material property.

Texture Properties. Defining a texture property is much the same as a color property. Instead of listing the type as Color, set it as 2D. The syntax for default textures is strange. Following the equals sign, type “white” (with quotes) followed by a pair of curly braces. If no texture is set in the material inspector, Unity will fill this property with a small, white texture. You can also set the default color to “black,” “gray,” or “red.”

Similarly to the [MainColor] attribute, there is a [MainTexture] attribute. Tagging this property makes it easily assignable from C# using the Material.mainTexture field.

Your property should show up in the material inspector now. Notice the four numbers beside it. They allow you to set an offset and scale for this texture, which is useful for tiling.

The right sphere has horizontal tiling. (For demonstration, your shader will not do this yet.)

Textures and UVs in HLSL. Now, let’s take a look at the HLSL side of things.

Defining a texture variable is a bit more complicated than colors. You must use this special syntax to define a 2D texture variable. Once again, the name must match the property reference exactly.

TEXTURE2D here is not a type but something called a “macro.” You can think of macros similarly to functions, except they run on the text making up code. You can create macros yourself using the #define command.

This is an example. You don’t need to add it to your shader.

Before compiling, the system will search for any text matching a defined macro and replace the macro name with the text you specify. Macros can also have arguments. The system will replace any occurrences of argument names in the macro definition with whatever text you pass in.

This is a very simple overview of macros, but they can be quite useful in shader code. HLSL does not have inheritance or polymorphism, so if you want to work with any structure that has a positionOS field but you don’t necessarily know the structure type, a macro can do the trick.

This is an example. You don’t need to add it to your shader.

They’re also great at handling platform differences, which is what Unity has done with TEXTURE2D. See, different graphics APIs (DirectX, OpenGL, etc.) use different type names for textures. To make shader code platform independent, Unity provides a variety of macros to deal with textures. That’s one less thing for us to worry about!

Moving on, there are a couple more variables Unity automatically sets when you define a texture property. Textures have a companion structure called a “sampler,” which defines how to read a texture. Options include the sampling and clamping mode you’re familiar with from the texture importer — point, bilinear etc.

Unity stores samplers in a second variable which you define with the SAMPLER macro. The name here is important; it must always have the pattern “sampler” followed by the texture reference name.

Finally, remember the tiling and offset values from the material inspector? Unity stores those in a float4 variable. The name must follow the pattern of the texture name followed by “_ST.” Inside, the X- and Y-components hold the X and Y scales while the Z- and W-components hold the X and Y offsets. (And yes, the fourth component of a float4 vector is referred to as “W” in HLSL.)

Now, we’d like to sample the texture, or read color data from it. We’ll do that in the fragment stage to apply texture colors to pixels. Use the SAMPLE_TEXTURE2D macro to get the color out of a texture at a specific location. It takes three arguments: the texture, the sampler, and the UV coordinate to sample.

This sphere-like model has UV’s mapping each vertex to a position on a flat plane. We can use them to draw a texture on a 3D shape.

First, what are UVs? UVs are texture coordinates assigned to all vertices of a mesh which define how a texture wraps around a model. Think of how cartographers try to unwrap a globe to fit on a flat map. They’re basically assigning UVs to positions on the globe.

This globe model “unwraps” it’s texture onto a flat plane.

UVs are float2 variables, where the X- and Y-coordinates define a 2D position on a texture. UVs are normalized, or independent of the texture’s dimensions. They always range from zero to one.

A few UV-coordinates on a texture. Notice (1, 1) is always the top right corner, no matter what dimensions or aspect ratio the texture has.

Interpolating Vertex Data. We can’t grab UVs out of thin air in the fragment stage — they’re another vertex data stream the input assembler needs to gather.

Add a float2 uv field to the Attributes structure with the TEXCOORD0 semantic, which is short for “texture coordinate set number zero.” Models can have many sets of UVs — Unity uses TEXCOORD1 for lightmap UVs for example, but we’ll get to that later.

The Attributes struct is not available in the fragment stage either. However, we can store data in the Interpolators struct, which will eventually make its way to the fragment stage.

Add another float2 uv field there, also using the TEXCOORD0 semantic.

In the Vertex function, pass the UV from the Attributes struct to the Interpolators struct. We can also apply UV scaling and offset here — might as well! This way we’ll only compute it once per vertex instead of once per-pixel. It’s a good idea to do as much as possible in the vertex function since it generally runs fewer times than the fragment function.

This is an example. You don’t need to add it to your shader.

Unity provides the TRANSFORM_TEX macro to apply tiling. There are two interesting things about it. First, the double hash “##” tells the precompiler to append text to whatever is passed in as an argument. When the macro runs, you can see how it replaces name with _ColorMap, correctly referencing _ColorMap_ST.

Second, the xy and zw suffixes give easy access to a pair of components. This mechanism is called “swizzling.” You can ask for any combination of the x-, y-, z-, and w-components, in any order. The compiler will construct an appropriately sized float vector variable for you. You can also use r, g, b, and a the same way — more intuitive for colors. It’s even possible to assign values with a swizzle operator.

Each vertex has a different value. How does the rasterizer decide which value to give to each Fragment call?

Anyway, now we have UV data in the fragment stage. But, let’s take a moment to really think about what’s happening here. The vertex function outputs data for each vertex. The rasterizer takes those values, places the vertices on the screen, figures out what pixels cover the formed triangle, and finally generates an input structure for each fragment function call. What value will input.uv have for each fragment?

A value interpolated between two ends of a line. Linear interpolation.

The rasterizer will interpolate any field tagged with a TEXCOORD semantic using an algorithm called “barycentric interpolation.” You’re probably familiar with linear interpolation, where a value on a number line is expressed as a weighted average of the values at the end points.

A value interpolated between three corners of a triangle. Barycentric interpolation.

Barycentric interpolation is the same idea except on a triangle. The value at any point inside the triangle is a weighted average of the values at each corner.

An example of how the rasterizer would assign values to each fragment call by interpolating vertex data.

Luckily, the rasterizer handles this for us, so the algorithm is not important. To recap, the values in Interpolators are a combination of values returned by the vertex function. Specifically, for any fragment, they are a combination of values from the three vertices forming the triangle covering that fragment.

Sampling the Texture. All that for some UVs, but now we have all we need to call SAMPLE_TEXTURE2D. It returns a float4, the color of the texture at the specified UV position. Depending on the sampler’s sample mode (point, bilinear, etc), this color may be a combination of adjacent pixels, to help smooth things out.

Regardless, multiply the sample with the color tint property and return it. In HLSL, multiplying two vectors is component-wise, meaning the X-components of each vector are multiplied together, then the Y-components, etc. All arithmetic operators work like this.

In the scene editor, set a texture on your material and marvel at what you’ve accomplished!

Do note that if your texture has an alpha component, the shader doesn’t handle transparency yet. The sphere will always be opaque. Stay tuned to fix that!

I hope this whet your appetite for shader programming, because we’re just getting started! In the next part of this tutorial series, we’ll add simple lighting to finally give objects dimensionality. Then, we’ll delve into shadows and learn about adding additional passes to a shader. You’ve gotten past a lot of the theory and can now focus on the fun stuff!

For reference, here are the final versions of the shader files.

If you enjoyed this tutorial, consider following me here to receive an email when the next part goes live.

If you want to see this tutorial from another angle, I created a video version you can watch here.

I want to thank Crubidoobidoo for all their support, as well as all my patrons during the development of this tutorial: Adam R. Vierra, Alvaro LOGOTOMIA, andrew luers, Andrew Thompson, Ben Luker, Ben Wander, bgbg, Bohemian Grape, Brannon Northington, Brooke Waddington, burma, Cameron Horst, Charlie Jiao, Christopher Ellis, Connor Wendt, Danny Hayes, Danyow Ed, darkkittenfire, Davide, Derek Arndt, Eber Camacho, Electric Brain, Eric Bates, Eric Gao, Erica, etto space, Evan Malmud, Florian Faller, gamegogojo, gleb lobach, Isobel Shasha, J S, Jack Phelps, Jannik Gröger, Jesse Comb, Jessica Harvey, Jiwen Cai, JP Lee, jpzz kim, Justin Criswell, KIMIKO KAWAMORITA, Kyle Harrison, Lanting Dlapan, Leafenzo (Seclusion Tower), Lhong Lhi, Lorg, Lukas Schneider, Luke Hopkins, Lune Snowtail, Mad Science, Maks Kaniewski, Marcin Krzeszowiec, Marco Amadei, martin.wepner, Maximilian Punz, maxo, Microchasm, Minori Freyja, Nick Young, Oliver Davies, Orcs Yang, Oskar Kogut, Óttar Jónsson, Patrick, Patrik Bergsten, Paul, persia, Petter Henriksson, rafael ludescher, Rhi E., Richard Pieterse, rocinante, Rodrigo Uribe Ventura, rookie, Sam CD-ROM, Samuel Ang, Sebastian Cai, Seoul Byun, shaochun, starbi, Stefan Bruins, Steph, Stephan Maier, Stephen Sandlin, Steven Grove, svante gabriel, T, Tim Hart, Tvoyager, Vincent Thémereau, Voids Adrift, Will Tallent, 智則 安田, 이종혁

If you would like to download all the shaders showcased in this tutorial inside a Unity project, consider joining my Patreon. You will also get early access to tutorials, voting power in topic polls, and more. Thank you!

If you have any questions, feel free to leave a comment or contact me at any of my social media links:

🔗 Tutorial list website ▶️ YouTube 🔴 Twitch 🐦 Twitter 🎮 Discord 📸 Instagram 👽 Reddit 🎶 TikTok 👑 PatreonKo-fi 📧 E-mail: nedmakesgames gmail

Thanks so much for reading, and make games!

Changelog:

  • May 29th 2023: Add support for Unity 2022.

©️ Timothy Ned Atton 2023. All rights reserved.

All code appearing in GitHub Gists is distributed under the MIT license, unless otherwise specified.

Timothy Ned Atton is a game developer and graphics engineer with ten years experience working with Unity. He is currently employed at Golf+ working on the VR golf game, Golf+. This tutorial is not affiliated with nor endorsed by Golf+, Unity Technologies, or any of the people and organizations listed above. Thanks for reading!

--

--

NedMakesGames

I'm a game developer and tutorial creator! If you prefer video tutorials, check out my YouTube channel NedMakesGames!