Graphics

HDRP Wind Corruption Shader Post-Mortem

Unity HDRP
Decal Projector
Shader Graph
Optimization

Facing the "volumetric corruption" requirement, this post documents why we ruled out heavy solutions like Uber Shaders and RT-driven approaches, pivoting to a lightweight Native Decal Projector with mathematical static/dynamic decoupling.

HDRP Decal Projector Corruption Shader Analysis

Rendering Pipeline Decision: The Journey from Complex R&D to Pragmatic Implementation

When receiving Mike’s requirement for “volumetric corruption with dynamic wind feedback,” the initial instinct was to lean towards complex, cutting-edge techniques. However, during the actual R&D and testing phases, I had to make pragmatic trade-offs between performance and visual fidelity.

This post-mortem not only documents how our final Shader is constructed, but more importantly, why we chose NOT to use those “advanced-sounding” solutions, and how we left the door open for future extensions.


1. The Initial Blueprint: Exploring Heavy Architectures

In the early stages of the project, to completely solve the lack of volumetric depth in 2D decals, I evaluated several highly ambitious technical routes:

  • Route A: Uber Shader Injection Injecting the corruption logic directly into the Master Node of all terrain and prop materials via a Subgraph, utilizing true world-space coordinates for vertex-level influence.

    SCHEME_A::UBER_SHADER_INJECTIONSTATUS::MODIFIED
    Input Assembler[FIXED]
    Vertex Shader[SHADER]
    Tessellation[SHADER]
    Geometry Shader[SHADER]
    Rasterizer[FIXED]
    Fragment Shader[SHADER]
    Output Merger[FIXED]
    MODIFIED
    UNTOUCHED
  • Route B: Mesh Decal Proxy Spawning a real semi-transparent sphere mesh at the corruption site and calculating soft blending at intersections by reading the CameraDepthTexture.

    SCHEME_B::MESH_PROXY_BLENDSTATUS::MODIFIED
    Input Assembler[FIXED]
    Vertex Shader[SHADER]
    Tessellation[SHADER]
    Geometry Shader[SHADER]
    Rasterizer[FIXED]
    Fragment Shader[SHADER]
    Output Merger[FIXED]
    MODIFIED
    UNTOUCHED
  • Route C: Render Texture Channel Packing (Enrico’s Proposal) Discarding global variable parameters in favor of a real-time Render Texture (RT). By packing the corruption shape, wind perturbation, and footprint state into different channels, this would allow for fully independent physical interactions for every corruption ring in the scene.

Theoretically, all these approaches are perfectly valid and represent standard practices for environmental VFX in AAA engines. However, following a rigorous Profiler audit, I decided to put them on hold.


2. The Reality Check: Why We Pivoted

The primary reasons for discarding the above solutions boil down to a poor Performance-to-Visual Ratio and exorbitant maintenance costs.

  • Why we dropped the Uber Shader: The Variant Explosion Risk Adding a corruption macro toggle to a vast library of existing materials would cause Shader Variants to multiply exponentially, severely bloating VRAM and build times. Furthermore, if environment artists updated the base terrain shaders later, these injected connections would easily break, resulting in an unmanageable maintenance nightmare.
  • Why we dropped the Mesh Proxy: The Abyss of Overdraw Proxy spheres are inherently part of the Transparent render queue, meaning they cannot benefit from Early-Z culling. If multiple corruption spheres overlap in a narrow corridor, the fragment shader (frag) overdraw would instantly blow through our performance budget.
  • Why we postponed the RT Solution: Avoiding Premature Over-engineering Enrico’s RT proposal is undeniably the ultimate form for complex environmental VFX. However, our current weather system only provides a “globally unified” wind parameter. Until game design explicitly demands granular interactions like “wind occlusion inside caves,” forcing a dynamic RT pipeline would introduce unnecessary memory bandwidth pressure.

Based on these considerations, I decided to pivot to the most lightweight, non-invasive solution available: HDRP Native Decal Projector.

FINAL_SCHEME::HDRP_DBUFFER_PROJECTIONSTATUS::MODIFIED
Input Assembler[FIXED]
Vertex Shader[SHADER]
Tessellation[SHADER]
Geometry Shader[SHADER]
Rasterizer[FIXED]
Fragment Shader[SHADER]
Output Merger[FIXED]
MODIFIED
UNTOUCHED
Note: Although it acts as a decal, under the hood it still renders a Projection Box (ia/vert), calculates inverse projection and internal logic in the fragment stage (frag), and finally blends into the D-Buffer (om).

3. Solving the Decal’s Inherent Visual Flaws

Having committed to Native Decals, the new challenge emerged: Using purely mathematical logic, how do we make a 2D projection feel like an organic fluid torn by the wind, without clipping awkwardly into the environment?

Through a synchronized effort between C# and Shader Graph, we tackled three major pain points:

Pain Point 1: Visual Snapping During Wind Speed Changes

Initially, I multiplied Time * WindSpeed directly inside the Shader to pan the noise. The fatal flaw? Whenever wind speed changed smoothly, the output of Time would jump drastically, causing the VFX to aggressively flicker and snap. The Solution (C# Phase Accumulation): Strip the Shader of its time-calculation duties. Instead, use Time.deltaTime in the C# script to calculate the incremental physical distance traveled each frame. We pass this stabilized, continuously accumulating vector (_CustomWindOffset) to the Shader, while applying modulo operations to prevent floating-point precision loss over long play sessions.

Pain Point 2: The Entire Decal Sliding Across the Ground

If we applied the wind offset to the global UVs directly, the entire corruption pit would slide across the floor like a skateboard. The Solution (Static/Dynamic Decoupling): Strictly isolate the static and dynamic components within the Shader. Static UVs are exclusively used to generate the central black hole mask; meanwhile, the wind perturbation doesn’t shift the UVs but is instead added to the Radius of the Distance node. This locks the pit in place, while allowing only the outer flames to stretch downwind.

Pain Point 3: The “Tree-Climbing” Artifact (Vertical Stretching)

When a decal is projected near a tree trunk or steep wall, the 2D planar projection inevitably creates hideous vertical stretching.

Decal texture stretching vertically on a tree trunk

The Problem: Severe stretching artifacts caused by Decal projection on vertical surfaces.

We deployed a “Dual Defense Mechanism” to handle this:

  1. Defense A (Physical Culling - Decal Layer Mask): For individual objects that should absolutely never be corrupted (e.g., crucial quest items, specific walls), we exclude them entirely using HDRP’s Decal Layer Mask. This takes effect at the base Culling stage, serving as a zero-cost hard isolation.
  2. Defense B (Visual Blending - Height Fade Mask): For organic transitions over tree roots or slight slopes, a hard Layer cut looks terribly rigid. Therefore, I built a volume height clipping mask inside the Shader based on absolute world height (World Space Y), infused with noise to break up the cut line. It allows flames to naturally “lick” slight elevations, but smoothly fades to transparent if it climbs beyond a set threshold, perfectly hiding the stretching artifact.
Result after applying the height fade mask, stretching is perfectly hidden

The Solution: Vertical stretching is smoothly eliminated after applying the Height Fade Mask.


4. Shader Graph Core Logic Breakdown

To execute the concepts above, the Shader is cleanly modularized into several functional groups:

Shader Graph global overview

Shader Graph Global Overview: Clean functional modularization

Group: WindFlowNoise (Wind Drive Engine)

Receives the accumulated variable _CustomWindOffset from C#.

  • Logic: Multiplies the global offset by an intensity parameter. It doesn’t modify vertices directly but acts as the dynamic “coordinate displacement thrust” for subsequent noise sampling.
Wind Flow Noise node setup

Group: UVDistance Field (Static Distance Field)

This is the anchor for the underlying corruption pit, ensuring the core doesn’t drift.

  • Logic: Uses a pristine local UV coupled with a center point of (0.5, 0.5) fed into a Distance node to calculate a perfectly static radial gradient. Inside this group, introducing any wind variables is strictly prohibited.

Group: Corruption Mask (Multi-dimensional Masking)

This is the heart of the deformation and restriction logic. I split it into two dimensions: planar perturbation and height clamping.

  • Dimension 1: Downwind Surface Stretching (XZ Noise) Handles fluid deformation on the 2D surface.

    XZ plane noise logic
    • Method: I split the wind offset vector and feed it into two separate Simple Noise nodes. By independently adjusting the XZ and Y scales, we artificially create a directional bias—elongated in the downwind direction and compressed on the flanks.
    • Crucial Detail: During this phase, I use a Subtract node to subtract 0.5 from the 0~1 noise. This shifts pure positive numbers into a range containing negatives, ensuring the flames oscillate back and forth across the original boundary, preventing the entire ring from migrating unidirectionally.
  • Dimension 2: Height Clamping (Y Mask) Specifically addresses the “tree-climbing” stretching artifact (Pain Point 3).

    Y-axis height mask logic
    • Method: Extracts the absolute Y-axis height of the pixel using the Position (World) node. This is fed into a Remap node to define the falloff zone, then a Power node controls the harshness of the fade, and finally, a Clamp restricts it between 0 and 1.
    • Result: This computed Float acts as an invisible “horizontal guillotine,” forcing all pixels that stray “too high off the ground” to aggressively fade out right before the final output.

Group: M * (1-M) * 4 (Emission Edge Extraction & Final Blend)

  • Logic: A pure mathematical trick. By adding the computed XZ noise to the static distance field and subtracting our designated corruption radius, we generate a base dynamic mask, M.
  • Final Composite: Utilizing the mathematical property of M * (1 - M), we extract a brilliant peak exclusively along the light-to-dark transition boundary. We take this razor-sharp flame ring, multiply it by our previously calculated Y Mask (to trim excess vertical climbing), and finally multiply it by our target color before feeding it into the Emission channel.
Final emission extraction and height mask blending

Final Composite: Blending the mathematical edge extraction with the height mask

Final in-game visual result

The Final In-Game Result


5. Future Extensions: Leaving the Door Open for RT

Currently, passing parameters via Shader.SetGlobalVector is incredibly performant and offers the best bang for our buck. However, keeping Enrico’s advice in mind, this setup can smoothly transition if environmental interaction demands increase.

Triggers for a Pipeline Upgrade:

  1. Design requires localized wind occlusion (e.g., zero wind inside a cave).
  2. The need for precise footprint feedback, such as flames extinguishing when a character steps on them.

When that day arrives, because our current Shader architecture already solidifies the underlying logic for “Static/Dynamic Decoupling” and “Planar/Height Separation”, we only need to swap out the few nodes receiving C# global variables. By replacing them with nodes that sample specific channels of a global Render Texture (e.g., reading the G channel for wind strength, B channel for footprint masks), we can seamlessly plug into a holistic environmental ecosystem like TVE.

Conclusion

Stepping back from a complex AAA architectural R&D phase to a streamlined mathematical implementation has been the most valuable lesson of this pipeline selection process. As Technical Artists, we must avoid the trap of treating everything like a nail just because we hold a powerful hammer.

By leveraging precise mathematical decoupling and height masking, we traded a handful of cheap ALU instructions for excellent volumetric visuals. More importantly, we clearly defined the boundaries of this solution while actively paving a smooth transition path toward a more robust RT architecture in the future.


6. Iteration: The “Breathing Lightning” Core (An Art-Driven Visual Revamp) (Mar.12)

Iteration Background & Positioning While the pure math M * (1-M) * 4 approach was practically flawless in terms of performance, subsequent internal playtests revealed a visual shortcoming: the corruption felt too “passive” and “mild.” The Art Director’s vision demanded that the core area read as a highly unstable energy anomaly brimming with violent vitality.

It is crucial to clarify to the readers: This current iteration is strictly an “Art-Driven” visual proof-of-concept. Its primary mission is to achieve the highest Visual Target inside the engine without any compromises, establishing the benchmark for our art style. Once we hit this visual ceiling, we will inevitably subject it to rigorous performance downgrades and pipeline refactoring in the upcoming production cycles.

Technical Implementation Breakdown To achieve the “crackling wandering energy” and “rhythmic breathing pulse,” we completely refactored the core node group, introducing a highly expressive—albeit heavy—complex noise network:

  • Core A: Polar Noise Composite Warping & Voronoi In-Place Evolution We abandoned simple UV panning, as it often just looks like a static texture sliding unnaturally across a surface. To imbue the energy with an outward-tearing and highly erratic sense of violence, we heavily customized the Voronoi noise across both spatial and temporal dimensions.

    • Spatial Dimension (Composite UV Tearing): First, the base UV is routed through a Polar Coordinates node to establish a radial spatial foundation. Simultaneously, we sample a standalone Simple Noise. The critical step: we take the output value of this basic noise and Add it directly to the output UV of the Polar Coordinates. This intensely warped, radially-biased, and uneven composite UV is then fed into the Voronoi node. This ensures the generated cellular structure is inherently torn through spatial distortion from the very beginning.

    • Temporal Dimension (Angle Offset Drive): Instead of using time to pan the UVs—the traditional trap—we take the accumulated time variable (Time * Movement), driven by the _LightningMovement property, and plug it directly into the Voronoi node’s Angle Offset port. This forces the Voronoi cell feature points to rotate, deform, and cannibalize each other in place, spawning an incredibly vivid, “crackling” wandering lightning dynamic that completely shatters the stiffness of linear panning.

    • Bandpass Filtering Extraction: Finally, to refine the chunky cellular shapes into razor-sharp lightning, we introduced the concept of Bandpass Filtering. We feed the Voronoi output into a Smoothstep node, locking Edge 1 at 0.1, while Edge 2 is dynamically controlled by the _LightningThickness property. This operation surgically extracts the microscopic “black gaps” crawling between the cells, instantly converting them into sharp, high-frequency, anti-aliased lightning meshes.

    Voronoi lightning generation logic using Polar and Noise combined UV warping, paired with time-driven Angle Offset

    Step A: Composite UV Warping + Angle Offset Temporal Evolution + Smoothstep Bandpass Filtering to extract lightning meshes

    Isolated visual effect of the dynamic wandering lightning mesh from Core A

    Isolated Core A Effect: Discarding conventional UV panning to reveal an outwardly tearing, wildly erratic wandering lightning mesh.

  • Core B: Spherize Cloud Base & Phase Shifting Pulse Beneath the wandering lightning meshes, we needed to lay down a “dark matter base” that could support the illusion of volume and throb like organic tissue. To create this slightly elevated, rhythmically pulsating physical illusion, we employed a highly elegant Phase Shifting technique:

    • Step 1: Constructing the Static Spherize Space & Cloud Base. We first pass the native UV into a Spherize node, sculpting a 3D convex-lens-like spatial base. This ballooned, distorted UV is then used to sample a Simple Noise (density governed by _L_BreathingNoiseScaler), generating a layer of static cloud cover with a volumetric, wrapping feel.

    • Step 2: Bandpass Texture Extraction (Smoothstep). The raw noise is too blurry and soft. We route it into a Smoothstep node, allowing the exposed _L_BreathingNoiseThickness property to dynamically hijack Edge 1 and Edge 2. Like a scalpel, this slices the blurry cloud into stark, high-contrast black-and-white static patches.

    • Step 3: Temporal Phase Drive & Sine Ripple. This is the core magic of the underlying logic. We do not move the UVs. Instead, using an Add node, we inject the continually accumulating Time variable directly into the grayscale values of the static patches we just extracted. This forces the numerical value of every single pixel to climb at a constant rate. We then feed this result into a Sine node. Because every pixel starts with a different initial grayscale value, processing them through a Sine function produces asynchronous, non-linear oscillation. Visually, this translates flawlessly into “breathing ripples” where the cloud base texture continuously expands and contracts in place.

    • Step 4: 0-1 Normalization. Given that a Sine wave outputs values oscillating between -1 and 1, overlaying this directly would cause color inversion or artifact blackouts. At the tail end of the logic chain, we use simple arithmetic—Add(1) followed by Divide(2)—to seamlessly remap the wave into an absolute positive range of 0 to 1, outputting a pristine breathing pulse mask.

    Left to Right: Spherize noise extraction, adding time variable, and generating phase breathing ripples via Sine node

    Step B: Spherize Noise Extraction -> Temporal Value Addition -> Sine Phase Ripple -> 0 to 1 Normalization

    Animated visual result of the Spherize and Phase Shifting breathing effect

    Isolated Core B Effect: The Phase Shifting pulse in action, creating an organic, volumetric breathing ripple at the base.

  • Core C: Energy Aggregation & HDR Exposure Physical Enhancement Finally, we multiply and merge the lightning texture (A) with the breathing noise (B), applying Saturate to strictly prevent negative value overflow. To ignite a genuine sense of energy within the HDRP environment, this dynamic mask is piped directly into an Exposure node before being multiplied by an extreme-intensity EmissionColor. This guarantees that even amidst drastic fluctuations in environmental lighting, the corruption core sustains a blindingly bright, physically accurate emissive Bloom.

Final in-game iteration visual result showing lightning and breathing effects

Iteration Result: A highly oppressive visual performance merging ionization and breathing.

Ultimate composition featuring Render Textures, PostFX Bloom, Crystal Distortion, and Volumetric Fog

Ultimate Composition: The complete VFX suite integrating the procedural core with Render Textures, Post-FX Bloom, Refraction, and Volumetric Fog.


7. Next Steps: Performance Reclamation and Texture Baking

As emphasized earlier, this current version exists to secure “visual sign-off.” Calculating Voronoi, Polar Coordinates, and Spherize in real-time during the Fragment stage incurs a heavy ALU (Arithmetic Logic Unit) overhead that cannot be ignored. For an environmental decal that might be massively deployed across a scene, this level of consumption is unsustainable.

Future Optimization Strategy: Procedural to Texture Baking

Can we switch to using textures instead? Absolutely, and it is the mandatory path toward large-scale production. Once the visual target is unequivocally locked in, our next iteration will focus entirely on “baking” these expensive mathematical calculations into lightweight static assets:

  1. Voronoi Texture Downgrade: We plan to utilize Substance Designer or in-engine baking tools to freeze the wandering lightning effect into a continuous Flow Map paired with a static, high-frequency Noise texture. Within the Shader, simple UV Panning combined with Flow Map distortion will allow us to nearly perfectly simulate the current dynamic Voronoi calculations at a fraction of the cost, trading ALU overhead for cheap Texture Fetches.
  2. Polar Pre-computation Baking: We will strip out the real-time arctangent (atan2) and spherize distortion math from the Shader. Instead, we will bake a UV Lookup Table (LUT) texture containing the polar distortion, or simply downgrade the effect to utilize a pre-rendered Flipbook sequence.
  3. Distance Fade Culling: Even if we retain a fraction of the core calculations, we will aggressively implement an LOD logic based on Camera Distance. This will allow players to witness the high-fidelity procedural calculations within a 5-meter proximity, while smoothly Lerp-ing down to the cheapest static single texture for anything beyond 15 meters.

8. Update (DLC): The Great Purge & Production-Ready Refactor (Mar.17)

Following the “Art-Driven” visual approval documented in Section 6, we immediately initiated the performance reclamation phase. The goal was simple: Ruthlessly eliminate all real-time procedural noises while preserving the chaotic, “breathing lightning” visual identity.

8.1 The Reality of ALU Bottlenecks

We executed a global purge across the Shader Graph. Every instance of Voronoi, Polar Coordinates, Spherize, and the 4 Simple Noise nodes was permanently deleted. Leaving these in the graph—even disconnected—is an industrial taboo, as they can cause variant bloat and confuse future maintainers.

In their place, we introduced a pre-baked Texture (T_Flow(RG)_Cell(B)_Noise(A)) authored in Substance Designer. This single texture packs our essential data:

  • R & G Channels: Flow vectors (pre-calculated radial distortion).
  • B Channel: Cellular structure (the base for the lightning meshes).
  • A Channel: High-frequency noise (for edge breakup).

8.2 Flowmap Blending: The Seamless Infinite Flow

Replacing Polar Coordinates with a Flowmap presented a new challenge: simple UV panning looks unnatural, and applying a modulo (Fraction) to the time variable causes a jarring, instantaneous visual reset when the value jumps from 1 back to 0. Similarly, using a PingPong (triangle wave) approach resulted in an awkward “breathing/accordion” artifact where the flow reverses direction.

The industrial-standard solution is Flowmap Blending (Dual Phase).

Flowmap Blending Subgraph utilizing two shifted fractions and a cross-fading lerp

The custom Flowmap Blending Subgraph. Notice the strictly decoupled static directional vector and the time-driven dual phases.

The Mathematical Mechanics:

To prevent infinite stretching, we resolve the “snap” of the cycle by running two identical engines (Phase_APhase\_A and Phase_BPhase\_B) with a half-cycle offset:

  1. Temporal Offset:
    • Phase_A=Fraction(Time)Phase\_A = Fraction(Time)
    • Phase_B=Fraction(Time+0.5)Phase\_B = Fraction(Time + 0.5)
  2. Weighting Function (The Mask): We use a triangle wave derived from the phase to ensure a phase is only visible when its UV distortion is stable: W=1Phase21W = 1 - |Phase \cdot 2 - 1|
  3. The Cross-fade: When Phase A reaches its reset point (0.9900.99 \to 0), its weight WAW_A is exactly 0. At that exact moment, Phase B is at its stable midpoint (0.50.5) with a weight of 1.0. The Lerp node toggles between them, hiding the jump.

TA Note: When sampling Flow vectors (RG), ensure the texture is not compressed using standard block compression (ASTC/BC7) and ideally exported at 16-bit precision to avoid “grid-like” vector quantization artifacts.

8.3 Macro Masking & Edge Control

With the heavy lifting shifted to textures and Flowmap Blending, we restructured the Main Graph to cleanly separate the Micro Detail (texture sampling) from the Macro Masking (world-space constraints).

The fully refactored, production-ready Main Shader Graph

The finalized Main Graph: Procedural nodes purged, relying on texture caches and decoupled macro-masking logic.

A crucial mathematical fix was implemented for the central radial mask. Previously, relying solely on a raw Distance node left the edges of the 1x1 quad with a value of ~0.5, causing the lightning noise to bleed aggressively into the hard edges of the decal box. By adjusting the math to Saturate(Distance * 2.0) -> One Minus, we forcibly pushed the 0.5 edge distance up to 1.0 before inversion. This acts as a surgical scalpel, guaranteeing a pure, pitch-black falloff at the perimeter, keeping the corruption perfectly contained within its radial bounds.

8.4 The Visual Trade-off: Procedural Evolution vs. Texture Smearing

While the performance gains of this refactor are undeniable, it is vital to document a specific visual compromise we accepted during this transition: the slight loss of the erratic, “crackling” lightning behavior.

In our procedural iteration (Core A), we drove the Angle Offset of the Voronoi node using Time. Mathematically, this caused the cellular walls to dynamically collapse and reconnect, generating true Topological Evolution.

By switching to a pre-baked static Voronoi texture (T_Flow(RG)_Cell(B)_Noise(A)) distorted by a Flowmap, we are no longer evolving the cells. Instead, we are physically smearing a 2D image plane. Consequently, the lightning tendrils now inherently appear as continuous, warped lines (akin to stretched taffy) rather than disjointed, snapping arcs.

The fully refactored, production-ready Main Shader Graph

The finalized Main Graph: Procedural nodes purged, relying on texture caches and decoupled macro-masking logic.

Mitigation Strategies:

  1. Art Direction (Current): Relying on the environment art team to iterate on the Flowmap and Cell textures. Authoring pre-broken cellular structures and high-frequency vector noise can artificially disrupt the continuous smearing.
  2. Tech Art Alternative (Future): If the stretching remains visually unacceptable, the ultimate middle-ground is a Flipbook Texture (SubUV). Baking the Angle Offset evolution into a low-cost 16-frame flipbook and playing it through the Flowmap pipeline would recover the chaotic cell-snapping dynamics at a fraction of the procedural ALU cost.

9. Update (DLC 2): The Spatial Refactor & Texture Evolution (Mar. 26)

Iteration Background: As we moved from the R&D sandbox into rigorous environmental placement, we hit a few unexpected integration hurdles.

  1. Masking Inconsistencies: The previous spatial masking logic exhibited some structural instability during edge-case placements, occasionally failing to clip the decal bounds cleanly.
  2. The “Brush Stroke” Artifact: The baked procedural noise felt thick and artificial—like digital paint—rather than the violent, razor-sharp plasma we originally envisioned.
  3. The Paradigm Shift: We realized that forcing a single Decal Projector to carry 100% of the volumetric visual weight was hitting a creative ceiling—it was becoming visually monotonous.

9.1 The Architecture Refactor & Dropping the Wind

Upon further stress-testing, I decided to completely refactor the underlying spatial masking logic. We stripped out the convoluted UV-based distance fields and unified all clipping strictly under Position (Object) space. By generating a pure 3D capsule intersection (Maskxz×MaskyMask_{xz} \times Mask_y), we restored absolute, mathematically perfect control to the Radius and Height parameters.

The refactored Shader Graph utilizing unified Object Space positioning for strict 3D capsule masking

The refactored spatial masking pipeline. Notice the clean separation of Vertical and XZ constraints.

Subtracting by Addition (Removing the Wind): During this refactor, we made a crucial art-direction call: We completely removed the dynamic wind distortion from the decal. Why? Because our environment’s procedural grass shaders already react beautifully to the global wind parameters. Having the glowing ground decal also warp with the wind created visual clutter and redundancy. By letting the grass carry the “wind feedback” and keeping the energy decal focused on pure, crackling emission, the scene immediately felt more grounded and performed noticeably better.

Furthermore, this frees us up for our next major step: transitioning away from relying solely on a 2D Decal. By treating this shader as just the “base burn mark” and layering it with actual GPU particles and volumetric meshes, we can break the monotony of planar projections.


9.2 Eradicating the “Brush Stroke” Artifact (Substance Designer)

With the spatial masking stabilized in Unity, the focus shifted to the raw texture quality. The heavy “brush stroke” feel was a byproduct of pushing a standard Histogram Scan too hard on low-frequency noise, which inherently yields smooth, vector-like edges.

To achieve the aggressively sharp, crackling lightning threads (the “Tar vine” look), we had to abandon simple thresholding and develop a dedicated “Skeleton Extraction and Scatter” pipeline in Substance Designer.

Substance Designer node cluster: Highpass Grayscale -> Levels -> Slope Blur -> Blend

Substance Designer node cluster: Highpass Grayscale -> Levels -> Slope Blur -> Blend.

As illustrated in the node cluster above, the secret sauce lies in this specific four-step operation:

  1. The Highpass Scalpel (Highpass grayscale): We completely bypassed Histogram Scan. Instead, the torn radial shapes were fed into a Highpass filter. This acts as a non-linear frequency separator, discarding the bloated glowing areas and isolating only the most extreme structural transitions.
  2. Levels Edge-Crush (Levels): Immediately following the Highpass, an aggressive Levels node crushes the remaining mid-tones to absolute black. What survives is a razor-sharp, sub-pixel-width energy skeleton.
  3. Fractal Micro-Distortion (Slope blur grayscale + Fractal sum 1): A pristine, sharp line looks too “digital.” To break this, we split the output of the skeleton. One branch is fed into a Slope Blur driven by a high-frequency Fractal Sum noise. This operation physically chews up the smooth edges of the skeleton, introducing chaotic, organic electrical tearing and atmospheric dispersion.
  4. Core-to-Scatter Blending (Blend): Finally, we use a Blend node (Screen/Lighten) to recombine the clean, sharp skeleton (representing the blinding energy core) with the slope-blurred, distorted version (representing the surrounding energy scatter).

9.3 Conclusion of DLC 2

This dual-layer compositing strategy provides a solid, intensely bright structural core surrounded by erratic, crackling micro-details. When this meticulously crafted texture is multiplied by our refactored 3D Master Mask and pushed through HDRP’s EmissionColor, it completely sheds the flat “painted” look.

The result is a volatile, highly aggressive volumetric plasma base—perfectly optimized, visually fierce, and ready to be combined with our broader VFX toolset.