This document provides basic descriptions for a number of shaders used to support the art, effects, and design departments throughout the development of Kaos Studios last two projects. I've tried to be as clear and complete as possible without compromising some of the technology I helped bring to the studio, but please don't hesitate to contact me if you have any questions.
Tests were conducted to reconstruct basic lighting models for experimentation and debugging purposes.
NOTES: (A unique ambient lighting term and a spherical harmonics based lighting solution were used to push the visibility of specular and normal information and to provide a feasible method for achieving the lighting style desired by the project's scope. Many issues were encountered when experimenting with these techniques and reconstructing fundamental lighting models proved beneficial during the implementation of both systems. These tests were helpful in exposing new ways to utilize the technologies and to assure the project took full advantage of the systems. The tests were also used to identify and isolate issues that emerged exponentially as the lighting solutions grew in complexity to meet the demands of the project.)
Blend modes and filters were recreated for designing and implementing different effects. Below you can see the two gradient textures used for testing the blends.
NOTES: (Post-process effects such as health/damage states, shell shock, night/thermal vision, flash bang grenades, heat haze, volumetric light scattering, and video/digital feeds were all prototyped using special blend modes and filters. These techniques are all based off of options available in Photoshop. One thing to note here is that Photoshop's calculations all make use of hard coded math that is executed on a CPU and when the same equations are applied to a shader, the math is interpreted by a GPU which is notorious for simplifying calculations. This is the cause for slight discrepancies in the visual results one would see when comparing the UE3 implementation of these equations to the results received from an application such as Photoshop.)
NOTES: (Below is a screenshot displaying the results of many blends and filters applied to the same scene. These techniques were combined with different technologies available in UE3 to produce several post-process effects.)
I used world space normals to map effects onto individual assets. This technique gives the look that your effect was projected onto a surface from a particular direction. This was used for water/rain, snow/ice, dust/sand, moss/mud, and ash/soot but for the samples I used snow/ice.
NOTES: (In the screenshots below you will see the node network responsible for the projection mapping calculations as well as some before and after comparisons I did to show the benefit from integrating this tech into the parent shaders for all objects. The final comparisons show how a single developer can take a warm sunny map, and with a single day worth of work, transform the look of the environment to something completely the opposite.)
For an E3 presentation a shader was needed to achieve the quality necessary for camera close ups against different residential building assets.
NOTES: (Five effects were needed to accomplish this material. For varying reasons these details could not be modeled into the base mesh. Animated detail blending was used to interpolate between base information and a high frequency pass. A grunge overlay was needed to break up tiling and a color tint option was implemented to exclude the grunge data. Bump offset was used to give the illusion of a self occluding surface but this introduced stretching pixel artifacts from extreme view angles. To solve for the artifacts I distributed displacement values based on the view angle relevant to the surface normal.)
NOTES: (The screenshot above shows, towards the bottom of the shot, how the slats appear to occlude the shaded sections of the material and towards the top of the image the shaded sections seem more exaggerated.)
Objects in the user interface had to animate based on in game events and materials were used to help convey different game play mechanics.
NOTES: (This UI element had to show the vehicle's current health percentage, how many pieces of armor remain, the current health percentages of each armor piece, and the base had to rotate independently from the turret. When health dropped by 33% a yellow tint would become visible in that affected section. When health dropped by 66% a flashing red effect would then be apparent in those damaged areas. When a piece of the vehicle got damaged beyond repair, that piece would get masked out and become invisible to the player.)
Many effects for decals were animated via the material system including hot embers in scorch marks, expanding blood pools, dripping liquids, and crumbling debris.
NOTES: (The screenshots below show an example of dripping blood and the node network responsible for the effect. The shot of the blood was taken at the last stage of the animation. Initially when this splat appears there are no drips, and as time passes individual sections of the decal begin to drip down independently from each other.)
A material needed for one project was to make objects look as though they have been burned or are still burning. This was made keeping all parameters names intact from the other shaders in our project so if we wanted to make something looked burned it was a simple matter of changing the material’s parent.
NOTES: (Below you see an image showing the parent material. The shader needed to contain almost all functionality available in other opaque rendered materials and required us to keep the naming conventions consistent for all parameters. This way if we wanted to make objects appear burnt, all that was needed was a simple switch of the material’s parent reference. The material took the base opaque rendered output and interpolated between that and a series of soot overlay textures (diffuse, specular, and normal). Another option added was responsible for the look of the burning embers. The embers were exposed with settings like brightness intensity, tiling amount, wind direction/speed, and flicker range/frequency.)
A parent shader that defined the look of glass was implemented to help provide the consistency and quality desired by the directors.
NOTES: (Below you see an image of the glass parent shader with some default settings and values applied. This was used for everything glass ranging from weapon scopes, to windows and even tiny shards used in some particle systems I built for the destructible objects. Since translucency was needed, we had to go with UE3’s default unlit solution but I was able to recreate the ambient, diffuse, and specular models needed to light the glass properly.)
NOTES: (Another thing to note is that the image was captured from the single sided version of the shader, but a double sided version was also successfully used in the project…on the weapon’s scopes for example. UE3 does not handle double sided translucency well and can introduce sorting artifacts so I used a single sided material and used shader tricks to reproduce the behavior and look of a spherical double sided material.)
Since we used UE3’s default lighting tech we needed to come up with a solution to get more lights into our scenes without hurting performance and once again materials were looked at for solutions.
NOTES: (Below you see examples of a material that recreated the visual results of adding a point light to a scene. The material came with certain caveats but all in all it was a major win from a performance and workflow standpoint.)
Using the material system we were able to yield fantastic results in our attempts to recreate the details observed when viewing video footage.
NOTES: (Below are some images showing the material and its’ affect on a scene. It contained details such as film grain, chromatic aberration, grid lines/scan bands, static noise, smudge/scratch layers, and distortion.)
Similar to the glass shader, we needed to use UE3’s default translucent unlit lighting/blend model to get the smooth alphas needed for particles. Since it ignores lighting we recreated the calculations necessary to diffusely light our particles.
NOTES: (This was done not only to better integrate effects into our scenes but also to improve the behaviors of the effect’s shading. Smoke would appear to get brighter and more opaque out in the sun while inheriting the scene’s primary light color, and vice versa, when in shadow become more transparent and illuminated by the scene’s ambient light colors.)
For the weapons and vehicles a parent metal material was produced that contained all necessary conditions for conveying a metallic look.
NOTES: (A blinn lighting model was used and options for varying reflection styles were exposed. Special masks were introduced into the specularity channels to define areas where highlights would get tinted by the albedo hues and sections that would allow the highlights to pass through uninfluenced by the underlining surface color. This helped exaggerate the differences between dielectric metals and conductive metals. In addition we also implemented a camo system which had to work for all weapons and weapon attachments.)
Water was used in our environments and separate materials were needed for the different objects in our scenes.
NOTES: (Our material for bodies of water had the same issues as glass and we benefitted greatly from recreating the lit specular calculations. I used the depth buffer to make the water become more or less opaque depending on how “deep” it was. Other depth based effects like foam forming around objects and roughness normals getting blended into the areas that interpenetrate terrain were implemented and later we created a dynamic shadow factory expression that allowed our water to receive shadow.)
NOTES: (The functionality used for opaque objects had to be integrated into the parent opaque shaders and all cross functional parameters needed to match in naming for consistency reasons. The water information was in world space by default so designers could place objects however they desired in the scene and the different water effects would make sense visually…rain drops on surface tops and water flowing down the sides of objects. Almost all settings were exposed with global multipliers so features like wind direction and tiling could be changed on a per map basis. A procedural method for generating world space gradients was implemented to give designers masking controls so objects placed in a river could look wet in only the submerged areas.)
NOTES: (The shot below shows the combined result of many different water shader permutations in a single scene.)
We used special materials to achieve different artifact effects seen within the human optical system and with current film cameras.
NOTES: (Below you see an example of the node network I constructed for lens flares. We decided at the beginning of the project that if a lens flare looked good, we would use it even if it contradicted the style of lens conveyed by other effects. This resulted in a mix of both camera based lens flares and flares observed within the human eye showing up in the same scene together.)
NOTES: (The below images show some of the different effects we used in our flares like edge scaling anamorphic streaks, eye lash artifacts, and moisture/dust that is revealed on the lens when approaching a light source.)
© 2015 GavinLerner.com . email@example.com