The limitations of a texture sheet animation can be pretty obvious at times. A texture sheet with 6 columns and 6 rows means a grand total of 36 differing frames. That means after 1 second of playback (at an animation speed of 36 frames/second), we have finished our animation.
But what if we wanted a longer particle lifetime than that? What if it’s a big, huge burning building and we need billowing smoke that lasts for maybe 10-20 seconds?
Sure, we could just stretch the lifetime and let the texture sheet play over a longer period of time. But this will introduce laggy-looking problems when the texture sheet switches frame.
Laggy frame switching, 6×6 texture sheet over 5 seconds, rendered at 60FPS
This comes from mismatching framerates. The texture sheet can handle a max of 36 FPS before it starts differing from the current rendering speed. If the engine handles 60 FPS, we’re skipping (lagging) 24 frames every second.
This is a problem I’ve encountered many times, and I often just countermeasured it by having more frames in my texture sheet or having more particles in the particle system. Often times though, this leads to fillrate problems, especially in a game where there are lots of particle effects already happening. But now that I’m writing a custom particle editor I thought I might solve this problem once and for all.
I wanted a solution that would meet the following requirements:
A) Smooth transitions
B) Speed & lifetime independent (think slowmotion, speed up, whatever the game might need)
C) Good performance (hopefully purely on a GPU level)
The solution I came up with has apparently already been developed at other places. I’ve heard rumors that Guerilla Games wrote an internal shader for this purpose (although this is only speculation at this point, haven’t confirmed it), and I believe there are some YouTubers who have done this with Unreal’s node-based shader network.
Unity Technologies also incorporated this technique into its particle system component in a recent patch, although we don’t have any insight into how it works. For the user, it’s just another setting that sends some extra UV’s to the Pixel Shader.
Luckily, using just another set of UV’s and a few simple math tricks we can solve it ourselves in the Geometry Shader stage. Assuming the particle shader is already built up using a GS, we need a few parameters to make this work.
Non-interpolated particle vs. interpolated particle, both at 60 FPS
To figure this out, we need:
- Total amount of frames
- Current animation frame
- Next animation frame (the frame to blend to)
- Current lifetime of the particle (value from 0.0 to 1.0)
- Frame percent (how far we are on this particular frame; value from 0.0 to 1.0)
We’re gonna store the next animation frame in a separate UV channel. Using the regular UV, the next frame’s UV and our current frame percent, we can interpolate between samples and get a color value based on how far we’ve gone on this current frame before the next one is going to be displayed. Calculating the current frame is rather easy:
int currentframe = (rows * columns) * lifetime; // lifetime is a normalized float value from 0 to 1
Next frame:
int nextframe = min(currentframe+1, totalframes); // always the smallest value: ensures it doesn't go past maxframes
Frame percent:
float framePercent = rescale(lifetime, (float)currentframe / totalframes, (float)nextframe / totalframes);
The rescale function looks like this:
float rescale(float oldVal, float min, float max)
return ((oldVal - min) / (max - min));
To map these values into UV coordinates, it requires some math:
float2 uv;
uv.x = (float)(currentframe % columns) / (float)columns;
uv.y = floor(currentframe / columns) / (float)rows;
After this you need to structure up your own quad, and place the UV’s accordingly.
I’ll leave this to further reading (the full source is on my GitHub)
Later on in the Pixel Shader stage, we’re going to use the framePercent value to produce this:
float4 color = lerp(particleTexture.Sample(smp, uv0), particleTexture.Sample(smp, uv1), framePercent);
This is the final step that allows us to have a smooth RGB+A transition between two frames.
Conclusion:
The upside of all this is that we get way smoother transitions, and way less laggy looking stuff. Completely independently of the game’s timescale. We can also test stuff like a 2×2 texture of random looking blobs, and it won’t necessarily look bad. It can sometimes look really cool even though you might not think it would.
The downside is that it costs a slight bit more performance-wise: you have to send one extra float and one extra UV set to the Pixel Shader, and you have to sample your texture twice, no matter what. If you want to use this technique, you might want to use two different shaders, one interpolated and one regular, and then only use the interpolated one where it makes sense.