Solution is to animate meshes on GPU (in shaders), and use the batchable MeshRenderer as a substitute of its skinned variation. Unity doesn’t have a built-in software for this, so I made one myself. I wrote a tough editor script that reads vertex positions of skinned mesh on every body and shops them right into a texture. A customized shader then repositions vertices in run-time by studying the positions from that texture.
As I used a shader graph and my baker utility class may be very specialised for my wants, it’s arduous to share any helpful code simply but. But should you’re , I discovered this article and this repository helpful.
As a place to begin, perform AnimationClip.PatternAnimation units the mesh into a correct pose at a given time, and SkinnedMeshRenderer.BakeMesh can be utilized to extract the mesh information at that second. In the shader, the vertex index is saved within the built-in variable vertexID. An attention-grabbing discovering is that should you retailer every keyframe on a separate texture row, bilinear filtering will mechanically deal with the animation mixing (see the article linked above).
Resizing the feel to power-of-two (not essentially required in desktop growth) was a surprisingly advanced job that required an exterior render texture (or I’ve simply missed some too apparent utility perform). Here’s what I’m doing for texture scaling (observe the feel format that’s wanted for storing detrimental values with first rate precision):
public static void ResizeTexture(Texture2D pTexture, int pWidth, int pHeight) {
var rt = RenderTexture.GetTemporary(pWidth, pHeight, 0, RenderTextureFormat.ARGBHalf, RenderTextureReadWrite.Default);
RenderTexture.energetic = rt;
Graphics.Blit(pTexture, rt);
pTexture.Reinitialize(pWidth, pHeight, TextureFormat.RGBAHalf, false);
pTexture.filterMode = FilterMode.Bilinear;
pTexture.ReadPixels(new Rect(0f, 0f, pWidth, pHeight), 0, 0);
pTexture.Apply();
RenderTexture.ReleaseTemporary(rt);
}
In the tip, animation baking wasn’t practically as advanced as I feared, however resulted in some hilarious glitches. During the primary makes an attempt, fashions distorted closely and made the viewers seem like a horde of lovecraftian monsters; sadly I did not file a video of it. Of course there’s nonetheless heaps to do. Instead of repeating similar gestures, every spectator wants particular person variation, so mixing of various animations at completely different speeds per particular person is required; it will in all probability introduce some batching points. I additionally have to bake normals and tangents and resolve smaller points with digicam frustum culling (in all probability associated to AABB calculation). But at the very least I now have a stable base to construct on, already in a position to run hundreds of animated spectators with no noticeable FPS drop.
I may make a prolonged weblog put up in regards to the topic. Perhaps I’ll in some unspecified time in the future. Writing in regards to the course of appears to be slower than the precise implementation, so hopefully somebody finds these useful.
Besides the GPU magic I’ve additionally improved the background of the shipyard, experimented with GUI design and ACES tone mapping (I’ve to spice up up all these baked lights…) and so on. I’ll come again to these within the upcoming posts.