Playing with physics in games is somewhat of a pass time for a lot of people. Half Life 2, a game that came out in 2004, is well known for having much of the game’s objects be interactable, moveable, and sometimes breakable. Where have we gotten since then? Destructible environments, realistic cloth physics, and not much else as far as I can tell. One of the neglected parts of game physics though is destructible food. Or, in a more broad sense, soft bodies that can be manipulated by the physics engine, broken apart, and molded by impacts in a mostly realistic way. This is a big request, but I decided to take things step by step. First, we need to pick a 3D engine and some sort of basic object to make a soft body. In this case, we go with Unreal Engine 4, and a cube to feel like gelatine. After that, we want to see if we can make it smaller by “breaking off” pieces of it or compressing it. Being able to modify things like stiffness, weight, and other properties can make this even more powerful. If we can also make it look pretty in the end, that would be great as well. But was I actually able to do that? Not really, but I can at least tell you what I was able to do, what I struggled with, and what you could potentially improve upon.
How does one go about trying to make soft bodies in Unreal Engine? I learned the unfortunate reality that it is not as straightforward as instantiating a soft body object in the engine. Unreal Engine 4 does come with a Clothing Tool that is used to make plane’s operate as soft body objects, so what happens when you decide to paint a whole cube mesh using the Clothing Tool? Not a lot of good, as it turns out. The physics engine doesn’t know what to do with the vertex points on the mesh, and it breaks in ways that are not describable. The Clothing Tool is unfortunately not usable for anything other than cloth-like objects.
If the Clothing Tool is unusable for this type of mesh physics needed, what other options are there? Of the free options out there, the one with the most (or really only) community usage is NVIDIA’s GameWorks FleX package. FleX uses particles to manipulate a mesh from its original state. FleX has multiple uses, including Soft Bodies, Cloth Simulation, Liquid Simulation, and even Rigid Bodies. This makes it a very versatile tool, even though all I want it for is the soft body mesh objects. The only unfortunate part of trying to use this process is having to compile a version of the engine that NVIDIA provides on their GitHub, which is currently stuck on version 4.19.2 of Unreal. Some members of the community have made their own forks of the project to be able to use FleX with new engine versions, but NVIDIA doesn’t seem to respond to any pull request for any GameWorks projects as of late. I decided to use a fork by Unreal community member 0lento, which you can find on GitHub, which fixes some small bugs and such.
The workflow of FleX makes it very appealing for this project, as almost any 3D mesh created for use in Unreal can be converted to a FleX Mesh in-engine. And once you create it, all you have to do is change a setting to convert it to a FleX Soft Asset, which is FleX’s soft body asset. This generates particles inside the shape of the mesh, which the user is able to determine the size of and the stiffness of the links to one-another. The default settings can create too many particles for a smaller mesh, and larger meshes can be almost unusable if you do not know how the settings affect different meshes. These settings don’t seem to have too much documentation for them, so users will likely have to determine what is right for their meshes based on trial and error.
Once you have these set up though, you can have soft body meshes thrown around your screen! This was the bare minimum I needed to start on this project, so where do we go from here?
As I learned quickly, these soft bodies were a lot more limited than I expected. FleX moves the mesh around the scene separately from the Actor’s position, even if the mesh is the root of the Actor. This can be shown in-engine, where if a soft body FleX Mesh is instantiated with a childed Collision Mesh, the Collision Mesh doesn’t move from the point the soft body was positioned, but the mesh itself has moved. This wouldn’t be too much of a problem if the mesh itself has its own collision mesh, which could be used to check if the soft body collides with something like a Static Mesh in the level. However, for some inexplicable reason, FleX based Soft Bodies cannot send any sort of Collision events to the engine (FleX’s clothes have this limitation). This took more time than needed to find out, and it all but forced me to find some other solution to making soft bodies that I can manipulate.
Where does one go from here then? One potential way to create a jelly-like object would be through the use of some Shaders or Materials. In Unreal, Materials tend to do all the work that one might associate with Unity’s Shaders, and thanks to a member of the Unreal Engine Forums, Roel, I was able to use their old example of Distance Field Soft Bodies to create a very simple soft body simulation. This is not as realistic as what Flex is able to give though, as it only works well with spheres and other nonuniform objects, and is a little “hacky” in terms of setup. In order for this effect to work, a Collision Mesh should be inside the Static Mesh, with the Static Mesh being allowed to Overlap or Ignore all collisions. Combining this effect with a custom Physics Material, this can give a great bouncing ball effect, or a flatted piece of food. Now that we have something working like that, and Collision events aren’t something to worry about as we are using Unreal’s own Static Meshes, the next thing that needs to be done is to simulate the process of breaking these objects apart.
One of the cool things about Unreal is that it ships with experimental features with stable engine versions, just disabled on all new projects. The one that is important to this project though, is Procedural Meshes. Procedural Meshes can be created manually by the user with an array of Vertices, represented as Vector3s, and built by the engine; they can also be built by copying a Static Mesh. This is great, as we can easily test out the most important feature of Procedural Meshes: slicing them. Thanks to the “procedural” nature of the meshes, we are able to modify them at runtime.
The only part about this that can be annoying is most of these features only exist as Blueprint scripting nodes and not C++ methods, so it can make more sense to make these objects completely in Blueprints even if you have a mostly C++ project. These meshes are divided based on normals and position of the split, and you have the option to not create a separate mesh after the split. This can result in some creative options when it comes to how you want your mesh to look, let’s say after it collides with the floor under it.
“Procedural meshes sound cool!” I hear you say, “Surely there has to be some catch?” Well, you’re right there is a catch: Collisions are kind of hard to control on procedural meshes. In my testing, collisions would either only be generated for the newly created mesh if told, or there would be some weird positioning of the current collision mesh. This means that the Distance Field effect that I used for the jelly like effect isn’t going to work, which means that most options have been exhausted for Unreal Engine at this point that I am willing to work with. What is left?
If I wanted more control over the mesh, then why don’t I just manipulate the mesh’s vertices independently? Unfortunately, Unreal doesn’t seem to allow that sort of manipulation by the end user. Both Static Mesh and Procedural Mesh don’t expose any sort of vertices in Blueprints or C++, so that got thrown out the window fairly quickly. For me, that left only one choice left to see if this project could be doable: move to Unity.
Unity gives the user a surprising amount of control over its Game Objects, which makes it a perfect candidate for something like this. Jelly like shaders have been done before in Unity, but since I lacked the proper amount of shader knowledge, a different approach was needed. When using any sort of mesh in Unity, the Mesh Filter component is added onto Game Objects. Using this, the user can access the vertices of a mesh and manipulate them as they please. A great example of this was shown by the YouTube channel Binary Lunar, which showed the concept of 2D soft bodies in Unity, using 3D meshes in the 2D space. Attaching small game objects with rigidbodies, colliders, and joints together on each vertex, a simple soft body simulation can be achieved fairly easily. However, problems arose when a mesh got more complex than several triangles put together. Whether due to how programs like Blender and Maya work, or how Unity imports these objects, the vertex array seems to have an arbitrary order for the vertices to be listed in. This was a problem in a 2D space, and gets exacerbated in a 3D one as well. Not only that, but almost every fully 3D mesh I was able to throw at the engine produced multiple vertex points at what seems to be one vertex. This is easily seen on even Unity’s built in cube, where each corner of the cube contains 3 vertex points. This means that currently, in order to use this type of soft body simulation, I would need to manually connect each vertex to each other and have the correct amount of vertices. This throws out using any complex meshes for this use if work wanted to be done in a reasonable manner. The connections could be made with raycast potentially, but that would require each point to have a set amount of connections and for each point only be connected if the length is under a certain amount as well. This could get messy real quick, and I unfortunately didn’t have enough time to test this out to see if it was even possible.
Where Do We Go Now?
In the end, I ended up with a bunch of solutions that could not solve my problem. Each solution has its own limitations. I was not able to explore other solutions for soft bodies and breakable objects, such as AMD’s FEMEX project. NVIDIA’s FleX is also available for Unity, but I went into Unity assuming the same limitations that existed on Unreal existed in the Unity implementation as well. While my work here shows many of the pitfalls of these approaches, I hope that anyone who finds this has the time and technical ability to push this farther than I could. In the future, someone else or myself could try to fix the issues with how Unity handles vertices, or create a solution to finding neighboring vertices(something could be done with Raycasts, but that is as far I was able to think about this solution). Maybe modifying the source code of Unreal to allow for easier manipulation of Mesh vertices could also be a good starting point. Maybe Unreal and Unity just aren't suited for this kind of work, and some other engine or custom solution can be made for this task. At any rate, all of the work I have done is available as open source on my GitHub. For the work in Unreal, check the repos here and here, and for the small work done in Unity check this repo.