Inverse kinematics for fingers explanation and example in Unreal Engine 4.21

Inverse kinematics seems like a daunting subject, but in Unreal Engine 4 it's surprisingly straightforward! I'll show an overview on how I applied used it in VR, and provide an example project where you can look into the code itself and experiment around.

Basic Inverse Kinematics for Fingers

Here's a video:



Short background story, feel free to skip over to the implementation. I'm a student, and for our school we're making a VR Horror game using the HTC Vive Pro. I wanted to get more experience with creating VR interactions, so I was assigned to the player character implementation. At some point during development, I saw a video showing inverse kinematics for placing the feet correctly when walking up stairs, and I thought: Hey, I can do that too right?


First of all, in case you don't know what inverse kinematics are, I recommend watching this video:

Oversimplified, it is a way to animate skeletal meshes. In Unreal Engine, you can use something called a "Two Boned IK" node in your animation blueprints to make this happen. You provide two locations, a so called "effector" and a "joint", and together they allow you to manipulate bones.


I've created a simple system that would let the fingers wrap around objects when you're grabbing them. There’s a lot of ways to do inverse kinematics, this document shows what the problem is, how I’ve solved it. Then, I will justify in short why I chose this solution.

1. The problem

For immersion, we need the hands to behave as realistically as possible. For that, we’ve determined that we’ll need the fingers to wrap around objects and interact with walls, so that we don’t get clipping or very static looking hands.

Here is a rough sketch of what needs to happen.

This is the problem, we have a bunch of fingers that need to detect collision and then they need to rest on the first object they encounter.

2. The Solution

In the end, I’ve gone for a system that uses a combination of regular blueprint logic and animation blueprint logic.

You need a couple of things for getting inverse kinematics:

  • A skeletal mesh that has been correctly rigged.
  • An animation blueprint for the rig.
  • A blueprint using the skeletal mesh.


First off, we’ll need to find the location on which our fingers will be. I decided to create UE4 arrow components that are hidden in game, and calculate line traces based on the transform of each arrow. To the right is a visualization of the system.



Below is another image meant to visualize what this looks like in the blueprint viewport using the default Unreal Engine 4 hand mesh.


Thirdly and lastly, about the inverse kinematics. Unreal Engine allows the user to setup Two-Boned inverse kinematics. If you use an animation blueprint, you can setup each finger to use two positions to apply inverse kinematics to two bones.

The biggest hurdle is setting up the effector location and the joint target location. The effector location is where the IK is trying to place the bone, while the Joint target location indicate how the bones will rotate. The joint target location is actually a misleading name, as it is placed opposite of where you want the bones to rotate around, which took me some time to figure out.

We use a joint to rotate the bones around, and a target to rotate the bones towards. We use the values that are set in the functions in the regular blueprint to set all the values correctly. There’s three functions that I’ve created in the example project that get called in the following order.

1. SetAllFingerIK: A function that calls SetSingleFingerIK() five times (once for each finger). It’s meant to combine the code into one big function.

2. SetSingleFingerIK: Each finger needs to call call 3 traces in the end, so this is a collection of linetraces and getting the result. This calculates and sets the variables in the animation blueprint.

3. LineTraceAlongArrow: Returns a hit result of a line trace based on the transform of an arrow.


This is the result of doing it on the index finger.


The alternatives and justification for my solution

I couldn't find any alternatives except for animating everything separately. Which, of course is an incredibly time consuming task. Animating the fingers dynamically was the only other solution available, and inverse kinematics does exactly that.


The reason that this was so difficult to do was that there was very little information available about this specific problem, which meant I had to figure it out myself. In the end, I got this self-made system working.

Here is the link again to the example project if you feel like seeing it in Unreal Engine:

Latest Jobs


Playa Vista, Los Angeles, CA, USA
Senior Level Designer (Zombies)

PlayStation Studios Creative Arts

Petaling Jaya, Selangor, Malaysia
Lead/ Senior Asset Artist


Playa Vista, Los Angeles, CA, USA
Senior Gameplay Systems Engineer - Treyarch

High Moon Studios

Carlsbad, CA, USA
VFX Artist
More Jobs   


Explore the
Advertise with
Follow us

Game Developer Job Board

Game Developer


Explore the

Game Developer Job Board

Browse open positions across the game industry or recruit new talent for your studio

Advertise with

Game Developer

Engage game professionals and drive sales using an array of Game Developer media solutions to meet your objectives.

Learn More
Follow us


Follow us @gamedevdotcom to stay up-to-date with the latest news & insider information about events & more