Kinect Mocap Animator

Kinect Mocap Animator is simple to use Kinect (‘Kinect for Xbox One’ or ‘Kinect for Xbox 360’) mo-cap tool that records user motions as fbx animation clip. It features optional hand and finger tracking, by utilizing a LeapMotion sensor. You can run the tool directly in Unity editor, to record animations for the humanoid models in your own projects and scenes. ‘Kinect MoCap Animator’ works with Kinect-v2 or Kinect-v1 (aka ‘Kinect for Xbox One’ & ‘Kinect for Xbox 360’), and optionally with the LeapMotion sensor, if finger animation is needed as well. It saves the recorded animation in fbx-file of your choice. The recorded animation may be retargeted to other humanoid models in your Unity projects and scenes, or may be processed further in external 3d modeling tool, like Maya or 3dsMax. This tool can be used in both Unity Pro and Unity Personal editors.

Free for education:
The package is free for academic use (i.e. in schools, colleges and universities, by students, teachers or researchers). If you match this criterion, please e-mail me to get the K2-MoCap animator directly from me.

One request:
My only request is NOT to share the package or its scenes in source form with others, without my explicit consent, regardless of whether you purchased it at Unity asset store, or got it from me free of charge for academic use. Please respect my work.

How to run ‘Kinect MoCap Animator’:
1. Install the Kinect for Windows SDK v2.0 or Kinect Runtime. The download link is below.
2. Download and install the Speech Platform Runtime v11 (both x86 and x64 versions). The download link is below.
3. Download and install the en-US Kinect language pack. The download link is below.
4. If finger tracking is needed, and you possess a LeapMotion-sensor, please download, unzip and install the LeapMotion Orion Setup. The download link is below.
5. Import ‘Kinect Mocap Animator’ into a new Unity project.
6. Open ‘File / Build settings’ and switch to ‘PC, Mac & Linux Standalone’, Target platform: ‘Windows’.
7. Open and run the Assets/KinectMocapAnimator-scene.

More information can be found in the Online documentation, as well as in the Readme pdf-file in the package.

* Kinect for Windows SDK 2.0 (Windows-only) can be downloaded here.
* Please mind Kinect v2 requires Windows 8 or later, as well as USB-3 port on your machine.
* MS Speech Platform Runtime v11 (both 32- and 64-bit versions) can be downloaded here.
* Kinect for Windows SDK 2.0 language packs can be downloaded here.
* LeapMotion Orion Setup can be downloaded here.
* If you use Kinect-v1 with Kinect SDK 1.8, but have also installed the Kinect SDK 2.0, please look at this tip.

How to use the LeapMotion-sensor for finger tracking:
You can use the LeapMotion-sensor to track the user’s fingers and record the full finger motion in the animation. To do it, follow the steps here.

How to utilize the recorded animations:
Please mind you don’t need to replace the models in the Mocap-Animator with your own, just to utilize the recorded animation(s). Instead, use the retargeting capabilities of Unity animation system Mecanim, like this:

1. Copy the model with the recorded animation (usually Assets/Models/Recorded.fbx) to the Unity project, where you plan to utilize the animation.
2. Select the copied model in Unity editor, switch to the Rig-tab of its Import settings, and make it Humanoid.
3. Go to the Animations-tab of the Import settings now, and make the needed changes, like setting the Start & End frames, Loop time, etc. Apply the changes.
4. Open the scene where your model should play the animation, and select it in Hierarchy. Its rig in Import settings should also be Humanoid.
5. Make sure the model has an Animator-component, and there is animator controller assigned to it. If not, create a new animator controller in assets, and assign it to the Controller-setting of the Animator-component.
6. Open the Animator-window (menu Window / Animator).
7. Drag the recorded animation (or the entire model with animation from p.1) and drop it into the Animator window. The recorded animation should appear in Animator’s state machine. That’s it.
8. Run the scene and check, if your model plays the recorded animation correctly. This way you can retarget the recorded animation(s) to all other humanoid models in your project, without physically recording the animation clips into them.

* The online documentation of the ‘Kinect Mocap Animator’-package is available here.

* The official release of ‘Kinect Mocap Animator’-package is available at the Unity Asset Store. All updates are free of charge.

Output File Formats:
To get the output file in the format you need, change the ‘Output file format’-setting of KinectFbxRecorder (component of the KinectController game object in the scene). Don’t forget to change the extension of the saved file in ‘Save file path’-setting, if needed. Keep in mind that in this case you may not see properly the results of the mo-cap in the scene. Here are the available output file formats:

  • FBX binary – saves fbx-file in binary format;
  • FBX ascii – saves fbx-file in text format;
  • FBX 6.0 binary – saves fbx-file v6.0 in binary format. This is the by-default setting;
  • FBX 6.0 ascii – saves fbx-file v6.0 in text format;

What’s new in version 1.5:
1. Added ‘Display Leap Motion camera’-setting, to turn on or off LM camera on background.
2. Fixed finger tracking with the LeapMotion sensor, when the user turns left or right.
3. Updated UI canvas, to make it visible in VR mode. Distance from the camera is adjustable.
4. Updated ‘Joint distance threshold’ and ‘Joint angle threshold’-settings to 0 by default.
5. Updated the package to utilize the latest KinectScripts – v2.14.


77 thoughts on “Kinect Mocap Animator

  1. Pingback: Motion Capture with Kinect-v2 Mocap Animatorphp Technologies | php Technologies

    • If you mean the Mocap Animator, it is fairly simple to use. Just see the ‘Readme-Kinect2-Mocap-Animator.pdf’ in the package. If you have any expert questions, please ask Ricardo Salazar, who posts his works here from time to time. He’s the best expert in design and animation I know so far.

  2. please send me your email. I will assets but what is the software I need to use? Are there lessons actually program using Visual studios and the programming aspects? Can you help me please? I want them to build a game or learn how to use the capture.

    • My e-mail is on the ‘About’-page. The software you need to use is Unity-3D. The output FBX is a 3D-scene exchange format, commonly used to transfer 3d geometry and animations between software packages like Unity, Maya, 3dMax, Blender and the likes.

  3. Hi
    I just bought your Mocap editor yesterday on asset store. It is perfect and solved my problem that I was struggling. I saw you are advertising the Update version(1.1). Could you please suggest where can I download? and do I need to pay for the update?

    • Hi, I’m not sure what advertising you mean. If you bought it yesterday, you should already have v1.1. You could check the version in What’s-New pdf-file in the package. All further updates are free of charge and can be downloaded from the Unity asset store. These applies to the other Kinect-v2 assets, too.

    • Hi, to make HD face-model data available in the scene, you only need to enable the ‘Get face model data’-setting of the FacetrackingManager (component of the KinectController game object). But how this data will be used, exported or added to the fbx-file is up to you. There are some use-cases, demonstrated in the face-tracking-demos of the K2-asset.

      • Thanks for the tip.

        I couldn’t find face-tracking demos in either KinectScript/Samples/ nor in StandardAssets/Microsoft/Kinect/Face/.

        In order to add head rotation data when saving to a new FBX file, it seems that one must change KinectMocapFbx/Scripts/KinectFbxRecorder.cs::Update() or SaveAnimFrame(). Is that correct?

      • Rumen said:
        > I meant you can use the public API of FacetrackingManager.cs to get the face data. It is a component of KinectController-game object in MocapFbxAnimator-scene. But I’m not sure what you can do with this data afterwards.

        I want to record to fbx a humanoid animation that involves both head rotations and skeletal movement (e.g. shrugging while nodding no). During cleanup, I plan to mask-off the lower torso and legs. I want the fbx to have as little model data as possible, so it’s easy to map to different models and keep the Unity build as small as possible (for WebGL).

        I’ve looked at the package contents at!/content/18708 but it’s not clear that any of them show how to do what I want. Would you clarify?

      • > “I want to record to fbx a humanoid animation that involves both head rotations and skeletal movement.”

        Isn’t it the way you want it at the moment? For getting full head orientation, make sure the FacetrackingManager-component is enabled.

        > “I plan to mask-off the lower torso and legs.”

        You would need to modify the code of Update()-method in KinectFbxRecorder.cs, to save only the joints you want. The full source is there, so you’re free to optimize the saved data as you want it.

        > “I’ve looked at the package contents at!/content/18708 but it’s not clear that any of them show how to do what I want. Would you clarify?”

        Forget about it. It’s not related to saving orientations to fbx animations.

      • Yes, when ‘Get Face Model Data’ is selected, I can capture some head rotations.

        However, I’ve done something more and recordings are no longer previewable. In Hierarchy > UChar0 > Inspector, I renamed UChar0 to LiveReplay, and renamed UChar1 to Recorded. When I record, UChar1.fbx is overwritten instead of creating Recorded.fbx. Do you support renaming these two game objects?

        Also, in Assets > KinectMocapFbx > Models > UChar1.fbx, the preview has no play button now. It seems the cause might be related to the Rig being set to Generic, so I set it to Humanoid / CopyFromOtherAvatar / UChar0Avatar (so I can mask-out the lower body). Do you support applying a Humanoid rig?

      • There is no need to enable ‘Get face model data’, as to me. This turns on the face HD-data tracking, like face model, etc. For simple head rotations, just enabling FacetrackingManager should suffice.

  4. Hi there, i am a Student at Falmouth University England, studying Digital games Animation.I have been using your software to have a quick play with my kinect sensor and i am really impressed with the results, I am having great fun and was wondering if there is any info on changing the character in the scene to one that i have made myself? A tutorial or a written guide would be awesome. Many thanks.

  5. Sorry, I just found this in the Readme pdf:
    > Optionally, look at the KinectFbxRecorder-component of KinectController-game object. You may customize
    its settings, if you like to change (for instance) the saved animation clip name, the name of the input or
    output fbx-files, the output file format, tracked player index, etc.

    But when I change the Load Path to Assets/KinectMocapFbx/Models/LiveReplay.fbx, the Save Path to Assets/KinectMocapFbx/Models/Recorded.fbx, and rename the fbx files in Assets to match, then after recording, Hierarchy shows UChar0 instead of LiveReplay. And the rigging of Recorded is Generic.

    • Yes, you need to do both:
      1. Change the file names in KinectFbxRecorder-settings.
      2. Rename the respective fbx-files in the specified Assets folder.
      3. Optionally rename the animator controller for recorded file in the assets-folder, too.
      4. Run the scene, and you will get the object LiveReplay(clone) in Hierarchy.
      5. The rigging of Recorded after mocap is set to Generic, because the file gets overwritten, but you can duplicate the recorded file in Assets, and change its rigging to Humanoid.

      • For step 4, Hierarchy shows Recorded and UChar0, which should be LiveReplay.

        To reduce the size of the fbx as much as possible to just the animations, I’ve set Model > Import Materials = off. Can you suggest a way to exclude the mesh also?

        When I change the rigging (on a duplicate file) to Humanoid/CreateFromThisModel, and then try to mask-off the lower body, I have to click Fix Mask. After masking-off and clicking Apply, the Fix Mask button reappears and the masking is gone. (FWIW, when I try to create cropped clips, that also fails.)

      • These are model specific questions that have little to do with the Mocap Animator. Please see the respective Unity documentation. By the way, is there anything that succeeds, just for a change?!

      • I’d like to help make this plugin as easy to use as possible, then give it a very positive review. I can’t answer yet whether it’s working because my fbx isn’t yet in a usable state, although much of it seems to be on the right track.

        As a tool for creating animations, this plugin would be easier to use if:

        1) The output fbx contained the captured animation clip for a Humanoid avatar and as little else as possible

        2) The game objects were labelled by their purpose, e.g. LiveReplay and Recorded

        These changes would greatly reduce the how-to steps I’m assembling to share with others. Today I’m going to explore starting with a minimal Humanoid model and rigging to see if that results in an FBX that allows masking and sub-clips; if it works, my how-to steps will be longer yet.

        I’m not saying this to criticize your work but to suggest that we can work together to help others.

      • Thank you for the explanation! Of course you are right. And sorry, maybe I was too harsh. I have to answer a dozen of questions each day, and sometimes I become tired to solve all kinds of issues without hearing a single good word for months. My other problem is that I’m not a model designer, and just used the model I had at hand. My goal was to provide simple, easy-to-use and affordable solution for capturing humanoid animations. If you have a simpler model that is not license protected, please send it over to me by e-mail and I may use it in the future Animator releases. You could also explain me in the e-mail (if needed with some screenshots) what exactly happened, and what did you expect instead. I recently added the option specific joints not be saved to the animation clip, as well as saving animation frames at specific fps-rate. This release is not yet published, but I can share it with you, if you like.

  6. Hello! I want to record some sitting animation. Can this mocap animator be set to capture the motion even when I sit in front of the desk or it just capture motions from the fixed distance?

    Also, I have the question about head orientation. Although you had answered before, I can’t get what you mean “For simple head rotations, just enabling FacetrackingManager should suffice”. Can you explain it for more details? For example, should I change any parameters in FacetrackingManager or others.

    Thank you for your assistance!

    • You can mo-cap whatever motions are detected by the sensor. You can see what gets detected on the avatar on the left. What you see, this you will get as recording, too. So, my advice is to experiment a bit.

      You can also disable recording of specific joints. This is a list-setting of KinectFbxRecorder-component. This way you could exclude leg joints from the animation recording, which would be desirable in your case, I think.

      Regarding the head rotations: As you can see (again, on the avatar on the left), KinectManager captures head rotations around X and Z axes, but cannot detect the rotations around Y (i.e. around the neck-head bone). When you enable the FacetrackingManager (you don’t need to change any of its parameters, just enable the FTM-component of KinectController), the rotations around all axes get detected, hence they can be recorded. The FTM-component is disabled ‘by default’, because the face-tracking subsystem of Kinect SDK 2.0 has caused crashes of Unity editor in the past, on some specific machines. Hope this helps you understand.

      • > You can also disable recording of specific joints. This is a list-setting of KinectFbxRecorder-component.

        This would be a good way to do masking. However, I can’t find this list in v1.2 under Hierarchy tab > KinectController > Inspector > Kinect Fbx Recorder, nor under Kinect Fbx Controller nor Kinect Manager

  7. First thanks for this great work. I purshased that on asset store, but I think that I don’t know how to use that. The problem is when I start the scene, I have a grey square in the right side but I can’t see anything about my body, I see the characters (T-Pose and Walking), but I don’t know what that I need to do… My Unity version is 5.5.2 and I installed everything that is described in tutorial. Can you help me?

    • According to your description, I suppose the Kinect sensor is not working, for any reason. Please start ‘SDK Browser 2.0’ (part of the Kinect SDK 2.0, you should have installed), and run ‘Kinect Configuration Verifier’ to check if the sensor is running and its data is received correctly.

      Otherwise, see the Readme-file in the package, on how to use the Mocap animator. Generally, you can start/stop animation recording with voice commands or key presses, and then retarget the saved animations to the characters in your Unity projects, or edit them in external 3d tools. I’ll post more info regarding the Mocap animator on the k2docs-online documentation soon.

  8. Dear Rumen,

    Thank for your excellent work for the motion capture plugin!

    When I utilize the Mocap plugin, I met some problems. When I recorded the squating motion, the avatar’s hip remains fixed so the feet were above the ground, which seems very weird.

    Could you give me some suggestions to solve this issue?

    Thank you!

    • I suppose you mean the recorded avatar. Anyway, I think the root motion is saved into the recorded animation, so when you retarget the animation to a humanoid model in your real scene, and enable the ‘Apply root motion’-setting of its Animator-component, the squat should look OK. This may be seen in the animation preview window as well, when you select the recorded model in Assets/Models-folder.

      • Thank you for your suggestion. When I retargeted the animation to another avatar, the squating motion seems more natural. One more question, I enabled the Kinect recrder player(Script) which attached in the Kinect controller object. I want to check the data of the file (bodyRecoding.txt), however I can’t find the file in the asset. Does this txt file have been generalized when I finished the recording? Where to find the file? The data in the file is about the joints position data?

        Thank you!

  9. Dear Rumen

    I love your asset and it works very well.
    This time I am trying to get the quaternion of each joint and display them live on the game scene.
    So I am wondering if I could get the data from the kinect and use that data to check whether the user is doing the right motion.

    Will it be possible and will there be a way to do this?

    Thank you!

    • Calling KinectManager.Instance.GetJointOrientation() should do the job. The last parameter ‘flip’ is the opposite of ‘mirrored movement’, i.e. should be ‘true’ for normal facing avatar and ‘false’ for mirrored one.

      • long userId = KinectManager.SetPrimaryUserID();
        /// User ID
        /// Joint index
        /// If set to true, this means non-mirrored rotation
        KinectManager.Instance.GetJointOrientation(userId, (int)KinectInterop.JointType.Head);

        Is this the right way?
        I keep getting compile errors when I play the scene.

        Thank you.

      • As far as I see, the last parameter of GetJointOrientation()-method invocation is missing in the code above. Maybe this causes the compile errors. I think the last parameter (flip) should be true in your case.

  10. This was our original code that we were going to use

    using UnityEngine;
    using System.Collections;
    using UnityEngine.UI;

    public class CheckBodyAngle : MonoBehaviour

    public GameObject objJoint;
    string strState = “Ready”;
    string strResult = “Check”;
    public Text textState;
    public Text textRotate;

    void Start()


    void OnGUI()
    GUI.Label(new Rect(0, 150, Screen.width, 50), strState);
    GUI.Label(new Rect(0, 200, Screen.width, 50), strResult);

    void Update()
    if (objJoint != null)

    strState = objJoint.transform.localRotation.ToString(); // + “/” +

    // objJoint.transform.rotation.ToString() + “/” +

    // objJoint.transform.localEulerAngles.ToString() + “/” +

    // objJoint.transform.eulerAngles.ToString();

    textRotate.text = strState;

    if (objJoint.transform.localRotation.x > 0.6f && objJoint.transform.localRotation.x < 0.7f)
    strResult = "Danger";
    textState.text = strResult;
    strResult = "Check";
    textState.text = strResult;


    But in your asset, I can't find the avatar's joint to drag to the Inspector window.

    Is there any way to solve this?

    Thank you.

    • Do you mean the Mocap animator or a scene from the K2-asset? In the Mocap animator, the LiveReplay-avatar is instantiated at runtime, so you should assign the objJoint-reference at runtime, too.

  11. Dear Rumen,

    After I enabled the KinectRecorderPlayer script attached on the KinectController game object, I can’t find the BodyRecording.txt file. I want to check the data of the file. How can I get the file? Could you give me some hints?


    • Sorry, I missed answering this question before. I personally use the KinectRecorderPlayer-component to test animation recording. The input text file should be present in the project folder. By default it is called ‘BodyRecording.txt’, and contains body movements of all available users. This file is normally saved by the KinectRecorderDemo-scene of the K2-asset. But you can also use KinectRecorderPlayer-component in the mocap animator to save one, too. To do it, create a script with reference to KinectRecorderPlayer-component and call its methods StartRecording() to start file recording, and StopRecordingOrPlaying() to stop recording. Then you will have a file to replay in the mocap animator.

  12. Thanks to your help, I solved the problem.
    Now, I am wondering if I could give a range to the rotation angle of each joint.
    So if the user gets into the range, the word “Check” will appear.

    Thank you.

  13. Dear Rumen,

    After I get the recoreded motion utilizing the MoCap asset, I want to edit the animation. In unity 5, I found the animation under Assets/Models. Actually I DON’T want the animation in looping mode, so I unchecked the “Loop Time”, and then click “apply”. However, after clicked the “Apply” button, the “Loop Time” automatically checked again. So in my scene, the avtar still do the animation in loop mode. It seems that I can’t uncheck the “loop time“ setting. Could you tell me how to cancel the loop setting?

    Thanks a lot!

    • Yes, this is so by design. The looping animation in the scene is only to demonstrate what has been recorded. It is not always possible to see the animation in full, if it plays only once.
      Actually, you don’t need this demo-animation. Instead, copy the model with recorded animation to your project, and then retarget the animation to the humanoid model in your scene. To do it, follow these steps:

  14. Dear Rumen,

    I utilized the MoCap asset to record the motion and record the position data at the same time, I found the avatar’s motion is much slower than my real motion, especially when the real motion is a little bit fast. And I checked the data which recorded at the same time, one second only 6 frames of data was recorded. I guess the low frequency caused the huge delay.

    But compared the Avatar demo in your K-2 asset, the avatar can follow the real motion well, there is no obvious delay. How to deal with the delay and increase the data per second in the MoCap asset? Each second 6 frames of data is too small.

    Thank you for your help

    • Hi, please set both ‘Joint distance threshold’ and ‘Joint angle threshold’-settings of KinectFbxRecorder (component of KinectController game object) to 0, and then try again.

  15. Hello Everyone:

    Is it possible in Maya or Modo v10.4 to save the animations
    in the bvh file format?

    Thank you,


  16. Hello Rumen,

    is it possible to get your animator free for trying some things out with mocap. I just want to get some little shots for my project (around 3) and dont want to spend money on that little shots.

    Thank you in advance

    Yours Faithfully

    • Hello Simon, are you a student, lecturer or researcher? If you are, you are eligible to get any of the packages here free of charge for academic use. Just send me your e-mail request from your university e-mail address, or provide some other proof you are really studying there.

  17. Dear Rumen,

    when I utilize MoCap, I found that if I didn’t check “Grounded Feet” in “Kinect Fbx Recorder” script, the squat motion would be very strange. I mean, from standing to squat, the feet would go inside the ground; and from squat to standing, the feet would be above the ground. Actually, my feet didn’t move.

    I also checked the recording data, the position data of feet seems no problem. I know “Grounded Feet” will make the feet stick to the ground. So my question is that, when converting the Kinect data to Fbx animation file, why feet motion is so strange when doing the squat?

    Thank you!

    • It is like this, because by default the AvatarController-component (responsible for avatar movements) applies forward kinematics. It starts from the position of the waist and then applies the rotations of different joints down the hierarchy. The feet are the last. So, depending on the accuracy of user tracking and your own height and bone lengths, it is possible the feet of the model to go into or over the ground. That’s why there is this setting – ‘Grounded feet’ that keeps model’s feet on the ground. But keep in mind that even with the setting turned on, it is possible to have short times of feet into or over the ground. This is because there is a time threshold of 0.2 seconds, when the feet are allowed to be not on the ground (for instance, when the user jumps). If you want to lower or clear this setting, open KinectScripts/AvatarController.cs and look for the definition of ‘maxFootDistanceTime’-constant.

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s